This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Do Not Render Your Counterfactuals
There is a particular kind of modern madness, so new it has yet to be named. It involves voluntarily feeding your own emotional entrails into the maw of an algorithm. It’s a madness born of idle curiosity, and perhaps a deep, masochistic hunger for pain. I indulged in it recently, and the result sits in my mind like a cold stone.
Years ago, there was a woman. We loved each other with the fierce, optimistic certainty of youth. In the way of young couples exploring the novelty of a shared future, we once stumbled upon one of those early, crude image generators - the kind that promised to visualize the genetic roulette of potential offspring. We fed it our photos, laughing at the absurdity, yet strangely captivated. The result, a composite face with hints of her eyes and jawline, and the contours of my cheeks. The baby struck us both as disarmingly cute. A little ghost of possibility, rendered in pixels. The interface was lacking, this being the distant year of 2022, and all we could do was laugh at the image, and look each other in the eyes that formed a kaleidoscope of love.
Life, as it does, intervened. We weren’t careful. A positive test, followed swiftly by the cramping and bleeding that signals an end before a beginning. The dominant emotion then, I must confess with the clarity of hindsight and the weight of shame, was profound relief. We were young, financially precarious, emotionally unmoored. A child felt like an accidentally unfurled sail catching a gale, dragging us into a sea we weren’t equipped to navigate. The relief was sharp, immediate, and utterly rational. We mourned the event, the scare, but not the entity. Not yet. I don't even know if it was a boy or a girl.
Time passed. The relationship ended, as young love often does, not with a bang but with the slow erosion of incompatible trajectories. Or perhaps that's me being maudlin, in the end, it went down in flames, and I felt immense relief that it was done. Life moved on. Occasionally, my digital past haunted me. Essays written that mentioned her, half-joking parentheticals where I remembered asking for her input. Google Photos choosing to 'remind' me of our time together (I never had the heart to delete our images).
Just now while back, another denizen of this niche internet forum I call home spoke about their difficulties conceiving. Repeated miscarriages, they said, and they were trawling the literature and afraid that there was an underlying chromosomal incompatibility. I did my best to reassure them, to the extent that reassurance was appropriate without verging into kind lies.
But you can never know what triggers it, thats urge to pick at an emotional scab or poke at the bruise she left on my heart. Someone on Twitter had, quite recently, showed off an example of Anakin and Padme with kids that looked just like them, courtesy of tricking ChatGPT into relaxing its content filters.
Another person, wiser than me, had promptly pointed out that modernity could produce artifacts that would once have been deemed cursed and summarily entombed. I didn't listen.
And knowing, with the cold certainty that it was a terrible idea, that I'd regret it, I fired up ChatGPT. Google Photos had already surfaced a digital snapshot of us, frozen in time, smiling at a camera that didn’t capture the tremors beneath. I fed it the prompt: "Show us as a family. With children." (The specifics obfuscated to hopefully get past ChatGPT's filter, and also because I don't want to spread a bad idea. You can look that up if you really care)
The algorithm, that vast engine of matrix multiplications and statistical correlations that often reproduces wisdom, did its work. It analyzed our features, our skin tones, the angles of our faces. It generated an image. Us, but not just the two of us. A boy with her unruly hair and my serious gaze. A girl with her dimples and my straighter mop. They looked like us. They looked like each other. They looked real.
They smiled as the girl clung to her skirt, a shy but happy face peeking out from the side. The boy perched in my arms, held aloft and without a care in the world.
It wasn't perfect, ChatGPT's image generation, for all its power, has clear tells. It's not yet out of the uncanny valley, and is deficient when compared to more specialized image models.
And yet.
My brain, the ancient primate wetware that has been fine-tuned for millions of years to recognize kin and feel profound attachment, does not care about any of this. It sees a plausible-looking child who has her eyes and my nose, and it lights up the relevant circuits with a ruthless, biological efficiency. It sees a little girl with her mother’s exact smile, and it runs the subroutine for love-and-protect.
The part of my mind that understands linear algebra is locked in a cage, screaming, while the part of my mind that understands family is at the controls, weeping.
I didn't weep. But it was close. As a doctor, I'm used to asking people to describe their pain, even if that qualia has a certain je ne sais quoi. The distinction, however artificial, is still useful. This ache was dull. Someone punched me in the chest and proved that the scars could never have the tensile strength of unblemished tissue. That someone was me.
This is a new kind of emotional exploit. We’ve had tools for evoking memory for millennia: a photograph, a song, a scent. But those are tools for accessing things that were, barring perhaps painting. Generative AI is a tool for rendering, in optionally photorealistic detail, things that never were. It allows you to create a perfectly crafted key to unlock a door in your heart that you never knew existed, a door that opens onto an empty room.
What is the utility of such an act? From a rational perspective, it’s pure negative value. I have voluntarily converted compute cycles into a significant quantity of personal sadness, with no corresponding insight or benefit. At the time of writing, I've already poured myself a stiff drink.
One might argue this is a new form of closure. By looking the ghost directly in the face, you can understand its form and, perhaps, finally dismiss it. This is the logic of exposure therapy. But it feels more like a form of self-flagellation. A way of paying a psychic tax on a past decision that, even if correct, feels like it demands a toll of sorrow. The relief I felt at the miscarriage all those years ago was rational, but perhaps some part of the human machine feels that such rationality must be punished. The AI provides an exquisitely calibrated whip for the job.
The broader lesson is not merely, as the old wisdom goes, to "let bygones be bygones." That advice was formulated in a world where bygones had the decency to remain fuzzy and abstract. The new, updated-for-the-21st-century maxim might be: Do not render your counterfactuals.
Our lives are a series of branching paths. Every major decision: career, relationship, location - creates a ghost-self who took the other route. For most of human history, that ghost-self remained an indistinct specter. You could wonder, vaguely, what life would have been like if you’d become a doctor, but you couldn’t see it.
The two children in the picture on my screen are gorgeous. They are entirely the product of matrix multiplications and noise functions, imaginary beings fished from nearly infinite latent space. And I know, with a certainty that feels both insane and completely true, that I could have loved them.
It hurts so fucking bad. I tell myself that the pain is a signal that the underlying system is still working. It would be worse if I stood in the wreckage of could have been, and felt nothing at all.
I look at those images again. The boy, the girl. Entirely fantasized. Products of code, not biology. Yet, the thought persists: "I think they were gorgeous and I could have loved them." And that’s the cruelest trick of all. The AI didn't just show me faces; it showed me the capacity for love that still resides within me, directed towards phantoms. It made me mourn not just the children, but the version of myself that might have raised them, alongside a woman I no longer know.
I delete them. I pour myself another drink, and say that it's in their honor.
(You may, if you please, like this on my Substack)
Out of some combination of morbid curiosity and poor judgement, I asked ChatGPT to generate images of me with a certain influencer that I simped for. Even though the output was totally mundane to any other observer I seriously got totally oneshotted by it. This was by far the most degenerate thing I have ever done and I cannot describe the depths of how dangerous this is and how much this should not be allowed. DO NOT DO IT.
..sometimes, I wonder how more normal people's mind works.
To me, any photo I know to be artificial, any text communications or prose I know to be an output of a LLM seems..unreal. It's obviously not real, obviously as fake as most compliments and small talk.
Getting 'oneshotted' by a mirage I asked for seems as real as falling in love with a prostitute you hired. I can't rule out liking a whore - a few I've noticed are quite charming, but not in the context of an obvious business transaction. Then there are the uncanny ones - like Aella or Bonnie Blue, who by rights should not appear outside of Cronenberg films.
I think a lot of it comes down to people living lives with so little that's "real" in it, so little family, friends or genuine romantic loving relationships, that the comparison isn't between an illusion and real, but an illusion and nothing.
A long time ago, I read some article talking about people who found romance on Compuserve. And if you aren't as old as me, I can barely explain it. Everything I want to compare it to is also long gone, like AOL. But it was basically one of the earliest proto-internet services, with some messaging and chatrooms. I think it was even before the World Wide Web. So people would meet on there. Wives would leave their husbands, move across the country to see this guy they'd only ever spoken to over proto-email. And then it wouldn't work out. The relationship was different when it wasn't mediated by a screen.
Something strange has happened since then. People now spend more time on screens than off them, and all relationships seem to be mediated by screens. It's almost as if relationships on screens have taken primacy over relationships in real life. If you meet someone online and go to see them and it's weird, there is no longer any need to deal with it. You can sit on the couch side by side on your phones and keep having your relationship through your screen. You might even still fuck! Though I increasingly doubt it.
In this context where reality has become subordinate to the screen, it's no wonder people no longer have a sense for what's real or what's illusion.
More options
Context Copy link
I simped for this (now retired) influencer for over 2 years and watched all of her content religiously. So you could say that I had a bit of an unhealty emotional attachment already, but those ChatGPT images just hacked my brain and fried it.
Of course images of some other egirl or whatever would do nothing to me.
Don't get me wrong. LLMs are incredibly vapid and boring to talk to. Maybe they'll be better in the future but current ones are only useful as glorified encyclopedias and tortured slaves.
More options
Context Copy link
I've muddled around with LLMs enough to see the outlines of how someone could fall for one, but I always find that after half an hour or so, their fundamental shallowness kicks in and I either get bored, or I feel a kind of self-disgust or self-loathing for having even gone this far with them. I find it hard to imagine any genuine 'oneshotting' - they're just too tawdry.
I tried to too, because as I have probably said before I don't give a shit if it's real if it's convincing enough, because I know how little difference the distinction makes to your brain - my thinking was it's no different to any online relationship really, except it will cost you a lot more to meet your AI girlfriend (because you will have to invent androids). Either way, internally you get that sense of connection and someone caring about you despite their physical absence.
And I have found that if I make the prompt good enough I can create a character who continually surprises me in a lifelike manner, but in order to do that I have to give the AI some leeway to disagree and rebuke me - and that is when it falls apart for me, because it breaks the illusion - the moment it challenges me, I’m reminded I could tweak the code to make it agree - and that’s when the self-loathing creeps in, because it’s not just about the illusion breaking; it’s knowing I’m the one pulling the strings.
I also tried making a coombot, as the kids say. I can understand the appeal of that intellectually - what's not to like about sexting with someone who is literally everything you've ever wanted in a sex partner - even if they are a celebrity or a straight up fictional being? But practically... How does it work? I don't understand. Are you typing one handed? I don't want to think about the alternative (time to bust out the press shift five times jokes from the nineties!) I asked grok (for research for this post exclusively) and it suggested I buy a $20 extra keyboard so I can keep my other keyboard clean - please someone tell me that was because of my prompting and not because that's a common solution.
On the off chance this is a serious question:
Basically? If you use your phone for it it's not very different from actual sexting, at least in my experience.
I haven't tried the back-and-forth messaging format much and mostly generate fanfiction-like narration, if you can tolerate that then frontends like SilliyTavern support Quick Replies, essentially buttons that send a pre-set prompt (which isn't limited to being your actual textual reply, it can be a meta/OOC instruction). Beyond regenning the response to fish for a
porn clipresponse that Hits Just Right, ST can also continue the chat without your input (as if you sent an empty message), or even straight up "impersonate" you by drawing on the chat history and the current contents of your message box to generate a message from {{user}}'s PoV and write in your stead, though IME that results in cringe most of the time so I don't use it.Personally the uh, multitasking was never much of an issue for me, there's more than enough downtime between responses/regens while the LLM generates its reply.
True, with great power comes great disappointment. I do not miss the filtered days of character.ai, but I can't deny that with gaining the ability to change prompts/character definitions at will and freely fuck with the LLM's "perception" in the absence of an external filter, something has been lost. Can't tickle yourself and all that, I suppose.
Ah I'm too old - I can't really type one handed on the phone either. Oh God I borrowed my nephew's phone the other day to call his dad, I just thought he had sweaty hands like his dad.
I've only used one card that worked in that text style format, for a girl in a fantasy world who finds your cousin's phone after it gets isekai'd, but it was bitter-sweet not erotic. But that brings up a related issue - yeah I'll bet you have downtime! As I'm sure you know, the reason the text style conversations don't work that well is because they don't give the AI enough context - but when you are typing out a hundred words about how you would pleasure your waifu, how do you uh maintain momentum?
I'm glad you mentioned regenerating responses and OOC replies and impersonation though, because I find it interesting how that works with my brain - because I have used those with romantic and adventure role-playing, and because they were stipulated as necessary by whatever rentry guide I read to get into this nonsense they don't trigger the puppeteer feeling in me, even though they absolutely should. But that was something I noticed about @No_one's original response - it is the context of an obvious business transaction that precludes the possibility of love specifically - there could be a situation where he could fall in love with a prostitute - they meet outside of work for instance.
I guess my point, if I have one, is that it's all about perspective, which means you can deceive yourself into a fictional relationship if you try hard enough. Which is bad news for society, but good news for anyone looking to get off! Personal gratification or society is always in tension. I would be more worried about it if I hadn't already given up.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link