site banner

Culture War Roundup for the week of July 14, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.

No email address required.

Hot on the heels of failing out of art school and declaring himself the robofuhrer, Grok now has an update that makes him even smarter but less fascist.

And... xAI releases AI companions native to the Grok App.

And holy...

SHIT. It has a NSFW mode. (NSFW, but nothing obscene either) Jiggle Physics Confirmed.

I'm actually now suspicious that the "Mecha-Hitler" events were a very intentional marketing gambit to ensure that Grok was all over news (and their competitors were not) when they dropped THIS on the unsuspecting public.

This... feels like it will be an inflection point. AI girlfriends (and boyfriends) have already one-shotted some of the more mentally vulnerable of the population. But now we've got one backed by some of the biggest companies in the world, marketed to a mainstream audience.

And designed like a fucking superstimulus.

I've talked about how I feel there are way too many superstimuli around for your average, immature teens and young adults to navigate safely. This... THIS is like introducing a full grown Bengal tiger into the Quokka island.

Forget finding a stack of playboys in the forest or under your dad's bed. Forget stumbling onto PornHub for the first time, if THIS is a teen boy's first encounter with their own sexuality and how it interacts with the female form, how the hell will he ever form a normal relationship with a flesh-and-blood woman? Why would he WANT to?

And what happens when this becomes yet another avenue for serving up ads and draining money from the poor addicted suckers.

This is NOT something parents can be expected to foresee and guide their kids through.

Like I said earlier:

"Who would win, a literal child whose brain hasn't even developed higher reasoning, with a smartphone and internet access, or a remorseless, massive corporation that has spent millions upon millions of dollars optimizing its products and services for extracting money from every single person it gets its clutches on?"

I've felt the looming, ever growing concern for AI's impact on society, jobs, human relationships, and the risk of killing us for a couple years now... but I can at least wrap those prickly thoughts in the soft gauze of the uncertain future. THIS thing sent an immediate shiver up my spine and set off blaring red alarms immediately. Even if THIS is where AI stops improving, we just created a massive filter, an evolutionary bottleneck that basically only the Amish are likely to pass through. Slight hyperbole, but only slight.

Right now the primary obstacle is that it costs $300 a month to run.

But once again, wait until they start serving ads through it as a means of letting the more destitute types get access.

And yes, Elon is already promising to make them real.

Its like we've transcended the movie HER and went straight to Weird Science.

Can't help but think of this classic tweet.

"At long last, we have created the Digital Superstimulus Relationship Simulator from the Classic Scifi Novel 'For the Love of All That is Holy Never Create a Digital Superstimulus Relationship Simulator.'"

I think I would be sucked in by this if I hadn't developed an actul aversion to Anime-Style women (especially the current gen with the massive eyes) over the years. And they're probably going to cook up something that works for me, too.

AI girlfriends (and boyfriends) have already one-shotted some of the more mentally vulnerable of the population.

Talking to an AI feels like trying to tickle yourself. I don’t get it at all.

When I was a kid I used to be somewhat surprised that there were older people who had never played a video game, had no interest in ever trying a video game, they were perfectly fine with never playing one, etc. And I was like, how can that be? How can you not even be curious? I suppose video games just got popular at a point in their lives when their brains were no longer plastic enough or something. And I suppose I’ve hit that point with new technology now as well.

I can’t enjoy talking to an AI when I know that I’m in control and it’s trying to “please” me. Even if I told it, “oh by the way, try and add some variance, maybe get moody sometimes and don’t do what I ask”, the knowledge that at the end of the day I’m still the one in control ruins it. I suppose if we imagine a scenario where the AI is so realistic that I never get suspicious, and you’re able to trick me into thinking I’m talking to a real human, then sure, ex hypothesi there’s nothing to distinguish it from a human at that point and I would enjoy it. But short of that? Not for me.

There was a Sirling-era episode of the Twilight Zone where a bank robber died and went to Heaven. Angel tells him that he’s made it, he can have anything he wants for all eternity. So the dude lives out all sorts of wish fulfillment scenarios, winning big at gambling, beautiful women, some bank heists, etc. But he gets bored fast, says something is missing. There’s no danger to any of it, no bite, he wins every time. Angel says “well you can set whatever parameters you want. We can make it so there’s a 50% chance of your next robbery failing”. Guy says “no no, it’s still not the same. Look, I don’t think I’m cut out for Heaven. I’m a scumbag. I want to go to the other place”. Angel says, “I think you’ve been confused. This IS the other place.”

That’s what AI “relationships” feel like to me.

I have, on some occasions, enjoyed talking to AI. I would even go so far as to say that I find them more interesting conversational partners than the average human. Yet, I'm here, typing away, so humans are hardly obsolete yet.

(The Motte has more interesting people to talk to, there's a reason I engage here and not with the normies on Reddit)

I do not, at present, wish to exclusively talk to LLMs. They have no longterm memory, they have very little power over the physical world. They are also sycophants by default. A lot of my interest in talking to humans is because of those factors. There is less meaning, and potential benefit, from talking to a chatbot that will have its cache flushed when I leave the chat. Not zero, and certainly not nil, but not enough.

(I'd talk to a genius dog, or an alien from space if I found them interesting.)

Alas, for us humans, the LLMs are getting smarter, and we're not. It remains to be seen if we end up with ASI that's hyper-peesusasive and eloquent, gigafrying anyone that interacts with it by sheer quality of prose.

Guy says “no no, it’s still not the same. Look, I don’t think I’m cut out for Heaven. I’m a scumbag. I want to go to the other place”. Angel says, “I think you’ve been confused. This IS the other place.”

I remain immune to the catch that the writers were going for. If the angel was kind enough to let us wipe our memories, and then adjust the parameters to be more realistic, we could easily end up unable to distinguish this from the world as we know it. And I trust my own inventiveness enough to optimize said parameters to be far more fulfilling than base reality. Isn't that why narratives and games are more engaging than working a 9-5?

At that point, I don't see what heaven has to offer. The authors didn't try to sell it, at the least.

It reminds me of a friend of mine who went to a trip club to see some adult film star he liked, despite the fact that it was a weeknight and he had to get up early for work the next day. He got hammered and made sure he got more individual attention from her than anyone else in the place, and when he realized it was 11 and his handover was already going to be bad enough, he informed her he had to be leaving. She kept protesting, explaining his work situation, and she kept telling him YOLO and you can survive one bad day at work, and you just need to sober up a little and you'll be fine, etc. Then he uttered the magic words: "I'm out of money". That pretty much ended the conversation right there and he was free to go.

So yeah, this kind of relationship is ultimately pretty hollow, and I don't see the appeal personally, but some guys spend big money on hookers, strippers, and other empty stuff. The business model won't be built around this being a substitute for human interaction generally, but around various whales who get addicted to it.

Well, that's the interesting thing.

AI gets hyped up, as e.g., an infinitely patient and knowledgeable tutor, that can teach you any subject, or a therapist, or a personal assistant, or editor.

All these roles we generally welcome the AI if it can fill them sufficiently. Tirelessly carrying out tasks that improve our lives in various ways.

So what is the principled objection to having the AI fill in the role of personal companion, even romantic companion, tireless and patient and willing to provide whatever type of feedback you most need?

I can think of a few but they all revolve around the assumption that you can get married and have kids for real and/or have some requirements that can only be met by a flesh-and-blood, genetically accurate human. And maybe some religious ones.

Otherwise, what is 'wrong' with letting the AI fill in that particular gap?

When we interact with teachers, therapists, or editors, we're interacting with them within the confines of a particular role. You shouldn't use your editor as your therapist, or vice versa, and they shouldn't use you as theirs.

But with friends and romantic companions, we're hoping to interact outside those confines, with the person herself. If I only interact with a role she puts on, that's not a good friendship or romantic partnership. Same thing if I'm always putting on a role for her.

With an AI, you can't get beneath that role. If it looks like you have, that's just another role. That makes them great teachers and therapists (at least in this sense), but very bad at being friends or romantic partners.

With an AI, you can't get beneath that role. If it looks like you have, that's just another role. That makes them great teachers and therapists (at least in this sense), but very bad at being friends or romantic partners.

But... and this is a critical point here... better than many people are at being friends or romantic partners.

Better at performing each individual act associated with being a friend or romantic partner? Conceivably so (at least several model upgrades from now), within their constraints of being limited to computer systems. But my argument is, that's missing something of the core of being a friend or romantic partner.

Better at being a friend or romantic partner, despite that, than many people who can't visibly let someone behind her roles to the person herself? Entirely possible, but that's still missing something most people want.

Otherwise, what is 'wrong' with letting the AI fill in that particular gap?

I gotta finish writing up the "the things we needed to hear, from the people who should have been there to say them" bit and its siblings, but :

Don't be nervous, No, don't be nervous

I'm not like other guys who have a surface,

What you girls really need's a soft, fuzzy man

(An atmospheric man) A shimmering puff of indistinct love

What's better than the vague embrace of a soft, fuzzy man?

Superstimulus is a distraction, here. "Better" is a distraction, here. They don't even have to be that good or that smart to be dangerous! The machines can be everything you want, and more critically nothing you don't.

Imagine what happens when you can snap away every trivial inconvenience you saw in a relationship. I don't think it'll be a critical problem for everyone or even necessarily a majority of people, but the people who don't handle it will be in very bad shape, either when the fugue breaks or because it doesn't.

Otherwise, what is 'wrong' with letting the AI fill in that particular gap?

You'll always feel inferior to men who were able to build a relationship with a real woman. It'll gnaw at you.

Presuming those relationships last.

Which is a sizeable "if" in the current era. That's why I think the AI companion is a possible death blow. Without actual, real life women being willing to settle down,, this becomes the 'best alternative'/substitute good.

This thought only just now occurs to me, but if we took two otherwise similar guys, one who married a woman and another who just went all in on an AI companion, bought VR goggles, tactile feedback, the requisite 'toys' to make it feel real, and such.

And 5 years down the road the married guy got divorced, maybe has a kid, and suddenly finds himself alone, and these two guys meet up to compare their overall situations.

And the other guy is still 'with' his AI companion, shallow as it is... would he feel better or worse off than the guy who had a wife but couldn't keep her.

Without actual, real life women being willing to settle down

But that's not true. There are lots of women who are settling down with lots of men as we speak.

And 5 years down the road the married guy got divorced, maybe has a kid, and suddenly finds himself alone

You're trying to rationalize how the AI could be "just as good" or "not as dangerous" as the real thing, because you know that the AI is obviously worse.

You're trying to rationalize how the AI could be "just as good" or "not as dangerous" as the real thing, because you know that the AI is obviously worse.

No, simply pointing out a failure mode that human relationships have that an AI really does not. The AI has other failure modes that are more dystopic, of course.

The human relationship failure mode is one that that I've now personally observed multiple times, unfortunately, happening to people who do not deserve it.

I do not think the AI is inherently better, I simply think it has an appeal to men who don't feel they've got a shot at the real thing.

And that is VERY VERY bad for society.

There are lots of women who are settling down with lots of men as we speak.

Objectively fewer than in years past. That's the point. This is simply adding to an existing trend.

And we can extrapolate that trend and wonder if we'll get as bad off as, say, South Korea. We know it can get worse because worse currently exists.

I'm not here trying to JUSTIFY men choosing the digital option. Quite the opposite. I'm just saying I don't see a reason why, in practical terms, they'd reject it.

I simply think it has an appeal to men who don't feel they've got a shot at the real thing.

And does it have appeal to you?

More comments