site banner

Culture War Roundup for the week of August 4, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

There's the Dodo Bird Verdict take where the precise practice of psychotherapy doesn't matter so much as certain very broad bounds of conduct are followed. If an hour talking with a slightly sycophantic voice is all it takes to ground people, that'll be surprising to me, but it's not bad.

Of course, there are common factors to the common factors theory. Some of the behaviors that are outside of those bounds of conduct can definitely fuck someone up. Some of them aren't very likely for an LLM to do (I guess it's technically not impossible for an LLM to 'sleep with' a patient if we count ERP, but it's at least not a common failure mode), but others are things LLMs are more likely to do that human therapists won't even consider ('oh it's totally normal to send your ex three million texts at 2am, and if they aren't answering right away that's their problem').

I'm a little hesitant to take any numbers for chatGPT psychosis seriously. The extent reporting is always tied to the most recognizable LLM is a red flag, and self_made_human has made a pretty good argument that we wouldn't be able to distinguish the signal from the noise even presuming there were signal.

On the other hand, I know about mirror dwellers. People can and do use VR applications as a low-stress environment for developing social skills or overcoming certain stressors. But some portion do go wonky in a way that I'm really skeptical they were breaking before. Even if they were going to have problems, otherwise, I don't think they'd have been the same problems.

((On the flip side, I'll point out that Ani and Bad Rudi are still MIA from iOS. I would not be surprised to see large censorship efforts aimed at even all-ages-appropriate LLM actors, if they squick the wrong people out.))

Funnily enough, I have an AAQC on the Dodo Bird model.

It seems like the most parsimonious explanation, but I would say that it doesn't disqualify therapy as a valid therapeutic intervention. A lot of the people being sent to therapy do not have access to a discreet, thoughtful friend who will keep secrets. That might well be a service worth paying for. What isn't in dispute is that therapy works in the first place, even the models that use bonkers frameworks.

I see no reason LLMs can't make for okay therapists, and they are definitely better than the quality of some I have personally met.

At the end of the day, I'm just glad that therapy isn't the only tool in my arsenal, and I can dish out the fun drugs. Psychologists are so painfully restricted in what they can do.

What's a "mirror dweller"?

VRChat (and most other social virtual reality worlds) allow people to choose an avatar. At the novice user level, these avatars just track the camera position and orientation, provide a walking animation, and have a limited number of preset emotes, but there's a small but growing industry for extending that connection. Multiple IMUs and/or camera tricks can track limbs, and there are tools used by more dedicated users for face and eye and hand tracking. These can allow avatar's general pose (and sometimes down to finger motions) to match that of the real-world person driving it, sometimes with complex modeling going on where an avatar might need to represent body parts that the person driving it doesn't have.

While you can go into third-person mode to evaluate how well these pose estimates are working in some circumstances, that's impractical for a lot of in-game use, both for motion sickness reasons and because it's often disruptive. So most VRChat social worlds will have at least one virtual mirror, usually equivalent to at least a eight-foot-tall-by-sixteen-foot-wide space, very prominently placed to check things like imu drift.

Some people like these mirrors. Really like them. Like spend hours in front of them and then go to sleep while in VR-level like them. This can sometimes be a social thing where groups will sit in front of a mirror and even do some social discussions together, or sometimes they'll be the one constantly watching the mirror while everyone is else doing their own goofy stuff. But they're the mirror dwellers.

I'm not absolutely sure whatever's going on with them is bad, but it's definitely a break in behavior that was not really available ten years ago.

Thanks, I hate it.

I finally took the plunge and joined an art discord a couple months back, and VR chat is a big part of their social activity. I actually have an old VR rig I've never bothered setting up, and briefly considered joining in, but increasingly think it's better to leave it on the shelf.