site banner

Culture War Roundup for the week of August 4, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

Today, OpenAI released a new update that will put more mental health guardrails on ChatGPT. I'd been hearing about chatbots inducing psychosis and I'm assuming it was a response to that. But looking into the topic more I'm astounded how much of average people's mental health is becoming tied to these chatbots. Not just people with severe mental illness but perhaps even the majority of all people that use chatbots.

A recent survey shows:

  • 49% of LLM users who self-report an ongoing mental health condition use LLMs for mental health support.

  • 73% use LLMs for anxiety management, 63% for personal advice, 60% for depression support, 58% for emotional insight, 56% for mood improvement, 36% to practice communication skills and 35% to feel less lonely.

A quick browse of reddit down this particular rabbithole quickly makes me realize how many people are already talking to chatbots as a friend/therapist. My impression is that its similar to the early days of online dating. People are sort of embarrassed to admit they has an AI friend, but with numbers increasing rapidly the younger you go. I've seen numbers between 10 and 50(!) percent of young people have used AI for companionship.

In retrospect, it would be shocking if AI therapy didn't take off. Probably the biggest barrier to getting therapy is cost and availability. Chatbots are available 24/7, essentially free, and will never judge you. The rate of mental illness is rising particularly among young people so the demand is there. But it's not just that, the idea of therapy is ingrained into today's culture. There's a sense that everyone should get therapy, who among us is truly mentally healthy, etc. I could easily see it becoming as ubiquitous as online dating is today.

I admit personally I'm relatively skeptical of therapy in general. As I understand it, it doesn't really matter what therapeutic method you use the result are about the same, so probably most of the benefit comes from just having someone you can vent to who is empathetic, and won't judge or get bored. If that's the case then AI therapy is probably as good or better than human therapy for cases that are not severe. On reddit I see a lot of comments that say that AI therapy has helped them more than years of human therapy and I can believe that.

So if AI therapy is helping so many people is that a good thing? I see a lot of parallels between AI therapy and AGI's alignment problem. I believe people when they say they went to therapy and they report feeling better. I'm not really confident that they came out with a more accurate view of reality. Recently I want down another tangentially related rabbithole about an online therapist that goes by the name of Dr. K, who has publicly streamed their therapy sessions (for legal reasons he doesn't actually call them therapy sessions). The thing that struck me is just how vulnerable a state of mind people are in during therapy, and the very subtle way that assumptions about reality can be pushed on them.

So if you consider how impressionable people are when receiving therapy, and how its becoming increasingly common for adults to use chatbots for therapy, and how it's becoming increasingly common for kids to grow up with chatbots as friends, then it really makes the potential impact of subtle value assumptions in these models loom large.

I admit personally I'm relatively skeptical of therapy in general. As I understand it, it doesn't really matter what therapeutic method you use the result are about the same, so probably most of the benefit comes from just having someone you can vent to who is empathetic, and won't judge or get bored.

What do you think of as therapy?

From time to time I ask Chat something like "my kid did this to his younger sister, even though he knows it's wrong and we've told him 100x what the consequences will be, what's the standard psychology take for what's going on?" or "my wife seems to get really frustrated by my not bringing the dishes up after I'm done with dinner even though I do it 9/10 times and when I don't it's because I got distracted; we split a ton of the housework so it's not like I loaf all day and she knows it, what are the top theories of mind around why she's losing her shit"

I find it more helpful than not.