site banner

Culture War Roundup for the week of June 9, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

I've been thinking the same thing. AI text seems so fundamentally uninteresting to me. The reasons I'm interested in humans talking is either to find out what people think or to learn actual information/insight about the rest of the world. AI doesn't do the former at all because there's nobody writing it so it doesn't let me know anyone's thoughts or feelings, and it's not reliable enough to be good at the latter. On rare occasion I've gotten use out of it as a search engine pointing me towards information I can verify myself, and I don't doubt various other uses as a tool, but beyond that? Back in the early days of GPT-2 through to GPT-4 I was interested in the samples posted by others, but that was because of what they indicated about the state of AI. Is it that some people enjoy the act of conversation itself even if they know there's nobody on the other end? I wonder which side is the majority, and by how much?

@Fruck compared it to parasociality but it's almost the opposite to me. For example I like reading other people discuss the same media I'm interested in. So do a lot of other people, that's presumably why people read Reddit or 4chan threads discussing media, read reviews for books they've already read, watch youtubers like RedLetterMedia, watch reaction-videos, etc. People want to know what other people thought, they want to empathize with their reactions to key moments, etc. AI-generated text has none of that appeal, if people are having parasocial relationships with it then their parasociality is completely different from anything I've felt. I guess the closest comparison is to parasocial feelings for fictional characters? If AI was capable of good fiction-writing I might be interested in reading it, the same way I can appreciate good-looking AI art, but currently it's not. Especially not when the character it's writing is "helpful AI assistant", hardly a font of interesting characterization or witty dialogue, yet a lot of people seem to find conversations with that character interesting.

...

I'm glad you said this, because I both agree with what you said and disagree with what you said from another perspective. And maybe I'm using parasocial wrong.

I wouldn't consider reading user reviews on reddit or watching rlm reviews parasocial at all, although I guess they are one sided relationships. But like you said the valence almost goes the other way - I know that when I read reddit idgaf about the stranger whose post I'm reading (unless they consistently knock it out of the park enough for me to notice), but if I post on reddit I use even more casual language than I do normally - I write for the hypothetical audience. But the parasociality with ai I was thinking of, oh yes that's different. That's parasocial in the same sense as those crazy ladies who attack soap stars for cheating on their lover in the show. That's true parasociality, a relationship entirely imagined by the viewer, as great or as terrible as they desire.

Because I would say you are right that there fundamentally isn't anyone writing it so you don't get anyone's thoughts and feelings - but you do get the zeitgeist position, which is an amalgamation of everyone's thoughts and feelings. It won't tell you what is true, but it is fantastic at telling you what popular consensus thinks is true. Forming a relationship with that is bonkers, but the narcissist in me sure sees the appeal.

And when I use it as a search engine I do prefer a conversation even though there's no one at the other end. I have always thought better with someone to bounce off, I always viewed taking notes to read the next day as sort of bouncing off myself, so using ai that way was a natural fit. And for general information that is easy to find, ai is much better than a search engine - that's why Google and Microsoft put it at the top of the search. Yeah you have to verify it's real, but you already had to do that with Google and Wikipedia! Or should have been.

That's why I wanted to know if my examples count as 'talking just to talk' - that's how I would describe them, but it's not about company, it's about information and novelty. But maybe I'm just flattering myself by saying that in the eyes of those squicked out by ai? I know I feel like I've been typical minding just assuming everyone is as enamoured with words as I am. I was aware I have a broader tolerance for slop than most but I figured if anyone here was a slow ai adopter it would be me, and most people here would be running their own llms already while I'm still playing around with the public models.