This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
One of my favorite bands just took a bunch of AI accusations, I guess, and he wrote a somewhat-pissed Substack post. That lead singer doesn't often step into culture war stuff, but this was close enough, I think:
and goes on to say that fighting AI art in this way is fruitless:
I regret that the culture war is poking random people in a new way in the last couple of years, and I can't help but cynically laugh at it. Not to mention how short-sighted it is. In that post, the lead singer details how much of a pain it is to do graphic design for music, and videos, and other art, and he hates it. Imagine if you could get a machine to do it? Also, it actually lifts up people who do not have money and allows them to make art like the people who have money do. Look at this VEO 3 shitpost. Genuinely funny, and the production value would be insane if it was real, for a joke that probably wouldn't be worth it. But now, someone with some Gemini credits can make it. This increases the amount of people making things.
I'm not sure I have any real thesis for this post, but I haven't been very good at directing discussion for my own posts, so, reply to this anecdote in any way you see fit. I thought it was interesting, and a little sad.
I agree that this stuff is becoming more and more difficult to tell apart. We even had one of our own posters get falsely accused by the mods of using AI recently. People are going to claim many things are "obviously AI" when they actually aren't, and the mania of false accusations is going to tick a lot of people off. When you're accused of using AI, not only are people saying you're committing artistic fraud, they're also implying that even if you aren't then your output is still generic trash to some extent.
I wish the Luddites would go away and we could all just judge things by quality rather than trying to read tea leaves on whether AI had a hand in creating something.
This also 100% applies to this forum's rule effectively banning AI. It's a bad rule overall.
If you want to talk to an AI, there's already a place where you can do that.
This rhetorical question actually caused me to have a think. Why do people want to talk to an AI? I mean productivity I can understand, all the usual "as a tool" excuses. But I've felt no compulsion, not even curiosity, to talk to an LLM just to talk. And yet I see people casually mentioning doing that all over the place. It's like something straight out of Her, a film which thoroughly squicked me out. Is there anyone here who just casually socializes with an LLM who can explain why they do it?
I've been thinking the same thing. AI text seems so fundamentally uninteresting to me. The reasons I'm interested in humans talking is either to find out what people think or to learn actual information/insight about the rest of the world. AI doesn't do the former at all because there's nobody writing it so it doesn't let me know anyone's thoughts or feelings, and it's not reliable enough to be good at the latter. On rare occasion I've gotten use out of it as a search engine pointing me towards information I can verify myself, and I don't doubt various other uses as a tool, but beyond that? Back in the early days of GPT-2 through to GPT-4 I was interested in the samples posted by others, but that was because of what they indicated about the state of AI. Is it that some people enjoy the act of conversation itself even if they know there's nobody on the other end? I wonder which side is the majority, and by how much?
@Fruck compared it to parasociality but it's almost the opposite to me. For example I like reading other people discuss the same media I'm interested in. So do a lot of other people, that's presumably why people read Reddit or 4chan threads discussing media, read reviews for books they've already read, watch youtubers like RedLetterMedia, watch reaction-videos, etc. People want to know what other people thought, they want to empathize with their reactions to key moments, etc. AI-generated text has none of that appeal, if people are having parasocial relationships with it then their parasociality is completely different from anything I've felt. I guess the closest comparison is to parasocial feelings for fictional characters? If AI was capable of good fiction-writing I might be interested in reading it, the same way I can appreciate good-looking AI art, but currently it's not. Especially not when the character it's writing is "helpful AI assistant", hardly a font of interesting characterization or witty dialogue, yet a lot of people seem to find conversations with that character interesting.
LLMs are a great way of researching things because they have a surface level understanding on par with a median professional of some field. You'll be taken for a ride in some way if you don't know the topic yourself, but you can get a lot out of them that way.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link