This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
And this, unfortunately, is why I now skim past your posts without reading them.
I am aware. I find it most unfortunate, since I do genuinely believe that LLMs help make my writing even stronger.
That's not the point. LLMs would make many people's writing stronger (for some value of "strong"). I'd rather read your writing, weaker or not. Now when I read you, every point you make, every turn of phrase, every word choice, I don't know if it was you or the LLM. Sure, maybe 80% to 90% of it was you. I can't know, and that makes me not care. I can prompt ChatGPT for its sparkling shiny opinions all day long.
I genuinely do not understand the intuition at play here. Let's imagine someone who has an instinctual aversion to the use of AI image gen: is using Adobe Firefly to change a single pixel with it sufficient to taint a larger painting? Two pixels? Ten? To finish the blocked-in background that the artist would have been too lazy to finish had he not had the tools at hand?
What if the artist deletes the AI pixel and reinserts one himself, with the exact same hexcode?
(It is worth noting that at one point, in the not so distant past, that even Photoshop itself was treated with similar suspicion)
Where is your threshold for "too much"? When you recognize an AI fingerprint? The problem is that once you begin suspecting it in a particular user, it is easy to imagine that there is more of it than in reality. Of course, if you have an all-or-nothing attitude, then I suppose that sounds less horrible to you than it does to me. I skew closer to a linear-no-threshold model, or perhaps one where, for the average writer, there exists an x% of AI usage that will increase overall quality as measured by multiple observers. Preferably blinded ones.
This x% can be very high for the truly average. I'm talking average Redditor. It can be very low, vanishingly so for others. Scott has mentioned that he has tried using LLMs to imitate his own style and has been consistently disappointed in the outcome.
I think, for me, the optimal amount is 1-10%. 20% is pushing it. This essay is closer to 20%. But even that 20% is closely vetted for factuality. Alas, it has not been vetted for style as hard, or else this topic wouldn't have arisen. In fact, I didn't particularly try. Performing edits to launder AI commentary as my own strikes me as dishonest.
I envision myself as the artist using the tool to finish painting that unfinished background. Sometimes, it makes something so good it's worth bringing to prominence in the foreground. The day where I can see no conceivable value-add from my own contribution is when I pack my bags as a writer. I suppose it is fortunate that I've been at it so long that there is a sizeable corpus of time stamped, archived evidence showing that I am damn good without it. That I don't need it. I still think I benefit from it, though I'm not sure I can change your mind on the topic.
After all, there are a lot of people making pure slop. I try not to ever become one of them.
There's a minor scandal in the tumblr video game sphere, because Studio Larian discussed the use of AI tools in the development pipeline. It's not clear exactly what they were using the tools for, but most critiques have interpreted it as only using AI-gen for concept art that won't even get a pixel in the final game, and they're still very unhappy with it.
((I've been trying to come together with a top-level post on the topic, but I dunno if it'll be interesting enough or if it'll be me going full Gelman Amnesia given that we have actual video game artist experts around.))
That's a shame they're being shamed. One of my takeaways from GPT-4 was that it was good enough to beat a lot of video game text and dialogue. Filler content, conversation with NPC #987, and side quests? AI can jazz up things budget doesn't allow for. It should no longer be cost prohibitive to develop the 120 filler fetch quests into something slightly more meaningful and engaging. Extra flair, storytelling, or character development where there was barebones effort. Someone needs to weather the criticism, raise the bar, and get paid for it.
Disagree on all accounts. When AI does content, it’s ultra generic, has no sense of tone or effect, and lacks any of the idiosyncratic spontaneity of even sloppily put together human content. There’s also a lack of broader complexity, meaning that any ‘character development’ it’s adding to a given story isn’t corresponding to a grander vision of what that story aspires to be or is about, but is essentially just the cut and pasted clichés, tropes, and emblems of other works that only incidentally contains some of their meaning. I’d prefer content to be cut rather than given over to AI.
That's what I think about many video games! Given that RPGs were never my forte, but I have played enough games. I've even played enough of recent-ish titles.
Take any Bethesda game or probably most other open-world titles. They all have mountains of generic filler called content that doesn't get cut despite being generic filler. The content doesn't get cut, because it needs to be there. Players like wandering into an interaction and they like doing the thing. That's the appeal. Developers can reward players with do-the-thing-get-thing reward and writers reward players to do the thing because they've become invested in some story or consequence behind it. A great game rewards a player with a dopamine did-thing-got-thing and it rewards the player with an engaging story. There are not very many great games and there's only so many opportunities for great writing in a given game.
In Starfield, there's a common loop. Player meets character NPC. Player may have up to 4 distinct interactions with the character. Possibly one or two of those interactions have 2 different variations. The player is provided with a few sentences of backstory in some way, then the player is expected to recognize the shape of a familiar story and fill in the blanks. These storytelling opportunities come a few phases translated to video game format:
There's nothing in there that can't be improved upon by a writer working with an LLM. If nothing else, this results in the player being provided the opportunity to add depth to a bland and boring A-B experience. The generic shape of the story, where the player is expected to recognize it to fill in the blanks, gets more filling.
Starfield is a bad game, but Starfield had so many of these generic fetch quests, generic storylines, generic dialogues that I don't think I got close to finish it. And hey, I know this developer, I expect some level of generic human slop, but boy did it seem bad. On the other side I've also played most of Baldur's Gate 3. BG3 is a much higher quality game. As I understand it's considered a generational class of game. My generation of this genre would be The Witcher 3 which, as I recall, had relatively rich stories and writing in part. There's still plenty of bland, formulaic, or marginal content that wouldn't be harmed by curated robot slop.
Now I could not be aware of the new fangled indie RPGs true gamers play these days, but I have played enough games to know the writers phone it in no more than a good prompt. That may be due to a workload as is typical for the industry or it might be that video game writers write games for a reason other than greatness. In either case I bet there's a use case for this now. Today! Someone could go find banal interaction in a game, feed a few prompts, and get something that enriches that experience. No question in my mind. AI will not single-handedly create a cohesive BG3 story board and 100 hours of dialogue in one go, but even a free model can help a mediocre writer enrich their 15 minute mini-story side quest #121.
Starfield was heavily panned as having by far the worst writing out of all Bethesda games. In a game like skyrim even the useless fetch quests have some variety in the framing
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link