site banner

Culture War Roundup for the week of February 2, 2026

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

In a way, AI is harder on nerds than it is on anyone else.

At a closed-door meeting in Princeton, leading researchers said agentic AI tools now handle up to 90% of their intellectual workload—forcing a reckoning over who, or what, drives scientific discovery.

It is interesting to see, now that it is ingrained into the personal and professional lives of vast numbers of ‘normal’ people, how mundanely it slots into the daily existence of the average person. I don’t mean that critically, I mean that the average person (especially globally but probably also in the rich world) probably already believed there were ‘computers’ who were ‘smarter than them’. ChatGPT isn’t so different from, say, Jarvis in Iron Man (or countless other AIs in fiction), and the median 90-100IQ person may even have believed in 2007 that technology like that actually existed “for rich people” or at least didn’t seem much more advanced than what they had.

Most people do not seek or find intellectual satisfaction in their work. Intellectual achievement is not central to their identity. This is true even for many people with decent-IQ white collar jobs. They may be concerned (like many of us) with things like technological unemployment, but the fact that an AI might do everything intellectually that they can faster and better doesn’t cause them much consternation. A tool that builds their website from a prompt is a tool, like a microwave or a computer. To a lot of users of LLMs, the lines between human and AI aren’t really blurring together so much as irrelevant; the things most people seek from others, like physical intimacy, family and children, good food and mirth, are not intellectual.

This is much more emotionally healthy than the nerd’s response. A version of the Princeton story is now increasingly common on ‘intellectual’ forums and in spaces online as more and more intelligent people realize the social and cultural implications of mass automation that go beyond the coming economic challenge. Someone whose identity is built around being a member of their local community, a religious organization, a small sports team, their spouse and children, a small group of friends with whom they go drinking a couple of times a month, a calendar of festivals and birthdays, will fare much better than someone who has spent a lifetime cultivating an identity built around an intellect that is no longer useful to anyone, least of all themselves.

I was thinking recently that I’m proud of what I’ve done in my short career, but that smart-ish people in their mid/late twenties to perhaps mid/late forties are in the worst position with regards to the impact of AI on our personal identities. Those much older than us have lived and experienced full careers at a time when their work was useful and important, when they had value. Those much younger will either never work or, if they’re say 20 or 22 now, work for only a handful of years before AI can do all intellectual labor - and have in any case already had three years of LLMs for their own career funeral planning. But in this age range, baited to complete the long, painful, tiresome and often menial slog that characterizes the first decade of a white collar career, we have the double humiliation of never getting further than that and of having wasted so much of our lives preparing for this future that isn’t going to happen.

In a way, AI is harder on nerds than it is on anyone else.

I'd say it's actually harder on artists more than everyone else (assuming you aren't counting artists as a subset of nerds). 90% is not 100%. At least for programmers reviewing code and structuring the solution were always part of the job, people who were fond of codegolfing crud in rust (look how much more elegant I can make this by using a right fold) are going to suffer, but only a little bit.

I imagine the same is true for physicists, maybe not, but the fact that they are willingly implementing it motu propriu suggests it is.

Maybe in a few years things will change, AI will be able to do everything fully autonomously, and we'll all end up at the bottom of the totem pole (or "the balls on the dick" as some will say). But so far that's not the case and, to be honest, the last big improvement to text generation models I've seen happened in early 2024.

Meanwhile I see artists collectively having a full blown psychotic break about AI, hence indie gaming dev awards banning any and all uses of AI etc. I think this is because it changes their job substantially, on top of slashing most of them, and also because it came completely out of left field, nobody expected one of the main things that AI would be good at would be art, quite the opposite, people expected art to be impossible to AI because it doesn't have imagination or soul or whatever. In fact, the problem with AI is actually that it has too much imagination. And revealed preference strikes here too, you don't see many artists talking about how they are integrating AI into their workflow.

Most art was already commodified, and it was commodity artists, not creative artists who got the most brutal axe.

Essentially, contrary to your point about AI having imagination, creativity is the primary skill it lacks. It's basically a machine for producing median outcomes based on its training data, which is about as far away from creativity as you can get.

But for most artists, their jobs were based on providing quotidian, derivative artworks for enterprises that were soulless to begin with. To the extent that creativity was involved in their finished products, it was at a higher level than their own input, i.e. a director or something commissioning preset quotidian assets as a component in their own 'vision', the vision being the creative part of the whole deal.

However, I do believe creative artists will be threatened too. It's a little complicated to get into, but I think creative art depends not just on lone individuals or a consumer market, but on a social and cultural basis of popular enthusiasm and involvement in a given artform. I'm talking about dilettantes, critics, aficionados here. It's a social and cultural pursuit as much as it's an individual or commercial one, and I think that AI will contribute to the withering away of these sorts of underpinnings the same way corporate dominance and other ongoing trends previously have.

So for the artistic field, I envision complete and total commoditized slop produced by machines, once the human spirit has finally been crushed.

If your market consists of 99 derivative rip-offs and one legitimately interesting and fresh idea, the fresh idea will take half the market and the 99 rip-offs will fight over the other half. If there are 999,999 derivative rip-offs, then they'll have to split their half a lot more ways but they still won't be able to push in on the fresh idea's cut.

Art is a winner-takes-all industry. The JK Rowlings and Terry Pratchetts of the world have many thousands of times as many sales as Joe Average churning out derivative slop that's merely so-so. The addition of more slop won't change the core dynamic. Fundamentally, anyone trying to get the audience to accept a lower quality product isn't pitting themselves against the ingenuity of the artist, but the ingenuity of the audience. Trying to hide information from a crowd that has you outnumbered thousands-to-one is not easy.

If you get 999,999 rip-offs the market simply collapses.

If the market collapses then you create demand for someone to create a new market with less crap.