site banner

Culture War Roundup for the week of April 13, 2026

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.

No email address required.

I was planning to write up a larger top-level effort-post on this topic, but since you've already made the top-level I'll post the notes I was drafting.

For the last few days, I've been reading about the Sam Altman attack drama and the warehouse fire attack that happened recently, and I've been finding the reactions pretty scary. General sentiment on HN is something along the lines of "Altman deserved it" and even among my general leftish acquaintance bubble the vibe is along the lines of "they shouldn't have missed" or "we need more of this fuck the rich" which doesn't really bode well for the stability of society.

Whether or not you believe the more bombastic claims of AI CEO's, I do think it's clear that at minimum AI is going to exacerbate the trend of technology centralizing power, wealth and status, even as absolute material standards have continued to improve beyond the wildest dreams of 99.9% of humanity in the past. For better or for worse, human happiness seems to be tied only lightly to absolute material standards and heavily tied to relative status, position, and feelings of fairness, and the internet and social media are super-stimuli for the human sense of status calibrated towards the Dunbar number.

Ruling out FOOM levels of societal disruption, I can think of a few ways that this plays out.

Left-wing communist populist marxist social democratic total victory: public outcry reaches all-time highs, perhaps with some peasant revolts sprinkled in, and the AOC/Mamdani coalition gets voted in to dismantle the AI labs, big tech and the icky billionaires. Leaving aside the fact that this would annihlate the economy and living standards by proxy, I'm not really convinced that with mass internet and social media there's any gini index or amount of redistribution that would leave the status anxious public satisfied. First they came for the billionaires and then they came for the homeowners.... Certainly comparable democratic countries with half of the gini index of America are still constantly flooded with rhetoric about eating the rich.

Right-wing AI strongman technofeudal democratic backsliding: political violence becomes normalised as a part of day to day life and as a response, perhaps after a significant assassination or riot, a strongman or group of technocrats use the violence as an excuse to seize absolute power, abetted by AI in part or in full. The lumpenproles are kept under control via mass surveillance, drones and guns or killed off entirely. The worst ending, but one that seems depressingly realistic looking at the history of inequality and failed revolutions.

Nothing ever happens: whether mass unemployment happens or not, most people end up with sinecures or welfare to keep them relatively pacified. Social media and concentrating wealth inequality continues to make people miserable even as absolute material conditions begin to reach sci-fi levels, and competition for zero-sum goods like housing in desirable areas and prestigious educations and sinecures becomes even more red in tooth and claw in the vein of the East Asian countries. Political violence gets somewhat more normalised, perhaps to Latin American or 20th century standards, but it's limited to isolated incidents.

Generally I consider myself libertarian and think that billionaires are good, actually, but I do think that inequality and society's response to inequality is likely to be one of the defining questions of the 21st century. While Sam Altman is the most visible face of AI to normies, pure game theory dictates that technological progress will continue with or without the consent of any individual person, company or nation-state, if the capability exists someone (or something...) is going to be the one that holds those reins to wealth, status and power, and as long those reins are held then the holder will inevitably be the target of the green-eyed masses. I don't think we yet have the social technology to deal with this and it's not clear that we ever will; I've seriously been thinking lately whether this might be one way that the Fermi Paradox manifests.

For better or for worse, human happiness seems to be tied only lightly to absolute material standards and heavily tied to relative status,

Yeah, it's kinda depressing to realize that some of the most optimistic scenarios for AI will still result in a lot of human misery. It's fun to be a trust fund baby, but if all the hoi polloi are trust fund babies too, it kinda loses its shine. You are just another unemployed loser who can't get a reservation at any of the best restaurants. And if you want to earn extra money beyond your UBI, you need to take some demeaning job as a personal servant for the grandchild of some schmuck who was lucky enough to put $10,000 into the right stock at the right time.

There is a story they used to teach in American history classes in high school that many of the early immigrants to the United States were people who had been locked out of European status hierarchies and decided to make a fresh start of things. Perhaps a similar sentiment will drive migration to the stars.

Yeah, it's kinda depressing to realize that some of the most optimistic scenarios for AI will still result in a lot of human misery.

At some point, a Matrix style world where everyone is just dumped in a virtual reality simulator where they can each become a hero of their own tale switched from one of the most dystopian outcomes imaginable to one of the better ones.

The realization hit me this weekend as I was hanging out with some friends at an artificial lagoon with temperature controlled water, lifeguards on duty, and basically everything optimized for keeping guests from getting hurt (and keep them spending money).

This is precisely how a 'beneficent' superintelligence is most likely to resolve the problem. Stick humans into a simulation, or maybe a completely artificial environment with all the edges that cause death and misery sanded off.

A permanent Disney World vacation. Maybe swap out the aesthetics often enough to make it feel novel.

Call me John the Savage but I always thought The Culture was a human zoo dystopia.

Life without struggle seems positively meaningless.

That is 5000% my own objection to the Culture as portrayed.

The ONLY entities with true volition in that universe are the minds. No human ever makes a meaningful choice, and whatever influence they have on their own fate is inherently pre-calculated in by the minds.

And somehow the humans are 100% aware of the arrangement and there are few dissenters, although they can get uppity from time to time.

It honestly makes me sympathetic to Culture opponents just on the basis of "yes, maybe they're sadistic, evil, and backwards, but at least they're the masters of their own fate dammit!"

I think that's the precise objection leveled by the main character of the first book, actually.

And somehow the humans are 100% aware of the arrangement and there are few dissenters, although they can get uppity from time to time.

It seems to me that the Culture deals with this by letting the dissenters interact with other cultures/societies on their behalf as part of Contact. Also humans live extended lifespans but not immortality, so far as Wikipedia tells me, so the problem will eventually solve itself; even the most fiery rebel can't maintain that meaningfully within the Culture, and if they leave to join a different world, then they are no longer a problem:

Since the Culture's biological population commonly live as long as 400 years and have no need to work, they face the difficulty of giving meaning to their lives when the Minds and other intelligent machines can do almost anything better than the biological population can. Many try—few successfully—to join Contact, the Culture's combined diplomatic / military / government service, and fewer still are invited to the even more elite Special Circumstances (SC), Contact's secret service and special operations division. Normal Culture citizens vicariously derive meaning from their existence via the works of Contact and SC. Banks described the Culture as "some incredibly rich lady of leisure who does good, charitable works... Contact does that on a large scale."

Yeah, and that's the existential horror of the situation to me.

You can dissent from the Culture, you can rebel, you can even try to kill yourself.

But none of that will change the outcome.

Its still there. Everywhere. Inevitable. And all alternatives are inherently worse.

I have before said that the inverse of the Culture might be a civilization of pure P-zombies whose whole, entire goal is removing sentience from the universe. Not intelligence, just sentience.

Assuming they're technologically equivalent to the culture, would the Culture win that fight?

That is also our real world situation.

More comments