site banner

Culture War Roundup for the week of June 30, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

After Zizians and the efilist bombing I have tried to pay more attention to the cross section of ethical veganism, rationalists, and nerdy utilitarian blogs.

A Substack titled "Don't Eat Honey" was published. Inside, the argument is made that to buy or consume honey is an unethical act for insect suffering-at-scale reasons. According to the essay, bees, like livestock, suffer quite a lot at the hands of beekeepers. That's a lot of bees. Thus the title: don't eat honey.

The median estimate, from the most detailed report ever done on the intensity of pleasure and pain in animals, was that bees suffer 7% as intensely as humans. The mean estimate was around 15% as intensely as people. Bees were guessed to be more intensely conscious than salmon!

If we assume conservatively that a bee’s life is 10% as unpleasant as chicken life, and then downweight it by the relative intensity of their suffering, then consuming a kg of honey is over 500 times worse than consuming a kg of chicken! And these estimates were fairly conservative. I think it’s more plausible that eating honey is thousands of times worse than eating comparable amounts of chicken

This particular post is high on assumption and light on rigor. It received outrage. Another post on Bentham's blog on insect suffering I recall as higher quality material for understanding. Did you know that composting is an unethical abomination? I'd never considered it!

'Suffering' presents an incommensurable problem. Suffering is a social construct. Suffering is the number and intensity of firing pain receptors over time. Suffering is how many days in a row I experienced boredom as a teenager. Still, science attempts to define and quantify suffering. An equation works out the math: how conscious a cricket is in relation to man, a cricket's assumed capacity to feel pain, the length of time it spends feeling pain, and so on. My prediction is we will figure out the consciousness part of the equation with stable meaning before we ever do so for suffering.

We will manage to rethink, remeasure, and find additional ways of suffering. People always have. Today, plants do not feel "pain", but tomorrow, pain may not a prerequisite for suffering. Maybe starvation becomes a moral imperative. If the slope sounds too slippery, please consider people have already built a (relatively unpopular) scaffolding to accept and impose costs at the expense of human comfort, life, and survival. Admittedly, that suffering may present an incommensurable problem doesn't negate any imperative to reduce it. Find more suffering? Reduce that, too. It does give me reason to question the limitations and guard rails of the social technology.

According to Wikipedia, negative utilitarians (NU) are sometimes categorized as strong NUs and weak NUs. This differentiates what I'd call fundamentalists --- who follow suffering minimizer logic to whatever ends -- to the milder "weak" utilitarians. The fundamentalist may advocate for suffering reduction at a cost that includes death, your neighbor's dog, or the continued existence of Slovenia-- the honey bee capitol of the world. Our anti-honey, anti-suffering advocate has previously demonstrated he values some positive utility when it comes to natalism, but much of his commenting audience appears more in the fundamentalist category.

One vibe I pick up from the modern vegans is that the anti-suffering ethics are the ethics of the future. That our great-grandchildren will look backwards and wonder how we ever stooped so low as to tolerate farming practice A or B. I don't doubt we'll find cost effective, technological solutions that will be accepted as moral improvements in the future. I am not opposed to those changes on principle. Increase shrimp welfare if you want, fine.

My vague concern is that this social technology doesn't appear limited to spawning technological or charitable solutions. With things like lab meat showing up more frequently in the culture war I'd expect the social technology to spread. So far, however, vegans remain a stable population in the US. Nerdy utilitarian bloggers are yet to impose their will on me. They just don't think I should eat honey.

Of all the things I did not expect to see in a "J'Accuse!" post, composting would have been high on the list if I had ever contemplated the ethical and moral issues involved. In letting worms break down food scraps to create soil. Like they've been doing ever since the first worms crawled through soil breaking down humus.

When I read stuff like that (if your food scraps are already fly-infested, be sure to humanely kill the insects before disposing of your rubbish), I have to wonder are these people living in the world of nature at all? Like, they're writing as though they were all born and raised on a space station that never saw a crumb of non-metallic, non-artificial surfaces in all their born days.

I swear, I am getting N.I.C.E. vibes from this attitude of "nature, ugh, organic life is so gross and icky" about, well, every darn natural process in the world of animal life. From "That Hideous Strength":

...The Italian was in good spirits and talkative. He had just given orders for the cutting down of some fine beech trees in the grounds.

"Why have you done that, Professor?" said a Mr. Winter who sat opposite. "I shouldn't have thought they did much harm at that distance from the house. I'm rather fond of trees myself."

"Oh yes, yes," replied Filostrato. "The pretty trees, the garden trees. But not the savages. I put the rose in my garden, but not the briar. The forest tree is a weed. But I tell you I have seen the civilised tree in Persia. It was a French attaché who had it, because he was in a place where trees do not grow. It was made of metal. A poor, crude thing. But how if it were perfected? Light, made of aluminium. So natural, it would even deceive."

"It would hardly be the same as a real tree," said Winter.

"But consider the advantages! You get tired of him in one place: two workmen carry him somewhere else: wherever you please. It never dies. No leaves to fall, no twigs, no birds building nests, no muck and mess."

"I suppose one or two, as curiosities, might be rather amusing."

"Why one or two? At present, I allow, we must have forest for the atmosphere. Presently we find a chemical substitute. And then, why any natural trees? I foresee nothing but the art tree all over the earth. In fact, we clean the planet."

"Do you mean," put in a man called Gould, "that we are to have no vegetation at all?"

"Exactly. You shave your face: even, in the English fashion, you shave him every day. One day we shave the planet."

"I wonder what the birds will make of it?"

"I would not have any birds either. On the art tree I would have the art birds all singing when you press a switch inside the house. When you are tired of the singing you switch them off. Consider again the improvement. No feathers dropped about, no nests, no eggs, no dirt."

"It sounds," said Mark, "like abolishing pretty well all organic life."

"And why not? It is simple hygiene. Listen, my friends. If you pick up some rotten thing and find this organic life crawling over it, do you not say, 'Oh, the horrid thing. It is alive,' and then drop it?"

"Go on," said Winter.

"And you, especially you English, are you not hostile to any organic life except your own on your own body? Rather than permit it you have invented the daily bath."

"That's true."

"And what do you call dirty dirt? Is it not precisely the organic? Minerals are clean dirt. But the real filth is what comes from organisms--sweat, spittles, excretions. Is not your whole idea of purity one huge example? The impure and the organic are interchangeable conceptions."

"What are you driving at, Professor?" said Gould. "After all we are organisms ourselves."

"I grant it. That is the point. In us organic life has produced Mind. It has done its work. After that we want no more of it. We do not want the world any longer furred over with organic life, like what you call the blue mould--all sprouting and budding and breeding and decaying. We must get rid of it. By little and little, of course; slowly we learn how. Learn to make our brains live with less and less body: learn to build our bodies directly with chemicals, no longer have to stuff them full of dead brutes and weeds. Learn how to reproduce ourselves without copulation."

..."There is a world for you, no?" said Filostrato. "There is cleanness, purity. Thousands of square miles of polished rock with not one blade of grass, not one fibre of lichen, not one grain of dust. Not even air. Have you thought what it would be like, my friend, if you could walk on that land? No crumbling, no erosion. The peaks of those mountains are real peaks: sharp as needles, they would go through your hand. Cliffs as high as Everest and as straight as the wall of a house. And cast by those cliffs, acres of shadow black as ebony, and in the shadow hundreds of degrees of frost. And then, one step beyond the shadow, light that would pierce your eyeballs like steel and rock that would burn your feet. The temperature is at boiling-point. You would die, yes? But even then you would not become filth. In a few moments you are a little heap of ash; clean, white powder. And mark, no wind to blow that powder about. Every grain in the little heap remain in its place, just where you died, till the end of the world . . . but that is nonsense. The universe will have no end."

"Yes. A dead world," said Mark, gazing at the moon.

"No!" said Filostrato. He had come close to Mark and spoke almost in a whisper, the bat-like whisper of a voice that is naturally high-pitched. "No. There is life there."

"Do we know that?" asked Mark.

"Oh, si. Intelligent life. Under the surface. A great race, further advanced than we. An inspiration. A pure race. They have cleaned their world, broken free (almost) from the organic."

"But how----?"

"They do not need to be born and breed and die; only their common people, their canaglia do that. The Masters live on. They retain their intelligence: they can keep it artificially alive after the organic body has been dispensed with--a miracle of applied biochemistry. They do not need organic food. You understand? They are almost free of Nature, attached to her only by the thinnest, finest cord."

"Do you mean that all that," Mark pointed to the mottled white globe of the moon, "is their own doing?"

"Why not? If you remove all the vegetation, presently you have no atmosphere, no water."

"But what was the purpose?"

"Hygiene. Why should they have their world all crawling with organisms? And specially, they would banish one organism. Her surface is not all as you see. There are still surface-dwellers--savages. One great dirty patch on the far side of her where there is still water and air and forests--yes, and germs and death. They are slowly spreading their hygiene over their whole globe. Disinfecting her. The savages fight against them. There are frontiers, and fierce wars, in the caves and galleries down below. But the great race press on. If you could see the other side you would see year by year the clean rock--like this side of the moon--encroaching: the organic stain, all the green and blue and mist, growing smaller. Like cleaning tarnished silver."

Anyone else reading that excerpt and thinking 'Based'? Wouldn't it be excellent to carve out a new artificial world, make better animals and plants according to one's wishes? Live as long as one likes without regard for age?

Not the specifics of perfectly cleaning the world, that could take many angles. One might make a jungle of talking animals, or an endless lived-in leafy suburbia or a Willy Wonka wonderland or all of those things simulated within a ball of computronium. But isn't that the logical endpoint of ever increasing mastery and control of the world? What's the alternative, stasis?

I can sense that many people don't like this vision but isn't this what we're doing, irregardless of objections? Unless you think 'no people mustn't live forever' or 'we mustn't have children' or 'technological advancement must stop' then you endorse indefinite growth in numbers and in power of worldshaping and knowledge ("All stable processes we shall predict. All unstable processes we shall control"), so eventually something like this will happen.

Anyone else reading that excerpt and thinking 'Based'?

That is why he wrote it that way. He's describing a character, a type of character even, not just a caricature.

Wouldn't it be excellent to carve out a new artificial world, make better animals and plants according to one's wishes? Live as long as one likes without regard for age?

I'm all for building artificial worlds. I'm skeptical "better" plants and animals are possible; we've altered plants and animals before, and we can doubtless alter them far more radically in the future, but what makes those alterations "better"? "Living as long as one wants, regardless of age" used to be something I was very excited for, less so after contemplating the downsides. All the pathways to serious immortality I'm aware of involve making the sum of me fully legible, and the risks of that very likely outweigh any possible benefit, assuming it's even possible.

But isn't that the logical endpoint of ever increasing mastery and control of the world? What's the alternative, stasis?

The alternative is thinking that our mastery is not ever-increasing in the way you seem to mean. Technology can and has greatly increased, and maybe it will greatly increase even more, but technology is not the same thing as mastery. If you want a highly reductive example of the difference between the two, compare the original Snow White film to the remake. The people who made the remake had vastly more technology, vastly more resources, vastly more experience in filmmaking to draw on; more "mastery", right? So why was the original a masterpiece, and the remake a trash disaster? Again, that's a highly reductive example, it seems to me that the principle generalizes quite widely.

I don't think we are moving toward ever-increasing mastery. I don't think we have to stop tech advancement either. I think what will happen next is pretty similar to what has happened before: we'll build something wondrous, and then the contradictions will assert themselves and it will all fall apart.

Technology is the concentration of power. Concentrated power is individual power. There is almost certainly a level of individual power that society, as we understand the term, can't contain or channel, and once that level is achieved society will simply fail. Society maintains technology; when society fails, likely the technology will fail as well, and then it's back down the curve for the survivors.

Maybe this time will be different. I wouldn't bet on it, though.

I think they could've made a better Snow White film than the original, it's just that they didn't want to. They wanted to make a bad film and did so.

Mastery isn't the problem, it's bad people using great resources to achieve bad goals. Now I see it, there's a pleasing symmetry in our tags "Just build nuclear plants" and "nuclear levels of sour" and what we're saying.

However I do agree that there are serious risks with progress and power concentration, it will probably end in tears for the vast majority for us for the same fundamental reason, bad people wanting bad things.

I don't see a collapse pathway though, only greater acceleration. Technology forms society. Writing and agriculture enabled settled states, steam engines enabled modern society. Powerful AI will enable transhuman or posthuman society. Maybe that does look more like an oligarchy where a few enjoy limitless technological power and can suppress everyone else. It may well be bad for those who aren't a chosen few or a singular one. Nevertheless I expect that it'd be much more highly developed than modern civilization in technological sophistication and scale.

Even if there's a full nuclear exchange induced by destabilizing technology, would the survivors really give up on securing more wealth, more power, more security through technological superiority? I believe they'd think 'damn, we should've struck first' or 'this time let's hide our schemes more effectively' or 'at least we've got the most remaining resources, we can try again'. They'd still know all the things we'd know, they'd be back at it again sooner or later, probably sooner and with a more ferocious sense of determination. A full nuclear exchange isn't certain either, it's hard to foresee what happens. I agree that there will be ever-greater instability and disruptions but that's just part of the transition from one kind of society to the next. The general trend is that even occasional setbacks (using rooted in social decline) are overcome - the Bronze Age Collapse, the fall of Rome and the Black Death only temporarily inhibited a larger trend of acceleration. Ideally acceleration should be channelled in a more pro-social way than it is but it seems an irresistible trend. Only if this time is different should we expect it to fail.

I think they could've made a better Snow White film than the original, it's just that they didn't want to. They wanted to make a bad film and did so.

I'm pretty sure no one involved in the process actually said "Our goal is to make a bad film". I'm pretty sure a lot of people involved in the process were trying as hard as they possibly could to make a blockbuster. Maybe all of them. And again, they had orders of magnitude more technology than Walt Disney had, but the technology didn't actually solve the problem of making a good movie even a little bit.

Mastery isn't the problem, it's bad people using great resources to achieve bad goals.

Just so. Humans inevitably human, for good or ill. They'll human with sticks and rocks, and they'll human just as hard with nanocircuitry and orbital launch vehicles and nuclear fusion.

Even if there's a full nuclear exchange induced by destabilizing technology, would the survivors really give up on securing more wealth, more power, more security through technological superiority?

Are you familiar with Bostrom's Vulnerable World Hypothesis? If not, I'd recommend it. The standard assumption is that tech advancements proceed in a stable fashion, that the increase in individual/breaking power is balanced by an increase in communal/binding power. I don't think that assumption is valid, not only for future tech, but very likely for tech that already exists. What we have available to us at this moment is probably enough to crash society as we know it; all that is required is for the dice to come up snake-eyes. Adding more tech just means we roll more dice. Maybe, as you say, some future development jacks the binding power up, and we get stable dystopia, but honestly I'd prefer collapse.

You're correct that we bounced back from the black death and so on. But consider something like Bostrom's "easy nukes" example. There, the threat is baked into tech itself. There's no practical way to defend against it. There's no practical way to live with it. You can suppress the knowledge, likely at grievous cost, but the longer you have it suppressed, the more likely someone rediscovers it independently. Bostrom's example is of course a parable about AI, because he's a Rationalist and AI parables are what Rationalists do. It seems to me, though, that their Kurzweilian origins deny them the perspective needed to see the other ways the shining future might be dismayed.