site banner

Culture War Roundup for the week of April 27, 2026

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

But there are tipping-point complexity requirements for other kinds of things right?

If we have a large enough group of people who are part of one organization, you end up with politics in some form or another. There’s a thing there that certainly just does not exist at all in an organization consisting of 1 person. Somewhere between 1 and 1,000,000, there is a tipping point where politics (or whatever you want to call the coalition-building, power-seeking formation of hierarchies) forms.

Something similar is true of concepts such as “liquid”. A single molecule of water is not a liquid, or properly any kind of state of matter as commonly thought of. Undeniably liquids are a real thing that are distinct from solids, no?

So I don’t see any reason why you can’t say that somewhere along the spectrum of computational complexity there’s a tipping point. Presumably it’s below the godly number of parameters and calculations to make up the human brain, but that’s presumably not the exact limit. I mean we already have a pretty good spectrum of complexity among animals and it seems pretty clear that what people generally consider consciousness correlates nearly perfectly with computational complexity (adjusted for our ability to actually perceive that complexity, we aren’t mind readers and I’m sure there’s some highly complex barnacle animal out here that moves once every lifetime or something).

As to our inability to look inside the computer’s mind, that’s just the p-zombie problem. You can’t look in my mind either.

That doesn’t mean it saying “I’m conscious” is evidence of consciousness, but the fact that it can do so much might be. What would actually be evidence of consciousness to you?

While there is some pretty strong evidence that intelligence is a prerequisite to consciousness/sentience, since you have to be able to actually process thoughts and feelings and emotions in order to experience them, we have no evidence that they are the same thing, at least not if we are using a broad enough definition of the word "intelligence" to include AI.

An organization of 1 person expanding to 1,000,000 will inevitably lead to the emergence of politics, with people fighting over status and hierarchy, but an organization of 1,000,000 ants will not, at least not for a definition of politics encompassing things like competing over positions in a hierarchy and gaining increased authority from it. They all cooperate within their own tribe because they are a different type of being with different incentives and behaviors. Their reproductive success, and thus evolutionary incentives, center around cooperating to serve the queen. Having a large number of entities is necessary for the emergence of politics. It is highly correlated with it: if you get 1,000,000 monkeys or parakeets or cats you're going to get something analogous to politics, but it's not sufficient. If you generalize too far outside your initial observations then some of the underlying supports for it go away, even if it's not quite obvious what they are.

I suspect, though I am not certain, that monkeys and dolphins are not quite sentient as they are now. I suspect, though I am not certain, that if somehow made a monkey or dolphin very very smart that it would become sentient. I suspect, though I am not certain, that computers have generalized far enough outside of this area for this not to be the case. And nobody else is certain either. We do not fully understand sentience, therefore all evidence has very low Bayesian weight to it. The vast majority of things are not sentient, so that's my prior on all non-humans. We have almost no reason to suspect that computers might be sentient, especially if that sentience flips on and off depending on whether it's currently implementing a structure that we named "neural networks", and essentially all claims that they are sentient are based on superficial characteristics that shouldn't be necessary components of actual sentience, so my prior has not moved.

Evidence of consciousness that could convince me would essentially require a convincing theory of consciousness that made me understand it, or at least convince me that the people making the argument understood it and it was likely to be true, and for that theory of consciousness to include AI as being conscious. If the arguments boil down to "you can't prove it's not conscious" and/or "it uses words that imitate humans" then my priors will not move from "quite low but nonzero".