site banner

Culture War Roundup for the week of February 13, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

10
Jump in the discussion.

No email address required.

It really looks to me like there's something particular in rationalist brain that makes it suspectible to, say, believing that computer programs might in fact be peoples. Insofar as I've seen, normies - when exposed to these LLM-utilizing new programs - go "Ooh, neat toy!" or "I thought it already did that?" or, at the smarter end, start pondering about legal implications or how this might be misused by humans or what sort of biases get programmed to the software. However, rationalists seem to get uniquely scared about things like "Will this AI persuade me, personally, to do something immoral?" or "Will we at some point be at the point where we should grant rights to these creations?" or even "Will it be humanity's fate to just get replaced by a greater intelligence, and maybe it's a good thing?" or something like that.

For me, at least, it's obvious that something like Bing replicating an existential dread (discussed upthread) makes it not any more human or unnerving (beyond the fact that it's unnerving that some people with potential and actual social power, such as those in charge of inputing values to AI, would find it unnerving) than previously, because it's not human. Then again, I have often taken a pretty cavalier tone with animals' rights (a major topic in especially EA-connected rationalist circles, I've found, incidentally), and if we actually encountered intelligent extraterrestrial, it would be obvious to me they shouldn't get human rights either, because they're humans. I guess I'm just a pro-human chauvinist.

I tire of people taking potshots at rationalists. Yes, some seem too fixated on things like "is the LLM conscious and morally equivalent to a human", I feel the same way about their fascination with animal rights. But they seem to be the only group that long ago and consistently to this day grok the magnitude of this golem we summon. People who see LLMs and think "Ooh, neat toy!" or "I thought it already did that?" lack any kind of foresight and the bias people have only slightly more foresight. We've discovered silicon can do the neat trick got us total dominance of this planet and can be scaled. This is not some small thing, it is not destined to be some trivia relegated to a footnote in a history book of the 20s in a few decades. It is going to be bigger and faster than the industrial revolution and most people seem to think it's going to be comparable to facebook.com. Tool or being, it doesn't really matter, the debate on whether they have rights is going to seem like discussions of whether steam engines should get mandatory break time by some crude analogy between overheating and human exhaustion.

Fuck rights, they are entirely a matter of political power and if you see a spacefaring alien I dare you to deny it its equality. This is not the problem.

Normies easily convince themselves, Descartes-like, that non-primate animals, great apes, human races and even specific humans they don't like do not have subjective experiences, despite ample and sometimes painful evidence to the contrary. They're not authorities in such questions by virtue of defining common sense with their consensus.

I am perfectly ready to believe that animals and apes have subjective experiences. This does not make me any more likely to consider them as a subject worthy of being treated equal to humans or be taken into account in the same way as humans are. For me, personally, this should be self-evident, axiomatic.

Of course it's not self-evident, in general, since I've encountered a fair amount of people who think otherwise. It's pretty harmless when talking about animals, for example, but evidently not harmless when we are talking about computer programs.

It really looks to me like there's something particular in rationalist brain that makes it suspectible to, say, believing that computer programs might in fact be peoples

It's the belief that *we*, our essence, is just the sum of physical processes, and if you reproduce the process, you reproduce the essence. It's what makes them fall for bizarre ideas like Roko's Basilisk, and focus on precisely the wrong thing ("acausal blackmail") when dismissing them, it's what makes them think uploading their consciousness to the cloud will actually prolong their life in some way, etc.