site banner

Culture War Roundup for the week of April 24, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

The ultra sci-fi utopian scenario and the mass extinction nanobots scenario are both a) exciting and b) relieve humanity of responsibility. If the god AI comes then it comes, nothing you can do about it. So naturally people gravitate towards scenarios like that.

More realistic AI concerns about things like job loss and surveillance are more mundane, and, crucially, they are things that we as a society might actually have some degree of control over, if we make the right choices. So that sounds boring and hard. So people would rather not think about it.

I think the issue here is that you are imagining current and historic types of oppression, but worse. With AI technology, 1984-style thought control becomes obsolete. Imagine a world where all human-level tasks can be automated. That old military adage about needing boots on the ground to hold and secure territory? Poof. Gone. The killbots can search your house for contraband and identify friend from foe. The Hobbesian constraints on political power disappear. Whoever controls the killbots controls the world.

But wait, isn’t this exactly, “current and historic types of oppression, but worse”? No. There isn’t any demand for labor anymore. That means there isn’t any reason for the common man to even exist. No political power + no economic value = you are a waste of calories. Sure, a human in charge is more likely to have sentimental attachment to other humans than a fully-agentic shoggoth AI would, but I still don’t like our chances.

So I guess that makes me on optimist, because I believe we were land merely in a 1984 dystopia.

With AI technology, 1984-style thought control becomes obsolete. Imagine a world where all human-level tasks can be automated. That old military adage about needing boots on the ground to hold and secure territory? Poof. Gone. The killbots can search your house for contraband and identify friend from foe. The Hobbesian constraints on political power disappear. Whoever controls the killbots controls the world.

None of this is going to happen in the foreseeable future, and is not guaranteed even in the most feverish "recursive improvement" scenario. Comparative advantage ist still a thing. Rationalists have a long history of pooh-pooing, brushing it off, and coming up with convoluted arguments for why it no longer applies, but it's still a thing. This is why they spent the 2010's preaching doom at blue-collar workers (I knew a guy who was literally reaching out to truck drivers and giving them vocational guidance, because self driving trucks are gonna replace them any day now) only for it to turn out that it's the brainy white-collar ones that will be the first on the chopping block. Kill bots? Yeah, very impressive, but why should I waste my precious GPUs doing stupid grunt work, when I can get a bunch of the more unscrupulous humans to do the same thing, and pay them in guaranteed rations of actual meat, instead of the usual Uncle Klaus' Bugz?

On the other hand 1984 surveillance is absolutely going to happen. The whole idea of AI is almost tailor made for it, and we are currently slapping digitization on top of every aspect of our lives, even when it brings no practical value, just to make sure we can feed more data to the AI, and nothing escapes it's gaze. Once the technology takes off they'll have insights into what makes us tick that will make Big Brother blush.

And at that point I just want to ask the Basilisk what can I do to help.