site banner

Culture War Roundup for the week of April 24, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

A heads-up: Yudkowsky will be talking AI with YouTube long-timer Ross Scott on the 3rd. This comes after Ross's last videochat with fans [warning: long, use the timestamps in one of the comments to skip around], where AI and Big Yud came up.

I expect that Ross will be able to wring some sort of explanation about AI risk out of Yudkowsky that will be palatable to the everyman. Ross has talked about things like Peak Oil before (here's an old, old video on the subject), so I think it will be interesting to see. I'll have to see if I can find out Ross's position on AI risk so far.

I admit this is surprising. I would've predicted the Butlerian Jihad movement deprioritizing Yud as a crank who may blurt out some risky political take, but he establishes himself more and more as the Rightful Caliph. Have Yuddites discovered a stash of SBF's lunch money to buy a bunch of podcasters, including some crypto has-beens looking for a new grift? Or is this simply a snowball effect, where Yud is becoming more credible and attractive the more he goes to podcasts?

On the other hand, this is all show for the plebs anyway; Policy people never lack for experts to cite. And «rationalists» can straight up lie to their audiences even about words of those experts.

I should accelerate my work on a dunk on Yudkowsky's whole paradigm, even though it honestly feels hopeless and pointless. If anyone has better ideas, I'm all ears.

Yud's message is aligned with the powers that be, so his voice will be magically amplified by the algorithm. The state is scrambling to ramp up their AI capabilities. They need the boot on any ambitious small companies in the form of a "six month pause". Yud thinks he's advocating for a less dangerous arms race, in reality he's just helping the most dangerous people catch up.

This makes sense if you consider that Yud takes Roko's Basilisk seriously. He's clearly realized this is his best contribution to its existence.

This makes sense if you consider that Yud takes Roko's Basilisk seriously. He's clearly realized this is his best contribution to its existence.

Well, how Big Yud reacted back then when Roko posted his idea on Less Wrong?

Called it wrong?

No, Yud went into full loud screaming mode.

https://basilisk.neocities.org/

I don't usually talk like this, but I'm going to make an exception for this case.

Listen to me very closely, you idiot.

YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.

and then put total ban on any further basilisk discussions on LW.

Not a reaction of someone who is not even slightly worried.

If big Y dissmissed this thing or just stayed silent, the whole idea would be forgotten in few days like other LW thought experiments. Streissand effect bites hard even if you are super genius.

Not a reaction of someone who is not even slightly worried.

Sure it is. Yudkowsky is exactly the sort of person who would be outraged at the idea of someone sharing what that person claims is a basilisk, regardless of whether he thinks the specific argument makes any sense. He is also exactly the sort of person who would approach internet moderation with hyper-abstract ideas like "anything which claims to be a basilisk should be censored like one" rather than in terms of PR.

Speaking or writing in a way where it's difficult to use your statements to smear you even after combing through decades of remarks is hard. It's why politicians use every question as a jumping off point to launch into prepared talking-points. Part of Yudkowsky's appeal is that he's a very talented writer who doesn't tend to do that, instead you get the weirdness of his actual thought-processes. When presented with Roko's dumb argument his thoughts were about "correct procedure to handle things claiming to be basilisks", rather than "since the argument claims it should be censored, censoring it could be used to argue I believe it, so I should focus on presenting minimum attack-surface against someone trying to smear me that way".

https://archive.is/nM0yJ

Again, I deleted that post not because I had decided that this thing probably presented a real hazard, but because I was afraid some unknown variant of it might, and because it seemed to me like the obvious General Procedure For Handling Things That Might Be Infohazards said you shouldn't post them to the Internet. If you look at the original SF story where the term "basilisk" was coined, it's about a mind-erasing image and the.... trolls, I guess, though the story predates modern trolling, who go around spraypainting the Basilisk on walls, using computer guidance so they don't know themselves what the Basilisk looks like, in hopes the Basilisk will erase some innocent mind, for the lulz. These people are the villains of the story. The good guys, of course, try to erase the Basilisk from the walls. Painting Basilisks on walls is a crap thing to do. Since there was no upside to being exposed to Roko's Basilisk, its probability of being true was irrelevant. And Roko himself had thought this was a thing that might actually work. So I yelled at Roko for violating basic sanity about infohazards for stupid reasons, and then deleted the post. He, by his own lights, had violated the obvious code for the ethical handling of infohazards, conditional on such things existing, and I was indignant about this.