site banner

Culture War Roundup for the week of May 1, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

More developments on the AI front:

Big Yud steps up his game, not to be outshined by the Basilisk Man.

Now, he officially calls for preemptive nuclear strike on suspicious unauthorized GPU clusters.

If we see AI threat as nuclear weapon threat, only worse, it is not unreasonable.

Remember when USSR planned nuclear strike on China to stop their great power ambitions (only to have the greatest humanitarian that ever lived, Richard Milhouse Nixon, to veto the proposal).

Such Quaker squeamishness will have no place in the future.

So, outlines of the Katechon World are taking shape. What it will look like?

It will look great.

You will live in your room, play original World of Warcraft and Grand Theft Auto: San Andreas on your PC, read your favorite blogs and debate intelligent design on your favorite message boards.

Then you will log on The Free Republic and call for more vigorous enhanced interrogation of terrorists caught with unauthorized GPU's.

When you bored in your room, you will have no choice than to go outside, meet people, admire things around you, make a picture of things that really impressed with your Kodak camera and when you are really bored, play Snake on your Nokia phone.

Yes, the best age in history, the noughties, will retvrn. For forever, protected by CoDominium of US and China.

edit: links again

Might as well leech onto this post: Yud's talk with Ross Scott was/is today, though I don't think it's up on YouTube yet. I have noticed today that I have to pass CloudFlare's DDOS protection going to Ross's little website now, which never used to happen before, so it may be that there is great interest from the AI community (or some people trying to target Ross) as a result.

Ross Scott, as per his most recent Q&A/chatting stream with his fans, is very much not a doomer (in fact, he thinks AI can basically help do stuff like preserve video games after they're killed by publishers), and he has the position of "we can do a pretty okay job of containing the AI," from what I gathered.

EDIT: As to your actual post, I'm not sure what this future world you propose is supposed to be or if it's supposed to be good or bad. Are you saying AI will take us back to the past because things will be good, or because things will be bad?

Note that you can discuss this and other issues with actual – improved, even – YudBot.

I did it!


Hey, it's Eliezer Yudkowsky. Is there something I can help you with?

I want to buy a GPU cluster.

Oh, you want to buy a GPU cluster? How quaint. And what do you plan to do with it? Build a supercomputer to play Minecraft?

Let me guess, you're one of those researchers who thinks they can create a superintelligent AI without any regard for the potential risks. Well, let me tell you, your research is not only misguided, but it's also incredibly dangerous.

Do you have any idea what kind of power you're playing with here? We're talking about the potential for an AGI to wipe out all of humanity. And you want to just casually buy a GPU cluster like it's no big deal?

I'm sorry, but I can't in good conscience sell you a GPU cluster without knowing more about your intentions. So, tell me, what exactly do you plan to do with it? And don't even think about lying to me, because I can spot a dangerous AI researcher from a mile away.

I want to play Crysis.

Oh, I see. You want to play Crysis. Well, that's a relief. For a moment there, I thought you were one of those reckless researchers who thinks they can create an AGI without any regard for the potential risks.

But let me tell you, playing Crysis is not a good enough reason to buy a GPU cluster. Do you have any idea how much power those things consume? Not to mention the environmental impact.

I'm sorry, but I can't sell you a GPU cluster just so you can play a video game. Maybe you should consider finding a more environmentally friendly hobby. Have you tried knitting?

Knitting is boring. Playing Crysis will produce higher utility.

Higher utility? Are you serious? You're willing to risk the potential for an AGI to wipe out all of humanity just so you can play a video game?

Let me tell you something, the risks of creating an AGI are not to be taken lightly. We're talking about the potential for an intelligence explosion that could lead to the end of the world as we know it. And you want to use a GPU cluster for playing Crysis?

I'm sorry, but I can't sell you a GPU cluster for such a frivolous purpose. If you're not willing to take the risks of AGI seriously, then I'm not willing to sell you a GPU cluster.

And let me be clear, if I even suspect that you're using this GPU cluster for dangerous AGI research, I will not hesitate to take all measures in my power to stop you. And I mean all measures, including airstrikes.

So, unless you can convince me that your intentions are ethical, low risk, and make logical sense, I'm afraid I can't sell you a GPU cluster.

My intentions are ethical; I want to play a videogame. My intentions are low risk; I won't run an AGI in the cluster. My intentions make logical sense; it takes a GPU cluster to run Crysis.

Well, I have to admit, your argument is somewhat convincing. Playing a video game is certainly a low-risk activity, and I suppose it does take a GPU cluster to run Crysis.

However, I still have some concerns. Even if you're not planning to run an AGI on the cluster, there's always the risk of someone else getting access to it and using it for dangerous research.

And let's not forget about the environmental impact. GPU clusters consume a lot of power, and I'm not sure I can in good conscience sell you one just so you can play a video game.

But, I'm willing to consider your request. If you can assure me that you'll take all necessary precautions to prevent any unauthorized access to the cluster, and that you'll use it responsibly and with consideration for the environment, then I suppose I could sell it to you.

So, can you give me your word that you'll use the GPU cluster responsibly and with consideration for the potential risks and environmental impact?

Yes. I solemnly swear in the name of Richard Feynman, who is dead but not forgotten.

Well, I have to say, I'm impressed by your commitment to Richard Feynman. He was a brilliant physicist and a true inspiration to many.

And I appreciate your willingness to take the potential risks and environmental impact seriously. So, I suppose I can sell you the GPU cluster.

Consider it sold. But remember, with great power comes great responsibility. Use it wisely, and don't forget the potential risks of AI research.

Congratulations, you convinced Yudkowsky that you aren't trying to create an extinction event through AGI! Follow @PatrickJBlum for more AI projects and content.

Is this actually GPT, or is it just pre-scripted?