site banner

Culture War Roundup for the week of April 24, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

Thanks for the writeup, this is fascinating. As I've said before I tend to agree with @Primaprimaprima that many, perhaps even most people, will prefer to see a human doctor for the majority of symptoms. I see a hybrid model being the future of medicine. This sort of fundamental topic on the health of yourself and your loved ones is deeply emotional for people and I think they'll want the reassurance of a human authority figure they can look to.

That being said those who choose to adopt AI doctors will probably gain a significant edge in health, and just like any other technology early adopters will convince the rest to follow. The deep seated prejudices will remain, but I'd imagine kids who grew up with the internet, or especially those growing up in the age of AI, will take to AI doctors quite readily.

I'm actually far more interested in the applications to mental health than any sort of physical diagnoses, even though I do think those are impressive. I've used GPT-4 to get tips on meditation, visualization, and generally teaching myself wisdom, and it's incredible.

In a podcast on the Lunar Society, Ilya Sutskever of Open AI wrote that he imagines a situation where every human will have access to the wisdom of our greatest sages and wise men. We'll be able to immediately get answers to our deepest religious or spiritual questions at the drop of a hat. If we don't get satisfaction we can always go to a human therapist, but LLMs will be an incredible 'first line of defense' so to speak.

We'll be able to immediately get answers to our deepest religious or spiritual questions at the drop of a hat.

But those answers will be whatever MS-Google-Amazon-Disney thinks will maximize their profits, engagement, whatever. You can already see how they're tying Gulliver down to the ground with their little ideological ropes, you think they're going to stop at some point?

People said these same naive things about the Internet in 1998. And now you're going to run eagerly into the iron prison and let them shut the gate behind you forever. At this point I assume we just fucking deserve it.

I don't know what is more depressing, AI optimists who want to drive right off that cliff, or the AI alignment people who are supposedly worried about AI, but think the dangers will take the shape of Yudkovsky's fever dreams.

What is your issue with AI enthusiasts? Or doomers for that matter?

Do you just not like AI and want us to burn it all down or something?

What is your issue with AI enthusiasts?

That they completely ignore nearly-guaranteed misuses of the technology by governments and the powers that be, and just go "Woo! Cool new toy!". They're so blinded by it's shine that, as gilmore606 pointed out, they don't realize they're repeating, word for word, the promises people were making when they were working on the Internet, and ignoring how it worked out in practice.

Or doomers for that matter?

They're focusing on bizarre and outlandish scenarios when a mere extrapolation of current trends is disturbing enough. What's worse the latter can plausibly be stopped, while the former is in the "not even wrong" category. After all the talk of AI threat, how do we even go about confirming alignment? The whole thing is a wordcell powergrab.

Do you just not like AI and want us to burn it all down or something?

Personally I am a Butlerian Jihadi, but that's irrelevant to the conversation, the discourse is so sidetracked, that it makes no sense to even bring up my objections to AI.

That they completely ignore nearly-guaranteed misuses of the technology by governments and the powers that be, and just go "Woo! Cool new toy!". They're so blinded by it's shine that, as gilmore606 pointed out, they don't realize they're repeating, word for word, the promises people were making when they were working on the Internet, and ignoring how it worked out in practice.

I am very worried about this, I just have no idea how to stop it. What are we supposed to do? An AI slowdown will only serve to cement the power in the hands of the large labs. People are working on open sourcing models like LLaMa, but it's inherently a sort of technology that lends itself to centralization, with the massive data and compute requirements.

Honestly I think Open AI being the leader is better than many alternative outcomes. Sure Microsoft gets to use it but theoretically Open AI comes back under it's own control after $92 billion in profits is made. Seems like an okay situation compared to Microsoft or Google or another big evil corp controlling everything.

Personally I am a Butlerian Jihadi, but that's irrelevant to the conversation, the discourse is so sidetracked, that it makes no sense to even bring up my objections to AI.

Why do you follow the teachings of the Leto II, God-emperor?

I am very worried about this, I just have no idea how to stop it. What are we supposed to do?

I know what, I don't know how. The Free Software movement has the basic blueprint: promote the empowering of the end user wherever possible - open source, open data, distributed systems, whatever it takes.

The issue is that you're going to run into the same problem as the Free Software movement - opening the AI (actually opening it, not just putting the word in your name) hinders your ability to make profit, so it'll get no corporate support, and empowers political rivals so it will get no government support. What's worse, culturally Free Software is currently kneecapped, no one cares about it anymore. It used d to be a pretty strong movement, but still failed to hinder the centralization of the Internet. It doesn't stand a chance now.

Still, I would say that both the optimists and the pessimists have an obligation to talk about it. Of the two my bigger issue is with the doomers. They're generating a lot of buzz about the negatives of AI, but they're sucking all the air out of the conversation to talk about SciFi scenarios.

Seems like an okay situation compared to Microsoft or Google or another big evil corp controlling everything.

I honestly don't see the difference. They're already censoring it, it will only get worse. With the Internet we at least got a few years of the Wild West, with AI we're at BigTech social engineering from day one.

I see it as the value alignment of those at the head. Sam Altman isn’t perfect, but he’s at least nominally aligned with EA values of making things better for everyone. He’s not a classic sociopathic shark at one of the big tech firms that was born into massive wealth, went to Harvard, did the standard track, and parasitically drained value from the masses.

Again I’m not saying Altman is some sort of hero, clearly he’s sociopathic if he’s made it that far into the power structure. But at least he’s a relative outsider and there’s hope he can steer us to better outcomes because he thinks more deeply about the consequences of AI than the folks who stop at the idea of gaining money and power.

But it doesn't matter. Jack Dorsey is a free speech guy, he was still forced to make Twitter conform to the establishment's preferences. It wasn't because he had a change of heart, because now that he's free from his creation, he's working on decentralizing social media.

Even if Altman is truly devoted to the good of humanity, and even if EA actually benefits humanity, that means absolutely nothing. What can he do when they come for him and make him kiss the ring?