site banner

Culture War Roundup for the week of February 20, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

15
Jump in the discussion.

No email address required.

Over the last few months, I've followed someone named Alexander Kruel on Substack. Every single day, he writes a post about 10 important things that happened that day - typically AI breakthroughs, but also other of his pet concerns including math, anti-wokeness, nuclear power, and the war in Ukraine. It's pretty amazing that he is able to unfailingly produce this content every day, and I'm in awe of his productivity.

Unfortunately, since I get this e-mail every morning, my information diet is becoming very dark.

The advances in AI in the last year have been staggering. Furthermore, it seems that there is almost no one pumping the breaks. We seemed doomed to an AI arms race, with corporations and states pursuing AI with no limits.

In today's email, Kruel quotes Elizier who says:

I've already done my crying, late at night in 2015…I think that we are hearing the last winds start to blow…I have no winning strategy

Elizier is ahead of the curve. Where Elizier was in 2015, I am now. AI will destroy the world we know. Nate Soares, director of MIRI, is similarly apocalyptic.

We've give up hope, but not the fight

What comes after Artificial General Intelligence? There are many predictions. But I expect things to develop in ways that no one expects. It truly will be a singularity, with very few trends continuing unaltered. I feel like a piece of plankton, caught in the swells of a giant sea. The choices and decisions I make today will likely have very little impact on what my life looks like in 20 years. Everything will be different then.

So, party until the lights go out? How do I deal with my AI-driven existential crisis?

The choices and decisions I make today will likely have very little impact on what my life looks like in 20 years. Everything will be different then.

To a large extent this viewpoint should alleviate absolutely GOBS of stress from your life. If you have constructive ideas you've been thinking of implementing but held back on because of self-doubt or the timing never felt right, maybe jump on those now. As long as you don't do anything unrecoverable, the risks pretty much round to zero, right?

Nihilism and Absurdism are two sides of the same coin, after all. I sometimes consider the possibility that right when we're on the cusp of AGI our alien overlords may reveal themselves and take away our toy before we kill ourselves. Or the Simulation masters reset us to 1975 to have another go at solving alignment.

If you feel like you want to make a difference then the only option seems to be Butlerian Jihad. There can't be but a couple hundred thousand people who are critical to AI research on the planet, right? (DO NOT do this, I do not endorse even the suggestion of violence).

I can't really put all my thoughts on this matter down without going off in dozens of different directions.

Suffice it to say I feel that regardless of which way things go, I am living in the most pivotal decade in all of human history. Maybe the globalist system collapses and cuts off the critical supply lines that are enabling AI research to proceed at lightning speed. Cut off from the high-end chips and electricity that are required to train new models, maybe we buy some time at the cost of massive decrease in our standard of living.

And since I can't do much to change it, I am focusing inward. I'm making my life as generally comfortable as possible. I'm spending time with family. I'm agreeing to more social and fun events than I normally would. Its weird because I cannot really express to people how I feel about our prospects for the coming years. But I've taken to telling people "All I know for sure is that 2023 is going to get REALLY FUCKING WEIRD." And explaining myself a little if they ask 'why?'

It still shocks me to hear normies talking about their discovery of ChatGPT as this new and novel tech and all the mundane uses they want to put it to. "Oh I have started using ChatGPT to create scripts for my marketing videos." I feel an internal sensation similar to if they told me "I just adopted a new pet Shoggoth, I love taking him for walks!"

How do you invest money when the two most likely trajectories of the next five years are either the devolution of industrialized society or an AI induced industrial revolution?

Get your own house in order, make whatever decisions are best for your personal health and wellbeing. Avoid blackpills.

If you feel like you want to make a difference then the only option seems to be Butlerian Jihad. There can't be but a couple hundred thousand people who are critical to AI research on the planet, right? (DO NOT do this, I do not endorse even the suggestion of violence).

I do. I don't endorse terrorism, mostly because I don't think that'll work*, but I totally endorse banning neural nets, having the police kick in the doors of people who won't stop, and invading any country that doesn't enforce the ban.

*Short version: AI is low on single-points-of-failure so that mode's out (with the possible exception of soft errors, but good luck getting nukes as a terrorist), it's already well-known so the Unabomber mode is out, and the AI not-kill-everyoneist movement is not a strong, pre-existing community so the vigilante mode and the insurgency mode would succumb to LE infiltration.