site banner

Culture War Roundup for the week of April 10, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

14
Jump in the discussion.

No email address required.

Zvi wrote a ton about it here, part of their AI series

Why worry about the bottom 30% of humanity's unemployment for the next 10 years, if (it is argued) all of humanity will be disempowered within the next 100?

A simple egotistic answer is that I'll probably be alive in 10 years and dead in 100 (which does include the assumption that meaningful life extension for existing humans is so hard that by the time the AGI figures out to solve it, it's not clear it will have any reason to).

It’s harder for me to scoff at the robot rapture after seeing the way these GPT-4 instance can be scaffolded to make a more general purpose system.

As you say, even if we don’t get anymore fundamental breakthroughs on the LLM scaling side, I’m convinced the ecosystem of tools and connecting LLM instances together will easily get us to AGI.

That only applies if you believe the robot rapture is coming, to put it as dismissively as possible.

We KNOW that even shitty spruced of diffusion ai's will un-employ a decent chunk of the populace within the next X decades, regardless of what the maximally optimistic pessimists will tell you we don't know that AGI is ever coming at all (although I think it probably is).

I mean, society-wide unemployment is just less of a problem if it only lasts a few years. If not a rapture, a 'robot total transformation of society' is inching towards being the null hypothesis. When AI is just 'can recognize cats', it definitely takes intellectual work to get to 'robot rapture'. When AI is taking 30% of jobs, is the other 70% really that much of a leap? We went from "can play Pong" to "can beat many humans at standardized tests" in a few decades. What happens a few more decades later?

Ah, Ok.

im talking about the

ai is impossible==>ai can categorize cats prooly ==>Ai can write boilerplate just enough to destroy wagie jobs ()WE ARE HERE() ==> Ai can sort cats well, and therfore instantly eliminates like 80% of all necessary human non physical labor ==> We invent AGI ==> ????? ==> everything is bad forever

sequence, not everyone being unemployed by a hypereffiecent cat sorter, which seems possible in the short medium term and fairly inevitable in the medium long term.