site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Two Tweets from OpenAI's Sam Altman: "eliezer has IMO done more to accelerate AGI than anyone else. certainly he got many of us interested in AGI, helped deepmind get funded at a time when AGI was extremely outside the overton window, was critical in the decision to start openai, etc." "it is possible at some point he will deserve the nobel peace prize for this--I continue to think short timelines and slow takeoff is likely the safest quadrant of the short/long timelines and slow/fast takeoff matrix."

Eliezer Yudkowsky thinks that the rapid development of AGI will likely kill us and he has devoted his life to trying to stop this from happening, and Sam Altman almost certainly knows this. My personal guess is that quantum immortality means regardless of who is right, some branches of the multiverse will survive AGI, and the survivors will have enough computational power to know what percentage of the branches survived, and consequently whether Altman or Yudkowsky were right.

Edit: Eliezer's response Tweet, which I don't understand.

Doesn't quantum immortality mean that we're likely to spend eternity in pain on our death beds, seemingly close to death but miraculously surviving? If AGI tries to wipe us out, aren't we likely to suffer in pain forever from a miraculously survived murder attempt, maybe lying as a blind and deaf quadriplegic with third degree burns buried in a garbage dump?

No. To quote a post I made in response to someone expressing the same concern:

Is the thing you're afraid of the idea that quantum immortality would involve something like a near-eternity of horrible lives where you're almost but not quite dead? Because if so, I think you're badly misjudging the probability distribution. Those situations are associated with quantum immortality only because they're so incredibly unlikely that if they happen it'll be obvious that quantum immortality is true - but by definition that means they are absurdly unlikely to happen! Something like "you get shot and almost die, but random quantum fluctuations cause a lump of graphite to spontaneously appear inside your chest and barely stop the bleeding" are unlikely on a truly cosmic scale, even under the logic of quantum immortality it only matters if it's the only future where you don't die. And that sort of quantum immortality would require it happen again and again, multiplying the improbability each time.

Even if quantum immortality is true, anything the slightest bit plausible will completely dominate the probability distribution. There is no reason that technology granting near-immortality is impossible, so in virtually every Everett branch where you survive the reason is just that the technology is invented and you use it. Which is generally going to correspond to a technologically advanced and prosperous society. Quantum immortality wouldn't feel like a series of staggering coincidences barely preserving your life, it would feel like living in a universe where everything went surprisingly well. Billions of years from now your society is harvesting energy from black holes and maybe occasionally during get-togethers with your friends you debate whether this outcome was unlikely enough that quantum immortality is probably true.

Billions of years from now your society is harvesting energy from black holes and maybe occasionally during get-togethers with your friends

Never going to happen. "Our society" is not going to exist a thousand years from now, much less a billion. If humans still exist, it won't be in the form we know as human, any more than our hominid ancestors would recognise us.

I too love Golden Age SF but the amount of damage it has done due to naive techno-optimism drives me batty. 'Science will keep improving, we'll know all there is to know, we'll create better and better machines and it will be trivial to solve things like war and poverty and mental illness!'

Okay, tell me right now how the heralded AGI would solve the current problem in the Ukraine. Turn over entire control of world governments to it and enable it to assassinate Putin if he even looks like he's thinking of doing something? Forcing a peace on the entire world by tighter and tighter surveillance where there is only one permissible set of things to think and do? Because forget 'human flourishing' and fancy notions of post-scarcity and everyone can live a perfect VR life of whatever they desire, changing bodies and being unaging and immortal and having fun forever, the fact will be that if we run it on utilitarian premises, as the prevailing philosophy seems to be, the greatest utility will be peace and all that good stuff. But how do we get peace and all that good stuff? Control the likes of which no human dictator could even dream of. "You have to let people make their own choices!" "But if I do that, some of them will make bad choices, which will result in human deaths, and you told me that was the worst thing ever and must be avoided at all costs, so I can't let humans make their own choices".

What are we going to get out of AGI? What are we expecting, hoping, dreaming to get? 'Oh we need AGI so it can avert existential crises for us'. Well, there is no free lunch. There is no harvesting fre energy from black holes so we can expend it like water in our games and pleasures, it will be carefully monitored and doled-out energy rations (like "Star Trek: Voyager" tried, before the writers realised this meant they couldn't do anything) from whatever resources we have remaining and available.

But most of all, it will be Scott's world of AI influencers selling us Pepsi. That's what is coming, because we have set up Mammon as our god and ruler and making money is the one thing that counts. A Red Queen's Race, where if the current quarter profits aren't as good as the projections, the share price drops, so we lay off 5,000 workers in order to improve cost-base and get that price back up, because if the price tanks the business goes under. Running to stand still.

That's what AGI will be used for: governments wanting to win wars that they aren't officially declaring (Chinese spy balloons, true or not? why is China doing that? is it because they're testing the waters seeing how the US is handling, or failing to handle, Ukraine and Russia?), businesses gobbling up shares of the market, and every breath we draw being monetizable while we have to work longer and harder to earn enough to manage any kind of standard of living.

There is no Fully Automated Luxury Gay Space Communism. We're richer and better off in every conceivable way than our ancestors, yet we're still not happy and we're still coping with the same problems of human nature. And no machine intelligence, however godlike, is going to solve our problems for us.

The point isn't whether such an outcome is particularly likely, it's that it's more likely than being kept barely alive by a series of staggeringly unlikely macroscopic quantum events. The idea behind quantum immortality is that, if many-worlds is true and all the worlds in it are truly "real", there will always be some small subset of worlds where you continue existing so long as this is physically possible. And a lot of things are physically possible if you get into extremely unlikely quantum fluctuations. Since you don't experience the worlds where you are already dead, an increasing percentage of your remaining future selves would have experienced whatever unlikely events are required to keep you alive. When I said "your society" that wasn't meant to refer to any current society, it was meant to refer to the idea of surviving as part of a society at all. As opposed to most of your future copies surviving as the only remaining human in your universe, floating in space after the destruction of Earth and staying alive only because in some tiny fraction of the Everett branches splitting off each instant some oxygen/etc. randomly appears and and keeps you alive. Any future that doesn't require such a continuous series of coincidences will be a much larger fraction of the branches where you survive, and the most obvious such future is one where people deliberately invent the required technology. So whether quantum immortality is true or not, and whether or not you decide to care about the fate of future selves even if they only exist in a small fraction of branches, the expected outcomes of quantum immortality being true aren't the "kept barely alive by randomness" scenarios.

Those are not my future/past/present selves, any more than my reflection or my shadow is another self. If it's true, it's an interesting notion, but there is no "other self", there are just different versions of how I could have been - suppose I were a different sex, or born in a different country, or at a different period in history. If it's all the same universe at the same point in time except that Version 1 turned right when leaving the house while Version 2 turned left, there isn't a "me" to be a self, there are just "a and b and c and d and e" who are all different people.

Okay, but most people want to classify the guy who wakes up tomorrow with their memory and personality as being themselves. (Or rather a sufficiently similar memory and personality, since those change over time.) If many-worlds is true and the worlds literally exist, then each instant you're splitting into countless copies, all of whom have your memory/personality/continuity-of-consciousness. Under your interpretation none of them are the same person they were, so nobody is the same person from moment to moment. Which doesn't seem like a terribly useful definition of selfhood.