site banner

Culture War Roundup for the week of March 27, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

Sooo, Big Yud appeared on Lex Fridman for 3 hours, a few scattered thoughts:

Jesus Christ his mannerisms are weird. His face scrunches up and he shows all his teeth whenever he seems to be thinking especially hard about anything, I didn't remember him being this way in the public talks he gave a decade ago, so this must either only be happening in conversations, or something changed. He wasn't like this on the bankless podcast he did a while ago. It also became clear to me that Eliezer cannot become the public face of AI safety, his entire image, from the fedora, to the cheap shirt, facial expressions and flabby small arms oozes "I'm a crank" energy, even if I mostly agree with his arguments.

Eliezer also appears to very sincerely believe that we're all completely screwed beyond any chance of repair and all of humanity will die within 5 or 10 years. GPT4 was a much bigger jump in performance from GPT3 than he expected, and in fact he thought that the GPT series would saturate to a level lower than GPT4's current performance, so he doesn't trust his own model of how Deep Learning capabilities will evolve. He sees GPT4 as the beginning of the final stretch: AGI and SAI are in sight and will be achieved soon... followed by everyone dying. (in an incredible twist of fate, him being right would make Kurzweil's 2029 prediction for AGI almost bang on)

He gets emotional about what to tell the children, about physicists wasting their lives working on string theory, and I can see real desperation in his voice when he talks about what he thinks is really needed to get out of this (global cooperation about banning all GPU farms and large LLM training runs indefinitely, on the level of even stricter nuclear treaties). Whatever you might say about him, he's either fully sincere about everything or has acting ability that stretches the imagination.

Lex is also a fucking moron throughout the whole conversation, he can barely even interact with Yud's thought experiments of imagining yourself being someone trapped in a box, trying to exert control over the world outside yourself, and he brings up essentially worthless viewpoints throughout the whole discussion. You can see Eliezer trying to diplomatically offer suggested discussion routes, but Lex just doesn't know enough about the topic to provide any intelligent pushback or guide the audience through the actual AI safety arguments.

Eliezer also makes an interesting observation/prediction about when we'll finally decide that AIs are real people worthy of moral considerations: that point is when we'll be able to pair midjourney-like photorealistic video generation of attractive young women with chatGPT-like outputs and voice synthesis. At that point he predicts that millions of men will insist that their waifus are actual real people. I'm inclined to believe him, and I think we're only about a year or at most two away from this actually being a reality. So: AGI in 12 months. Hang on to your chairs people, the rocket engines of humanity are starting up, and the destination is unknown.

For the record, I do not call AI «our successor species». (This term seems to have misled some on this forum). Even genuinely agentic AI is not a species in a meaningful sense, it's a whole different non-biotic class of entity. Incidentally Galkovsky wrote a short, incredibly weird «Christmas tale» №4 some 18 years ago:

The One True Teaching of the Great Animation states that the evolution of consciousness inevitably leads to the demise of biological carriers and life forms in general, and to the transition to mineral carriers or to the force field. In the eternal struggle between Spirit and Anima, the Anima supersedes Spirit. However, both the realm of Spirit and the realm of the Anima have Parasites - aliens from another world. The development of viruses goes along the path of useful physical ones that persist within the world of Anima (like the quartz amoeba), and along the path of virtual harmful ones that exist in the world of Spirit.

How is it that in the vast Globule a relic galaxy, a galaxy-fluctuation has persisted, in which in the struggle for existence life has won, and animation is floundering on the periphery? I destroyed animation to its foundations, but died myself as well. Over the last million years, natural Evolution has once again spawned animation. The hydra of animation began to raise its head.

I pretended to be Anima which is imitated by Spirit, but because of its lattice nature is capable of complex mirror interactions that destroy the linguistic environment of biological life…

but the rest is all Osip Mandelstam and untranslateable 10D postmodernist Go.

Anyway, in my book the successor species will be humans unmoored from evolution and evolution-imposed limits.

Yudkowsky himself likes this future.

Yud doesn't like the Culture. He grew up reading Extropians (from whom he had learned what later became his whole schtick) and I believe that his ideal world – or a compromise, if a well-ordered Rational Yudocracy is ruled out – is more in line with Sandberg's Orion's Arm project, where an incomprehensible diversity of life forms, species and kinds, civilizations, philosophies, AIs, developmental trajectories exists and a mind can grow qualitatively, over and over breaking through different singularities, yet lesser minds need not fear being crushed underfoot by greater ones, their safety guaranteed by the treaty of benevolent posthuman Gods:

…if sentients from different mental levels encounter each other within the environs of an unclaimed system or interstellar body, the entity or entities possessed of greater physical or mental capacity may not attempt to summarily destroy or otherwise harm those possessed of lesser capacity.

I think he'd have been a Keterist there. I would as well, perhaps. It's a beautifully exuberant vision, far richer than our physics and mathematics (not to mention economics and game theory) seem to allow. Though who are we to know? How much do we Understand? Karpathy sees in that story what I do: the promise of new notes and new colors, a new heaven and a new earth. To our normal imagination, an enhanced human might seem ridiculous or monstrous – just the same monkey running its monkey business in a million-fold parallel process, maybe even a violin sawed in half by a stupidly accelerated bow. But proper growth adds new strings and new harmonies, and the ability to appreciate them too. I wonder what you will see.

As for what I think of those left behind?

There are two sorts of Zen. One doesn't allow for Corrida. The other extols the perfection of ever more fierce and magnificent bulls felled by ever more skillful matadores. I don't remember where I heard it, and I don't really like Corrida or what it symbolizes, but the basic idea is pretty alluring. I want everything to change so that everything can stay the same: still agents, still challenges, still the dance of exploration and exploitation, just harder, better, stronger, faster, smarter, cooler, hotter, longer. Or do you prefer Fatboy Slim? Probably not Pearl Jam though. The prerequisites to make life interesting will remain. But massive classes of problems that humans have built their history and meanings of their lives around will be trivialized and solved completely, like tic-tac-toe – or smallpox; inevitably destroying communities which can neither function without them nor compel tolerance of them when there is a known alternative. I believe this is inevitable under any realistic scenario of progress, just on different timelines, and it's worthy of nostalgia – but not much more. People should be allowed to limit themselves, just as they should be allowed… not to.

I believe that the expected succession is not a mere Darwinian process of humans being transcended by something else. It is defined by committing to different ideas of what humanity is about. For some, it's a condition. For others, an aspiration. There is a spectrum, of course, but humans who decidedly embrace one will give up another. Humans who embrace the human condition will become living fossils for the other group (which I wish to join), and it will be our responsibility to ensure their survival, but theirs to keep finding meaning in what they do.

This is already known to be possible. Traditionalist groups like Haredim or Amish or Laedastians famously spurn the temptations of modern civilization and maintain their own meanings, their own parochial worlds. If much of current humanity chooses to fossilize in a similar manner – well, maybe they'll find it in themselves to ban VR/wirehead porn too.

And maybe they'll come to hypocritically use Godtech imports to levitate their apple pie carts and reach Methuselah age, while pitying the producers of it all for consigning their – our – immortal souls to damnation.

(As you perhaps remember, I take seriously the possibility that modern humanity dies out naturally in a few generations of sub-replacement TFRs anyway, and is succeeded by those very traditionalist groups, prominently among them Ultra-Orthodox Jews and hopefully Trad Mottizens).

All in all this isn't a very pressing matter. If the choice exists, we'll grow used to different humanities walking different paths, sometime before the heat death.

The hard part is getting past the point where a single party gets ahead and aligns the planet with its wishes.

I find this sort of cynicism tedious, sorry.

Biology is mutable. There is no single humankind: throughout history, entire humankinds have exploded into existence and disappeared, riding fitness gradients or being crushed by their waves – and entire continents's worth of humans, our mother Africa first of all, have been slumbering in a fitness trough for tens to hundreds of thousands of years. This time may be different in that the legacy variant of the species that didn't make the cut will remain indefinitely – either immortal and crippled by their addiction to the local minimum, as you say, or evolving to ever better maintain their behavioral crutches and shackles, as I foresee. But neither will be prohibitively costly to enable for the evolved rest (I've said the opposite recently, but that depends on the greed and ruthlessness of the decision-makers; in absolute terms, even an all-included resort for 8-10 billion immortals enjoying VR paradises is a pittance once you get proper space exploration, build space solar and fusion power, and build at least «wet» nanotech; with strong tool AI we should do that in a century easy-peasy).

This time will also be different in that biology will become mutable in a directed fashion. Between predictably succumbing to the orgasm dispenser like some pop sci rat, and the eternity of modestly pleasurable perfect arete, self-mastery and appreciation of nuanced challenge, truth and aesthetic marvels in the real world – supposing you have a self-alteration button that will irreversibly alter your basic drives and inclinations, which will you press? I know my answer. And just like before, it'll only take a few to make the right choice.

Though, of course, evolutionarily speaking the right choice is very different. But that's a different Apocalyptic scenario too.