site banner

Culture War Roundup for the week of May 15, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

Disclaimer: this is a serious test for shady thinking. My apologies. Consider this a strawman, and please try to confront a steelman.

Note: see disclaimer above. This is shady thinking in note format.

EDIT: This is mostly in response to https://astralcodexten.substack.com/p/why-is-the-academic-job-market-so particularly thinking about Scott analyzing how the academic job market actually works. I bet Scott's analysis is super annoying to many of those in the market, and likewise super satisfying to others. My thesis is that the others are rationalists and the many are not.

idea

  • rationalists explain foreign things from "first principles"

  • they liken themselves to newton and hooke, exploring new frontiers

  • for better or worse

  • to the experts in the field, they are cringe and dilettante, sneer worthy

the problem

  • within every field, there are certain "touchy areas"

  • everyone understands the truth but pretends not to

a bigger problem

  • rationalists home in on touchy areas

  • rationalists can't "understand the truth but pretend not to"

  • rationalists "say the quiet part out loud"

the solution

  • demonize the rationalists

  • sneer at the rationalists

  • how cringe, what baby

This sounds like something a rationalist would say. I don't think we should ask rationalists about what makes rationalists special.

One big problem with human movements is they're founded on ideas, but select for traits, as the matching – if imperfectly – phenotype for their objective function is discovered. Martial arts turn out to be mostly about finding the guy with the most ape-like skeleton, social studies reward cutthroat political sloganeering, and rationalists become a particular breed of neurotic high-IQ systemizers – plus people who want to bask in their light. They can do well in analyzing bullshit in some domain just because they're smart enough; their interest in discussing it publicly is explained by incentives in their community (or lack thereof; as we can see, prestige rats have excised HBD bros from their ranks). This isn't the special part of what makes them a cohesive set.

I like this thread, very illuminating:

when i came across lesswrong as a senior in college i was in some sense an empty husk waiting to be filled by something. i had not thought, ever, about who i was, what i wanted, what was important to me, what my values were, what was worth doing. just basic basic stuff

the Sequences were the single most interesting thing i'd ever read. eliezer yudkowsky was saying things that made more sense and captivated me more than i'd ever experienced. this is, iirc, where i was first exposed to the concept of a cognitive bias

i remember being horrified by the idea that my brain could be systematically wrong about something. i needed my brain a lot! i depended on it for everything! so whatever "cognitive biases" were, they were obviously the most important possible thing to understand

"but wait, isn't yud the AI guy? what's all this stuff about cognitive biases?"

the reason this whole fucking thing exists is that yud tried to talk to people about AI and they disagreed with him and he concluded they were insane and needed to learn how to think better

so he wrote a ton of blog posts and organized them and put them on a website and started a whole little subculture whose goal was - as coy as everyone wanted to be about this - thinking better because we were all insane and our insanity was going to kill us

…yud's writing was screaming to the rooftops in a very specific way: whatever you're doing by default, it's bad and wrong and you need to stop doing it and do something better hurry hurry you idiots we don't have time we don't have TIME we need to THINK

i had no defenses against something like this. i'd never encountered such a coherent memeplex laid out in so much excruciating detail, and - in retrospect - tailored so perfectly to invade my soul in particular. (he knew math! he explained quantum mechanics in the Sequences!)

an egg was laid inside of me and when it hatched the first song from its lips was a song of utter destruction, of the entire universe consumed in flames, because some careless humans hadn't thought hard enough before they summoned gods from the platonic deeps to do their bidding

(To be honest, sometimes I have nightmares about this. Like, yesterday).

This is an almost perfect description of someone like @Quantumfreakonomics, I think.

The intersection of people who identify strongly with their intelligence and have an OCD-like anxiety about doing wrong is the natural substrate for rationalism, the tradition of cultivating correct thought processes. It's as pecuilar as some Gnostic sect, and while there's a central doctrine about questioning priors (including, of course, political taboos), all these trappings do not define the memeplex. There's an entire ecosystem of elements to it, like the very notion of a meme, the meme about infohazards, the assumption that thought ought to be recursively interpretable, the contempt for normie beliefs and the hope/fear that More Is Possible. Underneath it all, together with socialization, identity and morality, are some very general intuitions, probably following from neurological kinks that work like inductive biases in machine learning. For example, one key part is the uncomfortable but giddy conviction, informed by pure math I guess, that spaces – any spaces, really, or perhaps any manifolds, such as all mind designs, or all thoughts, or all physics – have… higher dimensionality than they seem to have. Are rougher, weirder, full of navigable hairline cracks. And accordingly, processes in such spaces are non-convex in the Talebian sense. So if you know just the input, if you're smart enough to derive the exact fitting passkey with correct values in every register, you'll be able to chart a trajectory that's shorter than what normies believe is the direct one – or longer than they'd say you can go. You'd be able to do magic. This is what Yud has in mind when writing:

here's a concrete list of stuff I expect superintelligence can or can't do:

… Train an AI system with capability at least equivalent to GPT-4, from the same dataset GPT-4 used, starting from at most 50K of Python code, using 1000x less compute than was used to train GPT-4: YES

Starting from current human tech, bootstrap to nanotechnology in an hour: GOSH WOW IDK, I DON'T ACTUALLY KNOW HOW, BUT DON'T WANT TO CLAIM I CAN SEE ALL PATHWAYS, THIS ONE IS REALLY HARD FOR ME TO CALL

– Hack a human [in the sense of getting the human to carry out any desired course of action] given a week of video footage of the human in its natural environment; plus an hour of A/V exposure with the human, unsupervised and unimpeded: YES

– I'd currently consider the scenario my 20(?)yo self mentioned - trigger off a difference between known physics and actual physics from inside a computer - a "NO" but not quite "DEFINITE NO". You'd really really think it'd take greater physical extremes, but hey, I'm just a human.

(To be clear, the scenario was: «One of the things you can't rule out - and I mean this seriously - is "magic". Magic plain and simple. Running a particular sequence of instructions through the CPU so that a certain transistor turns on and off in a certain sequence and modulates the quantum cheat codes of the Universe.»)

When you feel this way, then fucking sure, you want to know how to think right. And surprisingly many people do.

I don't know where I'm going with that, but I feel that a proper account of «rationalism» has to include analysis of this axis.

If you're smart enough [...] you'd be able to do magic

From an empirical perspective, this has mostly turned out to be true. Telephones, horseless carriages, haber-bosch fertilizer, insert here the same feelgood rant you've heard a thousand times. Maybe rationalists would be very different if technological progress were slowed 10x or 100x.

It's hard to predict exactly what form the magic will take, but very predictable that something about the future will feel like magic to us moderns. Probably most spaces don't have a hairline crack shortcut through the manifold -- but it only takes one.

How do you secure your position as the world makes an important technological transition? If you're politically savvy, you'll be as fine as anyone else. For the rest of us, the best bet is to be one of the builders, and that's best accomplished by neurotic high-IQ systematization. Unless you have a better suggestion?

I do feel uncertain, seeing that QC has been through all this and decided to do something else instead. He's smart, maybe he knows something I don't? My current best bet is that he's a tragic case of someone who wandered off the true path, lured by the siren song of postrat woo. But I do sometimes wonder if he's made some Kegan step that I've failed to comprehend.