site banner

Culture War Roundup for the week of May 15, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

Disclaimer: this is a serious test for shady thinking. My apologies. Consider this a strawman, and please try to confront a steelman.

Note: see disclaimer above. This is shady thinking in note format.

EDIT: This is mostly in response to https://astralcodexten.substack.com/p/why-is-the-academic-job-market-so particularly thinking about Scott analyzing how the academic job market actually works. I bet Scott's analysis is super annoying to many of those in the market, and likewise super satisfying to others. My thesis is that the others are rationalists and the many are not.

idea

  • rationalists explain foreign things from "first principles"

  • they liken themselves to newton and hooke, exploring new frontiers

  • for better or worse

  • to the experts in the field, they are cringe and dilettante, sneer worthy

the problem

  • within every field, there are certain "touchy areas"

  • everyone understands the truth but pretends not to

a bigger problem

  • rationalists home in on touchy areas

  • rationalists can't "understand the truth but pretend not to"

  • rationalists "say the quiet part out loud"

the solution

  • demonize the rationalists

  • sneer at the rationalists

  • how cringe, what baby

Damn browser ate my reply, and I have no desire to retype it, in brief:

IPEDS Completions Survey shows more than 23 000 Americans obtain a bachelor's degree in history every year. That's just too damn many. Becoming a university researcher is the best career choice that utilizes their degree for most of them. There's no industry outside the academia that can consume that many graduates like it does in STEM and law and theology. Of course the universities exploit these rubes by dangling the carrot of tenure in front of them and letting anyone who wants major in history.

The best advice an adjunct teacher of history (or of English lit) can give to his or her students is: if you can dream of not being a historian and aren't independently wealthy, go major in something else. 2300 history majors is a much more sustainable number. 230 history majors is an even better one: everyone will be able to get a job as a historian.

I think honestly it’s a self-solving problem as long as the politicians can stop messing with the loans. The high cost of college is starting to make students more aware of major-related job prospects which should drastically cut down on vanity degrees. If you’re unqualified for a good paying job, yet spent (over a lifetime) over $100K on a degree you’re shutting yourself out of a lot of normal life experiences. That negative consequence, when seen by others, will lower the demand for those degrees.

I suspect that’s why the sudden push to simply wipe away the debt. It’s not for the students, it’s because the school’s’ economic model depends on fleecing humanities students to pay for the labs of students seeking actually useful degrees. Every student in the history department is paying just as much to read books and know things useful in case they’re ever invited on Jeopardy as the kid in a STEM class learning to build useful technology. Unless the debts get picked up by the government, this model will eventually fail when kids are no longer willing to pay big money to read books in the library.

And honestly, I think most humanizes should be autodidact pursuits. There are tons of resources, including online courses, books (and if they’re only enough, for free on Project Gutenberg), videos, museums, and so on. These resources are cheap if not free, and widely available. There’s not much added value to going through a history program at a university— you can ask the TA questions, you’ll be assigned things which I think would help in directing your study, but even this can be emulated with other methods if you are motivated enough.

This sounds like something a rationalist would say. I don't think we should ask rationalists about what makes rationalists special.

One big problem with human movements is they're founded on ideas, but select for traits, as the matching – if imperfectly – phenotype for their objective function is discovered. Martial arts turn out to be mostly about finding the guy with the most ape-like skeleton, social studies reward cutthroat political sloganeering, and rationalists become a particular breed of neurotic high-IQ systemizers – plus people who want to bask in their light. They can do well in analyzing bullshit in some domain just because they're smart enough; their interest in discussing it publicly is explained by incentives in their community (or lack thereof; as we can see, prestige rats have excised HBD bros from their ranks). This isn't the special part of what makes them a cohesive set.

I like this thread, very illuminating:

when i came across lesswrong as a senior in college i was in some sense an empty husk waiting to be filled by something. i had not thought, ever, about who i was, what i wanted, what was important to me, what my values were, what was worth doing. just basic basic stuff

the Sequences were the single most interesting thing i'd ever read. eliezer yudkowsky was saying things that made more sense and captivated me more than i'd ever experienced. this is, iirc, where i was first exposed to the concept of a cognitive bias

i remember being horrified by the idea that my brain could be systematically wrong about something. i needed my brain a lot! i depended on it for everything! so whatever "cognitive biases" were, they were obviously the most important possible thing to understand

"but wait, isn't yud the AI guy? what's all this stuff about cognitive biases?"

the reason this whole fucking thing exists is that yud tried to talk to people about AI and they disagreed with him and he concluded they were insane and needed to learn how to think better

so he wrote a ton of blog posts and organized them and put them on a website and started a whole little subculture whose goal was - as coy as everyone wanted to be about this - thinking better because we were all insane and our insanity was going to kill us

…yud's writing was screaming to the rooftops in a very specific way: whatever you're doing by default, it's bad and wrong and you need to stop doing it and do something better hurry hurry you idiots we don't have time we don't have TIME we need to THINK

i had no defenses against something like this. i'd never encountered such a coherent memeplex laid out in so much excruciating detail, and - in retrospect - tailored so perfectly to invade my soul in particular. (he knew math! he explained quantum mechanics in the Sequences!)

an egg was laid inside of me and when it hatched the first song from its lips was a song of utter destruction, of the entire universe consumed in flames, because some careless humans hadn't thought hard enough before they summoned gods from the platonic deeps to do their bidding

(To be honest, sometimes I have nightmares about this. Like, yesterday).

This is an almost perfect description of someone like @Quantumfreakonomics, I think.

The intersection of people who identify strongly with their intelligence and have an OCD-like anxiety about doing wrong is the natural substrate for rationalism, the tradition of cultivating correct thought processes. It's as pecuilar as some Gnostic sect, and while there's a central doctrine about questioning priors (including, of course, political taboos), all these trappings do not define the memeplex. There's an entire ecosystem of elements to it, like the very notion of a meme, the meme about infohazards, the assumption that thought ought to be recursively interpretable, the contempt for normie beliefs and the hope/fear that More Is Possible. Underneath it all, together with socialization, identity and morality, are some very general intuitions, probably following from neurological kinks that work like inductive biases in machine learning. For example, one key part is the uncomfortable but giddy conviction, informed by pure math I guess, that spaces – any spaces, really, or perhaps any manifolds, such as all mind designs, or all thoughts, or all physics – have… higher dimensionality than they seem to have. Are rougher, weirder, full of navigable hairline cracks. And accordingly, processes in such spaces are non-convex in the Talebian sense. So if you know just the input, if you're smart enough to derive the exact fitting passkey with correct values in every register, you'll be able to chart a trajectory that's shorter than what normies believe is the direct one – or longer than they'd say you can go. You'd be able to do magic. This is what Yud has in mind when writing:

here's a concrete list of stuff I expect superintelligence can or can't do:

… Train an AI system with capability at least equivalent to GPT-4, from the same dataset GPT-4 used, starting from at most 50K of Python code, using 1000x less compute than was used to train GPT-4: YES

Starting from current human tech, bootstrap to nanotechnology in an hour: GOSH WOW IDK, I DON'T ACTUALLY KNOW HOW, BUT DON'T WANT TO CLAIM I CAN SEE ALL PATHWAYS, THIS ONE IS REALLY HARD FOR ME TO CALL

– Hack a human [in the sense of getting the human to carry out any desired course of action] given a week of video footage of the human in its natural environment; plus an hour of A/V exposure with the human, unsupervised and unimpeded: YES

– I'd currently consider the scenario my 20(?)yo self mentioned - trigger off a difference between known physics and actual physics from inside a computer - a "NO" but not quite "DEFINITE NO". You'd really really think it'd take greater physical extremes, but hey, I'm just a human.

(To be clear, the scenario was: «One of the things you can't rule out - and I mean this seriously - is "magic". Magic plain and simple. Running a particular sequence of instructions through the CPU so that a certain transistor turns on and off in a certain sequence and modulates the quantum cheat codes of the Universe.»)

When you feel this way, then fucking sure, you want to know how to think right. And surprisingly many people do.

I don't know where I'm going with that, but I feel that a proper account of «rationalism» has to include analysis of this axis.

What do you mean by "the assumption that thought ought to be recursively interpretable"?

That might have been a good description had I discovered rationalism ~10 years ago. By the time I came across a Slatestarcodex article on /r/drama I was already a jaded veteran of internet culture wars who had learned the hard lesson that you can’t outsource epistemology.

If you're smart enough [...] you'd be able to do magic

From an empirical perspective, this has mostly turned out to be true. Telephones, horseless carriages, haber-bosch fertilizer, insert here the same feelgood rant you've heard a thousand times. Maybe rationalists would be very different if technological progress were slowed 10x or 100x.

It's hard to predict exactly what form the magic will take, but very predictable that something about the future will feel like magic to us moderns. Probably most spaces don't have a hairline crack shortcut through the manifold -- but it only takes one.

How do you secure your position as the world makes an important technological transition? If you're politically savvy, you'll be as fine as anyone else. For the rest of us, the best bet is to be one of the builders, and that's best accomplished by neurotic high-IQ systematization. Unless you have a better suggestion?

I do feel uncertain, seeing that QC has been through all this and decided to do something else instead. He's smart, maybe he knows something I don't? My current best bet is that he's a tragic case of someone who wandered off the true path, lured by the siren song of postrat woo. But I do sometimes wonder if he's made some Kegan step that I've failed to comprehend.

I found the that thread very interesting. Reading between the lines and over thinking until I can see what isn't really there, I see two big issues.

First, QC sees the issues of cognitive bias and running on untrusted hardware as specificially human issues. Yudkowsky is a space alien, of a superior species, so he is unaffected by these issues. His takes on AI risk are gospel truth.

Second, I'm reminded of testimony before Congress about unconscious racial bias. The witness claims that every-one harbours unconscious racial bias. The Congress man asks: which races are you unconsciously biased against? This leads to a deer-in-headlights moment rather than an answer. I want to ask QC whether his own judgement is subject to cognitive biases and whether his mind runs on untrusted hardware. Specifically, is his judgement that Yudkowsky is telling the gospel truth from a position of superiority, also the gospel truth? QC seems to think that he too is a space alien, free from human failure modes.

The thread seems like a living-out of the Zen parable about the Dharma being a finger pointing at the moon. QC has studied hard and knows all about the finger, its joints, and its nail.

What I have written comes across as unsympathetic to QC. Or does it? The impression of a lack of sympathy comes from inferring that I see myself as a space alien, of a superior species, unlike Yudkowsky and QC, who are merely human. Actually, I think that I suffer from cognitive biases and am running on untrusted hardware. I'm writing from a position of despair. How do we know anything? Epistemology is difficult. Epistemology is harder than that, we read the sequences and still don't get it. We encounter arguments about AI risk and never stop to think: Well, that has been crafted by Moloch to suck me in, maybe I should stay away and leave it to less vulnerable people to wrestle with the issue.

My antidote to epistemological despair is reading the history of science. There are ways round biases. The double blind, randomized controlled trial is one route, available to a well funded team. There are other instructive stories. I particularly like Blaise Pascal's 1647 pamphlet on barometers. One of the experiments involved a six foot tall mercury barometer. Why six feet, when three feet tall is tall enough? So that he could fit a three foot tall mercury barometer inside it, and watch the mercury run out when the inner one was in the vacuum. The mad lad actually went the extra mile to check what was really going on.

I don't see a clever hack that lets me cross-check AI alarmism to see if it is for real. I'll wait. For me, the core of "rationality" is studying clever cross-checks. Get a feel for what we can know despite cognitive biases if we are willing and able to do the extra work. Get a feel for what we cannot know, and learn patience.

One thing that people hate, of course, is the name "rationalism". It's conceited. Doesn't everyone think they are rational? What makes rationalists so special?

Relevant Hanania has a good answer to this question, defining rationalism thusly:

"Rationalism: The belief that fewer topics and ideas in the areas of politics, morality, ethics, and science should be considered taboo or sacred and not subject to cost-benefit analysis.

I think this gets at what you are saying. "Rationalism" is primarily about the ability or desire to examine things that others consider taboo.

This is also how a rationalist lay person can make insights and connections that a scientist with decades of training cannot. The taboos in academia are so strong that there are low-hanging fruit everyone. Infamously, HBD has great explanatory power which is completely tabooed inside the academia.

Now, in academia, does everyone really "understand the truth but pretend not to"? I don't think so. People's ability to self-deceive is very strong. For every person in the social sciences who secretly believes in HBD, there are 10 blank-slaters. Social conformity bias shapes beliefs. It's just less powerful in rationalists, whose lack of conformity also makes them irritating.

Which article is that Hanania quote from? I’d love to read it.

https://www.richardhanania.com/p/why-ea-will-be-anti-woke-or-die

Hanania is up there with Scott as a must-read blogger, IMO.

"Rationalism: The belief that fewer topics and ideas in the areas of politics, morality, ethics, and science should be considered taboo or sacred and not subject to cost-benefit analysis.

I somewhat disagree with this. If we consider lesswrong to be the hub/mecca of rationalism, there is little in there that is particularly taboo. I would feel comfortable sharing most links there with general audience. I think rationalism has more to do with systematic thinking, heuristics, long-form social commentary, etc. Dropping red pills about IQ and other aspects of HBD is a small part of rationalism. There are plenty of rationalists who don't think IQ is that important or are indifferent about it or don't subscribe to the purely deterministic view of IQ vs. individual achievement.

As Yudkowski himself pointed out, tell one lie and the truth will be your enemy. Without the uncomfortable truths, rationalism just gets eaten by rationalization.

I think both are aspects of rationalism. You start with the whole "thinking better" aspect, but from there it's an easy hop to recognizing "thought-terminating clichés", "arguments as soldiers", "political thinking" as things that impede rational thinking. So an honest rationalist sees someone's sacred cow and immediately goes full matador before his superego manages to stop him from saying stuff like, "yes, I agree this is the right policy to promote equity as you defined it, but why do you consider equity as you defined it the most desirable societal outcome?"

I would be curious if rationalists are even less "social conformity" biased. I'd guess the average rationalist grew up an outcast who became (often irrationally) suspicious of the ingroup, and gravitated to outgroups to fulfill their social needs and went on to justify their continual social exclusion via their own intelligence whether they were or not.

They are just as influenced by social conformity, but through an inverted/rejection/wound/resentment model that leaves them able to see through the blind spots of the normies, but just as biased when it comes to the particular outgroups they identify with. Which is still valuable to have, but the self congratulations are probably unwarranted.

This reminds me of how, it seems to me, a disproportionate number of people who make an identity out of being atheists (as opposed to just being atheists) are people who grew up in oppressive religious households and rebelled against it*. Similarly, there is a difference between people who just take some good ideas from rationalism, on the one hand, and people who make an identity out of being rationalists, on the other.

It does not help matters that rationalism was started by a guy who has what could fairly be called very extreme and dogmatic feelings about a certain topic (artificial intelligence) and that so many prominent rationalists are from the same Bay Area milieu, both of which things give the movement a sort of cloistered vibe that to some observers contrasts oddly (although not necessarily fairly) with its professed goal of pure reason.

*This is even more, and more obviously, the case for people who are into stuff like black metal and/or Satanism. Not that I think there is anything wrong with the vast majority of Satanism - which, from what I can tell, is mostly a fun hobby rather than some serious desire to do evil.

One thing that people hate, of course, is the name "rationalism". It's conceited. Doesn't everyone think they are rational? What makes rationalists so special?

There was a time when I would have agreed with you here, but I have been shocked by how many times I've seen and heard things that basically amount to, "look at these morons who think being correct is the most important thing in the universe." I think "rationalism" is a pretty good description of the ideology that hears that and goes "yes".