site banner

Culture War Roundup for the week of December 11, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

Three months ago, LessWrong admin Ben Pace wrote a long thread on the EA forums: Sharing Info About Nonlinear, in which he shared the stories of two former employees in an EA startup who had bad experiences and left determined to warn others about the company. The startup is an "AI x-risk incubator," which in practice seems to look like a few people traveling around exotic locations, connecting with other effective altruists, and brainstorming new ways to save the world from AI. Very EA. The post contains wide-ranging allegations of misconduct mostly centering around their treatment of two employees they hired who started traveling with them, ultimately concluding that "if Nonlinear does more hiring in the EA ecosystem it is more-likely-than-not to chew up and spit out other bright-eyed young EAs who want to do good in the world."

He, and it seems to some extent fellow admin Oliver Habryka, mentioned they spent hundreds of hours interviewing dozens of people over the course of six months to pull the article together, ultimately paying the two main sources $5000 each for their trouble. It made huge waves in the EA community, torching Nonlinear's reputation.

A few days ago, Nonlinear responded with a wide-ranging tome of a post, 15000 words in the main post with a 134-page appendix. I had never heard of either Lightcone (the organization behind the callout post) or Nonlinear before a few days ago, since I don't pay incredibly close attention to the EA sphere, but the response bubbled up into my sphere of awareness.

The response provides concrete evidence in the form of contemporary screenshots against some of the most damning-sounding claims in the original article:

  • accusations that when one employee, "Alice", was sick with COVID in a foreign country and nobody would get her vegan food so she barely ate for two days turned into "There was vegan food in the house and they picked food up for her, but on one of the days they wanted to go to a Mexican place instead of getting a vegan burger from Burger King."

  • accusations that they promised another, "Chloe", compensation around $75,000 and stiffed her on it in various ways turned into "She had a written contract to be paid $1000/monthly with all expenses covered, which we estimated would add up to around $70,000."

  • accusations that they asked Alice to "bring a variety of illegal drugs across the border" turned into "They asked Alice, who regularly traveled with LSD and marijuana of her own accord, to pick up ADHD medicine and antibiotics at a pharmacy. When she told them the meds still required a prescription in Mexico, they said not to worry about it."

The narrative the Nonlinear team presents is of one employee with mental health issues and a long history of making accusations against the people around her came on board, lost trust in them due to a series of broadly imagined slights, and ultimately left and spread provable lies against them, while another who was hired to be an assistant was never quite satisfied with being an assistant and left frustrated as a result.

As amusing a collective picture as these events paint about what daily life at the startup actually looked like, they also made it pretty clear that the original article had multiple demonstrable falsehoods in it, in and around unrebutted claims. More, they emphasized that they'd been given only a few days to respond to claims before publication, and when they asked for a week to compile hard evidence against falsehoods, the writers told them it would come out on schedule no matter what. Spencer Greenberg, the day before publication, warned them of a number of misrepresentations in the article and sent them screenshots correcting the vegan portion; they corrected some misrepresentations but by the time he sent the screenshots said it was too late to change anything.

That's the part that caught my interest: how did the rationalist community, with its obsession with establishing better epistemics than those around it, wind up writing, embracing, and spreading a callout article with shoddy fact-checking?

From a long conversation with Habryka, my impression is that a lot of EA community members were left scarred and paranoid after the FTX implosion, correcting towards "We must identify and share any early warning signs possible to prevent another FTX." More directly, he told me that he wasn't too concerned with whether they shared falsehoods originally so long as they were airing out the claims of their sources and making their level of epistemic confidence clear. In particular, the organization threatened a libel suit shortly before publication, which they took as a threat of retaliation that meant they should and must hold to their original release schedule.

My own impression is that this is a case of rationalist first-principles thinking gone awry and applied to a domain where it can do real damage. Journalism doesn't have the greatest reputation these days and for good reason, but his approach contrasts starkly with its aspiration to heavily prioritize accuracy and verify information before releasing it. I mention this not to claim that they do so successfully, but because his approach is a conscious deviation from that, an assertion that if something is important enough it's worth airing allegations without closely examining contrary information other sources are asking you to pause and examine.

I'd like to write more about the situation at some point, because I have a lot to say about it even beyond the flood of comments I left on the LessWrong and EA mirrors of the article and think it presses at some important tension points. It's a bit discouraging to watch communities who try so hard to be good from first principles speedrun so many of the pitfalls broader society built guardrails around.

which in practice seems to look like a few people traveling around exotic locations, connecting with other effective altruists, and brainstorming new ways to save the world from AI

While snarky, this is indeed my impression of (current) EA movement. At the start, with the mosquito nets, this at least was practical, boots-on-the-ground charity and they could be forgiven for their slightly smug 'we're doing charidee right (unlike the mugs who went before we appeared fully-formed from the head of Zeus)' attitude because they were indeed helping the poor and deprived.

But helping the poor and deprived wasn't the full gamut of EA activity and philosophy, and the crank stuff (sorry, people, I do not care if insects suffer) was there from the start. However, it was a minor part. But AI risk was one of the Less Wrong and other rationalist/rationalist-adjacent bugbears, and because of the cross-over between EA and the rationalist bubble, that was there too.

And it was sexy! and modern! and interesting! in a way that plain, bread-and-butter, 'help the poor with an ongoing problem that, despite all the fancy technical attempts to solve it, looks to remain intractable: malaria by mosquito-borne transmission' wasn't, because all the former mugs had been doing 'missions to Africa' and the likes for decades, so what makes you so special?

And it involved flying around and going to conferences and hob-nobbing with Big Names and getting yourself known in those circles, and was way more appealing to the SF nerd in us all (c'mon, if we're hanging round these parts, even if we're not rationalists or EA, we're SF nerds).

So EA the movement seems, to me at least looking in from the outside, to have subtly but definitely transformed into 'making a living by taking in each other's washing' - going to conferences to network about getting an internship to get into a programme about signing people up to attend EA conferences.

(Here's where I mention the manor house in Oxford).

That's why, while I understand Scott doing an apologia for EA and appealing to all the lives it (presumably/allegedly) has saved, I don't think he still has entirely grappled with the criticisms from the outside about 'travelling around exotic locations and brainstorming for projects which are not practical, boots-on-the-ground, charity'. If (and it's one hell of a big if) AI is going to Doom Us All unless it's perfectly aligned with nice, liberal, 21st century middle-to-upper middle class San Franciscan values, then their work is important.

If AI screws us over because (deep breath) the free market capitalist system incentivises greed and the gold rush is on to get to market first and grab the majority share with your product, and just ignore that right now the product you're peddling makes shit up and is totally unreliable but people are being sold on the notion that it's super-ultra-mega-accurate, just believe all it says but the thing is never going to become self-aware and have its own goals and I highly doubt even smarter than human intelligence (exhale) - then all the fancy conferences mean nothing. Except pleasant trips to Oxfordshire manor houses for EA talking sessions where you pretend to be doing something meaningful - junkets, in other words.

And if you've reached the point of junkets, you are not "doing charidee right unlike those other mugs".

As for the rest of it? Sounds like the typical EA over-sensitivity/scrupulousness where small things get blown up into microaggressions, unfulfilled promises, and 'you said I'd get X and then I never got X' pouting where all kinds of accommodations for neurodiversity, gender diversity, I don't know what diversity, are expected implicitly.

EDIT:

That's the part that caught my interest: how did the rationalist community, with its obsession with establishing better epistemics than those around it, wind up writing, embracing, and spreading a callout article with shoddy fact-checking?

I think, and this is only a vague impression so don't take it as Gospel, that it's a case of the pendulum over-correcting and swinging too far to the other side. There have been previous internal scandals among rationalist groups, and subsequent accusations of cover-ups and people in charge not taking the complaints seriously/not acting quickly enough/doing their best to hush it up.

So I think there's a sensitivity around being seen to 'victim blame' and not immediately strike while the iron is hot when you hear people accusing EA/EA-aligned groups of wrongdoing, and this perhaps led in this instance into jumping the gun. Fact-checking could be seen as denying the truth, trying to delay embarrassing revelations, and even a form of harassing the victims by making them respond to little, nit-picky details.

The whole "my vegan diet/my money that I was promised" and so on sounds exactly like what I've come to expect from these types, to be frank (and a little mean) about it. Wanting a whole specific vegan product from one place and kicking up about not getting it. If you're sick with Covid, you're likely not to be eating much anyway, and if you can eat to the point that you're fussy about "I only eat this not that", then you're not that sick. One of my siblings got Covid and couldn't even keep down water because she vomited everything she consumed straight back up, so I was genuinely worried about her getting dangerously dehydrated; that's not at all the same as "I didn't get my vegan din-dins".

EDIT EDIT: To be fair, if she was that sick, and her stomach was sensitive, it may well have been that she could only eat that one particular Burger King vegan burger; Mexican food does sound like it would be too much. But building it up into The Persecution of the Vegan Joan of Arc is the kind of overly dramatic, self-regarding, navel-gazing that a lot of the writing by EA and LessWrongers and lesser lights exhibits. That's one of the attractions of Scott's writing for me - he's never (or barely ever) indulged in that slightly whiny "I have this entire laundry list of Things wrong with me and I need and demand these special accommodations and I continuously gaze into the mirror of my soul and you lot get the reports from the frontier on that every five minutes and any criticism no matter how mild is hate speech".