site banner

Culture War Roundup for the week of February 20, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

15
Jump in the discussion.

No email address required.

Effective Altruism drama update:

You may remember a few weeks ago the article Effective Altruism Promises to Do Good Better. These Women Say It Has a Toxic Culture Of Sexual Harassment and Abuse was published in TIME (Motte discussion here).

It's been a hectic two weeks on the EA forum. Meta community posts have been consistently getting more engagement than object-level posts about actual charity. There is a palpable tension on the site between the hardcore rationalists and the mainstream liberals. Vote counts swing on an hourly basis depending on who has the upper hand, but overall the discussion has remained civil (mostly). A few days ago, the (in)famous Aella posted "People Will Sometimes Just Lie About You", a devastating screed against prudes, anonymous allegations, and haters of eccentric Bay Area parties. Eliezer himself even shows up, taking a break from doomscrolling to deliver a supporting bombardment against the mainstream press.

There's nothing EAs care about more than cute poly girls and AI. Once Aella and Eliezer weigh in, case closed right? WRONG.

A statement and an apology

EV UK board statement on Owen's resignation

In a recent TIME Magazine article, a claim of misconduct was made about an “influential figure in EA”:

"A third [woman] described an unsettling experience with an influential figure in EA whose role included picking out promising students and funneling them towards highly coveted jobs. After that leader arranged for her to be flown to the U.K. for a job interview, she recalls being surprised to discover that she was expected to stay in his home, not a hotel. When she arrived, she says, “he told me he needed to masturbate before seeing me.”"

Shortly after the article came out, Julia Wise (CEA’s community liaison) informed the EV UK board that this concerned behaviour of Owen Cotton-Barratt;[1] the incident occurred more than 5 years ago and was reported to her in 2021.[2] (Owen became a board member in 2020.)

One of the perpetrators from the article has been identified. So who wins?

Well, its too soon to say. This seems to be the first sexual misconduct allegation confirmed against an official EA leader, so you can't really call the TIME story which broke it to be a complete pile of journalistic garbage. It does seem like a pretty minor infraction though. After reading Owen's statement it seems like it could fall under the "weird nerds trying to date" umbrella, but maybe you can't use that excuse when you're a board member.

One aspect I haven't seen discussed is that this is the same guy who was behind the controversial decision to buy Wytham Abbey for 15 million pounds (see here). In light of current events, it sure looks to me like EA officials decided to blow millions on a luxury venue in Oxford in order to impress women.

Iron Law of Institutions comes for us all.

No exceptions.

Who are you accusing of seeking power within EA? Or, within what other institution is power being sought?

I would assume Keerthana Gopalakrishnan, plus probably others with mainstream views/ordinary philanthropy grifters who are backing her. There's speculation it's a group of people out of Oxford.

Chronologically, the first thing that happened was her making a post on eaforums that ended with a bunch of demands that EA change to make her happy. She admitted that she knew it wasn't up to the normal epistemic standards of EA:

Also, the post is not optimized for analytical/argumentative quality. My only goal is to speak my mind

After a bit of entirely polite pushback she demanded the post be taken down and went to the media.

https://ea.greaterwrong.com/posts/NacFjEJGoFFWRqsc8/women-and-effective-altruism

(Or maybe she had gone to the media prior and the reporter suggested a badly received post on the forums would look good in the story. Keerthana does seem to be struggling pretty hard to interpret the post as badly received in spite of half the responses being "you're so brave".)

Another question arises: why does she even want to be part of EA? She clearly does not align at all with EA epistemics or values:

For a community that is so alarmist for 5 or 1 or 0.1 percentage of X-risk from AI, giving a wide berth for sexual harassment is utterly hypocritical.

The most obvious answer is that she simply viewed EA as a place she could effectively grift, probably by subverting it with mainstream memes and turning the eye of Sauron on it.

Yeah, reading the essay (via Wayback Machine), it rings a lot of alarm bells, and the "oh no I feel unsafe now" rings more.

Yeah, taking a look at her and knowing the psyche of these types of people (I'm sort of one of them myself) I'm 90+% convinced that this woman doesn't care about EA at all, rather what she cares about is power.

In fact I'm 70+% convinced she doesn't even believe what she's saying in the allegations, but rather is using it as an effective tool to gain power and EA is just an available niche she found, it could as easily have been tennis or some nameless corporation had the dice rolled slightly differently.

And the reason this tool is effective is because white people have given it power, were it not effective she wouldn't be using it at all, so you can't really blame her for what's happening either - yes, you can blame her for being a manipulative bitch, but we all have a bit of a manipulative bitch inside of us and you can't blame her for using the most effective tools for the job and if you're going to blame someone for what the tools specifically are, then that blame falls squarely upon white people.

I chatted with her for a week or so on an OLD app, and, for what that level of interaction is worth, that doesn't align with my read on her, which is something on the "smart, somewhere on the spectrum, and a bit odd" side of things. Which seems a pretty solid fit with EA even ignoring formal ideological beliefs, TBH.

Fair, I don't know her personally so your impression is probably more accurate, it's just that there are a lot of South Asians who have this type of belief system, see how even though tons of East Asians are present at lower levels of high tech firms at the top there are a disproportionate number of South Asians and there are reasons why they rose so high (competence + the killer instinct).

the killer instinct)

..what does that mean exactly ? Being faster at stabbing your competition in the back to get ahead ?

I'd imagine it means more willing to (and therefore faster to), yes. You can wait to be uplifted and invited to the upper echelons or you can climb up a mountain of bodies to get there.

You might be shocked to discover that people seeking power often learn how to emulate the behaviors and values of those they wish to manipulate. You usually don't recognize it until after you've been burned. (Learned from experience.)