site banner

Culture War Roundup for the week of March 4, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

Point #2, #3, and the first half of #4 are reasonable for EAs and potential-EAs to know, but it's unclear that any of them constitute problems. For instance, to what extent is it bad that one billionaire moves a majority of the resources? To the extent it is bad, what realistic alternative would be better?

The second half of #4 refers to a problem that will plague any member who does hard-to-evaluate work at any large organization - i.e. nearly all white collar work, and a significant amount of government and blue collar work: namely that success depends on the perception of your work's value rather than its value, which gives you a dichotomy:

  • myopically focus on growing the pie (i.e. providing value) while studiously ignoring how the pie is distributed (i.e. whether you get grants, raises, etc)
  • learn to play the game and do to the extent it helps you sell

Obviously, there is a spectrum here. This is frustrating for the more scrupulous people, but beyond bad actors gaming the system, there are a number of causal reasons this dynamic persists:

  • the person most knowledgable about your work is you - your manager (or customer) frequently knows ~10% as much as you
  • unlike in school, it is typically difficult to tell whether you are 2x slower than your coworker or if your task is 2x harder - this is made especially difficult when skillsets are diverse
  • managers don't typically like evaluating people, so they're tend to avoid it by minimizing the amount of energy they put into it
  • and, yes, managers who optimize for signal rather than value tend to get promoted

The parallels for the hard-to-measure parts of EA are straightforward. This sucks, and I agree it's a "problem", but it's hard for me to imagine a clear solution. You seem to think it would be better if

  1. powerful EAs spent more time responding to comments on EA forum
  2. more grassroots-esque grants were given like Scott Alexander's

I intuitively agree with #2, but #1 seems really unclear to me. Commenters are nearly always less well-informed than the decision-makers, so it's unclear to me that this is actually a good use of the decision-makers' time. Maybe they could hire PR people to do this for them? Is that a good use of EA money? idk - maybe. But I suspect this would make you more upset rather than less.

blistering, white-hot competence

Can you give an example of any multi-billion dollar movement or organization that displays "blistering, white-hot competence"? If not, maybe your standard is unreasonable?

To sometimes take their funding, but to do your own thing and preserve your ability to comfortably leave

This seems blatantly anti-social and immoral.

Ultimately, this critique seems to fundamentally be an attempt to take someone whose genuine values match EA-the-philosophy and warn them that EA-the-movement differs, which is all well and good. However, it might be stronger, if you

  1. provided concrete evidence that interventions are less effective than claimed
  2. offered concrete alternatives to this target audience.

hard-to-evaluate work at any large organization... learn to play the game

You can also be on the lookout for different games to play.

You seem to think it would be better if powerful EAs spent more time responding to comments on EA forum

I think this is too much of a simplification. I am making the argument that EA is structured such that leaders don't really aggregate the knowledge of their followers.

Can you give an example of any multi-billion dollar movement or organization that displays "blistering, white-hot competence"?

Some which could come to mind: Catholic Church in Spain 1910 to early 2000s, Apple, Amazon, SpaceX, Manhattan project, Israeli nuclear weapons project, Peter Thiel's general machinations, Linus Torvald's stewardship of the Linux project, competent Hollywood directors, Marcus Aurelius, Bismark's unification of Germany and his web of alliances, Chicago school, MIT's JPAL (endowment size uncertain though), the Jesuits, the World Central Kitchen.

provided concrete evidence that interventions are less effective than claimed

I discussed a previous one on the Motte here, here is a more recent one: CEA spends ~$1-2M/year to host the equivalent of a medium subreddit, or a forum with probably less discussion than The Motte itself.

offered concrete alternatives to this target audience.

Here are some blue-sky alternatives, Auftragstaktik is one particular thing I'd want to see more of.

You can also be on the lookout for different games to play.

Do you mean leaving the company and/or deciding to put your energy into non-work things? Or something else?

leaders don't really aggregate the knowledge of their followers.

Hmm. I'm imagining something like an explicit set of users who are gatekeepers, so if I have a 10x idea, I can just convince one person to have The Powers That Be consider it? Something along those lines?

Some which could come to mind...

I think it's important to decide whether we're judging these from the insider or the outside.

If you went to work for Apple, I'm feel pretty sure you'd come away thinking it is woefully incompetent. From the outside, however, it largely appears competent. Not unlike the other FAANG companies imo. Likewise, if you actually worked as a priest in the Catholic Church in Spain in the 20th century, I'd be shocked if you felt this was what "blistering, white-hot competence" looked like. From the outside, I think EA is pretty clear amazingly competent, saving more counterfactual lives per dollar than nearly any other organization, even if you round everything hard-to-value to zero. From the inside however, ...

Re EA being less effective. Alas, it is tedious, but I fear the only way for us to reach a common understanding is point by point, starting with

The Forum

First, re moderation policy - this is something we discuss occasionally here. Blunt people think it's crazy to mod someone just because they were blunt - it drives away such people and we lose their valuable opinions! Other people think the reveres is more powerful: blunt people drive away blunt-averse people and cause the loss of their valuable opinions. I'm unfamiliar with any actual evidence on the matter.

Next, spending. The comment you link to explicitly says they would not accept 2x funding, which imo puts them heads and shoulders above the default of outside society (e.g. a team at a S&P 500 company, in the government, or at a typical nonprofit). I personally put a fair amount of weight on that kind of signal (I loved that Evidence Action closed down their bussing program for not-enough-impact reasons). I think its quite plausible that the forum's benefit of fostering an EA community creates new EAs and retains old ones to the extent that the value outweighs the $2m cost.

That being said, I think you are probably correct in your own comment in that thread in pointing out there is a margin-average distinction being elided, so the 2m probably really is too high.

That comment also links to a page on how they're trying to have impact. The task they rate as the most promising is running job ads on the forum. The second-most promising is helping recruiters find potential candidates. Those seem reasonably valuable to me, but, I'd still guess the EV is less than $2m.

That being said, there are some ameliorating factors:

  • The whole analysis depends on how much you think EA is money-constrained versus talent-constrained - fwiw Scott leans more towards the latter. FWIW, this takes the cake for the biggest misconception that new-to-EA people have - that money-constraints are the primary issue.
  • Building on that, the budget appears to have absolutely ballooned with the help of FTX funding. If this is true, it's unclear what exactly the counterfactual alternative was - i.e. was this money earmarked specifically for this forum? for community outreach? idk. Certainly, SBF's donations were not entirely effectiveness-driven.

Ultimately, I'm inclined to agree that $2M is too much, without having context on how the budget was determined, I'm not sure how much of a black eye this should be on EA as a whole.

Criminal Justice Reform

When I went through Open Philanthropy's database of grants a couple years ago, I felt only about half its spending would fall under conventional Effective Altruist priorities (e.g. global health, animal welfare, X-risk). That is, I've felt for a couple years that Open Philanthropy is only about half-EA, which, to be clear is still dramatically better than the typical non-profit, but I don't personally them funding a cause as equivalent to the EA community thinking the cause is effective. #NoTrueScotsman

I'm going to be honest - I do not, tonight, have the time to go through the two "alternatives" links with the thoroughness they deserve