site banner

Scott Alexander on Sam Bankman-Fried, FTX and Effective Altruism

astralcodexten.substack.com

I made this a top level post because I think people here might want to discuss it but you can remove it if it doesn't meet your standards.

Edit: removed my opinion of Scott from the body

15
Jump in the discussion.

No email address required.

It's nice to see #5: various examples of EA-adjacent figures arguing that shady dealings aren't justifiable for effectiveness. A common thread in responses (such as the whole "Chernobyl" thread on the frontpage" has been that sort of suspicion of EA. They're utilitarians, which means they must be able to justify anything, and SBF is proof of what we all suspected was going on. No. This isn't Nam, this is bowling, there are rules. Thinking that EA cause area so-and-so is underserved is not license to steal/deceive/etc., and I'll be glad to have these statements the next time someone asserts that it is.

This doesn't really feel all that reassuring. MacAskill's "in the vast majority of cases" and Eliezer's "(For Humans)" are explicit caveats. Scott doesn't even bother making more concessions than "well shucks you can't do much good if you've made all the deontologists mad enough that they get in your way".

For all of them there's still an implicit out, that lying/cheat/stealing for the greater good is okay if you can get away with it. Eliezer goes the furthest to emphasize that, no, really, you won't get away with it, but he still leaves that out there. He has to.

So all it takes is a little narcissism for a 140 IQ altruist to rationalize that he can in fact part the 90 IQ rubes from their money and have them be none the wiser. Sure, amongst humans such a thing is unthinkable, but are those two even really the same species?

Those caveats are critically important though, and are fairly universally endorsed by anyone who isn't a philosopher and takes things seriously.

For all of them there's still an implicit out, that lying/cheat/stealing for the greater good is okay if you can get away with it.

The explicit example on the EA forum was schindler. Another fun example is intelligence / spying, which is somewhat critical for a nation-state, especially during wars (which happen often), yet is lying/cheating/stealing.

"well but in those cases the greater good is just TOO IMPORTANT because it's an actually important thing" yeah that's the point, 'means' and 'ends' are both causes and effects that interact and mix together!