site banner

Scott Alexander on Sam Bankman-Fried, FTX and Effective Altruism

astralcodexten.substack.com

I made this a top level post because I think people here might want to discuss it but you can remove it if it doesn't meet your standards.

Edit: removed my opinion of Scott from the body

15
Jump in the discussion.

No email address required.

This doesn't really feel all that reassuring. MacAskill's "in the vast majority of cases" and Eliezer's "(For Humans)" are explicit caveats. Scott doesn't even bother making more concessions than "well shucks you can't do much good if you've made all the deontologists mad enough that they get in your way".

For all of them there's still an implicit out, that lying/cheat/stealing for the greater good is okay if you can get away with it. Eliezer goes the furthest to emphasize that, no, really, you won't get away with it, but he still leaves that out there. He has to.

So all it takes is a little narcissism for a 140 IQ altruist to rationalize that he can in fact part the 90 IQ rubes from their money and have them be none the wiser. Sure, amongst humans such a thing is unthinkable, but are those two even really the same species?

So all it takes is a little narcissism for a 140 IQ altruist to rationalize that he can in fact part the 90 IQ rubes from their money and have them be none the wiser. Sure, amongst humans such a thing is unthinkable, but are those two even really the same species?

Okay? How is this related to utilitarianism? If I'm just going to ignore all moral advice to make up rules and do whatever I feel like because I think I'm better than everybody else, I can do that as a deontologist or a utilitarian.

People with no care for morality are not in question. Of course it doesn't matter to them. They simply do as they will.

The people in question aren't ignoring any moral axioms of utilitarianism. Ends justify the means is fundamental to it. You can thus be a committed utilitarian and do evil simply via bad calculations.

Other moral systems fail in other ways. But a strict deontologist is not going to rob Peter to pay Paul.

Aside from the strictly amoral or immoral, one of the worst conceptual types is the one who knows a very long list of moral systems, from various flavors of deontology to different types of consequentialism to the ethics of many virtues, and applies the moral justification to fit his situation. In any given instance, he has a perfectly cogent explanation for why his choice was justifiable, and even an argument for why his justification was the best type for the context, and yet the central point of morality was lost along the way--fencing off bad choices from good ones. Different systems will have somewhat different fence patterns, but hotswapping fences to fit your behavior ends up destroying the concept and purpose of the fence.

This does sound like a neat rejection of Scott's deontology in the streets, utilitarianism in the sheets approach.

But I can't think of a coherent way for one to hot swap morality. For anyone's system to be coherent there must ultimately be some moral facts or axioms underlying it that reigns supreme in the meta-morality calculus.

People who think they're doing this are probably just utilitarians who think deontology is nothing more than rule utilitarianism by heuristic.

Committed theists don't have a backup morality for when God tells them to do evil. They either go whole hog or come up with biblical copes.

People who want the hodge podge pick virtue ethics.

Those caveats are critically important though, and are fairly universally endorsed by anyone who isn't a philosopher and takes things seriously.

For all of them there's still an implicit out, that lying/cheat/stealing for the greater good is okay if you can get away with it.

The explicit example on the EA forum was schindler. Another fun example is intelligence / spying, which is somewhat critical for a nation-state, especially during wars (which happen often), yet is lying/cheating/stealing.

"well but in those cases the greater good is just TOO IMPORTANT because it's an actually important thing" yeah that's the point, 'means' and 'ends' are both causes and effects that interact and mix together!

Eliezer goes the furthest to emphasize that, no, really, you won't get away with it, but he still leaves that out there. He has to.

Eliezer points out in MoR that perfect crimes probably happen all the time, and they just get ruled a suicide, or an electrical accident.