site banner

FTX is Rationalism's Chernobyl

You may be familiar with Curtis Yarvin's idea that Covid is science's Chernobyl. Just as Chernobyl was Communism's Chernobyl, and Covid was science's Chernobyl, the FTX disaster is rationalism's Chernobyl.

The people at FTX were the best of the best, Ivy League graduates from academic families, yet free-thinking enough to see through the most egregious of the Cathedral's lies. Market natives, most of them met on Wall Street. Much has been made of the SBF-Effective Altruism connection, but these people have no doubt read the sequences too. FTX was a glimmer of hope in a doomed world, a place where the nerds were in charge and had the funding to do what had to be done, social desirability bias be damned.

They blew everything.

It will be said that "they weren't really EA," and you can point to precepts of effective altruism they violated, but by that standard no one is really EA. Everyone violates some of the precepts some of the time. These people were EA/rationalist to the core. They might not have been part of the Berkley polycules, but they sure tried to recreate them in Nassau. Here's CEO of Alameda Capital Caroline Ellison's Tumblr page, filled with rationalist shibboleths. She would have fit right in on The Motte.

That leaves the $10 billion dollar question: How did this happen? Perhaps they were intellectual frauds just as they were financial frauds, adopting the language and opinions of those who are truly intelligent. That would be the personally flattering option. It leaves open the possibility that if only someone actually smart were involved the whole catastrophe would have been avoided. But what if they really were smart? What if they are millennial versions of Ted Kaczynski, taking the maximum expected-value path towards acquiring the capital to do a pivotal act? If humanity's chances of survival really are best measured in log odds, maybe the FTX team are the only ones with their eyes on the prize?

20
Jump in the discussion.

No email address required.

Yeah it's pretty bad. I'm fairly involved with EA and while I knew SBF was a big donor, I had no idea how bad the hero worship had gotten. Both among EA and big financial institutions. To my eyes this reflects even more poorly on VC funders and high finance/trading in general, they were supposed to have done due diligence on FTX (which presumably they did and the whole using $10b in customer funds came later) but they didn't see this coming either.

For instance look at this archived profile on SBF from Sequoia, a VC fund that made FTX happen and memoryholed this article after the disaster. The hero worship in there is cringey, and in retrospect it's horrifying:

In devoting every waking moment of his life to work, SBF doesn’t feel he’s doing anything unusual or extraordinary. He’s doing what he feels every right-minded person should do—if they were big-hearted and clear-headed. He’s attempting to maximize the amount of good in the world.... SBF, on the other hand, seems qualitatively different: He seems utterly driven, as if by a lash.

and

It’s hard to see SBF in a clear light. The glitter of the self-made billions are blinding. His intellect is as awesome as it is intimidating.

and

“It’s not some pathological thing,” he continues. “For some bizarre reason—and I don’t get it—they want to help.” We’re talking about EAs in general, but also about SBF in particular. “They want to make a difference, and they want to devote their lives to doing that,” says Lerner, who is definitely not an EA. “But that still doesn’t clarify why, or if, that somehow diminishes the pleasure drive.” SBF is a mystery even to his own therapist.

it goes on

Yet again, I’m reminded of a certain novel. In Fitzgerald’s day, there was the real-life character of John Pierpont Morgan, who steered the nation as it made the transition from the 19th to the 20th centuries—the transition from an agricultural economy, powered by horses, to an industrial economy, run on rails. Who is going to do the same for us, in this new century?

Of course the answer to that last question is: SBF. The blatant pedestalizing of the man in here is inherently disgusting to me, the fact that it comes from a well-respected VC firm really lowers my faith in that entire class of folks. Especially after the WeWork Adam Neumann disaster and all the other disasters from startup founders.

Either way, I've been trying to beat the drum in EA spaces for a long time that EAs put far too much focus on credentials. It's ironic that so many folks in the movement will tell you to your face they don't care about credentials, only impact, and yet the entire leadership is nothing but blue-blooded Ivy League grads and other well-connected elites. It's a shame because I think most people in EA have their hearts in the right place, they just can't take off the credential blinders and see that most of the academic/elite class is specialized in bullshitting and not much else.

It's the worship of intellect in rationalist circles. They are very smart, in general; they do tend to be well-intentioned, in general; and they do think they are working on "how to win" where that means putting their smarts and good hearts to work for the benefit of all humanity. They forget that brains alone are not enough, and you do need a leavening of common sense or practical experience.

Whereas me and other old-fashioned types like me were pointing out all along that thinking you know how to do charity better than all the groups that have ever done it over the history of humanity is boundless conceit, and no it doesn't matter if you use financial analysis and statistics and all the rest of the jargony tools. They dug a pit, and fell into it themselves.

I'm not happy this happened, but I think a little chastening about "all those other sets of individuals did it wrong and made dumb mistakes, but not us" is no harm in the long run - if they learn the correct lessons from this about not believing their own hype and that even if Sam or Caroline or whomever were your good old pals from college, that doesn't mean a tap when it comes to running a billion-dollar business with no real experience.

This is a good example.

I don't think this is a good example, considering it was skewered on LessWrong itself.