site banner

FTX is Rationalism's Chernobyl

You may be familiar with Curtis Yarvin's idea that Covid is science's Chernobyl. Just as Chernobyl was Communism's Chernobyl, and Covid was science's Chernobyl, the FTX disaster is rationalism's Chernobyl.

The people at FTX were the best of the best, Ivy League graduates from academic families, yet free-thinking enough to see through the most egregious of the Cathedral's lies. Market natives, most of them met on Wall Street. Much has been made of the SBF-Effective Altruism connection, but these people have no doubt read the sequences too. FTX was a glimmer of hope in a doomed world, a place where the nerds were in charge and had the funding to do what had to be done, social desirability bias be damned.

They blew everything.

It will be said that "they weren't really EA," and you can point to precepts of effective altruism they violated, but by that standard no one is really EA. Everyone violates some of the precepts some of the time. These people were EA/rationalist to the core. They might not have been part of the Berkley polycules, but they sure tried to recreate them in Nassau. Here's CEO of Alameda Capital Caroline Ellison's Tumblr page, filled with rationalist shibboleths. She would have fit right in on The Motte.

That leaves the $10 billion dollar question: How did this happen? Perhaps they were intellectual frauds just as they were financial frauds, adopting the language and opinions of those who are truly intelligent. That would be the personally flattering option. It leaves open the possibility that if only someone actually smart were involved the whole catastrophe would have been avoided. But what if they really were smart? What if they are millennial versions of Ted Kaczynski, taking the maximum expected-value path towards acquiring the capital to do a pivotal act? If humanity's chances of survival really are best measured in log odds, maybe the FTX team are the only ones with their eyes on the prize?

20
Jump in the discussion.

No email address required.

I have mixed feelings about this - from one side, some people in rationalist circles made very inflated claims about how their methods of rationality give them almost superpowers and how they are way better than regular humans at things that regular humans are bad at. So, for a regular human it's both a bit of reassuring that those supposed gods have exactly the same failure modes as mere mortals, though on the other side it was nice to imagine there are gods walking among us, for a short time.

That said, I don't see how it really should hurt rationalism that much. I mean yeah, a bunch of rationalists got caught on a scam. Smart people getting caught on a scam is not that rare. Isaac Newton - an intellectual titan by any measure - lost lots of money on South Sea Company. There were numerous other cases. It's not good, it's ok to feel bad about it, in fact it's probably necessary to feel bad about it, but it's hardly some kind of huge global catastrophe for rationalism or EA, IMHO. Stand up, dust off, analyze and admit your errors, try to do better next time.

Its not that they got caught on a scam. They were the scam.

Nah. The scam is unrelated. Yeah, they overtly endorsed "double or nothing coin flips and high leverage", but that wasn't the scam. The scam was what they did when things went bad. If SBF had just let Alameda collapse in August (and massively damage FTX with it), that would have been a vindication of their philosophy and a success by their standards and those of most EAs.

The only thing that went wrong was that FTX, seeing Alameda on the verge of bankruptcy, gave them $8 million of customer money rather than let it happen. That was a classic panic response of someone running a bad hedge fund. Everything before that was fine.

I'd argue that FTX's financials were broken before that point (even assuming no other wacky transfers shows up, which I wouldn't bet surprised by), but in a more boring way: extremely high (unnecessary!) expenditures compared to a business model that would very likely never support them. This is boring, compared to the fraud, but it's also a very high-visibility problem.

Maybe FTX could have pulled a rabbit out of a hat, in some alternate universe where Alameda had broken even or only had small losses. It's possible they could have scaled up another order of magnitude, and/or kept some sort of premium where people would be willing to pay five times the rates other exchanges offered. Even if they hadn't, the collapse almost certainly wouldn't have been exactly as abrupt or as entertaining. But they at least had a present and clear risk to nearly any pause or drop in crypto, or even decrease in growth, and it's important to notice that.

I don't think so. The EA work of FTX/SBF were rather incidental to the main scam activities, and I don't see much input from the EA/rat circles into the scam. They got some outputs from the scam, and maybe lent it a small measure of credibility, but I don't see any way they were an inherent part of the scamming and had any substantial input to the decisions FTX investors and clients took.

I don't see much input from the EA/rat circles into the scam.

You would be wrong. Here is Caroline explaining her trading philosophy literally replying to Scott Alexander.

The money quote: "Those people are lame and not EAs; this blog endorses double or nothing coin flips and high leverage". That's the smoking gun

Maybe you misunderstood me. That person is the CEO of FTX-affiliated hedge fund Alameda research. She's the one making the trades (with Sam's approval).

OK yes, I didn't realize whose messages those were, my mistake. It's more of a smoking gun than I realized, I'll probably need to research it deeper.

It's not. Double or nothing with leverage is just normal finance with EA goals. The scam is what they did when they hit 'nothing' - rather than take the L, they siphoned off customer money for one last big bet. (And it might have worked, if they got another six months.)

I don't agree this is "normal finance". It's a high risk game that is sometimes played in finance, but they committed fraud by the way they played this game and the way they presented it. My question was about whether EA/rat angle played any role in it. Initially, I thought the connection was purely coincidental, now reading more about it, I am not so sure - it looks there are some aspects of both a con aimed at EA/rat world and using EA/rat premises as justification for doing evil things, or at least to destroy all fencing that usually keeps one from doing evil things - which eventually inevitably led to doing evil things.