site banner

FTX is Rationalism's Chernobyl

You may be familiar with Curtis Yarvin's idea that Covid is science's Chernobyl. Just as Chernobyl was Communism's Chernobyl, and Covid was science's Chernobyl, the FTX disaster is rationalism's Chernobyl.

The people at FTX were the best of the best, Ivy League graduates from academic families, yet free-thinking enough to see through the most egregious of the Cathedral's lies. Market natives, most of them met on Wall Street. Much has been made of the SBF-Effective Altruism connection, but these people have no doubt read the sequences too. FTX was a glimmer of hope in a doomed world, a place where the nerds were in charge and had the funding to do what had to be done, social desirability bias be damned.

They blew everything.

It will be said that "they weren't really EA," and you can point to precepts of effective altruism they violated, but by that standard no one is really EA. Everyone violates some of the precepts some of the time. These people were EA/rationalist to the core. They might not have been part of the Berkley polycules, but they sure tried to recreate them in Nassau. Here's CEO of Alameda Capital Caroline Ellison's Tumblr page, filled with rationalist shibboleths. She would have fit right in on The Motte.

That leaves the $10 billion dollar question: How did this happen? Perhaps they were intellectual frauds just as they were financial frauds, adopting the language and opinions of those who are truly intelligent. That would be the personally flattering option. It leaves open the possibility that if only someone actually smart were involved the whole catastrophe would have been avoided. But what if they really were smart? What if they are millennial versions of Ted Kaczynski, taking the maximum expected-value path towards acquiring the capital to do a pivotal act? If humanity's chances of survival really are best measured in log odds, maybe the FTX team are the only ones with their eyes on the prize?

20
Jump in the discussion.

No email address required.

Rationality at least has an ethos you can boil it down to to determine if you're doing it 'correctly.'

"Rationality is About Winning."

If you are trying to practice rationality and find yourself losing, or constantly being defected against, or otherwise in a disadvantageous Nash equilibrium, you know you're doing something wrong and need to step back and reassess. Whether most rationalists do this, I dunno.

But EA seems to have a much squishier set of metrics by which it measures success, and I think the strongest critiques against it are mostly around the lack of serious tracking of outcomes in a way that is truly reliable and legible.

Which rationalists can be comfortable with since merely increasing the probability that people's lives are improved is just as good as 'actually' improving them.

But in practice, I would guess that fraud and simple inability to measure outcomes accurately interferes with the amount of good they're truly doing in the world, and they seem to lack what I would call 'defensive cynicism' about the world and the people they end up trusting.