site banner

Culture War Roundup for the week of November 21, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

The Bankman-Fried/FTX Saga just gets better and better. A "why oh why" article in the Wall Street Journal has plums to be plucked out, such as these.

(And if Will MacAskill wants to repair his reputation, he better make some moves sharpish because the media are painting him as Sam's guru who encouraged and indeed enabled him).

Mr. Bankman-Fried has said his law-professor parents instilled in him an interest in utilitarianism, the philosophy of trying to do the greatest good for the greatest number of people. He said he started putting those ideals into practice while majoring in physics at MIT. Concerned with the suffering of animals on factory farms, he said, he stopped eating meat.

Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.

...Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community.

Yeah, does anyone think that someone who doesn't know the first thing about EA or any of the people here, when reading this, is going to come away with a good view of all concerned? Personally I'm very amused that veganism has been dragged into this: "guy who swindled billions is against meat eating" 🤣 So let's count 'em up: that's utilitarianism, Effective Altruism, cryptocurrency, and veganism all tainted by association!

As for MacAskill, it sounds like he was in contact with Bankman-Fried up until quite recently:

The FTX Foundation’s favored causes included pandemic prevention and protecting humanity from the potential downsides of artificial intelligence. At a July meeting of the foundation, Mr. Bankman-Fried became deeply engaged in a discussion on how lightbulbs equipped with a particular frequency of ultraviolet light could eradicate airborne pathogens, Mr. MacAskill told the Journal this summer.

He has distanced himself now, but unfortunately that may be too little, too late:

[Future Fund’s] two largest public grants, of $15 million and $13.9 million, were awarded to effective altruism groups where Mr. MacAskill held roles. Mr. MacAskill, now a professor at Oxford University, wasn’t paid for his involvement in those organizations “other than expenses,” a spokeswoman for one of them said.

...Mr. MacAskill distanced himself from FTX as it was crumbling. In a string of tweets, he accused Mr. Bankman-Fried of personal betrayal and abandoning the principles of effective altruism. He was also one of the Future Fund staffers who quit.

But wait, that isn't the best bit:

Mr. MacAskill at times advised Mr. Bankman-Fried on more than just philanthropic matters. When Elon Musk started his campaign to buy Twitter, Mr. MacAskill sent the Tesla chief executive a text message, according to documents made public in the litigation over his purchase of the social-media firm. “My collaborator Sam Bankman-Fried has for a while been potentially interested in purchasing it and then making it better for the world,” he wrote.

Oh yes. Just imagine it. Instead of Musk buying Twitter, it could have been Bankman-Fried. If people are getting het-up about Twitter potentially collapsing, what would they think if Twitter was caught up in the undertow of the FTX collapse? 😈

that's utilitarianism, Effective Altruism, cryptocurrency, and veganism all tainted by association!

Well, if veganism survived association with Hitler (though he wasn't vegan or even, according to many accounts, consistently vegetarian, the popular meme is that he was a vegetarian), SBF would be no more than a tiny blip.

Not an apples-to-apples comparison here.

Vegetarians might have "weird" ethical presuppositions and they probably did in Hitler's time as well. But there is little overlap in what made vegetarians weird and Nazis bad.

EA has weird ethical presuppositions and there is much overlap with what made SBF bad. I can't read his mind but I'd wager he probably did think that large (or less charitably just positive) expected value was worth the risk (and the lies).

And EA's beliefs are really weird! Mosquito nets are well and good. AI taking over the world, marrying the rich to be a better altruist and giving a shit about ants are not. Much of their recent concerns read like madness to the layman. Yes I think naive the utilitarianism of just maximizing EV is weird, I'm not the only one, so I won't waste much digital ink on that.

Also, I'm sure Hitler was one among many vegetarians. It's a common enough lifestyle that you don't make too many inferences if you meet one. EA is not, SBF is a much larger amount/proportion of EAs or the only one, people would know about at all.

I'd wager EA is fucked. Is it for good, I don't know but at least in the medium term.

I think the arguments about insect welfare at least deserve consideration.

They do. But my point was that they cost too many weird points to put front and center.

giving a shit about ants

Thanks for linking this. What a read.

At one point, on the topic of the ants, I said, in passing, something like: “may we be forgiven.” My girlfriend responded seriously, saying something like: “We won’t be. There’s no forgiveness.”

Quokkas are much too fierce for these people.

Jesus, Mary and Joseph. Not even the ants, the dust mites.

It’s easy, upon learning about dust-mites, for the question of whether to adjust one’s sheet related practices, in view of these mites, to never get genuinely asked. Washing one’s sheets is “what one does.” If we learn that it kills thousands of tiny creatures we didn’t know existed, it’s easy to conclude that apparently, killing these creatures is OK; it’s also, apparently, “what one does.” The idea that we might better protect and promote what we care about by changing our sheet related practices (Tomasik reports: “To be safe, I daily flap out my bed sheet into an unused area of my house in an effort to remove some of the dead skin on it”) is treated as a reductio ad absurdum of the idea that the welfare of the mites justifies such a change.

I have no idea who this Tomasik person is, but what an idiot. Unused area of your house? What area is that? And he really thinks piling up dirt in it is better somehow, more moral? If he's that freakin' worried about dust mites, do what generations of maid servants and housewives did: bring the sheets outside and shake them out (in your back yard or back garden if you have such a place). Just like table cloths or beating the dust out of mats and carpets.

This is not morality, this is poisoned narcissism. "Oh my sense of responsibility is so large, I care about dust mites". If the guy was a Jain or something of that nature, I could respect this viewpoint. But it's just inflamed sensibilities, and that inflammation needs to be brought down. It's not healthy or moral.

And it didn't prevent him and his tender-hearted housemates from waiting too long until the situation became intolerable. When the house was infested with ants and the problem could no longer be ignored. Whereas, if he hadn't been a wuss and had put down a line of ant powder around entrance points (doors, windows) then he would have killed fewer ants in the long run. Cruel to be kind, but the bleeding heart sentimentality meant he ended up killing even more ants eventually when the infestation had to be dealt with.

You have to have a high level of wealth and comfort to be this obsessed with "oooh mustn't kill the ants". This loolah has no idea how hugely privileged he is, in relation to other people, to have the luxury of waiting until his home gets overrun with ants before acting.

And then he ends up with "Imagine there was a sentient slime civilisation around us that we destroyed just by playing the guitar (?), should we stop using fire? Well, we already commit tons of harm anyway, so the best thing is just to be aware of it - and go on the same as before, only patting ourselves on the back for our exquisitely tuned sensibilities". Yes, this is So Deep, you are Very Aware and much superior to clods like me 🙄

Remember, folks, only you can adjust your sheet-related practices! Save the dust mites today! (If we do get replaced by the sentient cockroaches, we bloody well deserve it having come to a pitch of neurotic navel-gazing such as this).

I had to do some looking here. I figured the fixation on not killing insects was an allegory for empathy for humans outside your tribe.

Nope. Dude has spent a lot of time really fixated on insect suffering. But not just suffering caused by humans. Oh no: he is seriously considering how to intervene in nature to reduce predation in all contexts. As in, he is torn up about spiders killing flies.

New solution to Fermi paradox: intelligent aliens become so opposed to suffering and so empathetic with microorganisms that they thoroughly sterilize all life that they encounter. Can't have microscopic mites suffering as they die.

But I had doubts as well. In a 17 Oct. 2005 letter to Peter Singer, I asked:

Finally, what implications would follow if insects did turn out to be sentient? Might it be possible that the net balance of utility of their lives would be negative, considering that many insects live only for a few days or weeks before enduring what I presume is generally a painful death? If I could choose between not existing or experiencing the life of an insect under the assumed circumstances, I might prefer the former. If insect life were actually a great source of disutility, would a utilitarian not then be obligated to support, for instance, destruction of the rainforest, since that would destroy insect habitat and prevent many painful lives?

God damn it, Flannery O'Connor is right:

In this popular pity, we mark our gain in sensibility and our loss in vision. If other ages felt less, they saw more, even though they saw with the blind, prophetical, unsentimental eye of acceptance, which is to say, of faith. In the absence of this faith now, we govern by tenderness. It is a tenderness which, long since cut off from the person of Christ, is wrapped in theory. When tenderness is detached from the source of tenderness, its logical outcome is terror. It ends in force-labor camps and in the fumes of the gas chamber.

Tomasik's 'tenderness' and squeamishness about things like the spider web in his cellar leads him to - mass extinction of all the animals for which he feels this 'tenderness'. "To save the village, we had to destroy it". To ameliorate the suffering of mindless creatures, we must kill them all and make them extinct so none of their kind ever lives - and thus suffers - again. And this is his notion of compassion.

And if we should kill the mindless, what about those with minds? A greater capacity for suffering and awareness of suffering surely means we are obligated to kill them all - without even a God to sort them out, save the gods that we have set ourselves up as, dealing out judgement as to who lives and who dies.

Of course, Tomasik's 'compassion' and 'tenderness' are ultimately for himself. Did he not feel upset and distressed by the idea of spiders killing and eating flies, he would not entertain the notion of killing all insects. So his real objection is not to their suffering, it is to "their suffering makes me feel bad and I don't like feeling bad, so to make it stop we must kill them all".

(I don't think he really means 'kill them all', this is just him grappling with his scrupulosity, but the easier thing for him and for the insects would be to choke off this over-sensitivity and be less upset about it. That way he doesn't feel so bad, and wild animals don't have to die so he can feel better).

So his real objection is not to their suffering, it is to "their suffering makes me feel bad and I don't like feeling bad, so to make it stop we must kill them all".

But isn't that why everyone wants to prevent suffering? Except they don't take it as seriously or don't think about it so much so it doesn't bother them as much.

I don't see anything particularly illogical in what he's saying. If you are obliged to save a drowning child, that obligation does imply a string of increasingly absurd things. The proper answer is "I am not obliged to save that drowning child unless it has literally no cost to me. If it has almost no cost, then I am only almost obligated."

But this 'solution' is on the level of "in order to prevent drowning children, I will shoot every child I see". It's bonkers.

If the guy was a Jain or something of that nature, I could respect this viewpoint

Why would him being a Jain change anything?

Short answer: if it's based on religious beliefs and practices, then you can factor in the irrationality. It may annoy you or you may be able to tolerate it, but you expect something like this in all religions. And Jainism has applied its principles practically and understands the limitations involved, as well as the costs of actually practising ahimsa.

This isn't religion, even if it's like a secular version of it. This is a philosophy, and okay on a philosophy of life, but really it's just "my superior feeeeeels" at work. There's a lot of dancing around to avoid the fact that this is goddamn stupid to be worrying about dust mites. Everything lives at the expense of something else, and taking his attitude to the absurd extreme, he ought to kill himself in order to prevent any more harm, damage and death caused to microscopic organisms.

The end parable about the slime civilisation and guitars and fires is so dumb, it makes me want to scream. The conclusion he draws is at once flattering to his exquisite sensibility about "even the very least of these" and absolves him of having to actually make those major changes he talks about that humanity would have to make (if we found out that there was an invisible slime civilisation around us).

I'm fairly sure dust mites do not have sentience, sapience, or a sophisticated civilisation. Being so boastful about your tender heart that you admit you let your house become unhygienically infested by insects does not make me want to hang out with you or yours. This philosophy does not seem to stop him doing what he really wants to do, so it involves unnecessary hardship (let your house get over-run by ants while you dither), doing in the end what you should have done in the first place (and crushing individual ants is a lot crueller than putting down poison or bait) and is just a useless generator of worry that goes nowhere and does nothing. If he had real problems in his life, he wouldn't have the time or energy to spare to worry about "there is no forgiveness for killing ants".

No, there isn't, because the universe does not deal in blessings or curses. And it does not care a straw about your masturbation with ethics to get that climax of "ooooh yeah, me so moral, uh uh uh!" while you and your housemates all sit around agonising over how to deal with the ants that are walking over all the kitchen surfaces, in your food, in the bedroom, etc.

This ties into my hot take theory that veganism is only acceptable if you're not doing it for ethical reasons. (opposition to factory farming, etc is I think different)

It's veganism not vegetarianism and my point, if any, is that veganism is often promulgated on an ethical basis (let's ignore PETA and the cow-rape people and other extremists for the moment).

So veganism is presented as the more ethical, moral alternative to eating meat, and by association with that, vegans are more moral and ethical.

Then along comes someone like Sam, and for us carnist bloodmouths it is irresistible to say "Well all your superior morals and ethics didn't stop this guy being a thief, conman and swindler, did it?"

I'm not seriously saying veganism is tainted, I do understand that there are bad people in good places, but it is funny. Particularly this time of the year with Thanksgiving and Christmas and the usual fol-de-rol over eating meat coming up.

To quote from "The Ball and the Cross":

"Why, you great fool!” cried MacIan, rising to the top of his tremendous stature, “did you think I would have doubted only for that rap with a sword? I know that noble orders have bad knights, that good knights have bad tempers, that the Church has rough priests and coarse cardinals; I have known it ever since I was born. You fool! you had only to say, 'Yes, it is rather a shame,' and I should have forgotten the affair. But I saw on your mouth the twitch of your infernal sophistry; I knew that something was wrong with you and your cathedrals. Something is wrong; everything is wrong. You are not an angel. That is not a church. It is not the rightful king who has come home.”

Afair the disgust sensitivity proportional to conservative tendencies studies didn't replicate. But that's more of a mess than I am willing to get into now.

The Bankman-Fried/FTX Saga just gets better and better. A "why oh why" article in the Wall Street Journal has plums to be plucked out, such as these.

It is just getting started too. Eventually there will be some of indictment and likely a legal battle to get him back to the US. If successful, the actual criminal trial. I think he will be deported successfully. Given that SBF worked alone and had no co-conspirators like we see with organized crime, likely there will be no plea deal, so we can expect a lengthy trial and sentencing.

He definitely has co-conspirstors. His gf and a few of the other top people at FTX. Potentially his parents.

But not people who are higher up. His girlfriend could rat on him though

Was specifically replying to this

“Given that SBF worked alone and had no co-conspirators like we see with organized crime,”

That’s factually untrue. While it would be weird to make a deal with the number 1 so you arrest everyone else it’s also untrue that he had no co-conspirators. If he’s looking at life with no parole he could probably make a deal and give up a half dozen or more other people for 30 years.

I feel like I missed something when Sam Trabucco quit, that should have been a sign something was wrong.

It's hard to know. A lot of the articles are pointing out that Bankman-Fried kept tight control, only a few people had access to things, and he was the ultimate guy who knew where the money was coming in and going out. So co-founders, Ellison, etc. have a case about "We believed Sam when he told us it was all kosher". They may end up sounding like fools ("so you are saying you had no idea what was going on with all these billions even though you were technically in charge?") but better to be judged a fool than a knave.

If I believe what is going on, there is already a lawsuit in preparation suing the people in the ads for "invest your money with FTX". This is gonna be a juicy one, exile!

Honestly haven’t seen that he kept tight control. I thought they all lived together and had sex together and worked 24/7 together. And the other people aren’t lowly fools their are credentialed folks who would be able to notice things.

I thought they all lived together and had sex together and worked 24/7 together.

Not quite as simplistic as that. There was a luxury penthouse apartment with six bedrooms where several of them, including Bankman-Fried lived, but who was romantically involved with whom isn't clear (was Ellison still his girlfriend or his ex at the time? Unclear).

The bankruptcy filing is very informative, because it digs into the corporate structure and makes it pretty clear that Bankman-Fried had the controlling share of everything and it was he who set up how things worked/didn't work. There are a heap of companies/entities under the umbrella of the FTX group, with head offices/registrations all over the world. There was an 'inner circle' of Bankman-Fried and maybe as few as two others, Wang and Singh, or maybe/maybe not Salame, Ellison, Trabucco were included. It's hard to know who was in the know and who wasn't, but it's pretty clear Bankman-Fried owned the majority of everything, even if others thought it was some kind of communal set-up:

For purposes of managing the Debtors’ affairs, I have identified four groups of businesses, which I refer to as “Silos.” These Silos include: (a) a group composed of Debtor West Realm Shires Inc. and its Debtor and non-Debtor subsidiaries (the “WRS Silo”), which includes the businesses known as “FTX US,” “LedgerX,” “FTX US Derivatives,” “FTX US Capital Markets,” and “Embed Clearing,” among other businesses; (b) a group composed of Debtor Alameda Research LLC and its Debtor subsidiaries (the “Alameda Silo”); (c) a group composed of Debtor Clifton Bay Investments LLC, Debtor Clifton Bay Investments Ltd., Debtor Island Bay Ventures Inc. and Debtor FTX Ventures Ltd. (the “Ventures Silo”); and (d) a group composed of Debtor FTX Trading Ltd. and its Debtor and non-Debtor subsidiaries (the “Dotcom Silo”), including the exchanges doing business as “FTXcom” and similar exchanges in non-U.S. jurisdictions. These Silos together are referred to by me as the “FTX Group.”

Each of the Silos was controlled by Mr. Bankman-Fried. ( To my knowledge, Mr. Bankman-Fried owns (a) directly, approximately 53% of the equity in Debtor West Realm Shires Inc.; (b) indirectly, approximately 75% of the equity in Debtor FTX Trading Ltd.; (c) directly, approximately 90% of the equity in Debtor Alameda Research LLC; and (d) directly, approximately 67% of the equity in Clifton Bay Investments LLC). Minority equity interests in the Silos were held by Zixiao “Gary” Wang and Nishad Singh, the co-founders of the business along with Mr. Bankman-Fried. The WRS Silo and Dotcom Silo also have third party equity investors, including investment funds, endowments, sovereign wealth funds and families. To my knowledge, no single investor other than the co-founders owns more than 2% of the equity of any Silo.

The parent company and primary operating company in the Alameda Silo is Alameda Research LLC, which is organized in the State of Delaware. Before the Petition Date (as defined below), the Alameda Silo operated quantitative trading funds specializing in crypto assets. Strategies included arbitrage, market making, yield farming and trading volatility. The Alameda Silo also offered over-the-counter trading services, and made and managed other debt and equity investments. In short, the Alameda Silo was a “crypto hedge fund” with a diversified business trading and speculating in digital assets and related loans and securities for the account of its owners, Messrs. Bankman-Fried (90%) and Wang (10%).

Related Party Loans Receivable of $4.1 billion at Alameda Research (consolidated) consisted primarily of a loan by Euclid Way Ltd. to Paper Bird Inc. (a Debtor) of $2.3 billion and three loans by Alameda Research Ltd.: one to Mr. Bankman-Fried, of $1 billion; one to Mr. Singh, of $543 million; and one to Ryan Salame, of $55 million.

Mr. Bankman-Fried and Mr. Wang controlled access to digital assets of the main businesses in the FTX Group (with the exception of LedgerX, regulated by the CFTC, and certain other regulated and/or licensed subsidiaries). … the absence of independent governance as between Alameda (owned 90% by Mr. Bankman-Fried and 10% by Mr. Wang) and the Dotcom Silo (in which third parties had invested).

One of the most pervasive failures of the FTXcom business in particular is the absence of lasting records of decision-making. Mr. Bankman-Fried often communicated by using applications that were set to auto-delete after a short period of time, and encouraged employees to do the same.

more like weasel on him, if you get my drift

As the resident apologist of EA, even I find it somewhat hilarious how much spotlight Macaskill has been getting throughout this entire fiasco. He has consistently from the beginning said that he doesn’t think he is smart or qualified enough to lead the movement or have as much impact as he does. It seems like he was right.

That being said, I wonder if this could be a sort of “no press is bad press” moment for Effective Altruism. As someone who has been involved for years, the amount of buzz I see around EA both online and in my personal life has been incredible. Between the media push for Macaskill‘s book and now this juicy scandal, I wouldn’t be surprised if the amount of people who know what EA is has doubled in just the last six months.

Time will tell if this is a good thing for EA or not, but speaking from my personal experience the main issue in talking to people about it was apathy. At least now the space will be a bit more entertaining and potentially show normies that there can be drama and fun to be had even in doing good. Maybe we’ll even pull in some rdrama folks who knows.

I am sorta sorry for MacAskill, and in good conscience who could have forecast something like this coming down the pike, but hooo baby. It is so funny for an inveterate "yeah yeah X-risk this, AI-risk that my royal Irish backside" sceptic like myself to see the high-minded big brains who were working on Saving The World getting a 2x4 to the side of the head like this. Yes, you were so busy thinking about "what disaster are we not preparing for that could threaten the future of humanity?" that you didn't notice the disaster unfolding under your very own nose.

I can't help thinking "Were there no damn adults in the room? Where are all the conservative old fuddy-duddies in suits who worked in accounts for forty years who would have said 'you can't make money off magic beans', for the love of Pete?" And of course there weren't any, because that's not how fast-paced, modern, cutting-edge, thinking big, really smart people with big philosophies work nowadays in our global connected world wotsit. No place for the conservative or cautious! Move fast, break things! We're not going to colonise the galaxy and live the transhumanist Fully Automated Luxury Gay Space Communism life if we are held back by the red tape and regulations of the bean counters!

Look at that ad mentioned in the lawsuit:

“I’m never wrong about this stuff, never,” said a dismissive and scoffing Larry David earlier this year in that now infamous Super Bowl ad for investing in cryptocurrency exchange FTX.

While the Seinfeld co-creator rejected the wheel, coffee, the U.S. Constitution, electricity, putting a man on the moon and more innovations in the much praised commercial, looks like David might have been right about the now collapsed FTX, for all the good it’s going to do him.

The guy who is sceptical about the new crypto get-rich-quick scheme is the model of an unreasonable conservative old-fashioned type; he doesn't believe in the new currency, but neither does he believe in any other innovations of the past. He is, dare I say it, on the wrong side of history. And yet - look how that turned out, as the article points out. What was meant as poking mild fun at any cautious types who might be hesitant about this new form of finance turned out to be prescient, though none of them knew it.

It's very mean of me, but us commons like watching the nobles dragged off in tumbrils, you know?

Hah well I appreciate your takes. At least you’re entertaining and light hearted unlike some of the other sharks circling EA right now.

For what it’s worth I’ve also been in the anti AI safety camp of EA for a while now. Quite a few people within EA don’t like the elitism, credentialism, and general galaxy brain takes that the leadership of the movement have succumbed to in recent years. Unfortunately we don’t get frequent interviews or become media darlings. We also don’t have posh British accents so it’s harder to be taken seriously in some circles.

We’ll see if EA can learn the right lessons from this, I’m hopeful that internal reform will be more possible now. Or at least a bifurcation of the salt of the earth EA types away from the navel gazing longtermists.

Or at least a bifurcation of the salt of the earth EA types away from the navel gazing longtermists.

Then what, exactly, is novel about EA once you get past the navel-gazers? I still do not understand what is new or interesting about EA.

Pledging to donate part of your income up front? Well, tithing is a well-known concept, and automatic paycheck withdrawals to your retirement account is a pretty well-established and useful concept.

Having metrics and quality control in regards to charities? It’s debatable when exactly effective altruism cohered as a concept, but critiquing donations to red tape-burdened inefficient charities is certainly not a new concept.

Like…once you strip out the funding for battling paperclip-optimizing super AIs, which still seems a silly concept given we can barely build a good Roomba to vacuum up my dog hair, let alone grey goo that will reshape physical reality…what is EA besides common sense? We don’t need Scottish philosophers to construct an elaborate taxonomy and praxis for automatic savings accounts and spending a solid 5 minutes to ponder that we are in fact quite well-aware of breast cancer by this point, and don’t need a month of NFL games to remind us.

Pledging to donate part of your income up front? Well, tithing is a well-known concept, and automatic paycheck withdrawals to your retirement account is a pretty well-established and useful concept.

Having metrics and quality control in regards to charities? It’s debatable when exactly effective altruism cohered as a concept, but critiquing donations to red tape-burdened inefficient charities is certainly not a new concept.

It seems obvious to me that the new concept here is actually taking these issues seriously. As others have pointed out, no these aren't entirely new concepts EA created out of thin air. The importance of EA comes in that they actually raise awareness and push for more people to understand the importance of these ideas.

'Common sense' as a concept is highly overrated, it essentially means whatever you think is right. If you think all of this is so easy to understand and obvious on the face, then as @Tarnstellung pointed out, why is so much of charity extremely inefficient and just bad? And do you think that should change?

And yet people still donate millions to breast cancer awareness and similar rubbish. Maybe we need a movement to get people to put a bit more thought into how they donate, to make sure their money is used in the best way possible, to make their... altruism effective?

Maybe we need a movement to get people to put a bit more thought into how they donate, to make sure their money is used in the best way possible, to make their... altruism effective?

Maybe we need a single organization dedicated to charity and peace and love, who can coordinate collection of donations, provide social services, and lead outreach efforts to drum up support. It can even have local, regional, national, and supranational levels, led by a leader selected for wisdom and benevolence from a conclave of regional leaders!

Oh wait, we just reinvented the Catholic church.

The Catholic Church is much more than its social programs. Arguably, the social programs are one of the less important things it does, especially nowadays. The Church also doesn't do any research on cost effectiveness (or do you think ornate cathedrals do more good than malaria nets?).

There is no need for a centralized EA organization. Cost effectiveness research, of the kind performed by GiveWell, isn't that expensive. You could reasonably have multiple groups each doing their own analysis and outreach efforts, and these meta-charities would be separate from those that actually implement the programs. GiveWell doesn't directly help anyone as far as I know, they do the research and direct any extra money to charities they find to be effective.

This may be a bit oversimplified, but a core part of utilitarian ethical calculus revolves around optimizing tradeoffs. EA takes that ethos, and tries to apply it to the real world, identifying places where minimal resources may be expended to reduce the most human suffering--mosquito nets in Africa is one of the central examples. EA looks for efficiency, opportunities for optimization, and benefits that scale cheaply and easily. Very much for better and for worse, EA is a big-picture-focused approach to practical philanthropy. Many other strains of philanthropy are individual-focused; the EA critique of such measures is that they are inefficient and do not scale well.

EA looked at American politics and decided the cheap, low hanging fruit to easily have large effects with small amounts of money was..........

Giving hard-left organizations millions to make an indie film about racism and millions more to the richest liberal slush fund in the world to write one paragraph.

My half-assed review of OPP can be found in the comments here:

https://old.reddit.com/r/TheMotte/comments/vdwwso/a_critical_review_of_open_philanthropys_bet_on/

It sounds like you are skeptical of the efficacy of magic beans? Yeah, in that case, it looks like just the worst combination of some form of quokka-ness with enough of a veneer of "trust me, I've done the utilitarian math, and it says that Democrats are surprisingly underfunded!" High-minded principles got suckered by a grift (well, to the extent that principles were involved at all beyond providing a fig leaf, which is worth questioning).

That said, as a practical matter, EA looks especially vulnerable to grifting exactly because its focus on big-picture analysis tends to dismiss individual failures. On the principles side, EA is an attempt to speedrun morality, which to me raises more red flags than a Chinese military parade.

EA looked at American politics

The irony here is that EA in general was extremely allergic to politics for over a decade. To the point where there were multiple lectures and serious discussions about avoiding political alignment.

Not everyone in EA wanted SBF to start meddling in politics, in fact I'd argue a majority didn't. And a majority right now certainly agree that political meddling is a bad idea.

EA is not a hivemind. I don't recall GiveWell endorsing politicians, for example.

Honestly, the best thing for EA would be to dump the AI stuff and other galaxy-brain projects, run away from the politics/lobbying stuff, and go back to "doing good better". That's boring and tedious, it's not exciting like "we're trying to save the world from long-term risk of human extinction and colonise the galaxy for a billion years!", but fuck that SF shit. Feed the hungry, clothe the naked, heal the sick. Basic dull, boring stuff that has to be done over and over again when you get up every day, but it has the most effect on real people that you want to help.

(And I'm speaking as someone who loves that SF shit. There are thousands of political lobbying groups, EA doesn't need to be one more fish in the pond. There are already groups worrying about AI risk and pandemic risk and getting hit by the sweet meteor of death risk. Doing efficient, effective work to help the needy is there to be done, are they going to leave it up to religious groups? And if they are, then they don't get to complain about missionaries on church trips to distribute Bibles instead of mosquito nets, because there's a good chance the church mission group is also doing that).

@big-city-gay this is another good explanation of why EA is important even if it doesn't have a large amount of novel theory behind it.

At least now the space will be a bit more entertaining and potentially show normies that there can be drama and fun to be had even in doing good.

No, this is exactly why we’re in this mess in the first place. Mosquito nets and clean water are boring. High-leverage crypto trading is fun and exciting. Living in a penthouse with your buddies is cool and sexy. Drama appeals to the deepest impulses of the primate brain. A world without drama would be very boring indeed. But drama, living in penthouses, and high-leverage crypto trading are not altruism. In case you think that no smart or rational person would ever make such a category error, consider the following primary source:

five years ago it felt like a really serious conflict in my mind that what was maximally good often seemed weird, or sketchy, or tacky, or unpleasant, or just plain aesthetically distasteful

these days it feels like the good is shockingly aligned with the fun, the beautiful, the awesome, the exciting, the sexy and shiny and cool

… I … don’t know what to think or how to feel about that tbh

-Caroline Ellison, October 5, 2021.

Pull up a Bitcoin price chart and find early October 2021. She was making bank in a bull market and confused getting rich for doing good. She is a smart girl, and noticed that something seemed off. As we now know, she did not correctly resolve the problem. This should tell you something about the darkness and corruption inherent in the human soul. Altruism is not natural and must be either rigorously guarded or lost.

Yeah I agree that EA has been leaning a lot more towards mental masturbation and things that are fun to think about like AI safety. That being said you have to bend a bit to the drama if you want to have massive societal impact and change the way everyday people view doing good.

I’m optimistic that even if EA gets watered down in its quest for more adoption, it will at least inject some good ideas/mental frameworks into the mainstream branch of philanthropy.

Congratulations to Effective Altruism. For those of you not understanding why this whole thing isn't good for Effective Altruism: You are now officially a group! Just like Atheism became a group through the infamous fedora/quote maker incident, you too are now an easily identifiable and targetable entity. No one will ever need to engage with what you say.

If anyone cares to ask what the difference is between Effective Altruism and just Altruism: Altruism is when you create a pyramid scam to rob people with money and give the money to powerless people who don't have money. Like Robin Hood. And then you go the way of Robin Hood, either getting killed or jailed for life.

Effective Altruism, on the other hand, is when you create a pyramid scam to rob people with money and give very powerful people with even more money all of the money. That way, unlike Robin Hood, when the jig is up nothing will happen to you, and you can go back to your polycule.

Do you actually have this cynical of a view on humanity? I personally would hate to live with the idea that all altruism and kindness in the world is fake and done for selfish reasons.

I personally would hate to live with the idea that all altruism and kindness in the world is fake and done for selfish reasons.

The Litany of Gendlin:

What is true is already so.

Owning up to it doesn't make it worse.

Not being open about it doesn't make it go away.

And because it's true, it is what is there to be interacted with.

Anything untrue isn't there to be lived.

People can stand what is true,

for they are already enduring it.

What is true is already so

I reject this as an absolute statement for all knowledge. Social constructs have a very different relationship to truth than object-level facts. Social consensus on altruism brings altruism into being and gives it meaning.

What cynical view of humanity? Altruism in effect is just called love. You get it from those who love you and you give it to those you love. Beyond that people are not giving anything they can't afford to give. The same way a gambler isn't really gambling if he is only betting money he can afford to lose.

I think people are shrouding their own nature in these words. 'Altruism' 'kindness'. I don't see it as cynicism to recognize that I am not giving away any of the money I can't afford to give away and that I am no different from anyone who finds themselves under the umbrella of 'Effective Altruism'.

can’t afford to give

This is relative though. I could introduce you to many people who make six figures and “can’t afford to give” any to charity, despite wasting money on lavish lifestyles that don’t even make them happy.

The point of EA, being a good person, altruism in general, or heck even being a good Christian, is to expand the circle of those you love, ideally without expecting love back in return.

I didn't say it wasn't relative. What I am saying is that the emotions you feel are real and you are not going to fake it till you make it. Like is demonstrated with people making 6 figure salaries choosing to live lifestyles that make them unable to 'expand their circle of love' more effectively.

What I am saying is that the emotions you feel are real and you are not going to fake it till you make it

Are you saying that emotions are unchanging and immutable? I agree that emotions are real, and that 'faking it till you make it' isn't a great strategy, although we probably disagree about what that means.

What I'm arguing is that humans can change the way they think and feel, and I like to think we should change by trying to become better people over time. To me, effective altruism presents a compelling argument as to how to become a better person.

humans can change the way they think and feel

This is proving to be very difficult, to the point that we have billions-of-dollars-per-year industries and millennia-old social phemonena devoted to various different methods of doing this (religion, meditation, self-help, therapy, pharmacology, alcohol, narcotics) and we are, by some measures, less happy, more lonely, and less loving than we've ever been.

The realities of 'being a good person' instead of being a selfish person aren't a mystery and your stated beliefs are not relevant to me next to the lived reality of people who have actually gone out of their way to be a good person, putting in all the hard work and effort, only to find themselves completely burned out in a year or two as the seriousness of the situation becomes clear to them. Examples of people 'being good', where I have first-hand accounts of people completely giving up on the situation are: Missionary work/general aid work in Africa, helping the homeless, helping stray animals.

In all of the examples there was a common line that, eventually, the people doing the aid hit a wall. If you have some way of helping people get over the wall, in whatever capacity it is that you imagine these folks becoming 'better persons' I am all ears. Because as far as I can tell, nothing about EA involves a replacement for actual people doing actual physical work on the ground.

I may not expect them to love me, but much of the 'help' / love for others in the Bible is of a spiritual nature rather than practical. There's no requirement to enable others sloth.

There are many Bible verses that say that there is such a requirement for practical, monetary help. Matthew 19:21, for example:

Jesus said to him, “If you would be perfect, go, sell what you possess and give to the poor, and you will have treasure in heaven; and come, follow me.”

He was speaking to a rich young man. He was telling Zuckerberg to sell up, give to the poor, and go on the road with him.

The direction to support your family / household is much clearer in Timothy 5:8

The church helping widows had conditions Timothy 5:4 and Timothy 5:9–10

The moral of that story isn't that you shouldn't covet material wealth, but rather you'll get even more of it later if you follow Jesus.

deleted

More comments

There's a philosophical question whether genuine altruism even exists, e.g. there may be evolutionary advantages to being generous. Even to complete strangers. If that's the case, then there are underlying self-serving That said, I find the attacks on EA overdone. The media has been unbelievably soft on SBF, often going along with his portrayals of himself as hapless (rather than malicious) and more than happy to spread to blame to the wider EA movement instead of focusing on his personal culpability.

There are plenty of people who spend evolutionarily disadvantageous amounts of time (which is to say, nonzero) thinking and talking about whether killing mosquitoes and tapeworms is ethical or not; it neither increases their own sexual fitness nor indirectly helps genes shared with near relatives propagate. You might say that's a result of evolved altruism gone awry, but it still indicates that altruism is something that has an independent existence from the evolutionary drive.

There's an option you're missing, which is that signaling a higher level of ethics may have, historically, been evolutionarily beneficial actually.

Now, being ethical isn't necessarily required, but thinking about ethics is a prerequisite to signaling ethics.

There's an option you're missing, which is that signaling a higher level of ethics may have, historically, been evolutionarily beneficial actually.

I suppose I can get behind this. My ends are not necessarily aligned to evolution, but I do think altruism is important in a sort of 'coordination is better than constant conflict' sort of way.

In men I thought it was signaling in an attempt to win the mad 'Save the Mosquitos' pussy or anti-GMO or whatever. In women I've no idea.

I’ve heard that line of reasoning before, but just because something is evolutionarily advantageous doesn’t make it lose all meaning. We draw our intelligence from evolution, does that mean human intelligence is pointless or fake?

And yeah I find it odd as well. I think EA makes people feel bad, especially white collar journalists, for lip servicing humble values while spending all of their high five figure salaries on eating out etc etc when in reality they could save multiple lives a year with that money. It’s probably quite nice to be able to turn and say “I told you so! Being a good person is fake and gay!”

SBF doesn't have a very vegetarian / vegan aesthetic. I find they typically have a slimmer gaunt unhealthy appearance not his pudgy unhealthy appearance. Maybe he's a vegan fraud too. Isn't it more likely that it's just fraud all the way down? Is anyone really surprised at another finance capitalism fraud? Fraud starts to look more like a feature.

At least the industrial capitalists produce widgets that actually exist, mostly.

the 'vegetarian / vegan aesthetic' you mention come from trying to eat 'healthy' and vegan. There's plenty of vegetarian fast food, and you'll get as fat if you're force-fed avocado oil as you will with lard!

Yeah, there's a pretty simple answer to how a vegan could be fat: carbs.

The Logical Vegetarian (from "The Flying Inn", Chesterton):

You will find me drinking rum,

Like a sailor in a slum,

You will find me drinking beer like a Bavarian

You will find me drinking gin

In the lowest kind of inn

Because I am a rigid Vegetarian.

So I cleared the inn of wine,

And I tried to climb the sign,

And I tried to hail the constable as “Marion.”

But he said I couldn’t speak,

And he bowled me to the Beak

Because I was a Happy Vegetarian.

I am silent in the Club,

I am silent in the pub.,

I am silent on a bally peak in Darien;

For I stuff away for life

Shoving peas in with a knife,

Because I am a rigid Vegetarian.

No more the milk of cows

Shall pollute my private house

Than the milk of the wild mares of the Barbarian

I will stick to port and sherry,

For they are so very, very,

So very, very, very, Vegetarian.

Oreos and french fries.

The illusion of veganism as 'healthy' persists because a lot of vegans are 'health nuts' compared to the normal person. If the normal person was vegan they could easily supplement their diets with certified vegan High Fructose Corn Syrup and Palm Seed Oil.

Well, chips and chocolate can be vegan... Generally veganism comes with a concern for one's health, but they are distinctly different, just not eating animal products doesn't at all guarantee you won't be pudgy.

I am guessing that breads, pasta, junk food can have plenty of calories while still being vegetarian . The good news for him is he should have plenty of opportunities to lose weight in prison.

I keep not seeing my perspective brought up in these threads so I guess I'll say what I can't believe isn't obvious. Why should I care at all about any of the people or things connected to this dude who I had not heard of until the collapse? If they had a knowing hand in the scam then sure, but what's next? Are we going to hear about his favorite ice cream shop and their complicity in selling such a monster ice cream? We don't even have two cases of EA adjacent scammers with which to draw a pattern, the only conceivable way I see this being bad for EA is that they're now down a lot of money that they expected.

We don't even have two cases of EA adjacent scammers with which to draw a pattern, the only conceivable way I see this being bad for EA is that they're now down a lot of money that they expected.

We have that Oregon political campaign as well.

The condemnation of EA over this is kinda funny. If SBF showed up to any other charity, or person or organization, including everyone criticizing him, and offered billions of dollars over a decade, I don't think they'd say "uh, what if this large heavily invested business secretly committing fraud though? I demand an invasive audit of your books to make sure, and I won't take the money otherwise." And that goes double for just 'advising SBF'.

Ah ah ah, EA built their entire case around being "not like those other charities, we investigate first and have all these tools to assess what is the bestest bang for the buck".

So they are on the hook for "so why didn't you check out how this guy was promising tons of money out of thin air then, clever-clogs?"

Susie Maye who runs the Save Our Fluffy Friends campaign out of her own house and is active mainly in the fifty mile radius around the town, accepting a hunk of cash from an anonymous donor who says "I love puppies and kitties and bunny rabbits, here's a thousand to help your good work, keep it up!" isn't held to the same standard as "I am an Oxford philosopher who is so important to the world I can't take time out to have kids because that would distract from my Important World-Saving Work, oh here's $30 million for Good Causes I am involved with and you want me to be on the team of your Big Donation Fund that is funded by you making money via magic beans? Don't mind if I do!"

Is it not intuitive that people looking to give money are more concerned with the output than the input? I don't quite understand why it should matter all that much to the people who aren't going to die of Malaria because of the movement whether the funding came from a perfect loving cute grandmother who loves all people equally or a scum bag serial murder rapist. Can someone actually articulate the harm of taking the money of bad people to do good things?

The only thing that really concerns me about the effective altruism angle is how poorly a lot of that money got spent.

That is intuitive, I agree. The thing that makes me hesitate to sign on to it, though, is that EA as a movement is actually involved not just in taking people's money but in encouraging people to make that money in the first place. So, if someone was going to be a bad person anyway, then, yes, you might as well take their money. But if you are (perhaps inadvertently) encouraging people to become bad in order to make lots of money that they then give to you, then it gets a bit more complicated. Especially if your movement then goes around loudly praising these (actually bad) people.

I'll admit that's a failure mode but it's not exactly a unique one. "Be fruitful and multiply" can advocate for some quite nasty practices if taken as loosely. Earning more to give more is a possible way to summarize as classic of tales as robin hood with the same moral ambiguity. Whether the ends justify the means is something that cuts through the heart of every member of every movement and I don't think it's fair to declare which side has won from the actions of one member.

Derek Thompson makes a related point, when he says:

I also think the scandal ought to compel EA-aligned institutions to be much more critical of their financiers. This movement, which is so clear-eyed about outputs and outcomes, ought to be equally concerned about inputs and incomes. Where is this movement’s money coming from? Whom is it coming from?

Thus far, EA has not been big on investigating where the money comes from -- only where it goes. That may change. We'll see.

SBF didn’t just show up one day at an EA event with billions of dollars. He was homegrown. He was in effective altruism before he was rich. He literally worked for The Center for Effective Altruism before leaving to start Alameda Research. Caroline Ellison ran a semi-anonymous blog jam packed with rationalist and EA references for years before she even started at Alameda or got into crypto. (Seriously, read it. This is a person who really thought they were maximizing expected utility by running an undercollateralized hedge fund into the ground. Fascinating.) These were true believers in the movement. Effective altruism was central to their identities. The fact that FTX not only failed but caused billions of dollars in collateral damage suggests that something is deeply wrong with the core philosophy.

The Future Fund site is still up and yep, all of 'em on this page are in the entire "taking in one another's washing" circles that EA has become.

To take the example of Ellison, here's the potted bio for her:

Caroline Ellison is the co-CEO of Alameda Research, a cryptocurrency trading firm. Before joining Alameda in 2018, Caroline worked at Jane Street as a trader on the equities desk. She's been involved in the effective altruism community since college and started working in finance with the goal of earning money to donate to good causes.

Gary Wang with the 200 IQ play to never be photographed at FTX. He knew.

The fact that FTX not only failed but caused billions of dollars in collateral damage suggests that something is deeply wrong with the core philosophy.

Maybe there's something deeply wrong with the core philosophy of crypto? FTX is EA's one big scandal - but FTX was a crypto company, and while the rest of EA is not obviously fraudulent, the rest of crypto is covered in frauds! Crypto scams are prevalent, exchanges collapsing and coins rugpulling are a constant - the most popular exchange, Binance, was generally regarded as shadier than FTX and continues to operate with, iirc, unclear legal status. Most of crypto's popularity and price come from people seeing the coin go up and wanting to get rich, and making up weird sounding 'decentralization' reasons for why it's making them rich. It makes much more sense to blame crypto for FTX than EA.

To be clear, this doesn't prove crypto is bad - it's clearly significantly populated with scammers and groundless speculators, but that doesn't mean the original ideals or ethereum or w/e aren't useful anyway. And in a broader sense - again, frauds happen everywhere, much of accounting and finance exists as regulations to prevent fraud that would otherwise be ever-present, crypto's lack of that is just revealing why they were necessary in the first place, "crypto speedrunning the financial system".

"EA has both fraudsters and people willing to go along with them in it" isn't surprising, it's expected. "Being willing to do morally gray / black things is associated with success in crypto" also isn't surprising, it's expected. EA doesn't need to have any causal impact on the fraud for it to happen.

And - say one believes that HFT is just a legal way of taking other peoples' money via financial alchemy. Is what sam did really worse than that? Are SBF's ethics materially worse than those of a fast food company or 'actively managed fund' CEO, or are they just less well regulated?

To be clear - I'm not arguing sam isn't bad, and I'm not arguing EA is based, just that people who otherwise dislike EA (which I may also for very different reasons!) are using this to prove EA BAD in the same way that, like, iraq or epstein proves (from different angles) LIBS ARE MORALLY BANKRUPT, despite it not really connecting to their initial reasons for disliking it.

say one believes that HFT is just a legal way of taking other peoples' money via financial alchemy. Is what sam did really worse than that?

Not that this undermines the main thrust of your point, Chris Stucchio put out a series of posts explaining the mechanics and social value of HFT, as well as some of its problems. The gist is that HFT charges people for liquidity, which compensates them for taking on risk, and also helps expose the "true" value of assets that large buyers try to keep hidden. So there's at least a solid argument that it's not just "a legal way of taking other peoples' money".

Furthermore, its worth noting that a non-fraudulent FTX would be providing a similar sort of value to HFT (liquidity), which suggests that actual SBF running a fraudulent FTX is much worse.

The problem here is that price discovery for crypto doesn’t really make sense the way that it does for say, stocks or bonds. It’s not just a risk asset with a speculative component, the whole thing is speculation.

Maybe there's something deeply wrong with the core philosophy of crypto? FTX is EA's one big scandal

FTX is EA's only big anything, because nothing else EA has done has captured mainstream attention like FTX has.

My sense is that EA does not lack for scandals (CFAR and MIRI are basically walking scandals), they're just not big enough to be interesting, and it feels kind of unsportsmanlike to publicize them, because they're basically all about maladjusted misfits hurting themselves and one another.

Can you clarify what you mean here? I thought MIRI just wrote not very exciting math papers and CFAR is just another self help seminars for $5000 program.

Check this shit out (hat tip @DaseindustriesLtd). I don't know the objective truth behind the accusations, but there's more than enough grist for CFAR/MIRI to get absolutely dragged by the mainstream press if they had anywhere near the public prominence or power that FTX had.

Good god, this and the Leverage scandals have made me want to basically entirely disavow rationality as a movement. I knew they were into some weird stuff but the links I found after going down that rabbit hole, especially this blog by Ziz(?) is just beyond the pale.

Actual specifics I've identified in this story:

  1. A mentally ill person who did psychadelics thinks MIRI is like Leverage and currently has assorted vague mental illness symptoms similar to the typical tiktok "I'm mentally ill" influencer.

  2. Said mentally ill person attributes their mental problems to MIRI. "I had paranoid fantasies about a MIRI executive assassinating me...."

  3. Multiple people at CFAR had mental health issues. One person took a seminar then did some crimes.

  4. The word "corruption" is used with no specifics.

  5. "I had disagreements...and there was quite a lot of effort to convince me of their position..." :O

  6. Some anti-CFAR protesters were unruly and local cops in a rural area got really excited about trying out their SWAT gear: https://sfist.com/2019/11/19/four-people-in-guy-fawkes-masks-scare-sonoma-retreat/

  7. "I was certainly socially discouraged from revealing things that would harm the “brand” of MIRI and CFAR, by executive people." :O

  8. Leverage leverage leverage, it was like leverage. Did you hear leverage was bad? I want some of the attention that leverage girl got.

The allegations against Leverage are equally non-specific and mostly come down to "I didn't get the social status I was seeking."

https://medium.com/@zoecurzi/my-experience-with-leverage-research-17e96a8e540b

I do agree that some of these folks would be better off attending church than CFAR.

More comments

I still haven’t seen or heard anything that can’t be explained by the fact that MIRI is a less-than 110% woke organization in Berkeley California. These are exactly the kind of fake “scandals” that I would have expected given such an arrangement.

There's two things here; first, hell yeah there is something deeply wrong with crypto.

But secondly, also hell yeah EA needed this kick in the pants. They started off doing concrete good, but that's boring. So the movement has drifted away to this kind of pie-in-the-sky theorising (AI risk may or may not Doom Us All, but right now starving, sick and poor people need to be helped even if Paperclip Maximiser is around the corner) and once it got involved in politics, that was it.

For all the lofty ideals, the dull slog of practical charity in, yes, your local neighbourhoods and beyond remains to be done. EA pivoted to the big leagues where the big money was, and that they needed, so they fell prey to the likes of Bankman-Fried. Because they wanted to be the big, exciting, world-changing, shaping the future deal. And this is part of the fall-out of that philosophy, and the movement has to accept it and address it.

I know it's hard. I know the defensive reaction is natural and inevitable. I know the temptation is there to say "This has nothing to do with us and he wasn't really one of us, anyhow". But like it or lump it, this is now the image people have of EA due to Bankman-Fried and this financial scandal. Maybe going back to feeding the hungry, clothing the naked, and visiting the sick is better than "let's all get jobs in each other's foundations so we can lobby Congress about pandemic prevention and big glamorous events".

The local, "charity begins at home" organisations that are going out with soup kitchens and the likes are still operating today, doing the same dull, boring, tedious, repetitive work of hands-on interaction with the poor and needy. Even if it is inefficient by GiveWell metrics. Even if it doesn't have Oxford philosophers working on utilitarian theory to be the official philosophy.

If one feels that local widows and orphans are sufficiently well taken care of, there has been another strategy used to deal with the "starving children in Africa" problem. A local group decides to subsidize sending one of their members to a village in rural sub-Saharan Africa, and their agent is to live in the village and exchange knowledge with the villagers over a period of months or years. The agent reports back things like "the villagers want to dig a new well, and a sturdy pump would improve their lives significantly," so the home group fundraises and delivers payment for the needed pump.

This process is hard work, but note how well it accounts for information management (specific and individualized needs are identified and communicated to the backers) and cash security (cash passes through very few hands, and is assigned on a project-by-project basis, not commingled into a slush fund). More organized charitable infrastructure usually claims to be more efficient, but it's important not to lose the benefits of direct personal accountability and immediate knowledge of hyperlocal needs that almost certainly vary from place to place.

Yeah the Seqouia article makes it clear he needed EA to get his initial wealth. The reason he became a sensation was arbitrage of Japanese exchange rates on bitcoin - only way he managed that was finding another EA who was a Japanese citizen.

He donated to multiple media outlets (including some that reported on him, like Vox) and politicians.

It would be interesting if this hurts EA more than them.

Such a catastrophe for EA in general and MacAskill in particular, possibly unrecoverable. Terrible timing for MacAskill, too, just after he spent a fortune on publicists for WWOTF and did a huge round of interviews. If this had happened in the run-up to the book he could have at least used the launch as damage control.

Unrecoverable? From a funding side Moskovitz/Open Phil still exist. From a PR perspective, it's not great, but it doesn't seem like "EA is literally hitler" is the main takeaway, so it will hurt but i don't think it'll massively slow the inflow of new people.

Unrecoverable just in the sense that this will forever be a part of the narrative around EA, and will be brought up as an easy gotcha by people looking to take potshots. I think the movement will survive, for sure, but it’s a chastening experience. It also wouldn’t surprise me if at least some significant faction of EA attempt to rebrand their branch of the movement in response (even if the underlying ideology doesn’t change). “Oh, we’re Compassionate Altruists not Effective Altruists. We still believe in optimising philanthropy to reduce suffering, but we learned a lot of lessons from the FTX fiasco and have come a long way since then.”

This is getting bad. I have my issues with effective altruism, but I wonder how far the media intends to spread the contagion. I believe that Sam himself is only allowed to blather in an attempt to make his case because they want to get some more juicy material to discredit other spheres and people with. Crypto – obviously, that's in the starter pack. EAs, MacAskill, meh, we'll do without. Veganism, most people who don't eat meat globally act this way for economic reasons. But the very idea of non-kin altruism, perhaps? Or of planning long-term? Worst of all, caution about AI and AI-empowered state-level actors, except the spook-aligned «what if China makes it first or uses our opensource, we gotta give our all to NSA closed model»?

Potentially, this isn't even throwing the baby with the bathwater, this is throwing a powered toaster into the baby's bath because the baby has shat itself. The media is far too powerful to be allowed to play with these matters so frivolously or maliciously.

To me the biggest issue is the media taking down the idea of definite optimism, to borrow a phrase from Thiel. Now these journalists can sit back in their chairs, scoff and say “I told you so! There’s no point in trying to advance technology, solve problems, or do real things in the world. The truly smart people sit around and spend all of their time playing online status games.”

It’s a shame that you almost have to create a cult like movement nowadays to get groups of people to have enough optimism and motivation to change anything, or try to solve any sort of big problems. Larger society wide belief systems have truly failed us and fallen to irony and nihilism.

When the SPAC fad happened, the question was which company taken public via a SPAC would be the first to see its share price implode via fraud. (A: Nikola.) SPACs have valid advantages over IPOs in terms of speed and cost, but they also have lower oversight. The latter was guaranteed to eventually attract someone like Trevor Milton.

I think about EA in the same way. In particular, encouraging massive wealth accumulation in the hands of a small few and prioritizing solutions for problems that might (AI) or won’t (the need to leave the solar system before the sun goes nova) threaten people in their lifetime was bound to attract a SBF eventually.

Edit: And, it’s dubious how sincere SBF was, so to qualify, attract as in entice some fraudster to espouse it.