site banner

Culture War Roundup for the week of November 21, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

The Bankman-Fried/FTX Saga just gets better and better. A "why oh why" article in the Wall Street Journal has plums to be plucked out, such as these.

(And if Will MacAskill wants to repair his reputation, he better make some moves sharpish because the media are painting him as Sam's guru who encouraged and indeed enabled him).

Mr. Bankman-Fried has said his law-professor parents instilled in him an interest in utilitarianism, the philosophy of trying to do the greatest good for the greatest number of people. He said he started putting those ideals into practice while majoring in physics at MIT. Concerned with the suffering of animals on factory farms, he said, he stopped eating meat.

Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.

...Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community.

Yeah, does anyone think that someone who doesn't know the first thing about EA or any of the people here, when reading this, is going to come away with a good view of all concerned? Personally I'm very amused that veganism has been dragged into this: "guy who swindled billions is against meat eating" 🤣 So let's count 'em up: that's utilitarianism, Effective Altruism, cryptocurrency, and veganism all tainted by association!

As for MacAskill, it sounds like he was in contact with Bankman-Fried up until quite recently:

The FTX Foundation’s favored causes included pandemic prevention and protecting humanity from the potential downsides of artificial intelligence. At a July meeting of the foundation, Mr. Bankman-Fried became deeply engaged in a discussion on how lightbulbs equipped with a particular frequency of ultraviolet light could eradicate airborne pathogens, Mr. MacAskill told the Journal this summer.

He has distanced himself now, but unfortunately that may be too little, too late:

[Future Fund’s] two largest public grants, of $15 million and $13.9 million, were awarded to effective altruism groups where Mr. MacAskill held roles. Mr. MacAskill, now a professor at Oxford University, wasn’t paid for his involvement in those organizations “other than expenses,” a spokeswoman for one of them said.

...Mr. MacAskill distanced himself from FTX as it was crumbling. In a string of tweets, he accused Mr. Bankman-Fried of personal betrayal and abandoning the principles of effective altruism. He was also one of the Future Fund staffers who quit.

But wait, that isn't the best bit:

Mr. MacAskill at times advised Mr. Bankman-Fried on more than just philanthropic matters. When Elon Musk started his campaign to buy Twitter, Mr. MacAskill sent the Tesla chief executive a text message, according to documents made public in the litigation over his purchase of the social-media firm. “My collaborator Sam Bankman-Fried has for a while been potentially interested in purchasing it and then making it better for the world,” he wrote.

Oh yes. Just imagine it. Instead of Musk buying Twitter, it could have been Bankman-Fried. If people are getting het-up about Twitter potentially collapsing, what would they think if Twitter was caught up in the undertow of the FTX collapse? 😈

I keep not seeing my perspective brought up in these threads so I guess I'll say what I can't believe isn't obvious. Why should I care at all about any of the people or things connected to this dude who I had not heard of until the collapse? If they had a knowing hand in the scam then sure, but what's next? Are we going to hear about his favorite ice cream shop and their complicity in selling such a monster ice cream? We don't even have two cases of EA adjacent scammers with which to draw a pattern, the only conceivable way I see this being bad for EA is that they're now down a lot of money that they expected.

The condemnation of EA over this is kinda funny. If SBF showed up to any other charity, or person or organization, including everyone criticizing him, and offered billions of dollars over a decade, I don't think they'd say "uh, what if this large heavily invested business secretly committing fraud though? I demand an invasive audit of your books to make sure, and I won't take the money otherwise." And that goes double for just 'advising SBF'.

SBF didn’t just show up one day at an EA event with billions of dollars. He was homegrown. He was in effective altruism before he was rich. He literally worked for The Center for Effective Altruism before leaving to start Alameda Research. Caroline Ellison ran a semi-anonymous blog jam packed with rationalist and EA references for years before she even started at Alameda or got into crypto. (Seriously, read it. This is a person who really thought they were maximizing expected utility by running an undercollateralized hedge fund into the ground. Fascinating.) These were true believers in the movement. Effective altruism was central to their identities. The fact that FTX not only failed but caused billions of dollars in collateral damage suggests that something is deeply wrong with the core philosophy.

The fact that FTX not only failed but caused billions of dollars in collateral damage suggests that something is deeply wrong with the core philosophy.

Maybe there's something deeply wrong with the core philosophy of crypto? FTX is EA's one big scandal - but FTX was a crypto company, and while the rest of EA is not obviously fraudulent, the rest of crypto is covered in frauds! Crypto scams are prevalent, exchanges collapsing and coins rugpulling are a constant - the most popular exchange, Binance, was generally regarded as shadier than FTX and continues to operate with, iirc, unclear legal status. Most of crypto's popularity and price come from people seeing the coin go up and wanting to get rich, and making up weird sounding 'decentralization' reasons for why it's making them rich. It makes much more sense to blame crypto for FTX than EA.

To be clear, this doesn't prove crypto is bad - it's clearly significantly populated with scammers and groundless speculators, but that doesn't mean the original ideals or ethereum or w/e aren't useful anyway. And in a broader sense - again, frauds happen everywhere, much of accounting and finance exists as regulations to prevent fraud that would otherwise be ever-present, crypto's lack of that is just revealing why they were necessary in the first place, "crypto speedrunning the financial system".

"EA has both fraudsters and people willing to go along with them in it" isn't surprising, it's expected. "Being willing to do morally gray / black things is associated with success in crypto" also isn't surprising, it's expected. EA doesn't need to have any causal impact on the fraud for it to happen.

And - say one believes that HFT is just a legal way of taking other peoples' money via financial alchemy. Is what sam did really worse than that? Are SBF's ethics materially worse than those of a fast food company or 'actively managed fund' CEO, or are they just less well regulated?

To be clear - I'm not arguing sam isn't bad, and I'm not arguing EA is based, just that people who otherwise dislike EA (which I may also for very different reasons!) are using this to prove EA BAD in the same way that, like, iraq or epstein proves (from different angles) LIBS ARE MORALLY BANKRUPT, despite it not really connecting to their initial reasons for disliking it.

say one believes that HFT is just a legal way of taking other peoples' money via financial alchemy. Is what sam did really worse than that?

Not that this undermines the main thrust of your point, Chris Stucchio put out a series of posts explaining the mechanics and social value of HFT, as well as some of its problems. The gist is that HFT charges people for liquidity, which compensates them for taking on risk, and also helps expose the "true" value of assets that large buyers try to keep hidden. So there's at least a solid argument that it's not just "a legal way of taking other peoples' money".

Furthermore, its worth noting that a non-fraudulent FTX would be providing a similar sort of value to HFT (liquidity), which suggests that actual SBF running a fraudulent FTX is much worse.

The problem here is that price discovery for crypto doesn’t really make sense the way that it does for say, stocks or bonds. It’s not just a risk asset with a speculative component, the whole thing is speculation.

Maybe there's something deeply wrong with the core philosophy of crypto? FTX is EA's one big scandal

FTX is EA's only big anything, because nothing else EA has done has captured mainstream attention like FTX has.

My sense is that EA does not lack for scandals (CFAR and MIRI are basically walking scandals), they're just not big enough to be interesting, and it feels kind of unsportsmanlike to publicize them, because they're basically all about maladjusted misfits hurting themselves and one another.

Can you clarify what you mean here? I thought MIRI just wrote not very exciting math papers and CFAR is just another self help seminars for $5000 program.

Check this shit out (hat tip @DaseindustriesLtd). I don't know the objective truth behind the accusations, but there's more than enough grist for CFAR/MIRI to get absolutely dragged by the mainstream press if they had anywhere near the public prominence or power that FTX had.

Good god, this and the Leverage scandals have made me want to basically entirely disavow rationality as a movement. I knew they were into some weird stuff but the links I found after going down that rabbit hole, especially this blog by Ziz(?) is just beyond the pale.

Actual specifics I've identified in this story:

  1. A mentally ill person who did psychadelics thinks MIRI is like Leverage and currently has assorted vague mental illness symptoms similar to the typical tiktok "I'm mentally ill" influencer.

  2. Said mentally ill person attributes their mental problems to MIRI. "I had paranoid fantasies about a MIRI executive assassinating me...."

  3. Multiple people at CFAR had mental health issues. One person took a seminar then did some crimes.

  4. The word "corruption" is used with no specifics.

  5. "I had disagreements...and there was quite a lot of effort to convince me of their position..." :O

  6. Some anti-CFAR protesters were unruly and local cops in a rural area got really excited about trying out their SWAT gear: https://sfist.com/2019/11/19/four-people-in-guy-fawkes-masks-scare-sonoma-retreat/

  7. "I was certainly socially discouraged from revealing things that would harm the “brand” of MIRI and CFAR, by executive people." :O

  8. Leverage leverage leverage, it was like leverage. Did you hear leverage was bad? I want some of the attention that leverage girl got.

The allegations against Leverage are equally non-specific and mostly come down to "I didn't get the social status I was seeking."

https://medium.com/@zoecurzi/my-experience-with-leverage-research-17e96a8e540b

I do agree that some of these folks would be better off attending church than CFAR.

More comments

I still haven’t seen or heard anything that can’t be explained by the fact that MIRI is a less-than 110% woke organization in Berkeley California. These are exactly the kind of fake “scandals” that I would have expected given such an arrangement.

There's two things here; first, hell yeah there is something deeply wrong with crypto.

But secondly, also hell yeah EA needed this kick in the pants. They started off doing concrete good, but that's boring. So the movement has drifted away to this kind of pie-in-the-sky theorising (AI risk may or may not Doom Us All, but right now starving, sick and poor people need to be helped even if Paperclip Maximiser is around the corner) and once it got involved in politics, that was it.

For all the lofty ideals, the dull slog of practical charity in, yes, your local neighbourhoods and beyond remains to be done. EA pivoted to the big leagues where the big money was, and that they needed, so they fell prey to the likes of Bankman-Fried. Because they wanted to be the big, exciting, world-changing, shaping the future deal. And this is part of the fall-out of that philosophy, and the movement has to accept it and address it.

I know it's hard. I know the defensive reaction is natural and inevitable. I know the temptation is there to say "This has nothing to do with us and he wasn't really one of us, anyhow". But like it or lump it, this is now the image people have of EA due to Bankman-Fried and this financial scandal. Maybe going back to feeding the hungry, clothing the naked, and visiting the sick is better than "let's all get jobs in each other's foundations so we can lobby Congress about pandemic prevention and big glamorous events".

The local, "charity begins at home" organisations that are going out with soup kitchens and the likes are still operating today, doing the same dull, boring, tedious, repetitive work of hands-on interaction with the poor and needy. Even if it is inefficient by GiveWell metrics. Even if it doesn't have Oxford philosophers working on utilitarian theory to be the official philosophy.

If one feels that local widows and orphans are sufficiently well taken care of, there has been another strategy used to deal with the "starving children in Africa" problem. A local group decides to subsidize sending one of their members to a village in rural sub-Saharan Africa, and their agent is to live in the village and exchange knowledge with the villagers over a period of months or years. The agent reports back things like "the villagers want to dig a new well, and a sturdy pump would improve their lives significantly," so the home group fundraises and delivers payment for the needed pump.

This process is hard work, but note how well it accounts for information management (specific and individualized needs are identified and communicated to the backers) and cash security (cash passes through very few hands, and is assigned on a project-by-project basis, not commingled into a slush fund). More organized charitable infrastructure usually claims to be more efficient, but it's important not to lose the benefits of direct personal accountability and immediate knowledge of hyperlocal needs that almost certainly vary from place to place.