site banner

Culture War Roundup for the week of November 21, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

The Bankman-Fried/FTX Saga just gets better and better. A "why oh why" article in the Wall Street Journal has plums to be plucked out, such as these.

(And if Will MacAskill wants to repair his reputation, he better make some moves sharpish because the media are painting him as Sam's guru who encouraged and indeed enabled him).

Mr. Bankman-Fried has said his law-professor parents instilled in him an interest in utilitarianism, the philosophy of trying to do the greatest good for the greatest number of people. He said he started putting those ideals into practice while majoring in physics at MIT. Concerned with the suffering of animals on factory farms, he said, he stopped eating meat.

Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.

...Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community.

Yeah, does anyone think that someone who doesn't know the first thing about EA or any of the people here, when reading this, is going to come away with a good view of all concerned? Personally I'm very amused that veganism has been dragged into this: "guy who swindled billions is against meat eating" 🤣 So let's count 'em up: that's utilitarianism, Effective Altruism, cryptocurrency, and veganism all tainted by association!

As for MacAskill, it sounds like he was in contact with Bankman-Fried up until quite recently:

The FTX Foundation’s favored causes included pandemic prevention and protecting humanity from the potential downsides of artificial intelligence. At a July meeting of the foundation, Mr. Bankman-Fried became deeply engaged in a discussion on how lightbulbs equipped with a particular frequency of ultraviolet light could eradicate airborne pathogens, Mr. MacAskill told the Journal this summer.

He has distanced himself now, but unfortunately that may be too little, too late:

[Future Fund’s] two largest public grants, of $15 million and $13.9 million, were awarded to effective altruism groups where Mr. MacAskill held roles. Mr. MacAskill, now a professor at Oxford University, wasn’t paid for his involvement in those organizations “other than expenses,” a spokeswoman for one of them said.

...Mr. MacAskill distanced himself from FTX as it was crumbling. In a string of tweets, he accused Mr. Bankman-Fried of personal betrayal and abandoning the principles of effective altruism. He was also one of the Future Fund staffers who quit.

But wait, that isn't the best bit:

Mr. MacAskill at times advised Mr. Bankman-Fried on more than just philanthropic matters. When Elon Musk started his campaign to buy Twitter, Mr. MacAskill sent the Tesla chief executive a text message, according to documents made public in the litigation over his purchase of the social-media firm. “My collaborator Sam Bankman-Fried has for a while been potentially interested in purchasing it and then making it better for the world,” he wrote.

Oh yes. Just imagine it. Instead of Musk buying Twitter, it could have been Bankman-Fried. If people are getting het-up about Twitter potentially collapsing, what would they think if Twitter was caught up in the undertow of the FTX collapse? 😈

As the resident apologist of EA, even I find it somewhat hilarious how much spotlight Macaskill has been getting throughout this entire fiasco. He has consistently from the beginning said that he doesn’t think he is smart or qualified enough to lead the movement or have as much impact as he does. It seems like he was right.

That being said, I wonder if this could be a sort of “no press is bad press” moment for Effective Altruism. As someone who has been involved for years, the amount of buzz I see around EA both online and in my personal life has been incredible. Between the media push for Macaskill‘s book and now this juicy scandal, I wouldn’t be surprised if the amount of people who know what EA is has doubled in just the last six months.

Time will tell if this is a good thing for EA or not, but speaking from my personal experience the main issue in talking to people about it was apathy. At least now the space will be a bit more entertaining and potentially show normies that there can be drama and fun to be had even in doing good. Maybe we’ll even pull in some rdrama folks who knows.

I am sorta sorry for MacAskill, and in good conscience who could have forecast something like this coming down the pike, but hooo baby. It is so funny for an inveterate "yeah yeah X-risk this, AI-risk that my royal Irish backside" sceptic like myself to see the high-minded big brains who were working on Saving The World getting a 2x4 to the side of the head like this. Yes, you were so busy thinking about "what disaster are we not preparing for that could threaten the future of humanity?" that you didn't notice the disaster unfolding under your very own nose.

I can't help thinking "Were there no damn adults in the room? Where are all the conservative old fuddy-duddies in suits who worked in accounts for forty years who would have said 'you can't make money off magic beans', for the love of Pete?" And of course there weren't any, because that's not how fast-paced, modern, cutting-edge, thinking big, really smart people with big philosophies work nowadays in our global connected world wotsit. No place for the conservative or cautious! Move fast, break things! We're not going to colonise the galaxy and live the transhumanist Fully Automated Luxury Gay Space Communism life if we are held back by the red tape and regulations of the bean counters!

Look at that ad mentioned in the lawsuit:

“I’m never wrong about this stuff, never,” said a dismissive and scoffing Larry David earlier this year in that now infamous Super Bowl ad for investing in cryptocurrency exchange FTX.

While the Seinfeld co-creator rejected the wheel, coffee, the U.S. Constitution, electricity, putting a man on the moon and more innovations in the much praised commercial, looks like David might have been right about the now collapsed FTX, for all the good it’s going to do him.

The guy who is sceptical about the new crypto get-rich-quick scheme is the model of an unreasonable conservative old-fashioned type; he doesn't believe in the new currency, but neither does he believe in any other innovations of the past. He is, dare I say it, on the wrong side of history. And yet - look how that turned out, as the article points out. What was meant as poking mild fun at any cautious types who might be hesitant about this new form of finance turned out to be prescient, though none of them knew it.

It's very mean of me, but us commons like watching the nobles dragged off in tumbrils, you know?

Hah well I appreciate your takes. At least you’re entertaining and light hearted unlike some of the other sharks circling EA right now.

For what it’s worth I’ve also been in the anti AI safety camp of EA for a while now. Quite a few people within EA don’t like the elitism, credentialism, and general galaxy brain takes that the leadership of the movement have succumbed to in recent years. Unfortunately we don’t get frequent interviews or become media darlings. We also don’t have posh British accents so it’s harder to be taken seriously in some circles.

We’ll see if EA can learn the right lessons from this, I’m hopeful that internal reform will be more possible now. Or at least a bifurcation of the salt of the earth EA types away from the navel gazing longtermists.

Honestly, the best thing for EA would be to dump the AI stuff and other galaxy-brain projects, run away from the politics/lobbying stuff, and go back to "doing good better". That's boring and tedious, it's not exciting like "we're trying to save the world from long-term risk of human extinction and colonise the galaxy for a billion years!", but fuck that SF shit. Feed the hungry, clothe the naked, heal the sick. Basic dull, boring stuff that has to be done over and over again when you get up every day, but it has the most effect on real people that you want to help.

(And I'm speaking as someone who loves that SF shit. There are thousands of political lobbying groups, EA doesn't need to be one more fish in the pond. There are already groups worrying about AI risk and pandemic risk and getting hit by the sweet meteor of death risk. Doing efficient, effective work to help the needy is there to be done, are they going to leave it up to religious groups? And if they are, then they don't get to complain about missionaries on church trips to distribute Bibles instead of mosquito nets, because there's a good chance the church mission group is also doing that).

@big-city-gay this is another good explanation of why EA is important even if it doesn't have a large amount of novel theory behind it.