This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
The Bankman-Fried/FTX Saga just gets better and better. A "why oh why" article in the Wall Street Journal has plums to be plucked out, such as these.
(And if Will MacAskill wants to repair his reputation, he better make some moves sharpish because the media are painting him as Sam's guru who encouraged and indeed enabled him).
Yeah, does anyone think that someone who doesn't know the first thing about EA or any of the people here, when reading this, is going to come away with a good view of all concerned? Personally I'm very amused that veganism has been dragged into this: "guy who swindled billions is against meat eating" 🤣 So let's count 'em up: that's utilitarianism, Effective Altruism, cryptocurrency, and veganism all tainted by association!
As for MacAskill, it sounds like he was in contact with Bankman-Fried up until quite recently:
He has distanced himself now, but unfortunately that may be too little, too late:
But wait, that isn't the best bit:
Oh yes. Just imagine it. Instead of Musk buying Twitter, it could have been Bankman-Fried. If people are getting het-up about Twitter potentially collapsing, what would they think if Twitter was caught up in the undertow of the FTX collapse? 😈
As the resident apologist of EA, even I find it somewhat hilarious how much spotlight Macaskill has been getting throughout this entire fiasco. He has consistently from the beginning said that he doesn’t think he is smart or qualified enough to lead the movement or have as much impact as he does. It seems like he was right.
That being said, I wonder if this could be a sort of “no press is bad press” moment for Effective Altruism. As someone who has been involved for years, the amount of buzz I see around EA both online and in my personal life has been incredible. Between the media push for Macaskill‘s book and now this juicy scandal, I wouldn’t be surprised if the amount of people who know what EA is has doubled in just the last six months.
Time will tell if this is a good thing for EA or not, but speaking from my personal experience the main issue in talking to people about it was apathy. At least now the space will be a bit more entertaining and potentially show normies that there can be drama and fun to be had even in doing good. Maybe we’ll even pull in some rdrama folks who knows.
I am sorta sorry for MacAskill, and in good conscience who could have forecast something like this coming down the pike, but hooo baby. It is so funny for an inveterate "yeah yeah X-risk this, AI-risk that my royal Irish backside" sceptic like myself to see the high-minded big brains who were working on Saving The World getting a 2x4 to the side of the head like this. Yes, you were so busy thinking about "what disaster are we not preparing for that could threaten the future of humanity?" that you didn't notice the disaster unfolding under your very own nose.
I can't help thinking "Were there no damn adults in the room? Where are all the conservative old fuddy-duddies in suits who worked in accounts for forty years who would have said 'you can't make money off magic beans', for the love of Pete?" And of course there weren't any, because that's not how fast-paced, modern, cutting-edge, thinking big, really smart people with big philosophies work nowadays in our global connected world wotsit. No place for the conservative or cautious! Move fast, break things! We're not going to colonise the galaxy and live the transhumanist Fully Automated Luxury Gay Space Communism life if we are held back by the red tape and regulations of the bean counters!
Look at that ad mentioned in the lawsuit:
The guy who is sceptical about the new crypto get-rich-quick scheme is the model of an unreasonable conservative old-fashioned type; he doesn't believe in the new currency, but neither does he believe in any other innovations of the past. He is, dare I say it, on the wrong side of history. And yet - look how that turned out, as the article points out. What was meant as poking mild fun at any cautious types who might be hesitant about this new form of finance turned out to be prescient, though none of them knew it.
It's very mean of me, but us commons like watching the nobles dragged off in tumbrils, you know?
Hah well I appreciate your takes. At least you’re entertaining and light hearted unlike some of the other sharks circling EA right now.
For what it’s worth I’ve also been in the anti AI safety camp of EA for a while now. Quite a few people within EA don’t like the elitism, credentialism, and general galaxy brain takes that the leadership of the movement have succumbed to in recent years. Unfortunately we don’t get frequent interviews or become media darlings. We also don’t have posh British accents so it’s harder to be taken seriously in some circles.
We’ll see if EA can learn the right lessons from this, I’m hopeful that internal reform will be more possible now. Or at least a bifurcation of the salt of the earth EA types away from the navel gazing longtermists.
Then what, exactly, is novel about EA once you get past the navel-gazers? I still do not understand what is new or interesting about EA.
Pledging to donate part of your income up front? Well, tithing is a well-known concept, and automatic paycheck withdrawals to your retirement account is a pretty well-established and useful concept.
Having metrics and quality control in regards to charities? It’s debatable when exactly effective altruism cohered as a concept, but critiquing donations to red tape-burdened inefficient charities is certainly not a new concept.
Like…once you strip out the funding for battling paperclip-optimizing super AIs, which still seems a silly concept given we can barely build a good Roomba to vacuum up my dog hair, let alone grey goo that will reshape physical reality…what is EA besides common sense? We don’t need Scottish philosophers to construct an elaborate taxonomy and praxis for automatic savings accounts and spending a solid 5 minutes to ponder that we are in fact quite well-aware of breast cancer by this point, and don’t need a month of NFL games to remind us.
It seems obvious to me that the new concept here is actually taking these issues seriously. As others have pointed out, no these aren't entirely new concepts EA created out of thin air. The importance of EA comes in that they actually raise awareness and push for more people to understand the importance of these ideas.
'Common sense' as a concept is highly overrated, it essentially means whatever you think is right. If you think all of this is so easy to understand and obvious on the face, then as @Tarnstellung pointed out, why is so much of charity extremely inefficient and just bad? And do you think that should change?
More options
Context Copy link
And yet people still donate millions to breast cancer awareness and similar rubbish. Maybe we need a movement to get people to put a bit more thought into how they donate, to make sure their money is used in the best way possible, to make their... altruism effective?
Maybe we need a single organization dedicated to charity and peace and love, who can coordinate collection of donations, provide social services, and lead outreach efforts to drum up support. It can even have local, regional, national, and supranational levels, led by a leader selected for wisdom and benevolence from a conclave of regional leaders!
Oh wait, we just reinvented the Catholic church.
The Catholic Church is much more than its social programs. Arguably, the social programs are one of the less important things it does, especially nowadays. The Church also doesn't do any research on cost effectiveness (or do you think ornate cathedrals do more good than malaria nets?).
There is no need for a centralized EA organization. Cost effectiveness research, of the kind performed by GiveWell, isn't that expensive. You could reasonably have multiple groups each doing their own analysis and outreach efforts, and these meta-charities would be separate from those that actually implement the programs. GiveWell doesn't directly help anyone as far as I know, they do the research and direct any extra money to charities they find to be effective.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This may be a bit oversimplified, but a core part of utilitarian ethical calculus revolves around optimizing tradeoffs. EA takes that ethos, and tries to apply it to the real world, identifying places where minimal resources may be expended to reduce the most human suffering--mosquito nets in Africa is one of the central examples. EA looks for efficiency, opportunities for optimization, and benefits that scale cheaply and easily. Very much for better and for worse, EA is a big-picture-focused approach to practical philanthropy. Many other strains of philanthropy are individual-focused; the EA critique of such measures is that they are inefficient and do not scale well.
EA looked at American politics and decided the cheap, low hanging fruit to easily have large effects with small amounts of money was..........
Giving hard-left organizations millions to make an indie film about racism and millions more to the richest liberal slush fund in the world to write one paragraph.
My half-assed review of OPP can be found in the comments here:
https://old.reddit.com/r/TheMotte/comments/vdwwso/a_critical_review_of_open_philanthropys_bet_on/
It sounds like you are skeptical of the efficacy of magic beans? Yeah, in that case, it looks like just the worst combination of some form of quokka-ness with enough of a veneer of "trust me, I've done the utilitarian math, and it says that Democrats are surprisingly underfunded!" High-minded principles got suckered by a grift (well, to the extent that principles were involved at all beyond providing a fig leaf, which is worth questioning).
That said, as a practical matter, EA looks especially vulnerable to grifting exactly because its focus on big-picture analysis tends to dismiss individual failures. On the principles side, EA is an attempt to speedrun morality, which to me raises more red flags than a Chinese military parade.
More options
Context Copy link
The irony here is that EA in general was extremely allergic to politics for over a decade. To the point where there were multiple lectures and serious discussions about avoiding political alignment.
Not everyone in EA wanted SBF to start meddling in politics, in fact I'd argue a majority didn't. And a majority right now certainly agree that political meddling is a bad idea.
More options
Context Copy link
EA is not a hivemind. I don't recall GiveWell endorsing politicians, for example.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link