site banner

Culture War Roundup for the week of December 4, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

Scott Alexander has recently argued in favor of Effective Altruism after the new scandal of effective altruists trying to oust Sam Altman from Open A.I.

His argument starts by focusing about how different factions attack EA from different perspectives that are contradictory. That those on the right call them woke and those on the left call them fascists and white supremacist. The point seems to be implying that they are going to be attacked anyway by all sides no matter what, so we shouldn't take seriously such criticisms. Then he mostly focuses on an estimated 200,000 lives saved in the developing world.

My problem with this is that it obscures something that isn't a mystery. Which is that EA's politics align much more with the Democratic establishment than with the right and there isn't any substantial confrontation of what that means.

The biggest donor of Effective Altruism according to my short research and claims I found in the effective altruism forum from 2022 where he participated in such discussion is Asana CEO Dustin Moskovitz.

Asana, his company contributed 45 million in the 2020 election and he also had an important contribution in millions in the future forwards pac

https://www.opensecrets.org/2020-presidential-race/joe-biden/contributors?id=N00001669 https://www.opensecrets.org/news/2020/10/pro-biden-super-pac-darkmon/ https://www.cnbc.com/2020/11/02/tech-billionaire-2020-election-donations-final-tally.html https://bluetent.us/articles/campaigns-elections/dustin-moskovitz-cari-tuna-democratic-donor-2020/

If one looks at open philanthropy or the EA forum and searches for controversial cultural issues there can be sometimes a small dissent but they follow the liberal party line for the most part.

Lets look at open philanthropy, an EA organization and Dustin Moskovitz organization. Scott certainly wants to give credit to EA and open philanthropy for promoting YIMBY.

However this organization has also funded decriminalization policies and pro migration policies.

https://www.openphilanthropy.org/focus/criminal-justice-reform/ https://www.openphilanthropy.org/focus/immigration-policy/

I wonder if the well funded caravans of migrants we see in some areas of the world have to some extend to do with funding related to EA.

Recently there has been a mini EA scandal where one individual expressed HBD views in the past but this was made a thing and he was condemned by many in the movement, but not entirely unanimously. https://forum.effectivealtruism.org/posts/8zLwD862MRGZTzs8k/a-personal-response-to-nick-bostrom-s-apology-for-an-old

https://forum.effectivealtruism.org/posts/kuqgJDPF6nfscSZsZ/thread-for-discussing-bostrom-s-email-and-apology

Basically, this individual wrote an email 26 years ago that used naughty language to make the point that you should use less offensive language when arguing for race realism.

Then he apologized due to pressure and argued:

What are my actual views? I do think that provocative communication styles have a place—but not like this! I also think that it is deeply unfair that unequal access to education, nutrients, and basic healthcare leads to inequality in social outcomes, including sometimes disparities in skills and cognitive capacity. This is a huge moral travesty that we should not paper over or downplay. Much of my personal charitable giving over the years has gone to fighting exactly this problem: I’ve given many thousands of pounds to organizations including to the SCI Foundation, GiveDirectly, the Black Health Alliance, the Iodine Global Network, BasicNeeds, and the Christian Blind Mission.

Then there is Open A.I. and Chat GPT and effective altruists have been influential in Open A.I. Chat GPT has liberal bias. https://www.foxnews.com/media/chatgpt-faces-mounting-accusations-woke-liberal-bias

Another thing to observe are the demographics of effective altruists.

They are only 0.9% right wing and 2.5% center right. With majority being of the left with 40% center left and 32% identifying as left. But that is identification. Just like Biden could be identified by some as center left while by others, including myself as far left. They are also 46% Vegans. 85.9% are Atheists.

https://rethinkpriorities.org/publications/eas2019-community-demographics-characteristics

I haven't encountered any group with such small representation of right wingers that actually is fair when promoting a political agenda towards either the right wing, or groups that are more seen related to the right. However, effective altruists are much more concerned about the lack of sufficient racial and ethnic diversity than ideological diversity when you search their forum.

Climate change and veganism are two issues that could well lead to hardcore authoritarian policies and restrictions. Considering the demographics of EA and the fact that Peter Singer is an important figure in it and helped coin the term, I do wonder if on that issue the EA influence would be for them to impose on us policies. When dealing with the moral framing of animal liberation movement activist like Singer we see a moral urgency. Like with all identity movements, to elevate one group such as animals you end up reducing the position of another group, such as humans. Or those who aren't vegans.

The issue is that these networks that are reinforced based on EA might already have as part of their agenda to promote their political agenda.. And these networks that developed in part due to EA and put like minded ideologues together to organize can also expand even more to promote their political agenda outside the EA banner.

It does seem that at least a few of the people involved with effective altruism think that it fell victim to its coastal college demographics. https://www.fromthenew.world/p/what-the-hell-happened-to-effective

My other conclusion related to the open A.I. incident as well is that the idea of these people that they are those who will put humanity first will lead to them ousting others and attempt to grab more power in the future too. When they do so, will they ever abandon it?

Scott Alexander himself argued that putting humanity first is the priority and he had some faith on them thinking rationally when they tried to oust Sam Altman, even though he invited them inside. He might not agree with their action necessarily but he sympathizes with the motive. https://twitter.com/slatestarcodex/status/1726132072031641853#m

That this action is dishonorable matters because like with Sam Bankman Fried it continues the pattern of important ethical issues being pushed aside under the idea that effective altruists know best.

This means that Sam Altman won't be the first. It also means that we got a movement very susceptible to the same problems of authoritarian far left movements in general of extreme self confidence to their own vision and will to power. This inevitably in addition to the whole issue of hell paved with good intentions encourages the power hungry to be part of it as well.

It does seem there is an important side to it which is about people donating in more unobjectionable terms but in general effective altruism it isn't separate from a political agenda that fits with a political tribe. That should be judged on its own merits without the 200,000 saved in developing world being accepted as an adequate answer for policies that affect the developed world. The short version of all this is that if you got a problem with leftist/far leftist NGOs, you should consider the effective altruism movement and some of its key players to be contributing in the same direction.

One thing that's always bugged my about progressivism and especially EA is that despite all their claims of being empathetic and humanistic they completely ignore the human. They are ironically the paperclip maximizers of philanthropy.

The argument is that despite some of the questionable things EA has been caught up in lately, they've saved 200 thousands lives! but did they save good lives? What have they saved really? More mouths to feed? Doctors and lawyers? Someone that cares about humanity would want to ask these questions. A paperclip maximizer that discounts a persons humanity entirely and just sees each life as some widget to maximize the number of would not.

The purpose of empathy is to be able to put yourself in someone else's shoes, to understand their feelings. Except, to do that you have to have some level of understanding of how they function, some mental model of their mind. Else you are simply projecting. It's easy to just imagine what you'd feel like if you were in Palestine or Israel etc. Except that isn't empathy. Even just listening to what a person says isn't truly empathy. If I were an alcoholic and I said I wanted a drink, to someone that has no knowledge of me it might seem a nice thing to do, but clearly it would not be. I'm not sure what it even means to have empathy for someone you don't know. I'm not sure it's possible. What is it really that you are feeling? Do you believe people are all the same, with the same wants? same needs? some values? It's such a dim view of people and of the world.

I suppose some people do, "We're all human," is something you'll hear espoused by this ideology, but that is literally the least you can have in common with another person. Trying to apply it to any other human interaction is instantly ridiculous. You wouldn't apply that logic anywhere in life, you don't hire someone just because they're human, you don't befriend someone, care about someone, hate someone. It's basically an open admission that you have nothing convincing to say. Even if someone was forced to compliment their worst enemy they'd manage to ad lib something more convincing than, "he's human."

Anyone that has had relationships with other humans, so basically everyone, knows how complicated it is to actually know someone. You can have spent years living with a partner and still be completely caught off guard when your mental model goes awry and your attempt at empathy then completely falls flat. The idea that some ideological group is more moral or more caring because of the sheer number of lives they've saved completely discredits and belittles one of the pillars of being human, getting to know each other, socializing, learning friend and foe. It discounts their humanity itself, that it's even necessary to get to know or to understand someone before you can help them. Your wants and needs don't matter, you are a widget, you need x calories, y oxygen, to continue existing and I will supply these needs, such altruism, wow.

Looking around at social media and world events I can't help but wonder if this is some major glitch with human psychology in the digital age. Too many strangers, too much opportunity for, "selflessness." So many people caught up in an empty and self serving empathy that has no imagination for others. Meanwhile people that have normal empathy are dismissed because they aren't as "selfless" as the newer movements. Spending time with and focusing on people that share your values isn't altruistic because if they share your values than you are less selfless than the progressive who cares about the stranger. (Not to mention the bay area tech bro that managed to save 0.0345 persons per dollar spent, blowing away the nearest tech bro competitor who only saved 0.0321)

This logic seems mad though, taken to it's extreme the most altruistic move would be to help someone that shares none of your values, and since altruism is a core value you should be exclusively helping the least altruistic of people as that is the most selfless thing you could do. Of course this is obviously ridiculous and self defeating (like the lgbt groups supporting hamas)

More cynically I think this sort of caring is just a way to whitewash your past wrongs, it's pr maximizing, spend x dollars and get the biggest number you can put next to your shady bay area tech movement that is increasingly under societies microscope given the immense power things like social networks and ai give your group. If you really want to help others you need to understand them, that means spending time with others, not with concepts. If you're lucky you might eventually find a few people that you understand well enough that more often than not your actions are positive and beneficial to them. Congratulations you have now invented the family and traditional community.

The argument is that despite some of the questionable things EA has been caught up in lately, they've saved 200 thousands lives! but did they save good lives? What have they saved really? More mouths to feed?

Yep. Some of those "mouths to feed" might end up becoming doctors and lawyers, but that's not why we saved them, and they would still be worth saving even if they all ended up living ordinary lives as farmers and fishermen and similar.

If you don't think that the lives of ordinary people are worth anything, that needless suffering and death are fine as long as they don't affect you and yours, and that you would not expect any help if the positions were flipped since they would have no moral obligation to help you... well, that's your prerogative. You can have your local community with close internal ties, and that's fine.

More cynically I think this sort of caring is just a way to whitewash your past wrongs, it's pr maximizing, spend x dollars and get the biggest number you can put next to your shady bay area tech movement that is increasingly under societies microscope given the immense power things like social networks and ai give your group.

I don't think effective altruism is particularly effective PR. Effective PR techniques are pretty well known, and they don't particularly look like "spend your PR budget on a few particular cause areas that aren't even agreed upon to be important and don't substantially help anyone with power or influence".

The funny thing is that PR maximizing would probably make effective altruism more effective than it currently is, but people in the EA community (myself included) are put off by things that look like advertising and don't actually do it.

Yep. Some of those "mouths to feed" might end up becoming doctors and lawyers, but that's not why we saved them, and they would still be worth saving even if they all ended up living ordinary lives as farmers and fishermen and similar.

If you don't think that the lives of ordinary people are worth anything, that needless suffering and death are fine as long as they don't affect you and yours, and that you would not expect any help if the positions were flipped since they would have no moral obligation to help you... well, that's your prerogative. You can have your local community with close internal ties, and that's fine.

and some of them will become rapists and murders. Maybe they already are. Have you stopped to check? Are they worth saving as well despite the harm they have done / will do?

Of course I wouldn't expect a stranger to help me. I'm arguing that it's not possible after all. In retrospect even people that do know and care about me have had some pretty spectacular failures on that front, though I don't blame them as long as they forgive me my own.

Death is necessary. We live in a world with physical limits, without death the resources eventually run out. Most of life from the realm of the microscopic to the complex workings of human society is just the process of determining what is worthy of those limited resources. When the determination is subjective we call it morality or justice and when it's objective we call it nature.

It seems trivial to me that human lives aren't worth saving per se. It's the content of those lives that matters, and if you don't know the content than you can't prove that you've done anything of value let alone something "effective." I mean if you had the choice between saving 1000 lives of people in a persistent vegetative state, or a dozen lives of people you know to be good and functioning people you choose the functioning people right? It's not the lives that matter it's the person, the content. If you could have more people living by putting everyone in a low energy state in some kind of feeding pod, where they undergo minimal activity to reduce calorie expenditure and just enough calories are provided to keep them alive is that good because more people are living? It seems cartoonishly evil.

and those are just overly simple demonstrations, in reality the world is more complex than that. Value is a human thing and though nature occasionally forces our hand the more advanced we get the more leeway we have to be subjective. There really isn't even a way to maximize value because people have different values and therefore competing interests.

That's the problem I have with EA. The whole, "we're saving more people than anyone" thing. Stopping needless suffering. Why is their suffering needless? Suffering can be important, it teaches us things. It leads to improvement. When you are saving them what are you saving? Do you know any of them? It's so surface level and such a philosophically empty paperclip maximizing type ethos.

I do agree that it hasn't been very effective PR for the tech bros so far. I think it worked better for progressives (though people are growing resistant to it) and EA seems to be a silicon valley version that has made the whole process too efficient and made it's contradictions too apparent. It feels too inhuman for most.

and some of them will become rapists and murders. Maybe they already are. Have you stopped to check? Are they worth saving as well despite the harm they have done / will do?

This is a retarded standard that nobody who has to work with more than a handful of people at a time holds. Do you think doctors look up new arrivals to the ER to ascertain whether they're accidentally treating murderers and rapists?

It's the net impact that matters, and unless you're exclusively attempting to save the denizens of a prison, or maybe Hamas, you will find almost no population where they predominate, such that by saving the entire lot you've done something worse.

I mean if you had the choice between saving 1000 lives of people in a persistent vegetative state, or a dozen lives of people you know to be good and functioning people you choose the functioning people right? It's not the lives that matter it's the person, the content. If you could have more people living by putting everyone in a low energy state in some kind of feeding pod, where they undergo minimal activity to reduce calorie expenditure and just enough calories are provided to keep them alive is that good because more people are living? It seems cartoonishly evil.

Great. An accusation that of all the people in the world, EAs don't know the concept of disability adjusted life years (DALY) and quality adjusted life years (QALY).

That's the problem I have with EA. The whole, "we're saving more people than anyone" thing. Stopping needless suffering. Why is their suffering needless? Suffering can be important, it teaches us things. It leads to improvement. When you are saving them what are you saving? Do you know any of them? It's so surface level and such a philosophically empty paperclip maximizing type ethos.

I will go with the "good things are good, and bad things are bad, actually" over this galaxy-brained advocacy for letting people starve to death or die of malaria.

I'm sure those are all laudable, character building exercises.

I'm not an EA, I just think that of all the people I strongly disagree with, they're doing what they believe to be right thing with the right amount of rigor, as opposed to nothing but vibes.

I will go with the "good things are good, and bad things are bad, actually" over this galaxy-brained advocacy for letting people starve to death or die of malaria.

Which is a lazy dismissive assumption. You have faith that lives are good or that they are in aggregate good and therefore maximizing them is positive, you don't know that. As far as I can tell you can't know that.

I'm not arguing against helping people, just that helping people you actually know is better, especially en masse (what if everyone logged off social media and did that?), than industrial philanthropy or w/e.

You are welcome to demonstrate your conviction that lives are terrible and worth terminating on average, as they must be if the aggregate is, but I suspect you can't, for the same odd reason most antinatalists or misanthropes don't start with themselves.

You have faith that lives are good or that they are in aggregate good and therefore maximizing them is positive, you don't know that. As far as I can tell you can't know that.

Faith? Why? I can clearly see that most people have lives worth living and extending, at least if it comes to the expenditure of funds I can't repurpose for things I personally care about more. To the extent that governments and charities spend their money on that, I'd prefer they save as many lives as cheaply as effectively as possible, and EAs do that. Would be even better if they handed all the cash to me, but since there's no advocacy group for the same, I'll take it.

Go ahead and help whoever you like, if you care to. By the same process where you don't care about most people, I don't particularly care about you and yours, and thus EA beats you in terms of net people I minimally consider worth existing saved. Sure, sucks that a large number of them are Sub-Saharan Africans with low IQs I suppose, but that's hardly all of them, there is a non-zero tradeoff for the same with Westerners or any other kind of human really.

ah yes, "KYS" nice to see the motte's standard of petty insults in as many words as possible is still around.

I mean it's more that it's quite obvious that "kys" is bad advice for you, so maybe you should examine the reasons why it's bad advice for you and see whether they're also true of a random farmer's kid in Mali.