site banner

Culture War Roundup for the week of November 7, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Does the Sam Bankman-Fried transformation into Bankrupt Fraud tell us something about the failures of effective altruism?

I saw Bankman mentioned on themotte a number of times over the past two years. I’m pretty sure he was mentioned over on SSC, too. After Scott, he was the person who immediately came to mind when I thought of figures associated with EA. Many normies and finance types will only think of Bankman when EA is brought up. (I refuse to use the “SBF” acronym because it was consciously chosen as imitation of HSBC and other institutions, and despite his name the man is not a bank.)

I think the EA’s failure to have any effective impact on Bankman’s moral calculus is its complete absence of emotional salience. Traditional moral systems usually try to maximize moral salience. (Stoicism was short-lived and immersed in a Hellenistic culture that emphasized honor through salient stories, and while “mindfulness” is emotional neutral, traditional Buddhism emphasizes benevolence through stories.)

Consider Christianity. Its stories are designed for emotional salience, using novelty/paradox/shock in key moments to illustrate the moral point. Mankind’s Hero was born in a manger to a lowly family, faced persecution from the very people who claimed moral superiority, took on followers who were poor and irrelevant, and died the death of a painful criminal for the purpose of saving all of humanity. The paradoxes and surprises are meant to enhance the emotional experience, and thus the effect, of the moral point. Within the Gospel narrative, we have parables, also emphasizing salience. You have the wealthy and high status patrician who looks down on his lower class sinful neighbor, and the latter is announced as just and not the former. We have metaphors involving specks in the eye, wheat cultivation, farm animals, and storing grain, all of which would be immediately understood by the target audience. The parable form itself can be construed as the most expedient way of expressing a moral point to the largest possible audience.

While Effective Altruism may be logically sound, in the sense that the optimal actions are clearly delineated and argued, it may also not be very effective in obtaining an end result. There is an ocean of difference between a logical assessment of morality and the effectively-felt transformation of an individual into a moral actor who follows the moral commandments. To walk over this ocean of difference or to part its waters requires a moral system (if not a religion, close to it) that is focused on making morality felt. Otherwise, as in the case of Bankman-Fried, our passions and our greeds prevent us from following through on what we ought. This conflict over Ought and Will is, of course, explored throughout the New Testament, with the inability to perfectly follow moral commandments (the law) being solved in the Person of Christ, who makes morality possible to follow through his being born (a human) and through his friendship (fellowship), which effects the salience necessary to turn the follower moral.

Maybe I haven't followed the story closely enough, but wasn't Sam more getting high on his own supply than betraying his principles? It seems to me more like Sam was doing exactly what Utilitarianism/EA would teach, that it either agreed with his genetic/existing temperament or penetrated his soul so thoroughly that he did what he was taught to do.

-- Utilitarianism teaches risk neutrality. A 51% chance of 100 is better than a 100% chance of 50. Sam took massive risks, they failed, but that doesn't mean they didn't align with EA principles.

-- Utilitarianism teaches that everything is permitted, provided it is justified by numbers that increase overall utility (risk weighted!) then there is no prohibition on lying to your partners and customers. If (and there's no reason to think anything else) Sam thought the whole thing would work out just fine for everyone, then it was within EA principles to lie to others about where their money was going. After all they'd get their money back and then some!

-- EA (especially associated ex-risk theorists) teaches that making money is to be done so that you can give it away, earning to give is lauded. One doesn't make money to build a long lasting company, one makes money as fast as one can so that one can support AI alignment research to the greatest amount possible. Sam wasn't trying to build Goldman Sachs or JP Morgan, what would be the point? 100 years from now we'll be paperclips if he doesn't give enough money to Yud, and we'll have fully automated luxury gay space communism if he does.

-- EA does not value ownership rights; if your money could do more good somewhere else it would be positive for it to be taken from you and directed somewhere else.

It seems to me that, if we're going to drag EA into this, we're seeing exactly what it teaches in action, not a betrayal of its principles because they were insufficiently inculcated.

EA does not value ownership rights; if your money could do more good somewhere else it would be positive for it to be taken from you and directed somewhere else.

I think there's this idea that utilitarianism is all like "sure, go ahead, rob people iff you can use that money better" but that's dumb strawman-utilitarianism.

The reason it's dumb is because you have to take into account second-order effects in doing whatever it is you're doing, and those second-order effects for dishonest and coercive actions are nearly always profoundly negative, in general resulting in a society where nobody can trust anyone well enough to coordinate (and also resulting in a society where nobody would want to live).

There is a reason why nobody on the EA side is defending Bankman.

that's dumb strawman-utilitarianism.

As we see with live examples, no it's not. And that's how it works in general. Utilitarians have not noticed the skulls, nobody ever notices the skulls, Putin's Russia is simply fascist, Xi is not a Palladiumpilled Governance Futurist but a two-bit dictator, your enemies simply hate you, there are more malicious conspiracies and frauds than «emergent memetic attractors», the simplest and crudest and cringiest explanation is usually correct.

The reason it's dumb is because you have to take into account second-order effects in doing whatever it is you're doing, and those second-order effects for dishonest and coercive actions are nearly always profoundly negative

It's not dumb, and this is a fig leaf for useful idiots with their prosocial Hajnal brains. Every time EAs try to explain how they are acktchually rationally compliant with conventional morality, it screams «but in less tortured scenarios we can calculate when to defect and get maximum utility» just as strongly as this yield farming aka «Ponzi with extra steps» story does. It's from April as you see; a lot of SBF-glorifying EA-backed content came after that. EAs know they are smart and can trick commoners; they believe themselves to be even smarter than that and able to calculate ripple effects. In fact, calculating ripple effects using raw brainpower is what they do under the rubric of longtermism.

There is a reason why nobody on the EA side is defending Bankman.

The reason is that he has apparently failed them and burned their piggy bank, decreasing the utility they could have produced.

And it's the same reason they did not renounce him before his fraudulent empire crashed. They are not deontologists, after all. They assumed he's a math whiz and got it all figured out.