site banner

Culture War Roundup for the week of May 22, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

10
Jump in the discussion.

No email address required.

A sleep-deprived rant that may be missing your argument:

I agree with this meta-post. But, in fairness, I suspect that the biggest intellectual deficit of the rationalsit culture is precisely this preoccupation with the meta. Half a high school debate club, half a nerd cult of reason, with very little exposure to empirical matters and gratitious amounts of speculative fiction plus a sprinkling of Talmudic education – this is our foundation. This hothouse allowed nuanced, hypothetical concepts to thrive, concepts which would not have lasted a day in a challenging object-level environment; indeed, some even become apex predators, like «infohazard». Or steelmanning. More terms and tools aren't always good; they can get reified and multiply confusion.

I don't think that arguments which get called out as «weakmanning» typically address a weak point, or a nonrepresentative point, per se. They address, and attack, that which their speaker think the other party actually believes, on basis of reasoning about their broader philosophy; specific illustrations are only for convenience. If it doesn't, at the moment, match the other party's consensus (however we establish it), that is immaterial. We can protest brining subpar evidence for an assertion about such essential belief, but it's important to know whether the assertion is meant to depend on the evidence presented.

How is «weakman» different from «the bailey»? «Steelman» from «sanewashing» or «the motte»? Those are all loaded terms (sanewashing implies the insanity of the main body of belief) and some comments offer a sensible perspective, eg @DuplexFields here. But in real scenarios each politically significant group has a wide gamut of opinions (as in, comprehensive interpretation of what the group «is about»), from the most idiotically extremist and unsubstantiated to fruitlessly anodyne to plain dumb (with tiny pockets of well-reasoned extremists); it's an exercise in futility to classify whether a particular take X' falls within the normal range for X-affiliated distribution of takes, or somewhat on its fringes (weakman) or just at the very edge of self-parody and not recognized as a legitimate variant by the consensus (strawman). (Language allows endless compositionality, so even a crazy rant can be construed as an inapt appeal to common sense.)

The futility goes deeper. Yes, there are ways for bizarre outliers to come to be which are basically unrelated to the position's essence, so strawmen do not tell us much about the big tent or merits of its ideology, and in this vein it could seem meaningful to identify them. But I'd argue it doesn't even matter if no such caricatures have proven existence. So long as the position is well-documented enough, it is appropriate to discuss its implications even if no proponent (again, recognized as such by the group's consensus) is currently willing to bite the bullet. And in fact this method is constantly applied to non-mainstream views.

The obvious example: consider the beeline from HBD to eugenics to racism and fascism; this is not so much a matter of historical association as it's recognition that these object-level views can be used to support illiberal policies; they don't have to, but it can work.

Consider, also, that merely a few months ago, back in the gentle age where plans like «inflating regulatory burden to obstruct AI research» were considered beyond the pale, Yudkowsky disavowed all violence in pursuit of AI risk reduction, and cranks like me were saying that utilitarianism + AI Doom doctrine imlies impossibility to trust such disawovals, that in the end AI Doomers will be willing to embrace totalitarianism. Lo and behold, now he advocates for airstrikes and worse. It wasn't the threat model that changed, it wasn't his philosophy; what changed was the Overton window, and the logic of his doctrine realigned his expressed views accordingly. Could this have been predicted by Yudkowsky?

I don't know. I don't even know if I could predict my own reaction to the invasion of Ukraine on February 23rd, 2022. But I was saying then that it probably won't happen, because it's so absurd, and I mocked Western intelligence that played up all the saber-rattling on the border; and many people who seemed to be on the same page as me then proceeded to enthusiastically support the Z operation, to the point of cheering for unhinged barbarism. It turned out that I was the fringe, not them; that the logic of our stated ideology, in a timeline like this, flows like that – just as predicted by our oh-so-despicable opponents. They were correct to interpret the outspoken outliers as signs of things to come; they were correct to dismiss me when I was saying that those strawmen or weakmen are not representative of the whole. I can steelman Russian Nationalism. I cannot redeem it.

I don't think people who can operate at many layers of abstraction are fools (and people who can't, generally can't benefit from appeals to these epistemological categories). They can understand positions fine, and they can see the distribution of voices as well as anyone, pointing out to them that this specific voice is X deviations from the median on the axis of quality or popularity is not a very good use of time. The disagreement is mainly about how a given position, adopted by a given group, works out in reality; what it collapses into.