site banner

Culture War Roundup for the week of July 24, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Since my 'don't trust Science' threads were already toeing the line between 'Pepe Silvia!' and schizophrenic (fair!) (I didn't even touch the four-part follow-up), Nate Silver summarizes better than I can :

Here’s the scandal. In March 2020, a group of scientists — in particular, Kristian G. Andersen the of The Scripps Research Institute, Andrew Rambaut of The University of Edinburgh, Edward C. Holmes of the University of Sydney, and Robert F. Garry of Tulane University — published a paper in Nature Medicine that seemingly contradicted their true beliefs about COVID’s origins and which they knew to be misleading. The paper, “The proximal origin of SARS-CoV-2”, has been cited more than 5,900 times and was enormously influential in shaping the debate about the origins of COVID-19.

We know this because of a series of leaked and FOIAed emails and Slack messages that have been reported on by Public, Racket News, The Intercept and The Nation along with other small, independent media outlets. You can find a detailed summary of the claims and a copy of the emails and messages here at Public. There’s also good context around the messages here (very detailed) or here and here (more high-level).

((Silver's links carry the touchstones of conspiracy paranoia, like an emphasis on coverups and literally-by-the-minute analysis of claimed coordinated action, which would normally discourage me from pointing to them, except they also happen to be reasonable factual descriptions.))

To be clear, this isn't a case of some barely-related scientists from nearby offices in slightly-related fields being somewhat more open-minded. These documents demonstrate each and every single author of the paper held some of the exact same concerns about the proposed wet market origin as piles of shitposters and too-online dogs, often pointing to the exact same evidence... privately. In public, they named opponents giving these possibilities conspiracy theorists for naming options they were accepting privately, or drawing out a web that actually existed. Jeremy Farrar would send e-mails giving 50:50 odds on natural (and non-natural, mostly serial passage) origins at the same day he was shopping around early drafts of the paper; while he isn't on the author list, that's its own mess. To be fair, they do change positions in private, as information comes around and as debate occurred. But they remain far from as convinced as they pretended in public, not just during publication but months later, and it's exceptionally clear that the political and pragmatic ramifications drive that.

Nor was this filled with caveats and used or intended to be used solely as a small opinion piece. It contains a few limited cautions about available data's ability to discriminate from evolution at the wet market from cryptic adaptation among humans, but serial passage was actively dismissed by an incoherent mush that steps from animal models to purely in vitro considerations. The paper's authors and 'unrelated' academics (who had been heavily involved in discussions with the paper's authors behind closed doors) cited this not-a-paper at length to justify treating anyone even considering the possibility of just serial passage or an accidental lab leak to be a conspiracy theory that must be shut down, all the way from casual shitposters to federal politicians, including those who advocated specifically serial passage or a purely transport-focused accident. These private messages make clear that wasn't some unintentional side effect, but a if not the specific goal.

Nor was this limited to the broadest strokes: at best, these otherwise closely-knit scientists did mention important information not widely available to random shitposters to each other, such as the rarity of live pangolin trafficking, or the animal makeup of the wet market's official shipments, or a variety of information about possible serial passage techniques, all of which were carefully excluded from the final paper. Some writers received confidential notice of discovery of RmYNO2, and after finding that it wasn't itself more helpful to their point than other already-known genomes, decided to instead obliquely reference it as possible to make a 'prediction', because the Texas Sharpshooter's approach would have been too on the nose.

And that's the stuff that came through FOIA-able emails or broad and leakable Slack channels. The messages show many people involved transitioning to private e-mails, to phone calls, to unrecorded Zoom meetings, often dropping to very clipped wording during that transition: they knew this could eventually be public, and they knew other conversations would not.

None of this amounts, as many COVID skeptics are calling it, to research fraud; I'm not even sure it fits most definitions of academic misconduct. But that's mostly because the publication didn't have enough numbers or analysis to need to actively lie: this paper has no pixels to check for signs of photoshopping, nor specific population numbers to hit with GRIM. Silver has joined calls to retract the paper, but Nature's staff have already said that "Neither previous out-of-context remarks by the authors nor disagreements with the authors’ stated views, are, on their own, grounds for retraction." It ain't happening.

Silver proposes that the scientists were motivated by some combination of :

  • Evidence of a lab leak could cause a political backlash — understandably, given that COVID has killed almost 7 million people — resulting in a reduction in funding for gain-of-function research and other virological research. That’s potentially important to the authors or the authors’ bosses — and the authors were very aware of the career implications for how the story would play out;
  • Evidence of a lab leak could upset China and undermine research collaborations;
  • Evidence of a lab leak could provide validation to Trump and Republicans who touted the theory — remember, all of this was taking place during an election year, and medical, epidemiological and public health experts had few reservations about weighing in on political matters.

These aren't exactly the most charitable framings for each possibility, if perhaps more charitable than focusing on Anderson's certainty this paper got him tenure. But with a more forgiving description, I get something along the lines of :

  • Prohibitions on gain-of-function and other virological research could undermine pandemic responses (and we wouldn't know about past prevented pandemics, after all), or drive research to locations with worse biosecurity or oversight (than BSL2?).
  • Bad relations with China could undermine future pandemic responses or escalate to a 'hot' war.
  • Trump and Republicans responding to a China with marginal scientific research could result in another Korematsu, undermine future pandemic responses, or escalate to a 'hot' war.

Perhaps @Chrisprattalpharaptor can do better. But even if these somewhat earnest reasons that business or political tribe might have controlled what these scientists were willing to say publicly, or if there was some more noble cause that they held above providing an accurate model of the world, it's still something other than providing an accurate model of the world. Which is what, supposedly, was their job.

Worse, few of these matters stop here. Trivially, a lot of academics and casual observers are saying that even if the Nature op-ed authors were playing fast-and-loose with the facts at the time, we since have a ton of evidence in favor the wet market/natural origin side and very little recently published in favor of serial passage or any intentional manipulation, and normally drawing big charts claiming almost all the experts in a field were conspiracy to hide The Truth would be the sorta thing you do shortly before the nice men give you a coat with extra-long sleeves and take you to get some anti-psychotics. Except all of the above.

normally drawing big charts claiming almost all the experts in a field were conspiracy to hide The Truth would be the sorta thing you do shortly before the nice men give you a coat with extra-long sleeves and take you to get some anti-psychotics.

Have you considered that this might be true largely because the people who engage in conspiracies have made it so? (There is, of course, a conspiracy theory to this effect).

Oh, absolutely. Speaking Truth to Power is no longer fashionable - it's illegal and demonized. As @official_techsupport mentioned elsewhere, the fashion is now to speak Power to Truth.

How could you say something so controversial, yet so brave?

“Speaking truth to power” became a cultural touchstone precisely because it was always demonized. It is social technology, and like any technology, there is an immense hindsight bias. Of course firms which implemented Stop Work Authority outperformed those which didn’t. Of course armies which lost turned out to be full of yes-men. Winning ideas are oh so obvious in hindsight.

As a corollary, no one* tries to “speak power to truth.” It is unfashionable, and to most Americans, signals something akin to a cult.

Curiously, cults continue to exist. Overbearing workplaces were never eliminated. People kept doing things they knew were wrong untrue, because social incentives are a helluva drug. This was true for every decade which could later be praised as the peak of freedom.

What do you think is so special about this one?


* Yeah, yeah. I’m sure there’s at least one countersignaling contrarian who disagrees. And they’re probably on this board.

As a corollary, no one* tries to “speak power to truth.” It is unfashionable, and to most Americans, signals something akin to a cult.

I think you can make a compelling case that the suppression and treatment of HBD et al as a field constitutes speaking power to truth. HBD theories aren't suppressed because they're wrong - rather, they would not need to be suppressed if they were wrong.

Is it really more likely that academics believe HBD but resolve to fight it anyway? I think genuine disbelief is more parsimonious. That would include feeling motivated not to look too closely.

If you genuinely believe X -- as in, all the evidence you've seen points towards X, you have no inkling that X might be false, you would be willing to bet at strong odds that X is true -- then there may not be any motivation to look any more closely but there certainly isn't any motivation to avoid looking closely -- because what's the worst that can happen? You find more evidence that you're right?

In order to know that you need to be motivated to not look to closely, you need to know that there's at least a significant chance of learning that a thing you want to be true is not true. And that means you know that you already know there's at least a significant chance of this thing not being true. At this point, if you act as if you "believe" X with any confidence then you are merely acting.

Genuine (dis)belief and motivated reasoning do not fit together.

Sure there is. You’ve got a limited amount of time on this earth, and aren’t obligated to spend it debating people you think are trolls or at least cranks. The worst that can happen is you waste your time, feel stupider for having engaged, or encourage your opponents. It’s also possible to tar yourself as an outsider, because tribalism isn’t always (ever?) open-minded.

There is some level of belief where it becomes rational to write off the rest as a rounding error, rather than spend time on it. There is a lesser level at which most people start to do this!

If I think something is 99% likely to be true, and I don’t want to spend time debating heretics, it’s still fair to say that I believe that thing.

You’ve got a limited amount of time on this earth, and aren’t obligated to spend it debating people you think are trolls or at least cranks. [...] There is some level of belief where it becomes rational to write off the rest as a rounding error, rather than spend time on it.

This is justification for not having any motivation to talk to them. It is not motivation to avoid looking more closely at your beliefs.

I believe the earth is round. I could be wrong, but I find it sufficiently unlikely that I'm going to learn anything worthwhile from the average flat earther that I'm not really interested in debating them. However, if I ever find myself wanting to debate them, and also feeling like I need to avoid doing that, then that's a sign that something about what I claim to believe is wrong.

Having watched a couple actual debates on this topic, it's often that the "round earther" has no idea how to justify their (correct, IMO) beliefs, and instead of honestly admitting that they are essentially taking people's words for it, are trying to pretend that they actually understand things more than they do. That cognitive dissonance doesn't necessarily mean you're wrong on the object of contention, but it's a pretty good bet that you're wrong somewhere (perhaps in how confident and justified you actually are), and this sign marks the trailhead.

feel stupider for having engaged, or encourage your opponents.

These are both signs that your story isn't adding up. Why did you feel tempted to do something stupid? What roped you in?

Why would your opponent leave feeling "encouraged" rather than humiliated? If you actually know the topic so well, and their beliefs so dumb, shouldn't you be able to address their points so well that they are the ones that leave feeling dumb?

If I think something is 99% likely to be true, and I don’t want to spend time debating heretics, it’s still fair to say that I believe that thing.

That's 99% fair. And that 1% lie can be an acceptable rounding error.

But that 1% lie can also be a part of a much bigger lie to avoid having to deal with the fact that it's a hell of a lot more than 1% motivated and likely to be false.

Any time you find yourself actively wanting to avoid engagement (and not simply lacking motivation to engage), you're actively up against the part of your belief which isn't genuine. Even if it's only 1%, it's proven that it's not small enough to be irrelevant. And if it's making itself relevant, that's good evidence that it isn't really as small as you might like to believe.