site banner

Culture War Roundup for the week of May 29, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

I considered making this an "inferential distance" post but it's more an idle thought that occurred to me and a bit too big of a question to go in the small questions thread.

That being, Are the replication crisis in academia, the Russian military's apparent fecklessness in Ukraine, and GPT hallucinations (along with rationalist's propensity to chase them), all manifestations of the same underlying noumenon?

Without going into details, I had to have a sit-down with one of my subordinates this week about how he had dropped the ball on his portion of a larger project. The kid is clearly smart and clearly trying but he's also "a kid" fresh out of school and working his first proper "grown-up" job. The fact that he's clearly trying is why I felt the need to ask him "what the hell happened?" and the answer he gave me was essentially that he didn't want to tell me that he didn't understand the assignment because he didn't want me to think he was stupid.

This reminded me of some of the conversations that have happened here on theMotte regarding GPT's knowledge and/or lack thereof. A line of thinking I've seen come up multiple times here is something to the effect of; As a GPT user I don’t ever want it to say "I don’t know". this strikes me as obviously stupid and ultimately dangerous. The people using GPT doesn't want to be told "sorry there are no cases that match your criteria" they want a list of cases that match their criteria and the more I think about it the more I come to believe that this sort of thinking is the root of so many modern pathologies.

For a bit of context my professional background since graduating college has been in signal processing. Specifically signal processing in contested environments, IE those environments where the signal you are trying to recognize, isolate, and track is actively trying to avoid being tracked, because being tracked is often a prelude to catching a missile to the face. Being able assess confidence levels and recognize when you may have lost the plot is a critical component of being good at this job as nothing can be assumed to be what it looks like. If anything, assumption is the mother of all cock-ups. Scott talks about bounded distrust and IMO gets the reality of the situation exactly backwards. It is trust, not distrust, that needs to be kept strictly bounded if you are to achieve anything close to making sense of the world. My best friend is an attorney, we drink and trade war stories from our respective professions, and from what he tells me the first thing he does after every deposition or discovery is go through every single factual claim no matter how seemingly minute or irrelevant and try to establish what can be confirmed, what can't, and what may have been strategically omitted. He just takes it as a given that witnesses are unreliable, that the opposing council wants to win, and that they may be willing to lie and cheat to do so. These are lawyers we're talking about after all, absolute shysters and moral degenerates the lot of them ;-). For better or worse this approach strikes me as obviously correct, and I think the apparent lack of this impulse amongst academics in general and rationalists in particular is why rationalists get memed as Quokka. I don't endorse 0 HP's entire position in that thread, but I do think he has correctly identified some nugget of truth.

So what does any of this have to do with the replication crisis or the War in Ukraine? Think about it. How often does an academic get applauded for publish a negative result? The simple fact that in a post-modern setting it is far more important to publish something that is new and novel than it is to publish something that is true. Nobody gets promoted for replicating someone else experiment or publishing a negative result and thus the people inclined to do so get weeded out of the institutions. By the same token, I've seen a similar trend in intel reports out of Russia. To put it bluntly their organic ISR and BDA is apparently terrible bordering on non-existent and a good portion of this seems to stem from an issue that the US was dealing with back in the early 2010s IE soldiers getting punished for reporting true information. Just as the US State Department didn't want to be told how precarious the situation with ISIL was, the Russian MOD doesn't want to hear that a given Battalion is anything other than at full strength and advancing. Ukrainian commanders will do things like confiscate their men's cell phones and put them all in a box in an empty field. When Russian bombers get dispatched to blow up that empty field and last thing anyone in the chain of command wants to believe is that they just wasted a bunch of expensive ordnance. They want to believe that 500 cell-phone signals going dark equates to 500 Ukrainian soldiers killed. It's an understandable desire, but the thing about contested environments is that the other guy also gets to vote.

In short, something that I think a lot of people here (most notably Scott, Caplan, Debeor, Sailer, Yud, and a lot of other rationalist "thought leaders") have forgotten is that appeals to authority, scientific consensus, and the "sense making apparatus" are all ultimately hollow. It is the combative elements of science that keep it honest and producing useful knowledge.

As clarification for others, ISR is Intelligence, Surveillance, Reconnaissance and BDA is Bomb Damage Assessment.

Just as the US State Department didn't want to be told how precarious the situation with ISIL was

Afghanistan is another good example - superiors were happy to hear about how they were running over children with MRAPs(!) since that was something they could fix. The huge systemic problems with corruption that threatened the very basis of the campaign, not so much: https://twitter.com/RichardHanania/status/1204178295618621440/photo/1

The huge systemic problems with corruption that threatened the very basis of the campaign, not so much

Superiors didn't want to know because the administration back home didn't want to know. Everyone regardless of political party wanted to wave the flag and be "we're bringing democracy and liberation to the people" and so "maybe the warlords who we're supporting/arming/paying to be our notional allies are functional paedophiles but it's their Cultural Tradition and let's not rock the boat, meanwhile we're selling the story back home that we're enabling women to be liberated and letting girls get an education and bringing the benefits of Westernisation to the backwards nation".

Then the withdrawal happened fast, the so-called national government folded like wet cardboard because outside of a couple of the cities it never existed, the systemic corruption meant that there was no independent organisation to stand on its own two feet, and the Taliban rolled back in. And nobody wanted to hear that this was the most likely outcome, because of the time and money spent and because it would contradict the happy, rosy, fake narrative crafted back home.