site banner

Culture War Roundup for the week of July 14, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

The trouble with untruth is that it is hard in advantage to know when it will be harmless and when it will lead to disaster.

Myths work okayish even if most people do not believe that they are literally true. Most people who partake in the Star Wars subculture do not believe that there was a historical person named Luke Skywalker in a galaxy far away. They still can dress up as wookies and go to conventions or debate minor points of Jedi philosophy online, but they are much less likely to engage in harmful actions than a subculture which believes their myths are literally true.

The trouble with untruth is that it is hard in advantage to know when it will be harmless and when it will lead to disaster.

The same can be said about the truth. In a sense the sentence itself is highly paradoxical, as it by itself is also not true and just a rationalist myth - vast majority of them would prefer lies if it increased utils, as they are utilitarians. This can be even trivially demonstrated by people who refuse to tell white lies and make their lives unnecessarily harder and miserable for other people as well. I am sure that even rationalists can be employed let's say in sales or service sector and pretend that they are thrilled to serve their customers instead of telling the "truth". The only thing that the truth destroys in that case is their job prospects with no upside.

I think that the sentence is generally more understood to express a preference for true beliefs for oneself and in cooperative settings. "Of course I told the Gestapo where the Jews were hiding, and destroyed them with the truth" is very much not a standard interpretation. Nor is there an imperative to destroy any respect your coworker might have for you by blurting "whenever I see you I fantasize about your tits". Same for consumer service.

Nor is it imperative to rub the truth into the face of an unappreciative audience. A religious person is very likely already aware of the fact that agnostic atheism is a thing. Telling them they are wrong once a day is not helpful.

A better example of a seemingly benign untruth might be homeopathy. Obviously it is bollocks. But the placebo effect is real, and larger if the patient is not aware of the fact that they are getting a placebo. So from a utilitarian perspective, it might seem beneficial to let your community believe some horseshite if it improves their health outcomes, and as long as you consider only direct effects, this might even be true (if you outlaw homeopathic "cures" for cancer and the like).

But the indirect epistemic consequences are devastating. "You know that orthodox medicine is wrong to deny homeopathy, why should you believe them if they claim that vaccines do not cause autism? Or why should you believe some adjacent ivory tower autofellating scientists that climate change is a thing?"

But the indirect epistemic consequences are devastating

The consequences are devastating for what? Some cosmic sense of justice and rightness? As long as consequences are beneficial for utility, then lies are absolutely okay for utilitarians. Are they not? Of course you may argue that a specific lie is detrimental to utility, but then it is not my argument. Go and find some utility improving lie as an example, and defend destroying that one from utilitarian standpoint.

Act utilitarianism is not the only kind of utilitarianism there is. There is also rule utilitarianism and Two-level utilitarianism. Utilitarians can be against believing false things in the same way that they can be against child rape: while it is certainly possible to conjure hypothetical scenarios where the thing they are against has the better outcome, in practice these situations do not seem to appear.

Go and find some utility improving lie as an example

Hey, I am not the one who claims that there is such a thing as a false belief which improves utility. You seem to claim that such things exist, so you should come up with examples.

One example comes from Pratchett:

"For example, there was the Raddles' privy. Miss Level had explained carefully to Mr. and Mrs. Raddle several times that it was far too close to the well, and so the drinking water was full of tiny, tiny creatures that were making their children sick. They'd listen very carefully, every time they heard the lecture, and still they'd never move the privy. But Mistress Weatherwax told them it was caused by goblins who were attracted to the smell, and by the time they left that cottage, Mr. Raddle and three of his friends were already digging a new well at the other end of the garden."

There are several defenses of Granny Weatherwaxes behavior possible: 0. Operating on simulacrum level 2 is fine, truth does not matter. Obviously I reject this.

  1. It could be argued that she wanted to transport the true belief that the distance between well and privy was to small (but I do not find that very convincing).
  2. It could be argued that this was the closest thing to the truth the Raddles could grasp. Consider:

Medieval peasant: "Where do you come from?"
Literally-truthtelling alien: "To understand the answer to that question, you first have to understand that your cosmology is all wrong. While you believe that your world is planar, it is actually a sphere, strike that, a roughly sphere-shaped body. You do not fall off from that sphere because there is a force called gravity which pulls you towards the center of that body, even though calling it a force is an oversimplification as in reality it is more accurately described as bent space-time. Gravity is also causing your world to rotate around ..."
Conceptionally-truthtelling alien: "We come from the stars."
Literally-truthelling alien: "We most certainly do not. The surface temperature of stellar bodies is much too high to support life." I would be rather sympathetic to the second alien here, because while he lies in a very technical sense, he is trying to answer in the most truthful way the peasant will understand.

  1. One might argue that both the Raddles and my peasant are not so much suffering from a false belief, but trapped in a whole world-view full of falsehoods. Where normally spreading false beliefs is like salting the fertile earth, replacing one falsehood with another one in an endless sea of falsehoods is like dumping salt into the ocean, so the lie is not morally wrong.

However, none of these arguments apply to believing falsehoods yourself or your epistemic peer community. The peasant who tries to understand general relativity, fails and ends up believing that in a vague way, the aliens come from the stars, but not exactly is more virtuous than the peasant who just goes "sure, you come from the stars. whatever."

Act utilitarianism is not the only kind of utilitarianism there is. There is also rule utilitarianism and Two-level utilitarianism. Utilitarians can be against believing false things in the same way that they can be against child rape: while it is certainly possible to conjure hypothetical scenarios where the thing they are against has the better outcome, in practice these situations do not seem to appear.

Of course, but in the end they still want to increase utils - be it by acts, rules etc. This does not weaken my arguments - whatever way you calculate utils, the sentence is stupid if destroying a lie decreases utils by that metric.

Hey, I am not the one who claims that there is such a thing as a false belief which improves utility. You seem to claim that such things exist, so you should come up with examples.

Sure, I can use a hypothetical. If utilitarian of any sort - act, rule or two-level - made a calculation and found out that let's say believing in Christianity increases utils, then he would be obliged not to destroy it even if he thought Christian belief was based on a lie. Is it not true statement?

My criticism of your "homeopathy" example was that you actually think that homeopathy decreases utility. Which is not an argument for anything, you just affirm that saying what you think is true increases utility. Which does not tackle my argument at all.

EDIT: you lost me with Pratchett, aliens and peasants. Was is supposed to be some longwinded explanation for why you hold truth as an ultimate good instead of utils?