site banner

Culture War Roundup for the week of July 14, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

I do not understand why rationalist love this sentence as it obviously goes against their main moral philosophy of utilitarianism. Most people - even rationalists - are not against what they consider white lies either individually (e.g. lying to Kant's axe murderer asking where your wife is) or society wide myths (e.g. everybody is equal before law, every vote counts etc). The actual sentence should therefore be something like

That which can be destroyed by the truth should be, except if destroying it would have huge cost in terms of negative utils

Which is basically in line with other moral philosophies as well - most of them like the truth unless it goes against other key values in that system.

I do not understand why rationalist love this sentence as it obviously goes against their main moral philosophy of utilitarianism.

If you are the Czar and you're the only one who needs to be a utilitarian, sure. If you need there to be lots of utilitarians, then assuming some commonality of interests lies are terrible because they cause people to calculate utility incorrectly. All moral systems are somewhat sensitive to false information, of course, but utilitarianism is particularly and notoriously so.

If you need there to be lots of utilitarians, then assuming some commonality of interests lies are terrible because they cause people to calculate utility incorrectly

Or lies can cause people to calculate utility correctly, especially if they have some sort of bias. Is it not the whole point of rationalist thinking - Overcomening Bias? If a white lie can do that, then it will increase utility and general good.

Truth and utility are different concepts that are independent of each other - rationalist could say that they if they are not exactly orthogonal, they are at least at some steep angle to each other. I am not sure why rationalists cannot understand this argument - are they not supposed to be impersonal calculators? If Yudkowsky calculated that spewing lie after lie for the rest of his life will enable humanity to align the AI, he would 100% do it to usher his utopia. Would he not?

Or lies can cause people to calculate utility correctly, especially if they have some sort of bias.

I mean, this is basically just saying "sometimes lying results in people believing the truth". And, okay, this is not actually impossible, but it's not very likely, especially in the long run. COVID is the obvious recent example of people trying this shit and it blowing up in their faces.

More generally, you say "I am not sure why rationalists cannot understand this argument". Notice that if you're not sure why somebody doesn't accept something, one of the possible answers is in fact "they understand it just fine, but there's a counterargument that they understand and you don't".

Spoiler alert: there's literally a Yudkowsky article from 2008 about this. And another. Probably others I haven't read or can't recall offhand!

I mean, this is basically just saying "sometimes lying results in people believing the truth".

No. It says that lies can increase utils. Truth and utility are independent concepts, why do you have it confused?

More generally, you say "I am not sure why rationalists cannot understand this argument". Notice that if you're not sure why somebody doesn't accept something, one of the possible answers is in fact "they understand it just fine, but there's a counterargument that they understand and you don't".

Then demonstrate it. The very first sentence of your reply shows that I was right.

why do you have it confused?

I don't.

You said "calculate utility correctly". To calculate utility correctly requires knowing the truth. Different worldstates result in different expected utilities for the same course of action, so a utilitarian with a bad understanding of reality will act suboptimally according to his own utility metric - often wildly so. The obvious example is that genocide looks utile if you think the relevant demographic are all evil.

Hence, "lies can cause people to calculate utility correctly" = "sometimes lying results in people believing the truth".

This is distinct from "lies can be utile", which is broader and covers things like people having different utility metrics and/or people not actually being utilitarians and/or direct, non-choice-based belief effects (e.g. stress). That condition of "if you need there to be lots of utilitarians" is actually relevant to my point, y'know.

So we are back to square one. It seems to me that you are truthitarian and not a utilitarian, which is fair game. Let's investigate it on my previous example of Kant's axe murderer asking for you wife. Since she can be destroyed by [you telling] the truth about her whereabouts, then she should be destroyed, right? Because telling a lie can hamper yours and murderer's ability to correctly calculate the utility in the future with immense impacts. Or you should tell a lie, because death of you wife would be more negative utility compared to whatever impact on correct calculation of utility is there from telling a lie. What is your answer to the axe murderer? Is the truth the ultimate value that should destroy all and everything in its path? Or is it subordinate to other values such as your best estimation of utility in a given moment?

This is distinct from "lies can be utile", which is broader and covers things like people having different utility metrics and/or people not actually being utilitarians and/or direct, non-choice-based belief effects (e.g. stress). That condition of "if you need there to be lots of utilitarians" is actually relevant to my point, y'know.

That is why amended Sagan's mantra:

That which can be destroyed by the truth should be - except if it clashes with some other higher value (e.g. causing negative utility), in that case you should not destroy that thing by the truth.

That which can be destroyed by the truth should be, except if destroying it would have huge cost it terms of negative utils

This seems functionally identical to "I will exercise complete arbitrary freedom to pick and choose to destroy something that a truth would destroy while also feeling morally virtuous along the way." Human bias being what it is, if you dislike any outcome for any reason, any good-faith honest calculation of utils of that outcome will certainly come out negative, and sufficiently so to meet whatever bar it needs to to justify not getting that outcome.

The shorter quotation is going to be wrong sometimes, but that's expected of any simplistic pithy line that tries to describe huge, overarching principles in ethics. I think it's more useful than this longer one which makes no concessions or commitments at all to any principles beyond one's own whims and preferences.

Human bias being what it is, if you dislike any outcome for any reason, any good-faith honest calculation of utils of that outcome will certainly come out negative, and sufficiently so to meet whatever bar it needs to to justify not getting that outcome.

Exactly, I could have not said it better. Despite their posturing, they weaponize their dogmas - such as this Sagan's quip - to destroy what they do not like, while selectively not applying it to things they like such as polyamory.

I think it's more useful than this longer one which makes no concessions or commitments at all to any principles beyond one's own whims and preferences.

Yeah, it may be a useful white lie. Which again paradoxically is the exact thing that the sentence rails against.

Exactly, I could have not said it better. Despite their posturing, they weaponize their dogmas - such as this Sagan's quip - to destroy what they do not like, while selectively not applying it to things they like such as polyamory.

Right, and the point of a pithy, simplistic mantra like "That which can be destroyed by the truth should be" is to explicitly condemn such behavior of selectively applying and not applying principles based on whims and preferred outcomes. Which makes it every different from a line that adds something like "except if destroying it would have huge cost in terms of negative utils," which doesn't condemn such behavior and, in fact, is openly supportive of selectively applying principles based on whims and preferred outcomes. That's what makes the shorter line actually different and better.

Yeah, it may be a useful white lie. Which again paradoxically is the exact thing that the sentence rails against.

Statements of "should" and "ought" like this - which lack anything that can be measured to determine the effects of the behaviors that people "should" do - are subjective judgment calls that don't really fit into a "lie," useful, white, or otherwise. Possibly "misguided" or "wrong." But if it is indeed true that this statement is useful, then it certainly doesn't seem like truth would destroy the statement. Why would it?

Right, and the point of a pithy, simplistic mantra like "That which can be destroyed by the truth should be" is to explicitly condemn such behavior of selectively applying and not applying principles based on whims and preferred outcome

But the point is that it is exactly what is expected - utilitarians ought to apply the mantra selectively.

But if it is indeed true that this statement is useful, then it certainly doesn't seem like truth would destroy the statement. Why would it?

Because the sentence is false and thus should be destroyed by its own prescription. Unless you selectively apply it based on your whims and preferences. Exactly what you condemned in paragraph before.

Because the sentence is false and thus should be destroyed by its own prescription. Unless you selectively apply it based on your whims and preferences. Exactly what you condemned in paragraph before.

But it isn't false. Again, you can consider a judgment call like this "wrong" or "misguided," but that's, just, like your opinion, man. If the truth is that people listening to this phrase is useful, then why would the truth destroy it? Why would we want to destroy something that's truthfully useful?

Here's Yudkovsky saying exactly that (but worse and with more words, as is his style). A common rationalist stance is that utilitarianism is what's correct, but deontology is what works for humans.

The irony inherent in that decision tickles me something fierce.

Utilitarianism holds that what is best for the largest group of people is correct (summarizing). What is best for the largest group of people is apparently, deontology. Thus utilitarianism can be explained as the belief that deontology is correct.

Well, Yudkovsky's interested in moral theories for nonhumans as well.

And, once you get detached enough, or more realistically, when deontology doesn't give a clear enough answer, you do get to do some utility calculation anyway. Effective altruists may have their 10% charity rule, but they use utility calculations to decide on the charity. Which can lead to both Givewll $/life saved and shrimp welfare, so not exactly perfectly reliable either, but nothing is.

Sure, I agree. Which is exactly my point. Rationalists are deontological cult of reason with a lot of let's say idiosyncracies. I just noted that they love this Sagan's quip and cite it quite often as some kind of mantra. I do not deny its utility for their ideology, but it is still a little bit cringey in many contexts. It is equivalent to some religious believer just writing that Jesus the way, the truth, and the life randomly in the middle of some argument about healthcare or whatnot - exactly like the OP of this thread felt the need to write the sentence as part of his argument.

Actually I think it is even worse for rationalists. The religious believers are mostly self aware to the extent, that they do understand that it is a religious statement and that nonbelievers or Muslims etc. will disagree. Rationalists can sometimes forget that it is just a mantra with symbolic meaning, and they may take it too literally - as if it is actually a good argument to present in a debate.

If your point was that rationalists are deontoligical in practice, why did your first post in this thread express confusion as to why rationalists like the pithy phrase expressing this rule, not a useless utilitarian tautology? 07mk gave the rationalist answer to why prefer the shorter version. I do agree that you shouldn't mistake a moral rule for an argument, though. But it's going to be a popular rule in rationalist and ex-rationalist communities, as they do select for people who highly value epistemic rationality.

If your point was that rationalists are deontological in practice, why did your first post in this thread express confusion as to why rationalists like the pithy phrase expressing this rule, not a useless utilitarian tautology?

Because they pretend to be utilitarian, but are in practice quite dogmatic. This Sagan's quip is actually a good example of that, because it is self-defeating paradox. If taken literally, it should destroy itself. It is a very poor choice for some deontological rule for a wannabe utilitarian. There are much better rules - e.g. give 10% of your income to charity.

07mk gave the rationalist answer to why prefer the shorter version.

I think 07mk did a pretty good job for why rationalist should ditch the whole sentence. He pretends, that the shortened version is somehow better, because it gives less space for individual whims and preferences. But he also basically admits, that it should not be applied all the time - of course subject to individual whims and preferences. How is that better? I focused more on the paradox side, but it does not mean that 07mk's explanation is satisfying in any way.

We need phrases like that because the last years have shown that if you leave people any wiggle room, they will lie every chance they get. I hear ‘it’s just a white lie’ all the time now, and there are no limits and no brakes on its runaway use. The thing I found most shocking about the woke establishment is not that they would lie (about corona, discrimination, race), but that they would casually justify it if caught. And maga/trump casually lie even more, and then deny, so there’s not even the attempt at coherence left.

I at least hope that you appreciate the paradox here - that the sentence itself is a white lie or a myth if you will. But it has utility as a mantra preventing people from lying too much either to others or to themselves.

I hear ‘it’s just a white lie’ all the time now, and there are no limits and no brakes on its runaway use.

Exactly. Like some autistic rationalist "telling the truth" about some weapon of mass destruction - if humanity can be destroyed by the truth, then it should be. Right?

People jump on slogans, they want the ten commandments version, not the 2000 page discourse. Under those constraints, they should err on the side of telling the truth no matter what, instead of 'white lies' and 'necessary myths'. I don't think giving them the cliff's notes version counts as a lie. They're welcome to explore the topic in depth if they have the time.

everybody is equal before law, every vote counts

Those aren't lies. First one is an aspiration : "Everyone should be treated as if they are equal before the law" - it just means there are no special categories of people, like aristocrats, as far as the law is concerned. Second one, well, every vote is literally counted.

lying to Kant's axe murderer asking where your wife is

Obviously that is justified, but I don't consider that a white lie

(wiki definition: A white lie is a harmless or trivial lie, especially one told in order to be polite or to avoid hurting someone's feelings or stopping them from being upset by the truth. A white lie also is considered a lie to be used for greater good (pro-social behavior))

It's not trivial nor polite, nor do I justify it on simplistic utilitarian grounds ("I just lied because my wife shouldn't die/the greater good." That's too weak a justification imo.) Essentially, a state of war or permanent defection exists between this evildoer and you, and acts which would be immoral normally, like murder and lying, are permitted or required in this case.

Like some autistic rationalist "telling the truth" about some weapon of mass destruction

I think some exceptions can be made in the face of nuclear armaggeddon, at the margins of neurotypical society.

By contrast, our personal lives and our politics are saturated with unnecessary, harmful lies.