site banner

Culture War Roundup for the week of November 13, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

I think the issue is most people in favor of rent control policies don't understand the economic arguments against them. They have mistaken factual beliefs. They correctly perceive the first order effects of reducing rent for people covered by such policies and think it is desirable. I think it takes a pretty specific kind of economics education to see the prices as outputs of a system, rather than inputs, and reason from the implications of that.

The issue here, to me, is that then the obvious follow-up question is, Why do they have mistaken factual beliefs? Surely some of them are just stupid and others are just in situations of forced ignorance, but I doubt that that covers more than a tiny fraction of them. So that would leave most people who are choosing to remain ignorant of the truth, which leads them to false conclusions; but why would they do that? My pet theory isn't "gut instinct" about what sounds good or what sounds bad to oneself, but rather another sort of "gut instinct" about what belief makes one more praised and less punished in one's social world. And thus people figure out what to remain ignorant of, so as to control one's own beliefs in a way that is beneficial to their social well-being (this may look like Bulverism, but in this case, the fact that these people in this hypothetical are mistaken, i.e. wrong, was taken as the baseline, so talking about how and why they landed on this wrongness rather than whether they're wrong seems appropriate).

That said, my pet theory might just be equivalent to the original assertion about what "sounds good," since one of the most common ways that I can tell of someone learning how to control one's own beliefs in such a way as to increase praise and reduce punishment socially is to modulate what "sounds good" (in an intuitive, ethical sense) to oneself.

I think you overestimate how much people are motivated to seek out the truth about something that's not interesting or important to them, especially if it means getting in long online debates and understanding technical arguments and possibly reading academic papers. It's not that people are aware of their ignorance and consciously choosing to be ignorant, it's that they were told incorrect things by some source they trust and have rarely been presented with counter evidence or reasoning. Or worse, have ended up in some epistemic closure that prevents them from considering alternative reasons and evidence. They are ignorant of their ignorance! I think certain kinds of arguments having certain social status attached might make sense as a cause for why people come to form particular beliefs but I think it is rarely a reason, in the sense of something subjectively experienced.

It's not that people are aware of their ignorance and consciously choosing to be ignorant, it's that they were told incorrect things by some source they trust and have rarely been presented with counter evidence or reasoning. Or worse, have ended up in some epistemic closure that prevents them from considering alternative reasons and evidence. They are ignorant of their ignorance!

I'd agree with this, but, again, I think just moves things a step back - why are they ignorant of their own ignorance? Most of it is, I think, that it's really hard to pay attention to your own ignorance; given that it's so hard, it's natural that they would make that choice not to do it and just remain ignorant of it. The cost-benefit calculation, compared to the far less costly and usually more beneficial strategy of just following the "does it give me social praise/punishment" measurement just doesn't work out. So they choose - a good choice in most cases, almost certainly - not to put in the hard work necessary to confront their own ignorance, and so they remain ignorant of their own ignorance, causing them to land at wrong conclusions. But they do land at conclusions somehow - it might be a bit too pithy to call it "gut instinct" or "what feels good," but if it is, I don't think it is by much.

I think certain kinds of arguments having certain social status attached might make sense as a cause for why people come to form particular beliefs but I think it is rarely a reason, in the sense of something subjectively experienced.

I haven't thought of this sort of distinction between "cause" versus "reason," but if you break it up that way, I think what you wrote here makes sense. It's just, I think the "reason" in the sense of something subjectively experienced is one of the least interesting and least impactful factors when it comes to exploring the way people think about things, because the flexibility that people have for using anything they want as a reason to support anything they want is effectively infinite.

I haven't thought of this sort of distinction between "cause" versus "reason," but if you break it up that way, I think what you wrote here makes sense. It's just, I think the "reason" in the sense of something subjectively experienced is one of the least interesting and least impactful factors when it comes to exploring the way people think about things, because the flexibility that people have for using anything they want as a reason to support anything they want is effectively infinite.

I wonder if this is at the heart of our disagreement. When I read a sentence like "how X thinks about Y" I interpret it less like "what all kinds of causal forces act on X's beliefs about Y" and more like "what kind of subjective state does X have when considering Y." So when I read the claim that people come to conclusions based on what "feels good" in a social status sense I interpret is a claim about their subjective reasons to believe Y.