site banner

Culture War Roundup for the week of January 22, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

I'm having trouble understanding the idea that "the onus is on the person making the positive claim to provide sufficient evidence to prove their case". It looks obvious why this is a good idea, but it seems completely open to the rhetorical trick of putting the onus on the other party to prove you wrong even if your own case is unproven (perhaps because the question is a hard one and whoever is tasked with proving anything will have a hard time).

What got me thinking about this was an internet argument on immigration and crime. Half a century ago the status quo was restricted immigration and the onus would be on the person advocating for more to prove that it was a good thing, nowadays the status quo is liberal immigration and the onus is on the person advocating restrictions to prove that it is a bad thing. No scientifically relevant change has taken place, only a change in government policy, but one side can now quote a basic principle of science to bolster their case in an argument even if they know nothing more than the other party.

The due diligence question is obviously is this actually a fundamental aspect of science as stated or is it misrepresenting a more nuanced principle?

I'm having trouble understanding the idea that "the onus is on the person making the positive claim to provide sufficient evidence to prove their case".

I'd treat this like a social norm. The practical problem being solved is that people have finite time. This rule is useful because it keeps conversations productive. It's not necessarily a philosophical claim; "ignore claims that don't have evidence" is probably a good heuristic but isn't proof of anything.

Suppose I enter a conversation and make a surprising statement, contrary to the common wisdom. To pick an example, if I'm hanging out with a bunch of libertarians, I might say:

I believe that raising the Federal Minimum Wage by 75¢/hour won't result in any measurable change in employment among poor people.

If we're being super-literal, my statement is technically about what's going on in my own head; I either believe that, or I don't. So, if my conversation partners want to be jerks, they could say "That's true." and mean, "Yes, you probably do believe that because you are an idiot and believe any number of false things."

But, that's a generally obnoxious way to approach conversations. The commonly-understood subtext of my statement is something like:

Raising the Federal Minimum Wage by 75¢/hour won't result in any measurable change in employment among poor people. I believe this with moderate-to-high confidence. You should also believe this.

So, now I'm not just making a statement about what's going on in my own head. I'm (1) making a claim about reality and (2) telling my conversation partner that THEY should adopt my view. In the context of a conversation, it's perfectly reasonable for them to ask, basically, "Ok, why? Why should I think that?"

There are two practical reasons for this:

  1. Investigating this kind of question can easily take an hour, and so if my conversation partner spent an hour looking up facts every time someone said something strange on the internet, they'd never get anything else done. ("Cows are not made of stone.")
  2. Since I'm the one making the claim, I want to talk about it, and likely have information to hand.

There are a couple exceptions to this principle.

  • There are some contexts where we really do have ~unlimited time to investigate important questions.

Philosophy journals have unlimited pages and can take up questions where "everyone knows" the answer. Economics journals also have lots of space. It's appropriate for the National Bureau of Economic Research to investigate all kinds of questions along the lines of "What happens if we continue current policy?" / "What happens if we change policy?"

  • There are some times when people are NOT adding the implicit "... and you should believe this, too."

There are all kinds of things that I believe where, if you caught me at a cocktail party, I'd have a hard time mustering evidence. For example:

Conventional wisdom overstates the risks associated with smoking while pregnant. This is because most medical studies focus on the impact on average birthweight. But, birthweight is only a major risk when it's unusually low. The effects of smoking are non-linear; large babies tend to have their birthweight reduced a lot, while smoking doesn't do very much to smaller babies. Smoking is still probably not a good idea, but the risks are overstated.

It happens that my belief comes from an econometrics paper I read a decade ago. I don't remember the title of the paper, and certainly don't carry a copy with me. So, if someone (say, a doctor) disagreed with me, I'd pretty much shrug and acknowledge that I don't have any convincing-to-them evidence to hand. But I'm also not going to change my viewpoint back to the 'common belief' simply because I left an econometrics paper in a drawer. So, this gets us to "Agree to Disagree" which is a good way to handle social conversations when no one has access to evidence at hand.