site banner

Culture War Roundup for the week of December 26, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

I know this may not be the usual place to get feedback on academic research, but there's a paper idea I've been mulling over for a while that I wanted to run past the community, since it dovetails nicely with many of your interests (and I'm sure you'll have some interesting things to say). In short, I'm increasingly thinking that genuine beliefs may be a lot rarer than people think.

The inspiration for this came about partly through conversations I've had with friends and family members, and I've noticed that people sincerely say and profess to believe shit all the time while simultaneously failing to exhibit most or all of the conventional features we'd expect in cases of genuine belief. Consider my sister, who is a staunch activist in the domain of climate change, yet recently bought a new gas guzzling car, has never given any serious thought to reducing her meat consumption, and takes 12+ international flights a year. Or consider my dad, who says extremely negative things about Muslims (not just Islam), yet who has a large number of Muslim friends who he'd never dream of saying a bad word about. Or consider me, who claims to believe that AI risk is a deep existential threat to humanity, yet gets very excited and happy whenever a shiny new AI model is released.

I'm not saying that any of the above positions are strictly contradictory (and people are very good at papering over apparent tensions in their beliefs), but they all have more than a whiff of hypocrisy to me. There are a lot of famous cases like this in the heuristics and biases literature, and to be fair, psychologists and philosophers have been investigating and theorising about this stuff for a while, from Festinger's famous cognitive dissonance framework to contemporary belief fragmentation and partial belief accounts.

However, one view that I don't think anyone has properly explored yet is the idea that beliefs - at least as classically understood by psychologists and philosophers - may be surprisingly rare (compare the view of philosophers like John Doris who argue that virtues are very rare). Usually, if someone sincerely professes to believe that P, and we don't think they're lying, we assume that they do believe that P. Maybe in extreme cases, we might point to ways in which they fail to live up to their apparent belief that P, and suggest that they can't believe P all that strongly. However, for the purposes of folk psychology, we normally take this as sufficient grounds for ascribing them the relevant belief that P.

Contrast this with how psychologists and philosophers have traditionally thought about the demands of belief. When you believe that P, we expect you to make your other beliefs consistent with P. We expect that P will be "inferentially promiscuous", meaning that you'll draw all sorts of appropriate inferences on the basis that P. And finally, we expect that your behaviour will largely align with what people who believe that P typically do (ceteris paribus in all these cases, of course).

To be sure, we recognise all sorts of ways in which people fall short of these demands, but they're still regulatory norms for believing. And simply put, I think that many of the standard cases where we ascribe beliefs to someone (e.g., a relative saying "no-one trusts each other any more") don't come close to these standards, nor do people feel much if any obligation to make them come close to these standards.

Instead, I think a lot of what we standardly call beliefs might be better characterised as "context-sensitive dispositions to agree or disagree with assertions". Call these S-dispositions. I think S-dispositions have a normative logic all of their own, far more closely linked to social cues and pressures than the conventional demands of epistemology. The view I'm describing says that S-dispositions should be understood as a distinctive kind of psychological state from beliefs.

However, they're a state that we frequently confuse for beliefs, both in the case of other people and even ourselves. That's partly because when we do truly believe that P, we're also inclined to agree with assertions that P. However, I don't think it works the other way round - there are lots of times we're inclined to agree with assertions that P without meeting any of the common normative criteria for strict belief. But this isn't something that's immediately transparent to us; figuring out whether you really believe something is hard, and requires a lot of self-reflection and self-observation.

Consider someone, John, who sincerely claims to believe that meat is murder. John may find himself very inclined to agree with statements like "animal farming is horrific", "it's murder to kill an animal for food", and so on. But let's say John is reflective about his own behaviour. He notices that he only started asserting this kind of thing after he fell in love with a vegan woman and wanted to impress her. He also notes that despite making some basic efforts to be a vegan, he frequently fails, and doesn't feel too bad about it. He also notes that it's never occurred to him to stop wearing leather or make donations to charities trying to reduce animal suffering. In this case, John might well think something like the following: "I had a strong disposition to agree to statements like 'Meat is murder', but my behaviour and broader mindset weren't really consistent with someone who truly believed that. Whatever state it is that makes me inclined to agree to statements like that, then, is probably not a sincere belief."

I think an obvious objection here is that this is a semantic issue: I'm essentially no-true-scotsmanning the concept of belief. However, I'd push back against this. My broader philosophical and psychological framework for understanding the mind is a "psychological natural kinds" model: I think that there really are important divisions in kind in the mind between different kinds of psychological state, and a big part of the job of cognitive science is to discover them. The view I'm describing here, then, is that a lot of the states we conventionally call beliefs aren't in fact beliefs at all - they're a different psychological natural kind with its own norms and functions, which I've termed S-dispositions. There may be some interesting connections between S-dispositions and strict beliefs, but they're weak enough and complicated enough that a good ontology of the mind should consider them separate kinds of psychological states.

I also think this 'sparse beliefs' view I'm describing has some interesting potential upshots for how we think about speech and epistemic virtue, including the simple point that S-dispositions are ubiquitous and strict beliefs are rare. I'm still figuring these out, and I'd like to hear others' views on this, but it raises some interesting questions. For example, should we have a different set of norms for rewarding/punishing S-dispositions from those we apply to beliefs? If someone says "Russians are a bunch of fucking savages", and we have reason to believe that it's merely an S-disposition rather than a belief, should we judge them less harshly? Or similarly, if someone has two contradictory S-dispositions, is that necessarily a bad thing in the same way that having two contradictory beliefs would be? Should social media platforms make an effort to distinguish between users who casually assert problematic or dangerous things ("men should all be killed") versus those whose broader pattern of online interactions suggests they truly believe those things? What sort of epistemic obligation if any do we have to make sure our S-dispositions line up with our strict beliefs? Is there something epistemically or morally problematic about someone who casually says things like "Americans are idiots" in specific social contexts yet in practice holds many Americans in high esteem?

In any case, I'm in the early stages of writing a paper on this, but I'd love feedback from you all.

First and foremost, this seems absurdly difficult to measure rigorously. It is easy to determine whether someone professes a belief, you just ask them on a poll. It is highly nontrivial to determine whether someone "truly believes" something in the way you describe in any sort of objective sense. You can make a bunch of inferences that you think ought to logically follow from the true belief and also ask them about those on a poll, but that's incredibly subjective in what "counts", and someone with genuine true belief could disagree with some of your logical implications or disagree with those particular statements because they also have other beliefs you didn't expect them to have. And someone without genuine true belief could agree with those statements for other reasons.

Similar issues come up if you try to track real world behavior like "does this person buy a gas guzzling car?" Maybe they really believe in climate change but they're just selfish and care more about their own convenience. Maybe they have a consistent belief that only 1% of people with the most demand for hauling heavy things around should have large trucks and they genuinely believe they qualify as one of those people. Maybe that belief is in part selfishly motivated but in part genuine and it's not a binary thing. Similarly, lots of people who don't believe in climate change still have low carbon impact simply by coincidence. Any attempt to measure hypocrisy is going to be incredibly subjective and could turn out completely different answers based on the methodology.

Second, I think a lot of the perceived sparseness is availability bias. You are thinking of positive examples where people have hypocritical-seeming behavior, and controversial issues that people disagree on, but if you look at a broader and less interesting class of beliefs I expect you'd find 99%+ of beliefs are genuine. Everyone believes the sun will come up tomorrow, and acts accordingly. Everyone believes that wearing clothes in public is good behavior, and acts accordingly. Everyone believes that using doorknobs is the optimal way to open most doors, and acts accordingly. There are millions of minor facts that everyone genuinely believes in, acts as if they believe in, and take for granted, not even thinking about except when educating children. It's only concepts which are controversial, which some people do and some people do not believe in, where your attention is drawn when making these considerations. So if you're trying to make some sort of claim about rarity of genuine beliefs you need to be careful about what class of beliefs you are considering.

Additionally, controversial issues where there is mixed evidence are precisely issues where a good Bayesian ought to have a nontrivial probabilistic belief. Maybe someone thinks there's a 60% chance that antropogenic climate change is a big problem, and so they do some high efficiency efforts that they think have a high value per cost, but not others because the expected value is lower than someone with a 99% belief would perceive. Does this 60% belief count as "genuine?" And would your study be able to tell the difference between that and someone with a hypocritical professed 99% belief?

In theory something along the lines of your study, done extremely carefully, could be useful. In practice it is incredibly likely to be muddled with subjective biases to the point of unusability except as a cudgel by some people to bash their political opponents with and call them hypocrites with "scientific evidence", and nobody learns anything they didn't already know.

I expect you'd find 99%+ of beliefs are genuine

Counting issues above aside, I'm not sure. And it's a much more interesting question when approached practically - what do many peoples' held beliefs mean, and should they hold the supposedly-nongenuine ones, as opposed to a philosophical "how real are they" approach.

Are beliefs about, say, the attractiveness of clothing genuine? Not that it's entirely fake, but the history of fashion and said trends show it is, at best, highly contingent - does the simulacra, groundless nature of it mean anything? What about men or women who find women with heavy, garish makeup attractive? (one could respond "they're just making trivial claims about their experience", but ... say i'm enlightened, and can redirect the rivers of perception at will - I look at an apple, "perceive it" instead as a pear, and then honestly claim "I see that as a pear". Something's not quite right there! Wouldn't something like that apply to to socially-modified, rather than intentionally-modified, beliefs?)

If someone says (and does really believes, as opposed to it being a straight lie) "I think my wife is the most beautiful person in the world"?

Beliefs of the form "my race is superior", or "my country is superior to other ones"? Even if some races were superior, e.g. whites or jews being smarter, most folk beliefs posit supremacy in areas where it doesn't exist, whether that be historically "british good, german bad", "whites are much more honest and freedom-loving than blacks", or funny-to-us balkan or african rivalries. There are plenty of overtly nationalistic or racist people alive today.

"I want to lose weight, but just can't manage to, I try to eat less but I still don't lose any!" or "I want to lose weight but don't have the willpower to"?

"<my favorite player> is the BEST <sport> PLAYER!" What does that mean?

It's easy to put politics into the 'just one of many things' box, but looking at a broader scope of human activity, a lot of them don't seem to be "fully updating" or "broadly applied". IMO, that's borne of their meaninglessness, and said faux-beliefs should be abandoned by those who hold them.

You make a good point that there are a wide range of possible fake, or at least questionable beliefs in a broad range of areas. But I don't think that invalidates my point that there are an absurdly large number of genuine beliefs about banal things. Any number of anecdotes does little to provide statistical weight when for every suspicious "My wife is the most beautiful person in the world" you cherry pick out there are literally hundreds of trivial beliefs like "My wrinkly grandpa is not the most beautiful person in the world", "my neighbor's dog is not the most beautiful creature in the world", "My wife's red scarf is more beautiful than her brown purse", "My wife's red scarf is more beautiful than mud"... that never get questioned and are rarely even mentioned because they're just so obvious to the person holding them and relatively uninteresting.

I'm not arguing that nongenuine beliefs don't exist, or are super rare in some global sense. Just that they are vastly outnumbered mathematically if you consider the full set of ordinary beliefs that people have continuously throughout the day that let them function as human beings.

Agree with that, and made the same point lol. It gets worse - what about locally-correct beliefs that are held for the same reasons as pseudobeliefs? One might avoid poisonous plants because they're "cursed", and also burn incense to avoid curses. Say you, in the interest of 'health', or just because it's what everyone in your family does, brush your teeth each night, and also use antimicrobial mouthwash each night - believing both to be equally effective means of teeth cleaning - and yet you don't actively pursue 'cleaning stuff off of teeth' while brushing, just 'go through the motions' and don't clean effectively, and also eat lots of donuts.