This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
Attempting to 'build consensus' or enforce ideological conformity.
Making sweeping generalizations to vilify a group you dislike.
Recruiting for a cause.
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
Don't imply that someone said something they did not say, even if you think it follows from what they said.
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
I know this may not be the usual place to get feedback on academic research, but there's a paper idea I've been mulling over for a while that I wanted to run past the community, since it dovetails nicely with many of your interests (and I'm sure you'll have some interesting things to say). In short, I'm increasingly thinking that genuine beliefs may be a lot rarer than people think.
The inspiration for this came about partly through conversations I've had with friends and family members, and I've noticed that people sincerely say and profess to believe shit all the time while simultaneously failing to exhibit most or all of the conventional features we'd expect in cases of genuine belief. Consider my sister, who is a staunch activist in the domain of climate change, yet recently bought a new gas guzzling car, has never given any serious thought to reducing her meat consumption, and takes 12+ international flights a year. Or consider my dad, who says extremely negative things about Muslims (not just Islam), yet who has a large number of Muslim friends who he'd never dream of saying a bad word about. Or consider me, who claims to believe that AI risk is a deep existential threat to humanity, yet gets very excited and happy whenever a shiny new AI model is released.
I'm not saying that any of the above positions are strictly contradictory (and people are very good at papering over apparent tensions in their beliefs), but they all have more than a whiff of hypocrisy to me. There are a lot of famous cases like this in the heuristics and biases literature, and to be fair, psychologists and philosophers have been investigating and theorising about this stuff for a while, from Festinger's famous cognitive dissonance framework to contemporary belief fragmentation and partial belief accounts.
However, one view that I don't think anyone has properly explored yet is the idea that beliefs - at least as classically understood by psychologists and philosophers - may be surprisingly rare (compare the view of philosophers like John Doris who argue that virtues are very rare). Usually, if someone sincerely professes to believe that P, and we don't think they're lying, we assume that they do believe that P. Maybe in extreme cases, we might point to ways in which they fail to live up to their apparent belief that P, and suggest that they can't believe P all that strongly. However, for the purposes of folk psychology, we normally take this as sufficient grounds for ascribing them the relevant belief that P.
Contrast this with how psychologists and philosophers have traditionally thought about the demands of belief. When you believe that P, we expect you to make your other beliefs consistent with P. We expect that P will be "inferentially promiscuous", meaning that you'll draw all sorts of appropriate inferences on the basis that P. And finally, we expect that your behaviour will largely align with what people who believe that P typically do (ceteris paribus in all these cases, of course).
To be sure, we recognise all sorts of ways in which people fall short of these demands, but they're still regulatory norms for believing. And simply put, I think that many of the standard cases where we ascribe beliefs to someone (e.g., a relative saying "no-one trusts each other any more") don't come close to these standards, nor do people feel much if any obligation to make them come close to these standards.
Instead, I think a lot of what we standardly call beliefs might be better characterised as "context-sensitive dispositions to agree or disagree with assertions". Call these S-dispositions. I think S-dispositions have a normative logic all of their own, far more closely linked to social cues and pressures than the conventional demands of epistemology. The view I'm describing says that S-dispositions should be understood as a distinctive kind of psychological state from beliefs.
However, they're a state that we frequently confuse for beliefs, both in the case of other people and even ourselves. That's partly because when we do truly believe that P, we're also inclined to agree with assertions that P. However, I don't think it works the other way round - there are lots of times we're inclined to agree with assertions that P without meeting any of the common normative criteria for strict belief. But this isn't something that's immediately transparent to us; figuring out whether you really believe something is hard, and requires a lot of self-reflection and self-observation.
Consider someone, John, who sincerely claims to believe that meat is murder. John may find himself very inclined to agree with statements like "animal farming is horrific", "it's murder to kill an animal for food", and so on. But let's say John is reflective about his own behaviour. He notices that he only started asserting this kind of thing after he fell in love with a vegan woman and wanted to impress her. He also notes that despite making some basic efforts to be a vegan, he frequently fails, and doesn't feel too bad about it. He also notes that it's never occurred to him to stop wearing leather or make donations to charities trying to reduce animal suffering. In this case, John might well think something like the following: "I had a strong disposition to agree to statements like 'Meat is murder', but my behaviour and broader mindset weren't really consistent with someone who truly believed that. Whatever state it is that makes me inclined to agree to statements like that, then, is probably not a sincere belief."
I think an obvious objection here is that this is a semantic issue: I'm essentially no-true-scotsmanning the concept of belief. However, I'd push back against this. My broader philosophical and psychological framework for understanding the mind is a "psychological natural kinds" model: I think that there really are important divisions in kind in the mind between different kinds of psychological state, and a big part of the job of cognitive science is to discover them. The view I'm describing here, then, is that a lot of the states we conventionally call beliefs aren't in fact beliefs at all - they're a different psychological natural kind with its own norms and functions, which I've termed S-dispositions. There may be some interesting connections between S-dispositions and strict beliefs, but they're weak enough and complicated enough that a good ontology of the mind should consider them separate kinds of psychological states.
I also think this 'sparse beliefs' view I'm describing has some interesting potential upshots for how we think about speech and epistemic virtue, including the simple point that S-dispositions are ubiquitous and strict beliefs are rare. I'm still figuring these out, and I'd like to hear others' views on this, but it raises some interesting questions. For example, should we have a different set of norms for rewarding/punishing S-dispositions from those we apply to beliefs? If someone says "Russians are a bunch of fucking savages", and we have reason to believe that it's merely an S-disposition rather than a belief, should we judge them less harshly? Or similarly, if someone has two contradictory S-dispositions, is that necessarily a bad thing in the same way that having two contradictory beliefs would be? Should social media platforms make an effort to distinguish between users who casually assert problematic or dangerous things ("men should all be killed") versus those whose broader pattern of online interactions suggests they truly believe those things? What sort of epistemic obligation if any do we have to make sure our S-dispositions line up with our strict beliefs? Is there something epistemically or morally problematic about someone who casually says things like "Americans are idiots" in specific social contexts yet in practice holds many Americans in high esteem?
In any case, I'm in the early stages of writing a paper on this, but I'd love feedback from you all.
Where does power, or the personal perception of power, come into all this? It seems to me like what you call out as hypocrisy could just as easily be explained by a belief that in one's own power/helplessness to implement one's beliefs. That the people you identify as either not holding beliefs, or as hypocrites, are instead rationally biding their time until they can implement their ideas en masse to greater benefit.
I'd like to take a moment to appreciate that you provided one Blue Tribe, one Red Tribe, and one Grey Tribe example; so that we all will tend to see one "moral" take, one "immoral" take, and one neutral-weird one.
The unifying factor across these beliefs doesn't seem to be hypocrisy, but a perception of a lack of power to implement change. Your sister sees no point in limiting her own consumption of carbon-intensive goods/services when her individual actions will mean little without regulatory change to enforce mass movement towards those goals.* The real win is governments implementing industrial carbon limits, not limiting your own flights to achieve nothing. Your father might see no point in being cruel to Muslims who are here and who he has no power to expel, but if I were a Muslim I certainly wouldn't count on his good will. I would imagine that he might choose to ban Muslim immigration or deport already present Muslims given the power to do so, even though he functions the way he does when lacking power. There's no benefit to him from excluding Muslims personally or being mean, there might be a benefit from ultimately removing all Muslims or Islam from the world.**
Thus a lot of what you identify as hypocrisy, is better seen as a rejection of the Guidance Counselor Office Poster advice about "Be The Change You Wish to See in the World." Instead, they might hold a belief closer to Big Yud's "Be Nice, Until You Can Coordinate Meanness." Perhaps "Be selfish in the circumstances you find yourself, but be willing to advocate for coordinated actions that might go against your selfish goals; don't be selfless unilaterally." This is a fairly common set of circumstances, a liberal billionaire might advocate higher taxes on himself politically, while also not overpaying the taxes he owes; Reagan believed strongly in Nuclear disarmament while also continuing to invest in and maintain the USA's nuclear arsenal to protect MAD and pressure the Soviets; or one might believe a gun-free society would be superior, but own a gun because you want to defend yourself against others with guns who you have no power to disarm.
Another example, a lot of people who conspicuously complain about the modern dating/romance/marriage/sex scene still participate in it for their own selfish gain, but if we had a big Constitutional Convention of Sex to decide how we were going to do things going forward they might choose a different system altogether. Saying that one can't date if one doesn't approve of the entire social system veers dangerously close to the meme about "Oh you critique society while participating in society!" One must do what one must do to live in society, and then seek to implement change by obtaining and exercising power over the collective. Your system requires all dissidents to Benedict Option themselves (at a minimum!) or be called hypocrites or non-believers.
Friedman feels relevant here, to view it in a more systematic way:
So I might hold a genuine belief, but have no interest in marginalizing myself by advocating it or implementing it in a useless way, while having an ultimate interest in implementing the idea in an effective way.
That feels much less organized than I'd like, maybe I need to chew on this idea more.
*For what it is worth, I tend to believe that most climate change activists seem to operate based on banning things they didn't like anyway. Climate change is at core about restricting people, and obviously some things will be justifiable and some things will not be justifiable under a carbon framework. People who get woke to the climate issues tend to restrict things that they/their class didn't want to do anyway: drive pickup trucks, run industrial concerns, have American children. They tend to ignore or justify the climate impacts of things that they did want to do anyway: fly to foreign countries, import fancy food from abroad, living/allowing people to live in places that are more carbon intensive. Right wing malthusian overpopulation types similarly tend to be most conspicuously concerned about preventing the birth of too many of the kinds of people they didn't like to begin with.
**I feel like your AI thing can be mapped to that as well, but it didn't write out well so I omitted it. But there are reasons for a grey-tribe individual to be selfishly excited at each new AI advancement even if they are frightened of AGI apocalypse. Empowering tech people, or confirming beliefs so people will take them seriously, or just the joy of saying I Told You So. Idk, I'm not one of y'all.
Context Copy link
Context Copy link