This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
This is true, and important if we're trying to come up with rules that we can directly audit, but this objection also applies any time we are reasoning outside of a formal system - the fact I can believe falsely does not mean I shouldn't use my beliefs in downstream reasoning. If "my estimate of how reasonable the origin of a belief is" produces useful clusters I'll probably have a hard time selling it to a journal, but it will still be useful.
Also true, but also I think overstated - we can say quite a bit about how it is to be a bat, and statements like this can't be thrown out immediately - especially when the difference in cognitive architecture is as minor as that between (in the religiosity case, I'm sure we can find at least once instance) a pair of identical twins. We can think about questions like this and achieve certainty to our own satisfactions because this is what we have to do constantly - if everyone believed they had to have absolute certainty to make a statement only the insane would speak.
I mean, again I largely agree, but I think you're discounting the sheer space of possible belief that's been selected away for being too falsifiable. In the salt case, I would be extremely surprised if anyone involved was highly confident that some immediately visible malady would occur. If that was the belief, it would have been falsified enough times in enough communities that the idea would be have been outcompeted. Even the very religious do respond to evidence. For an example, we see this with new religious movements / cults (Debunking “When Prophecy Fails”) - interesting how major, long-lived religious movements tend to avoid these kinds of situations. It's hard to say that membership in a flying saucer cult selects for especially good epistemology. These priors don't look stuck exactly, more insensitive.
More broadly, almost all evidence is filtered evidence. This is good and necessary - "we" understand a ton about the world, whereas I understand only what I have the time/energy/ability to really look into. All the rest is impressions filtering through my peers and favored media. I'm surprised it works as well as it does! Somehow we've created a system where global understanding increases while almost no one understands almost anything - "someone seems moderately too insensitive to evidence against their favored belief" is the default.
If we phrase the distinction as a stuck prior, sensitivity to evidence, etc like Scott tends to, the difference does seem quantitative rather than qualitative. We do also have within the rat canon 0 And 1 Are Not Probabilities, which makes the opposite point. If a few of our parameter choices lead to vastly different behavior than all of our others, we really want to point that out! The reason I want to draw the line at "true delusion" is because of this quantitative difference.
This does, however, require you to assume that they weren't sane to begin with. To be clear, being stuck in a negative-feedback loop of affect is a pretty good reason to believe someone isn't sane, but in the examples I brought up that's the entire point in contention. We could easily imagine analogous scenarios where a direct improvement in affect would make one markedly less sane.
Fair points. If I was less lazy, I would have linked Yudkowsky's post myself, it was definitely in my head.
That's why I throw up my hands and argue for a pragmatic approach instead of some clearly defined mathematical framework. Bayesian agents are, as far as I'm aware, allowed to have axiomatic priors. Reasonable ones will be exceedingly careful regarding which priors they accept as axiomatic or even those they contingent treat as axiomatic unless truly walloped with evidence.
(Mathematicians use axioms, but are open to reconsidering said axioms. Some things are merely axiom-ish)
I do not contest that religious people are capable of changing their minds. Conversions or apostasy does happen, albeit on the margin. The reason for my disdain is that largely that they started with little evidence and do not sufficiently update with the enormous amount of evidence to the contrary.
It would be trivial to find someone who is "high functioning religious", in the sense that they are devout, successful and also willing to consider empirical arguments when said arguments aren't directly opposed to their core religious beliefs. They might acknowledge the power of science and empiricism by default, but I shake my head when they I present them with RCT evidence of the inefficacy of prayer and they keep praying. At that point, I am content in noting the enormous amount of cognitive siloing involved, and washing my hands of the matter.
As I often do, I bring up the example of an uncle of mine who is an accomplished microbiologist while also believing in homeopathy (I think he's a bit religious, but let's ignore that for now). He applies standards of empiricsms rigorously in one context, and is specifically trained to apply it everywhere. He, however, does not apply it at home. If that's not irrational, I don't know what is.
There are many other kinds of cognitive dissonance and motivated reasoning: the same people who claim that the question of God is indecidable through material evidence often happily accept what they deem as physical evidence in favor. Separate and non-overlapping magisteria? Only when convenient. The logically consistent stance would be that even an incontrovertible miracle is null evidence in favor.
No substantive disagreement from me, I certainly don't think the average (or, well, any) human is an ideal reasoner. I agree we have the ability to do better even.
I just can't feel that disdain - any mistakes I can catch others making are mistakes I've made and will continue to make. It's no fun being a finite being
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link