This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
This makes me think there might be a cleaner line between "true delusion" and the other two proposed categories than I had initially expected. Why not consider the "you can't empathize your way into it" criteria as a (if not the) major boundary of the concept?
Considering both the Christians and the salt-based curse believers, both seem to be engaged in perfectly normal cognition - that is, I suspect that what both groups are doing is reasoning off of the apparent beliefs of people they trust at some point in their pasts. This is partially captured in the cultural congruity aspect, but seems distinct.
We could imagine my friends and family conspiring to convince me that my wife is cheating on me. They may use weak arguments and no evidence, but I would certainly still update in the direction they're pushing (unless, of course, I was aware of the conspiracy). Keep this up for long enough and deny me any opportunity to see evidence to the contrary (a notable feature of most popular supernatural beliefs, they are not easily and obviously falsifiable) and I expect I would have a strongly fixed, false, unjustified, non-culturally-determined belief that my wife is cheating on me.
Conversely, I could imagine a devout Christian hitting his head and suddenly losing all belief in the immaterial. Despite his beliefs coming closer to what I expect to be correctness, I find it very easy to rate him as less sane than the curse believers - something has clearly gone wrong with his cognition in a way that I cannot model as reasoning in the normal sense.
I expect also that this distinction is materially useful - the ways in which I'd interact with someone with strongly held false beliefs obtained via ordinary methods are very different from how I would interact with the truly delusional (at least concerning the areas of their maps that clearly have holes). As you say, the former can be pressed.
Because the ability to empathize is subjective, helplessly so. And just because you think you can empathize with someone doesn't mean you are accurately simulating their inner cognition.
I can try and empathize with an octopus. I can try and imagine having tentacles, but I do not think I could capture the qualia of an octopus even if I tried my best. I can dream of being a butterfly, but that is not the same as actually being a butterfly.
Alternatively, a society of autistic people might be fully functional (if they're high functioning autists). They might have severe deficits of theory of mind and can't actually understand the way that a neurotypical person in their midst actually feels. They might well call him broken or insane. Or a religious enclave might consider an unbeliever in their midst to be the crazy one, and feel very confident in their belief.
The autists might be able to, after a great deal of empirical research, be able to accurately predict the behavior of neurotypical people. Actually autistic people do often learn how to "mask", but passing as neurotypical does not necessarily make them neurotypical. Similarly, psychiatrists can predict the behavior of the psychotic (to a degree), even if we do not "understand" them in the Jasperian sense.
I am not an expert on phenomenology, but I do not fully agree with Jasper and his supporters. I think I can empathize with the insane or the religious, at least to some degree, even if I do not agree with them. Am I right? I don't know. Who does? On what grounds?
It is still a kludge. I would say that the our understanding of the universe is at a point where we can look at both the salt-aversive and the typical Christian and confidently say that both are incorrect. The world simply does not behave the way their beliefs would imply it does. The evidence is abundant, there are anti-cathedrals everywhere for those with the eyes to see.
Now, social consensus is evidence, in the Bayesian sense. It makes holding erroneous beliefs more defensible, or at least more understandable, than when they arise in a vacuum. A black person in America might well believe that thousands of black people are unjustly shot by the popo on an annual basis, because of media bias and their own in-group consensus. I would not call that a central example of delusion, it is possible for people to just be plain old wrong because of the bad luck of existing in an environment that does not optimize for truth. I just think that the evidence against the claims of the typical religion is even stronger, but that is more of a quantitative difference than a qualitative one.
("What evidence filtered evidence?")
If I was less lazy, I'd expand on the implications of/for Bayesianism. But the delusional, in the standard psychiatric sense, can be modeled as having stuck priors that do not update on new evidence. Scott has discussed this with more depth and rigor than I can ape.
I disagree! I see it as the equivalent of percussive maintenance, sometimes a sufficient shock to the system can break it out of a maladaptive pattern.
Within psychiatry, consider ECT. Let's say you're depressed and think you're an awful human being who deserves to die. I take you, put you under anesthesia, then induce seizures in your brain through the application of electric voltage.
You wake up, you no longer feel depressed, and you no longer want to kill yourself. Do you think that an electric shock is a valid argument against their position? Nonetheless, they're doing better, they're more functional at the very least. I would happily say that the process has made them more sane.
This is true, and important if we're trying to come up with rules that we can directly audit, but this objection also applies any time we are reasoning outside of a formal system - the fact I can believe falsely does not mean I shouldn't use my beliefs in downstream reasoning. If "my estimate of how reasonable the origin of a belief is" produces useful clusters I'll probably have a hard time selling it to a journal, but it will still be useful.
Also true, but also I think overstated - we can say quite a bit about how it is to be a bat, and statements like this can't be thrown out immediately - especially when the difference in cognitive architecture is as minor as that between (in the religiosity case, I'm sure we can find at least once instance) a pair of identical twins. We can think about questions like this and achieve certainty to our own satisfactions because this is what we have to do constantly - if everyone believed they had to have absolute certainty to make a statement only the insane would speak.
I mean, again I largely agree, but I think you're discounting the sheer space of possible belief that's been selected away for being too falsifiable. In the salt case, I would be extremely surprised if anyone involved was highly confident that some immediately visible malady would occur. If that was the belief, it would have been falsified enough times in enough communities that the idea would be have been outcompeted. Even the very religious do respond to evidence. For an example, we see this with new religious movements / cults (Debunking “When Prophecy Fails”) - interesting how major, long-lived religious movements tend to avoid these kinds of situations. It's hard to say that membership in a flying saucer cult selects for especially good epistemology. These priors don't look stuck exactly, more insensitive.
More broadly, almost all evidence is filtered evidence. This is good and necessary - "we" understand a ton about the world, whereas I understand only what I have the time/energy/ability to really look into. All the rest is impressions filtering through my peers and favored media. I'm surprised it works as well as it does! Somehow we've created a system where global understanding increases while almost no one understands almost anything - "someone seems moderately too insensitive to evidence against their favored belief" is the default.
If we phrase the distinction as a stuck prior, sensitivity to evidence, etc like Scott tends to, the difference does seem quantitative rather than qualitative. We do also have within the rat canon 0 And 1 Are Not Probabilities, which makes the opposite point. If a few of our parameter choices lead to vastly different behavior than all of our others, we really want to point that out! The reason I want to draw the line at "true delusion" is because of this quantitative difference.
This does, however, require you to assume that they weren't sane to begin with. To be clear, being stuck in a negative-feedback loop of affect is a pretty good reason to believe someone isn't sane, but in the examples I brought up that's the entire point in contention. We could easily imagine analogous scenarios where a direct improvement in affect would make one markedly less sane.
Fair points. If I was less lazy, I would have linked Yudkowsky's post myself, it was definitely in my head.
That's why I throw up my hands and argue for a pragmatic approach instead of some clearly defined mathematical framework. Bayesian agents are, as far as I'm aware, allowed to have axiomatic priors. Reasonable ones will be exceedingly careful regarding which priors they accept as axiomatic or even those they contingent treat as axiomatic unless truly walloped with evidence.
(Mathematicians use axioms, but are open to reconsidering said axioms. Some things are merely axiom-ish)
I do not contest that religious people are capable of changing their minds. Conversions or apostasy does happen, albeit on the margin. The reason for my disdain is that largely that they started with little evidence and do not sufficiently update with the enormous amount of evidence to the contrary.
It would be trivial to find someone who is "high functioning religious", in the sense that they are devout, successful and also willing to consider empirical arguments when said arguments aren't directly opposed to their core religious beliefs. They might acknowledge the power of science and empiricism by default, but I shake my head when they I present them with RCT evidence of the inefficacy of prayer and they keep praying. At that point, I am content in noting the enormous amount of cognitive siloing involved, and washing my hands of the matter.
As I often do, I bring up the example of an uncle of mine who is an accomplished microbiologist while also believing in homeopathy (I think he's a bit religious, but let's ignore that for now). He applies standards of empiricsms rigorously in one context, and is specifically trained to apply it everywhere. He, however, does not apply it at home. If that's not irrational, I don't know what is.
There are many other kinds of cognitive dissonance and motivated reasoning: the same people who claim that the question of God is indecidable through material evidence often happily accept what they deem as physical evidence in favor. Separate and non-overlapping magisteria? Only when convenient. The logically consistent stance would be that even an incontrovertible miracle is null evidence in favor.
No substantive disagreement from me, I certainly don't think the average (or, well, any) human is an ideal reasoner. I agree we have the ability to do better even.
I just can't feel that disdain - any mistakes I can catch others making are mistakes I've made and will continue to make. It's no fun being a finite being
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link