site banner

Culture War Roundup for the week of November 27, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

The Dunning-Kruger effect is autocorrelation

Huh. I can only apologize for the relatively bare link, but I feel like it's worth drawing attention to something so widely accepted yet utterly worthless, especially when it comes up so often here.

The thing about the DK effect is that it makes intuitive sense. To extend it past the realm of typical human intelligence, an ant or a chimp isn't very good at knowing how dumb they are because they're not very good at most things. However, I suspect that the average dumb (human) person does know they're a bit dim, so it confuses me how this finding can even arise.

The problem with the Dunning-Kruger chart is that it violates a fundamental principle in statistics. If you’re going to correlate two sets of data, they must be measured independently. In the Dunning-Kruger chart, this principle gets violated. The chart mixes test score into both axes, giving rise to autocorrelation.

Realizing this mistake, Edward Nuhfer and colleagues asked an interesting question: what happens to the Dunning-Kruger effect if it is measured in a way that is statistically valid? According to Nuhfer’s evidence, the answer is that the effect disappears.

Is it possible to salvage a non-trivial version of the DKE? The one we know and once loved literally works for random data, so that's right out. In other words, what's the cut-off where a stupid person becomes smart enough to know they're stupid, or at least worse than their peers?*

*In a more general sense than a chimp knowing he's not as strong or big as the alpha male.

So so much of pop science is utter garbage that it is amazing it ever got the reach it did.

Another example is the "human brains don't mature until 25" BS, where even the authors of the original paper from where this statement comes from don't agree with it and are surprised that this, out of all the claims in the paper, was the takeaway which entered public consciousness...

Another of my most hated ones is the idea that only 7% of meaning comes from words (the rest from body language and tone). This doesn't even make any fucking sense, much less have any evidence that it's true. It's a terrible description (like, you'd have to be borderline-illiterate to go to the original source and have this as your takeaway) of a study that was making a completely different, much narrower point. (Specifically, that if your verbal and nonverbal communication don't match, people will generally believe the nonverbal portion - e.g. you say to your spouse "everything's fine!" in an angry, aggressive tone, this will not persuade them that everything's fine.)

Thinking for ten seconds about the last non-trivial conversation you had (roughly speaking, one where novel information was exchanged on a topic other than the participants' current emotional states) should be enough to disprove this idea decisively. How, you should ask yourself, could 93% of that information have been exchanged in a way that was independent of the words used? In most cases, you will find that the question not only does not have a good answer, but is hard to even make sense of.

This has not prevented the idea from showing up in training materials from major multinational corporations, not to mention the Web sites of universities that should damn well know better.

Yeah, if only 7% of the meaning comes from words, then this dude who has Wernicke's Aphasia is communicating just as much as a videolink of a normal conversation where 7% of the time the screen disappears and the person is muted. I find that basically impossible to believe

Thinking for ten seconds about the last non-trivial conversation you had (roughly speaking, one where novel information was exchanged on a topic other than the participants' current emotional states) should be enough to disprove this idea decisively.

Yeah, this seems absolutely impossible to believe. The information content of someone telling me, "I would like pizza for dinner" via text message is almost identical to them saying the same thing face-to-face. I might glean some additional information about their enthusiasm for said pizza in the latter case, but that's the 7% rather than the 93%. I'm not familiar with the study, but it sure seems like it's just a measurement of a claim that everyone already believes, which is that words aren't great at overriding non-verbal signals that are generally more honest when the two are contradictory.

even the authors of the original paper from where this statement comes from don't agree with it

Do you have a reference for this?

Sure, here's an article from Slate from last year: https://slate.com/technology/2022/11/brain-development-25-year-old-mature-myth.html

Quoting:

They also found important clues to brain function. For instance, a 2016 study found that when faced with negative emotion, 18- to 21-year-olds had brain activity in the prefrontal cortices that looked more like that of younger teenagers than that of people over 21. Alexandra Cohen, the lead author of that study and now a neuroscientist at Emory University, said the scientific consensus is that brain development continues into people’s 20s.

But, she wrote in an email, “I don’t think there’s anything magical about the age of 25.”

Yet we’ve seen that many people do believe something special happens at 25. That’s the result of pop culture telephone: As people reference the takeaways from Cohen and other researchers’ work, the nuance gets lost. For example, to add an air of credibility to its DiCaprio theory, YourTango excerpts a passage from a 2012 New York Times op-ed written by the psychologist Larry Steinberg: “Significant changes in brain anatomy and activity are still taking place during young adulthood, especially in prefrontal regions that are important for planning ahead, anticipating the future consequences of one’s decisions, controlling impulses, and comparing risk and reward,” he wrote.

And further down:

To complicate things further, there’s a huge amount of variability between individual brains. Just as you might stop growing taller at 23, or 17—or, if you’re like me, 12—the age that corresponds with brain plateaus can differ greatly from person to person. In one study, participants ranged from 7 to 30 years old, and researchers tried to predict each person’s “brain age” by mapping the connections in each person’s brain. Their age predictions accounted for about 55 percent of the variance among the participants, but far from all of it. “Some 8-year-old brains exhibited a greater ‘maturation index’ than some 25 year old brains,” Somerville wrote in her Neuron review. Some of those differences might be random genetic variation, but people’s behavior and lived experience contribute as well. “Childhood experiences, epigenetics, substance use, genetics related to anxiety, psychosis, and ADHD—all that affects brain development as well,” said Sarah Mallard Wakefield, a forensic psychiatrist.

Bolding mine.

Seriously read the whole article, it's not too long and definitely worth it.

Considering that we're in an aging society dominated by boomers, where ageism is real, I don't find it surprising at all.