This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I wrote a much better comment a few minutes ago, but one of the cats I'm fostering because my girlfriend foisted them upon me jumped on my keyboard and deleted it. So I apologize in advance if this is a low effort comment.
I think that the distrust of experts on this site goes way too far. 99% of the topics experts agree on or are on places like Wikipedia are true. If you look up something like the Central Limit Theorem on Wikipedia the answer will be more or less correct. But most things are boring. The ideas we focus on that are controversial and we don't trust them on are ones that cause the experts to lose their minds over and lose the ability to be impartial. Some examples are HBD and Covid. But if you open up a biology textbook, you can take most of that knowledge to the bank.
I want to give an example of this guy I know who worked at Best Buy with me in college. He is a Muslim guy and the elusive moderate Muslim. He is more or less progressive on every topic. I saw him recently at a tech meetup in Austin and he more or less sounded like a straight up Jihadist. And I helped this guy get his job at a major networking company after he got his law degree as a project manager, so I can confirm I thought he was a rational and trustworthy person. Which he is, except on the Israel-Palestine issue. He literally can't be rational. I thin for "experts", this is the same thing. They literally can't be rational on a few issues and it causes them to act insane and make people lose trust in institutions.
I'll give a less controversial example. I have a CS degree and I worked for this company that sold software that helped people automate things. We'd get this guy on a call with potential customers after the sales people and sales engineers did their thing and he would just shit on Azure and AWS and how he could do this and that if they switched to Linux and open source and the customers hated it. I had to pull him aside I was like dude we make software that works with Azure wtf are you doing. He was incapable of putting that hammer and nail away. It was like who gives a shit if a company uses Microsoft but he literally couldn't be rational about it.
I think a lot of people default to something similar to Foucault's theories on knowledge and power where knowledge and power are so linked that they end up essentially being the same thing. I completely agree with him, and I think power and knowledge combine to influence, manipulate and create NPCs that don't think. But in the case of experts, I think it is their biases causing this top down gas lighting instead of anything from the regime. The simplest and most likely answer is these people just believe this stuff due to ideology and are incredibly biased on hot culture war issues. It's not a conspiracy, they literally just can't think about these issues rationally.
I don't have (and don't have the expertise to make) a biology textbook example written up, but I do have a pretty serious nuclear science one written up for you, where a man with no serious bias and far greater technical knowledge than I can legally get still has bizarre faults that I can verify. (If Atomic Accidents doesn't count as textbook, Radium should, and probably is the source of one of Mahaffey's miscites.)
In one sense, you're not wrong -- if you scrawled down every single fact mentioned in the book onto sticky notes, and put the ones that were strictly true on one side of the scale and the ones that were strictly false on the other, the scales would lean toward truth. But if you were looking at ones that were core claims for the book's theme, and then separated your sticky notes into those you can confirm and those you can't... truth might still win out, but it's not be lopsided victory.
And this gets worse the less concrete the topic. The problems for psychology are legendary (and often-hilarious), and even the strictest focus on textbooks have pretty sizable faults. Nutrition science is a joke, both in academia and on wikipedia. We've hit the point where the literal definitions of words get
redefined forhave their definition updates streamlines for political relevance.It also gets bad where a lot of the 'experts' aren't. That's most obvious for the political stuff, where the same people who bash randos for 'doing their own research' will take a long stiff drink of water before opining on topics where their expertise is limited to having gone to journalism school and slammed too much alcohol down there. Yet there are fields where the experts and textbook writers are plain liars and no one cares because it's not going to end up on the news even if it does end up on television, or leverage training in entirely one field as expertise for an entirely different one.
But even for material science, as concrete as can be, there's a lot of stuff that's just a mess; not that anyone's lying, but that they genuinely don't know. I've got an essay I've been working on, but the punchline revolves around this stuff. It's been around a decade, has no wikipedia page, won't be in any textbooks, and is subject to literally millions of dollars in grants for analysis of a material that can be mixed in a garage and I don't know if it 'reals' or not, or what extent that the claims benefits are of the claimed magnitude. There's a lot of red flags in all nano-whatsis stuff but especially the stuff around it, but it's weird for the DoE and Argonne and a dozen other labs to be looking at it seriously. But they've also all been looking at it seriously and not publishing a ton, for a material that you'd expect to see in cars and boats and household electronics.
The problem isn't an lackluster number of people I could point to who could be more serious investigators of this than myself. It's that they don't have an answer, or to the extent they do they have a half-dozen different ones.
The occasional news item about a fraudulent researcher just reinforces the idea that scientific malpractice consists of a tiny number of evil researchers who clearly violate scientific standards by fabricating data and that all other researchers do a great job.
In reality, most bad science consists of fairly subtle manipulations or bad practices like p-hacking, tiny data sets, misrepresenting the actual findings, measuring the wrong thing, etc. Much of this is due to incompetence, where the researchers get taught 'this one weird trick' which is good enough to get their papers accepted, but without actually understanding what the strengths and weaknesses of their method(s).
This incompetence is fueled by the scientific reward system rewarding those that do bad science and punishing those who do good science (limited by the ability to get away with BS, which is why fields like physics are a lot better, because engineers and the companies that employ them call out scientists when they can't make working things that are based on the scientific discoveries).
Convincing people whose worldview is based on trust that our elites take good care of us, based on mostly solid science, that science is fundamentally broken and most money spend on it is wasted, is quite hard though.
Perhaps, if every fraudulent researcher found themselves as news items, or if academic research appropriately handled the well-established cases as they're discovered.
This is an old post, but I'll highlight it for three reasons: you've never heard of the researcher (and I'd never heard of the entire university), it hit a 'hard' science field, and (most unusually) the publisher explicitly and publicly said they weren't going to treat fraud as fraud where usually that's just decided privately.
I can only demonstrate clear intentional fault in a small portion of all papers and don't know how prevalent this is, that's fair. No one knows how prevalent this is. Attempts to discover research fraud occur almost entirely at the hobbyist level, and the people trying to catch that overt fraud are dependent on tells like photoshop goofups or division errors, with only rare opportunities to see the raw data. Ariely's only coming to light after failed replications, followed by the man sending over Excel spreadsheets with the fakest data imaginable. There's basically zero institutional interest in discussing even the highest-profile and most explicit fraudsters. It's not that we're only seeing the crashes; we're only seeing the crashes that happen in the middle of the city, after which the pilot steps out of the aircraft and recites a five-stanza poem about how they mismanaged the flight. We don't know if the fraud is extremely rare or it's as common as the bad-but-not-fraudulent science.
I agree with and recognize that a lot of people have been trained to do bad-but-not-fraudulent science. I'll caveat that this division isn't always so cut-and-dry -- Wansink is my go-to for salami-slicing, but there's some evidence at least a couple of his studies depended on fabricated data rather than 'just' p-hacking -- but it is relevant to keep in mind.
As I perhaps mention too often, you can come at it from the other end as well. The scientific method, as commonly understood, should preclude the creation and maintenance of entirely fictitious fields of study. If such fields can be observed to exist, that's strong evidence that the process as a whole is fundamentally broken, even if you can't identify the specific steps where the problem lies.
There is actually no common understanding of the scientific method. What Judith Butler does is not comparable in any meaningful way to the work of Ferenc Krausz (and his team). Yet they both claim to do science, even though the work of one of them is not falsifiable.
Part of the charade that allows these nonsensical fields and subfields to exist is the claim that all the professors who work at universities do proper science, even when they do no such thing.
And the ideal scientific method is just aspirational anyway. In reality we cannot achieve that perfection even for physics. When it comes to fields that do not provide the preconditions that allow us to apply something a bit close to the ideal scientific method, people simply use less rigorous methods. Until you get to Judith Butler where claims just get conjured up with bad logic, misrepresentations of what others actually proved, etc.
And like gattsuru says, there is a disturbing lack of interest by institutions (and voters) in even figuring out how well those who call themselves scientists actually do their jobs. Researchers with good morals who do look into it, invariably find highly disturbing results.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link