site banner

Culture War Roundup for the week of May 12, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

Honestly, I think the article does itself a disservice by not breaking the problem down into the two major but separate issues, detailed below. Instead it bounces between the two in an effort to provide an engaging article, but it's very important to realize that these two problems are largely separable problems. They both involve AI, but that's the extent of the overlap.

Problem One: Scientific research clearly indicates that the difficulty and engagement with a task is directly proportional to learning. The neuroscience points out that different parts of the brain are activated when asked to perform "recall" instead of mere "recognition." Unfortunately many students are unable to recognize the difference! Recall is something like: "tell me something about this" and you work from scratch, recognition might be a looking at your notes or a nice summary and going "oh yeah that makes sense", or answering a multiple-choice question where you have plenty of cues to work with. Some have even argued that it's possible to create in-class notes that are too good at their purpose, thus "offloading" the work to an external knowledge storage device, in a sense. The key point however is this: not only is recall far more potent than recognition in terms of how likely the information is to make it into long-term memory in the first place, it's also worth stating that the more connections that are made during the learning process, the more likely the brain is to be able to retrieve that information from long-term memory as well.

ChatGPT in its most common use case, entirely "short-circuits" this process, depriving a student from forming connections, and developing a kind of "base knowledge" that could be helpful on less foundational topics later. This does not necessarily have to be the case - a good prompter might use ChatGPT to self-quiz, or ask smart follow-up questions, or give deeper explanations that trigger more connections (ignoring hallucinations for now). I think this kind of advanced usage is a small minority of college users, though. In short, this is the most serious problem for AI in college.

Problem Two: How important are essays, anyways? We can't really escape the classic "calculator problem": remember plaintively asking your math teachers why you needed to learn this if a calculator or graphing software could do it just fine? Obviously that's a complicated question, and this one is too; a certain level of familiarity with numbers and how they work is critical if you go into any kind of later applied math, not knowing your times tables can cripple the ability to engage with algebra, but frankly there were absolutely some questions that were designed to be deliberately difficult rather than to emulate any kind of real-world situation. So, essays. What good is an essay? Honestly I think the evidence has always been a little hand-wavy and weak for essays. Not only did virtually all humanities professors go way overboard on being strict about formatting in a misguided attempt to help students (I've seen some horror stories where well-written essays get absolutely demolished due to stupid rules like "you must exactly rephrase your thesis at the start of the conclusion") but it's hard to see if the act of writing essays noticeably improves vague notions like "thinking critically". Now, I might be behind the times on this particular area of research (if it even meaningfully exists), but it has always seemed to me that essays were more crude attempts at prompting students to do plenty of recall via independent research and synthesis. Thus increasing learning. But this was always an artifact of how difficult the task of assembling an essay from scratch is, something clearly no longer difficult with AI.

Thus, the essay must die. Perhaps professors should ask for a wider variety of writing formats, more applicable to life. Perhaps the standards should shift to the end-result of the writing - is it enjoyable to read and factual and the right length/complexity? Perhaps live or oral assessments should be more prominent. Or maybe professors should focus on teaching smaller and more broadly useful writing tips, about the writing itself, or even consider teaching tips about how to best prompt an AI for assembling a piece of writing. Is there any evidence writing essays actually increases the capacity or ability to wield "critical thought"? I say no, if you want to teach critical thinking, you might as well attempt to do so directly and not default to weak proxies like essays.

What good is an essay?

When you wanted to explain an idea you had to people you don't know, you sat down and wrote this essay. Maybe that's the joke, but in all seriousness, this is the good of an essay. It's a way of conveying your thoughts in a timeless and self-contained fashion.

They are also a way of helping yourself think. Have you read Paul Graham on essays? https://www.paulgraham.com/essay.html

Unless you are defining essays very strictly as 'five part theses of twenty pages as written by humanities students'. I am quite prepared to believe that essay writing is taught badly.