site banner

Culture War Roundup for the week of October 27, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

3
Jump in the discussion.

No email address required.

Polarizing and niche appeal people like Musk often doom their own projects to niche appeal by the very fact of being involved, for one. Nearly any other mainstream tech-famous figure would have far more cachet right away, or even a determined but unknown media whore. This matters not just for getting users to the site and retaining them (obviously important - see the failure of Truth Social), but also because at the current state of AI to do this you functionally need human volunteers to supervise said AI, and so you want to cast that net more widely. You want curious and motivated people, not tech castoffs with an axe to grind against the “establishment”. Making an encyclopedia is foundationally an establishment thing to do anyways, the ideas are not very nicely compatible. Wikipedia’s faults are in execution, not a flaw in the core mission or even necessarily in its processes. One reason why all challengers have failed was attempting to reject that - more similar projects have their oxygen stolen by the more mature free product, but that’s obviously not a concern for an AI encyclopedia which is a novelty in and of itself, and at least theoretically could offer some things Wikipedia cannot.

And don’t get me wrong, given the recent history of Grok models, not only would Grok need a lot of hand holding, it’s quite possible even with said help it would be flatly incapable of obtaining an acceptable final product. Some smart engineering might allow current gen models to achieve some sort of success, but that’s again something where the engineering is often the point, not the final output. As an example, it would be genuinely interesting to see if a horde of slightly differently tuned and varied models are able to produce an emergent AI “wisdom of the crowds” equivalent, or would get stuck in certain fail states. Musk gets this paradigm all wrong, because he is plainly treating the project as both advertising for his specific shitty model, as well as a partisan vehicle to launder his sociopolitical complaints into greater coherence or acceptability. These are not sustainable directions on multiple fronts.