site banner

Culture War Roundup for the week of June 16, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

4
Jump in the discussion.

No email address required.

The Blind Men and an Elephant

The parable of the blind men and an elephant is a story of a group of blind men who have never come across an elephant before and who learn and imagine what the elephant is like by touching it. Each blind man feels a different part of the animal's body, but only one part, such as the side or the tusk. They then describe the animal based on their limited experience and their descriptions of the elephant are different from each other. In some versions, they come to suspect that the other person is dishonest and they come to blows. The moral of the parable is that humans have a tendency to claim absolute truth based on their limited, subjective experience as they ignore other people's limited, subjective experiences which may be equally true. [from wikipedia.]

As someone who travels between cultures frequently, I find myself thinking a lot about this parable. Everywhere I go, different people in different places have developed different views and interpretations of the world, but the underlying fundamentals of reality remain unaffected by mere human perception and interpretation. In other words, the elephant remains the same regardless of the spot we’re poking at, rubbing against or cutting into.

I find myself reorienting what I experience and perceive from the viewpoint of my background and upbringing, shaped to some degree by my current context. When I meet new people, I compare them to people I was raised around, my friends and family back home. When I try new foods I orient them in relation to foods I was raised with and are most used to. When I experience new weather patterns I compare them to the climate of my birth. Inextricably I am linked to the time and place of my upbringing.

I was raised in a chaotic home environment between divorced parents. My mother was very strict and had many rules, while my father was very lax and enforced very few rules. My mother raised me in the Protestant church while I attended Catholic school for two years, then I was switched to public school in third grade. The inconsistency between Protestant, Catholic and secular worldviews left me very disenchanted by competing narratives and viewpoints that each assert their own contradicting universal realities which I remain suspicious of today.

General artificial intelligence could be capable of synthesizing the perspectives and contexts of every place and time into one universal viewpoint. Mapping out the elephant of the world with more objectivity seems more plausible than ever before. The self assuredness of modernity and the arrogance of postmodernity (Fukuyama’s end of history, for example) are likely to be dwarfed by the self assurance of any newly synthesized panopticon of awareness that an AGI could run on.

But would an AGI be capable of synthesizing every view of the elephant into one accurate rendering of reality at all, or would it merely be able to switch from one perspective to another? The Japanese conception of reality works well enough in the Japanese context, and my basic understanding or exposure to it is amusing enough to me as an outsider, but start poking at it a bit and the construction begins to fall apart. We westerners are just as bound by the false or skewed construction of the Western viewpoint, which is difficult for us to perceive the limits and contradictions of.

I wonder if the AGI will be a Tower of Babel of sorts that could give the illusion of unity and progress but that ends up dividing us further than ever before.

Actually, the thought of a universal synthesized view of the world is what is most frightening to me because it is so utterly foreign to anything we’ve ever come up with ourselves. Either we will discover things we never wanted to know about ourselves and the universe, or we will fail to discover those things and create an even more dystopian world that further reinforces the skewed, convenient beliefs that I believe we already build our societies on.

——

Many people on the right believe that right wing thinking is fundamentally the position of believing in the power to change things: The power to make different decisions, free will, and so on. But in my years of reading right wing thought, the concept that feels the most fundamentally grounding in right wing theory is the idea that nature remains constant. That is, that the elephant remains the elephant regardless of our interpretation. This is the most reassuring concept to me in right wing thinking: that I don’t need to make the Sisyphean effort to rewire my reaction to things outside my control, that I can just accept them as immutable forces of nature and move on with my life. I also think this is a more loving, understanding view of the fundamentals of reality compared with the left’s struggle to undermine them.

Your sense of suspicion at all these competing narratives reminds me of my own experiences, though perhaps my upbringing was even more atypical. I grew up in Malaysia and came into contact with many parts of the culture there, but was raised by parents who'd spent time in Britain and who homeschooled me in a very different environment than most other people would ever experience. I knew people who were staunchly Christian and prayed often to the Lord, de facto Taoists who actively made offerings to spirits and arranged their homes in line with feng shui geomantic principles, staunch atheists that somehow still clung to hints of superstition here and there, and so on. When I was sixteen, I moved to Australia (where I now live) and interacted with yet another cultural milieu.

A consequence of this strange muddled background is that I notice I don't really feel kinship with any way of thinking and virtually never identify with any major group or subculture, so there's this persistent tendency for me to feel like an alien wherever I go. I travel for fun a lot and come into contact with a lot of people from different cultural backgrounds, and it often seems like the way they mentally structure and interpret the world are completely incongruent with each other.

Anyway, reality (at least from our perspective) isn't so much an elephant as it is a Necker cube. There are two possible 3d interpretations of this cube, but we can't see both views at once since the interpretations are so diametrically opposed to each other that it's impossible to maintain both orientations in our minds at once. Yes, there is the problem of the Elephant, where different people come across different information about the world and draw different conclusions about it on that basis, but even when everyone agrees on the fundamental factual points of contention there is inevitably going to be subjectivity in how one puts them together and fits them into an internal narrative of the world.

Oftentimes we don't have direct access to seemingly simple things like cause and effect (insert quote about how all science is actually just correlation here) and even the same data points can lead to wildly different understandings of the world depending on the system interpreting them. At this point we can model quantum phenomena very well but what it actually implies is untestable and completely beyond us. So much of what we know about reality lies on the surface of a black box. We don't and perhaps will never have direct access to many aspects of how things work, and until that happens I suspect it will be like the Necker cube: analysing a 3d object through the lens of a 2d plane, and debating how it's actually oriented. There is a capital-T Truth out there, but whether that's accessible to us or not is another question.

That's before we can even get into things like moral outlook, which... well that's a crapshoot. Hume's is-ought problem still remains intractable today. I seriously doubt an AGI would be able to synthesise many aspects of worldview together as a result; there will always be big Unknowns (in more domains than people think, IMO) where all we can do is gesticulate at an answer.

moral outlook

Actually I find this to be the most universal piece of the puzzle beyond any more objective measurements. For example half the world drives on the right and half the world drives on the left, but the moral fundamentals beneath which side of the road you personally decide to drive on are universal regardless. You choose depending on whether you want to safely reach your destination or create chaos and accidents around you. The moral goals and is-ought problem leads to the same or similar results whether you choose to drive on the right in america or the left in the uk. That is a simple example for illustration's sake but I believe that most problems follow this pattern as well. Treating people kindly and with love and trust is always the solution to any is-ought problem in any culture I've been to because it absolves yourself of the guilt of having acted unkindly or unlovingly and if someone interprets it incorrectly it is not because your underlying intentions were wrong. Maybe this is too much of a consequentialist view that collapses morality into the mind of the actor too far but again we arrive at the uniqueness of the self's actions apart from any others, which would potentially be overcome in an artificial universal consolidated worldview.

Other than that I agree with everything you said and relate to your experiences as well. I agree that we each individually have an inability to fully describe the capital-T Truth but a general AI with infinite knowledge and sources of data interpreted outside the frame of an individual would either be a step toward a new integrated model of understanding or perhaps just the false appearance of such.

Treating people kindly and with love and trust is always the solution to any is-ought problem in any culture I've been to because it absolves yourself of the guilt of having acted unkindly or unlovingly and if someone interprets it incorrectly it is not because your underlying intentions were wrong.

This is only true if you tautologically define the term "kindly and with love and trust" to contain all of the complexities and nuances of the broader "is-ought".

Is it kind, loving, and trustful to lock your house or your car? Well, it's kind to the people inside the house, less kind to the thieves that want your stuff.

Is it kind, loving, and trustful to guard your wallet from pickpockets in a crime-ridden area and stop one if you catch them mid-theft?

Is it kind, loving, and trustful to punish someone for a crime? You can argue that it's kind to the victim, but unkind to the perpetrator being punished. Or you can make a complicated argument about how it's ultimately "kind and loving" to the perpetrator because the punishment will help them learn the error of their ways and become a better person which will ultimately be for their own good.

I'm not saying generally acting with kindness, love, and trust is wrong. They're good guidelines when to look to when trying to ground your decisions, but those words alone do not automatically solve all of the potential ethical dilemmas and tradeoffs inherent to the complexity of the real world.