site banner

Culture War Roundup for the week of April 20, 2026

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

2
Jump in the discussion.

No email address required.

Opening up the discussion on Palantir's CEO 22 points manifesto (original twitter link) which is an excerpt for his new book The Technological Republic.

First impulses when reading through the list, I see that it's melding nationalism and civic responsibility with some kind of tech-elite-ism (?) and culture war critiques.

Anyways, here is what I got from it after thinking about it more and talking with various AI (ChatGPT, Gemini, DeepSeek, Qwen):

  1. Stop criticizing the elites that actually do something (go away we know what's good for you)
  2. the grunts should share more of the burden (you're not doing your part)
  3. Argues for inclusion, but implies that there should be a defined/national culture for inclusion and assimilate into (this "minority" thought is good, that "minority" thought is bad)(btw, I will tell you which one is good or bad)
  4. Civic religion for everyone! (and remember to be nice to your tech priests)
  5. You all are fat and weak because of all this peace stuff (Although I benefitted, you are going the wrong way btw)

I think I am influenced because I am currently reading through Seeing Like a State, but I get the feeling that Alex Karp believes he is a leader in a vanguard of tech elites that knows what's best (even if many are distracted from the real issues right now) and everyone should listen and just follow this vanguard. Oh and throw in some "woe is me, only I can save the republic, they just don't understand me, so read my book because then you will".

I think this is an interesting view into the CEO of one of the most important companies. My impression of the man has decreased, and increasing my concern for the kind of leaders and elites that is brewing up within American society.

the CEO of one of the most important companies.

Citation needed.

Palantir is a very valuable company in the strict dollars and cents ... sense, but I don't know how "important" they are in the sense of a Ford, General Electric, US Steel, IBM, Standard Oil etc. Even within the post 2008 tech world, I wouldn't put them in front of Google, Facebook, Netflix, or the legacies-turned-cool-again Apple and Microsoft.

To shed some light on what Palantir actually does; they have a data "platfrom" that combines a bunch of open source technologies with their own tooling and integration layer. To be fair to them, this isn't something that anyone could vibe code. A lot of it is hard won engineering knowledge.

Their greatest strength is their greatest weakness -- it's kind of a "do anything" platfrom. Which sounds fun and cool and amazing until you consider that it does nothing out of the box. A big BIG part of Palantir is a role called "the forward deployed engineer." This is a software engineer - a team of them, usually - that sits on site with customers and builds, within the Palantir platform, purpose based "applications." Once the app is up and running, the Forward Deployed Engineers also, sometimes, try to "build back in" whatever they just built into the core Palantir platform.

Sound confusing and kind like a shitty way to do software development? You're not wrong. The Federal market loves this because it's how they've done software for ages -- by paying other people unending dollars to write it for them. The big Beltway Bandit firms like CACI, Booz Allen, Leidos, Deloitte Federal, and literally hundreds of smaller players do more or less what Palantir does, but with shittier marketing and without selling a required software license the way palantir does. All the way back in 2016, this got so bad that Palantir SUED THE ARMY for not giving them a "fair shot" at a contract.

(Again, to be totally generously fair to Palantir, protests over contract awards are common and all large players will use them from time to time. I think actually suing the gov't, however, was quite unusual).

An interesting note about Palantir is that several of its current and former executives are very publicly prominent, especially in tech spaces like X/Twitter and the podcast circuit. You have Alex Karp, Shyam Shankar, Trae Stephens (now at Anduril) among others. They capture a lot of attention and, frankly, a lot of what they say is smart and forward thinking. Still, you can't say the don't market themselves well. The cherry on top (crown jewel) is, of course, that Peter Thiel was an early Palantir investor and J.D. Vance worked for Thiel's investment company before running for Senate. In the good old fashioned DC tradition, a lot of Palantir's success has been because of Who They Know.


In terms of these culture war adjacent manifestos, I don't see how they make any sense from a risk/return perspective. Companies that get involved in culture war stuff often face blowback sooner or later without seeing much bottom or top line growth. If you're familiar with the hilarious tone-deaf "All In" podcast, you'll know that there's a tradition of Silicon Valley types thinking that because they're highly competent in one domain, they think they can easily use "first principles thinking" (what in the actual fuck?) to transfer that competency to another domain. Elon's Doge experiment was his flirtation.

they think they can easily use "first principles thinking" (what in the actual fuck?) to transfer that competency to another domain

In fairness to people who are intelligent high-decouplers, we're generally correct when we make that observation. We're really good at basically anything that requires you match patterns (because that's what intelligence is); that's why the middling among this class of person is also especially paranoid about being replaced with a computer system that can do this. When we take a step back and examine a system's inputs and outputs without being butthurt about the way things are [or "low decoupler" for short], and apply our reason to the way a system emerges from that, we call that "first principles thinking".

And yes, that means we do know better than you, about most everything, most of the time (and not internalizing this posture is destructive for us; it is a fail-state for us to ever communicate that sentiment to you, obviously, but I see no alternative here), but because of that there are a bunch of challenges that- if not accounted for [the shorthand for this is "the human element"]- will end up causing more harm than good. One of the pitfalls unique to us is that we end up creating the 1 Corinthians 8 problem, where what we're doing is, from first principles, correct [and we know that- if we shut ourselves off from knowing it, or otherwise permit low-decouplers to dictate our morality for us, we self-destruct] but mere correctness isn't the only factor in a solution, or what is acceptable to do or say when. INT outwardly resembles, but isn't actually fungible with, WIS. (And yes, it takes someone who has both to teach that, and yes, they are very rare. Accommodation is following God's example; He does not grumble dragging the cross- the ultimate instrument of accommodation for humankind- and as such, neither will you.)

On a broader level, this is also kind of why different cultures end up with different perspectives on things; different starting conditions reveal different answers to different questions, and also create different problems. Which is also why we tend to be given to weeabooism and other weird/offensive nonsense; part of the appeal of spicy or particularly unpalatable food is enjoying the fact nobody else can eat it, and the same applies to certain kinds of information for the same reason. Of course, if you were a restaurant and made every dish that way, your restaurant would close and you'd end up serving nobody, unless you had a sufficient hard-core customer base that hung around enough to sustain you. (This is why 4chan is the way it is.)


You do ultimately have to accommodate for the low-decouplers and the people who take time to come around to things. Which I think is why

A big BIG part of Palantir is a role called "the forward deployed engineer." This is a software engineer - a team of them, usually - that sits on site with customers and builds, within the Palantir platform, purpose based "applications." Once the app is up and running, the Forward Deployed Engineers also, sometimes, try to "build back in" whatever they just built into the core Palantir platform.

would probably be an effective strategy to combat/work with that type of person. You actually have to observe the customer, how they work, how they communicate, and how they think and reason, to turn Knows Better into effective service.

it is a fail-state for us to ever communicate that sentiment to you,

but mere correctness isn't the only factor in a solution, or what is acceptable to do or say when.

Have you ever thought that this, in itself, is a problem that needs to be solved?

Which is why the line after that is “there’s no other way to really say this”. If there was, I would have said that instead.

The problem with ‘you’re just going to have to trust me; what I’m doing is too hard for you’ is that you have to be right, both objectively and (as much as is otherwise possible) subjectively.

Most people don’t think about it as much or that way; they generally outsource what they think to others and match that (best case, those who that was outsourced to are following these rules and keeping things palatable for the average person, but that doesn’t help the people who have already mastered that part).

Which is why the line after that is “there’s no other way to really say this”. If there was, I would have said that instead.

By this, I meant that we should cower when speaking the truth and that the truth should be subordinated to acceptability. Isn't that a serious problem that needs urgent solving?

Actually, no; but I'd characterize it less as "subordinated to acceptability" and more as "the necessary translation layer to get as much of the truth across as possible".

You can't say the phrase "daily bread" to a people who don't know what bread is, so if you have something to deliver them you have to find words that do mean that and then say those.

This saves you so that once/if you get to the part you can't do that for (specifically, at 7:19) when your audience has to make the special effort to understand something- and you have to tell them which part to focus on, they don't naturally know that- they spend, and are still willing to spend, that limited effort [and time] only on the irreducible/important part.