This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
Opening up the discussion on Palantir's CEO 22 points manifesto (original twitter link) which is an excerpt for his new book The Technological Republic.
First impulses when reading through the list, I see that it's melding nationalism and civic responsibility with some kind of tech-elite-ism (?) and culture war critiques.
Anyways, here is what I got from it after thinking about it more and talking with various AI (ChatGPT, Gemini, DeepSeek, Qwen):
I think I am influenced because I am currently reading through Seeing Like a State, but I get the feeling that Alex Karp believes he is a leader in a vanguard of tech elites that knows what's best (even if many are distracted from the real issues right now) and everyone should listen and just follow this vanguard. Oh and throw in some "woe is me, only I can save the republic, they just don't understand me, so read my book because then you will".
I think this is an interesting view into the CEO of one of the most important companies. My impression of the man has decreased, and increasing my concern for the kind of leaders and elites that is brewing up within American society.
Citation needed.
Palantir is a very valuable company in the strict dollars and cents ... sense, but I don't know how "important" they are in the sense of a Ford, General Electric, US Steel, IBM, Standard Oil etc. Even within the post 2008 tech world, I wouldn't put them in front of Google, Facebook, Netflix, or the legacies-turned-cool-again Apple and Microsoft.
To shed some light on what Palantir actually does; they have a data "platfrom" that combines a bunch of open source technologies with their own tooling and integration layer. To be fair to them, this isn't something that anyone could vibe code. A lot of it is hard won engineering knowledge.
Their greatest strength is their greatest weakness -- it's kind of a "do anything" platfrom. Which sounds fun and cool and amazing until you consider that it does nothing out of the box. A big BIG part of Palantir is a role called "the forward deployed engineer." This is a software engineer - a team of them, usually - that sits on site with customers and builds, within the Palantir platform, purpose based "applications." Once the app is up and running, the Forward Deployed Engineers also, sometimes, try to "build back in" whatever they just built into the core Palantir platform.
Sound confusing and kind like a shitty way to do software development? You're not wrong. The Federal market loves this because it's how they've done software for ages -- by paying other people unending dollars to write it for them. The big Beltway Bandit firms like CACI, Booz Allen, Leidos, Deloitte Federal, and literally hundreds of smaller players do more or less what Palantir does, but with shittier marketing and without selling a required software license the way palantir does. All the way back in 2016, this got so bad that Palantir SUED THE ARMY for not giving them a "fair shot" at a contract.
(Again, to be totally generously fair to Palantir, protests over contract awards are common and all large players will use them from time to time. I think actually suing the gov't, however, was quite unusual).
An interesting note about Palantir is that several of its current and former executives are very publicly prominent, especially in tech spaces like X/Twitter and the podcast circuit. You have Alex Karp, Shyam Shankar, Trae Stephens (now at Anduril) among others. They capture a lot of attention and, frankly, a lot of what they say is smart and forward thinking. Still, you can't say the don't market themselves well. The cherry on top (crown jewel) is, of course, that Peter Thiel was an early Palantir investor and J.D. Vance worked for Thiel's investment company before running for Senate. In the good old fashioned DC tradition, a lot of Palantir's success has been because of Who They Know.
In terms of these culture war adjacent manifestos, I don't see how they make any sense from a risk/return perspective. Companies that get involved in culture war stuff often face blowback sooner or later without seeing much bottom or top line growth. If you're familiar with the hilarious tone-deaf "All In" podcast, you'll know that there's a tradition of Silicon Valley types thinking that because they're highly competent in one domain, they think they can easily use "first principles thinking" (what in the actual fuck?) to transfer that competency to another domain. Elon's Doge experiment was his flirtation.
In fairness to people who are intelligent high-decouplers, we're generally correct when we make that observation. We're really good at basically anything that requires you match patterns (because that's what intelligence is); that's why the middling among this class of person is also especially paranoid about being replaced with a computer system that can do this. When we take a step back and examine a system's inputs and outputs without being butthurt about the way things are [or "low decoupler" for short], and apply our reason to the way a system emerges from that, we call that "first principles thinking".
And yes, that means we do know better than you, about most everything, most of the time (and not internalizing this posture is destructive for us; it is a fail-state for us to ever communicate that sentiment to you, obviously, but I see no alternative here), but because of that there are a bunch of challenges that- if not accounted for [the shorthand for this is "the human element"]- will end up causing more harm than good. One of the pitfalls unique to us is that we end up creating the 1 Corinthians 8 problem, where what we're doing is, from first principles, correct [and we know that- if we shut ourselves off from knowing it, or otherwise permit low-decouplers to dictate our morality for us, we self-destruct] but mere correctness isn't the only factor in a solution, or what is acceptable to do or say when. INT outwardly resembles, but isn't actually fungible with, WIS. (And yes, it takes someone who has both to teach that, and yes, they are very rare. Accommodation is following God's example; He does not grumble dragging the cross- the ultimate instrument of accommodation for humankind- and as such, neither will you.)
On a broader level, this is also kind of why different cultures end up with different perspectives on things; different starting conditions reveal different answers to different questions, and also create different problems. Which is also why we tend to be given to weeabooism and other weird/offensive nonsense; part of the appeal of spicy or particularly unpalatable food is enjoying the fact nobody else can eat it, and the same applies to certain kinds of information for the same reason. Of course, if you were a restaurant and made every dish that way, your restaurant would close and you'd end up serving nobody, unless you had a sufficient hard-core customer base that hung around enough to sustain you. (This is why 4chan is the way it is.)
You do ultimately have to accommodate for the low-decouplers and the people who take time to come around to things. Which I think is why
would probably be an effective strategy to combat/work with that type of person. You actually have to observe the customer, how they work, how they communicate, and how they think and reason, to turn Knows Better into effective service.
The question isn't whether we know more than "you" (i.e. the man in the street - the idea that Silicon Valley-based Motteposters have superior access to intelligence and rationality than other Motteposters is straightforwardly silly), it is whether we know more than relevant domain experts.
How long it takes for a smart generalist to come unstuck is notoriously a measure of how legitimate a field of knowledge is, and there are plenty of legitimate fields of knowledge outside the core competence of "tech" - most obviously all the non-software engineering disciplines. Before Musk founded a rocket company, he found as many smart rocket guys as he could and listened to them. When he bought into and refounded a car company, he hired car guys and listened to them. But there is legitimate subject-matter expertise to be had outside STEM. When Musk took over large parts of the US government, he didn't bother to talk to people who understood governing, and DOGE came unstuck - and not just, or even mostly because Musk was too autistic to maintain public and political support for what he was doing per @ThisIsSin, but because he was taking an approach (what P J O'Rourke would call "balancing the budget by cutting helium funds") which everyone who understands the budget knows can't work because the math doesn't math.
Sounds like Musk didn't apply engineering first principles to his DOGE initiative. And as you pointed out, it takes time for a generally smart person to become an expert in the field, but I would like to point out that it all starts with first principles and that becoming a domain expert is always by applying first principles to their given interests.
Personally, my explanation for why Musk deviated from his successful formula is that Musk was ideologically captured, had increased drug use, and couldn't keep a cool head.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link