site banner

Culture War Roundup for the week of May 19, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

There has been a lot of hype news in robotics + AI lately, as the AI updates just continue to come at a blinding pace. From Tesla/XAI we have the Optimus robot, which I can't tell if this is a major breakthrough or just another marketing splash driven by Elon.

On the other side of the fence, you have Nvidia releasing an open foundational model for robotics and partnering with Disney of all companies to make a droid robot.

You also have Google's I/O, which I haven't had the energy to look into.

With the speed of AI updates and the wars of hype, it's always hard to tell who is actually advancing the frontier. But it does seem that in particular robotics are advancing quite rapidly compared to even a couple of years ago. Personally I think that while automating white collar work is useful and such, AI entering into robotics will be the real game changer. If we can begin to massively automate building things like housing, roads, and mass manufactured goods, all of the sudden we get into an explosive growth curve.

Of course, this is where AGI doomer fears do become more salient, so that's something to watch out for.

Either way, another day, another AI discourse. What do you think of this current crop of news?

It is crazy to me that most people alive today will be around to see how this - this journey of civilization, this grand process of technological development - ends, or at least moves far, far beyond us. There is a millenarian tension in the air. Paradise or extinction (at least for most people), it seems increasingly clear it will be one or the other.

I've been feeling this vibe lately. Humanoid Robots, Starship, and LLMs are the three things that make this feel more like the future than it ever has.

We are SO CLOSE. To being multiplanetary, interacting with computers that pass the turing test for 95% of the population, and finally being able to own a golden retriever without having to sweep every single day.

But it all seems to be hanging on the edge of a knife. Our governments are so obscenely powerful, people are so scared and stupid, weapons continue to become so much more lethal.

At the end of the day I'm just in wait and see mode. Whatever happens happens. I didn't grow powerful enough to meaningfully affect the outcome in time I don't think, so might as well worry about what I can control.

so might as well worry about what I can control.

It feels like a triage problem, doesn't it?

When your emergency center has too many victims to work on them all right away, you quickly assess them all and mark each person with one of 3 (in the original "tri"-age) tags: one group is going to live without your help, one is going to die regardless of your help, and one is borderline enough that they'll die without your help but live with it. You don't help the victims who need the most help, you help the victims where your help does the most good.

There's a bunch of wild possible futures where humanity dies out regardless of what I do, and there's a bunch of wild possible futures where humanity becomes so rich that we all end up fine no matter what I do. I might as well continue to focus on the more mundane possible futures that fall in between those extremes, even as the in-between category (which once felt nearly certain) becomes less and less likely, because the in-between futures are the only ones where my actions would have made a difference.

It is crazy to me that most people alive today will be around to see how this - this journey of civilization, this grand process of technological development - ends, or at least moves far, far beyond us.

Maybe you're right, but there are plenty of times in history where people have felt this way before, and most of those examples I can think of are from long before I was born. Most recently, the Atomic Scientists would have you believe we've been on a knife-edge of nuclear armageddon -- maybe they're right, but never materialized during the Cold War. Or you could go back and look at any number of doomsday cults, even including early Christians anxiously awaiting Christ's return in their lifetimes.

The pattern has held long enough that I'd personally discourage making any huge life changes assuming it won't matter.

maybe they're right, but never materialized during the Cold War.

I mean, if they were right, then in most timelines, a ton of humans died/were never born, and thus it is just more probable that you were born into the timeline where we narrowly avoided the doomsday scenario.

Or you could go back and look at any number of doomsday cults, even including early Christians anxiously awaiting Christ's return in their lifetimes.

My issue here is that we can see and interact with the 'messiah' this time. There are compelling arguments for why it will keep getting smarter. And if it gets smarter, there are plausible ways it can wipe out decent swaths of humanity.

I will grant that it is almost impossible to take anyone completely seriously because in both the scenario where we get Utopia AND the scenario where we get annihilated, nobody will care about the accuracy of the predictions that led up to it, so the incentive to be truthful and honest is minimal.

Just be cautious about normalcy bias, when things have been getting rapidly weirder for a while now.

the incentive to be truthful and honest is minimal.

Except to the extent we can avoid doom through correct perception and action.

Yeah, a lot of contradictory thought patterns emerge if I ruminate about the future more than like 2 years out.

Should I live as conservatively, frugally and healthily as possible to ensure I make it to there in good shape, or should I be more reckless and try to enjoy life as much as possible since it could all end? (obviously if EVERYBODY does the latter, we might not make it there at all).

Assuming we survive, are we bound for a future of exploring the stars and colonizing new worlds, or do we get stuffed in VR experience machines that satisfy every psychological desire we have without going anywhere? Will I even have a choice?

Is there any point in breathlessly following every notable development in the AI/Robotics space to try and guess when the big moment will arrive, or would it be more constructive and mentally healthy to divorce almost entirely from it and just read escapist fiction all the time so I don't worry about something I can't really control?

Should I continue to behave as though I expect society to persist into the next century and thus be very concerned about e.g. birth rates, pollution, government's fiscal policies, and/or immigration policies? Or does none of this matter in 10-15 years, and thus I should just do the bare minimum to keep things running but hey, let the kids do what they want in the meantime. The AI can fix the mess later.

It is in my nature to prepare, both mentally and financially, for things to go south. I don't buy the hype and promises without skepticism, but I can't deny that every 6 months for the past, what, 3 years? The SOTA models have demonstrated new capabilities that check another box off my "is it smarter than humans?" list. The temptation to just give up 'trying' and go with the flow is strong.

A bit of optimism, I do believe that I'm young and healthy enough that I'm likely going to be around when we reach Longevity Escape Velocity, if the AGI stuff never fully manifests we've got all the pieces to fix most age-related problems in humans so as to give us functional immortality by 2050. Which will create a whole host of new and exciting issues if the AGI isn't already in charge.

I'm not entirely sure I believe this but the bear case isn't that hard to articulate.

We hit the top of the S-curve for LLM's and they merely become very useful tools instead of ushering in the singularity or obviating human labor. Frontier research starts stagnating as a result of having picked all the low-hanging fruit / the competency crisis / failure of higher ed / loss of state capacity / pick your favourite boogeymen and life in 2050 looks about as similar to 2025 as life in 2025 looks about as similar to 2000. This hypothetical world is probably worse for the median westerner compared to 2025 owing to some combination of immigration / climate / financialisation etc etc and the path of civilization doesn't look great without some sort of technological salvation but it's unlikely that any of these issues will be back-breaking in one generation.

Re preparation: it's interesting to me that a lot of people's reaction to millenarism is to make life choices that would be very dumb in any world where Nothing Happens. Assuming you're already reasonably happy with your current life, liquidating your job and savings seems to me to have low upside (how much happier is burning all your bridges to be a hedonist really going to make you?) and very high downside (I can see lots of worlds where having a decade or more of savings gets you through the worst of AI societal upheaval, and of course if Nothing Happens your life is knocked significantly off-track).

This is kind of where I'm at personally, I still save a large amount of my income even though I'm also not quite sure if life will still look the same in 25 or 50 years. In any world where it does look similar I'm set, I'm more prepared for any medium-level scenario where investments are still useful in getting through the transition period or retain relevance post-transformation, and in worlds where life becomes so good or so bad that property and index funds become worthless there really nothing I could have done anyways, liquidating everything to do drugs or travel doesn't seem like it has a great ROI when I'm already broadly content with my current life.

As a side note: why are you bullish on LEV? It's my understanding as a complete medical layman that we've pretty much made zero progress on life extension. We're much better at keeping the very young, mothers in childbirth, the unlucky (genetic diseases, trauma, infection, disease etc) and the ill-and-probably-should-be-dead elderly alive, but modern technology hasn't really meaningfully moved the quality-adjusted lifespan of the average healthy person afaik.

As a side note: why are you bullish on LEV?

Keeping up with the literature, it seems very much like the 'code' of why aging occurs/effects us the way it does has almost been cracked. In short, the information that our cells use to reproduce starts to accumulate errors from both internal and external causes and their ability to repair those errors diminishes in kind (the more errors to repair, the more strain on the repair system). This leads our cells to A) become cancerous, B) Become senescent (nonfunctional but still 'active'), and C) change/mutate to a different type of cell, which obviously isn't helpful.

Eventually this cascades to full organ failure, and we die.

i.e. the science seems to 'know' the reason we 'get old.' The systems behind it are becoming better understood, and now the hunt is on for various methods or drugs or therapies that can trigger or reinforce natural repair systems or otherwise keep the cells reproducing accurately for much longer.

This is an actually tricky question, but a LOT more interest in this area has led to increased funding. It does seem likely that a couple silver bullets might emerge in the near future.

There's the obvious question of "where are the immortal mice?" And I think that's probably the thing that gives me the most doubt. If there's a surefire solution, then labs should be able to demonstrate it by pumping some mice full of it and showing that they just don't die naturally.

But watch out for interventions to extend canine lives. There's clearly something brewing.

And of course. "where are the immortal Billionaires," who could obviously afford any treatment they want, regardless of how experimental or illegal? Although I'd certainly suggest that the Billionaires just hitting their 50's and 60's these days are looking less decrepit than usual.

And I want a Goddamn explanation for how Tom Cruise is still hanging from airplanes in his early 60's. That doesn't invoke Thetans.

However I am reserving some bearishness for the possibility that the whole field is suffering from the current scientific crises where p-hacking, fraud, and failed replications are running rampant. For instance, studies of Blue Zones where extreme human longevity seems to be more common, seems likely to suffer from poor record-keeping, which is to say we can't be sure anyone is really as old as they say.

And that means the information gleaned from studying them will be inherently flawed. This might have ripple effects on the field's validity, if their model of 'extreme' longevity (and thus the metrics they're chasing) are on shaky grounds.

But the motivation to solve this issue is huge, and AI drug discovery is already a thing, so I'd expect some breakthroughs to emerge relatively soon. Maybe we get those immortal mice.

and in the meantime there are definitely a number of smaller interventions that, when done consistently, can up your chances of keeping healthy long enough to survive until aging reversal becomes feasible.

Interesting, thanks.

The current evidence seems to align with my preconceptions that absolutely nothing has happened so far for humans, although I wasn't aware of that dog trial which does seem promising. Perhaps it's true that AI will lead to further innovation in the space, but personally I'd at least like to see some immortal mice before I start hoping to overcome the human condition.

Its not that 'absolutely nothing' has happened, but more that every advance has been marginal, so even if you follow ALL the best advice, you're getting an extra 10-15% of extra lifespan at best.

If you want to see the absolute extreme limit of human longevity science, follow Bryan Johnson.

Should I continue to behave as though I expect society to persist into the next century and thus be very concerned about e.g. birth rates, pollution, government's fiscal policies, and/or Or does none of this matter in 10-15 years, and thus I should just do the bare minimum to keep things running but hey, let the kids do what they want in the meantime. The AI can fix the mess later.

I have the same doubts. It’s hard not to care, because for now the problems still exist. It will take solving them to end those concerns.

As for the rest of life, we brought some things forward and are probably living a little faster. There are things I want to do and experience, but most are regular life milestones. Going full hedonist and spending all the money / becoming a drug / sex / gambling / food addict doesn’t seem to make the people who do it happy, end of the world or not.

It’s hard not to care, because for now the problems still exist. It will take solving them to end those concerns.

Add on that there's so many people I care about who are just living life without much awareness of what appears to be on the horizon... and it feels literally impossible to explain to them why they should perhaps care a bit about what we're seeing. There's so many disparate chapters of lore I'd have to catch them up on so they can see the whole picture like I do, I'd look like the crazed conspiracy theorist with red thread until they finally got up to speed and it clicked for them too... if it ever does.

Bit lonely being unable to bridge the gap on a topic that I find important. Hence why I'm here.

There are things I want to do and experience, but most are regular life milestones. Going full hedonist and spending all the money / becoming a drug / sex / gambling / food addict doesn’t seem to make the people who do it happy, end of the world or not.

There's wisdom in that, but I can think of certain things I could be indulging in that would ABSOLUTELY make me happier, and I would do more of them if my time preference where about 5-10% higher.

One thought that springs forth recently: If I quit my job and sold my house and everything in it, I could afford to buy a decent camper van and then take a year, maybe two to drive around the Country with my dog. And why not? It won't hurt anybody, and I'll rack up a pretty fulfilling experience that will take my mind off the pending event. And that's without touching my own (modest) retirement savings. Which reminds me: What the FUCK am I supposed to with with a 401(k) as someone who is under 40?

Seriously, although I understand the benefit of having a money stash that you can't easily touch, the idea that I will want to keep adding to this pile of money that I will be unable to draw from until I'm in my 60's feels farcical under current expectations. Like, I just do not believe that the future is one where I diligently tap away at a series of steady jobs, watch my savings grow over a couple decades, and then have to draw on that money in old age for a peaceful retirement.

Can someone lay out the path to 2050 where the most likely outcome is that the market grows about 5-7% every year on average, we don't have a debt crisis, or catastrophic event, OR an AI-fueled industrial revolution that pushes things parabolic for a bit, and I, when I hit 67, will be SUPER grateful to my past self for diligently squirreling away U.S. Dollars (rather than betting on BTC, for example) over that whole period.

I will grant, if I cash in all my chips now and the "NOTHING EVER HAPPENS" brigade is right, I'd look very stupid later. And the Gods of Copybook headings have been undefeated for centuries.

But even if 'NOTHING EVER HAPPENS,' there are still enough small happenings that keep piling up that it really seems like the standard assumptions that go into the ol' "Put aside 15% of your pretax income in an index fund and never touch it" advice are not going to hold over the future. I don't think there's a reason to give up on saving entirely, but it suggests one should be taking wilder risks and being much less concerned with historic returns as a guideline for future probable outcomes.

Not that the advice of a random internet stranger should mean anything, but I think you should take the camper trip with your dog.

grateful to my past self for diligently squirreling away U.S. Dollars (rather than betting on BTC, for example)

These both sound terrifying to me.

The investment value of BTC is either an underlying "BTC will become so convenient to transact with that everyone will want to keep balances in it" (which looks less likely to happen the longer it goes without happening) or a meta "you can sell your BTC to someone who'll pay even more for it for some reason" (which happens, but can't happen forever without a non-circular reason). The investment value of USD has an underlying "everyone in the US needs some to pay their taxes instead of going to jail", and that's great, but at some point either we're going to get the federal debt under control or we're going to monetize it and dilute your USD to nothing, and I'm not betting on "get the federal debt under control".

an index fund

This is less terrifying. Sure, if the ASI kills everyone and/or mandates a Socialist Utopia then you're wasting a sweet camping-with-the-dog opportunity, but if property rights retain any respect then it'll be good to have equity in a wide enough array of investments to definitely include some companies who'll manage to surf the tidal wave rather than be crushed by it. A crashing dollar is going to hurt stocks but not as badly as it's going to hurt dollars.

Personally, I just wish I knew what to advise my kids. My index funds are at the "can pay for college if they don't go to med school" level, not the "idle rich" level. Even if AI progress levels off below superhuman, it looks like it will level off at somewhere around "can interpolate within the manifold of all existing human knowledge", and how much economic room is there for the vast majority of human knowledge workers in a world like that? Being able to personally push the boundaries of knowledge into previously uncharted territory used to be what you needed to do to get a PhD, not what you needed to do on a regular basis just to remain economically viable.

The investment value of BTC is either an underlying "BTC will become so convenient to transact with that everyone will want to keep balances in it" (which looks less likely to happen the longer it goes without happening) or a meta "you can sell your BTC to someone who'll pay even more for it for some reason" (which happens, but can't happen forever without a non-circular reason).

The Bitcoin Maxi case at this point is that it is digital gold. More fungible and easier to store, and readily convertible to whatever currency you need. I don't buy it all myself, but Bitcoins ongoing survival is proof of something.

And if you think a dollar collapse is pending, then BTC is probably where people flee to in at least the short term.

Personally, I just wish I knew what to advise my kids.

I think I can make a case for NOT following certain paths, but as for actionable "Do this to prosper in the future" advice I am at a loss. Its not like you can just say "Plastics" and nudge them off in the direction of the next big technological gold rush.

10-15 years ago "learn to code" would have been SOLID advice. No longer. I'm increasingly reading that AI models are really good at various parts of the practice of medicine... and SUPPOSEDLY robot surgery is here. So the Med school investment looks a bit questionable.

And as for college funds... why should colleges even be a thing, at least with their current business model, when AIs are generally capable of teaching at the level of even the best professors, across any subject?

Yeah, for Gen Alpha, there is probably NO career advice that previous generations can offer them based on experience other than "wait and see."

Being able to personally push the boundaries of knowledge into previously uncharted territory used to be what you needed to do to get a PhD, not what you needed to do on a regular basis just to remain economically viable.

I can imagine a world where the AI is doing all the knowledge work but keeps giving humans various tasks that it needs to complete in order to push the boundaries of knowledge forward. Tasks that will seem completely nonsensical to the individual performing them but in the aggregate allow the AI to improve things, iteration after iteration, and thereby keep most humans 'employed' and paying them in some currency they can spend with other humans and thus the 'economy' chugs along but in effect everyone on earth is a 'gig worker' who gets tasks assigned to them as needed, and gets rewarded for performance.

A scarier version is that the AI requires you to be Neuralinked up to it so it can inject arbitrary commands into your brain as needed, but also rewards you handsomely for helping out.

That only makes your refusal to let us know what men yearn for but can never admit all the more cruel.

It’s my theory that there is a psychosexual component to male envy. In any case I know I’ll just get flatly disagreed with if I make the case here, and I lack unfalsifiable / objectively compelling and comprehensive evidence. I’ve just always believed it.

What else is there to be envious of, his fucking watch?

And, you can’t take the hassle? That’s new. Although if I have to choose, I much prefer that reason for censoring yourself, than the implication it is to protect our fragile egos and ids. The latter could be interpreted as a challenge, or as passing the buck.

I know posters who use objectively compelling and comprehensive evidence, and they’re all cowards.

Like a candle in the wind - unreliable.

Yeah a lot of Christians I know seem to think it’s the Apocalypse. I disagree but there is that vibe.

Even the Pope seems to, which is interesting.