site banner

Culture War Roundup for the week of May 19, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

There has been a lot of hype news in robotics + AI lately, as the AI updates just continue to come at a blinding pace. From Tesla/XAI we have the Optimus robot, which I can't tell if this is a major breakthrough or just another marketing splash driven by Elon.

On the other side of the fence, you have Nvidia releasing an open foundational model for robotics and partnering with Disney of all companies to make a droid robot.

You also have Google's I/O, which I haven't had the energy to look into.

With the speed of AI updates and the wars of hype, it's always hard to tell who is actually advancing the frontier. But it does seem that in particular robotics are advancing quite rapidly compared to even a couple of years ago. Personally I think that while automating white collar work is useful and such, AI entering into robotics will be the real game changer. If we can begin to massively automate building things like housing, roads, and mass manufactured goods, all of the sudden we get into an explosive growth curve.

Of course, this is where AGI doomer fears do become more salient, so that's something to watch out for.

Either way, another day, another AI discourse. What do you think of this current crop of news?

It is crazy to me that most people alive today will be around to see how this - this journey of civilization, this grand process of technological development - ends, or at least moves far, far beyond us. There is a millenarian tension in the air. Paradise or extinction (at least for most people), it seems increasingly clear it will be one or the other.

Yeah, a lot of contradictory thought patterns emerge if I ruminate about the future more than like 2 years out.

Should I live as conservatively, frugally and healthily as possible to ensure I make it to there in good shape, or should I be more reckless and try to enjoy life as much as possible since it could all end? (obviously if EVERYBODY does the latter, we might not make it there at all).

Assuming we survive, are we bound for a future of exploring the stars and colonizing new worlds, or do we get stuffed in VR experience machines that satisfy every psychological desire we have without going anywhere? Will I even have a choice?

Is there any point in breathlessly following every notable development in the AI/Robotics space to try and guess when the big moment will arrive, or would it be more constructive and mentally healthy to divorce almost entirely from it and just read escapist fiction all the time so I don't worry about something I can't really control?

Should I continue to behave as though I expect society to persist into the next century and thus be very concerned about e.g. birth rates, pollution, government's fiscal policies, and/or immigration policies? Or does none of this matter in 10-15 years, and thus I should just do the bare minimum to keep things running but hey, let the kids do what they want in the meantime. The AI can fix the mess later.

It is in my nature to prepare, both mentally and financially, for things to go south. I don't buy the hype and promises without skepticism, but I can't deny that every 6 months for the past, what, 3 years? The SOTA models have demonstrated new capabilities that check another box off my "is it smarter than humans?" list. The temptation to just give up 'trying' and go with the flow is strong.

A bit of optimism, I do believe that I'm young and healthy enough that I'm likely going to be around when we reach Longevity Escape Velocity, if the AGI stuff never fully manifests we've got all the pieces to fix most age-related problems in humans so as to give us functional immortality by 2050. Which will create a whole host of new and exciting issues if the AGI isn't already in charge.

I'm not entirely sure I believe this but the bear case isn't that hard to articulate.

We hit the top of the S-curve for LLM's and they merely become very useful tools instead of ushering in the singularity or obviating human labor. Frontier research starts stagnating as a result of having picked all the low-hanging fruit / the competency crisis / failure of higher ed / loss of state capacity / pick your favourite boogeymen and life in 2050 looks about as similar to 2025 as life in 2025 looks about as similar to 2000. This hypothetical world is probably worse for the median westerner compared to 2025 owing to some combination of immigration / climate / financialisation etc etc and the path of civilization doesn't look great without some sort of technological salvation but it's unlikely that any of these issues will be back-breaking in one generation.

Re preparation: it's interesting to me that a lot of people's reaction to millenarism is to make life choices that would be very dumb in any world where Nothing Happens. Assuming you're already reasonably happy with your current life, liquidating your job and savings seems to me to have low upside (how much happier is burning all your bridges to be a hedonist really going to make you?) and very high downside (I can see lots of worlds where having a decade or more of savings gets you through the worst of AI societal upheaval, and of course if Nothing Happens your life is knocked significantly off-track).

This is kind of where I'm at personally, I still save a large amount of my income even though I'm also not quite sure if life will still look the same in 25 or 50 years. In any world where it does look similar I'm set, I'm more prepared for any medium-level scenario where investments are still useful in getting through the transition period or retain relevance post-transformation, and in worlds where life becomes so good or so bad that property and index funds become worthless there really nothing I could have done anyways, liquidating everything to do drugs or travel doesn't seem like it has a great ROI when I'm already broadly content with my current life.

As a side note: why are you bullish on LEV? It's my understanding as a complete medical layman that we've pretty much made zero progress on life extension. We're much better at keeping the very young, mothers in childbirth, the unlucky (genetic diseases, trauma, infection, disease etc) and the ill-and-probably-should-be-dead elderly alive, but modern technology hasn't really meaningfully moved the quality-adjusted lifespan of the average healthy person afaik.

As a side note: why are you bullish on LEV? It's my understanding as a complete medical layman that we've pretty much made zero progress on life extension. We're much better at keeping the very young, mothers in childbirth, the unlucky (genetic diseases, trauma, infection, disease etc) and the ill-and-probably-should-be-dead elderly alive, but modern technology hasn't really meaningfully moved the quality-adjusted lifespan of the average healthy person afaik.

As a medical professional:

This is an accurate statement. Or close enough to not need any hairs split.

That being said, I still expect LEV because:

1)ASI.

  1. Even in the absence of ASI, we're finally making good progress on things like cybernetics, replacement organs and the like. The only organ we can't currently replace, in any meaningful sense, is the brain. For obvious reasons, a transplant wouldn't work. But we are at/close to the point of being able to replace other organs. We've also made progress in applied genetherapy, even if rudimentary. More importantly, enough real money (i.e billions of dollars) is being invested in SENS or regenerative medicine that we can hope to see a difference.

I weakly expect us to achieve LEV in my nominal life expectancy, and probably yours, without ASI. With ASI, that becomes a far stronger expectation, only balanced by the risk of it killing us all.