site banner

Culture War Roundup for the week of May 26, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

I've all of the sudden seen AI blackpilling break out into the normie space around me. Not so much about FOOM, and paperclipping, or terminator scenarios, but around the sudden disruptive nature, and especially around economic upheaval. Not exactly sure why. Veo3 has been part of it.

For example, coworkers suddenly aware that AI is going to completely disrupt the job market and economy, and very soon. People are organically discovering the @2rafa wonderment at how precariously and even past-due a great deal of industry and surrounding B2B services industries stand to be domino'd over. If my observation generalizes, that middle class normies are waking up a doompill on AI economic disruption, what is going to happen?

Let's consider it from 2 points of view. 1 They're right. and 2. They're wrong. 1. is pretty predictable fodder here - massive, gamechanging social and economic disruption, with difficult to predict state on the other side.

But is 2 that much less worrisome? Even if everyone is 'wrong', and AI is somehow not going to take away 'careers', people in mass worrrying that it's so will still manifest serious disruption. People are already starting to hold thier breath. Stopping hiring, stopping spending, running hail mary's, checking out.

Somehow, it's only senior management who doesn't realize the impact. (They keep framing 'If we can cut costs, we'll come out on top, instead of following the logical conclusion, if everyone stops spending the B2B economy collapses.) - I have a nontechnical coworker, who has recently recreated some complex business intelligence tool we purchased not long ago using readily available AI and a little bit of coaching. He had an oh shit moment, when he realized how cannibalized the software industry is about to get. The film industry seems about to completely topple, not because Veo3 will replace it immediately, but because, who's going to make a giant investment in that space right now?

I suspect the macro economic shock is going to hit faster than most are expecting, and faster than actual GDP gains will be made, but maybe I'm just an idiot.

My gut feeling is that this AI wave will be a short panic and then basically blow over. To predict massive job loss you have to assume that jobs are already distributed rationally, and that companies are good at adapting to and using new technology efficiently. Neither of these are even remotely true!

If you've ever seen how the sausage gets made at a major company, jobs are very much withheld and created on more of an internal, political basis than any actual needs the companies have. On top of that, most major organizations are still barely adapted to using spreadsheets and the most simple algorithmic techniques that were created decades ago. Literally just using excel to automate tasks could save these companies tens of millions of dollars a year. And yet... they don't?

So the idea that just because there's a new technology coming out that can do a bunch of fancy new stuff, does not convince me that we'll have massive job loss at all. What will likely happen, and what has already been happening for a while, is that the people in white collar roles who can use these tools will just shift off more and more work to them privately, and pretend they are still just as busy. The roles might get a tad more competitive.

But we're not going to be in a doomsday scenario where everyone loses jobs, even IF AGI comes out tomorrow.

If you've ever seen how the sausage gets made at a major company, jobs are very much withheld and created on more of an internal, political basis than any actual needs the companies have.

Well said.

I’m inclined to take the indictment of economic inefficiency even further, and point out that jobs at major companies are also created on an EXTERNAL political basis. As @hydroacetylene says, the government has a variety of incentives (which may or may not include his schizo one) to keep the (formal) employment rate high. While I am skeptical that LLM-based AI will ever get good (or at least reliable) enough to make mass unemployment a realistic possibility, even if we grant the hypothetical that it actually will, I fully expect the government to disrupt the disruption by just writing some legislation which obliges corps to employ human rubber-stampers (“supervisory oversight”) on AI processes, thereby neatly regenerating all the white-collar jobs which have been automated. Legal compliance hits startups just as much as it hits status quo inc.

The year is 2050. Humans have long since ceased to do productive work. Amazon and Walmart are giant government contractors competing to provide unemployed people with ‘basic’- a groceries, clothes, household goods, etc ration. The two mega corps keep lobbying for basic to increase; their primary bargaining tool is offering to hire more ‘process supervisors’ and ‘account managers’ to sit in rubber rooms.

The year is 2050. Mass unemployment has been forestalled by government revenue per employee maximums. Practically speaking, this means the vast majority of jobs are sinecures, but the social prestige of being sinecured to a particular firm or brand has skyrocketed in value, and likewise the PR cost of sinecured associates can potentially be Bud-Light-level apocalyptic. As a result, sinecures at highly-desirable firms often take on substantial relations efforts on behalf of the firm for free, both to maintain their sinecure, acquire additional sinecures, and potentially rake in social media influencer consulting fees), but also to increase the social prestige of the firm overall, and by proxy, themselves. Having multiple sinecures is possible for people who are motivated to do things that increase brand value or mindshare, and people with exceptional social desirability may acquire many more. Income supplementation via gig work will still be possible for things that can't be done as well by computers and robots or require the human touch, though naturally the returns will be low since the supply is huge. Of course, anyone with valuable sinecures must be on good behavior in public, for bringing disrepute or negative reaction to the firm will mean instant termination, and potential blackballing.

Those who have been blackballed or otherwise unable to merit a sinecure on the basis of their social worthiness must provide by going and doing actual labor, probably many hours of grinding gig work, or finding a valuable contracting niche. Most actual work below executive level done at the firms is performed by independent contractors, who supplement their sinecures (if they have one) with hourly contracting fees. And of course, there's always welfare, but which only provides at sustenance levels.