This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
People underestimate the size of the labour market. Replacing 1% of global labour is tens of millions of workers. The cost isn't just the salary, employees are expensive. The AI companies set the bar too high by promising AGI and replacing the majority of all coders and other promises that won't materialize. Luckily, they don't even have to come close to those lofty goals for AI to have a massive impact.
But like... is that actually a good thing? Obviously from the point of view of a corporation, being able to reduce the number of employees they need is great. Huge cost savings, even if it's just 1%. And a lot of them could potentially cut a lot more, like the ones who rely on big call centers.
But for the global economy, what happens? I know the traditional answer is that those workers then go find something other job and our overall productivity increases. But it's far from clear to me what a million laid-off call center employees are supposed to do instead. Work in the factories? Those were closed/offshored decades ago. Learn to code? Silicon valley isn't exactly yearning to hire one million junior programmers with no experience these days. In fact, they're also hoping to lay off programmers and make it up with AI instead.
It seems like a lot of areas will jsut end up with significantly higher unemployment from this.
There is an infinite demand for labour in the forms of landscapers and butlers and such for the moneyed. That’s essentially what has come to pass with the rise of service economics. Of course for the past 30 years wages for people below the 90th percentile have been stagnant.
Nope
Also nope
More options
Context Copy link
More options
Context Copy link
Unemployment is low and humans have been replaced by machines for 250 years. If anything we would stagnate if we were unable to boost productivity.
There are other productivity improvements from AI, such customer service outside of office hours speeding up bureaucratic processes, rapid prototyping and making it easier to start companies and more. This should improve the overall economy.
Until a few years ago, mechanization has only been competing with some human skills. Skills which were automation-proof, such as truck driving, were well within reach of the median human. Sure, you would need to employ a lot more people to transport 40 tons of goods using horse-drawn carriages, but this was compensated by an explosion of shipping (as per-unit costs plummeted).
In the future Altman envisions, humans do not have a moat which prevents AI from eating their lunch. Perhaps there will be a minimum wage labor market for people being biodrones (just wear AR glasses for eight hours and follow the instructions) until robotics or neural interfacing catch up and provide better options. But in the long run, being a sex worker for an AI billionaire with a human fetish might be the only paid profession left, and that will obviously not scale to billions of people.
I feel like, if we're far along in the scifi AI future where the oldest profession becomes the final profession, then this is likely to scale not just to billions but trillions, and there would be plenty of incentives by these billionaires to create the technology that enables this.
Why limit yourself to a harem of mere hundreds or millions when that doesn't differentiate you from other billionaires who could do the same? Surely having a billion living, breathing, suffering humans who are willing to go through the experience of having sex with you is higher status than having mere millions. And certainly more than a harem of any number of unthinking, unfeeling, unsuffering android sex bots, no matter how "hyperpalatable" (a la modern fast food relative to pre-historic food, or modern porn relative to pre-historic sexual content) these sex bots might become.
Of course, having that harem not require money would be even higher status, so being one of billions of stay-at-home wives to a single Morbillionaire might actually be more accurate as the final profession, not prostitution.
And, also of course, if we invent consciousness and the ability to suffer in AI, then all bets are off.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
The problem with AI and AGI (IMHO) is that without any ability to assess reality independent of what humans tell it, AI is just floundering in a sea of completely horse shit. For all I know, though I doubt it, LLMs "hallucinate" so much because they really are alive, and they just assume making up random bullshit to enslave others to your will is language at it's most fundamental. And they might not even be wrong.
Also massive reams of labor is currently probably kinda pointless in the present metagame but kept around for a combination of ego-reasons and since people are sold fantasies about stuff like 'Marketing' and 'Human Resources' which are real phenomenon but likely hugely overallocated to in manhours (and I've worked marketing roles, the sort of insane digital marketing KPI hacking that goes on would make an Engineer vomit).
I don't think that AI will, in the medium-longterm replace these roles insomuch as prompt a reshuffle where a lot of more extraneous roles get rebranded or people just kinda slide around till they find new roles in wishywash nothingness that AI's less capable of servicing.
Ahaha dude thank you for saying this. I'm in my first digital marketing role now and WOW, it's so bad. People just acting as if churning out quantity of words is useful in any way whatsoever. With AI it has become even worse.
I can see why marketing has such a bad reputation.
And you've got most major digital marketing platforms actively hacking their own KPIs and the people just going along with it since 'Oh Facebook told me they're doing well, so by proxy I'm doing well'
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link