site banner

Culture War Roundup for the week of January 20, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

Are we all going to work fake jobs

From a Yarvin blog a few weeks ago:

[As a result of AI] Human populations, to governments, [will be] left as residual liabilities that need to be fed. And, in some sense, pleasured. Almost everyone is a zero-marginal product employee.

At this point, to protect any kind of humane existence, it is necessary to restrict the technology of production in a way that maximizes high-quality labor demand. We are essentially turning real life into a videogame—a maximally-engaging videogame. We may even have professional dragon-fighters (once we can bio-engineer real dragons).

The only alternative I can see to such a policy of artificial difficulty is one of “fully automatic luxury communism,” in which the total absence of meaning makes humanity suffer a moral and then political collapse. This political collapse seems likely to result in civilizational suicide, subjugation by barbarians, and a return to pre-pre-industrial technology levels (having lost all the secrets of pre-industrial technology as well).

Yarvin is influential, but many others including people in the Silicon Valley VC, AI, LLM research X.com e/acc space have made similar comments over the last few months. This is because, in part, it appears that AI researchers, senior lab figures etc increasingly believe that as multimodal performance and robotics both benefit from extreme uplift in terms of p investment and intelligence, which makes mobility inherently easier even if the mechanical components don’t change, the mass automation of all employment will happen as one Happening, over a brief period of a few years. This rather than over a prolonged 20+ year period, as had been predicted by early 2010s mass automation projectors, like CGPGrey in 2014.

I argue there are three core schools of post-AGI economics:

  1. It Doesn’t Matter Because We’re [Almost] All Going To Die. This category encompasses the three primary groups of AGI doomers: (a) malicious and/or paper clip maximizer AGIs will destroy the human race, (b) AGI will help human terrorists or factions to destroy the human race by eg. assisting in genetically engineering a deadly pandemic that kills most or all humans, and (c) AGI, in eliminating most/all jobs and therefore making all of us economically superfluous will lead to rich and/or powerful people exterminating or starving the majority of the population, and then perhaps eventually each other.

  2. UBI or some other form of by-default, low obligation distribution. To maintain consumption as productivity increases and employment decreases, governments transition their populations over to welfare, which eventually everybody is on. Ignoring significant implementation issues (like the classic Soviet ‘who gets the most beautiful apartments with the high ceilings’) this runs the risk, Moldbug argues, of a loss of meaning so extreme that it leads to a form of civilizational suicide. Proponents argues that this kind of true freedom will allow people to find their own meaning, in leisure, in raising families, in falling in love, in learning about and understanding the world and themselves. But what did humans do with their hugely increased leisure time starting from the mid-20th century? Spent much of it watching TV, porn, consooming products and scrolling online. Wall-E is about this, although the obesity will seemingly be avoidable by then.

  3. The Gamification of Life Something interesting that happened as the ‘live service’ video game developed over the last thirty or so years is that players increasingly demanded ‘progression’ in their competitive multiplayer games. It is not enough that the game you play for 20 hours a week is fun, it must involve your character’s statistical advancement, the slow grind for rare armor or skins or minute stat increases. Players demand ‘progression’. Man yearns to labor. Huge categories of modern day employment are already fake jobs that exist to reduce the number of welfare recipients in overall terms, the result of regulation and government spending in everything from compliance to college administration and the DMV to HR. This economy involves fake work in fake jobs, perhaps with stratification in terms of resource allocation, progression, a kind of gamified simulation of pre-AGI labor that most people engage with to a greater or lesser extent and which confers status and resources.

If (1) occurs, it was probably almost always inevitable (perhaps as a solution to the Fermi paradox). There is little anyone here can likely do to stop it. The choice between (2) and (3) is much more interesting. If you were the absolute ruler of a country that transitioned from widespread employment to mass automation of all labor, would you really give up on any incentives to encourage prosocial behavior beyond ‘obey the law’? Would you really trust people to live dignified, meaningful lives? Would you care?

For a long time, my dream job has been “game show host”. (Other professions near the top of the list have been “professional stage actor for a repertory theater company” and “tenured academic lecturer”.) My current side job is “local bar trivia host”, which is a small-scale version of that.

What do all these jobs have in common? Well, for one, they’re stable; you’re set up at an institution for a long-term contract, instead of having to constantly move around to chase better opportunities. You develop relationships with the other employees, and with the customers (audience members, students, contestants, etc.), such that you become a sort of local institution.

You’re also not having to constantly compete to keep your job. Obviously there’s competition to obtain one of these positions in the first place, but once you’ve got it, it’s pretty much yours for life until you decide to move on. The biggest reason I ultimately decided not to pursue professional acting, despite having both the training and talent for it, was that I realized that I would hate a life where half of my job is relentlessly auditioning for new gigs, with each audition being extremely competitive and high-pressure. I would much prefer a job where in exchange for accepting fairly low pay, I get to avoid the stress of competition and uncertainty.

These are also jobs where your charisma — your ability to cultivate a cozy and engaging social atmosphere, to present ideas creatively, and to generally be pleasant to spend time around — is the core of what you bring to the table. I would love being in academia if it meant I could just focus on being a competent lecturer, and not have to worry about constantly publishing “groundbreaking new works” within my chosen field. I don’t want to do a bunch of independent research to discover some new thing nobody’s ever discussed before. I just want to be really good at telling people interesting facts and crafting a compelling narrative presentation of information which, if they’d really wanted to, they could have found on their own.

Under an economic system in which people do not have to ruthlessly compete for scarce financial resources and job opportunities, and in which workers are under less pressure to produce quantifiable monetary value, careers like these would be more viable for more people. People could focus on being valued pillars of their local communities, instead of moving around to chase bigger paychecks. They could care more about cultivating reciprocal social bonds with those who enjoy and benefit from their work.

They will still want to constantly hone their respective crafts, both because they want to impress others, and because they find their professions intrinsically interesting, but there will not be any pressure to be “the best in the world”, nor even necessarily “the best” in one’s local context! I wouldn’t have to compete against strivers from around the world, nor would my job be outsourceable.

If AI can allow people like me — unambitious, head-in-the-clouds wordcels who primarily want to get along by being affable and verbally-loquacious — to ply our trades without having to produce economic value, then selfishly it is very appealing to me. What that would mean for the vast majority of actually-existing human beings is a different story.

These are also jobs where your charisma — your ability to cultivate a cozy and engaging social atmosphere, to present ideas creatively, and to generally be pleasant to spend time around — is the core of what you bring to the table

If AI can allow people like me — unambitious, head-in-the-clouds wordcels who primarily want to get along by being affable and verbally-loquacious — to ply our trades without having to produce economic value, then selfishly it is very appealing to me. What that would mean for the vast majority of actually-existing human beings is a different story.

AGI will make things much worse for people like you.

Already, we see that some large percentage of teenagers want to become influencers. But the number of people who want human attention is much greater than the available supply.

This will get worse.

In the future, instead of watching Ryan Seacrest, we'll watch an AI-generated super host who works even harder. At first, the existing celebrities will be able to maintain their audiences. Bruce Willis might make some extra coin selling his likeness. But corporations will cut out the middle man and create AI celebrities who they own and control outright.

This will extend to the local level too. As digital entertainment options increase, people go outside less. Why go to bar trivia, when there is a digital host who is specifically tailored to my needs? The obvious rebuttal here is "people crave human connection". Sadly, I don't think this is a good argument. As technology increases, people go outside less. This trend won't suddenly reverse with even more engaging, addicting technology. In the post AGI world, there will be no one at bar trivia because they will be at home, on their devices.

Here's what AGI could do though.

It could give people a fake audience of AI humans who appreciate their wit and wisdom. This technology is definitely coming soon. Already, we see a small group of mostly neurodivergent people who spend hours a day talking to AI chatbots. There's no reason to think this won't grow. In the future, everyone will have an audience of adoring robot fans, hanging on their every word. If you can get over the fact that it's all fake, it might be the best of all worlds.

It could give people a fake audience of AI humans who appreciate their wit and wisdom. This technology is definitely coming soon. Already, we see a small group of mostly neurodivergent people who spend hours a day talking to AI chatbots. There's no reason to think this won't grow. In the future, everyone will have an audience of adoring robot fans, hanging on their every word. If you can get over the fact that it's all fake, it might be the best of all worlds.

I'm reminded of... I wouldn't call it a study, but a post I remember that characterizes many of the most popular video game companions as professional sycophants whose role in the video-game power fantasy of the self-projection protagonist was to affirm how awesome and attractive you.

The example I remember was in the Bioware RPG Mass Effect, where the player plays the Super Awesome Special Forces Secret Agent Officer, Commander Shepard in the multi-species galaxy, where you are (allegedly) an amazing leader ready to make the Tough Choices. The first game's gimmick was not only the claim that your Big Decisions would matter in the future, but also the morality system that let you play a heroic virtuous paragon (who consistently deferred to / agreed with the Alien UN authority figures) or a ends-justify-the-means Renegade (who could be a raging racist). There was even a romance system where you could sleep with your subordinates, including a star-trek esque alien blue woman.

The second game's gimmick, among other things, was the ability to re-recruit most of your other alien squadmates from the first game and sleep with them... even if you were a raging racist infront of them. The player romance fantasy for the totally-not-gypsy coded geeky tech girl might be the dashing captain who was a white night who saved her late father's reputation (by covering up crimes that got a lot of people killed), and hey it's totally romantic if she loves you so much that she's willing to risk killing herself before a critical mission just to sleep with you...

...but she'd make the same doe eyes and declarations of love and how irresistibly attractive you were if you were a genocidal bigot who punched women for mouthing off on live television and turned over an autistic child to have his eyes stapled open and be tortured for Science (TM) after sleeping with an abused trauma victim tormented by the same racial-supremacist organization that you are currently working for and can repeatedly voice support for.

The virtual waifu was, in other words, incredibly popular. And like most of the most popular characters in the franchise, was never anything but supportive and/or adoring for the player self-insert protagonist.

So when you say fake audiences fawning over the player/protagonist... I believe it, because we've already seen it. It was just far more limited and harder to program and write for a decade ago... which is to say, should be in the LLM's training data.

Now, the real capitalism question will be how we get someone to pay for and profit from it, without being so crass as to expect the hosts to. Figure that out, and then we're talking.

Now, the real capitalism question will be how we get someone to pay for and profit from it

Don't we already have wAIfu chatbot companies, with scores upon scores of paying customers that suddenly go on suicide watch, when their chatbot doesn't want to have virtual sex with them anymore?

Anyway, this is precisely the source of my boundless disdain for Yudkowski and all the Rat-adjecant AI safety people. All that talk about "x-risks", only to overlook all the most obvious scenarios that can actually threaten humanity.

Anyway, this is precisely the source for my boundless disdain for Yudkowski and all the Rat-adjecant AI safety people. All that talk about "x-risks", only to overlook all the most obvious scenarios that can actually threaten humanity.

What are you talking about? Rationalists have totally noticed. Some even think is a good thing; if we are not going to force women to have sex with incels, we can at least allow virtual waifus to ease the pain.

I'm not the person you replied to, but I share his dislike for most of the Rat community. I think Yudkowsky gets it, though. (For a wire-header to be able to enjoy life, they'd have to create artifical challenges for themselves, not unlike those found in real life. So they might as well just engage with real life)

One of the obvious scenarios which threaten humanity is that some people have bad social skills and that they don't have an environment in which they can improve (or alternatively, that they can wirehead the reward of socializing, which is much more pleasant for them than actual improvement).

Giving "virtual waifus" to "incels" doesn't solve any issues, it just suppresses symptoms. By the way, I find it strange that, in an imaginary scenario where we approach AGI level of intelligence, we cannot seem to imagine coming up with a way to help people who have terrible social skills. I'm very puzzled by how a community can have so many knowledable (and sometimes intelligent) people and still have such shallow, naive, and simplified takes on serious topics.

It's not about social skills. It's about the fact that women are only attracted to a small minority of men. Any society where women are free to make their own sexual choices is going to be a society where the majority of men end up as incels.

Completely normal guys who shower and hold jobs and have friends and are non-obese or autistic get lectured by feminists that doing the bare minimum doesn't entitle them to a girlfriend while a small number of men plow their way through entire harems. And not even good men, but terrible human beings like Henry and Dean Moriarity, because not only are women only attracted to a small minority of men, but the minority of men they are attracted to are cocky assholes with options.

We increasingly live in a world where the average guy's best chance of getting married is to wife-up a 30+ single mother after she falls off the bottom of Chad's booty call list. And if you are an average man in your teens and twenties, you don't even get that, you get a "fuck you" and told to wait your turn.

If you wanted to fix this with advanced technology, and you did not want to resort to wire-heading or something morally equivalent like creating non-conscious sycophantic cat girls, digital or otherwise... well, you could create conscious male-complements that required some effort and level of social skills to successfully court but who were not impossible to please the way that human women are, what Eliezer calls verthandi... or you could modify human women to actually be satisfiable by regular men, perhaps with human men being modified in some other way in return... or, you know, we could just go back to what worked for the last 5000 years and force women to get married while they are still young to hard-working, law-abiding men, who would then be allowed to take their marital rights whenever they wanted (hey, you don't even need the advanced technology for that one!)

But if you don't want to wirehead, and you don't want to create sycophantic cat girls who will fuck and cuddle you at the drop of a hat, and you don't want to create bespoke Belldandys who will act like the love interest of a shonen romcom and get together with the nerdy loser after a few years of character growth and sexual tension, and you don't want to edit existing human women to make them something that could ever be satisfied with not being the exclusive wife of Chad, and you are not willing to bite the bullet and force young women to get married and perform their damn marital duties...

...then the problem is over-constrained and has no solution.

(hey, you don't even need the advanced technology for that one!)

That's the problem. Advanced technology is good. Humanity having more power and control over itself and its biological/planetary destiny is good. Retvrning to the kind of society that can happily (half of it at least) paddle on for millenia with zero incentive to evolve beyond its feudal agricultural formula until it encounters an out of context problem is, simply put, rank stagnation and unbecoming of creatures that have brains larger than walnuts.