site banner

Culture War Roundup for the week of March 6, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

It is my belief that after the AI takeover, there will be increasingly less human-to-human interaction. This is partially because interacting with AI will be much preferable in every way, but it is also because safetyism will become ever more powerful. Any time two humans interact, there is the potential for someone to be harmed, at least emotionally. With no economic woes and nothing to do, moral busybodies will spend their time interfering with how other people spend their time, until the point where interacting with another human is so morally fraught and alienating that there is no point. Think about it, who would you rather spend time with: an AI who will do whatever you want and be whatever you want, anytime, or a grumpy human on her own schedule who wants to complain about someone who said "hi" to her without her consent? The choice seems obvious to me.

It is my belief that after the AI takeover, there will be increasingly less human-to-human interaction.

This is a major concern, yes.

One of the worst possible outcomes of ASI/singularity would be everyone plugging into their own private simulated worlds. Yudkowskian doom at the hands of the paperclip maximizers may be preferable. I'm undecided.

who would you rather spend time with: an AI who will do whatever you want and be whatever you want, anytime, or a grumpy human on her own schedule who wants to complain about someone who said "hi" to her without her consent?

Freedom is boring, not to mention aesthetically milquetoast, if not outright ugly in some cases. I have always been opposed to trends towards greater freedom and democratization in the arts - open world video games, audience participation in performance art and installations, and of course AI painting and photo editing recently - I find it all quite distasteful.

Is Tolstoy applicable here? Free men are all alike in their freedom; but to each unfree man we may bestow a most uniquely and ornately crafted set of shackles.

You sound like the exact kind of person I'm trying to wake up with my statements. You want to put humanity in shackles because you are afraid that freedom might be boring? You want to force me to spend eternity shackled to my psychological abusers because you're worried that I might not use my time in the most aesthetic way? No one is forcing you to play open world video games, but you want me to be forced to play a closed world video game. Why?

I view plans of giving humans GOD mode (but not really) via AI as fundamentally removing all sort of meaning from life.

Things are easier now than they were, yes, but we still suffer. Suffering is, in my opinion, a core pillar of what it means to be human.

Great, suffer then. That doesn't give you the right to impose suffering on others.

What about his children? Will you send men with guns to snatch them away from him to hook them into AI-fueled hypermodernity?

Good, complicated question. We are I think agreed that adults should (usually) be able to do whatever. We are probably also agreed that very young children should not have their life outcome dominated by whatever decision they hold at any given moment. I believe it is also uncontroversial that children in plainly abusive (violent/sexual) households should be removed. Between that, I think this worry is overstated - parenting is also a skill whose scarcity will be reduced by the singularity.

Maybe if his children want to leave for a month, they can; it is then his problem to avoid this. I don't know where the actual degree shakes out; I suspect the actual numbers will be relative to circumstances. Presumably an AI will be able to analyze if an intention to leave is temporary or stable; this should affect decisionmaking. (Imagine how uncontroversial trans would be if satisfaction and outcome could be perfectly forecast.) But in sum, I simply think we have a warped picture of the tradeoffs involved in liberty vs parenthood due to the fact that we live in a very broken world filled with people who are very bad at what they do.

I'd send whatever needed to be sent to hear from the children themselves, give them an informed opinion of the state of the wider world, and hand them a ticket to leave whatever Neo-Malthusian hellscape Panem wants them to dwell in.

It's completely up to them whether they want to leave, but I fully support their exit rights.

I'm sure there will be people insane enough to want to dwell in such places, and that's their prerogative, but the opinion of the father shouldn't override the desires of the son. Offer to wipe their memories of the outer world after they decline if the knowledge it's out there is so unpleasant.

What age? 2? 5? 15?

Age is unlikely to be a meaningful signifier of mental maturity at that point in the future.

If a baseline human, then I'd go with 16ish, otherwise when they can be reasonably expected to have the maturity of a baseline 16yo human.

Rumspringa rules, so 16 or 17? Seems the closet analogy for modern life : experience machine would be Amish : modern life.

but you want me to be forced to play a closed world video game. Why?

Network effects.

I'm not going to plug into the experience machine, so if everyone else does, the world outside the simulation is going to become a much less pleasant place to live in.

I also endorse the response from @RenOS below.

if everyone else does, the world outside the simulation is going to become a much less pleasant place to live in.

Why?

No one is forcing you to play open world video games, but you want me to be forced to play a closed world video game.

I don't disagree, and I know where you're coming from, but there's a bit more to it than that. It's possible for a style of game (or movie, etc) to become so popular that almost all you can find is something of that style. For example if you didn't like WW2 shooters in the early 2000s, it was real slim pickings. Sure, nobody forced you to play one - but other people did cause your options to be sharply limited.

Additionally, it hits harder when a series you used to like takes a turn in a direction you don't like while trying to chase trends, too. For example I used to love FF, but they haven't made a good game in that series in 20 years because they keep chasing the Western market (which in turn means they push the action, more Western-style fantasy, etc). I would never suggest that the devs at Square-Enix should be constrained to only make traditional FF games from now on. They aren't interested in making the games I want to play, and I have to accept that. But it is still kind of sad, and it does mean that I lost something fun in my life because of their artistic choices.

If AI allows for infinite content generation it's hard to see how this would be a problem. Just say "give me Final Fantasy 7 remastered with stunning graphics in FDVR and oh also change this character... etc"

One of the worst possible outcomes of ASI/singularity would be everyone plugging into their own private simulated worlds. Yudkowskian doom at the hands of the paperclip maximizers may be preferable.

What???

Being able to do whatever you want, all the time, that's roughly as bad as death?

What's a good outcome then, if endless human autonomy is such a terrible fate? Working on a commune all day with 19th century technology? Chattel slavery? A happy-clappy Borg hive like in Foundation's Edge? Low-wage jobs in the modern-day? If you want an aesthetically ugly job, I can describe mine to you.

It depends on what you see as "autonomy". I think a world where everyone is plugged into a simulated world is, if not exactly zero, at least pretty close to zero autonomy. You do not provide for yourself in any meaningful way, you are not capable of substantively changing the material world around you, you are not capable of protecting yourself and instead depend on protection. Of course your examples aren't positive, either. I would like a future where humans are improving their capabilities, try their best to colonize the universe, are meaningful members of society (not just "a" society like an online guild, but "the" society, the one that creates the infrastructure we use, the food we use, etc.) and in a fully general sense are in control of their destiny.

One of the worst possible futures is them becoming glorified pets of safetyist AIs that make sure no harm comes to them and allows them to play in a little golden cage of their own making, one so nice that they don't even consider leaving anyway.

I have heard rumors on Twitter from philosophy professors that undergraduates these days don't have the same reaction to Nozick's Experience Machine as previous generations. They are much more willing to get in the pod.

My personal ideal utopia-pod is definitely going to have

-extensive awareness of the universe outside the pod, albeit largely delegated to subservient AI as most information coming from that vector will be uninteresting.

-self-sufficiency. I won't be farming up the calories I consume/KW my upload consumes, but I already don't do that.

-ample self-defense capabilities.

I agree that I wouldn't be substantially altering the outside world on a regular basis, and this is probably the crux of the issue. But I for one would likely bite that bullet, depending on the particulars of the technology available.

But what's the point of colonizing the universe? I agree that it's good and should be done. However, expanding our material resources and technical capabilities is a means to an end. The end should be human enjoyment, whether that's conversation, art, games or whatever we can come up with given immense intellect and resources. I'm in favor of working out how to make Matrioshka brains (playing Dyson Sphere Program IRL) or whatever's more practical/efficient than that. A defence fund for dealing with aliens, entropy and so on is also a good idea.

I suppose I can't imagine how human input would be necessary or even useful. Once you figure out a nearly optimal way to assemble your megastructures, what can you do then that's useful? I'm envisioning shooting a few trillion tiny seeds that hopefully reach the target star-system and self-assemble into a factory that produces the megastructure. It's all automatic. That then receives a beam of light containing copies of people's minds. They then reproduce. They probably never have a physical body. Why would they need one? They've tapped the star or whatever energy source they're using as much as they can. All the minerals are being processed automatically. Is there make-work for them, consciously operating iron refinery #39990120347?

Say I'm one of a trillion trillion superintelligent posthumans, how can I contribute to anything 'meaningful' (aside from making art)? Do we just hope that the tech tree, so to speak, extends forever?

Even if I choose to stay on Earth, it would be very meaningful to me to know that real live biological humans actually made the million year journey to another galaxy. And people will make that journey, unless the AI prevents them of course. Maybe the adventure is like a kind of art.

But how would that even work? They'd be overtaken on the way by something more efficient. Whatever a biological human can do, a posthuman or AI can do better. When it comes to accelerating objects to near-lightspeed, it's easier to do it to smaller, tougher objects. I imagine if we figure out FTL travel, similar principles will apply.

And what do they do when they get there, when they find that galaxy's already been taken? Every star reprocessed by the time they reach it, indistinguishable from whatever they left? I get a sad vibe from it, like the Incan army sallying out against the Spanish. It was so over from the moment the Spanish arrived. If you're a biological human in this far future, it's like living your whole life as a joke or a zoo animal. Posthumans can mess with you whenever they see fit, in ways you can't even perceive, using technologies you can't imagine. They truly have the least autonomy, being totally at the mercy of more powerful beings.

Well, if it's an aligned AI, then the nanoprobes which yes, will beat us to the stars, will simply prepare the way for us, including in some regions not-preparing (leaving untouched). I'll be excited when humans do arrive.

If it's an unaligned AI, welp.

I guess I'm personally resigned to the golden cage but I just want to make sure that I'm not denied my basic needs (esp sex) for some safetyist nonsense. I mean, do you think that a woke feminist or conservative Christian, if they managed to get in control of the AI, would allow us the sexual utopia that we have a right to? This is what I'm afraid of.

But I totally agree with you: humans should be free to, for example, colonize Mars even when its dangerous because the infrastructure isn't there yet. I too worry that safetyism will prevent us from taking risks, being part of the forefront of civilization, exploring the universe.

the sexual utopia that we have a right to

Who says you have a right? From whence do you derive this right? Explain to me how this is a right akin to the right to life?

It does depend on whether the assertion is being made from the position of "I can't get anyone in the real world, so my only hope is the simulated world where an AI character will pretend to love me" or "I want to have all kinds of sex beyond what is possible now, I am jaded and want infinite stimulation, hyper porn".

While one set of circumstances might be treated more sympathetically than the other, what right are you claiming? You can survive without sex, you can't survive without food, water or air. Maybe we all have a right to $50 million, to be tall and handsome/slender and beautiful and incredibly smart and successful and all the rest of it, but we're not all going to get that. Unless you are pinning your hopes on AI magic producing abundance and a way to get humans into some kind of "better than reality" virtual world where they can all be tall, handsome, successful, smart, rich people with tons of loving and willing partners, you can declaim about your "right to X, Y or Z" all you like, but you're not gonna get it.

The right to sex is not really about sex. It's about protecting normal people from moral busybodies that will ruin our lives by publicly proving that we have a sex drive. Once "so and so said something sexual once" or "so and so had sex (in an unapproved way)" is not a basis for public humiliation or losing your job, our lives will be so much better.

It's interesting to me that asserting a right to sex can a reaction from you. Are you afraid of people getting their needs met? Or is power over other people's sex lives something you need for some reason?

It does depend on whether the assertion is being made from the position of "I can't get anyone in the real world, so my only hope is the simulated world where an AI character will pretend to love me" or "I want to have all kinds of sex beyond what is possible now, I am jaded and want infinite stimulation, hyper porn".

While one set of circumstances might be treated more sympathetically than the other, what right are you claiming?

This is a complete aside: I agree with you that one would be treated more sympathetically than the other, but I'm curious what's your intuition on which one that would be? My intuition points to the latter being the one to receive far more sympathy than the former, which would actually receive close to none and actually attract antipathy.

Haha, this is exactly I don't give a shit about these people's sympathy.

I'm pretty sure this is an assertion of a negative right, derived from (among others), the right to privacy. Ie, if he can make it himself (or convince others to make it for him), what gives you the right to prevent him from doing so?

I think you're viewing this as "A says they have rights to B's body", whereas parent is viewing it as "C is saying they have the right to prevent what A and B want to do with their bodies."

Exactly, thank you.

denied my basic needs (esp sex) for some safetyist nonsense

Besides mods which alter the creators vision getting banned, as @tikimixoligist shows, mids which adhere to it more closely are forbidden from being distributed by mainstream sites: https://gamebanana.com/mods/430053, https://varishangout.com/index.php?threads/fire-emblem-engage-localization-fix-mod-removed-by-loverslab-gamebanana.1737/

I suspect sex would be reasonably safe. But we already have a preview of what might happen if your personal utopia does is not what the zeitgeist wants. There was a Rimworld mod called "European Phenotype and Names Only (White Humans)" which modifies a single player game. It's banned.

https://www.eurogamer.net/paradox-pulls-discriminatory-stellaris-mod-that-made-all-humans-white

We embrace the idea that players mod the game to best represent how they want to play, we do NOT however wish to enable discriminatory practices.

The want to force us to spend eternity with people who hate us and will psychologically abuse us. They'd rather us dead than allow us to escape. I know that's extreme but that's how I see it.

Well you and I are in agreement that that's a bad outcome. I personally expect to be killed as a result of strategic incentives encouraging monopolizing all available resources (which applies even if people are in charge).

But in principle, perfect autonomy is surely preferable.

I too am concerned about Yud-style misaligned AI, but I don't think it's more than 10% likely. Either way, if it's our fate, it's our fate. I'd rather be killed by emotionless AI than be psychologically tortured by feminists forever.

You really think that's the most accurate summary of their beliefs?

I think it's a hostile phrasing but correct in structure. I guess it could be accused of being an extrapolation. At any rate, it's hard to see how one would avoid it.

One man's "let's preserve human society" is another's "let's preserve the status games that unceasingly victimize me."

How would you summarize "their" beliefs? (We might have to decide who we mean by "their".)

I was referring to @Primaprimaprima's contention that if everyone dives into their own personal virtual world, that's a dystopia. I tend to agree, simply because I think interpersonal interactions are extremely important.