site banner

Culture War Roundup for the week of March 6, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

It is my belief that after the AI takeover, there will be increasingly less human-to-human interaction. This is partially because interacting with AI will be much preferable in every way, but it is also because safetyism will become ever more powerful. Any time two humans interact, there is the potential for someone to be harmed, at least emotionally. With no economic woes and nothing to do, moral busybodies will spend their time interfering with how other people spend their time, until the point where interacting with another human is so morally fraught and alienating that there is no point. Think about it, who would you rather spend time with: an AI who will do whatever you want and be whatever you want, anytime, or a grumpy human on her own schedule who wants to complain about someone who said "hi" to her without her consent? The choice seems obvious to me.

I expect AI to reduce safetyism because safetyism is, optimistically, a result of uncertainty and miscommunication. If you have poor eyesight, you wear glasses; if you have poor hearing, you wear a hearing-aid. My expectation is that many to most people will opt into prosthetics that give them improved social cognition: a feeling, in advance, for how something you're intending to say will be received. Alternatively, you can literally get the AI to translate vernacular, sentiment and idioms; this will be useful when leaving your peergroup. Furthermore, it will be much easier to stay up to date on shibboleths or to judge cultural fit in advance.

Humanity suffers from a massive lack of competence on every axis imaginable. We cannot now imagine how nice the post-singularity will be, but for a floor consider a world where everyone is good at everything at will, including every social skill.

Humanity suffers from a massive lack of competence on every axis imaginable. We cannot now imagine how nice the post-singularity will be, but for a floor consider a world where everyone is good at everything at will, including every social skill.

Your last paragraph sounds extremely dystopian and unappealing to me. It's completely inhumane and in my mind renders the experiences of everyone who's been alive before that as hollow and makes all of the suffering they and we have been through completely needless and pointless, just for endless generations of human beings to enjoy a life free of inadequacy. The fact that another commenter says it sounds grand is so distressing to me. If things truly begin moving this direction all I can do is hope to move as far away from people living this way as possible. The biggest joys in my life are experiencing human emotions that are gratifying that I've earned: the sun on my skin at the beach that I've worked hard to visit, being able to explore distant cities and meet new people from all walks of life. I will feel no gratification in being good at social skills through some technological enhancement, even if the AI enhances my gratification of it is as well. I just want none of it. All I can do is hope that opting out will be possible.

I think you're reading it as "you will be forced to have power X", which was not my intent. I'm sure there will be subgroups like that. The difference is that their lack of ability will be entirely voluntary. (Which, in the long run, may even make things better?)

The one thing that the Singularity cannot provide is a feeling of overcoming scarcity in an absolute sense; of advancing the cause of humanity. Because to advance is to struggle to get from here to there, and "there" is the absence of scarcity. The journey may be the goal, but the goal of a journey is still to progress; this is inherent and unavoidable.

Funny that your username is aiislove when you sound very un-transhumanist.

I don't understand this "life has to be hard to be meaningful" attitude. You can always turn the difficulty dial to whatever you want. Personally I don't want AI-aided social skills. I just want Westerners to stop treating me like shit for no reason. I would be interested in AI-aided social skills if I believed that it was the only way I could get love and friendship, but it is not (thank you rest-of-world!) so I don't need it.

I thought up my username in a few seconds, it's just a pun on "ai" meaning love in Japanese, plus I like making AI generated art, not because I want to use AI to game social interactions. I'm not a transhumanist.

You can always turn the difficulty dial to whatever you want.

That's not the problem, the problem is that there will be people using the difficulty dial to begin with, and that I will have to make the decision not to turn the difficulty dial, and we'll all have to live with the effects of there being a difficulty dial...... It's just a mess and I'm ready to live in the woods without it all. Using a dial to make yourself popular is the definition of cringe in my opinion, it is so pathetic, I'd rather be unpopular than using a transhumanist means to buy friends.

I really hope you're right, that sounds grand!

My expectation is that many to most people will opt into prosthetics that give them improved social cognition: a feeling, in advance, for how something you're intending to say will be received.

I think you have a fundamental misunderstanding of why some utterances are received poorly.

It's not about knowing enough cultural sensitivities to avoid faux pas, because faux pas aren't really caused by cultural insensitivities (which could be legible to an AI). Whether or not offense is taken is a choice of the listener, not a condition of the zeitgeist. If your interlocutor woke up on the good side of the bed this morning, conversation will go smoothly. If they woke up on the wrong side of the bed this morning, they'll claim to be offended by your aspie stutterings. It depends on the fundamentally invisible qualia of your conversation partner, not a legible, predictable, objective feature of language.

I am reminded of the fall of Lord Renard, brought down because he made "unwanted sexual advances". How could he know they were going to be unwanted? Sorry, pal, whether or not they're unwanted can only be decided inside the woman's head, unfalsifiably. I don't think anyone's going to agree to give up the power to destroy people at will because "Shucks, his AI told him she was asking for it, I guess he's off the hook!"

As such, I predict that "a prosthesis for social cognition" is impossible. Unless its a maxillofacial prosthetic, that'll successfully produce the desired effect.

Do you think it's okay that some people have AI companions, or do you think that those people should be forced to suffer eternally for no fucking reason?

I hardly know where to start with this, mostly because the part after the comma bears no connection to the part before the comma.

Do I think it's OK for some people to have AI companions? What do you mean "companions"? Do you mean AI GFs, or do you mean the AI social cognition prostheses discussed previously? In any case, I think AI GFs are bad because it's edging towards wireheading and wireheading is bad. And I think AI social cognition prostheses are impossible.

As for the people without AI companions being forced to suffer eternally for no fucking reason:

  • Why is tfw no AI gf "eternal suffering"?

  • Who's forcing them?

  • There's very good reasons for people to not have AI GFS. They're expensive to run, they make it more difficult for him to get a real gf, and there are moral problems to creating arguably semiconscious entities if you're only going to let them be an incel's ERP plaything.