site banner

Culture War Roundup for the week of March 6, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

It is my belief that after the AI takeover, there will be increasingly less human-to-human interaction. This is partially because interacting with AI will be much preferable in every way, but it is also because safetyism will become ever more powerful. Any time two humans interact, there is the potential for someone to be harmed, at least emotionally. With no economic woes and nothing to do, moral busybodies will spend their time interfering with how other people spend their time, until the point where interacting with another human is so morally fraught and alienating that there is no point. Think about it, who would you rather spend time with: an AI who will do whatever you want and be whatever you want, anytime, or a grumpy human on her own schedule who wants to complain about someone who said "hi" to her without her consent? The choice seems obvious to me.

I expect AI to reduce safetyism because safetyism is, optimistically, a result of uncertainty and miscommunication. If you have poor eyesight, you wear glasses; if you have poor hearing, you wear a hearing-aid. My expectation is that many to most people will opt into prosthetics that give them improved social cognition: a feeling, in advance, for how something you're intending to say will be received. Alternatively, you can literally get the AI to translate vernacular, sentiment and idioms; this will be useful when leaving your peergroup. Furthermore, it will be much easier to stay up to date on shibboleths or to judge cultural fit in advance.

Humanity suffers from a massive lack of competence on every axis imaginable. We cannot now imagine how nice the post-singularity will be, but for a floor consider a world where everyone is good at everything at will, including every social skill.

Humanity suffers from a massive lack of competence on every axis imaginable. We cannot now imagine how nice the post-singularity will be, but for a floor consider a world where everyone is good at everything at will, including every social skill.

Your last paragraph sounds extremely dystopian and unappealing to me. It's completely inhumane and in my mind renders the experiences of everyone who's been alive before that as hollow and makes all of the suffering they and we have been through completely needless and pointless, just for endless generations of human beings to enjoy a life free of inadequacy. The fact that another commenter says it sounds grand is so distressing to me. If things truly begin moving this direction all I can do is hope to move as far away from people living this way as possible. The biggest joys in my life are experiencing human emotions that are gratifying that I've earned: the sun on my skin at the beach that I've worked hard to visit, being able to explore distant cities and meet new people from all walks of life. I will feel no gratification in being good at social skills through some technological enhancement, even if the AI enhances my gratification of it is as well. I just want none of it. All I can do is hope that opting out will be possible.

Funny that your username is aiislove when you sound very un-transhumanist.

I don't understand this "life has to be hard to be meaningful" attitude. You can always turn the difficulty dial to whatever you want. Personally I don't want AI-aided social skills. I just want Westerners to stop treating me like shit for no reason. I would be interested in AI-aided social skills if I believed that it was the only way I could get love and friendship, but it is not (thank you rest-of-world!) so I don't need it.

I thought up my username in a few seconds, it's just a pun on "ai" meaning love in Japanese, plus I like making AI generated art, not because I want to use AI to game social interactions. I'm not a transhumanist.

You can always turn the difficulty dial to whatever you want.

That's not the problem, the problem is that there will be people using the difficulty dial to begin with, and that I will have to make the decision not to turn the difficulty dial, and we'll all have to live with the effects of there being a difficulty dial...... It's just a mess and I'm ready to live in the woods without it all. Using a dial to make yourself popular is the definition of cringe in my opinion, it is so pathetic, I'd rather be unpopular than using a transhumanist means to buy friends.