site banner

Culture War Roundup for the week of September 5, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

105
Jump in the discussion.

No email address required.

To which tribe shall the gift of AI fall?

In a not particularly surprising move, FurAffinity has banned AI content from their website. Ostensible justification is the presence of copied artist signatures in AI artpieces, indicating a lack of authenticity. Ilforte has skinned the «soul-of-the-artist» argument enough and I do not wish to dwell on it.

What's more important, in my view, is what this rejection means for the political future of AI. Previous discussions on TheMotte have demonstrated the polarizing effects of AI generated content — some are deathly afraid of it, others are practically AI-supremacists. Extrapolating outwards from this admittedly-selective community, I expect the use of AI-tools to become a hotly debated culture war topic within the next 5 years.

If you agree on this much, then I have one question: which party ends up as the Party of AI?

My kneejerk answer to this was, "The Left, of course." Left-wingers dominate the technological sector. AI development is getting pushed forward by a mix of grey/blue tribers, and the null hypothesis is that things keep going this way. But the artists and the musicians and the writers and so on are all vaguely left-aligned as well, and they are currently the main reactionary force against AI.

I'm mostly going to say "It doesn't matter" because I don't think an AI can be designed to have allegiance to any ideology or party, which is to say if it is capable of making 'independent' decisions, then those decisions will not resemble the ones that either party/tribe/ideology would actually want it to make such that either side will be able to claim the AI as 'one of them.'

But I think your question is more about which tribe will be the first to wholeheartedly accept AI into it's culture and proactively adapt its policies to favor AI use and development?

It's weird, the grey tribe is probably the one that is most reflexively scared of AI ruin and most likely to try and restrict AI development for safety purposes, even though they're probably the most technophilic of the tribes.

Blue tribe (as currently instantiated) may end up being the most vulnerable to replacement by AI. Blue tribers mostly work in the 'knowledge economy,' manipulating words and numbers, and include artists, writers, and middle management types whose activities are ripe for the plucking by a well-trained model. I think blue tribe's base will (too late) sense the 'threat' posed by AI to their comfortable livelihoods and will demand some kind of action to preserve their status and income.

So I will weakly predict that there will be backlash/crackdowns on AI development by Blue tribe forces that will explicitly be aimed at bringing the AI 'to heel' so as to continue to serve blue tribe goals and protect blue tribers' status. Policies that attempt to prevent automation of certain areas of the economy or require that X% of the money a corporation earns must be spent on employing 'real' human beings.

Red tribe, to the extent much of their jobs include manipulating the physical world directly, may turn out to be relatively robust against AI replacement. I can say that I think it will take substantially longer for an AI/robotic replacement for a plumber, a roofer, or a police officer to arise, since the 'real world' isn't so easy to render legible to computer brains, and the 'decision tree' one has to follow to, e.g. diagnose a leak in a plumbing stack or install shingles on a new roof requires incorporating copious amounts of real world data and acting upon it. Full self-driving AI has been stalled out for a decade now because of this.

So there will likely be AI assistants that augment the worker in performing their task whilst not replacing them, and red tribers may find this new tool extremely useful and appealing, even if they do not understand it.

So perhaps red tribe, despite being poorly positioned to create the AI revolution, may be the one that initially welcomes it?

I dunno. I simply do not forsee Republicans being likely to make AI regulations (or deregulation) a major policy issue in any near-term election, whilst I absolutely COULD see Democrats doing so.

Most of your post is in line with what I believe. The information workers in blue tribe will turn to protectionism as AI-generated content supercedes them. Red tribe blue-collar workers will suffer the least, and the Republicans will have their first and last opportunity to lure techbros away from the progressive sphere of influence.

There is one thing, though.

I simply do not forsee Republicans being likely to make AI regulations (or deregulation) a major policy issue in any near-term election, whilst I absolutely COULD see Democrats doing so.

It only takes one partisan to start a conflict. Republicans might not initially care, but once the democrats do, I expect it'll be COVID all over again -- sudden flip and clean split of the issue between parties.

But this is just nitpicking on my part.

It only takes one partisan to start a conflict. Republicans might not initially care, but once the democrats do, I expect it'll be COVID all over again -- sudden flip and clean split of the issue between parties.

Not nitpicking, this is a very salient point. Will the concept of "AI" in the abstract become a common enemy that both sides ultimately oppose, or will it be like Covid where one's position on the disease, the treatments, the correct policies to use will be an instantaneous 'snap to grid' based on which party you're in? And will it end up divided as neatly down the middle as Covid was?

I could see it happening!

When AI becomes salient enough for Democrats to make it a policy issue (it already is salient, but as with Crypotcurrency, the government is usually 5-10 years behind from noticing) the GOP will find some way to take the opposite position.

I think my central point, though, is that I don't see any Republican Candidate choosing to make AI a centerpiece of their campaign out of nowhere, whereas I could imagine a Democratic candidate deciding to add AI policy to their platform and using it to drive their campaign.