site banner

Culture War Roundup for the week of September 12, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

40
Jump in the discussion.

No email address required.

Finally something that explicitly ties AI into the culture war: Why I HATE A.I. Art - by Vaush

This AI art thing. Some people love it, some people hate it. I hate it.

I endorse pretty much all of the points he makes in this video. I do recommend watching the whole thing all the way through, if you have time.

I went into this curious to see exactly what types of arguments he would make, as I've been interested in the relationship between AI progress and the left/right divide. His arguments fall into roughly two groups.

First is the "material impact" arguments - that this will be bad for artists, that you're using their copyrighted work without their permission, that it's not fair to have a machine steal someone's personal style that they worked for years to develop, etc. I certainly feel the force of these arguments, but it's also easy for AI advocates to dismiss them with a simple "cry about it". Jobs getting displaced by technology is nothing new. We can't expect society to defend artists' jobs forever, if they are indeed capable of being easily automated. Critics of AI art need to provide more substantial arguments about why AI art is bad in itself, rather than simply pointing out that it's bad for artists' incomes. Which Vaush does make an attempt at.

The second group of arguments could perhaps be called "deontological arguments" as they go beyond the first-person experiential states of producers and consumers of AI art, and the direct material harm or benefit caused by AI. The main concern here is that we're headed for a future where all media and all human interaction is generated by AI simulations, which would be a hellish dystopia. We don't want things to just feel good - we want to know that there's another conscious entity on the other end of the line.

It's interesting to me how strongly attuned Vaush is to the "spiritual" dimension of this issue, which I would not have expected from an avowed leftist. It's clearly something that bothers him on an emotional level. He goes so far as to say:

If you don't see stuff like this [AI art] as a problem, I think you're a psychopath.

and, what was the real money shot for me:

It's deeply alienating, and if you disagree, you cannot call yourself a Marxist. I'm drawing a line.

Now, on the one hand, "leftism" and "Marxism" are absolutely massive intellectual traditions with a lot of nuance and disagreement, and I certainly don't expect all leftists to hold the same views on everything. On the other hand, I really do think that what we're seeing now with AI content generation is a natural consequence of the leftist impulse, which has always been focused on the ceaseless improvement and elevation of man in his ascent towards godhood. What do you think "fully automated luxury gay space communism" is supposed to mean? It really does mean fully automated. If everyone is to be a god unto themselves, untrammeled by external constraints, then that also means they have the right to shirk human relationships and form relationships with their AI buddies instead (and also flood the universe with petabytes of AI-generated art). At some point, there seems to be a tension between progress on the one hand and traditional authenticity on the other.

It was especially amusing when he said:

This must be how conservatives feel when they talk about "bugmen".

I guess everyone becomes a reactionary at some point - the only thing that differs is how far you have to push them.

The left sees the artistic class as fundamentally theirs, and AI disenfranchising them is a loss of social capital. That's all there is to it, and the appropriate response is to not care.

Did you actually watch the video?

I don’t see how you can walk away from it thinking that Vaush doesn’t deeply care about this issue on a personal level. And I went in skeptical, assuming that he didn’t care about it on a personal level.

I'm sure he does care about it for many reasons, much like how there are many things I care about for many good reasons. Yet those reasons all boil down to justification thrown over deeply personally-felt losses of status and power.

People don't like being blunt or crass. They want to decorate their selfishness with good-sounding justifications. Ultimately, why Vaush cares is that AI weakens his allies. He needs a more palatable reason, both for others and for his own lofty self-image. It's very human.

It's also very unimportant.

Do you think there are any psychological motivations that don’t ultimately reduce to personal power?

Absolutely. Sometimes we're motivated by hunger, sometimes we're horny, etc. If you mean do I believe in any motivations that don't reduce to primal drives, no. It's all the accumulated layers of deniability, obfuscation, and justification we heap on very primitive and simplistic desires.

Which isn't the same as being insincere. I trust that Vaush truly does feel strongly about this. I also trust he wouldn't if the AI was obsoleting his enemies instead.

I think the issue here is - even if people are driven to do algebraic topology "because being curious and smart got ancestors food and other ancestors iron weapons" - algebraic topology has complexities, effects that aren't really related to "status" or "hunger". Same for vaush and art. What does it even mean to 'reduce' a desire or action to a primal drive? What does it tell us about that action? If a society can electrify itself solely because of "status drives" and "hunger" and "horny" ... maybe it doesn't tell us much about an action or desire if it's built from "primal drives".

Honestly, I'd like to understand the philosophy informing your schtick, is this some devil advocacy, Williams syndrome, or do you just like making people explain themselves?

What does it even mean to 'reduce' a desire or action to a primal drive? What does it tell us about that action?

Reduction implies the absence of inherent interest in the action. At a minimum it tells us that some other inputs satisfying that primal drive just as well could be accepted as full compensation in the case of the action becoming non-viable, not just in the sense of being a good trade offer, but literally satisfying the specific motivation to pursue that action.

I don't know much about Vaush and see no point to demonizing him, so let's discuss a hypothetical Traush. Traush says the same things as Vaush here, but is a fanatical leftist from a reactionary's nightmare, who values human-made art and despises AI art for one reason only: most people with marketable artistic skill are leftists, therefore their effective monopoly on art (i.e. means of indoctrination) advances his political agenda and AI jeopardizes that. Put another way, his motivation to protest AI art genuinely reduces to will to political power. Assuming that Traush is a rational actor, he'd agree to some credible offer that advances leftist agenda through no new effort on his part and to an extent that offsetts (or more than offsets) the political loss induced by the democratization of AI-based art-making tools,

For example: after working with expert committees, the state proposes two possible solutions. A) human art is subsidized on every level from special free courses to giant exhibitions, people who want to dedicate their lives to artistic pursuits are getting a premium UBI package, commercialization of AI art is prohibited. B) starting at the age of 2 years, all children go to boarding schools with robo-nannies and heavy ideological propaganda the slant of which is determined by Traush's party, but for the next 20+ years nobody subsidizes art nor regulates AI tools for art. There's a referendum to vote for one or the other option.

Under the assumptions stated above, I believe Traush would pick option B in a heartbeat.


Edit: in rationalist parlance, all that is to say that reducible goals are trivially instrumental, whereas non-reducible ones are either terminal, perceived as such, or it's hard to see how they could be substituted without compromising some terminal goal.

Honestly, I'd like to understand the philosophy informing your schtick, is this some devil advocacy, Williams syndrome, or do you just like making people explain themselves?

Somewhere back on Reddit, I think someone explained that CSCA is a neoreactionary. At risk of mod intervention: I almost have to wonder if they went so far down the NRX wormhole that they teleported back to neoliberalism, because the vibe I got was "everything-is-actually-fine/nothing-ever-happens centrism."

It doesn't tell you much, but what it tells you is very important: it tells you how much you should care based on who that person is and what drive they're pursuing. You can spend time and effort trying to parse the particulars of someone's ideology and motives, and figure out all the myriad sensible reasons they despise or oppose you, or you can recognize at the beginning of that process this is just a tribal antagonist pursuing a simple hunger, and you don't need to know the details beyond that.

There is no glimmering insight to be found, there is no revelation to be uncovered when one understands the essence of Vaush. In the end, Vaush is just making arguments that back up his team. The details are masturbatory.

If the AI is going to obsolete anyone, then it will quite literally be my enemies. I am very much opposed to the twitterati ruling class. I make a living by writing code. And yet I am in complete agreement with Vaush's views on AI art. How do you explain that?

You are probably weird, in the way so many of us here are, in that you have a modestly coherent worldview arrived at in significant part through reason. I have no personal knowledge of the man, but by hearsay, Vaush is some sort of Wish.com Marxist Twitter/Youtube intellectual, right?

My impression of Vaush was that, while he may not be a lot better at intellectual rigor than your typical progressive, he does oppose idpol to some degree. He collaborated with Sh0eonhead at least once, I think, and both of them caught some shit for that.

Hating progressive excess doesn't make you not on their side. Are you against leftist hegemony and what it has produced, and would be comfortable in a world with actual right-wingers in charge, or do you just not like the fringe of your own people?

You're really grasping at straws here.

More comments