site banner

Culture War Roundup for the week of September 12, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

40
Jump in the discussion.

No email address required.

I'm sure he does care about it for many reasons, much like how there are many things I care about for many good reasons. Yet those reasons all boil down to justification thrown over deeply personally-felt losses of status and power.

People don't like being blunt or crass. They want to decorate their selfishness with good-sounding justifications. Ultimately, why Vaush cares is that AI weakens his allies. He needs a more palatable reason, both for others and for his own lofty self-image. It's very human.

It's also very unimportant.

Do you think there are any psychological motivations that don’t ultimately reduce to personal power?

Absolutely. Sometimes we're motivated by hunger, sometimes we're horny, etc. If you mean do I believe in any motivations that don't reduce to primal drives, no. It's all the accumulated layers of deniability, obfuscation, and justification we heap on very primitive and simplistic desires.

Which isn't the same as being insincere. I trust that Vaush truly does feel strongly about this. I also trust he wouldn't if the AI was obsoleting his enemies instead.

I think the issue here is - even if people are driven to do algebraic topology "because being curious and smart got ancestors food and other ancestors iron weapons" - algebraic topology has complexities, effects that aren't really related to "status" or "hunger". Same for vaush and art. What does it even mean to 'reduce' a desire or action to a primal drive? What does it tell us about that action? If a society can electrify itself solely because of "status drives" and "hunger" and "horny" ... maybe it doesn't tell us much about an action or desire if it's built from "primal drives".

Honestly, I'd like to understand the philosophy informing your schtick, is this some devil advocacy, Williams syndrome, or do you just like making people explain themselves?

What does it even mean to 'reduce' a desire or action to a primal drive? What does it tell us about that action?

Reduction implies the absence of inherent interest in the action. At a minimum it tells us that some other inputs satisfying that primal drive just as well could be accepted as full compensation in the case of the action becoming non-viable, not just in the sense of being a good trade offer, but literally satisfying the specific motivation to pursue that action.

I don't know much about Vaush and see no point to demonizing him, so let's discuss a hypothetical Traush. Traush says the same things as Vaush here, but is a fanatical leftist from a reactionary's nightmare, who values human-made art and despises AI art for one reason only: most people with marketable artistic skill are leftists, therefore their effective monopoly on art (i.e. means of indoctrination) advances his political agenda and AI jeopardizes that. Put another way, his motivation to protest AI art genuinely reduces to will to political power. Assuming that Traush is a rational actor, he'd agree to some credible offer that advances leftist agenda through no new effort on his part and to an extent that offsetts (or more than offsets) the political loss induced by the democratization of AI-based art-making tools,

For example: after working with expert committees, the state proposes two possible solutions. A) human art is subsidized on every level from special free courses to giant exhibitions, people who want to dedicate their lives to artistic pursuits are getting a premium UBI package, commercialization of AI art is prohibited. B) starting at the age of 2 years, all children go to boarding schools with robo-nannies and heavy ideological propaganda the slant of which is determined by Traush's party, but for the next 20+ years nobody subsidizes art nor regulates AI tools for art. There's a referendum to vote for one or the other option.

Under the assumptions stated above, I believe Traush would pick option B in a heartbeat.


Edit: in rationalist parlance, all that is to say that reducible goals are trivially instrumental, whereas non-reducible ones are either terminal, perceived as such, or it's hard to see how they could be substituted without compromising some terminal goal.

Honestly, I'd like to understand the philosophy informing your schtick, is this some devil advocacy, Williams syndrome, or do you just like making people explain themselves?

Somewhere back on Reddit, I think someone explained that CSCA is a neoreactionary. At risk of mod intervention: I almost have to wonder if they went so far down the NRX wormhole that they teleported back to neoliberalism, because the vibe I got was "everything-is-actually-fine/nothing-ever-happens centrism."

It doesn't tell you much, but what it tells you is very important: it tells you how much you should care based on who that person is and what drive they're pursuing. You can spend time and effort trying to parse the particulars of someone's ideology and motives, and figure out all the myriad sensible reasons they despise or oppose you, or you can recognize at the beginning of that process this is just a tribal antagonist pursuing a simple hunger, and you don't need to know the details beyond that.

There is no glimmering insight to be found, there is no revelation to be uncovered when one understands the essence of Vaush. In the end, Vaush is just making arguments that back up his team. The details are masturbatory.

If the AI is going to obsolete anyone, then it will quite literally be my enemies. I am very much opposed to the twitterati ruling class. I make a living by writing code. And yet I am in complete agreement with Vaush's views on AI art. How do you explain that?

You are probably weird, in the way so many of us here are, in that you have a modestly coherent worldview arrived at in significant part through reason. I have no personal knowledge of the man, but by hearsay, Vaush is some sort of Wish.com Marxist Twitter/Youtube intellectual, right?

My impression of Vaush was that, while he may not be a lot better at intellectual rigor than your typical progressive, he does oppose idpol to some degree. He collaborated with Sh0eonhead at least once, I think, and both of them caught some shit for that.

Hating progressive excess doesn't make you not on their side. Are you against leftist hegemony and what it has produced, and would be comfortable in a world with actual right-wingers in charge, or do you just not like the fringe of your own people?

You're really grasping at straws here.

You're evading a simple question, likely because the answer would indicate you are broadly aligned with the leftist hegemon.

I literally, unironically support the establishment of white ethnostates. Is that sufficiently opposed to the hegemon for you?

It is, though your initial evasiveness now leads to my being unavoidably skeptical of your assertions. Assuming you are telling the truth, and you would indeed rather see the right flourish above the left (rather than the left above the more-left), then you are evidence against my theory. I do not have an explanation for you; you could be evidence of my being wrong, or there could be an explanation I'm simply not seeing at the moment, given my inability to holistically examine you.

So let's proceed from the position you're sincere, and I am wrong. You see before you something that is going to transform the creative landscape - by empowering people who don't yield to the progressive hegemon to create things they like. You see an evolution of expression that will offer infinitely more creative freedoms to people.

You loathe this. Why?

More comments