site banner

Culture War Roundup for the week of September 12, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

40
Jump in the discussion.

No email address required.

Finally something that explicitly ties AI into the culture war: Why I HATE A.I. Art - by Vaush

This AI art thing. Some people love it, some people hate it. I hate it.

I endorse pretty much all of the points he makes in this video. I do recommend watching the whole thing all the way through, if you have time.

I went into this curious to see exactly what types of arguments he would make, as I've been interested in the relationship between AI progress and the left/right divide. His arguments fall into roughly two groups.

First is the "material impact" arguments - that this will be bad for artists, that you're using their copyrighted work without their permission, that it's not fair to have a machine steal someone's personal style that they worked for years to develop, etc. I certainly feel the force of these arguments, but it's also easy for AI advocates to dismiss them with a simple "cry about it". Jobs getting displaced by technology is nothing new. We can't expect society to defend artists' jobs forever, if they are indeed capable of being easily automated. Critics of AI art need to provide more substantial arguments about why AI art is bad in itself, rather than simply pointing out that it's bad for artists' incomes. Which Vaush does make an attempt at.

The second group of arguments could perhaps be called "deontological arguments" as they go beyond the first-person experiential states of producers and consumers of AI art, and the direct material harm or benefit caused by AI. The main concern here is that we're headed for a future where all media and all human interaction is generated by AI simulations, which would be a hellish dystopia. We don't want things to just feel good - we want to know that there's another conscious entity on the other end of the line.

It's interesting to me how strongly attuned Vaush is to the "spiritual" dimension of this issue, which I would not have expected from an avowed leftist. It's clearly something that bothers him on an emotional level. He goes so far as to say:

If you don't see stuff like this [AI art] as a problem, I think you're a psychopath.

and, what was the real money shot for me:

It's deeply alienating, and if you disagree, you cannot call yourself a Marxist. I'm drawing a line.

Now, on the one hand, "leftism" and "Marxism" are absolutely massive intellectual traditions with a lot of nuance and disagreement, and I certainly don't expect all leftists to hold the same views on everything. On the other hand, I really do think that what we're seeing now with AI content generation is a natural consequence of the leftist impulse, which has always been focused on the ceaseless improvement and elevation of man in his ascent towards godhood. What do you think "fully automated luxury gay space communism" is supposed to mean? It really does mean fully automated. If everyone is to be a god unto themselves, untrammeled by external constraints, then that also means they have the right to shirk human relationships and form relationships with their AI buddies instead (and also flood the universe with petabytes of AI-generated art). At some point, there seems to be a tension between progress on the one hand and traditional authenticity on the other.

It was especially amusing when he said:

This must be how conservatives feel when they talk about "bugmen".

I guess everyone becomes a reactionary at some point - the only thing that differs is how far you have to push them.

Man, hour long videos of a guy for some reason telling his chat to stop asking questions rather than just looking away is an awful medium for making a point. I'm 20 minutes in a all I've got so far is the super basic "it took 'er jorbs" point. I'm sorry that I'm a "subhuman piece of shit" because I'd rather have near limitless access to new art than keep artists employed digging holes and filling them back in again. Oh god he's trying to claim that this is going to favor corporations over the little guy? He knows that's not a huge budget item for corporations but can be half the budget for little guys right? very frustrating watch.

I think objection to AI art will gradually be more coded as right-wing than left, and I think it goes hand in hand with the left caring more about art as "symbol" or its usefulness, rather than as something like contact with the real, which I think is more right-wing and implicitly rejects death of the author.

I think Vaush is basically holding a reactionary opinion here, because something he likes is threatened. But I think the pro-AI art is the view that's going to be rewarded most on the left. It's what young people will be doing, it's equal-opportunity, etc. I think the hair-splitting he does across tech progress in art kind of gives away that he's not holding a cohesive worldview. I don't think art as "communication" solves the riddle here, especially if AI art could allow us to communicate better or more easily.

Instead I think that there's a pretty cohesive argument that every technology that led to making art easier to produce, was eventually exploited to make cheaper, broadly appealing, "worse" art. Even oil paint fits, especially if you argue that the time between introduction and exploitation got shorter and shorter over time, possibly due to a weaker institutional reactionary resistance each time. But you see it with photography, synths, digital cameras, and I'm sure I could go on. And what you're seeing right now is that there is absolutely no friction against someone exploiting this new tool, to the point they are exploiting it before it's even any good.

But I think inevitably a leftist or liberal would accept a pro-AI art position. In a leftist utopia you'd have both, and they'd be paid the same. And a liberal would just challenge you to make your art better and challenge the AI on its own terms. Is that an incorrect characterization?

Sidenote: The way these youtube debaters interact with chat or play videogames when they talk (not in this video) just completely reads decadent society to me.

Sidenote: The way these youtube debaters interact with chat or play videogames when they talk (not in this video) just completely reads decadent society to me.

Like what?

I'm not sure what you mean by your question

I was asking to explain what you mean by "the way [they] interact".

That sidenote came more from a feeling than thought out logic, so I kind of have to analyze it to answer your question.

One case in a different video was Destiny getting a heartfelt call from a therapist talking about trans issues, while Destiny just "uh-huh"ed through it playing Terraria.

In this video I feel like the constant chat feed is used as this sort of distraction in order to kind of reinforce the speaker's socially dominant role, while allowing him to kind of skip through an unthorough argument.

In both cases, there is kind of a conflation of entertainment with politics and philosophy, that obviously has been only growing the past 10 years. But it's not a marriage of those things, like a well-written political book that makes you think. It's like a series of orthogonal, unrelated abomination of various styles of dopamine hits.

The main concern here is that we're headed for a future where all media and all human interaction is generated by AI simulations, which would be a hellish dystopia. We don't want things to just feel good - we want to know that there's another conscious entity on the other end of the line.

I can see this as a Future Problem, but right now the "conscious entity on the other end" are simply prompt writers. There is a sense of community to be gained from indulging and working on AI generation together. I think it is misleading to apply the bugman/we-will-be-in-pod argument to text-to-image tools, because new means of human interaction are forming as a result of it.

Also, some of us just hate the majority of conscious entities and are happier with what simulations we can get. This obviously doesn't apply to you or Vaush, but I wonder what brings you both to so viciously condemn the estranged, the alienated, the anti-social.

artistic talent is a perfect example of haves versus have nots; the communist ideal would necessarily include systems that allow anyone to generate art as they desire. when these systems advance to allowing individuals to generate entire animated series and films society will undergo a historic decoupling of capital from entertainment. ie, further communist ideal.

on a long enough timeline capitalism will compete itself into socialism. systems like this are exactly why. opposition necessarily carries water for entrenched power and capital.

that which can be automated will be.

AI art can be thought of as the next level of Photoshop. Like Photoshop, it does not replace the artist but rather gives artists more tools. Like regular art, there is a learning curve . Second, AI generated art has a distinct appearance. This may be fine for applications in which this does not matter, but the the examples of art produced with Stable Diffusion look weird. like pastels, or deformed.

As someone quite familiar with the space, it's less of a tool and more the tool.

Progress will most likely continue at its current breakneck pace; there's no real architectural reason as of now why it shouldn't. And I suspect 2-3 years from now, image models will completely invalidate the need for Photoshop (nor will there be a need for Procreate, Lightroom, etc). You'll Just Add Language™ and achieve functionality several orders of magnitude more effective than the old point-and-click approach.

I anticipate a future where every seed for every concept in every imaginable latent space will be pre-generated, and humans will simply browse through the results until they find something they enjoy. Does that sound artistic to you?

And to your point on AI generated imagery having a "distinct appearance": that may have been true a handful of months ago, but the most recently open sourced Stable Diffusion model (v1.5) no longer suffers from any real stylistic slant. Try a free generator here if you're still skeptical. AI is really getting to the "blink and you'll miss it" stage, and it's exciting to see.

I recently asked an artist friend of mine what he thought of the tool after showing him the thing in all of its glory with img2img and GFPGAN and so on. And his take was that it was impressive but still just a tool, that maybe with an advanced version of this you would only need an art director for a project instead of 5 people, but definitely out of bounds was this idea that it would replace the artist.

In the end, browsing for what is enjoyable in the litany of possibilities is what art is. And it's a skill.

that this will be bad for artists, that you're using their copyrighted work without their permission, that it's not fair to have a machine steal someone's personal style that they worked for years to develop, etc.

It is difficult to convey in human language how absurd this argument seems. Is there anyone willing to actually defend it? What do people think "an artist's style" is, and why do they believe that it is, in any meaningful sense, something that can be "owned", on which property rights in any form could be enforced? At the moment, my best guess is that people making such arguments are either so thoroughly confused that they have nothing of value to say, or so dishonest that good-faith conversation with them is flatly impossible.

It seems in line with how people treat trade dress. The underlying justification was to prevent customer confusion where someone buys a product they did not intend because the branding was so similar. That has been expanded into the look-and-feel standard in the digital age. It's not that far of a logical leap to apply that to a case of training an AI on a specific artist. Take a test case of a programmer with no art skills training an AI on a particular artist they like to produce an app or a game and profiting off the similarity and you've got a precedent for that sort of thing. I don't think that's the correct way to handle things but I can see something like that happening.

If one artist learns another artist's style, should they be prevented from selling work in that style? Under current law or any previous laws I'm aware of, absolutely not. So where does this idea of protecting artistic style come from? Certainly "trade dress" or "look and feel" have never been applicable before, so why should they be applicable now? And if they are applied, why are they not applied for human artists as well?

The ease of copying is the difference. Every artist that wants to copy artist X's style has to learn it themselves and it's "bodybuilding hard". It doesn't really scale. With art AIs anyone can teach it to draw in any style you have enough copies of, you just need some gigaflopses, and then the training weights can be duplicated infinitely.

Romm Art Creations Ltd. v. Simcha Int'l., Inc. seems applicable to me but I'm just some rando on the internet. Typical plaintiff work compared to a defendant work that was enjoined in that case.

I think the decision in question was simply wrong, and do not think judges actually deliver such decisions on anything approaching a regular basis.

compare this and this.

That's two different artists, with the former intentionally getting as close to the latter's art style as possible, and then using it for his own profit. No one pretends that this is even slightly objectionable.

It is, but that decision is crazy sauce and apparently wasn't pursued after the preliminary injunction.

People are generally confused about these kinds of rights and only have a vague idea of "intellectual property rights" (which is considered by Richard Stallman as a deliberately misleading propaganda term in itself), based on FUD spread by music, movie, book, software etc publishers.

The term "intellectual property" blurs the lines between and masks the purpose behind different kinds of laws like copyright law, trademark law, patent law, trade secrets, the banning of industrial espionage etc. People don't understand even the basics, like ideas can't be copyrighted, only concrete expression, etc.

In this context an artist's style seems like just another natural intellectual property.

It doesn't seem much more absurd than saying that Mickey Mouse is something that can be owned and be subject to property rights. Which is exactly what existing copyright law does.

Determining whether someone has copied an artist's style would be more difficult than determining whether someone copied the design of Mickey Mouse, but given that "in the style of X artist" prompts are extremely popular with SD users right now, and people can coherently discuss how accurate or not the AI was at reproducing the requested style, it doesn't seem like it's totally impossible.

It doesn't seem much more absurd than saying that Mickey Mouse is something that can be owned and be subject to property rights. Which is exactly what existing copyright law does.

"A representation of mickey mouse" and "the style of mickey mouse" might not sound too different; after all, most of the words are the same. In the same way, we might say that looking at the moon and travelling to the moon are similar, since most of the words are the same. The reality is somewhat different.

Mickey Mouse is a specific representation. "style" is the raw material representations are made of. Metallica can copyright "Enter Sandman." They can't copyright musical notes, or even angry-sounding musical notes played on an electric guitar, but the latter is what you are advocating. You're saying that someone should be able to copyright "grim detective stories set in the 1920s", or "stories about D-Day."

You are claiming that people have a right to exclusive ownership of specific colors, line widths and angles, textures, curves and shapes, elements of composition and rhythm, not as a description of of a specific drawing or subject, but in general across all drawings. You aren't claiming that people can copyright mickey mouse, you're claiming they can copyright specific kinds of circles, the color orange, and vanishing-point perspective. That is what style is: heuristics for simplifying the infinite complexity of the real world into something more easily expressible. Artists can develop styles, or discover them. They cannot own them in any meaningful sense. No one can. Claiming otherwise is a naked assertion to ownership of someone else's brain.

As an aside, SD can generate pictures of Mickey Mouse doing novel things, same with any Marvel characters and so on. If I'm not allowed to release a new cartoon of Mickey Mouse (or Batman) acting out a new story, are the SD authors allowed to release this model?

(This is a distinct topic from style.)

the AI isn't itself a new cartoon of mickey mouse or batman. It can be made to create a new cartoon of mickey mouse and batman, in much the same way photoshop or a pencil and paper can. Why should the model be regulated differently from paper or drawing software?

Photoshop does not contain anything specific to mickey mouse, I have to know what he looks like if I want to create a picture of him. Meanwhile, SD does know what mickey looks like, I don't have to know. Even a blind man who has no idea what the mouse looks like can create images of him because SD contains the info of what the character looks like.

I'd agree with you if I had to type in a full, detailed description of what mickey mouse looks like, color, shape etc, and SD knew how to draw him only afterwards.

SD pretty much contains a representation of mickey mouse in the model weights. I'm not allowed to release a textured 3d mesh of mickey mouse, even though the user first has to choose a viewing angle and a light source position etc in order to render a pic of mickey from that 3d asset. Similarly here with SD we don't have a 3d mesh, but have something that can be controlled in slightly differently but is still a representation. Just because the format is neural weights instead of explicit 3d assets, the situation is very similar. Else what do you say about neural encodings of distance fields from which the surface can be recovered? How about NeRFs?

Photoshop does not contain anything specific to mickey mouse, I have to know what he looks like if I want to create a picture of him. Meanwhile, SD does know what mickey looks like, I don't have to know. Even a blind man who has no idea what the mouse looks like can create images of him because SD contains the info of what the character looks like.

It contains the colors yellow, red, black and white. It contains curve tools that can represent lines of specific angles and specific thicknesses, and a raster grid on which they can be presented.

Neither photoshop nor the AI contain an actual image of Mickey Mouse. They both contain the tools necessary to depict mickey mouse. Photoshop lacks the idea of mickey mouse, and so needs a human who does have that idea. The AI simply contains the idea. Not a picture of mickey, the idea of mickey.

Even a blind man who has no idea what the mouse looks like can create images of him because SD contains the info of what the character looks like.

Even a blind man can create an image of mickey in photoshop; a custom UI would make it easier but is not actually necessary. square canvas, new layer, circle tool>center-of-canvas>150-pixel-radius, set line width to 3 pixels, circle tool> center-of-canvas-minus80px_x-minus80px_y>80-pixel-radius, etc,etc...

SD pretty much contains a representation of mickey mouse in the model weights.

In much the same way that my brain contains a representation of mickey mouse, yes. In other senses, very much no. There is no picture, there is no mesh. There is no actualized output contained in the model. There is the idea, just as there is in my own mind. The AI is a rudimentary mind, not a collection of pictures. I'm pretty sure this can be proved mathematically, just based on the size of the final model versus the size of the training set, versus theoretical limits of data compression. The original pictures are not in there in any meaningful sense.

Else what do you say about neural encodings of distance fields from which the surface can be recovered? How about NeRFs?

I have no idea what this means. Elaborate?

In much the same way that my brain contains a representation of mickey mouse, yes.

Yes, but you can't release your brain. It's not an artefact or a tool. Humans and their minds have a very different standing under the law than inanimate objects and information-carrying media.

I have no idea what this means. Elaborate?

There are new ways of representing 3D scenes or 3D geometry using neural networks. They encode the properties of the 3D scene in neural network weights, and they can be used to create new images. But the representation has no notion of images, pixels, vertices, textures etc, it's all a bunch of "opaque" neural weights.

Here's one variant described: https://youtube.com/watch?v=T29O-MhYALw

The point is, law usually cares about intended use and how one interacts with the thing, not the implementation details. And nobody really knows how courts will treat these new methods. Laws were not made with the knowledge of such things, so interpretations of the wider goals will have to guide the court's work.

More comments

I'm aware that existing copyright law doesn't cover style, and I'm not saying it should. I'm just saying it's not as absurd and incoherent as you made it out to be, that's all.

In what way is it not absurd or incoherent? The original quote claims that artists have a right to the styles they use. What would such a right actually look like, in detail? The above is an attempt to actually describe what it would look like, and why such an idea is crazy. If you think it's not crazy, I'd be interested in hearing an argument as to why.

The left sees the artistic class as fundamentally theirs, and AI disenfranchising them is a loss of social capital. That's all there is to it, and the appropriate response is to not care.

Did you actually watch the video?

I don’t see how you can walk away from it thinking that Vaush doesn’t deeply care about this issue on a personal level. And I went in skeptical, assuming that he didn’t care about it on a personal level.

From how he has behaved himself over the last few years I find it difficult to believe that Vaush cares about anything but his own fame and power.

I mean maybe he does this time, but it would requires something much more extraordinary than his usual shtick to move my priors.

Yeah but fuck Vaush and whatever he cares about. I fucking knew "bread tube" would be all over the "anti-AI" side.

A major goal of this forum is to avoid the kind of incendiary language used here. Speak plainly. Don't attempt to build consensus about whether Vaush sucks, give an argument instead.

What led you to predict that "bread tube" would be anti-AI? Do you find their position hypocritical? Anti-progress?

He is a grifter without core principles. Whatever he has to say about anything is worthless. He is Unironicaly Evil

Regardless of Vaush's principles, I think he is getting at something core to leftists here. Like, if he doesn't believe AI art is fundamentally unnerving, this grift of this video would be him representing what he thinks his fellow travelers and audience believe.

Remember when NFTs were at their peak and pretty much everyone who wasn't buying or selling them thought they were idiotic? Their general dismissal was just that they were dumb and looked terrible. But the criticism from leftists was much more severe and in some regards deranged. They wheeled out the climate arguments, started talking about stuff they didn't understand and massively overclaimed the argument. It was approaching levels of the Keffals/KF debacle where the facts didn't matter, only achieving their result and I think would have gotten worse if the public hadn't rejected the Apes.

I am not sure where this core distaste is coming from but I am sure it exists. My best theory at the moment would be Minotaur's, that they believe they "own" art and cringey libertarians with doofus monkeys and robots can't be allowed to have it.

I am not sure where this core distaste is coming from but I am sure it exists. My best theory at the moment would be Minotaur's, that they believe they "own" art and cringey libertarians with doofus monkeys and robots can't be allowed to have it.

Probably this. As creative types in entrenched industries lean left it makes the illusion that Art and its expression is left aligned. It's just a temper tantrum and this "controversy" will go the same as Photography. The more concerning development on this space is that they are modifying the algos to introduce Black people and assorted terms to the images when the prompt isn't specific enough.

As for the Vaush issue, I'm sure this issue is better and more honestly expressed elsewhere. I would prefer to give views and hear the arguments of principled individuals trying to define the issue, rather than making the mudrakers's platforms bigger, especially of someone that has expressed a distaste to debate channels smaller than his just because he would be expanding their audiences.

EDIT (Regarding Art & Leftism).- This is probably why there was so much blowback when Terese Nielsen liked tweets that weren't kosher with the party line, despite being a married lesbian living in Utah, and the vitriol when she defended herself comparing that situation to when she was in the closet still. The mob couldn't conceive that they were even superficially similar to those Conservative Bigotstm.

Feeding the very soul of humanity into the endlessly hungering maw of sociopathic capitalism to own the libs art kids.

Wow! Just like my Canadian industrial music!

I have to wonder if we'll either shut it down or learn to live with it and maybe simply redefine what it means to be human or have a soul.

can you summarize the 1h40m video? Or ... explain this?

If I said "mitch mcconnell is a grifter with no principles. whatever he has to say is worthless. he is unironically evil". There's probably some truth to that - but, it's sort of true for many politicians, some R some not R (as well as those in other countries) - it's not useful, it doesn't explain why, how he is - and it doesn't even attack the arguments he's making.

I got about halfway through. It's a ton of examples of Vaush lying, shamelessly, about pretty much anything and everything. He appears to be the sort of person for whom language exists exclusively to manipulate others.

Vaush is absolutely a disingenuous grifter.

He backflipped from 'rape and sexual assault of women is such an important, underappreciated issue that society tragically ignores' to 'bullshit, she's lying, Muslims would never rape white women in Australia' in real time. This isn't just standard politician inconsistency but completely refusing to believe evidence after it disfavours his cause - in a matter of seconds.

https://youtube.com/watch?v=mhZ0JqQOsDA

The short and sweet of it is that as any breadtuber out there he uses a bunch of tactics to argue in bad faith in any debate he has with ideological opponents. He constantly tries to dehumanize them when all is said and done and he is "alone" in his stream and his only apparent core principle is winning, expressed in the quote "You call it selling out your principles?, I call it Fucking winning. And that is my principle"

If you seen enough of him, it's hard to believe he's the sincere type about anything.

I'm sure he does care about it for many reasons, much like how there are many things I care about for many good reasons. Yet those reasons all boil down to justification thrown over deeply personally-felt losses of status and power.

People don't like being blunt or crass. They want to decorate their selfishness with good-sounding justifications. Ultimately, why Vaush cares is that AI weakens his allies. He needs a more palatable reason, both for others and for his own lofty self-image. It's very human.

It's also very unimportant.

Do you think there are any psychological motivations that don’t ultimately reduce to personal power?

Absolutely. Sometimes we're motivated by hunger, sometimes we're horny, etc. If you mean do I believe in any motivations that don't reduce to primal drives, no. It's all the accumulated layers of deniability, obfuscation, and justification we heap on very primitive and simplistic desires.

Which isn't the same as being insincere. I trust that Vaush truly does feel strongly about this. I also trust he wouldn't if the AI was obsoleting his enemies instead.

I think the issue here is - even if people are driven to do algebraic topology "because being curious and smart got ancestors food and other ancestors iron weapons" - algebraic topology has complexities, effects that aren't really related to "status" or "hunger". Same for vaush and art. What does it even mean to 'reduce' a desire or action to a primal drive? What does it tell us about that action? If a society can electrify itself solely because of "status drives" and "hunger" and "horny" ... maybe it doesn't tell us much about an action or desire if it's built from "primal drives".

Honestly, I'd like to understand the philosophy informing your schtick, is this some devil advocacy, Williams syndrome, or do you just like making people explain themselves?

What does it even mean to 'reduce' a desire or action to a primal drive? What does it tell us about that action?

Reduction implies the absence of inherent interest in the action. At a minimum it tells us that some other inputs satisfying that primal drive just as well could be accepted as full compensation in the case of the action becoming non-viable, not just in the sense of being a good trade offer, but literally satisfying the specific motivation to pursue that action.

I don't know much about Vaush and see no point to demonizing him, so let's discuss a hypothetical Traush. Traush says the same things as Vaush here, but is a fanatical leftist from a reactionary's nightmare, who values human-made art and despises AI art for one reason only: most people with marketable artistic skill are leftists, therefore their effective monopoly on art (i.e. means of indoctrination) advances his political agenda and AI jeopardizes that. Put another way, his motivation to protest AI art genuinely reduces to will to political power. Assuming that Traush is a rational actor, he'd agree to some credible offer that advances leftist agenda through no new effort on his part and to an extent that offsetts (or more than offsets) the political loss induced by the democratization of AI-based art-making tools,

For example: after working with expert committees, the state proposes two possible solutions. A) human art is subsidized on every level from special free courses to giant exhibitions, people who want to dedicate their lives to artistic pursuits are getting a premium UBI package, commercialization of AI art is prohibited. B) starting at the age of 2 years, all children go to boarding schools with robo-nannies and heavy ideological propaganda the slant of which is determined by Traush's party, but for the next 20+ years nobody subsidizes art nor regulates AI tools for art. There's a referendum to vote for one or the other option.

Under the assumptions stated above, I believe Traush would pick option B in a heartbeat.


Edit: in rationalist parlance, all that is to say that reducible goals are trivially instrumental, whereas non-reducible ones are either terminal, perceived as such, or it's hard to see how they could be substituted without compromising some terminal goal.

Honestly, I'd like to understand the philosophy informing your schtick, is this some devil advocacy, Williams syndrome, or do you just like making people explain themselves?

Somewhere back on Reddit, I think someone explained that CSCA is a neoreactionary. At risk of mod intervention: I almost have to wonder if they went so far down the NRX wormhole that they teleported back to neoliberalism, because the vibe I got was "everything-is-actually-fine/nothing-ever-happens centrism."

It doesn't tell you much, but what it tells you is very important: it tells you how much you should care based on who that person is and what drive they're pursuing. You can spend time and effort trying to parse the particulars of someone's ideology and motives, and figure out all the myriad sensible reasons they despise or oppose you, or you can recognize at the beginning of that process this is just a tribal antagonist pursuing a simple hunger, and you don't need to know the details beyond that.

There is no glimmering insight to be found, there is no revelation to be uncovered when one understands the essence of Vaush. In the end, Vaush is just making arguments that back up his team. The details are masturbatory.

If the AI is going to obsolete anyone, then it will quite literally be my enemies. I am very much opposed to the twitterati ruling class. I make a living by writing code. And yet I am in complete agreement with Vaush's views on AI art. How do you explain that?

You are probably weird, in the way so many of us here are, in that you have a modestly coherent worldview arrived at in significant part through reason. I have no personal knowledge of the man, but by hearsay, Vaush is some sort of Wish.com Marxist Twitter/Youtube intellectual, right?

My impression of Vaush was that, while he may not be a lot better at intellectual rigor than your typical progressive, he does oppose idpol to some degree. He collaborated with Sh0eonhead at least once, I think, and both of them caught some shit for that.

Hating progressive excess doesn't make you not on their side. Are you against leftist hegemony and what it has produced, and would be comfortable in a world with actual right-wingers in charge, or do you just not like the fringe of your own people?

You're really grasping at straws here.

More comments

I don't think /u/Minotaur was saying that Vaush says we shouldn't care. They were saying that they themselves didn't care.

Incidentally, /u/anything links to a reddit username. @Minotaur, meanwhile, produces a proper ping/link here.

There was a problem with that too, but I guess Zorba fixed it recently.

Darn it, I've used /u/ in other posts, too.

This is correct, and if that was the root of the misunderstanding thank you for clarifying it for him. Vaush cares; unless you're one of Vaush's fellow travelers, you shouldn't. And if you are, I'm sure you already do.

Well there are conservatives who would like to claim the artistic class as their own, upset that it was ever ceded to the left.

This shows the limits of orienting one's political position around opposition to tribal enemies (if they're into it I'm out of of it) . It does injustice to political evolution in ideation.

Those conservatives are beating a dead horse. Whether the right once owned the artistic class is immaterial; currently, it doesn't. The left has mainstream cultural dominance. It exerts this overwhelming pressure on all artistic endeavors. It is not omnipotent, but it is there, and it is keenly felt. AI will weaken that.

counter-trans ideology

I have noticed the analogy, which is part of why I’m slightly surprised that this forum is so pro-AI. I mean, given the LessWrong origins of this forum, it makes sense they’d be pro-AI. But this is a decidedly reactionary slice of that original LW readership. How can the same group of people be so reactionary on so many issues while also supporting the prospect of AI-induced complete social disruption. Yes yes, it doesn’t have to be the same individuals making both types of posts, but still.

I don't think the forum is anti-trans in principle; it's just that almost all trans are diametrically opposed to more core values that this forum holds. Plus the transhumanist roots of this forum are in line with atom-for-atom transness... not surgical imitation. We want the real McCoy!

I don't think the forum is anti-trans in principle; it's just that almost all trans are diametrically opposed to more core values that this forum holds.

I don't think this is quite right, actually. People in this forum being "anti-trans" is really only true to the extent that they are against the demands of self-proclaimed pro-trans activists. In terms of the literal meanings of the terms "anti" and "trans," this forum is pretty full of people who aren't anti-trans. Rather, it has to do with opinions specifically about the demands of self-proclaimed pro-trans activists. Obviously this is an easy equivocation to make by accident just because of the literal words involved; my belief is that this type of equivocation is encouraged - and likely even believed in - by the self-proclaimed pro-trans activists; more people believing in the unsupported notion that these activists are speaking on behalf of the actual trans population lends them greater credibility.

Did you miss "don't"? It seems like we agree completely...

No, I think you were incorrect when you wrote "it's just that almost all trans are diametrically opposed to more core values that this forum holds." I don't think it's the case that almost all trans are diametrically opposed to this forum's core values. I think it's the case that almost all trans activists are diametrically opposed to those, and also that trans activists try to give the (unsupported) impression that trans people in general have some meaningful level of agreement with trans activists.

Ah. That might be the case, but I think there's a lot of overlap between trans and trans-activist.

I have or have had significant personal experiences with a number of transpeople. By and large, the activist trait is the most important one when it has come to personally figuring out if someone's values will align with mine or not (and, separately but related, this forum's core values). All my personal experiences with trans activists have seen them be left-wing and loudly against my values. Also, loudly against the values of this forum, as it's not generally believed outside this space that hateful people should be given a platform and that it's worthwhile to rigorously explore other's ideas.

Is this actually representative of the trans community? Couldn't say. But I can definitely say that every trans activist I've seen has been a censorious progressive, and every non-activist transperson I've dealt with is just... a normal person who happens to be trans and doesn't want to talk about it and is keenly more right-sympathetic, if not an outright right-winger themselves. Often, this left/right, activist/non-activist split also lines up with a doesn't pass/passes one, which I suspect has a lot to do with certain attitudes.

If the values of this forum are about free speech and rational constructive discussion, then why is it dominated by conservatives and reactionaries who don't let left wing comments get high rating, does this mean only right wingers value rational political debate or does it mean that left wingers do not like the way they are treated when they comment on here?

More comments

To elaborate on @07mk's parallel post, the issues I have with the trans rights movement all don't seem to apply to AI art. Nobody is forcing me to affirm in word or deed that AI (or, on that matter, non-AI) art is in fact legitimate, or passing laws forcing me to fill a real-art quota among the acquisitions of my hypothetical company. I can call Jackson Pollock or Rembrandt uninspired crap all I want without any fear of losing my job. If the trans rights movement were really just about the right of people to transition and unilaterally call themselves their chosen gender, or other people to agree with them about that, I would have zero issue with it (and in fact be opposed to its opposition); conversely, if AI art proponents did all the aforementioned things, I would fight them, and if they had any degree of success, I would not even mind salting the earth where AI art grew to disincentivise any future thought-police ambitionaries.

I'm just not convinced it'll be that disruptive or destructive in the end. Human art will still have a place in the world of AI artists in the same way that organic or natural alternatives find a spot in many other markets. Printing presses didn't destroy artists, and neither will this. People didn't suddenly become uninterested in owning original artworks because they could buy a cheap print instead. Buyable asset packs and the existence of crap asset flips like Gone Home have not destroyed proper ground-up game design.

I predict that AI art will be briefly high status for a while as a curio, and then once everyone can do it cheaply and easily it'll be low status. People who don't really care about art will hang some cool-looking stuff on their wall because it's relatively cheap, everyone else will go on as before.

I have noticed the analogy, which is part of why I’m slightly surprised that this forum is so pro-AI.

The analogy doesn't quite fit for where the rubber meets the road, does it? When it comes to deeming something created by Midjourney "real art," what does that actually involve for the individuals involved? Nothing, really; a particular arrangement of pixels being "real art" or not is mainly a metaphysical question that doesn't interact with our physical reality to much of an extent. At the end of the day, the arrangement of pixels is the arrangement of pixels, and people will continue to use that arrangement of pixels for things that arrangement of pixels are good at doing, regardless of whether we consider it "real art" or not.

When it comes to deeming a transwoman a "real woman," what does that actually involve for the individuals involved? It means, among other things, having some sort of enforcement regime by which people talking about the transwoman are limited in the terms they can use. It's not just a metaphysical question that people can make an invisible mental categorization as they wish and go about their day; it's a physical question with physical consequences that differ greatly depending on the categorization.

At the end of the day, the arrangement of pixels is the arrangement of pixels, and people will continue to use that arrangement of pixels for things that arrangement of pixels are good at doing, regardless of whether we consider it "real art" or not.

Yes: "is this image art, or is it merely beautiful?" All I can say is that's a nice problem to have.

Art historically has a long tradition of pushing the boundaries of what can be considered "art". Duchamp's Fountain is probably the most notable example here.

Of course technology enabling new types of art is also a new trend: at times we've had crises about whether recorded music would displace musicians, or synthesizers would displace specific instrumentalists, or photography painters. In most cases the answer is "somewhat", and I expect AI art will probably be disruptive, but at the same time the authenticity of a human creator will probably remain the pinnacle of status in most cases.

It's part of a long-standing trend you see in Western conservatism, where precedent-destroying economic activity is celebrated while simultaneously traditional gender roles and such is upheld.

I'm pro-AI and anti-trans ideology. Can you elaborate on why you think these should be somehow related in my mind? I don't understand why you're confused.

How can the same group of people be so reactionary on so many issues while also supporting the prospect of AI-induced complete social disruption

Charles III's old speech where he mentions Guénon has been making the rounds for obvious reasons, and it explains this. Traditionalists aren't the simple minded people that the progressives like to paint them as, who would be allergic to any form of change. What they have is an aversion to the modern that stems from a connection to the sacred that they have and that moderns lack.

The short of it is that they recognize that some things have a natural essence that can not be changed or gone against without dire consequence, not that all things are such.

Consider the world of Dune, one where technology has "disrupted" mankind to such extreme degrees that it is hard to parse as mankind at times. Yet who would argue that such a world is further from what traditionalists want than the globalist planner dreams they so often rail against?

Perhaps I'm parsing this incorrectly, and if so, my bad, but that didn't clarify anything for me. It's still unclear to me why the two would or should have any relation whatsoever.

A longing for simpler times, when men were real men, women were real women who followed the feminine principle, when people knew their place in society, when a firm handshake got you a job, when people had jobs like shoemaker or blacksmith, not marketing manager, when you got your tomatoes and eggs from the local farmer, not a multinational supermarket, and he didn't use GMO or antibiotics. When churches were beautiful not brutalist, the music was beautiful and not loud and noisy, musicians played instruments, not laptops like Skrillex, artists knew their craft and didn't just didn't paint digitally, when people hand wrote their letters and had to pay close attention as there was no backspace on a sheet of paper. When we said hi to the cashier and didn't just use a self checkout etc.

This is the coherent aesthetic of longing for the old stable social order, the "natural" ways of doing things. Both gender bending and AI disrupt that type of good old way of things.

Oh, I see. I don't much understand longing for times past; they led to where we are now, which I'm thoroughly displeased with, so I really have no desire whatsoever to rewind and repeat. My beliefs are all oriented around the construction of something new. I dislike trans ideology not because it is disruptive with any old social systems but because it is viscerally unpleasant to the new ones I'd like to see flourish.

Sadly, Vaush seems to be repeating a lot of arguments I've seen around Tumblr and Twitter about AI art.

He brings up the tired talking point of there being some sort of labor rights issue with feeding a bunch of artists' works into an AI and "stealing" their art in the process. No such labor rights issue exists. If someone is saying this, they fundamentally do not understand what the AI algorithms are doing. Don't get me wrong, there could be other issues with AI art, and we could decide as a society that putting human-generated content into an AI is corrosive to society for other reasons and pass laws limiting that if we wanted to - that's certainly a conversation we could have as a society, but I don't know why people are starting out with a wrong-headed argument right off the bat.

He is also in the "art is a form of communication" camp, which I think tends to be the biggest divide I see in a lot of these debates. Unfortunately, the intellectual groundwork has already been laid for "death of the author" analysis, where the question asked is not "what was the author trying to communicate?", but "what meaning can I as a reader/listener/viewer of an art piece craft from it?"

Borges wrote the short story "Pierre Menard, Author of the Quixote" in 1939, which played with the idea of someone authoring a word-for-word identical rendition of Don Quixote today. In some of the most amusing passages, the exact same paragraph is quoted but given a different analysis based on whether Pierre Menard or Miguel de Cervantes was the author.

I've long been enchanted by the idea of taking a bunch of random books, pretending that they were all written by the same author and then trying to figure out what we can guess about the life of the author based on their literary output. What kind of author would write Winnie the Pooh, Starship Troopers, Call of Cthulhu, Foucault's Pendulum and the Acts of the Apostles? This is an endlessly fun literary exercise that will probably remain fun even after most of the content on our feed is AI generated.

(We've already seen joking stabs at this idea, with people claiming that Hatsune Miku wrote Harry Potter or programmed Minecraft, because they take issue with the original creator.)

I do like art, and I agree it often has communicative value. But "communication" might not even be that far off. AI text generation is also advancing at a considerable rate, even if it might be a while before we see a successor to GPT 3 that can write a whole novel from scratch. Maybe modern AI art is a babbling mishmash of parroted human communication, but in the future we might be able to make pieces that have genuine intentionality behind them even without full AGI. (This also ignores the current arguments about human prompt-makers and curators adding an element of intentionality to AI art.)

the leftist impulse, which has always been focused on the ceaseless improvement and elevation of man in his ascent towards godhood

We really need fewer sweeping generalizations. Leftism is not transhumanism, that's just a theistic reactionary's cudgel to attack the Neo-Babel of social progress and attempts to «immanentize the eschaton». Marxism certainly isn't transhumanism, despite some vaguely congruent mutterings of Trotsky and weird blood transfusion experiments, which I believe have more to do with the ideology having taken root in Russia and consuming Russian Cosmism, its inherent pull to be used later as an extra carrot in the space race.

What do you think "fully automated luxury gay space communism" is supposed to mean?

A joke not 10 years old.

My turn: what problem do you think «socially necessary labour time» is supposed to solve in the Marxist framework? It's supposed to protect rights of laborers (a perennial focus of leftist movements) against relative devaluing by automation, by denying legitimacy to the notion of market value, and by forcing society into the path where things that take little human labor to produce have little cost for humans either, and things that take no human labor at all are value-less. While we're at it, what is alienation that Vaush has mentioned? It's the concept that's meant to prevent human laborers from being reduced to tools – and eventually deemed obsolete.

Vaush is, contrary to your impression, staying true to the Orthodoxy, which was always meant to handle this failure mode.


A somewhat related comment from January, on the matter of /r/antiwork:

This seems to me to basically be an admission that the antiwork ideology is a failure. They have to rely axiomatically on some future conception of technology where humans don't have to do anything because AI or machines can already do all the dirty work for us and we can just spend our time on art and philosophy and literature and whatever we please. This system just does not work in the idealised sense (advertised in these communities) in a non-technologically advance form. It's all especially ironic because the technology that supposedly rescues their ideology was the product of the industrial revolution and capitalism.

That's not only not a failure but the truest part of Marxism, which is of course not just a descriptive but primarily a prescriptive theory, conceived of to build a society that can survive alongside superhumanly productive economy (i.e. so productive that human labor cannot pay for itself). Moreover, Orthodox Marxists were always acutely aware that necessary advances will be forged by the engine of capitalism.

I'd go so far as to say that they rely on a near-inevitability, on a truism like «humans die» or «you get more of what you incentivize», whereas their opponents rely on blatantly dissimilar cases like industrialization, wishful thinking, discredited tabula rasa assumptions and inapplicable arguments like comparative advantage (that does not account for countless things, like common resource market and human inefficiency at utilizing resources). It is increasingly clear that market forces in technology make labor market largely, if not wholly obsolete. That not only can we make robots generally intelligent, cheap and nimble enough to automate most/all jobs currently manned by humans below ~95th percentile by IQ (and not in the business of selling their human body specifically), but that humans are not anywhere near flexible enough to learn qualitatively different tricks.

And that there won't be a compensating explosion in conveniently simple bullshit openings like «robo nanny consultant» or «pattern connoisseur» or whatever either, because there's no need for so many midwitted PMC parasites in a world of endlessly scalable knowledge.

It is the inevitable consequence of capitalism that humans increasingly need not apply (and that supply can easily outstrip demand limited by purchasing power of humans who need not apply). Antiwork is just a rejection of Landian/NRx accelerationism which resolves this conundrum with a simple, parsimonious and historically proven «let them freeloaders starve then», which, of course, is the unspoken instinct of every good Protestant, and especially a high-IQ one that works in STEM or finance and does not expect to starve anytime soon.

Leftism is not transhumanism

Certainly they're not identical, no. But, this book was published pretty recently:

In Fully Automated Luxury Communism, Aaron Bastani conjures a vision of extraordinary hope, showing how we move to energy abundance, feed a world of 9 billion, overcome work, transcend the limits of biology, and establish meaningful freedom for everyone. Rather than a final destination, such a society merely heralds the real beginning of history.

Sounds like transhumanism to me. Marx speaks positively of the outcomes of increased automation in The Fragment on Machines from the Grundrisse, saying that it will lead to

the general reduction of the necessary labour of society to a minimum, which corresponds to the artistic, scientific etc. development of the individuals in the time set free, and with the means created, for all of them.

Kolakowski's Main Currents of Marxism offers an interesting perspective on this, tracing the intellectual heritage of Marxism from ancient esoteric traditions that taught of the inherent divinity and perfectibility of mankind and the necessity for man to aspire to godhood, down through Hegel's Phenomenology of Spirit, and ultimately to Marx himself and his faith in humanity's power to radically reshape the "natural" order. I don't know how you can deny that a belief in progress and a belief in the capacity of man's reason to reshape the world and overcome social problems are central to leftism, and I don't know how you can deny the affinity between those same principles and transhumanism.

As I know you are already aware, there are people who are opposed to the whole idea of humans transcending their biological limits - forget whether it's possible, they don't even view it as desirable! To such people, the difference between orthodox Marxism and your preferred brand of transhumanism looks like merely an internal squabble over implementation details, and perhaps also over the scope of the project.

what problem do you think «socially necessary labour time» is supposed to solve in the Marxist framework?

It's a sufficiently generic concept that anything I said about Marx's motivations for developing the concept would just be speculation, absent a more explicit source that discusses the matter.

what is alienation that Vaush has mentioned?

Alienation for Marx is a result of capitalist social relations, not automation qua automation.

Moreover, Orthodox Marxists were always acutely aware that necessary advances will be forged by the engine of capitalism.

Yes, absolutely. Capitalism was always understood to be a necessary stage of development, and that it would furnish the tools of its own destruction, at which point those tools would be appropriated for allegedly more pro-social ends.

Every capitalist in the world would be more than happy to embrace socialism in a post-scarcity world. There's no practical difference between prince and pauper in a world like that. I'm aggressively right-wing and if we were actually in Paradise I would not care.

Ascending past all restraint and limitation isn't left or right-wing. Whether you imagine it as an angelic idle life in Heaven, or uploading yourself to the Great Machine, or being cared for by robots in your eternal nursing home, everyone yearns to be free of the human condition. It's one of the few dreams I'd call universal.

It's one of the few dreams I'd call universal.

Well...

Certainly I acknowledge that the vast majority of people, of any political persuasion, if asked if they would like to live in Paradise (whatever we ultimately mean by that term), would answer "yes". The main historical debate has been over whether such a condition was possible, and that debate has been quite vociferous. The most forceful exposition of the view that mankind is inherently flawed and incapable of transcending his limitations is of course found in Christianity. Christians too dream of utopia, but of course since we know that the Kingdom of God is fiction, the Christian position is tantamount to the claim that utopia is impossible and not worth striving for in actuality.

Even still, it cannot be called a universal dream. Orwell's Can Socialists Be Happy? provides some hints in this direction:

A book like Brave New World is an expression of the actual fear that modern man feels of the rationalised hedonistic society which it is within his power to create. A Catholic writer said recently that Utopias are now technically feasible and that in consequence how to avoid Utopia had become a serious problem. We cannot write this off as merely a silly remark. For one of the sources of the Fascist movement is the desire to avoid a too-rational and too-comfortable world.

I don't think Nietzsche would have wanted to live in Paradise either. Although, in his typical style, he approaches the issue only obliquely; it's more of an ethos that has to be absorbed from reading his entire corpus, rather than an issue he tackles directly in any one place.

Christians too dream of utopia, but of course since we know that the Kingdom of God is fiction, the Christian position is tantamount to the claim that utopia is impossible and not worth striving for in actuality.

This technically violates the rule against consensus-building--"we know" is too strong. More subtly, I know "Christians" (in the sense that they identify with Christianity while doubting the metaphysics of it) who see the Kingdom of God as unattainable but worth striving for as an ideal, so you need to be careful about making assertions regarding what "we" know, as well as what the "Christian position" is.

I thought the AI art world had already had a few culture war skirmishes when the art AI, like almost every AI before it, started outputting badthink? I remember this specifically being pulled, and I've been hearing for the last week or so that a few of these AI artists have been programmed to stealth-add "black" or "female" into prompts where the race and sex aren't specifically noted otherwise because the results were insufficiently diversified.

a few of these AI artists have been programmed to stealth-add "black" or "female" into prompts where the race and sex aren't specifically noted otherwise because the results were insufficiently diversified.

See: https://labs.openai.com/s/4jmy13AM7qO6cy58aACiytnL as a test demonstrating this. (Edit: from https://old.reddit.com/r/dalle2/comments/w1mflp/gender_bias_gone/igmevpj/ ) More discussion, at least, here: https://old.reddit.com/r/dalle2/comments/w0r284/anyone_noticed_a_significant_algorithm_change_in/

The main concern here is that we're headed for a future where all media and all human interaction is generated by AI simulations, which would be a hellish dystopia. We don't want things to just feel good - we want to know that there's another conscious entity on the other end of the line.

This is already indistinguishable from the current reality landscape. Cloistered writers room from an alien culture, overseen by DEI consultants, are generating countless hours of "art" that is as sterile and incomprehensible to me as if it had been written by an AI that had no concept of the human condition. Dialog makes no sense, characterization is utterly broken, plots leap nonsensically from forced set piece to forced set piece.

If anyone is afraid of AI taking their job in media or the "arts", good riddance to them.

Cloistered writers room from an alien culture, overseen by DEI consultants, are generating countless hours of "art" that is as sterile and incomprehensible to me as if it had been written by an AI that had no concept of the human condition.

This is basically how I feel. If your only conception of the UK was from our recent TV output, you'd assume we were about 30% black and 40% of us were in mixed race relationships. Both of those numbers are an order of magnitude too high, but nobody's ever going to fight for the proper representation of white people. There's no political interest.

If AI boots whoever makes these decisions off the map, or if AI can allow me to tailor a piece of media to my wishes by re-rendering the entire thing with the deepfaked casting I choose, so much the better.

We've already seen mostly passable race swaps with AI in stills. It wouldn't be too much of a leap to see a custom tailored AI that can just replace a character for every frame of a video in a few hours.

Yeah. I was a bit more ambitious in hoping it would allow for entirely custom casting. Make your own ideal version of the film.

I've been going over Chesterton and Lewis lately and I can think of something from both of them that seems relevant to this matter:

Chesterton, Heretics Ch. 17 ("On the Wit of Whistler):

He was not a great personality, because he thought so much about himself. And the case is stronger even than that. He was sometimes not even a great artist, because he thought so much about art. Any man with a vital knowledge of the human psychology ought to have the most profound suspicion of anybody who claims to be an artist, and talks a great deal about art. Art is a right and human thing, like walking or saying one's prayers; but the moment it begins to be talked about very solemnly, a man may be fairly certain that the thing has come into a congestion and a kind of difficulty.

The artistic temperament is a disease that afflicts amateurs. It is a disease which arises from men not having sufficient power of expression to utter and get rid of the element of art in their being. It is healthful to every sane man to utter the art within him; it is essential to every sane man to get rid of the art within him at all costs. Artists of a large and wholesome vitality get rid of their art easily, as they breathe easily, or perspire easily. But in artists of less force, the thing becomes a pressure, and produces a definite pain, which is called the artistic temperament. Thus, very great artists are able to be ordinary men—men like Shakespeare or Browning. There are many real tragedies of the artistic temperament, tragedies of vanity or violence or fear. But the great tragedy of the artistic temperament is that it cannot produce any art.

Lewis, The Great Divorce. (Here we have a conversation between a heavenly Spirit and a Ghost coming from hell, both of whom were artists in life):

"How soon do you think I could begin painting?" it asked.

The Spirit broke into laughter. "Don't you see you'll never paint at all if that's what you're thinking about?" he said.

"What do you mean?" asked the Ghost.

"Why, if you are interested in the country only for the sake of painting it, you'll never learn to see the country."

"But that's just how a real artist is interested in the country."

"No. You're forgetting," said the Spirit. "That was not how you began. Light itself was your first love: you loved paint only as a means of telling about light."

"Oh, that's ages ago," said the Ghost. "One grows out of that. Of course, you haven't seen my later works. One becomes more and more interested in paint for its own sake."

"One does, indeed. I also have had to recover from that. It was all a snare. Ink and catgut and paint were necessary down there, but they are also dangerous stimulants. Every poet and musician and artist, but for Grace, is drawn away from love of the thing he tells, to love of the telling till, down in Deep Hell, they cannot be interested in God at all but only in what they say about Him. For it doesn't stop at being interested in paint, you know. They sink lower-become interested in their own personalities and then in nothing but their own reputations."

"I don't think I'm much troubled in that way," said the Ghost stiffly.

"That's excellent," said the Spirit. "Not many of us had quite got over it when we first arrived. But if there is any of that inflammation left it will be cured when you come to the fountain."

"What fountain's that?"

"It is up there in the mountains," said the Spirit. "Very cold and clear, between two green hills. A little like Lethe. When you have drunk of it you forget forever all proprietorship in your own works. You enjoy them just as if they were someone else's: without pride and without modesty."

Now. Personally, I have already felt the sting of feeding one of my own drawings - one that I had thought was one of my best - into Stable Diffusion's "img2img" and, via the magic incantation "trending on ArtStation," seeing the results come out in some ways better than what I had put in. Not in every way, not yet, and of course there are mangled faces and hands and all sorts of details where, if you asked yourself "so what is that, exactly," you'd find yourself disturbingly unable to answer, but - it could still do much better rendering and textures than I had.

I said that stung, and it did, but how did I deal with it? I had to remind myself why I made art in the first place. Did I do so to make money? Now, if I had, I would have had a material complaint - but ha ha, no, I was never remotely good enough for that to be on the table anyway. The threshold of commercial viability was always out of reach for me - and for most everybody - anyway.

But more dangerously, was I making art to say something about myself? To give myself a self-image, to bind my self-worth to being able to do something others couldn't? To make myself a special person? Well, if so, I would have been doing a pretty poor job of it anyhow, but as both those authors say, that's a corrupting impulse on a person, anyway.

No, the good reason I have for creating is because I have something to say. Because there's some idea or image in my head that I need to get out of it, at the very least because I don't want to forget it, as I would if I left it at the mercies of my own squishy memory. As such, I expect that even if I did drink from that Lethe-like fountain, I'd still have something to appreciate in my own works, because they're about things that I've been interested in anyway.

Well, perhaps that fountain is right before us all now. Perhaps pride in proprietorship is something that's about to be technologically taken from us. But perhaps this isn't such a bad thing; perhaps, while I lose the ability to pride myself on being a More Creative Person than others, I gain the ability to actually get ideas out of my head that I never would have managed before. See all the beauty and wonder that I've hitherto seen only "through a glass, darkly" in much greater detail. So have I lost or have I gained?

(Of course, this doesn't resolve the question of losing one's livelihood, so I note that this analysis is sharply limited!)

No, the good reason I have for creating is because I have something to say.

You can’t see the problem with AI art if you just focus on you, yourself, and your personal capacities and motivations for artistic production.

The problem lies in how AI art alters the nature of art and how we relate to it, at a societal level.

Vaush gestured towards this by attempting to locate the problem in communication - highlighting the relationships between people rather than focusing on individual people in isolation.

I hope to have more to say on these points in a future post.

So there's another good reason to focus on art as expression - you won't have a problem with ai art.

Jokes aside, I think you have been duped, or are duping. The critics of ai art are not concerned about society, they are concerned about themselves. We have had this argument before, and it invariably comes down to insecurity about the future. It is no different here. And we know it is no different because depending on how you look at it Vaush and other ai critics have had at least 5 years - if not a hundred - where they knew this was coming and did nothing about it. It wasn't a problem until it threatened their livelihoods, or the livelihood of someone they love, because it is only a problem because it threatens their livelihoods.

5 years ago I thought that everyone concerned about AI was crazy. I just didn't think the technology was there. I imagine others felt the same.

DALL-E 2 is the first thing that made me pay attention and acknowledge that there really was something there. Maybe machine language translation should have done that sooner, but it didn't, for whatever reason. DALL-E 2 was the first time where I was truly blown away by a new technology in at least the last 15 years.

Hm. Well. If I tried to speak outside my own experience, I know I'd get slapped for that, too.

We've already had technology interrupt "the nature of art." Is photography art? Just point-and-click, after all; there's no talent or Soul necessary for that, its critics would say. But photographers would strenuously disagree, and perhaps it's been long enough and they've built up enough of a power-base of their own that they're taken seriously.

It won't neatly map to a left/right divide, not the least because there's no single such divide. So while I can empathize with feelings of "hey, you are a leftist/rightist, this isn't what you should think about this issue!", ultimately this is not very interesting, other than showing that a binary categorization is insufficient.

The split here is between pro-tech optimists, believers in quantification, that problems of society are mainly technical etc. vs people who miss the "soul" of things.

Some leftist utopias are fully automated large scale standardized productions, but others are about local communities in opposition to capitalist exploitation (of environment and communities). The left is supposed to like disruption and new ways of solving things, except if it comes from capitalist exploitation. There's also a distinction between classical left and woke capital which is nowadays often confused with the left.

Some of the right is pro business and pro capitalist, pro-large scale production, but other parts are more religious and miss the soul of things, the traditions, like the fruit of skilled dignified hard human labour, prefer local things as opposed to multinational business output, out of patriotism and nationalism.

There are some otherwise unnatural pseudo-alliances around woke topics that may connect trads with libertarian transhumanists but things like the AI issue may be a point of collision.

My own attitude is similar to eg furniture. Sure, a skilled carpenter can make a fabulous bed frame with soul and all, and it's beautiful hard work that puts bread on the table from the sweat of the brow etc. But it's expensive and so IKEA has its place too.

Most of the pictures, illustrations and clip arts, stock images, filler crap don't need novel artistic expression. It's like lamenting the emergence of word processor software and how it displaces the fine artists that typographers and editors are, now that people can typeset their own docs. And I'm sure people said as much back in the 80s. It's the same but for drawing.

It won’t neatly map onto a left/right divide

Pandemics and vaccines weren’t supposed to be a left/right issue either, but, we saw how that turned out.

For sure and that's a good example. On the one hand Trust The Science, on the other hand science is a fake-objective old/dead white cishet male-biased colonial-legacy Western Eurocentric project that needs to be dismantled in favor of other ways of knowing like indigenous lived experience.

Similarly with big tech / big corporations. They are bd because white libertarian tech bros but also good in the sense that eg women should see it as their life goal to build a career in them.

It all depends on who feels like that they are inside and who feel outside. Academic leftists will defend the status/prestige of academic knowledge production if it's controlled by them. Similarly if big tech supports ideologically/politically motivated "fact checking", then big tech is good.

It's often not about aesthetics and principles of whether rational quantification and cold calculation and large scale factories are good or small-scale, holistic, emotionally-nice, human-level handmade stuff is good, just who feels in control, in terms of identity politics groups.

There are plenty of conservatives and far right people who don't want to "play God", do genetic engineering, AI, etc.

This really is an issue where you at least need the Red/Blue/Gray three-way distinction.