It does claim to be that which it is displacing, though. There generally isn't open acknowledgement that AI goods are made by AI, and many sellers attempt to actively claim otherwise.
In either case, I'd say there's currently an implicit assumption by many buyers that, when they're purchasing a book, say, they're buying something that an intelligent mind constructed using skill and artifice (with plot twists, character arcs, and so forth), and not something that reads beautifully on the first page but never builds up to anything or has anything to say. AI's utility in this regard is its ability to both impersonate more meaningfully crafted human products and to exploit the sort of assumptions that book customers have built up through former habits.
The result will be the death of those 'former habits', as book customers do not gain the same pleasant experiences from their current purchasing habits, insofar as they inadvertently purchase AI products, and so the market will shrink and utility will be destroyed. If AI products were merely inferior, they could simply be ignored and filtered by such customers. It is their ability to mimic which makes them destructive. They can inhabit certain aspects of outer forms but not provide the same deeper experiences.
Maybe the case is less true for AI drawings, which are more of a what-you-see-is-what-you-get affair, in which case, no, it wouldn't count as a counterfeit to my mind. Unless the buyer was hopeful for some sort of engagement with an actual human that they weren't actually getting, or if they thought they were buying a more complex work that could be intensively studied to extract meanings which weren't immediately obvious, only to eventually realize it's an AI gestalt of several other works which only mimics their qualities superficially, say.
Which seems like a probable enough outcome.
AI is killing off the lower tier skill tree of this band of creators, and I'm not sure its a bad thing.
It's going to kill off the livelihoods and acclaim for any artist who operates on any site susceptible to AI spam (which includes mainstream ones like Amazon), as the spam will make it impossible for new artists to attract notice and will even make it difficult for established artists to attract new fans. The AI spam doesn't work by creating equal or superior products, it works by simply existing in vast quantities and being hard to distinguish at a casual glance from legitimate products. It's a form of counterfeit goods when used this way.
Furthermore, it kills off a sort of broader cultural enthusiasm for art which exists and accounts for much of our society's interest in it. Fed by AI content mills and their inferior simplistic content, you might be left with satisfied degenerates who don't care about complexity or meaning and are wholly content with endless repetitive images of their favorite anime characters or whatever, but you won't have the kind of cultural underpinnings that sustains either fandoms, forum media discussions, critical appreciations, or anything else that makes art socially engaging.
This will in turn kill off the production of any sort of non-hyper-commodified art. Who wants to put effort into things if no one's A) going to notice or buy it, or B) even possess the cultural capability of caring?
Not being an art expert, I can’t judge those images too deeply. One thing that stands out to me though is how compositionally simple those examples are. They seem to all consist of one character in the foreground and then some kind of dramatic stylistic background. My own experiences with AI image generation is that it’s very difficult to get the prompt engine to orchestrate more than just one or two characters, so that this sort of simple approach seems like it is probably the best that current AI is capable of. To me, it doesn’t seem like a rich tool for self expression.
I’m not a ‘math wizard’, but something about this seems off. Shakespeare didn’t write one hundred plays and then choose the best few dozen to publish. He developed his playwriting ability until he was at a skill level to consistently put out good content. If AI lacks the underlying skills to create good works, then should we expect even one in a trillion of its works to be as good as Macbeth, or should we regard the creation of such a thing as physically impossible unless underlying conditions are met? It seems like it’s less a matter of statistical probability than physical possibility.
Not true. Human works that find great success usually do so based on their merits as artistic products. AI works that find success usually do so as flukes. Put out millions of AI created light novels and occasionally one of them will slip through some quality filtering service. Their success is predicated on the inability of these services to filter quality 100%, and they enjoy an advantage over the shittiest of human works in this regard based only on the scale of their output.
AI artistic successes are indicative of survivorship bias. The way their creators operate is by spamming vast amounts of works and seeing what sticks. Through quirks of fate, a few of them end up successful. This business model is probably short lived, though, as the very spam it relies on degenerates the platforms necessary for their proliferation, so that user interest will eventually decline. Already we’re seeing sites like Deviant Art and Literotica killed off by AI spam. AI will kill off markets rather than improve them.
- Prev
- Next

Not if the process itself is beyond the AI to recreate.
For instance, say that a great movie like A Clockwork Orange was made in part through the theoretical understandings the main actors had developed over their lifetimes for their crafts and used to feed into their decisions of how to act and portray their characters.
Coming up with a similar quality of acting might be impossible through mere observation and mimicry of what works and what doesn't. The AI has an intuition for what sorts of things generally go together, but it doesn't use, among other things, underlying theoretical know-how to construct its outputs.
My current assessment is that there's a low ceiling for how far AI 'thinking' can take the quality of its output, particularly regarding the complexity of what it's attempting to do. Projects that require a layered approach of various theories and techniques seem like they're fundamentally beyond AI. The more systems that need to come together to create a work, the more exponentially difficult it becomes for a pattern-finder to match its quality. The pattern-finder needs to become capable of wielding tools, systems, theories in its thinking in order to up its game past a certain point.
I've heard people say before, in the context of AI art, that humans are essentially just 'pattern finders', too, and so are creatively indistinguishable from AI. But I think this is wrong: it ignores external tools humans use to structure and create their work, such as theories and techniques, which cumulatively take the load off of them having to conceive everything in a fit of genius. I think this is the primary reason AI, despite its 'brilliance' as a search engine or generalist explainer, is so lacking in certain other regards. It's due to the total reliance of its 'cognition' on what, compared to humans, would be more like a single sub-process.
More options
Context Copy link