This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
This was a response to @cjet79:
But I decided I would make it a top comment, because it's my second favorite subject after sci-fi bullshit: literary snobbery with a side of AI.
First, I like AI. I mean, I like it as a tool. (And yes, I know that "AI" is still a misnomer, I understand that LLMs are just token predictors, and I think people who believe that any neural net is close to actually "thinking" or becoming self-aware, or that really, what are we but pattern-matching echolaliac organisms? are drinking kool-aid). I've used ChatGPT to build applications (I don't do "vibe coding" but I have found it increases my productivity because with the right prompts it helps me use new applications and libraries faster than I could by going through tutorials and manuals). It cannot build a fully functional application (beyond the simplest) by itself, though. It often goes back and forth recommending obsolete or unavailable libraries or suggesting moving a line to the wrong place, then recommending I move it back in the next iteration. It's smart and often makes very good recommendations for improving and optimizing code, and it spots subtle bugs and typos very easily. It's also stupid and often makes terrible recommendations that will break your application.
On the hobby side, I've been making AI art, building Stable Diffusion on my PC and even training some LORAs. The vast majority of AI art is, as they say, "slop" and very recognizable as AI, but that's mostly because the vast majority of AI art is "Type a few sentences into text box, copy picture that results." "A cat making a face." "A cute catgirl with an assault rifle giving a come-hither look to her incel AGP fetishist fans." You will get a picture that meets your requirements, but will be very obviously plastic AI digital slop, like a Walmart t-shirt or a Subway sandwich. If you take the time to learn about inpainting and outpainting and ControlNet and upscaling and advanced prompt engineering and model selection and what all the parameters actually tweak, you'll get good pictures, pictures good enough to win Scott's various AI challenges.
Are they good enough for an AI to become a renowned professional artist with a unique and recognizable style? Not yet. But artists are rage-coping hard insisting they aren't good enough to replace the vast majority of commercial artists who just need to draw hamburgers or cars or Corporate Memphis HR posters, or commissioned MCU fanservice. The sticking point now is no longer extra fingers or shadows going in the wrong direction (though most AIs will still make little mistakes that are tells for the observant- but these can be easily repaired!) but just the fact that it's still painful to go back and forth to get exactly the pose, position, expression, color shade, background, accessories, species of flower, that you want. With real artists you can talk to the artist, and the artist can do rough sketches and ask clarifying questions. With AIs, you generate 100 images, let GPU go brrrrr, and maybe you get one or two that are kinda close and still need extensive inpainting and photoshopping. Conversely, though, I have commissioned some artists in the past and while I was generally satisfied with the results, even a human will never be able to really represent the picture that's in your head. Enough time with Stable Diffusion and some photoshop ability will often actually come closer to the mark. AI art is getting better all the time, but IMO, it is not close to replacing truly talented high-end artists, just as AI is not close to replacing actual rock star programmers and innovators.
It is close to replacing the print shoppers, the commercial graphic arts majors, the SEO optimizers and storefront webapp builders, though.
So, can it write?
Yes and no. I've tried out some of the NovelAI apps and gazed upon the sorry state of Kindle Unlimited, already flooded with thousands of subpar self-published romantasy-written-while-fingering-herself slop and power-fantasy-written-while-jerking-himself slop, and now that has been multiplied seven and sevenfold by AIs churning out the results of all those Udemy and YouTube courses promising you can now make a living on Amazon without actually writing anything. Throw a million books out there with pretty covers and even if you make pennies per title, it adds up. AI has been devastating the short story market for a while now.
If we get to the point where AI can generate good stories, then... I guess I'd be happy to read AI-generated stories? I think we are a long, long way from there, though. And I have experimented. LLMs can generate coherent stories at this point. They have a plot, and some degree of consistency, and I suppose they have all the traditional elements of a story. I am not sure if they are up to generating an entire novel with one prompt yet - I haven't tried, but I know there are tools to let you coach it along to get a whole novel out of it.
But everything I have seen so far is crap. In fairness, most of what's on RoyalRoad (and Wattpad and A03 and Scribd and all the other open platforms) is crap, but you can still tell what's human-written crap and what's AI slop.
I may be in the minority here; it often seems readers just don't care much anymore and want to consoom entertainment units. But waving my snooty literary tastes here, I sometimes despair at the writing some people think is good just because it tickles their
fetishessweet spots. Some genres (progressive fantasies, litrpg, a lot of romance) are written so, so badly that if they aren't AI generated, they may as well be. An AI has no prose style except very poor mimicry of other styles; it has no ability to truly craft words and turn a phrase in a way that makes you say "Ah, yes, that is totally how that author writes." It has no way to embed themes and metaphors that echo throughout a book, it has no thematic consistency (often not even tonal consistency). Character arcs, such as they exist, are flat and linear; LLMs cannot grasp "character development" or complexity or nuance in any real way.If you want a book that's mental bubblegum, a linear power fantasy about a guy getting ever more powerful and punching bigger villains in the face, or a hot chick being fought over by two smoking alphas, and nothing more to it and not even any clever writing to sweeten the experience, just "thing happens and then thing happens and then thing happens" and maybe some purple descriptive modifiers mimicking a high school creative writing exercise, I suppose AIs can do that now. But nothing that even approaches the most derivative pastiches of true classic novels.
And that's just to produce one book. How about a series, a multibook arc preserving plot threads and MacGuffins and character development from one book to the next? An AI cannot do that, and I doubt their ability to do that any time soon.
If you're not really a reader and consuming stories is like popping open a beer and you don't care how it tastes as long as it gives you a buzz, maybe AIs will fill that entertainment need. I sometimes put AI-generated soundtracks on as background music, and while the first few minutes can be okay, after a while it sounds very samey and droney and repetitive, even to my extremely unsophisticated ear (and my musical tastes are, in contrast to my literary tastes, utterly banal and horrible).
I don't doubt AI will continue to improve and eventually we'll have the first award-winning novel completely written by AI that even experts agree is actually... kinda good. But I am skeptical. I think it will take a while. I think even when we get to that point it will be a very particular kind of novel that uses some tricks (like being a surrealist or post-modern experimental novel or something else that avoids the usual conventions of narrative structure and story development).
I think it will be a long, long time before we have an AI Stephen King or Kazuo Ishiguro or Margaret Atwood. But I think we will have AI "authors" doing a "good-enough" job for the proles. Whether the slow-motion death of traditional publishing is a good thing or not I guess depends on how much you hate traditional publishing. I think gatekeeping is good, and that's what traditional publishing does. Publishers put out a lot of books I am not interested in and even think are very bad, but I can at least tell from the cover, the blurbs, and the author if it's likely to meet my minimal standards of readability. It's not like sifting through sewage for something sparkly. More like picking a few good apples out of a bin of mostly rotten ones.
I celebrate the flourishing of platforms for anyone to put their work out there and a handful of indie authors are killing it on Amazon, but increasingly they are no different from the handful of authors who make it big in trad publishing- there are a handful of big winners, but most earn below minimum wage for their efforts, and now many thousands who basically earn beer money if that are competing with LLMs who can scratch the same itch they do.
This feels like MOSTLY a solved problem with ChatGPT's o3 image generation capability.
You can feed it a few reference images for what you're trying to get to, including poses and background with a sufficiently precise prompt you WILL get something very, VERY accurate to your intentions. It does NOT do a great job on making precise adjustments from there, and currently it doesn't do inpainting but take the image it produced and running it through Stable Diffusion or just manual photoshop gets you to the finish line.
One thing its is actually very good at is feeding it an image representing a tattoo you're thinking of getting, feeding it an image of your bare skin in the area you want that tattoo, then it can produce an image showing you what that tattoo would look like. And THEN you can pay a human artist to hopefully execute on that vision well.
I have had annoying problems where it remembers something you asked for earlier and keeps including that in the image even after you tell it to move on or forget, but that's fixed by starting a new window with the most recent output.
I don't see how a human artist can outcompete this on cost or time. I CAN see how you might still pay a human to actually do the work of interacting with the AI and modifying outputs to get close to a particular vision.
Similarly, SONG PRODUCTION is now just about indistinguishable from full human now. To me, a decently done full AI song will have almost zero tells unless the creator set out to make it obvious.
Betting against the AI capabilities approaching peak human is probably a losing proposition unless we ARE very, very close to the plateau of what can be achieved with the current paradigm.
AI is now better than the best chess players, and better than the best GO players, and while Novel-writing is a different combination of skills and intellect than either of those, the AIs have already learned to write coherently and so I expect tacking on the additional capabilities will scale the machine into Stephen King territory pretty quickly.
My understanding is that o3 image gen is identical to the regular chatgpt image gen (famed for the ghiblification wave). Both cases call out to gpt-image-1 which to be fair is much better than dalle and stable diffusion and the like at following prompts.
https://platform.openai.com/docs/models/gpt-image-1 https://platform.openai.com/docs/models/o3
This is actually a very defensible position
It was a very defensible position every time, then some new advance blew past it.
I'll listen to the defense, but I'm not placing my bets on that side.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link