This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
This was a response to @cjet79:
But I decided I would make it a top comment, because it's my second favorite subject after sci-fi bullshit: literary snobbery with a side of AI.
First, I like AI. I mean, I like it as a tool. (And yes, I know that "AI" is still a misnomer, I understand that LLMs are just token predictors, and I think people who believe that any neural net is close to actually "thinking" or becoming self-aware, or that really, what are we but pattern-matching echolaliac organisms? are drinking kool-aid). I've used ChatGPT to build applications (I don't do "vibe coding" but I have found it increases my productivity because with the right prompts it helps me use new applications and libraries faster than I could by going through tutorials and manuals). It cannot build a fully functional application (beyond the simplest) by itself, though. It often goes back and forth recommending obsolete or unavailable libraries or suggesting moving a line to the wrong place, then recommending I move it back in the next iteration. It's smart and often makes very good recommendations for improving and optimizing code, and it spots subtle bugs and typos very easily. It's also stupid and often makes terrible recommendations that will break your application.
On the hobby side, I've been making AI art, building Stable Diffusion on my PC and even training some LORAs. The vast majority of AI art is, as they say, "slop" and very recognizable as AI, but that's mostly because the vast majority of AI art is "Type a few sentences into text box, copy picture that results." "A cat making a face." "A cute catgirl with an assault rifle giving a come-hither look to her incel AGP fetishist fans." You will get a picture that meets your requirements, but will be very obviously plastic AI digital slop, like a Walmart t-shirt or a Subway sandwich. If you take the time to learn about inpainting and outpainting and ControlNet and upscaling and advanced prompt engineering and model selection and what all the parameters actually tweak, you'll get good pictures, pictures good enough to win Scott's various AI challenges.
Are they good enough for an AI to become a renowned professional artist with a unique and recognizable style? Not yet. But artists are rage-coping hard insisting they aren't good enough to replace the vast majority of commercial artists who just need to draw hamburgers or cars or Corporate Memphis HR posters, or commissioned MCU fanservice. The sticking point now is no longer extra fingers or shadows going in the wrong direction (though most AIs will still make little mistakes that are tells for the observant- but these can be easily repaired!) but just the fact that it's still painful to go back and forth to get exactly the pose, position, expression, color shade, background, accessories, species of flower, that you want. With real artists you can talk to the artist, and the artist can do rough sketches and ask clarifying questions. With AIs, you generate 100 images, let GPU go brrrrr, and maybe you get one or two that are kinda close and still need extensive inpainting and photoshopping. Conversely, though, I have commissioned some artists in the past and while I was generally satisfied with the results, even a human will never be able to really represent the picture that's in your head. Enough time with Stable Diffusion and some photoshop ability will often actually come closer to the mark. AI art is getting better all the time, but IMO, it is not close to replacing truly talented high-end artists, just as AI is not close to replacing actual rock star programmers and innovators.
It is close to replacing the print shoppers, the commercial graphic arts majors, the SEO optimizers and storefront webapp builders, though.
So, can it write?
Yes and no. I've tried out some of the NovelAI apps and gazed upon the sorry state of Kindle Unlimited, already flooded with thousands of subpar self-published romantasy-written-while-fingering-herself slop and power-fantasy-written-while-jerking-himself slop, and now that has been multiplied seven and sevenfold by AIs churning out the results of all those Udemy and YouTube courses promising you can now make a living on Amazon without actually writing anything. Throw a million books out there with pretty covers and even if you make pennies per title, it adds up. AI has been devastating the short story market for a while now.
If we get to the point where AI can generate good stories, then... I guess I'd be happy to read AI-generated stories? I think we are a long, long way from there, though. And I have experimented. LLMs can generate coherent stories at this point. They have a plot, and some degree of consistency, and I suppose they have all the traditional elements of a story. I am not sure if they are up to generating an entire novel with one prompt yet - I haven't tried, but I know there are tools to let you coach it along to get a whole novel out of it.
But everything I have seen so far is crap. In fairness, most of what's on RoyalRoad (and Wattpad and A03 and Scribd and all the other open platforms) is crap, but you can still tell what's human-written crap and what's AI slop.
I may be in the minority here; it often seems readers just don't care much anymore and want to consoom entertainment units. But waving my snooty literary tastes here, I sometimes despair at the writing some people think is good just because it tickles their
fetishessweet spots. Some genres (progressive fantasies, litrpg, a lot of romance) are written so, so badly that if they aren't AI generated, they may as well be. An AI has no prose style except very poor mimicry of other styles; it has no ability to truly craft words and turn a phrase in a way that makes you say "Ah, yes, that is totally how that author writes." It has no way to embed themes and metaphors that echo throughout a book, it has no thematic consistency (often not even tonal consistency). Character arcs, such as they exist, are flat and linear; LLMs cannot grasp "character development" or complexity or nuance in any real way.If you want a book that's mental bubblegum, a linear power fantasy about a guy getting ever more powerful and punching bigger villains in the face, or a hot chick being fought over by two smoking alphas, and nothing more to it and not even any clever writing to sweeten the experience, just "thing happens and then thing happens and then thing happens" and maybe some purple descriptive modifiers mimicking a high school creative writing exercise, I suppose AIs can do that now. But nothing that even approaches the most derivative pastiches of true classic novels.
And that's just to produce one book. How about a series, a multibook arc preserving plot threads and MacGuffins and character development from one book to the next? An AI cannot do that, and I doubt their ability to do that any time soon.
If you're not really a reader and consuming stories is like popping open a beer and you don't care how it tastes as long as it gives you a buzz, maybe AIs will fill that entertainment need. I sometimes put AI-generated soundtracks on as background music, and while the first few minutes can be okay, after a while it sounds very samey and droney and repetitive, even to my extremely unsophisticated ear (and my musical tastes are, in contrast to my literary tastes, utterly banal and horrible).
I don't doubt AI will continue to improve and eventually we'll have the first award-winning novel completely written by AI that even experts agree is actually... kinda good. But I am skeptical. I think it will take a while. I think even when we get to that point it will be a very particular kind of novel that uses some tricks (like being a surrealist or post-modern experimental novel or something else that avoids the usual conventions of narrative structure and story development).
I think it will be a long, long time before we have an AI Stephen King or Kazuo Ishiguro or Margaret Atwood. But I think we will have AI "authors" doing a "good-enough" job for the proles. Whether the slow-motion death of traditional publishing is a good thing or not I guess depends on how much you hate traditional publishing. I think gatekeeping is good, and that's what traditional publishing does. Publishers put out a lot of books I am not interested in and even think are very bad, but I can at least tell from the cover, the blurbs, and the author if it's likely to meet my minimal standards of readability. It's not like sifting through sewage for something sparkly. More like picking a few good apples out of a bin of mostly rotten ones.
I celebrate the flourishing of platforms for anyone to put their work out there and a handful of indie authors are killing it on Amazon, but increasingly they are no different from the handful of authors who make it big in trad publishing- there are a handful of big winners, but most earn below minimum wage for their efforts, and now many thousands who basically earn beer money if that are competing with LLMs who can scratch the same itch they do.
I am getting pissed off with the AI assistant crap being suggested to me at work (no, Adobe, I do not need the inbuilt AI assistant to "simplify this document for me" when I'm reconciling a blinkin' bank statement) and I think much of the enthusiasm over AI is because it's all software engineering.
It seems to be useful (not perfect but useful) if you're writing code. Or if you're dumping your homework on it to write your essays and cheat your exams for you. But for use by ordinary people otherwise? Apart from the slop art and extruded fiction product you mention, I don't yet see it doing anything useful.
I don't need it to write a shopping list and order online for me, just in case someone wants to use that as an example. That's for people who only buy the same things over and over and have more money than time.
If anyone has an example of "this is how I use it for work/at home and it really saves me time and mental energy", I'd be glad to hear.
I don't, but I have a related observation. Because there are differently capable LLMs available, we have not a "one screen two movies" a situation but reverse, "nominally same movie in two screens". One screen is 4k ultraHD and other is camcorded VHS tape. In this thread and other forums, savvy people truly test this shit, constantly trying out which is the current state of the art, and enthusiastically adopt it, and report amazing results. I almost believe it is that good. Meanwhile, at work, my coworkers are not savvy at all yet enthusiastically adopt the default-tier ChatGPT. Which is shit. I call it ShitGPT.
I have watched how "senior" engineer who used to bit perhaps above his skill level but quite okay starting do quite stupid stuff, like in live code review call call copypasting ChatGPT outputs without looking at it, including the time when ShitGPT decided it wants to write the answer in C# instead of C++. Another engineer caused a week of mayhem because he uncritically trusted ChatGPT "summarization" of library documentation, except halfway the ShitGPT had stopped summarizing documentation and switched to hallucinating, causing the most curious bugs.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link