site banner

Culture War Roundup for the week of October 3, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

24
Jump in the discussion.

No email address required.

I think over the last few months we've established that AI issues are on topic for the culture war thread, at least when they intersect with explicitly cultural domains like art. So I hope it's ok that I write this here. Feel free to delete if not.

NovelAI's anime model was released today, and it's pretty god damned impressive. If you haven't seen what it can do yet, feel free to check out the /hdg/ threads on /h/ for some NSFW examples.

Not everyone is happy though; AI art has attracted the attention of at least one member of congress, among several other public and private entities:

WASHINGTON, D.C. – Today, U.S. Rep. Anna G. Eshoo (D-CA) urged the National Security Advisor (NSA) and the Office of Science and Technology Policy (OSTP) to address the release of unsafe AI models that do not moderate content made on their platforms, specifically the Stable Diffusion model released by Stability AI on August 22, 2022. Stable Diffusion allows users to generate unfiltered imagery of a violent or sexual nature, depicting real people. It has already been used to create photos of violently beaten Asian women and pornography depicting real people.

I don't really bet on there being any serious legal liability for Stability.AI or anyone else, but, you never know.

I've tried several times to articulate here why I find AI art to be so upsetting. I get the feeling that many people here haven't been very receptive to my views. Partially that's my fault for being a bad rhetorician, but partially I think it's because I'm arguing from the standpoint of a certain set of terminal values which are not widely shared. I'd like to try laying out my case one more time, using some hopefully more down-to-earth considerations which will be easier to appreciate. If you already disagree with me, I certainly don't expect you to be moved by my views - I just hope that you'll find them to be coherent, that it seems like the sort of thing that a reasonable person could believe.

Essentially the crux of the matter is, to borrow a phrase from crypto, "proof of work". There are many activities and products that are valuable, partially or in whole, due to the amount of time and effort that goes into them. I don't think it's hard to generate examples. Consider weight lifting competitions - certainly there's nothing useful about repeatedly lifting a pile of metal bricks, nor does the activity itself have any real aesthetic or social value. The value that participants and spectators derive from the activity is purely a function of the amount of human effort and exertion that goes into the activity. Having a machine lift the weights instead would be quite beside the point, and it would impress no one.

For me personally, AI art has brought into sharp relief just how much I value the effort and exertion that goes into the production of art. Works of art are rather convenient (and beautiful) proof of work tokens. First someone had to learn how to draw, and then they had to take time out of their day and say, I'm going to draw this thing in particular, I'm going to dedicate my finite time and energy to this activity and this particular subject matter rather than anything else. I like that. I like when people dedicate themselves to something, even at significant personal cost. I like having my environment filled with little monuments to struggle and self-sacrifice, just like how people enjoy the fact that someone out there has climbed Mt. Everest, even though it serves no real purpose. Every work of art is like a miniature Mt. Everest.

Or at least it was. AI art changes the equation in a way that's impossible to ignore - it affects my perception of all works of art because now I am much less certain of the provenance of each work*. There is now a fast and convenient way of cheating the proof of work system. I look at a lot of anime art - a lot of it is admittedly very derivative and repetitive, and it tends to all blend together after a while. But in the pre-AI era, I could at least find value in each individual illustration in the fact that it represented the concrete results of someone's time and effort. There are of course edge cases - we have always had tracing, photobashing, and other ways of "cheating". But you could still assume that the average illustration you saw was the result of a concrete investment of time and effort. Now that is no longer the case. Any illustration I see could just as easily be one from the infinite sea of AI art - why should I spend any time looking at it, pondering it, wondering about the story behind it? I am now very uncertain as to whether it has any value at all.

It's a bit like discovering that every video game speedrun video you see has a 50% chance of being a deepfake. Would you be as likely to watch speedrunning videos? I wouldn't. They only have value if they're the result of an actual investment of time by a human player - otherwise, they're worthless. Or, to take another very timely example, the Carlsen-Niemann cheating scandal currently rocking the world of chess. Chess is an illustrative example to look at, because it's a domain where everyone is acutely aware of the dangers of a situation where you can't tell the difference between an unaided human and a human using AI assistance. Many people have remarked that chess is "dead" if they can't find a way to implement effective anti-cheating measures that will prevent people from consulting engines during a game. People want to see two humans play against each other, not two computers.

To be clear, I'm not saying that the effort that went into a work of art is the only thing that matters. I also place great value on the intrinsic and perceptual properties of a work of art. I see myself as having a holistic view where I value both the intrinsic properties of the work, and the extrinsic, context-dependent properties related to the work's provenance, production, intention, etc.

TL;DR - I used to be able to look at every work of art and go "damn someone made that, that's really cool", now I can't do that, which makes every interaction I have with art that much worse, and by extension it makes my life worse.

*(I'm speaking for convenience here as if AI had already supplanted human artists. As I write this post, it still has limitations, and there are still many illustrations that are unmistakably of human origin. But frankly, given how fast the new image models are advancing, I don't know how much longer that will be the case.)

EDIT: Unfortunately, this dropped the day after I wrote my post, so I didn't get a chance to comment on it originally. Based on continually accumulating evidence, I may have to retract my original prediction that opposition to AI art was going to be a more right-coded position. Perhaps there are not as many aesthetes in the dissident right as I thought.

I really don't find your post convincing in the least. And the constant whiny bitching and crying by artists about AI art has made me suspicious of the motivations of these so called "artists", who claim to do it so much so for their love of the art. @EfficientSyllabus, said it a lot better than me. You are not lamenting the loss of an artform, you are lamenting the loss of status.

Here are a few scattered as to why I am so deeply unsympathetic to those who endlessly moan about AI art.

  • I'm a programmer. I believe I love the art of programming. I also know some people who genuinely love the art of programming. When I saw what OpenAI Codex (AI that can generate code) was capable of doing, My jaw was on the floor. A program that could write more programs?? It was science fiction in-front of my own eyes. Every other programmer I know who loves the game itself had the exact same reaction; amazement.

    So what if a machine can write code? Code is good! The world needs more code! Code makes machines more efficient, it does boring jobs that people would have to do, code optimizes processes that literally puts food into most peoples mouths.

    In the same vein? Is art not a good thing? Is the world not a richer place because there will be more art? Isn't it great that an independent blogger who couldn't afford commissions will now get to have art that makes his blogging richer? Is it not great that a mom and pop shop can now produce artwork that will make their corner store more lively? Won't the world get a little bit more aesthetically pleasing?

    Why are the majority programmers so enthusiastic about machines that can code but not artists?

    Maybe because the greatest trick the devil pulled was that "artists" are in it for the love of the art and us uncool dirty nerds are in it for the money and status?

  • There is an art to almost every process right?

    Farming can also be an art right? Getting the soiled tilled just right, making sure the seeds are placed just the appropriate distance apart, etc.

    However, if someone lamented the loss of farming as an artform because combine harvesters were invented... My and hopefully any rational persons response would be;

    " You motherfucker. Do you not realize that millions of hungry mouths will be fed because of this thing? Is your artsy fartsy shit more important that people not being hungry?"

  • The world is a place where things need to get done.

    I love the art of programming and spend countless hours cleaning up my programs, but ultimately it's of no value if no one can use my programs. Chefs can put their heart and soul into their food, but it would be of no value if no one ate it.

    The value is in the PRODUCT, not the PROCESS.

    If my favorite bakery found a way to mass produce their cheesecakes but the pastry chef was not required anymore and it would be all done by machines, then good. More people can enjoy great food for cheap. And to be honest my tongue doesn't care, if it did, its priorities are not in order.

    Boohoo for the pastry chef, if they love making cakes so much they can make the cake and throw it in the trash. In my world cakes are for eating. Is it not wisdom that you cook for your friends and family for them, not for you? The sanctity is in the fact that their stomachs are full not that your knife skills are perfected?

    Same for the artists, they can draw their art and throw it in the trash, its the process that matters right?

As someone who believes that more things are good. Products are good. Anyone lamenting about a process that brings more good things into the world is my enemy. You are actively lamenting that the world is becoming a richer place, in both the very economic and metaphorical sense of the word 'richer'. The pie is getting bigger, you are just lamenting you won't be having a relatively larger share of it.

Why are the majority programmers so enthusiastic about machines that can code but not artists?

Because they aren't. They're collectively deluding themselves into believing in the «soul» and that programming will never be automated by AI. Just like certain artists are.

I am a programmer. OpenAI scares me. I'm putting every effort I've got into the Grind, because I think the industry's due for a phenomenal crash that'll leave the majority in the dumps. You are free to disagree.

There is no problem humans face that cannot be reframed as a programming or automation problem. Need food? Build a robot to grow it for you, and another to deliver it to your house. Need to build a robot? Make a factory that automates robot fabrication. Need to solve X medical issue? Write a program that figures out using simulations or whatever how to synthesize a chemical or machine that fixes it. Given this, the question of "what happens to programmers when computers can write code for arbitrary domains just as well as programmers can" answers itself.

I expect that fully automating coding will be the last job anybody ever does, either because we're all dead or we have realized Fully Automated Luxury Space Communism.

Is there something misleading with the way I phrased my comment? I don't understand why multiple people have succeeded in reading "programmers will be completely replaced by AI" into my words.

And this isn't a nitpicking thing. It is an extremely important distinction; I see this in the same way as the Pareto Principle. The AI labs are going to quickly churn out models good enough to cover 95% of the work the average software engineer does, and the programming community will reach a depressive state where everyone's viciously competing for that last 5% until true AGI arrives.

Your first paragraph misses how hard it is for human programmers to achieve those things, if it is even possible under current circumstances (find me a program that can acquire farmland & construct robots for it & harvest everything & prepare meals from raw materials). Even hiring an army of programmers (AI or no) would not satisfy the preconditions necessary for getting your own food supply, namely having an actual physical presence. You need to step beyond distributed human-level abilities into superhuman AI turf for that to happen.

There is a sense in which the job of coding has already been automated away several times. For instance, high-level languages enable a single programmer to accomplish work that would be out of the grasp of even a dozen assembly-language programmers. (This did, in fact, trash the job market for assembly-language programmers.)

The reason this hasn't resulted in an actual decline in programmer jobs over time is because each time a major tool is invented that makes programming easier (or eliminates the necessity for it in particular domains), people immediately set their sights on more-difficult tasks that were considered impractical or impossible in the previous paradigm.

I don't really see the mechanism by which AI-assisted programming is different in this way. Sure, it means a subset of programming problems will no longer be done by humans. That just means humans will be freed to work on programming and engineering problems that AI can't do, or at least can't do yet; and they'll have the assistance of the AI programmers that automated away their previous jobs.

And if there are no more engineering or programming problems like that, then you now have Automated Luxury Space Communism.

Roughly speaking, I see your point and agree that it's possible we're just climbing a step further up on an infinite ladder of "things to do with computers".

But I disagree that it's the most likely outcome, because:

  1. I think the continued expansion of the domain space for individual programmers can be partially attributed to Moore's Law. More Is Different; a JavaScript equivalent could've easily been developed in the 80s but simply wasn't because there wasn't enough computational slack at the time for a sandboxed garbage collected asyncronous scripting language to run complex enterprise graphical applications. Without the regular growth in computational power, I expect innovations to slow.

  2. Cognitive limits. Say a full stack developer gets to finish their work in 10% of the time. Okay, now what? Are they going to spin up a completely different project? Make a fuzzer, a GAN, an SAT solver, all for fun? The future ability of AI tools to spin up entire codebases on demand does not help in the human learning process of figuring out what actually needs to be done. And if someone makes a language model to fix that problem, then domain knowledge becomes irrelevant and everyone (and thus no one) becomes a programmer.

  3. I think, regardless of AI, that the industry is oversaturated and due for mass layoffs. There are currently weak trends pointing in this direction, but I wouldn't blame anyone for continuing to bet on its growth.

For (1), what you're saying is certainly true; the better abstractions and better tooling has been accompanied by growth in hardware fundamentals that cannot be reasonably expected to continue.

(2) is where I'm a lot more skeptical. A sufficient-- though certainly not necessary-- condition for a valuable software project is identifying a thing that requires human labor that a computer could, potentially, be doing instead.

The reason I called out robotics specifically is because, yeah, if you think about "software" as just meaning "stuff that runs on a desktop computer", well, there's lots of spheres of human activity that occur away from a computer. But the field of robotics represents the set of things that computers can be made to do in the real world.

That being so, if non-robotics software becomes trivial to write I expect we are in one of four possible worlds:

World one: General-purpose robotics-- for example, building robots that plant and harvest crops-- is possible for (AI-assisted) human programmers to do, but it's intrinsically really hard even with AI support, so human programmers/engineers still have to be employed to do it. This seems like a plausible world that we could exist in, and seems basically similar to our current world except that the programmer-gold-rush is in robotics instead of web apps.

World two: General-purpose robotics is really easy for non-programmers if you just make an AI do the robot programming. That means "programming" stops being especially lucrative as a profession, since programming has been automated away. It also means that every other job has been (or will very soon be) automated away. This is Fully-Automated Luxury Space Communism world, and also seems broadly plausible.

World three: General-purpose robotics is impossible at human or AI levels of cognition, but non-robotics AI-assisted programming is otherwise trivial. I acknowledge this is a world where mass layoffs of programmers would occur and that this would be a problem for us. I also do not think this is a very likely scenario; general-purpose robotics is very hard but I have no specific reason to believe it's impossible, especially if AI software development has advanced to the point where almost all other programming is trivial.

World four: World two, except somebody screwed up the programming on one of their robot-programming AIs such that it murders everyone instead of performing useful labor. This strikes me as another plausible outcome.

Are there possibilities I'm missing that seem to you reasonably likely?

For your point (3), I have no particular expectations or insight one way or another.

Hi, I just want to leave a stub response: you seem right and I failed to type a recent response after reading 2 days ago.