site banner

Culture War Roundup for the week of October 3, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

24
Jump in the discussion.

No email address required.

I think over the last few months we've established that AI issues are on topic for the culture war thread, at least when they intersect with explicitly cultural domains like art. So I hope it's ok that I write this here. Feel free to delete if not.

NovelAI's anime model was released today, and it's pretty god damned impressive. If you haven't seen what it can do yet, feel free to check out the /hdg/ threads on /h/ for some NSFW examples.

Not everyone is happy though; AI art has attracted the attention of at least one member of congress, among several other public and private entities:

WASHINGTON, D.C. – Today, U.S. Rep. Anna G. Eshoo (D-CA) urged the National Security Advisor (NSA) and the Office of Science and Technology Policy (OSTP) to address the release of unsafe AI models that do not moderate content made on their platforms, specifically the Stable Diffusion model released by Stability AI on August 22, 2022. Stable Diffusion allows users to generate unfiltered imagery of a violent or sexual nature, depicting real people. It has already been used to create photos of violently beaten Asian women and pornography depicting real people.

I don't really bet on there being any serious legal liability for Stability.AI or anyone else, but, you never know.

I've tried several times to articulate here why I find AI art to be so upsetting. I get the feeling that many people here haven't been very receptive to my views. Partially that's my fault for being a bad rhetorician, but partially I think it's because I'm arguing from the standpoint of a certain set of terminal values which are not widely shared. I'd like to try laying out my case one more time, using some hopefully more down-to-earth considerations which will be easier to appreciate. If you already disagree with me, I certainly don't expect you to be moved by my views - I just hope that you'll find them to be coherent, that it seems like the sort of thing that a reasonable person could believe.

Essentially the crux of the matter is, to borrow a phrase from crypto, "proof of work". There are many activities and products that are valuable, partially or in whole, due to the amount of time and effort that goes into them. I don't think it's hard to generate examples. Consider weight lifting competitions - certainly there's nothing useful about repeatedly lifting a pile of metal bricks, nor does the activity itself have any real aesthetic or social value. The value that participants and spectators derive from the activity is purely a function of the amount of human effort and exertion that goes into the activity. Having a machine lift the weights instead would be quite beside the point, and it would impress no one.

For me personally, AI art has brought into sharp relief just how much I value the effort and exertion that goes into the production of art. Works of art are rather convenient (and beautiful) proof of work tokens. First someone had to learn how to draw, and then they had to take time out of their day and say, I'm going to draw this thing in particular, I'm going to dedicate my finite time and energy to this activity and this particular subject matter rather than anything else. I like that. I like when people dedicate themselves to something, even at significant personal cost. I like having my environment filled with little monuments to struggle and self-sacrifice, just like how people enjoy the fact that someone out there has climbed Mt. Everest, even though it serves no real purpose. Every work of art is like a miniature Mt. Everest.

Or at least it was. AI art changes the equation in a way that's impossible to ignore - it affects my perception of all works of art because now I am much less certain of the provenance of each work*. There is now a fast and convenient way of cheating the proof of work system. I look at a lot of anime art - a lot of it is admittedly very derivative and repetitive, and it tends to all blend together after a while. But in the pre-AI era, I could at least find value in each individual illustration in the fact that it represented the concrete results of someone's time and effort. There are of course edge cases - we have always had tracing, photobashing, and other ways of "cheating". But you could still assume that the average illustration you saw was the result of a concrete investment of time and effort. Now that is no longer the case. Any illustration I see could just as easily be one from the infinite sea of AI art - why should I spend any time looking at it, pondering it, wondering about the story behind it? I am now very uncertain as to whether it has any value at all.

It's a bit like discovering that every video game speedrun video you see has a 50% chance of being a deepfake. Would you be as likely to watch speedrunning videos? I wouldn't. They only have value if they're the result of an actual investment of time by a human player - otherwise, they're worthless. Or, to take another very timely example, the Carlsen-Niemann cheating scandal currently rocking the world of chess. Chess is an illustrative example to look at, because it's a domain where everyone is acutely aware of the dangers of a situation where you can't tell the difference between an unaided human and a human using AI assistance. Many people have remarked that chess is "dead" if they can't find a way to implement effective anti-cheating measures that will prevent people from consulting engines during a game. People want to see two humans play against each other, not two computers.

To be clear, I'm not saying that the effort that went into a work of art is the only thing that matters. I also place great value on the intrinsic and perceptual properties of a work of art. I see myself as having a holistic view where I value both the intrinsic properties of the work, and the extrinsic, context-dependent properties related to the work's provenance, production, intention, etc.

TL;DR - I used to be able to look at every work of art and go "damn someone made that, that's really cool", now I can't do that, which makes every interaction I have with art that much worse, and by extension it makes my life worse.

*(I'm speaking for convenience here as if AI had already supplanted human artists. As I write this post, it still has limitations, and there are still many illustrations that are unmistakably of human origin. But frankly, given how fast the new image models are advancing, I don't know how much longer that will be the case.)

EDIT: Unfortunately, this dropped the day after I wrote my post, so I didn't get a chance to comment on it originally. Based on continually accumulating evidence, I may have to retract my original prediction that opposition to AI art was going to be a more right-coded position. Perhaps there are not as many aesthetes in the dissident right as I thought.

I really don't find your post convincing in the least. And the constant whiny bitching and crying by artists about AI art has made me suspicious of the motivations of these so called "artists", who claim to do it so much so for their love of the art. @EfficientSyllabus, said it a lot better than me. You are not lamenting the loss of an artform, you are lamenting the loss of status.

Here are a few scattered as to why I am so deeply unsympathetic to those who endlessly moan about AI art.

  • I'm a programmer. I believe I love the art of programming. I also know some people who genuinely love the art of programming. When I saw what OpenAI Codex (AI that can generate code) was capable of doing, My jaw was on the floor. A program that could write more programs?? It was science fiction in-front of my own eyes. Every other programmer I know who loves the game itself had the exact same reaction; amazement.

    So what if a machine can write code? Code is good! The world needs more code! Code makes machines more efficient, it does boring jobs that people would have to do, code optimizes processes that literally puts food into most peoples mouths.

    In the same vein? Is art not a good thing? Is the world not a richer place because there will be more art? Isn't it great that an independent blogger who couldn't afford commissions will now get to have art that makes his blogging richer? Is it not great that a mom and pop shop can now produce artwork that will make their corner store more lively? Won't the world get a little bit more aesthetically pleasing?

    Why are the majority programmers so enthusiastic about machines that can code but not artists?

    Maybe because the greatest trick the devil pulled was that "artists" are in it for the love of the art and us uncool dirty nerds are in it for the money and status?

  • There is an art to almost every process right?

    Farming can also be an art right? Getting the soiled tilled just right, making sure the seeds are placed just the appropriate distance apart, etc.

    However, if someone lamented the loss of farming as an artform because combine harvesters were invented... My and hopefully any rational persons response would be;

    " You motherfucker. Do you not realize that millions of hungry mouths will be fed because of this thing? Is your artsy fartsy shit more important that people not being hungry?"

  • The world is a place where things need to get done.

    I love the art of programming and spend countless hours cleaning up my programs, but ultimately it's of no value if no one can use my programs. Chefs can put their heart and soul into their food, but it would be of no value if no one ate it.

    The value is in the PRODUCT, not the PROCESS.

    If my favorite bakery found a way to mass produce their cheesecakes but the pastry chef was not required anymore and it would be all done by machines, then good. More people can enjoy great food for cheap. And to be honest my tongue doesn't care, if it did, its priorities are not in order.

    Boohoo for the pastry chef, if they love making cakes so much they can make the cake and throw it in the trash. In my world cakes are for eating. Is it not wisdom that you cook for your friends and family for them, not for you? The sanctity is in the fact that their stomachs are full not that your knife skills are perfected?

    Same for the artists, they can draw their art and throw it in the trash, its the process that matters right?

As someone who believes that more things are good. Products are good. Anyone lamenting about a process that brings more good things into the world is my enemy. You are actively lamenting that the world is becoming a richer place, in both the very economic and metaphorical sense of the word 'richer'. The pie is getting bigger, you are just lamenting you won't be having a relatively larger share of it.

I think there is a certain line of thinking that can plausibly be raised as a defense:

  • The process of freely creating art is valuable as a form of human expression, either per se or because it enriches the human experience in some way. (This position is apparently one which you do not hold, but let's assume for the moment that a large portion of the population does sincerely hold it.)

  • So far, the process of creating art has been subsidized by its products which can be sold: corporate art, commissions, etc. However, in the future, these products are poised to be far more efficiently generated by AI.

  • Without revenue from these products, many of today's artists will be forced to move into other fields, and perhaps curtail their personal output due to no longer having enough time, supplies, or practice. This is bad, since it decreases the quality and quantity of valuable art creation.

  • Similarly, once the creation of art becomes no longer profitable, the second-order effects start to occur: the entire industry of art education gradually falls apart, and many people become unable to learn the skills to express themselves through art in the way they would prefer.

That is, the process of art-as-human-expression will be impacted negatively by the AI-driven devaluing of art-as-a-commercial-product.

I recall a discussion on LW or SSC (that I am now unable to find), about how many try to find economic justifications for avoiding animal stress, looking for evidence that less-stressed cows (for instance) produce better meat, since that kind of justification is the only form our society will accept: if no such justification can be found, then animal welfare will inevitably get tossed out the window. I interpret @Primaprimaprima's perspective in a similar light; if there is no more value in humans creating art as a product, then there will be nothing left to prop up the tradition of art as an expression, and the world will be worse off for it.

(Whether this assumption of art-as-expression depending on the existence of art-as-a-product holds up in reality is a different question. But it certainly seems like a plausible enough risk to worry about, assuming one values art-as-expression.)

If art as expression is valuable enough then people will spend money to do it for its own sake. If it is not then they were being subsidized the whole time and it was less valuable than we previously thought. I think art will survive but most artists will need to get dayjobs, just like all the people who used to subsidize their art who didn't personally get to experience the expression element. Because that's really the hidden cost that isn't being brought up, the tiny tax on everyone else in society so that artists could be for lack of a better word "unproductive"(undoubtedly they were productive in a pre-ai-art world but they are no longer productive in a post-ai-art world)

Indeed. I suppose that the next step of the defense would be that society persistently undervalues art-as-expression: if the general public were aware of its full value, they would pay for art-as-expression, but structural factors and lack of quantifiable benefits makes awareness implausible in the near future. (Compare this to the animal-welfare activist who fights against factory farmers' greed and consumers' apathy: they believe that if the public were aware of the full value of animal welfare, then animal-protection laws would be passed in a heartbeat.)

In this scenario, the best outcome, short of formal subsidies for artists, would perhaps be a large-scale donation model, much like for many orchestras and museums today. But this is still much less accessible to artists than the pre-AI status quo, where art-as-expression maintains a safe existence as a byproduct of art-as-a-product. So it would still make sense for those who value art-as-expression to lament this change beyond the effects on their own lifestyles, given that this particular Pandora's Box isn't getting closed any time soon.

I think there are two distinct divisions here that are worried or not worried about AI art.

(1) The 'fine artists', producing the likes of Basquiat painting in that story further down about the Guggenheim. The very top ones won't be affected, hell they may even get into using AI to produce art, because it's all about the concept and not the actual work. Damien Hirst did not cast himself works like this, he does the design then hands it off to a foundry to make it. So AI art is not a threat to art which is about concept, notoriety, ethnicity, who is the latest hot property taken up by the galleries and rich collectors, etc.

(2) The commercial artists, who very well may find themselves out of a job if AI can churn out made-to-measure works for posters (like that awful German Green Party one), magazine and online article illustrations in this style called Alegria or Corporate Memphis, advertising and product art, and the rest of it. Some of them can adopt it as one more tool, like the software they already use, but if a big corporation can create its own in-house art by purchasing an AI program to do it, then that cuts out freelancers and those who rely on commissions. Amateur artists are a sub-set of this, all the artists doing fan-art for commission may be priced out if you can instead get to use an AI who will do exactly what you want the way you want it.

So there is definitely a panic about "the AI is taking our jobs!" and that isn't completely mockery, because there will be people who can no longer make a living doing commercial art. How that shakes out remains to be seen, and we really won't know until AI art is widely used. Maybe people will go back to having a Real Human Drew This piece of work, to stand out from all the mass-produced AI art, especially for things like fashion magazines that want to sell themselves as being creative and different and unique.

And there are real concerns about art as art, from people who enjoy creating art and don't like the implication that this is just one more human activity that can be mechanised and turned into extruded product. You say that the value is in the product, not the process, but for most of us our experience of mass-market mechanised production of, for instance, food products has not been "oh wow, this cheesecake is so delicious and gorgeous, just like a pastry chef made it!", it has been "replace ingredients with cheapest substitute, lots of artificial flavouring and colouring, and a process that is economically convenient for the manufacturer" ending up in bland, processed, 'not as good as the real thing' goods (see the furore over how Cadbury chocolate has changed since Mondelez bought it).

If the experience of mechanisation was "wow, gorgeous!" instead of "yeah, now it's gonna be cheap, bad-tasting gunk", then people would be less alarmed about AI art (as distinct from the financial element). You say "More people can enjoy great food for cheap. And to be honest my tongue doesn't care, if it did, its priorities are not in order" but would you really not care if it tasted different? Why is that bakery your favourite bakery, if not for the very reason that it pleases your tongue? "Okay, now the cheesecake tastes like chalk and mouse-droppings and gives me diarrhoea after I eat it, but shut up tongue! The process is more efficient and cheaper and productive, who cares about the quality of the end product?"

If the experience of mechanisation was "wow, gorgeous!" instead of "yeah, now it's gonna be cheap, bad-tasting gunk", then people would be less alarmed about AI art (as distinct from the financial element). You say "More people can enjoy great food for cheap. And to be honest my tongue doesn't care, if it did, its priorities are not in order" but would you really not care if it tasted different? Why is that bakery your favourite bakery, if not for the very reason that it pleases your tongue? "Okay, now the cheesecake tastes like chalk and mouse-droppings and gives me diarrhoea after I eat it, but shut up tongue! The process is more efficient and cheaper and productive, who cares about the quality of the end product?"

So artists are losing people without taste who were being overcharged for what they were experiencing, people with taste will stick to real artists. What's the problem here? I can still get and do still get fancy hand crafted artisanal food, it costs as much as it always has. But now the poor can get at least an approximation for cheap enough for them to afford. I imagine high profile publications and AAA game titles will still have humans doing their art for them, but suddenly indie publications and indie games can afford as many art assets as they can productively use. This is a pure win for expression.

And the constant whiny bitching and crying by artists about AI art has made me suspicious of the motivations of these so called "artists"

Write like you want to include everyone in the conversation, please. This is unnecessarily heated.

But, critically, that isn't the actual argument being made by artists, probably because it's a losing argument. Milton Friedman's classic story:

While traveling by car during one of his many overseas travels, Professor Milton Friedman spotted scores of road builders moving earth with shovels instead of modern machinery. When he asked why powerful equipment wasn’t used instead of so many laborers, his host told him it was to keep employment high in the construction industry. If they used tractors or modern road building equipment, fewer people would have jobs was his host’s logic.

“Then instead of shovels, why don’t you give them spoons and create even more jobs?” Friedman inquired.

This is probably the key point. It’s impossible for a non-programmer to do anything useful with Copilot. Non-artists can already do useful things with NovelAI right now though; they’re ready to start cutting artists out of the loop now, today. It’s not a comparable situation.

Why are the majority programmers so enthusiastic about machines that can code but not artists?

Because they aren't. They're collectively deluding themselves into believing in the «soul» and that programming will never be automated by AI. Just like certain artists are.

I am a programmer. OpenAI scares me. I'm putting every effort I've got into the Grind, because I think the industry's due for a phenomenal crash that'll leave the majority in the dumps. You are free to disagree.

There is no problem humans face that cannot be reframed as a programming or automation problem. Need food? Build a robot to grow it for you, and another to deliver it to your house. Need to build a robot? Make a factory that automates robot fabrication. Need to solve X medical issue? Write a program that figures out using simulations or whatever how to synthesize a chemical or machine that fixes it. Given this, the question of "what happens to programmers when computers can write code for arbitrary domains just as well as programmers can" answers itself.

I expect that fully automating coding will be the last job anybody ever does, either because we're all dead or we have realized Fully Automated Luxury Space Communism.

Is there something misleading with the way I phrased my comment? I don't understand why multiple people have succeeded in reading "programmers will be completely replaced by AI" into my words.

And this isn't a nitpicking thing. It is an extremely important distinction; I see this in the same way as the Pareto Principle. The AI labs are going to quickly churn out models good enough to cover 95% of the work the average software engineer does, and the programming community will reach a depressive state where everyone's viciously competing for that last 5% until true AGI arrives.

Your first paragraph misses how hard it is for human programmers to achieve those things, if it is even possible under current circumstances (find me a program that can acquire farmland & construct robots for it & harvest everything & prepare meals from raw materials). Even hiring an army of programmers (AI or no) would not satisfy the preconditions necessary for getting your own food supply, namely having an actual physical presence. You need to step beyond distributed human-level abilities into superhuman AI turf for that to happen.

There is a sense in which the job of coding has already been automated away several times. For instance, high-level languages enable a single programmer to accomplish work that would be out of the grasp of even a dozen assembly-language programmers. (This did, in fact, trash the job market for assembly-language programmers.)

The reason this hasn't resulted in an actual decline in programmer jobs over time is because each time a major tool is invented that makes programming easier (or eliminates the necessity for it in particular domains), people immediately set their sights on more-difficult tasks that were considered impractical or impossible in the previous paradigm.

I don't really see the mechanism by which AI-assisted programming is different in this way. Sure, it means a subset of programming problems will no longer be done by humans. That just means humans will be freed to work on programming and engineering problems that AI can't do, or at least can't do yet; and they'll have the assistance of the AI programmers that automated away their previous jobs.

And if there are no more engineering or programming problems like that, then you now have Automated Luxury Space Communism.

Roughly speaking, I see your point and agree that it's possible we're just climbing a step further up on an infinite ladder of "things to do with computers".

But I disagree that it's the most likely outcome, because:

  1. I think the continued expansion of the domain space for individual programmers can be partially attributed to Moore's Law. More Is Different; a JavaScript equivalent could've easily been developed in the 80s but simply wasn't because there wasn't enough computational slack at the time for a sandboxed garbage collected asyncronous scripting language to run complex enterprise graphical applications. Without the regular growth in computational power, I expect innovations to slow.

  2. Cognitive limits. Say a full stack developer gets to finish their work in 10% of the time. Okay, now what? Are they going to spin up a completely different project? Make a fuzzer, a GAN, an SAT solver, all for fun? The future ability of AI tools to spin up entire codebases on demand does not help in the human learning process of figuring out what actually needs to be done. And if someone makes a language model to fix that problem, then domain knowledge becomes irrelevant and everyone (and thus no one) becomes a programmer.

  3. I think, regardless of AI, that the industry is oversaturated and due for mass layoffs. There are currently weak trends pointing in this direction, but I wouldn't blame anyone for continuing to bet on its growth.

For (1), what you're saying is certainly true; the better abstractions and better tooling has been accompanied by growth in hardware fundamentals that cannot be reasonably expected to continue.

(2) is where I'm a lot more skeptical. A sufficient-- though certainly not necessary-- condition for a valuable software project is identifying a thing that requires human labor that a computer could, potentially, be doing instead.

The reason I called out robotics specifically is because, yeah, if you think about "software" as just meaning "stuff that runs on a desktop computer", well, there's lots of spheres of human activity that occur away from a computer. But the field of robotics represents the set of things that computers can be made to do in the real world.

That being so, if non-robotics software becomes trivial to write I expect we are in one of four possible worlds:

World one: General-purpose robotics-- for example, building robots that plant and harvest crops-- is possible for (AI-assisted) human programmers to do, but it's intrinsically really hard even with AI support, so human programmers/engineers still have to be employed to do it. This seems like a plausible world that we could exist in, and seems basically similar to our current world except that the programmer-gold-rush is in robotics instead of web apps.

World two: General-purpose robotics is really easy for non-programmers if you just make an AI do the robot programming. That means "programming" stops being especially lucrative as a profession, since programming has been automated away. It also means that every other job has been (or will very soon be) automated away. This is Fully-Automated Luxury Space Communism world, and also seems broadly plausible.

World three: General-purpose robotics is impossible at human or AI levels of cognition, but non-robotics AI-assisted programming is otherwise trivial. I acknowledge this is a world where mass layoffs of programmers would occur and that this would be a problem for us. I also do not think this is a very likely scenario; general-purpose robotics is very hard but I have no specific reason to believe it's impossible, especially if AI software development has advanced to the point where almost all other programming is trivial.

World four: World two, except somebody screwed up the programming on one of their robot-programming AIs such that it murders everyone instead of performing useful labor. This strikes me as another plausible outcome.

Are there possibilities I'm missing that seem to you reasonably likely?

For your point (3), I have no particular expectations or insight one way or another.

Hi, I just want to leave a stub response: you seem right and I failed to type a recent response after reading 2 days ago.

We've been trying to innovate ourselves out of a job since the very beginning. I work with high powered business people that frequently can't even manage the most basic computer tasks, let alone automate them. We'll have a niche so long as people continue to work. What a glorious day it will be when all that intelligence and ingenuity is put to tasks other than making ads serve 0.2% faster.

You really think AI is going to replace programmers? If it does then it will be smart enough to self-modify, and then career concerns are the least of our worries.

If it does then it will be smart enough to self-modify,

This does not work out the way you think it will. A p99-human tier parallelised unaligned coding AI will be able to do the work of any programmer, will be able to take down most online infrastructure by merit of security expertise, but won't be sufficient for a Skynet Uprising, because that AI still needs to solve for the "getting out of the digital box and building a robot army" part.

If the programming AI was a generalised intelligence, then of course we'd be all fucked immediately. But that's not how this works. What we have are massive language models that are pretty good at tackling any kind of request that involves text generation. Solve for forgetfulness in transformer models and you'll only need one dude to maintain that full stack app instead of 50.

What I'm saying is that AI's are made of code. If they can write code then they can improve themselves. An AI able to code better than people can also code a better AI than people can. Maybe you don't think that that will lead to recursive self-modification--I think there's at least a good chance that there are diminishing returns there--but just consider the advances we've made in AI in the last year, and you're supposing a future where not only have we gotten farther but then there's another entity capable of going farther still. At a bare minimum I think an AI capable of doing that is capable of replacing most other careers too.

I too am a programmer, but fortunately the German software industry is so far behind the times and slow to evolve and German employment laws in general are so strict in their regulations in favor of employees that I think I can safely coast halfway to retirement before I feel any market pressure.

That depends on how much of a difference AI will make, doesn't it? If advanced AI enables big American corps to churn out absurdly efficient code or highly advanced machine designs in minimal time, what will sclerotic German companies do?

I used to work at Siemens and half the people employed as programmers there thought that automating things in Excel was black magic, let alone doing basic things in Python with libraries like pandas. The difference in productivity compared to its rivals is small enough that coasting on momentum of past strengths might be sufficient to stay relevant in the present, but strong AI could plausibly make a lot of crusty German institutions obsolete in a way that our lawmakers won't be able to compensate for.

Stop scaring me. If the statists are going to tax me anyways then the least I expect to receive in exchange is the illusion of security.

Why are the majority of programmers so enthusiastic about machines that can code

Because they have no foresight.

I write code for a living. Of course I don’t want machines to learn how to code. That would put me out of a job! Why would I want someone to build something that would put me out of a job? That makes no sense.

I assume that by the time machines have truly automated coding (that means they can debug and fix your 40 year old proprietary system too according to natural language requirements, not just generate new code) then we’ll have AGI. I’ve become even more pessimistic about AGI lately and I think it’s likely that it will lead us straight into a dystopia, because there’s no reason for the rich to give us UBI. Once they no longer need us, they’ll more than likely just let us starve. So I hope to God that machines don’t automate coding in my lifetime.

I find this pessimism ridiculous, what are the rich going to do with all that corn? Everything I've seen or heard about them is that they want to be admired and loved by the people, and I at least live in a democracy where it's not totally up to the rich to decide to let us all starve.

I'm of the same opinion. If the future is fully automated, then one doesn't need the AI itself to go bad in order to have a humanitarian catastrophe - it will be sufficient for those who command the AIs to determine that keeping the unproductive classes as pets is no longer fashionable.

This is my belief about the dangers of AI - it's not the AI itself (because I don't think we will get self-aware, having goals and wanting to meet them, agentic like a person AI) but the people who use it, control it, and think they can depend on it to solve problems like "best economic policy" or 'let's create the technocratic utopia'.

Let's say Boston Dynamics creates a genuinely humanoid robot that can do all the things humans do, and Amazon buys a ton of them to replace the human workers in its fulfilment centres (particularly since they have concerns about labour shortages ). Great, this is way more efficient, Amazon can now sell us "same hour delivery", costs go down and productivity goes way up, things are better all round for everyone!

Except the laid-off warehouse workers, because where are they going to find jobs? How are they going to live? Even 'learn to code' isn't enough anymore, because we have AI to do that, too. And Amazon is not going to pay them wages or the equivalent in UBI out of the goodness of its heart, why would it? It's a business, not a charity. The government may have to do this, and it will raise money by taxing Amazon, and Amazon will use the usual legal loopholes and run-arounds to avoid paying more tax than the minimum necessary (tax avoidance, not tax evasion is the distinction I have been told) so where will the money come from? If we think the pensions shortfall is going to be a massive shock, wait until you can't even work until you're 80 as a Walmart greeter to supplement your pension or whatever money you have to live on, because there's a BD robot doing that now,

tax avoidance

Tax avoidance works because we have literal laws on the books that let them do it, usually for quite good reasons. If the tax revenue is insufficient we can simply eliminate these ways to reduce tax burden or raise taxes.

If my favorite bakery found a way to mass produce their cheesecakes but the pastry chef was not required anymore and it would be all done by machines, then good. More people can enjoy great food for cheap. And to be honest my tongue doesn't care, if it did, its priorities are not in order.

I think you go too far here. If your tongue cares, it's because the machines are not producing the same cheesecakes you enjoyed from the pastry chef, they are producing almost the same. This is great for everyone who wants a good cheesecake for a reasonable price, but it is also good for the pastry chef - now he gets to make cheesecakes for people who want cheesecakes that are better than the machines can make. Plus he gets the fame and prestige of being the model upon which the machines are based. Nobody is putting "Made by machines!" on the label of their cheesecakes - they want a cheesecake made by the pastry chef for a reasonable price. Thanks to the machines they can get it, and if that's not good enough then they can pay for the chef's actual cheesecake.

They are equal in my analogy.

What a strange thing to say. Never mind that mass production has been around for a century and never ever created products identical to handmade, what other priorities might your tongue have then? Have you considered that maybe your tongue doesn't actually care, but due to a minor addendum you made to one of its arguments - helping it no less - it is just pretending to care because it is a child?

They are equal in my analogy

But in reality? That is the difference between people's experience of increasing automation and industrialisation, and the rosy forecasts of "by the 1980s, people will have so much leisure time it will be hard to fill it all, because the work week will be hours not days, thanks to machines!"

Unless we get Star Trek style replicators, the machine-made cheesecake will never be equal to 'the real thing' (and even in Trek, people still go out to restaurants where humans do the cooking). There's even an entire Youtube channel with different levels of chefs making different dishes - here's one for cheesecakes. This is why people pay different prices for different levels of cooking - you don't expect premium prices for fast food burgers, and you expect a higher level of quality if ordering a steak in a fancy restaurant.

I believe there is an American expression, used pejoratively, about "whitebread" or "Wonder bread", deriving from commercially produced sliced white bread loaves, filled with flour improvers and preservatives to enable it to remain soft and long-life. Now this is decried as spongy, tasteless and inferior. These were created thanks to the [Chorleywood Process}(https://en.wikipedia.org/wiki/Chorleywood_bread_process) which gave rise to the expression "the best thing since sliced bread" since the innovation was new and remarkable and consumers loved the product.

But I think anyone will agree that the commercial sliced panloaf is not as tasty as the bakery loaf, even if it lasts longer and is ready-sliced. I use both, I prefer batch bread. The constant 'improvements' turned out not to be improvements but were certainly an economically superior process. The end product suffered. That is the fear around AI art.

Wonderbread is a real brand and product that exists and is popular, not a pejorative. I prefer it for some types of sandwiches, like a pb&j and not for others. The world is richer for the existence of wonderbread.

Same for the artists, they can draw their art and throw it in the trash, its the process that matters right?

I think there have been instances of artists destroying their work or making art that essentially self-destructs (e.g. KLF, that one MMO art game where the game would shut down if people killed each other enough times or something, I think there was an installation piece that would beat itself apart), so really, they're kind of ahead of you on that.

My point was that if the process is what matters, there is nothing to be afraid of. No one cam steal that from you.

I too am a programmer, and am horrified by Copilot and friends. I write code to solve problems and release it under copyleft so that people can modify it for their own ends and share alike. I don't release it for it to be bundled up into some training set for a system that will accelerate the generation of non-free software.

Whatever an artist's goal in developing a skill, I think it's fair for him to be utterly crushed at the thought of his artistic career and personal style being reduced to an "by artist X" prompt to an image generator.

Sure if they were honest and just said "im not pleased about losijg my job" everyone would be sympathetic to them.

Instead they piss and shit all over about how their jobs are divine edicts from god and simulacras are demonic.

But part of it really is that making art is a very human thing to do, from the earliest records we have of humans, and mechanising it away with AI feels like chopping out part of the human experience. It isn't like "a better way to make cheesecake", where the AI is churning out industrial-recipe amounts in an industrial process. It's reducing creativity and imagination to a set of standard tropes for lowest common denominator appeal, like the production line of Marvel movies which, I think, people are beginning to get tired of because it's all too much and too the same: just slot in a new comic book character and sprinkle in explosions and fight scenes. A formula that gets over-used no longer works, because it's tedious. You've seen the same thing sixteen times before, why go see this particular one?

A lot of the complaining is taking themselves too seriously, but it's not merely about losing a job. It makes people feel replaceable, and in something that was considered to be uniquely human. Maybe a robot could replace you as a worker on an automobile assembly line, but as an artist? How would you feel to be totally replaced as a programmer, and whatever you might produce would be regarded as amateur hobbyist stuff, "that's nice dear", but everyone knows real coding is done by AI. Your occupation would be gone, and if this is something you do because you love this stuff, and not just as "well I gotta do something to make a living", wouldn't you feel lost and valueless?

It isn't like "a better way to make cheesecake", where the AI is churning out industrial-recipe amounts in an industrial process. It's reducing creativity and imagination to a set of standard tropes for lowest common denominator appeal, like the production line of Marvel movies which, I think, people are beginning to get tired of because it's all too much and too the same

(Emphasis added). I'm not sure where the bolded part came from. What reason is there to believe that AIs would reduce creativity or imagination to a set of standard tropes for lowest common denominator appeal? Nothing about the actual process of the creation of art by AI would imply that. If we look at usage of AI in other fields like, say, go or chess, AI has been known to display creativity far beyond what the best humans have been known to come up with.