site banner

Culture War Roundup for the week of May 1, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

Once again it is unclear to me how this is very different than a human who reads a bunch of scripts/novels/poems and then produces something similar to what he studied.

There’s a lot of different ways you could look at it, but I think I might just say that the principle of “if you use someone else’s work to build a machine that replaces their job, then you have a responsibility to compensate that person” just seems axiomatic to me. To say that the original writers/artists/etc are owed nothing, even though the AI literally could not exist without them, is just blatantly unfair.

Is it not different from the early factory laborers buildings the machines that would replace them? Or maybe more aptly the carriage companies that supplied the ford factories. They were paid for the production fairly enough. That was the narrow agreement, not that no one else could be inspired by it or build an inspiration machine. To be replaced by people who were inspired by your works is the fate of every artist in history, at least those that didn't wallow in obscurity.

Is it not different from the early factory laborers buildings the machines that would replace them?

They consented and were paid. It's not analogous at all.

They produced the media, which is being consumed and paid for under the current payment model. They are being compensated for it regularly and under all the fair and proper agreements. The AI is trained off of the general inspiration in the air, which is also where they artists pulled their own works for. It's a common resource emitted by everyone in the culture to some degree. The Disney corporation did not invent their stories from whole cloth, they took latent myths and old traditional tales from us and packaged it for us and the ideas return to us. Now we're going to have a tool that can also tap this common vein and more equitably? This is democratization. This is equality. This is progress.

Last week I not so popularly defended copyright, and I still believe it's the best compromise available to us. But it doesn't exist because of a fundamental right, it exists because it solves a problem we have with incentivizing the upfront cost of creating media. If these costs can be removed from the equation then the balance shifts.

Last week I not so popularly defended copyright, and I still believe it's the best compromise available to us. But it doesn't exist because of a fundamental right

How do you feel about software license agreements? Plenty of software/code is publicly visible on the internet and can be downloaded for free, but it's accompanied by complex licensing terms that state what you can and can't do with it, and you agree to those terms just by downloading the software. Do you think that license agreements are just nonsense? Once something is out on the internet then no one can tell you what you can and can't do with it?

If you think that once a sequence of bits is out there, it's out there, and anyone can do anything with it, then it would follow that you wouldn't see anything wrong with AI training as well.

In the case of images for training image generation AI, I don't think they were published on the internet with license agreements stipulating limitations on what people could do with the images after downloading them. IANAL, but I think if you publish an image online and don't gate it in some way like behind a password login and license agreement, you get nothing beyond the default basic legal protections for how people use it, i.e. not publishing a copy or derivative work and whatnot.

I don't think they were published on the internet with license agreements stipulating limitations on what people could do with the images after downloading them.

Because no artist knew what ML training even was until early-ish 2022.

If you think it's possible for an artist to have any rights at all over whether their work is used for AI training or not, then clearly they should be able to seek redress for images that have already been used without their permission. Trying to get off on a technicality of "aww shucks, maybe if you had published your work back in 2015 along with an explicit notice that said 'you do not have permission to use this image to train any AI models' then you would have a case, but, you didn't, so too bad..." is just silly.

I'm sure there's a Latin legal phrase that translates to "Bruh. Come on." and I would cite that principle as my justification.

But obviously if you don't think that artists have any right even going forward to control the use of their work for training then this point will be irrelevant for you.

If you think it's possible for an artist to have any rights at all over whether their work is used for AI training or not, then clearly they should be able to seek redress for images that have already been used without their permission. Trying to get off on a technicality of "aww shucks, maybe if you had published your work back in 2015 along with an explicit notice that said 'you do not have permission to use this image to train any AI models' then you would have a case, but, you didn't, so too bad..." is just silly.

I think it's possible for an artist to have such rights, but IANAL, and that's ultimately a question for the courts to decide. But in any case, no artist would have had to predict the advent of machine learning if they didn't want their images to become part of, to quote aqouta, "general inspiration in the air;" they could have just gated their images behind license agreements that included standard boilerplate stuff about how anyone who downloads the images are only granted rights to do XYZ, all other rights reserved. Again, IANAL and I don't know if the courts would see their lack of such gating and license agreements as giving implicit permission for viewers to use to train ML software, like how it gives other implicit permissions to viewers. Honestly, all this is so new that I'm not sure even a lawyer would be able to make a meaningful prediction on how things will go.