site banner

Culture War Roundup for the week of December 19, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

Really enjoyed these posts, two comments I'd add.

On the copyright side, I think it makes sense that the output of AI art generators can't be copyrighted. At least, as long as the use of copyrighted art to train an AI model isn't copyright infringement (I think it would pretty clearly be fair use currently). Otherwise you could do something like:

1. Find an artist whose style you like

2. Train an AI art generator on that artists works

3. Produce new works in that same style, whose copyright you own but the original author doesn't

That seems problematic to me. Especially since if you had spent time learning to produce art in that same artists style without the AI it could be a copyright violation. Laundering copyright violations through an AI seems like a problem to me.

On the legal front, I can't believe anyone is surprised by this arising as an issue. I remember when AI dungeon was new and it got used so often for the production of NSFW content that the model started to produce it in response to ordinary queries, eventually leading the developers to make some of changes to the model. The legal questions also seem complex. If I generate photo-realistic CP of a child who does not actually exist, is that a crime? Does it generate liability? Just for the individual actually producing or possessing the image or for the model developers? What if I create nudes of a celebrity? Would that be a tort? Maybe related to the use of likeness or image? What about various states revenge porn laws?

These questions do not have obvious answers to me and I understand why no one wants to be the first to find out!

  1. Find an artist whose style you like
  1. Train an AI art generator on that artists works
  1. Produce new works in that same style, whose copyright you own but the original author doesn't

That seems problematic to me. Especially since if you had spent time learning to produce art in that same artists style without the AI it could be a copyright violation.

I don't think it could, since styles can't be copyrighted. At best, it could be a trademark violation, if the artist in question has their style trademarked in some way. But in that case, the AI can't help you launder that, because it doesn't matter how you copied someone else's trademark, just that you did it. If you put a certain shade of red on the bottom of your shoes and sell them, that's a trademark violation whether you did it intentionally with malicious intent or you did it by randomly placing shoe parts and paint into a duffel bag and shaking them up and out popped, by pure chance, shoes with that particular shade of red on the bottom.

IANAL though, so someone please correct me if I'm wrong.

Especially since if you had spent time learning to produce art in that same artists style without the AI it could be a copyright violation.

I don't understand this; it is my understanding that this absolutely would NOT be a copyright violation. A style cannot be copyrighted, AFAIK, and the styles of influential artists have been copied forever (indeed, that is what it means to be an "influential artist). Can you elaborate?

If I generate photo-realistic CP of a child who does not actually exist, is that a crime?

Not in the US but possibly elsewhere. Unless of course the image is obscene or somehow is unprotected and illegal speech in another way, which is unlikely.

If I generate photo-realistic CP of a child who does not actually exist, is that a crime?

I'm going to bite that bullet and say 'yes'. The defence there is "but it's not a real child". However, the impetus is "I want to fuck a real child, but since I can't do that without being thrown in jail, this is the next-best thing". Or else "I don't want to fuck kids, but I'm happy to produce art for the sickos who do and take their money".

Since the consumer of child porn most likely would fuck a child if they could manage it, then that is indicative of desire to commit a crime (as for all the MAPs who are "but I don't want to do anything to a real child, I'm just romantically/sexually attracted without that being my will", if you're consuming child porn, yeah that argument doesn't hold too much water). Getting child porn of real children being raped and abused is not a victimless crime. Moving it one step up, 'this is photo-realistic so it looks like a real child but is computer-generated' is only a fig leaf. Since you can't fuck a kid without getting into trouble, and you can't have porn of real kids being really fucked without getting in trouble, you're settling for the next best thing.

Since you can't fuck a kid without getting into trouble, and you can't have porn of real kids being really fucked without getting in trouble, you're settling for the next best thing.

You can't tie up a non-consenting woman up and have sex with her, so therefore doing the same to a consenting woman who is pretending to be non-consenting is settling for the next best thing (and must be illegal or wrong). Spot the flaw?

I'm going to bite that bullet and say 'yes'. The defence there is "but it's not a real child". However, the impetus is "I want to fuck a real child, but since I can't do that without being thrown in jail, this is the next-best thing". Or else "I don't want to fuck kids, but I'm happy to produce art for the sickos who do and take their money".

Since the consumer of child porn most likely would fuck a child if they could manage it, then that is indicative of desire to commit a crime (as for all the MAPs who are "but I don't want to do anything to a real child, I'm just romantically/sexually attracted without that being my will", if you're consuming child porn, yeah that argument doesn't hold too much water).

Presuming all this, does it also follow that photorealistic first person shooter games, if they eventually become possible, ought to be illegal? Or rather, a video playthrough of a photorealistic first person shooter game where the player murders innocent bystanders. Of course, there are plenty of reasons to want to watch a video of a photorealistic first person shooter game other than wanting to live out the fantasy of what's depicted in the video game but lacking the legal ability to do so, but those reasons can apply to fictional photorealistic CP as well.

In the US, it is not illegal to merely desire to commit a crime.

Punishment for a status is particularly obnoxious, and in many instances can reasonably be called cruel and unusual, because it involves punishment for a mere propensity, a desire to commit an offense; the mental element is not simply one part of the crime but may constitute all of it. This is a situation universally sought to be avoided in our criminal law; the fundamental requirement that some action be proved is solidly established even for offenses most heavily based on propensity, such as attempt, conspiracy, and recidivist crimes.[4] In fact, one eminent authority has found only one isolated instance, in all of Anglo-American jurisprudence, in which criminal responsibility was imposed in the absence of any act at all

Powell v. Texas, 392 US 514, 543 (Black, J, concurring).

Besides, the only reason that non-obscene child porn is not protected by the First Amendment is that its production harms the child involved. New York v. Ferber, 458 U.S. 747 (1982). Hence, non-obscene depictions of children having sex that does not involve a real child are protected speech.

What is that one isolate instance?

That argument would also apply to someone writing the following erotic fanfiction: "two people have sex very erotically but without meeting legally relevant definitions of obscenity. One of them was only 17 years, 364 days, 23hours, and 59 minutes old, you sick fuck" (check out my AO3 account and fanbox for more sexy action featuring minors)

Can that be made illegal too, since it betrays a desire to bang minors? In fact, that parenthetical comment could be prosecuted under current caselaw about "pandering" if we use your definition!

Can that be made illegal too, since it betrays a desire to bang minors?

For start, having actual sex with 17 years old is likely to be legal in many places. See https://en.wikipedia.org/wiki/Age_of_consent#/media/File:Age_of_Consent_-_Global.svg

But filming it, drawing it, or talking about it can still be illegal in many of the same places!

(Edit: or getting married before having sex, in those blue states that recently raised the marriage age to 18, but left the age of consent at 16 or lower!)

Unless of course the image is obscene

How could such an image possibly not be obscene

Because in the United States, a work is obscene only if 1) the average person applying contemporary community standards would find the work, taken as a whole, appeals to the prurient interest; AND 2) the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; AND 3) the work, taken as a whole, lacks serious literary, artistic, political or scientific value.

In contrast, a work can be child pornography even if it is not obscene. So, a work which has substantial literary value, when taken as whole, is not obscene, but might be child porn. Similarly, a work which does not depict sexual conduct (or excretion, as some courts have said) cannot be obscene, but it can nevertheless be child pornography, because "the legal definition of sexually explicit conduct [in the federal child porn statute] does not require that an image depict a child engaging in sexual activity. A picture of a naked child may constitute illegal child pornography if it is sufficiently sexually suggestive." See here. And see US v. Knox, 977 F. 2d 815 (3rd Cir 1992)[Child porn conviction upheld where "[t]he tapes contained numerous vignettes of teenage and preteen females, between the ages of ten and seventeen, striking provocative poses for the camera. The children were obviously being directed by someone off-camera. All of the children wore bikini bathing suits, leotards, underwear or other abbreviated attire while they were being filmed. The government conceded that no child in the films was nude, and that the genitalia and pubic areas of the young girls were always concealed by an abbreviated article of clothing. The photographer would zoom in on the children's pubic and genital area and display a close-up view for an extended period of time . . . with the obvious intent to produce an image sexually arousing to pedophiles. "].

Hence, many works can be child porn, yet not obscene.

On the copyright side, I think it makes sense that the output of AI art generators can't be copyrighted. At least, as long as the use of copyrighted art to train an AI model isn't copyright infringement (I think it would pretty clearly be fair use currently)... 3. Produce new works in that same style, whose copyright you own but the original author doesn't

To an extent, (and even more so where steps 1/2 are replaced by 'use img2img'), though the state of copyright for 'traditional' cloning of media kinda makes this a weird or awkward question. The United States doesn't have any case quite as close to the line as the UK's infamous Red Bus case, but the Korean War stamp is pretty close: there certainly are ways in which even traditionally-created 'art' can be so derivative that it is infringement, even if the processes used to make the piece would otherwise allow copyright.

But these standards are incredibly tight. I like to use Rafman and similar 'found/outsider' art in furry contexts, simply because their 'transformative' nature is often limited to filing off signatures, but for a mainstream one, this Warhol v. Goldsmith case may fall one way or the other... and you'd have to overtrain the everliving hell out of a diffuser to get something that narrowly replicative. Indeed, the same complaints (in addition to the juvenille nature of the 'joke') would apply to any diffuser that produced the same result as the art in Leibovitz v. Paramount (cw: artistic nudity, mpreg), which is a clearly settled case. Or for a more boring example, see Cariou v. Prince. There's a pretty wide variety of contexts where lifting and directly copying an original work, even without commenting on that original work, is still considered transformative use, and while AI art can fall short of that, it's not a unique tool in doing so (compare: literally any color filter), and even the most moderately useful models will not favor doing so normally.

((And this entire thing is statutory interpretation: Congress could theoretically change the whole approach overnight for better or wo- ha, sorry, can't keep a straight face; any changes would be a clusterfuck even if best-intended, and more likely it'd get written by Disney.))

I do think there are novel technical and social problems, though -- img2img or overtuned models can launder art theft in ways that current perceptual-hash-and-search methods do not detect but clearly would not meet even the low standards of Cariou, we might want to consider AI-gen stuff more inherently economic than traditional 'inspiration', ML spam is a novel danger to artist communication and coordination.

These questions do not have obvious answers to me and I understand why no one wants to be the first to find out!

Yeah, it's perfectly reasonable that a company wanting to make AI art tools doesn't want to stick their foot into that bear trap; it'll be a huge resource drain away from their core mission, even in the optimistic case that they'd win every matter.

The trouble's that they don't really have a choice of avoiding the question; they've just decided to let someone answer it for them.

  1. Find an artist whose style you like

  2. Train an AI art generator low paid third world immigrant on that artists works

  3. Produce new works in that same style, whose copyright you own but the original author doesn't

It's just labor saving. The fundamental question is pretty to me is pretty much the old pirate question. It's not like we figured out how to reliably monetize infinitely copyable art in the past and suddenly AI interrupted a stable equilibrium. We're on the eve of the collapse of the streaming model which was reforged from the cable bundling model which was reforged from the... We're not going to find an answer to this question that doesn't have dozens of problems at least as bad as this and I'm still going to go on some torrent site and download whatever want anyways.

If I generate photo-realistic CP of a child who does not actually exist, is that a crime?

In Canada, yes, and they don't even have to be photo-realistic. They don't even have to be photos. Text would also be illegal.

Wait, so if someone loaded this page in Canada they may be in legal trouble?

An underaged human copulates with an old adult human

163.1 (1) In this section, child pornography means

...

(c) any written material whose dominant characteristic is the description, for a sexual purpose, of sexual activity with a person under the age of eighteen years that would be an offence under this Act;

...

Making child pornography

(2) Every person who makes, prints, publishes or possesses for the purpose of publication any child pornography is guilty of an indictable offence and liable to imprisonment for a term of not more than 14 years and to a minimum punishment of imprisonment for a term of one year.

Distribution, etc. of child pornography

(3) Every person who transmits, makes available, distributes, sells, advertises, imports, exports or possesses for the purpose of transmission, making available, distribution, sale, advertising or exportation any child pornography is guilty of an indictable offence and liable to imprisonment for a term of not more than 14 years and to a minimum punishment of imprisonment for a term of one year.

Possession of child pornography

(4) Every person who possesses any child pornography is guilty of

(a) an indictable offence and is liable to imprisonment for a term of not more than 10 years and to a minimum punishment of imprisonment for a term of one year; or

(b) an offence punishable on summary conviction and is liable to imprisonment for a term of not more than two years less a day and to a minimum punishment of imprisonment for a term of six months.

Accessing child pornography

(4.1) Every person who accesses any child pornography is guilty of

(a) an indictable offence and is liable to imprisonment for a term of not more than 10 years and to a minimum punishment of imprisonment for a term of one year; or

(b) an offence punishable on summary conviction and is liable to imprisonment for a term of not more than two years less a day and to a minimum punishment of imprisonment for a term of six months.

Interpretation

(4.2) For the purposes of subsection (4.1), a person accesses child pornography who knowingly causes child pornography to be viewed by, or transmitted to, himself or herself.

https://laws-lois.justice.gc.ca/eng/acts/c-46/section-163.1.html

It looks like it's possible. I'm not a lawyer though.

The offence would be under subsection (4.1), but I think subsection (4.2) means that not knowing that line was there means one would not be guilty of an offence under subsection (4.1). But maybe he would be upon loading the page a second time. Maybe you could argue it isn't the dominant characteristic. It would seem absurd for someone to be so convicted, so I would be surprised if there isn't some reason this wouldn't be considered an offence.