site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Has this group had any discussion regarding AI use in pornography, specifically, 'deepfakes?' Its come out recently that a major up-and-coming twitch streamer 'Atrioc' (who was recently married, and ostensibly very pro-feminist, but while funny these facts are technically irrelevant to the matter at hand) had been viewing porn 'deepfakes' of multiple major female twitch streamers, including possibly his best friend's girlfriend (last part unconfirmed but highly possible). He's come out with an apology and its a whole thing but I'm sure this community is more interested with the moral/ethical questions therein than internet-celebrity drama so I won't bore you with it.

The following are my perspectives on a few of the potential questions regarding 'deepfake' porn, and AI porn in general. I'd love to hear what other people think about these perspectives, because my thoughts are currently very incomplete on the issue.

First and foremost, I have a strong intuitive feeling that it is deeply wrong, perhaps tantamount to some form of sexual harassment/assault (but of course the non-violent sort) to make 'deepfake' pornography of other non-consenting people. For example taking a picture from a celebrity's instagram and using AI to transform it into a high-fidelity (but technically fake) nude picture of them seems functionally the same as i.e. spying on them in the shower or when they're changing, which are actions I think we can all agree would be some form of wrong or illegal sexual violation (or perhaps we can't? you tell me). The way I think about this is by considering that a strong enough AI program would be theoretically capable of using a clothed picture of someone to actually reconstruct the way the exact way they look naked, which would be quite literally equivalent to the aforementioned situation/looking at them with x-ray glasses, etc. which again (I think) we and most people agree would be wrong. And so, less-powerful AI capable of doing something similar seem to be at least on that gradient of wrong, if not exactly as bad.

Furthermore, AI that actually transplants people's faces onto video depictions of sexual intercourse (which is ostensibly what 'Atrioc' was caught doing) seem worse, or maybe just bad in a different way. I don't have a similar thought experiment to justify why I feel that way but the wrongness of it is my strong intuition nonetheless.

However, I can also sort of see the argument, at least abstractly, that it's a victimless crime. On the other extreme of the spectrum, fantasizing in one's own imagination about the way people look when they're naked, or how it might feel to have sex with them, is not only generally recognized as a very benign behavior, but is also known as something almost everyone does, men and women both. Sometimes, people do this even completely unconsciously, i.e. in their dreams. And what's the difference between looking at a very (or fully) realistic recreation of the way someone might look with their clothes off, and using one's own imagination to do so? What if one's own imagination was very vivid, and you had seen many naked people before thus making your training data very good, and so you also could reasonably expect to make a relatively accurate recreation of the way someone looked while naked, only in your own mind's eye?

The thing is, acknowledging these potential similarities between an action I find morally acceptable and the one I find morally wrong, still doesn't make my intuition about the wrongness of 'deepfakes' any weaker. I feel like there must be some thing that I haven't considered about it yet, which is where I'm hoping you guys might have insight. The only distinction I've found somewhat convincing so far is maybe that the mass-distribution via the internet is what makes it wrong? In other words I find it less wrong (but still wrong somewhat) to make a highly/fully realistic nude of someone and keep it entirely on one's own computer, more so than I find it wrong to make such an image and then distribute it online. This is especially weird because the former is even more apt of a comparison to i.e. peeping on someone in the locker room which is obviously (?) wrong. So why does it seem more okay to me? Help!

I have a few potential explanations that I'm considering as candidates for the source of my cognitive dissonance here:

  1. Perhaps in reality none of the aforementioned actions are wrong. It's not wrong to spy on someone in the locker room, and so it's not wrong to use 'x-ray glasses' to see through their clothes, or use an AI to edit a picture to do functionally the same thing.

  2. Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent. But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be? And thus how really wrong is the other situation, where one uses AI?

This kind of segues into 3. which is: Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other. Is the solution to the 'deepfake' issue then to try and change our society's sexual morals/ethics into something more internally coherent?

None of these really address why I feel different about 'turning a clothed photo into a nude' and 'transplanting, in a realistic way, a non-consenting individual's face onto an actor in a depiction of sexual intercourse.' I have no concrete ideas as to why the latter feels overall worse, but also in some other (minor) ways not as bad. And the latter situation is what the whole controversy with the streamer is all about AFAIK. Very confused about all this.

What's right here, and why? What should even be done? Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it? (I think if we wanted to, we could make both of these things illegal. We might not be able to enforce preventing anyone from making them considering the AI-cat is out of the bag, but it still might be worthwhile to have its illegality on the books if it really is wrong. In other circles I'm seeing the claims that a ban would be unenforceable (motivated thinking?) but it seems trivially easy to functionally ban at least the distribution of 'deepfake' porn in a way that would almost certainly actually reduce the dissemination of such porn if not completely eliminate it. Just as i.e. child sexual abuse imagery or zoophilia porn.

I also see a lot of people in other circles being prompted by this discussion to argue about the ethics of AI image generation in general. I generally think this is basically stupid. The arguments which claim that AI image generation is tantamount to plagiarism (of the dataset images, I suppose) are all basically worthless as far as I can tell. But people who have bought into this line of thinking are thus now going as far as to say that i.e. photorealistic porn (even that depicting completely synthetic likenesses) that is generated with AI is a sexual violation (of all the nude or semi-nude women in pictures in the dataset I guess?) Either way I am wholly unconvinced by these arguments and think they basically all stem from a bad understanding of how the AI work, so I don't think I'm super interested in discussing this axis of the debate. But I mention it because this community sometimes surprises me so if anyone here has a really strong argument as to why this might make sense that they think I haven't seen before, feel free to mention it.

Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it?

The fact that this is even a debate kind of sickens me. Only in this fallen modern era would it even be a question whether or not the discomfort of primarily women (which has been elevated in the current era not according to any reason but only according to an emotional hysteria, driven primarily by the same women who benefit from it (or at least feel compelled to use it to pathologically seek power that they can't even healthily enjoy as an expression of sadistic bitterness over the modern widespread weakness and hence unattractiveness of the opposite sex) and their male followers ("simps") who have often become so emasculated that they no longer even try to conceive much of a path to earning the favor of such women other than by slavishly obeying them for temporary headpats, that has amplified itself almost into a religious fervor) and their investment in the "sanctity" of their image (obviously highly debatable given how many of them choose to present themselves) takes precedence over the basic rights to freedom of expression, freedom to fantasize, freedom to create, etc.

The answer is obviously no. There is nothing about deepfaking that is categorically different from writing stories about someone, drawing/painting them (and even before the recent AI explosion, some people could make remarkably photorealistic artwork), Photoshopping (which has been possible for decades) their head on to someone else's body (and it's worth noting that modern deepfakes are essentially as primitive, just in motion, as the most used method at the moment just involves finding a video of someone with a body that is hopefully somewhat of a plausible match for the desired face and then inserting that face on to it, again just in motion), etc., and hopefully there is still a general agreement (not saying this to build consensus, just expressing what I have only ever been aware of the general consensus in the modern era always having been) that anyone who wants to use the state monopoly on force against people who do these things because the subjects of them might be made *uncomfortable* is a totalitarian lunatic. (I remember that before JK Rowling was a villain for opposing alternative sex lifestyle roleplaying (not a sneer, just my attempt to more accurately describe the phenomenon of "transsexualism"), she was a villain to some for vociferously opposing Harry Potter fanfiction, with the argument that Harry Potter sex stories for example would violate the rights of its movies' young actors by likely imposing their images on scenarios they didn't consent to being widely mocked.)

The whole "deepfaking" controversy is just using slightly new technology to launder into the public discourse the same old big brother bullshit that's been rightfully rejected many times before, except they may yet succeed this time (with "yet" being relative, as it's actually already illegal in a few states) because the rational faculties of their targets have been so broadly degraded and their discourse so thoroughly poisoned with mindless, kneejerk reactionary (which I've, ironically enough, almost always found those who are the most anti-reactionary in the political sense to be the most in the general sense) feminine emotionalism, safetyism, and exaggerated negative utilitarianism (so long as it's in favor of protecting the right demographics, the most sacred demographics, of course, as obviously this issue would not be one at all were men the primary subjects of discomfort here).

It is also quite ironic that it is mostly the side of people pretending to be highly opposed to/seeking a severe contraction of the carceral state pushing this. This is just more evidence to me of what has seemed obvious from the beginning, that these people are not against harder "crime and punishment" and "law and order" crackdowns than they've ever bemoaned, just against the punishment of particular crimes they associate with their favored client demographics (particularly/only when committed by members of those demographics who are also in good ideological standing, but they can't quite say that so explicitly yet) and the opposite for their disfavored ones. In their ideal world, Kevin gets 20 years of hard time for putting Pokimane's face on Viper Vixxen or whoever (especially if he seems like a "chud", maybe less if he has a history of serving the regime loyally, in which case he may get to lessen his penalty via subjecting himself to a routine of humiliation and self-criticism), but Tyrone gets therapy and cookies for stabbing him to death. (And does anyone want to bet how much they'd push for women to get punished for deepfaking Bieber or the BTS boys? Of course most men are still not invested enough in their egos to be incapable of separating fiction from reality, so they're unlikely to care anyway.)

If they were really against the worst excesses of modern surveillance authoritarianism as they claim, the last thing they would do is try to invent a fifth horseman of the infocalypse to give glowies and spooks yet another reason to treat any bit flowing through any digital system as a possible if not likely piece of illegal contraband on the run, "justifying" even more invasions of technological/digital freedom and privacy. But this is again because they're not actually against hammers as they claim, only against them being used on certain nails. This is after all the side that invented the "No bad tactics, only bad targets" mantra.

I think part of this is because so many have forgotten what rights are, or at least what they should be in practice, that is how they should function. They have fallen into the trap of, because "rights = good", consequently thinking that rights can only protect fundamentally also 100% good, squeaky clean, Reddit-certified Wholesome™ Mr. Rogers behavior (or at least what they see as that through their ideological lens), or at least not what they see as its opposite, which is how nonsense like "Hate speech isn't free speech." spreads even though such a statement is blatantly contradictory on its most basic semantic level. In actuality, as a loose heuristic, rights are more appropriately understood as restrictions on power (as they are formulated in the US Bill of Rights for example).

Rights are rules where giving authorities the power to violate them would likely make those authorities shittier and more prone to causing problems/hurting more people than whatever problems they could solve by violating them. Rights are when giving authorities the right to search anyone's asshole at any time is worse than whatever people are smuggling in their assholes, thus we say "no searching assholes except in very strict, limited, and justified circumstances", thus "people have a general right not to have their assholes probed for contraband." This isn't based on any determination that most of what anyone is smuggling inside their asshole is any good; indeed most people who have to smuggle things in their asshole probably aren't smuggling much nice (depending on your stance on drug use anyway, though I'd say at least fentanyl which is probably a very common asshole passenger nowadays is close to objectively evil).

So to tie it back to deepfaking, the choice comes down to preventing women from occasionally feeling uncomfortable about fiction about themselves vs. trying to protect what's left of the chastity of all of our digital, informational, expressional, and private assholes. Again, I think only in modern femworld would this even be considered a choice worth pondering for more than a second.

Women's feelings are not god. They don't even warrant being taken that seriously in many cases (to be fair the same is also true of men, though not as often I don't think). That's really all that needs to be said about it. Sorry you're uncomfortable ladies, but that doesn't mean that the entire boot increasingly stamping the human face for what seems increasingly like it might be forever needs to be at your beck and call. (Of course me or anyone else saying this will accomplish nothing at least in the short term, but the decay of society cannot be reversed until these ideas are fully absorbed by modern men.)

People find you attractive, including those whose attraction you might not reciprocate, which you know because even if your content is "SFW" you've built your whole career on it (and you've never objected to it when they were giving you money, which is also part of the issue here, as this style of deepfakes has been around for years but now many of these creators have Fanslys etc. and are making money off of them), among other things (like at least 70% of the reasonably attractive ones not infrequently walking around in public half naked nowadays). Get over it.

(I remember that before JK Rowling was a villain for opposing alternative sex lifestyle roleplaying (not a sneer, just my attempt to more accurately describe the phenomenon of "transsexualism"), she was a villain to some for vociferously opposing Harry Potter fanfiction, with the argument that Harry Potter sex stories for example would violate the rights of its movies' young actors by likely imposing their images on scenarios they didn't consent to being widely mocked.)

You remember incorrectly. Rowling never opposed Harry Potter fan fiction, and in fact is one of the more pro-fan fiction authors out there. She did object to porn fics being available on sites predominantly visited by young fans, but afaik never took any kind of legal action.

As for your rant about deepfakes, I don't think they should be illegal, per se (I think specifically using them to perpetrate fraud or damage someone's reputation is, at the very least, cause for a civil action), but I also think people are entitled to demand sites remove deepfakes of themselves. Like, if you want to create your own personal wank material with Emma Watson, or your neighbor's daughter, keep it to yourself. What's in your head (or on your hard drive) is nobody else's business. Putting it in public is like telling your neighbor's daughter that you jack off thinking about her. If you make it public, you make it her business (and her father's, to put it in terms that you consider relevant).

but I also think people are entitled to demand sites remove deepfakes of themselves.

So you think it should be illegal if those sites don't?

Like, if you want to create your own personal wank material with Emma Watson, or your neighbor's daughter, keep it to yourself.

Nah. The right to share the products of one's fantasies, expressions, creativity, etc. is inherent in all of the associated rights.

What's in your head (or on your hard drive) is nobody else's business

Sure but it's my choice if I choose to make it their business in a particular context.

Putting it in public is like telling your neighbor's daughter that you jack off thinking about her. If you make it public, you make it her business (and her father's, to put it in terms that you consider relevant).

Maybe it becomes their business but that doesn't imply any obligation for the state to do anything on their behalf.

Anyway I swear it was Rowling but maybe it was Meyer or some other author of a similar context or maybe it was just erotic fanfiction they were opposed to. (Actually I think maybe the controversy was that Rowling disapproved and tried to take down fics with even small amounts of sexually suggestive content. I don't know. All I know is at least some fics were targeted by someone. In any case the analogy stands even if the details aren't correct.)

Edit: I think I'm right about Rowling. Maybe she changed her mind over time but there's definitely a history of her targeting fan content:

https://old.reddit.com/r/harrypotter/comments/8nphgj/jk_rowling_vs_the_internet_a_history_of_harry/

J.K. Rowling comes out publicly in support of Harry Potter fanfiction online, but only on “her terms”.

This not much different than the people who are fully supportive of AI-based image techniques, but only on their terms, that we're discussing. So I think she's a good analogy here, especially since I again do recall some of the discourse being about how it violates the actors' image rights since everyone inevitably associates their appearances with the characters now.

Maybe it becomes their business but that doesn't imply any obligation for the state to do anything on their behalf

Yes, it does, because the state has made it their business to prevent them from doing anything to protect themselves.

The morally correct response to someone telling your sixteen year old daughter that he enjoys thinking about her while jerking off is ‘if you ever speak to her again I will kill you’. The state has decided to ban this option, and so it is incumbent on the state to imprison(or otherwise deal with) people who justify that recourse. The debate is about where to draw the line, not about whether the state should be involved.

The morally correct response to someone telling your sixteen year old daughter that he enjoys thinking about her while jerking off is ‘if you ever speak to her again I will kill you’.

Maybe if you're a violent psycho who is a ticking timebomb waiting to go off, though in that case I'd rather the state move on you.

(By the way, if you feel this strongly about people not jacking off to your (hypothetical?) daughter, then I sure hope you're equally as committed to keeping her completely modest in garb and demeanor. The moment you so much as let her walk around in front of other males in tight leggings (assuming she's attractive), all bets are off, whether they communicate that to you or not, if you want to try to appeal to some more traditional code of behavior. Many such daughters being jacked off to with their fathers unable to do anything about it other than seethe.)

The state has decided to ban this option, and so it is incumbent on the state to imprison(or otherwise deal with) people who justify that recourse.

Yeah, no. By this logic, it is incumbent on the state to imprison or otherwise deal with people chewing loudly because it has prevented me from simply murdering them. (You might say that chewing loudly could never possibly justify murder, but perhaps if you had dinner with some of my family members you might disagree.) That is nothing more than naked totalitarianism. (I don't actually support murdering or imprisoning people for chewing loudly of course. I am just pointing out that your argument is contingent on the notion that a particular behavior deserves a particular degree of punishment in the first place, which is obviously highly debatable. You're trying to launder in this premise as automatic.)

The debate is about where to draw the line, not about whether the state should be involved.

Yes, this applies so long as anything at all is illegal (like murder, which I'm pretty sure has been prohibited in some form in every society). It's also a meaningless statement.

Yes, I am aware that men think about women while masturbating, and that teenaged girls are attractive to the opposite sex.

Informing a woman or girl you’ve masturbated while thinking about her is creepy behavior* that will foreseeably be received as a threat, and there’s no possible reason to engage in it. Behaving in a sexually threatening manner towards women and girls justifies lethal violence from the men responsible for them. It’s been that way since time immemorial and the only exception has been if they’re just whores who forfeited their right to male protection(which was not the topic up for discussion). Things which are threatening are not the same as things which are merely annoying. Women have a right not to hear implied rape threats and their husbands and fathers have a right to police the things said to them.

*unless you’re in a relationship where she’s into that, I suppose, but I’m not talking about Reddit sex positive weirdos here.

that will foreseeably be received as a threat

That depends a decent amount on the context.

Behaving in a sexually threatening manner towards women and girls justifies lethal violence from the men responsible for them.

Maybe, if being "responsible" for them also means they have complete and absolute just and proper property rights and masculine dominion over them (which is also how it's been "since time immemorial"). Otherwise they are merely simping to some degree. The natural price of masculinity taking responsibility for the feminine is the feminine's complete and absolute obedience in return. So if you are not advocating for this then you are simply advancing cuckoldry under the guise of chivalry (which I suspect because you're framing the issue here as an injury to the female as opposed to her owner).

if they’re just whores who forfeited their right to male protection (which was not the topic up for discussion)

That's like at least 97% of modern women/girls over the age of 13 or so though, so I kind of think it's implicitly up for discussion. The actual society we live in is not the one you're describing.

Women have a right not to hear implied rape threats

If we're talking ideal ideal world (obviously my opinion influenced by my ideological presumptions here, though I think it's a lot more traditional), men have a right to not hear implied threats against their exclusive use of their property and women have very few to no rights. Again, the injury is to the man (hence why "rape" evolved as a synonym for "steal", because it's stealing another man's property). But even then I think in most cases going to the absolute extreme over someone saying they find your property attractive is a little much. If somebody said they liked my car, I wouldn't automatically in all circumstances threaten them like were threatening to steal it.

unless you’re in a relationship where she’s into that, I suppose, but I’m not talking about Reddit sex positive weirdos here.

The fact that you think relationships where the girl finds her partner sexually attractive enough to enjoy the idea of him wanting to masturbate to her is the domain of "Reddit sex positive weirdos" says a lot here.

My entire point is that we are not living in the kind of society you’re imagining, we’re living in a society where the state takes on the function of protecting women from sexual violence and predation. And the state, if it’s going to take on that function, has the responsibility to actually do that. Which in turn means that it needs to protect the privacy of the nude bodies of non-sex workers(and no, wearing a bikini does not make you a sex worker, and I say that as someone who does not approve of bikinis).

More comments

So you think it should be illegal if those sites don't?

Yes. Just like if you publish a libelous article about me I can demand the site take it down. They can refuse, of course, and then I can sue, but the end result is, theoretically, men with guns enforcing the law, yes.

Maybe it becomes their business but that doesn't imply any obligation for the state to do anything on their behalf.

Okay. If you tell someone that you enjoy jerking off while thinking about his underage daughter, no, he can't have you arrested for it.

People who expect no consequences for anything they say or do as long as it isn't actually illegal tend not to like the consequences and then suddenly become very interested in legal protections.

Edit: I think I'm right about Rowling.

You're not. You said she was opposed to fan fiction. She's not. She was opposed to minors being exposed to porn. You might disagree with her wanting to impose conditions on writing fan fiction, but that doesn't make her opposed to fan fiction (and in practice, she's never done much to enforce her terms except on large commercial sites).

Technically, fan fiction is still at best quasi-legal, and authors who are actually anti-fiction can and do force sites to remove fan fiction of their works entirely. Rowling could, if she wanted to, go after the many sites that do host sexually explicit HP fan fiction, but she hasn't.

This not much different than the people who are fully supportive of AI-based image techniques, but only on their terms, that we're discussing.

It's quite different.

So I think she's a good analogy here

It's not.

Just like if you publish a libelous article about me I can demand the site take it down.

It's not libelous if it's not presented as true. If I write a fictional story about you raping an eight year old that is explicitly presented as fictional, then as much as that may disturb you, you can't do shit.

People who expect no consequences for anything they say or do as long as it isn't actually illegal tend not to like the consequences

This is random pseudo-macho posturing that's irrelevant to the argument. But yeah in any case I will definitely take my chances with your average weak modern Reddit heckin' dad versus the state. (I mean even if I did deepfake someone's daughter, which I wouldn't at all especially now since I think current deepfakes are primitive and cringe, I'm not exactly going to go telling them about it, since, yes, legal or not that's pretty shitty or at least dumb etiquette, and if I shared it online I'd do so anonymously, but still. I've masturbated to a lot of people's daughters and I'm pretty sure none of them know anything about it except maybe the dads of the girls I've openly dated, though even that's not many because I have a weakness for fatherless girls.)

and then suddenly become very interested in legal protections.

...Is it supposed to be some sort of flaw in my argument or "gotcha" that I am very interested in legal protections... for that which I think should be legal? That's kind of the point, yes.

You said she was opposed to fan fiction. She's not.

I mean, as the link shows you, many actual HP fanfiction authors disagree. Many people also disagree that Diane Feinstein is opposed to guns (after all, you can disagree with her wanting to impose conditions on them, but that doesn't make her opposed to them, right?). Unless you can explain exactly what's "quite different" here and thus wrong about my analogy, I think the whole debate is a pointless back-and-forth of semantic vagueness.

Again, I'm talking explicitly about the "You can't write a sex story about Hermione, because when people think of Hermione they think of Emma Watson's image, and Emma Watson didn't consent to have her image in your sex fantasies." argument I've seen about erotic HP fanfiction. (I'm not saying Rowling made this exact argument directly herself. She was just a convenient segue.)

It's not libelous if it's not presented as true. If I write a fictional story about you raping an eight year old that is explicitly presented as fictional, then as much as that may disturb you, you can't do shit.

This does not appear to be correct:

"For example, in 2009, in the “Red Hat Club” case, the plaintiff was awarded $100,000 in damages by a Georgia court for a fictional portrayal modeled on her. The “original” claimed that her fictional counterpart, falsely depicted in the bestselling novel as a sexually promiscuous alcoholic who drank on the job, defamed her. From a libel defense perspective, this drawn-from-life portrayal failed, in part, because the author included personal characteristics that made the plaintiff recognizable, and mixed them with other traits that were false and defamatory, but, still believable."

If you can successfully sue because you were portrayed as a slutty drunk in a work of fiction based on you, I suspect you may be able to sue for being portrayed as a child rapist.

It's not a slam dunk, and often fails, depending on how closely the fictional version is recognizable to the original, but it does appear that you can indeed "do shit."

Fair. I'd be interested in seeing what some sort of disclaimer specifically targeting this achieves though, something like "X is quite obviously not a child rapist. There is no evidence that X is actually a child rapist nor is it believable based on all known information about them that they could be one." etc.

I mean, that's one case. There's also this meme about Glenn Beck and the guy who registered a domain for it won his case.

Plus, I don't think this can apply to deepfakes. If I write a fictional story about you doing X, then perhaps that can come with some implication that it's some veiled satire suggesting you might actually do it. But if I make a deepfake, I mean it's in the name. It's fake. It is very clearly not you doing it.

Sure it is entirely unclear how it would work with deepfakes if at all. But the deepfake is presumably recognizable as the subject (as that is the entire point) so you could perhaps get away with deepfaking them having sex as this is something they likely do. If you deepfaked them onto child pornography that might trigger something similar. That is highly speculative though. I suspect different courts and jurisdictions will go different ways.

Link to quoted article

A different article says of the same lawsuit:

Thursday’s legal victory for Stewart [the "victim"] was tempered, however, by a relatively small $100,000 award for damages, with the jury refusing to award her attorneys fees. The cost of litigating the case over the last five years likely exceeded $100,000.

Indeed, hence why it is not a slam dunk, but it is something.

It's not libelous if it's not presented as true. If I write a fictional story about you raping an eight year old that is explicitly presented as fictional, then as much as that may disturb you, you can't do shit.

I could probably sue on the basis that it causes me reputational harm, though my understanding of the law is that I'd have a hard time establishing actual damages.

People are coming up with all kinds of other scenarios, about Photoshopping a dick into someone's mouth or creating a deepfake of someone being raped and tortured, and not all of those things are illegal. I don't necessarily agree that none of them should be.

I'm in favor of enabling subjects of deepfakes to issue takedown demands, though enforcement will be very impractical in practice.

This is random pseudo-macho posturing that's irrelevant to the argument. But yeah in any case I will definitely take my chances with your average weak modern Reddit heckin' dad versus the state.

It's not macho posturing - I said you should keep your sexual fantasies about your neighbor's underage daughter to yourself and not share them with her or her father, and your response is "Nuh uh, it's not illegal!" I mean, sure, not everything that is wrong and unethical is illegal, and you certainly can go around telling everyone about your sexual fantasies. Reverting to "It's not illegal" when discussing ethics is a dodge.

I mean, as the link shows you, many actual HP fanfiction authors disagree.

No, they think she was trying to impose conditions they didn't like. Also, that link is four years old and references events going back much further, in the early days of online fan fiction.

The vast amount of HP fan fiction that Rowling has tacitly (and in some cases, explicitly) approved disproves your entire argument.

Many people also disagree that Diane Feinstein is opposed to guns (after all, you can disagree with her wanting to impose conditions on them, but that doesn't make her opposed to them, right?).

I actually doubt that many people disagree with that. But if you said "Diane Feinstein wants all guns to be illegal for everyone, period," that's a claim that may or may not be true (who knows what she really believes?) but it's not supported by any actual words or policies from her. If you said "She wants heavy restrictions on guns," that's obviously true. But "JK Rowling opposes certain kinds of fan fiction in certain contexts" is not the same as "JK Rowling is anti-fan fiction." It's not even a good parallel with your Feinstein analogy.

Again, I'm talking explicitly about the "You can't write a sex story about Hermione, because when people think of Hermione they think of Emma Watson's image, and Emma Watson didn't consent to have her image in your sex fantasies." argument I've seen about erotic HP fanfiction.

That's more an ethical argument than a legal one. Emma Watson would not have legal standing to demand that erotic fiction about Hermione (or about Emma Watson - RPF exists) be removed. But Rowling could demand that the former (though not the latter) be taken down, on the basis that fan fiction is, as I said, at least currently considered an IP violation, though this hasn't really been tested in court.

I could probably sue on the basis that it causes me reputational harm

How would it cause you reputational harm? If anything you'd be the victim of malicious fiction, and victimhood is a reputational benefit nowadays.

I don't necessarily agree that none of them should be.

Well I disagree. Fiction is fiction and thus automatically possesses a rightful presumption of being implicitly harmless (as it is, quite literally, unlike most important harms, not tangibly real) absent a more pressing justification than someone's discomfort over their depiction, whether it's because it's extra realistic looking (but again, still not actually tangibly real) or not.

I mean this is of course speaking in terms of abstract ideal legal policy. Strategically speaking, if you want to make deepfakes illegal in any sense and force those who want to see fake Pokimane or fake Emma Watson getting railed into the depths of the darknet where stuff like child porn also circulates, thus strengthening the entire enterprise of private and anonymous content contribution opposing the unjust power of the modern digital hegemony, then that's probably a win for people like me.

I said you should keep your sexual fantasies about your neighbor's underage daughter to yourself and not share them with her or her father, and your response is "Nuh uh, it's not illegal!"

Uhh no. My response actually was:

Maybe it becomes their business but that doesn't imply any obligation for the state to do anything on their behalf.

"Maybe it becomes their business" doesn't in any sense imply some overall objection to the principle of generally keeping such things to yourself on etiquette/behavioral grounds, and simply advancing the viewpoint that a certain behavior is not a concern of any formal power is not some blanket approval of it as ideal behavior in all contexts (many such cases of people unfortunately believing the opposite nowadays though). "That doesn't imply any obligation for the state to do anything on their behalf" is in fact some of the weakest commentary on a behavior you can give, other than again through the flawed modern lens that so frequently crops up where if you're not advocating calling the SWAT team in on something then you must be its biggest cheerleader or at least trying to excuse it.

Reverting to "It's not illegal" when discussing ethics is a dodge.

But we're not just discussing only ethics, unless the only compulsion you're advocating for being behind those takedown requests is that it'd be the right thing to do.

No, they think she was trying to impose conditions they didn't like.

And the Reddit admins were only trying to impose conditions on your subreddit that you didn't like.

Also, that link is four years old and references events going back much further, in the early days of online fan fiction.

And? My whole original point is that JKR was criticized for opposing fanfiction primarily in the past.

(Anyway I'm just going to ignore the rest of the stuff about whether it's reasonable to say that JKR opposes or ever opposed fanfiction or not since it's completely tangential and I'm not in an autistic enough mood today (which is not to say I never am) to dive into this conversation spiraling into dozens of tendrils of barely relevant side disputes (and I'm not saying you were the only one engaging in this up until this point by any means).)

That's more an ethical argument than a legal one.

Yes. And it's a bad one in my view. And it's similarly a bad one for Emma Watson/Hermione deepfakes too.

But Rowling could demand that the former (though not the latter) be taken down, on the basis that fan fiction is, as I said, at least currently considered an IP violation, though this hasn't really been tested in court.

And I also oppose this, though debating the validity IP law is mostly again a whole other subject.

Again, to me, the central dispute is whether openly, explicitly fictional content (again, if it's lying about being fictional, then that gets into the realm of fraud which is a whole other matter) should be prohibited because it makes its subjects uncomfortable or feel "violated" or however it's formulated (as I'm not seeing any other real justifications being advanced). I say no.

What's in your head (or on your hard drive) is nobody else's business. Putting it in public is like telling your neighbor's daughter that you jack off thinking about her. If you make it public, you make it her business.

I'm amazed at how succinctly this delineates ethical boundaries that appear basically airtight to my intuition (my intuition being where the problem lay in the first place). I'd go as far as to say that this essentially resolves the topic for me.

Yes, I think when it's distributed / public there are interesting ethical questions, but inside people's heads it seems entirely their business, and no moral issues whatsoever. (The only hints of moral issues are that it increases the chances of it making it outside of their heads).

takes precedence over the basic rights to freedom of expression, freedom to fantasize, freedom to create, etc.

This has a wider application than just porn. Someone makes a deepfake of you doing something illegal, are you going to be so gung-ho about the freedom to fantasise and create?

Most systems, including the US, draw a distinction between libel/slander and protected free speech. If someone makes libelous videos of me, and shares them around, then obviously I would have objections and would have legal recourse against that person.

I think at least some instances of deepfake porn would fall under libel laws, but not all of them.

Sure, as long as they're not trying to represent it as real (and if they don't the crime there is various forms of fraud, misrepresentation, false reporting, etc., not anything to do with fantasizing or creating, as it'd be essentially the same crime as if they just claimed I had done those things without the added fake evidence, again supporting my claim that not much if anything you can do with deepfakes is categorically different than anything you could before them). They can fantasize all they want about me being a bank robber or serial killer. I'll be flattered that they think me specifically being the criminal in their fantasies makes the scenario cooler.

Women's feelings are not god.

I can maybe accept most of what you're saying specifically in regards to how it answers to the question: 'should these images be illegal?' Perhaps it is a violation of the principle of free speech to have things like this be made illegal, or a slippery privacy slope, or a needless cession to emotional people, etc. That being said, whether or not it should be made illegal, I expect that it will be legally harder if not fully illegal to do this kind of thing in the near future. But I digress.

Many others in the thread are also focusing specifically on the legality dimension, which I regret and for which I bear responsibility. I was generally more interested in talking about what is moral/ethical, and less so in talking about what is and should be legal, even though I indeed asked as much in the body of my post. Even if these things are not illegal, the morality of them is still hugely important, as it determines who gets 'cancelled' etc.

And to that end, in figuring out what is ethical or moral, I think feelings do matter. For example, I think it would be immoral in many situations to do an action that I knew would make a person "feel bad" just because doing that action gave me sexual gratification, legality of the action notwithstanding. If I was trying to design the fairest US government, I might not make deepfake porn illegal. But if I was trying to be as morally/ethically upstanding of a person as I could be, there are plenty things I shouldn't do that are still legal.

I'm of the relatively firm belief that it isn't immoral to fantasize about having sex with someone, even if they haven't consented to you having such a fantasy. I'm not sure what I think when it comes to making highly realistic porn of them. If you were superman and had X-ray vision, would it be unethical or immoral to look into the women's locker room? If not, why does everyone seem to think it would be? If so, what's the difference between that and having a vivid, realistic imagination and using it for sexual purposes in the same way?

Another commenter prompted me to believe that a significant amount of how unethical it is lies in whether or not deepfaked person knows about the porn being made of them, because knowing that it exists is what inflicts psychological harm on them. I think I agree about this. However, the women in the shower into which you're peeping might not ever know that you've peeped at them, so is it not wrong to be a peeping tom (as long as you never get caught?) Teen coming-of-age movies from the 80s didn't seem to think so (the shift in attitudes between then and now might be pertinent to the discussion). Regardless, currently I do indeed think that i.e. spying on the womens locker room would be unethical, and I think most people today would agree that its probably wrong to do such a thing. This is the angle that I'm really trying to disentangle here, the moral and ethical angle, and less so the legal one.

And to that end, in figuring out what is ethical or moral, I think feelings do matter

OP didn't say "feelings don't matter". They said "women's feelings aren't God" i.e. are not the sole, overriding consideration in ethical disputes.

I'm of the relatively firm belief that it isn't immoral to fantasize about having sex with someone, even if they haven't consented to you having such a fantasy

Case in point: some women apparently dislike being "objectified". I don't really care tbh. What goes on in my skull is my business.

If not, why does everyone seem to think it would be?

Because it is a violation of actual privacy: the actual woman is in that room, with a reasonable expectation of privacy and you are peeking in. Even if it wasn't sexual there's all sorts of other concerns with such snooping (e.g. can they steal your stuff now that they saw your locker code)

With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way? That isn't the real Jolie, it's a virtual image that isn't even perfectly accurate.

What if it was trained on topless images of Angelina and perfectly matched her in her physical prime? I think an argument could be made that she removed privacy here herself, in a way she can't expect to get back (we can't unsee her body either way)

I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?

However, the women in the shower into which you're peeping might not ever know that you've peeped at them, so is it not wrong to be a peeping tom (as long as you never get caught?)

Besides the reason already given above? It's more reasonable to imagine you will never be caught for private files on your computer vs. peeking into someone's bedroom. Simply not being physically there reduces the risk of detection and thus harm to the person.

With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way?

This is the main thing I am trying to get at with the locker room/fantasizing examples. The current AI can inpaint nudity onto clothed pictures of people without necessarily having serious flaws or inaccuracies. (Not to say, it always succeeds at doing this. Just that it can reasonably often.) And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.

Technically, I suppose, it can't be known by the person operating the AI algorithm if the person has i.e. a mole on the chest, etc. So maybe, because technically uncertainty might remain, i.e. without actually being able to look at a real topless image of the subject, and thus verifying that the nudity-ai-inpainting is highly similar, there is still some sense of privacy maintained? Because even if the inpainted-nudity actually is extremely similar to their topless form, this isn't known to the person creating or viewing the deepfake?

Regardless, overall, the pertinent fact is that the current level of technology is at a level where it is indeed possible to get outputs, at least somewhat often, that the depicted person themselves could or would mistake for real nude photos of themselves. This seems to me to be functionally very similar if not the same as looking at someone changing/naked without their consent or knowledge. You're right in the sense that it doesn't imply other security concerns in the same way as an intruder present in a changing room would, but I'm not sure that's whats actually wrong/disliked about peeping toms; I feel like a significant amount of the dislike of the idea of someone seeing you changing is the actual fact that they know what you look like naked (and maybe also the knowledge or likelihood that they are fantasizing about you sexually). I.e. most people would be as mostly as opposed to a person using X-ray glasses, or more realistically a hole in the wall, to look inside their locker room while they changed, as they would be opposed to someone i.e. hanging from the rafters. I can't know for certain, though, at least personally I guess, because to my knowledge I've never been the victim of any such situations.

I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?

Well, as far as legality goes, it seems like copyright is the main way people take down unwanted deepfake porn of themselves. Regardless, though, I'm less so interested in the legality and moreso in what should or shouldn't be generally considered acceptable ethically or morally speaking, for which perhaps privacy or violations thereof, and perhaps other things, do seem like a relevant concern.

And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.

Porn stars not only self-select based on their agility in smoothly changing positions in front of cameras--incidentally, a skill shared with politicians--but also for how good they look naked. If an AI image generator is trained on naked bodies of porn starts, its AI-completed naked version of me will look amazingly better than I actually do.

Women's breasts, in particular, come in a variety of shapes, and they are frequently not symmetric. Older women's breasts tend to be flat--think more like those pictures in the old National Geographic depicting women in some far-away hunter-gatherer tribe. The nipples and areolae come in various shapes and sizes, and change with temperature. Some have inverted nipples. Practically all of this variability is hidden by the kinds of clothes women wear, especially if they are into padded bras.

The distribution of body fat also varies significantly for overweight women, and this is also mostly hidden or distorted by clothes.

Women's breasts, in particular, come in a variety of shapes, and they are frequently not symmetric. Older women's breasts tend to be flat--think more like those pictures in the old National Geographic depicting women in some far-away hunter-gatherer tribe. The nipples and areolae come in various shapes and sizes, and change with temperature. Some have inverted nipples. Practically all of this variability is hidden by the kinds of clothes women wear, especially if they are into padded bras.

I'm aware of this. The point is that not everyone with good-looking (pornstar-like, if you would) breasts, decides to become a pornstar. Thus, these types of people are vulnerable to having very realistic versions of their breasts recreated with pornstar data, despite never themselves putting images of their actual breasts out onto the internet. Additionally, there's plenty of data of non-pornstar-like breasts out there to train data on. The point is not that AI will always generate topless versions of people that are very much like what their breasts actually look like, its that it can with at least some relatively degree of frequency.

Making a deepfake porn of someone for noncommercial purposes should be fair use. It's clearly transformative, and it doesn't have any effect on the potential market for the work unless you think the copyright owner will sell their own picture for use in porn and this makes it harder to do so.

Maybe true, but I guarantee you that the vast majority of people paying money to host websites that distribute deepfakes are doing so for commercial purposes. I.e. the streamer in question had accessed a website which required him to pay 15 dollars to use

If so, what's the difference between that and having a vivid, realistic imagination and using it for sexual purposes in the same way?

One extracts real, factual information. One does not. Your actual naked body is your information. How I imagine your naked body to look, or how I conceive of it by using glorified Photoshop+ to put your head on another naked person, is my information.

What if the AI is so good at being photoshop+ that, using a picture of what you look like clothed, it is able to create a nude comprised of the exact same pixels/information that would be present in an actual photograph you took of yourself while in the same pose except naked? In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible, lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves. (This is where the technology is actually at, right now. Boobs aren't exactly the most difficult thing to draw, especially at 512x512 resolution.) In this case, even if it's not the exact same information, then, it seems to me to be functionally the same information, for all intents and purposes. So is it okay for me to use an AI to access what is for all intents and purposes the same as information which is otherwise immoral for me to access?

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible

Actually I'm pretty sure it's completely impossible, given how many bits of entropy there likely is in the variations of how naked bodies can look, particularly things like moles, random scars, etc. (and it's even possible to get new marks on your body over time, like again scars, which means even if there really is some perfect correlation between one's facial appearance and the exact configuration of moles, birthmarks, etc. on one's body, that still doesn't account for one's lived experiences (to think I've found an appropriate occasion for that phrase)) and also variation in genitalia appearance. There's also weight gain and loss which happens constantly and is a huge factor too. (Of course this would also depend on how much of a person's body they've shown off in information accessible to the AI and thus how much is left for it to guess.)

lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves.

Even this seems unlikely, again given the amount of (changing) variation involved. The moment one mark isn't there or the genitals are off in any way (and there's a lot of details in both penises and vulvas) or the pubic hair isn't how the person keeps it (or how they were keeping it in a plausible timeframe of the photo), etc., the whole thing is done. Just because it's AI doesn't mean it can do magic. It can't brute force cryptographic keys any better than any dumb algorithm, and again I believe that the informational entropy involved in both formulations of your scenario is likely similar to that range.

In any case, I would still say that even if you did hit on a perfect prediction/match (which as I will explain, is still not actually perfect in practice), it is still not a matter of you accessing my information. Like let's say we both have the password "hunter2" on this site. Are we then accessing each other's private information (since, after all, what's more private than a password) every time we log in? No, because the context is different. In one context, "hunter2" is my information. In another context, "hunter2" is yours. The only way you could be accessing my information is if you tried to use the "hunter2" information in my context, that is, tried to use it to log into my account.

Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction. (After all, information and uncertainty are inherently interlinked. It's not actually a perfect prediction, at least from the perspective of any observer or process we care about (that is, I'm presuming omniscient oracles are not in play in our social relations here), until you can confirm that it is, just like, going based off of the password analogy, if you're a bruteforcing a password, it's only meaningfully "right" at the moment you confirm it is and are successfully able to log in with it, not just at the moment the equivalent string is first generated and enters your computer's memory.)

Except, in that case... you've already necessarily seen the actual 100% real thing, so why do I care about the perfect prediction of it anymore? (I mean if you've only ever actually seen it in person but have a record of the prediction then that may change things, and I could address that, but this is all getting into kind of silly technicalities based on an implausible scenario anyway so I'll end it at this extension of it.) If I type my password in front of you and don't do a good enough job of hiding it, then I obviously don't have to worry about you bruteforcing it anymore (nor do I care extra above how much I care about you seeing the password that you may have bruteforced it prior, from the perspective of it being private) because the worst end result of that has already necessarily happened in the process of confirmation.

Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction.

I suppose you're right about this.

What if you're a very accomplished painter and you're able to tell from someone's clothed body exactly how to paint them such that it matches their body when nude?

Maybe that should have the same moral or ethical implications.

In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

No, I think OP (and myself) are considering the tangible possession as a key factor. The information may be functionally identical, but is not actually the same.

In the version of the hypothetical where the AI actually can exactly recreate the way a person would look naked in a certain pose, using only a clothed photo of them in that pose as reference, we can agree that the information is 'actually' the same, though, right? One pixel at location x,y, with color #f0d190 is 'actually the same' as another pixel at the same location x,y, with color #f0d190, regardless of whether or not that pixel exists there because it was reverse-engineered by AI, or normally-engineered to be there as a result of being captured via digital photo.

Even granting that, they are two separate copies of that data. The ability to possess one does not make it ethically or legally justifiable to steal the other.

No. In that hypothetical we lack the information that the picture corresponds to something that exists in reality. I can ask a random number generator for 512*512 pixels, and the result can be equivalent to a picture of you, but unless I somehow find it that it is equivalent I am lacking information that I would have if I took a photo of you.

I suppose you're right.