site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Has this group had any discussion regarding AI use in pornography, specifically, 'deepfakes?' Its come out recently that a major up-and-coming twitch streamer 'Atrioc' (who was recently married, and ostensibly very pro-feminist, but while funny these facts are technically irrelevant to the matter at hand) had been viewing porn 'deepfakes' of multiple major female twitch streamers, including possibly his best friend's girlfriend (last part unconfirmed but highly possible). He's come out with an apology and its a whole thing but I'm sure this community is more interested with the moral/ethical questions therein than internet-celebrity drama so I won't bore you with it.

The following are my perspectives on a few of the potential questions regarding 'deepfake' porn, and AI porn in general. I'd love to hear what other people think about these perspectives, because my thoughts are currently very incomplete on the issue.

First and foremost, I have a strong intuitive feeling that it is deeply wrong, perhaps tantamount to some form of sexual harassment/assault (but of course the non-violent sort) to make 'deepfake' pornography of other non-consenting people. For example taking a picture from a celebrity's instagram and using AI to transform it into a high-fidelity (but technically fake) nude picture of them seems functionally the same as i.e. spying on them in the shower or when they're changing, which are actions I think we can all agree would be some form of wrong or illegal sexual violation (or perhaps we can't? you tell me). The way I think about this is by considering that a strong enough AI program would be theoretically capable of using a clothed picture of someone to actually reconstruct the way the exact way they look naked, which would be quite literally equivalent to the aforementioned situation/looking at them with x-ray glasses, etc. which again (I think) we and most people agree would be wrong. And so, less-powerful AI capable of doing something similar seem to be at least on that gradient of wrong, if not exactly as bad.

Furthermore, AI that actually transplants people's faces onto video depictions of sexual intercourse (which is ostensibly what 'Atrioc' was caught doing) seem worse, or maybe just bad in a different way. I don't have a similar thought experiment to justify why I feel that way but the wrongness of it is my strong intuition nonetheless.

However, I can also sort of see the argument, at least abstractly, that it's a victimless crime. On the other extreme of the spectrum, fantasizing in one's own imagination about the way people look when they're naked, or how it might feel to have sex with them, is not only generally recognized as a very benign behavior, but is also known as something almost everyone does, men and women both. Sometimes, people do this even completely unconsciously, i.e. in their dreams. And what's the difference between looking at a very (or fully) realistic recreation of the way someone might look with their clothes off, and using one's own imagination to do so? What if one's own imagination was very vivid, and you had seen many naked people before thus making your training data very good, and so you also could reasonably expect to make a relatively accurate recreation of the way someone looked while naked, only in your own mind's eye?

The thing is, acknowledging these potential similarities between an action I find morally acceptable and the one I find morally wrong, still doesn't make my intuition about the wrongness of 'deepfakes' any weaker. I feel like there must be some thing that I haven't considered about it yet, which is where I'm hoping you guys might have insight. The only distinction I've found somewhat convincing so far is maybe that the mass-distribution via the internet is what makes it wrong? In other words I find it less wrong (but still wrong somewhat) to make a highly/fully realistic nude of someone and keep it entirely on one's own computer, more so than I find it wrong to make such an image and then distribute it online. This is especially weird because the former is even more apt of a comparison to i.e. peeping on someone in the locker room which is obviously (?) wrong. So why does it seem more okay to me? Help!

I have a few potential explanations that I'm considering as candidates for the source of my cognitive dissonance here:

  1. Perhaps in reality none of the aforementioned actions are wrong. It's not wrong to spy on someone in the locker room, and so it's not wrong to use 'x-ray glasses' to see through their clothes, or use an AI to edit a picture to do functionally the same thing.

  2. Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent. But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be? And thus how really wrong is the other situation, where one uses AI?

This kind of segues into 3. which is: Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other. Is the solution to the 'deepfake' issue then to try and change our society's sexual morals/ethics into something more internally coherent?

None of these really address why I feel different about 'turning a clothed photo into a nude' and 'transplanting, in a realistic way, a non-consenting individual's face onto an actor in a depiction of sexual intercourse.' I have no concrete ideas as to why the latter feels overall worse, but also in some other (minor) ways not as bad. And the latter situation is what the whole controversy with the streamer is all about AFAIK. Very confused about all this.

What's right here, and why? What should even be done? Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it? (I think if we wanted to, we could make both of these things illegal. We might not be able to enforce preventing anyone from making them considering the AI-cat is out of the bag, but it still might be worthwhile to have its illegality on the books if it really is wrong. In other circles I'm seeing the claims that a ban would be unenforceable (motivated thinking?) but it seems trivially easy to functionally ban at least the distribution of 'deepfake' porn in a way that would almost certainly actually reduce the dissemination of such porn if not completely eliminate it. Just as i.e. child sexual abuse imagery or zoophilia porn.

I also see a lot of people in other circles being prompted by this discussion to argue about the ethics of AI image generation in general. I generally think this is basically stupid. The arguments which claim that AI image generation is tantamount to plagiarism (of the dataset images, I suppose) are all basically worthless as far as I can tell. But people who have bought into this line of thinking are thus now going as far as to say that i.e. photorealistic porn (even that depicting completely synthetic likenesses) that is generated with AI is a sexual violation (of all the nude or semi-nude women in pictures in the dataset I guess?) Either way I am wholly unconvinced by these arguments and think they basically all stem from a bad understanding of how the AI work, so I don't think I'm super interested in discussing this axis of the debate. But I mention it because this community sometimes surprises me so if anyone here has a really strong argument as to why this might make sense that they think I haven't seen before, feel free to mention it.

For the celebrity women depicted in the AI porn, the fact that they are celebrities amplified their distress. They are used to seeing images and videos of themselves publicized and remembering the context of when they were taken, and then they see a highly accurate AI version of themselves in a sex scene. This can cause psychological harm in a person — humans are not designed to see something like that, I don’t think it computes properly in the brain. An AI scene of being sexually victimized (in essence, arguably) is fundamentally different than making a photoshop with a person’s face due to the sheer realism.

It’s also probable that there is harm when someone faps to the AI video versus to an instagram image, because it’s like a part of the mind would think the AI video really happened. That’s how realistic AI porn may be or can be in the future. So knowing that someone you know, that part of his mind remembers the scene when he thinks about you, is truly disgusting.

With that said, I’m not even sure if Atrioc (the streamer who was caught watching the video) actually did anything immoral here, as opposed to massively unfortunate. We know he paid for an AI porn membership, but not that he specifically paid for the creation of his colleagues’ AI generation in porn. You can imagine that a man on a porn site will see “woman I have crush on in AI porn” and essentially be compelled to click on the topic link. The ease with which men can watch porn and click on links reduces his moral culpability because men are simply not evolved to exercise such willpower. If you put most men in his situation, would they click? When they are in a hyper-aroused hypnotic state, and all day every day they click links by whim, and they know that no one would know he looked at the streamer’s AI video? I pretty much think every young man who is aroused would do the same, so this negates the immorality of his action.

Yeah I'm pretty willing to forgive the streamer guy specifically, especially considering your points as well as that I have little-to-no horse in the race. As to your other points:

This can cause psychological harm in a person — humans are not designed to see something like that, I don’t think it computes properly in the brain. An AI scene of being sexually victimized (in essence, arguably) is fundamentally different than making a photoshop with a person’s face due to the sheer realism.

By this do you mean to say that the main reason that these videos might be unethical is because knowledge of the existence of the videos causes psychological distress in the people whom they depict, not necessarily because i.e. the depicted people's consents have been violated?

So knowing that someone you know, that part of his mind remembers the scene when he thinks about you, is truly disgusting.

This example prompted me to think, though on a tangent only somewhat related to what you're getting at. I'm not sure the 'part of the mind thinking the AI video really happened' thing is what sets it apart. But I think that the knowledge of whether or not someone thought about you in that way is definitely part of what matters. Whether or not someone made an AI porn of you isn't relevant to you unless you know about it -- this fits with my intuition, because re: the completely imagined sexual fantasies point, even though I and most people consider those benign, the calculation changes if person A who masturbated to an imaginary fantasy of having sex with person B then went and told person B that they had done as much. Suddenly that becomes immoral/unethical to me in a way almost similar to the AI nude situation. So I think this might be getting at the distinction for me: what really matters most is if people know that this stuff is being made about them. And in the case of these popular female streamers, the fact that the pics/vids are being distributed basically means they are being forced to know that such content is being made of them. It would be like if 10,000 weirdos were constantly whispering in their ear that they jerked off to the thought of them naked, which is different than those 10,000 weirdos jerking it but not telling anyone.

And in the case of these popular female streamers, the fact that the pics/vids are being distributed basically means they are being forced to know that such content is being made of them. It would be like if 10,000 weirdos were constantly whispering in their ear that they jerked off to the thought of them naked, which is different than those 10,000 weirdos jerking it but not telling anyone.

It's a minor but non-trivial point that many of the female streamers flirt with openly encouraging guys to watch them based on their sexualized appearance. This dynamic is going to happen anyway--attractive news anchor has been a thing since TV started--but the streamers very often take it to another level. Pink cutsie hair, accentuated cleavage, tight pants and "accidental" butt shots, etc. To put it crudely, if their target audience is thirsty simps willing to pay for their streams, I think that should factor into whether they subsequently have a right to be creeped out when those simps imagine the streamers naked, beat off, whatever. 10,000 weirdos telling Martha Stewart that they jerk off to her is very different than 10,000 weirdos telling Pokimane the same thing. Pokimane is actively, if stealthily, cultivating that response in her viewers, Martha Stewart does not.

You're right that many female streamers cultivate an audience in this way, but some female streamers do not and yet still have deepfake porn of them made. So to avoid getting caught up in this we can just restrict the discussion to solely what is right or wrong regarding the porn made of the latter group.

I agree with what you're saying here, in general. And I think that even if the thirst streamers didn't exist, the ordinary streamers who are just streaming-while-female would still end up with subscribers just there to fantasize about dating them. Anytime a female does something on the internet, some guy will try to "send bobs and vagene" her. There's a hilarious example out there of a guy posting Botticelli's Birth of Venus on twitter and getting marriage proposals. With that in mind, deepfakes are inevitable. There are even deepfakes of Martha Stewart, after all.

At the same time, the rise of monetized streams and sites like onlyfans (spit) have really weaponized this tendency. That's bad for the guys whose wallets are getting drained, obviously, but it's also bad for the normie women who just want to share their hobbies. The thirst streamers are definitely part of the problem and they're making everything worse for everyone. Because of that, I have no sympathy for deepfakes of thirst streamers.

You're right that many female streamers cultivate an audience in this way, but some female streamers do not and yet still have deepfake porn of them made.

Is this actually true? I'm curious to learn of some examples for, uh, research purposes.