site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Has this group had any discussion regarding AI use in pornography, specifically, 'deepfakes?' Its come out recently that a major up-and-coming twitch streamer 'Atrioc' (who was recently married, and ostensibly very pro-feminist, but while funny these facts are technically irrelevant to the matter at hand) had been viewing porn 'deepfakes' of multiple major female twitch streamers, including possibly his best friend's girlfriend (last part unconfirmed but highly possible). He's come out with an apology and its a whole thing but I'm sure this community is more interested with the moral/ethical questions therein than internet-celebrity drama so I won't bore you with it.

The following are my perspectives on a few of the potential questions regarding 'deepfake' porn, and AI porn in general. I'd love to hear what other people think about these perspectives, because my thoughts are currently very incomplete on the issue.

First and foremost, I have a strong intuitive feeling that it is deeply wrong, perhaps tantamount to some form of sexual harassment/assault (but of course the non-violent sort) to make 'deepfake' pornography of other non-consenting people. For example taking a picture from a celebrity's instagram and using AI to transform it into a high-fidelity (but technically fake) nude picture of them seems functionally the same as i.e. spying on them in the shower or when they're changing, which are actions I think we can all agree would be some form of wrong or illegal sexual violation (or perhaps we can't? you tell me). The way I think about this is by considering that a strong enough AI program would be theoretically capable of using a clothed picture of someone to actually reconstruct the way the exact way they look naked, which would be quite literally equivalent to the aforementioned situation/looking at them with x-ray glasses, etc. which again (I think) we and most people agree would be wrong. And so, less-powerful AI capable of doing something similar seem to be at least on that gradient of wrong, if not exactly as bad.

Furthermore, AI that actually transplants people's faces onto video depictions of sexual intercourse (which is ostensibly what 'Atrioc' was caught doing) seem worse, or maybe just bad in a different way. I don't have a similar thought experiment to justify why I feel that way but the wrongness of it is my strong intuition nonetheless.

However, I can also sort of see the argument, at least abstractly, that it's a victimless crime. On the other extreme of the spectrum, fantasizing in one's own imagination about the way people look when they're naked, or how it might feel to have sex with them, is not only generally recognized as a very benign behavior, but is also known as something almost everyone does, men and women both. Sometimes, people do this even completely unconsciously, i.e. in their dreams. And what's the difference between looking at a very (or fully) realistic recreation of the way someone might look with their clothes off, and using one's own imagination to do so? What if one's own imagination was very vivid, and you had seen many naked people before thus making your training data very good, and so you also could reasonably expect to make a relatively accurate recreation of the way someone looked while naked, only in your own mind's eye?

The thing is, acknowledging these potential similarities between an action I find morally acceptable and the one I find morally wrong, still doesn't make my intuition about the wrongness of 'deepfakes' any weaker. I feel like there must be some thing that I haven't considered about it yet, which is where I'm hoping you guys might have insight. The only distinction I've found somewhat convincing so far is maybe that the mass-distribution via the internet is what makes it wrong? In other words I find it less wrong (but still wrong somewhat) to make a highly/fully realistic nude of someone and keep it entirely on one's own computer, more so than I find it wrong to make such an image and then distribute it online. This is especially weird because the former is even more apt of a comparison to i.e. peeping on someone in the locker room which is obviously (?) wrong. So why does it seem more okay to me? Help!

I have a few potential explanations that I'm considering as candidates for the source of my cognitive dissonance here:

  1. Perhaps in reality none of the aforementioned actions are wrong. It's not wrong to spy on someone in the locker room, and so it's not wrong to use 'x-ray glasses' to see through their clothes, or use an AI to edit a picture to do functionally the same thing.

  2. Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent. But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be? And thus how really wrong is the other situation, where one uses AI?

This kind of segues into 3. which is: Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other. Is the solution to the 'deepfake' issue then to try and change our society's sexual morals/ethics into something more internally coherent?

None of these really address why I feel different about 'turning a clothed photo into a nude' and 'transplanting, in a realistic way, a non-consenting individual's face onto an actor in a depiction of sexual intercourse.' I have no concrete ideas as to why the latter feels overall worse, but also in some other (minor) ways not as bad. And the latter situation is what the whole controversy with the streamer is all about AFAIK. Very confused about all this.

What's right here, and why? What should even be done? Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it? (I think if we wanted to, we could make both of these things illegal. We might not be able to enforce preventing anyone from making them considering the AI-cat is out of the bag, but it still might be worthwhile to have its illegality on the books if it really is wrong. In other circles I'm seeing the claims that a ban would be unenforceable (motivated thinking?) but it seems trivially easy to functionally ban at least the distribution of 'deepfake' porn in a way that would almost certainly actually reduce the dissemination of such porn if not completely eliminate it. Just as i.e. child sexual abuse imagery or zoophilia porn.

I also see a lot of people in other circles being prompted by this discussion to argue about the ethics of AI image generation in general. I generally think this is basically stupid. The arguments which claim that AI image generation is tantamount to plagiarism (of the dataset images, I suppose) are all basically worthless as far as I can tell. But people who have bought into this line of thinking are thus now going as far as to say that i.e. photorealistic porn (even that depicting completely synthetic likenesses) that is generated with AI is a sexual violation (of all the nude or semi-nude women in pictures in the dataset I guess?) Either way I am wholly unconvinced by these arguments and think they basically all stem from a bad understanding of how the AI work, so I don't think I'm super interested in discussing this axis of the debate. But I mention it because this community sometimes surprises me so if anyone here has a really strong argument as to why this might make sense that they think I haven't seen before, feel free to mention it.

Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other

That observation is a very useful starting place. When I find myself in a similar confusion, I try to switch my perspective to a more traditional view by imagining it involving my kin. Like: "What would I want to do to the guy who did this to my 18-year-old daughter?"

If a guy uploaded to pornhub a realistic sleazy deep-fake porn with my daughter's image and distributed the link within my community, I'd be contemplating the percussion sound of a baseball bat striking his kneecap.

Now that I have an anchor to my reaction, I can explore its possible reasons.

The modern US culture is (broadly) a culture of dignity, where "sticks and stones may break my bones but words will never hurt me" is an aspirational ideal. If I aspire to this ideal for myself and my hypothetical 18-year-old daughter, then the sleazy deep-fake porn is "words" that I and my daughter ought not allow to hurt us. We would then treat the incident as we would one where someone created a fake Linked-In account for my daughter, or a controversial blog post written in my daughter's name, or if someone hacked my daughter's Twitter account and posted some unsavory tweets in her name.

In a culture of dignity, I would assume that my daughter's dignity cannot truly be compromised by something she didn't do (in this case: make a sleazy porn video). I would understand the need to correct the record--have pornhub take down the video, issue a clarification within our community --and I would regard that task as an annoyance.

However, underneath that culture-of-dignity veneer lurk centuries of cultures of honor. It doesn't take much for me to get into that mindset. By creating the deepfake porn and distributing it among my community, the guy compromised my daughter's honor--altered for the worse her reputation among my community--and by extension he compromised my honor. Swift baseball-to-the-kneecap plus spreading the word about the retribution is pure restorative justice.

(But what if the guy didn't distribute the deepfake? Like, what if I found it by browsing his laptop? The threat of distribution is there. So my gut response is to get immediately angry and see that he erases the video and promises never to do that again. Presumably, if I am browsing the guy's laptop, the guy is part of my community and I will have social levers to ensure compliance.)

The question is then: what culture does my community have?

If it's Blue Tribe PMC: my daughter's reputation will rise by spreading word about (a) her stoic response to someone's attempt at reducing her dignity, (b) our levelheaded pursuit of legal means of redress, and even (c) our high-brow discussions on why our culture regards sex as shameful in the first place.

If it's Red Tribe Appalachia: out comes the baseball bat.

I'm not sure why you're bothering to make yourself one degree removed by making this be about your 18 year old daughter.

Imagine: I get access to pictures of your face, and I'm annoyed about your opinions about deepfake porn, so I decide to get some completely legal revenge. I create a deepfake porn video of you being tied up crying and screaming, before someone shoves a ring gag in your mouth to make it easier to throatfuck you, which several faceless men proceed to do, followed by someone sticking a barbed dildo into your ass and leaving you there, fade to black.

I smack a giant "this is a deepfake porn from Deepfake Porn Productions" watermark across the bottom of the screen, making sure not to obscure the semen dripping artistically from your slightly bleeding mouth. I send this video to your coworkers, friends, and family (all above age of consent, of course, I wouldn't send porn to minors). I even carefully label the video so they know exactly what's in it before clicking, I don't want to trigger them. I also upload it to pornhub, why not.

Why involve your barely legal 18 year old daughter in this scenario?

Let's say you feel the specific problem is that it's sexual harassment for me to upload it anywhere. It's fine as long as I don't sent it to your acquaintances, or it's fine as long as I don't upload it to reddit, or whatever. Is it okay for me to let only you know that I have an entire library of such videos? I promise no one but me will get to see them, I just like the deepfake of your voice begging me to stop.

Is the idea that it's fine as long as I keep it secret and never take my laptop full of videos of you to a repair shop where a technician gets to see you taking dick like a seasoned pro (who knew your asshole could gape that wide, wow -- just to be clear, this is a deepfake, this is not real, it's just your face attached to that body type I carefully matched to be as close to yours as possible)? We're past the revenge porn scenario here, I'm keeping this all to myself, it's just that I find it really satisfying to watch someone use a barbed sound on your urethra while a face that looks like yours is crying about it.

Anyway, like I said. Your daughter isn't necessary in this scenario. We can keep the discussion entirely to the ethics of me doing this to you. Concerns of anonymity on the motte aside, how do you feel about sharing a photo of yourself with me after this comment?

Two things about this scenario -

  1. It seems like this video will definitely not remain secret. In your scenario, there is no way for it to remain secret - even if you never ever show it to anyone a computer technician will see it, and even if that didn't happen you still need to tell me you have the videos and that you need to see me begging you to stop. Because the videos by themselves are meaningless. The act of sharing them is a necessary component.

  2. I am not friends with people who want to violently sexually violate me. I get the impression that we are actually enemies, maybe work acquaintances or we are rivals for a lady's affections? Either way I am truly, deeply sorry for dominating you so totally and completely that you have built an entire library dedicated to fantasising about my submission. Also though, thanks for telling me about it, because now I truly own you. Could you strip the deepfake warnings off it however, and pretend it's real? There is zero challenge in destroying someone showing others their violent yet impotent seething.

I think it says a lot about The Motte that this comment--which is obviously leaning super hard into deliberately poking emotional buttons--was met with multiple dispassionate responses that take the position offered seriously. And I think they were correct to do so!

This comment showed up on my "volunteer mod" list, and I seriously considered both the "warning" and "AAQC" options. Went with "neutral."

This kind of comment with deliberate shock value in an obvious attempt to cause an extreme emotional response to create a sort of "ad absurdum" proof (i.e. implicit message being "according to your stated values, this EXTREME and OBSCENE thing happening to YOU would be allowed; your stated values don't look so good now, does it?") happens every once in a while in this forum, and though I've often found it amusing, I've also started to find it frustrating. Because when they inevitably get the types of responses that this one got, (i.e. "Yes, that'd be fully allowed. And?"), there never seems to be any follow-up to continue the conversation. And that's a shame, because I feel like there's potentially an interesting conversation here. It's legit fascinating to me that some pixels arranged to look like a photograph of oneself doing XYZ could be offensive to one based on how offensive XYZ are, and not only that, that it's so obviously offensive that it's used as an "ad absurdum" endpoint to use as a "gotcha" against someone's values.

I'm not sure why you're bothering to make yourself one degree removed by making this be about your 18 year old daughter.

I am old, married, and no longer give a fuck. But I would care if it were my daughter.

I appreciate you taking the time to vividly describe the hypothetical experience. I know that your intent was to make me feel disturbed or disgusted, but that's rather the point of this discussion: it's about exploring our intuitions on the subject.

Well if you made deepfake gay rape porn featuring me(and distributed it) I would consider that a legitimate grievance, but not one justifying extralegal violence. If we were friends and you made deepfake porn of me for personal use only, I would probably stop being your friend, but not otherwise hold much of a grudge. If you made deepfake porn of any description about a female relative I would consider it to justify extralegal violence regardless of intent to distribute.

I wager myself much closer to the median American than you are on this issue.

Anyway, like I said. Your daughter isn't necessary in this scenario. We can keep the discussion entirely to the ethics of me doing this to you. Concerns of anonymity on the motte aside, how do you feel about sharing a photo of yourself with me after this comment?

Can't speak for them directly, but personally the daughter would be relevant because I would care significantly more if it were my daughter than me.

To answer further questions, if you sent it directly to my friends + family I would be very unhappy (though that's rather the whole point of the anonymity concerns).

If posted online with my name (so it would show up on Google etc, though once again rather the point of anonymity concerns) I'd be moderately unhappy since that means there's a decent chance friends, family or potential employers would stumble upon it.

Posted without my identifying info, I'd be a bit wigged out if people I knew personally happened to stumble upon it but its existence on the net to be used by strangers would not bother me much.

If kept on your hard drive for you and maybe a horny pc repair guy to find it I wouldn't mind at all, assuming no personally identifying info attached so the horny pc repair guy can't do scenarios 1 or 2.

If it were my child (thinking on it I would mind quite a bit if it were my son too), I would be distressed to a greater degree about all the above scenarios.

Hope that helps clear it up, that the degree of separation is being used because it is perceived as worse.

I find it interesting that Americans in general tend to often fall back to interrogating themselves with "what would I wish for if it happened to me?" when resolving questions of crime and punishment and ethical dilemmas. In terms of my own cultural programming, this seems wrong and immoral, and somewhere in a class with determining ethical conduct in retail by asking "what would I do in this store if I were absolutely sure that nobody could punish me for it?", which I guess you could simply call sociopathy. (In fact, to me, to proactively give up some of what you would and could claim for yourself seems like the essence of prosocial behaviour.) I can't pinpoint at what point and how it was conveyed, but if this is a European-American difference, it may explain why American prison terms and conditions are so notoriously draconian in comparison to ours.

I imagine you'd protest the comparison between shoplifting/abusing the staff and visiting punishment upon those who wronged you, but then I'd wonder what is the salient difference. If it's that your victimhood in the latter case gives you moral license to take more of the pie, well, you've now justified victimhood olympics (another very American phenomenon); if it's the detail that the case you are imagining involves your daughter and rules against selfishness do not apply if you are acting to defend someone else, you've justified a whole array of /r/talesfromretail stories involving motherly Karens.

Thanks for the interesting response.