site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Has this group had any discussion regarding AI use in pornography, specifically, 'deepfakes?' Its come out recently that a major up-and-coming twitch streamer 'Atrioc' (who was recently married, and ostensibly very pro-feminist, but while funny these facts are technically irrelevant to the matter at hand) had been viewing porn 'deepfakes' of multiple major female twitch streamers, including possibly his best friend's girlfriend (last part unconfirmed but highly possible). He's come out with an apology and its a whole thing but I'm sure this community is more interested with the moral/ethical questions therein than internet-celebrity drama so I won't bore you with it.

The following are my perspectives on a few of the potential questions regarding 'deepfake' porn, and AI porn in general. I'd love to hear what other people think about these perspectives, because my thoughts are currently very incomplete on the issue.

First and foremost, I have a strong intuitive feeling that it is deeply wrong, perhaps tantamount to some form of sexual harassment/assault (but of course the non-violent sort) to make 'deepfake' pornography of other non-consenting people. For example taking a picture from a celebrity's instagram and using AI to transform it into a high-fidelity (but technically fake) nude picture of them seems functionally the same as i.e. spying on them in the shower or when they're changing, which are actions I think we can all agree would be some form of wrong or illegal sexual violation (or perhaps we can't? you tell me). The way I think about this is by considering that a strong enough AI program would be theoretically capable of using a clothed picture of someone to actually reconstruct the way the exact way they look naked, which would be quite literally equivalent to the aforementioned situation/looking at them with x-ray glasses, etc. which again (I think) we and most people agree would be wrong. And so, less-powerful AI capable of doing something similar seem to be at least on that gradient of wrong, if not exactly as bad.

Furthermore, AI that actually transplants people's faces onto video depictions of sexual intercourse (which is ostensibly what 'Atrioc' was caught doing) seem worse, or maybe just bad in a different way. I don't have a similar thought experiment to justify why I feel that way but the wrongness of it is my strong intuition nonetheless.

However, I can also sort of see the argument, at least abstractly, that it's a victimless crime. On the other extreme of the spectrum, fantasizing in one's own imagination about the way people look when they're naked, or how it might feel to have sex with them, is not only generally recognized as a very benign behavior, but is also known as something almost everyone does, men and women both. Sometimes, people do this even completely unconsciously, i.e. in their dreams. And what's the difference between looking at a very (or fully) realistic recreation of the way someone might look with their clothes off, and using one's own imagination to do so? What if one's own imagination was very vivid, and you had seen many naked people before thus making your training data very good, and so you also could reasonably expect to make a relatively accurate recreation of the way someone looked while naked, only in your own mind's eye?

The thing is, acknowledging these potential similarities between an action I find morally acceptable and the one I find morally wrong, still doesn't make my intuition about the wrongness of 'deepfakes' any weaker. I feel like there must be some thing that I haven't considered about it yet, which is where I'm hoping you guys might have insight. The only distinction I've found somewhat convincing so far is maybe that the mass-distribution via the internet is what makes it wrong? In other words I find it less wrong (but still wrong somewhat) to make a highly/fully realistic nude of someone and keep it entirely on one's own computer, more so than I find it wrong to make such an image and then distribute it online. This is especially weird because the former is even more apt of a comparison to i.e. peeping on someone in the locker room which is obviously (?) wrong. So why does it seem more okay to me? Help!

I have a few potential explanations that I'm considering as candidates for the source of my cognitive dissonance here:

  1. Perhaps in reality none of the aforementioned actions are wrong. It's not wrong to spy on someone in the locker room, and so it's not wrong to use 'x-ray glasses' to see through their clothes, or use an AI to edit a picture to do functionally the same thing.

  2. Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent. But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be? And thus how really wrong is the other situation, where one uses AI?

This kind of segues into 3. which is: Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other. Is the solution to the 'deepfake' issue then to try and change our society's sexual morals/ethics into something more internally coherent?

None of these really address why I feel different about 'turning a clothed photo into a nude' and 'transplanting, in a realistic way, a non-consenting individual's face onto an actor in a depiction of sexual intercourse.' I have no concrete ideas as to why the latter feels overall worse, but also in some other (minor) ways not as bad. And the latter situation is what the whole controversy with the streamer is all about AFAIK. Very confused about all this.

What's right here, and why? What should even be done? Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it? (I think if we wanted to, we could make both of these things illegal. We might not be able to enforce preventing anyone from making them considering the AI-cat is out of the bag, but it still might be worthwhile to have its illegality on the books if it really is wrong. In other circles I'm seeing the claims that a ban would be unenforceable (motivated thinking?) but it seems trivially easy to functionally ban at least the distribution of 'deepfake' porn in a way that would almost certainly actually reduce the dissemination of such porn if not completely eliminate it. Just as i.e. child sexual abuse imagery or zoophilia porn.

I also see a lot of people in other circles being prompted by this discussion to argue about the ethics of AI image generation in general. I generally think this is basically stupid. The arguments which claim that AI image generation is tantamount to plagiarism (of the dataset images, I suppose) are all basically worthless as far as I can tell. But people who have bought into this line of thinking are thus now going as far as to say that i.e. photorealistic porn (even that depicting completely synthetic likenesses) that is generated with AI is a sexual violation (of all the nude or semi-nude women in pictures in the dataset I guess?) Either way I am wholly unconvinced by these arguments and think they basically all stem from a bad understanding of how the AI work, so I don't think I'm super interested in discussing this axis of the debate. But I mention it because this community sometimes surprises me so if anyone here has a really strong argument as to why this might make sense that they think I haven't seen before, feel free to mention it.

Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent.

IMO this is the correct answer. It is in fact wrong to fantasize about what someone looks like naked, or having sex with them, or what have you. It's very common, yes. But it's still wrong.

Let's say for example that you regularly fantasized about some female friend being naked. Furthermore, let's say you never told a soul but did write it in a diary which you kept safe and never let anyone see. Some might say you did nothing wrong. But even so, if your friend decided to snoop in your diary and found that out she would be profoundly creeped out, and the friendship would be seriously damaged. I think the same would happen for a male friend too, of course, this isn't a gender thing.

But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be?

I don't think this is a good argument. First of all, we don't agree that it's acceptable. We simply realize that it's impossible to tell, so we can't do anything about it. Those aren't the same.

Second, I don't think that whether a norm is enforceable has any bearing on whether the activity is actually wrong. Even if we can't catch a murderer and bring them to justice, we don't say "well I guess it wasn't that wrong to murder that person". The immorality of an act, and our ability to punish that immorality, are unrelated to each other.

Let's say for example that you regularly fantasized about some female friend being naked. Furthermore, let's say you never told a soul but did write it in a diary which you kept safe and never let anyone see. Some might say you did nothing wrong. But even so, if your friend decided to snoop in your diary and found that out she would be profoundly creeped out, and the friendship would be seriously damaged. I think the same would happen for a male friend too, of course, this isn't a gender thing.

This is appealing to consequences that only result if you leave records of your fantasies. I don't think it can extend to thoughts without completely changing the underlying situation.

Imagine applying the same standards to almost any other fantasy or imagined scenario you can have with another person.

Is it wrong to imagine yourself in an action movie with another person? How about if you do it frequently, and write it down?

Is it wrong to imagine the reaction another person will have to a gift you plan to give them?

Is it wrong to imagine conversations with other people?

Is it wrong to imagine punching another person?

I just don't see what line sexually fantasizing about another person is supposed to be crossing that these other things don't. I think policing thoughts is harmful and unproductive, and it is better to just accept that people all around you are imagining and doing things with their remembered images of you in their brains all the time. If people remember me at all when I'm not around, I'm flattered more than anything, even if they are remembering me in a negative light, or projecting me into a scenario that is harmful or embarrassing to my imagined doppleganger.

I think at best you could get a norm that amounts to, "If you fantasize about someone you know in real life, don't leave a paper trail."

I would add to this the very common self-help advice to visualize the success you want to have. As in imagining yourself winning the race, award, promotion, etc. And one of those et ceteras is "get the girl." Is it morally wrong to imagine oneself asking out a potential partner? Getting a yes? Having a great conversation over dinner? The first kiss? These don't strike me as remotely creepy. Why is "we have a great time together" creepy when you add "getting it on?"

I just don't see what line sexually fantasizing about another person is supposed to be crossing that these other things don't.

The problem with your argument is that you assume those other things don't cross lines. But fantasizing about hitting someone does cross a line, for example. It's bad to do that too. If I had to try to generalize a principle out of this (which I'm not sure I have the chops to do), it would be something like "don't fantasize about doing something with/to someone that they wouldn't want you to actually do with/to them". Fantasizing isn't bad in and of itself, it's the fact that you're fantasizing about something they would not be ok with that upsets people. Thus, fantasizing about having a conversation is fine because having a conversation is fine. Fantasizing about punching someone in the face is bad because punching them in the face is bad.

I also think you're really missing the mark if your takeaway is "just don't get caught and it's ok". I mentioned the diary because it's the only real way for someone to find out, but it isn't the record that would bother someone. It's the fact that you are doing it at all. "It's ok as long as I don't get caught" is literally the moral code of a child, but as an adult one should realize "no it's wrong even if nobody will ever know".

I don't think it's worth spending a lot of time on, but this sounds bat-shit crazy neurotic unhealthy self-flagellating.

Or do you just have something against imagination and fiction entirely?

As a different perspective, avoiding fantasizing about things that would be bad to do in real life sounds like an aspect of virtue ethics. It is neurotic and unhealthy to focus on something that will never happen. Epicureans would focus on obtainable pleasures. Buddists would say that these desires cause suffering. And so forth.

I think @SubstantialFrivolity is arguing that there is a very real moral and psychological injury being done to the people engaged in making and consuming these AI Generated images. I don't know if they would extrapolate to porn in general, but I would.

I mean, I mostly agree that it's not productive, and often not healthy, to spend a lot of time thinking about things that won't happen.

I think bringing in a moral judgement onto it makes no sense though.

For me, morality and health are intertwined. Any time someone says "should" they are making a moral judgement. Any time someone says, "I shouldn't do this, it's not healthy" they are making a moral judgement. "I shouldn't eat dessert, it's not healthy," is a moral decision that increases the virtue/habit of prudence and fortitude.

Fantasizing about sex with (uninterested female friend) isn't just about 'having sex with them immorally', it could also be a part of motivation to see if they are interested / pursue them, or even in a conservative moral framework attempt to court them for marriage and then have sex. "If something is obtainable" is not something one can know in many cases.

Do you think there is actual benefit to fantasizing about having sex with someone, in the eventuality that you actually get to have sex with that person at some point? I am not very certain that imagining having sex with a woman, picturing her liking this, enjoying that, actually helps when you encounter the flesh and blood woman, who likely acts and enjoys completely different things. In fact, I think it probably hinders a fruitful, mutually pleasing sexual encounter.

My argument is fantasizing about sex is ... part of or deeply related to desiring sex, in (same analogy as before) the same sense that 'imagining tasty food' is part of wanting that tasty food. This may be described as 'wanting it so badly you imagine it', but I don't actually think they're separate, or that 'imagination' is a discrete thing separate from normal thought. If you, just as a casual action, plan to reach for a cup, do you "imagine" reaching before you do? Not really, but ... sort of, partially, vacuously?

So 'imagining sex with someone' is just a normal thing. It's possible to spend too much time imagining it and not enough time in pursuit, and that could 'make the sex worse', but I don't think it's made worse in the normal case of imagining it.

More comments

Would you extend this standard to all forms of fiction, be they novels, movies, or video games, in which the protagonist harms or kills others? Or only if such harm is justified in context or the morals of the story are considered appropriate or applicable to the real world?

I don't extend it to novels where a protagonist harms another, has sex, or does any specific immoral action. I would extend it to a form of fiction where the sole point was to dwell/glorify violence, sex, or a specific immoral action. Most forms of fiction provide some sort of philosophical evaluation of right/wrong, and utilizes immoral actions to demonstrate this. Or they provide a psychological snapshot of someone else's viewpoint, which broadens the mind of the reader. Or they provide a glimpse into another way of life.

Something like Agony in Pink, on the other hand, takes a little something away from everyone who reads it, be it time or a tiny amount of psychological well-being.

Right now I am fantasising about having sex with either you or @vorpa-glavo. I am respectfully appreciating the other of you. Who have I injured? No one, because things that happen exclusively inside my head have no effect on either of you. What injured your friend with the diary was learning about your fantasy, not the fantasy itself - before they invaded your privacy they were unaware and as a result unbothered.

What if it was a dream? What if I dreamed I was sexing either you or vorpa while a duck with the face of my father sung Uptown Funk backwards into a cucumber? Knowledge of this might affect our relationship negatively, and if you read about it in my dream journal one of you might be upset, but am I really to blame for the random firing of synapses in my head? No, and if it had remained a dream no injury could be considered. It's not 'it's ok as long as I don't get caught', it's 'it's ok as long as it doesn't affect reality'.

You might then argue that fantasies often provoke real world actions and I would agree, and say that uncontrolled impulses are much more morally fraught, but that doesn't implicate the fantasies themselves.

I don't think "nobody was injured because nobody knows" is a reasonable defense. I don't think that there needs to be an injured party for something to be wrong.

Also dreams are an entirely different thing than actively fantasizing. The latter is a choice you make, the former is firing of random synapses in your brain. Intrusive thoughts that you don't dwell on are similarly not wrong. I've had dreams where I cheat on my wife, and I feel scuzzy in the morning. But once I get out of the post-dream haze, I realize I didn't actually do anything wrong. However, if I were dwelling on a fantasy about cheating on my wife I would be doing something wrong because that is a choice and is under my control.

I don't think that there needs to be an injured party for something to be wrong.

Okay. I accept this as a description of your values. I entirely disagree and do not emphasize.

No harm no wrong is my view. And idle horny thoughts are the very peak of inconsequential harmlessness.

Fantasizing isn't bad in and of itself, it's the fact that you're fantasizing about something they would not be ok with that upsets people.

I think there's a legitimate concern that this kind of rule or principle would be too broad in its application, and lead people to unhelpful policing of their own mind.

People who tend towards internalizing disorders like anxiety and depression are already predisposed to get very inside of their heads, and second guess themselves. Telling people like that that they need to worry about whether they're thinking about other people in a respectful way seems like it's just giving these people more tools to torture themselves over their inappropriate thoughts.

I'm fairly emotionally stable, and don't tend towards internalizing disorders, but I have a few friends that do, and the inside of their minds sound like horrible places. They're constantly being unkind to themselves, and even when their lives are going well they feel guilty and can't allow themselves a moment of happiness. Telling people like that that they have to feel bad about sexual thoughts about other people, or fantasies, is just not going to be good of them.

It's going to, at minimum, create OCD people who constantly have intrusive thoughts about fantasizing about the people around them, and who then scrupulously beat themselves up for their failure to live up to the highest human ideals.

I'd rather have rules that don't stigmatize normal parts of human cognition, and don't have the risk of being taken way too far for a portion of the population.

I think the norm of, "It's perfectly normal to sexually fantasize about people you know, but don't let your fantasies affect how you treat them", is a much more actionable norm with fewer downsides, compared to, "Sexually fantasizing about people you know is morally wrong, and you should probably feel bad for doing it."

I think that's a fair point, but that to combat that we should emphasize the difference between thoughts that just pop into your head and thoughts you actively entertain. It's only the latter which poses a moral problem, not the former.

I do sympathize with the plight of people who struggle with mental difficulties. I am one of those people. But I also don't think that the solution is to say "well it's ok" out of concern for their well being. There has to be a middle ground, it seems to me.