site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Women's feelings are not god.

I can maybe accept most of what you're saying specifically in regards to how it answers to the question: 'should these images be illegal?' Perhaps it is a violation of the principle of free speech to have things like this be made illegal, or a slippery privacy slope, or a needless cession to emotional people, etc. That being said, whether or not it should be made illegal, I expect that it will be legally harder if not fully illegal to do this kind of thing in the near future. But I digress.

Many others in the thread are also focusing specifically on the legality dimension, which I regret and for which I bear responsibility. I was generally more interested in talking about what is moral/ethical, and less so in talking about what is and should be legal, even though I indeed asked as much in the body of my post. Even if these things are not illegal, the morality of them is still hugely important, as it determines who gets 'cancelled' etc.

And to that end, in figuring out what is ethical or moral, I think feelings do matter. For example, I think it would be immoral in many situations to do an action that I knew would make a person "feel bad" just because doing that action gave me sexual gratification, legality of the action notwithstanding. If I was trying to design the fairest US government, I might not make deepfake porn illegal. But if I was trying to be as morally/ethically upstanding of a person as I could be, there are plenty things I shouldn't do that are still legal.

I'm of the relatively firm belief that it isn't immoral to fantasize about having sex with someone, even if they haven't consented to you having such a fantasy. I'm not sure what I think when it comes to making highly realistic porn of them. If you were superman and had X-ray vision, would it be unethical or immoral to look into the women's locker room? If not, why does everyone seem to think it would be? If so, what's the difference between that and having a vivid, realistic imagination and using it for sexual purposes in the same way?

Another commenter prompted me to believe that a significant amount of how unethical it is lies in whether or not deepfaked person knows about the porn being made of them, because knowing that it exists is what inflicts psychological harm on them. I think I agree about this. However, the women in the shower into which you're peeping might not ever know that you've peeped at them, so is it not wrong to be a peeping tom (as long as you never get caught?) Teen coming-of-age movies from the 80s didn't seem to think so (the shift in attitudes between then and now might be pertinent to the discussion). Regardless, currently I do indeed think that i.e. spying on the womens locker room would be unethical, and I think most people today would agree that its probably wrong to do such a thing. This is the angle that I'm really trying to disentangle here, the moral and ethical angle, and less so the legal one.

And to that end, in figuring out what is ethical or moral, I think feelings do matter

OP didn't say "feelings don't matter". They said "women's feelings aren't God" i.e. are not the sole, overriding consideration in ethical disputes.

I'm of the relatively firm belief that it isn't immoral to fantasize about having sex with someone, even if they haven't consented to you having such a fantasy

Case in point: some women apparently dislike being "objectified". I don't really care tbh. What goes on in my skull is my business.

If not, why does everyone seem to think it would be?

Because it is a violation of actual privacy: the actual woman is in that room, with a reasonable expectation of privacy and you are peeking in. Even if it wasn't sexual there's all sorts of other concerns with such snooping (e.g. can they steal your stuff now that they saw your locker code)

With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way? That isn't the real Jolie, it's a virtual image that isn't even perfectly accurate.

What if it was trained on topless images of Angelina and perfectly matched her in her physical prime? I think an argument could be made that she removed privacy here herself, in a way she can't expect to get back (we can't unsee her body either way)

I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?

However, the women in the shower into which you're peeping might not ever know that you've peeped at them, so is it not wrong to be a peeping tom (as long as you never get caught?)

Besides the reason already given above? It's more reasonable to imagine you will never be caught for private files on your computer vs. peeking into someone's bedroom. Simply not being physically there reduces the risk of detection and thus harm to the person.

With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way?

This is the main thing I am trying to get at with the locker room/fantasizing examples. The current AI can inpaint nudity onto clothed pictures of people without necessarily having serious flaws or inaccuracies. (Not to say, it always succeeds at doing this. Just that it can reasonably often.) And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.

Technically, I suppose, it can't be known by the person operating the AI algorithm if the person has i.e. a mole on the chest, etc. So maybe, because technically uncertainty might remain, i.e. without actually being able to look at a real topless image of the subject, and thus verifying that the nudity-ai-inpainting is highly similar, there is still some sense of privacy maintained? Because even if the inpainted-nudity actually is extremely similar to their topless form, this isn't known to the person creating or viewing the deepfake?

Regardless, overall, the pertinent fact is that the current level of technology is at a level where it is indeed possible to get outputs, at least somewhat often, that the depicted person themselves could or would mistake for real nude photos of themselves. This seems to me to be functionally very similar if not the same as looking at someone changing/naked without their consent or knowledge. You're right in the sense that it doesn't imply other security concerns in the same way as an intruder present in a changing room would, but I'm not sure that's whats actually wrong/disliked about peeping toms; I feel like a significant amount of the dislike of the idea of someone seeing you changing is the actual fact that they know what you look like naked (and maybe also the knowledge or likelihood that they are fantasizing about you sexually). I.e. most people would be as mostly as opposed to a person using X-ray glasses, or more realistically a hole in the wall, to look inside their locker room while they changed, as they would be opposed to someone i.e. hanging from the rafters. I can't know for certain, though, at least personally I guess, because to my knowledge I've never been the victim of any such situations.

I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?

Well, as far as legality goes, it seems like copyright is the main way people take down unwanted deepfake porn of themselves. Regardless, though, I'm less so interested in the legality and moreso in what should or shouldn't be generally considered acceptable ethically or morally speaking, for which perhaps privacy or violations thereof, and perhaps other things, do seem like a relevant concern.

And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.

Porn stars not only self-select based on their agility in smoothly changing positions in front of cameras--incidentally, a skill shared with politicians--but also for how good they look naked. If an AI image generator is trained on naked bodies of porn starts, its AI-completed naked version of me will look amazingly better than I actually do.

Women's breasts, in particular, come in a variety of shapes, and they are frequently not symmetric. Older women's breasts tend to be flat--think more like those pictures in the old National Geographic depicting women in some far-away hunter-gatherer tribe. The nipples and areolae come in various shapes and sizes, and change with temperature. Some have inverted nipples. Practically all of this variability is hidden by the kinds of clothes women wear, especially if they are into padded bras.

The distribution of body fat also varies significantly for overweight women, and this is also mostly hidden or distorted by clothes.

Women's breasts, in particular, come in a variety of shapes, and they are frequently not symmetric. Older women's breasts tend to be flat--think more like those pictures in the old National Geographic depicting women in some far-away hunter-gatherer tribe. The nipples and areolae come in various shapes and sizes, and change with temperature. Some have inverted nipples. Practically all of this variability is hidden by the kinds of clothes women wear, especially if they are into padded bras.

I'm aware of this. The point is that not everyone with good-looking (pornstar-like, if you would) breasts, decides to become a pornstar. Thus, these types of people are vulnerable to having very realistic versions of their breasts recreated with pornstar data, despite never themselves putting images of their actual breasts out onto the internet. Additionally, there's plenty of data of non-pornstar-like breasts out there to train data on. The point is not that AI will always generate topless versions of people that are very much like what their breasts actually look like, its that it can with at least some relatively degree of frequency.

Making a deepfake porn of someone for noncommercial purposes should be fair use. It's clearly transformative, and it doesn't have any effect on the potential market for the work unless you think the copyright owner will sell their own picture for use in porn and this makes it harder to do so.

Maybe true, but I guarantee you that the vast majority of people paying money to host websites that distribute deepfakes are doing so for commercial purposes. I.e. the streamer in question had accessed a website which required him to pay 15 dollars to use

If so, what's the difference between that and having a vivid, realistic imagination and using it for sexual purposes in the same way?

One extracts real, factual information. One does not. Your actual naked body is your information. How I imagine your naked body to look, or how I conceive of it by using glorified Photoshop+ to put your head on another naked person, is my information.

What if the AI is so good at being photoshop+ that, using a picture of what you look like clothed, it is able to create a nude comprised of the exact same pixels/information that would be present in an actual photograph you took of yourself while in the same pose except naked? In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible, lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves. (This is where the technology is actually at, right now. Boobs aren't exactly the most difficult thing to draw, especially at 512x512 resolution.) In this case, even if it's not the exact same information, then, it seems to me to be functionally the same information, for all intents and purposes. So is it okay for me to use an AI to access what is for all intents and purposes the same as information which is otherwise immoral for me to access?

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible

Actually I'm pretty sure it's completely impossible, given how many bits of entropy there likely is in the variations of how naked bodies can look, particularly things like moles, random scars, etc. (and it's even possible to get new marks on your body over time, like again scars, which means even if there really is some perfect correlation between one's facial appearance and the exact configuration of moles, birthmarks, etc. on one's body, that still doesn't account for one's lived experiences (to think I've found an appropriate occasion for that phrase)) and also variation in genitalia appearance. There's also weight gain and loss which happens constantly and is a huge factor too. (Of course this would also depend on how much of a person's body they've shown off in information accessible to the AI and thus how much is left for it to guess.)

lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves.

Even this seems unlikely, again given the amount of (changing) variation involved. The moment one mark isn't there or the genitals are off in any way (and there's a lot of details in both penises and vulvas) or the pubic hair isn't how the person keeps it (or how they were keeping it in a plausible timeframe of the photo), etc., the whole thing is done. Just because it's AI doesn't mean it can do magic. It can't brute force cryptographic keys any better than any dumb algorithm, and again I believe that the informational entropy involved in both formulations of your scenario is likely similar to that range.

In any case, I would still say that even if you did hit on a perfect prediction/match (which as I will explain, is still not actually perfect in practice), it is still not a matter of you accessing my information. Like let's say we both have the password "hunter2" on this site. Are we then accessing each other's private information (since, after all, what's more private than a password) every time we log in? No, because the context is different. In one context, "hunter2" is my information. In another context, "hunter2" is yours. The only way you could be accessing my information is if you tried to use the "hunter2" information in my context, that is, tried to use it to log into my account.

Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction. (After all, information and uncertainty are inherently interlinked. It's not actually a perfect prediction, at least from the perspective of any observer or process we care about (that is, I'm presuming omniscient oracles are not in play in our social relations here), until you can confirm that it is, just like, going based off of the password analogy, if you're a bruteforcing a password, it's only meaningfully "right" at the moment you confirm it is and are successfully able to log in with it, not just at the moment the equivalent string is first generated and enters your computer's memory.)

Except, in that case... you've already necessarily seen the actual 100% real thing, so why do I care about the perfect prediction of it anymore? (I mean if you've only ever actually seen it in person but have a record of the prediction then that may change things, and I could address that, but this is all getting into kind of silly technicalities based on an implausible scenario anyway so I'll end it at this extension of it.) If I type my password in front of you and don't do a good enough job of hiding it, then I obviously don't have to worry about you bruteforcing it anymore (nor do I care extra above how much I care about you seeing the password that you may have bruteforced it prior, from the perspective of it being private) because the worst end result of that has already necessarily happened in the process of confirmation.

Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction.

I suppose you're right about this.

What if you're a very accomplished painter and you're able to tell from someone's clothed body exactly how to paint them such that it matches their body when nude?

Maybe that should have the same moral or ethical implications.

In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

No, I think OP (and myself) are considering the tangible possession as a key factor. The information may be functionally identical, but is not actually the same.

In the version of the hypothetical where the AI actually can exactly recreate the way a person would look naked in a certain pose, using only a clothed photo of them in that pose as reference, we can agree that the information is 'actually' the same, though, right? One pixel at location x,y, with color #f0d190 is 'actually the same' as another pixel at the same location x,y, with color #f0d190, regardless of whether or not that pixel exists there because it was reverse-engineered by AI, or normally-engineered to be there as a result of being captured via digital photo.

Even granting that, they are two separate copies of that data. The ability to possess one does not make it ethically or legally justifiable to steal the other.

No. In that hypothetical we lack the information that the picture corresponds to something that exists in reality. I can ask a random number generator for 512*512 pixels, and the result can be equivalent to a picture of you, but unless I somehow find it that it is equivalent I am lacking information that I would have if I took a photo of you.

I suppose you're right.