site banner

Culture War Roundup for the week of January 30, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

What if the AI is so good at being photoshop+ that, using a picture of what you look like clothed, it is able to create a nude comprised of the exact same pixels/information that would be present in an actual photograph you took of yourself while in the same pose except naked? In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible, lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves. (This is where the technology is actually at, right now. Boobs aren't exactly the most difficult thing to draw, especially at 512x512 resolution.) In this case, even if it's not the exact same information, then, it seems to me to be functionally the same information, for all intents and purposes. So is it okay for me to use an AI to access what is for all intents and purposes the same as information which is otherwise immoral for me to access?

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible

Actually I'm pretty sure it's completely impossible, given how many bits of entropy there likely is in the variations of how naked bodies can look, particularly things like moles, random scars, etc. (and it's even possible to get new marks on your body over time, like again scars, which means even if there really is some perfect correlation between one's facial appearance and the exact configuration of moles, birthmarks, etc. on one's body, that still doesn't account for one's lived experiences (to think I've found an appropriate occasion for that phrase)) and also variation in genitalia appearance. There's also weight gain and loss which happens constantly and is a huge factor too. (Of course this would also depend on how much of a person's body they've shown off in information accessible to the AI and thus how much is left for it to guess.)

lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves.

Even this seems unlikely, again given the amount of (changing) variation involved. The moment one mark isn't there or the genitals are off in any way (and there's a lot of details in both penises and vulvas) or the pubic hair isn't how the person keeps it (or how they were keeping it in a plausible timeframe of the photo), etc., the whole thing is done. Just because it's AI doesn't mean it can do magic. It can't brute force cryptographic keys any better than any dumb algorithm, and again I believe that the informational entropy involved in both formulations of your scenario is likely similar to that range.

In any case, I would still say that even if you did hit on a perfect prediction/match (which as I will explain, is still not actually perfect in practice), it is still not a matter of you accessing my information. Like let's say we both have the password "hunter2" on this site. Are we then accessing each other's private information (since, after all, what's more private than a password) every time we log in? No, because the context is different. In one context, "hunter2" is my information. In another context, "hunter2" is yours. The only way you could be accessing my information is if you tried to use the "hunter2" information in my context, that is, tried to use it to log into my account.

Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction. (After all, information and uncertainty are inherently interlinked. It's not actually a perfect prediction, at least from the perspective of any observer or process we care about (that is, I'm presuming omniscient oracles are not in play in our social relations here), until you can confirm that it is, just like, going based off of the password analogy, if you're a bruteforcing a password, it's only meaningfully "right" at the moment you confirm it is and are successfully able to log in with it, not just at the moment the equivalent string is first generated and enters your computer's memory.)

Except, in that case... you've already necessarily seen the actual 100% real thing, so why do I care about the perfect prediction of it anymore? (I mean if you've only ever actually seen it in person but have a record of the prediction then that may change things, and I could address that, but this is all getting into kind of silly technicalities based on an implausible scenario anyway so I'll end it at this extension of it.) If I type my password in front of you and don't do a good enough job of hiding it, then I obviously don't have to worry about you bruteforcing it anymore (nor do I care extra above how much I care about you seeing the password that you may have bruteforced it prior, from the perspective of it being private) because the worst end result of that has already necessarily happened in the process of confirmation.

Along those lines, I'd say the only context in which the information that you've generated (hence your information) of a perfect prediction of the appearance of my naked body can become equivalent to the actual information of the actual appearance of my naked body is if you can see me naked and can confirm that it's a perfect prediction.

I suppose you're right about this.

What if you're a very accomplished painter and you're able to tell from someone's clothed body exactly how to paint them such that it matches their body when nude?

Maybe that should have the same moral or ethical implications.

In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

No, I think OP (and myself) are considering the tangible possession as a key factor. The information may be functionally identical, but is not actually the same.

In the version of the hypothetical where the AI actually can exactly recreate the way a person would look naked in a certain pose, using only a clothed photo of them in that pose as reference, we can agree that the information is 'actually' the same, though, right? One pixel at location x,y, with color #f0d190 is 'actually the same' as another pixel at the same location x,y, with color #f0d190, regardless of whether or not that pixel exists there because it was reverse-engineered by AI, or normally-engineered to be there as a result of being captured via digital photo.

Even granting that, they are two separate copies of that data. The ability to possess one does not make it ethically or legally justifiable to steal the other.

No. In that hypothetical we lack the information that the picture corresponds to something that exists in reality. I can ask a random number generator for 512*512 pixels, and the result can be equivalent to a picture of you, but unless I somehow find it that it is equivalent I am lacking information that I would have if I took a photo of you.

I suppose you're right.