This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
How about a different kind of AI culture war? I speak of course of non-consensual pornography generation. The most outrageous article I read about this recently was probably this AP article: Boys at her school shared AI-generated, nude images of her. After a fight, she was the one expelled. The girl in question is 13 and she started a fight on a school bus with one of the boys later charged with a crime for sharing the images.
It turns out that finding apps that advertise this kind of functionality is not hard. In fact, part of the reason I bring this up is it seems this capability is integrated into one of the largest AIs: Grok. There's been some controversy on X over the last couple days after Grok allegedly generated pornographic images of a couple minor girls. Additionally the bot's "media" tab was disabled, allegedly due to the discovery lots of people were using the bot to make pornographic edits of other people's pictures. Though the media tab is gone I did not find it very hard to get Grok to link me its own posts with these kinds of edits.
There is, I think understandably, a lot of controversy going around about this. It's not that it was previously impossible to make this kind of content but the fidelity and availability was much more limited and certainly required more technical skill. Being something you can do without even leaving your favorite social media app seems like something of a game changer.
Frankly I am unsure where to go with this as a policy matter. Should someone be liable for this? Criminal or civil? Who? Just the generating user? The tool that does the generating? As a general matter I have some intuitions about AI conduct being tortious but difficulty locating who should be liable.
Which is it? Either it's an image of her, or it's an AI generated image.
It seems like as a society we're going to have to learn how to distinguish photos of actual people and AI generated images that are amalgamations of many different people. Just like literature always had thinly-veiled fictional accounts of recognizable people we're going have synthetic images that resemble real people.
I don’t know about you, but I’m happy to call this an image of Henry VIII, even though it’s an artist-generated image and not a photograph.
More options
Context Copy link
More options
Context Copy link
If someone did this to my daughter I'd slap the shit out of them. I'd also be proud of my daughter for taking the matter into her own hands and beating the shit out of them.
Conversely, if my son did this I would hope he'd get his ass kicked by the girl and if her father came looking for him I would say I understand bro have at him just don't send him to the hospital.
I think I'm expressing majority feelings about this. Surely we can form rules around this accordingly.
Are you hardcore no-fap? Not that I think the dissemination and whatever other social stuff around this is outstanding, but obtaining nudes of cute girls in your high school would have been considered a high accomplishment for high school boys as long as high school and photos have existed.
Like with all porn the AI nudes are a little queer. Its much higher status and much better behavior to seduce her in real life, but some edits to an already existing thirst-trap image is hardly something you should let your son be beat over. I'd suggest the more appropriate action is the girl is removed from any ability to be online and the boy has to agree to supervised courtship if the girl's father so demands.
More options
Context Copy link
Uh huh. Although it is possible you are so badass as to slap some teenage boy—in recent years, men who are likely wealthier and more powerful than you have suffered larger offenses against their daughters and did fuck all.
One example would be a GS executive who had a daughter (Mayli Wang) deep tonguing ass and getting Facial Abused for but three figures a pop. He spent his last few years buying rights to her content to try and disappear it all, but then he died. One might argue that by dying early, he got away easy compared to someone like Laurence Fishburne.
More options
Context Copy link
Yeah its a pretty agreeable resolution to use short, sharp corporal punishment as a deterrent for antisocial behavior.
Good luck getting anyone to agree about what to do with your daughter if she were to start sending actual nudes around the school to a bunch of guys, though.
Bravado tends to fail, there.
The Western agreement would be to track down and punish the guys if they did anything other than preemptively deleting the nudes before they hit their phones.
Western men are generally unable or unwilling to thot-patrol their daughters, but sometimes they’ll try to get the state to ruin the lives of random guys who neglected to help maintain their daughters’ Wonderfulness.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
This comic will always be with us. It amuses me that in the comment section it's talking about Net Neutrality. Remember that massive culture-war issue and how it completely disappeared?
This is one of those losing things that only weirdos think should not be illegal. That's the cross us libertarians bear. Though, unlike price gouging and insider trading, which I think are good things, this does fall under the "immoral" and should come with severe social ramifications. Just like many things, if it is going to be illegal, the law should not pertain to the tool but to the person who misuses the tool. But hey, I guess we just make it illegal to get some things done while your car warms up.
In hindsight, the many and various traffic patterns on the Internet make strictly-defined "network neutrality" difficulty from a technical perspective, at best: I want my VOIP traffic to get minimal latency, but I want real-time fixed bandwidth for video streaming (or is that variable bitrate these days?), and sometimes I'm downloading, um, Debian ISOs and just want them eventually. Add in LTE bandwidth and I really want my 911 call to take precedence over dozens of zoomers on TikTok. Over-provisioning to avoid real QoS questions is darn expensive.
At the time the concern was that (landline!) ISPs (often cable companies) would start charging for data usage, pricing out competitor then-new streaming services, or worse, start charging for access to sites and services. Despite the official repeal of the neutrality policy, no landline services seem to have started charging for specific site access --- and the obvious sites to charge extra for now (Netflix, Amazon, Disney+) have more money and lawyers than the ISPs ever did. Metered landline bandwidth never was accepted by consumers, although I'm not sure if it's actually prohibited specifically and it's a bit outside of the scope of "neutrality". Honestly, the worst offence against the former policy I recall seeing was a mobile provider teaming with a streaming service to not count against your bandwidth limit.
IMO, the culture war fight was maybe a bit overblown, but if it did something it provided enough cover for our Big Streaming to take off. Is that the win the kids wanted? It is what they were asking for, I guess. Go ask them how they feel about streaming slop now, and maybe opinions would be different.
But it is an interesting observation that active CW fronts just sometimes up and fade away.
More options
Context Copy link
More options
Context Copy link
It's in the schools incentive to deny, deflect, and cover up anything bad that might be happening in the school. On the other hand it's in the police's incentive to hunt down people and put them behind bars, whether justified or not. We know that schools often cover up straight up rapes and beatings and pretend nothing is wrong.
As to whether or not ai nudes should be illegal, I'm not against classifying deepfakes of real underage people as illegal.
Of course as an ai company owner I would want to avoid generating this kind of content, this seems quite vague. An image of a kid playing can be normal, but it can also be csam if the photographer was thinking secual thoughts? In my imo csam should only count content where actual abuse happened or for actual pronographic content.
From what I'm aware of the gork story, the gork put them in underwear and swimwear, without including any direct explicit content.
I tried manipulating an image of my butt crack earlier today, to imagine someone climbing down it. In any case, Grok wouldn't comply. It's not some wild west of content generation (though it doesn't respect manipulation of celebs, as long as it's SFW).
More options
Context Copy link
More options
Context Copy link
People need to get more mature about images I think.
Imagine if, at the dawn of the internet, there was a big shock at all the dodgy information sources, conspiracies, cults and so on that emerged. 'We invented freedom of speech in an era of printing presses, not high-capacity assault routers!' someone might say. And it's true, there's a difference due to the speed and nature of the connection. We are bombarded with information, it can be quite overwhelming and mindbreak the weak-willed. Ziz cultists, Extinction Rebellion, retarded tiktok trends... The internet seems to have catalyzed many bad things in ways that aren't easy to counteract without squashing the whole thing.
But the answer isn't to shut down the internet, the answer is to strengthen our mental integrity, raise our willpower stat.
What is the alternate answer here? Restrict Grok from putting people in a bikini, ahegao face, milk sprayed on them? Restrict Grok, they'll just go back to civitai where this stuff has been going on for years. Men clearly desire lewd images of women.
How are you supposed to restrict this? If it's libel, then what about the time-honoured tradition of spreading false rumours about people, is that banned too? Do we all line up and go to the nanny state about how we were wrongly smeared as whiny, dumb, small-penised, ugly bitches who did something unspeakable at a party? Do we all line up in front of some ruinously slow legal system and give lawyers money to defend our reputations (they can't defend your reputation even if you win in court)? Do we have AIs surveiling every private groupchat to defend the honour of maidens? A gigantic Chinese style state surveillance apparatus to uphold the wholesomeness of the entire internet?
The best solution is for men and women to act in a more dignified and honourable way and not do any of this in the first place. That clearly isn't going to happen after decades and decades of subverting and violating just about all of the old taboos. What are taboos and censorship for if not enforcing a standard of behaviour?
Men still have the responsibility of dying in a trench for their country (now with their drone-killers filming their deaths for war propaganda), women will need to accept some downsides in a technological environment that's freed them from a lot of their unpleasant work. Picking and choosing to preserve just the taboos that overwhelmingly benefit women over men isn't a sustainable pattern in the long term.
Modern U.S. society will head straight for its grave rather than break from that pattern.
More options
Context Copy link
A given chick certainly makes it easy for Grok, when chances are she has already uploaded or sent lots of skimpy photos of herself that a guy can use in the generative process to make bikini pics or porn of her. Especially when Grok's training data already contains a vast sea of other chicks who have uploaded nudes, bikini pics, porn of themselves. Minimal edits needed to render a vaguely-plausible photo or even a quite realistic one.
The usual answer: We can expand the protections afforded females by limiting the freedom and protections afforded males. We can easily find some gerrymandered reason why Grok should be neutered, why things like "revenge porn" and making AI porn should be illegal (with the de facto burden of proof being on the defendant), while things like smearing someone as a "whiny, dumb, small-penised, ugly bitch[]" falls under free speech. If a man objects, it would only serve as further proof that he is, in fact, a whiny, dumb, small-penised, ugly bitch.
A woman who calls a man a rapist in an "Are We Dating the Same Man?" social media group is speaking her Emotional Truth and Lived Experience as they at least once had sex after she had a few drinks. A man who uploads a nude of her to "Are We Dating the Same Woman?" may face prison time and/or a hefty monetary judgment, if such a group hadn't already been yeeted.
More options
Context Copy link
More options
Context Copy link
There's just not any way around this. I have an AI image gen model on my computer right now, anyone with a current gen macbook could inpaint any image into pornography. It's not the kind of thing you can realistically ban. As a society we're just going to have to find a way to deal with this the way we deal with the fact anyone at any time could have drawn these same images if they wanted to badly enough. The genie is thoroughly out of the bottle and no amount of outrage will ever put it back in the bottle.
Well if GPU and Ram prices are any indication, we might get some de facto restrictions in that very few can afford a rig powerful enough to actually produce the images.
I was generating porn locally with Stable Diffusion XL running on an $800 gaming laptop that had an RTX 4050 with 6 GB of VRAM two years ago. Most of what I made was hentai, but it would have been trivial to train a LoRA on a couple dozen SFW photos of a particular girl, then made porn of her on demand.
More options
Context Copy link
Why would that be the case when a seven year old laptop is already powerful enough to do it? You don’t need fancy new hardware when the existing far from top of the line hardware will do fine.
This is true, but unless the intention is to keep salvaging old hardware as the various components die, we're still ending up in the same place.
(My actual guess is that capacity WILL expand to meet demand, so this is probably a shortish term crunch)
I’m not talking about old high end gpus but the middle / low-middle end that’s now eclipsed even by integrated gpus. When you equalize for processing power, gpus are still way cheaper than when the hw that was capable of image generation first became common (which was several years before the software was invented). You really don’t need a 32 GB 5090 just to do some basic NSFW generation / inpainting.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
You could make it pretty broadly inaccessible: ban all open-weight models; require any image generation to have strict safeguards and reporting of attempts to authorities; enforce severe criminal penalties. Your existing model would be pretty much untouchable, but it couldn't easily be shared, and a decade from now most copies of it would have been lost to end users. You could even require manufacturers to include firmware on new hardware that bails on unapproved workloads, but that seems like it'd be overkill.
Not saying that this is what I'd like, but it seems doable.
It's not even slightly theoretically doable. The theoretical knowledge of how these models work is broadly available. Further, not only are adversarial countries going to completely ignore your desire for model control, they also are currently the ones who produce most of our hardware. Including, fpgas and gpus. Also You can't include firmware in the new hardware that can survive contact with the consumer. SRAM chips are easily desoldered, dumped, and re-programmed. Firmware mods and flashing tools are easily accessible.
More options
Context Copy link
Torrenting continues to exist. You just can't realistically prevent the distribution of a few gigs of data. Even if you eradicated all the currently existing models it's not particularly hard to train any of the safeguards off new models unless we're just never going to let professionals locally render images.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I mean at the end of the day it is difficult to reliably tell teenaged from adult women, and so I suspect most major AI’s will simply not allow you to turn pictures pornographic. There will almost certainly be laws creating strict liability for AI’s and even nonconsensual nude images of an adult woman aren’t going to go over well.
Now don’t get me wrong, I don’t think AI erotica is going away. But just like YouTube doesn’t want hardcore porn on it, chatgpt and grok will eventually get much stricter about erotica, and there will likely be separate, much less pg-13, AI’s.
That is just a speedbump. All you have to do is reverse the direction. Take a random porn image, inpaint some clothes on it locally (doesn’t have to be very high quality, this is just to pass any censorship of the faceswap model), then do an AI face swap and finally unmask the original pornographic parts using standard non-AI editing. Hell, you could probably not even bother with the clothes by just using a closeup crop of the porn model’s face and then copy paste the result over the original.
More options
Context Copy link
The current trend is actually going towards less strict. slopgpt in the past blocked nfsw text content but sam altman was talking about adding it. When it comes to text it seems like the actual legal liability is quite low, and if everyone does it, then the the reputational risk is low as well. I would predict the opposite and say that likely all the major ai players will allow nfsw text in some form or another within a few years. Of course images are still going to be a danger area that most will steer away from.
More options
Context Copy link
More options
Context Copy link
Well duh? The article itself says they didn't have any hard evidence, is she expecting the school to take action against this boy because it's who she and her friends "suspect" created the images without any hard evidence?
It's disgusting and awful behavior by whoever made them, but unless you've got some real evidence (and I'd even take one of the friends of the accused boy saying that the boy did it as stronger evidence than what the article presents) then I don't see what you could expect the school to do here. And even though it's nasty and disgusting behavior, starting a physical fight over it is going to get the person fighting in trouble until schools get rid of their retarded zero tolerance policies. I spent plenty of time in detention for fighting back, it's retarded policy but at least schools still seem to be consistent in their retardation.
Based on other things I've seen elsewhere, probably.
The First Amendment protects free speech from infringement by the Federal Government, the Fifth guarantees due process in the courts, and so on. The principal is not a Fed and their office is not a court, so obviously the constitution does not apply. Do one little rhetorical slip, and suddenly the entire idea of due process is not a valid counterargument to your preferred methods of meting out punishments (anywhere short of a genuine Court of Law, at least).
This is so remarkably and verifiably wrong. The principal of a public school is a State employee. The bill of rights is incorporated against the states by the 14A. The courts have for decades said that students don't categorically lose those rights in school. This is all stuff you can just look up.
More options
Context Copy link
More options
Context Copy link
“Your tears say more than real evidence ever could.”
More seriously, to me, the part that struck the most discordant note was this:
Which almost makes it sound like she wanted to keep going to school with the boy. If I were in the school district’s position, the last thing I’d do is ship them both off to the same location.
More options
Context Copy link
The impression I get from the rest of the article is not just the boy avoided school discipline immediately but even after having been charged with a crime for his actions. Maybe that's wrong, the boy goes unidentified and the school claims it also can't provide any information. Surely if there's enough evidence for the police to charge a crime there's enough evidence for a school to act.
It should be noted that police are much, much better at acquiring evidence than schools, and it's likely they didn't hand all their evidence over to the school.
More options
Context Copy link
The article says they charged two of the boys that were sharing the images, it's not clear to me at all that either of those two boys were the one who had originally created the images.
Edit: the article also seems to be saying that the two boys who were actually charged went to a different school than the girl entirely, but it really didn't make any of that clear.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
From a legal standpoint, what is the theory for the 'harm' caused in this instance. And to whom?
Liability of any kind usually rests on the idea that someone's interests were injured and in need of redress.
We are able to statutorily 'create' interests (the Americans with Disabilities Act did this, for instance) but I think we'd smack into 1A issues trying to make it completely illegal to post manipulated images of people who... are already posting images of themselves online.
Most obvious angle is copyright/IP, but they're still sorting that out for AI works.
I'd kinda love for them to go at it from the obscenity angle. Because that would also suggest that the women posting thirst traps are doing something wrong too.
I don't know about legal harm, but my ideal solution to this would be "The girl who punched the asshole boy gets a finger-wagging, and the boy gets told he had it coming and stop being a little shit."
Obviously, that's too old school and common sense for a school to do.
Photoshopping a woman's face onto a pornographic image has long been understood to be a shitty thing to do and possibly actionable. But AI-generated real-person-porn is probably just something society is going to have to get used to.
In a world where social shame was still effective it'd be a pretty damning to do it and would probably result in ostracization. Not clear what one has to do to 'compensate' for the situation though.
Similar to being a peeping tom, or a subway groper or anything else that intrudes on people's strongest held social boundaries, even when the harm inflicted is de minimus.
But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.
Its rather schizophrenic that there's basically unlimited tolerance for (adult) women to produce pornographic content of themselves, but shame is still heaped upon the consumers, as if these weren't both inseparably linked and necessary components of the "empowerment" equation here.
Like I said before, worst of all worlds.
I've already said that I am pro-slut shaming.
That said, there's a difference between someone willingly posting their nudes and someone not doing that. I think the OnlyFans girl would have a harder case to make about being harmed by someone generating AI porn of her, versus a girl whom you think should just accept that all women are being punished for the OnlyFans girls.
There's a difference but I get confused about the secular reasons for why its meaningful.
Sex and nudity is supposedly no big deal, especially if you're attending a pride parade, but it absolutely IS a big deal when its someone's nudes hitting the internet, evidently. Shame, embarrassment, I dunno, it seems just taken as a given that it demeans the subject to be exposed in such a way. But if they publish those exact same images themselves, it is not demeaning?
There was a minor hullabaloo when I was in college involving 'Slutwalks' making it acceptable for women to wear skimpy clothes in public. And the "Free the Nipple" movement which, among other things, tried to make it acceptable for female nipples to appear on, e.g., instagram.
But then what I noticed is that almost no women (well, no attractive women) used this newfound power to actually go around in public topless or scantily clad, or post topless shots to IG. THEN came OF where they could monetize it and things REALLY got locked down.
So culturally we're told sex and nudity aren't a big deal, don't be prudes. But ECONOMICALLY, people (mostly males) spend billions upon billions of dollars to acquire sex and view nude women. So the only distinction I can really grasp is "am I getting paid for this or not." Which applies to many things, granted.
But where does that leave us?
According to whom? The leftists you hate so much? Yeah, some will make that argument, but even they won't say it's no big deal when it's non-consensual.
I don't think the pride parade demographic is particularly representative even of leftists.
I think pretty clearly it's morally objectionable to generate AI porn of someone who is neither a sex worker nor someone who wants AI porn generated of them. What the law says, I am less sure, but I expect the "revenge porn" laws will probably be expanded to include "Generating AI videos of your ex fucking a horse."
Well, the amorphous cultural norms brought on by the sexual revolution, more to the point.
Abortion? On demand. Contraception? Everywhere. Marriage? Optional.
As the biological consequences for having sex with whomever you wanted were abolished, so too were the social consequences.
I do not hate them, but I do not want to live amongst them.
And if people were better about choosing to live around people who genuinely shared their preferences and norms, much of the problems we're discussing in this particular case would evaporate.
I mean, the Muslims have solved it their way. Keep women covered up whenever they're in public. This "works" but, (as I'm sure you'd agree) this requires unacceptable restriction on female autonomy. If they only live among other Muslims, this tends to work "OKAY" (women stoned to death unavailable for comment).
We westerners have clearly NOT solved it in a way that is satisfactory, and we seem to make up the rules on the spot based on the relative status of the involved parties. I'm reminded of this every time I see a gym influencer post a video of some guy allegedly gawking at her body without her consent, and posting said video so that all the anonymous onlookers can... gawk at her body.
We CANNOT sustain a system where people are allowed to wear whatever they want without regard to its reception by onlookers, and the onlookers are only 'allowed' to enjoy the view if they're approved as acceptable by the wearer.
I don't want to live amongst people with such irreconcilable standards.
What about making an unflattering caricature art of them? Or depicting them in (non-graphic) torture scenes? Or just change their skin tone to a different color on purpose? This all seems like we're treating nudity (or even just softcore titillation) as a special pleading.
At least, with kids, I think there's a sustainable moral argument for why we don't want people sexualizing them, and cast a suspicious eye at those who do.
And question I've asked before, what if you find someone who is a very close lookalike and have them pose for nude shots and post them, but never actually imply that was your goal? It was a common enough practice among pornographers back in the day.
It seems like this is basically suggesting norms of "look at whatever content I choose to post, but do not ever interact in any way I might find unpleasant."
Which runs extremely counter to how internet culture as a whole works.
That's why I specified morally objectionable. I don't have to think it should be illegal to think someone who does that is a shitty person. I can legally draw a picture of your wife fucking a horse and post it online. Even if you couldn't press charges, you'd probably want to punch me, and most people would think I deserve it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I mean - in Eugene, Oregon, topless women are not terribly uncommon.
More options
Context Copy link
Yes. Consent and agency are necessary considerations in plenty of moral decisions/outcomes, sexuality included. It seems intuitive to me that the proactive decision to publish sexual content is a vastly different experience than having someone do it under your nose. Money need not apply.
A few (admittedly imperfect) analogies involving consent to illustrate my point:
In all these cases, the former option is fine when done at one's own volition, but become a problem when another actor steps in. There are almost certainly philosophical papers that provide the premise-by-premise reasoning for this sort of argument, but hopefully you get the picture.
In a way, the body, particularly the sexualized body, is something of a possession. It can be given and taken away, shown and hidden. In some sense, it is a commodity that we have "ownership" of and many consider it the sacred domain of the individual. Sexual acts are high stakes, which is why it is so terrible when they are done against one's will and why it is considered a statement when someone takes bold public action with their body, for better or worse. You could argue that it is demeaning to publish sexual content under some sort of moralist (i.e. public sexuality is inherently debasing) or consequentialist (i.e. public sexuality leads to negative behavioral outcomes), but these arguments are complementary rather than overriding to ideas of agency and consent, in my opinion.
Well I'm gonna have to drill down deeper as to your logic here, which I can accept as facially valid.
What is actually 'removed' when the image is published?
Similar with the secret, a breach of trust is a breach of trust, but unless you signed an NDA that expressly laid out how to calculate damages, then your remedy is "never trust that person again."
Vs. losing a kidney or having your money taken, where you can absolutely point to the thing that you lost and demand recompense for.
I would not be arguing this if we were talking about actual physical rape of a person, which is clearly a violation of a concept of 'bodily autonomy,' I think taking a photograph of someone/something is inherently less of a violation.
Publishing a photo is a step beyond, I can absolutely grant, but kind of as I alluded to before, the only actual dividing line I see between whether its a demeaning violation or not isn't in how the viewers receive and react to the image, but whether the original subject will get any money from its publication, not that they have lost something that was in their possession.
Like, consider a situation where a woman takes a nude photo, then fat fingers it and accidentally sends it to the wrong dude. Then, mortified, she demands that he delete it and excoriates him if he comments on it approvingly. Or comments on it at all.
Is HE in the wrong if he views and enjoys this image that wasn't intended for his consumption? Or is SHE in the wrong for sending unsolicited pornography to an unwitting recipient? Is he obligated to delete it? What's the difference? Once it has been sent, how is she harmed by it arriving to the wrong person?
Because I think if we take your express logic to any extreme, it also becomes objectionable to imagine someone naked, especially if you derive pleasure from it.
More options
Context Copy link
More options
Context Copy link
Which is why the difference is meaningful in the first place. The sharing of intimate images is something women in general want to be paid for, so obviously doing that without permission is bad; meanwhile the other sex stuff doesn't have anything to do with that, so it is OK, and it works as a "haha, ur a prude" trap for people who can't or won't understand the former dynamic is all that matters.
It's not any more complicated than that.
Yeah, but she also wants to completely dodge the reputation that comes with trading her sexuality for money.
And of course, there's still often a guy in the picture actually arranging for her to sell this stuff. In this case, OF clearly profits far and above what all but their top-performing producers do. And its owned by a dude.
I'm not even denying that there's a fundamental transactional nature to all this stuff, even if you're in it for marriage and kids... its just that its now literally reduced to a commodity that gets haggled over, and people who 'have' to pay for it are viewed as losers, whilst anyone who is successful at getting attractive women to give it up without explicitly paying is either extremely crafty or is inherently high status.
Let me repeat that: sex is a commodity which can be purchased at various price points depending on the quality, so its not hard to acquire in the abstract, but being able to acquire it without spending money somehow makes you a God amongst men. Our old-school ape-wiring seems somewhat at odds with our later 'homo economicus' upgrades.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Where's the harm in teenage boys faking nudes of a 13 year old girl without her knowledge or consent, indeed very much against her consent?
Well gosh gee whiz, why on earth are women such picky, fussy, hypergamous trollops who don't want to marry just plain ordinary guys? No wonder we need to force these women into marrying normal men who think nudes of 13 year old girls are just fine! Why is anyone getting het-up about this? Men like nubile women, young means fertile, and if she's old enough to bleed she's old enough to breed, right?
The boys are just doing what boys do! Boys are gonna be interested in girls of their own age! Boys will be looking at porn, and porn is fine and normal and in fact is good for society since it reduces rape and sex offending crimes (citation needed, of course!)
I know, I know: Amadan is going to hammer me for using sarcasm. But how else am I to react to "how is the girl harmed by this?" unless I get really angry and abusive, in which case I'm still going to get the mod hammer.
There is a lot of daylight between "it's not okay to make nude deep fakes of 13 year old girls" and "an individual was harmed when that deep fake was made," and even more from that to "someone should be sent to prison about it." None of these things actually imply the others.
More options
Context Copy link
Sigh. Yes. Come on, you're just trying to be provocative and you knew you were going to get reported.
It's not the sarcasm that's the problem. It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged. And you can even be (a little) snarky about it. But when you layer on the "gosh gee whiz"s and straw men obviously directed at the person you are responding to, of course you're dialing up the antagonism in a way that doesn't actually lead to productive engagement. You're just trying to say "You're a shitty human being unworthy of a respectful reply" without using those words.
The "big yikes not a good look chief" millennial slop has ruined online discourse because it feels so good to get off those snide, snarky little burns but it absolutely destroys any hope of good faith dialog. Do you want to talk to people and maybe enlist their sympathy and get them to see your side of things, or do you just want to score little zingers?
If I (and I mean me specifically) am unable to make intelligent and cogent arguments, will that be seen as a mitigating circumstance when moderating my comments?
More options
Context Copy link
More options
Context Copy link
Your logic suggests that you'd have no objection if a 13 year old girl published nudes with knowledge and consent.
Is that true?
Is consent the defining factor here?
I just want something on record.
I would mostly agree with this. Its just another example in the long list of examples why the consent standard when applied to sexuality and sexual interactions is more or less useless.
I don't think it is useless, but man, people do not seem to really know what they mean when they say "consent." Worse still, they don't really know what they mean when they say they "consent" to some activity.
Sex in particular, the emotional valence of the moment, and the intensity, can shift by the minute. Then, reassessed after the act, someone may decide that some particular part of it they 'agreed' to in the moment was actually a violation.
That is one of the main problems with consent as a standard. It does not hold up under any of the hard cases.
And with sex IN PARTICULAR, there is no reasonable way to go back and assess whether it was validly given or not or whether the lines were crossed. I noticed this issue in law school. "Wait, how the f@&k do you establish evidence for lack of consent when it all happens behind closed doors?"
Unless you film the whole interaction and that opens up the whole can of worms that we're discussing.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
So, back in high school, someone made a fake photo of me and posted it in a classroom. It wasn't a nude, but it was political, depicting me as Stalin, as I was an outspoken socialist. I was outraged ("the photoshop is not even accurate! I'm a TROTSKYIST!"), and it definitely hurt my feelings and hurt me socially. Pretty clear case of bullying, but, in retrospect, it was pretty hilarious and a useful learning experience. Should that kid have been punished?
I don't think so, and I suspect you don't either (though I'm curious if my suspicion is right). Which shifts the question to, what is the difference between a nonsexual representation and a sexual one? I think, to many people who don't see harm, harm categorically isn't something that can be done with an image or words--sticks and stones can break my bones etc. If people start physically attacking someone, or destroying their property, in response, there is harm, but the harm originates from the physical act, not the instigating image. The introduction of a sexual element doesn't change this. (I'm speaking here in terms of conceptual framework, not legal definitions.)
That doesn't mean that the school shouldn't do anything about the boys--schools can and should regulate behavior above and beyond the minimal standard of harm. But the idea that actual physical violence should be punished less than images and words is weird to me, especially when school administrators had no actual evidence of the images.
The difference is that a video of you as Stalin is obviously a joke. Do you think you would have felt the same if instead they showed a video of you masturbating on a television in the classroom?
I would have felt very differently: I would have cared much less, quite honestly. "Oh, someone's a weirdo, anyone whose opinion of me changes because of it isn't worth caring about." And I'm not sure that making the AI-generated nude clearly a fake joke (giving her purple skin or whatever) would change anyone's opinions. I think the crux of the matter is that it's a sexual image, and we cordon off sexuality as requiring unique, almost spiritual protections around it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What is the legal harm here, is the question that @faceh asked. Mind you I disagree with faceh, I think the harm here is pretty obvious even from a legal point of view since defamation per se usually covers allegations of sexual misconduct in as well (but as I said elsewhere I'm not a legal expert here and could be completely wrong). Please try to respond to the argument faceh is actually making instead of devolving into mocking and sarcasm.
More options
Context Copy link
More options
Context Copy link
I think there's a plausible false light (and defamation per se) claim, given that the images in this situation were being shared and would be themselves illegal for her to produce. Even for deepfakes-of-adults, false claims of sexual promiscuity would fall into these categories. There's some theoretical examples where a Falwell v. Hustler-style defense would be relevant in the case of a public figure where the deepfakes were clear parody, but that's pretty far from the typical case. But from a traditional law perspective you don't have to pull a Gorsuch to find a civil tradition against this sorta stuff.
Useless, though, since the kid who did it's judgement proof. In theory, the state law would allow six months imprisonment per act, but in practice that's really not how the juvenile court systems work, and even an adult doing this to another adult is more likely to just end up with a fine. And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.
Which is part of why the whole thing is such a mess.
Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat. The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor? Perhaps more to the point, given how AIs usually seem to handle NSFW requests, is it illegal to put the face of a minor on top of the body of a definitely-grown adult? I have no idea, and it’s not something I’m eager to google.
I'm mostly using "nuclear" in the sense of "the biggest available weapon, and its resulting proportionality concerns". Those style of prosecutions happen, but they're pretty uncommon, even though there's good evidence to think the chargeable conduct happens more often than anyone wants to think about.
In the US, it's a federal felony under the PROTECT Act, unless the content also has serious literary, artistic, political, or scientific value. Enforcement is pretty rare, though, since the feds don't want the law to get another challenge like Ashcroft.
More options
Context Copy link
More options
Context Copy link
Thought experiment, what if one of the boys shows it to his creepy uncle and then snapchats it to his creepy uncle? Then creepy uncle is in possession of CSAM - even worse if he then darkwebs it to all of his other chomo friends.
Obviously, probably not going to charge the minor male with accessory, right? But this is the "break containment" mega problem for cases exactly like this. Call it "deepfake laundering"; kids unwittingly doing stuff that, in the realm of adults, would be 100% illegal.
More options
Context Copy link
More options
Context Copy link
I had some thoughts about this back when that DignifAI thing was doing the rounds:
I thought DignifAI had legs, but of course the gooners demand the opposite service.
Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.
So exactly how much modification would it take before its no longer 'believable.' If they stylize it as a different artstyle?
I keep pointing out that with AI, whatever guidelines you put in place, the AI can tiptoe RIGHT up to them and stick a single pixel over the line.
Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.
That one is easy.
Ultimately, it'll become an even more premium service. Gooners will pay extra for some sort of cryptographically verifiable proof of realness. Setting up a digital chain of custody from a digital camera to actual final presentation on OnlyFans or another site wouldn't be too arduous and (markets in everything) someone will create a service to do just that.
Saying that AI porn will destroy "regular" porn fails because it relies on the same logic as "porn will destroy the market for webcam girls / strip clubs / prostitutes." It isn't all about the simple visual arousal, but about a parasocial/pararomantic/paraintimate relationship.
This is where my model of the goonbrain really fails.
If you're going to such absolute lengths to be 'certain' that the girl in question actually exists, surely its better to pay for in person companionship?
It doesn't parse, for me, how you can require a physical person actually exist for you to get off to the content, but NOT also inherently prefer that person be physically present. Which obviates the fear of it being faked, entirely.
Although I guess I can imagine a guy who is deathly afraid of getting arrested for soliciting a prostitute which drives him to avoid paying for sex.
Streamers/OnlyFan girls of various stripes tend to be much cheaper than prostitutes/escorts.
...I think. I'm sure someone can trot out an example of a rather ludicrious OnlyFans where the pricing is absolutely stratospheric, but the last time my morbid curiosity led me down that particular rabbit hole, that was my general take on the matter.
I'm gonna say its comparable to any other gig economy/gamified app. The basics are pretty cheap, but they rack up all kinds of extra charges where-ever possible, and milking the whales is the real profit center.
Basically, buying an OF is marking yourself as a possible sucker just by putting the basic money down, and the ecosystem is going to do its damnedest to drain your wallet.
I haven't availed myself of prostitutes, the standard strip club experience annoys me enough with the constant upsell even knowing that they can't actually promise the outcome you're hoping for.
More options
Context Copy link
More options
Context Copy link
Think of a goodbrained version of "never meet your heroes." If a goon actually pays for a prostitute or companion of some sort, he has to fully interact with a person for some length of time. That could go wrong. It could make him feel bad. The distance created by a screen creates a perception of control and the ability to "Rage quit" if things go south. But they still want the knowledge that there's someone real over there.
Don't get me wrong, a portion of the "real people" online sex market will be taken by AI. Perhaps a pretty large portion of it. But there's going to remain these other niches. If online porn and COVID didn't kill stripclubs (and it didn't), I don't see how they every really go away.
More options
Context Copy link
Map of worldwide prostitution laws
While I get that, you can still find strip clubs if you're willing to leave the house (this might be the true motivation, the desire to never go out in public).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
That's technically not a requirement for defamation per se, but I don't know enough about defamation law to say whether or not this counts as per se defamation. I do believe that legislatures could define it as defamation per se by statute if they wanted to though.
defamation per se still requires that there be an "accusation," which means in this case, you'd have to prove that the kids were claiming these AI generated images were real pictures, not fakes. It's no different than if they had drawn lewd pictures of her.
I imagine that unless it was an obvious parody or fake, or that the boys were explicitly telling their friends that they were AI generated and not real etc., most courts would presume that the images were being presented as real.
More options
Context Copy link
More options
Context Copy link
And once again, I'm not sure if it'd survive 1A scrutiny.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Revenge porn laws create some, don’t they?
Yes, but usually a component of those is that someone sent you those images, and DID NOT publish them themselves, so you're violating their 'privacy' interest by publishing these salacious images sans their consent.
Modifying an image they themselves published seems categorically different.
More options
Context Copy link
More options
Context Copy link
I suppose I imagine it's of a piece with something like false light which operates along similar lines to defamation. The harmed party would obviously be the individual whose photo was edited. Especially if their was an intention to pass off the photo as genuine. I don't think you could reach any edited picture with this doctrine but I think you could likely get non-consensual NSFW edits. In the underage case my understanding is that digital edits of minors can already be considered CSAM so I don't know why this would be different.
More options
Context Copy link
AI-generated nudes of minors meet the legal definition of child pornography.
If the Supreme Court ruled that sharing AI-generated deepfakes of actually existing 14-year-olds is protected speech, there would be a constitutional amendment. The general public will not stand for this, regardless of what weird tech-libertarians think.
Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002)
They basically just re-passed the same law struck down here, and have been avoiding scrutiny by only charging people who were already incentivized to plea bargain. Apparently the idea is that if the law is around for a long time before it gets seriously challenged again (in a virtual child porn case not involving anything else), the courts will forget all about Ashcroft and convict. Might work.
Tbf, the PROTECT Act stapled on a Miller test. They're still trying to bypass the 'prevailing community standards' bit, but compared to the pre-Ashcroft version that just pretended the Miller test didn't matter, it's a much wider retreat than, for example, US v. Lopez.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link