This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
How about a different kind of AI culture war? I speak of course of non-consensual pornography generation. The most outrageous article I read about this recently was probably this AP article: Boys at her school shared AI-generated, nude images of her. After a fight, she was the one expelled. The girl in question is 13 and she started a fight on a school bus with one of the boys later charged with a crime for sharing the images.
It turns out that finding apps that advertise this kind of functionality is not hard. In fact, part of the reason I bring this up is it seems this capability is integrated into one of the largest AIs: Grok. There's been some controversy on X over the last couple days after Grok allegedly generated pornographic images of a couple minor girls. Additionally the bot's "media" tab was disabled, allegedly due to the discovery lots of people were using the bot to make pornographic edits of other people's pictures. Though the media tab is gone I did not find it very hard to get Grok to link me its own posts with these kinds of edits.
There is, I think understandably, a lot of controversy going around about this. It's not that it was previously impossible to make this kind of content but the fidelity and availability was much more limited and certainly required more technical skill. Being something you can do without even leaving your favorite social media app seems like something of a game changer.
Frankly I am unsure where to go with this as a policy matter. Should someone be liable for this? Criminal or civil? Who? Just the generating user? The tool that does the generating? As a general matter I have some intuitions about AI conduct being tortious but difficulty locating who should be liable.
From a legal standpoint, what is the theory for the 'harm' caused in this instance. And to whom?
Liability of any kind usually rests on the idea that someone's interests were injured and in need of redress.
We are able to statutorily 'create' interests (the Americans with Disabilities Act did this, for instance) but I think we'd smack into 1A issues trying to make it completely illegal to post manipulated images of people who... are already posting images of themselves online.
Most obvious angle is copyright/IP, but they're still sorting that out for AI works.
I'd kinda love for them to go at it from the obscenity angle. Because that would also suggest that the women posting thirst traps are doing something wrong too.
I don't know about legal harm, but my ideal solution to this would be "The girl who punched the asshole boy gets a finger-wagging, and the boy gets told he had it coming and stop being a little shit."
Obviously, that's too old school and common sense for a school to do.
Photoshopping a woman's face onto a pornographic image has long been understood to be a shitty thing to do and possibly actionable. But AI-generated real-person-porn is probably just something society is going to have to get used to.
In a world where social shame was still effective it'd be a pretty damning to do it and would probably result in ostracization. Not clear what one has to do to 'compensate' for the situation though.
Similar to being a peeping tom, or a subway groper or anything else that intrudes on people's strongest held social boundaries, even when the harm inflicted is de minimus.
But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.
Its rather schizophrenic that there's basically unlimited tolerance for (adult) women to produce pornographic content of themselves, but shame is still heaped upon the consumers, as if these weren't both inseparably linked and necessary components of the "empowerment" equation here.
Like I said before, worst of all worlds.
More options
Context Copy link
More options
Context Copy link
Where's the harm in teenage boys faking nudes of a 13 year old girl without her knowledge or consent, indeed very much against her consent?
Well gosh gee whiz, why on earth are women such picky, fussy, hypergamous trollops who don't want to marry just plain ordinary guys? No wonder we need to force these women into marrying normal men who think nudes of 13 year old girls are just fine! Why is anyone getting het-up about this? Men like nubile women, young means fertile, and if she's old enough to bleed she's old enough to breed, right?
The boys are just doing what boys do! Boys are gonna be interested in girls of their own age! Boys will be looking at porn, and porn is fine and normal and in fact is good for society since it reduces rape and sex offending crimes (citation needed, of course!)
I know, I know: Amadan is going to hammer me for using sarcasm. But how else am I to react to "how is the girl harmed by this?" unless I get really angry and abusive, in which case I'm still going to get the mod hammer.
Sigh. Yes. Come on, you're just trying to be provocative and you knew you were going to get reported.
It's not the sarcasm that's the problem. It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged. And you can even be (a little) snarky about it. But when you layer on the "gosh gee whiz"s and straw men obviously directed at the person you are responding to, of course you're dialing up the antagonism in a way that doesn't actually lead to productive engagement. You're just trying to say "You're a shitty human being unworthy of a respectful reply" without using those words.
The "big yikes not a good look chief" millennial slop has ruined online discourse because it feels so good to get off those snide, snarky little burns but it absolutely destroys any hope of good faith dialog. Do you want to talk to people and maybe enlist their sympathy and get them to see your side of things, or do you just want to score little zingers?
More options
Context Copy link
Your logic suggests that you'd have no objection if a 13 year old girl published nudes with knowledge and consent.
Is that true?
Is consent the defining factor here?
I just want something on record.
More options
Context Copy link
So, back in high school, someone made a fake photo of me and posted it in a classroom. It wasn't a nude, but it was political, depicting me as Stalin, as I was an outspoken socialist. I was outraged ("the photoshop is not even accurate! I'm a TROTSKYIST!"), and it definitely hurt my feelings and hurt me socially. Pretty clear case of bullying, but, in retrospect, it was pretty hilarious and a useful learning experience. Should that kid have been punished?
I don't think so, and I suspect you don't either (though I'm curious if my suspicion is right). Which shifts the question to, what is the difference between a nonsexual representation and a sexual one? I think, to many people who don't see harm, harm categorically isn't something that can be done with an image or words--sticks and stones can break my bones etc. If people start physically attacking someone, or destroying their property, in response, there is harm, but the harm originates from the physical act, not the instigating image. The introduction of a sexual element doesn't change this. (I'm speaking here in terms of conceptual framework, not legal definitions.)
That doesn't mean that the school shouldn't do anything about the boys--schools can and should regulate behavior above and beyond the minimal standard of harm. But the idea that actual physical violence should be punished less than images and words is weird to me, especially when school administrators had no actual evidence of the images.
The difference is that a video of you as Stalin is obviously a joke. Do you think you would have felt the same if instead they showed a video of you masturbating on a television in the classroom?
I would have felt very differently: I would have cared much less, quite honestly. "Oh, someone's a weirdo, anyone whose opinion of me changes because of it isn't worth caring about." And I'm not sure that making the AI-generated nude clearly a fake joke (giving her purple skin or whatever) would change anyone's opinions. I think the crux of the matter is that it's a sexual image, and we cordon off sexuality as requiring unique, almost spiritual protections around it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What is the legal harm here, is the question that @faceh asked. Mind you I disagree with faceh, I think the harm here is pretty obvious even from a legal point of view since defamation per se usually covers allegations of sexual misconduct in as well (but as I said elsewhere I'm not a legal expert here and could be completely wrong). Please try to respond to the argument faceh is actually making instead of devolving into mocking and sarcasm.
More options
Context Copy link
More options
Context Copy link
I think there's a plausible false light (and defamation per se) claim, given that the images in this situation were being shared and would be themselves illegal for her to produce. Even for deepfakes-of-adults, false claims of sexual promiscuity would fall into these categories. There's some theoretical examples where a Falwell v. Hustler-style defense would be relevant in the case of a public figure where the deepfakes were clear parody, but that's pretty far from the typical case. But from a traditional law perspective you don't have to pull a Gorsuch to find a civil tradition against this sorta stuff.
Useless, though, since the kid who did it's judgement proof. In theory, the state law would allow six months imprisonment per act, but in practice that's really not how the juvenile court systems work, and even an adult doing this to another adult is more likely to just end up with a fine. And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.
Which is part of why the whole thing is such a mess.
Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat. The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor? Perhaps more to the point, given how AIs usually seem to handle NSFW requests, is it illegal to put the face of a minor on top of the body of a definitely-grown adult? I have no idea, and it’s not something I’m eager to google.
More options
Context Copy link
Thought experiment, what if one of the boys shows it to his creepy uncle and then snapchats it to his creepy uncle? Then creepy uncle is in possession of CSAM - even worse if he then darkwebs it to all of his other chomo friends.
Obviously, probably not going to charge the minor male with accessory, right? But this is the "break containment" mega problem for cases exactly like this. Call it "deepfake laundering"; kids unwittingly doing stuff that, in the realm of adults, would be 100% illegal.
More options
Context Copy link
More options
Context Copy link
I had some thoughts about this back when that DignifAI thing was doing the rounds:
I thought DignifAI had legs, but of course the gooners demand the opposite service.
Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.
So exactly how much modification would it take before its no longer 'believable.' If they stylize it as a different artstyle?
I keep pointing out that with AI, whatever guidelines you put in place, the AI can tiptoe RIGHT up to them and stick a single pixel over the line.
Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.
That one is easy.
Ultimately, it'll become an even more premium service. Gooners will pay extra for some sort of cryptographically verifiable proof of realness. Setting up a digital chain of custody from a digital camera to actual final presentation on OnlyFans or another site wouldn't be too arduous and (markets in everything) someone will create a service to do just that.
Saying that AI porn will destroy "regular" porn fails because it relies on the same logic as "porn will destroy the market for webcam girls / strip clubs / prostitutes." It isn't all about the simple visual arousal, but about a parasocial/pararomantic/paraintimate relationship.
This is where my model of the goonbrain really fails.
If you're going to such absolute lengths to be 'certain' that the girl in question actually exists, surely its better to pay for in person companionship?
It doesn't parse, for me, how you can require a physical person actually exist for you to get off to the content, but NOT also inherently prefer that person be physically present. Which obviates the fear of it being faked, entirely.
Although I guess I can imagine a guy who is deathly afraid of getting arrested for soliciting a prostitute which drives him to avoid paying for sex.
Map of worldwide prostitution laws
While I get that, you can still find strip clubs if you're willing to leave the house (this might be the true motivation, the desire to never go out in public).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
That's technically not a requirement for defamation per se, but I don't know enough about defamation law to say whether or not this counts as per se defamation. I do believe that legislatures could define it as defamation per se by statute if they wanted to though.
defamation per se still requires that there be an "accusation," which means in this case, you'd have to prove that the kids were claiming these AI generated images were real pictures, not fakes. It's no different than if they had drawn lewd pictures of her.
I imagine that unless it was an obvious parody or fake, or that the boys were explicitly telling their friends that they were AI generated and not real etc., most courts would presume that the images were being presented as real.
More options
Context Copy link
More options
Context Copy link
And once again, I'm not sure if it'd survive 1A scrutiny.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Revenge porn laws create some, don’t they?
Yes, but usually a component of those is that someone sent you those images, and DID NOT publish them themselves, so you're violating their 'privacy' interest by publishing these salacious images sans their consent.
Modifying an image they themselves published seems categorically different.
More options
Context Copy link
More options
Context Copy link
I suppose I imagine it's of a piece with something like false light which operates along similar lines to defamation. The harmed party would obviously be the individual whose photo was edited. Especially if their was an intention to pass off the photo as genuine. I don't think you could reach any edited picture with this doctrine but I think you could likely get non-consensual NSFW edits. In the underage case my understanding is that digital edits of minors can already be considered CSAM so I don't know why this would be different.
More options
Context Copy link
AI-generated nudes of minors meet the legal definition of child pornography.
If the Supreme Court ruled that sharing AI-generated deepfakes of actually existing 14-year-olds is protected speech, there would be a constitutional amendment. The general public will not stand for this, regardless of what weird tech-libertarians think.
Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002)
They basically just re-passed the same law struck down here, and have been avoiding scrutiny by only charging people who were already incentivized to plea bargain. Apparently the idea is that if the law is around for a long time before it gets seriously challenged again (in a virtual child porn case not involving anything else), the courts will forget all about Ashcroft and convict. Might work.
Tbf, the PROTECT Act stapled on a Miller test. They're still trying to bypass the 'prevailing community standards' bit, but compared to the pre-Ashcroft version that just pretended the Miller test didn't matter, it's a much wider retreat than, for example, US v. Lopez.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link