site banner

Culture War Roundup for the week of December 29, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

4
Jump in the discussion.

No email address required.

How about a different kind of AI culture war? I speak of course of non-consensual pornography generation. The most outrageous article I read about this recently was probably this AP article: Boys at her school shared AI-generated, nude images of her. After a fight, she was the one expelled. The girl in question is 13 and she started a fight on a school bus with one of the boys later charged with a crime for sharing the images.

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

When the sheriff’s department looked into the case, they took the opposite actions. They charged two of the boys who’d been accused of sharing explicit images — and not the girl.

It turns out that finding apps that advertise this kind of functionality is not hard. In fact, part of the reason I bring this up is it seems this capability is integrated into one of the largest AIs: Grok. There's been some controversy on X over the last couple days after Grok allegedly generated pornographic images of a couple minor girls. Additionally the bot's "media" tab was disabled, allegedly due to the discovery lots of people were using the bot to make pornographic edits of other people's pictures. Though the media tab is gone I did not find it very hard to get Grok to link me its own posts with these kinds of edits.

There is, I think understandably, a lot of controversy going around about this. It's not that it was previously impossible to make this kind of content but the fidelity and availability was much more limited and certainly required more technical skill. Being something you can do without even leaving your favorite social media app seems like something of a game changer.

Frankly I am unsure where to go with this as a policy matter. Should someone be liable for this? Criminal or civil? Who? Just the generating user? The tool that does the generating? As a general matter I have some intuitions about AI conduct being tortious but difficulty locating who should be liable.

From a legal standpoint, what is the theory for the 'harm' caused in this instance. And to whom?

Liability of any kind usually rests on the idea that someone's interests were injured and in need of redress.

We are able to statutorily 'create' interests (the Americans with Disabilities Act did this, for instance) but I think we'd smack into 1A issues trying to make it completely illegal to post manipulated images of people who... are already posting images of themselves online.

Most obvious angle is copyright/IP, but they're still sorting that out for AI works.

I'd kinda love for them to go at it from the obscenity angle. Because that would also suggest that the women posting thirst traps are doing something wrong too.

I don't know about legal harm, but my ideal solution to this would be "The girl who punched the asshole boy gets a finger-wagging, and the boy gets told he had it coming and stop being a little shit."

Obviously, that's too old school and common sense for a school to do.

Photoshopping a woman's face onto a pornographic image has long been understood to be a shitty thing to do and possibly actionable. But AI-generated real-person-porn is probably just something society is going to have to get used to.

In a world where social shame was still effective it'd be a pretty damning to do it and would probably result in ostracization. Not clear what one has to do to 'compensate' for the situation though.

Similar to being a peeping tom, or a subway groper or anything else that intrudes on people's strongest held social boundaries, even when the harm inflicted is de minimus.

But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.

Its rather schizophrenic that there's basically unlimited tolerance for (adult) women to produce pornographic content of themselves, but shame is still heaped upon the consumers, as if these weren't both inseparably linked and necessary components of the "empowerment" equation here.

Like I said before, worst of all worlds.

This young lady is doing something bad, and most people will agree. If her nudes get disseminated for free nobody will have sympathy for her.

And we will do what, precisely to rein in the behavior?

Call her a slut, like you’re already doing.

Look, I may not approve of sex work but I totally acknowledge you can’t get rid of it. Women being harlots in 1900 had few legal consequences then either.

Well you specifically said it there. "Legal" consequences.

What other sorts of consequences might have regulated this particular aspect of the economy?

There's presumably some middle ground between "try to stamp it out entirely via force of law" and "suggesting that prostitution is a bad career choice for almost all women is a bannable offense."

But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.

I've already said that I am pro-slut shaming.

That said, there's a difference between someone willingly posting their nudes and someone not doing that. I think the OnlyFans girl would have a harder case to make about being harmed by someone generating AI porn of her, versus a girl whom you think should just accept that all women are being punished for the OnlyFans girls.

There's a difference but I get confused about the secular reasons for why its meaningful.

Sex and nudity is supposedly no big deal, especially if you're attending a pride parade, but it absolutely IS a big deal when its someone's nudes hitting the internet, evidently. Shame, embarrassment, I dunno, it seems just taken as a given that it demeans the subject to be exposed in such a way. But if they publish those exact same images themselves, it is not demeaning?

There was a minor hullabaloo when I was in college involving 'Slutwalks' making it acceptable for women to wear skimpy clothes in public. And the "Free the Nipple" movement which, among other things, tried to make it acceptable for female nipples to appear on, e.g., instagram.

But then what I noticed is that almost no women (well, no attractive women) used this newfound power to actually go around in public topless or scantily clad, or post topless shots to IG. THEN came OF where they could monetize it and things REALLY got locked down.

So culturally we're told sex and nudity aren't a big deal, don't be prudes. But ECONOMICALLY, people (mostly males) spend billions upon billions of dollars to acquire sex and view nude women. So the only distinction I can really grasp is "am I getting paid for this or not." Which applies to many things, granted.

But where does that leave us?

Sex and nudity is supposedly no big deal

According to whom? The leftists you hate so much? Yeah, some will make that argument, but even they won't say it's no big deal when it's non-consensual.

I don't think the pride parade demographic is particularly representative even of leftists.

But where does that leave us?

I think pretty clearly it's morally objectionable to generate AI porn of someone who is neither a sex worker nor someone who wants AI porn generated of them. What the law says, I am less sure, but I expect the "revenge porn" laws will probably be expanded to include "Generating AI videos of your ex fucking a horse."

According to whom?

Well, the amorphous cultural norms brought on by the sexual revolution, more to the point.

Abortion? On demand. Contraception? Everywhere. Marriage? Optional.

As the biological consequences for having sex with whomever you wanted were abolished, so too were the social consequences.

The leftists you hate so much?

I do not hate them, but I do not want to live amongst them.

And if people were better about choosing to live around people who genuinely shared their preferences and norms, much of the problems we're discussing in this particular case would evaporate.

I mean, the Muslims have solved it their way. Keep women covered up whenever they're in public. This "works" but, (as I'm sure you'd agree) this requires unacceptable restriction on female autonomy. If they only live among other Muslims, this tends to work "OKAY" (women stoned to death unavailable for comment).

We westerners have clearly NOT solved it in a way that is satisfactory, and we seem to make up the rules on the spot based on the relative status of the involved parties. I'm reminded of this every time I see a gym influencer post a video of some guy allegedly gawking at her body without her consent, and posting said video so that all the anonymous onlookers can... gawk at her body.

We CANNOT sustain a system where people are allowed to wear whatever they want without regard to its reception by onlookers, and the onlookers are only 'allowed' to enjoy the view if they're approved as acceptable by the wearer.

I don't want to live amongst people with such irreconcilable standards.

I think pretty clearly it's morally objectionable to generate AI porn of someone who is neither a sex worker nor someone who wants AI porn generated of them

What about making an unflattering caricature art of them? Or depicting them in (non-graphic) torture scenes? Or just change their skin tone to a different color on purpose? This all seems like we're treating nudity (or even just softcore titillation) as a special pleading.

At least, with kids, I think there's a sustainable moral argument for why we don't want people sexualizing them, and cast a suspicious eye at those who do.

And question I've asked before, what if you find someone who is a very close lookalike and have them pose for nude shots and post them, but never actually imply that was your goal? It was a common enough practice among pornographers back in the day.

It seems like this is basically suggesting norms of "look at whatever content I choose to post, but do not ever interact in any way I might find unpleasant."

Which runs extremely counter to how internet culture as a whole works.

That's why I specified morally objectionable. I don't have to think it should be illegal to think someone who does that is a shitty person. I can legally draw a picture of your wife fucking a horse and post it online. Even if you couldn't press charges, you'd probably want to punch me, and most people would think I deserve it.

More comments

I mean - in Eugene, Oregon, topless women are not terribly uncommon.

But if they publish those exact same images themselves, it is not demeaning?

Yes. Consent and agency are necessary considerations in plenty of moral decisions/outcomes, sexuality included. It seems intuitive to me that the proactive decision to publish sexual content is a vastly different experience than having someone do it under your nose. Money need not apply.

A few (admittedly imperfect) analogies involving consent to illustrate my point:

  • A billionaire choosing to donate his fortune to a developing country vs. his funds being seized by a government and donated against his will.
  • You choose to donate a kidney vs. the ambulance coming to your house and taking it from you.
  • You choose to tell a secret to your friends vs. a loose-lipped confidant broadcasting it to the masses despite your wishes.

In all these cases, the former option is fine when done at one's own volition, but become a problem when another actor steps in. There are almost certainly philosophical papers that provide the premise-by-premise reasoning for this sort of argument, but hopefully you get the picture.

In a way, the body, particularly the sexualized body, is something of a possession. It can be given and taken away, shown and hidden. In some sense, it is a commodity that we have "ownership" of and many consider it the sacred domain of the individual. Sexual acts are high stakes, which is why it is so terrible when they are done against one's will and why it is considered a statement when someone takes bold public action with their body, for better or worse. You could argue that it is demeaning to publish sexual content under some sort of moralist (i.e. public sexuality is inherently debasing) or consequentialist (i.e. public sexuality leads to negative behavioral outcomes), but these arguments are complementary rather than overriding to ideas of agency and consent, in my opinion.

Well I'm gonna have to drill down deeper as to your logic here, which I can accept as facially valid.

What is actually 'removed' when the image is published?

Similar with the secret, a breach of trust is a breach of trust, but unless you signed an NDA that expressly laid out how to calculate damages, then your remedy is "never trust that person again."

Vs. losing a kidney or having your money taken, where you can absolutely point to the thing that you lost and demand recompense for.

I would not be arguing this if we were talking about actual physical rape of a person, which is clearly a violation of a concept of 'bodily autonomy,' I think taking a photograph of someone/something is inherently less of a violation.

Publishing a photo is a step beyond, I can absolutely grant, but kind of as I alluded to before, the only actual dividing line I see between whether its a demeaning violation or not isn't in how the viewers receive and react to the image, but whether the original subject will get any money from its publication, not that they have lost something that was in their possession.

Like, consider a situation where a woman takes a nude photo, then fat fingers it and accidentally sends it to the wrong dude. Then, mortified, she demands that he delete it and excoriates him if he comments on it approvingly. Or comments on it at all.

Is HE in the wrong if he views and enjoys this image that wasn't intended for his consumption? Or is SHE in the wrong for sending unsolicited pornography to an unwitting recipient? Is he obligated to delete it? What's the difference? Once it has been sent, how is she harmed by it arriving to the wrong person?

Because I think if we take your express logic to any extreme, it also becomes objectionable to imagine someone naked, especially if you derive pleasure from it.

It's a false statement about you that harms your reputation, often in measurable dollars and cents. One can easily imagine someone losing a job or marriage over fake nudes, and it's not like damages for emotional distress is some foreign concept lacking decades of court precedent.

Your public reputation is a valuable thing that can absolutely be damaged by someone distributing fake nudes of you, in the same way it can be damaged by someone spreading lies about your untrustworthiness in business or your lack of professional qualifications.

Most women don't want to be porn actresses. Making them into ones without their consent is obviously wrong in a way that them choosing to become porn actresses of their own free will isn't.

Is he obligated to delete it?

Yes.

What's the difference?

Intent. If you mean to pay your electricity bill but by mistake send $200 to Mr. Random, do you expect Mr. Random to send you back the money or not?

Once it has been sent, how is she harmed by it arriving to the wrong person?

Because if she's not selling images of her nudity for money, it was an intimate shot meant to be shared only with the person she is in a romantic relationship with, and who knows what Mr. Random is going to do with it? Maybe he'll show it around to his friends. Maybe he'll post it online. Maybe he'll try and blackmail her with it.

I don't think people should be sharing nude photos, boyfriends or not, but that stable door has swung off its hinges. So the next best thing we can do is maintain control over our property, which includes photos of our bodies and faces.

More comments

The examples I provided are not 1:1 analogues to sexuality but moreso illustrations of consent in practice. I don't believe there needs to be specific recompense in these situations for the interference of an outside actor to affect consent. In the case of a secret, you're correct that the victim has little to do other than not trust the person again. I think that's tangential, though: the secret-spreader has still committed a violation of some sort. The release of sexual explicit photos is similar. All that can be done is have them taken down, but it would be hard to argue that some principle of consent/agency wasn't violated in spite of this lack of direct recourse.

Taking a photo of someone is less of a violation than rape, sure, but a lesser violation is still a violation. Petty theft is less of a violation than grand larceny, and they're both prosecuted.

I'm not sure I totally follow your point about money. If I'm restating you correctly, you're saying that modern ideas about the acceptability of these things hinge on whether or not the woman gets paid, not the reaction of the viewer. My response to that would be: who cares about what the viewer thinks? Money is a useful moral fiat that people bend their preferences for all the time: they're employed. If someone forced me to work, that would be loathsome, but I do it for money. It is "reasonable" that many women bend their sexuality in this way, even if I find it socially problematic. I don't think it's somehow hypocritical or irrational for money to play a role in moderating peoples moral preferences. There's decades of social psychology research to support that idea. I'm not sure what your ideal outcome in that scenario would be.

The situation you paint is a bit too specific for me to argue in detail but overall I would say: if the woman sends an image accidentally and requests it be removed, doing so is basic common courtesy and respects her right to privacy. Of course, there is no mechanism whereby the recipient is obligated to do so, but it seems straightforward to me that he should do it. Perhaps in an appeal to the social contract, perhaps in respect for her autonomy - I can't argue it in great depth right now but I think you understand my point. He shouldn't be shamed if he finds it attractive - that's arguably involuntary - but doing anything to further exacerbate the uncomfortable situation is clearly morally dubious.

Yes, it is objectionable in my view to imagine someone naked without their consent. It's not a tremendous violation because it has minimal social consequences and effectively doesn't exist unless it's talked about, so I would never consider legislating it or even shaming anyone for doing it on occasion. We are human and we fantasize. That said, if I heard that someone was imagining the women passing them on the street as naked all day, I'd think less of them - a mental gooner is still a gooner. It's a matter of degree.

So the only distinction I can really grasp is "am I getting paid for this or not."

Which is why the difference is meaningful in the first place. The sharing of intimate images is something women in general want to be paid for, so obviously doing that without permission is bad; meanwhile the other sex stuff doesn't have anything to do with that, so it is OK, and it works as a "haha, ur a prude" trap for people who can't or won't understand the former dynamic is all that matters.

It's not any more complicated than that.

Yeah, but she also wants to completely dodge the reputation that comes with trading her sexuality for money.

And of course, there's still often a guy in the picture actually arranging for her to sell this stuff. In this case, OF clearly profits far and above what all but their top-performing producers do. And its owned by a dude.

I'm not even denying that there's a fundamental transactional nature to all this stuff, even if you're in it for marriage and kids... its just that its now literally reduced to a commodity that gets haggled over, and people who 'have' to pay for it are viewed as losers, whilst anyone who is successful at getting attractive women to give it up without explicitly paying is either extremely crafty or is inherently high status.

Let me repeat that: sex is a commodity which can be purchased at various price points depending on the quality, so its not hard to acquire in the abstract, but being able to acquire it without spending money somehow makes you a God amongst men. Our old-school ape-wiring seems somewhat at odds with our later 'homo economicus' upgrades.

Where's the harm in teenage boys faking nudes of a 13 year old girl without her knowledge or consent, indeed very much against her consent?

Well gosh gee whiz, why on earth are women such picky, fussy, hypergamous trollops who don't want to marry just plain ordinary guys? No wonder we need to force these women into marrying normal men who think nudes of 13 year old girls are just fine! Why is anyone getting het-up about this? Men like nubile women, young means fertile, and if she's old enough to bleed she's old enough to breed, right?

The boys are just doing what boys do! Boys are gonna be interested in girls of their own age! Boys will be looking at porn, and porn is fine and normal and in fact is good for society since it reduces rape and sex offending crimes (citation needed, of course!)

I know, I know: Amadan is going to hammer me for using sarcasm. But how else am I to react to "how is the girl harmed by this?" unless I get really angry and abusive, in which case I'm still going to get the mod hammer.

  • -24

There is a lot of daylight between "it's not okay to make nude deep fakes of 13 year old girls" and "an individual was harmed when that deep fake was made," and even more from that to "someone should be sent to prison about it." None of these things actually imply the others.

I know, I know: Amadan is going to hammer me for using sarcasm.

Sigh. Yes. Come on, you're just trying to be provocative and you knew you were going to get reported.

It's not the sarcasm that's the problem. It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged. And you can even be (a little) snarky about it. But when you layer on the "gosh gee whiz"s and straw men obviously directed at the person you are responding to, of course you're dialing up the antagonism in a way that doesn't actually lead to productive engagement. You're just trying to say "You're a shitty human being unworthy of a respectful reply" without using those words.

The "big yikes not a good look chief" millennial slop has ruined online discourse because it feels so good to get off those snide, snarky little burns but it absolutely destroys any hope of good faith dialog. Do you want to talk to people and maybe enlist their sympathy and get them to see your side of things, or do you just want to score little zingers?

The sarcasm is to prevent me going nuclear with abusive language. "So, okay, just shut up, you don't need to comment on this".

Yeah, that would work - if we didn't have guys posting about 13 year old girls in this fashion. I used to be a 13 year old girl. I can't be coolly objective and removed when it comes to stuff like this, because God Almighty if we can't even preserve some few rags of compassion towards children, what the hell are we doing even trying to keep this society going? Let it burn down. Let the TFR crater. Nothing is going to be missed.

Then be nuclear with abusive language and do eat ban. At least you will have made your point clearly for once.

Remember that the next time Amadan bans me, Southkraut. If I go down, I'm taking you with me! 🤣

I don't want to ban you.

But if the Jews on the Motte can cope with all the Joo-posters saying "The Holocaust didn't happen and it's good that it did," you can cope with some misogyny without losing it every time.

Honestly I'd prefer if you just adopted some more sang-froid, didn't treat this place as a twitter/discord/reddit meme-and-karma pit, and consequently don't get banned.

I mean, I too have my buttons and step out of line when they get pushed. Nobody here is an automaton (I hope). But this whole place works to any good only when people try their damndest not to chimp out, do get slapped when they do, and the largest part of our business gets done with some actual charity. Skirting the rules to fly as close to shit-flinging monkeydom as possible but just short of getting banned is just plain not productive.

You can make reasonable posts. You sometimes do. But eyeballing it, at least half of your output is antagonistic. And that part is worse than useless. Would it kill you to give us more of the other part? The one where you bring in views that are rare on the Motte, and don't make it a snarking contest?

You may as well post animal porn and go out in a blaze of glory, because your complete inability to refrain from embarrassing emotional outbursts every time a post rubs you the wrong way means you're doomed regardless.

I liked the guy calling me "middle-aged". Oh no, I've lived past early adulthood, how embarrassing! We all know women should die the minute they're not hot by male standards, which seems to be "over 13 because fake porn of 13 year old girls is hot and normal".

More comments

if we can't even preserve some few rags of compassion towards children

You mean girl children, right? Because compassion towards boy children is notably absent, and contempt is its substitute.

No, I mean everyone. I think those boys would be the better of a few slaps on the legs from their parents, but that is for their good, not because they had the temerity to offend against a female.

I don't think it's good for boy children that we seemingly are to expect that at the age of 13 they will already develop a porn habit, which is right, normal and good.

In this case the boy children in question did something very bad.

But in the general case it holds up. I cannot imagine this level of outrage if a boy were being bullied (or suffering equivalent psychic damage) by girls. Ultimately girls get compassion and boys get told to man up and/or shut up - I am honestly surprised they expelled the girl in this case and I wonder if it has anything to do with

inviting others to join her

if she organised a group beating. Which, on reading, she did:

She hit him a second time. Then, the principal said, the girl asked aloud: “Why am I the only one doing this?” Two classmates hit the boy, the principal said, before the 13-year-old climbed over a seat and punched and stomped on him.

If the genders were reversed, the boy doing the beating would get a lot worse than a temporary expulsion.

Well duh, 'you can just harass our daughters who did nothing wrong and followed the respectability rules in place in our society' resembles no historical society that has ever existed. Including in very patriarchal ones; what do you think the Taliban would have done to these boys?

There is a reason that, despite being skeptical of things like the 19th amendment, women's financial independence, women in pants, etc, I do not align myself with the faction on the motte which spits out spicy takes on women and then retreats to 'but but double standard! Boys just have to take it!' yeah, they do. Aside from the politically correct but very obvious fact that the statement 'men and boys are stronger than women and girls' does not need qualification(it applies psychologically as well as physically), you, uh, know what women do when they spend their girlhood being bullied and harassed by whatever boys want to do so? It generally looks like radical feminism; the breakdown in family formation in Latin America precedes the hyper-woke feminist wave in the region(and also the area's drop in TFR- these women appear to have retained conservative family values longer than their menfolk). Getting outraged about girls being abused and harassed with no recourse, far more outraged than in the case of boys, is a normal and healthy thing to do.

Yes, if a group of older girls doctored innocuous photos of a thirteen year old boy to be NSFW and started sharing them, this would be a bad thing and deserve to be punished. I am confident that you will not be able to find an example of this happening(you will, of course, be able to find examples of teenaged boys voluntarily sending NSFW content of themselves to young ladies who don't particularly want to receive any). The asymmetry of the sexes goes both ways. You can call be patronizing, but I'm not wrong.

More comments

It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged.

If I (and I mean me specifically) am unable to make intelligent and cogent arguments, will that be seen as a mitigating circumstance when moderating my comments?

Your logic suggests that you'd have no objection if a 13 year old girl published nudes with knowledge and consent.

Is that true?

Is consent the defining factor here?

I just want something on record.

I think 13 year olds should not be sending nudes to anyone. However, as a society, we seem to have decided sex is for everyone and we all should start the younger, the better. Romeo and Juliet laws because aw, statutory rape is such a condemnatory charge! it ruins your life (if you're the guy fucking the younger girl)!

So if we're going to say "okay, yeah, 13 year olds can have boyfriends/girlfriends" (instead of "What? No, you are too damn young! Wait a minimum of another three years before even thinking about dating!") and if we're going to say "hey, sex is a beautiful, natural, instinct that everyone has a right to engage in, and once you hit puberty you're old enough to make up your own mind" (even setting aside the "well ackshully in Classical times 13 year old girls were routinely married to 30 year old men, this is why I am not an ephebophile/why women should be married off as young as possible by their fathers picking a husband for them" set), then at least let it be in the context of a relationship, where she knows and consents to giving her boyfriend nude photos.

I still think it's a very bad idea because it's highly likely the first thing the boy will do, regardless of how he swears he'll only keep them to himself, is share them around with his friends. But at least then it's a decision, even if it's a bad decision, made by the girl. The consequences there are the punishment: yes, you can't trust men when they swear fidelity, yes men are only after one thing, yes you now have a reputation for being a slut, yes more people than you ever wanted to know now know what you look like naked (and maybe even trying to strike sexy poses), yes this is your fault as well. Learn the hard lesson and don't do anything this stupid the next time.

Fake photos of her that the creator pretends are real, that she sent him, that he's fucking her, that she's easy, hey guys have a look, you could get a piece of this whore - yeah, she is perfectly entitled to punch him in the face for that.

And so it sounds like we're worried about something other than a child's consent being present or not.

And of course, you're seemingly expecting that the female side of the equation isn't going to be mature and wise enough to make good decisions here and thus is not blameworthy.

But you get young guys, who are similarly immature and unwise, and you expect them to behave with maximum propriety, and if they do not, then they should expect immediate and swift reprisal. I don't see why leniency due to inexperience and immaturity absolves one but not the other.

If we think kids engaging in uninhibited sexual activity is bad, and, in that vein, that sharing nudes is bad, I simply suggest that we are concerned for reasons orthogonal to 'consent' and should thus apply rules that restrict all the parties' behaviors, possibly for their own good, regardless of who did or did not agree.

I think 13 year olds are not mature enough to make good decisions. If a 13 year old gets convinced to provide nude photos, they've been taken advantage of. It doesn't matter if they're a girl or a boy.

Where it's pardoned, as seems to be the case here, that "haw, haw; of course 13 year old boys are horndogs, of course they want naked photos of girls, what harm did they do?" that's equally bad. I'm not saying "excuse the girl", I'm saying "the girl in this case did not do anything wrong yet she is being punished for it".

Yes, and I'm saying "we accept that most 13 year olds can and will make horrendous decisions, and we try to correct those decisions without making it out as a double standard where only SOME teens are culpable whereas others are not."

And its probably not good to overreact and treat teen guys as evil rapists for an action that objectively speaking involved no physical force or coercion.

I would mostly agree with this. Its just another example in the long list of examples why the consent standard when applied to sexuality and sexual interactions is more or less useless.

I don't think it is useless, but man, people do not seem to really know what they mean when they say "consent." Worse still, they don't really know what they mean when they say they "consent" to some activity.

Sex in particular, the emotional valence of the moment, and the intensity, can shift by the minute. Then, reassessed after the act, someone may decide that some particular part of it they 'agreed' to in the moment was actually a violation.

That is one of the main problems with consent as a standard. It does not hold up under any of the hard cases.

And with sex IN PARTICULAR, there is no reasonable way to go back and assess whether it was validly given or not or whether the lines were crossed. I noticed this issue in law school. "Wait, how the f@&k do you establish evidence for lack of consent when it all happens behind closed doors?"

Unless you film the whole interaction and that opens up the whole can of worms that we're discussing.

there is no reasonable way to go back and assess whether it was validly given or not or whether the lines were crossed

Yes, which is why intelligently-designed laws around sex sidestep the issue.

Let's look at the sex laws described in Deuteronomy and the social dynamics it encourages downstream of those things- remembering that this is at a time when asking people to deny their human instincts of immediate revenge was far lot more novel (and a lot more difficult due to lack of State capacity) than it is today, so we can say that these are laws/accommodations/compromises that are made because human biological instinct ultimately runs more along these lines than any modern view.

The relevant TL;DR here is:

  1. If she actively cried rape, but in a place nobody could reasonably hear, she is assumed to have been raped by default and the judgment (of death) falls solely on the man. This protects the woman- if she actually values the life of the man, she won't run off somewhere this can't be detected (and because the penalty for rape and murder are otherwise the same, there's no other incentive for the man not to just kill her)[1].

  2. If she didn't actively cry rape in a place someone could hear, then judgment falls on both (forced to marry if woman was unmarried, and death for both if married or engaged). This ensures consent [to having participated in the violation, also [0]] cannot be revoked after the fact, which protects men, and also ensures that- if they both did want this- the woman is bound to/invested in total secrecy for the same reason the man is. It also protects men from cuckoldry in an age where sex always results in babies[2], but this is more a "make sure our warriors don't burn down society by refusing to fight and instead throwing open the gates" thing.


Now, because modern society is objectively stacked in favor of women (and this observation is a point towards this interpretation), we observe that we still have the strictures of the first (that protect women) but without the benefits of the second (that protect men). That is why we push further and further into "consent can be revoked at any time"- that's simply what we should expect from female gender politics and the associated ignorance (intentional or otherwise) of the inherent moral hazard that being able to retroactively cry rape enables.

In a society that's objectively stacked in favor of men, by contrast, we should observe that the second case dominates. It looks a little different when this happens; there tend to be a lot of mistresses, wives divorcing left with nothing, fathers disowning their children, old women intentionally locked out from self-sufficiency, casual ass-slapping, and all the other things Boomer women (and their [progressive] daughters) complain about the 50s and 60s for featuring.


[0] Feminists get angry about "has to marry the rapist", but ignores that all sex [outside marriage] can trivially be called rape for the same reasons it's so easy today (and the community at that time would agree; there's no reason a woman would ever have sex for pleasure- something traditionalists and progressives agree on, as it's dishonorable). So the law here is "virginity is part of a woman's inherent value [normal men and women agree on this point completely provided contraceptives don't exist]; you break it, you buy it, and the woman is in charge of saying you broke it retroactively and at any time" (which is what progressives want to be able to do).

[1] If I recall correctly- and you'll have to correct me on this point- the US still has a version of this law (where it's some separate sex crime charge if either participant leaves their state of residence- in fact I'm pretty sure the mere suggestion of such is a crime).

[2] Traditionalists in particular will harp on and on about this because their instincts are incompatible with the technology that makes this so, but what that argument actually means is that the father of the family shouldn't be expected to lay down his life and toil to advance children that aren't his. This is an argument that doesn't directly apply to women, so women naturally assert this dynamic doesn't exist... until the young men refuse to fight an enemy that promises better terms for young men. (And yes, the lack of enemies in the West means women will further not see a need to co-operate until it's too late.)


"Wait, how do you establish evidence for lack of consent when it all happens behind closed doors?"

But see, that's what consent is: it's simply a codification of who is assumed to have automatically cried rape. This is how "children can't consent" can even be a coherent sentence, because on dictionary reading it's nonsense. This case is also the steelman for having this law- traditionalists and progressives are actively harmed by the existence of sex in general so it's understandable- but naturally, most of the fight in this case is over who gets to be "a child" -> "who automatically cried rape".

Naturally, because young women (13+; both trads and progs call this "child" for property rights and anti-sexual-competition reasons respectively, but biology doesn't agree with their assessment) are sexual competition for women but not men, women will push as hard as they can to make sure the definition of "children"/"automatically-considered-raped-for-sympathetic-reasons" is as wide as possible. And I'd say "anyone who claims to be, at any time, ever" is pretty fucking wide already- the fact they semi-seriously want to set the age to 25 is proof they won't ever be satisfied with that.

Of course, setting it to "infinite" is just Deuteronomic law through the back door, and that's not going to happen as long as there are enough liberals (or liberal-sympathetic) who want to fuck more than one young woman. Hence the stalemate.

So, back in high school, someone made a fake photo of me and posted it in a classroom. It wasn't a nude, but it was political, depicting me as Stalin, as I was an outspoken socialist. I was outraged ("the photoshop is not even accurate! I'm a TROTSKYIST!"), and it definitely hurt my feelings and hurt me socially. Pretty clear case of bullying, but, in retrospect, it was pretty hilarious and a useful learning experience. Should that kid have been punished?

I don't think so, and I suspect you don't either (though I'm curious if my suspicion is right). Which shifts the question to, what is the difference between a nonsexual representation and a sexual one? I think, to many people who don't see harm, harm categorically isn't something that can be done with an image or words--sticks and stones can break my bones etc. If people start physically attacking someone, or destroying their property, in response, there is harm, but the harm originates from the physical act, not the instigating image. The introduction of a sexual element doesn't change this. (I'm speaking here in terms of conceptual framework, not legal definitions.)

That doesn't mean that the school shouldn't do anything about the boys--schools can and should regulate behavior above and beyond the minimal standard of harm. But the idea that actual physical violence should be punished less than images and words is weird to me, especially when school administrators had no actual evidence of the images.

You're a guy. Twenty years later, someone is not going to drag up that fake nude of you and use it as evidence that you are a lying, cheating, cock-carousel riding whore who wants a share of an alpha, will settle for a beta to support you and then cuck him with that alpha, try to pawn another's man children off on that beta as his own, and are a prime example of why women should not be allowed to vote, attend higher education, or be permitted out of the house by their father until the day to marry the man picked out for you by him, and then you will spend all your days in that house under the rule of your husband.

(Those are some of the highlights from the commentary around 'how to solve the TFR problem/why aren't women dating men/what can we do about the lonely, sexless men who can't get a wife' on here, so far as I've gathered them).

The difference is that a video of you as Stalin is obviously a joke. Do you think you would have felt the same if instead they showed a video of you masturbating on a television in the classroom?

I would have felt very differently: I would have cared much less, quite honestly. "Oh, someone's a weirdo, anyone whose opinion of me changes because of it isn't worth caring about." And I'm not sure that making the AI-generated nude clearly a fake joke (giving her purple skin or whatever) would change anyone's opinions. I think the crux of the matter is that it's a sexual image, and we cordon off sexuality as requiring unique, almost spiritual protections around it.

What is the legal harm here, is the question that @faceh asked. Mind you I disagree with faceh, I think the harm here is pretty obvious even from a legal point of view since defamation per se usually covers allegations of sexual misconduct in as well (but as I said elsewhere I'm not a legal expert here and could be completely wrong). Please try to respond to the argument faceh is actually making instead of devolving into mocking and sarcasm.

Okay, so I burned down faceh's house and broke faceh's legs and stole all faceh's money in their bank account, but what is the legal harm here? Was any harm even done if there's no legal harm?

  • -10

I mean, I can run you through the entire philosophical underpinnings of the Anglo Legal Tradition that explains the "harms" that, e.g. physically damaging a person's body, or removing funds they 'earned' without their permission, or demolishing objects that belong to them entail.

I did go to school for that after all.

But somehow I think you'd be nonplussed.

I'm not nonplussed. I can recognise "heh heh my chance to be an edgelord" when I see it.

Look, I'll make this clear, as that is the spirit of this whole forum:

I do not care about being an edgelord, I do not care if you are offended, or if you are flattered. I care about being correct.

Edgelords end up being correct in some cases because they can speak on matters that polite people will carefully ignore.

Being correct is not a necessary condition for being an edgelord, however.

If my being (mostly) correct makes you think I'm an edgelord, whatever. I hold a ton of other opinions that are 'edgy' to some crowds but very normal in others. I'm not trying to shock or offend anyone.

If anything, its "heh heh my chance to be a high decoupler." But even that isn't really accurate. Being a decoupler is also adaptive in the legal field, though.


If you want me to shut up about this topic; or ideally, you want me to come around to your side and agree that you're correct, there's plenty you can do.

  • Find data that contradicts mine, and show me my data is flawed.

I personally keep trying to find data that disagrees with me. The unfortunate truth is the more I look I keep stumbling across more data that suggests the other data is accurate and my position is correct.

  • Point out where my analysis is flawed. I'm taking good data and misinterpreting it, or missing a lot of context, or am engaging in clearly biased/motivated reasoning.

  • Point out where my premises are flawed.

  • Show me a superior theory that is still supported by good data.

If its NOT the case that women have gotten progressively less appealing to men, less mentally stable, more antisocial, more unpleasant and unhappy, and that this seems uncorrelated to male behavior...

Well, what is the competing interpretation?

  • Point out a solution that I'm missing, or explain how the problem isn't a problem, or the problem will go away on its own.

  • Point out actual counterexamples that show my position is not a sufficient explanation of the observed data/phenomenon.


Basically, I'm sitting here with data I think is mostly solid, all pointing in the same direction, with a general theory/interpretation of it that completely explains the data I'm seeing, and suggests particular solutions to the nature of the problem.

And EVERY TIME I ENGAGE WITH SOMEONE WHO THINKS I'M WRONG, they utterly fail to undermine or effectively attack any pillar on which I've formed this belief. Its all argument from some position of personal belief that doesn't appear to be informed by true facts in the world, and often is based on experience that is utterly outdated from a time before the problem manifested in earnest (read: older than 10 years, before dating apps arose).

I remain open to being persuaded. But I will simply no sell the social pressure and shaming attempts. I do not care if you think its 'cringe,' I do not hold your (or anyone elses') opinion of me in high enough regard for that to matter.

If you find this personally discomfiting I'm 'sorry' but this is one of the few places on the internet where social pressure is not a defining variable in how our discourse proceeds. I want you to come back with an earth-shattering insight that proves this part of my worldview incorrect. I will not respond to attempts to prove my worldview 'icky,' or 'unpopular' or 'impolite' or 'just c'mon dude really how can you say that?'

You're convinced you're right. I'm convinced I'm right. I don't think we can go much further on from here, apart from butting heads and drawing the ire of the mods.

Hot button topics are hot, and get people hot under the collar. And when I'm hot, I react with heat.

More comments

Not to mansplain your own violent revenge fantasy back to you—but you do know that as a middle aged woman, an attempt at breaking @faceh's legs would almost certainly result in a reverse uno card getting quickly played against you? Or for that matter, such an attempt against any man in this thread who you wish to "[go] nuclear" on for being insufficiently reverential toward girls and women, but especially against one with an MMA background.

Granted, perhaps the hidden premise is that he's already cooked like Anakin by the housefire but with legs still intact before the knee-breaking. Either way, what a curious toughgal LARP.

Didn't we have a distinction between rhetorical leg-breaking and actual leg-breaking on here when it came to comments? Anyway, no of course I can't physically break his legs. But it's a hypothetical: we all recognise the harm done by physical violence, nobody would attempt to defend it by "but is it legal offence, is legal harm done?"

Same with shaming and humiliating a minor child by producing lewd and fake images of her. If you really need "but is it legal harm, otherwise it doesn't count", then I submit you are not part of civilised human society.

Legally, you committed several clear and well established crimes (arson, battery, theft) that caused clear and well established harm, both in the legal and moral sense (loss of the house, medical bills for the legs, loss of the money in the account).

Faceh never argued that there was never any moral harm, and I doubt he believes that there wasn't any. But what was done in the article doesn't clearly and neatly fit under any existing legal framework like revenge porn laws and defamation laws. I (and others) think it likely falls under defamation, but other legal precedents like Fallwell v. Hustler make that unclear so we'd likely need some court cases or new statutes to establish a clear precedent.

So instead of being snarky and sarcastic to faceh, you could make an argument like "I think this behavior falls under [existing legal framework] because X" or "I don't think it fits under existing legal frameworks but legislatures could make it illegal without running afoul of [the first amendment/existing precedent/whatever] because X." It really isn't hard, you're just choosing to react with snark and sarcasm instead of an actual argument.

I think there's a plausible false light (and defamation per se) claim, given that the images in this situation were being shared and would be themselves illegal for her to produce. Even for deepfakes-of-adults, false claims of sexual promiscuity would fall into these categories. There's some theoretical examples where a Falwell v. Hustler-style defense would be relevant in the case of a public figure where the deepfakes were clear parody, but that's pretty far from the typical case. But from a traditional law perspective you don't have to pull a Gorsuch to find a civil tradition against this sorta stuff.

Useless, though, since the kid who did it's judgement proof. In theory, the state law would allow six months imprisonment per act, but in practice that's really not how the juvenile court systems work, and even an adult doing this to another adult is more likely to just end up with a fine. And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.

Which is part of why the whole thing is such a mess.

Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat. The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor? Perhaps more to the point, given how AIs usually seem to handle NSFW requests, is it illegal to put the face of a minor on top of the body of a definitely-grown adult? I have no idea, and it’s not something I’m eager to google.

Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat

I'm mostly using "nuclear" in the sense of "the biggest available weapon, and its resulting proportionality concerns". Those style of prosecutions happen, but they're pretty uncommon, even though there's good evidence to think the chargeable conduct happens more often than anyone wants to think about.

The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor?

In the US, it's a federal felony under the PROTECT Act, unless the content also has serious literary, artistic, political, or scientific value. Enforcement is pretty rare, though, since the feds don't want the law to get another challenge like Ashcroft.

And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.

Thought experiment, what if one of the boys shows it to his creepy uncle and then snapchats it to his creepy uncle? Then creepy uncle is in possession of CSAM - even worse if he then darkwebs it to all of his other chomo friends.

Obviously, probably not going to charge the minor male with accessory, right? But this is the "break containment" mega problem for cases exactly like this. Call it "deepfake laundering"; kids unwittingly doing stuff that, in the realm of adults, would be 100% illegal.

I had some thoughts about this back when that DignifAI thing was doing the rounds:

Unlike editing a photo of Trump so it looks like he's riding a skateboard or whatever, I don't think it's hard to understand why editing Alice/Bob's publicly posted photo to make it look more sexualised than the original is crossing a line: there's a significant possibility that people might mistake the edited photo for the genuine article, and Alice/Bob will take a reputational hit, as people will assume that they are the kind of person who shares thirst trap photos for public consumption. Even if the photos are obviously fake (as in the recent Taylor Swift "deepfakes", which look more like the kind of stylised fetish fanart which has been around for years before LLMs were a thing), I think it's still demeaning to reduce a real person to the status of a sex object without their consent.

I thought DignifAI had legs, but of course the gooners demand the opposite service.

I think it's still demeaning to reduce a real person to the status of a sex object without their consent.

Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.

So exactly how much modification would it take before its no longer 'believable.' If they stylize it as a different artstyle?

I keep pointing out that with AI, whatever guidelines you put in place, the AI can tiptoe RIGHT up to them and stick a single pixel over the line.

Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.

Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.

I found a couple of fully-AI generated insta accounts, some with moderately convincing short videos.

Unfortunately for them, there are still tells. For example, in one (I can find the link) the model's birthmark kept moving around her body. Or lighting ends up a bit off.

Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.

That one is easy.

Ultimately, it'll become an even more premium service. Gooners will pay extra for some sort of cryptographically verifiable proof of realness. Setting up a digital chain of custody from a digital camera to actual final presentation on OnlyFans or another site wouldn't be too arduous and (markets in everything) someone will create a service to do just that.

Saying that AI porn will destroy "regular" porn fails because it relies on the same logic as "porn will destroy the market for webcam girls / strip clubs / prostitutes." It isn't all about the simple visual arousal, but about a parasocial/pararomantic/paraintimate relationship.

This is where my model of the goonbrain really fails.

If you're going to such absolute lengths to be 'certain' that the girl in question actually exists, surely its better to pay for in person companionship?

It doesn't parse, for me, how you can require a physical person actually exist for you to get off to the content, but NOT also inherently prefer that person be physically present. Which obviates the fear of it being faked, entirely.

Although I guess I can imagine a guy who is deathly afraid of getting arrested for soliciting a prostitute which drives him to avoid paying for sex.

Streamers/OnlyFan girls of various stripes tend to be much cheaper than prostitutes/escorts.

...I think. I'm sure someone can trot out an example of a rather ludicrious OnlyFans where the pricing is absolutely stratospheric, but the last time my morbid curiosity led me down that particular rabbit hole, that was my general take on the matter.

I'm gonna say its comparable to any other gig economy/gamified app. The basics are pretty cheap, but they rack up all kinds of extra charges where-ever possible, and milking the whales is the real profit center.

Basically, buying an OF is marking yourself as a possible sucker just by putting the basic money down, and the ecosystem is going to do its damnedest to drain your wallet.

I haven't availed myself of prostitutes, the standard strip club experience annoys me enough with the constant upsell even knowing that they can't actually promise the outcome you're hoping for.

Think of a goodbrained version of "never meet your heroes." If a goon actually pays for a prostitute or companion of some sort, he has to fully interact with a person for some length of time. That could go wrong. It could make him feel bad. The distance created by a screen creates a perception of control and the ability to "Rage quit" if things go south. But they still want the knowledge that there's someone real over there.

Don't get me wrong, a portion of the "real people" online sex market will be taken by AI. Perhaps a pretty large portion of it. But there's going to remain these other niches. If online porn and COVID didn't kill stripclubs (and it didn't), I don't see how they every really go away.

But online porn did kill the dirty movie theater. AI porn seems like it'll kill actually filmed porn but not strip clubs.

Good point.

Does it kill webcam girls and the parasocially heavy OnlyFans accounts?

But online porn did kill the dirty movie theater.

VHS tapes / DVDs killed the dirty movie theater. Online porn killed DVDs.

While I get that, you can still find strip clubs if you're willing to leave the house (this might be the true motivation, the desire to never go out in public).

Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.

That's technically not a requirement for defamation per se, but I don't know enough about defamation law to say whether or not this counts as per se defamation. I do believe that legislatures could define it as defamation per se by statute if they wanted to though.

defamation per se still requires that there be an "accusation," which means in this case, you'd have to prove that the kids were claiming these AI generated images were real pictures, not fakes. It's no different than if they had drawn lewd pictures of her.

I imagine that unless it was an obvious parody or fake, or that the boys were explicitly telling their friends that they were AI generated and not real etc., most courts would presume that the images were being presented as real.

And once again, I'm not sure if it'd survive 1A scrutiny.

Revenge porn laws create some, don’t they?

Yes, but usually a component of those is that someone sent you those images, and DID NOT publish them themselves, so you're violating their 'privacy' interest by publishing these salacious images sans their consent.

Modifying an image they themselves published seems categorically different.

I suppose I imagine it's of a piece with something like false light which operates along similar lines to defamation. The harmed party would obviously be the individual whose photo was edited. Especially if their was an intention to pass off the photo as genuine. I don't think you could reach any edited picture with this doctrine but I think you could likely get non-consensual NSFW edits. In the underage case my understanding is that digital edits of minors can already be considered CSAM so I don't know why this would be different.

AI-generated nudes of minors meet the legal definition of child pornography.

“child pornography” means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where— [...] such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

If the Supreme Court ruled that sharing AI-generated deepfakes of actually existing 14-year-olds is protected speech, there would be a constitutional amendment. The general public will not stand for this, regardless of what weird tech-libertarians think.

Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002)

They basically just re-passed the same law struck down here, and have been avoiding scrutiny by only charging people who were already incentivized to plea bargain. Apparently the idea is that if the law is around for a long time before it gets seriously challenged again (in a virtual child porn case not involving anything else), the courts will forget all about Ashcroft and convict. Might work.

Tbf, the PROTECT Act stapled on a Miller test. They're still trying to bypass the 'prevailing community standards' bit, but compared to the pre-Ashcroft version that just pretended the Miller test didn't matter, it's a much wider retreat than, for example, US v. Lopez.