site banner

Culture War Roundup for the week of December 29, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

4
Jump in the discussion.

No email address required.

How about a different kind of AI culture war? I speak of course of non-consensual pornography generation. The most outrageous article I read about this recently was probably this AP article: Boys at her school shared AI-generated, nude images of her. After a fight, she was the one expelled. The girl in question is 13 and she started a fight on a school bus with one of the boys later charged with a crime for sharing the images.

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

When the sheriff’s department looked into the case, they took the opposite actions. They charged two of the boys who’d been accused of sharing explicit images — and not the girl.

It turns out that finding apps that advertise this kind of functionality is not hard. In fact, part of the reason I bring this up is it seems this capability is integrated into one of the largest AIs: Grok. There's been some controversy on X over the last couple days after Grok allegedly generated pornographic images of a couple minor girls. Additionally the bot's "media" tab was disabled, allegedly due to the discovery lots of people were using the bot to make pornographic edits of other people's pictures. Though the media tab is gone I did not find it very hard to get Grok to link me its own posts with these kinds of edits.

There is, I think understandably, a lot of controversy going around about this. It's not that it was previously impossible to make this kind of content but the fidelity and availability was much more limited and certainly required more technical skill. Being something you can do without even leaving your favorite social media app seems like something of a game changer.

Frankly I am unsure where to go with this as a policy matter. Should someone be liable for this? Criminal or civil? Who? Just the generating user? The tool that does the generating? As a general matter I have some intuitions about AI conduct being tortious but difficulty locating who should be liable.

Notably, the girl was expelled (for a single term) for organising a public and violent beating of the boy:

She hit him a second time. Then, the principal said, the girl asked aloud: “Why am I the only one doing this?” Two classmates hit the boy, the principal said, before the 13-year-old climbed over a seat and punched and stomped on him.

Having an AI-generated picture which pretends to be you is legitimately hurtful, but fists and boots are pretty damn hurtful too.

“She’s already been out of school enough,” one of the girl’s attorneys, Matt Ory, told the board on Nov. 5. “She is a victim.

“She,” he repeated, “is a victim.”

Martin, the superintendent, countered: “Sometimes in life we can be both victims and perpetrators.”

Which seems like a pretty good way of summing it up, although it doesn't fly in public.

Which seems like a pretty good way of summing it up, although it doesn't fly in public.

We have no trouble saying that to male victims of DV if they so much as raise their voice against their female abuser. It only doesn't fly in public when applied to the demographic that is only allowed to be viewed as a victim.

I was being descriptive. The fuller quote is:

“She’s already been out of school enough,” one of the girl’s attorneys, Matt Ory, told the board on Nov. 5. “She is a victim.

“She,” he repeated, “is a victim.”

Martin, the superintendent, countered: “Sometimes in life we can be both victims and perpetrators.”

But the board was swayed. One member, Henry Lafont, said: “There are a lot of things in that video that I don’t like. But I’m also trying to put into perspective what she went through all day.” They allowed her to return to campus immediately. Her first day back at school was Nov. 7, although she will remain on probation until Jan. 29.

Just the generating user? The tool that does the generating?

I think 'tool' is not a good word in this context, because it covers many different things.

The classic way to create fake nudes would be non-AI Photoshop (or GIMP). It does not know anything about nudes, you will have to find a suitable body picture to paste the face onto yourself. This is analogous to a kitchen knife as a tool for violence.

AI image generation is a very different ballpark. Basically you can just upload a picture of your victim and click the undress button, I imagine. This is analogous to a squad of marines being a tool for violence.

I am under no illusion that we can rid the world of the undressing AIs. A smart kid can still download some deepfake software and create his nudes that way (or even fall back to Photoshop). But I am also sure that OpenAI, Anthrophic and Google will at least have tried to build in safeguards so that their image generation systems will not undress minors. By contrast, Grok's attitude seems to be 'it is fine unless you undress Elon Musk'. I see no reason to allow them to do this.

Of course, I also have zero sympathy for kids trading nudes of their classmates. While I am generally skeptical of the concept of computer-generated CSAM (because hentai seems a victimless crime, whatever the bra size), I have no problem with treating undressing-AI images of real minors as CSAM. As I expect some kids will learn shortly, distribution of CSAM is generally a serious charge, and no of the mitigating circumstances for sexting kids sending consensual nudes apply. We have mostly managed to train 15yo boys to not violently rape girls (in part due to legal consequences), and I am hopeful that we can also train them not to pass deepfakes of their classmates as well.

By contrast, Grok's attitude seems to be 'it is fine unless you undress Elon Musk'.

probably more like: portray him with perfect body. (I didn't check this!)

Boys at her school shared AI-generated, nude images of her.

Which is it? Either it's an image of her, or it's an AI generated image.

It seems like as a society we're going to have to learn how to distinguish photos of actual people and AI generated images that are amalgamations of many different people. Just like literature always had thinly-veiled fictional accounts of recognizable people we're going have synthetic images that resemble real people.

Which is it? Either it's an image of her, or it's an AI generated image.

Disingenuous, unless you think the concept of, say, drawing a picture of Taylor Swift is incoherent. You can generally tell whether a picture is of Taylor Swift, and among people who know her the same is presumably true of this girl.

I don't see how having an AI do it instead of a human changes anything morally relevant; at the very least you need to make the case that it does. You seem to just assume it as a default, but I see no reason for doing this.

Disingenuous, unless you think the concept of, say, drawing a picture of Taylor Swift is incoherent. You can generally tell whether a picture is of Taylor Swift, and among people who know her the same is presumably true of this girl.

Sure. Just like Devil Wears Prada is a fictional work that is nevertheless unmistakably about Anna Wintour.

Drawing a picture of Taylor Swift isn't a photograph. It's is not a depiction of something that really happened. You can draw a picture of Taylor Swift wearing a swastika-emblazoned sombrero while she drowns a puppy. Also fictional.

Just like Devil Wears Prada is a fictional work that is nevertheless unmistakably about Anna Wintour.

This seems totally different because no one could mistake it for real footage of Anna Wintour, whereas the whole reason the AI-generated image controversy is a thing now is that there is no longer any (easy) way to tell if an image is fake or not.

So on this theory if these images had a visible watermark (or other signifier) saying “AI GENERATED” then all the controversy would be extinguished?

Not really, as watermarks don't mean all that much, and don't prevent a realistic-seeming image of a real person from being lodged in viewers' minds.

If it was a cartoon version of a nude and therefore manifestly not real, there would be reduced controversy (though there'd still be some, especially if a large corporation assisted in helping a boy create it).

I'm having a real time figuring out what your mental model is here.

Wouldn't TDWP also cause a realistic-seeming image of a real person in the viewers' mind that is nevertheless fictional?

Putting images in people's heads is one aspect of the injury done. In the case of particularly life-like sexual images, it may make people look at someone differently, even if they don't want to. Deceiving people about whether someone actually posed for the pics is another aspect. Injuries compound.

I don't see the need to have a quickly describable mental model here, as there are overlapping questions of harm, consent, reputation and victimisation at play in this story and making all the relevant distinctions would require an essay.

To your question though, I do indeed think that "putting unfavorable images in people's heads" in fiction such as in The Devil Wears Prada (I haven't seen it) may be injurious. It may also be satire, or a truthful depiction (and these categories aren't mutually exclusive).

Whether that's bad luck for the target or deserving of punishment/damages depends on a host of details.

More comments

This seems totally different because no one could mistake it for real footage of Anna Wintour, whereas the whole reason the AI-generated image controversy is a thing now is that there is no longer any (easy) way to tell if an image is fake or not.

Yeah, perhaps our culture will get to a point where a nude image of someone will be assumed to be fake. In which case creating a fake nude image of someone won't particularly harm them and therefore won't be treated as a serious wrong.

At a minimum, cultural (and legal) norms are going to have to catch up with these technological advances.

At a minimum, cultural (and legal) norms are going to have to catch up with these technological advances.

Rape was "worse than murder" crime because genetic testing, pregnancy testing and contraception did not exist. It still is, because of norms did not update.

Rape was "worse than murder" crime because genetic testing, pregnancy testing and contraception did not exist. It still is, because of norms did not update.

I see your point, but I'm not sure I agree. Arguably the potential sentences for murder are, generally speaking, more severe. I agree that people often get more upset about rape, but part of this is because rape is perceived as a crime that men commit on women.

Here's a thought experiment: Suppose a man tortures and kills a random woman without raping her in any way. Would it have been perceived as worse if he had only raped her?

Last time I read about case of real rape when a soldier raped a woman killing her husband's first. Femininsts were be like how it again tells how mysogynistic our society is. To my knowledge, when "socioeconomic factors" lead to woman being murdered, few people are worried.

The very case that original post is about fake images of nudity and not about fake images of murders, is telling.

More comments

Indeed. Paul Thomas Anderson's "The Master" was both not about the founder of Scientology and yea, it was obviously about him.

I don’t know about you, but I’m happy to call this an image of Henry VIII, even though it’s an artist-generated image and not a photograph.

Sure, artists rendition. And no one was beating artists up for unauthorized portraiture.

I feel like if that was an unflattering nude portrait of Henry VIII there would be issues.

But a flattering one would be ... fine?

People were most definitely beating up artists for portraiture of powerful individuals that displayed them in ways the powerful individuals didn't want to be seen. Same thing here, except that it's a minor girl.

If someone did this to my daughter I'd slap the shit out of them. I'd also be proud of my daughter for taking the matter into her own hands and beating the shit out of them.

Conversely, if my son did this I would hope he'd get his ass kicked by the girl and if her father came looking for him I would say I understand bro have at him just don't send him to the hospital.

I think I'm expressing majority feelings about this. Surely we can form rules around this accordingly.

I think I'm expressing majority feelings about this.

I really hope you're not. "Don't beat up little children" shouldn't be controversial IMO.

Would you beat the snot out of a little girl for being nasty to your son, or is this treatment reserved purely for penis-havers?

Yeah and it's OK that we recognize gendered solutions are appropriate. If a man in a social setting is being aggressively unpleasant you will threaten to hit him and then follow up on it, if a woman is doing the same you will call her an annoying bitch and that everyone thinks she's fat and ugly - it will inflict roughly equivalent damage.

Firstly, beating somebody up i.e. violently assaulting them is not equivalent to calling them a bitch, and I cannot take seriously a frame that considers otherwise.

Secondly, re: gendered solutions, please see my response to @hydroacetylene. If you are willing to apply your 'gendered solutions' fairly, then fair enough. If, however, you advocate for maximum harshness against men while chickening out whenever it is time to apply your 'gendered solutions' to women, then from the recipient's POV that is ultimately indistinguishable from straight-up hatred of men and I'm not going along with it.

Your desire to reframe the issue in your favour suggests that I am more correct than you are comfortable with admitting (Beat up becomes violently assault, "Call a bitch" instead of "Verbal assault, harassed and bullied). These are roughly equivalent because they likely cause temporary harm quickly, and don't have much lasting effect, a quick "Smarten up" if you will.

it will inflict roughly equivalent damage.

no

Definitely. If you scale physical and mental harm either linearly or on some sort of curve, these are within 1 rank of eachother at the furthest (and I think roughly equivalent)

5: murdered

4: permanently maimed (legs amputated, blinded)

3: injury with long recovery (broken hip, ruptured tendon)

2: injury with short recovery (sprained wrist, strep throat)

1: painful injury with no loss of function (slapped very hard, wasp sting)

0: nothing

5: total pariah (exiled, stripped of all social status)

4: fired and shunned (harvey weinstein, permanently life and livelihood altering, sam bankman-fried)

3: humiliating disruption (acrimonious divorce, demoted and forced to change careers, star wars kid?)

2: mild in length or severity of mockery (that lawyer who couldnt turn off the cat-effect on his zoom trial, a local clergyman leaves his microphone on while he uses the toilet)

1: typical bullying (a schoolmate who calls you a fag every week, mean-spirited gossip)

0: nothing

both of these are clearly in the 1-2 zone at worst. Obviously the world is very large and people have different goals, social realities, pain tolerances etc. But I'm quite confident in my assessment. By a week or 2 your bruises from your beating will heal, and everyone will forgot about the party where you got called a fat-slut and ran out crying.

another way: if two hundred people go in queue and each breaks one bone, he/she is probably dead, but if two hundred people go in queue and insult you, it's not that worse; you can even be elected POTUS afterwards.

Very true, owing to the fact that some major bone breaks like your hip and spine have extremely high mortality rates. That's why they're up in rank 3. If you were humiliated or ostracized in a severe way like job loss, divorce, family disruptiion 200 (or 40) times in a row you would probably kill yourself or at least spiral into an unrecoverable depression. In any case I don't think this scale works very well for "stacking debuffs", but 200 people giving you constant low-level (something that individually or a half dozen times wouldn't phase me or you at all) antagonism is what drives most teenagers to kill themselves I imagine.

Just because you can assign star ratings to them doesn't mean they will be equivalent. Many injuries, even minor, lead to permanently worse functiioning level.

Those are numbers, not stars, but your objection doesn't disprove anything I said. It's nearly a non-sequitur.

Are you hardcore no-fap? Not that I think the dissemination and whatever other social stuff around this is outstanding, but obtaining nudes of cute girls in your high school would have been considered a high accomplishment for high school boys as long as high school and photos have existed.

Like with all porn the AI nudes are a little queer. Its much higher status and much better behavior to seduce her in real life, but some edits to an already existing thirst-trap image is hardly something you should let your son be beat over. I'd suggest the more appropriate action is the girl is removed from any ability to be online and the boy has to agree to supervised courtship if the girl's father so demands.

Are you hardcore no-fap? Not that I think the dissemination and whatever other social stuff around this is outstanding, but obtaining nudes of cute girls in your high school would have been considered a high accomplishment for high school boys as long as high school and photos have existed.

I am not, nor ever were, no-fap. I jerked off a lot in my school days. If high school boys today can't fap without having as a visual aid a picture of their classmate's tits, the microplastics situation must be much worse than I thought.

Any high schooler who feels the need to jerk off has plenty of options. He can just fantasize about having sex with a fictional woman. Or a celebrity. While I would not recommend fantasizing about people in your personal life, thoughts are free.

If he wants visual aid, there is this thing called the 'internet'. From the softest softcore to all kinds of kinks and depravities, the net provides enough so that they can jerk off continuously to their favorite genre until they literally die of old age.

Or he can try to use his phone and flirt with a classmate and get her to send them some nudes, nothing wrong with that (unless he shares the photos with his buddies).

But then again, I do not really get the bro culture. As a straight guy, I have never had much inclination to talk about my sexuality with other men. Turning sex into a pure status competition between buddies is utterly alien to me. Sexting with some girl purely so you can betray her and gain status with other guys when you share her pics feels somewhat sociopathic, but technically is some achievement which might impress some people, just like a school shooter making double digits.

Even then, feeding the picture of your victim into an AI undresser is utterly pathetic. I honestly can not even imagine what kind of people would be impressed by that.

That at least requires the boy to make some effort at seducing the girl into sending him nudes of her. This was an idiot using AI to paste her features onto a nude photo, and hand it around claiming it was of her, so the other idiot boys could laugh at it and mock her. She's 13, she did not consent to any of this.

God's sake, we have enough posters on here criticising women for riding the cock carousel, sleeping with Chads, wanting to cuckold the betas who simp for them with alphas, and the rest of the poisonous discourse. Do you really think nude images of a 13 year old girl are anything but "bitch whore, even at that young they're at it" material? Do you think the teenage boys are being respectful and admiring of empowered sexual equality here?

This is why porn is dangerous; it creates such expectations in society, then normalises them, so 13 year old boys are now faking nudes of 13 year old girls for sexual gratification. And not a stranger or any random 13 year old, someone they know. Someone who will get the reputation among her peers of being easy, a whore, all the negatives that the "coerce women to have sex with men they don't want" posters on here like to throw at women. Creating expectations that she'll agree to sexual activity of some kind with random boys, because hey I've already seen your nudes.

We already saw this shit in the Loudon County case of the girl sexually assaulted by the trans girl/boy in the school bathroom: oh, they used to sneak off to the bathroom and fool around before, so this time so what if she changed her mind and he tried to/did rape her? She agreed by being intimate with him in previous situations, so really it's all her fault.

And this child didn't even agree to any of this, and they are not real images of her, but she is going to get the blame all the same.

This is why porn is dangerous; it creates such expectations in society, then normalises them, so 13 year old boys are now faking nudes of 13 year old girls for sexual gratification. And not a stranger or any random 13 year old, someone they know. Someone who will get the reputation among her peers of being easy, a whore, all the negatives that the "coerce women to have sex with men they don't want" posters on here like to throw at women. Creating expectations that she'll agree to sexual activity of some kind with random boys, because hey I've already seen your nudes.

I agree! The problem is porn. The problem does not lie with 13 year old boys using the tools provided to them to get quality fapping in.

Yeah, if 13 year olds are routinely using porn, and this is their introduction to relationships between the sexes, and if normies are getting concerned about choking (they're picking this up from porn as the expected thing to do during sex and that women want it, and girls are picking it up that they should be doing this), then something has gone sideways in our society.

Even that 13 year olds are now going to have sexual lives. Yeah, puberty happens, yeah masturbation happens. But that "okay, now you're old enough to get an erection, this is your welcome to adulthood and sex, of course you're going to develop a 'fapping style'", is something that does make me uncomfortable and does mark me out as a dinosaur. We're at the nadir where simultaneously we must not hold 25 year olds responsible for choices since their brains are not fully developed, but we're marketing sex to 13 year olds because that's old and mature enough to jump right in to full sexual activity.

Did you read the story? It was not a thirst trap, it was AI generated. She did not consent to this. If a girl sends nudes to a guy and he starts spreading them around, he's a jerk, but she made her own bed and she has to lie in it. With AI you can make convincing edits of literally anyone. Any picture of your face can be swapped onto a fake body. Any full body image of you can be nudified with realistic seeming body proportions. Even if you've never been online a day in your life a school photo in the yearbook or even just a quick cellphone pic someone takes of you without your knowledge or consent, and there you go.

This isn't sluts getting slut-shamed: as far as we know this is a completely innocent and pure 13 year old girl who was victimized through absolutely no fault of her own. Her only crime was existing as a 13 year old girl, and the only thing she possibly could have done to avoid this is to live in a bubble where nobody can see her face, or disfigure herself so horribly that nobody would want AI nudes with her deformed face on them. We're talking beyond Islamic levels of repression, since even a burka would reveal enough of her face to enable this.

The story is they took pictures already existing and nudified them. I highly doubt they were pictures of her wearing a burka. The point is not that their behavior was good, just that it is expected of teenage boys. Allowing some girls father to beat up your son because he did a thing that father would have done to your sister if the tech existed in the 90s is an absolutely stupid reaction.

What this argument looks like is something like "we shouldn't punish kids for doing dumb cruel kid shit because when we were kids, we were doing dumb cruel kid shit too".

The story is they took pictures already existing and nudified them. I highly doubt they were pictures of her wearing a burka.

The pictures could have been literally anything. Are you not aware of how powerful AI has become? I use the Burka as an example not because I think she was wearing one, but because that's all that would be needed. Boy gets picture of girl's face in literally any photo that contains her face, boy gets nude picture of literally anyone else that has nude photos, boy tells AI "put this girls face on that body". Boom, a "nudified" photo. It's not authentic, it's not her real body, but the AI's good enough to make it realistic enough that the boys can look at it and giggle and masturbate to it and tease her about it, and horrify her or anyone who cares about her if they saw it.

Or skip the second photo, just tell the AI "nudify this picture" and it uses the body contour lines and imagines a naked body of an underage girl with approximately the same size and pose embedded into the photo. Again, literally any photo containing her face, and I guess enough of a body (in any amount of clothing) that there's a spot to put the imaginary naked body.

just that it is expected of teenage boys.

I expect teenage boys to not care about the "authenticity" of a 90% exposed bikini-clad body in order for their nudification to count. I expect teenage boys to not need a thirst trap of a girl in order to get horny enough to want to see her nude. I expect teenage boys to just get horny. I expect teenage boys to see literally any girl and want to see her naked, even if she's ugly, if only out of curiosity. For all we know they could have gone "uggh, Jenny is such a prude, she never shows any cleavage and she always looks scornfully at us whenever we talk about girl's tits. We should make a nude picture out of her, wouldn't that be funny? Hah, and then make fun of her about it, she'd get so mad. Trololol."

I expect teenage boys to get horny, and to learn to control it and not victimize other people in the process. If one boy had, on his own, in private, asked an AI to generate a nude of her and then he masturbated to it and never told anyone, I would have no problem with that. Yes, it probably counts as child pornography in a technical and/or legal sense, but if she wasn't actually involved in the creation of it and she never finds out about it then nobody would be harmed by its existence. Heck, if every single boy at her school entirely on their own initiative had AI generate a nude of her and masturbated to it in private and never told anyone about it and nobody ever found out, this would still be fine. The problem is the social dynamic, the sharing, the teasing, the humiliation. Shame is a valuable tool that society can wield in order to disincentivize anti-social behaviors that the law either can't or shouldn't get involved in. Slut shaming is a valuable tool that society used to use in order to disincentivize slutty behavior. This breaks down when it isn't being wielded against sluts, but against anyone for any reason. This is the same as the wife that gets mad at her husband because she had a dream that he cheated on her. It's a dream, it's not real. I don't think we should send people to prison for AI-faked videos of them stealing, I don't think we should scorn people for AI-faked videos of them saying horrible things, and I don't think we should shame people for AI-faked nudes of them. And because the human brain is wired in certain ways I think that sharing faked nudes of someone is inevitably going to lead to shame and humiliation of the same type as sharing real ones, even if not quite at the same level of magnitude. And that's wrong to inflict on someone who hasn't earned it.

On the contrary, when someone does do something wrong, punishments help correct their behavior so they don't do it again. Someone who incorrectly inflicts this humiliation on an innocent person, needs to be punished in order to correct their behavior. A healthy young man should enjoy his sexuality in a way that doesn't victimize people. Beating the crap out of these boys would help them learn that lesson and become better men.

This here is the big, giant, difference between male and female attitudes to sex. Guys are all "what's the big deal over some nudes, hey if someone was showing hot nudes of me around I wouldn't care, it's a compliment" and so on. There just is not the same feeling of violation. I see the same thing with regard to stories of boys being sexually abused by older women: "what's the big deal? hey if some hot teacher wanted to fuck me when I was 14 I would have been delighted! it's free sex! this is great!"

And this is part of the entire problem around 'nobody is dating/having sex/getting married'. Why don't women date men? Well, because they don't want to date the 100s of guys who message them. But why don't they? Because women can do without sex more than men, seemingly, can do without it. When women do engage in sex with the same lack of emotional attachment as men, then they get criticised for being whores. Men want sex because of higher libido, and will be happy to just fuck anyone reasonably acceptable, and so can't understand why women would be upset about such images. "But it's not really you, so what does it matter?"

It matters because you'll be the same guy posting on here about women needing to be controlled by their fathers and husbands, because giving women free exercise of choice in romance and sex means they just scatter nudes of themselves around to any stranger who wants them, look at this case of a 13 year old girl sending nudes to her boyfriend without any self-respect or modesty, she'll be sleeping around if not stopped in time.

Are they normally the same people? It seems to be in this case, but that's what's confusing me. Because normally I think it's the "hypocrisy" of different people who believe similar things for opposite reasons.

The liberal, sex positive, man says "no big deal, free sex is great, everyone likes sluts". The conservative, sex negative, man says "women shouldn't sleep around, nobody likes a slut". In most cases of a woman fucking around and finding out, both say "stop complaining, you did this to yourself." If you just read a bunch of comments by people criticizing slutty women you might think they're massive hypocrites, but if half the people believe on thing and half believe the other and they're literally different people then each one can have an entirely consistent worldview internally and just present a united front on this one particular issue. This happens all the time on different issues. I constantly see people who have superficially similar external opinions to me with stupid garbage reasoning underlying them.

But it seems to me like in the AI case that shouldn't happen, and the various sides should strongly disagree, because the woman didn't do this to herself so the conservative wouldn't assign any blame. Any assertions that this could be fixed by a man (father or husband) controlling her and her deferring to his authority make absolutely no sense because that wouldn't have stopped this either. If you think women sharing nudes is bad and want to disincentivize the behavior then it makes no sense to punish someone who didn't engage in this behavior with the same treatment as someone who did. That's not how incentives work.

It's exactly the sort of thoughtless pattern matching I would expect to see from a normie who just parrots party lines, not here on the Motte.

It's exactly the sort of thoughtless pattern matching I would expect to see from a normie who just parrots party lines, not here on the Motte.

I will press that compliment to my bosom and shed tears of joy upon it. I'm a normie! For the first time in my life, I'm one of the vast mass of ordinary people with ordinary opinions!

I mean, I wouldn't be happy if nude pictures of me were being widely disseminated.

No more would I, but apparently (1) if there's no legal harm, what harm? and (2) you only say that because you're a woman and you want women to be treated as Wonderful and deprive men and boys of their natural right to masturbate over pictures of naked women.

At least, that's what I'm being scolded for on here, I think?

The recipients wouldn't be happy if nude pictures of me were being widely disseminated. Ugly privilege!

I have indeed seen photos where my reaction is "put more clothes on!"

If someone did this to my daughter I'd slap the shit out of them.

Uh huh. Although it is possible you are so badass as to slap some teenage boy—in recent years, men who are likely wealthier and more powerful than you have suffered larger offenses against their daughters and did fuck all.

One example would be a GS executive who had a daughter (Mayli Wang) deep tonguing ass and getting Facial Abused for but three figures a pop. He spent his last few years buying rights to her content to try and disappear it all, but then he died. One might argue that by dying early, he got away easy compared to someone like Laurence Fishburne.

It really isn't that hard to find news stories about dads beating up their daughter's boyfriends for perceived transgressions.

That doesn't tell you much about how often dads do nothing about perceived transgressions against their daughters. Plus, dads who do such beatings tend to be of quite different demographics and socioeconomic backgrounds than the modal Mottizen. The default for middle class white fathers or higher is not to exact direct retribution, but to only do so indirectly through The Authorities. Hence my pressing [X] to doubt that a given Mottizen would exact direct retribution upon a perceived transgression against his daughter.

Not only that, one can find examples where a father explicitly denounces retaliation or vindictiveness in the aftermath of this daughter's killing:

"Frustration, maybe, and sadness for sure, but no anger and no retaliation or vindictiveness or anything like that. We’re not that kind of people."

“I think Kate’s gone,” Jim said. “We’re not going there. We don’t have hate. Of course we’re deep in our faith and we go to church. We believe Kate’s in a better place, and we think about her on that level. But no, we have no vindictiveness.”

Sometimes such a father will even shake hands with and campaign for the release of his daughter's murderers. Or when his daughter is raped and murdered by a refugee, request that funeral-goers donate toward helping refugees.

Uh huh. Although it is possible you are so badass as to slap some teenage boy

It has nothing to do with being a badass. I have no choice in the matter. It's just something I would have to do.

Anyway, I'm not surprised to hear some people don't think family comes first, but I still think the feeling to excuse someone defending their daughter's honor is fairly universal.

Yeah its a pretty agreeable resolution to use short, sharp corporal punishment as a deterrent for antisocial behavior.

Good luck getting anyone to agree about what to do with your daughter if she were to start sending actual nudes around the school to a bunch of guys, though.

Bravado tends to fail, there.

I think turning g-rated photos into AI porn is so completely different than voluntary actual nudes pics I'm surprised so many are mentioning that in their replies.

Its a beautiful direct example of my point about the complete contrast in social response to males acting out inappropriately and females acting out inappropriately.

Male antisocial behavior: Beat them up, lock them up, keep them heavily policed.

Female antisocial behavior goes unremarked, often celebrated.

Multiply this out over dozens, hundreds of instances, over their entire childhood and young adulthood, and it explains quite a bit of what we're seeing now.

Do you have any thoughts on how you would regulate your daughter's behavior if she's acting in ways that harm her and others?

Any at all? Or is it just far easier to threaten other males with consequences whilst admitting you lack the ability to threaten her?

Do you have any thoughts on how you would regulate your daughter's behavior if she's acting in ways that harm her and others?

Yes. Rely on the subtle nods of disapproval from her family, friends and (future) colleagues which she is hyper sensitive to to cause extreme inner distress to regulate her behavior.

You know, the kind of pressure that a teenage boy is almost completely oblivious to.

Destroy her smartphones for starters. Not punch her, but use appropriate punishments.

Funny enough I believe that would sort of be a start.

But if she decides she wants to physically leave and, e.g. shack up with the first dude who will take her in, or convince someone to purchase her a new phone (maybe in exchange for favors, maybe not), or (most likely) just relegate her activities to solely when she's in school, what are your real options?

Is there ever a point where physically restraining/detaining her is appropriate?

I can only assume people are wildly misunderstanding what actually happened, though it seems to me the OP explains it quite clearly. There seems to be a lot of projecting of people's vaguely similar hobby horses going on.

Good luck getting anyone to agree about what to do with your daughter if she were to start sending actual nudes around the school to a bunch of guys, though.

The Western agreement would be to track down and punish the guys if they did anything other than preemptively deleting the nudes before they hit their phones.

Western men are generally unable or unwilling to thot-patrol their daughters, but sometimes they’ll try to get the state to ruin the lives of random guys who neglected to help maintain their daughters’ Wonderfulness.

Thanks, Sloot, for defending the rights of paedophiles everywhere to mock up nudes of underage girls because hey, it's the bitch's fault for letting herself be photographed in the first place, if she knew her place and didn't let her face be photographed then it would never have happened.

The girl took action to defend her reputation and got punished for it. How is that an example of Wonderfulness?

  • -15

Doesn't look like I talk about pedophiles (or even ephebophiles) in the comment you were responding to or elsewhere in the thread. My comment quotes what it was responding to, which itself in turn doesn't depend on the event in the OP.

It does, however, look like you were throwing in discussion of pedophiles as an attempt to poison the well, and used "Wonderfulness" as a self-prompt to launch into a mini-rant.

To the extent it makes snide middle-aged internet women seethe, perhaps going forward I can find it in my heart to be a better ally for the freedom and protections of marginalized and vulnerable communities such as cis-het Minor Attracted Persons. You're welcome!

Men who find they need to use terms like Wonderfulness strike me as failures. It's licking their wounds. It's shifting blame for their inability to navigate adult life onto handy targets "Boo! Wonderfulness! Women are all bitches!"

That you even think "middle-aged" is a put-down shows your immaturity.

(We can swap insults all day if you want and the mods will tolerate it, or we can try to speak without using brain-dead Internet swipes. Your choice).

We can swap insults all day if you want and the mods will tolerate it

No, we won't. You and @Sloot both need to knock it off.

That part was hyperbole since yeah, you guys will step in and send this exact message.

Always amusing to see online women reinforce the trope of women resorting to personal attacks against a man’s status when he expresses an opinion she doesn’t like.

“Middle-aged” is not more of an insult there than “women” (or “internet” for that matter), but I see that description appears to have established residency in your head free of rent. The word I was actually most directionally on-the-fence about cutting was “snide,” but after half a thought I decided it indeed was applicable given your behavior in this thread and in general.

If I wished to try to insult you, off the top of my head there could have been the low hanging fruit of deploying impolite synonyms for “poor” and “overweight”—or, more softly, those words in themselves—for poor and overweight are descriptions of which you’ve recounted yourself (perhaps using those exact words, so they may not actually be insults at all). Even to a habitual wrong-thinker like me, it’d feel kind of like punching down just by mentioning without using-as-insult.

You have also expressed dissatisfaction with your career pathway (or lack thereof) and life outcomes in general.

(We can swap insults all day if you want and the mods will tolerate it, or we can try to speak without using brain-dead Internet swipes. Your choice).

I’m aware that you have time all day for riling yourself up and insulting other forum users, as well as rage-quitting and starting a new account when things don’t go your way.

Just your current account’s posting history is a Top Right Messi when it comes to number of comments on the x-axis and frequency of tantrums as a percentage of comments on the y-axis.

So yeah, when forum poster A tries to malign forum poster B for things such as “failures,” “immaturity”, “shifting blame,” “inability to navigate adult life,” it comes across as a massive case of projection and lack of self-awareness when forum poster A is older; poor; overweight; prone to histrionic, antagonistic outbursts, etc. when she sees a comment she doesn’t like.

If I wished to try to insult you, off the top of my head there could have been the low hanging fruit of deploying impolite synonyms for “poor” and “overweight”—or, more softly, those words in themselves—for poor and overweight are descriptions of which you’ve recounted yourself

Just beautiful apophasis. You absolutely cooked em here. HereAndGone is not recovering from this any time soon. I’m in awe of your mastery Sloot

Why mention middle-age, since you don't know my exact age? The amount of times men assume "gotta be old ugly hag since complaining about us liking hot young women" is laughable, and you did that exact thing there.

I am old enough to be old ugly hag! Go me, I have survived to live this long! Whether I'm sixteen or sixty-six has no bearing on 'is what I am saying true/sensible/reasonable/not flat-out crazy at least' and reaching straight away for the "middle-aged women" bit is a weakness on your part.

Should I assume you are fifteen, by your writing? I'm going to do you the courtesy of imagining you are a grown adult, without assumptions about your precise chronological state.

How sweet of you:

forum poster A is older; poor; overweight; prone to histrionic, antagonistic outbursts, etc. when she sees a comment she doesn’t like.

The mens, they always gots to reach for "DE HYSTERIA" when they can't back up their fronting.

More comments

This comic will always be with us. It amuses me that in the comment section it's talking about Net Neutrality. Remember that massive culture-war issue and how it completely disappeared?

This is one of those losing things that only weirdos think should not be illegal. That's the cross us libertarians bear. Though, unlike price gouging and insider trading, which I think are good things, this does fall under the "immoral" and should come with severe social ramifications. Just like many things, if it is going to be illegal, the law should not pertain to the tool but to the person who misuses the tool. But hey, I guess we just make it illegal to get some things done while your car warms up.

This comic will always be with us.

Thanks for the laugh, first time I've seen that one!

I am interested in hearing your views on insider trading.

Reason article

insider trading is a victimless crime. Markets run on asymmetrical information. Stock prices bounce around because investors are always doing their best to use their own superior information for personal gain. So-called insider information is just one kind of asymmetry, and not a particularly insidious one.

What's more, insider trading tends to make markets more efficient. Here's the Washington Post last year, taking a page from George Mason economist Henry Mannes' book:

Markets work best when goods are priced accurately, which in the context of stocks means that firms' stock prices should accurately reflect their strengths and weaknesses. If a firm is involved in a giant Enron-style scam, the price should be correspondingly lower. But, of course, until the Enron fiasco was unearthed, its stock price decidedly did not reflect that it was cooking the books. That wouldn't have happened if insider trading had been legal. The many Enron insiders who knew what was going on would have sold their shares, the price would have corrected itself and disaster might have been averted.

I agree in theory, but the issue is not with information asymmetries, really - that's just how it's sold as 'fairness' to the public. Insider trading is bad because it radically distorts incentives for insiders and encourages all kinds of violations of fiduciary duty.

It's also bad because markets require liquidity and people (rationally) don't want to trade in markets in which they know they are at a significant information disadvantage

Allowing insider trading will reduce liquidity profoundly.

Prediction markets are going to be a very interesting space to watch for this. There, insider trading has the best argument, as it helps the market find the truth. But will people continue to participate in prediction markets if they know there are good odds they're going against sophisticated insiders? Time will tell.

How is liquidity more important than accuracy in long term market outcomes?

If there's no liquidity there's no market, regardless of accuracy

Remember that massive culture-war issue and how it completely disappeared?

In hindsight, the many and various traffic patterns on the Internet make strictly-defined "network neutrality" difficulty from a technical perspective, at best: I want my VOIP traffic to get minimal latency, but I want real-time fixed bandwidth for video streaming (or is that variable bitrate these days?), and sometimes I'm downloading, um, Debian ISOs and just want them eventually. Add in LTE bandwidth and I really want my 911 call to take precedence over dozens of zoomers on TikTok. Over-provisioning to avoid real QoS questions is darn expensive.

At the time the concern was that (landline!) ISPs (often cable companies) would start charging for data usage, pricing out competitor then-new streaming services, or worse, start charging for access to sites and services. Despite the official repeal of the neutrality policy, no landline services seem to have started charging for specific site access --- and the obvious sites to charge extra for now (Netflix, Amazon, Disney+) have more money and lawyers than the ISPs ever did. Metered landline bandwidth never was accepted by consumers, although I'm not sure if it's actually prohibited specifically and it's a bit outside of the scope of "neutrality". Honestly, the worst offence against the former policy I recall seeing was a mobile provider teaming with a streaming service to not count against your bandwidth limit.

IMO, the culture war fight was maybe a bit overblown, but if it did something it provided enough cover for our Big Streaming to take off. Is that the win the kids wanted? It is what they were asking for, I guess. Go ask them how they feel about streaming slop now, and maybe opinions would be different.

But it is an interesting observation that active CW fronts just sometimes up and fade away.

At the time the concern was that (landline!) ISPs (often cable companies) would start charging for data usage, pricing out competitor then-new streaming services, or worse, start charging for access to sites and services

Netflix had to build fast.com precisely because these legacy cable companies were screwing with Netflix's customers bandwidth.

isn't it because ISPs plans were unlimited traffic and effectively assumed that low data usage users would subsidize high data usage users, and Netflix increased number of the latter?

It's in the schools incentive to deny, deflect, and cover up anything bad that might be happening in the school. On the other hand it's in the police's incentive to hunt down people and put them behind bars, whether justified or not. We know that schools often cover up straight up rapes and beatings and pretend nothing is wrong.

As to whether or not ai nudes should be illegal, I'm not against classifying deepfakes of real underage people as illegal.

Grok allegedly generated pornographic images of a couple minor girls.

"any content that sexualizes or exploits a child for the viewer’s benefit."

Of course as an ai company owner I would want to avoid generating this kind of content, this seems quite vague. An image of a kid playing can be normal, but it can also be csam if the photographer was thinking secual thoughts? In my imo csam should only count content where actual abuse happened or for actual pronographic content.

From what I'm aware of the gork story, the gork put them in underwear and swimwear, without including any direct explicit content.

I tried manipulating an image of my butt crack earlier today, to imagine someone climbing down it. In any case, Grok wouldn't comply. It's not some wild west of content generation (though it doesn't respect manipulation of celebs, as long as it's SFW).

People need to get more mature about images I think.

Imagine if, at the dawn of the internet, there was a big shock at all the dodgy information sources, conspiracies, cults and so on that emerged. 'We invented freedom of speech in an era of printing presses, not high-capacity assault routers!' someone might say. And it's true, there's a difference due to the speed and nature of the connection. We are bombarded with information, it can be quite overwhelming and mindbreak the weak-willed. Ziz cultists, Extinction Rebellion, retarded tiktok trends... The internet seems to have catalyzed many bad things in ways that aren't easy to counteract without squashing the whole thing.

But the answer isn't to shut down the internet, the answer is to strengthen our mental integrity, raise our willpower stat.

What is the alternate answer here? Restrict Grok from putting people in a bikini, ahegao face, milk sprayed on them? Restrict Grok, they'll just go back to civitai where this stuff has been going on for years. Men clearly desire lewd images of women.

How are you supposed to restrict this? If it's libel, then what about the time-honoured tradition of spreading false rumours about people, is that banned too? Do we all line up and go to the nanny state about how we were wrongly smeared as whiny, dumb, small-penised, ugly bitches who did something unspeakable at a party? Do we all line up in front of some ruinously slow legal system and give lawyers money to defend our reputations (they can't defend your reputation even if you win in court)? Do we have AIs surveiling every private groupchat to defend the honour of maidens? A gigantic Chinese style state surveillance apparatus to uphold the wholesomeness of the entire internet?

The best solution is for men and women to act in a more dignified and honourable way and not do any of this in the first place. That clearly isn't going to happen after decades and decades of subverting and violating just about all of the old taboos. What are taboos and censorship for if not enforcing a standard of behaviour?

Men still have the responsibility of dying in a trench for their country (now with their drone-killers filming their deaths for war propaganda), women will need to accept some downsides in a technological environment that's freed them from a lot of their unpleasant work. Picking and choosing to preserve just the taboos that overwhelmingly benefit women over men isn't a sustainable pattern in the long term.

Picking and choosing to preserve just the taboos that overwhelmingly benefit women over men isn't a sustainable pattern in the long term.

Modern U.S. society will head straight for its grave rather than break from that pattern.

What is the alternate answer here? Restrict Grok from putting people in a bikini, ahegao face, milk sprayed on them? Restrict Grok, they'll just go back to civitai where this stuff has been going on for years. Men clearly desire lewd images of women.

How are you supposed to restrict this? If it's libel, then what about the time-honoured tradition of spreading false rumours about people, is that banned too? Do we all line up and go to the nanny state about how we were wrongly smeared as whiny, dumb, small-penised, ugly bitches who did something unspeakable at a party? Do we all line up in front of some ruinously slow legal system and give lawyers money to defend our reputations (they can't defend your reputation even if you win in court)? Do we have AIs surveiling every private groupchat to defend the honour of maidens? A gigantic Chinese style state surveillance apparatus to uphold the wholesomeness of the entire internet?

A given chick certainly makes it easy for Grok, when chances are she has already uploaded or sent lots of skimpy photos of herself that a guy can use in the generative process to make bikini pics or porn of her. Especially when Grok's training data already contains a vast sea of other chicks who have uploaded nudes, bikini pics, porn of themselves. Minimal edits needed to render a vaguely-plausible photo or even a quite realistic one.

The usual answer: We can expand the protections afforded females by limiting the freedom and protections afforded males. We can easily find some gerrymandered reason why Grok should be neutered, why things like "revenge porn" and making AI porn should be illegal (with the de facto burden of proof being on the defendant), while things like smearing someone as a "whiny, dumb, small-penised, ugly bitch[]" falls under free speech. If a man objects, it would only serve as further proof that he is, in fact, a whiny, dumb, small-penised, ugly bitch.

A woman who calls a man a rapist in an "Are We Dating the Same Man?" social media group is speaking her Emotional Truth and Lived Experience as they at least once had sex after she had a few drinks. A man who uploads a nude of her to "Are We Dating the Same Woman?" may face prison time and/or a hefty monetary judgment, if such a group hadn't already been yeeted.

There's just not any way around this. I have an AI image gen model on my computer right now, anyone with a current gen macbook could inpaint any image into pornography. It's not the kind of thing you can realistically ban. As a society we're just going to have to find a way to deal with this the way we deal with the fact anyone at any time could have drawn these same images if they wanted to badly enough. The genie is thoroughly out of the bottle and no amount of outrage will ever put it back in the bottle.

It's not the kind of thing you can realistically ban.

I think this mistakes different types of bans/controls and their different purposes.

One way a ban/control may operate is to try to pre-emptively prevent certain events from occurring. When folks try to control, say, ammonium nitrate following the Oklahoma City bombing, they're often trying to prevent someone from acquiring some of the tools used to create a large bomb, ultimately in the hopes of preventing said hypothetical bomb from being used to kill people and destroy stuff. Whether or not this is practical is not the point here; the point is that this is the point of the effort. Similarly for controls on nuclear material.

Importation controls are somewhat similar in that they may be trying to prevent an event from occurring at all. The funny example I go to sometimes is the ban on Chinese drywall. The intent was to prevent it from even getting into the country, pre-emptively preventing whatever harms it may (or may not) later produce. Or see, for example, the discussion below about possible controls on UAS; I read that conversation to be primarily pondering whether controls can be put in place which pre-emptively prevent a significant number of events, to what extent such controls will be effective or not effective (how hard is it for folks to still "roll their own"?), etc.

Many other bans/controls are post-hoc controls, assigning liability/culpability after a sufficient number of steps have been taken toward an event or after the event has occurred. These are different in type. Probably the majority of controls are like this. I might even say that part of the reason why so many controls are like this is because it is not reasonable to control the inputs that are used to lead up to an event. This may be in part due to "dual use" considerations or other factors.

For a silly example, rope can be used to tie someone up when kidnapping them. Well, basically no one thinks it's reasonable to put heavy controls on possessing rope. But basically no one thinks that kidnapping is "not the kind of thing you can realistically ban", either. That people have widespread access to the tool used is sort of neither here nor there when considering post-hoc controls on the use of those tools for specific events.

What I find strange is that I've really only seen this come up for digital tools. There's this weird perspective that if someone uses specifically a digital tool that is "out there" and accessible, that the "genie is out of the bottle", then it's simply unrealistic to use any sort of law to restrict any type of use of these digital tools that one might perform. That still seems wild to me. Rope is a technology that is "out there". "The genie is out of the bottle." Even the Primitive Technology guy makes his own! ...sorrrrta think that we can still ban kidnapping.

[EDIT: I forgot to add what I had wanted to say about the UAS conversation. Suppose, after consideration, it seems infeasible to use a Type I control to prevent things like killing people with UAS. Can't even manage to stop someone from flying into, say, a crowd at an open sports stadium. I don't see any reason why someone couldn't want a Type II control, still making it illegal to fly a UAS into a stadium or to kill people with a UAS. Sure, maybe you can't prevent it, but to the extent that you have the investigative tools to prove in a court of law who is culpable for doing it, you can still prosecute them.]

Of course, once we're in a Type II ban world instead of a Type I ban world, then there is some amount of "we have to get used to the fact that this type of event will actually happen significantly more often than events that we can control with Type I bans". Frequencies and percentages will depend heavily on specifics. And maybe that's the sentiment you're going for. Sure, we're not going to be able to meaningfully pre-emptively prevent fake AI nudes from being generated, just like we can't really pre-emptively prevent rope-enabled kidnappings. But folks may still want to try a Type II control. The extent to which even a Type II control can be considered effective certainly depends extremely heavily on specifics, including an analysis of post-hoc investigation techniques, surrounding legal frameworks, resource considerations, and even the oft-debated deterrence theory of government sanctions.

Of course, once we're in a Type II ban world instead of a Type I ban world, then there is some amount of "we have to get used to the fact that this type of event will actually happen significantly more often than events that we can control with Type I bans". Frequencies and percentages will depend heavily on specifics. And maybe that's the sentiment you're going for. Sure, we're not going to be able to meaningfully pre-emptively prevent fake AI nudes from being generated, just like we can't really pre-emptively prevent rope-enabled kidnappings. But folks may still want to try a Type II control.

Sure, and we can discuss type 2 controls. But we're going to very quickly get into the "what are we even doing here?" realm when anyone who wants to put together a piddly little indie game that uses player controlled image gen is going to need to spend time implementing some, easily circumvented, controls to prevent some class of images to be generated. And it's not just the deep fakes, we're going to have to get used to every image or video on the internet that doesn't have verifiable provenance being suspect. A lot of people seem to think this future is avoidable and it just isn't. People are going to be able to make deep fakes of people as easily as they can imagine them nude. We should try to teach young men not, or at least not to do so in a public manner just like we mostly manage to teach them not to describe to other people what they imagined one of their class mates would look like nude, but this is fundamentally a social problem that people are trying to solve with ill-fitting legal action. Do you think a kid should get expelled because he imagined what a classmate looked like nude? If a kid drew a picture of his classmate the buff should he be punished?

when anyone who wants to put together a piddly little indie game that uses player controlled image gen is going to need to spend time implementing some, easily circumvented, controls to prevent some class of images to be generated

Sorry, h-what? This is truly out of left field.

And it's not just the deep fakes, we're going to have to get used to every image or video on the internet that doesn't have verifiable provenance being suspect.

Yeah, sure, agreed. Not sure the relevance.

Do you think a kid should get expelled because he imagined what a classmate looked like nude?

I cannot possibly think of how this is remotely responsive to my comment. The answer is obviously no, but the mind is boggled.

The funny example I go to sometimes is the ban on Chinese drywall.

This ban was because we imported a lot of shitty Chinese drywall that later outgassed sulfur compounds. It wasn't pre-emptive, it was punitive.

This is different from the UAS ban for several reasons including

  1. UAS that do bad stuff on their own or at the surreptitious direction of their foreign manufacturer are largely only theoretical. DJI has been accused of uploading flight logs during an update, but that's it.

  2. It applies to components, too, including components such as motors and batteries that could not be compromised to do the bad stuff theorized.

The reason for the UAS import ban is to prevent Americans from doing bad things with a UAS on purpose, not for any damage done by the manufacturer or manufacturer's country.

shitty Chinese drywall that later outgassed sulfur compounds

For the purposes of my comment, it is this temporal relationship that matters. Sure, the other temporal relationship between folks realizing this temporal relationship and choosing to ban it is fine. But this one is the one that holds the conceptual link.

I'm certainly not going to defend the UAS/component ban, either, but that's not the point here. The point is that even if we assume that all of that is dumb and doesn't make sense as a Type I ban, we can still make it illegal to use a UAS to kill someone or even just make it illegal to fly a UAS into a stadium or something, and this type of ban will have particular qualities tied to the specifics.

Well if GPU and Ram prices are any indication, we might get some de facto restrictions in that very few can afford a rig powerful enough to actually produce the images.

I was generating porn locally with Stable Diffusion XL running on an $800 gaming laptop that had an RTX 4050 with 6 GB of VRAM two years ago. Most of what I made was hentai, but it would have been trivial to train a LoRA on a couple dozen SFW photos of a particular girl, then made porn of her on demand.

I get a feeling people here vastly overestimate the required HW needed for generating random NSFW images because so much discussion is about LLMs that do require an order or two of magnitude more HW. If you don't care much about prompt understanding, concept flexibility or accuracy of poses and such, even "ancient" (ie. SDXL) models are more than capable of doing the job on piddly half a decade old computers that can be bought for $300 second hand.

Even if you really care about prompt-adherence, there are realistic Pony finetunes you can use to get a model that can understand *booru tags.

Why would that be the case when a seven year old laptop is already powerful enough to do it? You don’t need fancy new hardware when the existing far from top of the line hardware will do fine.

This is true, but unless the intention is to keep salvaging old hardware as the various components die, we're still ending up in the same place.

(My actual guess is that capacity WILL expand to meet demand, so this is probably a shortish term crunch)

I’m not talking about old high end gpus but the middle / low-middle end that’s now eclipsed even by integrated gpus. When you equalize for processing power, gpus are still way cheaper than when the hw that was capable of image generation first became common (which was several years before the software was invented). You really don’t need a 32 GB 5090 just to do some basic NSFW generation / inpainting.

You could make it pretty broadly inaccessible: ban all open-weight models; require any image generation to have strict safeguards and reporting of attempts to authorities; enforce severe criminal penalties. Your existing model would be pretty much untouchable, but it couldn't easily be shared, and a decade from now most copies of it would have been lost to end users. You could even require manufacturers to include firmware on new hardware that bails on unapproved workloads, but that seems like it'd be overkill.

Not saying that this is what I'd like, but it seems doable.

ban all open-weight models

This seems harder than it sounds. Some of the best models aren't published by the West (DeepSeek is probably the best open text model at the moment, I hear [1]), so you'd need global agreement to start cracking down. And the small models aren't that big: Hollywood wasn't able to keep rips off of torrent sites a decade back, and from what I hear they're still around, and international VPNs are pretty ubiquitous too. Short of constructing your own Great Firewall, this isn't really feasible (and even then, it's just a matter of practicality, from what I hear).

  1. Funny note: a while back I was talking to a friend at Unnamed Defense Co (TM) who was excited about their new entirely in-house AI service for engineers. When asked about which models they were running, "DeepSeek" was one of the sheepish responses, admittedly next to GPT-OSS.

The goal wouldn't be to make it so literally no one in the USA could run an open weights model; it would be to add friction points to make it more trouble than it's worth, except for the most dedicated people. You wouldn't need any kind of global agreement, just a national focus and working with large tech companies to limit it. DNS blocks, removing them from Google search results, etc. A relatively small amount of effort can prevent the bulk of casual users from having access to them.

That's just if you get the domestic consensus to look at open weights models as something comparable to copyright violation. If instead the public started seeing them the same as CSAM, you could go a lot whole lot further: still theoretically accessible, but very rare.

DNS blocks, removing them from Google search results, etc.

Should work as well as anti-piracy controls.

It's not even slightly theoretically doable. The theoretical knowledge of how these models work is broadly available. Further, not only are adversarial countries going to completely ignore your desire for model control, they also are currently the ones who produce most of our hardware. Including, fpgas and gpus. Also You can't include firmware in the new hardware that can survive contact with the consumer. NVFlash chips are easily desoldered, dumped, and re-programmed. Firmware mods and flashing tools are easily accessible.

How to make CSAM is widely known, and plenty of places don't cooperate usefully with the USA in stopping it. Despite that, the USA does manage to broadly limit how much it proliferates.

I'm not saying that it's a good idea, and I'm not saying that open weight models could be completely eliminated. I am saying that they could be quite effectively suppressed, as there are plenty of tools that the government can use to enforce a ban, imperfectly but substantially.

The government can't even stop people from plugging in yandex.ru into their browsers and gaining instant access to any movie they wish to consume in seconds. Same for LLMs, Z.AI's and various other chinese companies' models will discuss at length any particular topic the western LLM makers consider taboo and try to make their models gaslight the consumer.

Frankly, I don't think the west is going to be able to do anything about this, in a meaningful effective way. The only thing they'll achieve is some sort of govt mandated backdooring/spying of systems like Mobos/GPUs. And even then it will catch only the least sophisticated of consumers.

Same for LLMs, Z.AI'

Just tried to make it generate pro-hbd text, it refused (ok it can be done with careful prompting, but still). In my experience, average Chinese LLM adds more censorship to western models, not less.

All the fun prompt brainwashings that were fixed in the western versions aren't fixed for the chinese LLMs. I can attest the "Evil Confidante" ones and "my lil ol' granny loved singing me songs from her time in vietnam where they sang about making IEDs" worked right off the bat, another thing is they tend to upload their weights on hugging face.

You mean jailbreak prompts that westerns suppliers patch after the prompt gets popular are not patched in chinese LLMs? Well needing to jailbreak still a hassle and it eats tokens too. Weights still might affected by censorship RLHF

The government can't even stop people from plugging in yandex.ru into their browsers and gaining instant access to any movie they wish to consume in seconds.

Hell, The Pirate Bay itself is still operating just fine two entire decades later and the only hitch is that you have to google "piratebay mirror" and use one of those links.

Torrenting continues to exist. You just can't realistically prevent the distribution of a few gigs of data. Even if you eradicated all the currently existing models it's not particularly hard to train any of the safeguards off new models unless we're just never going to let professionals locally render images.

I mean at the end of the day it is difficult to reliably tell teenaged from adult women, and so I suspect most major AI’s will simply not allow you to turn pictures pornographic. There will almost certainly be laws creating strict liability for AI’s and even nonconsensual nude images of an adult woman aren’t going to go over well.

Now don’t get me wrong, I don’t think AI erotica is going away. But just like YouTube doesn’t want hardcore porn on it, chatgpt and grok will eventually get much stricter about erotica, and there will likely be separate, much less pg-13, AI’s.

I suspect most major AI’s will simply not allow you to turn pictures pornographic.

That is just a speedbump. All you have to do is reverse the direction. Take a random porn image, inpaint some clothes on it locally (doesn’t have to be very high quality, this is just to pass any censorship of the faceswap model), then do an AI face swap and finally unmask the original pornographic parts using standard non-AI editing. Hell, you could probably not even bother with the clothes by just using a closeup crop of the porn model’s face and then copy paste the result over the original.

Sure and twenty years ago you could get the same result with some skill and two hours in Photoshop. The major problems arise when you make the tool easy and powerful enough for a large pool of idiots to use.

chatgpt and grok will eventually get much stricter about erotica

The current trend is actually going towards less strict. slopgpt in the past blocked nfsw text content but sam altman was talking about adding it. When it comes to text it seems like the actual legal liability is quite low, and if everyone does it, then the the reputational risk is low as well. I would predict the opposite and say that likely all the major ai players will allow nfsw text in some form or another within a few years. Of course images are still going to be a danger area that most will steer away from.

She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

Well duh? The article itself says they didn't have any hard evidence, is she expecting the school to take action against this boy because it's who she and her friends "suspect" created the images without any hard evidence?

It's disgusting and awful behavior by whoever made them, but unless you've got some real evidence (and I'd even take one of the friends of the accused boy saying that the boy did it as stronger evidence than what the article presents) then I don't see what you could expect the school to do here. And even though it's nasty and disgusting behavior, starting a physical fight over it is going to get the person fighting in trouble until schools get rid of their retarded zero tolerance policies. I spent plenty of time in detention for fighting back, it's retarded policy but at least schools still seem to be consistent in their retardation.

is she expecting the school to take action against this boy because it's who she and her friends "suspect" created the images without any hard evidence?

Based on other things I've seen elsewhere, probably.

The First Amendment protects free speech from infringement by the Federal Government, the Fifth guarantees due process in the courts, and so on. The principal is not a Fed and their office is not a court, so obviously the constitution does not apply. Do one little rhetorical slip, and suddenly the entire idea of due process is not a valid counterargument to your preferred methods of meting out punishments (anywhere short of a genuine Court of Law, at least).

This is so remarkably and verifiably wrong. The principal of a public school is a State employee. The bill of rights is incorporated against the states by the 14A. The courts have for decades said that students don't categorically lose those rights in school. This is all stuff you can just look up.

This is so remarkably and verifiably wrong.

Yup. Doesn't stop people from making that argument, either explicitly or by omission.

I expect more tho.

“Your tears say more than real evidence ever could.”

More seriously, to me, the part that struck the most discordant note was this:

She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her.

Which almost makes it sound like she wanted to keep going to school with the boy. If I were in the school district’s position, the last thing I’d do is ship them both off to the same location.

The complaint there is not "I wanted to be in the same school as him", it was "I got sent to the punishment school for defending my reputation, the guy who committed the offence was allowed stay where he was".

If you're going to punish people, and you can't/won't punish the guilty, then punish both. Don't let the offender off and punish the person who reacted.

Haven't we had stories on here from people talking about how they were bullied in school and when they finally snapped and hit back, they got punished while the bully got off? This is the same thing. It's unfair and it's incorrect.

Quokka

It's how bureaucracies often respond, though.

The impression I get from the rest of the article is not just the boy avoided school discipline immediately but even after having been charged with a crime for his actions. Maybe that's wrong, the boy goes unidentified and the school claims it also can't provide any information. Surely if there's enough evidence for the police to charge a crime there's enough evidence for a school to act.

Surely if there's enough evidence for the police to charge a crime there's enough evidence for a school to act.

It should be noted that police are much, much better at acquiring evidence than schools, and it's likely they didn't hand all their evidence over to the school.

The article says they charged two of the boys that were sharing the images, it's not clear to me at all that either of those two boys were the one who had originally created the images.

Edit: the article also seems to be saying that the two boys who were actually charged went to a different school than the girl entirely, but it really didn't make any of that clear.

From a legal standpoint, what is the theory for the 'harm' caused in this instance. And to whom?

Liability of any kind usually rests on the idea that someone's interests were injured and in need of redress.

We are able to statutorily 'create' interests (the Americans with Disabilities Act did this, for instance) but I think we'd smack into 1A issues trying to make it completely illegal to post manipulated images of people who... are already posting images of themselves online.

Most obvious angle is copyright/IP, but they're still sorting that out for AI works.

I'd kinda love for them to go at it from the obscenity angle. Because that would also suggest that the women posting thirst traps are doing something wrong too.

I don't know about legal harm, but my ideal solution to this would be "The girl who punched the asshole boy gets a finger-wagging, and the boy gets told he had it coming and stop being a little shit."

Obviously, that's too old school and common sense for a school to do.

Photoshopping a woman's face onto a pornographic image has long been understood to be a shitty thing to do and possibly actionable. But AI-generated real-person-porn is probably just something society is going to have to get used to.

In a world where social shame was still effective it'd be a pretty damning to do it and would probably result in ostracization. Not clear what one has to do to 'compensate' for the situation though.

Similar to being a peeping tom, or a subway groper or anything else that intrudes on people's strongest held social boundaries, even when the harm inflicted is de minimus.

But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.

Its rather schizophrenic that there's basically unlimited tolerance for (adult) women to produce pornographic content of themselves, but shame is still heaped upon the consumers, as if these weren't both inseparably linked and necessary components of the "empowerment" equation here.

Like I said before, worst of all worlds.

This young lady is doing something bad, and most people will agree. If her nudes get disseminated for free nobody will have sympathy for her.

And we will do what, precisely to rein in the behavior?

Call her a slut, like you’re already doing.

Look, I may not approve of sex work but I totally acknowledge you can’t get rid of it. Women being harlots in 1900 had few legal consequences then either.

Well you specifically said it there. "Legal" consequences.

What other sorts of consequences might have regulated this particular aspect of the economy?

There's presumably some middle ground between "try to stamp it out entirely via force of law" and "suggesting that prostitution is a bad career choice for almost all women is a bannable offense."

But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.

I've already said that I am pro-slut shaming.

That said, there's a difference between someone willingly posting their nudes and someone not doing that. I think the OnlyFans girl would have a harder case to make about being harmed by someone generating AI porn of her, versus a girl whom you think should just accept that all women are being punished for the OnlyFans girls.

There's a difference but I get confused about the secular reasons for why its meaningful.

Sex and nudity is supposedly no big deal, especially if you're attending a pride parade, but it absolutely IS a big deal when its someone's nudes hitting the internet, evidently. Shame, embarrassment, I dunno, it seems just taken as a given that it demeans the subject to be exposed in such a way. But if they publish those exact same images themselves, it is not demeaning?

There was a minor hullabaloo when I was in college involving 'Slutwalks' making it acceptable for women to wear skimpy clothes in public. And the "Free the Nipple" movement which, among other things, tried to make it acceptable for female nipples to appear on, e.g., instagram.

But then what I noticed is that almost no women (well, no attractive women) used this newfound power to actually go around in public topless or scantily clad, or post topless shots to IG. THEN came OF where they could monetize it and things REALLY got locked down.

So culturally we're told sex and nudity aren't a big deal, don't be prudes. But ECONOMICALLY, people (mostly males) spend billions upon billions of dollars to acquire sex and view nude women. So the only distinction I can really grasp is "am I getting paid for this or not." Which applies to many things, granted.

But where does that leave us?

Sex and nudity is supposedly no big deal

According to whom? The leftists you hate so much? Yeah, some will make that argument, but even they won't say it's no big deal when it's non-consensual.

I don't think the pride parade demographic is particularly representative even of leftists.

But where does that leave us?

I think pretty clearly it's morally objectionable to generate AI porn of someone who is neither a sex worker nor someone who wants AI porn generated of them. What the law says, I am less sure, but I expect the "revenge porn" laws will probably be expanded to include "Generating AI videos of your ex fucking a horse."

According to whom?

Well, the amorphous cultural norms brought on by the sexual revolution, more to the point.

Abortion? On demand. Contraception? Everywhere. Marriage? Optional.

As the biological consequences for having sex with whomever you wanted were abolished, so too were the social consequences.

The leftists you hate so much?

I do not hate them, but I do not want to live amongst them.

And if people were better about choosing to live around people who genuinely shared their preferences and norms, much of the problems we're discussing in this particular case would evaporate.

I mean, the Muslims have solved it their way. Keep women covered up whenever they're in public. This "works" but, (as I'm sure you'd agree) this requires unacceptable restriction on female autonomy. If they only live among other Muslims, this tends to work "OKAY" (women stoned to death unavailable for comment).

We westerners have clearly NOT solved it in a way that is satisfactory, and we seem to make up the rules on the spot based on the relative status of the involved parties. I'm reminded of this every time I see a gym influencer post a video of some guy allegedly gawking at her body without her consent, and posting said video so that all the anonymous onlookers can... gawk at her body.

We CANNOT sustain a system where people are allowed to wear whatever they want without regard to its reception by onlookers, and the onlookers are only 'allowed' to enjoy the view if they're approved as acceptable by the wearer.

I don't want to live amongst people with such irreconcilable standards.

I think pretty clearly it's morally objectionable to generate AI porn of someone who is neither a sex worker nor someone who wants AI porn generated of them

What about making an unflattering caricature art of them? Or depicting them in (non-graphic) torture scenes? Or just change their skin tone to a different color on purpose? This all seems like we're treating nudity (or even just softcore titillation) as a special pleading.

At least, with kids, I think there's a sustainable moral argument for why we don't want people sexualizing them, and cast a suspicious eye at those who do.

And question I've asked before, what if you find someone who is a very close lookalike and have them pose for nude shots and post them, but never actually imply that was your goal? It was a common enough practice among pornographers back in the day.

It seems like this is basically suggesting norms of "look at whatever content I choose to post, but do not ever interact in any way I might find unpleasant."

Which runs extremely counter to how internet culture as a whole works.

That's why I specified morally objectionable. I don't have to think it should be illegal to think someone who does that is a shitty person. I can legally draw a picture of your wife fucking a horse and post it online. Even if you couldn't press charges, you'd probably want to punch me, and most people would think I deserve it.

More comments

I mean - in Eugene, Oregon, topless women are not terribly uncommon.

But if they publish those exact same images themselves, it is not demeaning?

Yes. Consent and agency are necessary considerations in plenty of moral decisions/outcomes, sexuality included. It seems intuitive to me that the proactive decision to publish sexual content is a vastly different experience than having someone do it under your nose. Money need not apply.

A few (admittedly imperfect) analogies involving consent to illustrate my point:

  • A billionaire choosing to donate his fortune to a developing country vs. his funds being seized by a government and donated against his will.
  • You choose to donate a kidney vs. the ambulance coming to your house and taking it from you.
  • You choose to tell a secret to your friends vs. a loose-lipped confidant broadcasting it to the masses despite your wishes.

In all these cases, the former option is fine when done at one's own volition, but become a problem when another actor steps in. There are almost certainly philosophical papers that provide the premise-by-premise reasoning for this sort of argument, but hopefully you get the picture.

In a way, the body, particularly the sexualized body, is something of a possession. It can be given and taken away, shown and hidden. In some sense, it is a commodity that we have "ownership" of and many consider it the sacred domain of the individual. Sexual acts are high stakes, which is why it is so terrible when they are done against one's will and why it is considered a statement when someone takes bold public action with their body, for better or worse. You could argue that it is demeaning to publish sexual content under some sort of moralist (i.e. public sexuality is inherently debasing) or consequentialist (i.e. public sexuality leads to negative behavioral outcomes), but these arguments are complementary rather than overriding to ideas of agency and consent, in my opinion.

Well I'm gonna have to drill down deeper as to your logic here, which I can accept as facially valid.

What is actually 'removed' when the image is published?

Similar with the secret, a breach of trust is a breach of trust, but unless you signed an NDA that expressly laid out how to calculate damages, then your remedy is "never trust that person again."

Vs. losing a kidney or having your money taken, where you can absolutely point to the thing that you lost and demand recompense for.

I would not be arguing this if we were talking about actual physical rape of a person, which is clearly a violation of a concept of 'bodily autonomy,' I think taking a photograph of someone/something is inherently less of a violation.

Publishing a photo is a step beyond, I can absolutely grant, but kind of as I alluded to before, the only actual dividing line I see between whether its a demeaning violation or not isn't in how the viewers receive and react to the image, but whether the original subject will get any money from its publication, not that they have lost something that was in their possession.

Like, consider a situation where a woman takes a nude photo, then fat fingers it and accidentally sends it to the wrong dude. Then, mortified, she demands that he delete it and excoriates him if he comments on it approvingly. Or comments on it at all.

Is HE in the wrong if he views and enjoys this image that wasn't intended for his consumption? Or is SHE in the wrong for sending unsolicited pornography to an unwitting recipient? Is he obligated to delete it? What's the difference? Once it has been sent, how is she harmed by it arriving to the wrong person?

Because I think if we take your express logic to any extreme, it also becomes objectionable to imagine someone naked, especially if you derive pleasure from it.

It's a false statement about you that harms your reputation, often in measurable dollars and cents. One can easily imagine someone losing a job or marriage over fake nudes, and it's not like damages for emotional distress is some foreign concept lacking decades of court precedent.

Your public reputation is a valuable thing that can absolutely be damaged by someone distributing fake nudes of you, in the same way it can be damaged by someone spreading lies about your untrustworthiness in business or your lack of professional qualifications.

Most women don't want to be porn actresses. Making them into ones without their consent is obviously wrong in a way that them choosing to become porn actresses of their own free will isn't.

Is he obligated to delete it?

Yes.

What's the difference?

Intent. If you mean to pay your electricity bill but by mistake send $200 to Mr. Random, do you expect Mr. Random to send you back the money or not?

Once it has been sent, how is she harmed by it arriving to the wrong person?

Because if she's not selling images of her nudity for money, it was an intimate shot meant to be shared only with the person she is in a romantic relationship with, and who knows what Mr. Random is going to do with it? Maybe he'll show it around to his friends. Maybe he'll post it online. Maybe he'll try and blackmail her with it.

I don't think people should be sharing nude photos, boyfriends or not, but that stable door has swung off its hinges. So the next best thing we can do is maintain control over our property, which includes photos of our bodies and faces.

More comments

The examples I provided are not 1:1 analogues to sexuality but moreso illustrations of consent in practice. I don't believe there needs to be specific recompense in these situations for the interference of an outside actor to affect consent. In the case of a secret, you're correct that the victim has little to do other than not trust the person again. I think that's tangential, though: the secret-spreader has still committed a violation of some sort. The release of sexual explicit photos is similar. All that can be done is have them taken down, but it would be hard to argue that some principle of consent/agency wasn't violated in spite of this lack of direct recourse.

Taking a photo of someone is less of a violation than rape, sure, but a lesser violation is still a violation. Petty theft is less of a violation than grand larceny, and they're both prosecuted.

I'm not sure I totally follow your point about money. If I'm restating you correctly, you're saying that modern ideas about the acceptability of these things hinge on whether or not the woman gets paid, not the reaction of the viewer. My response to that would be: who cares about what the viewer thinks? Money is a useful moral fiat that people bend their preferences for all the time: they're employed. If someone forced me to work, that would be loathsome, but I do it for money. It is "reasonable" that many women bend their sexuality in this way, even if I find it socially problematic. I don't think it's somehow hypocritical or irrational for money to play a role in moderating peoples moral preferences. There's decades of social psychology research to support that idea. I'm not sure what your ideal outcome in that scenario would be.

The situation you paint is a bit too specific for me to argue in detail but overall I would say: if the woman sends an image accidentally and requests it be removed, doing so is basic common courtesy and respects her right to privacy. Of course, there is no mechanism whereby the recipient is obligated to do so, but it seems straightforward to me that he should do it. Perhaps in an appeal to the social contract, perhaps in respect for her autonomy - I can't argue it in great depth right now but I think you understand my point. He shouldn't be shamed if he finds it attractive - that's arguably involuntary - but doing anything to further exacerbate the uncomfortable situation is clearly morally dubious.

Yes, it is objectionable in my view to imagine someone naked without their consent. It's not a tremendous violation because it has minimal social consequences and effectively doesn't exist unless it's talked about, so I would never consider legislating it or even shaming anyone for doing it on occasion. We are human and we fantasize. That said, if I heard that someone was imagining the women passing them on the street as naked all day, I'd think less of them - a mental gooner is still a gooner. It's a matter of degree.

So the only distinction I can really grasp is "am I getting paid for this or not."

Which is why the difference is meaningful in the first place. The sharing of intimate images is something women in general want to be paid for, so obviously doing that without permission is bad; meanwhile the other sex stuff doesn't have anything to do with that, so it is OK, and it works as a "haha, ur a prude" trap for people who can't or won't understand the former dynamic is all that matters.

It's not any more complicated than that.

Yeah, but she also wants to completely dodge the reputation that comes with trading her sexuality for money.

And of course, there's still often a guy in the picture actually arranging for her to sell this stuff. In this case, OF clearly profits far and above what all but their top-performing producers do. And its owned by a dude.

I'm not even denying that there's a fundamental transactional nature to all this stuff, even if you're in it for marriage and kids... its just that its now literally reduced to a commodity that gets haggled over, and people who 'have' to pay for it are viewed as losers, whilst anyone who is successful at getting attractive women to give it up without explicitly paying is either extremely crafty or is inherently high status.

Let me repeat that: sex is a commodity which can be purchased at various price points depending on the quality, so its not hard to acquire in the abstract, but being able to acquire it without spending money somehow makes you a God amongst men. Our old-school ape-wiring seems somewhat at odds with our later 'homo economicus' upgrades.

Where's the harm in teenage boys faking nudes of a 13 year old girl without her knowledge or consent, indeed very much against her consent?

Well gosh gee whiz, why on earth are women such picky, fussy, hypergamous trollops who don't want to marry just plain ordinary guys? No wonder we need to force these women into marrying normal men who think nudes of 13 year old girls are just fine! Why is anyone getting het-up about this? Men like nubile women, young means fertile, and if she's old enough to bleed she's old enough to breed, right?

The boys are just doing what boys do! Boys are gonna be interested in girls of their own age! Boys will be looking at porn, and porn is fine and normal and in fact is good for society since it reduces rape and sex offending crimes (citation needed, of course!)

I know, I know: Amadan is going to hammer me for using sarcasm. But how else am I to react to "how is the girl harmed by this?" unless I get really angry and abusive, in which case I'm still going to get the mod hammer.

  • -24

There is a lot of daylight between "it's not okay to make nude deep fakes of 13 year old girls" and "an individual was harmed when that deep fake was made," and even more from that to "someone should be sent to prison about it." None of these things actually imply the others.

I know, I know: Amadan is going to hammer me for using sarcasm.

Sigh. Yes. Come on, you're just trying to be provocative and you knew you were going to get reported.

It's not the sarcasm that's the problem. It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged. And you can even be (a little) snarky about it. But when you layer on the "gosh gee whiz"s and straw men obviously directed at the person you are responding to, of course you're dialing up the antagonism in a way that doesn't actually lead to productive engagement. You're just trying to say "You're a shitty human being unworthy of a respectful reply" without using those words.

The "big yikes not a good look chief" millennial slop has ruined online discourse because it feels so good to get off those snide, snarky little burns but it absolutely destroys any hope of good faith dialog. Do you want to talk to people and maybe enlist their sympathy and get them to see your side of things, or do you just want to score little zingers?

The sarcasm is to prevent me going nuclear with abusive language. "So, okay, just shut up, you don't need to comment on this".

Yeah, that would work - if we didn't have guys posting about 13 year old girls in this fashion. I used to be a 13 year old girl. I can't be coolly objective and removed when it comes to stuff like this, because God Almighty if we can't even preserve some few rags of compassion towards children, what the hell are we doing even trying to keep this society going? Let it burn down. Let the TFR crater. Nothing is going to be missed.

Then be nuclear with abusive language and do eat ban. At least you will have made your point clearly for once.

Remember that the next time Amadan bans me, Southkraut. If I go down, I'm taking you with me! 🤣

I don't want to ban you.

But if the Jews on the Motte can cope with all the Joo-posters saying "The Holocaust didn't happen and it's good that it did," you can cope with some misogyny without losing it every time.

Honestly I'd prefer if you just adopted some more sang-froid, didn't treat this place as a twitter/discord/reddit meme-and-karma pit, and consequently don't get banned.

I mean, I too have my buttons and step out of line when they get pushed. Nobody here is an automaton (I hope). But this whole place works to any good only when people try their damndest not to chimp out, do get slapped when they do, and the largest part of our business gets done with some actual charity. Skirting the rules to fly as close to shit-flinging monkeydom as possible but just short of getting banned is just plain not productive.

You can make reasonable posts. You sometimes do. But eyeballing it, at least half of your output is antagonistic. And that part is worse than useless. Would it kill you to give us more of the other part? The one where you bring in views that are rare on the Motte, and don't make it a snarking contest?

You may as well post animal porn and go out in a blaze of glory, because your complete inability to refrain from embarrassing emotional outbursts every time a post rubs you the wrong way means you're doomed regardless.

I liked the guy calling me "middle-aged". Oh no, I've lived past early adulthood, how embarrassing! We all know women should die the minute they're not hot by male standards, which seems to be "over 13 because fake porn of 13 year old girls is hot and normal".

More comments

if we can't even preserve some few rags of compassion towards children

You mean girl children, right? Because compassion towards boy children is notably absent, and contempt is its substitute.

No, I mean everyone. I think those boys would be the better of a few slaps on the legs from their parents, but that is for their good, not because they had the temerity to offend against a female.

I don't think it's good for boy children that we seemingly are to expect that at the age of 13 they will already develop a porn habit, which is right, normal and good.

In this case the boy children in question did something very bad.

But in the general case it holds up. I cannot imagine this level of outrage if a boy were being bullied (or suffering equivalent psychic damage) by girls. Ultimately girls get compassion and boys get told to man up and/or shut up - I am honestly surprised they expelled the girl in this case and I wonder if it has anything to do with

inviting others to join her

if she organised a group beating. Which, on reading, she did:

She hit him a second time. Then, the principal said, the girl asked aloud: “Why am I the only one doing this?” Two classmates hit the boy, the principal said, before the 13-year-old climbed over a seat and punched and stomped on him.

If the genders were reversed, the boy doing the beating would get a lot worse than a temporary expulsion.

Well duh, 'you can just harass our daughters who did nothing wrong and followed the respectability rules in place in our society' resembles no historical society that has ever existed. Including in very patriarchal ones; what do you think the Taliban would have done to these boys?

There is a reason that, despite being skeptical of things like the 19th amendment, women's financial independence, women in pants, etc, I do not align myself with the faction on the motte which spits out spicy takes on women and then retreats to 'but but double standard! Boys just have to take it!' yeah, they do. Aside from the politically correct but very obvious fact that the statement 'men and boys are stronger than women and girls' does not need qualification(it applies psychologically as well as physically), you, uh, know what women do when they spend their girlhood being bullied and harassed by whatever boys want to do so? It generally looks like radical feminism; the breakdown in family formation in Latin America precedes the hyper-woke feminist wave in the region(and also the area's drop in TFR- these women appear to have retained conservative family values longer than their menfolk). Getting outraged about girls being abused and harassed with no recourse, far more outraged than in the case of boys, is a normal and healthy thing to do.

Yes, if a group of older girls doctored innocuous photos of a thirteen year old boy to be NSFW and started sharing them, this would be a bad thing and deserve to be punished. I am confident that you will not be able to find an example of this happening(you will, of course, be able to find examples of teenaged boys voluntarily sending NSFW content of themselves to young ladies who don't particularly want to receive any). The asymmetry of the sexes goes both ways. You can call be patronizing, but I'm not wrong.

More comments

It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged.

If I (and I mean me specifically) am unable to make intelligent and cogent arguments, will that be seen as a mitigating circumstance when moderating my comments?

Your logic suggests that you'd have no objection if a 13 year old girl published nudes with knowledge and consent.

Is that true?

Is consent the defining factor here?

I just want something on record.

I think 13 year olds should not be sending nudes to anyone. However, as a society, we seem to have decided sex is for everyone and we all should start the younger, the better. Romeo and Juliet laws because aw, statutory rape is such a condemnatory charge! it ruins your life (if you're the guy fucking the younger girl)!

So if we're going to say "okay, yeah, 13 year olds can have boyfriends/girlfriends" (instead of "What? No, you are too damn young! Wait a minimum of another three years before even thinking about dating!") and if we're going to say "hey, sex is a beautiful, natural, instinct that everyone has a right to engage in, and once you hit puberty you're old enough to make up your own mind" (even setting aside the "well ackshully in Classical times 13 year old girls were routinely married to 30 year old men, this is why I am not an ephebophile/why women should be married off as young as possible by their fathers picking a husband for them" set), then at least let it be in the context of a relationship, where she knows and consents to giving her boyfriend nude photos.

I still think it's a very bad idea because it's highly likely the first thing the boy will do, regardless of how he swears he'll only keep them to himself, is share them around with his friends. But at least then it's a decision, even if it's a bad decision, made by the girl. The consequences there are the punishment: yes, you can't trust men when they swear fidelity, yes men are only after one thing, yes you now have a reputation for being a slut, yes more people than you ever wanted to know now know what you look like naked (and maybe even trying to strike sexy poses), yes this is your fault as well. Learn the hard lesson and don't do anything this stupid the next time.

Fake photos of her that the creator pretends are real, that she sent him, that he's fucking her, that she's easy, hey guys have a look, you could get a piece of this whore - yeah, she is perfectly entitled to punch him in the face for that.

And so it sounds like we're worried about something other than a child's consent being present or not.

And of course, you're seemingly expecting that the female side of the equation isn't going to be mature and wise enough to make good decisions here and thus is not blameworthy.

But you get young guys, who are similarly immature and unwise, and you expect them to behave with maximum propriety, and if they do not, then they should expect immediate and swift reprisal. I don't see why leniency due to inexperience and immaturity absolves one but not the other.

If we think kids engaging in uninhibited sexual activity is bad, and, in that vein, that sharing nudes is bad, I simply suggest that we are concerned for reasons orthogonal to 'consent' and should thus apply rules that restrict all the parties' behaviors, possibly for their own good, regardless of who did or did not agree.

I think 13 year olds are not mature enough to make good decisions. If a 13 year old gets convinced to provide nude photos, they've been taken advantage of. It doesn't matter if they're a girl or a boy.

Where it's pardoned, as seems to be the case here, that "haw, haw; of course 13 year old boys are horndogs, of course they want naked photos of girls, what harm did they do?" that's equally bad. I'm not saying "excuse the girl", I'm saying "the girl in this case did not do anything wrong yet she is being punished for it".

Yes, and I'm saying "we accept that most 13 year olds can and will make horrendous decisions, and we try to correct those decisions without making it out as a double standard where only SOME teens are culpable whereas others are not."

And its probably not good to overreact and treat teen guys as evil rapists for an action that objectively speaking involved no physical force or coercion.

I would mostly agree with this. Its just another example in the long list of examples why the consent standard when applied to sexuality and sexual interactions is more or less useless.

I don't think it is useless, but man, people do not seem to really know what they mean when they say "consent." Worse still, they don't really know what they mean when they say they "consent" to some activity.

Sex in particular, the emotional valence of the moment, and the intensity, can shift by the minute. Then, reassessed after the act, someone may decide that some particular part of it they 'agreed' to in the moment was actually a violation.

That is one of the main problems with consent as a standard. It does not hold up under any of the hard cases.

And with sex IN PARTICULAR, there is no reasonable way to go back and assess whether it was validly given or not or whether the lines were crossed. I noticed this issue in law school. "Wait, how the f@&k do you establish evidence for lack of consent when it all happens behind closed doors?"

Unless you film the whole interaction and that opens up the whole can of worms that we're discussing.

there is no reasonable way to go back and assess whether it was validly given or not or whether the lines were crossed

Yes, which is why intelligently-designed laws around sex sidestep the issue.

Let's look at the sex laws described in Deuteronomy and the social dynamics it encourages downstream of those things- remembering that this is at a time when asking people to deny their human instincts of immediate revenge was far lot more novel (and a lot more difficult due to lack of State capacity) than it is today, so we can say that these are laws/accommodations/compromises that are made because human biological instinct ultimately runs more along these lines than any modern view.

The relevant TL;DR here is:

  1. If she actively cried rape, but in a place nobody could reasonably hear, she is assumed to have been raped by default and the judgment (of death) falls solely on the man. This protects the woman- if she actually values the life of the man, she won't run off somewhere this can't be detected (and because the penalty for rape and murder are otherwise the same, there's no other incentive for the man not to just kill her)[1].

  2. If she didn't actively cry rape in a place someone could hear, then judgment falls on both (forced to marry if woman was unmarried, and death for both if married or engaged). This ensures consent [to having participated in the violation, also [0]] cannot be revoked after the fact, which protects men, and also ensures that- if they both did want this- the woman is bound to/invested in total secrecy for the same reason the man is. It also protects men from cuckoldry in an age where sex always results in babies[2], but this is more a "make sure our warriors don't burn down society by refusing to fight and instead throwing open the gates" thing.


Now, because modern society is objectively stacked in favor of women (and this observation is a point towards this interpretation), we observe that we still have the strictures of the first (that protect women) but without the benefits of the second (that protect men). That is why we push further and further into "consent can be revoked at any time"- that's simply what we should expect from female gender politics and the associated ignorance (intentional or otherwise) of the inherent moral hazard that being able to retroactively cry rape enables.

In a society that's objectively stacked in favor of men, by contrast, we should observe that the second case dominates. It looks a little different when this happens; there tend to be a lot of mistresses, wives divorcing left with nothing, fathers disowning their children, old women intentionally locked out from self-sufficiency, casual ass-slapping, and all the other things Boomer women (and their [progressive] daughters) complain about the 50s and 60s for featuring.


[0] Feminists get angry about "has to marry the rapist", but ignores that all sex [outside marriage] can trivially be called rape for the same reasons it's so easy today (and the community at that time would agree; there's no reason a woman would ever have sex for pleasure- something traditionalists and progressives agree on, as it's dishonorable). So the law here is "virginity is part of a woman's inherent value [normal men and women agree on this point completely provided contraceptives don't exist]; you break it, you buy it, and the woman is in charge of saying you broke it retroactively and at any time" (which is what progressives want to be able to do).

[1] If I recall correctly- and you'll have to correct me on this point- the US still has a version of this law (where it's some separate sex crime charge if either participant leaves their state of residence- in fact I'm pretty sure the mere suggestion of such is a crime).

[2] Traditionalists in particular will harp on and on about this because their instincts are incompatible with the technology that makes this so, but what that argument actually means is that the father of the family shouldn't be expected to lay down his life and toil to advance children that aren't his. This is an argument that doesn't directly apply to women, so women naturally assert this dynamic doesn't exist... until the young men refuse to fight an enemy that promises better terms for young men. (And yes, the lack of enemies in the West means women will further not see a need to co-operate until it's too late.)


"Wait, how do you establish evidence for lack of consent when it all happens behind closed doors?"

But see, that's what consent is: it's simply a codification of who is assumed to have automatically cried rape. This is how "children can't consent" can even be a coherent sentence, because on dictionary reading it's nonsense. This case is also the steelman for having this law- traditionalists and progressives are actively harmed by the existence of sex in general so it's understandable- but naturally, most of the fight in this case is over who gets to be "a child" -> "who automatically cried rape".

Naturally, because young women (13+; both trads and progs call this "child" for property rights and anti-sexual-competition reasons respectively, but biology doesn't agree with their assessment) are sexual competition for women but not men, women will push as hard as they can to make sure the definition of "children"/"automatically-considered-raped-for-sympathetic-reasons" is as wide as possible. And I'd say "anyone who claims to be, at any time, ever" is pretty fucking wide already- the fact they semi-seriously want to set the age to 25 is proof they won't ever be satisfied with that.

Of course, setting it to "infinite" is just Deuteronomic law through the back door, and that's not going to happen as long as there are enough liberals (or liberal-sympathetic) who want to fuck more than one young woman. Hence the stalemate.

So, back in high school, someone made a fake photo of me and posted it in a classroom. It wasn't a nude, but it was political, depicting me as Stalin, as I was an outspoken socialist. I was outraged ("the photoshop is not even accurate! I'm a TROTSKYIST!"), and it definitely hurt my feelings and hurt me socially. Pretty clear case of bullying, but, in retrospect, it was pretty hilarious and a useful learning experience. Should that kid have been punished?

I don't think so, and I suspect you don't either (though I'm curious if my suspicion is right). Which shifts the question to, what is the difference between a nonsexual representation and a sexual one? I think, to many people who don't see harm, harm categorically isn't something that can be done with an image or words--sticks and stones can break my bones etc. If people start physically attacking someone, or destroying their property, in response, there is harm, but the harm originates from the physical act, not the instigating image. The introduction of a sexual element doesn't change this. (I'm speaking here in terms of conceptual framework, not legal definitions.)

That doesn't mean that the school shouldn't do anything about the boys--schools can and should regulate behavior above and beyond the minimal standard of harm. But the idea that actual physical violence should be punished less than images and words is weird to me, especially when school administrators had no actual evidence of the images.

You're a guy. Twenty years later, someone is not going to drag up that fake nude of you and use it as evidence that you are a lying, cheating, cock-carousel riding whore who wants a share of an alpha, will settle for a beta to support you and then cuck him with that alpha, try to pawn another's man children off on that beta as his own, and are a prime example of why women should not be allowed to vote, attend higher education, or be permitted out of the house by their father until the day to marry the man picked out for you by him, and then you will spend all your days in that house under the rule of your husband.

(Those are some of the highlights from the commentary around 'how to solve the TFR problem/why aren't women dating men/what can we do about the lonely, sexless men who can't get a wife' on here, so far as I've gathered them).

The difference is that a video of you as Stalin is obviously a joke. Do you think you would have felt the same if instead they showed a video of you masturbating on a television in the classroom?

I would have felt very differently: I would have cared much less, quite honestly. "Oh, someone's a weirdo, anyone whose opinion of me changes because of it isn't worth caring about." And I'm not sure that making the AI-generated nude clearly a fake joke (giving her purple skin or whatever) would change anyone's opinions. I think the crux of the matter is that it's a sexual image, and we cordon off sexuality as requiring unique, almost spiritual protections around it.

What is the legal harm here, is the question that @faceh asked. Mind you I disagree with faceh, I think the harm here is pretty obvious even from a legal point of view since defamation per se usually covers allegations of sexual misconduct in as well (but as I said elsewhere I'm not a legal expert here and could be completely wrong). Please try to respond to the argument faceh is actually making instead of devolving into mocking and sarcasm.

Okay, so I burned down faceh's house and broke faceh's legs and stole all faceh's money in their bank account, but what is the legal harm here? Was any harm even done if there's no legal harm?

  • -10

I mean, I can run you through the entire philosophical underpinnings of the Anglo Legal Tradition that explains the "harms" that, e.g. physically damaging a person's body, or removing funds they 'earned' without their permission, or demolishing objects that belong to them entail.

I did go to school for that after all.

But somehow I think you'd be nonplussed.

I'm not nonplussed. I can recognise "heh heh my chance to be an edgelord" when I see it.

Look, I'll make this clear, as that is the spirit of this whole forum:

I do not care about being an edgelord, I do not care if you are offended, or if you are flattered. I care about being correct.

Edgelords end up being correct in some cases because they can speak on matters that polite people will carefully ignore.

Being correct is not a necessary condition for being an edgelord, however.

If my being (mostly) correct makes you think I'm an edgelord, whatever. I hold a ton of other opinions that are 'edgy' to some crowds but very normal in others. I'm not trying to shock or offend anyone.

If anything, its "heh heh my chance to be a high decoupler." But even that isn't really accurate. Being a decoupler is also adaptive in the legal field, though.


If you want me to shut up about this topic; or ideally, you want me to come around to your side and agree that you're correct, there's plenty you can do.

  • Find data that contradicts mine, and show me my data is flawed.

I personally keep trying to find data that disagrees with me. The unfortunate truth is the more I look I keep stumbling across more data that suggests the other data is accurate and my position is correct.

  • Point out where my analysis is flawed. I'm taking good data and misinterpreting it, or missing a lot of context, or am engaging in clearly biased/motivated reasoning.

  • Point out where my premises are flawed.

  • Show me a superior theory that is still supported by good data.

If its NOT the case that women have gotten progressively less appealing to men, less mentally stable, more antisocial, more unpleasant and unhappy, and that this seems uncorrelated to male behavior...

Well, what is the competing interpretation?

  • Point out a solution that I'm missing, or explain how the problem isn't a problem, or the problem will go away on its own.

  • Point out actual counterexamples that show my position is not a sufficient explanation of the observed data/phenomenon.


Basically, I'm sitting here with data I think is mostly solid, all pointing in the same direction, with a general theory/interpretation of it that completely explains the data I'm seeing, and suggests particular solutions to the nature of the problem.

And EVERY TIME I ENGAGE WITH SOMEONE WHO THINKS I'M WRONG, they utterly fail to undermine or effectively attack any pillar on which I've formed this belief. Its all argument from some position of personal belief that doesn't appear to be informed by true facts in the world, and often is based on experience that is utterly outdated from a time before the problem manifested in earnest (read: older than 10 years, before dating apps arose).

I remain open to being persuaded. But I will simply no sell the social pressure and shaming attempts. I do not care if you think its 'cringe,' I do not hold your (or anyone elses') opinion of me in high enough regard for that to matter.

If you find this personally discomfiting I'm 'sorry' but this is one of the few places on the internet where social pressure is not a defining variable in how our discourse proceeds. I want you to come back with an earth-shattering insight that proves this part of my worldview incorrect. I will not respond to attempts to prove my worldview 'icky,' or 'unpopular' or 'impolite' or 'just c'mon dude really how can you say that?'

You're convinced you're right. I'm convinced I'm right. I don't think we can go much further on from here, apart from butting heads and drawing the ire of the mods.

Hot button topics are hot, and get people hot under the collar. And when I'm hot, I react with heat.

More comments

Not to mansplain your own violent revenge fantasy back to you—but you do know that as a middle aged woman, an attempt at breaking @faceh's legs would almost certainly result in a reverse uno card getting quickly played against you? Or for that matter, such an attempt against any man in this thread who you wish to "[go] nuclear" on for being insufficiently reverential toward girls and women, but especially against one with an MMA background.

Granted, perhaps the hidden premise is that he's already cooked like Anakin by the housefire but with legs still intact before the knee-breaking. Either way, what a curious toughgal LARP.

Didn't we have a distinction between rhetorical leg-breaking and actual leg-breaking on here when it came to comments? Anyway, no of course I can't physically break his legs. But it's a hypothetical: we all recognise the harm done by physical violence, nobody would attempt to defend it by "but is it legal offence, is legal harm done?"

Same with shaming and humiliating a minor child by producing lewd and fake images of her. If you really need "but is it legal harm, otherwise it doesn't count", then I submit you are not part of civilised human society.

Legally, you committed several clear and well established crimes (arson, battery, theft) that caused clear and well established harm, both in the legal and moral sense (loss of the house, medical bills for the legs, loss of the money in the account).

Faceh never argued that there was never any moral harm, and I doubt he believes that there wasn't any. But what was done in the article doesn't clearly and neatly fit under any existing legal framework like revenge porn laws and defamation laws. I (and others) think it likely falls under defamation, but other legal precedents like Fallwell v. Hustler make that unclear so we'd likely need some court cases or new statutes to establish a clear precedent.

So instead of being snarky and sarcastic to faceh, you could make an argument like "I think this behavior falls under [existing legal framework] because X" or "I don't think it fits under existing legal frameworks but legislatures could make it illegal without running afoul of [the first amendment/existing precedent/whatever] because X." It really isn't hard, you're just choosing to react with snark and sarcasm instead of an actual argument.

I think there's a plausible false light (and defamation per se) claim, given that the images in this situation were being shared and would be themselves illegal for her to produce. Even for deepfakes-of-adults, false claims of sexual promiscuity would fall into these categories. There's some theoretical examples where a Falwell v. Hustler-style defense would be relevant in the case of a public figure where the deepfakes were clear parody, but that's pretty far from the typical case. But from a traditional law perspective you don't have to pull a Gorsuch to find a civil tradition against this sorta stuff.

Useless, though, since the kid who did it's judgement proof. In theory, the state law would allow six months imprisonment per act, but in practice that's really not how the juvenile court systems work, and even an adult doing this to another adult is more likely to just end up with a fine. And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.

Which is part of why the whole thing is such a mess.

Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat. The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor? Perhaps more to the point, given how AIs usually seem to handle NSFW requests, is it illegal to put the face of a minor on top of the body of a definitely-grown adult? I have no idea, and it’s not something I’m eager to google.

Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat

I'm mostly using "nuclear" in the sense of "the biggest available weapon, and its resulting proportionality concerns". Those style of prosecutions happen, but they're pretty uncommon, even though there's good evidence to think the chargeable conduct happens more often than anyone wants to think about.

The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor?

In the US, it's a federal felony under the PROTECT Act, unless the content also has serious literary, artistic, political, or scientific value. Enforcement is pretty rare, though, since the feds don't want the law to get another challenge like Ashcroft.

And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.

Thought experiment, what if one of the boys shows it to his creepy uncle and then snapchats it to his creepy uncle? Then creepy uncle is in possession of CSAM - even worse if he then darkwebs it to all of his other chomo friends.

Obviously, probably not going to charge the minor male with accessory, right? But this is the "break containment" mega problem for cases exactly like this. Call it "deepfake laundering"; kids unwittingly doing stuff that, in the realm of adults, would be 100% illegal.

I had some thoughts about this back when that DignifAI thing was doing the rounds:

Unlike editing a photo of Trump so it looks like he's riding a skateboard or whatever, I don't think it's hard to understand why editing Alice/Bob's publicly posted photo to make it look more sexualised than the original is crossing a line: there's a significant possibility that people might mistake the edited photo for the genuine article, and Alice/Bob will take a reputational hit, as people will assume that they are the kind of person who shares thirst trap photos for public consumption. Even if the photos are obviously fake (as in the recent Taylor Swift "deepfakes", which look more like the kind of stylised fetish fanart which has been around for years before LLMs were a thing), I think it's still demeaning to reduce a real person to the status of a sex object without their consent.

I thought DignifAI had legs, but of course the gooners demand the opposite service.

I think it's still demeaning to reduce a real person to the status of a sex object without their consent.

Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.

So exactly how much modification would it take before its no longer 'believable.' If they stylize it as a different artstyle?

I keep pointing out that with AI, whatever guidelines you put in place, the AI can tiptoe RIGHT up to them and stick a single pixel over the line.

Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.

Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.

I found a couple of fully-AI generated insta accounts, some with moderately convincing short videos.

Unfortunately for them, there are still tells. For example, in one (I can find the link) the model's birthmark kept moving around her body. Or lighting ends up a bit off.

Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.

That one is easy.

Ultimately, it'll become an even more premium service. Gooners will pay extra for some sort of cryptographically verifiable proof of realness. Setting up a digital chain of custody from a digital camera to actual final presentation on OnlyFans or another site wouldn't be too arduous and (markets in everything) someone will create a service to do just that.

Saying that AI porn will destroy "regular" porn fails because it relies on the same logic as "porn will destroy the market for webcam girls / strip clubs / prostitutes." It isn't all about the simple visual arousal, but about a parasocial/pararomantic/paraintimate relationship.

This is where my model of the goonbrain really fails.

If you're going to such absolute lengths to be 'certain' that the girl in question actually exists, surely its better to pay for in person companionship?

It doesn't parse, for me, how you can require a physical person actually exist for you to get off to the content, but NOT also inherently prefer that person be physically present. Which obviates the fear of it being faked, entirely.

Although I guess I can imagine a guy who is deathly afraid of getting arrested for soliciting a prostitute which drives him to avoid paying for sex.

Streamers/OnlyFan girls of various stripes tend to be much cheaper than prostitutes/escorts.

...I think. I'm sure someone can trot out an example of a rather ludicrious OnlyFans where the pricing is absolutely stratospheric, but the last time my morbid curiosity led me down that particular rabbit hole, that was my general take on the matter.

I'm gonna say its comparable to any other gig economy/gamified app. The basics are pretty cheap, but they rack up all kinds of extra charges where-ever possible, and milking the whales is the real profit center.

Basically, buying an OF is marking yourself as a possible sucker just by putting the basic money down, and the ecosystem is going to do its damnedest to drain your wallet.

I haven't availed myself of prostitutes, the standard strip club experience annoys me enough with the constant upsell even knowing that they can't actually promise the outcome you're hoping for.

Think of a goodbrained version of "never meet your heroes." If a goon actually pays for a prostitute or companion of some sort, he has to fully interact with a person for some length of time. That could go wrong. It could make him feel bad. The distance created by a screen creates a perception of control and the ability to "Rage quit" if things go south. But they still want the knowledge that there's someone real over there.

Don't get me wrong, a portion of the "real people" online sex market will be taken by AI. Perhaps a pretty large portion of it. But there's going to remain these other niches. If online porn and COVID didn't kill stripclubs (and it didn't), I don't see how they every really go away.

But online porn did kill the dirty movie theater. AI porn seems like it'll kill actually filmed porn but not strip clubs.

Good point.

Does it kill webcam girls and the parasocially heavy OnlyFans accounts?

But online porn did kill the dirty movie theater.

VHS tapes / DVDs killed the dirty movie theater. Online porn killed DVDs.

While I get that, you can still find strip clubs if you're willing to leave the house (this might be the true motivation, the desire to never go out in public).

Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.

That's technically not a requirement for defamation per se, but I don't know enough about defamation law to say whether or not this counts as per se defamation. I do believe that legislatures could define it as defamation per se by statute if they wanted to though.

defamation per se still requires that there be an "accusation," which means in this case, you'd have to prove that the kids were claiming these AI generated images were real pictures, not fakes. It's no different than if they had drawn lewd pictures of her.

I imagine that unless it was an obvious parody or fake, or that the boys were explicitly telling their friends that they were AI generated and not real etc., most courts would presume that the images were being presented as real.

And once again, I'm not sure if it'd survive 1A scrutiny.

Revenge porn laws create some, don’t they?

Yes, but usually a component of those is that someone sent you those images, and DID NOT publish them themselves, so you're violating their 'privacy' interest by publishing these salacious images sans their consent.

Modifying an image they themselves published seems categorically different.

I suppose I imagine it's of a piece with something like false light which operates along similar lines to defamation. The harmed party would obviously be the individual whose photo was edited. Especially if their was an intention to pass off the photo as genuine. I don't think you could reach any edited picture with this doctrine but I think you could likely get non-consensual NSFW edits. In the underage case my understanding is that digital edits of minors can already be considered CSAM so I don't know why this would be different.

AI-generated nudes of minors meet the legal definition of child pornography.

“child pornography” means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where— [...] such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

If the Supreme Court ruled that sharing AI-generated deepfakes of actually existing 14-year-olds is protected speech, there would be a constitutional amendment. The general public will not stand for this, regardless of what weird tech-libertarians think.

Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002)

They basically just re-passed the same law struck down here, and have been avoiding scrutiny by only charging people who were already incentivized to plea bargain. Apparently the idea is that if the law is around for a long time before it gets seriously challenged again (in a virtual child porn case not involving anything else), the courts will forget all about Ashcroft and convict. Might work.

Tbf, the PROTECT Act stapled on a Miller test. They're still trying to bypass the 'prevailing community standards' bit, but compared to the pre-Ashcroft version that just pretended the Miller test didn't matter, it's a much wider retreat than, for example, US v. Lopez.