site banner

Culture War Roundup for the week of June 3, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

8
Jump in the discussion.

No email address required.

New York Times’ The Daily podcast ran an episode Real Teenagers, Fake Nudes: The Rise of Deepfakes in American Schools. The premise is contained in the title — AI image generation apps can remove the clothing from photos, and teenage boys are using these en masse to make faked nude images of their classmates.

The overt message that the NYT hits repeatedly in the piece is that the girls in question are victims and that the boys have committed a crime. It’s stated repeated, implicitly and explicitly, without any justification. At one point a police officer opines that the images are CSAM (child sexual abuse material). (By the way, never trust a police officer to tell you the law; it’s not their area of expertise.)

No, no, just all of it no. There’s no crime here. There are no victims. There’s no CSAM, because the images are not of children (notably the AI models are trained on nude adults), nor did any sexual abuse occur in its production.

This is the moral equivalent of weirdos 40 years ago who would cut the heads off photos and paste them on pornographic images. Creepy? Yes! Deserving of social shunning? You betcha! But not a crime. Everyone in these girls’ lives who is catastrophizing this is doing them psychological harm.

If anything, this technology pretty much decreases victimization via two means:

  1. If AI nudes become widespread and indistinguishable from real ones (and they're close), the danger/penalty/threat of blackmail/etc. of having real nudes leaked becomes basically zero. (Given how much many if not most women who would be the kinds to be targeted by deepfake nudes in my experience love sending out actual nudes cavalierly and are only stopped from doing so by concerns about exposure, I believe, if they could think one step ahead of inventing a new form of victimhood to decry in the NYT about this technology, they'd be tithing 10% to its developers.)

  2. AI "CSAM" (reverse the first two letters and you have my opinion about this modern newspeak term and its relation to the perfectly fine term CP that didn't need any replacement) holds the potential to completely destroy any markets in or sharing of actual CP, again if it's indistinguishable from it. If it's indistinguishable, then even people who specifically only want the real thing will have to give up, because even they won't be able to tell the difference. It'd be like flooding a drug market with a 100x cheaper to produce version that's indistinguishable from the real thing. You would put the dealers of the original stuff straight out of business, even if there were still a demand for their product on authenticity grounds, because that demand for authenticity can't be satisfied if nobody can determine authenticity.

But this just further reveals the character of the modern woke system of American "law and order" (and those are definitely scare quotes). It's not about actually improving the world, protecting anyone, or anyone's safety; it's about punishing people for being morally impure as considered by the privileged classes.

Pursuant to my second point, with a modest government investment in AI models specifically for the purpose and agentic AIs to spread it around the usual chains of CP distribution, the US government could probably end or at least curtail by 97-98% or so (casual estimation) the genuinely criminal distribution of actual CP by drowning it in mostly if not entirely indistinguishable AI forgeries. No living, breathing, sentient child (or again at least 97-98% less) would ever have to be sex trafficked or exposed by the production of such material again. Those who have already would, much sooner than would occur naturally, have the memories of it buried under hundreds of pounds of dirt of digital disinformation. (It is worth noting that every time somebody is caught with CP featuring a person known to the US government, that person has an opportunity to get payout from the confiscated assets etc. of the convicted, with the most famous "CP stars" sometimes making hundreds of thousands of dollars a year off of this. So, perversely enough, they may not like this change. Presumably most would be relieved however.)

They can't/won't do this though. Why not? Why because the people who are inclined to like CP might still like the new, AI-generated stuff, or might even think some of it is real, and still masturbate to it. Their filthy little perverted minds will still be free to get off with impunity (if not better than before with a state-of-the-art government AI pumping out content catering to their every fantasy for them), and that's the real crime here, their corrupt pleasure and satisfaction, even if it harms none, not what happens to real flesh-and-blood children or anybody else.

And it's the same with kids generating nudes of their schoolmates. There's no actual analysis or consideration of the boundaries of freedom of expression, private fantasizing, the balance of rights between people, etc. involved. They're dirty little "incel chuds" or some equivalent, as proven by the fact that they've done something to offend a female, and that's it. (And of course the likely general unattractiveness of the nerdy guys who have adopted AI technology for fake nudification this early is a major factor. If it had been only attractive guys found with this technology, there would be no NYT article. As usual and again, it's not about principles, it's about the fact that, as the famous graph shows, many if not most women are statistically illiterate (or at least in this particular area) and thus consider 80% of men to be below average and therefore unworthy of the baseline of respect and consideration. Thus the fact that these men have sexual urges at all is an abomination to women, something to be policed as forcibly as is necessary (unless money can be made from them on OF).)

CSAM is one of the least rational areas of politics.

In the dark ages before the sexual revolution, there were all kinds of sexual deviants against whom upstanding, proper citizens could unite. Gays, interracial couples, unmarried women having sex, kinky people, people using birth control.

Today, most of these targets have been swept away by a big wave of sexual tolerance. Saying "it is wrong to have sex before marriage" makes you sound like a cringy old person.

However, we have also established that adults having sex with kids is bad because it causes severe psychological issues for the kids.

So pederasts and pedophiles become the lightning rod for most of these innate drives to police the sexual relations of their neighbors -- which did not magically disappear.

This is obviously a very emotional topic, and such topics often allow you to score big political wins. Under an evidence-based system, the focus would be on preventing the actual sexual abuse of children both by exclusive pedophiles and other men who act opportunistically. This would entail de-stigmatizing pedophiles who did not commit any sexual offenses with kids (which in turn would increase the odds of them willing to risk therapy, which would reduce the odds of them becoming child abusers) and trying to get the shared social environment of both perpetrators and victims to speak out if they suspect sexual abuse is going on.

CSAM would be treated like snuff videos. Commissioning a snuff video is commissioning a murder and should be punished as such, and paying for them should be a felony to discourage their production, and if you want you can also criminalize distribution and possession. But if half of your homicide department works on possession of snuff video cases, then I would argue that you have your priorities wrong -- most murders do not happen for the creation of snuff videos, nor does their consumption precede most murders. Fake snuff videos lack the thing which makes them immoral in the first place -- a victim. Even if you want to regulate horror movies, it would be a good idea to not simply classify them as snuff.