site banner

Culture War Roundup for the week of January 8, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

This thought occurred after Christmas this year during a few activities where family members wanted to play a game, so they pulled up a YouTube video to demonstrate how a thing is done, and it was incredibly gross.

99% of modern kids will never have the ability to be forgotten- parents post their pictures online when they're not able to give consent, including embarrassing and compromised photos. This includes YouTube videos of moms putting their daughters in compromised positions and posting them on the video site.

Such videos are easy to find- the mom often speaks, and their prepubescent girls do a seemingly-innocuous activity. Those girls will always have those videos on a stranger's hard drive at best, or at worst, end up as data used for ai generation.

I'll note that I don't have a proposed solution to this. The laws on child-porn already exist, but this content skirts the edge of acceptability. The girls are usually 10-13, and doing an innocuous activity- like playing pattycake or ring around the rosie, usually in mostly-acceptable clothing.

When you stumble on one such video, you can tell what I'm talking about. It's the camera angles.

For this reason, I come to TheMotte- have you seen the videos I'm talking about? What do you think about them, and how would you evaluate whether or not such content is okay to post online?

If you have kids, do you worry that there's some random perusing Instagram or willing to train ai on them?

After seeing these things, I can't get it out of my head, nor can I come up with a reasonable solution.

have you seen the videos I'm talking about?

No.

What do you think about them

Nothing at all until you wrote you about them, now I think...social media does all kinds of crazy stuff to people and people with it, as always.

how would you evaluate whether or not such content is okay to post online?

Okay? I wouldn't find it okay for my wife to post such things with my daughter. But for others? Really quite their own problem. Once the information is out there, it's too late, the damage, if any be, is done. Can't put that back in the bottle. I suppose you could punish parents for it as discouragement, but that's state overreach in my opinion. Sucks for the children, but the world can't save children from their parents.

Also, in general, your post is very oblique and could stand to be more direct and explicit.

If you have kids, do you worry that there's some random perusing Instagram or willing to train ai on them?

I have a child. I don’t care if someone uses photos for sexual purposes, the same person could masturbate to them from memory of seeing them in public. If you’re female, people will masturbate to you countless times without your knowledge and they have never needed AI to do it. People are weird, that’s life.

Still allows for the question of how 'normalized' you'd want it to become.

Presumably you wouldn't want a guy to just drop trou and start jackin' it right in sight of your kid.

And gazing at him/her creepily from the bushes while he does it is not a major improvement?

So I think parents end up conflating "jacking off at home to a sneakily taken photo" as sort of close to "hiding in the bushes and jacking off" even if there is far less danger implied by the latter.

So lets' just say that photos kept in the privacy of one's domicile are not harming anyone. How close in time and space should the masturbatory be able to get to the object of his desire before we get too uncomfortable and want to shut down the behavior?

I haven't seen these, but the first thing that pops to my head by way of analogy is the way that close masculine friendship is now frequently coded as gay. Arab men holding hands? Gay. Two bros hugging it out? Gay. Telling your decades-long friend that you love him? Gay. Slap on the butt or other physical encouragement in sports? Gay. Just hanging out together? Believe it or not, still gay. Discouraging good, healthy, forms of masculine love is a terrible consequence of everything being interpreted through the lens of believing any two guys could be gay.

Likewise, viewing any adolescent play that isn't the platonic visual ideal of utter asexuality as an invitation to pedophiles concedes far too much ground for fear of pathology. A couple preteens playing pattycake just isn't sexual, even if weirdos are capable of interpreting it as such.

I haven't seen these, but the first thing that pops to my head by way of analogy is the way that close masculine friendship is now frequently coded as gay. Arab men holding hands? Gay. Two bros hugging it out? Gay. Telling your decades-long friend that you love him? Gay. Slap on the butt or other physical encouragement in sports? Gay. Just hanging out together? Believe it or not, still gay.

John Woo films? Total sausage fests.

The focus on male friendships in Woo's film have been interpreted as homoerotic. Woo has responded to these statements stating "People will bring their own preconceptions to a movie .... If they see something in The Killer that they consider to be homoerotic then that is their privilege. It's certainly not intentional."

That was 1989.

I haven't seen anything like your post mentions. But it reminds me of hearing about a few test suits where children sued their parents for invading their privacy by sharing essentially their entire childhood online.

I wonder if in the future we'll have legal codification restricting parental sharing rights, or if it will segment along class lines like so many other things.

I also wonder about what kids from families like Chris Ballenger's, whose family are streamers, will think of this 20-30 years from now.

I think there's a decent amount of precedent that would entitle the child to claim at least a portion of any money earned through the use of their image.

So the parent would be expected to have a restricted account into which the earnings go and this gets turned over to the child when they turn 18 (reasonable amounts could be deducted for the child's benefit in the meantime).

For 'mere' invasion of privacy I'm not sure what the damages would be.

Father here. Part of my job is to set her up for good partnerships.This requires her to be attentive, attractive, and to have a good social filter.

There will always be a fraction of people who are okay with jerking it to kids. I have no interest in adapting the Fast Bear Rule: making my daughter uglier/less available so pervs find a different girl. She will develop a filter & practice it like any other skill.

If I've done my job right, she won't have to worry about pedos. She just won't see them as legitimate options.

You cannot stop people from thinking gross thoughts. Moreover, they will be using AI in the near future to generate all they could ever want, and already do, and even if your children weren't in the dataset, it is diverse enough that something close enough to your children could be generated.

Focus on preventing real harm.

Such videos are easy to find- the mom often speaks, and their prepubescent girls do a seemingly-innocuous activity. Those girls will always have those videos on a stranger's hard drive at best, or at worst, end up as data used for ai generation.

I've not seen these videos, though I recall maybe 5ish years ago, there was a minor hubbub when some YouTuber brought this kind of thing to attention, and I watched a few of them which are probably similar to what you described in this post.

In any case, this quoted paragraphs touches on a common sentiment I've seen all over the place, which I still don't fully understand. Which is the notion that these images being on some pervert's HDD is somehow harmful to the subject of the original image. At best, I could see the argument that if the subject were acquaintances with the pervert, then the pervert's perception of the subject would be corrupted in an unfair way, but even that seems like a stretch. For strangers online, I'm not sure how to rationalize this; how does an arrangement of pixels that looks like me harm me if it's sitting on a HDD somewhere that I don't know, viewed by someone I've never heard of, and using it just to get his rocks off (most likely)?

With AI generation, I think a more generic fear exists that we don't know how insight AI will be able to gather in the future from this data, so there's a bit of an unlimited downside risk, but in terms of, say, modern diffusion models, I'm not sure there's anything that harms the subject either. Without intentional training, the model won't be able to recreate the subject's appearance based on their name, so there's just about no risk in terms of privacy. And if the training using video featuring the subject's face causes the model to, by chance, recreate the face, then it's just going to be one of umpteen anonymous faces the model generates. And if perverts generate grids of pixels using AI that look similar to the subject (by chance in this case of modern AI, but conscious intent wouldn't change anything), for the purpose of getting their rocks off while viewing the image, I don't see how the subject would be harmed.

I have little regard for the pedo-panic in the first place, and this is even more weaksauce compared to that.

If you're at the point where children doing entirely innocuous things, in modest clothing, is somehow a bad thing because you're worried some pedo will use it as jerkoff material, then we're at about the point where audio-visual recording of just about anyone and anything is off the table, and maybe even a ban on thinking about the children, you creep.

It also is entirely pointless to forbid it now, even for the ever illusive concerns of it ending up in AI training data. You don't think there's enough out there that people aren't making photorealistic artifical CSAM, in both photographic and video form? The cat is out of the bag, and while I'm sure there are some pedophiles who have a fetish for jailbait/"real" children, barring about 99.99% of parents from recording their kids and sharing it is so grossly overkill it's demented. You take a video of your daughter jumping on a trampoline and upload it on Insta? Well, about 6 frames can be construed as an "upskirt" shot, enjoy your ban. Discord is already banning people without recourse, including entire discord servers, if a single still image (that bald dude munching popcorn used as a reaction image), is shared, because their heuristics recognize it as a frame shared with a flagged CSAM video.

The reasonable solution, as far as I'm concerned, is to not care, or at least find something more concrete to worry about.

Small correction, that bald guy reaction image was used to cover up the actual image, which is also actually just someone eating popcorn, but the second one does trigger an instant account ban.

Thanks for clarifying! I'll take your word for it, since I am modestly attached to my Discord account and don't desire to find out the hard way haha.

The realm of responsibility falls on the parents. Just as they're responsible for feeding and clothing their kids and deciding what school/clubs/activities to send them to, they're just as responsible for what they choose to put online. Any personal videos you want to put out that you don't want any strangers to see should be private and access heavily restricted.

People can and will sexualize anything and everything, see rule 34. This is a touchy subject but I'm not sure we can or should do more than we already do, which is banning any explicit sexual content of children and socially ostracizing those with that kind of desire. But you can't stop or control what people think, and I draw the uncomfortable line at turning those desires into actions. If they're just engaging in masturbation in the privacy of their own home, then while disgusting, isn't doing any actual harm. But if they start compiling videos to make it easier for others to see, or invite others into their fetishes, or start reaching out to the parents/child then they are engaging in actions that actually have an impact on the person, and punishment should take place here.

Those girls will always have those videos on a stranger's hard drive at best, or at worst, end up as data used for ai generation.

Did you mean the other way around? I think I would be mortified if they were directly some captured in some internet stranger's hard drive, ai generation not so much since the output is not the same as the input. Maybe you meant specifically about deepfakes, that is something I haven't come to terms with myself since I haven't given it much thought yet.

My opinions are mostly the same as yours, but there is a potential issue in that it's difficult to exist on the internet with literally no presence.

That is, pervs who never interact with children directly are still clicking on and watching certain videos that fulfill their criteria for desirable content, which boosts the metrics on those videos according to the algorithms on the website, be it Youtube or Instagram or something else. This both makes said content more visible to other people, and provides positive feedback that this is the sort of content that becomes popular. People who care about being a "successful influencer" pay attention to popular content from other people, and also popularity of their own content, and are more likely to replicate things that were successful. In a certain respect it's sort of like AI training but in other people's brains.

Therefore, an army of pervs attempting to be stealthy but still being caught by the algorithm still end up incentivizing young girls to produce less appropriate content, even if the girls themselves don't realize why said content is popular.

I haven't seen these videos myself, but I've heard people complaining about them, not unlike yourself. And like you said, nothing about the videos are explicitly sexual. It doesn't go full pedobait like Cuties. But it's still uncomfortably sexualizing. From some other complaints I've seen the dead giveaway are the comments, usually full of gross comments by pedophiles ranging from plausibly deniable to 4-chan party van. Also foot fetishist? So many foot fetishist. I guess they can't all work at Nickelodeon.

The other tell is the recommendations on those videos. There is a tread worn in the algorithm a mile deep that assumes if you like that video, you must love other gross shit that sexualizes minors.

Personally, we keep all photos of our daughter off the internet. No social media, no approval for school or businesses to post them, just absolute zero tolerance. Partially because we both have family that is unwell, unsafe, obsessive and might not leave us or our daughter alone. Also for all the usual tech-paranoid reasons about AI and corporations creating and owning a simulacrum of your soul.

I’m kinda with you, although not a blanket ban from me. The issue beyond sexuality is that the internet is forever and kids cannot actually consent to having their image and activities uploaded. What’s cute at ten might be a problem at twenty or thirty when they’re trying to build their lives. Businesses can and do look at your Facebook. A lot of women will look at social media before going on a date. And whatever images and posts they find will form an impression. Every kid including me has an edge lord phase, where they think they’re super smart and cool because they dress weird and hold a lot of weird opinions (often quite loudly). For anyone born after 1990, that phase is online ready to bite them should the wrong person find it.

I've not seen these videos. I'm not totally sure how I'd feel about it, I suspect having a kid is going to change some of these feelings but I appear to lack the ick factor about having my likeness used to train AI. As for the soft core CP stuff? I'm overwhelmingly disgusted by pedophiles but I don't really think there is a way to prevent them from ever even seeing young children. I may still not like the idea of there being a lot of content on the open web about my future children but it's not really because pedos might find it.