This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
For a while I assumed that the big AI companies would permit porn generation eventually. They might want to act high and mighty now, but a time would come when they needed to show revenue, public perception be damned. Unfortunately, I'm naive enough that my conception of pornography did not extend to the type that could be legally problematic. Back in January, a bipartisan group of 35 attorneys general published a letter to Elon Musk asking for assurances that the company was taking steps to protect against NCII and CSAM, though it's unclear if he ever responded. Last week, a class action suit was filed in the Northern District of California, alleging that xAI is responsible for producing nonconsensual nude images of three underage named plaintiffs. Yesterday, the Baltimore city attorney filed a lawsuit alleging violation of various city ordinances involving consumer protection.
It appears to me that these issues could probably be solvable. Disallowing generative editing of user-uploaded images seems like a no-brainer. The CSAM issue is a tougher nut to crack, but it seems like the NCII issue is what was getting everyone's attention, so if that goes away then I doubt that the existing safeguards against the latter would be found deficient. But the cat's out of the bag at this point; Elon fucked up and now he's under the scrutiny of people who have the power to make life miserable for him. I imagine the class action suit will settle, but it will take years, and Elon is hard-headed enough that he might decide to make a statement out of it. The plaintiff's attorney seems to have selected the worst possible place to file, as I don't imagine you're going to find a more tech industry-friendly jury pool anywhere outside of Northern California. The Baltimore case is on less solid ground, and the potential exposure is likely lower (I can't imagine it being more than a few thousand dollars per proven victim), so it may make more sense to fight that one, although all that will accomplish is proving that he didn't violate a specific Baltimore consumer protection ordinance.
Grok Imagine has indeed implemented something like this, where uploaded images of real humans become extremely difficult to edit without triggering a censor, and I think videos might be right out. Unfortunately, even this gimped censor is severely limiting, and a full-on prevention of generative editing or animation of user-uploaded images would take away like 90% of the use cases for image/video generative AI. Since so much of using gen AI to produce images and videos is about trial-and-error and iterations of manual edits -> AI generation building on it -> manual edits of AI generations -> AI generation building on it -> etc., including using multiple different non-interoperable AI tools (e.g. generate original image in Midjourney, edit it locally using Krita and Stable Diffusion, then upload it to Grok AI to animate), lack of ability to take arbitrary image input would leave it as only the origin point for the workflow, which doesn't amount to much, or just simple time-waster slot machine generations.
Personally, I do not like the notion that it's possible to arrange pixels in an illegal way that doesn't involve some other independently illegal action as a causal factor and hope that attempts to make it so fail horribly. Unfortunately, I'm not hopeful, as it seems to me that support for free speech and free thought isn't very high right now in the USA. This is also why I'm still holding out hope that video gen on local hardware will become "good enough" that private servers owned by companies like OpenAI, xAI, Google, etc. don't become effective gates for this sort of creative endeavor for the layman (or at least lay enthusiast).
The reason it gets fuzzy around the edges is because you’re dealing with the knock on effects and potential second and third party consequences.
In several countries for example you can get executed for manufacturing and selling narcotics but not for consuming them. In this case if you’re someone who’s producing that kind of content, I can see a rationale for why you’d be in the crosshairs. People who consume the content… I can see why it’s ‘somewhat’ different.
More options
Context Copy link
More options
Context Copy link
Is it? Even if you only disallowed it for uploaded images of people, that would cripple one of the most popular use categories for generative editing. My kids mostly aren't very interested in AI, but they were thrilled when Gemini stopped disallowing and started allowing us to turn pictures of them (and their cats) into anime-cartoon-style and bobblehead-doll-style and kaiju-battle style and so on and so on. If you also disallowed it for all uploaded images then you'd be ruining one of the easiest good ways to control output of image generations in general.
More options
Context Copy link
I haven't seen that abbreviation before, so I'll explain it for lurkers: nonconsensual intimate images—a category that originally was just revenge pornography (public posting of privately-shared explicit content), but now has been expanded to include explicit and suggestive edits of publicly-posted nonsuggestive content, and sometimes even mere spotlighting of unedited publicly-posted suggestive content.
Fascinating how "consent" came to be a universal moral solvent, and by extension, a lack of consent can extend much further than any sane person might think.
That's because "consent" actually means "waives Female Privilege to profit from sex after the fact", not "accedes to".
Women cannot legally consent to sex (or any sex-adjacent activity, actually- 'revenge porn' is yet more salami-slicing away of that ability) today in any Western nation (the US is, perhaps ironically, the least far down that path- but it is still criminal). South Park made fun of this with the consent forms, but the fact that wouldn't hold up in court is actually the main issue here.
Sex with them is thus as potentially legally dangerous as it would be with a 7 year old- the group "consent" was made up to initially protect. We can see this by how laws tend to get changed so the man can't protect himself by demonstrating in court the women intended to discharge this and lied after the fact (i.e. the Jian Ghomeshi case). It's also why Western/feminist anti-prostitution laws only criminalize buying sex, not selling it.
In other words, invoking "consent" is the one-word fig leaf to cover up the fact women are blatantly abusing privileges meant for the people they claim are the most vulnerable, and to claim that if you're opposed to this abuse it's because you want 7 year olds to be raped. It's quite effective, as you can see.
About that... Dave Chappelle would like a word.
More options
Context Copy link
More options
Context Copy link
Well, for some people anyway. Those of us born with the wrong parts are typically exempt from needing to provide consent, though for some reason we're still expected to obtain it.
More options
Context Copy link
More options
Context Copy link
Wow, that phenomenon seems really reminiscent of how "sexual assault" became a catchall term to be used when the speaker wants to create the connotation of forcible rape in contexts where the reality is some sort of harassment of a sexual nature. As well as how "sex trafficking" became a catchall term to be used when the speaker wants to create the connotation of kidnapping women into sexual slavery in contexts where any level of prostitution took place. I've come to really dislike these overt attempts to engineer language for the purpose of hiding covert attempts to manipulate others into believing things that one finds useful for others to believe and personally always just call them "child porn" and "revenge porn."
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link