This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
I tried using Sora for about a month at the end of last year, but I had to stop due to getting banned. Grok Imagine wouldn't ban me, so I've been using that instead. My wild guess is that a social media platform based entirely around AI generated videos like Sora can only exist in a sustainable way if it's explicitly for erotic/pornographic material - there's simply not enough demand for creating or viewing AI generated videos that aren't in that category to get enough users and views to pay for the generations.
I've mixed feelings, since Sora was clearly much better and more flexible than Grok imagine, and so I would've loved to see that develop further, but, at the same time, the lower censorship in Grok and XAI's general attitude towards censorship versus OpenAI's makes me think improvements in Grok is more likely to bear fruit. Of course, without Sora around, XAI has less reason to improve Grok... And Grok Imagine is also still censored, which isn't great, but it's the least worst, at least. In the long run, I'd hope that local video generation will be "good enough," but that'll probably require a world where dual 5090s with 64gb VRAM is considered a quaint little living room computer for sending emails and running old games at a tolerable 25fps, which I'm guessing is within 2 decades.
The Grok app no longer allows me to generate videos. It went from 50 a day (!) to zero. I have to pay 30 a month for Super Grok.
Thing is, on desktop I can still generate videos (though unclear if they're still so generous as to give me 50) with my 13/month Premium X membership. Go figure. It's interrupted my mostly mobile "flow" but at least I'm not cut off completely.
More options
Context Copy link
Lol. What were you doing on the platform?
Thats why consumer AI will almost certainly never have broad based appeal. I mean let’s just be honest about what the real use cases are… 4Chan is similar to what most people would be using it for.
If you ask it questions related to mental health, they’ll refuse to answer and tell you to seek professional help. If you ask it for advice about interpersonal relationships they’ll say they won’t aid you in manipulating other people(???). If you ask them questions about fixing a car, it’ll give you straight up misinformation.
I’ve noticed it so many times. If you try testing the models by asking anything that involves an edge case, it’s either the case that the consequences are too large where it concerns returning a firm answer so it’ll refuse to do it, or it refuses to participate in supplying any information that has the potential to be abused. So what does that really leave you with? A cool plugin for some things that also has neat alternative functionalities as a toy that can generate images. Trillion dollar industry? Call me skeptical.
I think a good litmus test for these systems is to administer it with online IQ tests. If it can’t get a perfect score with low reaction times, consider it baloney. Cheap metric but viable place to start, and that’s still an ‘extremely’ far cry from getting some AI system to actually innovate or do something creative.
None of this is true? I've literally used it for all 3 of those things recently.
If you have any prompts for me, I will run them through any model of ChatGPT you like (I have pro).
Depends on how you ask it I suppose. I've ran into all 3 on multiple occasions. I'm also not using ChatGPT.
More options
Context Copy link
More options
Context Copy link
Evidently, trying to produce videos that OpenAI disapproved of. More precisely, IIRC, I was inspired by some videos I saw on Sora where, apparently through some clever prompting and/or iterations, the user had managed to generate and share a video of a woman doing yoga, shown from suggestive angles. I was experimenting with doing the same when I got the ban email. Grok Imagine has lines as well, as I alluded to before, but it places its lines very very far from where Sora did.
More options
Context Copy link
More options
Context Copy link
For a while I assumed that the big AI companies would permit porn generation eventually. They might want to act high and mighty now, but a time would come when they needed to show revenue, public perception be damned. Unfortunately, I'm naive enough that my conception of pornography did not extend to the type that could be legally problematic. Back in January, a bipartisan group of 35 attorneys general published a letter to Elon Musk asking for assurances that the company was taking steps to protect against NCII and CSAM, though it's unclear if he ever responded. Last week, a class action suit was filed in the Northern District of California, alleging that xAI is responsible for producing nonconsensual nude images of three underage named plaintiffs. Yesterday, the Baltimore city attorney filed a lawsuit alleging violation of various city ordinances involving consumer protection.
It appears to me that these issues could probably be solvable. Disallowing generative editing of user-uploaded images seems like a no-brainer. The CSAM issue is a tougher nut to crack, but it seems like the NCII issue is what was getting everyone's attention, so if that goes away then I doubt that the existing safeguards against the latter would be found deficient. But the cat's out of the bag at this point; Elon fucked up and now he's under the scrutiny of people who have the power to make life miserable for him. I imagine the class action suit will settle, but it will take years, and Elon is hard-headed enough that he might decide to make a statement out of it. The plaintiff's attorney seems to have selected the worst possible place to file, as I don't imagine you're going to find a more tech industry-friendly jury pool anywhere outside of Northern California. The Baltimore case is on less solid ground, and the potential exposure is likely lower (I can't imagine it being more than a few thousand dollars per proven victim), so it may make more sense to fight that one, although all that will accomplish is proving that he didn't violate a specific Baltimore consumer protection ordinance.
Grok Imagine has indeed implemented something like this, where uploaded images of real humans become extremely difficult to edit without triggering a censor, and I think videos might be right out. Unfortunately, even this gimped censor is severely limiting, and a full-on prevention of generative editing or animation of user-uploaded images would take away like 90% of the use cases for image/video generative AI. Since so much of using gen AI to produce images and videos is about trial-and-error and iterations of manual edits -> AI generation building on it -> manual edits of AI generations -> AI generation building on it -> etc., including using multiple different non-interoperable AI tools (e.g. generate original image in Midjourney, edit it locally using Krita and Stable Diffusion, then upload it to Grok AI to animate), lack of ability to take arbitrary image input would leave it as only the origin point for the workflow, which doesn't amount to much, or just simple time-waster slot machine generations.
Personally, I do not like the notion that it's possible to arrange pixels in an illegal way that doesn't involve some other independently illegal action as a causal factor and hope that attempts to make it so fail horribly. Unfortunately, I'm not hopeful, as it seems to me that support for free speech and free thought isn't very high right now in the USA. This is also why I'm still holding out hope that video gen on local hardware will become "good enough" that private servers owned by companies like OpenAI, xAI, Google, etc. don't become effective gates for this sort of creative endeavor for the layman (or at least lay enthusiast).
The reason it gets fuzzy around the edges is because you’re dealing with the knock on effects and potential second and third party consequences.
In several countries for example you can get executed for manufacturing and selling narcotics but not for consuming them. In this case if you’re someone who’s producing that kind of content, I can see a rationale for why you’d be in the crosshairs. People who consume the content… I can see why it’s ‘somewhat’ different.
More options
Context Copy link
More options
Context Copy link
Is it? Even if you only disallowed it for uploaded images of people, that would cripple one of the most popular use categories for generative editing. My kids mostly aren't very interested in AI, but they were thrilled when Gemini stopped disallowing and started allowing us to turn pictures of them (and their cats) into anime-cartoon-style and bobblehead-doll-style and kaiju-battle style and so on and so on. If you also disallowed it for all uploaded images then you'd be ruining one of the easiest good ways to control output of image generations in general.
More options
Context Copy link
I haven't seen that abbreviation before, so I'll explain it for lurkers: nonconsensual intimate images—a category that originally was just revenge pornography (public posting of privately-shared explicit content), but now has been expanded to include explicit and suggestive edits of publicly-posted nonsuggestive content, and sometimes even mere spotlighting of unedited publicly-posted suggestive content.
Fascinating how "consent" came to be a universal moral solvent, and by extension, a lack of consent can extend much further than any sane person might think.
That's because "consent" actually means "waives Female Privilege to profit from sex after the fact", not "accedes to".
Women cannot legally consent to sex (or any sex-adjacent activity, actually- 'revenge porn' is yet more salami-slicing away of that ability) today in any Western nation (the US is, perhaps ironically, the least far down that path- but it is still criminal). South Park made fun of this with the consent forms, but the fact that wouldn't hold up in court is actually the main issue here.
Sex with them is thus as potentially legally dangerous as it would be with a 7 year old- the group "consent" was made up to initially protect. We can see this by how laws tend to get changed so the man can't protect himself by demonstrating in court the women intended to discharge this and lied after the fact (i.e. the Jian Ghomeshi case). It's also why Western/feminist anti-prostitution laws only criminalize buying sex, not selling it.
In other words, invoking "consent" is the one-word fig leaf to cover up the fact women are blatantly abusing privileges meant for the people they claim are the most vulnerable, and to claim that if you're opposed to this abuse it's because you want 7 year olds to be raped. It's quite effective, as you can see.
Okay, let me put this question to everybody here.
Suppose women lose all sense of shame. They've sent intimate photos and videos to their boyfriend because that's how modern relationships work. Then they break up. Maybe it was a bad breakup. Former boyfriend is now pissed-off and is threatening them that unless they get back together, all their intimate photos and videos will be shared with everyone. Or maybe former boyfriend skips the threats and goes straight to uploading this on porn websites etc.
And the woman goes "Go right ahead, I don't care. Sure, send that full-frontal all-angles nothing concealed nude photo of me to my employer and my work colleagues. That video you wanted of me fucking myself with a vibrator? Yeah, send it to my granny. Hey, if you make any money off all that, remember to split it with me!"
That takes revenge porn off the table, because how can there be revenge if the blackmail element is removed? If women behave like men and are "I don't care if he's using my nudes to AI deepfake videos of me fucking dogs"? EDIT: I'm asking that in the context of the comments on here about "but what harm is really done if photos of women and children are used to create fake porn? why is this a concern? why are people worried about their images being used as masturbation material, if a guy wants to jerk off imagining a particular woman he knows, he can do that in his imagination so you can't stop it, and if you don't know why would you feel hurt about it?"
But would you like women to be like that? Or would it just be more "women are sluts who need their sexual autonomy removed and to be controlled by fathers and husbands" fuel for the fire?
EDIT: Oh, and gentlemen, if you find success with the ladies eludes you, could it be because you are neglecting your intimate hygiene? Luckily Lysol will solve that for you! Regular douching with something that makes you smell like coal tar down there will surely be irresistible!
And this is different than the general blackmail case... how, exactly (especially in the AI context)?
We already have laws to deal with this case (and in the cases where we've chosen not to have them/are prohibited from doing so, we've already made the tradeoff). You don't need another law like that, or at least, you wouldn't if this was actually about protecting people from harm and not just a case of
which is perhaps why you did exactly that in the first edit.
Given my assertion is "that's exactly what women themselves are agitating for here"? Of course, it's not really "controlled"- it's always legal for women to have sex for reasons that have a bunch to do with an echo of '70s sexual liberalism- it's just permanently illegal for men to participate in any way
First it was just sex itself, then it was sex-adjacent activities, now it's pictures (real or otherwise) of it. Salami-slicing.
Do you honestly not see why using pictures of real people to create sexual images might be offensive, even if the woman in question was sexually active? Would you not care if someone used a photo of you to create something like that and distributed it? You can call it a figleaf, but unhappily there are real guys out there who would indeed use a photo of the neighbour's four year old to create images of that child naked and sucking cock, and pass such images around.
Perhaps you don't object because if you got an AI-generated image of some hot chick you know, or a famous woman, or that bitch who refused to go out with you when you were seventeen, and she is used for porn material you can jerk off to, you'd be quite happy to use it that way. Perhaps you wouldn't care if such images were created and disseminated of you, because what harm is done? You never really got fucked by a stallion in real life, who cares if the kinksters are using your beach photo to show you taking horse cock? Maybe you think the guy who persuaded a 12 year old into sending him nudes, then tried to blackmail her with those and she eventually committed suicide, did nothing wrong (some people did comment along those lines before). After all, she freely gave him those images, so it served her right if he showed the world what a horny little bitch she really was, yes?
Some people do care, though. Personally, I think any woman who provides nudes or the like for a boyfriend is extremely stupid, but the betrayal there is that these were supposed to be intimate images for one specific person in the context of a relationship, not to be shared around or used to do reputational harm. That is what feels the most hurtful.
I'm cynical. Of course I think "don't trust men, they only think with their dicks and are vicious when not getting what they consider their right to get laid" but some women don't feel that way - until they get slapped in the face with it.
I absolutely think it is offensive. I don't think society should protect women from being so offended because society refuses to protect men from it. Nobody gave a rat's ass when girls were distributing pictures of 5-year-old me naked in high school, using them to openly sexually harass me. They wouldn't even confiscate the pictures. Women demand gender equality, which means they can either grow a fucking thick skin and ignore it like we expect men to or kill themselves in shame like men who can't do.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
About that... Dave Chappelle would like a word.
More options
Context Copy link
More options
Context Copy link
Well, for some people anyway. Those of us born with the wrong parts are typically exempt from needing to provide consent, though for some reason we're still expected to obtain it.
More options
Context Copy link
More options
Context Copy link
Wow, that phenomenon seems really reminiscent of how "sexual assault" became a catchall term to be used when the speaker wants to create the connotation of forcible rape in contexts where the reality is some sort of harassment of a sexual nature. As well as how "sex trafficking" became a catchall term to be used when the speaker wants to create the connotation of kidnapping women into sexual slavery in contexts where any level of prostitution took place. I've come to really dislike these overt attempts to engineer language for the purpose of hiding covert attempts to manipulate others into believing things that one finds useful for others to believe and personally always just call them "child porn" and "revenge porn."
Those kind of things were always actionable under common law tort theories, and most of the discourse around them doesn't even go that far. The problem is that regardless of legality, they're scummy things to do, and they've only seen a lot of media attention in an era where it's easier to do them. It's not like you could do this stuff in the 50s without consequence, it's just that actually being able to do it wasn't really an option.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link