This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
From a legal standpoint, what is the theory for the 'harm' caused in this instance. And to whom?
Liability of any kind usually rests on the idea that someone's interests were injured and in need of redress.
We are able to statutorily 'create' interests (the Americans with Disabilities Act did this, for instance) but I think we'd smack into 1A issues trying to make it completely illegal to post manipulated images of people who... are already posting images of themselves online.
Most obvious angle is copyright/IP, but they're still sorting that out for AI works.
I'd kinda love for them to go at it from the obscenity angle. Because that would also suggest that the women posting thirst traps are doing something wrong too.
I don't know about legal harm, but my ideal solution to this would be "The girl who punched the asshole boy gets a finger-wagging, and the boy gets told he had it coming and stop being a little shit."
Obviously, that's too old school and common sense for a school to do.
Photoshopping a woman's face onto a pornographic image has long been understood to be a shitty thing to do and possibly actionable. But AI-generated real-person-porn is probably just something society is going to have to get used to.
In a world where social shame was still effective it'd be a pretty damning to do it and would probably result in ostracization. Not clear what one has to do to 'compensate' for the situation though.
Similar to being a peeping tom, or a subway groper or anything else that intrudes on people's strongest held social boundaries, even when the harm inflicted is de minimus.
But the problem is that shame would also kick in for stuff like a young girl hyping up her debut on Onlyfans once she turns 18 (link is mostly SFW but you'll see some thirst trapping). The puritanical ethics required here would condemn both the voyeuristic act and the exhibitionist act.
Its rather schizophrenic that there's basically unlimited tolerance for (adult) women to produce pornographic content of themselves, but shame is still heaped upon the consumers, as if these weren't both inseparably linked and necessary components of the "empowerment" equation here.
Like I said before, worst of all worlds.
I've already said that I am pro-slut shaming.
That said, there's a difference between someone willingly posting their nudes and someone not doing that. I think the OnlyFans girl would have a harder case to make about being harmed by someone generating AI porn of her, versus a girl whom you think should just accept that all women are being punished for the OnlyFans girls.
There's a difference but I get confused about the secular reasons for why its meaningful.
Sex and nudity is supposedly no big deal, especially if you're attending a pride parade, but it absolutely IS a big deal when its someone's nudes hitting the internet, evidently. Shame, embarrassment, I dunno, it seems just taken as a given that it demeans the subject to be exposed in such a way. But if they publish those exact same images themselves, it is not demeaning?
There was a minor hullabaloo when I was in college involving 'Slutwalks' making it acceptable for women to wear skimpy clothes in public. And the "Free the Nipple" movement which, among other things, tried to make it acceptable for female nipples to appear on, e.g., instagram.
But then what I noticed is that almost no women (well, no attractive women) used this newfound power to actually go around in public topless or scantily clad, or post topless shots to IG. THEN came OF where they could monetize it and things REALLY got locked down.
So culturally we're told sex and nudity aren't a big deal, don't be prudes. But ECONOMICALLY, people (mostly males) spend billions upon billions of dollars to acquire sex and view nude women. So the only distinction I can really grasp is "am I getting paid for this or not." Which applies to many things, granted.
But where does that leave us?
According to whom? The leftists you hate so much? Yeah, some will make that argument, but even they won't say it's no big deal when it's non-consensual.
I don't think the pride parade demographic is particularly representative even of leftists.
I think pretty clearly it's morally objectionable to generate AI porn of someone who is neither a sex worker nor someone who wants AI porn generated of them. What the law says, I am less sure, but I expect the "revenge porn" laws will probably be expanded to include "Generating AI videos of your ex fucking a horse."
Well, the amorphous cultural norms brought on by the sexual revolution, more to the point.
Abortion? On demand. Contraception? Everywhere. Marriage? Optional.
As the biological consequences for having sex with whomever you wanted were abolished, so too were the social consequences.
I do not hate them, but I do not want to live amongst them.
And if people were better about choosing to live around people who genuinely shared their preferences and norms, much of the problems we're discussing in this particular case would evaporate.
I mean, the Muslims have solved it their way. Keep women covered up whenever they're in public. This "works" but, (as I'm sure you'd agree) this requires unacceptable restriction on female autonomy. If they only live among other Muslims, this tends to work "OKAY" (women stoned to death unavailable for comment).
We westerners have clearly NOT solved it in a way that is satisfactory, and we seem to make up the rules on the spot based on the relative status of the involved parties. I'm reminded of this every time I see a gym influencer post a video of some guy allegedly gawking at her body without her consent, and posting said video so that all the anonymous onlookers can... gawk at her body.
We CANNOT sustain a system where people are allowed to wear whatever they want without regard to its reception by onlookers, and the onlookers are only 'allowed' to enjoy the view if they're approved as acceptable by the wearer.
I don't want to live amongst people with such irreconcilable standards.
What about making an unflattering caricature art of them? Or depicting them in (non-graphic) torture scenes? Or just change their skin tone to a different color on purpose? This all seems like we're treating nudity (or even just softcore titillation) as a special pleading.
At least, with kids, I think there's a sustainable moral argument for why we don't want people sexualizing them, and cast a suspicious eye at those who do.
And question I've asked before, what if you find someone who is a very close lookalike and have them pose for nude shots and post them, but never actually imply that was your goal? It was a common enough practice among pornographers back in the day.
It seems like this is basically suggesting norms of "look at whatever content I choose to post, but do not ever interact in any way I might find unpleasant."
Which runs extremely counter to how internet culture as a whole works.
That's why I specified morally objectionable. I don't have to think it should be illegal to think someone who does that is a shitty person. I can legally draw a picture of your wife fucking a horse and post it online. Even if you couldn't press charges, you'd probably want to punch me, and most people would think I deserve it.
I mean, I personally might challenge you to a friendly kickboxing match.
But if you also had a couple thousand supporters who would donate to your gofundme to support your trolling efforts, would you really be dissuaded?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I mean - in Eugene, Oregon, topless women are not terribly uncommon.
More options
Context Copy link
Yes. Consent and agency are necessary considerations in plenty of moral decisions/outcomes, sexuality included. It seems intuitive to me that the proactive decision to publish sexual content is a vastly different experience than having someone do it under your nose. Money need not apply.
A few (admittedly imperfect) analogies involving consent to illustrate my point:
In all these cases, the former option is fine when done at one's own volition, but become a problem when another actor steps in. There are almost certainly philosophical papers that provide the premise-by-premise reasoning for this sort of argument, but hopefully you get the picture.
In a way, the body, particularly the sexualized body, is something of a possession. It can be given and taken away, shown and hidden. In some sense, it is a commodity that we have "ownership" of and many consider it the sacred domain of the individual. Sexual acts are high stakes, which is why it is so terrible when they are done against one's will and why it is considered a statement when someone takes bold public action with their body, for better or worse. You could argue that it is demeaning to publish sexual content under some sort of moralist (i.e. public sexuality is inherently debasing) or consequentialist (i.e. public sexuality leads to negative behavioral outcomes), but these arguments are complementary rather than overriding to ideas of agency and consent, in my opinion.
Well I'm gonna have to drill down deeper as to your logic here, which I can accept as facially valid.
What is actually 'removed' when the image is published?
Similar with the secret, a breach of trust is a breach of trust, but unless you signed an NDA that expressly laid out how to calculate damages, then your remedy is "never trust that person again."
Vs. losing a kidney or having your money taken, where you can absolutely point to the thing that you lost and demand recompense for.
I would not be arguing this if we were talking about actual physical rape of a person, which is clearly a violation of a concept of 'bodily autonomy,' I think taking a photograph of someone/something is inherently less of a violation.
Publishing a photo is a step beyond, I can absolutely grant, but kind of as I alluded to before, the only actual dividing line I see between whether its a demeaning violation or not isn't in how the viewers receive and react to the image, but whether the original subject will get any money from its publication, not that they have lost something that was in their possession.
Like, consider a situation where a woman takes a nude photo, then fat fingers it and accidentally sends it to the wrong dude. Then, mortified, she demands that he delete it and excoriates him if he comments on it approvingly. Or comments on it at all.
Is HE in the wrong if he views and enjoys this image that wasn't intended for his consumption? Or is SHE in the wrong for sending unsolicited pornography to an unwitting recipient? Is he obligated to delete it? What's the difference? Once it has been sent, how is she harmed by it arriving to the wrong person?
Because I think if we take your express logic to any extreme, it also becomes objectionable to imagine someone naked, especially if you derive pleasure from it.
More options
Context Copy link
More options
Context Copy link
Which is why the difference is meaningful in the first place. The sharing of intimate images is something women in general want to be paid for, so obviously doing that without permission is bad; meanwhile the other sex stuff doesn't have anything to do with that, so it is OK, and it works as a "haha, ur a prude" trap for people who can't or won't understand the former dynamic is all that matters.
It's not any more complicated than that.
Yeah, but she also wants to completely dodge the reputation that comes with trading her sexuality for money.
And of course, there's still often a guy in the picture actually arranging for her to sell this stuff. In this case, OF clearly profits far and above what all but their top-performing producers do. And its owned by a dude.
I'm not even denying that there's a fundamental transactional nature to all this stuff, even if you're in it for marriage and kids... its just that its now literally reduced to a commodity that gets haggled over, and people who 'have' to pay for it are viewed as losers, whilst anyone who is successful at getting attractive women to give it up without explicitly paying is either extremely crafty or is inherently high status.
Let me repeat that: sex is a commodity which can be purchased at various price points depending on the quality, so its not hard to acquire in the abstract, but being able to acquire it without spending money somehow makes you a God amongst men. Our old-school ape-wiring seems somewhat at odds with our later 'homo economicus' upgrades.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Where's the harm in teenage boys faking nudes of a 13 year old girl without her knowledge or consent, indeed very much against her consent?
Well gosh gee whiz, why on earth are women such picky, fussy, hypergamous trollops who don't want to marry just plain ordinary guys? No wonder we need to force these women into marrying normal men who think nudes of 13 year old girls are just fine! Why is anyone getting het-up about this? Men like nubile women, young means fertile, and if she's old enough to bleed she's old enough to breed, right?
The boys are just doing what boys do! Boys are gonna be interested in girls of their own age! Boys will be looking at porn, and porn is fine and normal and in fact is good for society since it reduces rape and sex offending crimes (citation needed, of course!)
I know, I know: Amadan is going to hammer me for using sarcasm. But how else am I to react to "how is the girl harmed by this?" unless I get really angry and abusive, in which case I'm still going to get the mod hammer.
There is a lot of daylight between "it's not okay to make nude deep fakes of 13 year old girls" and "an individual was harmed when that deep fake was made," and even more from that to "someone should be sent to prison about it." None of these things actually imply the others.
More options
Context Copy link
Sigh. Yes. Come on, you're just trying to be provocative and you knew you were going to get reported.
It's not the sarcasm that's the problem. It's that you (and I mean you specifically) can make an intelligent and cogent argument for why this is bad behavior that should be discouraged. And you can even be (a little) snarky about it. But when you layer on the "gosh gee whiz"s and straw men obviously directed at the person you are responding to, of course you're dialing up the antagonism in a way that doesn't actually lead to productive engagement. You're just trying to say "You're a shitty human being unworthy of a respectful reply" without using those words.
The "big yikes not a good look chief" millennial slop has ruined online discourse because it feels so good to get off those snide, snarky little burns but it absolutely destroys any hope of good faith dialog. Do you want to talk to people and maybe enlist their sympathy and get them to see your side of things, or do you just want to score little zingers?
If I (and I mean me specifically) am unable to make intelligent and cogent arguments, will that be seen as a mitigating circumstance when moderating my comments?
More options
Context Copy link
More options
Context Copy link
Your logic suggests that you'd have no objection if a 13 year old girl published nudes with knowledge and consent.
Is that true?
Is consent the defining factor here?
I just want something on record.
I would mostly agree with this. Its just another example in the long list of examples why the consent standard when applied to sexuality and sexual interactions is more or less useless.
I don't think it is useless, but man, people do not seem to really know what they mean when they say "consent." Worse still, they don't really know what they mean when they say they "consent" to some activity.
Sex in particular, the emotional valence of the moment, and the intensity, can shift by the minute. Then, reassessed after the act, someone may decide that some particular part of it they 'agreed' to in the moment was actually a violation.
That is one of the main problems with consent as a standard. It does not hold up under any of the hard cases.
And with sex IN PARTICULAR, there is no reasonable way to go back and assess whether it was validly given or not or whether the lines were crossed. I noticed this issue in law school. "Wait, how the f@&k do you establish evidence for lack of consent when it all happens behind closed doors?"
Unless you film the whole interaction and that opens up the whole can of worms that we're discussing.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
So, back in high school, someone made a fake photo of me and posted it in a classroom. It wasn't a nude, but it was political, depicting me as Stalin, as I was an outspoken socialist. I was outraged ("the photoshop is not even accurate! I'm a TROTSKYIST!"), and it definitely hurt my feelings and hurt me socially. Pretty clear case of bullying, but, in retrospect, it was pretty hilarious and a useful learning experience. Should that kid have been punished?
I don't think so, and I suspect you don't either (though I'm curious if my suspicion is right). Which shifts the question to, what is the difference between a nonsexual representation and a sexual one? I think, to many people who don't see harm, harm categorically isn't something that can be done with an image or words--sticks and stones can break my bones etc. If people start physically attacking someone, or destroying their property, in response, there is harm, but the harm originates from the physical act, not the instigating image. The introduction of a sexual element doesn't change this. (I'm speaking here in terms of conceptual framework, not legal definitions.)
That doesn't mean that the school shouldn't do anything about the boys--schools can and should regulate behavior above and beyond the minimal standard of harm. But the idea that actual physical violence should be punished less than images and words is weird to me, especially when school administrators had no actual evidence of the images.
The difference is that a video of you as Stalin is obviously a joke. Do you think you would have felt the same if instead they showed a video of you masturbating on a television in the classroom?
I would have felt very differently: I would have cared much less, quite honestly. "Oh, someone's a weirdo, anyone whose opinion of me changes because of it isn't worth caring about." And I'm not sure that making the AI-generated nude clearly a fake joke (giving her purple skin or whatever) would change anyone's opinions. I think the crux of the matter is that it's a sexual image, and we cordon off sexuality as requiring unique, almost spiritual protections around it.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What is the legal harm here, is the question that @faceh asked. Mind you I disagree with faceh, I think the harm here is pretty obvious even from a legal point of view since defamation per se usually covers allegations of sexual misconduct in as well (but as I said elsewhere I'm not a legal expert here and could be completely wrong). Please try to respond to the argument faceh is actually making instead of devolving into mocking and sarcasm.
More options
Context Copy link
More options
Context Copy link
I think there's a plausible false light (and defamation per se) claim, given that the images in this situation were being shared and would be themselves illegal for her to produce. Even for deepfakes-of-adults, false claims of sexual promiscuity would fall into these categories. There's some theoretical examples where a Falwell v. Hustler-style defense would be relevant in the case of a public figure where the deepfakes were clear parody, but that's pretty far from the typical case. But from a traditional law perspective you don't have to pull a Gorsuch to find a civil tradition against this sorta stuff.
Useless, though, since the kid who did it's judgement proof. In theory, the state law would allow six months imprisonment per act, but in practice that's really not how the juvenile court systems work, and even an adult doing this to another adult is more likely to just end up with a fine. And while both the boy generating the deepfakes and those passing it around (or even receiving it) could probably charged with federal CSAM stuff, that's such a nuclear option it's extremely unlikely anyone would seriously even threaten it here.
Which is part of why the whole thing is such a mess.
Given that teenagers have been charged with the production, possession, and distribution of CSAM for sending nudes of themselves, CSAM charges in this case don’t strike me as anything close to nuclear, assuming the police can recover the images from Snapchat. The only thing I’m not certain of is whether they actually broke any CSAM laws. Is it actually illegal to draw a photorealistic, but fake, image of a nude minor? Perhaps more to the point, given how AIs usually seem to handle NSFW requests, is it illegal to put the face of a minor on top of the body of a definitely-grown adult? I have no idea, and it’s not something I’m eager to google.
I'm mostly using "nuclear" in the sense of "the biggest available weapon, and its resulting proportionality concerns". Those style of prosecutions happen, but they're pretty uncommon, even though there's good evidence to think the chargeable conduct happens more often than anyone wants to think about.
In the US, it's a federal felony under the PROTECT Act, unless the content also has serious literary, artistic, political, or scientific value. Enforcement is pretty rare, though, since the feds don't want the law to get another challenge like Ashcroft.
More options
Context Copy link
More options
Context Copy link
Thought experiment, what if one of the boys shows it to his creepy uncle and then snapchats it to his creepy uncle? Then creepy uncle is in possession of CSAM - even worse if he then darkwebs it to all of his other chomo friends.
Obviously, probably not going to charge the minor male with accessory, right? But this is the "break containment" mega problem for cases exactly like this. Call it "deepfake laundering"; kids unwittingly doing stuff that, in the realm of adults, would be 100% illegal.
More options
Context Copy link
More options
Context Copy link
I had some thoughts about this back when that DignifAI thing was doing the rounds:
I thought DignifAI had legs, but of course the gooners demand the opposite service.
Reputational harm is covered under libel/slander/defamation laws, but does require other people to actually believe the falsehoods that someone published.
So exactly how much modification would it take before its no longer 'believable.' If they stylize it as a different artstyle?
I keep pointing out that with AI, whatever guidelines you put in place, the AI can tiptoe RIGHT up to them and stick a single pixel over the line.
Ultimately this is also why I don't see how Onlyfans continues to exist as a business model for flesh-and-blood women after this year.
That one is easy.
Ultimately, it'll become an even more premium service. Gooners will pay extra for some sort of cryptographically verifiable proof of realness. Setting up a digital chain of custody from a digital camera to actual final presentation on OnlyFans or another site wouldn't be too arduous and (markets in everything) someone will create a service to do just that.
Saying that AI porn will destroy "regular" porn fails because it relies on the same logic as "porn will destroy the market for webcam girls / strip clubs / prostitutes." It isn't all about the simple visual arousal, but about a parasocial/pararomantic/paraintimate relationship.
This is where my model of the goonbrain really fails.
If you're going to such absolute lengths to be 'certain' that the girl in question actually exists, surely its better to pay for in person companionship?
It doesn't parse, for me, how you can require a physical person actually exist for you to get off to the content, but NOT also inherently prefer that person be physically present. Which obviates the fear of it being faked, entirely.
Although I guess I can imagine a guy who is deathly afraid of getting arrested for soliciting a prostitute which drives him to avoid paying for sex.
Streamers/OnlyFan girls of various stripes tend to be much cheaper than prostitutes/escorts.
...I think. I'm sure someone can trot out an example of a rather ludicrious OnlyFans where the pricing is absolutely stratospheric, but the last time my morbid curiosity led me down that particular rabbit hole, that was my general take on the matter.
I'm gonna say its comparable to any other gig economy/gamified app. The basics are pretty cheap, but they rack up all kinds of extra charges where-ever possible, and milking the whales is the real profit center.
Basically, buying an OF is marking yourself as a possible sucker just by putting the basic money down, and the ecosystem is going to do its damnedest to drain your wallet.
I haven't availed myself of prostitutes, the standard strip club experience annoys me enough with the constant upsell even knowing that they can't actually promise the outcome you're hoping for.
More options
Context Copy link
More options
Context Copy link
Think of a goodbrained version of "never meet your heroes." If a goon actually pays for a prostitute or companion of some sort, he has to fully interact with a person for some length of time. That could go wrong. It could make him feel bad. The distance created by a screen creates a perception of control and the ability to "Rage quit" if things go south. But they still want the knowledge that there's someone real over there.
Don't get me wrong, a portion of the "real people" online sex market will be taken by AI. Perhaps a pretty large portion of it. But there's going to remain these other niches. If online porn and COVID didn't kill stripclubs (and it didn't), I don't see how they every really go away.
More options
Context Copy link
Map of worldwide prostitution laws
While I get that, you can still find strip clubs if you're willing to leave the house (this might be the true motivation, the desire to never go out in public).
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
That's technically not a requirement for defamation per se, but I don't know enough about defamation law to say whether or not this counts as per se defamation. I do believe that legislatures could define it as defamation per se by statute if they wanted to though.
defamation per se still requires that there be an "accusation," which means in this case, you'd have to prove that the kids were claiming these AI generated images were real pictures, not fakes. It's no different than if they had drawn lewd pictures of her.
I imagine that unless it was an obvious parody or fake, or that the boys were explicitly telling their friends that they were AI generated and not real etc., most courts would presume that the images were being presented as real.
More options
Context Copy link
More options
Context Copy link
And once again, I'm not sure if it'd survive 1A scrutiny.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
Revenge porn laws create some, don’t they?
Yes, but usually a component of those is that someone sent you those images, and DID NOT publish them themselves, so you're violating their 'privacy' interest by publishing these salacious images sans their consent.
Modifying an image they themselves published seems categorically different.
More options
Context Copy link
More options
Context Copy link
I suppose I imagine it's of a piece with something like false light which operates along similar lines to defamation. The harmed party would obviously be the individual whose photo was edited. Especially if their was an intention to pass off the photo as genuine. I don't think you could reach any edited picture with this doctrine but I think you could likely get non-consensual NSFW edits. In the underage case my understanding is that digital edits of minors can already be considered CSAM so I don't know why this would be different.
More options
Context Copy link
AI-generated nudes of minors meet the legal definition of child pornography.
If the Supreme Court ruled that sharing AI-generated deepfakes of actually existing 14-year-olds is protected speech, there would be a constitutional amendment. The general public will not stand for this, regardless of what weird tech-libertarians think.
Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002)
They basically just re-passed the same law struck down here, and have been avoiding scrutiny by only charging people who were already incentivized to plea bargain. Apparently the idea is that if the law is around for a long time before it gets seriously challenged again (in a virtual child porn case not involving anything else), the courts will forget all about Ashcroft and convict. Might work.
Tbf, the PROTECT Act stapled on a Miller test. They're still trying to bypass the 'prevailing community standards' bit, but compared to the pre-Ashcroft version that just pretended the Miller test didn't matter, it's a much wider retreat than, for example, US v. Lopez.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link