@questionasker's banner p

questionasker


				

				

				
0 followers   follows 0 users  
joined 2023 January 21 15:54:29 UTC

				

User ID: 2119

questionasker


				
				
				

				
0 followers   follows 0 users   joined 2023 January 21 15:54:29 UTC

					

No bio...


					

User ID: 2119

Maybe that should have the same moral or ethical implications.

In the version of the hypothetical where the AI actually can exactly recreate the way a person would look naked in a certain pose, using only a clothed photo of them in that pose as reference, we can agree that the information is 'actually' the same, though, right? One pixel at location x,y, with color #f0d190 is 'actually the same' as another pixel at the same location x,y, with color #f0d190, regardless of whether or not that pixel exists there because it was reverse-engineered by AI, or normally-engineered to be there as a result of being captured via digital photo.

This matches my intuition. For someone to just generate deepfakes they just keep to themselves? I've got no problem with that. For someone to distribute those deepfakes around, possibly (but not necessarily) passing them off as real has the potential for harm.

I'm starting to think along similar lines. It seems like its the actual distribution of the deepfakes that sets it apart in my intuition, not even necessarily because of the images being distributed in and of itself, but because distributing such images necessarily means they will be available publicly, and if they are available publicly that means that the depicted persons might learn that people are doing as much (creating realistic porn about them, 'viewing' them in a realistic way naked) which is what typically seems to cause the depicted persons psychological harm. Being that its wrong to cause people psychological harm, this is what makes it immoral. I'm starting to think a similar distinction would lie between i.e. masturbating while fantasizing about someone sexually (and keeping that you did as much entirely to yourself), and masturbating while fantasizing about someone and then telling that person that you did so.

With deepfakes I guess it depends on how much verisimilitude something can have before you think it violates your "actual" privacy. If I have a deepfake of Angelina Jolie that, for whatever reason, has serious flaws and inaccuracies have I violated her privacy in the same way?

This is the main thing I am trying to get at with the locker room/fantasizing examples. The current AI can inpaint nudity onto clothed pictures of people without necessarily having serious flaws or inaccuracies. (Not to say, it always succeeds at doing this. Just that it can reasonably often.) And training the AI on the actual person's breasts isn't required for the result to be highly similar to what they actually look like topless, at least for some women, considering at least some people's breasts are visually similar to other people's breasts. Thus a person who has not already consented to having topless photos of themselves present anywhere on the internet can have topless images of them created to what is indeed a very high degree of verisimilitude to their actual naked form, using i.e. pornstar's breasts as training data.

Technically, I suppose, it can't be known by the person operating the AI algorithm if the person has i.e. a mole on the chest, etc. So maybe, because technically uncertainty might remain, i.e. without actually being able to look at a real topless image of the subject, and thus verifying that the nudity-ai-inpainting is highly similar, there is still some sense of privacy maintained? Because even if the inpainted-nudity actually is extremely similar to their topless form, this isn't known to the person creating or viewing the deepfake?

Regardless, overall, the pertinent fact is that the current level of technology is at a level where it is indeed possible to get outputs, at least somewhat often, that the depicted person themselves could or would mistake for real nude photos of themselves. This seems to me to be functionally very similar if not the same as looking at someone changing/naked without their consent or knowledge. You're right in the sense that it doesn't imply other security concerns in the same way as an intruder present in a changing room would, but I'm not sure that's whats actually wrong/disliked about peeping toms; I feel like a significant amount of the dislike of the idea of someone seeing you changing is the actual fact that they know what you look like naked (and maybe also the knowledge or likelihood that they are fantasizing about you sexually). I.e. most people would be as mostly as opposed to a person using X-ray glasses, or more realistically a hole in the wall, to look inside their locker room while they changed, as they would be opposed to someone i.e. hanging from the rafters. I can't know for certain, though, at least personally I guess, because to my knowledge I've never been the victim of any such situations.

I don't think we have an easy rule. I also don't know that this can/should be grounded in privacy. Maybe defamation concerns would be more viable?

Well, as far as legality goes, it seems like copyright is the main way people take down unwanted deepfake porn of themselves. Regardless, though, I'm less so interested in the legality and moreso in what should or shouldn't be generally considered acceptable ethically or morally speaking, for which perhaps privacy or violations thereof, and perhaps other things, do seem like a relevant concern.

What if the AI is so good at being photoshop+ that, using a picture of what you look like clothed, it is able to create a nude comprised of the exact same pixels/information that would be present in an actual photograph you took of yourself while in the same pose except naked? In that case I actually am accessing the information that you call 'your' information, which is to say, that information which you agree is wrong for me to access.

To pre-empt any disputes you have about how possible this is, although I'm sure it is actually possible, lets retreat the capability of this AI just to a level of detail that could be at least good enough to trick even the depicted subject into thinking it was a real nude of themselves. (This is where the technology is actually at, right now. Boobs aren't exactly the most difficult thing to draw, especially at 512x512 resolution.) In this case, even if it's not the exact same information, then, it seems to me to be functionally the same information, for all intents and purposes. So is it okay for me to use an AI to access what is for all intents and purposes the same as information which is otherwise immoral for me to access?

So discussion of deepfakes should account for the fact that they aren't particularly novel.

Maybe. Couldn't it be true that these types of images were unethical to create/consume for as long as they've existed, and there just wasn't proper recognition of the problem?

Another commenter brought up the fact that perhaps a significant reason as to why at least distributing deepfake porn is immoral could be because, by nature of the fact that one makes them openly accessible on the internet in order to be distributed, one makes it likely that whomever the deepfakes depict will find out that such porn is being made of them, and it is specifically the psychological harm inflicted by knowing porn like this exists of oneself that accounts for (or accounts for the majority) of why such porn is unethical for people to make or consume. This would also explain why previous iterations of 'fake nudes' weren't as highly debated: because they weren't as commonly distributed until now (perhaps because they weren't as realistic, and thus not as popular).

Thanks for the interesting response.

Women's feelings are not god.

I can maybe accept most of what you're saying specifically in regards to how it answers to the question: 'should these images be illegal?' Perhaps it is a violation of the principle of free speech to have things like this be made illegal, or a slippery privacy slope, or a needless cession to emotional people, etc. That being said, whether or not it should be made illegal, I expect that it will be legally harder if not fully illegal to do this kind of thing in the near future. But I digress.

Many others in the thread are also focusing specifically on the legality dimension, which I regret and for which I bear responsibility. I was generally more interested in talking about what is moral/ethical, and less so in talking about what is and should be legal, even though I indeed asked as much in the body of my post. Even if these things are not illegal, the morality of them is still hugely important, as it determines who gets 'cancelled' etc.

And to that end, in figuring out what is ethical or moral, I think feelings do matter. For example, I think it would be immoral in many situations to do an action that I knew would make a person "feel bad" just because doing that action gave me sexual gratification, legality of the action notwithstanding. If I was trying to design the fairest US government, I might not make deepfake porn illegal. But if I was trying to be as morally/ethically upstanding of a person as I could be, there are plenty things I shouldn't do that are still legal.

I'm of the relatively firm belief that it isn't immoral to fantasize about having sex with someone, even if they haven't consented to you having such a fantasy. I'm not sure what I think when it comes to making highly realistic porn of them. If you were superman and had X-ray vision, would it be unethical or immoral to look into the women's locker room? If not, why does everyone seem to think it would be? If so, what's the difference between that and having a vivid, realistic imagination and using it for sexual purposes in the same way?

Another commenter prompted me to believe that a significant amount of how unethical it is lies in whether or not deepfaked person knows about the porn being made of them, because knowing that it exists is what inflicts psychological harm on them. I think I agree about this. However, the women in the shower into which you're peeping might not ever know that you've peeped at them, so is it not wrong to be a peeping tom (as long as you never get caught?) Teen coming-of-age movies from the 80s didn't seem to think so (the shift in attitudes between then and now might be pertinent to the discussion). Regardless, currently I do indeed think that i.e. spying on the womens locker room would be unethical, and I think most people today would agree that its probably wrong to do such a thing. This is the angle that I'm really trying to disentangle here, the moral and ethical angle, and less so the legal one.

Yeah I'm pretty willing to forgive the streamer guy specifically, especially considering your points as well as that I have little-to-no horse in the race. As to your other points:

This can cause psychological harm in a person — humans are not designed to see something like that, I don’t think it computes properly in the brain. An AI scene of being sexually victimized (in essence, arguably) is fundamentally different than making a photoshop with a person’s face due to the sheer realism.

By this do you mean to say that the main reason that these videos might be unethical is because knowledge of the existence of the videos causes psychological distress in the people whom they depict, not necessarily because i.e. the depicted people's consents have been violated?

So knowing that someone you know, that part of his mind remembers the scene when he thinks about you, is truly disgusting.

This example prompted me to think, though on a tangent only somewhat related to what you're getting at. I'm not sure the 'part of the mind thinking the AI video really happened' thing is what sets it apart. But I think that the knowledge of whether or not someone thought about you in that way is definitely part of what matters. Whether or not someone made an AI porn of you isn't relevant to you unless you know about it -- this fits with my intuition, because re: the completely imagined sexual fantasies point, even though I and most people consider those benign, the calculation changes if person A who masturbated to an imaginary fantasy of having sex with person B then went and told person B that they had done as much. Suddenly that becomes immoral/unethical to me in a way almost similar to the AI nude situation. So I think this might be getting at the distinction for me: what really matters most is if people know that this stuff is being made about them. And in the case of these popular female streamers, the fact that the pics/vids are being distributed basically means they are being forced to know that such content is being made of them. It would be like if 10,000 weirdos were constantly whispering in their ear that they jerked off to the thought of them naked, which is different than those 10,000 weirdos jerking it but not telling anyone.

The law, what is legal and what isn't, I suppose isn't as much of what I'm interested in rather than what is moral/ethical. Plenty of countries have already at least de jure banned deepfake porn, the US probably will too eventually. So my bad for including that question in the body of my post.

Yeah he apparently was looking at the porn websites on the same computer he uses to stream, despite being more than rich enough to afford a second computer/tablet etc. He left the tab open after viewing, went to stream the next morning, and then when he alt-tabbed between his game and some other application, the thumbnail of the website was visible for like 1/4 of a second in the alt-tab menu. It didn't even come out until 4 days after the stream took place because it took that long for someone to notice it by scouring the stream VOD, which they must have only happened to do by chance. Either way yeah, idiotic on his end to be looking at any kind of porn on his streaming computer, let alone unethical porn.

So is imagining what someone looks like naked/fantasizing about having sex with them a similarly non-coercive crime, then? Either way probably 'victimless' is the wrong word to use, but I'm not sure how much effect that has on my problem.

here doesn't seem to be a principled distinction between good impressions/drawings/etc. and deepfakes.

Maybe really good drawings of a non-consenting person's likeness having sex/naked are wrong to make as well, and should be illegal.

Has this group had any discussion regarding AI use in pornography, specifically, 'deepfakes?' Its come out recently that a major up-and-coming twitch streamer 'Atrioc' (who was recently married, and ostensibly very pro-feminist, but while funny these facts are technically irrelevant to the matter at hand) had been viewing porn 'deepfakes' of multiple major female twitch streamers, including possibly his best friend's girlfriend (last part unconfirmed but highly possible). He's come out with an apology and its a whole thing but I'm sure this community is more interested with the moral/ethical questions therein than internet-celebrity drama so I won't bore you with it.

The following are my perspectives on a few of the potential questions regarding 'deepfake' porn, and AI porn in general. I'd love to hear what other people think about these perspectives, because my thoughts are currently very incomplete on the issue.

First and foremost, I have a strong intuitive feeling that it is deeply wrong, perhaps tantamount to some form of sexual harassment/assault (but of course the non-violent sort) to make 'deepfake' pornography of other non-consenting people. For example taking a picture from a celebrity's instagram and using AI to transform it into a high-fidelity (but technically fake) nude picture of them seems functionally the same as i.e. spying on them in the shower or when they're changing, which are actions I think we can all agree would be some form of wrong or illegal sexual violation (or perhaps we can't? you tell me). The way I think about this is by considering that a strong enough AI program would be theoretically capable of using a clothed picture of someone to actually reconstruct the way the exact way they look naked, which would be quite literally equivalent to the aforementioned situation/looking at them with x-ray glasses, etc. which again (I think) we and most people agree would be wrong. And so, less-powerful AI capable of doing something similar seem to be at least on that gradient of wrong, if not exactly as bad.

Furthermore, AI that actually transplants people's faces onto video depictions of sexual intercourse (which is ostensibly what 'Atrioc' was caught doing) seem worse, or maybe just bad in a different way. I don't have a similar thought experiment to justify why I feel that way but the wrongness of it is my strong intuition nonetheless.

However, I can also sort of see the argument, at least abstractly, that it's a victimless crime. On the other extreme of the spectrum, fantasizing in one's own imagination about the way people look when they're naked, or how it might feel to have sex with them, is not only generally recognized as a very benign behavior, but is also known as something almost everyone does, men and women both. Sometimes, people do this even completely unconsciously, i.e. in their dreams. And what's the difference between looking at a very (or fully) realistic recreation of the way someone might look with their clothes off, and using one's own imagination to do so? What if one's own imagination was very vivid, and you had seen many naked people before thus making your training data very good, and so you also could reasonably expect to make a relatively accurate recreation of the way someone looked while naked, only in your own mind's eye?

The thing is, acknowledging these potential similarities between an action I find morally acceptable and the one I find morally wrong, still doesn't make my intuition about the wrongness of 'deepfakes' any weaker. I feel like there must be some thing that I haven't considered about it yet, which is where I'm hoping you guys might have insight. The only distinction I've found somewhat convincing so far is maybe that the mass-distribution via the internet is what makes it wrong? In other words I find it less wrong (but still wrong somewhat) to make a highly/fully realistic nude of someone and keep it entirely on one's own computer, more so than I find it wrong to make such an image and then distribute it online. This is especially weird because the former is even more apt of a comparison to i.e. peeping on someone in the locker room which is obviously (?) wrong. So why does it seem more okay to me? Help!

I have a few potential explanations that I'm considering as candidates for the source of my cognitive dissonance here:

  1. Perhaps in reality none of the aforementioned actions are wrong. It's not wrong to spy on someone in the locker room, and so it's not wrong to use 'x-ray glasses' to see through their clothes, or use an AI to edit a picture to do functionally the same thing.

  2. Perhaps instead, in reality it actually is wrong to imagine or fantasize about what other people look like while naked. The reason this is so commonly accepted as benign is because its so unenforceable to prevent. But if sexual mores are so arbitrary/constructed that something that would otherwise be wrong can just be arbitrarily agreed-upon as acceptable just because its unenforceable, how really wrong can any ('victimless') violation of sexual mores be said to be? And thus how really wrong is the other situation, where one uses AI?

This kind of segues into 3. which is: Perhaps in reality the ultimate causes of this dissonance are that modern-day sexual mores are completely stupid, so deeply incoherent that acceptance of any one of them will necessarily lead to cognitive dissonance when contrasted against some other. Is the solution to the 'deepfake' issue then to try and change our society's sexual morals/ethics into something more internally coherent?

None of these really address why I feel different about 'turning a clothed photo into a nude' and 'transplanting, in a realistic way, a non-consenting individual's face onto an actor in a depiction of sexual intercourse.' I have no concrete ideas as to why the latter feels overall worse, but also in some other (minor) ways not as bad. And the latter situation is what the whole controversy with the streamer is all about AFAIK. Very confused about all this.

What's right here, and why? What should even be done? Should 'deepfakes' be illegal because of these potential moral/ethical concerns? Should the act of making a deepfake be illegal, or just distributing it? (I think if we wanted to, we could make both of these things illegal. We might not be able to enforce preventing anyone from making them considering the AI-cat is out of the bag, but it still might be worthwhile to have its illegality on the books if it really is wrong. In other circles I'm seeing the claims that a ban would be unenforceable (motivated thinking?) but it seems trivially easy to functionally ban at least the distribution of 'deepfake' porn in a way that would almost certainly actually reduce the dissemination of such porn if not completely eliminate it. Just as i.e. child sexual abuse imagery or zoophilia porn.

I also see a lot of people in other circles being prompted by this discussion to argue about the ethics of AI image generation in general. I generally think this is basically stupid. The arguments which claim that AI image generation is tantamount to plagiarism (of the dataset images, I suppose) are all basically worthless as far as I can tell. But people who have bought into this line of thinking are thus now going as far as to say that i.e. photorealistic porn (even that depicting completely synthetic likenesses) that is generated with AI is a sexual violation (of all the nude or semi-nude women in pictures in the dataset I guess?) Either way I am wholly unconvinced by these arguments and think they basically all stem from a bad understanding of how the AI work, so I don't think I'm super interested in discussing this axis of the debate. But I mention it because this community sometimes surprises me so if anyone here has a really strong argument as to why this might make sense that they think I haven't seen before, feel free to mention it.

My read of the quoted comment is different than yours.

To your point that 'community' is overused to the point of meaninglessness in liberal circles, well, I think that's what's actually going on here. My perspective is that the quoted comment isn't really arguing for any criminal justice reform because it will help mend communities in any real way. They're just pro-criminal justice reform community effects notwithstanding, with a focus on rehabilitation and reconciliation between the criminal and victim (which are goals one could have for criminal justice reform without caring about 'mending the community') and then at the very end of their comment, they merely happen to name-drop 'community' because, as you said, its become such a meaningless term in liberal discourse that they use it almost without intentional meaning.

However much this might be the case, you're making a point more about how mis-percieved their actions are, more-so than about how poorly received (the common perception of) those actions are in absolute terms, compared to each other.

Maybe both of them are misunderstood generally. The truth is, though, that even some of the worst interpretations of Rowlings 'misdeeds' are not considered as heinous as some of the best interpretations of Louis CK's. If we're talking about 'Why is Rowling not as cancelled as Louis CK?' my point is only that the difference in this perception is indeed a factor.

Interesting. I hope the trend continues.

Without meaning to move the goal posts, a single one of the main trio of characters (and the perhaps overly socially-conscious/goody-goody one at that) caring about slavery, therefore clarifying that at least in universe its something that the characters could conceivably care about, but for the most part, just don't, is almost funnier.

Animals can't talk and most aren't regarded as sapient. On the other hand, house elves basically have a human equivalent mind in a small body, and are also non-consensual and generally unhappy servants of humans. As another commenter pointed out, this injustice actually is addressed in the books (though practically in passing) when a socially conscious/activist main character starts an organization opposed to house elf slavery, clarifying that it is conceivable to view it as i.e. worse than eating meat in-universe. Just that most characters don't care. This is silly to get into but again I regard it more as odd/funny than anything.

Yeah, I mean, obviously he has some ideological allies. What I meant is that, right now the CW split on trans issues is almost 50/50 at best, depending on region. Trans people/issues are not popular in many societies. Whereas Kanye's beliefs are touted by are only a tiny proportion of the discourse, widely regarded as crazy extremists. Yes, I'm sure Kanye isn't cancelled among adherents of the Nation of Islam or other anti-semites. That isn't exactly saying much though.

societies that are supposed to mimick medieval ones

Harry Potter is set in basically the modern day (the late 90s to early 2000s, to be exact) and the main setting is meant to evoke the experience of students at a mid to late 20th century British boarding school, one perhaps a few decades 'behind the times' of the actual year during which the story takes place. I agree that more fiction that mimicks historical societies in setting should try not to transplant modern morality onto said setting, but that is not the situation of Harry Potter. The existence of slavery and the idea that a person who is basically a young millennial going through high school is so nonplussed by the widespread slavery that exists in his world actually is almost funnily bad.

Regardless of whether or not transwomen should coinhabit women's prisons, whether or not they do or do not seems like a problem of minuscule ultimate importance.

Then why do trans activists push for it so hard? Just concede it then.

I should clarify that what I mean is that it seems like a problem of minuscule ultimate importance to a person who claims to care about women's issues generally. It's clear why this would be a significant issue for trans activists, but not clear to me why it should be a problem of similar magnitude to women's rights activists in general, as Rowling claims to be.

To put it another way, trans activists care about issues that trans people face. They believe that one of the main issues that trans people face is the fact that elements of society do not recognize them as their chosen gender. They believe that this lack of recognition is expressed in many ways, for example in the prison system, via being compelled to inhabit the prison of their biological sex rather than their chosen gender. They might also believe that i.e. trans women who are made to inhabit men's prisons suffer greatly at an individual level, and care specifically about alleviating the suffering of members of their tribe. Thus it seems clear to me how this issue slots into the greater project of trans activists of having society recognize them as their chosen gender rather than assigned at birth gender.

However, JK Rowling claims to be interested first and foremost in women's rights in general. If she perceived the most important problem facing society to be the potential advancement of trans rights, and thus stated that her main mission was the frustration of the advancement of trans rights, in just the same way that trans activists have as their central mission being pro-advancement of trans rights, it would make sense for her to care about i.e. 'should they be assigned to the prison of their chosen gender or not' just as much as trans activists do but in an equal and opposite sense. But JK Rowling doesn't claim to be an anti-trans-rights-activist, or proclaim that the potential increase in trans acceptance is of significant importance in general. She even claims to be for trans-rights in some sense. What she most specifically claims to be is a feminist, and that her main mission is women's rights in general. Yet, she makes an almost disproportionate amount of her online presence and activism about combating these specific areas like trans people being admitted to womens prisons and etc.

A rational person who cared most about women's rights but did not specifically support some areas of trans-rights would still not spend as much time caring or thinking about these specific trans issues as Rowling does: there are bigger fish to fry facing women even in her home country, but especially around the world.

Rowling is above any type of that retaliation.

I understand that the criticisms between them are distinct in a large way but neither of them did anything illegal.

I'm not sure how useful 'neither did anything illegal' is as a way to assert that what they did was or wasn't of similar magnitude. The difference that I am claiming exists between their misdeeds, and thus their levels of cancellation, still holds as a difference between these two examples: Louie CK's misdeeds have no 'supporters' in the culture war. There is no one out there who thinks people should be going around and randomly starting to masturbate in front of women who haven't consented to such a thing. Furthermore the question of whether or not its wrong to sexually assault people is a question with much greater cultural penetration, (especially penetration as a percent of the group of people who would otherwise be buying Louie CK's product). I would venture a guess to say that greater than 95 percent of people who would otherwise be interested in buying a ticket to one of his comedy shows would be turned off by the idea that the guy doing the performance was a sex weirdo who had non-consensually masturbated in front of multiple women.

On the other hand, again as stated, aside from having plenty of ideological allies, JK Rowling's issue just doesn't penetrate that much. People into harry potter are actually for the most part young children, who are too young to care about the discourse, and whose purchasing decisions are made by their parents, who are either too old to care, on JK Rowling's side, care more about satisfying their kid's interest in harry potter than their own interest in not supporting the Rowling estate, etc. Of the twitter-millenial-harry-potter-fan demographic who actually is most likely to care, not all of them do, some of them care but support the anti-trans position in the culture war, some that do care and support pro-trans but can pretend its warner bros that's getting all the profits and not rowling and that level of cognitive dissonance is enough for them, etc.

I think the careful wording of her tactics really does affect the level of retaliation she receives. The fact that she repeatedly maintains she has no hate for trans people etc. is important. That much seems obvious to me. If she started explicitly saying she actively hates trans people, I think its obvious that she would grow to a level of radioactivity at least somewhat more like the other listed examples, Kanye/Louie CK etc. Obviously you're right and that she wields a certain type of power that would insulate her somewhat, but I think overall you're understating the way in which the level of her cancellation is actually at least correlated with actual differences in her tactics, flavor of rhetoric, the specific CW issue she's chosen, etc.

Eh, Kanye was a billionaire and it worked on him.

Yeah I mean I think its ridiculous to say that cancel culture can't affect billionaires or that it even hasn't affected Rowling. In the sense that, once a person is already rich, only them spending/giving away their own money can stop them from being rich, cancel culture obviously runs into a limit of potential reach. But in basically every domain except money, cancel culture clearly has power even over the wealthy. Rowling's influence over the groups of people she would like to have influence over is clearly less than it would be if she had not taken an anti-trans stance. Perhaps she has more influence over different some smaller, different groups of people. But I bet she would like to express her beliefs and not be 'cancelled,' too.

JK Rowling has ideological allies, because the stance she has taken is one of the main and most divisive culture war issues of our time. She has fallen back to these ideological allies. People who care about the culture war and aren't on her side have cancelled her among themselves -- everyone else, AKA people who either care about the culture war and take her side, or, the significant majority of people, those who do not care about the culture war, have not cancelled her among themselves. Kanye has no ideological allies, because he is taking a culture war stance from two centuries ago, of which one side has already emerged victorious. How much someone will or won't be cancelled and to what extent that cancellation will effect them really isn't anything more complicated of a function than 'how deep does this culture war issue that I deciding to participate in penetrate the public consciousness, and how much of an ideological share of the public who cares is occupied specifically by the side am I taking?'

except in the very narrow sense of being able to, for example, coinhabit women's prisons.

Regardless of whether or not transwomen should coinhabit women's prisons, whether or not they do or do not seems like a problem of minuscule ultimate importance. Do you really think Rowling would dedicate as much effort and energy into her activism if she thought problems on this magnitude were the main issues of the trans movement?

Are there really a lot of ways in which the trans movement harms cis women? Do you mean because of women's prison issues and things like transgender women's participation in sports? Or are you thinking of a different set of problems/potential problems trans movement might pose to women