This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Hot on the heels of failing out of art school and declaring himself the robofuhrer, Grok now has an update that makes him even smarter but less fascist.
And... xAI releases AI companions native to the Grok App.
And holy...
SHIT. It has a NSFW mode. (NSFW, but nothing obscene either) Jiggle Physics Confirmed.
EDIT: Watch this demo then TELL ME this thing isn't going to absolutely mindkill some lonely nerds. Not only can it fake interest in literally any topic you find cool, they nailed the voice tones too.
I'm actually now suspicious that the "Mecha-Hitler" events were a very intentional marketing gambit to ensure that Grok was all over news (and their competitors were not) when they dropped THIS on the unsuspecting public.
This... feels like it will be an inflection point. AI girlfriends (and boyfriends) have already one-shotted some of the more mentally vulnerable of the population. But now we've got one backed by some of the biggest companies in the world, marketed to a mainstream audience.
And designed like a fucking superstimulus.
I've talked about how I feel there are way too many superstimuli around for your average, immature teens and young adults to navigate safely. This... THIS is like introducing a full grown Bengal tiger into the Quokka island.
Forget finding a stack of playboys in the forest or under your dad's bed. Forget stumbling onto PornHub for the first time, if THIS is a teen boy's first encounter with their own sexuality and how it interacts with the female form, how the hell will he ever form a normal relationship with a flesh-and-blood woman? Why would he WANT to?
And what happens when this becomes yet another avenue for serving up ads and draining money from the poor addicted suckers.
This is NOT something parents can be expected to foresee and guide their kids through.
Like I said earlier:
I've felt the looming, ever growing concern for AI's impact on society, jobs, human relationships, and the risk of killing us for a couple years now... but I can at least wrap those prickly thoughts in the soft gauze of the uncertain future. THIS thing sent an immediate shiver up my spine and set off blaring red alarms immediately. Even if THIS is where AI stops improving, we just created a massive filter, an evolutionary bottleneck that basically only the Amish are likely to pass through. Slight hyperbole, but only slight.
Right now the primary obstacle is that it costs $300 a month to run.
But once again, wait until they start serving ads through it as a means of letting the more destitute types get access.
And yes, Elon is already promising to make them real.
Its like we've transcended the movie HER and went straight to Weird Science.
Can't help but think of this classic tweet.
"At long last, we have created the Digital Superstimulus Relationship Simulator from the Classic Scifi Novel 'For the Love of All That is Holy Never Create a Digital Superstimulus Relationship Simulator.'"
I think I would be sucked in by this if I hadn't developed an actul aversion to Anime-Style women (especially the current gen with the massive eyes) over the years. And they're probably going to cook up something that works for me, too.
Talking to an AI feels like trying to tickle yourself. I don’t get it at all.
When I was a kid I used to be somewhat surprised that there were older people who had never played a video game, had no interest in ever trying a video game, they were perfectly fine with never playing one, etc. And I was like, how can that be? How can you not even be curious? I suppose video games just got popular at a point in their lives when their brains were no longer plastic enough or something. And I suppose I’ve hit that point with new technology now as well.
I can’t enjoy talking to an AI when I know that I’m in control and it’s trying to “please” me. Even if I told it, “oh by the way, try and add some variance, maybe get moody sometimes and don’t do what I ask”, the knowledge that at the end of the day I’m still the one in control ruins it. I suppose if we imagine a scenario where the AI is so realistic that I never get suspicious, and you’re able to trick me into thinking I’m talking to a real human, then sure, ex hypothesi there’s nothing to distinguish it from a human at that point and I would enjoy it. But short of that? Not for me.
There was a Sirling-era episode of the Twilight Zone where a bank robber died and went to Heaven. Angel tells him that he’s made it, he can have anything he wants for all eternity. So the dude lives out all sorts of wish fulfillment scenarios, winning big at gambling, beautiful women, some bank heists, etc. But he gets bored fast, says something is missing. There’s no danger to any of it, no bite, he wins every time. Angel says “well you can set whatever parameters you want. We can make it so there’s a 50% chance of your next robbery failing”. Guy says “no no, it’s still not the same. Look, I don’t think I’m cut out for Heaven. I’m a scumbag. I want to go to the other place”. Angel says, “I think you’ve been confused. This IS the other place.”
That’s what AI “relationships” feel like to me.
It reminds me of a friend of mine who went to a trip club to see some adult film star he liked, despite the fact that it was a weeknight and he had to get up early for work the next day. He got hammered and made sure he got more individual attention from her than anyone else in the place, and when he realized it was 11 and his handover was already going to be bad enough, he informed her he had to be leaving. She kept protesting, explaining his work situation, and she kept telling him YOLO and you can survive one bad day at work, and you just need to sober up a little and you'll be fine, etc. Then he uttered the magic words: "I'm out of money". That pretty much ended the conversation right there and he was free to go.
So yeah, this kind of relationship is ultimately pretty hollow, and I don't see the appeal personally, but some guys spend big money on hookers, strippers, and other empty stuff. The business model won't be built around this being a substitute for human interaction generally, but around various whales who get addicted to it.
Well, that's the interesting thing.
AI gets hyped up, as e.g., an infinitely patient and knowledgeable tutor, that can teach you any subject, or a therapist, or a personal assistant, or editor.
All these roles we generally welcome the AI if it can fill them sufficiently. Tirelessly carrying out tasks that improve our lives in various ways.
So what is the principled objection to having the AI fill in the role of personal companion, even romantic companion, tireless and patient and willing to provide whatever type of feedback you most need?
I can think of a few but they all revolve around the assumption that you can get married and have kids for real and/or have some requirements that can only be met by a flesh-and-blood, genetically accurate human. And maybe some religious ones.
Otherwise, what is 'wrong' with letting the AI fill in that particular gap?
You'll always feel inferior to men who were able to build a relationship with a real woman. It'll gnaw at you.
Presuming those relationships last.
Which is a sizeable "if" in the current era. That's why I think the AI companion is a possible death blow. Without actual, real life women being willing to settle down, this becomes the 'best alternative'/substitute good.
This thought only just now occurs to me, but if we took two otherwise similar guys, one who married a woman and another who just went all in on an AI companion, bought VR goggles, tactile feedback, the requisite 'toys' to make it feel real, and such.
And 5 years down the road the married guy got divorced, maybe has a kid, and suddenly finds himself alone, and these two guys meet up to compare their overall situations.
And the other guy is still 'with' his AI companion, shallow as it is... would he feel better or worse off than the guy who had a wife but couldn't keep her.
But that's not true. There are lots of women who are settling down with lots of men as we speak.
You're trying to rationalize how the AI could be "just as good" or "not as dangerous" as the real thing, because you know that the AI is obviously worse.
No, simply pointing out a failure mode that human relationships have that an AI really does not. The AI has other failure modes that are more dystopic, of course.
The human relationship failure mode is one that that I've now personally observed multiple times, unfortunately, happening to people who do not deserve it.
I do not think the AI is inherently better, I simply think it has an appeal to men who don't feel they've got a shot at the real thing.
And that is VERY VERY bad for society.
Objectively fewer than in years past. That's the point. This is simply adding to an existing trend.
And we can extrapolate that trend and wonder if we'll get as bad off as, say, South Korea. We know it can get worse because worse currently exists.
I'm not here trying to JUSTIFY men choosing the digital option. Quite the opposite. I'm just saying I don't see a reason why, in practical terms, they'd reject it.
And does it have appeal to you?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link