@TheDag's banner p

TheDag

Per Aspera ad Astra

3 followers   follows 12 users  
joined 2022 September 05 16:04:17 UTC

				

User ID: 616

TheDag

Per Aspera ad Astra

3 followers   follows 12 users   joined 2022 September 05 16:04:17 UTC

					

No bio...


					

User ID: 616

With the release of the recent Barbie movie, the old gender debates on the internet have been reignited. (Admittedly, I haven't watched it yet, might pen down my thoughts once I do.)

I recently encountered another article by a heterosexual, middle-class woman discussing how we can assist young men in discovering their masculinity. The piece, confidently titled map out of the wilderness, repeats the narrative tropes that countless similar works in journalism tend to focus on.

Does it argue that men are disoriented because women are no longer subservient? Indeed. Does it accuse men of falling for 'destructive' ideologues such as Jordan Peterson and Bronze Age Pervert whose political ideologies aren't personally favored? Yes. Does it claim men are discontent because women wish for them to behave more femininely? Absolutely. Does it state there's a lack of 'positive masculinity?' Oh, for sure.

To credit the writer, Christine Emba, she does highlight some of the more sinister issues that venture slightly beyond the bounds of conventional discourse. She openly criticizes feminists and women in general for refusing to assist men, citing an instance where Obama was chastised for attempting to help boys, and thousands of women denounced him in protest.

What prompted me to respond to this article was a moment of blatant self-awareness by the author, who admits when reproached by a man that she doesn't want to be intimate with men who heed her advice (emphasis mine):

Where I think this conversation has come off the tracks is where being a man is essentially trying to ignore all masculinity and act more like a woman. And even some women who say that — they don’t want to have sex with those guys. They may believe they’re right, and think it’s a good narrative, but they don’t want to partner with them.

I, a heterosexual woman, cringed in recognition.

Yes, dear writer, you recoiled in acknowledgment. If you, a talking head opining on this topic, felt this way, consider the reaction of those numerous women with lesser self-awareness when they encounter these feeble, effeminate men.

However, all the discussions around gender roles, sexual relations, power dynamics, and 'incels' are missing the real issue. They're distractions, veils obscuring the core problem.



At the risk of being cliche, I'll reference Nietzsche's most well-known line:

God remains dead. And we have killed him. How shall we, murderers of all murderers, console ourselves? That which was the holiest and mightiest of all that the world has yet possessed has bled to death under our knives. Who will wipe this blood off us? With what water could we purify ourselves? What festivals of atonement, what sacred games shall we need to invent? Is not the greatness of this deed too great for us? Must we not ourselves become gods simply to be worthy of it? There has never been a greater deed; and whosoever shall be born after us - for the sake of this deed he shall be part of a higher history than all history hitherto.

Why has this single paragraph echoed throughout recent centuries as one of the deepest and most frequently reiterated explanations of modernity's moral crisis? Obviously, Nietzsche, a self-proclaimed atheist, doesn't imply we've executed deicide in the literal sense. What we've done is obliterated any transcendent reason for existence. There is no apparent reason why young men should exhibit concern for their neighbors, work towards self-improvement, curtail their desires, or even make an effort to contribute to society.

For a young man in a contemporary world that is entirely individual-centric, what is the appeal of any altruistic act?

Regardless of the religion you choose, these systems provided us with a motive beyond primal, materialistic pleasures to care. They provided us with an aim to pursue. Most importantly, they offered us a social framework within which we could strive collectively with others and receive commendation for our benevolent deeds.

Nietzsche's suggested solution is that the New Men must 'become deities' to be worthy of God's murder. Regrettably, as we've found out, not everyone can ascend to godhood. Certainly some of the highest status and highest agency men can create their own values, but what about the rest of us?

How is a young man in his twenties, armed with a useless college degree and forced to work at a supermarket to get by, supposed to find purpose in what he's doing? How can he feel accomplished, or masculine, or empowered? He definitely can't rely on God or religion for that feeling. If he tries, he'll be overwhelmed by relentless mockery and cynicism from his society.



Returning to Ms. Emba's proposed solution, she states that men need to experience masculinity by:

by providing for their families and broader society, by protecting their tribe and others, and by successfully procreating.

This, she asserts, is 'Constructive Masculinity.' Let's look past the glaring issue that it's a woman attempting to define what masculinity should be - the question remains: why?

Without some larger mission, most men aren't going to be motivated whatsoever. Men need a reason to exist. And not a poor, weak reason like 'following your dreams' or 'getting money' or 'being a good person.' Men need something to strive for, something worth dying for, something that they can use to shield themselves from the terror of the void.

Of course this problem is applicable to far more people than just young Western males. This lack of meaning, lack of purpose, is at the core of modernity's societal problems. It waits like a tiger in the shadows, seizing us in our moments and weakness and pulling us into a black pit of despair, nihilism. Emptiness.

When you're on your deathbed, where will you look for comfort? What force or being or god will let you face your own death without flinching? What water will purify you?

How will you cleanse your hands of blood?

There are plenty of posts in the CW thread lamenting the takeover of modern TV and movies by 'wokeness,' I figured it might be interesting to look at another area, namely sci-fi novels.

The Hugo Award is probably the most well known science fiction writing award, having existed since 1953 and helping to launch many famous authors' careers such as Robert Heinlein, Philip K. Dick, Frank Herbert, and many more. Unfortunately, the quality of this award, among others, seems to have gone sharply downhill recently. Specifically, they are becoming overtly political and focusing primarily on female and POC authors.

This phenomenon started back in 2014-2015, and has received massive backlash since the genre of speculative fiction (science fiction + fantasy) is overwhelming male, and seems to select for high systematizers. There have even been organized voting campaigns against the political skew of the Hugo, predictably shut down hard by the social justice camp.

I was recently looking for a new sci-fi series, and stumbled upon Ancillary Justice, a sci-fi novel that won the first so-called 'Triple Crown' of Sci-fi, the Hugo, Nebula, and Arthur C. Clarke awards. Despite never having heard of the other two besides the Hugo, I figured that should be a good enough endorsement of the series. I was wrong.

The flaws with this first novel, as I only read about a fifth of it before quitting, are numerous. The basic premise is that the main character used to be an Artificial Intelligence who ran a starship, and communicated/perceived primarily through captured human bodies, called Ancillaries. She (the AI) was betrayed, and now is stuck in a single human body, plotting revenge. Why a super powerful AI needs to take over human bodies is never explained, but we'll chalk it up to suspension of disbelief.

This former-AI-being, despite having lived for over 2,000(!) years, is laughably incompetent and emotional while still managing to come off as a flat character. Starting on a backwater planet called Nilk, where she has been living for almost twenty years, she consistently manages to piss off the locals by mis-gendering them. This is because, as the author takes pain to remind us, the Radch Empire which she came from has one singular gender (or doesn't care about gender, it isn't clear) and the default pronoun is 'she.' This odd convention leads to such beautiful passages as (emphasis mine):

"She out-bulked me, but I was taller, and I was also considerably stronger than I looked. She didn’t realize what she was playing with. She was probably male, to judge from the angular mazelike patterns quilting her shirt."

This inconsistent gendering is constant throughout the novel, to the point where it's difficult to trust the gender of any character. You literally have characters introduced using female pronouns, only to find out two chapters later that it was actually a male character, the former-AI-turned-SJW just failed to correctly gender them!

Despite the fact that this is beyond frustrating from a reader perspective of trying to visualize the characters, it makes literally no sense given the world building. You're telling me that a millenia-old AI, who has explicitly spent centuries studying human expressions, culture, and communication, is so incompetent they can't correctly gender humans in a society they've been living in for twenty years?? Keep in mind this mis-gendering literally threatens the main character's life at multiple points. The amount of mental gymnastics required to suspend my disbelief at this point was far too much.

And yet, despite this inane premise (and the fact that according to many other reviewers, the book never gets better, there's barely any plot, and the AI's scheme for revenge is utterly flawed) this book received massive amounts of praise. Not just from the sci-fi establishment, but more general institutions too such as NPR, and various other celebrities. They somehow try to turn this confusing writing style into a good thing because it encapsulates a 'poignant personal journey':

It won't be easy. The universe of Ancillary Justice is complex, murky and difficult to navigate — no bad thing, as Leckie's deft sketches hint at worlds beyond, none of them neat. Most obvious are the linguistic disconnects: Breq's home tongue uses only "she," reinforcing her otherness as she constantly guesses at genders in other languages.

Now you may ask - why does this matter? Unfortunately, as many know here, awards are a zero-sum game. Speculative fiction, especially fantasy, is entering the main stream with hits like Game of Thrones and House of the Dragon. Right now we already have issues of adaptions being too focused on social justice narratives, even though many of the underlying works were popular due to their gritty, realistic, and often misogynistic worlds.

Writing fiction is a brutal career. Amateur authors often spend literally decades building a name for themselves, so short story magazines, awards, and other ways of gaining notoriety and funds are extremely important. If aspiring writers of science fiction and fantasy can't make it without catering to woke sensibilities, then unfortunately the quality of the genre will drop drastically. Writers who can't write woke fiction simply won't be able to support themselves.

When it comes to modern entertainment, science fiction novels especially have been one of the last bastions of male centric, systematized, shape-rotator style writing. It seems that where the genre goes could be an important bell-weather for the future of the culture war in entertainment.

Some of you may have read Scott Alexander’s recent post, Book Review: The Geography of Madness. The couple of paragraph summary is:

A culture-bound mental illness is one that only affects people who know about it, and especially people who believe in it. Often it doesn’t make sense from a scientific point of view (there’s no such thing as witches, and the penis can’t retract into the body). It sometimes spreads contagiously: someone gets a first case, the rest of the village panics, and now everyone knows about it / believes in it / is thinking about it, and so many other people get it too.

Different cultures have their own set of culture-bound illnesses. Sometimes there are commonalities - many cultures have something related to the penis or witches - but the details vary, and a victim almost always gets a case that matches the way their own culture understands it.

THESE PEOPLE ARE NOT MAKING IT UP. I cannot stress this enough. There are plenty of examples of people driving metal objects through their penis to pull it out of their body or prevent the witches from getting it or something like that. There is no amount of commitment to the bit that will make people drive metal objects through their penis. People have died from these conditions - not the illness itself, which is fake, but from wasting away worrying about it, or taking dangerous sham treatments, or getting into fights with people they think caused it. If you think of it as “their unconscious mind must be doing something like making it up, but their conscious mind believes it 100%,” you will be closer to the truth, though there are various reasons I don’t like that framing.



The thrust of Scott’s argument is that humans have an amazing propensity to change their subjective experience based on their beliefs. Here, I'm not talking about rationally held or carefully reasoned beliefs, but deep-seated beliefs that aren’t easy to change, even if you know for a fact they're irrational. Typically, these beliefs seem to be formed through social or cultural channels, and once formed, they can be very difficult to change unless your cultural narrative also changes.

This idea ties into other work on the placebo effect and the ways it shaped our culture, for instance, John Vervaeke’s take on shamanism. The basic idea being that shamanism was highly advantageous from an evolutionary perspective because it allowed groups of humans to harness the placebo effect to overcome illness and manage social problems.

In short, despite the rational pretensions our culture has, our irrational beliefs have extremely strong effects on our perception of pain and other subjective experiences. However, an important nuance is that no cultural disorder is 100% ‘in your head;’ on the contrary, these disorders are very real and can have strong physical effects.

Some of the big examples that Scott gives, and some I think might be (mostly) culturally mediated, are:

  • Anorexia

  • Post-traumatic stress disorder

  • Anxiety

  • Depression

  • Gender dysphoria

  • Chronic pain

  • TikTok Tourettes

  • Long Covid

Now, based on the bent of this forum, many people might be tempted to jump on the gender dysphoria issue. While it’s certainly a loud and vibrant battle in the culture war, I’d ask that we instead focus on other problems. In my opinion, if this thesis holds true, then gender dysphoria is a red herring.

The evidence clearly suggests that we are inflicting massive amounts of pain and suffering on ourselves through our cultural beliefs and practices. The fact that so many of our cultural problems - from overdose deaths and suicides to chronic pain and crippling anxiety - are unforced errors is truly shocking.

Think about it - one fourth of the adult U.S. population experiencing chronic pain? That's a staggering number, and it seems largely due to the fact that we have been conditioned to believe that our pain must have an acute physical cause. We've been taught to view pain as something that must be cured with medication or surgery, when in fact many cases of chronic pain can be alleviated by simply changing our beliefs about it.

The truly shocking revelation here is that so many of our cultural problems - massive amounts of overdose deaths, suicides, one fourth of the adult population experiencing chronic pain, crippling anxiety causing young people to retreat from society, and many more issues - are clear unforced errors. We are inflicting this pain on ourselves.

If this theory is true it may very well be one of the most important and impactful frameworks with which to view the issues of post modernity. We wouldn’t need endless medications or miraculous scientific breakthroughs - we could already have the power to end massive amounts of truly pointless suffering.

ETA: is another perfect example of this type of illness.



From a personal perspective, I can attest that this theory confirms my priors. I’ve dealt with chronic pain for a decade and have long suspected that it was mostly psychosomatic. Even with this realization, it is a difficult battle to fight. Ironically, support groups where people confirm and commiserate seem to make the issue worse. In fact, many modern studies on pain recommend not even using the word "pain" and replacing it with something else to trick your mind into understanding that your pain doesn’t have an acute physical cause.

So many of us in the rationalist community focus on object-level reasons as to why our society may be stagnating or why we have so many cultural problems. At the end of the day, it turns out that our beliefs themselves may be throwing us into a twisted, absurd, and horrific self-fulfilling prophecy.

It may be time to stop assuming that the causes of our problems originate directly from the outside world and update to a view that many more major problems could be solved if we simply change our cultural beliefs.

This excellent piece on age segregation has got me thinking about how serious and pervasive this problem is. As the author states:

Young adults are afraid to have children, because they can’t possibly imagine adding some to the life they currently have. New parents are isolated from most of their previous friends, as their paths suddenly never cross again unless they too have kids of their own. Children compete within their age group at schools, never having a chance to either mentor someone or have an older mentor themselves. Teenagers have no idea what to do with their lives, because they don’t know anyone who isn’t a teacher or their parent. And everyone is afraid of growing old because they think that the moment they stop going to the office they’ll simply disappear.

As discussed in @2rafa's post downthread, a major issue of the fertility crisis is a lack of time. Another issue it seems is a lack of even interacting with children unless you have some yourself, or have some in your family. I wonder if the lack of time among young adults in the West is causative of this age segregation?

Regardless, it likely has its roots in the K-12 education system. It's profoundly unnatural from a cultural standpoint to only be in the same peer group as people right around your age. I'm convinced it's unhealthy, and it predisposes us in a massive way to only socialize with people close to our age.

Do you think age segregation is an issue as well? If not, why not?

I gotta say the lab leak revisionism kills me. I remember arguing for it during the high points of covid with friends, and even though I was vaccinated masked etc, arguing for the lab leak was enough for some people to call me a 'covid denier.' Now when I bring up the studies to those same people, they just shrug and go 'what does it matter, covid is over now.'

Good lord, I am reasonably more accepting of the covid response than the median on this site, but the reaction to the lab leak really just baffles me. I can't believe more people don't see it as by far the biggest controversy in the last decade - China very plausibly released a virus, intentionally or unintentionally, that devastated most of the world economically if not biologically. And yet people just don't think it's a big deal or doesn't matter one way or another. Unreal.

My question is why wasn’t this leaked earlier?

A bit of a tangent, but I'd like to point out this is an excellent example of a real conspiracy that wasn't leaked. So many people on here act like conspiracies are impossible because someone will leak it - clearly not!

Looks like the Supreme Court is finally getting around to challenging affirmative action. Of course we don't know what the ruling will be, but with the decisions so far I'm hopeful they strike down AA, or at least put a dent in it.

I'm surprised this isn't a bigger deal as I haven't heard much buzz about it from my liberal friends. According to the article, 74% of Americans don't believe in using race as a factor in college admissions (although that question and whether or not Affirmative Action should be struck down likely have far different approval rates.) It may be a Roe situation where they really don't care until one of the sacred cows is gored because they believe in their own invincibility. I'm curious if AA does get struck down, will we have the same reaction as Roe?

I'm sure some people will be upset, but do you think liberal states will start changing their constitutions to allow race filtering for college admissions? Or is the political will for AA just gone on both sides of the aisle?

Just adding your weekly reminder that the Motte remains the brightest and best hope for open discourse on the internet. Be proud and relieved you are a Mottizen - we have made it to the shining City on the Hill, the one place online where truth and free speech are protected.

I'm doing all this grandstanding because I'm flabbergasted that right now Tildes, one of the other 'reddit-alternatives' that claims to stand for open and intellectual discourse, is actively and unapologetically censoring anything to do with the UAP hearings.

Many of the users there are rightly pointing out that it's insane that the moderators would block discussion about a literal Congressional hearing... but this is the doublethink that we Mottizens are up against:

Hypothesizing that aliens might well exist based on the vastness of our universe is not a harmful thing.

Believing in conspiracy theories based on zero credible evidence is incredibly harmful. This hearing centers around a man with absolutely zero evidence claiming that there is an arm (or arms) of the government operating above congressional and presidential oversight. The claim is that we are dealing with a massive, top-level coverup. Does that sound familiar to anyone else? Anyone want to quess which Qrowd is qoing to eat this shit up?

Giving these conspiracy theories a platform on a forum where discussion is purported to be high quality and based on intellectual curiosity is harmful. Have you ever noticed how people who believe in one conspiracy theory are more likely to believe in multiple conspiracy theories? Worse, they are even likely to believe all conspiracy theories, even those which are completely contradictory to each other.. This is called conspiracist ideation, or conspiracy theory monism.

Anyone who hasn't been under a rock for the last 8-ish years knows how harmful this can be. They helped elect a president who built our current supreme court. Check out /r/qanoncasualties if you forgot how it harms us on a more individual level. You could also go back further than Q anon and read accounts of witch trials, rapes, genocides, wars, and all manner of horrible things caused by false beliefs in conspiracy. Not to mention the countless deaths caused by vaccine avoidance - something that started long ago, but with which we are still dealing today.

Politicians having a hearing about this conspiracy theory is news, unfortunately. However, the conspiracy theory itself is not the news. Congresswoman Lauren Boebert openly discussed Replacement Theory, should we give that its own thread and discuss its merits?

The only posts we should have regarding this topic, in my semi-humble opinion, are well-written articles absolutely tearing our politicians apart for even humoring this hearing. Perhaps a good debunking article explaining that this whole testimony is based on hearsay and rumor, and explaining that someone's position in government doesn't mean they're immune to conspiratorial thinking.


Again, luckily there is some actual pushback on the site itself. But please, my fellow Mottizens, let this open display of intolerance remind you to keep your guard strong. Keep your eyes focused directly on the goal, and remember that if we let ourselves be distracted by our petty differences, the Motte may well become the same censored cesspool as the rest of the internet.

Be strong my brothers and sisters, and never forget the incredible and unique nature of this Forum that we have built. Don't take the Motte for granted, and be swift and sure when defending it.

Veritas omnia vincet.

One worrying trend I've been seeing in the modern world is the social outlawing of any form of permissible contact between older people and children. It seems that with the obsession modern western society has on children's sexuality, all of the sudden the default position when an adult interacts with a child that isn't related to them by blood, is that the adult is a sexual predator.

Especially on the motte there have been a lot of recent concerns about 'grooming,' which as a thread below mentions is an extremely muddy and useless term. In my opinion it should be tabooed from these discussions.

This issue becomes especially salient when you look at the rise of internet addiction issues, and the mental health/suicide problems that come along with it. Many kids go to internet forums like this because they don't have role models or guides they respect in life. They end up forming parasocial relationships with internet celebrities that are probably more likely to be predatory and harmful to the child, as if the habit of going on the internet all day isn't bad enough.

We as a society are losing vast amounts of illegible knowledge every day as older people die, exit the workforce, or suffer cognitive decline. There are many areas where 'book smarts' can't teach you everything, especially when it comes to emotional issues or social issues. The rise of inceldom, trans, and other social movements primarily focused on social issues of young people are a prime example.

My question is: How are adults supposed to offer guidance to children in the modern world, especially adult men? There are numerous stories of a child's father having the police called on them because people think the father might be a sexual predator, in this environment why would any man risk the reputational and legal risk of mentoring a kid?

Is it worth losing any realistic relationship between the young and old because of vague fears of sexual predation? Does the current hysteria even help sexual predation, or does preventing children from having good role models make them more insecure and vulnerable to bad actors?

A psychologist himself, Adam Mastroianni proclaims: I'm sorry for psychology's loss, whatever it is.

I found this post on the slatestarcodex subreddit. The main article discusses how the replication crisis really isn't as bad as most people think, because:

Gino's work has been cited over 33,000 times, and Ariely's work has been cited over 66,000 times. They both got tenured professorships at elite universities. They wrote books, some of which became bestsellers. They gave big TED talks and lots of people watched them. By every conventional metric of success, these folks were killing it.

Now let's imagine every allegation of fraud is true, and everything Ariely and Gino ever did gets removed from the scientific record, It's a Wonderful Life-style. (We are, I can't stress this enough, imagining this. Buzz buzz, I’m bees.) What would change?

Not much.

Basically this idea can be boiled down to 'well most modern psychologists don't do anything that's even remotely important, so why do we care if these studies don't replicate?' I'm very wary of buying this type of argument. One reason is that over $2 billion dollars went into psychology research, in the US alone, way back in 2016. I'm sure it has increased since then.

On top of that, as psychologists themselves have acknowledged, many public policies get based on psychological research. In the light of the replication crisis, this is perhaps the largest and most under-discussed mistake of the 21st century. The majority of our politicians are basing their decisions, and public justifications, on a field of science that has been proven to be mostly fake. To me, that's not something we can just throw up our hands at and say is trivial.


Another interesting point, which I won't go too far into, is that many of the replicable studies in psychology are just completely ignored. Here's a highly-upvoted comment on the SSC subreddit:

Psychology has nothing interesting left because all of the rock-solid empirical results with tremendous real-world consequences were buried due to being politically awkward.

Psychometrics, heredity of various personality traits, innate gender differences, etc.

So you're naturally left with irrelevancies (monkey prostitutes) and lies (growth mindset, power posing, priming, multiple intelligences).

It's almost enough to make me empathize with Gino and Ariely. The modern discipline is all about garbing feel-good falsehoods with vestments of science. Their only crime was taking the more direct path to that end, rather than undertaking the standard rituals of plausibly innocent methodological infirmities (p-hacking etc.)

I'll leave it to the reader to decide whether or not Psychology deserves an equal place among the rest of the sciences.

I'm sure others have seen this, but AutoGPT is here, a framework that lets instances of GPT call other instances of GPT to create complex task chains with no human input. In other words, it lets GPT instances prompt other instances to complete projects. Only about a week after being released, the examples are staggering.

This is an example of BabyAGI automating a sales prospecting pipeline, something I can say from experience normally takes a typical sales rep at least half a day to do. We can already automate it, and pretty well. This type of thing wasn't possible a week ago.

There are all sorts of other examples, and it's clear that massive automation is happening. I'm willing to bet we'll reach 30% unemployment in five years. If not sooner. The question becomes - what do we do about it?

The standard liberal answer is Universal Basic Income, and many on the left seem to think it will just magically appear once the government realizes the economic power of AGI. Problem is even if we get the buy-in from the political class, the implementation of UBI is not a simple undertaking! The funding, distribution, and potential impact on inflation alone are going to cause monstrous headaches and take years to work through. Plus even if we do have UBI, the potential of widening income inequality is insane, as those who own and control AGI technology stand to reap substantial profits, further concentrating extreme amounts wealth in their hands.

Another solution, favored by some conservatives, is to focus on retraining and upskilling the workforce. While I get the general direction here, I highly doubt a retraining program could possibly be enough to counter the rapid pace of automation. Furthermore, not everyone will have the aptitude or desire to transition into highly technical or specialized fields, which may leave a significant portion of the population without viable employment options. "Learn to code" just doesn't hit the same when software devs are going to be replaced as well.

Even if we get lucky enough to have both UBI and massive retraining, it may not be enough!

Why not get the government to throw some cash at massive infrastructure and public works projects? We could take a page out of the 1930s New Deal playbook and create a boatload of jobs in all sorts of industries. I've rarely seen anyone discuss this, but it may be necessary as it was during the Great Depression. Plus, it'd boost the economy, help repair our public infrastructure, and maybe even help tackle climate change if we invest in green tech. We could even turn this impetus towards space...

Last but not least we've got the potential impact of automation on mental health and societal well-being. We're already in the middle of a Meaning Crisis. As we increasingly rely on artificial intelligence to perform jobs and soon everyday tasks, we've got to ensure that people are still able to find purpose and meaning in their lives. This probably won't be what we've traditionally looked to, such as the arts or writing, since AI is already making that irrelevant.

Perhaps we will finally realize the importance of community in our lives and to our happiness, and start adding economic numbers and frameworks to those who create social goods. Have the government fund people to run local meetup groups, or help their neighbors with tasks, volunteer at old folks' homes, etc. It's a bit of a bludgeon solution right now, but we could refine things over time.

At the end of the day we all know the rise of AGI is going to be a shitshow for a number of reasons. I've outlined some potential solutions or stopgap measures to prevent the breakdown of society, but how does the Motte think we can navigate this change?

I'll add another shoutout to Community, great show if anyone hasn't seen it. It has some pretty hilarious takes on early wokeism.

What I love about the show is that it has complex characters, which really isn't a high bar but seems to be one which many cultural products no longer meet. Jeff, Britta and Annie all have well developed flaws, which is expected as the attractive white stars. But the shocking thing is even the minority characters have flaws! Troy, Abed, and Shirley are all well-rounded people who have good sides and bad, and don't blame all their problems on vague instances of racism. It's truly refrishing in this day and age.

Eh, I don't find this argument persuasive. I highly doubt the vast majority of supporters of the MeToo movement would be caught dead agreeing with any sort of 'sex negativity.' It's really about women wanting to have their cake and eat it to.

The way these sexual assault and rape proceedings are going, we are hurtling towards a world where young women get to become intoxicated at parties and fuck around as much as they want. But then if a man they slept with (or presumably could've slept with) ever does something they don't like, they can bring the full force of the law against them. Even 20 years later.

Yes conservative courting norms and laws were created to prevent this exact thing, but I'm not sure most mainstream progressives are able to think of anything labeled 'conservative' in a positive light. It's quite strange but the modern media landscape really has made a world where people see a group labeled 'enemy' enough times and they get to a point where they just literally cannot fathom that that group has anything beneficial going on.

Historically science flourishes best when you have motivated scientists that can devote multiple decades to learning everything about a problem.

Forget the name but there’s a book about one of the guys who ran Xerox PARC who talks about how this was his strategy. Just find scientists that seem brilliant and guarantee them 20 years to devote to a project, then sit back and let them do it.

The current scientific establishment is almost the opposite of this - in order to compete you have to publish quick and publish something important. And I don’t blame the scientists, most of them will lose their livelihood if they don’t get grants, and they typically don’t have skills to fall back on. Or much of a backup plan in general.

I’m surprised the comments here are so supportive of bullying, and frankly I wonder if it’s because it confirms many of our anti-trans biases.

When I first read this I noticed myself disagreeing that the bullying was important, but after reading some of the arguments in the comments I realized I’m not convinced for any logical reason, just because I am frustrated that trans ideology is spreading in middle schools. I think it should be reserved for adults.

If this child were being harassed for something like believing in God, or an immutable characteristic like a big nose or their race, I would find this situation terrible.

For all the people saying kids need to toughen up or whatever - I firmly disagree. Humans can learn to operate in high trust, net positive ways, and that’s the society I want to build. If we keep creating cycles of kids being fucked up and aggressive in their early lives, adults will continue to act that way too. You can say hierarchical psychological violence is necessary to the human condition or whatever, but if that’s truly the case I say we strive towards something better.

So, it looks like Anthropic has finally gotten the backing to give OpenAI a serious run for their money when it comes to foundational models, via a Series C round of $450 million. There are a few core differences between their approach, and intelligent AI pundits seem to think that these could have dramatic impacts down the road on who gets to control the future of AI. My personal thoughts are still muddied - I'm not sure how much the technical side of things will matter in the age of hype and massive digital marketing. ChatGPT's first mover advantage may have already cemented them the crown of AI for the foreseeable future - unless of course they make a massive blunder.

What's most amusing to me in this whole situation is the way the landscape played out. Even though there were many firms hoping to advance the capabilities of AI, the two that are currently at the top originated as AI Safety outfits. Just like OpenAI, Anthropic sniped a lot of the top engineering talent pretending to care about safety,, then shamelessly pivoted to capabilities enhancement once they became "increasingly convinced that rapid AI progress will continue rather than stall or plateau."

While there are many condemnations and arguments over the AI Safety movement as a whole in the corner of the internet that cares, I think most people that discuss the future of AI don't take these signals strongly enough. The fact that the two largest, arguable most influential 'startup' orgs focused on AI Safety have already flipped to the other side, OpenAI having led the charge, should give anyone who believes in the Eliezer Yudkowsky doomer movement pause. If most people with power in that camp genuinely believed that doom was a given without a long period of AI restriction and alignment research, we should've seen massive departures and drama from OpenAI and Anthropic. The fact that their employees, stakeholders, and most of the AI ecosystem seems content to swallow their facile window-dressing as to why they've switched sides is proof to me that there's a massive lack of conviction in the AI Safety space.

Whatever happens, we're certainly in for an interesting few years. Whether AI continues to advance rapidly or stalls out, the world has already been changed at a level equivalent at least to the birth of the Internet. Now we're just waiting for the dominos to fall and the dust to settle.

I'm going to shamelessly steal @Scimitar's post from the Friday Fun thread because I think we need to talk about LLMs in a CW context:


A few months ago OpenAI dropped their API price, from $0.06/1000 tokens for their best model, to $0.02/1000 tokens. This week, the company released their ChatGPT API which uses their "gpt-3.5-turbo" model, apparently the best one yet, for the price of $0.002/1000 tokens. Yes, an order of magnitude cheaper. I don't quite understand the pricing, and OpenAI themselves say: "Because gpt-3.5-turbo performs at a similar capability to text-davinci-003 but at 10% the price per token, we recommend gpt-3.5-turbo for most use cases." In less than a year, the OpenAI models have not only improved, but become 30 times cheaper. What does this mean?

A human thinks at roughly 800 words per minute. We could debate this all day, but it won’t really effect the math. A word is about 1.33 tokens. This means that a human, working diligently 40 hour weeks for a year, fully engaged, could produce about: 52 * 40 * 60 * 800 * 1.33 = 132 million tokens per year of thought. This would cost $264 out of ChatGPT.

https://old.reddit.com/r/singularity/comments/11fn0td/the_implications_of_chatgpts_api_cost/

...or about $0.13 per hour. Yes technically it overlooks the fact that OpenAI charge for both input and output tokens, but this is still cheap and the line is trending downwards.

Full time minimum wage is ~$20k/year. GPT-3.5-turbo is 100x cheaper and vastly outperforms the average minimum wage worker at certain tasks. I dunno, this just feels crazy. And no, I wont apologize for AI posting. It is simply the most interesting thing happening right now.



I strongly agree with @Scimitar, this is the most interesting thing happening right now. If you haven't been following AI/LLM progress the last month, it has been blazingly fast. I've spent a lot of time in AI doomer circles so I have had a layer of cynicism around people talking about the Singularity, but I'll be damned if I'm not started to feel a bit uncomfortable that they may have been right.

The CW implications seem endless - low skill jobs will be automated, but which tribe first? Will HR admins who spend all day writing two emails be the first to go? Fast food cashiers who are already on their way out through self ordering consoles?

Which jobs will be the last to go? The last-mile problem seems pretty bad for legal and medical professionals (i.e. if an LLM makes up an answer it could be very bad) but theoretically we could use them to generate copy or ideas then go through a final check by a professional.

Outside of employment, what will this do to human relations? I've already seen some (admittedly highly autistic) people online saying that talking to ChatGPT is more satisfying than talking to humans. Will the NEET apocalypse turn into overdrive? Will the next generation even interact with other humans, or will people become individualized entirely and surround themselves with digital avatars?

Perhaps I'm being a bit too optimistic on the acceleration, but I can't help but feel that we are truly on the cusp of a massive realignment of technology and society. What are your thoughts on AI?

Yeah it's pretty bad. I'm fairly involved with EA and while I knew SBF was a big donor, I had no idea how bad the hero worship had gotten. Both among EA and big financial institutions. To my eyes this reflects even more poorly on VC funders and high finance/trading in general, they were supposed to have done due diligence on FTX (which presumably they did and the whole using $10b in customer funds came later) but they didn't see this coming either.

For instance look at this archived profile on SBF from Sequoia, a VC fund that made FTX happen and memoryholed this article after the disaster. The hero worship in there is cringey, and in retrospect it's horrifying:

In devoting every waking moment of his life to work, SBF doesn’t feel he’s doing anything unusual or extraordinary. He’s doing what he feels every right-minded person should do—if they were big-hearted and clear-headed. He’s attempting to maximize the amount of good in the world.... SBF, on the other hand, seems qualitatively different: He seems utterly driven, as if by a lash.

and

It’s hard to see SBF in a clear light. The glitter of the self-made billions are blinding. His intellect is as awesome as it is intimidating.

and

“It’s not some pathological thing,” he continues. “For some bizarre reason—and I don’t get it—they want to help.” We’re talking about EAs in general, but also about SBF in particular. “They want to make a difference, and they want to devote their lives to doing that,” says Lerner, who is definitely not an EA. “But that still doesn’t clarify why, or if, that somehow diminishes the pleasure drive.” SBF is a mystery even to his own therapist.

it goes on

Yet again, I’m reminded of a certain novel. In Fitzgerald’s day, there was the real-life character of John Pierpont Morgan, who steered the nation as it made the transition from the 19th to the 20th centuries—the transition from an agricultural economy, powered by horses, to an industrial economy, run on rails. Who is going to do the same for us, in this new century?

Of course the answer to that last question is: SBF. The blatant pedestalizing of the man in here is inherently disgusting to me, the fact that it comes from a well-respected VC firm really lowers my faith in that entire class of folks. Especially after the WeWork Adam Neumann disaster and all the other disasters from startup founders.

Either way, I've been trying to beat the drum in EA spaces for a long time that EAs put far too much focus on credentials. It's ironic that so many folks in the movement will tell you to your face they don't care about credentials, only impact, and yet the entire leadership is nothing but blue-blooded Ivy League grads and other well-connected elites. It's a shame because I think most people in EA have their hearts in the right place, they just can't take off the credential blinders and see that most of the academic/elite class is specialized in bullshitting and not much else.

Looks like the war against advertising is continuing to fail, predictably. Google Chrome is now banning restricting ad blockers starting as early as next year. (1) I am not convinced this model of: create a free, ad-free service to get users --> slowly pull in ads for $$$ --> eventually become an ad-riddled hell is the best model. I often balk at paying for services up front, but if a service as essential as google is now bowing to the pressure, when will it end?

Advertising definitely has some uses in connecting buyers to sellers, and informing consumers about the market, but I'm convinced it's a bit of a 'tamed demon.' If we don't want to devolve into a horrid anarcho-capitalist future, we need to get serious about restricting what advertisers can do, and where they can advertise. I predict advertising will become far more ubiquitous with the rise of Dall-E and similar image producing AIs. The cost of creating extremely compelling, beautiful ads will plummet, and more and more of our daily visual space will become filled with non stop advertising.

On top of this, we have Meta and other tech oligarchs attempting to push us all into the Metaverse. I am no detractor of AR/VR, in fact I think utilized correctly it could solve many of our current problems. However if the Powers That Be take over the metaverse, we will soon have ads that engage all of our sense - not just vision and hearing.

Given how powerful advertising already is, can we really afford to let it run rampant in an age where we have such powerful technologies?

1 - https://developer.chrome.com/blog/mv2-transition/

It's getting unavoidable - the quality of news and novel information obtained from time here is crashing. I used to hear things here first - now I usually don't hear them here at all.

It's getting unavoidable that we're having far more people complain and add low-effort negative comments than actually take time to flesh out top level posts that are high quality. As @Amadan and others have consistently said, if you don't like the status quo why don't you contribute yourself, or try to organize something else to change it?

High quality, intellectual writing doesn't just drop out of thin air because you complain about it. It comes from intelligent people who are driven to write, and who want to sharpen their minds in an environment that tests their opinions.

Frankly, I'm concerned that the "quality of news an novel information" has gone up here much more than I'd like. In terms of a vision for this site, I'm far more in the camp of having great writers like @ymeskhout and @DaseindustriesLtd write long effortposts about serious issues they've spent a lot of time thinking about, rather than helping people like you get their latest CW fix.

There are a million places on the internet you can go to keep up with the spectacle of the twenty four hour news cycle. There aren't as many places where you can find in-depth analyses of Straussian themes, or a discussion that weaves together modern internet drama with the age-old idea of sacrifice and meaning and suffering, or the how the relationship between the rich and the poor has changed dramatically in the modern era.

These sorts of well thought out, insightful and useful write-ups are rare and take time to formulate. Expecting brilliant insights on every latest piece of CW gossip is ridiculous. Sure it might drive more engagement, but it would also likely lead us in a race to the bottom.

Not sure if folks here keep up with crypto much, but over the weekend FTX had a liquidity crisis and agreed to sell to Binance. This is pretty huge news - FTX was one of the bigger crypto exchanges known for buying out other flailing firms that had crises. This may lead to a larger spiral within the crypto economy. @aqouta curious for your take here.

Also as some folks here may know Sam Bankman-fried of FTX wealth is one of the three major funders of the Effective Altruism movement. Given the circumstances of this bailout, it's likely that FTX was sold for an incredibly small amount - if Binance didn't help them with the liquidity crisis they almost certainly would've fallen to $0 value. Unfortunately this means that the money EA has been pledged/receiving from SBF is going to dry up. I'm curious to see if the EA movement can weather this storm, as they have been rather aggressively growing and it looks like they've been betting on this funding being in place for a long time.

Time to add some wild speculation - Changpeng Zhao, the CEO of Binance, is Chinese. Now that Binance owns FTX, they are clearly the dominate player in the crypto space, or at least positioned well to become the dominant exchange. I wonder if this shift will cause China to reconsider their decision to make crypto illegal? Or is it too much of a risk to state power?

Update: This definitely seems like a coordinated attack. Apparently Coinbase released an article slamming FTX’s native token, then Binance pulled out their entire stake. Without those two events not sure if this would’ve happened.

Then I come here for a dose of sanity, and I have to dig DEEP into the replies before I find anyone positing the plainly obvious: that if you say your political opponents are child rapist election stealing perverts, some section of the population will actually believe the literal words you are saying and "take action".

This is a fair point, but I find the rest of your post pretty uncharitable. I see the recent right-wing explosion in crazy theories as a pretty understandable, if not reasonable, response to being totally destroyed culturally and shut out of important institutions by the left. With all of the recent censorship on social media sites etc., how can we as leftists expect right-wingers not to get radicalized into conspiracy theories?

I mean, if you look at the recent social media/alphabet agency collusion, the actual deep state was literally cooperating with social media companies to censor the public. That is what we know, of course the low-IQ crazies are going to believe even worse things.

It's a shame that a couple decades ago, this type of deep state critique would be firmly in the far-left camp. Leftists have been using this conspiratorial type of critique against the Western order for the better part of a century or more to radicalize their base, albeit usually with some more qualifiers and generally sane positions. Unfortunately, the right wing flipped the script on us, and weren't able to control the crazy side of the conspiratorial minded base.

If we have to blame someone, I blame leftists for allowing one of the best social movements in human history to be captured by milquetoast wokists and clear grifters. I'm not saying the right wing is innocent, but your post reads like blaming a dying bear for lashing out at its killers.

Has anyone else gone through a period in adult life where you realize you've kind of forgotten how to actually have fun?

I'm not talking about just zoning out to a video game, but joyous laugh-out-loud relaxing fun. For me I feel I've gotten so bogged down with job issues, health issues, and planning for the future that even when I carve out 'free time' I never fully relax and just have fun.

Anyone relate? Or have stories on how they got out of such a mode?

The future of AI is likely decided this week with Sam Altman's Congressional testimony. What do you expect?

I expect nothing to happen for another few years, by which time it's too late. As @2rafa mentioned below, I'm convinced AI research and development is already far ahead of where it needs to be for AGI in the next couple of years. Given the US's embarrassing track record of trying to regulate social media companies, I highly doubt they'll pass an effective regulation regime.

What I would expect, if something gets rushed through, is for Altman and other big AI players to use this panic the doomers have generated as a way to create an artificially regulated competitive moat. Basically the big players are the ones who rushed in early, broke all the rules, then kicked the ladder down behind them. This is a highly unfortunate, but also highly likely future in my estimation.

It's ironic that we've entered into this age of large networks and systems, yet with the rise of AGI we may truly go back to the course of humanity being determined by the whims of a handful of leaders. I'm not sure I buy the FOOM-superintelligence arguments, but even GPT-4 optimized with plug-ins and website access will be a tsunami of change over the way we approach work. If there are more technical advancements in the next few years, who knows where we will end up.

What annoys me most is that this doomer rhetoric lets politicians act like they're doing something - stopping the AI companies from growing - when in reality they need to face the economic situation. Whether it's UBI, massive unemployment benefits, socialized housing, or whatever, our political class must face the massive economic change coming. At this rate it seems neither side of the aisle is willing to double down on the idea that AI will disrupt the workforce, instead they prefer to argue about the latest social issue du jour. This avoidance of the economic shocks coming in the next five years or less is deeply troubling in my view.

Great comment, reported for quality contribution.

The problem isn't guns, the problem is that there are millions of disaffected people living in a country founded on the idea of individual human rights. That works when the people are hyper-invested in their families and the future that they'll be living in; that doesn't work when everybody is depressed and hates each other.

This is an underrated point. So many times when societal ills come up in rationalist discourse, people hand wave away, are ignorant of, or flat out ignore the fact that at least 1/5 (probably more in actuality) of the US population is depressed or has some mental disorder. Even given that our modern psychiatric framing is largely faulty, this mass wave of disaffectedness means that traditional solutions, things that worked for our forefathers, need rethinking.

Even though past societies had plenty of times of upheaval, they had different ways of fixing things. Revolutions, massive aligned religions, cultural processes and holidays like the Roman Saturnalia which acted as a pressure release valve for hierarchical resentment. We've been increasingly preventing a release of the pressure, and it will only get worse as we continue to do nothing.