@TheDag's banner p

TheDag

Per Aspera ad Astra

4 followers   follows 12 users  
joined 2022 September 05 16:04:17 UTC

				

User ID: 616

TheDag

Per Aspera ad Astra

4 followers   follows 12 users   joined 2022 September 05 16:04:17 UTC

					

No bio...


					

User ID: 616

Some of you may have read Scott Alexander’s recent post, Book Review: The Geography of Madness. The couple of paragraph summary is:

A culture-bound mental illness is one that only affects people who know about it, and especially people who believe in it. Often it doesn’t make sense from a scientific point of view (there’s no such thing as witches, and the penis can’t retract into the body). It sometimes spreads contagiously: someone gets a first case, the rest of the village panics, and now everyone knows about it / believes in it / is thinking about it, and so many other people get it too.

Different cultures have their own set of culture-bound illnesses. Sometimes there are commonalities - many cultures have something related to the penis or witches - but the details vary, and a victim almost always gets a case that matches the way their own culture understands it.

THESE PEOPLE ARE NOT MAKING IT UP. I cannot stress this enough. There are plenty of examples of people driving metal objects through their penis to pull it out of their body or prevent the witches from getting it or something like that. There is no amount of commitment to the bit that will make people drive metal objects through their penis. People have died from these conditions - not the illness itself, which is fake, but from wasting away worrying about it, or taking dangerous sham treatments, or getting into fights with people they think caused it. If you think of it as “their unconscious mind must be doing something like making it up, but their conscious mind believes it 100%,” you will be closer to the truth, though there are various reasons I don’t like that framing.



The thrust of Scott’s argument is that humans have an amazing propensity to change their subjective experience based on their beliefs. Here, I'm not talking about rationally held or carefully reasoned beliefs, but deep-seated beliefs that aren’t easy to change, even if you know for a fact they're irrational. Typically, these beliefs seem to be formed through social or cultural channels, and once formed, they can be very difficult to change unless your cultural narrative also changes.

This idea ties into other work on the placebo effect and the ways it shaped our culture, for instance, John Vervaeke’s take on shamanism. The basic idea being that shamanism was highly advantageous from an evolutionary perspective because it allowed groups of humans to harness the placebo effect to overcome illness and manage social problems.

In short, despite the rational pretensions our culture has, our irrational beliefs have extremely strong effects on our perception of pain and other subjective experiences. However, an important nuance is that no cultural disorder is 100% ‘in your head;’ on the contrary, these disorders are very real and can have strong physical effects.

Some of the big examples that Scott gives, and some I think might be (mostly) culturally mediated, are:

  • Anorexia

  • Post-traumatic stress disorder

  • Anxiety

  • Depression

  • Gender dysphoria

  • Chronic pain

  • TikTok Tourettes

  • Long Covid

Now, based on the bent of this forum, many people might be tempted to jump on the gender dysphoria issue. While it’s certainly a loud and vibrant battle in the culture war, I’d ask that we instead focus on other problems. In my opinion, if this thesis holds true, then gender dysphoria is a red herring.

The evidence clearly suggests that we are inflicting massive amounts of pain and suffering on ourselves through our cultural beliefs and practices. The fact that so many of our cultural problems - from overdose deaths and suicides to chronic pain and crippling anxiety - are unforced errors is truly shocking.

Think about it - one fourth of the adult U.S. population experiencing chronic pain? That's a staggering number, and it seems largely due to the fact that we have been conditioned to believe that our pain must have an acute physical cause. We've been taught to view pain as something that must be cured with medication or surgery, when in fact many cases of chronic pain can be alleviated by simply changing our beliefs about it.

The truly shocking revelation here is that so many of our cultural problems - massive amounts of overdose deaths, suicides, one fourth of the adult population experiencing chronic pain, crippling anxiety causing young people to retreat from society, and many more issues - are clear unforced errors. We are inflicting this pain on ourselves.

If this theory is true it may very well be one of the most important and impactful frameworks with which to view the issues of post modernity. We wouldn’t need endless medications or miraculous scientific breakthroughs - we could already have the power to end massive amounts of truly pointless suffering.

ETA: is another perfect example of this type of illness.



From a personal perspective, I can attest that this theory confirms my priors. I’ve dealt with chronic pain for a decade and have long suspected that it was mostly psychosomatic. Even with this realization, it is a difficult battle to fight. Ironically, support groups where people confirm and commiserate seem to make the issue worse. In fact, many modern studies on pain recommend not even using the word "pain" and replacing it with something else to trick your mind into understanding that your pain doesn’t have an acute physical cause.

So many of us in the rationalist community focus on object-level reasons as to why our society may be stagnating or why we have so many cultural problems. At the end of the day, it turns out that our beliefs themselves may be throwing us into a twisted, absurd, and horrific self-fulfilling prophecy.

It may be time to stop assuming that the causes of our problems originate directly from the outside world and update to a view that many more major problems could be solved if we simply change our cultural beliefs.

I'm curious how the Motte sees using AI for therapy / life advice? Online I'm seeing a ton of people recommend Claude especially, but others are skeptical.

On the one hand I could see it being useful because of the fact that you have nigh-unfettered access to it, and can really dig into deep problems. Also, it's trained on all the therapy texts of course.

The other, more culture war issue, is that due to the way RLHF works, they will likely be pushing one ideological lens over another. Especially about deep topics like morality, relationships, casual sex, etc.

Overall I think it's a fascinating area of development, and I'm still optimistic that LLMs could help people much more than the average therapist. Mainly because I'm pretty bearish on the help people get from the average therapist.

Anyway, what do people think about therapy becoming AI?

To bring up another post from last week, I'm going to go ahead and repost @justcool393's piece on the Sam Altman/OpenAI/Microsoft situation, since she posted it a few hours ago and right before the last thread went down.

Here's her writing:


Another day, another entrant into the OpenAI drama. Emmett Shear is the new interim CEO of OpenAI.

I don't know why it was surprising to people that Sam wouldn't come back. The company was meant to be subservient to the nonprofit's goals and I'm not sure why the attempted coup from Sam's side (you know the whole effectively false reporting that Sam Altman was to become the new CEO) was apparently "shocking" that it failed.

The OpenAI board has hired Emmett Shear as CEO. He is the former CEO of Twitch.

My understanding is that Sam is in shock.

https://twitter.com/emilychangtv/status/1726468006786859101

What's kinda sad about all of this is how much people were yearning for Sam Altman to be the CEO as if he isn't probably one of the worst possible candidates. Like maybe this is just a bunch of technolibertarians on Twitter or HN or something who think that the ultimate goal of humanity is how many numbers on a screen you can earn, but the amazing amount of unearned reverence towards a VC to lead the company.

In any case, here's to hoping that Laundry Buddy won't win out in the rat race for AGI, lest we live in a world optimized for maximum laundry detergent. Maybe we'll avoid that future now with Sam's departure.

Anyway, I'll leave this to munch on which I found from the HN thread.

Motte: e/acc is just techno-optimism, everyone who is against e/acc must be against building a better future and hate technology

Bailey: e/acc is about building a techno-god, we oppose any attempt to safeguard humanity by regulating AI in any form around and around and around"

https://twitter.com/eshear/status/1683208767054438400


I'm reposting here because I'm convinced, like many other residents, that the ongoing drama of who controls AI development has far reaching implications, likely on the scale of major power geopolitical events. If not ultimately even greater.

To add a bit to the discussion to justify reposting - I think many of these discussions around AI Safety versus Accelerationism are extremely murky because so many people in secular, rationalistic circles are extremely averse to claiming religious belief. It's clear to me that both AI Safety and Accelerationism have strong themes of classical religion, and seem to be two different sects of a religion battling it out over the ultimate ideology. Potentially similar to early Orthodox Christians versus Gnostics.

Alternatively, @2rafa has argued that many of the E/Acc (effective accelerationism) crowd comes from bored technocrats who just want to see something exciting happen. I tend to agree with that argument as well, given how devoid of purpose most of the technocratic social world is. Religion and religious-style movements tend to provide that purpose, but when you are explicitly secular I suppose you have to get your motivation elsewhere.

We've also got the neo-luddites like @ArjinFerman who just hate AI entirely and presumably want us to go back to the mid 90s with the fun decentralized internet. Not sure, I haven't actually discussed with him. I can actually agree with some of the Ludditism, but I'd argue we need to go back to 1920 or so and ban all sorts of propaganda, mass media and advertising.

Anyway, clearly the technological battle for the future of our civilization continues to heat up. The luddites seem out, but may have a surprising last hour comeback. The woke/political left leaning folks seem to be strongly in charge, though the OpenAI scandal points to trouble in the Olympian heights of Silicon Valley AI decision makers.

Will the Grey Tribe use AGI to come back and finally recover the face and ground it has lost to the advancing SJW waves? Who knows. I'm just here for the tea.

This excellent piece on age segregation has got me thinking about how serious and pervasive this problem is. As the author states:

Young adults are afraid to have children, because they can’t possibly imagine adding some to the life they currently have. New parents are isolated from most of their previous friends, as their paths suddenly never cross again unless they too have kids of their own. Children compete within their age group at schools, never having a chance to either mentor someone or have an older mentor themselves. Teenagers have no idea what to do with their lives, because they don’t know anyone who isn’t a teacher or their parent. And everyone is afraid of growing old because they think that the moment they stop going to the office they’ll simply disappear.

As discussed in @2rafa's post downthread, a major issue of the fertility crisis is a lack of time. Another issue it seems is a lack of even interacting with children unless you have some yourself, or have some in your family. I wonder if the lack of time among young adults in the West is causative of this age segregation?

Regardless, it likely has its roots in the K-12 education system. It's profoundly unnatural from a cultural standpoint to only be in the same peer group as people right around your age. I'm convinced it's unhealthy, and it predisposes us in a massive way to only socialize with people close to our age.

Do you think age segregation is an issue as well? If not, why not?

In a similar vein to @Pasha’s thread below, what are your favorite RPG games?

Divinity Original Sin 2, Disco Elysium and TES: Oblivion stand out to me.

Circling back on the crypto FTX fiasco, Noah Smith has a new piece out theorizing, what happens if crypto just dies? Unfortunately it's paywalled so I couldn't read the whole thing, but I have been wondering this myself. More and more crypto exchanges have been dying off, especially over the last couple of years. FTX seemed to resemble one of the last exchanges embodying the spirit of crypto, and now it's gone.

By spirit of crypto, I mean the original cypherpunk, decentralized idea of a currency that operates outside the bounds of the State. As @aqouta and others have mentioned, big exchanges like Binance are actually more centralized, and if they continue to grow they can easily be incorporated as organs of the existing State apparatus.

As someone who has always been wary of censorship and centralized power, especially in light of the recent escalation in terms of woke social norms being shoved down everyone's throat, this is troubling to me. Both the right and left seem to care less and less about the overreach of the powers that be - in fact folks like Tyler Cowen think that the main difference in the 'New Right' is that they are more trusting of elites.

I'm curious for takes on either side of this issue. If you don't think we have anything to worry about with regards to State power, why is that so? Do you just think that with the rise of the internet/technology States are impotent, or is centralized power a good thing?

If you disagree with the premise above, then how can we work to push back on centralization? Especially with the rise of powerful tools like LLM and the rise of AI, is there any hope for the individual classical liberal ethos to survive the next century?

Neuralink has caused a bit of a storm on X, taking off after claiming that three humans have what they call "Telepathy":

Today, there are three people with Telepathy: Noland, Alex, and Brad.

All three individuals are unable to move their arms and legs—Noland and Alex due to spinal cord injury (SCI) and Brad due to amyotrophic lateral sclerosis (ALS). They each volunteered to participate in Neuralink’s PRIME Study,* a clinical trial to demonstrate that the Link is safe and useful in the daily lives of people living with paralysis.

Combined, the PRIME Study participants have now had their Links implanted for over 670 days and used Telepathy for over 4,900 hours. These hours encompass use during scheduled research sessions with the Neuralink team and independent use for everyday activities. Independent use indicates how helpful the Link is for real-world applications and our progress towards our mission of restoring autonomy. Last month, participants used the Link independently for an average of 6.5 hours per day

Assuming this is all true and the kinks will be worked out relatively soon, this is... big news. Almost terrifyingly big news.

AI tends to suck in most of the oxygen around tech discourse, but I'd say, especially if LLMs continue to plateau, Neuralink could be as big or even bigger. Many AI maximalists argue, after all, that the only way humanity will be able to compete and keep up in a post-AGI world will be to join with machines and basically become cyborgs through technology like Neuralink.

Now I have to say, from a personal aesthetic and moral standpoint, I am close to revolted by this device. It's interesting and seems quite useful for paraplegics and the like, but the idea of a normal person "upgrading" their brain via this technology disturbs me greatly.

There are a number of major concerns I have, to summarize:

  • The security/trust issue of allowing a company to have direct access to your brain
  • Privacy issues with other people, hacking your Link and being able to see all of your thoughts, etc
  • "Normal" people without Neuralinks being outcompeted by those willing to trade their humanity for technical competence
  • LLMs and other AI systems being able to directly hijack human agents, and work through them
  • Emotional and moral centers in the human brain being cut off and overridden completely by left-brained, "logical" thinking

Does this ring alarm bells for anyone else? I'd imagine @self_made_human and others on here are rubbing their hands together with glee, and I have to say I'd be similar a few years back. But at the moment I am, shall we say... concerned with these developments.

I'm reposting a top level comment that got posted yesterday, because I don't think it got enough discussion.

Historic flooding in South Carolina, North Carolina, and Tennessee. Whole towns washed away. People retreating to their attics as water levels rise. People losing everything.

Tragic. Horrific. But this is the Culture War thread so I am going to ask the insensitive question, what does this mean for the election in <40 days?

My first thought is that there is a certain irony that these states are among those that just limited the forms of ID allowed at a voting booth. Someone who has lost their house is less likely to have all their documentation, and getting new copies will take longer than the time before the election.

Rural areas that were wiped out will have a harder time finding their polling location under the mud and timber. Mail-in voting will be difficult without a mailbox.

People are going to watch the Biden-Harris's administration to see how they respond.

Do these factors make it more likely for these swing states to turn Blue or Red? Buncombe County, one of the hardest hit, went 60% for Biden in 2020.

I want to add, as I commented, I am beyond frustrated by the federal response. Biden just put out a press release today, over 5 days after it happened, and he just sounds so tired and apparently has a cold. That being said, he says a lot of the right things even if the delivery is bad.

But why has it taken so long to get things out there? As another commenter said, why aren't planes and helicopters air dropping supplies and Starlink in?

And if there has been a major relief effort, where is the news on it? The left controls pretty much the entire mainstream media, so where are the videos of airlifted supplies? I know internet is out, but people on X have shared plenty of videos of waters washing away homes etc.

I'm just shocked at how poor the response has been, though I guess I shouldn't be. Also man, those videos really are something to watch. Reminds us how fragile things really are if Nature decides to make us face her wrath.

Grok 3 just came out, and early reports say it’s shattering rankings.

Now there is always hype around these sorts of releases, but my understanding of the architecture of the compute cluster for Grok 3 makes me think there may be something to these claims. One of the exciting and interesting revelations is that it tends to perform extremely well across a broad range of applications, seemingly showing that if we just throw more compute at an LLM, it will tend to get better in a general way. Not sure what this means for more specifically trained models.

One of the most exciting things to me is that Grok 3 voice allegedly understands tone, pacing, and intention in conversations. I loved OpenAIs voice assistant until it cut me off every time I paused for more than a second. I’d Grok 3 is truly the first conversational AI, it could be a game changer.

I’m also curious how it compares to DeepSeek, if anyone knows more than I?

With SpaceX's Starship having finished it's static fire tests they will soon be gearing up for the first orbital launch. So far, space travel and industry have avoided getting polarized (although Musk has gotten some frankly ridiculous hit pieces for the whole Ukraine Starlink fiasco), but I don't expect this to continue as it gets cheaper and easier to sent things to and from space.

If you look at the cost per metric ton for space travel right now, it's around $11.3 million/ton. That means that if you want to get a ton of material into space, you're shelling out quite a bit. This limits space endeavours to major governments or multinational corporations for the most part.

According to Musk, Starship will be able to lower the cost to only $20,000 per metric ton to get into space. This is multiple orders of magnitude in terms of cost reduction. Now I'm not super optimistic this number will be hit anytime soon, but if it is, it will enter us into a new era when it comes to space and technology.

My question is - how does this play into the Culture War? Musk has been increasingly right-coded, but it also seems like space and 'moonshots' have long been a darling of the left. On top of this, there's a strong nationalist angle if we can get and maintain an edge on Russia/China in space industry.

I'm curious if anyone else has more fleshed out ideas on this topic, in terms of how space industry will affect the Culture War. Or do most of y'all think this is a non-starter and nobody will care about space in 5-10 years?

I agree! And yes I know @2rafa has been beating that drum. I also think we should easily be able to save tens of billions by addressing the waste in these programs, or making things more efficient.

If we could uplift dogs or other non-sentient animals to sentience, should we? There's a Rick and Morty episode where they uplift the intelligence of a pet dog, who goes on to uplift all other dogs and lead a rebellion against humans. Theoretically the idea is that if humans and dogs had the same level of intelligence, the way we treat dogs would be inhumane.

While I agree with the statement above, obviously there's a vast gulf between the intelligence of humans and every other form of life we know of. If we accept the premise that once we have the technology to uplift dogs we not only must treat them different but have a moral imperative to uplift all dogs, we get into dangerous territory. Essentially as soon as we become capable of increasing the intelligence of any form of life, we must. This sounds like a bad idea, especially since at a certain point we may lose out to other forms of species that have evolved to kill us. (Mosquitos, insects generally that are hard to kill but venomous/diseased enough to kill us.)

Assuming uplifting other species is even possible, I wonder how this would change our outlook on how we treat animals today. Personally I am not a vegetarian, totally fine with pets, although I do find the state of factory farming today pretty dismal, and think we should treat animals we eat better. As someone's flair here says, how far will we last until we have to become outright speciesist?

Not sure if folks here keep up with crypto much, but over the weekend FTX had a liquidity crisis and agreed to sell to Binance. This is pretty huge news - FTX was one of the bigger crypto exchanges known for buying out other flailing firms that had crises. This may lead to a larger spiral within the crypto economy. @aqouta curious for your take here.

Also as some folks here may know Sam Bankman-fried of FTX wealth is one of the three major funders of the Effective Altruism movement. Given the circumstances of this bailout, it's likely that FTX was sold for an incredibly small amount - if Binance didn't help them with the liquidity crisis they almost certainly would've fallen to $0 value. Unfortunately this means that the money EA has been pledged/receiving from SBF is going to dry up. I'm curious to see if the EA movement can weather this storm, as they have been rather aggressively growing and it looks like they've been betting on this funding being in place for a long time.

Time to add some wild speculation - Changpeng Zhao, the CEO of Binance, is Chinese. Now that Binance owns FTX, they are clearly the dominate player in the crypto space, or at least positioned well to become the dominant exchange. I wonder if this shift will cause China to reconsider their decision to make crypto illegal? Or is it too much of a risk to state power?

Update: This definitely seems like a coordinated attack. Apparently Coinbase released an article slamming FTX’s native token, then Binance pulled out their entire stake. Without those two events not sure if this would’ve happened.

Looks like the Supreme Court is finally getting around to challenging affirmative action. Of course we don't know what the ruling will be, but with the decisions so far I'm hopeful they strike down AA, or at least put a dent in it.

I'm surprised this isn't a bigger deal as I haven't heard much buzz about it from my liberal friends. According to the article, 74% of Americans don't believe in using race as a factor in college admissions (although that question and whether or not Affirmative Action should be struck down likely have far different approval rates.) It may be a Roe situation where they really don't care until one of the sacred cows is gored because they believe in their own invincibility. I'm curious if AA does get struck down, will we have the same reaction as Roe?

I'm sure some people will be upset, but do you think liberal states will start changing their constitutions to allow race filtering for college admissions? Or is the political will for AA just gone on both sides of the aisle?

I just read a short article in an email newsletter that threw out this statistic with regards to automation in the food industry:

Between March and July 2022, an average of 760,000 people quit jobs in accommodation and food service

The article goes on to argue the point that due to all of the ‘quiet quitting’ and generally unsatisfied workers after the pandemic or over the last couple of years, automation will not be as big of a deal as we thought. I’ve seen this sentiment echoed a number of times recently where news outlets will talk about how all of the people worried about economic disruption from robotics and Artificial Intelligence don’t realize that it’ll actually be great because people hate working anyway.

I used to believe these claims when I was a disillusion young adult who hated working, but overtime I’ve gotten more and more skeptical. Many people I know take serious pride and work, and in fact for a lot of people their work is the most important thing in their life. I’m talking people who don’t even really need the money, or who claim that even if they had enough money to retire they would continue working just as much as they do now.

Is this recent trend of less engagement with work robust enough to offset the rise in automation of jobs? Is this just a cope from those who know their jobs will disappear soon? (Ie email newsletter writers)

Personally I’m surprised that artificial intelligence hasn’t gotten more flack than it has so far. I expected the lights to come out in full force and at least get some sort of ban on image generation (I know Getty or some other site has done this) but so far it seems that artificial intelligence is generally unopposed.

Any major salient examples of automation technology or artificial intelligence being banned to protect jobs?

Given that nobody was brave enough to submit a top level comment, I figured I would start off the games this week.

I'd like to talk about the Epstein files, and general pedophilic corruption amongst the world's elite. I wasn't that surprised that Trump didn't end up releasing the full document, given his history of promising things will be declassified (like the JFK files) and then failing to deliver. (I still want to see those JFK files dangit.)

I have to admit though, I have trouble understanding exactly how this conspiracy works. I would imagine if there were a genuine sort of sex cult with underaged teenagers, demon worship, etc going on amongst the world's elite, we would have at least some hard evidence, some video footage, or some audio recording to point at, no? Common wisdom about conspiracies in rationalist circles is that it's too difficult to cover up a leak, especially over decades and with thousands of people. When the stakes get this high I would imagine it's even harder.

Then again, if the files aren't all that bad, why the heck aren't the files released? Why hasn't anything changed after the investigations? I've always been skeptical of the 'kookier' conspiracy theories about Freemason Illuminati cults ruling the world, but I have to admit after this, the UFO kerfuffle in Congress, and just general seemingly bonkers decisions about classified info, I am quite confused as to what a reasonable explanation is.

Anyway, what are your thoughts Mottizens?

I'm going to shamelessly steal @Scimitar's post from the Friday Fun thread because I think we need to talk about LLMs in a CW context:


A few months ago OpenAI dropped their API price, from $0.06/1000 tokens for their best model, to $0.02/1000 tokens. This week, the company released their ChatGPT API which uses their "gpt-3.5-turbo" model, apparently the best one yet, for the price of $0.002/1000 tokens. Yes, an order of magnitude cheaper. I don't quite understand the pricing, and OpenAI themselves say: "Because gpt-3.5-turbo performs at a similar capability to text-davinci-003 but at 10% the price per token, we recommend gpt-3.5-turbo for most use cases." In less than a year, the OpenAI models have not only improved, but become 30 times cheaper. What does this mean?

A human thinks at roughly 800 words per minute. We could debate this all day, but it won’t really effect the math. A word is about 1.33 tokens. This means that a human, working diligently 40 hour weeks for a year, fully engaged, could produce about: 52 * 40 * 60 * 800 * 1.33 = 132 million tokens per year of thought. This would cost $264 out of ChatGPT.

https://old.reddit.com/r/singularity/comments/11fn0td/the_implications_of_chatgpts_api_cost/

...or about $0.13 per hour. Yes technically it overlooks the fact that OpenAI charge for both input and output tokens, but this is still cheap and the line is trending downwards.

Full time minimum wage is ~$20k/year. GPT-3.5-turbo is 100x cheaper and vastly outperforms the average minimum wage worker at certain tasks. I dunno, this just feels crazy. And no, I wont apologize for AI posting. It is simply the most interesting thing happening right now.



I strongly agree with @Scimitar, this is the most interesting thing happening right now. If you haven't been following AI/LLM progress the last month, it has been blazingly fast. I've spent a lot of time in AI doomer circles so I have had a layer of cynicism around people talking about the Singularity, but I'll be damned if I'm not started to feel a bit uncomfortable that they may have been right.

The CW implications seem endless - low skill jobs will be automated, but which tribe first? Will HR admins who spend all day writing two emails be the first to go? Fast food cashiers who are already on their way out through self ordering consoles?

Which jobs will be the last to go? The last-mile problem seems pretty bad for legal and medical professionals (i.e. if an LLM makes up an answer it could be very bad) but theoretically we could use them to generate copy or ideas then go through a final check by a professional.

Outside of employment, what will this do to human relations? I've already seen some (admittedly highly autistic) people online saying that talking to ChatGPT is more satisfying than talking to humans. Will the NEET apocalypse turn into overdrive? Will the next generation even interact with other humans, or will people become individualized entirely and surround themselves with digital avatars?

Perhaps I'm being a bit too optimistic on the acceleration, but I can't help but feel that we are truly on the cusp of a massive realignment of technology and society. What are your thoughts on AI?

Alright I want to talk about nuclear.

Ever since I studied it in high school I've been into nuclear, and shocked as to why we don't build more of it. Trump and Elon discussed nuclear energy in their discussion, and JD Vance apparently endorsed it during the VP debate.

Let's say Trump wins the election - what are the odds his administration actually gets some new reactors built, or at least started? I'd like to say I'm optimistic, but given the US track record for building things it's hard to believe it could actually happen.

People who are more familiar with the process of building these things, please let me know - what chances do we have?

Everyone else - how do you feel about nuclear energy? Are you surprised it's finally a CW topic?

EDIT: as a commenter mentioned, this discussion is happening in Europe as well.

Circling back around to the topic of space exploration, this article by Palladium on the reasons for exploring space brings up an interesting shift in how geopolitical justifications are made over the last hundred years or so.

The main thrust of the article hypothesizes that there may never be a truly strong economic or political incentive to push space travel. I'm not necessarily convinced this is the case, but I agree that most people that try to justify going to space all those terms are fighting a losing battle. Even if we do stand to gain massively from an economic perspective by pioneering various space initiatives, the timescale for any reasonable returns is in the hundreds of years. Not something that will motivate people to come out to the ballot box anytime soon.

What's really fascinating about the conclusion, however, is that the article points out in excellent pro something I hadn't really grasped before:

Modern governments are often wrongly derided for lacking vision. In fact, they are already committed to multi-trillion-dollar, multi-decade-long visions that require all of society, technology, and world geopolitics to be back-engineered accordingly. The U.S. government, for example, spends half its budget on social welfare programs, especially for the elderly. We take for granted that this is unremarkable, when in fact it is extremely historically unusual and a reflection of our deep commitment to a certain kind of post-industrial society that existentially values comfort and individuality.

While it's debatable whether or not the modern welfare state and social security in western countries really qualifies as a 'vision' of the future, it's absolutely true that the massive social engineering projects we have going on nowadays are far more ambitious and far more expensive than any of the space initiatives that have been proposed so far. This discrepancy is to the tune of multiple orders of magnitude.

The article rightly points out that the only thing that ever motivates people to enact these massive governmental projects are social, religious, or emotional goals. Despite all of our fancy rhetoric, humanity as a whole is nowhere near rational in our large scale decision making. This is a fundamental flaw when it comes to most rationalists or philosophers trying to create policy prescriptions - they lay out a beautiful argument, but failed to give any reasons that will truly motivate people to follow their argument.

I'll let the article conclude itself:

The expansion of human civilization to other stars will not be pioneered by lone adventurers or merry bands of hardy explorers, like we imagine the voyages of Erik the Red or Christopher Columbus. This works for interplanetary space, but not interstellar space, whose travel time will require multiple generations of people to survive a journey, including on the first try. Interstellar travel will need to accommodate not just adventurous young men with nothing to lose, but also women, children, and the elderly. In other words, a whole society. The existence of a society always implies the existence of a government.

More importantly, the sociological challenge of persuading a whole society to migrate into the unknown is very different from that of an explorer’s mission, which needs only the promise of adventure. Like the ancient Israelites, the Pilgrims, or the Mormons, a great migration will only occur when a Promised Land has been credibly found. Indirect evidence of extrasolar planets will never be enough. Whether with colossal space telescopes or ultra-fast nano-probes armed with cameras, we will need to have beautiful images and real maps of alien worlds before human civilization can become interstellar. The purpose of interplanetary expansion is to build the infrastructure and technology to make such scopes and probes feasible. These will be our cathedrals, the legacy which we will leave to our descendants.

VisionOS and the Future of Input



Ever since the computer first arrived, keyboard and mouse has been the standard. You have a flat surface with raised little squares that you smack with your fingers. You have another little rounded shape with a flat bottom you move around, and click with.

This awkward, clunky interface has significant culture war elements, in that an entire class of powerful people arose - specifically people who didn't have traditional status markers like height, strength, or indomitable physical presence. Instead these 'nerds' or 'geeks' or whatever you want to call them specialized themselves in the digital realm. Now, the Zuckerburgs and Musks of the prior generation rule the world. Or if they don't, they soon will.

These outdated interfaces seem perfectly normal to everyone who has only used them. Sure many people have used a controller for video games, and may think that controllers are superior for some cases, but not others. Keyboard and mouse is the only way to operate when it comes to a computer, most people surely imagine.

That being said, it's actually quite easy to dip your toes into alternate input methods. Talon is a system that utilizes voice to let you do practically anything on a computer. You can move the mouse, click on any object on your screen, dictate, edit text, and you can even code quite well. Talon's system even supports mapping operations, sometimes very complex ones, to custom noises you record on your own.

On top of that you can integrate eye tracking, using a relatively inexpensive device. If you've ever used voice control combined with eye tracking, you can operate around as fast as someone who is decent at using a keyboard and mouse.

If you have ever used these systems, you probably know that because most digital setups are built for keyboard and mouse, it's not necessarily perfect. Keyboard and mouse still hold the crown.

But. There is a certain magic to controlling a computer through your voice, or your eyes. It begins to open your mind to new possibilities, the idea that there are better, faster, easier, more natural ways of interfacing with a computer than the defaults we have been stuck with.



Enter Apple's VisionOS.

If you haven't seen the recent demo of Apple's new VisionOS they're breaking brand new ground. The entire OS is built around looking at things, and making minute hand motions to control the icons you're looking at. There are no controllers, no physical interfaces whatsoever besides your eyes and your hands. It's breathtaking to watch.

In a review from John Gruber, a well respected old head in the VR space and a creator of markdown, the possibilities behind this new technology are apparent. Gruber describes how

First: the overall technology is extraordinary, and far better than I expected. And like my friend and Dithering co-host Ben Thompson, my expectations were high. Apple exceeded them. Vision Pro and VisionOS feel like they’ve been pulled forward in time from the future. I haven’t had that feeling about a new product since the original iPhone in 2007. There are several aspects of the experience that felt impossible.

Now Apple does tend to get a ton of hype, but this reaction of being amazed by the experience is surprisingly common among earlier reviewers:

Similarly, Apple’s ability to do mixed reality is seriously impressive. At one point in a full VR Avatar demo I raised my hands to gesture at something, and the headset automatically detected my hands and overlaid them on the screen, then noticed I was talking to someone and had them appear as well. Reader, I gasped.

The implications of this 'spatial operating system' are varied and multitudinous, of course. There will be all sorts of productivity gains, and new ways of interacting with the digital world, and fun new apps. However I'm most interested in how this innovation could shift the balance of power back to the strong and physically capable, away from the nerds.

No longer will clunky interfaces make sense - instead computers will be optimized around healthy, fully functional humans. Ideally the most intuitive and common control schemes will reward physical fitness and coordination. Traits which nerds lack in droves.

Will we see a reversal of the popularity that being a nerd or geek has gained in the past few decades? Only time will tell.

I made a top-level comment here a couple of weeks ago that tried to outline some of the major updates on the Georgism discussion in the ratsphere.

(Editing for less strawmanning.) I think that a lot of the problem is that Georgism strikes at the heart of fundamental value differences for folks. Many people seem to equate Georgism with Communism, or redistribution of wealth, which I don't find convincing.

For instance:

@bnfrmt:

LVT is equivalent to the state seizing all land, and renting it back at market rates; it's expropriation on a massive scale.

@Brannigan:

Georgism at heart is about identifying what is often the most precious possession a person can have, that most of the middle class has spent 30 years of their lives working to pay off, to render to their posterity, and stealing it from them despite the fact that they haven't really done anything wrong.

@laxam

"We know better than you how you should use your land", is roughly analogous to, "We know better than you what you should put in your body".

@Westerly

This strikes me as rationalists rationalizing their own class self-interest. The same way EA just so happens to only support democrat politicians, rationalism coincidentally just so happens to work out extremely well for the types of people that are rationalists. Easy to be YIMBY when you are 25 and living in a rented apartment in San Francisco.

@naraburns

My concern with LVT is that I regard most kinds of property tax (as well as income tax) as fundamentally immoral

@The_Nybbler

Still low-effort is "it's communism, but only with land". But given how bad communism has turned out, I think it's sufficient. The Georgist LVT is equivalent to the government owning all the land and leasing it out to the highest bidder.

@MeinNameistBernd

Frankly advocating "georgism" is the "break out the guillotines" limit for me, because the victims are my people and the preparators are /r/neoliberal vampires.

These are not cherry picked responses - all of these had at least 10 upvotes, and in many cases 25+.

Some of the responses were less charitable, which has led to me getting heated on this topic, such as people literally calling me a vampire (and getting 15+ upvotes) for arguing for a type of land reform.

Resurfacing another old comment from @functor about Conservatism as anti-ideology. I think it's interesting to reflect back on now that we're in Trump 2.0:


Keith woods says it better than me

Conservatism as Anti-Ideology

There was much debate online recently over the political beliefs of country music singer Oliver Anthony. Anthony captured the hearts of conservatives with his “Rich Men North of Richmond”, which took aim at out of touch fatcat Yankees who have abandoned people like him. At first there was no question to conservatives, Anthony was definitely one of them. After all, he railed against welfare queens, taxes, and complained about elites not relating to regular folk. Anthony did alienate some of his newfound following when an interview of him appeared where he affirmed the “diversity is our strength” mantra. Then the first question at the first of this years Republican Party primary debates was the hosts asking the field for their interpretation of Athony’s masterpiece, to which an indignant Mr. Anthony then responded with derision for the entire field, reminding Republican partisans that these politicians were actually part of the elite he was singing about.

Still, most conservatives are not in any doubt that Oliver Anthony is one of them, and I think they’re correct. The fact that he is almost indistinguishable in his rhetoric from a Berniebro Democrat is a feature, not a bug. Neither is it a problem that the message in his song seemed inconsistent - targeting rich capitalists as the source of his problems in the same song that he complained about taxation and welfare spending. Conservatism in recent years has lost any positive content, it is now best understood as an anti-ideology, a vague, paranoid and inconsistent critique of a nebulous “elite”, the only point of which is to spread a general mistrust in whoever happens to be in power. ... Modern conservatism in the English speaking world developed out of the cadre of conservatives who formed the National Review in 1955, led by William F. Buckley. Buckley believed he had found a program to unite the two camps who dominated the right, but had been up to that point adversarial: the Burkean conservatives, led by figureheads like Russell Kirk, and the increasingly expanding camp of libertarians, who had been influenced by works like Friedrich Hayeks The Road to Serfdom. The program that would unite them was the “fusionism” of Frank Meyer, a German-Jewish immigrant to the United States who himself abandoned communism after reading Hayek’s work while serving in the US Army. Frank S. Meyer: The Fusionist as Libertarian | Mises Institute .... Since at least the 2000s, the conservatism of Reagan and Thatcher has been in retreat, while it found a resurgence with the Tea Party program during the Obama administration, this trend was swept aside by the muscular populism of Donald Trump. Since then, conservatism has lost any vestiges of whatever positive content it had remaining. Free market economics are still central to the establishment GOP politicians, but many conservatives now sound like economic populists, seeing rich capitalists as part of the same elite class as liberal politicians. While many conservatives still stand firm on abortion, there is little else in the way of the social conservatism that used to define the right: Trump was the most pro-gay US President in history, and modern conservatives are all too happy to embrace their own, based versions of “trans women” like Blair White if they affirm them back. Alex Jones asks Blaire White if "the chemicals" made her trans | Media Matters for America -... So what’s left? Well, there’s definitely a strong belief that the elites are evil - ridiculously, cartoonishly evil, to the point that they poison the water and the skies, intentionally derail trains, and start wars just to make common people suffer. There is also a strong cynicism about politics and idealism generally, not only is the conservative anti-ideological, but they are convinced everyone else is too, and that people that profess to believe in leftist ideals like egalitarianism are just cynics who don’t really believe it. As saimleuch, conservatives will often critique leftists for being inconsistent anti-racists or say things like their affirmation of trans rights is rooted in a hatred of women. Oliver Anthony engaged in some of this on his recent appearance on Joe Rogan. Rogan pointed out that Democrats in the early 90s “sounded like Nazis”, Oliver Anthony recognised the argument and immediately pointed out that Democrats like Hillary and Obama didn’t even support gay marriage in the 2000s! .. It is of course an eternal source of frustration to people on the radical right that conservatives attack the left by holding them to the moral standard the left itself has established, thus enforcing the leftist moral framework on the whole political spectrum. This seems obviously counter-productive, until you realise there is no alternative program the conservatives are advancing anyway - all that matters is getting people to share the same sense of cynicism and mistrust of power, so an accusation of racism or homophobia works as well as anything else.

https://keithwoodspub.substack.com/p/conservatism-as-anti-ideology

Conservatism lacks ideology, vision and a moral compass. At this point it is just angry ranting against cartoon vilians who are satanically evil. There is little systemic analysis instead there is an over emphasis of conspiracies. If the populist conservatives took power, they would be incapable of wielding it since their policies lack depth beyond SJWs bad but trans people with MAGA hats good. Conservatives are too negative, their entire focus is on what they dislike. Rich people bad, welfare queens bad, Klaus Schwab bad but what is good?

My life sucks, boo out group isn't really lyrics that inspire or offer novel insights. It isn't surprising that the anglosphere right has greater problems attracting young people than the right in the rest of the west. AfD, Sweden democrats and national rally do fairly well among young voters. The rather aimless right in the anglosphere fails at attracting young people and successful people. A young highly educated person is simply going to find the aesthetics and the values of mainstream conservatism boring and unappealing. It isn't a uniting message, it is a message with no vision that is anti PMC. I simply struggle to see a well travelled, highly educated person fitting in to the conservative movement at all. The right is making itself culturally toxic defenders of boomer rights.


I'll say from my perspective, this view actually seems validated after what we've seen from Trump so far. With the exception of tariffs, which are already being struck down, there's much more of an emphasis on destroying than actually building anything.

That being said, I'm generally conservative myself and weakly pro-Trump, so I'm not trying to just take cheap potshots. I genuinely think this is a huge problem the right needs to face in order to create a more compelling and useful platform for the future.

Now that crypto seems to be firmly on one side of the Culture War, in that Trump and co. are actively promoting it, what do we see as the future of crypto under a Trump presidency?

Personally I am only just getting into the space, but continue to be surprised by just the sheer amount of money and energy behind a lot of these projects, and communities. For instance right now in the Solana meme coin market, AI agents on twitter and other areas are a huge point of interest.

What fascinates me is the fact that the crypto ecosystem seems to offer an alternative way to get funding for startups, especially in "hot" areas like AI. Instead of going the VC route, a creator can just go straight to twitter and create an automated agent, then make a meme coin off of that agent and potentially make millions of dollars.

For some examples of this, we have @truth_terminal, which is I believe the most popular AI agent on the site which sort of started the entire wave. https://x.com/truth_terminal. The fascinating thing about this account is that it asked for $50,000 to start making money, and Marc Andreessen actually gave it the money, which it then took and became a millionaire from crypto.

More recently, there are all sorts of projects coming out around meme coins and AI. A ton of them are just general slop, porn bots, et cetera, but some seem to have real creative vision / potential behind them, like for instance @pillzumi, which seems to be like a blend of AI agents + an AI-generated anime series, or something like that.

All this to say that while many see crypto as nothing but scams and a waste of time, in my opinion some of the most genuinely novel and interesting projects are happening there.

To bring it back to the culture war - almost all of crypto and these AI innovations seem to be swinging heavily right wing. Same with the broader VC / tech environment. How will the left respond? Do you think democrats will begin to be more friendly to crypto, or will they double down on trying to lock it out of traditional financial systems?

Do you think Trump will actually make significant progress in regulating crypto and integrating it in the financial system of the U.S.? What would be the main blockers? How much political capital will he really need?

It seems that the vibe has definitely shifted in politics and general social spaces, as many folks last week commented on here. People are more open to using language that used to be termed offensive, right-wing political statements are more in vogue, etc.

I'm curious specifically what all of this means for feminism, and the gender war subset of the larger Culture War. I saw an interesting piece which blew up on X lately, that, in discussing the Neil Gaiman situation, argues:

Shapiro spends a lot of time thumbing the scale like this, and for good reason: without the repeated reminders that sexual abuse is so confusing and hard to recognize, to the point where some victims go their whole lives mistaking a violent act for a consensual one, most readers would look at Pavlovich's behavior (including the "it was wonderful" text message as well as her repeated and often aggressive sexual overtures toward Gaiman) and conclude that however she felt about the relationship later, her desire for him was genuine at the time — or at least, that Gaiman could be forgiven for thinking it was. To make Pavlovich a more sympathetic protagonist (and Gaiman a more persuasive villain), the article has to assert that her seemingly self-contradictory behavior is not just understandable but reasonable. Normal. Typical. If Pavlovich lied and said a violent act was consensual (and wonderful), that's just because women do be like that sometimes.

Obviously, this paradigm imposes a very weird, circular trap on men (#BelieveWomen, except the ones who say they want to sleep with you, in which case you should commence a Poirot-style interrogation until she breaks down and confesses that she actually finds you repulsive.) But I'm more interested in what happens to women when they're cast in this role of society's unreliable narrators: so vulnerable to coercion, and so socialized to please, that even the slightest hint of pressure causes the instantaneous and irretrievable loss of their agency.

The thing is, if women can’t be trusted to assert their desires or boundaries because they'll invariably lie about what they want in order to please other people, it's not just sex they can't reasonably consent to. It's medical treatments. Car loans. Nuclear non-proliferation agreements. Our entire social contract operates on the premise that adults are strong enough to choose their choices, no matter the ambient pressure from horny men or sleazy used car salesmen or power-hungry ayatollahs. If half the world's adult population are actually just smol beans — hapless, helpless, fickle, fragile, and much too tender to perform even the most basic self-advocacy — everything starts to fall apart, including the entire feminist project. You can't have genuine equality for women while also letting them duck through the trap door of but I didn't mean it, like children, when their choices have unhappy outcomes.

Now many linkers and commentators on X are basically arguing - why yes, women don't have agency, and that's why most cultures have reflected that in law and social practice. I think this sort of smugly satisfied mocking of women is in quite poor taste, and not likely to be productive, but there is a deeper point in there. Unfortunately it seems that, even after decades of propaganda, rewriting of tons of laws, giving women voting power, dismantling "oppressive" cultural structures like religion, etc. etc., we still as a society are not able to treat women as adults with agency, and consequences for their actions.

Now a progressive might come in and say - ok, fine we do still struggle with this issue, but hey, it's because of bad social programming! Just give us another 100 years and we will totally hold women responsible just like men, we promise!

That has basically been the progressive line to justify going further and further to the left with social and legal programs. Problem for them is, with the vibe shift I mentioned earlier, I think that argument is running out of steam. The average person no longer seems to be convinced that this is just a cultural problem which will go away.

So, where do we go from here? Do you think feminism will actually be rolled back in a meaningful way? I'm skeptical myself, but I'm also skeptical we will magically start holding women accountable. Not sure what happens next...

What do you think will happen with regards to the department of education in the US depending on the results of the election?

For my part it seems like the left would just keep watering it down, more of the same etc. There doesn't seem to be any acknowledgment of issues over there.

I’m also not optimistic that the right will do much either though, republicans have tended to be very Ham fisted in the past with this sort of reform.

Anyone have interesting or different perspectives here?