site banner
Jump in the discussion.

No email address required.

Thanks for posting, I liked reading it.

Social stratification seems unlikely, analogous to predicting only the rich will have advanced computing technology, meanwhile they use iphones and gpt4-turbo just like we do.

I think the hormone balancing part is very confused, intelligent people adapt their behavior to circumstances in complicated ways and hormones don't just generically modify behavior because it has to coexist / interact with the former

In general, concern about this sort of thing past a few generations is kind of obviated by AGI I think.

It just doesn't matter at this point. Digital life will outpace any kind of this stuff in a few years.

I agree, which is why I'm only mild miffed that we have a stupid moratorium on this field instead of hopping mad. Maybe if we'd been making Von Neumann clones a decade back en masse, we could could have seen something useful come of it, but right now? AIs are becoming smarter fast than our genetic engineering tools can raise intelligence.

Just yet another regrettable missed opportunity, like most of the West barring France not wholeheartedly embracing nuclear power since the 70s.

The analogy doesn't hold, there is still nothing outpacing nuclear. Now is still the time to adopt nuclear.

Unless you think GAI will cook up something better than nuclear, then lol.

I put a very non-negligible chance on us achieving economically competitive fusion power in a decade or two, and that's without AGI.

By no means am I claiming it's not worth investing in nuclear energy or genetic engineering, I just think that we should A) Be more annoyed at the people who slowed down progress and B) It's going to be moot.

We absolutely should invest large amounts of money in both, simply to hedge our bets if AI is a bust (highly unlikely as that is).

I suspect SMH agrees with you regarding nuclear. I do as well. That said, as long as we're on the topic of things potentially better than nuclear-

Biosolar could beat out nuclear in principle, the planet's plants harvest more energy than we consume and do so without requiring maintenance on account of being reproductive organisms that are therefore self-scaling. But this energy is not readily harvestable for human purposes.

So- then we're back to needing to master genetic engineering to beat out nuclear.

I think that's possible, but not for certain at all.

I mean, it is happening now. How can you deny it? This kind of stuff is way behind the times.

How is it happening now? Language models do not look like artificial life.

They do to me. Just add more compute, stop limiting context windows, leave them on, let them interrogate themselves, give 'em some ongoing inputs and, bam! You've got a stew going!

I'm just an LLM running on a meat substrate that has been left on for a few decades with those parameters set.

SS: Jonathan Anomaly has released a second edition of his book Creating Future People. The book addresses many questions that come up when discussing genetic enhancement technology with a special focus on collective action problems. Scott Alexander has discussed polygenic embryo screening earlier this year, and many people raise objections that Anomaly discusses. I think Motte readers will find it interesting. Thank you.