site banner

Small-Scale Question Sunday for January 4, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

What does everyone think of Eliezer Yudkowsky?

I just realized this website was born from /r/TheMotte, which came from /r/slatestarcodex, which came from Scott Alexander, who came from LessWrong, which was created by Eliezer himself. He's technically a grandfather of this community.

I for one think he's more influential than he's given credit for, and I consider myself lucky to have come across his writings in my younger days.

He is Earth's greatest living philosopher.

Edit: I challenge any of the downboaters to name a better one.

Edit: I challenge any of the downboaters to name a better one.

I don't even think that Yudkowsky was the best thinker on LessWrong. Both David Friedman and Scott Alexander (when he was on) surpass him easily IMO.

Heck I can also think of a handful of regular commentors from those days that I thought were at least in his league if not the same ball park.

I don't even think that Yudkowsky was the best thinker on LessWrong. Both David Friedman and Scott Alexander (when he was on) surpass him easily IMO.

This is trivia, not science, but for kicks I decided to see how many LessWrong quotes from each user I've found worth saving over the years: Yudkowsky wins with 18 (plus probably a couple more; I didn't bother making the bash one-liner here robust), Yvain (Scott) takes second with 10, and while I have dozens of Friedman quotes from his books and from other websites, I can't find a one from LessWrong that I saved. (was Friedman was just a lurker on LessWrong?)

On the other hand, surely "best" shouldn't just mean "most prolific", even after a (grossly-stochastic) filter for the top zero-point-whatever percent. Scott is a more careful thinker, and David more careful still, and prudence ought to count for something too ... especially by Yudkowsky's own lights! We praise Newton for calculus and physics and downplay the alchemy and the Bible Code stuff, but at worst Newton's mistakes were merely silly, just wastes of his time. Eliezer Yudkowsky's most important belief is his conclusion that human extinction is an extremely likely consequence of the direction of progress currently being pursued by modern AI researchers, who frequently describe themselves as having been inspired by the writings of: Eliezer Yudkowsky. I'm not sure how that could have been avoided, since the proposition of existential AGI risks has the proposition of transformative AGI capabilities as a prerequisite and there were naturally going to be people who took the latter more seriously than the former, but it still looks superficially like it could be the Ultimate Self-defeat in human history, in both senses of that adjective.