site banner

Small-Scale Question Sunday for July 30, 2023

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

2
Jump in the discussion.

No email address required.

I was reading a comment about how people that learn rationality often appear unhappy. It caused to reflect and think of a larger pattern.

I think rationality can cause people to go through something like the 5 stages of grief: denial, anger, bargaining, depression, and acceptance. So, while rationality concepts can cause some people to become unhappy it can often be just a temporary state.

Has studying rationality concepts caused you to go through a cycle of emotional states?

Less about rationality concepts themselves and more about my perception of the community. A feeling like watching my intellectual heroes not just stumble, but faceplant: first, a sense of enthusiasm and a sort of pride that there were people (dare I say, "my people"?) looking to transcend their flaws and start looking seriously at the hardest, most important problem in history — how to align a superintelligence. HMPOR is one of the most engaging works I've ever read; despite EY's often odd prose and the weirdness of the characters, it rewards close reading and sets out both a vision and a warning. And with the sequences (not just EY's, but other writers as well), you get a pretty inspiring offer: learn all this stuff, it will teach you how to win, and then deploy that to win the most important problem in history. Then dismay and disappointment as I learned that even these hardened epistemic defenses were no match for Berkeley, that rationalists ended up more interested in polyamorous group houses than solving the most important problem in history, and only slightly less vulnerable to the woke mind virus than the average normie. @zackmdavis' writing on the trans question takes a long time to get to the point, but it's an important one: there is a reality, and even the most ingroup members of what's meant to be the most reality-connected community threw out all of their epistemic standards just to let their friends claim an alternate sex. This seems to me to mean that even if we succeed at AI-don't-kill-everyone, any AGIs/ASIs we do get will be unacceptably decoupled from reality on at least the woke and trans questions, and anything connected to those. Since if you once tell a lie, the truth is ever after your enemy, solving the "AI-don't-kill-everyone" problem becomes harder if you don't even allow yourself to see reality while you're solving it.

I'd live in a cult compound group house if I could. It's the most reliable way to avoid dying if WWIII happens, and if you think there's a decent chance of WWIII before AI doom, then you'll want to make sure you survive WWIII.

Of course, you need supplies and guns as well as the cult to be a proper cult compound.

It does seem pretty fun, but you have to tolerate some real weird shit man. Real weird.