Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
I was reading a comment about how people that learn rationality often appear unhappy. It caused to reflect and think of a larger pattern.
I think rationality can cause people to go through something like the 5 stages of grief: denial, anger, bargaining, depression, and acceptance. So, while rationality concepts can cause some people to become unhappy it can often be just a temporary state.
Has studying rationality concepts caused you to go through a cycle of emotional states?
Less about rationality concepts themselves and more about my perception of the community. A feeling like watching my intellectual heroes not just stumble, but faceplant: first, a sense of enthusiasm and a sort of pride that there were people (dare I say, "my people"?) looking to transcend their flaws and start looking seriously at the hardest, most important problem in history — how to align a superintelligence. HMPOR is one of the most engaging works I've ever read; despite EY's often odd prose and the weirdness of the characters, it rewards close reading and sets out both a vision and a warning. And with the sequences (not just EY's, but other writers as well), you get a pretty inspiring offer: learn all this stuff, it will teach you how to win, and then deploy that to win the most important problem in history. Then dismay and disappointment as I learned that even these hardened epistemic defenses were no match for Berkeley, that rationalists ended up more interested in polyamorous group houses than solving the most important problem in history, and only slightly less vulnerable to the woke mind virus than the average normie. @zackmdavis' writing on the trans question takes a long time to get to the point, but it's an important one: there is a reality, and even the most ingroup members of what's meant to be the most reality-connected community threw out all of their epistemic standards just to let their friends claim an alternate sex. This seems to me to mean that even if we succeed at AI-don't-kill-everyone, any AGIs/ASIs we do get will be unacceptably decoupled from reality on at least the woke and trans questions, and anything connected to those. Since if you once tell a lie, the truth is ever after your enemy, solving the "AI-don't-kill-everyone" problem becomes harder if you don't even allow yourself to see reality while you're solving it.
I'd live in a
cult compoundgroup house if I could. It's the most reliable way to avoid dying if WWIII happens, and if you think there's a decent chance of WWIII before AI doom, then you'll want to make sure you survive WWIII.Of course, you need supplies and guns as well as the cult to be a proper cult compound.
It does seem pretty fun, but you have to tolerate some real weird shit man. Real weird.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
What sorts of rationality concepts do you have in mind?
Source: https://www.lesswrong.com/posts/PHnMDhfiadQt6Gj23/the-art-of-grieving-well
So really any concepts that relate to learning about why human nature makes the problems of the world intractable. Examples: Mediations on Moloch, cognitive biases that lead people to act irrationality.
If someone breaks up with you, or a loved one dies it is almost impossible to deny the painful truth of reality. With rationality if you don't go looking for certain things you can remain ignorant of parts of reality. If studying rationality can cause emotional distress then in some cases it may be better not to look under certain rocks.
More options
Context Copy link
More options
Context Copy link
Nope.
I don't think there is anyone actually "becoming" a rationalist. People have natural inclinations towards being high/low decouplers and that is probably baked into their personality profiles.
To me coming across rationality felt like coming across things I already intuited but formalized and documented.
I think most high decouplers already have a world model and internal epistemic model similar to rationalists, the Rationalists™ don't have a monopoly on... Rationalism.
More options
Context Copy link
Not really. Even before I was a Rationalist, I was still regularly dismayed by civilizational inadequacy and how fucking stupid the average person is.
I'm a smart person, and I'm certain that even if I hadn't discovered LW or Scott in my late teens, I'd have wandered into spaces where I'd have heard the same concepts eventually.
To me, rationality is both the art of clear thinking, and as Yudkowsky puts it, winning. Reasonably educated humans usually think quite clearly already when it comes to truly important decisions, so I don't really expect an introduction to formal Rationalist concepts to revolutionize someone's life barring those who go into EA, AI Alignment or sign up for cryonics.
You don't have to be a rat to worry about AI or x-risk, and while I've certainly felt existential terror at times, I cope quite well.
I don’t think it reasonable to call the average person stupid in the sense that they’re incapable of learning it. Most have never actually been taught to think in that manner, and as such lack the skill set. Part of thinking well is the toolset, and part of it is being able to (and choosing to) read widely enough to make good use of the tools in that skill set.
Education, at least in non-elite American schools is not built to create thinkers. Nobody funding the schools or hiring the graduates cares if they can think (and other than the cognitive elite type jobs in high level stem, thinking is a net negative as thinkers are hard to control). As such the system is set up for mostly rote learning— what the classical education model calls grammar. Memorize and recall, perform mathematical equations. That’s all well and good, but that’s not going to create a thinker. There’s the next step where kids learn to understand why that works, or to learn to apply what they’ve learned, and to analyze texts, equations and problems to understand what’s being done and why.
Absent that, most people developed proxies that mostly work. Finding a “priest” type who you trust on a topic, trusting a given set of sources, using the canonical list of fallacies, or trusting the guy in the argument who sounds like Spock. Those sort of work most of the time, provided those you’re trusting are honest and knowledgeable. The trouble is that it’s not hard to know the tricks most people use as rationalist proxies and positioning yourself to appeal to those instincts.
More options
Context Copy link
Don't be so sure. When I was in college, these places DIDN'T EXIST. It took the internet to bring together the critical mass to make it happen.
You certainly weren't going to be exposed to rationalist ideas on a typical college campus or by reading the New Yorker or the Economist or anything. The gulf between the writing of someone like Scott, and the publications available to a layperson in the 1990s is vast.
Even today, let's say TheMotte/SSC/LessWrong don't exist. Where are you going to get information that isn't hopelessly normie biased?
More options
Context Copy link
More options
Context Copy link
Well I’m one of participants that was in that discussion, so I can confidently say yes, learning about rationality (specifically ‘rationalism’ online) has caused me quite a few strong emotions.
More options
Context Copy link
More options
Context Copy link