site banner

The Motte Glossary

Mottizens sometimes use terms with obscure origins that can be confusing to newcomers. This is an attempt to provide a brief explanation of what these terms mean and where they came from to help anyone new to the community.

50 Stalins: A style of commentary which pretends to criticize something while actually praising it, e.g. “critiquing” Stalinist Russia by suggesting that it is not Stalinist enough and it should have even more Stalins. The term was coined by Scott Alexander in Reactionary Philosophy in an Enormous, Planet-sized Nutshell (2013).

Chinese Robber Fallacy: A dishonest argument that uses a generic problem to attack a specific person or group, even when the other groups have the problem just as much, e.g. complaining about the problem of Chinese robbers without providing evidence that Chinese people are more likely to be robbers than other groups. It was first described by blogger Alyssa Vance in 2015.

CultureWarRoundup / CWR: A splinter community of the Motte that lives at the subreddit /r/CultureWarRoundup, founded primarily by users who felt that the Motte's moderation policies were too strict. It has very little moderation.

Effective Altruism / EA : A philosophical and social movement that aims to use evidence and reason to do the most good possible. It has a lot of overlap with the rationalist community and LessWrong and experienced much of its early growth on LessWrong.

Human Biodiversity / HBD: A viewpoint that holds that there are socially relevant differences between groups of people that are genetic in origin. Most controversially, HBD advocates generally maintain that the observed differences between the average intelligence of people of different races originate in genetics.

Ideological Turing Test / ITT: An exercise where you try to pretend to hold an opposing ideology convincingly enough that outside observers can't reliably distinguish you from a true believer. It was first described by economist Bryan Caplan in 2011.

LessWrong: A discussion forum founded by AI theorist Eliezer Yudkowsky in 2009. The forum focuses on cognitive biases, rationality, artificial intelligence, and other topics. It is the primary nexus for the so-called “rationalist community.” LessWrong could be considered an ancestor forum to the Motte, since Scott Alexander blogged there before founding Slate Star Codex and this community originated in the subreddit for Slate Star Codex.

Lizardman's Constant: The share of the population (around 4%) who gives absurd responses in opinion polls (such as saying that lizardmen run the world); perhaps a combination of trolls, people who don't understand the question, people who just want to agree with the pollster, and people who are completely apathetic to the poll. It was coined by Scott Alexander in Lizardman's Constant is 4% (2013).

Motte and Bailey fallacy: A dishonest form of argument where one conflates two positions, one easy to defend but with limited implications (the motte) and another hard to defend but with far-reaching implications (the bailey). The fallacy was named after a kind of castle. It was coined by the philosopher Nicholas Shakel in 2005 and popularized by Scott Alexander via Social Justice and Words, Words, Words (2014) and All in All, Another Brick in the Motte (2014). The motte-and-bailey fallacy is the namesake of the Motte; in this community, we would like people to only hold positions that they can defend.

Neoreaction / NRx: A right-wing political philosophy whose signature viewpoint is that monarchy is a better form of government than democracy. Its most famous advocate is the blogger Curtis Yarvin, aka Mencius Moldbug. The neoreactionary movement first grew on LessWrong, although they were always a very small faction there.

Prior: A term from Bayesian Statistics that essentially means one's belief about something before they take new evidence into account. To say that one's prior is X is essentially to say that one's belief is X. To say that one has "adjusted their priors" is essentially to say that one has changed one's mind to some degree about the topic at hand based on the evidence presented; in theory this is done by applying Bayes' theorem.

Quokka: A kind of Australian macropod. They have no natural predators and are therefore not particularly fearful. Some people, beginning with a 2020 Twitter thread by “Zero HP Lovecraft”, who believe rationalists are too trusting or naive compare rationalists to Quokkas.

Rationalist: An online community of people originally formed around the blog Overcoming Bias, founded in 2006 by economist Robin Hanson and AI theorist Eliezer Yudkowsky; the discussion forum LessWrong, founded in 2009 by Yudkowsky; and the blog Slate Star Codex, founded in 2013 by Scott Alexander. The rationalist community is generally focused on cognitive bias and reason. Because the Motte originated in the subreddit for Slate Star Codex, it could be considered part of a rationalist “diaspora.”

rDrama: A humorous forum about Internet drama. Originally the subreddit /r/drama, the community now lives at rdrama.net. The Motte forked the code they use to run their site when we moved from Reddit to this website in 2022 because it was a proven example of a Reddit community successfully moving offsite and had many of the features we wanted.

Red Tribe / Blue Tribe / Gray Tribe: Terms used to describe different cultural groups in America. The terms were coined by Scott Alexander in I Can Tolerate Anything Except the Outgroup (2014). They are sometimes used interchangeably with the concepts of Republicans (Red Tribe) and Democrats (Blue Tribe), but in their original conception, Red Tribe (or Blue Tribe) meant something more precisely stated as “the sorts of people likely to be Republicans (or Democrats), regardless of their actual political views.” For example, a vegan Harvard graduate who lives in Manhattan and loves musical theater is part of the Blue Tribe even if he is actually politically conservative. The Gray Tribe is a sub-tribe of the Blue Tribe characterized by things like working in STEM fields and often having libertarian-ish politics.

Scissor Statement: A highly controversial statement that reliably provokes arguments. Coined by Scott Alexander in Sort By Controversial (2018), a work of fiction in which scissor statements are generated by a machine learning system trained on Reddit comments.

Slate Star Codex / SSC / Astral Codex Ten / Scott Alexander: Scott Alexander is a psychiatrist who lives in the Bay

Area. He blogged from 2013-2020 at Slate Star Codex and since 2021 at Astral Codex Ten. The Motte was created as a subreddit in 2019 as the home for a weekly "culture war roundup" thread that was hosted on the subreddit for SSC until then. The culture war roundup threads were removed from /r/slatestarcodex at Scott Alexander’s request. Scott Alexander’s writings are a major influence on the norms of this community. His blog and the community around it are generally considered part of the rationalist community. Scott Alexander was a popular writer at LessWrong before founding SSC and the community around SSC has a lot of overlap with LessWrong.

SneerClub: A community of people critical of the rationalist community, including the Motte, that lives at the subreddit /r/SneerClub.

Steelman: The strongest possible form of an opposing argument; the opposite of a strawman.

TheSchism: A splinter community of the Motte founded in 2020 that lives at the subreddit /r/TheSchism.

Weakman: A weak argument that someone has actually made (so it’s not a strawman). A poor form of argument is to choose the weakest argument that someone has actually made in favor of a position and argue against that while ignoring stronger arguments for that position.

25
Jump in the discussion.

No email address required.

(I am a noob and have no idea whether this is Motte-specific, Reddit-specific, or something else, though I've not seen it in SSC/ACX/ACX comments.)

What does AAQC mean?

Actually A Quality Contribution. It was an option when reporting comments if you wanted to highlight a comment to the mods as particularly good.

Quokka: A kind of Australian macropod. They have no natural predators and are therefore not particularly fearful. Some people, beginning with a 2020 Twitter thread by “Zero HP Lovecraft”, who believe rationalists are too trusting or naive compare rationalists to Quokkas.

Weren't there people calling us quokkas before then? It's hard to remember the pre-covid times.

And rationalists, bless their hearts, are REALLY easy to lie to. It's not like taking candy from a baby; babies actually try to hang onto their candy. The rationalists just limply let go and mutter, "I notice I am confused".

Man, it's sad how accurate this is. Our rightful caliph got taken in like the biggest sucker ever. "Hey everyone, NY Times is interviewing me for a human interest story about rationalists next week. The timing is a little weird, what with the race riots and all. Anyway she was really sweet and engaged, we talked about my petition to save Steven Hsu from cancellation!"

"Hey everyone, NY Times is interviewing me for a human interest story about rationalists next week. The timing is a little weird, what with the race riots and all. Anyway she was really sweet and engaged, we talked about my petition to save Steven Hsu from cancellation!"

Who are you referring to?

Who are you referring to?

?

Scott Alexander, as I'm sure you'll know unless you're someone else wearing /u/DrManhattan16's nick as a skinsuit. The NYT doxxed him, forcing him to resign from his job, and eventually published an article highlighting his connections to HBD and gender gap in math ability writing, and suggesting obliquely that things like his blog are too dangerous to be tolerated on tech platforms. There were also people ready to publish damaging private emails from Scott after he predictably wrote a self-defense which I'm sure is purely a coincidence and was not coordinated in advance at all.

It is me, but I was confused because you said "she was interviewing me" and unless I'm mistaken, the reporter (Cade Metz) is a man. Moreover, I don't recall the or a topic of discussion for them being the efforts to help Hsu. Also, I don't see the connection with the May/June BLM riots - an interview with a blogger seems far enough away to be unconnected. Perhaps you're speaking rhetorically, but that leaves an incorrect summarization of what happened with Scott.

the reporter (Cade Metz) is a man

I just realized that I always assumed this was a woman as well, since it's similar to Kate.

Personally, I recall the 0HPLC Twitter thread as the first time we became metaphorical quokkas.

Pushshift searching is proving unfruitful: the earliest mentions I can find of quokkas on /r/TheMotte or CWR are from the first week of July 2020, and I could've sworn there was discussion both places right after the 0HPLC thread. Also /r/QuokkaFightClub was made October 2020, and I similarly thought it was made like July at the latest. Hm. Anyway, no real evidence for prior to 0HPLC at least.

Yep, looks like I'm just misremembering. Thanks for the investigation.

I thought the point of the Chinese Robber Fallacy was more that any sufficiently large enough population will have bad actors/bad examples within it. In practical terms, it's a call to not look at things in absolute numbers, but per capita or relative numbers. China has a lot of robbers. But this is useless information because it has a lot of people. But how does that compare on a per capita basis, especially to other countries? If it is substantially higher or lower then maybe you have a point to point to robbery in China.

This is overlaps somewhat with the above given definition by different enough for me to ask for clarification.

You could probably throw some singularity/longtermist definitions in there too.

This list will probably make it a lot easier for rDrama trolls (among others) to fit in, which is a bit unfortunate. At the same time, this glossary of terms is really accurate and well-cited, and I cannot help but admire it for that.

Whats the problem if they fit in ? That would be a suffering from success scenario.

I have enough trust for people to not fall for bad arguments made with the right set of phrases and jargon. If they need good arguments to fit in, that's not exactly a problem, that's assimilation.

On the fence about having a list like this but I don't think being afraid of inviting the wrong people is a good reason not to. A community like this stands or falls on its discussion norms and the active moderation of them, not its gatekeeping or obscurity.

We don't have infinite moderators.

Mistake theory/Conflict theory: two opposing views on the nature of disagreement.

Mistake theory posits that people or groups disagree because one or both of them made a mistake in their reasoning, so it's possible to walk the chain of reasoning backwards to find that mistake, correct it, and harmonize the views. Think that anecdote by Dawkins that ends with "My dear fellow, I wish to thank you. I have been wrong these fifteen years". Also see Quokka.

Conflict theory posits that people or groups disagree because one or both of them uses their reasoning as a weapon against the other person or group and isn't interested in truth-finding. Think presidential debate and politics in general.

The definition of Quokka should probably note that "Quokka is often used to refer to people who naively believe all disagreements are attributable to mistake theory" or something along those lines.

This should probably mention that LW/Rationalist-sphere folks are obsessed with intelligence, AI, and the idea that a superintelligent [sic] poses an existential risk to humanity.

I'm strongly against this kind of artificial legibility. It can enable newcomers to perform acquaintance of the community far beyond where they're at.

Keep things hidden in plain sight lest you scrap the pipeline.

I really don't understand this point of view. Or to put it more aptly I don't understand why people who post here would hold this point of view. It's totally counter to "speak plainly".

The conversations on the motte are exclusionary to a large number of people, and that has its benefits but if you take the spirit of the rules (ethos) to heart, its very obviously a bug, not a feature.

It can enable newcomers to perform acquaintance of the community far beyond where they're at.

This implies that aquittance is artificially dampened.

Why wouldn't we want people to learn about our community as fast as possible as easily as possible? It's not a secret club is it?

I think something important is lost when someone picks up the signifiers but not the signified. Signals get muddled. I'm just advocating for high-context culture.

I have nothing against high-context cultures, but this should not be one of them. The point of this place is to (1) welcome all viewpoints, so long as (2) the viewpoint is articulated and defended clearly. High contexts cuts against both of these goals.

"Articulated and defended clearly" is much easier when working from a shared base of facts, OR when outsiderness is evident.

Keeping your shibboleths opaque is good practice if you're trying to identify true descendants of Jacob in ancient Canaan, but is it really a concern here?

As I see it, if people post in line with the local post-rationalist idiolect, and do it in a thoughtful and considered way, who cares if they were reading EY on LW in 2008 or if they first heard about SSC from the NYT? There isn't some Platonic ideal of a Mottian which one can possess or not. (Or maybe there is and we need to institute a series of membership trials? That would be fun, if somewhat impractical)

If somebody comes in and tries to dissemble by talking about their "priors" while simultaneously arguing that certain ideas are harmful and we ought to censor them, they're going to stick out like a sore thumb. Using the right lingo won't hide that.

There isn't some Platonic ideal of a Mottian which one can possess or not.

I think as a community we should refuse to welcome anyone who hasn't read 100,000 words on SSC, but that's just me. It's not about seniority, it's about (in venerable internet lingo) LURKING MOAR.

My opinion is that at some point it becomes kind of silly to keep using terms coined in blog posts that are almost a decade old now and which have never gained widespread usage elsewhere without ever explaining what they mean or where they came from. At this point you have to expect that many of the people here were not aware of Slate Star Codex or LessWrong circa 2013. Part of my hope is that providing explanations and links to the LW/SSC origins of this community it will encourage newcomers to read more of the "foundational texts" and assimilate better rather than causing a culture shift over time away from the things that made this community great.

I agree with the problem of enabling trolls but this is a tradeoff and honestly I'm confident in our ability to identify someone namedropping shibboleths while trolling.

Although it didn't occur to me when I posted this that it would likely be one of the most visible posts on the subreddit for a while. I'm open to taking it down temporarily (or permanently if that seems to be the community consensus) or moving it to a less visible place.

I'm not worried about enabling trolls, I'm worried about enabling larpers. There are a number of them on the site already, but as of right now they're not too hard to pick out from the crowd.

As I understand it, 50 Stalins is actually an argument that we don't live in as free of a country as we think. Analogy is, in Stalin's Soviet Union, you were free to state that we really need 50 Stalins, but not that Stalin was a big stupid jerk. Similarly, in many Western countries that supposedly have free speech, you are free to state any opinion that aligns with the Progressive movement, but any speech against it may have negative consequences.

"50 Stalins" is sometimes used in discussion of free speech but it's applicable in other contexts as well. For example "John pretends to be an opponent of effective altruism, but all his arguments are just 50 Stalins critiques. He actually seems to agree with effective altruism, he just doesn't think they go far enough."

I took it as a way to couch criticism to be acceptable without making yourself into a heretic.