site banner
Jump in the discussion.

No email address required.

I feel like you're strawmanning (or perhaps weakmanning) rationalists. Like your first example with the poorly reasoning Dr John reads like something straight off of less wrong from 10 years ago , it is absolutely nothing new to the community. Your second example has the exact same issue. The rationalists have pretty low opinions of mainstream media like Bloomberg already. The term gell-mann amnesia has been floating around for years to describe the phenomenon of temporarily forgetting just how bad journalists really are, that's not exactly something you'd expect in a community that blindly trusts mainstream media. In the third example you spend a long time attacking Krugman who has never been part of the rationalist movement, there are economists who could be considered well known rationalists like Robin Hanson and tyler Cowen but Krugman is not one of us.

Overall your post has a very /r/Iamverysmart vibe. You pat yourself on the back a lot for noticing things everyone else missed but you don't seem to actually have a good grasp of what the rest of the community actually thinks.

If you want to explore the topic further, in a sort of corollary to Betteridge: if you say "x is too y" in the headline, then in the piece you have to say why "x is too y to achieve A" or that "x is too y in comparison to B." Otherwise the criticism "x is too y" just floats in space, unmoored from any standard by which to judge it, its impact unclear. A headline "Eagles' Quarterback Jalen Hurts doesn't spread the ball around to all his receivers enough" is probably just sports blog masturbatory analysis and nitpicking; "Jalen Hurts doesn't spread the ball around enough to defeat the more organized defenses the team will meet in the playoffs" is much better, showing the impact of the action on the team's goals and providing a clear standard to judge against; "Jalen Hurts doesn't spread the ball around as much as quarterbacks like Rodgers or Mahomes" gives us a direct comparison to show how other successful players have performed and where Hurts falls short if he wants to be an MVP candidate like them. It also gives us a standard by which to figure out if we agree with the author.

Right now all I'm getting is "Rationalists are easier to to dupe than Nassim Taleb's imaginary friend Fat Tony.*" Consider some alternative theses, and if you can support them from your evidence:

"Rationalists are more easily duped than other political groupings and actors."

This seems tough to argue for. Progressives fell for Smollett and Abrams and Beto, vast hordes of the Republican base fell for Q and Mike Liddell and JFK Jr. being alive, Fox News advertises nothing but gold buying scams and reverse mortgages, while MSNBC and CNBC gave you Tom Brady and Larry David pitching FTX out the wazoo. Congress fell for WMDs, the EU fell for Putin, the US military-industrial-intelligence complex just keeps getting duped into handing local partners in Vietnam and Afghanistan and Syria billions of dollars that fail to deliver any results, US corporations get duped into doing DEI work or partnering with Rivian, US investment banks got duped into holding the bag for financing Elon's twitter acquisition, when academics aren't getting duped into printing fake papers they're getting duped by non-replicable studies p-hacked by researchers to make it look like they meant something, Billionaires get duped into buying fake old wines, the greatest venture capitalists and tech investors get duped into wasting billions on WeWork or Robin Hood or Peloton. To paraphrase the quote in my profile, ain't nobody who ain't gullible, I looked.

"Rationalists are too easily duped to achieve their goals."

This probably moves the argument into a debate about what the "goals" of the rationalist movement are. If we're going with the Yudkowskian maximalist theory that people who have read The Sequences will one day bestride the earth and rule like intellectual colossi, then there's a pretty good argument there. For Rationalists to be Ubermensch Philosopher Kings they'd need to be basically impossible to fool, they are not impossible to fool, ergo they cannot be Ubermensch Philosopher Kings. If we're going with the EA theory that they should be in charge of directing money to far-off charities for 10% of my income, I'd also find that pretty accurate, I don't want to risk giving money to an org that might pass it to a fake org or to a doomed and ill-conceived political campaign. My more modest view of the role Rationalists play in my life, writing fun blog posts and faux-reddit screeds that make me think, I don't think Rationalists are too gullible to handle. SA might be too gullible to be "the guy," he might even be too gullible to be the-guy-behind-the-guy, but I don't think he's too gullible to be the-guy-behind-the-guy's favorite author's favorite author.

"Rationalists are more easily duped than ordinary citizens."

This might be true in the sense that Rationalists are more likely to try a nootropic supplement or counterintuitive fitness trend than the common run of human, but that's more hobby than anything else. The ordinary run of humans are so easily fooled that it beggars belief. Not only do time share scams exist, there's a meta level of tv ads for scam companies that will help you get out of your scam time share for a fee.

*@Faceh I have known a lot of Fat Tonys, or at least guys who would have read that piece (if they were in the habit of reading books like that) and identified themselves with Fat Tony. Self-made greasy guys, realtors and salesmen and house flippers and developers and entrepreneurs. They're the kinds of guys to tell me that a slot machine is "hot," that the Sixers should stick with Shake Milton because he hit a few threes against the Lakers, that this particular craps table always hits 11s and to bet on them, that girls who wear Frye boots are always freaks in bed, to never trust Albanians. What Taleb trumpets as street-philosopher skepticism is often closer to over-active pattern recognition; in his thought experiment Tony calls bullshit after 99 tosses, in my experience of those kinds of guys they would have called bullshit after the first 3 landed heads and gone 5:1 odds on Heads for the fourth toss. Which, in Taleb's hypothetical, is great; but just as often guys like that end up having to sign themselves out of the casino, the house is very good at convincing people that their lucky streak will keep running. And they will all tell you how their ex-wife fooled them in the divorce. Great salesmen, in my experience, are often easily sold by other salesmen. Mamet and Miller agree, of course. The quality that enables them to sell so convincingly, their belief in the narrative of the superiority of their own insurance products, makes them vulnerable in turn to believing the narratives of boat salesmen and casino croupiers and loose women. Of course, there are times they recognize a danger that my sheltered education missed, there are also times their superstitious pattern recognition results in betting big on the Giants to win the division because they "have the will to win." There's a reason why the big successful hedge funds hire MIT math majors by the bale, not greasy Eyetalians who successfully guessed how many gumballs were in the jar.

**One could of course argue that maybe these guys seemed like Fat Tonys, but they weren't real Fat Tonys. Fat Tony isn't a useful category if he's impossible to tell from any other fat loser until after he's always right, he's less superpredictor and more Texas sharpshooter.

Right now all I'm getting is "Rationalists are easier to to dupe than Nassim Taleb's imaginary friend Fat Tony.*

He's not even getting that. He's getting that imaginary rationalists are easier to dupe than fat Tony. He doesn't give any actual example of a rationalist making this mistake.

Wait, Larry David was pushing FTX? As in the Curb Your Enthusiasm meme where he tries to do something, it fails spectacularly, and cut to the credits with that jaunty music?

…That’s hilarious.

Honestly, a fantastic ad. Sort of the opposite, which then worked out the other way.

Smartest celebrity in any crypto promo. He only agreed to it because he’d be written in as the non-believer.

Probably made 8 figures and no-one can even accuse him of endorsing it.

Wow I had never seen the whole thing. That's a solid commercial, too.

This whole thing works like an extremely meta CYE joke in and of itself. Just layers upon layers of irony at work here.

The more flack Larry David catches for his role in it, the better the meta joke gets.

Your essay doesn’t even use rationalists as an example once in the cases you examine. You have two examples, Ligma Johnson and a Scott Adams statement. The victims of which are journalists and Paul Krugman respectively, neither of which would describe themselves as rationalists. Just a nonsensical essay

Ooooh, Nassim Taleb set up a rigged thought experiment, well that has convinced me!

Taleb couldn't convince me grass is green. I agree the rationalists are a little too credulous, but that's not so much rationalism per se as that the people espousing it round these parts tend to be very nice, well-meaning types who are somewhat idealistic and progressive and are all about the openness to experience, trust, being charitable, and giving the benefit of the doubt.

Now, as to the coin flipping: who is the guy doing the flipping? Is he trustworthy? If you want to tell me that Fat Tony is right, because Taleb is the guy doing the flipping and he's shady as fuck so this is a loaded coin, okay - but I don't think that's the conclusion you want me to draw about the trustworthiness and believability of Taleb.

You want me to trust Taleb, that he is right about Fat Tony. But if Taleb is right about Fat Tony, then Fat Tony is right that this is not a fair coin, and so Taleb is a liar, and so Taleb is untrustworthy, so why should I believe him about Fat Tony?

(1) The coin toss could be fair, and Fat Tony is wrong.

(2) The coin toss could be rigged, and Fat Tony is right.

Just from the details given, we don't know which option is correct. Now, in general and in real life, if someone is trying to get you to bet money or agree to something based on "if I flip this coin 99 times and it comes up heads 99 times, we'll bet does it come up heads the 100th time" - yeah, be suspicious.

In a thought experiment? Where the guy has the incentive to make it that Fat Tony is right, not Dr. John? Yes, that's rigged - but not about the 50/50 chance of the coin coming up heads, but about Fat Tony never, ever being wrong.

Now, as to the coin flipping: who is the guy doing the flipping? Is he trustworthy?

With 99 heads in a row it doesn't really matter. There is literally no one in the entire world that you should trust enough that you would still believe the coin is fair. 2^-99 is a ridiculously small number.

This feels all over the place. Your title doesn't seem to be related with most of what you wrote, and your conclusion comes out of left field 'here's a bunch of examples about some jokes, this is why rationalists get scammed' seems more like nonsense than a coherent argument.

You also pattern-match badly here - Trump, Adams, Taleb and Musk are the kind of examples that intimate you aren't seriously thinking about this but simply want to dunk on the opposition.

Maybe take this one back to the writing board, and look for more salient examples that support the point you're trying to make.

Your title doesn't seem to be related with most of what you wrote, and your conclusion comes out of left field 'here's a bunch of examples about some jokes

They aren't jokes. It seems you don't want to see what actually happened, the pattern I pointed out, and the significance I very clearly explained.

Your assertion that Ligma Johnson was a genius 5D chess maneuver designed to undermine the authority of modern journalism as opposed to, y'know, a joke isn't compelling so much as it stretches the principle of charity to believe you believe it.

Your patterns aren't just non-obvious, they're non-existent and clearly contrived - badly contrived to support a single political side. You have a pattern of posting half-developed essays that meander for a long time and take a sharp left turn at the end into a conclusion completely unsupported by the argument.

Your thesis is "rationalists are too easily duped". Your supporting points are a hypothetical thought experiment from Nassim Taleb, a tweet including the eggplant emoji from Elon Musk and Krugman taking a Scott Adams tweet seriously to make his own point (this isn't duping and I have no idea why you think it is).

Notably, none of these events even involve scams, let alone rationalists. They involve 'deception' in the sense that Krugman doesn't really care if Adams votes for Trump or not, he was using Adams' tweet to make his own point. If you say 'I am a communist and I want higher taxes' in a discussion and I use that as a springboard to argue against higher taxes, you haven't fooled me if you're really a fascist and are just pretending to be a communist. I've said the piece I wanted to say, why do I care that you lied?

As for the end of your essay if you're genuinely in possession of a secret 'black pill' of deception and persuasion, why are you personally not convincing?

The answer lies in some fairly traditional elements. Pathos, ethos, and logos.

There's not much pathos to speak of here, so I won't.

Ethos matters. If your essay is 'wow, look at all these points from people who also hate the woke like I do', you lose a substantial amount of ethos from the get go. You've clearly picked a side and have an investment in it winning, or at least looking good. You seem untrustworthy - why would I believe you're genuine and honest about arguing this? This doesn't sink your essay, but it intrinsically loses you the trust you might need to stretch an argument further than it might ordinarily go.


I maintain than in our modern era rationalists are not nearly as skeptical as they should be. Even people who call themselves skeptics lack skepticism.

Why are none of your examples about real skeptics or rationalists failing to be skeptical? This is a complete failure of logos. You should bring a series of logical reasons I would believe in your stated argument, but you don't. You bring weirdly irrelevant culture-war bits - which ties back into ethos again. If your supporting evidence is both irrelevant and biased, your failure in logos simply increases the failure in ethos.

Hopefully that clarifies things.

As for naivety in general, I would say that we all can make wrong judgements when we step outside of our sphere of expertise. We can be wrong even in our own areas (for example, was FDA right to approve aducanumab or not) but it tends to be corrected over time. But if I tried to make judgements when it is appropriate to increase or lower taxes depending on current state of economy, I would be wrong most of the time, even if I have some superficial understanding of macroenocomics.

Or even more science-based example – someone complained that the main bridge in our city is unsafe and should be closed immediately while some city official publicly announced that it is only a rumour and the bridge is safe. Whom to believe? I wouldn't know unless I had spent really long time studying dynamics of bridge safety. It was a true case that happened 5 years ago in Latvia but the bridge is still standing and in use.

Taleb: “I am going to flip this fair coin 100 times, and after the 99th toss, I want each of you to tell me the probability of the 100th being heads. You should know that each toss is independent and the that the coin is fair.”

Taleb flips the coin 99 times and each of the 99 tosses results in a heads.

Taleb: “Now before, I toss the coin for the 100th time, I want each of you to tell me the probability of heads on this next toss.”

Dr. John replies in a calm, studied, and restrained tone, “The probability of the next toss being heads is 1/2 (0.5). This is because you originally told me that the coin is fair and each toss is independent of the rest. So, despite the previous 99 heads, the probability of the next toss being heads is unchanged from the beginning (i.e. 0.5).”

Fat Tony replies in a visible excited and somewhat agitated demeanor, “This is one big set up! The next toss HAS to be heads! I don’t trust you. You lied when you told us the initial rules.”

What is more likely? That a fair coin lands heads 99 times in a row, or that Taleb lied? Fat Tony doesn’t need to be very intelligent to know the answer: his experiences in life have given him the wisdom to distrust claims like this. Dr. John may have more knowledge about probability, but not about deception.

Fat Tony can smell bullshit and never lets his guard down.

Using a contrived example to warn us against trusting contrived examples. Either outcome is a gotcha. Only fair move is to not play. and the outrage fat tony has should apply to the OP as well.

It's an example of "out of the box" thinking. Normally a math puzzle has certain givens that we are supposed to accept, giving the frame to a puzzle. Theres always an implicit "if these things are true then...". But the real world doesn't work like this. If you want to get stuff done, you have to doubt and step out of frames defined by others.

My favorite paradox is Curry's, which highlights how the mere entertainment of a statements veracity can inject an arbitrary payload into your beliefs if you are naive and not prepared to exit frames and boxes imposed by others.

Fat Tony is basically a contrived pastiche that is used for this purpose throughout Taleb. He's an annoying device for exactly this reason, and I like Taleb.

He comes across as less contrived when you've actually met a few Fat Tonys in real life.

Exactly, Dr. John might have rigged the coin throw but in a way that it doesn't land heads on the last toss.

Scott Adams has done enough lolcow-ish things throughout his life that I am not willing to believe he's an intelligent person "just duping everyone". Sure, you can easily "dupe" people and intentionally get them to engage with you by saying stupid shit on Twitter. That's not particularly intelligent or insightful. Ditto for Trump.

Even more: Krugman unknowingly raised awareness about the very fine people hoax by spreading Adams’ own Trump disavowment hoax. He did Adams’ bidding without realizing it.

Nothing in Krugman's tweets indicate this. He doesn't seem to have talked about the "very fine people" quote at all. Besides, the basic problem with Trump is that he says enough stupid shit to the point where anyone who dislikes him can always find something to get pissed over. It's not worth arguing with them even if they concede that he never called neo-Nazis fine people.

I wanted to like this piece because of the segment on Elon Musk and the Ligma Johnson hoax but the rest of it was pretty meh.

What lolcow stuff has he done?

He keeps starting drama with people and saying dumb shit about mass shootings on Twitter. He has a KF thread, that says it all.

I mean, didn't Dilbert get yoinked from newspapers because Adams started injecting his politics into the strip? I'm amazed it took this long for it to happen, but nonetheless, it seems like a fairly strong repudiation of Adams' legacy.

I read this and doesn't convince me. Surely, it wasn't just politics but still. It is like some people say in no way a white male could be overlooked for promotion in preference to some minority. But it happens all the time even when there are no specific quotas. There was one person here got freedom of information request from Canadian government that confirmed that they only hire people with some minority status because didn't want to sort through too many qualified candidates.

Okay, being cancelled from 77 newspapers at the same time is kinda suspicious, not gonna lie. But it's otherwise hard to tell the reason (though I grant that we would expect there to be no reason given to maintain plausible deniability).

Okay, being cancelled from 77 newspapers at the same time is kinda suspicious,

Not at all. Those aren't really 77 newspapers, they're all owned by the same company. Given the wafer thin local coverage in most of these papers these days, it's probably better to think of it as 77 local editions of the same paper. The articles in most of those papers will be 50-60% the same, 75-90% the same within the same state/region, with a handful of truly local articles sprinkled through by a small team of local reporters.

This wasn't a "coincidence" because nothing coincided. One guy at corporate decided he didn't like Dilbert, maybe because of politics maybe because he was thinking the male counterpart to Kathy had maybe outlived his usefulness.

Like I said, I'm amazed at how long it took, so this thesis is believable, but at the same time, the counter-current the comic represented probably made it an easier choice to axe.

Nice sub. But the author failed miserable with the previous article:

It wasn't that the US though that Russia is 50% weaker than it pretended to be. It was actually 50% weaker than the US thought it was.

Dictators tend to overestimate their power. The only reason the US does not push to overthrow them is that it becomes very ugly, like war in Ukraine with many innocent people dying. But Putin had a choice to do nothing and remain at status quo. Now he has destroyed his lie about powerful Russia.

But who is to say that cynicism can't itself be weaponized as easily as naivete? After all, we see politically-liberal people caution against cynicism and doomerism, saying it's deployed by those who wish to preserve the status quo.

It certainly could, which is why I'm not advocating for cynicism, what I advocate for is skepticism. Many true skeptics end up being cynics, but not all cynics are good skeptics.

Religions have clergy and laity. It's the clergy's job to study theology and know what's True, and it's the laity's job to trust the clergy and follow their lead. This archetypal form is so baked into human social programming that you're gonna be hard pressed to supplant it. So secular replacements for religion are bound to follow it.

"Boooring, muh everything's a religion." Rationalism doesn't try to be a religion, you say? Well I'm sure the clergy understands that but tell that to the laity. No actually the laity believe that too and recite it. But at the end of the day they follow the archetypal form. They will call themselves Rationalists but they have jobs and hobbies and lives (as is right and proper for laity of any religion—this is not a dunk) and so they will trust the EYs and the Scotts and the other clergy to do the actual work. The actual thinking. The actual application of Bayesian reasoning. Etc. For their part they simply have faith that they have found the True epistemology and that they follow the lead of trustworthy workers of Truth.

I do not think rationalists are duped more easily than normies that never heard about Less Wrong, but people who claim to be the smartest people in the world should be held to higher standard, and claim that studying their super special sequences will make anyone as smart as they are should be put to close scrutiny.

If you are boasting you are Airborne Navy Seal Ranger specially trained in gorilla warfare, and get your ass regularly kicked by ordinary drunks in bar fights, people would be justified doubting the value of super secret martial arts training you offer.

Obligatory link:

Where are All the Successful Rationalists?

Where are all the former losers who read the sequences, pulled themselves by their bootstraps and became brilliant winners?

Point me where in the sequences it makes the claims that you will become unusually successful for having absorbed them? Or where they claim that they're useful to everyone. In the matrix the red pill cannot be used on just anyone. And finally, who elected these main stream media figures who were criticized as rationalist representatives? I don't even really call myself a rationalist, but these are weak swings.

I guess it hasn't made me "successful", but I do think I live a happier and more fulfilling life because of the rationalist literature. Having the tools to make the world make sense is a value in itself.

Having the tools to make the world make sense is a value in itself

Exactly. "I notice I am confused" is a damn superpower.

Before, I used to observe stuff that didn't make sense, think "Huh. That was weird," and then go on about my day.

Now I actually interrogate the phenomena until it makes sense.

If you see a coin that turns up heads 100 times in a row, your first step should really be "Let me see that fucking coin" and not "wow, what a crazy random happenstance." Hell, if you see a coin that shows heads 99 times then tails on the 100th, you should DEFINITELY demand to examine said coin.

Although this has ruined the entire concept of magic performances for me.

Where are All the Successful Rationalists?

I relate a lot. I have not read a lot of rationalists articles, but it seems to me that a lot of what they do is share ideas amongst themselves, but these ideas are not necessarily true or important, merely interesting. Few of these ideas have anything to do with the real world.

Nassim Taleb talks about putting skin in the game as a way to escape this intellectual circle jerking, because when you confront ideas with reality is the only way to know if there's any true truth to them. This follows Karl Popper's falsification principle: if your idea cannot be falsified (in the real world), then it's worthless.

I think the reason why there are no successful rationalists is because they don't want their precious ideas to actually be tested in the real world, they'll rather keep them unopened like collectionists do, and just admire them.

Where are all the former losers who read the sequences, pulled themselves by their bootstraps and became brilliant winners?

*✋️ raises hand✋️ *

Hi, I was in a rut of depression and uncertainty and fear of the future in my early twenties, then one summer I read HPMOR, then worked through the sequences, recognized the value, and spent the next several years doing the hard work to adjust my life onto the track that allowed me to actually become happy, healthy, and financially secure. I truly enjoy life now.

Also, I tempered that knowledge with some extra examination of neuroscience, statistics/risk (Fooled by Randomness is a REALLY good book), and the art of rhetoric since, it turns out, merely thinking rationally doesn't get you very far if you can't deal with other "irrational" people in normal conversation.

A few things I credit rationality/the sequences with:

Being aware of and buying Bitcoin very early, recognizing the potential upside.

Being aware to never go all-in on Bitcoin or crypto (we see many, many people never grok this and blow up) at any given time.

Avoiding every single collapsed exchange and rugpull, from Mt.Gox to FTX, and thus never losing my gains to some unexpected event.

Pulling the fuck out of crypto when it became clear it had gone full cheap-money-fueled casino.

It is likely that I would have been one of those poor rubes who got fucked by SBF if I had not gotten my epistemic foundation built on solid ground well in advance.

Now, the caveat is that I have defined "winning" in quite modest terms. So my success is not amazing when compared to what many others who aren't rationalists have achieved. But it has put me in a position where virtually no single event (not counting X-risks) can wipe me out. And that's the fucking dream.

So I will strongly maintain that the sequences are a force for good, even if they haven't caused humanity as a species to vault to a higher state of being in a single decade.

And I did all the same as you in crypto while never fully reading the sequences (have read some but find them to be too much of a waste of time to complete) and reading HPMOR fully but considering it a joke as anything other than an amusing diversion. Meanwhile there are also plenty of people who worship the sequences and HPMOR and "rationally" dismissed or significantly underestimated crypto (as complained about on LessWrong itself many times) in its free money bonanza days despite being fully aware of it.

It seems more likely to me that you simply have decent (at least in one proven realm) intellectual instincts and latched on to sequentialist rationalism as the means through which to express them. But it's the good instincts, not the book, without which you'd be nothing, same as plenty of people who had the book but not the instincts.

I think the "good instincts" amounted to being aware that I was behaving in irrational/suboptimal ways, to my own detriment and there were probably tools out there to improve on this, if only I could find them.

The question that kept recurring in my head was "there's plenty of people who can give me advice on various decisions I'll have to make... but how the fuck can I know which advice is good?" Blindly accepting the advice of people I considered "authority figures" had already failed me badly.

That was the "bootstrap" portion of it. Being able to assess information in a systematic way so as to identify and make use of good information and, generally, discard bad/useless information (and none too soon, given how the ratio of useful to useless information has decreased exponentially).

Or, as the sequences put it, to be more confused by lies/falsehoods than truth.

I lacked any reliable tools for doing this despite having, as stated, the intuitive sense that the tools ought to exist.

Which really speaks ill of my college education, I should add.

The main way this helped in Crypto was the very early realization that nobody on the crypto subreddits knew shit about finance, they were all self-interested, and mostly dishonest (or self-deluded). So I went and learned to understand finance and ignored 99% of what the community had to say.

Only regret I have is not jumping on Dogecoin early on. Had no reason to think it would have this kind of longevity, though I did predict that it's community would fail to keep any of its early ethos intact.

I dunno, I think my life ends up very different if I never read the sequences. I would probably be one of those types who "fucking loves science" but really just uncritically accepts what experts say. And that would have caused me some problems when Covid hit.

Also, being plugged into the rationalist community (and, relatedly, /r/themotte) kept me like 3 months ahead of the curve on understanding the pandemic.

Over the years I've made better life decisions in a hundred little ways that would be hard to sufficiently articulate here, that I think the counterfactual version of me handles more poorly overall.

Also, being plugged into the rationalist community (and, relatedly, /r/themotte) kept me like 3 months ahead of the curve on understanding the pandemic.

Really? This place was overflowing with doomer takes about the pandemic as the "big one" (as opposed to the big scam) that aged terribly, and as far as I can still tell there's still no widespread recognition here that people were overly hasty and insufficiently scrupulous about their vax shilling.

If I had listened to /r/themotte I'd probably have my furniture made of worthless (or at least mostly unnecessary) N95 masks by now.

That's the thing. Themotte was quicker to see that masks might be helpful (whilst the CDC was literally saying "stop buying masks!"), but also shifting away from them as it became clear that this wasn't going to be the civilization-ending event it might have been.

The biggest insight I received from /r/themotte specifically was someone pointing out that viruses tend to mutate towards less lethal versions since that is optimal for long term spread.

Which is exactly. what. happened. Remember Omicron was more contagious and less deadly?

In absolute OCEANS of misinfo on the right and the left, and absolute collapse of expert guidance, themotte was basically the equivalent of a lighthouse in a storm.

Reddit at large was still in favor of mandatory masking FOR CHILDREN long after some posters here had already pointed out that this didn't actually help and might actually HURT young children's development. The latter being a point the CDC (I think) agreed with until it became politically unfavorable and they pulled that info from their site.

I'll go back and pull up the actual comments from the old sub if you don't believe it.

Of course, you do have to be able to sift out useful information from non useful to get the full benefit. But see my whole comment above about rationality teaching exactly that.

Fair, but I still resent the "rationalist" side of the Internet (well it wasn't all rationalists per se, but it was mostly fringey Internet commentators at least at the very beginning, not established media figures, Substacks and Mediums at best) for (and I'll admit contributing myself to some of the first point, to my regret, which is why I think it's worth pointing out):

  1. Essentially greenlighting the whole hysteria. Sure, established authorities weren't taking the threat as seriously as they should have at the beginning and maybe needed a little kick in the pants, and sure many rationalists called BS on the alarmism once the novel virus became less novel and was revealed to be far less dangerous than initial concerns (which happened far before Omicron btw and as early as the first global strain, so anybody only admitting it then was way behind the curve), but if rationalists really were all that rational, they should have perhaps seen two steps ahead instead of just one and realized that it would be very hard to take back the panic they helped drum up once it got rolling, especially since it was known that viruses have a tendency to moderate their own mortality as they spread as you mentioned. Instead I think so many people were desperately excited to finally get to go into "X-risk" mode and prove how Serious™ they are, and then the resulting mindset of paranoid doomer absolutist safetyism was hijacked by established authorities for their totalitarian ends and became the dominant attitude of authority throughout the entirety of the pandemic until it was unceremoniously ended by Putin.

  2. Particularly on /r/themotte (though obviously this particular issue was far worse in the non-terminally online realm in general), again the vax was shilled far beyond available sensible justification (and I haven't seen any retractions), especially for people who had supposedly appropriately absorbed SSC's reflections on metascience/the replication crisis and the flimsiness of so much "research" and so many "studies" because they are too hasty, unexacting, and corrupted by perverse incentives (like how about being conducted by the same people trying to sell the object of study as one of the most profitable pharmaceutical products of all time?). (But I'm pretty sure Siskind got the jab too (or I assume his polyorbit or whatever would have screeched at him until he had) so maybe even he didn't absorb his own reflections. Hopefully he faked getting it.)

I certainly won't say there was no insight on the subject to ever be found on /r/themotte. Its early campaign in favor of variolation was a good idea and probably would have been far superior to the vaccination we got.

The reason that you had many rationalists skeptical of the hysteria but still supporting vaccines is that rationalists tend to be scientifically trained, so even if they don't listen to the scientists who are signal-boosted by the media, they can understand what vaccines do based solely on their own knowledge.

they can understand what vaccines do based solely on their own knowledge.

If that were true then they would have been far more skeptical of taking them, especially for the younger demographics of their own community.

But no, I don't believe your average /r/themotte poster was in any way particularly "scientifically trained" in mRNA vaccine platforms before the debut of the most recent ones.

More comments

I am happy that you've found success. May you be your best self in enjoying the good that you've earned.

I would not describe myself as a rationalist, but I recognize what you've discovered--there's value to be found here, and it's worth the time to seek out. I suspect that SBF found some valuable facts about the world, but not the much more valuable attending wisdom, and proceeded to apply his lessons much less well than you've done. Perhaps he'll learn something from the ruin his mistakes have caused, but even if so, it will have been purchased at great cost to many others.

I'm increasingly convinced that SBF was acting with some level of malice aforethought and he was using EA as a decent camouflage.

But whatever he did learn from the rationalists, he missed the lessons on how deontology is extremely useful for putting up behavioral guardrails so that your fallible human hardware doesn't end up causing you to commit moral atrocities.

I mean, half of us are Quokkas, the other are Quokka contrarians who developed cynicism as a defense strategy that only works somewhat worse than having the social intuitions of a non-autist.

Of course we are easily duped. How is that surprising?

What is a Quokka?

Pejorative nickname given to rationalists when the Scott Alexander/NYT stuff went down. See this twitter thread by reactionary sci-fi/horror writer "Zero HP Lovecraft".

One of the thoughts that I've been kicking around in my head in relation to my long delayed (see procrastinated) effort-post is how a lot of blue tribe progressive types seem to be unfamiliar with the concept of the permissive vs contested vs hostile environment. I see people complaining about getting banned from an internet forum or reported to the FBI and my first reaction is the James Franco from Buster Scruggs meme, is this your first time? Similarly back in the Clinton days (that is the early 90s) I recall a lot of talk about "why are otherwise intelligent people buying this shit?" Nobody actually believes that the president didn't inhale or fuck Paula Jones do they?

My working theory is that wealthy Yale and Stanford types don't really get a lot of exposure to predators and con-men at a young age and thus they don't really develop the mental antibodies against them before entering the buisiness world. Meanwhile the kid who grew up around used car salesmen probably understands "the nudge" better than those with a 4-year degree in marketing.

That's right. Rationalists claim it was rational to trust Sam Bankman-Fried, because if his pitch was part of an academic exam to see if this person was credible, trust would be the right answer.

But that's the thing: we are not in an academic exam, this is the real world, and people are going to try to exploit your blind spots.

I often wonder if these people play poker, video games, or any kind of board game were deception is part of the game.

I'm not sure this hypothesis is correct. IME propensity to be conned doesn't really have much to do with community values but does have a lot to do with education, time preference, and intelligence (though they're certainly not proof against it).

Nobody actually believes that the president didn't inhale or fuck Paula Jones do they?

Nobody believes it. The important thing there is that the president kowtowed to prevailing norms by disavowing his behavior, even if his excuses are obviously bullshit.

IME, propensity to be conned is correlated with exposure to cons, and has no relation to education, time preference, or intelligence. It's just about having the mental habit of double checking "Could this person be conning me?" and a willingness to accept when the indicators are yes.

I agree, but I think the word is skepticism. You don't need to be intelligent or educated to be skeptic. It's just a mental muscle: the more you doubt claims, the easier it becomes to doubt claims.

I don't follow. Not growing up around con men could result in the kind of naivete that would make you believe Bill Clinton did not inhale, but like you said, I don't think anyone actually believed that.

George Orwell was probably more on target with the concept of doublethink.

No submission statement, not reading.

The headline talks about rationalists, but the article actually talks a lot about people who aren't rationalists at all. Like journalists. Or Krugman. Which are very easy to dupe, because they want to be duped. They actively go out and look for people who can be used as props to launder their agenda through them, and in some cases if they fail, they manufacture it (somehow this is considered to be much worse behavior than cherry-picking props, while being essentially the same). This is an easy trap to fall into - and I am sure many people declaring themselves rationalist fell in it too, because they are human. If you build a trap skillfully and put tasty enough cheese inside (different cheese for different people), a lot of people will get caught. Some of them may be calling themselves "rationalists", some of them may even try and become less easy to get caught - but they are imperfect humans, so they'll get caught anyway. That is to be expected. Doubly so if they actually profit in one way or another from getting caught (like journalists or political activists - which are pretty much one and the same nowdays). For those, passing a good "boo outgroup" story is almost inhumanly hard, so here are most of your examples.

To add to this, there's also the element of betting.

Humans, even rationalists, have to make decisions without the time to obtain perfect knowledge. It's only prudent to place bets if you think the upside might be big and the downside small. In other words, there were probably rationalists in the OP's sample that donated/took money from SBF while thinking this is all likely going to blow up in their face. This isn't the case of conflicting beliefs--it's playing the odds.

Plus, the characterization of "rationalists" seems to me a faulty generalization. There are probably very few people who make their life revolve around rationalism. But rationalism isn't some monastic order that stamps out mentat-like Rationalists, so in the real world, "rationalist" describes everyone from hyperlogical baysian wizards to folks who like a good argument and enjoy eating popcorn while watching the Culture War eat itself.

Humans, even rationalists, have to make decisions without the time to obtain perfect knowledge.

Yes, sometimes, but a lot of times they don't have to make a decision, and they do anyway. For example if I enter a meeting I will want to sit down, I don't know if the chair isn't broken, but I sit down anyway. Is not checking the chair a mistake? No, I can make a decision without perfect knowledge. But what about a raffle? I also don't know that I'm going to lose, so it might make sense to buy a ticket, but I don't have to. You'll say that I made a decision anyway, but not necessarily, a lot of times the result is "I don't know", and that's not really a decision.

It's only prudent to place bets if you think the upside might be big and the downside small.

That depends on the odds. A small upside and big downside might make sense if the odds of losing are sufficiently small.

In other words, there were probably rationalists in the OP's sample that donated/took money from SBF while thinking this is all likely going to blow up in their face.

But those are two different things. Taking money from a person is one decision, trusting that person is a completely different one. You can take money from a person without trusting them.

The difference between skeptics and normal people is not readily apparent. We both sit on a chair without checking if it's broken, but I as a rational skeptic do not assume it is unbroken. The end result looks the same, but the mental model is different: I do not have a belief.

but I as a rational skeptic do not assume it is unbroken.

And yet you assume you have access to other people's mental models.

No, I ask them what they believe, and they tell me.

Some of them may be calling themselves "rationalists", some of them may even try and become less easy to get caught - but they are imperfect humans, so they'll get caught anyway.

But the point is not that they get caught, all humans indeed have the potential to get caught at some point in their life, the point is why. Why do people get burnt touching a pan?

So, what's your answer for the why, that is special for rationalists? I say my answer is a common human one - they thought the pan is not hot, or maybe they wanted what's in the pot too much to reasonably evaluate the chances that it'd be too hot. People do that. I'm not too proud to admit it happened to me.

You are forgetting the most common reason: they have never encountered a hot pan in their life (e.g. kids). They get burnt because they didn't think they could get burnt. This also happens to adults who should know better after a while of not dealing with hot pans.

People who have never been scammed are the easiest to scam, precisely because they don't think it could possibly happen to them. Hubris and overconfidence are known to make intelligent people make obvious mistake they otherwise would not commit.

It's actually looking like the most un-common reason. You can only do it once in your life. If you ever been burned by a hot pan more than once in your life (I have, I assume most other people too, pretty much any adult had this experience - and yet adults are regularly getting burned by hot pans) - that's not the most common reason for you. It's hard for something to be most common reason for something if you can do it only once in your whole life, and you have plenty of warning before it.

OTOH, I'm pretty sure a lot of people tried to scam rationalists - because a lot of people try to scam everybody, look into your mailbox under "Spam" and you'd probably see a dozen scam attempts every day. Surely, they haven't been scammed this particular way before, but nobody has been scammed this particular way before, so there's nothing special for rat circles. BTW, a lot of much more weathered people - like journalists, politicians, Hollywood types, etc. - had accepted SBF with open arms. It's not like everybody but rationalists rejected him, but those doofuses got caught. Nobody within the Well Respected People circles rejected him. He had investment from the best and most respected venture funds. Financial regulators planned to use them as the example of "good crypto investor". He had CFTC license. Those people not only have seen every scam there is, they are supposed to be the supreme authority of the land to determine which is scam and which is not. They failed. Surely there were many reasons for that. Not having seen a scam before in their lives isn't one of them.

This is all part of a ploy to get me viewing another mediocre substack.

Not today!

All that glitters is not gold.

I believe the appropriate retort would be: "Fare you well; your suit is cold."

...Well, not quite. This is the second article of yours I've read; both have been "haha, everyone's such a moron because they don't know this thing I know" gloating but getting basic facts wrong about the main subject of your post ("what is Z4?" and "who is and isn't a Rationalist?" respectively), and on top of that, both times you've gone in the comments section trying (quite fruitlessly if your reaction counts are any barometer) to out-rhetoric your critics.

I'll be blunt: if, the next time you make a thread linking your blog and I see it (i.e. not the one you've already made that I haven't read yet; a new one), the article is this poorly-researched and/or you're playing these kinds of games in the comments, I'll stop following your links. I'm not asking that you even write something good - that'd be unfair, since nobody can write good articles all the time and some people can't write good articles any of the time - just a) don't act all superior without getting the basic premise of your article right, b) either don't engage with the criticism, or do it in a constructive fashion rather than trying to score points.

If it's any consolation, I like your Substack and don't think it's too mediocre.

Thanks. It's a work in progress to try to question the fundamentals of belief, and the discussions it has generated show it's surprising difficult to get intelligent people to question their own cherished beliefs, which in the case of rationalists in theory should not be the case.

it's surprising difficult to get intelligent people to question their own cherished beliefs

Only if your prior was that intelligent people should be easy to get to question their cherished beliefs. The reverse seems to be the case, it is dumb people who know they are dumb who change belief easily. Smart people do not, by and large, change their beliefs, no matter the evidence.

But contrary to the prediction of SCT, such polarization did not abate among subjects highest in Numeracy; instead, it increased. This outcome supported ICT, which predicted that more Numerate subjects would use their quantitative-reasoning capacity selectively to conform their interpretation of the data to the result most consistent with their political outlooks.

Raw intelligence, g, or IQ is an impediment to wisdom. It allows us to bully others with complex arguments, Euler math and factiods, which reinforces our intellectual arrogance. Being smart moves you further from Truth, not closer. It is a handicap to be struggled with, not a superpower.

Only if your prior was that intelligent people should be easy to get to question their cherished beliefs.

Not really. Only people who claim to follow logic, reasoning, and scientific thinking who tend to be intelligent, but not all intelligent people do that. These people (the scienticians) should in theory understand that they should conform their beliefs to the data, not the other way around. Science is supposed to be set up to avoid confirmation bias, and that's why the falsification principle that Karl Popper set up was supposed to be so powerful.

But yeah, they disregard all that when their beliefs are sufficiently cherised.

Smart people do not, by and large, change their beliefs, no matter the evidence.

That has been my experience.

Very interesting. But not at all surprising to me.

Raw intelligence, g, or IQ is an impediment to wisdom.

Weird, I started the article writing precisely about the difference between intelligence and wisdom, but it diverged so much that I changed the topic. I'll finish the article about wisdom later.

It allows us to bully others with complex arguments, Euler math and factiods, which reinforces our intellectual arrogance. Being smart moves you further from Truth, not closer. It is a handicap to be struggled with, not a superpower.

I think this is the case, but it shouldn't be the case. Smart people have the capacity to move closer to the truth, but only by using the right heuristic, and scientific thinking clearly doesn't seem to be sufficient. Intellectual humility is necessary, and accepting the possibility that perhaps they could be wrong, which many don't.

This is a very Motteish, meta contrarian hipster thing to say. This seems absurd, contrary to reason and personal experience. Intelligent people are not right about everything, but I would find it hard to believe they are wrong more often than stupid people.

The upper classes are not entirely devoid of superstition and conspiracy theory, but talking to an average lower class person for even a few minutes generally exposes truly wild reptilian-level beliefs in a senseless mishmash. You are romanticizing retards

but I would find it hard to believe they are wrong more often than stupid people.

That's not what I or the research said. They are less likely to change their beliefs. Whether they are wrong more than less intelligent people depends a great deal on the intellectual fashions of their social class and the subject of debate. If it's something boring, not culture war and technical, they're probably wrong less than dumb people, if it's classifying the sexes of the human species in 2022, they're probably wrong more.

"Rationalists" are just as inclined to use "rationalism" to reinforce the priors they came in with as opposed to challenging them. That's not the only reason, but I consider the whole "movement" silly.

I'm not very familiar with the movement, but after a few interactions with them I feel like they are even more inclined to reject evidence against their beliefs than the average person. I debated Scott Alexander in reddit, and after I pointed out fallacies he committed, he straight up rationalized that making fallacies wasn't a problem, and me pointing them out was too basic and "uninteresting".

He said by pointing out fallacies taught in philosophy 101 I was not responding to his argument, but isn't the whole point of fallacies being taught in philosophy 101 to avoid making them in arguments? A fallacious argument is invalid, so "this is a fallacy" is all the response needed.

I don't see how he could possibly think he is beyond the realm of fallacies.

I'd love to see a link to this if you wouldn't mind.

That was a lot of words to say very little that the quokka tweet didn't say already.