@felipec's banner p

felipec

unbelief

1 follower   follows 0 users  
joined 2022 November 04 19:55:17 UTC
Verified Email

				

User ID: 1796

felipec

unbelief

1 follower   follows 0 users   joined 2022 November 04 19:55:17 UTC

					

No bio...


					

User ID: 1796

Verified Email

It's hard for something to be most common reason for something if you can do it only once in your whole life, and you have plenty of warning before it.

No it's not, it's basic statistics. You can only donate your heart once by dying, and guess what's the most common reason for heart donation: death.

OTOH, I'm pretty sure a lot of people tried to scam rationalists

False equivalence fallacy.

Surely, they haven't been scammed this particular way before, but nobody has been scammed this particular way before, so there's nothing special for rat circles.

Yes they have. Financial fraud is not new.

BTW, a lot of much more weathered people - like journalists, politicians, Hollywood types, etc. - had accepted SBF with open arms.

This has nothing to do with my argument, you are attacking a straw man of it. Of course there are dumb journalists who fell for the scam, but the intelligent ones with solid epistemology likely did not, because they have epistemic humility.

Only if your prior was that intelligent people should be easy to get to question their cherished beliefs.

Not really. Only people who claim to follow logic, reasoning, and scientific thinking who tend to be intelligent, but not all intelligent people do that. These people (the scienticians) should in theory understand that they should conform their beliefs to the data, not the other way around. Science is supposed to be set up to avoid confirmation bias, and that's why the falsification principle that Karl Popper set up was supposed to be so powerful.

But yeah, they disregard all that when their beliefs are sufficiently cherised.

Smart people do not, by and large, change their beliefs, no matter the evidence.

That has been my experience.

http://culturalcognition.squarespace.com/browse-papers/motivated-numeracy-and-enlightened-self-government.html

Very interesting. But not at all surprising to me.

Raw intelligence, g, or IQ is an impediment to wisdom.

Weird, I started the article writing precisely about the difference between intelligence and wisdom, but it diverged so much that I changed the topic. I'll finish the article about wisdom later.

It allows us to bully others with complex arguments, Euler math and factiods, which reinforces our intellectual arrogance. Being smart moves you further from Truth, not closer. It is a handicap to be struggled with, not a superpower.

I think this is the case, but it shouldn't be the case. Smart people have the capacity to move closer to the truth, but only by using the right heuristic, and scientific thinking clearly doesn't seem to be sufficient. Intellectual humility is necessary, and accepting the possibility that perhaps they could be wrong, which many don't.

I'm not very familiar with the movement, but after a few interactions with them I feel like they are even more inclined to reject evidence against their beliefs than the average person. I debated Scott Alexander in reddit, and after I pointed out fallacies he committed, he straight up rationalized that making fallacies wasn't a problem, and me pointing them out was too basic and "uninteresting".

He said by pointing out fallacies taught in philosophy 101 I was not responding to his argument, but isn't the whole point of fallacies being taught in philosophy 101 to avoid making them in arguments? A fallacious argument is invalid, so "this is a fallacy" is all the response needed.

I don't see how he could possibly think he is beyond the realm of fallacies.

Your title doesn't seem to be related with most of what you wrote, and your conclusion comes out of left field 'here's a bunch of examples about some jokes

They aren't jokes. It seems you don't want to see what actually happened, the pattern I pointed out, and the significance I very clearly explained.

No, I ask them what they believe, and they tell me.

You are forgetting the most common reason: they have never encountered a hot pan in their life (e.g. kids). They get burnt because they didn't think they could get burnt. This also happens to adults who should know better after a while of not dealing with hot pans.

People who have never been scammed are the easiest to scam, precisely because they don't think it could possibly happen to them. Hubris and overconfidence are known to make intelligent people make obvious mistake they otherwise would not commit.

Thanks. It's a work in progress to try to question the fundamentals of belief, and the discussions it has generated show it's surprising difficult to get intelligent people to question their own cherished beliefs, which in the case of rationalists in theory should not be the case.

It certainly could, which is why I'm not advocating for cynicism, what I advocate for is skepticism. Many true skeptics end up being cynics, but not all cynics are good skeptics.

What is a Quokka?

Humans, even rationalists, have to make decisions without the time to obtain perfect knowledge.

Yes, sometimes, but a lot of times they don't have to make a decision, and they do anyway. For example if I enter a meeting I will want to sit down, I don't know if the chair isn't broken, but I sit down anyway. Is not checking the chair a mistake? No, I can make a decision without perfect knowledge. But what about a raffle? I also don't know that I'm going to lose, so it might make sense to buy a ticket, but I don't have to. You'll say that I made a decision anyway, but not necessarily, a lot of times the result is "I don't know", and that's not really a decision.

It's only prudent to place bets if you think the upside might be big and the downside small.

That depends on the odds. A small upside and big downside might make sense if the odds of losing are sufficiently small.

In other words, there were probably rationalists in the OP's sample that donated/took money from SBF while thinking this is all likely going to blow up in their face.

But those are two different things. Taking money from a person is one decision, trusting that person is a completely different one. You can take money from a person without trusting them.

The difference between skeptics and normal people is not readily apparent. We both sit on a chair without checking if it's broken, but I as a rational skeptic do not assume it is unbroken. The end result looks the same, but the mental model is different: I do not have a belief.

I agree, but I think the word is skepticism. You don't need to be intelligent or educated to be skeptic. It's just a mental muscle: the more you doubt claims, the easier it becomes to doubt claims.

Where are All the Successful Rationalists?

I relate a lot. I have not read a lot of rationalists articles, but it seems to me that a lot of what they do is share ideas amongst themselves, but these ideas are not necessarily true or important, merely interesting. Few of these ideas have anything to do with the real world.

Nassim Taleb talks about putting skin in the game as a way to escape this intellectual circle jerking, because when you confront ideas with reality is the only way to know if there's any true truth to them. This follows Karl Popper's falsification principle: if your idea cannot be falsified (in the real world), then it's worthless.

I think the reason why there are no successful rationalists is because they don't want their precious ideas to actually be tested in the real world, they'll rather keep them unopened like collectionists do, and just admire them.

That's right. Rationalists claim it was rational to trust Sam Bankman-Fried, because if his pitch was part of an academic exam to see if this person was credible, trust would be the right answer.

But that's the thing: we are not in an academic exam, this is the real world, and people are going to try to exploit your blind spots.

I often wonder if these people play poker, video games, or any kind of board game were deception is part of the game.

"You" are less than 0.0001% of the population, so virtually nobody.

No, it remains to convince you that X is false.

If there was a person willing to engage in open debate who I had a chance to convince, sadly there's none. There is no point in debate if one side is completely closed off.

Some of them may be calling themselves "rationalists", some of them may even try and become less easy to get caught - but they are imperfect humans, so they'll get caught anyway.

But the point is not that they get caught, all humans indeed have the potential to get caught at some point in their life, the point is why. Why do people get burnt touching a pan?

Now you're making an unsupported assumption about my character instead of an argument. Retract it and apologize.

You just accepted your mind cannot possibly be changed below.

Do you accept the possibility that X may be true? Yes or no.

No.

That's the end of the road then.

But 0 is what we think, because 0 is 4.

Nobody thinks that 0 is 4.

I did not claim and did not need to claim anything about all instances of building Hello World in assembly; the idea that I was trying to is an assumption that you made.

This is obviously not the case because this was not an aside, but an analogy to another point that you were making. You were clearly saying that a) "coding Hello World in assembly" is never b) "coding Hello World in assembly", and always c) "coding Hello World in assembly", and there was no other possible way to interpret that.

You used that to substantiate your claim that Bertrand Russell didn't actually want to prove 1+1=2, but wanted to do something else using the proof 1+1=2 as a tool.

But in both cases you made assumptions: what you claimed is not necessarily true.

You think you get them right. So that's a "no, I don't question my fundamental beliefs".

All that glitters is not gold.

Yes. But the whole point of my post is to get people to reconsider what basic notions like 2+2 are.

And if I understand correctly in ℤ/4ℤ there is no 2 in the main set, it's {..,-6,-2,2,6,...}, so it's actually {...,-6,-2,2,6,...}+{...,-6,-2,2,6,...}={...,-8,-4,0,4,8,...}, or something like that. 2 is just a simplification of the coset.

How does one distinguish between someone making an assumption, and someone only appearing to be making an assumption?

By checking whether or not the person considers the possibility of the claim being not necessarily true. And if not, whether or not the claim is substantiated by evidence or reason.

Or the other way: if the person considers the claim to be 100% certain to be true without any evidence or reason to substantiate it (it just is).

I'm loving this #TwitterDown saga getting woke progressives melting: ‘Twitter Is Dead,’ 300 Million People Post On Twitter.

Listening to your case and engaging with your argument will make me change my mind

No it won't.

No, it's still possible for me to be convinced of true things.

Obvious circular reasoning. You believe X is false, and you say it's possible for you to be convinced that X is true if X were true, but X is false, because you believe X is false. Could not be more obvious.

Do you accept the possibility that X may be true? Yes or no.