felipec
unbelief
No bio...
User ID: 1796

That's the same thing. If it fails to show they are guilty, then they are innocent.
No, they're not. That's the whole point of the article. The legal system considers a person who has been acquitted to not be found guilty, which is why the jury renders the verdict quite literally "not guilty".
Ukraine as a country isn't particularly important and the population is likely to be hostile to Russia, meaning that to integrate it into Russia proper will be difficult if not impossible.
This is an oversimplification, there's no such thing as the "Ukraine population": different people have different believes. This is like saying the "USA population" believes X
. Sure, some do, but not all.
You can say the majority of the population is likely to be hostile to Russia (I have my doubts about that), but some will not.
Definition of smug
: "highly self-satisfied". If he said the tone was smug, it's because he believed the person writing such prose was "highly self-satisfied".
There is no "gray prison", there's only prison. In the real world at some point decisions must be made.
Recordings can be faked, even if it's unlikely.
Which is why evidence should not be considered proof. People often confuse the two, for example in the aphorism "absence of evidence is not evidence of absence" which is incorrect: it is evidence of absence, it's just not proof.
Why should I just assume I'm talking to a human?
You shouldn't. Why do you need to know that I'm a human?
Effectively, you just did.
No, I did not. You are committing a converse error fallacy.
I guess you will remain at the idea that only fools can be ever deceived
You have no idea what I believe, because you refuse to listen when I tell you directly: "I did not say that". I know what I said, and only I know what I meant, and I know that you don't know what I meant, and when I tell you directly, you completely ignore me and keep believing I said what I most definitely did not.
I ask a woman out for a date, do I trust that she is going to say yes?
Of course not.
Then why did I ask her out?
-
p
: I trust a woman will say yes -
q
: I ask her out
p ⇒ q
If I did trust a woman will say yes, then I would ask her out, but if I ask her out, you do see how that would be a fallacy to conclude that I trust she was going to say yes (converse error fallacy), yes?
Therefore you cannot assume I did trust some outcome based on some action. By the exact same token if I go to the store you cannot assume that I trusted any outcome, including me not being murdered. I am rolling the die.
If p
I trust I'll roll a 7, then q
I roll the die, but q
me rolling the die doesn't imply p
I trust I'll roll a 7. There are obviously other reasons why I would roll the die, including me hoping--not trusting--that I'll get a 7.
You assume that me engaging in some activity necessarily imples trust, you are 100% wrong.
I was not smug, you believed I was smug. Big difference.
I don't require 100% certainty to call anything true, but even if I did, I don't need to call absolutely anything true.
The single value is just the point estimate of your belief.
There is no "point estimate" of my belief because I don't believe anything.
You are trying pinpoint my belief on a continuum, or determine it with a probability function, but you can't, because I don't have any belief.
You seem to believe Bayesians only care about the point estimate and not the whole probability distribution.
Do you have any source for that? Do you have any source that explains the difference between a coin flip with 0/0 priors vs 50/50?
I agree. Climate change is one of the areas I'm most skeptical about. I believe that if true it's one of the most important issues of our time, and I've seen evidence that climate change is indeed happening, and it's indeed caused by human activity, but evidence isn't proof.
I have also seen enough evidence to be skeptical about the amount of damage human activity is actually causing--as opposed to random fluctuations. And also to be skeptical about the irreversible damage, for which there's evidence that it's actually reversing.
So my conclusion so far is the default position: I don't know.
But one absolutely can not function in a society without trusting somebody with something.
You are patently wrong. I don't trust anybody.
You go to a store and you trust the owner not to murder you, feed your body to the pigs and take your money.
No, I don't.
You put your money in the bank and you trust the bank not to refuse to give it back, or the society to be on your side if they do.
No, I don't.
You get employed and you trust your employer to pay you and not to sell your data to the identity thieves and ghost you, etc.
No. I don't.
Sidenote: before you say "I actually never trust anybody, I grow my own food on the top of remote mountain and never speak to another human being unless I see them through the sights of my rifle, and only to procure ammunition for the said rifle, and I demand it upfront" - good for you
I don't say that.
We trust somebody many times a day if we live in a society, and in the most of these cases the trust is reciprocated with cooperation.
I don't think you have the slightest idea of what trust means and how society actually works.
Answer this: I ask a woman out for a date, do I trust that she is going to say yes?
Sorry about the delay.
Here's the subthread: Rationalists are too easily duped.
This is the second article of yours I've read; both have been "haha, everyone's such a moron because they don't know this thing I know" gloating but getting basic facts wrong about the main subject of your post ("what is Z4?" and "who is and isn't a Rationalist?" respectively)
Two converse error fallacies don't make one right.
Your conclusion is still unjustified.
You talk as if your intellect is superior to mine, but I seriously wonder if you even know what a converse error is, and if you can provide an example without looking it up.
I wasn't talking about the rationalist movement, I was talking about people who are generally considered very smart / rational / scientific / humanist, or whatever term you want to call them.
That being said, people in the rationalist movement do suffer from precisely the same deficiency, and proof of that is that many were duped by Sam Bankman-Fried.
I think you are missing the forest for the trees. The examples in the article are used to exemplify a single problem, do you understand what that problem is?
I disagree. The burden of the proof is an inherent property of the claim, not the person.
If a deficient person makes claim X
, and you find it's your moral duty to create a steel man argument designed to maximize the defense of claim X
, claim X
still has the burden of proof. This means I as a rational person should not believe X
until such time when anyone--whether it's you, the deficient person, or somebody else--substantiates such claim.
And I don't have to disprove X
, X
is not considered true by default.
Thus it seems very reasonable to conclude that we are in a simulation and we are thus ruled by a deity.
I see people make this probabilistic fallacy very often. You can say X
is very likely, so it's reasonable to conclude it's true, but winning Russian roulette is likely, do you think it's reasonable to conclude you will win? This doesn't change with higher values of X
.
If you change the statement to "it's reasonable to conclude that we are likely in a simulation", then I would agree.
I don't believe rand() < 0.99
is true
, because it could be false
.
But there is still zero evidence that such a teapot doesn't exist. Even if I were to grant you that your rationale is solid, that's not evidence.
I feel people have a hard time understanding that unlikelihood is not evidence. If someone tells me it's unlikely for me to lose in Russian roulette, that's not evidence that I'm going to win. Unlikely events happen all the time, and people don't seem to learn that.
What are the chances that the entire housing market is overpriced and it's about to collapse? Someone might have said "almost impossible" right before the financial crash of 2008, and in fact many did.
What are the chances that Bernie Madoff is running a Ponzi scheme given that his company has already passed an SEC exam? Again, "almost impossible" is what people said.
Black swans were considered impossible long time ago, and yet they existed, which is precisely why the term is used nowadays to describe things we have no evidence for, but yet could happen.
You can be considered right in thinking that black swans don't exist, that Bernie Madoff is legit, and that the housing market is not about to collapse (innocent
), right until the moment the unlikely event happens and you are proven wrong. It turns out a cheeky Russian astronaut threw out a teapot in the 1970s and it has been floating since.
Why insist in believing the unlikely is not going to happen only to be proven wrong again and again when we can just be skeptical?
Should not guilty be treated in social situations as innocent?
I don't think so. If Jake is accused of sexually assaulting Rachel and you consider Jake as innocent you would have a tendency to dismiss evidence that Rachel is telling the truth (since people have a tendency to not like to be wrong). Also, people would justly ask you for evidence that Jake is innocent, since you do actually have a burden of proof in this case. And then if incontrovertible evidence comes out that Rachel was telling the truth, you would have been proven wrong.
If instead of considering him innocent you say "the jury is still out", then you are open to evidence of guilt, you don't have a burden of proof, and if Jake turns out to be guilty you would not have been proven wrong.
It's OK to say "I don't know".
I do not think it is novel, I specifically said a lot of people don't have any problem seeing this distinction, and I would expect most rationalists to see this distinction.
But it's a fact that a lot of people do not see this distinction, not dull people, a lot intelligent people. I've debated many of them, and it's a chore to explain again and again how the burden of proof actually works and why not-guilty ≠ innocent
. Next time I'm in a debate with such people I can simply link to this article, and presumably so can other people in similar debates.
Moreover, you seem to be overlooking the fact that what is obvious to you (and many rationalists), may not be so obvious to everyone. This bias is called the curse of knowledge. People have a tendency to fake humility, because people don't like arrogance, they like humility, but the fact is assuming everyone is as intelligent and/or knowledgeable as you is not always productive.
In fact, the whole motivation behind the article is that someone I know refused to accept there's a genuine difference. He is not unintelligent.
In economics terms what you do is take your Bayesian beliefs and multiply each probability by the utility gained or lost by each state.
I know how expected value works. But this confirms what I said: a single percentage cannot tell me what I should believe.
Also, this still doesn't answer my scenario. Is the next toss of a coin going to land heads given that in previous instances there have been 50 heads / 50 tails? How about 0 heads / 0 tails?
I know there's a difference, but Bayesians assume they are the same.
For your coin example, your prior belief the coin is fair is most likely not 50%.
No, the probability that the next toss of a coin is going to land heads is 50%, regardless if the results have been 0/0, 50/50, or 500000/500000.
As for whether or not you divorce your wife, well that’s not Bayes Theorem that’s just how you choose to apply the beliefs you have.
My beliefs are binary. Either I believe in something or I don't. I believe everyone's beliefs are like that. But people who follow Bayesian thinking confuse certainty with belief.
A Bayesian would say that beliefs have continuous degrees, expressible on a scale from 0% to 100%.
I'm not overly familiar with the Bayesian way of thinking, I have seen it expressed very often in The Motte and similar circles, but I don't see why would anyone conclude that this is a valid way of reasoning, especially when it comes to beliefs. I do understand Bayes' theorem, and I understand the concept of updating a probability, what I don't understand is why anyone would jump to conclusions based on that probability.
Let's say through a process of Bayesian updating I arrive to a 83% probability of success, should I jump the gun? That to me is not nearly enough information.
Now let's say that if I "win" I get $100, and if I "lose" I pay $100. Well now I have a bit more information and I would say this bet is in my favor. But if we calculate the odds and adjust the numbers so that if I lose I pay $500, now it turns out that I don't gain anything by participating in this bet, the math doesn't add up: ((5 / 6) * 100) / ((1 / 6) * 500) = 1
.
Even worse: let's say that if I win I get $100, but if I lose I get a bullet in my brain. I'm literally playing Russian roulette.
83% tells me absolutely nothing.
Real actions in real life are not percentages, they are: do you do it or not? and: how much are you willing to risk?
You can't say I'm 60% certain my wife is faithful, so I'm going to 40% divorce her. Either you believe something, or you don't. Period.
Even worse is the concept of the default position in Bayesian thinking, which as far as I understand it's 50%.
Edit: I mean the probability that the next coin toss is going to land heads is 50%.
So starting off if I don't know if a coin is fair or not, I would assume it is. If I throw the coin 100 times and 50 of those it lands head the final percentage is 50%. If I throw the coin 1,000,000 times and 500,000 of those times it land heads it's still 50%, so I have gained zero information. This does not map to the reality I live in at all.
My pants require at least two numbers to be measured properly, surely I can manage two numbers for a belief. So let's say before I have any evidence I believe a coin is fair 50%±50
(no idea), after throwing it a million times I would guess it's about 50%±0.01
(I'm pretty sure it's fair).
So no, I'm not sold on this Bayesian idea of a continuous belief, I can't divorce my wife 40%, or blow my brains 17%. In the real world I have to decide if I roll the dice or not.
That's pretty cheap trick.
Most people when faced with something they have not imagined complain about that.
But when you have multiple ways to do something, the it's different - it's hard for the way that you can do only once to be the most common.
No it's not. Do I really have to explain it with statistics?
Say everyone will experience event X
once in their lifetime, which is 80 years in average, that means in a population of 1000 in every given year around 12.5 people will experience it for that reason in average. Now let's say there's another way they can experience X
that also happens for everyone in their lifetime, so again it's 12.5. In this case the percentage of people who experience X
for the first time every given year is 50%, so it's not the most common cause.
But, what if the other way doesn't happen for 100% of the people, they learn their lesson and it only happens to 50% of the people? In that case it's only 6.25 people and the percentage of people who experience X
for the first time any given year is 67%, therefore it's the most common cause.
Your failure of imagination is not an argument.
Fraud in general is not new. This one in particular is.
No. All fraud relies on people trusting without good reason, or more specifically: not distrusting enough. This is no exception.
It remains to be proven that no intelligent ones with solid epistemology in fact did, and only dumb ones did.
Indeed, but it doesn't have to be proven because the hallmark of having a solid epistemology is not believing things without evidence, and in order to fall for the fraud you have to believe things without evidence. So if anyone with a solid epistemology fell for the fraud, they would have to be almost by definition a very rare exception.
Yes. Although I don't like to use the term "agnostic" because it relates to knowledge, and here we are dealing with belief. I prefer "skeptical".
The default position is uncertain
, so maybe there's a teapot, maybe not. That means we are questioning its existence, therefore we are skeptics. But this also means we don't believe its existence (not-guilty
), which is different than believing it doesn't exist (innocent
).
You are wrong. The word trust means to "rely on", I don't rely on her not calling the police, that's something you assume, but you don't know my mental state, only I know that, and I'm telling you emphatically that I do not rely on that. You think you can read my mind, but you can't.
More options
Context Copy link