@felipec's banner p

felipec

unbelief

1 follower   follows 0 users  
joined 2022 November 04 19:55:17 UTC
Verified Email

				

User ID: 1796

felipec

unbelief

1 follower   follows 0 users   joined 2022 November 04 19:55:17 UTC

					

No bio...


					

User ID: 1796

Verified Email

Me: They do learn. They use many numbers to compute.

They don't. The probability that the next coin flip is going to land heads is the same: 0/0, 50/50, 5000/5000 is 0.5. It does not get updated.

Me: They do, the probability is the uncertainty of the event.

No. It's not. p=0.5 is not the uncertainty.

You: 50%+-20% is analogous to saying "blue" whereas saying 50% is analogous to saying "sky blue".

I didn't say that.

Me: Not if probability means uncertainty.

Which is not the case.

Me: It depends on the question.

There is no other question. I am asking a specific question, and the answer is p=0.5, there's no "it depends".

p=0.5 is the probability the next coin flip is going to land heads. Period.

I'm going to attempt to calculate the values for n number of heads and tails with 95% confidence so there's no confusion about "the question":

  • 0/0: 0.5±∞

  • 5/5: 0.5±0.034

  • 50/50: 0.5±0.003

  • 5000/5000: 0.5±0.000

It should be clear now that there's no "the question". The answer for Bayesians is p=0.5, and they don't encode uncertainty at all.

In your view, is "believing" something equivalent to supposing it with 100% certainty (or near-100% certainty)?

No.

I have a strong suspicion that your epistemic terminology is very different from most other people's, and they aren't going to learn anything from your claims if you use your terminology without explaining it upfront.

How so? I believe the Bayesian notion that you can believe something 60% is what is not shared by most people. Most people either believe something or they don't.

For instance, people may have been far more receptive to your "2 + 2" post if you'd explained what you mean by an "assumption", since most people here were under the impression that by an "assumption" you meant a "strong supposition".

There's a difference between most people and most people "here". My understanding of "assume" is in accordance with many dictionaries, for example: to take as granted or true.

So it's hard to tell what you mean by "people who follow Bayesian thinking confuse certainty with belief" if we misunderstand what you mean by "certainty" or "belief".

  • certainty: a quality of being proven to be true

  • belief: something considered to be true

Something can be 60% proven to be true, it can't be 60% considered true.

It's a badly posed question.

No, it's not. You are refusing to answer because the answer destroys your belief.

Are you denying that mathematical expressions exist?

If you prove Alice isn't racist, you haven't proven anything relevant. You're just nitpicking.

In your opinion, which isn't infallible.

I'm listening to the supporting case and engaging with your arguments.

This is not enough. Open debate requires an open mind: you must accept the possibility that you might be wrong.

If you don't even accept the possibility that you might be wrong about anything, then there's no point in debating, not about Alice, not about Bob, not about anything. All you are doing is wasting the time of your interlocutor.

This in my view is arguing in bad faith. If there's absolutely no way you can be convinced otherwise, then what am I even doing?

You're mainly arguing he's as racist as Alice and I happen to know she isn't.

Therefore it's impossible for you to be convinced of anything (about Alice and even less of Bob), and there's no point in me even trying.

I agree. There's a difference between education and schooling. You don't need to go to school to educate yourself, and most of what a school concerns about is not education.

In particular in the area of information technology we don't bother remembering anything, we develop technologies like wikipedia.org and stackoverflow.com so that relevant information is easily available and retrievable by anyone. No education needed.

If any kills are necessary to learn are those of logic, reasoning and skepticism, otherwise everything else one learns might not be learned properly. Unfortunately I don't see anyone interested in learning these skills, they all believe they already know what they need to know, and no evidence of the contrary would convince them otherwise.

Which why I don't think you will manage to persuade many people. Either they already understand why modern education is not working, or they don't.

The definition is circular: it doesn't lead to your interpretation of "to assume" as "to believe true with absolutely zero possible doubt".

Many definitions on all dictionaries are circular. Language is not an easy thing, which is why AI still has not been able to master it.

That leads to the usage of the term by many here, where to make an assumption about something is to make a strong judgment about its nature, while still possibly holding some amount of doubt.

No, that's not what the definition is saying. "[[[judge true] or deem to be true] as true or real] or without proof". There is no possibility of doubt. It's judged/deemed/considered to be true.

But if common usage recognized your boundaries, then the dictionaries would be flat-out wrong to say that, e.g., to believe something is to assume it, suppose it, or hold it as an opinion (where an opinion is explicitly a belief less strong than positive knowledge).

I believe they are. dictionary.com says "believe" is "assume", but Merriam-Webster does not. One of them has to be wrong.

That's the whole reason dictionaries exist: people disagree.

That's why I suspect that your understanding of the terms is not aligned with common usage, since the dictionaries trample all over your boundaries.

One dictionary does, not all.

BTW. I used ChatGPT and asked it if it saw any difference between "assume" and "suppose", and it 100% said exactly what is my understanding.


Also, I think that "certainty" in a Bayesian context is best treated as a term of art, equivalent to "degree of belief": a measure of one's belief in the likelihood of an event.

There's a big difference in saying "I'm 75% certain X is true", and "I'm certain X is 75%". If I believe it's likely that Ukraine launched a missile and not Russia, I'm saying I'm 75% certain that's true, I don't think there's an event which is 75% likely. I believe most people think this way, and it's more rational.

It's a badly posed question because it's not fully specified, namely, you're not stating where (2+2=4) lives.

Really? Wasn't your entire argument relying on the fact that if the arithmetic wasn't specifically specified, then certain arithmetic was always assumed?

Your question is ambiguously stated.

Which was my entire point.

Normally it wouldn't be

So you are accepting it: normally 2+2 is not 0, but I didn't ask if normally that was the case, I asked if it was always the case.

For the record, when I ask ChatGPT if it's always necessarily the case, it answers "no". It says that's not the case in other arithmetics. Weird that it interprets math like me, not like you.

Define whether (2+2=4) in your question is integer arithmetics or (mod 4) (or something else) and I'll answer your question.

It's not any modular arithmetic, it's standard arithmetic (the one you claimed should always be assumed).

Listening to your case and engaging with your argument will make me change my mind

No it won't.

No, it's still possible for me to be convinced of true things.

Obvious circular reasoning. You believe X is false, and you say it's possible for you to be convinced that X is true if X were true, but X is false, because you believe X is false. Could not be more obvious.

Do you accept the possibility that X may be true? Yes or no.

Sure, my point is just that your meaning can't be supported by that definition alone.

I did not claim my meaning was supported by that definition alone.

That particular dictionary says the exact opposite of what you're saying.

That's not what I'm saying, that's what your dictionary is saying. You are proving that the dictionaries disagree, which is what I'm saying.

That's the whole thesis of "The Categories Were Made for Man, Not Man for the Categories": nearly all our categories are fuzzy and ill-defined, but they're still useful enough that we talk about them anyway.

That is what I'm saying. In one context the word "theory" means something for most people, in another context it means something else.

You can't say the word "assume" means X and only X in all contexts and here's a dictionary that says so, because that's not how language works, not all dictionaries agree, and dictionaries are not perfect.

You can't say that my categorization system is an error, and you can't say only your categorization system should be considered by default, especially when it's not clear that everyone is following it.

I asked ChatGPT the question, and the interpretation it produced is certainly far less strong than your standard of "zero possible doubt" regarding an assumption

To me it said: «to "assume" something is to accept it as true without proof of evidence». That to me doesn't include doubt, because it's true a priori: it's just true.

He replied that the difference is that you assume something before it occurs, but you suppose it while it's occurring or after it occurs.

That aligns with my notion of a priori: you don't need evidence for an assumption, it's just true.

He replied that when you assume something, you're not entirely sure whether or not it's true, but when you suppose something, you have some kind of predetermined knowledge that it's true.

He is wrong: it's the other way around.

After several seconds of thought, she replied that she had no clue, then her friend chimed in to say she also had no clue.

That's what I claim many rational people should do in most circumstances.

ChatGPT, the dictionaries I've checked, and the ordinary people I've asked all give different definitions of "assume" and "suppose", none of which include your standard of zero possible doubt in order to assume something.

That isn't true, that's what you are assuming. It could be that you misinterpreted, and I contend that you did.

You contend that it means "strong supposition", yet the first person you asked said nothing like that, and the second person talked about "predetermined knowledge".

Therefore, I have strong evidence

I guess your definition of "strong evidence" and mine are very different.

What evidence do you have that common usage recognizes your hard boundary, so hard that to cross it is to be unambiguously incorrect?

I never claimed such a thing.


You are the one that claimed most people here equate "assume" with "strong supposition", and that that's what common people believe as well. But there's nothing you have provided to substantiate that. Even the testimonies you provided said nothing about "strong supposition", they talked about "before it occurs" and "predetermined knowledge" which very strongly suggests: without proof of evidence.

Either way I don't have to provide evidence because I did not make that claim, you made the claim that my understanding of "assume" is at odds with what most people understand by that word, but the evidence you yourself provided shows otherwise. You are trying to shift the burden of proof. The person that said to assume is to consider something true before it occurs is completely aligned with my notion of considering something true without evidence.

I don't have to show that my notion is shared by everyone, because I did not claim that, all I need to show is that your notion of "strong supposition" is not shared by everyone, and you yourself proved that.

It's not an authority for anything.

That's a straw man fallacy. Nobody said it was an authority.

But just for the record, the answer is no then.

Finally, it only took you 5 comments to answer my very simple question.

It merely makes 2+2=0 another representation of the same statement.

Do you believe that (2+2=4) and (2+2=0 (mod 4)) is "the same statement"?

No

Therefore you are contradicting your previous claim: (2+2=4) is not another representation of (2+2=0 (mod 4)): they are different statements. (2+2=4 (mod 4)) might be the same statement as (2+2=0 (mod 4)), but not (2+2=4).

I claimed that virtually nobody understands that (2+2=4 (mod 4)) exists, which is not the same as (2+2=4), and you finally accept that they are two different things.

Some of them may be calling themselves "rationalists", some of them may even try and become less easy to get caught - but they are imperfect humans, so they'll get caught anyway.

But the point is not that they get caught, all humans indeed have the potential to get caught at some point in their life, the point is why. Why do people get burnt touching a pan?

Now you're making an unsupported assumption about my character instead of an argument. Retract it and apologize.

You just accepted your mind cannot possibly be changed below.

Do you accept the possibility that X may be true? Yes or no.

No.

That's the end of the road then.

But FTX is not crypto. FTX was a mixture of the old and the new, which is precisely why it failed.

The whole point of crypto is to be completely detached from old systems, so there's zero surface of attack from government. If you use pure crypto (no exchange), then you are immune to these kinds of failures.

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

There is no "arithmetic", there's multiple arithmetics. You are assuming that the "laws" of one particular arithmetic apply to all arithmetics, which is not true.

Yes. Although I don't like to use the term "agnostic" because it relates to knowledge, and here we are dealing with belief. I prefer "skeptical".

The default position is uncertain, so maybe there's a teapot, maybe not. That means we are questioning its existence, therefore we are skeptics. But this also means we don't believe its existence (not-guilty), which is different than believing it doesn't exist (innocent).

If it was three hours, then I wouldn't answer 1:00 like you're suggesting either. I'd say "01:00 the next day" because time isn't truly modular

But your clock would read 01:00.

We use this concept in programming all the time. If the week ends in Sunday we don't say that the day after that is Monday the next week, it's Monday (this doesn't change if the week ends in Saturday). In fact, many people consider Friday 23:00+02:00 to still be Friday night.

I'm not sure about more esoteric ones, but in spherical and hyperbolic geometries pairs of lines with constant distance simply don't exist.

Yes, I meant "two straight lines that indefinitely extended in a two-dimensional plane that are both perpendicular to a third line", like in this picture, which are kind of parallel. The point is the standard concept of "parallel" more or less only exists in Euclidean geometries.

So you're now saying that 2+2=4 without further context is not the same statement as 2+2=4 (mod 4)?

No, I said (2+2=4 (mod 4)) might not be the same as (2+2=4). I very clearly never said what you claim I'm supposedly "now saying": I said "might not be", never said "is not".


This is a smoke screen though. I'm talking about what YOU said, and you are very conveniently trying to distract from that.

YOU claimed (2+2=4) is just another representation of (2+2=0 (mod 4))... that is 100% false, as you yourself now admitted. They are different statements.

And you also avoided to comment on the obvious conclusion from your misrepresentation, and instead chose a distraction from what YOU said.

So would you say that ChatGPT disagrees with your notion of "assuming" in my example?

No.

If not, then how could Alice change her mind from the indirect evidence, if she had zero doubt that there was only a dog in the box?

First: I think you misinterpreted what ChatGPT said, and second: ChatGPT can seem to disagree in one interaction, and agree in another, it depends on how the question was posed.

I bombarded ChatGPT with questions about the matter, and everything aligned to my notion, for example "If Alice believes claim X is true with zero doubt, can she change her mind?" it answered "Yes", which is obvious to me. Alice believes claim X with zero doubt in one moment, but then receive evidence contradicting that belief (which was assumed in the first place), why wouldn't she change her mind?

But to be crystal clear I asked this killer question:

If Alice believes claim X is true with zero doubt, but then finds out indirect evidence that claim X may not be true and changes her mind, can it be said that Alice no longer assumes claim X is true?

If Alice believes claim X is true with zero doubt, but then finds out indirect evidence that claim X may not be true and changes her mind, it can be said that Alice no longer assumes claim X is true. An assumption is something taken for granted without being verified, whereas a belief is something considered to be true based on evidence and reasoning. When new evidence comes up, it's important to evaluate it and adjust one's beliefs and assumptions accordingly. In this case, Alice's belief was based on a zero doubt, but when new evidence came up, she changed her mind, which indicates that she no longer takes claim X as a given, and it is no longer an assumption. It's important to note that changing one's mind or updating beliefs and assumptions is a natural and healthy process, and it's a sign of a rational and open-minded individual.

How does this not align precisely to my notion? I didn't even use the term "assume" throughout the question, I used it only to verify the outcome.

You're calling people (like the dictionary author, or the second person I questioned) "wrong" when they say that you can "assume" something while still doubting it to some extent.

No, I said: if a dictionary says that to believe something is to assume it, then I believe it's wrong. I did not say the dictionary is wrong, I said that I believe it is wrong.

This is completely different from linking "assume" to doubt.

Ukraine as a country isn't particularly important and the population is likely to be hostile to Russia, meaning that to integrate it into Russia proper will be difficult if not impossible.

This is an oversimplification, there's no such thing as the "Ukraine population": different people have different believes. This is like saying the "USA population" believes X. Sure, some do, but not all.

You can say the majority of the population is likely to be hostile to Russia (I have my doubts about that), but some will not.

You are forgetting the most common reason: they have never encountered a hot pan in their life (e.g. kids). They get burnt because they didn't think they could get burnt. This also happens to adults who should know better after a while of not dealing with hot pans.

People who have never been scammed are the easiest to scam, precisely because they don't think it could possibly happen to them. Hubris and overconfidence are known to make intelligent people make obvious mistake they otherwise would not commit.

No, it remains to convince you that X is false.

If there was a person willing to engage in open debate who I had a chance to convince, sadly there's none. There is no point in debate if one side is completely closed off.

All that glitters is not gold.

I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations.

There are no axioms that apply to all arithmetics. There are no such "laws".

Go ahead and try come up with one "law". I'm fairly certain I can point out an arithmetic where it doesn't apply.

There's a reason these fall under the umbrella of abstract algebra.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

I was not smug, you believed I was smug. Big difference.

No. In programming it's literally impossible to include information that wasn't meant to be included. If you have an int to store the weekday, that's all the information stored in that int.

Not having all the information is a huge problem in programming, and historically it has been a big headache to deal with dates and time.

But if a program doesn't need any information other than the weekday, it may use that and nothing more.