@felipec's banner p

felipec

unbelief

1 follower   follows 0 users  
joined 2022 November 04 19:55:17 UTC
Verified Email

				

User ID: 1796

felipec

unbelief

1 follower   follows 0 users   joined 2022 November 04 19:55:17 UTC

					

No bio...


					

User ID: 1796

Verified Email

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

There is no "arithmetic", there's multiple arithmetics. You are assuming that the "laws" of one particular arithmetic apply to all arithmetics, which is not true.

Yeah? What is your most fundamental belief that you have questioned this year?

That's kind of an accurate summary. But doesn't that apply everywhere in modern discourse? People assume that Kanye West said X, but "X" doesn't necessarily mean X.

Words like "censorship", "racism", "war", "vaccine" are used in different ways all the time, and when people with agendas use them, they feel 100% there is only one meaning.

So "censorship" doesn't always mean censorship.

I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations.

There are no axioms that apply to all arithmetics. There are no such "laws".

Go ahead and try come up with one "law". I'm fairly certain I can point out an arithmetic where it doesn't apply.

There's a reason these fall under the umbrella of abstract algebra.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

This is your claim:

This is usually covered in basic math courses or textbooks.

What is "this" in this context?

Also, you claimed that "this" is taught in basic math textbooks, but you din't provide an example of such textbook, you provided one of logic.

ugh, this guy really want to show me how smart he thinks he is

Yes, but I don't care about their reaction or their opinion of me.

I've never seen anybody seriously question any of their core beliefs in real time. But these notions plant a seed which eventually they can't avoid. Sleep is important in mulling these down.

In fact, I remember mentioning to somebody the claim "all models are wrong, but some are useful", which was immediately dismissed (since in an argument nobody wants to be wrong), but some time later the same person made the exact same claim to me. He forgot I was the one who mentioned it, but more importantly: he forgot that initially he immediately dismissed it.

I bet many people will forget this article and dismiss it as pedantry, but the next time someone says "this is as true as 2+2=4" they might think twice. These two things are not exclusionary.

But 22+2 can be 0.

The statement "in normal arithmetic 2+2=4" is true, but "2+2 is always 4" is false.

You can dismiss semantics all you want, but the meaning of the statements we make matter, and the certainty we have about the meaning of the statements other people make do matter.

Just this week I debated a person who was 100% certain he knew what anti-Semitism was (he didn't), what a dictionary was (he didn't), and what all the words in the definitions I presented to him meant (he didn't).

In my view 100% certainty is a problem.

I believe questioning the meaning of 2+2 might help some people question other unquestionably true beliefs.

Are you 100% certain it's impossible for this to happen?

It’s not a fallacy because I assume the content of his argument is not the content of your argument.

But you are assuming his argument is valid merely on the basis of his credentials.

And you are assuming my argument is invalid merely on the basis of my credentials.

That's a fallacy.

A company can be comprised of 99% of geniuses, and 1% idiots at the top and fail.

All it takes is 1 idiot.

I was in Nokia with the most elite team of open source programmers and hardware engineers I've ever seen in Nokia's Skunk Works at the height of Nokia's success. Our software was way better than Android and had features many phones didn't get for more than a decade, and some they still don't have. The future was bright.

It didn't matter: one person at the top ruined everything.

And the failure to communicate can be entirely on the listening side by assuming a meaning that was never there.

The fact that today people don't understand each other is a huge problem, and worse: people don't want to understand what the other side is actually saying.

That's the field where you would doubt 1+1=2, not because you actually doubt it, but because you expect insight from dispelling that doubt.

It doesn't matter if Bertrand Russell personally doubted it or not, he acted as if it was rational to not believe with 100% certainty something which had not been proven yet, and it was.

The reason he attempted to dispell that doubt, is that absent that proof, it was reasonable to doubt.

It's the same level of abstraction as wondering whether you're actually a brain in a vat.

Which is a valid doubt in philosophy.

In politics or engineering, you can't do that.

You have to doubt in engineering, for the same reason you have to doubt in every field. Bridges have fallen because engineers did not doubt enough.

No. In programming it's literally impossible to include information that wasn't meant to be included. If you have an int to store the weekday, that's all the information stored in that int.

Not having all the information is a huge problem in programming, and historically it has been a big headache to deal with dates and time.

But if a program doesn't need any information other than the weekday, it may use that and nothing more.

You didn't answer my question.

We managed to finish one product against all odds, even with many people jumping ship right before the launch: Nokia N9.

You made this claim:

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about. People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

did you know that Earth has a four corner simultaneous 4-day time cube?

Well, I never assumed it didn't. Mainly because I don't know what that means.

I suppose your larger point is true, but not particularly meaningful.

Are you 100% certain of that?

So a statement that seems easy and clear to interpret can actually be misleading when your interlocutor is deliberately trying to confuse and deceive you by omitting key information?

This is a loaded language claim, a rhetoric trick. You are intentionally adding the word "misleading" to prompt an emotional response.

Consider this exchange:

  1. If you don't denounce Russia's illegal war of aggression, that makes you a Putin apologist, that's as unequivocally true as 2+2=4

  2. Actually, 2+2=4 is not unequivocally true

My claim (2) is not "misleading", and I'm not "deliberately trying to confuse and deceive" anyone, it's the other person who made a false claim (1). The sole objective of me bringing up this abstract algebra notion is to increase doubt on the original claim about Russia sides. The factoid 2+2=4 is not used by me as an end, it's used by somebody else as a means to an end. 2+2=4 is often used as a tool to demonstrate 100% certainty, and it can be dismantled.

Your loaded language claim doesn't apply in this example. We can get rid of the loaded language and make a much more fair, generous and neutral claim:

"A statement that seems easy, clear to interpret, and is obviously 100% certain to be true can actually be not necessarily true when an unrealized assumption is present."

How is this more generous claim not correct?

  • -10

“This” is that we assume the common interpretation if one exists. The second quoted paragraph explains it.

Which is?

Logic is a part of math.

No. Logic and mathematics have a complicated relationship.

The book is from an undergrad discrete math course I took once, so I pulled the book from the shelf to quote for you.

So it wasn't a "basic math" course, and you don't have an example of a "basic math" textbook covering "this".

No. Apples are not oranges. Abstract algebra is a much less known concept than numeral systems. Virtually nobody thinks of that when considering 2+2.

I can guarantee you that Russell used 1+1=2 when calculating his daily expenses even before he formally proved it.

I literally said "it doesn't matter if Bertrand Russell personally doubted it or not".

If I'm not 100% certain a particular chair is not broken, but I sit on it anyway, and you conclude that therefore I believe with 100% certainty that it wasn't broken, you are committing a converse error fallacy.

You cannot read minds, you cannot know why I did sit on that chair, and assuming that you do know is an error in logic.

Even worse is to assume you do know why I checked the chair before sitting on it, and assuming it had nothing to do with my potential doubt.

Doubt is essential in all fields. 100% certainty is extremely dangerous. And I don't see you addressing this at all.

Information is always limited. Humans and all rational agents always operate with limited information. There is no omission.

You are assuming I'm the one who brought up the 2+2=4 factoid.

Of course with skillful redefinition of what '2' and '+' and '2' and '=' and '4' you can make it mean anything you like.

I did not invent abstract algebra, it's a important field in mathematics.

Then please stop assuming that my uncountable usage of "the concept of arithmetic in general" in that sentence is secretly referring to your countable idea of "a single arithmetic".

Where did I "assume" that in my last comment?

I've clarified my meaning twice now, I'd appreciate it if you actually responded to my argument instead of repeatedly hammering on that initial miscommunication.

I don't know what argument you are talking about. If you are referring to this:

  • almost no one uses the notation associated with real-number arithmetic in a way contrary to real-number arithmetic

  • ∴ I refuse to entertain the notion that someone is actually referring to some system of arithmetic incompatible with real-number arithmetic when they use the notation associated with real-number arithmetic, unless they first clarify this

That's not an argument, you are just stating your personal position. You are free to do whatever you want, if you don't want to doubt a particular "unequivocal" claim, then don't. Your personal position doesn't contradict my claim in any way.

Why should there be any doubt in their minds

Because that's what skepticism demands. I assert that 100% certainty on anything is problematic, which is the reason why skepticism exists in the first place.