site banner

2+2 = not what you think

felipec.substack.com

Changing someone's mind is very difficult, that's why I like puzzles most people get wrong: to try to open their mind. Challenging the claim that 2+2 is unequivocally 4 is one of my favorites to get people to reconsider what they think is true with 100% certainty.

-34
Jump in the discussion.

No email address required.

I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations.

There are no axioms that apply to all arithmetics. There are no such "laws".

Go ahead and try come up with one "law". I'm fairly certain I can point out an arithmetic where it doesn't apply.

There's a reason these fall under the umbrella of abstract algebra.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

There are no axioms that apply to all arithmetics. There are no such "laws".

Are you getting hung up on my use of the term "laws of arithmetic"? I'm not trying to say that there's a single set of rules that applies to all systems of arithmetic. I'm using "laws of arithmetic" as a general term for the class containing each individual system of arithmetic's set of rules. You'd probably call it the "laws of each arithmetic". The "laws of one arithmetic" (by your definition) can share common features with the "laws of another arithmetic" (by your definition), so it makes sense to talk about "laws of all the different arithmetics" as a class. I've just personally shortened this to the "laws of arithmetic" because I don't recognize your usage of "arithmetic" as a countable noun.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

I was focusing on integer arithmetic since that was sufficient to cover your original statement. The natural generalization is group or field arithmetic to define the operations, and real-number arithmetic (a specialization of field arithmetic) to define the field elements. The notation associated with integer arithmetic is the same as the notation associated with real-number arithmetic, since the integers under addition or multiplication are a subgroup of the real numbers.


To repeat my actual argument, I assert that, without prior clarification, almost no one uses the notation associated with real-number arithmetic in a way contrary to real-number arithmetic, which implies that almost no one uses it in a way contrary to integer arithmetic. Therefore, I refuse to entertain the notion that someone is actually referring to some system of arithmetic incompatible with real-number arithmetic when they use the notation associated with real-number arithmetic, unless they first clarify this.

You made this claim:

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about. People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about.

Then please stop assuming that my uncountable usage of "the concept of arithmetic in general" in that sentence is secretly referring to your countable idea of "a single arithmetic". I've clarified my meaning twice now, I'd appreciate it if you actually responded to my argument instead of repeatedly hammering on that initial miscommunication.

People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

Why should there be any doubt in their minds, if other systems of arithmetic are never denoted with that notation without prior clarification?

Then please stop assuming that my uncountable usage of "the concept of arithmetic in general" in that sentence is secretly referring to your countable idea of "a single arithmetic".

Where did I "assume" that in my last comment?

I've clarified my meaning twice now, I'd appreciate it if you actually responded to my argument instead of repeatedly hammering on that initial miscommunication.

I don't know what argument you are talking about. If you are referring to this:

  • almost no one uses the notation associated with real-number arithmetic in a way contrary to real-number arithmetic

  • ∴ I refuse to entertain the notion that someone is actually referring to some system of arithmetic incompatible with real-number arithmetic when they use the notation associated with real-number arithmetic, unless they first clarify this

That's not an argument, you are just stating your personal position. You are free to do whatever you want, if you don't want to doubt a particular "unequivocal" claim, then don't. Your personal position doesn't contradict my claim in any way.

Why should there be any doubt in their minds

Because that's what skepticism demands. I assert that 100% certainty on anything is problematic, which is the reason why skepticism exists in the first place.

Where did I "assume" that in my last comment?

You said, "The 'laws of arithmetic' that are relevant depend 100% on what arithmetic we are talking about," which is only meaningful under your usage of "laws of arithmetic" and does not apply to the term as I meant it in my original comment.

That's not an argument, you are just stating your personal position. You are free to do whatever you want, if you don't want to doubt a particular "unequivocal" claim, then don't. Your personal position doesn't contradict my claim in any way.

To quote myself:

there is no choice but to make assumptions of terms ordinarily having their plain meanings, to avoid an infinite regress of definitions used to clarify definitions.

To rephrase that, communication relies on at least some terms being commonly understood, since otherwise you'd reach an infinite regress. As a consequence, there must exist terms that have an unambiguous "default meaning" in the absence of clarification. But how do we decide which terms are unambiguous? Empirically, I can decide that a widespread term has an unambiguous default meaning if I have never heard anyone use the term contrary to that meaning in a general context, and if I have no particular evidence that other people are actively using an alternative meaning in a general context. I believe it reasonable to set the bar here, since any weaker criterion would result in the infinite-regress issue.

Because that's what skepticism demands. I assert that 100% certainty on anything is problematic, which is the reason why skepticism exists in the first place.

Sure, if someone writes "2 + 2 = 4", it isn't 100% certain that they're actually making a statement about the integers: perhaps they're completely innumerate and just copied the symbols out of a book because they look cool. I mean to say that it's so unlikely that they're referring to something other than integer arithmetic that it wouldn't be worth my time to entertain the thought, without any special evidence that they are (such as it being advertised as a "puzzle").

If you were to provide real evidence that people are using this notation to refer to something other than integer arithmetic in a general context, then I would be far more receptive to your point here.


Indeed, how do you know that your interlocutors are "100% certain" that they know what you mean by "2 + 2"? Perhaps they're "100% certain" that "2 + 2 = 4" by the rules of integer arithmetic, but they're independently 75% certain that you're messing with them, or setting up a joke.

You said, "The 'laws of arithmetic' that are relevant depend 100% on what arithmetic we are talking about," which is only meaningful under your usage of "laws of arithmetic" and does not apply to the term as I meant it in my original comment.

No it doesn't.

The "laws of arithmetic" after your explanation mean the "laws of all the different arithmetics" which you asked me to not consider as uncountable, which I didn't. You yourself said that the "laws of all the different arithmetics" is not a single set of rules that apply to all arithmetics, therefore a subset of the "laws of all the different arithmetics" may apply to a specific arithmetic, but not necessarily to another different arithmetic.

Therefore my phrase "The 'laws of arithmetic' ('laws of all the different arithmetics') that are relevant depend 100% on what arithmetic we are talking about" is 100% consistent with your usage of the term.


To rephrase that, communication relies on at least some terms being commonly understood, since otherwise you'd reach an infinite regress.

This is what you said:

This is because I have no evidence that any reasonable person would use the notation associated with integer arithmetic in such a way, and without such evidence, there is no choice but to make assumptions of terms ordinarily having their plain meanings, to avoid an infinite regress of definitions used to clarify definitions.

Having no evidence is no excuse. Having no evidence of black swans doesn't imply that black swans cannot exist, nor is it a valid reason to assume that all swans are white.

You do have a choice: don't make assumptions.

Symbols do not have a single meaning. If I say "run a marathon" you may think about participating in a marathon, but it could be managing one. Nobody sees the word "run" and assume a single meaning, the meaning always depends on the context. Intelligent beings must consider different meanings, and this is precisely the reason computers are not considered very intelligent: they can't consider multiple meanings the way a human does. If language was as simple as you paint it, computers would have had no problem solving it decades ago.

It's not that linear and simple, you do have the choice to consider multiple meanings of the word "run".


Indeed, how do you know that your interlocutors are "100% certain" that they know what you mean by "2 + 2"?

Because they use it as a clear example of something unequivocally true.

You do have a choice: don't make assumptions.

I suspect that this choice is impossible to consistently make. So that I can better understand what you're asking for, could you give me an example of a conversation in which one participant doesn't make any assumptions about the meaning of another?

This one. I'm the participant not making any assumptions about what you mean.

I suppose (not assume) that your question was rhetorical, and you actually believe I cannot answer it in truth, because you believe in every conversation all participants have to make assumptions all the time. But this is tentative, I do not actually know that, therefore I do not assume that's the case.

And this is a fallacy I have pointed out already. The fact that somebody appears to be making an assumption doesn't necessarily means that he is. All that glitters is not gold. You are likely going to comb through my statement and try to find a point where I made an assumption, but all you are going to find is the appearance of an assumption, without reading my mind you can't actually tell.

Once again: I do not know what you mean though, but I'm guessing, and that's all rational agents can do when communicating.

More comments