site banner

2+2 = not what you think

felipec.substack.com

Changing someone's mind is very difficult, that's why I like puzzles most people get wrong: to try to open their mind. Challenging the claim that 2+2 is unequivocally 4 is one of my favorites to get people to reconsider what they think is true with 100% certainty.

-34
Jump in the discussion.

No email address required.

I'd concur that this is more of an annoying semantic trick than anything else. It is never denied that 2 + 2 = 4 within the group of integers under addition (or a group containing it as a subgroup), a statement that the vast majority of people would know perfectly well. Instead, you just change the commonly understood meaning of one or more of the symbols "2", "4", "+", or "=", without giving any indication of this. Most people consider the notation of integer arithmetic to be unambiguous in a general context, so for this to make any sense, you'd have to establish that the alternative meaning is so widespread as to require the notation to always be disambiguated.

(There's also the epistemic idea that we can't know that 2 + 2 = 4 within the integers with complete certainty, since we could all just be getting fooled every time we read a supposedly correct argument. But this isn't really helpful without any evidence, since the absence of a universal conspiracy about a statement so trivial should be taken as the null hypothesis. It also isn't relevant to the statement being untrue in your sense, since it's no less certain than any other knowledge about the external world.)

Most people consider the notation of integer arithmetic to be unambiguous in a general context

But that is the point: most people make assumptions. In this particular case it's easy to see what assumption is made for people who do understand modular arithmetic, but that excludes the vast majority of people who don't.

The whole point of the article is to raise doubt about more complicated subjects which are not so easy to mathematically prove.

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question. The only lesson to be learned is that your interlocutor's terminology has to be aligned with yours in order to meaningfully discuss the subject. This has nothing to do with how complicated the subject is, only by how ambiguous its terminology is in common usage; terminology is an arbitrary social construct. And my point is that this isn't even a very good example, since roughly no one uses standard integer notation to mean something else, without first clarifying the context. Far better examples can be found, e.g., in the paper where Schackel coined the "Motte and Bailey Doctrine", which focuses on a field well-known for ascribing esoteric or technical meanings to commonplace terms.

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

There is no "arithmetic", there's multiple arithmetics. You are assuming that the "laws" of one particular arithmetic apply to all arithmetics, which is not true.

Here, I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations. I am not assuming that the rules of integer arithmetic will apply to systems of arithmetic that are incompatible with integer arithmetic but use the exact same notation. I am assuming that no one reasonable will use the notation associated with integer arithmetic to denote something incompatible with integer arithmetic, without first clarifying that an alternative system of arithmetic is in use.

Furthermore, I assert that it is unreasonable to suppose that the notation associated with integer arithmetic might refer to something other than the rules of integer arithmetic in the absence of such a clarification. This is because I have no evidence that any reasonable person would use the notation associated with integer arithmetic in such a way, and without such evidence, there is no choice but to make assumptions of terms ordinarily having their plain meanings, to avoid an infinite regress of definitions used to clarify definitions.

I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations.

There are no axioms that apply to all arithmetics. There are no such "laws".

Go ahead and try come up with one "law". I'm fairly certain I can point out an arithmetic where it doesn't apply.

There's a reason these fall under the umbrella of abstract algebra.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

There are no axioms that apply to all arithmetics. There are no such "laws".

Are you getting hung up on my use of the term "laws of arithmetic"? I'm not trying to say that there's a single set of rules that applies to all systems of arithmetic. I'm using "laws of arithmetic" as a general term for the class containing each individual system of arithmetic's set of rules. You'd probably call it the "laws of each arithmetic". The "laws of one arithmetic" (by your definition) can share common features with the "laws of another arithmetic" (by your definition), so it makes sense to talk about "laws of all the different arithmetics" as a class. I've just personally shortened this to the "laws of arithmetic" because I don't recognize your usage of "arithmetic" as a countable noun.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

I was focusing on integer arithmetic since that was sufficient to cover your original statement. The natural generalization is group or field arithmetic to define the operations, and real-number arithmetic (a specialization of field arithmetic) to define the field elements. The notation associated with integer arithmetic is the same as the notation associated with real-number arithmetic, since the integers under addition or multiplication are a subgroup of the real numbers.


To repeat my actual argument, I assert that, without prior clarification, almost no one uses the notation associated with real-number arithmetic in a way contrary to real-number arithmetic, which implies that almost no one uses it in a way contrary to integer arithmetic. Therefore, I refuse to entertain the notion that someone is actually referring to some system of arithmetic incompatible with real-number arithmetic when they use the notation associated with real-number arithmetic, unless they first clarify this.

You made this claim:

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about. People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about.

Then please stop assuming that my uncountable usage of "the concept of arithmetic in general" in that sentence is secretly referring to your countable idea of "a single arithmetic". I've clarified my meaning twice now, I'd appreciate it if you actually responded to my argument instead of repeatedly hammering on that initial miscommunication.

People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

Why should there be any doubt in their minds, if other systems of arithmetic are never denoted with that notation without prior clarification?

More comments