site banner

2+2 = not what you think

felipec.substack.com

Changing someone's mind is very difficult, that's why I like puzzles most people get wrong: to try to open their mind. Challenging the claim that 2+2 is unequivocally 4 is one of my favorites to get people to reconsider what they think is true with 100% certainty.

-34
Jump in the discussion.

No email address required.

You can do equally annoying semantic tricks with pretty much anything, it's just harder to get away with it with when it isn't math:

Ie, "The Sun is smaller than a pebble" - - (Pebble is an alternate name I made up for the Milky Way)

"Grass isn't green" - - (I've defined Green to be 00FF00 in Hexadecimal, and this grass here is 28CF0E, which I have named "moss", so the colors aren't equal)

etc.

When you say things without rigorously defining every word ahead of time, there is an implicit promise that your words mean approximately what they usually mean in that language. Most words and concepts have reasonably well understood meanings, or such can be inferred via context. And this is almost always a good thing because it enables people to have conversations without carrying dictionaries around, not some close minded thing that needs to be challenged and abused with pedantic tricks and deception.

Except arithmetic isn't a semantic trick, and modern algebra is an important field of mathematics, not something I invented.

  • -12

And in plain arithmetic, which is what more than 99% of all uses of those symbols occur in, 2 + 2 = 4 is a true statement.

Perhaps a better analogy would be if you throw a random French word that sounds identical to a different English word into an otherwise English sentence, and trick people with the double meaning. It's a language that exists, it does mean that thing to some people in some contexts, but you've deliberately placed it out of context and in order to deceive the audience. This can be great as the setup to a pun/joke, but not so much for making educational points.

The statement "in normal arithmetic 2+2=4" is true, but "2+2 is always 4" is false.

You can dismiss semantics all you want, but the meaning of the statements we make matter, and the certainty we have about the meaning of the statements other people make do matter.

Just this week I debated a person who was 100% certain he knew what anti-Semitism was (he didn't), what a dictionary was (he didn't), and what all the words in the definitions I presented to him meant (he didn't).

In my view 100% certainty is a problem.

I believe questioning the meaning of 2+2 might help some people question other unquestionably true beliefs.

Are you 100% certain it's impossible for this to happen?

The map is not the territory.

If you hold constant the referents, then 2+2 is always 4. That is, the number 2 in the integers/real-numbers, added to the number 2 in the integers/real-numbers, deterministically always yields the number 4 in the integers/real-numbers.

The symbol "2" does not always refer to integers/real-numbers, and "+" does not always refer to addition in the integers/real-numbers, and "=" does not always refer to equality in the integers/real-numbers, so the string of symbols "2+2=4" does not always refer to a true statement, but that's only if it refers to an unusual statement other than the standard referent of the string.

So I would say that "2+2=4 is always true" is true, because when I say 2+2=4 without additional context I implicitly mean the integers/real-numbers. I will concede that " "2+2=4" always refers to a true statement" is false, but consider this vacuous, because literally any string can be redefined to refer to any true or false statement. So when somebody says "2+2=4", I am not 100% certain that the statement they intend with their words is true, but I am 100% certain that the statement in my mind induced by those words is true, and am 99.99% sure that the true statement in my mind would be the same statement created in the minds of the majority of people who have at least 1 month of mathematics education using Arabic numerals, so am not at all worried about considering this to be the default interpretation unless otherwise specified.

You didn't answer my question.

I am not 100% certain it's impossible for someone (including myself) to be mistaken about the definitions or meanings of commonly used words or mathematical symbols. It's technically possible with nonzero but very very small probability, especially for very commonly used stuff like 2 and +. But that's true of literally every fact, and there is not enough time in the human lifespan to thoroughly interrogate all of them, so 2+2 is not a wise choice to focus on. I think that the assumption of common knowledge of words is incredibly useful when used appropriately, so sowing doubt and being pedantic about it is likely to cause more harm than good if done unstrategically. Your goal is potentially useful, but pedantry is not the way to accomplish it, you'd do better targeting more ambiguous words that don't have the force of mathematical logic and precision behind them.

I am not 100% certain it's impossible for someone (including myself) to be mistaken about the definitions or meanings of commonly used words or mathematical symbols.

That was not my claim. Please read my claim and then answer my question.

Using non-standard definitions without denoting them beforehand is a semantic trick.

And if you want to do math, you absolutely need to rigorously define things.

This is usually covered in basic math courses or textbooks. For example, freely translated from The Open University of Israel's a quick intro to logic:

We mentioned that 3*4 > 10 is a true statement. This statement is false if the numbers are actually written in Hexadecimal base, where "10" represents the decimal number 16.

So that we don't require the assistance of a lawyer every time we determine a statement to be true or false, we agree that in every case where concepts have a common interpretation or context, we assume (without mentioning) that we speak in that common context, [...]

My post has absolutely nothing to do with bases. Did you read it?

  • -14

2+2 is unequivocally 4 unless the numbers are redefined, such as by changing the base you're working in.

Except my post proves that's not the case. Again: I did not change any the base in my post.

You’re missing the point. It’s not that you literally changed the base, but you did effectively the same thing

No. Apples are not oranges. Abstract algebra is a much less known concept than numeral systems. Virtually nobody thinks of that when considering 2+2.

My comment has nothing to do with bases either. It has everything to do with assuming common context.

This is like you replying “My post has nothing to do with 3*4”, you’ve missed the point entirely and got hung up on the example.

This is your claim:

This is usually covered in basic math courses or textbooks.

What is "this" in this context?

Also, you claimed that "this" is taught in basic math textbooks, but you din't provide an example of such textbook, you provided one of logic.

“This” is that we assume the common interpretation if one exists. The second quoted paragraph explains it.

Logic is a part of math. The book is from an undergrad discrete math course I took once, so I pulled the book from the shelf to quote for you.

“This” is that we assume the common interpretation if one exists. The second quoted paragraph explains it.

Which is?

Logic is a part of math.

No. Logic and mathematics have a complicated relationship.

The book is from an undergrad discrete math course I took once, so I pulled the book from the shelf to quote for you.

So it wasn't a "basic math" course, and you don't have an example of a "basic math" textbook covering "this".

I’m not sure what your angle here is. What are you trying to say? Do you actually not understand my point, or are you just being obtuse?

Discrete math is as basic as it gets, it’s first semester CS/Electrical/Math/Physics. It’s literally the base. Saying logic isn’t part of math but has “a complicated relationship” with math… again, I don’t see what you’re getting at. Seems like an objection for objection’s sake.

Again, the point is that it is convention to assume the common interpretation/ context of a statement when we assess its truth value, unless otherwise specified.

Discrete math is as basic as it gets, it’s first semester CS/Electrical/Math/Physics.

Of university. You were taught math before that, weren't you?

It's not "basic math".

Saying logic isn’t part of math but has “a complicated relationship” with math… again, I don’t see what you’re getting at.

That your statement is not quite correct.

Again, the point is that it is convention to assume the common interpretation/ context of a statement when we assess its truth value

"Convention" literally means usually done, not always.

That's not how modular arithmetic works: 2+2=4 is still true, it's just that 4=0 mod 4, so 2+2=0 is also true.

Even if your example were true, that would just be notation confusion: The statement commonly meant by 2+2=4 is always true. So if I say 2+2=4 is always true, I'm correct, and if you say 2+2=? and the answer isn't 4 you're just communicating badly by omitting relevant information about the problem statement. In honest conversation this doesn't change anything.

That's not how modular arithmetic works: 2+2=4 is still true

There is no 4 in modulo 4, you are confusing the modulo operation with modular arithmetic, they are two different concepts that lead to the same result.

I'm not, neither of us was talking about the modulo operation (I was using mod 4 to denote I'm operating in the congruence class ring).

And the article about modular arithmetic agrees with me. Choice quote:

Each residue class modulo n may be represented by any one of its members

It may be represented that way, but they are not the same thing.

You yourself accepted here that 4 (sa) is not the same statement as 4 (mod 4).

So your claim that 4 (sa) = 0 (mod 4) is just plainly false.

It may be represented that way, but they are not the same thing.

4 and 0 are equivalent as representants of the residue class. If you can write down 2+2 where 2 refers to a residue class, the answer can be written down as 4.

your claim that 4 (sa) = 0 (mod 4)

How many times do I have to ask you to stop misquoting me?

If I ask you what’s the result of 22+2, you are most likely going to answer 24, but if I ask you what’s 22:00+2:00, you are likely not going to answer 24:00 (which isn’t a thing)

It's quite common in Japan to see a bar or restaurant with posted hours of operation from 17:00 to 27:00 (i.e. from 5:00 PM to 3:00 AM). 27:00 is valid in the same sense that 27/24 or 400° is valid; we might call it an "improper time."

You also don't specify whether 22:00 is time of day or duration. If it's a duration, then 24:00 is clearly the correct answer.

But 22+2 can be 0.

there is a difference between “you should have lingering doubt in the face of certainty that you know what exactly is trying to be communicated” and “you should have doubt about things you absolutely know are true”.

“I purposefully did not mention I was thinking about modulo math and let you assume I meant the common notion of + “ doesn’t really convince me of anything except that people disagree about what it means to be dishonest.

I don't need to be thinking about modular arithmetic to doubt 2+2=4, I could do it without having a good reason to doubt.

And I explained in the article Bertrand Russell doubted something much more fundamental 1+1=2, wrote extensively about it, and it's considered serious and important work on the foundations of mathematics.

Do you think Bertrand Russell was "dishonest" for asking people to suspend their belief?

Do you think Bertrand Russell was "dishonest" for asking people to suspend their belief?

No, merely exceedingly rigorous. He set out to prove 1+1=2, then after a lot of tedious work, he indeed proved that 1+1=2 is in fact true, settling the debate confirming what everyone already knew. He didn't actually doubt it, he merely wanted to put it on a formal foundation, and he did.

He wasn't an engineer who was worried bridges would fall if everyone computed 1+1 incorrectly, he wasn't a politician who got challenged on his fiscal plan and needed to double-check his assumptions. He was a nerd who wanted clarity for its own sake, operating at the intersection between pure math and philosophy. That's the field where you would doubt 1+1=2, not because you actually doubt it, but because you expect insight from dispelling that doubt. It's the same level of abstraction as wondering whether you're actually a brain in a vat. In politics or engineering, you can't do that.

That's the field where you would doubt 1+1=2, not because you actually doubt it, but because you expect insight from dispelling that doubt.

It doesn't matter if Bertrand Russell personally doubted it or not, he acted as if it was rational to not believe with 100% certainty something which had not been proven yet, and it was.

The reason he attempted to dispell that doubt, is that absent that proof, it was reasonable to doubt.

It's the same level of abstraction as wondering whether you're actually a brain in a vat.

Which is a valid doubt in philosophy.

In politics or engineering, you can't do that.

You have to doubt in engineering, for the same reason you have to doubt in every field. Bridges have fallen because engineers did not doubt enough.

Not really. I can guarantee you that Russell used 1+1=2 when calculating his daily expenses even before he formally proved it. Had he failed at his attempt to prove it, he would have gone on believing and using it. I can guarantee you he didn't scold any colleagues for using 1+1=2 without proof.

He wanted a formal proof for itself, not because one was needed.

I can guarantee you that Russell used 1+1=2 when calculating his daily expenses even before he formally proved it.

I literally said "it doesn't matter if Bertrand Russell personally doubted it or not".

If I'm not 100% certain a particular chair is not broken, but I sit on it anyway, and you conclude that therefore I believe with 100% certainty that it wasn't broken, you are committing a converse error fallacy.

You cannot read minds, you cannot know why I did sit on that chair, and assuming that you do know is an error in logic.

Even worse is to assume you do know why I checked the chair before sitting on it, and assuming it had nothing to do with my potential doubt.

Doubt is essential in all fields. 100% certainty is extremely dangerous. And I don't see you addressing this at all.

I literally said "it doesn't matter if Bertrand Russell personally doubted it or not".

No one doubted it, because it wasn't actually reasonable to doubt it. Russell wanted to formalize a foundation, he wanted to prove that arithmetics derived from logic, not that arithmetics was true.

Doubt is essential in all fields.

Not doubt about math or fundamental logic. That is only reasonable in philosophy. An engineer who doubts 1+1=2 will never build any bridges, and no bridges will crash because an engineer assumed 1+1=2.

If you doubt the fundamentals, you're doing philosophy. If you want to get anything done, you need to stop doing philosophy. You need to choose some axioms, build a knowledge base and then get to work on questions that are actually in doubt.

100% certainty is extremely dangerous. And I don't see you addressing this at all.

Because right now a fallacious argument is being made for too little certainty, not too much. I'm addressing the bad arguments that are actually on the table.

Not doubt about math or fundamental logic.

No? So nobody in mathematics doubts the Zermelo–Fraenkel set theory axiomatic system?

An engineer who doubts 1+1=2 will never build any bridges

Who said an engineer should doubt 1+1=2?

No? So nobody in mathematics doubts the Zermelo–Fraenkel set theory axiomatic system?

Doubt about axioms is basically mathematical philosophy.

Who said an engineer should doubt 1+1=2?

So you agree doubt about everything is not reasonable in every field?

More comments

Based on his reputation and without reading what he wrote, no I don’t think he was being dishonest. I assume he was doing some weird philosophy and never at any moment entertained the possibility that meat and potatoes real life counting was hanging in jeopardy. I don’t think he was going around and saying “you should worry about reality and trusting your lying eyes because of some fancy math that you probably don’t need and doesn’t apply to counting physical objects”

Based on his reputation and without reading what he wrote, no I don’t think he was being dishonest.

That's literally an argument from authority fallacy.

Plenty of philosophers have doubted even the most fundamental concepts of everything, including reality itself. Solipsism is a serious philosophical concept, which includes doubting that 1+1 is necessarily 2, and Bertrand Russell entertained that possibility.

It’s not a fallacy because I assume the content of his argument is not the content of your argument. I’m unable to comment on what he wrote. I have told you what I take issue with in your argument and it isn’t the part where you say Bertrand Russel said it too.

It’s not a fallacy because I assume the content of his argument is not the content of your argument.

But you are assuming his argument is valid merely on the basis of his credentials.

And you are assuming my argument is invalid merely on the basis of my credentials.

That's a fallacy.

No, this is a bizarre reading of the situation. Goodbye

I think you have a fundamental misunderstanding of what Bertrand Russel was doing when he proved 1+1=2. From an earlier work of his which effectively turned into a preface of the Prinicipa Mathematica:

The present work has two main objects. One of these, the proof that all pure mathematics deals exclusively with concepts definable in terms of a very small number of fundamental concepts, and that all its propositions are deducible from a very small number of fundamental logical principles, is undertaken in Parts II–VII of this work, and will be established by strict symbolic reasoning in Volume II.

The proof was not to dispel doubt about the statement 1+1=2, but to dispel doubt about the system of formal logic and axioms that he was using while constructing that proof. "1+1=2" was not a conundrum or a question to be answered, but a medal or trophy to hang on the mantle of mathematical logicism; much in the same way that the point of "coding Hello World in assembly" is not "coding Hello World in assembly" but "coding Hello World in assembly."

Russel was showing that you could lower the "basement" of mathematics and consider it as starting from another foundation deeper down from which you could construct all mathematical knowledge, and to do that he had to build towards mathematics where it already stood.

(Then Kurt Gödel came along and said "Nice logical system you've built there, seems very complete, shame if someone were to build a paradox in it...")

I think you have a fundamental misunderstanding of what Bertrand Russel was doing when he proved 1+1=2

No, I don't. In mathematics the word "proof" has a very precise meaning, and anything without a "proof" is held as tentative (i.e. not necessarily true), for example a conjecture.

This entirely depends on the set of axioms you choose as as foundation, and you certainly could choose 1+1=2 as one of those axioms, therefore it's an assumption that doesn't need to be substantiated. But if you get rid of that axiom, then 1+1=2 is held as tentative and thus lacking proof.

much in the same way that the point of "coding Hello World in assembly" is not "coding Hello World in assembly" but "coding Hello World in assembly."

You are making a very obvious assumption there.

Russel was showing that you could lower the "basement" of mathematics and consider it as starting from another foundation deeper down from which you could construct all mathematical knowledge, and to do that he had to build towards mathematics where it already stood.

I know.

Another way to think about it is that he tried to refactor the 1+1=2 axiom into more fundamental axioms. But this work necessitates the possibility that an axiomatic system that doesn't have 1+1=2 as an axiom is tenable. If such a system exists (which I think Bertrand Russell pretty much proved), that means that 1+1=2 does not need to be assumed to be true, it can be inferred.

I call "not assume" "doubt", but it doesn't matter what you call it, the fact is that to write Principia Mathematica Bertrand Russell had to not assume 1+1=2.

I call "not assume" "doubt", but it doesn't matter what you call it, the fact is that to write Principia Mathematica Bertrand Russell had to not assume 1+1=2.

It does matter what you call it, especially if you haven't explicitly defined what you mean when you use the term you're calling it by, because people will generally interpret you as using the most common meaning of that term. And we can see the communication issues that causes right here, because there are two relevant meanings of the word "assume" in this conversation and the word "doubt" is only a good antonym for one of them, so it looks like you're conflating those meanings, unintentionally or otherwise.

To assume(everyday) something means approximately to act as if that something were true, without feeling the need to personally verify it for oneself.

To assume(logic) something means to accept it as an axiom of your system (although potentially a provisional one) such that it can be used to construct further statements and the idea of "verifying" it doesn't make much sense.

Doubt is a reasonable word for "not assume(everyday)," thought it's usually used in a stronger sense, but it's a much poorer fit for "not assume(logic)." The technique of proof by contradiction is entirely based on assuming(logic) something that one is showing to be false, i.e. that one does not assume(everyday).

Russel himself is a good example of the inequivalence going the other direction. What would he have done if he had managed to prove 1+1=3 with his logical system? I can't be completely certain, but I don't think he'd have published it as a revolution in mathematical philosophy. More likely, he'd have gone over the proof looking for errors, and if he couldn't find any he'd start tinkering with the axioms themselves or the way in which they were identified with arithmetical statements to get them to a form which proved 1+1=2 instead, and if that failed he'd give them up as a foundation for mathematics, either with a grumbling "well I bet there's some other way it's possible even if I wasn't able to show it myself" or in an outright admission that primitive logic doesn't make a good model for math.

In other words, even though he didn't assume(logic) that 1+1=2, his assumption(everyday) that 1+1=2 would be so strong as to reverse all the logical implication he had been working on; a "proof" that 1+1 != 2 would instead be taken as a proof that the method he used reached that conclusion was flawed. This is not a state of mind I would refer to as "doubt."

much in the same way that the point of "coding Hello World in assembly" is not "coding Hello World in assembly" but "coding Hello World in assembly."

You are making a very obvious assumption there.

Yes. I assumed that you have enough in common with me culturally to know what "Hello World" and "assembly" are in the context of coding, why "Hello World" is a nearly useless program in the vast majority of contexts, and that people frequently practice new programming languages by writing programs in them with little regard for the practical use of those programs; that you are intelligent enough to comprehend those kinds of comparative asides and familiar enough with conversational English to understand that loading them with caveats would draw too much focus away from the point they are supporting; and that you are here to have a constructive conversation instead of deliberately wasting people's time. If I'm wrong about any of those I will be happy to be corrected.

It does matter what you call it

I did not say it doesn't matter what I call it, I said it doesn't matter what you call it.

And it seems pretty clear to me you are being intentionally obtuse. The purpose of me communicating to you is that you understand what I mean, it doesn't matter how. For any given idea I have there's a set of words I could use to transmit that idea, and any word I use has multiple meanings, but as long as you pick the meaning that correctly match the idea I want to transmit, we are communicating effectively. The "most common meaning" is completely irrelevant. The most common meaning of the word "get" is "to gain possession of", but if I say "do you get what I mean", I'm not using the most common meaning, and I don't have to.

I used multiple words, terms, and an explanation for you to understand what I meant, and if you understand it, I don't personally care what word you use to name that idea.

To assume(everyday) something means approximately to act as if that something were true, without feeling the need to personally verify it for oneself.

To assume(logic) something means to accept it as an axiom of your system (although potentially a provisional one) such that it can be used to construct further statements and the idea of "verifying" it doesn't make much sense.

I don't see the any difference. If you "assume X" it means you hold X as true without any justification, evidence, verification, or inference.

In other words, even though he didn't assume(logic) that 1+1=2, his assumption(everyday) that 1+1=2 would be so strong as to reverse all the logical implication he had been working on

I disagree. Every day he saw evidence that 1+1=2, so it would be reasonable to believe (not assume) that this was always true. Additionally he saw no way it could not be true, but he was rational enough to know this was not a good reason to assume it was true, as this would have been an argument from incredulity fallacy.

Maybe he did assume that 1+1=2 in everyday life, but you cannot know that unless you could read his mind.

This is a converse error fallacy. If I assume a chair is sound, I would sit down without checking it, but if I sit down without checking it doesn't necessarily mean I assumed it was sound.

In general rationalists try to not assume anything.


If I'm wrong about any of those I will be happy to be corrected.

I know helloworld is a nearly useless program in the vast majority of contexts, but not all, and I know that people frequently practice new programming languages by writing programs in them with little regard for the practical use of those programs, but not all people who write helloworld programs are practicing new programming languages.

You are assuming the general case. I can easily imagine somebody in the 1970s developing for a new hardware architecture for which there are no other compilers available trying to test that any software runs, in fact, I can even imagine somebody today doing that for a new hardware architecture like RISC-V.

And once again the point of these examples is not to "deliberately wasting people's time", it's to show they are making assumptions even if they can't possibly see how they could be making an assumption.

Every time I tell somebody that they are making an assumption they disagree, and every time I point to them the assumption they were making they come with a rationalization, like "you tricked me", or "that's very unlikely", or "everyone would have assumed the same". It's never "actually you are right, I didn't think about that".

I don't see the any difference. If you "assume X" it means you hold X as true without any justification, evidence, verification, or inference.

As I've seen the term used outside of logic, it only requires a lack of effort towards verification. You can have justification, evidence, or inference, as long as they are simple enough and easily-enough available. For example, I would find nothing unusual in a drive-by reply to this line consisting of the following sentence: I assume you didn't read the post very thoroughly, then, because the paragraph immediately below where your quote ends contains a distinguishing case.


You are assuming the general case.

Ah! I see the false assumption was "that you are intelligent enough to comprehend those kinds of comparative asides and familiar enough with conversational English to understand that loading them with caveats would draw too much focus away from the point they are supporting." Asides of that type are implicitly restricted to the general case, because they are intended to quickly illustrate a point by way of rough analogy, rather than present a rigorous isomorphism.

I assume you didn't read the post very thoroughly, then, because the paragraph immediately below where your quote ends contains a distinguishing case.

This is an equivocation fallacy. You are using a different definition of "assume", in particular using it exactly as "suppose". In my view assuming and supposing are two different things, even in the colloquial sense.

I see the false assumption was "that you are intelligent enough to comprehend those kinds of comparative asides and familiar enough with conversational English to understand that loading them with caveats would draw too much focus away from the point they are supporting."

Wrong. I can comprehend the notion without accepting it. This is a converse error fallacy.

Asides of that type are implicitly restricted to the general case, because they are intended to quickly illustrate a point by way of rough analogy, rather than present a rigorous isomorphism.

This is obviously a cop-out. If you were aware that your claim applied only to the general case, but you merely did not make it explicit, then the moment I mentioned there was an assumption you would immediately know what assumption I was talking about, because you were fully aware.

But you didn't know what assumption I was talking about, because you were not aware of the restriction. Now you want to pretend you always knew you were making an assumption, you merely didn't say it when I pointed it out, for some reason.

This is precisely what everyone does. Before they say they aren't making an assumption, and after I point it out to them they always knew. You did exactly what I said people do, and you went for one of the options I listed: "everyone would have assumed the same".

It seems like you don't, actually, understand what that comparative aside was doing, so let me restate it at more length, in different words, with the reasoning behind the various parts made more explicit.

I described a situation where a person generated object A by means of process B, but due to their circumstances the important part of their activity was process B, and object A was important mostly insofar as it allowed the engagement of process B. Since I judged this sort of process-driven dynamic may seem counterintuitive, I also decided to give an example that is clearly caused by similar considerations. Writing Hello World in a new language is a nearly prototypical instance of trivial output being used to verify that a process is being applied successfully. The choice of assembly further increased the relevance of "moderately experienced programmer checking that their build pipeline works and their understanding of fundamentals is correct".

In this context, the existence of the general case - and the fact that it is the typical example brought to mind by the description, as indicated by the name you selected - suffices to serve the purpose of the aside. I did not claim and did not need to claim anything about all instances of building Hello World in assembly; the idea that I was trying to is an assumption that you made.

More comments

Looks like an exercise in sophistry. Of course with skillful redefinition of what '2' and '+' and '2' and '=' and '4' you can make it mean anything you like. It's not a very interesting thing, just as you could invent your own private language where the word "water" means "sun" and the word "dry" means "yellow" and then claim "water is dry!" and feel very smart about it. There's nothing to reconsider - it's not about something being true or not true, it's about confusing oneself on purpose.

Of course with skillful redefinition of what '2' and '+' and '2' and '=' and '4' you can make it mean anything you like.

I did not invent abstract algebra, it's a important field in mathematics.

I didn't claim you invented abstract algebra (if you did, I'd probably just worship you quietly and wouldn't dare to contradict you) - I claim that ability of symbols to attach to any meaning and the fact that same symbols can be used to designate multiple entities with different qualities and properties is not some kind of deep revelation that leads people to profound insights (at least not by now - maybe when this fact has been first discovered, it would). It's more like a rhetorical gotcha trick that is meant to confuse people and give the trickster a false aura of profundity - "oh, he knows that 2+2 can mean so many different things, he must be so smart!", or even "aha! He said "I like giving gifts to children", but in German "gift" means poison, so he is actually is a child murderer and I won the internet by this clever trick!". It's not really that clever, that's what I am saying.

It's not really that clever, that's what I am saying.

Who says it has to be clever?

I think it's time to move on to a horse that has not shuffled its mortal coil and did not join the choir invisible. This is an ex-horse.

This is a smoke screen. You still haven't answered the question.

The most important ideas in civilization are not "clever". So what?

Idea stated in this post is not interesting nor important.

This seems more like a "gotcha" then a genuine prompt to get people to reconsider the nature of truth. Without knowing how you're construing 2 + 2 != 4, I can fairly confidently guess it's one of the following:

  1. An arbitrary redefinition of "2" or "4", e.g. "2 AM" or something silly like that.

  2. Highly advanced mathematics, possibly something non-Euclidean.

If the person you're asking is polite, they'll likely go along with it, but their internal monologue is probably something less like "this makes me rethink everything!" and more like "ugh, this guy really want to show me how smart he thinks he is".

ugh, this guy really want to show me how smart he thinks he is

Yes, but I don't care about their reaction or their opinion of me.

I've never seen anybody seriously question any of their core beliefs in real time. But these notions plant a seed which eventually they can't avoid. Sleep is important in mulling these down.

In fact, I remember mentioning to somebody the claim "all models are wrong, but some are useful", which was immediately dismissed (since in an argument nobody wants to be wrong), but some time later the same person made the exact same claim to me. He forgot I was the one who mentioned it, but more importantly: he forgot that initially he immediately dismissed it.

I bet many people will forget this article and dismiss it as pedantry, but the next time someone says "this is as true as 2+2=4" they might think twice. These two things are not exclusionary.

I would actually double down and assert that 2+2=4 is a fact deeper than arithmetic. If 2+2=4 are elements in a modular ring, it holds true. If they are vectors, it still holds true. If they are abstract discrete topological spaces, it holds true. I have not encountered a situation in (non-joke) mathematics where the symbols '2', '+' and '4' are overloaded so as to not make this equation true.

There is an underlying concept of "twoness", "addition" and "fourness" that holds this property even as you generalize it to systems beyond integer arithmetic, almost like a fundamental structure of mathematics. This is not even about notational trickery. Even if you decide to use different symbols, it does not change the underlying mathematical relationships. You would just be expressing the same undeniable fact differently.

If 2+2=4 are elements in a modular ring, it holds true.

Integers modulo 4 (𝐙/4𝐙) is a modular ring which does not contain the number 4.

It contains the congruence class 4Z (= {...-8,-4,0,4,8...}) of which the number, more so the symbol, 4 is a valid representant.

The statement remains true.

So does the statement 2+2=0.

Which no one has doubted.

You accepted here that (4 (sa)) is not the same statement as (4 (mod 4)).

Thus conceding my point.

You’re getting dogpiled in the comments here, which I hate to join in on, but in your comments you just seem to be repeatedly missing the point people are making.

Your post does nothing to contest the validity of the common meaning of 2+2, it just points out that by using far, far less common definitions of the symbols (either different meanings of ‘2’ or ‘+’) you can arrive at a different result.

Everyone is pointing out that this is trivially true, but very silly to use as in example. Because in reality, if someone wanted you to interpret the symbols in a nonstandard way it would be incumbent upon them to make that clear to you.

I suppose your larger point is true, but not particularly meaningful. So a statement that seems easy and clear to interpret can actually be misleading when your interlocutor is deliberately trying to confuse and deceive you by omitting key information? Ok, but that’s not exactly a surprising or interesting conclusion

I suppose your larger point is true, but not particularly meaningful.

Are you 100% certain of that?

So a statement that seems easy and clear to interpret can actually be misleading when your interlocutor is deliberately trying to confuse and deceive you by omitting key information?

This is a loaded language claim, a rhetoric trick. You are intentionally adding the word "misleading" to prompt an emotional response.

Consider this exchange:

  1. If you don't denounce Russia's illegal war of aggression, that makes you a Putin apologist, that's as unequivocally true as 2+2=4

  2. Actually, 2+2=4 is not unequivocally true

My claim (2) is not "misleading", and I'm not "deliberately trying to confuse and deceive" anyone, it's the other person who made a false claim (1). The sole objective of me bringing up this abstract algebra notion is to increase doubt on the original claim about Russia sides. The factoid 2+2=4 is not used by me as an end, it's used by somebody else as a means to an end. 2+2=4 is often used as a tool to demonstrate 100% certainty, and it can be dismantled.

Your loaded language claim doesn't apply in this example. We can get rid of the loaded language and make a much more fair, generous and neutral claim:

"A statement that seems easy, clear to interpret, and is obviously 100% certain to be true can actually be not necessarily true when an unrealized assumption is present."

How is this more generous claim not correct?

  • -10

Statement 1 is debatable but not because 2 + 2 ≠ 4, so it's pointless to argue that point. A few of the deficiencies:

  1. “Illegal war of aggression” is begging the question.

  2. “That makes you a Putin apologist” is a nonsequitur: a refusal to denounce someone's actions does not equal an endorsement of the perpetrator. Has your Muslim neighbor denounced Islamic terrorism recently? Does that make him an ISIS-apologist?

  3. Finally, and most importantly, law in general and international law in particular is much less clearly defined and broadly agreed upon than simple arithmetic over the natural numbers. Even if you believe that 2 + 2 = 4 isn't objectively true, it's undeniably more well-established than jus ad bellum.

The point is that the fact that statement 1 is false doesn't make statement 2 any more (or less) true.

To give a different example, if I say “Waffles are better than pancakes, that's as clear as the sky is blue”, would you start arguing that the sky isn't always blue? Or would you agree that the two clauses here have no logical relation to each other, and to disagree with the first doesn't require you to argue against the second?

And yes, you could argue that sometimes the sky is black or red or that the color blue is ill-defined etc., but if I put a gun to your head and asked you “What color is the sky?” I'm sure you know exactly what word you need to utter to save your life. But if I asked you about waffles vs pancakes instead, the correct answer would be a lot less obvious, proving that the truth of these statements isn't equally clear.

Finally, and most importantly, law in general and international law in particular is much less clearly defined and broadly agreed upon than simple arithmetic over the natural numbers.

This supports my argument. If I demonstrate that a rational agent should doubt something very "clearly defined" such as 2+2=4, then it logically follows that something much less clearly defined should be doubted as well.

if I say “Waffles are better than pancakes, that's as clear as the sky is blue”, would you start arguing that the sky isn't always blue?

Yes. I start with the claims that are more easy to dismantle because I know that people virtually never doubt their beliefs in real time. It would be very hard for me to convince that person that waffles are not necessarily better than pancakes, but it would be easy to dismantle the auxiliary claim.

This person may attempt to find another more unequivocally true auxiliary claim, but I would easily dismantle that too. And sooner or later this person would be forced to realize that it's not easy to find an unequivocally true claim. And if it's not easy to find an unequivocally true claim, perhaps the unequivocally true claim that waffles are better than pancakes is not so unequivocally true.

If a person says "Bob is as racist as Alice", and I show that Alice is not racist, then says, "OK. Bob is as racist as Mary", and I show Mary is not racist, "OK. Bob is as racist as Linda", Linda isn't racist. Wouldn't it make sense to doubt whether or not Bob is actually racist?

Using metaphors to tackle deep philosophical problems isn't even fringe. The notion of a black swan is nowadays common in order to explain that the fact that something has never happened before is not a valid reason to think it will never happen in the future. It tackles the deep philosophical problem of induction.

Instead of saying "as clear as the sky is blue", people in the past used to say "as impossible as a black swan". To say "actually, the fact that we haven't seen a black swan doesn't necessarily mean black swans don't exist" is not pedantry, it's in fact valid reasoning, a deep philosophical notion (problem of induction), and something that should have made people doubt their 100% certainty on "impossible" events.

If a person says "Bob is as racist as Alice", and I show that Alice is not racist, then says, "OK. Bob is as racist as Mary", and I show Mary is not racist, "OK. Bob is as racist as Linda", Linda isn't racist. Wouldn't it make sense to doubt whether or not Bob is actually racist?

Okay, but if someone says "Bob is as racist as a KKK grand wizard", it would still make sense to doubt it. Conversely, if they say "Bob is as racist as Alice, because he's the author of the bobracial supremacy manifesto", pointing out Alice isn't racist just distracts from the point at hand. Yes, it's a bad metaphor, but the point stands.

Compare this discussion. I have refuted your argument that 2+2=4 is not unequivocally true, but I'm still willing to discuss the point you were trying to make without forcing you to come up with a new example.

Conversely, if they say "Bob is as racist as Alice, because he's the author of the bobracial supremacy manifesto", pointing out Alice isn't racist just distracts from the point at hand. Yes, it's a bad metaphor, but the point stands.

Yes, but the premise of this line of thought is precisely the opposite: it's not easy to prove Bob isn't racist, other other hand it's extremely easy to prove Alice isn't racist.

I have refuted your argument that 2+2=4 is not unequivocally true, but I'm still willing to discuss the point you were trying to make without forcing you to come up with a new example.

But discussing is not accepting. You are arguing that Bob is a racist, but you are nowhere near accepting the possibility that he might not be.

You are not willing to accept that Alice might not be a racist, and Bob even less. Which proves my point.

Yes, but the premise of this line of thought is precisely the opposite: it's not easy to prove Bob isn't racist, other other hand it's extremely easy to prove Alice isn't racist.

That's my exact point. If you prove Alice isn't racist, you haven't proven anything relevant. You're just nitpicking. The actual relevant question of whether Bob is racist is unaddressed.

But discussing is not accepting. You are arguing that Bob is a racist, but you are nowhere near accepting the possibility that he might not be.

I'm accepting the possibility Bob might be racist to the degree I'm required to: I'm listening to the supporting case and engaging with your arguments.

Your arguments that Bob is racist just aren't convincing. You're mainly arguing he's as racist as Alice and I happen to know she isn't. And instead of leaving it at that until you make a better argument, which I could, I'm trying to work out why you think Alice is racist and how it applies to Bob, and arguing against that.

You are not willing to accept […]. Which proves my point.

No, I'm not accepting your point because it's false. You don't get to twist opposition to your argument into support for your point.

If you prove Alice isn't racist, you haven't proven anything relevant. You're just nitpicking.

In your opinion, which isn't infallible.

I'm listening to the supporting case and engaging with your arguments.

This is not enough. Open debate requires an open mind: you must accept the possibility that you might be wrong.

If you don't even accept the possibility that you might be wrong about anything, then there's no point in debating, not about Alice, not about Bob, not about anything. All you are doing is wasting the time of your interlocutor.

This in my view is arguing in bad faith. If there's absolutely no way you can be convinced otherwise, then what am I even doing?

You're mainly arguing he's as racist as Alice and I happen to know she isn't.

Therefore it's impossible for you to be convinced of anything (about Alice and even less of Bob), and there's no point in me even trying.

In your opinion, which isn't infallible.

Is that supposed to be a counterargument?

This is not enough.

Yes it is. Listening to your case and engaging with your argument will make me change my mind if your case is convincing enough.

Therefore it's impossible for you to be convinced of anything (about Alice and even less of Bob), and there's no point in me even trying.

No, it's still possible for me to be convinced of true things.

You'e right there's no point trying to convince me of a false statement about math. Instead you should let yourself be convinced by me.

More comments

Yes, it is useful to challenge your basic assumptions about our reality. For example, did you know that Earth has a four corner simultaneous 4-day time cube?

did you know that Earth has a four corner simultaneous 4-day time cube?

Well, I never assumed it didn't. Mainly because I don't know what that means.

It's an old webpage that became a meme, at the time.

https://en.wikipedia.org/wiki/Time_Cube

This is without a doubt the least interesting thing I’ve read this year.

Yeah? What is your most fundamental belief that you have questioned this year?

Some rather personal shit I’d rather not air even pseudonymously.

What about your second most fundamental belief that you have questioned?

My brother, you don’t have to question your fundamental beliefs if you get them right the first time.

You think you get them right. So that's a "no, I don't question my fundamental beliefs".

I'd concur that this is more of an annoying semantic trick than anything else. It is never denied that 2 + 2 = 4 within the group of integers under addition (or a group containing it as a subgroup), a statement that the vast majority of people would know perfectly well. Instead, you just change the commonly understood meaning of one or more of the symbols "2", "4", "+", or "=", without giving any indication of this. Most people consider the notation of integer arithmetic to be unambiguous in a general context, so for this to make any sense, you'd have to establish that the alternative meaning is so widespread as to require the notation to always be disambiguated.

(There's also the epistemic idea that we can't know that 2 + 2 = 4 within the integers with complete certainty, since we could all just be getting fooled every time we read a supposedly correct argument. But this isn't really helpful without any evidence, since the absence of a universal conspiracy about a statement so trivial should be taken as the null hypothesis. It also isn't relevant to the statement being untrue in your sense, since it's no less certain than any other knowledge about the external world.)

Most people consider the notation of integer arithmetic to be unambiguous in a general context

But that is the point: most people make assumptions. In this particular case it's easy to see what assumption is made for people who do understand modular arithmetic, but that excludes the vast majority of people who don't.

The whole point of the article is to raise doubt about more complicated subjects which are not so easy to mathematically prove.

most people make assumptions

Assumptions are a basic building block of cognition, necessary in order to make even the very simplest possible of conclusions. Literally everyone makes assumptions whenever they have literally any thought or take literally any action. To take a single step forward is generally to assume your eyes don't deceive you, the world isn't about to explode, you won't suddenly quantum phase to the sun when you take the step, you still remember how to take steps, etc.

There are certainly more and less justified assumptions, and perhaps you mean to call attention to less justified assumptions, but don't do so using an example of an extremely justified assumption.

Literally everyone makes assumptions whenever they have literally any thought or take literally any action.

Yes, but not everyone realizes they are making an assumption. Just like virtually nobody realizes they are making an assumption when answering the 2+2 question.

But that is the point: most people make assumptions.

Assumptions about the meaning of symbols, namely that symbols carry their conventional meaning unless denoted otherwise.

This is a necessary prerequisite of communication, and messing with it is merely a failure to communicate.

And the failure to communicate can be entirely on the listening side by assuming a meaning that was never there.

The fact that today people don't understand each other is a huge problem, and worse: people don't want to understand what the other side is actually saying.

In general? Yes. In this example? Absolutely the speaker's fault. If you're using non-standard symbols, you need to denote that.

You are assuming I'm the one who brought up the 2+2=4 factoid.

If the speaker who brought up 2+2=4 is using standard symbols, he's unambiguously correct, so that can't be what we're talking about.

If the speaker claimed that 2+2=4 is unequivocally true, he/she is wrong.

Absolutely not. The speaker knows what the statement means, what the symbols mean, in what structure we're operating. The rest is just basic arithmetic over the natural numbers.

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question. The only lesson to be learned is that your interlocutor's terminology has to be aligned with yours in order to meaningfully discuss the subject. This has nothing to do with how complicated the subject is, only by how ambiguous its terminology is in common usage; terminology is an arbitrary social construct. And my point is that this isn't even a very good example, since roughly no one uses standard integer notation to mean something else, without first clarifying the context. Far better examples can be found, e.g., in the paper where Schackel coined the "Motte and Bailey Doctrine", which focuses on a field well-known for ascribing esoteric or technical meanings to commonplace terms.

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

There is no "arithmetic", there's multiple arithmetics. You are assuming that the "laws" of one particular arithmetic apply to all arithmetics, which is not true.

Here, I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations. I am not assuming that the rules of integer arithmetic will apply to systems of arithmetic that are incompatible with integer arithmetic but use the exact same notation. I am assuming that no one reasonable will use the notation associated with integer arithmetic to denote something incompatible with integer arithmetic, without first clarifying that an alternative system of arithmetic is in use.

Furthermore, I assert that it is unreasonable to suppose that the notation associated with integer arithmetic might refer to something other than the rules of integer arithmetic in the absence of such a clarification. This is because I have no evidence that any reasonable person would use the notation associated with integer arithmetic in such a way, and without such evidence, there is no choice but to make assumptions of terms ordinarily having their plain meanings, to avoid an infinite regress of definitions used to clarify definitions.

I'm using "the laws of arithmetic" as a general term to refer to the rules of all systems of arithmetic in common usage, where a "system of arithmetic" refers to the symbolic statements derived from any given set of consistent axioms and well-defined notations.

There are no axioms that apply to all arithmetics. There are no such "laws".

Go ahead and try come up with one "law". I'm fairly certain I can point out an arithmetic where it doesn't apply.

There's a reason these fall under the umbrella of abstract algebra.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

There are no axioms that apply to all arithmetics. There are no such "laws".

Are you getting hung up on my use of the term "laws of arithmetic"? I'm not trying to say that there's a single set of rules that applies to all systems of arithmetic. I'm using "laws of arithmetic" as a general term for the class containing each individual system of arithmetic's set of rules. You'd probably call it the "laws of each arithmetic". The "laws of one arithmetic" (by your definition) can share common features with the "laws of another arithmetic" (by your definition), so it makes sense to talk about "laws of all the different arithmetics" as a class. I've just personally shortened this to the "laws of arithmetic" because I don't recognize your usage of "arithmetic" as a countable noun.

Also, you seem to be conflating "integer arithmetic" with normal arithmetic. 2.5 + 2.1 is not integer arithmetic, and yet follows the traditional arithmetic everyone knows. I'm not even sure if normal arithmetic has a standard name, I just call it "normal arithmetic" to distinguish it from all the other arithmetics. Integer arithmetic is just a subset.

I was focusing on integer arithmetic since that was sufficient to cover your original statement. The natural generalization is group or field arithmetic to define the operations, and real-number arithmetic (a specialization of field arithmetic) to define the field elements. The notation associated with integer arithmetic is the same as the notation associated with real-number arithmetic, since the integers under addition or multiplication are a subgroup of the real numbers.


To repeat my actual argument, I assert that, without prior clarification, almost no one uses the notation associated with real-number arithmetic in a way contrary to real-number arithmetic, which implies that almost no one uses it in a way contrary to integer arithmetic. Therefore, I refuse to entertain the notion that someone is actually referring to some system of arithmetic incompatible with real-number arithmetic when they use the notation associated with real-number arithmetic, unless they first clarify this.

You made this claim:

It's an assumption about the meaning of the question, not an assumption about the actual laws of arithmetic, which are not in question.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about. People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

The "laws of arithmetic" that are relevant depend 100% on what arithmetic we are talking about, therefore it's imperative to know which arithmetic we are talking about.

Then please stop assuming that my uncountable usage of "the concept of arithmetic in general" in that sentence is secretly referring to your countable idea of "a single arithmetic". I've clarified my meaning twice now, I'd appreciate it if you actually responded to my argument instead of repeatedly hammering on that initial miscommunication.

People assume it's the normal arithmetic and cannot possibly be any other one. There is zero doubt in their minds, and that's the problem I'm pointing out.

Why should there be any doubt in their minds, if other systems of arithmetic are never denoted with that notation without prior clarification?

More comments

Or conversely, "I am Very Smart and like showing off to the sheeple".

Sorry, I'm not feeling particularly charitable this morning, got a pig of a headache for the past two days that painkillers are not touching so my patience for "I like to parade my superiority" is running very low.

I haven't heard anything about maths or modularism or whatever the hell this is about. What I see is someone chortling about how smart they are.

Someone going around challenging people that "hey, are you sure 2+2=4?" is not opening minds, they're being an attention-seeking prat. There may be ways of shaking up ways of thinking, but boasting that you like puzzles most people get wrong (but I don't, I'm So Smart) as a great way to change minds (because I know the Right Way To Think, unlike these poor less smart people) is not that way.

It's being "this so deep" and we would do a lot more good to "jumpstart him further along the road" not by patting him on the head for "Did you think this up all by yourself, Junior?" but pointing out that nobody likes a smartarse, and he needs to decide if he is more interested in getting people to challenge their assumptions, or just showing off My Big Brain.

For the author himself it's an interesting discovery.

It's not about what I think, from what I've seen very few people know about abstract algebra, many don't know what modulo is, and the vast majority of those who do, consider it an operation, not a completely new kind of arithmetic (as mathematicians do).

If this was general knowledge people wouldn't keep saying 2+2=4 as if it was an unequivocal fact, and at least someone would would say "well, only under normal arithmetic". I've never heard somebody say that.

Can you find an article or someone seriously saying that 2+2 isn't necessarily 4? (other than woke activists decrying Western mathematics)

In the real world people do misinterpret, and they rarely (if ever) follow Grice's razor. They argue about what Trump said, rather than what Trump meant.

Semantics is in my opinion a huge problem in modern discourse. Russia claims what they did in Ukraine was a "special military operation", but other people claim it's a "war". Which is it? Meaning does matter.

Even in deep philosophical debates meaning is everything. A debate about "free will" entirely depends on what opposing sides mean by "free will", and there's at least three different definitions.

You say the meaning of meaning is "not extremely deep", but does it have to be? People fail extremely basic problems of logic (90% fail the Wason selection task), basic problems of probability (like the Monty Hall problem), I've also setup my own problems of probability of probability, and guess what?: most people get it wrong.

Maybe some ideas are too simple for you, but what about other people perhaps not so intellectually gifted? My objective is to arrive to a concept that even people with an IQ of 80 would be able to understand, and I'm not sure they would understand what modular arithmetic even means (not the modulo operator), so perhaps even though it's "not extremely deep" for you, it's a challenge for them.

"2+2 = 4" is still actually true in Z4. The elements of Z4 aren't strictly integers; they're sets of integers. "0" in Z4 is defined as {...-12, -8, -4, 0, 4, 8, 12...} in Z and "2" is defined as {...-14, -10, -6, -2, 2, 6, 10, 14...} in Z. Which element from the first set (or second set) you use to denote the full set is pretty arbitrary; the relevant point is that adding two elements of the latter set will always produce an element of the former set - including, notably, 2 + 2 = 4.

"2+2 = 4" is still actually true in Z4.

But not in 𝐙/4𝐙 (integers modulo 4).

Um, that's what I meant by "Z4" (I couldn't remember and didn't bother with the exact definitional name). The element of Z/4Z that is usually denoted "0" is that set I noted above and can also be correctly denoted "4".

Do you have any source for that? All the sources I've found say the elements of the underlying set of integers modulo 4 are integers.

https://math.stackexchange.com/questions/1556009/quotient-ring-mathbbz-4-mathbbz

Somebody was confused when defining Z/4Z and not getting integers; every response notes that Z/4Z is strictly not a set of integers, but a set of sets of integers.

I could go look for a (presumably pirate) online textbook if you really want (I learned this from lectures in uni, not from a textbook), but it'd be a pain.

(The elements of the underlying rings of the quotient ring - Z and 4Z - are of course integers, but the elements of Z/4Z aren't.)

OK. But in the answers it's claimed that this defines a new way to say what elements equals to what else, so 3=7. Therefore 4=0, and 2+2=0.

Yes. It is true that 2 + 2 = 0 in Z4; I've not disputed that. It's just also true that 2 + 2 = 4.

Yes. But the whole point of my post is to get people to reconsider what basic notions like 2+2 are.

And if I understand correctly in ℤ/4ℤ there is no 2 in the main set, it's {..,-6,-2,2,6,...}, so it's actually {...,-6,-2,2,6,...}+{...,-6,-2,2,6,...}={...,-8,-4,0,4,8,...}, or something like that. 2 is just a simplification of the coset.

Second part is right, yes.

More comments

Z4 (i.e. ℤ₄, man I love Unicode) is just another name for ℤ/4ℤ. ([edit for clarity: ℤ₄ is] a disfavored notation now, I think, because of the ambiguity with p-adic integers for prime p, but that's how I learned it)

The "/" symbol itself in the notation you're using is specifically referencing its construction as a quotient group, a set-of-cosets with the natural operation, as described above.

OK. I'm not a mathematician, I'm a programmer, but from what I can see the set {0,1,2,3} is isomorphic to ℤ/4ℤ that means one can be mapped to the other and vice versa. The first element of ℤ/4ℤ is isomorphic to 0, but not 0, it's a coset. But the multiplicative group of integers modulo 4 (ℤ/4ℤ)* is this isomorphic set, so it is {0,1,2,3} with integers being the members of the set. Correct?

Either way 2+2=0 can be true.

Either way 2+2=0 can be true.

Only because 4=0. So 2+2=4 is true, and the central claim of your substack post is wrong.

Correct?

No. Some basic mistakes:

  • Isomorphy requires preservation of structure, in our case the structure of respective additions. This is not the case: Addition in {0,1,2,3} works different than in ℤ/4ℤ.

  • We don't say an element in a structure is isomorphic to one in another.

  • (ℤ/4ℤ)*is an entirely different structure. For starters, it contains only 3 elements. (The * signifies we're excluding the 0.)

Only because 4=0.

So 2+2=4=0="not what you think". Therefore the claim of my post is true.

But 0 is what we think, because 0 is 4. You're just changing the representation. It's like saying "You think 2+2 is '4', but it's actually 'four'".

Also, the claim in your post was

So there you have it: 2+2 is not necessarily 4.

which is wrong whether or not 2+2=0 can be true.

But 0 is what we think, because 0 is 4.

Nobody thinks that 0 is 4.

Nevertheless it is the case. We think 4, 4 is 0, therefore "0=not what you think" isn't true.

More comments

but if I ask you what’s 22:00+2:00, you are likely not going to answer 24:00 (which isn’t a thing)

Yes it is... that's a perfectly valid way of expressing time.

https://en.wikipedia.org/wiki/24-hour_clock#Midnight_00:00_and_24:00

2+2=4 is always true, but "2+2=4" does not always mean 2+2=4.

That's kind of an accurate summary. But doesn't that apply everywhere in modern discourse? People assume that Kanye West said X, but "X" doesn't necessarily mean X.

Words like "censorship", "racism", "war", "vaccine" are used in different ways all the time, and when people with agendas use them, they feel 100% there is only one meaning.

So "censorship" doesn't always mean censorship.

if I ask you what’s 22:00+2:00, you are likely not going to answer 24:00

I read that as "Twenty two minutes plus two minutes", which is obviously a duration of 24 minutes. Even if I had read it as hours, there's nothing wrong with 24 hours or even 30, 48, or 100. (A thousand hours might be pushing it, though.) Also, I will refer to midnight as 24:00:00. If it was three hours, then I wouldn't answer 1:00 like you're suggesting either. I'd say "01:00 the next day" because time isn't truly modular, it's just mixed base and convention says that we exclude the most-significant parts when possible.

Another claim in the same vein is that lines with constant distance (e.g. parallel) never intersect, this is again making an assumption: Eucledian geometry.

I'm not sure about more esoteric ones, but in spherical and hyperbolic geometries pairs of lines with constant distance simply don't exist. Lines converge or diverge (for spherical/hyperbolic, respectively), and the set of points that are a constant distance from a given line are a curve for both of them: you can either have a line or you can have constant distance.

In fact, proving that lines with constant distance never intersect is utterly trivial if you assume they exist:

  • intersection is when the distance is zero

  • the constant distance is nonzero

  • therefore, they don't intersect.

If it was three hours, then I wouldn't answer 1:00 like you're suggesting either. I'd say "01:00 the next day" because time isn't truly modular

But your clock would read 01:00.

We use this concept in programming all the time. If the week ends in Sunday we don't say that the day after that is Monday the next week, it's Monday (this doesn't change if the week ends in Saturday). In fact, many people consider Friday 23:00+02:00 to still be Friday night.

I'm not sure about more esoteric ones, but in spherical and hyperbolic geometries pairs of lines with constant distance simply don't exist.

Yes, I meant "two straight lines that indefinitely extended in a two-dimensional plane that are both perpendicular to a third line", like in this picture, which are kind of parallel. The point is the standard concept of "parallel" more or less only exists in Euclidean geometries.

But your clock would read 01:00.

We use this concept in programming all the time. If the week ends in Sunday we don't say that the day after that is Monday the next week, it's Monday

That's merely convention, omitting information that can be derived from context for brevity. If you want to make a formal argument, you need to include that information again. Everyone is aware monday is next week, that's why you don't spell it out if it isn't relevant, but if you're e.g. scheduling business on a weekly base, you might have to say "Tomorrow is monday, which is next calendar week".

No. In programming it's literally impossible to include information that wasn't meant to be included. If you have an int to store the weekday, that's all the information stored in that int.

Not having all the information is a huge problem in programming, and historically it has been a big headache to deal with dates and time.

But if a program doesn't need any information other than the weekday, it may use that and nothing more.

If you're omitting the information of which week it is because it's not relevant, you're omitting information, and that means you can't use the result to support your argument, because it's missing information.

Information is always limited. Humans and all rational agents always operate with limited information. There is no omission.

In our case, informations isn't just limited, but artificially limited, i.e. omitted. The information is indeed still available, just by deriving it from context. We both know monday after sunday is next week.

You're making an argument based on information you know is incomplete, and the missing information invalidates it. Don't do that.

In our case, informations isn't just limited, but artificially limited, i.e. omitted.

Wrong. Information by its very nature is limited. Nobody is "artificially" limiting the information that can fit in one bit, one bit can only fit one bit of information. Period.

This is the foundation of information theory.

The information is indeed still available, just by deriving it from context.

There is no context attached to information. One bit is one bit. You can try to do some clever tricks with two bits, or four bits, but at the end of the day the information is the information.

We both know monday after sunday is next week.

No, we don't. You are assuming where the week starts.

You're making an argument based on information you know is incomplete, and the missing information invalidates it.

All information is incomplete.

One bit fits one bit of data. This can be less than one bit of information (e.g. if I encode each 0 or 1 in an equal-probability channel as 000000 or 111111 I get 1/6 of a bit of information each, or if a non-uniform data source with probability p=1/3 of the next bit being 1 gives me an expected 0 then I get 0.58 bits out of that) or it can be more than one bit (if I see an unexpected 1 instead then that's 1.58 bits of information).

If it's a noisy channel then "000000" might not be very "artificial", that might be the safest way to communicate. If it's not a noisy channel then I've just wasted 5 bits; the limitation to 6 bits might have been natural, but the further limitation to 1 was not. Natural limitations and artificial limitations are not opposites; you can have both at once.

[edit to fix weird formatting; I can't seem to get a tilde to be a tilde, whether I escape it with a backslash or backticks or what, without it either disappearing or turning into a strikethrough formatter]

More comments

Wrong. Information by its very nature is limited. Nobody is "artificially" limiting the information that can fit in one bit, one bit can only fit one bit of information. Period.

That's a red herring. We're not talking about bits. We're talking about the information we have about your example, which was given in english.

No, we don't. You are assuming where the week starts.

Liar. The end of the week being sunday was included in your description of the example.

All information is incomplete.

Not all information is incomplete in the sense that reasoning from it leads to false conclusions. Stop defending your fallacious argument.

More comments