@felipec's banner p

felipec

unbelief

1 follower   follows 0 users  
joined 2022 November 04 19:55:17 UTC
Verified Email

				

User ID: 1796

felipec

unbelief

1 follower   follows 0 users   joined 2022 November 04 19:55:17 UTC

					

No bio...


					

User ID: 1796

Verified Email

If 2+2=4 are elements in a modular ring, it holds true.

Integers modulo 4 (𝐙/4𝐙) is a modular ring which does not contain the number 4.

I think you have a fundamental misunderstanding of what Bertrand Russel was doing when he proved 1+1=2

No, I don't. In mathematics the word "proof" has a very precise meaning, and anything without a "proof" is held as tentative (i.e. not necessarily true), for example a conjecture.

This entirely depends on the set of axioms you choose as as foundation, and you certainly could choose 1+1=2 as one of those axioms, therefore it's an assumption that doesn't need to be substantiated. But if you get rid of that axiom, then 1+1=2 is held as tentative and thus lacking proof.

much in the same way that the point of "coding Hello World in assembly" is not "coding Hello World in assembly" but "coding Hello World in assembly."

You are making a very obvious assumption there.

Russel was showing that you could lower the "basement" of mathematics and consider it as starting from another foundation deeper down from which you could construct all mathematical knowledge, and to do that he had to build towards mathematics where it already stood.

I know.

Another way to think about it is that he tried to refactor the 1+1=2 axiom into more fundamental axioms. But this work necessitates the possibility that an axiomatic system that doesn't have 1+1=2 as an axiom is tenable. If such a system exists (which I think Bertrand Russell pretty much proved), that means that 1+1=2 does not need to be assumed to be true, it can be inferred.

I call "not assume" "doubt", but it doesn't matter what you call it, the fact is that to write Principia Mathematica Bertrand Russell had to not assume 1+1=2.

Finally, and most importantly, law in general and international law in particular is much less clearly defined and broadly agreed upon than simple arithmetic over the natural numbers.

This supports my argument. If I demonstrate that a rational agent should doubt something very "clearly defined" such as 2+2=4, then it logically follows that something much less clearly defined should be doubted as well.

if I say “Waffles are better than pancakes, that's as clear as the sky is blue”, would you start arguing that the sky isn't always blue?

Yes. I start with the claims that are more easy to dismantle because I know that people virtually never doubt their beliefs in real time. It would be very hard for me to convince that person that waffles are not necessarily better than pancakes, but it would be easy to dismantle the auxiliary claim.

This person may attempt to find another more unequivocally true auxiliary claim, but I would easily dismantle that too. And sooner or later this person would be forced to realize that it's not easy to find an unequivocally true claim. And if it's not easy to find an unequivocally true claim, perhaps the unequivocally true claim that waffles are better than pancakes is not so unequivocally true.

If a person says "Bob is as racist as Alice", and I show that Alice is not racist, then says, "OK. Bob is as racist as Mary", and I show Mary is not racist, "OK. Bob is as racist as Linda", Linda isn't racist. Wouldn't it make sense to doubt whether or not Bob is actually racist?

Using metaphors to tackle deep philosophical problems isn't even fringe. The notion of a black swan is nowadays common in order to explain that the fact that something has never happened before is not a valid reason to think it will never happen in the future. It tackles the deep philosophical problem of induction.

Instead of saying "as clear as the sky is blue", people in the past used to say "as impossible as a black swan". To say "actually, the fact that we haven't seen a black swan doesn't necessarily mean black swans don't exist" is not pedantry, it's in fact valid reasoning, a deep philosophical notion (problem of induction), and something that should have made people doubt their 100% certainty on "impossible" events.

Discrete math is as basic as it gets, it’s first semester CS/Electrical/Math/Physics.

Of university. You were taught math before that, weren't you?

It's not "basic math".

Saying logic isn’t part of math but has “a complicated relationship” with math… again, I don’t see what you’re getting at.

That your statement is not quite correct.

Again, the point is that it is convention to assume the common interpretation/ context of a statement when we assess its truth value

"Convention" literally means usually done, not always.

Can we consider the possibility that all of this was vaporware?

  1. Most people don't know what FTX is

  2. Most people have no idea who SBF is

  3. Most people have never heard of EA

Scott Alexander seems to be devastated by something most people didn't even know was a thing, much less an important thing.

But FTX is not crypto. FTX was a mixture of the old and the new, which is precisely why it failed.

The whole point of crypto is to be completely detached from old systems, so there's zero surface of attack from government. If you use pure crypto (no exchange), then you are immune to these kinds of failures.

"2+2 = 4" is still actually true in Z4.

But not in 𝐙/4𝐙 (integers modulo 4).

I think you kinda underestimate how easy is to manipulate a grown adult outside of their area of expertise.

Completely agree. I've devoted my previous posts to try to get people to doubt things they assume as 100% certain, and the end result is that no one wants to do that.

It seems pretty clear to me that even the most rational and intelligent people on the planet will believe whatever they want to believe as long as it feels good.

If the speaker claimed that 2+2=4 is unequivocally true, he/she is wrong.

In our case, informations isn't just limited, but artificially limited, i.e. omitted.

Wrong. Information by its very nature is limited. Nobody is "artificially" limiting the information that can fit in one bit, one bit can only fit one bit of information. Period.

This is the foundation of information theory.

The information is indeed still available, just by deriving it from context.

There is no context attached to information. One bit is one bit. You can try to do some clever tricks with two bits, or four bits, but at the end of the day the information is the information.

We both know monday after sunday is next week.

No, we don't. You are assuming where the week starts.

You're making an argument based on information you know is incomplete, and the missing information invalidates it.

All information is incomplete.

Not doubt about math or fundamental logic.

No? So nobody in mathematics doubts the Zermelo–Fraenkel set theory axiomatic system?

An engineer who doubts 1+1=2 will never build any bridges

Who said an engineer should doubt 1+1=2?

Do you have any source for that? All the sources I've found say the elements of the underlying set of integers modulo 4 are integers.

It does matter what you call it

I did not say it doesn't matter what I call it, I said it doesn't matter what you call it.

And it seems pretty clear to me you are being intentionally obtuse. The purpose of me communicating to you is that you understand what I mean, it doesn't matter how. For any given idea I have there's a set of words I could use to transmit that idea, and any word I use has multiple meanings, but as long as you pick the meaning that correctly match the idea I want to transmit, we are communicating effectively. The "most common meaning" is completely irrelevant. The most common meaning of the word "get" is "to gain possession of", but if I say "do you get what I mean", I'm not using the most common meaning, and I don't have to.

I used multiple words, terms, and an explanation for you to understand what I meant, and if you understand it, I don't personally care what word you use to name that idea.

To assume(everyday) something means approximately to act as if that something were true, without feeling the need to personally verify it for oneself.

To assume(logic) something means to accept it as an axiom of your system (although potentially a provisional one) such that it can be used to construct further statements and the idea of "verifying" it doesn't make much sense.

I don't see the any difference. If you "assume X" it means you hold X as true without any justification, evidence, verification, or inference.

In other words, even though he didn't assume(logic) that 1+1=2, his assumption(everyday) that 1+1=2 would be so strong as to reverse all the logical implication he had been working on

I disagree. Every day he saw evidence that 1+1=2, so it would be reasonable to believe (not assume) that this was always true. Additionally he saw no way it could not be true, but he was rational enough to know this was not a good reason to assume it was true, as this would have been an argument from incredulity fallacy.

Maybe he did assume that 1+1=2 in everyday life, but you cannot know that unless you could read his mind.

This is a converse error fallacy. If I assume a chair is sound, I would sit down without checking it, but if I sit down without checking it doesn't necessarily mean I assumed it was sound.

In general rationalists try to not assume anything.


If I'm wrong about any of those I will be happy to be corrected.

I know helloworld is a nearly useless program in the vast majority of contexts, but not all, and I know that people frequently practice new programming languages by writing programs in them with little regard for the practical use of those programs, but not all people who write helloworld programs are practicing new programming languages.

You are assuming the general case. I can easily imagine somebody in the 1970s developing for a new hardware architecture for which there are no other compilers available trying to test that any software runs, in fact, I can even imagine somebody today doing that for a new hardware architecture like RISC-V.

And once again the point of these examples is not to "deliberately wasting people's time", it's to show they are making assumptions even if they can't possibly see how they could be making an assumption.

Every time I tell somebody that they are making an assumption they disagree, and every time I point to them the assumption they were making they come with a rationalization, like "you tricked me", or "that's very unlikely", or "everyone would have assumed the same". It's never "actually you are right, I didn't think about that".

This can be less than one bit

Yes, but it cannot be more. The point here is that information is not being "omitted", there's always limits to how much information can be transmitted, stored, processed, etc.

And in general programmers try to not use more information than necessary, this is not unique to this field, it's just easier to see because the information can be precisely measured.

We're talking about the information we have about your example, which was given in english.

The information in English is limited too. Information is always limited.

"Tomorrow is Monday" has limited information.

Liar. The end of the week being sunday was included in your description of the example.

This was my example:

If the week ends in Sunday we don't say that the day after that is Monday the next week, it's Monday (this doesn't change if the week ends in Saturday)

The case where the week ends in Saturday is included. If today is Sunday we say:

  1. Tomorrow is Monday (if the week ends in Sunday)

  2. Tomorrow is Monday (if the week ends in Saturday)

My example was crystal clear in explaining that the day the week ends does not matter in describing what day comes after Sunday. This information is not available from the phrase "tomorrow is Monday".

You claim the information is available because if the week ends in Sunday "we both know" when the week ends. No, we don't, because I don't. If you want to claim you know when the week ends from the phrase "tomorrow is Monday" go ahead, I do not know.

And it doesn't seem to me you are engaging with my argument.

I am not 100% certain it's impossible for someone (including myself) to be mistaken about the definitions or meanings of commonly used words or mathematical symbols.

That was not my claim. Please read my claim and then answer my question.

Conversely, if they say "Bob is as racist as Alice, because he's the author of the bobracial supremacy manifesto", pointing out Alice isn't racist just distracts from the point at hand. Yes, it's a bad metaphor, but the point stands.

Yes, but the premise of this line of thought is precisely the opposite: it's not easy to prove Bob isn't racist, other other hand it's extremely easy to prove Alice isn't racist.

I have refuted your argument that 2+2=4 is not unequivocally true, but I'm still willing to discuss the point you were trying to make without forcing you to come up with a new example.

But discussing is not accepting. You are arguing that Bob is a racist, but you are nowhere near accepting the possibility that he might not be.

You are not willing to accept that Alice might not be a racist, and Bob even less. Which proves my point.

Are you 100% certain it was a banality?

It's not really that clever, that's what I am saying.

Who says it has to be clever?

You said, "The 'laws of arithmetic' that are relevant depend 100% on what arithmetic we are talking about," which is only meaningful under your usage of "laws of arithmetic" and does not apply to the term as I meant it in my original comment.

No it doesn't.

The "laws of arithmetic" after your explanation mean the "laws of all the different arithmetics" which you asked me to not consider as uncountable, which I didn't. You yourself said that the "laws of all the different arithmetics" is not a single set of rules that apply to all arithmetics, therefore a subset of the "laws of all the different arithmetics" may apply to a specific arithmetic, but not necessarily to another different arithmetic.

Therefore my phrase "The 'laws of arithmetic' ('laws of all the different arithmetics') that are relevant depend 100% on what arithmetic we are talking about" is 100% consistent with your usage of the term.


To rephrase that, communication relies on at least some terms being commonly understood, since otherwise you'd reach an infinite regress.

This is what you said:

This is because I have no evidence that any reasonable person would use the notation associated with integer arithmetic in such a way, and without such evidence, there is no choice but to make assumptions of terms ordinarily having their plain meanings, to avoid an infinite regress of definitions used to clarify definitions.

Having no evidence is no excuse. Having no evidence of black swans doesn't imply that black swans cannot exist, nor is it a valid reason to assume that all swans are white.

You do have a choice: don't make assumptions.

Symbols do not have a single meaning. If I say "run a marathon" you may think about participating in a marathon, but it could be managing one. Nobody sees the word "run" and assume a single meaning, the meaning always depends on the context. Intelligent beings must consider different meanings, and this is precisely the reason computers are not considered very intelligent: they can't consider multiple meanings the way a human does. If language was as simple as you paint it, computers would have had no problem solving it decades ago.

It's not that linear and simple, you do have the choice to consider multiple meanings of the word "run".


Indeed, how do you know that your interlocutors are "100% certain" that they know what you mean by "2 + 2"?

Because they use it as a clear example of something unequivocally true.

I cannot imagine how crypto could work without exchanges.

The exchanges don't need to work like the do now. The exchange could transfer the bitcoin to an external wallet, it doesn't need to be a wallet controlled by the exchange.

It's perfectly doable to buy bitcoins with Binance, and then transfer those to an external wallet you have 100% control of. That way if Binance disappears you still have your bitcoins.

Yes, but this is perfectly consistent with FTX being a Ponzi scheme all along. It was never an important thing for humanity, some people were just duped into believing it was.

Yes, but this is what happens when you are conned. You feel betrayed for trusting someone or something only to realize that your bullshit detector isn't as good as you thought it was. The cognitive dissonance when you are forced to change paradigms is a personal struggle, but not something that changes the world in any way, only your perception of the world.

The goodness in the world isn't going to diminish because effective altruism turned out to be bullshit, only Scott's belief in the goodness in the world.

So does the statement 2+2=0.

Doubt about axioms is basically mathematical philosophy.

So it's essential.

So you agree doubt about everything is not reasonable in every field?

Depends on what you mean by "doubt". If you mean <100% certainty, then no. If you mean 50% certainty, then yes.