site banner

Recursive thinking, Newcomb's problem, and free will

felipec.substack.com

Newcomb's problem splits people 50/50 in two camps, but the interesting thing is that both sides think the answer is obvious, and both sides think the other side is being silly. When I created a video criticizing Veritasium's video This Paradox Splits Smart People 50/50 I received a ton of feedback particularly from the two-box camp and I simply could not convince anyone of why they were wrong.

That lead me to believe there must be some cognitive trap at play: someone must be not seeing something clearly. After a ton of debates, reading the literature, considering similar problems, discussing with LLMs, and just thinking deeply, I believe the core of the problem is recursive thinking.

Some people are fluent in recursivity, and for them certain kind of problems are obvious, but not everyone thinks the same way.

My essay touches Newcomb's problem, but the real focus is on why some people are predisposed to a certain choice, and I contend free will, determinism, and the sense of self, all affect Newcomb's problem and recursivity fluency predisposes certain views, in particular a proper understanding of embedded agency must predispose a particular (correct) choice.

I do not see how any of this is not obvious, but that's part of the problem, because that's likely due to my prior commitments not being the same as the ones of people who pick two-boxes. But I would like to hear if any two-boxer can point out any flaw in my reasoning.

3
Jump in the discussion.

No email address required.

No, it doesn't, because my decision doesn't retroactively change what kind of person I am. The causality goes in the other direction.

Basically, depending on what kind of person I am, Omega offers me a different game.

Your choice reveals what kind of person you are, which omega already knew. If you didn't know what you were going to choose ahead of time that's a mark of your ignorance, not omega's.

I.e. my choice doesn't change anything, it just "reveals" information already known to the relevant player Omega.

What I know or don't know ahead of time doesn't matter, because I'm not making a decision ahead of time.

You can also just model it as omega knowing whether or not you're smart or lucky enough to come up with the right answer to get the $1m. If you pick the right answer you get $1m if you don't then you don't. It's a bit of a brain twister but it works out.