site banner

Recursive thinking, Newcomb's problem, and free will

felipec.substack.com

Newcomb's problem splits people 50/50 in two camps, but the interesting thing is that both sides think the answer is obvious, and both sides think the other side is being silly. When I created a video criticizing Veritasium's video This Paradox Splits Smart People 50/50 I received a ton of feedback particularly from the two-box camp and I simply could not convince anyone of why they were wrong.

That lead me to believe there must be some cognitive trap at play: someone must be not seeing something clearly. After a ton of debates, reading the literature, considering similar problems, discussing with LLMs, and just thinking deeply, I believe the core of the problem is recursive thinking.

Some people are fluent in recursivity, and for them certain kind of problems are obvious, but not everyone thinks the same way.

My essay touches Newcomb's problem, but the real focus is on why some people are predisposed to a certain choice, and I contend free will, determinism, and the sense of self, all affect Newcomb's problem and recursivity fluency predisposes certain views, in particular a proper understanding of embedded agency must predispose a particular (correct) choice.

I do not see how any of this is not obvious, but that's part of the problem, because that's likely due to my prior commitments not being the same as the ones of people who pick two-boxes. But I would like to hear if any two-boxer can point out any flaw in my reasoning.

3
Jump in the discussion.

No email address required.

Two-boxers fundamentally disbelieve the premise - refuse to engage the actual hypothetical. The strict domination idea, that 'once you're in the room, the money is already there' is discarding the very thing the problem is working with - the predictor predicts you. If you are thinking along two box lines, then the predictor will leave one box empty. Once you have entered the room, the game is already over. The prediction has already been made, and if you're a two-boxer, you've already lost. You have to realize that the only way to win is for the predictor to think you are going to pick one box, and for the predictor to think you are going to one box, since it is extraordinarily accurate, you have to be a one-boxer. You can't solemnly resolve to be a one-boxer while secretly planning to be a two-boxer, because the predictor will pick up on that. You have to actually have the thought patterns of a one-boxer. You have to believe one-boxing is the superior strategy. It's not 'irrational' - it's playing the game. In this specific case, because of the predictor's stipulated accuracy, one-boxing is the strategy that wins. It doesn't matter how the accuracy comes about - a lack of free will, time travel, hand waving woo - it's there. The experiment depends upon it, and discarding it is foolish.

Two-boxers fundamentally disbelieve the premise - refuse to engage the actual hypothetical.

Of course, that was my conclusion as well. But the question is why.

I've discussed the problem with many two-boxers and all of them dismiss the high accuracy of the predictor on the basis that your choice doesn't affect the prediction. But they never bother to explain why that's relevant.

Even Robert Nozick in the original paper says if the decision doesn't affect the final state, then one should ignore the accuracy of the predictor. Why? Because that's what he was "lead to believe".

There is no reason to discount the correlation just because two-boxers don't see a direct causal link. That's why I devised the sunscreen problem: to show why it's irrational to discount a correlation on the basis of no apparent direct causal link.

In my experience most 1-boxers are 1-boxers because they implicitly believe in backwards causation. 2-boxers are people that realize that backwards causation isn't possible. But then when you realize pre-commitment is an option you should be back at 1-box. If a 2-boxer has heard the argument for pre-commitment and remain 2-boxers then I don't understand that but I think most people just don't even get to that point because they're still stuck on the backwards causation part. To be honest I don't really see that recursion had to do with it. It just comes down to "be the kind of person the oracle predicts would guess 1-box, the oracle is smart and accurate enough that you shouldn't try to trick it".

In my experience most 1-boxers are 1-boxers because they implicitly believe in backwards causation.

That is not not true. Two-boxers make that claim with zero evidence. I've been told I must believe in backwards causation because I'm a one-boxer.

Let me be clear: I do not believe in backwards causation, and I'm a one-boxer.

To be honest I don't really see that recursion had to do with it.

Then why do you insist in backwards causation? If a and b are correlated, that's all you need to know to make an informed decision, no backwards causation needed.

I think you misunderstood my comment. I'm also a 1-boxer and I don't think you need to believe in backwards-causation to be a 1-boxer. I just think a lot of 1-boxers do.

I'm just trying to explain why I think 2-boxers are 2-boxers. They think "backwards causation is wrong so 1-boxing is wrong". Actually backwards causation is wrong but it doesn't mean 1-boxing is wrong.

I just think a lot of 1-boxers do.

I don't think so. I haven't seen a single one-boxer make that claim.

I'm just trying to explain why I think 2-boxers are 2-boxers. They think "backwards causation is wrong so 1-boxing is wrong".

Yes, that is certainly one of the rationales of two-boxers. But that doesn't mean many one-boxers do actually believe that.

But then when you realize pre-commitment is an option you should be back at 1-box.

But is pre-commitment an option? The problem as usually stated stipulates that by the time Omega has finished explaining the rules to you, the content of the box is already determined. It's too late for pre-committing.

P.S.: Should I be worried for myself because I know what your name refers to?

Even if it's too late for pre-commitment, if you're the kind of person who decides it's not too late and only opens the 1 box, it'll have the money.

By pre-commit I just mean that for the general class of newcomb-like problems you decide you will pick the 1-box option. So as long as you are aware there are problems like this you can do it.

And maybe :)