site banner

Friday Fun Thread for May 1, 2026

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

I can no longer look at Nick Bostrom the same way after finding out he’s a halfer in the Sleeping Beauty Problem.

In fairness, it trips up a lot of people. I would probably say including you. Last time we discussed it, you didn't come back to explain how your position worked, but my best interpretation was that your position thought:

Alice is smart enough and capable of distinguishing between "the probability that Bob observes an outcome" and "the probability of the coin flip, itself"... but is too stupid to distinguish between "the probability that I, Alice, observe an outcome" and "the probability of the coin flip, itself"?

(This is for Variant 1:) The number that Alice should put into the computer is not her credence that in her side of the experiment, right now, the coin came up tails. Alice should put p, the weight of the coin, into the computer. All of the weird anthropic probability shifting only occurs because Alice doesn't know what day it is. Because of the way the computer works, it will only transmit the message to Bob if it is Monday. This is why the probability that Alice should put into the computer is the naive weight of the coin, and not the anthropically shifted probability that she should bet herself.

How about in Variant 2? Should Alice do some weird anthropic probability shifting for what she puts into the computer for Bob? Should she do two different weird anthropic probability shifting things, one for herself and a different one for Bob?

...wouldn't it be sooooo much simpler to just say, "Alice is capable of distinguishing between the probability of the coin flip itself, the probability that she observes an outcome, and the probability that Bob observes an outcome," rather than some conceptual mess garbage about her simultaneously anthropically probability shifting for Bob opposite her own? Like, what do you even mean "anthropically probability shifting" now? I thought it was supposed to be something about updating a belief on the coin flip, itself, but it seems like you've already just admitted that that is not happening. She still has "the naive weight of the coin". She still knows about this probability as a distinct probability.