@GoodGoose's banner p

GoodGoose


				

				

				
0 followers   follows 0 users  
joined 2022 November 18 00:37:21 UTC

				

User ID: 1886

GoodGoose


				
				
				

				
0 followers   follows 0 users   joined 2022 November 18 00:37:21 UTC

					

No bio...


					

User ID: 1886

Why couldn't you just nest them? If I have a lottery ticket that pays off in other lottery tickets which finally pay money, then there are likelihoods of having a likelihood. You could of course calculate the average likelihood but sometimes this information is useful. Another example, if I have a game-theory situation where one player has beliefs over the beliefs of the other player, I have probabilities over probabilities.

While your idea is somewhat valid, it either misses the point of the question that a Bayesian probability answers or it ignores that it is an important part of Bayesian reasoning. In other words, a good Bayesian would say that your idea is trivial and irrelevant, unless there is further information acquisition. It is not a "valid and important question to ask" except for some contexts.

In your example, if you can only take the bet once, optimally choosing to take the bet or not involves calculating the expected gain using the correct Bayesian probability. Any other information is irrelevant. In another simple example, you can formulate this as a problem with an option to continue. In that case, there will be an instantaneous (also called flow) payoff and a continuation value (the value of being able to take the bet again). The continuation value depends on the posterior probability which, as you correctly mention, depends on other stuff. However, this continuation value only matters for the decision if it is affected by the decision. If the shady guy will nonetheless toss the coin, then how the posterior probability will change is irrelevant for you.

More generally, dynamic problems with new information are not a problem for Bayesians. Specifying the informational context of a problem requires a proper prior, which is a joint distribution of all variables. These variables can be the decision-relevant ones (the particulars of the coin) or informational ones (the history of coin tosses or extra information on how the coin was obtained). Bayes theorem has us update this prior in the usual way. While there are some examples where this extra information can be neatly summarized into a simple sufficient statistic (e.g., the number of tosses and number of heads for coins with a given probability of landing heads and independently distributed outcomes given the coin), those examples are the exception.

To recap, Bayesians are not "making incorrect technical arguments insinuating that the estimated probability alone should be enough for everyone." They are making correct arguments that fail only in a very small subset of problems, those with information acquisition that is affected by the decisions. In this way, it is not a "a valid and important question to ask." Furthermore, it is not clear that "Bayesians ought to get a habit of volunteering this information unprompted" because this information, besides being irrelevant to most decisions, is not easy to communicate succinctly.