site banner
Jump in the discussion.

No email address required.

If you all happen to believe there’s an omnipotent power that rewards good behavior after you die, and you’re in a close-knit community where betrayals have huge social costs, then you can all trust each other.

I suspect the second factor is doing a lot more work than the first. (Suppose the diamond marketplace were 50/50 Jews and Jains; would we expect each trader to do exactly as much business with their own group as with the other?)

Maybe right and wrong don’t “really” exist. But it’s tough out there and you don’t want people will hurt you. So you loudly advertise that if anyone defects against you, you’ll go out of your way to punish them—even “irrationally” hurting yourself if that’s necessary to get revenge. To make this threat maximally credible, you adopt as core beliefs that right and wrong do exist and that defectors are wrong.

I'm not sure I'd count adopting a certain game-theoretic strategy as "believing an untrue thing" - a strategy can be effective or ineffective, but it can't be "false". The article seems to imply that this is an example of a false meta-ethical belief, but this is a controversial assertion and we can't take for granted that everyone understands "right" and "wrong" in the same way. (Maybe I've just defined "right" as whatever strategies are effective?) Which leads to the next point:

At least in recent history, people on both sides of wars seem to believe they are fighting for the side of good. Obviously, that can’t be right, and in a sense, two such parties fighting should be cause for them to sit down and work through Aumann’s agreement dynamics.

A disagreement over values isn't like a disagreement over facts - we wouldn't expect it to be resolvable through Aumann's agreement theorem even if both sides were being perfectly rational.