site banner

not-guilty is not the same as innocent

felipec.substack.com

In many discussions I'm pulled back to the distinction between not-guilty and innocent as a way to demonstrate how the burden of proof works and what the true default position should be in any given argument. A lot of people seem to not have any problem seeing the distinction, but many intelligent people for some reason don't see it.

In this article I explain why the distinction exists and why it matters, in particular why it matters in real-life scenarios, especially when people try to shift the burden of proof.

Essentially, in my view the universe we are talking about is {uncertain,guilty,innocent}, therefore not-guilty is guilty', which is {uncertain,innocent}. Therefore innocent ⇒ not-guilty, but not-guilty ⇏ innocent.

When O. J. Simpson was acquitted, that doesn’t mean he was found innocent, it means the prosecution could not prove his guilt beyond reasonable doubt. He was found not-guilty, which is not the same as innocent. It very well could be that the jury found the truth of the matter uncertain.

This notion has implications in many real-life scenarios when people want to shift the burden of proof if you reject a claim when it's not substantiated. They wrongly assume you claim their claim is false (equivalent to innocent), when in truth all you are doing is staying in the default position (uncertain).

Rejecting the claim that a god exists is not the same as claim a god doesn't exist: it doesn't require a burden of proof because it's the default position. Agnosticism is the default position. The burden of proof is on the people making the claim.

-2
Jump in the discussion.

No email address required.

Eh, why should it be trivial to simulate a younger civilization?

Say our ancestors want to simulate Earth alone in real time. The stars and microwave background and gravity waves are just a projector on a sheet. How many bits of information are in that bubble? By definition, I don’t think they could fit on an Earth-sized computer.

The problem can be handwaved away if our simulators exist in different laws of physics, but we’re assuming that our own laws reflect theirs. Memory can also be traded off for time if our perceptions can be stepped frame-by-frame. That tradeoff opens up the possibility that higher reality just hasn’t had long enough to run a simulation of our observed fidelity. I don’t think that’s a knockout, just another possible resolution to the paradox.

He discusses this in the paper:

We noted that

a rough approximation of the computational power of a planetary‐mass

computer is 10^42 operations per second, and that assumes only already known

nanotechnological designs, which are probably far from optimal. A single such a

computer could simulate the entire mental history of humankind (call this an

ancestor‐simulation) by using less than one millionth of its processing power for

one second.

You just take shortcuts. You don't need to simulate the whole atomic-level phenomena unless it's actually being used, if someone is pointing an electron microscope at it for example. Or if they make it into a semiconductor. The interior of the Earth can be simplified hugely too, along with much of the deep oceans.

And if it takes a million or a billion times more processing power than expected for whatever reason, they could increase their computing operations to match. The galaxy is not short of stars or planets to be converted.