## All bets are off, to disprove the strong physical Church-Turing Thesis (foundations of probability, digital physics and Laplacian determinism 3)

(continued from previous post)

Let H0 be the hypothesis: the real world is non-computable’ (popularly speaking, see previous post), and H1 be PCTT+ (also see the previous post).

For comparison we introduce the hypothesis H2: the real world produces only rational (real) numbers’.

H2 is assumed to have been the original world view of ancient Greek mathematicians (Pythagoreans), before their discovery that $\sqrt{2}$ is irrational (which is rumoured’ to have caused a shock but I cannot find a reliable historical reference for this).

The rational numbers in $[0,1]$ have Lebesgue measure $0$, so we can start constructing $(T_{40,m})_{m\in \mathbb{N}}$ such that $T_{40}=\bigcup_{m\in \mathbb{N}} T_{40,m}$ has Lebesgue measure less than $2^{-40}$, and such that $T_{40}$ contains all rational numbers in $[0,1]$.

If we then take our coin-flip-randomly produced $x\in [0,1]$, I personally don’t think that we will encounter an $m\in\mathbb{N}$ for which we see that $T_{40,m}$ contains $x$.

This opinion is supported by the fact that we can easily construct a non-rational number…at least in theory. Take for instance $e$, the basis of the natural logarithm, which equals $\Sigma_{n\in\mathbb{N}}\frac{1}{n!}$. We can in fact construct $T_{40}$ in such a way that $T_{40}$ does not contain $e$, and assume this to be the case here.

On the one hand, this does not depend on infinity, since we can simply look at approximations of $e$. We construct $T_{40}$ such that for any $m\in\mathbb{N}$ the $2m+2^{\rm th}$ binary approximation to $e$ is positively apart from $T_{40,m}$. On the other hand, any finite approximation to $e$ is still rational…and so we can only construct $e$ as an irrational number in the sense described above.

With regard to the existence of non-computable reals, the situation in my humble opinion is very different. We cannot construct a non-computable real, as result of the Church-Turing Thesis (which I have no reason to doubt). Any construction of a real which we recognize as such will consist of a finite set of construction instructions…in other words a Turing machine.

So to make a reasonable case for the existence of non-computable reals, we are forced to turn to Nature. In the previous post, we flipped our coin to produce a random $x$ in $[0,1]$. We argued that finding $m\in\mathbb{N}$ for which $S_{40,m}$ contains $x$ would force us to reject the hypothesis H0 (the real world is non-computable’).

So what result in this coin-tossing experiment could force us to reject H1, the strong physical Church-Turing thesis (PCTT+, the universe is a computer’)?

To be able to reject H1 in the scientific hypothesis-testing way, we should first assume H1. [This might pose a fundamental problem, because if we really assume H1, then our perception of probability might change, and we might have to revise the standard scientific hypothesis-testing way which seems to be silently based on H0. But we will for the moment assume that the scientific method itself needs no amendment under H1.]

Under H1 $x$ has to fall in some $S_{40,m}$. Failure to do so even if we let $m\in\mathbb{N}$ grow very large, might indicate H1 is false. For scientific proof we should avail of some number $M\in\mathbb{N}$ such that (under H1) the probability that $x$ is not in $\bigcup_{m\in \mathbb{N}, m is less than $2^{-40}$.

This reverse probability has had me puzzled for some time, and sent me on the quest for a probability distribution on the natural numbers. In the thread drawing a natural number at random’ I argued that some indication could be taken from Benford's law, and for discrete cases from Zipf’s law. Anyway, very tentatively, the result of this thread was to consider relative chances only. If for $1\leq n,m \in \mathbb{N}$ we denote the relative Benford chance of drawing $n$ vs. drawing $m$ by: $\frac{P_B(n)}{P_B(m)}$, then we find that $\frac{P_B(n)}{P_B(m)} = \frac{\log{\frac{n+1}{n}}}{\log{\frac{m+1}{m}}}$. The relative Zipf chance of drawing $n$ vs. drawing $m$ would be given by $\frac{P_Z(n)}{P_Z(m)} = \frac{m}{n}$.

In both cases, the relevant density function is $f(x)=\frac{1}{x}$. The important feature of this distribution is twofold:

1) The smaller natural numbers are heavily favoured over the larger. (`Low entropy’).

2) There is no $M\in\mathbb{N}$ such that even the relative probability of drawing $m\in\mathbb{N}$ larger than $M\in\mathbb{N}$ becomes less than $2^{-40}$. (Because $\log x$ tends to infinity).

Fools rush in where angels fear to tread. I know, and so let me fall squarely in the first category. Yet this train of thought might provoke some smarter people to come up with better answers, so I will just continue. I do not believe these relative chances can simply be applied here, there are too many unknowns and assumptions. But it cannot do harm to try and get some feel for the reverse probability needed to disprove H1.

For this tentative argument then, disregarding some technical coding issues, we consider (under H1) our coin-flip random $x$ to equal some computable $x_s$ computed by a Turing machine with random number $s\in\mathbb{N}$, drawn from some extremely large urn with low entropy (favouring the smaller natural numbers).

Even with this favouring of the smaller natural numbers, still we cannot begin to indicate $M\in\mathbb{N}$ such that (under H1) the probability that $x$ is not in $\bigcup_{m\in \mathbb{N}, m is less than $2^{-40}$. Perhaps if we would know the size of the urn (which in this case would seem to be the universe itself) we could say something more definite on $M$. But all things considered, it seems to me that $M$ could easily be astronomically large, far larger than our limited computational resources can ever handle.

In other words: all bets are off, to disprove H1.

And so also, if H1 is true, it could very well take our coin-flip experiment astronomically long to find this out.

But I still think the experiment worthwhile to perform.