Foundations of probability, digital physics and Laplacian determinism

In this thread of posts from 2012, the possibility of drawing a natural number at random was discussed. In the previous post I rediscussed an entropy-related solution giving relative chances. This solution also explains Benford’s law.

In the 2012 thread, I was working on two fundamental questions, the first of which

QUESTION 1   Is our physical world finite or infinite?

was treated to some degree of satisfaction. But its relation to the second question still needs exposition. So let me try to continue the thread here by returning to:

QUESTION 2   What is the role of information in probability theory?

In my (math) freshman’s course on probability theory, this question was not raised. Foundations of probability were in fact ignored even in my specialization area: foundations of mathematics. Understandable from a mathematical point of view perhaps…but not from a broader foundational viewpoint which includes physics. I simply have to repeat what I wrote in an earlier post:

(Easy to illustrate the basic problem here, not so easy perhaps to demonstrate why it has such relevance.) Suppose we draw a marble from a vase filled with an equal amount of blue and white marbles. What is the chance that we draw a blue marble?

In any high-school exam, I would advise you to answer: 50%. In 98% of university exams I would advise the same answer. Put together that makes … just kidding. The problem here is that any additional information can drastically alter our perception of the probability/chance of drawing a blue marble. In the most dramatic case, imagine that the person drawing the marble can actually feel the difference between the two types of marbles, and therefore already knows which colour marble she has drawn. For her, the chance of drawing a blue marble is either 100% or 0%. For us, who knows? Perhaps some of us can tell just by the way she frowns what type of marble she has drawn…?

It boils down to the question: what do we mean by the word `chance’? I quote from Wikipedia:

The first person known to have seen the need for a clear definition of probability was Laplace.[citation needed] As late as 1814 he stated:

The theory of chance consists in reducing all the events of the same kind to a certain number of cases equally possible, that is to say, to such as we may be equally undecided about in regard to their existence, and in determining the number of cases favorable to the event whose probability is sought. The ratio of this number to that of all the cases possible is the measure of this probability, which is thus simply a fraction whose numerator is the number of favorable cases and whose denominator is the number of all the cases possible.

— Pierre-Simon Laplace, A Philosophical Essay on Probabilities[4]

This description is what would ultimately provide the classical definition of probability.

One easily sees however that this `definition’ avoids the main issue. Laplace did not always avoid this main issue however:

Laplace([1776a]; OC, VIII, 145):

Before going further, it is important to pin down the sense of the words chance and probability. We look upon a thing as the effect of chance when we see nothing regular in it, nothing that manifests design, and when furthermore we are ignorant of the causes that brought it about. Thus, chance has no reality in itself. It is nothing but a term for expressing our ignorance of the way in which the various aspects of a phenomenon are interconnected and related to the rest of nature.

and:

We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.

—Pierre Simon Laplace, A Philosophical Essay on Probabilities[37]

In the meantime I came across some work by Albert Tarantola, and this work is really heartening! In a seminal paper Inverse problems = quest for information (together with Bernard Valette) Tarantola already states that we should consider any probability distribution as an information state (subjective even) regarding the phenomenon under study, and vice versa: every information state can be described by a probability distribution function on some appropriate model space.

Now we’re talking!

To surprise me further: Tarantola describes the quandary that measure density functions like f(x)= \frac{1}{x} cause (the integral diverges) and offers exactly the same solution: look at relative probabilities of events, instead of absolute probabilities. To top it off, Tarantola emphasizes that the measure density function f(x)=\frac{1}{x} plays a very important role in inverse problems…

So now I need to study this all, in order also to join these ideas to the perspective of digital physics and Laplacian determinism.

(to be continued)

About franka waaldijk

mathematician (foundations & topology in constructive mathematics) and visual artist
This entry was posted in Uncategorized and tagged , , , , , . Bookmark the permalink.

1 Response to Foundations of probability, digital physics and Laplacian determinism

Leave a comment