To illustrate what I mean with entropy (I’m not sure how well this corresponds with usual interpretations), let’s consider the throwing of a traditional die (marked with 1 through 6). It is seemingly borne out by experience that the odds of throwing either of the digits 1 through 6 usually approximate …but as we discussed earlier, this also depends on our interpretation of `chance´ and `odds´ and `probability´. We momentarily put these interpretational issues aside.

In this situation, the relative probability of drawing a 1 vs. drawing a 6 say, is equal to 1. I would call this a situation of maximal entropy, in other words here the urn of natural numbers 1 through 6 from which we draw is extremely well-mixed. In reality, no die is perfectly symmetric, but we have taken pains to design dice in such a way that no face is favoured over another as far as we can tell.

Suppose our world is truly finite, say it can be characterized by its finite cardinality , and extremely well-mixed (with maximum entropy say), then the chance of drawing a random natural number woulds seem to be irrespective of , in other words: a uniform distribution. However, if the entropy is low, then the distribution will (I believe) be skewed in favour of the lower natural numbers.

So now we arrive at the highly speculative but interesting idea, that Nature, on average, in order to create anything of cardinality 2, needs to invest precisely twice as much `trouble´ as to create the same anything of cardinality 1. And similarly (cutting corners), to create something of cardinality costs times the effort of creating the same something of cardinality 1. If we continue cutting corners, then this obviously leads to the idea that the `density’ of compared to the density of 1 on average is .

As an example, consider trees. It takes twice as long for a tree to grow 2 metres, than for that same tree to grow 1 metre. It takes on average twice as much `effort’ to grow 2 trees, than to grow 1 tree. So for instance in isolated spots, you will find solitary trees…because there is simply no room for a second tree. Of course the idea that this, for given on average should lead to a `density’ of is highly speculative, in more than one way. I will try to list all the caveats that come to mind, later on in this series.

But first, I believe one can link `trouble’ to some aspect of `entropy’. In the case of a die, we have beforehand (!) invested energy, trouble, whatever, to ensure that accessing the number 6 is just as easy as accessing the number 1. In general however, if we bear with our highly speculative idea, it would be six times harder to access 6 than to access 1. You can compare this to drawing marbles from a large urn, which is filled layer by layer. The marbles on top are labeled 1, and each successive deeper layer is labeled 2, 3, etc. Now you can imagine that it is in some sense 6 times harder to draw a 6 than to draw a 1. The urn is not well-mixed, it has low entropy.

You will notice that with the `density’ function , for given , we do not immediately arrive at the solution given in post 4, although of course the difference is not very big. I wish to modify the interpretation of this `density’ function a little. This is partly opportunistic (because I wish to achieve conformance to Benford’s law) and partly not, because Benford’s law is an experimental truth (with accuracy problems of course…) which points out that there is another mechanism at work when `drawing random natural numbers’.

Cutting more corners, I propose to switch to a continuous density function , namely for . This corresponds to the image of the growing tree. It doesn’t achieve a height of metres in discrete steps, rather it is a continuous process (although we have to bear in mind all the caveats already given on this account, namely that spacetime might well be finite, or at least discrete…making the concept of even shakier than the concept of ).

Now, still not letting us get hampered by caveats, the idea is that drawing a natural number corresponds to the area below the graph of the density function on the interval . This is perhaps the most opportunistic argument, since this choice makes for a perfect correspondance with Benford’s law. But there is also something non-opportunistic to be said for choosing the interval over its rival candidate .

This has to do also with anthropic issues (which are all-present, but I have largely ignored them so-far). When do we perceive something as a tree? For this to happen, a certain threshold has to be crossed. To achieve a natural number , similarly we have to cross a threshold. It seems more consistent to put this threshold at itself, than to (prematurely, overly optimistically) at point presume that our process will arrive at just as easily.

The above gives, in a nutshell, the bare essence of the formula in post 4. The next posts will discuss why this formula works in either of the scenarios finite or potentially infinite, and can also even be made plausible in the `actually infinite’ case.

(to be continued)