3.6 Randomness

Randomness is at the very heart of quantum physics. When a physical state that is in a superposition of states is measured, then it collapses into one of its possible states in a completely unpredictable way – we can only evaluate the probability of obtaining various possible outcomes. An extreme view is to claim with Peres [110] that
in a strict sense quantum theory is a set of rules allowing the computation of probabilities for the outcomes of tests which follow specific preparations.
According to Milburn [102], p. 1, a quantum priciple is
physical reality is irreducible random.
We are talking about “true" randomness, not the “randomness" which, at times, nature appears to exhibit and for which classical physics blames our ignorance: meteorologists cannot predict accurately the path of a hurricane, for example. The explanation is not difficult to obtain: the equations governing the motion of the atmosphere are nonlinear and tiny errors in the initial conditions can immensely amplify. This behaviour is known as “deterministic chaos".

A mathematical definition of randomness is provided by algorithmic information theory, see Chaitin [43], Calude [27]. The idea is to define (algorithmic) randomness as incompressibility. The length of the smallest program (say, for a universal Turing machine) generating a binary string is the
program-size complexity of the string. This idea can be extended in an appropriate way to infinite sequences. A random string/sequence is incompressible as the smallest program for generating it is the string/sequence itself! Strings/sequences that can be generated by small programs are deemed to be less random than those requiring longer programs. For example, the digits of can be computed one by one; nonetheless, if examined locally, without being aware of their provenance, they appear “random". People have calculated out to a billion or more digits. A reason for doing this is the question of whether each digit occurs the same number of times, a sympton of randomness. It seems, but remains unproven, that the digits 0 through 9 each occur 10% of the time in a decimal expansion of . If this turns out to be true, then would be a so-called simply normal real number. But although may be random in so far as it’s “normal", it is far from (algorithmic) random, because its infinity of digits can be compressed into a concise program for calculating them. The numbers generated by the so-called logistic map,


(9)
where is an arbitrary constant and the process starts at some state , may appear “random" for some values, say and ; however, they are not, because of the succinctness of the rule (9) describing them. In Figure 3.6 we will show the evolution of 130 iterates of the logistic map, all starting from , with .

In general, a long string of pseudo-random bits produced by a program may pass all practical statistical tests for randomness, but it is not (algorithmic) random: its program-size complexity is bounded by the size of the generating program plus a few extra bits which specify the random number seed. Later on we will discuss an amplification technique based on the use of the logistic map.

Similarly, a long string of binary bits produced by any classical physical system, of which a Turing machine or a Java program is just an instance, is not (algorithmic) random. The program-size complexity of such a string is bounded by the size of the program generating it, that is, the physical law which governs its evolution, plus the size of the initial conditions on which the law acts. Any classical computer can only feign randomness; thinking otherwise is not only wrong, but as von Neumann said,
Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.
Note that human beings are not doing a better job in generating “random" bits as Shannon [121] has argued: biases observed in people’s preferences for popular lottery numbers are manifest.

Is there any physical system that can generate arbitrarily long (algorithmic) random strings?

It is not difficult to destroy randomness. For example, start with a random sequence over the alphabet and define a new sequence , over the alphabet , by

Then, the new sequence
is not random. The motivation is simple: the strings and (and, infinitely many more others) never appear, so the sequence has clear regularities (which can, actually, be detected by simple statistical randomness tests).

It is much more demanding to “generate" a truly random long string starting from an initial state with a simple description. Note that the condition of simplicity of the initial state is
crucial: starting from a random string one can generate, in a pure algorithmic way, many other random strings. For example, if is a random binary string, then break the string into pairs and then code , , , by : the result is again a random sequence. So, the problem is to start from an initial state which can be precisely controlled and has a low program-size complexity and produce measurements of unbounded program-size complexity out its natural dynamical evolution.

Quantum mechanics seems capable to produce, with probability one, truly random strings. Here is a way to do it. Consider the operator

and recall that it rotates a qubit through an angle . In particular, transforms that state into an equally weighted superposition of 0 and 1:


(10)
So, to make a quantum device to produce random bits one needs to place a 2-state quantum system in the state, apply the operator to rotate the state into the superposition (10), and the observe the superposition. The act of observation produces the collapse into either or , with equal chances. Consequently, one can use the quantum superposition and indeterminism to simulate, with probability one, a “fair" coin toss. Random digits produced with quantum random generators of the type described above are, with probability one, free of subtle correations that haunt classical pseudo-random number generators. Of course, the problem of producing algorithmic random strings is still open. Indeed, let us assume that we have a classical silicon computer that simulates, using a high-quality pseudo-random generator, the quantum mechanics dynamics and quantum measurement of a 2-state quantum system. The simulated world will be statistically almost identical (up to some degree) with the “real" quantum system. However, all simulated bits will be, in the long run, highly compressible. How can we be sure that the “real" quantum system is not just a superpowerful pseudo-random generator?


















_


Figure 6: Degrees of “randomness” in the logistic map


For technical reasons, we use self-delimiting Turing machines, machines having a “prefix-free" domain: no proper extension of a program that eventually halts has that property.