April 2, 2012
One of the basic ideas of modern science is that the laws of the material universe can only be meaningfully understood by expressing quantified measurements. Numerical terms are needed, not just words and stories. The belief was that instead of ordinary sentences we must use mathematical equations.
The values of the measurements at a given starting time are called the initial conditions for that system. The Newtonian, deterministic claim is that for any given system, the same initial conditions will always produce an identical outcome. Life is like a film that can be run forwards or backwards in time.
One thing we have learned is that no real measurement is infinitely precise. All measurements necessarily include a degree of uncertainty. The uncertainty that is always present arises from the fact that all measuring devices can record measurements only with finite precision. To be able to reach infinite precision, the instrument we use should be able to display outputs with an infinite number of digits.
By using very accurate devices, the level of uncertainty can often be made acceptable for a particular purpose, but it can never be eliminated completely. It is important to note that the uncertainty in the outcome does not arise from randomness in the equations, but from the lack of infinite accuracy in the initial conditions.
It used to be assumed that it was theoretically possible to obtain nearly perfect predictions by getting more precise information. Better instruments would shrink the uncertainty in the initial conditions, leading to shrinking imprecision in predictions. The lack of infinite precision was thought to be a minor problem. Well, our belief systems are still mostly based on the idea that very small uncertainties don’t matter.
Possibly the first clear explanation of a very different kind of understanding was given in the late nineteenth century by the French mathematician Henri Poincaré. He was the founder of the modern dynamical systems theory. His claim was that there were systems that followed different laws: the tiniest imprecision in the initial conditions could grow in time. Two nearly indistinguishable sets of different initial conditions for the same system would then result in two developments that differed massively from one another. This is the reason why seemingly random behavior can emerge from deterministic systems with no external source of randomness.
Poincaré was way ahead of his time. His early thoughts gained evidence in 1963, when Edward Lorenz found, by accident, that even computer models of the weather were subject to very sensitive dependence on initial conditions.
Numbers fool us and quantified measurements are very rarely the whole picture. Stories matter more than we think.