Few of the great scientific principles have been more abused than that of the famous German physicist, Werner Heisenberg. Known as the Heisenberg Uncertainty Principle, it states that no pair of certain physical quantities, such as position and momentum, can both be known at the same time with a precision greater than a certain very small number. It was one of the great discoveries in quantum mechanics in the 1920’s, a decade studded with such discoveries that resulted in the development of modern quantum theory. Among other things, the modern theory states in mathematical terms the implications of the wave nature of both matter and energy. That mathematics can be used to derive Heisenberg’s famous principle.
Unfortunately, because many things in life are uncertain, the principle has been abundantly misapplied to a whole range of uncertainties to which it has no relevance whatsoever, just as the Theory of Relativity has been misapplied to a whole host of things that happen to be relative to each other in one way or another. Some of the misapplications and misconceptions are more subtle than others. I recently ran across an interesting one in a book entitled Swarm Intelligence, by James Kennedy and Russell Eberhart, the former a social psychologist and the latter an expert in evolutionary computation. The book argues that “intelligent human cognition derives from the interactions of individuals in a social world and that the sociocognitive view can be effectively applied to computationally intelligent systems.” I actually bought it to try programming a few of the computational examples contained therein, but found that it was a much a statement of ideology as of computational theory, larded with all the usual illusions to all the usual suspects among the philosophers who are currently fashionable in works of that genre. Among other things there is a discussion on page 11 of whether such a thing as “true randomness” exists, or whether, on the contrary, in the words of the authors, “The basis of observed randomness is our incomplete knowledge of the world. A seemingly random set of events may have a perfectly good explanation; that is, it may be perfectly compressible.”
As you may have gathered, all this eventually relates to the question of free will, and whether the universe is truly random or “stochastic” at some level, or, on the contrary, purely deterministic. I will not presume to answer that fascinating question here. However, the authors appear to be of the opinion that the latter is the case. What caught my eye was one of the arguments they used to support that point of view. Allow me to quote them at length:
For most of the 20th century it was thought that “true” randomness existed at the subatomic level. Results from double-slit experiments and numerous thought experiments had convinced quantum physicists that subatomic entities such as photons should be conceptualized both as particles and waves. In their wave form, such objects were thought to occupy a state that was truly stochastic, a probability distribution, and their position and momentum were not fixed until they were observed. In one of the classic scientific debates of 20th-century physics, Niels Bohr argued that a particles’s state was truly, unkowably random, while Einstein argued vigorously that this must be impossible: “God does not play dice.” Until very recently, Bohr was considered the winner of the dispute, and quantum events were considered to be perhaps the only example of true stochasticity in the universe. But in 1998, physicists Duerr, Nonn, and Rempe disproved Bohr’s theorizing, which had been based on Heisenberg’s uncertainty principle. The real source of quantum “randomness” is now believed to be the interactions or “entanglements” of particles, whose behavior is in fact deterministic.”
In fact, the paper referred to, entitled “Origin of quantum-mechanical complementarity probed by a ‘which-way’ experiment in an atom interferometer,” is an elegant piece of work, but in no way, shape or form does it demonstrate that the “real source of quantum randomness is the interactions of “entanglements” of particles, whose behavior is in fact deterministic.” It can be found in its entirety here. The math and scientific notation are a little dense, but if you simply trust the authors of the paper on those matters, and just look at the discussion and conclusions at the end, you should be able to see without too much difficulty that it in no way has the significance that Kennedy and Eberhart assign to it. They seem to think that it somehow represents a refutation of the Heisenberg principle. In fact, the authors of the paper explicitly state the contrary.
Their paper is one of the many that have sought to shed light on the famous experiment in which interference patterns are formed by particles passing through a double slit, even when single particles are passed through one at a time, defeating any attempt to explain the phenomenon based on classical (non-quantum) physics. The question they attempt to answer is not whether the Heisenberg principle is itself valid or not, but merely whether the principle must be invoked to explain the fact that “measuring” which one of the slits each particle passes through causes the loss of the interference pattern, or, on the contrary, some other mechanism can enforce the change. It turns out that, in fact, the Heisenberg principle is not necessary. Which of the “slits” (in this case the experiment is done with standing light waves rather than physical slits) each particle passes through can be measured by much more subtle means that have orders of magnitude less effect on particle momentum than would be necessary to justify invoking it. In other words, what the authors are really saying is, not that the Heisenberg principle is wrong, or has been superceded by some new “deterministic” theory, but merely that it is not true that it must be invoked to explain “complementarity,” or the ability of quantum mechanical entities to behave as either particles or waves.
All this is very intriguing. One wonders to what extent this meme that the experiment in question “proves” that we live in a deterministic universe is making the rounds among people who don’t actually understand its implications one way or the other. It would hardly be the first time that authors have been cited as authorities for ideas that never appeared in their work. To what extent do the authors of the paper realize they’ve become “famous” in this way?
And what of the great questions of free will, and whether we live in a deterministic or stochastic universe? The world is full of people who are cocksure they know the answer. They just don’t agree on what it is. Alas, I fear we are not at the point at which we can really say one way or the other. Before that can happen, it will be necessary for us to figure out the fundamental nature of all this stuff around us, and why it all exists to begin with. We are yet far from having that knowledge, a fact that makes life that much more exciting. There are still great new worlds for us to discover out there.