Before the quantum revolution, the scientific depiction of the natural world was a deterministic one, i.e., once all the initial parameters of a physical system were known, the evolution of a system could be predicted with exact precision. It was this ability to make exact predictions derived from empirical knowledge that made up the backbone of science, with the field of physics painting this deterministic picture of the world on the fundamental level. From describing the motions of the stars, to the behavior of the atoms which made up our bodies and the materials around us, physics had an advantage over the other sciences, such as biology and chemistry, in that its precision was unmatched, e.g., with what speed an object would hit the ground could be calculated exactly, while how the human body would respond to a certain chemical couldn’t always be precisely predicted. Even in statistical physics–thermodynamics–where the components of a system were too innumerable to treat individually, a deterministic view was suggested. Though an ensemble of particles may approach the innumerable, nothing in the nature of thermodynamic theory suggested that the trajectories of these particles were fundamentally unknowable; it was simply a practical matter to treat the system statistically rather than to treat each molecule individually, though, in principle, each molecule could be isolated and its properties measured. It was this line of reasoning that would inspire the realist position following the quantum revolution.
But once it was shown that Niels Bohr’s model of the atom was incorrect, as was Schrödinger’s model of the electron being a continuous stream of charge distributed around the atom, physical models of theories began to lose precedence in physics.1 Mathematical formalism began to take the stage in the atomic realm–this being so because no physical model seemed to be able to describe what was being measured by experiment when it came to sub-atomic particles. Electrons, when treated probabilistically, were now shown to obey wave equations, and their characteristics, within certain limits, could be measured. This treatment introduced certain limits that were at odds with previously established principles in physics, and much debate has gone into what the wave equation of a particle physically actually represents. What quantum theory suggested was that the location of a particle could not be predicted beyond the realm of probability (as matter of principle, not just practicality), and that measurable quantities, such as position and momentum, could not be simultaneously measured, i.e., knowledge of one forbid knowledge of the other. This concept was mathematically formulated in Heisenberg’s uncertainty principle, originally published in the German physics journal, “Zeitschrift fùr Physik” in 1927–and it’s been a thorn in the philosopher’s side ever since.
To the classical physicist (and the aforementioned theory of determinism in philosophy), these ideas were anathema. It was one thing to say that it is impractical to measure a certain property of a particle, e.g., to measure the trajectory of a specific air molecule, but it is another thing to say that, in principle, a particle’s property couldn’t be measured–that nature was imposing limits as to what it revealed about itself on the fundamental level. If particles’ trajectories were fundamentally random, and the uncertainty principle was a fundamental law of nature, a deterministic view of the universe was now an anachronism. In response to this new, stochastic view of the universe, Einstein made his famous “God does not play dice” statement,2 illustrating his view that the true trajectory of a particle was not a matter of uncertainty, but that it depended on the initial conditions of the system, and if those conditions were known, its trajectory could be predicted and described precisely.
Yet, despite these classical and philosophical oppositions, quantum theory has remained supreme in its respective realm. Its predictions about the fundamental indeterminism of our universe on the atomic scale have been experimentally verified, and though we may not like it, it seems probability governs our world–not simple, linear cause and effect as was previously thought. Even the phenomenon of quantum entanglement, a contradiction in the mathematical formalism of quantum mechanics which Einstein revealed, has been physically demonstrated.3 Today, most physicists have capitulated to the inherent, counter-intuitive realities of nature that the Copenhagen and other non-deterministic interpretations of quantum mechanics suggest. It is widely accepted that knowledge of a quantum mechanical system affects the system, “true” particle trajectories do not exist, matter and light particles are also waves, and an electron can be in two places at once. These phenomena are both what we observe and what the math tells us, and therefore, the physics community must roll with it.
But this hasn’t stopped a small minority of physicists from clinging to a deterministic universe; this interpretation is known as the realist position, or the realist interpretation of quantum mechanics. Though this view is not a denial of the realities of quantum theory, as evidenced by numerous experimental confirmations–this can’t be emphasized enough–it is an insistence on the idea that the picture of the quantum realm is not complete, with all of this hinging on the grounds that quantum mechanics, though useful and consistent, has yet to provide a physical model for the universe–or at least, one that makes even a bit of sense.
Quantum mechanics is a theory without a clear publication date or founder. The formation of the theory consists of the aggregated works of many early twentieth-century physicists, such as Niels Bohr, Enrico Fermi, Erwin Schrödinger, Wolfgang Pauli, and Richard Feynman.4 Even Albert Einstein, already noted as a later opponent of the theory, couldn’t help but contribute to its formation. His work on the photoelectric effect, for which he received the Nobel Prize in physics in 1921, helped illustrate the modern understanding of discrete electron energy levels in the atom–what quantum mechanics is all about and is used for–and the relationship between the energy and frequency of light.5
Another one of the early physicists who helped construct quantum theory was Louis de Broglie. His initial work was on the theoretical development of matter waves, presented in his 1924 PhD thesis. In this brave and groundbreaking doctoral defense, de Broglie predicted that all matter had an associated wavelength, this wavelength becoming more salient as the scale of matter involved decreased, i.e., it wouldn’t be obvious for cars and baseballs, but it would be for sub-atomic particles. This prediction was confirmed by the Davisson-Germer electron diffraction experiments at Bell Laboratories–a serendipitous discovery–and de Broglie was awarded the Nobel Prize in 1928 for his insight into the wave-particle duality exhibited, not only by light, but by matter as well.
If de Broglie’s ideas about the wave-particle duality of all matter were true, they posed a challenge not just for physics, but for the philosophy of science as well. If an electron has a wavelength, then where is the electron, or better, where is the wave? The answer isn’t clear because waves are spread out over a range of space. In order to define a wavelength, one loses the ability to define a position and vice versa. Yet, an electron still can have a defined position as demonstrated by experiments which reveal its particle-like nature; particles aren’t spread out in space. It was from these considerations that Werner Heisenberg developed the famous and already mentioned uncertainty principle. To define a position, an experimentalist must forfeit information about its wavelength (momentum) or vice versa. It was the development of this principle that marked the downfall of determinism in science.
Yet, de Broglie did not originally believe that the probabilistic wave treatment of matter warranted an indeterministic interpretation of the universe. In 1927, the year before his matter-waves theory was confirmed, he proposed the pilot-wave theory, a suggestion that the wave equation in quantum mechanics could be interpreted deterministically. Though he eventually abandoned this interpretation to follow a more mainstream one, the theoretical physicist David Bohm would later continue his work, and the pilot-wave theory would also become known as the de Broglie-Bohm theory.6
In 1952, while employed at Princeton, Bohm published a paper which espoused his realist interpretation of quantum mechanics. In the paper, he suggested the idea that quantum theory was incomplete and that “hidden variables” were not taken into account in its formulation. These hidden variables would explain why the theory was so far probabilistic, and if they were taken into account, the predictive capabilities of the theory would become exact. That is, he believed there were more parameters to consider in the wave equation, and quantum theory had so far failed to predict exact results because not all of the pertinent variables were accounted for. (This is analogous to trying to measure the total kinetic energy of the Earth while only considering its linear kinetic energy and not its rotational energy. You won’t get a precise answer until you account for both.)
Bohm suggested introducing a “quantum mechanical potential energy” to begin a new mathematical treatment of the theory. The double-slit experiment, in which a single electron passes through a single slit exhibiting particle-like properties, while passing through a double-slit exhibiting wave-like properties, could be explained by postulating that the quantum mechanical potential energy of the system was changed when the second slit was opened or closed. The realist’s goal was to then discover the hidden variables and physical phenomena that would induce this change in the said potential energy of the system. In particular, Bohm pointed out that an expansion of the theory in this direction might be needed to solve the problems of quantum mechanics on the nuclear scale, where its laws broke down, and that developments in the direction of a more complete formulation of the theory would expand its domain.7
Bohm also expressed his view that though quantum mechanics was useful, consistent, and elegant, it did not justify its denouncement of determinism–the philosophy behind every field of science, not just physics. To Bohm, nothing in the theory suggested that the Copenhagen–mainstream–interpretation revealed the physical reality of nature, but rather that the theory was still developing, and that, in addition to all the theoretical complications, the instruments used in the experimental verification of the theory were bound to interfere with the precision of the measurements. After all, this was the first time in history that objects of such small size were being precisely measured for their exact location and properties. Renouncing a deterministic world view that the rest of science reinforced didn’t seem justified simply because a practical theory which suggested otherwise had been developed. Bohm, like Einstein, was sure a more complete and physically-sensible theory would one day supplant it.
In fact, Einstein didn’t wait for the future; even after having already developed his groundbreaking theory of relativity and winning the Nobel Prize for the photoelectric effect–it’s widely thought he won it for the former, not the latter–Einstein continued his work in theoretical physics, his eyes set on bringing absolute determinacy back into science. In 1935, Einstein, along with his colleagues Boris Podolsky and Nathan Rosen, published a paper demonstrating the incompleteness of the quantum mechanical description of reality by the wave function.8 In the mathematics of quantum mechanics, Einstein and his colleagues found a paradox, one that predicted the phenomenon of two or more particles becoming “entangled.” This meant that two or more particles would sometimes need to be described by a single quantum state, even when the respective particles were separated by a distance larger than was usually dealt with on the quantum scale–meaning that the speed of light wouldn’t be able to communicate information about the single state between the two particles in time for them to respond accordingly. The transmission of information is limited by the finite speed of light.
This meant that, for entanglement to occur, action at a distance was required, a concept regarded as untenable in most fields of physics–and one that bothered the ancient Greek philosophers as well. It suggested that the physical system in the entanglement question was non-local, and that for action at a distance to occur, the principle of locality must be violated. The importance of this principle rests in the assumption that in order for information to be transmitted between two objects, something must do the transmitting. Be it a particle, a field, or a wave, the information must be physically communicated somehow.
In 1964, the physicist John Stewart Bell proposed a theorem demonstrating the possibility of non-local quantum effects and quantum entanglement. Bell was convinced his work also showed that quantum theory was complete, and that the postulation of hidden variables would not add to the theory, but rather violate it, therefore ruling out the possibility of their existence.9 Going into the technical details of Bell’s Theorem is beyond the scope of this article, but its predications were experimentally proven to be true concerning the non-locality of the quantum world–but proving the nonexistence of the hidden variables would be disproving a negative, something beyond the capabilities of science, at least in its current philosophical form. Quantum entanglement was experimentally verified, proving Einstein and his colleagues wrong and making their predicted physical paradox a reality. 10
Today, there are an appreciable number of physicists who subscribe to the realist interpretation, an esoteric view in the already esoteric discipline of quantum physics. Dr. Emilio Santos of the University of Costa Rica is one of the leading physicists to subscribe to this view. Yet to be convinced that Bell’s Theorem refutes the possibility of Bohm’s hidden variables, Dr. Santos posits that the apparent stochasticism of the quantum universe is due to the interference of measuring apparatuses with their systems in quantum mechanical experiments, as well as the presence of vacuum fluctuations in space-time.1
His conception of the uncertainty principle stems from the unavoidable reality that, in a quantum system, the researcher must examine a microscopic object–which obeys quantum-mechanical laws–while using a macroscopic measuring tool–which obeys Newtonian laws.3 So far, no known theory can link the two realms together. To try and work around this difficulty, Niels Bohr, one of the first developers of quantum mechanics, proposed the correspondence principle. In doing this, he suggested that as we go from the quantum world to the classical or macroscopic one, taking the limit of Plank’s constant as it approaches zero, quantum laws transition into classical ones.1,3 However, it is philosophically contradictory to claim that some aspects of our universe are deterministic and others are not, as determinism implies that all components of a system have predictable, causal trajectories. It seems odd to claim that predictable systems are based on unpredictable foundations. Though he does not state this explicitly in his papers, it’s apparent that Dr. Santos doesn’t subscribe to Bohr’s correspondence principle, and he believes the radically different natures of the experimental system and the measuring apparatus are more to blame.
In addition to the apparatus problem, it is also much more likely that the ostensible indeterminacy of quantum mechanics rises from vacuum fluctuations.1 Dr. Santos ascribes the apparent probabilistic nature of quantum theory due to the inherent difficulties in practically measuring particles on such small scales, where the space they inhabit itself affects the system. Even vacuums participate in quantum mechanical activity, and due to the fact that there are no discontinuities in ordinary space, no system can be truly isolated or claim to be local. To Dr. Santos, while non-locality must be accepted, this does not preclude a realist interpretation of quantum theory, as it does not prove inherent, natural limits to the knowledge we may possess of any physical system; it simply suggests that the systems we study are full of too much background “noise” to precisely measure any individual particle–in the same way there’s too much noise in a crowded room to precisely record any one particular conversation. Dr. Santos suggests that, until a physical model is proposed or an advancement in the mathematical formalism of the theory suggests a realist interpretation, quantum mechanics is incomplete. He says, “I do not propose any modification of that core, but claim that the rest of the quantum formalism is dispensable.”1
It would be interesting to note the technological implications the realist interpretation would have for the modern field of quantum computing. Ordinary computers make use of binary, reducing all stored data to a collection of ones and zeroes arranged in a particular order for each datum. Relatively speaking, computers are limited by all their processes simply being a collection of yes or no, on or off, binary statements, which the computer has to read all the way through in order to perform any command.
Quantum computing would overhaul this limitation of binary by taking advantage of the wide range of quantum phenomena available to us. Instead of a computer going step by step through a collection of yes or no statements, the processors could take advantage of quantum entanglement and perform a number of different computational processes simultaneously–something impossible in binary. The fundamental indeterminacy of quantum mechanics makes these wild processes possible. Instead of electric transistors converting circuit data into binary–current flowing here or there–quantum computer chips make use of the fact that, though counterintuitive, electrons can be in several places at once, meaning an electron can run down several different circuit pathways at once, and therefore perform several different computations at once. While no quantum computer of worthy application has been developed, such devices do exist, and it’s only a matter of time until their capabilities supplant those of a digital computer’s. Already, quantum computing data is stored in something called a “qubit,” the quantum mechanical datum to replace the binary computing “bit.” So far, quantum computers can only handle a measly 16 qubits, but most developers in the field are confident an expansion of quantum computing capabilities is on the horizon.
It is somewhat unclear what a deterministic revolution in quantum theory would mean for quantum computing. This all would depend on what exactly the hidden variables and their physical reality would represent. Would the discovery of the hidden variable reveal that, in actuality, an electron cannot be in two places at once? This is unlikely, as experiment has revealed such a phenomenon to occur, but then again, what if the hidden variables revealed that our measuring in experiments did indeed influence our measured physical systems beyond the limits of forgivable scientific error, and that our measurements effected the so far paradoxical results of electrons–and the rest of matter–having almost phantom like properties? Alas, being that the realist interpretation of quantum mechanics is not the focus of many researchers in quantum theory, its implications with respect to quantum computing have not been fully considered. It could either expand or kill the field. Maybe the reason quantum computers cannot deal with more than 16 qubits is because we are asking nature to do something that is fundamentally against its mechanics, despite the fact that our tentative mathematical theories suggest it is possible.
But technological considerations aren’t the only ones to be had concerning the realist interpretation of quantum mechanics. The philosophical and religious implications of the realist interpretation versus the more mainstream, Copenhagen interpretation are quite profound, and the debate between determinism and uncertainty in quantum mechanics has inspired many philosophers to consider what each interpretation means for the limits of human free will. If the laws of nature are completely deterministic, and every event in the history of the universe can be traced back through particle trajectories to the big bang, then it follows, through inductive reasoning, that all events, even human thoughts, wants, and actions, are simply the reactions of atoms and molecules to physical laws, leaving no room for unnatural agents to participate in the system. In this view of the universe, one doesn’t make choice A instead of B because one is a free agent in a universe of otherwise natural laws, one makes that choice because the information about those two choices induced a certain chemical reaction in the mind of the chooser (the mind is made of atoms as well), and in the same way a rock falls under the influence of gravity, the matter that composes the human mind reacts under the influence of causal particle mechanics.
But if the universe is indeterministic, as suggested by the mainstream interpretations of quantum mechanics, it means human choices aren’t predetermined, and this indeterminacy ostensibly leaves room for human influence. Yet, it remains to be shown how this position can be maintained. Even if all human decisions and actions were not determined at the moment of the big bang, and all the events in the universe could be reduced to the unpredictable nature of stochastic particles, this leaves nowhere for a non-natural influence–free will–to come into the picture. Human choices are still the result of particle trajectories, whether or not those trajectories can be determined, and whether or not the trajectories of those particles are linear or non-linear. Until some unnatural agent is introduced into the complex but natural configuration of the human mind–unnatural in that it would be outside the laws of nature–the position that humans have free will cannot be maintained without appealing to notions of the supernatural. And nature does not approach the supernatural as its systems approach complexity, even the complexity of the human mind. To claim otherwise is to claim that the molecules which make up the brain follow different physical laws than the rest of the molecules in the universe. And if you disagree, I can’t blame you; it’s not like you had a choice in the matter anyways.
But philosophical debate aside, as one of the most successful and useful theories in all of theoretical physics, quantum mechanics does seem to suggest the indeterministic realities of nature. We get our understanding of semi-conductors, the devices used to power your smartphone, through quantum mechanics, and we can’t discard the probabilistic elements without discarding our understanding of the theory altogether. In physics, where experiment is king, and in science, where nature is under no obligation to make sense to us, it seems stubborn to ignore the continuing theoretical and experimental verification of the probabilistic nature of the universe. Yet, the idea that this limit is one of practicality, not principle, is a hard one to overcome. Human science has reduced every other aspect of the universe down to the simple but fascinating level of causal mechanics; it is tempting to say that quantum mechanics will one day reach this point as well.
1 E. Santos, Foundations of Science, 20, 357-386 (2015) or arXiv:1203.5688 [quant-ph].
2 W. Hermanns, Einstein and the Poet: In Search of the Cosmic Man, Branden Books; 1st Ed. (2013), p. 58.
3 V. Singh, Materialism and Immaterialism in India and the West: Varying Vistas (New Delhi, 2010), p. 833-851 or arXiv:0805.1779 [quant-ph].
4 J Mehra, H, Rechenberg, The Historical Development of Quantum Theory ( New York, 1982).
5 ”Albert Einstein – Facts”. gf. N.p., 2017. Web. 24 Feb. 2017.
6 F. David Peat, Infinite Potential: The Life and Times of David Bohm (1997), p. 125-133.
7 D. Bohm, B. Podolsky, Phys. Rev. 85, 2 (1952).
8 A. Einstein, B. Podolsky, and N. Rosen, Phys. Rev. 47, 777 (1935).
9 H. P. Stapp, Nuovo Cim B 29, 270 (1975).
10 A. Witze, ”75 Years Of Entanglement”. Science News. N.p., 2017. Web. 24 Feb. 2017.
Essay by David Kyle Johnson.…