Copenhagen

More from the archives. This time an undergrad essay I wrote for Huw Price’s class Philosophy of Physics II, on the Copenhagen Interpretation of Quantum Mechanics, EPR Paradox, Bell’s Inequality and Alain Aspect’s experimental resolution of all three.

Causality, determinism and the classical view of the Universe

Newton, in the late seventeenth century, was the first to formulate a thorough mathematical formalism to accurately describe physical interactions observed both astronomically (planetary motion) and on the everyday scale (projectile motion). The philosophical implications of his theories were that the universe, and all the particles of which it was made, evolved in a completely deterministic and clearly defined manner. Given a complete description of the state of a given system (including the forces, velocities and positions of the elements that make it up), one could then predict its behaviour for the rest of time. This ability to predict the future, in principle, was only limited by the extent of one’s knowledge of the present, which in turn was only limited by the accuracy of one’s measuring instruments. Thus, it was believed that knowledge about any given system was only restricted by one’s technological ability to make measurements.

Although major inadequacies in Newton’s theories were identified and corrected by Albert Einstein’s special theory of relativity, they were mainly concerned with the nature of space-time, with the basic belief in causality and determinism in the universe maintained. This view of the world remained largely unchallenged until the advent of quantum mechanics and, in particular, what has come to be known as the Copenhagen Interpretation.

The Copenhagen interpretation of Quantum Mechanics

Prior to the 1920s and the development of quantum mechanics it was believed that atomic entities such as protons and electrons possessed strictly particle-like properties. However, in 1927, an experiment known as the double-slit experiment showed this view to be incomplete. This experiment involved projecting electrons at a screen with two closely spaced holes in it, and viewing the resulting pattern formed on a second screen behind it. If electrons existed only as particle-like entities the resulting pattern would have consisted of two intensity maxima each directly opposite the two holes. This was not the case – an interference pattern, typical of that resulting from similar experiments carried out with wave-like light, was seen. Thus, it was shown that electrons could display both particle-like properties and wave-like properties. This duality of character is known as the wave–particle dilemma.

Although Einstein’s 1905 paper on the photoelectric effect (the conclusions of which were proven experimentally in 1923 by Arthur Compton) had shown a similar wave–particle duality with respect to light, a further effect was observed in the two-slit experiment which was incomprehensible to a classical view of the world. The existence of the interference pattern requires each electron to pass through both holes of the first screen, thereby denying its particle-like character. However, if one uses a detecting instrument to determine which hole each electron goes through, without hindering its path, the interference pattern disappears leaving two single maxima on the viewing screen. What’s more, if the detecting instrument is left in the setup, but turned off, the pattern reappears. What seems to happen, in effect, is that electrons only allow themselves to “Seen” with either particle-like properties or wave-like properties, but not both.

Bohr, one of the founders of the modern quantum mechanics, explained these results by what called the principle of complementarity. This principle states that both theoretical pictures of fundamental particles (such as electrons and the like), as particles and as waves, are equally valid, complementary descriptions of the same reality. Neither description is complete in itself, but each is correct in appropriate circumstances; an electron should be considered in the case of the double-slit interference pattern, and as a particle, in the case where it is being detected by a particle detector. From this, Bohr concluded that the results of any measurement are inherently related to the apparatus used to make the measurement. Moreover, he believed that, in a Sense, the apparatus actually gives the property being measured to the particle: experiments designed to detect particles always detect particles; experiments designed to detect waves always detect waves.

Bohr’s principle of complementarity introduces an element of uncertainty into any measurement and, in effect, requires it. In a more specific experimental sense, Heisenberg derived a set of relations from the equations of quantum mechanics which show that any two non-commuting dynamic properties of a system (properties whose mathematical operators do not obey the commutativity relation a×b = b×a) cannot both be measured to arbitrary accuracy. An example of what is known as the Heisenberg Uncertainty Principle involves consideration of the momentum, p, and the position, x, of an elementary particle. This principle says that the accuracy of any measurement of the momentum and position of a particle, Δx and Δp respectively, is restricted by the relation Δp⋅Δx > ℏ (where ℏ is Planck’s constant divided by 2π). Thus, the momentum of a particle may be measured to arbitrary accuracy but with a sacrifice to the accuracy of any measurement of position, and vice versa. This relation can also be shown to apply for simultaneous measurements of energy, ΔE, and time, Δt.

The Heisenberg uncertainty principle can be considered a special case of Bohr’s more general complementarity principle, and together they provide the conceptual framework for what has come to be known as the Copenhagen interpretation of quantum mechanics.

Completeness

Although the discussion so far has concentrated on the restrictions placed on experimental measurements by quantum mechanics, the implications of the Copenhagen interpretation go much further, challenging the previous (classical) conceptions of the nature of reality itself. It was the belief of Bohr and others of the Copenhagen school, that uncertainty is not merely a mathematical artifact of quantum mechanics but a reflection of an inherent ambiguity in the physical reality of the universe.

A crucial result of this interpretation is that nothing exists in objective reality which is not contained in the mathematical formalism of quantum mechanics. In other words, a quantum system can only be said to possess a dynamical property if it is describable by a quantum state for which the property is assigned a probability of one. Properties such as the spin, charge and rest mass of a system (all of which have definite quantum numbers in the formalism) can be said to have a concrete reality, and which, in principle, can be measured to any accuracy regardless of the situation. In contrast, essentially classical properties such as position and momentum cannot be said to exist at a quantum level in the same way as they do at a macroscopic level. This leads to the view that the reality of a measured property does not exist until a measurement is made, and it was this denial of objective reality, along with the fact that quantum mechanics only allows statistical predictions about the universe, which Einstein disputed. This lead him to argue that that quantum mechanics cannot reasonably be accepted as a complete description of the universe.

A useful analog; in considering the completeness of quantum mechanics is that of quantum mechanics with classical thermodynamics. Classical thermodynamics is successful in predicting the equilibrium properties of macroscopic systems, but is unable to describe such phenomena as thermal fluctuations, Brownian motion, and the like. A similar situation exists for quantum mechanics with respect to processes such as nuclear and sub- atomic particle decay. Quantum mechanics can predict the average decay time for a number of nuclei or sub-atomic particles, but is unable to predict the time for any single decay or to provide an explanation for fluctuations from the mean. Hence, it is conceivable that quantum mechanics is incomplete in the same way that classical thermodynamics is incomplete.

Acceptance of quantum mechanics requires that processes such as nuclear decay be considered as inherently acausal and indeterminate. Although this stance has been qualified (prompted by a 1964 paper by Vladimir Fock in response to Bohr’s interpretation of quantum mechanics) in as much as a `simple causality’, reflected in the well-defined natural laws that guide statistical outcomes, must exist, causality with respect to the display of macroscopic effects by a locally isolated quantum system does not. It is this that lead to Einstein’s objection, “God does not play dice with the Universe!”

It was Einstein’s belief that some form of local reality and causality must exist, independent of spatially extended effects, and therefore, that quantum mechanics is an incomplete description of the universe. He believed that although quantum mechanics places a restriction on what can be measured directly, this does not necessarily imply a restriction on the actual physical reality of the dynamical properties of a system. Furthermore, he believed that it was possible to circumvent the uncertainty principle and to show so with a number of thought experiments. Using several such experiments, and logical arguments based on `reasonable’ starting assumptions, Einstein attempted to show that quantum mechanics was indeed incomplete.

The EPR paradox

One of the first thought experiments proposed by Einstein to show a violation of uncertainty involved a radiation filled box with a tiny hole and an aperture controlled by a highly accurate atomic clock. It was argued that, by allowing a single photon to leave the box at a time prescribed by the clock and finding the energy of the photon by weighing the setup both before and after, the uncertainty relation ΔE⋅Δt > ℏ could be violated. It was shown by Bohr, however, that classical assumptions made by Einstein could not necessarily be said to apply when considering the system on a quantum level. For example, to weigh the system before and after would require suspending it by a spring;, or other such means, in a gravitational field. As the photon escapes, thereby changing the mass of the system, the spring will contract and the box will change its position in the field. From Einstein’s own theory of relativity, this change in gravitational field will change the clock’s time frame and thus introduce an error into the time measurement. It therefore turns out that energy and time cannot be measured to arbitrary accuracy, leaving the uncertainty principle intact.

The above example illustrates an important restriction placed by quantum mechanics on the way intuition may be used to predict the outcome of a hypothetical situation. Since both physicists and philosophers live in the macroscopic world where a classical view is adequate to describe things, the basis and assumptions of such intuition must be closely scrutinised, and some sort of working rules must be applied to any thought experiment. For the majority of Einstein’s early work against quantum mechanics as a complete theory, it was inconsistencies in the initial assumptions which lead to rejection of his arguments, and not the logic of the arguments themselves.

Einstein eventually accepted criticisms of his early thought experiments which attempted to disprove the uncertainty relations by the direct measurement of the properties of a system. Instead, he pursued a different line of argument (and a different thought experiment) to show inconsistencies in quantum mechanics through its denial of objective reality (with respect to the dynamical properties of a system). It was this line of argument that formed the basis of the 1935 paper by Einstein, with Boris Podolsky and Nathan Rosen, on the incompleteness of quantum mechanics, which has come to be known as the EPR paradox.

The principle argument of the EPR paper was based on the claim that,

“if, without in any way disturbing a system, we can predict with certainty (i.e. with probability equal to unity) the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity”

Accepting this claim, it was then shown that such predictions could be made for certain correlated systems, thereby proving the existence of some sort of objective reality, and contradicting quantum mechanics.

The basic thought experiment used to show this involved consideration of two particles, travelling in opposite directions, and originating from a single event (such as the radioactive decay of a nucleus) in such a way that their properties must be correlated. Quantum theory allows the distance between the total momentum of the two particles to be known precisely, and so, by measuring the position or momentum of one particle one can predict with certainty the position or momentum of the other particle. Since the distance between the two particles could be made arbitrarily large, it was argued, that a measurement of one particle could not simultaneously have any effect on the other (by special relativity). Thus, a measurement carried out on one particle cannot create or in anyway influence any property of the second particle, and so any property derived from such a measurement for the second particle must have a separate objective reality from the first, and so must have existed before any measurement at all was made. This creates a paradox for quantum theory, which states that such properties do not exist until measured. It was therefore concluded that quantum mechanics must be considered to be an incomplete description of the universe.

Hidden variable theories

Physical theories which set out to prove, or otherwise require, the existence of objective reality with respect to the dynamical properties (or variables) of a system can be placed in the general category of hidden variable theories. Such theories prescribe to a belief in an underlying clockwork controlling the interactions which occur in our universe, and that such clockwork only gives the appearance of uncertainty and unpredictability at the quantum level (through statistical variations). An example of this was Einstein s belief in the existence of some deeper, yet unknown, mechanism governing the radioactive decay of an individual, isolated nucleus.

Such theories obey strict causality and determinism, with particles having a precise values of properties such as momentum and position irrespective of whether or not they are being observed or measured. As a result, they are in direct conflict with quantum mechanics. However, the EPR thought experiment in itself provided no concrete way of physically resolving this conflict, only hypothesis. This effectively remained until a 1964 paper by John Bell, which showed that a resolution of the EPR paradox was possible through a set of experiments whose predicted outcomes from hidden variable theories are different from those of quantum mechanics.

Bell’s theorem

The effectively classical, `local realistic’ view of the universe which underlies most hidden variable theories is generally based on three fundamental assumptions. Firstly, that there are real things that exist regardless of whether we look at them or not; secondly, that it is legitimate to draw conclusions from consistent observations or experiments; and third, that no influence can propagate faster than the speed of light. In his 1964 paper Bell carried out his argument starting from this stance.

Although Bell’s experiment differed from the original EPR experiment in that it involved consideration of different components of polarisation of two photons rather than momentum and position, the fundamental question about the objective reality of non-commuting properties still remained the central issue. Bell’s idea was to consider an atom with zero angular momentum that decays into two identical photons propagating in opposite directions. Because the initial angular momentum of the system is zero, the two photons must be polarised in the same direction, the physics of which is unequivocally accepted by both classical and quantum theory. The photons are then allowed to move apart and eventually travel through separate polarisers mounted in front of photodetectors. The angle of the polarisers can be set to either vertical or 60° to either side of the vertical.

The experiment proceeds for each decay by setting the angle of each of the two polarisers randomly to one of the three positions and photon detection or non-detection recorded from each photodetector. If this procedure is carried out for a large number of decays, Bell showed, that predictions for the resulting statistics will be different for local realistic (hidden variable) theories and quantum mechanics.

Considering a number of decays, quantum mechanics states that the two detectors will register the same outcome (either double detection or double non-detection) 100% of the time when the polarisers are both in the same position (both at the same angle), and 25% of the time when their positions are 60° or 120° apart. Thus, averaging over the random selection of the nine possible polariser settings, the quantum mechanical probability of identical detections is

PQM = (3×1.0 + 6×0.25)/9 = 1/2

Hidden variable theories, however, find quite different statistical probabilities, the argument for which proceeds as follows. Each of the photons is considered to posses its own instruction set (or hidden variables). Effectively these might take the form of three parameters which tell the photon whether or not it should be detected for each orientation of a polariser. An example of such a set could be DNN which tells the photon to register detection (D) if the polariser is in the first position (e.g. at 60° in one direction to the vertical) and non-detection (N) if it is in the other two. Now, since classical mechanics also requires that the same outcome will occur when both polarisers are in the same position, both photons must contain exactly the same instruction sets. So, for the instruction sets DDD or NNN, the two detectors will register the same, regardless of their relative orientation, and for the other six possible instruction sets, five of the nine polarisation orientations will register the same outcome. Therefore, even if no NNN or DDD instruction sets are ever emitted, we have a lower bound of 5/9 for the probability of identical responses, that is,

PEPR > 5/9

This leads to the relation PEPRPQM which is known as Bell’s Theorem. Thus, by this theorem, the contradiction between the predictions of quantum mechanics and those of EPR hidden variable/local reality theories is made both obvious, and what is more, experimentally testable. It is also important to note the the condition for PEPR above is not specific to the EPR argument but a general result for all hidden variable theories (the instruction sets introduced are a general requirement of all non-local hidden variable theories).

Aspect’s experiment and its implications

In 1982, Alain Aspect and co-workers at Orsay, France conducted what is arguably the most conclusive set of experiments so far that set out to resolve the conflict between quantum mechanics and hidden variable theories. The results find that not only is the EPR inequality violated but that the predictions of quantum mechanics are confirmed.

The implication of these and the majority of similar results favouring quantum mechanics is that local hidden variable theories are incorrect, and that the assumptions of the EPR paper about the nature of the universe are flawed. In particular, the belief that quantum influences (which might enforce the uncertainty relations) between spatially separated particles cannot propagate faster than the speed of light — and that strict causality must exist for all interactions — must be rejected. The results confirm that dynamical properties such photon polarisation (and in other similar experiments the direction of electron spin) do not have a concrete local reality with respect to isolated, undisturbed particles, and imply some sort of superluminal interaction between correlated particles, which ensures that the Heisenberg uncertainty principle is not violated. Thus, it can be argued that it was the fundamental assumptions of EPR hidden variable theories, such as those begun with by Bell, that have lead, through plausible arguments, to incorrect conclusions.

One can say that it is the classical belief in local objective reality, and thus in the laws of determinism and causality, that are incorrect and thus incomplete. As for quantum mechanics, although it would be logically incorrect to say that Aspect’s results prove it to be a complete theory of the universe (that is, it is quantum theory that, through its predictions, logically implies the outcomes and not vice versa), it is possible to say with modest confidence that any truly complete theory should contain the critical aspects of quantum theory — those to which Einstein and others objected. This leaves the door open for broader and more wide reaching interpretations of the physical implications of quantum theory, such as the many worlds interpretation and non-local hidden variable theories.

But by any standard, the Copenhagen interpretation of quantum mechanics still stands as the most complete formalism for describing the universe to date.

Bibliography

  1. Mermin, N.D. `Is the Moon There When Nobody Looks?’ The Philosophy of Science. (MIT Press).
  2. Shimony, A. `Metaphysical Problems in the Foundations of Quantum Mechanics.’ The Philosophy of Science. (MIT Press).
  3. Putnam, H. `A Philosopher Looks at Quantum Mechanics’, Mathematics, Matter and Method — Philosophical Papers 1 (2nd edition). (Cambridge University Press).
  4. Gribbin, J. In Search of Schrodinger’s Cat. (Wildwood House Limited, 1984).
  5. Selleri, F. Quantum Paradoxes and Physical Reality. (Kluwer Academic Publishers, 1990).
  6. Healey, R. The Philosophy of Quantum Mechanics. (Cambridge University Press, 1989).
  7. Robinson, P. Physics IV — Advanced Quantum Mechanics course notes. (University of Sydney, 1992).
Filed under Physics. Comments are closed, but you can leave a trackback: Trackback URL.