Quantum Mechanics: A Not-Too-Technical Introduction

Do House Rules Allow Doubling-Down on the Universe?

                                                                                     Marty Thurston

We often refer to Albert Einstein and Niels Bohr as the respective opponent and proponent of quantum theory, but when we make this comparison we are really examining the dichotomy between two respective camps of physicists; a venerable all-star line up at the turn of the last century. This essay is a non-technical floor plan for the game of chance that has been described as our universe. I will begin by reiewing some of the philosophical implications of quantum theory, and then I will attempt to lay a foundation for what processes had been considered “standard” by the Copenhagen interpretation. Following I will discuss some of the more controversial features of quantum theory including uncertainty and entanglement, and I will also discuss the measurement problem and interpretations for solutions to it. I will conclude the essay by briefly describing some of the possible technological innovations that quantum processing could provide. These leaps forward will illustrate, in the form of a reply to the measurement problem, that not all hope is lost to causal physics. Understanding is a relative term and by harvesting this technology we are that much closer to obtaining the holy grail of science, a unified field theory of quantum mechanics.

“If an atom was in a state of higher energy, it was possible to calculate the probability that it would emit a photon at any specific moment. But it was not possible to determine the momentum of emission precisely. Nor was it possible to determine the direction. No matter how much information you had. It was all a matter of chance, like the roll of dice.” – Albert Einstein (Isaacson, 323)

Where Einstein would tell Bohr that God can not play dice with the universe, Bohr would amiably reply, “Einstein, [maybe you should] stop telling God what to do.” (Isaacson, 326) These discussions of uncertainty would leave much of the classical scientific community feeling, in a sense, deceived by quantum games of chance. To some scientists and philosophers, it is was a very cruel practical joke that classical “truths” essential to the discovery of quantum processes were now insufficient. In this respect, I think Einstein was always very nervous about quantum processes because they were further demonstrations of “weird” sciences that either required an advanced degree in physics or blind faith to completely subscribe to. Even in general relativity, which “preserved rigid cause-and-effect rules,” just imagining a picture of four-dimensional space time is about as easy as fitting a square peg into a triangular slot for most if not all people. (Isaacson, 323) With the controversies that arose from the general relativity theory, Einstein was all too familiar with how the scientific community would respond to eerie quantum behaviour; he did not want to be part of the scientific generation that would turn a large portion of Euclidean physics directly on its head.

For Einstein, the freewill emission and absorption of quanta, or pieces of light, were not merely scientific inconsistencies, but rather amounted to an existential nightmare, mocking man’s futile attempt to understand. In one sense, Einstein believed that prior to the quantum debates, he had been a physicist, but he had now been reduced to an “employee of a gaming house” (Isaacson, 324) In this time period there were many opponents to these theories, anti-relativists who believed that Einstein’s theory of relativity was the beginning of the end for the scientific community and that the world could no longer be considered measurable. Einstein however, (especially in his younger years) believed that this extended description of reality would open up endless doors in the fields of science, discovery, and technology. Einstein saw uncertainties arising in general relativity, but was optimistic that through further study, they could be accurately measured and described. It should be noted that although apprehensive about quantum theory, Einstein was an advocate of the truth; he would play devils advocate as often as he was forced into the role.

Quantum theory, on the other hand, subscribes to the notion that we should accept uncertainties as a description of reality. The spooky part of this interpretation is that we might come to a truer description of immediate reality by accepting the fact that there are undeniably unclear rules. Einstein could never fully accept these theories. He likened his opinion to a thought experiment in which he “imagined two boxes, one of which we know contains a ball. As we prepare to look in one of the boxes, there is a 50% chance of the ball being there. After we look there is either a 100% or 0% chance it is in there. But all along, in reality, the ball was in one of the boxes.” (Isaacson, 454) Yes, probability will help us make our decision but in reality there are no probabilities, the ball was really in one box all along.

The following is an example of a conversation between Einstein and friend Philipp Frank that illustrates Einstein’s personal feelings towards emerging quantum theories.

Einstein: “A new fashion has arisen in physics, which declares that certain things cannot be observed and therefore should not be ascribed reality.”

Frank: “But the fashion you speak of was invented by you in 1905!”

Einstein: “A good joke should not be repeated too often.”

(Isaacson, 332)

The Copenhagen Consensus

At this point it will be useful to describe what has come to be known as the Copenhagen interpretation. This is a non-local interpretation of quantum mechanics that claims all particles are described by their wavefunction, which dictates the probability for them to be found in any location following an observation. When a measurement is made, it causes a change to the particle and results in an apparent collapse of the wavefunction. The Copenhagen Consensus can be thought of as a collection of various theories from multiple authors that tends to depict the foundations of quantum mechanics. The following are considered the pillars of quantum theory:

  1. Schrodinger’s wavefunction would represent the complete system being measured.
  2. Max Born would provide the mathematical applications to describe probability as a relation to the square of the amplitude of the wavefunction.
  3. The uncertainty principle would acknowledge the fact that we can not aim to achieve a complete description of the entire system.
  4. Principle of complementarity would describe that all matter exhibits wave/particle duality, the effects of which, can never be observed simultaneously.
  5. Correspondence principle would try to keep things from becoming too “spooky”. Roughly, the quantum realm should correspond in logical ways to the macro-realm.

The uncertainty principle was credited to Heisenberg in 1927. He argued that we can not know all the values in a system at the same time. Things we do not know can be described in terms of probabilities. Planck’s constant is directly responsible for studying the way the system will evolve; the more one variable is measured (position or momentum), the less accurate our measurements for its corresponding variable will be. Isaacson explains that the uncertainty principle is a little more complex than this. “An electron does not have a definite position or path until we observe it.” (331) Niels Bohr believed that the uncertainty principle was a feature of nature; his opinion was that science was the pursuit of describing what is taking place in our world, not necessarily how it works. In correspondence with Max Born, Einstein would reply, “I am very, very reluctant to give up complete causality” (Einstein to Max Born, Jan. 27, 1920)

The complementarity principle was conceived by Bohr who was a proponent of positivism. This principle would mature shortly after the EPR paper that struck powerful blows to non-locality in 1935. This discovery was the idea that an electron could in fact drop from higher energy orbits to lower ones, which would result in the creation of a photon. This photon or light quanta would be comprised of discrete energy. The uncertainty principle would show that there were features of wave-particle duality, but these effects could not be observed simultaneously. The principle of complementarity would show that the two separate particles were in fact a representation of one singular phenomenon, which is the basis for the concept of quantum entanglement. In rebuttal, Einstein would cling to the fact that QM violates the fundamental principle of separability which holds that, “two systems that are spatially separated have an independent existence.” (Isaacson, 453)

The Quantum Process

The following is an elementary introduction to a few aspects of linear quantum mechanics. I am definitely not qualified, but here are my two cents.

The particles we study will be subject to motion along the (x)-axis only. With any given particle we choose to study, we can expect to find a matter-wave that we refer to as the wavefunction. It should be noted that our definition of wavefunction is an “expression for the amplitude of the particle wave” (wave function, Encyclopedia Britannica, 2008) Since a wavefunction has no physical reality, it is mathematically represented by a normalized Hilbert space, which is a multi-variant complex number that behaves differently in different environments. Normalization means simply that the particle can in fact be found somewhere along the (x)-axis. The two lessons for understanding basic linear quantum mechanics are to understand how a wavefunction can be calculated and what kinds of information can be extracted from it.

Max Born was a German theoretical physicist who was credited in 1926 with coming up with a probability interpretation for a given wavefunction along the (x)-axis of the Schrödinger equation. This interpretation recognizes the fact that we are unable to specify locations of the particle along the (x). Instead, Born’s interpretation attaches probabilities to the wavefunction, describing where we might expect to find it along the (x-axis). Graphically, the wavefunction will look like a curve, and the probability density will be an area under the curve between linear points A and B. One of the biggest problems for quantum physics is that when we are given a wavefunction position at some time (t), we can not readily describe the position of that particle once we acknowledge (t).

As mentioned above, the wavefunction contains all the information required to measure a given particle, but once actually measured, the wave will then be subject to outside forces of nature acting on it and collapse into a single-state. This represents the dynamic nature of quantum mechanics; the study of an evolving system. “In Newton’s mechanics, position and velocity that are relative to a given time are calculated from Newton’s second law; in quantum mechanics, a wavefunction relative to position and time must be calculated from a different law – Schrödinger’s equation.” (Modern Physics, 194)

Schrödinger’s equation, relativistic quantum theory, and quantum gravity are all attempts to explain the evolutionary process of normal quantum systems when they are not being measured or observed. These are all time-dependent theories that are deterministic by nature. Schrödinger’s equation can be solved for free particles in stationary states known as plane-waves. Free particles that undergo some change in (x) value in Schrödinger’s equation are known as wave packets and represent a collection of multiple plane-waves. “The momentum of such a particle is not known precisely, but only to some accuracy of change in (p) value that is related to the change in (x) value through the uncertainty principle.” (Modern Physics, 225)

The square of the area under the wave packet solution can be considered the probability density. These wave packets form into what is referred to as a superposition; which will be relevant in the coming discussion of superselection. In quantum realms, a nucleus exists in a superposition, meaning “it exists simultaneously as being decayed and un-decayed until it is observed, at which point its wavefunction collapses and it becomes one or the other” (Isaacson, 456) This would become a point of contention for Einstein and his colleagues and a rebuttal would come in the form of a thought experiment eventually named Schrödinger’s Cat. “According to the Copenhagen interpretation developed by Bohr and his fellow pioneers of quantum mechanics, until such an observation is made, the reality of the particle’s position or state consists only of these probabilities.” (Isaacson, 457) Schrödinger wanted to find out when a superposition stops representing two or more probabilities, collapses, and snaps into one state. To question this he related the micro-realm to the macro-world, where, a cat is exposed to a radioactive substance that may or may not decay. If the cat is hidden within a box then the Copenhagen interpretation would have us believe the cat’s immediate reality consists of it being both alive and dead within its superposition. This was too large of a pill to swallow for Einstein and by 1935, the attacks at QM were shifted to include these kinds of thought experiments. From a historical perspective, one can imagine the calculated and serious manner of Bohr, pacing his room, trying not to smile while he was contemplating the idea of a half-dead cat.

Interpretations and Problems

Quantum mechanics has a fundamental problem known as the measurement problem. When measurements are made in the quantum realm, the measurements themselves can change the expected outcome of the measurement. This is basically the problem of knowing how measurements can turn superpositions into pure states. The following is an argument for the Everett / Many Worlds interpretation, as in my opinion, it might have been the most acceptable for Einstein.

The Everett / Many Worlds (EMW) description was first discussed by Dr. Hugh Everett in 1957 and fundamentally deals with the process of observation (or measurement) within wave-mechanics. In this interpretation, the wavefunction is treated as physically real. This can be considered “as-relativistic-as-it-gets”, because after observing one of the relative particles, the wavefunction is not said to collapse, but rather, it is said to carry on deterministically. Because of the simultaneity of the outcomes, each outcome also produces the outcomes of other possible outcomes. These outcomes are given relativistically; that is to say that they are worlds that are as unobservable to us as our world is to them. I say this description is as “relativistic-as-it-gets” because we can no longer take the measurements of only one variable to study. In QM, in order to study anything it is essential to study the complete subsystem before we notice any variance within that system.

The animal rights activist in Einstein would have probably found comfort in the fact that the EMW interpretation seems to have eased the fate of Schrodinger’s Cat. There is no “ta-da!” moment in the EMW interpretation; instead, there is a dominant outcome that for all intents and purposes, is the only outcome that matters. This comforting notion of reality is what I think Einstein was digging his nails into with the EPR paper.

Arguably, Einstein might have supported this interpretation because of how it ranks compared to other alternative solutions. First of all, the Copenhagen Consensus could be viewed as being damned if there was a wavefunction and damned if there was not. This makes sense because if the interpretation is based on non-locality, then it will never move beyond spooky action at a distance. Likewise, if the wavefunction is not real, then we are almost advocating solipsism because there is no objective reality. I will not diverge further by going into Bohm’s interpretation of hidden variables, but I will say that it is probably one of the more competitive theories available. The main difference between Bohm’s approach and Everett’s is that once the wavefunction splits into separate realities, Bohm claims it ceases to exist as a reality in all unobservable worlds. Since the wavefunction contains all the knowable information, the outcome would be the only reality that has an existence. However, the EMW interpretation sees no need to make such a distinction.

There are also undeveloped techniques of quantum logic. For the most part, these are trial-and-error experiments and there has been relatively no collaborative agreement on any of these proposals. The fundamentals of logic are tedious enough, and highly complex logic requires highly elaborate reasoning – something there is no common or practical method for. This is in the same realm as Muckenheim’s concept of extended probabilities, which, for the same reasons, is deeply contested. Further, there are a number of half-hearted theories like the Many Minds interpretation and other non-linear proposals, but virtually none of these have received any considerable academic or experimental support.

Our understanding of the interpretations of QM is very much limited to our understanding of these theories in relation to classical physics. First, there is the abstract, nature of describing quantum processes. For example, Hilbert spaces used in the Schrödinger equation are representations of complex multivariate numbers that will behave differently in different environments. There are also problems with the process of measurement, as illustrated by the different interpretations described above.

One of the most interesting philosophical implications of quantum theory is that there would be no accurate way of ever describing all the properties of a system at the same time. Basically, this is saying that we can not define a reality, even though we are living through something that feels an awful lot like reality. Furthermore, QM supports the existence of non-deterministic, irreversible, and entangled processes, all of which are more than arms-length away from regular Newtonian capacities.

Solution to the Measurement Problem?

In the measurement problem we are observing a spinning particle that is within a superposition of two eigenstates of spin in a direction, this is basically a graphical representation of Schrödinger’s equation. When this equation is worked out with a statistical operator, two interference terms result. Superpositions will not usually give the same statistical results as mixtures do, and therefore if our expectation values were to be equal, we would almost certainly have to have zero value interference terms. This process is known as a wash-out solution and this particular example is based on superselection rules. In superselection, these principles are the result of quantum processes and not brought forward from some outside theory, so this theory has a particularly natural feeling.

What superselection rules set out to do is express certain restrictions on which kinds of operators represent genuine observables. Robinson explains that the simplest way to explain this solution is not to think in terms of restrictions on the operator-observable link but in terms of restrictions on the formation of superpositions of states of the measuring apparatus. “If superpositions of states of the apparatus could be formed without any interference terms arising, then we would have, arguably, a wash-out solution to the measurement problem.” (Robinson, 81)

It should be noted however that these scenarios are a source of controversy. Maximilian Schlosshauer from the University of Washington has said that before we can take theories of superselection or induced-decoherence seriously, we need to “clarify key features of the decoherence program, including its more recent results, and to investigate their application and consequences in the context of the main interpretive approaches to quantum mechanics.” (Schlosshauer, 2005) The main consequence of superselection is that even if we try to remove these terms in practice, it does not mean this can be equally done in principle until there are reliable, non ad-hoc methods of doing so.

Where to From Here?

Before quantum mechanics, we knew very little about the individual atom. However, we did have incredible statistical techniques for estimating. “We now are in a very different situation, for we have much the same statistical techniques, but a highly confirmed and sophisticated theory of the individual atoms and their constituents.” (Robinson, 84-85) This type of physical description had simply never been available prior to quantum mechanics.

Quantum technology is a developing field that attempts to build on the knowledge of quantum processes and tries to mimic these systems in technological development. “We can create entangled matter and energy that are not likely to exist anywhere else in the universe. These new man-made quantum states have novel properties of sensitivity and non-local correlation that have wide applications to the development of computers, communication systems, sensors, and compact metrological devices.” (Dowling and Milburn, 1656) In this respect the technological possibilities for quantum processors are so boundless, they could make the invention of the internet seem about as important as the Y2K bug.

Much of the quantum information technology emerges out of the second order effects that come out of these processes. These effects were first seriously discussed in the Einstein, Podolsky, and Rosen paper. These are “not just manifestations of the usual wave-particle duality, but rather they are a new type of higher-level quantum effect that manifests itself only in precisely engineered, man-made quantum architectures. (Dowling and Milburn, 1658) While I will not deviate much further into quantum technology, Dowling and Milburn list a number of specific topics that may be of further interest to the reader. Namely, the protocols for communication including distributed quantum computing and quantum wavepacket switching as well as physical systems suggested for constructing a quantum computer including “NMR, ion traps, cavity QED, quantum dots, superconducting circuits, and optical systems. (1658) With such quantum computing, there are interesting philosophical questions to ask as well. For instance, in the Everett / Many Worlds interpretation, we know that we do not directly interact with parallel Everett worlds. However, this is not to say we do not interfere with them. With respect to Feynman diagrams, interaction would be an event occurring at the vertices, interference could be any event that happens to be in line of two distinct world lines. Therefore, if we had some kind of reversible quantum computer, we might have a different way of studying the measurement problem.

The quantum theory has come a long way since the 1900s. More than one-hundred years since its initial discussions, we are still absent of a unified field theory. For Einstein, there are still a lot of questions that are not yet answered and may never be. I think he would have been an advocate of Dr. Everett’s approach to the measurement problem. At the very least, the EMW interpretation of quantum mechanics is a good sequel to Schrodinger’s Cat experiment. For the time being, think it is suffice to say that Einstein would have been relieved of some of the doubts that were cast on quantum mechanics, especially in light of more recent developments to the measurement problem. Einstein was by no means a card-dealer, and as uncomfortable as it made him, it seems as if we are still forced to play by the “house” rules.

Marty Thurston


References

Dowling, J. P., Milburn, G.J. (2003). Quantum Technology: The Second Quantum Revolution. Philosophical Transactions: Mathematical, Physical and Engineering Sciences, 361, 1655-1674

Isaacson, Walter. (2007). Einstein: His Life and Universe. Simon and Schuster, NY

Robinson, Don. (1994). Can Superselection Rules Solve the Measurement Problem?. The British Journal for the Philosophy of Science, 45, 79-93

Schlosshauer, Maximilian. (2005). Decoherence, the Measurement Problem, and Interpretation of Quantum Mechanics. Department of Physics, University of Washington, Seattle.

Serway, R.A., Moses, C.J., and Moyer, C.A. (2005). Modern Physics, 3rd ed.

Wave Function. (2008). In Encyclopædia Britannica. Retrieved March 8, 2008, from Encyclopædia Britannica Online: [http://www.britannica.com/eb/article-9382433]

Bohm, David J. (1952). A Suggested Interpretation of the Quantum Theory in Terms of “Hidden Variables. Physical Review, 85 166-193

Muchenheim, W. (1986). A Review of Extended Probabilities. Physics Reports, 133, 339-. By, Youssef Saul. Quantum Mechancis as Complex Probability Theory

More by this Author


Comments 2 comments

Robert Kernodle profile image

Robert Kernodle 5 years ago

It is interesting how Einstein was instrumental in establishing the very quantum theory that he never fully accepted.

You might find my latest hub on this subject interesting:

http://hubpages.com/education/Quantum-Theory-Alter...

Robert


cosmomed profile image

cosmomed 3 years ago from Sarawak. Malaysia

Still very difficult to be understand by laymen.

    Sign in or sign up and post using a HubPages Network account.

    0 of 8192 characters used
    Post Comment

    No HTML is allowed in comments, but URLs will be hyperlinked. Comments are not for promoting your articles or other sites.


    Click to Rate This Article
    working