A brief essay on a controversial concept in quantum physics
There is a spectre haunting quantum physics, and it is called the “collapse of the wave function”. It has attracted so many discussions that I prefer, to get the Weltbild right, to go back to a concrete experimental scheme. One needs some special equipment to perform an experiment where the “collapse” really comes into play. In “standard experiments” like the double slit with a very low flux of particles (Feynman: “the first mystery of quantum mechanics”), there is no collapse, just dots that appear one by one on the screen. Quantum physics is silent about the position of each and the next dot – it only gives their probability distribution once we specify the entire experimental setup: what are the source properties, which slits are open, and where the particles are detected. Conversely, by accumulating the fringe pattern from this collection of dots on the screen, the experimenter is accessing a probability distribution by the standard procedure of measuring “frequencies” (Häufigkeiten) and extrapolating to an infinite number of detections. It does not make sense to say: “the particle collapses onto a dot on the screen” because there is no further experiment done after that.
Let us take the viewpoint that perhaps the true sense of the collapse is this: once a first detection has been performed, we are able to perform further experiments. After we know the result of the first detection, the prediction of further outcomes are changed because “the state of the particle has changed”. But look at it carefully: this is done in a statistical sense, by selecting events after the first detection.
There is a famous historic anecdote here from the birth years of quantum physics: W. Heisenberg contemplating the “paths” of electrons or other particles in front of a bubble chamber (see the account of it in J. Ferrari’s book).
Heisenberg has just discovered the uncertainty relations and learned that “there is no sharp particle position”. But the bubbles in the chamber align nicely into orbits of charged particles. The orbits are typically circular or spiral-like because a magnetic field is applied to measure the charge and velocity of the particle. Heisenberg may have contemplated this spectacle for a long time, eventually sighing: “Now I understood.” (Well, I am conjecturing, a quote like that has been reported from N. Bohr in a similar situation.) Heisenberg realised that the appearance of the first bubble is like a detection of the particle’s position. Within some wide window of uncertainty, however: the bubble is manifestly a large, macroscopic object whose size is way above the particle’s de Broglie wavelength. So this position measurement is still compatible with a fairly well-defined momentum of the particle. The scattering of the high-energy particle with the vapor molecules in the chamber has probably not changed its momentum a lot. By extrapolating from this first position information, one can infer the probability density of the following bubble — and quantum mechanics tells us that this density will trace out the classical path, just because the first measurement was relatively imprecise, no big change in momentum, hence the most probable next bubble will simply lie ahead. (And here one can also use that the particles come from a beam with a relatively well-defined direction.) The calculation is related to the one of a correlation function: the “collapse” corresponds to the way the first observation constrains the wave function (more like a wave packet than a plane wave, for example).
Personally, although the quantum mechanics of the bubble chamber is certainly an interesting topic, I would consider other experiments more spectacular. The reason is probably that in the bubble chamber, all “detections” (bubbles) are random events. In the example that follows, the experimenter is free to choose her setup after the first measurement.
I want to talk about one of S. Haroche’s experiments: an atom flies through a cavity where it interacts with photons and one may make a measurement of the cavity field. Let us call this first cavity A (the left one in the picture, say).
The atom is still available for further experiments (in cavity B on the right, say), that’s a key difference. If the photon number in A has increased, for example, we may infer that the atom has deposited an energy quantum, has made a transition to a lower level. With this “increase of knowledge”, quantum mechanics can update its predictions to the outcomes of experiment B. The “collapse” is again very much related to a correlation between two events: “photon number has changed in cavity A” and “atom is behaving like this or that in experiment B”. Note that this viewpoint is very close to what is called the “epistemic interpretation” of the wave function: it encodes what the experimenter knows about the system (from the way it has been prepared). Here, we adjoin the information from the first measurement. (And indeed, many experimental preparation schemes are based on measurements and selection of those events that conform with the specifications of the source.)
The “weird” aspects of quantum physics become apparent in two variations on this scheme:
1
Imagine that one did not detect an increase in the photon number. Still, something has been learned about the state of the atom. Such an event would be compatible with the “retrodiction” that the atom has been in its ground state, where photon emission is impossible. (In the simulation procedure of “Monte Carlo wave functions”, this “non-detection” leads to a (continuous) change in the atom’s wave function where the probability of the atom being in the ground state is increased. But one may argue that this simulation is too beautiful to provide a true Bild of the “real state of affairs”.) But of course, this information is only valuable in the right context. One has to be sure that an atom has been launched and that is has effectively crossed the cavity at a place where it could have emitted a photon if it had been in an excited state. How can one be sure about this in an operational way? Just repeat the experiment and observe the frequency of photon emission events. Many weird schemes related to null measurements become quite trivial when this specific “post-selection” is re-instated.
2
An even weirder scheme is to “erase” the knowledge that has been gained in the first measurement. This is not easy because one has to intervene before the measurement result has been “amplified” to a detectable signal. But it can be done in the case of the cavity whose state can be manipulated, right after the passage of the atom. One can do this in such a way that the cavity state does not disclose any longer any information about the prior state of the atom. This has been popularised as the “quantum eraser”. [No I won’t offer you an image here, the web is full of strange claims on that keyword.] One observes that typical quantum interference features are restored that depend on the atom being in a superposition state – a state that would be precisely destroyed by the “collapse” after the first measurement.
But now kicks in quantum physics as its best: this kind of interference is demonstrated after accumulating many measurements, where an interference phase is scanned by adjusting a macroscopic element in the setup. After collecting the data, the plot of the frequencies vs phase tell us: “interference has happened”. Not in any of the individual runs, but only in the overall picture.