Now reading: Harmonizing theory and experiment brings out the best of both worlds
Menu
Close
Close

Take a self-guided tour from quantum to cosmos!

Harmonizing theory and experiment brings out the best of both worlds

Researchers Robert Spekkens and Kevin Resch are interrogating the nature of causality in quantum mechanics – and the fusion of theory and experiment is helping solve some long-standing puzzles at the edge of known physics.

A group of researchers working together in a lab on an experiment

Robert Spekkens and Kevin Resch have known each other since their days in graduate school some 25 years ago, where Spekkens trained in theoretical physics and Resch became an experimentalist. But their distinct areas of expertise haven’t kept them from collaborating. In fact, their combined talents make them a uniquely productive team.

Spekkens is a theoretical physicist at Perimeter Institute, engaged in quantum foundations – a discipline that seeks to refine and reformulate quantum theory to better understand its inner workings and bring some clarity to its counterintuitive features.

Resch, meanwhile, is an expert in quantum optics, devising and carrying out experiments with entangled photons in his lab at the University of Waterloo’s Institute for Quantum Computing and Department of Physics and Astronomy. Together with a group of talented graduate students and postdocs, they are asking new questions of nature to identify conceptual innovations in quantum theory and examine the ways in which it differs from the classical theories that preceded it.

“There’s a natural collaboration between people who are doing the sort of experimental quantum optics that happens in Kevin’s lab and people who are interested in testing the boundary between quantum and classical theories, such as myself,” Spekkens explained about the origins of their partnership in 2014. “Kevin is the natural person locally to implement these kinds of tests.”

Upon realizing their research goals were complementary, Spekkens and Resch quickly established a working relationship, and they have now coauthored seven papers together. Each paper combines a theoretical idea with an experimental test, and with each successive paper they seek to untangle a different knot in the hazy edges of quantum physics.

Their latest project: Rescuing causality in quantum theory

Their most recent result – with graduate student Patrick Daley – is a new take on an experiment that is central to the foundations of quantum theory: a “Bell experiment.”

Such experiments are named after the Irish physicist John Bell, who considered a pair of particles that are initially prepared together and then sent to distant locations and subjected to different measurements.

In 1964, Bell showed, in the theorem that now bears his name, that certain kinds of correlations between measurement outcomes cannot be explained in terms of the existence of a common cause (in other words, there doesn’t seem to be any sort of shared information that determines the state of both particles before their separation) – at least, not according to classical probability theory.

And yet quantum theory predicts, and experimental tests confirm, that such correlations do arise in Bell’s setup. This has led some researchers to claim that it is inappropriate to seek a causal explanation for these correlations. Others, however, are unwilling to compromise on causal explanations and have therefore looked for ways around Bell’s no-go result.

The strangeness of quantum theory is like a lump in a rug. If you eliminate the lump in one spot, it moves somewhere else. The different ideas about how to provide a causal explanation of quantum correlations in a Bell experiment are like different places where the lump appears.

Some have suggested that it is Bell’s assumption about the causal structure of the experiment that needs to be abandoned. They imagine, for instance, that the causal connection between the two wings of the experiment might not be a common-cause connection but a cause-effect connection, so that the choice of measurement on one wing influences the outcome on the other wing. This would require influences to move faster than the speed of light – a kind of “spooky action at a distance,” to borrow Einstein’s phrase.

Another idea in this category is known as superdeterminism, wherein it is presumed that variables that causally influence the measurement outcomes also causally influence the choices of what measurements are performed, in contrast with the assumption that these can be freely chosen. This too can be understood as rejecting Bell’s assumption about the causal structure.

Others have suggested that the common-cause structure is correct – that is, there are no spooky actions – but one needs to replace classical probability theory with a more exotic theory for making inferences in the presence of incomplete information, which impacts the kinds of correlations that the common cause can support. These kinds of exotic probability theories are called quantum common-cause models.

The first sort of approach is conservative about the concepts of causation and inference, but it is radical about the causal structure. The second sort of approach is conservative about the causal structure (it’s a common cause, after all) and is radical about the concepts of causation and inference.

Spekkens, Resch, and Daley were interested in seeing if they could find a version of a Bell-type experiment and a way of analyzing the data that might adjudicate between these two approaches. They used a pair of entangled photons (particles of light). Photons are a perfect tool for this kind of experiment because they don’t interact with much, giving them a long coherence time: in other words, photons stay in the quantum entangled state for a relatively long time without being disturbed by interactions with the wider world around them.

Passing laser light through a specific nonlinear optical material called ppKTP (periodically poled potassium titanyl phosphate) generates entangled photon pairs. Subsequently, each photon of the entangled pair is transmitted through a fibre-optic cable and then has its polarization measured in one of several ways. By preparing a very large number of such pairs for different choices of the polarization measurements, and by registering the relative frequencies of the outcomes of these polarization measurements, such an experiment can characterize aspects of the quantum correlations.

Analyzing the experimental data: The model might yield a good fit, but is it overfitting?

The most novel part of what they did was in the data analysis.

The challenge they faced is that both model types – the classical causal models allowing “spooky action at a distance” or superdeterminism, and the quantum common-cause models – can fit the data produced by the experiment, so how do you determine which is correct?

“You still have a tool,” says Spekkens, “which is to see not just whether these models can fit the data – can they make the data that you’ve observed likely – but do they overfit the data?

“Overfitting is when your model mistakes statistical fluctuations for real features. And if you give your model too much freedom, too many parameters, for example, it can use that extra freedom and kind of get a fit that basically does worse when you give it new data it hasn’t seen before.”

As Resch puts it: “The rope that hangs some models is they fit to every little random feature in the data, features that are not likely to be repeated.”

To see if the models would overfit the data, the team first fed some training data into them. Then, as a second step, they introduced the models to different, new data, to see if they still could predict the results of the experiment accurately.

“What we found essentially is that the models that allowed for superluminal causation [spooky action] or superdeterminism tended to overfit the data,” said Spekkens. “They end up doing worse predicting the test data.” The quantum common-cause models did a much better job at avoiding overfitting.

What does this mean? First, it reinforces the idea that our understanding of quantumness is incomplete. The lesson of Bell experiments is not spooky-at-a-distance or superdeterministic correlations, but that we must accommodate a novel type of causal explanation, where the concepts of causation and inference are modified relative to classical theories.

The jury is still out, but the result corroborates an idea that has come out of recent theoretical work in the field of quantum causal inference, in which, as Spekkens describes, “maybe the causal structure is just fine. It’s still just a common cause. But the way we define a causal model, the parameters, the way we compute the correlation, changes, and so you go radical on the side of the parameters, and you stay conservative on the side of the causal structure.”

The implications are exciting, hinting at a potential way forward through one of the long-standing impasses in modern physics.

Harmonizing theory and experiment

The compelling results from this experiment could not have been achieved without the long-standing collaborative process Resch and Spekkens have built. Indeed, the very idea of testing for overfitting grew out of their previous work.

“There are themes that sort of persist through our projects,” says Spekkens. “We’ve been talking here about overfitting, and that’s a critical element of this project. But if you ask ‘Where did that idea come from?’, the answer is that in the prior project we were already thinking about overfitting. It was a natural progression from that to our more recent experiment. When you have this kind of long-standing collaboration and you have tools that you’ve developed and ways of thinking, there are a lot of natural next questions that present themselves.”

The scientific method is often thought of as a repeating cycle in which theorists invent a model, experimentalists test it, and the results inform the next theory.

“Science doesn’t always evolve so neatly. Working with real systems is messier than that, and more interesting,” says Resch. “I think the most productive discussions that we’ve had, and maybe the best results that we’ve produced, arose from thinking about ways to overcome some of that messiness.”

As an example, theorists’ models often make idealized assumptions about a scenario. In the real world, however, experiments have noise and random error and imperfections. Resch and Spekkens found that dealing with this reality pushed them both to do better science: they had to learn how to make the theory robust, and on the experimental side, they had to find new ways of obtaining more precise characterizations of their devices using novel data analysis techniques.

Four people, 3 men and 1 woman, collaborating together in a science lab
From left to right: Jean-Philippe Maclean, Kevin Resch, Katja Ried, and Robert Spekkens.
Image credit: Institute of Quantum Computing, University of Waterloo

The real-time dialogue between the theoretical and experimental aspects of their projects has been a vital part of their success.

“One thing that’s unique about our collaboration,” says Spekkens, “is that we tend to put the theory and experiment together in one paper.”

Resch concurs: “In some cases, that’s been essential because there’s a lot of back and forth. The theory might start with assumptions that can’t be realized in an experiment…so it’s important to be able to think about a practical implementation and make sure the theory can deal with it.”

The students working with Resch and Spekkens similarly benefit from the unique nature of the collaboration. Resch’s students can interact with theorists and grapple with some of the leading ideas on the theory side, while Spekkens’ students gain valuable experience engaging with experimentalists.

It is a joint effort, un-siloed and interactive. “Rather than me having a meeting about the theory with just the postdocs here [at Perimeter],” says Spekkens, “we’ll just get everyone together to talk about some open problem on the theory side. Similarly for the question of how to design the experiment or what experimental issues have come up. That’s been our mode of operation. Just always get everybody in the room to talk about whatever issue we’re having.”

In doing so, both theorists and experimentalists gain a deeper understanding of the challenges on both sides of the coin, and they become better physicists because of it.

Applications and next steps: The story isn’t over

Spekkens and Resch are planning their next collaboration. The research described here was a starting point, carried out with a minimal number of parameters, but they want to go further with it.

“The number of fit parameters in these models can grow rapidly with the number of states and measurements,” says Resch, “and now we’re upping the complexity with a better experiment and more efficient fitting algorithm that can optimize over all these parameters.”

This work deepens our conceptual understanding of causality in quantum mechanics, but it may have direct technological impact as well.

“When Bell’s inequalities were discovered in the 60s…there were no applications for it at the time except pure fundamental understanding,” says Resch. “But now with the emergence of quantum information technologies, these very fundamental aspects of quantum systems have practical applications. For example, Bell’s inequalities now play a role in communication security. If a quantum communication system violates Bell’s inequalities, you can be sure that there are no eavesdroppers eavesdropping on the system.”

Likewise, Resch continues, “contextuality [a feature of quantum systems where measurement results cannot be understood as arising from a set of pre-existing properties] has been connected to quantum enhancement in computational power. Our research can be viewed as identifying new quantum resources, which may lead to applications that we haven’t thought of yet.”

Once these quantum resources have been identified, they can be exploited for technology such as quantum computers, communications, and cryptography.

In the long run, collaboration seems to be the engine that drives innovation. Theory and experiment each have something important to offer, but when brought together, can elevate one another.

As Resch notes, you just can’t overstate “how important it is to have so many talented theorists and experimentalists co-located like we have in Waterloo, because we can do so much more together than we can do on our own. The Quantum Valley ecosystem is greater than the sum of its parts.… This [collaboration] is a good example of how that works.”

 

Further resources:

The research described above was published in Physical Review A this spring.

Resch’s quantum optics research group at the Institute for Quantum Computing maintains a website where you can learn about their current activities.

Spekkens’ Quantum Causal Inference Initiative at Perimeter also has a webpage where you can learn about his group’s work in the area.

Spekkens and Resch don’t do their research alone, and it is important to recognize the graduate students and postdoctoral fellows who have contributed to their collaborations over the years. Alongside Patrick J. Daley, mentioned above, these include Megan Agnew, Andrew Cameron, Michael Grabowecky, Ravi Kunjwal, Jean-Philippe W. MacLean, Michael D. Mazurek, Christopher Pollack, Matthew F. Pusey, Katja Ried, and Lydia Vermeyden.

Related

A Simons Emmy Noether Fellowship has given experimental physicist Urbasi Sinha the opportunity to collaborate with theorists at Perimeter.

/Aug 09, 2022

The Perimeter Scholars International Winter School makes a comeback with physics and winter fun.

/Mar 24, 2022

Elie Wolfe’s hidden variables research is deeply counterintuitive – and may shed light on what quantum theory tells us about the world.

/Feb 02, 2022