APS Physics

  • Collections

alain aspect quantum entanglement experiment

  • APS Journals

Nobel Prize : Quantum Entanglement Unveiled

Figure caption

7 October 2022: We have replaced our initial one-paragraph announcement with a full-length Focus story.

The Nobel Prize in Physics this year recognizes efforts to take quantum weirdness out of philosophy discussions and to place it on experimental display for all to see. The award is shared by Alain Aspect, John Clauser, and Anton Zeilinger, all of whom showed a mastery of entanglement—a quantum relationship between two particles that can exist over long distances. Using entangled photons, Clauser and Aspect performed some of the first “Bell tests,” which confirmed quantum mechanics predictions while putting to bed certain alternative theories based on classical physics. Zeilinger used some of those Bell-test techniques to demonstrate entanglement control methods that can be applied to quantum computing, quantum cryptography, and other quantum information technologies.

Since its inception, quantum mechanics has been wildly successful at predicting the outcomes of experiments. But the theory assumes that some properties of a particle are inherently uncertain—a fact that bothered many physicists, including Albert Einstein. He and his colleagues expressed their concern in a paradox they described in 1935 [ 1 ]: Imagine creating two quantum mechanically entangled particles and distributing them between two separated researchers, characters later named Alice and Bob. If Alice measures her particle, then she learns something about Bob’s particle—as if her measurement instantaneously changed the uncertainty about the state of his particle. To avoid such “spooky action at a distance,” Einstein proposed that lying underneath the quantum framework is a set of classical “hidden variables” that determine precisely how a particle will behave, rather than providing only probabilities.

The hidden variables were unmeasurable—by definition—so most physicists deemed their existence to be a philosophical issue, not an experimental one. That changed in 1964 when John Bell of the University of Wisconsin-Madison, proposed a thought experiment that could directly test the hidden variable hypothesis [ 2 ]. As in Einstein’s paradox, Alice and Bob are each sent one particle of an entangled pair. This time, however, the two researchers measure their respective particles in different ways and compare their results. Bell showed that if hidden variables exist, the experimental results would obey a mathematical inequality. However, if quantum mechanics was correct, the inequality would be violated.

Bell’s work showed how to settle the debate between quantum and classical views, but his proposed experiment assumed detector capabilities that weren’t feasible. A revised version using photons and polarizers was proposed in 1969 by Clauser, then at Columbia University, along with his colleagues [ 3 ]. Three years later, Clauser and Stuart Freedman (both at the University of California, Berkeley) succeeded in performing that experiment [ 4 ].

Figure caption

The Freedman-Clauser experiment used entangled photons obtained by exciting calcium atoms. When a calcium atom de-excites, it can emit two photons whose polarizations are aligned. The researchers installed two detectors (Alice and Bob) on opposite sides of the calcium source and measured the rate of coincidences—two photons hitting the detectors simultaneously. Each detector was equipped with a polarizer that could be rotated to an arbitrary orientation.

Freedman and Clauser showed theoretically that quantum mechanics predictions diverge strongly from hidden variable predictions when Alice and Bob’s polarizers are offset from each other by 22.5° or 67.5°. The researchers collected 200 hours of data and found that the coincidence rates violated a revamped Bell’s inequality, proving that quantum mechanics is right.

The results of the first Bell test were a blow to hidden variables, but there were “loopholes” that hidden-variable supporters could claim to rescue their theory. One of the most significant loopholes was based on the idea that the setting of Alice’s polarizer could have some influence on Bob’s polarizer or on the photons that are created at the source. Such effects could allow the elements of a hidden-variable system to “conspire” together to produce measurement outcomes that mimic quantum mechanics.

Figure caption

To close this so-called locality loophole, Aspect and his colleagues at the Institute of Optics Graduate School in France performed an updated Bell test in 1982, using an innovative method for randomly changing the polarizer orientations [ 5 ]. The system worked like a railroad switch, rapidly diverting photons between two separate “tracks,” each with a different polarizer. The changes were made as the photons were traveling from the source to the detectors, so there was not enough time for coordination between supposed hidden variables.

Zeilinger, who is now at the University of Vienna, has also worked on removing loopholes from Bell tests (see Viewpoint: Closing the Door on Einstein and Bohr’s Quantum Debate , written by Aspect). In 2017, for example, he and his collaborators devised a way to use light from distant stars as a random input for setting polarizer orientations (see Synopsis: Cosmic Test of Quantum Mechanics ).

Figure caption

Zeilinger also used the techniques of entanglement control to explore practical applications, such as quantum teleportation and entanglement swapping. For the latter, he and his team showed in 1998 that they could create entanglement between two photons that were never in contact [ 6 ]. In this experiment, two sets of entangled photon pairs are generated at two separate locations. One from each pair is sent to Alice and Bob, while the other two photons are sent to a third person, Cecilia. Cecilia performs a Bell-like test on her two photons, and when she records a particular result, Alice’s photon winds up being entangled with Bob's. This swapping could be used to send entanglement over longer distances than is currently possible with optical fibers (see Research News: The Key Device Needed for a Quantum Internet ).

“Quantum entanglement is not questioned anymore,” says quantum physicist Jean Dalibard from the College of France. “It has become a tool, in particular in the emerging field of quantum information processing, and the three nominated scientists can be considered as the godfathers of this new domain.”

Quantum information specialist Jian-Wei Pan of the University of Science and Technology of China in Hefei says the winners are fully deserving of the prize. He has worked with Zeilinger on several projects, including a quantum-based satellite link (see Focus: Intercontinental, Quantum-Encrypted Messaging and Video ). “Now, in China, we are putting a lot of effort into actually turning these dreams into reality, hoping to make the quantum technologies practically useful for our society.”

–Michael Schirber

Michael Schirber is a Corresponding Editor for  Physics Magazine based in Lyon, France.

  • A. Einstein et al. , “Can quantum-mechanical description of physical reality be considered complete?” Phys. Rev. 47 , 777 (1935) .
  • J. S. Bell, “On the Einstein Podolsky Rosen paradox,” Physics 1 , 195 (1964) .
  • J. F. Clauser et al. , “Proposed experiment to test local hidden-variable theories,” Phys. Rev. Lett. 23 , 880 (1969) .
  • S. J. Freedman and J. F. Clauser, “Experimental test of local hidden-variable theories,” Phys. Rev. Lett. 28 , 938 (1972) .
  • A. Aspect et al. , “Experimental test of Bell’s inequalities using time-varying analyzers,” Phys. Rev. Lett. 49 , 1804 (1982) .
  • J. W. Pan et al. , “Experimental entanglement swapping: Entangling photons that never interacted,” Phys. Rev. Lett. 80 , 3891 (1998) .

More Information

Research News: Hiding Secrets Using Quantum Entanglement

Research News: Diagramming Quantum Weirdness

APS press release

The Nobel Prize in Physics 2022 (Nobel Foundation)

Experimental Test of Bell's Inequalities Using Time-Varying Analyzers

Alain Aspect, Jean Dalibard, and Gérard Roger

Phys. Rev. Lett. 49 , 1804 (1982)

Published December 20, 1982

Experimental Entanglement Swapping: Entangling Photons That Never Interacted

Jian-Wei Pan, Dik Bouwmeester, Harald Weinfurter, and Anton Zeilinger

Phys. Rev. Lett. 80 , 3891 (1998)

Published May 4, 1998

Experimental Test of Local Hidden-Variable Theories

Stuart J. Freedman and John F. Clauser

Phys. Rev. Lett. 28 , 938 (1972)

Published April 3, 1972

Subject Areas

Related articles.

Entangled Photons Maintained under New York Streets

Entangled Photons Maintained under New York Streets

A new scheme sends large numbers of entangled photons reliably through commercial fiber-optic cables, demonstrating a capability required for secure quantum networks. Read More »

Long-Standing Quantum Problem Finally Solved

Long-Standing Quantum Problem Finally Solved

An answer to a decades-old question in the theory of quantum entanglement raises more questions about this quirky phenomenon. Read More »

Iterative Process Builds Near-Perfect Atom Array

Iterative Process Builds Near-Perfect Atom Array

Researchers show that atoms that escape from an atom array can be replaced on the fly—an important step toward operating a large-scale neutral-atom quantum computer. Read More »

Sign up to receive weekly email alerts from Physics Magazine .

Alain Aspect’s experiments on Bell’s theorem: a turning point in the history of the research on the foundations of quantum mechanics

  • Published: 21 December 2022
  • Volume 76 , article number  248 , ( 2022 )

Cite this article

alain aspect quantum entanglement experiment

  • Olival Freire Junior   ORCID: orcid.org/0000-0003-3401-8885 1  

5134 Accesses

13 Altmetric

Explore all metrics

Alain Aspect’s three experiments on Bell’s theorem, published in the early 1980s, were a turning point in the history of the research on the foundations of quantum mechanics not only because they corroborated entanglement as the distinctive quantum signature but also because these experiments brought wider recognition to this field of research and Aspect himself. These experiments may be considered the most direct precursors of the research on quantum information, which would blossom a decade later.

Graphical abstract

alain aspect quantum entanglement experiment

John Bell (1928–1990)—on the board, drawing of Aspect’s 1982 experiment with two-channel polarizers. Courtesy: Nature

Avoid common mistakes on your manuscript.

1 Introduction

During some time of the twentieth century, the research on the foundations of quantum mechanics was poorly considered among physicists. This was true, particularly concerning the possibility of changing quantum mechanics with additional variables. Evidence of this Zeitgeist abound. David Bohm’s alternative interpretation of quantum mechanics was considered to belong “to the philosophy of science, rather than to the domain of physical science proper” [ 31 ], p. 48). Hugh Everett’s interpretation faced such harsh opposition that he abandoned physics [ 22 ], pp. 107–115, 129). On December 2, 1966, when John Bell had just published his theorem contrasting quantum mechanics and local hidden variables, Léon Rosenfeld wrote to him: “I need not tell you that I regard your hunting hidden parameters as a waste of your talent, I don’t know, either, whether you should be glad or sorry for that.” Disregard for Bell’s theorem continued even when experiments to test it were being performed. John Clauser, who conducted the first experiment contrasting local hidden variables prediction with quantum mechanics, faced obstacles to obtaining a permanent position because American physicists doubted if what Clauser was doing was “real physics” [ 22 ], p. 271).

Rosenfeld’s stand reflected the shared wisdom at the time that there was no physics to be done about the possibility of completing quantum mechanics with additional variables. This wisdom was grounded on the existence of a mathematical proof formulated by John von Neumann to prove that any hidden variables would be incompatible with quantum mechanics. It was also rooted in the widely shared view that Niels Bohr had put an end to this issue in the debate with Albert Einstein on the EPR experiment and the completeness of this physical theory. In fact, to counter this wisdom, Bell had criticized von Neumann’s proof and maintained that the last word on the matter of the completeness had yet to be said. Footnote 1 However, while well considered for his work on high energy and accelerators, Bell was not yet the authoritative voice on the foundations of quantum mechanics that he would later become. In fact, traces of this previously shared wisdom survived the initial reception of Bell’s theorem. Thus, Abraham Pais, in his biography of Einstein, assessed the EPR paper had no bearing on physics and did not cite Bell’s theorem as a development of this issue [ 33 ], Chapter 25c).

A half-century later, the scene had completely changed. In 2010, the Wolf Prize in physics was awarded to John Clauser, Alain Aspect, and Anton Zeilinger “for their fundamental conceptual and experimental contributions to the foundations of quantum physics, specifically an increasingly sophisticated series of tests of Bell’s inequalities or extensions thereof using entangled quantum states.” Footnote 2 A little later, in 2013, Alain Aspect went to Copenhagen to receive the Bohr Medal, awarded by UNESCO for his contribution to understanding the non-locality of quantum mechanics, that is, entanglement. His talk at the ceremony was an exposition on entanglement. Footnote 3 Aspect explained that it is both a physical phenomenon that challenges our classical intuition and the physical ground for the current research on quantum information and cryptography. These prizes expressed both the high status acquired by work on entanglement among physicists and the role played by Aspect’s work in this story.

Aspect’s seminal experiments, the results of which were published in 1981–1982, and his doctoral thesis presented in Paris in 1983, lie midway in this story. They were major breakthroughs in our understanding of the conflict between quantum mechanics and local realism and the establishment of entanglement as part of the physics conceptual and practical toolkit. Furthermore, they were also a turning point in the recognition among physicists of how good the physics being done on the foundations of quantum mechanics was. These experiments were responsible for the early prestige Aspect was given as an outstanding experimental physicist. Finally, in hindsight, Aspect’s work on Bell’s theorem is a milestone in the road leading to more sophisticated experiments on entanglement and to the blossoming of research on quantum information.

The second section of this paper presents Bell’s theorem and its first experiments, dealing mainly with those which used pairs of optical photons before Aspect announced his plan to perform a new investigation. Aspect’s works are analyzed in the third section. The fourth section deals with the impact of his work on the physics community. The epilogue is a summary of the theoretical and experimental developments concerning Bell’s theorem and the physical phenomenon of entanglement as they have evolved since Aspect’s early works. Footnote 4

2 The early history of Bell’s theorem

Since the very inception of quantum mechanics, around 1925–1927, when probabilistic descriptions of quantum states were introduced by Max Born, some physicists longed for a deeper theory able to overcome this weird feature. They appealed to an analogy with classical mechanics and statistical mechanics. Thus, if one considers quantum mechanics analogous to statistical mechanics, one should look to find the quantum counterpart to classical mechanics, which would demand more variables than those already used in quantum mechanics. Indeed, to counter this appeal was von Neumann’s motivation with his proof against the possibility of additional variables compatible with quantum mechanics. The situation became more acute in 1935 when Albert Einstein challenged the completeness of quantum mechanics through the EPR thought experiment. However, Bohr had reacted to the challenge by showing that no inconsistency appears if one considers the physical phenomenon as the wholeness of the system under investigation and the measurement devices required for such an investigation.

In the early 1950s, David Bohm challenged both von Neumann’s mathematical proof and its usual interpretation maintained by Bohr, Pauli, Heisenberg, and most physicists. Bohm materialized his challenge with a model of particles with well-defined paths guided by a new potential, the quantum potential, which was able to reproduce results from non-relativistic quantum mechanics. In hindsight, Bohm’s feat can be explained by the fact that its quantum potential introduced non-local features, thus making it compatible with standard quantum mechanics, but this was not understood this way at the time. In addition, while Bohm was aware his model was a counterexample to von Neumann’s proof [ 14 ], p. 187), he did not identify flaws in this proof. Footnote 5 Thus, as late as the early 1960s, the foundations of quantum mechanics, as far as the possibility of completing this physical theory through additional variables was concerned, were in an untidy state. Nobody could say what the problem was with the coexistence of von Neumann’s proof and Bohm’s counterexample. In this context, EPR did not gain much attention among physicists. They thought Bohr’s answer was right even though they did not analyze in detail what was at stake with such an experiment. Footnote 6 The reception of the EPR paper led Sidney Redner [ 36 ] to label it a “sleeping beauty.” It laid dormant without getting much attention and no citations until a particular day when the paper suddenly began to receive attention from the science community.

John Bell’s mid-1960s papers were the equivalent of the prince’s kiss in the fairy tale in the history of physics. According to his recollections, Bell became fascinated by the subject of physical models challenging von Neumann’s proof in the early 1950s. “In 1952 I saw the impossible done,” and “[ 14 ] papers on quantum mechanics were for me a revelation” were Bell’s ( 1982 , 1987 ) statements. However, his attention was diverted to other subjects, high energy physics, and accelerators, and only in the early 1960s he came back to von Neumann’s proof. Bell’s works focused on the critical analysis of the assumptions behind von Neumann’s proofs and their reformulations, and later, on the assumptions behind the Einstein–Podolsky–Rosen Gedankenexperiment . After understanding the reasons why Bohm’s model had survived such proof, it was as non-local as quantum theory; Bell asked himself what kind of conflict may remain between hidden variables and quantum theory.

Then, Bell went back to the EPR experiment and took the next logical step: to isolate what reasonable assumptions were behind Einstein’s argument and check the compatibility between these assumptions and quantum mechanics. Einstein’s realism implied physical systems have well-defined properties independent of being observed. In addition, Bell [ 8 ] noted that the “vital assumption” when dealing with a two-particle system is that what is being measured on one of them does not affect the other. He recalled Einstein’s dictum, according to which, “on one supposition we should, in my opinion, absolutely hold fast: the real factual situation of the system S2 is independent of what is done with the system S1, which is spatially separated from the former.” As Bell knew that Bohm’s hidden-variable theory did not satisfy this dictum, he built a simple model of a hidden-variable theory obeying Einstein’s assumptions and showed that its results conflict with quantum mechanical predictions in very special cases. This is Bell’s theorem: no local hidden-variable theory can recover all quantum mechanical predictions. This resulted from the violation, by quantum mechanics predictions, of inequality based on such hidden variables. Since then, many other analogous inequalities have been obtained, adopting somewhat different premises. As a result, it is usual nowadays to speak of Bell’s inequalities as the quantitative measurement of Bell’s theorem.

2.1 The first experiments

Meaningful reactions to Bell’s theorem were not immediate. At the end of the 1960s, however, a few American physicists acknowledged the novelty implied by this theorem. After working on different paths, they joined efforts and produced the CHSH paper [ 16 ]. In this paper, John Clauser, Abner Shimony, Michael Horne, who was a doctoral student under Shimony at Boston University, and Richard Holt, a doctoral student of Francis Pipkin at Harvard, translated Bell’s theorem into viable experiments and noted that no available experimental results could be used either to corroborate quantum mechanics or to support hidden variables based on the local realistic assumptions.

In the early 1970s, the first results of tests of Bell’s theorem using pairs of polarized photons were available, and their results conflicted with each other. The conflict grabbed the attention of physicists not initially involved with the subject, such as John Archibald Wheeler, and led to a rush for new experiments. Footnote 7 At Berkeley, Clauser and Stuart Freedman obtained results confirming quantum predictions and violating Bell’s inequalities [ 21 ], and, at Harvard, Holt got results against quantum mechanics [ 27 ]. Clauser tried repeating Holt and Pipkin’s experiment, and unlike the Harvard physicists, he obtained results confirming quantum predictions. Edward Fry and Randall Thompson attempted to improve the experimental techniques using a tunable laser to excite the atomic sample. With this improvement, Fry was able to get results in 80 min, while Clauser’s first experiment lasted around 200 h in data collection. Fry’s and Clauser’s results confirmed quantum mechanics predictions [ 15 , 23 ]. However, so far, no experiment was immune to the loophole of a hypothetically unknown interaction between the two polarizers, which were far away from each other. Such a loophole has since been known as the locality loophole. The year before, Alain Aspect had entered the game suggesting a different experiment to test Bell’s theorem [ 1 ], inspired by an earlier suggestion of John Bell to close that loophole, which Aspect [ 2 ] detailed one year later.

3 Aspect’s experiments and doctoral dissertation

Aspect’s [ 1 ] main proposal was to use “versatile” polarizers “whose orientations are changed rapidly and repeatedly in a stochastic manner” in such a way that the principle of separability holds: “the response of a polarizer is independent of the state of another device that is separated from the former by a space-like interval, according to special relativity.” Thus, the time required to change the polarization orientation should be less than the time light requires to cross the distance between the two polarizers. It was not a new idea. Indeed, Aspect was following a proposal from Bell, who had suggested it at the end of the paper presenting his theorem [ 8 ], recalling that Bohm and Yakir Aharonov had formulated such a suggestion in 1957. Clauser had thought about this experiment but did not try to perform it.

At the time Aspect made such a proposal, he had returned from the civil service (French coopérant ) in Africa and was looking for a subject in optics for his French doctorat d’état . It was at the time when France had two types of doctoral degrees, the first doctorat de troisième cycle, and the second doctorat d’état . Aspect already had the former working with holography. The latter was a significant research project, longer than the current doctoral degrees, and without the need for supervision. These two kinds of degrees would be unified in 1984 into the current French Ph.D.

In 1974, Aspect had gone to the Institut d’Optique at Orsay looking for a subject for his research and met Christian Imbert there, who handed him a bunch of Bell’s papers. Aspect became fascinated by them and thought about performing the experiment Bell had suggested: to rotate the polarizers while the photons were in flight. It was a daring but wise plan. Had Aspect chosen to repeat the kind of experiments being run in the USA, the following year his proposal would have become totally unattractive. Indeed, as we have just seen, in 1976, Clauser’s and Fry’s results settled the initial tie between Freedman & Clauser’s and Holt’s conflicting results. Aspect’s proposal was more ambitious but harder to be won due to the intrinsic technical difficulties of changing the polarizers while the photons were in flight. Aspect’s strategy was to use acoustic standing waves to produce interactions between them and a beam of light, thus obtaining two channels with transmitted and diffracted beams.

To plan and perform experiments, Aspect spent around five years. In the meantime, he realized that the very techniques he would use could produce better results for the kind of experiments that had already been performed by S. Freedman, J. Clauser, R. Holt, E. Fry, and R. Thompson in the USA. This led him to plan the realization of three different experiments.

Eventually, the experimental results were published between 1981 and 1982. The first experiment was a replication of the experimental test of Bell’s inequalities early conducted by Clauser, Holt, and Fry. However, Aspect used two tunable lasers to excite the sample, which provided him a source of higher efficiency. The experimental running lasted 100 s. Furthermore, Aspect took the opportunity to check Furry’s conjecture, which suggested that the quantum non-locality would vanish after the photons traveled the order of the coherence length of their associated wave packets, which meant 1.5 m in this experiment. In mathematical terms, Furry’s conjecture says that a pure state would evolve toward a mixture of factorized states after such a distance. In the Aspect’s experiment, the source was separated from the polarizer by 6.5 m. This first experiment was conducted with the collaboration of Aspect’s undergraduate student, Philippe Grangier, and the research engineer Gérard Roger. In the second experiment, still with Grangier and Roger as collaborators, Aspect used two-channel polarizers to have a straightforward transposition of the EPR Gedankenexperiment . In previous experiments held in the USA, when one of the detectors was not triggered, it was impossible to know whether this resulted from the low efficiency of the detectors or whether the polarizer had blocked the photon, which would be a real measurement. For this reason, auxiliary experiments without the polarizers were needed to circumvent the intrinsic deficiency of the setup.

The third experiment had the widest impact. Now with Jean Dalibard and Gérard Roger as co-workers, Aspect produced the first-ever test of Bell’s inequalities with time-varying analyzers. Dalibard was then a young student at École Normale Supérieure who volunteered to work with Aspect on this experiment because he wanted “to turn the knobs of an experiment that will remain in the books,” a premonitory view. Footnote 8 Aspect’s ingenuity was to use a switch to redirect the incident photons to two different polarizers. This device worked through the planned acousto-optical interaction with an ultrasonic standing wave in water. Aspect was aware such a device would operate in a periodic manner and not in a genuinely stochastic way. They looked to attenuate such limitations once they could not be completely overcome.

In all these experiments, Aspect obtained results in clear-cut violations of Bell’s inequalities and strong confirmation of quantum mechanics predictions. The first experimental result was \({\delta }_{exp} = 5.72 \times {10}^{-2}\pm 0.2 \times {10}^{-2}\) , while the concerned Bell’s inequality was \(\delta \le 0\) and the quantum mechanical prediction was \({\delta }_{QM} = 5.8 \times {10}^{-2}+0.2 \times {10}^{-2}.\) Thus, Bell’s inequality was violated by more than 13 standard deviations [ 5 ]. The second experimental result was \({S }_{exp }= 2.697\pm 0.015\) . In this case, the Bell’s inequality at stake was \(-2\le S\le 2\) and \({S}_{QM} = 2.70\pm 0.05\) , which was to that date the strongest violation of Bell’s inequalities ever reported. In each of these experiments, the runs lasted 100 s [ 7 ].

The last experimental result, from the experiment using time-varying analyzers, was telling precisely because of its novelty, and possibly because of this it was the result that most resonated in the physics community despite its accuracy being less than the other experiments. For this case, the Bell’s inequality being checked was \(S\le 0\) and the value predicted by quantum mechanics was \({S }_{QM}= 0.112\) . The runs lasted 200 min and the experimental result was \({S}_{exp} = 0.101\pm 0.020\) , violating Bell’s inequality by five standard deviations [ 7 ].

Jumping ahead of time, let us say that the three experiments were included in Aspect’s doctoral dissertation, which was assessed by the panel of examiners at the Université de Paris—Sud at Orsay on February 1, 1983. It is larger than the ensemble of results and papers already published. It includes a clear introduction to Bell’s theorem and its first experiments, an invaluable source if used as a textbook on this subject. In addition, each technical, experimental, or conceptual choice, either mentioned or briefly justified in the papers, are explicitly considered in detail here. The dissertation’s conclusion deserves comments. Aspect’s main conclusion is that Bell’s inequalities are violated, and his experimental results are in excellent agreement with quantum mechanics predictions. However, he also presents their main limitations, namely the weak sensitivity of the photodetectors, which led to the additional hypothesis of taking the number of pairs of detected photons as a representative sample of the emitted photons, and the not strictly random nature of the changing polarizers, in the case of the third experiment. However, Aspect is not optimistic about possible improvements to such features soon, a topic we come back to in the following section.

Aspect’s conclusion is cautious but strong. From a strictly logical point of view, the dispute between quantum mechanics and local hidden variables is not yet closed. However, he puts the emphasis of his conclusion on another side of the issue. The experiments are in precise agreement with quantum mechanics predictions for these factual experiments. On this issue, he ends his dissertation recalling what Bell had said about Aspect’s suggested experiment. Indeed, in 1976, when Aspect announced the experiment with the changing polarizers, Bell (1976, p. 442) declared: “It is therefore of the highest interest that an atomic cascade experiment is now under way, presented here by Aspect, in which the polarization analyzers are in effect reset while the photons are in flight .” Aspect ( 1983 , p. 346) further recalled what Bell had said at that time, “if this experiment gives the expected result, this will be the confirmation of what is, from my point view, considering the discussions on locality, one of the most telling predictions of quantum theory.”

The panel evaluating Aspect’s doctoral dissertation reflects the professional network he was able to build around him and the subject of Bell’s theorem. These include André Maréchal and Christian Imbert, who welcomed him at the Institut d’Optique at Orsay for a French doctorat d’état . Maréchal, the director of the institute, a major figure in French optics, and Imbert, who handed him a collection of Bell’s papers. John Bell and Bernard d’Espagnat were experts in the foundations of quantum mechanics who discussed and supported him from the start of his doctoral research. Franck Laloë, who had co-authored with C. Cohen-Tannoudji and B. Diu an influential quantum physics textbook [ 18 ], shared an early interest in Bell’s theorem, having attended the 1976 Erice conference organized by Bell, which was one of the first conferences dedicated to this subject. Footnote 9 More recently, Laloë [ 29 ] published an authoritative book on the subject, provocatively titled “Do we really understand quantum mechanics?” Finally, Claude Cohen-Tannoudji, a leader in French quantum optics who would share the 1997 Physics Nobel Prize. Relationship with Cohen-Tannoudji was built by Aspect during his doctoral research leading to a collaboration that predated the dissertation’s conclusion and continued afterward.

4 The following day

Aspect’s experiments marked a turning point in the history of the research on the foundations of quantum mechanics, at least as far as the issue of hidden variables and quantum mechanics is concerned. On the one hand, Aspect’s experiments brought stronger evidence than previous experiments favoring quantum mechanics and opposing local hidden variables theories. On the other hand, and more importantly, these experiments brought recognition to research on these issues. Evidence of this appeared very quickly.

In 1982, just after the announcement and before the dissertation’s approval, Aspect’s reputation skyrocketed. He was one of the invited speakers at the Eighth International Conference on Atomic Physics, held in Sweden, to report on his experiments on Bell’s inequalities. The American physicist Arthur Schawlow, Physics Nobel Prize winner in 1981, was requested to make the final report of the conference. He chose Bell’s theorem and its experiments as the main topic of his speech [ 38 ]:

Physical metaphors, such as the dual concepts of particles and waves in dealing with the light and atoms, are more than just conveniences, but rather are practical necessities. [. . .] But the experiments on Bell’s inequalities are making it difficult for us to continue using some of our familiar physical metaphors in the old ways. We are used to thinking that light waves are produced at an atom with definite polarizations and are subsequently detected by remote detectors. However, the experiments show that if anything is propagated, it seems to convey more polarization information than a transverse wave. [. . .] As an experimentalist, I like to think that there is something there that we call an atom, and that we can make good measurements on it if we are careful not to disturb it too much. But the experiments on polarization of correlated photons don’t bear out these expectations.

Two years later, Feynman, who once refused to talk about hidden variables with John Clauser while the first experiment on Bell’s theorem by Freedman and Clauser was being carried out (Freire 2015, p.272), attended a seminar given by Aspect at Caltech on the tests of Bell’s theorem. At this seminar, Aspect finished his talk by quoting a certain paper whose author derived results like Bell’s inequalities and went on to discuss whether it was “a real problem.” According to Aspect, this author had provided an answer that was so unclear that he “had found it amusing to quote it as a kind of joke to conclude this presentation.” Only at this point did Aspect reveal the name of the author, Richard Feynman. According to Aspect, nobody in the audience laughed, until Feynman himself laughed. Feynman later checked the quotation and wrote to Aspect, conceding he was right and saying, “once again let me say, your talk was excellent.” Footnote 10

Schawlow’s and Feynman’s positive reactions may be framed in the context of the number of citations that Aspect’s papers would obtain for years to come. By any standard, these papers have had an expressive number of citations. So far, these three papers have received 4,895 citations. If we consider only a few years following the publication of these papers, the figures are rather important as they may give us a sense of the immediate recognition of Aspect’s work. In the following table we have the number of citations for each paper in the years 1982, 1983, 1984, and 1985, as well as the total number of citations so far. Considering such that such papers began to get citations from roughly 1983 on, we may calculate the average number of citations, per year and per paper, from 1983 to 1985. Thus, we have the impressive figure of 27 citations per paper per year, which is an impressive number even by the current standards, 40 years later, with an enlarged number of researchers.

Paper/citations

1982

1983

1984

1985

1982/2022

1st paper

10

16

24

21

1,185

2nd paper

3

18

25

33

1,525

3rd paper

18

34

41

2,185

  • Source: Web of science, accessed on May 10, 2022. 1st paper is [ 5 ], 2nd paper is [ 7 ], and the 3rd paper is [ 6 ]

In a study on the history of the research on the foundations of quantum mechanics, between 1950 and 1990, I coined the physicists who approached such a subject as quantum dissidents [ 22 ], their dissent being related to the fact that they supported that there was good physics to be done concerning the foundations of quantum mechanics. This meant they challenged the view, shared by most of the physicists at the time, that foundational issues had already been solved by the founding fathers of the discipline. The quantum dissidents included physicists such as Bohm, Everett, Bell, Eugene Wigner, Shimony, Clauser, d’Espagnat, Aspect, H–D Zeh, Franco Selleri, and Tony Leggett. In this ensemble, however, Aspect played a singular role as a transitional protagonist. He was aware of the prevailing prejudices against the kind of research he intended to conduct. John Bell might have been the first to warn him of this when Aspect looked for him to obtain for his planned experiments. Bell asked him: “Do you have a permanent job?” Luckily Aspect did, and this made a huge difference. Initially, he suffered the indifference toward his subject but overcame it. His pedagogical skills to explain to his colleagues why testing Bell’s theorem in this context was relevant for physics development helped him. However, the major transition came after the publication of the three papers. As we have seen, he was immediately acknowledged by the physicists as somebody who had done first-class physics. Some of the dissidents, however, paid a high professional price for their dissidence. Footnote 11

As we have seen, Aspect was not optimistic about the possibility of improving his experimental technique to replicate experiments related to Bell’s theorem. Indeed, his experiments had been so convincing that in subsequent years nobody else bothered to replicate them. The reasons for this were related to the perceived unfeasibility of new experimental breakthroughs, as Aspect himself remarked: “I do not see further meaningful progress can be made in the domain of Bell’s inequalities, at least with our apparatus. We have exploited its maximal possibilities. Sure, an additional decimal could be obtained, but would it be worth it?”. Footnote 12 Aspect himself moved to other rewarding topics of research. Invited by Cohen-Tannoudji, he began to work on the use of lasers to cool down atoms and later became a leader in the field of atomic optics.

However, the story about experimental tests of Bell’s theorem did not end with Aspect’s work. The revival began slowly five years later with the discovery of a better way to produce pairs of photons with entangled polarizations. The new sources did not use atomic cascades, instead, the pair of entangled photons were created in the interaction between a laser beam and nonlinear optical crystals, a process named parametric down conversion (PDC). While this process was known since the early days of nonlinear optics, only in the late 1980s, Yanhua Shih and Carroll Alley at the University of Maryland in College Park and Zhe-Yu Ou and Leonard Mandel at the University of Rochester [ 32 , 40 ] had the idea to use this source to redo experiments with Bell’s theorem. The first experiment with the new source did not produce better violations of Bell’s inequalities than those previously obtained by Alain Aspect. However, as time has gone by, the use of this source was improved, in both conceptual and experimental aspects, and the results surpassed Aspect’s results. In addition to this better source of pairs of entangled photons, there remained loopholes other than the locality, which had been addressed by Aspect. Physicists were aware, since the first experiments on Bell’s theorem, that such loopholes allowed the survival of the local realism assumption even if they were hardly plausible.

The loopholes were related to some additional assumptions experimental physicists need to take to make the transition from Bell’s original inequality to the CHSH’s one, which could be applied to a factual experiment, and to real laboratory experiments. The locality loophole, as we have seen, was approached by Aspect’s third experiment, but the quasi-random nature of the changing polarizers was a limitation to consider it had been closed. Another loophole, called “detector-efficiency loophole” or “fair-sampling loophole,” derived from the fact that as we were not able to detect all the pairs of photons, the sample taken for the statistical calculations to compare experimental results with Bell’s inequalities, could be, in principle, at least, biased. Indeed, with detectors used in the early Bell’s theorem experiments, it was possible to mimic experimental results departing from local-realist assumptions. The third loophole has been called the freedom-of-choice. While the locality loophole concerns the transmission of information among parts of the experimental device, the freedom-of-choice loophole means the possibility of a hypothetical common cause interfering in statistical correlation among the entangled systems. These loopholes have been tackled by different teams, with different strategies, and in different places. Footnote 13

A review of the wide series of Bell’s theorem experiments dealing with these loopholes falls beyond the scope of this paper. A few milestones in this series were the following: In 1998, Anton Zeilinger and his team improved on Aspect’s 1982 experiment with time-varying analyzers reinforcing the condition of locality by changing the analyzers in a stronger random manner with the detectors separated by 400 m. They got a full agreement with quantum mechanics predictions and violations of Bell’s inequality by over 30 standard deviations [ 43 ]. In 2015, three different experiments closed, at the same time, the locality, and the fair-sampling loopholes. They were held in Austria, the USA, and the Netherlands, led by Zeilinger, at the University of Vienna, Lynden Shalm at NIST in Boulder, Colorado, and Ronald Hanson, at Delft University of Technology. Zeilinger’s and Shalm’s teams took advantage from the newly available photon detectors with efficiency above 90%, in addition to using pairs of entangled photons through parametric down conversion and a scheme with a new type of random number generator to change the polarizer’s alignments. Hanson’s team developed a different strategy to prevent the fair-sampling loophole. Their scheme, called “Bell’s event-ready scheme,” allowed them to measure spin components from a kind of artificial atom consisting of two nitrogen-vacancy (NV) centers. This scheme was further developed by H. Weinfurter and colleagues in Garching, Germany. The impact of the 2015 three experiments was assessed by Aspect [ 4 ] in a “Viewpoint” paper meaningfully titled “Closing the Door on Einstein and Bohr's Quantum Debate.” Footnote 14

Aspect’s title was right if we consider the third loophole was not on the agenda of the debate between Einstein and Bohr. However, as a logical possibility, initially noted by Shimony, Horne, and Clauser, in 1976; it required to be addressed. Footnote 15 Jason Gallicchio, Andrew Friedman, and David Kaiser (2014) suggested to use signals from cosmic sources to change the polarizer’s alignments and, through this strategy, to send backward the time of that hypothetical common cause. This proposal led to a joint effort among physicists in Austria and in the USA, led by Zeilinger. Eventually, they were able to use signals coming from two different quasars to adjust their setting, which took 7,78 billion years to arrive in one of the detectors 12,21 to the other. Thus, as concluded by Kaiser ( 2022 , 361), this experiment “excluded such local-realist, freedom-of-choice scenarios from 96:0% of the space–time volume of the past light cone of the experiment, extending from the Big Bang to the present time,” thus corroborating the current view that quantum entanglement is a true physical nature feature. Footnote 16

Parallel to the new experiments with entanglement, there was yet another surprise in store for Aspect. Since late 1980s, physicists began to conjecture about the use of entanglement, the physical effect brought to light by research on Bell’s theorem for quantum computing and cryptography. At a certain juncture, in the mid-1990s, new fields in physics were born: quantum information and cryptography. The core physical effects in these fields are entanglement and decoherence. Thus, the 2010 Wolf Prize in physics, awarded to Clauser, Aspect, and Zeilinger, recognized the role played by the leaders of the three distinct generations of physicists who had worked on Bell’s theorem and paved the road to the current blossoming of quantum information.

Data Availability Statement

This manuscript has no associated data or the data will not be deposited. [Authors' comment: The datasets generated during and/or analyzed during the current study, such as letters, interviews, and citations, are available from the corresponding author on reasonable request.]

Bell exemplified the “view that the possibility of hidden variables has little interest,” citing works by Rosenfeld, Pauli, Heisenberg, and N. R. Hanson [ 9 ], p. 451).

Wolf Fund Prize Announcement 2010, online: http://www.wolffund.org.il/index.php?dir=site&page=winners&cs=283&language=eng .

The conference full title was “An Open World—Science, Technology and Society in the Light of Niels Bohr’s Thoughts.” See http://bohr-conference2013.ku.dk/ .

In Sects.  3 , 4 and the epilogue, I draw from my book [ 22 ] “The Quantum Dissidents—Rebuilding the Foundations of Quantum Mechanics (1950–1990)”, pp. 274–279 and 290–301.

Bell would criticize Bohm’s “lack of clarity, or else accuracy” on this point [ 9 ], p. 451).

Noting this attitude, John Bell and Michael Nauenberg remarked: “We emphasize not only that our view [that quantum mechanics is, at the best, incomplete] is that of a minority but also that current interest in such questions is small. The typical physicist feels that they have long been answered, and that he will fully understand just how if ever he can spare 20 min to think about it [ 13 ].

Optical photons were not the sole choice for Bell’s theorem experiments. For a wider review, see Clauser & Shimony [ 17 ]. For a balance of these experiments, see Paty [ 34 ]. On Wu & Sakhnov experiment, see Silva [ 41 ].

Alain Aspect, interviewed by O. Freire & I. Silva, 16 Dec 2010 & 19 Jan 2011, American Institute of Physics, College Park, MD.

For the proceedings of this conference, see Progress in Scientific Culture—The Interdisciplinary Journal of the Ettore Majorana Centre, 1/4, 439–460, 1976.

Richard Feynman to Alain Aspect, 28 Sep 1984. Richard P. Feynman Papers, Box 22, Folder 15, California Institute of Technology Archives. On the episode, see Freire (2015, p. 278).

See, particularly, the cases of Everett, Clauser, Tausk, and Zeh, in Freire (2015).

Aspect, interviewed in Deligeorges ( 1985 , p. 137).

For a conceptual presentation of these three loopholes, as well as for a survey of recent Bell’s experiments, see Kaiser [ 28 ]. On the history of techniques related to Bell’s theorem experiments, see [ 42 ].

Vienna’s experiment is [ 25 ], NIST’s experiment is [ 39 ], Delft’s experiment is [ 26 ], and Garching’s experiment is [ 37 ].

For the history of the freedom-of-choice loophole, and attempts to close it, see Kaiser ( 2022 , from p. 356 on).

The cosmic experiment’s results are in [ 35 ]. See also [ 28 ], p. 361) for the experiment conducted in Shanghai by Jian-Wei Pan and his team (Li et al., 2018), where they closed the three loopholes in the same investigation while taking signals from nearby bright stars.

A. Aspect, Proposed experiment to test separable hidden-variable theories. Phys. Lett. 54A (2), 117–118 (1975)

Article   ADS   Google Scholar  

A. Aspect, Proposed experiment to test the nonseparability of quantum mechanics. Physical Review D 14 (8), 1944–1951 (1976)

Aspect, A. Trois tests expérimentaux des inégalités de Bell par corrélation de polarization de photons. Atomic Physics. Université Paris Sud - Paris XI, (1983). [Doctoral dissertation, accessed at https://tel.archives-ouvertes.fr/tel-00011844 , on May 9, 2022]

A. Aspect, Closing the door on Einstein and Bohr’s quantum debate. Physics 8 , 123 (2015)

Article   Google Scholar  

A. Aspect, P. Grangier, G. Roger, Experimental tests of realistic local theories via Bell’s theorem. Phys. Rev. Lett. 47 (7), 460–463 (1981)

A. Aspect, J. Dalibard, G. Roger, Experimental test of Bell inequalities using time-varying analyzers. Phys. Rev. Lett. 49 (25), 1804–1807 (1982)

A. Aspect, P. Grangier, G. Roger, Experimental realization of Einstein-Podolsky-Rosen-Bohm Gedankenexperiment—a new violation of Bell inequalities. Phys. Rev. Lett. 49 (2), 91–94 (1982)

J.S. Bell, On the Einstein Podolsky Rosen paradox. Physics 1 , 195–200 (1964)

J.S. Bell, On the problem of hidden variables in quantum mechanics. Rev. Mod. Phys. 38 (3), 447–452 (1966)

Article   ADS   MATH   Google Scholar  

J.S. Bell, On the impossible pilot wave. Found. Phys. 12 (10), 989–999 (1982)

J.S. Bell, Beables for quantum field theory, in Quantum Implications: Essays in Honour of David Bohm . ed. by B.J. Hiley, F.D. Peat (Routledge & Kegan, London, 1987), pp.227–234

Google Scholar  

J.S. Bell, Speakable and Unspeakable in Quantum Mechanics: Collected Papers on Quantum Philosophy. With an Introduction by Alain Aspect (Cambridge University Press, Cambridge, 2004)

Book   MATH   Google Scholar  

J.S. Bell, M. Nauenberg, The moral aspect of quantum mechanics, in Preludes in Theoretical Physics . ed. by A. De Shalit, H. Feshbach, L. Van Hove (North Holland, Amsterdam, 1966), pp.279–286

D. Bohm, A suggested interpretation of the quantum theory in terms of" hidden" variables I. Phys. Rev. 85 (2), 166–179 (1952)

J.F. Clauser, Experimental investigation of a polarization correlation anomaly. Phys. Rev. Lett. 36 (21), 1223–1226 (1976)

J.F. Clauser, M.A. Horne, A. Shimony, R.A. Holt, Proposed experiment to test local hidden variable theories. Phys. Rev. Lett. 23 (15), 880–884 (1969)

J.F. Clauser, A. Shimony, Bell’s theorem—experimental tests and implications. Rep. Prog. Phys. 41 (12), 1881–1927 (1978)

C. Cohen-Tannoudji, B. Diu, F. Laloë, Mécanique Quantique (Hermann, Paris, 1973)

S. Deligeorges, Le Monde Quantique (Éditions du Seuil, Paris, 1985)

A. Einstein, B. Podolsky, N. Rosen, Can quantum-mechanical description of physical reality be considered complete? Phys. Rev. 47 , 777–780 (1935)

S.J. Freedman, J.F. Clauser, Experimental test of local hidden-variable theories. Phys. Rev. Lett. 28 (14), 938–941 (1972)

O. Freire Junior, The Quantum Dissidents: Rebuilding the Foundations of Quantum Mechanics (1950–1990) (Springer, Berlin, 2015)

E.S. Fry, R.C. Thompson, Experimental test of local hidden-variable theories. Phys. Rev. Lett. 37 (8), 465–468 (1976)

J. Gallicchio, A.S. Friedman, D.I. Kaiser, Testing Bell’s inequality with cosmic photons: closing the setting-independence loophole. Phys. Rev. Lett. 112 (11), 110405 (2014)

M. Giustina et al., Significant-Loophole-Free Test of Bell’s Theorem with Entangled Photons. Phys. Rev. Lett. 115 , 250401 (2015)

B. Hensen et al., Loophole-free Bell inequality violation using electron spins separated by 1.3 kilometres. Nature 526 , 682–686 (2015). https://doi.org/10.1038/nature15759

Holt, R. A.: Atomic cascade experiments. Ph.D. dissertation, Harvard University (1973)

D. Kaiser, Tackling loopholes in experimental tests of Bell’s inequality, in The Oxford Handbook of the History of Quantum Interpretations . ed. by J.O. Freire et al. (Oxford University Press, Oxford, 2022), pp.339–370

F. Laloë, Do We Really Understand Quantum Mechanics? (Cambridge University Press, New York, 2012)

M.H. Li et al., Test of local realism into the past without detection and locality loopholes. Phys. Rev. Lett. 121 (8), 080404 (2018)

A. Messiah, Quantum Mechanics (North Holland, Amsterdam, 1961)

MATH   Google Scholar  

Z.Y. Ou, L. Mandel, Violation of Bells-inequality and classical probability in a 2-photon correlation experiment. Phys. Rev. Lett. 61 (1), 50–53 (1988)

A. Pais, “Subtle Is the Lord”: The Science and the Life of Albert Einstein (Oxford University Press, New York, 1982)

M. Paty, The recent attempts to verify quantum mechanics, in Quantum Mechanics, A Half Century Later . ed. by J.L. Lopes, M. Paty (Reidel, Dordrecht, 1977), pp.261–289

Chapter   Google Scholar  

D. Rauch et al., Cosmic Bell test using random measurement settings from high-redshift quasars. Phys. Rev. Lett. 121 (8), 080403 (2018)

S. Redner, Citation statistics from 110 years of physical review. Phys. Today 58 (6), 49–54 (2005)

W. Rosenfeld et al., Event-ready bell test using entangled atoms simultaneously closing detection and locality loopholes. Phys. Rev. Lett. 119 , 010402 (2017)

A. Schawlow, Concluding remarks, in Atomic Physics , vol. 8, ed. by I. Lindgren, A. Rosen, S. Svanbeg (Plenum, New York, 1983), pp.565–569

L.K. Shalm et al., Strong Loophole-free test of local realism. Phys. Rev. Lett. 115 , 250402 (2015)

Y.H. Shih, C.O. Alley, New type of Einstein-Podolsky-Rosen-Bohm experiment using pairs of light quanta produced by optical parametric down conversion. Phys. Rev. Lett. 61 (26), 2921–2924 (1988)

I. Silva, Chien-Shiung wu’s contributions to experimental philosophy, in The Oxford Handbook of the History of Quantum Interpretations . ed. by J.O. Freire et al. (Oxford University Press, Oxford, 2022), pp.735–754

C.P. Silva Neto, Instrumentation and the foundations of quantum mechanics, in The Oxford Handbook of the History of Quantum Interpretations . ed. by O. Freire Junior et al. (Oxford University Press, Oxford, 2022), pp.587–613

G. Weihs et al., Violation of Bell’s inequality under strict Einstein locality conditions. Phys. Rev. Lett. 81 (23), 5039–5043 (1998)

Download references

The Project is funded by CNPq, Grant No. 305772/2013-9.

Author information

Authors and affiliations.

Instituto de Física, Universidade Federal da Bahia, Salvador, Brazil

Olival Freire Junior

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Olival Freire Junior .

Rights and permissions

Reprints and permissions

About this article

Freire Junior, O. Alain Aspect’s experiments on Bell’s theorem: a turning point in the history of the research on the foundations of quantum mechanics. Eur. Phys. J. D 76 , 248 (2022). https://doi.org/10.1140/epjd/s10053-022-00542-z

Download citation

Received : 11 May 2022

Accepted : 27 October 2022

Published : 21 December 2022

DOI : https://doi.org/10.1140/epjd/s10053-022-00542-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Advertisement

  • Find a journal
  • Publish with us
  • Track your research
  • Share full article

Advertisement

Supported by

Nobel Prize in Physics Is Awarded to 3 Scientists for Work Exploring Quantum Weirdness

Alain Aspect, John F. Clauser and Anton Zeilinger were recognized for their experiments in an area that has broad implications for secure information transfer and quantum computing.

Video player loading

By Isabella Kwai Cora Engelbrecht and Dennis Overbye

Three physicists whose works each showed that nature is even weirder than Einstein had dared to imagine have been named winners of the 2022 Nobel Prize in Physics.

John Clauser, of J.F. Clauser and Associates in Walnut Creek, Calif.; Alain Aspect of the Institut d’Optique in Palaiseau, France; and Anton Zeilinger of the University of Vienna in Austria, will split a prize of 10 million Swedish kronor.

Their independent works explored the foundations of quantum mechanics, the paradoxical rules that govern behavior in the subatomic world. In experiments conducted over the last 50 years, they confirmed the reality of an effect that Albert Einstein had disdained as “spooky action at a distance.” Measuring one of a widely separated pair of particles could instantaneously change the results of measuring the other particle, even if it was light-years away. Today, physicists call this strange effect quantum entanglement, and it is the basis of the burgeoning field of quantum information. When the award winners were announced on Tuesday, Eva Olsson, a member of the Nobel Committee for Physics, noted that quantum information science had broad implications in areas like cryptography and quantum computing.

Quantum information science is a “vibrant and rapidly developing field,” she said. “Its predictions have opened doors to another world, and it has also shaken the very foundation of how we interpret measurements.”

As Daniel Kabat, a physics professor at Lehman College in New York, explained recently, “We’re used to thinking that information about an object — say that a glass is half full — is somehow contained within the object.” Instead, he says, entanglement means objects “only exist in relation to other objects, and moreover these relationships are encoded in a wave function that stands outside the tangible physical universe.”

In a conversation with the Nobel committee Tuesday morning, Dr. Aspect said he had been looking for a limit on quantum mechanics but had not found it.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and  log into  your Times account, or  subscribe  for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?  Log in .

Want all of The Times?  Subscribe .

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center

Alain Aspect

  • Is mathematics a physical science?
  • Why does physics work in SI units?

Commemorative medal of Nobel Prize winner, Johannes Diderik Van Der Waals

Alain Aspect

Our editors will review what you’ve submitted and determine whether to revise the article.

Alain Aspect

Alain Aspect (born June 15, 1947, Agen , France) is a French physicist who was awarded the 2022 Nobel Prize for Physics for his experiments with quantum entanglement . He shared the prize with American physicist John F. Clauser and Austrian physicist Anton Zeilinger . What happens to one particle in an entangled pair determines what happens to the other, even if they are really too far apart to affect each other. The laureates’ development of experimental tools has laid the foundation for a new era of quantum technology .

Aspect received a bachelor’s degree from the École Normale Supérieure de Cachan (ENS Cachan), passing his civil service examination (agrégation) in physics in 1969. He attended the Université d’Orsay for both of his graduate degrees in physics, receiving a master’s degree in 1971 and a Ph.D. in 1983. Aspect performed his national service as a teacher in Cameroon from 1971 to 1974, before starting his Ph.D. and taking a lecturer position at the École Normale Supérieure de Cachan in Paris in 1974. For his graduate research, Aspect developed experiments to test Bell’s inequalities with respect to entangled photons . In 1985 he accepted a position at the Collège de France in Paris as a scientist in the atomic physics department, before becoming a senior scientist at the Laboratoire Charles Fabry de l’Institut d’Optique, in Palaiseau, near Paris, and later serving as a professor and head of the atomic optics group and as a professor in the École Polytechnique .

In quantum entanglement, two particles are in a single entangled state such that measuring a property of one particle instantly determines that same property in another particle. For example, two particles are in a state where one is spin -up and the other is spin-down. Since the second particle must have the opposite value of the first particle, measuring the first particle results in a definite state for the second particle, notwithstanding the fact that the two particles may be millions of kilometres apart and are not interacting with each other at the time. In 1935, when Albert Einstein , Boris Podolsky, and Nathan Rosen devised this paradox , they thought that this conclusion was so obviously false that the quantum mechanical theory on which it was based must be incomplete. They concluded that a correct theory would contain some hidden variable feature that would restore the determinism of classical physics; that is, the particles must be in some definite spin even before they are measured.

In 1964 the Irish-born physicist John Stewart Bell devised mathematical relationships, Bell’s inequalities, that would be satisfied by a hidden variable theory in which measurement of one particle would not instantly determine the properties of the other particle. Clauser and American physicist Stuart Freedman made the first experimental tests of the Bell inequalities in 1972 and showed results in accordance with quantum mechanics and not hidden variable theory.

However, the Clauser-Freedman experiment did not test an assumption Bell made, which is that the measurement of a particle by one observer would not somehow affect the measurement of the other. For example, the Clauser-Freedman experiment used polarizers that were preset. What if the experimental setup somehow had selected only photons that did not behave in accordance with hidden variable theory?

Aspect and his collaborators in the early 1980s performed a series of experiments designed to answer such questions. The most spectacular experiment, conducted with Jean Dalibard and Gérard Roger, was a modified version of the Clauser-Freedman experiment. A pair of photons with opposite polarizations was emitted from a heated calcium source. Each of the photons traveled toward a polarizer. The polarizers were 12 metres (40 feet) apart. The time for light and thus any conceivable signal to travel between the two polarizers was 40 nanoseconds. The experiment had switches that would send the photons between two pairs of polarizers every 10 nanoseconds. Thus, each polarizer was independent of the other because no signal could travel between the two. Aspect, Dalibard, and Roger measured a quantity S that—had Bell’s inequalities held—would have been between −1 and 0. They measured S = 0.101 ± 0.02, which was not in accordance with hidden variable theory but was in accordance with the quantum mechanical value of 0.112.

Aspect is a member of several national science academies, including those of Austria, Belgium, France , the United Kingdom, and the United States . His numerous awards include the French National Centre for Scientific Research (CNRS) gold medal (2005), the Wolf Prize (2010, shared with Zeilinger and Clauser), the Niels Bohr International Gold Medal and the UNESCO Neils Bohr Gold Medal (both in 2013), and the Balzan Prize for Quantum Information Processing and Communication (2013). He wrote Introduction to Quantum Optics: From the Semi-classical Approach to Quantized Light (2010; with Gilbert Grynberg and Claude Fabre) and Einstein et les révolutions quantiques (2019).

Grab your spot at the free arXiv Accessibility Forum

Help | Advanced Search

Physics > History and Philosophy of Physics

Title: alain aspect's experiments on bell's theorem: a turning point in the history of the research on the foundations of quantum mechanics.

Abstract: Alain Aspect's three experiments on Bell's theorem, published in the early 1980s, were a turning point in the history of the research on the foundations of quantum mechanics not only because they corroborated entanglement as the distinctive quantum signature but also because these experiments brought wider recognition to this field of research and Aspect himself. These experiments may be considered the most direct precursors of the research on quantum information, which would blossom a decade later.
Comments: 22 pages, Paper to be published in the European Physical Journal D topical issue titled Quantum Optics of Light and Matter: Honouring Alain Aspect
Subjects: History and Philosophy of Physics (physics.hist-ph); Quantum Physics (quant-ph)
Cite as: [physics.hist-ph]
  (or [physics.hist-ph] for this version)
  Focus to learn more arXiv-issued DOI via DataCite
: Focus to learn more DOI(s) linking to related resources

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • INSPIRE HEP
  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

alain aspect quantum entanglement experiment

  • Follow us on Facebook
  • Follow us on Twitter
  • Follow us on LinkedIn
  • Watch us on Youtube
  • Audio and video Explore the sights and sounds of the scientific world
  • Podcasts Our regular conversations with inspiring figures from the scientific community
  • Video Watch our specially filmed videos to get a different slant on the latest science
  • Webinars Tune into online presentations that allow expert speakers to explain novel tools and applications
  • Latest Explore all the latest news and information on Physics World
  • Research updates Keep track of the most exciting research breakthroughs and technology innovations
  • News Stay informed about the latest developments that affect scientists in all parts of the world
  • Features Take a deeper look at the emerging trends and key issues within the global scientific community
  • Opinion and reviews Find out whether you agree with our expert commentators
  • Interviews Discover the views of leading figures in the scientific community
  • Analysis Discover the stories behind the headlines
  • Blog Enjoy a more personal take on the key events in and around science
  • Physics World Live
  • Impact Explore the value of scientific research for industry, the economy and society
  • Events Plan the meetings and conferences you want to attend with our comprehensive events calendar
  • Innovation showcases A round-up of the latest innovation from our corporate partners
  • Collections Explore special collections that bring together our best content on trending topics
  • Artificial intelligence Explore the ways in which today’s world relies on AI, and ponder how this technology might shape the world of tomorrow
  • #BlackInPhysics Celebrating Black physicists and revealing a more complete picture of what a physicist looks like
  • Nanotechnology in action The challenges and opportunities of turning advances in nanotechnology into commercial products
  • The Nobel Prize for Physics Explore the work of recent Nobel laureates, find out what happens behind the scenes, and discover some who were overlooked for the prize
  • Revolutions in computing Find out how scientists are exploiting digital technologies to understand online behaviour and drive research progress
  • The science and business of space Explore the latest trends and opportunities associated with designing, building, launching and exploiting space-based technologies
  • Supercool physics Experiments that probe the exotic behaviour of matter at ultralow temperatures depend on the latest cryogenics technology
  • Women in physics Celebrating women in physics and their contributions to the field
  • IOP Publishing
  • Enter e-mail address
  • Show Enter password
  • Remember me Forgot your password?
  • Access more than 20 years of online content
  • Manage which e-mail newsletters you want to receive
  • Read about the big breakthroughs and innovations across 13 scientific topics
  • Explore the key issues and trends within the global scientific community
  • Choose which e-mail newsletters you want to receive

Reset your password

Please enter the e-mail address you used to register to reset your password

Note: The verification e-mail to change your password should arrive immediately. However, in some cases it takes longer. Don't forget to check your spam folder.

If you haven't received the e-mail in 24 hours, please contact [email protected]

Registration complete

Thank you for registering with Physics World If you'd like to change your details at any time, please visit My account

Alain Aspect, John Clauser and Anton Zeilinger win the 2022 Nobel Prize for Physics

Alain Aspect, John Clauser and Anton Zeilinger have won the 2022 Nobel Prize for Physics. The trio won “for their experiments with entangled photons, establishing the violation of Bell’s inequalities and pioneering quantum information science”.

The prize will be presented in Stockholm in December and is worth 10 million kronor ($900,000). It will be shared equally between the winners.

Working independently, the three laureates did key experiments that established the quantum property of entanglement. This is a curious effect whereby two or more particles display much stronger correlations than are possible in classical physics. Entanglement plays an important role in quantum computers, which in principle could outperform conventional computers at some tasks.

Bell’s inequality

All three of the experiments measured violations of Bell’s inequality, which places a limit on the correlations that can be observed in a classical system. Such violations are an important prediction of quantum theory.

The first experiment was done in 1972 at the University of California at Berkeley by Clauser, who measured the correlations between the polarizations of pairs of photons that were created in an atomic transition. He showed that Bell’s inequality was violated – which meant that the photon pairs were entangled.

However, there were several shortcomings or “loopholes” in this experiment, making it inconclusive. It is possible, for example, that the photons detected were not a fair sample of all photons emitted by the source – which is the detection loophole.  It is also possible that some aspects of the experiment that are thought to be independent were somehow causally connected – which is the locality loophole.

Ten years later, in 1982, Aspect and colleagues at the Université Paris-Sud in Orsay, France, improved on Clauser’s experiment by using a two-channel detection scheme. This avoided making assumptions about the photons that were detected. They also varied the orientation of the polarizing filters during their measurements. Again, they found that Bell’s inequality was violated.

Third loophole

The locality loophole was closed in 1998 by Zeilinger and colleagues at the University of Innsbruck in Austria. They used two fully independent quantum random-number generators to set the directions of the photon measurements. As a result, the direction along which the polarization of each photon was measured was decided at the last instant, such that no signal travelling slower than the speed of light would be able to transfer information to the other side before that photon was registered.

As well as confirming a fundamental prediction of quantum mechanics, the three experiments laid the groundwork for the development of modern quantum technologies.

Speaking at the press conference when the prize was announced, Zeilinger said he was “very surprised” to receive a call from the Nobel committee. “This prize is an encouragement to young people and the prize would not be possible without more than 100 young people who have worked with me over the years. I alone could not have achieved this.”

Zeilinger also said he hoped the prize would encourage young researchers.

“My advice to young people would be do what you find interesting and don’t care too much about possible applications. On the other hand, this recognition is very important for the future development of possible applications. I am curious what we will see in the next 10–20 years.”

A profound impact

Sheila Rowan , president of the Institute of Physics, which publishes Physics World , congratulated the trio on their “well-deserved” recognition. “This is an area of physics with ongoing, profound impact, at a fundamental level to help understand the world around us and being explored for use in highly novel technologies for sensing and communication today,” she added.

Quantum physicist Artur Ekert from the University of Oxford says that while he is “happy” to see the field and the trio being recognized with this year’s Nobel,  he adds that it is a “pity” that John Bell, who formulated the inequalities, missed out given that he died in 1990 and Nobel prizes are not awarded posthumously.

Ekert adds that the advent of quantum cryptography has provided an additional motivation to push the Bell inequality experiments to their limits. “From the foundations of science perspective, I think the Bell inequality experiments simply had to be done — they refute a certain world view and so they are important,” adds Ekert. “Fixing all the loopholes in such experiments is another story. This is probably more important for the quantum cryptography perspective as if we want to use Bell inequalities to detect eavesdropping we have to close the loopholes.”

Indeed, congratulations also came from those who are trying to use the work of Aspect, Clauser and Zeilinger for practical applications. In a joint statement, Ilyas Khan and Tony Uttley, chief executive and president, respectively, of the quantum technology firm Quantinuum , noted they were thrilled” by the announcement.

“This recognition of the power of quantum information systems is timely on many counts, but above all is a wonderful acknowledgement of the fact that experimental advances underpin the quantum technologies revolution that we are embarking upon.”

A life in science

Aspect was born in Agen, France, on 15 June 1947. He passed the “agrégation” – the national French exam – in physics in 1969 and received his Master’s degree from the Université d’Orsay two years later. He then embarked on a PhD at Orsay, working on experimental tests of Bell’s inequalities, which he completed in 1983.

Following a lectureship at the Ecole Normale Supérieure de Cachan, which Aspect held while he was doing his PhD, in 1985 he worked at Collège de France in Paris. In 1992 he then moved to the Laboratoire Charles Fabry de l‟Institut d’Optique at the Université Paris-Saclay.

Clauser was born in Pasadena, California, on 1 December 1942. He received his Bachelor’s degree in physics from California Institute of Technology in 1964 and a Master’s in physics two years later. In 1969 he received a PhD in physics from Columbia University.

From 1969 to 1975 Clauser was a researcher at Lawrence Berkeley National Laboratory and from 1975 to 1986 worked at the Lawrence Livermore National Laboratory. Following a stint as a senior scientist at the US firm Science Applications International Corporation, in 1990 he moved to the University of California, Berkeley until 1997 where he then focused on his research and consultancy firm J F Clauser & Associates.

alain aspect quantum entanglement experiment

Anton Zeilinger: a quantum pioneer

Zeilinger was born in Ried im Innkreis, Austria, on 20 May 1945. In 1963 he began studying physics and mathematics at the University of Vienna and in 1971 completed his PhD in atomic physics. He then worked at the Atomic Institute in Vienna until 1983 before heading to the Vienna University of Technology.

In 1990 Zeilinger moved to the University of Innsbruck and in 1999 worked at the University of Vienna where he also became director of the Vienna-based Institute for Quantum Optics and Quantum Information from 2004 to 2013. In 2013 he served as president of the Austrian Academy of Sciences, a position he held until this year.

  • Anton Zeilinger has written several articles for Physics World : “ A quantum renaissance ” by Markus Aspelmeyer and Anton Zeilinger; “ Probing the limits of the quantum world ” by Markus Arndt, Anton Zeilinger and Klaus Hornberger; and “ Fundamentals of quantum information ” by Anton Zeilinger.

Oxford Instruments logo

Physics World ‘s Nobel prize coverage is supported by Oxford Instruments Nanoscience , a leading supplier of research tools for the development of quantum technologies, advanced materials and nanoscale devices. Visit nanoscience.oxinst.com to find out more.

Want to read more?

Note: The verification e-mail to complete your account registration should arrive immediately. However, in some cases it takes longer. Don't forget to check your spam folder.

If you haven't received the e-mail in 24 hours, please contact [email protected] .

  • E-mail Address

alain aspect quantum entanglement experiment

Introducing open physics – A vision for open science at IOP Publishing

Find out how our activities, policies and publications are supporting open science.

  • Environment and energy

Climate change expected to reduce the quality of ground-based astronomical observations

  • Diversity and inclusion

Citing like it's 1995: why women physicists find their papers referenced less

Discover more from physics world.

Hannah Stern sat in a chair

Hannah Stern: how new materials are driving the quantum revolution

Tsung-Dao Lee at CERN

  • Particle and nuclear

Tsung-Dao Lee: Nobel laureate famed for work on parity violation dies aged 97

Barbora Špačková

Peering inside the biological nano-universe: Barbora Špačková on unveiling individual molecules moving in real time

Related jobs, application scientist quantum technologies, assistant professor in materials systems for sustainability, master in quantum science & technology, related events.

  • Optics and photonics | Symposium SPIE Optics + Photonics 2024 18—24 August 2024 | San Diego, US
  • Quantum | Workshop III Workshop on Quantum Nonstationary Systems 26—30 August 2024 | Brasilia, Brazil
  • Quantum | Exhibition Quantum Tech Europe 2024 23—25 September 2024 | London, UK

SciTechDaily

  • August 10, 2024 | According to a New Study, Following This Type of Diet Could Slow Biological Aging
  • August 10, 2024 | Do Anxiety Pills Have a Dark Side? Research Reveals Dementia Risks
  • August 10, 2024 | The Oldest Grapes in the Americas: Discovery of 60-Million-Year-Old Seeds Unveils Ancient Plant Secrets
  • August 10, 2024 | Decoding the Dark Metabolism: Synthetic Biology Exposes Secrets of Life Without Oxygen
  • August 10, 2024 | Celestial Light Shows: Physicists Decode the Cosmic Dance of Auroras on Earth, Jupiter, and Saturn

First Experimental Proof That Quantum Entanglement Is Real

By California Institute of Technology October 9, 2022

Quantum Entanglement Illustration

Scientists, including Albert Einstein and Erwin Schrödinger, first discovered the phenomenon of entanglement in the 1930s. In 1972, John Clauser and Stuart Freedman were the first to prove experimentally that two widely separated particles can be entangled.

A Q&A with Caltech alumnus John Clauser on his first experimental proof of quantum entanglement.

When scientists, including Albert Einstein and Erwin Schrödinger, first discovered the phenomenon of entanglement in the 1930s, they were perplexed. Disturbingly, entanglement required two separated particles to remain connected without being in direct contact. In fact, Einstein famously called entanglement “spooky action at a distance,” because the particles seemed to be communicating faster than the speed of light.

Born on December 1, 1942, John Francis Clauser is an American theoretical and experimental physicist known for contributions to the foundations of quantum mechanics, in particular the Clauser–Horne–Shimony–Holt inequality. Clauser was awarded the 2022 Nobel Prize in Physics, jointly with Alain Aspect and Anton Zeilinger “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” try { window._mNHandle.queue.push(function (){ window._mNDetails.loadTag("974871025", "600x250", "974871025"); }); } catch (error) {}

To explain the bizarre implications of entanglement, Einstein, along with Boris Podolsky and Nathan Rosen (EPR), argued that “hidden variables” should be added to quantum mechanics. These could be used to explain entanglement, and to restore “locality” and “causality” to the behavior of the particles. Locality states that objects are only influenced by their immediate surroundings. Causality states that an effect cannot occur before its cause, and that causal signaling cannot propagate faster than light speed. Niels Bohr famously disputed EPR’s argument, while Schrödinger and Wendell Furry, in response to EPR, independently hypothesized that entanglement vanishes with wide-particle separation.

Unfortunately, at the time, no experimental evidence for or against quantum entanglement of widely separated particles was available. Experiments have since proven that entanglement is very real and fundamental to nature. Furthermore, quantum mechanics has now been proven to work, not only at very short distances but also at very great distances. Indeed, China’s quantum-encrypted communications satellite, Micius, (part of the Quantum Experiments at Space Scale (QUESS) research project) relies on quantum entanglement between photons that are separated by thousands of kilometers.

John Clauser Second Quantum Entanglement Experiment

John Clauser standing with his second quantum entanglement experiment at UC Berkeley in 1976. Credit: University of California Graphic Arts / Lawrence Berkeley Laboratory

The very first of these experiments was proposed and executed by Caltech alumnus John Clauser (BS ’64) in 1969 and 1972, respectively. His findings are based on Bell’s theorem, devised by CERN theorist John Bell. In 1964, Bell ironically proved that EPR’s argument actually led to the opposite conclusion from what EPR had originally intended to show. Bell demonstrated that quantum entanglement is, in fact, incompatible with EPR’s notion of locality and causality.

In 1969 , while still a graduate student at Columbia University , Clauser, along with Michael Horne, Abner Shimony, and Richard Holt, transformed Bell’s 1964 mathematical theorem into a very specific experimental prediction via what is now called the Clauser–Horne–Shimony–Holt (CHSH) inequality ( Their paper has been cited more than 8,500 times on Google Scholar .) In 1972, when he was a postdoctoral researcher at the University of California Berkeley and Lawrence Berkeley National Laboratory , Clauser and graduate student Stuart Freedman were the first to prove experimentally that two widely separated particles (about 10 feet apart) can be entangled.

Clauser went on to perform three more experiments testing the foundations of quantum mechanics and entanglement, with each new experiment confirming and extending his results. The Freedman–Clauser experiment was the first test of the CHSH inequality. It has now been tested experimentally hundreds of times at laboratories around the world to confirm that quantum entanglement is real.

Clauser’s work earned him the 2010 Wolf Prize in physics. He shared it with Alain Aspect of the Institut d’ Optique and Ecole Polytechnique and Anton Zeilinger of the University of Vienna and the Austrian Academy of Sciences “for an increasingly sophisticated series of tests of Bell’s inequalities, or extensions thereof, using entangled quantum states,” according to the award citation.

John Clauser Yacht Club

John Clauser at a yacht club. Clauser enjoys sailboat racing in his spare time. Credit: John Dukat

Here, John Clauser answers questions about his historical experiments.

We hear that your idea of testing the principles of entanglement was unappealing to other physicists. Can you tell us more about that?

In the 1960s and 70s, experimental testing of quantum mechanics was unpopular at Caltech, Columbia, UC Berkeley, and elsewhere. My faculty at Columbia told me that testing quantum physics was going to destroy my career. While I was performing the 1972 Freedman–Clauser experiment at UC Berkeley, Caltech’s Richard Feynman was highly offended by my impertinent effort and told me that it was tantamount to professing a disbelief in quantum physics. He arrogantly insisted that quantum mechanics is obviously correct and needs no further testing! My reception at UC Berkeley was lukewarm at best and was only possible through the kindness and tolerance of Professors Charlie Townes [PhD ’39, Nobel Laureate ’64] and Howard Shugart [BS ’53], who allowed me to continue my experiments there.

In my correspondence with John Bell , he expressed exactly the opposite sentiment and strongly encouraged me to do an experiment. John Bell’s 1964 seminal work on Bell’s theorem was originally published in the terminal issue of an obscure journal, Physics , and in an underground physics newspaper, Epistemological Letters . It was not until after the 1969 CHSH paper and the 1972 Freedman–Clauser results were published in the Physical Review Letters that John Bell finally openly discussed his work. He was aware of the taboo on questioning quantum mechanics’ foundations and had never discussed it with his CERN co-workers.

What made you want to carry through with the experiments anyway?

Part of the reason that I wanted to test the ideas was because I was still trying to understand them. I found the predictions for entanglement to be sufficiently bizarre that I could not accept them without seeing experimental proof. I also recognized the fundamental importance of the experiments and simply ignored the career advice of my faculty. Moreover, I was having a lot of fun doing some very challenging experimental physics with apparatuses that I built mostly using leftover physics department scrap. Before Stu Freedman and I did the first experiment, I also personally thought that Einstein’s hidden-variable physics might actually be right, and if it is, then I wanted to discover it. I found Einstein’s ideas to be very clear. I found Bohr’s rather muddy and difficult to understand.

What did you expect to find when you did the experiments?

In truth, I really didn’t know what to expect except that I would finally determine who was right—Bohr or Einstein. I admittedly was betting in favor of Einstein but did not actually know who was going to win. It’s like going to the racetrack. You might hope that a certain horse will win, but you don’t really know until the results are in. In this case, it turned out that Einstein was wrong. In the tradition of Caltech’s Richard Feynman and Kip Thorne [BS ’62], who would place scientific bets, I had a bet with quantum physicist Yakir Aharonov on the outcome of the Freedman–Clauser experiment. Curiously, he put up only one dollar to my two. I lost the bet and enclosed a two-dollar bill and congratulations when I mailed him a preprint with our results.

I was very sad to see that my own experiment had proven Einstein wrong. But the experiment gave a 6.3-sigma result against him [a five-sigma result or higher is considered the gold standard for significance in physics]. But then Dick Holt and Frank Pipkin’s competing experiment at Harvard (never published) got the opposite result. I wondered if perhaps I had overlooked some important detail. I went on alone at UC Berkeley to perform three more experimental tests of quantum mechanics. All yielded the same conclusions. Bohr was right, and Einstein was wrong. The Harvard result did not repeat and was faulty. When I reconnected with my Columbia faculty, they all said, “We told you so! Now stop wasting money and go do some real physics.” At that point in my career, the only value in my work was that it demonstrated that I was a reasonably talented experimental physicist. That fact alone got me a job at Lawrence Livermore National Lab doing controlled-fusion plasma physics research.

Can you help us understand exactly what your experiments showed?

In order to clarify what the experiments showed, Mike Horne and I formulated what is now known as Clauser–Horne Local Realism [ 1974 ]. Additional contributions to it were subsequently offered by John Bell and Abner Shimony , so perhaps it is more properly called Bell–Clauser–Horne–Shimony Local Realism . Local Realism was very short-lived as a viable theory. Indeed, it was experimentally refuted even before it was fully formulated. Nonetheless, Local Realism is heuristically important because it shows in detail what quantum mechanics is not .

Local Realism assumes that nature consists of stuff, of objectively real objects, i.e., stuff you can put inside a box. (A box here is an imaginary closed surface defining separated inside and outside volumes.) It further assumes that objects exist whether or not we observe them. Similarly, definite experimental results are assumed to obtain, whether or not we look at them. We may not know what the stuff is, but we assume that it exists and that it is distributed throughout space. Stuff may evolve either deterministically or stochastically. Local Realism assumes that the stuff within a box has intrinsic properties, and that when someone performs an experiment within the box, the probability of any result that obtains is somehow influenced by the properties of the stuff within that box. If one performs say a different experiment with different experimental parameters, then presumably a different result obtains. Now suppose one has two widely separated boxes, each containing stuff. Local Realism further assumes that the experimental parameter choice made in one box cannot affect the experimental outcome in the distant box. Local Realism thereby prohibits spooky action-at-a-distance. It enforces Einstein’s causality that prohibits any such nonlocal cause and effect. Surprisingly, those simple and very reasonable assumptions are sufficient on their own to allow derivation of a second important experimental prediction limiting the correlation between experimental results obtained in the separated boxes. That prediction is the 1974 Clauser–Horne (CH) inequality.

The 1969 CHSH inequality’s derivation had required several minor supplementary assumptions, sometimes called “loopholes.” The CH inequality’s derivation eliminates those supplementary assumptions and is thus more general. Quantum entangled systems exist that disagree with the CH prediction, whereby Local Realism is amenable to experimental disproof. The CHSH and CH inequalities are both violated, not only by the first 1972 Freedman–Clauser experiment and my second 1976 experiment but now by literally hundreds of confirming independent experiments. Various labs have now entangled and violated the CHSH inequality with photon pairs, beryllium ion pairs, ytterbium ion pairs, rubidium atom pairs, whole rubidium-atom cloud pairs, nitrogen vacancies in diamonds, and Josephson phase qubits.

Testing Local Realism and the CH inequality was considered by many researchers to be important to eliminate the CHSH loopholes. Considerable effort was thus marshaled, as quantum optics technology improved and permitted. Testing the CH inequality had become a holy grail challenge for experimentalists. Violation of the CH inequality was finally achieved first in 2013 and again in 2015 at two competing laboratories: Anton Zeilinger’s group at the University of Vienna, and Paul Kwiat’s group at the University of Illinois at Urbana–Champaign. The 2015 experiments involved 56 researchers! Local Realism is now soundly refuted! The agreement between the experiments and quantum mechanics now firmly proves that nonlocal quantum entanglement is real.

What are some of the important technological applications of your work?

One application of my work is to the simplest possible object defined by Local Realism—a single bit of information. Local Realism shows that a single quantum mechanical bit of information, a “qubit,” cannot always be localized in a space-time box. This fact provides the fundamental basis of quantum information theory and quantum cryptography. Caltech’s quantum science and technology program, the 2019 $1.28-billion U.S. National Quantum Initiative, and the 2019 $400 million Israeli National Quantum Initiative all rely on the reality of entanglement. The Chinese Micius quantum-encrypted communications satellite system’s configuration is almost identical to that of the Freedman–Clauser experiment. It uses the CHSH inequality to verify entanglement’s persistence through outer space.

Can you tell us more about your family’s strong connection with Caltech?

My dad, Francis H. Clauser [BS ’34, MS ’35, PhD ’37, Distinguished Alumni Award ’66] and his brother Milton U. Clauser [BS ’34, MS ’35, PhD ’37] were PhD students at Caltech under Theodore von Kármán . Francis Clauser was Clark Blanchard Millikan Professor of Engineering at Caltech (Distinguished Faculty Award ’80) and chair of Caltech’s Division of Engineering and Applied Science. Milton U. Clauser’s son, Milton J. Clauser [PhD ’66], and grandson, Karl Clauser [BS ’86] both went to Caltech. My mom, Catharine McMillan Clauser was Caltech’s humanities librarian, where she met my dad. Her brother, Edwin McMillan [BS ’28, MS ’29], is a Caltech alum and ’51 Nobel Laureate. The family now maintains Caltech’s “Milton and Francis Doctoral Prize” awarded at Caltech commencements.

More on SciTechDaily

Electrons in a Topological Quantum Metal

Electrons Waiting Their Turn: New Model Explains 3D Quantum Material

alain aspect quantum entanglement experiment

Physicists Detect Neutrinos in the Sun’s Core

Yale Researchers Demonstrate “Teleportation” of a Quantum Gate

Researchers Demonstrate “Teleportation” of a Quantum Gate

Andrea Ghez Royce Hall

Andrea Ghez Awarded 2020 Nobel Prize in Physics for Supermassive Black Hole Discovery

How Hydrogen Becomes Metallic Inside Gas Giant Planets

Physicists Demonstrate How Hydrogen Becomes Metallic Inside Gas Giant Planets

Deep-Earth Pressure Conditions

Recreating Deep-Earth Conditions To See How Iron Copes With Extreme Stress and Pressure

Laser AOM Sound Waves Silicon Cavity Schematic

Revolutionizing Time With Cutting-Edge Laser Technology

SiN Resonator Under Localized Heating

Quantum Resonators Break the Thermal Rule

21 comments on "first experimental proof that quantum entanglement is real".

alain aspect quantum entanglement experiment

The interactions and balances of topological vortex fields cover all short-distance and long-distance contributions, and are the basis of the formation and evolution of cosmic matter. 1.According to the topological vortex field theory, not only light, almost all rays and particles have electric effects. 2.The nature of electricity is perfect fluid.It has no shear stresses,viscosity,or heat conduction.Electric current generates heat because it interacts with vortex current. 3.Entanglement is one of the forms of interaction between vortexes. 4.If you are interested, please see https://zhuanlan.zhihu.com/p/463666584 . Good luck to your team.

The physical characteristics of the fluid vortex center are suitable to be described by energy rather than mass. According to the topological vortex field theory, there are two types of vortex centers: one is constant temperature and the other is variable temperature.

alain aspect quantum entanglement experiment

The expansion of space is due to star dynamics or relativity is a part of this.So,entanglement is a natural property present uniformly in universe for quantum particle after adding,saý GR(but not essential beside an arbitrary rational standard taken for measurement in the experiment);hence speed of light has no relation for entanglment of quantum particles at any two points in the same galaxy,or else.These facts have well proven before by many experimentations and are in due course of appĺied field,thus nothing to state about noble prize of current year in physics,but taken as secoded phenomena of the principle of entanglement.Congratulations to the physicists for their good presentations and wise works,granted late.

alain aspect quantum entanglement experiment

Fascinating!!!

Quantum Entanglement is perhaps the strangest phenomenon in physics, when some small particles may communicate with each other instantly and over vast distances. How can that happen without violating the maximum speed in the universe, the speed of light?

As this article says, the two ways physicists have used to answer this, that the particles contain hidden variables of unknown natures and that the universe is completely deterministic with all results predefined, have been shown to be incorrect.

Perhaps concepts in String Theory can help. There are 11 possible dimensions in String Theory and I suggest one of them leads a way around, what Einstein called this “spooky action at a distance”. Specifics on this can be found by searching YouTube for “Quantum Entanglement – A String Theory Way”

Bùt credìt of research on particle physics goes for quark-gluon to the America,charm quark to the CERN particĺe physicists.Thus,spin or magnetism required for entanglement has been done in parallel is an established work.

GR in connection to star dynamics is well proven concept taken in all kinds of measurements.

Metaphysics in Quantum Computation field is usual natural part has also been proven and established by experimentation.

All these are distinct works present ofcourse in fractional forms,but commonly adopted jointly in Quantum Computation.

So,alĺ these discoveries with their applications express happiness on behalf of this year’s Physics Nobel Prize with gratefulness to community and all with thanks.

alain aspect quantum entanglement experiment

It’s so fascinating contemplating the theme and variations of line of sight communication being moot through the newer mechanical developments

alain aspect quantum entanglement experiment

Energy can not be created or destroyed. Our thoughts are forms of energy. And scripture says “as a man think so is he”. Negative thoughts create depression, lack and poverty. Positive thoughts create abundance, wealth and prosperity.

FTL effects and hidden variable are not clearly ruled out and failure of localism could arise from FTL effects, it seems. “Non-local” with “hidden variables” still point to invisible FTL gravity effects, I believe.

Just to clarify, I wanted to note that it still seems “non-localism” and “hidden variables” can fit FTL gravity effects.

alain aspect quantum entanglement experiment

Refraction of fire and chief fields to contain high density gravity using quantum Magnetic codings will intensify the field of gravity to project

alain aspect quantum entanglement experiment

Quantum energy and its distant entanglement might be a breakthrough for holistic medical science. So therefore mysteries of working of homeopathic remedies on living organisms including humans could be explained and placebo effects of homeopathic remedies can further be explored. Diversity of conventional medical treatment can be boxed into single holistic approach. Thumbs up to marvelous discovery.

alain aspect quantum entanglement experiment

I have found a name for what goes on in my mind

alain aspect quantum entanglement experiment

ha ha ha… It’s the bizarre world where those embarrassments attempt to qualify as an authority by making word salad… to use their deleterious language once reserved for those sacrifices for the greater good, fire pits, abattoirs, and bomb vests. China still kills them, an economic champion, at what sacrifice? But you may be talking of mirror neurons, that not critical part of physical motion that allows instant … ok. entanglement for such as line dancers, but don’t confuse that with your critical thinking. Remember mirror neurons don’t really care, it’s a temporary allowing of one’s trust to be like another, not forebrain activity.

alain aspect quantum entanglement experiment

There is, of course, an information ‘matrix’ associated with the isotropic energy substrate underlying all measurable phenomena. ‘Particles’, therefore, isuue from this substrate and have, ipso facto, access to the information at any point of manifestation. It seems to me. So, no problem really with ‘Spooky Action’.

‘Particles’ isuue from this substrate put this notion well. The interactions and balances of topological vortex fields cover all short-distance and long-distance contributions, wich are the substrate of the formation and evolution of cosmic matter.

alain aspect quantum entanglement experiment

There are many here who are eminently more qualified than myself but it seems “apparent” that particles simultaneously exist in a different dimension and in that different dimension are essentially quite local.

alain aspect quantum entanglement experiment

you guys are just figuring this whole thing out now, this whole thing had been figured out a long time ago by ancient spiritualism, probably over 10 000 years ago. ancient spirituality had been trying to tell humanity that there is another dimension( “invisible reality”) which is the source of all things happening in this universe and outside the universe, they call it “the all”, some spiritual traditions call it the infinite consciousness, non-duality, the timeless dimension, the formless dimension and more. What’s happening in this universe of relativity is ultimately an illusion because people perceive reality as separate entities and the dimension I’m talking about is beyond forms, time, and space which all the dualistic categories of this universe and mental principles ceases to exist and what left is pure energy, the existence of this present moment(now).thats what science is trying to figure out and spirituality had already figure this whole thing out very long time ago. if someone wants to figure out what’s going on in quantum entanglement, I highly recommend you to access spirituality and non-dual teachings. it is not surprising that science is shocked about this because this whole had been figured out a long time ago, it’s just that science is catching up with spirituality. whatever is happening in the phenomenon of quantum entanglement that seems spooky is governed by that invisible reality called infinite conscioussness, which you cannot understand conceptually but realize as the oness.

Pure drivel, to start with Einstien who this fraudulent author misquotes, said quantum entanglement DOESN’T occur and there was no spooky action at a distance… completely misquoting others and besmirching their names by such slanders is common among such complete frauds as By CALIFORNIA INSTITUTE OF TECHNOLOGY or Not?

Just saying, anyone else ever seen a supposedly academic publication without a long list of authors, and co-authors all wanting credit for the publication as well as a long list of citations? No? Also the long list of word salad comments, anything false spawns false, proof of contraction is abundant, no such thing as quantum entanglement.

alain aspect quantum entanglement experiment

i suspect the quantum entanglement experiments are flawed but I have not found the details of these experiments. My skepticism arises from theories surrounding the origin of the universe. Black holes are gravitationally sorted spheres with the densest particles in the center. In order to have a big bang black holes (remnants of adjacent universes would have to collide. The resulting explosion propelled particles into space while preserving some of the more dense particles from the core which formed the early galaxies. The bulk of the mass shot into space the gavitational force decreasing with greater volume and distance from the center dense particles resulting in acceleration. No dark matter required. I am also skeptical of the atomic clock experiment which showed time slows with speed. all the experiment shows is that atomic radius is not constant. As an atom approaches the dense matter from the big bang at the center of the earth its radius deceases. I also suspect the current pole rotation we are in is tied to pre big bang dense matter at the center of the earth, and does not involve liquid iron suddenly changing direction. If quantum entanglement is real the experimental proceedures should be published and available to the layman.

Leave a comment Cancel reply

Email address is optional. If provided, your email will not be published or shared.

Save my name, email, and website in this browser for the next time I comment.

October 6, 2022

11 min read

The Universe Is Not Locally Real, and the Physics Nobel Prize Winners Proved It

Elegant experiments with entangled light have laid bare a profound mystery at the heart of reality

By Daniel Garisto

An apple half red and half green.

Athul Satheesh/500px/Getty Images

One of the more unsettling discoveries in the past half a century is that the universe is not locally real. In this context, “real” means that objects have definite properties independent of observation—an apple can be red even when no one is looking. “Local” means that objects can be influenced only by their surroundings and that any influence cannot travel faster than light. Investigations at the frontiers of quantum physics have found that these things cannot both be true. Instead the evidence shows that objects are not influenced solely by their surroundings, and they may also lack definite properties prior to measurement.

This is, of course, deeply contrary to our everyday experiences. As Albert Einstein once bemoaned to a friend, “Do you really believe the moon is not there when you are not looking at it?” To adapt a phrase from author Douglas Adams, the demise of local realism has made a lot of people very angry and has been widely regarded as a bad move.

Blame for this achievement has been laid squarely on the shoulders of three physicists: John Clauser, Alain Aspect and Anton Zeilinger. They equally split the 2022 Nobel Prize in Physics “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science.” (“Bell inequalities” refers to the trailblazing work of physicist John Stewart Bell of Northern Ireland, who laid the foundations for the 2022 Physics Nobel in the early 1960s.) Colleagues agreed that the trio had it coming, deserving this reckoning for overthrowing reality as we know it. “It was long overdue,” says Sandu Popescu, a quantum physicist at the University of Bristol in England. “Without any doubt, the prize is well deserved.”

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

“The experiments beginning with the earliest one of Clauser and continuing along show that this stuff isn’t just philosophical, it’s real—and like other real things, potentially useful,” says Charles Bennett, an eminent quantum researcher at IBM. “Each year I thought, ‘Oh, maybe this is the year,’” says David Kaiser, a physicist and historian at the Massachusetts Institute of Technology. “[In 2022] it really was. It was very emotional—and very thrilling.”

The journey from fringe to favor was a long one. From about 1940 until as late as 1990, studies of so-called quantum foundations were often treated as philosophy at best and crackpottery at worst. Many scientific journals refused to publish papers on the topic, and academic positions indulging such investigations were nearly impossible to come by. In 1985 Popescu’s adviser warned him against a Ph.D. in the subject. “He said, ‘Look, if you do that, you will have fun for five years, and then you will be jobless,’” Popescu says.

Today quantum information science is among the most vibrant subfields in all of physics. It links Einstein’s general theory of relativity with quantum mechanics via the still mysterious behavior of black holes. It dictates the design and function of quantum sensors, which are increasingly being used to study everything from earthquakes to dark matter. And it clarifies the often confusing nature of quantum entanglement, a phenomenon that is pivotal to modern ma­­ter­i­als science and that lies at the heart of quantum computing. “What even makes a quantum computer ‘quantum?’” Nicole Yunger Halpern, a physicist at the National Institute of Standards and Technology, asks rhetorically. “One of the most popular answers is en­­tanglement, and the main reason why we understand entanglement is the grand work participated in by Bell and these Nobel Prize winners. Without that understanding of entanglement, we probably wouldn’t be able to realize quantum computers.”

Image of a man standing in front of a chalkboard with mathematical equations

Work by John Stewart Bell in the 1960s sparked a quiet revolution in quantum physics.

Peter Menzel/Science Source

FOR WHOM THE BELL TOLLS

The trouble with quantum mechanics was never that it made the wrong predictions—in fact, the theory described the microscopic world splendidly right from the start when physicists devised it in the opening decades of the 20th century. What Einstein, Boris Podolsky and Nathan Rosen took issue with, as they explained in their iconic 1935 paper, was the theory’s uncomfortable implications for reality. Their analysis, known by their initials EPR, centered on a thought ex­­periment meant to illustrate the absurdity of quantum mechanics. The goal was to show how under certain conditions the theory can break—or at least de­­­liver nonsensical results that conflict with our deepest assumptions about reality.

A simplified and modernized version of EPR goes something like this: Pairs of particles are sent off in different directions from a common source, targeted for two observers, Alice and Bob, each stationed at opposite ends of the solar system. Quantum mechanics dictates that it is impossible to know the spin, a quantum property of individual particles, prior to measurement. Once Alice measures one of her particles, she finds its spin to be either “up” or “down.” Her results are random, and yet when she measures up, she instantly knows that Bob’s corresponding particle—which had a random, indefinite spin—must now be down. At first glance, this is not so odd. Maybe the particles are like a pair of socks—if Alice gets the right sock, Bob must have the left.

But under quantum mechanics, particles are not like socks, and only when measured do they settle on a spin of up or down. This is EPR’s key conundrum: If Alice’s particles lack a spin until measurement, then how (as they whiz past Neptune) do they know what Bob’s particles will do as they fly out of the solar system in the other direction? Each time Alice measures, she quizzes her particle on what Bob will get if he flips a coin: up or down? The odds of correctly predicting this even 200 times in a row are one in 10 60 —a number greater than all the atoms in the solar system. Yet despite the billions of kilometers that separate the particle pairs, quantum mechanics says Alice’s particles can keep correctly predicting, as though they were telepathically connected to Bob’s particles.

Designed to reveal the incompleteness of quantum mechanics, EPR eventually led to experimental results that instead reinforce the theory’s most mind-boggling tenets. Under quantum mechanics, nature is not locally real: particles may lack properties such as spin up or spin down prior to measurement, and they seem to talk to one another no matter the distance. (Be­­cause the outcomes of measurements are random, these correlations cannot be used for faster-than-light communication.)

Physicists skeptical of quantum mechanics proposed that this puzzle could be explained by hidden variables, factors that existed in some imperceptible level of reality, under the subatomic realm, that contained information about a particle’s future state. They hoped that in hidden variable theories, nature could recover the local realism denied it by quantum mechanics. “One would have thought that the arguments of Einstein, Podolsky and Rosen would produce a revolution at that moment, and everybody would have started working on hidden variables,” Popescu says.

Einstein’s “attack” on quantum mechanics, however, did not catch on among physicists, who by and large accepted quantum mechanics as is. This was less a thoughtful embrace of nonlocal reality than a desire not to think too hard—a head-in-the-sand sentiment later summarized by American physicist N. David Mermin as a demand to “shut up and calculate.” The lack of interest was driven in part because John von Neumann, a highly regarded scientist, had in 1932 published a mathematical proof ruling out hidden variable theories. Von Neumann’s proof, it must be said, was refuted just three years later by a young female mathematician, Grete Hermann, but at the time no one seemed to notice.

The problem of nonlocal realism would languish for another three decades before being shattered by Bell. From the start of his career, Bell was bothered by quantum orthodoxy and sympathetic toward hidden variable theories. Inspiration struck him in 1952, when he learned that American physicist David Bohm had formulated a viable nonlocal hidden variable interpretation of quantum mechanics—something von Neumann had claimed was impossible.

Bell mulled the ideas for years, as a side project to his job working as a particle physicist at CERN near Geneva. In 1964 he rediscovered the same flaws in von Neumann’s argument that Hermann had. And then, in a triumph of rigorous thinking, Bell concocted a theorem that dragged the question of local hidden variables from its metaphysical quagmire onto the concrete ground of experiment.

Typically local hidden variable theories and quantum mechanics predict indistinguishable experimental outcomes. What Bell realized is that under precise circumstances, an empirical discrepancy between the two can emerge. In the eponymous Bell test (an evolution of the EPR thought experiment), Alice and Bob receive the same paired particles, but now they each have two different detector settings—A and a, B and b. These detector settings are an additional trick to throw off Alice and Bob’s apparent telepathy. In local hidden variable theories, one particle cannot know which question the other is asked. Their correlation is secretly set ahead of time and is not sensitive to up­­dated detector settings. But according to quantum mechanics, when Alice and Bob use the same settings (both uppercase or both lowercase), each particle is aware of the question the other is posed, and the two will correlate perfectly—in sync in a way no local theory can account for. They are, in a word, entangled.

Measuring the correlation multiple times for many particle pairs, therefore, could prove which theory was correct. If the correlation remained below a limit derived from Bell’s theorem, this would suggest hidden variables were real; if it exceeded Bell’s limit, then the mind-boggling tenets of quantum mechanics would reign supreme. And yet, in spite of its potential to help determine the nature of reality, Bell’s theorem languished unnoticed in a relatively obscure journal for years.

THE BELL TOLLS FOR THEE

In 1967 a graduate student at Columbia University named John Clauser accidentally stumbled across a library copy of Bell’s paper and became enthralled by the possibility of proving hidden variable theories correct. When Clauser wrote to Bell two years later, asking if anyone had performed the test, it was among the first feedback Bell had received.

Three years after that, with Bell’s encouragement, Clauser and his graduate student Stuart Freedman performed the first Bell test. Clauser had secured permission from his supervisors but little in the way of funds, so he became, as he said in a later interview, adept at “dumpster diving” to obtain equipment—some of which he and Freedman then duct-taped together. In Clauser’s setup—a kayak-size apparatus requiring careful tuning by hand—pairs of photons were sent in opposite directions toward detectors that could measure their state, or polarization.

Unfortunately for Clauser and his infatuation with hidden variables, once he and Freedman completed their analysis, they had to conclude that they had found strong evidence against them. Still, the result was hardly conclusive because of various “loopholes” in the experiment that conceivably could allow the influence of hidden variables to slip through undetected. The most concerning of these was the locality loophole: if either the photon source or the detectors could have somehow shared information (which was plausible within an object the size of a kayak), the resulting measured correlations could still emerge from hidden variables. As M.I.T.’s Kaiser explained, if Alice tweets at Bob to tell him her detector setting, that interference makes ruling out hidden variables impossible.

Closing the locality loophole is easier said than done. The detector setting must be quickly changed while photons are on the fly—“quickly” meaning in a matter of mere nanoseconds. In 1976 a young French ex­­pert in optics, Alain Aspect, proposed a way to carry out this ultraspeedy switch. His group’s experimental re­­sults, published in 1982, only bolstered Clauser’s re­­sults: local hidden variables looked extremely un­­likely. “Perhaps Nature is not so queer as quantum mechanics,” Bell wrote in response to Aspect’s test. “But the experimental situation is not very encouraging from this point of view.”

Other loopholes remained, however, and Bell died in 1990 without witnessing their closure. Even As­­pect’s experiment had not fully ruled out local ef­­fects, because it took place over too small a distance. Similarly, as Clauser and others had realized, if Alice and Bob detected an unrepresentative sample of particles—like a survey that contacted only right-handed people—their experiments could reach the wrong conclusions.

No one pounced to close these loopholes with more gusto than Anton Zeilinger, an ambitious, gregarious Austrian physicist. In 1997 he and his team improved on Aspect’s earlier work by conducting a Bell test over a then unprecedented distance of nearly half a kilometer. The era of divining reality’s nonlocality from kayak-size experiments had drawn to a close. Finally, in 2013, Zeilinger’s group took the next logical step, tackling multiple loopholes at the same time.

“Before quantum mechanics, I actually was interested in engineering. I like building things with my hands,” says Marissa Giustina, a quantum researcher at Google who worked with Zeilinger. “In retrospect, a loophole-free Bell experiment is a giant systems-engineering project.” One requirement for creating an experiment closing multiple loopholes was finding a perfectly straight, unoccupied 60-meter tunnel with access to fiber-optic cables. As it turned out, the dungeon of Vienna’s Hofburg palace was an almost ideal setting—aside from being caked with a century’s worth of dust. Their results, published in 2015, coincided with similar tests from two other groups that also found quantum mechanics as flawless as ever.

BELL’S TEST REACHES THE STARS

One great final loophole remained to be closed—or at least narrowed. Any prior physical connection between components, no matter how distant in the past, has the potential to interfere with the validity of a Bell test’s results. If Alice shakes Bob’s hand prior to departing on a spaceship, they share a past. It is seemingly im­­­plausible that a local hidden variable theory would exploit these kinds of loopholes, but it was still possible.

Today quantum information science is among the most vibrant subfields in all of physics.

In 2016 a team that included Kaiser and Zeilinger performed a cosmic Bell test. Using telescopes in the Canary Islands, the researchers sourced random decisions for detector settings from stars sufficiently far apart in the sky that light from one would not reach the other for hundreds of years, ensuring a centuries-spanning gap in their shared cosmic past. Yet even then, quantum mechanics again proved triumphant.

One of the principal difficulties in explaining the importance of Bell tests to the public—as well as to skeptical physicists—is the perception that the veracity of quantum mechanics was a foregone conclusion. After all, researchers have measured many key aspects of quantum mechanics to a precision of greater than 10 parts in a billion. “I actually didn’t want to work on it,” Giustina says. “I thought, like, ‘Come on, this is old physics. We all know what’s going to happen.’” But the accuracy of quantum mechanics could not rule out the possibility of local hidden variables; only Bell tests could do that.

“What drew each of these Nobel recipients to the topic, and what drew John Bell himself to the topic, was indeed [the question], ‘Can the world work that way?’” Kaiser says. “And how do we really know with confidence?” What Bell tests allow physicists to do is remove the bias of anthropocentric aesthetic judgments from the equation. They purge from their work the parts of human cognition that recoil at the possibility of eerily inexplicable entanglement or that scoff at hidden variable theories as just more debates over how many angels may dance on the head of a pin.

The 2022 award honors Clauser, Aspect and Zeilinger, but it is testament to all the researchers who were unsatisfied with superficial explanations about quantum mechanics and who asked their questions even when doing so was unpopular. “Bell tests,” Giustina concludes, “are a very useful way of looking at reality.”

Daniel Garisto is a freelance science journalist covering advances in physics and other natural sciences. He is based in New York.

Scientific American Magazine Vol 328 Issue 1

  • Comment Comments
  • Save Article Read Later Read Later

Physicists Observe ‘Unobservable’ Quantum Phase Transition

September 11, 2023

Measurement is the nemesis of entanglement. As entanglement spreads through a grid of quantum particles — as shown in this simulation — what if you measured some of the particles here and there? Which phenomenon would triumph?

Paul Chaikin/ Quanta Magazine

Introduction

In 1935, Albert Einstein and Erwin Schrödinger, two of the most prominent physicists of the day, got into a dispute over the nature of reality.

Einstein had done the math and knew that the universe must be local, meaning that no event in one location could instantly affect a distant location. But Schrödinger had done his own math, and he knew that at the heart of quantum mechanics lay a strange connection he dubbed “entanglement,” which appeared to strike at Einstein’s commonsense assumption of locality.

When two particles become entangled, which can happen when they collide, their fates become linked. Measure the orientation of one particle, for instance, and you may learn that its entangled partner (if and when it is measured) points in the opposite direction, no matter its location. Thus, a measurement in Beijing could appear to instantly affect an experiment in Brooklyn, apparently violating Einstein’s edict that no influence can travel faster than light.

Einstein disliked the reach of entanglement (which he would later refer to as “spooky”) and criticized the then-nascent theory of quantum mechanics as necessarily incomplete. Schrödinger in turn defended the theory, which he had helped pioneer. But he sympathized with Einstein’s distaste for entanglement. He conceded that the way it seemingly allowed one experimenter to “steer” an otherwise inaccessible experiment was “ rather discomforting .”

Physicists have since largely shed that discomfort. They now understand what Einstein, and perhaps Schrödinger himself, had overlooked — that entanglement has no remote influence. It has no power to bring about a specific outcome at a distance; it can distribute only the knowledge of that outcome. Entanglement experiments, such as those that won the 2022 Nobel Prize , have now grown routine.

Over the last few years, a flurry of theoretical and experimental research has uncovered a strange new face of the phenomenon — one that shows itself not in pairs, but in constellations of particles. Entanglement naturally spreads through a group of particles, establishing an intricate web of contingencies. But if you measure the particles frequently enough, destroying entanglement in the process, you can stop the web from forming. In 2018, three groups of theorists showed that these two states — web or no web — are reminiscent of familiar states of matter such as liquid and solid. But instead of marking a transition between different structures of matter, the shift between web and no web indicates a change in the structure of information.

“This is a phase transition in information,” said Brian Skinner of Ohio State University, one of the physicists who first identified the phenomenon. “It’s where the properties in information — how information is shared between things — undergo a very abrupt change.”

Brian Skinner standing in front of a blackboard, wearing a burgundy T-shirt. The board has a few diagrams drawn on it.

Brian Skinner of Ohio State University and his colleagues showed that entanglement can survive the destructive effects of repeated measurements and spread throughout a system.

More recently, a separate trio of teams tried to observe that phase transition in action. They performed a series of meta-experiments to measure how measurements themselves affect the flow of information. In these experiments, they used quantum computers to confirm that a delicate balance between the competing effects of entanglement and measurement can be reached. The transition’s discovery has launched a wave of research into what might be possible when entanglement and measurement collide.

Entanglement “can have lots of different properties well beyond what we ever imagined,” said Jedediah Pixley , a condensed matter theorist at Rutgers University who has studied variations of the transition.

An Entangled Dessert

One of the collaborations that stumbled upon the entanglement transition was born over sticky toffee pudding at a restaurant in Oxford, England. In April 2018, Skinner was visiting his friend Adam Nahum, a physicist now at the École Normale Supérieure in Paris. Over the course of a sprawling conversation, they found themselves debating a fundamental question regarding entanglement and information.

First, a bit of a rewind. To understand what entanglement has to do with information, imagine a pair of particles, A and B, each with a spin that can be measured as pointing up or down. Each particle begins in a quantum superposition of up and down, meaning that a measurement produces a random outcome — either up or down. If the particles are not entangled, measuring them is like flipping two coins: Getting heads with one tells you nothing about what will happen with the other.

But if the particles are entangled, the two outcomes will be related. If you find B pointing up, for instance, a measurement of A will find it pointing down. The pair shares an “oppositeness” that resides not in either member but between them — a whiff of the nonlocality that unnerved Einstein and Schrödinger. One consequence of this oppositeness is that by measuring just one particle you learn about the other. “Measuring B first gave me some information about A,” Skinner said. “That reduces my ignorance about the state of A.”

How much a measurement of B reduces your ignorance about A is called the entanglement entropy, and like any type of information, it is counted in bits. Entanglement entropy is the main way physicists quantify the entanglement between two objects, or, equivalently, how much information about one is stored nonlocally in the other. No entanglement entropy means no entanglement; measuring B reveals nothing about A. High entanglement entropy means a lot of entanglement; measuring B teaches you a lot about A.

Over dessert, Skinner and Nahum took this thinking two steps further. They first extended the pair of particles into a chain as long as one cared to imagine. They knew that according to Schrödinger’s eponymous equation, the quantum mechanics analog of F = ma , entanglement would jump from one particle to the next like the flu. They also knew that they could calculate how fully entanglement had taken hold in the same way: Label one half of the chain A and the other half B; if entanglement entropy is high, then the two halves are highly entangled. Measuring half the spins will give you a good idea of what to expect when you measure the other half.

Next, they moved measurement from the end of the process — when the particle chain had already reached a particular quantum state — to the middle of the action, while entanglement was spreading. Doing so created a conflict because measurement is the mortal enemy of entanglement. Left untouched, the quantum state of a group of particles reflects all the possible combinations of ups and downs you might get when you measure those particles. But measurement collapses a quantum state and destroys any entanglement it contains. You get what you get, and any alternate possibilities vanish.

Nahum asked Skinner the following question: What if, while entanglement was in the process of spreading, you measured some of the spins here and there? Constantly measuring them all would snuff out all the entanglement in a boring way. But if you sporadically measured just a few spins, which phenomenon would emerge victorious? Entanglement or measurement?

Merrill Sherman/ Quanta Magazine

Skinner, ad-libbing, reasoned that measurement would crush entanglement. Entanglement spreads lethargically from neighbor to neighbor, so it grows by at most a few particles at a time. But one round of measurement could hit many particles throughout the lengthy chain simultaneously, snuffing out entanglement at a multitude of sites. Had they considered the strange scenario, many physicists would likely have agreed that entanglement would be no match for measurement.

“There was some kind of folklore,” said Ehud Altman , a condensed matter physicist at the University of California, Berkeley, that “states that are very entangled are very fragile.”

But Nahum, who had been mulling over this question since the previous year, believed otherwise. He envisioned the chain extending into the future, moment by moment, to form a sort of chain-link fence. The nodes were the particles, and the connections between them represented links across which entanglement might form. Measurements clipped links in random locations. Snip enough links, and the fence falls apart. Entanglement can’t spread. But until that point, Nahum argued, even a somewhat tattered fence should allow entanglement to spread far and wide.

Nahum had managed to turn a problem about an ephemeral quantum occurrence into a concrete question about a chain-link fence. That happened to be a well-studied problem in certain circles — the “vandalized resistor grid” — and one that Skinner had studied in his first undergraduate physics class, when his professor had introduced it during a digression.

“That’s when I got really excited,” Skinner said. “There’s no way to make a physicist happier than to show that a problem that looks hard is actually equivalent to a problem you already know how to solve.”

Tracking Entanglement

But their dessert banter was just that: banter. To rigorously test and develop these ideas, Skinner and Nahum joined forces with a third collaborator, Jonathan Ruhman at Bar-Ilan University in Israel. The team digitally simulated the effects of snipping links at different speeds in chain-link fences. They then compared these simulations of classical nets with more accurate but more challenging simulations of real quantum particles to make sure the analogy held. They were making leisurely but steady progress.

Then, in the summer of 2018, they learned that they weren’t the only group thinking about measurements and entanglement.

Matthew Fisher, a prominent condensed matter physicist at the University of California, Santa Barbara, had been wondering whether entanglement between molecules in the brain might play a role in how we think. In the model he and his collaborators were developing, certain molecules occasionally bind together in a way that acts like a measurement and kills entanglement. Then the bound molecules change shape in a way that could create entanglement. Fisher needed to know whether entanglement could thrive under the pressure of intermittent measurements — the same question Nahum had been considering.

“It was new,” Fisher said. “Nobody had looked at this before 2018.”

In a display of academic cooperation, the two groups coordinated their research publications with each other and with a third team studying the same problem, led by Graeme Smith of the University of Colorado, Boulder.

“It kicked into a feverish pitch of us all working in parallel to post our papers at the same time,” Skinner said.

That August, all three groups unveiled their results. Smith’s team was initially at odds with the other two, which both supported Nahum’s fence-inspired reasoning: At first, entanglement outpaced modest rates of measurement to spread across a chain of particles, yielding high entanglement entropy. Then, as the researchers cranked up measurements beyond a “critical” rate, entanglement would halt — entanglement entropy would drop.

The transition appeared to exist, but it wasn’t entirely clear to everyone where the intuitive argument — that the neighbor-to-neighbor creep of entanglement should get wiped out by the widespread lightning strikes of measurement — had gone wrong.

In the following months, Altman and his collaborators at Berkeley found a subtle flaw in the reasoning. “It doesn’t take into account [the spread of] information,” Altman said.

Altman’s group pointed out that not all measurements are highly informative, and therefore highly effective at destroying entanglement. This is because random interactions between the chain’s particles do more than just entangle. They also immensely complicate the state of the chain as time goes on, effectively spreading out its information “like a cloud,” Altman said. Eventually, each particle knows about the whole chain, but the amount of information it has is minuscule. And so, he said, “the amount of entanglement you can destroy [with each measurement] is ridiculously small.”

In March 2019, Altman’s group posted a preprint detailing how the chain effectively hid information from measurements and allowed much of the chain’s entanglement to escape devastation. Around the same time, Smith’s group updated their findings, bringing all four groups into agreement.

Ehud Altman sitting at his desk. He has swiveled partway around to smile at the camera. In the background are a bookshelf and a window, through which evergreen trees are visible.

Ehud Altman, a physicist at the University of California, Berkeley, used an argument based on quantum information theory to help clarify why entanglement can overcome measurements.

Noga Altman

The answer to Nahum’s question was clear. A “measurement-induced phase transition” was, theoretically, possible. But unlike a tangible phase transition, such as water hardening into ice, this was a transition between information phases — one where information remains safely spread out among the particles and one in which it is destroyed through repeated measurements.

That’s kind of what you dream of doing in condensed matter, Skinner said — finding a transition between different states. “Now you’re left pondering,” he continued, “how do you see it?”

Over the next four years, three groups of experimenters would detect signs of the distinct flow of information.

Three Ways to See the Invisible

Even the simplest experiment that could pick up on the intangible transition is extremely tough. “At the practical level, it seems impossible,” Altman said.

The goal is to set a certain measurement rate (think rare, medium or frequent), let those measurements duke it out with entanglement for a bit, and see how much entanglement entropy you end up with in the final state. Then rinse and repeat with other measurement rates and see how the amount of entanglement changes. It’s a bit like raising the temperature to see how the structure of an ice cube changes.

But the punishing math of exponentially proliferating possibilities makes this experiment almost unthinkably difficult to pull off.

Entanglement entropy is not, strictly speaking, something you can observe. It’s a number you infer through repetition, the way you might eventually map out the weighting of a loaded die. Rolling a single 3 tells you nothing. But after tossing the die hundreds of times, you can learn the likelihood of getting each number. Similarly, finding one particle to point up and another to point down doesn’t mean they’re entangled. You’d have to get the opposite outcome many times to be sure.

Deducing the entanglement entropy for a chain of particles being measured is much, much harder. The final state of the chain depends on its experimental history — whether each intermediate measurement came out spin up or spin down. Those are twists of fate beyond the experimenter’s control, so to amass multiple copies of the same state, the experimenter needs to repeat the experiment over and over until they get the same sequence of intermediate measurements — like flipping a coin repeatedly until you get a bunch of heads in a row. Each additional measurement makes the effort twice as hard. If you make 10 measurements while preparing a string of particles, for instance, you’ll need to run another 2 10 or 1,024 experiments to get the same final state a second time (and you might need 1,000 more copies of that state to nail down its entanglement entropy). Then you’ll have to change the measurement rate and start again.

The extreme difficulty of sensing the phase transition caused some physicists to wonder whether it was, in any meaningful sense, real.

“You’re relying on something that’s exponentially unlikely in order to see it,” said Crystal Noel , a physicist at Duke University. “So it raises the question of what does it mean physically?”

Noel spent almost two years thinking about measurement-induced phases. She was part of a team working on a new trapped-ion quantum computer at the University of Maryland. The processor contained qubits, quantum objects that act like particles. They could be programmed to create entanglement via random interactions. And the device could measure its qubits.

A portrait of Noel in a laboratory. She is smiling and wearing a white jacket and a flowered shirt.

Crystal Noel of Duke University was part of the first team that used a quantum computer to realize a version of the information phases.

Alex Mousan for Pratt Communications/Duke University

In 2019, Noel and her colleagues began collaborating with two theorists who had come up with an easier way of doing the experiment. They had worked out a way of setting aside one qubit that, like a canary in a coal mine, could serve as a bellwether for the state of the entire chain.

The group also used a second trick to reduce the number of repetitions — a technical procedure that amounted to digitally simulating the experiment in parallel with actually doing it. That way, they knew what to expect. It was like being told in advance how the loaded die was weighted, and it reduced the number of experimental runs needed to work out the invisible entanglement structure.

With those two tricks, they could detect the entanglement transition in chains that were 13 qubits long, and they posted their results in the summer of 2021.

“We were amazed,” Nahum said. “Certainly I didn’t think it would have happened so soon.”

Unbeknownst to Nahum or Noel, a full execution of the original, exponentially more difficult version of the experiment — with no tricks or caveats — was already underway.

Around that time, IBM had just upgraded its quantum computers, giving them the ability to make relatively quick and reliable measurements of qubits on the fly. And Jin Ming Koh, an undergraduate student at the time at the California Institute of Technology, had given an internal presentation to IBM researchers and convinced them to assist with a project that would push the new feature to its limits. Under the supervision of Austin Minnich, an applied physicist at Caltech, the team set out to directly detect the phase transition in an effort Skinner calls “heroic.”

After consulting Noel’s team for advice, the group simply rolled the metaphorical dice enough times to determine the entanglement structure of every possible measurement history for chains of up to 14 qubits. They found that when measurements were rare, entanglement entropy doubled when they doubled the number of qubits — a clear signature of entanglement filling the chain. The longest chains (which involved more measurements) required more than 1.5 million runs on IBM’s devices, and altogether, the company’s processors ran for seven months. It was one of the most computationally intensive tasks ever completed using quantum computers.

Minnich’s group posted their realization of the two phases in March 2022, and it quieted any lingering doubts that the phenomenon was measurable.

“They really just brute-force did this thing,” Noel said, and proved that “for small system sizes, it’s doable.”

Recently, a team of physicists collaborated with Google to go even bigger, studying the equivalent of a chain almost twice as long as the previous two. Vedika Khemani of Stanford University and Matteo Ippoliti, now at the University of Texas, Austin, had already used Google’s quantum processor in 2021 to create a time crystal , which, like phases of spreading entanglement, is an exotic phase existing in a changing system.

Working with a large team of researchers, the pair took the two tricks developed by Noel’s group and added one new ingredient: time. The Schrödinger equation links a particle’s past with its future, but measurement severs that connection. Or, as Khemani put it, “once you put in measurements into a system, this arrow of time is completely destroyed.”

Without a clear arrow of time, the group was able to re-orient Nahum’s chain-link fence to let them access different qubits at different moments, which they used in advantageous ways. Among other results, they found a phase transition in a system equivalent to a chain of about 24 qubits, which they described in a March preprint .

Measurement Power

Skinner and Nahum’s debate over pudding, along with the work of Fisher and Smith, has spawned a new subfield among physicists interested in measurement, information and entanglement. At the heart of the various lines of investigation is a growing appreciation that measurements do more than just gather information. They are physical events that can generate genuinely new phenomena.

“Measurements are not something condensed matter physicists have thought about historically,” Fisher said. We make measurements to gather information at the end of an experiment, he continued, but not to actually manipulate a system.

In particular, measurements can produce unusual outcomes because they can have the same sort of everywhere-all-at-once flavor that once troubled Einstein. At the instant of measurement, the alternative possibilities contained in the quantum state vanish, never to be realized, including those that involve far-off spots in the universe. While the nonlocality of quantum mechanics doesn’t allow for faster-than-light transmissions in the way Einstein feared, it does enable other surprising feats.

“People are intrigued by what kind of new collective phenomena can be induced by these nonlocal effects of measurements,” Altman said.

Entangling a collection of many particles, for instance, has long been thought to require at least as many steps as the number of particles you hope to entangle. But last winter theorists detailed a way to pull it off in far fewer steps by using judicious measurements. Earlier this year, the same group put the idea into practice and fashioned a tapestry of entanglement hosting fabled particles that remember their pasts . Other teams are looking into other ways measurement could be used to supercharge entangled states of quantum matter.

The explosion of interest has come as a complete surprise to Skinner, who recently traveled to Beijing to receive an award for his work in the Great Hall of the People in Tiananmen Square. (Fisher’s team was also honored.) Skinner had initially believed that Nahum’s question was purely a mental exercise, but these days he isn’t so sure where it’s all heading.

“I thought it was just a fun game we were playing,” he said, “but I’d no longer be willing to put money on the idea that it’s not useful.”

Editor’s note: Jedediah Pixley receives funding from the  Simons Foundation , which also funds this editorially independent magazine. Simons Foundation funding decisions have no influence on our coverage. More details are  available here .

Get highlights of the most important news delivered to your email inbox

Comment on this article

Quanta Magazine moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (New York time) and can only accept comments written in English. 

alain aspect quantum entanglement experiment

Next article

Quantum Entanglement in Neurons May Actually Explain Consciousness

glowing brain

A silent symphony is playing inside your brain right now as neurological pathways synchronize in an electromagnetic chorus that's thought to give rise to consciousness .

Yet how various circuits throughout the brain align their firing is an enduring mystery, one some theorists suggest might have a solution that involves quantum entanglement .

The proposal is a bold one, not least because quantum effects tend to blur into irrelevance on scales larger than atoms and molecules. Several recent findings are forcing researchers to put their doubts on hold and reconsider whether quantum chemistry might be at work inside our minds after all.

In their new published paper , Shanghai University physicists Zefei Liu and Yong-Cong Chen and biomedical engineer Ping Ao from Sichuan University in China explain how entangled photons emitted by carbon-hydrogen bonds in nerve cell insulation could synchronize activity within the brain.

Their findings come just months after another quantum phenomenon known as superradiance was identified in cellular frameworks, drawing attention to a highly speculative theory on consciousness called the Penrose-Hameroff 'orchestrated-objective reduction' model .

Proposed by the highly respected physicist Roger Penrose and the American anesthesiologist Stuart Hameroff, the model suggests networks of cytoskeleton tubules that lend structure to cells – in this case, our neurons – act as a kind of quantum computer that somehow shapes our thinking.

It's easy to see why there's an appeal in looking to quantum physics to explain consciousness. For one thing, both have a kind of 'weirdness' to them – a mix of predictability and randomness that's hard to pin down.

Then there is the perennial problem of what constitutes the pivotal observation that transforms quantum uncertainty into a classical absolute measurement. Could a quantum phenomenon in the brain be related to the collapse of a wave of probability?

On the other hand, weird plus weird doesn't equal scientific truth, no matter how incomprehensible each concept seems. Brains might not work like classical computers, but sprinkling with quantum magic is unlikely to lead to a comprehensive theory.

Scientists have had a whole other reason to staple their skeptic hat on tight when it comes to quantum theories of consciousness – the sloppy tides of biology have long been considered too chaotic, too noisy, and too 'big' for quantum mechanics to emerge in any significant way.

That part we might need to reconsider, especially if experiments can verify Liu, Chen, and Ao's prediction.

The trio notes the fatty coating called myelin around the nerve cell's axon 'tail' could conceivably serve as a suitable cylindrical cavity for the amplification of infrared photons generated elsewhere in the cell, causing carbon-hydrogen bonds to occasionally spit out pairs of photons that would have a high degree of correlation between their properties.

Graphic depicting how myelin sheaths could act as a cylindrical cavity

Movements of these entangled photons through the ionic tides of the brain's biochemistry just might drive correlations between processes that play a central role in the organ's ability to synchronize.

The word 'might' is doing some tremendous heavy lifting here, of course. While there are plenty of empirical discoveries to support details of the hypothesis, evidence of entangled photons affecting large-scale biological processes is currently limited to photosynthesis .

That doesn't mean there are zero precedents for quantum biology in animals. Mounting evidence suggests the fuzzy superposition states of electron spins in proteins called cryptochromes can be influenced by magnetic fields in a way that helps explain long-distance navigation in some animals.

We're a fair way from demonstrating anything but classical chemistry is at work inside our heads, let alone confidently proclaiming the symphonies of our brain are united by a quantum composer.

But it might be time to press pause on reservations over quantum phenomena exerting an influence over at least some of our brain's basic functions.

This research has been published in Physical Review E .

alain aspect quantum entanglement experiment

Advertisement

Nerve fibres in the brain could generate quantum entanglement

Calculations show that nerve fibres in the brain could emit pairs of entangled particles, and this quantum phenomenon might explain how different parts of the brain work together

By Karmela Padavic-Callaghan

31 July 2024

alain aspect quantum entanglement experiment

Do quantum interactions help brain cells stay in sync?

Andriy Onufriyenko/Getty Images

Nerve fibres in the brain could produce pairs of particles linked by quantum entanglement. If backed by experimental observations, this phenomenon could explain how millions of cells in the brain synchronise their activity to make it function.

“When a brain is active, millions of neurons fire simultaneously,” says Yong-Cong Chen at Shanghai University in China. Doing so requires even distant cells to coordinate their timing, which has led some researchers to wonder if this coordination could be due to what Einstein called “spooky action at a distance”…

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox! We'll also keep you up to date with New Scientist events and special offers.

To continue reading, subscribe today with our introductory offers

No commitment, cancel anytime*

Offer ends 2nd of July 2024.

*Cancel anytime within 14 days of payment to receive a refund on unserved issues.

Inclusive of applicable taxes (VAT)

Existing subscribers

More from New Scientist

Explore the latest news, articles and features

alain aspect quantum entanglement experiment

We may have found why people experience body dysmorphic disorder

Subscriber-only

Will implants that meld minds with machines enhance human abilities?

Will implants that meld minds with machines enhance human abilities?

alain aspect quantum entanglement experiment

AI can identify a child's sex based on their brain activity

alain aspect quantum entanglement experiment

Gene therapy could prevent the tau tangles linked with Alzheimer's

Popular articles.

Trending New Scientist articles

share this!

July 31, 2024

This article has been reviewed according to Science X's editorial process and policies . Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

Quantum information theorists shed light on entanglement, one of the spooky mysteries of quantum mechanics

by William Mark Stuckey, The Conversation

Quantum information theorists are shedding light on entanglement, one of the spooky mysteries of quantum mechanics

The year 2025 marks the 100th anniversary of the birth of quantum mechanics . In the century since the field's inception, scientists and engineers have used quantum mechanics to create technologies such as lasers, MRI scanners and computer chips.

Today, researchers are looking toward building quantum computers and ways to securely transfer information using an entirely new sister field called quantum information science .

But despite creating all these breakthrough technologies, physicists and philosophers who study quantum mechanics still haven't come up with the answers to some big questions raised by the field's founders. Given recent developments in quantum information science , researchers like me are using quantum information theory to explore new ways of thinking about these unanswered foundational questions. And one direction we're looking into relates Albert Einstein's relativity principle to the qubit.

Quantum computers

Quantum information science focuses on building quantum computers based on the quantum "bit" of information, or qubit . The qubit is historically grounded in the discoveries of physicists Max Planck and Einstein . They instigated the development of quantum mechanics in 1900 and 1905, respectively, when they discovered that light exists in discrete, or "quantum," bundles of energy.

These quanta of energy also come in small forms of matter, such as atoms and electrons, which make up everything in the universe. It is the odd properties of these tiny packets of matter and energy that are responsible for the computational advantages of the qubit.

A computer based on a quantum bit rather than a classical bit could have a significant computing advantage. And that's because a classical bit produces a binary response—either a 1 or a 0—to only one query.

In contrast, the qubit produces a binary response to infinitely many queries using the property of quantum superposition. This property allows researchers to connect multiple qubits in what's called a quantum entangled state. Here, the entangled qubits act collectively in a way that arrays of classical bits cannot.

That means a quantum computer can do some calculations much faster than an ordinary computer. For example, one device reportedly used 76 entangled qubits to solve a sampling problem 100 trillion times faster than a classical computer.

But the exact force or principle of nature responsible for this quantum entangled state that underlies quantum computing is a big unanswered question. A solution that my colleagues and I in quantum information theory have proposed has to do with Einstein's relativity principle .

Quantum information theory

The relativity principle says that the laws of physics are the same for all observers, regardless of where they are in space, how they're oriented or how they're moving relative to each other. My team showed how to use the relativity principle in conjunction with the principles of quantum information theory to account for quantum entangled particles .

Quantum information theorists like me think about quantum mechanics as a theory of information principles rather than a theory of forces. That's very different than the typical approach to quantum physics , in which force and energy are important concepts for doing the calculations. In contrast, quantum information theorists don't need to know what sort of physical force might be causing the mysterious behavior of entangled quantum particles.

That gives us an advantage for explaining quantum entanglement because, as physicist John Bell proved in 1964 , any explanation for quantum entanglement in terms of forces requires what Einstein called "spooky actions at a distance."

That's because the measurement outcomes of the two entangled quantum particles are correlated—even if those measurements are done at the same time and the particles are physically separated by a vast distance. So, if a force is causing quantum entanglement, it would have to act faster than the speed of light. And a faster-than-light force violates Einstein's theory of special relativity .

Many researchers are trying to find an explanation for quantum entanglement that doesn't require spooky actions at a distance, like my team's proposed solution.

Classical and quantum entanglement

In entanglement, you can know something about two particles collectively—call them particle 1 and particle 2—so that when you measure particle 1, you immediately know something about particle 2.

Imagine you're mailing two friends, whom physicists typically call Alice and Bob, each one glove from the same pair of gloves. When Alice opens her box and sees a left-hand glove, she'll know immediately that when Bob opens the other box he will see the right-hand glove. Each box and glove combination produces one of two outcomes, either a right-hand glove or a left-hand glove. There's only one possible measurement—opening the box—so Alice and Bob have entangled classical bits of information.

But in quantum entanglement the situation involves entangled qubits, which behave very differently than classical bits.

Qubit behavior

Consider a property of electrons called spin. When you measure an electron's spin using magnets that are oriented vertically, you always get a spin that's up or down, nothing in between. That's a binary measurement outcome, so this is a bit of information.

If you turn the magnets on their sides to measure an electron's spin horizontally, you always get a spin that's left or right, nothing in between. The vertical and horizontal orientations of the magnets constitute two different measurements of this same bit. So, electron spin is a qubit—it produces a binary response to multiple measurements.

Quantum superposition

Now suppose you first measure an electron's spin vertically and find it is up, then you measure its spin horizontally. When you stand straight up, you don't move to your right or your left at all. So, if I measure how much you move side to side as you stand straight up, I'll get zero.

That's exactly what you might expect for the vertical spin up electrons. Since they have vertically oriented spin up, analogous to standing straight up, they should not have any spin left or right horizontally, analogous to moving side to side.

Surprisingly, physicists have found that half of them are horizontally right and half are horizontally left. Now it doesn't seem to make sense that a vertical spin up electron has left spin (-1) and right spin (+1) outcomes when measured horizontally, just as we expect no side-to-side movement when standing straight up.

But when you add up all the left (-1) and right (+1) spin outcomes you do get zero, as we expected in the horizontal direction when our spin state is vertical spin up. So, on average, it's like having no side-to-side or horizontal movement when we stand straight up.

This 50–50 ratio over the binary (+1 and -1) outcomes is what physicists are talking about when they say that a vertical spin up electron is in a quantum superposition of horizontal spins left and right.

Entanglement from the relativity principle

According to quantum information theory , all of quantum mechanics, to include its quantum entangled states, is based on the qubit with its quantum superposition.

What my colleagues and I proposed is that this quantum superposition results from the relativity principle , which (again) states the laws of physics are the same for all observers with different orientations in space.

If the electron with a vertical spin in the up direction were to pass straight through the horizontal magnets as you might expect, it would have no spin horizontally. This would violate the relativity principle, which says the particle should have a spin regardless of whether it's being measured in the horizontal or vertical direction.

Because an electron with a vertical spin in the up direction does have a spin when measured horizontally, quantum information theorists can say that the relativity principle is (ultimately) responsible for quantum entanglement .

And since there is no force used in this principle explanation, there are none of the "spooky actions at a distance" that Einstein derided.

With quantum entanglement 's technological implications for quantum computing firmly established, it's nice to know that one big question about its origin may be answered with a highly regarded physics principle.

Provided by The Conversation

Explore further

Feedback to editors

alain aspect quantum entanglement experiment

Saturday Citations: A rare misstep for Boeing; mouse jocks and calorie restriction; human brains in sync

16 hours ago

alain aspect quantum entanglement experiment

Flood of 'junk': How AI is changing scientific publishing

22 hours ago

alain aspect quantum entanglement experiment

135-million-year-old marine crocodile sheds light on Cretaceous life

Aug 9, 2024

alain aspect quantum entanglement experiment

Researchers discover new material for optically-controlled magnetic memory

alain aspect quantum entanglement experiment

A new mechanism for shaping animal tissues

alain aspect quantum entanglement experiment

NASA tests deployment of Roman Space Telescope's 'visor'

alain aspect quantum entanglement experiment

How do butterflies stick to branches during metamorphosis?

alain aspect quantum entanglement experiment

Historic fires trapped in Antarctic ice yield key information for climate models

alain aspect quantum entanglement experiment

Hubble spotlights a supernova

alain aspect quantum entanglement experiment

New technology uses light to engrave erasable 3D images

Relevant physicsforums posts, the claymath 4-d qft problem and virtual particles (as an example).

12 hours ago

FTL signals connected to a previous non-FTL discussion are possible?

Aug 8, 2024

Time to the collapse

Aug 7, 2024

Converting momentum sums to integrals in curved spacetime

Calculating higher order terms for electron anomalous magnetic moment, laser interference pattern flickering.

Aug 6, 2024

More from Quantum Physics

Related Stories

alain aspect quantum entanglement experiment

Researchers demonstrate how to build 'time-traveling' quantum sensors

Jul 10, 2024

alain aspect quantum entanglement experiment

The best of both worlds: Combining classical and quantum systems to meet supercomputing demands

Aug 12, 2021

alain aspect quantum entanglement experiment

A three-qubit entangled state has been realized in a fully controllable array of spin qubits in silicon

Sep 10, 2021

alain aspect quantum entanglement experiment

What is quantum entanglement? A physicist explains the science of Einstein's 'spooky action at a distance'

Oct 7, 2022

alain aspect quantum entanglement experiment

Shortcut to success: Toward fast and robust quantum control through accelerating adiabatic passage

Mar 5, 2024

alain aspect quantum entanglement experiment

Spin correlation between paired electrons demonstrated

Nov 23, 2022

Recommended for you

alain aspect quantum entanglement experiment

Achieving quantum memory in the notoriously difficult X-ray range

alain aspect quantum entanglement experiment

New 2D quantum sensor detects temperature anomalies and magnetic fields

alain aspect quantum entanglement experiment

After AI, quantum computing eyes its 'Sputnik' moment

alain aspect quantum entanglement experiment

X-ray imagery of vibrating diamond opens avenues for quantum sensing

alain aspect quantum entanglement experiment

Physicists develop new method to combine conventional internet with the quantum internet

Aug 5, 2024

alain aspect quantum entanglement experiment

Cold antimatter for quantum state-resolved precision measurements

Aug 2, 2024

Let us know if there is a problem with our content

Use this form if you have come across a typo, inaccuracy or would like to send an edit request for the content on this page. For general inquiries, please use our contact form . For general feedback, use the public comments section below (please adhere to guidelines ).

Please select the most appropriate category to facilitate processing of your request

Thank you for taking time to provide your feedback to the editors.

Your feedback is important to us. However, we do not guarantee individual replies due to the high volume of messages.

E-mail the story

Your email address is used only to let the recipient know who sent the email. Neither your address nor the recipient's address will be used for any other purpose. The information you enter will appear in your e-mail message and is not retained by Phys.org in any form.

Newsletter sign up

Get weekly and/or daily updates delivered to your inbox. You can unsubscribe at any time and we'll never share your details to third parties.

More information Privacy policy

Donate and enjoy an ad-free experience

We keep our content available to everyone. Consider supporting Science X's mission by getting a premium account.

E-mail newsletter

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 August 2024

A hybrid quantum-classical classification model based on branching multi-scale entanglement renormalization ansatz

  • Yan-Yan Hou 1 ,
  • Jian Li 2 ,
  • Tao Xu 3 &
  • Xin-Yu Liu 1  

Scientific Reports volume  14 , Article number:  18521 ( 2024 ) Cite this article

Metrics details

  • Computer science
  • Quantum information

Tensor networks are emerging architectures for implementing quantum classification models. The branching multi-scale entanglement renormalization ansatz (BMERA) is a tensor network known for its enhanced entanglement properties. This paper introduces a hybrid quantum-classical classification model based on BMERA and explores the correlation between circuit layout, expressiveness, and classification accuracy. Additionally, we present an autodifferentiation method for computing the cost function gradient, which serves as a viable option for other hybrid quantum-classical models. Through numerical experiments, we demonstrate the accuracy and robustness of our classification model in tasks such as image recognition and cluster excitation discrimination, offering a novel approach for designing quantum classification models.

Similar content being viewed by others

alain aspect quantum entanglement experiment

Practical overview of image classification with tensor-network quantum circuits

alain aspect quantum entanglement experiment

Variational quantum and quantum-inspired clustering

alain aspect quantum entanglement experiment

Theoretical error performance analysis for variational quantum circuit based functional regression

Introduction.

Machine learning has made significant strides in diverse scientific and technological domains, such as image recognition and natural language processing. The rapid growth of big data and artificial intelligence has led to increased demands for improved machine learning performance. Quantum superposition and entanglement render quantum computation a superior solution for processing large-scale data. Researchers have integrated quantum computation with machine learning, resulting in a series of quantum machine learning algorithms 1 , 2 , 3 , 4 , 5 , 6 , 7 applicable to discriminative 8 , 9 and generative learning 10 , 11 tasks. Tensor networks (TNs) extract features by contracting tensors, serving as crucial numerical tools for analyzing quantum multi-body systems.

Recently, researchers have begun applying tensor networks (TNs) in quantum classification models. Tree tensor networks (TTN) 12 , known for their logarithmic depth concerning input qubits, were initially employed in quantum classification models. Another tensor network used in quantum classification models is multi-scale entanglement renormalization ansatz (MERA). MERA shares a similar structure to TTN but incorporates isometry operations to capture entanglements across a wider scale. Research suggests that quantum machine learning models utilizing MERA outperform those utilizing TTNs in both classification and generative tasks 13 .

TNs can be regarded as quantum neural networks with specific structures, characterized by logarithmic circuit depth, which mitigates the barren plateau problem during parameter optimization 14 . Quantum classification models utilizing TTN or MERA leverage entanglement to capture the correlation among data features, where the entanglement entropy must satisfy the area law 15 . However, in some complex classification tasks, the correlation among data features may necessitate the violation of the area law by entanglement entropy. Are there TNs better suited for implementing quantum classification models? Branching multi-scale entanglement renormalization ansatz (BMERA) serves as a generalization of MERA, wherein its entanglement entropy can break the area law to support the volume law. BMERA’s favorable entanglement characteristics render it a suitable choice for quantum classification models.

Unlike general neural networks, BMERA relies on tensor contraction to process data, where the input data of local tensors cannot be sent to multiple outputs 16 . Consequently, quantum classification model based on BMERA is not scalable. Inspired by the entanglement advantage of BMERA and the scalability of classical neural networks, we design a hybrid quantum-classical classification model. In this model, BMERA and classical neural networks cooperates to construct the cost function of classification model. Subsequently, a classical optimizer is employed to optimize the model parameters. This classification model transfers some tasks challenging for quantum devices to classical neural networks, so quantum circuit has a shallow circuit depth and is suitable for implementation on noisy intermediate-scale quantum (NISQ) devices 17 , 18 .

Our work contributes in three main ways. Firstly, we propose a novel parameterized quantum circuit based on BMERA, which utilizes a shallower circuit for data feature extraction. Secondly, we analyze the relationship between circuit layout, expressibility, and classification accuracy, providing valuable insights for enhancing classification accuracy. Lastly, we propose an autodifferentiation method for computing the cost function gradient, which serves as a viable option for other hybrid quantum-classical models. Simulation results demonstrate that the proposed model surpasses quantum classification models based on TTN and MERA in both MNIST (handwriting recognition with binary images) and quantum cluster state excitation discrimination tasks. Additionally, we illustrate that the proposed model exhibits robustness against depolarization noise.

This paper is organized as follows: “ Method ” section presents the hybrid quantum-classical classification model based on BMERA, and “ Numerical simulations and discussions ” section verifies the accuracy and robustness of this model in classical and quantum classification tasks. Finally, we present our conclusions and discuss future research directions.

In this section, we introduce the hybrid quantum-classical classification model based on BMERA. If the input data originates from quantum processes, it is already in a quantum state and can be directly used as input for the model. For classical input data, the primary task is to encode it into quantum states. Encoding methods include amplitude encoding, qubit encoding, and hybrid encoding. Among these methods, qubit encoding methods encodes each element into the rotation angle of a single-qubit gate, resulting in a shallower circuit depth and easier implementing on NISQ devices. The core work of the hybrid quantum classical classification model involves the collaboration of parameterized quantum circuits and classical neural networks. Firstly, parameterized quantum circuits act on encoded quantum states to extract data features. Typically, the number of output qubits exceeds the qubits needed for predicting labels. Therefore, measuring the output state to obtain the expectation is necessary, which is then forwarded to classical neural networks for extracting predicted labels.

Both parameterized quantum circuit and classical neural networks require a large number of parameters. As the initial parameters are random, it is difficult for the initial classification model to be optimal. Similar to classification models implemented by classical neural networks, we introduce a classical optimizer to optimize parameterized quantum circuit and neural networks, obtaining optimal parameters through iterative optimization. It is noteworthy that the parameterized quantum circuit and classical neural networks are alternately updated during the optimization process. During each iteration, only the parameters that were not updated in the previous iteration are updated, while the other parameters remain unchanged. Similar to layer-wise learning used in quantum neural networks 19 , 20 , alternating parameter updates are beneficial to addressing the barren plateau problem during parameter optimization process. Figure  1 shows the framework of the hybrid quantum-classical classification model based on BMERA.

figure 1

The framework of the hybrid quantum-classical classification model based on BMERA. Input is classical data, whose features are extracted and mapped into quantum states (represented by circles). Then, quantum states are sent to BMERA. Each layer contains one or two sub-layers, where yellow and white rectangles represent unitary modules of the first and second sub-layers, respectively. Data dimension is reduced by partial trace operation, represented by hash marks. The expectation of the Pauli operator for the output is obtained through multiple measurement operations. Classical neural networks act on the expectation values and output predicted labels. In the optimization process, the parameters of BMERA and classical neural networks are iteratively optimized by a classical optimizer.

Map data into quantum states

Let \({\mathcal {D}}=\{(x_1,y_1),...,(x_m,y_m)\}\) denote training data set, where \(x_i=(x_i^1, x_i^2,\) \(..., x_i^n)\) is n -dimensional data vector, and \(y_i\in \{0,...,l-1\}\) indicates the corresponding label. For classification tasks, the kernel work is to build the function \(f:x_i\rightarrow y_i\) , mapping data vector \(x_i\) to the corresponding label \(y_i\) . For qubit encoding, each element \(x_i^j\) is first scaled to \([-1,1]\) , and then mapped into the state

\(x_i\) corresponds to the tensor product state

where \(|\phi (x_i)\rangle\) is located in \(2^n\) -dimensional Hilbert space. Qubit encoding can be implemented by fixed quantum circuit \(P(x_i)=\otimes _{j=1}^{n}P_i^j\) on the initial state \(|0\rangle ^{\otimes n}\) , where

Parameterized quantum circuit based on BMERA

Parameterized quantum circuit, also called ansatz, plays an essential role in the hybrid quantum-classical classification model. The challenging task in implementing the hybrid quantum-classical classification model is to construct an ansatz that can represent the solution space of the problem being solved 21 . The entanglement entropy of BMERA can break the area law to support the volume law. In classification tasks with strong data correlation, BMERA can effectively represent the solution space of the problem in classification tasks with strong data correlation, so as to better approximate the classification model.

Similar to ansatzes based on general tensor networks, BMERA consists of 2-qubit unitary modules arranged in hierarchical layout. For n -qubit input, BMERA needs \(L=O(log(n))\) layers. In the i th layer, adjacent \(2^i\) qubits are entangled to build local correlations. Let

represent n -qubit input, where \(\rho ^i_0\) represents the state of the i th qubit. In the first layer, the unitary \(U_1(\theta )\) acts on \(\rho _0\) and produces the entangled state

where \(\rho ^i_1\) represents the entangled state of the i th adjacent qubit pair. In the t layer, \(U_t(\theta )\) acts on the output state \(\rho _{t-1}\) of the \(t-1\) layer, and produce the entangled state

where \(\rho ^i_t\) represents the state entangled by \(2^t\) adjacent qubits and \(t\in \{1,...,L\}\) . As the number of circuit layers increases, BMERA gradually builds entanglement on a larger scale. In the last layer, unitary operation \(U_L(\theta )\) acts on the state \(\rho _{L-1}\) and gets the final output state \(\rho _L\) entangling all qubits.

Figure  2 shows a BMERA with 16-qubit input, including four layers. Starting from the second layer, each layer contains two sub-layers to establish a larger range of entanglement. Each sub-layer includes multiple two-qubit unitary modules \(V_i^j(\theta )\) . Unitary modules \(V_i^j(\theta )\) are implemented by single-qubit and two-qubit unitary operations, where single-qubit unitary operations are rotation gates controlled by trainable parameters and two-qubit unitary operations are controlled gates acting on adjacent qubits.

figure 2

The structure of BMERA with 16-qubit input. \(\rho _0\) represents the input including \(a_0\sim a_{15}\) . Unitary operation in the first layer acts on \(\rho _0\) and ouputs the entangled state \(\rho _1\) , where neighboring 2 qubits are entangled. Unitary operation in the second layer forms an entangled state \(\rho _2\) , where neighboring 4 qubits are entangled. Unitary operation in the third layer forms the entangled state \(\rho _3\) , where neighboring 8 qubits are entangled. 16 qubits are entangled through 4 layer unitary operations, and a highly entangled state \(\rho _4\) is built. Rectangles represent unitary modules \(V_i^j(\theta _{i,j})\) .

Classification is equivalent to mapping the input to its predicted label vectors. Typically, the dimension of the label vector is less than the dimension of the input. Therefore, the number of output qubits should be less than the number of input qubits. BMERA establish entanglement over a larger scale, but input and output have the identical qubits, so that BMERA cannot implement shrinkage mapping from the input to the predicted label. Quantum convolutional neural network (QCNN) is an important quantum neural network model. In each layer, only some qubits are output to subsequent layer, where partial trace operation are used to reduce output qubits. The output of BMERA is obtained through multiple layers of unitary and partial trace operations, and its expectation corresponds to the extracted feature.

Causal cone is an important property of MERA, consisting of gates and connections that affect output qubits. BMERA, an extension of MERA, also has causal cone property, and output is affected by the unitary operations in its causal cone. Inspired by the contractility of QCNN and the causal cone property of BMERA, we design a reduced BMERA that only uses partial continuous qubits as outputs, and the quantum circuit retains only the unitary modules located in the causal cone where the output qbits are located.

In reduced BMERA, \(\upsilon _{i-1}\) qubits of the \((i-1)\) th layer remain as the input of the i th layer, and the remaining qubits are reduced by partial trace operation. The output of the i th layer can be written as \(\rho _i^{'}=tr_{{\widetilde{o}}_i}({U_i(\theta )\rho ^{'}_{i-1}U^{\dag }_i(\theta )})\) , where \(tr_{{\widetilde{o}}_i}\) denotes the partial trace operation acting on the qubits other than the output qubit in the i th layer, and \(\rho _i^{'}\) and \(\rho _{i-1}^{'}\) represent the input and output of the \(i-1\) layer, respectively.

Figure  3 shows a reduced BMERA with 16-qubit input. Structurally, the reduced BMERA is a binary tree. As the number of layers increases, the size of subtrees gradually increases, but the number of output qubits decreases. In the final layer, the output qubits \(a_7\) , \(a_0\) , \(a_{15}\) , and \(a_{8}\) are measured, and their expectation values are transmitted to classical neural networks. Reduced BMERA can effectively extracts the input features while retaining the entanglement of output qubits. Classical convolutional neural network (CNN) has translation invariance, and this means that the same convolutional layer adopts the shared filter weights. CNN has higher accuracy for classifying data with space correlation 22 . Inspired by the translation invariance of CNN, reduced BMERA adopts the same parameters for the unitary modules in the same sub-layer.

figure 3

The structure of a reduced BMERA with 16-qubit input. The input \(\rho _0^{'}\) includes qubits \(a_0\sim a_{15}\) . Each layer has multiple sub-trees, and each dotted box represents a subtree. \(\rho _0^{'}\) , \(\rho _{1}^{'}\) , \(\rho _2^{'}\) and \(\rho _{3}^{'}\) represent the inputs of the \(1\sim 4\) th layers, respectively. Partial trace operation acts on some qubits to reduce output qubits. In each layer, quantum swap operation is adopted to establish a larger scale entanglement. The final output is obtained by measuring the expectation values of Pauli operator \(\sigma _z\) for the qubits \(a_7\) , \(a_0\) , \(a_{15}\) , and \(a_{8}\) .

Cost function and optimization

The critical task of the hybrid quantum-classical classification model lies in formulating the cost function relevant to the problem at hand. Cost functions can be constructed using methods such as mean squared error (MSE) and cross entropy methods. Cross-entropy method is more suitable for classification tasks, and its definition is

where \(y_i\) and \(l^{x_i}\) represent the correct and predicted labels, respectively, and n is the number of training samples. For the hybrid quantum-classical classification model, the initial step involves extracting data features using a parameterized quantum circuit.”

Let \(U(\theta )\) denote the parameterized quantum circuit acting on the input \(|\phi (x_i)\rangle\) , then the output is

where \(\theta\) represents trainable parameters. Measure the expectation of Pauli operator \(\sigma _z\) for the j th qubit, and get the expectation value

where \(\sigma _z^j\) means the operator acting on the j th qubit.

Subsequently, the expectation values of all output qubits are used to construct the feature vector \(E(\theta )=\{E^1(\theta ),\) \(E^2(\theta ),....,\) \(E^{n_o}(\theta )\}\) , where \(n_o\) denotes the number of output qubits. Finally, \(E(\theta )\) is transmitted to classical neural network \(f_{nn}(E(\theta ),\beta )\) to get the predicted label \(l^{x_i}\) , where \(\beta\) denotes trainable parameters of classical neural network. Based on cross-entropy method, the cost function of the hybrid quantum-classical classification model can be written as

where \(f_{tn}(|\phi (x_i)\rangle ,\theta )\) represents the mapping from the input \(|\phi (x_i)\rangle\) to the expectation value \(E(\theta )\) , and \(f_c(l^{x_i},y_i)\) means the cross-entropy function that maps the predicted label \(l^{x_i}\) and the correct label \(y_i\) to the cost function \(C(\theta ,\beta )\) .

The subsequent step involves computing the optimal parameters \(({\theta }^*,{\beta }^*)\) by minimizing the cost function \(C(\theta ,\beta )\) . Gradient descent is a common optimization method in machine learning. Its core concept involves updating the parameters of the cost function along the direction of gradient descent. Autodifferentiation is an effective approach for computing the gradient of a composite cost function. In this approach, the cost function is decomposed into several subfunctions, and the gradient can be computed by applying the chain rule of backpropagation to the partial derivatives of these subfunctions. Consequently, the gradient descent method involves computing a series of partial derivatives of the subfunctions 23 .

Let \(\theta _j\) represent the j th parameter of \(\theta\) . By the derivative chain rule of backpropagation, the partial derivative of the cost function \(C(\theta ,\beta )\) with respect to \(E^k(\theta )\) is

where \(\frac{\partial (f_c(l^{x_i}))}{\partial {l^{x_i}}}=\frac{1-y_i}{1-l^{x_i}}-\frac{y_i}{l^{x_i}}\) and \(\frac{\partial (f_{nn}(E(\theta ),\beta ))}{\partial {E^k(\theta )}}\) are computed by classical computers. Let \(g=\{g_1,g_2,...,g_{n_o}\}\) represent the vector consisting of partial derivative \(g_k\) . Then, the gradient of \(C(\theta ,\beta )\) with respect to the j th parameter \(\theta _j\) is

A critical aspect of minimizing the cost function involves solving the partial derivative of the expectation value \(E^k(\theta )\) with respect to \(\theta _j\) , which can be computed using the parameter shift rule 24

where \({\Delta }_j\) is a small increment of \({\theta }_j\) in the positive direction, and \({\widetilde{\theta }}_j\) denotes all parameters of \(\theta\) except for \({\theta }_j\) . \(\theta _j+\frac{\pi }{2}{\Delta }_j\) and \(\theta _j-\frac{\pi }{2}{\Delta }_j\) are shift parameters for evaluating the gradient. The parameter shift rule enables precise gradient computation without discretization errors, and its circuit is easily implementable on near-term quantum devices.

Expressibility

Expressibility is a crucial metric for assessing parameterized quantum circuits. It indicates the capability of states generated by parameterized quantum circuits to span the entire Hilbert space. States generated by Haar random unitaries uniformly cover the Hilbert space, thus exhibiting the highest expressibility. The smaller the distance between the state distribution resulting from uniform sampling of a parameterized quantum circuit and the Haar random unitary distribution, the more expressive the circuit becomes. Since the Haar random unitary follows a uniform distribution, greater uniformity in the states generated by randomly sampling parameterized quantum circuits results in higher expressibility. Thus, expressibility can be quantified as the deviation of the probability distribution of states generated by a parameterized quantum circuit from that of the Haar random unitary.

Let \(Q(\alpha )\) denote a parameterized quantum circuit with n -qubit input, and \(|\psi (\alpha _1)\rangle\) and \(|\psi (\alpha _2)\rangle\) represent two states generated by randomly sampling the parameters of \(Q(\alpha )\) . \(F_b=|\langle \psi (\alpha _1)|\psi (\alpha _2)\rangle |^t\) means the t -moment fidelity between \(|\psi (\alpha _1)\rangle\) and \(|\psi (\alpha _2)\rangle\) . Assuming 1-moment fidelity is used to describe the expressibility of \(Q(\alpha )\) , then \(F_b\) can be rewritten as \(|\langle \psi (\alpha _1)|\psi (\alpha _2)\rangle |^2\) , abbreviated as fidelity. Random sampling states and obtain the fidelity distribution function \(P_{b}(F_b;\alpha )\) . Similarly, \(F_h\) represents the fidelity of the states produced by Haar-random unitary with n -qubit input, and its distribution function is \(P_{h}(F_h)=(N-1)(1-F_h)^{N-2}\) , where \(N=2^n\) is the dimension of Hilbert space.

Let \(KL(P_b(F_b;\alpha )||P_h(F_h)))\) denote the KL divergence between the fidelity distribution functions \(P_{b}(F_b;\alpha )\) and \(P_{h}(F_h)\) . The smaller the KL divergence, the closer the fidelity distribution of \(Q(\alpha )\) is to that of Haar-random unitary. As Haar random unitary has the highest expressibility, the closer the unitary distribution of \(Q(\alpha )\) is to the Haar random unitary distribution, the stronger the representation of \(Q(\alpha )\) 25 . When \(KL(P_b(F_b;\alpha )||P_h(F_h)))=0\) , \(P_b(F_b;\alpha )\) is equal to \(P_h(F_h)\) , and \(Q(\alpha )\) has the highest expressibility. We define the expressibility of \(Q(\alpha )\) as

the larger \(R(Q(\alpha ))\) , the stronger the expressibility of \(Q(\alpha )\) . In the hybrid quantum-classical classification model, expressibility corresponds to the capacity to address the target problem effectively. Greater expressibility results in the output state of the parameterized quantum circuit being closer to the correct solution.

The layout of parameterized quantum circuits can be changed by varying qubit connections and gates. The expressibility of parameterized quantum circuits with different layouts is analyzed in follows. Figure  4 shows 8 types of circuit layouts of 2-qubit unitary module \(U_i^j(\theta _{i,j})\) , and most of them are built based on previous studies 26 . Circuit (1) consists of one-qubit rotation gates \(R_X\) and \(R_Z\) and two-qubit controlled gate CNOT. Each CNOT gate acts on one neighboring qubit pair to construct entanglement. Circuits (2) and (3) have a similar layout to circuit (1), except that CNOT gate is replaced with controlled- \(R_Z\) and controlled- \(R_X\) gates, respectively. Circuit (4) is implemented by H , CNOT, and \(R_X\) gates , and it has the lowest expressibility. Circuit (5), first presented in Ref 27 , comprises \(R_Y\) and CNOT gates. Circuits (6) and (7) adopt the similar layout to circuit (5), except that CNOT gate is replaced with controlled- \(R_Z\) and controlled- \(R_X\) gates, respectively. Circuit (8), denoting arbitrary SU (4) 20 , has the highest expressibility.

figure 4

The circuit layouts of unitary module \(U_i^j(\theta _{i,j})\) . Each panel represents a circuit layout. \(R_X\) , \(R_Y\) , and \(R_Z\) denote controlled rotation gates around X -axis, Y -axis, and Z -axis, respectively. \(M(\kappa ,\mu ,\nu )=R_Z(\kappa )R_X(-{\pi }/2)R_Z(\mu )R_X({\pi }/2)R_Z(\nu )\) represents arbitrary SU (4) gate.

Table  1 shows the expressibilities of unitary module \(U_i^j(\theta _{i,j})\) and BMERA with 8-qubit input, consisting of \(U_i^j(\theta _{i,j})\) . Each column corresponds to one circuit layout in Fig.  4 . The first row shows the expressibility of the unitary module \(U_i^j(\theta _{i,j})\) , and the second row shows the expressibility of BMERA built by \(U_i^j(\theta _{i,j})\) . Fidelity distribution is obtained by 10000 samples. We can find that the higher the expressibility of \(U_i^j(\theta _{i,j})\) , the higher the expressibility of the corresponding BMERA. Circuit (3) has higher expressibility than circuit (2), and circuit (7) has higher expressibility than circuit (6). This conclusion is consistent with the fact that controlled- \(R_X\) has higher expressibility than controlled- \(R_Z\) 26 . Circuit (8) has the highest expressibility among all circuit layouts. Figure  5 shows the histograms of the fidelity distributions of circuit (8) and BMERA built by it. The Hilbert space dimension of BMERA with 8-qubit input is 256. This larger dimension makes \(P_{h}(F_h)\) and \(P_{b}(F_b;\theta )\) near 0, so the X -axis of Fig.  5 only shows the range [0, 0.1]. Simulation results show that when the parameterized quantum circuit has high expressibility, its fidelity distribution is close to that of Haar random unitary.

figure 5

The histograms of the fidelity distributions. Panels ( a – b ) show the histograms of the fidelity distributions of the circuit (8) and the corresponding BMERA. The orange line represents the fidelity distribution of the Haar-random unitary. The fidelity distributions of the circuit (8) and BMERA are close to the fidelity distributions of the corresponding Haar-random unitary, respectively.

Numerical simulations and discussions

In this section, we evaluate the performance of the hybrid quantum-classical classification model using the TensorFlow Quantum (TFQ) framework. Initially, we demonstrate the accuracy of the proposed classification model in classical classification tasks. Next, we verify the accuracy of these classification tasks under noise environments. Lastly, we demonstrate the accuracy of the proposed classification model in cluster state excitation discrimination tasks.

Classical data classification

MNIST dataset is a widely used data set in machine learning, consisting of 60000 training samples and 10000 test samples. The samples consist of \(28\times 28\) -dimensional grayscale images, each representing a handwritten digit from 0 to 9. Our simulations primarily focus on binary classification tasks, where we select two categories of handwritten digits for the training set. Due to the limited qubits and shallower circuits of NISQ devices, the samples are reduced to 8-dimensional vectors using principal component analysis (PCA).

Initially, we analyzed the accuracy of the binary classification task. Besides 8 circuit layouts shown in Fig.  4 , we introduced an alternative circuit layout, wherein the unitary modules of the first sublayer are implemented by circuit (4), and the unitary modules of the second sublayer are implemented by circuit (5). We adopt the Adaptive moment estimation (Adam) method 28 to train the hybrid quantum-classical classification model, with a the training data size of 32, and a learning speed of 0.01. Table  2 shows the accuracies and standard deviations of various binary classification tasks. Figure  6 shows the accuracy and standard deviation in the form of a bar chart.

figure 6

The accuracies and standard deviations (%) for classification models based on different circuit layouts in bar chart.

The accuracy of the classifier based on alternate circuit layout falls between that of classifiers based on circuits (4) and (5). In certain cases, the accuracy may approach or exceed that of the higher accuracy among these two cases. Digits 4 and 9 exhibit similar local features, and many detailed features are lost during the dimensionality reduction process. consequently yielding lower accuracy for the classification task. Circuit (4) has the poorest expressibility, resulting in the classification model based on it having the lowest accuracy among all models. Except for the classification model based on circuit (4) and the task involving the classification of digits 4 and 9, the accuracies of others are no less than 94%.

Upon comparing Tables  1 and  2 , we observe a correlation between the accuracy and the expressibility of BMERA. In general, higher expressibility correlates with higher accuracy 26 . In most binary classification tasks, there is little variation in accuracy among classification models based on circuits (6), (7), and (8). Despite circuit (8) having the highest expressibility among the binary classification models, its accuracy is lower than that of models based on circuits (6) and (7) in certain classification tasks. This discrepancy is primarily attributed to circuit (8) requiring excessive parameters, leading to overfitting. Generally, BMERA models with higher expressibility require more parameters and complex circuit connections. Circuit layouts should be chosen based on circuit scale and accuracy requirements in practical tasks. If there is minimal difference in accuracy among different circuit layouts, we select the BMERA model with fewer parameters, as it is easier to implement on NISQ devices.

In the hybrid quantum-classical classification model, the number of parameters in BMERA grows polynomially with the number of qubits. To reduce the number of parameters, the unitary modules in the same sub-layer adopt the same parameters, drawing inspiration from the identical filter parameters in the same layer of a CNN for translation invariance. Table  3 displays the accuracies and standard deviations for classification models incorporating translation invariance. Figure  7 shows the accuracies and standard deviation in bar chart. In comparison to Tables  2 and  3 , classification models lacking translation invariance exhibit higher accuracy but require more parameters than those with translation invariance. Classification models based on circuits (6), (7), and (8) exhibit minor differences in accuracy between those with translation invariance and those without.

figure 7

The accuracies and standard deviations (%) for classification models based on different circuit layouts (translation invariance).

Table  4 shows the accuracies and standard deviations of the hybrid quantum-classical classification model based on BMERA (abbreviated as HBMERA), with quantum classification models based on TTN 13 , MERA 13 , and BMERA. Figure  8 shows the accuracies and standard deviation in bar chart.

figure 8

The accuracies and standard deviations (%) of the TTN, MERA, BMERA, and HBMERA classification models.

Figure  9 shows the accuracies of the above four classification models based on circuit (5). The HBMERA classification model exhibits the highest accuracy across all classification tasks. Moreover, the accuracy of the BMERA classification model is lower than that of the HBMERA classification model but higher than those of the TTN and MERA classification models.

figure 9

The accuracies of classification models based on TNs. Panels ( a – f ) show the accuracies for 6 classification tasks. The blue, orange, green, and red lines represent the accuracies of the TTN, MERA, BMERA, and HBMERA classification models, respectively.

At present, several classification models have been implemented using tensor networks, such as Unitary Tree Tensor Network (UTTN) 29 , Residual Matrix Product State (RMPS) 30 , and Projected Entangled Pair States (PEPS) 31 .The core idea of these models is to view tensor networks as a special type of network, map data to tensor states, and achieve recognition and classification tasks through tensor contraction. Essentially, these models are still implemented using classical computational methods. To further evaluate the performance of BMERA and HBMERA classifiers, we conducted classification experiments on the MNIST and Fashion-MNIST datasets. Due to the limitations of currently available quantum circuit scales, we only implemented binary classification tasks. For the MNIST dataset, we selected classes 5 and 6, and for the Fashion-MNIST dataset, we selected classes 1 and 2. Tables  5 and  6 present the accuracy of various classification models applied to MNIST and Fashion-MNIST dataset. These models include 1-layer Neural Network (1-layer NN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Fully Connected Network (FCN)), UTTN, RMPS, PEPS, along with the proposed BMERA and HBMERA models.

From the simulation results, we observe that the accuracies of the BMERA and HBMERA models are lower than those of commonly used neural networks and TN models. This is primarily due to the scale limitations of existing quantum circuits, which make it challenging to process high-dimensional data. Dimensionality reduction is required before classical data can be encoded into quantum states, which impacts the overall accuracy of the algorithm.

Classical data classification under noise environment

In this subsection, we simulate depolarization noise acting on an 8-qubit quantum system and compare classification accuracies in both noisy and noiseless environments. The state \(\rho\) is replaced by a completely hybrid state \({\widetilde{\rho }}=sI/2+(1-s)\rho\) , with s probability of I /2 and \(1-s\) probability of \(\rho\) under depolarization noise environment. s is restricted in a small region [0, 0.05] to ensure the state \(\rho\) undergoes minimal alteration. Tables  7 and  8 show the accuracies and standard deviations of the HBMERA classification model based on the circuit (5), with depolarization noise acting on the initial state and the whole quantum system, respectively. Figures  10 and 11 show the corresponding accuracy and standard deviation in bar chart. Figure  12 shows the accuracies of binary classification task in noise and noiseless environments.

figure 10

The accuracies and standard deviations (%) of the HBMERA classification model with depolarization noise acting on input states.

figure 11

The accuracies and standard deviations (%) of the HBMERA classification model with depolarization noise acting on the whole quantum system.

Simulation results indicate that the accuracy discrepancy for the identical binary classification task between noisy and noiseless environments ranges from 0 to 0.05. The maximum discrepancy, observed in the classifying task distinguishing digits 4 and 9, is 0.0338. Notably, this particular classification task also exhibits the lowest accuracy in a noiseless environment. Interestingly, in certain instances, the mean accuracies in noisy environments surpass those in noiseless environments. Simulation results suggest that depolarization noise minimally affects classification accuracy when s is below the threshold of 0.05. We can find that classification accuracy changes minimally compared to the noiseless environment, regardless of whether the noise affects the input state or the whole system. Thus, the HBMERA classification model demonstrates good robustness under depolarization noise.

figure 12

The accuracies of the HBMERA classification model under noise and noiseless environments. Panels ( a – b ) show the accuracies of classifying digits 4 and 5 with depolarization noise acting on input states and the whole quantum system, respectively. Solid lines represent the accuracies under noiseless environments, and dotted lines represent the accuracies under depolarization noise environments.

Quantum state discrimination

In this subsection, we evaluate the performance of the HBMERA classification model in discriminating cluster state excitations. Cluster states, highly entangled states, serve as common initial states for measurement-based quantum systems. However, due to their higher dimensions, cluster states require exponentially increasing resources for data processing as the number of qubits grows. Consequently, discriminating cluster state excitations using classical computers is a challenging 16 . We conduct a discriminative experiment on cluster state excitation. The preparation process for cluster states is as follows:

Initialize the state \(|0\rangle ^n\) , where n represents the number of qubits. For an 8-qubit cluster state, the initial state is \(|0\rangle ^8\) .

Apply Hadamard gates to each qubit to create an superposition state \(|+\rangle ^{8}\) , where \(|+\rangle =(|0\rangle +|1\rangle )/\sqrt{2}\) .

Apply Controlled-Z(CZ) gates between adjacent qubits. A CZ gate applies a phase flip (Z operation) to the target qubit depending on the state of the control qubit. This operation can be written as

For qubits i and \(i+1\) , CZ operation is described as

where \(I_{i+1}\) represents identity operation applied to the \((i+1)\) th qubit and \(Z_{i+1}\) represents Z operation applied to the \((i+1)\) th qubit.

Repeat applying CZ gates sequentially between adjacent qubits until all desired pairs have been operated upon.

In the excitation state preparation process, the RX gate acts on each qubit with a random rotation angle within the range of \([-\pi ,\pi ]\) . Subsequently, labels are assigned to cluster states based on the rotation angle: if the angle is between \(-\pi /2\) and \(\pi /2\) , the label is assigned as 1, indicating excitation; otherwise, it is assigned as -1, indicating no excitation.

In this simulation, the training set includes 800 cluster and test set including 200 test set. The training process consists of 25 epochs with a batch size of 32. Table  9 shows the accuracies and standard deviations for discriminating 8-qubit and 16-qubit cluster states excitation using TTN 13 , MERA 13 , BMERA, and HBMERA classification models, with each datum obtained from 5 simulations. Figure  13 shows the corresponding accuracy and standard deviation in bar chart. Figure  14 shows the accuracies of discriminating 8-qubit and 16-qubit cluster states excitations. Simulation results reveal that the HBMERA classification model achieves mean accuracies of 99.06% and 97.92% in discriminating 8-qubit and 16-qubit cluster state excitations, respectively, surpassing those of the TTN, MERA, and BMERA classification models. The HBMERA classification model demonstrates superior accuracy in discriminating cluster state excitations.

figure 13

The accuracies and standard deviations (%) for discriminating cluster state excitation.

figure 14

The accuracies of discriminating cluster state excitation or not. Panels ( a – b ) show the accuracies for discriminating 8-qubit and 16-qubit cluster states excitation. The solid lines represent the accuracies of the HBMERA classification model. The blue, orange, and green dotted lines represent the accuracies of the TTN, MERA, and BMERA classification models, respectively.

Conclusions and future work

Hybrid quantum-classical algorithms offer a promising avenue for integrating NISQ devices into machine learning tasks. Leveraging the entanglement benefits of tensor networks, we propose a hybrid quantum-classical classification model based on branching multi-scale entanglement renormalization ansatz (BMERA). This model enhances its ability to approximate nonlinear functions through qubit encoding and an optimized circuit structure. Additionally, the proposed HBMERA model achieves a shallower circuit depth by shifting some computational complexity from quantum devices to classical neural networks, making it more compatible with NISQ devices. Simulation results demonstrate that the BMERA classification model achieves higher classification performance in classical and quantum data classification tasks and exhibits robustness to depolarization noise. However, the accuracy of the BMERA model is lower than that of classical neural networks and TN models. This is mainly because the architecture and optimization methods of classification models have become increasingly mature and sophisticated after years of development. In contrast, quantum computers face limitations in terms of qubit, circuit depth, and error correction. These limitations necessitate preprocessing operations, such as dimensionality reduction of classical data, before executing the operation, which constrains the potential for further improving classification accuracy.

With advancements in quantum computers regarding scale and fault tolerance, these limitations will be greatly alleviated, potentially giving quantum models a competitive edge. For highly complex tasks, quantum models can more effectively capture global features and intricate relationships within the data, representing complex functions that are challenging for classical neural networks. This capability stems from the utilization of quantum superposition and entangled states, which may enable quantum neural networks to achieve higher accuracy in certain tasks. Currently, research on quantum classification models based on tensor network structures is in its early stages. The framework and architecture of quantum models require further optimization, which is an important direction for our future research. As the field progresses, we anticipate that improvements in quantum model design will lead to enhanced performance and new capabilities in machine learning.

LaTeX formats citations and references automatically using the bibliography records in your .bib file, which you can edit via the project menu. Use the cite command for an inline citation, e.g. 32 .

For data citations of datasets uploaded to e.g. figshare , please use the [SPSVERBc1SPS] option in the bib entry to specify the platform and the link, as in the [SPSVERBc2SPS] example in the sample bibliography file.

Data availability

Data is provided within the manuscript or supplementary information files. Codes will be made available on request.

Rebentrost, P., Mohseni, M. & Lloyd, S. Quantum support vector machine for big data classification. Phys. Rev. Lett. 113 , 130503 (2014).

Article   ADS   PubMed   Google Scholar  

Schuld, M. & Killoran, N. Quantum machine learning in feature hilbert spaces. Phys. Rev. Lett. 122 , 040504 (2019).

Article   ADS   CAS   PubMed   Google Scholar  

Wang, Y., Lin, K.-Y., Cheng, S. & Li, L. Variational quantum extreme learning machine. Neurocomputing 512 , 83–99 (2022).

Article   Google Scholar  

Wang, Y., Wang, Y., Chen, C., Jiang, R. & Huang, W. Development of variational quantum deep neural networks for image recognition. Neurocomputing 501 , 566–582 (2022).

Chen, Y., Wang, C., Guo, H., Gao, X. & Wu, J. Accelerating spiking neural networks using quantum algorithm with high success probability and high calculation accuracy. Neurocomputing 493 , 435–444 (2022).

Martín-Guerrero, J. D. & Lamata, L. Quantum machine learning: A tutorial. Neurocomputing 470 , 457–461 (2022).

Huang, R., Tan, X. & Xu, Q. Variational quantum tensor networks classifiers. Neurocomputing 452 , 89–98 (2021).

Cohen, N., Sharir, O. & Shashua, A. On the expressive power of deep learning: A tensor analysis. In Conference on learning theory , 698–728 (PMLR, 2016).

Stoudenmire, E. & Schwab, D. J. Supervised learning with tensor networks. Adv. Neural Inf. Process. Syst. 29 , 4799 (2016).

Google Scholar  

Wall, M. L., Abernathy, M. R. & Quiroz, G. Generative machine learning with tensor networks: Benchmarks on near-term quantum computers. Phys. Rev. Res. 3 , 023010 (2021).

Article   CAS   Google Scholar  

Cheng, S., Wang, L., Xiang, T. & Zhang, P. Tree tensor networks for generative modeling. Phys. Rev. B 99 , 155131 (2019).

Article   ADS   CAS   Google Scholar  

Benedetti, M. et al. A generative modeling approach for benchmarking and training shallow quantum circuits. npj Quantum Inf. 5 , 1–9 (2019).

Grant, E. et al. Hierarchical quantum classifiers. npj Quantum Inf. 4 , 1–8 (2018).

Pesah, A. et al. Absence of barren plateaus in quantum convolutional neural networks. Phys. Rev. X 11 , 041011 (2021).

CAS   Google Scholar  

Vidal, G. Class of quantum many-body states that can be efficiently simulated. Phys. Rev. Lett. 101 , 110501 (2008).

Broughton, M. et al. Tensorflow quantum: A software framework for quantum machine learning. arXiv preprint arXiv:2003.02989 (2020).

Verdon, G., Pye, J. & Broughton, M. A universal training algorithm for quantum deep learning. arXiv preprint arXiv:1806.09729 (2018).

Romero, J. & Aspuru-Guzik, A. Variational quantum generators: Generative adversarial quantum machine learning for continuous distributions. Adv. Quantum Technol. 4 , 2000003 (2021).

Skolik, A., McClean, J. R., Mohseni, M., van der Smagt, P. & Leib, M. Layerwise learning for quantum neural networks. Quantum Mach. Intell. 3 , 1–11 (2021).

MacCormack, I., Delaney, C., Galda, A. & Narang, P. Branching quantum convolutional neural networks: A variational ansatz with mid-circuit measurements. Bull. Am. Phys. Soc. 4 (1), 013117 (2021).

Li, W. & Deng, D.-L. Recent advances for quantum classifiers. Sci. China Physi. Mech. Astron. 65 , 220301 (2022).

Article   ADS   Google Scholar  

Cong, I., Choi, S. & Lukin, M. D. Quantum convolutional neural networks. Nat. Phys. 15 , 1273–1278 (2019).

Harrow, A. W. & Napp, J. C. Low-depth gradient measurements can improve convergence in variational hybrid quantum-classical algorithms. Phys. Rev. Lett. 126 , 140502 (2021).

Schuld, M., Bergholm, V., Gogolin, C., Izaac, J. & Killoran, N. Evaluating analytic gradients on quantum hardware. Phys. Rev. A 99 , 032331 (2019).

Hubregtsen, T., Pichlmeier, J., Stecher, P. & Bertels, K. Evaluation of parameterized quantum circuits: On the relation between classification accuracy, expressibility, and entangling capability. Quantum Mach. Intell. 3 , 1–19 (2021).

Sim, S., Johnson, P. D. & Aspuru-Guzik, A. Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms. Adv. Quantum Technol. 2 , 1900070 (2019).

Peruzzo, A. et al. A variational eigenvalue solver on a photonic quantum processor. Nat. Commun. 5 , 1–7 (2014).

Kingma, D. P. & Ba, J. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).

Liu, D. et al. Machine learning by unitary tensor network of hierarchical tree structure. New J. Phys. 21 , 073059 (2019).

Article   ADS   MathSciNet   Google Scholar  

Meng, Y.-M., Zhang, J., Zhang, P., Gao, C. & Ran, S.-J. Residual matrix product state for machine learning. SciPost Phys. 14 , 142 (2023).

Article   MathSciNet   Google Scholar  

Cheng, S., Wang, L. & Zhang, P. Supervised learning with projected entangled pair states. Phys. Rev. B 103 , 125117 (2021).

Hao, Z., AghaKouchak, A., Nakhjiri, N. & Farahmand, A. Global integrated drought monitoring and prediction system (GIDMaPS) data sets. figshare https://doi.org/10.6084/m9.figshare.853801 (2014).

Download references

Acknowledgements

This work was supported by the Open Fund of Advanced Cryptography and System Security Key Laboratory of Sichuan Province (Grant No. SKLACSS-202108), Scientific Research Fund of Zaozhuang University (No.102061901), Shandong Province College Student Innovation and Entrepreneurship Training Program Project (S202310904040).

Author information

Authors and affiliations.

College of Information Science and Engineering, ZaoZhuang University, Zaozhuang, 277160, China

Yan-Yan Hou & Xin-Yu Liu

School of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing, 100876, China

Network Center, ZaoZhuang University, Zaozhuang, 277160, China

You can also search for this author in PubMed   Google Scholar

Contributions

Y.-Y.H. wrote the manuscript and conducted the experiment(s), L.J. conceived the experiment(s), X.T. analysed the results, X.-Y.L. reviewed the manuscript.

Corresponding author

Correspondence to Tao Xu .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Hou, YY., Li, J., Xu, T. et al. A hybrid quantum-classical classification model based on branching multi-scale entanglement renormalization ansatz. Sci Rep 14 , 18521 (2024). https://doi.org/10.1038/s41598-024-69384-6

Download citation

Received : 13 March 2024

Accepted : 05 August 2024

Published : 09 August 2024

DOI : https://doi.org/10.1038/s41598-024-69384-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Tensor networks
  • Hybrid quantum-classical classification model
  • Branching multi-scale entanglement renormalization ansatz (BMERA)
  • Quantum machine learning

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

alain aspect quantum entanglement experiment

IMAGES

  1. MuonRay: Overview of Quantum Entanglement

    alain aspect quantum entanglement experiment

  2. Prof. Alain Aspect Explains Quantum Entaglement

    alain aspect quantum entanglement experiment

  3. The Nobel Prize 2022 in Physics goes to Alain Aspect, John F. Clauser

    alain aspect quantum entanglement experiment

  4. Who Is Alain Aspect

    alain aspect quantum entanglement experiment

  5. Physics Nobel 2022: The Mysteries Of Quantum Entanglement, And Their

    alain aspect quantum entanglement experiment

  6. Alain Aspect : « La seconde révolution quantique conduira à des

    alain aspect quantum entanglement experiment

COMMENTS

  1. Aspect's experiment

    Alain Aspect explaining his experiment. Aspect's experiment was the first quantum mechanics experiment to demonstrate the violation of Bell's inequalities with photons using distant detectors. Its 1982 result allowed for further validation of the quantum entanglement and locality principles. It also offered an experimental answer to Albert Einstein, Boris Podolsky, and Nathan Rosen's paradox ...

  2. Physics

    The award is shared by Alain Aspect, John Clauser, and Anton Zeilinger, all of whom showed a mastery of entanglement—a quantum relationship between two particles that can exist over long distances. Using entangled photons, Clauser and Aspect performed some of the first "Bell tests," which confirmed quantum mechanics predictions while ...

  3. Alain Aspect, Nobel-winning father of quantum entanglement

    The experiment helped pave the way for what Aspect has called the "second quantum revolution", which has led to a range of new technologies including quantum computing, encryption and more.

  4. Explorers of Quantum Entanglement Win 2022 Nobel Prize in Physics

    Alain Aspect, John F. Clauser and Anton Zeilinger won the 2022 Nobel Prize in Physics for their work using entangled photons to test the quantum foundations of reality. This year's Nobel Prize ...

  5. Alain Aspect

    Alain Aspect (French: ⓘ; born 15 June 1947) is a French physicist noted for his experimental work on quantum entanglement.. Aspect was awarded the 2022 Nobel Prize in Physics, jointly with John Clauser and Anton Zeilinger, "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science".

  6. 'Spooky' quantum-entanglement experiments win physics Nobel

    Such experiments have laid the foundations for an abundance of quantum technologies, including quantum computers and communications. Alain Aspect, John Clauser and Anton Zeilinger will each share ...

  7. Press release: The Nobel Prize in Physics 2022

    When he took the measurements, they supported quantum mechanics by clearly violating a Bell inequality. This means that quantum mechanics cannot be replaced by a theory that uses hidden variables. Some loopholes remained after John Clauser's experiment. Alain Aspect developed the setup, using it in a way that closed an important loophole. He ...

  8. Alain Aspect

    The Nobel Prize in Physics 2022 was awarded jointly to Alain Aspect, John F. Clauser and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science" ... and how he gets his head around the weirdness of quantum entanglement. Interview transcript. Adam Smith ...

  9. PDF Alain Aspect, Nobel-winning father of quantum entanglement

    Alain Aspect, who won a long-expected Nobel Physics Prize on Tuesday. , not only helped prove the strange theory of quantum entanglement but also inspired a generation of physicists in his native ...

  10. Alain Aspect

    The Nobel Prize in Physics 2022 was awarded jointly to Alain Aspect, John F. Clauser and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science" ... In 1981-1982, Alain Aspect conducted groundbreaking experiments using entangled light particles ...

  11. Pioneering Quantum Physicists Win Nobel Prize in Physics

    The physicists Alain Aspect, John Clauser and Anton Zeilinger have won the 2022 Nobel Prize in Physics for experiments that proved the profoundly strange quantum nature of reality. Their experiments collectively established the existence of a bizarre quantum phenomenon known as entanglement, where two widely separated particles appear to share ...

  12. Alain Aspect, Nobel-winning father of quantum entanglement

    It remained a theory until Aspect and his team proved the phenomenon in a laboratory experiment for the first time in 1981, entangling two photons—units of light—at a distance of 12 meters (40 ...

  13. Alain Aspect's experiments on Bell's theorem: A turning point in the

    states."2 A little later, in 2013, Alain Aspect went to Copenhagen to receive the Bohr Medal, awarded by UNESCO for his contribution to understanding the non-locality of quantum mechanics, that is, entanglement. His talk at the ceremony was an exposition on entanglement.3 Aspect explained that it is both a physical

  14. Alain Aspect's experiments on Bell's theorem: a turning ...

    Alain Aspect's three experiments on Bell's theorem, published in the early 1980s, were a turning point in the history of the research on the foundations of quantum mechanics not only because they corroborated entanglement as the distinctive quantum signature but also because these experiments brought wider recognition to this field of research and Aspect himself. These experiments may be ...

  15. Nobel Prize in Physics Is Awarded to 3 Scientists for Work Exploring

    Alain Aspect, John F. Clauser and Anton Zeilinger were recognized for conducting experiments concerning quantum entanglement, a phenomenon that occurs when two particles behave as one, even when ...

  16. Alain Aspect

    Alain Aspect (born June 15, 1947, Agen, France) is a French physicist who was awarded the 2022 Nobel Prize for Physics for his experiments with quantum entanglement.He shared the prize with American physicist John F. Clauser and Austrian physicist Anton Zeilinger.What happens to one particle in an entangled pair determines what happens to the other, even if they are really too far apart to ...

  17. Alain Aspect's experiments on Bell's theorem: A turning point in the

    Download PDF Abstract: Alain Aspect's three experiments on Bell's theorem, published in the early 1980s, were a turning point in the history of the research on the foundations of quantum mechanics not only because they corroborated entanglement as the distinctive quantum signature but also because these experiments brought wider recognition to this field of research and Aspect himself.

  18. Demonstrations of quantum entanglement earn the 2022 Nobel Prize in

    All the experiments favored the rejection of local hidden variables. From foundations to applications. Clauser's and Aspect's experiments may not have directly set off the second quantum revolution, but they got theorists thinking about the practical applications of the information encoded in quantum states, particularly entangled ones.

  19. Alain Aspect, John Clauser and Anton Zeilinger win the 2022 Nobel Prize

    Alain Aspect, John Clauser and Anton Zeilinger have won the 2022 Nobel Prize for Physics. The trio won "for their experiments with entangled photons, establishing the violation of Bell's inequalities and pioneering quantum information science". The prize will be presented in Stockholm in December and is worth 10 million kronor ($900,000).

  20. First Experimental Proof That Quantum Entanglement Is Real

    Clauser was awarded the 2022 Nobel Prize in Physics, jointly with Alain Aspect and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science." ... i suspect the quantum entanglement experiments are flawed but I have not found the details of these ...

  21. The Universe Is Not Locally Real, and the Physics Nobel Prize Winners

    Without that understanding of entanglement, we probably wouldn't be able to realize quantum computers." Work by John Stewart Bell in the 1960s sparked a quiet revolution in quantum physics ...

  22. Prof. Alain Aspect Explains Quantum Entaglement

    UofG was delighted to welcome Professor Alain Aspect back to campus to deliver the 2017 Andrew Carnegie Lecture. Prior to the lecture he explained a little b...

  23. PDF How entanglement has become a powerful tool

    How entanglement has become a powerful tool. Using groundbreaking experiments, Alain Aspect, John Clauser. and . Anton Zeilinger. have demonstrated . the potential to investigate and control particles that are in entangled states. What happens to one particle in an entangled pair determines what happens to the other, even if they are really too ...

  24. Physicists Observe 'Unobservable' Quantum Phase Transition

    Entanglement experiments, such as those that won the 2022 Nobel Prize, have now grown routine. ... In these experiments, they used quantum computers to confirm that a delicate balance between the competing effects of entanglement and measurement can be reached. The transition's discovery has launched a wave of research into what might be ...

  25. Quantum Entanglement in Neurons May Actually Explain ...

    A silent symphony is playing inside your brain right now as neurological pathways synchronize in an electromagnetic chorus that's thought to give rise to consciousness.. Yet how various circuits throughout the brain align their firing is an enduring mystery, one some theorists suggest might have a solution that involves quantum entanglement.

  26. Nerve fibres in the brain could generate quantum entanglement

    Nerve fibres in the brain could generate quantum entanglement. ... Time travel sci-fi novel is a rip-roaringly good thought experiment. 3. People who had severe covid-19 show cognitive decline ...

  27. Quantum information theorists shed light on entanglement, one of the

    But in quantum entanglement the situation involves entangled qubits, which behave very differently than classical bits. ... Experiment uses quantum techniques to stimulate photons, enhancing ...

  28. A hybrid quantum-classical classification model based on ...

    Quantum classification models utilizing TTN or MERA leverage entanglement to capture the correlation among data features, where the entanglement entropy must satisfy the area law 15. However, in ...