The three Nobel Laureates of the 2022 Nobel Prize in Physics, for their experiments with entangled particles that established Bell’s inequality violations and pioneered quantum information science. From left-to-right: Alain Aspect, John Clauser, and Anton Zeilinger. Credit: The Nobel Prize in Physics, 2022
We are thrilled about this year’s physics Nobel prize announcement because the topic concerned is extremely relevant in the context of our Unified Physics Theory.
Quantum entanglement, coined by Albert Einstein as spooky action at a distance, has its origin in a thought experiment from Albert Einstein, Boris Podolsky and Nathan Rosen, based on a discussion about the apparent impossibility of quantum mechanics to provide a complete description of reality. Such impossibility was due to the Heisenberg incertitude principle, positing that position and momentum of quantum particles can’t be measured simultaneously, and that increasing the certitude in one of the variables, decreases the certitude on the other variable accordingly, since in quantum mechanics there is a limit below which these variables are no longer independent of each other.
In this thought experiment coined EPR paradox, two particles A and B are in a hypothetical entangled state (correlated state). Although it is impossible to measure both the momentum and the position of particle B exactly, it is possible to measure the exact position of particle A and by calculation deduce the exact position of particle B. The same with the exact momentum of particle A; the exact momentum of particle B can be found out. Therefore, exact values of position or momentum can be known for particle B without having to perturb or interact with it. And since information can’t travel faster that the speed of light, the possibility of both particles communicating to inform each other and react as to preserve the correlation, would violate Einstein’s relativity theory.
Furthermore, Einstein's main argument was independent of the choice of measurement (position or momentum) done in particle A, focusing rather on the fact that the phenomenon violates locality (he believed that the elements of reality are local or in correspondence to a certain point in spacetime, that can only be influenced by events in the light cone -or past- of its point in spacetime), hence, the state of B could not depend on the measurement of A. If we assume locality, quantum states cannot be in one-to-one correspondence with the real states, and therefore, quantum theory is not complete and there are local hidden variables at play. The idea that quantum mechanics was incomplete and driven by local hidden variables, did not please Niels Bohr, one of the main founders of quantum mechanics, and they discussed strongly on the matter.
Bohr argued that the states of quantum particles are not fixed aprori by some hidden variable that we are unable to measure, and that they truly are in a superposition of possible states until a measurement collapses the wave function into one of the possibilities. Einstein insisted on the hidden variable theory, which would imply that quantum weirdness resulted from our ignorance of these hidden variables, and hence, we had to stick to a probabilistic collapse of a wavefunction interpretation of quantum mechanics, in absence of a better model.
For more information about the history of entanglement, we recommend this video below.
The focus turned into studying the nature of nonlocality in quantum mechanics. David Bohm developed the first successful hidden-variable theory in 1951 -the EPR-Bohm thought experiment using electron-positron pairs- and it predicted a prominent nonlocal behavior. In 1964, John S. Bell explored whether it was indeed possible to solve the nonlocality problem with hidden variables, and his research showed that the correlations found in both former versions of the paradox (EPR and Bohr’s) could indeed be explained in a local way with hidden variables. Though, the correlations shown by his own version of the paradox, that established the so-called Bell inequalities, couldn't be explained by any local hidden-variable theory. This second result is known as the Bell theorem. To violate the theorem, the correlations measured between the quantum particles must be above the threshold established by Bell’s inequality, that implies the presence of local hidden variables. Quantum mechanics would have to violate this theorem, to preserve completeness (at least with respect to local hidden variables).
Versions of Bell’s inequalities have been experimentally testable. John Clauser and the late Stuart Freedman were the first ones to prove experimentally in 1972 that entanglement was not just a thought experiment. Their findings violate Bell’s theorem, they disagreed with local hidden variables, just like all experiments performed since then. The "spookyness" was real -although not the same one proposed by Einstein based on hidden variables-, because non-locality was being proved; it was a fundamental aspect of nature. Alain Aspect and his team performed more accurate experiments in the early 1980’s, confirming Clauser’s findings in more robust, rigorous experimental setups and conditions.
Nevertheless, the critical point remains in assuring complete randomness as to have reliable baselines to compare to the correlations from entanglement. And only measurements that are truly independent of each other (no possible communication between them whatsoever), allows to disregard local hidden variables. This is where the work from Anton Zellinger completed the picture at this level.
Therefore, non-locality was proved beyond any doubt. Whether this non-locality resided in Quantum Mechanics itself is or in some different kind of hidden-variable theory underlying it (non-local hidden variables), is still a matter of debate.
Yet, a more fundamental question remains unsolved. Quantum mechanics predicted entanglement, but as the title of the Science article in the image below claims, the mechanism explaining such behavior remains unacknowledged in current physics. It is now formally established that entangled particles are described by quantum states that can’t be described as a combination of the independent particles, they become a single object. They seem to follow a second-degree uncertainty principle where they are no longer independent of each other. But, how does this happen?
Even though quantum mechanics violation of Bell’s inequalities seems to prove that it is a complete theory (at least with respect to local hidden variables), evidently something is lacking in this theory, since it is unable to provide a complete understanding of the mechanism from which entanglement originates. It predicts entanglement, but it doesn’t explain how it happens. Einstein was right, quantum mechanics is not telling the whole story of the quantum world, it is not a complete theory.
And clearly, something must be missing since current mainstream physics has not achieved quantum gravity. It is logical to think that non-local variables and mechanism are at play. This is where the generalized holographic model developed by Nassim Haramein, really closes the topic. His work shows that there is a connection and an information flow between systems at very different scales, all obeying the Schwarzschild solution to Einstein’s field equations for non-rotating, uncharged spherical black holes. Among these systems we find protons, stars, cosmological black holes, the universe …
The same exact values obtained from the Schwarzschild solution for these systems, are obtained as well with the holographic mass solution derived by Haramein; a quantized equation in terms of units of volume -coined Planck Spherical Units-, that discretize space at the very fine and small Planck scale, with the possibility that the PSU functioned as a wormhole termination. From his calculations, the surface of the proton has 1040 PSU or wormhole terminations, such that the volume information is not only the result of the information/entropy surface bound of the local environment, but may also be non-local, due to these wormhole interactions like those proposed by a conjecture (known as ER=EPR conjecture) in which black hole interiors are connected to each other through micro wormholes. This means that systems obeying the Schwarzschild condition (such as subatomic particles) are connected by wormhole terminations.
The fabric of spacetime is like a network, and the particles are the hubs of the network, the servers of the universal net. They are all connected and that connects all the scales.
- Nassim Haramein
Since Haramein proves that this discretization of space produces the events that we call mass, energy, forces, and fields (see the section below for more information), through a mechanism that exhibits an inertia or screening of the information flow across scales, a network across scales is evidenced. This nexus of information exchange that seems instantaneous at our scale, is responsible for the event we call entanglement. The mechanism is not spooky at all, on the contrary, its implications are truly mind-expanding.
It was about time for entanglement to receive the notoriety it deserved! Together with 2020 Nobel prize awarded to Roger Penrose, Reinhard Genzel and Andrea Ghez, for the discovery and detection of Black holes, the whole picture starts to unveil before our eyes.
Could there be a connection between entanglement and black holes? Our RSF article entitled Galactic Engines addresses this question in detail and gives a robust affirmative answer. Not only does entanglement reveal a fundamental aspect of nature; it pierces directly into the nature of reality itself.
We now have experimental evidence of entanglement in macroscopic samples and at room temperature, and quantum phenomena at biological conditions has been observed in biological samples. Quantum entanglement is no longer a curious extravagance or exceptional behavior at extreme conditions, and it may very well be the heart of matter organization across scales …