Simplicity is the key to the interpretation of physics. Nothing more simple in the analysis than supposing the existence of some parameter "hidden," invisible and not measurable which is an integral part of a pair of photons and that tells at the time of their creation: "you are oriented east" or "you are oriented to the west. "This analysis requires us to introduce "hidden variables", a process which in physics is debatable, but allows in a very elegant way to explain everything in realistic terms. The pair of photons has its own objective reality that can describe them completely. Part of this reality is unknowable but never mind, the problem is only human, nature is safe.
We have two options: 1) quantum mechanics is inherently probabilistic; 2) quantum mechanics is not inherently probabilistic, but deterministic. The first position is that of the so-called "Copenhagen interpretation", still very accredited by physicists, while the second was that of Einstein-Podolsky-Rosen (EPR) and of the "hidden variables". Subsequently, Bell showed that the hidden variables can not be there. John Bell in 1964 pointed the way for an experimental verification of the existence of hidden variables, but subsequent experiments, especially the French group of Alain Aspect, have shown the full validity of quantum mechanics.
Then, the second theoretical position is no longer sustainable. Instead it is if we consider the fact that the "ontological materiality" turns out to be greater than the "physical". There are no additional variables that may enter into the physic calculation, but there are physical materials that physics fails to consider which have an impact on theorizing. These factors determine the overall behavior of matter which, therefore, appears inherently probabilistic. It can be said that Einstein was right: the hidden variables exist, only that they lurk outside of physics, in ontology.
Many physicists (Einstein leading) have always refused that indetermination be an inherent feature of physical reality. Consequently, they preferred to assume that the description provided by quantum mechanics was simply incomplete. Their reasoning, in practice, consists in saying: even at the microscopic level physical reality continues to be deterministic, only that we can not know the exact values of the state variables and so we are forced to an indeterministic description. To explain this failure many proponents of determinism (starting from Einstein himself) introduced the so-called "hidden variables". At the microscopic level, there would be some factor that is not yet known which would prevent us from a deterministic description. The moment we knew, we could provide a description of these factors completely deterministic
For many years the debate between the advocates of the hidden variables and the promoters of intrinsic indeterminism remained on a purely metaphysical level. In 1964, however, the physicist J.S. Bell derived a famous inequality (Bell's theorem) that allowed to transfer experimentally what until then had been a metaphysical discussion. Such inequality, in practice, led us to expect different experimental results depending on whether had been true the hypothesis of hidden variables (at least limited to the so-called "local theories") or not.
Now, the Heisenberg principle would not only establish our inability to learn at the same time the values of the position and momentum of a particle. These values are established, before a measurement be made, they are absolutely and inherently indeterminate.
Einstein's objections to quantum mechanics made sense because he was perfectly aware that quantum mechanics is incompatible with determinism. However, his views obstinately deterministic and his attempts to defend them (hidden variables) have not stood the test of facts.
The microscopic reality is inherently indeterminate. However, what is surprising is that the macroscopic reality is instead largely deterministic. To explain this apparent contradiction is a fascinating challenge in theoretical physics. An interesting attempt at a solution appears that provided by three Italian physicists G. Ghirardi, A. Rimini and. T. Weber (in Physical Review D 34, 470, 1986).
So, in this context it became obvious that the description of the states of a physical system offered by quantum mechanics was incomplete and that such an incompleteness was responsible for the indeterministic character of the theory. In other words, it has been assumed that quantum mechanics is indeterministic only because our level of knowledge does not put us in a position to "see" some additional variable, able to "complete" the description of the physical system provided by quantum mechanics. According to this conjecture, if we were able to identify these new variables, currently "hidden", we would recuperate a level of description deeper than the quantum level and at that level determinism could be recovered. "
In fact, the enigma of the "hidden variables" was not solved by a logical-deductive approach, as Popper might have wished, or was it only partially.
As already said, “in 1964 the issue was a crucial turning point: J. Bell showed that for a large family of theories and hidden variables, the so-called local theories, it is impossible to reproduce with media operations on hidden variables all the predictions of quantum mechanics. "" the result of Bell had the great merit of showing on the experimental ground the theme of possible deterministic completions of quantum mechanics, and a great interest aroused for the realization of experiments sensitive to discrepancies between the predictions of quantum mechanics and that of the local theories of hidden variables . "(Enrico Beltrametti)
In 1981, Alain Aspect was able to realize the first of a series of experiments of high quality. In practice, the experiment showed that Einstein had been wrong in suggesting the idea of hidden variables.
As for Popper, we could say that he lost a game: the one with LQ,
Criticism of Popper was wrong from a logical point of view, but in many ways it had some basis. Popper did not want to admit a weakness of logic explicit in theory LQ. For Popper's logic was to remain an ‘a priori’ science, having as main feature the absolute independence from any content. Therefore, he refused to consider the possibility of choosing logics different from the logic, most suitable than this to the empirical character of particular situations.
Already in the Logic of Scientific Discovery, which was finished in 1934, then prior to the writing of Birkhoff and von Neumann, Popper anticipated: "... replacing the word" true with "the word" likely "and the word" false with "the word" unlikely ", nothing is gained.
However Popper earned another no less important point. The revolutionary discovery of Bell and Aspect was not from a pure inductivism, but from experiments carried out in the light of a theory already formulated ‘a priori’, then from a hypothesis to be subjected to strict scrutiny, identifying the elements and data that could refute it. At least on this ground, Popper took an important rematch.
At the time of the article in Einstein's death, the controversy was still strong and "philosophical" issues had a great weight, so much so that an American physicist was the victim of McCarthyism and lost his job for supporting a deterministic model with hidden variables. Today we tend to minimize the importance of our imperfect knowledge on the subject; theories are used as they are reaping the fruits without worrying about a coherent understanding of the underlying laws. Most physicists do not interpret more the principle of indeterminism in a metaphysical way. It is considered as a simple impossibility of knowing at the same time position and momentum of the particles in a system still felt completely deterministic. After all, beyond the supposed wave-particle duality, also in the macroscopic world there is a kind of uncertainty: for example, I can not measure my speed with accuracy higher than my reaction time to press the button on the timer.