Thank you Charles. To be honest, we all know that quantum mechanics is mostly witchcraft, and it's high time to go back to the original crossroad. I was able to find more references on this topic based on the articles you refered. For example,
Inequivalence between the Schrödinger equation and the Madelung hydrodynamic equations, PRA 49, 1613 (1994)
Statistical origin of quantum mechanics, Physica A 307, 172 (2002)
Quantum mechanics involves phenomena very distant from our own experience, and therefore difficult to frame in our language. To this end, there exists a mathematical formalism. This gives the results observed in experiment. After you have familiarised yourself with it, you may obtain some degree of intuition, and find that you can, in some way, visualise, and hence in some sense ``understand'' the behaviour of quantum particles.
Finally, just a question: do you understand, in the same immediate way you would like understand quantum mechanics, the behaviour of a heavy symmetrical top? The correct answer is probably, that you do not, at least until you have thoroughly reviewed the computations. After having done that, you may indeed derive an intuitive understanding of what a heavy top does, in terms of angular momentum and the like, but not before. It is not different in quantum mechanics.
Finally, Mermin in a series of papers with ``quantum mysteries'' in the title, has very nicely shown why stochastic explanations of quantum mechanics will probably not work. Reading them should be a first step to answering your first question.
The weirdness in QM isn't at all in its lack of intuitive insight (unanchaulichkeit) since there is a mathematical formalism, like any other theory, and this formalism allows no intuitive insight either. At worst, it is logically inconsistent (the unitary and the projection postulates are incompatible,) at best, what the mathematics decribe can't be defined (what is a measurement and what is not? What behaves classically and what beahaves quantally? when is there collapse and when not?) Even decoherence provides no esplanation since it rests on plain QM.
The state of affair is not it seemed weird before because it was new, and now it is well understood. Rather, it seemed acceptable although new, and on closer examination the mathematics proved to be problematic, especially after the Einstein-Podolky-Rosen's paper where they spotted the measurement problem while the other physicists, like Bohr, didn't have seen it. See the discussions at the Solvay congress.
Now, most of the physicists today prefer that QM be stainless, so they can publish, shut up and calculate. That's why despite more and more publications, there is less and less progress. And many physicists monopolize the news groups, the forums and other public spaces in order to impose their convenient point of view. That's why we still see so many ingeneous people believing that QM isn't weird, while the contrary get more and more evidence, like Bell theorem violation.
@Claude Pierre Massé: Why should decoherence not provide a reasonable answer? As you say, it rests on plain QM. This is its great merit. The search for mechanisms outside QM that ``explain'' the projection postulate have by and large failed. In particular, Bell's inequalities and their violation, showing that whatever such mechanisms might be, they must in some way be nonlocal, has greatly reduced the attractiveness of such explanations.
Decoherence, on the other hand, allows to state that QM may, for all we know, be universally valid, but the projection postulate will unavoidably be very close to whatever we can observe using macroscopic objects, precisely because of decoherence.
I am not aware of anybody saying that QM is not weird. I believe most physicists would agree that Bell inequalities violation is strange. But it is undoubtedly there, as experiments have shown.
Finally, it may be worthwhile to point out that the weirdness in the violation of Bell's inequalities may not so much be on the quantum side as on the classical one. The truly unacceptable feature of such experiments, as far as I can tell, is the fact that there are macroscopic objects, say detectors, clicking away with correlations that are utterly unexplainable at any kind of intuitive level. The behaviour of the quantum state, in comparison, has nothing unexpected about it.
Decoherence explains nonlocality through nonlocality, i.e., "plain" QM. It uses the projection postulate as well. In substance, it is a thermodynamical average, but without its justifications. Especially, the average across large distances like in Bell like experiments doesn't converge. The fact that it gives classical probabilities doesn't mean that the physical process becomes classical, since it remains truly random. Only, the way of calculating the probabilities changes through a mathematical transformation.
The use of the fact that any search as failed for justificating a theory is like Shadok's principle: "if there is no solution, then there is no problem." But we can also use this other one: "The more it fails, the more likely it will work."
Be it on the classical or the quantum side, it is the fact that there is a measure. Of course, if there is no measure, there is no weirdness, but there is no physics either. We know there is a quantum state only by measuring it, otherwise, parhaps there is no quantum object at all.
When we throw a dice and get an ace, say, if we throw it again there is only a probability of 1/6 to get an ace again. That's the difference with QM, there is no collapse. In addition, in QM if we throw the same dice outside the light cone of the first throw, we still get the same ace.
Quantum mechanics isn't ``just'' stochastic mchanics-because it provides a particular way for describing the space of states, namely through linear superpositions. However how it does so is well understood, in particular, that many of these states do not possess a classical limit. This doesn't mean that they aren't relevant for physical applications or for consistency of the mathematical description.
Classical esplanations always fail when we consider correlations, or entanglement. There is no way to correlation the outcome of two different dices. We throw one dice and we get an ace, but if we throw another identical dice, there is still a probability of 1/6 to get an ace, at variance with a Bell like experiment. Correlations also applies to a single particle. If in the two slits experiment the outcome of the dice is whether the particle passes through a slit or not, there is no interference pattern, while there is a correlation among the two slits.
Thank you for your answers which show your enthusiasm in this topic. I believe few decades ago somebody (D. Nelson?) wrote a paper and actually derived the Schodinger equation from stochastic mechanics. I was unable to find further references. The reason that quantum mechanics might be stochastic mechanics is mainly the following: fluctuation and stochastic behaviors are more and more dominant once the size of the particles are getting smaller. We all know that quantum mechanics has some serious holes. I think this is also important to nano-tech. I was surprised so many researchers ignore the stochastic nature of nano-particle, rods, fibers.
Thank you Charles. To be honest, we all know that quantum mechanics is mostly witchcraft, and it's high time to go back to the original crossroad. I was able to find more references on this topic based on the articles you refered. For example,
Inequivalence between the Schrödinger equation and the Madelung hydrodynamic equations, PRA 49, 1613 (1994)
Statistical origin of quantum mechanics, Physica A 307, 172 (2002)
The first one to have think of a stochastic mechanics is De Broglie himself. Then there have been Bohm (Phys. Rev. 85(1952)166, 180.) But in Bohm theory, like in any subsequent one, the "witches" of Quantum Mechanics still have their broom, concretely a nonlocal potential.
Fushan, You mentioned: “We all know that quantum mechanics has some serious holes. I think this is also important to nano-tech. I was surprised so many researchers ignore the stochastic nature of nano-particle, rods, fibers.”
Nonetheless, QM is a required theory because it is the only theory that can currently account for: entanglement, double slit phenomenon, particles occurring in two states at the same time, why bulk matter doesn’t collapse under the electric force alone, wave characteristics of particles, the violation of the Bell inequality equation, etc.
As you suggested however, there are issues with QM. These issues appear to exist because QM was empirically derived based on observations and measurements of particles and their interactions, not on the makeup and the underlying dynamic structures and mechanisms of particles and their interactions that give such observations and measurements. This suggests that understanding the makeup and dynamic structures of particles is first necessary before the issues regarding QM can be resolved.
For example, our work shows that the electron and the photon have axes along which, their dynamic elements translate and/or oscillate. Thus, measurements and/or affects on the properties of either the electron or photon would vary in accordance with the cosine of the angle between the axis of measurement and the axis of either the electron or photon. Thus, Bell’s work would have shown that the correlation/angle relationship would not be linear, but instead would be related to the cosine function as reproduced in experiments and would not need to be attributed to QM. In all cases, the Bell inequality equation would be violated.
Bell’s phenomenon and all the other phenomena mentioned earlier can be understood, without having to invoke QM, if the makeup and dynamic structures of particles are understood, as the example above shows. Due to the randomness of particles (i.e., nano-particles that you mentioned), a probabilistic mathematical platform will continue to be required in some cases, but we will understand why and how we get the measurements and observations that we do.
Christian, of course there have been the experiment of Aspect in 1982. But also theoretically, any detailed contention of "unintuitive by not weird" have been examinated and rejected, even if nobody want to listen. No solution, either with another formulation or or another interpretation, has been found, making it more and more plausible that there is none. Analysis with information theory have shown some anomalies, like negative information. Finally, all other development based on QM, save for merely assigning groups and representations, have failed, namely supersymmetry, strings, and quantum gravity, suggesting there is something wrong from the very beginning.
The weirdness is not in the probabilistic formulation, it is is the nonlocality. The problem with Quantum Mechanics it that it is all fallacious. We see issue where there are none, and we think to have cured an issue while it is about something else. That's why many people think it is not weird, they have not even considered the real problem and are content with fallacious solutions for where there is nothing to solve. That is quite apparent in the first controversy between Bohr and Einstein. Einstein pointing out what will become the EPR paradox, and Bohr missing the point and answering wide with the complementarity principle, while Ehrenfest being blunt and insolent. The legend tells Bohr won the day, still another fallacy, Einstein was half right. So, most of the discusions about QM go in circles. Let's face the reality "prenons le taureau par les cornes."
Thank you so much Claude, for the English translation of that paper.
It sounds hush when I said QM is mostly witchcraft, considering so many great scientists gave an unbelievable concerted effort to develop the theory as we see today. However, think about all the weirdos that possibly let pseudo-science to creep in, like our minds can interference with experiments? If this is so, then quantum mechanics is psychology and we should consult psychologists first.
Also it seems that there is an interesting observation that most of the breakthroughs (I mean those that really get results, not just shown on paper) are made by those who really don't care about QM. Think about lasers, holography, semiconductors, which are not necessarily tied to QM. Take P. G. deGennes as an example. What if he didn't switch to liquid crystal from superconductors? Why did he switch?
I want to particularly mention one recently vehemently revived branch of mechanics -- generalized continua with microstructures, first developed more than a century ago by the Cosserat Brothers. The bottom line is that in this kind of medium the individual particles cannot be simply treated like ideal Newtonian particles, because they have internal structures. It can be shown that essentially the internal structures of these particles can be taken care of if we add one more freedom of the system -- the "intrinsic spin (angular momentum)". A lot of materials (particularly artificial materials, like granular systems, bones, composites materials, etc) should be treated this way. Does this sound familiar to us? Is it possible to develop a new theory which might be called "stochastic mechanics of generalized continum with microstructures", which is the replacement of quantum mechanics?
The way I see it is that one can possibly develop several theories that can equally explain all existing facts, just you can find many functions to fit a group of data. Currently semiconductors and laser are in the core of quantum mechanics, but that doesn't mean that QM is the only one. There is a chance that QM is the wrong one.
Dear Fushan, you are flat wrong: you just don't know what this is, a "energy gap"! Exactly this "strange object"(?) is common to semiconductors and lasers. in other words, you don't understand "laser" either.
Fushan, suppose we have two nontrivially different theories explaining all known facts asyou put it. What is the procedure to tell the "right" from the "wrong" one?
I only know one: find out where they make different predictions and strive for making an experiment which can decide (i.e. create a new fact) between the two or prove them both wrong.
If the theories do not make different predictions then there is some consensus to opt for the one with fewer postulates, if such a distinction can be made. When it comes down to choose between different sets of postulates, this might lead to matters of taste ?!
This question has been discussed since the introduction of quantum mechanics (QM). A simple, good and rigorous review of contemporary understanding of QM's nature can be found in: Andrei Khrennikov, Lectures given at the Institute of Quantum
Optics and Quantum Information, arXiv:1410.5773V1 [quant-ph] 19 Oct 2014.
Yes (maybe the 2nd lecture would have more QM related content ?), but it's kinda interesting to see what you're asking for when asking whether QM is "just" statistical mechanics. And to see that it is not necessarily a simple concept in classical physics neither. I sort of was always content with "stability of frequency" in the large number imit and, in the contect of thermodynamics the idea of ergodicity, and attributing each quantum state the same phase space...
Reply to Dr. C. P. Masse: probability theory. forms a foundation of quantum mechanics. One can safely state that QM is a probabilistic theoretical approach. In particular, understanding QM experiments is impossible without understanding foundations (at least) of probability theory.
(Continuation of the previous message to Dr. Masse). From mathematical point of view, QM is a theory of linear operators in Hilbert spaces. Physics comes into QM via probability theory that is necessary to interpret experimental measurements. Thus, on physical side, QM is a probabilistic theory. If one reads the Lecture 1 (see my previous reference) , one can find generalization of probability measures (in Kolmogorov's sense) to signed and complex probability measures that are of use in quantum field theory (Feynman). In quantum statistical mechanics (QSM, where one deals with more than 2 quantum particles) one needs probability theory to develop a self-consistent theoretical approach, that is, directly (so to speak), not only for interpreting immediate experimental results. Almost every solid textbook on QSM contains a detailed discussion of probability-theoretical foundations of QSM.