I find it irritating that some people loose any sense of politeness and respect when they make contributions to discussions on social media like this one. If people who do not quite understand what they are writing about are impolite it makes me even more unhappy. - In any event, the insight that, in quantum mechanics, detectable events exhibited by isolated, but open systems are always accompanied by "loss of information" and entanglement with the states of degrees of freedom no longer accessible to observations after the occurrence of events, well, this insight is basic! It is as basic for quantum mechanics as, for example, the insight that the speed of light measured by a freely moving observer is independent of the state of motion of that observer is for the special theory of relativity. - I learn from these online-discussions that nothing can replace real seminar lectures and person-to-person discussions. Online-discussions appear to created more confusion than they eliminate!
I guess, we do not have sufficient knowledge about the geometry of space-time at a quantum scale unlike Riemannian geometry which is well-understood and applicable to classical scale giving rise to the formulation of general relativity.
The oversimplification of the flat sometime geometry of the universe responsible about non-unification of the two theorems. We developed a solution to instien field equations reveals a hyperbolic spacetime universe and wedded GR to QT for the first time.
Article The hyperbolic geometry of the universe and the wedding of g...
I think the expectation is simpy too high.Popular science writters just make too much fus about the possible integration.
There are points of contact.
The gravitational potential acts the same as any other potential for quantum
purposes. Why do people expect more than this?
Mixing the very small to very large distances is just about impossible.
General Relativity is a classical (non quantum) theory..
Due to different scale and models for explanation. Same has been resolved in attached publications.
Article Journey of the Universe from Birth to Rebirth with Insight i...
Article A Spiral Structure for Elementary Particles
Article Quantum Spiral Theory
Article Spiral Hashed Information Vessel
Article Mass-Energy Equivalence in Spiral Structure for Elementary P...
@ Suraj,
I have just gone through an article of yours and found multiple errors in it. I politely ask you to, instead of cooking pseudo-scientific stories, submit your work in a decent peer-reviewed journals; you will learn the bitter truth!
Juan,
I think that it would 'merely be necessary ' to change the paradigm that space-time is an empty 'continuum' (but capable of carrying 'curvature' ??'.. as now confirmed by LIGO..) whilst 'c' is an absolute constant and time dilates. ('c' is fixed absolutely to a frequency now, until the panel determines a better value).
If it were recognised that the active vacuum of QM is a 'ponderable' medium for GRT then we could accept variable 'c', explain gravitation as an energy density field and do away with ballistic gauge particles and enable rapid progress.
Only a different point of view is required but I despair that a paradigm shift will occur in my lifetime. Guy
The connection between the two theories, either mathematical or physical, is what is failing?
The maths describes the paradigms but if both are out of phase then we have two divergent models with which to hunt the true theory by designing experiments to meet Popper's test by falsification. Moreover, we have multiple models of SRT, GRT and several Quantum Theories to satisfy. Ptolemy added epicycles, have we come far from that technique?
Epicycles can only be used when there are data to fit. As pointed out by Ales, so far there is not one solitary piece of experimental evidence which such a theory could fit, let alone explain.
The fact that there is no mathematically obvious way to combine GRT and Quantum Field theory is only the smaller problem: the main one is that there is no experimental fact that depends on the way these could be united. Everything happens at a spatio-temporal scale which is completely inaccessible so far. So any solution conceivably discovered would probably remain pure hypothesis for at least a century.
@Vikas,
Please review the papers and guide me with the mistakes and probable solutions to it.
Well, you can only expect an (educated) opinion with this question, so that is what I can offer you.
In my opinion, the primary reason is that quantum theory has been inadequately understood. Both quantum mechanics and quantum field theory still suffer from serious conceptual problems that, as I view it, need to be (partially) resolved before any attempt is made of doing the next step.
I respectfully disagree with the statement that a lack of experimental evidence is at the heart of the problem. Following Kuhn, the evidence a scientific community considers important is dictated by the paradigm it follows. That's not necessarily unscientific, rather it should tell us when a new theory is presented the available evidence and possible counterevidence is given a different weight in an attempt to determine which theory is better. Note that I am using the word theory in its colloquial sense, not in the Popperian one.
Maik,
Your response impressed me, so I read your paper arXiv 1509 and I quote your own words from p2 : "Kuhn argues that, as a field of science develops, a paradigm is eventually formed through which all empirical data is interpreted. As, however, the empirical evidence becomes increasingly incompatible with the paradigm, it is modified in an ad hoc manner in order to allow for progress in the field. Ultimately, this creates a crisis in the field as attempts to account for the evidence become unmanageably elaborate and highly contradictory. Unless a new paradigm is presented..."
Maybe Popper should overrule Kuhn if we must change the paradigm?
Until the paradigm is shifts there will only be attempts to force/ enforce the existing theories by ignoring the available evidence. Especially so when whole careers and reputations are at stake.
I suggest that the nuclear and atomic aggregation states of matter should be accompanied by a vacuum aggregation state of matter and recognise different models for each to more fully understand their interactions.
Guy
@ Maik Reddiger: I do not, of course, disagree with the claim that quantum theory has some conceptual problems. I do not believe that these need be resolved, however, to make progress on the possible unification of quantum mechanics with gravity. The point is here that people who resolve the conceptual problems of quantum mechanics in extremely different ways, eventually all agree on the experimental predictions, those who do not being proved wrong. There has so far not, to my knowledge, been any experiment inspired by the conceptual issues of QM and leading to a conflict with it.
On the other hand, it is merely a matter of units to determine at what scale one expects quantum gravity effects to become important. Unless something unexpected happens, and the effects turn out to be much larger than expected, quantum gravity effects will not be observed in my lifetime. And so far they certainly have not been observed.
So neither in the foundations of QM, nor in the link to gravitation do I see any hint of ``As however the evidence becomes increasingly incompatible with the paradigm, it is modified in an ad hoc manner...''. In the foundations of QM there is no incompatible evidence (there is plenty that is *weird* but everything follows the quantum rules of calculation) and in the link to gravity, there is no evidence, period.
I think that the evidence impossible of to see in our time, perhaps can be measured in the future, when we count with measuring instruments, with a higher resolution. Therefore, the possible connection with gravity may be so insignificant, that we are not technologically capable of measuring.
I think that contemporary mathematics can easily cope with the problem of unification of general relativity and quantum mechanics. Look for example at paper "Applications of the local algebras of vector fields to the modelling of physical phenomena" where the local algebra of quantum mechanics induces a Riemannian metric.
Article Applications of the local algebras of vector fields to the m...
That's an instance where "shut up and calculate" leads nowhere, and is a huge loss of time and energy. We simply can't unify two theories when one of them, or both, is not understood. The first task it to solve the measurement problem in quantum mechanics, instead of ignoring it.
The argument of Kuhn is specially taylored for the emergence of quantum mechanics, but every scientific revolution not necessarily follows this pattern. We face a conceptual crisis, not an experimental one. It is more akin to the Newtonian revolution, where the useful mathematics not yet existed. The general problem is that quantum mechanics failed to completely replace classical physics, even as an approximation, and that falls exactly on the blind spot of most physicists. General relativity is such a classical theory.
Besides of the general relativity, exists the quantum cosmology, too. The Wheeler-DeWitt's equation for example, sees to the universe as a wave function that depends on the evolution of the scale factor a (t). Could this be a starting point ?.
James Maxwell had tried to unify electrodynamics, optics and theory of magnetism through all his life and did not attain his goal.He only constructed one of the first crossbred theoretical schemes by introduction of displacement current.Yet it marked only the beginning of true unification, of interpenetration of the paradigms of pre-maxwellian physics. Likewise, Stephen Hawking had begun in 1974 the process of interpenetration of general relativity and quantum field theory, the process of "grinding" their theoretical ontologies. Hence one of the main philosophy-of-science problems is :"what is real unification"?
The problem with anti-crackpots is: there are no omniscient people. Only an omniscient judge can decide what or who a crackpot is.
Second: physics was guided by humans and contain pieces that are contradicting other pieces.
The scientific method is a contradiction in itself. Not everything can be experimentally verified.
The crackpot criterion is often used as a defense by those that themselves are weak in mathematics. Otherwise they could just disprove the discussed statements.
Also mathematics still contains holes and inaccuracies.
Good arguments exist for the fact that physical reality applies structures that we also know as mathematical structures.
All experiments need a supporting model that helps interpreting the results of the experiment. Who verifies that model?
Most things that were lectured at my university were proper, but some of the stuff that was lectured appeared incorrect. The lecturers did not lie. They teach what has been lectured to them.
I just wanted to add that the author of the crackpot index is a famed mathematical physicist, who is certainly not weak in that matter.
``Why the quantum mechanics, and the general relativity, cannot be unified?''
Maybe this question is inappropriate, and should be changed into ``Is it useful to try to unify quantum mechanics and general relativity?''
The first question starts from the idea that both theories are true, independently of a domain of application to which a theory may be restricted. The domains of the two theories seem very different to me, for quantum mechanics, let's say, microscopic physics, for general relativity mainly cosmology. Putting arguments of an aesthetic character aside, I do not see any reason to quantify curved space-time (which is a useful alternative description of the larger-scale influences of the gravitational field). The microscopic interaction between elementary particles (including Higgs particles) is quite a different physical domain, as little requiring the use of cosmological concepts as atomic physics requires the macroscopic concept of `rigid body' for describing atoms.
Firm believers in the feasibility of a ``theory of everything'' are advised to read Lee Smolin's book entitled ``The trouble with physics: The Rise of String Theory, the Fall of a Science, and What Comes Next''.
Well, the right question may be: "How could one go about unifying quantum mechanics and general relativity theory in a grander theory that, in some limiting regimes, reduces to the known theories?" - In other words: "What would be a promising strategy allowing us to make progress towards unifying quantum theory and relativity theory?" I believe that some kind of grand-unified theory containing quantum theory and a theory of space, time and gravitation will be found eventually. With Charles Francis I believe that some or most of the fundamental problems on the way towards such a theory are more of a conceptual nature than technical. I also think that space(-time) is an "emergent structure" that encodes relations between different "events". Incidentally, this is the view Leibniz advocated! If one believes that it is the correct view one might think that one should start from an abstract quantum theory, which, at the bottom, does not talk about space(-time). Space(-time) and gravitation are phenomena that should come out of such a theory in a limiting regime - just like fluid dynamics in the form of the Navier-Stokes equations comes out of atomic and molecular physics in a limiting regime. - I tend to think that there are various promising ideas around that might help us to get started. But the pressure of various fashions may prevent the strong young people from pursuing them.
Philadelphia, PA
Dear Fröhlich,
You make a very interesting, a very positive comment here. You wrote:
With Charles Francis I believe that some or most of the fundamental problems on the way towards such a theory are more of a conceptual nature than technical. I also think that space(-time) is an "emergent structure" that encodes relations between different "events". Incidentally, this is the view Leibniz advocated! If one believes that it is the correct view one might think that one should start from an abstract quantum theory, which, at the bottom, does not talk about space(-time).
---End quotation
I wonder if you would be willing to point us in the direction of approaches you regarding as more plausible. I was just reading a passage in Carlo Rovelli's little book, What is time? What is Space? (2004, Italian, 2006, English) He says there, p. 49:
So, I now return to our own days, to quantum gravity and to the significance of the assertion "time does not exist" that I stated earlier. The meaning of this assertion is simply that the Newtonian scheme no longer functions, when we are dealing with much smaller things. It was a good scheme, but only for macroscopic phenomena.
---End quotation.
I recall when I first met Rovelli, at a meeting of the Italian Society for Philosophy of Science, in Milan, there was a lot of talk in my session, which Rovelli attended, concerning time, and what does it mean that time does not exist? We didn't get a direct answer at the meeting.
But in any case, given what he says in the passage quoted just above, it seems that it is not merely a matter of the Newtonian scheme that "no longer functions." It must be, also that the Einsteinian scheme also "no longer functions" --or only at the scale of "macroscopic phenomena?" What is your view of this strange matter of the "disappearance of time" --presumably, along with Einstein's spacetime.
This topic seems strongly connected to our earlier exchanges on RG. Among technical problems, however, would seem to be the problem of experimental access to extreme high-energy phenomena and the domain of the Planck length.
H.G. Callaway
@Charles,
The characteristic "crackpot" is often used in situations that are not so straightforward. If something seams to be in conflict with mainstream physics, then it is often without further discussion classified as crap. In many occasions that is not justified.
Mathematics is not complete and not always trustworthy. This is especially so in multidimensional differentiation and integration. For example Maxwell equations are incomplete. The generalized Stoke's theorem is formulated in an incomprehensible way.
John Baez has shown that it took half a century before mathematicians found a hard prove for the relation between number systems and Hilbert spaces that was already suggested soon after the introduction of the separable Hilbert space. It took several decades to find a proper definition of a non-separable Hilbert space and the relation between the two kinds is still unclear to most scientists. Still Hilbert spaces are extensively applied by quantum physicists. Are these physicists crackpots?
Baez also showed the importance of the relation between the orthomodular lattice and Hilbert spaces. That lattice is suitable as a foundation of physics because it does not yet contain numbers. It is no more than a relational structure. The set of closed subspaces of a separable Hilbert space has exactly that structure. With the Hilbert spaces the number systems emerge. However, at the same time the Hilbert spaces put a strict restriction onto the tolerable number systems. It accepts only real numbers, complex numbers and quaternions. Octonions and bi-quaternions do not fit.
Notions of space and time emerge when the model that is based on the orthomodular lattice is upgraded to the level where the Hilbert spaces are introduced, The restriction that the Hilbert spaces pose to number systems also hold for the notions of space and time. Hilbert spaces are storage media that can only store members of division algebras and functions that have target values in those algebras.
I don't see any reasonable argument that says that "time does not exist". When stated in this generality, this claim is nonsensical. Something like the proper time of intelligent observers of Nature clearly exists. What we don't know is what the exact structure of space-time on very short distance scales is. But that's another story!
@Charles,
You are right about the fact that reality appears to use finite sets, but these sets can be embedded in a structure that in principle has an infinite extension. For example every infinite dimensional separable Hilbert space owns a Gelfand triple, which is a non-separable Hilbert space. The intimate relation between separable Hilbert spaces and non-separable Hilbert spaces can only investigated via this bridge. Personally I use the reverse bra-ket method for this purpose. I derived that method from Dirac's bra-ket notation. Continuums, such as fields can only be stored in the non-separable Hilbert space. Both the separable and the non-separable Hilbert space are required in a model that simulates physical reality. I use this model in the investigation of the generalized Stokes theorem. With the help of the reverse bra-ket method I can apply the theorem to a quaternionic function space. The result is an interesting model of an evolving universe.
You will be balancing at the edge of being classified as a crackpot, when you try to interpret the significance of the orthomodular lattice. Garret Birkhoff and John von Neumann called their discovery "quantum logic". They did this because the relational structure of this lattice is quite similar to the structure of the lattice that defines the relational structure of classical logic. However, the orthomodular lattice is no logic system. its elements are no logical propositions. The duo indicated this themselves by showing in their introductory paper that the set of closed subspaces of a separable Hilbert space shows the relational structure of an orthomodular lattice.
Instead it is more relevant to interpret the orthomodular lattice as part of a recipe for modular construction. This would mean that every module and every modular system is represented by a closed subspace of a separable Hilbert space. The converse is not true. Not every closed subspace represent a module or a modular system. Elementary modules are not constituted of other modules and form the atoms of the orthomodular lattice. They are represented by one-dimensional subspaces. This subspace is spanned by a single Hilbert vector. It is clear that the notion of a module must be detailed further than a classification as a closed subspace of a separable Hilbert space.
Exploring this further appears to be very fruitful. It is only risky when you fear to be called a crackpot. I am not so easily scared. I pursued the exploration and got much deeper insight in the foundation of physical reality.
http://vixra.org/abs/1603.0021
Why the quantum mechanics, and the general relativity, cannot be unified?
GR cannot be unified with QM because QM and Classical Mechanics strictly follows the Noether's theorems, GR does not. The equivalence principle forbids the localization of the gravitational energy, this has drawbacks for high energies.
to Willem Marinus de Muynck,
You may find that Lee Smolin has changed the storyline.
I recommend you check out his later book 'Time Reborn' (2013) - in particular or at least, pp175-186 and of course the preface - if standing in a bookshop.
Hi Ales,
By the way have you heard from Kaare recently?
No I unfortunately not I did not hear anything about him. I was a bit surprised not to see him anymore. Do you know anything?
Dirac was the one who introduced mathematical mysticism in physics. Even before the positron was observed, he withdrawned from his explanation that the symmetric particle in his equation was the proton, and made the correct prediction of the anti-electron, and even of the anti-proton. At the same time, he made the prediction of the magnetic monopole, which proved a desperate flop. According to him, every mathematical structure must be identified with a physical reality, if it isn't forbidden by the known laws. Now the magnetic monopole is forbidden by nothing on earth. Things are the way they are, we can't say more. The same story played again with supersymmetry. Following this philosophy, and a fortiori with crackpot mathematics, led to a desperate flop too for half a century. Dirac was brillant (or lucky,) period. He hadn't a secret recipe. The problem with that idea is a structure that is not forbidden in the current knowledge may become so as science progresses. Otherly said, don't read the weather of tomorow from the bray of an ass.
Now we have found the lacking link between unification and crackpotry, Nothing will be found beholding cabalistic signs on paper. It takes a very deep conceptual analysis, and probably abandoning grandiose pet principles too.
Anyway, Paul Dirac had desperately outlined the True Way for us : "Shut up and Calculate!". Amen.
Ans:
Following are few of the reasons:
-GR is based on local invariance of speed of light, not global invariance.
-It uses weak field approximation which is not suitable for strong fields.
-It uses linear time moving in one direction but time is periodic like period of the wave.
-It violates conservation of energy on a global scale.
-It is a geometrical theory and not energy based theroy.
-It is based on Lorentz invariance which is less accurate than energy momentum invariant.
-QM is a theory which operates at a different scale than GR.
-QM looks for a fundamental building block of the universe, which is a flowed concept.
-QM seeks to unify fundamental forces using Lorentz invariance and ignoring conservation of energy.
-Both theories ignore the presence of consciousness in the universe and its relation to energy.
Following article addresses these issues in little more detail.
Article Periodic quantum gravity and cosmology
Charles, so you have surrendered, I think you can do better. It is grandiosely stated that quantum mechanics is complete, that's not true. Quantum mechanics isn't complete in the sense that it builds on empirical input that is not explained. One of such input is the discretization of the number of particles. The discretization of the number of bosons is spurious since it is basically a field, and the quantum phenomena can be alternatively explained by the discretization of the number of fermions, that are basically particles. There is no persuasive explanation of the discretization of the number of fermions, since it makes use of numbers with strange properties, that are actually members of a larger algebra (Grassmann algebra is an exterior algebra) which is not used. The anticommutators are but handy abstract devices that aren't justified like the commutators are as quantum equivalents of the Poisson brackets. On deeper conceptual analysis, it is clear that the difficulty of the unification comes from there. We must relies on a patch which is the projection postulate, without any insight on what it means, we only have an ad hoc mathematical description of it.
Charles, the conclusions of EPR have been disproven by experiment, then their hypothesis of incompleteness doesn't hold.
Postulating a fundamental discreteness can't work. We are much in the situation of the old quantum mechanics and the Bohr model. We use an ad hoc quantization condition for the electric charge. But while the other fundamental constants are interpreted as mere conversion coefficients between different physical dimensions, the electron charge is not and there is no explanation of the value of the fine structure constant. It was exactly the same for the Planck constant in old quantum mechanics.
Ironically, it was what Dirac was after when he thought out the magnetic monopole. But it came with a graceless string that went away only after the development of the fiber bundle and connection form theory. In his considerations, he forgot that is wasn't before much developments: Schrödinger equation, then Klein-Gordon equation that is ridden with negative energy states too, together with the empirical observation that there is not so much electron-proton anihilation, that he could come by with his equation and its prediction, While the basic idea may have some value, research follows a much more long and winding road, it isn't as simple as identifying new mathematical stuctures. It is a involved process combining both experiment and theory.
So the discretization of the electric charge is an empirical finding that is not yet accounted for. It is the challenge of you approach to derive the fine structure constant.
It is a little painful to see that the contributions ReserachGate advertises as popular answers do not even address the question that has been posed by Echaurren. That's not good! Once again, there is too much polemic in this discussion.
For the time being, we have a problem unifying relativistic quantum field theory (RQFT) with general relativity, because RQFT presupposes the conformal geometry of space-time to be fixed in order to formulate its assumption of locality (Einstein causality). But the state of the degrees of freedom of an RQFT must have an effect on space-time geometry! If one studies the field equations of general relativity then, on the left side, one encounters a classical, geometrical quantity, the Einstein tensor, while, on the right side of the equation, there is the energy-momentum tensor of matter, which is a quantum field, i.e., an operator-values distribution. Obviously, this means that, at a fundamental level (i.e., beyond semi-classical approximations), the present formulation of the theory of gravitation is not compatible with quantum theory. - But one can imagine other formulations that might avoid these problems. In such formulations, space (or, at least, the causal structure of space-time) emerges from a fundamental quantum theory. It appears quite clear that the fundamental theoretical paradigm is quantum theory, whereas general relativity is an effective theory only valid at macroscopic distance scales. I am an optimist and think that there will be much progress in this endeavor, before very long.
Let me add that in my view, the one incorrect element in GR is the so far merely local but not global validity of c.
This has been repaired recently in the equivalence principle and can, therefore, be repaired in GR as well.
This latter task has a maximally large importance.
To me, it appears quite clear that at the fundamental level, quantum mechanics must also be formulated as a geometrical theory. General Relativity works without any mystery or paradox, we have a problem with quantum mechanics only. What happens today is sort of a group therapy for overcoming the trauma of the emergence of quantum mechanics through denying reality.
General Relativity works without any mystery or paradox, we have a problem with quantum mechanics only.
I do not agree at all. The Noether symmetries and conservation are complied in Quantum Physics, while GR with the artifax of the curved space had to give up to such fundamental theorem. GR is a better model for geodesic motion than the Newtonian Gravitation, at the same time though EP (which is at the base of the curved space) forbids the localization of gravitational energy, globally reinstated with the escamotage of the Pseudotensors. At the present time Gravitation, if represented by GR, cannot even merge with Electromagnetism.
Charles, if I am not mistaken, what you try and say is that the fine structure constant is sort of a free parameter, that for example God used to tune His Universe so that interesting stuff occurs in it. There has often been a temptation from the men to dictate God how He should have performed his work. I find it is the height of nonsense, ask to Job.
The fuel of Einstein, and the reason why he made so many great contributions, is that he wanted to expurge physics from all of its contingencies, like precisely an arbitrary fine structure constant . That's the deep meaning of "God doesn't play dice." That principle led him to formulate the EPR paradox in order to introduce a non-random completion of quantum mechanics. Rationaly, we can't say anyhing else than until now, that failed. But that doesn't mean such an attempt is bound to fail forever.
The general working way to solve a perplexing problem is to see the obstacle, then to bypass it. But most of the physicists are not ready to see the obstacle. As long as there is a number they can calculate with, they are satisfied. The truth is, until now the fine structure constant, however fast it runs, is a empirical value, not a basic constant of the physical theories.
About the impossibility of the QM and GR to merge and the equivalence principle:
Gravity and the Quantum: Are they Reconcilable?
http://arxiv.org/pdf/gr-qc/0509051v1.pdf
Stefano, take a particle in a plane wave state, and make a local gauge transformation. You'll see that the energy isn't localized either. But that has no empirical consequence, since the wave function can't be measured. One can only get the full energy of the particle as a whole. That's a feature of all gauge theories, including General Relativity. We can't either measure the space-time, that's what says the equivalence principle, which is similar to he gauge principle.
"One can only get the full energy of the particle as a whole. That's a feature of all gauge theories, including General Relativity. We can't either measure the space-time, that's what says the equivalence principle, which is similar to he gauge principle."
Are you realizing of what you are talking about? The pillars of physics are on the Noether's symmetries and conservations, if we respect these ok otherwise you have to show that these are violated (suck a net energy flux from the vacuum). The non localization of the wave function is another thing which does not infringe in any case the conservation laws and respect the least action principle.
Charles, there is no observation of corpuscles, while the quantization of charge has been shown by the famous experiments of Milikan. Speaking about philosophical principles, your idea starts from one that, unfortunately, is the fundamental philosophical error. If matter is made of independent corpuscles, there is no way to force them to be identical. Indeed, from Democrite up to Descartes, the diversity of the atoms (along with the diversity of their arrangment) was thought to be the origin of the diversity of Nature. The observation is that the particles are indistinguishable, not independent, and thus are manifestations of a same fundamental stuff, or individed whole in the words of Bohm. That is in full agreement with Leibniz's philosophical principle of the identity of the indiscernibles. some centuries ago. By the same principle, two patches of empty space can't be discerned, and then should be the same. That is contrary to the concept of absolute space, leaving only spatial relationship between objects. In other words, the idea of emergent space existed already in old philosophical principles.
No, Milikan observed drops of oil. The corpuscle is a hypothesis that is contradicted by the observation of quantum phenomena.
Corpuscles are like plums, each one is allowed to get one.
Particles are like parts of a pudding (I don't take a cake for not giving the impression that it is a piece of cake,) each one is allowed to get a portion of the size of hbar, however it is cut. It can be a thin slice in the x direction, and then y is indeterminate, or in the y direction.
The two concepts are different: particles must be identical, corpuscles must be distinct. Quantum mechanics deals with particles, while classical mechanics deals with corpuscles.
Now what about a plum-pudding?
We have to keep the two terms in order to express the difference. A quantum particle has both a corpuscular and a wave nature, the word is much used in this context. Corpuscle = small body, particle = small part. That's standard terminology. The quantum particle is not the classical corpuscle, the two concepts kept diverging as quantum mechanics developed: indistinguishability, second order interferometry, particle decay etc. In extreme views, or grand unification, there is only one type of particle with several possible states. The indivisibility is a mere occurrence of the discretization of the lepton number and other similar numbers, angular momentum too can't be divided. Note that it is true only in a measurment, I expressed that with the word 'get', the two slits experiment shows that an elementary particle can ideed be divided. Your idea uses corpuscles, not particles, in the sense I have explicited. Basically it says that the line in the Feynman diagrams aren't particles but corpuscles.
Wave nature refers to the interference phenomena, which is empirical, not any abstract structure. We can speak of indivisible particle only in a measurement, thus indivisibility is the appearance.
But I'm taking about the concepts that are used in physical theories, trick or appearance doesn't enter. Concepts are conceptual, experiment is empirical. In pratical terms, we deals with drops of oil and an operational definition of the electric charge, everything else is assumptions and mathematical calculus, there is no fundamental structure.
Whenever I hear something like ``we have a problem of measurement in quantum mechanics'' I feel surprised: where is the problem? I do agree that we have a difficulty interpreting the *measurement* results in a Bell type experiment (say Aspect's) in classical local terms. I cannot see, however, that it is a problem with quantum mechanics: its predictions altogether correspond to what is observed. On the other hand, the macroscopic behaviour of the measuring instruments---the specific way in which the photomultipliers click---is not, so far, understood in classical terms: it appears as though some kind of nonlocal communication between the detectors were necessary in order to explain the results in ``realist'' terms. But it does not seem as though this could be solved by fiddling around with quantum mechanics. It could happen that QM is actually wrong---in the style of Penrose's state reduction, or the Ghirardi-Rimini-Weber approach---or else we should somehow find a way of getting used to the strangeness of the real macroscopic world as given to us by quantum experiments.
It could happen that QM is actually wrong---in the style of Penrose's state reduction, or the Ghirardi-Rimini-Weber approach.
Maybe a different interpretation is needed in order to have a better explanation of hte macro world but it is quite difficult that QM is wrong.
``Maybe a different interpretation is needed in order to have a better explanation of hte macro world but it is quite difficult that QM is wrong.''
What could possibly be meant by ``a different interpretation''? A physical theory consists of a mathematical formalism, which is altogether unambiguous, and rules of correspondence that allow to say what experimental facts correspond to what predictions by the theory. The latter part is sometimes referred to as an ``interpretation''. But it is quite a clear concept, even though it goes beyond mathematics. In QM, I would say the Born rule, stating the frequencies at which different values of a given observable will be observed in a given state, as well as some way of preparing this state, are essential features of almost any ``interpretation''.
It does not seem to me likely at all that one will be able to find, in that sense, something that can be significantly changed in the interpretation. Any new interpretation is overwhelmingly likely to coincide, in all experimental predictions, with the usual one, and thus not to bring any solution to the perceived ``problems''.
I do not believe that QM is wrong. But if it turns out that it is---and it still might be---then a real change would be at hand, as opposed to the eternal rephrasing of the same physical truths in different garb, characteristic of the more sterile interpretational debates.
Charles, I dont buy the imaginary unit as constituent of the Universe. Complex numbers are but handy tools for every wave phenomena on earth, and even for every Hamiltonian system, classical or quantal, and that is a mere consequence of the existence of motion and invariants. The complex numbers possess unique algebraic properties, and it is mathematics, not physics. There are many discussion on this in RG. Early experiments have shown that the elementary particles have a wave nature, and until now, it has never been described by anything else than wave functions.
I think there is enough material in the literature about the measurement problem for not repeating each time the same thing, or even for writing one's own account to refer to. There are handy tags here for finding the many discussion about it.
Every integrable Hamiltonian system can be written with action-angle variables, through a canonical transformation. The action variable is the invariant, and the angle variable represents the motion, and this motion is described by a complex number, for the simplest way to have both an invariant and motion is circular motion. That's why everything that is integrable is described by wave functions. Even nonlinear integrable partial differential equations can be represented by linear wave equations. Besides, it is known since 90 years that everything is descibed by the Maxwell, the Schrödinger, or the Dirac equation. Even the general relativistic field equation has wave solutions.
The problem can be stated as follow. Nature is necessarily nonlinear. In nonlinear dynamics, there is the integrable and the non integrable sector. The integrable sector gives the waves and the particles. The non integrable sector, or chaos, gives the apparent randomness. The fundamental problem of physics is to make these two sectors work together, and Quantum Mechanics was a good, but failed attempt. In General Relativity, they already work together. Therefore the way to unify Quantum Mechanics and General Relativity is to first unify Quantum Mechanics with itself.
Please read : “Division algebras and quantum theory” by John Baez. http://arxiv.org/abs/1101.5690 and http://www.ams.org/journals/bull/1995-32-02/S0273-0979-1995-00593-8/ and http://arxiv.org/abs/quant-ph/0510095
If you can read these papers then it gives you much more insight. If you can comprehend the consequences of these papers then you can generate more consistent models
All of physics is unified by two simple principles: conservation and locality. They are embodied in Hamiltonian and Lagrangian mechanics. That's actually where the Newtonian revolution truly was, the invention of calculus. Quantum Mechanics is based on both, as shown by wave equation and gauge principle. But one of its features doesn't fit at all, it is of course non locality. So it rests on two inconsistent axioms: unitary evolution and projection postulate, it isn't unified with itself. It's the so-called measurement problem, and that's not because the physicists aren't familiar with it. That signals a future shift in the mathematical description: from analysis (or analytical geometry) to algebra. David Bohm made a similar diagnosis in his Intricate Order.
John Baez has produced more relevant papers.
http://arxiv.org/abs/1101.5690
"So it rests on two inconsistent axioms: unitary evolution and projection postulate, it isn't unified with itself."
Claude, that is an interesting point. Yes, QM evidently rests on these two axioms. But I would not call them inconsistent but rather complementary. The first one concerns the undisturbed evolution of a quantum system in time, the second one describes the measurement process. Problems arise when we carelessly intermix these axioms. First example: Von Neumann described the measurement process consistently as a projection. When we try to describe the measurement process by a unitary transformation, we run into the "measurement problem". Second example: Quantum electrodynamics is usually derived from the idea that there is a unitary transformation from incoming states (t = -infinity) through an interaction region to outgoing states (t = +infinity). Haag showed that (in the interaction picture) such a unitary transformation does not exist (Haag's Theorem) and that is only one of the problems with QED.
When we are in the domain of General Relativity, we apply General Relativity, no problem. When we are in the domain of Quantum Mechanics, we apply Quantum Mechanics, "no problem." Does that mean they are compatible? The domain where both must be applied is very narrow, for instance the first moments of the Big Bang, and are out of reach of experiment. Why then try and unify them?
The same exactly holds true for the two inconsistent axioms of Quantum Mechanics. Either we use one, either we use the other, each in its domain, and we boldly claim that it is the most successful theory ever. But the big problem is that we don't even know how those domains are defined. Yet, most people keep convincing themselves that it is alright, everything is under control. I ask exactly the same question: why then not try and unify it?
If only the two axioms of QM were inconsistent! Then, since experimental facts cannot be contradictory, these would necessarily tell us how the contradiction should be resolved.
But that, of course, is far from being the case In fact, as several experiments tell us (various versions of double slit, quantum eraser, etc....) unitary quantum mechanics is really unrestrictedly valid until the system you are interested in happens to interact with degrees of freedom you cannot control. Once you trace over these degrees of freedom, as you must, you go from a pure state to a mixed state, and the essential ``problems'' of measurement arise. Of course, it is not possible to avoid such uncontrolled interaction whenever you make a measurement.
As to nonlocality, I do not see it appearing in quantum mechanics as we see it. The wave function always evolves locally, and such effects as involve ``instaneous propagation in nonrelativistic QM are always extremely small. The nonlocality is an issue of interpretation: if we wish to interpret the variability of quantum measurement results as the result of averaging over unknown degrees of freedom, then the behaviour of these unknown degrees of freedom must be nonlocal. But that could mainly be viewed as encouragement to give up thinking along such lines.
There are many authors that have shown decoherence doesn't solve the measurement problem, since it rests on problematic measurements itself.
``There are many authors that have shown decoherence doesn't solve the problem measurement, since it rests on problematic measurements itself.''
I am sure there are (any references, by the way?). But before we claim that ``the measurement problem'' is not solved, we should know what it is. We probably cannot do without some kind of prescription, external to QM, telling us what we are likely to observe when we are looking at a given state in a given way.
However, I do think decoherence gives us a good way of understanding one fundamental, and to my opinion difficult, issue about measurement: how does a pure state evolve into a mixture? Decoherence states that unitary evolution can generate, from a simple pure state, a pure state so complicated that it is not feasible, with any reasonable instruments, for us to distinguish it from a mixture.
Now the appearance of mixtures from pure states always appeared to me rather ad hoc and miraculous. We may see the ``problem'' of measurement in different places. Those who see it in that part of the process can look to decoherence as, at least, a partial solution.
Charles, F. Leyvraz doesn't name it, but gives a description worth a definition.
F. Leyvraz, how does a molecular configuration of a gaz evolve into a statistical ensemble? When it comes to claim Quantum Mechanics is a banal theory, people lose any rationality. There is a mathematical transformation of the formulation, but after, it doesn't describe the same thing, because there is a summation of the state of the actual environment with all other states it could have, but has not. In this case, the very operation of addition has a dubious physical interpretation. It is possible to develop a correct thermodynamical theory from a false underlying dynamics, in no case the former can explain the latter.
Decoherence is intended to explain classical mechanics from quantum mechanics, it can't explain the "underlying dynamics" and anyway, quantum correlation is observed, and there is no classical equivalent. The measurement problem has an empirical manifestation, it is non locality. And it was my point, that doesn't fit into our conservation-locality paradigm. The collapse isn't described by partial differential equations like all the rest of physics, it's not just unanschaulichkeit or habit.
I think that the evidence impossible of to see in our time, perhaps can be measured in the future, when we count with measuring instruments, with a higher resolution. Therefore, the possible connection with gravity may be so insignificant, that we are not technologically capable of measuring. Besides, the Heisenberg's uncertainty principle prevents us from measuring the quantum level.
The lower levels of physical reality are inaccessible for observation. That does not only hold for human observers. It holds for all discrete objects that take part in the considered system.
Gravitation finds it origin at these very deep levels. All massive elementary objects contribute to the deformation of the embedding field and the gravitation potential is nothing more and nothing less than a smoothed view of this embedding field. The embedding field represents our living space. It can be represented by a function and that function has a flat parameter space. Many people make the mistake to consider the flat parameter space as our living space. It is better to consider our living space as a kind of compressible fluid that obeys a set of special fluid dynamics equations. Another mistake is that people try to identify the embedding continuum with the electromagnetic field. The EM field exists as well and it is in some way coupled to the embedding field, but the fields differ in a fundamental way.
Photons do not travel in the EM field. It is far more likely that they travel in the embedding field. It is also wrong to consider photons as waves. Certainly they possess a frequency but it has more sense to consider photons as strings of one-dimensional shape keeping fronts. These fronts also keep their amplitude. That is why photons can travel billions of light years and after that trip they can still be detected by relatively small detectors.
@ Claude Massé: answering the post starting with ``Decoherence is intended...''
First I should say that I do not believe QM to be a banal theory. I state that considerable and real advances have been made in what is called the ``measurement problem', to such an extent that it becomes hard to see where the problem is.
``Quantum correlation is observed, and there is no classical equivalent'': perfectly true. Quantum mechanics is the right theory, its predictions are observed.
Non-locality: as far as I can understand, the word is dubious. What is undoubtedly correct is that, if we attempt to explain certain correlations typical of quantum mechanics by the fact of our ignorance of certain degrees of freedom, then these degrees of freedom are non-local.
But neither quantum mechanics in its usual form, nor the more modern concepts of decoherence, give any reason to believe in the existence of such classical degrees of freedom.
``The collapse isn't described by partial differential equations like all the rest of physics, it's not just unanschaulichkeit or habit.''
There I would actually disagree. It is true that ``collapse'' as postulated by von Neumann has a somewhat arbitrary nature. At some stage, we just project and assume to have measured. It is discontinuous and to some extent arbitrary.
But this is very much where decoherence has clarified matters, by introducing a precise *quantitative* analysis of the *continuous* transition between a pure state and a purely incoherent superposition of alternatives. We need not nowadays simply say ``the state has collapsed, because I just looked at it''. Rather, one says: the system got entangled with degrees of freedom which I cannot observe, the state is thus, for all practical intents, equivalent to a partial trace, and evolves via a generalisation of the unitary evolution, irreversibly, and possibly, if we are in fact dealing with a measurement, the system will eventually find itself in a state which corresponds merely to classical ignorance: a density matrix. A far too quick summary, but the details can be found elsewhere.
This is not a metaphysical ``solution of the measurement problem'' with heaps of obscure verbiage. It is a physical solution, with mathematical techniques, which takes into account the realities of a system, the development of which we cannot fully control. I do not believe that my appreciation of these advances puts in doubt my rational faculties.
Yours,
Francois
What have all these contributions to do with the question posed by Echaurren? Very little, I am afraid. I propose that some of you start a new discussion line on quantum mechanics, the measurement problem, decoherence, etc. Ane it might not be a bad idea to first clarify one's thoughts before contributing remarks.
F.Leyvraz, non locality is precisely defined: it is correlation accross space-like intervals. The local hidden variable was postulated by EPR, but it has been rejected by experiment. There are speculations about non local hidden variables, but nothing resulted as yet. Quantum Mechanics, and especially the measurement process, can't be formulated as something happening as a function of what happens at neighboring points in space-time (expressed as differential equations,) like in all other theories. There have been absolutely no progress in that direction. We can't, from thermodynamics, describe the physical process of the collision of two molecules. To the contrary, we get at macroscopic equation by neglecting the details, that's what decoherence do. There is no diagonal density matrix in Nature, it just describes our ignorance, not physics.
Decoherence can't do anything against non locality in Quantum Mechanics. That's exactly that point which Einstein objected from the very beginning, and neither Bohr, nor decoherence, nor anyone else answered it, or even adressed it. Most of the time there is a skirting maneuver aimed at a strawman. And now we ask and whine why we can't unify General Relativity and Quantum Mechanics. My answer is just that: first unify the projection postulate.