For entangled particles the measurements of a physical property related to entanglement leads to collapsing of wave function to a certain state instead of two states previously. This against causality and locality. The non locality can be admitted but causality breaking is against most of the physics laws. why we don't assume each particle has its own state from the moment of creation till measurement. why we don't adopt realistic quantum mechanics if there are new theories that can succeed in interpreting the quantum phenomena and respect locality and causality. What do you think?
There are a lot of answers already that are adamant that collapse of the wave function is "real" and undisputed. This reality is that the need for "collapse of the wave function", or some of the other measurement axioms of quantum mechanics, is still very much an open question. The subject is still debated by those active in foundations of quantum mechanics research.
In terms of your question which seems to me to be leading toward a realist view of the world - the challenge is this: To an excellent extent the projection hypothesis reproduces the results of experiments. While conceptually tricky it is undeniably a great approximation to the way nature behaves in many "measurement" scenarios. If it is wrong then we either need to show that the remaining bits of the model can reproduce this behaviour or expain it in some way or we need another axiom in its place that can reproduce the effect - but does not seem to be in such a deep conflict with the rest of the formalism (e.g. it may preserve unitarity).
I am afraid that the answer is still unknown and the path to a solution is not clear cut. For example - I have attached links to a paper that I did with Tim Spiller and Bill Munro where
"We study a dissipative quantum mechanical model of the projective measurement of a qubit. We demonstrate how a correspondence limit, damped quantum oscillator can realise chaotic-like or periodic trajectories that emerge in sympathy with the projection of the qubit state, providing a model of the measurement process."
It would be possible (but I am not saying that you have to!) to interpret the model and results of this paper in a realist mindset and claim that it removes the need for "collapse of the wave function", Born's rule and a few other axioms. However one could also argue that all we have done is hide the measurement in an environment and invoke a von-Neuman chain argument. In either case even the trajectories model that we use that preserves the norm of the state vector has some non-obvious non-unitary component. So even if the realist was to take up this argument they would need to explain where this non-unitary dynamics came from (perhaps an answer might lie in Penrose's musings on gravity and QM or Milburn's intrinsic decoherence - but that is nothing more than speculation). There are very interesting experiments by Anton Zeilinger and other probing the ontological nature of the wavefunction. As to whether or not all these results could be forced to be consistent with the Copenhagen view is an interesting and open question.
While I have been unable to answer your question I hope what I have said will be useful to you.
http://www.sciencedirect.com/science/article/pii/S0375960110005700
http://arxiv.org/abs/0905.1867
I think it is resolved when one considers the wavefunction of the measurement device (i.e. us and the rest of the world) as true many body wavefunction. One naturally gets a partition into a many worlds like bifurcation of results in this case.
Dear Sadeem:
First of all, whatever the “collapse” of the wavefunction is, it is definitely real, because it does collapse, so I wouldn’t call it science fiction. That’s a little bit harsh.
On the other hand, I believe the word “collapse” is a catchall term to refer to the presently unknown process that takes the wavefunction from the mathematical probability domain (a model) to the actuality domain (reality). There is nothing wrong with this model, as long as it predicts the behavior of reality.
Having said that, let’s look into some of the details in your question.
“This against causality and locality.” Entanglement seems to be a real property of reality, so I agree with you that non-locality needs to be accepted, it is very difficult to imagine entanglement, at any distance, without non-locality. It’s part of the definition. Now, violation-of-causality is a different story, because it involves understanding the nature of the concept of time, which is a property of reality so deeply rooted in our consciousness that it blurs with it. Proof is, no one seems to know what either concept is. Millions of extremely-intelligent-person-hours have been spent on trying to understand them. So, let’s go on.
“why we don't assume each particle has its own state from the moment of creation till measurement.” A particle does not have to be permanent, it could be transient. In other words, it could be manifested (constructed) by some process and then cease to exist (destructed). This is where the word “collapse” comes in for “constructed”. I don’t think the word for the process “cease to exist” has been coined yet, the closest being “annihilation”, which is not quite the same thing. As you can see, QM begs for a non-local, transient reality.
Finally, are there any “new theories that can succeed in interpreting the quantum phenomena?” I don’t know, perhaps this is another question for RG.
Regards,
Bernardo.
The question of conservation laws come up for the "collapse" picture. If one insists on conservation laws for each event then one needs a back reaction on the device or a many worlds splitting of the space. I favor the later and gave an argument for why here.
Article Two Experimental Tests to Distinguish Decoherence from the S...
Dear Prof. Mohamed El Naschie
I agree with you set theory and fractal space time can be a good candidate which can solve the problem of non locality and causality. The mathematics are so convinceing, but we need to pay attension more to the behind physics.
Dear Dr. Clifford Chafin
I know moderm quantum mechanics gives a strong role for measurement, but don't you think its much exaggerated. I can agree with you that wave function collapse mathematically but the physics of the such description is really mysterious and disagree with dominant principles like causality if we admit non locality.
Dear Bernardo Sotomayor Valdivia
"whatever the “collapse” of the wavefunction is, it is definitely real, because it does collapse"
I 'm convinced its real but if the physics are well described taking into account the respect to causality and locality. The problem is the physics is just neglected and we are taking mathematics as facts, just because its final results agrees well with experiment, but this is not enough.
There are a lot of answers already that are adamant that collapse of the wave function is "real" and undisputed. This reality is that the need for "collapse of the wave function", or some of the other measurement axioms of quantum mechanics, is still very much an open question. The subject is still debated by those active in foundations of quantum mechanics research.
In terms of your question which seems to me to be leading toward a realist view of the world - the challenge is this: To an excellent extent the projection hypothesis reproduces the results of experiments. While conceptually tricky it is undeniably a great approximation to the way nature behaves in many "measurement" scenarios. If it is wrong then we either need to show that the remaining bits of the model can reproduce this behaviour or expain it in some way or we need another axiom in its place that can reproduce the effect - but does not seem to be in such a deep conflict with the rest of the formalism (e.g. it may preserve unitarity).
I am afraid that the answer is still unknown and the path to a solution is not clear cut. For example - I have attached links to a paper that I did with Tim Spiller and Bill Munro where
"We study a dissipative quantum mechanical model of the projective measurement of a qubit. We demonstrate how a correspondence limit, damped quantum oscillator can realise chaotic-like or periodic trajectories that emerge in sympathy with the projection of the qubit state, providing a model of the measurement process."
It would be possible (but I am not saying that you have to!) to interpret the model and results of this paper in a realist mindset and claim that it removes the need for "collapse of the wave function", Born's rule and a few other axioms. However one could also argue that all we have done is hide the measurement in an environment and invoke a von-Neuman chain argument. In either case even the trajectories model that we use that preserves the norm of the state vector has some non-obvious non-unitary component. So even if the realist was to take up this argument they would need to explain where this non-unitary dynamics came from (perhaps an answer might lie in Penrose's musings on gravity and QM or Milburn's intrinsic decoherence - but that is nothing more than speculation). There are very interesting experiments by Anton Zeilinger and other probing the ontological nature of the wavefunction. As to whether or not all these results could be forced to be consistent with the Copenhagen view is an interesting and open question.
While I have been unable to answer your question I hope what I have said will be useful to you.
http://www.sciencedirect.com/science/article/pii/S0375960110005700
http://arxiv.org/abs/0905.1867
hmm, have thought about this one for a long time but have a hard time describing it. I visualize individual quanta colliding. Does it go left? Or does it scatter right, or backwards? These are all possible results, and depend on how squeezed the state is within the eigenspace. Every unit dt results in left, and right scattering; generally this process is not completely symmetric, and when considering a soup of hot quanta interacting very quickly, a particle going left or right does not matter; their superposition structures overlap, and thus coherence grows from chaos.
Measurement is generally considered...well, it's that whole process. Measurement is the interaction of a quanta with the superposition structure it encounters. Quanta may be in superpositions, so measurement involves the collision of superpositions with superpositions. This doesn't seem like it could ever cohere, but it generally does, and a good proof of how is in the paper linked below.
I am a big fan of the many-worlds hypothesis; it's simply the easiest way to track the growing superposition of quanta. The part many don't realize is that the expectation value of an observable can reduce in effective superposition if the other values are small; the more gaussian the wavefunction, the more coherent and thus near-classical it is. A "real world" can be not just one superposition contributing to the world structure, but ALL of the superpositions. Holography is just some method for which to reduce infinite dimensions down to a few; they're still infinite, but they're...closely packed in some regions? I'm not really sure how to describe it adequately.
This is an informatic perspective though. If we want to describe nonlocality, we have to describe the interaction of quanta with other quanta by relating superpositions to spatial locations - this is the AdS/CFT correspondence. Generally it is pictured that the more entangled two sets of quanta are - if their relative entanglement entropy is low - then they are close within the holographic/fractal anti de-Sitter space. Collapse in this picture is simply the wave evolution, and involves no long-range interaction because the wavefunction of the universe simply moves that way; a collapse is a further evolution, and occurs smoothly.
Hope this helps Sadeem.
http://arxiv.org/pdf/cond-mat/0203017.pdf
Dear everybody!
We don't know what is the wave-function. There is one thing that we can suppose: since it bends under the influence of fields, it should be MATERIAL. From which type of material is built the wave-function we don't know yet. But, for sure, the wave-function is something real, not a formula on the paper.
Now, if the wave-function is some form of matter which travels through our apparatuses, it cannot simple disappear from some regions of the space, as the collapse postulate implies.
But, as long as we don't know more about the nature of the wave-function, we aren't able to say more. What people should do, is to suggest new experiments for elucidating this issue. The present known experiments were examined from many points of view. There is need to "attack" the wave-function with new experiments.
I agree with you Sofia the problem is in the way that we are designing our experiment to study the physics of entanglement we just keep increasing the distance between entangled particles and if there is correlation we say yee, this is a strong evidence against realistic quantum mechanics. Indeed non locality is not enough proof against realistic quantum mechanics it can be an indication that the space time that all these phenomena occured in is not simple as we thought.
First, any realistic as well as any causal theory has to be "nonlocal", or, more accurate, has to violate Einstein causality. This is a consequence of Bell's theorem, and, given that the violation of Bell's inequality has been observed, is independent of interpretations of quantum theory, and even of the correctness of quantum theory.
But the breaking of Einstein causality does not mean that causality in general is broken. All what is necessary is to return to classical causality which corresponds to some hidden preferred frame.
There are realistic and causal interpretations of quantum theory, like the de Broglie-Bohm theory.
Sofia,
We may not understand what exactly the wavefunction is, but did we understand what apples truly were when Newtonian mechanics came to be? Mostly the suggestion of a reality is held back by Descartes' most popular thinking; to sum it up in a sentence, are we a brain in a vat?
This probably seems like a silly attack on the Copenhagen interpretation, but I think it's the best perspective to hold when it comes to quantum mechanics. Observables are states which act as individual quanta; when particle physics comes in, it becomes clear that each quanta has not only two "positions" (states) but it has a number of states with respect to charge, flavor, color, even mass as we've recently discovered.
To go further, the Dirac sea is a great starting point to visualize how quanta wavefunctions can be excited from the vacuum, with pair production being not a sudden manifestation, but rather a breaking of the symmetry of empty space.
You guys are largely debating nonlocality though, and an important thing to remember is that nonlocality onto presents itself upon classical intersection. That is to say, we can only be sure of the correlation when we actually compare the results by causing the superpositioned wavefunction future (here is where the many-worlds interpretation comes in) to overlap. There is no violation of Einstein locality; the breakdown of two regions of spacetime is always symmetric when you see the two worlds come together. The wavefunction "collapses" not instantaneously but over the course of the measurement process. This is so often forgotten that it gives rise to all of this nonlocality talk. We don't SEE the wavefunction collapse; we KNOW it has collapsed due to the correlation of entangled quanta.
Anyways, bit of a rant, but yeah. I've come to this conclusion about quantum mechanics after exploring quantum information theory; it is an abstraction of the basic "physical" quantum mechanics, but it does follow the rules of the wavefunction, whatever they may be.
Dear Ilja
The non locality is only from our view but the entangled particles shoud be local relative to each other, this does not contradicts Bell inequality but at the same time conserves both locality and causality. Accordingly no need to assume collapsing of wave function and every particle should have its state from the beginning. The problem with traditional quantum mechanics that it assumes the probabilities of getting the final results through all possible combinations of wave functions applicable even before the measurement is taking place. This really oversimplified idea was only adopted becauseat the beginning there was no acceptible physica
Dear All,
In fact the wave-function is real and the collapse of the wave-function is real. Heisenberg uncertainty is real. This was discussed in details in this RG https://www.researchgate.net/post/Is_there_a_solid_proof_of_non_existence_of_the_absolute_reference_frame_or_no_one_has_found_the_proof_yet
Please review the last posts. The double slit experiment are solved now!
This is one of my posts there:
Dear...
Suppose again a train is moving from pylon A to pylon B. Now if the front of the moving train arrives pylon B for the observer on the ground at time t for the observer on the ground, then the train for the observer on the ground passed the distance x between the two pylons at time t according to the observer on the ground according to his ground clock. According to the observer on the ground it appears the moving train is moving according to continuity...right? Because the distance passed for the observer on the ground is in a continuous path, where there is no any cut in the distance.
Now what about the observer on the moving train or for the train itself. According to my theory, when the observer on the ground sees the front of the moving train at pylon B at time t according to his ground clock, at this moment for the observer on the moving train, the front of moving train is not at pylon B now according to his clock time on the moving train. The front of the moving train is at distance x'=R-1x between pylon A&B at time t'=R-1t. Here there is no continuity, because the location of the front of the moving train is in different points in space and time on the ground for the observer on the ground and for the observer on the moving train. Because of that, if the observer on the ground stops the moving train at pylon B, in this case the moving train for the observer on the moving train will transform from point x'=R-1x to point x'=x at zero time separation, and this is what happened when the observer on the ground make a measurement at a certain point on space on the ground, where at this moment the wave-function collapse. Review the double slit experiment.
(Here the two pictures of the moving train -the location of the moving on the ground for the observer on the ground at x,t, and the location of the moving train for the observer on the moving train at x',t' are entangled according to the conservation of energy-momentum four vector, and they form the wave-function according to the wave-particle duality...this is exactly the entanglement)
Yes the moving train moved for the observer on the ground in a continuous path, but for the observer on the moving train, that is not! So the continuity destroyed in this case. Because of that for the observer on the ground, Sagnac effect, Hafele Keating experiment, appeared to be solved according to Galilean transformation.
In order to keep on continuity, you must keep on objectivity, and that means the two observers on the ground and on the moving train must agree at the location of the moving train at the same point on space on the ground at the same time, and that is impossible without t'=t as in Galilean transformation.
So from where Heisenberg uncertainty comes from?
It comes from that the two observers on the ground and on the moving train are impossible to be agreed at the location of the moving train at the same point on space on the ground at the same time according to time dilation and length contraction on the Lorentz transformations (during the motion in constant speed v). In this case the Lorentz transformation must express about the wave-particle duality and the motion of the moving train must be controlled by the Schroedinger equation. Now the quantization of gravity is very simple if you understand how the change of the velocity is controlled by the quantization of energy in order to understand the equivalence principle.
What you must understand now, if I see now the plane arrives Paris now and passed the distance between London and Paris continuously for me, that does not mean -during the motion of the plane in constant speed v- the plane arrives Paris also for the Pilot of the plane. The Plane arrives Paris for me only on the ground now, not for me and for the pilot at the same time. Because of that objectivity dies and continuity as in classical physics dies also.
This effect is not appeared in case of low velocities, because in low velocities the time dilation and length contraction is negligible. Because of that Einstein and Newton thought objectivity is absolute in macro world, and thus Einstein tried to explain Lorentz transformation according to objectivity, where y'=y and z'=z and thus he proposed the reciprocity principle. Now you must understand why faster than light is impossible according SRT according x/t. That is because of objectivity and thus continuity. Because of that spooky action (entanglement) could not be interpreted by Einstein!!
Whenever you use a material body with some spatial extension in Relativity, beware that the definition of the body itself is observer dependant. A material body can be defined as a set of material points which have world lines which do not cross and at some location in space time belong to the same hypersurface. Then one can define a proper time for the whole body, but two observers do not see the same material points, and not the same body. So you cannot deduce anything from this kind of example.
The collapse of the wavefunction does exist, but really results for a misconception of the nature of a fundamental particle.
A fundamental particle like the electron is not itself the fundamental particle.
Let's remind ourselve that an electron behaves like a cloud when in orbit. What you see is what you get.
It is a cloud of more fundamental particle 1.23 x 1020 fundamental quanta and when you understand this you can iunsderstant all the outre feature of quantum physics
Please see attached link for a more detail mathematical explanation
Article The formulation of harmonic quintessence and a fundamental e...
It’s very likely that the world is five dimensional rather than 4D, and that the fifth dimension is the metrical scale of existence. I have shown that Bohm’s QM may be derived from GR if the metrical scale of the line-element of GR oscillates at the Compton frequency. The wave function then becomes modulation of the fifth dimension, i.e. the scale. And influences via the scale may be instantaneous so that entanglement may occur via the scale. The collapse may therefore merely be an attempt to describe QM in the 4D word of GR; it’s not needed if the world is 5D.
To Ilja Schmelzer,
Bohm's interpretation is indeed very appealing. It's a real temptation to admit it. However, this interpretation works with a preferred frame of coordinates. I only want to tell you a weakness of this, as explained by by Prof. Antoine Suarez: many of our experiments are with photons. Well, for the electromagnetic waves there is no preferred frame.
This argument is a hard blow to the idea of a preferred frame, and it doesn't matter whether experiments for finding preferred frames are difficult to do, and the results not so conclusive, or all sort of other discussions and digressions. A preferred frame is ruled out by the theory, and also by the Michelson experiment.
About a preferred frame:
A preferred cosmological reference frame may exist because of the CMB dipole. It would also be implied if the cosmological expansion is in the metrical scale of spacetime. This expansion mode agrees with all our observations and also explains the dark energy and the dark matter. Special Relativity rules this out, but suffers from the Twin Paradox, which in effect means that inertial frames are in different 4D spacetime manifolds in a 5D world.
Dear jean claude Dutailly,
In my theory there is no space time continuum, it is only time. The Lorentz transformation is a transformation from point to point. There is no objectivity or the reciprocity principle.
http://dx.doi.org/10.14299/ijser.2014.10.002
http://dx.doi.org/10.14299/ijser.2014.04.001
A thoughtful and deep discussion of this and related matters that concern foundations of quantum mechanics can be found in the paper "An argument for ψ-ontology in terms of protective measurements" by S. Gao, http://arxiv.org/pdf/1508.07684.pdf and references therein.
Aren't we always trying to control waves to capture the energy ? I have been able to do that in Stock Markets and the Banking Sector. If you will please refer to my coauthored published paers on my RG page. I have also developed a Computational Physics for such waves (fluctuations in Economic Sciences terminology which does not require them to be harmonically propelled) in my String Theoretic Stock Markets research papers on my RG page. This preserves the original Schroedinger concept of thought experiment which should be ultimately observable. SKM QC
"reality or science fiction"
Neither, actually. It is nonrelativistic phenomenology. It is also strictly limited to the Schroedinger picture. In the Heisenberg picture, the quantum state does not evolve, so no collapse can happen in principle.
To Soumitra,
Sorry to use this thread to send a message, but the messaging function does not work.
On the same line that you explore (meaning the transposition of QM axioms to Economics) see my paper "Common structures in scientific theories" and "Estimation of the probability of transition between phases".
Best
With the exception of the well thought out comments of Mark Everitt, many appear to pursue avenues that display varying degrees of misconception of what the current state of understanding is. It is a matter of fact that any standard textbook on quantum physics will solve the Schrodinger equation for a number of simple systems to obtain the wavefunction, a mathematical object. This mathematical object does not describe collapse. If your position is that wavefunctions collapse, after 9 decades we are still waiting for a mathematical treatment of this.
I am sorry if just in case you are referring to my comment Prof. Garofalo, random fluctuations in economic sciences which are distinguished from noise in the system by requiring that their means be non zero usually, are propagated through number of channels which is a subject matter of macroeconomics research, but since the mainstream believes that they are equilibrium realisations of variables they are propagated by the same market forces which dynamically propels the mean (equilibrium) GNP in trend. SKM QC
@David G,
I would like your opinion on this idea. It is not built on stochastic anything but on some many body features of wavefunctions and the kinds of correlations in matter we observe on the classical scale.
Cliff
Article Quantum Measurement and Classical States
Dear Soumitra, Sorry but I was not referring to you in any way.
Dear Clifford,
It’s difficult to get a good sense of the ideas from your descriptions. For example, when you say that “It has already been long debated how general a measurement can be made from an arbitrary self adjoint operator…” you seem to be mixing the physical nature of the measurement process (which is unresolved) with the perfectly understood mathematical operation of finding real eigenvalues. You then attribute a special character to measurements of space and time, suggesting that others can be obtained from them. What does that exactly mean? Since energy, charge, mass etc. cannot be obtained from space and time simply on dimensional grounds, it’s not quite clear what you mean. Ultimately, your idea should boil down to the Schrodinger equation and something else, or something else entirely that reduces to the Schrodinger equation or can be built up from the Schrodinger equation. Can you produce that mathematical framework?
@David, there was a period where measurements were simply viewed abstractly as LCSA operators (as in Cohen-Tannoudji) and it was suspected that any such operator could correspond to a measurement. Now I don't think people take that last part seriously. It was a holdover from the formalist work of von Neumann and others.
You bring up an interesting point mass and charge. I would say that for any quantitative measurement we are always looking at some sort of displacement of a dial, or some solid state analog, like polarization on a plate that is detected digitally. This is what I mean. For example, in a mass spectrometer, we select out various q/m ratios based on position measurements for where they strike a set of detectors. For energy, we extract it from velocity measurements or calorimeter data. The first is found by a sequence of spatiotemporal measurements and the second from temperature which is I would argue always comes down to a position measurement though a longer argument.
The slicing procedure I outlined follows from the Schrodinger equation assuming the classical world is described by the initial data as in the paper.
Information theory gives, I think, a transparent treatment of this issue. From this stand-point, a kind of "collapse" occurs classically. While the die is still rolling your information is that each face has an equal chance of coming up. But when it stops, your info that a particular face is up replaces or collapses, if you will, the old info. Viewed info theoretically, a wave function is your prior info about possible outcomes of some measurement so it's collapse upon measurement should be no surprize. Whether you call this reality or not depends on your philosophy. All of this has a more rigorous treatment in subjective probability. Cf Richard Jeffrey, Cambridge (2004).
Dear all,
So, I understand from you that collapsing is only a philosophical assumption and not a physical one just to interpret or to connect the mathematics of wave function with the physical results from measurements, but in reality every particle has its own properties from the moment of its foundation. Am I right?
Sadeem,
"in reality every particle has its own properties from the moment of its foundation"
With this remark you finger a deep issue. There are two, very different, ways to understand Physics.
In the first, the traditional way, we have concepts such as fields and particles, and each of these objects is defined by properties which are related to measures. So a particle, by definition, has a definite charge, mass, location, trajectory,...The problem is then to find a consistent and adequate representation (in the mathematical formalism) of these properties and their interactions with other object. A model, representing a system, is then just the list of the objects and each of their properties, represented by variables. And one can conceive of the state of the system as the collection of maps (the variables) representing the objects over some area of space time. One can prove that the Axioms of QM are properties of the mathematical models (and not of the objects).
In the second way, which is closer to a pure QM approach, the key object is the measure (meaning an operation which is precisely defined, like a "yes-no" answer to some question). These measures have properties of their own, but mostly some measures can be related to a physical phenomenon, then this collection of potential measures can be seen as the properties of a physical object. A model is then essentially the values of the measures that could be taken, at different steps, corresponding to the states of the system, and a prediction is a probability of transition between different states. The axioms of QM are then related to the rules which can be established in a formal representation of a set of measures.
Both points of view can be made consistent.There are philosophical (and deep) motivations to support one or the other.
I personnaly support the first one, mostly (there are also philosophical reasons) because it is intuitive, and provide a clear framework in which one can build a narrative. This is a fact that our human minds claim simplicity and the hope to get a picture of reality. The unending discussions about QM shows that, whatever its merits, the second point of view is too abstract to gain an indisputable acceptance. But the worst position is to mix both points of view, and to strive to interpret the bizarre rules of calculations of QM as properties of a real world, as it is done in all "physical interpretations" of QM (from Copenhagen to multiverse).
No, Sadeem, you don't understand well.
You mix things also. Which properties you think that a particle has from its foundation? Mass, charge, spin, lepton/barion/etc. number? Yes, it has these properties.
But the collapse postulate deals with another concept: the quantum state of the particle, e.g. spin up or spin down, presence in one region of the space or in another, linear momentum p, or p', etc. When the description of a particle comprises ambiguity about some property, e.g. it is not clear whether the particle has linear momentum p or p', we write this situation by a wave-function ~ α|p> + β|p'>.
Well, when measuring the linear momentum we get only one of these values, i.e. either p, or p'. NOBODY knows precisely how the nature works for picking a certain value. But John von Neumann thought that despite this difficulty we have to continue to work with the QM. So, he introduced the "collapse" postulate, which says that at a measurement of the linear momentum the wave-function "collapses" on either |p>, or |p'>.
Still, how the nature works, and if we get the result p what happens with the wave-packet |p'>, we DON'T KNOW at present. Does the wave-packet |p'> disappear?
Despite the fact that we don't know much about the wave-function (we only know to write formulas), we are not pleased with the idea that the wave-packet |p'> disappears into nothing. Though, for the moment we don't know enough, we wait for new experiments.
Sadeem,
You have above an example of a mixture which is lamented as impossible to understand because it is not consistent.
In the first point of view, the properties of a particle do not depend on the observer, but their measure depends on the observer. This is just like a momentum : it involves the spatial speed, which is relative, but one can claim that there is an intrinsic property (say the mass m), which does not depend on the observer. In a relativist picure a particle has a velocity u, which is a vector which does not depend on any "frame" (observer), so we have a clear definition of a relativist momentum p=mu.
This is quite simple when there is no rotation involved. Rotations in a 4 dimensional universe are quite different from the usual space rotations. If we assume that there is an intrinsic relativist momentum for a particle, encompassing the full motion (translation and rotation) then we must opt for a more complicated representation, by spinors. And we get particles / antiparticles, spin up / down, and spin.
We have a similar representation for the charges, in gauge theories, where there is a group which reprents the action of force fields. It works also with gravitation.
The quantization of these properties is then done through the maps which represent the different variables. One can define a wave function,which has not physical meaning, but there is no "collapse of a wave function". Actually one can represent the state of a particle by a vector which replaces easily the usual psi in all computations, including the Schrödinger equation. So we have a "classic" formulation, which follows all the prescriptions of the axioms of QM, that is quite clear, but requires a bit of mathematical work (to understand spinors and fiber bundles which are the usual components of the Standard Model) but most of all which requires to understand well what the representation of physical phenomena means.
What is reality ? We do not know, but we can build efficient representations of this reality. And one can requires that this representation gives a clear picture, that we can grasp. Unfortunately for decenniums teachers have been telling that this is impossible, no surprise that they cannot understand themselves.
Indeed, all we need is map. But what to do with this map, what is the content of this map?
"A secondary observable is a linear map Φ E L(V; V) valued in a finite dimensional subspace of V such that $\hat Φ = Y \circle Φ \circle Y-1$ is a normal operator : $\hat Φ \circle \hat Φ* = \hat Φ* \circle \Φ$ with the adjoint $\hat Φ*$ ". (Mathematics in Physics (v.2) Jean Claude Dutailly)
Indeed, the wave-function is problematic, for one and the same process one can write more than one wave-function, (I encountered this situation in nuclear physics), but more mathematics is not helpful, unless it is preceded by the explanation of the phenomenological mechanisms and the ideas that the mathematics describes.
Physics is not mathematics - see Feynman's video-lectures. Feynman explains everything physically. Physics is phenomenology. Of course mathematics is very helpful, nobody says otherwise. But without clear explanations of what it describes, it makes no sense.
If there is a claim that we can evade the collapse - an escape to be saluted - this claim should be clearly explained, formulas in themselves are not a way, somebody has to explain what the formulas try to say.
To measure any property of a particle , we need to disturb it. so when we measure the property, we measure the changed property. Clearly the change of state is due to disturbance. so a disturbed system can not give the original property of the system. what we can do only to limit the disturbance. so perfect measurement is not possible. there always has some perturbation.
Think about the theoretical frame work of particle in 1D box. we have assumed at the boundary, wave function vanishes. reality says it does not occur. wave function vanishes mean there is no particle ! a infinitesimally small distance from the boundary the wave function has some non zero value. How could be this possible.
surely a complete theory is needed
@Mohamed: You are welcome to your philosophy and it seems you agree that it doesn't have a lot to do with physics.
James,
If you don't understand what people say, you can rather ask instead of mocking.
Sofia,
At least you find the time in your busy schedule to read some pages of my paper.
Of course there is no physics in the definition of an observable, because the axioms of QM are not about a physical representation, they are about mathematical representations. As it is clear in reading the usual axioms, you will not find any hint of a physical phenomenum.
The most prominent idea (that I've seen) out there as to what quanta are - what is "happening" at the quantum level - is consciousness. This is probably just stipulation though (and still is, right now) around the idea of "measurement" being performed.
Even if the question "what is consciousness?" isn't really answered truly by "quantum mechanics" - it still leaves something to be desired. We still don't have research into the construction of consciousness, so little in fact that it would just be shoving the "what is quantum mechanics?" question even deeper.
The wave function is a mathematical tool used to calculate probabilities in quantum mechanics, that is it makes mathematical statements on ensembles of particles, not individual ones. "Wave function collapse" or "reduction of state" is a concept resulting out of the misinterpretation of the mathematical formalism of quantum mechanics, a historic artifact from the beginnings of modern quantum theory, that became necessary for those trying to interpret the wave function as a physically real "field" (Copenhagen interpretation) as otherwise they could not explain the fact that one detects particles as opposed to waves and that the same experiment can have many outcomes.
For further information on this, see section 4.2 and 9.3 of:
Leslie E. Ballentine, "Quantum Mechanics: A Modern Development", World Scientific, Singapore, 1998.
Exactly what I was trying to get at, Artur. The fact is, quantum physics, like all other physics, is simply a mechanism (like newtonian mechanics) we use to interpret (not necessarily "understand") the way the world works. There's nothing inherently special about it except that it does a really good job at predicting interesting physical phenomena - like how thermodynamics lead to the combustion engine. Chemistry is a construct of quantum mechanics which has been simplified to a science we teach often in high school, and in some cases elementary school.
Quantum mechanics today is difficult because it involves differential equations. Diffeq involves calculus. Calculus involves geometry, multiplication and addition. Algebraic multiplication and addition involves arithmetic. Going from last to first, some children will go all the way, whereas others may struggle with algebra in their senior year of university!
Like you said, getting rid of the intimidation factor will require a huge education overhaul. Until then, many will be stuck trying to build wave mechanics using nothing but arithmetic and music, and interpreting the probabilistic nature with nothing but gambling...
Dear All,
To understand well what is the collapse the wave-function, Heisenberg uncertainty principle and everything about quantum and gravity exactly revirew my new paper. It will be published soon in a peer reviewed journal. The unified theory becomes very simple!!!
http://vixra.org/abs/1509.0059
In the quantum world, a system such as electron or exciton can exists in several locations at the same time. There is the view that the electron wave function which spans several positions, automatically collapses into a single spot when measured by a conscious external observer. It thus appears that the electron "knows" that it is being measured, and furthermore it seems to also know that the observer is not blind if it happens to be a human being who is able to record its location. Quantum weirdness indeed in full operation here. Then what is the state of the electron prior to measurement?. Does the electron expect some form of external intervention before collapsing? How does it know that someone out there is watching it?
Zurek's Quantum Darwinism model is an nice attempt to explain the bizarre state of affairs. Zurek's model employs the decoherence idea to show that the electron interacts with its environment which acts as a constant observer. As is well known, in a classical world, it is impossible to decouple the environment from objects.
Zurek's quantum Darwinism uses the idea of natural selection where the fittest state are selected on the basis that they survive and propagate best under the existing environmental conditions. Other possible states are ignored. The states that eventually survive are located indirectly by examining select regions of the environment. Hence "objective" classical states which are not affected by measurements appear to arise within a quantum domain. It appears that the classical world experience is born from a quantum background. Seems rather counterintuitive. But at least, if we argue using Zurek's Darwinism model, the collapse of the wavefunction cannot be a reality.
@Mohamed -- I apologize and wish also to point out this isn't what I meant. It was intended to be a statement about what constitutes physics.. More directly physics is intended to be an examination of the real world with intentions to perhaps partially explain what's happening. An axiom of theory or model building is called Occam's razor: the theory should carry no more assumptions than necessary. Case in point: Newton defended his model of gravity against the true criticism that it was non-causal by saying this was an unnecessary hypothesis: his intention was only to describe nature.
Let me just point out that indeed, reduction of the wave packet can follow from unitary evolution. Consider, as in Nevill Mott, "The Wave Mechanics of α-Ray Tracks", Proceedings of the Royal Society (1929) A126, pp. 79-84, an alpha particle emitted as a spherical wave in a medium, the atoms of which can be ionised by the alpha particle. Mott shows that the spherical psi wave will, in an excellent approximation, only ionise atoms on a line. That is, the final wave function is a coherent superposition of atoms ionised along all possible straight lines, correlated wit the alpha particle in an approximate momentum eigenstate in the same direction as the ionised ray. A reduction to a purely probabilistic uncertainty then arises from the fact that it is impossible to measure coherence between two rays of ionised particles.
This is just to say that there are straightforward ways to analyse the measurement process, which go back quite a while. Saying that everything in measurement is mysterious and unsolved, is misleading.
The truly non-obvious issues are those involving measurement of entangled particles at a distance, which seem to involve nonlocality About this I prefer to remain noncomittal.
Levraz, thanks for your coherent explanation of how coherence arises! This is exactly what I was attempting to get to in my post earlier.
At this point in time, the issue with quantum mechanics is not figuring out how classical mechanics arises, but rather how general relativity does. That's a bit of a simplification, but it is the focus of the theoretical physics today. There just isn't an obvious easy way to approach the problem; theories such as string theory, spin foams, and the AdS/CFT correspondence all attempt this, but 4-dimensional spacetime ends up being an emergent phenomena, not an inherent obvious facet of the space. Preserving locality means changing our perspective on what the geometry of spacetime is in the first place.
Like Levraz concludes however, there isn't a coherent picture here yet. It's best to hold back on choosing a particular description of entanglement collapse and staying with it; it'll only lead to confusion along the road.
Henry Tregillus said:
“Preserving locality means changing our perspective on what the geometry of spacetime is in the first place.
Like Levraz concludes however, there isn't a coherent picture here yet. It's best to hold back on choosing a particular description of entanglement collapse and staying with it; it'll only lead to confusion along the road.”
I agree with this comment, and believe that we have not yet realized that there is another dynamic dimension to the world beyond the four dimensions of space and time.
In all current physics very little attention is paid to the progression of time, and to the question of what is causing it. How can we ever understand and model the world without understanding what is causing the progression of time?
The progression of time may take place by changing the scale of 4D existence, i.e. the cosmos may expand in time as well as in scale by making the second longer. This implies the possibility of perpetual existence with all cosmological energy generated by the ever slowing progression of time. I have shown that this Scale Expanding Cosmos (SEC) model agrees with all astronomical observations and resolves the Dark Energy and Dark Matter issues.
Furthermore, I have also shown that Bohm’s QM may be derived from an oscillating scale for the Minkowskian line-element of GR. This may explain non-locality in our current 4D world as nothing but a misunderstanding; it would not exist in the 5D world where influences may exist via the fifth scale dimension.
The “collapse of the wave function” may therefore be caused by trying to explain the world by our current 4D physics, which is impossible!
Everyone who would like to have simple answers may begin from reading already existing literature on foundations of quantum mechanics, including the paper I referred to in my previous message, and many other papers. Many bright minds thought of and worked on mathematical and physical foundations of quantum mechanics. One cannot expect to be successful in quantum mechanical research if one does not know or understand rigorous results that were proved already. It's not a matter of voting for or against some proposition - it is a matter of education and careful mathematical analysis of physics reality..
Suppose a measurement gives as a result that the system is in a given
eigenstate. Then a second measurement that immediately follows the first,
before a certain relaxation time passes, must give the same answer.
That is why it is said that the wave function has collapsed in the first
measurement...in that sense it is real. ok?
The response by Juan Weisz is thought provoking.
Lindblad (1975) in his entropy inequality theorem showed that the quantum relative entropy diminishes after a measurement. Even if the change is very small, one can still employ the relative entropy tool to distinguish two sequential measured states. So the only way you are going to get the same measurements is if you impose some external operation that reverses the effect of the first measurement. Of course, depending on the precision of the measurement process and the time lapse between two measurements, the decrease of relative entropy may be so small that you hardly need these recovery operations to record same readings. In theory, the readings can't be the same.
Nevertheless it would be interesting to seek a link between the collapse mechanism and changes in relative entropy. How can this link be quantified?
Dear Juan
The collapsing of wave function due to successive measurements
at a short time can be understood also as a supporting evidence
of reality of quantum mechanics. Because here you have a problem with initail
boundary condition that is already known and measurement does
not change it. while in a measurement that takes longer time
the matter is different because the interaction of elementary particles with potential will have sufficent time to be happened. The problem is this interaction is happened in a complex space time that is not describable by the traditional space time, so the results here will be only expectation values and not certain ones as the in the formal case.
Following the way QED is constructed via Fock space methods (applied to Dirac field and e.m. field) one recognizes that in such a theory-set-up of QED, which in a limit contains (non-relativistic) QM of charged objects (electrons) interacting with e.m. field, the postulate of the 'collapse of the state/wavefunction' does not occur at all. Then, whenever 'probabilites' are calculated from 'probability amplitudes' and from expressions like ||**2, one can ask the question of what else might be the physical nature of the wave function and wether there are some other reasons, for example some 'intrinsic' random or quasi-stochastic or highly chaotic unknown properties of the systems (particles) under consideration, which might explain why "||**2" leads to reasonable results for the statistics observed in experiments.
Usually in axiomatic QM the collapse-credo is used to 'prove' that the state-vector shortly after measurement must be an eigenvector of the operator corresponding to the observable to be measured, and therefore eigenvalues and eigenvectors are considered to be of great importance in QM. However, the relevance of eigenvalues and eigenvectors of s.a. operators with respect to the discrete energy levels of excited atomes or molecules and the discrete energy spectrum of emitted photons can be founded mathematically by considering properties of the square-variance -**2 of s.a. operators A and vectors v in a Hilbert space. A more general axiom of QM would be to postulate a minimum principle: The system tries to get into a state where, during and-?-or after the interaction, the square-variance of the energy (represented by a s.a. operator in a Hilbert space) and some other (commuting/anticommuting) s.a. operators is minimized (and thereby, even opening a way to handle constraints in numerical calculations for obtaining approximations to eigenvalues and eigenvectors).
My understanding of entanglement is that what we call the "separate but entangled particles" are in fact described by one state vector (wavefunction) that can not be decomposed into the product of two distinct wavefunctions (which I interpret as meaning the entangled particles are not "parts" of a unified system or not "segments" along a unified path), and in some sense are QM only one "entity." I think this upsets are notion of space as individuating objects, but restores locality to the example where the "particles" interact with something (which doesn't have to be an observer or measurement device) and "collapse" into some eigenvalue. Entanglement in this sense can be better pictured in QFT where the fields are fundamental and the particles are excitations of the fields). Or entanglement can even be interpreted as evidence for an additional dimension (where we see the separate "parts" of an entity as entangled particles), but this gets to be a little fringe.
What entanglement I think really upsets is the notion that entities in isolaton (not interacting) have specific valued-properties or eigenvalues (what writers on Bell's theorem unfortunately call "realism)." Entangled particles give an excellent experimental example of states as superpositions of eignevalues. My understanding of the results of testing Bell's inequalities is we need to abandon either locality or this sense of realism, and most have chosen to abandon realism to keep consistency between QM and SR.
In terms of "collapse" I think Zeh's decoherence picture makes the most sense, as it generalizes the old "observer" based terminology to interactions with other systems.
Karl,
Do you really expect your answer to be understood?
"relevance of eigenvalues and eigenvectors of s.a. operators with respect to the discrete energy levels of excited atomes or molecules and the discrete energy spectrum of emitted photons can be founded mathematically by considering properties of the square-variance -**2 of s.a. operators A and vectors v in a Hilbert space."
You could have continued this sentence a couple of lines more. But, if you want to explain something, make your explanation readable. As to your last sentence (also of many lines):
"A more general axiom of QM would be to postulate a minimum principle: The square-variance of the energy . . . and some other self-adjoint operators is minimized . . . "
You didn't say, over which population of values to minimize? You forgot that the collapse is supposed to occur for every single trial in the experiment (e.g. a single pair of photons). Which population of values you have in a single trial for calculating square variance and also minimizing?
Sadeem,
you ask :
why we don't adopt realistic quantum mechanics if there are new theories that can succeed in interpreting the quantum phenomena and respect locality and causality?
There is no theory that respects locality and causality and also reproduces correctly the results of the quantum mechanics. I mean, theories that try to do so are many. But there is no theory that succeeds. The quantum world is non-local, this non-locality was proved and explained, it is no more a mystery to us.
As to causality, this is a big problem. The future should not influence the past, this is a law of our universe. Now, the collapse seems to violate this law, but nothing is sure. We don't understand the mechanism of the collapse, there are many arguments, many opinions, but, bottom line, we don't know enough.
@Sofia D. Wechsler:
From linear algebra it is known that in a finite dim. vector space for a Hermitian op. A:
- **2 = 0 if an only if v is an eigenvector of A
If v is not an eigenvector of A then -**2 > 0
I think that this has also been proved for selfadjoint operators in a Hilbert space.
This means that -**2 can be considered as a kind of 'measure' to see wether a given vector can be considered as a good or a bad approximation to one of the (unknown) eigenvectors of A.
The proposed "minimum principle" as a substitue for the collapse axiom seems to be reasonable because it refers definitely to the energy operator (and commuting operators) of the total interacting system, and it would allow to handle constraints to the total set of vectors in Hilbert space which might result from neglecting some part of the total system. (As "total system" consider for example two colliding particles or a particle and a 'measuring device'.)
If one consideres the collapse axiom as too stringent and if one is looking for a substitute then this "minimum principle" might be helpful, maybe at least in collision processes of particles where energy transfer occurs from one subsystem to another subsystem.
@Karl,
it seems that we don't speak of the same thing. Do you know how a measurement of an operator Ȃ is done? I suppose you do, but I'll repeat it below.
We first of all decompose the wave-function ψ into eigenvectors of Ȃ. The measurement apparatus is sensitive to the eigenvalues of Ȃ. The collapse, collapses the wave-function ψ onto one of the eigenfunctions.
Along the series of trials one can see that the eigenvector of Ȃ with the maximum absolute square of the amplitude in ψ, let's name it u, appears most frequently. But, if in each trial and trial the collapse process would calculate what is the most probable eigenvector of Ȃ in ψ, then ψ would collapse in all the trials only on u.
@ Sofia,
Do you think that a 'measurement' of any 'operator' can be done whithout any change and interchange of some physical 'quantities' of some 'physical objects'?
The mysticism of axiomatic QM is well known to me for about 40 years.
Even today it is worthwhile to have a look on v.Neumann's "Mathematisch Grundlagen der Quantenmechanik (1st ed. 1932), espicially his very general arguements against "hidden variables" and against "causality" (ch. IV, especially p. 171-173). v.Neumann's axiomatic approach to QM seems to be on of the origins of the never ending controversies concerning the extremly general way the 'measurement' process should be treated in QM.
Whenever a physical object is 'measured' by 'someone' and by a 'measuring device' at least two systems come in contact and INTERACT and interchange some energy, momentum, ang.momentum, spin, charge, and in repeating such measuring processes one obtains a bulk of statistical data characterizing the ensemble of processes, and which QM tries to handle and see how this stat. data is related to supposed 'properties' of the individuals objects.
But the physical ideas concerning interactions and the nature of particles and the nature of "empty" space have changed in recent decades since the birth of axiomatic QM. And if "empty" space turns out to be science fiction, then every particle would interact with something to an amount which is noticeable or not noticeable by means of our ('good' or 'bad') 'measuring' devices.
Even in the case of the e.m. field when it is quantized according Fock space methods without any axiomatic QM stuff it turns out there are vacuum expectation values for the square of electric field strength which do not vanish. This and the Casimir effect give rise to the idea of vacuum fluctuations filling "empty" space and to efforts to describe them in physical theories. What the (highly chaotic? stochastic) dynamics of the vac.fluc. might be and what else might be beyond vacuum fluctuations can only (with some ingeniouosly intuition) be guessed in the sense of some ansatz, hoping that it leads to results which are recognizable in the macroscopic world of measuring devices.
If vacuum fluctuations are part of reality then there is no "empty" space, and interactions are always and everwhere going on, and an exchange some extremely tiny amounts energy and momentum happens, even in any 'measuring device'. Therefore, the physical object to be measured and the 'measuring device' and the 'environment' should not considered to be absolutly free from a tiny chaotic or stochastic nature.
And this also means that the very general ideas concerning the vaguely stated 'measuring' process in axiomatic QM has to be revised completely from the very beginning.
Much more has do be done than Bell and followers have done. Nevertheless, wether or not my proposal of a "minimum principle" is useful as a substitute for a collapse axiom or to abandon the collapse axiom or to abandon axiomatic QM completely is not a part of my primary interests in theoretical physics.
@Karl,
what shall I take from your answer? Your proposal of a "minimum principle", or your saying that you are not interested?
I am replying here to your proposal, and to the fact that you are not interested I can't reply.
So, you say
"Whenever a physical object is 'measured' by 'someone' and by a 'measuring device' at least two systems come in contact and INTERACT and interchange some energy, momentum, ang.momentum, spin, charge, and in repeating such measuring processes one obtains a bulk of statistical data . . ."
I again remind you that we speak of collapse in single trials of the measurement. The fact that we repeat the trial is not relevant.
You conclude:
"Therefore, the physical object to be measured and the 'measuring device' and the 'environment' should not considered to be absolutely free from a tiny chaotic or stochastic nature."
In short, you propose hidden variables. Well, local hidden variables were ruled out, and non-local hidden variables are as enigmatic as the collapse.
@Sofia,
A comprehensive survey of the various attempts to resolve interpretational problems of QM can be found in "Do We Really Understand Quantum Mechanics?" by F. Laloe (Cambridge,2012).
In ch. 10.1.2 and appendix G the so-called "correlation interpretation" is presented shortly, which allows for eliminating the postulate of state vector reduction, provided that a postulate concerning eigenvalues and eigenvectors of linear operators as beeing the possible values of physical quantities in measurements is maintained.
It is exactly this postulate which might be replaced by the proposed "minimum principle" for interaction processes in particle collisions. If you do not like this proposal then just forget about it.
http://www.cambridge.org/us/academic/subjects/physics/quantum-physics-quantum-information-and-quantum-computation/do-we-really-understand-quantum-mechanics
Dear @Karl,
Do you know how many people say "read this article", or "read this chapter", etc. I did in the past, out of politeness, and usually I found that I wasted time.
People are busy. Personally, I have so many things on my head that I don't know when is night and when is day.
Nobody would read a material just because you, or someone else, indicates it. If you appreciate that material and recommend it - very kind from your part - please say in a few lines what is the essence of that material. If the essence seems convincing, then people would read the material for getting details. Let me tell you that during many, many years, I read tons of material, ideas, proposals. NONE succeeded to beat the collapse.
You see, the collapse postulate is outside the quantum mechanics (QM). The only serious proposals known "on the market" to get rid of the collapse, are Bohm's mechanics, and the spontaneous localization theory of Ghirardi. There is also a certain "flash theory" of Roderich Tumulka. But all these proposals suffer of big problems. I also read a proposal that the collapse is caused by "vacuum fluctuations". Unfortunately, it doesn't work - vacuum fluctuations are as good as local hidden variables.
The explanations you gave until now also look like hidden variables. If Laloe puts forward an idea which is not hidden variables, please tell me in your own words what is that idea.
Kind regards from me,
Sofia
The "correlation interpretation" (CI) is a way to "avoid" the collapse axiom, and CI has nothing do to with "hidden variables".
Following Laloe's book, what is done in CI is to consider very carefully the mathematical ingredients of the collapse axiom, and thereby showing that the collapse axiom is mathematically equivalent to a sequence of subsequent measuring processes at times t_1,t_2,t_3,... and the assumption that the probabilites in each step is given by the Born probability amplitude.
The math. ingredients are:
(1) the time evolution of the state vector |psi(t_1)> = U(t_1,t_0) |psi(t_0)> according to the unitary op. U(t,t_0)
and
(2) and the projection ops. in the Heisenberg picture
P_m (t) := U*(t,t_0)|phi_m> are the eigenvectors characterizing the possible results of a measuring process of an selfadj. operator M (corresponding to an observable quantity of the system)
and
(3) and a sequence of measuring processes of obs. M,N,...,Z at times t_1,t_2,t_3,...
and
(4) 'conditional' probabilities
/
Furthermore, in CI it is shown that this result (i.e. the equivalence proof) can also be obtained by using so-called "generalized Born probability amplitudes" which consider the system to be measured as well as the measuring devices for measuring processes at times t_1,t_2,t_3,...
Concerning CI in Laloe's book a publication of E.P.Wigner is cited:
E.P. Wigner "The Problem of Measurement", Am.J.Phys.31,6-15,(1963).
For details see Laloe's book.
For those who do not want to bother with the axiomatics of QM and who do not have any reservations against "new" stochastic approaches to QM and QFT it might be worthwhile to have a look at
"Nelson mechanics":
E.Nelson, "Derivation of the Schroedinger equation from Newtonian mechanics", Phys.Rev.150,1079-1085(1966)
"THE MYSTERY OF STOCHASTIC MECHANICS", https://web.math.princeton.edu/~nelson/papers/talk.pdf
and
"Path Integral Quantization and Stochastic Quantization" by M.Masujima, Springer (2000,2009) , https://archive.org/details/springer_10.1007-3-540-48162-1
https://web.math.princeton.edu/~nelson/papers/talk.pdf
https://archive.org/details/springer_10.1007-3-540-48162-1
Thanks you @Karl,
I will try to see that Laloe's book. I also had thoughts that the collapse is a whole process. In my mind, the collapse involves more and more particles until the bulk of material involved becomes macroscopic. But there are difficulties with this line of thinking.
I have no reservation against any idea whatsoever, the only requirement is that the idea be rigorous, i.e. it should not clash with other laws of the physics.
I have though a question. You mention
"a sequence of subsequent measuring processes at times t1, t2, t3,... and the assumption that the probability in each step is given by the Born probability amplitude."
I assume that you know that we can measure a quantum system only once. After a measurement, the system is disturbed. So, at times t1, t2, t3, the measurement cannot be one that involves collapse. The question is how can one do this. It is not simple.
Kind regards from me,
Sofia
@Sofia,
The idea of "CI" is to measure op. A at time t_1, resulting state is |a>, then to measure op. B at time t_2 > t_1 , resulting state is |b> , then to measure op. C at time t_3 > t_2 , ... , and finally an op. Z , resulting state is |z>. Then, 'conditional' probabilities in each step are considered, and the projection ops. to the all these states in the Heisenberg picture are considered. Indeed, it is a little bit tricky. Some (but not all) of the details can also be found here: http://arxiv.org/pdf/quant-ph/0209123v2.pdf , page 65/66.
http://arxiv.org/pdf/quant-ph/0209123v2.pdf
Dear @Karl,
As I said already I am terribly busy. I thank you a lot for references but I have no chance to get into them.
Now, we speak of the collapse, don't we? Well, let's begin with a wave-function |ψ> which, expanded into a superposition of the eigenstates |ai> of the operator Ȃ, becomes
|ψ> = Σi Ai |ai>
where Ai are constants. If at a time t1 you measure the operator Ȃ, the wave-function |ψ> COLLAPSES onto one of the eigenstates |ai>, for instance |a5>. This eigenfunction preserves no memory from the fact that initially we had the function |ψ>. We could have measured another wave-function than |ψ> and get |a5>.
A subsequent measurement of an operator B̂, at the time t2, is done on the function |a5> not on |ψ>. It implies the same procedure
|a5> = Σi Bi |bi>
where Biare constants. The new measurement COLLAPSES the eigenfunction |a5> onto one of the eigenfunctions |bi>, e.g. on |b17>. The eigenfunction |b17> is not the result of a unitary evolution of the function |ψ> from t0 to t2, because in between we had a collapse due to the measurement at t1. If we further expand
|b17> = ΣiCi |ci>
and measure the operator Ĉ at t3, we cannot say that between t0 and t3 we had a unitary evolution of |ψ>, because we had in between two collapses, one of |ψ>, and one of |a5>.
The fact that we write the unitary evolutions of the eigenvectors doesn't help, because the wave-function |ψ> doesn't evolve unitarily.
Bottom line, one cannot explain the collapse by a series of collapses. Logically, one doesn't explain a concept using that same concept.
Again kind regards from me,
Sofia
@Sofia,
Note that the state vector |psi(t)> between the times of the measurements, i.e. t_1 < t < t_2 , t_2 < t < t_3 , .... evolves according to the time evolution operator U(t,t'). Note that in this 'slicing' procedure the norm of any intermediate state vector is not restored, as opposed to what one usually does with the state vector reduction. As I said above, it is a little bit tricky. If you can disprove Laloe's statements concerning CI the please feel free to contact the author of http://arxiv.org/pdf/quant-ph/0209123v2.pdf .
http://arxiv.org/pdf/quant-ph/0209123v2.pdf
@Karl
I don't need to contact Laloe. Between t1 and t2 the wave-vector evolves unitarily. However, at t2 it undergoes collapse, because you measure it. After that, i.e. between t2 and t3 it also evolves unitarily. But at t3 there is collapse because you measure, and what you measure is no more the initial wave-function, but the state on which it collapsed at t2.
There is no need to contact Laloe. It is such an elementary issue as 1+1=2.
Let me tell you that among the tons of material that I read in my life, a big part were non-rigorous. Out of collegiality I won't mention names of people who wrote books with nonsenses. As to the discussion between you and me, it is on the most elementary issue in QM. I suggest you to leave Laloe and read other authors. Collapse is a NON-UNITARY transformation, that's TRIVIAL.
I only want to say that whatever Laloe proposed, the collapse is a process that begins with the measured system and involves more and more particles in the measuring apparatus, until all this bulk of involved material becomes classical. Though, the collapse poses more problems.
But I am terribly busy now, please believe me.
Best regards,
Sofia