"Quantum decoherence" is a concept used to modify or amend the traditional "Copenhagen interpretation" of QM, de-emphasizing the role of the observer, but explicating the quantum-classical boundary. How are we best to understand this view of quantum mechanics? What does it explain? Does it represent a clear improvement over prior views?
The following quotation may help the question along:
Decoherence is of use within the framework of either of the two interpretations: It can supply a definition of the branches in Everett’s Many Worlds Interpretation, but it can also delineate the border that is so central to Bohr’s point of view. And if there is one lesson to be learned from what we already know about such matters, it is that information and its transfer play a key role in the quantum universe. The natural sciences were built on a tacit assumption: Information about the universe can be acquired without changing its state. The ideal of “hard science” was to be objective and provide a description of reality. Information was regarded as unphysical, ethereal, a mere record of the tangible, material universe, an inconsequential reflection, existing beyond and essentially decoupled from the domain governed by the laws of physics. This view is no longer tenable (Landauer 1991). Quantum theory has put an end to this Laplacean dream about a mechanical universe. Observers of quantum phenomena can no longer be just passive spectators. Quantum laws make it impossible to gain information without changing the state of the measured object. The dividing line between what is and what is known to be has been blurred forever. While abolishing this boundary, quantum theory has simultaneously deprived the “conscious observer” of a monopoly on acquiring and storing information: Any correlation is a registration, any quantum state is a record of some other quantum state. When correlations are robust enough, or the record is sufficiently indelible, familiar classical “objective reality” emerges from the quantum substrate. Moreover, even a minute interaction with the environment, practically inevitable for any macroscopic object, will establish such a correlation: The environment will, in effect, measure the state of the object, and this suffices to destroy quantum coherence. The resulting decoherence plays, therefore, a vital role in facilitating the transition from quantum to classical.
From Wojciech H. Zurek (2002)“Decoherence and the Transition from Quantum to Classical—Revisited,” Los Alamos Science Number 27 2002; http://arxiv.org/ftp/quant-ph/papers/0306/0306072.pdf
Philadelphia, PA
Dear Quark,
I see that you have posted three extracts from the Wikipedia article on quantum decoherence in three different postings.
http://en.wikipedia.org/wiki/Quantum_decoherence
Would you like to comment on these passages, add to them or offer criticisms?
For instance, in your first posting, you have it that:
Quantum decoherence gives the appearance of wave function collapse, which is the reduction of the physical possibilities into a single possibility as seen by an observer.
---end quotation
To say that decoherence "gives the appearance of wave function collapse" suggests that it only gives the appearance and is not the same as wave-function collapse. What, then, is the difference? Posting this statement, you appear to take responsibility for it, but if so, then it would seem that you are under some obligation to explain the difference suggested here. It hardly seems very helpful to simply borrow text from the Wikipedia and leave it at that.
In some authors it seems that the idea of the "collapse of the wave-function" is something to be avoided at all costs. In others, it seems to be simply taken for granted as an aspect of quantum mechanics. I suspect that some varied combination of differing criticisms or critical perspectives may be involved which are not always clearly stated. This creates a sort of "black cloud" over the notion, without really explaining what (or which) criticism may be involved--the source of the critical perspective. On the contrary, it may be supposed, from some readings, that quantum decoherence just is what is otherwise termed the "collapse of the wave-function," though understood more generally as a result of accidental interactions and not merely under conditions of observation or measurement?
Later on in your first posting you have it that:
Decoherence occurs when a system interacts with its environment in a thermodynamically irreversible way. This prevents different elements in the quantum superposition of the total system's wave-function from interfering with each other.
---end quotation
It would certainly be a welcome result to me to find some definite and well argued relationship between decoherence and thermodynamics. However, when I first read the Wikipedia article, some time back, this connection, as made in this passage, struck me as somewhat problematic. That is to say that I have often encountered physicists resisting explanation of "thermodynamic irreversibility" in quantum mechanical terms. The idea seems to be that the increase of entropy, of the second law of thermodynamics is to be explained by reference to the process in which the uniform evolution of the QM wave-function is "reduced" in any particular measurement or QM interaction. While before measurement, a variety of particular results had their determinate and calculable probabilities, on measurement, we find just some one particular result, and the other possibilities seem to be gone, or "reduced." It is exactly this phenomenon, which seems to have evoked Einstein's famous remark that "God does not play dice with the universe" --or the idea that QM must be "incomplete" so long as we get no explanation of why there is just this one particular measured result.
Do you, and the Wikipedia have some definite view of this problem or apparent problem? Does the defender of decoherence need to take a stand on the relationship between QM and the second law of thermodynamics? You leave a mystery here.
In your third posting on the present question you have it that:
A total superposition of the global or universal wave-function still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue. Specifically, decoherence does not attempt to explain the measurement problem. Rather, decoherence provides an explanation for the transition of the system to a mixture of states that seem to correspond to those states observers perceive.
--end quotation
I have long been puzzled by the notion of "the global or universal wave-function." This would seem to be the wave-function of the universe as a whole--in relation to which there is no possible observer. So, I wonder what physical meaning this can possibly have. Nonetheless, this idea is frequently encountered. Here you seem to make yourself responsible for its introduction into the present thread. Though it is also said that "its ultimate fate remains an interpretational issue." Does that mean that we can safely set aside the doubt I just mentioned? If so, why is that? Again, you have it that "decoherence does not attempt to explain the measurement problem." But how are we to understand this claim? What exactly is "the measurement problem" mentioned here? Is it the same thing as the felt need to explain the "collapse of the wave-function" in measurement? If not, what is the difference? Does it imply that we lack a suitably physical conception of "measurement"? If so, what is the problem. The phrase is often repeated, but rarely explained in a clear and concise manner.
Answers to my various questions seem to be involved in the opening question: "What is quantum decoherence?" Would you care to elaborate on your opening notes from the Wikipedia?
H.G. Callaway
Dear HG,
I think the weakness in Zureck's argument is revealed by the claim:
The natural sciences were built on a tacit assumption: Information about the universe can be acquired without changing its state. The ideal of “hard science” was to be objective and provide a description of reality. Information was regarded as unphysical, ethereal, a mere record of the tangible, material universe, an inconsequential reflection, existing beyond and essentially decoupled from the domain governed by the laws of physics. This view is no longer tenable (Landauer 1991). Quantum theory has put an end to this Laplacean dream about a mechanical universe.
This is wrong. The natural sciences were built by people like Galileo, Descartes, Newton and Leibniz, none of whom I would suspect held this assumption. They were all aware that physics was about describing the rules of dynamic relation of the world, not an 'objective description of reality'. In my judgment Leibniz shows in Monadology why the fundamental level of physics must be like quantum theory.
Zureck belongs to a long line of 'interpretors' who retain an intuitive or naive realist approach to physics at heart and want to be able to describe the interstices of fundamental events. The irony is that fundamental events are by definition those with no describable interstices. von Neumann's processes 1 and 2 were the ultimate mistake. There is no 1 and 2 within an indivisible dynamic relation. If the dynamic relation is not indivisible then QM was on the wrong track anyway.
I agree with you that the universal wave function is a nonsense, since a wave function is a description of the probabilities of certain observations and it is completely absurd to apply that to the universe. People who think it is something more than that have I think fallen into the same trap as Zureck and his decoherence. None of this stuff is of any use to practical science and in fact it gets in the way by throwing up false assumptions that people then apply wrongly and run up blind alleys - particularly in the area of philosophy of mind.
Philadelphia, PA
Dear Edwards,
Thanks for your thoughts on this question. I have to say, though, that I believe you seriously underestimate Zureck and the concept of decoherence. (I understand that there has recently been a physics conference devoted to his work.)
I believe this judgment is quite independent of our estimates of the value of the notion of the wave-function of the universe. His point in relation to it might be viewed as simply a matter of arguing that the value of the concept of decoherence is not undercut even there. No viable criticism arises from that quarter.
You wrote:
Zureck belongs to a long line of 'interpreters' who retain an intuitive or naive realist approach to physics at heart and want to be able to describe the interstices of fundamental events.
---end quotation
I suspect you are simply wrong on this point. This is a matter of how we understand "decoherence" in contrast to "collapse of the wave function." It seems to me implausible that the preference for theory conducted in terms of "decoherence" is a matter of endorsing even the kind of realism evident in Einstein's criticisms of QM--which, by the way, strikes me as a quite sophisticated version or valuation of physical realism.
I think more could be said on your posting, but let me stop here, with these few comments and wait to see if there are other replies. It seems to me important for the present question to get at some of the points you have brought up.
H.G. Callaway
Dear HG,
It seems to me very plain that Zureck is a naive realist. He talks about 'the state of the object'. Such an idea would make Leibniz very amused. Zureck's metaphysics would have seemed childlike even to Galileo and Descartes. It is wrapped up in quantum jargon, but as old as the intuitive view that Heraclitus and Parmenides became famous for trying to disabuse people of - thereby inventing 'philosophy'. Unfortunately almost all science is now popular science - tuned to the highest common factor in audiences (often wrongly called the lowest common denominator), which tends to be pretty low. I prefer to take note of people who have achieved something profound, like devising the calculus and the law of conservation of energy, rather than smooth talkers.
Philadelphia, PA
Dear Edwards,
Your critical perspective is welcome here, still I'm doubtful of your idea of Zureck as a "naive realist." I hardly think that his talk of "the state of the object" proves anything such as you ascribe, in any case. When he speaks of the "state of the object" as in the claim that "Quantum laws make it impossible to gain information without changing the state of the measured object." This seems a matter of the "quantum state."
Take another look at the following passage from the quotation above:
The natural sciences were built on a tacit assumption: Information about the universe can be acquired without changing its state. The ideal of “hard science” was to be objective and provide a description of reality. Information was regarded as unphysical, ethereal, a mere record of the tangible, material universe, an inconsequential reflection, existing beyond and essentially decoupled from the domain governed by the laws of physics. This view is no longer tenable (Landauer 1991). Quantum theory has put an end to this Laplacean dream about a mechanical universe. Observers of quantum phenomena can no longer be just passive spectators.
--end quotation
A plausible reading of this passage is that it is a rejection of a version of realism, since " Observers of quantum phenomena can no longer be just passive spectators." If anything, the passage might be more suspected of idealism--though I think that idea won't hold up either. I suspect that Leibniz might have liked it.
There is a lot of popular science around in recent times, but I think this neither here nor there. It proves nothing about the viability of the concept of quantum decoherence.
H.G. Callaway
Philadelphia, PA
Dear all,
Here's a bit concerning the relationship between "quantum decoherence" and "the measurement problem." I found the following brief account of the measurement problem in Julian Barbour's 1999 book, The End of Time, The Next Revolution in our Understanding of the Universe, p. 224:
The measurement problem of quantum mechanics is this: how does the entangled state of many possibilities collapse down to just one, and when does it happen? Is it when the pointer strikes the emulsion, or when the human observer sees a mark on the emulsion?
---end quotation
It struck me that this statement has the considerable virtue of conciseness, very useful for understanding what is meant by "the measurement problem." This is not a problem in QM, as I understand it, but more a problem in the "interpretation" of QM. Of course, when one mentions "the interpretation of QM" then this may seem to invite discussion of every historical question or position which has ever arisen in reaction to QM--say, Einstein's skepticism about entanglement or randomness, Bohm's hidden variable theory, etc. Such complications seem to keep the little "black cloud" hanging over phrases such as "the measurement problem" and "collapse of the wave-function."
But if we take the primary question to be "how does the entangled (or super-positioned) state of many possibilities "collapse down to just one" on measurement, then I think we can see the relationship of quantum decoherence (as interpretation of QM) to the more traditional talk (in Bohr, etc.) of the "collapse of the wave-function." The chief difference with "quantum decoherence" is the idea that this is happening all the time, and need have nothing to do with any human observer, while in the traditional "Copenhagen interpretation" the talk of the collapse of the wave-function gave undo emphasis to interactions explicitly mentioning observers and measurement. Regarding the fundamentally random character of the results of interaction (governed by the deterministic calculation of probabilities by means of the Schrodinger equation, say, "decoherence" seems not to differ from the traditional conception of the collapse of the wave-function. It is a more sophisticated conception linked to the idea of a quantum system interacting with a larger environment.
So, it looks to me that those seeking a mechanism of the "collapse of the wave-function" explaining why one result arises and not another, will seek in vain in the decoherence approach to the interpretation of QM. Regarding Barbour's second question, however, when does it happen, decoherence does suggest an answer--by distinguishing "measurement" as a special case of interaction with an environment.
H.G. Callaway
Dear HG,
This is the sort of naive realism I was talking about. It is not billiard ball realism, I admit, but it is still close to von Neumann process 1 and 2. The physicists I know simply don't think it means anything. Why should we think there is a 'state of many possibilities that collapses down to one"? If we consider a quantum system as an indivisible dynamic connection this is all just guff. Until measured we just have a type description. When measured we have evidence of a particular token of that type. Since the measurement is part of the entire mode or connection (a crucial point made clear by the Aspect results) the token does not even exist until it is measured. It has not 'already started on its way' since there are no trajectories in QM.
The issue of whether the connection is defined by the measurement or the observer has been pretty well sorted for a long time - Feynman certainly went for the measurement. I think the decoherence issue is actually something much more technical and detailed relating to a situation where there are no specific measurements. But as far as I know there are no new postulates about what 'really happens', the same maths is just jigged around to make predictions in a measurement free context. The trouble is, as Wikipedia says, you cannot actually test any new prediction. If you test you do a measurement and so you can use the math for measurement contexts, as far as I can see.
Philadelphia, PA
Dear Edwards,
Your "this" in your opening statement, "This is the sort of naive realism I was talking about," seems to lack any very clear reference. Interesting that the physicists you know "simply don't think it means anything." But who are these physicists and what did they actually say? Since you provide no text or citation, you are simply asking readers to take your word on the matter.
You leave the reader guessing and the effect of this seems just a bit too polemical. If you want to dispute Barbour's statement of the measurement problem, that is one thing. If you want to dispute my analysis of it, then that it something else, again. If you want to dispute the concept of the wave-function, that would be interesting.
Let me say a few words, however, about measurement and interaction, since you neglect to mention the contrast. The idea is, I take it that the physical interaction involved in measurement, using some particular measuring device, can be described in general terms. But things in nature are known to take place sufficiently similar to the interactions involved in measurement. Thus it seems that what is true of the one is true of the other as well. The result of measurement is, I take it, a physical effect of the system being measured upon the measuring device, and unless we suppose that the measuring device has some miraculous character, then it stands to reason that a system sufficiently similar would have the same kind of effect upon things which, as it happens, are not measuring devices. That seems to be the kind of generalization involved in talk of "decoherence."
Moreover, questions about decoherence seem to be taken very seriously among people devoted to the topic and engineering of quantum computers. Devices for quantum computing have to be physically isolated at considerable expense and effort so that suitable super-positions are maintained --against the real possibility of accidental decoherence. Perhaps you have some alternative account of these matters?
Berkeley, or the Berkelian idealist wonders whether, when a tree falls in the forest and no one is there, it nonetheless makes a sound. I suppose that those of us who say that it does are not to be thought "naive realists" on that ground alone. Agreed?
I'm just looking for some limits on your use of the term, basically.
Philosophy is sometimes a very slow, process, I'm afraid; but my current impression is that you are in quite a hurry with your conclusions.
H.G. Callaway
Philadelphia, PA
Dear readers,
Looking around for a further authority on the topic of decoherence, I came across a passage in an article by Bryce DeWiit--in a book I have on hand. (See the brief quotation below.) DeWitt's related accomplishments are mentioned, in the following obituary for DeWitt, published from the University of Texas at Austin:
http://www.utexas.edu/faculty/council/2006-2007/memorials/dewitt/dewitt.html
It is noted in the obituary that his last book (2003) The Global Approach to Quantum Field Theory, included,
A careful statement of the many worlds interpretation of quantum mechanics in the context of both measurement theory and the localization-decoherence of macroscopic systems, which leads to the emergence of the classical world.
---end quotation
The following passage comes from his article, “Quantum field theory and space-time—formalism and reality,” which appeared in Tian Yu Cao, ed. Conceptual Foundations of Quantum Field Theory. Cambridge U.P. 1999, pp. 176-186.
Fortunately, in the last few years great progress has been made on this problem. The key word here is decoherence, the development of random phase relationships between parts of the wave function. We now know that there are two major kinds of decoherence that play dominant roles in nature: the decoherence of the state of a light body (typically sub-atomic) brought about by careful coupling to a massive body (known as an apparatus) and the decoherence of the state of a massive body that results from collisions that it undergoes with light bodies in the ambient medium around it. The first is known as quantum measurement … . The second, … is ubiquitous, and is what gives rise to the classical behavior in the world around us …
---end quotation
Clearly, DeWitt took the concept of decoherence quite seriously. This is only worth remarking, I suppose, because of doubts expressed by others on this thread.
It may be, however, that there is no great desire on RG to conduct further discussion of the concept. If so, so be it. However, it seems very interesting to yours truly to know what decoherence will look like absent "many worlds." That's part of the interest of the topic.
H.G. Callaway
Schroedinger was not unsane. His example with a cat reminds us that every moment the cat is dead or alive - and the disjunction is exclusive. I don't do any decoherence when I open the door. I just open the door to call my cat. Quantum non-determinism and macroscopic determinism are two views which cannot be unified. A little bit like polytheist vs. monotheist religions. I am happy with a determinist world, which is predicted by determinist equations (even at quantum level, as told also by Zurek) and with good reasons not to be able to determine too much at quantum level - as given by Heisenberg's relations. Those are of purely mathematical nature, as can be seen for example here:
http://ac.els-cdn.com/S1063520396902079/1-s2.0-S1063520396902079-main.pdf?_tid=4829e434-c71d-11e4-822b 00000aacb35e&acdnat=1425989110_ba1151de824c17504dbfa2f67b1ae7ff
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS 4, 119–146 (1997)
ARTICLE NO. HA960207
Heisenberg Inequalities for Wavelet States
Guy Battle
The simple fact that we don't know something does not allow different evolutions to co-exist as freaky coherent shadows of reality. We only don't know the true evolution. I don't know what happened in London on 10th of March 1400, but this does not mean that different possibilities coexisted this particular day in London. Lord X and Lady Y met that day or didn't, and the disjunction is again exclusive.
So, instead of decoherence, I guess that both our theoretical knowledge and our experimental possibilities on quantic phenomena are limited - and that a kind of undeterminist thinking helps us in making previsions, for example in terms of probability. But we should not exagerate with a generalization of this nondeterminism. Or otherwise said, decoherence takes place all the time (!)....
Mainz, Germany
Dear Prunescu,
Thanks for your posting on this question.
The puzzle posed by Schrödinger, regarding the cat is exactly that, a puzzle. He did not think that the cat was actually in a superposition in the kind of situation he described. The question posed was, given QM and the possibility of micro-super-positions, why it is that we do not observe macro super-positions.
Many answers to the puzzle have been given, but what remains unclear is which of these, if any, is more acceptable. I would mention Roger Penrose and his theory of a gravitational collapse of the wave-function, and in some tension with this, we find the idea of decoherence --as a more generalized phenomenon. According to Penrose's theory, any object in superposition will tend to more quickly collapse, as a function of its mass, and the idea is that this is an effect of self-gravitation. There is a brief description of Penrose's approach in one of my papers, available from RG. I suppose that the general idea of decoherence might also be conceived as sometimes involving gravity, though that it not typically emphasized in the discussions.
Solutions to the puzzle of Schrödinger's cat generally seek to retain the indeterminacy of the quantum world --as this has been empirically validated-- while explaining why macro super-positions are not observed. You in some contrast seem to want to be rid of the quantum indeterminacy, except as a consequence of limits on our knowledge of a system. The most common related approaches, as I understand the matter are called "hidden variable theories," and they are not presently finding much favor. Einstein, too, of course, was skeptical of the idea of a basic randomness in nature. Hidden variable theories do have their defenders, I believe; but much of that may be beside the present point of simply understanding the contemporary concept of decoherence.
H.G. Callaway
Dear Callaway,
- Heisenberg's undeterminacy comes alone from the fact that we are speaking about waves. Waves cannot be approximated by points, as tried in classical mechanics, and as Zurek also correctly arguments. If one throws a stone in the lake, it is already difficult to say where the waves are. If in a first stage the waves are concentric circles increasing in all directions, they start to be reflected by the irregular shore, to interfere with theirselves and to degenerate to a general underground "radiation". After reflection and interference it becomes very difficult to guess where the original event did take place. I guess that most of our quantic difficulties are no more that such things.
- Things like wave functions and fields of probability can be done for every kind of undetermined event. Take this example with Lord X and Lady Y which are known to have met first time in London around 1400, maybe in spring. Those informations come from the evaluation of their letters and letters from other people who have known them. One can do a field of probability in the space-time around London and the year 1400. A first meeting in Oxford in summer 1401 has a smaller probability as a first meeting in the City in March 1400, but both are possible from our state of knowledge. More information can lead to a sharper function of probability, but even a page of Lady's Y diary does not make probability 1 for the day of March 10 in the City, because she also could have been wrong.
- I am not familiar with hidden variables... I hope that one can model the situation of Lord X and Lady Y also in some other formalisms, like hidden variables, but I don't know.
- The situation of the cat is only apparently related with a mechanism detecting one quantic particle and killing the cat. In fact cats are mortal, and every time when I come home, my cat could have died or not. A day, it will happen... But I do not kill my cat by opening the door and becoming aware of its state, although my cat is all the time the superposition of a dead cat and a living cat. (these states have different probability, I admit)
- However, experiments as those done at CERN proves that in very focussed artificial situations (that by the way costs a lot of energy...) one can get enough information out of the experiment to describe such an uncommon object as Higg's boson.
I believe that a synthesis of all these things (and similar ones) lead to the position expressed in my first posting. I don't say that decoherence does not exist. It exists in the world of quantic wave functions and superpositions. I just say that is quite unnatural for us to speak as we were living in that quantic world, and I say that on the basis of the fact that the two ways to describe the reality are really impossible to be unified. That is why I prefered to put the whole in this domain of knowable (measurable) vs. impossible to know exactly.
I will look for your paper about the cat. Can you give us the link? Thank you!
Mainz, Germany
Dear Prunescu
My paper is not specifically about the puzzle of Schrödinger's cat, but I do give a brief account of Penrose's approach to similar (or more general) questions --why do we observe no macro super-positions? Thanks for your interest.
You will find the paper at the following address:
https://www.researchgate.net/publication/256410914_Abduction_Competing_Models_and_the_Virtues_of_Hypotheses
As you will see, the paper is generally about the comparative evaluation of hypotheses yet untested. I thought quantum gravity especially interesting, in this connection, because there is so little evidence available. The alternative developments are almost purely theoretical and speculative--though much related work has been done in mathematical physics.
It would not be hard for you to find some preliminary information on hidden variable theories. If you start with Wikipedia, I would suggest paying particular attention to the references to the piece and the on-line papers or titles associated with it.
H.G. Callaway
Chapter Abduction, Competing Models and the Virtues of Hypotheses.
Callaway: " According to Penrose's theory, any object in superposition will tend to more quickly collapse, as a function of its mass, and the idea is that this is an effect of self-gravitation. "
These ideas are really revolutionary. It is no surprise that some physicists asked about them evite an answer, if this attitude was correctly reported in another posting.
If I well understood Penrose's position, the different possibilities of a macroscopic superposition collapse on this one which has the biggest probability, due to gravitation. All others dissapear by decoherence.
This remembers me about an older essay by Schrödinger, called "What is life?". There he disscussed life from the thermodynamical point of view, and he concluded that the secret of life was its ability to locally generate negentropy. This is its ability to let entropy locally decreasing, contrary to its natural tendency of continuous increment.
Keeping the parallel, life would mean not only to produce negentropy, but also antigravitation - because in order to preserve life, one does not always choose the most probable possibility in superposition. Further, it would mean that intelligent life generates even more antigravitation, because it tends to influence a further future.
As to my position, that I still didn't give up, it is normal from someone coming from the endless debate around Gödel and the notion of undecidable theory. Moreover, I worked long time in a project applying wavelet analysis to EEG's (electroencephalography) and I could see Heisenberg's indeterminacy operating in hundreds of cases. For any kind of stmulus/reaction there were wavelets better for the determination of the moment of reaction and other wavelets better for the determination of the frequency of reaction, but not both - because the product of errors in time and frequency is always bigger than a constant. So the interpretation of indeterminacy as barrer for knowledge is an important factor and will still remain decissive.
Mainz, Germany
Dear Prunescu,
It is perhaps just an autobiographical detail, of no great general interest; however, I would say that what convinced me to change my mind in favor of Heisenberg and a basic randomness in nature was consideration of Bohm's efforts to the contrary (hidden variable theory) in the light of Bell's inequalities and related work. This was many years back.
Thanks for your comments on my paper and on Penrose. As a general matter, I find Penrose a very impressive thinker. I do not believe anyone will go wrong by reading his work in detail. Whether one will thereafter quite agree with him is of lessor consequence. He opens up entire fields of contemporary science to the non-specialist reader.
H.G. Callaway
Dear Callaway,
I have seen yesterday the film "Coherence", see here more about the movie:
http://www.hollywoodreporter.com/review/coherence-film-review-711238
The action is as follows: a comet comes near to the Earth and has as effect, that the macroscopical decoherence is perturbated. A party with young artists and intellectuals is strongly perturbated, because there are a lot of parallel parties where the things happen slightly differently, and people slides from one reality in the other. The next day they observe that some things happened during the night had already consequences: now there are realities with two or more copies of some person, and other worlds where they have misteriously dissapeared...
And more information here:
http://www.vulture.com/2014/06/movie-review-coherence.html
Philadelphia, PA
Dear Prunescu,
The themes of the movie, "Decoherence," seem to project the idea of the many- worlds interpretation of the quantum formalism onto the human world. I do not see, at this point that quantum decoherence requires the many-worlds interpretation of the quantum formalism. It seems. Instead, they are linked by the current popularity of the many-worlds interpretation. I'm skeptical of that. The formalism seems to me instead a matter of uniform development of (conflicting) potentialities of measurements or results; and the "many worlds" strike me as rather metaphysical. Interpreting possibilities or potentialities as something actual in different "worlds" has always struck me as quite a stretch.
I'd suggest looking at the following paper:
http://xxx.lanl.gov/pdf/quant-ph/9908008v1.pdf
It appears to go over the basics in an intelligent way, keeping to the topic at hand.
Though this particular question has not found much interest, I think it would be a rather desperate measure to discuss the movie. If you want to do so, I'd suggest starting a different thread.
H.G. Callaway
Mainz, Germany
May 8, 2015
Dear All,
Here is the first paragraph of the Wikipedia piece for which Pereira (in another RG thread) provided the link:
In quantum mechanics, quantum decoherence is the loss of coherence or ordering of the phase angles between the components of a system in a quantum superposition. One consequence of this dephasing is classical or probabilistically additive behavior. Quantum decoherence gives the appearance of wave function collapse, which is the reduction of the physical possibilities into a single possibility as seen by an observer. It justifies the framework and intuition of classical physics as an acceptable approximation: decoherence is the mechanism by which the classical limit emerges from a quantum starting point and it determines the location of the quantum-classical boundary[citation needed]. Decoherence occurs when a system interacts with its environment in a thermodynamically irreversible way. This prevents different elements in the quantum superposition of the total system's wavefunction from interfering with each other. Decoherence was first introduced 1970 by the German physicist H. - Dieter Zeh and has been a subject of active research since the 1980s.
---end quotation
I thought this interesting and considered providing this link myself when I instead linked to the information philosophy page--which I think is pretty good.
I think we have to keep in mind that the word "decoherence" is quite widely used and there is some reason to expect that not all the physicists give the term quite the same meaning --or emphasized quite the same points by use of the term. Especially since the term is highly theoretical, that sort of situation is bound to present semantic puzzles. It is not as though the meaning of the term could possibly be independent of the viewpoint represented in its various usages by various theorists. So, I think that sort of expectation --of a single universally recognized meaning common to the debate--should be put aside, and we should look for something more like a common core meaning or crucial coincidences in the implicit meaning of the term.
So, consider, again, the opening sentence from the above quotation:
In quantum mechanics, quantum decoherence is the loss of coherence or ordering of the phase angles between the components of a system in a quantum superposition. One consequence of this dephasing is classical or probabilistically additive behavior.
---end quotation
This seems to me the core meaning in the usage of the advocates of the theory of decoherence. What is decoherence? It is "the loss of coherence or ordering of the phase angles between the components of a system in a quantum superposition." And notice that it is immediately added that the consequence of dephasing is "classical or probabilistic additive behavior." This seems to me a re-description or re-conceptualization of the more familiar concept of the "collapse of the wave function." Still it is not offered as a solution of the "measurement problem" --which often seems to be a mare's nest of polemics and hardened debates and positions. The chief or most important difference is that decoherence is taken to result from various and continuous interactions of a quantum system with its environment instead of merely focusing on the interaction with a measuring device. How exactly does a system in superposition reduce to particle-like behavior? Well, we know no more than Heisenberg, Bohr and Born did, but the over-emphasis on observation in the Copenhagen interpretation has been put aside.
Pereira comments:
but what really matters is:
the loss of interference patterns.
Why do these patterns disappear when the system interacts with the environment (or with the measuring apparatus)?
---end quotation
I think this is correct. What really matters is the loss of interference patterns. I suspect that the answer to the question, though, is the same as what we find in Heisenberg, Bohr and Born: the patterns just do disappear, and to ask why is to bark up the wrong tree, its a misconceived question --given the indeterminancy/ uncertainty principle.
I return to the quoted passage:
Quantum decoherence gives the appearance of wave function collapse, which is the reduction of the physical possibilities into a single possibility as seen by an observer. It justifies the framework and intuition of classical physics as an acceptable approximation: decoherence is the mechanism by which the classical limit emerges from a quantum starting point and it determines the location of the quantum-classical boundary.
---end quote
It only seems to me misleading to say that "decoherence is the mechanism by which the classical limit emerges from a quantum starting point "-- if this creates the expectation of finding a "mechanism" of a deterministic sort, going beyond the Born rule.
This seems to me the first step in understanding what is in question in the discussions of "decoherence," the central meaning of the term. All the reflections on information seem secondary to this.
May the physicists find my layman's errors.
H.G. Callaway
In the article:
https://www.researchgate.net/publication/275970727_There_are_Quantum_Jumps
the author also addresses the decoherence problem in the context of a photon emision by an atom. It seems to be a good example, both nontrivial and easy to formalize.
Article There Are Quantum Jumps
Mainz, Germany
Dear Prunescu,
Among Brändas' references, readers may find (part of) the Bell 1987 at the following address:
http://www.informationphilosopher.com/solutions/scientists/bell/Bell_Quantum_Jumps.pdf
I've also found Part II of Schrödinger's paper on-line.
H.G. Callaway
Dear HG,
I am puzzled by this quote from Pereira:
Pereira comments:
but what really matters is:
the loss of interference patterns.
Why do these patterns disappear when the system interacts with the environment (or with the measuring apparatus)?
---end quotation
Interference patterns are generated by ensembles, not individual quantum systems, and they appear when there is measurement. They do not exist in any known sense before that. Before measurement we assume that some dynamic process occurs within each quantum system that involve 'self-interference' but that is something other than interference patterns. There is also the problem that a wave function is a decriptor of a dynamic unit type, not a token. It is not an individual thing. The individual dynamic unit requires additional description, which might be hidden variables but is more likely to be a variable that only exists in the context of a completed (measured or decohered) system. Pereira's account is also odd in that he talks of components of a quantum system but we have no reason to think there are any components - one of the defining features of a quantum system is that it has no parts, or is indivisible.
If this is the way people think of decoherence - and nobody has rewritten Wikipedia to indicate a more sophisticated version - then it seems pretty naive, if not just plain wrong.
Mainz, Germany
Dear Edwards,
Thanks for your thoughts on the brief quotation, which as it happens, came out of a rather lengthy discussion. Let's see what others may think about your arguments. No doubt, decoherence is a complicated and difficult topic. There has been considerable resistance to taking it up, and it seems to be strewn with conflicts, and hardened positions (AKA "the measurement problem") old and new. I believe these conditions of discussion require a quite sympathetic ear for subtle differences and developing trends of thought. Its going to take some time.
Note, however, the last sentence from the introductory quotation at the start of this thread:
The environment will, in effect, measure the state of the object, and this suffices to destroy quantum coherence. The resulting decoherence plays, therefore, a vital role in facilitating the transition from quantum to classical.
---end quotation
This seems to be the key idea. Or perhaps you'd dispute this analysis of what decoherence claims?
H.G. Callaway
Dear HG,
I think your own comment about 'mechanism' is important here. 'A vital role in facilitating the transition' sounds awfully like a hypothetical mechanism, yet the beauty of QM is that it tells us that it is about that level of dynamics where no further 'mechanism' applies. I.e. there are no further combinations of events within a quantum system. It is just one event. My other thought would be to ask what is meant by transition from quantum to classical. It sounds as if these are seen as metaphysically distinct, yet this seems redundant.
To my mind the difference between quantum and classical is just that there are brute logical reasons why you have to have different rules for describing the evolution of individual dynamic elements and for describing the impact those elements have on subsequent events. Leibniz shows that the former rules must have a 'telic' structure whereas the latter cannot. Nothing collapses from a wave to a particle. Wave and particle are merely crude metaphors for the types of rule we have to use - for a priori reasons.
Mainz, Germany
Dear Edwards,
Forgive my slight delay in replying. I've been off to a conference for a week or so.
It strikes me that your claim that "nothing collapses from a wave to a particle," oversteps the boundaries of philosophy to intrude on quantum physics. No doubt, various people see various problems about wave-particle duality and about "collapse of the wave function," but the question of whether or not these are appropriate descriptions is a scientific question and not primarily a philosophical question. Least of all does talk of "a priori reasons" seem appropriate in this context. I do not know what your "brute logical reasons," may be, but your appeal to Leibniz seems to me doubtful. Methodological and even "logical" developments plausibly arise in science itself, so that philosophy can't plausibly constitute itself as a higher " a priori" court of appeals. This is an empiricist stance, and one widely accepted both in philosophy and in the sciences. How else would we makes sense, say, of Einstein's rejection of the idea of absolute simultaneity, or absolute and Euclidean, Newtonian space?
The quotation from Zureck, is, of course, presented above for discussion. However, I see no suggestion of a (contra-Heisenberg) "mechanism" in what he says about " a vital role" for decoherence "in facilitating the transition from quantum to classical." It is more that talk of decoherence up-dates talk of any exclusive role of observation.
H.G. Callaway
Dear HG,
I don't think I overstepped any boundaries. A 'wavefunction' is not a wave. It is well recognised to be a descriptor of a type or ensemble of dynamic unit(s). Its structure is incompatible with a true wave anyway but it has never been legitimately seen as describing wavelike token units. It is just that it describes the way probabilities for measurements vary across space and time for a type or ensemble - and if you try and draw it (with fudging because it cannot be drawn) on a piece of paper it has the serpentine mathematical structure of a classical wave (by coincidence). Moreover, the particle aspect of QM is not a particle. A particle is a very small piece of matter with a top and a bottom and a front and back and left and right hand side, even if very close together. Nothing in QM has these things in this sense.
I think the a priori suggestion is reasonable. There is another thread going on in relation to whether QM destroys the principle of sufficient reason. Steve Faulkner posted a nice paper that seems to put my own argument in a more rigorous mathematical form - showing a priori that there has to be randomness at the QM level because the math entails it.
Mainz, Germany
Dear Edwards,
I think you suggest some interesting questions, but I don't see that you actually reply to my prior remarks. You say "a wave function is not a wave," and apparently this is your reason for saying that "nothing collapses from a wave to a particle." But your suggested interpretation of both "wave function" and "particle," seem to me to read in things that no one seriously considers to be part of the relevant theory. So, you first construct a straw man, it seems, and then dispute it, without replying to my remarks--so far as I can see.
You say, for instance:
A particle is a very small piece of matter with a top and a bottom and a front and back and left and right hand side, even if very close together. Nothing in QM has these things in this sense.
---end quotation
If as you say, "Nothing in QM has these things in this sense," then why do you bring it up?
In quantum systems there are phenomena that are wave-like and others particle-like. These claims must be understood in the sense relevant to theory. One speaks of Schrödinger's wave equation, because it is based on mathematics similar to that used in describing waves in other parts of physics. The Schrödinger equation is one thing, and the wave function for a particular system is something else again, which, as you say, "describes the way probabilities for measurements vary across space and time for a type or ensemble." So what is the rub about types and tokens here? The wave-like character of quantum systems is evident in interference patterns, and the wave function of a particular system is a kind of theoretical hypothesis functioning to cover the observed phenomena.
The particle-like character of QM phenomena draws on Einstein's postulation of the photon--and the local character of light impinging on a detector. The idea is generalized, often in terms of wave-particle duality. Though photons in the double-slit type experiment can be sent through one by one, and they impinge on the detector at particular locations, particle-like, still over time, an interference pattern builds up at the detector screen--demonstrating wave-like character.
I don't suppose that you will be inclined to dispute much of this, so I don't see the point of the detour through your discussion of alternative conceptions of wave and particle.
I am glad to know that you are not (now?) insisting upon Leibniz as against QM. The point has not been exactly clear in the past. However, the idea that this could be shown "a prior" seems rather implausible, to say the least. The postulated randomness or probabilistic character of quantum phenomenon is based on the observed facts, and the subsequent choice of particular mathematical forms to describe the phenomenon in general terms does not change the empirical character of the theory. If other facts were found, then a different mathematical system would be selected to describe the situations. That particular facts can be fit into a particular mathematical formulism does not make of the resulting theory something a priori.
H.G. Callaway
Dear HG,
I understand your concerns but I do not think they hold – there is no straw man here.
The problem is that quantum physicists and commentators talk of wave and particle properties without being specific enough about what they mean. In the textbooks they seem to mean the things you suggest. But in discussing ontology there seems to be an assumption that whatever can be wave-like cannot be particle-like, so a ‘collapse’ or ‘decoherence’ event is required. I think there is stuff to unpack here.
The wave particle dichotomy was of course inherited from a 200+ year debate involving Newton, Young and Huygens. Note that Leibniz, and I have not changed my view about his prescience here, would not have taken part in this debate because he had deduced that fundamental dynamic units (monads) could not be particles. My definitions of particle and wave would have been pretty well accepted at the time they were invoked in QM. So where is the problem?
It seems that ‘particle-like’ in QM simply means ‘causally connects at one point in space and time’. Wave-like means ‘ has cyclicity and interacts with a distributed domain in spacetime such that the output is co-dependent on the situation in the whole of that domain’. The interesting thing is that people seem to assume these must be incompatible – otherwise why would they need a ‘collapse’ or ‘decoherence’ event to convert one to the other.
However, there does not seem to be any need for incompatibility. It is more or less by definition impossible to find a classical physical metaphor for a quantum system, as Leibniz understood, but I will try a fishing net. A fishing net has cyclicity in terms of the mesh of holes. Let us say that our dynamic units are elongated diamond shaped pieces of fishing net. Such units then have cyclicity and can interact over an extended domain but always connect one point to another – the ends of the diamond. Let us now say that these nets are not used for fishing but for some sort of roof construction, maybe underneath banana leaves to form a mesh. The user spreads a diamond across an area traversed by beams with nails in. He can knit the net on to the beams wherever a nail will catch on a net hole. Having started at one place he will always arrive at a single final place, having interacted with an extended domain but there will be places where he is likely to end up and others less likely because of the relationship between hole size (wavelength) and nail separation.
Note that the fishing net is not very like a wave or a particle and there is no need for any collapse or decoherence because the net does the whole job.
With one net the user will arrive at one place. There will be no pattern of net loops on the final beam. A pattern of net loops, (‘interference pattern’) will appear when he has attached an ensemble of nets. This pattern is likely to have points with lots of loops ending up and points with few – and the rule for the pattern will depend on the net hole size (wavelength) because reaching the final beam will depend on individual steps constrained by how the loop size fits nails along the way. It is essential to remember that individual quantum systems do not show ‘interference patterns’, only ensembles. This is a point made by Feynman and many other textbooks.
So there is no need to propose a collapse or a decoherence. We have at least a crude model that has no such need.
But then there is the token /type issue, which is not a red herring. It is emphasised by many QM texts. All a wave equation does is describe a type of fishing net. It says nothing about where the user starts. Moreover it says nothing about the precise way the net is used on any one occasion. A wave equation, as you say, is not ‘a thing’, it is a descriptor of a type or ensemble – an equation. It does need to be distinguished from something else – the token dynamic unit on this occasion or quantum system. I think the use of the term 'wavefunction' to imply the dynamic unit is very misleading, because it confuses the mathematical form for a type with a real token unit, which is not totally described by the math.
The problem is that so many people seem to assume that because the type descriptor is cyclical that the actual quantum system must ‘cycle’. But the fishing net does not cycle. In a sense the wavefunction simply tells you how many holes you will get if you cut your net diamond four feet long or three and a half feet long. Nothing needs to ‘cycle’ in a quantum system so there is no need for it ‘stop cycling and become a particle’.
I also stick to the a priori bit. Quantum theory is not entirely built on the basis of experiment. If you read QM textbooks you will find that the evolution of wave equations was based on an understanding of a need for symmetry that required a decriptor with no preferred positions. That is to say a sine wave was no good because sometimes it is up and sometimes down. A complex harmonic wave, in contrast, has all positions equivalent. This building of symmetry into the theory draws on general experience but is a much more abstract a priori conjecture about theory structure. What seems to me fairly obvious is that if you have a law that is symmetrical in three spatial dimensions and specify that all dynamic processes are built out of discrete quantum systems then you have to have randomness in order to ‘share out the results fairly in all directions’. The alternative to randomness is hidden variables but then you generate an infinite regress as to where you keep your variables hidden – how you share them out amongst the systems that might keep them under the floorboards. It gets circular. The paper Steve flagged up showed neatly how in logical independence terms you can predict the same thing in a completely general way.
Mainz, Germany
Dear Edwards,
You wrote,
The wave particle dichotomy was of course inherited from a 200+ year debate involving Newton, Young and Huygens. Note that Leibniz, and I have not changed my view about his prescience here, would not have taken part in this debate because he had deduced that fundamental dynamic units (monads) could not be particles. My definitions of particle and wave would have been pretty well accepted at the time they were invoked in QM. So where is the problem?
---end quotation
I think you are basically going in circles. I ask about the relevancy of your imposition of non-QM conceptions of wave and particle, in your reply above, and you say that "they would have been accepted at the time they were invoked in QM," --But accepted by whom? What moreover is the relevancy of any of this to contemporary discussions?
That there are abstract elements in a mathematical formalism by no means shows that a theory involving that formalism is "a priori." You merely reiterate the view I already refuted --adding details of doubtful import, but not responding to the criticism.
I don't see that you are helping the question along.
H.G. Callaway
I think you have missed my points HG, so maybe we should let it lie.
Mainz, Germany
Dear Edwards,
Agreed. Many thanks for your good efforts.
H.G. Callaway
Mainz, Germany
Dear El Naschie,
Thanks for the note concerning your paper --and for the abstract. I don't see that you have made the paper itself available, though; and you may want to more specifically relate your themes from the paper to the topic of quantum decoherence. I sense some level of agreement with quantum decoherence, though your focus seems to be elsewhere. Your talk of the "golden mean" here is left without any appropriate explanation in relation to the topic of quantum decoherence. I can imagine, say, that a theory of "fractal spacetime" might evoke quantum decoherence, but this in itself is not an argument, for or against, "quantum decoherence" --lacking a full explanation and defense of the specific theory of spacetime.
So, while your paper may be of interest for the present topic and thread, it strikes me that the connection and the paper are somewhat speculative. It is not, of course, that a theory of "fractal spacetime" could be of no interest to physics. Quite the contrary, but the question is whether it is specifically of interest here and in this thread. Could you speak to this concern? I take it that proper arguments for or against the concept of quantum decoherence are typically much more modest in their scope. Again, it is not that anyone thinks that theories of conscious awareness could have no interest to physics and to quantum physics, but again, the connection seems rather speculative. It is best, I believe, to address the question of quantum decoherence on the least speculative grounds. Otherwise, one invites a great mass of argument and discussion of a more doubtful character which will, in effect, tend to cloud the specifics of the present question.
H.G. Callaway
Decoherence is the untangling of wave functions that connect objects and events together. Everything is connected to everything else, but not as firmly as it once was.
About 10 years ago Ulla Mattfolk and I proved that free will exists from arguments that decoherence limits the cosmic potential for destiny.
Essentially a volume of space has finite capacity to contain answers, but there is no limit to the number of questions that can be asked of it.
Some things are destined, but not everything is.
Philadelphia, PA
Dear Decker,
A new reply to an old question here, again. It seems you have a crew of followers. Good for you.
Did you assume that I was somehow in a competition with you?
H.G. Callaway
What evidence is there for quantum computation in the brain?
You might be interested to look at an actual computation of quantum decoherence times in the brain: -- the punchline of this calculation is that the timescale for quantum decoherence in the brain is in the ballpark of 10−20−10−13s, which is extremely short compared to the dynamical timescale of 0.001−0.1s. This means that computations in the brain are classical, not quantum mechanical.