Many EPR experiments use pairs of entangled photons. If entanglement is so fragile, how can entanglement survive transmission through the glass lenses so frequently used in EPR experiments?
My concern? The standard explanation for the cause of refraction in glass is the delay caused by absorption/re-emission of the photon by many, many electrons. Absorption destroys the photon; emission creates a new photon. In EPR experiments, these destructive, jumpy interactions within a lens do not constitute a 'measurement' that disrupts entanglement, but this implies that entanglement can be temporarily stored by the wavefunction of an electron (or the atom or glass), which is then restored to a brand new photon.
Is an electron's wavefunction complex enough to store entanglement? Is this process clearly explained in any literature?
Good question, Dean. I think I have an answer.
Yes, refraction is indeed caused by the phase delay incurred by light when it interacts with the electrons in matter, which makes it appear as if it is traveling slower than in free space. But I have some reservations with your use of the words "jumpy" and "destructive" wrt the interaction between light and the lens.
First of all, it is wrong to think that the photon is a point particle that interacts with an electron at a time. It is a quantum of the light field, which induces the electrons to oscillate. Remember the Lorentz model we use to describe the origin of refractive index? The electron is assumed to oscillate in response to the driving electric field of light. We could not get away with this model if coherence was not maintained by the interaction of light with the electrons of a transparent material. The absorption and re-emission that you speak of, is not the way I would like to think of it. Absorption and Re-emission occurs in spontaneous emission, or in stimulated emission and absorption (like in lasers). If you think of the whole thing as electrons vibrating in consonance with the incident light field and transferring their coherence to the transmitted field, the question of entanglement stored by the electron and then transferred to a brand new photon is irrelevant. Of course, as we know from QED, we can think of any process of interaction of electrons with photons consisting of the absorption and re-emission of photons (brand new ones!), but that is not a big issue. One can go on asking similar questions about whether a new photon is created after destroying the old photon when a light beam goes through a beam-splitter. But that hardly makes a big difference.
Answering your question about whether the electron wavefunction is complex enough to store entanglement, the answer would be a resounding "yes". You don't need a very complex wavefunction to store entanglement. If you look at the original paper by EPR and the subsequent reply by Bohr, "Can quantum mechanical description of reality be considered complete", the EPR states are weird (in terms of being perfectly localized), but they are definitely not complicated.
That being said, yes the entanglement is fragile, and the more the loss through the lens, the greater the chance of losing entanglement. That is why all these lenses are usually AR coated at the wavelength of operation, which means they are very close to 100% transmitting. Only a certain maximum loss can be tolerated by the entangled state.
Lastly, I would like to say that the most important thing is that even if you think of the photons as being absorbed and re-emitted, the emitted photon would be correlated strongly with the absorbed photon, through the interaction with electrons. In passing through the lens, it is not that the electrons consume the photons and then re-emit when they please- that would be what happens when light is incident on an opaque object, and it emits a blackbody spectrum after absorbing the incident radiation. If the light-matter interaction was jumpy and destructive, then the transmitted beam through the lens would be very different from the incident beam in terms of coherence, wavelength, linewidth and so on.
@Avik - Thank you for your lucid explanation. I had been unable to resolve the 'simple' jumpy explanation for refraction--too frequently used in texts--with my intuitive grasp of QED explanations of light transmission and EPR experiments.
In researching explanations I tended to either get textbook-jumpiness, "shut up and calculate," or assumptions that no explanation was needed!
Avik Dutt's answer is very good, and so I have only a little to add to it. The first is that this question is highly related to one from Robert Taylor - "A photon enters a transparent medium and exits the other side - is it the same photon?" which touches on some of these questions (although some of the answers in the thread are not so useful in my opinion).
The only issue I would stress is that the key (as Avik Dutt points out) is measurement, and so just having an interaction is not sufficient to act as a measurement provided that the measurement does not take away information about the states of the entangled particles. The clearest example of this effect is in extraordinary transmission through metal grids. In this experiment, light is directed at a metal grid with holes smaller than the wavelength, then the transmission is detected. Classically, one would expect that fraction of grid that is metal should transmit, whilst the fraction that is not metal (i.e. the holes) should transmit the light: this is not what is observed. In fact it is possible to see far more light transmitted than one would expect on these arguments. The resolution is that there are photon/plasmon interactions, so the information is essentially copied from photon to plasmon (collective electron excitation) and then back again. This is relevant because the same experiments were repeated with entangled sources, and the entanglement was preserved even through this more extreme interaction. The moral of the story is that as long as no information is extracted from the environment, then entanglement will be preserved. If the information leaks out or is spread through other degrees of freedom, entanglement will be lost.
I felt tempted to vote up the question trice or more.
Who knows about a reference where the quantum optics of photons moving through lenses is treated clearly and completely comparable to the treatment of the classical case in Born/Wolf?
One theme that keeps coming back in my journals is that it is getting more and more difficult to continue to justify the use of the word "particle" to describe the information parcels for many subatomic, atomic and even molecular entities. I am not really suggesting that the use of the word particle will drop from use, but that it is a highly prejudicial term that inhibits intuitive understanding of quantum phenomena.
I have a growing suspicion that, in situations like what Andrew describes, and possibly even the lenses that prompted my question, Nature may be acting more holistically within materials than is comfortably accepted if we persist in forcing quasi-newtonian particle-based descriptions onto the phenomena. Avik's description was more palatable, but just the fact that it difficult to find literature that doesn't lean on old-school jumpiness is an indication that Physics is still struggling to come up with intuitive and accurate language to describe quantum phenomenon. The calculations are amazing, the experiments are phenomenal, the results are astonishing ... the language is atrocious.
Anton Zeilinger's group produces extemely tight work. Their papers are meticulous and the experiments impressive, and yet they still often use "which way" information in the explanatory descriptions of the experiments. (I can't remember the german terminology they use in the papers meaning "which way"). It seems clear that "which way" is at best a clumsy interpretation of any wave-function path, just as quantum jumps being the cause of photons slowing in a medium is clumsy. I also don't accept that functionally *more* accurate descriptions are not possible--a kind of Copenhagen-extremist viewpoint I don't hear much these days--but is apparent in the bias in the writing of some great 20th century scientists, and which still infects modern published papers.
I'm fairly new to the community, but I'd say building a clear and accurate mental framework on which to grow my understanding of Nature is one of my primary passions, and quantum level, biological level and cosmic level phenomena seem to be the most challenging to describe clearly and accurately. We suffer from a combination of astounding advances in empirical knowledge hindered by a need for a cross-disciplinary leap in the evolution of our descriptive language.
The greatest scientists were also poets who wove new myths and stories to help us grasp new and strange phenomena: Quarks, Black Holes, Wormholes, Quantum Jumps, Tunneling, Entanglement ... these are all poetic descriptions of previously unknown phenomena, that also conveyed much, but not all of what might be going on in a fairly accurate manner. Photons, electrons, lattices, glasses, seem to still have "muddy" but evolving language. Phonons and Solitons are relatively young words I had never encountered before the past few years, so there is hope! :-)
Andrew,
On a second read of this entire thread your response prompted another thought. One of the reasons glasses fascinate me in this context is that they are not uniform crystal lattices. Their structure is not uniform, so it would seem there would be a greater tendency to disrupt something as fragile as entanglement, but in optical glasses this is clearly not the case. Especially in popular literature there seems to be a tendency to treat a glass as a collection of isolated atoms, with the photon interacting in a sequential order with individual entities. Rarely is it discussed that the glass in a lens develops wave-like behavior. I have go back and reread the definition of plasmon, because I'm not sure it applies to non-lattice structures, my further question would be do "plasmons" apply to glasses, and if not what is going on? Something is, because the percentage of reflections from the surface of (uncoated) glass is directly related to the *thickness* of the glass, which implies there is some kind of resonant behavior over the entire thickness. I may be beating a dead horse, but occasionally apparently dead horses still have some life in them!
Hi Dean,
Thanks for the followup.
The critical issue pertaining to glasses is the structure on the length scales that are important to the light. In the case of standard glasses and visible light, the interatomic spacing is still of order angstrom to nanometres (the reason for going to quite large spacing is due to the resonances that we might be interested in a particular effect, but for the moment, such details are unimportant). Conversely visible light has wavelengths of order hundreds of nanometres (green is around 500nm), and so the light sees a uniform, effective material when travelling through the bulk.
Now with this in mind, of course the photon must be travelling as a wave, and indeed the wave-picture is the canonical (and correct) way to understand propagation of light through a material such as a lens.
I'm not sure if I understand what you mean by the glass developing a wave-like nature. You might be referring to graded index materials, where the refractive index varies, usually statically (by design) or dynamically in the case of acousto-optic modulators. Or you could be referring to the wavelike nature of the electrons in the glass. This latter is an important effect, but the wavelength of the electrons are typically much smaller than those of the light, although the energies associated with the electronic levels can be commensurate with the light.
But as long as one is not near resonance, and hence the possibility for absorption of the light by the matter is negligible (absorption can complicate issues and can tend to reduce coherence, but note that this is quantifiable, it is not an 'all or nothing' type interaction in the decoherence type approach that we are talking about here), then quantum coherence will not be disrupted by the material, and neither will entanglement.
The reason (almost paradoxically) why crystalline materials can introduce 'worse' performance is that the symmetry of the atoms can give rise to effects that have clear directions and effects over length scales important to the photons (think birefringence, but of course in that context, the word 'worse' is perhaps not so good as it is more often used as a feature rather than a bug).
Now the plasmon case is slightly more interesting. The plasmon is a collective electron excitation (plasma wave), and is caused by a resonant (or near-resonant) interaction in a metal. Typically plasmons are associated with significant optical loss, and are the reason why metals are poor transmitters of light. Plasmons definitely hold for crystalline and amorphous conductors, again for the effective material arguments discussed above (in fact one does not usually think of the electrons as being crystallised in conventional metals, although this phenomenon can also occur).
Why then does the extraordinary transmission case work? It is because the plasmon is excited in such a way that it deexcites to regenerate the photon that has impinged on it and _has taken away no information about the photon propagation._ This latter point is critical. If the plasmon had given rise to a measurable (and by this I mean measurable at the quantum scale, not necessarily at the human scale) change in the properties of the material that housed it (e.g. a rise in temperature associated with absorbing some of the photon energy), then decoherence would certainly set in. However if the material is thin enough, and the plasmon excitations coherent with respect to the exciting photon, then there may be no loss of quantum coherence, and no information about the photon is kept by the material, and hence no decoherence.
Hope this helps
Andy
Andy,
That is by far the most coherent and cohesive explanation I've read anywhere! Thank you so much. I am finding more and more often that clearly understanding the scale of the various entities that are involved is critical, and the scales are often hard to keep track of.
I think I was so thoroughly confused by the standard explanation of electron-absorption-emission that I could no longer see other explanations. I try very hard to understand phenomena as *explained* and in this case just got confused and irritated as a result. I'll have to reread your comment, but I wanted to let you know that conversations like this are why I love Research Gate. I know my ignorance is stupendous, but Research Gate allows people to ask fumbling questions that may lay outside of their area of expertise, but are pertinient to something they really wish to undertand.
Glass and wave-like behavior: I believe I was alluding to the last case, the wave-like nature of electrons in glass, though I'm not sure I understood what I was suggesting. What are the "energies associated with the electronic levels?" I'm smell a trip to Wikipedia!
What I was alluding to is how an incoming photon "knows" how thick a plate of glass is such that when it reaches the first surface, say 4% is reflected and the rest transmitted. Feynman's little book QED goes through this in detail, and mentions that as thickness increases, the amount reflected oscillates from, say 0% to 16% reflected. The odd thing is that this has apparently been tested with glass plates up to 50 feet thick. It seems that one of the aspects you discussed above might be responsible for a kind of resonant communication between front and back surface of the plate which "prepares" the front surface in such a way as reflect 4% or 7%, etc. I wondered if you suddenly vaporized a half-wavelength of material from the back surface, how long would it take for the percentage of photon reflection to change? In a fifty-foot thick block of glass, how long would it take for the information to travel from front to back? Would there be a restabilization time?
I realize this is getting into the weeds a bit, but I'm interested in having *very* accurate mental models for quantum behavior, and with light transmission I feel you have helped nudge me in the right direction. It is going to take me some time to digest.
I also just remembered a similar situation to the metal lattices you discussed, where the experimenter found that barriers with holes in them actually blocked more sound than barriers with holes!
Thanks again, to all in this thread, for helping me to get back on track. I'll probably be twice as confused tomorrow, but that's happens a lot around here! :-) I'd like to loop this conversation back to how what you described and what part of your description is responsible for the "slowing down" of the photons in a glass. I'm feeling a bit thick at not getting this clearly, but I'm trying!
I'm going to go visit Wikipedia to see if I can sort this out myself, but you may beat me to the punch!
Dean
"Think Crazy, Prove Yourself Wrong!"
Hi Dean,
Actually the question of when the photon "knows" about the reflection is really interesting, and has been studied in a number of different contexts. Whilst the 'vapourisation' experiment is not so easy in practice ( ;) ) it is possible to make a Fabry Perot cavity and move one of the mirrors on timescales fast relative to the storage times of the cavity, and if you change it's properties fast relative to the speed of light then you get really weird effects. The other related topic is that of delayed choice experiments which are functionally similar to the case you are interested in (although the physics manifests differently).
Anyway, I thought I'd just give you this result quickly between meetings for me, so that your trip to Wikipedia (and related places, I recommend you look up the arXiv also) might have some additional foci.
All the best
Andy
I spend a lot of time on Arxiv.org, but find I have to ping-pong back and forth to Wikipedia to look up various terminology, then double check other sources to see if everything lines up. I often say to my kids, "Aaargh! I wish I could just stuff all this math into my head!"
I've read quite a bit about the delayed choice experiments, but hadn't heard about the Fabry Perot cavity experiments. I'm excited to look that up!
Many phenomena behave well when parameters are in the middle of a range, and experiments tend to be forced toward "middle" behavior because it is easier to create stable mathematical models where there is no chance of turbulence or anomalous behavior. That seems to be much of the early impetus for The Copenhagen Interpretation's "shut up and calculate" dictum. Once experimental method and mathematical modeling advances, the most interesting papers are often where some researcher has found a way to poke around at the edges!
Thanks again for all your guidance.
Dean
Dear Dean,
I read this entire discussion with great interest. A lot of these, off the cuff discussions on the Gate hits the problem on the head while the official side of science is continually sweeping things under the carpet. I absolutely agree with Dean that the terminology we use is defective and the term particle misleading. However, I was compelled to join the discussion because the term a wave is equally misleading. Both are concepts of an extreme reductionism that is/was so successful in the last 400 or so years in describing the nature but has its natural limits.
The discussion should start with the redefinition.
What the hell is a photon.
It is not a particle in the extreme sense (dimensionless-pointlike with well defined kinematic descriptors). It is certainly not a wave (as an ideal wave should be infinite and periodic). It is something that cannot even oscillate, so it cannot be even a partial wave (limited in extent). The EM wave in vacuo is created by a motion of charge and subsequent regeneration of electric and magnetic components from each other. (See my comments on dualities in a different stream). In order to obtain periodic change one would have to move an electron several times up and down between the orbits to get a multiple periods. Since a quantum transition is a single act than something (whatever we call an electron) undergoes a single transition, so classically it accelerates and decelerates and creates a BLIP, an electric component goes up and down (and then magnetic duplicates and propagates)
We mostly created a photon as a wave by use of Fourier transform. Any function becomes a wave when subjected to the Fourier. So math won big time in distorting the reality. I struggled with that concept for a long time till I encountered a marvelous explanations in off the mainstream sources.
How Long Is a Photon?
I. V. Drozdov A. A. Stahlhofen
http://arxiv.org/abs/0803.2596v1
The consequences of this type of description are simple. In classical picture it is a blip (wavelet, half wave, soliton, etc. etc.). However, it cannot be classical, because classically, as soon as it starts moving it would radiate the energy and decay to nothingness so it must obey a "quantum" view as a portion of energy that cannot be easily disposed off, so it becomes like a particle - a grain of energy traveling and regenerating by transmuting itself from a pure electric to pure magnetic on its way through space. It needs to encounter a similar quantum object in order to react with it. So as you see it is not easy to destroy or change a state of such an photon. But in a state of transfer (as a blip) it is a wave so it interacts with other blips, therefore diffraction and coherence and all this stuff.
Now if you imagine that entanglement is nothing else but a coordination in properties between two adjacent quantum wells (see my explanation of this concept in other stream about Pauli principle) you easily understand that whatever property it is: polarization or encoded by it spin must be entangled as long as something comparable does not interact with it on a quantum level (a level on which the quantum cells would require a change).
Absorption is obviously such an act (because it interacts with an electron in coupled quantum cell) but any transformation from a polariton or a phonon or any form of a "quantum" particle obviously does not do the job. So the quantum property survives. For me much more compelling were the experiments with slowing and stopping the light done in Boston several years back. I finally got my head around it when I realized that from the point of view of quantum cells, if the BEC of many hundreds of Na,Ru or I atoms are in a single quantum state they form a "super atom" another quantum cell which interacts with a single photon as does a single electron. So this "superatom" can absorb it and re-emit it. Because of large physical dimensions (almost millimiter size) the momentum variable that is dual must be affected.
So in essence glass never belongs to the quantum space of a single photon unless it has a some kind of impurity that would be commensurate in position or momentum space with a single photon.
However, some notions of refraction should be straightened out. Dean, refraction is a result of a tremendous but almost continuous change in electromagnetic propagation conditions in solids (not re-absorption or jumping). There is a whole field of experimental solid state that tries to numerically explain the rotational and optical density effects of materials from elementary properties of underlying molecules. It is a difficult field but a simple lesson as an intuitions is worth taking from. Because molecules mostly interact with each other with orbitals the modifications of the EM field in crystals are comparable to visible light frequencies and wavelengths. We do not have X-ray lenses because the existing molecular systems do not produce sufficiently strong EM field inside to influence 1000 fold more energetic photons. In order to do that we would have to use nuclear matter. So if we ever master nuclear fusion and handling of nuclear plasma we will probably have X-ray lenses at home (just joking).
As a final remark, for more "math oriented" people, seeing world through math is creating new realities (as math is an invented artificial/ formal language) so only the coordination of the abstract math formulas with underlying descriptors can expose the limitations of the math as well as our perception of the world. So only this coordinate effort can lead to reaching down to "truth" if something like this exists at all, and not our impressions of it.
Dear Boguslaw,
Whilst some of your points above are quite helpful and correct, there are other ones that I fear are misleading and not grounded in the latest research.
Your statements about the nature of a photon, for example, are not so helpful in my opinion.
Certainly it is true that the concepts 'particle' and 'wave' are idealisations that are not completely valid in any setting. Hence the use of terms like 'particle-like' and 'wave-like' to describe the aspects of quantum mechanical properties that correspond to our classical understandings. For example, a particle-like property is quantisation, and another particle-like property is anti-bunching. Conversely wave-like properties include interference and superposition.
Your comment regarding 'absorption/emission' cycles for measurement is mainly correct, although one does need to be careful to stress the fact that it is the carrying away of information that is critical here (and that that information can also be erased which can restore or even create entanglement under appropriate conditions).
It is simply not true to say that waves must be infinite in extent. The wavepacket approach is well-defined for both classical and quantum wave phenomena. It is certainly correct that we use a Fourier decomposition to explain properties such as the frequency bandwidth for a finite width optical pulse and that the component sine waves are infinite in extent: but they simply represent a basis for explaining the phenomena and are not the phenomena itself. With this in mind I don't understand what you are trying to imply by the statement "We mostly created a photon as a wave by use of Fourier transform".
You mention the fact that the photon does not oscillate: I think I'm a little troubled by this statement and am not sure what you are getting it. Certainly the electro-magnetic field does exhibit oscillations, and these have been imaged directly using atto-second techniques. I'm not aware if this has been done at the single photon level, but I do not see this as being an important distinction as the properties of a wavepacket imaged in this way are the same for each component photon in that packet. See http://www.attoworld.de/Home/newsAndPress/Gallery/index.html for an image of the first such imaging, and a more recent one can be found at http://www2.physics.ox.ac.uk/research/ultrafast-quantum-optics-and-optical-metrology/latest-news-0.
I'm also not convinced that your suggestion, intuitive though it may be, that the electron must oscillate mulitple times to generate a photon is either helpful or correct. Consider the example of a single two-level atom and the emission of a single photon from it. We apply an excitation pulse to the system and promote the electron to the excited state (formally we actually require a 3 state system to do this properly, excite to a high lying state, allow it to decay to the upper state of the transition, and then the argument proceeds). There will be a finite lifetime for emission, the spontaneous emission lifetime, and in this time the emission will generate one and only photon. The extent of the photon so emitted is governed by the lifetime, and the single photon nature due to the fact that the system only undergoes a single emission cycle and is a single emitter. This argument does not require absorption and reemission by the emitter.
You mention the slow light work in BECs: this is something I know quite a lot about. The work you describe is beautiful and was certainly ground breaking, but it is important to realise that the BEC nature of the sample was not in the least bit significant for the demonstration of the phenomenology. These effects have even been shown in hot vapour cells and solid-state systems, and at the EIT/group velocity modification level there is no modification due to the BEC nature of the sample. Of course, BECs have their own advantages and I'm not saying that there are no places where you wouldn't want to use a BEC for extensions to this work, but in the case you mention, so the 'super-atom' concept is not relevant to this discussion.
I'm also not sure I understand your comment regarding X-ray lenses. There are several lensing technologies that work for X-rays, and whilst I agree that they are not as convenient as visible light optical elements it is wrong to suggest that they dont exist (or that they require nuclear matter).
I hope that this helps
Andy
Dear Andy,
Call me Bog. Short posts do not allow for extensive discussions. My English would have to be sharper than top scholars to even define the language of the conversation. If you want to continue this discussion let's do it off line. I felt compelled to add my language to the conversation, this is why I did it. Your response is the best proof that it was needed. I am afraid that you did not understand my remarks and I assure you that I do not claim that an electron needs to move many times between the energy levels to create a photon. The core of the argument is the language not the phenomena. And this convinces me that you do not have a good imagery of a single photon, and as classically trained person buy in into the nonsensical image of a packet. (Please do not discuss why I think it is nonsensical, think about it as a quantum with all the historical baggage). By the way I am aware of all the experiments you mention but this is not the time and space to discuss them. (Additionally, I strongly advise reading the material that I cited before answering.)
Let me only reiterate one point. EM wave can be only created by a movement of a charge in a particular inertial system. You can create a magnetic field by just simply changing an inertial system since the duality in EM fields. In order to classify something as a wave we need to create attributes of a wave therefore create multiple "crests" and "valleys" what is equivalent to wavelength and frequency, customarily recognized as a wave (the most classic definition of sin or cos). Hence, the charge has to move at least once back and forth. In absence of vibrating media (vacuum is not a medium) a photon is a "half wave" which you can recognize as a prototypical wave but in reality it would be an unjustified extension of this word. As such any function would become a "wave". The prosthesis of a wave packet does not rescue these simple facts.
The oscillation that you would like to recognize as a wave refer to is the progressive motion (as a photon do not have a resting mass) reproducing EM field in propagation which is not equivalent to the initial pattern. This basically is equivalent to the fact that every single photon cannot possibly have a single frequency (A Fourier transform will have dispersion and multiple components). The experiments that you are referring to are for the "trains of photons" and can not possibly describe a single object. A single object is by definition a quantum object and such is difficult to wrap a head around it. Its quantum properties means that it becomes fuzzy as you want to define frequency you loose other attributes. However from a definition (however fuzzy it is) of spatial and time dimensions (have a peak at other posts concerning other quantum issues to follow this particular language) it is somewhat limited in space and time (or pseudolocalized). This pseudolacalization is the cause of all problems. So let's settle this amicably, it is not a particle and it is not a wave (and have it done, and on our search for better words). Even if you do not like my attempts you have to admit we are in dire needs of a better language. (Have a peek at the stream discussing what the space really is and the breakdown of the meaning of spacetime in quantum phenomena.)
So, I really do not think that classifying my imagery as helpful or unhelpful is really helpful. Let people have their own opinions. If it is unhelpful it will be rejected if it is helpful it will be accepted. I suppose to be short but...
Andrew and Bog. I really enjoy both of your perspectives, whether they meet in the middle or not. This thread has already got me rebuilding my intuition and I will comment again after I get some sleep. When I research I find myself pushing intuition as far as it will go until it bumps into something mathematical which I must push into to understand if my intuition has any value or if I can do my favorite "prove myself wrong," when I can start understand what conceptions I didn't understand.
At this point in time there is a great deal of philosophy and and dogma jousting with utterly amazing results pouring fourth from stupendous experiments. For a generalist, like myself, this is a dream come true. Verifications and Falsifications flooding out onto easily accessible web sites on a daily basis.
Hi Dean !
Entanglement can be in various degrees of freedom. Most frequently for photons one would prepare an EPR pair entangld in polarization so lets assume we are talking about that. Now, polarizatoion of light does not suffer any change when passing through a lens. Of course, this is an idealization but it holds to a high presision.
What happened to interaction with a lattice of atoms (and their electrons - note that thare are no free electrns in the glas or poor photon would have hard time to survive) ? You have two ways to visualize that: waves and photons. When a wave spreads through glass it feels a force that is a sum of all fields so there is nothing that is transfered to a specific electron, much less the entanglement. The entanglement is a very specific property and you need a specific action to affect it. Note that the reason why photons are so big (5000 x atoms) is that their passage through matter can work only because they are able to adrress a highly collective motion and avoid absorption by a single atom. If it would you make feel better, at the level of photons, you may think of a photom pasing through a matter to exchange virtual energy with matter not a real one. Virtual energy is that which you exchange with the system but get the exact same back (uncertainty relations) and th net effect may be change of impulse (k vector), not energy. After all, a single photon can pass through a whoe lens without any transfer of energy (change of color). Some photons of corse will get stuck in the lens (absrbed) and that is unfortunate for entanglement too, but if glass is good it is a very small effect.
Note that entanglement of two particles is a property of a SINGLE entity which constitutes of two particles and which poses one proprety that can only be described as a single particle property. This is an exclusive peculiarity of QM. Effectively, the wavefunction of an entangled pair is a 2-photon state that can not be expressed in any way as a mathematical combination of 2 single photon wavefunctions. So why we do believe that there are two particles in it ? Becuse when we break it, for example by measuring polarization at two distinch places, we get TWO photons, but note that after that entanglement has gone forever.
The electrons can alo be entangled as atoms etc. In electrons you would most naturally go for spin entanglement. Note that it is not complexity of wavefunction but its SIMPLICTY that is required if you want your entanglement to survive temptations from the surroundoings (also known as decoherence).
One more thing: the entanglement is not a mechanical property but it stems from the quantity of specific information contained by the system. A pair of polarizaion entangled photons contains one and only one qubit of polarizaton infomation. That is what Einstein, Podolsky and Rosen did not understand and what makes so many people today wondering how can two "distant" partices be entangled. Simply, any mechanical property such as energy, impulse, distance, time is NOT a part of description of entanglement and therefore have no connection wahtsoever to it. There is nothing "distant", "spooky" or "forcy" about entanglement - it is purely a queston of information put in the system when it was prepared. The two poor photons, when interrogated, would gladly give quite different and unrelated polarization results but the system contains only ONE qubit of information, so the polarization measurements must be completely (anti)correlated. Why would that be spooky and what distance has to do with it ?
@mario
Thank you for your interesting contribution.
>>reason why photons are so big (5000 x atoms)
I had not before had a picture of "size" of a photon in passing through a lens. As noted in many traditional explanations, the apparent slowing/bending of light has been described as the delay caused by absorption/delay/re-emission from an electron. Pretty much all of the descriptions in this thread say that is not only a convenient oversimplification, it is probably just plain wrong.
When you speak of SIMPLICITY of a wavefunction in order for entanglement to survive, this touches on many thoughts I have had about current entanglement experiments largely being centered on the *purification* of entanglement. We are teasing out single threads of entanglement from the otherwise complex weave of the wavefunction of larger systems when we wish to study or exploit entanglement. One of Roger Penrose's contributions to my understanding is that *much* of the wavefunction of larger systems is concerned with matters of entanglement, but through forms that are largely hidden from the view of our experiments and not easily exploited for profit by our technologies, thus they get little attention!
I am a bit slow in learning some aspects of quantum behavior, which is probably not surprising! The reason entanglement could still be considered spooky is that most experimental descriptions limit their discussion to just the simplicity you speak of. The anti-correlation you mention is maintained at what could be argued as great distances, but without considering *both* photons as a part of a *unified* wavefunction, this looks like spooky action at a distance. Even though the wavefunction of an entire experiment (emitter, lenses, beamsplitters, birefrignent crystals, etc.) is quite complex, it is difficult and unnecessary to convey all that in a 3-10 page paper! Or is it?
>> it is purely a question of information put in the system when it was prepared.
There is a subtle, but important distinction. The mechanical equations of a pair of classical objects prepared at a single location, then separated cease to be united when they separate. The equations, the wavefunction, of a pair of polarization entangled photons separate in space but *share* a super-wavefunction unless interfered with by their environment.
Bell basically says, "Yes, the wavefunction of the pair is a result of how the system was prepared, but the preparation is not enough to determine the outcome of later measurements if the wavefunctions of the two entangled parties share no connection." We must continue to consider *both* photons as parts of a single system, wavefunction, or a stable portion of a wavefunction of a much larger system.
I argued in a rather clumsy essay on FQXi.org that it might be helpful to consider this anti-correlation as a non-spatial *structural* element of the *combined* wavefunction of entangled particles. Much of the discussion of wavefunctions is about motion and local interaction. Combined notation *implies* connection, but rarely discusses the nature of the binding mechanism.
I'm not convinced I'm right, but the key proposition is that the "binding" between prepared-entangled-entities is largely ignorant of the entangled parties physical location, their place in time ... to anthropomorphise a bit ... the part of the wavefunction dealing with entanglement is largely ignorant and blind to space and time. It does not *communicate* between entangled entities, it just connects them like a stiff brass rod spinning, in tandem, at two different locations in space. It is *shared* information. If the entanglement is disrupted by the environment, and one believes in information not being destroyed, then whatever "bit" of information conveyed by the entangled pair is not lost, it is passed on to some other more complex wavefunction.
The challenge is understanding that the universe seems to support these connections *without-reference-to* spatial separation, hence my (naive) suggestion that the ignorant-structural link between entangled parties exists, but as a simple, blind connection largely not interested in space or time ... as you say a SIMPLICITY of connection, this structural link's only job is to carry the anti-correlation ... unless that correlation is disrupted ... passed on to some other entity in the environment, the larger, eventually universal wavefunction.
The question then becomes, "Is there any reason to *not* believe that non-spatial, non-traditional connections not only exist, but may be ubiquitous in the universe."
Are we over-counting information? Does nature *constantly* share information between identical entities? All electrons are supposed to be identical (maybe with a few spin characteristics varying) and identical entities are somehow supposed to be entangled in the wavefunction. Does this mean that Nature has a "template" in a non-spatial realm where the basic attributes of an electron are non-locally enforced?
(I apologize for the rant, but it is has been a long week, and I'm about to go into the bowels of a convention center for a week, so I thought I'd leave a few thoughts for when I return!)
Dean
P.S. I am a slow learner, but I adjust rapidly and will give up absurd positions when presented with good references, compellingly more coherent explanations, or am pointed to a vast storehouse of new knowledge which can help dispel the hopeful, but incoherent mythology of prior (or current) generations of thought! I'd much rather learn than be right. ;-)
When it comes to interaction of electrons or other particles,I believe
that issue will involve time correction, which is part of my research. Neither Newton nor Einstein asserted their beliefs about time and space as explained w.r.t to energy or momentum. Where time is perceived or measured may be a factor. When time approaches an instantaneous change or 0 with reference to a vector an interaction or time correction that already may exist will occur again. However if time approaches 0, then is matter necessarily destroyed then? According to thermodynamics, matter is neither created nor destroyed. There may be a lot of future research involved.
Dear Manoj, sooner or later you will have to learn to form clear thoughts and to expresss them clearly. Now is a good time to start.
John Paul,
One of the reasons I posed this question was indeed that light propagation in lenses seemed a "classical-bug" forced onto some descriptions of quantum systems. I do find the use of "information" in descriptions of quantum systems to be compelling, though I am also trying to be very careful to accept quantum-information as the be-all-and-end-all description without gaining a better grasp on the situation.
As a somewhat naive observer I find it pretty easy to overestimate my ability to fully grasp the implications of entanglement. I'm guessing I'm not alone!
Hi,
I would like to comment on "the cause of refraction in glass is the delay caused by absorption/re-emission of the photon by many, many electrons".
The quantum interpretation of the index of refraction of a transparent dielectric is based on virtual transitions not real ones. The energy of the photons is not enough to promote electrons in the conduction band. We are rather dealing with a typical quantum global effect described by sums of transition integrals involving the promotion of an electron and its quasi-instantaneous annihilation and the reverse in time ! Experiments show that this does not affect entanglement as do real light-matter interaction.
I once made the "classical thinking" calculation of the time delay induced by this quantum effect for diamond knowing the index of refraction, the density, the propagation length of a given visible wavelength and found a quantity less than an atosecond.
Charles,
Thank you for emphasizing the concept of a virtual transition in the conduction band. I keep forgetting about the possibility of virtual components in certain situations.I sometimes wonder if nature "bothers with" performing all of these virtual transactions, even if I understand that they are significant in their contribution to the final behavior of quantum entities. It is more that I wonder if it is possible that many, if not all, virtual events are summed-out, cancelled or otherwise balanced out by a more holistic view of the entire systems that are involved. I guess I am asking if nature is better at summing Feynman diagrams and finding ways to eliminate unnecessary transactions than physicists are!
Also, I am a bit confused, but intrigued by, your classical thinking calculation. Are you saying the summing of all those attosecond events does or does not add up to a reasonable approximation of the refraction delay?
Hello Dean,
All of these non causal virtual transitions calculations is only a way of calculating things. The interpretation being the existence of an index of refraction. If you want to think the index of refraction classically (which can be done as the light wave function, the electromagnetic field, is a real, not a complex function like the one of a particle) the total time delay experienced by a photon crossing through a transparent medium is the sum of the delays it takes on every atom/molecule of the medium for the photon to start the vibrational movement of the electrons. This is what I call my "classical thinking": dividing the total delay of the light by the number of atoms the light has to cross. It is also the mean value of a couple of virtual transitions on each atom.
I choose diamond because it is the only mono-atomic transparent medium I know. The mono-atomic character greatly simplifies the thinking about these virtual transitions.
Hope it helps.
Charles,
That helped very much. I find that having many, many different descriptions of the same basic phenomenon is necessary for me to understand anything on physics well, and virtual transactions in the conduction band was not a perspective I had considered.
Well, this whole thread shows how clear the "entanglement" thing is. It is preserved, not destroyed, and so forth... Now we have a principle of conservation of the "entanglement" whatever the "entanglement" is.
@Oscar,
Whilst it would be tempting to agree with you that there is some conservation of entanglement - unfortunately this does not appear to be true at all, at least not in the usual sense of a conserved quantity.
In most cases, entanglement is not really an observable, in the same way that polarisation or position is an observable. Also, we do not have a rigorous measure of entanglement when we go beyond simple cases, and these measures can wildly disagree.
So whilst we can talk about the ideal case of an EPR type experiment (as here) as preserving (two-qubit) entanglement, it is not so obvious that we easily generalise.
I'm a bit cagey about this, because of course we would love to be able to generalise these arguments, and it has been a major goal of quantum information processing (which has, incidentally, been very fruitful), but conservation of entanglement is not at all an obvious concept.
This is an old question that already has several very good answers, so I only wanted to add a simple but handy rule: If the photon leaves the atoms undisturbed, that is, without any trace of its passage, then it will remain quantum and entangled.
interaction of an entangled photon with a lens produces NO change in the state of the lens.
This is enough to preserve entaglement.
I do not agree with "entanglement is not really observable". There exists operators called "Bell Operators" with eingenvectors (the Bell states in a 4 dimensional space) wich are measurable. This is also related to the concept of "Stabilizer Codes" used in Quantum Information introduced by Gottesman.
Hi Zeno,
I assume that you are responding to my comment, so given your comment I need to clarify.
You have misquoted me as I didn't say 'entanglement is not really observable', I said 'entanglement is not really an observable'. In this context I mean an observable in the quantum mechanical sense of the work, i.e. an operator that can be applied to get a result that would be the amount of entanglement.
In saying this, I accept that under restricted settings entanglement can be measured and quantified, for example a full Bell state measurement. This is slightly different from a single observable, however. The Stabilizer codes are closer, but keep in mind that although they project onto entangled states, and are observables, the results are typically binary outcomes, i.e. the eigenvalues of the stabilizers. These are correlated with entangled states, but do not, by themselves, give the entanglement of an arbitrary multi-particle state (especially when decoherence is taken into account).
Entanglement witnesses are certainly observables, but again, they simple tell whether a state has a particular kind of entanglement.
All of this is why entanglement is not really an observable, even though it most certainly is observable/measurable.
I hope that this clarifies the point.
Andy
@Klaus-Dieter Yes, that is fun! I have a much more "coherent" picture of the processes involved from the many helpful answers to my question and your answer added a new, very quantum, twist for me to consider.
This is really a question of language. Many simple and sometimes even classical phenomena can be easily made mysterious and weird by abusing the quantum language.
A circularly polarized photon can be regarded as the result of "entanglement" of states of linearly polarized photons. In turn, a linear polarized photon is an entangled state of left and right circular polarization states. Not much substance; except the trivial fact that x= (1/2)[(x+iy)+(x-iy)].
There is no mystery in transmission of photons (or, simply speaking; light) through a lens. The mystery comes from the misleading picture that our mind generates when we hear that two photons, or dead and live Schroedinger cats are "entangled".
The physical cats are not entangled in any way. The quantum mechanical description by a wave-function that is a superposition of states tells us how to calculate the probabilities. That's all there is. .
I agree with the last comment.
Having started with Quantum Mechanics (at the age of 17) and getting in all the degreees in Physics up to the PhD I have since worked a lot in the field of Optoelectronics and telecommunications I find that these debates are perhaps not pointing at the pertrinent aspects of the phenomena.
And my feeling is that many aspects of Optics (mostly interferometry) are in some way "rediscovered" by the Quantum Information community. But when the question comes up whether there can be a parallel between the two fields the answer is always that one is classical (Optics) and the other is quantum and that in one case one deals with "many" photons and in the other with single photons.
I think the distinction should rather be made whether we consider coherent measurements as is the case of Interferometry (which is also the case of all radio technology using antennas since N. Tesla and G. Marconi and more recently in the Fiber Optics technology) or direct detection where we detect photons losing phase infoermation (coherence). If we are in the coherent domain then we "work" with waves and in this case there is "essentially" no difference beteween "quantum" photons and "classical" photons...