The loss of energy results in an increase in wavelength for radiation. If the time this progress takes is infinitely long, could all radiation (gamma, x, UV, etc.) finish with a larger wavelength.
It depends on the average energy (temperature) of the particles interacting with the radiation. For example, interstellar dust lanes in our Milky Way galaxy consist of hydrogen molecules and other small particles at an average temperature between 2 Kelvin and 4 Kelvin, just above the average background temperature in the universe. Therefore photons of radiation (gamma rays, X rays, UV, visible, and infrared) hitting these dust lanes, if they are absorbed or scattered by the dust, will heat the dust particles slightly and on average the re-emissions will tend toward the longer wavelengths, eventually ending up with energies in the microwave and radio-wave frequencies, which are the characteristic frequencies for the black-body spectrum of molecules at 2 Kelvin to 4 Kelvin temperature.
As another example, consider the particles in the outer regions of our sun: these are mostly hydrogen ions at about 5800 Kelvin. When gamma rays, X rays, or UV rays hit these hydrogen ions, they again tend to lose energy to the hydrogen ions, and are re-emitted mostly as visible light photons (the frequencies typical of a "black body" source at 5800 Kelvin). Note that when I use the physics term "black body" I do not mean an object that looks "black" in color--I am referring to the Max Planck era (~1900) measurements and theory of infrared photon emissions from dark objects (such as a lump of carbon). A theoretical physics "black body" does not actually look black if it is hot enough (for example the sun) to emit visible light. If you were to send an intense beam of microwaves or infrared photons into the surface of the sun, in general these photons would actually scatter and be UPshifted to higher energy (shorter wavelength), because the average hydrogen ion in the sun has MORE energy than the microwave or infrared photons hitting it.
So it is not always true that when a radiation photon hits a particle, it scatters, or absorbs and then re-emits, at lower energy and longer wavelength. It is true that a particle will generally emit radiation that is typical of its temperature. Since gamma rays and X-rays are more energetic than the average temperature of any normal object (except for the core of a star during hypernova?) gamma rays and X-rays will almost always lose energy and scatter at longer wavelength when they interact with particles of normal matter, as you suggest.
Over time, this would tend to suggest that the entire universe should get darker and darker, with all gamma rays and X-rays scattering or absorbing and being re-emitted as lower energy photons: visible, then infrared, then eventually microwave, as objects like stars eventually burn out and cool off. But of course there is tremendous nuclear, gravitational, and kinetic energy stored in the matter of the universe: whenever two hydrogen ions fuse in a star, whenever a large object falls into the event horizon of a black hole, or whenever a large atomic nucleus left over from a supernova undergoes radioative decay, these energetic events release gamma rays and X-rays. Since many of the stars will be fusing hydrogen and producing gamma rays, x-rays, and visible light for tens or hundreds of billions of years to come, the universe will not go dark any time soon.
I know gamma and X rays are created too, and possibly matter can be created (atoms created by magnetic field crossing), but I am talking about a radiation already emitted. This is possible always finish with the bigger wavelength.
"Almost always," but not "always" a longer wavelength.
At SLAC (stanford linear accelerator center) they did an experiment in the 1990s where they directed a high intensity beam of infrared laser photons into a highly focused electron beam at about 50 GeV (50 billion electron volt beam energy). When the infrared photons scattered from the 50 GeV electrons, they actually gained energy from the electron beam, becoming higher energy (shorter wavelength) gamma ray photons. So it is indeed possible for radiation to "upscatter" into higher energy, shorter wavelength, by hitting a very energetic charged particle. But this is rare in nature for gamma rays and X-rays, since very few particles in nature have MeV or GeV energies. I'm sure that it happens a lot in the core of exploding hypernova stars (the hottest objects in the universe), and it must occasionally happen that a beta-decay electron from a nuclear radioisotope decay hits a low-energy photon and upscatters it to a gamma ray, but for the most part (more than 99.99999% certainly) gamma rays and X-rays are much more energetic than the particles they are likely to encounter in nature, and so *almost always* an X-ray or gamma-ray will scatter to lower energy, not higher energy.
I think there may be a question of interpretation or the meaning of words here. If a low-energy infrared photon (less than 1 electron-volt energy) scatters from a high energy electron (50 GeV, or 50 billion electron-volt), and the resulting scattered photon has energy greater than 10 MeV, is it still the same photon?
This is not one of those processes where one photon is "absorbed" by an atom, exciting an electron in an atom, and then a different photon is emitted some time later. This is the scattering process where the photon comes in, "bounces" (or reflects) off the electron, and with negligible delay (effectively instantaneous), the photon goes back out at higher energy. So most physicists would say it is the same photon, it has merely gained energy from the electron: energy is conserved because the electron slows down and loses energy in the collision with the photon. But some physicists would draw the interaction using a Feynman diagram, which makes it look on paper as if one photon goes in and a different photon goes out. Photons are created and destroyed so easily (unlike particles with mass, like electrons) that there is a philosophical question as to whether a photon has an "identity" as the same photon or whether its identity changes when its energy (wavelength) changes. There is also the question as to whether "virtual" photons should be counted as particles in counting the "number" of photons (virtual photons are never observed because they never travel anywhere, but they locally affect the energy of particle interactions indirectly through quantum mechanical photon creation near some interaction point, followed almost immediately by annihilation of the virtual photons near the same interaction point.)
But to make my answer as philosophically clear as possible, let me say: In the 50 GeV electron beam scattering experiment at SLAC, the scattering data from the experiment showed that "one low energy photon goes in, one higher energy photon comes out." This does not violate conservation of energy as I mentioned, because the electron slowed down in giving up some of its energy to the photon.
Maybe eventually every radiation will be infrared radiation, since if the black holes suck all the matter and light and electromagnetic radiation and what the "jet" of the black hole emits is just the infrared radiation.
The case of photons gaining energy is seen in the hot corona around Massive Black Holes (MBH) , collimated jets emitted from Active Galactic Nuclei and in the Sunyaev-Zel'dovich effect where photons in the microwave background are gaining energy from the hot gas in galactic clusters.
Excellent discussion, especially Kenneth's very thorough dissertation.
As Nathan points out, the observed initial release of propagating photons in the now dispersed universe is, as I understand, thought to have initially been emitted in the infrared spectrum, but has now been redshifted to microwave spectra. I also think that the initial propagating light was likely emitted in a broader range of spectra, but the omnidirectional background microwave radiation is more definitively identifiable as representing the initial propagation of light.
However, as I understand this cosmological redshift is attributed to physical wavelength extension of light caused by the metric expansion of spacetime. While, as Kenneth explained so well in direct response to your question, intergalactic light that interacts with matter (most commonly galactic gas clouds but also the intracluster media of galaxy clusters and not so easily identifiable intergalactic media) is generally redshifted, the redshift of intergalactic light is most generally attributed to spacetime expansion.
If most intergalactic light does not interact with matter, including any terminal detection or absorption by a sufficiently massive object, it should continue to to be redshifted by spacetime expansion as long as spacetime continued to expand. Even ignoring for the moment the evidence that seems to support an accelerating universe, it had previously been expected that eventually spacetime expansion would perpetually diminish if not halt or reverse. Given some perpetual propagation of light in a perpetually expanding universe, it seems that the wavelength of light would eventually become flat - redshifted well beyond infrared, microwave and and any other frequency spectra...
That's a very good question - that I can't definitely answer. It does seem that a light wave with 'infinite' wavelength would have at least nearly no energy, but the speed of light is considered to be constant, regardless of wavelength.
As to it still being a photon, or disappeared, I have to wonder whether: in a universe that was no longer expanding would time still progress? Would light waves of nearly infinite length and almost no energy be detectable anywhere along their length? I think I can't evaluate!
The case of a photon with "infinitely low" frequency is a limiting case, of little practical relevance here. Yes, as the frequency tends toward zero the photon energy tends toward zero and the photon tends toward not existing. But currently the cosmic microwave background is peaked at around 200 GHz (200 x 1.0E+09 Hertz) frequency microwave or "millimeter wave" as it is sometimes called. How much further does space have to expand before the average background photon will be red-shifted down to our own "AM radio" band, 200 kiloHertz to 1.6 MegaHertz: The answer is, space would have to expand by another factor of 100000 to 1000000. At currently supposed rates of expansion this is not going to happen in the next 100 billion years (or ever?)... the stars and the galaxies as we know them will run out of hydrogen fusion and go dark long before the cosmic microwave background gets redshifted into our AM radio band. So it's a hypothetical situation that does not concern us.
As to the question of whether you can DETECT a very longwave photon, there is a simple answer: calculate the equivalent temperature of your detector. If your detector is warmer than the temperature of the photon you want to detect, the signal you are looking for will be swamped in the background noise. So you cannot detect a single 200 GHz photon unless you cool your detector to well below 2 Kelvin (which is possible but expensive.) You can still detect the "noise" of a large number of 200 GHz photons--this is what Penzias & Wilson did to win the 1978 Nobel prize, by detecting the "excess noise" of the cosmic microwave radiation peaked near 200 GHz-- and you can detect a large signal of a coherent wave consisting of trillions of longwave photons all in phase together (obviously, this is what an AM radio does). But detection of a single photon is a difficult thing especially when you are looking for single photons at long wavelengths with less energy than infrared.
So when you say, "would light waves of nearly infinite length and almost no energy be detectable?" the answer is, certainly NOT if they were individual photons. But if you created a very powerful low-frequency coherent electromagnetic wave (consisting of quintillions of low-frequency photons all in phase together), it may be detectable by long-antenna radio equipment of comparable wavelength. If you made a very powerful electromagnetic wave of such low frequency (about 25 hertz) that its wavelength was longer than the earth, you would need an antenna of comparable length to detect it from far away (an antenna thousands of kilometers long). So very long wavelength photons are of limited practical interest.
PS: finally, it is worth noting that at CLOSE range (so-called "near field" range) it is relatively easy to detect the presence of quintillions of very low-energy "longwave photons" when they are coherently in phase-- whenever you hear the electric "hum" of a large 60-hertz or 50-hertz transformer, you are hearing the acoustic vibrations in the air generated by vibrations in transformer coils-- the interaction of quintillions of longwave photons (10^15 up to 10^25 per second, depending on the electric current I suppose) as they are emitted and then captured by the neighboring coils of low-frequency, high-power AC (Alternating Current) electric transformers such as those which power our houses and businesses. But these are not "propagating" photons in the sense of moving through free space; they are more like the virtual photons I mentioned above, which are created and then annihilated within less than the space of one wavelength. Does hearing that humming sound count as "photon detection'? I'll let you decide.
to be more in touch with reality: the vibrations in power transformers have mechanical nature and are being generated mainly in their soft magnetic cores, not in their windings. This is caused by magnetostriction, i.e. change of dimensions in response to the applied magnetic field. The magnetic field, in turn, is generated by currents in coils, that is by movements of charge carriers. The picture with photons seems therefore highly artificial in this case.
Thanks for your comment. I agree with your point of view. On the one occasion when I was asked as an engineer to calculate the magnitude of acoustic vibrations of a transformer coil and iron core, I did NOT use any knowledge or mathematics relevant to photons! I simply used the Maxwell Equations of electromagnetism. I brought up this "highly artificial" case because one of our colleagues here inquired about whether very long wavelength, low-energy photons would be "detectable" and the answer depends to some extent on your definition of "photons" and of "detection"-- in the usually assumed definitions, NO, you cannot detect a very low-energy photon without cooling your detector far below the temperature corresponding to that photon; but in the case of large numbers of COHERENT low energy photons, well... that is just a highly artificial ("quantum") way of describing ordinary electromagnetism, which is better described using Maxwell's classical (non-quantum) field equations.
single photons are indeed tricky. You are absolutely right when you say the detection is only possible when the signal is above (thermal) noise. But there is another catch: the size of the detector. Naively, the photon should "fit" inside the detector. Additionally, it should interact somehow with detector material. Naively, again, the photon's size should be of order of its wavelength. But look: in the Electron Spin Resonance (ESR) experiments one can easily detect at room temperature the absorption of radiation with wavelength ~3 cm, (~9GHz) corresponding to the temperature ~0.5K, in samples smaller than 1 mm. Even less energetic photons (~100-300 kHz) are absorbed in Nuclear Magnetic Resonance (NMR) experiments.
On the other hand, quite energetic X-ray radiation produces very useful diffraction patterns, perfectly described as electromagnetic waves ...
Depends upon the energy of the photon (radiation) and the nature of particle (medium) it is colliding with. High energy (UV, Optical, x-ray) radiation colliding with cold matter with excitation/ionization energy less than their energy will get absorbed and photons with lesser energy will be emitted. Think central region of our Galaxy with lot of dust/hydrogen. It absorbs shorter wavelength radiation and emits in IR/submm. On the other hand, our atmosphere absorbs UV,x-ray and gama-ray radiations as medium in upper atmosphere has atoms/molecules which can absorb higher energy photons. In inverse Compton scattering, low energy (IR, optical, x-ray) photons are upscattered to high energies by relativistic electrons. These processes keep going on and the emission and absrption continues. There is nothing like all photons getting abosrbed leaving the universe completly dark- not in near future.
All bodies at a temperature above absolute zero (–273°C) radiate for the most part in the infrared (IR) region (from 780 nm to 1 mm) of the electromagnetic (EM) spectrum. Approximately half of the electromagnetic energy from the Sun is infrared.
Nice question, and really clever answers. I also agree that the ending energy depends of the temperature (atomic density and composition) of the medium. Just to give you some other perspective, my field involves studying meteors (hot gas 5,000-10,000 K "near-plasma" physics) associated with meteor expanding columns in the upper atmosphere (mesosphere-lower thermosphere). We try to infer the bulk chemistry of the ablated particles from emission spectroscopy. A significant part of the energy comes from ionized atoms and molecules being reemitted in short time scales while other propagates in the upper atmosphere. There are sinks of energy in the photons heating tiny dust and recondensation smokes. Then, in remaining dust trails some IR emission remains, but most has been already emitted or scattered. To have some idea of what we are doing in meteor spectroscopy I invite you to check our paper: http://www.spmn.uji.es/ESP/articulo/spectr03.pdf
Quantum theory and heat body transfers proves that Radiation can lose energy by collisions with other particles... it is again an energy... energy in one form transform to another... can neither be created or destroyed.
What you are thinking about is called Compton effect. But there exists also the inverse Compton effect, that is sometimes a photon actually gains energy after collision with (charged) matter. Essentially the same phenomenon is known in condensed matter physics as electron-phonon interaction. The probability of direct and inverse effects are governed by the temperature (thus it doesn't make too much sense to speak about a single photon in this context). Yet, the very existence of this inverse effect(s) makes certain that the final temperature of our universe will never be equal to zero as the energy is always conserved. On the other hand we cannot achieve true zero energy even in a small, isolated system, because of "zero-point energy/vibrations' of a harmonic oscillator. Perhaps neutron stars are close to this regime?
Volodymyr, let's get in the touch with reality. Photons never stop, unless you consider photon's absorption act (thus its disappearance) as "stopping". Photons in glass are absorbed and then re-emitted (not necessarily in the original direction) and scattered by phonons as well. Those two mechanisms are completely different but both conspire to make the imaginary part of refraction index not equal to zero. If your view was correct, then your internet connection, most likely based on optical fibers few kilometers long, would never work. The 'light' on the other end of such a connection has lower intensity but the same 'color' as originally.
Your question depends upon infinitely long. How and when we introduce infinity changes the results of a calculation. The result is a paradox if infinity is not properly introduced. A popular concept of entropy is that the entropy of a system increases. A closed, insulated system has constant entropy. When is it proper to introduce infinity in a closed system? Is the universe a closed system?
Suppose we begin with the most intense gamma ray burst attainable. The gamma rays will lose energy with each interaction with matter or other photons (except for some exceptions noted in other answers). Nevertheless, if this gamma ray burst occurred at the beginning of the universe, at the current age of the universe, we still have a probability that some of the initial gammas have not interacted. Is this a case of improper use of infinity, even if the probability is ridiculously small?
All the photons from the initial burst will have lost energy until some final gasp when that energy is absorbed in some process. We can say that the energy of the photons goes asymptotically to zero. So the answer to your question is that the photons degrade until they quietly absorb. That is the effect of carrying the equations to infinity.
If the universe is a closed system and the entropy does not increase, do we require continuous bursts of intense gamma rays? Will the universe finally degrade to the average with a small distribution in energy? I do not believe there is an equation that will answer these questions, but there is an equation that says a burst of gamma rays will asymptotically degrade to zero.
Is possible to know if the Compton effect is more than inverse Compton effect?
I asked, many time ago if the Universe is expanding and everything is moving away, the density (matter and energy) of the universe was decreasing, and they told me that it is not known, that is, it can not be, and it is not very understandable.
But both, in the closed universe or not, it may be known if the radiation has more tendency to lose energy than to capture it. Is it?