I am looking for the general definition, one that does not depend upon whether or not the system is a quantum one. However, please provide context and explanation.
Information expressed in bits is dimension less - but entropy as a Boltzman's constant associated with it - that is why thermodynamic entropy and Shannon entropy are dimensionally related to each other by a constant kB , the latter being dimesionless.
Unlike thermodynamic entropy which is dimensionally = energy /temperature in kelvin, action is energy * time
Thank you for your answers and please excuse the delay in getting back to you. I do not have internet at home, am pre-computer generation and have I think accidentally erased my reply three times.
I am also a mathematical logician and specialist in set theoretic, axiomatic probability theory. I know that information, as well as entropy, as it occurs in pure mathematics is dimensionless, as is probability, in terms of which it is defined, when the events that possess probability, information, and entropy are pure sets. However, when the events are physical and a frequency interpretation of probability is being employed, the probability, information given by an event happening, and entropy are alike physical and Landauer's limit says the amount of energy it takes to change one bit of information is greater than or equal to Bolzmann's constant times degrees Kelvin times the natural logarithm of 2. It is possible to replace the the natural logarithm of two with the logarithm to the base two of two so long as we change how we measure temperature by the appropriate factor, about 0.7 According to an article published in Nature, experimentalists have now used Landauer's theorem to generate energy from information. However, as I understand matters there is some question whether they have actually shown energy to be a form of information, which quantum physicists now agree is a constant. Let me send this much of my question before this spontaneously disappears on me again.
You’re asking an interesting question. The problem with the physical dimensions of entropy is that it is the result of several twists in the history of temperature measurement. Because of the historical development of thermometry, temperature was measured in artificial units called “degrees,” for example by dividing the freezing and steam points of water into 100 “degrees.” Measuring temperature in degrees made it necessary to artificially invent Boltzmann’s constant k as a conversion factor to convert temperature “degrees” into energy. In turn, temperature measured in “degrees” also forced traditional physical entropy S to artificially take on dimensions of “energy per degree.”
Actually temperature T should be measured in physical dimensions of energy kT. In statistical mechanics, everywhere temperature T occurs, it occurs combined with k in the form kT. An example is the Boltzmann factor exp(-E/kT) or the inverse temperature often used Beta = 1/kT. If we measure temperature in units of energy kT, then physical entropy automatically becomes dimensionless. For example in the first law of thermodynamics dU=TdS-pdV+ .. , the term TdS become kTd(S/k), where now the new temperature kT=T’ has dimensions of energy and the new entropy S/k =S’ is dimensionless.
So if we undo the historical accidents in measuring the physical dimensions of temperature, physical entropy becomes dimensionless. Dimensionless physical entropy then is identified with dimensionless information entropy, and statistical mechanics and thermodynamics can be derived from Shannon information, for example as carried out in Jaynes’ maximum entropy formalism.
As for the frequency interpretation of probability in physics, I think it’s fair to say that the last several decades have seen many physicists moving toward the Bayesian view.
In a closed system, energy conserving in the physical world or stochastic models, processes such as diffusion tend to increase entropy of the system over time. Temperature as a macro statistic is conserved but this only tells you about steady state "information potential" and not dynamics of information at a micro level. Useful "information" in the sense of re-organisation or creation of structure, is not bound to "use energy" since reversible computations are not bound by the lower limit on energy required to create or destroy a bit. Continuous distributions, standalone, have no absolute measure of entropy .. information entropy is only a relative concept. Quantum level interactions complicate matters further as information density depends on whether state has collapsed. So I think the statement on temperature/energy/information-entropy is a tautology given a steady-state probabalistic kinetic model of matter and breaks down or is hard to extend into other situations.
Please excuse my earlier misspelling your last name. My mother died of that disease and I am afraid I may be showing symptoms of it, so it is on my mind. No insult was intended. I save my insults for such as Postmodernist philosophers, e.g., Feyerabend and his followers, who claim to have disproved the scientific method, mathematics, and logic itself (Feyerabend says we should be like a worm on a log and never experiment or theorize about anything because anything anyone could ever say is, according to him, necessarily false; I on the other hand, am a philosopher who maintains mathematical logic is a legitimate branch of mathematics and the rest of philosophy is pseudoscience, which tends not to endear me to philosophy departments.)
Regards Dan Whiting.
Jack,
Your explanation is very clear, so much clearer than what is in the draft of what is intended to be my magnum opus, I would like to quote you there, if that is all right with you. I knew the formula the derivative of physical entropy equals the derivative of thermal energy divided by temperature, but was having difficulty fully understanding how that is reconciled with set theoretic entropy. I am still not positive that entropy as it appears in Mirtag and Evans' fluctuation theorem has gotten rid of the historical baggage by using Shannon entropy rather than (thermal energy + a constant)/ temperature. I can try rereading the papers on the fluctuation theorem, but perhaps you or some other reader of this can enlighten me.
Regards,
Dan Whiting
James Prichard:
Thank you for your answer. I get more out of it each time I read it. Since I am concerned with entropy (scrambled information) as it is created by the fluctuation theorem, which deals with nonreversable computations, it seems that energy must be used.
Regards,
Dan Whiting
"... particle physicists have adopted a system of units [such that] the values of the velocity of light and of Planck's constant divided by 2 pi ... and their products are all dimensionless and equal to unity [Elementary Particle Physics, 1987, Kenyon, I. E., London and New York: Routledge & Kegan Paul]" So action, energy multiplied by time, is dimensionless. Can it be identified with the Shannon information of a physical event happening as John L. Haller [[email protected]] suggests in "Information Mechanics"? I have considerably more explanation of my question, but will attempt to send now.
If you want to do a quote, I think it's fine in this case. About your mentioning the "fluctuation theorem," I'm assuming you mean the fluctuation theorem work of Evans and Searle (for example, as discussed in Adv.Physics 51 (7) 1529–1585, 2002). They do use the traditional (i.e., historical) units for temperature and entropy, as does most of the work on fluctuation-dissipation theorems over the past century. However, the results are the same whether using the traditional physical units for temperature and entropy (T, S) or temperature and dimensionless entropy modified by a constant (kT, S/k). Therefore the results will not be affected by the consistent use of either the historical units or the modified units.
This entry explains the large system metric as compared to finite systems and the relation between temperature/energy, physical entropy and pure information measure. http://en.m.wikipedia.org/wiki/Ruppeiner_geometry