01 January 1970 65 4K Report

In the introduction to his text, A Student’s Guide to Entropy, Don Lemons has a quote “No one really knows what entropy is, so in a debate you will always have the advantage” and writes that entropy quantifies “the irreversibility of a thermodynamic process.” Bimalendu Roy in his text Fundamentals of Classical and Statistical Mechanics (2002) writes “The concept of entropy is, so to say, abstract and rather philosophical” (p. 29). In Feynman’s lectures (ch. 44-6): “Actually, S is the letter usually used for entropy, and it is numerically equal to the heat which we have called Q_S delivered to a 1∘-reservoir (entropy is not itself a heat, it is heat divided by a temperature, hence it is measured in joules per degree).” In thermodynamics there is the Clausius definition which is a ratio of a quantity of heat Q to a degree Kelvin, Q/T, and the Boltzmann approach, k log(n). Shannon analogized information content to entropy; 2 as the base of the logarithm gives information content in bits. Eddington in the Natural Physical World (p. 80) wrote: “So far as physics is concerned time’s arrow is a property of entropy alone.” Thomas Gold, physicist and cosmologist suggested that entropy manifests or relates to the expansion of the universe. There are reasons to suspect that entropy and the concept of degrees of freedom are closely related. How best we understand entropy?

More Robert Shour's questions See All
Similar questions and discussions