Definition of entropy is available in many text books and it is said as a function of a quantity of heat which shows the possibility of conversion of that heat into work. How this definition can be expressed and discussed in a more simple way?
Historically, the entropy concept was developed by Clausius in the middle of the nineteenth century. Clausius recognized that in a perfect heat engine, a heat engine without friction and without heat loss by thermal conduction, the ratio of the heat that entered the engine at high temperature (Qh) and this high temperature (Th), equals the ratio of the heat rejected at low temperature (Ql) and this low temperature (Tl). In other words: Qh/Th = Ql /Tl. In a perfect heat engine something is conserved (Greek 'entropos'), and he called this entity the entropy, symbol S.
As we are living in non-perfect world (just watch the news!) with non-perfect heat engines, there is always in heat engines some work converted into heat, and there is also always some heat being conducted from high to low temperature. In both cases the entropy increases. This was generalized into a general principle, the Second Law of Thermodynamics.
This Second Law is consistent with the observation that no perpetuum mobile has been constructed. Engineers have lived happily thereafter, but some theoretical physicists have found it necessary to try to proof the Second Law, first from classical mechanics, and later from quantum mechanics. Since according to Feynman no one understands quantum mechanics, such a proof if found could hardly be convincing.
During these attempts to proof the Second Law, many theoretical physicists seem to have lost the knowledge of the connection of entropy to heat engines. They may also consider it as something distasteful. I, on the other hand, consider their superior attitude towards an entity derived from experience, from the real world, from practical application, as boring and distasteful. This superior attitude has resulted in a huge waste of effort.
As a result, in science, entropy has turned into a mystical entity, such as met in religion. Such mystical concepts are taboo in daily life, having to do with power and politics, rather than with science. So don't talk about entropy. It is bad for your career. As my late mother used to say: 'The truth cannot be told.'
If you want to understand more you may want to read about probability theory or statistical mechanics. Generally, entropy is a measurement of disorder in the system.
In thermodynamics, energy can be defined as the ability to do work. Basically, Entropy is associated with the unavailability of energy to do work since the fact that not all heat can be converted into work.
Unlike energy, entropy is not conserved but increases for any system undergoing an irreversible process. When entropy increases, a certain amount of energy becomes permanently unavailable to do work. The energy is not lost, but its character is changed due to irreversibilities so that some of it can never be converted to doing work.
Entropy is often defined as a monotonically increasing measure of disorder, but disorder is a somewhat vague term. Probably a sharper definition is that entropy is a monotonically increasing measure of delocalization. The delocalization is in phase space (position plus momentum space). When a gas expands to occupy a larger volume, the delocalization is in the position part of phase space. When work is dissipated to heat via friction, the delocalization is in the momentum part of phase space: the temperature rises, so the width at half maximum of the Maxwellian velocity/momentum distribution increases. When iron rusts, atmospheric oxygen is localized in the rust, in the position part of phase space, but rusting is sufficiently exothermic that delocalization in the momentum part of phase space (the temperature rises, so the width at half maximum of the Maxwellian velocity/momentum distribution increases) more than compensates. When water evaporates into an unsaturated atmosphere (less than 100% relative humidity) the evaporative cooling results in localization in momentum space, but this is more than compensated by delocalization of water molecules in position space as they change from liquid to gas. The Second Law of Thermodynamics can be stated as: Net localization is impossible. Localization is possible but must always be paid for by greater delocalization elsewhere. Localization = negentropy = fuel.
Sarah Ali Decreasing the temperature makes molecules more ordered, therefore the entropy drops. Recall that the Third Law states that "the absolute entropy is zero for all perfect crystalline substances at absolute zero temperature (0 K)".
Note that in this case the increase of entropy for the surroundings higher than the decrease of entropy for the water. Therefore, the overall entropy of a system will increase.
Historically, the entropy concept was developed by Clausius in the middle of the nineteenth century. Clausius recognized that in a perfect heat engine, a heat engine without friction and without heat loss by thermal conduction, the ratio of the heat that entered the engine at high temperature (Qh) and this high temperature (Th), equals the ratio of the heat rejected at low temperature (Ql) and this low temperature (Tl). In other words: Qh/Th = Ql /Tl. In a perfect heat engine something is conserved (Greek 'entropos'), and he called this entity the entropy, symbol S.
As we are living in non-perfect world (just watch the news!) with non-perfect heat engines, there is always in heat engines some work converted into heat, and there is also always some heat being conducted from high to low temperature. In both cases the entropy increases. This was generalized into a general principle, the Second Law of Thermodynamics.
This Second Law is consistent with the observation that no perpetuum mobile has been constructed. Engineers have lived happily thereafter, but some theoretical physicists have found it necessary to try to proof the Second Law, first from classical mechanics, and later from quantum mechanics. Since according to Feynman no one understands quantum mechanics, such a proof if found could hardly be convincing.
During these attempts to proof the Second Law, many theoretical physicists seem to have lost the knowledge of the connection of entropy to heat engines. They may also consider it as something distasteful. I, on the other hand, consider their superior attitude towards an entity derived from experience, from the real world, from practical application, as boring and distasteful. This superior attitude has resulted in a huge waste of effort.
As a result, in science, entropy has turned into a mystical entity, such as met in religion. Such mystical concepts are taboo in daily life, having to do with power and politics, rather than with science. So don't talk about entropy. It is bad for your career. As my late mother used to say: 'The truth cannot be told.'
On explaining entropy by disorder: it is an example of an explanation of 'obscurum per obscurius, ignotum per ignotius'.
See also the publications of Jaynes, Lambert, the Denbighs, the journal of Chemical Education.
I find older Physical Chemistry textbooks (before say 1950) the most clear. More modern books have condensed the arguments and the examples so much that the basic idea has become unrecognizable. Engineering textbooks tend to be too simple.
The journal Entropy shows that the concept of entropy has evolved extensively in physics, mathematics and other disciplines.
But, there is more than one kind of Entropy. So depending on the particular kind you are talking about, will be the approach and system definition we will need to use..
For example, there is the Thermodynamic Entropy, Quantum Entropy, Statistical Entropy, Black Holes Entropy, Gibbs Entropy, etc.,
So different Entropies apply to different systems, and have different definitions in which not all of them are compatible to all the approaches and within all the scales.
It should be true that there's should be a more fundamental definition of entropy from which all the others emerge. But the interpretation of the fundamental Entropy is still being an open field of research .
Some aim to the Shannon's Entropy (or Entropy in Information Theory), but also others think that the so called Gibbs Entropy maybe more fundamental , anyways is a still open debate.
The physical beginning of the entropy is very well explained by Anthonie Wilhelmus Muller. It has a continuation. Ludwig Boltzmann built a theory of entropy on statistical physics of gasses.
Boltzmann took all possible microstates of the physical system and made distributions of their physical values (temperature, velocities, etc.). From the distributions can be evaluated entropy easily, without knowing every detail of the system, which gives the equation
S = Sum (-p * ln(p))
which is engraved even on his gravestone. Using this formula we are capable to make very useful predictions about statistical systems in physics.
Later was the very same concept used in encryption and description of encoded messages during the second world war. It leads to the term of information entropy, which is used to estimate the strength of the encoding.
Recently, the concept of entropy started to be used in the description of biosignals that are otherwise mathematically impenetrable because we do not understand the complex systems, bodies, that are generating them.
It is a very fascinating field of research that is paving the path f the future development of predictive, personalized medicine. A complex system is increasingly and successfully used in modeling of biological systems including medicine.
We are still at the very beginning but even now some results are exceeding all expectations are much better than anything that mathematics was capable to provide before. :-)
It is already possible to asses and sometimes even predict
Epileptic seizures
Anesthesia depth
Gait disruption
Arrhythmias (my research field)
Wakefulness
and a lot more using concepts of entropy often combined with AI and machine learning techniques.
If thermodynamics itself cannot explain 'what is entropy', the other explanation(s ) will be 'obscurum per obscurius, ignotum per ignotius' as Anthonie mentioned, 'an explanation that is less clear than what it tries to explain'.
In this opinion, thermodynamics does not seem like a perfect theory in the usual sense.
Why you say that thermodynamics cannot explain Entropy ?
I believe that Thermodynamics can explain well the thermodynamic entropy (not others entropies), since the concept of thermodynamic entropy was conceived into the frameworks of thermodynamics, it can be explained with its own definition: dS=dQ/T . Analysis as energy balance, entropy balance and exergy balance on systems can help us to understand well the thermodynamic entropy of a system. Also the Second Law and the definition of entropy helps each other to explain themselves.
Speaking about another kind of entropies ... Shannon Entropy, Statistical Entropy, etc, I agree that classical thermodynamics is not enough.
But nowadays there is Statistical Thermodynamics and Statistical Quantum Mechanics, you can now associate a specific entropy even to a particle, or to a system which is or not in equilibrium with a thermal bath.
1) Classical thermodynamics itself cannot explain the physical meaning of entropy, this is a fact, you cannot find answer in any textbook.
2) In classical thermodynamics, δQ/T is comprehensible,“Q is known as an extensive quantity, and can be transferred only from higher temperature to lower temperature, this property shows that δQ relates to the intensity distribution at temperature T. For the same δQ, if its distribution in temperature is different, the intensive property of δQ will be also different, thus, we can use the function δQ/T to measure δQ and its distribution property in temperature, and then δQ/T may be named as the entropy of the heat δQ.”[1] P3-P4
3) However, 2) is not an explanation for the system state in that Q denotes heat exchange, and is not a state function, δQ/T is only the entropy of δQ.
During the process of deriving the so-called entropy, in fact, ΔQ/T can not be turned into dQ/T. That is, the so-called "entropy " doesn't exist at all.
The so-called entropy was such a concept that was derived by mistake in history.
It is well known that calculus has a definition,
any theory should follow the same principle of calculus; thermodynamics, of course, is no exception, for there's no other calculus at all, this is common sense.
Based on the definition of calculus, we know:
to the definite integral ∫T f(T)dQ, only when Q=F(T), ∫T f(T)dQ=∫T f(T)dF(T) is meaningful.
As long as Q is not a single-valued function of T, namely, Q=F( T, X, …), then,
∫T f(T)dQ=∫T f(T)dF(T, X, …) is meaningless.
1) Now, on the one hand, we all know that Q is not a single-valued function of T, this alone is enough to determine that the definite integral ∫T f(T)dQ=∫T 1/TdQ is meaningless.
2) On the other hand, In fact, Q=f(P, V, T), then
∫T 1/TdQ = ∫T 1/Tdf(T, V, P)= ∫T dF(T, V, P) is certainly meaningless. ( in ∫T , T is subscript ).
We know that dQ/T is used for the definite integral ∫T 1/TdQ, while ∫T 1/TdQ is meaningless, so, ΔQ/T can not be turned into dQ/T at all.
that is, the so-called "entropy " doesn't exist at all.
1. Logic of the Second Law of Thermodynamics: Subjectivism, Logical Jump, Interdisciplinary Argumentation.
2. New thermodynamics pursues universality, two theoretical cornerstones:
2.1 Boltzmann formula: ro=A*exp(-Mgh/RT) - Isotope centrifugal separation experiments show that it is suitable for gases and liquids.
2.2. Hydrostatic equilibrium: applicable to gases and liquids.
3. The second and third sonic virial coefficients of R143a derived from the new thermodynamics are in agreement with the experimental results.
3.1. The third velocity Virial coefficient derived is in agreement with the experimental data, which shows that the theory is still correct when the critical density is reached.
4. See Appendix Pictures and Documents for details.
Although the question is very common but answar is not . I can define entropy in a statistical view ie entropy is nothing but the measure of thermodynamic probability of a particular system at a particular temperature .
In existing thermodynamics, as I.Prigogine indicated that “entropy is a very strange concept without hoping to achieve a complete description”. Now we need to find where the issues are.
By Clausius approach, we can only know that there is a state function named as entropy, symbolized by S, in a reversible heat exchange process, we have dS=δQ/T, in an irreversible process, we have dS>δQ/T, and for as isolated system, dS>=0, but Clausius had not given an explicit definition for the function S.
It is readily apparent that dS=δQ/T is only applicable to a reversible heat exchange process, it is only the entropy of δQ, this equation is inapplicable to an irreversible process, and inapplicable to an isolated system because in such cases, dS≠δQ/T, then, in general cases, what dS is equal to? and what is the definition of entropy?
There is no an explicit definition which indicates the physical contents of the entropy S, thus, in existing thermodynamics, the physical meaning of the entropy is unexplainable, one can only make a guess.
Entropy is a function of heat which shows the possibility of conversion of that heat into work. Entropy is a property of a system. Absolute entropy cannot be calculated but change of entropy can be evaluated.
The difference between entropy of two states is power function of entropy between the two states, therefore the entropy difference, between any two states, is the energy power.
Absolute entropy CAN be calculated. S = k ln OMEGA. OMEGA = number of accessible microstates. In principle OMEGA can always be COUNTED. For a perfect crystal in a nondegenerate ground state at absolute zero (0 K), OMEGA = 1 so the absolute entropy is S = 0. If the ground state is n-fold degenerate, then the absolute entropy at 0 K is S = k ln n. At any higher temperature T, the absolute entropy is higher than that at 0 K by the integral of C/T dT from 0 K to the given temperature, where C is the heat capacity under the given conditions (e.g., constant pressure, constant volume, etc.). If there are phase transitions, then at each phase transition an extra amount of entropy equal to the latent heat of the phase transition divided by the temperature at which it occurs is added to the absolute entropy.
Entropy is a property and change of entropy is the ratio of heat to temperature. One cannot evaluate the absolute entropy at a particular state but change of entropy can be evaluated. This is also evaluated from temperature- entropy diagram.
Entropy is the ratio of change in heat to temperature. Temperature- entropy diagram gives the heat supplied or heat rejected from the system. Entropy at a particular point that is absolute entropy cannot be determined, but change of entropy be evaluated.
Entropy shows the possibility of conversion of heat into work. Entropy is given as change in heat to temperature. This is a property of a system and depends only on end states and independent of path travelled.
It seems that to many the expression Q/T, as a measure for the loss of free energy when due to friction a heat Q is generated in an environment at a temperature T, is mysterious. But the free energy loss due to friction must depend on the temperature: the heat Q can still be used as the input of a heat engine when it is generated in a reservoir at high temperature (say the boiler of a steam engine, the core of the Sun, or even the interior of a hydrogen bomb!). Heat generation only is not sufficient to describe the free energy loss: the temperature of the reservoirs is important as well. Hence loss is proportional to Q/ f(T). But what is f(T)? Suppose that f(T) is unequal to T. In that case, by suitable choices of Q and T, and letting Q enter a heat engine at temperature T*, contradictions will, I guess, be derivable. Only by formulating entropy as Q/T can these contradictions be avoided, and this legitimizes the existence of entropy as Q/T: a notion that permits the treatment of dissipative processes in nature in a consistent way.
Note that this argument is made by using concepts from the macroscopic world. Note also that no notions from information theory or from quantum mechancics are used.