Hello Dear colleagues:
it seems to me this could be an interesting thread for discussion:
I would like to center the discussion around the concept of Entropy. But I would like to address it on the explanation-description-ejemplification part of the concept.
i.e. What do you think is a good, helpul explanation for the concept of Entropy (in a technical level of course) ?
A manner (or manners) of explain it trying to settle down the concept as clear as possible. Maybe first, in a more general scenario, and next (if is required so) in a more specific one ....
Kind regards !
First of all, we need to define thermodynamic entropy in an explicit way, we need an explicit function, however, δQ/T is only the entropy of δQ, not the definition of the system state, and in fundamental equation of thermodynamics, the entropy is not an explicit function, thus, thermodynamics itself cannot explain the physical meaning of the entropy, and the other explanations both are 'obscurum per obscurius, ignotum per ignotius'.
So, classical Thermodynamics cannot explain Thermodynamic Entropy ( δQ/T ) ?
What about Stochastical Thermodynamics ? or Quantum Thermodynamics ?
The so-called "entropy " doesn't exist at all.
During the process of deriving the so-called entropy, in fact, ΔQ/T can not be turned into dQ/T. That is, the so-called "entropy " doesn't exist at all.
The so-called entropy was such a concept that was derived by mistake in history.
It is well known that calculus has a definition,
any theory should follow the same principle of calculus; thermodynamics, of course, is no exception, for there's no other calculus at all, this is common sense.
Based on the definition of calculus, we know:
to the definite integral ∫T f(T)dQ, only when Q=F(T), ∫T f(T)dQ=∫T f(T)dF(T) is meaningful.
As long as Q is not a single-valued function of T, namely, Q=F( T, X, …), then,
∫T f(T)dQ=∫T f(T)dF(T, X, …) is meaningless.
1) Now, on the one hand, we all know that Q is not a single-valued function of T, this alone is enough to determine that the definite integral ∫T f(T)dQ=∫T 1/TdQ is meaningless.
2) On the other hand, In fact, Q=f(P, V, T), then
∫T 1/TdQ = ∫T 1/Tdf(T, V, P)= ∫T dF(T, V, P) is certainly meaningless. ( in ∫T , T is subscript ).
We know that dQ/T is used for the definite integral ∫T 1/TdQ, while ∫T 1/TdQ is meaningless, so, ΔQ/T can not be turned into dQ/T at all.
that is, the so-called "entropy " doesn't exist at all.
.
.
.
Why did the wrong "entropy" appear ?
In summary , this was due to the following two reasons:
1) Physically, people didn't know Q=f(P, V, T).
2) Mathematically, people didn't know AΔB couldn‘t become AdB directely .
If people knew any one of them, the mistake of entropy would not happen in history.
Please read my paper and those answers of the questions related to my paper in my Projects.
https://www.researchgate.net/publication/230554936_Entropy_A_concept_that_is_not_a_physical_quantity
Dear Franklin Uriel Parás Hernández,
1) δQ/T is explainable [1]p3-4, it is only the entropy of δQ, not a definition of the system state, a state function can only be defined by some state variables, then it will be explainable for the state meaning which represented, since Q is not a state variable, δQ/T cannot explain the state meaning of (the change of) the entropy. It cannot be proved to be an exact differential in math, can only be presented by the aid of an imaginary reversible cycle of heat engine.
2) Gibbs entropy is correct as an equation, and implies the definition of the entropy, the question is that the equation involves a difference of functions that
dS=dU/T+(pdV)/T-Σ(μdN)/T
as we know from the first law, Σ(μdN) is a part of the internal energy dU, so what the difference of functions (dU-Σ(μdN) represented should be made sure, the above equation is actually that
dS=(dU-Σ(μdN)/T+(pdV)/T.
the definition of a state function should not be defined by a difference of functions.
What is (dU-Σ(μdN)? the answer is the heat energy. [1]
3) If the physical meaning of thermodynamic entropy remains unclear, the ‘other thermodynamics’ are less reliable.
[1]https://www.researchgate.net/publication/331457885_Heat_Energy_Entropy_and_The_Second_Law_A_New_Perspective_in_Thermodynamics
I sincerely advise you to stop studying "entropy". It's a waste of time to study a theory that doesn't exist at all !
Dear Franklin Uriel Parás Hernández,
Thermodynamics should be a self-consistent theory, the words self-consistent means that all of the concepts and principles are comprehensible, including explain the physical content of the entropy by thermodynamics itself.
And remember that the definition of the pressure is a force acting per unit area, the definition has nothing to do with the entropy, the equation of a partial derivative is not a definition.
Best regards,
Tang Suye
Comparison of New and Old Thermodynamics
1. Logic of the Second Law of Thermodynamics: Subjectivism, Logical Jump, Interdisciplinary Argumentation.
2. New thermodynamics pursues universality, two theoretical cornerstones:
2.1 Boltzmann formula: ro=A*exp(-Mgh/RT) - Isotope centrifugal separation experiments show that it is suitable for gases and liquids.
2.2. Hydrostatic equilibrium: applicable to gases and liquids.
3. The second and third sonic virial coefficients of R143a derived from the new thermodynamics are in agreement with the experimental results.
3.1. The third velocity Virial coefficient derived is in agreement with the experimental data, which shows that the theory is still correct when the critical density is reached.
4. See Appendix Pictures and Documents for details.
Preprint In the gravitational field, the relation between the interna...
Probably the simplest explanation of entropy is Boltzmann's: S = k ln OMEGA, where OMEGA is the number of accessible states. Thus the mathematics of entropy is ultimately simplified to the simple counting of the number of accessible states OMEGA. A system is specified by various parameters, say volume, temperature, etc., which describe its properties. The smaller OMEGA is, the more localized a system is in parameter space. The larger OMEGA is, the more delocalized a system is in parameter space. The Second Law of Thermodynamics can be stated as: Net localization is impossible. Localization is possible but must be paid for by greater delocalization elsewhere. Localization = negentropy = fuel. If all states microstates are equally probable, localization is measured by the smallness of ln OMEGA. If they are not, it is measured by the smallness of -SUM( Pj lin Pj). In either case, the maximum possible localization is that one Pj is unity and all the rest are zero, i.e., OMEGA = 1.
I must point out that“Thermodynamics gives an additional relation between pressure, energy, entropy and volume” is wrong, it is such simple a mistake.
Many thought that “the temperature, pressure,… are partial derivatives of functions S,V,N1..Nr,….”(textbook)
T=(∂U/∂S)V, N1..Nr
p=-(∂U/∂V)S, N1..Nr
the two equations only expressed that when V, N1..Nr are constant,T=(∂U/∂S), and when S, N1..Nr are constant, p=-(∂U/∂V), the two are the particular situation, not the general relations.
Why you don't consider that when S, V, N1..Nr are not constant, the temperature, pressure are still exist?
The temperature, pressure are not partial derivatives of functions S,V,N1..Nr,….”, they only, in some cases, are equal to.
Dear Jack Denur,
Statistical mechanics is established on the equal a priori probability postulate, thermodynamics does not need to consider this postulate. In fact, the equal a priori probability postulate does not always hold, this postulate only hold when thermal motion is stronger than interactions.
So thermodynamic entropy > Boltzmann entropy.
If the physical content of thermodynamic entropy remains unclear, the relations between thermodynamic entropy and Boltzmann entropy are unclear.
Best
Dear Mr. Hernandes
In the response on your *entropy* question.
Super-briefly: entropy is the measure of the disorder of some (any) system.
A little bit more.
As an entity, the *Entropy* is unigue and the most intriguing concept in the modern philosophy and natural sciences. The name generates from old-greek ἐντροπία (turnabout, conversion, transformation). The matter is that any system being existed in some solitary macroscopic state may stay in a many different microscopy states; the entropy being the only concept that establish direct tie between these states, i.e. it corresponds micro-and macro –description of this system.
This is unique concept in the modern physics that shows the very trend of the process development – the greater entropy value, the lesser is probability of spontaneous self-sustaining evolution of isolated system to the state of its final stationary equilibrium. Due to the entropy is state function (while not transition function) it is independent on the transition route, being dependent only on the initial state and resulting in the solitary final state.
Gnoseologically, entropy is the measure of energy devaluation (just so –not the price of energy, but rather energy devaluation). It presents the capability of the system to produce the work. Two systems of the same energy can produce different work, the higher the entropy, the lesser is accessible work. And visa versa
As a philosophical entity, the entropy has a many senses.
In mathematical sense, entropy is the logarithm of the number of acceptable microscopic states of the system (numerical value depends on the base of logarithm). This feature provides additivity of the entropies of two independent systems. So, in mathematical statistics, entropy is dimensionless measure of the uncertainty of the probabilities distributions (i.e., dispersion).
In the thermodynamics, it characterizes the measure of irreversibility of some spontaneous process (dissipation). In this approach, the entropy is dimensional function (energy/temperature, i.e., J/T K)
In statistical physics, the entropy shows the existence probability of any microscopic state and should be read as the measure of the disarray of the system. However, this in turn implies that the system can possessed of different both quantity and quality of information. In other words, the same (in macroscopic sense) system has different information capacity. That is why the founder of the information theory (Shannon) equates concepts *entropy* and *information*. In this approach, the entropy is dimensionless parameter. It should be see as the measure of the transmitted information distortion (due to any reasons – improper presentation of the original symbols, spontaneous appearance of strange symbols, etc).
In quantum physics, entropy is the amount of the acceptable microscopic states, the macroscopic non-equilibrium data being the same.
In the control theory, it is measure of the uncertainty of system behavior, all initial conditions being given.
In the theory of dynamic systems, it is the measure of the chaotic state and it predetermines the variety of the dynamic trajectories of system behavior. Ets.
Summary: the entropy as a concept is of many faces (Janus, not two-faces, but rather multi-faces)
It is my hope that the preceding help you to see what the entropy is.
Regards,
V. Dimitrov.
The format of this page unable present plot. But if you appeal to my personal mail, I'll send you the plot that shows the connection of entropy S and other thermodynamic functions (H -enthalpy, U-free internal energy,, etc.). My requisite you can see below
Vasili DIMITROV, Ph.& Ch.D.
Emeritus Professor of Chemical Physics.
25 Fisherville Rd, #703, North York,
ON, M2R 3B7, Canada.
Tel. home (1)-416-546-8442, cell (1)-647-632-6355.
E-mail: [email protected]
Skype vasilidim
Proving the Existence of Perpetual motion machine with COMSOL
Miao Bo
ABSTRACT: In this paper, we study charged conductors. The charge density at the tip is high, and the charge concentration potential is high. In the depression, the charge density tends to be zero, and the concentration potential tends to be negative infinite. Using the difference of the concentration potential between the two, the diffusion channel is established at the tip and the depression. During the diffusion process, the heat energy of the charge is converted into electric energy. The charge enters the depression and returns to the tip by conduction.
Key words: Law 2 of thermodynamics, electrostatic equilibrium, COSMOL of long-range interaction
PACC: 4460, 5130
Entropy is a simple concept in classical thermodynamics. It is made complicated in statistical mechanics and further more in quantum mechanics. Therefore, to appreciate the simplicity, and understand in its elements, we need to study first classical thermodynamics. The younger generation of scientists who learn thermodynamics through statistical mechanics get a different conceptual understanding of the concept of entropy from that obtained through classical thermodynamics.
In classical thermodynamics entropy is defined through Clasius' relation
∆S>= Q/T, and for a cyclic process: ∆S>=0. Q is the heat absorbed by the system in a reversible process at absolute temperature T. The inequality sign applies to irreversible processes and the equality sign for reversible processes.
Thus, we see, the concept of entropy is closely associated with the concept of reversibility.
Planck gives the simplest definition of a reversible process as: A process which can in no way be completely reversed, even with the assistance of all agents in nature to restore everywhere the exact initial conditions when the process has once taken place is termed 'irreversible', all other processes reversible. We note that heat interactions are not explicitly involved in the statement here.
Caratheodory formulates the second law using adiabatic processes where heat interactions find no place.
When no heat interactions are involved, Clausius' result gives entropy change to be zero, as is to be expected of a mechanical process.
We thus see that, Clausius, Planck and Caratheodory give the same concept of entropy in different forms.
There is but one important point that we didn't consider in the above - and that is the concept of the ubiquitous 'ideal gas'. It is so intimately intertwined with thermodynamics that students tend to think that thermodynamics doesn't exist without ideal gas!
Through the experimens of Joule and Thomson (Kelvin) ideal gas sneaks into thermodynamics. It assumes further importance through the works of Maxwell. It assumes gigantic proportions with the works of Boltzmann and Gibbs.
From then on so many paradoxes appear.
The concept of entropy looses its simplicity and becomes so complicated that it becomes difficult to understand easily. The situation worsens with the entry of quantum mechanics and unwieldy with the introduction of information theory.
I don't mean to belittle any of the great sciences and their contribution to their fields of origin - but as far as thermodynamics is concerned they contributed only to more confusion.
I hope my friends who specialize in those branches understand my exposition in the right spirit.
The second law of thermodynamics was chaotic at first. See below
*************************
Comparison of New and Old Thermodynamics
1. Logic of the Second Law of Thermodynamics: Subjectivism, Logical Jump, Interdisciplinary Argumentation.
2. New thermodynamics pursues universality, two theoretical cornerstones:
2.1 Boltzmann formula: ro=A*exp(-Mgh/RT) - Isotope centrifugal separation experiments show that it is suitable for gases and liquids.
2.2. Hydrostatic equilibrium: applicable to gases and liquids.
3. The second and third sonic virial coefficients of R143a derived from the new thermodynamics are in agreement with the experimental results.
3.1. The third velocity Virial coefficient derived is in agreement with the experimental data, which shows that the theory is still correct when the critical density is reached.
4. See Appendix Pictures and Documents for details.
Indeed, there are some * From then on so many paradoxes appear.*
There are some well-known paradoxes in the history of the science (Maxwell's demon, “black hole” information paradox, Schrodinger cat, etc.), Gibbs’ paradox (hereafter GP) being one of the most intriguing and poorly understood. GP allows for entropy of the closed systems to change, thus quasi-violating the Second Law of thermodynamics. A related, yet another paradox is the so-called "mixing paradox": if one takes the perspective that the definition of *…entropy must be changed so as to ignore particle permutation…*, the paradox is averted.
Excellent founding fathers (Poincare, Lorentz, Van-der-Waals, Nernst, Planck, Fermi, Einstein, Schrodinger, and Tamm, at alias together with 9 Nobel Prize Winners) deal with GP, conceptually new approach being forwarded by I. Prigogine. Contrary to above-listed paradoxes, GP has two additional specific features.
1). First, the very GP existence is questionable.
Note, that Gibbs by himself doesn’t believe GP existence: ”…merging of two equal amount of the same gases doesn’t increase mixture entropy…”(Gibbs J.W., The collected works, Thermodynamics, v1, N-Y-Lnd-Toronto, 1931). It doesn’t exist as well in the statistical mechanics approach. At the same time GP certainly exists in the quantum- statistical approach.
2). Second, GP is inter-disciplinary Problem (Problem of capital letter); therefore, both the statement and the very GP-solution are of different nature, e.g., thermodynamic, classical-statistic, quantum-statistic, operational, informational, etc. That is why the entropy as a concept is of many faces (Janus, not two-faces, but rather multy-faces) - it can be considered simultaneously as a parameter and/or as a function, which has/hasn’t dimension.
The entropy indeed is parameter (not function) in classical-statistic analysis, yet it is function (not parameter) in quantum-statistic approach. Moreover, it is smooth (i.e. differentiable) function in informational approach, while it is discontinuous function in quantum-statistic approach. The entropy indeed hasn’t dimension (by definition) in the informational approach (the reason is evident – Shannon defines it as the ratio of the lost information to the whole information, i.e. as a ratio of uniform values), while certainly it is dimensional quantity [energy/temperature J/K] from thermodynamic point of view, etc.
Regards, V. Dimitrov
@Padyala
you wrote:
"∆S>= Q/T, and for a cyclic process: ∆S>=0. Q is the heat absorbed by the system in a reversible process at absolute temperature T. The inequality sign applies to irreversible processes and the equality sign for reversible processes."
"applies to irreversible processes" ----> question: what is T for irreversible processes ?
Best regards !
Connect the initial and final States by a reversible process. In this process we will have no problem with values of Q or with the corresponding values of T. Get ∆S for this process.
For irreversible processes we don't need to know the values of T. Even without knowing the values of T for the irreversible processes (the process that you may have in mind) connecting the same end points, we can make an assertive statement that ∆S for the irreversible processes is greater than that for the reversible process.
Dear Bo Miao,
Since I found your earlier post difficult to understand, I didn't reply.
Now, you made a simple statement that is easy to understand - Carnot efficiency is wrong, why discuss entropy?
You are right. It has been a fashion to discuss entropy from a long time, since it is more complicated.
We can discuss the issue starting with the problem of the definition of Carnot efficiency.
These are two different methods of addressing the same issue.
Dear Prof. Tang Suye,
Thank you.
I am happy you found my posts to be correct.
As you rightly pointed out, we must start at the beginning to discuss/resolve the issues here. These issues are connected with the second law.
I am sure that everyone who contributed posts to the topic here came across these issues in different forms in different contexts.
It is not a simple issue of understanding the concept of entropy, or the concept of irrversibility or the correctness of Carnot efficiency - It is the second law of thermodynamics.
As you correctly stated, we need to go back and start from the beginning.
In existing thermodynamics, as I.Prigogine indicated that “entropy is a very strange concept without hoping to achieve a complete description”. Now we need to find where the issues are.
By Clausius approach, we can only know that there is a state function named as entropy, symbolized by S, in a reversible heat exchange process, we have dS=δQ/T, in an irreversible process, we have dS>δQ/T, and for as isolated system, dS>=0, but Clausius had not given an explicit definition for the function S.
It is readily apparent that dS=δQ/T is only applicable to a reversible heat exchange process, it is only the entropy of δQ, this equation is inapplicable to an irreversible process, and inapplicable to an isolated system because in such cases, dS≠δQ/T, then, in general cases, what dS is equal to? and what is the definition of entropy?
The first question can be answered by the fundamental equation of thermodynamics, the second question has no answer (because the fundamental equation is not an explicit definition).
There is no an explicit definition which indicates the physical contents of the entropy S, thus, in existing thermodynamics, the physical meaning of the entropy is unexplainable, (can only make a guess), this is the first issue.
If you think you have an answer for your first question, please write it out.
Let me reiterate, what you write must only have an equals sign (=), because your question asks: what dS is equal to.
Consider δQ/T to be the entropy of the heat flux δQ.
For a given system, we need to consider δQ to be transformed into the internal energy of the system, or the internal energy of the system to be transformed into the heat flux δQ, how do we describe the state changes of the system.
From non-equilibrium thermodynamics we have
dS=deS+diS>=δQ/T.
where deS=δQ/T is the entropy flux, and diS is the entropy production.
We only consider a reversible transform process, diS=0, such that
dS=deS=δQ/T.
For an ideal gas (as an example), by the first law,
δQ=dU+pdV.
Thus, we get
dS=deS=δQ/T=dU/T+pdV/T.
where dS is the state changes of the ideal gas, which can be directly proved to be an exact differential in math.
In general cases, we still have (diS>=0)
dS=dU/T+pdV/T>=δQ/T.
Do you need irreversible thermodynamics to answer your first question?
Try, if you could answer without using irreversible thermodynamics.
To you all
Before continue the discussion, please read the the attached paper
Best regards
W.M.
Why there is no an explicit definition which indicates the physical contents of the entropy S? it is a bit difficult.
There is a deep issue involving the first law of thermodynamics that has not been addressed.
We both know the internal energy, which is the sum of the different types of energy within a given system, the question is, does all of the internal energy make contribution to entropy, or only some types of the internal energy make to?
In other words, assume that the internal energy is the sum of the three types of the energy within a given system
U=UT+Ux+UN
In U, if only UT makes contribution to entropy, i.e, S is the function of UT, but not the function of Ux and UN, thus, UT is needed to be distinguished first, then we can define entropy by UT. However, in existing thermodynamics, UT has not been distinguished, therefore, no one can give an explicit definition and indicates the physical contents of the entropy S.
Dear Tang Suye,
'Irreversible thermodynamics is not an essential requirement to answer the first question.'
Then why use it to answer the first question? Let us try to keep the discussion at the simplest level possible. It makes it easier for others also to follow.
'Why there is no an explicit definition which indicates the physical contents of the entropy S?'
Why do we not ask a similar question about U?
Thermodynamics doesn't define either U or S - it only defines ∆U and ∆S. ∆U is defined by first law and ∆S is defined by second law.
'does all of the internal energy make contribution to entropy, or only some types of the internal energy make to?'
What does this question mean?
What is the meaning of saying or asking, if internal energy contributes to entropy?
Finally, you seem to imply that if there is a definition for U, then you can offer a definition of S. You can perhaps explain how you would define S, given a definition of U.
1) For an ideal gas, U=(i/2)NkT.
2) By Kelvins statement, heat cannot converses into work without compensation, it shows that heat Q and work W are the two different types of the energies in transfer, and the entropy of the two are also different. Thus, there is a similar question, are there similar difference existing in the internal energy? If yes, the similar difference is needed to be distinguished. It means that there are two different types of the internal energy, the entropy is only associated with one type, similar to that the entropy (δQ/T)rev is the entropy of δQ, but (δW/T)rev makes no sense.
3) What I mean is that there may be two different types of the internal energy which are needed to be distinguished, the internal energy has already been well defined, the physical contents of the internal energy is the sum of the different types of energy within a given system, we only need to make a classification for different types.
4) Yes, I can perhaps explain how I would redefine S, please see the link paper appeared below. Thanks for reading my work.
https://www.researchgate.net/publication/331457885_Heat_Energy_Entropy_and_The_Second_Law_A_New_Perspective_in_Thermodynamics
By writing the equation for an ideal gas, U=(i/2)NkT.
But why choose ideal gas as the system?
By restricting the system to an ideal gas, we will be robbing thermodynamics, of its universal validity and projecting a poor picture of thermodynamics to a general reader or a serious student.
For an arbitrarily chosen system, U is not defined. Let me reiterate - thermodynamics doesn't define U, it defines only ∆U (through the statement of the first law).
You seem to be trying to change first law in order to be able to explain/understand the concept of entropy/second law. This may not be a sound strategy.
I find you using the ideas such as: 'energy produced' which violate the very well established law of conservation of energy, and therefore invalid. (Note, energy cannot be produced!)
Again, free energy is not a form of internal energy - let us not mix up concepts from second law and first law creating confusion.
The concept of heat needs to be modified. The definition of efficiency of an ideal heat engine needs to be modified - so that thermodynamics is in harmony with mechanics, by ridding itself of the so many paradoxes it suffers from (for example Maxwell's demon).
So we may see if some others also feel that there is a need for modification of the concept of heat, of efficiency of an ideal heat engine, etc., so as to take this discussion on the concept of entropy forward.
'.....and the first law of thermodynamics may be regarded as defining the internal energy as the sum of heat added and work done on the system by its surroundings.'
This statement from your ref [1] is incorrect. That doesn't define the first law correctly.
Maybe many others also follow and use it - I can only say that they are misguided.
In the process of transformation of heat into work, or in its reverse process, energy is transformed from one form to another, but is not created/generated in either of the forms.
My only caution is that, our statements should not lead to misunderstanding of what we are trying to convey.
I appreciate your efforts to convince others about the concept of internal energy. Such discourses as the one you sent are the ones which make students believe, in their formative years, that they understand what internal energy is. Such discourses don't distinguish energies of different kinds from internal energy.
Let me try an alternate method.
Thermodynamics doesn't bother about what is within the boundaries of the system. For example, it need not be a gas at all, let alone ideal gas!
Did we not learn, and do we not teach that thermodynamics deals with macroscopic systems? Why then, go on lecturing about molecues, atoms, electrons, bonds between atoms and so on? This is the influence of statistical mechanics that destroyed the most powerful, simple and beautiful subject of thermodynamics.
Let us not fall pray to this trap.
Let us discuss entropy from the point of view of classical thermodynamics. We will have lots of opportunities to develop new and fundmental concepts that lead to breakthroughs and beautiful, simple new sciences.
Let me now address the original issue raised by OP.
When Newton's laws are taught, we are advised not to ask certain questions: eg. What (is force)? - rather ask: How (does it change motion)? That inquiry gives a better understanding of what force is.
Taking cue from that, we may suggest that it is better to ask: How (does entropy, or for that matter, internal energy help us understand nature or natural processes better)? - rather than asking (inquiring): What (is entropy)? The answers to such inquiry lead to a better understanding of what entropy, or internal energy is.
Let me briefly answer that question for the present and elaborate only if and when some interest is evinced.
Entropy S is a thermodynamic state function of a system. When a system changes from one equilibrium state A to another equilibrium state B the value of S changes, or remains the same in some cases. Simultaneously, the entropy of the surroundings also changes. The sum of these two changes together (called entropy change of universe) is always either greater than zero or equal to zero.
Case I. ∆S (universe) > 0.
Then we can, with 100% certainity declare that the process that takes the system from A to B is spontaneous, that is, occurs on its own with no external influence.
Case ii. ∆S(universe) = 0,
Then we say, the process is reversible. This requires further elaboration and we will not digress into that now.
Case iii. ∆S < 0.
Then we can declare without fear of contradiction that the process is impossible.
This information is of greatest importance - it enables us to predict whether a given process is spontaneous or not, without doing experiment (this also needs elaboration, which we shall not go into here). If spontaneous, we can invest time, money and efforts into making that process occur at a desired rate (which is a different study altogether, called kinetics).
If, on the other hand ∆S < 0, we are saved of wasting time, money etc in such a process.
Predictions of thermodynamics have unparalleled validity and are most useful. Hence the importance, greatness and interest in thermodynamics!
A word of caution!
Inspite of its great power, entropy is not of much practical use, because it needs a knowledge of entropy of surroundings as well for predictions. But changes in entropy of surroundings is not an easily obtainable quantity.
It is this difficulty that lead to further searches of criteria to predict spontaneity. It is those searches that lead to the concepts of different types of of' free energies that proved to be more useful.
Radhakrishnamurty Padyala
What do you think about Stochastic Thermodynamics containing processes of negative entropy production, a concept which is missing in your contribution.
Once we cross the boundaries of classical thermodynamics, we can not be 100% sure of the predictions made or conclusions drawn from the analyses made.
So, stochastic thermodynamics and many other such, don't offer us the reliability that classical thermodynamics provides.
Because of certain inherent problems present in thermodynamics, which scientists are refusing to accept and the blind faith they repose in thermodynamics so much of confusion is arising. That confusion, however, goes under the guise of new developments and discoveries such as information thermodynamics, stochastic thermodynamics, black hole thermodynamics and so on.
They didn't reach the dead end yet. Only when they do, new original discoveries will come.
We need to formulate new theory of heat to overcome the deficiencies of thermodynamics. We need to redefine the efficiency of an ideal heat engine. The concept of entropy which is the origin of 'arrow of time' will disappear.
Nature has no one sidedness, it exhibits the beauty of symmetry in each and every aspect of its phenomenon. Nature tries to teach us the beauty and simplicity of symmetry.
We ignore it and postulate 'super symmetries' - modern science is relentlessly after complexity! It tries to see complexity in Nature, where it is not present.
Stochastic thermodynamics may have processes that lead to production of negative entropy, but classical thermodynamics will have nothing to say about it.
Dear Prof. Bo Miao,
Is the system you are considering, a closed adiabatic system?
If yes, could you show whether the final state could be reached from the initial state? In light of Caratheodory principle, certain states are not reachable by an adiabatic process from a given initial state.
In other words, our aim would be to show that the entropy in the final state is >= the entropy in the initial state.
When the partition moves, the volume of the system (container) doesn't change. Does it? It is a closed system. Therefore, volume, temperature and the mass remain constant, for every position of the partition. What change could the system possibly suffer?
If the system is in contact with a heat reservoir, should there not be a term representing the change in the entropy suffered by the heat reservoir, on RHS of the first equation, dS = ....?
In my opinion, entropy is a useful property and a valuable tool in the analysis of the second thermodynamic law in engineering devices, but this does not mean that we know and understand entropy well. In fact, we cannot give an adequate answer to the question: what is entropy? However, the impossibility of describing entropy in its entirety has nothing to do with its usefulness. It is not possible to define energy, but this does not interfere with our understanding of energy transformations and their conservation principle. It is admitted that entropy is not a common word as energy is, but with continued use a deeper understanding and appreciation is reached. This analysis attempts to show the physical meaning of entropy, considering the microscopic nature of thermodynamic matter. Entropy can be seen as a measure of molecular disorder, or molecular randomness. When a system becomes more disordered, the positions of the molecules are less predictable and the entropy increases, so it is not surprising that the entropy of a substance is lower in the solid phase and higher in the gas phase. In the solid, the molecules of a substance continuously oscillate in their equilibrium positions, but it is impossible for them to move with respect to each other, so their position can be predicted with certainty at any time. However, in gas, the molecules move randomly, collide with each other and change direction, making it extremely difficult to accurately predict the microscopic state of a system at any instant. Associated with this molecular chaos is a high entropy value. From a microscopic point of view (from the perspective of statistical thermodynamics), an isolated system that seems to be in equilibrium
it can exhibit a high level of activity due to the incessant movement of molecules. To each macroscopic state of equilibrium corresponds a large number of possible microscopic states or molecular configurations; then the entropy of a system is related to the total number of those possible states of that system and is called thermodynamic probability (p), which is expressed by the Boltzmann relation as:
S = k * ln (p)
(k) is the Boltzmann constant. Therefore, From a microscopic point of view, the entropy of a system increases whenever the randomness or molecular uncertainty (that is, the molecular probability) of a system increases. Thus, entropy is a measure of molecular disorder, and the molecular disorder of an isolated system increases whenever it undergoes a process.
Dear Debrayan,
You mentioned ‘’entropy is a measure of molecular disorder’’, but you ignored the limitations of statistical mechanics, a priori probability postulate is only applicable in the case when thermal motion is stronger than interaction, statistical explanation is only applicable to thermal motion, but not all of the state or the processes. In statistical mechanics, we only see a priori probability, fluctuation, thermal motion, stochastic process, tend to equilibrium, but can hardly discuss structure, order, the creative process, self-organize, etc, the latter are also the research objects of thermodynamics.
A pretty girl married a handsome boy, then they have a baby, this an irreversible process, and can be described by the second law of thermodynamics (dissipative structure), how do you describe this story by a priori probability postulate? It is obvious this is not a probability distribution.
A priori probability postulate is not a fundamental principle.
A priori probability postulate is not applicable to describe long range order.
Compare different systems:
1. A gas, a priori probability postulate is applicable,
2. A liquid, short range order, a priori probability postulate is applicable for some system, such as a single-component system, or a multicomponent system which are completely miscible,
3. A multicomponent system which are partly miscible, such as water & oil, it will be a multiphase system, a priori probability postulate is not fully applicable,
4. Solids and structures, long range order, a priori probability postulate is not applicable.
The very fact is that not all of the molecular systems are probability distribution system, in many cases, a priori probability postulate is not applicable.
Boltzmann ‘entropy is a measure of molecular disorder’? it only works in the cases 1 and 2.
Until today, no one real know what entropy is.
Please consider the following experiment.
Put a big water droplet on a glass plate, then sprinkle some small water droplets near the big water droplet, covers the glass plate with a glass shade (sealing airtight), waiting for a few days (maintain an appropriate temperature). What will happen? the small water droplets will migrate into the big droplet and disappear, at same time, the big water droplet becomes bigger. In equilibrium state, there is only one bigger water droplet.
From physical chemistry, we know that the vapor pressure of small droplet is inversely proportional to its curvature radius, the smaller curvature radius, the bigger vapor pressure. That is why the small water droplets will migrate into the big droplet then disappear.
This is a spontaneous process different from diffusion, dS>0 (dG
The answer to your question to your excellent question is very complicated and there is no universal agreement on this subject, but if I had only twenty lines to answer it, I wouldn't talk about metaphors such as the spreading of energy and certainly not of an increase in disorder. Instead, I would define entropy as a measure of the probability of a thermodynamic system. In turn, the probability is defined by the number of accessible quantum microstates, W. W is a temperature dependent quantity and SB (Boltzmann entropy) = k ln W. This expression is universal - it doesn't just apply to gases but we must be able to define the thermodynamic system and so model W if we want to calculate SB; that's easy for a bottle of gas at a constant temperature but virtually impossible for complex systems. However, in principle, we should be able to define SB for open or isolated systems and (importantly) for systems that are not at equilibrium as well as those that are. In other words, SB is allowed to vary with time. At equilibrium, SB equals the Clausius entropy (which does not vary with time). Boltzmann entropy therefore subsumes Clausius (often simply called thermodynami entropy) and is the closest we have to a universal entropy for equilibrium and non-equilibrium states. So for example, we can't talk about the Clausius entropy of a plant since the plant is not at equilibrium, but we could - assuming that the thermodynamic system may be defined in principle - talk about the Boltzmann entropy of a plant. But note that the entropy of open systems is not constrained by the Second Law; while irreversible processes always generate entropy, matter or energy entering or leraving the open system can cause the entropy of the plant to go up, stay in a steasdy state or go down with time.