Everybody believes that thermodynamics is atemporal, but I think it is not. Actually, I am trying to find a equation that defines time as a function of enthropy variation.
You can write a balance equation for entropy, for example in 1D
ds/dt+u*ds/dx= s_production
But this function s=s(x,t) I am not sure you can invert to define the time as function of s ...
I'm not the best expert on this, but the second law of thermodynamics is the only fundamental law that gives time a direction.
If entropy in a closed system can only increase and never decrease, then the time direction in which this happens can be defined as positive time (i.e. physical) and the one in which entropy decreases is the negative (i.e. unphysical) time direction.
None of this is new though.
I am not sure that can work...in a closed system the entropy cannot decrease but could be that the production of entropy is zero, so you could elaborate that time is constante and does not evolve??
As a science of exchange of mass and energy between macroscopic systems, thermodynamics is far from being atemporal.
It is thermodynamical equilibrium which - by definition - involves no time-dependent quantity. Its basic feature - maximization of entropy in closed and isolated (microcanonic) systems, constrained maximization of entropy with the constraint of total energy conservation in closed, non isolated (canonic) systems - follows straightforwardly from Liouville's theorem of mechanics and - indeed- from the requirement that nothing depends on time.
Since Liouville follows from Hamilton's equations, which in turn rule both classical and quantum mechanics (even if with different variables) then the properties of thermodynamical equilibrium are truly universal.
Far from equilibrium, who knows?
For example, nobody knows if such universal properties affect e.g. steady non-equilibrium states. There are many particular cases of interest, i.e. Kirchhoff's principle for Ohmic conductors and Kortweg-Helmholtz principle for Stokes' fluids.
Above all, if all phenomenological laws are linear and no phenomenological coefficient depends on time and are odd in the magnetic field, then symmetry of microscopic physics under time reversal leads to Onsager's symmetry, which in turn leads to Onsager&Machlup's maximization of entropy production at given thermodynamical forces (= entropy gradients) or minimization of entropy production at given thermodynamical forces and fluxes (=the dual quantities of thermodynamic forces).
Beyond the domain of validity of Onsager symmetry, there is just the wild, wild west.
Across the prairies of this Far West the true believers' caravans of the MaxEnt (maximum entropy production, at the very basis of Gaia's hypothesis) are attacked by the desperadoes of Stochastic Thermodynamics, regardless of the cavalry of Extended Thermodynamics and Information thermodynamics, while the wise shamans of Rational Thermodynamics and Topoological Thermodynamics keep to themselves, well sheltered in the teepees of their own formalism...
Thermodynamics is a macroscopic theory. Statistiscal mechanics is a microscopic theory, so reversibiliy may be evoked from the equations of motion. But not for thermodynamics, that needs an arrow of time - this is at the core of the second postulate (s is an increase function and max at equilibrium). So, from a thermo viewpoint with a change in the constraints of the system, at the end it will reach an equilibrium state, but it has to follow a path - and this is not atemporal!
Thermodynamics is a phenomenological theory of time dependent systems.
Reversible "processes" and Equilibrium are included. Please, read the attached file.
I would suggest you to have a look to stochastic thermodynamics formalism, where first and second law can be 'derived' directly from microscopic equations of motion. Furthermore, in this formalism entropy production emerges naturaly as a consecuence of the lack of time-reversal symmetry of a dynamical action (for stochastic processes, in the case of Hamiltonian systems the procedure is different, but many of the main conclusions are the same). I belief that you can answer some of your questions by thinking in this direction.
To have a feeling about the role of time-reversal symmetry of microscopic dynamics on setting thermodynamics, I recomend you to read the paper below. It is a little bit old (from the beginnings of stochastic thermodynamics) but I think it is very enligthening.
Best wishes,
Reinaldo.
Article Fluctuation theorem for stochastic dynamics
Time is ultimately a dynamical property, not a thermodynamical one. The time rate of entropy production in the Universe is unrelated to time itself. Consider the operation of a wristwatch some 10^15 years from now, if the Universe expands forever and does not recollapse. If there is no replenishment of hydrogen (as in the old steady-state model), by 10^15 years from now, all stars will have burned out and the time rate of entropy production in the Universe will be miniscule compared to the current value. Yet there is no reason to suspect that a wristwatch will not still operate normally 10^15 years from now. Entropy increases because the Universe was created from nothing, and the entropy of nothing is zero. Equal amounts of positive mass-energy and negative gravitational energy created at the big bang and via inflation represent a large increase in entropy over zero, but still to very much less than the maximum possible value --- the maximum possible value itself increasing as the Universe expands.
Thermodynamics describes the evolution of systems in time. Entropy as a property can not be destroyed, only be generated in irreversible processes. An isolated system will have constant (if in equilibrium) or increasing entropy (if irreversible processes occur), but the entropy of a non-isolated system can increase or decrease, depending on transfer of entropy across the system surface (entropy transfer = heat flux over temperature in many cases). So, if entropy can go up or down over time, you can't invert the function, that is time as function of entropy is generally not possible. You could try in isolated systems, but there you have the problem that as equilibrium is approached, change of entropy vanishes (it reaches a maximum), that is you have a singularity of some kind.
Apart from stating that you try to express time through entropy, you could help the discussion by stating WHY you would try this, that is, what is the goal?
Professor Struchtrup, the only way we have to realize that time goes by is by the variation in entropy. Equilibrium systems and isolated systems, in fact do not exist, only hypotetically. Actually, to keep the properties of a system constant we would have to provide it some energy, so that entropy would increase, not vanishe.
I am trying to express time through entropy because it seems to me that our perception of time is based on it. It is like entropy were a more fundamental function than time is.
Professor Lima, I am afraid I do not grasp the meaning or your statement "the only way we have to realize that time goes by is by the variation in entropy". Time is what we say we measure with clocks, pendulums and the like. The less friction rhese devices undergo, the better. Galilei's ideal pendulum is a perfectly dissipationless system, with zero 'variation in entropy'. Do you mean Galilei did not measure time?
One could probably add that the passing of time is nicely measured by a pendulum, but the pendulum does not give a direction of time. The ideal pendulum is reversible, i.e., you can't decide whether you see a movie of the pendulum forward or backward.
The direction of time is linked to the growth of entropy in an isolated system, which is what the idea seems to refer to. The non-ideal pendulum looses amplitude in our perception of time, hence you know whether you watch that movie forward or backward.
The growth rate of entropy in such isolated system depends on the momentary state of non-equilibrium, where one would expect the growth rate--which equals the entropy generation rate, of course--to be larger, for a strong non-equilibrium state, and to be quite small for a system close to equilibrium (e.g., a small amplitude pendulum has lower speed, hence lower friction in air, hence less entropy generation).
So even for an isolated system, it seems rather difficult to link entropy (generation) and time in a meaningful way. Lima's last answer even indicates that the system should not be closed?
I still find this too abstract: to do something, anything, along these lines you need to chose a well defined system/process.
I read in past many discussions in APS forum about this issue. Long, intriguing and often debated replies... I don't think that a dissipationless (zero entropy production) devices like a pendulum is philosophically unable to measure time. We just do not add entropy to the system by using that device.. On the other hand it can measure time both in a open or closed systems. In principle, we can think of a pendulum in a system where we have an exiting entropy flux that diminuishes, in a certain time interval, the integral of entropy of the system. So, does that mean the pendulum measures an inverse arrow of time?
And what about the time, does it still exist if we simply stop to measure it?
Really, I feel unable to understand. Entropy is a macroscopic quantity: its never-ending growth follows from the fact we limit ourselves to macroscopic description of many-body systems. Such 'unavoidable' growth just ceases to exist when considering mesoscopic systems like e.g. Brownian ratchets - not to mention atoms and quarks. Time is definitely no macroscopic quantity - or else, all the stuff concerning relativistic particle physics is meaningless. Seriously: what we are speaking about?
Professor Di Vita, I’m sure Galilei measured time as much as I am sure that the “ideal pendulum” (perfectly dissipationless system) does not exist. Like ideal gas or ideal solution, ideal pendulum is just an idea to make easier calculations and interpretations. In real systems, that are dissipatives, entropy variation is not zero. The total entropy always increases in any transformation. I think your conclusions are right, but I don’t follow your assumptions. What I am trying to say is that time might be connected to entropy variation by the amount of energy involved in the process.
Professor Struchtrup, I see your point, but I guess the pendulum in fact gives the direction of time. Considering that we must provide some energy to the pendulum in order to keep it always moving (perfect dissipationless systems do not exist), I would say that expending energy corresponds to time moving forward. The opposite (time moving backward), that is obviously “impossible”, would correspond to a pendulum voluntarily stop moving and an amount of energy being created elsewhere.
Let’s imagine, for instance, a capsule full of a gas and protected from any contact with radiation or matter. That would be an isolated system. If we take a look inside the capsule we would see molecules moving and colliding against each other and against the capsule wall in a perfect elastic way. So we can say this system is in equilibrium, but is it reasonable to accept all the assumption I’ve made as right? I guess it is not. We need to put some energy into the system to keep it with constant properties.
@Benedicto Lima
The problem I see is in the fact that in a insulated system we know from thermodynamics that ds/dt>=0. So, according to your reasoning, if the production of entropy would be zero, the time would not exist in the system.
Dear Professor Lima, the SI definition of second is 'the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom'. Do such levels undergo dissipation? If the answer is negative, then there is no connection between time and entropy. You may answer that this definition refers to a caesium atom at rest at a temperature of 0 K., and that nothing actually achieves 0 K. All the same, to the best of our technology ( 100 picokelvins) no definition of 'second' imply entropy-raising systems.
Dear all,
It is a truly interesting and definitely stimulating discussion I would greatly appreciate to join right now (hopefully I am joining not much too late, for I had no time for attentively dealing with this earlier, to my sincere regret)...
Prof. Dr. Lima would like to to find an equation that defines time as a function of entropy variation. Prof. Dr. Struchtrup wonders as to what might be the actual goal of such a project. Prof. Lima's response does not seem to sound satisfactorily for the audience. Indeed, Prof. Dr. Struchtrup means that such a project ought to be much too abstract and suggests to look for some realistic systems. On the other hand, Dr. Ing. Leick does not feel any novelty in such a project...
As a result, the discussion is converging to the "stable limit cycle attractor", to duly describe the story at hand in terms of the dynamic systems theory, which is consistently rotating around the old conceptual error by Rudolf Clausius, as well as the brilliant finding by Ludwig Boltzmann to shift the disputable notion of entropy to the realm of the probability theory...
Sure, of course, if the "Evil Entropy" is stubbornly driving our Universe to the shear inevitable catastrophe, then let us try to protect ourselves by seeking our sanctuary in the powerful, old, good probability theory. Boltzmann's true Genius consists in fetching the TRUE relationship between the Probability and Entropy... Still, to our sincere regret he had no more time to clarify, the Probability of WHAT is connected logarithmically to the Entropy Notion...
In the meantime, in parallel to Boltzmann and independently of him, Josiah Willard Gibbs had come even much nearer to clarifying the physical sense of both entropy and probability. Still, all of us, human beings do have finite time upon our Earth to waste... Gibbs was looking for the rigorous maths to describe how the dynamics of the apparently fuzzy system of many Micros might lead to fully crispy Macroscopic results. So, in accordance with this, he started studying Statistically Independent Systems consisting of lots of Microparticles - and managed to publish his seminal results, just before his untimely departure... By treating Statistically Independent Systems, Gibbs had excluded the ubiquitous Correlations due to the Physical Interactions... Of course, Gibbs' treatments might well approximate realistic situations, and we know this nowadays for sure, but they themselves do not provide us with the unique clue as to what ought to be the actual physical sense of the Entropy Notion...
It is just this circumstance that had lead the leading theorists of that time, like, e.g., Prof. Dr. Max Plank, to prefer Boltzmann's train of thoughts to that by Gibbs... Planck had been right to 100%. It is the S = k * Ln(W) that might duly describe the universal temperature dependence of entropy... That had been the result by Dr. Georg(e) Augustus Linhart...
We know very well that these were the powerful efforts by Max Planck and his numerous successors that had resulted in the nowadays old and good Quantum Mechanics, plus all the related scientific branches...
And nowadays everybody from play-school kids till dotards know for sure: The square of the absolute value of the wave-function is PROBABILITY!!! The Quantum Mechanics ought to be a self-consistent theory successfully applied in many branches of physics/chemistry/biology... One point is but still remaining largely unresolved: THE PROBABILITY OF WHAT EXACTLY is the square of the absolute value of the wave-function???
To my mind, the above poser is slowly developing to the degree of rhetoric ones, together with the poser of WHAT IS THE ENTROPY AT ALL...
As the leading progressive theorists of all the times and peoples have been hardly working on "KILLING THE TIME", the result has come in the form of the brilliant idea by Prof. Dr. I. N. Prigogine to introduce the handy notion of the "thermodynamic time", which is nothing more than just a mathematical rearrangement of the formula S = k*Log(W) + some audacious philosophy.
Just to revert to the main topic of the discussion I would like to attach here a copy of the unpublished review by Dr. Georg Augustus Linhart concerning the actual interrelationship between the time and entropy...
Entropy is not a physical quantity.
What is the critical error of entropy ?
that is: In fact, Q = f(T, V, P) !
( and W= f(T, V, P) 、E= f(T, V, P) ),
so, △Q/T can not become dQ/T !
Q = f(T, V, P) is a process quantity which varies with path, it has innumerable forms between the same original and terminal states, and has a unique form for fixed reversible process path. When the given path is fixed, Q = f(T, V, P) is the system state variable. P, V and T are all variables (two variables of T, V and P are generally independent) for any reversible process.
Hence, regarding
(1/T)dQ = (1/T)df(T, V, P) = dF(T ,V ,P) (here, P, V are constant). P, V should be variables but become constant !
so, ∫T 1/TdQ = ∫T dF(T, V, P) is meaningless .( in ∫T , T is subscript ) and, dQ/T = df(T, V, P)/T is meaningless in itself .
so, △Q/T can not become dQ/T, the so-called entropy does not exist.
So, time has nothing to do with the
so-called entropy.
Time might be considered an intensive variable, and entropy might be considered a function of time (George Augustus Linhart).
Dear Evgeny,
I hope you will find time this. I am a physician who observed two convenient eletrochemical systems for more than 2 decades hoping to understand how seizures spread (or not).
Looking at retinas with two and three interacting spirals waves we could see an almost cyclic behavior but it was looking at this figure (first published in Dahlem, M. and Müller, S. C. (1997) Self-induced splitting of of spiral shaped spreading depression waves in chicken retina. Exp. Brain Res., 115:319-3241) that I had an insight: an important part of what we call global coupling agents had to be time and that time was the link between irreversible thermodynamics and excitable media . I tried to argument in this direction in a recent publication in which we used this figure ( DOI: 10.4236/ojbiphy.2016.64011 ).
What the figure shows is the temporal evolution (projected in 2D space of a particular system state (the tip or turning point of an algorhythm spiral wave); every patch of tissue undergoes a cycle from quiescent to excited to refractory in that order (simplifying of course). We have 4 clusters of evolution from the central to to peripheral retina and the arrow of time from the cluster bellow to the uppermost. I like your approach to llok at historical evolution of concepts. Would you care to coment on this?
Dear Vera Maura,
many sincere thanks for your very interesting question!
First and foremost, I would greatly appreciate attentively reading your report you cite, to be capable of expressing some sound opinion. This would take a couple of days... In the mean time I wish you, your family and your colleagues Happy, Healthy and Prosperous New Year!
Respectfully yours,
Evgeni B. Starikov
Dear Vera Maura,
The first and the foremost, I would greatly appreciate taking this opportunity to wish you, your family and your colleagues the Happy, Healthy and Prosperous New Year to come!
I have started reading your and your colleagues' very interesting publications. What you are working on is to my regret not my direct research field, but may perhaps I could still share my thoughts, which might hopefully be useful at least to some extent...
As far as I could see, you and your colleagues are experimentally investigating propagation of electric excitations through the media of biological interest.
I am a theoretician in the field of bio-macromolecular physical chemistry/chemical physics. Since about a decade I am interested in clarifying the foundations of thermodynamics (I started being interested in this theme while a university student, so that I have approached this at the final period of my professional life).
The membrane systems you are dealing with are complicate indeed, not only due to the 'mixture' of lipids and proteins, but also owing to the sheer specificity of the membrane proteins themselves (they are well known to be remarkably unlike the whole entirety of the proteins available upon Earth).
Further, water, which is in your cases the so-called 'water-of-hydration'-part of the membrane complexes under study is still a kind of a 'mysterious liquid', which is not just immediately amenable to the conventional physical-chemical understanding of what is physically-chemically happening in it and in view of its participation...
To sum up, any skillful microscopic model of what happens in your systems in effect is not immediately approachable, to my mind...
Nevertheless, macroscopic (thermodynamic?) models might well be formulated.
My feeling: This ought to be a different thermodynamics, not the conventional "reversible/irreversible/equilibrium/non-equilibrium" cud you might fetch in the conventional handbooks...
With this in mind, I would suggest the following train of thoughts.
You are acting on the excitable media under study with some physical stimulus, for example a photon of some energy (retina), some ion (brain, heart muscle), having a kinetic energy.
This kinetic energy ought to enliven some reaction of the media under study, which, for example, some electro-magnetic wave propagating through the media. Now the propagation of the latter ought to be described in according with the structural modalities of the media under study.
There will definitely be factors PROMOTING and IMPAIRING this propagation, so here we immediately arrive at the model based upon the DifferentThermodynamics.
Indeed, there will be energetic (enthalpic) factors standing for the PROMOTERS and entropic factors standing for the IMPAIRING agents. Mathematically this is correspondent to the theory of antagonistic games. Time is naturally included into the story, if you would formulate your IMPAIRING factors as a Chronodynamic Entropy according to G. A. Linhart. Then it would be throughout possible to formulate a relevant game-theoretical model and try fitting it to your experiments - from that point on, it should be just a technical problem, while the main point would be to perform the correct physical-chemical assignment of the factors (PROMOTING/IMPAIRING agents) mentioned above. Skillful experiment planning ought to suggest the proper combination of experimental methods. Thereafter it will be the matter of skillful processing the experimental data set (using such statistical approaches like factor analysis/multidimensional scaling)...
These are just suggestions in general, whereas I am not sure I cannot go in for the specific details, for this not my actual research field.
Respectfully yours,
Evgeni B. Starikov (Jewgeni B. Starikow)
Comparison of New and Old Thermodynamics
1. Logic of the Second Law of Thermodynamics: Subjectivism, Logical Jump, Interdisciplinary Argumentation.
2. New thermodynamics pursues universality, two theoretical cornerstones:
2.1 Boltzmann formula: ro=A*exp(-Mgh/RT) - Isotope centrifugal separation experiments show that it is suitable for gases and liquids.
2.2. Hydrostatic equilibrium: applicable to gases and liquids.
3. The second and third sonic virial coefficients of R143a derived from the new thermodynamics are in agreement with the experimental results.
3.1. The third velocity Virial coefficient derived is in agreement with the experimental data, which shows that the theory is still correct when the critical density is reached.
4. See Appendix Pictures and Documents for details.
1,The second law of thermodynamics is incorrect. See figure below for details.
2,The system is isothermal and exchanges heat with a large heat source to keep the temperature constant. Only volume changes are discussed.
3,The problem here is that the actual system is balanced and the second law of thermodynamics judges it to be unbalanced.
Ratio of tetrahedron edge to radius of an inscribed sphere at any size is always 4.98 which represents a "geometric" thermodynamic equilibrium.. As referred to in Einstein’s lecture__of 1821, ”Geometry is a natural science to be regarded as the most ancient branch of physics. As laws of mathematics refer to reality, they are not certain, and as far as they are certain, they do not refer to reality”.
Interconnection between time and entropy variation, but what is entropy and what is its variation?
The problem with thermodynamics is that 1) descriptive variables like T, S, P, V, but also µ the chemical potentiel and m the mass , are not physically defined; 2) the thermodynamic system, organized in sub-systems, is not defined at all; and 3) we currently base all our reflection on the two laws that we try to put in equation; instead to consider these laws as only consequences of correct definitions of the descriptive variables of natural processes.
With a correct definition of the descriptive variables, at each scale of organization including molecules and atoms, not only these two laws are respected but we can go further and find, for example, that the celerity of molecules are the same in a fluid phase of the system at equilibrium, that the internal and external pressure of a molecule are equal etc. Using this systemic point of view, introducing the reality of nested variables of volumes of energy, one could physically define, exactly, all variables mentioned in1). We showed this in a recent article:
Article Systemic Modelling of Soil Water Thermodynamics under Natura...
So for us there is no direct connection between time and entropy variation.
Herewith you are posing important methodological questions. How should we define the physically descriptive variables to properly understand what we are observing, when these variables do measurably change? ... What is the proper standpoint?
Here I would like to place just a couple of citations - and thoughts they activate.
....(Sorry for mixing up the languages below, it is difficult to maintain a unique tongue, for the story has long become to be expressly international)...
"A careful study of the thermodynamics of electrical networks has given considerable insight into these problems and also produced a very interesting result: the non-existence of a unique entropy value in a state, which is obtained during an irreversible process .... I would say, I have done away with entropy. The next step might be to let us also do away with temperature",
in: Edward B. Stuart, Benjamin Gal-Or and Alan J. Brainard (Editors): A Critical Review of Thermodynamics, 1970;
Dies wäre das Zitat von Herrn Prof. Dr. Josef Meixner (1908-1994), der von 1926 bis 1931 Mathematik und Physik an der Universität München studierte, wo er dann im April 1931 die Staatsprüfung für das Höhere Lehrfach in diesen Fächern ablegte und im Juni 1931 bei Arnold Sommerfeld in theoretischer Physik mit einer Arbeit zur Wellenmechanik promoviert wurde...
Das zeigt sehr wohl, wie die Adepten der Wellenmechanik die Thermodynamik behandelt haben... Deren Anliegen war die metaphysische Theorie um jeden Preis zu der physikalischen zu machen... Sie hatten ja Erfolg, wie wir wissen... Die Thermodynamik wurde dabei aber zur Dienstmädchen dieser Wellenmechanik gemacht...
Howbeit, that was not purely German story, it was percolating internationally, the story has allegedly been put onto the firm philosophic basis:
"The law that entropy always increases—the Second Law of Thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations—then so much the worse for Maxwell's equations. If it is found to be contradicted by observation— well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation".
The above is just the citation from Sir Arthur Stanley Eddington, 'The Nature of the Physical World', Chapter IV. Coincidences (p. 74), 1928"...
Thermodynamics had apparently no chance, for its basic law of the Energy Conservation and Transformation, which is clearly demonstrating two basic sites of Energy Notion: Qualitative and Quantitative, respectively, has been conceptually bisected, and we supposedly had - and still have - to do with the two basic laws. Practically, such a scheme is likewise trying to bisect coins with the sole aim to separate their heads from their tails, which does not look a reasonable activity, at least IMHO...
The result is but the brave report by Josef Meixner about "having done away with entropy" - and his enthusiastic intent to do away with the temperature as well. This is a consequence of Einstein's doing away with the Space and Time by dubbing them RELATIVE - sure, this is a skillful theoretical approach to cope with difficult conceptual obstacles. Using the fully true and seminal formula by Boltzmann-Planck 'S = k* ln (W)' we embody Meixner's dream as well...
...But using such a methodology throughout the scientific research does kill the latter by throwing away its integral notions...
It is of utmost importance to define all the measurable/not measurable variables but EXPLICITLY.
One of such variables being not easily/directly measurable ought to be ENTROPY.
Howbeit, the conventional 'Equilibrium Thermodynamics' coupled to the Statistical Mechanics do not provide us with a clear picture of the latter. Instead, the Boltzmann-Planck's formula is offering the IMPLICIT picture. This renders the 'Equilibrium Thermodynamics' and Statistical Mechanics the slaveys of Quantum Mechanics, which are not standalone but providing the true physical basement to the latter largely metaphysical construct...
At this point, I would suggest to come back to the immortal legacy of N. L. S. Carnot, the sheer misunderstanding of which could lead to 'Equilibrium Thermodynamics' coupled to the Statistical Mechanics...
This is not to blame the latter both, for owing to ingenious international efforts of a number of outstanding colleagues we have gotten a nice, valuable and efficient physical theory of Quantum Mechanics, which is in effect just a handy chapter of the Probability Theory - skillfully covering up nothing more and nothing less than just a voluntarist metaphysics...
What is but truly negative ought to be absolutizing Quantum Methodology, which deprives us of understanding physical sense of important measurable variables, likewise Space, Time, ... and so on, and so forth...
N. L. S. Carnot's standpoint ought to be Energetics - the consistent employment of the Energy notion... N. L. S. Carnot had no time to think over the Entropy Notion - but he could provide us with enough conceptual tools to work EXPLICITLY with it.
Meanwhile, as the Entropy is physically measured by Energy, the actual story is as follows:
1. Any realistic action is physically characterized by a Driving Force (in the ancient times they were dubbing this Vis Viva, i.e., the Livening Force).
2. The physical source of the Vis Viva is known as the Kinetic Energy.
3. The Energy as it is might be characterized Quantitatively (The Total Energy of the Universe is Constant) and Qualitatively (there is a wealth of Energy Forms or Types in the Universe, which might be transformed into each other). To sum up, we arrive at the Basic Law (The First Basic Law of Thermodynamics): The Law of Energy Conservation and Transformation.
4. In the meantime, after reading textbooks, we come across a big number of Thermodynamics Laws (up to four), which is an exaggeration. The First Law mentioned above is in effect also the Last one. Instead, in the textbooks you will find at least the First Basic Law (Energy Conservation Law) and the Second Basic Law (Energy Transformation Law, or Entropy Law).
5. The latter statement is likewise trying to bisect coins with the sole aim to separate their heads from their tails, which does not look a reasonable activity, at least IMHO.
6. But let us come back to Physics instead: In the real life any Action has some aim. Achieving the latter is practically known as Performing a Useful Work.
7. In Energy terms we spend Kinetic Energy to perform a Useful Work. In other words, Kinetic Energy is Transformed into Work. As the Total Energy is Constant, the Kinetic Energy, i.e. the Useful Energy may be transformed into the Useful Work - and thus become the Useless Energy. From N. L. S. Carnot we know that Heat is the proper representation of the latter.
8. Now, what is the Kinetic Energy being spent for? It is spent to compensate/equilibrate ubiquitous obstacles/hindrances/resistances. From Isaac Newton we know well that basically, all the latter 'worries' (i.e. whichever Counteractions) do increase with increasing the intensity of the Action we apply... Howbeit, the really good news do come from Clausius and Lord Kelvin: the Counteractions' growth may not be infinite, they do arraive at their maximum value. Therefore, if we get the Kinetic Energy enough to overcome the maximum of the entire Counteractions, or, in other words, to COMPENSATE/EQUILIBRATE the latter, we shall successfully achieve the aim of our Action and thus perform the Useful Work.
9. To sum up, Entropy is the maximum sum of all the ubiquitous obstacles/hindrances/resistances to be overcome during any realistic process.
10. Where we might get the Kinetic Energy from? From the Potential Energy.
11. What is Potential Energy? It is just the Energy of coupling/interaction among the subsystems of a system under study, whatever the nature of the latter and the very rule of how to take it into parts (if, e.g., we adopt the conventional atomistic standpoint, we may define the Potential Energy as the inter-atomic/-molecular coupling energy).
12. Philosophically we might judge the Entropy as a kind of Basic Constraint. It is a Static viewpoint. On the other hand, we might treat Entropy dynamically as a trend to achieve some Equilibrium. This philosophic entirety ought to be the basic physical/chemical/biological sense of this truly key notion.
Dear Prof. Rastovic,
HAPPY NEW YEAR!
It is not about Probability Theory and/or Mathematics of Stochastic Processes - It is about Physics/Chemistry/Biology. Mathematics are only tools, the story here is about the proper conceptual frameworks, instead... Sure, some specific maths are required here as well, this ought to be rather 'Differential Games Theory', instead. There might be place for the controlled stochastic processes - still, somewhere at the initial stages of the research...
Respectfully yours,
Evgeni Starikov
Dear Dr. Evgeni B. Starikov,
What do You think, does my answer can be connected with Maximum Entropy Principle , Second Law of Thermodynamics and Cumulative Entropy. Sincerely, Danilo Rastovic.
Dear Prof. Rastovic,
your answer is definitely within the conceptual framework of Maximum Entropy Principle, Second Law of Thermodynamics and Cumulative Entropy, as far as I can judge.
Let us speak a bit about these conceptual frameworks.
1. The Second Law of Thermodynamics: Meanwhile, there is only one Basic Law of Thermodynamics.
A. In the meantime, after reading textbooks, we do come across a big number of Thermodynamics Laws (up to four), which is a sheer exaggeration, for the Law of Energy Conservation and Transformation is in fact the Unique one. Instead, in the textbooks you will find at least the First Basic Law (Energy Conservation Law) and the Second Basic Law (Energy Transformation Law, or Entropy Law).
B. The latter statement is likewise trying to bisect coins with the sole aim to separate their heads from their tails, which does not look a reasonable activity, at least IMHO.
C. I.e. it is not a constructive stance, to conceptually separate the Second Basic Law from the First Basic Law. This conceptually emasculates the Energy notion - and mystifies the Entropy notion. (Please cf. my answer to the above note by Prof. Braudeau for details).
2. Maximum Entropy Principle is to 100% correct representation of the Entropy's physical sense: The Entropy (Counteraction) is zero, if the Driving Force is zero, the former is growing with the latter - and ought to reach its ultimate Maximum. Thus, the purely numerical-mathematical approach based upon this principle is valid, useful and powerful. As a result, the latter fact has to do with the physical sense of the Entropy notion in its EXPLICIT form.
3. Cumulative Entropy is also a valid, useful and powerful, but purely numerical-mathematical approach. Howbeit, it is exploiting the IMPLICIT representation of the Entropy notion in the form of the well-known seminal formula by Boltzmann-Planck 'S = k* ln (W)'. Here W does stand for some Magic Probability of the Lord Almighty knows what scilicet. The latter fact does not destroy the basic validity of the formula in question, for W is in fact a simple and handy algebraic function of the absolute temperature. Hence, it is a truly interesting poser to tackle: What is the physical sense of this W, as it becomes Ψ, i.e., the Wavefunction of Quantum Mechanics?...
Respectfully yours,
Evgeni Starikov
Dear Prof. Staricov
I totally agree with you and all points you mentioned in the long answer you did yesterday. Nevertheless, starting from point 7, there is something totally "ignored" by physicians and mathematicians in thermodynamics until now: they are two kinds of kinetic energy: inside molecules (Eint=1/2ML²/T²) and outside molecules (Ext=1/2Mv²), v=celerity and M the molecular mass. These two kinds of energy lead to two kinds of pressure (Eint/Vint and Eext/vext) and chemical potentiel (Eint/M and Eext/M) which are descriptive variables at the basis of the concept of thermodynamic equilibrium (Cf. my recent article mentioned above) . Note that the Newton's 2nd law plays at this molecular level where the chemical potential of the fluid phase is µ=1/2v² (cf. a preprint article: "hydrostructural pedology ..." on my RG site). The last points, including point12, can find their correct answer taking account of this new .
Best consideration
Erik Braudeau
Dear Prof. Braudeau,
many sincere thanks for pinpointing this important aspect!
The answer to your queries is known in part, but, otherwise, it is widely unknown.
In fact, the point 12 of my 'long response' ought to be throughout philosophic, and J. W. Gibbs has just started considering the STATIC standpoint, by introducing the notion of the Chemical Potential. But, to our regret, his lifetime was not enough to go along with the details of this...
Meanwhile, ca. 100 years go in Sweden there were two school teachers, the one's name was Nils Engelbrektsson (just a true descendent of Engelbrekt - the actual founder of the modern Swedish state), while the other was Karl Franzén. Till very recently, the both colleagues were widely unknown, whereas I could pick up their seminal work, which is now digitalized and obtainable through the Royal Swedish National Library. Its theoretic part was published in Swedish, and its experimental part is in English.
Nils could rigorously mathematically infer the Truly Universal Thermodynamic Equation of State (and described this in Swedish), while Karl could to 100% verify the inferences by Nils using all the relevant experimental data available that time (that was just ca. 100 years ago). Their finding beats clearly outperforms what is well known and became conventional as van der Waals Equation of State.
Of immediate relevance to your poser is Nils' inference combining the principles of classification by Carl von Linné and the topology by Gaspard Monge.
Starting from the suggestion that Entropy ought to be a Universal Boundary, Nils is constructing the topological equation for the latter and the subsequent analysis reveals all the proper state variables including the Gibbs Chemical Potential, when we deal with the mixtures...
Nils results do obey the clear and natural physical-chemical classification...
I have translated the Nils' part of work into English, and submitted the full result, including my own analysis of why we had to wait for 100 years to come across such a seminal result...
I have submitted two reviews to the Austrian Monatshefte für Chemie (another one is about N. L. S. Carnot's Energetics, and as to how it could be useful for the modern research activity).
I shall send you the PDF copies of everything mentioned here via your E-mail you indicate on your MS.
Respectfully yours,
Evgeni Starikov
To: Evgeni B. Starikov
You should read the attached paper
Best regards
W,M.
Dear Evgeni B. Starikov I have done sort of similar thing but only as an argumentum ad absurdum. I do not think that is a proper definition. Please see section 4 of this paper of mine Deleted research item The research item mentioned here has been deleted
Dear Alireza,
I am really sorry to disappoint you, but your argument is absurdum indeed. The Boltzmann-Planck formula S = k* ln (W) you are trying to deny is nonetheless 100% correct.
Yes, your University lecturers were absolutely rightfully telling you that neither Boltzmann nor Planck have ever inferred this formula analytically. Although this mathematical construct is seminal (it does constitute the mathematical basis for the conventional Statistical and Quantum Mechanics), it is truly difficult to recognize the sheer rationality of this formula without any additional information.
Nonetheless, there is such information. This formula has been thoroughly and in detail analyzed, it is 100% formally inferrable using Bayesian statistics and leading to the strict conclusion that the 'Magic Probability' W ought to be just a handy algebraic function of the Absolute Temperature (whereas the Mass notion has absolutely nothing to do with this). Dr. Georg(e) Augustus Linhart has obtained this important result about 100 years ago and published it in the Journal of the American Chemical Society several times (there are other publications by him on this theme in the ACS journals, but, in general, he was 'suffocated', so that only PREPRINTS by him are available. I have republished them. You might consult them in my 2019 monograph, the MS Word copy of which is available for free at my RESEARCHGATE account, under MY PROJECT. Please consult Chapter IV as to the story and publications by Linhart, although other stories might seem to be instructive as well).
The resulting formula for the Entropy is S = k* ln (1 + (T/Tref)K), where T stands for the Absolute Temperature, k, K are constants having no explicit dependence on the Mass, and Tref stands for some reference Absolute Temperature, which might also be considered constant.
As to the notion of Time, Linhart could have performed the necessary analysis to introduce the notion of the time-dependent entropy, cf. my monograph, as well as my paper: Article George Augustus Linhart—As a ``Widely Unknown''Thermodynamicist
To sum up, by trying to look for some interconnection between Time and Statistics, which is absent in practice, but might be established using skillful mathematical prestidigitation you follow the way of Ilya Prigogine, who was, in fact, the follower of Albert Einstein.
Einstein was trying to find at least some plausible conceptual basis for the purely metaphysical construct of Quantum Mechanics. The actual difficulty was the role of some peculiar atomic trajectories to be considered within the then-popular Kinetic Theory of Gases.
Trajectories ought to be the mathematical functions of the particle's Cartesian coordinates versus time.
The actual huge achievement of Einstein was to declare the 'BAD' trajectories to be RELATIVE so that one might REFUSE to take them into account theoretically if they do not conform with the remainder of the theory at hand (i.e. they are BAD). As a result, Einstein's approach allows us to treat ALL THE TRAJECTORIES THAT ARE AVAILABLE BUT TO TREAT SOME OF THEM IMPLICITLY, just by DECLARING SOME PART OF THEM TO BE RELATIVE, i.e., NOT ACCESSIBLE IN SOME CASES... Such a train of thought ought to be throughout PLAUSIBLE - especially, if you are trying to voluntarily promote some metaphysical insights, likewise, e.g., declaring the ZERO of the Absolute Temperature - which is UNREACHABLE - to be a unique ENERGY RESERVOIR... (This is the actual basement of Quantum Mechanics)...
Albert Einstein has but never created any Relativity Theory: instead, Hermann Minkowski had worked such a theory out, he was an outstanding mathematician and theoretical physicist - but had to die untimely, whereas Einstein could find a truly unexpected (but nonetheless quite seminal) application for such a theory...
A hugely similar story has occurred with the Entropy notion:
Nobody all over the world could properly hear what N. L. S. Carnot would like to convey so that there was no clarity as to the Entropy notion introduced by R. Clausius, who could properly analyze its properties, but had never revealed its actual physical sense.
This is why Boltzmann's idea to TREAT ENTROPY STATISTICALLY was just one of the possible solutions to the problem. Boltzmann's formula could have been tried by Planck, and he could find the formula to be throughout useful. Practically, the sheer usefulness of this formula consists in that we might employ the important Entropy notion but IMPLICITLY, i.e. without ever trying to clarify its actual physical sense...
Remarkably, there were lots of colleagues all around the world, who was but going a quite different way: CLARIFYING THE ACTUAL PHYSICAL SENSE of the Entropy notion... Interestingly, J. W. Gibbs was among such colleagues, while Linhart was the actual follower of the Gibbs' line of thoughts...
...To sum up, I would suggest: Before using advanced mathematical tools, please look around... Maths are instruments only, which must be used thoughtfully, i.e. first we have to think over, What is the Problem. As soon as the Problem is clear, we might think about how to solve it, i.e., what are the proper maths in this case... A very nice preliminary step would be just to look for the literature describing all the other efforts in the direction chosen...
Respectfully yours,
Evgeni
Dear Evgeni B. Starikov I don't have the slightest idea of the relevance of what you are talking to the problem and my proposal. You are just making no sense. By the way I don't understand this particular pretentious tone of you, trying to be the wise guy who is giving advice to a naive student who has just read some `university lectures'. These are utterly irrelevant; if you cannot refute an argument, trying to just appeal to Ad hominem is not a logical approach. I cannot see why you are just trying to show off with your knowledge with which you seem to be so pleased. Someone who has written something on the foundations of thermodynamics knows all those things.
So to `sum up', I just like to remind you that Researchgate is supposed to be a place for true scientific discussions between scientists. It is not one of your classes which you try to show off with your knowledge and advice the students with your eternal academic wisdom! Should the time come when we can logically talk like two scientists, not a professor and an student, I would be ready to consider any serious objection.
Dear Alireza,
I see your point and feel TRULY ASHAMED WITH MY OWN APPARENT POSITION (I AM SERIOUS TO 100%).
Some 20 years ago I was feeling absolutely with you - and acting like you.
Thus, this is not the question of my knowledge or even wisdom (??? some living experience maybe ?), this is a question of YOUR ERUDITION.
You are very enthusiastically approaching fundamental problems (here I feel sympathy with you) without having but enough information about what you are working with (and here I am disappointed that explains - at least to some extent - but definitely not apologizes - my harsh tonality in approaching you).
After you have obtained enough information about what you are trying to deal with - and THIS IS CONSISTENT NOT ONLY of my humble writings, because I am referencing a lot of works - I would be ready to enter fruitful scientific discussions with you.
Respectfully yours,
Evgeni
Benedito,
I have been thinking about time and got to this: Time is the emergent perceptual correlate of any chain of sequencial events of which biochemistry has plenty of.
see here:
DOI: 10.4236/ojbiphy.2021.112003
bellow, the bastract
The Spreading Depression Propagation: How Electrochemical Patterns Distort or Create Perception
Vera Maura Fernandes de Lima1, Alfredo Pereira Junior2, Guilherme Lima de Oliveira3
1Centro de Biotecnologia CNEN/IPEN-SP, São Paulo, Brazil. 2Department of Human and Nutritional Sciences, Biosciences Inst., UNESP, Botucatu, São Paulo, Brazil. 3São Paulo, São Paulo, Brazil.
Abstract
At the transition from quiescence to propagating waves recorded in isolated retinas, a circular electric current closes in the extracellular matrix; this circular current creates a magnetic torus flow that, when entering quiescent tissue in front of the wave, recruits elements and when leaving behind, helps to build the absolute refractory state. The waving magnetic torus is the consequence of the vortex effect and explains the energy boost that drives propagation. Methods: We interpret experimental results from intrinsic and extrinsic fluorescence dyes, voltage, calcium and pH sensitive, optical signals from isolated retinas, and time series recordings using ion exchange resins: Ca, K, pH, Na, Cl recorded extracellularly at retinas, cerebellums and cortices coupled to spreading depression waves. Finally, we checked the ECoG activity, also a time series, at the transition from after discharges to spreading depression in rat hippocampus. Results: The integrated assessment of the diversified measurements led to the realization that the magnetic flow at the wavefront is a major contributor to the wave propagation mechanisms. This flow couples mass and charge flows as a swirling torus from excited to quiescent tissue. Conclusions: An alternative model of the brain is possible, apart from the classical HH and molecular biology model. Physical chemistry of charged gels and its flows explains the results. The conceptual framework uses far from equilibrium thermodynamics.
Keywords
Dear Benedicto Lima,
Greetings, The Second Law of Thermodynamics! Or Entropy as Arrow of Time!
Regards, Saeed
The second law of thermodynamics contradicts itself and competes with the first law of thermodynamics (the calculation method of Carnot's efficiency). See below for details.
https://www.researchgate.net/publication/352708795_The_contradiction_of_the_second_law_of_thermodynamics
1. General philosophy principle: internal cause (thermophysical property of working fluid) determines external performance (Carnot efficiency).
2. The second law of thermodynamics holds that Carnot efficiency has nothing to do with the thermophysical properties of working medium.
3. The second law of thermodynamics violates the general principle of life. And it's self contradictory.
4. Please see the picture and link:
https://www.researchgate.net/publication/352708795_The_contradiction_of_the_second_law_of_thermodynamics
Last couple of weeks I have been following the discussion in another RG discussion group called "Tackling a Century Mystery: Entropy" with corresponding theme:
"Why are we still unable to explain the difficulties caused by a physical concept even after more than 150 years of hard work?"
As many will know, the second law of thermodynamics is to a large extent grounded with the Clausius entropy, Clausius theorem. Another grounding of the second law of thermodynamics is the Carnot theorem.
Some other quotes of alternative views of Clausius entropy from this discussion group:
Hereby I inform the followers here that interesting alternative views on Clausius entropy have been posted in the other discussion group over the past two weeks.
Discussion group: "Tackling a Century Mystery: Entropy"
Source: https://www.researchgate.net/post/Tackling_a_Century_Mystery_Entropy