In the introduction to his text, A Student’s Guide to Entropy, Don Lemons has a quote “No one really knows what entropy is, so in a debate you will always have the advantage” and writes that entropy quantifies “the irreversibility of a thermodynamic process.” Bimalendu Roy in his text Fundamentals of Classical and Statistical Mechanics (2002) writes “The concept of entropy is, so to say, abstract and rather philosophical” (p. 29). In Feynman’s lectures (ch. 44-6): “Actually, S is the letter usually used for entropy, and it is numerically equal to the heat which we have called Q_S delivered to a 1∘-reservoir (entropy is not itself a heat, it is heat divided by a temperature, hence it is measured in joules per degree).” In thermodynamics there is the Clausius definition which is a ratio of a quantity of heat Q to a degree Kelvin, Q/T, and the Boltzmann approach, k log(n). Shannon analogized information content to entropy; 2 as the base of the logarithm gives information content in bits. Eddington in the Natural Physical World (p. 80) wrote: “So far as physics is concerned time’s arrow is a property of entropy alone.” Thomas Gold, physicist and cosmologist suggested that entropy manifests or relates to the expansion of the universe. There are reasons to suspect that entropy and the concept of degrees of freedom are closely related. How best we understand entropy?
Hello Robert Shour
This is a very relevant question.
First, you need to specify in which "domain" you speak of entropy.
I included a specific explanation in my following article:
https://www.researchgate.net/publication/327233806_Entropic_description_of_gravity_by_thermodynamics_relativistic_fluids_and_the_information_theory
I quote, page 3, 4, 5 :
"There are various forms equational of entropy, we will see now. The first is the entropy used below, the Boltzmann entropy [6], which is written:
(look paper)
This equation defines the microcanonical entropy of a physical system at the macroscopic balance, but left free to evolve on a microscopic scale between Omega different micro-states (also called number of complexions, or number of system configuration). The unit is in Joule per Kelvin (J / K).
Entropy is the key point of the second law of thermodynamics, which states that "Any transformation of a thermodynamic system is performed with increasing the overall entropy, including the entropy of the system and the external environment. We then say that there is creation of entropy."; "The entropy in an isolated system can only increase or remain constant."
There is also the Shannon formula [7]. The Shannon entropy, due to Claude Shannon, is a mathematical function that corresponds to the amount of information contained in or issued by a source of information. Over the source is redundant, it contains less information. Entropy is maximum and for a source whose symbols are equally likely. The Shannon entropy can be seen as measuring the amount of uncertainty of a random event, or more precisely its distribution. Generally, the log is in base 2 (binary). Its formula is:
(look paper)
however, one can define an entropy in quantum theory [9], particularly used in quantum cryptography (with the properties of entanglement), called the von Neumann entropy noted:
(look paper)
With the density and orthonormal basis matrix:
(look paper)
The von Neumann entropy is identical to that of Shannon, except that it uses the variable (look paper), a density matrix. As written by Serge Laroche, this equation can be used to calculate the degree of entanglement of two particles: if two particles are entangled, the entropy is zero. Conversely, if the entanglement between two particles is maximum, the entropy is maximum, given we do not have access to the subsystem. In classical mechanics zero entropy means that the events are some (only one possibility), while in quantum mechanics this means that the density matrix is a pure state of phi. But in quantum physics measurements are generally unpredictable because the probability distribution depends on the wave function and observable.
And this is also explained by the principle Heisenberg uncertainty: indeed, if for example we had to have more information (so less entropy) the momentum of the particle, there is less information on the position thereof (more entropy). This implies that quantum physics is still immersed in the entropy, although the entropy is low.
Now that we know the Boltzmann entropy and Shannon entropy, we can merge the two giving the Boltzmann-Shannon entropy or statistical entropy [8]. If we consider a thermodynamic system that can be in several microscopic states of probabilities , statistical entropy is then:
Or, the Boltzmann entropy-Neumann, equivalent to the above equation:
(look paper)
This function is paramount, and it will be constantly used in our theory of gravitational entropy. Its unit is the binary and Joule per Kelvin. These include some properties of this function. We know that the entropy is maximum when the numbers of molecules in each compartment are equal. Entropy is minimal if all molecules are in one compartment. It is then 0 as the number of microscopic states is 1.
From the perspective of information theory, the thermodynamic system behaves like a source that does not send any message. Thus, the entropy measure "the missing information" to the receiver (or uncertainty of the entire information).
If the entropy is maximum (the numbers of molecules in each compartment are equal) the missing information is maximum. If the entropy is minimal (molecules numbers are in the same compartment), then the missing information is zero.
In the end, the Shannon entropy and Boltzmann entropy is the same concept."
In conclusion, entropy is a measure of uncertainty:
- in information theory -> bit uncertainty
- in quantum physics (Von Neumann) -> Uncertainty in qubit
- In thermodynamics -> Uncertainty of the contents of a thermodynamic system
- in statistical physics -> bit uncertainty of the contents of a thermodynamic system
There is another form of entropy, the entropy of flat curves, proposed by Michel Mendes. But that, I let you see;)
A very good question. Entropy can be understood as a measure of a system, which can be extremely complicated, that describes the level of disorganization of the system without actual measuring of all its parts.
We can say that entropy gives us the fingerprint of the actual state of the system. The problem is that many different states of the system give the same fingerprint. In many cases, the entropy measure is sufficient to describe complicated systems.
Complex systems, especially those observed in biological systems and medicine, are perfect candidates for employing entropy measures of their states.
Why am I telling it? Biological signals observed in medicine can be studied using entropy. Entropy open completely new areas to quantifying and hence mathematical description. I do use entropy in my own research and it is just incredible how much can be accomplished with it. :-)
Dr. Yushan Jiang,
Thank you for your reply. May I ask, how are physical entropy and mathematical entropy different?
Thank you.
In my opinion entropy is a statistical variable that can be related with the physical quantities measured on the physical system. It is not a physical quantity as it have not a measure unit (if we do not consider the constant k), it can be related with physical quantities as every variable can be related with other variables if the dependence is verified .
The great advantage of this variable coming from the genius of Boltzmann is that is really simple to have its probability distribution.
It might exist other statistical variables through which to establish other time behaviours of physical systems... we need another Boltzmann.
This algoritm can justify a part of what I'm saying. Let's try it ..... it is only html, and it consists in a system of 64 oscillators that exchange energy in a casual way; let's start the calculus with different initial energies, let's determine the maximum of entropy. We are going to determine the relation between entropy and temperature.
inserisci la temperatura (0..4) =
numero di scambi entropia= numero di stati=
n(0)= n(1)= n(2)= n(3)= n(4)= n(5)= n(6)= n(7)=
n(8)= n(9)= n(10)= n(11)= n(12)= n(13)= n(14)= n(15)=
n(16)= n(17)= n(18)= n(19)= n(20)=
Distribuzione di Boltzmann
entropia massima= n. stati massimo=
n(0)= n(1)= n(2)= n(3)= n(4)= n(5)= n(6)= n(7)=
n(8)= n(9)= n(10)= n(11)= n(12)= n(13)= n(14)= n(15)=
n(16)= n(17)= n(18)= n(19)= n(20)=
I'v found that the best way to unravel the mess that we now have is to go back and follow the historical record. Initially, It was difficult to understand the realtionships that existed between the work done and the input of fuel -the beginning of the industrial revolution. At first just thermodynamics. The recognition of atomism and by developments Boltzmann made more formalized statements- directed toward the the statistics of large ensembles of atoms. But Gibbs also kept things in order due to his influence in chemistry. Disorder and its relationship to entropy came later but it wasn't there from the beginning. Under the acceptance of the concepts of disorder ideas with entropy it made sense that if an ordered ensemble were to be slightly disordered then entropy would be a good way to describe its loss. From there we entered the information age and the concept of entropy expanded to accommodate it. In a more general form entropy became very useful to deal with computer errors, corruption of messages but at the same time never becoming separated from its roots based on thermodynamics. Its so very useful in many body Mechanics and Quantum Field Theory.
Dann Passoja
Classical thermodynamics defines entropy as system property. It manifest itself while system communicates with the surroundings in terms of heat and work. The most notable impact of entropy in this context is what puts heat and work apart as different forms of energies: heat is riddled with entropy while work is entropy free!
Hi Khalid
the work it is not entropy free if (and this is our case) change the internal energy of the system. We can define a statistic entropy that connect work with the probability of an atom to occupy a place in space but simmetry reasons tell us that this entropy is not connected with the atom's energy. Is this what you want to say with the phrase " work is entropy free "?
Hello I. Borsini
From classical thermodynamics point of view, it's possible to make distinction between heat and work as macroscopic forms of transient energies. Heat is referred to as degraded form of energy while work is a superior form of energy. Entropy is the one macroscopic system property that provides the necessary condition for this distinction. At the microscopic level, the distinction between heat and work becomes blurred.
As a physicist i suspect that this aspect is influenced by our point of view and in every case a macroscopic work is related widh a change of entropy of the system.
Yes, Borsini, you are right, it is point of view or perspective that's letting entropy change color like a chameleon. Yes, there might be a system that performs a process and the sole effects thereof are an increase in system entropy while some work is done on the surroundings. From classical (macroscopic) thermodynamics point of view, the work that system does on surroundings is the amount of energy that has its entropy stripped away from it to be left behind in the system - work is entropy free - which (the system) will "see" its energy content continue to lose capacity to do work (entropy buildup) and ultimately cease to do work. Whether it's possible to contrive such a system ever performing this sort of process and in case it does, then how much the work output will be, is all thoroughly dictated by the second law of thermodynamics that, at its core, encapsulates the mystery mankind has come to know as entropy.
Hello, Roman, the following excerpt from your comment needs some further detail please.
" In thermodynamics -> Uncertainty of the contents of a thermodynamic system"
In particular, "the contents of a thermodynamic system" is somewhat unclear, I reckon.
The so-called "entropy " doesn't exist at all.
During the process of deriving the so-called entropy, in fact, ΔQ/T can not be turned into dQ/T. That is, the so-called "entropy " doesn't exist at all.
The so-called entropy was such a concept that was derived by mistake in history.
It is well known that calculus has a definition,
any theory should follow the same principle of calculus; thermodynamics, of course, is no exception, for there's no other calculus at all, this is common sense.
Based on the definition of calculus, we know:
to the definite integral ∫T f(T)dQ, only when Q=F(T), ∫T f(T)dQ=∫T f(T)dF(T) is meaningful.
As long as Q is not a single-valued function of T, namely, Q=F( T, X, …), then,
∫T f(T)dQ=∫T f(T)dF(T, X, …) is meaningless.
1) Now, on the one hand, we all know that Q is not a single-valued function of T, this alone is enough to determine that the definite integral ∫T f(T)dQ=∫T 1/TdQ is meaningless.
2) On the other hand, In fact, Q=f(P, V, T), then
∫T 1/TdQ = ∫T 1/Tdf(T, V, P)= ∫T dF(T, V, P) is certainly meaningless. ( in ∫T , T is subscript ).
We know that dQ/T is used for the definite integral ∫T 1/TdQ, while ∫T 1/TdQ is meaningless, so, ΔQ/T can not be turned into dQ/T at all.
that is, the so-called "entropy " doesn't exist at all.
In fact, the so-called "entropy" was "deduced" by mistake in history.
There is NO such physical quantity as "entropy" at all.
Why did the wrong "entropy" appear ?
In summary , this was due to the following two reasons:
1) Physically, people didn't know Q=f(P, V, T).
2) Mathematically, people didn't know AΔB couldn‘t become AdB directely .
If people knew any one of them, the mistake of entropy would not happen.
All the theories and formulas that MUST use "thermodynamics entropy" or "statistical physics entropy" are WRONG.
Please read my paper and those answers of the questions related to my paper in my Projects.
Here, there are some new contents that people didn't know before:
Heat Q (and work W ) is both a system Process Variable and a State Variable in the system process, this is a NEW class of Variables, to this class of process-related State Variables, we can call them Process State Variables (or Special State Variables), the loop integral of a Process State Variable is not 0.
In contrast, we can call the familiar state variable, such as internal energy, as the Ordinary State Variables, the loop integral of an Ordinary State Variables is always zero.
When the system goes from initial state 1 through a reversible loop to final state 2, the initial state 1 and the final state 2 coincide, to the system and the Ordinary State Variables(a.g., internal energy), the initial state 1 and the final state 2 are the same one state.
However, to the Process State Variables (such as heat Q and work W), the initial state 1 and the final state 2 are not the same state, but two different states, and the change of the function value of the Process State Variable are obtained by piecewise integration in general.
Hi Shufeng Zhang
I do not like common sense because it contributes to our alienation. Can I use the dE / T integral to define entropy? I did it and I sent you an algorithm that in Bose Einstein's statistics correlates statistical entropy with the system's energy change. Is it enough to prove that entropy exists?
Dear Robert Shour
Great question! I have been wrestling with it for years, and will probably continue until the end.
Here is a good article on explaining the concept of entropy in chemical systems to students (Article Give Them Money: The Boltzmann Game, a Classroom or Laborato...
).The way I currently think about entropy in a system of matter and energy is as follows:
Let's look at a simple example. Our system consists of 2 dice pieces, each having 6 sides. We define the system's macrostate as the sum of the 2 numbers that appeared after the 2 dice were rolled.
Given this explanation of entropy, we can state the Second Law as follows:
Spontaneous change will always move an isolated system from a less likely (macro)state to a more likely (macro)state.
All the best
Johan
Dear Robert Shour,
From my understanding, entropy aims to provide a justification of the “natural direction” taken by energy flows during (irreversible) processes. In particular, it explains why heat is transferred from a hot body to a cold one and why not the opposite. It also explains why we cannot entirely convert thermal energy into useful work. The notion of exergy is also interesting to help better understand/explain the concept of entropy and analyze energy systems.
Exergy is a thermodynamic state function, which refers to the maximum theoretical portion of energy that is able to be transformed in a reversible manner into a useful work, with respect to a reference environment. In other words, it assesses both energy quantity and quality at the same time by comparing energy quantities on a same scale (i.e. useful work) and it can be written as the product of an energy quantity times an appropriate quality factor. For instance, electrical work is entirely useful and its quality factor is one. For thermal energy, the Carnot efficiency is considered. Therefore, converting work into heat (e.g. in electric baseboards) degrades the high quality of electricity into low quality energy, which is characterized by high entropy creation.
That being said, specific entropy is considered as a state variable. Thus, it characterizes the state of a given fluid for instance and does not depend on the process. As an example, specific entropy of a given fluid is lower when its temperature is low and it increases when its temperature becomes higher. At lower temperature, fluid molecules are more organized and thus, more able to perform a useful work; exergy is higher and entropy is lower. Such a concept can be related to Boltzmann’s entropy formula: if molecules are more organized, they could not easily move from one state to another one, which limits the number of possible microstates and thus, entropy is lower. It can also be related to Clausius inequality, which indicates the direction of heat transfer where entropy variation is expressed as a function of heat Q and temperature T. For an adiabatic heat exchanger, heat can only be transferred from the hot to the cold side in order to guarantee the inequality while the higher the temperature difference, the higher the amount of transferred energy but also the higher the energy quality degradation and the higher the entropy creation. In order to reduce entropy creation, temperature difference should be reduced, which means larger heat exchanger surface area and then, money comes into the equation…
Concepts of entropy and exergy are well explained in the book “Thermodynamic, An Engineering Approach” (Cengel & Boles).
Hope this helps,
Best regards,
Etienne Saloux
Good evening everyone,
I will try to bring the clearest answer is conceived about it, largely addressed in my work.
Source "Entropic Gravity and the Space-Time Fluid" (slide 30 to 38)
Research Gate : Preprint Entropic Gravity and the Space-Time Fluid
My website : http://entropyfluid.e-monsite.com/pages/my-scientific-articles-mes-articles-scientifiques/entropic-gravity-and-the-space-time-fluid.html
I. WHAT ARE THE DIFFERENT ENTROPICAL FORMS
There are 5 equations of entropy which are:
Shannon's entropy: Use in information theory Entropy is maximum when the events are equiprobable (p = 1 / n). Useful to know the uncertainty transmission of information via a channel; useful in classical cryptography for example.
INFO: Entropy metric (or Kolmogorov entropy) is an entropy inspired by Shannon's equation. It can be used to show if two dynamic systems are not conjugated. It is a fundamental invariant of measured dynamic systems. It allows a qualitative definition of chaos: a chaotic transformation can be seen as a non-zero entropy transformation.
Clausius entropy: Measures the degree of disorder in a thermodynamic system as a function of the amount of heat and temperature. Useful to note if in thermodynamics a transformation is irreversible gold reserrable. Widely used in the industrial environment as part of a machine that uses temperature as the main source of energy.
Boltzmann's entropy: Same as Clausius, except that he uses statistical mathematics. More precisely, it measures the degree of possible micro states (omega) and temperature. In addition to finding out whether a transformation is reversible and irreversible, it may be An isolated system reaches equilibrium when its entropy becomes maximum.
Von-Neumann entropy: Same utility as Shannon, but used in quantum physics. It is a function of the probability p, or if formulated differently, as a function of the quantum density matrix. It measures uncertainty. Useful for determining the correlation rate between two entangled particles. Or know the uncertainty of receiving information from polarized photons. Widely used in quantum cryptography and the development of quantum computers.
Statistical entropy (or Gibbs entropy): Blend between Boltzmann's and Shannon's entropy. Boltzmann's entropy is a measure of disorder; the entropy of Shannon a measure of uncertainty. From here we can "unify" the Boltzmann equation and Shannon's equation into one equation. To put it simply, it is enough to multiply the constant Boltzmann to Shannon's entropy to get the statistical entropy. When the entropy is maximal, we should go back to the Boltzmann equation (possible state 1 = possible state 2 etc ... = p = 1 / omega). Very used in statistical physics.
II). UNDERSTANDING AND INTERPRETING ENTROPY
This distinguishes two forms of information. The first is "frequentist information": this is the well-known Shannon theory.
According to him, the more choices there are, the greater the information. To be more precise, I quote: "The more messages available between which to choose, the greater the uncertainty about the recipient, and the more information gained from the message. has more information in a highly improbable message than in a message whose content is entirely predictable. "
In shannon's interpretation, entropy, which refers to uncertainty (hence small probabilities), is synonymous with information.
The second is "epistemological information", which is a measure of the degree of knowledge.
I still quote the article, which I translate again: "The association of information and entropy is indeed in Brillouin on an already informed informational interpretation of entropy understood as the expression of a lack of knowledge on the part of the structure of the system According to This interpretation of the fact that the entropy of a system is high expresses the fact that the knowledge of its real state, microscopic, presents important gaps, since the number of possible microscopic states compatible with the observed macroscopic state is high. A high entropy of a system of entropy is a problem in the context of a weak entropy. measures the lack of information on this lack of information implies the possibility of a wide variety of distinct microscopic structures which re, in practice, indistinguishable from each other. "
The epistemological information is born from Boltzmann's work on entropy, which uses logarithm in its calculation, but which respects an epistemological interpretation of entropy. Brillouin is one of the people who thinks that entropy is the opposite of information, because entropy is an uncertainty (lack of knowledge), because information is a degree of knowledge.
Here is the first paragraph of the conclusion[21], which I translate here: “It therefore appears at the end of this development that the use of probabilities in the mathematical definition of information gives rise to two opposite readings of what information can mean in reality, as soon as we move away from The amount of information can be interpreted as a measure of the degree of order-reading that will be retained in algorithmic theories that will deepen the relationship between information and complexity - or measure the degree of knowledge. This duality between the informed and the informative feeds on the opposition between epistemic and frequentist interpretation of the probabilities mobilized by the definition of the quantity of information. This duality appears out of the question in the question of the physical status of information, related to the thought experiment of Maxwell's demon.”
To "unify" the theory of information to physical theories, it is necessary to decide what is the correct interpretation of information and entropy, at the risk of finding "differences" between thermodynamics or physics in general and the theory of information.
So the good question to ask yourself is this : Is it frequentist or epistemological interpretation that is right ?
III). FINALLY : THE GOOD INTERPRETATION OF ENTROPY
The interpretation of the Statistical entropy (or Gibbs entropy) will alow to decide what is the good interpretation. Indeed, the statiscal entropy is a equation who have in both the entropy of Boltzmann and entropy of Shannon (and obviously, the entropy of Von Neumann since we have seen previously that this entropy is equivalent to that of Shannon).
Imagine two compartments (or room, as you prefer) next to each other in which particles are present. Let's call the first compartment (or room) A, and the other B.
If all the particles are in A or all the particles are in B, mathematically, we note that the entropy is minimal and the uncertainty on the information is zero.
If, on the other hand, the particles are in A and B in an equal manner (as many particles in A as in B) the entropy is maximal (great disorder), so the uncertainty on the information is maximum.
With this interpretation, we note that the more dispersive particles (probability less than 1) the stronger the entropy. This is the case of our particules which are separated equally in A and B (𝑝 = 1/2, 𝑆 = 2 × 1/2 × log2 2 = 1). In the opposite case (all the particules in A or B), the information is certain, the probability is zero, so the entropy is zero.
So, the epistemological interpretation is the good interpretation
Hello Robert et every body,
Up to my opinion, the best way to interpret the notion of entropy is to follow E. T. Janes who, in 1965, has written: "we see that entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept." (in "Gibbs vs Boltzmann Entropies", E. T. Janes, American Journal of Physics 33, 391 (1965); doi: 10.1119/1.1971557)).
Clearly, this means that entropy is not a property of the system but a property of the perception's model use to analyze it.
Hoping having contribute to the discussion,
Best regards,
Marc
Logical error of the second law of thermodynamics:
1. There are many kinds of type 2 perpetual motion machines. Behind each type is a natural phenomenon.
Machine A: against the irreversibility of thermodynamics (diffusion, heat conduction, friction, etc) - dynamics;
Machine B: Utilizing the Difference of Carnot Efficiency (Reversible Thermodynamics) - Thermodynamics
2. A and B belong to different disciplines. There is a parallel relationship between them, and there is no logical mutual inevitability.
Heat 2 Logic: Experience induction, deny that A machine==》 B machine can not be manufactured==》 All material Kano efficiency: 1-T1/T2.
3. Logical of the second law of thermodynamics violates the physical logic that A and B cannot be inferred from each other.
4. the second law of thermodynamics elevates the "irreversibility", but it is only a kinetic experience.
5. See the annex for details.
Good morning Bo MIao
i would like to know your opinion about the perpetual motion of a brownian particle freely floating in an isolated system
Thank you for your answer.
Charge moves Brownian in the container. The electric field in space varies. There is an inductive current on the nearby conductor. The temperature of the conductor increases, while the temperature of the container A decreases. This is inconsistent with the second law of thermodynamics. Is the second law of thermodynamics correct? Judge for yourself.
See pictures for details.
Entropy is not a state variable, and there is a physical difference between it and internal energy.
In the textbook, the ideal gas entropy: S = n * Cv * In (T) +nR * In (V);
There are two problems:
1. The natural logarithm In(x) will submerge the unit system of volume and temperature [m3], [K].
2. The unit of temperature may be [K], or [mK]; the unit of volume may be [m3], or [cm3]. The results are different and the calculation lacks stability.
Entropy is only a process quantity: DS = n * Cv * In (T2/T1) + nR * In (V2/V1); it has physical significance.
Internal energy is a state quantity, and entropy is only a process quantity. There are physical differences between them.
Internal energy U is conserved and can have the meaning of total differential dU = Ut * dT + Uv * dV - correct.
Entropy is only a process quantity, not a state quantity. Total differential is meaningless: dS = St * dT + Sv * dV - - incorrect.
Entropy is not conservative, even in reversible cycles, it can increase or decrease.
Thermodynamics has some problems in describing ideal gases.
Hi Bo Miao
Charge moves Brownian into the ( electrical insulating) container. The electric field in space varies. There is an inductive current on the nearby conductor. The temperature of the conductor increases, while the temperature of the container A decreases.( Since the whole system is isolated the conductor and the container still exchange energy in form of thermal radiation until the system reaches thermal equilibrium) .... and the brownian charge will be destined to stop on the surface of the container cause the electrical induction on the conductor.
The second law of thermodynamics is far too out of line with the deviation rate of 10-80 times.
1. Gert Van der Zwan Doctor asked me to build a perpetual motion machine, so I won't be fooled.
In 1905, Einstein put forward the theory of relativity. The atomic bomb was produced in 1945. After 1905, scientists quickly recognized the theory of relativity. There is no need to wait 40 years. Human beings have brains and rationality.
2. Reality has a perpetual motion.
**********************************************
3. When the second law of thermodynamics is applied to capillary phenomena, there is Kelvin's formula.
3.1, P1 - capillary liquid surface vapor pressure
P--Horizontal level vapor pressure
Ro - Steam Density
H - height of liquid column.
3.2, theoretical results of the second law of thermodynamics: P1 = P 0 - ro*g*h
Experiments: P1=P-(10-80)ro*g*h
4,The second law of thermodynamics is far too out of line with the deviation rate of 10-80 times.
5. The capillary experiment negates the Kelvin formula and the second law (that an isolated system must have an equilibrium state).
6. It is easy to imagine that the gas evaporates from the horizontal liquid surface, enters the capillary from the outside (gas phase), condenses, and returns to the horizontal position from the liquid phase, and starts again and again.
7. See the attached drawings for details.
the law p=p0 - ρgh is the well know law of Stevino, not a prediction of second principle of thermodinamics:
Simon Stevin, also known as Simone of Bruges or Latinized Simone Stevino, was a Flemish engineer, physicist and mathematician. Stevino was born in Bruges in 1548. He was an illegitimate son so he was raised by his mother, Cathelijne van der Poort. His father's name was Antheunis Stevin.
Death: February 1620, The Hague, Netherlands
Stevin's law can also be derived from the second Euler law (fluid dynamics) which relates to the effect of the mean force field acting on the fluid ..... and Euler's laws are a simplified form of Navier-Stokes equations. At the end it is only fluid dynamics
Quantitative prediction of the second law of thermodynamics is inaccurate. Scientists can hardly ride on tigers and only rely on fraud to maintain.
The second law of thermodynamics is "irreversible", "dissipation" and "statistics". In fact, we should be concerned about whether the quantitative calculation of the second law of thermodynamics is correct. Here is the Clapellon formula.
(dP/dT)*T=λ/(V1-V2) - - the same as in figure 7-2-2
1. This is the Clapellon formula, which involves vapor pressure, gas-liquid density, and heat of gasification.
2. Vapor pressure and gas-liquid density are easy to measure accurately, but the heat of vaporization is difficult to measure.
According to Clapellon's formula, the vaporization heat is calculated by using vapor pressure and gas-liquid density, which shows the theoretical value.
3. Later, with the development of instruments and the increase of accurate measurement data of gasification heat, it was found that the vaporization heat calculated earlier by Law 2 was not accurate. See the following figure - "Properties of Gases and Liquids"
4. The second law of thermodynamics disappoints everyone.
5. However, the vapor pressure, gas-liquid density and gasification heat on the enthalpy-entropy chart satisfy the Clapellon formula. This is done by sorting out the data. To put it plainly, it's fraud. Scientists are also difficult to ride on tigers. They only rely on fraud to maintain.
1. Logic of the Second Law of Thermodynamics: Subjectivism, Logical Jump, Interdisciplinary Argumentation.
2. New thermodynamics pursues universality, two theoretical cornerstones:
2.1 Boltzmann formula: ro=A*exp(-Mgh/RT) - Isotope centrifugal separation experiments show that it is suitable for gases and liquids.
2.2. Hydrostatic equilibrium: applicable to gases and liquids.
3. The second and third sonic virial coefficients of R143a derived from the new thermodynamics are in agreement with the experimental results.
3.1. The third velocity Virial coefficient derived is in agreement with the experimental data, which shows that the theory is still correct when the critical density is reached.
4. See Appendix Pictures and Documents for details.
I agree with the point of view: “No one really knows what entropy is.“
δQ/T is only the entropy of δQ.
In thermodynamics, we have only the equation of the entropy, but there is no an explicit definition which indicates the physical meanings of the entropy, that is why thermodynamics itself cannot explain what entropy is, and that is why I.Prigogine indicated that “Now entropy is a very strange concept without hoping to achieve a complete description.’’
Thank you very much for all of your writings about entropy, through which I have learned much more. In a long time, I am always thinking about this notion and I can admit that this is a very difficult notion to achieve. It looks like the diamond with so many aspects that in a very concrete context can we perceive one or some properties. It is something like the particle and wave properties of matter in quantum mechanics but so many more ones. The discovery of this notion is a great achievement of science. Comparing to other scientific basic notions, the entropy has been revealing more and more specific characteristic as well as applications, going with the development of science, and I think it may be the time for the scientific community to carry out more severe researches on it.
I am know finding the way to describe the entropy of a system for estimating the level of its structure, contrasting to the chaos or uncertainty. Entropy has the two important properties related to energy and the direction of the process. The first entropy formula of Clausius related to energy and the equation of Boltzmann implied the structure of a system. The inequalities derived later reveal the direction of the process through entropy. In my opinion, entropy can reveal the energy expenditure (through the process) in the pass to build up (or degrade to) the present structure and in other hand, it may predict the future of a system in respect to the specific environment. Because of that, in the present state of a system, we can (hopefully) find out the "imprinting" of the pass (overall history) through the entropy of the current system. It is something like the cosmic background radiation of the universe from the moment of Big Bang. I am working more in math for deriving (from current equations of entropy) the equation system that can cover the entropy and the structure. I am grateful when receiving recommend for my idea and math formula of entropy from all of you.
I think it may be time for a workshop/symposium organised for this great notion.
Best regards,
Ngo Dang Nghia
Nha Trang University, Vietnam
In capillary phenomena, the experiment is 10-80 times as much as that predicted by the second law of thermodynamics. See the picture for details.
Why do you still believe in the second law of thermodynamics?
This experiment negates the statement of equilibrium state in the second law of thermodynamics.
The isolated system tends to be thermal equilibrium (maximum entropy).
As for the imbalance, it should be obvious...
Physical law is the pursuit of universality, where there is a deviation of 10-80 times. Is there any deviation from other inferences of the second law of thermodynamics?
I have added an article on RG in which degrees of freedom plays a prominent role in explaining Clausius's definition of entropy.
Preprint Why not just Q 1 = Q 2 ?
As a follow up to the article Why not just Q1=Q2, I have added an article
Preprint Entropy 1865, 2019 update
. The ideas still need work.Eventually, entropy should be describable in one line. The fundamental principle needed is, I think, dimensional capacity. The conceptual set up of Entropy 1865 is that both Carnot's physical assumptions about an ideal heat engine, and Clausius's ratio characterization of entropy are consequences of maximizing dimensional capacity and hence imply each other --- are equivalent.
Here is the gist of Preprint Entropy 1865, 2019 update
:The principle of dimensional capacity implies that the heat engine is optimally efficient when degrees of freedom of maximal.
The ideal heat engine assumptions (the Assumption Model) are that the piston's chamber is friction-less and perfectly insulated.
Clausius ratio definition of energy is maximal for denominator a mean temperature (Clausius model) relative to the numerator.
Since both the Clausius model and the Assumption model have maximal degrees of freedom, for the physical set up of the ideal heat engine they imply each other, in consequence of the principle of dimensional capacity.
The second law of thermodynamics is best known for the irreversibility of dynamics, such as diffusion and heat conduction. The logic of the second law of thermodynamics is to raise and enlarge irreversibility. Irreversibility is just a kinetic experience. It is wrong to apply it to thermodynamics. Please refer to the above figure.
Let's take a look at the following discussion ……………………
Does entropy reduction need external work? Containers 1 and 2 contain alcohol, water and saturated steam. The system is connected with the heat source and the temperature is constant. Different piston areas meet P1xS1=P2xS2 1,Calculate the change of entropy as the piston moves W=0 The simplified calculation shows that the vapor density is small, it is regarded as an ideal gas, and the volume change of liquid is ignored. dn1=-dn2 dn1---- The number of moles of alcohol evaporated(mol) dn2---- The number of moles of water evaporated(mol) Entropy change of system dS=Q/T=dn1(L1-L2)/T L1,L2 ---Heat of vaporization dn1 can be positive or negative. dS can be positive or negative. When w = 0, entropy increase or entropy decrease is not limited by the second law of thermodynamics. The second law of thermodynamics is not a universal physical law, but only an experience.
Dear Robert Shour
In teaching statistical physics for 10 years, I found that in order to explain what entropy is, it helps to use the concept of Omega as the number of states accesible to a system as it is elaborate in Prof. Reif´s books.
1. Berkeley Physics Course: Statistical physics, by F. Reif, chapter 7, postulates of statistical thermodynamics.
2. Fundamentals of Statistical and Thermal Physics, by F. Reif, chapter 3, statistical thermodynamics.
Dear Bo Miao
Please check the book by Acad. L. Landau and E. Lifshitz, Vol. 6, Statistical Physics, Pergamon 1980, Part I, chapter XV---Surfaces- apart #156 on surface pressure. Pag. 524 equation 156.7
The equation for pressure with surface tension---as your png capture shows---will have an additional surface energy to account for; hereby, the derivation of the surface pressure in capilars gives a different equation for Delta P.
I might guess, the data can be fitted with equation 156.7. Also check for an easier one in:
https://www.amazon.com/-/es/L-D-Landau/dp/0080091067
chapter on surface phenomena.
How all of that apply to corporate management? For instance, in construction engineering, it is said that the more specialties a consultant engineering firm (i.e. transportation, energy, mining, high rise, etc.) might have, the less profitable the firm is. Put differently, the more it spreads its resources on activities that can not be properly combined (building a bridge and a nuclear reactor for example) , the more it spends useless energy. Here, it seems as though complexity has a role to play. While a firm as a wider field of activities, the uncertainty of accomplishing successfully a complex mega-project (=variable) rises to a point where the firm may fail to meet budget and calendar goals. A different way to look at it might be to say that the the firm reaches the Peter Principle when the entropy value nars its maximum.
Dear Robert Shour I have proposed here Deleted research item The research item mentioned here has been deleted
to dispose of entropy as a statistical entity and define it as a fundamental field; just like gravity or electromagnetism.
Dear Roman Baudrimont According to my recent contemplations which led to the proposal of entropy as a field (here Deleted research item The research item mentioned here has been deleted
) I respectfully disagree with you. I would be happy to receive your views regarding my proposal.1. You'll get narrower and narrower on the road to the second law of thermodynamics.
2. Physics stresses winner take all.
3. People who believe in the second law of thermodynamics are always confident. When encountering problems that cannot be solved (such as fluctuations), they like to explain. Real physics doesn't do that.
4. Is it accurate to deduce specific heat or sound speed from the actual equation of state in its accurate region by the second law of thermodynamics? I asked the professors of thermophysics in China, who do these things every day and dare not use "yes" to support the truth they think.
You can ask American, Japanese, German professors, or look at the literature. The situation is similar.
5. If you don't understand these facts, and you construct theories like Kelvin or Clausius, it will become narrower and narrower.
Dear Robert Shour,
Greetings, The best way to explain entropy may be use of the energy components approaches in the performed processes. You may want to see the following papers:
1. A General Solution to the Different Formulations of the Second Law of Thermodynamics DOI:
2. A Study of the Entropy Production in Physical Processes from a New Perspective of the Energy Structure DOI:
Regards, Saeed
1. General philosophy principle: internal cause (thermophysical property of working fluid) determines external performance (Carnot efficiency).
2. The second law of thermodynamics holds that Carnot efficiency has nothing to do with the thermophysical properties of working medium.
3. The second law of thermodynamics violates the general principle of life. And it's self contradictory.
4. Please see the picture and link:
https://www.researchgate.net/publication/352708795_The_contradiction_of_the_second_law_of_thermodynamics
Last couple of weeks I have been following the discussion in another RG discussion group called "Tackling a Century Mystery: Entropy" with corresponding theme:
"Why are we still unable to explain the difficulties caused by a physical concept even after more than 150 years of hard work?"
As many will know, the second law of thermodynamics is to a large extent grounded with the Clausius entropy, Clausius theorem. Another grounding of the second law of thermodynamics is the Carnot theorem.
Some quotes of alternative views of Clausius entropy from this discussion group:
Without saying to leave the discussion group Robert Shour started, I inform the followers here that interesting alternative views on Clausius entropy have been posted in that other discussion group over the past two weeks.
Discussion group: "Tackling a Century Mystery: Entropy"
Source: https://www.researchgate.net/post/Tackling_a_Century_Mystery_Entropy
Is there anyone who has managed to deduce the equilibrium state of a perfect gas without using the concept of statistical entropy?
The salary of academic scientists
Thermodynamic academic scientists have been paid by university executives who don’t know what entropy is. This started when Rudolf Clausius published his work in 1854 introducing the Clausius entropy equation without a generalized analytical phenomenological explanation. Academic scientists have passed on this relic from generation to generation. Until this day they haven't found a generalized analytical phenomenological explanation of Clausius entropy, despite all 10.000 scientific research works. On the contrary, more and more evidence has become available that Clausius entropy is a flaw and that all kinds of evidence builds up that Clausius entropy is the “Greatest Blunder Ever in the History of Science” in accordance to the scientific work from Professor Arieh Ben-Naim, of which the summary is quoted:
This article is about the profound misuses, misunderstanding, misinterpretations and misapplications of entropy, the Second Law of Thermodynamics and Information Theory. It is the story of the “Greatest Blunder Ever in the History of Science”. It is not about a single blunder admitted by a single person (e.g., Albert Einstein allegedly said in connection with the cosmological constant, that this was his greatest blunder), but rather a blunder of gargantuan proportions whose claws have permeated all branches of science; from thermodynamics, cosmology, biology, psychology, sociology and much more.
.
.
Tackling a Century Mystery: Entropy
This is one part of my comment in the discussion group "Tackling a Century Mystery: Entropy". There is a lively debate going on about the correctness and corruptness of Clausius entropy for several weeks now. Past days the discussion reached a new high.
Source: https://www.researchgate.net/post/Tackling_a_Century_Mystery_Entropy
Casper A. Helder added a reply August 26, 2021 remarking:
".... Rudolf Clausius published his work in 1854 introducing the Clausius entropy equation without a generalized analytical phenomenological explanation."
In connection is this quote from Clausius:
"For brevity, we will introduce a simpler symbol for the last function, or rather for its reciprocal, inasmuch as the latter will afterwards be shown to be the more convenient of the two."
As Dr. Helder notes, no phenomenological explanation, as discussed in:
Preprint Why not just Q 1 = Q 2 ?
Dear Robert Shour,
I quickly reviewed your article. This seems a in depth study trying to unravel Clausius entropy; the pillar of the second law of thermodynamics. Better said, Clausius entropy is the main pillar of thermodynamic knowledge base. As we all know, thermodynamic knowledge base is not grounded and shows various contradictions. Present thermodynamic knowledge base has not been able to present the engineers the tools they need to fight within the battle against climate change: generalized analytical phenomenological explanation of ideal heat exchanger and a generalized analytical phenomenological explanation of ideal heat engine/pump. This is the result of 200 years of hard work after Sadi Carnot left us his book in 1824.
So, especially these two statements in your article should be embraced by all thermodynamic scientists, especially the academic university scientists:
Reasons to investigate the nature of entropy include:
.
Reference
.
In my opinion Clausius entropy stands in the way to dismantle the second law of thermodynamics. Basically Clausius entropy stands in the way to dismantle, or at least unravel, the complete thermodynamic knowledge base.
.
Separate Clausius theorem
In my opinion Clausius theorem should be separated into two applications which are both useless when we truly unravel these Clausius entropy concepts. Two different applications:
Clausius theorem:
.
Wikipedia Occam's razor: https://en.wikipedia.org/wiki/Occam%27s_razor
.
In your article you wrote:
How does dividing by temperature connect physically to a cycling heat engine?
In my comment of August 7th I wrote the following (section):
Rename Carnot formula
The Carnot theorem is completely flawed to, even the well know Carnot efficiency formula/heat-to-work ration. Among other things we found the basic grounding of the well known Carnot formula. This formula is not uniquely related to the Carnot cycle: it is the general governing equation of any randomly operated ideal thermodynamic cycle between 2 so-called isothermal heat sources. The Carnot formula is the logical consequence of a thermodynamic cycle with a thermodynamic medium (gas, liquid or gas) with random thermomechanical single-valued and not necessarily monotoic properties and one singular point at zero Kelvin.
The following articles support this view:
.
In this comment it can be found why the amount of heat [J] by temperature is not specifically related to the Carnot cycle.
.
Quote
In the next part of this comment I quoted (without italic characters) my comment of August 6th from the discussion group "Tackling a Century Mystery: Entropy" :
Source: https://www.researchgate.net/post/Tackling_a_Century_Mystery_Entropy
.
Summary
ΔS=ΔW/T2. Clausius entropy is a flaw.
The zeroth, the second and the third laws of thermodynamics are flawed. Thermodynamic science should be transformed to basic physics science. The physical law of conservation of energy replaces the first law of thermodynamics. In addition to that, physics science can probably do what it hasn't done for the past 200 years: unify as many observed phenomena as possible in as few generalized phenomenological analytical explanations as possible.
As long as IPCC and the industry are using flawed analytical thermodynamic tools from the zeroth, second and third laws of thermodynamics, the outcome of human effort will certainly be: “catastrophic”, similar to what currently is predicted and feared by many experts in recent article “World Scientists’ Warning of a Climate Emergency 2021”.
This all is stated by a startup company named Tezzit.
.
Introduction
This comment is part of a larger explanation which is based upon our research within startup company Tezzit. Our initial goal was to discover perfect heat transfer and heat exchange, not for scientific reasons but for technological and commercial purposes. Our project centered around heat exchanger technology. Our discoveries within our 10-year research and development project are critical in the battle against climate change. In our research, we have clearly found out that heat exchanger science has been failing badly since 1824. This is almost certainly the case for thermodynamic science. There is no doubt that heat exchanger science and thermodynamic science are responsible for the dramatic bad progress in the battle against climate change. The poor analytical thermodynamic tools they provide to the industry and IPCC are incorrect.
The full explanation will be posted in the ResearchGate discussion “Can the Second law of thermodynamics be abandoned?”.
Reference:
ResearchGate discussion: “Can the Second law of thermodynamics be abandoned?”
https://www.researchgate.net/post/Can_the_Second_law_of_thermodynamics_be_abandoned
.
Sadi Carnot, engineer
While reading, more than 1 million people are pushed forward through air at a speed of 900 km/h in 10,000 and also 60,000 power plants generate 90% of the power required for all electrical devices within the global energy system. These impressive technological achievements on which human society is heavily dependent are attributable to engineers and not to thermodynamic scientists. In the past 200 years, engineers have been able to develop all kinds of energy technologies for the global energy system thanks to the ideas of Sadi Carnot of thermodynamic heat- work cycles which he explained in his book about dated 1824.
Recently, an extreme hybrid scientific and technological paradigm shift has been unleashed that everyone should be aware of.
.
ΔS=ΔW/T2
We found out that the entropy difference can analytically, mathematically and phenomenologically be expressed as ΔS=ΔW/T2. And the following expression is also applicable ΔW/T2=ΔS.
The most extensive expression which is also applicable is ΔS=ΔW/T2=Q2/T2-Q1/T1.
What this expression quantifies is the difference between two thermodynamic cycles. Both cycles operate their thermodynamic media within two isothermal heat sources. So a real cycle is compared with an ideal cycle, both operated with two isothermal heat sources T1 and T2. In both cycles the heat source T1 supplies the same amount of thermal energy Q1 [J].
For reasons of simplicity the notation of Q in Joule is stated instead of the time-integrated time-variant Q(t) [W] within a complete thermodynamic cycle. This is basically incorrect or at least too simplified. Using Q(t) [W] as an independent variable within the analysis of thermodynamic heat- work cycles is incorrect.
Q1 [J]: the net amount of transferred thermal energy from the heat source T1 which is transferred to the thermodynamic medium within one completed thermodynamic cycle within two isothermal heat sources. In addition to the net amount of thermal energy, also the net amount of radiative transferred energy must be included.
Q2 [J]: this property only pertains to the real cycle. Q2 is the net amount of thermal energy transferred from the thermodynamic medium to the heat sink (T2/Tlow) within one completed thermodynamic (real) cycle. The real cycle operates the thermodynamic medium within two isothermal heat sources. In addition to the net amount of thermal energy, the net amount of transferred radiative energy must also be taken into account. Real thermodynamic cycle means that in one or all successive thermodynamic operations there is a thermodynamically irreversible process. Without explanation, this process is always a convective transfer of thermal energy within the thermodynamic medium with an associated temperature difference greater than zero Kelvin.
∆W [J]: the calculated difference between the net amount of generated work of an ideal thermodynamic cycle and a real thermodynamic cycle within two isothermal heat sources. ∆W is thus calculated from casually known properties. Both thermodynamic cycles are operated by supplying the same amount of thermal energy.
.
Carnot’s Loss Of Motive Power
Without explaining in detail ∆W is equal to another invention of Sadi Carnot explained in his book of 1824 called “Loss of motive power”.
.
Clausius entropy
Without explaining in detail in this comment, we have found supporting scientific works that supports the statement that Clausius entropy is a complete scientific flaw.
.
Second law of thermodynamics
Without explaining in detail in this comment, we have found supporting scientific works that supports the statement that the second law of thermodynamics is a scientific flaw.
.
Zeroth law of thermodynamics
Without explaining in detail in this comment, we have found supporting scientific works that supports the statement that the zeroth law of thermodynamics is a scientific flaw.
.
First law of thermodynamics
Without explaining in detail in this comment, we have found supporting scientific works that supports the statement that the first law of thermodynamics can be abandoned when the property Work and Temperature are transformed to mechanical properties: kinetic energy within a certain field of the fundamental interactions.
.
Third law of thermodynamics
Without explaining in detail in this comment, we have found supporting scientific works that supports the statement that the third law of thermodynamics is a complete scientific flaw.
.
Thermodynamic science
Thermodynamic science should be transformed to basic physic science. The physical law of conservation of energy replaces the first law of thermodynamics. Beyond that, physics science can probably do what it hasn't done for the past 200 years: unify as many observed phenomena as possible in as few generalized phenomenological analytical explanations as possible.
Reference
Wikipedia Unification (physics): Unification of the observable fundamental phenomena of nature is one of the primary goals of physics.
Retrieved from: https://en.m.wikipedia.org/wiki/Unification_(physics)
.
Battle against climate change
Apart from the fact that all institutions currently deciding the route and tools to battle climate change have interests other than those of humanity, thermodynamic science is the second flaw. Under the term “institutions” can be considered any company, organization, government, university and NGO. Without explaining in detail in this comment, as long as IPCC and the industry are using flawed analytical thermodynamic tools from the zeroth, second and third laws of thermodynamics, the outcome of human effort will certainly be: “catastrophic”, similar to what currently is predicted and feared by many experts in recent article “World Scientists’ Warning of a Climate Emergency 2021”.
Reference
Article: World Scientists’ Warning of a Climate Emergency 2021
Retrieved from: https://academic.oup.com/bioscience/advance-article/doi/10.1093/biosci/biab079/6325731#
Thermodynamic science has one pillar: the second law of thermodynamics. This law is held up by Clausius entropy. Various researchers are lively discussing the grounding of Clausius entropy on different ResearchGate discussion groups.
Last couple of weeks I have been discussing Clausius entropy in the discussion group "Tackling a Century Mystery: Entropy". The proponents and opponents are diametrically opposed. The proponents of Clausius entropy still can't explain entropy or substantiate it other than "Entropy must always increase in the universe and thus . . . . . . ". Moreover, none could name one spontaneous process in the world or even in the universe, which is the grounding of Clausius entropy.
This is shocking: no progress in 170 years to understand the simple formula. It was published by Rudolf Clausius in 1850 without an explanation. The opponents are certain: Clausius entropy is the “Greatest Blunder Ever in the history of Science”. Opponents have issued various exceptionally strong proofs and corresponding warnings: Clausius entropy must be dismantled in order to make scientific and technological progress. Thermodynamics has to go back 200 years and start over with the work of Sadi Carnot (1824).
Source: https://www.researchgate.net/post/Tackling_a_Century_Mystery_Entropy
.
Fruitful discussions on 200-year-old thermodynamic pillars
Various scientists, researchers and experts have shared their ideas which lead to fruitful discussions. In order to make this fruitful discussions in the related discussion groups also accessible for those who don’t have a ResearchGate account, I composed a document which contains various interesting comments from one of the scientists, researchers and experts.
.
Title: Fruitful discussions on 200-year-old thermodynamic pillars
Source: Deleted research item The research item mentioned here has been deleted
The second law of thermodynamics violates the epistemology of "internal causes determine external manifestations".
From the picture (book content), the second law of thermodynamics deduces: "the efficiency of Carnot heat engine has nothing to do with the thermophysical properties of working medium". This is absurd, like "human looks have nothing to do with genes"
"Carnot heat engine efficiency and human appearance" belong to external appearance, while "working medium thermophysical properties and genes" are internal causes. Internal causes determine external appearance.
Thermologists over consume "anti perpetual motion machine" and "irreversible". The second law of thermodynamics violates the rigor of science.
The second law of thermodynamics cannot calculate the thermodynamic entropy in the process of gas diffusion to vacuum
1, The gas diffuses to vacuum,dq = 0, dS = dQ / T = 0,so the entropy in the
diffusion process cannot be calculated: S (t1).
2, If S (t1) has no physical meaning, then S (t0) and S (t2) have no physical
meaning.
Article The second law of thermodynamics is a physical disaster
Article The second law of Thermodynamics: a mathematical error
1)Mathematical treatment of Carnot heat engine: the properties of working medium are physical equations, and the temperature of heat source is the mathematical boundary of the equation.
2)Carnot's law is expressed as: The result (efficiency) of a thermophysical equation (heat engine) is only related to the boundary (heat source temperature) of the equation and has nothing to do with the properties of the working medium (equation itself).
3)Carnot's law becomes a low-level mathematical error. The thermodynamic theory thus established is naturally wrong. The next section is an experimental case.
In my opinion, Entropy is a phylosiphical concept and used as transitional status to relatively measure or detect an event in a chaotic system.
Not an obsulete concept to try to limit it down, but actually it is a good starting point of veiw to generally grasp random systems.
The second law of thermodynamics is not strictly closed and definite
The expression of the second law of thermodynamics is not rigorous and full of life experience. Legal workers who do not understand physics can draw strange conclusions from the perspective of language logic.
l Kelvin stated that it is impossible to take heat from a single heat source and make it completely useful without other effects.
This sentence can be understood as follows:
A1: The heat engine with single heat source cannot be manufactured, and the heat engine with two heat sources can be manufactured.
A2: A single heat engine cannot be manufactured, and a heat engine with two heat sources cannot be manufactured.
A3: The heat engine with single heat source and two heat sources cannot be manufactured, and the heat engine with three heat sources can be manufactured.
A4: What does "other impacts" mean? You can freely guess: the hen can't lay eggs, or the sun rises from the south?
l Clausius states that it is impossible to transfer heat from low temperature objects to high temperature objects without causing other changes.
This sentence can be understood as follows:
B1: Heat cannot be spontaneously transferred from low temperature to high temperature. Heat can be spontaneously transferred from high temperature to low temperature.
B2: Heat cannot be spontaneously transferred from low temperature to high temperature, and heat cannot be spontaneously transferred from high temperature to low temperature.
B3: What does "other changes" mean? You can freely guess: Did the tortoise lay the golden egg, or did the sun rise from the west?
Originally, the laws of physics should be expressed in quantitative mathematics, such as Newton's second law "f=ma". The second law of thermodynamics is completely the language of life, not strictly closed and definite.