"Entropy of Substance" MUST be zero at absolute zero, because Entropy is the notion to characterize ubiquitous Reactions/Counteractions in response to not less ubiquitous Actions. This is known since the time of Isaac Newton and expressible as his Third Basic Law of Mechanics.
It is because of the above circumstance that the absolute zero temperature is never reachable through any realistic process, as we know. Why?
Indeed, any Action MUST entail Counteraction of the equal absolute value, so that Zero Action would result in Zero Counteraction.
As Entropy (the energetic expression of the counteractions) is equal to zero at zero absolute temperature (cf. here for the detailed information about this result: https://arxiv.org/pdf/1110.6352.pdf), the latter temperature must ever be unreachable.
If the entropy of a system is zero, this means that the system is in perfect order and that the particles are all in a stationary state. This only happens at absolute zero temperature. At absolute zero, the temperature is equal to 0 Kelvin. At absolute zero the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state with minimum energy. In such a case, the entropy at absolute zero will be exactly zero. The entropy of a system at absolute zero is typically zero, and in all cases is determined only by the number of different ground states it has. Specifically, the entropy of a pure crystalline substance at absolute zero temperature is zero. At absolute zero, the entropy of a perfect crystal is zero. Entropy is the measure of randomness since the constitute particles are motion less therefore entropy is taken as zero at absolute zero. Only a perfectly ordered, crystalline substance at absolute zero would exhibit no molecular motion and have zero entropy. If it is greater than zero, the reaction is product-favored. If it is less than zero, the reaction is reactant-favored.
If the entropy of a system is zero, the only conclusion about such a system is that it has not received any internal or external stimulus/drive, that is, it is dead - and, therefore, it is not interesting for study and discussion.
If you do not know, what entropy is, it is not a big problem, for you might still successfully use this notion IMPLICITLY, by employing conventional physical-chemical methods/approaches -> depending in detail solely on the nature of your problem to solve.
In the Adiabatic process no exchange of heat and mass transfer is possible and therefore, the entropy of the system is zero.If there is no difference in the final and initial state of entropy then entropy will be zero. Devices with a steady state of flow of energy like nozzles, and turbines have zero entropy. Reversible processes also have zero entropy. If the entropy of each element in some crystalline state be taken as zero at the absolute zero of temperature: every substance has a finite positive entropy, but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances. If it is greater than zero, the reaction is product-favored. If it is less than zero, the reaction is reactant-favored. When focusing on the main objectives, Entropy: Zero is about 3 Hours in length. If you're a gamer that strives to see all aspects of the game, you are likely to spend around 6 Hours to obtain 100% completion. Entropy is the measure of randomness since the constitute particles are motion less therefore entropy is taken as zero at absolute zero. The entropy of a perfect crystal at absolute zero is exactly equal to zero. At absolute zero, the system must be in a state with the minimum possible energy, and the above statement of the third law holds true provided that the perfect crystal has only one minimum energy state. The third law of thermodynamics establishes the zero for entropy as that of a perfect, pure crystalline solid at 0 K. With only one possible microstate, the entropy is zero. We may compute the standard entropy change for a process by using standard entropy values for the reactants and products involved in the process. Entropy of perfectly crystalline solid is taken as zero at 0 K. (R): At absolute zero translation kinetic energy of a system is zero. At absolute zero, the entropy of a perfect cyrstal is zero.
"Entropy is a measure of the disorder of a system".
In effect, Entropy is a measure of Reaction/Counteraction in terms of Energy, as soon as the relevant Action is non-zero.
This is Isaac Newton's Third Basic Law of Mechanics enriched by the ideas of Carnot, Clausius, Rankine and Lord Kelvin (last but not least!).
The intensity of 'disorder' mentioning in discussions about the entropy notion ought to be the measure of sheer ignorance as to the actual sense of the latter.
I agree with Rana Hamza Shakil that by the 3rd law, entropy is zero, at zero K. If we think about what the temperature is, it is a measure of internal, randomized energy in a material. Absolute zero is a point at which thermal motions stop entirely. Therefore, the entropy change of a system is zero if the state of the system does not change during the process. As entropy change of steady flow devices such as nozzles, compressors, turbines, pumps, and heat exchangers is zero during steady operation. If it is greater than zero, the reaction is product-favored. If it is less than zero, the reaction is reactant-favored. The entropy of a pure, perfect crystalline substance at 0 K is zero. Entropy is the measure of randomness since the constitute particles are motion lesstherefore entropy is taken as zero at absolute zero. Entropy of perfectly crystalline solid is taken as zero at 0 K. (R): At absolute zero translation kinetic energy of a system is zero. At absolute zero, the entropy of a perfect cyrstal is zero.
"I agree with Rana Hamza Shakil that by the 3rd law, entropy is zero, at zero K."
There is no separate '3rd law', for a very simple straightforward reason.
Mathematically, entropy is equal to zero at zero absolute temperature. This is mathematically derivable using the famous Boltzmann-Planck formula to express entropy S via probability W: S = k*ln(W).
Meanwhile, W is a handy algebraic function of the absolute temperature, T:
W = 1 + T/Tscale, where Tscale stands for the pertinent temperature scale of your problem. Consequently, if T = 0, then S = 0. This is known since ca. 100 years already.
Physically/chemically/biologically/etc., S = 0, i.e., reaction/counteraction is equal to zero, if and only if the system under study gains no internal or external stimuli, i.e., the driving/livening force, i.e. relevant Action, of the process under study is zero.
This is just how T = 0 is EVER UNREACHABLE in the course of any realistic process, for the latter does exert a NON-ZERO driving/livening force, that is, relevant Action, which ought to cause a NON-ZERO reaction/counteraction, to be expressible via some NON-ZERO S = k*ln(W).
To sum up, the UNREACHABILITY of T = 0 does represent the mathematical consequence of the Energy Transformability, that is, a QUALITATIVE aspect of the Basic Fundamental Energy Conservation and Transformation Law.
To stubbornly conceptually isolate it by dubbing it ' the 3rd law', has served as a justification of metaphysical voluntarism having successful termination in the form of Quantum Mechanics - but does have none of methodological value apart of the latter story:
Basic Fundamental Natural Laws are never mathematically derivable from elsewhere.
Theoretically entropy can be zero; however practically one cannot achieve this because to have entropy at 0 the temperature reached must be 0 kelvin and that can't be reached. If it is greater than zero, the reaction is product-favored. If it is less than zero, the reaction is reactant-favored. However, for an ideal solution, entropy mixing (ΔmixS) is not zero. It positive because a solvent in a solution has more molecular disorder as such, there are a multitude of possible ground states for the ideal gas, which means that its entropy is nonzero at zero temperature. The mod features three endings and one secret ending, each resulting from the player's decisions throughout the final of the game. Following the increase of entropy, the dissipation of matter and energy goes on until our universe becomes so infinitely disordered that entropy can no longer increase and events come to an end. This is called the heat death of the universe. Some say that, because things cannot get any worse, nothing happens at all. The third law of thermodynamics was formulated by Nernst. It states that, "The entropy of a perfectly crystalline substance at absolute zero or zero kelvin is taken as zero".The third law of thermodynamics establishes the zero for entropy as that of a perfect, pure crystalline solid at 0 K. With only one possible microstate, the entropy is zero. We may compute the standard entropy change for a process by using standard entropy values for the reactants and products involved in the process. In the quantum-mechanical description, matter (solid) at absolute zero is in its ground state, the point of lowest internal energy.
"The entropy of a perfectly crystalline substance at absolute zero or zero kelvin is taken as zero".
1. There is nothing 'perfect' in the Universe.
2. The 'entropy of substance' is a meaningless combination of words. The entropy is a characteristic of some realistic process of whatever nature, whereas some driving/livening force (Action) must encounter pertinent Reactions/Counteractions. That is, Zero Action, Zero Reaction, that is all.
3. The author(s) of the so-called "First, Second, Third, Fourth" laws had but absolutely no idea what entropy is. He/they had to invent "laws" fake in fact to reconcile the gap between natural sciences and metaphysical voluntarism... Howbeit, "fake laws" plus probability theory and a couple of chapters of the mathematical statistics could be duly combined into a true, valid and seminal physical mathematics - having but nothing to do with mathematical physics.
4. But if you know what entropy is, you know that Energy notion obeys ONE and ONLY ONE BASIC FUNDAMENTAL CONCEPTUALLY INSEPARABLE LAW:
Energy Conservation and Transformation Law
5. It is throughout possible to treat entropy notion both IMPLICITLY (wave or quantum mechanics) and EXPLICITLY, this is dependent on the problem you have to solve. Due to ubiquitous human relationships APPROACHES, which are STEMMING from the EXPLICIT entropy are underdeveloped nowadays, but young active and proactive colleagues would reconcile this gap for sure, no doubt.
Evgeni B. Starikov '"Entropy of Substance" MUST be zero at absolute zero, because Entropy is the notion to characterize ubiquitous Reactions/Counteractions in response to not less ubiquitous Actions'
Not true. The entropy need not go to zero at absolute zero for quantum systems with degenerate ground state. (The degeneracy must be sufficiently high, i.e. its logarithm must be O(N).) Such systems have a quantum phase transition at T=0.
Moreover, the entropy of other systems need not go to zero if the system is in a non-equlibrium state. Glasses are examples. A glass is not at thermal equilibrium and disordered enough to have a higher entropy than a corresponding crystal. This does not go away as the temperature is decreased to zero, as the glass is caught in its metastable state and will not get out of it. (Well, maybe after a gazillion years it could tunnel into the crystalline state...)
Anyway, keep in mind that the third law of thermodynamics holds for equilibrium systems only.
I agree with K kassner that an entropy is the measure of randomness since the constitute particles are motion less therefore entropy is taken as zero at absolute zero. The entropy of a system at absolute zero is typically zero, and in all cases is determined only by the number of different ground states it has. Specifically, the entropy of a pure crystalline substance at absolute zero temperature is zero. The entropy of a perfect crystal at absolute zero is exactly equal to zero. At absolute zero the system must be in a state with the minimum possible energy, and the above statement of the third law holds true provided that the perfect crystal has only one minimum energy state.The third law of thermodynamics was formulated by Nernst. It states that, "The entropy of a perfectly crystalline substance at absolute zero or zero kelvin is taken as zero". Therefore, the entropy change of a system is zero if the state of the system does not change during the process. As entropy change of steady flow devices such as nozzles, compressors, turbines, pumps, and heat exchangers is zero during steady operation.
Rk Naresh The assumption that there is zero entropy at absolute zero is based on the Third Law of Thermodynamics and the idea that a substance is in its lowest energy state with no molecular motion at that temperature. The exact zero entropy of a system at any non-zero temperature suggests an extremely ordered and improbable state, which is unlikely to occur in reality.
Vielen herzlichen Dank für Ihren echt höchstwertigen Vorstoß!
Dies erleichtert mir meine nicht-triviale Aufgabe, den betroffenen Kollegen und sonstigen Interessenten etwas Wichtiges zu vermitteln.
Somit kehren wir von da aus ins Englische zurück, um die Kreise der Interessenten dieser Diskussion nicht zu verringern:
Dear Prof. Dr. Kassner,
My statement: "Entropy of Substance" MUST be zero at absolute zero, because Entropy is the notion to characterize ubiquitous Reactions/Counteractions in response to not less ubiquitous Actions"
Your response: "Not true. The entropy need not go to zero at absolute zero for quantum systems with degenerate ground state. (The degeneracy must be sufficiently high, i.e. its logarithm must be O(N).) Such systems have a quantum phase transition at T=0".
In case we do not know WHAT THIS DAMNED ENTROPY IN EFFECT IS, I do agree that "The entropy need not go to zero at absolute zero", for there are exactly two possibilities for entropy: to go or not to go to zero at absolute zero.
Quantum systems, ground states, degeneracies of the latter are, in effect, nothing more than convenient misnomers as soon as we discuss physical principles.
Why?
Quantum Physics stems from metaphysical voluntarism. The actual origin of the latter is ignorance in regard to the proper answer to the poser I have just mentioned, namely: WHAT THIS DAMNED ENTROPY IS IN EFFECT? Most of the relevant colleagues at the time in question (end of the XIX-th, starting of the XX-th century) have regretfully been ignorant -> in regard to the proper answer to the poser I have just mentioned -> whereas still duly grasping the immense importance of the Energy Transformation modalities representable by the Entropy notion.
This is why, the collegial majority has taken the plausible, but, in effect, very uneasy, decision to strive for some due, but IMPLICIT treatment of Entropy, and the (in the meantime, notorious) formula S = k*ln(W) opens the door.
As W stands here for a Probability of Lord Almighty knows what scilicet, the next logical step has been to formulate Statistical Mechanics of Rigid Balls we dub Atoms and, after realizing that the matter is actually Non-Rigid, Soft -> the W in S = k*ln(W) started to look like Ψ and bear the identifier "Wave Function". In addition, ingenious finding of Nicolas Carnot has degraded into a strange field dubbed 'Equilibrium Thermodynamics' known to any skillful student all around the World for its hardly logical conceptual framework.
Meanwhile, nobody could have grasped WHAT THIS DAMNED ENTROPY IN EFFECT IS, but a combination of S = k*ln(W) with a metaphysical suggestion to 'cut Energy into discrete thin slices likewise a sausage' has turned out to be useful in explaining the ubiquitous black-body radiation modalities. This is just how Quantum Physics has started its marvelous route...
Howbeit, Simon Ratnowsky could have proven that formulae derivable using this metaphysical suggestion are derivable from J. W. Gibbs mathematical results WITHOUT but any metaphysical suggestions at all... Moreover, Simon has proven that this 'Energy Quantum' suggestion stems from the reference to absolute zero, which is physically unreachable, as we know...
The results are known to any University student of Physics: Quantum Mechanics has finally grown into a true, valid and seminal physical theory.
What is widely unknown is that Simon Ratnowsky for his results has been mobbed into death...
Moreover, apart from Simon there have colleagues who not only knew well WHAT THIS DAMNED ENTROPY IN EFFECT IS, but also HOW TO DEAL WITH THIS DAMNED ENTROPY in a physically proper correct and fruitful way.
The conclusion: Scientific Research is INEXHAUSTIBLE.
Howbeit, professional readership should know the actual value of terminology.
Your response, Part II: "Moreover, the entropy of other systems need not go to zero if the system is in a non-equlibrium state. Glasses are examples. A glass is not at thermal equilibrium and disordered enough to have a higher entropy than a corresponding crystal. This does not go away as the temperature is decreased to zero, as the glass is caught in its metastable state and will not get out of it. (Well, maybe after a gazillion years it could tunnel into the crystalline state...)".
There is no physical/chemical/biological/etc. sense to look for "entropy of SYSTEMS", the entropy notion has sense for realistic PROCESSES only.
Your conclusion: "Anyway, keep in mind that the third law of thermodynamics holds for equilibrium systems only".
In turn, please keep in mind that 'equilibrium systems' ought to be DEAD and therefore not interesting for researchers, because in such systems all of the driving/livening forces (i.e., Actions) have been compensated/equilibrated by the pertinent Reactions/Counteractions.
It is this eternal/ubiquitous Action-Reaction Interplay that IS DRIVING TIME.
It's merely a matter of convention and calculation. The math is easier if entropy and temperature were proportional rather than linearly related. The physics interpretation is questionable and is in indication of lack of understanding which is part of the mystery of entropy. All bodies are on motion relative to the rest of the universe.
"There is no physical/chemical/biological/etc. sense to look for "entropy of SYSTEMS", the entropy notion has sense for realistic PROCESSES only."
No. That is wrong.
I have taught Thermodynamics and Statistical Mechanics for 25 years. Not every year, but I think I went through the full course five or six times. When I finally wrote up my course into a script (not to become a book but as a convenience for the students), I became more accurate and painstakingly meticulous in order not to write up errors "for eternity". Anyway, I did discover some errors in textbooks that one author had reproduced from the other, by not thinking carefully and precisely. This usually did not make the outcome incorrect, these were just errors in the argument (that could be rectified and were, in my script).
I have very clear ideas about what entropy is. It is not a notion for processes as you say. It is a notion referring to thermodynamic states. Therefore, it shares with other state variables (such as free energy, pressure, temperature) the property of not depending on the path by which the final state is reached from the initial one. Other quantities, such as work and heat are process dependent and therefore cannot describe a state.
In thermodynamics, entropy is a state variable, the change of which is given by the heat added (or subtracted) reversibly from the system, divided by the temperature. This definition is applicable to systems having a homogeneous temperature. If that is not the case, the system must be divided into (small but macroscopic) pieces that have homogeneous temperature and the total entropy change is the sum of the entropy changes of these pieces. Entropy is an extensive quantity, i.e. proportional to the amount of material of the system. Clearly, it is a property of the material, not of the process applied to get the material into its state. (For irreversible processes, its change must be computed via a reversible surrogate process.)
In statistical mechanics, entropy is a property of the ensemble describing a macrostate. A macrostate is characterized by having certain macroscopic properties and the ensemble consists of all microstates realizing that macrostate. Many properties of entropy can be discussed without quantum mechanics, so there is no particular problem of understanding entropy from that side. Macroscopic systems are typically self-averaging, i.e. their macroscopic properties can be described as averages over the corresponding properties of the microscopic ensemble. This way the entropy becomes again a property of the system, one of the quantities describing its state. It is defined and calculable regardless of the process that brought about the state. (You may use another process leading to the same final state to make the calculation of the entropy change between the initial and final states simpler.) So, in statistical mechanics, entropy also is a system property, not a process property. And, in fact, one can prove that the entropy defined in thermodynamics is the same quantity as the entropy defined in statistical mechanics.
'Your conclusion: "Anyway, keep in mind that the third law of thermodynamics holds for equilibrium systems only".
In turn, please keep in mind that 'equilibrium systems' ought to be DEAD and therefore not interesting for researchers, because in such systems all of the driving/livening forces (i.e., Actions) have been compensated/equilibrated by the pertinent Reactions/Counteractions.'
I could not disagree more. To understand nonequilibrium systems, you first have to understand equilibrium. The mathematical description of nonequilibrium systems is normally constructed from equilibrium states of its subsystems, so you will not make any progress in nonequilibrium statistics, if you are not capable or willing to deal with equilibrium ones.
Understanding the approach to equilibrium is one of the most important problems of statistical mechanics. Since a system that is left to its devices i.e. is not externally driven, will develop towards a well-defined equilibrium state (as we have learned from statistical mechanics), it is important to know what that state will be in order to assess the nonequilibrium deviations of the actual state from the expected final one, and to get an idea about the dynamical behaviour of the nonequilibrium system.
Moreover, note that the only nonequilibrium systems, for which there is a general theory comparable to statistical mechanics of equilibrium systems, are systems close to equilibrium. For these, we still have principles such as minimum entropy production. No generally valid principles are known for far-from-equilibrium systems. So equilibrium and close-to-equilibrium physics are far from being dead. Whenever we wish to understand a completely new system, we can only start from these, because they are the disciplines, in which we have general principles, expected to be valid also in a completely new area.
many sincere thanks for your detailed and exhaustive response!
You are perfectly summarizing what you have found after having "taught Thermodynamics and Statistical Mechanics for 25 years..." and having gone "through the full course five or six times".
But over there you haven't found a number of several important colleagues' names. This is definitely not any kind of rebuke to you, for your having never found those names is not your fault or your intent, but, instead, a product of an ubiquitous human relationship 'Entropy' counteracting the not less human strive for permanent cognisance.
What you learn from bestseller thermodynamics/statistical mechanics books is that Nicky Carnot was wrong, whereas Rudolf Clausius could but basically and duly correct Nicky's conceptual errors by shedding light upon two basic natural laws:
1. Energy Conservation Law
2. Energy Transformation Law
It is just the above statement that is physically/chemically/biologically/etc. WRONG, because the actual basic fundamental law is THE SINGLE, UNIQUE and CONCEPTUALLY INSEPARABLE
Energy Conservation and Transformability Law
Rudolf Clausius could have duly analyzed purely logical inferences by Nicky, and even formalize them not only by solely introducing the Entropy notion - but also by revealing the essential mathematical feature of the latter, namely its ever tending to arrive at its maximum value... Howbeit, his separating the inseparable, that is, producing TWO FAKE LAWS instead of A SOLE UNIQUE FUNDAMENTAL and BASIC ONE has produced a methodological bias.
Indeed, it has become throughout clear that Entropy ought to be a basic fundamental and notion, but what about its actual sense has remained just a bull-shit poser.
In France and Great Britain have been detractors of Clausius' stance, but let us stay in Germany.
The most consistent criticism has come from Karl Friedrich Mohr, auch Carl Friedrich Mohr, genannt Friedrich Mohr, (* 4. November 1806 in Koblenz; † 28. September 1879 in Bonn), inter alia, one of Clausius' colleagues at the University of Bonn.
To our sincere regret, Clausius had no more time to correctly perceive Mohr criticism, but one of the Clausius students, August Friedrich Horstmann (* 20. November 1842 in Mannheim; † 8. Oktober 1929 in Heidelberg) had not only reconciled Clausius' conceptual gap, but could also convey his findings to J. W. Gibbs, who was, inter alia, his student, while in Heidelberg in 1869...
...The actual story is long and interesting, and there are much more names you'll never find in bestseller thermodynamics/statistical mechanics books - but this is not the problem -> for MANUSCRIPTS ARE NOT DEFLAGRABLE - 'Рукописи не горят'.
It is due to the latter basic, fundamental natural law that Entropy Notion is fruitfully and seminally treatable not only IMPLICITLY (it is just what you are conveying, in fact), but EXPLICITLY as well (it is just what I am conveying).
And the true conclusion ought to be not adulterating each other's lives or at least shouting to each other "YOU ARE WRONG!", but let our young, active and proactive colleagues find the proper ways of how to pursue research work having THE BOTH possibilities in mind...
The entropy of a substance is typically assumed to be zero at absolute zero temperature, as per the Third Law of Thermodynamics. This law states that a substance's entropy approaches zero as its temperature approaches zero. At absolute zero, atoms or molecules are in their lowest energy state with minimal vibrational or positional disorder. As temperature increases, particles gain energy and move more randomly, increasing entropy. However, this is unlikely for most substances, as quantum mechanical effects prevent the elimination of all possible particle arrangements. In rare cases, such as Bose-Einstein condensates, the Third Law may not apply.
"The entropy of a substance is typically assumed to be zero at absolute zero temperature, as per the Third Law of Thermodynamics."
The entropy as it is must MATHEMATICALLY be equal to ZERO at absolute zero temperature. This ought to be well known, for it is formally proven, as well as published, already some 100 years ago. Hence, there is no space for any kind of 'assumptions'.
Physically/chemically/biologically/etc., the entropy of any realistic PROCESS is equal to zero IF AND ONLY IF the driving/livening force of the latter is equal to zero.
This is but not a separate law. This is a logical-mathematical consequence of two basic fundamental natural laws:
1. Energy Conservation and Transformability Law.
2. Hon. Isaac Newton's Third Basic Law of Mechanics.
The entropy notion is therefore applicable solely to REALISTIC PROCESSES, wherewith the Energy Transformations do take place according to their modalities dictated by Hon. Newton's Third Law.
It is also throughout possible to successfully employ the entropy notion IMPLICITLY. This has been done in QUANTUM/WAVE MECHANICS, which is a physical mathematics having metaphysical roots.
If the entropy of a substance is exactly zero, it would mean that the particles of the substance are perfectly ordered and there is no randomness or disorder in the system. This is only possible at absolute zero temperature. In practice, it is very difficult to achieve absolute zero temperature, and therefore, it is difficult to find a substance with zero entropy. However, if a substance could have exactly zero entropy at some temperature, it would mean that the system is in a state of maximum order, and no spontaneous changes can occur. This is known as a state of thermodynamic equilibrium.
If the entropy of a substance is exactly zero, it would mean that the particles of the substance are perfectly ordered and there is no randomness or disorder in the system. This is only possible at absolute zero temperature. In practice, it is very difficult to achieve absolute zero temperature, and therefore, it is difficult to find a substance with zero entropy. However, if a substance could have exactly zero entropy at some temperature, it would mean that the system is in a state of maximum order, and no spontaneous changes can occur. This is known as a state of thermodynamic equilibrium.
Entropy is the measure of randomness since the constitute particles are motion less therefore entropy is taken as zero at absolute zero. At absolute zero of temperature, there is complete orderly molecular arrangement in the crystalline substance. Therefore, there is no randomness at 0 K and entropy is taken to be zero. At absolute zero, the entropy of a perfectly crystalline substance is taken as zero. Zero, the entropy of any pure crystal at the absolute zero is zero, and the entropy of any other substance is greater than zero. By the 3rd law, entropy is zero, at zero K. If we think about what the temperature is, it is a measure of internal, randomized energy in a material. Absolute zero is a point at which thermal motions stop entirely.
1. "At absolute zero of temperature, there is complete orderly molecular arrangement in the crystalline substance."
Nobody would ever learn what happens at absolute zero of temperature, for the latter is PRACTICALLY UNREACHABLE. This is NOT a BASIC LAW, but a logical-mathematical consequence from the VERY UNIQUE, BASIC LAW:
ENERGY CONSERVATION and TRANSFORMABILITY LAW
This is the ONLY General Fundamental Basic Natural Law the Energy Notion obeys.
But it is just the stubborn attempts to multiply the actual number of General Fundamental Basic Natural Laws in regard to Energy notion that ought to be the actual measure of randomness, which is not 'magic natural randomness' but solely a randomness in our brains due to sheer ignorance.
There are two efficient ways to lift the above-mentioned ignorance, the one ought to be using probability-theoretical and mathematical-statistic tools.
This is just where Quantum Mechanics does result from: Starting from pure metaphysical voluntarism, revolutionary colleagues have created a valid and seminal physical-mathematic framework, whereas it is multiplying the actual number of General Fundamental Basic Natural Laws in relation to the Energy notion, as well as treating the Entropy notion IMPLICITLY, that represents all the methodological essentials of such an approach.
2. Another way to efficiently lift the above-mentioned ignorance would be to treat Energy and Entropy notions EXPLICITLY, while using the mathematical physics and not physical mathematics likewise in quantum/wave mechanics.