Hello all :

Always when we talk about Entropy we can open a few interesting and intriguing discussions ...

We know there are several Entropies in nature i.e. not just a different ways of defining the concept of Entropy, but different concepts, depending on the area of physics you are studying ... e.g.

There is the Thermodynamic Entropy

the Statistical Entropy

Shannon Entropy

von Neumann Entropy

Black Hole Entropy , etc ...

Some Entropies belong to the same realm of Physics(and now we need to include Information Theory as well), e.g; the Statistical Entropy: S=k_B*ln(W) , which emerges from the Boltzmann Entropy, which in turn emerges from the Gibbs Entropy (please correct me if I got this last sentence wrong).

But other Entropies are found in fields of physics totally separated from each other (as the ones from the list I mentioned above).

So my question is ... :

Do you think there could be one fundamental Entropy, from which all the others can emerge?

A one single general definition of Entropy, which all the others can be derived from ... ?

Or at the end, It would be impossible to explain all the Entropies (from different physics) from one general formula ...

?

kind Regards !

More Franklin Uriel Parás Hernández's questions See All
Similar questions and discussions