Dynamical Systems, Markov Chains, Information Theory and Statistical Mechanics.
Energy and Entropy seems like a very profound concepts applicable to many different kind of situations. In the book Nonlinear Dynamics by Strogatz I saw that in continuos dynamical systems, stable fixed points can be seen local minima in a potential well through some transformation. Leonard Susskind's statistical mechanics class on youtube starts very nice showing a sort of markov chain with transitions of probability 1 to illustrate the concept of entropy.
All these topics seem to be profoundly connected, but historically they have been developed by different scientific communities. Are there introductory texts that try to consolidate them?