Entropy represent the randomness of the molecules. It truly defines the characteristic of the system. Ideal or hypothetical systems could be thought of without friction, i.e., zero irreversibility. on the contrary, real systems always have friction which leads to the irreversibility and generate the entropy in the system. Hence, for any system, second law analysis (Entropy analysis or Exergy analysis) provides better judgement for the performance of the system rather than first law analysis of the systems.
In fact, in macro thermodyanmics, i.e. in the continuum state of the matter, entropy has not a trivial sense. It scales a heat dimensioned by a temperature. It is an extensive quantity of which the time variation indicates the presence of dissipative (irreversible) phenomena. ds= \deltaQ/T + Sirr and the second law imposes Sirr >= 0.
If you really want to understand what it means, you must refer to microscopic thermodynamics and the fundamental principle of statistical physics. In this case, at the equilibrium, the entropy is conveniently defined: it is the quantity which is maximized under the strain of available information (in a statistic sense). In another (intelligible) words, at the equilibrium, all the possible states pi of the i elements which compound the discrete distribution of the matter in a given volume are equiprobable. The entropy is maximisized in this condition: its definition appears directly when you consider it "measures" the quantity of information I (I=-log(p_i)): S=-\sigma_i [pi log(pi)]. At the equilibrium p_i =p equiprobability, and S is maximal.
Relying now macro and micro thermoddynamics, the sense is the same. At the equilibrium there is no dissipative phenomena Sirr = 0 and S gets a maximal value!!!
In anothers words, the more there are dissipative phenomena, the farer you are from a state of equilibrium (defined by a microscopic maximal disorder) for which the entropy is maximal.
Its interest yields in the fact that a budget equation for entropy density may be written in the elementary volume, in continuum mechanics. The source term of this budget equation contains all the rates of change in dissipative phenomena ( flow viscosity, irreversible chemistry, conductive and radiative heat transfer, electromagnetic interaction .... and so on).
Thermodynamics - An Engineering Approach by Yunus A Cengel and Michael A Boles is a good book to refer. It gives an indepth understanding of entropy with simple practical examples
Entropy (or thermal charge as it is sometimes called) is an extensive (as opposed to intensive) system property expressing the system's microscopic randomness or our inability to determine exactly the microscopic state of the system at any one time when its macroscopic state is given.
For an isolated system in equilibrium when its macrostate (or macroscopic state) is constant, its microstate will nonetheless be continuously changing between its possible quantum states. The entropy of a system which can only exist at one state is zero, because its state will be then quite determined. But if the probability of the system existing at any one of its possible states is equal, its state at any one time cannot be determined and its entropy will be at a maximum.
According to the second law of thermodynamics, any spontaneous change is accompanied by entropy increase and available energy decrease. If world energy is constant, available energy will therefore continue to decrease. This will result eventually as some believe in heat death, when all change ceases and the entropy will be at a maximum, or as Rudolf Clausius put it:
Die Energie der Welt ist constant,
Die Entropie strebt einem Maximum zu.
This continuous increase of entropy might be related to the continuous flow of time. Entropy decrease or time travel to the past are both impossible for an isolated system. When a glass is broken and its milk content is spilt, entropy increases. But for the reverse process to occur we have either to go back in time or have an entropy decrease, possible only if at all through the expenditure of available energy.
For entropy, boiled egg is the best example , where outer shell having minimum temperature. But inside the shell towards the yellow yolk the temperature has the maximum temperature
Entropy is defined as the quantitative measure of disorder or randomness in a system. The concept comes out of thermodynamics, which deals with the transfer of heat energy within a system. This is measure of a system's thermal energy per unit temperature that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, the amount of entropy is also a measure of the molecular disorder, or randomness, of a system. A measure of the unavailability of a system's energy to do work; also a measure of disorder; the higher the entropy the greater the disorder. In thermodynamics concept, parameters represents the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy. Several factors affect the amount of entropy in a system. If you increase temperature, you increase entropy. (1) More energy put into a system excites the molecules and the amount of random activity. (2) As a gas expands in a system, entropy increases.
Entropy tells us about the flow and distribution of energy. If the entropy decreases, then that process cannot occur spontaneously. It needs some work/ energy to be pumped in order for the process to occur. It is important to understand what entropy is and why it always increase.
Changes in temperature will lead to changes in entropy. The higher the temperature the more thermal energy the system has; the more thermal energy the system has, the more ways there are to distribute that energy; the more ways there are to distribute that energy, the higher the entropy(This is a general concept).
Entropy and unavailability of energy to do work is more concern.