The second law leads to the definition of a new property called entropy. That is, the cyclic integral of δQ / T is always less than or equal to zero. This is valid for all cycles, reversible or irreversible. The Clausius inequality forms the basis for the definition of a new property called entropy. We see that Entropy becomes maximum at probability 1/2. In the general case, the maximum value of entropy for any system is log₂c. This happens when probability is 1/2. Entropy is measured between 0 and 1. Depending on the number of classes in your dataset, entropy can be greater than 1but it means the same thing, a very high level of disorder.
Entropy is measured between 0 and 1. Depending on the number of classes in your dataset, entropy can be greater than 1 but it means the same thing, a very high level of disorder. We see that Entropy becomes maximum at probability 1/2. In the general case, the maximum value of entropy for any system is log₂c. This happens when probability is 1/2. The entropy or the impurity measure can only take value from 0 to 1 as the probability ranges from 0 to 1 and hence, we do not want the above situation. Suppose X is a random variable with range {a1, a2, an}; i.e., it can take on n different values. Then the maximum possible value of H(X) is log2 n. Notice that the maximum entropy log2 n occurs when we have the uniform distribution: i.e., pX(ai)=1/n for all