If you mean thermodynamical entropy (as there are other forms and definitions as well, e.g. optical entropy), you can say that entropy is a measure for the number of microscopical thermodynamical states of a system which can be reached within your phase space.
Or, as a more sloppy definition: it can be seen as the degree of disorder in your system.
It is an important thermodynamic state function used to explain/quantify the degree of disorder, which is able to limit the amount of thermal energy converted to Mechanical work. if you remember first law of thermodynamics, Conservation of energy; and explain the conversion of thermal energy in to Mechanical energy but nothing to say on the amount which will be converted, i.e., the limit. Therefore, this property is providing an answer for this. To being a system in most stable/equilibrium state it should be exist/have maximum entropy. "dS=(change Q)/T".
The entropy is a measure of the number of microscopic states that can give rise to an observable macroscopic state. The microstate might consist of a geometric arrangement of molecules in three dimensions together with each of the molecules' velocities. The macrostate could be the system's pressure and temperature, given its volume. The big breakthrough in 19th century statistical physics was (1) figuring out how to count the microstates, and (2) relating this number to the macroscopic entropy that had been inferred from thermodynamic measurements on macroscopic systems.
A key observation that made this breakthrough possible is that every microstate corresponds to one specific macrostate, but many distinct microstates can correspond to the same macrostate. Boltzmann realized that we are most likely to see the macrostate that corresponds to the largest number of microstates, given the other fixed parameters in the system. This is true for the same reason that rolling a pair of six-sided dice is more likely to yield 7 than 2.
The entropy S of a macrostate is defined to be proportional to the logarithm of the number N of possible contributing microstates: S = k_B log N. Why a logarithm? Why not just use the number of microstates? This is because we want the entropy to be an extensive quantity. When we bring two systems into thermal contact, we want the entropy to be the sum of the component systems' entropies, not the product.
To see this, consider two systems, A and B, with N_A and N_B microstates respectively. Once these are brought into contact, the number of microstates in the combined system is N = N_A N_B, which is a product. The logarithm of this product has the property we want: log N = log N_A + log N_B.
The most probable macrostate is the one with the largest entropy, by construction.
It's interesting to note that the most probable macrostate need not be the most disordered. An ergodic "gas" of monodisperse hard spheres crystallizes when the spheres fill 50 percent of space because the crystalline configuration has higher
entropy than any disordered configuration at the same volume fraction. This is quite remarkable because the spheres only become close-packed when their volume fraction reaches 74 percent. Entropically-driven crystallization of hard spheres is well documented in experiments on colloidal particles.
As far as I think entropy is the disorder of a system or the loss quantity of a system. With increasing disorder, the loss quantity of system is increased and entropy is also increased.