One of Murphy's famous laws states that “left to themselves, things will always go from bad to worse.” This humorous prediction is, in a way, echoed in the second law of thermodynamics. That law deals with the concept of entropy. Stated simply, entropy is a measure of the disorder of a system. Entropy is used to quantify the amount of randomness or uncertainty in the state of a system. There is not one but a whole family of entropies, and some of them play key roles in different areas of quantum information theory.