In information theory, the entropy of a variable is the amount of information contained in the variable. One way to understand the concept of the amount of information is to tie it to how difficult or easy it is to guess the value. The easier it is to guess the value of the variable, the less “surprise” in the variable and so the less information the variable has.

Rényi entropy of order q is defined for q ≥ 1 by the equation,

S = (1/1-q) log (Σ p^q)

http://www.tina-vision.net/docs/memos/2004-004.pdf

As order q increases, the entropy weakens.

Why we are concerned about higher orders? What is the physical significance of order when calculating the entropy?

More Vishnu M Nair's questions See All
Similar questions and discussions