09 September 2015 12 10K Report

Mathematically, a system’s information content can be quantified by the so-called information entropy H, introduced by Claude Shannon in 1948. The larger the information entropy, the greater the information content.1 Consider the simplest possible information-storage device: a system with two distinct states—for example, up and down, left and right, or magnetized and unmagnetized. If the system is known with certainty to be in a particular state, then no new information can be gained by probing the system, and the information entropy is zero. An interesting question, then, is whether the thermodynamic consequences of the second law extend to information. Is it possible to extract useful mechanical work from a system just by observing its state? If so, how much? And at a more fundamental level, are the thermodynamic and information entropies related?

Are there any information particles? Are there any information bosons which  carry the information? Is the information fundemental ?

http://scitation.aip.org/content/aip/magazine/physicstoday/article/68/9/10.1063/PT.3.2912

More Ali Övgün's questions See All
Similar questions and discussions