1872 Orange pog.svg – Ludwig Boltzmann presents his H-theorem, and with it the formula Σpi log pi for the entropy of a single gas particle.
1878 Orange pog.svg – J. Willard Gibbs defines the Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the whole system.
1924 Red pog.svg – Harry Nyquist discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system.
1927 Orange pog.svg – John von Neumann defines the von Neumann entropy, extending the Gibbs entropy to quantum mechanics.
1928 Red pog.svg – Ralph Hartley introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning).
1929 Orange pog.svg – Leó Szilárd analyses Maxwell's Demon, showing how a Szilard engine can sometimes transform information into the extraction of useful work.
1940 Red pog.svg – Alan Turing introduces the deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismus process.
1944 Red pog.svg – Claude Shannon's theory of information is substantially complete.
1947 Purple pog.svg – Richard W. Hamming invents Hamming codes for error detection and correction. For patent reasons, the result is not published until 1950.
1948 Red pog.svg – Claude E. Shannon publishes A Mathematical Theory of Communication
1949 Red pog.svg – Claude E. Shannon publishes Communication in the Presence of Noise – Nyquist–Shannon sampling theorem and Shannon–Hartley law
1949 Red pog.svg – Claude E. Shannon's Communication Theory of Secrecy Systems is declassified
1949 Green pog.svg – Robert M. Fano publishes Transmission of Information. M.I.T. Press, Cambridge, Mass. – Shannon–Fano coding
1949 Green pog.svg – Leon G. Kraft discovers Kraft's inequality, which shows the limits of prefix codes
1949 Purple pog.svg – Marcel J. E. Golay introduces Golay codes for forward error correction
1951 Red pog.svg – Solomon Kullback and Richard Leibler introduce the Kullback–Leibler divergence
1951 Green pog.svg – David A. Huffman invents Huffman encoding, a method of finding optimal prefix codes for lossless data compression
1953 Green pog.svg – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm, a procedure to decide whether a given variable-length code is uniquely decodable
1954 Purple pog.svg – Irving S. Reed and David E. Muller propose Reed–Muller codes
1955 Purple pog.svg – Peter Elias introduces convolutional codes
1957 Purple pog.svg – Eugene Prange first discusses cyclic codes
1959 Purple pog.svg – Alexis Hocquenghem, and independently the next year Raj Chandra Bose and Dwijendra Kumar Ray-Chaudhuri, discover BCH codes
1960 Purple pog.svg – Irving S. Reed and Gustave Solomon propose Reed–Solomon codes
1962 Purple pog.svg – Robert G. Gallager proposes Low-density parity-check codes; they are unused for 30 years due to technical limitations.
1965 Purple pog.svg – Dave Forney discusses concatenated codes.
1967 Purple pog.svg – Andrew Viterbi reveals the Viterbi algorithm, making decoding of convolutional codes practicable.
1968 Purple pog.svg – Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed-Solomon codes is pointed out by James L. Massey the following year.
1968 Red pog.svg – Chris Wallace and David M. Boulton publish the first of many papers on Minimum Message Length (MML) statistical and inductive inference
1972 Purple pog.svg – J. Justesen proposes Justesen codes, an improvement of Reed-Solomon codes
1973 Red pog.svg – David Slepian and Jack Wolf discover and prove the Slepian–Wolf coding limits for distributed source coding.[1]
1974 Red pog.svg – George H. Walther and Harold F. O'Neil, Jr., conduct first empirical study of satisfaction factors in the user-computer interface[2]
1976 Purple pog.svg – Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s.
1976 Green pog.svg – R. Pasco and Jorma J. Rissanen develop effective arithmetic coding techniques.
1977 Green pog.svg – Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77)
1989 Green pog.svg – Phil Katz publishes the .zip format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container and most widely used lossless compression algorithm
1993 Purple pog.svg – Claude Berrou, Alain Glavieux and Punya Thitimajshima introduce Turbo codes
1994 Green pog.svg – Michael Burrows and David Wheeler publish the Burrows–Wheeler transform, later to find use in bzip2
1995 Orange pog.svg – Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem
2001 Green pog.svg – Dr. Sam Kwong and Yu Fan Ho proposed Statistical Lempel Ziv
2008 Purple pog.svg – Erdal Arıkan introduced Polar Codes, the first practical construction of codes that achieves capacity for a wide array of channels.
1872 Orange pog.svg – Ludwig Boltzmann presents his H-theorem, and with it the formula Σpi log pi for the entropy of a single gas particle.
1878 Orange pog.svg – J. Willard Gibbs defines the Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the whole system.
1924 Red pog.svg – Harry Nyquist discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system.
1927 Orange pog.svg – John von Neumann defines the von Neumann entropy, extending the Gibbs entropy to quantum mechanics.
1928 Red pog.svg – Ralph Hartley introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning).
1929 Orange pog.svg – Leó Szilárd analyses Maxwell's Demon, showing how a Szilard engine can sometimes transform information into the extraction of useful work.
1940 Red pog.svg – Alan Turing introduces the deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismus process.
1944 Red pog.svg – Claude Shannon's theory of information is substantially complete.
1947 Purple pog.svg – Richard W. Hamming invents Hamming codes for error detection and correction. For patent reasons, the result is not published until 1950.
1948 Red pog.svg – Claude E. Shannon publishes A Mathematical Theory of Communication
1949 Red pog.svg – Claude E. Shannon publishes Communication in the Presence of Noise – Nyquist–Shannon sampling theorem and Shannon–Hartley law
1949 Red pog.svg – Claude E. Shannon's Communication Theory of Secrecy Systems is declassified
1949 Green pog.svg – Robert M. Fano publishes Transmission of Information. M.I.T. Press, Cambridge, Mass. – Shannon–Fano coding
1949 Green pog.svg – Leon G. Kraft discovers Kraft's inequality, which shows the limits of prefix codes
1949 Purple pog.svg – Marcel J. E. Golay introduces Golay codes for forward error correction
1951 Red pog.svg – Solomon Kullback and Richard Leibler introduce the Kullback–Leibler divergence
1951 Green pog.svg – David A. Huffman invents Huffman encoding, a method of finding optimal prefix codes for lossless data compression
1953 Green pog.svg – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm, a procedure to decide whether a given variable-length code is uniquely decodable
1954 Purple pog.svg – Irving S. Reed and David E. Muller propose Reed–Muller codes
1955 Purple pog.svg – Peter Elias introduces convolutional codes
1957 Purple pog.svg – Eugene Prange first discusses cyclic codes
1959 Purple pog.svg – Alexis Hocquenghem, and independently the next year Raj Chandra Bose and Dwijendra Kumar Ray-Chaudhuri, discover BCH codes
1960 Purple pog.svg – Irving S. Reed and Gustave Solomon propose Reed–Solomon codes
1962 Purple pog.svg – Robert G. Gallager proposes Low-density parity-check codes; they are unused for 30 years due to technical limitations.
1965 Purple pog.svg – Dave Forney discusses concatenated codes.
1967 Purple pog.svg – Andrew Viterbi reveals the Viterbi algorithm, making decoding of convolutional codes practicable.
1968 Purple pog.svg – Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed-Solomon codes is pointed out by James L. Massey the following year.
1968 Red pog.svg – Chris Wallace and David M. Boulton publish the first of many papers on Minimum Message Length (MML) statistical and inductive inference
1972 Purple pog.svg – J. Justesen proposes Justesen codes, an improvement of Reed-Solomon codes
1973 Red pog.svg – David Slepian and Jack Wolf discover and prove the Slepian–Wolf coding limits for distributed source coding.[1]
1974 Red pog.svg – George H. Walther and Harold F. O'Neil, Jr., conduct first empirical study of satisfaction factors in the user-computer interface[2]
1976 Purple pog.svg – Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s.
1976 Green pog.svg – R. Pasco and Jorma J. Rissanen develop effective arithmetic coding techniques.
1977 Green pog.svg – Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77)
1989 Green pog.svg – Phil Katz publishes the .zip format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container and most widely used lossless compression algorithm
1993 Purple pog.svg – Claude Berrou, Alain Glavieux and Punya Thitimajshima introduce Turbo codes
1994 Green pog.svg – Michael Burrows and David Wheeler publish the Burrows–Wheeler transform, later to find use in bzip2
1995 Orange pog.svg – Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem
2001 Green pog.svg – Dr. Sam Kwong and Yu Fan Ho proposed Statistical Lempel Ziv
2008 Purple pog.svg – Erdal Arıkan introduced Polar Codes, the first practical construction of codes that achieves capacity for a wide array of channels.