Information theory, statistical theory of signal transmission, communication theory and Shannon theory are synonymous names for the mathematical theory first published by Claude Shannon in the 1940s. The concept of entropy in this theory can be viewed as a measure of randomness of sequences of symbols. Shannon entropy and its variants have been widely used in molecular biology and bioinformatics as statistical tools of choice for sequence and structure analyses.
Keywords: Shannon communication theory; information; probability; coding; sequence analysis; cryptanalysis; uniform distribution





