Claude Shannon (1916–2001), an American mathematician and father of information theory. In his 1948 paper A Mathematical Theory of Communication he describes how information can be encoded as a series of 1’s and 0’s (binary format). This insight set the stage for the development of the digital computer and the modern digital communication revolution. Shannon used the concept of information entropy, which he demonstrated to be equivalent to a shortage in the information content in a message. He showed that in a noisy conversation, signals could always be sent without distortion. If the message is encoded in such a way that it is self-checking, signals will be received with the same accuracy as if there were no interference on the line. Shannon's work eventually had applications not only in computer design but also in virtually every subject in which language was important such as linguistics, psychology, cryptography, and phonetics.