Claude Elwood Shannon, a mathematician born in Gaylord, Michigan (U.S.) in 1916, is credited with two important contributions to information technology: the application of Boolean theory to electronic switching, thus laying the groundwork for the digital computer, and developing the new field called *information theory* . It is difficult to overstate the impact which Claude Shannon has had on the 20th century and the way we live and work in it, yet he remains practically unknown to the general public. Shannon spent the bulk of his career, a span of over 30 years from 1941 to 1972, at Bell Labs where he worked as a mathematician dedicated to research.

While a graduate student at MIT in the late 1930s, Shannon worked for Vannevar Bush who was at that time building a mechanical computer, the Differential Analyser. Shannon had the insight to apply the two-valued Boolean logic to electrical circuits (which could be in either of two states - on or off). This syncretism of two hitherto distinct fields earned Shannon his MS in 1937 and his doctorate in 1940.

Not content with laying the logical foundations of both the modern telephone switch and the digital computer, Shannon went on to invent the discipline of information theory and revolutionize the field of communications. He developed the concept of entropy in communication systems, the idea that information is based on uncertainty. This concept says that the more uncertainty in a communication channel, the more information that can be transmitted and vice versa. Shannon used mathematics to define the capacity of any communications channel to optimize the signal-to-noise ratio . He envisioned the possibility of error-free communications for telecommunications, the Internet, and satellite systems.

*A Mathematical Theory Of Communication* , published in the Bell Systems Technical Journal in 1948, outlines the principles of his information theory. Information Theory also has important ramifications for the field of cryptography as explained in his 1949 paper *Communication Theory of Secrecy Systems* - in a nutshell, the more entropy a cryptographic system has, the harder the resulting encryption is to break.

Shannon's varied retirement interests included inventing unicycles, motorized pogo sticks, and chess-playing robots as well as juggling - he developed an equation describing the relationship between the position of the balls and the action of the hands. Claude Shannon died on February 24, 2001.

## Tech TalkComment

## Share

## Comments

## Results

## Contribute to the conversation