Claude Shannon: The juggling father of the information age
who coined the term 'bit'
Who was Claude Shannon?
From building two-seater unicycles, juggling robots to
creating chess-playing machines, Claude Elwood Shannon was not just an
information theorist. The gifted mathematician also used his skills to analyse
the stock market with a system he designed though his methods remained
unpublished.
The American electrical engineer and
cryptographer was the
grandson of an inventor and a distant cousin of Thomas Edison and would earn
money by repairing radios when he was a schoolboy. He went on to study
electrical engineering and mathematics at the University of Michigan,
graduating in 1936, and obtained his PhD in mathematics at Massachusetts
Institute of Technology (MIT) in 1940.
During the Second World War, he designed equipment to
intercept V1 and V2 missiles and in Axis code-breaking and while working at
Bell Labs, he is believed to have met codebreaker Alan Turing, though there is
no record of their meeting.
He became a visiting professor at MIT in 1956, and a
permanent member of the faculty in 1958.
During his working life, Shannon worked on early mechanical
computers under Vannevar Bush, who subsequently forecast the World Wide Web
some 40 years before its invention.
Shannon was completely focused on his work, though that was
not to say he was anti-social. His days at work began with a game of chess with
the Mathematics Centre director and he would work until late evening on his
own. Revered in the Soviet Union, Shannon did not seek praise from his
contemporaries. In fact, he spent long periods away from the field he had
contributed so much to.
Such was his fame that when he attempted to attend the
Information Theory Symposium at Brighton in 1985 under disguise, rumours went
around of his attendance and soon he was found out. Once discovered, he was
given large applause before a speech, where it was remarked that it was clear
he was suffering from the early effects of Alzheimer's. He died in February
2001.
What was the impact of his theory?
One of the things Shannon is most remembered for is how we
quantify information today in “bits” and “bytes.” To express information in a
“bit,” one uses a binary digit, either a “1” or a “0.” These binary digits can
describe everything from words to pictures to the most sophisticated gaming
software.
At MIT he worked on the "differential analyser",
the world's leading computer at the time but Shannon's experience with this
slow machine led to his vision that computers should be built not using motors
but electrical circuits.
It was Shannon's master's thesis, A Symbolic Analysis Of
Relay And Switching Circuits, that showed the possibility of solving problems
simply by manipulating two symbols - 1 and 0 - in an automatic electric
circuit.
It referred to Boolean algebra where 1 equals true and O
equals false, so for circuits, 1 meant they were turned on and 0 would mean
circuits were turned off.
Claude Shannon is often described as the father of the
information age who created the term "bit", short for binary digits.
Had Shannon been alive on April 30, 2016, he would have turned 100 (or 1100100
if you prefer his age in binary).
His most famous work is A Mathematical Theory of
Communication (1949) in which he introduced information theory, the branch of
mathematics focused on transmitting digital data.
It was in this masterpiece that he coined the term
"bit", the fundamental unit of
information which relates to digital certainty: true or false, on or off, yes
or no.
Now to celebrate what would have been his 100th birthday,
Shannon's work and life is celebrated with a Google Doodle in which a cartoon
Shannon is juggling, a reference to the juggling machines he built and his
juggling on a unicycle in lab halls.