Lesson 314 of 2116
Shannon and the Birth of Information
Claude Shannon turned communication into mathematics and gave AI the substrate it would need.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1A Mathematical Theory of Communication
- 2information theory
- 3entropy
- 4Shannon
Concept cluster
Terms to connect while reading
Section 1
A Mathematical Theory of Communication
In 1948, a Bell Labs engineer named Claude Shannon published a paper that quietly reshaped the century. He showed that information, like energy, could be measured, encoded, and transmitted with provable limits.
Shannon defined the bit as the fundamental unit of information. He introduced entropy as a measure of uncertainty, and showed how any message could be compressed, corrected, and communicated across a noisy channel.
Shannon's core insights
- 1Information is the reduction of uncertainty, measurable in bits
- 2Every channel has a capacity; beyond it, errors are unavoidable
- 3Redundancy in a message allows error correction
- 4Compression is possible up to the entropy of the source, and no further
Shannon also built playful machines. He made a mechanical mouse named Theseus that could solve a maze using relays, widely considered one of the first learning machines. He juggled, rode a unicycle, and proved that his intellectual range matched his rigor.
“Information is the resolution of uncertainty.”
Key terms in this lesson
The big idea: AI is applied information theory. Every loss curve, every tokenizer, every compression trick traces back to Shannon's 1948 paper.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Shannon and the Birth of Information”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 45 min
Uncertainty Quantification in LLMs
A model that says 'I am 95 percent sure' and is wrong 40 percent of the time is miscalibrated. Measuring that gap is uncertainty quantification.
Creators · 55 min
The Three Ingredients: Data, Compute, Algorithms (Capstone)
Every AI breakthrough of the past decade rests on three interacting ingredients. Synthesize everything you have learned into one working model.
Creators · 30 min
Backpropagation Rediscovered, 1986
Rumelhart, Hinton, and Williams published the algorithm that would eventually power everything.
