Lesson 199 of 1570
Probability for Beginners
AI is fundamentally probabilistic. A little probability literacy goes a long way.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The Language of Uncertainty
- 2probability
- 3random variable
- 4distribution
Concept cluster
Terms to connect while reading
Section 1
The Language of Uncertainty
Every LLM is, at heart, a probability distribution over the next token. To read AI papers you need basic probability literacy. The good news: the core ideas fit on a napkin.
Three ideas to know
- 1An event's probability is a number between 0 (impossible) and 1 (certain)
- 2Probabilities over all outcomes sum to 1
- 3Expectation: the average outcome you would get over many repetitions
Common distributions you will see
Compare the options
| Distribution | Shape | Shows up in |
|---|---|---|
| Uniform | Flat — all outcomes equal | Random sampling |
| Bernoulli | Just success/failure | Binary classification |
| Gaussian (normal) | Bell curve | Measurement noise, weight initialization |
| Categorical / Softmax | Probability per class | Next-token prediction |
Independent vs dependent
Two events are independent if one tells you nothing about the other. Coin tosses are independent. The weather today and tomorrow are dependent. Independence hugely changes how probabilities combine.
- Independent AND: multiply probabilities (P(A) × P(B))
- Independent OR: approximately add them if small
- Dependent: use conditional probability (next lesson)
“Probability is the mathematics of common sense.”
Key terms in this lesson
The big idea: probability is a language for reasoning under uncertainty. Once you speak it, half of AI becomes less mysterious.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Probability for Beginners”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 30 min
The Supervised Learning Loop
Most modern AI is trained on a loop of guess, check, and adjust. Understand the loop and you understand the heart of machine learning.
Builders · 30 min
Tokens and Embeddings: How AI Reads Words
AI does not read letters. It reads tokens, which live as vectors in a space of meaning. Learn how text becomes numbers you can do math on.
Builders · 35 min
Neural Networks, Actually Explained
You have heard the term a thousand times. Now let's actually look inside: neurons, weights, activations, and what happens in a single pass.
