Lesson 295 of 2116
Log-Scale Thinking: When Linear Lies
Some things grow multiplicatively, not additively. Log scales reveal patterns that linear scales hide, especially for anything related to scale or growth.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1Linear Scales Break Down Quickly
- 2logarithm
- 3log scale
- 4exponential growth
Concept cluster
Terms to connect while reading
Section 1
Linear Scales Break Down Quickly
Plot GDP of every country on a linear scale and the US, China, and Japan dominate; everyone else looks like zero. Plot on a log scale and you suddenly see that Norway is near the Dominican Republic by income per capita. Log scales make multiplicative differences visible.
When log scale helps
- Exponential growth (virus spread, compound interest)
- Power-law distributions (wealth, view counts)
- Measurements spanning many orders of magnitude (star masses, earthquake energy)
- Human perception (sound is heard logarithmically; decibels are a log scale)
- Model training loss curves (always plot on log-y)
Model scaling: a perfect log-scale story
Scaling laws for neural networks are almost always shown on log-log plots. The relationship between compute and loss looks like a straight line on log scale, but would be a dramatic curve on linear scale. This is why every paper on training curves uses logarithmic axes.
Log scale reveals training dynamics
import matplotlib.pyplot as plt
import numpy as np
training_steps = np.array([100, 1000, 10000, 100000])
loss = np.array([4.8, 3.2, 2.1, 1.4])
fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(10, 4))
ax1.plot(training_steps, loss, marker='o')
ax1.set_title('Linear scale (hides the pattern)')
ax1.set_xlabel('Steps')
ax2.plot(training_steps, loss, marker='o')
ax2.set_xscale('log')
ax2.set_title('Log scale (reveals the pattern)')
ax2.set_xlabel('Steps (log)')
plt.tight_layout()
plt.show()Two practical tricks
- Use log(1 + x) instead of log(x) when data contains zeros
- For percentages or ratios, use log-odds: log(p / (1-p))
- In pandas: df['log_followers'] = np.log1p(df['followers'])
Key terms in this lesson
The big idea: the world runs on multiplication more than addition. Log scales are how you see growth, scale, and skew clearly. Get comfortable reading them, and data starts making much more sense.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Log-Scale Thinking: When Linear Lies”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 45 min
Open vs. Closed Models: Philosophy and Strategy
Open-source AI is both a technical movement and a political one. Understand the arguments so you can pick a stack and defend it.
Creators · 32 min
AP Biology: Using AI to Survive the Vocab Tsunami
AP Bio has roughly a thousand terms and four big concepts. NotebookLM and Claude Projects can turn your textbook into a custom tutor that actually knows what you are studying.
Creators · 32 min
Elo Ratings for AI
Born in chess, now everywhere in AI evaluation. Learn why Elo works and where it quietly misleads.
