Lesson 810 of 1570
AI model families: DeepSeek and the China AI scene
Understand DeepSeek and why China's AI models surprised the world.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2Why DeepSeek R1 Made the Whole Industry Freak Out
- 3The big idea
- 4Chinese AI Models: DeepSeek, Qwen, Kimi, GLM
Concept cluster
Terms to connect while reading
Section 1
The big idea
DeepSeek is a Chinese AI lab that released models rivaling OpenAI's at a fraction of the cost. It shocked the industry. Their R1 reasoning model is open and very good at math and code.
Some examples
- Try DeepSeek's chat for free
- Use DeepSeek R1 for hard math
- Notice the 'thinking' tokens DeepSeek shows
- Compare DeepSeek to GPT-5 on logic
Try it!
Try DeepSeek's free chat. Give it a hard math problem and watch the reasoning trace. Notice how it 'thinks out loud' before answering.
Section 2
Why DeepSeek R1 Made the Whole Industry Freak Out
Section 3
The big idea
In early 2025, China's DeepSeek released R1 — a reasoning model that performed nearly as well as OpenAI's o1 but cost a small fraction to train and run, AND the weights were open. Stocks dropped, Twitter melted down, and every lab had to rethink pricing. R1 proved frontier capability isn't only for $100B labs.
Some examples
- Cost to run R1 is ~30x cheaper than o1 — anyone can afford reasoning-model API calls now.
- Other labs (Qwen, Kimi) followed with their own cheap reasoning models.
- Western labs slashed prices in response — your ChatGPT subscription got better because of R1.
- R1 is downloadable; you can run a smaller version on a beefy laptop.
Try it!
Try DeepSeek R1 (free) at chat.deepseek.com. Ask it a hard reasoning question. Compare to ChatGPT.
Section 4
Chinese AI Models: DeepSeek, Qwen, Kimi, GLM
Section 5
The big idea
China has a thriving AI scene. DeepSeek shocked the world in 2025 with cheap-to-train reasoning. Qwen leads many open-weight benchmarks. Kimi has long-context expertise. GLM-4 is a strong all-rounder. They're often free to try and have generous API limits.
Some examples
- DeepSeek-R1: open-source reasoning model that competed with o1 at a fraction of the cost.
- Qwen3-Coder: top open-source coding model, free to download.
- Kimi K2: 1M+ token context for analyzing very long documents.
- GLM-4.5: strong general-purpose Chinese-and-English model.
Try it!
Try DeepSeek's chat (chat.deepseek.com) or Qwen's chat (chat.qwen.ai) for free. Compare to ChatGPT.
Section 6
DeepSeek: Strong Reasoning at a Fraction of the Cost
Section 7
The big idea
price-per-token matters when you do many calls
Some examples
- Reasoning benchmarks near the top tier
- API pricing way under OpenAI
- Open weights you can self-host
Try it!
Open your favorite AI tool and try one of the examples above. Pick the one that matches what you are actually working on this week. Spend 10 minutes, no more. Notice what worked and what did not — that's the real lesson.
Section 8
DeepSeek: shockingly cheap, surprisingly good
Section 9
The big idea
DeepSeek-V3 and R1 are open-weights Chinese models that opened up AI pricing wars.
Some examples
- Use DeepSeek for high-volume work where cost matters.
- Note that data may be processed in China — read the policy.
- R1 is a strong reasoning model at low cost.
Try it!
Try DeepSeek on a reasoning problem you've also tried on Claude. Compare.
Understanding "DeepSeek: shockingly cheap, surprisingly good" in practice: Understanding AI in this area gives you a real advantage in how you work and think. DeepSeek's models cost a fraction of OpenAI/Anthropic and rival them on many tasks — and knowing how to apply this gives you a concrete advantage.
- Apply the concepts from DeepSeek: shockingly cheap, surprisingly good directly
- Identify where this fits into your current workflow
- Measure the before/after difference when you apply this
- Iterate and refine — first attempts rarely nail it
- 1Apply DeepSeek: shockingly cheap, surprisingly good in a live project this week
- 2Write a short summary of what you'd do differently after learning this
- 3Share one insight with a colleague
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI model families: DeepSeek and the China AI scene”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 40 min
AI model families: Meta's Llama (open source)
Understand why Llama matters as a free, open AI model anyone can run.
Builders · 40 min
Context Windows: How Much AI Can 'Remember'
Each AI has a 'context window' — how much it can hold in memory. Knowing this matters for big tasks.
Builders · 40 min
AI and Claude Haiku: The Tiny Speed Demon
Haiku is Anthropic's smallest, fastest, cheapest model — perfect for short tasks and chatbots.
