Lesson 1282 of 2116
AI Gateway vs. Direct Provider APIs: When to Insert the Hop
Vercel AI Gateway, OpenRouter, LiteLLM, and Portkey — what gateways add and what they cost.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The premise
- 2AI-gateway
- 3OpenRouter
- 4LiteLLM
Concept cluster
Terms to connect while reading
Section 1
The premise
An AI gateway buys you provider portability, retries, and cost dashboards — at the cost of an extra hop and a new vendor relationship.
What AI does well here
- Failover across providers when one is down
- Cache identical prompts to cut cost
- Centralize per-team budgets and rate limits
- Provide consistent logging across providers
What AI cannot do
- Hide provider-specific quirks (tool-call schema, vision support)
- Add zero latency — there is always a hop cost
- Replace the need to read provider release notes
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI Gateway vs. Direct Provider APIs: When to Insert the Hop”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 11 min
Enterprise LLM Gateways: Portkey, LiteLLM, Vercel AI Gateway
Evaluate gateway platforms that put policy, caching, and routing in front of your LLM calls.
Creators · 11 min
AI Model Routers: OpenRouter, Portkey, and the AI Gateway Pattern
AI Model Routers — a structured comparison so you can pick a tool by fit rather than vibes.
Creators · 9 min
AI and Ollama Local Model Routing for Mixed Workloads
AI helps Ollama users route tasks to the right local model instead of running everything against one default.
