Lesson 202 of 2116
Calling the OpenAI API
The Responses API is OpenAI's modern surface. One call, text and tools. Learn the shape you'll use most.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1Two APIs, One Client
- 2Responses API
- 3chat completions
- 4streaming
Concept cluster
Terms to connect while reading
Section 1
Two APIs, One Client
OpenAI ships chat.completions (classic) and responses (modern). New code should prefer responses — it unifies text, tools, and structured output.
output_text is a convenience accessor that concatenates all text in the response.
from openai import OpenAI
client = OpenAI()
def ask(prompt: str) -> str:
try:
r = client.responses.create(
model="gpt-5",
input=[
{"role": "system", "content": "Be concise."},
{"role": "user", "content": prompt},
],
)
return r.output_text
except Exception as e:
print(f"OpenAI call failed: {e}")
raise
print(ask("Explain recursion in one sentence."))Streaming
Context manager ensures the stream closes. Event types are strings — filter for the text delta.
def ask_stream(prompt: str) -> None:
with client.responses.stream(
model="gpt-5",
input=[{"role": "user", "content": prompt}],
) as stream:
for event in stream:
if event.type == "response.output_text.delta":
print(event.delta, end="", flush=True)
stream.until_done()
print()Understanding "Calling the OpenAI API" in practice: AI-assisted coding shifts work from syntax recall to design thinking — models handle boilerplate so you focus on architecture. The Responses API is OpenAI's modern surface. One call, text and tools. Learn the shape you'll use most — and knowing how to apply this gives you a concrete advantage.
- Apply Responses API in your ai-coding workflow to get better results
- Apply chat completions in your ai-coding workflow to get better results
- Apply streaming in your ai-coding workflow to get better results
- Apply model in your ai-coding workflow to get better results
- 1Use AI to generate unit tests for an existing function
- 2Ask AI to refactor a messy function and explain the changes
- 3Have AI suggest a code review for a recent pull request
Key terms in this lesson
The big idea: responses.create for the modern path, stream for UIs, and centralize model ids so provider swaps are painless.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Calling the OpenAI API”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 45 min
React Server Components
RSCs render on the server and stream HTML to the client. Zero-JS components, free data fetching. Learn the boundary rules.
Creators · 45 min
Calling the Claude API With Streaming
Anthropic's SDK in 20 lines. Learn messages, streaming tokens, and basic error handling.
Creators · 50 min
Deploying an AI App to Vercel
Streaming AI chat to production takes one framework and three env vars. Learn the deploy path that actually ships.
