AI can pick up unfair ideas from the writing it learned from.
5 min · Reviewed 2026
The big idea
AI learns from writing made by humans. And humans aren't always fair. So AI can sometimes act unfair without knowing it — like always picking boy names for doctors and girl names for nurses. This is called bias.
Some examples
Old picture-makers used to draw mostly white people for 'CEO'.
AI might assume all chefs are men or all nannies are women.
Engineers work hard to fix bias when they find it.
If something feels unfair from AI, it's okay to push back.
Try it!
Ask an AI to describe 'a hero' or 'a scientist'. Did it make any assumptions about who they are? Try asking again with different words and see if you get a different answer.
End-of-lesson check
15 questions · take it digitally for instant feedback at tendril.neural-forge.io/learn/quiz/end-explorers-foundations-AI-and-fairness
Why might an AI image maker show mostly men when you ask it to draw a CEO?
The AI wants to upset people who see the pictures
The AI learned from pictures that mostly showed men as CEOs
The computer part called the GPU only understands male pictures
The AI randomly picks men every time
What is the word that describes when AI copies unfair ideas from the world around it?
Coding
Learning
Honesty
Bias
An AI keeps using girl names for nurses and boy names for doctors. Where do you think it learned this?
From counting how many letters are in each job name
From a special computer game about hospitals
From books and articles written by humans over many years
From flipping a coin every time it needs a name
If an AI image maker shows you something that feels unfair or wrong, what should you do?
Share it with all your friends without thinking
Ask the AI to show you more unfair images
Push back and tell someone it's not right
Believe the AI because it's always correct
Why do engineers (the people who build AI) work to fix bias when they find it?
They want to make the AI more confusing
They want AI to treat everyone equally and fairly
They want the AI to run faster on computers
They want to add more biased ideas on purpose
What is 'training data' for an AI system?
The special tests that engineers give AI to see if it works
The instructions that tell a computer when to turn on and off
The examples and information an AI studies to learn how to answer questions
The pictures and videos that people upload to share online
An AI text generator always describes chefs as men and nannies as women. This is an example of:
The AI trying to be funny
Bias from learning old ideas about jobs
A mistake where the AI forgot all words
A computer virus affecting the AI
Can AI be unfair without meaning to be?
No, AI can only do exactly what programmers tell it
No, computers never make mistakes
Yes, because it learns from a world that already has unfair ideas
Yes, but only if the computer has a broken part
The lesson says 'AI can be unfair because the world it learned from was unfair.' What does this mean?
AI only learns from perfect, completely fair sources
AI purposely learns unfair ideas to cause problems
AI copies the unfair ideas that already exist in human writing and pictures
AI reads newspapers that are 100 years old
You ask an AI to describe 'a hero' and it always describes a strong man. What could you try next?
Give up and never use AI again
Ask the same question ten more times exactly the same way
Ask again using different words like 'a heroic nurse' or 'a heroic teacher'
Tell the AI it is a bad computer
What job did the lesson say AI might incorrectly show as always being done by women?
Nanny or childcare worker
Police officer
CEO or business leader
Pilot or airplane driver
Which of these is the BEST reason to question an AI that shows something unfair?
Because AI learns from humans and humans can have unfair ideas
Because AI computers are dangerous and should be turned off
Because AI is always right and you want to test it
Because you want the AI to show more unfair things
If AI learned from books written only a long time ago, what unfair ideas might it pick up?
Ideas about how to fix computer bugs
Ideas about which foods taste best
Ideas about how to build faster computers
Ideas that were common long ago but are now considered unfair
Why can't an AI just 'decide' to be fair on its own without help from humans?
AI has feelings and doesn't care about fairness
AI only knows what it has learned from human-created data
AI already knows everything about fairness
AI is smarter than all humans and doesn't need help
The lesson says engineers work hard to fix bias when they find it. What does this tell us about bias?