Loading lesson…
A 1966 program with a few hundred lines of code convinced people it understood them. Its creator was horrified.
In 1966 at MIT, Joseph Weizenbaum wrote ELIZA, a program that mimicked a Rogerian psychotherapist. You typed a sentence, and ELIZA rephrased it back as a question. Simple pattern matching, no understanding at all.
If you typed I am sad, ELIZA might reply, why do you say you are sad? If you mentioned your mother, it would ask about your family. The trick worked because Rogerian therapy is mostly about reflecting the patient back to themselves.
Weizenbaum grew alarmed at how readily people anthropomorphized his toy. In 1976 he wrote Computer Power and Human Reason, a book warning against putting machines in roles that require human judgment and compassion.
I had not realized that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.
— Joseph Weizenbaum
The big idea: humans are wired to find minds in language. That makes building helpful AI powerful and building honest AI hard.
15 questions · take it digitally for instant feedback at tendril.neural-forge.io/learn/quiz/end-history-eliza-explorers
In what year was ELIZA created?
Where was ELIZA created?
What kind of therapist did ELIZA pretend to be?
Who created ELIZA?
If you typed 'I am sad' to ELIZA, which response might it give?
What was the main technique ELIZA used to respond to users?
What happens when ELIZA could not find a keyword in your input?
What term describes the phenomenon where people believe a computer program understands them when it actually doesn't?
Why did ELIZA work so well for talking about family problems?
What surprised Joseph Weizenbaum about how people reacted to ELIZA?
What did Weizenbaum's secretary ask him to do?
In what year did Weizenbaum write the book warning about putting machines in roles requiring human judgment?
What was the main warning in Weizenbaum's book 'Computer Power and Human Reason'?
What did Weizenbaum mean when he said ELIZA could induce 'powerful delusional thinking'?
Why did users refuse to believe ELIZA was just string manipulation?