Lesson 25 of 2116
Creative Rights: Artists, Writers, Musicians vs. Generative AI
The creative industries are not against AI. They are against training on their work without consent or compensation. Here is what the fight is actually about.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The Shape of the Fight
- 2creative rights
- 3licensing
- 4consent
Concept cluster
Terms to connect while reading
Section 1
The Shape of the Fight
A common framing says creatives are Luddites, afraid of new tools. That framing is lazy. Most working artists, writers, and musicians already use AI tools. The actual dispute is narrower and more tractable: was it legal to train on their work without permission, and do they get to opt in or out going forward?
Who has sued whom, and for what
- NYT v. OpenAI/Microsoft (2023, ongoing): article corpus used for training — copyright claims proceeding to trial
- Getty v. Stability AI (UK, Nov 2025): copyright claims largely rejected, trademark partially upheld
- Andersen v. Stability AI / Midjourney / DeviantArt (2023, ongoing): visual artists' class action
- Authors Guild + novelists v. OpenAI (2023-): consolidated with NYT case
- UMG + record labels v. Suno + Udio (2024): AI music generators, alleging copyrighted recordings in training
- Thomson Reuters v. ROSS Intelligence (2023): first US ruling against fair use for AI training (Feb 2025)
The positions, stated fairly
Compare the options
| Position | AI labs | Creative industries |
|---|---|---|
| Was training fair use? | Yes — transformative, non-expressive, benefits society | No — systematic, commercial, creates substitutes |
| Can we do it without licenses? | Yes — scale makes per-work licensing impossible | No — scale does not excuse infringement |
| Do outputs compete with originals? | Rarely; different market | Often; especially for commodity work |
| Should creators be compensated? | If output directly uses their work | Training itself is compensable |
What working creatives are actually building
- Glaze and Nightshade (University of Chicago): subtle adversarial perturbations protecting art from training
- Have I Been Trained: tools to check if your work is in common training sets
- Kudurru and NoAI headers: opt-out signals honored by some platforms
- Collective licensing: ICMP, CISAC exploring music licensing frameworks
- Fair Training initiatives: getting consent-based datasets (Spawning, Kadrey)
The labor agreements that actually moved things
The 2023 WGA (Writers Guild of America) contract, after a 148-day strike, included concrete AI provisions: studios cannot require writers to use AI, cannot train on writers' work without consent, and AI-generated material cannot be credited as written material. SAG-AFTRA won similar protections for voice and likeness. These deals are the most specific creative-rights AI rules in any industry so far.
The licensing market that is quietly forming
By 2025, major labs had signed licensing deals with Reddit, Stack Overflow, the Associated Press, News Corp, Axel Springer, Financial Times, Vox Media, Shutterstock, Getty (for some datasets), and others. Terms are rarely public; estimates suggest $60M-$250M per year per deal for top properties. Smaller creators are not part of this market, which is the core equity complaint.
Where courts are landing (mid-2026)
- Thomson Reuters v. ROSS (US, 2025): first substantive US fair use ruling — against the AI defendant, but narrow facts
- Getty v. Stability (UK, 2025): model weights are not copies — big signal for training fair use
- NYT v. OpenAI: still pre-summary-judgment, likely to set the big US precedent
- EU AI Act Article 53: training content disclosure required from 2025 — enables enforcement
- Japan: broadly permits training, creators lobbying for changes
The deeper question nobody is resolving
Style is not copyrightable. A model that has ingested 10,000 of your paintings cannot technically infringe your copyright by producing new work in your style. That is the law. Whether that should be the law is a legitimate policy question for the 2026-2030 legislative cycle. The EU AI Act, US Copyright Office, and a growing list of countries are all actively debating it.
“We are not asking for a veto on AI. We are asking for a seat at the table when our life's work becomes an input to somebody else's product.”
Key terms in this lesson
The big idea: creative rights is the most human-facing front of AI ethics. It has concrete lawsuits, concrete contracts, and concrete tools. The next five years will set patterns we live with for a generation.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Creative Rights: Artists, Writers, Musicians vs. Generative AI”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 45 min
The EU AI Act: The Global Floor, Whether You Like It or Not
The EU AI Act is the most sweeping AI law in the world. It will set the compliance floor for anyone who ships globally. Here is the architecture, the timeline, and what it gets right and wrong.
Creators · 45 min
Labor and AI: What the Data Actually Says
Most predictions about AI and jobs are either panic or dismissal. Here is what the best evidence through 2025 actually shows — including what is overstated.
Creators · 40 min
AI Safety Orgs and How They Actually Operate
The AI safety ecosystem is small, influential, and often misunderstood. Here is who does what, how they get funded, and how to tell real work from rhetoric.
