Fair Use or Free Ride? The Case for an AI Blanket Licence
What if AI companies had to pay for the content they train on? Welcome to the next frontier in copyright law — where inspiration meets ingestion.
When AI companies train their models — whether for music, image generation, writing or video — they don’t do it in a vacuum. They train on us. Or more precisely: on our songs, our blogs, our art, our tweets, our books, our interviews.
They harvest it at scale, often scraped from the open web, with or without permission — and certainly without compensation.
This has prompted an increasingly vocal question from creators and content owners:
Shouldn’t we get paid when machines learn from our work?
The proposed answer from some corners: a blanket licensing regime.
What’s a Blanket Licence?
Nothing to do with bedding – a blanket licence is a pre-agreed system for legal reuse. It doesn’t ask for permission each time. Instead, it says:
You can use a defined pool of material for a defined purpose — if you pay.
We already see this in:
-
Music royalties (e.g. APRA, ASCAP, BMI)
-
Broadcast and public performance rights
-
Compulsory licensing of cover songs in the US
Could the same apply to AI?
What the Law Says (or Doesn’t)
AI companies argue that training their models on public material is “fair use” (US) or doesn’t involve “substantial reproduction” (Australia), since no exact copy of the work appears in the output.
However, copies are made during scraping, and substantial parts are almost certainly reproduced during the training process or embedded in derivative outputs — either of which could pose problems under both US and Australian copyright law.
But courts are still catching up.
Pending or recent litigation:
-
The New York Times v OpenAI: scraping articles to train GPT
-
Sarah Silverman v Meta: use of copyrighted books
-
Getty Images v Stability AI: image training and watermark copying
None of these cases have yet resolved the underlying issue:
Is training AI on copyrighted works a use that requires permission — or payment?
What a Blanket Licence Would Do
Under a blanket licence system:
-
Training (and copying or development of derivatives for that purpose) would be lawful, as long as the AI provider paid into a fund
-
Creators and rights holders would receive royalty payments, either directly or via a collecting society
-
A legal baseline would be established, reducing lawsuits and uncertainty
This would mirror systems used in broadcasting and streaming, where revenue is pooled and distributed based on usage data.
Challenges Ahead
1. Who Gets Paid?
Not all data is traceable or attributed. Unlike Spotify, which tracks each song streamed, AI models ingest billions of unlabeled tokens.
How do you determine who owns what — and which parts — of material abstracted, fragmented, and stored somewhere in the cloud?
2. How Much?
Rates would need to reflect:
-
The extent of use
-
The importance of the material to the training corpus
-
The impact on the original market for the work
This is tricky when a model is trained once and then used forever.
3. Which Countries?
Copyright laws vary. A licence in Australia might mean nothing in the US.
A global licence would require multilateral cooperation — and likely WIPO involvement.
Legal Precedent: Australia’s Safe Harbour and Statutory Licensing Models
Australia’s own statutory licensing schemes (e.g. educational copying under Part VB of the Copyright Act) show that:
-
Lawmakers can mandate payment for certain uses,
-
Even if individual rights holders never negotiated the terms,
-
Provided it’s reasonable, transparent, and compensatory.
But those systems also brought:
-
Bureaucratic collection processes
-
Contentious allocation models
-
Endless legal wrangling over definitions (What is “reasonable portion”? What qualifies as “educational purpose”?)
Expect the same for AI.
Creators and Innovation: A Balancing Act
For creators:
-
A blanket licence offers recognition and payment
-
It helps avoid the current “scrape now, settle later” model
-
It could fund new creative work rather than hollowing out industries
For innovators:
-
It provides legal certainty
-
Encourages investment in AI tools
-
Reduces the risk of devastating retroactive litigation
But if set up poorly, it could:
-
Be exclusionary (if licensing fees are too high for small players)
-
Be ineffective (if rights aren’t properly enforced or distributed)
-
Or be too slow to match AI’s pace
What’s Next?
Australia’s Copyright Act doesn’t currently recognise training as a specific form of use. But policy reviews are under way in multiple countries, including by:
-
The UK IPO
-
The European Commission
-
The US Copyright Office
-
And here in Australia, the Attorney-General’s Department is conducting consultations through 2024–25 on how copyright law should respond to AI
Creators, platforms, and governments are all watching the courts. But if consensus forms around the need for structured compensation, a statutory blanket licence might just be the solution.
Bottom Line
We’ve built AI on the backs of human creativity. The question isn’t whether to stop AI — it’s how to make it fair.
A blanket licence won’t solve every problem. But it could be the start of a system where creators aren’t left behind — and where AI learns with permission, not just ambition.