• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

IPMojo

  • About IP Mojo
  • About Scott Coulthart
  • CONTACT
BOOK AN APPOINTMENT

Copyright

August 1, 2025 by Scott Coulthart

Copy Paste App? The Pleasures and Pitfalls of Screenshot-to-Code Tools

Imagine this: you take a screenshot of your favourite SaaS dashboard, upload it to a no-code AI tool, and minutes later you have a functioning version of the same interface — layout, buttons, styling, maybe even a working backend prototype. Magic? Almost.

Welcome to the world of screenshot-to-code generators — tools that use AI and no-code logic to replicate functional software from images. These platforms (like Galileo AI, Builder.io, and Uizard) promise rapid prototyping, faster MVP launches, and a lower barrier to entry for founders, designers, and product teams alike.

But while the tech is impressive, the legal waters are murkier. Here’s the pleasure and the pitfall.


🚀 The Pleasure: Design to Prototype at Lightspeed

The promise is seductive:

  • Rapid prototyping: What used to take weeks of front-end dev can now take hours — sometimes minutes.

  • Visual to functional: AI converts static designs (or even screenshots of existing apps) into working interfaces with mock data or basic logic.

  • Lower costs: Startups or solo devs can build more for less — less code, less labour, and less time.

Tools like Galileo AI and Uizard are being used to generate mock admin panels, mobile UI concepts, and even pitch-ready MVPs. They’re ideal for internal dashboards, client demos, or iterating fast before investing in full-stack builds.

But many users go further — taking screengrabs from existing platforms (think Notion, Salesforce, Figma, Xero) and asking the AI to “make me one of these.”

And that’s where the problems begin.


⚠️ The Pitfall: Copyright, Clones, and Clean Hands

Just because a tool can replicate an interface doesn’t mean you should — especially if your starting point is a screenshot of someone else’s software.

Here are the big legal traps to watch out for:

1. Copyright in the Interface

While copyright doesn’t protect ideas, it does protect expressions — including graphic design, layout, icons, fonts, and even the “look and feel” of certain interfaces. If your cloned UI copies the visual design of another product too closely, you may be infringing copyright (or at least inviting a legal headache).

Australia’s Desktop Marketing Systems v Telstra [2002] FCAFC 112 reminds us that copyright can exist in compilations of data or structure — not just in pretty pictures.

2. Trade Dress and Reputation

Even if your app doesn’t copy the code, a lookalike interface could fall foul of passing off or misleading conduct laws under the Australian Consumer Law if it creates confusion with an established brand. That risk increases if you’re operating in a similar space or targeting the same user base.

The global tech giants have deep pockets — and they’ve sued for less.

3. Terms of Use Breaches

Many platforms prohibit copying or reverse engineering their interfaces. Uploading screenshots of their product to an AI builder might violate their terms of service — even if your clone is only for internal use.

This isn’t just theory: platforms like OpenAI and Figma already use automated tools to detect and act on terms breaches — especially those that risk commercial leakage or brand dilution.

4. No Excuse Just Because the Tool Did It

You can’t hide behind the AI. If your clone infringes IP rights, you’re liable — not the platform that helped you build it. The tool is just that: a tool.

In legal terms, there’s no “my AI made me do it” defence.


🤔 So What Can You Do?

  • ✅ Use these tools for original designs: Sketch your own wireframes, then let the AI flesh them out.

  • ✅ Take inspiration, not duplication: You can draw ideas from good UI — but avoid replicating them pixel-for-pixel.

  • ✅ Use public design systems: Many platforms release UI kits and components under open licences (e.g., Material UI, Bootstrap). Start there.

  • ✅ Keep it internal: If you must replicate an existing interface to test functionality, don’t deploy it publicly — and definitely don’t commercialise it.

  • ✅ Get advice: If you’re close to the line (or don’t know where the line is), speak to an IP lawyer early. Clones are cheaper than court.


🧠 Final Thought: Just Because You Can…

…doesn’t mean you should.

AI is rapidly transforming the way software is built — but it’s also tempting users to cut corners on IP. Using these tools responsibly means treating screenshots not just as pixels, but as possibly protected property.

Build fast — but build clean.

Filed Under: AI, Copyright, Digital Law, IP, Technology Tagged With: AI, Copyright, Digital Law, IP, Technology

July 3, 2025 by Scott Coulthart

Extinct Rights and Existential Threats: Copyright Claims that Shouldn’t Bite, But Still Do

In the world of entertainment, nothing stops a production faster than a copyright claim — even when the claimant doesn’t actually have a leg (or claw) to stand on.

Take, for example, a recent kerfuffle involving a dinosaur skull. Yes, a dinosaur skull. A marketing team used a stock image of a fossil for a film poster. Shortly after, they received a cease and desist from someone asserting copyright over the image — or more nebulously, over “the representation” of the skull itself. A fossilised jawbone that hasn’t had an original idea in 66 million years? Not the most obvious candidate for copyright protection.

And yet, the production company found itself in a bind. Time was short, the distributor was nervous, and the platform had already flagged the content under its automated systems. Legally, the claim didn’t stack up. Commercially, it still had teeth.

Welcome to the prehistoric jungle of copyright brinksmanship.

When Copyright Doesn’t Exist (But the Threat Still Does)

Under Australian law — and most copyright regimes — not everything is capable of being protected:

  • Natural formations (like fossils, mountains, or sea shells)? No copyright.

  • Basic factual photos taken with no creativity (e.g. museum catalogue snaps)? Likely not protected, or may already be in the public domain.

  • Ancient artefacts or artworks? Copyright has almost certainly expired — assuming it existed in the first place.

Even modern reproductions of old things don’t necessarily create new rights. The High Court in IceTV Pty Ltd v Nine Network Australia Pty Ltd [2009] HCA 14 was crystal clear: originality matters. Sweat of the brow won’t cut it.

But that doesn’t stop people — or bots — from claiming rights anyway. Particularly when money, notoriety, or mistaken beliefs are at play.

The Problem with Platform Panic

Most modern disputes don’t start in court — they start with takedown notices:

  • A DMCA claim to YouTube,

  • A flagged post on Meta or TikTok,

  • A licensing hold-up at the distributor level.

These processes are fast, opaque, and slanted toward rights holders (real or imagined). Fighting them requires time and evidence. Meanwhile, your release window slips away and your investor starts asking questions.

What Should You Do? Five Jurassic Principles

1. Know What You’re Looking At
Ask: Is the underlying subject matter capable of copyright protection? A taxidermied tiger, a Greco-Roman bust, or a piece of driftwood might not pass the threshold.

2. Trace the Chain
Even if the subject is unprotectable, the photograph or render might be. But who owns it? Was it created under licence? Is it truly original?

3. Don’t Assume the Claimant Understands Copyright Law
Many people think “I took a photo, so I own the image of the thing in the photo.” That’s not how it works. Be prepared to explain gently (or not so gently).

4. Be Ready to Push Back
If you’re legally in the clear, a well-structured rebuttal often gets the job done. Cite relevant case law, identify flaws in the claim, and invite the claimant to back down — preferably in writing.

5. But Be Prepared to Deal
Sometimes, you do a deal not because the claim is strong, but because the alternative is too costly. A nuisance settlement, a rights clarification, or a rapid poster redesign might save your release. That’s not weakness — that’s triage.

Final Thought: A Question of Extinction

When copyright claims are fossilised nonsense, you don’t need to panic — but you do need to be strategic. Entertainment projects move fast, and delays can cost more than a licence ever would. Know your rights, document your sources, and don’t be afraid to call a bluff.

But also? Don’t let a prehistoric claim derail your production. Even in the Jurassic jungle of IP, survival is about knowing when to roar and when to run.

Filed Under: Copyright, Entertainment, IP Tagged With: Copyright, Entertainment, IP

June 26, 2025 by Scott Coulthart

DMCA Abuse, Deleted Evidence and Damaged Credibility

In C21 Pty Ltd (Trustee) v Hou (No 6) [2025] FedCFamC2G 927, Judge Manousaridis handed down a strongly worded decision marking the latest chapter in a copyright enforcement saga — and it’s not one Mr Hou will be pleased with.

The case delivers a clear warning about the misuse of takedown procedures, destruction of evidence, and strategic dishonesty in IP disputes.

The Story So Far

C21, a real estate agency and video producer, had previously succeeded in proving that Mr Hou had infringed copyright in a range of marketing videos and photographs. The current decision dealt with the consequences: what additional orders should flow from the infringements — and how Mr Hou’s subsequent conduct should influence those outcomes.

DMCA Misuse

One of the key issues was Mr Hou’s deliberate use of DMCA takedown notices to get C21’s legitimately owned videos removed from YouTube. Despite prior court findings that C21 owned the copyright in those materials, Mr Hou sent notices claiming infringement — knowingly and falsely asserting he was the rights holder.

Judge Manousaridis held that these takedown notices were issued:

  • With no lawful basis;

  • In a continuing effort to damage C21’s business;

  • And in knowing contradiction to the findings in earlier proceedings.

Deleted Evidence

Equally concerning was Mr Hou’s deletion of thousands of emails, including emails that may have contained information relevant to the proceedings. The Court accepted that this was done:

  • After the proceedings had commenced;

  • With knowledge of the likely relevance of those materials;

  • And without any acceptable explanation for their destruction.

This led to adverse inferences being drawn about the deleted material.

Additional Damages and Costs

Given the flagrancy of the infringement and the subsequent conduct:

  • The Court awarded additional damages under s 115(4) of the Copyright Act;

  • Compensatory damages were set at $4,200;

  • Additional damages were calculated at $17,000, taking into account Mr Hou’s conduct and the need for deterrence;

  • Full costs were awarded to C21 on a standard basis.

Main Takeaways

The decision is a powerful reminder that:

  • Copyright enforcement tools like DMCA notices must not be weaponised — false claims are not just unethical, they’re legally risky;

  • Deleting relevant evidence mid-litigation can be just as damaging to your case as the infringement itself;

  • Courts take reputational harm and procedural abuse seriously, and will respond with enhanced penalties.

Filed Under: Copyright, IP Tagged With: Copyright, IP

June 24, 2025 by Scott Coulthart

Fair Use or Free Ride? The Case for an AI Blanket Licence

What if AI companies had to pay for the content they train on? Welcome to the next frontier in copyright law — where inspiration meets ingestion.

When AI companies train their models — whether for music, image generation, writing or video — they don’t do it in a vacuum. They train on us. Or more precisely: on our songs, our blogs, our art, our tweets, our books, our interviews.

They harvest it at scale, often scraped from the open web, with or without permission — and certainly without compensation.

This has prompted an increasingly vocal question from creators and content owners:

Shouldn’t we get paid when machines learn from our work?

The proposed answer from some corners: a blanket licensing regime.

What’s a Blanket Licence?

Nothing to do with bedding – a blanket licence is a pre-agreed system for legal reuse. It doesn’t ask for permission each time. Instead, it says:

You can use a defined pool of material for a defined purpose — if you pay.

We already see this in:

  • Music royalties (e.g. APRA, ASCAP, BMI)

  • Broadcast and public performance rights

  • Compulsory licensing of cover songs in the US

Could the same apply to AI?

What the Law Says (or Doesn’t)

AI companies argue that training their models on public material is “fair use” (US) or doesn’t involve “substantial reproduction” (Australia), since no exact copy of the work appears in the output.

However, copies are made during scraping, and substantial parts are almost certainly reproduced during the training process or embedded in derivative outputs — either of which could pose problems under both US and Australian copyright law.

But courts are still catching up.

Pending or recent litigation:

  • The New York Times v OpenAI: scraping articles to train GPT

  • Sarah Silverman v Meta: use of copyrighted books

  • Getty Images v Stability AI: image training and watermark copying

None of these cases have yet resolved the underlying issue:

Is training AI on copyrighted works a use that requires permission — or payment?

What a Blanket Licence Would Do

Under a blanket licence system:

  • Training (and copying or development of derivatives for that purpose) would be lawful, as long as the AI provider paid into a fund

  • Creators and rights holders would receive royalty payments, either directly or via a collecting society

  • A legal baseline would be established, reducing lawsuits and uncertainty

This would mirror systems used in broadcasting and streaming, where revenue is pooled and distributed based on usage data.

Challenges Ahead

1. Who Gets Paid?

Not all data is traceable or attributed. Unlike Spotify, which tracks each song streamed, AI models ingest billions of unlabeled tokens.

How do you determine who owns what — and which parts — of material abstracted, fragmented, and stored somewhere in the cloud?

2. How Much?

Rates would need to reflect:

  • The extent of use

  • The importance of the material to the training corpus

  • The impact on the original market for the work

This is tricky when a model is trained once and then used forever.

3. Which Countries?

Copyright laws vary. A licence in Australia might mean nothing in the US.

A global licence would require multilateral cooperation — and likely WIPO involvement.

Legal Precedent: Australia’s Safe Harbour and Statutory Licensing Models

Australia’s own statutory licensing schemes (e.g. educational copying under Part VB of the Copyright Act) show that:

  • Lawmakers can mandate payment for certain uses,

  • Even if individual rights holders never negotiated the terms,

  • Provided it’s reasonable, transparent, and compensatory.

But those systems also brought:

  • Bureaucratic collection processes

  • Contentious allocation models

  • Endless legal wrangling over definitions (What is “reasonable portion”? What qualifies as “educational purpose”?)

Expect the same for AI.

Creators and Innovation: A Balancing Act

For creators:

  • A blanket licence offers recognition and payment

  • It helps avoid the current “scrape now, settle later” model

  • It could fund new creative work rather than hollowing out industries

For innovators:

  • It provides legal certainty

  • Encourages investment in AI tools

  • Reduces the risk of devastating retroactive litigation

But if set up poorly, it could:

  • Be exclusionary (if licensing fees are too high for small players)

  • Be ineffective (if rights aren’t properly enforced or distributed)

  • Or be too slow to match AI’s pace

What’s Next?

Australia’s Copyright Act doesn’t currently recognise training as a specific form of use. But policy reviews are under way in multiple countries, including by:

  • The UK IPO

  • The European Commission

  • The US Copyright Office

  • And here in Australia, the Attorney-General’s Department is conducting consultations through 2024–25 on how copyright law should respond to AI

Creators, platforms, and governments are all watching the courts. But if consensus forms around the need for structured compensation, a statutory blanket licence might just be the solution.


Bottom Line

We’ve built AI on the backs of human creativity. The question isn’t whether to stop AI — it’s how to make it fair.

A blanket licence won’t solve every problem. But it could be the start of a system where creators aren’t left behind — and where AI learns with permission, not just ambition.

Filed Under: AI, Copyright, Digital Law, IP, Technology Tagged With: AI, Copyright, Digital Law, IP, Technology

June 2, 2025 by Scott Coulthart

Whose Work Is It Anyway? The Remix War, AI, Coffee Plungers and Swimsuits

From Elton John to anonymous meme-makers, a battle is raging over what it means to be “creative” — and whether it starts with permission.

Two stories made waves in copyright circles last week:

  • In the UK, Sir Elton John, Sir Paul McCartney and other musical heavyweights called for stronger rules to stop AI from “scraping” their songs without a licence.

  • In India, news agency ANI drew criticism for aggressively issuing YouTube copyright claims — even for sub-10 second clips — triggering takedown threats against creators.

At first glance, these might seem worlds apart. But they highlight the same question:

At what point does using someone else’s work become exploitation, not inspiration?

And who decides?

Creators vs Reusers: Two Sides of the Copyright Culture Clash

On one side: Creators — musicians, writers, filmmakers, photographers — frustrated by tech platforms and algorithms ingesting their work without permission. Whether it’s AI training data or news footage embedded in political commentary, their message is the same:
“You’re building on our backs. Pay up.”

On the other side: Remixers, meme-makers, educators, and critics argue that strict copyright regimes chill creativity. “How can we critique culture,” they ask, “if we’re not allowed to reference it?”

This isn’t new — hip hop, collage art, satire, and even pop music are full of samples and nods. But AI has industrialised the scale of reuse. It doesn’t borrow one beat or a single shot. It eats the entire catalogue — then spits out something “new.”

So what counts as originality anymore?

Australian Lens: Seafolly, Bodum, and the Meaning of “Original”

Seafolly v Madden [2012] FCA 1346

In this high-profile swimwear spat, designer Leah Madden accused Seafolly of copying her designs. She posted comparison images on social media implying that Seafolly had engaged in plagiarism. Seafolly sued for misleading and deceptive conduct under ss 52 and 53 of the Trade Practices Act 1974 (predecessors to s18 of the Australian Consumer Law – which had by then commenced but the relevant conduct being sued for took place before it had commenced).

The Federal Court found that Madden’s claims were not only misleading but also unsubstantiated, because the design similarities were not the result of actual copying. The case reinforced that:

  • Independent creation is a valid defence, even if the resulting works are similar

  • Superficial resemblance isn’t enough — there must be a causal connection

It’s a reminder that derivation must be substantial and material, not speculative or assumed.

Bodum v DKSH [2011] FCAFC 98

This case involved Bodum’s iconic French press coffee plunger — the Chambord — and whether a rival product sold by DKSH under the “Euroline” brand misled consumers or passed off Bodum’s get-up as its own.

Bodum alleged misleading or deceptive conduct and passing off, based not on name or logo, but on the visual appearance of the product: a clear glass beaker, metal band, and distinctive handles, which had come to be strongly associated with Bodum.

At trial, the Federal Court rejected Bodum’s claims. But on appeal, the Full Federal Court reversed that decision, holding that:

  • Bodum had a substantial reputation in the get-up alone;

  • The Euroline plunger was highly similar in appearance; and

  • DKSH’s failure to adequately differentiate its product through branding or design gave rise to a misleading impression.

Both passing off and misleading/deceptive conduct (also under the old s52) were found. The Court emphasised that reputation in shape and design can be enough — and differentiation must be meaningful, not tokenistic.

The AI Angle: Who Trains Whom?

AI tools like ChatGPT, Midjourney, and Suno don’t just copy works. They learn patterns from thousands of inputs. But in doing so, they arguably absorb creative expression — chord progressions, phrasing, brushstroke styles — and then make new outputs in that same vein.

AI developers claim this is fair use or transformative. Artists argue it’s a form of invisible appropriation — no different from copying and tweaking a painting, but with zero attribution or compensation.

It’s the Seafolly and Bodum problem, scaled up: if AI’s “original” work was trained on 10,000 human ones, is it really original? Or just a remix with plausible deniability?

The Bottom Line

Copyright law is meant to balance:

  • Encouraging creativity

  • Rewarding labour

  • Allowing critique and cultural dialogue

But that balance is breaking under the weight of machine learning models and automated copyright bots. As Seafolly and Bodum show, the law still values intention, process, and context — not just resemblance.

Yet in a world of remix and AI, intention is opaque, and process is synthetic.

So where do we draw the line?

Filed Under: AI, Copyright, Entertainment, IP Tagged With: AI, Copyright, Entertainment, IP

May 27, 2025 by Scott Coulthart

Who Owns the Music? Taylor Swift and the Master Rights Nobody Talks About

She might be Swift, but she wasn’t quick enough to catch the Scooter back in the day.  But now all has changed …

It’s the music industry story that refuses to fade: Taylor Swift may finally have the chance to buy back her original masters — the recordings that launched her global superstardom. If the deal happens, it would close a saga that began in 2019, when her former label sold those recordings to private equity giant Shamrock Capital, following an earlier sale by Scooter Braun’s Ithaca Holdings.

For Swift fans, it’s a long-awaited victory. But for lawyers — and especially those in IP — the story is a masterclass in what most people don’t understand about music rights.

Let’s break it down.

It’s Her Song, But Not Her Recording

When people say “Taylor Swift owns her music,” they’re often talking about copyright in the song/composition itself — comprised of the lyrics, melodies and chord structure. And yes, she owns or co-owns the copyright in many of her compositions, particularly the later albums.

But that’s not the same as owning the recordings. The actual sound recordings of her early music — the studio masters — were owned by her former label, Big Machine Records. That’s standard in the music industry. Unless you’re a major independent artist or had rare contract leverage, your label usually controls the master rights from day one.

So even though the voice on those original albums is Taylor’s, and even though the songs are her words and melodies, the master recordings were never hers to begin with.

Why Master Ownership Matters

Owning the masters means controlling how the recordings are used, licensed, sold, or synced in media. If someone wants to use the originally recorded “Love Story” in a film, the master rights holder — not Taylor — says yes or no and collects the licensing fee.

It also means revenue. Master owners collect royalties from streaming, downloads, radio play, and physical sales. For a catalogue like Swift’s, we’re talking tens of millions of dollars per year.

In fairness, so does Taylor as the songwriter – but not as many as she’d collect if she owned the masters too.

When Swift lost control of her masters, she didn’t just lose licensing rights — she lost influence over how those recordings were represented commercially, something she’s made clear she cares deeply about.

The Re-Recording Strategy — and What This Offer Means

Swift’s response was bold: she began re-recording her albums (as “Taylor’s Versions”) to reclaim both control and commercial value. Because copyright law allows the same songwriter to create a new recording of their own work, she’s been able to rebuild her catalogue under her own terms.

But this new offer — to buy back the original recordings — is different. It’s about reconciling emotional legacy and legal control. For Swift, it could mean regaining ownership of the original audio associated with her rise to fame … and far more royalties.  For Shamrock Capital, it could mean cashing out at a high watermark while retaining goodwill.

The Legal Lesson

Here’s the IP truth every artist — and every lawyer advising creators — should remember:

  • Songs and recordings are separate IP assets with separate ownership structures.

  • A performer can own either, neither, or both.

  • Contract terms set at the start of a career can shape or strangle an artist’s control for decades.

For artists, the Swift story is a cautionary tale — but also a blueprint. For lawyers, it’s a reminder to explain the difference between composition rights, performance rights, and master rights clearly — preferably before the artist becomes a household name.

And for Swifties? It’s one more reason to stream the hell out of 1989 (Taylor’s Version).

Filed Under: Copyright, Entertainment, IP Tagged With: Copyright, Entertainment, IP

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • Copy Paste App? The Pleasures and Pitfalls of Screenshot-to-Code Tools
  • Brand Control, Part 7: “Beyond the Logo” — Trade Marking Product Shapes, Sounds, and Scents
  • Confidential No More? New Aim Took Their Shot and Missed
  • Brand Control, Part 6: “Use It or Lose It” — Genuine Use and Trade Mark Non-Use Risks
  • Fanatics vs FanFirm: When Coexistence Crashes and Burns

Archives

  • August 2025 (1)
  • July 2025 (16)
  • June 2025 (21)
  • May 2025 (12)
  • April 2025 (4)

Footer

© Scott Coulthart 2025