• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

IPMojo

  • About IP Mojo
  • About Scott Coulthart
  • CONTACT
BOOK AN APPOINTMENT

Technology

September 29, 2025 by Scott Coulthart

Deepfakes on Trial: First Civil Penalties Under the Online Safety Act

The Federal Court has handed down its first civil penalty judgment under the Online Safety Act 2021 (Cth), in eSafety Commissioner v Rotondo (No 4) [2025] FCA 1191.

Justice Longbottom ordered Anthony (aka Antonio) Rotondo to pay $343,500 in penalties for posting a series of non-consensual deepfake intimate images of six individuals, and for failing to comply with removal notices and remedial directions issued by the eSafety Commissioner.


Key Points

1. First penalties under the Online Safety Act

This is the first time civil penalties have been imposed under the Act, making it a landmark enforcement case.

The Commissioner sought both declarations and penalties, with the Court emphasising deterrence as its guiding principle.

2. Deepfakes squarely captured

The Court confirmed that non-consensual deepfake intimate images fall within the Act’s prohibition on posting “intimate images” without consent.

Importantly, it rejected Rotondo’s submission that only defamatory or “social media” posts should be captured.

3. Regulatory teeth and enforcement

Rotondo received notices under the Act but responded defiantly (“Get an arrest warrant if you think you are right”) before later being arrested by Queensland Police on related matters.

His lack of remorse and framing of deepfakes as “fun” aggravated the penalty.

4. Platform anonymity

Although the Commissioner did not object, the Court chose to anonymise the name of the website hosting the deepfakes — reflecting a policy judgment not to amplify harmful platforms.

That said, the various newspapers reporting on this story all revealed the website’s address, but noted it has now been taken down.

IP Mojo is choosing not to reveal that website.

5. Civil vs criminal overlap

Alongside the civil penalties, the Court noted criminal charges under Queensland’s Criminal Code.

This illustrates how civil, regulatory and criminal enforcement can run in parallel.


Why It Matters

  • For regulators: This case confirms the Act has teeth. Regulators can secure significant financial penalties even where offenders are self-represented.

  • For platforms: The Court’s approach signals that services hosting deepfakes are firmly in scope, even if located offshore.

  • For the public: The judgment highlights the law’s adaptability to AI-driven harms — and sends a clear deterrence message.

  • For practitioners: Expect more proceedings of this kind, particularly as the prevalence of AI-generated abuse grows.

Filed Under: AI, Digital Law, Privacy, Regulation, Technology Tagged With: AI, Digital Law, Privacy, Regulation, Technology

September 17, 2025 by Scott Coulthart

Aristocrat’s Jackpot: Full Court Revives Gaming Machine Patents

When does a slot machine cross the line from an abstract idea to a patentable invention?

After years of litigation, remittals, and even a 3–3 deadlock in the High Court, the Full Federal Court has finally tipped the balance in Aristocrat’s favour.

🎰 The Long Spin

Aristocrat has been fighting since 2018 to keep its patents over electronic gaming machines (EGMs) with “configurable symbols” — feature games that change play dynamics and prize allocation. The Commissioner argued these were just abstract rules of a game dressed up in software. Aristocrat said they were genuine machines of a particular construction that yielded a new and useful result.

The case bounced through:

  • Delegate (2018): patents revoked.

  • Burley J (2020): Aristocrat wins.

  • Full Court (2021): Aristocrat loses (majority invents “advance in computer technology” test).

  • High Court (2022): split 3–3, affirming the Full Court’s result by default under Judiciary Act s 23(2)(a).

  • Remittal (2024): Burley J reluctantly applies Full Court reasoning against Aristocrat.

Cue the latest appeal.

⚖️ The Precedent Puzzle

The Full Court (Beach, Rofe & Jackman JJ) confronted a thorny problem: should it stick to its own 2021 reasoning when the High Court had unanimously rejected that reasoning, even though no majority emerged?

The answer: No.

  • Only majority or unanimous High Court views are binding.

  • But the High Court’s unanimous criticism provided a “compelling reason” to abandon the earlier Full Court approach.

  • The Court found “constructive error” — not blaming Burley J, but recognising the law had to move on.

🖥️ Rethinking “Manner of Manufacture”

The Court reframed the test for computer-implemented inventions:

  • Not patentable: an abstract idea manipulated on a computer.

  • Patentable: an abstract idea implemented on a computer in a way that creates an artificial state of affairs and useful result.

Applying this, Aristocrat’s claim 1 was patentable — and by extension, so were the dependent claims across its four patents. The EGMs weren’t just abstract gaming rules. They were machines, purpose-built to operate in a particular way.

💡 Why It Matters

  • For patentees: This revives hope for computer-implemented inventions beyond “pure software” where technical implementation creates a new device or process.

  • For examiners: IP Australia may need to recalibrate examination practice on software-related patents — the “advance in computer technology” yardstick is gone.

  • For practitioners: This is a case study in how precedent, process, and patents collide. The High Court’s split didn’t end the story — it forced the Full Court to resolve it.

🚀 Takeaway

The Full Court has effectively reset the slot reels. Aristocrat’s EGMs are back in play, and the scope of patentable computer-implemented inventions in Australia looks a little brighter.

Sometimes the house doesn’t win.

Filed Under: Digital Law, Gaming Law, IP, Patents, Technology Tagged With: Digital Law, Gaming Law, IP, Patents, Technology

September 16, 2025 by Scott Coulthart

Epic Won the Battle. Now Developers Want Their Refunds.

When Epic Games went head-to-head with Apple, the Federal Court found that Apple misused its market power by locking iOS developers into the App Store and its payment system. That was big. But the Anthony v Apple class action takes it a step further: what if Apple has been overcharging Australian developers and consumers for years?

From liability to dollars

In Epic v Apple [2025] FCA 900, Justice Beach held that Apple’s restrictions substantially lessened competition in two markets:

  1. iOS app distribution; and

  2. iOS in-app payment solutions.

That case was about liability — whether Apple broke the law.

In Anthony v Apple Inc [2025] FCA 902, the Court applied those findings in the context of a class action by developers and users. This time, the question wasn’t just “did Apple misuse its power?” but “what should Apple have charged if competition had been allowed?”

The counterfactual commission

Apple famously takes up to 30% of in-app revenue. The class action alleges that this cut was inflated by Apple’s anti-competitive restrictions.

Justice Beach accepted that the key issue was whether commissions exceeded the “counterfactual” level — i.e. the rate that would have prevailed in a competitive market.

That’s not just a legal puzzle. It’s an economic modelling exercise: estimating what rival app stores and payment processors would have charged, and how Apple’s fees distorted prices across the app ecosystem.

Why this matters

  • Developers: If successful, they may recover damages for inflated commissions they’ve paid over years. That could mean real money back into the hands of Australian app makers.

  • Consumers: If commissions were inflated, those costs were often passed on through higher app and in-app purchase prices. Compensation claims could extend to end users.

  • Apple (and Google): The damages bill could be eye-watering. Liability findings are one thing; being ordered to pay back billions is another.

A coordinated strategy

Justice Beach emphasised that his reasons in Anthony v Apple should be read together with Epic v Apple and Epic v Google. This isn’t three random cases — it’s a coordinated litigation front against the app store model.

First, establish liability (Epic).
Then, pursue compensation (Anthony).
Finally, broaden the net (Epic v Google).

The bigger picture

Globally, regulators and courts are converging on the same theme: Apple and Google can’t use security or convenience as a shield for overcharging.

Australia’s twist? Class actions have a way of turning abstract competition law into concrete refunds.

⚖️ The takeaway

Epic v Apple broke open the wall. Anthony v Apple asks whether Apple should hand back the gold it’s been collecting inside.

This isn’t just another round in the same fight — it’s the damages phase of the app store wars. And it could hit closer to home for Australian developers and users than anything Epic ever fought for.

Filed Under: Competition Law, Digital Law, Remedies, Technology Tagged With: Competition Law, Digital Law, Remedies, Technology

September 12, 2025 by Scott Coulthart

Epic Down Under: How Australia Took a Bite Out of Apple’s Walled Garden

When Epic Games took on Apple in the US and Europe, the headlines practically wrote themselves – it was billed as a David-and-Goliath showdown between the Fortnite maker and the Cupertino colossus. Now, the same fight has reached Australian shores — and the Federal Court has bitten into Apple’s walled garden.

When Epic Games v Apple erupted in the US in 2020, Fortnite, the global gaming juggernaut, had been punted from the App Store after Epic tried to sneak in its own cheaper payment system. Cue a legal battle royale over whether Apple’s “walled garden” was innovation, exploitation, or both.

Fast forward to August 2025, and the fight has gone local. In Epic Games, Inc v Apple Inc [2025] FCA 900, Justice Beach of the Federal Court handed down a sprawling 6,347-paragraph judgment — and while not everything went Epic’s way, the headline is clear: Apple misused its market power under s 46 of the Competition and Consumer Act 2010 (Cth).

Australia has officially joined the global chorus questioning how far Big Tech’s gatekeeping power can go.

The legal frame: two markets, one gatekeeper

Epic’s case hinged on market definition — the first battlefield of any competition law fight. Epic said there were two relevant markets:

  1. The iOS app distribution market – how apps get onto iPhones and iPads.

  2. The iOS in-app payment solutions market – how digital content is paid for inside apps.

Apple argued for something broader: a market for “app transactions”, with plenty of alternatives. Justice Beach wasn’t buying it. He sided with Epic’s narrower framing, recognising that Apple was the sole gatekeeper in both distribution and in-app digital payments.

From there, the logic snowballed: Apple’s rules preventing sideloading (direct downloads) and banning alternative payment systems substantially lessened competition. That, in turn, triggered contraventions of s 46 (misuse of market power) and s 47 (exclusive dealing).

Epic didn’t get everything it wanted — some claims under s 21 of the ACL (unconscionable conduct) fell flat, and Apple’s ban on rival app stores inside the App Store was upheld. But the central wins are seismic.

Apple’s defence: “But security!”

Apple leaned heavily on security as a justification. Its argument: a centralised, curated App Store keeps users safe from malware, fraud, and scams.

Justice Beach accepted there were genuine security benefits — Apple’s model really does provide higher baseline quality and safety compared to the Wild West of sideloaded apps.

However, crucially, he ruled that security doesn’t trump competition law. A legitimate purpose (protecting users) doesn’t erase the anti-competitive effects (locking out rivals) – or as the Court put it: “The existence of a security purpose says little about the effect or likely effect of Apple’s restrictive conduct in terms of competition questions.”

What this means for…

Developers

Epic signalled it would jump into iOS distribution if given the chance.

Others will follow. Think Spotify, payment processors like Stripe, or even local players offering niche app stores.

The judgment cracks open a door that’s been locked since 2008.

Consumers

If remedies flow, users could see lower prices and more choice.

Developers paying Apple’s 30% cut have long argued they’re forced to inflate in-app purchase prices.

Alternatives could push those costs down — though don’t expect Apple to give up without a fight.

Regulators

The decision aligns with the ACCC’s Digital Platform Services Inquiry, which has repeatedly flagged Apple’s and Google’s control over app ecosystems.

Australia may now move closer to Europe’s Digital Markets Act, which mandates interoperability and alternative app stores.

Apple

Even if remedies are still pending, the finding of liability alone is a reputational hit.

Apple’s “we know best” stance has always traded on consumer trust. Now it must reckon with courts telling it that choice matters too.

The global context

This ruling doesn’t exist in a vacuum. It follows:

  • The US: where Epic’s case against Apple produced a mixed bag — Apple largely won at trial, but Epic clawed back some ground on appeal.

  • The EU: where the Digital Markets Act forced Apple to allow rival app stores and alternative payment methods in early 2024.

  • South Korea and Japan: already experimenting with app store regulation.

Australia is now firmly in the mix. What started as a Fortnite scuffle is becoming a global test of whether digital gatekeepers can keep locking the gates.

Where to next?

Justice Beach has reserved questions of relief for another hearing.

That means the really juicy part — what remedies Apple will face in Australia — is still to come. Options range from structural orders (sideloading must be allowed) to behavioural remedies (Apple must permit rival payment providers).

Whatever the outcome, one thing is clear: the walled garden isn’t as impregnable as Apple thought – and Epic may have just turned its legal battle royale into a global trend of regulatory respawn.

⚖️ The takeaway

For digital lawyers, regulators, and anyone building on someone else’s platform: this case is a reminder that market power is never just a tech problem — it’s a legal one.

When innovation, competition, and consumer choice collide, courts are willing to get out the pickaxe.

Filed Under: Competition Law, Digital Law, Technology Tagged With: Competition Law, Digital Law, Technology

August 7, 2025 by Scott Coulthart

Your Data, My Model? Why AI Ambitions Demand a Contract Check-Up

As AI capabilities become standard fare in SaaS platforms, software providers are racing to retrofit intelligence into their offerings. But if your platform dreams of becoming the next ChatXYZ, you may need to look not to your engineering team, but to your legal one.

The Problem with “Your Data”

Most software providers already have mountains of processed, transformed and inferred data—data shaped by customer inputs and platform logic. That data could supercharge AI development, from powering smarter dashboards to training predictive algorithms.

But here’s the rub: just because the data isn’t raw customer input doesn’t mean you can freely use it.

You may assume your standard software licence or SaaS agreement gives you all the rights you need. It probably doesn’t.

What Does the Contract Say?

Take a typical clause like this:

“The Customer grants the Provider a non-exclusive, irrevocable licence to use Customer Data to the extent reasonably required to provide the Services and for use in the Provider’s business generally.”

Even a broad “use in our business generally” clause won’t necessarily cover:

  • Using processed or aggregated data from multiple customers

  • Training an AI model whose outputs are shared with others

  • Commercialising new AI-powered features not contemplated in the original deal

And if the data is derived from inputs that were themselves confidential or personal, you’ve got even more legal landmines—Privacy Law, confidentiality obligations, and IP ownership issues if the customer contributed meaningful structure to the dataset.

Is Deidentification Enough (or even Allowed)?

A common fallback is: “We’ll just deidentify the data.” But that’s not a bulletproof strategy.

Under most privacy regimes, data is only considered deidentified if re-identification is not reasonably possible—a high bar, especially in small or specialised datasets. Even deidentified data may still be contractually protected if it originates from information the customer expects to be confidential.

More fundamentally, your contract might not give you the right to deidentify the data at all, unless required to do so by law.

Most software licences and SaaS agreements treat customer data as confidential information. Unless the contract expressly permits you to transform, aggregate or deidentify that data for secondary use (like AI training), doing so could itself amount to a breach. Moreover, if the data includes personal information, you’ll need to navigate privacy laws that impose their own limits—regardless of your contractual rights.

So before you start feeding your LLM, make sure you’re not breaching your SLA.

What to Look For (or Add)

If you’re a provider:

  • Check whether your agreement expressly allows you to create, collate, and use aggregated and deidentified customer data for AI training and product development.

  • Ensure the licence to use data extends beyond service delivery and includes improvements, analytics, and R&D.

  • Include language around data governance, privacy compliance, and ownership of AI outputs.

If you’re a customer:

  • Scrutinise clauses that allow use of data for “business purposes” or “analytics”—these may reach further than you think.

  • Consider negotiating limits, notice obligations, or opt-out rights when your data could be used to build broadly deployed AI systems—unless, of course, that can be turned to your advantage.

In the Age of AI, Contracts Are Training Data Too

Training AI on customer data can unlock immense value—but only if your agreements keep up. Your model is only as smart as your data. And your data rights are only as strong as your contract.

Filed Under: AI, Commercial Law, Contracts, Digital Law, Technology Tagged With: AI, Commercial Law, Contracts, Digital Law, Technology

August 1, 2025 by Scott Coulthart

Copy Paste App? The Pleasures and Pitfalls of Screenshot-to-Code Tools

Imagine this: you take a screenshot of your favourite SaaS dashboard, upload it to a no-code AI tool, and minutes later you have a functioning version of the same interface — layout, buttons, styling, maybe even a working backend prototype. Magic? Almost.

Welcome to the world of screenshot-to-code generators — tools that use AI and no-code logic to replicate functional software from images. These platforms (like Galileo AI, Builder.io, and Uizard) promise rapid prototyping, faster MVP launches, and a lower barrier to entry for founders, designers, and product teams alike.

But while the tech is impressive, the legal waters are murkier. Here’s the pleasure and the pitfall.


🚀 The Pleasure: Design to Prototype at Lightspeed

The promise is seductive:

  • Rapid prototyping: What used to take weeks of front-end dev can now take hours — sometimes minutes.

  • Visual to functional: AI converts static designs (or even screenshots of existing apps) into working interfaces with mock data or basic logic.

  • Lower costs: Startups or solo devs can build more for less — less code, less labour, and less time.

Tools like Galileo AI and Uizard are being used to generate mock admin panels, mobile UI concepts, and even pitch-ready MVPs. They’re ideal for internal dashboards, client demos, or iterating fast before investing in full-stack builds.

But many users go further — taking screengrabs from existing platforms (think Notion, Salesforce, Figma, Xero) and asking the AI to “make me one of these.”

And that’s where the problems begin.


⚠️ The Pitfall: Copyright, Clones, and Clean Hands

Just because a tool can replicate an interface doesn’t mean you should — especially if your starting point is a screenshot of someone else’s software.

Here are the big legal traps to watch out for:

1. Copyright in the Interface

While copyright doesn’t protect ideas, it does protect expressions — including graphic design, layout, icons, fonts, and even the “look and feel” of certain interfaces. If your cloned UI copies the visual design of another product too closely, you may be infringing copyright (or at least inviting a legal headache).

Australia’s Desktop Marketing Systems v Telstra [2002] FCAFC 112 reminds us that copyright can exist in compilations of data or structure — not just in pretty pictures.

2. Trade Dress and Reputation

Even if your app doesn’t copy the code, a lookalike interface could fall foul of passing off or misleading conduct laws under the Australian Consumer Law if it creates confusion with an established brand. That risk increases if you’re operating in a similar space or targeting the same user base.

The global tech giants have deep pockets — and they’ve sued for less.

3. Terms of Use Breaches

Many platforms prohibit copying or reverse engineering their interfaces. Uploading screenshots of their product to an AI builder might violate their terms of service — even if your clone is only for internal use.

This isn’t just theory: platforms like OpenAI and Figma already use automated tools to detect and act on terms breaches — especially those that risk commercial leakage or brand dilution.

4. No Excuse Just Because the Tool Did It

You can’t hide behind the AI. If your clone infringes IP rights, you’re liable — not the platform that helped you build it. The tool is just that: a tool.

In legal terms, there’s no “my AI made me do it” defence.


🤔 So What Can You Do?

  • ✅ Use these tools for original designs: Sketch your own wireframes, then let the AI flesh them out.

  • ✅ Take inspiration, not duplication: You can draw ideas from good UI — but avoid replicating them pixel-for-pixel.

  • ✅ Use public design systems: Many platforms release UI kits and components under open licences (e.g., Material UI, Bootstrap). Start there.

  • ✅ Keep it internal: If you must replicate an existing interface to test functionality, don’t deploy it publicly — and definitely don’t commercialise it.

  • ✅ Get advice: If you’re close to the line (or don’t know where the line is), speak to an IP lawyer early. Clones are cheaper than court.


🧠 Final Thought: Just Because You Can…

…doesn’t mean you should.

AI is rapidly transforming the way software is built — but it’s also tempting users to cut corners on IP. Using these tools responsibly means treating screenshots not just as pixels, but as possibly protected property.

Build fast — but build clean.

Filed Under: AI, Copyright, Digital Law, IP, Technology Tagged With: AI, Copyright, Digital Law, IP, Technology

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • 🏇 When the Race Stops a Nation — Who Owns the Moment?
  • AI Training in Australia: Why a Mandatory Licence Could Be the Practical Middle Ground
  • AI-Generated Works & Australian Copyright — What IP Owners Need to Know
  • When Cheaper Medicines Meet Patent Law: Regeneron v Sandoz
  • #NotThatFamous: When Influencer Buzz Fails the s 60 Test

Archives

  • November 2025 (1)
  • October 2025 (14)
  • September 2025 (21)
  • August 2025 (18)
  • July 2025 (16)
  • June 2025 (21)
  • May 2025 (12)
  • April 2025 (4)

Footer

© Scott Coulthart 2025