• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

IPMojo

  • About IP Mojo
  • About Scott Coulthart
  • CONTACT
BOOK AN APPOINTMENT

Privacy

October 9, 2025 by Scott Coulthart

Privacy’s First Big Hit: Australian Clinical Labs Fined $5.8 Million for Data Breach Failures

When 86 gigabytes of patient data — including health, financial and identity information — hit the dark web after a ransomware attack, the fallout was always going to be brutal.

Now, in Australian Information Commissioner v Australian Clinical Labs Limited (No 2) [2025] FCA 1224, the Federal Court has handed down a $5.8 million penalty — marking the first civil penalty judgment under the Privacy Act.

And it’s a warning shot for every business holding personal information in Australia.


⚖️ The Case in a Nutshell

Australian Clinical Labs (ACL) — one of the country’s largest private pathology providers — bought Medlab Pathology in late 2021.

What it didn’t buy (or even check properly) were Medlab’s crumbling IT systems: unsupported Windows servers, weak authentication, no encryption, and logs that deleted themselves every hour.

In February 2022, the inevitable happened — a ransomware group calling itself “Quantum” infiltrated Medlab’s servers, exfiltrated 86GB of data, and dumped it online.

ACL’s response was painfully slow. Despite early signs of exfiltration, it:

  • Relied almost entirely on an external consultant’s limited review;

  • Concluded (wrongly) that no data had been stolen;

  • Ignored early warnings from the Australian Cyber Security Centre; and

  • Waited over three months before notifying the OAIC.


🧩 The Breaches

Justice Halley found ACL had seriously interfered with the privacy of 223,000 individuals through three major contraventions of the Privacy Act 1988 (Cth):

  1. Breach of APP 11.1 — Failure to take reasonable steps to protect personal information from unauthorised access or disclosure.

    • The Medlab systems were riddled with vulnerabilities.

    • ACL failed to identify or patch them after acquisition.

    • Overreliance on third-party providers compounded the problem.

  2. Breach of s 26WH(2) — Failure to carry out a reasonable and expeditious assessment of whether the incident was an eligible data breach.

    • ACL’s “assessment” was based on incomplete data and unsupported assumptions.

    • The Court called it unreasonable and inadequate.

  3. Breach of s 26WK(2) — Failure to notify the Commissioner as soon as practicable after forming the belief that an eligible data breach had occurred.

    • ACL delayed nearly a month after confirmation that personal and financial information was on the dark web.

Each breach amounted to a “serious interference with privacy” under s 13G, attracting civil penalties.


💰 The Penalty Breakdown

ACL agreed to pay a total of $5.8 million:

Contravention Section Penalty
Breach of APP 11.1 (223,000 contraventions, treated as one course of conduct) s 13G(a) $4.2 million
Failure to assess breach s 26WH(2) $800,000
Failure to notify OAIC s 26WK(2) $800,000
Total $5.8 million

ACL also agreed to pay $400,000 in costs.

While the theoretical maximum exceeded $495 billion, the Court accepted the agreed penalty as being within the permissible range — particularly given ACL’s cooperation, remorse, and post-breach reforms.


⚙️ “Reasonable Steps” — The New Legal Standard

This judgment finally gives judicial colour to APP 11.1’s “reasonable steps” requirement.
Justice Halley said reasonableness must be assessed objectively, considering:

  • the sensitivity of the information;

  • the potential harm from unauthorised disclosure;

  • the size and sophistication of the entity;

  • the cyber risk landscape; and

  • any prior threats or attacks.

Critically, “reasonable steps” cannot be outsourced — delegation to an IT vendor does not discharge responsibility.  ACL’s overreliance on StickmanCyber was no defence.


🚨 Why It Matters

This decision rewrites the playbook for privacy compliance in Australia:

  • Civil penalties are real — the OAIC now has judicial precedent for enforcement.

  • Each affected individual counts — the Court held that each person’s privacy breach is a separate contravention.

  • “Serious” breaches will be taken seriously — health and financial data, inadequate security, and systemic failures will all tip the scales.

  • M&A due diligence must cover cybersecurity — buying a business means inheriting its data liabilities.

  • Notification delays will cost you — the OAIC expects “as soon as practicable,” not weeks or months.


💡 IP Mojo Take

Privacy can’t be treated anymore like it is just a paperwork exercise — it’s a governance test you can fail in the Federal Court.

This case cements privacy law as a compliance discipline with teeth.

The OAIC now has a roadmap for future actions — and the Court has made clear that “reasonable steps” means measurable, auditable, and proactive security governance.

For corporate Australia, this is ASIC v RI Advice for the health sector — but under the Privacy Act instead of the Corporations Act.

Expect to see:

  • Increased OAIC enforcement in healthcare, finance, and tech sectors;

  • Board-level scrutiny of data protection measures; and

  • Class actions waiting in the wings, armed with a judicial finding of “serious interference with privacy.”

The privacy bar has just been raised — permanently.

Filed Under: Digital Law, Privacy, Regulation Tagged With: Digital Law, Privacy, Regulation

September 29, 2025 by Scott Coulthart

Deepfakes on Trial: First Civil Penalties Under the Online Safety Act

The Federal Court has handed down its first civil penalty judgment under the Online Safety Act 2021 (Cth), in eSafety Commissioner v Rotondo (No 4) [2025] FCA 1191.

Justice Longbottom ordered Anthony (aka Antonio) Rotondo to pay $343,500 in penalties for posting a series of non-consensual deepfake intimate images of six individuals, and for failing to comply with removal notices and remedial directions issued by the eSafety Commissioner.


Key Points

1. First penalties under the Online Safety Act

This is the first time civil penalties have been imposed under the Act, making it a landmark enforcement case.

The Commissioner sought both declarations and penalties, with the Court emphasising deterrence as its guiding principle.

2. Deepfakes squarely captured

The Court confirmed that non-consensual deepfake intimate images fall within the Act’s prohibition on posting “intimate images” without consent.

Importantly, it rejected Rotondo’s submission that only defamatory or “social media” posts should be captured.

3. Regulatory teeth and enforcement

Rotondo received notices under the Act but responded defiantly (“Get an arrest warrant if you think you are right”) before later being arrested by Queensland Police on related matters.

His lack of remorse and framing of deepfakes as “fun” aggravated the penalty.

4. Platform anonymity

Although the Commissioner did not object, the Court chose to anonymise the name of the website hosting the deepfakes — reflecting a policy judgment not to amplify harmful platforms.

That said, the various newspapers reporting on this story all revealed the website’s address, but noted it has now been taken down.

IP Mojo is choosing not to reveal that website.

5. Civil vs criminal overlap

Alongside the civil penalties, the Court noted criminal charges under Queensland’s Criminal Code.

This illustrates how civil, regulatory and criminal enforcement can run in parallel.


Why It Matters

  • For regulators: This case confirms the Act has teeth. Regulators can secure significant financial penalties even where offenders are self-represented.

  • For platforms: The Court’s approach signals that services hosting deepfakes are firmly in scope, even if located offshore.

  • For the public: The judgment highlights the law’s adaptability to AI-driven harms — and sends a clear deterrence message.

  • For practitioners: Expect more proceedings of this kind, particularly as the prevalence of AI-generated abuse grows.

Filed Under: AI, Digital Law, Privacy, Regulation, Technology Tagged With: AI, Digital Law, Privacy, Regulation, Technology

June 25, 2025 by Scott Coulthart

Ready, Set, Comply: Queensland’s IPOLA Reforms Launch 1 July 2025

This July marks a pivotal moment for Queensland public sector entities, agencies, and their contractors. The Information Privacy and Other Legislation Amendment (IPOLA) Act 2023 comes into full effect from 1 July 2025, ushering in sweeping updates to Queensland’s Information Privacy Act 2009, Right to Information Act 2009, and the rules governing data-breach notifications.

Let’s break it down.

1. Unified Access Rights & RTI Overhaul

What’s Changing:

  • As of 1 July, Queensland merges personal and non-personal document access into a single, unified right under the RTI Act.

  • Expect streamlined procedural rules: revised timeframes, adjusted decision-maker roles, and consolidated fees.

  • New requirements for disclosure logs and proactive release of information also come into force.

Why It Matters:

  • RTI applicants apply once—and agencies can’t dodge questions by splitting personal and non-personal requests.

  • Agencies must refresh policies, train staff, and implement systems that can handle integrated workflows.

  • Transparency expectations heighten. Agencies will be judged not just on compliance, but also disclosure culture.

2. Queensland Privacy Principles (QPPs) & Binding Codes

What’s Changing:

  • A fresh suite of 12 Queensland Privacy Principles takes effect—covering collection, disclosure, accuracy, retention, security, and more.

  • Binding QPP Codes can be issued by the Information Commissioner.

  • Importantly: contractual obligations with service providers (e.g., cloud, IT, data analytics) must now include binding QPP compliance clauses.

Why It Matters:

  • IT contracts across private and public sectors need rewriting to mandate QPP compliance.

  • Outsourced services—especially those involving personal data—must adhere to QPP requirements in practice, not just in documentation.

3. Mandatory Notification of Data Breach (MNDB) Scheme

Note: While the broader IPOLA reforms kick in July 2025, the MNDB requirement for local governments is delayed until July 2026.

What’s Happening Now:

  • State government Agencies adopt MNDB notifications from July 2025.

  • Local governments have an additional year to prepare.

Why It Matters:

  • MNDB templates, policies, and flowcharts from OIC are now live and ready.

  • All entities need clear internal breach response tech and training—or risk non-compliance.

  • Local councils have a 12-month window to align with the Scheme before 2026 rollout.

4. Training & Resources at the OIC

The Office of the Information Commissioner (OIC) has curated an extensive IPOLA onboarding program:

  • Stage 1 Awareness sessions (Aug–Sep 2024), attended by 1,000+ staff across 19 venues.

  • Stage 2 Build‑Knowledge workshops (Oct 2024–Mar 2025), reaching 3,000+ participants over modules covering MNDB, QPPs, and RTI.

  • Stage 3 Topic‑based training commenced in May 2025—delving into MNDB and RTI templates, including a Local‑Government‑specific workshop on 11 June 2025.

Why It Matters:

  • Poly‑themed, modular, and scenario‑driven sessions (including Q&A panels) are freely available and compressed into SCORM packages—but note: the SCORM kit is only available until 30 June 2025.

  • Agencies should download before then and integrate into internal LMS if you haven’t already—no extensions.

5. Practical Tools & Templates

To smooth your compliance journey, OIC offers (at their website oic.qld.gov.au:

  • Checklists: “Prepare for IPOLA” workbook, Access & Amendment Application checklist.

  • Policy templates: breach policy, eligible data‑breach registers, response plans.

  • Privacy Impact Assessment (PIA) tools: threshold forms, risk registers.

  • Contractor & collection‑notice guides: for binding providers and updating public info notices.

🚨 What You Should Do Before 1 July 2025

For Agencies & Departments:

  1. Download & embed SCORM training content by 30 June 2025.

  2. Deploy team training using Stage 2/3 modules or in-house adaptations.

  3. Revise internal systems for unified access rights, disclosure logs, and fee handling.

  4. Update contracts with QPP compliance clauses for all service providers.

  5. Implement MNDB policies and breach-response tech for July rollout.

For Contractors & Vendors:

  1. Review contracts—you’ll likely be legally required to comply with QPPs by July.

  2. Audit your data systems: implement encryption, retention, and access protocols matching QPPs.

  3. Train staff on breach detection, logging, and your obligations to notify.

For Local Government Entities:

  • Use 2025–26 as a setup year for MNDB readiness. Download checklists, test templates, and tap into OIC’s LG-specific training.

Final Word: Compliance Is Non-Negotiable

Come 1 July 2025, Queensland’s public-facing privacy and information regime becomes holistic:

  • Single RTI access request = one-stop for all documents.

  • QPPs apply across the lifecycle of personal data—including handling by contracted parties.

  • MNDB enforcement begins for state bodies (councils get a 12‑month grace period).

  • Training content won’t be available post 30 June.

The concrete tools, training, and structure are all out now—so aim to have your systems fully aligned before end of June. Delay is not an option.

Filed Under: Government Law, Privacy, Regulation Tagged With: Government Law, Privacy, Regulation

June 24, 2025 by Scott Coulthart

What Didn’t Happen (Yet): The Privacy Reforms Still Waiting in the Wings

You could be forgiven for thinking Australia’s privacy law just had its big moment — and it did. But don’t get too comfortable. What we’ve seen so far from the December 2024 amendments to the Privacy Act 1988 (Cth) is just Round 1.

Welcome to the final instalment of our 9-part Privacy 2.0 series.

There’s a long queue of proposed changes that didn’t make it into the latest legislation, many of them quietly simmering in government inboxes, consultation drafts and “agreed in principle” footnotes.

Some of these postponed reforms could reshape the privacy landscape even more profoundly than the current crop. If you’re trying to future-proof your compliance or understand where the law is going next, here’s what to watch.

1. The Small Business Exemption — Still Alive (for Now)

Right now, businesses with an annual turnover under $3 million are generally exempt from the Privacy Act. That’s tens of thousands of data-handling entities with zero formal privacy obligations. The reform process flagged this as outdated — and it’s clear the exemption will eventually go. When it does, thousands of SMEs will be pulled into the privacy net for the first time. It’s not a question of if. It’s when.

2. Controllers vs Processors — Coming Soon to a Framework Near You

Unlike the GDPR(s), Australia’s privacy law still doesn’t distinguish between data “controllers” (who decide the purpose and means of processing) and “processors” (who process data on someone else’s behalf). That distinction brings clarity and proportionality in many overseas regimes. Expect pressure to harmonise with global norms — especially from businesses operating across borders who are tired of legal whiplash.

3. The Right to Object, Delete, Port — Not Yet, But On Deck

Australia still lacks a formal, standalone right to object to certain uses of data, to demand deletion (the famed “right to be forgotten”), or to port your data from one provider to another. These rights — core pillars of the GDPR(s) — have been agreed to in principle, are popular with the public, and would bring us closer to GDPR standards (and make life very interesting for adtech, fintech, and platform businesses).

4. De-Identified Data? Still A Grey Zone

The reform process acknowledged that re-identification of supposedly anonymous data is a real risk — and that de-identified information still needs regulation. But the law hasn’t caught up yet. Watch for future reforms to APPs 8 and 11 that would bring de-identified data into scope and make re-identification attempts a regulatory red flag.

5. Privacy by Design & Mandatory PIAs — Still Optional (for Now)

There was also discussion of codifying “privacy by design” and making Privacy Impact Assessments mandatory for high-risk activities. The idea? Embed privacy into planning, not just cleanup. It didn’t land this time, but expect it to return — particularly as AI, biometric tech and behavioural profiling go mainstream.


Bottom line? This is just the intermission. The Privacy Act is evolving — slowly, but deliberately — toward a framework that looks more like the GDPR(s) and less like its 1980s self. Businesses that treat the current reforms as the finish line are missing the point. The smart ones are already adapting to what’s next.

That’s a wrap on our Privacy 2.0 reform series. If you’ve made it this far, congratulations — you now know more about privacy law than most of Parliament.

Now, go fix your privacy policy — and maybe tell your AI to behave while you’re at it.

Filed Under: Privacy, Privacy 2.0, Regulation Tagged With: Privacy, Privacy 2.0, Privacy 2.0 Part 9, Regulation

June 23, 2025 by Scott Coulthart

Black Box, Meet Sunlight: Australia’s New Rules for Automated Decision-Making

Automated decision-making is everywhere now — in the background of your credit check, your insurance quote, your job application, even the price you see for a pair of shoes. For a while, this opaque machine logic operated in a legal blind spot: useful, profitable, and often inscrutable. But no longer.

Welcome to part 8 of our 9-part Privacy 2.0 series.

Australia’s latest privacy reforms are dragging automated decisions into the daylight. Starting 10 December 2026, organisations will be legally required to disclose in their privacy policies whether and how they use automated decision-making that significantly affects the rights of individuals. It’s the first real attempt under Australian law to impose some transparency obligations on algorithmic systems — not just AI, but any automation that crunches personal data and outputs a decision with real-world consequences.

So what do these changes demand? Two key things:

  1. Your privacy policy must (from 10 December 2026) clearly describe:

    • the types of personal information used in any substantially automated decision-making process, and

    • the kinds of decisions made using that information.

  2. It will apply wherever those decisions significantly affect an individual’s rights or interests — eligibility for credit, pricing, recruitment shortlists, fraud flags, algorithmic exclusions from essential services like housing or employment, and more. It’s not limited to full automation either. Even “mostly automated” systems — where human review is token or rubber-stamp — are caught.

The goal here is transparency, not prohibition. The law doesn’t say you can’t automate — but it does say you will have to own it, explain it, and flag it. That means no more hiding behind UX, generic privacy blurbs, or vague disclaimers. And if your systems are complex, decentralised, or involve third-party algorithms? No excuses — you’ll need to understand them anyway, and track them over time so your policy stays accurate.

In short, if your business relies on automated decisions in any meaningful way, you’ll need to:

  • Map those processes now (don’t wait until 2026),

  • Build a system for tracking how and when they change, and

  • Craft plain-language disclosures that are specific, truthful, and meaningful.

This isn’t just a ‘legal’ problem anymore — customers, regulators, and journalists are watching. No one wants to be the next brand caught auto-rejecting job applicants for having a gap year or charging loyal customers more than first-timers.

Tomorrow: we wrap our Privacy 2.0 series with what didn’t make it into the legislation (yet) — and where the next battle lines in Australian privacy reform are likely to be drawn.

Filed Under: Privacy, Privacy 2.0, Regulation Tagged With: Privacy, Privacy 2.0, Privacy 2.0 Part 8, Regulation

June 18, 2025 by Scott Coulthart

For many years, privacy enforcement in Australia was a bit… polite. The OAIC could nudge, issue determinations, and make a bit of noise, but it often lacked the real teeth needed to drive compliance in the boardroom. That era is over.

11 December 2024 saw the commencement of amendments to the Privacy Act 1988 (Cth) which overhaul Australia’s enforcement toolkit — with bigger fines, broader court powers, faster penalties, and forensic-level investigative authority. It’s not quite the GDPR, but it’s getting close enough to make a lot of GCs uncomfortable.

In this 7th part of our Privacy 2.0 series, let’s start with the money. The maximum fine for a serious or repeated privacy breach by a company is now $50 million, or three times the benefit obtained, or 30% of adjusted turnover — whichever is greater. That’s serious deterrent territory, not just a regulatory slap. Even mid-tier breaches now carry $3.3 million maximums for corporates. Individuals? You’re looking at up to $2.5 million if you seriously mess it up. There’s a new hierarchy of penalties too — with lower thresholds and infringement notices for technical breaches like bad privacy policies or sloppy notifications.

But it’s not just about fines. The OAIC can now issue infringement notices, bypassing court for certain minor but clear-cut breaches. Think of it like a privacy speeding ticket — faster, cheaper, but still stings. And yes, you can fight it in court if you want. Just hope your documentation holds up.

Then there are the new powers of investigation and monitoring. The OAIC is now plugged into the Regulatory Powers (Standard Provisions) Act 2014 (Cth), meaning it can get warrants, enter premises, seize devices, and even apply reasonable force — all while preserving privilege. This puts the Privacy Commissioner on more equal footing with ASIC and the ACCC, especially when it comes to serious or systemic non-compliance. If your data handling is shady, half-baked or undocumented — now’s the time to clean it up.

And finally, court powers have been expanded. The Federal Court and the Federal Circuit and Family Court can now order not just fines, but anything else appropriate — including remediation, compensation, and public declarations. This opens the door for privacy class actions to get seriously strategic – not just possible, but powerful.

Here’s the bottom line: privacy compliance can no longer sit in the “legal” corner or be outsourced to the IT team. It’s now a cross-functional risk category — and it’s time businesses treated it that way. If you’re not audit-ready, breach-ready, or regulator-ready… you’re not ready.

Next week in our Privacy 2.0 series: how the law tackles automated decision-making — and why your pricing algorithm, hiring bot, or fraud engine might need to show its work.

Filed Under: Privacy, Privacy 2.0, Regulation Tagged With: Privacy, Privacy 2.0, Privacy 2.0 Part 7, Regulation

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • 🏇 When the Race Stops a Nation — Who Owns the Moment?
  • AI Training in Australia: Why a Mandatory Licence Could Be the Practical Middle Ground
  • AI-Generated Works & Australian Copyright — What IP Owners Need to Know
  • When Cheaper Medicines Meet Patent Law: Regeneron v Sandoz
  • #NotThatFamous: When Influencer Buzz Fails the s 60 Test

Archives

  • November 2025 (1)
  • October 2025 (14)
  • September 2025 (21)
  • August 2025 (18)
  • July 2025 (16)
  • June 2025 (21)
  • May 2025 (12)
  • April 2025 (4)

Footer

© Scott Coulthart 2025