Back in our Privacy 2.0 series, we unpacked the upcoming Children’s Online Privacy Code — a new legal framework aimed at improving how kids’ personal information is handled online. Now, we’re hearing more from the people it’s meant to protect.
Children themselves have made it clear: they don’t want to be tracked, profiled, or buried in unreadable consent forms. And for companies whose business depends on that data, the reforms aren’t just a policy shift — they’re a threat to the model.
What Children Say
Our earlier blog post focused on the what: a binding industry code to strengthen children’s data protections under the Privacy Act 1988 (Cth).
Now we’re seeing more of the why — and it’s coming straight from the kids.
According to findings from consultations with children conducted by Reset Tech Australia, the message is loud and clear: children aren’t just passive subjects of data collection. They have opinions — strong ones.
Among the most consistent themes:
-
Nearly 90% of children surveyed want default privacy settings set to high, and geolocation turned off by default.
-
Many want simpler, age-appropriate explanations of how their data is used.
-
Crucially, they want the ability to delete their data — a right currently absent from Australia’s privacy framework.
As Privacy Commissioner Carly Kind put it (in her recent discussions with The Australian newspaper):
“Kids aren’t going to read 50 pages of terms and conditions when they sign up to an app… How do we give them actual choices, and not just the ability to click ‘I consent’ when they haven’t even read something, and it’s not a genuine form of consent?”
That question goes to the heart of what the Children’s Privacy Code — and broader privacy reform — is trying to fix.
Not Just Social Media — The Code’s Expanding Reach
Commissioner Kind also confirmed that the Children’s Privacy Code will work in parallel with the upcoming ban on under-16s using social media — but its scope is much broader. The Code will apply to:
-
Websites and online services accessed by children,
-
Wearable devices and fitness trackers, and
-
Education technology and apps, including those used in schools.
In other words, the Code is not just about excluding children from certain online spaces. It’s about protecting them wherever they are — especially in digital environments they’re required to engage with for school or social connection.
This dual approach — platform bans on one side, enforceable data safeguards on the other — reflects a recognition that meaningful participation in digital life shouldn’t come at the expense of privacy.
Who’s Worried — and Why
Stronger children’s privacy rules are good policy — but they’re also bad news for some very profitable business models. Behind the push for transparency and consent reform lies a quieter question: who stands to lose when kids gain more control over their data?
Let’s follow the data trail.
Adtech platforms are the obvious players at risk. Targeting, profiling, and retargeting of under-18s — even if indirect — fuels everything from engagement strategies to dynamic pricing models. If default privacy settings go “high” by law, or profiling becomes opt-in (or outright banned), that revenue stream starts to dry up.
Social platforms, even those ostensibly closed to under-16s, have powerful incentives to retain youth users — both for ad revenue and for maintaining long-term brand stickiness. The idea that kids might have a right to delete their data, opt out of tracking, or receive age-appropriate disclosures cuts into their legal risk model.
Then there’s edtech — the often-overlooked battleground. Many school-deployed tools gather extensive user-level data but offer limited controls to students (or even schools). Vendors that haven’t built privacy-by-design tooling may soon be scrambling to comply.
And finally, consumer IoT and smart toys — products that rely on voice input, biometric sensors, or location tracking — may find their compliance and legal risk profile radically changed if the Code’s protections become enforceable.
Most won’t publicly oppose child protection. But you can expect to see:
-
Lobbying for “flexible” implementation timelines,
-
Calls for “self-regulation”,
-
Quiet legal arguments around the scope of “reasonable access by children”, and
-
Industry pushback on making data deletion or privacy impact assessments mandatory.
What This Means for Industry
If your platform, app, device, or service is used by children, or may reasonably be accessed by children, you’ll need to start preparing for compliance now. That means:
-
Reviewing your default settings for privacy, location, and profiling;
-
Translating privacy policies into plain, age-appropriate language;
-
Building functionality for data deletion — even if not (yet) mandatory; and
-
Moving beyond consent as your only compliance crutch — especially if that consent comes from a user too young to legally or meaningfully provide it.
The consultations have made it clear: children want transparency, choice, and respect for their privacy — and regulators are listening.