YouTube’s Free Pass May Be Up: eSafety Pushes Back on Social Media Carve-Out
The Albanese Government’s plan to restrict under-16s from holding social media accounts is already proving contentious — and now, its one glaring exception has been officially called out. The eSafety Commissioner, Julie Inman Grant, has advised Communications Minister Anika Wells to scrap the carve-out that would exempt YouTube from the new age-gating regime set to kick in this December.
The proposal, which mandates that platforms like TikTok, Instagram, Snapchat, Reddit and X take “reasonable steps” to block account creation by under-16s, currently spares YouTube on the basis that it has a broader educational and health utility. But the Commissioner’s position is clear: if it walks like TikTok and Shorts like TikTok, it’s probably TikTok — and deserves to be regulated accordingly.
YouTube: Too Big to Ban?
Back in November, then-Minister Rowland argued YouTube played a “significant role in enabling young people to access education and health support”, and thus deserved its special treatment. But the eSafety Commissioner’s new advice — now in the hands of Minister Wells — says the data tells a different story.
YouTube isn’t just a fringe player. A recent eSafety survey found it’s used by 76% of 10- to 15-year-olds, making it the dominant platform for that age group. Among kids who encountered harmful content online, 37% said the worst of it happened on YouTube.
In other words, if the aim is to protect children from the harms of social media, YouTube is not just part of the problem — it’s the biggest piece of it.
Functional Similarity, Regulatory Inconsistency
The core of the Commissioner’s argument is that functionality, not branding, should drive regulation. YouTube Shorts mimics the addictive swipe-based short-form video experience of TikTok and Instagram Reels. Carving it out sends mixed messages about the purpose of the law — and creates loopholes large enough for a Shorts binge.
The advice also calls for more adaptable, risk-based rules that focus on a platform’s actual features and threat profile, not how it labels itself. Technology evolves too fast for static category-based exemptions.
But What’s the Threat, Really?
There may be many examples of nanny-state regulation these days – but this isn’t one of them.
YouTube is in this author’s opinion an excellent platform extremely useful and entertaining all at the same time, and that applies to benefits both for adults and under-18s/under-16s.
However, there are also significant dangers for under-16s that can’t be ignored.
In plain terms:
1. Exposure to Inappropriate Content
Even with YouTube Kids and restricted mode, children can still be exposed to:
-
Pornographic or sexually suggestive content (sometimes slipped past filters).
-
Violent or graphic videos (including real-life fights, injuries, or distressing footage).
-
Content promoting self-harm, eating disorders, or suicide (often through seemingly innocuous videos or “coded” messaging).
-
Misinformation or conspiracy theories (e.g., QAnon, anti-vax rhetoric).
These exposures are linked to real psychological harms, especially among younger teens still forming their identity and critical reasoning skills.
2. Contact Risks (Predators & Harassment)
YouTube allows comments, live chat during livestreams, and even community posts — all of which create:
-
Opportunities for unsolicited contact from adults (including grooming behaviour).
-
Exposure to cyberbullying or peer harassment, often via comments.
-
Unfiltered interactions during livestreams — which are harder to moderate in real time.
The eSafety Commissioner sees this as part of a broader “contact harm” risk — it’s not just what kids see, but who can reach them and how they’re targeted.
3. Addictive Design (Shorts, Recommendations)
YouTube’s algorithmic design encourages:
-
Binge-watching and excessive screen time through autoplay and recommendations.
-
Engagement loops in YouTube Shorts (TikTok-style scrollable video snippets).
-
Exposure to more extreme or sensational content the longer a child watches (known as algorithmic “radicalisation”).
This design can disrupt sleep, concentration, and mental wellbeing — particularly in adolescents.
4. Data Privacy & Profiling
YouTube collects vast amounts of user data — even from minors — to personalise recommendations and ads. While Google claims to limit this for users under 18:
-
The eSafety Commissioner is concerned that data-driven profiling may still occur covertly or imperfectly.
-
Kids may also be inadvertently tracked across platforms when logged into a YouTube or Google account.
5. False Sense of Safety
YouTube’s exemption from the new social media rules may give parents the impression it is “safe” or “educational” by default — when, in fact, it often contains the same risks as TikTok or Instagram.
The Commissioner specifically called out that there isn’t sufficient evidence YouTube “predominantly provides beneficial experiences” for under-16s. So the carve-out undermines the purpose of the rules.
In summary, the concern isn’t just about under-16s accessing YouTube, but about the total environment of:
-
Risky content,
-
Risky contact,
-
Addictive design, and
-
Inadequate protective controls.
Risk-Based Reform on the Horizon
The YouTube advice comes as the eSafety Commissioner readies a suite of industry-specific codes targeting harmful online content, including pornography and violent material. New obligations are expected for search engines, hosting services, and telcos — with five more codes in the pipeline. If voluntary industry codes fall short, the Commissioner has flagged she’ll impose mandatory standards before July’s end.
Penalties for breach of these codes — like the new social media rules — could reach $50 million for systemic non-compliance.
What’s Next?
The final decision on YouTube’s exemption sits with Minister Wells, who must table the rules in Parliament for scrutiny. But with pressure now coming from the very regulator tasked with enforcement, and mounting community concern over YouTube’s influence, the carve-out may not survive the next sitting.
The bigger question is whether Australia can strike the right balance between platform accountability, digital literacy, and youth agency — without blunting the tools that help kids learn and connect. In a digital world that resists easy categorisation, risk-based regulation may be the only way forward.
What if AI companies had to pay for the content they train on? Welcome to the next frontier in copyright law — where inspiration meets ingestion.
Maxim Media, the publishers behind the well-known men’s lifestyle magazine and brand MAXIM, had minimal success when in Maxim Media Inc. v Nuclear Enterprises Pty Ltd [2024] FCA 1443 they sought urgent Federal Court orders to shut down an Australian company allegedly riding on their name — through magazines, domain names, destination tours, and model management services.