Ban the Business Model, Not the Child

Australia’s new under-16 social media rules are now live. In plain terms, they function as a delay on having accounts: young people (and parents) aren’t penalized, but platforms must take “reasonable steps” to prevent under-16s from holding accounts, with civil penalties that can reach A$49.5 million.

The debate feels decisive. That’s the problem.

Because when youth policy is built around a clean story – kids are harmed, platforms are the cause, remove access, solved – we risk trading meaningful safety for the illusion of control (Piotrowski, 2025).

The word “addiction” is doing too much work

“Social media is addictive” is a tempting line: simple, sticky, politically convenient. But “addiction” isn’t just a strong metaphor – it’s a clinical claim. Treating it as settled science can steer policy towards blunt tools: prohibition, surveillance, and moral panic, instead of targeted, evidence-based supports.

A recent paper in Scientific Reports makes the core point clearly: many Instagram users believe they are addicted, yet only a small subset show symptom patterns consistent with addiction risk; and experimentally framing heavy use as addiction can actually reduce perceived control and increase self-blame – conditions that make change harder, not easier (Anderson & Wood, 2025). In other words, what feels like “addiction” often looks more like habit and control challenges, and the label itself can backfire.

Yes, some young people struggle online. Excessive use can be linked with sleep loss, distraction, conflict, anxiety, and stress. Those harms are real. We need to focus on this, immediately. But the response has to fit the mechanisms – bans do not fit the mechanisms.

Bans relocate risk

Age-based bans promise certainty: remove access and the harm goes away. In practice, restrictions displace behavior rather than eliminate it, especially for adolescents, where autonomy-seeking and boundary-testing are developmentally typical.

When mainstream platforms are blocked, we see many teens route around restrictions (shared accounts, false birthdays, VPNs, migration to smaller platforms) or simply use the platform without an account (in other words, without any youth guardrails). That displacement can reduce adult visibility/support and push young people toward less regulated spaces – leaving the underlying drivers of harm untouched (Piotrowski, 2025). Early coverage of Australia’s rollout already highlights predictable “cracks”: underage account creation still occurring on some services while enforcement mechanisms try to catch up, and an ongoing debate about where teens will go next. 

This is the “kicking the can” dynamic: we don’t build skills or safer environments now, and then we are going to act surprised when young people still end up online, just in harder-to-reach and less safe places.

Rights aren’t an inconvenience – they’re the point

When platforms fail children, the answer shouldn’t be to remove children’s access to digital participation by default. Children have rights to protection, but also rights to provision and participation in digital environments. A ban-first approach risks becoming a policy shortcut that sidesteps the harder work: demanding better design, regulating harmful business incentives, investing in education, and supporting families realistically (Piotrowski, 2025).

If not bans, then what?

The alternative to bans is not “do nothing.” It’s shared responsibility – safer design, better skills, proportionate safeguards, and real accountability.

  1. Require agency-first design defaults (not willpower).
    Youth safety should not depend on self-control alone. Practical expectations for child and teen accounts should include, for example, removal of infinite scroll/autoplay, plain-language “why recommended” explanations, meaningful alternative pathways (“show me something different”), session timers with save-and-stop, quiet-hour modes, and increased friction (Kucirkova & Piotrowski, 2025; Piotrowski, 2025)
  2. Teach skills now.
    If we don’t teach digital competence during adolescence, we simply postpone learning until the environment is even more complex and our ability to reach them is less (Kucirkova & Piotrowski, 2025).
  3. Use age assurance selectively. and design it to protect rights.
    Age assurance may have a role, but it is not a cure-all. It should be targeted, privacy-preserving, transparent, appealable, and evaluated for equity impacts (Kucirkova & Piotrowski, 2025)
  4. Measure what matters, and enable independent evaluation.
    Bans thrive because they offer a simple metric: did the child access the app? But access is a crude proxy. We should measure real outcomes: well-being, safety, learning, opportunity, and agency – supported by independent audits and vetted researcher access to evidence (Kucirkova & Piotrowski, 2025).

The Bottom Line

Australia’s under-16 rules show why bans are politically attractive: they’re clear and visible. But clarity isn’t the same as effectiveness. The addiction frame is also convenient, yet evidence suggests it’s vastly overapplied, and can even worsen perceived control (Anderson & Wood, 2025). Meanwhile, bans relocate risk, increase pressure for broad age assurance, undermine children’s rights, and crowd out the deeper work of building environments worthy of young people – especially as childhood becomes increasingly AI-mediated (Kucirkova & Piotrowski, 2025; Piotrowski, 2025).

Protecting children in the digital age shouldn’t mean keeping them out of digital life. It should mean enabling them to participate safely, critically, and with agency supported by better design, better skills, and real accountability.

A safer internet isn’t one where children are locked out of – it’s one they can grow up in.

References

Anderson, I. A., & Wood, W. (2025). Overestimates of social media addiction are common but costly. Scientific Reports, 15, 39388. https://doi.org/10.1038/s41598-025-27053-2

Kucirkova, N. I., & Piotrowski, J. T. (2025, November). An Agenda for Student Agency in the AI Era: OECD guiding discussion paper (8th Global Forum on the Future of Education and Skills, Bratislava, 24–26 November 2025). Organisation for Economic Co-operation and Development.

Piotrowski, J. T. (2025). Illusion or Impact? Shaping Social Media Policy for Youth [Forthcoming Article].