Table of Contents
- Introduction: Meta Enforces Social Media Ban
- Scope of the Ban and Affected Platforms
- How Meta is Managing Account Deactivations
- Impact on Children and Families
- Criticism and Concerns About the Ban
- Global Implications and Monitoring
- Related Reads

Introduction: Meta Enforces Social Media Ban
Meta has started deactivating accounts of Australian children under 16 on Instagram, Facebook, and Threads, just days before the official social media ban takes effect on 10 December. The move affects an estimated 150,000 Facebook users and 350,000 Instagram accounts. Threads, which requires an Instagram account, will also be impacted.
Scope of the Ban and Affected Platforms
Australia’s pioneering social media law is the first in the world to restrict access for minors under 16. Companies face fines up to A$49.5 million (US$33 million) if they fail to implement “reasonable steps” to prevent children from accessing social media platforms. Alongside Meta, platforms such as YouTube, X, TikTok, Snapchat, Reddit, Kick, and Twitch are also subject to the ban.
How Meta is Managing Account Deactivations
Meta has begun notifying users aged 13-15 that their accounts will soon be shut down. Affected users can download and save their posts, videos, and messages prior to deactivation. Teenagers who believe they have been misclassified can request a review by submitting a “video selfie” or official government ID.
Meta has argued that a standardized, privacy-focused approach requiring app stores to verify age and parental consent would be more effective than having teens prove their age across multiple apps.
Impact on Children and Families
Communications Minister Anika Wells emphasized the law’s goal to protect Gen Alpha from the addictive nature of social media. She described the impact of “behavioural cocaine” from algorithms that create constant dopamine drips for children. The ban is intended to reduce exposure to harmful content and online abuse.
Experts and critics warn that the ban may isolate some children, pushing them toward lesser-known and less-regulated platforms. Meta’s system only scans public posts, leaving abusive direct messages visible, which has sparked concerns about the policy’s effectiveness in fully protecting children online.

Criticism and Concerns About the Ban
Platforms like YouTube, originally exempt but later included in the ban, have criticized the legislation as “rushed,” suggesting it may paradoxically make platforms less safe by removing parental control tools. Other apps, like Lemon8 and Yope, have undertaken self-assessments to determine whether they fall under the ban.
Despite these concerns, the Australian government remains firm on enforcement, emphasizing that early challenges are expected, but the long-term aim is to protect children’s safety and mental wellbeing in the digital landscape.
Global Implications and Monitoring
Australia’s social media ban is being closely watched worldwide as a precedent for protecting minors online. A government-commissioned study found that 96% of children aged 10-15 use social media, with seven in 10 exposed to harmful content and one in seven experiencing grooming attempts. More than half reported cyberbullying experiences.
As platforms comply, regulators are monitoring migration patterns to less-regulated apps to ensure the ban’s objectives are met. The law represents a global milestone in digital child protection, potentially influencing policies in other countries.
Related Reads
By The Morning News Informer — Updated 4 December 2025

