
London, November 20, 2025
Meta will begin closing Instagram, Facebook, and Threads accounts of users under 16 years old in Australia starting December 4, 2025, ahead of a legal ban taking effect on December 10. The move aims to comply with Australia’s new law prohibiting under-16s from holding these social media accounts, necessitating the removal of hundreds of thousands of profiles.
Accounts Closure Details
Meta is required by Australian law to deactivate at least 350,000 Instagram and 150,000 Facebook accounts belonging to users under 16. This mandate excludes Messenger, which remains accessible to younger users. To enforce this, Meta employs artificial intelligence tools to identify underage accounts, although the company acknowledges an error rate that may misclassify about one in seven 16-year-olds, potentially restricting access incorrectly.
Affected users are being notified via email, SMS, and in-app messages, with instructions to download or delete their data before the shutdown. Meta also encourages younger users to provide contact information, enabling account reinstatement once they reach the age of 16.
Australia’s Groundbreaking Legislation
This enforcement is in response to Australia’s pioneering legislation—the world’s first national law explicitly banning social media accounts for children under 16. The legal framework seeks to enhance online safety for minors by restricting their exposure to potentially harmful content on major platforms. Meta’s proactive shutdown, beginning days before the official ban date, underscores the company’s effort to adhere to this unprecedented regulatory requirement.
Implications for Global Social Media Governance
Australia’s move marks a significant milestone in social media regulation, signaling increasing governmental intervention to protect children in the digital domain. Meta’s early compliance highlights the challenges and responsibilities faced by global platforms as they navigate heterogeneous legal landscapes. The bulk removal of underage accounts also raises important questions regarding accuracy in age verification technologies and the balance between digital inclusion and child safety.
As jurisdictions worldwide consider similar protective measures, the Australian case will be closely watched as a benchmark for enforcing age restrictions on social media usage. Meta’s experience may influence how platforms develop future compliance strategies to meet evolving regulatory expectations aimed at safeguarding young users online.

