Instagram Takes Action: Teen Accounts Shutdown Before Australian Social Media Ban

Instagram to start closing Australian teen accounts ahead of social media ban

London, November 20, 2025
Meta, owner of Instagram, Facebook, and Threads, will begin deactivating accounts of Australian users aged 13 to 15 on December 4, 2025, ahead of a nationwide social media ban for under-16s set to take effect in early December as part of government efforts to enhance online safety for minors.

Immediate Impact on Young Users

From December 4, 2025, Meta will commence shutting down accounts for Australian teenagers between 13 and 15 years old. This action precedes a broader legal ban effective around December 10, 2025, which prohibits users under the age of 16 from accessing major social media platforms such as Instagram, Facebook, Threads, Snapchat, TikTok, X (formerly Twitter), YouTube, Reddit, and Kick.

Meta has initiated notifications via email and text messages to inform affected users and their guardians that their accounts will soon be deactivated. Following the ban activation, Meta will also cease allowing new under-16 account creations within Australia.

Government Rationale and Industry Response

The Australian government has enacted this restrictive policy with the stated goal of creating safer and more age-appropriate digital environments for minors. These regulations form part of a growing trend by governments globally to impose stricter safeguards around youth usage of social media platforms.

In contrast, Meta publicly opposes the ban, asserting that removing teenagers from these ecosystems is not the most effective method for ensuring online safety. Meta has voiced concerns about the policy’s impact on young users’ ability to engage online, arguing for alternative approaches. Despite opposition, the company is complying with the legal requirements.

Broader Regulatory and Social Implications

This sweeping ban in Australia marks a significant milestone in social media regulation with potential implications beyond national borders. It highlights increasing governmental scrutiny of platform responsibilities to protect younger demographics from online harms.

Policymakers, business leaders, and academics observing this development must consider how regulatory frameworks intersect with platform governance, user rights, and child safety. Meta’s compliance, alongside similarly affected platforms, indicates a shifting digital landscape where tighter controls on youth access are becoming a norm rather than exception.

As Australia enforces this ground-breaking legislation, the effectiveness and global influence of such regulatory models will be closely watched. The changes will likely prompt social media companies worldwide to reassess their youth engagement and safety policies in response to evolving government expectations.