TLDRs:
- Meta shares rose 3.80% o Thursday after removing under-16 Australian accounts
- New regulations block all account creation for under-16 users starting December 4.
- Threads usage affected as Instagram age limits restrict access.
- Compliance fines push platforms to adopt robust age verification methods.
Meta (META) shares rose 3.80% on Thursday after the company began deactivating accounts belonging to Australian users under the age of 16 on Facebook and Instagram.
This initiative comes in preparation for a new law set to take effect on December 10, 2025, which mandates stricter online safety measures for minors. As a result, users under this age threshold will no longer be able to maintain active accounts on the two social media giants.
The move also affects Threads, Meta’s conversational app, since users need an Instagram account to access the platform. According to estimates from Australia’s eSafety commissioner, approximately 500,000 accounts are impacted by the removal.
Meta has reportedly informed affected users last month and is allowing them to appeal if they believe their age was incorrectly flagged.
Account creation blocked for minors
In addition to removing existing under-16 accounts, Meta has implemented a new policy that prevents the creation of accounts for minors in Australia starting December 4, 2025. This means new users under 16 will not be able to sign up for Facebook, Instagram, or Threads.
This proactive approach comes as part of Australia’s wider effort to safeguard younger users online. While 500,000 accounts represent only a fraction of Meta’s reach, roughly 2.8% of Facebook’s 17.7 million ad-reachable users and 3.3% of Instagram’s 15.2 million in Australia, it is an important step toward compliance with stricter digital safety legislation.
Analysts note that Meta’s platforms should remain largely unaffected in terms of advertising reach, as Facebook continues to reach 65.3% of the population and Instagram 56%.
Compliance pressures drive verification solutions
The new legislation also brings hefty penalties for non-compliance, with fines reaching up to A$49.5 million, prompting platforms like Meta to invest in more robust age verification methods. Companies offering solutions such as biometric age estimation, document verification, and layered verification processes are seeing increased demand.
Experts recommend combining multiple approaches to ensure accuracy and fairness, including behavioral signals, AI-driven age estimation, and identity document checks. These measures not only meet Australia’s Privacy Act requirements but also help platforms stay ahead of enforcement actions. TikTok, Snapchat, and YouTube, which also fall under similar regulations, are similarly adopting layered verification strategies.
Implications for users and the market
The removal of underage accounts underscores a growing global trend toward online safety for minors. For affected users, it means the loss of access to social media platforms widely used for social interaction, entertainment, and even education.
For Meta, while the compliance workload is manageable, the company must balance safety initiatives with maintaining user engagement and advertising revenues.
The eSafety commissioner’s guidance provides platforms with a framework to monitor and improve age-checking processes, offering an opportunity to refine digital safeguards before enforcement begins. Industry observers suggest that these measures could pave the way for similar policies in other regions, signaling a shift toward stricter online safety standards worldwide.


