Australia has intensified enforcement of its December law that bars people under 16 from holding accounts on popular social media platforms, raising the stakes for large technology companies as international attention focuses on Canberra’s policy experiment.
Two months after the ban took effect, and following a mid-January compliance report that had been presented as evidence of successful platform cooperation, the federal government moved to investigate potential breaches by a number of major apps. The increased scrutiny comes amid statements of interest from at least eight other countries considering similar restrictions, and after U.S. courts handed down rulings that found major tech firms negligent with respect to young users.
Compliance data and the government response
In mid-January the Australian government reported that social media companies had deactivated 4.7 million suspected underage accounts, a figure that had encouraged industry participants to expect a grace period for enforcement of up to a year. That early report, however, has been undercut by a more comprehensive compliance review from the eSafety regulator.
The regulator’s first full compliance report found that nearly one-third of parents surveyed said their child under 16 still had at least one social media account. Among those parents, about two-thirds reported that the platform had not asked the child for their age, suggesting gaps in the age verification measures companies have deployed.
Separate reporting by the regulator highlighted additional operational problems: complaints related to cyberbullying and image-based abuse - two harms the government flagged the ban would help address - remained broadly unchanged; parents reported difficulties informing platforms that their underage children still had accounts; and minors who failed age checks were sometimes prompted to repeat the check until they passed.
Which platforms are under investigation
Officials said they are gathering evidence that could form the basis of legal action and named a set of major platforms under scrutiny: Meta’s Instagram and Facebook, TikTok, Alphabet’s YouTube, and Snapchat. The eSafety regulator had previously signalled it would focus enforcement action on cases of systemic noncompliance, and investigators are now assessing whether those criteria are met.
Companies have provided varying responses. Meta and Snap said they were committed to complying with the law. TikTok declined to comment, and Alphabet did not respond to a request for comment on the government’s stated investigations.
Domestic political framing and international ripple effects
Canberra’s centre-left government, which had touted cooperation with industry earlier in the ban’s rollout, appears to have sharpened enforcement rhetoric in part because of global interest in the policy. Officials welcomed the number of foreign jurisdictions exploring similar restrictions, while also responding to evidence that many under-16s still have access to social platforms.
Experts advising government bodies described the situation as an experiment under global scrutiny. One academic who contributes to the government’s two-year study on the ban said recent U.S. jury and judicial decisions strengthened the public case for holding platforms accountable for harms faced by young people. Another former government lawyer noted the government was encouraged by other jurisdictions’ interest, and described recent compliance findings as disheartening, suggesting the need for visible enforcement to sustain momentum abroad.
Legal context from recent U.S. decisions
Observers say recent court outcomes in the United States have likely helped embolden Australian regulators. A jury verdict in a U.S. trial ordered Meta to pay $375 million for safety lapses tied to child exploitation across Facebook, Instagram and WhatsApp. Additionally, other U.S. rulings found Meta and Google negligent for designing social media features that can be harmful to young people. Those decisions, while arising in separate jurisdictions, have been cited by Australian commentators as influencing public opinion and regulatory posture.
Potential platform design and operational consequences
Researchers tracking regulation and platform design say the combined pressure of litigation and national regulation could prompt companies to alter product features and account-creation flows worldwide. One academic specialising in regulatory impacts suggested platform redesigns intended to reduce liability in the United States may also serve to curtail access for under-16s globally, since design changes implemented to mitigate litigation risk in one market can be replicated across products.
This aligns with the government’s position that the barrier to compliance is not parents or children failing to follow rules, but rather weaknesses in how some large platforms have implemented measures to keep under-16s from having accounts.
Regulatory penalties and enforcement standard
The law requires platforms to take what it defines as "reasonable steps" to prevent under-16s from holding accounts. Failure to meet that standard can trigger fines of up to A$49.5 million, the government warned. Officials are now collecting evidence to determine whether the earlier deactivations and other measures meet the threshold for compliance, or whether more substantive enforcement is necessary.
What remains uncertain
The most recent public reporting points to persistent implementation gaps, but the regulator’s findings do not, on their own, resolve whether systemic noncompliance has occurred across platforms. The government has signalled it will pursue legal remedies if the evidence supports such action; companies have signalled varying degrees of cooperation. Meanwhile, at least eight other countries have said they are exploring similar curbs, a dynamic officials say encourages continued enforcement to sustain international momentum.
($1 = 1.4531 Australian dollars)