Stock Markets April 22, 2026 06:59 AM

Australia Orders Major Gaming Platforms to Disclose Child Protection Measures

eSafety Commissioner issues enforceable notices to Roblox, Minecraft, Fortnite and Steam, warning of heavy daily fines for noncompliance

By Priya Menon MSFT
Australia Orders Major Gaming Platforms to Disclose Child Protection Measures
MSFT

Australia's online safety regulator has served legally enforceable transparency notices on several leading gaming platforms, requesting detailed information about measures to prevent grooming, sexual extortion and radicalisation of children. Platforms named include Roblox, Microsoft’s Minecraft, Epic Games’ Fortnite and Valve’s Steam. Companies face potential fines of up to A$825,000 ($590,783) per day for failing to comply.

Key Points

  • Australia's eSafety Commissioner issued legally enforceable transparency notices to Roblox, Minecraft, Fortnite and Steam, seeking details on child protection systems and cybersecurity-aligned measures.
  • Companies face fines of up to A$825,000 ($590,783) per day for failing to comply; responses are generally expected within 30 days.
  • The action affects the online gaming and technology sectors, highlighting regulatory and legal pressures on platforms that provide real-time chat and social features for children.

Australia's internet safety regulator has ordered some of the world's most popular gaming services to provide detailed explanations of how they protect young users from sexual predators and extremist influences.

The eSafety Commissioner said on Wednesday it issued legally enforceable transparency notices to Roblox, Microsoft’s Minecraft, Epic Games’ Fortnite and Valve’s Steam. The notices request material on their safety systems, staffing and measures consistent with cybersecurity protocols.

Companies that do not comply with the transparency notices could face penalties of up to A$825,000 a day, which the regulator noted is equivalent to $590,783 at the exchange rate provided. Regulators in Australia typically allow 30 days for responses to compliance notices.

Julie Inman Grant, the eSafety Commissioner, highlighted how gaming services and related features can expose children to initial contact with offenders. "What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services," she said in a statement.

Inman Grant also pointed to the scale of children's participation in online gaming, saying that nine in 10 Australians aged 8 to 17 have played online games. "Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms," she said.

Microsoft said it was reviewing the regulator’s notice and reiterated that it takes children's online safety seriously. "We continue to evolve our approach to meet the evolving threat and regulatory landscape," a company spokesperson said by email. Roblox did not immediately respond to requests for comment.

The regulator's action comes as scrutiny grows over how gaming platforms detect and prevent threats to minors. The eSafety Commissioner flagged that real-time chats with unknown users on some gaming platforms can be more difficult for automated systems to monitor than conventional social media interactions.

Last week Roblox settled with the U.S. states of Alabama and West Virginia over claims the platform failed to protect young users. Under those settlements, Roblox agreed to pay more than $23 million and to implement changes to how children access its chat and gaming features. The company is also facing more than 140 lawsuits in U.S. federal courts accusing it of knowingly facilitating child sexual exploitation.

Amid the legal and regulatory pressures, Roblox said it will roll out tailored account types for younger users from June. Children aged 5 to 8 will be placed on "Roblox Kids" accounts, while users aged 9 to 15 will be assigned to "Roblox Select."

The enforcement notices are focused on gaming-related services and features that may include encrypted messaging and other communications tools that the eSafety Commissioner identified as potential first points of contact between children and offenders involved in grooming, sexual extortion and radicalisation.

By seeking documentation on safety systems, staffing and cybersecurity-aligned measures, the regulator aims to assess how platforms identify, prevent and respond to the range of harms it has highlighted. The notices create a legal obligation for the named companies to deliver that information within the usual compliance timeframe or face the stated daily penalties.


Exchange rate noted in correspondence: $1 = 1.3965 Australian dollars.

Risks

  • Noncompliance with the eSafety Commissioner's notices could expose companies to substantial daily fines, posing financial and operational risks to gaming and technology firms.
  • Real-time chat features and encrypted messaging on gaming platforms can be harder to monitor with automated systems, increasing the risk of undetected grooming, sexual extortion and radicalisation and prompting regulatory scrutiny of moderation practices.
  • Ongoing litigation and recent settlements, such as Roblox's agreements with Alabama and West Virginia and the more than 140 lawsuits in U.S. federal courts, create legal and financial uncertainties for affected companies and could influence investor and partner decisions in the sector.

More from Stock Markets

Jefferies Picks Defensive Winners in UK Construction amid Sector Downgrades Apr 22, 2026 Robinhood Ventures Allocates $75M to OpenAI as Part of Push into Private Markets Apr 22, 2026 Executives, trusts and major holders disclose big stock moves; Astera Labs CEO sells $57.85M Apr 22, 2026 Limited-Time InvestingPro Offer Cuts Price to Under $7 a Month on Selected Plans Apr 22, 2026 Upstart Secures $1.2 Billion Forward-Flow Commitment from Centerbridge, Shares Rise Apr 22, 2026