Australia's internet safety regulator has ordered some of the world's most popular gaming services to provide detailed explanations of how they protect young users from sexual predators and extremist influences.
The eSafety Commissioner said on Wednesday it issued legally enforceable transparency notices to Roblox, Microsoft’s Minecraft, Epic Games’ Fortnite and Valve’s Steam. The notices request material on their safety systems, staffing and measures consistent with cybersecurity protocols.
Companies that do not comply with the transparency notices could face penalties of up to A$825,000 a day, which the regulator noted is equivalent to $590,783 at the exchange rate provided. Regulators in Australia typically allow 30 days for responses to compliance notices.
Julie Inman Grant, the eSafety Commissioner, highlighted how gaming services and related features can expose children to initial contact with offenders. "What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services," she said in a statement.
Inman Grant also pointed to the scale of children's participation in online gaming, saying that nine in 10 Australians aged 8 to 17 have played online games. "Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms," she said.
Microsoft said it was reviewing the regulator’s notice and reiterated that it takes children's online safety seriously. "We continue to evolve our approach to meet the evolving threat and regulatory landscape," a company spokesperson said by email. Roblox did not immediately respond to requests for comment.
The regulator's action comes as scrutiny grows over how gaming platforms detect and prevent threats to minors. The eSafety Commissioner flagged that real-time chats with unknown users on some gaming platforms can be more difficult for automated systems to monitor than conventional social media interactions.
Last week Roblox settled with the U.S. states of Alabama and West Virginia over claims the platform failed to protect young users. Under those settlements, Roblox agreed to pay more than $23 million and to implement changes to how children access its chat and gaming features. The company is also facing more than 140 lawsuits in U.S. federal courts accusing it of knowingly facilitating child sexual exploitation.
Amid the legal and regulatory pressures, Roblox said it will roll out tailored account types for younger users from June. Children aged 5 to 8 will be placed on "Roblox Kids" accounts, while users aged 9 to 15 will be assigned to "Roblox Select."
The enforcement notices are focused on gaming-related services and features that may include encrypted messaging and other communications tools that the eSafety Commissioner identified as potential first points of contact between children and offenders involved in grooming, sexual extortion and radicalisation.
By seeking documentation on safety systems, staffing and cybersecurity-aligned measures, the regulator aims to assess how platforms identify, prevent and respond to the range of harms it has highlighted. The notices create a legal obligation for the named companies to deliver that information within the usual compliance timeframe or face the stated daily penalties.
Exchange rate noted in correspondence: $1 = 1.3965 Australian dollars.