Australia Orders Roblox, Minecraft To Explain Child Safety Measures

Date:

Australia’s online safety regulator has asked major gaming platforms, including Roblox and Microsoft’s Minecraft, to detail how they are protecting children from grooming, exploitation and exposure to extremist content.

The eSafety Commissioner said on April 22 that legally enforceable transparency notices had been issued to several companies, including Epic Games’ Fortnite and Valve’s Steam, requiring them to disclose information about safety systems, staffing structures and cybersecurity protocols.

Failure to comply with the notices could expose the companies to financial penalties and possible civil action.

Australia’s eSafety Commissioner Julie Inman Grant said gaming platforms and related services, including encrypted messaging tools, are often the first point of contact between children and potential offenders.

She warned that after initial contact is made in gaming environments, offenders frequently move conversations to private messaging platforms, making detection more difficult.

In a statement, she added that online games are not only entertainment spaces but also social environments, noting that around nine in ten Australian children aged between eight and 17 have played online games.

Authorities expressed concern that predators may use these platforms to groom children, engage in sexual exploitation, or introduce extremist narratives disguised within gameplay interactions.

The regulator stressed that these risks can lead to harm beyond gaming platforms, including real-world contact offences and radicalisation.

The move comes amid increased global scrutiny of child safety standards in the gaming industry, where real-time communication features can be difficult to monitor effectively compared to traditional social media platforms.

Roblox has recently faced legal pressure in the United States, including settlements with state authorities over child safety concerns, and is currently dealing with multiple lawsuits alleging failure to prevent exploitation on its platform.

In response, the company has announced plans to introduce age-specific accounts aimed at improving safety for younger users, with new categories set to roll out in the coming months.

Roblox and Microsoft have not yet publicly responded to the latest Australian regulatory notices.

Share post:

Popular

More like this
Related

Woman Claims She Found Broken Battery In Food Delivery Order, Investigation Underway

A woman in Shanxi province, China, has claimed she...

Black Rhino Attack At Japan Zoo Leaves Keeper With Skull And Neck Fractures

A black rhinoceros weighing over one tonne suddenly attacked...

Man Posing As Customer Flees With RM5,000 Gold Bracelet In Brazen Jewellery Shop Theft

A routine afternoon at a jewellery shop on Jalan...

James Chai Questioned By MACC For Over Eight Hours Over RM1.11 Billion Semiconductor Project Probe

Political analyst James Chai was questioned for over eight...