Roblox, the online gaming platform with more than 144 million daily users worldwide, has announced a sweeping overhaul of its safety systems for minors, introducing age-specific account types and tightening verification checks — while acknowledging that some parents have actively helped their children circumvent existing protections. The announcement follows sustained regulatory pressure, a high-profile grooming case in the United Kingdom, and an earlier Guardian Australia investigation that documented a week of virtual sexual harassment experienced by a user whose profile was set up as an eight-year-old.
The new measures, rolling out in Australia from May and globally from June, centre on two distinct account categories: Roblox Kids for children aged five to eight, and Roblox Select for those aged nine to fifteen. Each account will carry a distinct background colour so parents can identify which tier their child is using, and both will restrict access to content rated only "minimal" or "mild" until age checks are completed. Chat will be switched off by default in Australia for both account types. Users will be assigned to the appropriate account through facial age estimation technology or via parental controls, and will transition to a standard account once they turn sixteen.
The push comes after Roblox's chief of safety, Matt Kaufman, confirmed that the company's monitoring systems had caught parents physically handing their phones to their children during facial age checks — effectively allowing minors to be registered as adults. "You could see the kid in the background who handed the phone to their parent," Kaufman said. The facial estimation technology carries an error margin of roughly 1.4 years for users under eighteen, and age verification is treated as an ongoing process rather than a one-time gate: accounts whose behaviour does not match their registered age are flagged for rechecks, which Kaufman said occur daily. As of April, fifty percent of global users and sixty percent of users in Australia, New Zealand and the Netherlands — where the system launched in December — have completed age assurance checks.
Beyond account types, Roblox is overhauling how games reach younger users. New developer verification requirements mean content creators must either complete formal ID checks or maintain a verified link to a parent account. A real-time moderation system will assess games not just for content but for how players actually interact with them, and titles will need to be tested by users over sixteen before becoming available to younger audiences. Roblox also plans to replace its internal content labels with country-specific ratings — in Australia, this means adopting the national classification system later this year.
Despite broad recognition that the changes represent genuine progress, experts caution that significant gaps remain. Research suggests that age estimation technologies fail to restrict access for as many as seven in ten children under sixteen, due to technical error rates and deliberate circumvention. Analysts also note that the new framework places a heavy burden on parents to actively manage settings — including deciding whether to enable chat and vetting games sourced from regions with different rating systems. More than two-thirds of Australian teenagers under sixteen were still using platforms subject to the country's social media ban as of March. Broader child safety legislation, including a "digital duty of care" framework that would hold platforms legally accountable for foreseeable harms, was introduced in Australia in 2024 but later paused; the communications minister has pledged to bring revised legislation before parliament this year.