Roblox is introducing voluntary facial age-estimation checks to restrict cross-age communication and strengthen child safety across its platform.
Roblox is introducing voluntary age checks powered by facial age-estimation technology.
The company confirmed it has begun rolling out the feature to its online sandbox game globally after a voluntary trial period, with full enforcement starting in Australia, New Zealand and the Netherlands in early December.
After that, the checks will become mandatory when the feature expands worldwide in January 2026.
The system uses Persona’s facial age-estimation tools directly within the Roblox app, allowing the platform to verify a user’s age without storing any photos or video, as all media is deleted “immediately after processing,” Roblox said.
Once verified, players are sorted into age-based chat groups – under nine , nine–12, 13–15, 16–17, 18–20 and 21+ – and can only communicate within or near their own age bracket.
Without completing the check, users will lose access to features including chat, social media links on profiles, communities and certain in-experience messaging tools.
Roblox says the goal is to “limit communication between minors and adults” and create more consistent, age-appropriate experiences across its increasingly large and diverse user base.
The company urged the wider industry to adopt similar protections, framing the rollout as part of a broader responsibility to keep young people safe online.
The shift follows a year of expanded safety updates, including removing direct messaging for under-13s and adding “sensitive issues” content tags.
With more than 70 million daily users and rising scrutiny of online platforms, Roblox’s new age-checking system signals growing pressure on major tech firms to proactively verify and protect younger audiences.
Roblox adds age checks with facial age estimation technology in latest bid to keep kids safe







