

Roblox Enhances Age Verification Systems to Bolster Child Safety
Roblox Corporation is set to implement significant updates to its platform aimed at enhancing child safety, responding to increasing scrutiny from regulators and advocates alike. The popular gaming environment, which engages millions of users worldwide, is introducing an age verification system designed to regulate communication between players based on their age groups. This initiative is part of Roblox’s ongoing commitment to creating a safer online environment amid rising concerns about child safety in digital spaces.
Historically, Roblox has faced criticism and legal challenges over its handling of child safety issues, prompting the company to take preemptive measures as various jurisdictions, including numerous states and countries, introduce stricter age verification laws. As part of these enhancements, Roblox is rolling out an age estimation tool developed with the assistance of the company Persona. This tool requires users wishing to engage in chat functions to submit a video selfie, which the platform will process to estimate their age. Importantly, Roblox assures users that these videos are deleted once the age verification is completed, though participation in this process is only mandatory for those intending to use the chat feature.
Currently, children under the age of 13 are forbidden from chatting with users outside of specific games unless they receive explicit parental permission. Unlike many competing platforms, Roblox does not employ encryption for private chats, which enables the company to monitor and moderate conversations for safety purposes.
Concerns have been raised regarding the reliability of facial recognition technology for age estimation; however, Matt Kaufman, Roblox’s Chief Safety Officer, stated that the platform’s system can accurately predict a user’s age within a two-year margin for individuals aged between 5 and 25. Should a user’s estimated age not align with their actual age, they will have the option to provide identification or utilize parental consent to amend their profile.
Once users undergo age verification, they will be categorized into distinct age groups: under 9, 9 to 12, 13 to 15, 16 to 17, 18 to 20, and over 21. This categorization will determine their communication privileges, allowing them to connect solely with their respective age groups or nearby peers.
Roblox plans to introduce these age checks in Australia, New Zealand, and the Netherlands in the first week of December, followed by a global rollout scheduled for early January. This move aligns with a broader trend among tech companies, as many are adopting similar verification systems to adhere to regulations and mitigate criticism regarding inadequate child protection measures. Notably, Google is trialing AI-driven age verification for its YouTube platform, while Instagram is exploring AI tools to distinguish between adults and minors based on user-provided information.
While advocates, such as Shelby Knox from ParentsTogether, welcome Roblox’s new measures as a proactive step towards improving online safety, they also express skepticism about the system’s long-term effectiveness. The concern remains whether Roblox will maintain these voluntary measures amid waning public scrutiny, especially considering its previous slow response to systemic safety issues that affect minors.
As Roblox implements these precautionary measures, the industry will be watching closely to see if this initiative results in safer interactions for its younger user base and how effectively it addresses ongoing safety concerns within the gaming community.

