
London, November 21, 2025
Roblox has announced a global rollout of stringent age verification and chat restriction measures aimed at preventing children from communicating with adult strangers, starting immediately in select countries and expanding worldwide by January 2026. The update uses AI-powered facial scans and ID verification to enforce new safety standards on the platform.
Policy Changes and Enforcement
The newly introduced policy segments users into strict age brackets, barring children under 13 from messaging or chatting privately with adults unless explicit parental consent is granted. This measure targets the elimination of unsolicited contact between minors and adults to reduce risks of exploitation.
The system leverages advanced AI technology to verify users’ ages. Facial age estimation and government-issued ID checks, powered by the identity verification company Persona, form the backbone of the updated security framework. These technologies enable Roblox to implement real-time enforcement of chat restrictions aligned with verified user ages.
Background and Legal Context
Roblox’s move follows intense regulatory scrutiny and several lawsuits in the United States accusing the platform of inadequate protections against grooming and predatory behavior targeting children. As the platform hosts over 150 million users globally—approximately one-third of whom are under 13—safety advocates and lawmakers have pressured Roblox to overhaul its child protection mechanisms.
These legal and public pressures have accelerated Roblox’s commitment to safer communications within its expansive virtual ecosystem, which remains a popular digital space for children and families worldwide.
Stakeholder Responses and Safety Implications
Roblox has framed the updates as foundational steps toward creating safer environments for its predominantly young user base. Child safety organizations welcome the implementation of these age-focused chat restrictions but caution that continuous oversight will be essential to address challenges posed by evolving online behaviors and new artificial intelligence tools.
Business leaders, policymakers, and child advocacy experts view these changes as a significant regulatory and technological milestone in the ongoing effort to secure digital platforms used by children. The introduction of biometric and identity verification marks an industry shift towards integrating AI-driven compliance solutions in user safety protocols.
Forward Outlook
Roblox plans a phased global expansion of these features, aiming for comprehensive implementation by January 2026. Continued monitoring and adaptive policy will be critical as the platform evolves amid growing concerns over children’s exposure to online risks.
For those overseeing digital platform governance and child protection policies, Roblox’s approach underscores the increasing intersection of technology, regulation, and user safety. The effectiveness and ethical considerations of AI-driven identity verification will remain central topics as the digital landscape adapts to safeguard young users worldwide.

