
New York, November 21, 2025
Roblox has initiated a global safety policy requiring facial age verification for users to access chat features, starting with a voluntary phase on November 21, 2025, and moving to mandatory enforcement in December and January across key and worldwide markets. The measure aims to prevent communication between children and adult strangers, addressing safety concerns on the platform.
Age Verification System and Chat Restrictions
Roblox’s new safety protocol integrates facial age estimation technology provided by the specialist company Persona. Following verification, users are classified into six age cohorts ranging from under 9 to 21 and above. This segmentation determines the users with whom they may communicate. Children under 9 have chat disabled by default unless parents explicitly consent post-verification. Private messaging remains restricted for users below 13 unless approved through parental controls. The system supports “Trusted Connections,” verified family or friends permitted as safe contacts despite age differences.
Privacy safeguards are embedded in the process—facial images and videos used for age analysis are deleted immediately after processing. Roblox emphasizes the design is intended to uphold user privacy while operating at large scale.
Industry Context and Safety Enhancements
This initiative arrives amid significant legal pressures following lawsuits by several U.S. states accusing Roblox of insufficient protections against predatory adult contact with minors. CEO Dave Baszucki acknowledged vulnerabilities especially among users aged 13 to 17, underscoring a company stance of zero tolerance for harmful content and substantial investments in AI-powered safety systems.
The policy has garnered acknowledgment from child safety and privacy experts. Jules Polonetsky, CEO of the Future of Privacy Forum, commended the privacy-preserving implementation. Similarly, Stephen Balkam from the Family Online Safety Institute highlighted the critical role of age estimation tools in creating safer online environments for youth.
Parental Controls and Recommendations
Post-verification, parents maintain management capabilities over linked child accounts, including control over birthdate information and messaging permissions. Experts advocate continuous parental engagement through anonymous gamertags, declining unknown friend requests, and leveraging comprehensive parental controls and filtering settings.
Setting a New Standard for Online Safety
Roblox positions this system as a “safety gold standard” within the gaming industry, aiming to elevate protections for millions of daily users worldwide. Its approach, balancing robust age assurance with stringent privacy, seeks to drive broader industry adoption and foster safer digital spaces for children and teenagers.
The unfolding enforcement timeline and the integration of AI-enhanced safety technologies mark a significant evolution in how interactive platforms address risks of harmful contact online. This policy serves as a benchmark potentially influencing regulatory frameworks and industry practices going forward.

