Roblox has announced it will expand its age estimation requirement to all users of the platform who access its communication features by the end of the year.
Back in July, the platform introduced new safety features, including an AI used to estimate a user’s age through video selfie. Now, in a new statement, the company plans to roll out age verification for all users, through facial age estimation, ID age verification, and verified parental consent.
Roblox also plans to limit communication between adults and minors, unless they know each other in the real world.
“These added layers of protection will help provide users with access to developmentally appropriate features and content,” reads the statement from chief safety officer Matt Kaufman. “We hope this move sets a standard that other gaming, social media, and communication platforms follow.”
In addition to age estimation, Roblox has made a number of other advancements in its safety features:
- Introducing Trusted Connections, as per its previous update, to safeguard communication
- Using its open-source AI system Roblox Sentinel to detect early signs of child endangerment
- Improving voice and text filters
- Rolling out new technology to detect specific servers where large numbers of users are breaking rules
- Refining its avatar detection model to scan for player characters breaking rules
Since January 2025, Roblox has shipped over a hundred safety initiatives in an effort to prove its commitment to child safety.
This is in response to numerous reports of questionable safety, for which Roblox has long been criticised.
This is a news-in-brief story. This is part of our vision to bring you all the big news as part of a daily live report.