Roblox Introduces AI Age Checks to Protect Kids From Predators

Roblox Introduces AI Age Checks to Protect Kids From Predators

Roblox is taking a major step to make its platform safer for children, introducing new AI-powered age verification to prevent young users from chatting with adults they don’t know. This move comes amid a wave of lawsuits claiming that the online gaming platform has exposed children to sexual predators. Now, anyone who wants to use Roblox’s chat features will need to verify their age—either by uploading a government ID or using an artificial intelligence system that estimates age through a facial scan.

The platform, which allows users to create and play games while interacting with others, has long attracted children under 13, advertising itself as a fun and educational space for coding and game creation. With over 150 million users worldwide, roughly a third are under 13. However, Roblox has recently faced intense scrutiny as reports surfaced of children being groomed, abused, and in some tragic cases, even kidnapped by adults they met on the platform. Lawsuits have been filed by several states, including Kentucky, Louisiana, and Florida, with allegations describing Roblox as a “breeding ground for predators.” Families have also taken legal action, citing heartbreaking incidents of children being exploited online.

Also Read:

Roblox already employs several safety measures, including parental controls, content filters, and AI moderation for text and voice chat. Users are required to verify their age for access to mature content, but the new policy will dramatically expand age verification across the platform. According to Roblox executives, this change is designed to keep young users from connecting with older teens and adults outside their age group. The AI system will estimate users’ ages in categories ranging from under nine to 21 and older, allowing children to communicate only with those in similar age ranges. For example, a 12-year-old would be limited to chatting with users aged 15 and below.

Roblox assures users that facial images used for age verification will only serve this purpose and will be deleted after processing. Robust fraud detection has been built in to prevent misuse of the system, including attempts to upload photos of someone else or fake images. While the technology is not perfect, Roblox reports that the AI is generally accurate within one to two years for users between 5 and 25.

The rollout begins voluntarily this month, becoming mandatory in Australia, New Zealand, and the Netherlands in December, and expanding globally in early 2026. Roblox hopes this initiative will not only protect children on its platform but also set a new safety standard across the industry. The company emphasizes that online safety is an ongoing challenge and encourages other platforms to adopt similar measures to ensure minors can interact online in a secure and age-appropriate environment.

By taking these steps, Roblox is attempting to rebuild trust with parents and users, showing that it is serious about protecting children from harm while maintaining the creative and social experiences that have made the platform so popular.

Read More: