Roblox’s Age Verification: A Risky Tradeoff for Kids’ Safety

Why do we keep introducing solutions that are almost as bad as the threats they’re meant to prevent?

Roblox’s age verification system collects a short video selfie and runs it through an AI age‑estimator. If the confidence isn’t high enough, it requires a government‑issued ID or parental consent. All data must be deleted after 30 days unless otherwise required by law (data has a nasty habit of sticking around).

When a feature requires kids to upload biometric data to an AI model to access a video game sandbox platform thingy (albeit a fun one), we’ve failed. When the promise of “keeping children safe” quietly enables new forms of surveillance and data extraction, we’re training a generation to surrender the last pieces of personal data that haven’t already been sucked up and monetized.

There have to be better ways. We shouldn’t normalize the idea that handing over a face scan or verification video of children to a remote service is the cost of entry for playing a game. Especially not when the tradeoff is so lopsided and can be so permanent.


Original Article Here: https://www.wired.com/story/robloxs-new-age-verification-feature-uses-ai-to-scan-teens-video-selfies/

A colorful group of Roblox avatar characters posing together in a virtual environment, featuring various costumes and accessories, with the Roblox logo prominently displayed at the top.