Roblox's current safety issues, such as inappropriate content, predatory behavior, and unsupervised chat, aren't simply caused by self-reported ages. A person can technically bypass the age verification system by using borrowed credentials and may find workarounds. Age checks themselves can't filter behavior, moderate conversations, or prevent grooming. I've tried it myself: it first asks for a face scan, and if it fails, it asks you for a government-issued identification. Many people, including minors, may be hesitant to submit this information due to ongoing data breaches. Even recently, there was a data breach for the vendor that handles the age verification on Discord; about 70,000 people's IDs were leaked. Even if Roblox encrypts the information, storing government-issued identifications can pose an extreme risk. People may simply just refuse to comply with this. Facial-recognition AI is also open to errors; it may not be able to detect the exact age for different ethnicities. Many users will also see this as unsafe and unnecessary just to use chat features. Even if it does work, it can still damage a user's experience. Roblox should instead work on better content moderation, preventing false bans, and smarter behavioral detection.