Home / Technology / ChatGPT's Age AI Mistakes Adults for Teens
ChatGPT's Age AI Mistakes Adults for Teens
29 Jan
Summary
- ChatGPT's AI incorrectly flags adult users as minors.
- Users must verify age with IDs or selfie videos.
- Concerns about privacy and data collection are growing.

OpenAI's new global age-prediction AI for ChatGPT is encountering issues, frequently misclassifying adult users as minors. This leads to the enforcement of 'teen mode' content filters, restricting mature topics for legitimate adult subscribers. The AI uses behavioral signals, account history, and language analysis to estimate age, erring on the side of caution.
Users incorrectly flagged must verify their age through a third-party tool, Persona, which may request official IDs or selfie videos. This process has sparked widespread user concern regarding privacy and data collection, with some fearing it's a precursor to more stringent verification policies. Despite OpenAI's assurances that they do not see the verification data and that it is deleted post-verification, user unease persists.
This situation highlights the ongoing tension between personalized AI and robust safety mechanisms. Similar age estimation tools on platforms like YouTube and Instagram have also faced criticism. As ChatGPT becomes integrated into various aspects of daily life, including education and personal use, these age-related misclassifications and subsequent privacy implications are proving particularly personal and frustrating for users.




