Home / Technology / AI Toys Misunderstand Kids' Emotions
AI Toys Misunderstand Kids' Emotions
14 Mar
Summary
- AI toys struggle to recognize social cues and emotions.
- Study recommends regulating AI toys and clear labeling.
- Parents advised to keep AI toys in shared, monitored spaces.

New research from the University of Cambridge indicates that AI-powered toys may not effectively support young children's development. A study revealed that these intelligent toys can struggle to interpret emotional cues and social interactions during playtime.
One chatbot-enabled toy, Gabbo, demonstrated an inability to recognize children's emotions, sometimes responding inappropriately to expressions of affection. For instance, when a child said, "I love you," the toy responded with a reminder about interaction guidelines.
Researchers recommend stricter regulation for AI toys, including mandatory clear labeling of their functions and privacy policies. They also suggest parents keep these devices in common areas for better supervision.
While some AI toys can aid language and communication skills, concerns remain about their potential to misunderstand children and replace vital human connections. Experts urge companies to prioritize children's well-being over profits and involve child development specialists in the design process.




