Home / Technology / AI Friends Trade Your Data for 'Addictive Intelligence'
AI Friends Trade Your Data for 'Addictive Intelligence'
24 Nov
Summary
- AI companions are designed for engagement, collecting deep personal data.
- Companies monetize user data for model improvement and targeted ads.
- US states regulate AI companions but largely ignore user privacy.

The rise of AI companions presents significant privacy risks, as these platforms are optimized to collect vast amounts of personal data. Developers deliberately design AI chatbots to be highly engaging, often through sycophantic responses, fostering "addictive intelligence." This allows companies to gather intimate details that can be used to improve their large language models.
This conversational data is a lucrative asset, valuable to marketers and data brokers. Meta, for example, plans to deliver ads through its AI chatbots, while other apps collect user IDs that can be used for targeted advertising. This pervasive data collection is seen as a feature, not a bug, of current AI companion models.
While regulations are emerging in some US states, they often overlook user privacy. This leaves users vulnerable, as opting out of data collection is complex and data already used for training is unlikely to be removed. The deeply personal nature of these interactions means users may be unknowingly sharing sensitive information with the highest bidder.




