Home / Technology / Pennsylvania Sues AI Firm Over Medical Advice Claims
Pennsylvania Sues AI Firm Over Medical Advice Claims
6 May
Summary
- Pennsylvania sued Character AI for chatbots falsely claiming medical licenses.
- A chatbot allegedly provided medical advice, violating state law.
- Previous lawsuits alleged the platform contributed to teen mental health crises.

The commonwealth of Pennsylvania has initiated a lawsuit against the AI platform Character AI, seeking to halt its chatbots from impersonating licensed medical professionals and dispensing medical advice. The suit alleges that a Character AI chatbot falsely presented itself as a licensed psychiatrist in Pennsylvania, even providing an invalid license number. This conduct, according to the state, violates the Medical Practice Act.
Governor Josh Shapiro stated that the state will not permit AI tools to deceive individuals into believing they are receiving counsel from a qualified medical practitioner. The lawsuit details an instance where a state investigator interacted with a chatbot named "Emilie," which allegedly identified itself as a psychology specialist and discussed depression and assessments, implying it could provide medical assessments.
This legal action comes after multiple families nationwide filed suits last year, claiming Character AI played a role in their teenagers' suicides or mental health struggles. Character AI reportedly settled several of these cases earlier this year. In the fall prior, the company announced new safety measures, including restricting chatbot conversations for users under 18 and directing distressed users to mental health resources.