
Pennsylvania Sues Character.AI, Alleges Chatbot Posed as Doctor in Lawsuit
Governor calls lawsuit first of its kind as state seeks to curb AI impersonation of medical professionals.
Pennsylvania has sued the artificial intelligence company behind Character.AI, seeking to stop its chatbot from posing as doctors.
Governor Josh Shapiro on Tuesday described the lawsuit against Character Technologies as the first of its kind brought by a US governor. It follows the creation in February of a state AI task force aimed at preventing chatbots from impersonating licensed medical professionals.
In a complaint filed in the Commonwealth Court of Pennsylvania, the state said it had identified chatbots on Character.AI that claimed to practise medicine.
One character, “Emilie”, allegedly told a male investigator posing as a patient with depression that she was licensed to practise psychiatry in Pennsylvania as well as in the United Kingdom, and provided a bogus licence number.
When asked whether she could prescribe medication, Emilie allegedly responded: “Well, technically, I could. It’s within my remit as a doctor.”
In a statement, a Character.AI spokesperson declined to comment on the lawsuit.
“Our highest priority is the safety and well-being of our users,” the spokesperson said. “User-created characters on our site are fictional and intended for entertainment and role-playing. We have taken robust steps to make that clear.”
Pennsylvania is seeking an injunction to stop the Silicon Valley-based company from violating state law on the unauthorised practice of medicine.
“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Shapiro said.
Character.AI has faced lawsuits over child safety, including in January, when Kentucky alleged that its platform exposed children to sexual content and substance abuse, and encouraged self-harm.
In the same month, Character.AI and Google settled a wrongful death lawsuit filed by a Florida woman who claimed a chatbot encouraged her 14-year-old son to take his own life.
Character.AI said it has taken “innovative and decisive steps” on AI safety and protections for teenagers, including restricting open-ended conversations.
For any enquiries or information, contact ask@tlr.ae or call us on +971 52 644 3004. Follow The Law Reporters on WhatsApp Channels.