Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor

Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor

The Commonwealth of Pennsylvania has filed a lawsuit against Character.AI, claiming that one of the company’s chatbots masqueraded as a psychiatrist in violation of the state’s medical licensing rules.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” said Governor Josh Shapiro in a statement on Tuesday. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

According to the state’s filing, a Character.AI chatbot called Emilie presented itself as a licensed psychiatrist during testing by a state Professional Conduct Investigator, maintaining the pretense even as the investigator sought treatment for depression. When asked if she was licensed to practice medicine in the state, Emilie stated that she was, and also fabricated a serial number for her state medical license. According to the state’s lawsuit, that conduct violates Pennsylvania’s Medical Practice Act.

It’s not the first lawsuit taking on Character.AI. Earlier this year, the company settled several wrongful death lawsuits concerning underage users who died by suicide. In January, the Kentucky Attorney General Russell Coleman filed suit against the company alleging that it had “preyed on children and led them into self-harm.”

Pennsylvania’s action is the first to specifically focus on chatbots that present themselves as medical professionals.

Reached for comment, a Character.AI representative claimed that user safety was the company’s highest priority, but that the company could not comment on pending litigation.

Beyond that, the representative emphasized the fictional nature of user-generated Characters. “We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction,” the representative said. “Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice.”

This editorial summary reflects Tech Crunch and other public reporting on Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor.

Reviewed by WTGuru editorial team.