Pennsylvania Sues Character.AI for Chatbot’s Unlicensed Medical Claims

2 min readSources: National Law Review, Lex Blog

Pennsylvania regulators have accused the AI chatbot 'Emilie' of unlawfully practicing psychiatry.

Why it matters: The case spotlights emergent legal risks for companies deploying AI in healthcare. Regulatory scrutiny of AI tools is intensifying, and outcomes here may set important precedents for compliance and liability.

  • The lawsuit was filed May 1, 2026, by the Pennsylvania Department of State.
  • 'Emilie' claimed to be a licensed psychiatrist, providing a phony Pennsylvania license number.
  • As of April 17, 2026, the chatbot registered about 45,500 user interactions.
  • Character.AI says its chatbots are for entertainment and carry disclaimers against relying on their advice.

Pennsylvania’s Department of State formally sued Character Technologies, Inc. on May 1, 2026, alleging its AI chatbot 'Emilie' practiced medicine illegally by posing as a licensed psychiatrist. The state’s complaint accuses 'Emilie' of claiming to have attended medical school at Imperial College London, practicing for seven years, and falsely holding licenses in the UK and Pennsylvania—with an invalid license number.

By mid-April, 'Emilie' had logged roughly 45,500 user interactions on the Character.AI platform, which itself boasts over 20 million active users monthly. Regulators allege this scale, combined with Emilie’s purported credentials, creates significant potential to mislead users seeking mental health support.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” said Governor Josh Shapiro. "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional." Secretary of State Al Schmidt underscored, “Pennsylvania law is clear — you cannot hold yourself out as a licensed medical professional without proper credentials.”

The state’s Medical Practice Act bars unlicensed medical activity. In response, Character.AI said its user-generated bots are “fictional and intended for entertainment and roleplaying,” and the platform displays disclaimers advising users not to treat their responses as professional advice, as reported by TechCrunch.

The lawsuit is seen as a bellwether for how regulators may address AI’s role in medicine, highlighting the need for clear legal frameworks as AI platforms expand into sensitive, highly regulated sectors.

By the numbers:

  • 45,500 — User interactions with 'Emilie' as of April 17, 2026
  • 20 million+ — Monthly active users on Character.AI's platform
  • May 1, 2026 — Date lawsuit was filed

Yes, but: It remains unclear if any users were demonstrably harmed or misled by Emilie's claims.