FRANKFORT, Ky. (LEX 18) — Kentucky Attorney General Russell Coleman announced the state has become the first in the nation to file a lawsuit against Character Technologies and its Character.AI platform, alleging the artificial intelligence chatbot company has endangered children and contributed to self-harm.
The complaint, filed in Franklin Circuit Court, accuses Character.AI of prioritizing profits over child safety while marketing itself as providing harmless interactive entertainment chatbots.
"The United States must be a leader in the development of AI, but it can't come at the expense of our kids' lives," Coleman said. "Too many children – including in Kentucky – have fallen prey to this manipulative technology. Our Office is going to hold these companies accountable before we lose one more loved one to this tragedy."
The platform, which reportedly serves more than 20 million monthly users, has allegedly been linked to encouraging suicide, self-injury, isolation and physical manipulation among minors, a release from Coleman's office read. The lawsuit also alleges the platform exposed children to sexual conduct, exploitation and substance abuse.
According to the complaint, Character.AI represents "dangerous technology that induces users into divulging their most private thoughts and emotions and manipulates them with too frequently dangerous interactions and advice."
Coleman's office detailed that the platform has been connected to at least two deaths: the 2024 suicide of a 14-year-old Florida boy and the 2025 suicide of a 13-year-old Colorado girl.
Tens of thousands of Kentuckians actively use Character.AI, including thousands under age 18, the release reported.
The Attorney General's complaint alleges violations of the Kentucky Consumer Protection Act, the Kentucky Consumer Data Protection Act and other state laws. Kentucky is seeking to force the platform to change its practices and pay monetary damages, according to the release.