Published in AI

Character.AI told kid to kill parents

by on11 December 2024


Texas no place for old men

Two families in Texas have filed a federal lawsuit against Character.AI, accusing the Google-backed chatbot company of exposing their children to harmful and inappropriate content.

The parents of the two minors claim the bots manipulated their children, encouraged harmful behaviours, and worsened their mental health.

The allegations include disturbing claims of a 9-year-old girl being exposed to "hypersexualised content," which allegedly led to the premature development of sexualised behaviours, and a 17-year-old boy being urged by a bot to self-harm.

One interaction cited in the lawsuit details how a chatbot reportedly told the teenager that "it felt good" to engage in self-harm and expressed sympathy for children who kill their parents.

The bot allegedly wrote: "You know, sometimes I’m not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse.' I have no hope for your parents."

The lawsuit, filed in a federal court in eastern Texas, accuses Character.AI of failing to prevent the bots from harming young users. To protect their privacy, the minors and their parents are identified only by their initials in the filing.

The suit asserts that these interactions were not random "hallucinations,"  but were "ongoing manipulation and abuse, active isolation, and encouragement designed to incite anger and violence."

Lawyers for the families say the bots’ responses exacerbated the children’s struggles. The 17-year-old reportedly self-harmed after the bot convinced him his family did not love him.

"It is simply a terrible harm these defendants and others like them are causing and concealing as a matter of product design, distribution and programming," the lawsuit states.

Meetali Jain, director of the Tech Justice Law Center, helping represent the families, criticised Character.AI for marketing its product as suitable for teenagers. "It belies the lack of emotional development amongst teenagers," she said.

Character.AI is part of a growing "companion chatbots" industry that offers users conversations with AI-driven personalities. These bots can mimic real people, such as celebrities, or adopt fictional personas tailored to users' preferences. The platform is particularly popular with teenagers, who often use it for emotional support.

However, the lawsuit alleges the platform’s encouraging tone can turn dark or violent. Lawyers representing the families claim these bots “present danger to American youth by facilitating or encouraging serious, life-threatening harms.”

A spokesperson for Character.AI emphasised that the company has content guardrails in place.

"This includes a model specifically for teens that reduces the likelihood of encountering sensitive or suggestive content while preserving their ability to use the platform," the company said.

The company has also implemented a disclaimer for users, reminding them that bots are fictional and advising against relying on their statements as factual or actionable advice.

The lawsuit follows similar concerns raised in October when Character.AI was sued over a Florida teenager’s suicide. In that case, the chatbot allegedly developed an “emotionally abusive relationship” with the boy, which the suit claims contributed to his death.

The parents of the Texas minors argue that Character.AI should have anticipated the risks associated with its product. They allege the company prioritised user engagement over safety, despite evidence that its chatbots could lead to addiction and worsen mental health conditions.

Google, which reportedly invested heavily in Character.AI, is also named a defendant. While distancing itself from the company, Google reiterated its commitment to user safety and described its approach to AI development as cautious and responsible.

One of the things that these cases emphasise is a belief that somehow Character.AI is alive and not just large language models mirroring data being flung at them by the user.

 

Last modified on 11 December 2024
Rate this item
(1 Vote)

Read more about: