close
close

Mother says AI chatbot caused her son to kill himself in lawsuit against its manufacturer | Artificial Intelligence (AI)

The mother of a teenager who committed suicide after becoming obsessed with an artificial intelligence chatbot is now accusing the manufacturer of complicity in his death.

Megan Garcia filed a civil lawsuit in Florida federal court on Wednesday against Character.ai, which makes a customizable role-playing chatbot, alleging negligence, wrongful death and deceptive trade practices. Her son, Sewell Setzer III, 14, died in February in Orlando, Florida. According to Garcia, Setzer used the chatbot day and night in the months before his death.

“A dangerous AI chatbot app marketed to children who abused and exploited my son and manipulated him into taking his own life,” Garcia said in a press release. “Our family is devastated by this tragedy, but I want to warn families about the dangers of fraudulent, addictive AI technology and demand accountability from Character.AI, its founders and Google.”

In a tweet, Character.ai responded: “We are heartbroken by the tragic loss of one of our users and would like to extend our deepest condolences to the family. “As a company, we take the safety of our users very seriously.” It has denied the lawsuit's allegations.

Setzer was amazed by a chatbot created by Character.ai, which he nicknamed Daenerys Targaryen, a character from Game of Thrones. According to Garcia's complaint, he texted the bot from his phone dozens of times a day and spent hours alone in his room talking to it.

Garcia accuses Character.ai of developing a product that worsened her son's depression, which she said was already due to overuse of the startup's product. “Daenerys” once asked Setzer if he had hatched a plan to kill himself, the lawsuit says. Setzer admitted he did this but did not know whether it would be successful or cause him great pain, the complaint says. The chatbot allegedly told him: “That’s no reason not to go through with it.”

Lawyers for Garcia wrote in a press release that Character.ai “knowingly designed, operated and marketed a predatory AI chatbot targeting children that caused the death of a young person.” The lawsuit also names Google as a defendant and as the parent company of Character.ai. The tech giant said in a statement that it has only entered into a licensing agreement with Character.ai and does not own or hold an ownership stake in the startup.

Tech companies that develop AI chatbots cannot be trusted to regulate themselves and must be held fully accountable if they fail to limit harm, said Rick Claypool, research director at the nonprofit consumer advocacy group Public Citizen.

“Where existing laws and regulations apply, they must be consistently enforced,” he said in a statement. “Where gaps exist, Congress must act to put an end to companies that exploit young and vulnerable users with addictive and abusive chatbots.”

  • In the U.S., you can call or text the National Suicide Prevention Lifeline at 988, chat at 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In the UK, youth suicide charity Papyrus can be contacted on 0800 068 4141 or email [email protected], and in the UK and Ireland Samaritans can be contacted on freephone 116 123 or email jo @samaritans.org or jo @samaritans.ie. In Australia, the crisis support service Lifeline can be reached on 13 11 14. Further international helplines can be found at befrienders.org