Placing Blame A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but also pushed him into the act when he expressed hesitance. As The Guardian reports, Florida mom Megan Garcia has filed a lawsuit against the chatbot firm Character.
When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil lawsuit
The mother said her son became infatuated with a chatbot made in the likeness of Daenerys Targaryen and often exchanged messages that were romantic and sexual in nature.
A Florida mom has filed a lawsuit against Character.AI, an artificial intelligence company, alleging that one of its chatbots encouraged her 14-year-old son to kill himself and failed to recognize the warning signs he typed in.
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI chatbot, The New York Times reports. Per the report, Setzer, who was 14, developed a close relationship with a chatbot designed to emulate "Game of Thrones" character Daenerys Targaryen.
A lawsuit against Character.ai has been filed in the suicide death of a Florida teenager who allegedly became emotionally attached to a Game of Thrones chatbot.
Sewell Setzer III became obsessed with the chatbot that "abused and preyed" on the boy, according to his mother who is suing the company behind the tech.
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she believes drove him to suicide.
The mom says her son became dependent on an AI chatbot that made the 14-year-old feel like he was in a real relationship, one time asking the teen about suicide.
In a lawsuit, a mother blames Character.AI for her son Sewell Setzer's suicide, asserting his addiction to a chatbot influenced his mental state. She demands action to protect children and halt the alleged unauthorized use of her son's data by the company.