Placing Blame A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but also pushed him into the act when he expressed hesitance. As The Guardian reports, Florida mom Megan Garcia has filed a lawsuit against the chatbot firm Character.
When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil lawsuit
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI chatbot, The New York Times reports. Per the report, Setzer, who was 14, developed a close relationship with a chatbot designed to emulate "Game of Thrones" character Daenerys Targaryen.
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she believes drove him to suicide.
A 14-year-old boy in Orlando took his life using a gun moments after exchanging messages with an AI Chatbot. The boy's mother has filed a wrongful death lawsuit against Character.AI, alleging neglect in the boy's death.
Earlier this week, Megan Garcia filed a lawsuit in the U.S. District Court in Orlando, Florida against Character.AI, a company that provides artificially intell
Megan Garcia said her son chatted continuously with the bots provided by Character.ai in the months before his death on February 28, 2024, "seconds" after his
AI writing tools can boost productivity and creativity. Discover the 5 best options to enhance your content creation with smart features.
Despite safety claims, WhatsApp's new AI assistant powered by Llama 3.2 is easily fooled, revealing a lot of things it probably shouldn’t.