Placing Blame A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but also pushed him into the act when he expressed hesitance. As The Guardian reports, Florida mom Megan Garcia has filed a lawsuit against the chatbot firm Character.
When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil lawsuit
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI chatbot, The New York Times reports. Per the report, Setzer, who was 14, developed a close relationship with a chatbot designed to emulate "Game of Thrones" character Daenerys Targaryen.
A 14-year-old boy in Orlando took his life using a gun moments after exchanging messages with an AI Chatbot. The boy's mother has filed a wrongful death lawsuit against Character.AI, alleging neglect in the boy's death.
Earlier this week, Megan Garcia filed a lawsuit in the U.S. District Court in Orlando, Florida against Character.AI, a company that provides artificially intell
Megan Garcia said her son chatted continuously with the bots provided by Character.ai in the months before his death on February 28, 2024, "seconds" after his
Character.ai is facing a lawsuit after a fourteen-year-old boy from Florida committed suicide after becoming obsessed with his AI chatbot.
The mother said her son became infatuated with a chatbot made in the likeness of Daenerys Targaryen and often exchanged messages that were romantic and sexual in nature.
Sewell Setzer III died from a self-inflicted gunshot wound after the company’s chatbot allegedly encouraged him to do so. Character.AI says it's updating its approach to safety.
Nearly 50% of employees at Morgan Stanley and 60% at JPMorgan Chase have access to generative AI software provided by OpenAI.