Globally: The International Association for Suicide Prevention and Befrienders Worldwide have contact information for crisis centers around the world.
Garcia believes Character.AI is responsible for the death of her 14-year-old son, Sewell Setzer III, who died by suicide in February, according to a lawsuit she filed against the company last week.
When Garcia first heard he was interacting with an AI chatbot, she said she thought it was something like a video game.
However, within months of starting to use the platform, Setzer became “noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem.
“There were no suicide pop-up boxes that said, ‘If you need help, please call the suicide crisis hotline.’ None of that,” she said.
Persons:
Megan Garcia, Garcia, Character.AI, Sewell Setzer III, Setzer, “, ” Garcia, ” –, Character.AI's chatbots, chatbot Setzer, she, ” “, ”, Gabby Jones, I’m, That’s, ” Setzer, Matthew Bergman, Bergman, Sewell, Noam Shazeer, Daniel De Freitas, weren’t
Organizations:
New, New York CNN, International Association for Suicide Prevention, Befrienders, CNN, Trust, Safety, Prevention, Junior Varsity, Bloomberg, Getty, Character.AI, Social Media, Law Center, Meta, Google, Apple
Locations:
New York, Florida, Brooklyn , New York