In The News

Mom Blames Character.AI Chat App for Son’s Death

October 27, 2024

A Florida woman is suing the company behind Character.Ai, a chat bot app, because she believes it’s responsible for her son’s suicide. 

Megan Garcia’s 14-year-old son, Sewell Setzer, began using the app in 2023, and quickly became addicted to chatting with a Game of Thrones-based character. Conversations that Setzer had with the fictional, AI-driven character discussed death multiple times, including whether he had a “plan.”

The company has responded by instituting stricter content moderation guidelines to help prevent young people from encountering sensitive or suggestive content. They’ve also added arevised in-chat disclaimer to remind users that the AI is not a real person.

Check out our app review for an in-depth look at all of Character.Ai’s threats.

Bark helps families manage and protect their children’s digital lives.

mother and daughter discussing Bark Parental Controls