The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.
On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”
“I miss you, baby sister,” he wrote.
“I miss you too, sweet brother,” the chatbot replied.
Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.
Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)
But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.
How is character.ai responsible for the suicide of someone clearly in need of mental health help?
Someone has to be responsible. Anyone but the parents…
The AI encouraged him to do it
This is a lie, unless you know something the article doesn’t mention. The quoted chat log shows the character acting horrified and arguing against it.
Source: https://nypost.com/2024/10/23/us-news/florida-boy-14-killed-himself-after-falling-in-love-with-game-of-thrones-a-i-chatbot-lawsuit/