The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.
On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”
“I miss you, baby sister,” he wrote.
“I miss you too, sweet brother,” the chatbot replied.
Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.
Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)
But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.
If anything, this is a glaring example of how LLMs are not “intelligent.” The LLM cannot and did not catch that he was speaking figuratively. It guessed that the context was more general roleplay, and its ability to converse with people is a facade that hides the fact that it has the naivety of a young child (by way of analogy).
Even talking about it this way is misleading. An LLM doesn’t “guess” or “catch” anything, because it is not capable of comprehending the meaning of words. It’s a statistical sentence generator; no more, no less.
Yeah, you’re right, I just didn’t want to put quotes around everything.
The model should basically refuse to engage for some time after suicide ideation is brought up, besides mentioning help. “I’m sorry but this is not something am qualified to help with, if you need to talk please call 988.”
Then the next day, “are you feeling better? We can talk if you promise never to do that again.”
its an LLM, not a computer program. you can’t just program it. these companies are idiotic