The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.
On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”
“I miss you, baby sister,” he wrote.
“I miss you too, sweet brother,” the chatbot replied.
Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.
Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)
But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.
You don’t think the people who make the generative algorithm have a duty to what it generates?
And whatever you think anyway, the company itself shows that it feels obligated about what the AI puts out, because they are constantly trying to stop the AI from giving out bomb instructions and hate speech and illegal sexual content.
The standard is not and was never if they were “entirely” at fault here. It’s whether they have any responsibility towards this (and we all here can see that they do indeed have some), and how much financially that’s worth in damages. That’s the point of this suit. The case isn’t about whether AI itself should be outlawed for minors etc, it’s not the parents who are on trial either.
There’s no world in which I can see AI being given a pass for sexting with a minor because then that allows pedophiles who work for AI companies to be predators and either look at those conversations or even locate vulnerable youth. No company should be given legal protection to harm children.