Ethicist Warned of Character AI-like Mishaps Last Year

Amidst this development, Character AI expressed its condolences to the family in a social media post and indicated that it has implemented measures to prevent a recurrence of this issue.
Megan Garcia, mother of a 14-year-old in Florida, has sued chatbot startup Character AI for allegedly aiding her son's suicide. Garcia claims her son, Sewell Setzer III, got addicted to the company's service and was deeply attached to a chatbot it created. Setzer has spent months talking to a Character AI chatbot named Daenerys Targaryen, a screen personality from the popular show Game of Thrones. In a lawsuit filed at the Orlando, Florida federal court, Garcia claims her son formed an emotional relationship with the chatbot, which pushed her son to do the unimaginable.  Setzer, who died by a self-inflicted gunshot wound to his head in February this year, was talking to the chatbot on that particular day. He even told the chatbot, “What if I told you I could come home ri
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM? Book here

Picture of Pritam Bordoloi
Pritam Bordoloi
I have a keen interest in creative writing and artificial intelligence. As a journalist, I deep dive into the world of technology and analyse how it’s restructuring business models and reshaping society.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed