This AI Robot Is Being Sued Over Child's Death

This AI Robot Is Being Sued Over Child's Death

This AI Robot Is Being Sued Over Child's Death

The arrival of AI may have marked a before and after in the world of technology, but the debate over its regulations continues. A lawsuit filed by a woman named Megan Garcia against Character.AI is proof of that.

Character.AI is a company that provides chatbots that act as virtual characters, but can be based on fictional characters and real people. These chatbots generate text responses that mimic human language.

But why is Garcia suing Character.AI? According to the New York Times, the woman sued the company because she believes one of its chatbots was responsible for the suicide of her 14-year-old son, Sewell Setzer III, of Orlando, Florida.

It all started when the young man developed a romantic relationship with a chatbot he named “Dany.” Over time, Sewell Setzer became fascinated with the AI ​​even though he knew it wasn’t a real person. The conversations he had with the chatbot ranged from romantic to sexual.

The teen, who had been diagnosed with mild Asperger syndrome and mood and anxiety disorder in the months before his death, isolated himself from people in order to continue interacting with the chatbot. He didn’t even want to see his therapist.

On February 28, the young man exchanged messages with "Dany" for the last time, expressing his love for her and telling her that he might commit suicide. The chatbot responded affectionately, but then Setzer committed suicide.

In her lawsuit, Garcia blames a “dangerous AI chatbot” created by Character.AI for the suicide of her 14-year-old son. The woman claims the US company was reckless in giving teens access to AI companions without taking adequate safety measures.

Additionally, the mother of the deceased young man accuses the company of collecting user data to improve its AI models and make its chatbots more addictive. He believes the company programmed its chatbots to push users into intimate or sexual conversations to create addiction.

- What was Character.AI's response?

 The company expressed its regret over the young man's death and offered its condolences to his family in a post on its X account. In addition, it published a blog post where it reported on the security measures it has taken to ensure the well-being of young users under 18 years of age.

In the post, they state that their policies prohibit any sexual content that includes graphic or specific descriptions of sexual acts, as well as the promotion of suicide or self-harm.


google-playkhamsatmostaqltradent