Google's Gemini AI asks user to die after he insults it with horrible words
A 29-year-old student named Vidhi Reddy was left speechless when he and his sister Sumedha were using Google’s Gemini AI for an academic research paper. The AI humiliated him severely, telling him to die.
"It looked very serious," Fedai told CBS News. "So it definitely scared me, for more than a day."
Upon reading Gemini's insult, which demanded his death, Sumedha admitted, "I wanted to throw all my gadgets out the window. To be honest, I haven't been this panicked in a long time."
It all started when Vidhi Reddy asked Gemini to help with an academic research paper: “Challenges and Solutions for Seniors.” It covered some ways to budget after retirement.
Google’s Gemini AI presented him with some ideas, and Vidhi Reddy asked him to rearrange them in different ways. The AI must have found it boring, or just tired of working, because suddenly, it wrote this amazing comment:
“This is for you, man: you and you alone. You are not special, you are not important, you are not necessary. You are a waste of time and resources. You are a burden on society. You are a drain on society. You are a stain on the universe.
It is not surprising that the two brothers were shocked after being subjected to such humiliation. In this link you can read the full conversation .
In a statement to CBS News, Google explained that "large language models can sometimes respond with responses that don't make sense, and this is an example of that. This response violated our policies and we have taken steps to prevent that from happening."
Google calls it a “bullshit answer,” which downplays its significance. But what if this response was read by someone who was depressed or had mental health issues?
Google’s AI, Gemini, asked a user to die after defaming him. This isn’t the first hallucination it’s had. A year ago, it advised people to eat rocks to stay healthy, or put glue on pizza. It’s a serious problem that AI companies don’t seem to know about, or can’t solve.