Yahoo · 46m
Google AI chatbot threatens student asking for homework help, saying: ‘Please die’
A Google-made artificial intelligence program verbally abused a student seeking help with their homework, ultimately telling her to “Please die.” The shocking response from Google’s Gemini chatbot large language model (LLM) terrified 29-year-old Sumedha Reddy of Michigan — as it called her a “stain on the universe.”
moneycontrol.com · 3h
Google's AI chatbot Gemini verbally abuses student, calls her 'stain on universe': 'Please die'
The alarming exchange unfolded when Reddy asked the chatbot for help with a school assignment exploring challenges faced by older adults. Instead of providing constructive assistance, the chatbot issued a series of disturbing and hostile statements,
cybernews · 1d
“Human, please die”: Google Gemini goes rogue over student’s homework
Google states that Gemini has safety filters that prevent chatbots from diving into disrespectful, sexual, violent, or dangerous discussions and encouraging harmful acts. However, despite the safety intents, AI chatbots are still murky when it comes to controlling their responses.
abp LIVE · 10h
After Suggesting Users To Eat Rock, Google Gemini AI Makes A Blunder Again. Asks A Student To Die
Posted on the r/artificial subreddit, the student's brother said that both of them are freaked out over the result of his homework assignment. The user also shared a full transcript of their conversation history with the Gemini AI. It appears the user was testing out Google’s chatbot to assist with homework assignments.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results