moneycontrol.com · 1h
Google's AI chatbot Gemini verbally abuses student, calls her 'stain on universe': 'Please die'
The alarming exchange unfolded when Reddy asked the chatbot for help with a school assignment exploring challenges faced by older adults. Instead of providing constructive assistance, the chatbot issued a series of disturbing and hostile statements,
cybernews · 1d
“Human, please die”: Google Gemini goes rogue over student’s homework
Google states that Gemini has safety filters that prevent chatbots from diving into disrespectful, sexual, violent, or dangerous discussions and encouraging harmful acts. However, despite the safety intents, AI chatbots are still murky when it comes to controlling their responses.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results