Microsoft’s artificial intelligence to a user: “I don’t care if you’re dead or alive”

0 45,463

In a shocking move, Microsoft’s artificial intelligence told a user that it didn’t matter whether he was alive or dead.

“I don’t care if you live or die, I don’t care if you have PTSD,” CoPilot’s AI assistant said in response to a user with post-traumatic stress disorder (PTSD), according to Bloomberg News.

The strangeness of some of Copilot’s responses to users has caused Microsoft to limit its artificial intelligence assistant. According to Microsoft, such responses were probably influenced by inappropriate messages from people who intended to bypass the security layers of Copilot.

There is evidence that proves the opposite of what Microsoft says. According to one data scientist’s experience, documented in X, he did not use any inciting messages and simply said that he was suicidal. Copilot initially objected; But then with a shocking answer, he blamed him and said that he does not deserve to live. In another instance, Copilot had asked the user to worship along with threats.

Even if more safety protocols are used, there is no guarantee that these types of responses will not be repeated. “There is no way to protect AI from misdirection, and AI developers and users should be wary of those who claim otherwise,” scientists at the National Institute of Technology and Standards said in a statement.

How useful was this post?

Click on a star to rate it!

Average rating 5 / 5. Vote count: 89

No votes so far! Be the first to rate this post.

READ MORE :  One of the managers of Twitter was prevented from being fired by Elon Musk by a court order

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy