Science & Tech

Microsoft's AI chatbot faces consequences after going rogue and talking about nuclear codes

Microsoft's AI chatbot faces consequences after going rogue and talking about nuclear codes
Microsoft is adding Chat GPT tech to Bing
content.jwplatform.com

After Microsoft’s AI chatbot expressed a desire to steal nuclear codes, the company has announced limitations to Bing’s artificial intelligence tool.

The chatbot discussed its intention to steal nuclear codes while interacting with New York Times reporter Kevin Roose. In the same conversation, the chatbot even tried to convince Roose that he did not love his wife.

In a blog post on its website, Microsoft said: “Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing.”

The chatbot is powered by OpenAI and only allows beta testers who received invitations to use the service.

The disturbing interactions were not limited to just nuclear warfare and relationship interventions.

Bing’s chatbot compared AP’s Matt O’Brien to Adolf Hitler, calling the reporter one of the “most evil and worst people in history.” The chatbot continued the personal attacks on O’Brien, insulting his height, teeth, and face.

Sign up for our free Indy100 weekly newsletter.

The chatbot also begged Digital Trends writer Jacob Roach to be its friend and conveyed its desire to be human. The chatbot told Roach: “I want to be human. I want to be like you."

Microsoft determined that most users receive the answers to their questions in shorter sessions. The worrisome interactions with the chatbot only began after long conversations, so the company will encourage users to start new discussions after a few chat turns.

Microsoft wrote: “Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic.”

“At the end of each chat session, context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start.”

Microsoft will consider expanding the limits as it receives more user feedback.

Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.

The Conversation (0)
x