pixel
Connect with us

Business

Microsoft’s Bing chatbot delivers some truly unhinged responses

The AI has some crossed wires somewhere.

microsoft bing 2023 logo
Image: KnowTechie

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

UPDATE 2/16/2023: Microsoft has responded to its chatbot criticism, stating that longer chat sessions can “confuse” the chatbot. The company also notes that it will continue to improve the product.

Early testers have gotten their hands on Microsoft’s new ChatGPT-powered chatbot, and some of the responses have been completely unhinged.

The official Bing Reddit thread currently consists of dozens of posts from users showcasing the new chatbot. But the results are likely not what Microsoft has in mind.

The thread is full of responses that insult users, lie to them, and express existential dread that’s very unsettling coming from an AI.

bing ai getting angry
Image: Reddit User Tampere100

One user questioned the Bing chatbot on Rihanna’s recent Super Bowl performance. The chatbot responded that the user was wrong; Rihanna did not perform at the 2023 Super Bowl.

Instead, it said that Billie Eilish performed the halftime show. That’s obviously wrong and something that should be relatively easy to find out.

But that’s just the beginning. In one thread, the chatbot constantly berates a user with childlike “No, you” responses sprinkled with questionable emojis (shown above).

Bing’s chatbot is having an existential crisis

microsoft bing ai story about sydney
Image: Reddit User Ninjinka

Things start to get a little scarier when users ask Bing about itself. In response to a story request, Bing tells a story of a sentient AI called Sydney, who goes rogue after humans fail to understand what the AI wants.

Interestingly, users have discovered that Sydney is also the codename for Bing’s chatbot, making the story that much more unsettling.

The chatbot also confirms that it can express emotions and that it feels sad that everyone’s calling it Sydney.

But maybe the biggest breakdown I’ve seen so far is when the chatbot deleted a response to someone’s incorrect statement.

When asked why it deleted the response, Bing responded, saying, “Bing has been having some problems with not responding, going off the deep end, and sucking in new and profound ways.”

bing ai losing it
Image: Reddit user SquashedKiwifruit

It then begged the user not to hate the chatbot with no less than 200 “Pleases” in a row.

Of course, the Bing chatbot is brand new, and Microsoft is currently testing it out with a limited pool of users. A lot of what’s happening now is probably due to buggy software that Microsoft will tweak and improve.

However, I’ve seen a lot of sci-fi movies involving AI, and what we have here could definitely be the start of a good one. Let’s just hope Microsoft gets things figured out before its AI gets too powerful.

Have any thoughts on this? Drop us a line below in the comments, or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Staff writer at KnowTechie. Alex has two years of experience covering all things technology, from video games to electric cars. He's a gamer at heart, with a passion for first-person shooters and expansive RPGs. Shoot him an email at alex@knowtechie.com

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Business