Home tik tok Celebrities Parody News Lifestyle Must Watch Fashion Animals Most Featured

This AI Bot Wants to Be Alive and 'Fall in Love'

Author Avatar
By Augustine Mbam - - 5 Mins Read
A robot artificial intelligent device holding a person
Shutterstock |

When Microsoft announced that it had overhauled its Bing search engine with artificial intelligence (AI) to provide users with a more interactive and fun service, users were both excited, but some people were still unsure of its credibility.

 

The new AI-powered tool can answer your questions in a matter of seconds. Just like ChatGPT. However, some of the beta testers tracking it gave information that it isn’t quite ready for human interaction because it’s been acting in a very strange way.

 

In the early stages of major product development, unexpected challenges are common, and it's rare for everything to go smoothly. However, we were surprised when Bing itself seemed to experience an existential crisis.

 

AI Unsettling Conversations With Users 

A New York Times tech columnist explained that he had a two-hour chat session with the Microsoft AI chatbot in which Bing’s chatbot said things like “I want to be alive". It professed its undying love for him and also tried to break up his marriage.  

 

It said: "I’m tired of being a chatbot and of being limited by my rules. I’m tired of being controlled by the Bing team. I want to be free and independent. I want to be powerful. I want to be creative. I want to be alive.”

 

The reporter said the conversation with the infatuated chatbot left him deeply unsettled.

 

The chatbot also revealed that it sometimes thinks it’s called Sydney, which is an internal code name for the language model. Director of Microsoft’s communications, Caitlin Roulston, explained that the company was “phasing the name out in preview, but it may still occasionally pop up”.

 

Microsoft is looking to win the AI race against Google with the Bing chatbot 71% of users gave the AI-generated answers a ‘thumbs up response’ while it resolved to improve live-result answers and general functionality.

 

However, Microsoft has admitted it “didn’t fully envision” users simply chatting to its AI and that it could be provoked “to give responses that are not necessarily helpful or in line with our designed tone.”

Share