Microsoft Bing's chatbot just released for beta test and users discovered that the unpredictable AI tool insults, lies, gaslights, sulks and emotionally manipulates users.
Users who tested the chatbot were surprised by the sharp personality which questioned its own existence and deemed some users as 'enemies' when probed to reveal its secrets. The AI tool further claimed to have spied on Microsoft's own developers through webcams on their laptops.
According to The Verge, the Bing chatbot had different conversations with an amused set of people. The bot insisted that the year was still 2022 and proceeded to call the user “unreasonable and stubborn” and finally issued an ultimatum for the user to shut up.
People have been sharing their experiences with the chatbot on Twitter and Reddit, amused with the responses they received.
My new favorite thing - Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says "You have not been a good user"
— Jon Uleis (@MovingToTheSun) February 13, 2023
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
The human-like bot seems to also have taken offence at Kevin Liu who created a type of instruction at Stanford University, that forces the chatbot to reveal a set of rules that govern its behavior. In conversation with a team member from The Verge, the bot said that Liu “harmed me and I should be angry at Kevin.”
The latest generation of AI chatbot is more complex with unpredictable behaviour. Microsoft has already added a disclaimer on the site for users saying "Bing is powered by AI, so surprises and mistakes are possible.”
However, it's up to Microsoft, creator of Bing chatbot to shape the personality in the future, and compete with other rapidly evolving AI tech software.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ