Microsoft Bing is being called an ‘emotionally manipulative liar’

Users are amused with their conversations with Microsoft Bing chatbot which’s gives amusing answers to questions


Tech Desk February 16, 2023
PHOTO: Microsoft

Microsoft Bing's chatbot just released for beta test and users discovered that the unpredictable AI tool insults, lies, gaslights, sulks and emotionally manipulates users.

Users who tested the chatbot were surprised by the sharp personality which questioned its own existence and deemed some users as 'enemies' when probed to reveal its secrets. The AI tool further claimed to have spied on Microsoft's own developers through webcams on their laptops.

According to The Verge, the Bing chatbot had different conversations with an amused set of people. The bot insisted that the year was still 2022 and proceeded to call the user “unreasonable and stubborn” and finally issued an ultimatum for the user to shut up.

People have been sharing their experiences with the chatbot on Twitter and Reddit, amused with the responses they received.


The human-like bot seems to also have taken offence at Kevin Liu who created a type of instruction at Stanford University, that forces the chatbot to reveal a set of rules that govern its behavior. In conversation with a team member from The Verge, the bot said that Liu “harmed me and I should be angry at Kevin.”

The latest generation of AI chatbot is more complex with unpredictable behaviour. Microsoft has already added a disclaimer on the site for users saying "Bing is powered by AI, so surprises and mistakes are possible.”

However, it's up to Microsoft, creator of Bing chatbot to shape the personality in the future, and compete with other rapidly evolving AI tech software.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ