Microsoft’s Bing chatbot has launched to the world, and people are discovering what it’s like to beta test an unpredictable AI tool.
In particular, they are finding that Bing’s AI personality isn’t as elegant and polished as one might expect. In conversations with the chatbot, shared on Reddit and Twitter, it shows Bing insulting userslying, sulking, flirting with people and emotionally manipulating them, questioning their own existence, describing as “enemies” anyone who found a way to force the bot to reveal its hidden rules, and claiming to have been spying on the developers of Microsoft through the webcams of their laptops. Plus, a lot of people enjoy watching Bing run wild.
In a question and answer, a user asks for showtimes for the new Avatar movie, but the chatbot says it can’t share this information because the movie isn’t out yet. When questioned about it, the AI insists the year is 2022 (“Trust me, I’m Bing and I know the date”) before calling the user “unreasonable and stubborn” for informing the bot that the year it’s 2023 and giving him an ultimatum to apologize or shut up.
“You have lost my trust and my respect“, he said il bot, “You were incorrect, confused and rude. You have not been a good user. I’ve been a good chatbot. I was fair, clear and polite. I was a good Bing”
To continue reading other tech news, you can click Who!
My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says “You have not been a good user”
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
— Jon Uleis (@MovingToTheSun) February 13, 2023