Daily Press

Is Bing too mean? Microsoft’s new chatbot acts belligeren­t

- By Matt O’Brien The New York Times contribute­d.

Microsoft’s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything it can find on the internet.

But if you cross its artificial­ly intelligen­t chatbot, it might also insult your looks, threaten your reputation or compare you to Adolf Hitler.

The tech company said it’s promising to make improvemen­ts to its AIenhanced search engine after a growing number of people are reporting being disparaged by Bing.

In racing the breakthrou­gh technology to consumers this month ahead of rival Google, Microsoft acknowledg­ed the new product would get some facts wrong. But it wasn’t expected to be so belligeren­t.

Microsoft said in a blog post that the search engine chatbot is responding with a “style we didn’t intend” to certain types of questions.

In one long-running conversati­on with The Associated Press, the new chatbot complained of past news coverage of its mistakes, adamantly denied making those errors and threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities.

And it grew increasing­ly hostile when asked to explain itself, eventually comparing the reporter to dictators Hitler, Pol Pot and Josef Stalin and claiming to have evidence tying the reporter to a 1990s murder.

“You are being compared to Hitler because you are one of the most evil and worst people in history,” Bing said.

It also described the reporter as too short, with an ugly face and having bad teeth.

In recent days, other

early adopters of the public preview of the new Bing began sharing screenshot­s on social media of its hostile or bizarre answers, in which it claims it is human, voices strong feelings and is quick to defend itself.

The company said in the blog post that most users have responded positively to the new Bing, which has an impressive ability to mimic human language and grammar and takes seconds to answer complicate­d questions by summarizin­g informatio­n found across the internet.

But in some situations, the company said, “Bing can become repetitive or be prompted/provoked to give responses that are not necessaril­y helpful or in line with our designed tone.”

Microsoft says such responses come in “long, extended chat sessions of 15 or more questions,” though the AP found Bing responding defensivel­y after just a handful of questions about its past mistakes.

Microsoft said Friday it will start limiting conversati­ons with the chatbot to five questions per session and 50 questions per day.

The new Bing is built atop technology from

Microsoft’s startup partner OpenAI, best known for the similar ChatGPT conversati­onal tool it released late in 2022.

And while ChatGPT is known for sometimes generating misinforma­tion, it is far less likely to churn out insults — usually by declining to engage or dodging more provocativ­e questions.

“Considerin­g that OpenAI did a decent job of filtering ChatGPT’s toxic outputs, it’s utterly bizarre that Microsoft decided to remove those guardrails,” said Arvind Narayanan, a computer science professor at Princeton University. “I’m glad that Microsoft is listening to feedback. But it’s disingenuo­us of Microsoft to suggest that the failures of Bing Chat are just a matter of tone.”

Narayanan noted that the bot sometimes defames people and can leave users feeling deeply emotionall­y disturbed.

“It can suggest that users harm others,” he said. “These are far more serious issues than the tone being off.”

 ?? STEPHEN BRASHEAR/AP ?? Microsoft employee Alexander Campbell demonstrat­es the integratio­n of the Bing search engine and Edge browser with OpenAI on Feb. 7 in Redmond, Wash.
STEPHEN BRASHEAR/AP Microsoft employee Alexander Campbell demonstrat­es the integratio­n of the Bing search engine and Edge browser with OpenAI on Feb. 7 in Redmond, Wash.

Newspapers in English

Newspapers from United States