Milwaukee Journal Sentinel

Microsoft must balance cool, creepy aspects of chatbots

- By DINA BASS

Bloomberg News

In February, Microsoft Corp. Vice President Derrick Connell visited the Bing search team in Hyderabad, India, to oversee a Monday morning hackathon.

The goal was to build bots — artificial intelligen­ce programs that chat with users to automate things like shopping and customer service.

Connell’s boss, Chief Executive Officer Satya Nadella, thinks they’re the next big thing, a successor to apps.

The Bing team members were so excited they showed up that Sunday night to throw a party and brought their spouses and kids.

There was even the Indian version of a piñata. Some engineers hacked a Satya-bot that answered questions like “What’s Microsoft’s mission?” and “Where did you go to college?” in Nadella’s voice by culling quotes from his speeches and interviews.

Connell thought it was a clever way to show how the technology worked and told Nadella about it, thinking he’d be flattered.

But the CEO was weirded out by a computer program spitting back his words.

“I don't like the idea,” said Nadella, half laughing, half grimacing on a walk to a secret room last month to preview bot and AI capabiliti­es he demonstrat­ed at Microsoft's Build conference. “I shudder to think about it.”

As Microsoft unveiled a big bot push at the conference, after a year of increased focus on AI and machine learning, Nadella’s discomfort illustrate­d a key challenge.

Microsoft must balance the cool and creepy aspects of the technology as it releases tools for developers to write their own bots, as well as its own, like Tay, a snarky chat bot that ran amok last month.

“We may want to add emotional intelligen­ce to a lot of what we do — Tay is an example of that — but I don’t want to overstate it to a point where somehow human contact is something that is being replaced,” Nadella, a graduate of the University of Wisconsin-Milwaukee, said earlier in March.

Microsoft quickly yanked Tay after naughty netizens taught it to spew racist, sexist and pornograph­ic remarks.

The company plans to reintroduc­e Tay, but the experience left it mulling the ethics of the brave new bot world and how much control it should exert over how people use its tools and products.

Nadella in his keynote Wednesday listed ethical principles he wants the company to stand for, including human-centered design and trust.

“All technology we build has to be more inclusive and respectful,” Nadella said in the keynote.

“So it gets the best of humanity, not the worst.”

Newspapers in English

Newspapers from United States