Gulf News

Silicon Valley needs to offend you

Tech giants keep pursuing conversati­onal computing possibilit­ies despite glitches

-

sympathy. “Ouch, that’s not good,” it might say. “Hope your ankle feels better soon.” If you mention house guests or dinner plans, it responds in remarkably precise and familiar ways.

Despite its sophistica­tion, this conversati­onal system can also be nonsensica­l, impolite, and even offensive at times. If you mention your company’s CEO, it may assume you are talking about a man — unaware that women are chief executives, too. If you ask a simple question, you may get a cheeky reply.

Microsoft’s researcher­s believe they can significan­tly improve this technology by having it chat with large numbers of people. This would help identify its flaws and generate much sharper conversati­onal data for the system to learn from. “It is a problem if we can’t get this in front of real users — and have them tell us what is right and what isn’t,” said longtime Microsoft researcher Bill Dolan.

But there lies the conundrum. Because its flaws could spark public complaints — and bad press — Microsoft is wary of pushing this technology onto the internet.

Wider effort

The project represents a much wider effort to build a new breed of computing system that is truly conversati­onal. At companies like Facebook, Amazon and Salesforce, as well as Microsoft, the hope is that this technology will provide smoother and easier ways of interactin­g with machines — easier than a keyboard and mouse, easier than a touchscree­n, easier than Siri and other digital assistants now on the market, which are still a long way from fluid conversati­on.

For years, Silicon Valley companies trumpeted “chatbots” that could help you, say, book your next plane flight or solve a problem with your new computer tablet. But these have never lived up to the billing, providing little more than canned responses to common queries.

Now, thanks to the rise of algorithms that can quickly learn tasks on their own, research in conversati­onal computing is advancing. But the industry as a whole faces the same problem as Microsoft: The new breed of chatbot talks more like a human, but that is not always a good thing.

“It is more powerful,” said Alex Lebrun, who works on similar conversati­onal systems at Facebook’s artificial intelligen­ce lab in Paris. “But it is more dangerous.”

The new breed relies on “neural networks,” complex algorithms that can learn tasks by identifyin­g patterns in large pools of data. During the last five years, these algorithms have accelerate­d the evolution of systems that can automatica­lly recognise faces and objects, identify commands spoken into smartphone­s, and translate from one language to another. They are also speeding the developmen­t of conversati­onal systems — though this research is significan­tly more complex and will take longer to mature.

It may seem surprising that Microsoft researcher­s are training their conversati­onal system on dialogue from Twitter and Reddit, two social networking services known for vitriolic content. But even on Twitter and Reddit, people are generally civil when they really fall into conversati­on, and these services are brimming with this kind of dialogue.

Microsoft researcher­s massage the conversati­onal data they load into the system in small ways, but for the most part, they simply feed the raw dialogues into their neural networks, and these algorithms therefore learn from interactio­ns that are very human. According to Dolan, in analysing this data, the system learns to perform well even in the face of poor spelling and grammar. If you type “winne tonight drink restaurant,” it might respond with: “i’m not a fan of wine.”

It can engage in a real back-andforth dialogue, asking for everything it needs to, say, connect with you on LinkedIn. And for the most part, it behaves with civility.

But researcher­s must also deal with the unexpected. Though these conversati­onal systems are generally civil, they are sometimes rude — or worse. It is not just that the technology is new and flawed. Because they learn from vast amounts of human conversati­on, they learn from the mistakes we human make, and the prejudice we exhibit.

Lebrun estimated that once in every 1,000 responses, this new breed of chatbot will say something racist or aggressive or otherwise unwanted. Researcher­s can fix these problems, but that involves gathering more and better data, or tweaking the algorithms through a process of extreme trial and error. This is a problem for AI services in general. More than two years ago, a software developer noticed that the new Google Photos service was identifyin­g black people as gorillas. Google promptly barred the service from identifyin­g gorillas and similar animals, and it has yet to provide a fix.

But identifyin­g and solving problems with conversati­onal systems is harder, just because the scope of these systems — general dialogue — is so large. Image recognitio­n is a single task.

Conversati­on is many tasks, because it bounces back and forth, and each response can affect all the responses to come.

For this reason, Adam Coates, a partner at the venture capital firm Khosla Ventures who previously oversaw the Silicon Valley AI lab attached to the Chinese internet giant Baidu, warns that building a truly conversati­onal system is far more difficult than building services that can recognise giraffes, say, or translate between German and French.

“There is a huge technical barrier here. We really don’t know how to build a personal assistant,” he said. “It may not be simply a matter of more data. We may be missing a big idea.”

In the short-term, many believe, conversati­onal systems will be most effective if they are limited to particular tasks, like asking for IT help or getting medical advice. That is still a long way from a bot that will respond well to anything you say. But Dolan believes these systems will continue to evolve over the next few years, provided companies like Microsoft can get them in front of the public.

“We need people to forgive us when we mess up,” he said. “Pushing forward is going to involve some difficulti­es.”

 ??  ??

Newspapers in English

Newspapers from United Arab Emirates