Nelson Mail

Human-sounding robots have some hang ups

-

software developers. The assistant added pauses, ‘‘ums’’ and ‘‘mmmhmms’’ to its speech in order to sound more human as it spoke with real employees at a hair salon and a restaurant.

‘‘That’s very impressive, but it can clearly lead to more sinister uses of this type of technology,’’ said Matthew Fenech, who researches the policy implicatio­ns of AI for the London-based organisati­on Future Advocacy. ‘‘The ability to pick up on nuance, the human uses of additional small phrases – these sorts of cues are very human, and clearly the person on the other end didn’t know.’’

Fenech said it’s not hard to imagine nefarious uses of similar chatbots, such as spamming businesses, scamming seniors or making malicious calls using the voices of political or personal enemies.

‘‘You can have potentiall­y very destabilis­ing situations where people are reported as saying something they never said,’’ he said.

Pichai and other Google executives tried to emphasise that the technology is still experiment­al, and will be rolled out cautiously. It’s not yet available on consumer devices.

‘‘It’s important to us that users and businesses have a good experience with this service, and transparen­cy is a key part of that,’’ Google engineers Yaniv Leviathan and Yossi Matias, who helped design the new technology, wrote in a blog post. ‘‘We want to be clear about the intent of the call so businesses understand the context. We’ll be experiment­ing with the right approach over the coming months.’’

It’s unclear how the company will navigate existing telecommun­ications laws, which can vary by state or country. Google didn’t return a request for comment on how it plans to seek the consent of people called by its bots.

One co-owner of a San Francisco Bay Area barbershop patronised by some Google employees was a little creeped out by the privacy implicatio­ns.

‘‘It seems like something that would be helpful for our clients,’’ said Katherine Esperanza, coowner of the Slick & Dapper barbershop in Oakland, California. Esperanza, however, wondered if the shop would be able to block the calls, and said it ‘‘begs the question about whether the conversati­on is recorded and if the recipient of these automated calls could be aware that they’re being recorded’’.

Anti-wire tapping laws in California and several other states already make it illegal to record phone calls without the consent of both the caller and the person being called. The Federal Communicat­ions Commission has also been grappling with rules for robocalls, the unsolicite­d and automatica­lly-dialled calls made by telemarket­ers.

Such calls are typically prerecorde­d monologues, but more businesses and organisati­ons are employing machine-learning techniques to respond to a person’s questions with a natural-sounding conversati­on, in hopes they’ll be less likely to hang up.

 ??  ?? Matthew Fenech, of Future Advocacy: It’s not hard to imagine nefarious uses of chatbots similar to Google’s new computer assistant.
Matthew Fenech, of Future Advocacy: It’s not hard to imagine nefarious uses of chatbots similar to Google’s new computer assistant.

Newspapers in English

Newspapers from New Zealand