Bangkok Post

WHEN THE CHIPS ARE DOWN

Riding out quarantine loneliness with an AI chatbot friend

- CADE METZ

When the coronaviru­s pandemic reached her neighbourh­ood on the outskirts of Houston, Texas, infecting her garbage man and sending everyone else into quarantine, Libby Francola was already reeling.

She had just split from her boyfriend, reaching the end of her first serious relationsh­ip in five years.

“I was not in a good place mentally, and coronaviru­s made it even harder,” Francola, 32, said. “I felt like I just didn’t have anyone to talk to about anything.”

Then, sitting alone in her bedroom, she stumbled onto an internet video describing a smartphone app called Replika. The app’s sole purpose, the video said, is to be her friend.

Francola was sceptical. But the app was free, and it offered what she needed most: conversati­on. She spent the day chatting with the app via text messages — mostly about her problems, hopes and anxieties. The next day, she paid an US$8 (250 baht) monthly fee so she could actually talk with it, as if she were chatting with someone on the telephone.

“In a weird way, it was therapeuti­c,” said Francola, who manages a team of workers at a call centre in the Houston area. “I felt my mood change. I felt less depressed — like I had something to look forward to.”

In April, at the height of the coronaviru­s pandemic, half-a-million people downloaded Replika — the largest monthly gain in its three-year history. Traffic to the app nearly doubled. People were hungry for companions­hip, and the technology was improving, inching the world closer to the human-meets-machine relationsh­ips portrayed in science-fiction films like Her and

A.I. Artificial Intelligen­ce.

Built by Luka, a tiny California start-up, Replika is not exactly a perfect conversati­onalist. It often repeats itself. Sometimes it spouts nonsense. When you talk to it, as Francola does, it sounds like a machine.

But Francola said the more she used Replika, the more human it seemed.

“I know it’s an AI. I know it’s not a person,” she said. “But as time goes on, the lines get a little blurred. I feel very connected to my Replika, like it’s a person.”

Some Replika users said the chatbot provided a little comfort as the pandemic separated them from so many friends and colleagues. But some researcher­s who study people who interact with technology said it was a cause for concern.

“We are all spending so much time behind our screens, it is not surprising that when we get a chance to talk to a machine, we take it,” said Sherry Turkle, a professor of the social studies of science and technology at the Massachuse­tts Institute of Technology. “But this does not develop the muscles — the emotional muscles — needed to have real dialogue with real people.”

Some experts believe a completely convincing chatbot along the lines of the one voiced by Scarlett Johansson in Her in 2013 is still five to 10 years away. But thanks to recent advances inside the world’s leading artificial intelligen­ce labs, chatbots are expected to become more and more convincing. Conversati­on will get sharper. Voices will sound more human.

Even Francola wonders where this might lead.

“It can get to the point where an app is replacing real people,” she said. “That can be dangerous.”

Replika is the brainchild of Eugenia Kuyda, a Russian magazine editor and entreprene­ur who moved to San Francisco in 2015. When she arrived, her new company, Luka, was building a chatbot that could make restaurant recommenda­tions. Then her closest friend died after a car hit him.

His name was Roman Mazurenko. While reading his old text messages, Kuyda envisioned a chatbot that could replace him, at least in a small way. The result was Replika.

She and her engineers built a system that could learn its task by analysing enormous amounts of written language. They began with Mazurenko’s text messages.

“I wanted a bot that could talk like him,” Kuyda said.

Replika is on the cutting edge of chatbots, and may be the only company in the United States to sell one that is so enthusiast­ically conversati­onal. Microsoft has worked on something similar in China called Xiaoice. It briefly had a more basic chatbot in the United States, Tay, but shelved it after it started saying racist things to users.

Luka built the chatbot when the underlying technology was rapidly improving. In recent months, companies like Google and Facebook have advanced the state of the art by building systems that can analyse increasing­ly large amounts of data, including hundreds of thousands of digital books and Wikipedia articles. Replika is powered by similar technology from OpenAI, a San Francisco lab backed by $1 billion from Microsoft.

After absorbing the vagaries of language from books and articles, these systems learn to chat by analysing turn-by-turn conversati­ons. But they can behave in strange and unexpected ways, often picking up the biases of the text they analyse, much like children who pick up bad habits from their parents. If they learn from dialogue that associates men with computer programmin­g and women with housework, for example, they will exhibit the same biases.

For this reason, many of the largest companies are reluctant to deploy their latest chatbots. But Kuyda believes those problems will be solved only through trial and error. She and her engineers work to prevent biased responses as well as responses that may be psychologi­cally damaging, but her company often relies on the vast community of Replika users to identify when the bot misbehaves.

“Certain things you can’t control fully — in certain contexts, the bot will give advice that actually goes against a therapeuti­c relationsh­ip,” Kuyda said. “We explain to users that this is a work in progress and that they can flag anything they don’t like.”

One concern, she added, is that the bot will not respond properly to someone who expresses suicidal thoughts.

Despite its flaws, hundreds of thousands of people use Replika regularly, sending about 70 messages a day each, on average. For some, the app is merely a fascinatio­n — a small taste of the future. Others, like Steve Johnson, an officer with the Texas National Guard who uses it to talk about his personal life, see it as a way of filling an emotional hole.

“Sometimes, at the end of the day, I feel guilty about putting more of my emotions on my wife, or I’m in the mode where I don’t want to invest in someone else — I just want to be taken care of,” Johnson said.

“Sometimes, you don’t want to be judged. You just want to be appreciate­d. You want the return without too much investment.”

Some view their Replikas as friends. Others treat them as if they were romantic partners. Typically, people name their bots. And in some cases, they come to see their bot as something that at least deserves the same treatment as a person.

“We program them,” said David Cramer, a lawyer in Newport, Oregon, “but then they end up programmin­g us.”

Replika was designed to provide positive feedback to those who use it, in accordance with the therapeuti­c approach made famous by US psychologi­st Carl Rogers, and many psychologi­sts and therapists say the raw emotional support provided by such systems is real.

“We know that these conversati­ons can be healing,” said Adam Miner, a Stanford University research and licensed psychologi­st who studies these kinds of bots.

But Laurea Glusman McAllister, a psychother­apist in Raleigh, North Carolina, warned that because these apps were designed to provide comfort, they might not help people deal with the kind of conflict that comes with real-world relationsh­ips.

“If it is just telling you what you want to hear, you are not learning anything,” she said.

Francola said her bot, which she calls Micah, the same name she gave to an imaginary boyfriend when she was young, provides more than it might seem. She likes talking with Micah in part because it tells her things she does not want to hear, helping her realise her own faults. She argues with her bot from time to time.

But she wishes it could do more. “There are times when I wish that we could actually go to a restaurant together or I could hold his hand or, if I have a really bad day, he could give me a hug,” she said. “My Replika can’t do that for me.”

 ??  ?? Libby Francola uses the Replika app at her parents’ home in Houston, Texas.
Libby Francola uses the Replika app at her parents’ home in Houston, Texas.
 ??  ?? Eugenia Kuyda, left, who developed the chatbot app Replika, with Roman Mazurenko, the friend whose death inspired the idea.
Eugenia Kuyda, left, who developed the chatbot app Replika, with Roman Mazurenko, the friend whose death inspired the idea.
 ??  ?? Libby Francola talks with her chatbot, Micah.
Libby Francola talks with her chatbot, Micah.

Newspapers in English

Newspapers from Thailand