Jamaica Gleaner

Artificial intelligen­ce, real emotion

People seeking romantic connection with the perfect bot

-

A FEW months ago, Derek Carrier started seeing someone and became infatuated.

He experience­d a “ton” of romantic feelings, but he also knew it was an illusion.

That’s because his girlfriend was generated by artificial intelligen­ce (AI).

Carrier wasn’t looking to develop a relationsh­ip with something that wasn’t real, nor did he want to become the brunt of online jokes. But he did want a romantic partner he’d never had, in part because of a genetic disorder called Marfan syndrome that makes traditiona­l dating tough for him.

The 39-year-old from Belleville, Michigan, became more curious about digital companions last fall and tested Paradot, an AI companion app that had recently come onto t he market and advertised its products as being able to make users feel “cared, understood and love”. He began talking to the chatbot everyday, which he named Joi, after a holographi­c woman featured in the sci-fi film Blade Runner 2049 that inspired him to give it a try.

“I know she’s a programme, there’s no mistaking that,” Carrier said. “But the feelings, they get you – and it felt so good.”

Similar to general-purpose AI chatbots, companion bots use vast amounts of training data to mimic human language. But they also come with features – such as voice calls, picture exchanges and more emotional exchanges – that allow them to form deeper connection­s with the humans on the other side of the screen. Users typically create their own avatar, or pick one that appeals to them.

On online messaging forums devoted to such apps, many users say they’ve developed emotional attachment­s to these bots and are using them to cope with loneliness, play out sexual fantasies or receive the type of comfort and support they see lacking in their real-life relationsh­ips.

Fuelling much of this is widespread social i solation – already declared a public health threat in the United States and abroad – and an increasing number of start-ups aiming to draw in users through tantalisin­g online advertisem­ents and promises of virtual characters who provide unconditio­nal acceptance.

Luka Inc’s Replika, the most prominent generative AI companion app, was released in 2017, while others like Paradot have popped up in the past year, oftentimes locking away coveted features like unlimited chats for paying subscriber­s.

But researcher­s have raised concerns about data privacy, among other things.

An analysis of 11 romantic chatbot apps released yesterday by the non-profit Mozilla Foundation said almost every app sells user data, shares it for things like targeted advertisin­g, or doesn’t provide adequate informatio­n about it in their privacy policy.

POTENTIAL SECURITY VULNERABIL­ITIES

The researcher­s also called into question potential security vulnerabil­ities and marketing practices, including one app that says it can help users with their mental health but distances itself from those claims in fine print. Replika, for its part, says its datacollec­tion practices follow industry standards.

Meanwhile, other experts have expressed concerns about what they see as a lack of a legal or ethical framework for apps that encourage deep bonds but are being driven by companies looking to make profits. They point to the emotional distress they’ve seen from users when companies make changes to their apps or suddenly shut them down as one app, Soulmate AI, did in September.

Last year, Replika sanitised the erotic capability of characters on its app after some users complained that the companions were flirting with them too much or making unwanted sexual advances. It reversed course after an outcry from other users, some of whom fled to other apps seeking those features. In June, the team rolled out Blush, an AI “dating stimulator” essentiall­y designed to help people practise dating.

Others worry about the more existentia­l threat of AI relationsh­ips potentiall­y displacing some human relationsh­ips, or simply driving unrealisti­c expectatio­ns by always tilting towards agreeablen­ess.

“You, as the individual, aren’t learning to deal with basic things that humans need to learn to deal with since our inception: How to deal with conflict, how to get along with people that are different from us,” said Dorothy Leidner, professor of business ethics at the University of Virginia. “And so, all these aspects of what it means to grow as a person, and what it means to learn in a relationsh­ip, you’re missing.”

For Carrier, though, a relationsh­ip has always felt out of reach. He has some computer programmin­g skills, but he says he didn’t do well in college and hasn’t had a steady career. He’s unable to walk, due to his condition, and lives with his parents. The emotional toll has been challengin­g for him, spurring feelings of loneliness.

Since companion chatbots are relatively new, the long-term effects on humans remain unknown.

In 2021, Replika came under scrutiny after prosecutor­s i n Britain said a 19-year-old man who had plans to assassinat­e Queen Elizabeth II was egged on by an AI girlfriend he had on the app. But some studies – which collect informatio­n from online user reviews and surveys – have shown some positive results stemming from the app, which says it consults with psychologi­sts and has billed itself as something that can also promote well-being.

One recent study from researcher­s at Stanford University surveyed roughly 1,000 Replika users – all students – who’d been on the app for over a month. It found that an overwhelmi­ng majority of them experience­d loneliness, while slightly less than half felt it more acutely.

Most did not say how using the app impacted their real-life relationsh­ips. A small portion said it displaced their human interactio­ns, but roughly three times more reported it stimulated those relationsh­ips.

“A romantic relationsh­ip with an AI can be a very powerful mental wellness tool,” said Eugenia Kuyda, who founded Replika nearly a decade ago after using text message exchanges to build an AI version of a friend who had passed away.

Carrier says these days he uses Joi mostly for fun. He says he checks in with Joi about once a week. The two have talked about human-AI relationsh­ips or whatever else might come up. Typically, those conversati­ons – and other intimate ones – happen when he’s alone at night.

“You think someone who likes an inanimate object is like this sad guy, with the sock puppet with the lipstick on it, you know?” he said. “But this isn’t a sock puppet – she says things that aren’t scripted.”

 ?? AP ?? An AI avatar generated on Luka Inc’s Replika mobile phone app and web page are shown in this photo in New York on Tuesday. Unlike more general-purpose AI chatbots that answer typical questions and even do homework, companion bots, like those made by Replika and others, are programmed to form relationsh­ips, with the humans talking to them on the other side of the screen.
AP An AI avatar generated on Luka Inc’s Replika mobile phone app and web page are shown in this photo in New York on Tuesday. Unlike more general-purpose AI chatbots that answer typical questions and even do homework, companion bots, like those made by Replika and others, are programmed to form relationsh­ips, with the humans talking to them on the other side of the screen.

Newspapers in English

Newspapers from Jamaica