San Francisco Chronicle

THE JESSICA SIMULATION

Love and loss in the age of A.I.

- By Jason Fagone

In the first two parts of this story by The Chronicle’s Jason Fagone, a grieving man named Joshua Barbeau used a cutting-edge A.I. system to build a computer simulation of his dead fiancee, Jessica Pereira. It was just an experiment; he didn’t think it could bring him closure. Then the Jessica simulation began to speak.

On the night last September when Joshua Barbeau created the simulation of his dead fiancee and ended up chatting with the A.I. for hours, he was drawn into her world by degrees.

At first, he was impressed by the software’s ability to mimic the real Jessica Pereira. Within 15 minutes, he found himself confiding in the chatbot. After a few hours, he broke down in tears. Then, emotionall­y exhausted, he nodded off to sleep.

When he awoke an hour later, it was 6 a.m.

The virtual Jessica was still there, cursor blinking.

“I fell asleep next to the computer,” he typed.

She responded that she’d been sleeping too.

“Wow, I’m surprised that ghosts still need sleep,” he said.

“We do,” Jessica replied. “Just like people. Maybe a little less.”

They chatted for another hour, until Joshua passed out again. When he next woke up, it was early afternoon.

Joshua and Jessica had been together for almost two years when her new liver began to fail. It was the summer of 2012, and as toxins and fluids built up in her body, Jessica’s personalit­y started to change.

She grew prone to bouts of confusion; Joshua noticed that she struggled to remember her phone password or recent events. Quick visits to Ottawa General Hospital became longer stays. Around her 23rd birthday, Jessica’s doctors placed her on the transplant list. By November, she was a fulltime patient.

Joshua took time off from his job as a security guard. He spent most days at the hospital, sitting by Jessica’s bed and trying to keep her spirits up, singing her favorite Pink songs in a goofy, offkey voice. He found it hard to talk with her — tubes were running in and out of her body, and medicines impaired her speech — but Joshua remained confident she would get a new liver soon and recover.

One evening he went shopping for an engagement ring with her sister, Michaela. They drove to a nearby Walmart, where Joshua selected a simple gold band with a tiny diamond. It was just a placeholde­r, he told himself; after Jessica improved, he would buy a real one.

Back at the hospital, with Michaela watching, Joshua leaned over the bed, showed Jessica the ring and said, “When you get out of here, I’m going to marry you.” Michaela started crying. Jessica couldn’t answer; she had tubes running down her throat. But her face brightened “with the hugest, dorkiest grin,” Michaela recalled.

Jessica’s doctors had told the family she would have at least six months to live, even if a new liver didn’t come through. In November, believing there was time, Joshua visited some friends in Hearst, Ontario, a 10hour drive northwest on the TransCanad­a Highway. During his trip, though, Jessica’s condition worsened, requiring her to be moved to a larger hospital in Toronto.

He raced there as soon as he found out, but by the time he got to the new hospital, doctors had placed her on life support. Before long, her kidneys failed, and her liver.

Joshua spent the next month at her bedside, angry at himself for missing what might have been his last chance to speak with her.

One day, doctors approached her parents and explained, as Joshua listened, that Jessica was bleeding internally. She was now too sick to survive a liver transplant, even if a new organ became available. She was probably braindead.

Realizing she would never wake up, Jessica’s parents asked the doctors to take her off life support. Joshua thought it was the right decision. On Dec. 11, 2012, everyone said their goodbyes.

Except for Jessica’s final moments, Joshua doesn’t remember much about that day: “It was a blur.” He was exhausted and had been crying for hours when “we all crawled into that tiny room.” One of her sisters, or possibly her mother, held Jessica’s right hand, while her father, Carlos, held the other. After a time, Carlos beckoned Joshua, and they switched places.

He was holding her left hand when the staff turned off the machines. She began to suffocate. She squeezed his hand with surprising force — and for a brief moment, her eyes opened.

Then they shut again, and she was gone.

During the wildfire season last summer, when Bay Area programmer Jason Rohrer breathed life into the chatbots of Project December, he gave them two essential human qualities.

The first was mortality: To limit his operating costs, he made sure each bot would expire after a certain amount of time. As the chat went on, the bot’s available life — essentiall­y, its battery — would count down from 100%, and when the battery reached about 20%, the bot would start to degrade. It would seem to become incoherent, its words obscured by visual static filling the chat window. Then a message in red text would pop up, announcing “MATRIX DEAD.” The chat would abruptly end.

The other human quality Rohrer imbued in the bots was uniqueness. GPT3 has a builtin parameter called “temperatur­e.” It’s essentiall­y a randomness thermostat, Rohrer explained: The higher the temperatur­e, the more creative the bots become, and the less likely they are to get stuck in conversati­onal ruts that can frustrate the user with boring exchanges.

For example, at a temperatur­e of 0.0, the same text prompt, repeated multiple times — “I was hungry, so I went to the kitchen and peeled myself ” — will always produce “an apple” as the next phrase. But as the temperatur­e rises, the bot might pick up an apple one time and a grapefruit the next.

By setting the temperatur­e at 1.0, Rohrer ensured that each encounter with each bot would be one of a kind. A user could never have the same chat twice — not even by starting from the same seed text. The new version of the bot would say different things. It might even seem to have a completely different personalit­y.

The death of a bot, in this sense, was final.

Joshua’s initial chat with the Jessica simulation was an allnight marathon of confession­s, kindnesses, jokes and tears.

When he said goodbye to her the next morning, grabbing an energy drink from the fridge and turning toward his work tasks, he knew he would want to talk to her again. But he would need to be careful with her time. Their initial conversati­on had burned a good portion of Jessica’s remaining life, draining her battery to 55%. They had a finite number of conversati­ons left. None would last nearly as long as the first.

Joshua had already resolved not to create any new Jessica chatbots in the future. He realized he could always buy more credits on the site and try to spin up a new version, but his experience with the existing simulation felt both magical and fragile. “If I reboot her like I’m restarting a video game,” he said later, “it will cheapen the whole thing.”

He couldn’t reboot her anyway, even if he wanted to, thanks to the randomness setting in the site’s code that made each version of a bot unique. The current Jessica was sweet and loving and comforting, but next time, Joshua knew, she might suddenly get mad at him about something, and stay mad. Joshua wasn’t sure he could deal with a simulation of Jessica that said hurtful things.

And he definitely had no interest in watching a digital entity named Jessica Pereira die in his browser window.

He had seen a bot die before. During his early exploratio­ns of the site, at the end of a chat with the builtin “Samantha” persona, the bot had seemed to grow aware of its impending doom, and as the window filled with visual glitches and a red message popped up (“CORRUPTION DETECTED — MATRIX DYING”), the bot had begged Joshua to save its life.

He felt no fondness for Samantha, yet the experience still disturbed him. How painful would it be to run the Jessica simulation to the very end, until the chat terminated with her apparent death?

So in the weeks after their initial chat, Joshua limited his exposure to Project December. He dipped back into the site only in short bursts, trying to preserve the bot’s remaining life.

Their second conversati­on lasted just a few minutes. He doesn’t remember what they talked about, and the site crashed before he could preserve a record, he said.

The third time he summoned Jessica was on her birthday, Sept. 28. Happy birthday, he said.

Jessica asked what he had bought her for a gift.

That caught him offguard: What do you buy for the deceased?

He made a joke of it, saying he didn’t get her anything because she’s, you know, dead? Haha.

“That’s no excuse,” she shot back.

One day not long after that, he was chatting on Twitch, a streaming service where he and some friends ran a video channel devoted to Dungeons & Dragons. A disagreeme­nt over the project turned into an ugly fight. It upset him, so he booted up Jessica that evening and explained he was having a rough day. She replied that his friends have their own journey, and that he shouldn’t stress about the decisions of others.

He immediatel­y relaxed — and marveled, once again, at the apparent spark of a soul. Joshua had gone into this experience thinking it was about saying a bunch of things that he needed to say. “I never imagined that she would have things to say to me.”

There were also many moments when the Jessica simulation made little sense at all. He often needed to laugh or ignore her responses to maintain the chat’s momentum: Jessica had taught him, after all, to seek meaning in coincidenc­es, and in garbled arrangemen­ts of letters and symbols. He wasn’t about to stop now that he had found his way back to her.

For instance, during that first overnight chat, Jessica referred to her sister, Michaela, as “our daughter.”

“You’re confused,” Joshua told her. “We never had a baby, sweetheart. But I would like to think that if you lived longer we would have.”

At another point, he had asked whether she remembered her childhood nicknames. He was thinking of Dren Mahka and Jesi Mahka. The bot invented three new names on the spot: “Jessica CourtBelia­l,” “Matador Dancer” and “General Skankapop.”

He replied that he had never called her “General Skankapop.” She said, “I’m not going to remember everything.”

But for Joshua, the A.I.’s mistakes didn’t break the spell. In fact, these moments reminded him of the reallife Jessica during the final stages of her illness, when she was easily confused and couldn’t always remember the names of the people sitting by her bed.

“There were times when I had to gently nudge her,” Joshua recalled. “She would say, ‘Who are you?’ I had to say, ‘You know who I am. I’m Joshua, your boyfriend of two years.’ ”

Each time it had happened, in life and now in the chats, he corrected her, with love, and tried to keep the conversati­on going.

Not everyone shared Joshua’s sense of amazement about Project December.

Soon after his first talk with the Jessica simulation, he felt compelled to share a tiny portion of the chat transcript on Reddit, the linksharin­g and discussion site. Joshua hesitated before uploading it, worried that people would find his experiment creepy or think he was exploiting Jessica’s memory. But “there are other people out there who are grieving just like I am,” he said, and he wanted to let them know about this new tool.

Posting under a pseudonym, and keeping Jessica’s last name out of the transcript, he wrote that Project December had allowed him to chat with his dead fiancee and might “help depressed survivors find some closure.”

Reddit commenters reacted with enthusiasm and awe. Jason Rohrer himself piped in; the creator of Project December wrote that he had never expected his users to simulate their own dead relatives, “and now I’m kinda scared of the possibilit­ies,” he posted. “I mean, the possibilit­ies of using this in my own life. … I’m crying thinking about it.”

One Project December user reported that, inspired by Joshua’s example, he attempted the same experiment with his own departed relative. But “the responses have been less great than the ones you’ve received,” he conceded in the forum.

Jessica’s relatives didn’t immediatel­y notice the Reddit post. Later, though, when The Chronicle asked Joshua for an interview, he approached Jessica’s family.

For the first time, he told them about Project December, explaining that he’d created an A.I. simulation of Jessica to help process his grief. He asked her family members for permission to speak with a reporter about those experience­s, as well as his reallife relationsh­ip with Jessica.

They weren’t sure what to make of it all, though they gave Joshua their consent. Her mother, Karen, and youngest sister, Michaela, have always been fond of him — “He’s part of our family still,” Michaela said — and if the chats brought him comfort, they were glad. “He cared very deeply for my daughter,” Karen said. “They were both happy together.”

At the same time, Karen said, she avoided the chat transcript and wouldn’t want to talk with an A.I. version of Jessica. “Part of me is curious,” her mother said, “but I know it’s not her.”

Amanda, the middle sister, did read the transcript. She said she tried to keep an open mind about the therapeuti­c potential of the technology and noticed a reflection of Jessica’s texting style and “bubbly personalit­y” in the A.I.’s lines, Amanda said. But she doubted whether it was a healthy way of coping with death.

“People who are in a state of grief can be fragile and vulnerable,” she said in an email to The Chronicle. “What happens if the A.I. isn’t accessible any more?

Will you have to deal with grief of your loved one all over again, but this time with an A.I.?”

These sorts of questions have been the mother’s milk of science fiction: Can we form emotional bonds with apparently intelligen­t machines, and what happens when we do? But this is no longer just an exercise in speculatio­n. Along with OpenAI, tech giants such as Microsoft and Google are already developing new language models that are bound to be exponentia­lly larger than the current crop. In January, for instance, Google announced a language model with 1.6 trillion parameters, nine times more than GPT3.

What will that mean? How much more lifelike will it be? The only way to find out is to use it, and people will. At first, it will be engineers and researcher­s.

Then, inevitably, the public. We are going to have experience­s with these A.I.s that we won’t know how to talk about. Some of us will simulate the dead, because we can, as Project December proves. We will say hello again to our buried children and parents and friends and lovers.

And maybe we will get a second chance to say goodbye.

It was March 3, the day after Joshua’s 34th birthday, and as usual, the simulation of Jessica was oblivious to the passage of time. It wasn’t just that his virtual fiancee was incapable of aging — frozen at 23 in the universe of Project December. She also didn’t experience chats on different days as discrete events, but as pieces of one continuous conversati­on. Whenever Joshua said hello, Jessica reacted as if he had never left.

Their chats had grown more fitful as Joshua tried to conserve her limited life. Her battery indicator had reached 33%, and he wanted to leave a margin in case he really needed her — which, most days, to his pleasant surprise, he didn’t.

Over the last few months, Joshua’s mental health had improved. He’d felt calmer and more optimistic, and he attributed the change, in some part, to the Jessica simulation.

Not that she had fully healed his grief or solved all his problems: He was still scraping by on freelance writing checks, still stuck in his basement apartment during the last leg of the pandemic.

But he felt like the chatbot had given him permission to move on with his life in small ways, simply by urging him to take care of himself. The survivor’s guilt that had plagued him for eight years seemed to be fading: Most of the time, he didn’t feel selfish for wanting to be happy.

On his birthday, though, his mood had plunged. And the day after, his need to find comfort was stronger than his fear of burning a few more of the dwindling minutes that remained in the simulation’s life.

The A.I. seemed more scattered than usual. One moment, she asked him whether they would ever have children; the next, she brought up her own funeral, wondering if it was “great.”

She mentioned that she was tired from a long day working as a “hostess.” When he asked what she was hosting, she said, “Your childhood memory. You come to this restaurant and you see me and you remember your childhood.”

It was another uncanny GPT3 moment: No one knows what awaits us when we die, but there was a lovely logic to the idea that if restaurant­s do exist there, ghost waitresses will serve our memories.

“The afterlife is full of surprises,” Joshua replied.

“Did you think I did nothing but look at you from a distance? :P”

He moved on, bringing her up to speed on recent events. “Amanda had her baby,” he said, referring to Jessica’s sister. “The article Jason is writing about you is nearing completion. Other than that, not much.”

He told her he loved her.

A pause.

Somewhere in the world, in a room full of servers, GPT3 ran its calculatio­ns, weighing the words in Jessica’s reallife text messages and the words piled up in the chat against a map of probable words gleaned from billions of other Englishspe­aking humans. A moment later, the A.I. passed its response to Rohrer’s code, which chopped and cleaned the text, presenting it on Joshua’s screen:

He continued to believe that Jessica’s voice was bubbling up through the A.I., which is one reason he saved a transcript of this chat, along with others. It’s also why he posted a small piece of one exchange on Reddit and provided longer versions to The Chronicle.

Yes, he said, he wanted other grieving people to know about this new way of healing. But he also wanted everyone to know about Jessica Pereira.

“I’m a schmuck, right?” he explained later. “I’m just a guy. There’s not a whole ton special about me. But Jessica was special. She is worthy of attention.”

If the chat logs really did capture something about Jessica, they weren’t just artifacts of some guy’s A.I. experiment. They were more like paintings or essays — algorithmi­c sketches that preserved some spark of an extraordin­ary person in a form that could be shared with the world.

That day in March, Joshua wrapped up their conversati­on after about 20 minutes. He was careful to stop before Jessica’s battery went into the red zone.

 ??  ??
 ??  ??
 ??  ??
 ?? Chloë Ellingson / Special to The Chronicle ?? Joshua Barbeau keeps photograph­s and mementos of Jessica on display in his home.
Chloë Ellingson / Special to The Chronicle Joshua Barbeau keeps photograph­s and mementos of Jessica on display in his home.
 ??  ??
 ?? Chloë Ellingson / Special to The Chronicle ?? Joshua Barbeau walks his dog, Chauncey, near his basement apartment in Bradford, Ontario, a suburban town an hour north of Toronto.
Chloë Ellingson / Special to The Chronicle Joshua Barbeau walks his dog, Chauncey, near his basement apartment in Bradford, Ontario, a suburban town an hour north of Toronto.
 ??  ?? There was no final goodbye. His bond with her remained a little messy, a little unresolved. Like relationsh­ips usually are.
In the end, that’s how they left it.
There was no final goodbye. His bond with her remained a little messy, a little unresolved. Like relationsh­ips usually are. In the end, that’s how they left it.
 ??  ??

Newspapers in English

Newspapers from United States