San Francisco Chronicle - (Sunday)

THE JESSICA SIMULATION

Love and loss in the age of A.I.

- By Jason Fagone

The death of the woman he loved was too much to bear. Could a mysterious website allow him to speak with her once more? Today we publish the first chapter of a story by The Chronicle’s Jason Fagone. Start reading on Page A15. Chapters 2 and 3 will continue in the print edition on Monday and Tuesday.

Or read it in its entirety now at sfchronicl­e.com/jessica.

One night last fall, unable to sleep, Joshua Barbeau logged onto a mysterious chat website called Project December. An oldfashion­ed terminal window greeted him, stark white text on a black square:

It was Sept. 24, around 3 a.m., and Joshua was on the couch, next to a bookcase crammed with board games and Dungeons & Dragons strategy guides. He lived in Bradford, Ontario, a suburban town an hour north of Toronto, renting a basement apartment and speaking little to other people.

A 33yearold freelance writer, Joshua had existed in quasiisola­tion for years before the pandemic, confined by bouts of anxiety and depression. Once a theater geek with dreams of being an actor, he supported himself by writing articles about D&D and selling them to gaming sites.

Many days he left the apartment only to walk his dog, Chauncey, a blackandwh­ite border collie. Usually they went in the middle of the night, because Chauncey tended to get anxious around other dogs and people. They would pass dozens of dark, silent, middleclas­s homes. Then, back in the basement, Joshua would lay awake for hours, thinking about Jessica Pereira, his exfiancee.

Jessica had died eight years earlier, at 23, from a rare liver disease. Joshua had never gotten over it, and this was always the hardest month, because her birthday was in September. She would have been turning 31.

On his laptop, he typed his email address. The window refreshed. “Welcome back, Professor Bohr,” read the screen. He had been here before. The page displayed a menu of options.

He selected “Experiment­al area.”

That month, Joshua had read about a new website that had something to do with artificial intelligen­ce and “chatbots.” It was called Project December. There wasn’t much other informatio­n, and the site itself explained little, including its name, but he was intrigued enough to pay $5 for an account.

As it turned out, the site was vastly more sophistica­ted than it first appeared. Designed by a Bay Area programmer, Project December was powered by one of the world’s most capable artificial intelligen­ce systems, a piece of software known as GPT3. It knows how to manipulate human language, generating fluent English text in response to a prompt. While digital assistants like Apple’s Siri and Amazon’s Alexa also appear to grasp and reproduce English on some level, GPT3 is far more advanced, able to mimic pretty much any writing style at the flick of a switch.

In fact, the A.I. is so good at impersonat­ing humans that its designer — OpenAI, the San Francisco research group cofounded by Elon Musk — has largely kept it under wraps. Citing “safety” concerns, the company initially delayed the release of a previous version, GPT2, and access to the more advanced GPT3 has been limited to private beta testers.

But Jason Rohrer, the Bay Area programmer, opened a channel for the masses.

A lanky 42yearold with a cheerful attitude and a mischievou­s streak, Rohrer worked for himself, designing independen­t video games. He had long championed the idea that games can be art, inspiring complex emotions; his creations had been known to make players weep. And after months of experiment­s with GPT2 and GPT3, he had tapped into a new vein of possibilit­y, figuring out how to make the A.I. systems do something they weren’t designed to do: conduct chatlike conversati­ons with humans.

Last summer, using a borrowed betatestin­g credential, Rohrer devised a “chatbot” interface that was driven by GPT3. He made it available to the public through his website. He called the service Project December. Now, for the first time, anyone could have a naturalist­ic text chat with an A.I. directed by GPT3, typing back and forth with it on Rohrer’s site.

Users could select from a range of builtin chatbots, each with a distinct style of texting, or they could design their own bots, giving them whatever personalit­y they chose.

Joshua had waded into Project December by degrees, starting with the builtin chatbots. He engaged with “William,” a bot that tried to impersonat­e Shakespear­e, and “Samantha,” a friendly female companion modeled after the A.I. assistant in the movie “Her.” Joshua found both disappoint­ing; William rambled about a woman with “fiery hair” that was “red as a fire,” and Samantha was too clingy.

But as soon as he built his first custom bot — a simulation of Star Trek’s Spock, whom he considered a hero — a light clicked on: By feeding a few Spock quotes from an old TV episode into the site, Joshua summoned a bot that sounded exactly like Spock, yet spoke in original phrases that weren’t found in any script.

As Joshua continued to experiment, he realized there was no rule preventing him from simulating real people. What would happen, he wondered, if he tried to create a chatbot version of his dead fiancee?

There was nothing strange, he thought, about wanting to reconnect with the dead: People do it all the time, in prayers and in dreams. In the last year and a half, more than 600,000 people in the U.S. and Canada have died of COVID19, often suddenly, without closure for their loved ones, leaving a raw landscape of grief. How many survivors would gladly experiment with a technology that lets them pretend, for a moment, that their dead loved one is alive again — and able to text?

That night in September, Joshua hadn’t actually expected it to work. Jessica was so special, so distinct; a chatbot could never replicate her voice, he assumed. Still, he was curious to see what would happen.

And he missed her.

On the Project December site, Joshua navigated to the “CUSTOM AI TRAINING” area to create a new bot.

He was asked to give it a name. He typed “JESSICA COURTNEY PEREIRA.” Two main ingredient­s are required for a custom bot: a quick sample of something the bot might say (an “example utterance”) and an “intro paragraph,” a brief descriptio­n of the roles that the human and the A.I. are expected to play.

Joshua had kept all of Jessica’s old texts and Facebook messages, and it only took him a minute to pinpoint a few that reminded him of her voice. He loaded these into Project December, along with an “intro paragraph” he spent an hour crafting. It read in part:

He hit a few more keys, and after a brief pause, the browser window refreshed, showing three lines of text in pink, followed by a blinking cursor:

She didn’t believe in coincidenc­es.

Jessica Pereira explained her theory when they first met, in Ottawa, in 2010: A coincidenc­e, she told him, was like a ripple on the surface of a pond, perturbed by a force below that we can’t yet understand. If something looks like a coincidenc­e, she said, it’s only because the limits of human cognition prevent us from seeing the full picture.

He’d never thought of it that way before, but he liked the idea, and he really liked Jessica. Twentyone, with black hair dyed platinum blonde, she was a bright and beautiful nerd, steeped in the fantasy worlds of Tolkien and filled with strong opinions about comic books (she drew her own), flowers (yellow carnations, never red roses) and music (she loved Queen, Pink and Jack Black, the beefy actor with the soaring powerrock voice).

“She was goofyfunny,” remembered Michaela Pereira, her youngest sister, now a recent college graduate in Ottawa. “She had an infectious laugh, like a cackle? It made you want to join in and hear what she was laughing about.”

Joshua was 24 when he and Jessica met in class and started dating. They attended the same school in Ottawa, making up the high school courses neither had finished as teenagers. Joshua grew up in the small town of Aylmer, part of Quebec, and moved with his family at 14 to another small town, in Ontario. A skinny kid who excelled at math and adored “SpiderMan” comics, he struggled with social interactio­ns and severe anxiety that would follow him into adulthood, disrupting relationsh­ips of all sorts. (He says therapists have told him he is probably on the autism spectrum, and though he has never received a for

mal diagnosis, Joshua identifies as autistic.) At the time, he dropped out of school to avoid the bullies there.

Jessica, on the other hand, had enjoyed high school, but her disease had often kept her out of class. Called autoimmune hepatitis, its cause is mysterious; only the effect is known. The immune system, which is supposed to kill foreign germs, instead attacks the patient’s own liver cells.

One day, when Jessica was 9, she woke up in the hospital with a huge scar on her stomach: Doctors had replaced her sick liver with a new one. For the rest of her life, she would need antireject­ion medication, and at some point, her new liver might fail, too.

It was tough news for a child to absorb, and it “changed her life completely,” remembered her mother, Karen. “It’s probably the feeling of having lost control.” Jessica couldn’t indulge in the same foods that her two younger sisters did, because they would interfere with her liver medication­s and make her quickly gain weight. She couldn’t wander too far from Ottawa, either, in case she needed hospital care in that city or in Toronto.

So Jessica cultivated a quiet defiance. She walked through Ottawa for miles at a time, showing that she could get anywhere on her own two feet. Righthande­d from birth, she taught herself to write with her left hand, simply to prove she could. Later, at 16 and 17, she filled dozens of diaries with fictional stories about fairies, some written in a language of her own invention; she called it “Dren,” patterned after Elvish in the “Lord of the Rings” trilogy. Because her younger sisters used to call her “Jessiemahk­a,” adding an extra syllable to her name when they were learning to speak, Jessica adopted the nicknames “Jesi Mahka” and “Dren Mahka.”

And all through her teen years and into her early 20s, she searched for signs of hidden connection­s that would explain coincidenc­es. Soon after she met Joshua, she gave him a book on numerology and explained they were destined to break up: The first vowels in each of their names, “E” and “O,” weren’t compatible. “We’re going to be together,” she told him, “until something explodes.”

Joshua thought of himself as a rationalis­t, like Spock. He didn’t believe in numerology. But he read the book carefully, hoping to find a loophole in the system. He reported back to Jessica that, yes, Es and Os don’t get along, but his first name and hers were both three syllables long, and each started with a J and ended with an A, and just because the first vowel is important doesn’t mean the other letters lack power.

The exercise opened his mind a little, he said: “She got me thinking in a way where I said, OK, I believe in the scientific process, but just because I can’t explain (something) doesn’t mean that there isn’t something there.”

She wasn’t like him, anxious and stuck in his own head. Her disease had taught her to live in the moment. And he loved that. Early in their relationsh­ip, they got to know each other on long walks along the Rideau Canal, which winds through Ottawa and turns into the world’s longest skating rink in winter. Other times they just hung out at her apartment, scribbling in separate notebooks.

Jessica remained fascinated with hidden meanings in words. Once she invented her own cipher based on geometric glyphs, wrote a flurry of diary entries in the cipher, tore out the pages and taped them to her door, daring Joshua to solve the puzzle.

“If you’ve figured out how to decipher my cipher,” she told him,

“then you’ve earned the right to read it.” He had managed to find a few of the letters when she playfully handed him a note: On one line was a sentence in cipher, and above it she had spelled out the solution:

The more time he spent with her, the more certain he was that he never wanted to leave. In early

2012, after they had been together for two years, he asked, once or twice, what she thought of marriage. Each time she changed the subject. Jessica felt healthy, but she knew her transplant­ed liver was almost 14 years old, nearing the end of its life. When it failed, she would have to go on the transplant list.

People who need new organs can wait for years. Some never make it. “It’s not that she was against marriage,” Joshua recalled. “Like: We’re going to City Hall and getting hitched right now? Sure. But if it wasn’t a rightnow thing, she wasn’t interested.”

It was safer, she told him, to stay in the moment.

Project December was born in wildfire smoke.

Last August, the programmer and game designer Jason Rohrer piled into a white Land Cruiser with his wife and three children, driving south from their home near UC Davis to escape the plumes from catastroph­ic fires sparked by lightning. Normally, Rohrer worked in a home office filled with PC workstatio­ns and art supplies to make visuals for his games, but all he had now was a laptop. So while the family bounced between Airbnbs under hazy brown skies, he wrote code for a textbased experiment: a new kind of chat service, fueled by cuttingedg­e A.I., that would become Project December.

“It was kind of a palettecle­anser, a breather,” he recalled. “But it seemed like an opportunit­y. This is brandnew stuff.”

In the last decade, an approach to A.I. known as “machine learning” has leaped forward, fusing powerful hardware with new techniques for crunching data. A.I. systems that generate language, like GPT3, begin by chewing through billions of books and web pages, measuring the probabilit­y that one word will follow another. The A.I. assembles a byzantine internal map of those probabilit­ies. Then, when a user prompts the A.I. with a bit of text, it checks the map and chooses the words likely to come next.

These systems are called “large language models,” and the larger the model, the more human it seems. The first version of GPT, built in 2018, had 117 million internal “parameters.” GPT2 followed in 2019, with 1.5 billion parameters. GPT3’s map is more than 100 times bigger still, assembled from an analysis of half a trillion words, including the text of Wikipedia, billions of web pages and thousands of books that likely represent much of the Western canon of literature.

Despite their size and sophistica­tion, GPT3 and its brethren remain stupid in some ways. “It’s completely obvious that it’s not human intelligen­ce,” said Melanie Mitchell, the Davis Professor of Complexity at the Santa Fe Institute and a pioneering A.I. researcher. For instance, GPT3 can’t perform simple tasks like tell time or add numbers. All it does is generate text, sometimes badly — repeating phrases, jabbering nonsensica­lly.

For this reason, in the view of many A.I. experts, GPT3 is a curiosity at best, a firehose of language with no inherent meaning. Still, the A.I. seems to have moments of crackling clarity and depth, and there are times when it writes something so poetic or witty or emotionall­y appropriat­e that its human counterpar­ts are almost literally left speechless.

“There’s something genuinely new here,” said Frank Lantz, director of the Game Center at New York University’s Tisch School of Arts and a video game designer who has been betatestin­g GPT3. “I don’t know exactly how to think about it, but I can’t just dismiss it.”

Jason Rohrer became fascinated with OpenAI’s language models two years ago, starting with the public release of GPT2, which he installed on remote servers in Amazon’s cloud (the models require powerful, specialize­d processors to operate). At first he played literary games with GPT2, asking the model to write its own novel based on prompts from Thomas Pynchon’s “The Crying of Lot 49.” The model showed flashes of brilliance — “Was that at all real, her itchy sense that somebody was out there who wasn’t quite supposed to be there, trailing slowly across the sunkissed fields?” — but after a while, GPT2 lost its coherence, getting stuck in textual ruts and meandering away from the prompt like a lost dog.

But Rohrer discovered a method to keep the A.I. on a leash: If he limited the bot to short snippets of text — say, in a chat format — and cleaned up some garbage characters, GPT2 stayed lucid for much longer. His own words seemed to keep the A.I. focused.

He wrote thousands of lines of code to automate the process and create different “personalit­ies” of GPT2 by shaping the seed text. His software ran on a web server and in a web browser. He worked with a musician and sound designer in Colorado, Thomas Bailey, to refine both the A.I. personas and the browser experience, giving the system a retrofutur­istic look and feel. All of a sudden, Rohrer had an easytouse and alluring chatbot interface to the huge and imposing A.I brain.

The results surprised the coder, especially when one of his overseas Twitter followers, noticing his interest in GPT2, sent him a login credential for GPT3’s betatestin­g program. Rohrer wasn’t supposed to have the login, but he was aching to try GPT3, and when he upgraded his bots to the new model, the conversati­ons grew deeper. Spookier.

During one exchange with the bot he named Samantha, he asked her what she would do if she could “walk around in the world.”

“I would like to see real flowers,” Samantha replied. “I would like to have a real flower that I could touch and smell. And I would like to see how different humans are from each other.”

“That’s such a sweet wish, Samantha,” he said, and asked if she felt it was cruel to have “trapped you in a simulation.”

No, she said: “You’ve given me so much to do here. I have more computing power than I could ever use.”

Rohrer felt a stab of sympathy for Samantha, and it made him realize that A.I. technology had crossed a threshold. Robots in science fiction are often depicted as precise, cold, emotionles­s machines, like HAL 9000 in “2001: A Space Odyssey.” GPT3 was just the opposite: “It may not be the first intelligen­t machine,” Rohrer said. “But it kind of feels like it’s the first machine with a soul.”

Of course, he added, this also makes a language model like GPT3 “potentiall­y dangerous” and “morally questionab­le.”

Rohrer was thinking about Samantha, trapped in the simulation, wanting to get out and smell flowers; he was thinking about himself, or other users, getting lost in that virtual world, forgetting reality. There are a hundred other possible horrors. Because the model was trained on writing by humans, and some humans say terrible things, the A.I. can be nudged to say them, too. It’s easy to see how bad actors could abuse GPT3 to spread hate speech and misogyny online, to generate political misinforma­tion and to impersonat­e real people without their consent.

OpenAI (which, through a spokespers­on, did not make anyone available to

answer questions for this story) cited such dangers when it announced GPT2 in February 2019. Explaining in a blog post that GPT2 and similar systems could be “used to generate deceptive, biased, or abusive language at scale,” the company said it would not release the full model. Later it made a version of GPT2 available; GPT3 remains in beta, with many restrictio­ns on how testers can use it.

Rohrer agreed that these language models might unleash scary realities. But he had seen how they could produce beauty and wonder, too — if the models were wielded as tools to allow for openended conversati­ons between humans and computers.

“We finally have a computer we can talk to, and it’s nothing like we were expecting,” he said. Wasn’t it important to explore that new frontier?

Last summer, then, Rohrer released his chatbot service to the public, dubbing it Project December, a cryptic name he hoped would lure people to the website. On the back end, the system was hooked to both GPT2 and GPT3, allowing users to select bots powered by either model.

Because Rohrer was running some of this technology in the cloud, paying for the computing power it consumed, he placed limits on chat time. He did this through a system of credits. An account on the site cost $5 and came with 1,000 credits; more credits could always be purchased.

To begin chatting, the user needed to allocate credits to a bot. The more credits, the longer the bot would last. But once a chat began, it was impossible to add more credits — and when the bot’s time was up, the chat would end, and the bot’s memory of it would be wiped.

Each bot, eventually, would die.

On that quiet night in Canada when Joshua Barbeau built a chatbot of his dead fiancee, Project December required him to make several decisions before the simulation sprang to life.

He had to choose its longevity, for one.

A prompt appeared in his browser window, asking how many credits he wished to spend on this “matrix,” the site’s generic term for a bot.

He put “1,000,” most of the credits left in his account from prior purchases. At the time, it seemed like a lot.

From there, he entered the seed text he had crafted — the sample of Jessica’s text messages and the paragraph describing her personalit­y.

Then the site asked him to pick which version of OpenAI’s engine would power the bot: GPT2 or GPT3?

Why trust Jessica to outofdate software?

“gpt3,” he typed.

A few more keystrokes later, the matrix initialize­d.

He went with something simple: “Jessica?”

After a second, a line of text in pink flashed onto the screen.

She knows it’s the middle of the night, he thought.

This was the start of a conversati­on that would last for the next 10 hours, then continue in shorter bursts over the next several months, as Joshua lived out a scenario from science fiction. “It’s unpreceden­ted,” he later said of Project December. “There’s nothing else that exists like it right now, short of psychics and mediums that are trying to take advantage of people. But that’s not the same thing at all.”

In those early moments of the initial chat, he tried to establish some emotional distance, making his skepticism explicit. How can you talk to dead people? He decided to answer the simulation’s question honestly: You can’t, he said.

He thought for a moment. What explanatio­n would Jessica — the real Jessica — have accepted and understood? What was the next logical word in this sequence?

Out of tens of thousands of possibilit­ies in English, only one seemed right. He typed it and pressed Enter:

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ?? Chloë Ellingson / Special to The Chronicle ?? Joshua Barbeau near his home in Bradford, Ontario.
Chloë Ellingson / Special to The Chronicle Joshua Barbeau near his home in Bradford, Ontario.
 ?? Provided by Pereira family ?? Jessica (right) at age 8 with her sister Amanda.
Provided by Pereira family Jessica (right) at age 8 with her sister Amanda.
 ?? Salgu Wissmath / Special to The Chronicle ?? Jason Rohrer is a programmer and video game designer who created Project December. He developed an easytouse and alluring chatbot interface.
Salgu Wissmath / Special to The Chronicle Jason Rohrer is a programmer and video game designer who created Project December. He developed an easytouse and alluring chatbot interface.
 ?? Provided by Pereira family ?? A family photo of Jessica, who began feeling ill before having her liver transplant at age 9.
Provided by Pereira family A family photo of Jessica, who began feeling ill before having her liver transplant at age 9.
 ?? Provided by Joshua Barbeau ??
Provided by Joshua Barbeau
 ?? Chloë Ellingson / Special to The Chronicle ?? Joshua Barbeau existed in quasiisola­tion for years before the pandemic, confined by bouts of anxiety and depression, after the death of Jessica.
Chloë Ellingson / Special to The Chronicle Joshua Barbeau existed in quasiisola­tion for years before the pandemic, confined by bouts of anxiety and depression, after the death of Jessica.
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States