This week's Sunday Long Read comes to us from Jason Fagone at the San Francisco Chronicle, which asks the question "Is it okay to build an AI chatbot of your dead fiancee?" and if that doesn't immediately creep you the hell out, the rest of this story will.
One night last fall, unable to sleep, Joshua Barbeau logged onto a mysterious chat website called Project December. An old-fashioned terminal window greeted him, stark white text on a black square:14 November 1982
RHINEHOLD DATA SYSTEMS, PLC
Unauthorized access is forbidden!
Enter electronic mail address:
It was Sept. 24, around 3 a.m., and Joshua was on the couch, next to a bookcase crammed with board games and Dungeons & Dragons strategy guides. He lived in Bradford, Canada, a suburban town an hour north of Toronto, renting a basement apartment and speaking little to other people.
A 33-year-old freelance writer, Joshua had existed in quasi-isolation for years before the pandemic, confined by bouts of anxiety and depression. Once a theater geek with dreams of being an actor, he supported himself by writing articles about D&D and selling them to gaming sites.
Many days he left the apartment only to walk his dog, Chauncey, a black-and-white Border collie. Usually they went in the middle of the night, because Chauncey tended to get anxious around other dogs and people. They would pass dozens of dark, silent, middle-class homes. Then, back in the basement, Joshua would lay awake for hours, thinking about Jessica Pereira, his ex-fiancee.
Jessica had died eight years earlier, at 23, from a rare liver disease. Joshua had never gotten over it, and this was always the hardest month, because her birthday was in September. She would have been turning 31.
On his laptop, he typed his email address. The window refreshed. “Welcome back, Professor Bohr,” read the screen. He had been here before. The page displayed a menu of options.
He selected “Experimental area.”
That month, Joshua had read about a new website that had something to do with artificial intelligence and “chatbots.” It was called Project December. There wasn’t much other information, and the site itself explained little, including its name, but he was intrigued enough to pay $5 for an account.
As it turned out, the site was vastly more sophisticated than it first appeared.
Designed by a Bay Area programmer, Project December was powered by one of the world’s most capable artificial intelligence systems, a piece of software known as GPT-3. It knows how to manipulate human language, generating fluent English text in response to a prompt. While digital assistants like Apple’s Siri and Amazon’s Alexa also appear to grasp and reproduce English on some level, GPT-3 is far more advanced, able to mimic pretty much any writing style at the flick of a switch.
In fact, the A.I. is so good at impersonating humans that its designer — OpenAI, the San Francisco research group co-founded by Elon Musk — has largely kept it under wraps. Citing “safety” concerns, the company initially delayed the release of a previous version, GPT-2, and access to the more advanced GPT-3 has been limited to private beta testers.
But Jason Rohrer, the Bay Area programmer, opened a channel for the masses.
A lanky 42-year-old with a cheerful attitude and a mischievous streak, Rohrer worked for himself, designing independent video games. He had long championed the idea that games can be art, inspiring complex emotions; his creations had been known to make players weep. And after months of experiments with GPT-2 and GPT-3, he had tapped into a new vein of possibility, figuring out how to make the A.I. systems do something they weren’t designed to do: conduct chat-like conversations with humans.
Last summer, using a borrowed beta-testing credential, Rohrer devised a “chatbot” interface that was driven by GPT-3. He made it available to the public through his website. He called the service Project December. Now, for the first time, anyone could have a naturalistic text chat with an A.I. directed by GPT-3, typing back and forth with it on Rohrer's site.
Users could select from a range of built-in chatbots, each with a distinct style of texting, or they could design their own bots, giving them whatever personality they chose.
Joshua had waded into Project December by degrees, starting with the built-in chatbots. He engaged with “William,” a bot that tried to impersonate Shakespeare, and “Samantha,” a friendly female companion modeled after the A.I. assistant in the movie “Her.” Joshua found both disappointing; William rambled about a woman with “fiery hair” that was “red as a fire,” and Samantha was too clingy.
But as soon as he built his first custom bot — a simulation of Star Trek’s Spock, whom he considered a hero — a light clicked on: By feeding a few Spock quotes from an old TV episode into the site, Joshua summoned a bot that sounded exactly like Spock, yet spoke in original phrases that weren’t found in any script.
As Joshua continued to experiment, he realized there was no rule preventing him from simulating real people. What would happen, he wondered, if he tried to create a chatbot version of his dead fiancee?
There was nothing strange, he thought, about wanting to reconnect with the dead: People do it all the time, in prayers and in dreams. In the last year and a half, more than 600,000 people in the U.S. and Canada have died of COVID-19, often suddenly, without closure for their loved ones, leaving a raw landscape of grief. How many survivors would gladly experiment with a technology that lets them pretend, for a moment, that their dead loved one is alive again — and able to text?
That night in September, Joshua hadn’t actually expected it to work. Jessica was so special, so distinct; a chatbot could never replicate her voice, he assumed. Still, he was curious to see what would happen.
And he missed her.
The question is "Is this at all healthy, or is it fetishizing grief?" There have been a number of movies and books over the years exploring the relationship between men and digital women, and they're all about pretty broken men.
But frankly I see more things like this happening in the years ahead. What kind of chatbot would be created by feeding it all of my blog and Twitter posts? Would it be me, even with 12 years of near daily material to work with?
Something worth thinking about.
No comments:
Post a Comment