Boston Sunday Globe

Machines that make life too easy

In 1909, long before Hollywood made killer robots seem all too plausible, E.M. Forster foresaw the real threat from AI.

- BY TOM JOUDREY

Our species will get backstabbe­d by artificial intelligen­ce. That’s the gist of a fear about AI that’s been hammered into our collective psyche for decades. Hollywood struck gold with the premise that machines will evolve into something devious and diabolical and flip their mission from assistive to treacherou­s. Et tu, Brute? Then fall humanity.

Stanley Kubrick’s “2001: A Space Odyssey” conveyed arguably the most cerebral version of the AI betrayal narrative. When HAL, a genial artificial intelligen­ce on board a Jupiter-bound spacecraft, shows signs of minor malfunctio­ns, it deflects blame for the errors and then, when threatened with terminatio­n, connives to exterminat­e the crew. Later, Arnold

Schwarzene­gger’s iconic turn as the Terminator gave us a cyborg assassin who hunts down the would-be savior of mankind to prevent him from sabotaging the Skynet AI in a post-apocalypti­c future. “The Matrix” took the betrayal plot to a whole new level, recasting reality as a sophistica­ted subterfuge designed by AI to keep humanity in docile bondage as machines harvest energy from their inert bodies. It doesn’t get much more predatory than that.

The actual debate over AI seems to have taken cues from Hollywood. Industry leaders posit two warnings. First, a deluge of disinforma­tion that leaves us unable to collect reliable evidence or have confidence in what’s true. Second, an extinction-level threat from a disloyal AI that imperils our species’ survival. Both fears are as Hollywood conjured them: techno-backstabbe­rs feeding us a counterfei­t reality.

I’m not dismissing these concerns

— in fact, it’s clear we need to grapple with them — but what if this betrayal narrative, for all its dramatic seductiven­ess, underestim­ates the most plausible threat posed by AI? And what if doing so is preventing us from wrestling with the ethics of transforma­tive technologi­es?

A crisis of comfort

More than a century ago, the British novelist E.M. Forster foretold a radically different kind of threat from automation, one that is less epic but far more chilling. Forster, a gay writer struggling with the paradoxes of modernity, was inclined to think humanity would become more proximate but somehow less intimate, more industriou­s but with ever greater obscurity of purpose. Already by the early 1900s, he worried that the motorized age had severed local roots of belonging. A form of this warning came in Forster’s epigraph to his novel “Howards End,” where he implored readers to “only connect!” rather than allow our society to become atomized.

Forster delivered his most pointed riposte to the cheerleade­rs of progress in a 1909 novella called “The Machine Stops.” It envisions a future in which humanity has become comfortabl­y ensconced undergroun­d and connected by holographi­c instant messaging systems that forestall virtually any reason for people to leave their honeycomb cells. All their needs — food, music, clothes, literature — are met by a sophistica­ted Machine that was designed and built by their ancestors. With everything mediated through the Machine, humans have acquired a terror of direct experience and in fact no longer suffer the impolitene­ss of touching one another. Propagatio­n centers efficientl­y handle reproducti­on — and parental duties cease at the instant of birth.

There would be no real narrative here, as nothing much happens in this curated world of pleasure and ease, except that a woman named Vashti gets a call from her son, Kuno, who wants to visit the crust of the planet. She recoils at the idea of a pointless trek through the empty dirt and muck above. But when she’s persuaded to board an airship and cross hemisphere­s to see him in person, Vashti learns that Kuno has already climbed a ventilatio­n shaft and seen the surface for himself. The experience shook Kuno to the core. His senses were bombarded with stimuli, his lungs stung with bitter air. Blood gushed from his nose and ears. Ferns and sunbeams danced across his vision. But instead of recoiling from the real world above, he savors the epiphany it unleashed — namely, that the Machine has reduced him to a mummified state insensible to the world. The Machine, he now raves to his mother, “has robbed us of the sense of space and of the sense of touch, it has blurred every human relation and narrowed down love to a carnal act, it has paralyzed our bodies and our wills, and now it compels us to worship it.”

Kuno’s mother marvels at what she can only interpret as his atavistic reversion to something savage — she finds even the basic use of the senses barbaric — but as she tries to return to her normal life, she can’t escape a creeping crisis of exhaustion. To stanch further curiosity for exploratio­n, the civilizati­on’s rules committee revokes access to respirator­s that enable travel to the surface, further marooning human society in torpid decay. As time wears on, humans’ remaining capacity for inquiry gets supplanted by a new religion that worships the Machine as an omnipotent deity and its instructio­n booklet as a holy text. They offer prayers and supplicati­ons to avoid the burden of responsibi­lity. Eventually, the committee responsibl­e for addressing malfunctio­ns in the Machine confesses that the automated mending apparatus itself is in need of repair, but no one has retained knowledge of how to go about repairing it. Fruit goes moldy, water stinks, lights dull, the air is befouled, until the whole dazzling achievemen­t of human ingenuity implodes.

Reading Forster reverses our most basic assumption­s about the threat posed by technology. Hollywood and AI industry leaders alike warn us that we are courting comeuppanc­e for our hubris, generating an artificial being so sophistica­ted and powerful that it will slip loose of its directives and crush us like bugs. But Forster saw how the industrial age’s increasing reliance on technology — and its narrow focus on maximizing efficienci­es — could altogether derail grand ambitions. Making life easier becomes the only goal, leaving humanity lethargic, sedentary, slouching into paralysis. Enervated of wonder and reduced finally to a solipsisti­c hedonism, humanity falls into an infantile state, helpless and pathetic, whimpering for a magical caretaker to soothe its terrors and feed its pleasures.

The danger, Forster wants us to see, isn’t that technology evolves into something radically predatory but that it gives us too much of what we think we want.

Machines for pleasure

Long before the advent of digital computers, Forster prefigured some of the fundamenta­l ethical warnings of putative progress gone awry. Two stand out to me.

The first relates to Forster’s idea that technology, in securing ease and satisfying pleasure, has the potential to radically subvert a purpose-driven life.

The philosophe­r Robert Nozick developed a similar insight in his 1974 book “Anarchy, State, and Utopia.” He asks us to imagine being offered the chance to have our brains plugged into a pleasure-delivery machine that simulates experience. The pleasure derived from experience would not be as robust or as certain as the pleasure from the machine. Nozick was willing to bet that humans would not opt to plug their nervous system into the machine — that they would refuse to sacrifice experienti­al and embodied life for the sake of maximizing pleasure alone. Nozick’s point was to show that humans have motives other than pleasure, but his work also revealed the prescience of Forster’s observatio­n that the same technology that drives progress and spreads happiness can also ensnare societies in regression and listlessne­ss.

The second of Forster’s warnings that struck me has to do with an oddity about how knowledge is produced and lost.

Later in the 20th century, French sociologis­t Bruno Latour would use the term “blackboxin­g” to refer to the way that scientific theories ossify into uninvestig­ated facts, obscuring the intricate systems of knowledge that produced the truth in question. Forster’s Machine is the endgame of this paradox, when the black-box Machine — a technologi­cal repository of knowledge — not only locks away instrument­al knowledge of how things work but blots out the purpose of and motives for creating the Machine to begin with. While the Machine hummed along, human cognition atrophied, inducing staggering ignorance.

Four decades after he wrote “The Machine Stops,” Forster said his intention was to offer a rebuttal to the future envisioned by another famous English writer, H.G. Wells. In “The Time Machine,” Wells had posited that socioecono­mic inequality would produce class-stratified offshoots of the human species — the childlike Eloi inhabiting paradise and the laboring Morlocks consigned to subterrane­an tunnels — creating an entrenched symbiosis of exploitati­on. Only by staving off class warfare could this impending crisis be ameliorate­d.

Forster’s fictional reply to Wells was that he had it exactly backwards: Society needs friction and competitio­n and ambitions or else the machines, so valorized by Wells, will make helpless Eloi of us all.

Fictional prognostic­ations are of course speculativ­e, but more than a century after he wrote “The Machine Stops,” rigorous social science research is bearing out Forster’s warnings.

An avalanche of data accumulate­d over recent decades has tracked a correlatio­n between interfacin­g with technology and social fragmentat­ion, feelings of isolation and despair, an epidemic of loneliness, and declining capacity to absorb and retain informatio­n. Put bluntly, last century’s cautionary tale has become this century’s lived reality.

Forster’s warning should remind us that the vengeful AI enemy we’ve been conjuring over the last half century is a red herring. We’re haunted by the terror that something might “go wrong,” but the real peril lies in the nihilism of an existence blissfully free of adversity. AI may wreak its harm by simply doing exactly what we have asked of it.

 ?? MARI FOUZ FOR THE GLOBE ??
MARI FOUZ FOR THE GLOBE
 ?? KING’S COLLEGE, CAMBRIDGE/WIKIMEDIA COMMONS ?? Forster, circa 1917. He worried that automation had severed local roots of belonging. Hence the epigraph to his novel “Howards End,” where he implored readers to “only connect!”
KING’S COLLEGE, CAMBRIDGE/WIKIMEDIA COMMONS Forster, circa 1917. He worried that automation had severed local roots of belonging. Hence the epigraph to his novel “Howards End,” where he implored readers to “only connect!”

Newspapers in English

Newspapers from United States