“The history of artificial intelligence in cinema dates back to the early 20th century, with the first notable AI character being Maria in Fritz Lang’s iconic film Metropolis (1927). The portrayal of a humanoid robot ...”
I was tempted not to put quotation marks around those lines. What fun to lure readers into thinking they were beginning a boilerplate article on artificial intelligence in culture only to have them then discover that a free website had generated the copy. No. Better not go there. Even a hint of confusion is dangerous.
The (I’ll confess, eerily convincing) cyberarticle went big on technoapocalypse but declined to note how books, movies and TV had failed to conjure up the low-grade tediousness of so much AI. The “art” in particular. Who would bother structuring a science-fiction novel around not-quite-photorealistic images of blandly attractive lady warriors with seven fingers on each hand? Speculative nightmares are not fashioned from autogenerated reviews on travel sites that deal in suspiciously interchangeable rave clauses.
The perils of death by robot are everywhere. The perils of midlife redundancy thanks to programs that produce indifferent simulacrums of hand-drawn animation – or greeting-card text, or ambient soundscapes – are just a little too ordinarily depressing to interest zesty creatives (as nobody then called HG Wells or Georges Méliès). There is, here, a misty parallel with speculative fiction about manned space exploration. There are yarns about travelling to Mars. There are stories about achieving light speed. Stanley Kubrick was less ambitious and, by the beginning of the 21st century, merely had us making it as far as Jupiter.
The man in Data Centre Alley couldn’t conceal his shock: ‘You’re screwed’
Lawyer Simeon Burke cannot get a master. Is this his own fault or due to an arcane system?
‘I laughed when a friend recommended I buy a single bitcoin when the price was €300. It would now be worth €55,000’
Leicester Tigers and Montpellier among clubs looking to sign Leinster’s Ross Byrne
Did anyone bother to write a novel in which, after landing on the moon, we just sort of gave up on the idea? Never mind the way the world ends, TS Eliot. This is the way it potters on. Not with your epic bangs but with your spineless whimpers. Not with Terminator 2 but with a website that turns your holiday snaps into sword-and-sorcery postcards.
Kubrick’s 2001: A Space Odyssey, of course, also offers one of the most memorable depictions of AI gone wrong. No better man for the job. The director of Barry Lyndon and Eyes Wide Shut is, among other things, the great poet of exquisite boredom. HAL is not a gleaming protoman with fingers that shoot lasers. He operates from behind a bland panel decorated with a single red light. By the close of the film, as artificial intelligences will in cinema, the computer has taken to murdering his human companions. He does even that in drab, monotonic fashion. We will never encounter the psychedelic shift to a higher state of being that closes 2001, but being put out of a job by a characterless sliver of software is something with which we can already identify.
[ Women more exposed to jobs impact of AI, Government research findsOpens in new window ]
“The history of artificial intelligence in cinema is a fascinating journey that mirrors our evolving relationship with technology,” the autogenerated alternative to this column tells me. Shame about the repetition of “history of artificial intelligence in cinema”, but there is something close to a banal truth here. It would perhaps be better to say that the fictional timeline mirrors the most extreme of our fears.
And those fears remain grimly unchanged. Mary Shelley’s Frankenstein is not quite an artificial-intelligence story, but the eponymous creature is indeed the creation of a misguided scientist. In truth the rampaging humanoid avenger owes more to cinematic adaptions than to Shelley’s philosophical misery-guts. We were, nonetheless, more than 200 years ago, already on a path to hubristic annihilation. It goes back further than that. Goethe’s The Sorcerer’s Apprentice, in which an enchanted helper brings unmanageable chaos, echoes a Greek story from the second century.
The Sorcerer’s Apprentice runs through the fevered misery of the Matrix films. It is certainly there in the Terminator sequence. There is a particularly horrid incarnation in Harlan Ellison’s great 1967 story I Have No Mouth, and I Must Scream: an entity that (spoiler) becomes so bored it annihilates all humanity bar four men it elects to torture for eternity.
After all that, it is scarcely a surprise that we have become so nervy about the eventual arrival of the real thing. Just look at the outrage that greeted the news that the makers of the recent horror film Late Night with the Devil had “experimented with AI for three still images”. It was as if they’d been caught collaborating with an occupied power. But this is not just paranoia. The recent US actors’ strike addressed a real fear that background actors could see their images replicated for decades to come. Workers in visual effects and related postproduction also know that jobs are in danger. They (quelle horreur!) may even be coming for film reviewers.
As so often in human existence, the actual terrors are more insidious, more subtle and, yes, more tedious than anything Hollywood would bother imagining.