Politicians and leaders have frequently used body doubles throughout history. One of the most celebrated Irish cases was that of George Brendan Nolan of Ballinasloe.
As a child, he emigrated to New York with his mother, but returned to Dublin in 1921 at the age of 17 and joined the Abbey Theatre. He also became a courier and then doppelgänger for the revolutionary leader Michael Collins, before fleeing back to the US via Canada with a British bounty on his head.
Nolan adopted the new name George Brent. He proceeded to conquer Broadway and Hollywood, appearing in 88 films, 11 of which teamed him up with Betty Davis with whom he had a steamy two-year affair amid his five marriages. A licensed pilot, he largely avoided using a double for his screen stunts.
Nevertheless, stand-in doubles have been routine in Hollywood. Digitisation and computer-generated imagery are not only making stunts even more spectacular but completely replacing the need for human actors with fully animated avatars in scenes from, for example, the Marvel superhero movies, Lord of the Rings and the Hobbit.
Parties’ general election manifestos struggle to make the figures add up
On his return to Web Summit, the often outspoken chief executive Paddy Cosgrave is now an epitome of caution
Surviving a shake-up: is restructuring ever good for staff?
The Irish Times Business Person of the Month: Dalton Philips, Greencore
Last year, the Writers Guild of America (WGA) and the union Sag-Aftra fought Hollywood against the “dehumanising of the workforce” by digitisation and artificial intelligence (AI). The WGA was apprehensive that AI systems like ChatGPT could be used to generate entire movie scripts, so obsoleting human scriptwriters.
Sag-Aftra was concerned that for a very modest once-off payment, an actor’s “digital twin” could be used in perpetuity. In a settlement last October, the WGA won the right for writers to use AI only as they wished, rather than being forced to do so.
In December, Sag-Aftra won compensation royalties for human actors whose digital avatars are subsequently used.
Also in December, the CNN journalist and news anchor Anderson Cooper broadcast a striking TV clip (see https://edition.cnn.com/videos/business/2023/12/01/artificial-intelligence-deepfake-anderson-cooper-actws-vpx.cnn) in which he noted the profound potential impact of AI on many industries, including journalists and news anchors like himself.
He then teased that perhaps AI already has already done so. He confessed “because what you just saw and heard a moment ago was not actually me”, but rather an identical digital avatar presenting as Cooper even though he had never spoken the words it used. He challenged viewers as to how can they know any longer what is real, and what is not.
Entirely fictitious influencers are now becoming popular on social media networks. Extremely realistic digital avatars, such as the virtual supermodels Shudu Gram and Aitana Lopéz, are earning substantial endorsements for their creators and agencies without the complications of managing the foibles of human models and artists.
This year will see multiple political elections worldwide, which may have a profound effect on societies, the global economy, and perhaps even democracy itself. Brazil, India, South Africa and numerous others will have national elections, the EU will hold its parliamentary election, and Russia a presidential election. Both Ireland and the UK may have a general election this year. November will host the US presidential election.
Digital mischief in these elections is not only imaginable but eminently achievable. Anderson Cooper noted that his digital doppelgänger had been created in just a few weeks by a student contractor using widely available open source tools. We should not be surprised if credible but fake videos of certain politicians appear making outlandish assertions, particularly on the run-up to election days when there is little remaining time for fact-checking or refutation as bogus.
Politicians and their advisers may well see the opportunity for positive use of digital doppelgängers, rather than just for dirty political assertions.
A photorealistic avatar, removing any skin blemishes and wrinkles, could be a youthful vibrant virtual alternative to its human political embodiment. Such a political avatar – I call it an “avatician” – could then be used in video footage, digitally animated with appropriate body movements.
Like Anderson Cooper’s avatar, an avatician could deliver statements and speeches convincingly indistinguishable from its human doppelgänger.
As I explored in an article last January on how ChatGPT can be encouraged to mimic various Irish political leaders, an avatician could readily be trained to respond to arbitrary questions posed to it by the public or by journalists, in the style of its human equivalent.
[ Meet my new online friend: Michael D GPTOpens in new window ]
It does not take much imagination to envisage an avatician surviving the death of its human twin. Some historical regimes have been known to use body doubles to sustain an administration after its leader passes, including for Joseph Stalin by the KGB and for Kim Jong-il by the North Korean administration. An entirely credible, speaking, moving, AI-generated virtual double could now conceivably carry out such a deception just as well, thus enabling a dictator’s administration to remain in power.
- Sign up for Business push alerts and have the best news, analysis and comment delivered directly to your phone
- Find The Irish Times on WhatsApp and stay up to date
- Our Inside Business podcast is published weekly – Find the latest episode here