Video and audio are the most visceral mediums of human communication. From the movies that reduce us to tears to the music that lift our spirits, what we see and hear has huge power to shape our beliefs and guide our behaviour.
We all know when we watch Star Trek or immerse ourselves in EDM that we are suspending reality in order to feel a thrill of escapism. But what if reality was suspended permanently?
The rise of “deepfake” technology has the power to fracture society’s ability to tell fact from fiction. The term ‘deepfake’ refers to video or audio that has been altered with the aid of deep learning technology, to usually show a person doing something they never did; or saying something they never said.
Though media has been artificially manipulated for decades, faster computers and easy-to-use, publicly available technology makes convincing deepfakes increasingly easy to produce and proliferate online.
The most famous example is film director Jordan Peele’s 2018 deepfake of President Obama (below) to sound the alarm about the potential abuse of the technology. Being a film director, Peele is well placed to speak of the power of video and audio to manipulate emotions and persuade us to see events in a way the creator wants us to.
Many experts have recently raised their heads above the parapet and began publicly expressing concern. “There are a few phenomena that come together that make deepfakes particularly troubling when they’re provocative and destructive,” said Danielle Citron, a law professor at the University of Maryland “We know that as human beings, video and audio is so visceral, we tend to believe what our eyes and ears are telling us.” Citron was talking about deepfakes in the context of politics. And how a foreign government may release fake videos to sew chaos in democracies and make citizens believe things that never happened.
But technology expert Jamie Bartlett has recently expressed the opposite concern. That the most damaging effect of the rise of deepfakes may not be that we are all duped into believing fakes, but that we will become so cynical that we will believe nothing at all.
“If everything is potentially a fake then everything can be dismissed as a lie.” If a future Trump is caught saying “grab em by the pussy” It’s a deep fake! He will proclaim.
What Can We Do To Protect Democracy?
A recent hearing of the U.S House Intelligence Committee sought expert advice on the best means for governments to respond to deepfakes. Professor Citron contrasted two recent viral examples. The first was a video of Speaker Nancy Pelosi in which her voice was doctored to make her sound drunk when delivering a speech. The second was a parody video of Mark Zuckerberg by the artist Bill Posters in which Zuckerberg is synthetically made to say he controls the world’s data and therefore controls the world.
Citron suggested it was right for the Pelosi video to be removed while the Zuckerberg video allowed to stay online:
“For something like a video where its clearly a doctored and impersonation, not satire, not parody it should be removed.. [but] there are wonderful uses for deepfakes that are art, historical, sort of rejuvenating for people to create them about themselves…”
The moral and legal principle which Citron seemed to be suggesting is that deepfakes should be permitted in instances where a reasonable person would be able to distinguish it as a fake equivalent to a piece of satire or fictional art but prohibited in instances where the primary purpose of the video is to deceive and injure.
David Doermann, former project manager at DARPA (Defense Advanced Research Project Agency) echoed this sentiment and added that he believes another layer of verification will be needed online. He advocated a new law for social media companies to delay the publishing of videos until some initial verification can be done, akin to the Facebook ads verification.
“There’s no reason why these things have to be instantaneous.. we’ve done it for child pornography, we’ve done it for human trafficking.”
Public debate continues to rage as to what legal measures should be implemented to protect our democracies from falsification and confusion. But there is at least strong consensus emerging that there is a need to act and to act fast.
As the political scientist Hannah Arendt wrote in the 1950s, the ideal conditions for authoritarianism to thrive is a society where “the distinction between fact and fiction, true and false, no longer exists.”
The health of democracies all over the world will depend on finding ways to re-establish truth and authenticity of video and audio content. I’ll leave you with this final quote from Jamie Bartlett on what we can expect if regulation is not urgently implemented:
“In the face of constant and endless deep fakes and deep denials, the only rational response from the citizen will be extreme cynicism and apathy about the very idea of truth itself. They will conclude that nothing is to be trusted except her own gut instinct and existing political loyalties..”