Fakebook: Why Deep Fakes Mean Deep Trouble

Video and audio are the most visceral mediums of human communication. From the movies that reduce us to tears to the music that lift our spirits, what we see and hear has huge power to shape our beliefs and guide our behaviour.

We all know when we watch Star Trek or immerse ourselves in EDM that we are suspending reality in order to feel a thrill of escapism. But what if reality was suspended permanently?

The rise of “deepfake” technology has the power to fracture society’s ability to tell fact from fiction. The term ‘deepfake’ refers to video or audio that has been altered with the aid of deep learning technology, to usually show a person doing something they never did; or saying something they never said.

Though media has been artificially manipulated for decades, faster computers and easy-to-use, publicly available technology makes convincing deepfakes increasingly easy to produce and proliferate online.

The most famous example is film director Jordan Peele’s 2018 deepfake of President Obama (below) to sound the alarm about the potential abuse of the technology. Being a film director, Peele is well placed to speak of the power of video and audio to manipulate emotions and persuade us to see events in a way the creator wants us to.

Many experts have recently raised their heads above the parapet and began publicly expressing concern. “There are a few phenomena that come together that make deepfakes particularly troubling when they’re provocative and destructive,” said Danielle Citron, a law professor at the University of Maryland “We know that as human beings, video and audio is so visceral, we tend to believe what our eyes and ears are telling us.” Citron was talking about deepfakes in the context of politics. And how a foreign government may release fake videos to sew chaos in democracies and make citizens believe things that never happened.

But technology expert Jamie Bartlett has recently expressed the opposite concern. That the most damaging effect of the rise of deepfakes may not be that we are all duped into believing fakes, but that we will become so cynical that we will believe nothing at all.

“If everything is potentially a fake then everything can be dismissed as a lie.” If a future Trump is caught saying “grab em by the pussy” It’s a deep fake! He will proclaim.

What Can We Do To Protect Democracy?

A recent hearing of the U.S House Intelligence Committee sought expert advice on the best means for governments to respond to deepfakes. Professor Citron contrasted two recent viral examples. The first was a video of Speaker Nancy Pelosi in which her voice was doctored to make her sound drunk when delivering a speech. The second was a parody video of Mark Zuckerberg by the artist Bill Posters in which Zuckerberg is synthetically made to say he controls the world’s data and therefore controls the world.

Citron suggested it was right for the Pelosi video to be removed while the Zuckerberg video allowed to stay online:

“For something like a video where its clearly a doctored and impersonation, not satire, not parody it should be removed.. [but] there are wonderful uses for deepfakes that are art, historical, sort of rejuvenating for people to create them about themselves…” 

The moral and legal principle which Citron seemed to be suggesting is that deepfakes should be permitted in instances where a reasonable person would be able to distinguish it as a fake equivalent to a piece of satire or fictional art but prohibited in instances where the primary purpose of the video is to deceive and injure.

David Doermann, former project manager at DARPA (Defense Advanced Research Project Agency) echoed this sentiment and added that he believes another layer of verification will be needed online. He advocated a new law for social media companies to delay the publishing of videos until some initial verification can be done, akin to the Facebook ads verification.

“There’s no reason why these things have to be instantaneous.. we’ve done it for child pornography, we’ve done it for human trafficking.” 

Public debate continues to rage as to what legal measures should be implemented to protect our democracies from falsification and confusion. But there is at least strong consensus emerging that there is a need to act and to act fast.

As the political scientist Hannah Arendt wrote in the 1950s, the ideal conditions for authoritarianism to thrive is a society where “the distinction between fact and fiction, true and false, no longer exists.” 

The health of democracies all over the world will depend on finding ways to re-establish truth and authenticity of video and audio content. I’ll leave you with this final quote from Jamie Bartlett on what we can expect if regulation is not urgently implemented:

“In the face of constant and endless deep fakes and deep denials, the only rational response from the citizen will be extreme cynicism and apathy about the very idea of truth itself. They will conclude that nothing is to be trusted except her own gut instinct and existing political loyalties..” 

 

 

Advertisements

Why Technology Changes Who We Trust

Trust is the foundation of all human connections. From brief encounters to intimate relationships, it governs almost every interaction we have with each other. I trust my housemates not to go into my room without asking, I trust the bank to keep my money safe and I trust the pilot of my plane to fly safely to the destination.

Rachel Botsman describes trust as “a confident relationship with the unknown.” The bridge that allows us to cross from a position of certainty to one of uncertainty and move forward in our lives.

Throughout history, trust has been the glue that allowed people to live together and flourish in cooperative societies. An absence, loss or betrayal of trust could spark violent and deadly consequences.

In recent decades the world has witnessed a radical shift in trust. We might be losing faith in global institutions and political leaders but simultaneously millions of people are renting their homes to complete strangers on Air BnB, exchanging digital currencies like bitcoin or finding themselves trusting bots for help online. Botsman describes this shift as a new age of ‘distributed trust.’

Instead of a vertical relationship where trust flows upwards from individuals to hierarchical institutions, experts, authorities and regulators, today trust increasingly flows horizontally from individuals to networks, peers, friends, colleagues and fellow users.

If we are to benefit from this radical shift and not see a collapse of our institutions, we must understand the mechanics of how trust is built, managed, lost, and repaired in the digital age. To explain this new world, Botsman provides a detailed map of this uncharted landscape and explores what’s next for humanity.

Watch below:

And for a more detailed account listen here: https://play.acast.com/s/intelligencesquared/rachelbotsmanandhelenlewisontechnologyandtrust