Fakebook: Why Deep Fakes Mean Deep Trouble

Video and audio are the most visceral mediums of human communication. From the movies that reduce us to tears to the music that lift our spirits, what we see and hear has huge power to shape our beliefs and guide our behaviour.

We all know when we watch Star Trek or immerse ourselves in EDM that we are suspending reality in order to feel a thrill of escapism. But what if reality was suspended permanently?

The rise of “deepfake” technology has the power to fracture society’s ability to tell fact from fiction. The term ‘deepfake’ refers to video or audio that has been altered with the aid of deep learning technology, to usually show a person doing something they never did; or saying something they never said.

Though media has been artificially manipulated for decades, faster computers and easy-to-use, publicly available technology makes convincing deepfakes increasingly easy to produce and proliferate online.

The most famous example is film director Jordan Peele’s 2018 deepfake of President Obama (below) to sound the alarm about the potential abuse of the technology. Being a film director, Peele is well placed to speak of the power of video and audio to manipulate emotions and persuade us to see events in a way the creator wants us to.

Many experts have recently raised their heads above the parapet and began publicly expressing concern. “There are a few phenomena that come together that make deepfakes particularly troubling when they’re provocative and destructive,” said Danielle Citron, a law professor at the University of Maryland “We know that as human beings, video and audio is so visceral, we tend to believe what our eyes and ears are telling us.” Citron was talking about deepfakes in the context of politics. And how a foreign government may release fake videos to sew chaos in democracies and make citizens believe things that never happened.

But technology expert Jamie Bartlett has recently expressed the opposite concern. That the most damaging effect of the rise of deepfakes may not be that we are all duped into believing fakes, but that we will become so cynical that we will believe nothing at all.

“If everything is potentially a fake then everything can be dismissed as a lie.” If a future Trump is caught saying “grab em by the pussy” It’s a deep fake! He will proclaim.

What Can We Do To Protect Democracy?

A recent hearing of the U.S House Intelligence Committee sought expert advice on the best means for governments to respond to deepfakes. Professor Citron contrasted two recent viral examples. The first was a video of Speaker Nancy Pelosi in which her voice was doctored to make her sound drunk when delivering a speech. The second was a parody video of Mark Zuckerberg by the artist Bill Posters in which Zuckerberg is synthetically made to say he controls the world’s data and therefore controls the world.

Citron suggested it was right for the Pelosi video to be removed while the Zuckerberg video allowed to stay online:

“For something like a video where its clearly a doctored and impersonation, not satire, not parody it should be removed.. [but] there are wonderful uses for deepfakes that are art, historical, sort of rejuvenating for people to create them about themselves…” 

The moral and legal principle which Citron seemed to be suggesting is that deepfakes should be permitted in instances where a reasonable person would be able to distinguish it as a fake equivalent to a piece of satire or fictional art but prohibited in instances where the primary purpose of the video is to deceive and injure.

David Doermann, former project manager at DARPA (Defense Advanced Research Project Agency) echoed this sentiment and added that he believes another layer of verification will be needed online. He advocated a new law for social media companies to delay the publishing of videos until some initial verification can be done, akin to the Facebook ads verification.

“There’s no reason why these things have to be instantaneous.. we’ve done it for child pornography, we’ve done it for human trafficking.” 

Public debate continues to rage as to what legal measures should be implemented to protect our democracies from falsification and confusion. But there is at least strong consensus emerging that there is a need to act and to act fast.

As the political scientist Hannah Arendt wrote in the 1950s, the ideal conditions for authoritarianism to thrive is a society where “the distinction between fact and fiction, true and false, no longer exists.” 

The health of democracies all over the world will depend on finding ways to re-establish truth and authenticity of video and audio content. I’ll leave you with this final quote from Jamie Bartlett on what we can expect if regulation is not urgently implemented:

“In the face of constant and endless deep fakes and deep denials, the only rational response from the citizen will be extreme cynicism and apathy about the very idea of truth itself. They will conclude that nothing is to be trusted except her own gut instinct and existing political loyalties..” 

 

 

Advertisements

Your Phone is Designed to Control You And Your Life

An alarming new report from The Economist exposes the extent to which tech companies are exploiting our psychological impulses to keep us hooked to our smartphones.

 

It often goes over our head the influence that tech products exert over our behaviour. Former google employee and leader in promoting design ethics in tech Tristan Harris explains:

 “Companies say, we’re just getting better at giving people what they want. But the average person checks their phone 150 times a day. Is each one a conscious choice? No. Companies are getting better at getting people to make the choices they want them to make.”

Behaviour Design 

How have companies mastered this? It all stems from the expert study of “Pursuasive Technology Design” an illustrious programme spearheaded by Professor BJ Fogg of Stanford University which has produced everyone from the creators of Instagram to the people at the top of tech in Apple and Google.

Be it the emails that induce you to buy right away, the apps and games that rivet your attention, or the online forms that nudge you towards one decision over another: all are designed to hack the human brain and capitalise on its instincts, quirks and flaws. The techniques they use are often crude and blatantly manipulative, but they are getting steadily more refined, and, as they do so, less noticeable.

And it’s not just tech companies who are adopting this tactic. Even banking and insurance companies have started modelling their customer interface design along the lines of Candy Crush.

“It’s about looping people into these flows of incentive and reward. Your coffee at Starbucks, your education software, your credit card, the meds you need for your diabetes. Every consumer interface is becoming like a slot machine.”

It’s a startling phenomenon of the digital age and something we should all be aware and conscious of. We wouldn’t allow our family or friends become addicted to gambling so why don’t we care about addiction to social media which to the brain is the same thing?

The exciting explosion of smartphone technology has overshadowed the questioning of it’s potentially more pernicious effects and we have nonchalantly accepted the terms and conditions without reading the small print.

Check out Tristan Harris explain how it works in more detail below:

 

Read the Economist article in full here:

https://www.1843magazine.com/features/the-scientists-who-make-apps-addictive

Why I’d Vote For Corbyn

Professor Noam Chomsky speaks to BBC Newsnight to discuss the anger which has raged across the Middle and Working Classes of Western democracies since the economic collapse in 2008.

Discussing the roots of the anger, the rise of far right nationalism as well as the optimistic signs of youth galvanisation around progressive policies on climate change and income inequality – Chomsky discusses why he would vote for Jeremy Corbyn in the UK general election in the context of Brexit.

This is a riveting interview from one of the words best known progressive public intellectuals and gives some interesting insights into the global order and future of western democracy.

Wikipedia Proves Fake News Hysteria is Bullsh*t

Katherine Maher, executive director at the Wikimedia Foundation discusses how Wikipedia went from a site loaded with errors and false information to the world’s trusted open encyclopedia.

Through the process of constant self improvement and a dedication to ensuring accurate information, Wikipedia shows that sorting fact from fiction is a much easier job than has been made out from public figures such as Hillary Clinton and Donald Trump.

Maher suggsts that the way news is consumed and how information is spread is more the problem than fake news itself.  It is the profiteering, commercial model of clickbait and stretching of truth as companies and individuals fight for our screen time that must be seen as the focal point of fake news.

She states the product design is flawed and the major providers need to take a stand on the way information is presented to the consumer and allowing quick resolution to removing what is fake, just as Wikipedia has done:

“When I’m looking at a Facebook feed I don’t know why information is being presented to me. Is it because it’s timely? Is it because it’s relevant? Is it because it’s trending, popular, important?
All of that is stripped out of context so it’s hard for me to assess: is it good information that I should make decisions on? Is it bad information that I should ignore? And then you think about the fact that all of the other sort of heuristics that people use to interpret information, where does it come from? Who wrote it? When was it published? All of that is obscured in the product design as well.”

So does Fake News really have the problem or is this an obfuscation of what is really causing the spread of misinformation?

How to Stop Wasting Your Days on Facebook

It’s an addiction. A stimulation we crave. Yet it can really inhibit the quality of our lives and ability to focus on hard tasks. Most of us would admit we spend way too much time aimlessly drifting through newsfeeds but how do we beat it?

Author Charles Duhigg believes we must treat it like any other ingrained habit. Accept that we have a dependency and slowly try and wean ourselves off.

This can be done by scheduling timeslots in the day when we will use social media and removing automatic notification alerts that we don’t need and slowly start creating a new habit of focus.

How Social Media is Shaping Our Thought Patterns

In this extraordinary clip Dr. Dan Siegel, clinical professor of psychiatry at the UCLA School of Medicine reveals how social media is actually physically rewiring our brains.

The addictive nature of social media has become starkly apparent as anyone who takes public transport will be aware. Yet its capacity to manipulate and reshape our brains is something not often discussed and something parents should be particularly aware of in relation to exposing their children to smartphones.

 

Why Millennials Can’t be Happy 

Simon Sinek explores the reasons why Millenials are getting a bad reputation. From growing up with a toxic addiction to social media to the sense of entitlement which is cultivated in today’s youth through instant gratification and pampering.

This talk is a riveting insight into the potential damage of overuse of social media at a young age and poses some stark questions about how young people today will cope with the harshness of the working world.