< Return to Video

How deepfakes undermine truth and threaten democracy

  • 0:00 - 0:04
    [This talk contains mature content]
  • 0:06 - 0:09
    Rana Ayyub is a journalist in India
  • 0:09 - 0:12
    whose work has exposed
    government corruption
  • 0:12 - 0:15
    and human rights violations.
  • 0:15 - 0:16
    And over the years,
  • 0:16 - 0:19
    she's gotten used to [unclear]
    and controversy around her work.
  • 0:20 - 0:25
    But none of it could have prepared her
    for what she faced in April 2018.
  • 0:26 - 0:29
    She was sitting in a cafe with her friend
    when she first saw it.
  • 0:30 - 0:35
    A two-minute-twenty-second video
    of her engaged in a sex act.
  • 0:35 - 0:38
    And she couldn't believe her eyes.
  • 0:38 - 0:40
    She had never made a sex video.
  • 0:41 - 0:44
    But unfortunately, thousands
    upon thousands of people
  • 0:44 - 0:46
    would believe it was her.
  • 0:47 - 0:50
    I interviewed Miss Ayyub
    about three months ago
  • 0:50 - 0:52
    in connection with my book
    on sexual privacy.
  • 0:53 - 0:56
    I'm a law professor, lawyer
    and civil rights advocate.
  • 0:56 - 1:01
    So it's incredibly frustrating
    knowing that right now
  • 1:01 - 1:03
    law could do very little to help her.
  • 1:03 - 1:05
    And as we talked,
  • 1:05 - 1:09
    she explained that she should have seen
    the fake sex video coming.
  • 1:10 - 1:16
    She said, "After all, sex is so often used
    to demean and to shame women,"
  • 1:16 - 1:18
    especially minority women
  • 1:18 - 1:22
    and especially minority women
    who dare to challenge powerful men
  • 1:22 - 1:24
    as she had in her work.
  • 1:25 - 1:29
    The fake sex video went viral in 48 hours.
  • 1:30 - 1:35
    All of her online accounts were flooded
    with screenshots of the video,
  • 1:35 - 1:38
    with graphic rape and death threats,
  • 1:38 - 1:41
    and with slurs about her Muslim faith.
  • 1:41 - 1:46
    Online posts suggested that
    she was "available" for sex.
  • 1:46 - 1:48
    And she was doxed,
  • 1:48 - 1:50
    which means that her home address
    and her cell phone number
  • 1:50 - 1:52
    were spread across the internet.
  • 1:53 - 1:57
    The video was shared
    more than 40,000 times.
  • 1:58 - 2:02
    Now, when someone is targeted
    with this kind of cyber-mob attack
  • 2:02 - 2:04
    the harm is profound.
  • 2:04 - 2:08
    Rana Ayyub's life was turned upside-down.
  • 2:08 - 2:12
    For weeks, she could hardly eat or speak.
  • 2:12 - 2:16
    She stopped writing and closed
    all of her social media accounts,
  • 2:16 - 2:19
    which is, you know, a tough thing to do
    when you're a journalist.
  • 2:19 - 2:23
    And she was afraid to go outside
    her family's home.
  • 2:23 - 2:26
    What if the posters
    made good on their threats?
  • 2:26 - 2:30
    The UN Council on Human Rights
    confirmed that she wasn't being crazy.
  • 2:31 - 2:32
    It issued a public statement,
  • 2:32 - 2:35
    saying that they were worried
    about her safety.
  • 2:37 - 2:40
    What Rana Ayyub faced was a deepfake.
  • 2:41 - 2:44
    Machine-learning technology
  • 2:44 - 2:48
    that manipulates or fabricates
    audio and video recordings
  • 2:48 - 2:50
    to show people doing and saying things
  • 2:50 - 2:52
    that they never did or said.
  • 2:53 - 2:55
    Deepfakes appear authentic and realistic,
  • 2:55 - 2:58
    but they're not, they're total falsehoods.
  • 2:59 - 3:03
    Although the technology
    is still developing in its sophistication,
  • 3:03 - 3:05
    it is widely available.
  • 3:05 - 3:08
    Now the most recent attention
    to deepfakes arose,
  • 3:08 - 3:12
    as so many things do online,
    with pornography.
  • 3:12 - 3:15
    In early 2018,
  • 3:15 - 3:17
    someone posted a tool or Reddit
  • 3:17 - 3:21
    to allow users to insert faces
    into porn videos.
  • 3:22 - 3:25
    And what followed was a cascade
    of fake porn videos,
  • 3:25 - 3:28
    featuring people's favorite
    female celebrities.
  • 3:29 - 3:32
    And today, you can go on YouTube
    and pull up countless tutorials
  • 3:32 - 3:34
    with step-by-step instructions
  • 3:35 - 3:38
    on how to make a deepfake
    on your desktop application.
  • 3:38 - 3:42
    And soon we may be even able
    to make them on our cell phones.
  • 3:43 - 3:48
    Now, it's the interaction
    of some of our basic human frailties
  • 3:48 - 3:50
    and network tools
  • 3:50 - 3:53
    that can turn deepfakes into weapons.
  • 3:53 - 3:54
    So let me explain.
  • 3:55 - 3:56
    As human beings,
  • 3:56 - 4:00
    we have a visceral reaction
    to audio and video.
  • 4:00 - 4:02
    We believe they're true
    on the notion that,
  • 4:02 - 4:06
    of course you can believe
    what your eyes and ears are telling you.
  • 4:06 - 4:08
    And it's that mechanism
  • 4:08 - 4:12
    that might undermine our shared
    sense of reality.
  • 4:12 - 4:16
    Although we believe deepfakes
    to be true, they're not.
  • 4:16 - 4:20
    And we're attracted
    to the salacious, the provocative.
  • 4:20 - 4:23
    We tend to believe
    and to share information
  • 4:23 - 4:25
    that's negative and novel.
  • 4:26 - 4:29
    And researchers have found
    that online hoaxes
  • 4:29 - 4:32
    spread 10 times faster
    than accurate stories.
  • 4:34 - 4:38
    Now, we're also drawn to information
  • 4:38 - 4:41
    that aligns with out viewpoints.
  • 4:41 - 4:44
    Psychologists call that tendency
    confirmation bias.
  • 4:45 - 4:50
    And social media platforms
    supercharge that tendency,
  • 4:50 - 4:54
    by allowing us to instantly and widely
    share information
  • 4:54 - 4:56
    that accords with out viewpoints.
  • 4:57 - 4:59
    Now, deepfakes have the potential
  • 4:59 - 5:02
    to cause grave individual
    and societal harm.
  • 5:03 - 5:05
    So imagine a deepfake
  • 5:05 - 5:09
    that shows American soldiers
    in Afganistan burning a Koran.
  • 5:11 - 5:14
    You can imagine that that deepfake
    would provoke violence
  • 5:14 - 5:15
    against those soldiers.
  • 5:16 - 5:19
    And what if the very next day
  • 5:19 - 5:21
    there's another deepfake that drops,
  • 5:21 - 5:24
    that shows a well-known Imam
    based in London
  • 5:24 - 5:27
    praising the attack on those soldiers?
  • 5:28 - 5:31
    We might see violence and civil unrest,
  • 5:31 - 5:34
    not only in Afganistan
    and the United Kingdom,
  • 5:34 - 5:36
    but across the globe.
  • 5:36 - 5:37
    And you might say to me,
  • 5:37 - 5:40
    "Come on, Danielle, that's far-fetched."
  • 5:40 - 5:41
    But it's not.
  • 5:41 - 5:43
    We've seen falsehoods spread
  • 5:44 - 5:46
    on WhatsApp and other online
    message services
  • 5:46 - 5:49
    lead to violence
    against ethnic minorities.
  • 5:49 - 5:51
    And that was just text.
  • 5:51 - 5:53
    Imagine if it were video.
  • 5:55 - 6:00
    Now, deepfakes have the potential
    to corrode the trust that we have
  • 6:00 - 6:02
    in democratic institutions.
  • 6:03 - 6:06
    So imagine the night before an election.
  • 6:06 - 6:09
    There's a deepfake showing
    one of the major party candidates
  • 6:09 - 6:10
    gravely sick.
  • 6:11 - 6:14
    The deepfake could tip the election
  • 6:14 - 6:17
    and shake our sense
    that elections are legitimate.
  • 6:19 - 6:22
    Imagine if the night before
    an initial public offering
  • 6:22 - 6:24
    of a major global bank
  • 6:24 - 6:27
    there was a deepfake
    showing the bank's CEO
  • 6:27 - 6:30
    drunkenly spouting conspiracy theories.
  • 6:31 - 6:34
    The deepfake could tank the IPO,
  • 6:34 - 6:38
    and worse, shake our sense
    that financial markets are stable.
  • 6:39 - 6:43
    So deepfakes can exploit and magnify
  • 6:43 - 6:46
    the deep distrust that we already have
  • 6:46 - 6:51
    in politicians, business leaders
    and other influential leaders.
  • 6:51 - 6:54
    They find an audience
    primed to believe them.
  • 6:55 - 6:59
    And the pursuit of truth
    is on the line as well.
  • 6:59 - 7:03
    Technologists expect
    that with advances in AI
  • 7:03 - 7:06
    soon it may be difficult,
    if not impossible,
  • 7:06 - 7:10
    to tell a difference
    between a real video and a fake one.
  • 7:11 - 7:13
    So how can the truth emerge
  • 7:13 - 7:16
    in a deepfake-written
    marketplace of ideas?
  • 7:17 - 7:20
    Will we just proceed along
    the path or least resistance
  • 7:20 - 7:23
    and believe what we want to believe,
  • 7:23 - 7:24
    truth be damned?
  • 7:25 - 7:28
    And not only might we believe the fakery,
  • 7:28 - 7:31
    we might start disbelieving the truth.
  • 7:32 - 7:36
    We've already seen people invoke
    the phenomenon of deepfakes
  • 7:36 - 7:40
    to cast doubt on real evidence
    of their wrongdoing.
  • 7:40 - 7:46
    We've heard politicians say of audio
    of their disturbing comments,
  • 7:46 - 7:48
    "Come on, that's fake news.
  • 7:48 - 7:52
    You can't believe what your eyes
    and ears are telling you."
  • 7:52 - 7:54
    And it's that risk
  • 7:54 - 7:59
    that professor Robert Chesney and I
    call the liars dividend.
  • 8:00 - 8:03
    The risk that liars will invoke deepfakes
  • 8:03 - 8:06
    to escape accountability
    for their wrongdoing.
  • 8:07 - 8:10
    So we've got our work cut out for us,
    there's no doubt about it.
  • 8:11 - 8:14
    And we're going to need
    a proactive solution
  • 8:14 - 8:17
    from tech companies, from law makers,
  • 8:17 - 8:19
    law enforcers and the media.
  • 8:20 - 8:24
    And we're going to need
    a healthy dose of societal resilience.
  • 8:26 - 8:29
    So now, we're right now engaged
    in a very public conversation
  • 8:29 - 8:32
    about the responsibility
    of tech companies.
  • 8:33 - 8:36
    And my advice to social media platforms
  • 8:36 - 8:40
    has been to change their terms of service
    and community guidelines
  • 8:40 - 8:42
    to ban deepfakes that cause harm.
  • 8:43 - 8:44
    Now, that determination,
  • 8:44 - 8:47
    that's going to require human judgment,
  • 8:47 - 8:48
    and it's expensive.
  • 8:49 - 8:51
    But we need human beings
  • 8:51 - 8:55
    to look at the content
    and context of a deepfake
  • 8:55 - 8:59
    to figure out if it is
    a harmful impersonation,
  • 8:59 - 9:03
    or instead, if it's valuable
    satire, art or education.
  • 9:04 - 9:06
    So now, what about the law?
  • 9:07 - 9:09
    Law is our educator.
  • 9:10 - 9:14
    It teaches us about
    what's harmful and what's wrong.
  • 9:14 - 9:18
    And it shapes behavior it deters
    by punishing perpetrators
  • 9:18 - 9:20
    and securing remedies for victims.
  • 9:21 - 9:22
    Now right now,
  • 9:22 - 9:25
    law is not up to the
    challenge of deepfakes.
  • 9:26 - 9:28
    Across the globe,
  • 9:28 - 9:30
    we lack well-tailored laws
  • 9:30 - 9:34
    that would be designed to tackle
    digital impersonations
  • 9:34 - 9:36
    that invade sexual privacy,
  • 9:36 - 9:39
    that damage reputations
    and that cause emotional distress.
  • 9:40 - 9:44
    What happened to Rana Ayyub
    is increasingly commonplace.
  • 9:44 - 9:46
    Yet, when she went
    to law enforcement in Delhi,
  • 9:46 - 9:48
    she was told nothing could be done.
  • 9:49 - 9:51
    And the sad truth
  • 9:51 - 9:55
    is that the same would be true
    in the United States and in Europe.
  • 9:55 - 10:00
    So we have a legal vacuum
    that needs to be filled.
  • 10:00 - 10:02
    My colleague, Dr. Mary Anne Franks and I
  • 10:02 - 10:06
    are working with US law makers
    to devise legislation
  • 10:06 - 10:09
    that would ban harmful
    digital impersonations
  • 10:09 - 10:12
    that are tantamount to identity theft.
  • 10:12 - 10:14
    And we've seen similar moves
  • 10:14 - 10:18
    in Iceland, the UK and Australia.
  • 10:18 - 10:21
    But of course, that's just a small piece
    of the regulatory puzzle.
  • 10:23 - 10:26
    Now, I know law is not a cure-all, right.
  • 10:26 - 10:28
    It's a blunt instrument.
  • 10:28 - 10:30
    And we've got to use it wisely.
  • 10:30 - 10:33
    It also has some practical impediments.
  • 10:34 - 10:39
    You can't leverage law against people
    you can't identify and find.
  • 10:39 - 10:43
    And if a perpetrator lives
    outside the country
  • 10:43 - 10:45
    where a victim lives,
  • 10:45 - 10:46
    then you may not be able to insist
  • 10:46 - 10:49
    that the perpetrator
    come into local courts
  • 10:49 - 10:50
    to face justice.
  • 10:50 - 10:54
    And so we're going to need
    a coordinated international response.
  • 10:56 - 10:59
    Education has to be part
    of our response as well.
  • 11:00 - 11:05
    Law enforcers are not going to
    enforce laws they don't know about,
  • 11:05 - 11:08
    and proffer problems
    they don't understand.
  • 11:08 - 11:11
    In my research on cyber stalking,
  • 11:11 - 11:14
    I found that law enforcement
    lacked the training
  • 11:14 - 11:17
    to understand the laws available to them
  • 11:17 - 11:19
    and the problem of online abuse.
  • 11:19 - 11:22
    And so often they told victims,
  • 11:22 - 11:26
    "Just turn your computer off,
    ignore it, it will go away."
  • 11:26 - 11:29
    And we saw that in Rana Ayyub's case.
  • 11:29 - 11:33
    She was told, "Come on,
    you're making such a big deal about this.
  • 11:33 - 11:34
    It's boys being boys."
  • 11:35 - 11:40
    And so we need to pair new legislation
    with efforts at training.
  • 11:42 - 11:45
    And education has to be aimed
    on the media as well.
  • 11:46 - 11:50
    Journalists need educating
    about the phenomenon of deepfakes
  • 11:50 - 11:54
    so they don't amplify and spread them.
  • 11:55 - 11:57
    And this is a part
    where we're all involved.
  • 11:57 - 12:01
    Each and every one of us needs educating.
  • 12:01 - 12:05
    We click, we share, we like,
    and we don't even think about it.
  • 12:06 - 12:07
    We need to do better.
  • 12:08 - 12:11
    We need far better radar for fakery.
  • 12:14 - 12:18
    So, as we're working
    through these solutions,
  • 12:18 - 12:20
    there's going to be
    a lot of suffering to go around.
  • 12:21 - 12:24
    Rana Ayyub is still wrestling
    with the fallout.
  • 12:25 - 12:29
    She still doesn't feel free
    to express herself on and offline.
  • 12:30 - 12:31
    And as she told me,
  • 12:31 - 12:36
    she still feels like there are thousands
    of eyes on her naked body.
  • 12:36 - 12:40
    Even though, intellectually,
    she knows it wasn't her body.
  • 12:40 - 12:43
    And she has frequent panic attacks.
  • 12:43 - 12:47
    Especially when someone she doesn't know
    tries to take her picture.
  • 12:47 - 12:50
    "What if they're going to make
    another deepfake," she thinks to herself.
  • 12:51 - 12:55
    And so for the sake of individuals
    like Rana Ayyub,
  • 12:55 - 12:57
    and the sake of our democracy,
  • 12:57 - 13:00
    we need to do something right now.
  • 13:00 - 13:01
    Thank you.
  • 13:01 - 13:03
    (Applause)
Title:
How deepfakes undermine truth and threaten democracy
Speaker:
Danielle Citron
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
13:16

English subtitles

Revisions Compare revisions