-
[This talk contains mature content]
-
Rana Ayyub is a journalist in India
-
whose work has exposed
government corruption
-
and human rights violations.
-
And over the years,
-
she's gotten used to [unclear]
and controversy around her work.
-
But none of it could have prepared her
for what she faced in April 2018.
-
She was sitting in a cafe with her friend
when she first saw it.
-
A two-minute-twenty-second video
of her engaged in a sex act.
-
And she couldn't believe her eyes.
-
She had never made a sex video.
-
But unfortunately, thousands
upon thousands of people
-
would believe it was her.
-
I interviewed Miss Ayyub
about three months ago
-
in connection with my book
on sexual privacy.
-
I'm a law professor, lawyer
and civil rights advocate.
-
So it's incredibly frustrating
knowing that right now
-
law could do very little to help her.
-
And as we talked,
-
she explained that she should have seen
the fake sex video coming.
-
She said, "After all, sex is so often used
to demean and to shame women,"
-
especially minority women
-
and especially minority women
who dare to challenge powerful men
-
as she had in her work.
-
The fake sex video went viral in 48 hours.
-
All of her online accounts were flooded
with screenshots of the video,
-
with graphic rape and death threats,
-
and with slurs about her Muslim faith.
-
Online posts suggested that
she was "available" for sex.
-
And she was doxed,
-
which means that her home address
and her cell phone number
-
were spread across the internet.
-
The video was shared
more than 40,000 times.
-
Now, when someone is targeted
with this kind of cyber-mob attack
-
the harm is profound.
-
Rana Ayyub's life was turned upside-down.
-
For weeks, she could hardly eat or speak.
-
She stopped writing and closed
all of her social media accounts,
-
which is, you know, a tough thing to do
when you're a journalist.
-
And she was afraid to go outside
her family's home.
-
What if the posters
made good on their threats?
-
The UN Council on Human Rights
confirmed that she wasn't being crazy.
-
It issued a public statement,
-
saying that they were worried
about her safety.
-
What Rana Ayyub faced was a deepfake.
-
Machine-learning technology
-
that manipulates or fabricates
audio and video recordings
-
to show people doing and saying things
-
that they never did or said.
-
Deepfakes appear authentic and realistic,
-
but they're not, they're total falsehoods.
-
Although the technology
is still developing in its sophistication,
-
it is widely available.
-
Now the most recent attention
to deepfakes arose,
-
as so many things do online,
with pornography.
-
In early 2018,
-
someone posted a tool or Reddit
-
to allow users to insert faces
into porn videos.
-
And what followed was a cascade
of fake porn videos,
-
featuring people's favorite
female celebrities.
-
And today, you can go on YouTube
and pull up countless tutorials
-
with step-by-step instructions
-
on how to make a deepfake
on your desktop application.
-
And soon we may be even able
to make them on our cell phones.
-
Now, it's the interaction
of some of our basic human frailties
-
and network tools
-
that can turn deepfakes into weapons.
-
So let me explain.
-
As human beings,
-
we have a visceral reaction
to audio and video.
-
We believe they're true
on the notion that,
-
of course you can believe
what your eyes and ears are telling you.
-
And it's that mechanism
-
that might undermine our shared
sense of reality.
-
Although we believe deepfakes
to be true, they're not.
-
And we're attracted
to the salacious, the provocative.
-
We tend to believe
and to share information
-
that's negative and novel.
-
And researchers have found
that online hoaxes
-
spread 10 times faster
than accurate stories.
-
Now, we're also drawn to information
-
that aligns with out viewpoints.
-
Psychologists call that tendency
confirmation bias.
-
And social media platforms
supercharge that tendency,
-
by allowing us to instantly and widely
share information
-
that accords with out viewpoints.
-
Now, deepfakes have the potential
-
to cause grave individual
and societal harm.
-
So imagine a deepfake
-
that shows American soldiers
in Afganistan burning a Koran.
-
You can imagine that that deepfake
would provoke violence
-
against those soldiers.
-
And what if the very next day
-
there's another deepfake that drops,
-
that shows a well-known Imam
based in London
-
praising the attack on those soldiers?
-
We might see violence and civil unrest,
-
not only in Afganistan
and the United Kingdom,
-
but across the globe.
-
And you might say to me,
-
"Come on, Danielle, that's far-fetched."
-
But it's not.
-
We've seen falsehoods spread
-
on WhatsApp and other online
message services
-
lead to violence
against ethnic minorities.
-
And that was just text.
-
Imagine if it were video.
-
Now, deepfakes have the potential
to corrode the trust that we have
-
in democratic institutions.
-
So imagine the night before an election.
-
There's a deepfake showing
one of the major party candidates
-
gravely sick.
-
The deepfake could tip the election
-
and shake our sense
that elections are legitimate.
-
Imagine if the night before
an initial public offering
-
of a major global bank
-
there was a deepfake
showing the bank's CEO
-
drunkenly spouting conspiracy theories.
-
The deepfake could tank the IPO,
-
and worse, shake our sense
that financial markets are stable.
-
So deepfakes can exploit and magnify
-
the deep distrust that we already have
-
in politicians, business leaders
and other influential leaders.
-
They find an audience
primed to believe them.
-
And the pursuit of truth
is on the line as well.
-
Technologists expect
that with advances in AI
-
soon it may be difficult,
if not impossible,
-
to tell a difference
between a real video and a fake one.
-
So how can the truth emerge
-
in a deepfake-written
marketplace of ideas?
-
Will we just proceed along
the path or least resistance
-
and believe what we want to believe,
-
truth be damned?
-
And not only might we believe the fakery,
-
we might start disbelieving the truth.
-
We've already seen people invoke
the phenomenon of deepfakes
-
to cast doubt on real evidence
of their wrongdoing.
-
We've heard politicians say of audio
of their disturbing comments,
-
"Come on, that's fake news.
-
You can't believe what your eyes
and ears are telling you."
-
And it's that risk
-
that professor Robert Chesney and I
call the liars dividend.
-
The risk that liars will invoke deepfakes
-
to escape accountability
for their wrongdoing.
-
So we've got our work cut out for us,
there's no doubt about it.
-
And we're going to need
a proactive solution
-
from tech companies, from law makers,
-
law enforcers and the media.
-
And we're going to need
a healthy dose of societal resilience.
-
So now, we're right now engaged
in a very public conversation
-
about the responsibility
of tech companies.
-
And my advice to social media platforms
-
has been to change their terms of service
and community guidelines
-
to ban deepfakes that cause harm.
-
Now, that determination,
-
that's going to require human judgment,
-
and it's expensive.
-
But we need human beings
-
to look at the content
and context of a deepfake
-
to figure out if it is
a harmful impersonation,
-
or instead, if it's valuable
satire, art or education.
-
So now, what about the law?
-
Law is our educator.
-
It teaches us about
what's harmful and what's wrong.
-
And it shapes behavior it deters
by punishing perpetrators
-
and securing remedies for victims.
-
Now right now,
-
law is not up to the
challenge of deepfakes.
-
Across the globe,
-
we lack well-tailored laws
-
that would be designed to tackle
digital impersonations
-
that invade sexual privacy,
-
that damage reputations
and that cause emotional distress.
-
What happened to Rana Ayyub
is increasingly commonplace.
-
Yet, when she went
to law enforcement in Delhi,
-
she was told nothing could be done.
-
And the sad truth
-
is that the same would be true
in the United States and in Europe.
-
So we have a legal vacuum
that needs to be filled.
-
My colleague, Dr. Mary Anne Franks and I
-
are working with US law makers
to devise legislation
-
that would ban harmful
digital impersonations
-
that are tantamount to identity theft.
-
And we've seen similar moves
-
in Iceland, the UK and Australia.
-
But of course, that's just a small piece
of the regulatory puzzle.
-
Now, I know law is not a cure-all, right.
-
It's a blunt instrument.
-
And we've got to use it wisely.
-
It also has some practical impediments.
-
You can't leverage law against people
you can't identify and find.
-
And if a perpetrator lives
outside the country
-
where a victim lives,
-
then you may not be able to insist
-
that the perpetrator
come into local courts
-
to face justice.
-
And so we're going to need
a coordinated international response.
-
Education has to be part
of our response as well.
-
Law enforcers are not going to
enforce laws they don't know about,
-
and proffer problems
they don't understand.
-
In my research on cyber stalking,
-
I found that law enforcement
lacked the training
-
to understand the laws available to them
-
and the problem of online abuse.
-
And so often they told victims,
-
"Just turn your computer off,
ignore it, it will go away."
-
And we saw that in Rana Ayyub's case.
-
She was told, "Come on,
you're making such a big deal about this.
-
It's boys being boys."
-
And so we need to pair new legislation
with efforts at training.
-
And education has to be aimed
on the media as well.
-
Journalists need educating
about the phenomenon of deepfakes
-
so they don't amplify and spread them.
-
And this is a part
where we're all involved.
-
Each and every one of us needs educating.
-
We click, we share, we like,
and we don't even think about it.
-
We need to do better.
-
We need far better radar for fakery.
-
So, as we're working
through these solutions,
-
there's going to be
a lot of suffering to go around.
-
Rana Ayyub is still wrestling
with the fallout.
-
She still doesn't feel free
to express herself on and offline.
-
And as she told me,
-
she still feels like there are thousands
of eyes on her naked body.
-
Even though, intellectually,
she knows it wasn't her body.
-
And she has frequent panic attacks.
-
Especially when someone she doesn't know
tries to take her picture.
-
"What if they're going to make
another deepfake," she thinks to herself.
-
And so for the sake of individuals
like Rana Ayyub,
-
and the sake of our democracy,
-
we need to do something right now.
-
Thank you.
-
(Applause)