ADL International Leadership Award Presented to Sacha Baron Cohen at Never Is Now 2019
-
0:02 - 0:04Thank you very much.
Thanks to everyone who stood up. -
0:05 - 0:07Unnecessary but very flattering.
-
0:07 - 0:10Thank you, Jonathan,
for your very kind words. -
0:10 - 0:12Thank you, the Anti-Defamation League,
-
0:12 - 0:15for this recognition
-
0:15 - 0:18and your work in fighting racism,
hate and bigotry. -
0:19 - 0:20And to be clear,
-
0:20 - 0:22when I say racism, hate, and bigotry
-
0:23 - 0:26I'm not referring to the names
of Stephen Miller's labradoodles. -
0:27 - 0:28[audience laughs]
-
0:30 - 0:34Now, I realize that
some of you may be thinking: -
0:34 - 0:38"What the hell is a comedian doing
speaking at a conference like this?" -
0:39 - 0:40I certainly am.
-
0:40 - 0:41[audience laughs]
-
0:41 - 0:45I've spent most of the past two decades
in character. -
0:45 - 0:50In fact, this is the first ever time
that I've stood up and given a speech -
0:50 - 0:53as my least popular character:
Sacha Baron Cohen. -
0:53 - 0:54[audience laughs]
-
0:54 - 0:56[audience cheers]
-
0:57 - 1:01And, I have to confess, it is terrifying.
-
1:03 - 1:07I realize that my presence here
may also be unexpected for another reason: -
1:08 - 1:11At times, some critics
have said my comedy -
1:11 - 1:14risks reinforcing old stereotypes.
-
1:14 - 1:19The truth is, I've been passionate about
challenging bigotry and intolerance -
1:19 - 1:20throughout my life.
-
1:21 - 1:22As a teenager in England,
-
1:22 - 1:25I marched against
the Fascist National Front -
1:25 - 1:27and to abolish apartheid.
-
1:27 - 1:30As an undergraduate,
I traveled around America -
1:30 - 1:33and wrote my thesis
about the Civil Rights Movement -
1:33 - 1:35with the help of the archives of the ADL;
-
1:36 - 1:39and as a comedian,
I've tried to use my characters -
1:39 - 1:42to get people to let down their guard
-
1:42 - 1:44and reveal what they actually believe,
-
1:44 - 1:46including their own prejudice.
-
1:46 - 1:49Now, I'm not gonna claim
everything I've done -
1:49 - 1:50has been for a higher purpose.
-
1:51 - 1:53Yes, some of my comedy –
-
1:53 - 1:54okay, probably half my comedy –
-
1:54 - 1:56has been absolutely juvenile.
-
1:56 - 1:57[audience laughs]
-
1:57 - 2:00And the other half completely puerile.
-
2:00 - 2:02But...
[audience laughs] -
2:02 - 2:05I admit there was nothing
particularly enlightening about me -
2:05 - 2:07as Borat from Kazakhstan,
-
2:07 - 2:09the first 'fake news' journalist,
-
2:09 - 2:10[audience laughs]
-
2:10 - 2:13running through a conference
of mortgage brokers -
2:13 - 2:14while I was completely naked.
-
2:14 - 2:15[audience laughs]
-
2:16 - 2:20But when Borat was able to get
an entire bar in Arizona to sing -
2:20 - 2:22♪ throw the Jew down the well ♪
-
2:22 - 2:23[audience laughs]
-
2:23 - 2:27it did reveal people's indifference
to anti-semitism. -
2:27 - 2:31When, as Bruno, the gay
fashion reporter from Austria, -
2:32 - 2:35I started kissing a man
in a cage fight in Arkansas, -
2:35 - 2:37nearly starting a riot,
-
2:37 - 2:40it showed the violent potential
of homophobia. -
2:41 - 2:44And, when disguised
as an ultra-woke developer, -
2:44 - 2:48I proposed building a mosque
in one rural community, -
2:48 - 2:51prompting a resident to proudly admit:
-
2:51 - 2:54"I am racist against Muslims,"
-
2:54 - 2:57it showed the growing acceptance
of islamophobia. -
2:58 - 3:02That's why I really appreciate
the opportunity to be here with you. -
3:02 - 3:04Today, around the world,
-
3:04 - 3:07demagogues appeal
to our worst instincts. -
3:07 - 3:11Conspiracy theories,
once confined to the fringe, -
3:11 - 3:12are going mainstream.
-
3:13 - 3:18It's as if the age of reason,
the era of evidential argument -
3:18 - 3:19is ending,
-
3:19 - 3:22and now knowledge
is increasingly delegitimized -
3:22 - 3:25and scientific consensus is dismissed.
-
3:26 - 3:29Democracy,
which depends on shared truths, -
3:30 - 3:31is in retreat,
-
3:31 - 3:34and autocracy,
which depends on shared lies, -
3:34 - 3:36is on the march.
-
3:36 - 3:38Hate crimes are surging,
-
3:38 - 3:41as are murderous attacks
on religious and ethnic minorities. -
3:42 - 3:45Now, what do all these
dangerous trends have in common? -
3:46 - 3:47I'm just a comedian and an actor,
-
3:47 - 3:51I'm not a scholar,
but one thing is pretty clear to me: -
3:52 - 3:55All this hate and violence
is being facilitated -
3:56 - 3:58by a handful of Internet companies
-
3:58 - 4:02that amount to the greatest
propaganda machine in history. -
4:02 - 4:04[audience applauds]
-
4:07 - 4:09The greatest
propaganda machine in history. -
4:09 - 4:12Let's think about it.
Facebook, YouTube... -
4:12 - 4:14...Google, Twitter and others –
-
4:14 - 4:16they reach billions of people.
-
4:16 - 4:18The algorithms
these platforms depend on -
4:19 - 4:20deliberately amplify
-
4:21 - 4:23the type of content
that keeps users engaged; -
4:24 - 4:27stories that appeal
to our baser instincts, -
4:27 - 4:29and that trigger outrage and fear.
-
4:30 - 4:35It's why YouTube recommended videos
by the conspiracist Alex Jones -
4:35 - 4:37billions of times.
-
4:37 - 4:40It's why fake news outperforms real news,
-
4:41 - 4:44because studies show that
lies spread faster than truth. -
4:45 - 4:48And it's no surprise that the greatest
propaganda machine in history -
4:49 - 4:52has spread the oldest
conspiracy theory in history: -
4:52 - 4:55the lie that Jews are somehow dangerous.
-
4:56 - 4:58As one headline put it:
-
4:58 - 5:01"Just think what Goebbles
could have done with Facebook." -
5:01 - 5:03[audience groans]
-
5:03 - 5:04On the internet,
-
5:04 - 5:07everything can appear equally legitimate.
-
5:07 - 5:10Breitbart resembles the BBC,
-
5:10 - 5:13the fictitious protocols
of the Elders of Zion -
5:13 - 5:16look as valid as an ADL report,
-
5:16 - 5:19and the rantings of a lunatic
seem as credible -
5:19 - 5:21as the findings of a Nobel Prize winner.
-
5:22 - 5:26We have lost, it seems,
a shared sense of basic facts -
5:26 - 5:29upon which democracy depends.
-
5:29 - 5:32When I, as the wanna-be-gansta Ali G,
-
5:32 - 5:35asked the astronaut Buzz Aldrin:
-
5:35 - 5:37"what woz it like to walk on de sun??"
-
5:38 - 5:40[audience laughs]
-
5:42 - 5:47the joke worked because,
we, the audience, shared the same facts. -
5:47 - 5:50If you believe the moon landing was a hoax
-
5:50 - 5:52the joke doesn't work.
-
5:52 - 5:56When Borat got that bar
in Arizona to agree that, -
5:56 - 6:01♪ Jews control everybody's money
and they never give it back ♪ -
6:02 - 6:05the joke worked because
the audience shared the fact -
6:05 - 6:08that the depiction of Jews as miserly
-
6:09 - 6:13is a conspiracy theory
originating in the Middle Ages. -
6:13 - 6:17But when, thanks to social media,
conspiracies take hold, -
6:17 - 6:20it is easier for hate groups to recruit,
-
6:20 - 6:23easier for foreign intelligence agencies
-
6:23 - 6:25to interfere in our elections,
-
6:25 - 6:32and easier for a country like Myanmar
to commit genocide against the Rohingya. -
6:33 - 6:36[audience applauds]
-
6:39 - 6:41Now, it's actually quite shocking
-
6:41 - 6:45how easy it is to turn
conspiracy thinking into violence. -
6:45 - 6:47In my last show, "Who is America?"
-
6:47 - 6:51I found an educated, normal guy
who'd held down a good job, -
6:51 - 6:53but who, on social media,
-
6:53 - 6:56repeated many of the conspiracy theories
-
6:56 - 6:58that President Trump, using Twitter,
-
6:59 - 7:05has spread more than 1700 times
to his 67 million Twitter followers. -
7:05 - 7:10The president even tweeted that
he was considering designating Antifa -
7:11 - 7:13who are anti-fascists
who march against the far-right, -
7:14 - 7:16as a terror organization.
-
7:16 - 7:21So, disguised as an Israeli
anti-terrorism expert, -
7:21 - 7:23Colonel Erran Morad,
-
7:23 - 7:24[audience laughs]
-
7:25 - 7:27[in a foreign accent]
"Yalla. Let go" -
7:27 - 7:28[audience laughs]
-
7:28 - 7:30Disguised as him,
I told my interviewee -
7:30 - 7:32that at the Women's March in San Francisco
-
7:33 - 7:38Antifa were plotting to put hormones
into babies' diapers in order to -
7:38 - 7:41[in a foreign accent]
"make them transgender," -
7:41 - 7:42[audience laughs]
-
7:42 - 7:43And this man believed it.
-
7:44 - 7:49I instructed him to plant small devices
on three innocent people at the march, -
7:49 - 7:51and explained that,
when he pushed a button, -
7:51 - 7:53he'd trigger an explosion
that would kill them all. -
7:54 - 7:57They weren't real explosives,
of course, but he thought they were. -
7:57 - 8:00I wanted to see, would he actually do it?
-
8:00 - 8:02The answer was yes.
-
8:03 - 8:07He pushed the button and thought
he had actually killed three human beings. -
8:07 - 8:10Voltaire was right when he said,
-
8:10 - 8:14"Those who can make you
believe absurdities -
8:14 - 8:17"can make you commit atrocities."
-
8:18 - 8:24And social media lets authoritarians
push absurdities to billions of people. -
8:25 - 8:26In their defense,
-
8:26 - 8:30these social media companies
have taken some steps to reduce hate -
8:30 - 8:32and conspiracies on their platforms,
-
8:32 - 8:35but these steps
have been mainly superficial. -
8:35 - 8:37And I'm talking about this today
-
8:37 - 8:40because I believe that
our pluralistic democracies -
8:41 - 8:44are on a precipice
and that the next 12 months, -
8:44 - 8:47and the role of social media
could be determinant. -
8:48 - 8:50British voters will go to the polls
-
8:50 - 8:52while online conspiracists
-
8:52 - 8:56promote the despicable theory
of the "great replacement," -
8:56 - 8:59that white Christians
are being deliberately replaced -
8:59 - 9:01by Muslim immigrants.
-
9:02 - 9:04Americans will vote for president
-
9:04 - 9:11while trolls and bots perpetuate
the disgusting lie of a Hispanic invasion. -
9:12 - 9:16And after years of YouTube videos
calling climate change a hoax, -
9:16 - 9:19the United States is on track,
a year from now, -
9:20 - 9:23to formally withdraw
from the Paris Accords. -
9:23 - 9:27A sewer of bigotry
and vile conspiracy theories -
9:27 - 9:31that threaten our democracy,
and to some degree, our planet. -
9:32 - 9:36This can't possibly be what
the creators of the internet had in mind. -
9:36 - 9:41I believe that it's time for
a fundamental rethink of social media -
9:41 - 9:44and how it spreads
hate, conspiracies and lies. -
9:45 - 9:48(Audience cheers; applauds)
-
9:53 - 9:58Last month, however,
Mark Zuckerberg of Facebook -
9:58 - 10:00delivered a major speech
-
10:00 - 10:04that, not surprisingly,
warned against new laws and regulations -
10:04 - 10:06on companies like his.
-
10:06 - 10:10Well, some of these arguments
are simply... -
10:10 - 10:12...pardon my French, bullshit.
-
10:13 - 10:14Let's count the ways.
-
10:14 - 10:18First, Zuckerberg
tried to portray this whole issue as -
10:18 - 10:21"choices around free expression."
-
10:21 - 10:23That is ludicrous.
-
10:23 - 10:27This is not about
limiting anyone's free speech. -
10:27 - 10:28This is about giving people –
-
10:28 - 10:32including some of the most
reprehensible people on earth – -
10:32 - 10:36the biggest platform in history
to reach a third of the planet. -
10:36 - 10:40Freedom of speech
is not freedom of reach. -
10:40 - 10:42Sadly, there will always be racists,
-
10:42 - 10:45misogynists,
anti-semites and child abusers. -
10:45 - 10:47but I think we can all agree
-
10:48 - 10:50that we shouldn't be
giving bigots and pedophiles -
10:51 - 10:55a free platform to amplify their views
and target their victims. -
10:55 - 10:58[audience applauds]
-
11:01 - 11:04Second, Mark Zuckerberg
claimed that new limits -
11:04 - 11:06on what's posted on social media
-
11:06 - 11:09would be
"to pull back on free expression." -
11:10 - 11:11This is utter nonsense.
-
11:12 - 11:15The First Amendment
says that, and I quote: -
11:15 - 11:18"Congress shall make no law...
abridging freedom of speech." -
11:18 - 11:23However, this does not apply
to private businesses like Facebook. -
11:23 - 11:27We're not asking these companies
to determine the boundaries of free speech -
11:27 - 11:28across society,
-
11:29 - 11:32we just want them
to be responsible on their platforms. -
11:33 - 11:36If neo-nazi comes
goose-stepping into a restaurant -
11:36 - 11:39and starts threatening other customers
and says he wants to kill Jews, -
11:40 - 11:42would the owner of the restaurant,
a private business, -
11:43 - 11:47be required to serve him
an elegant 8-course meal? -
11:47 - 11:49Of course not!
-
11:49 - 11:52The restaurant owner
has every legal right – -
11:52 - 11:55and indeed, I would argue,
a moral obligation – -
11:55 - 11:56to kick that Nazi out.
-
11:57 - 11:59And so do these internet companies.
-
11:59 - 12:01[audience applauds]
-
12:05 - 12:06Now third.
-
12:06 - 12:10Mark Zuckerberg seemed to equate
regulation of companies like his -
12:10 - 12:14to the actions of
"the most repressive societies." -
12:15 - 12:16Incredible.
-
12:16 - 12:21This from one of the six people
who decide what information -
12:21 - 12:23so much of the world sees:
-
12:24 - 12:26Zuckerberg at Facebook,
-
12:26 - 12:28Sundar Pichai at Google,
-
12:28 - 12:32at its parent company Alphabet,
Larry Page and Sergey Brin, -
12:32 - 12:36Brin’s ex-sister-in-law,
Susan Wojcicki at YouTube, -
12:36 - 12:38and Jack Dorsey at Twitter.
-
12:38 - 12:40The Silicon Six.
-
12:41 - 12:43All billionaires, all Americans,
-
12:43 - 12:46who care more
about boosting their share price -
12:47 - 12:49than about protecting democracy.
-
12:49 - 12:51[audience applauds]
-
12:54 - 12:57This is ideological imperialism.
-
12:58 - 13:01Six unelected individuals
in Silicon Valley -
13:01 - 13:04imposing their vision
on the rest of the world, -
13:04 - 13:07unaccountable to any government
-
13:07 - 13:10and acting like
they're above the reach of law. -
13:10 - 13:12It's like we're living
in the Roman Empire, -
13:12 - 13:14and Mark Zuckerberg is Caesar.
-
13:15 - 13:17At least that would explain his haircut.
-
13:17 - 13:19[audience laughs]
-
13:19 - 13:21Now here's an idea.
-
13:22 - 13:26Instead of letting the Sillicon Six
decide the fate of the world, -
13:27 - 13:29let our elected representatives,
-
13:30 - 13:34voted for by the people
of every democracy in the world, -
13:34 - 13:36have at least some say.
-
13:37 - 13:39Fourth, Zuckerberg
speaks of welcoming -
13:39 - 13:42"a diversity of ideas,"
-
13:42 - 13:44and last year he gave us an example.
-
13:44 - 13:48He said that he found
posts denying the Holocaust -
13:48 - 13:49"deeply offensive,"
-
13:50 - 13:52but he didn't think
Facebook should take them down -
13:52 - 13:57"because I think there are things
that different people get wrong." -
13:57 - 14:02At this very moment, there are still
Holocaust deniers on Facebook, -
14:02 - 14:04and Google still takes you
-
14:04 - 14:08to the most repulsive
Holocaust denial sites -
14:08 - 14:09with a simple click.
-
14:10 - 14:15One of the heads of Google, in fact,
told me that these sites just show -
14:15 - 14:17"both sides" of the issue.
-
14:17 - 14:18[audience groans]
-
14:18 - 14:19This is madness.
-
14:19 - 14:21To quote Edward R. Murrow:
-
14:21 - 14:25One "cannot accept
that there are, on every story, -
14:25 - 14:28"two equal and logical sides
to an argument." -
14:28 - 14:32We have, unfortunately,
millions of pieces of evidence -
14:32 - 14:33for the Holocaust.
-
14:33 - 14:36it is an historical fact.
-
14:36 - 14:39And denying it is not
some random opinion. -
14:39 - 14:43Those who deny the Holocaust
aim to encourage another one. -
14:44 - 14:46[audience applauds]
-
14:51 - 14:55Still, Zuckerberg says that
"people should decide what is credible, -
14:55 - 14:57"not tech companies."
-
14:57 - 15:00But at a time
when two-thirds of Millenials -
15:00 - 15:03say they haven't even heard of Auschwitz,
-
15:03 - 15:05how are they supposed to know
what's credible? -
15:06 - 15:11How are they supposed to know
that the lie is a lie? -
15:11 - 15:14There is such a thing as objective truth.
-
15:14 - 15:16Facts do exist.
-
15:17 - 15:20And if these internet companies
really want to make a difference -
15:20 - 15:24they should hire enough monitors
to actually monitor, -
15:24 - 15:28work closely with groups like the ADL
and the NAACP, -
15:29 - 15:35insist on facts, and purge these lies
and conspiracies from their platforms. -
15:35 - 15:37[audience applauds]
-
15:40 - 15:45Now, fifth, when discussing
the difficulty of removing content, -
15:45 - 15:46Zuckerberg...
-
15:47 - 15:50Mark Zuckerberg asked,
"where do you draw the line?" -
15:51 - 15:55Yes, drawing the line can be difficult,
-
15:55 - 15:57but here's what he's really saying:
-
15:57 - 16:00removing more
of these lies and conspiracies -
16:00 - 16:02is just too expensive.
-
16:02 - 16:05These are
the richest companies in the world, -
16:05 - 16:07and they have
the best engineers in the world. -
16:08 - 16:11They could fix these problems
if they wanted to. -
16:11 - 16:14Twitter could deploy an algorithm
-
16:14 - 16:17to remove more
white supremacist hate speech, -
16:17 - 16:19but they reportedly haven't
-
16:20 - 16:23because it would eject
some very prominent politicians -
16:23 - 16:24from their platform.
-
16:24 - 16:26[audience groans]
-
16:26 - 16:28Maybe that wouldn't be such a bad thing.
-
16:28 - 16:30[audience applauds, cheers]
-
16:36 - 16:37The truth is
-
16:38 - 16:40these companies
won't fundamentally change -
16:40 - 16:42because their entire business model
-
16:43 - 16:46relies on generating more engagement,
-
16:46 - 16:48and nothing generates more engagement
-
16:48 - 16:51than lies, fear and outrage.
-
16:51 - 16:53So it's time
to finally call these companies -
16:53 - 16:57what they really are:
the largest publishers in history. -
16:57 - 16:59So here's an idea for them:
-
16:59 - 17:02abide by basic standards and practices
-
17:02 - 17:07just like newspapers,
magazines and TV news do everyday. -
17:07 - 17:11We have standards and practices
in television and the movies. -
17:11 - 17:14There are certain things
we cannot say or do. -
17:14 - 17:16In England, I was told that Ali G
-
17:16 - 17:19couldn't curse
when he appeared before 9 p.m. -
17:20 - 17:24Here in the US,
the Motion Picture Association of America -
17:24 - 17:26regulates and rates what we see.
-
17:27 - 17:31I've had scenes in my movies cut
or reduced to abide by those standards. -
17:32 - 17:36If there are standards and practices
for what cinemas and TV channels can show -
17:36 - 17:40then, surely
companies that publish material -
17:40 - 17:41to billions of people
-
17:41 - 17:44should have to abide
basic standards and practices too. -
17:44 - 17:45[audience applauds]
-
17:50 - 17:53Take the issue of political ads,
-
17:54 - 17:56on which Facebook have been resolute.
-
17:56 - 17:59Fortunately, Twitter finally banned them,
-
17:59 - 18:02and Google, today I read,
is making changes too. -
18:03 - 18:04But, if you pay them,
-
18:05 - 18:09Facebook will run
any political ad you want, -
18:09 - 18:10even if it's a lie.
-
18:11 - 18:15And they'll even help you
micro-target those adds to their users -
18:15 - 18:17for maximum effect.
-
18:17 - 18:20Under this twisted logic,
-
18:20 - 18:23if Facebook were around in the 1930s
-
18:23 - 18:26it would have allowed Hitler
to post 30-second ads -
18:26 - 18:29on his solution to the "Jewish problem."
-
18:29 - 18:32So here's a good standard and practice:
-
18:32 - 18:39Facebook, start fact-checking
political ads before you run them, -
18:39 - 18:43stop micro-targeted lies immediately,
-
18:43 - 18:44and when the ads are false,
-
18:44 - 18:47give back the money
and don't publish them. -
18:48 - 18:50[audience applauds]
-
18:55 - 18:57Here's another good practice:
-
18:57 - 18:59slow down.
-
19:00 - 19:04Every single post doesn't need
to be published immediately. -
19:04 - 19:07Oscar Wilde once said:
"We live in an age -
19:07 - 19:11"when unnecessary things
are our only necessity." -
19:11 - 19:12But...
-
19:13 - 19:16But, let me ask you,
is having every thought -
19:16 - 19:19or video posted instantly online,
-
19:19 - 19:21even if it's racist
or criminal or murderous, -
19:22 - 19:24really a necessity?
-
19:24 - 19:25Of course not.
-
19:25 - 19:29The shooter who
massacred Muslims in New Zealand -
19:29 - 19:32live streamed his atrocity on Facebook
-
19:32 - 19:34where it then spread across the internet
-
19:34 - 19:37and was viewed likely millions of times.
-
19:37 - 19:42It was a snuff film,
brought to you by social media. -
19:42 - 19:44Why can't we have more of a delay
-
19:45 - 19:47so that this trauma-inducing filth
-
19:47 - 19:52can be caught and stopped
before it's posted in the first place? -
19:52 - 19:55[audience applauds]
-
19:57 - 20:01Finally, Zuckerberg said that
social media companies should -
20:01 - 20:04"live up to their responsibilities,"
-
20:04 - 20:09but he's totally silent about
what should happen when they don't. -
20:09 - 20:12By now, it's pretty clear
-
20:12 - 20:15they can't be trusted
to regulate themselves. -
20:16 - 20:19As with Industrial Revolution,
it's time for regulation -
20:19 - 20:24and legislation to curb the greed
of these high-tech robber barons. -
20:25 - 20:27[audience applauds]
-
20:30 - 20:32In in every other industry,
-
20:32 - 20:36a company can be held liable
when their product is defective. -
20:36 - 20:40When engines explode
or seat belts malfunction, -
20:40 - 20:44car company's recall
tens of thousands of vehicles -
20:44 - 20:46at a cost of billions of dollars.
-
20:47 - 20:50It only seems fair
to say to Facebook, YouTube and Twitter: -
20:51 - 20:53Your product is defective,
-
20:53 - 20:55you are obliged to fix it,
-
20:55 - 20:57no matter how much it costs
-
20:57 - 21:01and no matter how many moderators
you need to employ. -
21:02 - 21:05[audience applauds, cheers]
-
21:08 - 21:11In every under-- sorry
-
21:11 - 21:15In every other industry
you can be sued for the harm you cause. -
21:15 - 21:20Publishers can be sued for libel,
people can be sued for defamation. -
21:20 - 21:23I've been sued many times!
-
21:23 - 21:24[audience laughs]
-
21:24 - 21:26I'm being sued right now
-
21:27 - 21:31by someone whose name I won't mention
because he might sue me again! -
21:31 - 21:32[audience laughs]
-
21:32 - 21:37But social media companies
are largely protected from liability -
21:37 - 21:39for the content their users post –
-
21:39 - 21:41no matter how indecent it is –
-
21:41 - 21:44by Section 230 of, get ready for it,
-
21:44 - 21:47the Communications Decency Act.
-
21:47 - 21:48It's absurd!
-
21:49 - 21:54Fortunately, internet companies can now
be held responsible for pedophiles -
21:54 - 21:57who use their site to target children.
-
21:57 - 21:58So I say,
-
21:58 - 22:00let's also hold these companies
-
22:00 - 22:03responsible for those who use their sites
-
22:03 - 22:06to advocate
for the mass murder of children -
22:06 - 22:08because of their race or religion.
-
22:09 - 22:10[audience applauds]
-
22:15 - 22:17And maybe fines are not enough.
-
22:18 - 22:24Maybe it's time to tell Mark Zuckerberg
and the CEOs of these companies -
22:24 - 22:28you already allowed one foreign power
-
22:28 - 22:29to interfere in our elections.
-
22:30 - 22:34You already facilitated
one genocide in Myanmar. -
22:34 - 22:37Do it again and you go to jail.
-
22:37 - 22:38[audience applauds]
-
22:43 - 22:48In the end, it all comes down to
what kind of world we want. -
22:49 - 22:53In his speech, Zuckerberg
said that one of his main goals -
22:53 - 22:58is to "uphold as wide a definition
of freedom of expression as possible." -
22:59 - 23:00It sounds good.
-
23:00 - 23:04Yet our freedoms
are not only an end in themselves. -
23:04 - 23:07They're also the means to another end –
-
23:07 - 23:09as you say here in the US:
-
23:09 - 23:13the right to life, liberty
and the pursuit of happiness. -
23:13 - 23:19But today these rights are threatened
by hate, conspiracies and lies. -
23:19 - 23:24So allow me to leave you with a suggestion
for a different aim for society: -
23:25 - 23:27The ultimate aim of society
-
23:27 - 23:31should be to make sure
that people are not targeted, -
23:31 - 23:34not harassed and not murdered
-
23:34 - 23:37because of who they are,
where they come from, -
23:37 - 23:39who they love or how they pray.
-
23:40 - 23:42[audience applauds, cheers]
-
23:55 - 23:58If we make that our aim –
-
23:58 - 24:00if we prioritize truth over lies,
-
24:00 - 24:02tolerance over prejudice,
-
24:03 - 24:05empathy over indifference,
-
24:05 - 24:08and experts over ignoramuses –
-
24:08 - 24:09[audience laughs]
-
24:09 - 24:11then maybe, just maybe,
-
24:12 - 24:16we can stop the greatest
propaganda machine in history. -
24:16 - 24:18We can save democracy,
-
24:18 - 24:22we can still have a place
for free speech and free expression, -
24:23 - 24:27and most importantly,
my jokes will still work. -
24:27 - 24:28Thank you very much.
-
24:28 - 24:31[cheers and applause]
- Title:
- ADL International Leadership Award Presented to Sacha Baron Cohen at Never Is Now 2019
- Description:
-
!Revision 1 of the English subtitles: YouTube automatic subtitles that need revision!
Mark the place you've reached in reviewing with ***, please.
Sacha Baron Cohen is the well-deserved recipient of ADL’s International Leadership Award, which goes to exceptional individuals who combine professional success with a profound personal commitment to community involvement and to crossing borders and barriers with a message of diversity and equal opportunity.
Over 100 years ago Supreme Court Justice Louis Brandeis wrote: “Sunlight is said to be the best disinfectant.” Through his alter egos, many of whom represent anti-Semites, racists and neo-Nazis, Baron Cohen shines a piercing light on people’s ignorance and biases.
https://www.adl.org/news/article/sacha-baron-cohens-keynote-address-at-adls-2019-never-is-now-summit-on-anti-semitism
copyright © 2019 ADL
- Video Language:
- English
- Team:
Captions Requested
- Duration:
- 24:44