< Return to Video

ADL International Leadership Award Presented to Sacha Baron Cohen at Never Is Now 2019

  • 0:02 - 0:04
    Thank you very much.
    Thanks to everyone who stood up.
  • 0:05 - 0:07
    Unnecessary but very flattering.
  • 0:07 - 0:10
    Thank you, Jonathan,
    for your very kind words.
  • 0:10 - 0:12
    Thank you, the Anti-Defamation League,
  • 0:12 - 0:15
    for this recognition
  • 0:15 - 0:18
    and your work in fighting racism,
    hate and bigotry.
  • 0:19 - 0:20
    And to be clear,
  • 0:20 - 0:22
    when I say racism, hate, and bigotry
  • 0:23 - 0:26
    I'm not referring to the names
    of Stephen Miller's labradoodles.
  • 0:27 - 0:28
    (Audience Laughs)
  • 0:30 - 0:34
    Now, I realize that some of you
    may be thinking:
  • 0:34 - 0:38
    "What the hell is a comedian doing
    speaking at a conference like this?"
  • 0:39 - 0:40
    I certainly am.
  • 0:40 - 0:41
    (Audience Laughs)
  • 0:41 - 0:45
    I've spent most of the past two decades
    in character.
  • 0:45 - 0:50
    In fact, this is the first ever time
    that I've stood up and given a speech
  • 0:50 - 0:53
    as my least popular character:
    Sacha Baron Cohen.
  • 0:53 - 0:54
    (Audience Laughs)
  • 0:54 - 0:55
    (Audience Cheers)
  • 0:57 - 1:01
    And, I have to confess, it is terrifying.
  • 1:03 - 1:07
    I realized that my presence here
    may also be unexpected for another reason:
  • 1:07 - 1:11
    At times some critics
    have said my comedy
  • 1:11 - 1:14
    risks reinforcing old stereotypes.
  • 1:14 - 1:19
    The truth is I've been passionate about
    challenging bigotry and intolerance
  • 1:19 - 1:20
    throughout my life.
  • 1:21 - 1:22
    As a teenager in England,
  • 1:22 - 1:25
    I marched against
    the Fascist National Front
  • 1:25 - 1:27
    and to abolish apartheid.
  • 1:27 - 1:30
    As an undergraduate,
    I traveled around America
  • 1:30 - 1:33
    and wrote my thesis
    about the Civil Rights Movement
  • 1:33 - 1:35
    with the help of the archives of the ADL;
  • 1:36 - 1:39
    and as a comedian,
    I've tried to use my characters
  • 1:39 - 1:42
    to get people to let down their guard
  • 1:42 - 1:44
    and reveal what they actually believe -
  • 1:44 - 1:46
    including their own prejudice.
  • 1:46 - 1:49
    I'm not gonna claim
    that everything I've done
  • 1:49 - 1:50
    has been for a higher purpose.
  • 1:51 - 1:53
    Yes, some of my comedy--
  • 1:53 - 1:54
    Okay, probably half my comedy,
  • 1:54 - 1:56
    has been absolutely juvenile.
  • 1:56 - 1:57
    (Audience Laughs)
  • 1:57 - 2:00
    And the other half completely puerile.
  • 2:00 - 2:02
    But...
    (Audience Laughs)
  • 2:02 - 2:04
    I admit there was nothing
    particularly enlightening
  • 2:04 - 2:07
    particularly enlightening about me
    as Borat from Kazakhstan,
  • 2:07 - 2:08
    the first fake
  • 2:08 - 2:11
    news journalist, running through a
  • 2:11 - 2:13
    conference of mortgage brokers while I
  • 2:13 - 2:15
    was completely naked.
  • 2:15 - 2:18
    But when Borat was able to get an entire
  • 2:18 - 2:20
    bar in Arizona to sing ♪ throw the Jew
  • 2:20 - 2:24
    down the well ♪ it did reveal
  • 2:24 - 2:27
    people's indifference to anti-semitism.
  • 2:27 - 2:30
    When as Bruno, the gay fashion reporter
  • 2:30 - 2:31
    from Austria,
  • 2:31 - 2:34
    I started kissing a man in a cage fight
  • 2:34 - 2:37
    in Arkansas, nearly starting a riot, it
  • 2:37 - 2:39
    showed the violent potential of
  • 2:39 - 2:42
    homophobia. And when disguised as an
  • 2:42 - 2:46
    ultra woke developer I proposed building
  • 2:46 - 2:48
    a mosque in one rural community
  • 2:48 - 2:51
    prompting a resident to proudly admit "I
  • 2:51 - 2:55
    am racist against Muslims" it showed the
  • 2:55 - 2:57
    growing acceptance of islamophobia.
  • 2:57 - 3:00
    That's why I really appreciate the
  • 3:00 - 3:02
    opportunity to be here with you. Today,
  • 3:02 - 3:06
    around the world, demagogues appeal to
  • 3:06 - 3:09
    our worst instincts. Conspiracy theories
  • 3:09 - 3:12
    once confined to the fringe are going
  • 3:12 - 3:15
    mainstream. It's as if the age of reason,
  • 3:15 - 3:19
    the era of evidential argument is ending
  • 3:19 - 3:21
    and now knowledge is increasingly
  • 3:21 - 3:24
    delegitimized and scientific consensus is
  • 3:24 - 3:28
    dismissed. Democracy, which depends on
  • 3:28 - 3:31
    shared truths, is in retreat. And
  • 3:31 - 3:34
    autocracy, which depends on shared lies,
  • 3:34 - 3:38
    is on the march. Hate crimes are surging
  • 3:38 - 3:40
    as are murderous attacks and religious and
  • 3:40 - 3:43
    ethnic minorities. Now what do all these
  • 3:43 - 3:45
    dangerous trends have in common?
  • 3:45 - 3:47
    I'm just a comedian and an act, I'm not a
  • 3:47 - 3:51
    scholar but one thing is pretty clear to
  • 3:51 - 3:54
    me: all this hate and violence is being
  • 3:54 - 3:57
    facilitated by a handful of Internet
  • 3:57 - 4:00
    companies that amount to the greatest
  • 4:00 - 4:04
    propaganda machine in history.
  • 4:04 - 4:05
    The greatest propaganda machine
  • 4:06 - 4:09
    in history.
  • 4:09 - 4:10
    Let's think about it. Facebook,
  • 4:10 - 4:13
    Facebook, YouTube and Google Twitter and others
  • 4:13 - 4:16
    they reach billions of people the
  • 4:16 - 4:18
    algorithms these platforms depend on
  • 4:18 - 4:22
    deliberately amplify the type of content
  • 4:22 - 4:25
    that keeps users engaged stories that
  • 4:25 - 4:27
    appeal to our baser instincts and that
  • 4:27 - 4:30
    trigger outrage and fear it's why
  • 4:30 - 4:33
    YouTube recommended videos by the
  • 4:33 - 4:36
    conspiracist alex jones billions of
  • 4:36 - 4:39
    times it's why fake news outperforms
  • 4:39 - 4:42
    real news because studies show that lies
  • 4:42 - 4:46
    spread faster than truth and it's no
  • 4:46 - 4:47
    surprise that the greatest propaganda
  • 4:47 - 4:48
    machine in history
  • 4:48 - 4:51
    has spread the oldest conspiracy theory
  • 4:51 - 4:54
    in history the lie that Jews are somehow
  • 4:54 - 4:58
    dangerous as one headline put it
  • 4:58 - 5:00
    just think what gurbles could have done
  • 5:00 - 5:04
    with Facebook on the internet everything
  • 5:04 - 5:08
    can appear equally legitimate Breitbart
  • 5:08 - 5:11
    resembles the BBC the fictitious
  • 5:11 - 5:13
    protocols of the Elders of Zion look as
  • 5:13 - 5:17
    valid as an ADL report and the rantings
  • 5:17 - 5:19
    of a lunatic seem as credible as the
  • 5:19 - 5:22
    findings of a Nobel Prize winner we have
  • 5:22 - 5:25
    lost it seems a shared sense of basic
  • 5:25 - 5:29
    facts upon which democracy depends when
  • 5:29 - 5:32
    I as the wannabe gangsta led asked the
  • 5:32 - 5:36
    astronaut Buzz Aldrin what would they
  • 5:36 - 5:39
    like to walk on the Sun
  • 5:41 - 5:45
    the joke worked because we the audience
  • 5:45 - 5:48
    shared the same facts if you believe the
  • 5:48 - 5:51
    moon landing was a hoax the joke doesn't
  • 5:51 - 5:55
    work when Borat got that bar in Arizona
  • 5:55 - 5:58
    to agree that Jews control everybody's
  • 5:58 - 6:02
    money and they never give it back the
  • 6:02 - 6:05
    joke worked because the audience shared
  • 6:05 - 6:07
    the fact that the depiction of Jews as
  • 6:07 - 6:10
    miserly is a conspiracy theory
  • 6:10 - 6:13
    originating in the Middle Ages but when
  • 6:13 - 6:16
    thanks to social media conspiracies take
  • 6:16 - 6:19
    hold it is easier for hate groups to
  • 6:19 - 6:22
    recruit easier for foreign intelligence
  • 6:22 - 6:25
    agencies to interfere in our elections
  • 6:25 - 6:29
    and easier for a country like Myanmar to
  • 6:29 - 6:34
    commit genocide against the ringa
  • 6:38 - 6:41
    now it's actually quite shocking how
  • 6:41 - 6:44
    easy it is to turn conspiracy thinking
  • 6:44 - 6:46
    into violence in my last show who is
  • 6:46 - 6:47
    America
  • 6:47 - 6:50
    I found an educated normal guy who had
  • 6:50 - 6:53
    held down a good job but who on social
  • 6:53 - 6:55
    media repeated many of the conspiracy
  • 6:55 - 6:58
    theories their president Trump using
  • 6:58 - 7:01
    Twitter has spread more than 1700 times
  • 7:01 - 7:05
    to his 67 million Twitter followers the
  • 7:05 - 7:07
    president even tweeted that he was
  • 7:07 - 7:11
    considering designating and Tifa who are
  • 7:11 - 7:13
    anti fascists who march against the
  • 7:13 - 7:17
    far-right as a terror organization so
  • 7:17 - 7:19
    disguised and then as an Israeli
  • 7:19 - 7:25
    anti-terrorism expert : lol harmalard
  • 7:25 - 7:28
    yalla let go
  • 7:28 - 7:30
    disguised as him I told my interviewee
  • 7:30 - 7:32
    there at the Women's March in San
  • 7:32 - 7:34
    Francisco and Tifa were plotting to put
  • 7:34 - 7:38
    hormones into babies diapers in order to
  • 7:38 - 7:42
    make them turns again there and this man
  • 7:42 - 7:45
    believed it I instructed him to plant
  • 7:45 - 7:47
    small devices on three innocent people
  • 7:47 - 7:49
    at the March and explained that when he
  • 7:49 - 7:50
    pushed a button
  • 7:50 - 7:52
    he'd trigger an explosion that would
  • 7:52 - 7:54
    kill them all they weren't real
  • 7:54 - 7:56
    explosives of course but he thought they
  • 7:56 - 7:59
    were I wanted to see would he actually
  • 7:59 - 8:03
    do it the answer was yes he pushed the
  • 8:03 - 8:05
    button and thought he had actually
  • 8:05 - 8:08
    killed three human beings Voltaire was
  • 8:08 - 8:12
    right when he said those who can make
  • 8:12 - 8:15
    you believe absurdities can make you
  • 8:15 - 8:20
    commit atrocities and social media lets
  • 8:20 - 8:25
    authoritarianism now in their defense
  • 8:25 - 8:28
    these social media companies have taken
  • 8:28 - 8:30
    some steps to reduce hate and
  • 8:30 - 8:33
    conspiracies on their platforms that
  • 8:33 - 8:35
    these steps have been mainly superficial
  • 8:35 - 8:37
    and I'm talking about this today because
  • 8:37 - 8:39
    I believe that our pluralistic
  • 8:39 - 8:42
    democracies are on a precipice and in
  • 8:42 - 8:44
    the next 12 months and the role of
  • 8:44 - 8:48
    social media could be determinant now
  • 8:48 - 8:50
    British voters will go to the polls
  • 8:50 - 8:51
    while online
  • 8:51 - 8:54
    Spira says promote the despicable theory
  • 8:54 - 8:57
    of the great replacement that white
  • 8:57 - 8:59
    Christians are being deliberately
  • 8:59 - 9:02
    replaced by Muslim immigrants Americans
  • 9:02 - 9:05
    will vote for president while trolls and
  • 9:05 - 9:08
    BOTS perpetuate that it's the disgusting
  • 9:08 - 9:12
    lie of a Hispanic invasion and after
  • 9:12 - 9:14
    years of YouTube videos calling climate
  • 9:14 - 9:18
    change a hoax the United States is on
  • 9:18 - 9:21
    track a year from now to formally
  • 9:21 - 9:24
    withdraw from the Paris Accords a sewer
  • 9:24 - 9:27
    of bigotry and vile conspiracy theories
  • 9:27 - 9:29
    that threaten our democracy in to some
  • 9:29 - 9:33
    degree our planet this can't possibly be
  • 9:33 - 9:35
    what the creators of the internet had in
  • 9:35 - 9:38
    mind I believe that it's time for a
  • 9:38 - 9:40
    fundamental rethink of social media and
  • 9:40 - 9:44
    how it spreads hate conspiracies and
  • 9:44 - 9:46
    lies
  • 9:52 - 9:57
    now last month however Mark Zuckerberg
  • 9:57 - 9:59
    of Facebook delivered a major speech
  • 9:59 - 10:03
    that not surprisingly warned against new
  • 10:03 - 10:05
    laws and regulations on companies like
  • 10:05 - 10:08
    his well some of these arguments are
  • 10:08 - 10:13
    simply part of my French let's
  • 10:13 - 10:14
    count the ways
  • 10:14 - 10:17
    first Zuckerberg tried to portray this
  • 10:17 - 10:20
    whole issue as choices around free
  • 10:20 - 10:24
    expression that is ludicrous this is not
  • 10:24 - 10:27
    about limiting anyone's free speech this
  • 10:27 - 10:29
    is about giving people including some of
  • 10:29 - 10:32
    the most reprehensible people on earth
  • 10:32 - 10:34
    the biggest platform in history to reach
  • 10:34 - 10:37
    a third of the planet freedom of speech
  • 10:37 - 10:41
    is not freedom of reach sadly there will
  • 10:41 - 10:44
    always be racist misogynist anti-semites
  • 10:44 - 10:47
    and child abusers but I think we can all
  • 10:47 - 10:49
    agree that we should not be giving
  • 10:49 - 10:52
    bigots and pedophiles a free platform to
  • 10:52 - 10:54
    amplify their views and target their
  • 10:54 - 10:57
    victims
  • 11:00 - 11:04
    second Mark Zuckerberg claimed that new
  • 11:04 - 11:06
    limits on what's posted on social media
  • 11:06 - 11:09
    would be to pull back on free expression
  • 11:09 - 11:12
    this is utter nonsense the First
  • 11:12 - 11:15
    Amendment says that and I quote Congress
  • 11:15 - 11:18
    shall make no law abridging freedom of
  • 11:18 - 11:20
    speech however this does not apply to
  • 11:20 - 11:22
    private businesses like Facebook
  • 11:22 - 11:25
    we're not asking these companies to
  • 11:25 - 11:26
    determine the boundaries of free speech
  • 11:26 - 11:30
    across society we just want them to be
  • 11:30 - 11:33
    responsible on their platforms now if a
  • 11:33 - 11:35
    neo-nazi comes goose-stepping into a
  • 11:35 - 11:37
    restaurant and starts threatening other
  • 11:37 - 11:39
    customers and say he wants to kill Jews
  • 11:39 - 11:42
    would the owner of the restaurant a
  • 11:42 - 11:44
    private business be required to serve
  • 11:44 - 11:47
    him an elegant eight course meal of
  • 11:47 - 11:48
    course not
  • 11:48 - 11:51
    the restaurant owner has every legal
  • 11:51 - 11:54
    right and indeed I would argue a moral
  • 11:54 - 11:57
    obligation to kick that Nazi out and so
  • 11:57 - 12:00
    do these internet companies
  • 12:04 - 12:07
    now third Mark Zuckerberg seemed to
  • 12:07 - 12:10
    equate regulation of companies like his
  • 12:10 - 12:13
    to the actions of the most repressive
  • 12:13 - 12:17
    societies incredible this from one of
  • 12:17 - 12:20
    the six people who decide what
  • 12:20 - 12:23
    information so much of the world sees
  • 12:23 - 12:26
    Zuckerberg it's at Facebook sundar
  • 12:26 - 12:29
    pichai at Google at its parent company
  • 12:29 - 12:31
    alphabet Larry Page and Sergey Brin
  • 12:31 - 12:34
    Bryn's ex-sister-in-law
  • 12:34 - 12:36
    Susan Wojcicki at YouTube and Jack
  • 12:36 - 12:41
    Dorsey at Twitter the silicon six all
  • 12:41 - 12:44
    billionaires all Americans who care more
  • 12:44 - 12:47
    about boosting their share price than
  • 12:47 - 12:55
    about protecting democracy this this is
  • 12:55 - 12:59
    ideological imperialism six unelected
  • 12:59 - 13:02
    individuals in Silicon Valley imposing
  • 13:02 - 13:04
    their vision on the rest of the world
  • 13:04 - 13:07
    unaccountable to any government and
  • 13:07 - 13:09
    acting like they're a bomb for the reach
  • 13:09 - 13:11
    of law it's like we're living in the
  • 13:11 - 13:13
    Roman Empire and Mark Zuckerberg is
  • 13:13 - 13:16
    Caesar at least that would explain his
  • 13:16 - 13:18
    haircut
  • 13:19 - 13:23
    now here's an idea instead of letting
  • 13:23 - 13:26
    the silicon sects decide the fate of the
  • 13:26 - 13:29
    world let our elected representatives
  • 13:29 - 13:32
    voted for by the people of every
  • 13:32 - 13:35
    democracy in the world have at least
  • 13:35 - 13:39
    some say 4th Zuckerberg speaks of
  • 13:39 - 13:42
    welcoming a diversity of ideas and last
  • 13:42 - 13:45
    year he gave us an example he said that
  • 13:45 - 13:48
    he found post denying the Holocaust
  • 13:48 - 13:50
    deeply offensive but he didn't think
  • 13:50 - 13:53
    Facebook should take them down because I
  • 13:53 - 13:55
    think there are things that different
  • 13:55 - 13:59
    people get wrong at this very moment
  • 13:59 - 14:01
    there are still Holocaust deniers on
  • 14:01 - 14:04
    Facebook and Google still takes you to
  • 14:04 - 14:06
    the most repulsive Holocaust denial
  • 14:06 - 14:10
    sites sites with a simple click one of
  • 14:10 - 14:13
    the heads of Google in fact told me that
  • 14:13 - 14:16
    these sites just show both sides of the
  • 14:16 - 14:19
    issue this is madness
  • 14:19 - 14:22
    to quote Edward r murrow one cannot
  • 14:22 - 14:25
    accept that there are on every story to
  • 14:25 - 14:28
    equal and logical sides to an argument
  • 14:28 - 14:31
    we have unfortunately millions of pieces
  • 14:31 - 14:34
    of evidence for the Holocaust it is an
  • 14:34 - 14:37
    historical fact and denying it is not
  • 14:37 - 14:40
    some random opinion those who deny the
  • 14:40 - 14:45
    Holocaust aim to encourage another one
  • 14:50 - 14:54
    still zugerberg says that people should
  • 14:54 - 14:56
    decide what is credible not tech
  • 14:56 - 14:59
    companies but our time when two-thirds
  • 14:59 - 15:01
    of Millennials say that they haven't
  • 15:01 - 15:03
    even heard of Auschwitz how are they
  • 15:03 - 15:06
    supposed to know what's credible how are
  • 15:06 - 15:10
    they supposed to know that the lie is a
  • 15:10 - 15:13
    lie there is such a thing as objective
  • 15:13 - 15:17
    truth facts do exist and if these
  • 15:17 - 15:19
    Internet companies really want to make a
  • 15:19 - 15:21
    difference they should hire enough
  • 15:21 - 15:24
    monitors to actually monitor work
  • 15:24 - 15:27
    closely with groups like the ADL and the
  • 15:27 - 15:28
    n-double-a-cp
  • 15:28 - 15:32
    insist on facts and purge these lies and
  • 15:32 - 15:36
    conspiracies from their platforms
  • 15:40 - 15:44
    now v when discussing the difficulty of
  • 15:44 - 15:47
    removing content Zuka berg mark
  • 15:47 - 15:47
    zuckerberg
  • 15:47 - 15:51
    asked where do you draw the line yes
  • 15:51 - 15:55
    drawing the line can be difficult but
  • 15:55 - 15:58
    here's what he's really saying removing
  • 15:58 - 16:00
    more of these lies and conspiracies is
  • 16:00 - 16:03
    just too expensive these are the richest
  • 16:03 - 16:06
    companies in the world and they have the
  • 16:06 - 16:08
    best engineers in the world they could
  • 16:08 - 16:10
    fix these problems if they wanted to
  • 16:10 - 16:14
    Twitter could deploy an algorithm to
  • 16:14 - 16:16
    remove more white supremacist hate
  • 16:16 - 16:19
    speech but they have reportedly haven't
  • 16:19 - 16:21
    because it would eject some very
  • 16:21 - 16:24
    prominent politicians from their
  • 16:24 - 16:27
    platform maybe that wouldn't be such a
  • 16:27 - 16:29
    bad thing
  • 16:35 - 16:39
    the truth is these companies won't
  • 16:39 - 16:40
    fundamentally change because their
  • 16:40 - 16:43
    entire business model relies on
  • 16:43 - 16:46
    generating more engagement and nothing
  • 16:46 - 16:49
    generates more engagement than lies fear
  • 16:49 - 16:51
    and outrage
  • 16:51 - 16:53
    so it's time to finally call these
  • 16:53 - 16:55
    companies what they really are the
  • 16:55 - 16:58
    largest publishers in history so here's
  • 16:58 - 17:01
    an idea for them abide by basic
  • 17:01 - 17:03
    standards and practices just like
  • 17:03 - 17:06
    newspapers magazines and TV news do
  • 17:06 - 17:09
    everyday we have standards and practices
  • 17:09 - 17:11
    in television and the movies there are
  • 17:11 - 17:14
    certain things we cannot say or do in
  • 17:14 - 17:17
    England I was told that alley G could
  • 17:17 - 17:19
    not curse when he appeared before 9 p.m.
  • 17:19 - 17:22
    here in the US the Motion Picture
  • 17:22 - 17:25
    Association of America regulates and
  • 17:25 - 17:28
    rates what we see I've had scenes in my
  • 17:28 - 17:30
    movies cut or reduced to abide by those
  • 17:30 - 17:32
    standards now if there are standards and
  • 17:32 - 17:35
    practices for what cinemas and
  • 17:35 - 17:37
    television channels can show then surely
  • 17:37 - 17:40
    companies that publish material to
  • 17:40 - 17:42
    billions of people should have to abide
  • 17:42 - 17:45
    basic standards and practices to
  • 17:49 - 17:53
    take now take the issue of political ads
  • 17:53 - 17:56
    on which facebook have been resolute
  • 17:56 - 17:59
    fortunately Twitter finally banned them
  • 17:59 - 18:01
    and Google today I read is making
  • 18:01 - 18:05
    changes to but if you pay them Facebook
  • 18:05 - 18:09
    will run any political ad you want even
  • 18:09 - 18:12
    if it's a lie and they'll even help you
  • 18:12 - 18:15
    micro target those lies to their users
  • 18:15 - 18:19
    for maximum effect under this twisted
  • 18:19 - 18:21
    logic if Facebook were around in the
  • 18:21 - 18:25
    1930s it would have allowed Hitler to
  • 18:25 - 18:28
    post 30-second ads on his solution to
  • 18:28 - 18:30
    the Jewish problem so here's a good
  • 18:30 - 18:35
    standard and practice Facebook start
  • 18:35 - 18:38
    fact-checking political ads before you
  • 18:38 - 18:41
    run them stop micro-targeted lies
  • 18:41 - 18:44
    immediately and when the ads of false
  • 18:44 - 18:47
    give back the money and don't publish
  • 18:47 - 18:49
    them
  • 18:55 - 18:59
    here's another good practice slow down
  • 18:59 - 19:02
    every single post does not need to be
  • 19:02 - 19:05
    published immediately Oscar Wilde once
  • 19:05 - 19:08
    said we live in an age when unnecessary
  • 19:08 - 19:13
    things are our only necessity but but
  • 19:13 - 19:15
    let me ask you is having every thought
  • 19:15 - 19:19
    or video posted instantly online even if
  • 19:19 - 19:22
    it's racist or criminal murderous really
  • 19:22 - 19:26
    a necessity of course not the shooter
  • 19:26 - 19:28
    who massacred Muslims in New Zealand
  • 19:28 - 19:32
    live streamed his atrocity on Facebook
  • 19:32 - 19:34
    where it then spread across the internet
  • 19:34 - 19:36
    and was viewed likely millions of times
  • 19:36 - 19:39
    it was a snuff film
  • 19:39 - 19:43
    brought to you by social media why can't
  • 19:43 - 19:45
    we have more of a delay so that this
  • 19:45 - 19:48
    trauma inducing filth can be caught and
  • 19:48 - 19:51
    stopped before it's posted in the first
  • 19:51 - 19:53
    place
  • 19:56 - 19:59
    finally Zuckerberg said that social
  • 19:59 - 20:03
    media companies should live up to their
  • 20:03 - 20:06
    responsibilities but he's totally silent
  • 20:06 - 20:09
    about what should happen when they don't
  • 20:09 - 20:13
    by now it's pretty clear they cannot be
  • 20:13 - 20:13
    trusted
  • 20:13 - 20:16
    to regulate themselves as with the
  • 20:16 - 20:18
    Industrial Revolution it's time for
  • 20:18 - 20:21
    regulation and legislation to curb the
  • 20:21 - 20:26
    greed of these high-tech robber barons
  • 20:29 - 20:33
    in in every other industry a company can
  • 20:33 - 20:35
    be held liable when their product is
  • 20:35 - 20:36
    defective
  • 20:36 - 20:39
    when engines explode or seat belts
  • 20:39 - 20:42
    malfunction car company's recalled tens
  • 20:42 - 20:45
    of thousands of vehicles at a cost of
  • 20:45 - 20:48
    billions of dollars it only seems fair
  • 20:48 - 20:50
    to say to Facebook YouTube and Twitter
  • 20:50 - 20:53
    your product is defective you are
  • 20:53 - 20:57
    obliged to fix it no matter how much it
  • 20:57 - 21:00
    costs and no matter how many moderators
  • 21:00 - 21:10
    you need to employ in every under in
  • 21:10 - 21:13
    entry in every other industry you can be
  • 21:13 - 21:16
    sued for the harm you caused publishers
  • 21:16 - 21:19
    can be sued for libel people can be sued
  • 21:19 - 21:22
    for defamation I've been sued many times
  • 21:22 - 21:27
    I'm being sued right now by someone
  • 21:27 - 21:29
    whose name I won't mention because he
  • 21:29 - 21:33
    might see me again but social media
  • 21:33 - 21:36
    companies are largely protected from
  • 21:36 - 21:38
    liability for the content their users
  • 21:38 - 21:41
    post no matter how indecent it is by
  • 21:41 - 21:44
    section 230 F get ready for it the
  • 21:44 - 21:49
    Communications Decency Act it's absurd
  • 21:49 - 21:51
    fortunately internet companies can now
  • 21:51 - 21:54
    be held responsible for pedophiles who
  • 21:54 - 21:57
    use their site to target children so I
  • 21:57 - 21:59
    say let's also hold these companies
  • 21:59 - 22:02
    responsible for those who use their
  • 22:02 - 22:05
    sites to advocate for the mass murder of
  • 22:05 - 22:07
    children because of their race or
  • 22:07 - 22:10
    religion
  • 22:12 - 22:18
    and maybe fines are not enough maybe
  • 22:18 - 22:22
    it's time to tell Mark Zuckerberg and
  • 22:22 - 22:25
    the CEOs of these companies you already
  • 22:25 - 22:28
    allowed one foreign power to interfere
  • 22:28 - 22:29
    in our elections
  • 22:29 - 22:33
    you already facilitated one genocide in
  • 22:33 - 22:38
    the anime do it again and you go to jail
  • 22:43 - 22:46
    in the end it all comes down to what
  • 22:46 - 22:49
    kind of world we want in his speech
  • 22:49 - 22:52
    Zuckerberg said that one of his main
  • 22:52 - 22:56
    goals is to uphold as wide a definition
  • 22:56 - 22:59
    of freedom of expression as possible it
  • 22:59 - 22:59
    sounds good
  • 22:59 - 23:03
    yeah our freedoms are not only an end in
  • 23:03 - 23:06
    themselves they're also the means to
  • 23:06 - 23:08
    another end as you say here in the US
  • 23:08 - 23:11
    the right to life liberty and the
  • 23:11 - 23:14
    pursuit of happiness but today these
  • 23:14 - 23:17
    rights are threatened by hate
  • 23:17 - 23:20
    conspiracies and lies so allow me to
  • 23:20 - 23:22
    leave you with a suggestion for a
  • 23:22 - 23:26
    different aim for society the ultimate
  • 23:26 - 23:28
    aim of society should be to make sure
  • 23:28 - 23:31
    that people are not targeted not
  • 23:31 - 23:35
    harassed and not murdered because of who
  • 23:35 - 23:37
    they are where they come from who they
  • 23:37 - 23:41
    love or how they pray
  • 23:54 - 23:59
    if we make that our aim if we prioritize
  • 23:59 - 24:02
    truth over lies tolerance over prejudice
  • 24:02 - 24:06
    empathy over indifference and experts
  • 24:06 - 24:10
    over ignoramus --is then maybe just
  • 24:10 - 24:14
    maybe we can stop the greatest
  • 24:14 - 24:16
    propaganda machine in history we can
  • 24:16 - 24:19
    save democracy we can still have a place
  • 24:19 - 24:22
    for free speech and free expression and
  • 24:22 - 24:26
    most importantly my jokes will still
  • 24:26 - 24:28
    work thank you very much
  • 24:28 - 1:46
    [Applause]
Title:
ADL International Leadership Award Presented to Sacha Baron Cohen at Never Is Now 2019
Description:

!Revision 1 of the English subtitles: YouTube automatic subtitles that need revision!

Mark the place you've reached in reviewing with ***, please.

Sacha Baron Cohen is the well-deserved recipient of ADL’s International Leadership Award, which goes to exceptional individuals who combine professional success with a profound personal commitment to community involvement and to crossing borders and barriers with a message of diversity and equal opportunity.

Over 100 years ago Supreme Court Justice Louis Brandeis wrote: “Sunlight is said to be the best disinfectant.” Through his alter egos, many of whom represent anti-Semites, racists and neo-Nazis, Baron Cohen shines a piercing light on people’s ignorance and biases.

https://www.adl.org/news/article/sacha-baron-cohens-keynote-address-at-adls-2019-never-is-now-summit-on-anti-semitism

copyright © 2019 ADL

more » « less
Video Language:
English
Team:
Captions Requested
Duration:
24:44

English subtitles

Revisions Compare revisions