0:00:01.950,0:00:04.500 Thank you very much.[br]Thanks to everyone who stood up. 0:00:04.510,0:00:06.990 Unnecessary but very flattering. 0:00:07.380,0:00:09.760 Thank you, Jonathan,[br]for your very kind words. 0:00:10.480,0:00:12.320 Thank you, the Anti-Defamation League, 0:00:12.330,0:00:14.510 for this recognition 0:00:14.570,0:00:18.230 and your work in fighting racism,[br]hate and bigotry. 0:00:18.720,0:00:19.950 And to be clear, 0:00:20.040,0:00:22.320 when I say racism, hate, and bigotry 0:00:22.860,0:00:26.210 I'm not referring to the names[br]of Stephen Miller's labradoodles. 0:00:26.600,0:00:27.880 [audience laughs] 0:00:30.490,0:00:33.920 Now, I realize that[br]some of you may be thinking: 0:00:34.110,0:00:38.130 "What the hell is a comedian doing[br]speaking at a conference like this?" 0:00:39.010,0:00:40.290 I certainly am. 0:00:40.390,0:00:41.390 [audience laughs] 0:00:41.440,0:00:44.640 I've spent most of the past two decades[br]in character. 0:00:44.760,0:00:49.680 In fact, this is the first ever time[br]that I've stood up and given a speech 0:00:49.750,0:00:53.170 as my least popular character:[br]Sacha Baron Cohen. 0:00:53.280,0:00:54.280 [audience laughs] 0:00:54.380,0:00:55.950 [audience cheers] 0:00:57.470,0:01:01.320 And, I have to confess, it is terrifying. 0:01:02.710,0:01:06.830 I realize that my presence here[br]may also be unexpected for another reason: 0:01:07.560,0:01:10.580 At times, some critics[br]have said my comedy 0:01:10.730,0:01:14.090 risks reinforcing old stereotypes. 0:01:14.280,0:01:19.210 The truth is, I've been passionate about[br]challenging bigotry and intolerance 0:01:19.270,0:01:20.420 throughout my life. 0:01:20.770,0:01:22.140 As a teenager in England, 0:01:22.160,0:01:24.950 I marched against[br]the Fascist National Front 0:01:25.010,0:01:26.890 and to abolish apartheid. 0:01:27.220,0:01:29.750 As an undergraduate,[br]I traveled around America 0:01:29.770,0:01:32.770 and wrote my thesis[br]about the Civil Rights Movement 0:01:32.900,0:01:35.410 with the help of the archives of the ADL; 0:01:35.670,0:01:38.610 and as a comedian,[br]I've tried to use my characters 0:01:38.650,0:01:41.690 to get people to let down their guard 0:01:41.710,0:01:43.940 and reveal what they actually believe, 0:01:44.250,0:01:45.950 including their own prejudice. 0:01:46.430,0:01:48.550 Now, I'm not gonna claim[br]everything I've done 0:01:48.570,0:01:50.210 has been for a higher purpose. 0:01:50.560,0:01:52.670 Yes, some of my comedy – 0:01:52.700,0:01:54.390 okay, probably half my comedy – 0:01:54.410,0:01:56.120 has been absolutely juvenile. 0:01:56.220,0:01:57.350 [audience laughs] 0:01:57.470,0:01:59.710 And the other half completely puerile. 0:01:59.750,0:02:01.520 But...[br][audience laughs] 0:02:01.590,0:02:05.100 I admit there was nothing[br]particularly enlightening about me 0:02:05.140,0:02:07.060 as Borat from Kazakhstan, 0:02:07.150,0:02:09.180 the first 'fake news' journalist, 0:02:09.240,0:02:10.390 [audience laughs] 0:02:10.460,0:02:12.800 running through a conference[br]of mortgage brokers 0:02:12.860,0:02:14.400 while I was completely naked. 0:02:14.420,0:02:15.490 [audience laughs] 0:02:15.520,0:02:19.720 But when Borat was able to get[br]an entire bar in Arizona to sing 0:02:19.820,0:02:21.720 ♪ throw the Jew down the well ♪ 0:02:21.740,0:02:22.750 [audience laughs] 0:02:22.800,0:02:26.930 it did reveal people's indifference[br]to anti-semitism. 0:02:27.490,0:02:31.300 When, as Bruno, the gay[br]fashion reporter from Austria, 0:02:31.600,0:02:34.630 I started kissing a man[br]in a cage fight in Arkansas, 0:02:35.000,0:02:36.540 nearly starting a riot, 0:02:36.630,0:02:40.440 it showed the violent potential[br]of homophobia. 0:02:40.740,0:02:44.020 And, when disguised[br]as an ultra-woke developer, 0:02:44.040,0:02:48.010 I proposed building a mosque[br]in one rural community, 0:02:48.110,0:02:50.880 prompting a resident to proudly admit: 0:02:50.940,0:02:53.650 "I am racist against Muslims," 0:02:54.030,0:02:57.360 it showed the growing acceptance[br]of islamophobia. 0:02:58.090,0:03:01.630 That's why I really appreciate[br]the opportunity to be here with you. 0:03:01.850,0:03:04.160 Today, around the world, 0:03:04.260,0:03:07.170 demagogues appeal[br]to our worst instincts. 0:03:07.430,0:03:10.870 Conspiracy theories,[br]once confined to the fringe, 0:03:11.100,0:03:12.400 are going mainstream. 0:03:12.700,0:03:17.860 It's as if the age of reason,[br]the era of evidential argument 0:03:17.880,0:03:19.100 is ending, 0:03:19.110,0:03:22.010 and now knowledge[br]is increasingly delegitimized 0:03:22.350,0:03:25.350 and scientific consensus is dismissed. 0:03:25.650,0:03:29.450 Democracy,[br]which depends on shared truths, 0:03:29.520,0:03:30.620 is in retreat, 0:03:30.660,0:03:34.430 and autocracy,[br]which depends on shared lies, 0:03:34.460,0:03:35.690 is on the march. 0:03:36.060,0:03:37.840 Hate crimes are surging, 0:03:38.020,0:03:41.400 as are murderous attacks[br]on religious and ethnic minorities. 0:03:41.800,0:03:45.200 Now, what do all these[br]dangerous trends have in common? 0:03:45.570,0:03:47.180 I'm just a comedian and an actor, 0:03:47.210,0:03:51.040 I'm not a scholar,[br]but one thing is pretty clear to me: 0:03:52.110,0:03:55.410 All this hate and violence[br]is being facilitated 0:03:55.510,0:03:57.550 by a handful of Internet companies 0:03:57.920,0:04:01.920 that amount to the greatest[br]propaganda machine in history. 0:04:02.400,0:04:04.310 [audience applauds] 0:04:06.690,0:04:09.290 The greatest[br]propaganda machine in history. 0:04:09.310,0:04:11.700 Let's think about it. [br]Facebook, YouTube... 0:04:11.720,0:04:13.500 ...Google, Twitter and others – 0:04:13.770,0:04:15.670 they reach billions of people. 0:04:15.700,0:04:18.440 The algorithms[br]these platforms depend on 0:04:18.800,0:04:20.470 deliberately amplify 0:04:20.530,0:04:23.360 the type of content[br]that keeps users engaged; 0:04:23.750,0:04:27.080 stories that appeal[br]to our baser instincts, 0:04:27.270,0:04:29.380 and that trigger outrage and fear. 0:04:29.630,0:04:34.690 It's why YouTube recommended videos[br]by the conspiracist Alex Jones 0:04:34.830,0:04:36.760 billions of times. 0:04:37.360,0:04:40.360 It's why fake news outperforms real news, 0:04:40.630,0:04:44.430 because studies show that[br]lies spread faster than truth. 0:04:44.920,0:04:48.230 And it's no surprise that the greatest[br]propaganda machine in history 0:04:48.630,0:04:52.070 has spread the oldest[br]conspiracy theory in history: 0:04:52.270,0:04:55.310 the lie that Jews are somehow dangerous. 0:04:56.010,0:04:57.640 As one headline put it: 0:04:57.980,0:05:01.450 "Just think what Goebbles[br]could have done with Facebook." 0:05:01.480,0:05:02.537 [audience groans] 0:05:02.640,0:05:03.640 On the internet, 0:05:03.960,0:05:07.170 everything can appear equally legitimate. 0:05:07.470,0:05:09.750 Breitbart resembles the BBC, 0:05:10.090,0:05:12.730 the fictitious protocols[br]of the Elders of Zion 0:05:12.980,0:05:15.630 look as valid as an ADL report, 0:05:16.040,0:05:18.730 and the rantings of a lunatic[br]seem as credible 0:05:18.750,0:05:21.370 as the findings of a Nobel Prize winner. 0:05:21.850,0:05:25.770 We have lost, it seems,[br]a shared sense of basic facts 0:05:26.070,0:05:28.570 upon which democracy depends. 0:05:29.020,0:05:31.810 When I, as the wanna-be-gansta Ali G, 0:05:31.830,0:05:34.650 asked the astronaut Buzz Aldrin: 0:05:34.950,0:05:37.420 "what woz it like to walk on de sun??" 0:05:37.650,0:05:39.560 [audience laughs] 0:05:41.650,0:05:46.960 the joke worked because,[br]we, the audience, shared the same facts. 0:05:47.360,0:05:49.730 If you believe the moon landing was a hoax 0:05:49.960,0:05:52.230 the joke doesn't work. 0:05:52.360,0:05:55.730 When Borat got that bar[br]in Arizona to agree that, 0:05:56.000,0:06:01.070 ♪ Jews control everybody's money[br]and they never give it back ♪ 0:06:02.150,0:06:05.410 the joke worked because[br]the audience shared the fact 0:06:05.440,0:06:08.480 that the depiction of Jews as miserly 0:06:08.810,0:06:12.580 is a conspiracy theory[br]originating in the Middle Ages. 0:06:12.780,0:06:16.890 But when, thanks to social media, [br]conspiracies take hold, 0:06:17.060,0:06:19.790 it is easier for hate groups to recruit, 0:06:19.990,0:06:22.630 easier for foreign intelligence agencies 0:06:22.690,0:06:25.100 to interfere in our elections, 0:06:25.400,0:06:32.370 and easier for a country like Myanmar[br]to commit genocide against the Rohingya. 0:06:33.170,0:06:35.710 [audience applauds] 0:06:38.780,0:06:40.980 Now, it's actually quite shocking 0:06:41.010,0:06:44.650 how easy it is to turn[br]conspiracy thinking into violence. 0:06:44.700,0:06:46.790 In my last show, "Who is America?" 0:06:47.350,0:06:50.650 I found an educated, normal guy[br]who'd held down a good job, 0:06:50.690,0:06:53.100 but who, on social media, 0:06:53.120,0:06:55.790 repeated many of the conspiracy theories 0:06:56.060,0:06:58.230 that President Trump, using Twitter, 0:06:58.580,0:07:04.540 has spread more than 1700 times[br]to his 67 million Twitter followers. 0:07:05.040,0:07:10.440 The president even tweeted that[br]he was considering designating Antifa 0:07:10.580,0:07:13.480 who are anti-fascists[br]who march against the far-right, 0:07:13.580,0:07:15.980 as a terror organization. 0:07:16.380,0:07:21.120 So, disguised as an Israeli[br]anti-terrorism expert, 0:07:21.390,0:07:23.150 Colonel Erran Morad, 0:07:23.170,0:07:24.170 [audience laughs] 0:07:25.008,0:07:26.725 [in a foreign accent][br]"Yalla. Let go" 0:07:26.757,0:07:27.987 [audience laughs] 0:07:28.160,0:07:30.260 Disguised as him, [br]I told my interviewee 0:07:30.280,0:07:32.500 that at the Women's March in San Francisco 0:07:32.560,0:07:38.440 Antifa were plotting to put hormones[br]into babies' diapers in order to 0:07:38.470,0:07:40.630 [in a foreign accent][br]"make them transgender," 0:07:40.660,0:07:41.660 [audience laughs] 0:07:41.910,0:07:43.480 And this man believed it. 0:07:43.740,0:07:48.510 I instructed him to plant small devices[br]on three innocent people at the march, 0:07:48.530,0:07:50.610 and explained that,[br]when he pushed a button, 0:07:50.650,0:07:53.350 he'd trigger an explosion[br]that would kill them all. 0:07:53.790,0:07:56.940 They weren't real explosives, [br]of course, but he thought they were. 0:07:57.090,0:08:00.120 I wanted to see, would he actually do it? 0:08:00.460,0:08:02.360 The answer was yes. 0:08:02.730,0:08:06.900 He pushed the button and thought[br]he had actually killed three human beings. 0:08:07.370,0:08:10.000 Voltaire was right when he said, 0:08:10.100,0:08:13.810 "Those who can make you[br]believe absurdities 0:08:14.140,0:08:16.510 "can make you commit atrocities." 0:08:17.580,0:08:24.080 And social media lets authoritarians[br]push absurdities to billions of people. 0:08:24.600,0:08:25.680 In their defense, 0:08:25.710,0:08:29.850 these social media companies[br]have taken some steps to reduce hate 0:08:29.890,0:08:32.380 and conspiracies on their platforms, 0:08:32.420,0:08:34.930 but these steps[br]have been mainly superficial. 0:08:35.370,0:08:37.130 And I'm talking about this today 0:08:37.150,0:08:40.230 because I believe that[br]our pluralistic democracies 0:08:40.630,0:08:43.860 are on a precipice[br]and that the next 12 months, 0:08:43.890,0:08:47.310 and the role of social media[br]could be determinant. 0:08:48.050,0:08:49.910 British voters will go to the polls 0:08:49.940,0:08:51.940 while online conspiracists 0:08:52.180,0:08:56.250 promote the despicable theory[br]of the "great replacement," 0:08:56.310,0:08:59.400 that white Christians[br]are being deliberately replaced 0:08:59.420,0:09:01.420 by Muslim immigrants. 0:09:01.750,0:09:03.760 Americans will vote for president 0:09:04.120,0:09:11.130 while trolls and bots perpetuate[br]the disgusting lie of a Hispanic invasion. 0:09:11.620,0:09:15.530 And after years of YouTube videos[br]calling climate change a hoax, 0:09:16.070,0:09:19.400 the United States is on track,[br]a year from now, 0:09:19.800,0:09:23.070 to formally withdraw[br]from the Paris Accords. 0:09:23.470,0:09:27.150 A sewer of bigotry[br]and vile conspiracy theories 0:09:27.350,0:09:31.250 that threaten our democracy,[br]and to some degree, our planet. 0:09:31.850,0:09:35.550 This can't possibly be what[br]the creators of the internet had in mind. 0:09:35.850,0:09:40.660 I believe that it's time for[br]a fundamental rethink of social media 0:09:40.890,0:09:44.460 and how it spreads[br]hate, conspiracies and lies. 0:09:44.800,0:09:47.670 (Audience cheers; applauds) 0:09:52.570,0:09:57.640 Last month, however,[br]Mark Zuckerberg of Facebook 0:09:58.040,0:09:59.780 delivered a major speech 0:10:00.110,0:10:04.330 that, not surprisingly, [br]warned against new laws and regulations 0:10:04.360,0:10:05.820 on companies like his. 0:10:06.180,0:10:09.880 Well, some of these arguments[br]are simply... 0:10:10.152,0:10:12.190 ...pardon my French, bullshit. 0:10:12.670,0:10:13.890 Let's count the ways. 0:10:14.060,0:10:17.990 First, Zuckerberg[br]tried to portray this whole issue as 0:10:18.460,0:10:21.070 "choices around free expression." 0:10:21.100,0:10:22.670 That is ludicrous. 0:10:23.050,0:10:26.570 This is not about[br]limiting anyone's free speech. 0:10:26.840,0:10:28.360 This is about giving people – 0:10:28.390,0:10:32.010 including some of the most[br]reprehensible people on earth – 0:10:32.280,0:10:35.910 the biggest platform in history[br]to reach a third of the planet. 0:10:36.350,0:10:39.950 Freedom of speech[br]is not freedom of reach. 0:10:40.460,0:10:42.370 Sadly, there will always be racists, 0:10:42.410,0:10:45.400 misogynists,[br]anti-semites and child abusers, 0:10:45.460,0:10:47.460 but I think we can all agree 0:10:47.500,0:10:50.430 that we shouldn't be[br]giving bigots and pedophiles 0:10:50.700,0:10:55.030 a free platform to amplify their views[br]and target their victims. 0:10:55.070,0:10:57.800 [audience applauds] 0:11:00.770,0:11:03.960 Second, Mark Zuckerberg[br]claimed that new limits 0:11:03.980,0:11:05.880 on what's posted on social media 0:11:05.910,0:11:09.250 would be [br]"to pull back on free expression." 0:11:09.610,0:11:11.150 This is utter nonsense. 0:11:11.600,0:11:14.530 The First Amendment[br]says that, and I quote: 0:11:14.740,0:11:18.000 "Congress shall make no law...[br]abridging freedom of speech." 0:11:18.240,0:11:22.530 However, this does not apply[br]to private businesses like Facebook. 0:11:22.930,0:11:26.630 We're not asking these companies[br]to determine the boundaries of free speech 0:11:26.860,0:11:28.330 across society, 0:11:28.530,0:11:32.300 we just want them[br]to be responsible on their platforms. 0:11:32.670,0:11:35.670 If neo-nazi comes[br]goose-stepping into a restaurant 0:11:36.040,0:11:39.410 and starts threatening other customers[br]and says he wants to kill Jews, 0:11:39.840,0:11:42.450 would the owner of the restaurant,[br]a private business, 0:11:42.580,0:11:46.760 be required to serve him[br]an elegant 8-course meal? 0:11:47.190,0:11:48.550 Of course not! 0:11:48.690,0:11:51.820 The restaurant owner[br]has every legal right – 0:11:52.060,0:11:54.690 and indeed, I would argue,[br]a moral obligation – 0:11:54.890,0:11:56.500 to kick that Nazi out. 0:11:56.720,0:11:58.560 And so do these internet companies. 0:11:58.610,0:12:00.610 [audience applauds] 0:12:04.550,0:12:06.060 Now third. 0:12:06.200,0:12:09.920 Mark Zuckerberg seemed to equate[br]regulation of companies like his 0:12:10.210,0:12:14.250 to the actions of [br]"the most repressive societies." 0:12:14.710,0:12:15.810 Incredible. 0:12:16.100,0:12:20.620 This from one of the six people[br]who decide what information 0:12:20.820,0:12:22.990 so much of the world sees: 0:12:23.550,0:12:25.590 Zuckerberg at Facebook, 0:12:25.720,0:12:27.590 Sundar Pichai at Google, 0:12:27.760,0:12:31.730 at its parent company Alphabet,[br]Larry Page and Sergey Brin, 0:12:32.060,0:12:35.770 Brin’s ex-sister-in-law,[br]Susan Wojcicki at YouTube, 0:12:35.800,0:12:37.770 and Jack Dorsey at Twitter. 0:12:38.300,0:12:40.040 The Silicon Six. 0:12:40.640,0:12:43.170 All billionaires, all Americans, 0:12:43.420,0:12:46.480 who care more[br]about boosting their share price 0:12:46.850,0:12:48.750 than about protecting democracy. 0:12:49.222,0:12:51.140 [audience applauds] 0:12:53.630,0:12:57.278 This is ideological imperialism. 0:12:57.780,0:13:00.860 Six unelected individuals[br]in Silicon Valley 0:13:01.160,0:13:04.200 imposing their vision[br]on the rest of the world, 0:13:04.460,0:13:06.660 unaccountable to any government 0:13:06.740,0:13:09.970 and acting like[br]they're above the reach of law. 0:13:10.000,0:13:12.300 It's like we're living[br]in the Roman Empire, 0:13:12.330,0:13:14.340 and Mark Zuckerberg is Caesar. 0:13:14.880,0:13:16.940 At least that would explain his haircut. 0:13:17.220,0:13:18.629 [audience laughs] 0:13:19.460,0:13:21.380 Now here's an idea. 0:13:22.380,0:13:26.180 Instead of letting the Sillicon Six[br]decide the fate of the world, 0:13:26.680,0:13:29.290 let our elected representatives, 0:13:29.650,0:13:33.860 voted for by the people[br]of every democracy in the world, 0:13:34.030,0:13:35.660 have at least some say. 0:13:36.590,0:13:39.230 Fourth, Zuckerberg[br]speaks of welcoming 0:13:39.330,0:13:41.630 "a diversity of ideas," 0:13:42.070,0:13:44.190 and last year he gave us an example. 0:13:44.250,0:13:47.910 He said that he found[br]posts denying the Holocaust 0:13:47.970,0:13:49.410 "deeply offensive," 0:13:49.610,0:13:52.080 but he didn't think[br]Facebook should take them down 0:13:52.340,0:13:56.580 "because I think there are things[br]that different people get wrong." 0:13:56.980,0:14:02.120 At this very moment, there are still[br]Holocaust deniers on Facebook, 0:14:02.220,0:14:04.100 and Google still takes you 0:14:04.130,0:14:07.830 to the most repulsive[br]Holocaust denial sites 0:14:07.960,0:14:09.390 with a simple click. 0:14:09.650,0:14:14.530 One of the heads of Google, in fact,[br]told me that these sites just show 0:14:14.870,0:14:16.670 "both sides" of the issue. 0:14:16.700,0:14:17.670 [audience groans] 0:14:17.740,0:14:19.000 This is madness. 0:14:19.100,0:14:21.402 To quote Edward R. Murrow: 0:14:21.460,0:14:24.640 One "cannot accept[br]that there are, on every story, 0:14:24.750,0:14:28.280 "two equal and logical sides[br]to an argument." 0:14:28.310,0:14:31.606 We have, unfortunately,[br]millions of pieces of evidence 0:14:31.646,0:14:32.661 for the Holocaust. 0:14:32.720,0:14:35.720 It is an historical fact. 0:14:35.850,0:14:38.760 And denying it is not[br]some random opinion. 0:14:38.920,0:14:43.460 Those who deny the Holocaust[br]aim to encourage another one. 0:14:44.230,0:14:46.500 [audience applauds] 0:14:50.670,0:14:55.370 Still, Zuckerberg says that[br]"people should decide what is credible, 0:14:55.420,0:14:57.040 "not tech companies." 0:14:57.250,0:14:59.760 But at a time[br]when two-thirds of Millenials 0:14:59.790,0:15:02.560 say they haven't even heard of Auschwitz, 0:15:02.600,0:15:05.390 how are they supposed to know[br]what's credible? 0:15:05.813,0:15:10.660 How are they supposed to know[br]that the lie is a lie? 0:15:10.920,0:15:13.560 There is such a thing as objective truth. 0:15:13.820,0:15:16.290 Facts do exist. 0:15:16.570,0:15:19.920 And if these internet companies[br]really want to make a difference 0:15:20.180,0:15:23.940 they should hire enough monitors[br]to actually monitor, 0:15:24.230,0:15:28.340 work closely with groups like the ADL[br]and the NAACP, 0:15:28.670,0:15:34.810 insist on facts, and purge these lies[br]and conspiracies from their platforms. 0:15:35.210,0:15:37.380 [audience applauds] 0:15:40.480,0:15:44.810 Now, fifth, when discussing[br]the difficulty of removing content, 0:15:44.860,0:15:45.960 Zuckerberg... 0:15:46.610,0:15:50.130 Mark Zuckerberg asked, [br]"where do you draw the line?" 0:15:50.600,0:15:54.570 Yes, drawing the line can be difficult, 0:15:55.040,0:15:56.800 but here's what he's really saying: 0:15:56.920,0:15:59.700 removing more[br]of these lies and conspiracies 0:15:59.870,0:16:01.640 is just too expensive. 0:16:01.980,0:16:05.040 These are[br]the richest companies in the world, 0:16:05.240,0:16:07.450 and they have[br]the best engineers in the world. 0:16:07.720,0:16:10.680 They could fix these problems[br]if they wanted to. 0:16:10.850,0:16:14.090 Twitter could deploy an algorithm 0:16:14.260,0:16:17.020 to remove more[br]white supremacist hate speech, 0:16:17.220,0:16:19.270 but they reportedly haven't 0:16:19.650,0:16:23.260 because it would eject[br]some very prominent politicians 0:16:23.330,0:16:24.430 from their platform. 0:16:24.450,0:16:25.600 [audience groans] 0:16:25.650,0:16:27.870 Maybe that wouldn't be such a bad thing. 0:16:28.400,0:16:30.330 [audience applauds, cheers] 0:16:35.540,0:16:37.060 The truth is 0:16:37.560,0:16:39.940 these companies[br]won't fundamentally change 0:16:39.960,0:16:42.250 because their entire business model 0:16:42.610,0:16:45.720 relies on generating more engagement, 0:16:45.880,0:16:48.320 and nothing generates more engagement 0:16:48.330,0:16:50.760 than lies, fear and outrage. 0:16:51.040,0:16:53.250 So it's time[br]to finally call these companies 0:16:53.270,0:16:57.260 what they really are:[br]the largest publishers in history. 0:16:57.470,0:16:58.960 So here's an idea for them: 0:16:59.230,0:17:02.130 abide by basic standards and practices 0:17:02.470,0:17:06.700 just like newspapers,[br]magazines and TV news do everyday. 0:17:07.340,0:17:10.840 We have standards and practices[br]in television and the movies. 0:17:11.140,0:17:14.080 There are certain things[br]we cannot say or do. 0:17:14.420,0:17:16.160 In England, I was told that Ali G 0:17:16.450,0:17:19.420 couldn't curse[br]when he appeared before 9 p.m. 0:17:20.080,0:17:23.920 Here in the US, [br]the Motion Picture Association of America 0:17:24.290,0:17:26.320 regulates and rates what we see. 0:17:26.560,0:17:31.060 I've had scenes in my movies cut[br]or reduced to abide by those standards. 0:17:31.500,0:17:36.440 If there are standards and practices[br]for what cinemas and TV channels can show 0:17:36.480,0:17:39.710 then, surely [br]companies that publish material 0:17:39.732,0:17:40.940 to billions of people 0:17:40.970,0:17:43.710 should have to abide[br]basic standards and practices too. 0:17:43.810,0:17:45.270 [audience applauds] 0:17:49.880,0:17:53.150 Take the issue of political ads, 0:17:54.050,0:17:55.850 on which Facebook have been resolute. 0:17:56.090,0:17:58.920 Fortunately, Twitter finally banned them, 0:17:59.033,0:18:02.290 and Google, today I read, [br]is making changes too. 0:18:02.649,0:18:04.460 But, if you pay them, 0:18:04.770,0:18:08.730 Facebook will run[br]any political ad you want, 0:18:09.070,0:18:10.300 even if it's a lie. 0:18:10.599,0:18:15.170 And they'll even help you[br]micro-target those adds to their users 0:18:15.220,0:18:16.910 for maximum effect. 0:18:17.380,0:18:19.526 Under this twisted logic, 0:18:19.770,0:18:22.820 if Facebook were around in the 1930s 0:18:22.850,0:18:26.080 it would have allowed Hitler[br]to post 30-second ads 0:18:26.090,0:18:28.750 on his solution to the "Jewish problem." 0:18:29.460,0:18:31.950 So here's a good standard and practice: 0:18:32.462,0:18:38.650 Facebook, start fact-checking[br]political ads before you run them, 0:18:39.120,0:18:42.510 stop micro-targeted lies immediately, 0:18:42.870,0:18:44.420 and when the ads are false, 0:18:44.500,0:18:46.910 give back the money[br]and don't publish them. 0:18:47.710,0:18:49.740 [audience applauds] 0:18:55.380,0:18:57.120 Here's another good practice: 0:18:57.261,0:18:59.191 slow down. 0:18:59.679,0:19:03.550 Every single post doesn't need[br]to be published immediately. 0:19:03.920,0:19:06.560 Oscar Wilde once said: [br]"We live in an age 0:19:06.660,0:19:11.030 "when unnecessary things[br]are our only necessity." 0:19:11.495,0:19:12.495 But... 0:19:13.260,0:19:15.690 But, let me ask you,[br]is having every thought 0:19:15.720,0:19:18.570 or video posted instantly online, 0:19:18.710,0:19:21.310 even if it's racist[br]or criminal or murderous, 0:19:21.600,0:19:23.510 really a necessity? 0:19:23.940,0:19:24.960 Of course not. 0:19:25.432,0:19:28.680 The shooter who[br]massacred Muslims in New Zealand 0:19:28.910,0:19:31.880 live streamed his atrocity on Facebook 0:19:31.950,0:19:34.110 where it then spread across the internet 0:19:34.200,0:19:36.950 and was viewed likely millions of times. 0:19:37.160,0:19:41.630 It was a snuff film,[br]brought to you by social media. 0:19:42.260,0:19:44.300 Why can't we have more of a delay 0:19:44.560,0:19:47.200 so that this trauma-inducing filth 0:19:47.430,0:19:51.670 can be caught and stopped[br]before it's posted in the first place? 0:19:51.840,0:19:54.870 [audience applauds] 0:19:56.788,0:20:00.980 Finally, Zuckerberg said that[br]social media companies should 0:20:01.480,0:20:04.150 "live up to their responsibilities," 0:20:04.380,0:20:08.920 but he's totally silent about[br]what should happen when they don't. 0:20:09.410,0:20:11.660 By now, it's pretty clear 0:20:11.720,0:20:15.190 they can't be trusted[br]to regulate themselves. 0:20:15.660,0:20:18.930 As with Industrial Revolution,[br]it's time for regulation 0:20:19.200,0:20:24.370 and legislation to curb the greed[br]of these high-tech robber barons. 0:20:24.670,0:20:26.670 [audience applauds] 0:20:29.670,0:20:32.060 In in every other industry, 0:20:32.100,0:20:36.040 a company can be held liable[br]when their product is defective. 0:20:36.310,0:20:39.720 When engines explode[br]or seat belts malfunction, 0:20:39.980,0:20:43.950 car company's recall[br]tens of thousands of vehicles 0:20:44.290,0:20:46.320 at a cost of billions of dollars. 0:20:46.680,0:20:50.090 It only seems fair[br]to say to Facebook, YouTube and Twitter: 0:20:50.530,0:20:52.860 Your product is defective, 0:20:53.080,0:20:55.130 you are obliged to fix it, 0:20:55.370,0:20:57.340 no matter how much it costs 0:20:57.408,0:21:00.990 and no matter how many moderators[br]you need to employ. 0:21:01.940,0:21:05.380 [audience applauds, cheers] 0:21:08.480,0:21:10.810 In every under-- sorry 0:21:10.830,0:21:15.190 In every other industry[br]you can be sued for the harm you cause. 0:21:15.450,0:21:19.790 Publishers can be sued for libel,[br]people can be sued for defamation. 0:21:20.070,0:21:22.630 I've been sued many times! 0:21:22.680,0:21:23.620 [audience laughs] 0:21:23.700,0:21:25.860 I'm being sued right now 0:21:26.700,0:21:30.870 by someone whose name I won't mention[br]because he might sue me again! 0:21:30.930,0:21:31.940 [audience laughs] 0:21:31.970,0:21:36.800 But social media companies[br]are largely protected from liability 0:21:36.810,0:21:38.610 for the content their users post – 0:21:38.810,0:21:40.980 no matter how indecent it is – 0:21:41.260,0:21:43.510 by Section 230 of, get ready for it, 0:21:43.820,0:21:46.680 the Communications Decency Act. 0:21:47.260,0:21:48.390 It's absurd! 0:21:48.652,0:21:53.690 Fortunately, internet companies can now[br]be held responsible for pedophiles 0:21:53.700,0:21:56.670 who use their site to target children. 0:21:56.830,0:21:57.730 So I say, 0:21:57.880,0:21:59.500 let's also hold these companies 0:21:59.760,0:22:02.730 responsible for those who use their sites 0:22:02.910,0:22:05.970 to advocate [br]for the mass murder of children 0:22:06.170,0:22:08.140 because of their race or religion. 0:22:08.810,0:22:10.481 [audience applauds] 0:22:14.640,0:22:17.350 And maybe fines are not enough. 0:22:17.770,0:22:23.920 Maybe it's time to tell Mark Zuckerberg[br]and the CEOs of these companies 0:22:24.411,0:22:27.510 you already allowed one foreign power 0:22:27.520,0:22:29.390 to interfere in our elections. 0:22:29.660,0:22:34.000 You already facilitated[br]one genocide in Myanmar. 0:22:34.130,0:22:36.630 Do it again and you go to jail. 0:22:36.720,0:22:38.460 [audience applauds] 0:22:43.440,0:22:47.750 In the end, it all comes down to[br]what kind of world we want. 0:22:48.590,0:22:52.650 In his speech, Zuckerberg[br]said that one of his main goals 0:22:52.950,0:22:58.160 is to "uphold as wide a definition[br]of freedom of expression as possible." 0:22:58.570,0:22:59.660 It sounds good. 0:23:00.230,0:23:04.330 Yet our freedoms[br]are not only an end in themselves. 0:23:04.400,0:23:06.760 They're also the means to another end – 0:23:06.960,0:23:08.930 as you say here in the US: 0:23:09.110,0:23:13.240 the right to life, liberty[br]and the pursuit of happiness. 0:23:13.470,0:23:18.610 But today these rights are threatened[br]by hate, conspiracies and lies. 0:23:18.920,0:23:24.480 So allow me to leave you with a suggestion[br]for a different aim for society: 0:23:24.980,0:23:26.840 The ultimate aim of society 0:23:27.140,0:23:30.790 should be to make sure[br]that people are not targeted, 0:23:30.940,0:23:33.820 not harassed and not murdered 0:23:33.894,0:23:36.800 because of who they are,[br]where they come from, 0:23:36.860,0:23:39.200 who they love or how they pray. 0:23:39.860,0:23:42.470 [audience applauds, cheers] 0:23:54.830,0:23:57.600 If we make that our aim – 0:23:57.690,0:24:00.180 if we prioritize truth over lies, 0:24:00.480,0:24:02.390 tolerance over prejudice, 0:24:02.650,0:24:04.760 empathy over indifference, 0:24:04.900,0:24:07.830 and experts over ignoramuses – 0:24:07.980,0:24:08.990 [audience laughs] 0:24:09.130,0:24:11.330 then maybe, just maybe, 0:24:11.590,0:24:15.570 we can stop the greatest[br]propaganda machine in history. 0:24:16.069,0:24:17.928 We can save democracy, 0:24:17.988,0:24:22.470 we can still have a place[br]for free speech and free expression, 0:24:22.730,0:24:26.540 and most importantly,[br]my jokes will still work. 0:24:26.880,0:24:28.080 Thank you very much. 0:24:28.280,0:24:31.180 [cheers and applause]