1 00:00:00,844 --> 00:00:03,222 When we think about prejudice and bias, 2 00:00:03,222 --> 00:00:05,366 we tend to think about stupid and evil people 3 00:00:05,366 --> 00:00:07,820 doing stupid and evil things. 4 00:00:07,820 --> 00:00:09,890 And this idea is nicely summarized 5 00:00:09,890 --> 00:00:12,358 by the British critic William Hazlitt, 6 00:00:12,358 --> 00:00:15,293 who wrote, "Prejudice is the child of ignorance." 7 00:00:15,293 --> 00:00:17,405 I want to try to convince you here 8 00:00:17,405 --> 00:00:19,040 that this is mistaken. 9 00:00:19,040 --> 00:00:20,772 I want to try to convince you 10 00:00:20,772 --> 00:00:22,495 that prejudice and bias 11 00:00:22,495 --> 00:00:25,783 are natural, they're often rational, 12 00:00:25,783 --> 00:00:27,614 and they're often even moral, 13 00:00:27,614 --> 00:00:29,866 and I think that once we understand this, 14 00:00:29,866 --> 00:00:32,375 we're in a better position to make sense of them 15 00:00:32,375 --> 00:00:33,432 when they go wrong, 16 00:00:33,432 --> 00:00:35,200 when they have horrible consequences, 17 00:00:35,200 --> 00:00:37,525 and we're in a better position to know what to do 18 00:00:37,525 --> 00:00:39,207 when this happens. 19 00:00:39,207 --> 00:00:42,234 So start with stereotypes. You look at me, 20 00:00:42,234 --> 00:00:44,480 you know my name, you know certain facts about me, 21 00:00:44,480 --> 00:00:46,309 and you could make certain judgments. 22 00:00:46,309 --> 00:00:49,162 You could make guesses about my ethnicity, 23 00:00:49,162 --> 00:00:52,443 my political affiliation, my religious beliefs. 24 00:00:52,443 --> 00:00:54,542 And the thing is, these judgments tend to be accurate. 25 00:00:54,542 --> 00:00:56,724 We're very good at this sort of thing. 26 00:00:56,724 --> 00:00:58,207 And we're very good at this sort of thing 27 00:00:58,207 --> 00:01:00,940 because our ability to stereotype people 28 00:01:00,940 --> 00:01:04,195 is not some sort of arbitrary quirk of the mind, 29 00:01:04,195 --> 00:01:06,511 but rather it's a specific instance 30 00:01:06,511 --> 00:01:08,166 of a more general process, 31 00:01:08,166 --> 00:01:09,785 which is that we have experience 32 00:01:09,785 --> 00:01:11,326 with things and people in the world 33 00:01:11,326 --> 00:01:12,575 that fall into categories, 34 00:01:12,575 --> 00:01:15,031 and we can use our experience to make generalizations 35 00:01:15,031 --> 00:01:17,390 about novel instances of these categories. 36 00:01:17,390 --> 00:01:19,757 So everybody here has a lot of experience 37 00:01:19,757 --> 00:01:22,010 with chairs and apples and dogs, 38 00:01:22,010 --> 00:01:23,646 and based on this, you could see 39 00:01:23,646 --> 00:01:25,998 unfamiliar examples and you could guess, 40 00:01:25,998 --> 00:01:27,314 you could sit on the chair, 41 00:01:27,314 --> 00:01:29,879 you could eat the apple, the dog will bark. 42 00:01:29,879 --> 00:01:31,643 Now we might be wrong. 43 00:01:31,643 --> 00:01:33,443 The chair could collapse if you sit on it, 44 00:01:33,443 --> 00:01:35,665 the apple might be poison, the dog might not bark, 45 00:01:35,665 --> 00:01:38,535 and in fact, this is my dog Tessie, who doesn't bark. 46 00:01:38,535 --> 00:01:41,294 But for the most part, we're good at this. 47 00:01:41,294 --> 00:01:43,210 For the most part, we make good guesses 48 00:01:43,210 --> 00:01:45,024 both in the social domain and the non-social domain, 49 00:01:45,024 --> 00:01:46,973 and if we weren't able to do so, 50 00:01:46,973 --> 00:01:50,189 if we weren't able to make guesses about new instances that we encounter, 51 00:01:50,189 --> 00:01:51,640 we wouldn't survive. 52 00:01:51,640 --> 00:01:54,509 And in fact, Hazlitt later on in his wonderful essay 53 00:01:54,509 --> 00:01:55,994 concedes this. 54 00:01:55,994 --> 00:01:58,536 He writes, "Without the aid of prejudice and custom, 55 00:01:58,536 --> 00:02:00,876 I should not be able to find my way my across the room; 56 00:02:00,876 --> 00:02:03,328 nor know how to conduct myself in any circumstances, 57 00:02:03,328 --> 00:02:07,531 nor what to feel in any relation of life." 58 00:02:07,531 --> 00:02:09,040 Or take bias. 59 00:02:09,040 --> 00:02:10,748 Now sometimes, we break the world up into 60 00:02:10,748 --> 00:02:13,749 us versus them, into in group versus out group, 61 00:02:13,749 --> 00:02:14,910 and sometimes when we do this, 62 00:02:14,910 --> 00:02:16,467 we know we're doing something wrong, 63 00:02:16,467 --> 00:02:18,140 and we're kind of ashamed of it. 64 00:02:18,140 --> 00:02:19,623 But other times we're proud of it. 65 00:02:19,623 --> 00:02:21,436 We openly acknowledge it. 66 00:02:21,436 --> 00:02:22,718 And my favorite example of this 67 00:02:22,718 --> 00:02:25,120 is a question that came from the audience 68 00:02:25,120 --> 00:02:27,837 in a Republican debate prior to the last election. 69 00:02:27,837 --> 00:02:30,129 (Video) Anderson Cooper: Gets to your question, 70 00:02:30,129 --> 00:02:34,310 the question in the hall, on foreign aid? Yes, ma'am. 71 00:02:34,310 --> 00:02:36,546 Woman: The American people are suffering 72 00:02:36,546 --> 00:02:39,183 in our country right now. 73 00:02:39,183 --> 00:02:42,531 Why do we continue to send foreign aid 74 00:02:42,531 --> 00:02:43,847 to other countries 75 00:02:43,847 --> 00:02:47,950 when we need all the help we can get for ourselves? 76 00:02:47,950 --> 00:02:49,645 AC: Governor Perry, what about that? 77 00:02:49,645 --> 00:02:51,012 (Applause) 78 00:02:51,012 --> 00:02:53,350 Rick Perry: Absolutely, I think it's— 79 00:02:53,350 --> 00:02:55,010 Paul Bloom: Each of the people onstage 80 00:02:55,010 --> 00:02:56,981 agreed with the premise of her question, 81 00:02:56,981 --> 00:02:59,100 which is as Americans, we should care more 82 00:02:59,100 --> 00:03:01,226 about Americans than about other people. 83 00:03:01,226 --> 00:03:04,091 And in fact, in general, people are often swayed 84 00:03:04,091 --> 00:03:07,599 by feelings of solidarity, loyalty, pride, patriotism, 85 00:03:07,599 --> 00:03:10,315 towards their country or towards their ethnic group. 86 00:03:10,315 --> 00:03:13,400 Regardless of your politics, many people feel proud to be American, 87 00:03:13,400 --> 00:03:15,462 and they favor Americans over other countries. 88 00:03:15,462 --> 00:03:18,312 Residents of other countries feel the same about their nation, 89 00:03:18,312 --> 00:03:20,798 and we feel the same about our ethnicities. 90 00:03:20,798 --> 00:03:22,482 Now some of you may reject this. 91 00:03:22,482 --> 00:03:24,203 Some of you may be so cosmopolitan 92 00:03:24,213 --> 00:03:26,547 that you think that ethnicity and nationality 93 00:03:26,547 --> 00:03:28,700 should have no moral sway. 94 00:03:28,700 --> 00:03:31,462 But even you sophisticates accept 95 00:03:31,462 --> 00:03:33,296 that there should be some pull 96 00:03:33,296 --> 00:03:35,997 towards the in group in the domain of friends and family, 97 00:03:35,997 --> 00:03:37,418 of people you're close to, 98 00:03:37,418 --> 00:03:38,979 and so even you make a distinction 99 00:03:38,979 --> 00:03:40,954 between us versus them. 100 00:03:40,954 --> 00:03:43,557 Now this distinction is natural enough 101 00:03:43,557 --> 00:03:46,481 and often moral enough, but it can go awry, 102 00:03:46,481 --> 00:03:48,210 and this was part of the research 103 00:03:48,210 --> 00:03:50,969 of the great social psychologist Henri Tajfel. 104 00:03:50,969 --> 00:03:53,574 Tajfel was born in Poland in 1919. 105 00:03:53,574 --> 00:03:55,713 He left to go to university in France, 106 00:03:55,713 --> 00:03:58,268 because as a Jew, he couldn't go to university in Poland, 107 00:03:58,268 --> 00:04:00,778 and then he enlisted in the French military 108 00:04:00,778 --> 00:04:02,061 in World War II. 109 00:04:02,061 --> 00:04:03,830 He was captured and ended up 110 00:04:03,830 --> 00:04:05,361 in a prisoner of war camp, 111 00:04:05,361 --> 00:04:07,628 and it was a terrifying time for him, 112 00:04:07,628 --> 00:04:09,316 because if it was discovered he was a Jew, 113 00:04:09,316 --> 00:04:11,408 he could have been moved to a concentration camp, 114 00:04:11,408 --> 00:04:13,400 where he most likely would not have survived. 115 00:04:13,400 --> 00:04:15,987 And in fact, when the war ended and he was released, 116 00:04:15,987 --> 00:04:18,492 most of his friends and family were dead. 117 00:04:18,492 --> 00:04:20,329 He got involved in different pursuits. 118 00:04:20,329 --> 00:04:21,860 He helped out the war orphans. 119 00:04:21,860 --> 00:04:23,591 But he had a long-lasting interest 120 00:04:23,591 --> 00:04:25,136 in the science of prejudice, 121 00:04:25,136 --> 00:04:27,796 and so when a prestigious British scholarship 122 00:04:27,796 --> 00:04:29,641 on stereotypes opened up, he applied for it, 123 00:04:29,641 --> 00:04:30,998 and he won it, 124 00:04:30,998 --> 00:04:33,188 and then he began this amazing career. 125 00:04:33,188 --> 00:04:35,937 And what started his career is an insight 126 00:04:35,937 --> 00:04:37,777 that the way most people were thinking 127 00:04:37,777 --> 00:04:39,893 about the Holocaust was wrong. 128 00:04:39,893 --> 00:04:42,299 Many people, most people at the time, 129 00:04:42,299 --> 00:04:44,200 viewed the Holocaust as sort of representing 130 00:04:44,200 --> 00:04:47,204 some tragic flaw on the part of the Germans, 131 00:04:47,204 --> 00:04:51,038 some genetic taint, some authoritarian personality. 132 00:04:51,038 --> 00:04:53,096 And Tajfel rejected this. 133 00:04:53,096 --> 00:04:55,639 Tajfel said what we see in the Holocaust 134 00:04:55,639 --> 00:04:57,950 is just an exaggeration 135 00:04:57,950 --> 00:04:59,728 of normal psychological processes 136 00:04:59,728 --> 00:05:01,489 that exist in every one of us. 137 00:05:01,489 --> 00:05:04,174 And to explore this, he did a series of classic studies 138 00:05:04,174 --> 00:05:05,918 with British adolescents. 139 00:05:05,918 --> 00:05:07,467 And in one of his studies, what he did was he asked 140 00:05:07,467 --> 00:05:10,019 the British adolescents all sorts of questions, 141 00:05:10,019 --> 00:05:11,903 and then based on their answers, he said, 142 00:05:11,903 --> 00:05:14,260 "I've looked at your answers, and based on the answers, 143 00:05:14,260 --> 00:05:16,357 I have determined that you are either" 144 00:05:16,357 --> 00:05:17,363 — he told half of them — 145 00:05:17,363 --> 00:05:20,320 "a Kandinsky lover, you love the work of Kandinsky, 146 00:05:20,320 --> 00:05:23,298 or a Klee lover, you love the work of Klee." 147 00:05:23,298 --> 00:05:25,114 It was entirely bogus. 148 00:05:25,114 --> 00:05:27,404 Their answers had nothing to do with Kandinsky or Klee. 149 00:05:27,404 --> 00:05:30,132 They probably hadn't heard of the artists. 150 00:05:30,132 --> 00:05:32,872 He just arbitrarily divided them up. 151 00:05:32,872 --> 00:05:36,143 But what he found was, these categories mattered, 152 00:05:36,143 --> 00:05:38,654 so when he later gave the subjects money, 153 00:05:38,654 --> 00:05:40,330 they would prefer to give the money 154 00:05:40,330 --> 00:05:42,130 to members of their own group 155 00:05:42,130 --> 00:05:43,963 than members of the other group. 156 00:05:43,963 --> 00:05:46,290 Worse, they were actually most interested 157 00:05:46,290 --> 00:05:48,296 in establishing a difference 158 00:05:48,296 --> 00:05:50,862 between their group and other groups, 159 00:05:50,862 --> 00:05:52,770 so they would give up money for their own group 160 00:05:52,770 --> 00:05:58,018 if by doing so they could give the other group even less. 161 00:05:58,018 --> 00:06:00,236 This bias seems to show up very early. 162 00:06:00,236 --> 00:06:02,536 So my colleague and wife Karen Wynn at Yale 163 00:06:02,536 --> 00:06:04,147 has done a series of studies with babies 164 00:06:04,147 --> 00:06:06,979 where she exposes babies to puppets, 165 00:06:06,979 --> 00:06:09,244 and the puppets have certain food preferences. 166 00:06:09,244 --> 00:06:11,426 So one of the puppets might like green beans. 167 00:06:11,426 --> 00:06:14,001 The other puppet might like graham crackers. 168 00:06:14,001 --> 00:06:16,370 They test the babies own food preferences, 169 00:06:16,370 --> 00:06:19,060 and babies typically prefer the graham crackers. 170 00:06:19,060 --> 00:06:21,672 But the question is, does this matter to babies 171 00:06:21,672 --> 00:06:24,788 in how they treat the puppets? And it matters a lot. 172 00:06:24,788 --> 00:06:26,307 They tend to prefer the puppet 173 00:06:26,307 --> 00:06:29,786 who has the same food tastes that they have, 174 00:06:29,786 --> 00:06:32,342 and worse, they actually prefer puppets 175 00:06:32,342 --> 00:06:35,327 who punish the puppet with the different food taste. 176 00:06:35,327 --> 00:06:37,604 (Laughter) 177 00:06:37,604 --> 00:06:41,236 We see the sort of in group out group psychology all the time. 178 00:06:41,236 --> 00:06:42,900 We see it in political clashes 179 00:06:42,900 --> 00:06:45,314 within groups with different ideologies. 180 00:06:45,314 --> 00:06:48,940 We see it in its extreme in cases of war, 181 00:06:48,940 --> 00:06:52,157 where the out group isn't merely given less, 182 00:06:52,157 --> 00:06:53,745 but dehumanized, 183 00:06:53,745 --> 00:06:55,985 as in the Nazi perspective of Jews 184 00:06:55,985 --> 00:06:58,070 as vermin or lice, 185 00:06:58,070 --> 00:07:02,306 or the American perspective of Japanese as rats. 186 00:07:02,306 --> 00:07:04,520 Stereotypes can also go awry. 187 00:07:04,520 --> 00:07:06,781 So often they're rational and useful, 188 00:07:06,781 --> 00:07:08,355 but sometimes they're irrational, 189 00:07:08,355 --> 00:07:09,581 they give the wrong answers, 190 00:07:09,581 --> 00:07:10,798 and other times 191 00:07:10,798 --> 00:07:12,973 they lead to plainly immoral consequences. 192 00:07:12,973 --> 00:07:15,781 And the case that's been most studied 193 00:07:15,781 --> 00:07:17,448 is the case of race. 194 00:07:17,448 --> 00:07:18,855 There was a fascinating study 195 00:07:18,855 --> 00:07:20,929 prior to the 2008 election 196 00:07:20,929 --> 00:07:23,955 where social psychologists looked at the extent 197 00:07:23,955 --> 00:07:27,397 to which the candidates were associated with America, 198 00:07:27,397 --> 00:07:31,002 as in an unconscious association with the American flag. 199 00:07:31,002 --> 00:07:32,358 And in one of their studies they compared 200 00:07:32,358 --> 00:07:34,372 Obama and McCain, and they found McCain 201 00:07:34,372 --> 00:07:37,766 is more thought of as more American than Obama, 202 00:07:37,766 --> 00:07:39,469 and to some extent, people aren't that surprised 203 00:07:39,469 --> 00:07:40,327 by hearing that. 204 00:07:40,327 --> 00:07:42,257 McCain is a celebrated war hero, 205 00:07:42,257 --> 00:07:43,916 and many people would explicitly say 206 00:07:43,916 --> 00:07:46,616 he has more of an American story than Obama. 207 00:07:46,616 --> 00:07:48,553 But they also compared Obama 208 00:07:48,553 --> 00:07:51,069 to British Prime Minister Tony Blair, 209 00:07:51,069 --> 00:07:53,330 and they found that Blair was also thought of 210 00:07:53,330 --> 00:07:55,837 as more American than Obama, 211 00:07:55,837 --> 00:07:57,910 even though subjects explicitly understood 212 00:07:57,910 --> 00:08:00,900 that he's not American at all. 213 00:08:00,900 --> 00:08:02,324 But they were responding, of course, 214 00:08:02,324 --> 00:08:05,375 to the color of his skin. 215 00:08:05,375 --> 00:08:07,426 These stereotypes and biases 216 00:08:07,426 --> 00:08:08,876 have real-world consequences, 217 00:08:08,876 --> 00:08:11,748 both subtle and very important. 218 00:08:11,748 --> 00:08:14,410 In one recent study, researchers 219 00:08:14,410 --> 00:08:17,679 put ads on eBay for the sale of baseball cards. 220 00:08:17,679 --> 00:08:20,413 Some of them were held by white hands, 221 00:08:20,413 --> 00:08:21,631 others by black hands. 222 00:08:21,631 --> 00:08:23,210 They were the same baseball cards. 223 00:08:23,210 --> 00:08:24,454 The ones held by black hands 224 00:08:24,454 --> 00:08:26,521 got substantially smaller bids 225 00:08:26,521 --> 00:08:29,005 than the ones held by white hands. 226 00:08:29,005 --> 00:08:31,367 In research done at Stanford, 227 00:08:31,367 --> 00:08:35,597 psychologists explored the case of people 228 00:08:35,597 --> 00:08:39,166 sentenced for the murder of a white person. 229 00:08:39,166 --> 00:08:41,970 It turns out, holding everything else constant, 230 00:08:41,970 --> 00:08:44,340 you are considerably more likely to be executed 231 00:08:44,340 --> 00:08:46,117 if you look like the man on the right 232 00:08:46,117 --> 00:08:48,090 than the man on the left, 233 00:08:48,090 --> 00:08:50,119 and this is in large part because 234 00:08:50,119 --> 00:08:52,653 the man on the right looks more prototypically black, 235 00:08:52,653 --> 00:08:55,283 more prototypically African-American, 236 00:08:55,283 --> 00:08:57,332 and this apparently influences people's decisions 237 00:08:57,332 --> 00:08:59,103 over what to do about him. 238 00:08:59,103 --> 00:09:00,650 So now that we know about this, 239 00:09:00,650 --> 00:09:02,307 how do we combat it? 240 00:09:02,307 --> 00:09:03,929 And there are different avenues. 241 00:09:03,929 --> 00:09:05,363 One avenue is to appeal 242 00:09:05,363 --> 00:09:07,409 to people's emotional responses, 243 00:09:07,409 --> 00:09:09,542 to appeal to people's empathy, 244 00:09:09,542 --> 00:09:11,415 and we often do that through stories. 245 00:09:11,415 --> 00:09:13,980 So if you are a liberal parent 246 00:09:13,980 --> 00:09:15,852 and you want to encourage your children 247 00:09:15,852 --> 00:09:18,226 to believe in the merits of non-traditional families, 248 00:09:18,226 --> 00:09:20,499 you might give them a book like this. 249 00:09:20,499 --> 00:09:22,225 If you are conservative and have a different attitude, 250 00:09:22,225 --> 00:09:24,156 you might give them a book like this. 251 00:09:24,156 --> 00:09:25,905 (Laughter) 252 00:09:25,905 --> 00:09:29,241 But in general, stories can turn 253 00:09:29,241 --> 00:09:31,473 anonymous strangers into people who matter, 254 00:09:31,473 --> 00:09:34,158 and the idea that we care about people 255 00:09:34,158 --> 00:09:35,860 when we focus on them as individuals 256 00:09:35,860 --> 00:09:38,139 is an idea which has shown up across history. 257 00:09:38,139 --> 00:09:40,722 So Stalin apocryphally said, 258 00:09:40,722 --> 00:09:42,169 "A single death is a tragedy, 259 00:09:42,169 --> 00:09:44,379 a million deaths is a statistic," 260 00:09:44,379 --> 00:09:45,830 and Mother Theresa said, 261 00:09:45,830 --> 00:09:47,371 "If I look at the mass, I will never act. 262 00:09:47,371 --> 00:09:49,696 If I look at the one, I will." 263 00:09:49,696 --> 00:09:51,766 Psychologists have explored this. 264 00:09:51,766 --> 00:09:53,067 For instance, in one study, 265 00:09:53,067 --> 00:09:55,850 people were given a list of facts about a crisis, 266 00:09:55,850 --> 00:10:00,106 and it was seen how much they would donate 267 00:10:00,106 --> 00:10:01,690 to solve this crisis, 268 00:10:01,690 --> 00:10:03,527 and another group was given no facts at all 269 00:10:03,527 --> 00:10:05,625 but they were told of an individual 270 00:10:05,625 --> 00:10:08,065 and given a name and given a face, 271 00:10:08,065 --> 00:10:11,284 and it turns out that they gave far more. 272 00:10:11,284 --> 00:10:13,145 None of this I think is a secret 273 00:10:13,145 --> 00:10:15,256 to the people who are engaged in charity work. 274 00:10:15,256 --> 00:10:17,904 People don't tend to deluge people 275 00:10:17,904 --> 00:10:19,227 with facts and statistics. 276 00:10:19,227 --> 00:10:20,249 Rather, you show them faces, 277 00:10:20,249 --> 00:10:21,985 you show them people. 278 00:10:21,985 --> 00:10:25,212 It's possible that by extending our sympathies 279 00:10:25,212 --> 00:10:27,183 to an individual, they can spread 280 00:10:27,183 --> 00:10:30,061 to the group the individual belongs to. 281 00:10:30,061 --> 00:10:32,527 This is Harriet Beecher Stowe. 282 00:10:32,527 --> 00:10:34,970 The story, perhaps apocryphal, 283 00:10:34,970 --> 00:10:36,834 is that President Lincoln invited her 284 00:10:36,834 --> 00:10:39,042 to the White House in the middle of the Civil War 285 00:10:39,042 --> 00:10:40,626 and said to her, 286 00:10:40,626 --> 00:10:43,290 "So you're the little lady who started this great war." 287 00:10:43,290 --> 00:10:45,175 And he was talking about "Uncle Tom's Cabin." 288 00:10:45,175 --> 00:10:47,706 "Uncle Tom's Cabin" is not a great book of philosophy 289 00:10:47,706 --> 00:10:50,850 or of theology or perhaps not even literature, 290 00:10:50,850 --> 00:10:53,365 but it does a great job 291 00:10:53,365 --> 00:10:55,863 of getting people to put themselves in the shoes 292 00:10:55,863 --> 00:10:58,196 of people they wouldn't otherwise be in the shoes of, 293 00:10:58,196 --> 00:11:00,598 put themselves in the shoes of slaves. 294 00:11:00,598 --> 00:11:02,379 And that could well have been a catalyst 295 00:11:02,379 --> 00:11:03,983 for great social change. 296 00:11:03,983 --> 00:11:06,345 More recently, looking at America 297 00:11:06,345 --> 00:11:09,414 in the last several decades, 298 00:11:09,414 --> 00:11:12,563 there's some who reasonably say that shows like "The Cosby Show" 299 00:11:12,563 --> 00:11:15,161 radically changed American attitudes towards African-Americans, 300 00:11:15,161 --> 00:11:18,234 while shows like "Will & Grace" and "Modern Family" 301 00:11:18,234 --> 00:11:19,597 changed American attitudes 302 00:11:19,597 --> 00:11:20,897 towards gay men and women. 303 00:11:20,897 --> 00:11:23,352 I don't think it's an exaggeration to say 304 00:11:23,352 --> 00:11:26,013 that the major catalyst in America for moral change 305 00:11:26,013 --> 00:11:28,906 has been a situation comedy. 306 00:11:28,906 --> 00:11:30,322 But it's not all emotions, 307 00:11:30,322 --> 00:11:31,388 and I want to end by appealing 308 00:11:31,388 --> 00:11:33,833 to the power of reason. 309 00:11:33,833 --> 00:11:35,989 At some point in his wonderful book 310 00:11:35,989 --> 00:11:37,212 "The Better Angels Of Our Nature," 311 00:11:37,212 --> 00:11:39,228 Steven Pinker says, you know, 312 00:11:39,228 --> 00:11:41,810 the Old Testament says love thy neighbor, 313 00:11:41,810 --> 00:11:44,532 and the New Testament says love thy enemy, 314 00:11:44,532 --> 00:11:47,218 but I don't love either one of them, not really, 315 00:11:47,218 --> 00:11:48,885 but I don't want to kill them. 316 00:11:48,885 --> 00:11:50,751 I know I have obligations to them, 317 00:11:50,751 --> 00:11:54,221 but my moral feelings to them, my moral beliefs 318 00:11:54,221 --> 00:11:55,934 about how I should behave towards them, 319 00:11:55,934 --> 00:11:57,981 aren't grounded in love. 320 00:11:57,981 --> 00:11:59,920 What they're grounded in is the understanding of human rights, 321 00:11:59,920 --> 00:12:02,143 a belief that their life is as valuable to them 322 00:12:02,143 --> 00:12:04,499 as my life is to me, 323 00:12:04,499 --> 00:12:06,431 and to support this, he tells a story 324 00:12:06,431 --> 00:12:08,279 by the great philosopher Adam Smith, 325 00:12:08,279 --> 00:12:09,965 and I want to tell this story too, 326 00:12:09,965 --> 00:12:11,261 though I'm going to modify it a little bit 327 00:12:11,261 --> 00:12:12,939 for modern times. 328 00:12:12,939 --> 00:12:14,530 So Adam Smith starts by asking you to imagine 329 00:12:14,530 --> 00:12:16,741 the death of thousands of people, 330 00:12:16,741 --> 00:12:18,781 and imagine that the thousands of people 331 00:12:18,781 --> 00:12:21,020 are in a country you are not familiar with. 332 00:12:21,020 --> 00:12:24,574 It could be China or India or a country in Africa. 333 00:12:24,574 --> 00:12:27,058 And Smith says, how would you respond? 334 00:12:27,058 --> 00:12:29,365 And you would say, well that's too bad, 335 00:12:29,365 --> 00:12:31,241 and you'd go on to the rest of your life. 336 00:12:31,241 --> 00:12:33,460 If you were to open up The New York Times online or something, 337 00:12:33,460 --> 00:12:36,420 and discover this, and in fact this happens to us all the time, 338 00:12:36,420 --> 00:12:37,941 we go about our lives. 339 00:12:37,941 --> 00:12:40,135 But imagine instead, Smith says, 340 00:12:40,135 --> 00:12:41,389 you were to learn that tomorrow 341 00:12:41,389 --> 00:12:43,928 you were to have your little finger chopped off. 342 00:12:43,928 --> 00:12:46,097 Smith says, that would matter a lot. 343 00:12:46,097 --> 00:12:47,508 You would not sleep that night 344 00:12:47,508 --> 00:12:48,861 wondering about that. 345 00:12:48,861 --> 00:12:50,880 So this raises the question: 346 00:12:50,880 --> 00:12:53,346 would you sacrifice thousands of lives 347 00:12:53,346 --> 00:12:55,315 to save your little finger? 348 00:12:55,315 --> 00:12:57,633 Now answer this in the privacy of your own head, 349 00:12:57,633 --> 00:13:00,552 but Smith says, absolutely not, 350 00:13:00,552 --> 00:13:02,244 what a horrid thought. 351 00:13:02,244 --> 00:13:04,275 And so this raises the question, 352 00:13:04,275 --> 00:13:05,649 and so, as Smith puts it, 353 00:13:05,649 --> 00:13:07,867 "When our passive feelings are almost always 354 00:13:07,867 --> 00:13:09,315 so sordid and so selfish, 355 00:13:09,315 --> 00:13:10,780 how comes it that our active principles 356 00:13:10,780 --> 00:13:13,313 should often be so generous and so noble?" 357 00:13:13,313 --> 00:13:15,363 And Smith's answer is, "It is reason, 358 00:13:15,363 --> 00:13:17,138 principle, conscience. 359 00:13:17,138 --> 00:13:18,679 [This] calls to us, 360 00:13:18,679 --> 00:13:22,104 with a voice capable of astonishing the most presumptuous of our passions, 361 00:13:22,104 --> 00:13:23,781 that we are but one of the multitude, 362 00:13:23,781 --> 00:13:26,222 in no respect better than any other in it." 363 00:13:26,222 --> 00:13:28,347 And this last part is what is often described 364 00:13:28,347 --> 00:13:31,555 as the principle of impartiality. 365 00:13:31,555 --> 00:13:34,184 And this principle of impartiality manifests itself 366 00:13:34,184 --> 00:13:35,931 in all of the world's religions, 367 00:13:35,951 --> 00:13:38,209 in all of the different versions of the golden rule, 368 00:13:38,209 --> 00:13:40,663 and in all of the world's moral philosophies, 369 00:13:40,663 --> 00:13:41,970 which differ in many ways 370 00:13:41,970 --> 00:13:44,964 but share the presupposition that we should judge morality 371 00:13:44,964 --> 00:13:47,949 from sort of an impartial point of view. 372 00:13:47,949 --> 00:13:49,771 The best articulation of this view 373 00:13:49,771 --> 00:13:52,856 is actually, for me, it's not from a theologian or from a philosopher, 374 00:13:52,856 --> 00:13:54,213 but from Humphrey Bogart 375 00:13:54,213 --> 00:13:55,760 at the end of "Casablanca." 376 00:13:55,760 --> 00:13:59,536 So, spoiler alert, he's telling his lover 377 00:13:59,536 --> 00:14:00,676 that they have to separate 378 00:14:00,676 --> 00:14:02,269 for the more general good, 379 00:14:02,269 --> 00:14:04,133 and he says to her, and I won't do the accent, 380 00:14:04,133 --> 00:14:05,815 but he says to her, "It doesn't take much to see 381 00:14:05,815 --> 00:14:07,274 that the problems of three little people 382 00:14:07,274 --> 00:14:10,385 don't amount to a hill of beans in this crazy world." 383 00:14:10,385 --> 00:14:13,665 Our reason could cause us to override our passions. 384 00:14:13,665 --> 00:14:15,381 Our reason could motivate us 385 00:14:15,381 --> 00:14:16,602 to extend our empathy, 386 00:14:16,602 --> 00:14:18,929 could motivate us to write a book like "Uncle Tom's Cabin," 387 00:14:18,929 --> 00:14:20,652 or read a book like "Uncle Tom's Cabin," 388 00:14:20,652 --> 00:14:23,346 and our reason can motivate us to create 389 00:14:23,346 --> 00:14:25,308 customs and taboos and laws 390 00:14:25,308 --> 00:14:27,118 that will constrain us 391 00:14:27,118 --> 00:14:28,794 from acting upon our impulses 392 00:14:28,794 --> 00:14:30,383 when, as rational beings, we feel 393 00:14:30,383 --> 00:14:31,778 we should be constrained. 394 00:14:31,778 --> 00:14:33,791 This is what a constitution is. 395 00:14:33,791 --> 00:14:36,712 A constitution is something which was set up in the past 396 00:14:36,712 --> 00:14:38,019 that applies now in the present, 397 00:14:38,019 --> 00:14:39,004 and what it says is, 398 00:14:39,004 --> 00:14:41,231 no matter how much we might to reelect 399 00:14:41,231 --> 00:14:43,834 a popular president for a third term, 400 00:14:43,834 --> 00:14:45,929 no matter how much white Americans might choose 401 00:14:45,929 --> 00:14:49,997 to feel that they want to reinstate the institution of slavery, we can't. 402 00:14:49,997 --> 00:14:51,673 We have bound ourselves. 403 00:14:51,673 --> 00:14:54,090 And we bind ourselves in other ways as well. 404 00:14:54,090 --> 00:14:56,848 We know that when it comes to choosing somebody 405 00:14:56,848 --> 00:14:59,799 for a job, for an award, 406 00:14:59,799 --> 00:15:02,757 we are strongly biased by their race, 407 00:15:02,757 --> 00:15:05,053 we are biased by their gender, 408 00:15:05,053 --> 00:15:07,268 we are biased by how attractive they are, 409 00:15:07,268 --> 00:15:09,919 and sometimes we might say, "Well fine, that's the way it should be." 410 00:15:09,919 --> 00:15:12,226 But other times we might say, "This is wrong." 411 00:15:12,226 --> 00:15:14,115 And so to combat this, 412 00:15:14,115 --> 00:15:16,366 we don't just try harder, 413 00:15:16,366 --> 00:15:19,367 but rather what we do is we set up situations 414 00:15:19,367 --> 00:15:22,406 where these other sources of information can't bias us, 415 00:15:22,406 --> 00:15:23,721 which is why many orchestras 416 00:15:23,721 --> 00:15:26,366 audition musicians behind screens, 417 00:15:26,366 --> 00:15:27,610 so the only information they have 418 00:15:27,610 --> 00:15:30,303 is the information they believe should matter. 419 00:15:30,303 --> 00:15:32,626 I think prejudice and bias 420 00:15:32,626 --> 00:15:35,720 illustrate a fundamental duality of human nature. 421 00:15:35,720 --> 00:15:39,496 We have gut feelings, instincts, emotions, 422 00:15:39,496 --> 00:15:41,657 and they affect our judgments and our actions 423 00:15:41,657 --> 00:15:43,988 for good and for evil, 424 00:15:43,988 --> 00:15:47,610 but we are also capable of rational deliberation 425 00:15:47,610 --> 00:15:49,045 and intelligent planning, 426 00:15:49,045 --> 00:15:51,862 and we can use these to in some cases 427 00:15:51,862 --> 00:15:53,805 accelerate and nourish our emotions, 428 00:15:53,805 --> 00:15:56,573 and in other cases staunch them. 429 00:15:56,573 --> 00:15:57,807 And it's in this way 430 00:15:57,807 --> 00:16:00,574 that reason helps us create a better world. 431 00:16:00,574 --> 00:16:02,918 Thank you. 432 00:16:02,918 --> 00:16:06,623 (Applause)