1 00:00:00,887 --> 00:00:04,415 So I'm starting us out today with a historical mystery. 2 00:00:05,037 --> 00:00:08,006 In 1957, there were two young women, 3 00:00:08,030 --> 00:00:09,385 both in their 20s, 4 00:00:09,409 --> 00:00:10,893 both living in the same city, 5 00:00:10,917 --> 00:00:13,753 both members of the same political group. 6 00:00:14,751 --> 00:00:18,591 That year, both decided to commit violent attacks. 7 00:00:19,284 --> 00:00:23,413 One girl took a gun and approached a soldier at a checkpoint. 8 00:00:24,300 --> 00:00:29,078 The other girl took a bomb and went to a crowded café. 9 00:00:30,287 --> 00:00:31,986 But here's the thing: 10 00:00:32,429 --> 00:00:36,138 one of the those girls followed through with the attack, 11 00:00:37,028 --> 00:00:40,048 but the other turned back. 12 00:00:41,303 --> 00:00:42,997 So what made the difference? 13 00:00:44,157 --> 00:00:47,414 I'm a behavioral historian, and I study aggression, 14 00:00:47,438 --> 00:00:49,359 moral cognition 15 00:00:49,383 --> 00:00:52,592 and decision-making in social movements. 16 00:00:52,616 --> 00:00:54,613 That's a mouthful. (Laughs) 17 00:00:54,637 --> 00:00:57,287 So, the translation of that is: 18 00:00:57,311 --> 00:01:02,089 I study the moment an individual decides to pull the trigger, 19 00:01:02,113 --> 00:01:06,749 the day-to-day decisions that led up to that moment 20 00:01:06,773 --> 00:01:12,896 and the stories that they tell themselves about why that behavior is justified. 21 00:01:13,468 --> 00:01:14,724 Now, this topic -- 22 00:01:14,748 --> 00:01:17,418 it's not just scholarly for me. 23 00:01:17,442 --> 00:01:19,227 It's actually a bit personal. 24 00:01:19,251 --> 00:01:22,612 I grew up in Kootenai County, Idaho, 25 00:01:22,636 --> 00:01:24,724 and this is very important. 26 00:01:25,147 --> 00:01:29,669 This is not the part of Idaho with potatoes. 27 00:01:29,693 --> 00:01:32,111 We have no potatoes. 28 00:01:32,135 --> 00:01:34,108 And if you ask me about potatoes, 29 00:01:34,132 --> 00:01:35,483 I will find you. 30 00:01:35,507 --> 00:01:36,726 (Laughter) 31 00:01:36,750 --> 00:01:40,198 This part of Idaho is known for mountain lakes, 32 00:01:40,222 --> 00:01:42,047 horseback riding, 33 00:01:42,071 --> 00:01:43,319 skiing. 34 00:01:43,684 --> 00:01:46,938 Unfortunately, starting in the 1980s, 35 00:01:46,962 --> 00:01:50,969 it also became known as the worldwide headquarters 36 00:01:50,993 --> 00:01:52,984 for the Aryan Nations. 37 00:01:53,348 --> 00:01:56,911 Every year, members of the local neo-Nazi compound 38 00:01:56,935 --> 00:01:59,924 would turn out and march through our town, 39 00:01:59,948 --> 00:02:01,158 and every year, 40 00:02:01,182 --> 00:02:04,523 members of our town would turn out and protest them. 41 00:02:04,547 --> 00:02:08,395 Now, in 2001, I graduated from high school, 42 00:02:08,419 --> 00:02:12,577 and I went to college in New York City. 43 00:02:12,601 --> 00:02:16,553 I arrived in August 2001. 44 00:02:17,482 --> 00:02:19,766 As many of you probably are aware, 45 00:02:19,790 --> 00:02:21,681 three weeks later, 46 00:02:21,705 --> 00:02:23,500 the Twin Towers went down. 47 00:02:24,243 --> 00:02:27,997 Now, I was shocked. 48 00:02:28,503 --> 00:02:30,878 I was incredibly angry. 49 00:02:32,613 --> 00:02:34,222 I wanted to do something, 50 00:02:34,246 --> 00:02:38,396 but the only thing that I could think of doing at that time 51 00:02:38,420 --> 00:02:40,623 was to study Arabic. 52 00:02:41,955 --> 00:02:43,112 I will admit, 53 00:02:43,136 --> 00:02:49,223 I was that girl in class that wanted to know why "they" hate "us." 54 00:02:49,247 --> 00:02:52,638 I started studying Arabic for very wrong reasons. 55 00:02:53,265 --> 00:02:55,200 But something unexpected happened. 56 00:02:55,224 --> 00:02:58,194 I got a scholarship to go study in Israel. 57 00:02:58,718 --> 00:03:01,535 So the Idaho girl went to the Middle East. 58 00:03:01,559 --> 00:03:06,045 And while I was there, I met Palestinian Muslims, 59 00:03:06,069 --> 00:03:08,101 Palestinian Christians, 60 00:03:08,125 --> 00:03:09,282 Israeli settlers, 61 00:03:09,306 --> 00:03:10,975 Israeli peace activists. 62 00:03:11,501 --> 00:03:16,035 And what I learned is that every act has an ecology. 63 00:03:16,696 --> 00:03:18,125 It has a context. 64 00:03:20,074 --> 00:03:23,627 Now, since then, I have gone around the world, 65 00:03:23,651 --> 00:03:27,565 I have studied violent movements, 66 00:03:27,589 --> 00:03:32,554 I have worked with NGOs and ex-combatants in Iraq, 67 00:03:32,578 --> 00:03:33,776 Syria, 68 00:03:33,800 --> 00:03:35,219 Vietnam, 69 00:03:35,243 --> 00:03:36,425 the Balkans, 70 00:03:36,449 --> 00:03:37,730 Cuba. 71 00:03:38,422 --> 00:03:40,817 I earned my PhD in History, 72 00:03:40,841 --> 00:03:43,355 and now what I do is I go to different archives 73 00:03:43,379 --> 00:03:45,600 and I dig through documents, 74 00:03:45,624 --> 00:03:49,257 looking for police confessions, 75 00:03:49,281 --> 00:03:51,846 court cases, 76 00:03:51,870 --> 00:03:57,086 diaries and manifestos of individuals involved in violent attacks. 77 00:03:57,110 --> 00:04:00,296 Now, you gather all these documents -- 78 00:04:00,320 --> 00:04:01,730 what do they tell you? 79 00:04:01,754 --> 00:04:04,921 Our brains love causal mysteries, 80 00:04:04,945 --> 00:04:06,133 it turns out. 81 00:04:06,157 --> 00:04:09,107 So any time we see an attack on the news, 82 00:04:09,131 --> 00:04:11,563 we tend to ask one question: 83 00:04:11,587 --> 00:04:12,889 Why? 84 00:04:12,913 --> 00:04:14,404 Why did that happen? 85 00:04:14,428 --> 00:04:17,328 Well, I can tell you I've read thousands of manifestos, 86 00:04:17,352 --> 00:04:21,913 and what you find out is that they are actually imitative. 87 00:04:21,937 --> 00:04:25,506 They imitate the political movement that they're drawing from. 88 00:04:25,530 --> 00:04:29,543 So they actually don't tell us a lot about decision-making 89 00:04:29,567 --> 00:04:31,561 in that particular case. 90 00:04:31,924 --> 00:04:36,542 So we have to teach ourselves to ask a totally different question. 91 00:04:36,566 --> 00:04:40,144 Instead of "Why?" we have to ask "How?" 92 00:04:40,168 --> 00:04:43,079 How did individuals produce these attacks, 93 00:04:43,103 --> 00:04:48,381 and how did their decision-making ecology contribute to violent behavior? 94 00:04:48,781 --> 00:04:53,799 There's a couple things I've learned from asking this kind of question. 95 00:04:53,823 --> 00:04:55,895 The most important thing is that 96 00:04:55,919 --> 00:04:58,955 political violence is not culturally endemic. 97 00:04:58,979 --> 00:05:00,359 We create it. 98 00:05:00,383 --> 00:05:02,673 And whether we realize it or not, 99 00:05:02,697 --> 00:05:08,160 our day-to-day habits contribute to the creation of violence 100 00:05:08,184 --> 00:05:09,738 in our environment. 101 00:05:09,762 --> 00:05:14,978 So here's a couple of habits that I've learned contribute to violence. 102 00:05:16,322 --> 00:05:19,562 One of the first things that attackers did 103 00:05:19,586 --> 00:05:23,392 when preparing themselves for a violent event 104 00:05:23,416 --> 00:05:27,111 was they enclosed themselves in an information bubble. 105 00:05:27,135 --> 00:05:29,693 We've heard of fake news, yeah? 106 00:05:29,717 --> 00:05:32,094 Well, this shocked me: 107 00:05:32,118 --> 00:05:35,616 every group that I studied had some kind of a fake news slogan. 108 00:05:35,640 --> 00:05:39,028 French communists called it the "putrid press." 109 00:05:39,052 --> 00:05:43,149 French ultranationalists called it the "sellout press" 110 00:05:43,173 --> 00:05:45,365 and the "treasonous press." 111 00:05:45,389 --> 00:05:48,900 Islamists in Egypt called it the "depraved news." 112 00:05:48,924 --> 00:05:51,989 And Egyptian communists called it ... 113 00:05:52,013 --> 00:05:53,266 "fake news." 114 00:05:53,290 --> 00:05:57,796 So why do groups spend all this time trying to make these information bubbles? 115 00:05:57,820 --> 00:06:00,701 The answer is actually really simple. 116 00:06:00,725 --> 00:06:05,220 We make decisions based on the information we trust, yeah? 117 00:06:05,244 --> 00:06:09,087 So if we trust bad information, 118 00:06:09,111 --> 00:06:12,086 we're going to make bad decisions. 119 00:06:12,110 --> 00:06:15,100 Another interesting habit that individuals used 120 00:06:15,124 --> 00:06:18,067 when they wanted to produce a violent attack 121 00:06:18,091 --> 00:06:21,621 was that they looked at their victim not as an individual 122 00:06:21,645 --> 00:06:24,356 but just as a member of an opposing team. 123 00:06:25,006 --> 00:06:26,737 Now this gets really weird. 124 00:06:27,554 --> 00:06:31,889 There's some fun brain science behind why that kind of thinking is effective. 125 00:06:31,913 --> 00:06:35,441 Say I divide all of you guys into two teams: 126 00:06:35,465 --> 00:06:36,617 blue team, 127 00:06:36,641 --> 00:06:37,945 red team. 128 00:06:37,969 --> 00:06:41,437 And then I ask you to compete in a game against each other. 129 00:06:41,461 --> 00:06:45,419 Well, the funny thing is, within milliseconds, 130 00:06:45,443 --> 00:06:50,292 you will actually start experiencing pleasure -- pleasure -- 131 00:06:50,316 --> 00:06:55,129 when something bad happens to members of the other team. 132 00:06:55,814 --> 00:06:59,898 The funny thing about that is if I ask one of you blue team members 133 00:06:59,922 --> 00:07:01,837 to go and join the red team, 134 00:07:02,805 --> 00:07:04,797 your brain recalibrates, 135 00:07:04,821 --> 00:07:06,195 and within milliseconds, 136 00:07:06,219 --> 00:07:08,812 you will now start experiencing pleasure 137 00:07:08,836 --> 00:07:12,294 when bad things happen to members of your old team. 138 00:07:14,064 --> 00:07:20,669 This is a really good example of why us-them thinking is so dangerous 139 00:07:20,693 --> 00:07:22,463 in our political environment. 140 00:07:22,487 --> 00:07:26,828 Another habit that attackers used to kind of rev themselves up for an attack 141 00:07:26,852 --> 00:07:29,220 was they focused on differences. 142 00:07:29,244 --> 00:07:32,201 In other words, they looked at their victims, and they thought, 143 00:07:33,075 --> 00:07:35,221 "I share nothing in common with that person. 144 00:07:35,245 --> 00:07:37,374 They are totally different than me." 145 00:07:38,829 --> 00:07:41,995 Again, this might sound like a really simple concept, 146 00:07:42,019 --> 00:07:46,676 but there's some fascinating science behind why this works. 147 00:07:47,209 --> 00:07:52,438 Say I show you guys videos of different-colored hands 148 00:07:52,462 --> 00:07:56,237 and sharp pins being driven into these different-colored hands, 149 00:07:56,261 --> 00:07:57,411 OK? 150 00:07:58,360 --> 00:08:00,219 If you're white, 151 00:08:00,243 --> 00:08:05,954 the chances are you will experience the most sympathetic activation, 152 00:08:05,978 --> 00:08:07,541 or the most pain, 153 00:08:07,565 --> 00:08:10,496 when you see a pin going into the white hand. 154 00:08:12,053 --> 00:08:15,447 If you are Latin American, Arab, Black, 155 00:08:15,471 --> 00:08:19,038 you will probably experience the most sympathetic activation 156 00:08:19,062 --> 00:08:23,926 watching a pin going into the hand that looks most like yours. 157 00:08:26,846 --> 00:08:30,718 The good news is, that's not biologically fixed. 158 00:08:30,742 --> 00:08:32,537 That is learned behavior. 159 00:08:33,252 --> 00:08:37,730 Which means the more we spend time with other ethnic communities 160 00:08:37,754 --> 00:08:44,623 and the more we see them as similar to us and part of our team, 161 00:08:44,647 --> 00:08:46,875 the more we feel their pain. 162 00:08:46,899 --> 00:08:49,456 The last habit that I'm going to talk about 163 00:08:49,480 --> 00:08:54,537 is when attackers prepared themselves to go out and do one of these events, 164 00:08:54,561 --> 00:08:57,227 they focused on certain emotional cues. 165 00:08:57,251 --> 00:09:03,117 For months, they geared themselves up by focusing on anger cues, for instance. 166 00:09:03,141 --> 00:09:05,839 I bring this up because it's really popular right now. 167 00:09:05,863 --> 00:09:09,547 If you read blogs or the news, 168 00:09:09,571 --> 00:09:13,550 you see talk of two concepts from laboratory science: 169 00:09:13,574 --> 00:09:16,779 amygdala hijacking and emotional hijacking. 170 00:09:16,803 --> 00:09:19,376 Now, amygdala hijacking: 171 00:09:19,400 --> 00:09:23,489 it's the concept that I show you a cue -- say, a gun -- 172 00:09:23,513 --> 00:09:27,406 and your brain reacts with an automatic threat response 173 00:09:27,430 --> 00:09:28,599 to that cue. 174 00:09:28,623 --> 00:09:31,124 Emotional hijacking -- it's a very similar concept. 175 00:09:31,148 --> 00:09:36,179 It's the idea that I show you an anger cue, for instance, 176 00:09:36,203 --> 00:09:41,292 and your brain will react with an automatic anger response 177 00:09:41,316 --> 00:09:42,643 to that cue. 178 00:09:42,667 --> 00:09:46,646 I think women usually get this more than men. (Laughs) 179 00:09:46,670 --> 00:09:47,677 (Laughter) 180 00:09:47,701 --> 00:09:51,227 That kind of a hijacking narrative grabs our attention. 181 00:09:51,251 --> 00:09:54,016 Just the word 'hijacking" grabs our attention. 182 00:09:54,526 --> 00:09:55,677 The thing is, 183 00:09:55,701 --> 00:10:00,472 most of the time, that's not really how cues work in real life. 184 00:10:00,970 --> 00:10:02,120 If you study history, 185 00:10:02,144 --> 00:10:07,406 what you find is that we are bombarded with hundreds of thousands of cues 186 00:10:07,430 --> 00:10:08,837 every day. 187 00:10:08,861 --> 00:10:10,908 And so what we do is we learn to filter. 188 00:10:10,932 --> 00:10:12,776 We ignore some cues, 189 00:10:12,800 --> 00:10:14,979 we pay attention to other cues. 190 00:10:15,003 --> 00:10:18,631 For political violence, this becomes really important, 191 00:10:18,655 --> 00:10:24,566 because what it meant is that attackers usually didn't just see an anger cue 192 00:10:24,590 --> 00:10:26,470 and suddenly snap. 193 00:10:26,825 --> 00:10:28,288 Instead, 194 00:10:28,312 --> 00:10:34,831 politicians, social activists spent weeks, months, years 195 00:10:34,855 --> 00:10:39,878 flooding the environment with anger cues, for instance, 196 00:10:39,902 --> 00:10:41,905 and attackers, 197 00:10:41,929 --> 00:10:44,364 they paid attention to those cues, 198 00:10:44,388 --> 00:10:46,914 they trusted those cues, 199 00:10:46,938 --> 00:10:48,486 they focused on them, 200 00:10:48,510 --> 00:10:51,112 they even memorized those cues. 201 00:10:51,136 --> 00:10:57,613 All of this just really goes to show how important it is to study history. 202 00:10:57,637 --> 00:11:01,607 It's one thing to see how cues operate in a laboratory setting. 203 00:11:01,631 --> 00:11:05,245 And those laboratory experiments are incredibly important. 204 00:11:05,269 --> 00:11:09,491 They give us a lot of new data about how our bodies work. 205 00:11:10,269 --> 00:11:15,488 But it's also very important to see how those cues operate in real life. 206 00:11:18,535 --> 00:11:22,700 So what does all this tell us about political violence? 207 00:11:23,908 --> 00:11:27,496 Political violence is not culturally endemic. 208 00:11:27,985 --> 00:11:33,208 It is not an automatic, predetermined response to environmental stimuli. 209 00:11:33,523 --> 00:11:34,706 We produce it. 210 00:11:35,356 --> 00:11:37,429 Our everyday habits produce it. 211 00:11:38,945 --> 00:11:42,906 Let's go back, actually, to those two women that I mentioned at the start. 212 00:11:43,940 --> 00:11:49,518 The first woman had been paying attention to those outrage campaigns, 213 00:11:49,542 --> 00:11:50,965 so she took a gun 214 00:11:50,989 --> 00:11:53,258 and approached a soldier at a checkpoint. 215 00:11:55,302 --> 00:11:58,905 But in that moment, something really interesting happened. 216 00:11:58,929 --> 00:12:01,798 She looked at that soldier, 217 00:12:01,822 --> 00:12:03,775 and she thought to herself, 218 00:12:06,180 --> 00:12:08,762 "He's the same age as me. 219 00:12:09,435 --> 00:12:10,953 He looks like me." 220 00:12:12,724 --> 00:12:15,262 And she put down the gun, and she walked away. 221 00:12:16,179 --> 00:12:18,602 Just from that little bit of similarity. 222 00:12:20,128 --> 00:12:23,702 The second girl had a totally different outcome. 223 00:12:25,533 --> 00:12:28,382 She also listened to the outrage campaigns, 224 00:12:28,406 --> 00:12:31,398 but she surrounded herself with individuals 225 00:12:31,422 --> 00:12:33,246 who were supportive of violence, 226 00:12:33,270 --> 00:12:35,776 with peers who supported her violence. 227 00:12:36,707 --> 00:12:39,983 She enclosed herself in an information bubble. 228 00:12:40,747 --> 00:12:44,397 She focused on certain emotional cues for months. 229 00:12:44,421 --> 00:12:50,013 She taught herself to bypass certain cultural inhibitions against violence. 230 00:12:50,037 --> 00:12:51,784 She practiced her plan, 231 00:12:51,808 --> 00:12:54,055 she taught herself new habits, 232 00:12:54,079 --> 00:12:58,201 and when the time came, she took her bomb to the café, 233 00:12:58,225 --> 00:13:00,526 and she followed through with that attack. 234 00:13:03,592 --> 00:13:06,403 This was not impulse. 235 00:13:06,903 --> 00:13:08,576 This was learning. 236 00:13:10,463 --> 00:13:14,345 Polarization in our society is not impulse, 237 00:13:14,369 --> 00:13:15,715 it's learning. 238 00:13:16,391 --> 00:13:19,318 Every day we are teaching ourselves: 239 00:13:19,342 --> 00:13:21,304 the news we click on, 240 00:13:21,328 --> 00:13:23,372 the emotions that we focus on, 241 00:13:23,396 --> 00:13:27,756 the thoughts that we entertain about the red team or the blue team. 242 00:13:28,284 --> 00:13:30,549 All of this contributes to learning, 243 00:13:30,573 --> 00:13:32,296 whether we realize it or not. 244 00:13:32,696 --> 00:13:34,324 The good news 245 00:13:35,570 --> 00:13:41,441 is that while the individuals I study already made their decisions, 246 00:13:41,465 --> 00:13:43,885 we can still change our trajectory. 247 00:13:45,164 --> 00:13:48,824 We might never make the decisions that they made, 248 00:13:48,848 --> 00:13:52,997 but we can stop contributing to violent ecologies. 249 00:13:53,552 --> 00:13:58,038 We can get out of whatever news bubble we're in, 250 00:13:58,062 --> 00:14:02,106 we can be more mindful about the emotional cues 251 00:14:02,130 --> 00:14:03,359 that we focus on, 252 00:14:03,383 --> 00:14:05,782 the outrage bait that we click on. 253 00:14:06,422 --> 00:14:07,602 But most importantly, 254 00:14:07,626 --> 00:14:12,127 we can stop seeing each other as just members of the red team 255 00:14:12,151 --> 00:14:13,545 or the blue team. 256 00:14:13,959 --> 00:14:19,856 Because whether we are Christian, Muslim, Jewish, atheist, 257 00:14:19,880 --> 00:14:22,364 Democrat or Republican, 258 00:14:22,388 --> 00:14:23,544 we're human. 259 00:14:23,568 --> 00:14:24,843 We're human beings. 260 00:14:25,992 --> 00:14:29,494 And we often share really similar habits. 261 00:14:30,224 --> 00:14:32,057 We have differences. 262 00:14:32,081 --> 00:14:34,296 Those differences are beautiful, 263 00:14:34,320 --> 00:14:36,761 and those differences are very important. 264 00:14:36,785 --> 00:14:43,254 But our future depends on us being able to find common ground 265 00:14:43,278 --> 00:14:44,628 with the other side. 266 00:14:45,785 --> 00:14:49,120 And that's why it is so, so important 267 00:14:49,144 --> 00:14:51,723 for us to retrain our brains 268 00:14:51,747 --> 00:14:55,547 and stop contributing to violent ecologies. 269 00:14:56,272 --> 00:14:57,444 Thank you. 270 00:14:57,468 --> 00:14:59,374 (Applause)