0:00:00.887,0:00:04.345 So I'm starting us out today[br]with a historical mystery. 0:00:05.037,0:00:08.068 In 1957, there were two young women, 0:00:08.068,0:00:09.512 both in their 20s, 0:00:09.512,0:00:10.935 both living in the same city, 0:00:10.935,0:00:13.771 both members of the same political group. 0:00:14.815,0:00:18.587 That year both decided[br]to commit violent attacks. 0:00:19.578,0:00:23.361 One girl took a gun and approached[br]a soldier at a checkpoint. 0:00:24.404,0:00:29.070 The other girl took a bomb[br]and went to a crowded café. 0:00:30.383,0:00:32.082 But here's the thing. 0:00:32.532,0:00:37.028 One of the those girls followed[br]through with the attack, 0:00:37.028,0:00:40.048 but the other turned back. 0:00:41.335,0:00:43.353 So what made the difference? 0:00:44.237,0:00:45.786 I'm a behavioral historian 0:00:45.786,0:00:47.569 and I study aggression, 0:00:47.569,0:00:49.554 moral cognition 0:00:49.554,0:00:52.220 and decision-making in social movements. 0:00:52.720,0:00:53.970 That's a mouthful. 0:00:53.970,0:00:54.972 (Laughing) 0:00:54.972,0:00:57.156 So the translation of that 0:00:57.156,0:01:02.246 is I study the moment an individual[br]decides to pull the trigger, 0:01:02.246,0:01:06.921 the day-to-day decisions[br]that led up to that moment 0:01:06.921,0:01:12.896 and the stories that they tell themselves[br]about why that behavior is justified. 0:01:13.651,0:01:14.852 Now this topic -- 0:01:14.852,0:01:16.964 it's not just scholarly for me. 0:01:17.442,0:01:19.226 It's actually a bit personal. 0:01:19.530,0:01:22.782 I grew up in Kootenai County, Idaho, 0:01:22.782,0:01:24.785 and this is very important. 0:01:25.488,0:01:29.385 This is not the part[br]of Idaho with potatoes. 0:01:29.733,0:01:31.796 We have no potatoes. 0:01:32.247,0:01:34.132 And if you ask my about potatoes, 0:01:34.132,0:01:35.341 I will find you. 0:01:35.341,0:01:36.407 (Laughter) 0:01:36.901,0:01:40.222 This part of Idaho is known[br]for mountain lakes, 0:01:40.222,0:01:42.163 horseback riding -- 0:01:42.163,0:01:43.411 skiing. 0:01:43.804,0:01:45.165 Unfortunately, 0:01:45.165,0:01:47.021 starting in the 1980s, 0:01:47.021,0:01:50.970 it also became known[br]as the worldwide headquarters 0:01:50.970,0:01:53.048 for the Aryan Nations. 0:01:53.468,0:01:57.054 Every year, members of the local[br]neo-Nazi compound 0:01:57.054,0:01:59.920 would turn out and march through our town, 0:01:59.920,0:02:01.308 and every year, 0:02:01.308,0:02:04.435 members of our town[br]would turn out and protest them. 0:02:04.672,0:02:08.914 Now, in 2001, I graduated from high school 0:02:08.914,0:02:12.470 and I went to college in New York City. 0:02:12.768,0:02:16.426 I arrived in August 2001. 0:02:17.482,0:02:19.790 As many of you probably are aware, 0:02:19.790,0:02:21.736 three weeks later, 0:02:21.736,0:02:23.726 the Twin Towers went down. 0:02:24.244,0:02:27.289 Now, I was shocked. 0:02:28.567,0:02:30.942 I was incredibly angry. 0:02:32.725,0:02:34.403 I wanted to do something 0:02:34.403,0:02:38.420 but the only thing that I could think[br]of doing at that time 0:02:38.420,0:02:40.623 was to study Arabic. 0:02:41.955,0:02:44.931 I will admit I was that girl in class 0:02:44.931,0:02:49.237 that wanted to know why "they hate us." 0:02:49.650,0:02:52.763 I started studying Arabic[br]for very wrong reasons. 0:02:53.381,0:02:55.245 But something unexpected happened. 0:02:55.245,0:02:58.215 I got a scholarship to go study in Israel. 0:02:58.877,0:03:01.486 So the Idaho girl went to the Middle East. 0:03:01.735,0:03:06.204 And while I was there I met[br]Palestinian Muslims, 0:03:06.205,0:03:08.194 Palestinian Christians, 0:03:08.194,0:03:09.270 Israeli settlers, 0:03:09.270,0:03:11.501 Israeli peace activists, 0:03:11.501,0:03:16.035 and what I learned is that[br]every act has an ecology. 0:03:16.736,0:03:18.165 It has a context. 0:03:20.233,0:03:23.809 Now since then I have gone[br]around the world, 0:03:23.809,0:03:27.649 I have studied violent movements, 0:03:27.649,0:03:30.081 I have worked with NGOs 0:03:30.081,0:03:32.693 and ex-combatants in Iraq, 0:03:32.693,0:03:34.037 Syria, 0:03:34.037,0:03:35.243 Vietnam, 0:03:35.243,0:03:36.362 the Balkans -- 0:03:36.362,0:03:37.418 Cuba. 0:03:38.510,0:03:40.964 I earned my PhD in History, 0:03:40.964,0:03:43.501 and now what I do is I go[br]to different archives 0:03:43.501,0:03:45.682 and I dig through documents, 0:03:45.682,0:03:49.305 looking for police confessions, 0:03:49.305,0:03:51.822 court cases ... 0:03:51.822,0:03:52.862 diaries, 0:03:52.862,0:03:56.766 manifestos of individuals[br]involved in violent attacks. 0:03:57.332,0:04:00.384 Now you gather all these documents -- 0:04:00.384,0:04:01.670 what do they tell you? 0:04:02.011,0:04:05.889 Our brains love causal[br]mysteries, it turns out. 0:04:06.237,0:04:09.203 So any time we see an attack on the news, 0:04:09.203,0:04:11.842 we tend to ask one question: 0:04:11.842,0:04:12.955 why? 0:04:12.955,0:04:14.523 Why did that happen? 0:04:14.523,0:04:17.563 Well, I can tell you I've read[br]thousands of manifestos 0:04:17.563,0:04:21.930 and what you find out is[br]that they are actually imitative. 0:04:21.930,0:04:25.390 They imitate the political movement[br]that they're drawing from. 0:04:25.598,0:04:29.567 So they actually don't tell us[br]a lot about decision-making 0:04:29.567,0:04:31.561 in that particular case. 0:04:32.091,0:04:36.367 So we have to teach ourselves[br]to ask a totally different question. 0:04:36.646,0:04:40.324 Instead of "Why?" we have to ask "How?" 0:04:40.324,0:04:43.372 How did individuals produce these attacks 0:04:43.372,0:04:48.440 and how did their decision-making ecology[br]contribute to violent behavior? 0:04:48.861,0:04:53.203 There's a couple things I've learned[br]from asking this kind of question. 0:04:53.871,0:04:58.576 The most important thing is that political[br]violence is not culturally endemic. 0:04:59.123,0:05:00.313 We create it. 0:05:00.614,0:05:03.099 And whether we realize it or not, 0:05:03.099,0:05:08.303 our day-to-day habits contribute[br]to the creation of violence 0:05:08.303,0:05:09.687 in our environment. 0:05:09.949,0:05:14.631 So here's a couple of habits[br]that I've learned contribute to violence. 0:05:16.322,0:05:19.644 One of the first things that attackers did 0:05:19.644,0:05:23.544 when preparing themselves[br]for a violent event 0:05:23.544,0:05:26.914 was they enclosed themselves[br]in an information bubble. 0:05:27.239,0:05:29.526 We've heard of fake news, yeah? 0:05:29.820,0:05:32.229 Well, this shocked me: 0:05:32.229,0:05:35.444 every group that I studied[br]had some kind of a fake news slogan. 0:05:35.807,0:05:39.169 French communists [br]called it the "putrid press." 0:05:39.182,0:05:43.302 French Nationalists called it[br]the "sellout press" 0:05:43.302,0:05:45.240 and the "treasonous press." 0:05:45.437,0:05:48.721 Islamists in Egypt called it[br]the "depraved news." 0:05:48.971,0:05:52.059 And Egyptian communists called it ... 0:05:52.059,0:05:53.331 "fake news." 0:05:53.434,0:05:57.599 So why do groups spend all this time[br]trying to make these information bubbles? 0:05:57.908,0:06:00.397 The answer is actually really simple. 0:06:00.725,0:06:04.849 We make decisions based on[br]the information we trust, yeah? 0:06:05.380,0:06:09.246 So if we trust bad information, 0:06:09.246,0:06:12.110 we're going to make bad decisions. 0:06:12.110,0:06:15.200 Another interesting habit[br]that individuals used 0:06:15.200,0:06:18.268 when they wanted[br]to produce a violent attack 0:06:18.268,0:06:21.645 was that they looked at their victim[br]not as an individual 0:06:21.645,0:06:24.356 but just as a number of an opposing team. 0:06:25.173,0:06:26.904 Now this gets really weird. 0:06:27.554,0:06:29.289 There's some fun brain science 0:06:29.289,0:06:32.025 behind why that kind[br]of thinking is effective. 0:06:32.025,0:06:35.508 Say I divide all of you guys[br]into two teams: 0:06:35.508,0:06:36.567 blue team, 0:06:36.567,0:06:37.640 red team. 0:06:38.136,0:06:41.282 And then I ask you to compete[br]in a game against each other. 0:06:41.635,0:06:45.587 Well, the funny thing is[br]within milliseconds, 0:06:45.587,0:06:49.578 you will actually start[br]experiencing pleasure ... 0:06:49.578,0:06:54.752 pleasure when something bad happens[br]to members of the other team. 0:06:55.814,0:06:59.974 The funny thing about that is[br]if I ask one of you blue team members 0:06:59.974,0:07:02.836 to go and join the red team, 0:07:02.836,0:07:04.916 your brain recalibates, 0:07:04.916,0:07:06.275 and within milliseconds, 0:07:06.275,0:07:08.836 you will now start experiencing pleasure 0:07:08.836,0:07:12.294 when bad things happen[br]to members of your old team. 0:07:14.144,0:07:16.932 This is a really good example 0:07:16.932,0:07:20.741 of why us/them thinking is so dangerous 0:07:20.741,0:07:22.379 in our political environment. 0:07:22.694,0:07:27.058 Another habit that attackers used[br]to kind of rev themselves up for an attack 0:07:27.058,0:07:29.098 was they focused on differences. 0:07:29.244,0:07:30.238 In other words, 0:07:30.238,0:07:31.571 they looked at their victims 0:07:31.571,0:07:35.304 and they thought "I share[br]nothing in common with that person. 0:07:35.304,0:07:37.374 They are totally different than me." 0:07:38.953,0:07:42.097 Again, this might sound[br]like a really simple concept, 0:07:42.097,0:07:46.173 but there's some fascinating science[br]behind why this works. 0:07:47.377,0:07:52.462 Say I show you guys videos[br]of different-colored hands 0:07:52.462,0:07:56.261 and sharp pins being driven[br]into these different-colored hands, 0:07:56.261,0:07:57.259 OK? 0:07:58.464,0:08:00.413 If you're white, 0:08:00.413,0:08:06.147 the chances are you will experience[br]the most sympathetic activation, 0:08:06.147,0:08:07.565 or the most pain, 0:08:07.565,0:08:10.496 when you see a pin[br]going into the white hand. 0:08:12.209,0:08:15.471 If you are Latin American, Arab, Black, 0:08:15.471,0:08:19.110 you will probably experience[br]the most sympathetic activation 0:08:19.110,0:08:23.688 watching a pin going into the hand[br]that looks most like yours. 0:08:26.921,0:08:30.335 The good news is that's not[br]biologically fixed. 0:08:30.878,0:08:32.673 That is learned behavior. 0:08:33.395,0:08:38.050 Which means the more we spend time[br]with other ethnic communities, 0:08:38.050,0:08:41.216 and the more we see them as similar to us 0:08:41.216,0:08:44.647 and part of our team, 0:08:44.647,0:08:46.675 the more we feel their pain. 0:08:47.016,0:08:49.570 The last habit[br]that I'm going to talk about 0:08:49.570,0:08:54.699 is when attackers prepared themselves[br]to go out and do one of these events, 0:08:54.699,0:08:57.146 they focused on certain emotional cues. 0:08:57.372,0:09:02.961 For months they geared themselves up[br]by focusing on anger cues, for instance. 0:09:03.277,0:09:05.863 I bring this up because[br]it's really popular right now. 0:09:05.863,0:09:09.571 If you read blogs or the news, 0:09:09.571,0:09:13.574 you see talk of two concepts[br]from laboratory science: 0:09:13.574,0:09:16.675 amygdala hijacking[br]and emotional hijacking. 0:09:16.931,0:09:19.493 Now, amygdala hijacking: 0:09:19.493,0:09:22.093 it's the concept that I show you a cue -- 0:09:22.093,0:09:23.625 say, a gun -- 0:09:23.625,0:09:28.413 and your brain reacts with an automatic[br]threat response to that cue. 0:09:28.693,0:09:29.838 Emotional hijacking -- 0:09:29.838,0:09:31.276 it's a very similar concept. 0:09:31.276,0:09:34.840 It's the idea that I show[br]you an anger cue, 0:09:34.840,0:09:36.389 for instance, 0:09:36.389,0:09:41.419 and your brain will react[br]with an automatic anger response 0:09:41.419,0:09:42.568 to that cue. 0:09:42.801,0:09:45.495 I think women usually[br]get this more than men. 0:09:45.495,0:09:46.498 (Laughing) 0:09:46.834,0:09:47.827 (Laughter) 0:09:47.827,0:09:51.376 That kind of a hijacking narrative[br]grabs our attention -- 0:09:51.376,0:09:54.049 just the word 'hijacking"[br]grabs our attention. 0:09:54.707,0:09:55.756 The thing is, 0:09:55.756,0:10:00.378 most of the time that's not[br]really how cues work in real life. 0:10:01.084,0:10:02.144 If you study history, 0:10:02.144,0:10:07.430 what you find is that we are bombarded[br]with hundreds of thousands of cues 0:10:07.430,0:10:08.661 every day. 0:10:08.988,0:10:11.239 And so what we do is we learn to filter. 0:10:11.239,0:10:12.907 We ignore some cues; 0:10:12.907,0:10:14.787 we pay attention to other cues. 0:10:15.071,0:10:18.655 For political violence[br]this becomes really important, 0:10:18.655,0:10:24.590 because what it meant is that attackers[br]usually didn't just see an anger cue 0:10:24.590,0:10:26.470 and suddenly snap. 0:10:27.002,0:10:33.197 Instead, politicians,[br]social activists spent weeks 0:10:33.197,0:10:34.204 months, 0:10:34.204,0:10:36.987 years flooding the environment 0:10:36.987,0:10:39.260 with anger cues, for instance. 0:10:40.056,0:10:41.929 And attackers -- 0:10:41.929,0:10:44.388 they paid attention to those cues, 0:10:44.388,0:10:47.116 they trusted those cues, 0:10:47.116,0:10:48.592 they focused on them, 0:10:48.592,0:10:50.877 they even memorized those cues. 0:10:51.335,0:10:57.489 All of this just really goes to show[br]how important it is to study history. 0:10:57.677,0:11:01.179 It's one thing to see how cues[br]operate in a laboratory setting. 0:11:01.766,0:11:05.218 And those laboratory experiments[br]are incredibly important. 0:11:05.403,0:11:09.492 They give us a lot of new data[br]about how our bodies work. 0:11:10.269,0:11:15.056 But it's also very important to see[br]how those cues operate in real life. 0:11:18.647,0:11:22.478 So what does all this tell us[br]about political violence? 0:11:24.075,0:11:27.663 Political violence is not[br]culturally endemic. 0:11:28.081,0:11:33.099 It is not an automatic, predetermined[br]response to environmental stimuli. 0:11:33.682,0:11:34.865 We produce it. 0:11:35.436,0:11:37.509 Our everyday habits produce it. 0:11:38.914,0:11:42.875 Let's go back actually to those two[br]women that I mentioned at the start. 0:11:44.084,0:11:49.622 The first woman had been paying[br]attention to those outrage campaigns, 0:11:49.622,0:11:53.361 so she took a gun and approached[br]a soldier at a checkpoint. 0:11:55.477,0:11:56.658 But in that moment, 0:11:56.658,0:11:59.010 something really interesting happened. 0:11:59.131,0:12:01.933 She looked at that soldier 0:12:01.933,0:12:06.247 and she thought to herself, 0:12:06.247,0:12:08.951 "He's the same age as me. 0:12:09.507,0:12:11.251 He looks like me." 0:12:12.876,0:12:14.056 And she put down the gun 0:12:14.056,0:12:16.346 and she walked away 0:12:16.346,0:12:19.086 just from that little bit of similarity. 0:12:20.199,0:12:23.634 The second girl had a totally[br]different outcome. 0:12:25.676,0:12:28.583 She also listened[br]to the outrage campaigns, 0:12:28.583,0:12:31.458 but she surrounded herself[br]with individuals 0:12:31.458,0:12:33.269 who were supportive of violence, 0:12:33.269,0:12:35.775 with peers who supported her violence. 0:12:36.819,0:12:39.835 She enclosed herself[br]in an information bubble. 0:12:40.875,0:12:44.206 She focused on certain[br]emotional cues for months. 0:12:44.484,0:12:49.783 She taught herself to bypass certain[br]cultural inhibitions against violence. 0:12:50.148,0:12:51.918 She practiced her plan, 0:12:51.918,0:12:54.079 she taught herself new habits 0:12:54.079,0:12:55.331 and when the time came, 0:12:55.331,0:12:58.368 she took her bomb to the café 0:12:58.368,0:13:00.831 and she followed through with that attack. 0:13:03.656,0:13:06.166 This was not impulse. 0:13:07.102,0:13:08.656 This was learning. 0:13:10.463,0:13:14.369 Polarization in our society[br]is not impulse, 0:13:14.369,0:13:15.715 it's learning. 0:13:16.391,0:13:19.504 Every day we are teaching ourselves: 0:13:19.504,0:13:21.456 the news we click on, 0:13:21.456,0:13:23.523 the emotions that we focus on, 0:13:23.523,0:13:27.883 the thoughts that we entertain[br]about the red team or the blue team. 0:13:28.284,0:13:30.658 All of this contributes to learning, 0:13:30.658,0:13:32.497 whether we realize it or not. 0:13:32.847,0:13:35.570 The good news 0:13:35.570,0:13:41.549 is that while the individuals[br]I study already made their decisions, 0:13:41.549,0:13:43.969 we can still change our trajectory. 0:13:45.323,0:13:49.006 We might never make[br]the decisions that they made, 0:13:49.006,0:13:52.918 but we can stop contributing[br]to violent ecologies. 0:13:53.664,0:13:58.249 We can get out of whatever[br]news bubble we're in, 0:13:58.249,0:14:02.130 we can be more mindful[br]about the emotional cues 0:14:02.130,0:14:03.383 that we focus on, 0:14:03.383,0:14:05.782 the outrage bait that we click on. 0:14:06.422,0:14:07.634 But most importantly, 0:14:07.634,0:14:09.801 we can stop seeing each other 0:14:09.801,0:14:13.545 as just members of the red team[br]or the blue team. 0:14:14.079,0:14:17.017 Because whether we are Christian, 0:14:17.017,0:14:18.018 Muslim, 0:14:18.018,0:14:19.022 Jewish, 0:14:19.022,0:14:20.080 Atheist, 0:14:20.080,0:14:22.388 Democrat or Republican, 0:14:22.388,0:14:23.568 we're human. 0:14:23.568,0:14:24.843 We're human beings. 0:14:26.128,0:14:29.312 And we often share really similar habits. 0:14:30.328,0:14:31.661 We have differences. 0:14:32.233,0:14:34.400 Those differences are beautiful 0:14:34.400,0:14:36.562 and those differences are very important. 0:14:36.857,0:14:40.422 But our future depends on us 0:14:40.422,0:14:44.628 being able to find common ground[br]with the other side. 0:14:45.785,0:14:49.255 And that's why it is so, so important 0:14:49.255,0:14:51.747 for us to retrain our brains 0:14:51.747,0:14:55.168 and stop contributing[br]to violent ecologies. 0:14:56.272,0:14:57.524 Thank you. 0:14:57.524,0:14:59.374 (Applause)