WEBVTT 00:00:00.887 --> 00:00:04.415 So I'm starting us out today with a historical mystery. 00:00:05.037 --> 00:00:08.006 In 1957, there were two young women, 00:00:08.030 --> 00:00:09.385 both in their 20s, 00:00:09.409 --> 00:00:10.893 both living in the same city, 00:00:10.917 --> 00:00:13.753 both members of the same political group. 00:00:14.751 --> 00:00:18.591 That year, both decided to commit violent attacks. 00:00:19.284 --> 00:00:23.413 One girl took a gun and approached a soldier at a checkpoint. 00:00:24.300 --> 00:00:29.078 The other girl took a bomb and went to a crowded café. 00:00:30.287 --> 00:00:31.986 But here's the thing: 00:00:32.429 --> 00:00:36.138 one of the those girls followed through with the attack, 00:00:37.028 --> 00:00:40.048 but the other turned back. 00:00:41.303 --> 00:00:42.997 So what made the difference? NOTE Paragraph 00:00:44.157 --> 00:00:47.414 I'm a behavioral historian, and I study aggression, 00:00:47.438 --> 00:00:49.359 moral cognition 00:00:49.383 --> 00:00:52.592 and decision-making in social movements. 00:00:52.616 --> 00:00:54.613 That's a mouthful. (Laughs) 00:00:54.637 --> 00:00:57.287 So, the translation of that is: 00:00:57.311 --> 00:01:02.089 I study the moment an individual decides to pull the trigger, 00:01:02.113 --> 00:01:06.749 the day-to-day decisions that led up to that moment 00:01:06.773 --> 00:01:12.896 and the stories that they tell themselves about why that behavior is justified. NOTE Paragraph 00:01:13.468 --> 00:01:14.724 Now, this topic -- 00:01:14.748 --> 00:01:17.418 it's not just scholarly for me. 00:01:17.442 --> 00:01:19.227 It's actually a bit personal. 00:01:19.251 --> 00:01:22.612 I grew up in Kootenai County, Idaho, 00:01:22.636 --> 00:01:24.724 and this is very important. 00:01:25.147 --> 00:01:29.669 This is not the part of Idaho with potatoes. 00:01:29.693 --> 00:01:32.111 We have no potatoes. 00:01:32.135 --> 00:01:34.108 And if you ask me about potatoes, 00:01:34.132 --> 00:01:35.483 I will find you. NOTE Paragraph 00:01:35.507 --> 00:01:36.726 (Laughter) NOTE Paragraph 00:01:36.750 --> 00:01:40.198 This part of Idaho is known for mountain lakes, 00:01:40.222 --> 00:01:42.047 horseback riding, 00:01:42.071 --> 00:01:43.319 skiing. 00:01:43.684 --> 00:01:46.938 Unfortunately, starting in the 1980s, 00:01:46.962 --> 00:01:50.969 it also became known as the worldwide headquarters 00:01:50.993 --> 00:01:52.984 for the Aryan Nations. 00:01:53.348 --> 00:01:56.911 Every year, members of the local neo-Nazi compound 00:01:56.935 --> 00:01:59.924 would turn out and march through our town, 00:01:59.948 --> 00:02:01.158 and every year, 00:02:01.182 --> 00:02:04.523 members of our town would turn out and protest them. NOTE Paragraph 00:02:04.547 --> 00:02:08.395 Now, in 2001, I graduated from high school, 00:02:08.419 --> 00:02:12.577 and I went to college in New York City. 00:02:12.601 --> 00:02:16.553 I arrived in August 2001. 00:02:17.482 --> 00:02:19.766 As many of you probably are aware, 00:02:19.790 --> 00:02:21.681 three weeks later, 00:02:21.705 --> 00:02:23.500 the Twin Towers went down. NOTE Paragraph 00:02:24.243 --> 00:02:27.997 Now, I was shocked. 00:02:28.503 --> 00:02:30.878 I was incredibly angry. 00:02:32.613 --> 00:02:34.222 I wanted to do something, 00:02:34.246 --> 00:02:38.396 but the only thing that I could think of doing at that time 00:02:38.420 --> 00:02:40.623 was to study Arabic. NOTE Paragraph 00:02:41.955 --> 00:02:43.112 I will admit, 00:02:43.136 --> 00:02:49.223 I was that girl in class that wanted to know why "they" hate "us." 00:02:49.247 --> 00:02:52.638 I started studying Arabic for very wrong reasons. 00:02:53.265 --> 00:02:55.200 But something unexpected happened. 00:02:55.224 --> 00:02:58.194 I got a scholarship to go study in Israel. 00:02:58.718 --> 00:03:01.535 So the Idaho girl went to the Middle East. 00:03:01.559 --> 00:03:06.045 And while I was there, I met Palestinian Muslims, 00:03:06.069 --> 00:03:08.101 Palestinian Christians, 00:03:08.125 --> 00:03:09.282 Israeli settlers, 00:03:09.306 --> 00:03:10.975 Israeli peace activists. 00:03:11.501 --> 00:03:16.035 And what I learned is that every act has an ecology. 00:03:16.696 --> 00:03:18.125 It has a context. NOTE Paragraph 00:03:20.074 --> 00:03:23.627 Now, since then, I have gone around the world, 00:03:23.651 --> 00:03:27.565 I have studied violent movements, 00:03:27.589 --> 00:03:32.554 I have worked with NGOs and ex-combatants in Iraq, 00:03:32.578 --> 00:03:33.776 Syria, 00:03:33.800 --> 00:03:35.219 Vietnam, 00:03:35.243 --> 00:03:36.425 the Balkans, 00:03:36.449 --> 00:03:37.730 Cuba. 00:03:38.422 --> 00:03:40.817 I earned my PhD in History, 00:03:40.841 --> 00:03:43.355 and now what I do is I go to different archives 00:03:43.379 --> 00:03:45.600 and I dig through documents, 00:03:45.624 --> 00:03:49.257 looking for police confessions, 00:03:49.281 --> 00:03:51.846 court cases, 00:03:51.870 --> 00:03:57.086 diaries and manifestos of individuals involved in violent attacks. 00:03:57.110 --> 00:04:00.296 Now, you gather all these documents -- 00:04:00.320 --> 00:04:01.730 what do they tell you? NOTE Paragraph 00:04:01.754 --> 00:04:04.921 Our brains love causal mysteries, 00:04:04.945 --> 00:04:06.133 it turns out. 00:04:06.157 --> 00:04:09.107 So any time we see an attack on the news, 00:04:09.131 --> 00:04:11.563 we tend to ask one question: 00:04:11.587 --> 00:04:12.889 Why? 00:04:12.913 --> 00:04:14.404 Why did that happen? 00:04:14.428 --> 00:04:17.328 Well, I can tell you I've read thousands of manifestos, 00:04:17.352 --> 00:04:21.913 and what you find out is that they are actually imitative. 00:04:21.937 --> 00:04:25.506 They imitate the political movement that they're drawing from. 00:04:25.530 --> 00:04:29.543 So they actually don't tell us a lot about decision-making 00:04:29.567 --> 00:04:31.561 in that particular case. 00:04:31.924 --> 00:04:36.542 So we have to teach ourselves to ask a totally different question. 00:04:36.566 --> 00:04:40.144 Instead of "Why?" we have to ask "How?" 00:04:40.168 --> 00:04:43.079 How did individuals produce these attacks, 00:04:43.103 --> 00:04:48.381 and how did their decision-making ecology contribute to violent behavior? NOTE Paragraph 00:04:48.781 --> 00:04:53.799 There's a couple things I've learned from asking this kind of question. 00:04:53.823 --> 00:04:55.895 The most important thing is that 00:04:55.919 --> 00:04:58.955 political violence is not culturally endemic. 00:04:58.979 --> 00:05:00.359 We create it. 00:05:00.383 --> 00:05:02.673 And whether we realize it or not, 00:05:02.697 --> 00:05:08.160 our day-to-day habits contribute to the creation of violence 00:05:08.184 --> 00:05:09.738 in our environment. NOTE Paragraph 00:05:09.762 --> 00:05:14.978 So here's a couple of habits that I've learned contribute to violence. 00:05:16.322 --> 00:05:19.562 One of the first things that attackers did 00:05:19.586 --> 00:05:23.392 when preparing themselves for a violent event 00:05:23.416 --> 00:05:27.111 was they enclosed themselves in an information bubble. 00:05:27.135 --> 00:05:29.693 We've heard of fake news, yeah? 00:05:29.717 --> 00:05:32.094 Well, this shocked me: 00:05:32.118 --> 00:05:35.616 every group that I studied had some kind of a fake news slogan. 00:05:35.640 --> 00:05:39.028 French communists called it the "putrid press." 00:05:39.052 --> 00:05:43.149 French ultranationalists called it the "sellout press" 00:05:43.173 --> 00:05:45.365 and the "treasonous press." 00:05:45.389 --> 00:05:48.900 Islamists in Egypt called it the "depraved news." 00:05:48.924 --> 00:05:51.989 And Egyptian communists called it ... 00:05:52.013 --> 00:05:53.266 "fake news." 00:05:53.290 --> 00:05:57.796 So why do groups spend all this time trying to make these information bubbles? 00:05:57.820 --> 00:06:00.701 The answer is actually really simple. 00:06:00.725 --> 00:06:05.220 We make decisions based on the information we trust, yeah? 00:06:05.244 --> 00:06:09.087 So if we trust bad information, 00:06:09.111 --> 00:06:12.086 we're going to make bad decisions. NOTE Paragraph 00:06:12.110 --> 00:06:15.100 Another interesting habit that individuals used 00:06:15.124 --> 00:06:18.067 when they wanted to produce a violent attack 00:06:18.091 --> 00:06:21.621 was that they looked at their victim not as an individual 00:06:21.645 --> 00:06:24.356 but just as a member of an opposing team. 00:06:25.006 --> 00:06:26.737 Now this gets really weird. 00:06:27.554 --> 00:06:31.889 There's some fun brain science behind why that kind of thinking is effective. 00:06:31.913 --> 00:06:35.441 Say I divide all of you guys into two teams: 00:06:35.465 --> 00:06:36.617 blue team, 00:06:36.641 --> 00:06:37.945 red team. 00:06:37.969 --> 00:06:41.437 And then I ask you to compete in a game against each other. 00:06:41.461 --> 00:06:45.419 Well, the funny thing is, within milliseconds, 00:06:45.443 --> 00:06:50.292 you will actually start experiencing pleasure -- pleasure -- 00:06:50.316 --> 00:06:55.129 when something bad happens to members of the other team. 00:06:55.814 --> 00:06:59.898 The funny thing about that is if I ask one of you blue team members 00:06:59.922 --> 00:07:01.837 to go and join the red team, 00:07:02.805 --> 00:07:04.797 your brain recalibrates, 00:07:04.821 --> 00:07:06.195 and within milliseconds, 00:07:06.219 --> 00:07:08.812 you will now start experiencing pleasure 00:07:08.836 --> 00:07:12.294 when bad things happen to members of your old team. 00:07:14.064 --> 00:07:20.669 This is a really good example of why us-them thinking is so dangerous 00:07:20.693 --> 00:07:22.463 in our political environment. NOTE Paragraph 00:07:22.487 --> 00:07:26.828 Another habit that attackers used to kind of rev themselves up for an attack 00:07:26.852 --> 00:07:29.220 was they focused on differences. 00:07:29.244 --> 00:07:32.201 In other words, they looked at their victims, and they thought, 00:07:33.075 --> 00:07:35.221 "I share nothing in common with that person. 00:07:35.245 --> 00:07:37.374 They are totally different than me." 00:07:38.829 --> 00:07:41.995 Again, this might sound like a really simple concept, 00:07:42.019 --> 00:07:46.676 but there's some fascinating science behind why this works. 00:07:47.209 --> 00:07:52.438 Say I show you guys videos of different-colored hands 00:07:52.462 --> 00:07:56.237 and sharp pins being driven into these different-colored hands, 00:07:56.261 --> 00:07:57.411 OK? 00:07:58.360 --> 00:08:00.219 If you're white, 00:08:00.243 --> 00:08:05.954 the chances are you will experience the most sympathetic activation, 00:08:05.978 --> 00:08:07.541 or the most pain, 00:08:07.565 --> 00:08:10.496 when you see a pin going into the white hand. 00:08:12.053 --> 00:08:15.447 If you are Latin American, Arab, Black, 00:08:15.471 --> 00:08:19.038 you will probably experience the most sympathetic activation 00:08:19.062 --> 00:08:23.926 watching a pin going into the hand that looks most like yours. 00:08:26.846 --> 00:08:30.718 The good news is, that's not biologically fixed. 00:08:30.742 --> 00:08:32.537 That is learned behavior. 00:08:33.252 --> 00:08:37.730 Which means the more we spend time with other ethnic communities 00:08:37.754 --> 00:08:44.623 and the more we see them as similar to us and part of our team, 00:08:44.647 --> 00:08:46.875 the more we feel their pain. NOTE Paragraph 00:08:46.899 --> 00:08:49.456 The last habit that I'm going to talk about 00:08:49.480 --> 00:08:54.537 is when attackers prepared themselves to go out and do one of these events, 00:08:54.561 --> 00:08:57.227 they focused on certain emotional cues. 00:08:57.251 --> 00:09:03.117 For months, they geared themselves up by focusing on anger cues, for instance. 00:09:03.141 --> 00:09:05.839 I bring this up because it's really popular right now. 00:09:05.863 --> 00:09:09.547 If you read blogs or the news, 00:09:09.571 --> 00:09:13.550 you see talk of two concepts from laboratory science: 00:09:13.574 --> 00:09:16.779 amygdala hijacking and emotional hijacking. 00:09:16.803 --> 00:09:19.376 Now, amygdala hijacking: 00:09:19.400 --> 00:09:23.489 it's the concept that I show you a cue -- say, a gun -- 00:09:23.513 --> 00:09:27.406 and your brain reacts with an automatic threat response 00:09:27.430 --> 00:09:28.599 to that cue. 00:09:28.623 --> 00:09:31.124 Emotional hijacking -- it's a very similar concept. 00:09:31.148 --> 00:09:36.179 It's the idea that I show you an anger cue, for instance, 00:09:36.203 --> 00:09:41.292 and your brain will react with an automatic anger response 00:09:41.316 --> 00:09:42.643 to that cue. 00:09:42.667 --> 00:09:46.646 I think women usually get this more than men. (Laughs) NOTE Paragraph 00:09:46.670 --> 00:09:47.677 (Laughter) NOTE Paragraph 00:09:47.701 --> 00:09:51.227 That kind of a hijacking narrative grabs our attention. 00:09:51.251 --> 00:09:54.016 Just the word "hijacking" grabs our attention. 00:09:54.526 --> 00:09:55.677 The thing is, 00:09:55.701 --> 00:10:00.472 most of the time, that's not really how cues work in real life. 00:10:00.970 --> 00:10:02.120 If you study history, 00:10:02.144 --> 00:10:07.406 what you find is that we are bombarded with hundreds of thousands of cues 00:10:07.430 --> 00:10:08.837 every day. 00:10:08.861 --> 00:10:10.908 And so what we do is we learn to filter. 00:10:10.932 --> 00:10:12.776 We ignore some cues, 00:10:12.800 --> 00:10:14.979 we pay attention to other cues. NOTE Paragraph 00:10:15.003 --> 00:10:18.631 For political violence, this becomes really important, 00:10:18.655 --> 00:10:24.566 because what it meant is that attackers usually didn't just see an anger cue 00:10:24.590 --> 00:10:26.470 and suddenly snap. 00:10:26.825 --> 00:10:28.288 Instead, 00:10:28.312 --> 00:10:34.831 politicians, social activists spent weeks, months, years 00:10:34.855 --> 00:10:39.878 flooding the environment with anger cues, for instance, 00:10:39.902 --> 00:10:41.905 and attackers, 00:10:41.929 --> 00:10:44.364 they paid attention to those cues, 00:10:44.388 --> 00:10:46.914 they trusted those cues, 00:10:46.938 --> 00:10:48.486 they focused on them, 00:10:48.510 --> 00:10:51.112 they even memorized those cues. NOTE Paragraph 00:10:51.136 --> 00:10:57.613 All of this just really goes to show how important it is to study history. 00:10:57.637 --> 00:11:01.607 It's one thing to see how cues operate in a laboratory setting. 00:11:01.631 --> 00:11:05.245 And those laboratory experiments are incredibly important. 00:11:05.269 --> 00:11:09.491 They give us a lot of new data about how our bodies work. 00:11:10.269 --> 00:11:15.488 But it's also very important to see how those cues operate in real life. NOTE Paragraph 00:11:18.535 --> 00:11:22.700 So what does all this tell us about political violence? 00:11:23.908 --> 00:11:27.496 Political violence is not culturally endemic. 00:11:27.985 --> 00:11:33.208 It is not an automatic, predetermined response to environmental stimuli. 00:11:33.523 --> 00:11:34.706 We produce it. 00:11:35.356 --> 00:11:37.429 Our everyday habits produce it. NOTE Paragraph 00:11:38.945 --> 00:11:42.906 Let's go back, actually, to those two women that I mentioned at the start. 00:11:43.940 --> 00:11:49.518 The first woman had been paying attention to those outrage campaigns, 00:11:49.542 --> 00:11:50.965 so she took a gun 00:11:50.989 --> 00:11:53.258 and approached a soldier at a checkpoint. 00:11:55.302 --> 00:11:58.905 But in that moment, something really interesting happened. 00:11:58.929 --> 00:12:01.798 She looked at that soldier, 00:12:01.822 --> 00:12:03.775 and she thought to herself, 00:12:06.180 --> 00:12:08.762 "He's the same age as me. 00:12:09.435 --> 00:12:10.953 He looks like me." 00:12:12.724 --> 00:12:15.262 And she put down the gun, and she walked away. 00:12:16.179 --> 00:12:18.602 Just from that little bit of similarity. NOTE Paragraph 00:12:20.128 --> 00:12:23.702 The second girl had a totally different outcome. 00:12:25.533 --> 00:12:28.382 She also listened to the outrage campaigns, 00:12:28.406 --> 00:12:31.398 but she surrounded herself with individuals 00:12:31.422 --> 00:12:33.246 who were supportive of violence, 00:12:33.270 --> 00:12:35.776 with peers who supported her violence. 00:12:36.707 --> 00:12:39.983 She enclosed herself in an information bubble. 00:12:40.747 --> 00:12:44.397 She focused on certain emotional cues for months. 00:12:44.421 --> 00:12:50.013 She taught herself to bypass certain cultural inhibitions against violence. 00:12:50.037 --> 00:12:51.784 She practiced her plan, 00:12:51.808 --> 00:12:54.055 she taught herself new habits, 00:12:54.079 --> 00:12:58.201 and when the time came, she took her bomb to the café, 00:12:58.225 --> 00:13:00.526 and she followed through with that attack. NOTE Paragraph 00:13:03.592 --> 00:13:06.403 This was not impulse. 00:13:06.903 --> 00:13:08.576 This was learning. 00:13:10.463 --> 00:13:14.345 Polarization in our society is not impulse, 00:13:14.369 --> 00:13:15.715 it's learning. 00:13:16.391 --> 00:13:19.318 Every day we are teaching ourselves: 00:13:19.342 --> 00:13:21.304 the news we click on, 00:13:21.328 --> 00:13:23.372 the emotions that we focus on, 00:13:23.396 --> 00:13:27.756 the thoughts that we entertain about the red team or the blue team. 00:13:28.284 --> 00:13:30.549 All of this contributes to learning, 00:13:30.573 --> 00:13:32.296 whether we realize it or not. NOTE Paragraph 00:13:32.696 --> 00:13:34.324 The good news 00:13:35.570 --> 00:13:41.441 is that while the individuals I study already made their decisions, 00:13:41.465 --> 00:13:43.885 we can still change our trajectory. 00:13:45.164 --> 00:13:48.824 We might never make the decisions that they made, 00:13:48.848 --> 00:13:52.997 but we can stop contributing to violent ecologies. 00:13:53.552 --> 00:13:58.038 We can get out of whatever news bubble we're in, 00:13:58.062 --> 00:14:02.106 we can be more mindful about the emotional cues 00:14:02.130 --> 00:14:03.359 that we focus on, 00:14:03.383 --> 00:14:05.782 the outrage bait that we click on. 00:14:06.422 --> 00:14:07.602 But most importantly, 00:14:07.626 --> 00:14:12.127 we can stop seeing each other as just members of the red team 00:14:12.151 --> 00:14:13.545 or the blue team. 00:14:13.959 --> 00:14:19.856 Because whether we are Christian, Muslim, Jewish, atheist, 00:14:19.880 --> 00:14:22.364 Democrat or Republican, 00:14:22.388 --> 00:14:23.544 we're human. 00:14:23.568 --> 00:14:24.843 We're human beings. 00:14:25.992 --> 00:14:29.494 And we often share really similar habits. NOTE Paragraph 00:14:30.224 --> 00:14:32.057 We have differences. 00:14:32.081 --> 00:14:34.296 Those differences are beautiful, 00:14:34.320 --> 00:14:36.761 and those differences are very important. 00:14:36.785 --> 00:14:43.254 But our future depends on us being able to find common ground 00:14:43.278 --> 00:14:44.628 with the other side. 00:14:45.785 --> 00:14:49.120 And that's why it is so, so important 00:14:49.144 --> 00:14:51.723 for us to retrain our brains 00:14:51.747 --> 00:14:55.547 and stop contributing to violent ecologies. NOTE Paragraph 00:14:56.272 --> 00:14:57.444 Thank you. NOTE Paragraph 00:14:57.468 --> 00:14:59.374 (Applause)