1 00:00:01,000 --> 00:00:03,000 I want to talk to you today a little bit 2 00:00:03,000 --> 00:00:06,000 about predictable irrationality. 3 00:00:06,000 --> 00:00:10,000 And my interest in irrational behavior 4 00:00:10,000 --> 00:00:13,000 started many years ago in the hospital. 5 00:00:13,000 --> 00:00:17,000 I was burned very badly. 6 00:00:17,000 --> 00:00:20,000 And if you spend a lot of time in hospital, 7 00:00:20,000 --> 00:00:23,000 you'll see a lot of types of irrationalities. 8 00:00:23,000 --> 00:00:28,000 And the one that particularly bothered me in the burn department 9 00:00:28,000 --> 00:00:32,000 was the process by which the nurses took the bandage off me. 10 00:00:33,000 --> 00:00:35,000 Now, you must have all taken a Band-Aid off at some point, 11 00:00:35,000 --> 00:00:38,000 and you must have wondered what's the right approach. 12 00:00:38,000 --> 00:00:42,000 Do you rip it off quickly -- short duration but high intensity -- 13 00:00:42,000 --> 00:00:44,000 or do you take your Band-Aid off slowly -- 14 00:00:44,000 --> 00:00:48,000 you take a long time, but each second is not as painful -- 15 00:00:48,000 --> 00:00:51,000 which one of those is the right approach? 16 00:00:51,000 --> 00:00:55,000 The nurses in my department thought that the right approach 17 00:00:55,000 --> 00:00:58,000 was the ripping one, so they would grab hold and they would rip, 18 00:00:58,000 --> 00:01:00,000 and they would grab hold and they would rip. 19 00:01:00,000 --> 00:01:04,000 And because I had 70 percent of my body burned, it would take about an hour. 20 00:01:04,000 --> 00:01:07,000 And as you can imagine, 21 00:01:07,000 --> 00:01:11,000 I hated that moment of ripping with incredible intensity. 22 00:01:11,000 --> 00:01:13,000 And I would try to reason with them and say, 23 00:01:13,000 --> 00:01:14,000 "Why don't we try something else? 24 00:01:14,000 --> 00:01:16,000 Why don't we take it a little longer -- 25 00:01:16,000 --> 00:01:21,000 maybe two hours instead of an hour -- and have less of this intensity?" 26 00:01:21,000 --> 00:01:23,000 And the nurses told me two things. 27 00:01:23,000 --> 00:01:27,000 They told me that they had the right model of the patient -- 28 00:01:27,000 --> 00:01:30,000 that they knew what was the right thing to do to minimize my pain -- 29 00:01:30,000 --> 00:01:33,000 and they also told me that the word patient doesn't mean 30 00:01:33,000 --> 00:01:35,000 to make suggestions or to interfere or ... 31 00:01:35,000 --> 00:01:38,000 This is not just in Hebrew, by the way. 32 00:01:38,000 --> 00:01:41,000 It's in every language I've had experience with so far. 33 00:01:41,000 --> 00:01:45,000 And, you know, there's not much -- there wasn't much I could do, 34 00:01:45,000 --> 00:01:48,000 and they kept on doing what they were doing. 35 00:01:48,000 --> 00:01:50,000 And about three years later, when I left the hospital, 36 00:01:50,000 --> 00:01:53,000 I started studying at the university. 37 00:01:53,000 --> 00:01:56,000 And one of the most interesting lessons I learned 38 00:01:56,000 --> 00:01:58,000 was that there is an experimental method 39 00:01:58,000 --> 00:02:02,000 that if you have a question you can create a replica of this question 40 00:02:02,000 --> 00:02:06,000 in some abstract way, and you can try to examine this question, 41 00:02:06,000 --> 00:02:08,000 maybe learn something about the world. 42 00:02:08,000 --> 00:02:10,000 So that's what I did. 43 00:02:10,000 --> 00:02:11,000 I was still interested 44 00:02:11,000 --> 00:02:13,000 in this question of how do you take bandages off burn patients. 45 00:02:13,000 --> 00:02:16,000 So originally I didn't have much money, 46 00:02:16,000 --> 00:02:20,000 so I went to a hardware store and I bought a carpenter's vice. 47 00:02:20,000 --> 00:02:24,000 And I would bring people to the lab and I would put their finger in it, 48 00:02:24,000 --> 00:02:26,000 and I would crunch it a little bit. 49 00:02:26,000 --> 00:02:28,000 (Laughter) 50 00:02:28,000 --> 00:02:31,000 And I would crunch it for long periods and short periods, 51 00:02:31,000 --> 00:02:33,000 and pain that went up and pain that went down, 52 00:02:33,000 --> 00:02:37,000 and with breaks and without breaks -- all kinds of versions of pain. 53 00:02:37,000 --> 00:02:39,000 And when I finished hurting people a little bit, I would ask them, 54 00:02:39,000 --> 00:02:41,000 so, how painful was this? Or, how painful was this? 55 00:02:41,000 --> 00:02:43,000 Or, if you had to choose between the last two, 56 00:02:43,000 --> 00:02:45,000 which one would you choose? 57 00:02:45,000 --> 00:02:48,000 (Laughter) 58 00:02:48,000 --> 00:02:51,000 I kept on doing this for a while. 59 00:02:51,000 --> 00:02:53,000 (Laughter) 60 00:02:53,000 --> 00:02:57,000 And then, like all good academic projects, I got more funding. 61 00:02:57,000 --> 00:02:59,000 I moved to sounds, electrical shocks -- 62 00:02:59,000 --> 00:03:04,000 I even had a pain suit that I could get people to feel much more pain. 63 00:03:04,000 --> 00:03:08,000 But at the end of this process, 64 00:03:08,000 --> 00:03:11,000 what I learned was that the nurses were wrong. 65 00:03:11,000 --> 00:03:14,000 Here were wonderful people with good intentions 66 00:03:14,000 --> 00:03:16,000 and plenty of experience, and nevertheless 67 00:03:16,000 --> 00:03:20,000 they were getting things wrong predictably all the time. 68 00:03:20,000 --> 00:03:23,000 It turns out that because we don't encode duration 69 00:03:23,000 --> 00:03:25,000 in the way that we encode intensity, 70 00:03:25,000 --> 00:03:29,000 I would have had less pain if the duration would have been longer 71 00:03:29,000 --> 00:03:31,000 and the intensity was lower. 72 00:03:31,000 --> 00:03:34,000 It turns out it would have been better to start with my face, 73 00:03:34,000 --> 00:03:36,000 which was much more painful, and move toward my legs, 74 00:03:36,000 --> 00:03:39,000 giving me a trend of improvement over time -- 75 00:03:39,000 --> 00:03:40,000 that would have been also less painful. 76 00:03:40,000 --> 00:03:42,000 And it also turns out that it would have been good 77 00:03:42,000 --> 00:03:44,000 to give me breaks in the middle to kind of recuperate from the pain. 78 00:03:44,000 --> 00:03:46,000 All of these would have been great things to do, 79 00:03:46,000 --> 00:03:49,000 and my nurses had no idea. 80 00:03:49,000 --> 00:03:50,000 And from that point on I started thinking, 81 00:03:50,000 --> 00:03:53,000 are the nurses the only people in the world who get things wrong 82 00:03:53,000 --> 00:03:56,000 in this particular decision, or is it a more general case? 83 00:03:56,000 --> 00:03:58,000 And it turns out it's a more general case -- 84 00:03:58,000 --> 00:04:01,000 there's a lot of mistakes we do. 85 00:04:01,000 --> 00:04:06,000 And I want to give you one example of one of these irrationalities, 86 00:04:06,000 --> 00:04:09,000 and I want to talk to you about cheating. 87 00:04:09,000 --> 00:04:11,000 And the reason I picked cheating is because it's interesting, 88 00:04:11,000 --> 00:04:13,000 but also it tells us something, I think, 89 00:04:13,000 --> 00:04:16,000 about the stock market situation we're in. 90 00:04:16,000 --> 00:04:19,000 So, my interest in cheating started 91 00:04:19,000 --> 00:04:21,000 when Enron came on the scene, exploded all of a sudden, 92 00:04:21,000 --> 00:04:24,000 and I started thinking about what is happening here. 93 00:04:24,000 --> 00:04:25,000 Is it the case that there was kind of 94 00:04:25,000 --> 00:04:28,000 a few apples who are capable of doing these things, 95 00:04:28,000 --> 00:04:30,000 or are we talking a more endemic situation, 96 00:04:30,000 --> 00:04:34,000 that many people are actually capable of behaving this way? 97 00:04:34,000 --> 00:04:38,000 So, like we usually do, I decided to do a simple experiment. 98 00:04:38,000 --> 00:04:39,000 And here's how it went. 99 00:04:39,000 --> 00:04:42,000 If you were in the experiment, I would pass you a sheet of paper 100 00:04:42,000 --> 00:04:46,000 with 20 simple math problems that everybody could solve, 101 00:04:46,000 --> 00:04:48,000 but I wouldn't give you enough time. 102 00:04:48,000 --> 00:04:50,000 When the five minutes were over, I would say, 103 00:04:50,000 --> 00:04:53,000 "Pass me the sheets of paper, and I'll pay you a dollar per question." 104 00:04:53,000 --> 00:04:57,000 People did this. I would pay people four dollars for their task -- 105 00:04:57,000 --> 00:04:59,000 on average people would solve four problems. 106 00:04:59,000 --> 00:05:02,000 Other people I would tempt to cheat. 107 00:05:02,000 --> 00:05:03,000 I would pass their sheet of paper. 108 00:05:03,000 --> 00:05:05,000 When the five minutes were over, I would say, 109 00:05:05,000 --> 00:05:06,000 "Please shred the piece of paper. 110 00:05:06,000 --> 00:05:09,000 Put the little pieces in your pocket or in your backpack, 111 00:05:09,000 --> 00:05:12,000 and tell me how many questions you got correctly." 112 00:05:12,000 --> 00:05:15,000 People now solved seven questions on average. 113 00:05:15,000 --> 00:05:20,000 Now, it wasn't as if there was a few bad apples -- 114 00:05:20,000 --> 00:05:23,000 a few people cheated a lot. 115 00:05:23,000 --> 00:05:26,000 Instead, what we saw is a lot of people who cheat a little bit. 116 00:05:26,000 --> 00:05:29,000 Now, in economic theory, 117 00:05:29,000 --> 00:05:32,000 cheating is a very simple cost-benefit analysis. 118 00:05:32,000 --> 00:05:34,000 You say, what's the probability of being caught? 119 00:05:34,000 --> 00:05:37,000 How much do I stand to gain from cheating? 120 00:05:37,000 --> 00:05:39,000 And how much punishment would I get if I get caught? 121 00:05:39,000 --> 00:05:41,000 And you weigh these options out -- 122 00:05:41,000 --> 00:05:43,000 you do the simple cost-benefit analysis, 123 00:05:43,000 --> 00:05:46,000 and you decide whether it's worthwhile to commit the crime or not. 124 00:05:46,000 --> 00:05:48,000 So, we try to test this. 125 00:05:48,000 --> 00:05:52,000 For some people, we varied how much money they could get away with -- 126 00:05:52,000 --> 00:05:53,000 how much money they could steal. 127 00:05:53,000 --> 00:05:56,000 We paid them 10 cents per correct question, 50 cents, 128 00:05:56,000 --> 00:05:59,000 a dollar, five dollars, 10 dollars per correct question. 129 00:05:59,000 --> 00:06:03,000 You would expect that as the amount of money on the table increases, 130 00:06:03,000 --> 00:06:06,000 people would cheat more, but in fact it wasn't the case. 131 00:06:06,000 --> 00:06:09,000 We got a lot of people cheating by stealing by a little bit. 132 00:06:09,000 --> 00:06:12,000 What about the probability of being caught? 133 00:06:12,000 --> 00:06:14,000 Some people shredded half the sheet of paper, 134 00:06:14,000 --> 00:06:15,000 so there was some evidence left. 135 00:06:15,000 --> 00:06:17,000 Some people shredded the whole sheet of paper. 136 00:06:17,000 --> 00:06:20,000 Some people shredded everything, went out of the room, 137 00:06:20,000 --> 00:06:23,000 and paid themselves from the bowl of money that had over 100 dollars. 138 00:06:23,000 --> 00:06:26,000 You would expect that as the probability of being caught goes down, 139 00:06:26,000 --> 00:06:29,000 people would cheat more, but again, this was not the case. 140 00:06:29,000 --> 00:06:32,000 Again, a lot of people cheated by just by a little bit, 141 00:06:32,000 --> 00:06:35,000 and they were insensitive to these economic incentives. 142 00:06:35,000 --> 00:06:36,000 So we said, "If people are not sensitive 143 00:06:36,000 --> 00:06:41,000 to the economic rational theory explanations, to these forces, 144 00:06:41,000 --> 00:06:44,000 what could be going on?" 145 00:06:44,000 --> 00:06:47,000 And we thought maybe what is happening is that there are two forces. 146 00:06:47,000 --> 00:06:49,000 At one hand, we all want to look at ourselves in the mirror 147 00:06:49,000 --> 00:06:52,000 and feel good about ourselves, so we don't want to cheat. 148 00:06:52,000 --> 00:06:54,000 On the other hand, we can cheat a little bit, 149 00:06:54,000 --> 00:06:56,000 and still feel good about ourselves. 150 00:06:56,000 --> 00:06:57,000 So, maybe what is happening is that 151 00:06:57,000 --> 00:06:59,000 there's a level of cheating we can't go over, 152 00:06:59,000 --> 00:07:03,000 but we can still benefit from cheating at a low degree, 153 00:07:03,000 --> 00:07:06,000 as long as it doesn't change our impressions about ourselves. 154 00:07:06,000 --> 00:07:09,000 We call this like a personal fudge factor. 155 00:07:10,000 --> 00:07:14,000 Now, how would you test a personal fudge factor? 156 00:07:14,000 --> 00:07:18,000 Initially we said, what can we do to shrink the fudge factor? 157 00:07:18,000 --> 00:07:20,000 So, we got people to the lab, and we said, 158 00:07:20,000 --> 00:07:22,000 "We have two tasks for you today." 159 00:07:22,000 --> 00:07:23,000 First, we asked half the people 160 00:07:23,000 --> 00:07:25,000 to recall either 10 books they read in high school, 161 00:07:25,000 --> 00:07:28,000 or to recall The Ten Commandments, 162 00:07:28,000 --> 00:07:30,000 and then we tempted them with cheating. 163 00:07:30,000 --> 00:07:33,000 Turns out the people who tried to recall The Ten Commandments -- 164 00:07:33,000 --> 00:07:35,000 and in our sample nobody could recall all of The Ten Commandments -- 165 00:07:36,000 --> 00:07:40,000 but those people who tried to recall The Ten Commandments, 166 00:07:40,000 --> 00:07:43,000 given the opportunity to cheat, did not cheat at all. 167 00:07:43,000 --> 00:07:45,000 It wasn't that the more religious people -- 168 00:07:45,000 --> 00:07:46,000 the people who remembered more of the Commandments -- cheated less, 169 00:07:46,000 --> 00:07:48,000 and the less religious people -- 170 00:07:48,000 --> 00:07:49,000 the people who couldn't remember almost any Commandments -- 171 00:07:49,000 --> 00:07:51,000 cheated more. 172 00:07:51,000 --> 00:07:55,000 The moment people thought about trying to recall The Ten Commandments, 173 00:07:55,000 --> 00:07:56,000 they stopped cheating. 174 00:07:56,000 --> 00:07:58,000 In fact, even when we gave self-declared atheists 175 00:07:58,000 --> 00:08:02,000 the task of swearing on the Bible and we give them a chance to cheat, 176 00:08:02,000 --> 00:08:04,000 they don't cheat at all. 177 00:08:06,000 --> 00:08:08,000 Now, Ten Commandments is something that is hard 178 00:08:08,000 --> 00:08:10,000 to bring into the education system, so we said, 179 00:08:10,000 --> 00:08:12,000 "Why don't we get people to sign the honor code?" 180 00:08:12,000 --> 00:08:14,000 So, we got people to sign, 181 00:08:14,000 --> 00:08:18,000 "I understand that this short survey falls under the MIT Honor Code." 182 00:08:18,000 --> 00:08:21,000 Then they shredded it. No cheating whatsoever. 183 00:08:21,000 --> 00:08:22,000 And this is particularly interesting, 184 00:08:22,000 --> 00:08:24,000 because MIT doesn't have an honor code. 185 00:08:24,000 --> 00:08:29,000 (Laughter) 186 00:08:29,000 --> 00:08:33,000 So, all this was about decreasing the fudge factor. 187 00:08:33,000 --> 00:08:36,000 What about increasing the fudge factor? 188 00:08:36,000 --> 00:08:38,000 The first experiment -- I walked around MIT 189 00:08:38,000 --> 00:08:41,000 and I distributed six-packs of Cokes in the refrigerators -- 190 00:08:41,000 --> 00:08:43,000 these were common refrigerators for the undergrads. 191 00:08:43,000 --> 00:08:46,000 And I came back to measure what we technically call 192 00:08:46,000 --> 00:08:50,000 the half-lifetime of Coke -- how long does it last in the refrigerators? 193 00:08:50,000 --> 00:08:53,000 As you can expect it doesn't last very long; people take it. 194 00:08:53,000 --> 00:08:57,000 In contrast, I took a plate with six one-dollar bills, 195 00:08:57,000 --> 00:09:00,000 and I left those plates in the same refrigerators. 196 00:09:00,000 --> 00:09:01,000 No bill ever disappeared. 197 00:09:01,000 --> 00:09:04,000 Now, this is not a good social science experiment, 198 00:09:04,000 --> 00:09:07,000 so to do it better I did the same experiment 199 00:09:07,000 --> 00:09:09,000 as I described to you before. 200 00:09:09,000 --> 00:09:12,000 A third of the people we passed the sheet, they gave it back to us. 201 00:09:12,000 --> 00:09:15,000 A third of the people we passed it to, they shredded it, 202 00:09:15,000 --> 00:09:16,000 they came to us and said, 203 00:09:16,000 --> 00:09:19,000 "Mr. Experimenter, I solved X problems. Give me X dollars." 204 00:09:19,000 --> 00:09:22,000 A third of the people, when they finished shredding the piece of paper, 205 00:09:22,000 --> 00:09:24,000 they came to us and said, 206 00:09:24,000 --> 00:09:30,000 "Mr Experimenter, I solved X problems. Give me X tokens." 207 00:09:30,000 --> 00:09:33,000 We did not pay them with dollars; we paid them with something else. 208 00:09:33,000 --> 00:09:36,000 And then they took the something else, they walked 12 feet to the side, 209 00:09:36,000 --> 00:09:38,000 and exchanged it for dollars. 210 00:09:38,000 --> 00:09:40,000 Think about the following intuition. 211 00:09:40,000 --> 00:09:43,000 How bad would you feel about taking a pencil from work home, 212 00:09:43,000 --> 00:09:45,000 compared to how bad would you feel 213 00:09:45,000 --> 00:09:47,000 about taking 10 cents from a petty cash box? 214 00:09:47,000 --> 00:09:50,000 These things feel very differently. 215 00:09:50,000 --> 00:09:53,000 Would being a step removed from cash for a few seconds 216 00:09:53,000 --> 00:09:56,000 by being paid by token make a difference? 217 00:09:56,000 --> 00:09:58,000 Our subjects doubled their cheating. 218 00:09:58,000 --> 00:10:00,000 I'll tell you what I think 219 00:10:00,000 --> 00:10:02,000 about this and the stock market in a minute. 220 00:10:03,000 --> 00:10:07,000 But this did not solve the big problem I had with Enron yet, 221 00:10:07,000 --> 00:10:10,000 because in Enron, there's also a social element. 222 00:10:10,000 --> 00:10:11,000 People see each other behaving. 223 00:10:11,000 --> 00:10:13,000 In fact, every day when we open the news 224 00:10:13,000 --> 00:10:15,000 we see examples of people cheating. 225 00:10:15,000 --> 00:10:18,000 What does this cause us? 226 00:10:18,000 --> 00:10:19,000 So, we did another experiment. 227 00:10:19,000 --> 00:10:22,000 We got a big group of students to be in the experiment, 228 00:10:22,000 --> 00:10:23,000 and we prepaid them. 229 00:10:23,000 --> 00:10:26,000 So everybody got an envelope with all the money for the experiment, 230 00:10:26,000 --> 00:10:28,000 and we told them that at the end, we asked them 231 00:10:28,000 --> 00:10:32,000 to pay us back the money they didn't make. OK? 232 00:10:32,000 --> 00:10:33,000 The same thing happens. 233 00:10:33,000 --> 00:10:35,000 When we give people the opportunity to cheat, they cheat. 234 00:10:35,000 --> 00:10:38,000 They cheat just by a little bit, all the same. 235 00:10:38,000 --> 00:10:41,000 But in this experiment we also hired an acting student. 236 00:10:41,000 --> 00:10:45,000 This acting student stood up after 30 seconds, and said, 237 00:10:45,000 --> 00:10:48,000 "I solved everything. What do I do now?" 238 00:10:48,000 --> 00:10:52,000 And the experimenter said, "If you've finished everything, go home. 239 00:10:52,000 --> 00:10:53,000 That's it. The task is finished." 240 00:10:53,000 --> 00:10:57,000 So, now we had a student -- an acting student -- 241 00:10:57,000 --> 00:10:59,000 that was a part of the group. 242 00:10:59,000 --> 00:11:01,000 Nobody knew it was an actor. 243 00:11:01,000 --> 00:11:05,000 And they clearly cheated in a very, very serious way. 244 00:11:05,000 --> 00:11:08,000 What would happen to the other people in the group? 245 00:11:08,000 --> 00:11:11,000 Will they cheat more, or will they cheat less? 246 00:11:11,000 --> 00:11:13,000 Here is what happens. 247 00:11:13,000 --> 00:11:17,000 It turns out it depends on what kind of sweatshirt they're wearing. 248 00:11:17,000 --> 00:11:19,000 Here is the thing. 249 00:11:19,000 --> 00:11:22,000 We ran this at Carnegie Mellon and Pittsburgh. 250 00:11:22,000 --> 00:11:24,000 And at Pittsburgh there are two big universities, 251 00:11:24,000 --> 00:11:27,000 Carnegie Mellon and University of Pittsburgh. 252 00:11:27,000 --> 00:11:29,000 All of the subjects sitting in the experiment 253 00:11:29,000 --> 00:11:31,000 were Carnegie Mellon students. 254 00:11:31,000 --> 00:11:35,000 When the actor who was getting up was a Carnegie Mellon student -- 255 00:11:35,000 --> 00:11:37,000 he was actually a Carnegie Mellon student -- 256 00:11:37,000 --> 00:11:41,000 but he was a part of their group, cheating went up. 257 00:11:41,000 --> 00:11:45,000 But when he actually had a University of Pittsburgh sweatshirt, 258 00:11:45,000 --> 00:11:47,000 cheating went down. 259 00:11:47,000 --> 00:11:50,000 (Laughter) 260 00:11:50,000 --> 00:11:53,000 Now, this is important, because remember, 261 00:11:53,000 --> 00:11:55,000 when the moment the student stood up, 262 00:11:55,000 --> 00:11:58,000 it made it clear to everybody that they could get away with cheating, 263 00:11:58,000 --> 00:12:00,000 because the experimenter said, 264 00:12:00,000 --> 00:12:02,000 "You've finished everything. Go home," and they went with the money. 265 00:12:02,000 --> 00:12:05,000 So it wasn't so much about the probability of being caught again. 266 00:12:05,000 --> 00:12:08,000 It was about the norms for cheating. 267 00:12:08,000 --> 00:12:11,000 If somebody from our in-group cheats and we see them cheating, 268 00:12:11,000 --> 00:12:15,000 we feel it's more appropriate, as a group, to behave this way. 269 00:12:15,000 --> 00:12:17,000 But if it's somebody from another group, these terrible people -- 270 00:12:17,000 --> 00:12:19,000 I mean, not terrible in this -- 271 00:12:19,000 --> 00:12:21,000 but somebody we don't want to associate ourselves with, 272 00:12:21,000 --> 00:12:23,000 from another university, another group, 273 00:12:23,000 --> 00:12:26,000 all of a sudden people's awareness of honesty goes up -- 274 00:12:26,000 --> 00:12:28,000 a little bit like The Ten Commandments experiment -- 275 00:12:28,000 --> 00:12:32,000 and people cheat even less. 276 00:12:32,000 --> 00:12:36,000 So, what have we learned from this about cheating? 277 00:12:36,000 --> 00:12:39,000 We've learned that a lot of people can cheat. 278 00:12:39,000 --> 00:12:42,000 They cheat just by a little bit. 279 00:12:42,000 --> 00:12:46,000 When we remind people about their morality, they cheat less. 280 00:12:46,000 --> 00:12:49,000 When we get bigger distance from cheating, 281 00:12:49,000 --> 00:12:53,000 from the object of money, for example, people cheat more. 282 00:12:53,000 --> 00:12:55,000 And when we see cheating around us, 283 00:12:55,000 --> 00:12:59,000 particularly if it's a part of our in-group, cheating goes up. 284 00:12:59,000 --> 00:13:02,000 Now, if we think about this in terms of the stock market, 285 00:13:02,000 --> 00:13:03,000 think about what happens. 286 00:13:03,000 --> 00:13:06,000 What happens in a situation when you create something 287 00:13:06,000 --> 00:13:08,000 where you pay people a lot of money 288 00:13:08,000 --> 00:13:11,000 to see reality in a slightly distorted way? 289 00:13:11,000 --> 00:13:14,000 Would they not be able to see it this way? 290 00:13:14,000 --> 00:13:15,000 Of course they would. 291 00:13:15,000 --> 00:13:16,000 What happens when you do other things, 292 00:13:16,000 --> 00:13:18,000 like you remove things from money? 293 00:13:18,000 --> 00:13:21,000 You call them stock, or stock options, derivatives, 294 00:13:21,000 --> 00:13:22,000 mortgage-backed securities. 295 00:13:22,000 --> 00:13:25,000 Could it be that with those more distant things, 296 00:13:25,000 --> 00:13:27,000 it's not a token for one second, 297 00:13:27,000 --> 00:13:29,000 it's something that is many steps removed from money 298 00:13:29,000 --> 00:13:33,000 for a much longer time -- could it be that people will cheat even more? 299 00:13:33,000 --> 00:13:35,000 And what happens to the social environment 300 00:13:35,000 --> 00:13:38,000 when people see other people behave around them? 301 00:13:38,000 --> 00:13:42,000 I think all of those forces worked in a very bad way 302 00:13:42,000 --> 00:13:44,000 in the stock market. 303 00:13:44,000 --> 00:13:47,000 More generally, I want to tell you something 304 00:13:47,000 --> 00:13:50,000 about behavioral economics. 305 00:13:50,000 --> 00:13:54,000 We have many intuitions in our life, 306 00:13:54,000 --> 00:13:57,000 and the point is that many of these intuitions are wrong. 307 00:13:57,000 --> 00:14:00,000 The question is, are we going to test those intuitions? 308 00:14:00,000 --> 00:14:02,000 We can think about how we're going to test this intuition 309 00:14:02,000 --> 00:14:04,000 in our private life, in our business life, 310 00:14:04,000 --> 00:14:07,000 and most particularly when it goes to policy, 311 00:14:07,000 --> 00:14:10,000 when we think about things like No Child Left Behind, 312 00:14:10,000 --> 00:14:13,000 when you create new stock markets, when you create other policies -- 313 00:14:13,000 --> 00:14:16,000 taxation, health care and so on. 314 00:14:16,000 --> 00:14:18,000 And the difficulty of testing our intuition 315 00:14:18,000 --> 00:14:20,000 was the big lesson I learned 316 00:14:20,000 --> 00:14:22,000 when I went back to the nurses to talk to them. 317 00:14:22,000 --> 00:14:24,000 So I went back to talk to them 318 00:14:24,000 --> 00:14:27,000 and tell them what I found out about removing bandages. 319 00:14:27,000 --> 00:14:29,000 And I learned two interesting things. 320 00:14:29,000 --> 00:14:31,000 One was that my favorite nurse, Ettie, 321 00:14:31,000 --> 00:14:35,000 told me that I did not take her pain into consideration. 322 00:14:35,000 --> 00:14:37,000 She said, "Of course, you know, it was very painful for you. 323 00:14:37,000 --> 00:14:39,000 But think about me as a nurse, 324 00:14:39,000 --> 00:14:41,000 taking, removing the bandages of somebody I liked, 325 00:14:41,000 --> 00:14:44,000 and had to do it repeatedly over a long period of time. 326 00:14:44,000 --> 00:14:47,000 Creating so much torture was not something that was good for me, too." 327 00:14:47,000 --> 00:14:52,000 And she said maybe part of the reason was it was difficult for her. 328 00:14:52,000 --> 00:14:55,000 But it was actually more interesting than that, because she said, 329 00:14:55,000 --> 00:15:00,000 "I did not think that your intuition was right. 330 00:15:00,000 --> 00:15:01,000 I felt my intuition was correct." 331 00:15:01,000 --> 00:15:03,000 So, if you think about all of your intuitions, 332 00:15:03,000 --> 00:15:07,000 it's very hard to believe that your intuition is wrong. 333 00:15:07,000 --> 00:15:10,000 And she said, "Given the fact that I thought my intuition was right ..." -- 334 00:15:10,000 --> 00:15:12,000 she thought her intuition was right -- 335 00:15:12,000 --> 00:15:17,000 it was very difficult for her to accept doing a difficult experiment 336 00:15:17,000 --> 00:15:19,000 to try and check whether she was wrong. 337 00:15:19,000 --> 00:15:23,000 But in fact, this is the situation we're all in all the time. 338 00:15:23,000 --> 00:15:26,000 We have very strong intuitions about all kinds of things -- 339 00:15:26,000 --> 00:15:29,000 our own ability, how the economy works, 340 00:15:29,000 --> 00:15:31,000 how we should pay school teachers. 341 00:15:31,000 --> 00:15:34,000 But unless we start testing those intuitions, 342 00:15:34,000 --> 00:15:36,000 we're not going to do better. 343 00:15:36,000 --> 00:15:38,000 And just think about how better my life would have been 344 00:15:38,000 --> 00:15:40,000 if these nurses would have been willing to check their intuition, 345 00:15:40,000 --> 00:15:41,000 and how everything would have been better 346 00:15:41,000 --> 00:15:46,000 if we just start doing more systematic experimentation of our intuitions. 347 00:15:46,000 --> 00:15:48,000 Thank you very much.