< Return to Video

Our buggy moral code

  • 0:01 - 0:03
    I want to talk to you today a little bit
  • 0:03 - 0:06
    about predictable irrationality.
  • 0:06 - 0:10
    And my interest in irrational behavior
  • 0:10 - 0:13
    started many years ago in the hospital.
  • 0:13 - 0:17
    I was burned very badly.
  • 0:17 - 0:20
    And if you spend a lot of time in hospital,
  • 0:20 - 0:23
    you'll see a lot of types of irrationalities.
  • 0:23 - 0:28
    And the one that particularly bothered me in the burn department
  • 0:28 - 0:32
    was the process by which the nurses took the bandage off me.
  • 0:33 - 0:35
    Now, you must have all taken a Band-Aid off at some point,
  • 0:35 - 0:38
    and you must have wondered what's the right approach.
  • 0:38 - 0:42
    Do you rip it off quickly -- short duration but high intensity --
  • 0:42 - 0:44
    or do you take your Band-Aid off slowly --
  • 0:44 - 0:48
    you take a long time, but each second is not as painful --
  • 0:48 - 0:51
    which one of those is the right approach?
  • 0:51 - 0:55
    The nurses in my department thought that the right approach
  • 0:55 - 0:58
    was the ripping one, so they would grab hold and they would rip,
  • 0:58 - 1:00
    and they would grab hold and they would rip.
  • 1:00 - 1:04
    And because I had 70 percent of my body burned, it would take about an hour.
  • 1:04 - 1:07
    And as you can imagine,
  • 1:07 - 1:11
    I hated that moment of ripping with incredible intensity.
  • 1:11 - 1:13
    And I would try to reason with them and say,
  • 1:13 - 1:14
    "Why don't we try something else?
  • 1:14 - 1:16
    Why don't we take it a little longer --
  • 1:16 - 1:21
    maybe two hours instead of an hour -- and have less of this intensity?"
  • 1:21 - 1:23
    And the nurses told me two things.
  • 1:23 - 1:27
    They told me that they had the right model of the patient --
  • 1:27 - 1:30
    that they knew what was the right thing to do to minimize my pain --
  • 1:30 - 1:33
    and they also told me that the word patient doesn't mean
  • 1:33 - 1:35
    to make suggestions or to interfere or ...
  • 1:35 - 1:38
    This is not just in Hebrew, by the way.
  • 1:38 - 1:41
    It's in every language I've had experience with so far.
  • 1:41 - 1:45
    And, you know, there's not much -- there wasn't much I could do,
  • 1:45 - 1:48
    and they kept on doing what they were doing.
  • 1:48 - 1:50
    And about three years later, when I left the hospital,
  • 1:50 - 1:53
    I started studying at the university.
  • 1:53 - 1:56
    And one of the most interesting lessons I learned
  • 1:56 - 1:58
    was that there is an experimental method
  • 1:58 - 2:02
    that if you have a question you can create a replica of this question
  • 2:02 - 2:06
    in some abstract way, and you can try to examine this question,
  • 2:06 - 2:08
    maybe learn something about the world.
  • 2:08 - 2:10
    So that's what I did.
  • 2:10 - 2:11
    I was still interested
  • 2:11 - 2:13
    in this question of how do you take bandages off burn patients.
  • 2:13 - 2:16
    So originally I didn't have much money,
  • 2:16 - 2:20
    so I went to a hardware store and I bought a carpenter's vice.
  • 2:20 - 2:24
    And I would bring people to the lab and I would put their finger in it,
  • 2:24 - 2:26
    and I would crunch it a little bit.
  • 2:26 - 2:28
    (Laughter)
  • 2:28 - 2:31
    And I would crunch it for long periods and short periods,
  • 2:31 - 2:33
    and pain that went up and pain that went down,
  • 2:33 - 2:37
    and with breaks and without breaks -- all kinds of versions of pain.
  • 2:37 - 2:39
    And when I finished hurting people a little bit, I would ask them,
  • 2:39 - 2:41
    so, how painful was this? Or, how painful was this?
  • 2:41 - 2:43
    Or, if you had to choose between the last two,
  • 2:43 - 2:45
    which one would you choose?
  • 2:45 - 2:48
    (Laughter)
  • 2:48 - 2:51
    I kept on doing this for a while.
  • 2:51 - 2:53
    (Laughter)
  • 2:53 - 2:57
    And then, like all good academic projects, I got more funding.
  • 2:57 - 2:59
    I moved to sounds, electrical shocks --
  • 2:59 - 3:04
    I even had a pain suit that I could get people to feel much more pain.
  • 3:04 - 3:08
    But at the end of this process,
  • 3:08 - 3:11
    what I learned was that the nurses were wrong.
  • 3:11 - 3:14
    Here were wonderful people with good intentions
  • 3:14 - 3:16
    and plenty of experience, and nevertheless
  • 3:16 - 3:20
    they were getting things wrong predictably all the time.
  • 3:20 - 3:23
    It turns out that because we don't encode duration
  • 3:23 - 3:25
    in the way that we encode intensity,
  • 3:25 - 3:29
    I would have had less pain if the duration would have been longer
  • 3:29 - 3:31
    and the intensity was lower.
  • 3:31 - 3:34
    It turns out it would have been better to start with my face,
  • 3:34 - 3:36
    which was much more painful, and move toward my legs,
  • 3:36 - 3:39
    giving me a trend of improvement over time --
  • 3:39 - 3:40
    that would have been also less painful.
  • 3:40 - 3:42
    And it also turns out that it would have been good
  • 3:42 - 3:44
    to give me breaks in the middle to kind of recuperate from the pain.
  • 3:44 - 3:46
    All of these would have been great things to do,
  • 3:46 - 3:49
    and my nurses had no idea.
  • 3:49 - 3:50
    And from that point on I started thinking,
  • 3:50 - 3:53
    are the nurses the only people in the world who get things wrong
  • 3:53 - 3:56
    in this particular decision, or is it a more general case?
  • 3:56 - 3:58
    And it turns out it's a more general case --
  • 3:58 - 4:01
    there's a lot of mistakes we do.
  • 4:01 - 4:06
    And I want to give you one example of one of these irrationalities,
  • 4:06 - 4:09
    and I want to talk to you about cheating.
  • 4:09 - 4:11
    And the reason I picked cheating is because it's interesting,
  • 4:11 - 4:13
    but also it tells us something, I think,
  • 4:13 - 4:16
    about the stock market situation we're in.
  • 4:16 - 4:19
    So, my interest in cheating started
  • 4:19 - 4:21
    when Enron came on the scene, exploded all of a sudden,
  • 4:21 - 4:24
    and I started thinking about what is happening here.
  • 4:24 - 4:25
    Is it the case that there was kind of
  • 4:25 - 4:28
    a few apples who are capable of doing these things,
  • 4:28 - 4:30
    or are we talking a more endemic situation,
  • 4:30 - 4:34
    that many people are actually capable of behaving this way?
  • 4:34 - 4:38
    So, like we usually do, I decided to do a simple experiment.
  • 4:38 - 4:39
    And here's how it went.
  • 4:39 - 4:42
    If you were in the experiment, I would pass you a sheet of paper
  • 4:42 - 4:46
    with 20 simple math problems that everybody could solve,
  • 4:46 - 4:48
    but I wouldn't give you enough time.
  • 4:48 - 4:50
    When the five minutes were over, I would say,
  • 4:50 - 4:53
    "Pass me the sheets of paper, and I'll pay you a dollar per question."
  • 4:53 - 4:57
    People did this. I would pay people four dollars for their task --
  • 4:57 - 4:59
    on average people would solve four problems.
  • 4:59 - 5:02
    Other people I would tempt to cheat.
  • 5:02 - 5:03
    I would pass their sheet of paper.
  • 5:03 - 5:05
    When the five minutes were over, I would say,
  • 5:05 - 5:06
    "Please shred the piece of paper.
  • 5:06 - 5:09
    Put the little pieces in your pocket or in your backpack,
  • 5:09 - 5:12
    and tell me how many questions you got correctly."
  • 5:12 - 5:15
    People now solved seven questions on average.
  • 5:15 - 5:20
    Now, it wasn't as if there was a few bad apples --
  • 5:20 - 5:23
    a few people cheated a lot.
  • 5:23 - 5:26
    Instead, what we saw is a lot of people who cheat a little bit.
  • 5:26 - 5:29
    Now, in economic theory,
  • 5:29 - 5:32
    cheating is a very simple cost-benefit analysis.
  • 5:32 - 5:34
    You say, what's the probability of being caught?
  • 5:34 - 5:37
    How much do I stand to gain from cheating?
  • 5:37 - 5:39
    And how much punishment would I get if I get caught?
  • 5:39 - 5:41
    And you weigh these options out --
  • 5:41 - 5:43
    you do the simple cost-benefit analysis,
  • 5:43 - 5:46
    and you decide whether it's worthwhile to commit the crime or not.
  • 5:46 - 5:48
    So, we try to test this.
  • 5:48 - 5:52
    For some people, we varied how much money they could get away with --
  • 5:52 - 5:53
    how much money they could steal.
  • 5:53 - 5:56
    We paid them 10 cents per correct question, 50 cents,
  • 5:56 - 5:59
    a dollar, five dollars, 10 dollars per correct question.
  • 5:59 - 6:03
    You would expect that as the amount of money on the table increases,
  • 6:03 - 6:06
    people would cheat more, but in fact it wasn't the case.
  • 6:06 - 6:09
    We got a lot of people cheating by stealing by a little bit.
  • 6:09 - 6:12
    What about the probability of being caught?
  • 6:12 - 6:14
    Some people shredded half the sheet of paper,
  • 6:14 - 6:15
    so there was some evidence left.
  • 6:15 - 6:17
    Some people shredded the whole sheet of paper.
  • 6:17 - 6:20
    Some people shredded everything, went out of the room,
  • 6:20 - 6:23
    and paid themselves from the bowl of money that had over 100 dollars.
  • 6:23 - 6:26
    You would expect that as the probability of being caught goes down,
  • 6:26 - 6:29
    people would cheat more, but again, this was not the case.
  • 6:29 - 6:32
    Again, a lot of people cheated by just by a little bit,
  • 6:32 - 6:35
    and they were insensitive to these economic incentives.
  • 6:35 - 6:36
    So we said, "If people are not sensitive
  • 6:36 - 6:41
    to the economic rational theory explanations, to these forces,
  • 6:41 - 6:44
    what could be going on?"
  • 6:44 - 6:47
    And we thought maybe what is happening is that there are two forces.
  • 6:47 - 6:49
    At one hand, we all want to look at ourselves in the mirror
  • 6:49 - 6:52
    and feel good about ourselves, so we don't want to cheat.
  • 6:52 - 6:54
    On the other hand, we can cheat a little bit,
  • 6:54 - 6:56
    and still feel good about ourselves.
  • 6:56 - 6:57
    So, maybe what is happening is that
  • 6:57 - 6:59
    there's a level of cheating we can't go over,
  • 6:59 - 7:03
    but we can still benefit from cheating at a low degree,
  • 7:03 - 7:06
    as long as it doesn't change our impressions about ourselves.
  • 7:06 - 7:09
    We call this like a personal fudge factor.
  • 7:10 - 7:14
    Now, how would you test a personal fudge factor?
  • 7:14 - 7:18
    Initially we said, what can we do to shrink the fudge factor?
  • 7:18 - 7:20
    So, we got people to the lab, and we said,
  • 7:20 - 7:22
    "We have two tasks for you today."
  • 7:22 - 7:23
    First, we asked half the people
  • 7:23 - 7:25
    to recall either 10 books they read in high school,
  • 7:25 - 7:28
    or to recall The Ten Commandments,
  • 7:28 - 7:30
    and then we tempted them with cheating.
  • 7:30 - 7:33
    Turns out the people who tried to recall The Ten Commandments --
  • 7:33 - 7:35
    and in our sample nobody could recall all of The Ten Commandments --
  • 7:36 - 7:40
    but those people who tried to recall The Ten Commandments,
  • 7:40 - 7:43
    given the opportunity to cheat, did not cheat at all.
  • 7:43 - 7:45
    It wasn't that the more religious people --
  • 7:45 - 7:46
    the people who remembered more of the Commandments -- cheated less,
  • 7:46 - 7:48
    and the less religious people --
  • 7:48 - 7:49
    the people who couldn't remember almost any Commandments --
  • 7:49 - 7:51
    cheated more.
  • 7:51 - 7:55
    The moment people thought about trying to recall The Ten Commandments,
  • 7:55 - 7:56
    they stopped cheating.
  • 7:56 - 7:58
    In fact, even when we gave self-declared atheists
  • 7:58 - 8:02
    the task of swearing on the Bible and we give them a chance to cheat,
  • 8:02 - 8:04
    they don't cheat at all.
  • 8:06 - 8:08
    Now, Ten Commandments is something that is hard
  • 8:08 - 8:10
    to bring into the education system, so we said,
  • 8:10 - 8:12
    "Why don't we get people to sign the honor code?"
  • 8:12 - 8:14
    So, we got people to sign,
  • 8:14 - 8:18
    "I understand that this short survey falls under the MIT Honor Code."
  • 8:18 - 8:21
    Then they shredded it. No cheating whatsoever.
  • 8:21 - 8:22
    And this is particularly interesting,
  • 8:22 - 8:24
    because MIT doesn't have an honor code.
  • 8:24 - 8:29
    (Laughter)
  • 8:29 - 8:33
    So, all this was about decreasing the fudge factor.
  • 8:33 - 8:36
    What about increasing the fudge factor?
  • 8:36 - 8:38
    The first experiment -- I walked around MIT
  • 8:38 - 8:41
    and I distributed six-packs of Cokes in the refrigerators --
  • 8:41 - 8:43
    these were common refrigerators for the undergrads.
  • 8:43 - 8:46
    And I came back to measure what we technically call
  • 8:46 - 8:50
    the half-lifetime of Coke -- how long does it last in the refrigerators?
  • 8:50 - 8:53
    As you can expect it doesn't last very long; people take it.
  • 8:53 - 8:57
    In contrast, I took a plate with six one-dollar bills,
  • 8:57 - 9:00
    and I left those plates in the same refrigerators.
  • 9:00 - 9:01
    No bill ever disappeared.
  • 9:01 - 9:04
    Now, this is not a good social science experiment,
  • 9:04 - 9:07
    so to do it better I did the same experiment
  • 9:07 - 9:09
    as I described to you before.
  • 9:09 - 9:12
    A third of the people we passed the sheet, they gave it back to us.
  • 9:12 - 9:15
    A third of the people we passed it to, they shredded it,
  • 9:15 - 9:16
    they came to us and said,
  • 9:16 - 9:19
    "Mr. Experimenter, I solved X problems. Give me X dollars."
  • 9:19 - 9:22
    A third of the people, when they finished shredding the piece of paper,
  • 9:22 - 9:24
    they came to us and said,
  • 9:24 - 9:30
    "Mr Experimenter, I solved X problems. Give me X tokens."
  • 9:30 - 9:33
    We did not pay them with dollars; we paid them with something else.
  • 9:33 - 9:36
    And then they took the something else, they walked 12 feet to the side,
  • 9:36 - 9:38
    and exchanged it for dollars.
  • 9:38 - 9:40
    Think about the following intuition.
  • 9:40 - 9:43
    How bad would you feel about taking a pencil from work home,
  • 9:43 - 9:45
    compared to how bad would you feel
  • 9:45 - 9:47
    about taking 10 cents from a petty cash box?
  • 9:47 - 9:50
    These things feel very differently.
  • 9:50 - 9:53
    Would being a step removed from cash for a few seconds
  • 9:53 - 9:56
    by being paid by token make a difference?
  • 9:56 - 9:58
    Our subjects doubled their cheating.
  • 9:58 - 10:00
    I'll tell you what I think
  • 10:00 - 10:02
    about this and the stock market in a minute.
  • 10:03 - 10:07
    But this did not solve the big problem I had with Enron yet,
  • 10:07 - 10:10
    because in Enron, there's also a social element.
  • 10:10 - 10:11
    People see each other behaving.
  • 10:11 - 10:13
    In fact, every day when we open the news
  • 10:13 - 10:15
    we see examples of people cheating.
  • 10:15 - 10:18
    What does this cause us?
  • 10:18 - 10:19
    So, we did another experiment.
  • 10:19 - 10:22
    We got a big group of students to be in the experiment,
  • 10:22 - 10:23
    and we prepaid them.
  • 10:23 - 10:26
    So everybody got an envelope with all the money for the experiment,
  • 10:26 - 10:28
    and we told them that at the end, we asked them
  • 10:28 - 10:32
    to pay us back the money they didn't make. OK?
  • 10:32 - 10:33
    The same thing happens.
  • 10:33 - 10:35
    When we give people the opportunity to cheat, they cheat.
  • 10:35 - 10:38
    They cheat just by a little bit, all the same.
  • 10:38 - 10:41
    But in this experiment we also hired an acting student.
  • 10:41 - 10:45
    This acting student stood up after 30 seconds, and said,
  • 10:45 - 10:48
    "I solved everything. What do I do now?"
  • 10:48 - 10:52
    And the experimenter said, "If you've finished everything, go home.
  • 10:52 - 10:53
    That's it. The task is finished."
  • 10:53 - 10:57
    So, now we had a student -- an acting student --
  • 10:57 - 10:59
    that was a part of the group.
  • 10:59 - 11:01
    Nobody knew it was an actor.
  • 11:01 - 11:05
    And they clearly cheated in a very, very serious way.
  • 11:05 - 11:08
    What would happen to the other people in the group?
  • 11:08 - 11:11
    Will they cheat more, or will they cheat less?
  • 11:11 - 11:13
    Here is what happens.
  • 11:13 - 11:17
    It turns out it depends on what kind of sweatshirt they're wearing.
  • 11:17 - 11:19
    Here is the thing.
  • 11:19 - 11:22
    We ran this at Carnegie Mellon and Pittsburgh.
  • 11:22 - 11:24
    And at Pittsburgh there are two big universities,
  • 11:24 - 11:27
    Carnegie Mellon and University of Pittsburgh.
  • 11:27 - 11:29
    All of the subjects sitting in the experiment
  • 11:29 - 11:31
    were Carnegie Mellon students.
  • 11:31 - 11:35
    When the actor who was getting up was a Carnegie Mellon student --
  • 11:35 - 11:37
    he was actually a Carnegie Mellon student --
  • 11:37 - 11:41
    but he was a part of their group, cheating went up.
  • 11:41 - 11:45
    But when he actually had a University of Pittsburgh sweatshirt,
  • 11:45 - 11:47
    cheating went down.
  • 11:47 - 11:50
    (Laughter)
  • 11:50 - 11:53
    Now, this is important, because remember,
  • 11:53 - 11:55
    when the moment the student stood up,
  • 11:55 - 11:58
    it made it clear to everybody that they could get away with cheating,
  • 11:58 - 12:00
    because the experimenter said,
  • 12:00 - 12:02
    "You've finished everything. Go home," and they went with the money.
  • 12:02 - 12:05
    So it wasn't so much about the probability of being caught again.
  • 12:05 - 12:08
    It was about the norms for cheating.
  • 12:08 - 12:11
    If somebody from our in-group cheats and we see them cheating,
  • 12:11 - 12:15
    we feel it's more appropriate, as a group, to behave this way.
  • 12:15 - 12:17
    But if it's somebody from another group, these terrible people --
  • 12:17 - 12:19
    I mean, not terrible in this --
  • 12:19 - 12:21
    but somebody we don't want to associate ourselves with,
  • 12:21 - 12:23
    from another university, another group,
  • 12:23 - 12:26
    all of a sudden people's awareness of honesty goes up --
  • 12:26 - 12:28
    a little bit like The Ten Commandments experiment --
  • 12:28 - 12:32
    and people cheat even less.
  • 12:32 - 12:36
    So, what have we learned from this about cheating?
  • 12:36 - 12:39
    We've learned that a lot of people can cheat.
  • 12:39 - 12:42
    They cheat just by a little bit.
  • 12:42 - 12:46
    When we remind people about their morality, they cheat less.
  • 12:46 - 12:49
    When we get bigger distance from cheating,
  • 12:49 - 12:53
    from the object of money, for example, people cheat more.
  • 12:53 - 12:55
    And when we see cheating around us,
  • 12:55 - 12:59
    particularly if it's a part of our in-group, cheating goes up.
  • 12:59 - 13:02
    Now, if we think about this in terms of the stock market,
  • 13:02 - 13:03
    think about what happens.
  • 13:03 - 13:06
    What happens in a situation when you create something
  • 13:06 - 13:08
    where you pay people a lot of money
  • 13:08 - 13:11
    to see reality in a slightly distorted way?
  • 13:11 - 13:14
    Would they not be able to see it this way?
  • 13:14 - 13:15
    Of course they would.
  • 13:15 - 13:16
    What happens when you do other things,
  • 13:16 - 13:18
    like you remove things from money?
  • 13:18 - 13:21
    You call them stock, or stock options, derivatives,
  • 13:21 - 13:22
    mortgage-backed securities.
  • 13:22 - 13:25
    Could it be that with those more distant things,
  • 13:25 - 13:27
    it's not a token for one second,
  • 13:27 - 13:29
    it's something that is many steps removed from money
  • 13:29 - 13:33
    for a much longer time -- could it be that people will cheat even more?
  • 13:33 - 13:35
    And what happens to the social environment
  • 13:35 - 13:38
    when people see other people behave around them?
  • 13:38 - 13:42
    I think all of those forces worked in a very bad way
  • 13:42 - 13:44
    in the stock market.
  • 13:44 - 13:47
    More generally, I want to tell you something
  • 13:47 - 13:50
    about behavioral economics.
  • 13:50 - 13:54
    We have many intuitions in our life,
  • 13:54 - 13:57
    and the point is that many of these intuitions are wrong.
  • 13:57 - 14:00
    The question is, are we going to test those intuitions?
  • 14:00 - 14:02
    We can think about how we're going to test this intuition
  • 14:02 - 14:04
    in our private life, in our business life,
  • 14:04 - 14:07
    and most particularly when it goes to policy,
  • 14:07 - 14:10
    when we think about things like No Child Left Behind,
  • 14:10 - 14:13
    when you create new stock markets, when you create other policies --
  • 14:13 - 14:16
    taxation, health care and so on.
  • 14:16 - 14:18
    And the difficulty of testing our intuition
  • 14:18 - 14:20
    was the big lesson I learned
  • 14:20 - 14:22
    when I went back to the nurses to talk to them.
  • 14:22 - 14:24
    So I went back to talk to them
  • 14:24 - 14:27
    and tell them what I found out about removing bandages.
  • 14:27 - 14:29
    And I learned two interesting things.
  • 14:29 - 14:31
    One was that my favorite nurse, Ettie,
  • 14:31 - 14:35
    told me that I did not take her pain into consideration.
  • 14:35 - 14:37
    She said, "Of course, you know, it was very painful for you.
  • 14:37 - 14:39
    But think about me as a nurse,
  • 14:39 - 14:41
    taking, removing the bandages of somebody I liked,
  • 14:41 - 14:44
    and had to do it repeatedly over a long period of time.
  • 14:44 - 14:47
    Creating so much torture was not something that was good for me, too."
  • 14:47 - 14:52
    And she said maybe part of the reason was it was difficult for her.
  • 14:52 - 14:55
    But it was actually more interesting than that, because she said,
  • 14:55 - 15:00
    "I did not think that your intuition was right.
  • 15:00 - 15:01
    I felt my intuition was correct."
  • 15:01 - 15:03
    So, if you think about all of your intuitions,
  • 15:03 - 15:07
    it's very hard to believe that your intuition is wrong.
  • 15:07 - 15:10
    And she said, "Given the fact that I thought my intuition was right ..." --
  • 15:10 - 15:12
    she thought her intuition was right --
  • 15:12 - 15:17
    it was very difficult for her to accept doing a difficult experiment
  • 15:17 - 15:19
    to try and check whether she was wrong.
  • 15:19 - 15:23
    But in fact, this is the situation we're all in all the time.
  • 15:23 - 15:26
    We have very strong intuitions about all kinds of things --
  • 15:26 - 15:29
    our own ability, how the economy works,
  • 15:29 - 15:31
    how we should pay school teachers.
  • 15:31 - 15:34
    But unless we start testing those intuitions,
  • 15:34 - 15:36
    we're not going to do better.
  • 15:36 - 15:38
    And just think about how better my life would have been
  • 15:38 - 15:40
    if these nurses would have been willing to check their intuition,
  • 15:40 - 15:41
    and how everything would have been better
  • 15:41 - 15:46
    if we just start doing more systematic experimentation of our intuitions.
  • 15:46 - 15:48
    Thank you very much.
Title:
Our buggy moral code
Speaker:
Dan Ariely
Description:

Behavioral economist Dan Ariely studies the bugs in our moral code: the hidden reasons we think it's OK to cheat or steal (sometimes). Clever studies help make his point that we're predictably irrational -- and can be influenced in ways we can't grasp.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:03
TED edited English subtitles for Our buggy moral code
TED added a translation

English subtitles

Revisions Compare revisions