< Return to Video

Can prejudice ever be a good thing?

  • 0:01 - 0:03
    When we think about prejudice and bias,
  • 0:03 - 0:05
    we tend to think about stupid and evil people
  • 0:05 - 0:08
    doing stupid and evil things.
  • 0:08 - 0:10
    And this idea is nicely summarized
  • 0:10 - 0:12
    by the British critic William Hazlitt,
  • 0:12 - 0:16
    who wrote, "Prejudice is the child of ignorance."
  • 0:16 - 0:17
    I want to try to convince you here
  • 0:17 - 0:19
    that this is mistaken.
  • 0:19 - 0:21
    I want to try to convince you
  • 0:21 - 0:23
    that prejudice and bias
  • 0:23 - 0:26
    are natural, they're often rational,
  • 0:26 - 0:28
    and they're often even moral,
  • 0:28 - 0:30
    and I think that once we understand this,
  • 0:30 - 0:32
    we're in a better position to make sense of them
  • 0:32 - 0:34
    when they go wrong,
  • 0:34 - 0:35
    when they have horrible consequences,
  • 0:35 - 0:38
    and we're in a better position to know what to do
  • 0:38 - 0:39
    when this happens.
  • 0:39 - 0:42
    So start with stereotypes. You look at me,
  • 0:42 - 0:45
    you know my name, you
    know certain facts about me,
  • 0:45 - 0:46
    and you could make certain judgments.
  • 0:46 - 0:49
    You could make guesses about my ethnicity,
  • 0:49 - 0:53
    my political affiliation, my religious beliefs.
  • 0:53 - 0:55
    And the thing is, these
    judgments tend to be accurate.
  • 0:55 - 0:57
    We're very good at this sort of thing.
  • 0:57 - 0:58
    And we're very good at this sort of thing
  • 0:58 - 1:01
    because our ability to stereotype people
  • 1:01 - 1:04
    is not some sort of arbitrary quirk of the mind,
  • 1:04 - 1:07
    but rather it's a specific instance
  • 1:07 - 1:08
    of a more general process,
  • 1:08 - 1:10
    which is that we have experience
  • 1:10 - 1:11
    with things and people in the world
  • 1:11 - 1:13
    that fall into categories,
  • 1:13 - 1:15
    and we can use our experience
    to make generalizations
  • 1:15 - 1:17
    about novel instances of these categories.
  • 1:17 - 1:20
    So everybody here has a lot of experience
  • 1:20 - 1:22
    with chairs and apples and dogs,
  • 1:22 - 1:24
    and based on this, you could see
  • 1:24 - 1:26
    unfamiliar examples and you could guess,
  • 1:26 - 1:27
    you could sit on the chair,
  • 1:27 - 1:30
    you could eat the apple, the dog will bark.
  • 1:30 - 1:32
    Now we might be wrong.
  • 1:32 - 1:34
    The chair could collapse if you sit on it,
  • 1:34 - 1:36
    the apple might be poison, the dog might not bark,
  • 1:36 - 1:39
    and in fact, this is my dog Tessie, who doesn't bark.
  • 1:39 - 1:41
    But for the most part, we're good at this.
  • 1:41 - 1:43
    For the most part, we make good guesses
  • 1:43 - 1:45
    both in the social domain and the non-social domain,
  • 1:45 - 1:47
    and if we weren't able to do so,
  • 1:47 - 1:50
    if we weren't able to make guesses about
    new instances that we encounter,
  • 1:50 - 1:52
    we wouldn't survive.
  • 1:52 - 1:55
    And in fact, Hazlitt later on in his wonderful essay
  • 1:55 - 1:56
    concedes this.
  • 1:56 - 1:59
    He writes, "Without the aid of prejudice and custom,
  • 1:59 - 2:01
    I should not be able to find
    my way my across the room;
  • 2:01 - 2:03
    nor know how to conduct
    myself in any circumstances,
  • 2:03 - 2:08
    nor what to feel in any relation of life."
  • 2:08 - 2:09
    Or take bias.
  • 2:09 - 2:11
    Now sometimes, we break the world up into
  • 2:11 - 2:14
    us versus them, into in group versus out group,
  • 2:14 - 2:15
    and sometimes when we do this,
  • 2:15 - 2:16
    we know we're doing something wrong,
  • 2:16 - 2:18
    and we're kind of ashamed of it.
  • 2:18 - 2:20
    But other times we're proud of it.
  • 2:20 - 2:22
    We openly acknowledge it.
  • 2:22 - 2:23
    And my favorite example of this
  • 2:23 - 2:25
    is a question that came from the audience
  • 2:25 - 2:28
    in a Republican debate prior to the last election.
  • 2:28 - 2:30
    (Video) Anderson Cooper: Gets to your question,
  • 2:30 - 2:35
    the question in the hall, on foreign aid? Yes, ma'am.
  • 2:35 - 2:37
    Woman: The American people are suffering
  • 2:37 - 2:39
    in our country right now.
  • 2:39 - 2:43
    Why do we continue to send foreign aid
  • 2:43 - 2:44
    to other countries
  • 2:44 - 2:48
    when we need all the help we can get for ourselves?
  • 2:48 - 2:50
    AC: Governor Perry, what about that?
  • 2:50 - 2:51
    (Applause)
  • 2:51 - 2:53
    Rick Perry: Absolutely, I think it's—
  • 2:53 - 2:55
    Paul Bloom: Each of the people onstage
  • 2:55 - 2:57
    agreed with the premise of her question,
  • 2:57 - 2:59
    which is as Americans, we should care more
  • 2:59 - 3:01
    about Americans than about other people.
  • 3:01 - 3:04
    And in fact, in general, people are often swayed
  • 3:04 - 3:08
    by feelings of solidarity, loyalty, pride, patriotism,
  • 3:08 - 3:10
    towards their country or towards their ethnic group.
  • 3:10 - 3:14
    Regardless of your politics, many
    people feel proud to be American,
  • 3:14 - 3:15
    and they favor Americans over other countries.
  • 3:15 - 3:18
    Residents of other countries
    feel the same about their nation,
  • 3:18 - 3:21
    and we feel the same about our ethnicities.
  • 3:21 - 3:23
    Now some of you may reject this.
  • 3:23 - 3:24
    Some of you may be so cosmopolitan
  • 3:24 - 3:27
    that you think that ethnicity and nationality
  • 3:27 - 3:29
    should have no moral sway.
  • 3:29 - 3:31
    But even you sophisticates accept
  • 3:31 - 3:33
    that there should be some pull
  • 3:33 - 3:36
    towards the in group in the
    domain of friends and family,
  • 3:36 - 3:38
    of people you're close to,
  • 3:38 - 3:40
    and so even you make a distinction
  • 3:40 - 3:41
    between us versus them.
  • 3:41 - 3:44
    Now this distinction is natural enough
  • 3:44 - 3:47
    and often moral enough, but it can go awry,
  • 3:47 - 3:48
    and this was part of the research
  • 3:48 - 3:51
    of the great social psychologist Henri Tajfel.
  • 3:51 - 3:54
    Tajfel was born in Poland in 1919.
  • 3:54 - 3:56
    He left to go to university in France,
  • 3:56 - 3:58
    because as a Jew, he couldn't
    go to university in Poland,
  • 3:58 - 4:01
    and then he enlisted in the French military
  • 4:01 - 4:02
    in World War II.
  • 4:02 - 4:04
    He was captured and ended up
  • 4:04 - 4:06
    in a prisoner of war camp,
  • 4:06 - 4:08
    and it was a terrifying time for him,
  • 4:08 - 4:09
    because if it was discovered he was a Jew,
  • 4:09 - 4:11
    he could have been moved to a concentration camp,
  • 4:11 - 4:13
    where he most likely would not have survived.
  • 4:13 - 4:16
    And in fact, when the war
    ended and he was released,
  • 4:16 - 4:19
    most of his friends and family were dead.
  • 4:19 - 4:20
    He got involved in different pursuits.
  • 4:20 - 4:22
    He helped out the war orphans.
  • 4:22 - 4:24
    But he had a long-lasting interest
  • 4:24 - 4:25
    in the science of prejudice,
  • 4:25 - 4:28
    and so when a prestigious British scholarship
  • 4:28 - 4:30
    on stereotypes opened up, he applied for it,
  • 4:30 - 4:31
    and he won it,
  • 4:31 - 4:33
    and then he began this amazing career.
  • 4:33 - 4:36
    And what started his career is an insight
  • 4:36 - 4:38
    that the way most people were thinking
  • 4:38 - 4:40
    about the Holocaust was wrong.
  • 4:40 - 4:42
    Many people, most people at the time,
  • 4:42 - 4:44
    viewed the Holocaust as sort of representing
  • 4:44 - 4:47
    some tragic flaw on the part of the Germans,
  • 4:47 - 4:51
    some genetic taint, some authoritarian personality.
  • 4:51 - 4:53
    And Tajfel rejected this.
  • 4:53 - 4:56
    Tajfel said what we see in the Holocaust
  • 4:56 - 4:58
    is just an exaggeration
  • 4:58 - 5:00
    of normal psychological processes
  • 5:00 - 5:02
    that exist in every one of us.
  • 5:02 - 5:04
    And to explore this, he did a series of classic studies
  • 5:04 - 5:06
    with British adolescents.
  • 5:06 - 5:08
    And in one of his studies, what he did was he asked
  • 5:08 - 5:10
    the British adolescents all sorts of questions,
  • 5:10 - 5:12
    and then based on their answers, he said,
  • 5:12 - 5:14
    "I've looked at your answers,
    and based on the answers,
  • 5:14 - 5:16
    I have determined that you are either"
  • 5:16 - 5:18
    — he told half of them —
  • 5:18 - 5:20
    "a Kandinsky lover, you love the work of Kandinsky,
  • 5:20 - 5:23
    or a Klee lover, you love the work of Klee."
  • 5:23 - 5:25
    It was entirely bogus.
  • 5:25 - 5:27
    Their answers had nothing
    to do with Kandinsky or Klee.
  • 5:27 - 5:30
    They probably hadn't heard of the artists.
  • 5:30 - 5:33
    He just arbitrarily divided them up.
  • 5:33 - 5:36
    But what he found was, these categories mattered,
  • 5:36 - 5:39
    so when he later gave the subjects money,
  • 5:39 - 5:40
    they would prefer to give the money
  • 5:40 - 5:42
    to members of their own group
  • 5:42 - 5:44
    than members of the other group.
  • 5:44 - 5:47
    Worse, they were actually most interested
  • 5:47 - 5:48
    in establishing a difference
  • 5:48 - 5:51
    between their group and other groups,
  • 5:51 - 5:53
    so they would give up money for their own group
  • 5:53 - 5:58
    if by doing so they could give
    the other group even less.
  • 5:58 - 6:00
    This bias seems to show up very early.
  • 6:00 - 6:03
    So my colleague and wife Karen Wynn at Yale
  • 6:03 - 6:04
    has done a series of studies with babies
  • 6:04 - 6:07
    where she exposes babies to puppets,
  • 6:07 - 6:09
    and the puppets have certain food preferences.
  • 6:09 - 6:11
    So one of the puppets might like green beans.
  • 6:11 - 6:14
    The other puppet might like graham crackers.
  • 6:14 - 6:16
    They test the babies own food preferences,
  • 6:16 - 6:19
    and babies typically prefer the graham crackers.
  • 6:19 - 6:22
    But the question is, does this matter to babies
  • 6:22 - 6:25
    in how they treat the puppets? And it matters a lot.
  • 6:25 - 6:26
    They tend to prefer the puppet
  • 6:26 - 6:30
    who has the same food tastes that they have,
  • 6:30 - 6:32
    and worse, they actually prefer puppets
  • 6:32 - 6:35
    who punish the puppet with the different food taste.
  • 6:35 - 6:38
    (Laughter)
  • 6:38 - 6:41
    We see the sort of in group out
    group psychology all the time.
  • 6:41 - 6:43
    We see it in political clashes
  • 6:43 - 6:45
    within groups with different ideologies.
  • 6:45 - 6:49
    We see it in its extreme in cases of war,
  • 6:49 - 6:52
    where the out group isn't merely given less,
  • 6:52 - 6:54
    but dehumanized,
  • 6:54 - 6:56
    as in the Nazi perspective of Jews
  • 6:56 - 6:58
    as vermin or lice,
  • 6:58 - 7:02
    or the American perspective of Japanese as rats.
  • 7:02 - 7:05
    Stereotypes can also go awry.
  • 7:05 - 7:07
    So often they're rational and useful,
  • 7:07 - 7:08
    but sometimes they're irrational,
  • 7:08 - 7:10
    they give the wrong answers,
  • 7:10 - 7:11
    and other times
  • 7:11 - 7:13
    they lead to plainly immoral consequences.
  • 7:13 - 7:16
    And the case that's been most studied
  • 7:16 - 7:18
    is the case of race.
  • 7:18 - 7:19
    There was a fascinating study
  • 7:19 - 7:21
    prior to the 2008 election
  • 7:21 - 7:24
    where social psychologists looked at the extent
  • 7:24 - 7:27
    to which the candidates were
    associated with America,
  • 7:27 - 7:31
    as in an unconscious association
    with the American flag.
  • 7:31 - 7:32
    And in one of their studies they compared
  • 7:32 - 7:34
    Obama and McCain, and they found McCain
  • 7:34 - 7:38
    is more thought of as more American than Obama,
  • 7:38 - 7:40
    and to some extent, people aren't that surprised
  • 7:40 - 7:41
    by hearing that.
  • 7:41 - 7:43
    McCain is a celebrated war hero,
  • 7:43 - 7:44
    and many people would explicitly say
  • 7:44 - 7:47
    he has more of an American story than Obama.
  • 7:47 - 7:49
    But they also compared Obama
  • 7:49 - 7:51
    to British Prime Minister Tony Blair,
  • 7:51 - 7:53
    and they found that Blair was also thought of
  • 7:53 - 7:56
    as more American than Obama,
  • 7:56 - 7:58
    even though subjects explicitly understood
  • 7:58 - 8:01
    that he's not American at all.
  • 8:01 - 8:03
    But they were responding, of course,
  • 8:03 - 8:06
    to the color of his skin.
  • 8:06 - 8:08
    These stereotypes and biases
  • 8:08 - 8:09
    have real-world consequences,
  • 8:09 - 8:12
    both subtle and very important.
  • 8:12 - 8:15
    In one recent study, researchers
  • 8:15 - 8:18
    put ads on eBay for the sale of baseball cards.
  • 8:18 - 8:20
    Some of them were held by white hands,
  • 8:20 - 8:22
    others by black hands.
  • 8:22 - 8:23
    They were the same baseball cards.
  • 8:23 - 8:25
    The ones held by black hands
  • 8:25 - 8:27
    got substantially smaller bids
  • 8:27 - 8:29
    than the ones held by white hands.
  • 8:29 - 8:32
    In research done at Stanford,
  • 8:32 - 8:36
    psychologists explored the case of people
  • 8:36 - 8:39
    sentenced for the murder of a white person.
  • 8:39 - 8:42
    It turns out, holding everything else constant,
  • 8:42 - 8:45
    you are considerably more likely to be executed
  • 8:45 - 8:46
    if you look like the man on the right
  • 8:46 - 8:48
    than the man on the left,
  • 8:48 - 8:50
    and this is in large part because
  • 8:50 - 8:53
    the man on the right looks more prototypically black,
  • 8:53 - 8:55
    more prototypically African-American,
  • 8:55 - 8:57
    and this apparently influences people's decisions
  • 8:57 - 8:59
    over what to do about him.
  • 8:59 - 9:01
    So now that we know about this,
  • 9:01 - 9:03
    how do we combat it?
  • 9:03 - 9:04
    And there are different avenues.
  • 9:04 - 9:06
    One avenue is to appeal
  • 9:06 - 9:08
    to people's emotional responses,
  • 9:08 - 9:10
    to appeal to people's empathy,
  • 9:10 - 9:12
    and we often do that through stories.
  • 9:12 - 9:14
    So if you are a liberal parent
  • 9:14 - 9:16
    and you want to encourage your children
  • 9:16 - 9:18
    to believe in the merits of non-traditional families,
  • 9:18 - 9:20
    you might give them a book like this.
  • 9:20 - 9:22
    If you are conservative and have a different attitude,
  • 9:22 - 9:25
    you might give them a book like this.
  • 9:25 - 9:26
    (Laughter)
  • 9:26 - 9:30
    But in general, stories can turn
  • 9:30 - 9:32
    anonymous strangers into people who matter,
  • 9:32 - 9:34
    and the idea that we care about people
  • 9:34 - 9:36
    when we focus on them as individuals
  • 9:36 - 9:38
    is an idea which has shown up across history.
  • 9:38 - 9:41
    So Stalin apocryphally said,
  • 9:41 - 9:42
    "A single death is a tragedy,
  • 9:42 - 9:45
    a million deaths is a statistic,"
  • 9:45 - 9:46
    and Mother Theresa said,
  • 9:46 - 9:48
    "If I look at the mass, I will never act.
  • 9:48 - 9:50
    If I look at the one, I will."
  • 9:50 - 9:52
    Psychologists have explored this.
  • 9:52 - 9:53
    For instance, in one study,
  • 9:53 - 9:56
    people were given a list of facts about a crisis,
  • 9:56 - 10:00
    and it was seen how much they would donate
  • 10:00 - 10:02
    to solve this crisis,
  • 10:02 - 10:04
    and another group was given no facts at all
  • 10:04 - 10:06
    but they were told of an individual
  • 10:06 - 10:08
    and given a name and given a face,
  • 10:08 - 10:12
    and it turns out that they gave far more.
  • 10:12 - 10:13
    None of this I think is a secret
  • 10:13 - 10:16
    to the people who are engaged in charity work.
  • 10:16 - 10:18
    People don't tend to deluge people
  • 10:18 - 10:20
    with facts and statistics.
  • 10:20 - 10:21
    Rather, you show them faces,
  • 10:21 - 10:23
    you show them people.
  • 10:23 - 10:25
    It's possible that by extending our sympathies
  • 10:25 - 10:27
    to an individual, they can spread
  • 10:27 - 10:30
    to the group the individual belongs to.
  • 10:30 - 10:33
    This is Harriet Beecher Stowe.
  • 10:33 - 10:35
    The story, perhaps apocryphal,
  • 10:35 - 10:37
    is that President Lincoln invited her
  • 10:37 - 10:39
    to the White House in the middle of the Civil War
  • 10:39 - 10:41
    and said to her,
  • 10:41 - 10:44
    "So you're the little lady who started this great war."
  • 10:44 - 10:45
    And he was talking about "Uncle Tom's Cabin."
  • 10:45 - 10:48
    "Uncle Tom's Cabin" is not
    a great book of philosophy
  • 10:48 - 10:51
    or of theology or perhaps not even literature,
  • 10:51 - 10:53
    but it does a great job
  • 10:53 - 10:56
    of getting people to put themselves in the shoes
  • 10:56 - 10:58
    of people they wouldn't otherwise be in the shoes of,
  • 10:58 - 11:01
    put themselves in the shoes of slaves.
  • 11:01 - 11:03
    And that could well have been a catalyst
  • 11:03 - 11:04
    for great social change.
  • 11:04 - 11:06
    More recently, looking at America
  • 11:06 - 11:10
    in the last several decades,
  • 11:10 - 11:13
    there's some who reasonably say
    that shows like "The Cosby Show"
  • 11:13 - 11:16
    radically changed American attitudes
    towards African-Americans,
  • 11:16 - 11:18
    while shows like "Will & Grace" and "Modern Family"
  • 11:18 - 11:20
    changed American attitudes
  • 11:20 - 11:22
    towards gay men and women.
  • 11:22 - 11:23
    I don't think it's an exaggeration to say
  • 11:23 - 11:26
    that the major catalyst in America for moral change
  • 11:26 - 11:29
    has been a situation comedy.
  • 11:29 - 11:30
    But it's not all emotions,
  • 11:30 - 11:32
    and I want to end by appealing
  • 11:32 - 11:34
    to the power of reason.
  • 11:34 - 11:36
    At some point in his wonderful book
  • 11:36 - 11:37
    "The Better Angels Of Our Nature,"
  • 11:37 - 11:39
    Steven Pinker says, you know,
  • 11:39 - 11:42
    the Old Testament says love thy neighbor,
  • 11:42 - 11:45
    and the New Testament says love thy enemy,
  • 11:45 - 11:47
    but I don't love either one of them, not really,
  • 11:47 - 11:49
    but I don't want to kill them.
  • 11:49 - 11:51
    I know I have obligations to them,
  • 11:51 - 11:54
    but my moral feelings to them, my moral beliefs
  • 11:54 - 11:56
    about how I should behave towards them,
  • 11:56 - 11:58
    aren't grounded in love.
  • 11:58 - 12:00
    What they're grounded in is the
    understanding of human rights,
  • 12:00 - 12:02
    a belief that their life is as valuable to them
  • 12:02 - 12:05
    as my life is to me,
  • 12:05 - 12:06
    and to support this, he tells a story
  • 12:06 - 12:09
    by the great philosopher Adam Smith,
  • 12:09 - 12:10
    and I want to tell this story too,
  • 12:10 - 12:12
    though I'm going to modify it a little bit
  • 12:12 - 12:13
    for modern times.
  • 12:13 - 12:15
    So Adam Smith starts by asking you to imagine
  • 12:15 - 12:17
    the death of thousands of people,
  • 12:17 - 12:19
    and imagine that the thousands of people
  • 12:19 - 12:21
    are in a country you are not familiar with.
  • 12:21 - 12:25
    It could be China or India or a country in Africa.
  • 12:25 - 12:27
    And Smith says, how would you respond?
  • 12:27 - 12:29
    And you would say, well that's too bad,
  • 12:29 - 12:31
    and you'd go on to the rest of your life.
  • 12:31 - 12:34
    If you were to open up The New
    York Times online or something,
  • 12:34 - 12:36
    and discover this, and in fact
    this happens to us all the time,
  • 12:36 - 12:38
    we go about our lives.
  • 12:38 - 12:40
    But imagine instead, Smith says,
  • 12:40 - 12:42
    you were to learn that tomorrow
  • 12:42 - 12:44
    you were to have your little finger chopped off.
  • 12:44 - 12:46
    Smith says, that would matter a lot.
  • 12:46 - 12:48
    You would not sleep that night
  • 12:48 - 12:49
    wondering about that.
  • 12:49 - 12:51
    So this raises the question:
  • 12:51 - 12:54
    would you sacrifice thousands of lives
  • 12:54 - 12:55
    to save your little finger?
  • 12:55 - 12:58
    Now answer this in the privacy of your own head,
  • 12:58 - 13:01
    but Smith says, absolutely not,
  • 13:01 - 13:02
    what a horrid thought.
  • 13:02 - 13:04
    And so this raises the question,
  • 13:04 - 13:06
    and so, as Smith puts it,
  • 13:06 - 13:08
    "When our passive feelings are almost always
  • 13:08 - 13:09
    so sordid and so selfish,
  • 13:09 - 13:11
    how comes it that our active principles
  • 13:11 - 13:14
    should often be so generous and so noble?"
  • 13:14 - 13:15
    And Smith's answer is, "It is reason,
  • 13:15 - 13:17
    principle, conscience.
  • 13:17 - 13:19
    [This] calls to us,
  • 13:19 - 13:22
    with a voice capable of astonishing
    the most presumptuous of our passions,
  • 13:22 - 13:24
    that we are but one of the multitude,
  • 13:24 - 13:27
    in no respect better than any other in it."
  • 13:27 - 13:28
    And this last part is what is often described
  • 13:28 - 13:32
    as the principle of impartiality.
  • 13:32 - 13:34
    And this principle of impartiality manifests itself
  • 13:34 - 13:36
    in all of the world's religions,
  • 13:36 - 13:38
    in all of the different versions of the golden rule,
  • 13:38 - 13:41
    and in all of the world's moral philosophies,
  • 13:41 - 13:42
    which differ in many ways
  • 13:42 - 13:45
    but share the presupposition
    that we should judge morality
  • 13:45 - 13:48
    from sort of an impartial point of view.
  • 13:48 - 13:50
    The best articulation of this view
  • 13:50 - 13:53
    is actually, for me, it's not from
    a theologian or from a philosopher,
  • 13:53 - 13:54
    but from Humphrey Bogart
  • 13:54 - 13:56
    at the end of "Casablanca."
  • 13:56 - 14:00
    So, spoiler alert, he's telling his lover
  • 14:00 - 14:01
    that they have separate
  • 14:01 - 14:03
    for the more general good,
  • 14:03 - 14:04
    and he says to her, and I won't do the accent,
  • 14:04 - 14:06
    but he says to her, "It doesn't take much to see
  • 14:06 - 14:08
    that the problems of three little people
  • 14:08 - 14:11
    don't amount to a hill of beans in this crazy world."
  • 14:11 - 14:14
    Our reason could cause us to override our passions.
  • 14:14 - 14:15
    Our reason could motivate us
  • 14:15 - 14:17
    to extend our empathy,
  • 14:17 - 14:19
    could motivate us to write a
    book like "Uncle Tom's Cabin,"
  • 14:19 - 14:21
    or read a book like "Uncle Tom's Cabin,"
  • 14:21 - 14:23
    and our reason can motivate us to create
  • 14:23 - 14:25
    customs and taboos and laws
  • 14:25 - 14:27
    that will constrain us
  • 14:27 - 14:29
    from acting upon our impulses
  • 14:29 - 14:30
    when, as rational beings, we feel
  • 14:30 - 14:32
    we should be constrained.
  • 14:32 - 14:34
    This is what a constitution is.
  • 14:34 - 14:37
    A constitution is something
    which was set up in the past
  • 14:37 - 14:38
    that applies now in the present,
  • 14:38 - 14:40
    and what it says is,
  • 14:40 - 14:41
    no matter how much we might to reelect
  • 14:41 - 14:43
    a popular president for a third term,
  • 14:43 - 14:46
    no matter how much white Americans might choose
  • 14:46 - 14:50
    to feel that they want to reinstate
    the institution of slavery, we can't.
  • 14:50 - 14:52
    We have bound ourselves.
  • 14:52 - 14:54
    And we bind ourselves in other ways as well.
  • 14:54 - 14:57
    We know that when it comes to choosing somebody
  • 14:57 - 15:00
    for a job, for an award,
  • 15:00 - 15:02
    we are strongly biased by their race,
  • 15:02 - 15:05
    we are biased by their gender,
  • 15:05 - 15:08
    we are biased by how attractive they are,
  • 15:08 - 15:10
    and sometimes we might say,
    "Well fine, that's the way it should be."
  • 15:10 - 15:12
    But other times we might say, "This is wrong."
  • 15:12 - 15:14
    And so to combat this,
  • 15:14 - 15:16
    we don't just try harder,
  • 15:16 - 15:20
    but rather what we do is we set up situations
  • 15:20 - 15:22
    where these other sources
    of information can't bias us,
  • 15:22 - 15:25
    which is why many orchestras
  • 15:25 - 15:26
    audition musicians behind screens,
  • 15:26 - 15:28
    so the only information they have
  • 15:28 - 15:31
    is the information they believe should matter.
  • 15:31 - 15:32
    I think prejudice and bias
  • 15:32 - 15:36
    illustrate a fundamental duality of human nature.
  • 15:36 - 15:40
    We have gut feelings, instincts, emotions,
  • 15:40 - 15:42
    and they affect our judgments and our actions
  • 15:42 - 15:44
    for good and for evil,
  • 15:44 - 15:48
    but we are also capable of rational deliberation
  • 15:48 - 15:50
    and intelligent planning,
  • 15:50 - 15:52
    and we can use these to in some cases
  • 15:52 - 15:54
    accelerate and nourish our emotions,
  • 15:54 - 15:57
    and in other cases staunch them.
  • 15:57 - 15:58
    And it's in this way
  • 15:58 - 16:01
    that reason helps us create a better world.
  • 16:01 - 16:03
    Thank you.
  • 16:03 - 16:07
    (Applause)
Title:
Can prejudice ever be a good thing?
Speaker:
Paul Bloom
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
16:23

English subtitles

Revisions Compare revisions