< Return to Video

Moral Machines: How culture changes values​

  • 0:12 - 0:18
    Who would you save, the pedestrian in the road or the drivers in the car?
  • 0:18 - 0:19
    It's not easy,
  • 0:19 - 0:24
    and yet that's the kind of decision which millions of autonomous cars would have to make in the near future.
  • 0:25 - 0:29
    We programme the machine but who do we tell it to save?
  • 0:29 - 0:33
    That is the set-up of the moral machine experiment.
  • 0:35 - 0:42
    There are so many moral decisions that we usually make during the day we don't realise.
  • 0:42 - 0:47
    In driverless cars, these decisions will have to be implemented ahead of time.
  • 0:47 - 0:51
    The goal was to open this discussion to the public.
  • 0:51 - 0:57
    Some decisions might seem simple, should the car save a family of 4 or a cat?
  • 0:57 - 1:01
    But what about a homeless person and their dog instead of a businessman?
  • 1:01 - 1:06
    Or how about two athletes and an old woman instead of two schoolchildren?
  • 1:06 - 1:13
    The problem was that there were so many combinations, so many possible accidents,
  • 1:13 - 1:18
    that it seemed impossible to investigate them all using classic social science methods.
  • 1:18 - 1:24
    Not only that, but how do people's culture and background affect the decisions that they make?
  • 1:24 - 1:28
    The only option we had really was to turn it into a viral website.
  • 1:28 - 1:31
    Of course, it's easier said than done, right.
  • 1:31 - 1:34
    But that is exactly what the team managed to do.
  • 1:34 - 1:43
    They turned these situations into an online task that people across the globe wanted to share and take part in.
  • 1:43 - 1:46
    They gathered almost 40 million moral decisions,
  • 1:46 - 1:51
    taken from millions of online participants across 233 countries
  • 1:51 - 1:54
    and territories from all around the world.
  • 1:54 - 1:57
    The results are intriguing.
  • 1:57 - 2:03
    First, there are three fundamental principles which hold true across the world.
  • 2:03 - 2:09
    The main results of the paper, for me, are first, the big three in people's preferences
  • 2:09 - 2:13
    which is save human, save the greater number, save the kids.
  • 2:13 - 2:17
    The second most interesting finding was the clusters,
  • 2:17 - 2:21
    the clusters of countries with different moral profiles.
  • 2:21 - 2:24
    The first cluster included many western countries,
  • 2:24 - 2:27
    the second cluster had many eastern countries
  • 2:27 - 2:31
    and the third cluster had countries from Latin America
  • 2:31 - 2:34
    and also from former French colonies.
  • 2:34 - 2:39
    The cultural differences we find are sometimes hard to describe because they're multidimensional,
  • 2:39 - 2:41
    but some of them are very striking,
  • 2:41 - 2:48
    like the fact that eastern countries do not have such a strong preference for saving young lives.
  • 2:48 - 2:53
    Eastern countries seem to be more respectful of older people,
  • 2:53 - 2:57
    which I thought was a very interesting finding.
  • 2:57 - 2:59
    And it wasn't just age.
  • 2:59 - 3:04
    One cluster showed an unexpectedly strong preference for saving women over men.
  • 3:04 - 3:07
    I was also struck by the fact that French
  • 3:07 - 3:12
    and the French subcluster was so interested in saving women.
  • 3:12 - 3:16
    That was, yeah, I'm still not quite sure what's going on here.
  • 3:16 - 3:20
    Another surprising finding concerned people's social status.
  • 3:20 - 3:24
    On one side we put male and female executives,
  • 3:24 - 3:26
    and on the other side we put a homeless person.
  • 3:26 - 3:29
    The higher the economic inequality in a country,
  • 3:29 - 3:36
    the more people were willing to spare the executives at the cost of the homeless people.
  • 3:37 - 3:42
    This work provides new insight into how morals change across cultures
  • 3:42 - 3:45
    and the team see particular relevance to the field of artificial intelligence
  • 3:45 - 3:47
    and autonomous vehicles.
  • 3:47 - 3:52
    In the grand scheme of it, I think these results are going to be very important
  • 3:52 - 3:59
    to align artificial intelligence to human values. We sometimes change our minds.
  • 3:59 - 4:03
    Other people next to us don't think the same things we do.
  • 4:03 - 4:05
    Other countries don't think the same things we do.
  • 4:05 - 4:12
    So aligning AI and human moral value is only possible if we do understand these differences,
  • 4:12 - 4:18
    and that's what we tried to do. I so much hope that we can converge, that we avoid a future
  • 4:18 - 4:25
    where you have to learn about the new ethical setting of your car every time you cross a border.
Title:
Moral Machines: How culture changes values​
Description:

more » « less
Video Language:
English
Duration:
04:35

English subtitles

Revisions Compare revisions