< Return to Video

Moral Machines: How culture changes values​

  • 0:12 - 0:18
    Who would you save, the pedestrian in the
    road or the drivers in the car?
  • 0:18 - 0:23
    It's not easy, and yet that's the kind of decision which millions of autonomous cars
  • 0:23 - 0:29
    would have to make in the near future. We programme the machine but who do we tell it to save?
  • 0:29 - 0:35
    That is the set-up of the moral machine experiment.
  • 0:35 - 0:42
    There are so many moral decisions that we usually make during the day we don't realise.
  • 0:42 - 0:47
    In driverless cars, these decisions will have to be implemented ahead of time.
  • 0:47 - 0:51
    The goal was to open this discussion to the public.
  • 0:51 - 0:57
    Some decisions might seem simple - should the car save a family of 4 or a cat?
  • 0:57 - 1:01
    But what about a homeless person and their dog instead of a businessman?
  • 1:01 - 1:06
    Or how about two athletes and an old woman instead of two schoolchildren?
  • 1:06 - 1:13
    The problem was that there were so many combinations, so many possible accidents,
  • 1:13 - 1:18
    that it seemed impossible to investigate them all using classic social science methods.
  • 1:18 - 1:24
    Not only that, but how do people's culture and background affect the decisions that they make?
  • 1:24 - 1:31
    The only option we had really was to turn it into a viral website. Of course, it's easier said than done, right.
  • 1:31 - 1:38
    But that is exactly what the team managed to do. They turned these situations into an online task
  • 1:38 - 1:43
    that people across the globe wanted to share and take part in.
  • 1:43 - 1:47
    They gathered almost 40 million moral decisions, taken from millions
  • 1:47 - 1:54
    of online participants across 233 countries and territories from all around the world.
  • 1:54 - 2:03
    The results are intriguing. First, there are three fundamental principles which hold true across the world.
  • 2:03 - 2:09
    The main results of the paper, for me, are first, the big three in people's preferences
  • 2:09 - 2:13
    which is save human, save the greater number, save the kids.
  • 2:13 - 2:21
    The second most interesting finding was the clusters, the clusters of countries with different moral profiles.
  • 2:21 - 2:27
    The first cluster included many western countries, the second cluster had many eastern countries
  • 2:27 - 2:34
    and the third cluster had countries from Latin America and also from former French colonies.
  • 2:34 - 2:39
    The cultural differences we find are sometimes hard to describe because they're multidimensional,
  • 2:39 - 2:43
    but some of them are very striking, like the fact that eastern countries
  • 2:43 - 2:48
    do not have such a strong preference for saving young lives.
  • 2:48 - 2:57
    Eastern countries seem to be more respectful of older people, which I thought was a very interesting finding.
  • 2:57 - 3:04
    And it wasn't just age. One cluster showed an unexpectedly strong preference for saving women over men.
  • 3:04 - 3:12
    I was also struck by the fact that French and the French subcluster was so interested in saving women.
  • 3:12 - 3:16
    That was, yeah, I'm still not quite sure what's going on here.
  • 3:16 - 3:24
    Another surprising finding concerned people's social status. On one side we put male and female executives,
  • 3:24 - 3:29
    and on the other side we put a homeless person. The higher the economic inequality in a country,
  • 3:29 - 3:37
    the more people were willing to spare the executives at the cost of the homeless people.
  • 3:37 - 3:42
    This work provides new insight into how morals change across cultures
  • 3:42 - 3:47
    and the team see particular relevance to the field of artificial intelligence and autonomous vehicles.
  • 3:47 - 3:52
    In the grand scheme of it, I think these results are going to be very important
  • 3:52 - 3:59
    to align artificial intelligence to human values. We sometimes change our minds.
  • 3:59 - 4:05
    Other people next to us don't think the same things we do. Other countries don't think the same things we do.
  • 4:05 - 4:12
    So aligning AI and human moral value is only possible if we do understand these differences,
  • 4:12 - 4:18
    and that's what we tried to do. I so much hope that we can converge, that we avoid a future
  • 4:18 - 4:25
    where you have to learn about the new ethical setting of your car every time you cross a border.
Title:
Moral Machines: How culture changes values​
Description:

more » « less
Video Language:
English
Duration:
04:35

English subtitles

Revisions Compare revisions