0:00:02.240,0:00:05.490 [Music] 0:00:07.560,0:00:10.160 hi uh my name is Deborah rajie and uh 0:00:10.160,0:00:12.040 I'm a Milla fellow I work with the 0:00:12.040,0:00:14.120 algorithmic Justice League so the 0:00:14.120,0:00:17.160 algorithmic Justice League is a research 0:00:17.160,0:00:19.600 organization um that works very hard to 0:00:19.600,0:00:22.160 make sure that AI is developed in a way 0:00:22.160,0:00:26.199 that is inclusive and effective for 0:00:28.599,0:00:31.000 everyone right now a lot of our work has 0:00:31.000,0:00:33.360 also involved doing audits ourselves of 0:00:33.360,0:00:35.879 these deployed systems so we analyze 0:00:35.879,0:00:38.120 situations um like I mentioned anything 0:00:38.120,0:00:40.200 from like healthcare to hiring to facial 0:00:40.200,0:00:42.399 recognition what we do is we come into 0:00:42.399,0:00:43.960 those situations and we try to 0:00:43.960,0:00:45.760 understand how the deployment of that 0:00:45.760,0:00:49.879 system impacts different marginalized 0:00:52.680,0:00:55.199 groups is a project called gender Shades 0:00:55.199,0:00:57.680 where we looked at uh facial recognition 0:00:57.680,0:00:59.079 systems that were deployed in the real 0:00:59.079,0:01:01.600 world and asked the question of is this 0:01:01.600,0:01:03.800 A system that works for everyone these 0:01:03.800,0:01:06.080 systems although they were operating at 0:01:06.080,0:01:08.799 almost 100% for for example lighter 0:01:08.799,0:01:12.799 skinned um male faces um they were uh 0:01:12.799,0:01:15.280 performing at less than 70% accuracy for 0:01:15.280,0:01:17.960 darker skinned women um this was a huge 0:01:17.960,0:01:20.720 story and it kind of escalated uh in the 0:01:20.720,0:01:22.400 press and and that's a lot of what we're 0:01:22.400,0:01:25.320 known for is that 0:01:27.759,0:01:30.880 project so you might have um a company 0:01:30.880,0:01:34.159 that builds a tool for doctors or for 0:01:34.159,0:01:36.280 teachers um whereas the affected 0:01:36.280,0:01:38.119 population in that situation would 0:01:38.119,0:01:40.759 actually be the students or the patients 0:01:40.759,0:01:43.399 and those guys very rarely have any kind 0:01:43.399,0:01:46.200 of influence on the types of features 0:01:46.200,0:01:48.119 that are emphasized in the development 0:01:48.119,0:01:50.560 of the AI system uh the type of data 0:01:50.560,0:01:54.240 that's collected uh and as a result um 0:01:54.240,0:01:56.640 those that are sort of experiencing the 0:01:56.640,0:01:58.640 weight of the decision-making that these 0:01:58.640,0:02:03.159 tools make uh end up almost uh erased 0:02:03.159,0:02:04.920 from the entire process of development 0:02:04.920,0:02:08.200 unless actively sought 0:02:10.760,0:02:13.440 out yeah so there's a lot of situations 0:02:13.440,0:02:15.319 in which humans are making very 0:02:15.319,0:02:17.760 important decisions uh an example being 0:02:17.760,0:02:20.239 hiring or a judge making a decision in a 0:02:20.239,0:02:22.560 criminal case and there's certainly a 0:02:22.560,0:02:24.440 lot of bias involved in that there's a 0:02:24.440,0:02:26.599 lot of the perspective of that person 0:02:26.599,0:02:28.480 making that decision that influences the 0:02:28.480,0:02:30.840 nature of that outcome in the same way 0:02:30.840,0:02:33.000 if you replace that human decision maker 0:02:33.000,0:02:34.879 with an algorithm there's bound to be 0:02:34.879,0:02:37.040 some level of bias involved in that the 0:02:37.040,0:02:38.959 other sort of aspect of this is that we 0:02:38.959,0:02:41.239 tend to trust algorithms and see them as 0:02:41.239,0:02:45.360 neutral in a way that we don't with 0:02:48.560,0:02:51.400 humans yeah so I got into this field 0:02:51.400,0:02:54.280 almost accidentally um I studied 0:02:54.280,0:02:56.959 robotics Engineering in University and I 0:02:56.959,0:03:00.640 was sort of playing a lot with um 0:03:00.640,0:03:04.120 uh AI as like just a form of of of part 0:03:04.120,0:03:05.920 of my experience in terms of coding and 0:03:05.920,0:03:07.080 and my experience in hackathons and 0:03:07.080,0:03:09.280 building projects and realize very 0:03:09.280,0:03:11.280 quickly that a lot of the data sets for 0:03:11.280,0:03:13.560 example um do not include a lot of 0:03:13.560,0:03:14.879 people that look like me so a lot of the 0:03:14.879,0:03:17.360 data sets that we use to uh you know to 0:03:17.360,0:03:18.840 to pretty much teach these algorithmic 0:03:18.840,0:03:21.239 systems uh what a face looks like what a 0:03:21.239,0:03:23.840 hand looks like what a human looks like 0:03:23.840,0:03:25.799 um don't actually include uh a lot of 0:03:25.799,0:03:28.200 people of color um and other different 0:03:28.200,0:03:31.080 demographics so that was is probably the 0:03:31.080,0:03:33.720 biggest uh sort of red flag that I saw 0:03:33.720,0:03:36.360 in the industry 0:03:38.599,0:03:41.040 immediately um I think a lot of the 0:03:41.040,0:03:43.799 times we think of AI systems as these 0:03:43.799,0:03:46.840 sci-fi sentient robot 0:03:46.840,0:03:49.879 overlords um but they're really just a 0:03:49.879,0:03:52.120 bunch of decisions being made by actual 0:03:52.120,0:03:55.439 humans and um our understanding of AI 0:03:55.439,0:03:57.120 systems as the separate thing makes it 0:03:57.120,0:04:00.000 really hard to hold anyone accountable 0:04:00.000,0:04:04.280 when a bad decision is made 0:04:04.680,0:04:08.879 [Music]