1 00:00:02,240 --> 00:00:05,490 [Music] 2 00:00:07,560 --> 00:00:10,160 hi uh my name is Deborah rajie and uh 3 00:00:10,160 --> 00:00:12,040 I'm a Milla fellow I work with the 4 00:00:12,040 --> 00:00:14,120 algorithmic Justice League so the 5 00:00:14,120 --> 00:00:17,160 algorithmic Justice League is a research 6 00:00:17,160 --> 00:00:19,600 organization um that works very hard to 7 00:00:19,600 --> 00:00:22,160 make sure that AI is developed in a way 8 00:00:22,160 --> 00:00:26,199 that is inclusive and effective for 9 00:00:28,599 --> 00:00:31,000 everyone right now a lot of our work has 10 00:00:31,000 --> 00:00:33,360 also involved doing audits ourselves of 11 00:00:33,360 --> 00:00:35,879 these deployed systems so we analyze 12 00:00:35,879 --> 00:00:38,120 situations um like I mentioned anything 13 00:00:38,120 --> 00:00:40,200 from like healthcare to hiring to facial 14 00:00:40,200 --> 00:00:42,399 recognition what we do is we come into 15 00:00:42,399 --> 00:00:43,960 those situations and we try to 16 00:00:43,960 --> 00:00:45,760 understand how the deployment of that 17 00:00:45,760 --> 00:00:49,879 system impacts different marginalized 18 00:00:52,680 --> 00:00:55,199 groups is a project called gender Shades 19 00:00:55,199 --> 00:00:57,680 where we looked at uh facial recognition 20 00:00:57,680 --> 00:00:59,079 systems that were deployed in the real 21 00:00:59,079 --> 00:01:01,600 world and asked the question of is this 22 00:01:01,600 --> 00:01:03,800 A system that works for everyone these 23 00:01:03,800 --> 00:01:06,080 systems although they were operating at 24 00:01:06,080 --> 00:01:08,799 almost 100% for for example lighter 25 00:01:08,799 --> 00:01:12,799 skinned um male faces um they were uh 26 00:01:12,799 --> 00:01:15,280 performing at less than 70% accuracy for 27 00:01:15,280 --> 00:01:17,960 darker skinned women um this was a huge 28 00:01:17,960 --> 00:01:20,720 story and it kind of escalated uh in the 29 00:01:20,720 --> 00:01:22,400 press and and that's a lot of what we're 30 00:01:22,400 --> 00:01:25,320 known for is that 31 00:01:27,759 --> 00:01:30,880 project so you might have um a company 32 00:01:30,880 --> 00:01:34,159 that builds a tool for doctors or for 33 00:01:34,159 --> 00:01:36,280 teachers um whereas the affected 34 00:01:36,280 --> 00:01:38,119 population in that situation would 35 00:01:38,119 --> 00:01:40,759 actually be the students or the patients 36 00:01:40,759 --> 00:01:43,399 and those guys very rarely have any kind 37 00:01:43,399 --> 00:01:46,200 of influence on the types of features 38 00:01:46,200 --> 00:01:48,119 that are emphasized in the development 39 00:01:48,119 --> 00:01:50,560 of the AI system uh the type of data 40 00:01:50,560 --> 00:01:54,240 that's collected uh and as a result um 41 00:01:54,240 --> 00:01:56,640 those that are sort of experiencing the 42 00:01:56,640 --> 00:01:58,640 weight of the decision-making that these 43 00:01:58,640 --> 00:02:03,159 tools make uh end up almost uh erased 44 00:02:03,159 --> 00:02:04,920 from the entire process of development 45 00:02:04,920 --> 00:02:08,200 unless actively sought 46 00:02:10,760 --> 00:02:13,440 out yeah so there's a lot of situations 47 00:02:13,440 --> 00:02:15,319 in which humans are making very 48 00:02:15,319 --> 00:02:17,760 important decisions uh an example being 49 00:02:17,760 --> 00:02:20,239 hiring or a judge making a decision in a 50 00:02:20,239 --> 00:02:22,560 criminal case and there's certainly a 51 00:02:22,560 --> 00:02:24,440 lot of bias involved in that there's a 52 00:02:24,440 --> 00:02:26,599 lot of the perspective of that person 53 00:02:26,599 --> 00:02:28,480 making that decision that influences the 54 00:02:28,480 --> 00:02:30,840 nature of that outcome in the same way 55 00:02:30,840 --> 00:02:33,000 if you replace that human decision maker 56 00:02:33,000 --> 00:02:34,879 with an algorithm there's bound to be 57 00:02:34,879 --> 00:02:37,040 some level of bias involved in that the 58 00:02:37,040 --> 00:02:38,959 other sort of aspect of this is that we 59 00:02:38,959 --> 00:02:41,239 tend to trust algorithms and see them as 60 00:02:41,239 --> 00:02:45,360 neutral in a way that we don't with 61 00:02:48,560 --> 00:02:51,400 humans yeah so I got into this field 62 00:02:51,400 --> 00:02:54,280 almost accidentally um I studied 63 00:02:54,280 --> 00:02:56,959 robotics Engineering in University and I 64 00:02:56,959 --> 00:03:00,640 was sort of playing a lot with um 65 00:03:00,640 --> 00:03:04,120 uh AI as like just a form of of of part 66 00:03:04,120 --> 00:03:05,920 of my experience in terms of coding and 67 00:03:05,920 --> 00:03:07,080 and my experience in hackathons and 68 00:03:07,080 --> 00:03:09,280 building projects and realize very 69 00:03:09,280 --> 00:03:11,280 quickly that a lot of the data sets for 70 00:03:11,280 --> 00:03:13,560 example um do not include a lot of 71 00:03:13,560 --> 00:03:14,879 people that look like me so a lot of the 72 00:03:14,879 --> 00:03:17,360 data sets that we use to uh you know to 73 00:03:17,360 --> 00:03:18,840 to pretty much teach these algorithmic 74 00:03:18,840 --> 00:03:21,239 systems uh what a face looks like what a 75 00:03:21,239 --> 00:03:23,840 hand looks like what a human looks like 76 00:03:23,840 --> 00:03:25,799 um don't actually include uh a lot of 77 00:03:25,799 --> 00:03:28,200 people of color um and other different 78 00:03:28,200 --> 00:03:31,080 demographics so that was is probably the 79 00:03:31,080 --> 00:03:33,720 biggest uh sort of red flag that I saw 80 00:03:33,720 --> 00:03:36,360 in the industry 81 00:03:38,599 --> 00:03:41,040 immediately um I think a lot of the 82 00:03:41,040 --> 00:03:43,799 times we think of AI systems as these 83 00:03:43,799 --> 00:03:46,840 sci-fi sentient robot 84 00:03:46,840 --> 00:03:49,879 overlords um but they're really just a 85 00:03:49,879 --> 00:03:52,120 bunch of decisions being made by actual 86 00:03:52,120 --> 00:03:55,439 humans and um our understanding of AI 87 00:03:55,439 --> 00:03:57,120 systems as the separate thing makes it 88 00:03:57,120 --> 00:04:00,000 really hard to hold anyone accountable 89 00:04:00,000 --> 00:04:04,280 when a bad decision is made 90 00:04:04,680 --> 00:04:08,879 [Music]