[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:02.24,0:00:05.49,Default,,0000,0000,0000,,[Music] Dialogue: 0,0:00:07.56,0:00:10.16,Default,,0000,0000,0000,,hi uh my name is Deborah rajie and uh Dialogue: 0,0:00:10.16,0:00:12.04,Default,,0000,0000,0000,,I'm a Milla fellow I work with the Dialogue: 0,0:00:12.04,0:00:14.12,Default,,0000,0000,0000,,algorithmic Justice League so the Dialogue: 0,0:00:14.12,0:00:17.16,Default,,0000,0000,0000,,algorithmic Justice League is a research Dialogue: 0,0:00:17.16,0:00:19.60,Default,,0000,0000,0000,,organization um that works very hard to Dialogue: 0,0:00:19.60,0:00:22.16,Default,,0000,0000,0000,,make sure that AI is developed in a way Dialogue: 0,0:00:22.16,0:00:26.20,Default,,0000,0000,0000,,that is inclusive and effective for Dialogue: 0,0:00:28.60,0:00:31.00,Default,,0000,0000,0000,,everyone right now a lot of our work has Dialogue: 0,0:00:31.00,0:00:33.36,Default,,0000,0000,0000,,also involved doing audits ourselves of Dialogue: 0,0:00:33.36,0:00:35.88,Default,,0000,0000,0000,,these deployed systems so we analyze Dialogue: 0,0:00:35.88,0:00:38.12,Default,,0000,0000,0000,,situations um like I mentioned anything Dialogue: 0,0:00:38.12,0:00:40.20,Default,,0000,0000,0000,,from like healthcare to hiring to facial Dialogue: 0,0:00:40.20,0:00:42.40,Default,,0000,0000,0000,,recognition what we do is we come into Dialogue: 0,0:00:42.40,0:00:43.96,Default,,0000,0000,0000,,those situations and we try to Dialogue: 0,0:00:43.96,0:00:45.76,Default,,0000,0000,0000,,understand how the deployment of that Dialogue: 0,0:00:45.76,0:00:49.88,Default,,0000,0000,0000,,system impacts different marginalized Dialogue: 0,0:00:52.68,0:00:55.20,Default,,0000,0000,0000,,groups is a project called gender Shades Dialogue: 0,0:00:55.20,0:00:57.68,Default,,0000,0000,0000,,where we looked at uh facial recognition Dialogue: 0,0:00:57.68,0:00:59.08,Default,,0000,0000,0000,,systems that were deployed in the real Dialogue: 0,0:00:59.08,0:01:01.60,Default,,0000,0000,0000,,world and asked the question of is this Dialogue: 0,0:01:01.60,0:01:03.80,Default,,0000,0000,0000,,A system that works for everyone these Dialogue: 0,0:01:03.80,0:01:06.08,Default,,0000,0000,0000,,systems although they were operating at Dialogue: 0,0:01:06.08,0:01:08.80,Default,,0000,0000,0000,,almost 100% for for example lighter Dialogue: 0,0:01:08.80,0:01:12.80,Default,,0000,0000,0000,,skinned um male faces um they were uh Dialogue: 0,0:01:12.80,0:01:15.28,Default,,0000,0000,0000,,performing at less than 70% accuracy for Dialogue: 0,0:01:15.28,0:01:17.96,Default,,0000,0000,0000,,darker skinned women um this was a huge Dialogue: 0,0:01:17.96,0:01:20.72,Default,,0000,0000,0000,,story and it kind of escalated uh in the Dialogue: 0,0:01:20.72,0:01:22.40,Default,,0000,0000,0000,,press and and that's a lot of what we're Dialogue: 0,0:01:22.40,0:01:25.32,Default,,0000,0000,0000,,known for is that Dialogue: 0,0:01:27.76,0:01:30.88,Default,,0000,0000,0000,,project so you might have um a company Dialogue: 0,0:01:30.88,0:01:34.16,Default,,0000,0000,0000,,that builds a tool for doctors or for Dialogue: 0,0:01:34.16,0:01:36.28,Default,,0000,0000,0000,,teachers um whereas the affected Dialogue: 0,0:01:36.28,0:01:38.12,Default,,0000,0000,0000,,population in that situation would Dialogue: 0,0:01:38.12,0:01:40.76,Default,,0000,0000,0000,,actually be the students or the patients Dialogue: 0,0:01:40.76,0:01:43.40,Default,,0000,0000,0000,,and those guys very rarely have any kind Dialogue: 0,0:01:43.40,0:01:46.20,Default,,0000,0000,0000,,of influence on the types of features Dialogue: 0,0:01:46.20,0:01:48.12,Default,,0000,0000,0000,,that are emphasized in the development Dialogue: 0,0:01:48.12,0:01:50.56,Default,,0000,0000,0000,,of the AI system uh the type of data Dialogue: 0,0:01:50.56,0:01:54.24,Default,,0000,0000,0000,,that's collected uh and as a result um Dialogue: 0,0:01:54.24,0:01:56.64,Default,,0000,0000,0000,,those that are sort of experiencing the Dialogue: 0,0:01:56.64,0:01:58.64,Default,,0000,0000,0000,,weight of the decision-making that these Dialogue: 0,0:01:58.64,0:02:03.16,Default,,0000,0000,0000,,tools make uh end up almost uh erased Dialogue: 0,0:02:03.16,0:02:04.92,Default,,0000,0000,0000,,from the entire process of development Dialogue: 0,0:02:04.92,0:02:08.20,Default,,0000,0000,0000,,unless actively sought Dialogue: 0,0:02:10.76,0:02:13.44,Default,,0000,0000,0000,,out yeah so there's a lot of situations Dialogue: 0,0:02:13.44,0:02:15.32,Default,,0000,0000,0000,,in which humans are making very Dialogue: 0,0:02:15.32,0:02:17.76,Default,,0000,0000,0000,,important decisions uh an example being Dialogue: 0,0:02:17.76,0:02:20.24,Default,,0000,0000,0000,,hiring or a judge making a decision in a Dialogue: 0,0:02:20.24,0:02:22.56,Default,,0000,0000,0000,,criminal case and there's certainly a Dialogue: 0,0:02:22.56,0:02:24.44,Default,,0000,0000,0000,,lot of bias involved in that there's a Dialogue: 0,0:02:24.44,0:02:26.60,Default,,0000,0000,0000,,lot of the perspective of that person Dialogue: 0,0:02:26.60,0:02:28.48,Default,,0000,0000,0000,,making that decision that influences the Dialogue: 0,0:02:28.48,0:02:30.84,Default,,0000,0000,0000,,nature of that outcome in the same way Dialogue: 0,0:02:30.84,0:02:33.00,Default,,0000,0000,0000,,if you replace that human decision maker Dialogue: 0,0:02:33.00,0:02:34.88,Default,,0000,0000,0000,,with an algorithm there's bound to be Dialogue: 0,0:02:34.88,0:02:37.04,Default,,0000,0000,0000,,some level of bias involved in that the Dialogue: 0,0:02:37.04,0:02:38.96,Default,,0000,0000,0000,,other sort of aspect of this is that we Dialogue: 0,0:02:38.96,0:02:41.24,Default,,0000,0000,0000,,tend to trust algorithms and see them as Dialogue: 0,0:02:41.24,0:02:45.36,Default,,0000,0000,0000,,neutral in a way that we don't with Dialogue: 0,0:02:48.56,0:02:51.40,Default,,0000,0000,0000,,humans yeah so I got into this field Dialogue: 0,0:02:51.40,0:02:54.28,Default,,0000,0000,0000,,almost accidentally um I studied Dialogue: 0,0:02:54.28,0:02:56.96,Default,,0000,0000,0000,,robotics Engineering in University and I Dialogue: 0,0:02:56.96,0:03:00.64,Default,,0000,0000,0000,,was sort of playing a lot with um Dialogue: 0,0:03:00.64,0:03:04.12,Default,,0000,0000,0000,,uh AI as like just a form of of of part Dialogue: 0,0:03:04.12,0:03:05.92,Default,,0000,0000,0000,,of my experience in terms of coding and Dialogue: 0,0:03:05.92,0:03:07.08,Default,,0000,0000,0000,,and my experience in hackathons and Dialogue: 0,0:03:07.08,0:03:09.28,Default,,0000,0000,0000,,building projects and realize very Dialogue: 0,0:03:09.28,0:03:11.28,Default,,0000,0000,0000,,quickly that a lot of the data sets for Dialogue: 0,0:03:11.28,0:03:13.56,Default,,0000,0000,0000,,example um do not include a lot of Dialogue: 0,0:03:13.56,0:03:14.88,Default,,0000,0000,0000,,people that look like me so a lot of the Dialogue: 0,0:03:14.88,0:03:17.36,Default,,0000,0000,0000,,data sets that we use to uh you know to Dialogue: 0,0:03:17.36,0:03:18.84,Default,,0000,0000,0000,,to pretty much teach these algorithmic Dialogue: 0,0:03:18.84,0:03:21.24,Default,,0000,0000,0000,,systems uh what a face looks like what a Dialogue: 0,0:03:21.24,0:03:23.84,Default,,0000,0000,0000,,hand looks like what a human looks like Dialogue: 0,0:03:23.84,0:03:25.80,Default,,0000,0000,0000,,um don't actually include uh a lot of Dialogue: 0,0:03:25.80,0:03:28.20,Default,,0000,0000,0000,,people of color um and other different Dialogue: 0,0:03:28.20,0:03:31.08,Default,,0000,0000,0000,,demographics so that was is probably the Dialogue: 0,0:03:31.08,0:03:33.72,Default,,0000,0000,0000,,biggest uh sort of red flag that I saw Dialogue: 0,0:03:33.72,0:03:36.36,Default,,0000,0000,0000,,in the industry Dialogue: 0,0:03:38.60,0:03:41.04,Default,,0000,0000,0000,,immediately um I think a lot of the Dialogue: 0,0:03:41.04,0:03:43.80,Default,,0000,0000,0000,,times we think of AI systems as these Dialogue: 0,0:03:43.80,0:03:46.84,Default,,0000,0000,0000,,sci-fi sentient robot Dialogue: 0,0:03:46.84,0:03:49.88,Default,,0000,0000,0000,,overlords um but they're really just a Dialogue: 0,0:03:49.88,0:03:52.12,Default,,0000,0000,0000,,bunch of decisions being made by actual Dialogue: 0,0:03:52.12,0:03:55.44,Default,,0000,0000,0000,,humans and um our understanding of AI Dialogue: 0,0:03:55.44,0:03:57.12,Default,,0000,0000,0000,,systems as the separate thing makes it Dialogue: 0,0:03:57.12,0:04:00.00,Default,,0000,0000,0000,,really hard to hold anyone accountable Dialogue: 0,0:04:00.00,0:04:04.28,Default,,0000,0000,0000,,when a bad decision is made Dialogue: 0,0:04:04.68,0:04:08.88,Default,,0000,0000,0000,,[Music]