< Return to Video

AI Auditing with Deborah Raji

  • 0:02 - 0:05
    [Music]
  • 0:08 - 0:10
    hi uh my name is Deborah rajie and uh
  • 0:10 - 0:12
    I'm a Milla fellow I work with the
  • 0:12 - 0:14
    algorithmic Justice League so the
  • 0:14 - 0:17
    algorithmic Justice League is a research
  • 0:17 - 0:20
    organization um that works very hard to
  • 0:20 - 0:22
    make sure that AI is developed in a way
  • 0:22 - 0:26
    that is inclusive and effective for
  • 0:29 - 0:31
    everyone right now a lot of our work has
  • 0:31 - 0:33
    also involved doing audits ourselves of
  • 0:33 - 0:36
    these deployed systems so we analyze
  • 0:36 - 0:38
    situations um like I mentioned anything
  • 0:38 - 0:40
    from like healthcare to hiring to facial
  • 0:40 - 0:42
    recognition what we do is we come into
  • 0:42 - 0:44
    those situations and we try to
  • 0:44 - 0:46
    understand how the deployment of that
  • 0:46 - 0:50
    system impacts different marginalized
  • 0:53 - 0:55
    groups is a project called gender Shades
  • 0:55 - 0:58
    where we looked at uh facial recognition
  • 0:58 - 0:59
    systems that were deployed in the real
  • 0:59 - 1:02
    world and asked the question of is this
  • 1:02 - 1:04
    A system that works for everyone these
  • 1:04 - 1:06
    systems although they were operating at
  • 1:06 - 1:09
    almost 100% for for example lighter
  • 1:09 - 1:13
    skinned um male faces um they were uh
  • 1:13 - 1:15
    performing at less than 70% accuracy for
  • 1:15 - 1:18
    darker skinned women um this was a huge
  • 1:18 - 1:21
    story and it kind of escalated uh in the
  • 1:21 - 1:22
    press and and that's a lot of what we're
  • 1:22 - 1:25
    known for is that
  • 1:28 - 1:31
    project so you might have um a company
  • 1:31 - 1:34
    that builds a tool for doctors or for
  • 1:34 - 1:36
    teachers um whereas the affected
  • 1:36 - 1:38
    population in that situation would
  • 1:38 - 1:41
    actually be the students or the patients
  • 1:41 - 1:43
    and those guys very rarely have any kind
  • 1:43 - 1:46
    of influence on the types of features
  • 1:46 - 1:48
    that are emphasized in the development
  • 1:48 - 1:51
    of the AI system uh the type of data
  • 1:51 - 1:54
    that's collected uh and as a result um
  • 1:54 - 1:57
    those that are sort of experiencing the
  • 1:57 - 1:59
    weight of the decision-making that these
  • 1:59 - 2:03
    tools make uh end up almost uh erased
  • 2:03 - 2:05
    from the entire process of development
  • 2:05 - 2:08
    unless actively sought
  • 2:11 - 2:13
    out yeah so there's a lot of situations
  • 2:13 - 2:15
    in which humans are making very
  • 2:15 - 2:18
    important decisions uh an example being
  • 2:18 - 2:20
    hiring or a judge making a decision in a
  • 2:20 - 2:23
    criminal case and there's certainly a
  • 2:23 - 2:24
    lot of bias involved in that there's a
  • 2:24 - 2:27
    lot of the perspective of that person
  • 2:27 - 2:28
    making that decision that influences the
  • 2:28 - 2:31
    nature of that outcome in the same way
  • 2:31 - 2:33
    if you replace that human decision maker
  • 2:33 - 2:35
    with an algorithm there's bound to be
  • 2:35 - 2:37
    some level of bias involved in that the
  • 2:37 - 2:39
    other sort of aspect of this is that we
  • 2:39 - 2:41
    tend to trust algorithms and see them as
  • 2:41 - 2:45
    neutral in a way that we don't with
  • 2:49 - 2:51
    humans yeah so I got into this field
  • 2:51 - 2:54
    almost accidentally um I studied
  • 2:54 - 2:57
    robotics Engineering in University and I
  • 2:57 - 3:01
    was sort of playing a lot with um
  • 3:01 - 3:04
    uh AI as like just a form of of of part
  • 3:04 - 3:06
    of my experience in terms of coding and
  • 3:06 - 3:07
    and my experience in hackathons and
  • 3:07 - 3:09
    building projects and realize very
  • 3:09 - 3:11
    quickly that a lot of the data sets for
  • 3:11 - 3:14
    example um do not include a lot of
  • 3:14 - 3:15
    people that look like me so a lot of the
  • 3:15 - 3:17
    data sets that we use to uh you know to
  • 3:17 - 3:19
    to pretty much teach these algorithmic
  • 3:19 - 3:21
    systems uh what a face looks like what a
  • 3:21 - 3:24
    hand looks like what a human looks like
  • 3:24 - 3:26
    um don't actually include uh a lot of
  • 3:26 - 3:28
    people of color um and other different
  • 3:28 - 3:31
    demographics so that was is probably the
  • 3:31 - 3:34
    biggest uh sort of red flag that I saw
  • 3:34 - 3:36
    in the industry
  • 3:39 - 3:41
    immediately um I think a lot of the
  • 3:41 - 3:44
    times we think of AI systems as these
  • 3:44 - 3:47
    sci-fi sentient robot
  • 3:47 - 3:50
    overlords um but they're really just a
  • 3:50 - 3:52
    bunch of decisions being made by actual
  • 3:52 - 3:55
    humans and um our understanding of AI
  • 3:55 - 3:57
    systems as the separate thing makes it
  • 3:57 - 4:00
    really hard to hold anyone accountable
  • 4:00 - 4:04
    when a bad decision is made
  • 4:05 - 4:09
    [Music]
Title:
AI Auditing with Deborah Raji
Description:

more » « less
Video Language:
English
Team:
Code.org
Duration:
04:12
Amara Bot edited English subtitles for AI Auditing with Deborah Raji Jan 25, 2025, 1:19 AM

English (auto-generated) subtitles

Revisions

  • Revision 1 ASR: YouTube automatic subtitles
    Amara Bot Jan 25, 2025, 1:19 AM