1 00:00:00,000 --> 00:00:03,420 [MUSIC PLAYING] 2 00:00:03,420 --> 00:00:03,920 3 00:00:03,920 --> 00:00:06,290 I'm April Williams, and I am an assistant professor 4 00:00:06,290 --> 00:00:08,430 in the Department of Communication Media. 5 00:00:08,430 --> 00:00:11,520 I'm also faculty in the Digital Studies Institute, 6 00:00:11,520 --> 00:00:14,960 and I am an RCI fellow of Research and Community Impact 7 00:00:14,960 --> 00:00:17,570 fellow with the NCID. 8 00:00:17,570 --> 00:00:22,260 The RCI fellowship is funding my project on online dating. 9 00:00:22,260 --> 00:00:24,180 The project is called Swipe Rights, 10 00:00:24,180 --> 00:00:26,390 and I am thinking through advocating 11 00:00:26,390 --> 00:00:28,760 for people and the way that we use 12 00:00:28,760 --> 00:00:30,980 apps and everyday technologies. 13 00:00:30,980 --> 00:00:33,200 AI has a lot of bias, whether that's 14 00:00:33,200 --> 00:00:35,148 racial bias or gender bias. 15 00:00:35,148 --> 00:00:37,190 So my project is really trying to bring awareness 16 00:00:37,190 --> 00:00:41,250 to that bias, especially where dating apps are concerned. 17 00:00:41,250 --> 00:00:44,310 The dating apps are only giving us what they think we want, 18 00:00:44,310 --> 00:00:45,660 and that's the problem. 19 00:00:45,660 --> 00:00:47,960 All the social problems that exist in our society 20 00:00:47,960 --> 00:00:49,890 get amplified by algorithms. 21 00:00:49,890 --> 00:00:52,290 They're not necessarily created by them. 22 00:00:52,290 --> 00:00:54,270 And we have to really stop and think through, 23 00:00:54,270 --> 00:00:56,970 do I want to use a race filter on Hinge? 24 00:00:56,970 --> 00:01:00,930 Do I want to put something in my bio that is racially coded? 25 00:01:00,930 --> 00:01:05,030 Do I want to train the algorithm to look for things that 26 00:01:05,030 --> 00:01:06,570 might be racially sensitive? 27 00:01:06,570 --> 00:01:08,510 Most of the responsibility for bias 28 00:01:08,510 --> 00:01:12,020 that happens in dating platforms falls on those 29 00:01:12,020 --> 00:01:14,040 who are making the platforms. 30 00:01:14,040 --> 00:01:17,750 There's a huge power imbalance where those folks at the top 31 00:01:17,750 --> 00:01:20,150 have most of the power, and us individual users 32 00:01:20,150 --> 00:01:21,810 have less of the power. 33 00:01:21,810 --> 00:01:25,350 I want to empower users to know what they're looking for, 34 00:01:25,350 --> 00:01:28,140 to ask questions about where their data is coming from, 35 00:01:28,140 --> 00:01:30,750 what happens to their data when they upload it to the apps, 36 00:01:30,750 --> 00:01:33,500 and to really call out app designers and platform 37 00:01:33,500 --> 00:01:36,660 designers, to get them to be more accountable to users, 38 00:01:36,660 --> 00:01:39,500 to be more trustworthy and to be more transparent. 39 00:01:39,500 --> 00:01:41,360 I have a forthcoming book which is 40 00:01:41,360 --> 00:01:43,910 on online dating and the algorithms that drive 41 00:01:43,910 --> 00:01:45,540 the online dating process. 42 00:01:45,540 --> 00:01:46,940 It's called Not My Type-- 43 00:01:46,940 --> 00:01:49,553 Automating Sexual Racism in Online Dating. 44 00:01:49,553 --> 00:01:51,470 And I think about it both from the perspective 45 00:01:51,470 --> 00:01:54,360 of online daters and the things that we do, 46 00:01:54,360 --> 00:01:56,540 so our individual swiping behaviors, but then 47 00:01:56,540 --> 00:01:59,810 also the algorithms that program and drive 48 00:01:59,810 --> 00:02:01,920 our swipe-based experiences. 49 00:02:01,920 --> 00:02:05,390 I'm working on a guide to accompany the book so that users 50 00:02:05,390 --> 00:02:08,990 can know exactly what language they need to use when they are 51 00:02:08,990 --> 00:02:11,910 writing to their representatives and saying, 52 00:02:11,910 --> 00:02:14,550 hey, look, actually, big tech needs to be regulated. 53 00:02:14,550 --> 00:02:16,050 This is something that I care about. 54 00:02:16,050 --> 00:02:17,570 This is something that I would vote for. 55 00:02:17,570 --> 00:02:18,070 Right? 56 00:02:18,070 --> 00:02:20,670 We vote on policy with our dollars, 57 00:02:20,670 --> 00:02:23,820 whether that is an in-app purchase, downloading Tinder, 58 00:02:23,820 --> 00:02:26,330 downloading Bumble, or any of the other apps 59 00:02:26,330 --> 00:02:27,360 that you might use. 60 00:02:27,360 --> 00:02:30,840 Everything that we do is really reinforcing demand. 61 00:02:30,840 --> 00:02:33,230 So if we want dating apps to sort of stop 62 00:02:33,230 --> 00:02:36,180 and think about the racial bias that's in their algorithms, 63 00:02:36,180 --> 00:02:38,730 we have to encourage them by saying, 64 00:02:38,730 --> 00:02:40,110 no, this is unacceptable. 65 00:02:40,110 --> 00:02:43,340 And the way that we do that is by withholding our money 66 00:02:43,340 --> 00:02:44,040 from them. 67 00:02:44,040 --> 00:02:47,750 So ultimately, the goal would be that dating platforms, dating 68 00:02:47,750 --> 00:02:50,330 companies, would do an internal audit 69 00:02:50,330 --> 00:02:52,550 and really take accountability for the algorithms 70 00:02:52,550 --> 00:02:54,000 that they're currently using. 71 00:02:54,000 --> 00:02:56,310 Hopefully, if there were enough users that said, 72 00:02:56,310 --> 00:02:59,130 I'm concerned about this, I'm uncomfortable with this, 73 00:02:59,130 --> 00:03:02,490 that would really indicate to these programmers to really say, 74 00:03:02,490 --> 00:03:04,130 OK, we need to stop and reevaluate 75 00:03:04,130 --> 00:03:06,680 the way that we collect data, the way that we 76 00:03:06,680 --> 00:03:10,610 use data, the way that we draw inferences about our user base. 77 00:03:10,610 --> 00:03:13,610 So with this grant, I'm really pulling together several 78 00:03:13,610 --> 00:03:15,350 of the leading scholars in the field 79 00:03:15,350 --> 00:03:17,490 and lots of other practitioners as well 80 00:03:17,490 --> 00:03:21,050 to think through what it means for the individual user 81 00:03:21,050 --> 00:03:24,590 to say to a company, hey, I'm not comfortable with the way 82 00:03:24,590 --> 00:03:25,920 that you're using my data. 83 00:03:25,920 --> 00:03:28,810 Hey, we're not going to take this anymore. 84 00:03:28,810 --> 00:03:36,000