0:00:00.000,0:00:03.420 [MUSIC PLAYING] 0:00:03.420,0:00:03.920 0:00:03.920,0:00:06.290 I'm April Williams, and I[br]am an assistant professor 0:00:06.290,0:00:08.430 in the Department of[br]Communication Media. 0:00:08.430,0:00:11.520 I'm also faculty in the[br]Digital Studies Institute, 0:00:11.520,0:00:14.960 and I am an RCI fellow of[br]Research and Community Impact 0:00:14.960,0:00:17.570 fellow with the NCID. 0:00:17.570,0:00:22.260 The RCI fellowship is funding[br]my project on online dating. 0:00:22.260,0:00:24.180 The project is[br]called Swipe Rights, 0:00:24.180,0:00:26.390 and I am thinking[br]through advocating 0:00:26.390,0:00:28.760 for people and the[br]way that we use 0:00:28.760,0:00:30.980 apps and everyday technologies. 0:00:30.980,0:00:33.200 AI has a lot of[br]bias, whether that's 0:00:33.200,0:00:35.148 racial bias or gender bias. 0:00:35.148,0:00:37.190 So my project is really[br]trying to bring awareness 0:00:37.190,0:00:41.250 to that bias, especially where[br]dating apps are concerned. 0:00:41.250,0:00:44.310 The dating apps are only giving[br]us what they think we want, 0:00:44.310,0:00:45.660 and that's the problem. 0:00:45.660,0:00:47.960 All the social problems[br]that exist in our society 0:00:47.960,0:00:49.890 get amplified by algorithms. 0:00:49.890,0:00:52.290 They're not necessarily[br]created by them. 0:00:52.290,0:00:54.270 And we have to really[br]stop and think through, 0:00:54.270,0:00:56.970 do I want to use a[br]race filter on Hinge? 0:00:56.970,0:01:00.930 Do I want to put something in[br]my bio that is racially coded? 0:01:00.930,0:01:05.030 Do I want to train the algorithm[br]to look for things that 0:01:05.030,0:01:06.570 might be racially sensitive? 0:01:06.570,0:01:08.510 Most of the[br]responsibility for bias 0:01:08.510,0:01:12.020 that happens in dating[br]platforms falls on those 0:01:12.020,0:01:14.040 who are making the platforms. 0:01:14.040,0:01:17.750 There's a huge power imbalance[br]where those folks at the top 0:01:17.750,0:01:20.150 have most of the power,[br]and us individual users 0:01:20.150,0:01:21.810 have less of the power. 0:01:21.810,0:01:25.350 I want to empower users to[br]know what they're looking for, 0:01:25.350,0:01:28.140 to ask questions about where[br]their data is coming from, 0:01:28.140,0:01:30.750 what happens to their data when[br]they upload it to the apps, 0:01:30.750,0:01:33.500 and to really call out[br]app designers and platform 0:01:33.500,0:01:36.660 designers, to get them to be[br]more accountable to users, 0:01:36.660,0:01:39.500 to be more trustworthy and[br]to be more transparent. 0:01:39.500,0:01:41.360 I have a forthcoming[br]book which is 0:01:41.360,0:01:43.910 on online dating and the[br]algorithms that drive 0:01:43.910,0:01:45.540 the online dating process. 0:01:45.540,0:01:46.940 It's called Not My Type-- 0:01:46.940,0:01:49.553 Automating Sexual[br]Racism in Online Dating. 0:01:49.553,0:01:51.470 And I think about it[br]both from the perspective 0:01:51.470,0:01:54.360 of online daters and[br]the things that we do, 0:01:54.360,0:01:56.540 so our individual swiping[br]behaviors, but then 0:01:56.540,0:01:59.810 also the algorithms[br]that program and drive 0:01:59.810,0:02:01.920 our swipe-based experiences. 0:02:01.920,0:02:05.390 I'm working on a guide to[br]accompany the book so that users 0:02:05.390,0:02:08.990 can know exactly what language[br]they need to use when they are 0:02:08.990,0:02:11.910 writing to their[br]representatives and saying, 0:02:11.910,0:02:14.550 hey, look, actually, big[br]tech needs to be regulated. 0:02:14.550,0:02:16.050 This is something[br]that I care about. 0:02:16.050,0:02:17.570 This is something[br]that I would vote for. 0:02:17.570,0:02:18.070 Right? 0:02:18.070,0:02:20.670 We vote on policy[br]with our dollars, 0:02:20.670,0:02:23.820 whether that is an in-app[br]purchase, downloading Tinder, 0:02:23.820,0:02:26.330 downloading Bumble, or[br]any of the other apps 0:02:26.330,0:02:27.360 that you might use. 0:02:27.360,0:02:30.840 Everything that we do is[br]really reinforcing demand. 0:02:30.840,0:02:33.230 So if we want dating[br]apps to sort of stop 0:02:33.230,0:02:36.180 and think about the racial bias[br]that's in their algorithms, 0:02:36.180,0:02:38.730 we have to encourage[br]them by saying, 0:02:38.730,0:02:40.110 no, this is unacceptable. 0:02:40.110,0:02:43.340 And the way that we do that[br]is by withholding our money 0:02:43.340,0:02:44.040 from them. 0:02:44.040,0:02:47.750 So ultimately, the goal would[br]be that dating platforms, dating 0:02:47.750,0:02:50.330 companies, would do[br]an internal audit 0:02:50.330,0:02:52.550 and really take accountability[br]for the algorithms 0:02:52.550,0:02:54.000 that they're currently using. 0:02:54.000,0:02:56.310 Hopefully, if there were[br]enough users that said, 0:02:56.310,0:02:59.130 I'm concerned about this,[br]I'm uncomfortable with this, 0:02:59.130,0:03:02.490 that would really indicate to[br]these programmers to really say, 0:03:02.490,0:03:04.130 OK, we need to[br]stop and reevaluate 0:03:04.130,0:03:06.680 the way that we collect[br]data, the way that we 0:03:06.680,0:03:10.610 use data, the way that we draw[br]inferences about our user base. 0:03:10.610,0:03:13.610 So with this grant, I'm really[br]pulling together several 0:03:13.610,0:03:15.350 of the leading[br]scholars in the field 0:03:15.350,0:03:17.490 and lots of other[br]practitioners as well 0:03:17.490,0:03:21.050 to think through what it[br]means for the individual user 0:03:21.050,0:03:24.590 to say to a company, hey, I'm[br]not comfortable with the way 0:03:24.590,0:03:25.920 that you're using my data. 0:03:25.920,0:03:28.810 Hey, we're not going[br]to take this anymore. 0:03:28.810,0:03:36.000