WEBVTT 00:00:00.000 --> 00:00:03.420 [MUSIC PLAYING] 00:00:03.420 --> 00:00:03.920 00:00:03.920 --> 00:00:06.290 I'm April Williams, and I am an assistant professor 00:00:06.290 --> 00:00:08.430 in the Department of Communication Media. 00:00:08.430 --> 00:00:11.520 I'm also faculty in the Digital Studies Institute, 00:00:11.520 --> 00:00:14.960 and I am an RCI fellow of Research and Community Impact 00:00:14.960 --> 00:00:17.570 fellow with the NCID. 00:00:17.570 --> 00:00:22.260 The RCI fellowship is funding my project on online dating. 00:00:22.260 --> 00:00:24.180 The project is called Swipe Rights, 00:00:24.180 --> 00:00:26.390 and I am thinking through advocating 00:00:26.390 --> 00:00:28.760 for people and the way that we use 00:00:28.760 --> 00:00:30.980 apps and everyday technologies. 00:00:30.980 --> 00:00:33.200 AI has a lot of bias, whether that's 00:00:33.200 --> 00:00:35.148 racial bias or gender bias. 00:00:35.148 --> 00:00:37.190 So my project is really trying to bring awareness 00:00:37.190 --> 00:00:41.250 to that bias, especially where dating apps are concerned. 00:00:41.250 --> 00:00:44.310 The dating apps are only giving us what they think we want, 00:00:44.310 --> 00:00:45.660 and that's the problem. 00:00:45.660 --> 00:00:47.960 All the social problems that exist in our society 00:00:47.960 --> 00:00:49.890 get amplified by algorithms. 00:00:49.890 --> 00:00:52.290 They're not necessarily created by them. 00:00:52.290 --> 00:00:54.270 And we have to really stop and think through, 00:00:54.270 --> 00:00:56.970 do I want to use a race filter on Hinge? 00:00:56.970 --> 00:01:00.930 Do I want to put something in my bio that is racially coded? 00:01:00.930 --> 00:01:05.030 Do I want to train the algorithm to look for things that 00:01:05.030 --> 00:01:06.570 might be racially sensitive? 00:01:06.570 --> 00:01:08.510 Most of the responsibility for bias 00:01:08.510 --> 00:01:12.020 that happens in dating platforms falls on those 00:01:12.020 --> 00:01:14.040 who are making the platforms. 00:01:14.040 --> 00:01:17.750 There's a huge power imbalance where those folks at the top 00:01:17.750 --> 00:01:20.150 have most of the power, and us individual users 00:01:20.150 --> 00:01:21.810 have less of the power. 00:01:21.810 --> 00:01:25.350 I want to empower users to know what they're looking for, 00:01:25.350 --> 00:01:28.140 to ask questions about where their data is coming from, 00:01:28.140 --> 00:01:30.750 what happens to their data when they upload it to the apps, 00:01:30.750 --> 00:01:33.500 and to really call out app designers and platform 00:01:33.500 --> 00:01:36.660 designers, to get them to be more accountable to users, 00:01:36.660 --> 00:01:39.500 to be more trustworthy and to be more transparent. 00:01:39.500 --> 00:01:41.360 I have a forthcoming book which is 00:01:41.360 --> 00:01:43.910 on online dating and the algorithms that drive 00:01:43.910 --> 00:01:45.540 the online dating process. 00:01:45.540 --> 00:01:46.940 It's called Not My Type-- 00:01:46.940 --> 00:01:49.553 Automating Sexual Racism in Online Dating. 00:01:49.553 --> 00:01:51.470 And I think about it both from the perspective 00:01:51.470 --> 00:01:54.360 of online daters and the things that we do, 00:01:54.360 --> 00:01:56.540 so our individual swiping behaviors, but then 00:01:56.540 --> 00:01:59.810 also the algorithms that program and drive 00:01:59.810 --> 00:02:01.920 our swipe-based experiences. 00:02:01.920 --> 00:02:05.390 I'm working on a guide to accompany the book so that users 00:02:05.390 --> 00:02:08.990 can know exactly what language they need to use when they are 00:02:08.990 --> 00:02:11.910 writing to their representatives and saying, 00:02:11.910 --> 00:02:14.550 hey, look, actually, big tech needs to be regulated. 00:02:14.550 --> 00:02:16.050 This is something that I care about. 00:02:16.050 --> 00:02:17.570 This is something that I would vote for. 00:02:17.570 --> 00:02:18.070 Right? 00:02:18.070 --> 00:02:20.670 We vote on policy with our dollars, 00:02:20.670 --> 00:02:23.820 whether that is an in-app purchase, downloading Tinder, 00:02:23.820 --> 00:02:26.330 downloading Bumble, or any of the other apps 00:02:26.330 --> 00:02:27.360 that you might use. 00:02:27.360 --> 00:02:30.840 Everything that we do is really reinforcing demand. 00:02:30.840 --> 00:02:33.230 So if we want dating apps to sort of stop 00:02:33.230 --> 00:02:36.180 and think about the racial bias that's in their algorithms, 00:02:36.180 --> 00:02:38.730 we have to encourage them by saying, 00:02:38.730 --> 00:02:40.110 no, this is unacceptable. 00:02:40.110 --> 00:02:43.340 And the way that we do that is by withholding our money 00:02:43.340 --> 00:02:44.040 from them. 00:02:44.040 --> 00:02:47.750 So ultimately, the goal would be that dating platforms, dating 00:02:47.750 --> 00:02:50.330 companies, would do an internal audit 00:02:50.330 --> 00:02:52.550 and really take accountability for the algorithms 00:02:52.550 --> 00:02:54.000 that they're currently using. 00:02:54.000 --> 00:02:56.310 Hopefully, if there were enough users that said, 00:02:56.310 --> 00:02:59.130 I'm concerned about this, I'm uncomfortable with this, 00:02:59.130 --> 00:03:02.490 that would really indicate to these programmers to really say, 00:03:02.490 --> 00:03:04.130 OK, we need to stop and reevaluate 00:03:04.130 --> 00:03:06.680 the way that we collect data, the way that we 00:03:06.680 --> 00:03:10.610 use data, the way that we draw inferences about our user base. 00:03:10.610 --> 00:03:13.610 So with this grant, I'm really pulling together several 00:03:13.610 --> 00:03:15.350 of the leading scholars in the field 00:03:15.350 --> 00:03:17.490 and lots of other practitioners as well 00:03:17.490 --> 00:03:21.050 to think through what it means for the individual user 00:03:21.050 --> 00:03:24.590 to say to a company, hey, I'm not comfortable with the way 00:03:24.590 --> 00:03:25.920 that you're using my data. 00:03:25.920 --> 00:03:28.810 Hey, we're not going to take this anymore. 00:03:28.810 --> 00:03:36.000