[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.00,0:00:03.42,Default,,0000,0000,0000,,[MUSIC PLAYING] Dialogue: 0,0:00:03.42,0:00:03.92,Default,,0000,0000,0000,, Dialogue: 0,0:00:03.92,0:00:06.29,Default,,0000,0000,0000,,I'm April Williams, and I\Nam an assistant professor Dialogue: 0,0:00:06.29,0:00:08.43,Default,,0000,0000,0000,,in the Department of\NCommunication Media. Dialogue: 0,0:00:08.43,0:00:11.52,Default,,0000,0000,0000,,I'm also faculty in the\NDigital Studies Institute, Dialogue: 0,0:00:11.52,0:00:14.96,Default,,0000,0000,0000,,and I am an RCI fellow of\NResearch and Community Impact Dialogue: 0,0:00:14.96,0:00:17.57,Default,,0000,0000,0000,,fellow with the NCID. Dialogue: 0,0:00:17.57,0:00:22.26,Default,,0000,0000,0000,,The RCI fellowship is funding\Nmy project on online dating. Dialogue: 0,0:00:22.26,0:00:24.18,Default,,0000,0000,0000,,The project is\Ncalled Swipe Rights, Dialogue: 0,0:00:24.18,0:00:26.39,Default,,0000,0000,0000,,and I am thinking\Nthrough advocating Dialogue: 0,0:00:26.39,0:00:28.76,Default,,0000,0000,0000,,for people and the\Nway that we use Dialogue: 0,0:00:28.76,0:00:30.98,Default,,0000,0000,0000,,apps and everyday technologies. Dialogue: 0,0:00:30.98,0:00:33.20,Default,,0000,0000,0000,,AI has a lot of\Nbias, whether that's Dialogue: 0,0:00:33.20,0:00:35.15,Default,,0000,0000,0000,,racial bias or gender bias. Dialogue: 0,0:00:35.15,0:00:37.19,Default,,0000,0000,0000,,So my project is really\Ntrying to bring awareness Dialogue: 0,0:00:37.19,0:00:41.25,Default,,0000,0000,0000,,to that bias, especially where\Ndating apps are concerned. Dialogue: 0,0:00:41.25,0:00:44.31,Default,,0000,0000,0000,,The dating apps are only giving\Nus what they think we want, Dialogue: 0,0:00:44.31,0:00:45.66,Default,,0000,0000,0000,,and that's the problem. Dialogue: 0,0:00:45.66,0:00:47.96,Default,,0000,0000,0000,,All the social problems\Nthat exist in our society Dialogue: 0,0:00:47.96,0:00:49.89,Default,,0000,0000,0000,,get amplified by algorithms. Dialogue: 0,0:00:49.89,0:00:52.29,Default,,0000,0000,0000,,They're not necessarily\Ncreated by them. Dialogue: 0,0:00:52.29,0:00:54.27,Default,,0000,0000,0000,,And we have to really\Nstop and think through, Dialogue: 0,0:00:54.27,0:00:56.97,Default,,0000,0000,0000,,do I want to use a\Nrace filter on Hinge? Dialogue: 0,0:00:56.97,0:01:00.93,Default,,0000,0000,0000,,Do I want to put something in\Nmy bio that is racially coded? Dialogue: 0,0:01:00.93,0:01:05.03,Default,,0000,0000,0000,,Do I want to train the algorithm\Nto look for things that Dialogue: 0,0:01:05.03,0:01:06.57,Default,,0000,0000,0000,,might be racially sensitive? Dialogue: 0,0:01:06.57,0:01:08.51,Default,,0000,0000,0000,,Most of the\Nresponsibility for bias Dialogue: 0,0:01:08.51,0:01:12.02,Default,,0000,0000,0000,,that happens in dating\Nplatforms falls on those Dialogue: 0,0:01:12.02,0:01:14.04,Default,,0000,0000,0000,,who are making the platforms. Dialogue: 0,0:01:14.04,0:01:17.75,Default,,0000,0000,0000,,There's a huge power imbalance\Nwhere those folks at the top Dialogue: 0,0:01:17.75,0:01:20.15,Default,,0000,0000,0000,,have most of the power,\Nand us individual users Dialogue: 0,0:01:20.15,0:01:21.81,Default,,0000,0000,0000,,have less of the power. Dialogue: 0,0:01:21.81,0:01:25.35,Default,,0000,0000,0000,,I want to empower users to\Nknow what they're looking for, Dialogue: 0,0:01:25.35,0:01:28.14,Default,,0000,0000,0000,,to ask questions about where\Ntheir data is coming from, Dialogue: 0,0:01:28.14,0:01:30.75,Default,,0000,0000,0000,,what happens to their data when\Nthey upload it to the apps, Dialogue: 0,0:01:30.75,0:01:33.50,Default,,0000,0000,0000,,and to really call out\Napp designers and platform Dialogue: 0,0:01:33.50,0:01:36.66,Default,,0000,0000,0000,,designers, to get them to be\Nmore accountable to users, Dialogue: 0,0:01:36.66,0:01:39.50,Default,,0000,0000,0000,,to be more trustworthy and\Nto be more transparent. Dialogue: 0,0:01:39.50,0:01:41.36,Default,,0000,0000,0000,,I have a forthcoming\Nbook which is Dialogue: 0,0:01:41.36,0:01:43.91,Default,,0000,0000,0000,,on online dating and the\Nalgorithms that drive Dialogue: 0,0:01:43.91,0:01:45.54,Default,,0000,0000,0000,,the online dating process. Dialogue: 0,0:01:45.54,0:01:46.94,Default,,0000,0000,0000,,It's called Not My Type-- Dialogue: 0,0:01:46.94,0:01:49.55,Default,,0000,0000,0000,,Automating Sexual\NRacism in Online Dating. Dialogue: 0,0:01:49.55,0:01:51.47,Default,,0000,0000,0000,,And I think about it\Nboth from the perspective Dialogue: 0,0:01:51.47,0:01:54.36,Default,,0000,0000,0000,,of online daters and\Nthe things that we do, Dialogue: 0,0:01:54.36,0:01:56.54,Default,,0000,0000,0000,,so our individual swiping\Nbehaviors, but then Dialogue: 0,0:01:56.54,0:01:59.81,Default,,0000,0000,0000,,also the algorithms\Nthat program and drive Dialogue: 0,0:01:59.81,0:02:01.92,Default,,0000,0000,0000,,our swipe-based experiences. Dialogue: 0,0:02:01.92,0:02:05.39,Default,,0000,0000,0000,,I'm working on a guide to\Naccompany the book so that users Dialogue: 0,0:02:05.39,0:02:08.99,Default,,0000,0000,0000,,can know exactly what language\Nthey need to use when they are Dialogue: 0,0:02:08.99,0:02:11.91,Default,,0000,0000,0000,,writing to their\Nrepresentatives and saying, Dialogue: 0,0:02:11.91,0:02:14.55,Default,,0000,0000,0000,,hey, look, actually, big\Ntech needs to be regulated. Dialogue: 0,0:02:14.55,0:02:16.05,Default,,0000,0000,0000,,This is something\Nthat I care about. Dialogue: 0,0:02:16.05,0:02:17.57,Default,,0000,0000,0000,,This is something\Nthat I would vote for. Dialogue: 0,0:02:17.57,0:02:18.07,Default,,0000,0000,0000,,Right? Dialogue: 0,0:02:18.07,0:02:20.67,Default,,0000,0000,0000,,We vote on policy\Nwith our dollars, Dialogue: 0,0:02:20.67,0:02:23.82,Default,,0000,0000,0000,,whether that is an in-app\Npurchase, downloading Tinder, Dialogue: 0,0:02:23.82,0:02:26.33,Default,,0000,0000,0000,,downloading Bumble, or\Nany of the other apps Dialogue: 0,0:02:26.33,0:02:27.36,Default,,0000,0000,0000,,that you might use. Dialogue: 0,0:02:27.36,0:02:30.84,Default,,0000,0000,0000,,Everything that we do is\Nreally reinforcing demand. Dialogue: 0,0:02:30.84,0:02:33.23,Default,,0000,0000,0000,,So if we want dating\Napps to sort of stop Dialogue: 0,0:02:33.23,0:02:36.18,Default,,0000,0000,0000,,and think about the racial bias\Nthat's in their algorithms, Dialogue: 0,0:02:36.18,0:02:38.73,Default,,0000,0000,0000,,we have to encourage\Nthem by saying, Dialogue: 0,0:02:38.73,0:02:40.11,Default,,0000,0000,0000,,no, this is unacceptable. Dialogue: 0,0:02:40.11,0:02:43.34,Default,,0000,0000,0000,,And the way that we do that\Nis by withholding our money Dialogue: 0,0:02:43.34,0:02:44.04,Default,,0000,0000,0000,,from them. Dialogue: 0,0:02:44.04,0:02:47.75,Default,,0000,0000,0000,,So ultimately, the goal would\Nbe that dating platforms, dating Dialogue: 0,0:02:47.75,0:02:50.33,Default,,0000,0000,0000,,companies, would do\Nan internal audit Dialogue: 0,0:02:50.33,0:02:52.55,Default,,0000,0000,0000,,and really take accountability\Nfor the algorithms Dialogue: 0,0:02:52.55,0:02:54.00,Default,,0000,0000,0000,,that they're currently using. Dialogue: 0,0:02:54.00,0:02:56.31,Default,,0000,0000,0000,,Hopefully, if there were\Nenough users that said, Dialogue: 0,0:02:56.31,0:02:59.13,Default,,0000,0000,0000,,I'm concerned about this,\NI'm uncomfortable with this, Dialogue: 0,0:02:59.13,0:03:02.49,Default,,0000,0000,0000,,that would really indicate to\Nthese programmers to really say, Dialogue: 0,0:03:02.49,0:03:04.13,Default,,0000,0000,0000,,OK, we need to\Nstop and reevaluate Dialogue: 0,0:03:04.13,0:03:06.68,Default,,0000,0000,0000,,the way that we collect\Ndata, the way that we Dialogue: 0,0:03:06.68,0:03:10.61,Default,,0000,0000,0000,,use data, the way that we draw\Ninferences about our user base. Dialogue: 0,0:03:10.61,0:03:13.61,Default,,0000,0000,0000,,So with this grant, I'm really\Npulling together several Dialogue: 0,0:03:13.61,0:03:15.35,Default,,0000,0000,0000,,of the leading\Nscholars in the field Dialogue: 0,0:03:15.35,0:03:17.49,Default,,0000,0000,0000,,and lots of other\Npractitioners as well Dialogue: 0,0:03:17.49,0:03:21.05,Default,,0000,0000,0000,,to think through what it\Nmeans for the individual user Dialogue: 0,0:03:21.05,0:03:24.59,Default,,0000,0000,0000,,to say to a company, hey, I'm\Nnot comfortable with the way Dialogue: 0,0:03:24.59,0:03:25.92,Default,,0000,0000,0000,,that you're using my data. Dialogue: 0,0:03:25.92,0:03:28.81,Default,,0000,0000,0000,,Hey, we're not going\Nto take this anymore. Dialogue: 0,0:03:28.81,0:03:36.00,Default,,0000,0000,0000,,