-
[MUSIC PLAYING]
-
-
I'm April Williams, and I
am an assistant professor
-
in the Department of
Communication Media.
-
I'm also faculty in the
Digital Studies Institute,
-
and I am an RCI fellow of
Research and Community Impact
-
fellow with the NCID.
-
The RCI fellowship is funding
my project on online dating.
-
The project is
called Swipe Rights,
-
and I am thinking
through advocating
-
for people and the
way that we use
-
apps and everyday technologies.
-
AI has a lot of
bias, whether that's
-
racial bias or gender bias.
-
So my project is really
trying to bring awareness
-
to that bias, especially where
dating apps are concerned.
-
The dating apps are only giving
us what they think we want,
-
and that's the problem.
-
All the social problems
that exist in our society
-
get amplified by algorithms.
-
They're not necessarily
created by them.
-
And we have to really
stop and think through,
-
do I want to use a
race filter on Hinge?
-
Do I want to put something in
my bio that is racially coded?
-
Do I want to train the algorithm
to look for things that
-
might be racially sensitive?
-
Most of the
responsibility for bias
-
that happens in dating
platforms falls on those
-
who are making the platforms.
-
There's a huge power imbalance
where those folks at the top
-
have most of the power,
and us individual users
-
have less of the power.
-
I want to empower users to
know what they're looking for,
-
to ask questions about where
their data is coming from,
-
what happens to their data when
they upload it to the apps,
-
and to really call out
app designers and platform
-
designers, to get them to be
more accountable to users,
-
to be more trustworthy and
to be more transparent.
-
I have a forthcoming
book which is
-
on online dating and the
algorithms that drive
-
the online dating process.
-
It's called Not My Type--
-
Automating Sexual
Racism in Online Dating.
-
And I think about it
both from the perspective
-
of online daters and
the things that we do,
-
so our individual swiping
behaviors, but then
-
also the algorithms
that program and drive
-
our swipe-based experiences.
-
I'm working on a guide to
accompany the book so that users
-
can know exactly what language
they need to use when they are
-
writing to their
representatives and saying,
-
hey, look, actually, big
tech needs to be regulated.
-
This is something
that I care about.
-
This is something
that I would vote for.
-
Right?
-
We vote on policy
with our dollars,
-
whether that is an in-app
purchase, downloading Tinder,
-
downloading Bumble, or
any of the other apps
-
that you might use.
-
Everything that we do is
really reinforcing demand.
-
So if we want dating
apps to sort of stop
-
and think about the racial bias
that's in their algorithms,
-
we have to encourage
them by saying,
-
no, this is unacceptable.
-
And the way that we do that
is by withholding our money
-
from them.
-
So ultimately, the goal would
be that dating platforms, dating
-
companies, would do
an internal audit
-
and really take accountability
for the algorithms
-
that they're currently using.
-
Hopefully, if there were
enough users that said,
-
I'm concerned about this,
I'm uncomfortable with this,
-
that would really indicate to
these programmers to really say,
-
OK, we need to
stop and reevaluate
-
the way that we collect
data, the way that we
-
use data, the way that we draw
inferences about our user base.
-
So with this grant, I'm really
pulling together several
-
of the leading
scholars in the field
-
and lots of other
practitioners as well
-
to think through what it
means for the individual user
-
to say to a company, hey, I'm
not comfortable with the way
-
that you're using my data.
-
Hey, we're not going
to take this anymore.
-