Harm:

Discrimination

Definition: Targeting or excluding individuals in online spaces based on a protected or immutable characteristic.
Motivation:
Ideological
Legal Status:
Rarely criminalized
Platform ToS:
Allowed by Policy
Victim Visibility:
Unaware
Classification:
Contextually Sensitive
TSPA Abuse Type:
None

Online spaces provide users a pair of functionality that enables discrimination: the ability to represent their identities to other users, and their ability to choose whether or not to interact with other users. Some prominent examples from the last decade have included discrimination in short-term rentals (see: AirBnb), discrimination on dating apps (see: OkCupid), and discrimination in housing (see: Craigslist). In each instance, the discrimination occurred as a result of the aggregate biases of the users on the platform, paired with the novel capacity those discriminatory users had (through the platform) to surmise the identity characteristics of the person on the other side of the transaction.

Discrimination is a societal ill translated into an online space, but it can be made more or less of an issue through platform design.

For example, when platforms make the identity and profile of the participating users central to their evaluation of a transaction, discrimination becomes more likely. Instead, when platforms center the exchange being offered (for AirBnB, the house, rather than the host, ex), discrimination seems considerably less likely. 

As another example, the predictive goal of an algorithm can easily embed systemic discrimination within it. For example, if a dating site's algorithm for suggesting users to one another has the goal of maximizing the total number of matches across all users, it may optimize against recommending users who have niche tastes, or uncommon characteristics. However, if instead it has the goal of optimizing the number of people that get any match, that pattern might be less likely.

While it would be unreasonable to ask online platforms to solve problems of discrimination that arise across centuries and cultures, it is wholly reasonable to challenge platforms design of their systems in ways that does not make discrimination easier or more prevalent.

What features facilitate Discrimination?

Identity
Individuals' ability to represent themselves in a digital space.

How can platform design prevent Discrimination?

When it doesn't undermine the purpose of a platform:
Deprioritize User Identity
In platforms where the identity of the participants isn't central, omit it.
Because it helps victims feel heard and empowered:
Reporting Mechanisms
Allow users to flag content or behavior that they find to be abusive.
Hide Interaction Counts
Foster authentic interaction by making numerical properties less central.
Is something missing, or could it be better?
Loading...