Online spaces provide users a pair of functionality that enables discrimination: the ability to represent their identities to other users, and their ability to choose whether or not to interact with other users. Some prominent examples from the last decade have included discrimination in short-term rentals (see: AirBnb), discrimination on dating apps (see: OkCupid), and discrimination in housing (see: Craigslist). In each instance, the discrimination occurred as a result of the aggregate biases of the users on the platform, paired with the novel capacity those discriminatory users had (through the platform) to surmise the identity characteristics of the person on the other side of the transaction.
Discrimination is a societal ill translated into an online space, but it can be made more or less of an issue through platform design.
For example, when platforms make the identity and profile of the participating users central to their evaluation of a transaction, discrimination becomes more likely. Instead, when platforms center the exchange being offered (for AirBnB, the house, rather than the host, ex), discrimination seems considerably less likely.
As another example, the predictive goal of an algorithm can easily embed systemic discrimination within it. For example, if a dating site's algorithm for suggesting users to one another has the goal of maximizing the total number of matches across all users, it may optimize against recommending users who have niche tastes, or uncommon characteristics. However, if instead it has the goal of optimizing the number of people that get any match, that pattern might be less likely.
While it would be unreasonable to ask online platforms to solve problems of discrimination that arise across centuries and cultures, it is wholly reasonable to challenge platforms design of their systems in ways that does not make discrimination easier or more prevalent.