To minimize the impact of harms like Harassment / Cyberbullying Spam Coordinated Inauthentic Activity Online Shaming, change the design of Comments.
Intervention:

Temporal Comment Limits

Definition: Restricting the volume of comments people can post will make folks think twice about their actions.
Kind of Intervention:
Limits
Reversible:
Easily Tested + Abandoned
Suitability:
General
Technical Difficulty:
Straightforward

A vanishingly small number of users generate the overwhelming majority of negative online comments. For that reason alone, platforms should place a variety of bounds on the volume of comments that individual users can generate, and could do so along a variety of overlapping and intersecting dimensions. These limitations could be absurdly high and still have significant positive impact, because the users at the 99th percentile of comment volume are consuming dramatically more than 1% of the communicative space. Some examples:

  • Limit the number of comments a user can post in an day to 100.
  • Limit the number of comments a user can post in an hour to 10.
  • Limit the number of comments a user can post on content posted by a user that doesn't follow them (a limit paired with an Affinity strategy)

These limits are just a few potential ways of framing this approach, and they aren't exclusive. The best approach here would be a set of overlapping restrictions, each restriction constraining different modalities of negative interactions, adding up to robust guardrails for expected user behavior over time.

These kinds of approaches (i.e. Limits) work because limits are concrete expressions of a platform's norms. It is very likely that users at the tail ends of a distribution (like the number of comments in a day) are behaving in ways that are antithetical to the norms of how the platform (and other users) expect them to behave.

Is something missing, or could it be better?
Loading...