Definition: Giving users capability controls over a circumscribed perimeter.

As online spaces play larger and larger roles in the lives of their users, users are seeking more control and say in how those spaces reflect their preferences and needs. The intervention category of User Control captures  platforms efforts to to put the power to curate and manage online experiences back into the hands of their users. Effective tools in this category define a circumscribed perimeter over which each user can shape their digital environment, akin to how one controls their personal physical space. By providing users with robust tools to filter, block, report, or limit interactions, platforms empower users to create safe and comfortable digital communities on their own terms. More sophisticated systems enable delegation of these powers and responsibilities to other trusted parties - like moderated forums, or friend-comment-takedowns. This expression of personal and individualized control not only enhances individual comfort but also fosters a sense of responsibility among users, as they actively participate in maintaining the standards and norms of their online circles.

Weaknesses

Though this approach can be applied with great effect to a wide variety of challenges, there are some for which it cannot offer any hope of address. In particular, in cases where:

  • Harm is largely done out of view of those harmed by it (like hate speech)
  • Users aren't able to rapidly determine the suitability of content (like malware, misinformation)
  • or Motivated groups can organize spaces expressly for harm (like CSAM sharing networks)

In any of these cases, additional control to users can not obviate the harm.

Strengths

The capacity for this model to scale is one of its profound advantages. Traditional moderation systems, often centralized, face immense challenges when addressing the sheer volume of user-generated content, not to mention the complexities of cultural subjectivity, controversy, and unequal power dynamics inherent in top-down moderation. User-controlled moderation, on the other hand, distributes this colossal task among many, effectively creating a vast network of user-moderators. Each user becomes a node of governance, making real-time decisions that collectively contribute to the larger digital ecosystem's health. This system mitigates the risks of centralized control and bias, as content is not subjected to a one-size-fits-all assessment. Instead, it creates space for diversity in thought and standards, echoing and reflecting the diversity of its users.

Interventions using this Approach

Author Comment-Moderation
Enable primary content creators control over the comments layered on their content.
Self-imposed time limits
Allow users to set hard cutoffs on platform use.
Configurable Feed Optimization Goals
Allow users to choose what they want their feeds to optimize for.
Allow immediate location takedown
Enable any user to immediately censor a physical address from public content.
User Submitted Reviews
Platforms that enable repeat engagement can benefit from allowing users to learn about the past experiences of other users.
Reporting Mechanisms
Allow users to flag content or behavior that they find to be abusive.
Bulk Location Takedown Tools
Allow a user to scrub their profile of location data while leaving other data intact.
Crowdsourced Annotations
Allow users to add context to the posts of others when it is widely seen to be useful.
Loading...