To minimize the impact of harms like Spam Harassment / Cyberbullying Hate Speech, change the design of Comments.
Intervention:

Affinity To Comment

Definition: Require a history of interaction between users before they're allowed to interact in comments.
Kind of Intervention:
Affinity
Reversible:
Easily Tested + Abandoned
Suitability:
General
Technical Difficulty:
Straightforward
Legislative Target:
Yes

Comments are the site of abuses including harassment, discrimination, and doxxing, where the victim is the author of the primary content, and the abusive user typically has no prior relationship to the victim.

This pattern can be avoided by creating measures of interpersonal affinity and using that to gate access to the ability to leave comments: only a user with some level of affinity to the author of a post would be allowed to comment on the post.

This could be made more effective by putting this power in users' hands, offering them the ability to control comments on their posts at a granular, and per post, level. An example set of controls could allow commenters to be (a) anyone online (b) logged in users (c) friends of friends (d) only friends, or (e) only specific people. Though this example relies on "friend" mechanics as a metric of affinity, a wide array of affinity metrics can fulfill this role, including shared interests or duration of interaction.

Putting power in users' hands tends to be a winning strategy for harms where they're directly effected by the harm when it occurs, as these forms of comment-abuse do.

Is something missing, or could it be better?
Loading...