Harm:

Grooming

Definition: The deliberate development of emotional connection with a minor as a means to manipulate and exploit them.
Motivation:
Sexual
Legal Status:
Can be illegal
Platform ToS:
Violates Policy
Victim Visibility:
Unaware
Classification:
Contextually Sensitive
TSPA Abuse Type:
Violent and Criminal Behavior: Child Abuse and Nudity

Grooming refers to the process whereby an individual builds an emotional connection with a minor to gain their trust for the purposes of  sexual exploitation, abuse, or trafficking. In the online context, this activity poses a complex challenge for platforms targeted to children, and platforms where adults and children can freely interact.

Grooming frequently occurs in private digital spaces like direct messages or private chat rooms, which makes platform intervention difficult without infringing upon user privacy.  Given the sensitive nature of private conversations, platforms must tread carefully to avoid violating all users' privacy rights while still ensuring safety. Another complicating factor is that victims, often minors, may not fully comprehend the predatory nature of the grooming tactics, rendering the platform's reporting features less effective.

Platforms that cater to children need to be designed with their safety as a priority, with grooming toward the top of the list of concerns. Parental controls and status updates, affinity requirements for the initiation of messages, and particular boundaries on the communication of adults and children, all have the potential to make this pattern less frequent.

One note: the efficacy of parental consent and oversight controls is highly variable based on the types of levers and visibility offered to parents, and requires deliberate design beyond the scope of what this resource can cover. However, building these tools to serve both the needs of parents and children requires calibrating the privacy afforded children with the oversight tools offered to parents, and encouraging this dynamic to evolve as the child grows up.

What features facilitate Grooming?

Messaging
Enable users to exchange text in real time.

How can platform design prevent Grooming?

Limit account volume
Reducing the volume of accounts a person can create restricts their capacity to cause harm at scale.
Identity Verification
Require users to register for an application with a state issued identity document.
Adults Can't Message Kids
Facilitate a sense of safety for parents + kids by only allowing prior connections to initiate conversations with them.
Is something missing, or could it be better?
Loading...