Child Sexual Abuse Imagery (CSAM) (previously known as Child Porn) is the creation and distribution of images, audio or video that depicts the sexual abuse of children. It is a uniquely pernicious form of online harm, in that the creation of the media always constitutes a serious criminal and moral violation, and it's continued circulation traumatizes victims and normalizes abhorrent sexual acts.
The universal condemnation of CSAM is a notable area of broad consensus on an internet rife with extreme diversity. This unified view has yielded an unrivaled level of legislation, collaboration and tooling that heavily coordinates and streamlines the discovery of CSAM, and translates it into law-enforcement activity.
Because of this near-universal condemnation, CSAM spreads in corners of the internet that are out of public view, typically insular, and highly technically sophisticated. Though they are rarely used expressly for the purposes of storing and distributing CSAM, public platforms have an important role to play: they are likely to receive CSAM uploads through backups or peer-to-peer file sharing, and thus have the potential to intervene in an issue that otherwise can continue to cause harm out of the public eye.