Core Platform Team/Initiative/Hash Checking/Initiative Description
- Summary
The program scope is to prevent the exposure to and exploitation of terrorism and child protection multimedia content by developing and deploying hash checking functionality to automate the detection and takedown of terrorism and child protection multimedia content across the Wikimedia ecosystem.
- Significance and Motivation
Reacting to societal and regulatory pressures, big for-profit platforms have built two shared open hash corpuses to facilitate more effective platform-wide and cross-platform policing of the most problematic types of content: child protection and terrorism. These tools have established themselves as the industry base standard ever since Microsoft released PhotoDNA for child protection in 2014, now also used by Adobe, Facebook, Google, Twitter, and the National Center for Missing & Exploited Children. Building on the system in late 2016, Facebook, Microsoft, Twitter, and Youtube expanded the shared ISP hash corpus to tackle terrorism challenges.
- Outcomes
- automated capability to detect terrorism and child protection multimedia content on upload and during scan
- notification to Trust and Safety of detection of terrorism and child protection multimedia content
- automated takedown of child protection multimedia content
- Baseline Metrics
detection and takedown of terrorism and child protection multimedia content is a purely manual process
- Target Metrics
automated detection and takedown of terrorism and child protection multimedia content
- Stakeholders
- members of the Wikimedia movement
- members of the Trust and Safety team
- Known Dependencies/Blockers
None given