The Moderator Tools team is a Wikimedia Foundation product team exploring content moderation tool needs. We are particularly focusing on the needs of medium-sized Wikimedia projects.
We want to understand what tools and processes are missing or hard to use in projects which are growing substantially so that we can prioritise Product investment. We believe there are particular stresses in these communities as a small number of administrators and content patrollers find themselves needing to review a growing number of edits, while also developing processes and workflows already established in larger communities.
The team's focus is on content moderation processes, including page protection, deletion, reporting, and recent changes patrolling, rather than user reporting and moderation, which is more within the purview of the Anti-Harassment Tools and Trust and Safety Tools teams.
Latest Update: We completed our primary research and published a report, which you can read at Moderator Tools/Content Moderation in Medium-Sized Wikimedia Projects, and now want to learn more about Content moderation on mobile web.
July 2021 - March 2022: ResearchEdit
In the 2021/22 Annual Plan (July-June) the team focused on design research for the first 9 months (July 2021 - March 2022). We heard from a wide range of editors on the problems they're facing in keeping the content on their projects reliable and trustworthy.
We then published a research report, Content Moderation in Medium-Sized Wikimedia Projects. This report found content moderation on mobile web to be the highest priority issue.
April 2022 - current: Mobile webEdit
We're now learning more about the needs and priorities for engaging with content moderation processes on mobile web.
To share your thoughts on this with us please visit Content moderation on mobile web.
Senior Design Researcher
Staff Software Engineer
Susana Cardenas Molinar
Senior Software Engineer