This essay is written by me in the volunteer role of senior staff and operations manager for the Countervandalism Network.

Preamble edit

Presently the foundation's investment in counter vandalism and content quality assurance is almost entirely absent (depending on the type of content). This needs to change. We have to do better.

The only projects relevant to this subject that are backed by the foundation are:

  • prevention (AbuseFilter)
  • restriction (FlaggedRevs)

But those are hardly actively maintained last I checked, and they aren't designed to be used for patrolling edits.

FlaggedRevs edit

FlaggedRevs allows one to control which version of an article is shown to the public. When enabled, later edits are only seen by the public after an authorized editor moves the "stable" marker forward in history. This process could does not scale to the size of Wikipedia for various reasons. Normally, post-publish review is distributed among readers by being visible to them. This means the workload is distributed and naturally prioritised by visibility. The act of mitigating vandalism that is visible is also a crucial element to the wiki mentality, and taking that away could have disastrous consequences to the size, health and sustainability of the community. Hiding changes by default (as FlaggedRevs does) would make the wiki less inviting to contribute to, and would significantly reduce motivation to revert vandalism if it hadn't become visible in the first place.

Aside from scalability and ecosystem concerns, FlaggedRevs is also difficult to scale because it does not allow the user to choose which edits to review. Instead the user has to review the entire contents of a page, including all changes since the stable version, or they could review a single edit but then it has to be the first edit after the stable version. This is a consequence of FlaggedRevs being based on moving a "marker" through a linear history.

"Patrolling", on the other hand, is performed piecemeal on a per-edit basis. Any individual change can be patrolled, improved or reverted as-needed, whilst allowing one to deal other changes at a later time.

This is particularly important given that patrolling can happen in batches. For example, if a specific user has just vandalised 10 different pages, the patroller can deal those all those edits at once, whilst leaving any other changes made to those same pages for other users to patrol. This is plays into the scalability factor.

Patrolling edit

The patrolling feature has existed in MediaWiki since 2005. While it's feature set is relatively basic, it is still a massive improvement compared to not having a patrolling feature at all. This is significant because despite the feature existing in the software, it was in fact disabled on the majority of wikis by community consensus on English Wikipedia in 2005 – for reasons that are no longer relevant today.

Current edit

Initiatives edit

We currently rely completely on community driven initiatives to fight vandalism. To name a few:

  • Cobi's ClueBot:
    A highly effective and self-learning bot that detects vandalistic patterns. This is a candidate that I'd like to see integrated into Wikimedia's production cluster as an independent service. Its data could be exposed in AbuseFilter to prevent the edit if it scores above a certain threshold. Currently this is worked around by running the system in Wikimedia Labs, and it reverts the "rejected" edits milliseconds after it receives a push notification of the save event. Rather inefficient and not very user-friendly. Also, the service is currently community run and only targets English Wikipedia. The other wikis would benefit significantly from this system. For English Wikipedia this bot along is responsible for catching the vast majority of vandalistic edits.
  • STIki, Huggle and others:
    These are standalone desktop applications that need installing on a user's computer. Relies on irc.wikimedia.org. These are not accessible from the web. They are also not integrated into the wiki experience (which makes it harder to multi-task with related wiki actions such as interacting with user talk pages, deleting pages, requesting an admin to block a vandal user etc.). Due to the patrol-flag being disabled on most wikis in production, they are also forced to maintain their own database to track which edits have already been patrolled. While part of this initiative is a usable web interface for monitoring and patrolling edits, there is no principle opposition to the power-user interface these apps provide. If we enable the MediaWiki patrol backend in production, these tools can re-use that database and the associated RCFeed events. Right now each of these tools maintain their own database and as such cause lots of duplicate work as each tool is unaware of the other tools' progress.
  • RTRC, LiveRC and the like:
    Gadgets that implement an interface for the core patrolling feature. It is basically an enhanced version of "Special:RecentChanges" that features a live reloading interface, modal dialogs for diffs, and integrated workflows to approve/reject edits. These tools typically also have various filters to narrow down the scope of the shown feed (e.g. only unpatrolled edits by anonymous users in the Main namespace, or New page creations).
  • CVN. Its database, CVNBot, monitoring channels like Freenode #cvn-sw (SWMT):
    These IRC channels are similar to tools like Huggle and RTRC. Except the feed of "potentially interesting events" is output via IRC instead of through a web interface or standalone application. One interesting aspect is that the CVN has a public API to its database which contains a shared database of known vandal users/IPs, popular vandalism targets (pages), and suspicious user name patterns. Most tracking of known vandalis is automatically maintained by SWMTBot; in response to a user being blocked on one wiki. The user's activity is then subsequently highlighted in CVN channels of other wikis. To my knowledge this has been the most valuable system (and also the only system) to catch cross-wiki vandalism and repeated offenders. The blacklist duration is typically twice the duration of the block. For example, a vandal is active on nl.wikipedia.org and blocked. During that block the vandal might visit de.wikipedia.org or Commons[1], thanks to the CVN database patrollers there will be aware of his record. This is especially useful to help catch more subtle vandalism that may not be obvious when inspecting individual edits, the kind that was uncovered on one wiki only after a while. This allows wiki sysops, patrollers and rollbackers to collaborate across from different projects and languages barriers.

Problems edit

  • Patrol flag:
    • These tools have no way to track (and exclude) edits already patrolled; some implement their own database. They should be using MediaWiki's patrol flag, which allows patrollers to use any tool they choose while still collaborating and working on the same stream of unpatrolled edits. Currently users are bound to duplicate reviews of the same edit.
    • Using the patrol-flag of MediaWiki would allow gathering of analytics, and have interoperability with other extensions.
    • In addition to sharing the same queue between different interfaces, enabling the patrol flag will also allow patrollers to finally make use of the many ways of automatic patrolling MediaWiki provides. Such as auto-patrolling of edits by established users, and auto-patrolling of edits that are reverted (with rollback). Currently patrollers often end up reviewing edits that are already reviewed or been reverted already.
  • Tool maintenance:
    • MediaWiki is a fast moving target and it's hard to keep up. The Countervandalism Network has many users, but only a small number of members are involved in developing its software. Even fewer are involved with its networking and operations that ensure uptime and availability. In addition, many of the maintained software projects are in different programming languages, may be abandoned or otherwise hard to maintain.
  • Wikimedia backend support:
    • The backends the current toolset relies on is not well-maintained by Wikimedia (namely irc.wikimedia.org and stream.wikimedia.org). Uptime and stability is not what it should be, and during down time, critical events are missed with no feasible recovery to catch up or replay these events. Perhaps a replay mechanism should be built for server-side downtime, or other mechanism to allow ensure a client can listen to edits with better guarantees that downtime on client or server will not cause events to be missed. Either based on RCFeed queue, or by backfilling from the API.

Workflow edit

Here is a case study of the review workflow as established on the Dutch Wikipedia, Wikimedia Commons, and other wikis.

Before we continue, a quick definition of terms and subroutines:

  • Checklist: An overview that shows each day of the week divided in blocks of 2 to 6 hours. Users use this to track the progress which chunks of edits have been reviewed.
  • Live patrol vs backlog: Patrollers work either live or from a checklist. Those patrolling live will work from an interface that is continuously updating and sorted descending (latest edits on top). Those working from the checklist open an interface with start-time/end-time parameters applied to clear the backlog, by reviewing a fixed chunk until it longer contains unpatrolled changes. Those patrolling from the checklist (nl:Wikipedia:Checklist countervandalism) will review the edits that were not patrolled by the live patrollers (sometimes nobody is watching, or there are too many edits, and certain edits may not seem in need of immediate review based on the appearance of their meta data).
  • Reviewing an edit: Before we get into the workflows, let's define what it means to review an edit (e.g. when viewing the difference page in a web browser).
    • Scan the change difference.
    • If the change is problematic, the patroller may either undo it, revert it, or make a fix-up edit to correct any mistake.
    • If the change is vandalism, the patroller may also leave a message on the user's talk page. (Depending on local wiki conventions.)
    • If the user should be blocked (e.g. repeated vandalism), the patroller either places a block request on the administrators notice, or (if he or she is an administrator) they may block the user immediately.
    • If the page should be protected (e.g. subject of current events, or other popular vandal target), the patroller may request page protection, or (if the patroller is an administrator), they may protect it themselves.
    • Click "Mark as patrolled" button – indicating the edit has been reviewed and dealt with. Patrollers may actually prefer clicking that button as their first step, so that other reviewers may skip this edit while they perform the above steps.
  • Live patrol:
    • IRC:
      Users join IRC channels (e.g. #cvn-sw, #cvn-commons, or #cvn-wp-nl) where a CVNBot reports edits from the wikis (filtered to only show edits by new users, anonymous users, edits by blacklisted users, and edits to a watched page and other patterns). Users in this channel have the "patrol" user right. They click the diff urls in their irc client, which opens in their web browsers where they review the edit. Though this is a common way of patrolling, it has the downside that the log of the irc channel is append-only. Once a user reviews an edit the irc channel can't be "updated" to remove it from the queue. This leads to situations where a link is either missed in the log, or multiple users attempt to review the same edit.
    • RTRC:
      Users enable the RTRCgadget which creates user-friendly version of Special:RecentChanges with configurable feed filter options, and real-time updates. For example, users can choose to see the newest (or oldest) unpatrolled edits. The end result is similar to the IRC channel, except that it is integrated into the wiki environment, and reflects the active queue in real time (e.g. User A reviews an edit, and a second later this edit will disappear from the "unpatrolled only" queue on the User B's screen).

Phase One edit

The foundation should start focussing some efforts on counter-vandalism and reviewing of contributions. And support related backend infrastructure and services that it builds on top of.

These are the minimum critical projects required to allow wikis to start patrolling edits in a sustainable way.

Enable RCPatrol edit

The configuration variable $wgUseRCPatrol should be enabled by default on all wikis. Right now lots of tools are forced to maintain their own proprietary database for tracking which edits have been patrolled already. MediaWiki has a system for this already (RCPatrol) and it is enabled on a dozen or so wikis that manually requested it to be enabled. The system has been reviewed, is stable, is scalable and performant. The RecentChanges API allows filtering by this flag to retrieve only unpatrolled edits. The RCFeed also supports the patrol flag already, so that real-time applications are notified of a patrol action right away for them to update their feeds. All that is missing is to enable it.

  • Status:   To do

Historical context around RCPatrol edit

From wmf-config/InitialiseSettings.php:

'wgUseRCPatrol' => [
	'default' => false, # See [[Wikipedia:Village pump (news)]], 09:47, Jan 6, 2005

.. which was preserved at Wikipedia:Village_pump_(news)/Archive_A, and since moved to Wikipedia_talk:Checked_edits_brainstorming.

Machine-readable changes feed edit

Implement a modern feed that is compatible with current web browsers. The feed provides recent changes events in a machine-readable format (preferably JSON).

  • Status:   Done!

A proof-of-concept was developed and deployed in 2015 by Ori Livneh (WMF Performance Team). See also RFC/Push notification for recent changes and wikitech:RCStream.

The service has since been replaced by WMF Analytics with the EventStreams service.

Activity monitor edit

A MediaWiki extension inspired by RTRC and CVN providing the following components:

  • The "Activity monitor" special page. This client opens a socket to the changes feed (fallback to API polling if the socket is unavailable, or unsupported by the browser, and for testing/third-parties without RCStream installed). It will start populating and updating a list or table of visualised change meta-data. The feed is configurable by the user based on various parameters (patrolled/unpatrolled, user type, change type, namespace, start-date/end-date). Similar to MediaWiki RecentChanges and the RTRC gadget. Features:
    • Configurable real-time change feed.
    • Model or inline diff viewing.
    • Patrolling and rollback via AJAX.
    • Ability to monitor multiple wikis (SiteMatrix, CentralAuth wiki sets; RCStream supports this already).
  • The "Countervandalism" Database. This is a living data set shared across wikis in the same wiki farm. Similar to the CVN database currently maintained in Wikimedia Labs. This dataset would be manipulated via an API and Special page, restricted to users with appropriate user rights. Features:
    • Flag suspicious users (e.g. users previously or currently blocked on one or more wikis, users that triggered many AbuseFilters, users manually flagged by patrollers).
    • Shared watch list of page names and page name patterns.
  • Status:   To do

Phase Two edit

Global ClueBot edit

Currently ClueBot is only enabled for English Wikipedia. If running the service in production is not feasible, at least set up additional instances to cover more wikis than just English Wikipedia.

Status:   In progress ORES provides a scalable infrastructure for creating vandalism-detection models for more wikis. There are no high-confidence revert bots based on ORES yet, though.

Vandalism prevention edit

Allow ClueBot (or ORES) to effectively prevent edits before saving, rather than having to revert edits immediately after they are saved. This would be done by running the learning system in Wikimedia production. Then we could prevent edits by either using a simple MediaWiki extension, or we can expose it as a property in AbuseFilter. This however would require the service to be performant enough to be ran as part of the save pipeline (under 150ms for 95 percentile).

Status:   To do

Patrol log edit

The MediaWiki logging system currently lacks proper distinction between patrol actions (e.g. User A reviewing an edit by User A) and autopatrol actions (e.g. trusted User A making an edit which the system automatically considers as patrolled). This results in the patrol log having a very low signal-to-noise ratio (due to the vast majority of entries not being actual patrol actions).

Tracked as T27799.

Status:   Done in 1.27.0-wmf.19.

See also edit

References and notes edit

  1. According to Krinkle, the most common type of cross-wiki vandalism is between a local wiki and Commons, where often the perpetrator isn't even aware yet of his block on the local wiki because after clicking a thumbnail he ends up on Commons (unaware of it being a completely different wiki).