Подобрение на прегледа на редакции

This page is a translated version of the page Edit Review Improvements and the translation is 36% complete.

Подобрение на прегледа на редакции е проект на Екипа за Collaboration|сътрудничество, който проучва начини за намаляване на негативните ефекти, които могат да имат текущите процеси на преглед на редакциите върху новите редактори в уикитата. Повечето инструменти за преглед на редактирането и за патрулиране са проектирани да гарантират качеството на съдържанието и предотвратяването на лоши приноси - две жизненоважни мисии. Редица изследвания обаче показват, че тези процеси могат, особено когато включват автоматизирани или полуавтоматизирани инструменти, да имат нежелано обезсърчително въздействие и дори да отблъснат добронамерените нови редактори.

За да реши този проблем, Екип по сътрудничеството проучва начини, за да отсее добронамерените нови потребители от текущите работни процеси на преглед на редакциите от и в крайна сметка да осигури процес на подкрепящ преглед, който да помага на новите потребители да станат продуктивни сътрудници.

Проблеми

  • Изследванията показват, че особено за новите уики редактори "връщането на редакцията предвещава както намаляване на активността, така и намаляване на вероятността за оставане" като редактори.[1]
  • В същото време, нарастващата употреба на автоматизирани и полуавтоматизирани инструменти за преглед на редакциите допринася за увеличаване на броя на откази на добронамерени нови редактори. Употребата на тези инструменти "значително увеличава негативния ефект от отхвърлянето на новодошли редактори, които е желателно да се задържат."[2]
  • Горните, независими, инструменти за преглед на редакциите са от съществено значение за борещите се срещу вандализма и за други, работещи за запазване на цялостта на уики и нейното качество. Как можем да помогнем и задържим новите потребители, докато в същото време запазим ефективността на борещите срещу вандализма и другите патрулиращи редактори?

Цели

  • Осигуряване, че добронамерените нови редактори получават повече конструктивни и не толкова обезкуражаващи преживявания при редактиране и преглед на статия.
  • Осигурявайки по-подробни данни за последните промени, се дава възможност на патрульорите и рецензентите от всякакъв вид да работят по-ефективно и да преследват различни интереси (например борба с вандализма, подкрепа на нови потребители) по един по-ефективен и целенасочен начин.

В крайна сметка, този проект има за цел да окаже влияние върху запазване на редакторите - цел, която съвпада с общите цели на Годишния план на Фондация Уикимедия 2016-17, разработен в тясно сътрудничество с потребителската общност.

Подходът преследва по-точно целите на Годишния план, изложени пред Продуктовия екип, които обещават, наред с други неща, да "Инвестираме в нови видове съдържание ... инструменти за помощ и сътрудничество."

Решения

To begin to address the problems of struggling but good-faith newcomers, a good first step will be to ensure that reviewers can find them. To make this possible, we propose to analyze recent changes using data from a variety of sources, including and most notably the machine-learning program ORES (Objective Revision Evaluation Service). ORES’s good faith model, trained on human judgement, can find 95% of good-faith edits with 98% accuracy. ORES can also predict edits that will be reverted and those that are damaging to the wikis.

While research shows that new editors are particularly vulnerable to rejection, there’s also evidence that edit-review and even rejection can be a powerful learning experience for newcomers.

[3]

For reviewers interested in supporting new users, then, a stream of edits that are a) likely to be reverted but which were b) made in good faith will, we hope, represent a string of teachable moments.

Анализът на редактирането, описан по-горе, първоначално ще бъде достъпен за потребителите по два начина [4]:

Текуща активност

  • To visualize possible product directions, the Collaboration Team is exploring design concepts while continuing to research the issues.
  • To better gauge the size of the problem and be able to track progress, we’re working to define and measure new-editor retention.
  • Design Research is organizing and conducting interviews with users touched by this issue in various ways, to better understand their motivations and workflows. Groups who will be interviewed in the near term include: anti-vandalism patrollers, recent changes patrollers, Teahouse hosts, Welcoming Committee members, and AfC reviewers.
  • The Research and Data team is working to make predictions better by refining the accuracy of prediction models.
  • There was a discussion of the project at Wikimania 2016, in June

Improving filtering in Recent Changes page

Допълнителна информация

 
A single entry point is proposed for filtering recent changes
 
Scenarios defined for the filtering system to support: help newcomers, vandalism fighting and thanking newcomers.

In order to help reviewers to easily find the contributions they look for, we plan to improve the way filtering works on the Special:Recent Changes page. The goal is to make the list of contributions easy to filter, allow for more filter criteria (especially those relevant for helping newcomers) and facilitate combining multiple filters for different purposes.

This interactive prototype illustrates the filtering concept proposed. For additional context, you can check the supported scenarios.

Before reaching there, this will be done in multiple steps inside a beta feature. More details below.

Първоначални стъпки

Initially, namespaces and tags won't be integrated into the filtering system. Filters related to ORES will be supported. These filters include:

  • Review. Filters that allow reviewers to focus on those contributions not reviewed yet, or those already processed by other reviewers.
  • Contribution quality. Filters that allow to identify contributions that are good or damaging.
  • User intent. Filters that allow to identify contributions that were made in good or bad faith.
  • User experience level. Filters that allow to target edits depending on the expertise of their author.

Бъдещи планове

Creating the streams/pages of “teachable moments” described above has the potential to establish edit-review as a new space for instructing and supporting new editors.

The mere existence of such a platform, however, won’t in itself ensure that this new practice will take root. To truly have an impact on newcomer retention, interventions may be required at multiple points in the editing and review cycles: before publication, to spot problems and enable authors to seek help; during review, to facilitate a constructive process; and even after review, to help new users overcome rejection and learn from from their experiences.

In addition to exploring ideas for intervening at various points, we’re pursuing answers to questions such as these:

  • How can we bring reviewers to this new activity?
  • What would make reviewers most effective in the job of supporting newcomers during edit review?
  • How can we make the process rewarding for reviewers, so that they stay involved?

The counter-vandalism community also has an important role to play in this arena. Richer data about edits and editors should make patrollers of all types not only more discriminating about which edits might be in good faith, but also more efficient at their job of combating harm. It will be important to work closely with vandalism fighters and others to understand how their processes and tools might best be adapted to realize these potential gains.

Принципи

As we pursue this project, the following principles will guide our planning.

  • Smart but human. Use technology to support rather than replace human interaction. Artificial intelligence can provide analysis, but humans should make decisions.
  • Cross-community. Find solutions that will work across language groups and projects, rather than building wiki-specific tools.
  • Platform not feature. Seek solutions that are extensible and reusable by current and future community-created and WMF tools.
  • Mobile. Although edit-review is not currently popular on mobile, consider mobile users carefully in our plans.
  • Adoption. In addition to creating new technology, focus on finding ways to encourage reviewers to adopt and continue to use the new tools.
  • Integration. In seeking new solutions, build on and integrate with existing practices whenever possible.
  • Incremental approach. As we move into this new area, proceed incrementally to each milestone and then evaluate where to go next.
  • Participatory design. Collaborate with editors and tool developers already working in this space.

Свързани документи

  1. Halfaker, A., Kittur, A., & Riedl, J. (2011, October). Don't bite the newbies: how reverts affect the quantity and quality of Wikipedia work. In Proceedings of the 7th international symposium on wikis and open collaboration (pp. 163-172). ACM.
  2. “Several changes the Wikipedia community made to manage quality…have ironically crippled the very growth they were designed to manage. Specifically...the algorithmic tools used to reject contributions are implicated as key causes of decreased newcomer retention.” Halfaker, A., Geiger, R. S., Morgan, J. T., & Riedl, J. (2012). The rise and decline of an open collaboration system: How Wikipedia’s reaction to popularity is causing its decline. American Behavioral Scientist, 0002764212469365.
  3. “We found that newcomers are particularly likely to decrease their contributions after they are reverted. We also saw some evidence that they can learn the most from being reverted. Newcomers should be reached out to actively to help them become socialized into Wikipedia.” Halfaker, A., Kittur, A., & Riedl, J. Don't bite the newbies: how reverts affect the quantity and quality of Wikipedia work.
  4. Цели на екипа за Сътрудничество - Второ тримесечие FY2016-17