Vérification des modifications

This page is a translated version of the page Edit check and the translation is 19% complete.

Pendant l'exercice financier 2024-2024, l'équipe de rédaction travaille sur un ensemble d'améliorations pour l'éditeur visuel afin d'aider les nouveaux volontaires à comprendre et à suivre certaines des règles et recommandations nécessaires pour réaliser des modifications constructives dans les projets Wikipédia.

Ci-dessous, vous trouverez des informations sur les objectifs de ce projet, l'historique qui l'a guidé et pourquoi le Département produit de la Fondation Wikimédia accorde la priorité à ce travail.

Regardez Équipe rédaction/Conversations de la communauté pour les réunions prévues sur ce projet.

Plusieurs fonctionnalités sont disponibles au sein de la vérification des modifications :

The Editing team works on new Checks:

See also:

Objectifs

  1. Les novices et les participants et participants juniors d'Afrique subsaharienne se sentiront suffisamment en sécurité et en confiance pour publier des modifications dont ils et elles retireront de la fierté et que les personnes expérimentées jugent utiles.
  2. Les modérateurs et modératrices des Wikipédias anglaise et française remarqueront des améliorations dans la qualité des modifications effectuées par les novices et auront la motivation pour configurer la manière dont Edit Check leur présente les politiques d'édition.

État actuel

Updates for 2025 are listed below, from this page. Watch 2025 updates.

Edit check updates:

Peacock check becomes Tone Check, model test to start soon

As the term "Peacock" was specific to the English language, internationalizing it was challenging. The Editing team decided to change the name of the feature for a more universal term: Tone check.

Tone check will use a model, trained by users. This first test will be conducted for English, Spanish, Portuguese, French and Japanese languages. USers are invites to signup at Edit check/Tone Check/model test before May 23rd.

Peacock check model test to start soon

Peacock check will use a model, trained by users. We will soon start reaching at a few communties (listed at T388471) and ask for volunteers to test the model.

We selected the wikis based on several criteria:

  • Technical reasons:
    • Wikis that use the “variants” feature (like Chinese) — because the model has to infer across different language varieties
    • Languages that don’t space-separate words (like Chinese, Japanese) — where the results will be very dependent on the tokenizer
    • Agglutinative languages (like Turkish, Indonesian) — where the model will be very dependent on the tokenizer
    • RTL (like Arabic, Hebrew) — because of potential user experience (UX) issues
  • projects that see relatively high volumes of newcomers, specifically in Sub-Saharan Africa, and
  • projects that have expressed a willingness to experiment with Peacock Check.

A/B Test: Multiple Reference Checks

An analysis of leading indicators for the Multiple Reference Check A/B test is complete and the results are encouraging:

  • New(er) volunteers are encountering Multiple Reference Checks in enough editing sessions to draw statistically significant conclusions from
  • People shown multiple Reference Checks within an edit session are proceeding to publish edits that include ≥1 reference at relatively high rates
  • Showing people multipole Reference Checks within an edit session is not leading to increases in revert rate or blocks

You can review these results in more detail below.

For context, leading indicator analyses of this sort are meant to uncover what – if any – adjustments we will consider prioritizing before evaluating the broader impact of the feature in question.

Peacock Check

 
Work in progress design of Peacock Check.

In collaboration with the Machine Learning team, the Editing team has started working on a new check: Peacock check (T368274) This check will detect the usage of puffery terms, and encourage the user to change them.

We are currently gathering on-wiki policies, templates used to tag non-neutral articles, and the terms (jargon) used in edit summaries for 20 wikis.

A/B Test: Multiple Reference Checks

Yesterday (25 March), an A/B test began at 12 Wikipedias that removes the constraint on how many Reference Checks people have the potential to see within a single edit.

Note: at present, a maximum of one Reference Check is shown per edit.

This experiment is an effort to learn: What – if any – changes in edit quality and completion do we observe when people have the potential to see multiple Reference Checks within a single edit?

The findings from this A/B test will be relevant for the near-term future where multiple Edit Checks of the same and/or different types (e.g. Peacock Check, Paste Check, etc.) have the potential to become activated within a single edit session.

 
Daily revert rate for new content edits where Reference Check is shown.

Curious to learn if and how the new desktop Edit Check experience might have impacted edit quality, we investigated how the number of reverted edits changed before and after the December 2024 release.

We learned:

  1. The revert rate of new content edits where Reference Check was presented decreased by 15.7% (20.4% pre- to 17.2% post-change).
  2. There was a 8% increase in the proportion of new content edits that included a reference following the change (34.8% pre-change to 37.9% post-change).

See full report for more details

In December, we released a new design for the Edit Check desktop experience.

Last week, we finished an analysis that compared how – if at all – several key metrics shifted before and after this change.

The purpose of this analysis: decide whether there were any changes to the user experience we ought to prioritize making before beginning an A/B test that will evaluate the impact of it being possible for multiple Reference Checks to be shown within a single edit.

You can find a summary of the results below and the full report here.


Stratégie et approche

To equip newcomers and Junior Contributors from Sub-Saharan Africa with the know-how and tools to publish changes they are proud of and that experienced volunteers consider useful, the Editing Team will be introducing new functionality within the visual editor (desktop and mobile ) that will check the changes people are attempting to make and present them with actions they can take to improve these changes in ways that will align with established Wikipedia policies and guidelines.

The first "check" the Editing Team will be introducing is one that will detect when people are attempting to add new content to an existing article without a corresponding reference and prompt them to do so. The functionality will be accompanied by a complimentary set of features that will enable moderators to configure the user experience newcomers and Junior Contributors will see to ensure the software is guiding them to take actions that align with project policies and conventions.

Défis

The visual editor's growing popularity among people who are new to editing Wikipedia[1] leads us to think that the editing experience has been reasonably successful at helping inexperienced volunteers learn the technical skills necessary to publish changes to Wikipedia.

The trouble is that the visual editor and other editing interfaces do not make people aware of the Wikipedia policies and guidelines they are expected to follow.

As a result, the changes inexperienced volunteers publish often break established best practices and lead to undesirable outcomes for inexperienced volunteers, experienced volunteers, and Wikipedia projects as a whole:

  1. Inexperienced volunteersbecome disappointed and frustrated when the good-faith change(s) they arrived to the wiki seeking to make are undone (read: reverted), deleted, and/or scrutinized in inequitable ways. These poor interactions are demotivating and drive these could-be volunteers and community members, and the knowledge that are uniquely positioned to offer, away.[2]
  2. Experienced volunteers/moderators – need to do more work reverting low-quality edits and posting messages on inexperienced volunteers' talk pages to make them aware of the policies and/or guidelines they are likely to have unknowingly broken. Continually needing to educate inexperienced volunteers and undo their changes can lead to experienced volunteers becoming skeptical of inexperienced volunteers and impatient with them.
  3. Wikipedia projects – struggle to grow and diversify their volunteer populations and shrink the knowledge gaps present within Wikimedia wikis.

This project seeks to address the challenges above by:

  1. Offering inexperienced volunteers relevant and actionable feedback about Wikipedia policies in the precious moments when they are in the midst of making a change using the visual editor.
  2. Equipping moderators with a new ability to specify the feedback inexperienced volunteers are presented with while they are editing

Théorie du changement

This project is built on the belief that by surfacing relevant guidance in the precious moments when inexperienced volunteers are in the midst of making a change to Wikipedia and equipping them with the know-how and tools necessary to apply this guidance, they will make changes they are proud of and that experienced volunteers value.

In the longer term, the Editing Team thinks that people who are new, particularly people who have historically been excluded from and harmed by established power structures, will feel safe and motivated making changes to Wikipedia if they can accurately predict whether the changes they are attempting to make are aligned with existing Wikipedia policies, guidelines, and/or cultural conventions.

More broadly, the Editing Team thinks that to evolve towards a future where wikis' policies and cultural norms – and ultimately, content – reflect the diverse experiences of the people these projects are intended to serve, we first need to make the norms and standards that are currently in place legible and actionable to people while they are editing.[3] This way, volunteers can develop shared awareness of cases where these norms and standards are not having the impacts they were intended to have and decide what – if any – changes they think are worth making to them in response.

Public concerné

L'équipe de rédaction se concentre sur les besoins des personnes qui sont :

  1. Experience: Learning the basics of contributing to Wikipedia
    • In the context of this project, we are considering people who are still "learning the basics" to be people who have published <100 cumulative edits to a single, or multiple, Wikipedias. This includes people who are editing Wikipedia for the first time.
  2. Location: Living in Sub-Saharan Africa
  3. Projects: Contributing to the English and French Wikipedias
  4. Motivation: Seeking to fill gaps they notice within Wikipedia

Les quatre critères prioritaires énumérés ci-dessus découlent de :

  • Newcomers are two times more likely to live in Africa or Asia.[4]
  • The movement struggles to retain editors who live outside Europe and North America.[4]
  • People from Sub-Saharan Africa are underrepresented within the movement: people from Sub-Saharan Africa represent only 1% of active unique editors, despite representing 15% of the global population and 7% of the global internet population.[5]
  • 80% of registered editors in Sub-Saharan Africa contribute to English or French Wikipedia.[6]


Idée de conception

Détection de référence

To start, the Editing Team is pursuing an approach with Edit Check that minimizes the likelihood of false positives and is implemented in ways[7] that empower volunteers, on a per-project basis, to evolve the heuristic[8] to become more robust over time.

This strategy amounts to the initial reference Edit Check becoming activated if/when all of the following conditions are met:

  1. A minimum of one new paragraph of text is added to the article someone is editing
  2. The "new paragraph(s) of text" someone has added does NOT include a reference
  3. The changes described in "1." and "2." are happening on a page within the main namespace (NS:0)

The conditions above are implemented and maintained in code here: editcheck/init.js.

The Editing Team arrived at the decision to start with a relatively limited and straightforward set of rules in order to:

  1. Increase the likelihood that newcomers and Junior Contributors find the guidance Edit Check is presenting them with, and the editing experience more broadly, to be intuitive and straightforward so that they feel encourage to return to edit again
  2. Decrease the likelihood that Edit Check is creating more work for experienced volunteers by prompting newcomers and Junior Contributors to add sources when they are not needed

You can learn more about the assumptions that informed the thinking above in phab:T329988#8654867.

Other applications


Voir aussi : Edit check/Ideas

Configurability

The Editing Team thinks it is crucial that moderators be empowered to configure when, and for whom, Edit Check becomes activated. This way, they can be confident the software is promoting behavior they deem to be productive and modify the software when it is not.

In line with the above, and drawing inspiration from how the Edit filter and Growth Team Community configuration systems afford volunteers the ability to audit and configure how they function on-wiki, Edit Check will enable volunteers, on a per project basis to:

  • Audit and edit the logic that determines when the reference Edit Check becomes activated and
  • Review the edits people who are shown Edit Check are making

Work to implement the above is ongoing in phab:T327959.

User Experience

Mobile

The first version of Edit Check will introduce a new step within the mobile visual editor's publishing workflow that people will see if/when they add new content without a reference.

Desktop

Design for the desktop user experience is still underway. See T329579.

Experiments

Multi-Check (References) A/B Test

Leading Indicators | T388731

  • New(er) volunteers are encountering Multi-Check
    • In the test group, multiple reference checks were shown within a single editing session at 19% of all published new content VE edits (549 edits) by unregistered users and users with 100 or fewer edits.
    • For edits shown multiple checks, the majority of edits (73%) were shown between 2 to 5 Reference Checks.
  • People shown multiple Reference Checks w/in an edit go on to publish at a relatively high rate
    • The edit completion rate for sessions that were shown multiple checks within a session was 76.1% compared to 75% for sessions shown only one check, indicating that multiple checks are not causing significant disruption or confusion to the editors.
  • Likelihood to include a reference
    • Sessions shown multiple checks are more likely to include at least one new reference in the final published edit compared to sessions shown just a single check.
    • In the test group, 52.5% of all published edits shown multiple Reference checks included at least one new reference compared to 39.7% of edits that were shown a single check.
  • Disruption (revert and block rates)
    • In the test group, the revert rate of new content edits shown multiple Reference Checks (17%) is currently lower compared to sessions shown a single Reference Check (26%.).
    • No significant changes in the proportion of users blocked after being shown multiple Reference Checks compared to a single Reference Check.

Multi-Check Phase 1 Impact Analysis

 
Multi-Check (Phase 1)

In December 2024, we released a new design for the Edit Check desktop experience. This change shifted Edit Checks from appearing within articles to appearing alongside them, in a new "siderail."

To decide how – if at all – this change impacted volunteer disruption and edit quality, we compared analyzed several key metrics shifted before and after this change.

Findings

  • The revert rate of new content edits where Reference Check was activated decreased by 15.7%.
    • 20.4% (pre) →  17.2% (post).
  • The proportion of new content edits that included a reference following the change increased 8%.
    • 34.8% (pre) → 37.9% (post)
  • Excluding reverted edits, there was 3.2% increase [2 percentage points] in edit completion rate.
    • 68% of edits where Reference Check was presented were successfully saved and not reverted following the change in the Edit Check UX.
  • The rate at which people declined to add a reference when Edit Check prompted them decreased 4.7%.

Conclusion(s)

The findings above suggest this design change has been net positive. As a result, we will continue depending on this new design paradigm as we introduce new types of Edit Checks and moments within the editing workflow when they are presented.

Reference Check A/B Test

To learn whether the Reference Edit Check is effective at causing newcomers to make edits they intended and experienced volunteers value , we conducted an A/B test with 15 Wikipedias.

Below you can read more about what this experiment demonstrated, what the Editing Team is planning in response, and more details about the test's design.

Conclusion and next step(s)

Reference Check caused an increase in the quality of edits newcomers publish and did not cause any significant disruption.

This combination is leading the Editing team to be confident that offering Reference Check as a default-on feature would have a net positive impact on all wikis and the people who contribute to them.

Findings

 
There was 2x increase in the proportion of new content edits by newcomers, Junior contributors, and unregistered users that included a reference when Reference Check was shown to eligible edits.

New content edits *with* a reference

People shown the Reference Check are 2.2 times more likely to publish a new content edit that includes a reference and is constructive (not reverted within 48 hours).

  • Increases were observed across all reviewed user types, wikis, and platforms.
  • The highest observed increase was on mobile where contributors are 4.2 times more likely to publish a constructive new content edit with a reference when Reference Check was shown to eligible edits.

Revert rate

  • New content edit revert rate decreased by 8.6% if Reference Check was available.
    • New content edits by contributors from Sub-Saharan Africa are 53% less likely to be reverted when Reference Check is shown to eligible edits.
 
We observed increases on both desktop and mobile. On mobile, users are 4.2 times more likely to include a reference with their new content when the Reference Check is shown to eligible edits.

While some non-constructive new content edits with a reference were introduced by this feature (5 percentage point increase), there was a higher proportion of constructive new content edits with a reference added (23.4 percentage point increase). As a result, we observed an overall increase in the quality of new content edits.

 
There was a -8.6% decrease in the revert rate of all new content edits comparing edits where Reference Check was shown in the test group to edits that were eligible but not shown Reference Check in the control group.

Constructive Retention Rate

  • Contributors that are shown Reference Check and successfully save a non-reverted edit are 16 percent more likely to return to make a non-reverted edit in their second month (31-60 days after).
    • This increase was primarily observed for desktop edits. There was a non-statistically significant difference observed on mobile.

Guardrails

Edit Completion Rate

  • We observed no drastic decreases in edit completion rate from intent to save (where Reference Check is shown) to save success overall or by wiki.
  • Overall, there was a 10% decrease in edit completion rate for edits where Reference Check was shown.
    • There was a higher observed decrease in edit completion rate on mobile compared to desktop. On mobile, edit completion rate decreased by -24.3% while on desktop it decreased by -3.1%.

Block Rate

  • There were decreases or no changes in the rate of users blocked after after being shown Reference Check and publishing an edit compared to users in the control group.

False Negative Rate

  • There was a low false negative rate. Only 1.8% of all published new content edits in the test group did not include a new reference and were not shown Reference Check.

False Positive Rate

  • 6.6% of contributors dismissed adding a citation because they indicated the new content being added does not need a reference. This was the least selected decline option overall.

Test design

11 Wikipedias participated in the test. At each wiki, 50% of users were randomly assigned to a test group and 50% were assigned to a control group.

Users in the test group were shown the Reference Check notice prompting them to decide whether the new content they were adding need a reference (if they had not already added one themselves).

User in the control group were shown the default editing experience, even if they did not accompany the new content they were adding with a reference.

Timing

This analysis was completed on 16 April 2024 and analyzed engagement data at the 11 participating wikis from 18 February 2024 through 4 April 2024.

Evaluating impact

The viability of the features introduced as part of the Edit Check project depends on the impacts it causes and averts.[9]

This section describes the:

  1. Impacts the features introduced as part of the Edit Check are intended to cause and avert
  2. Data we will use to help[10] determine the extent to which a feature has/has not caused a particular impact
  3. Evaluation methods we will use to gather the data necessary to determine the impact of a given feature
Desirable Outcomes[11]
ID Outcome Data Evaluation Method(s)
1. Increase the quality of edits newcomers and Junior Contributors editing from within Sub-Saharan Africa publish in the main namespace Decrease in the proportion of published edits that add new content and are reverted within 48 hours or have a high revision risk score

Comments/reports from experienced volunteers about the quality of edits Edit Check is activated within[12]

A/B test[13], qualitative feedback (e.g. talk page discussions, false positive reporting)
2. Increase the likelihood that newcomers and Junior Contributors editing from within Sub-Saharan Africa will accompany the new content they are adding with a reference Increase in the percentage of published edits that add new content and include a reference

Increase in the percent of newcomers or Junior Contributors from SSA that publish at least one new content edit that includes a reference

Increase in the likelihood that someone includes a reference the next time they contribute new content.

A/B test[13]
3. Newcomers and Junior Contributors editing from within Sub-Saharan Africa will report feeling safe and confident making changes to Wikipedia Newcomers and Junior Contributors find the feedback and calls to action Edit Check presents them with to be:
  1. Helpful
  2. Supportive
  3. Motivating
Qualitative feedback via channels like:Community Calls, talk pages, event organizers, etc.
4. Experienced volunteers will independently audit and iterate upon Edit Check's default configurations to ensure Edit Check is causing newcomers and Junior Contributors to make productive edits.
5. Newcomers and Junior Contributors will be more aware of the need to add a reference when contributing new content because the visual editor will prompt them to do so in cases where they have not done so themselves. Increase in the percent of newcomers or Junior Contributors from SSA that publish at least one new content edit that includes a reference. A/B test[13]
Risks (Undesirable Outcomes)[14]
ID Outcome Data Evaluation Method(s)
1. Edit quality decreases Increase in the proportion of published edits that add new content and are reverted within 48 hours or have a high revision risk score

Comments/reports from experienced volunteers about the quality of edits Edit Check is activated within[12]

A/B test[13], qualitative review and feedback
2. Edits become more difficult to patrol because unreliable citations are difficult to detect Significant increase in the percentage of new content edits new and developing volunteers make that include a reference

Comments/reports from experienced volunteers about the quality of edits Edit Check is activated within[12]

A/B test[13], qualitative review and feedback
3. Edit completion rate drastically decreases Proportion of edits that are started (event.action = init) that are successfully published (event.action = saveSuccess). A/B test[13]
4. Edit abandonment rate drastically increases Proportion of contributors that are presented Edit Check feedback and abandon their edits (indicated by event.action = abort and event.abort_type = abandon) A/B test[13]
5. Blocks increase Proportion of contributors blocked after publishing an edit where Edit Check was shown is significantly higher than edits in which Edit Check was not shown A/B test[13]
6. High false positive or false negative rates Proportion of new content edits published without a reference and without being shown Edit Check (indicator of false negative)

Proportion of contributors that dismiss adding a citation and select "I didn't add new information" or other indicator that the change they are making doesn't require a citation

A/B test[13], qualitative feedback received from volunteers about the accuracy and usefulness of Edit Check's current configuration[15]
7. Edit Check is too resource intensive to scale Efficiencies do not emerge over time making each new Edit Check as "expensive" to implement as the first one Qualitative assessment by the Edting team

Deployment process

Please see Deployment status#Deployment process .

Contexte

Les bénévoles de tout le mouvement ont une longue histoire de travail pour :

  • Proactively educate and guide newcomers to make changes they feel proud of and changes that improve Wikipedia
  • Empêcher les personnes de publier des modifications destructrices, et
  • Réagir aux modifications apportées aux articles de Wikipédia et les modérer.

L'équipe de rédaction et ce projet sont inspirés par les initiatives ci-dessous. S'il y a un projet ou une ressource que vous pensez que nous devrions connaître, ajoutez-les ci-dessous !

Initiative Description Initiateur(s)
Model: meta:Eno-Prompt A project to train a fine-tuned open-source LLM (large language model) that will detect disinformation based on linguistic analysis.
Paper: Counter-Misinformation Dynamics: The Case of Wikipedia Editing Communities during the 2024 US Presidential Elections Recommendations to make Wikipedia more resilient to misinformation. Includes recommendations relevant to Edit Check.
Wish: Make editnotices display as pop-ups Make people aware when they are at risk of changing an article volunteers consider to be about a contentious topic User:Theleekycauldron
Corrector ortográfico, Helferlein/Rechtschreibprüfung, ויקיפדיה:סקריפטים/בודק איות, Revisor_ortográfico A tool that checks pages loaded in the browser against a list of common spelling errors. Poco_a_poco, Benutzer:APPER, :משתמש:ערן, Elisardojm
Error Finder A tool to find common errors in Persian texts Reza1615
Wish: Warn when large amount of content has been copy-pasted A wish to warn people who are pasting text into Wikipedia and to annotate edits in which this occurs so that patroller can consider this as they are reviewing edits/looking for edits to review. Matěj_Suchánek
fr:Projet:Articles sans sources (along with 3 other wikiprojects) A WikiProject intended to add sources to articles that need them
CopyPatrol Tool that allows you to see recent Wikipedia edits that are flagged as possible copyright violations Community Tech Team
paper: Automatically Neutralizing Subjective Bias in Text Method for automatically bringing inappropriately subjective text into a neutral point of view ("neutralizing" biased text). Reid Pryzant, Richard Diehl Martinez, Nathan Dass, Sadao Kurohashi, Dan Jurafsky, Diyi Yang
Wikipedia:Citation watchlist User script that adds visual indicators to watchlist and recent changes entries when unreliable sources are added to articles. Harej, Ocaasi
Internet Archive Reference Explorer Explore references included in Wikipedia articles via a range of criteria
WikiScore A tool created to validate edits and count scores of participants in wikicontests.
Earwig's Copyvio Detector This tool attempts to detect copyright violations in articles. The Earwig
CiteUnseen A user script that adds categorical icons to Wikipedia citations, providing readers and editors a quick initial evaluation of citations at a glance. SuperHamster
Credibility bot Monitors and collects data on source usage within Wikipedia articles Harej
Salebot (French Wikipedia) A counter-vandalism bot that uses regex to identify issues.
Edit intros (English Wikipedia) A message is shown automatically when editing a page categorized as either Category:Living people or Category:Possibly living people.
Make edit notices more visible in Visual Editor How might we make it so people who are in the midst of an edit are likely to see and "internalize" the information that is currently presented within Edit Notices? User:Stjn
Internet Archive Reference Explorer Automatically detect source quality Ocaasi
Wish: Reference requirement for new article creation Require new article to include references User:Mega809
Edit Notices Enables individual volunteers and projects to display a custom notice above the edit form, depending on the page, namespace, or other circumstances.
Remarques sur les pages
Bandeaux de maintenance
Extension:AbuseFilter Enables privileged users to set specific actions to be taken when actions by users, such as edits, match certain criteria.
Extension:Disambiguator Displays a notification in the 2006/2010 wikitext editor whenever one adds a link to a disambiguation page. Community Tech
ORES Halfak (WMF)
Modifications suggérées
CiteHighlighter Met en évidence 1800 sources vertes, jaunes ou rouges selon leur fiabilité. Novem Linguae
Checkwiki Aide à nettoyer la syntaxe et d'autres erreurs dans le code source de Wikipédia Stefan Kühn, Bgwhite
Modifier le balisage des diffs Présente toutes les différentes balises qui peuvent être déterminées automatiquement (généralement via des heuristiques de base) pour un diff d'édition donné de Wikipédia. Isaac (WMF)
CivilityCheck Un projet visant à évaluer la courtoisie dans les commentaires des discussions de Wikipédia afin de résoudre le problème des abus qui conduit au déclin de la rédaction au sein de la communauté Wiki. Deus Nsenga, Baelul Haile, David Ihim, and Elan Houticolo-Retzler
BOTutor Un bot qui envoie un message aux personnes qui tentent de publier une modification qui déclenche un ensemble de règles existant ValeJappo
Gadget-autocomplete.js ערן
Text reactions Une proposition qui permettrait à l'interface d'édition de réagir à ce que les gens saisissent dans la zone d'édition SD0001
Editwizard Un processus étape par étape pour guider les novices vers la source du contenu qu'ils tentent d'ajouter aux articles de Wikipédia Ankit18gupta, Enterprisey, Firefly, and SD0001
Headbomb/unreliable "Le script décompose les liens vers diverses sources dans différentes « sévérités » de manque de fiabilité. En général, le script est synchronisé avec [[w:fr:Wikipédia:Observatoire des sources WP:OBS]], {{Predatory open access source list}}, WP:NPPSG, WP:SPSLIST (pas encore entièrement implémenté) et WP:CITEWATCH, avec quelques différences mineures." Headbomb, SD0001
The Wikipedia Adventure Game based on the tech of Extension:GuidedTour that teaches basic wikitext markup and the rules about reliable sources and neutral point of view. Research into its effectiveness is described at m:Research:Impact of The Wikipedia Adventure on new editor retention. Ocaasi
w:Help:Introduction Le didacticiel principal pour les nouveaux éditeurs de Wikipédia en anglais, couvrant à la fois les politiques et les procédures techniques pour l'Éditeur visuel et le balisage wiki. Plus récemment révisé fin 2020 et plus activement entretenu que TWA (The Wikipedia Adventure). Sdkb, Evolution and evolvability, and others
User:Phlsph7/
HighlightUnreferencedPassages
Un script utilisateur pour mettre en évidence les passages qui manquent de références avec un fond rouge. Son objectif principal est d'aider les utilisateurs à identifier rapidement les passages, paragraphes et sections non référencés dans les articles et brouillons de l'espace principal. Phlsph7
Wish: Add notice to the visual editor that unsourced edits may be reverted Un message dans la fenêtre « Publier les modifications » de l'Éditeur visuel qui avertit que les modifications non sourcées seront annulées User:Lectrician1
Wish: Warn when adding a url reference that matches the SpamBlacklist Avertir lorsque l'url ajoutée en référence est inscrite dans la SpamBlacklist, et ainsi empêcher l'apparition de l'avertissement lors de l'enregistrement de la page. User:DSan
Edit FIler #686 L'AbuseFilter qui est déclenché lorsqu'un nouvel utilisateur ajoute éventuellement du contenu non référencé à BLP User:Rich Farmbrough
WikiLearn Plateforme pour s'entraîner
DannyS712/copyvio-check.js Vérifie automatiquement le pourcentage de copyvio des nouvelles pages en arrière-plan et affiche ces informations avec un lien vers le rapport dans le panneau « info » de la barre d'outils de curation de page. DannyS712
XLinkBot Un bot qui avertit les personnes qui ont ajouté un lien externe inapproprié d'une manière ou d'une autre. Versageek, Beetstra

See also

Références

  1. Superset: Wikipedia edits by interface and experience level
  2. Growth Team: IP editing Research Report
  3. The Tyranny of Structurelessness
  4. 4.0 4.1 Community Insights 2021 Report
  5. Regional Quarterly Learning Sessions (June 2022, google document)
  6. Superset
  7. T327959
  8. T324730
  9. Where "viability" in this context refers to a feature being fit for being scaled to all projects as determined by the extent to which it has been proven to have a net positive impact on wikis and the volunteers who build and maintain them.
  10. Emphasis on "help" seeing as how all decisions will depend on a variety of data, all of which need to be weighted and considered to make informed decisions.
  11. T325838 - Finish Edit Check measurement plan proposal
  12. 12.0 12.1 12.2 At every project where Edit Check is available, volunteers will be able to use the editcheck-reference-activated tag to review edits where the reference check is shown to people in the process of publishing an edit. Learn more about Edit Check tags.
  13. 13.0 13.1 13.2 13.3 13.4 13.5 13.6 13.7 13.8 $1
  14. T325851 - Conduct pre-mortem for Edit Check project
  15. In addition to existing feedback channels (Phabricator, talk pages, etc.) there will be a minimum of two additional ways for people to share feedback about Edit Check: A) reporting edits that you think Edit Check should not have been shown within and B) declining to add a reference mid-edit by indicating you think Edit Check was shown when it shouldn't have been.