This page describes work on "positive reinforcement" as part of the Growth feature set. This page contains major assets, designs, open questions, and decisions.
Features to encourage newcomers to continue editing by showing that their contributions matter
Most incremental updates on progress will be posted on the general Growth team updates page, with some large or detailed updates posted here.
- 2021-03-01: project page created
- 2022-02-25: project kicked off with team discussions
- 2022-03-01: project page expanded
- 2022-05-11: community discussion
- 2022-08-12: user testing complete
- 2022-11-24: current designs and measurement and experiment plan added
- 2022-12-01: new impact module released to pilot wikis
- Next: further design iteration & engineering starts on leveling up and personalized praise
The Growth team has been focused on building a “cohesive newcomer experience” that provides access newcomers need to the elements that help them join the Wikipedia community of practice. For instance, with newcomer tasks, we have given them access to opportunities to participate, and with the mentorship module, we have given them access to mentorship. Suggested edits has been able to get more newcomers to make their first edits. With that success, we want to take action to encourage newcomers to continue to make more edits. This draws our attention to an undeveloped element to which newcomers need access: evaluating performance. We’re calling this project “positive reinforcement”.
We want newcomers to understand there is progression and value to sustained contributions on Wikipedia, increasing retention for those users who took the first step in making an edit.
Our big question here is: How might we encourage newcomers who have visited our homepage and tried our features to keep editing and build on their momentum?
When the newcomer homepage was deployed in 2019, it contained a basic "impact module", which listed the number of pageviews for the pages the newcomer had edited. That is the only part of the Growth features that give the newcomer any sense of their impact, and we have not improved on it since it was first deployed.
With this as a starting point, we have gathered some important learnings about positive reinforcement:
- We have heard good feedback from community members about the module, with experienced editors saying that it is interesting and valuable to them.
- Appreciation from other users has been shown to increase retention, such as in the case of "thanks" (here and here) and in an experiment on German Wikipedia. We believe that these reinforcements from real people would be more effective than automated ones coming from the system.
- Community members have explained that it is a high priority for newcomers to move on to more valuable tasks after starting with easy ones, as opposed to getting stuck just doing easy tasks.
- Other platforms, such as Google, Duolingo, and Github, all utilize numerous positive reinforcement mechanisms like badges and goals.
- Communities are wary of incentivizing unhealthy editing. We have seen that when editing contests offer cash prizes, or just when useful roles such as "extended confirmed" rely on edit counts, it can incentivize people to make many problematic edits.
There are many parts of the newcomer journey in which we could attempt to increase retention. We could focus on newcomers who have stopped editing after just one or a few edits, or we could focus farther down the journey on newcomers who have stopped editing after weeks of activity. For this project, we have decided to focus on those newcomers who have completed their first editing session, and who we want to return for a second session. The diagram illustrates these with a yellow star.
We want to focus on newcomers at this stage, as that’s the next stage of the editor funnel in which we can help improve retention. It is also where we see a very significant attrition rate currently, so if we can help retain newcomers at this point, it should have a meaningful increase in editor growth overtime.
Research and designEdit
Research was conducted on the various mechanisms that have been employed to encourage people to contribute content to both on and off-wiki products. The following are some of the key findings from the research:
- Motivations for Wikipedia editors are multifaceted, and shift over time and experience. New editors are often driven more by curiosity and social connection than ideology.
- Internal projects focus on intrinsic incentives, appeal to altruistic motivations, and are not systematically applied.
- Broadening the appeal beyond ideological motivations may improve diversity of retained editors on Wikipedia.
- Positive messages from experienced users and mentors is proven effective in short-term retention.
To see a summary of the current design ideas for Positive Reinforcement, see this Design Brief. Our designs will evolve further through community feedback and several rounds of user testing.
We have three main ideas for positive reinforcement. We may pursue multiple ideas as we work on this project.
- Impact: An overhaul of the Impact module based on incorporating stats, graphs, and other contribution information. The revised impact module would provide new editors more context about their impact, as well as encourages them to continue contributing. Areas of exploration include:
- Suggested edits milestone, to nudge users to try suggested edits.
- Statistics on how much the user has edited over time (similar to what is in X Tools).
- “Thanks received” count, to highlight the ability to receive community recognition.
- Recent editing activity - including days in a row newcomers have edited (“streaks”) to encourage continued engagement or remind people to restart their contributions.
- View reading activity on articles newcomers have edited over time (similar to info on en:Wikipedia:Pageview_statistics).
Impact module design A - more emphasis on a user’s impact on others (readers and editors)
Impact module design B - emphasizes a users’ recent editing activity
- Leveling up: It is important to communities that newcomers progress to more valuable tasks. For those who do many easy tasks, we want to nudge them toward trying more difficult tasks. This could happen after they complete a certain number of easy tasks, or by encouragement on their homepage. Areas of exploration include:
- The newcomer will see success messages post-editing that motivate them to do more edits of the same or different levels of difficulty.
- In the Suggested Edits module, provide opportunities to do more difficult edits, so that newcomers can become more skilled editors.
- In the Impact module, include a milestone counter or award area.
- On the Homepage, add a new module with set challenges to attain some reward (badge/certificate).
- Add notifications to prompt newcomers to try a more difficult task.
Design idea incorporating a daily edit goal
Newcomers could level up to more difficult tasks and receive recognition
An award module on the newcomer homepage
Newcomers making poor quality “fast edits” could receive guidance
Newcomers who complete a challenge could receive a sharable "Skilled newcomer" award
- Personalized praise: research shows that praise and encouragement from other users increases newcomer retention. We want to think about how to encourage experienced users to thank and award newcomers for good contributions. Perhaps mentors could be encouraged to do this on their mentor dashboards or through notifications. We can utilize existing communication mechanisms which past studies have proven to have a degree of positive effect. Areas of exploration include:
- A personal message from the newcomer’s mentor appearing in the homepage.
- An echo notification from the mentor or the Wikimedia Growth team.
- “Thanks” on a specific edit.
- A new milestone badge awarded by the mentor or the Wikimedia Growth team relating to a specific edit.
Display “Thanks” on Newcomer homepage
Display Wikilove on Newcomer homepage
We received direct feedback about the three main ideas, along with many other ideas for improving new editor retention.
Below is a summary of the main themes from the feedback, along with how we plan to iterate based on the feedback.
|We heard...||Plans to iterate based on feedback|
|😊 Looks good!||This idea seems the least controversial and most supported. We will plan to start development on this first, and allow for more time to refine other ideas.|
|😐 The impact module would be more effective if it scaled with editors as they gained experience.||We plan to focus on newcomers for now, but the new impact module will be built in an extensible way to accommodate improvements in the future.|
|We heard...||Plans to iterate based on feedback|
|😊 Leveling up ensures newcomers don't get "stuck" on easy tasks||Once users have a certain number of unreverted edits of one type, we should suggest they try more difficult tasks.|
|😊 Newcomers are often eager for awards||If we give awards they will need to feel meaningful to newcomers, and ideally are sharable either on-wiki (on their user page) or off-wiki.|
|❌ Goal-based incentives might be problematic, and may result in low-quality edits||Incentives that include a time-based element (similar to service awards) might be an effective approach as they factor in not only number of edits, but length of time registered. Certain "quality gates" could help slow down and guide newcomers if they are making edits that are getting reverted. We plan to reduce the scope on the award-side of "Leveling Up" for now, and focus more on encouraging users to try more difficult task-types as they are successful with easier tasks.|
|❌ Daily goals might be stressful and demotivating for some people||We will review this idea further and likely allow for goal customization if we pursue this idea.|
|We heard...||Plans to iterate based on feedback|
|😊 Spreading praise and positivity might help increase newcomer retention.||We are still refining designs for how to encourage more Thanks and personalized paise of newcomers, but hope to have further design ideas to present soon.|
|😐 Scaling personalized praise might be a challenge as it takes more time for experienced editors.||Mentors are already busy, so we hope to find a way to surface "praise-worthy" mentees. We will also brainstorm other ideas that don't rely on just mentors.|
|😐 We should use existing systems (Thanks, WikiLove, etc.)||Plans aren't finalized, but we definitely plan to take advantage of existing systems.|
Community members suggested several other ideas for improving newcomer engagement and retention. We think these are all valuable ideas (some of which we are already exploring or want to work on in the future) but the following ideas won't fit within the scope of the current project:
- Send newcomers onboarding and welcome emails (the Growth team is actually currently exploring engagement emails in collaboration with the Marketing and the Fundraising teams).
- Expose newcomers to Wikiprojects that relate to their interests.
- Include a customizable widget on the newcomer homepage to allow wikis to promote certain newcomer tasks or events.
- Send notifications to users who welcome newcomers once the newcomer reaches certain editing milestones (to help prompt the user to offer Thanks or Wikilove).
Along with community discussion, we wanted to validate and add to our initial designs and hypothesis by testing designs with readers and editors from several countries. So our design research team conducted Positive Reinforcement user testing aimed to better understand the project's impact on newcomer contribution across several different languages.
We tested several static Positive Reinforcement designs with Wikipedia readers and editors in Arabic, Spanish, and English. Along with testing Positive Reinforcement designs we introduced data visualizations from xtools as a way to better understand how these data visualizations are perceived by newcomers.
User testing resultsEdit
- Make impact data actionable: Impact data was a compelling feature for participants with more experience editing, which several related to their interest in data—an unsurprising quality for a Wikipedian. For those new to editing, impact data, beyond views and basic editing activity, may be more compelling if linked to goal-setting and optimizing impact.
- Evaluate the ideal editing interval: Across features, daily intervals seemed likely to be overly ambitious for new and casual editors. Participants also reflected on ignoring similar mechanisms on other platforms when they were unrealistic. Consider consulting usage analytics to identify “natural” intervals for new and casual editors to make goals more attainable.
- Ensure credibility of assessments: Novice editor participants were interested in the assurance of their skills and progress the quality score, article assessment, and badges offer. Some hoped that badges could lend credibility to their work reviewed by more experienced editors. With that potential, it could be valuable to evaluate that the assessments are meaningful measures of skill and further explore how best to leverage them to garner community trust of newcomers.
- Reward quality and collaboration over quantity: Both editor and reader participants from esWiki were more interested in recognition of their knowledge or expertise (quality) than the number of edits they have made (quantity). Similarly, some Arabic and English editors are motivated by their professional interests and skill development to edit. Orienting goals and rewards to other indicators of skilled edits, such as adding references or topical contributions, and collaboration or community involvement may also help mitigate concerns about competition overtaking collaboration.
- Prioritize human recognition: While scores and badges via Growth tasks is potentially valued, recognition from other editors appears to be more motivational. Features which promote giving, receiving, and revisiting thanks seemed most compelling, and editors may benefit from selecting impact data which demonstrates engagement with readers or editors most compelling to them.
- Experiment with playfulness of designs: While some positive reinforcement features can be seen as the product of “gamification”, some participants (primarily from EsWiki) felt that simple, fun designs were overly childish or playful for the seriousness of Wikipedia. Consider experimenting with visual designs that vary in levels of playfulness to evaluate broader reactions to “fun” on Wikipedia.
Below are the current designs for Positive Reinforcement. We have refined the three main ideas outlined above, but the scope of plans and the actual designs have evolved based on feedback from community discussions and user testing.
The revised impact module provides new editors with more context about their impact. The new design includes far more personalized info and data visualizations than the previous design. This new design is fairly similar to the design we shared previously when discussing this feature with communities. You can view the current engineering progress at beta wiki, and we hope to release this feature to Growth pilot wikis soon.
The Leveling up features focus on encouraging newcomers to progress to more valuable tasks. Ideas also include some prompts for new editors to try suggested edits, since structured tasks have been shown to improve newcomer activation and retention.
- “Level up” post-edit dialog message: A new post-edit dialog message type is added to encourage newcomers to try a new task type. We hope this will encourage some users to learn new editing skills as they progress to different, more challenging tasks.
- Post-edit dialog for non-suggested edits: Introduce newcomers who complete ‘normal’ edits to suggested edits. We plan to experiment by showing newcomers a prompt post 3rd and 7th edit. Desktop users who click through to try a suggested edit will also see their Impact module, which we hope helps engage newcomers and provides a small degree of automated positive reinforcement. We will carefully measure this experiment, and ensure there aren't any unintentional negative effects.
- New notifications: New echo notifications to encourage newcomers to start or continue suggested edits. This acts as a proxy to “win-back” emails for those who have an email address and settings on to receive email notifications.
“Level up” post-edit dialog message
Post-edit dialog for non-suggested edits
Personalized praise features are based on research results that show that encouragement and thanks from other users increases editor retention.
- Encouragement from Mentors: We will add a new module to the Mentor dashboard, that is designed to encourage Mentors to send personalized messages to newcomers who meet certain criteria. We will allow Mentors to customize and control how and when "praise-worthy" mentees are surfaced.
- Increasing Thanks across the wiki: We plan to fulfill the community wishlist item to Enable Thanks Button by default in Watchlists and Recent Changes (T51541, T90404). We hope this will increase Thanks and positivity across the wikis, and hopefully newcomers will benefit from this directly or indirectly.
Design for a new Mentor dashboard module
Design of the settings view of the new dashboard module
Measurement and resultsEdit
The Positive Reinforcement features aim to provide or improve the tools available to newcomers and mentors in three specific areas that will be described in more detail below. Our hypothesis is that once a newcomer has made a contribution (say by making a structured task edit), these features will help create a positive feedback cycle that increases newcomer motivation.
Below are the specific hypotheses that we seek to validate across the newcomer population. We will also have hypotheses for each of the three sets of features that the team plans to develop. These hypotheses drive the specifics for what data we will collect and how we will analyse that data.
- The Positive Reinforcement features increase our core metrics of retention and productivity.
- Since the Positive Reinforcement features do not feature a call to action that asks newcomers to make edits, we will see no difference in our activation core metric.
- Newcomers who get the Positive Reinforcement features are able to determine that making un-reverted edits is desirable, and we will see a decrease in the proportion of reverted edits.
- The positive feedback cycle created by the Positive Reinforcement features will lead to a significantly higher proportion of "highly active" newcomers.
- The Positive Reinforcement features increase the number of Daily Active Users of Suggested edits.
- The average number of edit sessions during the newcomer period (first 15 days) increases.
- "Personalized praise" will increase mentor’s proactive communication with their mentees, which will lead to increase in retention and productivity.
Similarly as we have done for previous Growth team projects, we want to test our hypotheses through controlled experiments (also called "A/B tests"). This will allow us to establish a causal relationship (e.g. "The Leveling Up features cause an increase in retention of xx%"), and it will allow us to detect smaller effects than if we were to give it to everyone and analyze the effects pre/post deployment.
In this controlled experiment, a randomly selected half of users will get access to Positive Reinforcement features (the "treatment" group), and the other randomly selected half will instead get the current (September 2022) Growth feature experience (the "control" group). In previous experiments, the control group has not gotten access to the Growth features. The team has decided to move away from that (T320876), which means that the current set of features is the new baseline for a control group.
The Personalized Praise feature is focused on mentors. There is a limited number of mentors on every wiki, whereas when it comes to newcomers the number increases steadily every day as new users register on the wikis. While we could run experiments with the mentors, we are likely to run into two key challenges. First, the limited number of mentors could mean that the experiments would need to run for a long time. Second, and more importantly, mentors are well integrated into the community and communicate with each other, meaning they are likely to figure out if some have access to features that others do not. We will therefore give the Personalized Praise features to all mentors and examine activity and effects on newcomers pre/post deployment in order to understand the feature’s effectiveness.
In summary, this means we are looking to run two consecutive experiments with the Impact and Leveling up features, followed by a deployment of the Personalized Praise features to all mentors. These experiments will first run on the pilot wikis. We can extend this to additional wikis if we find a need to do that, but it would only happen after we have analyzed the leading indicators and found no concerns.
Each experiment will run for approximately one month, and for each experiment we will have an accompanying set of leading indicators that we will analyze two weeks after deployment. The list below shows what the planned experiments will be:
- Impact: treatment group gets the updated Impact module.
- Leveling up: treatment group gets both the updated Impact module and the Leveling up features.
- Personalized praise: all mentors get the Personalized praise features.
Leading indicators and plan of actionEdit
While we believe that the features we develop are not detrimental to the wiki communities, we want to make sure we are careful when experimenting with them. It is good practice to define a set of leading indicators together with plans of what action to take based if a leading indicator suggests something isn't going the way it should. We have done this for all our past experiments and do so again for the experiments we plan to run as part of this project.
|Indicator||Expected result||Plan of action||Results|
|Impact module interactions||No difference or increase||If Impact module interactions decrease, then this suggests that we might have performance or compatibility issues with the new Impact module. If the proportion of newcomers who interact with the new Impact module is significantly lower than the old module we investigate the cause, reverting back to the old module if necessary.||Significant decrease|
|Mentor module interactions||No difference||The new Impact module takes up more screen real estate than the old module, which might lead to newcomers not finding the Mentor module as easily as before. If the number of newcomers who interact with the Mentor module is significantly lower for those who get the new Impact module, we investigate the need for design changes.||No signifiant difference|
|Mentor module questions||No difference||Similar concerns as for interactions with the Mentor module, if the number of questions asked to mentors is significantly lower for newcomers who get the new Impact module, we investigate the need for design changes.||No signifiant difference|
|Edits and revert rate||No difference in both edits and reverts, or an increase in edits and a decrease in revert rate||If there is an increase in the revert rate, this may suggest that newcomers are making unconstructive edits in order to inflate their edit or streak count. If the revert rate of newcomers who get the new Impact module is significantly higher than the old, we investigate their edits and decide whether changes are needed.||No signifiant difference (once outliers are removed)|
Impact module interactions: We find that the proportion of newcomers who interact with the old module (6.1%) is significantly higher than for the new module (5.0%): This difference showed up early on in the experiment, and we have examined the data more closely understand what is happening. One issue we identified early on was that not all interaction events were instrumented, which we subsequently resolved. Examining further, we find that many of those who get the old module click on links to the articles or the pageviews. In the new module, a graph of the pageviews is available, thus removing some of the need for visiting the pageview tool. As a result, we decided that no changes were needed.
Mentor module interactions: We find no significant difference in the proportion of newcomers who interact with the Mentor module. The proportion for newcomers who get the old module is 2.4%, for those who get the new module it's 2.2%. A Chi-square test finds this difference not significant:
Mentor module questions: We do not see a substantial difference in the number of questions asked between the old module (269 edits) and the new module (281 edits). The proportion of newcomers who asks their mentor a question is also the same for both groups, at 1.5%.
Edits and revert rate: We do not see a substantial difference in the number of edits nor in the revert rate between the two groups measured on a per-user average basis. There are differences between the groups, but these are driven by some highly prolific editors, particularly on the mobile platform.