Re "26.7% more likely to make a first unreverted article edit", i'm interested how much time elapsed before the edit was considered to be unreverted? Also, at this stage were mentors assigned to the new users?
Thanks, ~~~~
Re "26.7% more likely to make a first unreverted article edit", i'm interested how much time elapsed before the edit was considered to be unreverted? Also, at this stage were mentors assigned to the new users?
Thanks, ~~~~
Hello Zindor, and thank you for your questions.
We have a 48-hour window from the edit being made until it’s labelled as “unreverted”.
Mentors are assigned to new users immediatly when they create an account. However, it is not yet the case at English Wikipedia, as we are still short of mentor to cover every account.
We assign a newcomer to every account, as don't know what each newcomer needs. Based on this research, some new users need a confirmation of their right to edit before making their first edit. Some other editors may like to have more information about editing before starting editing. As a consequence, we provide a mentor to each new account. It is then up to the newcomer to contact their mentor.
Hope this helps!
Thank you for your detailed reply, Trizek. As a mentor on en-wiki i'm enjoying using the software and it appears well thought out. While a full roll-out would be great, if it remains the case that only some new users get assigned a mentor that's still a big positive.
Thank you Zindor. Our goal is to give a mentor to everyone, so that newcomers will have an equal opportunity to be helped.
@MMiller (WMF) the section on Retention states, quote, "we did not detect any changes".
The graph File:Graph_showing_retention_from_newcomer_tasks_experiment_2020-11-19.png is fictional. The claim of increased retention in the nutshell box and elsewhere fails Verification, not to mention failing to Neutrally reflect the source data.
Hi @Alsee -- thank you for checking out this analysis. I can tell you gave it a close read, and you keyed in on the part that is most challenging for us to explain. It’s been really important to us to communicate openly about the results we are and are not seeing, so that communities can make informed decisions about whether the Growth features are a good fit for their wikis.
You’re right that our analysis did not directly detect retention. Retention is difficult to detect because so very few users are retained across the board (which the problem our team is trying to address, of course). The way we run the analysis asks, “Given the set of users who are activated, do the Growth features increase their retention?” In other words, we’re looking for whether the Growth features do something to increase retention beyond just increasing activation. The analysis finds that the answer is no — that it looks like users who are activated by the Growth features are retained at about the same rate as other activated users. But since the Growth features cause more users to be activated, and about the same share of those are retained as usual, we believe that the features increase how many users are retained overall. In other words, the features likely increase retention because they increase activation.
That’s what we’re trying to show in the graph you pointed out, and in the section of the page you cited. We wanted to be clear that this part of the analysis is based on our inference, as opposed to the hard numbers, and so we wrote that we “estimate”, and “we can expect”, and “appear”. The graph is meant to illustrate the way we’re thinking about it. Does my explanation make sense? How do you think we should write it so that readers can understand it the same way we do?
It sounds like you're trying to say people are more likely to edit and that the retention rate is unchanged. It's hard to say what wording is appropriate without seeing the data and what it actually does support. What exactly is the data on retention?
On a related note, I'd like to point out the unanticipated results of an older project with similar goals. The Wikipedia Adventure was intended to help bring in more new editors, helping them make initial edits and giving them badges etc as motivation / positive reinforcement. While initial results seemed to indicate increase in editing, once the facilitated-edits were discounted it turned out to have a very significant NEGATIVE impact. Users who were offered The Wikipedia Adventure were less likely to go on to make edits and they made fewer edits. People tried it, they made a few worthless facilitated edits, and they quit.
What does the data show for this project if you filter out edits made by the newcomer-link-app & image-app, and the help/mentor edits made via the newcomer page? Not to mention that help/mentor edits are far more valuable and costly than newbie edits. Was there a genuine increase in new users becoming constructive editors, making useful edits themselves?
Hi @Alsee -- if I can take another crack at it, I think another way to explain our results is like this: we know that the Growth features increase how many people make their first edits (i.e. “activated”). We also know that the Growth features do not cause activated people to be retained any more or less than usual. So for the newcomers who have the Growth features, since more of them have activated, more of them end up being retained. In other words, some portion of those additional newcomers who were activated by the Growth features are retained at the usual rates, thereby increasing how many retained newcomers there are. Now, as you pointed out, we did not detect this directly; rather we believe, according to the best data we have, that this is probably what’s happening.
Thanks for bringing up The Wikipedia Adventure. I think our team learned a lot from the research around it as we designed our features. One of our takeaways was that rather than giving newcomers practice edits, we should try giving them real content edits to do (and learn from). In The Wikipedia Adventure, the edits are all made in the newcomer's User space, but with the Growth team’s features, the edits are all happening to articles. Through the Growth features, the newcomers are making copyedits, adding wikilinks, and sometimes adding content and references. To date, there have been about 80,000 article edits made through the Growth features (with revert rates about the same as for other edits that newcomers make). And the findings on edit volume in the report show that these are edits that newcomers would not have otherwise made on their own (i.e. 22% more edits are happening with the Growth features). That’s why we believe that the data show that the features do help new users become constructive editors. It’s also why we don’t filter out the edits made through the features, because they are real, constructive edits to articles. (We do, though, filter out help/mentor edits; we're only looking at edits to the article namespace).
One other thing I want to point out is that this analysis did not include the “add a link” and “add an image” features, because those aren’t released yet. They only included “classic” tasks, in which the newcomer goes to an article with a maintenance template, and finds their own path to editing it. Our hope is that those future structured tasks will lead even more of the newcomers to success. If you have time, it would be great to get your thoughts on those projects on their respective talk pages.
How does this all sound? Does it make sense? Do you think there’s anything else we should be considering?
On December 4, we changed the estimated increase in edit volume from 85.6% to 22%. We made the change after improving the modeling methodology with which we estimate that number, which was originally too high.