Is this thing on? Giving new Wikipedians feedback post-edit

Translate This Post

Figure 1. One of the messages used in the test (confirmation).

We recently tested a simple change in the user interface for registered Wikipedia editors. We’re happy to report results from a trial of post-edit feedback that lead to an increase in the productivity of newcomers to the project, while still maintaining quality.

The problem

The user experience problem was fairly straightforward: Wikipedia fails to tell its new contributors that once you edit an article, your change is live and can be immediately seen by every single reader. Simple, consistent feedback to new contributors make good sense from a usability standpoint. There is also evidence from the scholarly literature that delivering feedback after successful contributions can help newcomers feel motivated to continue participating.

Our first test of a solution

In this test, we examined the effect of a simple confirmation message or a thank you message on new English Wikipedia editors registered between July 30 and August 6. We randomly assigned newcomers to one of these two conditions, or to a control group, and we consistently delivered the same feedback message (or none, for the control group) after every edit for the first week of activity since registration.
The results indicate that receiving feedback upon completion of an edit has a positive effect on the volume of contributions by new editors, without producing any significant side-effect on the quality of their work or whether it was kept in the encyclopedia.
We focused our analysis on a sample of 8,571 new users with at least one edit during the test period, excluding to the best of our knowledge sockpuppets and other categories of spurious accounts. We measured the effects of feedback on the volume of contributions by analyzing the number of edits and edit size per participant in the different groups; we measured the impact of the test on quality by looking at the rate of reverts and blocks per participant in the different groups.

Impact on edit volume

Figure 2. Log-scale box plots of edit counts of new users presented with the confirmation message (left), no message (control group, center) or the gratitude message (right) after saving an edit.

We compared the edit count of contributors by condition over the first 2 weeks of activity and found an increase in mean edit count in the two experimental conditions of about 23.5% compared to the control. The difference was marginally significant in the confirmation condition and very close to significance (p=0.052) in the gratitude condition.
We also analyzed the size of contributions by editors in each condition, by measuring edit size as bytes added, bytes removed or net bytes changed. The results indicate that both experimental conditions significantly outperformed the control in net byte count changed per edit. The confirmation condition significantly outperformed the control for positive byte count per edit, while we found a marginally significant effect for gratitude. No significant difference was observed on the negative byte count per edit (or content removal). Therefore, receiving feedback has an effect on the size of contributions by new editors compared to the content added by editors in the control condition.

See our edit volume analysis for more details.

 

Impact on quality

Figure 3. Mean success rate for edits by new users in each condition: Control group (left), confirmation message (center), gratitude message (right)

While feedback may increase the volume of newcomer edits, it might do so at the cost of decreased quality. This is concerning since increasing the amount of edits that will need to be reverted represents a burden to the current Wikipedians. To address these questions, we measured the proportion of newcomers who were eventually blocked from editing and the rate at which their contributions were rejected (reverted or deleted).
Analyzing the proportion of newcomers that were blocked since the beginning of the treatment, we found the experimental treatment had no meaningful effect on the rate at which newcomers were blocked from editing – the difference was about 7% for each group, not enough to be declared significant relative to the sample size.
We also examined the “success rate” for each user, measured as the proportion of edits that were not reverted or deleted in the first week since registration. We calculated the mean success rate per newcomer for each experimental condition and found no significant difference between either of the experimental conditions and the control (figure 3).
These results suggest that the experimental treatment had no meaningful effect on the overall quality of newcomer contributions, and therefore, the burden imposed on Wikipedians.

See our newbie quality analysis for more details.

 

What’s next

The results of this first test were promising, and we’re currently working to implement an edit confirmation message for new contributors in the current editing interface, as well as in the upcoming visual editor. However, confirmation messages or messages of gratitude are just two of many different types of feedback that could motivate new contributors.
We’re currently testing the impact of letting people know when they reach milestones based on their cumulative edit count. Some Wikipedias already have community-created service awards based on edit count and tenure, so we’re extending these awards to a newer class of contributor, by letting them know when they’ve completed their first, fifth, 10th, 25th, 50th and 100th edits to the encyclopedia.
If you’re interested in participating in the research and analysis process for tests like these, please chime in and give us your feedback. We’ll be publishing open-licensed data for these experiments, when possible, on our open data repository.

Steven Walling, Associate Product Manager
Dario Taraborelli, Senior Research Analyst
on behalf of the Editor Engagement Experiments team

Archive notice: This is an archived post from blog.wikimedia.org, which operated under different editorial and content guidelines than Diff.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?