CSCW ’14 retrospective; the impact of SOPA on deletionism; like-minded editors clustered; Wikipedia stylistic norms as a model for academic writing
- 1 CSCW ’14 retrospective
- 2 Clustering Wikipedia editors by their biases
- 3 Monthly research showcase launched
- 4 Study of AfD debates: Did the SOPA protests mellow deletionists?
- 5 Word frequency analysis identifies “four conceptualisations of femininity on Wikipedia”
- 6 Wikipedia and the development of academic language
- 7 Briefly
- 8 References
CSCW ’14 retrospective
The 17th ACM Conference on Computer-supported cooperative work and Social Computing (CSCW ’14) took place this month in Baltimore, Maryland.[supp 1] The conference brought together more than 500 researchers and practitioners from industry and academia presenting research on “the design and use of technologies that affect groups, organizations, communities, and networks.” Research on Wikipedia and wiki-based collaboration has been a major focus of CSCW in the past. This year, three papers on Wikipedia were presented:
|Slides from Editing beyond articles|
The rise of alt.projects in Wikipedia
Jonathan Morgan from the Wikimedia Foundation and collaborators from the University of Washington analyzed the nature of collaboration in alternative WikiProjects, i.e. projects that the authors identify as not following “the conventional pattern of coordinating a loosely defined range of article creation and curation-related activities within a well defined topic area” (examples of such alternative WikiProjects include the Guild of Copy Editors or WikiProject Dispute Resolution). The authors present an analysis of editing activity by members of these projects that are not focused on topic content editing. The paper also reports data on the number of contributors involved in WikiProjects over time: while the number of editors participating in conventional projects decreased by 51% between 2007 and 2012, participation in alternative projects only declined by 13% in the same period and saw an overall 57% increase in the raw number of contributions.
Categorizing barnstars via Mechanical Turk
Paul Andre and collaborators from Carnegie Mellon University presented a study showing how to effectively crowdsource a complex categorization task by assigning it to users with no prior knowledge or domain expertise. The authors selected a corpus of Wikipedia barnstars and showed how different task designs can produce crowdsourced judgments where Mechanical Turk workers accurately match expert categorization. Expert categorization was obtained by recruiting two Wikipedians with substantial editing activity as independent raters.
Understanding donor behavior through email
A team of researchers from Yahoo! Research, the Qatar Computing Research Institute and UC Berkeley analyzed two months of anonymized email logs to understand the demographics, personal interests and donation behavior of individuals responding to different fundraising campaigns. The results include donation email from the Wikimedia Foundation and indicate that among other campaigns, email from a wikimedia.org domain had the highest score of messages tagged for spam over total messages read, which the authors attribute to spoofing. The paper also indicates that the Wikimedia fundraiser tends to attract slightly more male than female donors.
Clustering Wikipedia editors by their biases
review by User:Maximilianklein
Building on the streams of rating editors by content persistence and algorithmically finding cliques of editors, Nakamura, Suzuki and Ishikawa propose a sophisticated tweak to find like- and disparate-minded editors, and test it against the Japanese Wikipedia. The method works by finding cliques in a weighted graph between all editors of an article and weighting the edges by the agreement or disagreement between editor. To find the agreement between two editors, they iterate through the full edit history and use the content persistence axioms of interpreting edits that are leaving text unchanged as agreement, and deleting text as disagreement. Addressing that leaving text unchanged is not always a strong indication of agreement, they normalize by each action’s frequency of both the source editor and the target editor. That is, the method accounts for the propensity of an editor to change text, and the propensity of editors to have their text changed.
To verify their method, its results are compared to a simplified weighting scheme, random clustering, and human-clustered results on 7 articles in Japanese Wikipedia. In 6 out of 7 articles, the proposed technique beats simplified weighting. An example they present is their detection of pro- and anti-nuclear editors on the Nuclear Power Plant article. An implication of such detection would be a gadget that colours text of an article depending on which editor group wrote it.
Monthly research showcase launched
The Wikimedia Foundation’s Research & Data team announced its first public showcase, a monthly review of work conducted by researchers at the Foundation. Aaron Halfaker presented a study of trends in newcomer article creation across 10 languages with a focus on the English and German Wikipedias (slides). The study indicates that in wikis where anonymous users can create articles, their articles are less likely to be deleted than articles created by newly registered editors. Oliver Keyes presented an analysis of how readers access Wikipedia on mobile devices and reviewed methods to identify the typical duration of a mobile browsing session (slides). The showcase is hosted at the Wikimedia Foundation every 3rd Wednesday of the month and live streamed on YouTube.[supp 2]