program evaluation

"Schütte & Pöppe Fabrik hauswirtschaftlicher Maschinen Hannover-Linden Rechnung 1909-01-16 Rückseite Detail IIIII" by MCruz (WMF), under CC-BY-SA-3.0

Quantitative versus Qualitative: More friends than enemies

As Wikimedia program leaders and evaluators work together toward more systematic measurement and evaluation strategies, we face a common challenge to have both systematic quantitative data and qualitative information.

READ MORE
"Wikimania2014 GrantmakingLearningDay 11" by AWang (WMF), under CC-BY-SA-4.0

Global Metrics for Grants: one way of doing, reporting and learning better

The Wikimedia movement is known for its diversity, on many levels: individuals, groups and organizations, in different contexts, are invested in achieving the same goal of free knowledge.

READ MORE
"Top page - Evaluation portal" by EGalvez (WMF), under CC-BY-SA-4.0

Evaluation Portal on Meta: A Redesigned Space for Learning

Just over one year ago, the Wikimedia Foundation started talking about evaluating programs like Wiki Loves Monuments and the Wikipedia Education Program. The goal was to grow support for program leaders to evaluate activities and outcomes that would lead to learning and improving the effectiveness of their programs.

READ MORE
"Wikimania 2014 Grantmaking Community Village stall CROP" by MCruz (WMF), under CC-BY-SA-4.0

Grants, Programs and Learning: This year at Wikimania London

Those present at this year’s Wikimania may have witnessed a different presence of the Grantmaking team. The department, formed by Grants, Learning & Evaluation and Education teams, was present in the global conference that brings together Wikimedia project programs, movement leaders and volunteers to learn from and connect with one another at the five day event.

READ MORE
"Wikimedians in Residence - Report May 2014" by MCruz (WMF), under CC-BY-SA-4.0

Wikimedians in Residence: a journey of discovery

In April of 2014 I found myself digging deep into analytics in search of possible improvements and insight into what we do as a chapter. What brought me there? One of our most renowned programs, Wikimedians in Residence. [...]
Wikimedia UK has been involved with WiR in the UK with varying degrees of support and supervision. Since the creation of the chapter, we always felt that the program was worth running, seeing it as one of the key ways we can engage with external organizations. However, I never knew for sure, if that was just a feeling. Toward the end of 2013 we decided to explore these notions. …

READ MORE
<a href="https://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg">"DARPA Big Data"</a> by <a href="https://commons.wikimedia.org/wiki/User:Kayaker">Kayaker</a>, under <a href="">PD US DARPA</a>

Digging for Data: How to Research Beyond Wikimetrics

For Learning & Evaluation, Wikimetrics is a powerful tool for pulling data for wiki project user cohorts, such as edit counts, pages created and bytes added or removed. However, you may still have a variety of other questions, for instance: How many members of WikiProject Medicine have edited a medicine-related article in the past three months?

READ MORE
"RobertFuddBewusstsein17Jh" by Magnus Manske, under CC-PD-Mark

Asking the right questions: Resources for your survey strategies

Surveys are an excellent strategy for measuring a group’s interests, behaviors, learning methods and other feedback. They are windows into the mind of the movement and they give us insight we otherwise would not have. These insights are gathered from survey questionnaires, where asking questions the right way becomes very important.

READ MORE
"WikimediaConference2014 13" by AWang (WMF), under CC-BY-SA-3.0

A Collaborative Definition of Impact: Building Metrics Together

As part of the Wikimedia Conference in Berlin, on Thursday, April 10, members of the WMF Grantmaking department’s Learning and Evaluation team held a brainstorming session around metrics with chapter representatives from around the world. The aim of the session was to start a conversation around what the evaluation metrics piloted in the (beta) Evaluation Reports tell us about our current programs and what they do not tell us, in terms of program impact.

READ MORE

Beginning to Understand What Works: Measuring the Impact of Wikimedia Programs

Radio Edit-a-thon in Argentina, on April 5, 2014. Across the globe, Wikimedia organizations and volunteers are engaging in online and offline activities to get more editors to contribute to Wikimedia projects. There are expansive efforts to attract new editors and to mobilize existing editors who can contribute diverse and high-quality…

READ MORE

Survey shows interest in evaluation in Wikimedia movement, with room to grow

The Wikimedia Foundation’s Program Evaluation & Design team recently completed a survey about the evaluation of organized activities within the Wikimedia community. Program evaluation allows the Wikimedia community to see if the programs and projects they are doing, often to inspire and engage people to participate in the Wikimedia movement,…

READ MORE