program evaluation

<a href="https://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg">"DARPA Big Data"</a> by <a href="https://commons.wikimedia.org/wiki/User:Kayaker">Kayaker</a>, under <a href="">PD US DARPA</a>

Digging for Data: How to Research Beyond Wikimetrics

For Learning & Evaluation, Wikimetrics is a powerful tool for pulling data for wiki project user cohorts, such as edit counts, pages created and bytes added or removed. However, you may still have a variety of other questions, for instance: How many members of WikiProject Medicine have edited a medicine-related article in the past three months?

READ MORE
"RobertFuddBewusstsein17Jh" by Magnus Manske, under CC-PD-Mark

Asking the right questions: Resources for your survey strategies

Surveys are an excellent strategy for measuring a group’s interests, behaviors, learning methods and other feedback. They are windows into the mind of the movement and they give us insight we otherwise would not have. These insights are gathered from survey questionnaires, where asking questions the right way becomes very important.

READ MORE
"WikimediaConference2014 13" by AWang (WMF), under CC-BY-SA-3.0

A Collaborative Definition of Impact: Building Metrics Together

As part of the Wikimedia Conference in Berlin, on Thursday, April 10, members of the WMF Grantmaking department’s Learning and Evaluation team held a brainstorming session around metrics with chapter representatives from around the world. The aim of the session was to start a conversation around what the evaluation metrics piloted in the (beta) Evaluation Reports tell us about our current programs and what they do not tell us, in terms of program impact.

READ MORE

Beginning to Understand What Works: Measuring the Impact of Wikimedia Programs

Radio Edit-a-thon in Argentina, on April 5, 2014. Across the globe, Wikimedia organizations and volunteers are engaging in online and offline activities to get more editors to contribute to Wikimedia projects. There are expansive efforts to attract new editors and to mobilize existing editors who can contribute diverse and high-quality…

READ MORE

Survey shows interest in evaluation in Wikimedia movement, with room to grow

The Wikimedia Foundation’s Program Evaluation & Design team recently completed a survey about the evaluation of organized activities within the Wikimedia community. Program evaluation allows the Wikimedia community to see if the programs and projects they are doing, often to inspire and engage people to participate in the Wikimedia movement,…

READ MORE

Drafting a strategy plan with the community

This post was authored by User:Barcelona from Amical Wikimedia. Amical member classifying community proposals during Amical’s 2014-2018 strategy plan taskforce meeting 4 of the 5 members of the taskforce meeting At Amical Wikimedia we have started the process of thinking about, and writing down, our 2014–2018 strategic plan for the…

READ MORE

Improving program performance: first evaluation workshop in Budapest

Participants from 15 countries attended the first Program Evaluation & Design Workshop In the Wikimedia movement, there are many organized activities seeking to contribute to the Wikimedia vision and strategic goals. But how do you determine which of these programs work and which don’t? And how can you further improve…

READ MORE

Call for participants: Program Evaluation and Design workshop in Budapest

Over the next couple of years, the Wikimedia Foundation will be building capacity among program leaders around evaluation and program design. A better understanding of how to increase impact through better planning, execution and evaluation of programs and activities will help us to move a step closer to achieving our mission…

READ MORE

Let’s start talking about program evaluation

Most Wikipedians I know – myself included – care deeply about our mission. We are highly passionate about our vision to provide free knowledge to every single person on the planet. And many of us invest an incredible amount of our free time into making this vision come true. Even…

READ MORE