Global Metrics for Grants: one way of doing, reporting and learning better

Translate This Post

We want to understand in a better way the work being done by Wikimedia communities all over the world.
“Wikimania2014 GrantmakingLearningDay 11” by AWang (WMF), under CC-BY-SA-4.0

The Wikimedia movement is known for its diversity, on many levels: individuals, groups and organizations, in different contexts, are invested in achieving the same goal of free knowledge. As community members seeking and executing grants have worked with grant committee members and the WMF Grantmaking team, we have reached a point of shared understanding: we need to do better at learning from each other and doing more to demonstrate our impact.
Starting this month, the Grantmaking team is putting into effect a set of Global Metrics, that will help us all understand, appreciate and be accountable for some of the work being done by Wikimedia communities worldwide. In particular, we are seeking a shared aggregate understanding of how successful we are at expanding participation and improving content on our projects. These will have the form of a table template that will be included in the reporting form, starting on future grants, from Round 1 2014-2015.
These metrics are not meant to replace, but to complement, each grant and grantee’s individual metrics and measures of success, both qualitative and quantitative.

Why Global Metrics and how were they designed?

For the past two years, we have worked with community members to build a funding framework that supports a spectrum of needs, ideas and initiatives from across the movement, led by individuals to established organizations. This framework was also supported by a self-evaluation strategy, that allowed any community member to build their own metrics and report against their own goals.

A look back: the outcomes of the first batch of FDC grants
“Learning and Evaluation. FDC Impact 2012-14” by Jessie Wild Sneller, under CC-BY-SA-3.0

Over the past year, we have begun reviewing grant progress and impact reports[1], and amongst many insights, three stand out: people are still finding it difficult to measure their work in clear ways; the larger the grants, the less proportionate the impact seems to be (and one challenge may be reporting); and we are finding it difficult to assess the collective impact of the considerable work supported by these grants in any systematic fashion. In particular, as a movement, we are not yet skillful in offering both the stories and the numbers, that describe how our offline work positively impacts our online successes.
After two years of observing the goals and measures of various grants projects, a few core metrics came out as indicators that are commonly used by community members in different contexts. These measures, however, were not calculated consistently across projects. As a result, it was difficult to convey outwards what we are accomplishing as a movement. Global Metrics, in this sense, provide a shared set of indicators that can be used across projects, to report on results. In addition, we did our best to design metrics that can, currently, be assessed with the support of tools built and used across the movement.
After research and consultation with some grantees and grants committee members, the new Global Metrics focus on participation, content and learning processes:

  • Number of active editors involved.
  • Number of new registered users.
  • Number of individuals involved.
  • Number of new images added to Wikimedia articles/pages.
  • Number of articles added or improved on Wikimedia projects.
  • Number of bytes added to or deleted from Wikimedia projects.
  • Learning question: Did your work increase the motivation of contributors and how do you know?

The main challenge these Global Metrics are trying to overcome is the limited ability observed in Wikimedia projects and programs to sum up inputs, outputs and outcomes in self-evaluation and thereby to give us all a more cogent sense of the collective impact of our work. We hope that more cohesive reporting will help us celebrate our successes as a global movement, but also point out where we are not making an appreciable difference. We recognize, however, that numbers are not enough.

Numbers do not tell the full story

We are therefore counting on community members to offer both numbers and stories, since numbers only make sense within context. Secondly and critically, global metrics are not the only measures of success we will learn from: each grantee will continue to define and assess themselves against measures of success that are critical to them. We don’t expect that grant reports should or will focus only on these seven measures. In fact, some key insights that would significantly improve the effectiveness of our work may not be easily measurable, but we know and understand their impact: for instance, volunteer motivation.

Presentation from 29 July 2014 on the 2013-14 impact reports of PEG grantees. Covers the outcomes of 36 grants that submitted reports during 2013-14, with key learnings.
“PEG Impact learning series – 2014 July” by Jwild (WMF), under CC-BY-SA-4.0

The Global Metrics are also limited in what they can currently measure. As they stand, they do not directly measure quality, retention, or readership. In addition, they may not offer the right metrics for all types of grants. For instance, an individual engagement grant for research on our wiki projects may not directly produce content or recruit new editors. In this case, the grantee might only be able to report the number of individuals and/or active editors involved.
As we implement these metrics, keeping in mind the potential and the limitations of Global Metrics will help us learn from what is useful and what we may continue to need to improve upon.

Room to grow, work and be successful together

As we pilot this new set of metrics in the movement, the Grantmaking team will be available to provide consultation and support to grantees. We also encourage everyone involved in reporting to reach out to us to learn more what each metric means and how to measure them. We have prepared a set of learning patterns, available on the Evaluation portal on Meta, that go through each of the Global Metrics and explain how to gather data for those. We will work with community members during the next few months to further develop these information resources and to create new ones. Please check Grants:Evaluation/News and follow @WikiEval on Twitter for updates. We also encourage all community members to comment, share concerns and ask any questions related to global metrics. Do join the conversation on the talk page and reach out to the team at eval [at] wikimedia [dot] org: come talk to us, let’s do better together!
Anasuya Sengupta, Senior Director of Grantmaking, Wikimedia Foundation
MarĂ­a Cruz, Community Coordinator of Program Evaluation & Design, Wikimedia Foundation

Archive notice: This is an archived post from blog.wikimedia.org, which operated under different editorial and content guidelines than Diff.

Can you help us translate this article?

In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?