Wikimedia blog

News from the Wikimedia Foundation and about the Wikimedia movement

Posts Tagged ‘program evaluation’

Survey shows interest in evaluation in Wikimedia movement, with room to grow

The Wikimedia Foundation’s Program Evaluation & Design team recently completed a survey about the evaluation of organized activities within the Wikimedia community. Program evaluation allows the Wikimedia community to see if the programs and projects they are doing, often to inspire and engage people to participate in the Wikimedia movement, work. It’s important to find out whether the programs that we spend hours of work and much energy on, and may invest money in, can be more efficient, more effective, and more impactful. Program Evaluation allows us to do that, and the Program Evaluation & Design team is here to support the community in discovering ways to do just that.

The survey was completed in August, having been sent out to over 100 program leaders around the world. The survey’s goal was to get a high level view of how program leaders within the Wikimedia movement have been evaluating programs such as edit-a-thons, workshops, Wikipedia Education Program, on-wiki contests, Wiki Loves Monuments, WikiExpeditions, other “Wiki Loves”, and GLAM programs. We wanted to know what type of data was being gathered by those planning and executing such programs across the movement. The results show that people who run programs track a variety of different data points, which is good. We know how busy volunteers and chapter/affiliate staff are, so it’s wonderful to see their ability to include evaluation into their often already overwhelming workflows. We’re excited to share some results with you, and to explain our next steps.

Evaluation Capacity Survey

We had a great response rate – 69 of the 114 invited program leaders completed the survey! Respondents represented 32 Wikimedia chapters, three affiliated clubs and organizations and eight individual community members. Thank you to everyone who responded! Some of the highlights from the survey include:

  • Half of the respondents reported having received grants from the Wikimedia Foundation.
  • Edit-a-thons and workshops, photo upload competitions, and the Wikipedia Education Program were the kinds of programs which were most frequently organized in the movement in the past year.

(more…)

Drafting a strategy plan with the community

This post was authored by User:Barcelona from Amical Wikimedia.

Amical member classifying community proposals during Amical’s 2014-2018 strategy plan taskforce meeting

4 of the 5 members of the taskforce meeting

At Amical Wikimedia we have started the process of thinking about, and writing down, our 2014–2018 strategic plan for the next five years, and also our 2013–2014 annual activity plan (our activities are linked with the educational calendar instead of the calendar year). We are trying apply both the principles and values of our association (shared with the global Wikimedia movement) and the lessons learned at the Program Evaluation June workshop in Budapest, which our GLAM ambassador Kippelboy attended. Thus, the key words defining all this process could be collaboration, efficiency and self-reflection.

First step: including all the voices

We made a collaborative effort to include all the voices within the association and the community of editors in the main plan. First, we opened a public proposals page to collect all the ideas coming from any interested person about the long term evolution of Amical Wikimedia. Secondly, we hosted an online chat meeting via IRC to ask our members’ thoughts about the plan goals, their questions about the priorities and their suggestions about which should be the most important social sectors to address with our activities. Later, we had a few in-depth interviews with some users who are especially committed to Amical.

Second step: task force

Then we created a special task force of five members of different profiles to start the effective actual composition of the plan. After some offline and online meetings and developing previous working documents, we had an all-day session –hosted by the University of Girona– where the taskforce members met and used the specific methodology learned in Budapest, working as a community and implementing evaluation strategies in our programme activities. We shared and evaluated our thoughts and proposals before starting to write down the final document.

Third step: internal evaluation

Our strategy plan is now ready and is being shared on our internal wiki, so all the Amical members can add, remove, change, discuss or challenge its assumptions. The result at the end, which will published in September, will surely reflect the collaborative spirit of our community and all the reflections around what should we become.

Outcomes

This graph explains the relation between our key words and paths

Although the process has not finished yet, we can now present some partial results: Amical’s Strategy Schema and the main intended work tracks. You can see the intersection of goals and actions of our plan in this graphic (still under construction, as is the whole wiki world). On top, there are the five key words which summarize the association’s focus in the next years:

  • Cohesion: We want to keep the community binding and strong personal ties that make possible to work together.
  • Discourse: Since we would like to spread the word about what we do and think to participate in the global debate.
  • Content: Obviously the central point is to help to add content to Wikimedia projects – that should be the main goal for all the chapters and associations.
  • Territory: We want to be active in all the Catalan-speaking countries.
  • Readers: We want to explain the world in Catalan and make the people read it.

Below are the three paths to attain these goals: Activities (programmed projects for the next years), Internal (the sometimes invisible but crucial work to keep the community alive and growing) and External (how we relate with other users, Wikimedia groups and the non-wiki reality).

As for the Activity Plan, several initiatives were suggested by the community: continue the work with museums, extend our presence at universities and schools, promote Wikimedia sister projects, prepare a multilingual contest on Wikipedia focused on Catalan culture and history, increase the number of libraries that we are already collaborating with, explore new possibilities to attract editors… We are really excited to see what comes next, because surely at Amical we have plenty of work to do!

User:Barcelona, Amical Wikimedia

Improving program performance: first evaluation workshop in Budapest

Participants from 15 countries attended the first Program Evaluation & Design Workshop

In the Wikimedia movement, there are many organized activities seeking to contribute to the Wikimedia vision and strategic goals. But how do you determine which of these programs work and which don’t? And how can you further improve the performance of programs? To tackle these difficult question, 26 international participants came together in June 2013 for the first Program Evaluation & Design Workshop in Budapest, Hungary. The event was held by the Wikimedia Foundation, in partnership with Wikimedia Magyarország, the local chapter.

With record high temperatures in Budapest, participants kept cool in the heart of the city, engaging in an intensive, two-day workshop that presented the basics of Program Evaluation. The workshop focused on creating a shared understanding of what program evaluation is, why it is important, and providing attendees with some basic skills and a logic modeling tool for mapping out their programs in order for them to begin incorporating Program Evaluation into their program work.

The workshop brought together 21 Wikimedians from 15 countries. The participants – all with a track record of doing program work – represented five different program types:

Topics of the workshop

Day one opened with a welcome by Frank Schulenburg, Senior Director of Programs at the Wikimedia Foundation, and a long-time Wikipedian. He gave a brief background on why Wikimedia is investing in Program Evaluation and what it is. Schulenburg stressed three points about the current evaluation initiative:

  • self-evaluation: program leaders evaluate their own programs
  • collaborative: we’re all in this together and we will learn together
  • focused on capacity building: our goal is to equip program leaders in the movement with the necessary skills to use program evaluation and design practices

Dr. Jaime Anstee, Program Evaluation & Design Specialist for the Wikimedia Foundation, then led the group through the basics of Program Evaluation – different types of evaluation and the roles of all involved in it while also expressing that the current evaluation initiative aims to be empowering, and participatory, while maintaining a utilization focus. The morning ended with a visioning exercise to see the positive and negative results of what the movement could experience with Program Evaluation, and lightning talks by the participants about the programs they have executed.

(more…)

Call for participants: Program Evaluation and Design workshop in Budapest

Over the next couple of years, the Wikimedia Foundation will be building capacity among program leaders around evaluation and program design. A better understanding of how to increase impact through better planning, execution and evaluation of programs and activities will help us to move a step closer to achieving our mission of offering a free, high quality encyclopedia to our readers around the world.

With this in mind, we are pleased to announce the first Program Evaluation and Design Workshop, on 22-23 June 2013 in Budapest, Hungary.

We have only 20 slots available for this workshop and the application deadline ends on May 17th. This two-day event will be followed by a pre-conference workshop at Wikimania 2013. Ideally, applicants would commit to attending both events.

The first Program Evaluation & Design workshop will be held in the shadows of the Buda Castle, Budapest, Hungary

Our long-term goals for the workshop are:

  • Participants will gain a basic shared understanding of program evaluation
  • Participants will work collaboratively to map and prioritize measurable outcomes, beginning with a focus on the most common programs and activities
  • Participants will gain increased fluency in common language of evaluation (i.e. goals versus objectives, inputs and outputs versus outcomes and impact)
  • Participants will learn and practice how to extract and report data using the UserMetrics API
  • Participants will commit to working as a community of evaluation leaders who will implement evaluation strategies in their programs and activities and report back at the pre-conference workshop at Wikimania 2013
  • …and participants will have a lot of fun and enjoy networking with other program leaders!

We will publish a detailed agenda for the event in Budapest soon on Meta-Wiki.

During the workshop in Budapest, we will only have a limited amount of time. Therefore, we will be focusing on the some of the more common programs and activities:

  • Wikipedia editing workshops where participants learn how to or actively edit (i.e. edit-a-thon, wikiparty, hands-on Wikipedia workshop)
  • Content donations through partnerships with galleries, libraries, archives and museums (GLAMs) and related organizations
  • Wiki Takes/Expeditions where volunteers participate in day-long or weekend events to photograph site specific content
  • Wiki Loves Monuments, which takes place in September
  • Education program and classroom editing where volunteers support educators who have students editing Wikipedia in the classroom
  • Writing competitions, which generally take place online in the form of contests, the WikiCup  and other challenges – often engaging experienced editors to improve content.

Contributors who play an active role in planning and executing programs and activities as described above in the Wikimedia community are highly encouraged to apply. Your experience and knowledge will make this workshop a success!

Hotels, flights and other transportation costs will be the responsibility of your chapter; the Wikimedia Foundation will provide the venue, handouts, breakfasts, light lunches, and a dinner for all participants on Saturday. If you’re not affiliated with a chapter and cannot afford to attend the event, please email me after you apply – we have a small amount of money set aside for those cases.

Remember, applications are open until May 17. You can apply via this Google Form.

Thanks for your interest, and I look forward to a great group of participants!

Sarah Stierch, Program Evaluation and Design Community Coordinator, Wikimedia Foundation

Let’s start talking about program evaluation

Most Wikipedians I know – myself included – care deeply about our mission. We are highly passionate about our vision to provide free knowledge to every single person on the planet. And many of us invest an incredible amount of our free time into making this vision come true. Even though I have been editing Wikipedia since 2005, I’m still amazed when I look at the daily stream of Wikipedia’s recent changes. And I have a deep respect for the volunteers who invest their precious time and energy into never letting this stream of small and big improvements run dry.

For many years now, Wikipedians have not only worked on increasing and improving the amount of free content available on the web. We have also launched a wide variety of programmatic activities that intend to strengthen free knowledge by rising the public awareness of Wikipedia through exhibitions and presentations, by recruiting new editors as part of Wikipedia workshops, and by starting programs like “Wiki Loves Monuments”, the popular Wikimedia Commons photo competition.

We have not, however, been very strong at measuring the impact of those programmatic activities. “Measuring impact” in this case refers to quantifying the long-term effects those programmatic activities have on Wikipedia and its sister projects. In practice, this means, for example, analyzing how many of the people who attended a specific Wikipedia workshop actually turned into Wikipedians. How many of them embarked on editing articles as a result of the workshop and how many of them continued doing so on a regular basis?

Here’s where program evaluation comes into play.

  • If you’re supporting programmatic work as a volunteer, you most likely want to know whether your activities are worth the effort. Whether you help running photo competitions or sign up as a speaker at workshops for Wikipedia beginners – you want to know whether you’re making a real difference with what you’re doing. Program evaluation will help you answer this question. By measuring the outcome of the programmatic activity you signed up for, you will know whether your photo competition was successful and whether the people who participated in your workshop really learned the skills they need to write great articles on Wikipedia. This knowledge will make your work as a volunteer more fulfilling.
  • If you’re helping to improve existing programs, you’re most likely eager to find out which changes will make your program more effective and efficient. Imagine you could achieve the same result with fewer volunteer hours being spent. And what if you could double the number of people who actually start contributing to Wikipedia after your workshop, with making your workshop more engaging and fun? Improving an existing program requires that you measure its effectiveness. Here’s where integrating evaluation into your program design can make a difference.
  • If you’re thinking about starting a new program, you will want to have some evidence that your new program is working. How else would you convince others to participate in your program? And how else would you convince a funder to provide you with a grant so you can eventually execute your program and grow it over time? Program evaluation will help you to make a strong case for your idea. And it will also prevent you from embarking on activities that have no impact.
  • If you’re serving on the board of a Wikimedia chapter or a thematic organization, you might want to know which kind of programmatic activities produce the “biggest bang for the buck”. You might ask whether it makes more sense to start putting money and efforts into in-person workshops as compared to spending resources on creating an online training. How many hours of volunteer or staff time are you going to need in order to produce a specific result? Are the in-person workshops going to be more effective than the online training? And which of the two options will be more efficient? Also, which one is going to have the bigger long-term impact? In this sense, program evaluation can be a decision-making tool that will help you to determine which programmatic activities you want to embark on.

Finally, with the Funds Dissemination Committee (FDC) being in place since last year, there’s also another reason why program evaluation will be more important than ever: after the first round of funding for 2012/2013, the FDC requested more information about program impact, so it has a better foundation for making recommendations on what to fund in the future. This means that from now on, funding decisions will rely heavily on the ability of grantees to demonstrate what impact their programmatic activities have. That means that grantees will have to start thinking about program evaluation, in case they plan to apply for movement funds through the FDC process.

I’ve started a series of documents on Meta (“Program evaluation basics”) aimed at providing program leaders with an introduction to the main terms and concepts. Currently, three documents are available:

I invite you to take a look at the documents and to share your thoughts with me. I will also be available for an IRC office hour on Thursday, March 21, at 17:00 UTC. Let’s start talking about program evaluation…

Frank Schulenburg
Senior Director of Programs, Wikimedia Foundation