This action research project seeks to support progressive social change organisations to evaluate the impacts and outcomes of their efforts. By comparison to other sectors and types of organisation, NGOs working toward social and environmental justice commit fewer resources to evaluating their programs and campaigns. Research undertaken by Change Agency associates suggests that many progressive NGOs infrequently apply evaluation frameworks. The reasons for this include lack of time, limited access to appropriate tools and frameworks and the highly ambiguous and irrational nature of the policymaking process. In recent times a small but growing number of organisations, especially philanthropic bodies that fund advocacy work, have begun to explore frameworks for evaluating advocacy.
- Collect and synthesise resources and literature on campaign evaluation including case studies
- Identify evaluation frameworks that are simple, powerful and road-tested
- Develop a suite of resources for participatory workshops that support activists to evaluate their own campaigns
- Trial and revise resources
- Identify opportunities for ongoing action research
Project update November 2007: evaluating advocacy campaigns – What is the link between our advocacy activities and policy outcomes? Is it possible to establish a causal relationship between a vote in Parliament and campaign tactics? In mid-2007, two of the Change Agency team initiated the advocacy evaluation project to:
project update August 2008: evaluating advocacy campaigns – Our advocacy evaluation action research project has been powering along this month. In the longer term, we plan to develop, adapt and share a range of resources to use in community campaigns. Campaign evaluation can turbo-charge how and how much we learn. It’s a wise investment of time and energy… not just at the end of a campaign, but at the beginning and at along the way.
During July, Justin completed a review of literature on advocacy evaluation which has been shared internationally and will soon be published as a Comm-org working paper. You can download it from our articles and papers page.
And we helped develop an evaluation framework for Climate Camp. The online questionnaire was completed by 90 activists who’d taken part in the week-long protest camp. The evaluation showed up some important outcomes including a strong match between hoped-for and actual outcomes, a significant increase in the likelihood that participants will participate in direct action as a result of their experiences in Newcastle and very strong support for another national climate change convergence.
During coming months we’ll be trialing a range of evaluation tools with campaigning organisations. If your organisation is developing and applying evaluation tools in your campaigns, we’d love to hear from you.
online evaluation resources
Justin Whelan reviews the emerging literature in this field, noting the points of convergence and divergence and suggesting some limitations of the frameworks and opportunities for effective evaluation that meets the needs of interest groups.
Justin Whelan (2006). Extracts from ‘Work Justice’, a case study of the campaign by Uniting Justice to influence Industrial Relations legislation in Australia.
Much like the discipline of evaluation itself, the evaluation function in philanthropy with staff assigned to evaluation-related responsibilities is a fairly recent phenomenon…