The 20-year-and-counting infatuation our profession has had with impact has now taken a welcome and clearly discernible shift towards performance, learning and improvement[1] and nothing represents this trend better than the Social Sector Office of McKinsey & Company’s Learning for Social Impact Initiative.
The project has produced an impressive set out outputs including a path-breaking report, Learning for Social Impact: What foundations can do, a website, a database of tools and resources for assessing impact (TRASI), and an interactive workbook that foundations can use to design an evaluation plan. All of this is freely available. In fact, in order to ensure that the database would be maintained well and seen as a community resource, McKinsey has given the database to the Foundation Center, which has supplemented it by creating the TRASI online community of impact practitioners – now some 260 strong.
We need to declare our interest here: Keystone tools are included in the TRASI database and our work is featured in the main McKinsey report as a leading exponent of their first of five best practices: Hear the constituent voice. For that reason, this review will pass lightly over the report and concentrate on the most innovative aspect of the initiative, the interactive online workbook.
This online tool offers a comprehensive ‘one-stop’ solution to impact assessment design, amply supplemented with definitions of technical terms and of course linked to the TRASI annotated database of tools for those who might wish to delve more deeply. It emphasizes learning as the overall purpose, providing solutions for each stage in the foundation strategy. At the planning phase of a project, the tool identifies learning questions. At the design phase it identifies preferred assessment method options. At the implementation phase, it describes the main issues and options for implementation of the assessment model selected.
It is commendably broad, reflecting what it calls ‘a spirit of inquiry, not of judgment’. It identifies what works and why, and considers unintended consequences (both positive and negative) and environmental influences that enhance or undermine a programme’s success. It asks ‘does the community support it?’ and ‘what do the intended beneficiaries think about it?’ It expects and draws value from failures as well as successes.
Our review of the tool is based on road testing it on an innovative World Bank project to support citizen engagement in monitoring the delivery of public services. This test enabled us to see how the main features work and to compare them with the actual evaluation strategy that is emerging for that work.
The first step is to clarify your strategic goal, working from a set of 24 objectives defined by six types of social intervention and four stages of the solution.
We decided that the type of intervention was to enable a system development – improving public services – and that the Bank was at the final ‘scale and sustain’ stage since the effectiveness of citizen monitoring is well demonstrated through hundreds of examples around the world.
Once this general objective is defined, the tool guides you through the different aspects of the impact assessment process: design, data collection methods, and data analysis and implementation methods. This is supported by examples that help you determine the feasibility and effectiveness of each method, and a rich set of questions that help you to consider what fits best with your resources and values. The workbook also allows you to divide an initiative into separate parts, so you can run the exercise separately for different aspects of the same social intervention. This prevents organizations falling into the trap of a ‘one size fits all’ method.
Along the way the workbook asks you to refine your learning questions and elaborate a set of indicators that will tell you if you are making progress towards answering those questions. We felt that the specific advice about the indicators could be deepened, perhaps with links to the growing number of efforts to catalogue leading outcome indicators.[2]
Overall, the workbook and online resource provide a powerful new tool for the field. When combined with the database and online community of practice – which could over time become a natural place to share experiences with the tool and specific impact assessments – the McKinsey initiative represents a remarkably comprehensive contribution to one of the toughest challenges that foundations face. Well done, McKinsey!
1 For a short articulation of guidelines and principles for impact evaluation from this point of view, see Impact Evaluation for Development: Principles for Action at http://www.keystoneaccountability.org/node/431
2 Such as those identified in FSG Impact Advisors’ 2009 report, Breakthroughs in Shared Measurement and Social Impact.
David Bonbright is chief executive of Keystone and Erika Lopez-Franco is an MA Development student and Keystone intern. Emails david@keystoneaccountability.org and erika@keystoneaccountability.org
For more information
http://lsi.mckinsey.com/en.aspx
http://trasi.foundationcenter.org
http://trasicommunity.ning.com
Comments (0)