By Erica Bertolotto, Inspiring Impact, Programme Manger.
How much evaluation is enough?
What to measure, and what not to measure? How much money and time to spend on evaluation? How much data collection is enough? Most charities grapple with these questions, and this is one of the topics that came up at our recent learning exchange event for Inspiring Impact’s Impact Champions.
Impact Champions are organisations with a commitment to learning and improving the impact practice of their networks. The Champions are organisations of different sizes, working in different sectors, and at different places in their impact journeys. And yet, despite their differences, finding the right balance in data collection was something they were all interested in learning more about.
We know from our own experience helping charities with their impact practice that most collect too much data, and often don’t use it. But this can change.
To collect the right amount of data and design a suitable evaluation methodology, you first need to know:
- How much evidence is there already that your project works?
If you’re implementing an approach that has been extensively researched and is known to be effective, you don’t need to focus too much on measuring outcomes. For example, there’s lots of research to show that peer tutoring leads to higher grades. If you’re delivering such an intervention, you don’t need to prove that your activities will lead to the outcomes you want to achieve, because the causal link has already been proven. Instead you should focus on collecting feedback from your service users and staff/ volunteers, as well as data about your users: how many people use your service, how often, are they in the right target group?
On the other hand, if you’re taking a new or innovative approach that hasn’t been evaluated before, you need a rigorous methodology focused on measuring outcomes. Your evaluation needs to tell you if your activities contribute to your desired outcomes.
- What resources and capability can you allocate to evaluation?
Allocate a suitable proportion of your budget to evaluation (this will vary according to your intervention and chosen evaluation methodology) and make sure your staff have the right skills.
With limited resources, prioritise data that you will use to improve your services and to meet funders’ requirements:
- Only collect data that you will analyse and use. Seeing how data is used to make tangible improvements to services will keep staff motivated to engage in evaluation, and users will be willing to provide data if they can see it’s meaningfully considered and acted upon.
- Have an honest conversation with your funders about your evaluation plans to try to collect the same data for different funders, and make sure you’re able to meet your reporting commitments.
Once you can answer these two questions, you’ll be on the right track to putting an effective measurement framework in place and better understanding the impact of your work. It can be really helpful to talk these things through with people in a similar position. If you’re an organisation committed to learning, improving your impact and promoting good impact practice throughout your networks, you might be a good fit for Inspiring Impact’s network of Impact Champions. The organisations in the network come from across the charity sector and draw on their diverse expertise to regularly discuss challenges such as those put forward in this blog.
Get in touch with Shona Curvers at email@example.com for more information on joining the network.