Close Search
 
MEDIA, JOBS & RESOURCES for the COMMON GOOD
Analysis  |  Charity & NFP

Case Study: Embedding standards of evidence into grant-making


13 June 2019 at 8:00 am
Geoff Mulgan
How can we build the evidence for people helping people? The Centre for Social Action used the Nesta standards of evidence to assess every grantee and support them to build their evidence of impact further, writes Geoff Mulgan, as part of a series of case studies highlighting different approaches to capturing impact.


Geoff Mulgan | 13 June 2019 at 8:00 am


0 Comments


 Print
Case Study: Embedding standards of evidence into grant-making
13 June 2019 at 8:00 am

How can we build the evidence for people helping people? The Centre for Social Action used the Nesta standards of evidence to assess every grantee and support them to build their evidence of impact further, writes Geoff Mulgan, as part of a series of case studies highlighting different approaches to capturing impact.

The Centre for Social Action Innovation Fund, ran from April 2013 to March 2016. It was a £14 million (A$25.5 million) fund, delivered in partnership with the Cabinet Office, to support the growth of innovations that mobilise people’s energy and talents to help each other, working alongside public services.

In total, we received more than 1,400 expressions of interest. We backed a portfolio of 52 innovations, investing £11.5 million (A$21 million) in grants and a further £3 million (A$5.5 million) in non-financial advice and support, including rigorous evaluation of the outcomes for the people helped by these innovations.

The work focused on innovations in six priority areas where there was a plausible case for how social action could make a difference, and where we felt that the current solutions were underused.

The method

The Centre for Social Action Innovation Fund was the first time Nesta systematically integrated our Standards of Evidence into our grant-making, following its development in our impact investments work.

We wanted to work with projects to use evidence to increase their understanding about the impact of their work. We also believed that the innovations would need to share good quality evidence with their funders, commissioners, volunteers and beneficiaries if they were going to be successful in scaling their work.

It was important to clearly articulate purpose from the outset – to help grantees get the evidence they needed to know what works and to scale that sustainably. The approach aimed not just to end up with a series of robust evaluation reports, but to create genuinely useful insights that could be used to improve the approach and, where possible, to demonstrate its effectiveness.

We wanted to work with projects to use evidence to increase their understanding about the impact of their work.

To support the development of evidence, every innovation was given support to develop a theory of change and gather data through the lifetime of the grant to improve their evidence of impact. Each innovation was supported by Nesta and our evidence partner The Social Innovation Partnership (TSIP) to commission independent evaluations of their work, to ensure an appropriate approach was designed and a good quality brief developed for the evaluators.

Each grantee’s evaluation was co-designed with them, ensuring that it was tailored to their stage of development. On occasion this meant using control groups or similar, but it also meant more exploratory evaluations to help the grantees improve the design or delivery of their innovation, and/or putting systems and processes in place that better enabled grantees to monitor their impact over time.

Each innovations’ evaluations were independently verified at the end of the programme to see if they had been able to increase the quality of their evidence of impact, and how confident we could be in the quality of the evidence.

The impact

Spotlight on The Access Project

The Access Project (TAP) works with bright students from disadvantaged backgrounds, providing in-school support and personalised tuition, to help them gain access to top universities. It was awarded £100,953 (A$184,101 ); £15,000 (A$27,355) of this was for the evaluation.

At the start of the funding period, TAP was validated at Level 2 on the Nesta Standards of Evidence – that is, they had data that could begin to show effect but not causality. At that stage they were comparing the value-added scores of TAP pupils with pupils from the same school who did not take part in TAP. This provided an interesting benchmark, but there was likely to be some systematic difference between those pupils who did and did not take part in TAP.

TAP commissioned the National Institute of Economic and Social Research (NIESR), with the aim of strengthening its own capacity to evaluate effectiveness in a more robust and rigorous way.

Working with NIESR and building on its work to date, TAP developed a number of tools and approaches for the evaluation work. To improve rigour, the team developed a matched comparison group design, using propensity score matching and national pupil data. This is a quasi-experimental approach, used to estimate the difference in outcomes between people receiving the innovation and those who don’t that is attributable to the approach. The data produced was assessed at Level 3 on the Nesta Standards of Evidence – that is it demonstrated that the work caused impact, by showing less impact amongst those who do not receive the TAP support.

While TAP invested considerable resources internally on monitoring and evaluation before, this was the first time it looked externally for expertise.

Propensity score matching is widely considered a robust approach to creating a comparison group, provided that the factors on which participants are matched are sufficiently comprehensive and meaningful. TAP was unable to include “level of motivation” (a variable that could affect how effective the work was if only highly motivated students were receiving the support), as a matching factor. It was, however, able to provide evidence to successfully make the case that this does not significantly weaken the findings.

For university places the team developed data showing the change in number of pupils attending top universities from each school, from before TAP started working with them to after. This was assessed at Level 2 on the Nesta Standards of Evidence, showing impact but not causality.

While TAP invested considerable resources internally on monitoring and evaluation before, this was the first time it looked externally for expertise. Working with NIESR and other external partners allowed TAP to develop a set of tools it could take on and integrate into its yearly operations, both in terms of impact measurement but also programme development and learning. For example, the learning informed TAP to change the staff structure to support the delivery in schools in new ways.

This approach increased the team’s confidence level in the effectiveness of the programme – introducing more rigorous approaches that mitigate against bias, and can therefore confirm with more certainty and accuracy that the programme is having a positive impact. This approach was also really cost effective – with £15,000 (A$27,355) spent on external support for their evaluation. This has also helped TAP make the case for its work, make improvements to the programme, and continue to scale to support many more people.

At the beginning of the grant, TAP was working in London with 600 volunteers, supporting around 600 beneficiaries each academic year. By the end of the grant, it had expanded to include the West Midlands, and was working with almost 1,000 volunteers and 1,200 beneficiaries.

TAP has continued to grow its work beyond the lifetime of the fund, and is now working in the East Midlands, West Midlands and London, and experimenting with online tutoring as a way to reach even more young people through the Click Connect Learn programme.

The role of evidence continues to be critical to the delivery of TAP’s work, and it continues to use and build on the methods of evaluation it developed through the fund.

More information on the evaluation approach and the individual evaluations from the Centre for Social Action Innovation Fund can be found on the published evidence bank.

About the author: Geoff Mulgan has been chief executive of Nesta since 2011. Nesta is the UK’s innovation foundation and runs a wide range of activities in investment, practical innovation and research.

This article was first published by Nesta.

It is one of a series of eight examples provided by Nesta to highlight different approaches to capturing impact they have used in recent years.

Read Geoff Mulgan’s article here about why it’s essential to try and track what is being achieved.

See also:

Case study: Capturing the impact of a challenge prize

Case study: Using matched-control groups to evaluate a volunteering scheme

Case Study: Randomised control trial of SME business support vouchers


Geoff Mulgan  |  @ProBonoNews

CEO of the innovation foundation Nesta in the UK.


Get more stories like this

FREE SOCIAL
SECTOR NEWS

Your email address will not be published. Required fields are marked *



YOU MAY ALSO LIKE

Is impact just a buzz word?

Sonia Vignjevic

Monday, 20th February 2023 at 3:29 pm

Jumping to judgement

David Crosbie

Wednesday, 28th September 2022 at 9:13 pm

What Save The Children learned from becoming an impact investor

Paul Ronalds

Wednesday, 14th September 2022 at 12:55 pm

Access to quality and “clean” data key to ACNC collaboration with sector 

Samantha Freestone

Wednesday, 7th September 2022 at 9:16 pm

pba inverse logo
Subscribe Twitter Facebook
×