Case Study: Using mixed methods to understand the value of developing a typology of innovation agencies
18 June 2019 at 8:32 am
This programme developed a typology of innovation agencies around the world, and then used follow up qualitative and quantitative methods to understand and develop its value to agencies, writes Geoff Mulgan, as part of a series of case studies highlighting different approaches to capturing impact.
Innovation agencies – government-funded or managed institutions that provide financial and other support to catalyse or drive private sector innovation – have proliferated around the world. They are important players within national and international innovation ecosystems, and are often involved in the design and implementation of innovation policies. Yet there is little practical research that seeks to understand what innovation agencies do, how their missions and activities differ, how they can adapt in order to respond to rapidly changing policy environments, and what they can learn from each other.
The method
In 2015, Nesta set out to fill this gap. Over the course of a year, we spoke to more than 100 individuals in many innovation agencies around the world. We selected 10 different case study agencies to study in more detail that represented a cross section of geographies, approaches and levels of development. Based on our analysis of their strategies and the types of support they offer to innovators and others, we proposed a broad “typology” for innovation agencies that distinguished between the different roles they might be designed to play.
The final report, How Innovation Agencies Work: International lessons to inspire and inform national strategies was launched at Nesta’s inaugural Innovation Growth Lab Global Conference in May 2016. Since then, we have engaged with numerous individual innovation agencies as well as large multilateral organisations like the World Bank, the Inter-American Development Bank (IDB) and the Taftie network of European innovation agencies to share the findings and explore opportunities for further collaborative work that will support these agencies in designing more effective and inclusive innovation policies.
How we set out to understand the impact of our work
There are challenges involved in evidencing the impact of non-academic research and policy reports. Quantitative data (such as numbers of report downloads or citations) can provide some sense of reach and the types of people who engaged with the work, but do not usually show how the research was used and what contribution it made to the thinking of its intended audience. Qualitative data (such as stakeholder interviews) can give these deeper insights, but are not as objective or generalisable.
To try and get a picture of the value of our work on innovation agencies, we therefore sought a mix of both types of information. We started by collecting data on the report downloads, reach and citations, using Google analytics. We also commissioned a rapid independent stakeholder review and reached out through our own networks to gather feedback from the innovation agencies we partner with regularly (including through our Global Innovation Policy Accelerator programme) as well as others who had proactively contacted us in response to the research.
The impact
In 2018, the report has been downloaded by just over 2,300 unique individual users since its publication, making it one of the most downloaded Nesta reports at the time of writing. By far the highest proportion (just over 25 per cent) came from the UK.
Subsequent presentation of the findings have taken place for the IDB in Rio de Janeiro, the World Bank in Singapore, and the Taftie network in Lisbon, Luxembourg and Slovenia. Beyond these major umbrella organisations, endorsements and further queries have been received from agencies in countries including Switzerland, Argentina, Norway, Thailand, Spain, Kenya, Hong Kong, Brazil, Sri Lanka, Costa Rica and Peru.
How Innovation Agencies Work has been cited in both academic literature (eg in the Oxford Review of Economic Policy) and “grey” literature, including publications by the World Bank, the OECD, and IDB. While other comparative analyses of innovation agencies exist, there are none that categorise agencies’ strategic orientations. This appears to have been one of the most impactful results of the work – many innovation agencies (and bodies that work with them) now explicitly draw on this typology.
“How Innovation Agencies Work provides a much needed systematic approach to a rather informal and diverse landscape.” – Ana Ponte, head of partnerships and cooperation at ANI Portugal
For example, the World Bank has conducted some research to test whether the typology could apply to a range of further agencies in developing countries. Emerging findings, presented at the Bank’s Global Innovation Forum in 2017, suggest that it is suitable in these contexts, and provides individuals at the World Bank with an improved framework for approaching the task of boosting innovation capability in new client countries and regions.
Individual agencies have also used the typology to frame their own conversations about strategy. The Chair of Kenya’s National Innovation Agency in 2018 described Nesta’s report as “very useful in assisting us to think through the kind of innovation agency we would like to build”.
Ana Ponte, head of partnerships and cooperation at ANI Portugal, observed that the work has been: “A fundamental cornerstone for the National Innovation Agency, not only for our internal reflection…but also as the central knowledge basis on which to build a 17-innovation agency task force within Taftie for defining their roles, support services and competences for the future. It provides a much needed systematic approach to a rather informal and diverse landscape.”
The Science and Innovation Link office in Spain used Nesta’s typology to structure a benchmarking exercise in its support for the Spanish innovation agency CDTI. This enabled CDTI to better formulate its strategic orientation by determining which of the four types of agency was most suitable for the Spanish context and therefore what kinds of support instruments it should focus on developing.
What we learnt
Even this limited data-gathering exercise supports a conclusion that our research has made a useful contribution to the debate on national innovation agencies. To our knowledge, it remains the most comprehensive study of its kind on this issue, and it has informed and inspired innovation agencies around the world. Positive feedback has focused particularly on the quality of the report’s content and format, reinforcing the importance of producing research outputs that are clearly written and not overly reliant on jargon that cannot easily be understood by non-specialists.
Feedback also highlighted the importance of producing outputs that are useful for practitioners. Our review found that not all of the innovation agencies who engaged with the work were able to make direct use of its findings. This is unsurprising, given that the report was not intended to be a “how to” guide for setting up or changing an innovation agency, but rather a comparative overview to inspire fresh thinking. Nevertheless, it points to potential, unrealised impact that the work could have had if there had been more internal budget and capacity to do some direct follow on work with individual agencies.
Related to this, an important lesson for us is that research impact can take a long time to become clear. Immediate interest in this work died down after the publication of the paper, but has grown and strengthened over time as innovation agencies have started applying its ideas to their strategic review processes. More than two years after the launch of the research, we are still being asked to present on its findings, and we are building collaborations to develop the work in new directions.
What’s next
In addition to our ongoing work with a range of innovation agencies in different countries through the Global Innovation Policy Accelerator, Nesta has recently partnered with the Taftie network of European innovation agencies to design and deliver a taskforce on the “soft power” of innovation agencies (the terms of which were informed directly by Nesta’s previous research). The final report was published in April 2019, setting out ideas on the trends, challenges and opportunities facing innovation agencies, and the results of a comprehensive survey gathering data on the profiles of the Taftie agencies, the advisory and support services they provide, and the skills and capabilities of agency staff.
Key takeaways
- Use both quantitative and qualitative evidence to identify the value of research. Report downloads and citations are traditional metrics of impact, but cannot always show the influence our work has had on shaping ideas and ways of working. Stakeholder views are important sources of data too.
- Invest in maintaining stakeholder networks. Continued engagement with the audience for our work, even if we are not actively conducting follow-up research, is a useful way of tracking impact and ensuring continued relevance. It can also generate new collaborations further down the line.
- Regularly seek external reviews of our work. Even if there is no budget for full evaluations, small-scale assessments by independent experts can be a useful way of gaining an outside perspective of how our research and ideas are received.
About the author: Geoff Mulgan has been chief executive of Nesta since 2011. Nesta is the UK’s innovation foundation and runs a wide range of activities in investment, practical innovation and research.
This article was first published by Nesta.
It is one of a series of eight examples provided by Nesta to highlight different approaches to capturing impact they have used in recent years.
Read Geoff Mulgan’s article here about why it’s essential to try and track what is being achieved.
See also:
Case study: Capturing the impact of a challenge prize
Case study: Using matched-control groups to evaluate a volunteering scheme
Case Study: Randomised control trial of SME business support vouchers
Case Study: Embedding standards of evidence into grant-making