Nine trends in social impact measurement – and how they could impact your organisation
19 September 2019 at 8:30 am
Katya Andreyeva and Nancy Tran from SVA Consulting share their top nine social impact measurement trends and shine a light on the implications for not for profits and the buzzwords associated with them.
Social impact measurement (SIM) is an ever-growing and changing field of practice.
For practitioners and those who interact with SIM (ie all of us in the social sector), it’s important to understand these emerging trends as they will have a significant impact on how and what we measure, what skills will be required and what it will cost to know if our work is making a difference.
We’ve compiled what we believe to be the top nine SIM trends in three key areas most likely to change the way you measure impact, and what you need to do to ensure you’re riding the new wave instead of being bogged down by it. And, given how frequently they pop up in the sector, we’ve included some buzzword explainers because, in reality, most of these terms are new names for familiar concepts or just fancy terms for things that are easy to understand.
(Feel like you’ve completely missed the boat on impact measurement? Read why measuring your impact is non-negotiable in today’s sector.)
Theme: Increased demand for social impact measurement
- Shift towards outcomes-based commissioning
In the past, governments have measured what they do, and not necessarily what they achieve – this is changing. Governments increasingly see their role as a strategic funder which puts a greater emphasis on outcomes and impact, rather than block funding and activity reporting. In this sense, you can think of government as buying outcomes rather than services or activities.
This shift is evidenced by the development of outcomes frameworks (at various levels) across all state governments, and wider adoption of outcomes-focused performance measurement to build the capacity for longer-term outcomes-based commissioning.
Buzzword alert:
Commissioning for outcomes: A strategic approach to designing, resourcing and delivering effective and efficient services to meet needs, but focused clearly on how resources can achieve defined outcomes for individuals and communities.
Implications:
Service providers should start building the organisational culture, capacity and processes to measure and report on outcomes – both for their own decision-making, but also for satisfying the requirements of government. And with that in mind, they should also familiarise themselves and align outcomes with government outcomes frameworks as they are developed.
- Rise of consumer-directed funding and social procurement
Thanks to developments such as the National Disability Insurance Scheme (NDIS) and the rise of social procurement, more consumers are making purchasing decisions based on evidence of social outcomes achieved.
Looking first at social procurement, increasingly, both governments and corporates want to spend their money in a way that also generates social value. This is particularly true in Victoria where the introduction of the Victorian government’s Social Procurement Framework is going to see demand skyrocket for social enterprises who can demonstrate their social value.
Similarly, the NDIS ushered in a consumer-directed funding environment where individuals decide where to spend their support payments, meaning providers need to be able to articulate why their services will lead to the outcomes the individual is seeking. And while this funding model was introduced into the disability sector, giving consumers control of their own budgets is spreading to other areas as well.
Buzzword alert:
Consumer-directed funding: People have a budget set aside that they can spend on goods and services – giving them decision-making power as a consumer.
Social procurement: When organisations choose to purchase a social outcome when they buy a good or service.
Implications:
The implications are two-fold. Firstly, by placing the needs of those who procure your services at the centre of what you do, it is critical that an efficient feedback loop exists between the activities delivered and the outcomes achieved to ensure you do not lose those customers.
And secondly, not only do organisations need to be able to communicate their impact to government and funders, they now need to be able to do so for their customers. This is a change for organisations as they need to consider the right communications and channels to reach their target audience with a compelling impact message to help attract them to their services.
Theme: Evolution of practices and approaches
- Emerging approaches to evaluating systems change
There is growing interest in taking a wider view when tackling complex problems, seen in “systems change” approaches and collective impact initiatives (see definitions below) that look beyond a single program to find solutions. For example, the eight Children and Youth Area Partnerships established across Victoria use a collective impact approach to unite government, services and community to improve outcomes for vulnerable children, young people and families.
Buzzword alert:
Systems change: An approach to creating impact that’s designed to shift the status quo by shifting the function or structure of an entire system, rather than just a part of it.
Collective impact: A collaborative framework that allows cross-sector groups who share a common interest to address a complex social issue in a particular community.
Implications:
Naturally, these system-change initiatives need to be evaluated which requires social impact measurement practitioners to think beyond measuring the impact of a single program, service or organisation to how to evaluate an entire system. There are some emerging approaches to taking this broader perspective in evaluation, based on an understanding of the different levels of system change.
This trend also has implications for the skills that evaluators (internal or external) need to bring to the table, as traditional linear thinking must be replaced by critical thinking, complex problem-solving and strong facilitation skills.
- Emphasis on learning and continuous improvement rather than “evaluation”
Traditional program evaluation places the evaluator as an unbiased, external researcher who determines whether impact occurred once the project is complete. But increasingly it is regarded as a waste of resources to know whether something works only once it’s finished.
Instead, placing evaluators alongside project managers encourages double-loop learning as well as increased accountability to evaluation commissioners. Having the evaluator engaged at the outset to advise on how interventions are designed allows for improved evaluability, risk management and the ability to achieve outcomes and overall impact. This is a trend to watch as internal independent evaluation models are proving to be a value-add alongside the accountability check of external evaluation.
Buzzword alert:
Developmental evaluation: An evaluation approach designed to support the development and adaptation of a program or initiative, by providing real-time feedback to program staff to facilitate a continuous development loop.
Implications:
This trend towards evaluators as project partners rather than end-of-project assessors reveals a significant shift in the role and perception of evaluators. This longer-term engagement is seeing an increase in the requirement for internal evaluators across the sector as well as increased interest in, and funding for, developmental evaluation as a practice.
- Use of mixed methods rather than the “gold standard”
While some still believe that randomised control trials are the “gold standard” of evidence, the mixed methods approach is increasingly accepted as “good enough”.
There are now multiple ways to demonstrate impact, including the use of counterfactuals and multiple data types for context and triangulation. This growing acceptance of “good enough” reflects the reality of working in the social sector where resource constraints, complexity and the need for agility all loom large. As Voltaire is believed to have said, “perfect is the enemy of good”.
Buzzword alert:
Mixed-methods: A research approach that combines both qualitative and quantitative data.
Quasi-experimental research: A study used to estimate the impact of an intervention – similar to a randomised control trial, but without the random assignment or same level of controls.
Implications:
It lowers the barrier of entry to SIM, as academics and specialist evaluators are no longer gatekeepers for impact measurement. This more democratic trend means evaluations can cost less, facilitate faster learning and allow for critical thinking instead of black and white pronouncements.
- Human-centred and participatory design
Human/user-centred design is not just a flashy term but central to ensuring that SIM practice is accurate, useful and flexible. It is an approach to SIM that prioritises human/stakeholder perspectives allowing people to be at the centre of data collection, evaluation design and reporting – ensuring that the power is held by the people whom the data concerns. This contrasts with traditional SIM methodology that used predetermined or funder-imposed measurement frameworks for the project organised around objectives (rather than people).
It’s easy to see how adopting a human-centred approach can improve SIM, particularly with marginalised populations. But it’s also useful for complex problems where the intervention and the outcomes aren’t well understood, or where the context is volatile or involves repressive situations.
Buzzword alert:
Data sovereignty: A movement which seeks to retain and return data ownership rights to the communities where it is collected. Read more about this movement in First Australian communities in this article.
Implications:
Shifting from a traditional approach to participatory design requires a mindset change for all involved, as well as different skills for SIM practitioners including capabilities like using empathy, listening or having an ability to extract your own values, ideas or experiences from your measurement approach. It also requires a higher tolerance for complexity or “messiness”.
Theme: Tech-enabled advancements
- Shared measurement and standardisation
In recent years there has been a global shift towards aligning metrics, tools, and methods for measuring and evaluating outcomes, supported by international networks for sharing data and learnings. This trend is visible in the rise of a new generation of open-source platforms that create opportunities for complex projects that permit entry and analysis of real-time data, as well as the facility for data management, analysis, and visualisation.
This move to standardisation has largely been driven by the impact investment community who require the same level of rigour in impact measurement as in financial reporting to increase investor confidence and thus the flow of capital.
We’ll be keeping an eye on this trend as we believe these new opportunities to integrate evaluation with monitoring will enhance data quality and hopefully break down evaluation silos.
Buzzword alert:
Shared measurement: Both the product and process of taking a shared approach to impact measurement where multiple organisations share responsibility for data collection and learning.
Implications:
Standardisation is likely to lead to cost savings, improved data quality, reduce the need for evaluation expertise and allow for greater credibility on behalf of organisations looking to measure their impact.
- Open data and powerful analytics
Data is increasingly collected and shared between government, social and private sectors. This leads to huge potential for cross-sector analytics and insight. Keeping pace with the development of technology, outcomes measurement systems are now offering innovative and real-time data collection functionality using devices such as tablets, mobile apps and near field chip cards.
The challenge for organisations looking for opportunities to capitalise on these technological advancements is the need to think strategically and explicitly about ethical considerations with this data being collected. The potential risks of this data being misused, as well as challenges of gaining informed consent, opens up a conversation about what information should be collected. The driver for data collection should be grounded in a clear data governance framework, and not led by the capabilities of the technology itself.
Buzzword alert:
Open data: Data that is freely available to everyone to use and republish.
Big data: A massive volume of data that is so large it’s difficult to process using traditional database techniques.
(Predictive) data mining: Analysis and machine learning of big data, looking for trends and insights (a form of artificial intelligence, which is the simulation of “human intelligence” by machines, especially computer systems).
Data labs: Opening up and mining government and social sector data to develop greater insight into and improve, services.
Implications:
The opportunity and potential impact here can be large, but organisations must carefully prepare for this change. This includes strengthening internal data governance systems and protocols, as well as thinking about the skills required to make the most of the available data. It’s important also to understand the limitations of these systems and platforms, particularly artificial intelligence.
- Data visualisation
Collecting data is never enough as the data cannot speak for itself. There is always work to be done to ensure that any data collected can be understood. Even in the space of our careers we’ve seen such improvement in how data is reported and visualised. Data visualisation software now allows users to more easily and effectively interpret and communicate the data they have.
Buzzword alert:
Business intelligence (BI): BI refers to the approaches, tools, and mechanisms that organisations can use to keep a finger on the pulse of their operations – also referred to by unsexy terms, such as “reporting”.
Implications:
These new tools have seen a shift from data collection to the rise of data interpretation. It similarly requires a change in an organisation’s culture as data needs to be embraced as an opportunity to learn, not as a judgement of individual performance.
For tips on how to manage this culture shift, see our report on Harnessing the Power of Client Feedback.
About the authors: Katya Andreyeva and Nancy Tran are principals in Social Ventures Australia’s consulting team.
Thank you Katya, this was very helpful. You’ve brought clarity to lots of not-so-simple concepts. Much appreciated.