Our evaluation team has been busy with a range of work, from helping councils measure the impact of specific initiatives or investments to analysis of ongoing projects, ensuring that the conditions of grant funding, for example (hello, Warm Homes!) are met. This is a dynamic process. It’s not just based on data, but on people and their experiences. It demands analytical skill, but also a thoughtful approach to capturing their stories and helping communities see the benefits of participation, as Behavioural Economist, Leanne Kelly explains.
Centre for Ageing Better/Unsplash
Our evaluation projects over the last couple of years have led us to explore and analyse all kinds of interesting initiatives, places and meet some inspiring community organisers. Some of this work is ongoing through iterative approaches, as we pose and help to answer questions across the project cycle. These are ‘realist evaluations’.
By this, we mean that the same intervention won’t work the same everywhere, for everyone, or even every time. We undertake in-depth and mixed methods research to understand how context interacts with specific place-based mechanisms to realise the intended outcomes. In simple terms, how do the unique features of a place and its people affect how certain actions work to get the desired results?
These are just a few of the questions we’ve been answering to illustrate how this thinking works in practice.
How well did X decarbonisation innovation work in Y local area?
What did it mean for different parts of the supply chain and local economy?
Which households or community groups benefitted?
What will drive or constrain its upscaling, wider uptake and value for money?
How well has the programme improved social connection and wellbeing in these neighbourhoods?
Who remains at most risk of social isolation – and why?
What is the system-level to social connection and wellbeing intervention, and how healthy is this system? (In essence, what factors help people connect and feel well and are these systems working?)
What are the learnings and recommendations for the programme’s adaptation?
We have talked to leaders and users of community groups covering a wide range of activities, from men’s mental health meet-ups (with good food always served!) and self-expression art classes for new parents who felt isolated, to dance and drama classes for those new to the local area, including those in difficult housing situations.
Three areas have become increasingly vital in evaluation
Each of our evaluations can be categorised as process, impact and/or value for money assessments. Each has had a bespoke Theory of Change and Evaluation Framework that maps intended outcomes and impacts to indicators, measures and methods, which we co-design with our client, stakeholder and community groups.
But three method and exploration areas have become increasingly valuable in our independent evaluation roles. These areas provide our evaluation’s stakeholders and the wider community with effective ways to share more insights, to actively participate and disseminate learning and gain tangible lessons and examples that they can bring into their own activities and delivery going forward.
1. Systems mapping
Firstly, the role of systems. The local authority and neighbourhood-level interventions we are looking at include housing innovations, place-based decarbonisation and projects addressing health inequalities. All of these can be described as systematic problems and solutions. To assess their process (what is working and how) and their impact requires an understanding of the system in which these interventions work and have an influence.
Policy and interventions can be defined as being of a complex system due to specific characteristics. For example, those set out by the Policy Lab’s Healthy System Indicators: being interconnected (the problem they seek to solve is interlinked with other problems), non-linear (the problem is complex and won’t yield to simple cause-and-effect logic) and unpredictable (the outcome of efforts to address the problem cannot be reliably forecast).
We are looking at these interventions through different perspectives and at different levels, for example at the level of government or the supply chain of large companies to SMEs, community groups, volunteers and individual beneficiaries. We do this with:
different systematic factors, e.g. hierarchies, the quality and quantity of connections, information flow and behavioural barriers
varied outcomes and impacts, e.g. at individual, community, social, supply chain and governance levels.
System mapping, ongoing analysis, and assessing system health have become powerful tools in our evaluations, often explored through interactive stakeholder workshops. In our health, wellbeing and social connection work, this approach has helped uncover where trust is lacking, where communication barriers have formed and who the key local messengers could be.
It has also revealed shared purposes and responsibilities among stakeholders, highlighted where outcomes are co-designed with people with lived experience, and shown how better coordination can make information and signposting more consistent and accessible. Crucially, it underscores the vital role of VCSEs in strengthening local partnerships, fostering social connection, and supporting people to take charge of their own health and wellbeing.
2. Getting the timing right
Ensuring those with lived experience are included in the design and delivery of interventions, and in the evaluation itself, can be critical – a principle shared by many of our clients.
It’s important to capture the story and let individuals and organisations feel ownership of it. At the same time, we need to avoid overwhelming people, especially early on, when they are just starting new connections, managing their health or trying something new at home (while still understanding what their starting point looked like). To strike this balance, we consider:
Flexible approaches to participation methods over time, where earlier input could take the form of snapshot polls, brief chats or diary entries through to later more in-depth discussion, observations or workshops. We aren’t fixed on using the same methods ‘pre’ and ‘post’ when it concerns unique individual experiences in a realist evaluation.
Built-in moments of reflection. By deploying an iterative approach to evaluation we can return to individuals and groups at different times, allowing them to build on and update their own narratives. We look to align their evaluation participation with natural points of reflection. This might be the start of a new year, at the end of an activity term, once the individual passes a key milestone… Participants here often readily offer their own learning insights and suggestions for the programme and stakeholders. From one of our recent evaluation waves, we were able to bring together valuable learning from funding grantees for other grantees, using stories of delivery and change.
Based on our work, we have found it helpful to:
Demonstrate when your local partnerships work well to encourage collaboration in the community. Word-of-mouth testimonies and referrals can be powerful.
Reflect diverse experiences and opinion. It’s important to have a range of staff and volunteers who reflect and represent the diversity of the community you are working with.
Grow people’s interest and initial momentum using a mix of approaches. Consider when ‘in-person’ works best, the use of accessible language, give event and activity reminders with group communications methods and offer easy and convenient trials of new activities.
Develop a sense of ownership and agency for beneficiaries. Consider where beneficiaries can make decisions and shape what the activity looks like, this may be as simple as selecting topics for next sessions or meeting times.
Capture stories of change. Storytelling, in careful and transparent collaboration with beneficiaries, can be a useful exercise for the individuals involved as well as demonstrating your impact.
Be adaptable! Understand what (potential) beneficiaries really need and want.
3. Demonstrate benefits
We are committed to making sure our clients understand how beneficiaries, wider stakeholders and the community can benefit from these evaluations. We have developed summary reports for stakeholders to see the evaluation’s insights and recommendations for each wave. We also run capacity building and training sessions for VCSEs, SMEs and even GP practice teams. These cover basic foundations in monitoring and evaluation, through to co-designing their own monitoring plans and strategies to measure and report impact.
This capacity building helps these groups or organisations to be more funding-ready, helping them articulate the intended outcomes and demonstrate impact.
It can also benefit those who lead programmes and interventions, as they have a base of local stakeholders who can understand how objectives can be met. People can be reflective in sharing their learning and successes, and can offer constructive ideas for what could improve local programme design or delivery.
Inviting, capturing and utilising this knowledge has been a key part of our recent evaluations – putting people and communities at the heart of evaluating the change they want to see.
