Impact and outcomes reporting

Supporting intermediary organisations
Published in Reports on 3 Jul 2018
Corra Foundation

Iriss was involved in delivering workshops, alongside Corra Foundation, that helped explore with intermediary support organisations their approaches to expressing the impact and outcomes of their work. The organisations taking part all received core funding through the Scottish Government's CYPFEIF and ALEC Fund (Children, Young People and Families Early Intervention Fund and Adult Learning and Empowering Communities Fund), which Corra Foundation administers.

The aim was to support participants think about how they place the contribution their work makes to the people who they ultimately aspire to provide care and support for, as they do not work directly with these groups. We also hoped that they would then begin to share ideas and thoughts amongst each other. Most of the participant organisations didn't have a separate evaluation function within their organisations, so those attending were team and programme managers and coordinators who were thinking about how they could fold processes within their current functions and ways of recording impact.

What did participants want to explore?

Corra Foundation, alongside colleagues in the Scottish Government, sent out an invite to organisations to attend the first workshop at the end of 2016. They asked what people in these organisations wanted to explore. Some of the responses were:

  • Gathering ideas and sharing info with organisations who are similar/have similar challenges
  • Hints and tips to better evaluate work, development of existing logic model
  • We're good at evaluation, how to glean more from our members via evaluation
  • How do you prove that things have changed from delivery of services
  • How to demonstrate organisation is robust and well-suited to deliver outcomes
  • What can be put in place to measure impact of work/activities
  • How to measure impact on policy (beyond number of consultations)
  • How best to share outcomes delivery when reporting
  • How to measure longer term impact and outcomes

Contribution analysis and action learning

To respond to these, in an initial workshop, Iriss presented an introduction to contribution analysis, the use of logic models/theories of change, and how Iriss has used this approach as an intermediary in looking at our organisational impact. You can see the blog from some of our contribution analysis (CA) work here. We highlighted that we layered our CA work and examined the contribution of individual projects, programmes and the organisation as a whole. We emphasised how we built our theories of change that connected up the resources we put in, what we did, and the outputs and outcomes that this resulted in. At that day, Children in Scotland also shared some of their own work on developing plans around their contribution and some of the ways in which this was beginning to influence wider parts of their organisational strategies and work. We then used this to prompt people in the room to explore in groups how they currently measure their impact.

Participants gave feedback at the end, highlighting the value they took from learning more about contribution analysis, but also wanting to have some practical tools/help in starting to try some of this out. In response, we shared some of the basic tools and methods we used to develop our logic models and collect our evidence.


Participants also told us they wanted more time to discuss with each other and reflect on what this practically might mean for them. The next two workshops were therefore set-up using an action learning set model so that people could bring along their own questions/challenges/situations and they could help each other come up with thoughts and solutions.

Key messages

The following four discussion points are a result of learning from these sessions, as well as learning that Iriss has accumulated from talking with other organisations across social services around measuring their impact and outcomes. At Iriss, we often use appreciative enquiry and action learning set processes (as described in our recent leadership and supervision work) and these themes around impact and the importance of time for reflection are frequently discussed.

There is perceived to be differences between what funded organisations want to say and what funders want to hear

We consistently hear about there being both real and perceived differences between what practitioner organisations report on, and what they think commissioners and funders want to hear. There can be a difference between what the organisation themselves value (what they use to develop themselves), and what they report on to others. There seems to be a perception that the funder can't hear the 'story' of a service/intervention, that they can't hear and accept service user stories or practitioner accounts as evidence of change. However, in our discussions with both sides of this conversation there was a real commitment to change that perception. This is something that has also been discussed by Evaluation Support Scotland (ESS) in their work around going beyond numbers in reporting. See their blog with some additional links. They highlight the strengths that case studies, creative writing and word clouds can bring to enliven and strengthen the reporting of impact. Iriss has also published an Insight on storytelling and how this can contribute to change at personal and organisational levels.

We have found that one way to expose the differences between reporting and reality, if they exist, is to simply ask questions around 'why'; 'why do you collect that evidence?' Or even more fundamentally, 'why do you do what you do?' Once you start to then explore what really matters, you can identify the difference you want to make and focus on reporting around that. Sarah Morton at Outcomes Focus has posted around their learning on some of these themes. Those we have worked with have really valued using logic models/theories of change to draw the line between their values, their actions, and the outcomes they want to achieve. This has then made it easier for them to argue and evidence what they do, and to do it with confidence.

These conversations tend to not start by sitting down and saying 'let's develop a logic model', but rather start with having open conversations about what matters. We have developed a number of tools that help people start these conversations, whether that be through having a community of enquiry, initiating conversations around outcomes, or bringing in more perspectives for people to work around the values of coproduction. People have told us that these tools have been particularly useful as they can help bring together funders, managers, practitioners and experts-by-experience to share what is important and create a true story of impact. For intermediary organisations, this story often blends evidence from a number of places.

We don't have the time…

This is probably the most common thing that we hear as a barrier, not only to engage with thinking around evidence/impact/evaluation, but to engage with any development, learning and support work. This is completely understandable in what are often fast moving and high-pressured environments where much of an organisation's resources are put into winning grants/work, delivering that work, and reporting on that work. Evaluation and outcomes thinking can often seem like a luxury, or an afterthought, especially within intermediary organisations with little or no specific evaluation functions. This also leaves very restricted time for continuous reflection.

The organisations that have started to change how they work are the ones who see the value of how this can impact their organisation as a whole, both in how they reflect on past work, but also how that then helps them plan for the future and support others. Focussing more strongly on using outcomes to explain impact can seem daunting to begin with, as you stare at a text heavy logic model from elsewhere and the volume of evidence that you need to collate your own. In our sessions we highlighted how this is the case at the beginning, but as you go further down this route a culture of learning and reflection becomes normal and is no longer additional work. But it would be a lie to claim that this is easy. Individuals and organisations need help to think these processes through. There is a resource published by Learning Link Scotland and Education Scotland that explores this called 'Evidencing the Impact', as well as a range of helpful resources by Evaluation Support Scotland. We found the action learning set approach as a great aid in helping organisations begin their explorations around this way of thinking. It enabled them to pose questions of each other, as well as share stories and practice of what they already do and were hoping to do in the future. This revealed that they were actually already collecting the majority of evidence that they needed, but there were just some gaps (often around diversity of voices and evidence types) that they could strengthen around.

Learning from Evaluation Support Scotland

Evaluation Support Scotland

Evaluation can seem daunting, especially where you are moving to an outcome focused approach. The trick is to:

  • Map and use existing evidence and information collecting systems
  • Don't have too many outcomes or indicators
  • Use methods that fit with your everyday practice
  • Make that evidence useful for clients and practitioners to reflect, make improvements or plan ahead

Where do we make a difference?

There was some anxiety around contribution vs attribution. Many of the intermediary organisations taking part, as well as across social services as a whole, wrestle with the fact that they were one amongst many influences on people's lives, and that teasing out their impact was complex, especially as the end outcomes were often around behaviour, feelings and emotions. This is key as it highlights the importance of hearing from those whose lives you seek to influence, and having their voice lead in any outcome assessment. As Bond's guide to Impact Evaluation notes, in complex settings where there are multiple funders or other stakeholders 'identifying your contribution and recognising the contribution of others is more realistic than searching for evidence of sole attribution.' Again, this underlines the advantage of getting multiple perspectives when planning, delivering, reflecting on and evaluating on any piece of work or programme. This means an intermediary has to work closely with both those primary delivery organisations they support, as well as those funders and commissioners they report to.

There is also a challenge in here for funders around 'holding outcomes lightly', highlighted in this ESS report on asset-based approaches. This means that outcomes often emerge from actually doing work, they may be unintended, and they may change and evolve over time. This makes outcomes hard to predict at times (although this shouldn't stop you trying!). This then necessitates being open to things changing, and this isn't always a comfortable place for a funder, or indeed a reporting organisation to be. However, a high level outcome will often be something like 'people having more positive control over their lives' or 'people having more independence in their social lives' and if this is what people value and want to strive towards from the start, you can shape your work with them to get towards that goal. Something like a logic model can therefore be aspirational and malleable from the start and help form what evidence you will collect and present to a funder. None of these things should be left to happen or emerge in isolation and a holistic plan will help to chart and justify any journey you take and difference you make.

Learning from Evaluation Support Scotland

Evaluation Support Scotland
  • We seldom achieve outcomes alone, lives and systems of support are complex and there may be many influencers
  • A logic model can help you to tease out your contribution to other people's agendas
  • Often you start with what is important to the person and those outcomes might evolve or change over time
  • You can link personal outcomes to broader outcomes in your model

Is my evidence good enough?

This point flows through the three others above and is much wider than just the subject of this piece around contribution and impact. There are multiple ways of cutting evidence - type, quality, availability, relevance, usability, rigour… the list can go on. Organisations who support others, when reporting on their work and the evidence they use and create themselves, can get lost in this world. Those we have worked have talked about 'going numb' at this deluge of information, and it can cause them to retreat from engaging with their own use of evidence.

We have referred to the TREBL framework1 before to help people navigate their thinking around evidence they do/can use. This outlines that good enough evidence should be:

  • Transparent - are the methods, strengths and limitations of evidence explored and acknowledged?
  • Relevant - does it help you answer or get to the outcomes that you hope?
  • Enough - do you have enough evidence to be confident that your impact is leading to what you claim?
  • Believable - are you confident in the impact you are claiming through the evidence you present?
  • Legitimate - have you included stakeholders in sense checking you impact/contribution arguments?

There are some useful reflections on this evidence thinking by BIG Lottery Fund and ESS.

We found the trickiest thing when talking to organisations about how they use evidence to argue impact is that there is no one perfect solution to be identified where you can say 'do it this way'. It always has to be tailored to the organisation or project, and almost always it is a blend of evidence types that make the best arguments. We recognise that; practice wisdom and craft; research findings; lived experience; system data; as well as policy, political and contextual knowledge all have their place in the evidence landscape. We highlight that people need to know why they collect evidence and how this ultimately improves the outcomes for people who access services. However, we also acknowledge that ethics, values and relationships play a key role in progressive social services practice.

In the work we have carried out, the evidence that intermediary organisations used in their reporting was often primary evidence (collected by them) and would regularly be numbers based (number of workshops held, people involved, referrals reached etc). There is work to be done in helping organisations expand their thinking on what evidence types can be incorporated and how, but also on where they can go to access secondary or external evidence that would help them justify and make arguments. This may include where to go to collect evidence, but it may also involve improving individual's skills around searching for, finding and using evidence from elsewhere. You can see our recent report on our event, in partnership with the Alliance for Useful Evidence that brought together a range of sectors to discuss evidence use in their worlds. Key themes that came out of the event were the importance of collaboration, as well as the challenge of using evidence to inform practice in fast moving environments.

Learning from Evaluation Support Scotland

Evaluation Support Scotland

Those we have worked with are often overwhelmed with the 'is my evidence good enough?' question. The question should be connected to, 'good enough for what? How will you use the information?' For example, will it be to:

  • Help inform personal support for a person
  • Reflect on practice
  • Report to funders
  • Inform policy

The type of evidence collected and used should be informed by the answer to these questions - good enough and appropriate is not the same in every situation.


Summary

On the back of this project, and the other work that we have been involved with, we have some general recommendations for organisations that relate to the discussions explored above:

  1. Spend time as early as possible with commissioners/funders, as well as service users and other stakeholders, to develop together what outcomes are being sought and agree together the evidence that will be collected to evaluate towards those outcomes.
  2. Look at what you already do to map and evidence your impact, from your resources and actions, through to your project and organisational goals and outcomes. Are there gaps, or are you collecting evidence that you don't need to?
  3. Consider how you reflect on and learn from what you collect - does this feed back into future work?
  4. Use theories of change/logic models/contribution analysis to become more confident in what you do and how you communicate that to others.
  5. Embrace using a blend of evidence types to make the best impact argument you can.

For further information on this work please contact: Stuart Muirhead, Project Manager, Iriss

Acknowledgements

Many thanks to Diane Kennedy from Evaluation Support Scotland who contributed to the piece with some key learning points from their work, and to Catriona Henderson from Corra Foundation for her work on the project and on feedback to this piece.

Notes

  1. This has been adapted by Evaluation Support Scotland from Levitt, Martin, Nutley and Solesbury (2010) book 'Evidence for Accountability: Using Evidence in the Audit, Inspection and Scrutiny of UK Government