Data in the news and a trip down memory lane….
It’s January 2026, and I’m taking a bit of time for reflection at the start of the year to talk about ASP data and why the big statistics matter. This relates to the journey I and others have been involved in, in co-designing a National Minimum Dataset (NMDS) for ASP. I want to share reflections on why I think this data matters, why we built the NMDS, how it’s being used to support improvement and planning and can be used for optimal effect. It ends with some practical tips and considerations for those who might want to build their own dataset! It should be noted that it has been several years in the making, to ensure we have robust enough data that is both meaningful and comparable across all 32 Adult Protection Committee (APC) areas.
It also comes on the back of the Scottish Government’s Adult Support and Protection (ASP) statistics that came out on 9 Sept 2025 which draws on the NMDS. This report was a landmark occasion and a bit of a cause for celebration. Why? Because we were able to put a report out into the public domain and share it for the first time with some confidence in the data. The data provides national oversight on a range of key indicators. Did you know for example that in 2024-25 there were 63,144 ASP referrals and 47,314 ASP inquiries - of which just over 1 in 5 required Council Officers to use investigatory powers afforded to them by the Adult Support and Protection (2007) Act? There were also 49 Large Scale Investigations due to concerns that more than one adult in a given service was at risk- a decrease from 76 reported in the previous year.
And in terms of what we know about adults at risk - the largest number of inquiries were for women aged over 65 and 96% of all inquiries were for people who were ‘white’ where ethnicity was recorded. At inquiry stage, ‘physical harm’ was the most common primary type of harm followed by self-inflicted harm (‘self neglect’ and ‘self harm’ combined), with the individual’s own home followed by care homes the primary location of harm. The data also shows that adults at risk of harm experience a wide range of underlying conditions including mental health problems, substance misuse, learning disabilities, physical disabilities, and infirmity due to age. For inquiries using investigatory powers, the most common primary client group categories relate to ‘mental health (excluding dementia)’ (20%) and ‘infirmity/frailty due to age’ (17%).
In future years we also hope to report on other indicators collected, when we are satisfied with the quality of the data. This would include the number of Adult Support and Protection Plans in place and data on adults at risk with caring responsibilities for other adults or for children and young people.
Where does the data come from?
The published report is based on the ASP NMDS - quarterly data that has been reported to the Scottish Government since 2023-24 as a mandatory return by all 32 APCs. Different APCs have different Management Information Systems (MIS), and their own recording processes, so it is the shared definitions, glossary terms and guidance on how to count and categorise that provide the framework for meaningful national oversight and comparison and that helps inform conversations on national improvement agendas. At national level it is aggregate data; at a local level there is greater scope for analysis and manipulation that we will get to later.
Why we built it
At both national and local level its purpose is to inform planning and support improvement. And it has a secondary benefit in making visible the contribution that ASP makes in helping keep adults at risk of harm, safe, supported and protected.
Prior to the NMDS we had annual data based on annual returns to the Scottish Government. But it wasn’t robust or comparable across different APC areas, with the phrase ‘it’s like comparing apples and pears’ commonly used. There was also some dissatisfaction that what was collected was too focused on the start of the ASP journey (referrals, inquiries and investigations etc.) and didn't tell us enough about the people in the system or the support in place. Plus, it was annual data, so out of date before anyone had read it.
In 2020-21 Iriss were commissioned by the ASP Team in the Scottish Government to help develop a quarterly Minimum Dataset, inspired by the Child Protection one that CELCIS supported colleagues with. Our remit was to co-design it with the sector, and later to ensure that our indicator definitions aligned to the revised Code of Practice that came in, in July 2022.
How we built it?
In 2021 we started with a national audit of ASP indicators, including those collected annually by the Scottish Government and others commonly used by different APCs, with real potential to identify ones with potential for inclusion in a NMDS. 23 APCs contributed to this. We also recruited five learning partners (representatives from five Adult Protection Committee areas) to co-design, then test and refine prototypes, made material in the form of data workbooks, glossary terms for indicators and categories, and guidance on how to ‘count’ and select. We also provided some scrutiny questions to support reflection, to help interpret the data and tease out implications for action.
In 2022-23 Scottish Government colleagues established a Data Reference Group – to support the development of the NMDS and optimise opportunities for collaboration and usage. It was also minded to consider the NMDS in its relation to intersecting datasets (like regular SOLACE collections begun during COVID or Care Inspectorate data). This was with a view to minimising duplication, whilst also wanting to ensure that the NMDS informs and is informed by relevant cross-policy and cross-agency interests.
In 2023-24 we rolled out the first set of indicators across Scotland, with additional indicators added in 2024-25 as Phase 2! The first phase’s indicators were largely similar to the previous national annual return, but with a far more robust glossary underpinning it, to promote consistency of understanding each indicator. We adopted a phased approach because sector colleagues told us that bringing in too much change too quick would overwhelm their systems!
In 2022-23 we also began having conversations with SOLACE on phasing out duplicatory SOLACE returns that included indicators relating to adults at risk, once we are confident in data quality of the NMDS for those ‘matched’ indicators. And in 2023-24 ASP-related indicators within the SOLACE returns ceased. 2022-23 was also the last year for the ASP Annual report that the ASP NMDS had been brought in to replace.
Plus, before, during and after roll out we provided (and continue to provide) national quarterly drop-ins to support implementation. In this space, ASP Leads, Data Analysts and MIS providers with an interest in the data are all welcome. These are Chaired by, and at the invitation of, Scottish Government colleagues and supported by Iriss. At these online meetings we have shared progress updates, and from 2023-24 the data itself. We have also used these as an opportunity to answer questions and listen to colleagues on any implementation challenges experienced, taking these back to learning partners to see where guidance can be revised and improved. And more recently, we have used this space to reflect on how APCs are using the data itself to drive improvement at a local level and to support shared learning.
To support quality assurance, we provide space in the workbooks for local areas to identify any gaps in their data submissions and to say why this is the case. And in the first year and later on as needed, we have allowed for retrospective updates as systems and practices adapted and people got better at completing the workbooks.
For various indicators we also offered ‘other’ as a category with an associated free text box, e.g. for harm types and client categories. This informed discussions about any ‘types’ that didn’t fit naturally into any existent boxes within the dataset; it helped prioritise areas in which the dataset needed refined, and also informed quality assurance. The NMDS continues to be an evolving dataset, with data users very much driving identification for any changes warranted that will ultimately improve meaningfulness and robustness of the data.
Bringing us up-to-date, in 2025 we now have a total of 19 indicators, two years’ worth of data, and a national report in the public domain. And locally, each area's own data can tell them so much more about what is happening in their own patch.
What the ASP NMDS can tell us at a local level
The data can tell us about many things, as highlighted in the diagram below. This includes data about pathways into and through ASP, use of the legislation, as well as data about adults in the system, the support in place, workforce demands and the role of different partners. These roles also relate to key duties as set out in the 2007 Adult Support and Protection Act (ASPA) including duty to refer, co-operate and offer advocacy.
For our APC audiences, we also provide comparator data – supporting local areas to compare themselves with national levels and family groupings. Family groups are areas with similar profiles in terms of size or populations and are based on the Local Government Benchmarking Framework. This is not about saying one area is ‘better or worse than another’ or that an increase or decrease is necessarily a good or a bad thing. Rather it is to spark professional curiosity, raise important questions to explore variance in practices, and to support learning that will ultimately benefit adults at risk and assure areas they are doing ‘the right thing’.
We also know that data has to be used to be useful – so it requires an audience and people to engage with it, and spend time reflecting on it to identify implications for action. Statistical data after all, is good at telling you about the what, but not the why.
Clearly, the data can also go on different journeys. Locally, the data will be reported to APCs- to inform reviews of practices, procedures and performance or biennial reports to Scottish Government on achievements, current issues and future plans. APCs also report to Chief Officer Groups (COGs) - accountable for multi-agency leadership, scrutiny and direction of Public Protection more generally - with ASP data appearing in some of their reports too. We have also heard from colleagues how the data is being used with practitioners - to support learning and improvement at practice level across agencies, sending the message that this is their data too!
Maximising its use
The NMDS indicators can be combined with any other local indicators that are collected, or different types of data to provide a fuller picture and to support interpretation. This can include experiential data from experts by experience or practitioners, or environmental data to bring in the local context. For example, the building of a new housing estate or loss of a major employer in the local areas, may all have an impact. We therefore encourage people to see the ASP NMDS indicators as one part of a jigsaw.
The NMDS should also trigger deeper dives or further analysis locally. Local systems should be able to afford this manipulation of the data in ways that we can’t do nationally with the aggregate data received. Locally, you might, for example, want to zoom in on a particular client group to understand their referral pathways into ASP, or to see if different age bands experience different types of harm. You might want to run cross-tabulations to see what types of harm occur in what type of locations, including those that happen online?
How it’s been used and been useful!
This, ultimately, is what matters. Here are some examples, as told to us by ASP colleagues, on its application at a recent drop-in session…
Informing priorities
The ASP NMDS has provided a national picture of self-neglect for the first time, with this missing in previous annual reports to the Scottish Government. This has helped to make self-neglect more visible and raised its profile, prompting greater focus on this nationally, with one APC also telling us that it had resulted in it being a key priority within its new strategic plan.
Monitoring the impact of initiatives…
The data has revealed more about the source of ASP referrals, and who is playing a part. For example, in one APC, they have seen an increase in ASP referrals from GPs and Primary Care - showing that the awareness raising and training efforts of the NHS ASP team has paid off! The breakdown of different NHS categories as ASP referral sources has also been useful in the targeting of future activities of this nature.
In another local area, efforts have been made with care home staff to clarify what is or isn’t an appropriate ASP referral following incidents in the home, to ensure that ASP resources are best used and directed. Workshops and information sharing activities have aimed to clarify when ASP visits, interviews or examinations, keeping within local procedural timescales, would be needed or where other pathways are more appropriate. This has led to a drop in ASP referrals from care homes for people with dementia or experiencing physical harm, and is attributed to this work. It shows how local triangulation and analysis of the data can be used to track and monitor the success of initiatives such as these.
Workforce matters
Some APCs have highlighted how the distinction between inquiries with and without use of investigatory powers in the data, has been helpful in assessing the learning, development and support needs of both council officers (afforded these powers) and second workers supporting inquiries. Furthermore, that local analysis of what client groups and types of harm are more likely to require the use of investigatory powers, can help APCs target training to teams serving those client groups.
Benchmarking for learning
APCs are very interested if the national data shows them to be ‘outliers’ compared with national levels or family groupings. To be clear, being an outlier isn’t necessarily a good or a bad thing, but exploring the reasons behind it prompts helpful self-evaluation to ask: What’s the data telling us? Have we got our processes right? Can we provide assurance to ourselves and others that the reason that we're doing A is because of X, Y, and Z….?
The grey areas?
There is interest in cases where the adult meets the three-point criteria, but the ASPA has not been used - was this a defensible use of the legislation when alternative pathways are deemed better or least restrictive for the adult? However, areas are keen to understand the reasons behind decisions, to check if they are doing the right thing, and to determine how the outcomes for adults compare if they are routed down a non-ASP pathway.
Some APCs are also trying to better understand what coercive control or undue pressure looks like, how to spot it, and how to apply trauma-informed principles in assessing someone’s ability or inability to safeguard themselves as relevant to ASP and its three point criteria for support. It’s a challenging ethical area, and use of the data can support local areas to better understand how their procedures and practices are identifying, supporting and protecting adults at risk.
Deeper dives
One area also reported that a deeper dive into the data, showed that more men than women in a middle-age category were being referred to ASP - bucking the overall trend and providing a more nuanced picture! This reinforces the point that locally, areas can do much more with their data than we can nationally.
New ways of using the data
A few areas, like Dumfries and Galloway, and Dundee, are looking at their ASP data alongside other Public Protection data, for example, Alcohol Drug Partnership data, and data on suicide prevention and gender-based violence. This is to help identify common themes or trends across different data sets and to move toward a more rounded, un-siloed understanding of risk and the individuals in the centre. It’s also another opportunity to ask what happens to those who don’t meet the three-point criteria or are classified for ‘No Further Action’.
Answering these questions will depend not just on the data, but its audience in making sense of it, and its implications for cross-cutting areas. Shared spaces, interest from senior leaders in holistic use of the data, and infrastructure that provides feedback loops between different practice areas can help. Dumfries and Galloway for example, have a joint Public Protection committee (rather than separate ones for ASP or CP) and a shared data dashboard.
Further tips - but probably only for readers interested in creating a national minimum dataset themselves….
I offer a few practical reflections for your consideration, should you be interested in creating your own national minimum dataset!
Begin appreciatively - we began by carrying out a national audit to identify indicators that were frequently collected by APCs across Scotland.
Be aligned with up-to-date national guidance and policy, but don’t expect everyone’s timescales to match - When we began the project, we knew that ASP’s Code of Practice was being revised, and we needed to be aligned to that. It was published in July 2022, after our work had gotten underway, so be prepared for intersecting timescales to not always align with yours. We don’t live in an ideal world! Changes in a COP can also take time to work through. For example, the revised ASP COP posed challenges to ASP colleagues - in a move away from thinking of inquiries and investigations as two separate things to focus on the S4 duty to inquire and clarity on the role of council officers. It provoked discussions on the use of S7-10 investigatory powers (afforded only to Council Officers), where ‘visits’ or ‘interviews’ with an adult begin and end, as well as what is appropriate for a paraprofessional to do and what guidance, supervision and training was needed? The Inquiries, Investigations and Role of the Council Officer Subgroup was established to help work this though and has produced guidance to be ratified at the ASP Strategic Forum in March 2026.
More generally, we also looked for opportunities to ensure that our definitions matched up-to-date ones in intersecting policy areas and strategies, like palliative care, self-harm and exploitation. In this there may be possible implications on what is counted and seen in the data?? Palliative care for example has moved away from an ‘end of life’ or ‘life expectancy’ approach as it is not possible to predict with any validity when a person will die.
Be guided by the following basic principles - In plain and simple terms, we were guided by the following three principles in helping to decide what indicators to include and prioritise.
- what’s meaningful to collect – being clear on rationale/purpose.
- what’s feasible and achievable to collect and report on that is also comparable across all areas (Comparable relates to shared definitions and categories, ways to count. It doesn’t mean that every area practices in the same way).
- what’s worth the return (is the effort required worth it in terms of insights provided).
Ask how minimum or minimal you want your dataset to be - Initial sessions with learning partners considered findings from the national audit, but we also generated a much bigger wish list (of what might be included) that needed to be assessed and prioritised. Expect to feel the tension or tug of war from those inside and outside the project between those who want the dataset to do more, and those who want to keep it small.
Maintain momentum and make decisions as you go - Learning partners in the first few years of the project met regularly - on average once a month online. Along the way, we also had to make hard decisions to drop certain indicators because they weren’t working, sometimes reluctantly. And of course, we didn’t all agree. It took time to unpack whether we wanted to include performance indicators like target timescales. In the end, there are no target timescales in the ASP NMDS unlike the CP NMDS. Why? Our COPs are different. In ASP target waiting times between key stages are set at the discretion of local areas; in CP these are set nationally. We also discussed the pros and cons of their inclusion, with some arguing that targets can drive unhelpful behaviours, contrary to person-centred approaches. It may, for example, be better to ‘miss’ a target timescale and delay a case conference meeting if it means the adult is better prepared or able to attend.
Provide space and support for national roll out and implementation and continuously demonstrate the value of the data – don’t expect it all just to happen. As well as providing workbooks and guidance, let people know it’s coming. Be clear on why it’s useful and demonstrate its value by using it and discussing it and presenting it in ways people can engage with. Provide space and feedback loops so people can ask questions, share their challenges and tips and you can respond accordingly at regular intervals.
Consider how much change people and systems can take at any one time - as mentioned in the ‘how we built it’ section above, we opted for a phased approach to roll out over two years. We also looked to remove duplication where we could to avoid overlapping data collections.
This is emotional labour – so be clear on why you are doing this and its value to take people with you Change doesn’t happen without taking people with you, working alongside them and fostering good relationships. I could provide you with the model for change diagram we used - it looks all neat and tidy. In reality, however, it’s much messier and more difficult. It was often two steps forward, one step back (or is that the other way around)? And just when you thought you had something cracked, you realised that you hadn’t! So tenacity, good humour, supporting and appreciating your fellow travellers in what they bring, is absolutely necessary along with seeing everything as learning.
Coming into this work, I was worried it might be a little dry, given datasets on the surface are about numeral inputs and outputs. However, this was most certainly not the case. I couldn’t have been more wrong! Behind every glossary definition lies deep exploration and unpacking of different practices and terms, and behind that, consideration of the best interest of adults at risk, and ensuring they are safe, supported and protected. For me, and for fellow travellers, this was our north star! So...never forget your north star.
With thanks
The ASP NMDS was very much co-designed, and is an example of how Iriss works alongside those in the sector. This approach draws on and appreciates everyone’s assets across the system, with a focus on nurturing good relationships and shared learning. Change doesn’t happen without people, so thanks to everyone for getting us this far!
- Our committed learning partners: Dumfries and Galloway, East Dunbartonshire, East and South Ayrshire and Renfrewshire Adult Protection Committee areas.
- Scottish Government colleagues in the ASP Team who have worked tirelessly to support and make it happen! Most notably Jamie Aarons (Professional Social Work Adviser) and Evelyn Shiel and Cate Neil from the Social Care Analytical Unit.
- Members of the multi-disciplinary ASP Data Reference Group.
- ASP colleagues, who have supported national implementation and roll out.
- Iriss colleagues on the project, notably Ian Phillip who worked on the project until 2024, and Katie Feyerabend who took over from there! Again for their dedication, and for leading on user-friendly presentation of the data. Thanks also to Holly Smith for her ongoing administrative support with the Data Reference Group and ASP Leads Network.
- Alex McTier from CELCIS -who won’t know how much I appreciated him sharing his learning in creating the CP NMDS, calming my initial jitters and giving me the confidence to get started!