Health and Social Care Delivery Research (Jun 2023)

New and emerging technology for adult social care – the example of home sensors with artificial intelligence (AI) technology

  • Glasby Jon,
  • Litchfield Ian,
  • Parkinson Sarah,
  • Hocking Lucy,
  • Tanner Denise,
  • Roe Bridget,
  • Bousfield Jennifer

DOI
https://doi.org/10.3310/HRYW4281
Journal volume & issue
Vol. 11, no. 09

Abstract

Read online

Background Digital technology is a focus within the NHS and social care as a way to improve care and address pressures. Sensor-based technology with artificial intelligence capabilities is one type of technology that may be useful, although there are gaps in evidence that need to be addressed. Objective This study evaluates how one example of a technology using home-based sensors with artificial intelligence capabilities (pseudonymised as ‘IndependencePlus’) was implemented in three case study sites across England. The focus of this study was on decision-making processes and implementation. Design Stage 1 consisted of a rapid literature review, nine interviews and three project design groups. Stage 2 involved qualitative data collection from three social care sites (20 interviews), and three interviews with technology providers and regulators. Results •It was expected that the technology would improve care planning and reduce costs for the social care system, aid in prevention and responding to needs, support independent living and provide reassurance for those who draw on care and their carers. •The sensors were not able to collect the necessary data to create anticipated benefits. Several technological aspects of the system reduced its flexibility and were complex for staff to use. •There appeared to be no systematic decision-making process in deciding whether to adopt artificial intelligence. In its absence, a number of contextual factors influenced procurement decisions. •Incorporating artificial intelligence-based technology into existing models of social care provision requires alterations to existing funding models and care pathways, as well as workforce training. •Technology-enabled care solutions require robust digital infrastructure, which is lacking for many of those who draw on care and support. •Short-term service pressures and a sense of crisis management are not conducive to the culture that is needed to reap the potential longer-term benefits of artificial intelligence. Limitations Significant recruitment challenges (especially regarding people who draw on care and carers) were faced, particularly in relation to pressures from COVID-19. Conclusions This study confirmed a number of common implementation challenges, and adds insight around the specific decision-making processes for a technology that has been implemented in social care. We have also identified issues related to managing and analysing data, and introducing a technology focused on prevention into an environment which is focused on dealing with crises. This has helped to fill gaps in the literature and share practical lessons with commissioners, social care providers, technology providers and policy-makers. Future work We have highlighted the implications of our findings for future practice and shared these with case study sites. We have also developed a toolkit for others implementing new technology into adult social care based on our findings (https://www.birmingham.ac.uk/documents/college-social-sciences/social-policy/brace/ai-and-social-care-booklet-final-digital-accessible.pdf). As our findings mirror the previous literature on common implementation challenges and a tendency of some technology to ‘over-promise and under-deliver’, more work is needed to embed findings in policy and practice. Study registration Ethical approval from the University of Birmingham Research Ethics Committee (ERN_13-1085AP41, ERN_21-0541 and ERN_21-0541A). Funding This project was funded by the National Institute of Health and Care Research (NIHR) Health Services and Delivery Research programme (HSDR 16/138/31 – Birmingham, RAND and Cambridge Evaluation Centre). Plain language summary Our aim Social care is facing pressures due to a lack of funding and staff and COVID-19. One way to ease pressures is by using digital technology. We looked at a technology that places sensors around people’s homes to monitor changes in daily activity, including how this technology was brought into social care and how it works. What we did We reviewed evidence and spoke with experts (including people who draw on care and support) to finalise the study design. We then interviewed people from social care organisations, carers, technology developers and regulators. What we found •Organisations expected the technology to do a lot, including preventing illness, assessing needs, supporting independent living, reassuring people drawing on care (and their carers) and saving money. •Some social care decision-makers may not have the skills and understanding needed to make decisions about the use of new technology, and lacked a strategic approach to decision-making. •It was difficult to collect the data needed to use the sensors correctly, which meant the technology did not meet expectations. •Care staff were trained on how to use the sensors, although many struggled to make sense of the data they collected. •Social care is often focused on dealing with a crisis, rather than preventing one. This means a culture change is needed to use the sensors properly. What it means Our research confirmed challenges in using new technology in social care. We also found new problems, such as dealing with large amounts of health data, asking care staff to use this information without enough training, and introducing a technology focused on prevention into an environment which is focused on dealing with crises. Our findings have helped to fill gaps in knowledge and will let us share practical learning with those introducing new technology in social care. Scientific summary Background The social care system in England is under significant pressure, and there are funding gaps and workforce challenges that make it difficult to keep up with rising demands. Digital technology may be one way to improve care and address pressures in the social care system, and has been a major focus within the NHS as a way to improve care and address pressures in the social care system. Sensor-based technology with artificial intelligence (AI) capabilities is one type of technology that may be useful in some contexts. There is evidence to suggest that this type of digital technology can potentially improve some aspects of care and care planning, although there are key gaps in evidence that need to be addressed. Interest from policy-makers, commissioners and care providers in developing and using new and emerging technology for social care makes it a current priority for evaluation. In 2019, an NIHR-funded national prioritisation exercise was held, during which organisations and individuals with knowledge of adult social care and social work identified the top innovations which would benefit from a rapid evaluation, several of which related to new and emerging digital technology. Objectives This study builds on the prioritisation exercise by evaluating how one example of new and emerging digital technology (a system of sensors with AI which we have anonymised using the name ‘IndependencePlus’) was implemented in sites across England. The study seeks to answer the following core research questions: 1.How do commissioners and providers decide to adopt new and emerging technology for adult social care? (Decision-making) 2.When stakeholders (local authorities and care providers, staff, people who draw on care and support, and carers) start to explore the potential of new and emerging technology, what do they hope it will achieve? (Expectations) 3.What is the process for implementing technological innovation? (Implementation) 4.How is new and emerging technology for adult social care experienced by people who draw on care and support, carers and care staff? (Early experiences) 5.What are the broader barriers to and facilitators of the implementation of new and emerging technology in addressing adult social care challenges? (Barriers and facilitators) 6.How has the COVID-19 pandemic influenced responses to the questions above? (Impact of COVID-19) 7.How can the process of implementing new technology be improved? (Making improvements) Methods and limitations This study was conducted across two stages. Stage 1 consisted of scoping work to better understand new and emerging digital technologies for social care, with a focus on home sensors with AI technology, and the challenges and lessons learnt from previous research and evaluation efforts. This consisted of a pragmatic, rapid scan of the literature, nine key informant interviews, three online project design groups (with 11 participants) and selection of potential case study sites. The scoping work in stage 1 informed the design of stage 2 by providing insight into which themes to consider in the study’s research questions and how best to collect data from key stakeholders. Stage 2 was an evaluation of new and emerging digital technology, using the example of IndependencePlus, through qualitative data collection and analysis from three case study sites. This was supplemented by interviews and focus groups with care technology providers/innovators and regulatory bodies. Case study sites consisted of two local authorities and one care provider who had used IndependencePlus. The case studies involved interviewing 20 key stakeholders (decision-makers and operational leads, care staff and carers), reviewing documents from case study sites and, finally, analysis and synthesis across sites. This was supplemented by three interviews with technology providers/developers and national regulatory/other bodies. Significant recruitment challenges were faced during stage 2 of the research, particularly related to the pressures imposed by COVID-19 on social care, as well as a range of other factors. This had significant implications for our research, particularly the recruitment of people who draw on care and support, carers and care staff. We were unable to recruit any individuals who draw on care and support, and in one site we were unable to interview any care staff or carers. Despite this, we still believe that the insights presented here are helpful to inform future practice. Results The scoping stage of this study (stage 1) informed the approach to stage 2. Available evidence in the published literature points to key issues in the uptake of digital technology in social care, including: a lack of information to support decision-making processes around technology; gaps in how front-line staff and people who draw on care and their carers are consulted in relation to digital technologies; and varied expectations around what digital technology is expected to accomplish within social care. Information from the interviews and project design workshops also fed into stage 2 in terms of what themes to explore (e.g. potential benefits of technology, ethical considerations, implications for front-line staff, the impact of COVID-19), how to best engage stakeholders and frameworks to help position evaluation findings [e.g. the NASSS (adoption/non-adoption, abandonment, scale-up, spread, sustainability) framework]. The field work in stage 2 of this study formed the main portion of the evaluation. In relation to the expectations of those adopting AI in social care, perhaps unsurprisingly with new and emerging technology, there was a lack of understanding of exactly what AI-based technology can provide, which led to the emergence of a broad range of (potentially unrealistic) anticipated benefits. The expected benefits include: increasing preventative care; improving assessments and diagnoses; supporting independent living; providing reassurance for those who draw on care and support and their carers; and the conservation of resources and reduction in costs. Overall, participants felt that the sensors were not sufficiently stable or effective to collect reliable data over the necessary period of time to meet these anticipated benefits. There appeared to be a lack of systematic decision-making processes in deciding whether to adopt AI and, in its absence, a number of contextual factors influenced procurement decisions, including perceived pressure from central bodies to invest in technology-enabled care solutions and a more general belief in the capabilities of technology. The identification and exploration of options and alternatives often appeared ad hoc in nature and frequently relied on word of mouth and/or a relatively superficial appraisal of the suitability of the digital technology informed by the quality of the sales pitch and the aesthetic characteristics of the hardware. There may be scope for the decision-making process to be more structured and, where broad-ranging changes are envisaged, more strategic in nature. There was an apparent lack of protocol at any of the sites that described the process of implementing and evaluating new digital technologies. Training was provided for staff, though when provided directly by the sensor provider there were reports that it failed to appropriately engage care staff. Because of the perceived lack of consultation with care staff during procurement, issues arose with finding a cohort suitable to pilot the technology. In addition, careful consideration is needed in gathering informed consent from people who draw on care and support, especially where people have cognitive impairments (who, ironically, might be the very group that were intended to benefit the most from digital technology). In terms of early experiences of using IndependencePlus, the instability of the sensors (not producing data on a continuous basis for long enough to be consistently meaningful) meant there was no evidence of the expected cost benefits from a reduction in direct staff contact. That they were single-use devices further impacted on any potential savings. Though the data gathered by the sensors were potentially useful, question marks remained over their reliability and precision (making it difficult to rely solely on the system for the safety of those who draw on care and support). Several technological aspects of the system reduced its flexibility. For example, it could only be installed by the provider and was perceived as difficult to maintain and update independently. The complexity of the user interface and volume of the data it produced were also felt to overwhelm staff. Incorporating AI-based technology into existing models of social care provision requires alterations to existing funding models and care pathways, as well as concerted training to increase the digital literacy of the workforce. New and emerging technology-enabled care solutions require a robust digital infrastructure, which is lacking for many of those who most draw on care and support. Short-term service pressures and a sense of crisis management are not conducive to the kind of culture and approach that might be needed in order to reap the potential longer-term benefits promised by AI-enabled technology. The onset of COVID-19 part way through the implementation of IndependencePlus led to a shift in priorities, with some abandoning the pilot as the focus shifted to managing the effects of the pandemic. For those sites that persevered with the pilot, observing the regulations around social distancing delayed the roll-out of the technology, predominantly because installation of the sensors was interrupted. However, the pandemic did prompt some sites to reflect on the potential benefit of such digital and AI-based technologies that employed remote monitoring in future pandemics. Conclusions In analysing and interpreting our findings, we draw on two implementation frameworks: the ‘rational model’ of policy implementation and the NASSS framework. We use these frameworks to set out practical recommendations for implementing new and emerging technology in social care. We identified a number of gaps in the implementation of new and emerging digital technology in social care, relating to: identifying the problem; establishing/weighing decision criteria; generating/evaluating/choosing the best alternatives; implementing the decision; and evaluating the decision. We outline questions that might be helpful when exploring the potential for implementing new and emerging digital technology in relation to each of these aspects. While both the rational model and the NASSS framework are helpful for structuring and summarising the findings of this evaluation, their use should not be at the expense of room for sites to experiment with something where they might not know the likely outcome in advance, and to learn by doing. Overall, the research confirmed a number of common implementation challenges, but also adds early insights into slightly newer themes, such as the volume/complexity of data and subsequent analytical burden on untrained staff, or the challenges of implementing AI-based technology which tries to establish a long-term picture of someone’s routine in a system where interventions can often be short-term and crisis-focused. In future, further research might explore the implementation of other AI-based technologies and their introduction in a larger number of sites. This work should aim to include the experiences of people who draw on care and support, carers and front-line staff. Future studies could also learn from successful implementation, so that we know more about the key success factors as well as the barriers. Crucially, there may be scope to provide additional implementation support so that pilots are not ‘set up to fail’ as a result of known implementation challenges. Practical support to build evaluation into such processes could also be valuable, not least in terms of helping to clarify and then measure desired outcomes. Despite significant challenges in conducting such research in such a difficult context (particularly in relation to the recruitment of people who draw on care and support, and carers), a series of very clear and significant themes emerged, the study remains highly topical, and the approach adopted has helped to produce a series of tangible and significant findings. This has helped to fill some of the key gaps in the literature and to share practical lessons learnt with commissioners, providers, technology providers and policy-makers – especially at a time when technology has been so prominent during the pandemic and in recent government policy. Given that this study confirms so many common implementation barriers, this focus on sharing and embedding lessons learnt in order to help future implementation feels crucial – otherwise we run the risk that future studies of new and emerging technology will simply report similar challenges once again. Study registration Ethical approval from the University of Birmingham Research Ethics Committee (ERN_13-1085AP41, ERN_21-0541 and ERN_21-0541A). Funding This project was funded by the National Institute of Health and Care Research (NIHR) Health Services and Delivery Research programme (Reference – Birmingham, RAND and Cambridge Evaluation Centre

Keywords