Advertisement
Editor's Pick

How can we boost AI use in children’s social care?

The announcement to use artificial intelligence (AI) in education to help reduce burnout in teachers and tackle pupil absence could revolutionise the sector. But in children’s social care, where is the impetus to explore the benefits of tech. 

The National Children & Adult Services Conference (NCAS) in November had nothing in the programme on AI in children’s social care, and a search of ADCS blogs did not find any mention of AI at all. It appears that concerns about the ethics of using AI are a barrier to exploring its potentially enormous benefits. But I believe that this roadblock is due to lack of clarity about where to start and where the journey will take children’s services, as opposed to a lack of interest. A recent blog by McKinsey, which is supporting the Coram Innovation Incubator’s upcoming Forum, states that ‘to capture gen AI’s full potential, companies must consider how the technology can redefine the way the organisation works’ and I suspect councils are having trouble answering this question. 

However, there are some exciting initiatives taking place as a growing number of local authorities across the country set up groups to look at how they could use AI. To bolster their progress I offer a possible roadmap. 

The first step is to encourage and support staff to use the free public AGI tools safely and develop an AI usage policy for staff as soon as possible – staff are already using AI tools, so a lack of policy increases the risk to councils. Practitioners must know how to use these tools carefully, treating them as a third-party in terms of the information shared, but not made to feel like they are cheating by using them. Responsible use of AI tools should be a key learning objective for everyone – imagine the reaction if a staff member didn’t know how to use Google. It will be the same for AI tools soon.

Meanwhile, local authorities should consider providing social care staff with a private, all-purpose AGI tool. This would allow them to enter confidential information and get help with tasks including itinerary planning, work prioritisation, writing emails / documents and data analysis.

Next they should introduce specialised, standalone, private AGI tools, that are very good at the specific task they have been built for (the effectiveness of all-purpose or public tools has its limit). Policy Buddy, which our partners at Engine built for North Yorkshire Council, is a generative AI tool that can provide quick, detailed answers to questions on policy and procedures. As it is trained on the council’s policies and procedures it won’t accidentally retrieve irrelevant information from elsewhere.

As the organisation and its staff become more adept at using AI, more complicated uses can be explored, such as purpose-built tools that integrate with other systems, to support social work. North Yorkshire used funding from the Department for Education to build a prototype that can automatically generate relationship maps and chronologies for a child based on the content of case notes. The same technology could also be used to automate the creation of case summaries or redact confidential information. Other initiatives include Beam’s pilot of Magic Notes with several councils to see how much it can alleviate the administrative burden of case recording, and Sentinel Partners supporting local authorities to combine data from multiple sources and create a composite record for each child so that more informed decisions can be made.

There are also possibilities that go beyond one organisation. In one locality, the Coram Innovation Incubator is looking at creating an AI-powered service directory that would help people find a relevant support service by telling it what the issue is or what support they need, rather than trying to guess what specific wording to use, as with traditional internet search engines. Machine learning could potentially provide insights into the effectiveness of interventions by analysing records at a large scale or finding previously unidentified factors that influence how well a looked after child will settle in with a family.

To explore all the future possibilities, we must first understand the current state of AI integration and innovation in children’s social care. Coram has launched a survey to assess how deeply these innovations are embedded within organisations, identify key challenges faced in implementing such innovations, and determine what support would be most beneficial in fostering a culture of innovation in this sector. The survey, which can be accessed here, closes on 31st January. 

AI is not a panacea and will always need human intervention. Its application needs to be a collaborative effort, drawing together experts from across sectors to mitigate risks and improve outcomes for children and families. But the benefits of automating time-consuming processes to allow social workers to spend vital relationship-building time with children and families, as well as using AI to extract insight which could deliver important benefits in safeguarding and prevention, are too significant to ignore. The Children’s Services Innovation Forum takes place on 27th February.

This article was written by Kevin Yong, head of Coram-i – an organisation established by Coram that sees experts work with local authorities and create better care plans for children. 

Other features:

How the Social Care Commission will affect organisations

Collaborative leadership is an essential skill for social care leaders

Groundhog day: When will care reform advance from talks to action?

Comments

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Help us break the news – share your information, opinion or analysis
Back to top