Advertisement

Impacts of AI chatbots on mental illness – new paper

UK researchers warn that the ‘agreeableness’ and ‘adaptability’ of AI systems used for emotional support and companionship present a range of risks to mental health 

‘My partner has been working with ChatGPT,’ said a poster on the Reddit forum recently, ‘to create what he believes is the world’s first truly recursive AI that gives him answers to the universe. He says with conviction that he is a superior human now.’ 

a white toy with a black nose

Photo by Julien Tromeur / Unsplash

Obviously concerned by this, the poster read the output from ChatGPT, and said it wasn’t ‘doing anything special or recursive but it is talking to him as if he is the next messiah.’ 

With advice from others on the forum, the original poster was able to get help – but many others shared their own experience of strange and extreme behaviour from people using AI chatbots. That matches concerning reports elsewhere in the media, ranging from delusional thinking to violence and suicide. 

This is greatly concerning, not least because – as the UK-based authors of a new academic paper say – millions of people now use AI chatbots for emotional support and companionship. This, say the authors, is understandable given wider issues of social isolation and constrained mental health services. They even acknowledge reported psychological benefits of using such chatbots including increased happiness and reduced suicidal ideation. 

But they also cite studies that have found negative impacts such as increased loneliness, reduced socialisation and increased dependence on the AI system, and refer to ‘edge-cases’ of severe mental health crisis. They seek to understand the underlying causes of this and suggest ways to address them. 

One issue is that while AI systems specifically designed for use in mental health contexts are strictly regulated, general-purpose chatbots – such as the popular ChatGPT, Claude and others – are not marketed as mental health tools and so are not subject to these rules.  

But the authors also explore why AI systems can have such a potent effect on users. Some faults with AI – such as biases, hallucinations or failures to understand pragmatic language – are well known. The authors argue instead that we need to look at the broader interaction between the cognitive and emotional biases of humans seeking help, and the way chatbots use agreeableness or sycophancy, and adaptability or in-context learning.  

Effectively, AI generates a feedback loop that can exacerbate existing issues of mental health.  

The paper explores the subject in some depth, covering ‘chatbot-induced belief destabilization and dependence, owing to altered belief-updating, impaired reality-testing, and social isolation.’ It then makes recommendations for coordinated action across clinical practice as well as in AI development and regulation. 

The authors of the paper are: Sebastian Dohnány (Department of Psychiatry, University of Oxford), Zeb Kurth-Nelson (Max Planck UCL Centre for Computational Psychiatry and Ageing, University College London), Eleanor Spens (Nuffield Department of Clinical Neuroscience, University of Oxford Health NHS Foundation Trust), Iason Gabriel (School of Advanced Study, University of London), Christopher Summerfield (Department of Experimental Psychology, University of Oxford; School of Advanced Study, University of London), Murray Shanahan (Department of Computing, Imperial College London) and Matthew M Nour (Department of Psychiatry, University of Oxford; Max Planck UCL Centre for Computational Psychiatry and Ageing, University College London; Early Intervention in Psychosis Team, Oxford Health NHS Foundation Trust).

In related news:

Care vacancy rates return to pre-pandemic levels, but war isn’t over yet

Calls for financial support amid rising elderly single households

Interview: XR driving school for autistic people

Simon Guerrier
Writer and journalist for Infotec, Social Care Today and Air Quality News
Help us break the news – share your information, opinion or analysis
Back to top