AI chatbots are being increasingly used to make decisions, but experts warn they are also becoming a go-to for companionship and emotional support.
Almost half of adults in the UK report feeling lonely, with one in ten people experiencing chronic loneliness – feeling lonely “often or always”.
Given the high number of people battling loneliness, it’s not surprising that people are seeking alternative sources for companionship and emotional support, authors of a report published in the British Medical Journal (BMJ) said.
They added that there is a “worrying possibility we might be witnessing a generation learning to form emotional bonds with entities that lack capacities for human-like empathy, care, and relational attunement”.
The report, written by Dr Susan Shelmerdine at Great Ormond Street Hospital for Children and consultant psychiatrist Matthew Nour, highlighted the risks and benefits of using AI to tackle loneliness and called for studies to explore the risks of human-chatbot interactions.

Among younger people, one study found a third of teenagers use AI companions for social interaction, with one in ten reporting that the AI conversations are more satisfying than human conversations, and one in three reporting that they would choose AI companions over humans for serious conversations.
Another study, carried out by charity the Youth Endowment Fund (YEF), found a quarter of teenagers in the UK have turned to AI chatbots for mental health support in the last year.
Authors of the BMJ report proposed clinicians should ask patients about their chatbot use to find out if it is problematic, particularly during holiday periods when vulnerable populations are most at risk, followed if necessary by questions to assess compulsive use patterns, dependency, and emotional attachment.
However, authors do acknowledge AI chatbots may have benefits for improving accessibility and support for individuals experiencing loneliness.
Psychologists agree that while AI can be beneficial when asking for suggestions of what to do if you are lonely, it should not replace human support.
“Replacing our social circle and real-life opportunities with a chatbot is a frightening prospect for a young person’s emotional wellbeing,” chartered psychologist Dr Audrey Tang told The Independent.
“Social media and other forms of technology can simulate company, but it is not the fundamental connection that many humans desire, or is healthy in the long term.”
President of the British Psychological Society, Dr Roman Raczka, warned “AI is not a silver bullet”, especially when it comes to cutting down mental health waiting lists.
“AI offers many benefits to society, but it should not replace the human support essential to mental health care. Instead, tools like chatbots should be used to complement existing services which help those who require mental health support,” Dr Raczka said.
“AI cannot replicate genuine human empathy, and there is a risk it creates an illusion of connection rather than meaningful interaction. Concerns also remain about data privacy and the dangers of becoming overly dependent on technology. That said, when used appropriately, AI can offer an anonymous, judgment-free space that’s accessible 24/7. This could be a useful addition to existing in-person mental health services.”











