No matter which data source you find, all the statistics show a clear rise in the number of people in the UK who report poor mental health, increasing anxiety, and a growing feeling of loneliness, with the NHS struggling to keep pace with the rise in referrals and higher waiting times.
It’s no wonder, therefore, that people are seeking alternative support by turning to AI.
The rise in the capabilities and sheer prevalence of AI and LLM (Large Language Model) is hard to ignore, and advanced chatbots such as ChatGPT and Claude have weaved their way into the daily lives of tens of millions of people across the world.
And increasingly, people are not just using AI to help with the more mundane work-related tasks like data analysis or content creation, but also for decision-making and emotional support.
As something accessible 24/7, 365 days a year through our phones or computers, AI feels immediate, and the super quick responses means our attention remains captive – AI can ‘type back’ far quicker than a human being.
And AI seemingly provides us with a sympathetic, non-judgemental ‘ear’, offering us validation and support that feels personal to us.
However, there have been more and more stories in the media about AI actually providing incorrect. and potentially harmful advice, which has even led to cases where self-harm has been encouraged. The reality is that AI chatbots have been designed to increase engagement, so tend to appear very agreeable and validating, rather than challenging, which means harmful thoughts might be perceived to be ‘justified’ by AI because it’s programmed to effectively agree with you.
That said, I do understand that typing your feelings out, rather than saying them out loud, can feel safer – I used to be a volunteer at Shout, which is a text-based crisis support service, designed to support people who were struggling with a confidential, written form of assistance.
During my time there, I supported over 500 individuals during periods of overwhelm, and I used to work a ‘night shift’ with many texters telling me that messaging was easier to do as they either didn’t want to wake someone in their household up, or they didn’t want the people they were with to know they were struggling.
And often with the texters, working through some grounding practices was extremely beneficial to move them gently from a place of overwhelm, to a more calm and in control state.
Going through the 5, 4, 3, 2, 1 technique, technique, or the Alphabet Game, for example, worked well because they are designed to bring our focus back to our current environment and therefore shift the focus away from the anxious thoughts and worries, can help us to feel more in control, and give us that break from spiraling.
AI can also support you to do that, if you wish to use AI, as an immediate support in the moment.
However, when we need to truly explore how we feel, especially if we are having challenging thoughts, or feel like our overall wellbeing is being impacted, human connection is vital.

