The British Association for Counseling and Psychotherapy (BACP) is warning of the increasing risks of children using artificial intelligence tools such as ChatGPT for mental health advice.
Its new research revealed that more than a third (38%) of therapists working with people under 18 have clients seeking mental health guidance from AI platforms. And nearly one in five (19%) therapists reported that children receive harmful mental health advice.
Therapists told the BACP that some AI tools provide potentially harmful and misleading information, including encouraging children to self-diagnose conditions such as ADHD and OCD, reinforcing avoidance behaviors and automatically validating their feelings regardless of what they express. There have also been tragic cases where AI tools have given dangerously wrong advice or even encouraged suicide.* Therapists are also particularly concerned about AI’s inability to provide real-time support or intervene in crisis situations.
Ben Kay, Director at BACPwhich is the largest professional body for counseling and psychotherapy in the UK and has more than 70,000 members, said:
“It is worrying that children are increasingly being forced to turn to AI chatbots like ChatGPT for mental health support, often unable to tell if the advice they are receiving is safe or even true. Some have already suffered devastating consequences. And this is likely only the tip of the iceberg, with many more children struggling in silence, without access to real treatment.”
“We want parents, carers and young people to understand that using AI for mental health support is not the easy, safe or quick fix it may seem, there are real risks and it needs to be approached with caution. While AI is accessible and convenient, it cannot replicate the empathy, connection or safety of treatment provided by a real person trained in the complex assessment of a mental health professional health support Information shared with AI also does not have the same protections as treatment.
“Too many young people are turning to AI because they can’t get the mental health support they need. This is unacceptable. The Government needs to step up and invest now in real, professional treatment through the NHS, schools and community hubs. No young person should ever be forced to turn to a chatbot for help. AI can fill gaps, but it can never replace more than human connection. professionally trained therapists who listen’.
New research findings
The annual BACP Midometer surveywhich gathered information from almost 3,000 practicing therapists across the UK, shows that more than a quarter (28%) of therapists – working with both adults and children – have asked clients to report unhelpful treatment instructions from AI. And nearly two-thirds (64%) of therapists said public mental health has worsened since last year, with 43% believing AI is contributing to that decline.
Senior Chartered Therapist Debbie Keenan, who works in a secondary school and has her own private practice in Chepstow, added:
“I’m definitely seeing more kids and young people turning to AI for treatment advice and self-diagnosing conditions like ADHD and OCD. That’s a real concern for me. As advanced as AI is, it just can’t do that. It also can’t tell if a child is distressed, dysregulated, or at risk. Before they leave my room – but AI would do that. intelligence?
“Furthermore, I am also concerned about the ongoing risk of isolating and disconnecting children from real human relationships – this can lead to an over-reliance on Al for emotional support and increase feelings of loneliness, making it harder to reach out for ‘real life’ support.
“I believe that children are increasingly turning to AI for therapy because it is available 24/7. It feels non-judgmental and offers a sense of privacy. However, AI remembers data, is not bound by ethical or confidentiality standards, and lacks regulation or accountability. While it can fill the gap in access to mental health support or a mental health advocate, it cannot replace mental health support. health. can.”
Amanda MacDonald, BACP registered therapist providing support to children, adolescents and adults said:
“Artificial intelligence therapy bots tend to take one of two approaches: offer validation or provide solutions. Both lack the detail of actual therapy and fail to provide advice that contradicts best practices for emotional distress. For example, some AI tools have advised people with OCD to continue their compulsions, mistaking short-term relief for progress. Others may encourage avoidance of stressful time by reinforcing avoidance behaviors.
“There have also been well-documented, tragic cases where AI tools have given dangerously wrong advice or even encouraged suicide – results that are both devastating and deeply worrying.
“Parents and caregivers should be aware that their children may turn to AI for guidance and advice. While it’s important to have appropriate parental controls in place, open and honest communication at home is just as vital. Talk to your children with curiosity and share your concerns in an age-appropriate way.”
“Children and teens are not yet equipped to fully assess risk, so parents play a critical role in keeping them safe. Balancing privacy with security is never easy, but without that balance, young people can become overly reliant on an ultimately very smart algorithm.
“Picking up their phones when they’re upset is natural for many young people, especially as AI tools seem supportive and validating. This creates a valuable opportunity for families to talk about their relationship with phones and technology. Parents can help by modeling healthy behavior – setting shared screen-free times and recognizing when they’ve instinctively connected to their phones. It can start to replace real human interaction.” connection”.
References:
All figures come from the BACP’s annual Mindometer survey of its members. The total sample size was 2,980 therapists and the fieldwork took place between September 3 and 17, 2025. The survey was conducted online.
