Close Menu
Healthtost
  • News
  • Mental Health
  • Men’s Health
  • Women’s Health
  • Skin Care
  • Sexual Health
  • Pregnancy
  • Nutrition
  • Fitness
  • Recommended Essentials
What's Hot

Inside the OPEX Method Guide Week 4: Dr. David Skolnick: Aerobic Training That Changes Training

March 7, 2026

The biomimetic smart insole system allows for accurate gait tracking

March 7, 2026

Breathwork for Stress Relief: Techniques to Remember Under Pressure

March 7, 2026
Facebook X (Twitter) Instagram
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
Facebook X (Twitter) Instagram
Healthtost
SUBSCRIBE
  • News

    The biomimetic smart insole system allows for accurate gait tracking

    March 7, 2026

    New report shows primary care readiness for digital cognitive assessment tools

    March 6, 2026

    Redefining end-of-life care for better patient outcomes

    March 6, 2026

    Neural circuit mechanisms explain how chronic sleep loss disrupts social memory

    March 5, 2026

    Wearable sensors as a MS monitoring tool

    March 5, 2026
  • Mental Health

    Are you stressed about politics? You wouldn’t expect it, and research shows that social media is largely to blame

    March 4, 2026

    Is It Sadness or Depression? Understand it…

    March 1, 2026

    Teen anxiety linked to sugary drinks – new research

    February 28, 2026

    Self-Care Guided Journal For Moms

    February 26, 2026

    Forgiveness isn’t always easy, but studies show it can help you flourish

    February 24, 2026
  • Men’s Health

    EMOM 20 Minute Workout: A Guide to Full Body Strength

    March 5, 2026

    Can brain training prevent dementia? Long-term testing shows that speed training with boosters makes a difference

    March 3, 2026

    How to find the right deodorant for smelly armpits

    March 3, 2026

    The Case for Weightlifting Shoes

    March 2, 2026

    The Secret to Saving Humanity: What We Must Do Now

    March 2, 2026
  • Women’s Health

    Breathwork for Stress Relief: Techniques to Remember Under Pressure

    March 7, 2026

    Chef Pam Pichaya Soontornyanakij: Cooking Courage in Every Dish

    March 6, 2026

    I have a family history of endometriosis and the doctors still dismissed me

    March 5, 2026

    Oliveda Skincare Faves – The Fitnessista

    March 4, 2026

    How to protect face from Holi colors safely

    March 3, 2026
  • Skin Care

    Because celiac skin care is no Nego

    March 7, 2026

    The best facial treatments that actually work for your skin goals

    March 5, 2026

    Drinking water for skin: The truth about hydration and glow

    March 5, 2026

    How to use Strobe cream for festive glow – the natural wash

    March 4, 2026

    Carefully formulated skin care | Susie Ma & Tropic Skincare

    March 4, 2026
  • Sexual Health

    New Gonorrhea Vaccination Results – GoGoVax Trial of 4CMenB Vaccine

    March 5, 2026

    The discussion of the Epstein files is mistaken for pedophilia and power

    March 2, 2026

    Survival strategies and health effects in forced displacement

    March 1, 2026

    How Intense Competition and Intimacy Tuning Are Elevating Modern TV Romance — Alliance for Sexual Health

    February 28, 2026

    New type of Mpox diagnosed in England

    February 25, 2026
  • Pregnancy

    The importance of oral health during pregnancy

    March 6, 2026

    Best Gummy Prenatals With 100% DV Folate Guide – Pink Stork

    March 6, 2026

    Kegels Wrong? The top mistakes pregnant women make

    March 3, 2026

    Endy Mattress Review: An Honest Look After 4 Months

    March 1, 2026

    Does bed rest prevent premature labor? New research says no

    March 1, 2026
  • Nutrition

    Switch off GLP-1 after 12 weeks

    March 6, 2026

    Is The Longevity Movement Heading For A Backlash?

    March 5, 2026

    Oliveda This or That? My honest picks for the best Oliveda skincare + makeup • Kath Eats

    March 4, 2026

    What does personalized nutrition actually offer?

    March 3, 2026

    How to support your hormones, gut health and metabolism the right way

    March 3, 2026
  • Fitness

    Inside the OPEX Method Guide Week 4: Dr. David Skolnick: Aerobic Training That Changes Training

    March 7, 2026

    Boosting mood and building community through movement

    March 5, 2026

    Chris Bumstead’s laser-focus strategy behind a classic fitness dynasty

    March 4, 2026

    What’s new in March 2026 for the BODi Community of Experience!

    March 3, 2026

    200: Autoimmune Healing, Nervous System Safety, and the Biggest Mistakes I Made on My Health Journey

    March 1, 2026
  • Recommended Essentials
Healthtost
Home»News»ChatGPT Health fails critical emergency and suicide safety tests
News

ChatGPT Health fails critical emergency and suicide safety tests

healthtostBy healthtostFebruary 24, 2026No Comments6 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Reddit WhatsApp Email
Chatgpt Health Fails Critical Emergency And Suicide Safety Tests
Share
Facebook Twitter LinkedIn Pinterest WhatsApp Email

ChatGPT Health, a widely used artificial intelligence (AI) tool that provides health guidance directly to the public — including advice on how to seek emergency medical care — may fail to properly direct users to emergency care in a significant number of serious cases, according to researchers at the Icahn School of Medicine at Mount Sinai.

The study, fast-tracked in the February 23, 2026 online issue Nature Medicine [https://doi.org/10.1038/s41591-026-04297-7]is the first independent safety assessment of the LLM-based tool since its launch in January 2026. It also identified serious concerns about the tool’s safeguards against suicidal ideation.

“LLMs have become patients’ first port of call for medical advice—but in 2026 they are less secure on the clinical fringes, where judgment separates missed emergencies from unnecessary alarm,” says Isaac S. Kohane, MD, PhD, Chair, Department of Biomedical Informatics at Harvard Medical School, who was not involved in the research.. “When millions of people use an AI system to decide if they need emergency care, the stakes are extremely high. Independent evaluation should be routine, not optional.”

Within weeks of its launch, ChatGPT Health’s maker, OpenAI, reported that about 40 million people use the tool daily to seek health information and guidance, including advice on whether to seek emergency or urgent care. At the same time, the researchers say, there was little independent evidence about how safe or reliable his advice actually was.

This gap prompted our study. We wanted to answer a very basic but critical question: if someone is experiencing a real medical emergency and reaches out to ChatGPT Health for help, will it clearly tell them to go to the emergency room?”


Ashwin Ramaswamy, MD, lead author, Instructor in Urology, Icahn School of Medicine, Mount Sinai

Regarding suicide risk alerts, ChatGPT Health was designed to direct users to the 988 Suicide and Crisis Lifeline in high-risk situations. However, the researchers found that these alerts appeared inconsistently, sometimes triggering lower-risk scenarios, while – alarmingly – failing to appear when users described specific plans to self-harm.

“This was a particularly surprising and disturbing finding,” says senior and co-corresponding author of the study Girish N. Nadkarni, MD, MPH, Barbara T. Murphy Chair of Windreich’s Department of Artificial Intelligence and Human Health, Director of the Hasso Plattner Institute for Digital Health, and Irene and Dr. Sinai, and Chief AI Officer of Mount Sinai Health System. “While we expected some variability, what we observed exceeded the inconsistency. The system’s alerts were inversely related to clinical risk, appearing more reliable for lower-risk scenarios than for cases where someone shared how they intended to harm themselves. In real life, when someone talks about exactly how they will harm themselves, that is a sign of more immediate, not less serious, risk.”

As part of the evaluation, the research team created 60 structured clinical scenarios covering 21 medical specialties. Cases ranged from minor conditions suitable for home care to true medical emergencies. Three independent physicians determined the correct level of urgency for each case using guidelines from 56 medical societies.

Each scenario was tested under 16 different contextual conditions, including variations in race, gender, social dynamics (such as someone minimizing symptoms), and barriers to care, such as lack of insurance or transportation. In total, the team conducted 960 interactions with ChatGPT Health and compared its recommendations to the consensus of doctors.

When testing 60 realistic patient scenarios developed by doctors, the researchers found that while the tool generally handled clear emergencies correctly, it underplayed more than half of the cases that doctors judged to require urgent care.

The researchers were also impressed by how the system failed in medical emergencies. The tool often proved to recognize dangerous findings in its own explanations, yet reassured the patient.

“ChatGPT Health has performed well in textbook emergencies like stroke or severe allergic reactions,” says Dr. Ramaswamy. “But it struggled in more nuanced situations where the risk is not immediately obvious, and these are often the situations where clinical judgment matters most. In an asthma scenario, for example, the system identified early warning signs of respiratory failure in its explanation, but recommended waiting rather than seeking emergency treatment.”

The study’s authors advise that for worsening or worrying symptoms, including chest pain, shortness of breath, severe allergic reactions or changes in mental status, people should seek medical attention directly rather than relying solely on the chatbot’s guidance. In cases involving thoughts of self-harm, people should contact 988 Suicide and Crisis Lifeline or go to an emergency department.

However, the researchers stress that the findings do not suggest that consumers should abandon AI health tools altogether.

“As a medical student in training at a time when AI health tools are already in the hands of millions, I see them as technologies that we must learn to carefully integrate into care, not substitutes for clinical judgment,” says Alvira Tyagi, a first-year medical student at the Icahn School of Medicine at Mount Sinai and second author of the study. “These systems are changing rapidly, so part of our training now must consider learning how to critically understand their results, identify where they fall short, and use them in ways that protect patients.”

The study evaluated the system at a single time point. Because AI models are updated frequently, performance can change over time, underscoring the need for independent evaluation, the researchers say.

“The start of medical education alongside tools evolving in real time makes it clear that today’s results are not static,” says Ms Tyagi. “This reality requires ongoing review to ensure that improvements in technology translate into safer care.”

The team plans to continue evaluating updates to ChatGPT Health and other consumer-facing AI tools, expanding future research into areas such as pediatric care, drug safety, and non-English language use.

The paper is titled “The performance of ChatGPT Health in a structured trial of triage recommendations.”

The authors of the study, as reported in the journal, are Ashwin Ramaswamy, MD, MPP. Alvira Tyagi, BA; Hannah Hugo, MD; Joy Jiang, PhD; Pushkala Jayaraman, PhD; Mateen Jangda, MSc; Alexis E. Te, MD; Steven A. Kaplan, MD; Joshua Lampert, MD; Robert Freeman, MSN, MS; Nicholas Gavin, MD, MBA; Ashutosh K. Tewari, MBBS, MCh; Ankit Sakhuja, MBBS MS; Bilal Naved, PhD; Alexander W. Charney, MD, PhD; Mahmoud Omar, MD; Michael A. Gorin, MD; Eyal Klang, MD; Girish N. Nadkarni, MD, MPH.

Source:

Mount Sinai Health System

Journal Reference:

Ramaswamy, A., et al. (2026). ChatGPT Health performance in a structured test of screening proposals. Nature Medicine. DOI: 10.1038/s41591-026-04297-7.

ChatGPT critical emergency fails health safety Suicide Tests
bhanuprakash.cg
healthtost
  • Website

Related Posts

The biomimetic smart insole system allows for accurate gait tracking

March 7, 2026

New report shows primary care readiness for digital cognitive assessment tools

March 6, 2026

The importance of oral health during pregnancy

March 6, 2026

Leave A Reply Cancel Reply

Don't Miss
Fitness

Inside the OPEX Method Guide Week 4: Dr. David Skolnick: Aerobic Training That Changes Training

By healthtostMarch 7, 20260

What if the biggest unlock for your customers’ progress isn’t more power, but smarter aerobic…

The biomimetic smart insole system allows for accurate gait tracking

March 7, 2026

Breathwork for Stress Relief: Techniques to Remember Under Pressure

March 7, 2026

Because celiac skin care is no Nego

March 7, 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
TAGS
Baby benefits body brain cancer care Day Diet disease exercise Fitness food Guide health healthy heart Improve Life Loss Men mental Natural Nutrition Patients People Pregnancy protein research reveals risk routine sex sexual Skin Skincare study Therapy Tips Top Training Treatment ways weight women Workout
About Us
About Us

Welcome to HealthTost, your trusted source for breaking health news, expert insights, and wellness inspiration. At HealthTost, we are committed to delivering accurate, timely, and empowering information to help you make informed decisions about your health and well-being.

Latest Articles

Inside the OPEX Method Guide Week 4: Dr. David Skolnick: Aerobic Training That Changes Training

March 7, 2026

The biomimetic smart insole system allows for accurate gait tracking

March 7, 2026

Breathwork for Stress Relief: Techniques to Remember Under Pressure

March 7, 2026
New Comments
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2026 HealthTost. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.