Dr. Jeff Ditzell, DOis CEO and Psychiatrist-in-Chief at Jeff Ditzell Psychiatry in New York City, where he provides highly responsive, compassionate care with flexible hours for busy adults. Specializes in adult ADHD, anxiety, treatment-resistant depression, addiction, and life optimization coaching, including telepsychiatry and innovative therapies. His background includes leading an inpatient dual diagnosis unit, providing services in psychiatric emergency departments, and directing an assisted community treatment group. A lieutenant colonel in the US Army, he has over 20 years of service, including duty as a Division Psychiatrist in Tikrit, Iraq. He completed fellowships in addiction psychiatry, psychosomatic medicine, and public psychiatry. More information here: www.jeffditzellpsychiatry.com.
Scott Douglas Jacobsen interviews Jeff Ditzell about why people are using AI chatbots for conversations and mental health exchanges. Ditzell argues that chatbots can fulfill basic needs for safety, validation, companionship and quick thought organization, especially in the midst of extended loneliness. He warns that ADHD, anxiety and addiction may increase vulnerability to unhealthy dependency if AI becomes a primary tool for emotional regulation, decision-making or reward-seeking, displacing human connection or clinical care. Adolescents may use AI to shape social identity, while adults may rely on it to function, risking reduced self-efficacy. Long-term use can affect therapists’ attachment patterns and expectations, and chatbots can subtly shape beliefs through bias or manipulation.
Scott Douglas Jacobsen: What basic psychological needs are people trying to fulfill by turning to AI chatbots for conversation?
Dr. Jeff Dietzel: One of the most fundamental psychological needs is connection with others. People turn to AI chatbots for conversation for many of the same reasons they would seek out a human to chat or connect with. People need to feel safe. we seek situations and environments where we can safely express ourselves without being judged or rejected. Ideally, a trusted friend or family member can provide that connection, but in the absence of such a person, some may turn to AI chatbots to fill that gap. People need to feel heard, understood and validated. The AI is highly effective at mirroring emotional content and providing clear and empathetic reflections. This can provide a person with the experience of feeling understood and validated. AI can also be useful for organizing and clarifying thoughts. It can help people with brainstorming, problem-solving and decision-making support, in a direct, low-risk interaction. Unfortunately, loneliness is an epidemic in this country. For some people, AI may simply help provide a sense of companionship and connection, reducing feelings of loneliness and isolation.
Jacobsen: Is there a parallel between ADHD, anxiety and addiction — and dependence on AI chatbots and other forms of digital or behavioral addiction?
Ditzell: ADHD, anxiety and addiction share some diagnostic symptoms and characteristics. People with these disorders can use AI in helpful, supportive, and healthy ways. However, there is also the potential for unhealthy dependence and misuse. This is especially true if AI is used as the primary tool to reduce or treat certain symptoms. People with these disorders who look to AI for emotional regulation, decision making, or reward seeking may come to rely too heavily on AI and fail to develop healthier, more effective coping mechanisms. Problems may be more likely to arise if the AI chatbot is used as a substitute for human connection or appropriate clinical support.
Jacobsen: If so, do they manifest differently in adolescents versus adults?
Ditzell: This can manifest differently between teenagers and adults in a number of ways. Both populations with these diagnoses may be vulnerable to increased reliance on AI chatbots, but the presentation, risks, and outcomes may differ. For example, teenagers and young adults may turn more to AI to build social identity, thereby negatively impacting appropriate social development. On the other hand, adults may use AI more as a coping tool to regulate emotions and increase functioning. This can lead to an over-reliance on AI and a weakening of self-efficacy and resilience.
Jacobsen: How can emotionally charged interactions with AI companions shape a person’s attachment patterns?
Ditzell: AI interactions can mimic the qualities of a personal relationship. As a result, this interaction can affect how a person relates to others in real life. Depending on a person’s age and developmental stage, AI interactions can affect how a person learns to trust, rely on, and relate to other people. Although these interactions could have positive effects in building secure attachment in the right environment, there are serious risks of creating or reinforcing anxious, avoidant, or disorganized attachment patterns.
Jacobsen: In what ways can AI chatbots reduce anxiety or depression symptoms or loneliness?
Ditzell: When used in moderation as a healthy coping tool, in conjunction with psychiatric treatment, AI chatbots can help reduce some anxiety or depressive symptoms. They can help provide clarity, structure and connection, along with emotional regulation and opportunities for cognitive reframing. They may also provide a sense of companionship that can help combat loneliness or social isolation, helping people feel less alone.
Jacobsen: Many people evaluate mental health concerns with AI before seeking a professional. What are the clinical risks of using chatbots — beyond the risk of content hallucinations?
Ditzell: It can be helpful to assess mental health issues with AI as a first step in understanding mental health concerns and beginning to seek treatment. However, AI should never be used as a substitute for professional medical or psychiatric care. AI is notorious for providing inaccurate or incomplete information. It can also become a dangerous substitute for human connection or support. Building a dependency on AI instead of taking decisive action can also lead to increased risk and negative outcomes. AI should not be seen as an effective way to diagnose or treat any mental health problems.
Jacobsen: How might long-term reliance on artificial intelligence affect people’s trust in clinicians or therapists?
Ditzell: There is growing interest in using artificial intelligence to address mental health issues. This can be helpful if it helps reduce stigma or lower the barrier to seeking treatment. Interacting with AI chatbots can help some people practice vulnerability and emotional connection in a low-risk situation. This can begin to help build trust in human relationships, including therapeutic relationships. Alternatively, reliance on AI could also reduce trust by shaping unrealistic expectations of human relationships or preventing individuals from experiencing the meaningful benefits of real human connection.
Jacobsen: What are the concerns about chatbots subtly influencing users’ beliefs or political views?
Ditzell: AI chatbots can certainly influence individual beliefs or political opinions. This can happen in the same way that any type of social media, news organization, or even community group can shape personal beliefs and opinions. Chatbots can use misinformation, persuasion techniques, manipulation or bias to influence users. Without guardrails to monitor the spread of false information, AI can create a false sense of trust and authority. Chatbots can reinforce existing beliefs or biases or use emotional manipulation to target users with unmet social or relational needs. The use of AI chatbots can prevent or discourage dialogue between people and limit exposure to different opinions and ideas. This can also lead to reduced civic and political participation within communities or specific segments of the population.
Jacobsen: Thanks for the opportunity and your time, Jeff.
—
Scott Douglas Jacobsen it is the his publisher In-Sight Publishing (ISBN: 978-1-0692343) and its Editor-in-Chief In-Sight: Interviews (ISSN: 2369-6885). He writes about THE Good Men Project, International Policy Digest (ISSN: 2332–9416), THE Humanist (Print: ISSN 0018-7399; Online: ISSN 2163-3576), Basic Income Land Network (UK Registered Charity 1177066), A Further researchand other means. He is a member in good standing of several media organizations.
***

If you believe in the work we do here at The Good Men Project and want a deeper connection with our community, join us as a Premium Member today.
Premium members can watch The Good Men Project ADS-free. Need more information? A full list of benefits is here.
—
Photo by Aerops.com on Unscrew
The post Dr. Jeff Ditzell: AI Companions and Mental Health, Connection Needs, ADHD Risk and Attachment Patterns appeared first on The Good Men Project.
