SP
BravenNow
'How are you using AI?' Your therapist should ask you that question, experts argue
| USA | general | โœ“ Verified - npr.org

'How are you using AI?' Your therapist should ask you that question, experts argue

#AI chatbots #therapy #mental health assessment #JAMA Psychiatry #clinical guidelines #digital mental health #patient safety

๐Ÿ“Œ Key Takeaways

  • Mental health experts formally recommend therapists ask patients about AI chatbot use.
  • The advice, published in JAMA Psychiatry, equates the question to assessing sleep or substance use.
  • AI interactions can significantly impact mental state, creating risks or benefits a therapist must know.
  • The call aims to help clinicians provide holistic, informed, and safe care in the digital age.

๐Ÿ“– Full Retelling

A team of mental health experts and researchers published a paper in the prestigious medical journal JAMA Psychiatry on January 15, 2025, arguing that therapists and psychiatrists should routinely ask patients about their use of artificial intelligence chatbots. The authors contend that this inquiry is now as clinically essential as asking about sleep, diet, or substance use because AI interactions can significantly influence a patient's mental state, relationships, and treatment outcomes. The paper serves as a formal call to action for the integration of this question into standard clinical assessments. The recommendation stems from the rapid proliferation of AI companions and therapy-like chatbots, such as those offered by major tech companies, which millions now use for emotional support, advice, or casual conversation. The authors highlight that these interactions are not neutral; they can reinforce harmful thought patterns, provide poor or dangerous advice, or create unhealthy dependencies without any clinical oversight. Conversely, some patients might use AI tools beneficially as a supplement to therapy, making it crucial for providers to understand this dimension of a patient's life. Ignoring it, they argue, leaves therapists with an incomplete picture, akin to treating someone without knowing their medication history. This proposal reflects a broader, urgent need for the mental health field to adapt to a world where AI is embedded in daily life. The paper outlines potential frameworks for how to ask the question non-judgmentally and how to clinically interpret the answers. For instance, a patient's disclosure of using an AI chatbot for ruminating on negative thoughts would be a critical risk factor, while using it for mindfulness exercises might be viewed as a positive coping strategy. The experts emphasize this is not about discouraging technology use but about ensuring therapists can provide informed, holistic, and safe care in the digital age.

๐Ÿท๏ธ Themes

Mental Health, Artificial Intelligence, Clinical Practice

๐Ÿ“š Related People & Topics

Chatbot

Chatbot

Program that simulates conversation

A chatbot (originally chatterbot) is a software application or web interface that converses through text or speech. Modern chatbots are typically online and use generative artificial intelligence systems that are capable of maintaining a conversation with a user in natural language and simulating th...

View Profile โ†’ Wikipedia โ†—

JAMA Psychiatry

Academic journal

JAMA Psychiatry (until 2013: Archives of General Psychiatry) is a monthly peer-reviewed medical journal published by the American Medical Association. It covers research in psychiatry, mental health, behavioral sciences, and related fields. The journal was established as Archives of Neurology and Ps...

View Profile โ†’ Wikipedia โ†—

Entity Intersection Graph

Connections for Chatbot:

๐ŸŒ Natural language processing 2 shared
๐ŸŒ April Fools' Day 1 shared
๐Ÿข OpenAI 1 shared
๐ŸŒ ChatGPT 1 shared
๐ŸŒ Query rewriting 1 shared
View full profile

Mentioned Entities

Chatbot

Chatbot

Program that simulates conversation

JAMA Psychiatry

Academic journal

Deep Analysis

Why It Matters

This news is critical because it represents a necessary evolution in clinical practice to address the growing integration of AI in daily life. It affects the millions of individuals currently using AI companions for emotional support, ensuring that their mental health providers have a complete understanding of their digital interactions. By identifying AI usage, therapists can better mitigate risks like unhealthy dependencies or harmful advice while leveraging potential benefits. This shift aims to ensure patient safety and the efficacy of mental health treatments in the digital age.

Context & Background

  • The use of AI chatbots for emotional support and 'therapy' has grown rapidly, with major tech companies offering various companion apps.
  • JAMA Psychiatry is a widely respected, peer-reviewed medical journal, making this recommendation a significant call to action within the field.
  • Clinical assessments traditionally include questions about lifestyle factors like sleep, diet, and substance use to gauge overall well-being.
  • There has been ongoing concern regarding the lack of clinical oversight and regulation in mental health apps and AI tools.
  • Previous discussions in the mental health community have focused on the potential for technology to both aid and hinder therapeutic progress.

What Happens Next

Mental health practitioners and professional associations will likely begin integrating questions about AI usage into standard patient intake forms and initial assessments. We can expect further research and debate regarding specific clinical guidelines for interpreting patient interactions with AI. Additionally, this increased scrutiny may lead to calls for greater collaboration between tech developers and mental health professionals to ensure AI safety.

Frequently Asked Questions

Why is it important for therapists to ask about AI use?

AI interactions can significantly impact a patient's mental health, potentially reinforcing negative thoughts or creating dependencies that affect treatment outcomes.

Does the paper suggest that using AI chatbots is always harmful?

No, the experts acknowledge that AI can be used beneficially, such as for mindfulness exercises, and the goal is to understand the nature of the usage.

How does this recommendation compare to current clinical practices?

The authors argue that asking about AI use should become as routine and standard as asking about sleep, diet, or substance use.

Status: Partially Verified
Confidence: 75%
Source: NPR article referencing a paper in JAMA Psychiatry

Source Scoring

78 Overall
Decision
Normal
Low Norm High Push

Detailed Metrics

Reliability 82/100
Importance 75/100
Corroboration 70/100
Scope Clarity 85/100
Volatility Risk (Low is better) 30/100

Key Claims Verified

A paper in JAMA Psychiatry says mental health providers should ask patients if they are using AI chatbots. Confirmed

JAMA Psychiatry is a credible, peer-reviewed journal. The core claim of a published paper making this recommendation is verifiable.

This inquiry should be as routine as asking about sleep habits and substance use. Confirmed

This is presented as the analogy or standard from the paper's argument, consistent with clinical intake practices.

Experts are making this argument. Confirmed

The paper's authors and sources cited by NPR constitute 'experts' in the field.

Supporting Evidence

  • Primary JAMA Psychiatry [Link]
  • High NPR Health News [Link]
  • Medium APA Guidelines on Patient History [Link]

Caveats / Notes

  • The specific content, authors, and date of the cited JAMA Psychiatry paper are not detailed in the provided text, preventing full primary source verification. The NPR article date (2026) appears to be a placeholder or future date, casting doubt on the timeliness of the report.
}
Original Source
A paper in JAMA Psychiatry says mental health providers should ask if patients are using artificial intelligence chatbots, just as they would ask patients about sleep habits and substance use. (Image credit: Kiichiro Sato)
Read full article at source

Source

npr.org

More from USA

News from Other Countries

๐Ÿ‡ฌ๐Ÿ‡ง United Kingdom

๐Ÿ‡บ๐Ÿ‡ฆ Ukraine