AI Chatbots Show Promise in Delivering Motivational Interviewing for Health Behavior Change, FAU Study Finds
Artificial intelligence is being explored as a way to expand access to motivational interviewing (MI), a proven but underused counseling method that helps individuals build motivation to adopt healthier behaviors. Researchers from Florida Atlantic University’s Charles E. Schmidt College of Medicine conducted the first scoping review of AI-driven tools designed to deliver MI, aiming to assess their effectiveness, adherence to core MI principles, and impact on health behaviors. MI has demonstrated success in helping people quit smoking, improve physical activity, and follow medical treatments, yet its adoption in clinical settings remains limited due to time constraints, training requirements, and reimbursement challenges. AI-powered chatbots, virtual agents, and mobile apps offer a scalable alternative, providing continuous, empathetic, and nonjudgmental support that mimics the conversational style of MI. The study, published in the Journal of Medical Internet Research, analyzed existing research on AI systems using technologies from rule-based scripts to advanced models like GPT-3.5 and GPT-4. Most tools were chatbots focused on health behaviors such as smoking cessation, substance use reduction, stress management, and treatment adherence. While many systems incorporated key MI elements—such as open-ended questions, affirmations, and reflective listening—only a few demonstrated high fidelity to formal MI practices. Evaluation methods varied widely. Some studies used expert reviews to assess MI quality, while others relied on study design or user feedback. However, few addressed critical safety concerns, such as the risk of AI generating misleading or inappropriate responses. Long-term behavioral outcomes were largely missing, with most studies focusing on short-term psychological indicators like readiness to change or perceived understanding. Despite their potential, participants often noted a lack of emotional depth and genuine relational connection compared to human counselors. While users appreciated the convenience and consistency of AI tools, they missed the nuanced empathy and adaptability of face-to-face interactions. Maria Carmenza Mejia, M.D., senior author and professor of population health, emphasized the need for more rigorous evaluation. “Many digital interventions include motivational elements but don’t clearly follow established MI frameworks,” she said. “We mapped the specific techniques used and how fidelity was measured—this detail is essential to know what these tools are actually doing.” The findings highlight both promise and caution. AI-driven MI tools are feasible, well-accepted, and capable of delivering core MI principles in a scalable way. However, more research is needed to confirm their ability to drive lasting behavior change and to ensure safety, transparency, and ethical use. As AI continues to evolve, the integration of proven behavioral science with advanced technology could significantly expand access to effective health coaching—especially for those who avoid traditional care. Future studies must prioritize robust evaluation, long-term follow-up, and safeguards to ensure these tools are both effective and trustworthy.
