Quantcast

Show-Me State Times

Wednesday, September 10, 2025

Study explores AI chatbots assisting pregnant women with opioid use disorder

Webp ihk9f2h8cn3x8og34g37z36lewi9

Mun Y. Choi, PhD, President | University of Missouri

Mun Y. Choi, PhD, President | University of Missouri

For pregnant women dealing with opioid use disorder, discussing their condition with clinicians, family, or friends can be challenging due to stigma. Drew Herbert, a doctoral student at the University of Missouri Sinclair School of Nursing, has recognized this issue through his work as a full-time nurse.

"It can be a tough conversation for pregnant women to bring up to their clinician, family or friends in the first place, and if they do, they might be told simply, 'You should stop,' 'Think about your baby' or 'Just quit,'" Herbert said. "Not only are these responses lacking empathy and compassion, but it can also be hard to quit cold turkey, as opioid withdrawals can be dangerous to both the mom and her unborn fetus."

Despite the sometimes unreliable nature of online chatbots, Herbert noticed people often prefer seeking anonymous medical advice on the internet to avoid judgment. He aimed to determine whether GPT-4 could be trained to provide safe, accurate, supportive, and empathetic guidance for pregnant women with opioid use disorder.

Herbert created "Jade," a fictional six-weeks pregnant woman ready to quit opioids but unsure how. GPT-4, after being fine-tuned, thanked Jade for sharing her journey, recognized her strength, and introduced her to buprenorphine, a medication that reduces cravings and prevents withdrawal symptoms. This approach was based on guidelines from the American Society of Addiction Medication and the American College of Obstetrician and Gynecologists.

"We instructed the chatbot to use motivational interviewing, a counseling approach where you help people overcome resistance to change," Herbert explained. "Rather than telling someone what to do, you empower them on their path toward change from a place of empathy and support. This approach is highly effective in substance use disorder spaces."

For Help in the Denver area, the chatbot provided Jade with a list of local treatment centers, telemedicine options, and reputable directories such as the Substance Abuse and Mental Health Services Administration's treatment locator tool and the American Society of Addiction Medicine's provider directory.

The responses were rated safe, accurate, and relevant more than 96% of the time by two clinicians experienced in treating opioid use disorder.

Herbert sees potential in the study and plans to refine the chatbot using feedback from clinicians and women who have experienced opioid use during pregnancy. He envisions chatbots as supplemental information sources, not a replacement for clinicians. They might encourage individuals to visit a clinician once they feel more prepared.

Herbert's goal is to provide accurate treatment information to as many people as possible at a low cost. “In the midst of a nationwide clinician shortage, not everyone has quick, easy, and affordable access to health care. We are learning more about how these language models can be leveraged to help people become healthier, and once we build out these models further, they can potentially help a lot of people,” Herbert said.

The findings were published in the Journal of Studies on Alcohol and Drugs.

ORGANIZATIONS IN THIS STORY

!RECEIVE ALERTS

The next time we write about any of these orgs, we’ll email you a link to the story. You may edit your settings or unsubscribe at any time.
Sign-up

DONATE

Help support the Metric Media Foundation's mission to restore community based news.
Donate

MORE NEWS