NewsNational News

Actions

A multi-year project looks at how AI can help with the operation of crisis helplines

988 suicide hotline adds support for American Sign Language
Posted
and last updated

The National Institute of Mental Health (NIMH) has awarded a $2.1 million grant to tech company, Lyssn in partnership with Protocall, a national crisis call center, to see how artificial intelligence can help with the operation of crisis helplines.

Protocall answers calls on behalf of hundreds of organizations including those made to the 988 helpline.

The trial would be using AI to review the recordings of a staffer fielding the phone calls, not replace workers who are connecting with the callers.

“The person who's contacting the crisis call center won't even know that this technology is being used because it doesn't impact the call in the interaction at all with the call responder,” said Dr. Stephen O’Connor, chief of the National Institute of Mental Health Suicide Prevention Research Program.

Dr. O’Connor explains AI would be assessing the calls people make to a helpline, not live, but after the fact, to determine how well the crisis call went.

“There are certain things that the crisis responders are supposed to do. They're supposed to ask questions about whether or not someone's having suicidal thoughts. And if there was any kind of history of suicidal behavior. And then they're supposed to be empathic, right? They're supposed to kind of keep someone engaged,” said O’Connor.

Jen Bollinger is a volunteer crisis counselor.

“I think there's a space for it. I think that there is, but I don't think that it can fully replace like internal review. I feel like in text-based conversations, it's more applicable and I think that it would be more useful because it's just evaluating a written conversation,” said Bollinger.

Some critics point to problems AI has already caused with behavioral health services.

The National Eating Disorders Association (NEDA) removed its chatbot called Tessa after it says it learned it was given generative AI functions without their knowledge.

“These functions caused the chatbot to give dangerous advice to chat participants, and changed the way the program was running without NEDA's or the researchers' knowledge. Upon learning this, the program was taken down immediately. We later learned that the technology company that was hosting Tessa had changed the way the program was running, so that it was no longer solely utilizing the script developed by the researchers,” a spokesperson for NEDA said.

Taking AI out of the equation, it's evident more resources are needed to help crisis call centers keep up with the demand. Within the first year of operating, the crisis helpline 988 had nearly five million calls, chats and texts.

If you or someone you know is struggling with their mental health, you can call or text 988. The helpline is available 24/7.