When military veteran and author Andrew Pedersen was only 26 years old, he attempted suicide with a firearm. He survived the attempt, but decided to end his life in an entirely different way—by jumping off the Golden Gate Bridge, over 1,000 feet above San Francisco Bay. Fortunately, he survived that attempt as well, and now works as an advocate for suicide prevention awareness. Pedersen is one of the many veterans who will be helped by the work of artificial intelligence startups like BlackFynn Inc., which aims to help prevent veteran suicide by detecting and reducing risk factors with machine learning and AI technology.
In one of the cases, suffering from post-traumatic stress disorder after serving tours in the Middle East and as a Chicago police officer, Dan Miller did not see any reason to live at that point. In addition, his wife and children had grown wary of his behavior at home.
“My whole world was falling apart,” he says of that dark night in 2014. “It left a hole I didn’t know how to fill.”
He chose not to shoot after reading a brochure on the passenger seat of his car – this led him to get involved in volunteering with those who had done something similar.
On average, 17 U.S. veterans die by suicide every day in the United States, according to the Department of Veterans Affairs. As of 2019, the most recent year for which records are available, 6,261 veterans have taken their own lives – and veterans’ suicide rates are 52% higher than for non-veterans, according to the agency.
With so many veterans at risk of suicide, the Veterans Health Administration (VHA) now uses artificial intelligence (AI) to identify the veterans who are at the highest risk and speak to them before they find themselves in a crisis.
However, when Dan Miller’s life started unraveling, that option wasn’t available.
In the years before his near-suicide, his wife pushed him to get help. “She said, ‘You’re not the same person you were when you left. The kids are scared of you. The pets are scared of you,” he recalls.
While in the Marine Corps, Miller had become increasingly emotionally aloof. His wife threatened divorce even as he resisted. It was not safe for him to tell anyone what he was going through because he thought that he would lose his job and respect from others.
After looking into the VHA for an initial consultation in 2010, he didn’t find it useful. It didn’t sit well with him to be told what to do. So he stopped. Excessive exercise and excessive drinking became his way of dealing with his problems.
That day in 2014, Miller’s wife told him she was taking the kids out for a playdate. He didn’t believe her, and so after she left, he was served with divorce papers. Forty-five minutes later, he was parked in his car with his gun and contemplating how to end his life.
If that same thing had happened a few years later, it might never have happened at all.
What is The Red Flag Project?
The Red Flag Project is an initiative to combat veteran suicide. The initiative is focused on reducing the number of veterans who die by suicide, especially first-time and young veterans. Red Flag is a nonprofit that has partnered with organizations across the country to provide free mental health resources and connect veterans with services in their communities. The goal of the project is to reduce the number of veterans who die by suicide, especially first-time and young veterans. The organization provides free mental health resources and connects veterans with services in their communities. The project was created by J.J. McCaffrey, a former marine and the daughter of a Vietnam veteran. McCaffrey wanted to create a way to help veterans before they reached a crisis point.
Assessment of Suicidal Risks
In 2017, the Veterans Health Administration tried its pilot artificial intelligence program, called REACH VET, to help veterans in need of mental health.
Every month, a computer analyses the electronic health records of all VHA patients who have had a health care visit for any reason in the last 2 years. It looks at more than 140 variables and uses those factors to estimate someone’s suicide risk at that given moment.
In order to create the risk algorithm, a computer retrieved medical records for 6,360 veterans who committed suicide between 2009 and 2011. The VHA continuously updates the list of variables in these records with information from patients who died by suicide since then, and also includes others.
Here are some variables you’d expect:
- Someone who attempted suicide in the past
- A mental health diagnosis of depression or other disorder
- A terminal diagnosis
- Others are more surprising, such as being diagnosed with diabetes or arthritis.
The riskiest cases – the top 0.1% – are flagged by Reach VET for a mental health professional to review. The provider is notified about the flag, discusses any recommended treatment changes, and asks the patient to come in for a visit.
“It’s an opportunity to talk about their risk factors, which is designed to lead to a conversation about safety planning,” says Matthew Miller, a clinical psychologist and national director of the U.S. Department of Veterans Affairs’ Suicide Prevention Program.
An Action Plan to Prevent Suicide
A safety plan is a document that outlines the steps a person can take to avoid suicide in a crisis.
An example plan might include:
- An individual’s triggers or warning signs
- What has helped them in the past
- Contact information for people or organizations who can help
- Elimination of suicide-causing items, such as guns, from their environment
- The reasons for their existence
Research has shown that having a safety plan for those who are at risk for suicide has positive effects such as lowering thoughts of suicide, lowering depression and hopelessness, and making veterans more receptive to mental health care. This may also help people with managing the things that may cause suicidal thoughts to appear.
Getting the Call
When Dan Miller was in crisis, what if REACH VET had been there to help?
“One of the biggest things on that day … was feeling completely alone and that I had no one to turn to.” – Dan Miller
“It absolutely, positively would have helped because one of the biggest things on that day when I got served was feeling completely alone and that I had no one to turn to,” Miller says. Now, he serves as a speaker for the Wounded Warrior Project, a nonprofit that serves veterans and active-duty service members.
Veterinary patients’ reactions to the unexpected phone calls range, psychologist Miller says, “run the gamut from ‘Thank you for contacting me. Let’s talk,’ to ‘What are you talking about? Leave me alone!’ “
Sadly, REACH VET is not preventing all suicides, but it is making a difference. During a clinical trial, vets contacted through REACH VET had more doctor visits, were more likely to have a written suicide prevention safety plan, and had fewer hospital admissions for mental health, emergency room visits, and suicide attempts.
Assist From Artificial Intelligence
The simplest outreach can make a huge difference. Research proves it.
A study included 4,730 veterans who had recently been discharged from the VA’s psychiatric unit, a group considered especially at risk for suicide.
A half of the patients received 13 caring emails from hospital staff after they left the hospital. They mentioned personal things the patient shared, like a love of hiking, and wished them well. Emails were sent to the veterans who got routine follow-up, but not to the others.
A study published in Contemporary Clinical Trials in 2014 found that the vets who received the caring emails were less likely to commit suicide two years later.
Researchers have done similar studies many times: via handwritten notes, postcards from the ER, and so on. It is always the same: The notes reduce suicide risk.
“If we could use AI to identify people to receive notes or phone calls, it would be a very effective and inexpensive way to guide follow-up care,” says Rebecca Bernert, PhD, an associate professor of psychiatry at the Stanford University School of Medicine in Palo Alto, CA.
Artificial intelligence does not replace clinical judgment.
An artificial intelligence algorithm is only as good as the data it uses. If this data isn’t diverse, it may discover important information. Also, civilians may not experience the same variables as veterans.
“When you’re able to put time and space between the suicidal thought and the access to the method to act on that thought, you save lives.” says Rebecca Bernert.
How to Prevent Suicidal Thoughts
AI is also being used by Google to prevent suicide. Its MUM (Multitask Unified Model) technology recognizes the intent of a search.
Google Search is powered by MUM. It can tell the difference between someone searching for information about suicide to write a research paper, versus someone searching for where and how to commit suicide.
If Google Search detects someone in the United States might be in crisis and risk for suicide, the first results it provides them are the number for the National Suicide Prevention Lifeline and other resources for people in crisis.
It’s like Google Home but for suicide-related crisis. When a user poses a question signaling suicidal intent, the gadget points them to a selection of suicide prevention resources.
It is the goal of MUM to be able to provide hotlines or other resources in 75 languages for people in crisis in many countries through Google Search.
“We want to find partners that are accessible to users in terms of hours of operation. We have a strong preference for finding partners that promise confidentiality and privacy to the extent that those are permitted [in that country],” says product manager Anne Merritt, MD, of Google Search.
Some other companies are creating apps that will use AI to predict the probability of suicide risk based on other forms of recognition such as changes in voice pitch and rhythm, as is more often seen in a depressed person. That technology is still in development, but looks promising. Keep in mind that there is no approval from the government for the apps, so make sure to talk to your health care provider if you try any.
You can find help by seeing a hotline number on your phone or computer screen, Dan Miller says. “If I happened to be online, searching maybe for a bridge to jump off of … and suddenly that pops up on the screen, it’s like it changes the channel.”
This advice may not work for everyone, but that particular search result may be an interruption to someone’s train of suicidal thoughts.
They say that for most suicide attempts, the process starts with just a thought and progresses to potentially lethal action in a very short amount of time-as it did for Dan Miller in 2014.
Choosing a Different Path
It was an interruption in Miller’s thinking that saved his life.
A brochure from the Wounded Warrior Project lay on the passenger seat, and Miller, holding the gun to his head, glanced at it. He saw a picture of a disabled man, a veteran similar to him, who had no legs. Despite being in worse condition than him, the man had not given up.
Putting down his gun, Miller decided to call for help.
Recovering from a suicide attempt is a journey. It does not happen overnight. After working on the speaker circuit for 8 years, Miller is taking a 2-week break to undergo outpatient counseling for posttraumatic stress disorder and traumatic brain injury.
“Telling my story to strangers – part of it is healing me in a way, but I’m learning that repeating the story over and over again is also keeping me from letting it go. And I’m still healing.”
To Help Prevent Suicide: Call, Text, or Chat 988
988 is the National Suicide Prevention Lifeline number and you or someone you know who is thinking of suicide can call, text, or chat with it.
The Lifeline can also be reached on its original phone number, 800-273-8255. Their helpline is available 24 hours a day, 7 days a week in English and Spanish.