Table of Contents
Psychological care is scarce. While one in five adults in the US suffers from a mental illness, as of March 2023, 160 million Americans will be living in areas where it is present mental health professionals missing.
At the same time, the limited supply of psychiatrists and other healthcare professionals means that many providers are overwhelmed and a limited number of staff are under pressure to diagnose and treat patients in the shortest possible time.
Artificial intelligence (AI) could fundamentally change the mental health industry. Not just because it helps study symptoms, predictions and signals that can be used to diagnose diseases like Alzheimer’s, depression and schizophrenia. But also because it can be used to discover new treatments to treat these diseases.
How AI can help mental health professionals
In recent years, a number of researchers have been experimenting with AI, machine learning (ML), and deep learning to see if they can be used to diagnose and treat mental illness.
For example, in 2022, Stanford researchers used machine learning to Predict opioid use in patients. As part of the study, researchers processed the anonymized data of 180,000 Medicaid employees to look for key indicators of chronic opioid use.
This approach led to new insights into the treatment of opioid addiction. In particular, the study showed that the commonly prescribed opioid pain reliever tramadol can be used to predict long-term opioid use.
The study also found that 29.9% of patients who were taking opioids for the first time or who had been taking them for less than two months were at risk for opioid dependence.
In another study, researchers from Queen’s University Canada showed how AI and deep learning can be used to Process transcripts of clinical interviews and automatically assess the severity of a patient’s depression. In this case, AI helped standardize and speed up the assessment process.
It’s important to note that AI’s diagnostic capabilities are not intended to replace a mental health professional’s judgment, but to give them access to more insights to use in their decisions about treating and supporting their patients.
Is AI accurate enough to predict mental illness?
When using AI in healthcare, it is critical that the insights gleaned from a data set are as accurate as possible because human lives are at stake. A wrong decision or diagnosis can result in people at risk not getting the support they need.
However, one of the biggest challenges in using AI to predict mental illness is that the accuracy of their diagnoses or predictions depends on the quality and accuracy of the data they were trained on.
Clinical researcher Sarah Graham et al. examined 28 studies that used AI to predict, classify, or subdivide mental illnesses such as depression, schizophrenia, and other psychiatric disorders and found that the Overall accuracy was between 63% and 92%.
While 63% is relatively low, 92% is more promising. It suggests that researchers who take the time to feed AI systems with the right data signals can dramatically increase the accuracy of their results.
For example, a study published earlier this year found that AI (CRANK-MS) Predict Parkinson’s disease with 96% accuracy can.
This was possible because of the detailed data available about the test subjects. The study was conducted on 78 people from Spain who gave a blood sample between 1993 and 1996 and were followed for 15 years.
The key finding from this study is that AI systems can draw much more reliable conclusions when they are provided with rich signals from which to derive insights.
Of course, AI can play a useful role in supporting patients with mental illness, but it must be used carefully. In practice, this means that healthcare professionals cannot rely on AI to diagnose mental illness, but to improve their own diagnosis and understanding of a patient’s condition.
For example, when a psychologist is examining a person for depression, he/she could use his/her own knowledge to assess the severity of the condition and use additional insights generated by the AI to increase confidence in his/her diagnosis increase or to review possible treatment options.
Importantly, organizations that want to use AI to process patient data will need to obtain consent or de-identify their data to protect their privacy rights.
Failure to do so could result in significant legal obligations under data protection regulations such as GDPR and HIPAA, or other types of reputational risk.
The US Senate recently approved a review of Mental health apps like BetterHelp and Talkspace demanded because he fears that the providers could collect private information and share it with third parties.
Complement of medical service providers
AI is not a panacea for the mental health crisis, but it does offer physicians and clinical researchers the opportunity to expand their ability to diagnose and treat patients.
Basically, the more data available for a diagnosis or treatment, the better.