AI not ready for prime time in mental health care, experts say | Local News

ABOUT THIS SERIES

The applications of artificial intelligence — AI — are growing exponentially and will continue to do so as the technology advances even more.

Today, CNHI and the Daily News continue an ongoing series looking at AI and its potential benefits and concerns in various parts of everyday life. This latest part revisits AI and its use in health care.

Although several uses of artificial intelligence in mental health are showing some success, experts say the jury is still out on its capabilities for wider use.

Therapists are using AI to examine large amounts of patient data, including family histories, patient behaviors and response to treatments to help diagnose and identify treatments, as well as select what therapists can best connect with individual patients, according to an article by Switzerland-based World Economic Forum.

A study led by researchers from New York University showed that AI has been useful in identifying post-traumatic stress disorder in veterans.

Mental health professionals are using wearables, such as FitBits, to monitor sleeping patterns, physical activity and variations in heart rate and rhythm that are used to assess the user’s mood and cognitive state. The devices alert patients and health care providers with warnings when interventions may be needed and help users change behavior and seek assistance.

AI chat programs using natural language programming are being used to review therapists’ reports and notes, along with conversations during interaction with patients to look for useful patterns. Researchers hope to help therapists develop better relationships with patients and identify warning signs in patients’ choice of subjects and words, the World Economic Forum reported.

Along with AI’s success comes the potential for misuse. The forum has published comprehensive guideline considerations and potential AI implementation strategies.

“Global Governance Toolkit for Digital Mental Health: Building Trust in Disruptive Technology for Mental Health” recommends goals, standards, ethical considerations, governance structure and ways to encourage new innovations.

The forum recognizes current shortcomings and challenges to expand AI in the mental health field. Using AI chat in therapy, for example, raises the question if the technology is optimized for a consumer’s mental health outcomes or for the developer’s profitability, the toolkit authors said.

“Who is ensuring that a person’s mental health-related information is not being used unscrupulously by advertising, insurance or criminal justice systems?” the authors wrote. “Questions such as these are troubling in the light of current regulatory structure.”

A study by researchers at the University of California San Diego, La Jolla, warned differences between traditional health care and mental health care create complications for AI systems.

“While AI technology is becoming more prevalent in medicine for physical health applications, the discipline of mental health has been slower to adopt AI,” the study published in the medical journal Current Psychology Reports said. “Mental health practitioners are more hands-on and patient-centered in their clinical practice than most non-psychiatric practitioners, relying more on “softer” skills, including forming relationships with patients and directly observing patient behaviors and emotions. Mental health clinical data is often in the form of subjective and qualitative patient statements and written notes.”

While those researchers and others at the World Health Organization were optimistic technology could address the current shortcomings, WHO’s report, “Artificial Intelligence for Mental Health and Mental Illnesses: An Overview,” concludes it’s too early to predict AI’s future in mental health care.

“We found that AI application use in mental health research is unbalanced and is mostly used to study depressive disorders, schizophrenia and other psychotic disorders. This indicates a significant gap in our understanding of how they can be used to study other mental health conditions,” Dr. Ledia Lazeri, regional advisor for mental health at WHO/Europe, wrote in the report.

In the article “Is AI the Future of Mental Healthcare?”, which was published in May, the European scientific journal TOPOI concluded:

“It is not possible to answer the question about whether and to what extent AI should be adopted in mental healthcare. Too much information is missing about both its potential benefits and its potential drawbacks. However, it would make sense to use AI to support mental healthcare provision if and when there are good reasons to think AI outperforms or can significantly assist human therapists.”

Source link

credite