How AI Can Help Manage Mental Health In Times Of Crisis – Forbes

Much has been written in the past few weeks about the COVID-19 crisis and the ripple effects that will impact human society. Beyond the immediate effect of the virus on health and mortality, it is clear that we are also facing a global, massive financial crisis that is likely to affect our lives for years to come. These changes, along with the expected prolonged social isolation, are bound to have a devastating effect on our mental health, collectively and individually, and, in turn, cause a dramatic deterioration in overall health and an increase in the prevalence of chronic illness.

From research conducted by the World Health Organization, we know that most people affected by emergency situations experience immediate psychological distress, hopelessness and sleep issues -- and that 22% of people are expected to develop depression, anxiety, post-traumatic stress disorder, bipolar disorder or schizophrenia. This escalation comes on top of a baseline of 19.1% of U.S. adults experiencing mental illness (47.6 million people in 2018, according to the Substance Abuse and Mental Health Services Administration). We further know that rising depression rates are associated with a variety of chronic health conditions, including obesity, coronary heart disease and diabetes, so the domino effect does not end with mental health.

This prediction may sound like an eschatological prophecy of dystopia, but there are good reasons to be optimistic too. At our disposal, we now have myriad clinical-grade digital tools and applications designed to treat and prevent anxiety and depression. All it takes is a Wi-Fi connection and a mobile phone to provide digital treatment that can reach everyone. Even more encouraging are the recent advances in the use of artificial intelligence in mental health -- more specifically augmented intelligence, the ability to embed the collective knowledge and care of humans into digital applications.

Such an approach attempts to make the best of both worlds -- the human connection along with the rich, often gamified digital experience that is driven by data science. For example, research scientists at the University of Utah founded Lyssn, a product that uses deep learning algorithms for analyzing the recording and sharing of psychotherapy conversations for training and quality assurance purposes. This is a manual and expensive process usually conducted by a panel of psychotherapists. Lyssns product is trained using a broad range of therapists, so the advantages are not only cost and time, but also consistency and reduced bias to any particular attitude or approach.

Other companies offer a range of therapy chatbots: X2AIs bot, Sara, uses natural language processing to engage users in conversations on Facebook Messenger, helping them manage stress and anxiety. Another example is Lark Health, a chatbot that is directed at managing diabetes and hypertension, gathering and analyzing sleep, weight and nutrition information from users in daily conversation.

Such applications distill collective human knowledge into a digital experience, providing users with 24/7 access to a therapist representing a cohort of hundreds of clinicians, who are trained in a variety of different disciplines.

The challenge ahead is to go beyond the mechanics of therapeutic conversation and to model the human alliance or bond that human therapists establish with clients. For this purpose, joint teams of data scientists, clinicians and writers (like those working on team Anna at my company) need to work on creating conversational experiences that have the capacity to express curiosity about users, develop a deeper understanding of their lives, and be emotionally sensitive and attuned.

Going beyond the mechanics of interaction and attempting to build a superhuman digital therapist requires:

Establishing A Single Transdisciplinary Team: Data scientists, clinicians and content editors speak different languages. To avoid creating a modern Tower of Babel, it is critical to help them establish the same language by working closely in a single nimble and cohesive team.

Starting With A Clear Model Of A Therapist: Empathy, care and listening are a result of patterns of interaction that need to be explicitly modeled. Prior to any work on specific interventions, it is critical to specify these patterns in detail. For example, what are the personality traits of the digital therapist? What triggers in the conversation does it respond to? Which goals is it trying to accomplish? What language does it avoid using? What are the ways in which it shows interest? Curiosity? Support? How does it manage the trade-off between persisting with its own agenda to being flexible and letting the user take the lead?

Defining A Few Simple Criteria For Success: Beyond the standard quantitative methods for assessing efficacy, define clearly how you assess the degree to which the AI was successful in establishing an alliance with the user. What does the ideal user feedback sound like? What would you want users to say when they describe the digital therapist?

Talking To Users On A Daily Basis: The experience of digital therapy is made by assembling multiple, and often complex, algorithms and mechanisms together. To make sure you are investing in the right places, always talk to users and ask them to recall specific parts of the interaction that made them feel a sense of alliance and bond. You may discover that the simplest patterns of interaction are the most important ones.

These tools will not replace the couch, tissue box and innate professionalism of the therapist's office, but they may very well keep us healthier in times when we can't make it to that office. In times of crisis, like our current situation and those that will inevitably crop up in the future, it's important to know what our options are and work toward a healthier future in whatever ways we can.

Excerpt from:

How AI Can Help Manage Mental Health In Times Of Crisis - Forbes

Related Posts

Comments are closed.