Artificial intelligence can complicate finding the right therapist – STAT

Posted: September 23, 2019 at 7:44 pm

Companies have learned the hard way that their artificial intelligence tools have unforeseen outputs, like Amazons (AMZN) favoring mens resumes over womens or Ubers disabling the user accounts of transgender drivers. When not astutely overseen by human intelligence, deploying AI can often bend into an unseemly rainbow of discriminatory qualities like ageism, sexism, and racism. Thats because biases unnoticed in the input data can become amplified in the outputs.

Another underappreciated hazard is the potential for AI to cater to our established preferences. You can see that in apps that manage everything from sources of journalism to new music and prospective romance. Once an algorithm gets a sense of what you like, it delivers the tried and true, making the world around you more homogeneous than it might otherwise be without embedded artificial intelligence. Having your preferences catered to can sometimes be great. But it can also be debilitating in insidious ways, like in the search to find the right therapist.

As a psychiatrist who works on improving the design of tech-enabled health care services, I foresee how misapplied AI could create pitfalls for the growing number of online platforms that aim to make it easier for individuals seeking therapy to find a therapist.

advertisement

It is now possible to search through online listings that sort therapists by various factors. You can learn about the type of therapy a provider offers as well as his or her clinical specialization and educational background. You can sometimes see a photo of the clinician, or even watch a video interview to get a better sense of what she or he is all about.

Shared demographic features can be key factors in finding a therapist. Hearing about a few of my friends personal experiences, demographic matching was salient. One was willing to go only to a male person of color; another told me he would only feel comfortable seeking therapy with a woman or a gay man. Minority mental health care providers are demographically underrepresented in the profession, which can result in the therapeutic environment feeling like a foreign, unwelcoming space to many prospective patients.

Making health care and specifically mental health care culturally inclusive is extremely important. To get a sense of how much of an impact representation can have, look no further than the inspirational force of psychiatrist Jessica Clemons, also known by her Instagram handle as @askDrJess, who has done preeminent work flipping the script on mental health in communities of color and engaging underrepresented groups with mental health.

What often drives individuals searches for providers with shared demographic features is the belief that such providers will be able to understand them better and help them get better because the therapist shares their lived experiences. But that presumption may unintentionally set up a therapeutic environment built on a shaky foundation.

Heres what the available evidence shows happens in practice. Among racial and ethnic minorities, there tends to be a preference for providers who share their identity, or at least share a common status as a racial/ethnic minority. Although a meta-analysis showed that such individuals perceive their identity-matched providers to be more effective, measurable outcomes didnt match the perception. Ultimately, the treatment outcomes were about the same regardless of the therapists identity.

Its possible that studies like these systematically underreport hard-to-quantify factors that are part and parcel of psychotherapy, like helping an individual feel fundamentally understood and making ones personal struggles comprehensible to themselves in light of their prior experiences. And some might argue that a patients positive perception is, in fact, the best way to measure therapeutic success.

Yet a 2015 study demonstrated that individuals would knowingly seek less-effective therapy in exchange for sharing identity features with their therapist. At the very least, this shows that individuals are willing to make objective sacrifices in order to ensure that their identity is a central part of their therapy, even if that means at odds with what an economist would call rational decision-making that the therapy itself is less effective in addressing its defined purpose.

The overlooked danger in using artificial intelligence or other tools to prioritize identity features when selecting a mental health provider is that it could systematically amplify a potential confirmation bias for some individuals seeking therapy. In daily life, confirmation biases develop when individuals set up their environments often subconsciously to make new evidence confirm their preconceptions.

The challenges, for example, that a gay or black person faces often rightly justified through hard-earned if not traumatic life experiences may make it difficult to navigate social or professional interactions with others whom they believe cannot empathize or appreciate their perspectives or values. In turn, if that individual exclusively seeks a therapist with whom they readily identify, the experience can confirm and amplify the discrepancy they feel between their carefully pre-screened therapist and the rest of the outside world.

Offering the idea of establishing a relationship with a therapist who is outside a patients natural comfort zone may seem insensitive to patient autonomy. As a white man with reasonable knowledge of mental health care systems and the means to afford care, I dont face nearly any of the barriers to care that many others do. As a clinician, however, I have witnessed firsthand the unique value of therapeutic relationships that transcend demographics. When the visible identity differences melt into the background with time, the therapeutic relationship has the potential to gently challenge a patients preconceptions regarding trust, compassion, and clinical competence. Thus, the successful patient/provider relationship is more dependent on a multi-dimensional interpersonal connection than on one or a few demographic features.

Artificial intelligence and sorting algorithms invisibly embedded in mental health care navigation can have unintended consequences that become greater as technological innovation slips further into the patient-provider matching process. Therapists, entrepreneurs, and health care service designers must be collectively attuned to the potential problems posed by artificial intelligence when it is treated as a magical black box. Not paying attention to input features and catering to customer demands without foresight could systematically segregate health care provision and erect greater barriers in health care as we cluster into ever more homogenized groups. And because of the underrepresentation of minorities as mental health care providers, any homogenization of care could have the potential to significantly worsen access to care for many minority patients.

Addressing these concerns will require several steps. Solutions must be carried out on two levels: the individual provider and the health care system. Technology developers must exercise a great degree of caution in the use of automated identity matching between patient and mental health provider. For clinicians and clinical educators, its important to note that most individuals value a therapist with extensive cultural training and experience even more highly than simply sharing racial or ethnic identity with their provider. This makes patient-panel diversity and cultural competency training to be defining issues in the quality of clinical training and continuing professional education in the coming years.

Technology has the potential to readily improve individuals opportunities to seamlessly connect with high quality mental health care. However, treating artificial intelligence as if it were a black box that automatically produces an optimal result holds the potential to systematically undercut access to care when expert clinical judgement isnt closely tied to applying AI.

Scott Breitinger, M.D., is an instructor in psychiatry at the Mayo Clinic in Rochester, Minn.

Read more here:

Artificial intelligence can complicate finding the right therapist - STAT

Related Posts