Technology That Helps Us Help Ourselves: Talk Therapy in Combination with Augmented Intelligence

The term “artificial intelligence” (AI) is increasingly touted as a potential solution for various aspects of physical health.  From earlier detection of disease to drug discovery and development to processing paperwork and data more efficiently, AI is no longer the future of physical health, it’s the present.

When it comes to mental health – an epidemic of global proportions – AI continues to mature, but faces nuances that differentiate it from its utility in physical health. Namely, AI for mental health is a challenge that requires the technology to be a notch above traditional AI and, in particular, the ability to understand humans intrinsically.  While this presents opportunities – the ability to intervene and even treat – it also presents ethical challenges that, ultimately, should not deter but instead should be met head-on.

The Mental Health Challenge

According to WHO, one in four people are affected with a mental disorder at some point during their lives.  At the same time, one out of five adults with a mental illness reported that they were not able to receive the treatment they needed.

Further, mental health is a comorbidity of nearly every chronic disease, and often receives short shrift when considered “secondary.”  In fact, according to a meta-analysis of co-morbidities of mental disorders and chronic physical diseases in developing and emerging countries, published in BMC Public Health, “…the increased risk for anxiety and/or depression in people with chronic physical diseases was 310%.”  As a result, the researchers conclude that “…awareness of mental health must be integrated into all aspects of health and social policy, into the planning of the healthcare system and into the provision of primary and secondary healthcare.”

We need a better approach to diagnosing and treating mental health, whether as a primary condition or co-morbidity.  AI, with its aspiration to be “human like” through deep learning, is critical to moving beyond our current methods to ones that are more effective and scalable. 

Artificial Intelligence Versus Augmented Intelligence

Let’s examine if “artificial intelligence” is even the most accurate descriptor as applied to digital interventions for mental health.  According to MIT Technology Review, artificial intelligence is “…the quest to build machines that can reason, learn and act intelligently.”  The goal of augmented intelligence, however, is more akin to “…using technology to supplement and support human intelligence, with humans remaining at the center of the decision-making process.”

People tend to use the term AI as a catch-all, but the fact is that there are nuances in the mental health space, specifically, that deem augmented intelligence as a more accurate description. While AI has matured to the point that it is more human-like than ever before, the role of humans remains critical in mental health treatment and, as such, should not seek to replace human therapy but, rather, to enhance it.

Conventional Talk Therapy in Concert with Augmented Intelligence

Talk therapy is a cornerstone of traditional mental health treatment.  It is highly dependent on a successful, meaningful and trustworthy relationship between the practitioner and the patient, most often referred to as the “therapeutic alliance.”  This alliance, established at the earliest stages of therapy, is what allows the patient to trust that the practitioner has their best interests at heart, even when therapy becomes difficult for the patient.  As a result, any digital intervention in mental health, particularly as a complement to traditional talk therapy, need not only be intelligent, but also be human-like, if intended to be therapeutic in nature.  In other words, digital interventions should have the ability to get to know the patient over time, as a therapist would, but do not go so far as to be a substitute for the therapist.

The mental health clinical establishment agrees that augmented intelligence will not soon be taking over their jobs. 

According to a recent analysis of psychiatrists’ attitudes toward artificial intelligence, co-led by researchers at Duke University School of Medicine and Harvard Medical School, only about 4% of psychiatrists felt it was likely that future technology would make their jobs obsolete and only 17% felt that future artificial intelligence and/or machine learning was likely to replace a human clinician for providing empathetic care.

The question remains, how best can augmented intelligence be leveraged to help those with mental health issues?

Establishing an Appropriate Role for Augmented Intelligence in Mental Health

Sound and effective mental health augmented intelligence has the potential to “fill the gaps” where traditional talk therapy cannot, keeping in mind that the typical patient who is in therapy goes once per week, or perhaps every-other-week.  As such, the benefits afforded by any augmented intelligence intervention geared toward consumers is its availability 24/7; its ability to be used for micro sessions to enhance ongoing therapy; its accessibility on the digital devices people are already using; and its affordability and scalability, two attributes particularly appealing as workplace benefits.

Additionally, AI has the ability, when done properly, to code the “best of” wisdom and experience from thousands of therapists into a single session, program or other offering, essentially minimizing subjectivity and enhancing flexibility to make the digital experience more adaptable to a wide range of patient personalities and needs.  In other words, it’s “therapeutic crowd wisdom,” offering a panel of thousands of therapists behind the scenes and at the patient’s disposal. 

Equally as important, mental health augmented intelligence can be used to intervene “in state,” meaning that it can be accessed at the precise moment of need, complementing the offline conversation a patient may have with their therapist later.

And, therapists are already using augmented intelligence to improve talk therapy.  For instance, direct observation of psychotherapy for training and providing feedback is both labor-intensive and expensive.  One study has shown that using machine learning to analyze session transcripts, including therapist empathy and reflective listening, was not only well-received by the therapists being observed, but also that most would incorporate the feedback into their practice.

Further, an evaluation of empathy among addiction counseling therapists revealed that machine evaluation, as compared to direct observation, “…provides useful information that can contribute to automatic quality insurance and therapist training.”

Challenges of Augmented Intelligence for Mental Health

Implementing augmented intelligence into the mental health therapeutic model introduces a completely new standard of care and, as such, presents unique challenges.  As a result, it warrants its own set of guidelines and ethics, specific to AI.   Factors that need to be taken into consideration include:

·        Honesty. it’s critical that any intervention be clear that it is human-like, not human. While that may seem obvious to some, it has the potential to be confusing to others, especially those feeling vulnerable or in a fragile state of mental health. 

·        Transparency on Decision-Making.  Mental health augmented intelligence needs to be transparent on how it makes it decisions.  What factors does it consider?  What are the guidelines it uses?  How is it programmed to make decisions? And, what data was used to train the models that make decisions.  For example, a model trained on a population of young, urban individuals may produce output that is less applicable to older, rural users. 

·         Non-Discriminatory.  This may seem counter-intuitive, as one of the benefits to machine learning is that it is supposed to offer a non-biased environment.  However, how it is programmed matters.  For instance, if a digital intervention tool knows that a user is female, aged 40+, will that tool assume that she’s depressed, since the research supports it?  Or, does it have the capacity to make independent judgement?

·         Crisis Situations.  We’re not at the point where we can entrust machines to make decisions on potential suicides and self-harm and, because algorithms trained on data will always have some degree of errors, we need humans to be accountable; review records; analyze transcripts, as warranted; and alert authorities when danger is imminent.

The Path Forward for Augmented Intelligence for Mental Health

Proponents and critics alike must come to grips that, when it comes to traditional talk therapy and mental health augmented intelligence, it’s not an “either/or” situation. Throughout various points in history and across industries, there have been fears that humans and their skillsets will become extinct.  This hasn’t happened yet, though it is reasonable to argue that, over time, humans’ skillsets adapt to the introduction of technology. 

Technology provides humans the opportunity to be more focused on the things that humans do best and to let machines do what they do best.  And, in many cases, they complement each other quite well, enhancing each other’s skill sets to provide a better experience for the patient.  

This is the point at which we’re at now with mental health and augmented intelligence: finding the sweet spot where technology enhances therapeutic care, but does not replace it. 

Change is both exciting and scary, but it is inevitable. And, in the case of mental health AI, should be a welcome change to the current experience for far too many.

Ran Zilca is the Chief Data Science Officer at Happify Health and the author of Ride of Your Life—a Coast to Coast Guide to Finding Inner Peace, describing his 6,000-mile solo motorcycle ride of personal transformation.

Sponsored Recommendations

Care Access Made Easy: A Guide to Digital Self-Service for MEDITECH Hospitals

Today’s consumers expect access to digital self-service capabilities at multiple points during their journey to accessing care. While oftentimes organizations view digital transformatio...

Going Beyond the Smart Room: Empowering Nursing & Clinical Staff with Ambient Technology, Observation, and Documentation

Discover how ambient AI technology is revolutionizing nursing workflows and empowering clinical staff at scale. Learn about how Orlando Health implemented innovative strategies...

Enabling efficiencies in patient care and healthcare operations

Labor shortages. Burnout. Gaps in access to care. The healthcare industry has rising patient, caregiver and stakeholder expectations around customer experiences, increasing the...

Findings on the Healthcare Industry’s Lag to Adopt Technologies to Improve Data Management and Patient Care

Join us for this April 30th webinar to learn about 2024's State of the Market Report: New Challenges in Health Data Management.