What the Digital Transformation of Mental Health Support Means for Employers — and Employees.

AI generated image of a God-like figure at a keyboard
Into the ether and into the minds (photo credit Antonio Sadaric and midjourney)

In the olden days, when at a loss with what to do for sick individuals, doctors and apothecaries ordered or dispensed “ut aliquid erit” medicine or salves— “so that something will occur.” Sometimes the placebo effect worked — often, it didn’t. Regardless, patients felt cared for.

In the Covid pandemic, companies rushed to do “something so that something will occur” to support increasingly mentally challenged and impaired employees. Providing access to mental health apps rapidly became part of this arsenal — a relatively cheap technological solution to a deep, protracted, very human problem.

Investors understood the promise, and money poured into the space. In January 2021, Talkspace went public at a $1.4 billion valuation. In October, its competitors Headspace and Ginger.io agreed to a $3 billion merger. Telemedicine has boomed into a quarter trillion-dollar industry.

By April 2022, Fortune magazine reported that nearly 20% of workers reported that their employers offered mental health apps or telehealth appointments. While companies could offer their digital apps to consumers for free and pay low licensing fees, apps made money by selling ads or user data, raising privacy and user experience concerns.

Here we review the pros of a digital approach to providing mental health diagnoses and services, examine its vulnerabilities, and explore what this means for employers — and the role educated consumers (employees and employers) can play in improving services and products.

The road to digital mental health

Since Greek physician Hippocrates first suggested in 400 BC that the mentally unwell were in the throes of a physiological disease and not cursed, medicine and psychotherapy have made progress in helping patients manage and recover.

In the U.S., mental health awareness has come a long way since Dorothea Dix lobbied the U.S. government to fund building 32 state psychiatric hospitals to provide better living conditions for the mentally ill in 1843.

It took another 165 years to improve mental health service coverage. The 2008 Mental Health Parity and Addiction Equity Act (MHPAEA) required insurers to offer the same coverage for mental health services as other physical health services. No longer could treatment limits (i.e., the number of visits to a health provider for care) or financial burdens (i.e., deductibles and out-of-pocket costs) for mental health services deviate significantly from other essential health services.

The 2010 Patient Protection and ACA extended the MHPAEA’s mental health services parity requirement to small-group insurers and Medicaid. Expanding Medicaid coverage was particularly important because it funded more mental health services than any other US insurer. Experts predicted that the ACA and MHPAEA would extend mental health coverage to over 60 million Americans.(1)

Nearly 2,500 years later, diagnosis and treatment, especially for mood disorders, still mostly rely on patients’ verbalized symptoms, and the practitioner-client relationship, the therapeutic alliance, has been considered irreplaceable as the cornerstone of any psychological treatment.

Digital tools are challenging this understanding.

With the advent of the Internet, healthcare providers began using telemedicine in four main ways: for hospital-to-hospital consultations among physicians, for asynchronous patient/clinician communication, for real-time patient-clinician interaction, and for patient education.

Even so, many physicians expressed skepticism of telehealth during the 1990s and 2000s, suggesting that treating patients without an in-person examination was unethical. Into the 2010s, it remained a no-no in many countries.

Since 2010, digital software applications (apps) have emerged as a new format for teletherapy. Some offered online talk therapy via video conversations (e.g., Talkspace, BetterHelp), others focused on mindfulness and sleep (e.g., Headspace, Calm), anxiety (e.g., Worry Watch, MoodKit), coaching services (e.g., Modern Health and BetterUp), addiction support (e.g., I Am Sober), and still, others focused on AI-informed chat therapy and insights (e.g., Woebot and Ginger.io).

The Covid-19 pandemic catapulted patients and providers online and caused a mental health crisis, which brought even more patients and providers into the digital fold.

As telehealth options expanded before and especially during the pandemic, mental health became a logical application, given how much the response to Covid challenged our ability to lead fulfilled, productive lives.

By the end of 2020, nearly 20% of American adults were experiencing a mental illness. As millions of Americans finally overcame the stigma of having a mental illness, the demand for treatment drastically increased. Yet access to care, already difficult and expensive, worsened.

At the time, more than 10,000 mental health apps were available in the U.S. By Spring 2022, that number had more than doubled. With the rise of telemedicine, a very new concept in mental health treatment, the market grew at blistering speed. Athletes like Michael Phelps endorsed the approach and became spokespeople.

adverstisement for Talkspace showing Olympic swimming champion Michael Phelps seated in a chair at the bottom of a empty pool — “therapy helped change my life anad it can help you too”

Pros of digital access to mental health resources…

It helps expand capacity and reach by leveraging therapists. Mental health apps and remote care provision expanded access, especially in more rural areas and underserved, lower-income populations. About one-third of Americans lived in areas without enough mental health providers. Before the pandemic, nearly 60% of US counties had no psychiatrists, meaning that primary care doctors were the sole mental health care provider for over 65% of rural patients. By 2025, a pre-pandemic study estimated that the demand for psychiatric care would exceed supply by as much as 25%. By 2022 burnout among therapists and skyrocketing demand for their services worsened that math dramatically.

and, to some extent, replace therapists. Several companies had developed chatbots specifically for mental health contexts. Arizona-based Banner Health, for example, partnered with two chatbot companies — Buoy Health and LifeLink — to help patients navigate the health system while reducing unnecessary healthcare spending. Another company, Woebot, offered a chatbot that asked users questions to assess their mood, behaviors, and cognitive ability and directed them to other care resources as needed.

It can provide access to diagnoses and care at a lower cost. For many individuals, paying for mental health care is prohibitive and a significant barrier to treatment. The job losses and economic dislocation of the Covid pandemic exacerbated these problems as millions of Americans lost their jobs and, with that, often their insurance coverage. Because many insurers had inadequate in-network mental health providers, patients often faced long wait times for an appointment and traveled long distances to see a provider. As a result, 32% of behavioral outpatient care was out-of-network, versus only 6% of other medical and surgical care.

Extensive advertising helps reduce stigma — or avoid it all together by making it easier to access resources off business hours and in the privacy of one’s own digital home. Stigma often kept individuals from acknowledging issues to themselves and others; many waited years before getting treated, even though practitioners thought early intervention promised better outcomes. In addition, while some employers offered mental health days or generous sick time, most employees found it hard to take time off regularly and confidentially to visit doctors and therapists. This seemed particularly challenging during the pandemic when sick time was less clear.

… and risks and limitations

Accountability for such digital therapeutics and other online offerings was sparse. Efforts to address this accountability gap were ongoing but placed much of the burden on patients and providers. To help, the American Psychiatric Association suggested 37 questions for providers and patients to use to determine which apps might be a good fit for them.

Among thousands of health-focused apps in the app stores, 64% claimed effectiveness, yet only 2.7% provided data or pilot study results about the app’s effectiveness. Most of these programs did not fall within the FDA’s regulations, so they had not run clinical studies to demonstrate their effectiveness. This left consumers unaware of which online offerings would best meet their needs. A 2016 study of almost 1,000 apps for depression found only 35% to be clinically relevant; 3% had clinical effectiveness data available.

Privacy had emerged as a significant issue, which was not new nor restricted to mental health — i.e., weather apps sharing location information to advertisers — but these involved smoking suppression apps and Talkspace. One study of 36 apps intended to help with smoking cessation and depression found 81% shared data with Google and Facebook, though fewer than half publicly disclosed that they did so. An investigation by the online publication Jezebel found that BetterHelp shared some user data with third parties, including Facebook, Google, and Snapchat. While the information shared in therapy sessions was kept private, data shared with third parties contained potentially sensitive information, such as how and when users used the app.

In some instances, sharing private customer health data did not contravene health data protection rules because the apps were developed by third parties. Also, people who use them at home or at work may face privacy issues related to people listening to them for teletherapy.

Most users rapidly churned through the apps — reducing their impact… Engagement data and retention data are key. Apps need to be engaging to keep users. One study of 93 apps focused on “targeting anxiety, depression, or emotional wellbeing” that had more than 10,000 installs found that the median open rate for these apps was 4.0%. In contrast, the median retention rate 30 days after downloading these apps was 3.3%. Some studies indicated that teletherapy might be more effective for children and youths than in-person care due to the “novelty of the interaction.” However, like with dating apps, the odds were that after the novelty wears off they will be gamed — answers will be bogus or even fraudulent, especially if users are pursuing unneeded special disability or special accommodations at work or at school. This will be a bitter pill for those who genuinely are disabled when it happens.

or become too passive. Many mood-tracking apps deliver validated questionnaires that ask users to think about symptoms over the past week or two. Those will be as accurate as those measures always are, with the same limitations of self-reporting. When people are used to the questions, they can blow through them. There was also a risk of “survey fatigue” when participants went through the motions. Many mood-tracking apps delivered validated questionnaires that ask users to think about symptoms over the past week or two. Those will be as accurate as those measures always are, with the same limitations of self-reporting. When people are used to the questions, they blow through them. The questions are unidirectional, the patients cannot interrogate the machine.

and not benefit from one of the most important aspects of any therapy: confrontation. Buzzwords such as “Too Much Information,” “boundaries,” and “codependence” have all come from seeking to reinforce the power of the therapist’s very personal relationship with the patient. In that space, confrontation is required. And this ensures growth. It is also a dangerous space with opportunities for misunderstanding and implied actions and promises. The working alliance of the therapy resides in exploring the psychological contract made in the work. It can also exist outside therapy. And when there is a history and context based on love and acceptance, confrontation can be critical to growth and harm avoidance. A piece of software cannot do this.

The American Psychological Association is exploring the intersection of artificial intelligence and psychology as AI changes every aspect of psychology and mental health. Psychologists are trained in human interaction, and some of their skills are irreplaceable, but thoughtful and strategic implementation of AI cannot be overlooked.

A more general risk to both patients and providers, including employers, is cybersecurity risks.

As the industry grew, so did its attractiveness to potential hackers. Suppose mental health data on a large scale falls into the wrong hands. In that case, the trust and security of the whole industry might be put at risk, and people experiencing mental health issues would no longer seek help from these telehealth applications. Groups at risk of persecution, such as undocumented migrants, should not have to worry about their information being leaked, which could lead to them being deported or jailed.

Some argued that these businesses needed to be held to higher standards before gaining the trust of patients battling mental illnesses.

Despite the massive risk, HIPAA announced in 2021 that it would waive penalties for violations against healthcare providers if they acted in “good faith.” This gave telehealth companies a year “grace period” to comply with HIPAA guidelines. Other countries also gave their telehealth businesses time to adapt to the digital age of health.

However, in December 2022, the U.S. Health and Human Services Department issued a Bulletin on guidelines that HIPAA-covered entities should follow when using online tracking technology. It also investigated Cerebral, a mental health platform that offered online therapy and assessments, for selling patient data to third-party advertisers via online tracking technologies.

For the Finnish company Vastaamo, this led to an eventual demise. Its database was hacked, and over 25,000 patients were asked for a ransom in exchange for their confidential data not being leaked publically. This data leak embarrassed some patients, but it was incriminating and even life-threatening for others. In the UK, Babylon Health, a $2 billion startup, mistakenly sent videos of patients’ private consultations to doctors and other patients, which put at risk over 10 years of video and audio consultation data. The breach ultimately stemmed from a software error, yet it could have impacted the lives of thousands of patients.

With the customer in mind, the National Cybersecurity Center of Excellence (NCCoE), a branch of the National Institute of Standards and Technology (NIST), focused on the cybersecurity and risk-mitigation aspect of the Telehealth industry and developed a strategic plan for Health Delivery Organizations (HDOs) to create a better at-home treatment environment for patients.

Through virtual private networking (VPNs) and a secure network firewall, emphasis will be put on the security of a patient portal provided by the HDO or a third-party system responsible for the effectiveness of their network’s access control. This platform's risk assessment and management will need to be secure from potential breaches or leaks, eliminating any possibility of blame towards the patient. The platform used by the HDO will establish a baseline of expected behavior to detect and monitor unusual activity to alert risk-management individuals of abnormal activities.

Moving beyond “just doing something” to making things better

Getting this right matters, and investors have started to tighten up investment. This focuses on security, approach, and return on investment, and connection to business goals.

Through new and improved security techniques, telehealth can become a trustworthy option for those needing remote access to mental health platforms without worrying about the dangerous outcomes of data breaches or data leaks.

Employers must ensure they support apps and options that invest in privacy and security and pick apps that provide all the reassurance about their systems, even if they don’t need to abide by health protection rules. Instead of picking trendy or cool-sounding apps, they can use the evaluation approach from the American Psychiatric Association below.

Pyramid of levels of evaluation for mental health apps — available here https://www.psychiatry.org/psychiatrists/practice/mental-health-apps/the-app-evaluation-model
Source: American Psychiatric Association (Psychiatry.org — The App Evaluation Model)

Second, beyond consulting app evaluation databases, employers and their representatives can consult nonprofit and mental health groups for reviews and open up apps to a set of employees before rolling them out at the firm level. This might allow employees to provide feedback on what type of add (self-guided vs. coaching, meditation/mindfulness, or cognitive behavioral therapy-based) would be most helpful to their colleagues.

Finally, as employers tighten the hatches and pull back on benefits, they can learn from more than two years of enhanced experience with apps and employee mental health benefits. A key element of mental balance is sleep quality, which has declined for years and more precipitously during the pandemic. Helping employees defray the cost of fitness and sleep trackers, sleep support apps, and C-PAP machines could yield great individual results but potentially yield data, if securely collected, at the enterprise level that could help teams and organizations adapt starting times, etc.

Improving employee well-being and helping them manage mental illness needs to evolve into managing for mental health, moving from the reactive and palliative (like apps and time off even though they are important) to preventive measures and investments in creating better cultures — the focus of our collective work.

For example, a significant source of stress could be better managed diversity, which companies can improve by better understanding the neuroscience behind discord.

Another issue is the need for imagination, accountability, and humor in human capital sustainability. One example could be the creation of an individualized stress kit or group stress questionnaires that enable employers to surface sources of stress and unique approaches instead of having to resort to one app that fits all.

Furthermore, leaders and managers can remind employees — directly and by example — that they, too, play a role in creating toxic workplaces and have a responsibility to do better. To do so, companies need to move away from a leadership and management model that rewards narcissistic personality behavior, and leaders need to let employees and the world know that they, too, struggle at times and were impacted by the pandemic. There is no app for that.

Their employees’ well-being might depend on it, especially in a world where HR and employee support budgets are shrinking while the need for support is growing relentlessly.

This article was compiled by Susanna Harkonen, Carin-Isabel Knoop, Daven Morrisson, MD, and Antonio Sadaric, Ph.D. We welcome thoughts and inputs.

(1) Norah Mulvaney-Day, Brent J. Gibbons, Shums Alikhan, and Mustafa Karakus, “Mental Health Parity and Addiction Equity Act and the Use of Outpatient Behavioral Health Services in the United States, 2005–2016,” Am J Public Health, 2019 June; 109 (Suppl 3): S190-S196.

--

--

Carin-Isabel Knoop (on Humans in the Digital Era)

Pragmatic optimist devoted to helping those who care for others at work and beyond. Advocate for compassionate leadership and inclusive and honest environments.