Skip to main content
Full access
Clinical and Research News
Published Online: 14 May 2019

Mental Health Apps Miss the Mark on Usability Standards, Study Shows

A lack of consensus onwhat it means to have a “positive user experience” may limit real-world uptake of apps.
A review of 40 studies that evaluated mental health apps found that they all reported positive user-engagement scores—an unusual finding given that health apps are known to have problems keeping users engaged. Underlying problems noted in the review, published March 27 in Psychiatric Services in Advance, were that each study used a different set of subjective and/or objective measures, and none used consistent benchmarks to define a “positive” user experience.
“As with a medication, we need to make sure mobile apps are tolerable before we recommend them to a patient,” said John Torous, M.D., director of the Digital Psychiatry Division at Harvard-affiliated Beth Israel Deaconess Medical Center and a co-author of the study. Digital “tolerability” refers to whether an app is easy to use and is engaging so that it is used repeatedly. These findings indicate that app developers have their own idea of what constitutes usability, he said.
As Torous and his colleagues wrote in the article, “This lack of consensus makes it difficult to compare results across studies, hinders understanding of what makes apps engaging for different users, and limits their real-world uptake.”

Most App Privacy Policies Omit Pertinent Data-Sharing Details

Usability is just one area where mobile apps might not be all they are advertised to be. Another recent analysis co-authored by John Torous, M.D., found that mental health apps often have inadequate and/or misleading privacy policies. Torous and colleagues assessed the privacy disclosures of 36 popular apps for treating depression or quitting smoking and examined what data (both encrypted and unencrypted) were transmitted following simulated use.
While 25 of the 36 apps (69%) incorporated a privacy policy, in many cases the policy was not comprehensive. For example, 22 of these 25 policies provided information about the primary uses of collected data (such as using user data to improve app performance), but only 16 described secondary uses (such as sharing data if needed with legal authorities). Further, only 13 of the policies described how to opt out of data sharing, only eight provided information about data-retention practices, and only three discussed what happens to a person’s data in the event the company is bought or dissolved.
Almost all of the studied apps (33 of 36) transmitted user data to third parties, with Google and Facebook analytic services being the dominant destinations. In several cases, the apps failed to disclose in their privacy policy that such third-party transmission would occur or stated that such sharing would not occur. The researchers did not observe any transmission of personally identifiable information, but data sent to third parties routinely included information that could be linked back to the device.
“Our data highlight that, without sustained and technical efforts to audit actual data transmissions, relying solely on either self-certification or policy audit may fail to detect important privacy risks,” Torous and colleagues wrote. “For example, consolidation of data processing into a few transnational companies underlines the risk that user data may be inadvertently moved into jurisdictions with fewer user protections or that this may be exploited by malicious actors.”
This study was published April 19 in JAMA Network Open and can be accessed at here.
Of the 40 studies in the analysis, nine evaluated mobile apps for depression, four for bipolar disorder apps, seven for schizophrenia apps, seven for anxiety apps, and 13 for apps designed for multiple psychiatric disorders. The studies were selected because they all reported user-engagement indicators (UEI), a variety of measures describing the degree to which users find an app easy to use and engaging.
All of the studies reported that their app had a positive UEI rating. Of these, 15 studies used only subjective data (such as participant surveys or interviews), four used only objective data (such as verified number of login sessions), and 21 used a combination of measures.
“It is concerning that 15 of the 40 (38%) studies concluded that their app had positive UEIs without considering objective data,” Torous and colleagues wrote.
“Qualitative data are unquestionably valuable for creating a fuller, more nuanced picture of participants. ... However, there is also a need for objective measurements that can be reproduced to validate initial results and create a baseline for generalizing results of any single study.”
A problem with the studies that used objective data, however, was that most (20 of 25) did not set predetermined thresholds for good scores in advance—all analyses were retrospective.
iStock/asiseeit
Of the studies that included both subjective and objective measures, many set low thresholds for a positive UEI rating. For example, one study considered a user-satisfaction score of 60% to be sufficient, while another required app users to complete only one-third of their assigned tasks in a week.
In addition to low thresholds within individual studies, thresholds were inconsistent across studies. For example, frequency of usage was a common objective marker, but acceptable usage rates varied from once a day to just a few times a month.
Torous acknowledged that each of these 40 mental health apps was developed for a different purpose; therefore, some variation is expected. Still, he believes it is possible to develop some usability standards to make comparisons and evaluations easier and more reliable.
This study was funded by National Institutes of Health career development awards given to Torous and study co-author Mia Minen, M.D. ■
“User Engagement in Mental Health Apps: A Review of Measurement, Reporting, and Validity” can be accessed here.

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

There are no citations for this item

View Options

View options

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share