Mental Health and Prayer Apps Fail the Privacy Test

Over 90% of mental wellness and prayer apps contain serious privacy issues, while many others raise cybersecurity concerns, according to a new analysis from Mozilla.

The non-profit behind the open-source Firefox web browser used the start of Mental Health Awareness Month to update its Privacy Not Included guide.

It found 29 out of the 32 apps appraised did not pass Mozilla’s privacy requirements, while 25 out of 32 did not meet its Minimum Security Standards, which cover things like encryption, security updates, strong passwords and vulnerability management.

After spending over 255 hours researching and writing the guide, the team reported that many apps routinely share sensitive data, allow weak passwords, target vulnerable users with personalized ads and feature poorly written privacy policies.

“The vast majority of mental health and prayer apps are exceptionally creepy. They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data,” argued Mozilla Privacy Not Included lead Jen Caltrider.

“Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”

The six worst offenders on the list featured “incredibly vague and messy privacy policies,” shared personal information with third parties and/or collected chat transcripts.

The researchers also noted that only one out of all the app developers they analyzed responded to their questions in a timely manner, despite being sent requests for more information three times.

Mental health apps, in particular, harvest large amounts of data from their users and, in some cases, also extract information from other apps on the same device, such as Facebook, according to the report.

Further, at least eight apps allowed weak passwords ranging from “1” to “11111111,” Mozilla claimed. Only two out of 32 made it into the “best of” category: PTSD Coach, an app made by the US Department of Veterans Affairs, and AI chatbot Wysa.

“Hundreds of millions of dollars are being invested in these apps despite their flaws,” argued Mozilla researcher Misha Rykov. “In some cases, they operate like data-sucking machines with a mental health app veneer. In other words: a wolf in sheep’s clothing.”

What’s Hot on Infosecurity Magazine?