On Monday, Mozilla released the findings of a new study into these types of apps, which often deal with sensitive topics including depression, mental health awareness, anxiety, domestic violence, PTSD, and more, alongside religion-themed services.
According to Mozilla’s latest *Privacy Not Included guide, despite the deeply personal information these apps manage, they “routinely share data, allow weak passwords, target vulnerable users with personalized ads, and feature vague and poorly written privacy policies.”
In a study of 32 applications geared toward mental health and religion, the organization found that 25 of them did not meet Mozilla’s Minimum Security Standards.
These standards act as a benchmark for the *Privacy Not Included reports. The mismanagement or unauthorized sharing and sale of user data, vague data management policies, a lack of encryption, weak password policies, no clear vulnerability management system, and other lax security policies can all downgrade a vendor product in the eyes of Mozilla.
If an app or service fails to meet these basic requirements, they are slapped with the “*Privacy Not Included” warning label.
The mental health and prayer-related apps have received an accolade – but not one you’d covet. The company says:
“When it comes to protecting people’s privacy and security, mental health and prayer apps are worse than any other product category Mozilla researchers have reviewed over the past six years.”
The organization examined apps including Talkspace, Better Help, Calm, Glorify, 7 Cups, Wysa, Headspace, and Better Stop Suicide. As a result, each app now has a dedicated space that can be accessed to find out more about the software’s privacy and security rating.
For example, Better Stop Suicide, a suicide prevention app, failed Mozilla’s test.
“Holy vague and messy privacy policy Batman! Better Stop Suicide’s privacy policy is bad,” Mozilla says. “Like, get a failing grade from your high school English teacher bad.”
While the app gathers some personal information and says that users can reach out to them if they have further queries, they did not respond to Mozilla’s attempts at contact and did not mention who “trusted partners” were when data sharing.
Only two applications on the list, PTSD Coach and the AI chatbot Wysa seemed to take data management and user privacy seriously.
“The vast majority of mental health and prayer apps are exceptionally creepy,” commented Jen Caltrider, Mozilla’s *Privacy Not Included lead. “They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data. Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”
Update 19.43 gmt: Talkspace told ZDNet:
“Mozilla’s report lacks context from Talkspace and contains major inaccuracies which we are working with Mozilla to address. We have one of the most comprehensive privacy policies in the industry, and it is misleading to assert we collect user data or chat transcripts for anything other than the provision of treatment. We recently updated our privacy policy to be more transparent with customers and provide clarity on data-sharing policies. Talkspace’s Notice of Privacy Practices linked here also provides additional information on how data is used in collaboration with the updated privacy policy.”
See also
The best standing desks: Get up out of your seat The best treadmills for your home: Top picks for indoor running and walking What is Apple Fitness? Here’s what it can do for you
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0