As demand for mental health services continues to rise, the Mozilla Foundation’s latest round of its *Privacy Not Included research says that despite warnings that app developers need to shape up.
The foundation has slapped 59% of the mental health apps it studied with *Privacy Not Included warning labels because they fail to safeguard an app user’s privacy and protect their data.
“Our main goal is better protection for consumers, so we were encouraged to see that some apps made changes that amount to better privacy for the public,” said Jen Caltrider, a privacy researcher and consumer privacy advocate and Mozilla’s *Privacy Not Included team lead.
“And sometimes all that had to be done to make those positive changes was to ask the companies to do better. But the worst offenders are still letting consumers down in scary ways, tracking and sharing their most intimate information and leaving them incredibly vulnerable. The handful of apps that handle data responsibly and respectfully prove that it can be done right.”
The good and the bad
After developers took heed of the previous Mozilla mental health app study, there were some good – as well as some are-you-kidding-me – results. Almost a third of the apps, including Youper, Woebot, PTSD Coach, and the AI chatbot Wysa, made improvements over their 2022 performance. Those last two received a “Best Of” citation, which Mozilla uses to spotlight the apps doing privacy and security the right way.
One piece of bad news was that an astonishing 40% of the apps researched got worse in the last year. One that Mozilla researchers found troubling was Replika: My AI Friend, an app downloaded 10 million times on Google Play and “millions'' more (according to the description) on the Apple app store.
The analysts called Replika one of the worst apps they have ever reviewed because of its weak password requirements, sharing of personal data with advertisers, and its recording of personal photos, videos, and voice and text messages consumers shared with the app’s chatbot.
Another scary app was Cerebral. It set a new mark for the number of trackers: 799 within the first minute of download. Plus, the foundation charged that several others — Talkspace, Happify, and BetterHelp — couldn’t wait to get their hands on a user’s private information, reportedly pushing consumers into taking questionnaires up front without asking for consent.
"They claim collecting your information will help them deliver you a better service, but ... they aren’t using your personal information to help you feel better, they are using your personal information to make them money."
How to carefully choose a mental health app
Given that there's a lot of pitfalls embedded in the mental health apps Mozilla reviewed, the smart money is on finding out what those are before downloading one. ConsumerAffairs asked Caltrider and Lucas Hamrick, CEO of ORE Sys, what are their best practices in this regard.
Read the “About this App” section on the app stores. Both emphasized the importance of clicking on the little arrow beside the ‘About This App’ section of the listing prior to downloading it.
"Then I go down to the bottom of that box and find the word ‘Permissions’ and click on the link under there. That tells me what app permissions the app wants to use,” Caltrider said.
“If an app offers me tips for weight loss wants to know all my contacts, that seems weird. Or if an app says it can help me recognize songs wants access to my microphone, okay, that makes sense. But if it asks for access to my camera, I’m like, ‘nah’ you don’t need that."
Hamrick says if an app implies it can do anything "health" related, the information an app developer provided should also cover any information on therapeutic methods used within the app. "
These methods should complement any recognized therapeutic practices in your mental health and wellness journey," he commented.
Read the privacy policy for what information is collected. Privacy policies are insurmountable documents full of gobbledygook, but Caltrider suggests that app users look for the most telling information like what personal info is collected, how it’s used, and who it’s shared with or sold to. It doesn’t take long to see if an app triggers their “creepy” senses, she said.
Check trusted resources. Her last suggestion is to check and see if a trusted source like Mozilla’s *Privacy Not Included or Common Sense Media. Common Sense is a good source for parents because it also reviews app concerns like sex, nudity, drinking, smoking, and violence.
“There are people out there doing work like this to help consumers. Use us, we’re here to help!” she said.