Pocket worthyStories to fuel your mind

How Mental Health and Prayer Apps Fail Spectacularly at Privacy

Your smartphone can instantly connect you to safe spaces via therapists, guided meditation, or scripture. But how safe—and how private—is the data that comes from these interactions?

Pocket Collections

Read when you’ve got time to spare.

In partnership with
Mozilla Foundation

Turns out, researching the privacy and security of mental health apps is bad for your mental health. This is the first thing I learned while digging into the growing world of mental health apps in my role as creator and lead of Mozilla’s Privacy Not Included buyer’s guide.

May is Mental Health Awareness Month and it sure seems the world’s mental health could use some help these days. Enter mental health apps, designed to help people connect with therapists online, meditate, track moods and symptoms, pray, and even play games designed to make us happier. Thanks to the current mental health crisis, these apps are a rapidly growing industry with a lot of growing pains. Primary among those growing pains are frightening concerns about privacy and security. We’re watching health privacy issues run smack into our current out-of-control data economy and it’s scary.

For Privacy Not Included, we researched the privacy of 32 popular mental health and prayer apps, including Better Help, Talkspace, Calm, Headspace, Happify, Wysa, and Pray.com. And we learned that too many of these apps can share very personal information with advertisers for interest-based ad targeting, sell your information, gather even more information about you from places such as data brokers, and don’t always have strong security practices to protect all this very personal information they share with you. Yikes! As I was researching these products, I came across a number of articles that helped me understand the ongoing problems with mental health apps and privacy. I want to share them here, with hopes they’ll help contextualize our ratings and recommendations.

Jen Caltrider

During a rather unplanned stint working on her Master's degree in Artificial Intelligence, Jen quickly discovered she’s much better at telling stories than writing code. This epiphany led to an interesting career as a journalist covering technology for CNN. Continuing down her random life path, Jen moved from CNN to digital media activism, where she helped pioneer the creative use of digital storytelling to try and leave the world a little better than she found it. That eventually brought her to Mozilla, where she created and now leads Privacy Not Included, a guide to help consumers shop smart for consumer tech that won’t invade their privacy.

Jen spends her days as a consumer privacy advocate helping people better understand issues around privacy, security, and artificial intelligence in their technology. Just exactly what she thought she’d be growing up in rural West Virginia. (not really, life is random…and wonderful).