Apps for substance use disorders, other conditions, may not be as private as we think

About Karen Blum

Karen Blum is AHCJ’s core topic leader on health IT. An independent journalist in the Baltimore area, she has written health IT stories for publications such as Pharmacy Practice News, Clinical Oncology News, Gastroenterology & Endoscopy News, General Surgery News and Infectious Disease Special Edition.

Sara Morrison

Sara Morrison

About one in five Americans report using mobile health applications (apps), according to survey data published by Gallup in 2019. But users may not necessarily be aware that the personal information they enter in those apps frequently is shared with third-party vendors that make some of those apps’ features.

In a recent article for Vox’s Recode, tech reporter Sara Morrison took a deep dive into data privacy — or a potential lack thereof — among mobile apps for substance use disorders, with implications for all health apps. She also covered the outdated laws that allow developers to share users’ information, often without full disclosure.

I interviewed Sara recently about the nuances of this article and her advice to AHCJ members writing about apps. Know what you don’t know and find the right sources to explain it to you, she says. And be aware that any information you enter in a health app could be shared more widely. (Responses have been lightly edited and condensed.)

How did you get the idea for this story?

I’ve talked to Sean O’Brien, principal researcher at ExpressVPN’s Digital Security Lab, a couple of times, and written about app privacy and a HIPAA explainer. He came to me and said he had a report coming out on data collection practices among apps for opioid addiction and recovery — do you want to know more about it? I said yes. I knew about the laws, or lack thereof, regarding health information in general, so it intersected with two things that I thought I knew a reasonable amount about. I looked through the report and reached out to some of the companies named in it to get their side, and talked to the Opioid Policy Institute because they worked with the lab. I also talked to a couple of people about what the laws were regarding this information because besides HIPAA there is another law regarding substance use disorder data, and I reached out to the Substance Abuse and Mental Health Services Administration (SAMSHA).

I was very conflicted about doing this story for a while because I didn’t want to write this alarmist article that discouraged people who might need or benefit from these apps from using them, especially if the apps were doing a good job of what they were supposed to do. I talked that through with the Opioid Policy Institute and with developers themselves. We don’t know that these apps have done anything wrong; we just know there’s the potential for that because of the way apps are built. We also don’t always know what parts of data are being shared, and they all say they’re doing everything they’re legally required to do, and they care about it.

I wanted the story to focus on an issue with health apps in general, not just those for substance use disorders. People make assumptions that apps can protect their health privacy, but some apps, depending on what you use them for, don’t really have any requirements to protect it at all. People assume anything that deals with their health is protected by some kind of law, and it’s not. Even apps that are held to some of the strictest privacy standards are still sharing information and data because apps have to in order to have certain features work.

I also wanted it to have a broader focus. Let’s assume the majority of our readers don’t have a substance use disorder issue, and won’t ever need to use these apps — why would this be relevant to them? Well, you probably have a FitBit, or use one of those period-tracker apps, or you might have an Apple watch that monitors heart rate. Those all are apps, and maybe people make more of an assumption that privacy is there that isn’t.

Your story mentions that mobile app developers commonly collect data and track users, and share this information with third parties. Why do they do this?

When a developer builds an app, there might be functions or features in it that they can’t develop themselves, or it’s much easier to get them from another company. Instead of building their own telehealth portal, they might use Zoom, or if they’re trying to measure use or sell ads, they send data about what people on the app are doing to Facebook or Google, which uses that information to send ads to people and the developer makes money off of the app. Most of the data shared is for functions needed for the app, but because another company or third party is involved, there should be restrictions on what they share. You can have HIPAA business associate or qualified service agreements with these companies and make sure those companies are doing what they’re supposed to do, too. But there’s potential for things to go wrong or for data to be exposed, much more so on apps that aren’t health and medical apps, including games you play. Anything with a Facebook login is sending data to Facebook, for example. It connects your Facebook account to your activity on that app. In exchange, it offers an easy way for people to log in rather than build it themselves.

The article discusses the use of software development kits (SDKs) — tools made by third parties that app developers can use to add additional functions. But to use them, they need to share data about their users. Are health apps as transparent as they should be about this?

I don’t think so. You have to rely on them telling you through whatever policies they have that they work with outside companies; almost every app does not do that. They’ll say ‘we work with third parties,’ but may not spell out who they are, or why. They don’t really have to, so the thing that has always annoyed me about SDKs is that you know you’re giving data to an app, you know the app knows that your phone is using it, and how much time you spend on it, and maybe what IP (internet protocol) address you’re doing it through if you’re on WiFi, but you don’t realize that data is also possibly going to a whole bunch of other companies, some you’ve never even heard of, and they don’t really have to disclose it. People are surprised when they find out Facebook or Google are getting their information.

Your article touches on privacy issues among apps. What should our members know about mobile health apps that they might not otherwise?

There seems to be a surprisingly large misconception about health data, especially from people who think that HIPAA somehow protects anybody from asking anything about any health issue. People just assume because it’s a health thing, it’s protected, and it’s not necessarily. The things you do with your actual doctor through your doctor’s office, those almost certainly are. But if you look up lupus online and you get targeted ads for lupus products, that wasn’t protected. Any time you write about an app, it’s not a bad idea to maybe call up people who know something about them.

You wrote that mobile health apps are in a gray area when it comes to following federal privacy laws. What would need to happen to better protect users?

Laws are very bad at keeping up with things that move quickly, like technology. I’d like to see much more transparency. A lot of the laws we do have are like an opt-out model. People don’t go to 100 different apps and websites and click ‘opt out;’ it’s not how humans work, so we need laws that put the burden on the companies themselves to have more transparency, to explicitly state what they’re collecting, why, who it’s going to, that allow agencies to audit and make sure they’re doing the things they’re supposed to do. Now, it’s really hard. It’s not obvious where your information is going and that it’s being treated correctly. Instead, you should be given ways to opt in to data collection.

Your article also mentions it’s possible, but not easy, to build apps that truly are private. Why don’t companies do that?

A lot is expertise. If you allow payments within apps, you might use PayPal or Apple Pay to protect from fraud and keep people’s credit card information secure. They have tons of money, they have done a lot of work. It would make sense, and may even be better, to use companies like those instead of having two engineers try to figure out their own system. I have to imagine that building your own telehealth portal is very difficult. It takes a long time and you still don’t have any guarantee that you’ll do it well, whereas an established service is right there. One company I talked to that didn’t use any third parties said if there were features they wanted to add, there may come a time when they need to use an outside company, and then they were going to have to consider what data they would share and how it would be treated. They would do a cost-benefit analysis. If it was better for their clients or patients to have the added features than not, then they would have to make a choice. There’s a trade-off.

What advice do you have for our members to consider when writing about apps and privacy?

I don’t think this is different from any other journalism but use experts in the field. I had to know enough about the law to understand what I didn’t know and had to talk to some other people who knew the law better. I always say, “Know what you don’t know.” Don’t assume you know everything because you read it on a website or on the Department of Health and Human Services’ site because there’s a lot of nuances. And figure out who can tell you it.

Leave a Reply