Who Really Owns All Your Health Data?

Sleep patterns. Heart rates. Menstrual cycles. Weight fluctuations. Medication schedules. The location of major world leaders. Every morning, millions of people strap on smartwatches, open period-tracking apps, and upload their most intimate details to the cloud. We’re told this data will optimize our health, help us live better lives. But there’s a darker question lurking beneath the surface: Who actually owns all this information, and where exactly is the line between optimization and surveillance?

First things first: What HIPAA actually protects

In discussing this topic with friends and family, most everyone I know assumed their health data enjoys robust federal protections under HIPAA (the Health Insurance Portability and Accountability Act). Sadly, they’re wrong. HIPAA applies exclusively to “Covered Entities,” aka health plans and healthcare providers. The fitness tracker on your wrist? Not covered. The period-tracking app on your phone? Not covered. The sleep monitor beside your bed? You get the picture.

“When we think we are protected and we’re not—that’s when we run into danger,” says Ron Zayas, an online privacy expert and CEO of Ironwall by Incogni. “So when you let a company collect your health data, it is safe to assume two things: 1) you are not covered by HIPAA protections. and 2) the company is going to sell your data.” The reason is simple economics. Selling user information often generates more revenue than the product itself. Your health data is intensely personal, which makes it intensely valuable.

What happens when we don’t own our health data

I remember firsthand when my friends and I frantically deleted apps period-tracking apps after the Supreme Court overturned Roe v. Wade in 2022. What once felt like simple tools for monitoring my cycle suddenly looked a lot like potential evidence in criminal investigations. We were terrified our menstrual data could be subpoenaed to prove we’d had abortions, and this fear wasn’t paranoid. As Zayas explains, governments can purchase the same data anyone else can and cross-reference it with location information from mobile phones. “When you had—or skipped—your period can imply if you are pregnant or trying to get pregnant,” he says. “Governments can buy this information and tie it to your recent trips to decide if you had an abortion or miscarriage.”

At the same time, I love all kinds of health-related “optimization.” I love sharing my runs on Strava and checking my sleep score on my Garmin. Outside of my vanities, health gadgets can deliver life-changing benefits—monitoring blood sugar, tracking heart rate variability, detecting irregular sleep patterns. But what happens when that data shows you’re not exercising enough, or eating poorly, or sleeping irregularly? Could your rates increase? Could you be denied coverage?

Like with the period-tracking fears, the very real concern here is that same data streams that help you feel in control of your health—that make your daily life more “optimized”—can be exploited for insurance profiling, targeted advertising, or even employment decisions, if data-sharing policies aren’t strictly controlled. Let’s take a look at the fine print to see where exactly your data is going, and what you can do to protect yourself.

The fine print nobody reads

Julia Zhen, a third-party information security risk manager at a major nonprofit, says, “If you want to know what information is being gathered and/or stored—which are two distinct acts—start with the privacy policy for the app itself.” On top of that, third parties like the Google app store have their own terms of service, creating multiple points of data collection to investigate.

Zhen recommends a shortcut: Search for keywords like “sell” or “share” within privacy policies to quickly understand what happens to your data. “Most of the time, companies are de-identifying individuals from their data because they want to aggregate information and speak to certain demographics,” she explains. That aggregation still might raise ethical concerns, but according to Zhen, it’s industry standard practice these days.

Using this strategy, Zhen says she has encountered privacy policies that brazenly admit to selling user data. And even when companies claim to anonymize information, the protection isn’t foolproof. Jacob Kalvo, a cybersecurity expert and CEO at Live Proxies, says there still exist risks of re-identification down the line. Because even a giant like Apple can’t safeguard your data once you choose to share it beyond their ecosystem. Jake Peterson, Lifehacker’s senior technology editor, says, “Apple has some good privacy policies in place to keep your health data private, but if you choose to share it with outside sources, you’ll lose that control.” In other words, if you share medical data directly to a healthcare provider through the health app and later delete it, Apple won’t retain it anymore, but you might not have control over the data your healthcare provider has collected.

How to protect yourself in the digital health age

Even if you trust a company’s privacy policy today, there’s another threat lurking: cybersecurity breaches. “The real risk that we accept on a daily basis is hackers and cyber attacks,” Zhen says.

Hackers are sophisticated, and you can count on them staying ahead of security development. Even if companies don’t intentionally sell your data, they can be careless. Most privacy policies acknowledge they try to protect against attacks, but breaches are endemic in the tech industry. Your carefully guarded health information could be stolen and sold on the dark web regardless of a company’s good intentions. Once your data is leaked, it can be used outside your control with zero recourse.

When asked about period-tracking apps in the current political climate, Zhen says these service providers “may be at a higher rate of being targeted by cyber attacks because of restrictive reproductive laws.” This is important to keep in mind across platforms: What information are you willing to risk?

Still, this doesn’t have to mean abandoning health tech entirely. Experts agree on several protective measures:

Read the damn privacy policy. Zhen’s advice is to go straight to the privacy policy for every point of data collection and search for keywords like “sell” and “share.” Most policies include data retention information and a contact email where you can request details about what information they hold on you.

Understand what you’re giving up. Before downloading an app, understand exactly what data it collects and why. When in doubt, assume the worst in every privacy policy.

Practice good data hygiene. As a rule, avoid ever giving out your mobile number. Use alias email addresses you don’t use elsewhere. Enable a VPN to hide your identity and location. Turn on multi-factor authentication everywhere.

Don’t overshare. Don’t give out any more information than you need for your purposes. Does the company need to know your exact birthdate, or just a year? Do they need to know where you live? If not, don’t provide the information, or feel free to lie when you can.

Remember that privacy policies aren’t binding contracts. Companies typically reserve the right to modify their terms whenever they want.

The bottom line

The reality is most people accept all sorts of data-collection risks daily, because modern life demands it. My goal here isn’t to fear-monger, but to help make informed choices in what ultimately is a calculated gamble.

If you are the kind of person who posts on social media, downloads apps to order takeout, and accepts risk as it comes with the convenience of modern tech norms, then “downloading a reputable health metrics app is usually going to be fine—so long as the privacy policy isn’t directly stating they’re selling your data,” Zhen says.

Then again, I’d argue your health data is more intimate, more permanent, and more potentially damaging than your food delivery history. If you ask me, we’re conducting a massive, uncontrolled experiment in health surveillance, and we’re all the test subjects. The technology offers genuine benefits—better health outcomes, earlier disease detection, personalized medicine. But we’re trading something precious and poorly understood for those benefits: privacy, autonomy, and control over our most intimate information.

The question isn’t whether to use health tech. For many people, the benefits are too significant to ignore. The question is whether we’re making that choice with full awareness of what we’re giving up—and whether the companies collecting our data can be held accountable, if and when a reckoning comes.

Leave a Reply

Your email address will not be published. Required fields are marked *