Wearable devices are collecting valuable information about us. The question is, how can it be protected?
This story is part of the Pulse series “Bit by Bit: How Data is Shaping our Health”
The Apple Watch has been on wrists around the country for about a week, tracking steps, counting calories and monitoring heart rates with on-board accelerometers and optical heart rate sensors.
The Watch ushers in a new era in wearable technology, one that has privacy experts and lawmakers asking where all that sensitive health data will ultimately end up.
“Things that happen outside of a doctor’s office—your apps, your fitness trackers, your gym memberships—none of that stuff is protected by HIPAA,” said University of Pennsylvania privacy researcher Tim Libert.
HIPAA, or the Health Insurance Portability and Accountability Act, is the federal legislation that limits who has access to a patient’s clinical health records.
The law does not govern data gathered by the Apple Watch or Fitbit or more sophisticated wearables that measure blood oxygen levels, brain activity, and REM sleep. Nor does it govern the data that users enter themselves into apps that do everything from tracking menstrual cycles to sending out medication reminders.
“For the most part [that data] is completely unregulated at a federal level, so most of the regulations in that space are self-regulations,” Libert said, “As we’ve seen over the past 10 or 20 years, that the industry does not police itself virtually at all.”
Could Apple Watch set the tone for health tech?
The Apple Watch is an exception to that rule. Apple recently implemented new privacy restrictions for apps that tap into health information in the Health Kit and use the Apple Watch’s heart rate sensor.
“I think Apple really put a flag in the ground when they came out with their Apple Watch and a set of developer agreements that really defined how (developers) can use the data,” said Morgan Reed, executive director of The App Association, a Washington, D.C., based trade organization that represents software companies.
Reed said Apple leaves personal health data on the device, and lets the user choose which apps to share it with. The company also restricts data from being used for targeted behavioral advertising.
“What that means is that your information can’t be used to say, ‘Oh, you have this health condition, we’re going to show you ads specific to your personal health condition,'” Reed said.
Apple also forbids developers from selling health data to advertisers.
These rules only apply to Apple’s health apps, and it remains to be seen how well the company will enforce them. Still, even setting these rules is a step toward privacy much of the rest of the industry has not yet taken.
Reed has been analyzing privacy policies of top health and fitness apps, and while he said he doesn’t know of specific firms that sell information to advertisers or data aggregators, some privacy policies do reserve the right for the companies to do so.
Third parties already access health and fitness data
The Federal Trade Commission, federal lawmakers and the Attorney General of at least one state have all been investigating privacy and data security in consumer-generated health data in the last year.
A FTC study presented last year found that information gathered by 12 mobile health and fitness were transmitted to 76 different third parties. That information included exercise routines, diet, and symptom searches, and according to the FTC could be re-identified back to individual customers.
Selling data is by no means new. Targeted advertising is how Facebook and Google make their money, and data aggregators amass huge amounts of data to sell to marketers.
Industry representative Morgan Reed believes Apple’s new privacy policies will set the tone for the rest of the industry as wearables and apps gather more sensitive health data, but privacy researcher Libert is skeptical.
“If you’re giving away a free app, the money has to come from some place,” Libert said. “The likelihood that you’re going to say everything is encrypted, we can’t see [anything], and we can never look at it, well then how do you make money?”
Concerns for the future
Personal information is the currency of the digital age, and John Wilbanks, Chief Commons Officer at the non-profit biomedical research firm Sage Bionetworks, fears health data is beginning to be viewed just like any other consumer information.
“What we have in sort of consumer culture is really fine grain stratification of people so that we know that you’re more likely to buy a yellow sweater and I’m more likely to buy a blue sweater,” Wilbanks said.
Wilbanks, who worked on the participant consent policies for two of Apple’s secure research apps, is especially worried about what may happen if insurance companies have access to detailed health data that is leaked or sold from less-secure health and wellness apps.
“It becomes easy [for insurance companies] to say, ‘I’m not going to deny you insurance because of factors X, Y, and Z,'” Wilbanks said. “‘I simply used all of this data, and people like you tend to have a bad health record, so I’m not going to give you insurance at a good rate, because my data has put you into this risk pool.'”
“That risk pool would not have been knowable before wearables, and so that’s one of the big fears, and I think it’s a valid one,” Wilbanks said.
Wilbanks likens this to 20th century redlining, where banks wouldn’t lend to residents in neighborhoods heavily populated by minorities.
The Affordable Care Act does not allow insurance pricing to be based on health status in the individual market, but generally does allow basing costs on health status and claims history for employer-based coverage.
Companies are already incorporating the fitness tracker Fitbit into corporate wellness programs, and at least one life insurance company offers discounts to customers who wear one and share the data.
Fitbit has emphatically said it does not sell personal data, and doesn’t share it without consent except for under a few specific exceptions.
Consumers seem largely unfazed
There has not been a highly visible privacy breach with consumer-generated health data, nor does there seem to be much consumer push-back as new sensors, like the heart rate monitor, enter the mainstream.
“I suppose I should be creeped out, but it’s kindof cool,” said Sunkwon Bush, an experience designer at the Center City Philadelphia office of the tech firm Think Brownstone.
He and a group of colleagues split two Apple Watches in a rotating office watch share. Bush was especially interested in the health and fitness functions of the watch when he formatted it for the first time last week.
“The most interesting thing is, when I started setting up the health part, I could actually see my heart rate,” Bush said. “It also asked my age, my weight, gender, my activity level and then it started giving me goals, so I’m curious to see how that works.”
Phil Charron, vice president of experience design at Think Brownstone and the first recipient of the office’s other watch, said he was looking forward to accessing all of his health information easily in one place.
“At this point I’m not worried about someone else seeing that data,” Charron said. “If I gained five pounds and Apple finds out, I don’t think Jony Ive [Apple’s head designer] cares.”
For now, which health information is collected and shared is largely up to the tech industry, regulated by current consumer protection laws. That means consumers must choose whether the help they get in tracking steps, calories or heart rate is worth any privacy they may sacrifice.
Privacy-conscious smartphone and wearable users can read privacy policies to look for clauses reserving the right to “share” or “sell” personal data. So avoid using their apps on public wifi networks, and remember the old maxim about getting things for free online: If you’re not paying for it, you are the product.
An earlier version of this story misspelled the name of Sunkwon Bush.