Are you tracking your health with a device? Here’s what could happen with the data
- The use of wearable technology and medical apps surged in the years following the COVID-19 pandemic, but research released by Mozilla indicates that current laws offer little protection for consumers who are often unaware just how much of their health data are being collected and shared by companies.
- The report suggests existing data protection laws be clarified to encompass all forms of bodily data
- It also calls for expanding national health privacy laws to cover health related information collected from health apps and fitness trackers and making it easier for users to opt out of body-centric data collections.
Every day millions of people share more intimate information with their accessories than they do with their spouse.
Wearable technology — smartwatches, smart rings, fitness trackers and the like — monitors body-centric data such as your heart rate, steps taken and calories burned, and may record where you go along the way. Like Santa Claus, it knows when you are sleeping (and how well), it knows when you’re awake, it knows when you’ve been idle or exercising, and it keeps track of all of it.
People are also sharing sensitive health information on health and wellness apps, including online mental health and counseling programs. Some women use period tracker apps to map out their monthly cycle.
These devices and services have excited consumers hoping for better insight into their health and lifestyle choices. But the lack of oversight into how body-centric data are used and shared with third parties has prompted concerns from privacy experts, who warn that the data could be sold or lost through data breaches, then used to raise insurance premiums, discriminate surreptitiously against applicants for jobs or housing, and even perform surveillance.
The use of wearable technology and medical apps surged in the years following the COVID-19 pandemic, but research released by Mozilla on Wednesday indicates that current laws offer little protection for consumers who are often unaware just how much of their health data are being collected and shared by companies.
“I’ve been studying the intersections of emerging technologies, data-driven technologies, AI and human rights and social justice for the past 15 years, and since the pandemic I’ve noticed the industry has become hyper-focused on our bodies,” said Mozilla Foundation technology fellow Júlia Keserű, who conducted the research. “That permeates into all kinds of areas of our lives and all kinds of domains within the tech industry.”
2024 Consumer Attorneys of Southern California
New California Law Governs Protected Health Information
The State of California recently sent letters to eight major pharmacy chains as well as five health data companies, reminding the companies of their obligations to comply with California’s Confidentiality of Medical Information Act (CMIA).
The report “From Skin to Screen: Bodily Integrity in the Digital Age” recommends that existing data protection laws be clarified to encompass all forms of bodily data. It also calls for expanding national health privacy laws to cover health-related information collected from health apps and fitness trackers and making it easier for users to opt out of body-centric data collections.
Researchers have been raising alarms about health data privacy for years. Data collected by companies are often sold to data brokers or groups that buy, sell and trade data from the internet to create detailed consumer profiles.
Body-centric data can include information such as the fingerprints used to unlock phones, face scans from facial recognition technology, and data from fitness and fertility trackers, mental health apps and digital medical records.
One of the key reasons health information has value to companies — even when the person’s name is not associated with it — is that advertisers can use the data to send targeted ads to groups of people based on certain details they share. The information contained in these consumer profiles is becoming so detailed, however, that when paired with other data sets that include location information, it could be possible to target specific individuals, Keserű said.
Location data can “expose sophisticated insights about people’s health status, through their visits to places like hospitals or abortions clinics,” Mozilla’s report said, adding that “companies like Google have been reported to keep such data even after promising to delete it.”
A 2023 report by Duke University revealed that data brokers were selling sensitive data on individuals’ mental health conditions on the open market. While many brokers deleted personal identifiers, some provided names and addresses of individuals seeking mental health assistance, according to the report.
In two public surveys conducted as part of the research, Keserű said, participants were outraged and felt exploited in scenarios where their health data were sold for a profit without their knowledge.
“We need a new approach to our digital interactions that recognizes the fundamental rights of individuals to safeguard their bodily data, an issue that speaks directly to human autonomy and dignity,” Keserű said. “As technology continues to advance, it is critical that our laws and practices evolve to meet the unique challenges of this era.”
Turmoil at 23andMe, and a lawsuit alleging that GEDmatch shares data with Facebook, highlights how far your genetic information could travel without your consent.
Consumers often take part in these technologies without fully understanding the implications.
Last month, Elon Musk suggested on X that users submit X-rays, PET scans, MRIs and other medical images to Grok, the platform’s artificial intelligence chatbot, to seek diagnoses. The issue alarmed privacy experts, but many X users heeded Musk’s call and submitted health information to the chatbot.
While X’s privacy policy says that the company will not sell user data to third parties, it does share some information with certain business partners.
Gaps in existing laws have allowed the widespread sharing of biometric and other body-related data.
Health information provided to hospitals, doctor’s offices and medical insurance companies is protected from disclosure under the Health Insurance Portability and Accountability Act, known as HIPAA, which established federal standards protecting such information from release without the patient’s consent. But health data collected by many wearable devices and health and wellness apps don’t fall under HIPAA’s umbrella, said Suzanne Bernstein, counsel at Electronic Privacy Information Center.
“In the U.S. because we don’t have a comprehensive federal privacy law ... it falls to the state level,” she said. But not every state has weighed in on the issue.
Washington, Nevada and Connecticut all recently passed laws to provide safeguards for consumer health data. Washington, D.C., in July introduced legislation that aimed to require tech companies to adhere to strengthened privacy provisions regarding the collection, sharing, use or sale of consumer health data.
In California, the California Privacy Rights Act regulates how businesses can use certain types of sensitive information, including biometric information, and requires them to offer consumers the ability to opt out of disclosure of sensitive personal information.
“This information being sold or shared with data brokers and other entities hypercharge the online profiling that we’re so used to at this point, and the more sensitive the data, the more sophisticated the profiling can be,” Bernstein said. “A lot of the sharing or selling with third parties is outside the scope of what a consumer would reasonably expect.”
The Federal Trade Commission released a report Thursday slamming social media platforms including Facebook’s parent company, Meta, as well as TikTok, Google-owned YouTube, Snap and other online services over privacy and youth safety concerns.
Health information has become a prime target for hackers seeking to extort healthcare agencies and individuals after accessing sensitive patient data.
Health-related cybersecurity breaches and ransom attacks increased more than 4,000% between 2009 and 2023, targeting the booming market of body-centric data, which is expected to exceed $500 billion by 2030, according to the report.
“Nonconsensual data sharing is a big issue,” Keserű said. “Even if it’s biometric data or health data, a lot of the companies are just sharing that data without you knowing, and that is causing a lot of anxiety and questions.”
More to Read
Sign up for Essential California
The most important California stories and recommendations in your inbox every morning.
You may occasionally receive promotional content from the Los Angeles Times.