Now Reading
Does Deregulating HIPAA Benefit Big Data?

Does Deregulating HIPAA Benefit Big Data?

  • Electronic Frontier Foundation, a nonprofit that champions digital privacy, has filed their objections to the proposed policy changes.

The US Department of Health and Human Services has recently proposed amendments to the health policies under the Health Insurance Portability and Accountability Act of 1996 (HIPAA). According to HHS, these modifications are aimed at changing the standards posing a hindrance to the transition to value-based health care by discouraging “care coordination” and “case management” communications among individuals, hospitals, physicians, and other health care providers.

Electronic health records contain vast amounts of medical information about patients, which can be used to train machine learning models to predict health outcomes as well as to prevent future diseases or disabilities. But, how safe is it to put a patient’s privacy at risk?

Deep Learning DevCon 2021 | 23-24th Sep | Register>>

Perils of deregulation

Electronic Frontier Foundation (EFF), a nonprofit that champions digital privacy, has filed their objections to the proposed policy changes.

Impact of modifications

  • Sharing of health data without permission, by previously undefined terms like “health care operations” to include “case management” and “care coordination,” is a warning sign.
  • Seeking to lower the standard of disclosure for emergencies. 
  • Covered entities are required to disclose personal health information (PHI) to uncovered health mobile applications upon patient request. 
  • The release of PHI is a potential threat to patient health and privacy.

The EEF, in their filing, argued that these modifications would undermine trust, preventing patients from disclosing their sensitive and intimate medical information. According to EEF, the United States spends more than $100 billion for untreated mental illness annually and currently, only 2.5 million of the 21.2 million patients seek treatment. HIPAA deregulation would discourage such mental health patients from opting for treatments.

Follow us on Google News>>

According to a 2014 Federal Trade Commission study, personal health apps and devices sold information to third parties, and some of this data could be even linked back to specific users. In addition, third parties received device-specific identifiers, and other key health information. If the proposed HIPAA modifications are adopted, warned EEF, third party health app’s vendors will get access to patients’ PHI. This can be tricky for patients who are often unaware of the intricacies of privacy policies, terms of use, permissions and their consequences.

Further, depending on where the PHI is stored, apps can access the PHI through fishy permissions. Such permissions have serious consequences as many apps harvest data on devices that are unrelated to what the app is supposed to do. 

Improved health care data accessibility has advantages. But, is it essential to deregulate HIPAA? Not necessarily. Putting real-world health data to good use comes with many practical challenges. In the context of big data, distributed data silos, privacy concerns of centralised databases, resource constraints for integrating data from multiple sites are few of many such challenges. To address this, last year, the researchers at IBM have introduced a federated learning (FL) framework that can learn from distributed health data held locally at different sites. 

The framework offered two levels of privacy protection:

  • Raw data is not shared across sites or with a centralized server during the model training process. 
  • It uses a differential privacy mechanism to further protect the model from potential privacy attacks. 

The researchers implemented this framework on healthcare applications using real-world electronic health data of 1 million patients. They observed that the federated learning framework offered an elevated level of privacy while maintaining utility of the global model.

Despite best practices for limiting access, terms of use agreements, and other ways to assure compliance, experts warn that de-identified data can still be re-identified through membership or inference attacks. Outside the US, where GDPR applies, only purely synthetic data is allowed. According to the experts,  one solution is to release high fidelity synthetic data that can dramatically reduce the risk of disclosure,especially relative to traditional de-identification.

The digitisation of medical records ushered in a new era of big data to clinical science. The data can be used to draw multiple insights beyond what an expert could abstract previously. As the need to share individual-level medical data for precision medicine continues to grow, more big data tools will come into picture. However, the EEF reminds that a major lesson of this pandemic is that people’s trust in the healthcare system is critical to health care outcomes.  

“We have seen this in the context of advice about testing, about wearing masks, about social distancing, and about vaccination. Our greatest disappointment is that the Department seems to be promoting more, and less accountable, disclosure of PHI without patient knowledge or consent.  This will not promote patient trust, and the RFI will not improve health outcomes,” countered EFF.

What Do You Think?

Join Our Discord Server. Be part of an engaging online community. Join Here.


Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top