What is "Protected Health Information" (PHI)?
- meganjungers
- Feb 9
- 6 min read
Within the United States, sensitive, personally identifiable health data is protected under the Health Insurance Portability and Accountability Act (commonly known as HIPAA), which is loosely applicable when specifically pertaining to information connected to medical technology devices1. When an individual's data is facilitated and stored within applications or data-based cloud systems, private health data is accessible through a number of different means that may not contain the same securities and protections as traditional electronic health records1.
The protections for identifiable health data originate in the HIPAA Privacy Rule (45 C.F.R. § 160, 164) which was created in 1996 with the goal of protecting patients from insurance and payment discrimination based on their individual data2.
The Privacy Rule defines PHI as:
"Protected Health Information means individually identifiable health information ... that is...
i) transmitted by electronic media
ii) maintained in electronic media
iii) transmitted or maintained in any other form or medium"
"individually identifiable health information is... including demographic information...
1) is created or received by a health care provider, health plan, employer, or health care clearing house;
2) and relates to the past, present, or future physical or mental health or condition of an individual; or the past , present, or future payment for the provision of health care to an individual; and
i) that identifies the individual; or
ii) with respect to which there is a reasonable basis to believe the information can be used to identify the individual "
(45 C.F.R. § 160.103)
This explicit definition of what information is considered confidential and protected under law aims to limit exposing patient data, and highlights the scope of who is responsible and liable for creating and receiving the data (health care providers, health plans, employers, and health care clearing houses/ insurance billing facilitators). However, this definition fails to encompass an entire industry where identifiable health information is collected, generated, and retained- the technology industry.
Take, for example, wearable technology. A device that one wears throughout their daily activities can potentially track both information about their health (vital signs, pre-entered measurements, exercise/lifestyle habits, etc.) as well as demographic information (location, gender, age). If a individual is using their device regularly, predictive profiles relating to their health status (and future health statuses) can be derived from the information gathered by the wearable device. However, because the information is not under the same protections as is with data originating or being received by a covered entity, the same legal safeguards do not apply in the same manner.
The same can go for apps and even AI algorithms that are not held to the same privacy standards as information that originates outside of HIPAA's scope. This information is typically acquired without the user's knowledge, and often does not involve an informed consent process. A continued frustration with the insufficient regulatory definitions of what constitutes “protected health data” beyond the scope of information used by medical providers emphasizes the pressing nature of issues surrounding security of patient data3. In relation to the continuously evolving medical technology landscape, there remains a need to clearly classify protected health data in the context of cloud, app-based, and AI data sharing.
Artificial Intelligence and PHI
The surge of artificial intelligence across numerous industries has proved the potential influences on generating and supporting many businesses' success. However, artificial intelligence is dependent on the data it is trained with, and with greater access to larger amounts of data, there is potential for decreasing chances of machine learning biases. Further, businesses aim to utilize these AI resources to promote products to different demographics, collect individual feedback, and predict desires of current and potential customers.
While a helpful tool in some contexts, there are a number of ethical risks associated with failing to regulate elements of the data collected by the utilization of AI in everyday lives, or just how such data is acquired. In his recent senate testimony (20:35) at a hearing regarding the need to protect privacy in the age of developing AI, Dr. Ryan Calo emphasized that “companies have an incentive to use what they know about individual and collective psychology, plus the power of design to extract as much money and attention as they can from everyone else."4
According to Dr. Calo's testimony, it can be deduced that protected health data could be beneficial to a number of different companies, particularly those with healthcare-based products. And with limited regulations surrounding what information in an app or cloud-based database is not accessible, there is increased potential harm to individuals using such medical technology devices as forms of treatment.
The question then becomes, do we need to reevaluate our federally legal definition of protected health information?
The HIPAA Security Rule
As a part of HIPAA's protections, PHI cannot be disclosed without a patient's explicit consent. HIPAA's PHI regulations are more easily enforceable in a healthcare setting; however in a digital space where there is significant difficulty enforcing a reasonable amount of protections of such data, data can be procured by hackers if the PHI is not protected well enough. HIPAA's Security Rule tries to act as a safeguard against cyber attacks by requiring standardized protections of electronic medical records and regular risk assessments to meet compliance standards (45 C.F.R. § 164, subpart C). Again, this is a step in the right direction, however the scope of the policy is targeted towards health care entities, and does not apply to all elements where sensitive health information can be gathered.
HITECH and the 21st Century Cures Act
The HIPAA Privacy Rule was amended in 2009 by the HITECH (Health Information Technology for Economic and Clinical Health) Act, which sought to expand the scope of HIPAA to electronic health records as they were becoming standard across the medical field2. It also generated the building blocks for sharing health information across states through Health Information Exchanges (HIEs), in an effort to streamline and protect patient history when sent between providers virtually2. The 21st Century Cures Act was further adopted in 2016 to help overcome HIPAA barriers that the HITECH Act was facing in establishing HIEs with unique patient identification numbers, further de-identifying patient's electronic medical data2. These two amendments to health privacy policy illustrate both the rapid pace in which healthcare is becoming integrated with technology, as well as the feasibility of contributing to the evolution of health privacy standards. While not directly supporting stronger privacy protections of data not gathered by a covered entity, these two amendments help to lay the groundwork for potential health privacy policy.
PHI Limitations/ Conflicts
As discussed, PHI is a limited by the language used in structuring HIPAA. However, some states have chosen to incorporate more stringent privacy laws of their own, in an attempt to bridge necessary protections not covered under HIPAA.
For example, the state of California has the California Consumer Privacy Act (CCPA) which includes direct references to limiting access to"biometric information" which is encompassed in many types of information registered by apps and wearable technology devices (TITLE 1.81.5. California Consumer Privacy Act of 2018 [1798.140 CPRA])5. However, amendments to the piece of legislation were needed in order to fully comply with HIPAA's exceptions for clinical research6.
Further, PHI is only recognized as such in the United States. In the EU, for example, the General Data Protection Regulation (GDPR) outlines the differences between types of Personally Identifiable Information (PII) and Personal Data, so as to make a clear process for meeting respective regulation criteria7. Challenges further arise when information needs to be shared across research groups, and is compliant in one federation, while not in the other.
Citations
Health IT Answers. n.d. "Medical Devices and HIPAA Compliance: What to Know." https://www.healthitanswers.net/medical-devices-and-hipaa-compliance-what-to-know/.
Pendo, Brietta R., Erin C. Fuse Brown, Robert Gatter, and Elizabeth Y. McCuskey. 2022. Clark, Fuse Brown, Gatter, McCuskey, and Pendo's Health Law: Cases, Materials and Problems. 9th ed., 175-199. St. Paul, MN: West Academic Publishing.
McIntyre, Mandy. 2023. "Many Americans Don’t Realize Digital Health Apps Could Be Selling Their Personal Data." ClearDATA, July 13, 2023. https://www.cleardata.com/blog/digital-health-apps-selling-personal-data/.
United States Senate. 2024. The Need to Protect Americans' Privacy and the AI Accelerant. Testimony of Dr. Ryan Calo, Professor at the University of Washington, July 11, 2024.
California State Legislature. 2023. California Civil Code, Title 1.81.5. https://leginfo.legislature.ca.gov/faces/codes_displayText.xhtml?division=3.&part=4.&lawCode=CIV&title=1.81.5.
Gardner Law. 2021. "April 2021 Privacy Update for Drug and Device Makers." https://gardner.law/news/april-2021-privacy-update-for-drug-and-device-makers.
TechGDPR. 2023. "Difference Between PII and Personal Data." Accessed March 24, 2025. https://techgdpr.com/blog/difference-between-pii-and-personal-data/.
Comments