Developing privacy compliance strategies for new and innovative digital health tools can be challenging. Here are some key considerations for digital health technology developers.
Jennifer S. Geetter, JD and Lisa S. Mazur, JD
Historically, physicians and patients interacted within the four walls of a hospital, physician’s office, or other medical facility with minimal interaction between scheduled patient visits, except for medical emergencies. In recent years, new modalities for healthcare provider-patient interaction have significantly expanded to allow for provider and patient engagement outside of a medical facility and in the absence of an emergency. Technology-supported devices, resources, and solutions allow patients and consumers to effectively monitor, track, and manage their health conditions, share information with health care providers, and access health care and educational resources. When these tools are used appropriately, their promise to improve the health and wellness of users, and to reduce healthcare costs, seems endless.
To illustrate, think about how digital health tools have the potential to make a significant impact on the lives of the estimated 30 million Americans with diabetes. Using these tools, diabetes patients can set up medication alerts and reminders, track their intake of food, and integrate this information with nutrition, weight and cholesterol management recommendations. In addition, data from personal insulin meters can be downloaded to the patient’s smart phone and automatically shared with his/her family members and caregivers, and sent to a variety of different medical professionals in real time who can then reach their patients via online chat, text messaging or phone, and make any necessary medication adjustments before a mild case of hypoglycemia turns into diabetic shock. In addition to improving the lives of these patients, the appropriate use of these tools has the potential to decrease the estimated $245 billion attributable to the lost productivity and the medical care expenses of diabetes patients each year because these tools can help to keep these patients from needing costly emergency and inpatient care services, and missing days of work.
Health industry regulators, however, are struggling with how to apply the existing privacy regulatory regime, and the permitted uses and disclosures for which they provide, in this new world of healthcare innovation, and whether new or enhanced requirements and standards should be imposed given the patient safety, data integrity and other risk considerations.
The Challenge—Intersection of Innovation and Regulation
Although there is near universal recognition among regulators, payers, providers, and patients that these tools have the potential to improve health and decrease healthcare costs, technology developers and healthcare providers are still faced with navigating the complex, sometimes overwhelming, array of state and federal laws that were often written for a different time and inelegantly apply to current modalities and relationship.
Privacy laws are one example of state and federal laws that need to be understood and navigated. These privacy laws are typically not new. Health industry regulators, however, are struggling with how to apply the existing privacy regulatory regime in this new world of healthcare innovation, and whether new or enhanced requirements and standards should be imposed given the patient safety, data integrity, and other risk considerations.
State and federal regulators are concerned about the privacy and security challenges faced by these digital health solutions for various reasons, including the (a) increased volume of data that are available for new analyses; (b) different sources receiving and transmitting patient information; (c) potential for remote eavesdropping; (d) potential for confusion caused by the blurring of the lines between when data is personal information and when it is protected health information (PHI)—essentially, when HIPAA applies and when it does not; (e) potential for sharing of data that patients and consumers may not “reasonably” have expected; (f) challenges and impracticability of providing notice and choice to consumer; and (g) enhanced security risks with the storage of patient information on digital health tools.
As a result, developing privacy compliance strategies for new and innovative digital health tools can be challenging.
There is no single, comprehensive federal law that regulates the collection or use of patient and personal information in the United States. Instead, there is a patchwork of information privacy and security laws, rules, and guidelines at both the state and federal levels that may apply. Some of these laws, regulations, and standards apply to specific entities (e.g., health plans, vendors, financial institutions, providers), the type of individual about/from whom the data is collected (e.g., adult, minors), the particular categories of information (e.g., personally identifiable information, financial data, personal health data, protected health information, information about specific types of conditions), the sources of the data (e.g., medical records, patient, payment claims to a health insurer, indirectly through a health care provider or health place), or the purpose for using or disclosing data (e.g., research).
When PHI is collected and transmitted using digital health tools, state privacy standards, and the federal requirements under HIPAA and other laws related to the preservation of the integrity and safeguarding the security of PHI are implicated. In addition, there are broad consumer protection laws that are designed to protect consumers from unfair or deceptive practices involving the collection, use, disclosure, and security of patient information. For example, the Federal Trade Commission Act of 1914, Gramm-Leach-Bliley Act (also known as the Financial Services Modernization Act of 1999), Children’s Online Privacy Protection Act of 1998, and FTC Health Breach Notification Law. The FTC has taken its consumer protection role seriously, as demonstrated by its enforcement activity against technology developers believed to have engaged in deceptive or unfair trade practices.
At the state level, Attorney Generals also serve as additional watchdogs, as illustrated by the New York Attorney General’s office’s recent settlement with the developers of three mobile health applications for, among other things, alleged misleading commercial claims.
Guideposts for Moving Ahead
The success of digital health companies rests largely on their ability to secure the trust of consumers, patients, and providers that data gathered by the tool is being used responsibly, transparently, and in ways that are consistent with their expectations and goals.
Complying with state and federal laws is the minimum. Developing legal and compliance strategies that go beyond the minimum legal requirements, in some respects, may better position digital health companies for success.
Developers of digital health technologies need to understand which state and federal laws and regulations governing privacy and security are applicable to them, as well as those applicable to the company’s customers, and develop and implement a responsive compliance program that enables the company to meet its legal obligations—and its customers to meet theirs.
Below are guideposts for digital health technology developers that are aimed at addressing the enhanced privacy and security risks posed by digital health tools based on guidance provided by the FTC and other regulatory agencies:
- Perform a careful, upfront analysis of the state and federal laws and regulations to avoid inadvertently violating the law, which can result in significant fines and penalties, reputational harm, and loss of customer trust. This assessment must take into account a variety of factors, including the volume, types, and sensitivity of the data gathered by the digital health tool, and the activities of the customer who is licensing or purchasing the tool.
- In particular, technology developers need to understand if they are acting as a Business Associate (as defined in HIPAA) of its customer or as a downstream Business Associate of its customer’s client. If a developer is acting as a Business Associate, the developer has certain important responsibilities that require early consideration (i.e., during the negotiation of the terms of the arrangement) and documentation in a Business Associate Agreement. For example, what are the developers’ obligations for returning or destroying PHI? While returning or destroying information sounds simple, there are unique legal, practical, technological, and financial challenges that are frequently overlooked.
- In addition to HIPAA, state sensitive information laws are numerous and inconsistent with one another, and require special consideration, particularly if the digital health tool collects behavioral health-related information.
- As part of this upfront legal and regulatory analysis, technology developers should also identify the available compliance pathways under state and federal laws that are needed to support the use and disclosure of identifiable information for the creation and the subsequent use of the digital health tool, and document this information in policies and procedures.
- These policies, procedures, and practices should appropriately balance the need for future innovation with the amount of data collected and retained. Minimizing the amount of data collected and retained decreases potential harms associated with a data breach. An effective data minimization strategy may involve limiting the collection of data to (a) only the fields of data necessary to the product or service being offered; (b) data that is less sensitive; (c) de-identify the data they collect; and/or (d) obtain consent for collection. This strategy may also require deleting the data when it is no longer needed.
- In addition, if applicable, these privacy policies, procedures, and practices should limit access to information if it is not needed (e.g., a fitness-related mobile app should not need to access the operating system’s API for consumer information that is not needed for that functionality) and mandate privacy-protective default settings that offer protection for users new to the technology while still accommodating the needs of more experienced users (e.g., set the weight-loss mobile app’s default to “private” rather than “public” so as to protect users from unwittingly sharing their weight-loss results with others).
- Users of the digital health tool should be given a choice before collecting and using consumer data when the use of the data is inconsistent with consumers’ reasonable expectations. The methods employed for informing users about security and privacy options/features should be simple, clear, and direct—rather than including complicated legal jargon or hard to locate links.
- A regular assessment of the digital health company’s privacy and security compliance strategy is essential, and should be revisited if the functionalities of the tool changes.
Disclaimer: This article does not constitute legal advice and does not establish an attorney-client relationship.
[Image courtesy of SAMARTTIW/FREEDIGITALPHOTOS.NET]