As noted this week in a blog posting from Hunton & Williams, a ruling issued in February from a federal district court in Ohio highlights some of the legal complexities in navigating both state and federal laws governing the disclosure of personal information in medical records. In the case, Turk v. Oiler, the plaintiff sued a Cleveland medical clinic for violating his privacy rights under Ohio law when it disclosed his medical records in response to grand jury subpoena. While the facts of the case related to a variety of issues, the salient detail from a privacy rights standpoint is that the plaintiff had been arrested for possession of a concealed weapon firearm, and for carrying a firearm under disability, which is illegal in Ohio. The subpoena of Turk’s medical records was intended to help investigators show that he did in fact have a disability (specifically being drug dependent), and therefore to provide evidence that he was in violation of Ohio law because he was carrying a firearm.
The Cleveland clinic justified its action in furnishing Turk’s medical records to the grand jury under a statutory provision under HIPAA (45 CFR §164.512(e)(ii)) that permits the disclosure of protected health information by a covered entity “in response to a subpoena, discovery request, or other lawful process.” While not cited in the case, the other federal statute most relevant to medical record data concerning drug and alcohol abuse, 42 CFR Part 2, provides a similar exception from disclosure constraints “if authorized by an appropriate order of a court of competent jurisdiction granted after application showing good cause therefor” (§290dd-3(b)(2)(C)), which even trumps a provision that no record disclosed under Part 2 “may be used to initiate or substantiate any criminal charges against a patient or to conduct any investigation of a patient.” Based on federal disclosure rules, the clinic seems to be on solid legal ground. Ohio law, however, has its own restrictions on disclosure or testimony regarding doctor-patient communication (O.R.C. 2317.02, which the court construes to include medical records), and because the state law is more restrictive than HIPAA, federal law does not preempt it. Because of procedural motions and the dismissal of the carry-under-disability charge in previous court proceedings, the district court was the first to actually consider Turk’s privacy violation claims, leading it to refuse to dismiss the claim against the clinic. The Hunton & Williams post notes that since the ruling, Turk subsequently dismissed all his claims against the clinic, suggesting that some sort of settlement was reached after the district court’s ruling. The obvious message for any health care organization seeking to ensure compliance with health information disclosure laws is, considering federal requirements alone is insufficient, especially where state laws impose tighter restrictions than HIPAA or other federal privacy rules.
One of the provisions of the Health Information Technology for Economic and Clinical Health (HITECH) Act portion of the Recovery Act changed the requirements for HIPAA-covered entities to maintain an accounting of disclosures of health information. Under the HIPAA Privacy Rule (specifically, 45 CFR §164.528), such entities were already required to keep (and make available to individuals if requested) a record of disclosures going back six years, but they did not have to include disclosures for treatment, payment, or health care operations. HITECH changed the rule to remove these exempted purposes for disclosure where electronic health records are in use, and also changed the required timeframe to a three-year history, rather than six (§13405(c)(1)).This change will go into effect for existing EHR users on January 1, 2014, and three years earlier for entities that acquire an EHR after January 1, 2009. (The fact that the new rules aren’t yet in effect is why the current Code of Federal Regulations still reflects the six-year period and the exceptions for treatment, payment, and health care operations.) The law also directs HHS to produce regulations describing exactly what information needs to be collected about each disclosure, taking into account “the interests of the individuals in learning the circumstances under which their protected health information is being disclosed and takes into account the administrative burden of accounting for such disclosures.” (§13405(c)(2)) As part of the process of developing these regulations, HHS this week published a notice in the Federal Register requesting information on the new accounting for disclosure requirements.
While the need to produce accountings of disclosures should be familiar ground to health care organizations, the exceptions to the rules for treatment, payment, and health care operations purposes likely have resulted in many covered entities not having to attend to these requirements, so there is understandable concern about imposing new administrative burdens on health care providers. To some extent, the effort to comply with the accounting of disclosure rules could be shifted to EHR vendors, if the ability to store and report comprehensive accounting of disclosure statements is made a requirement for EHR certification under meaningful use rules. Many of these systems already have this capability, so the level of effort to comply hinges on the details of the information that needs to be collected and whether these systems can capture that information with little or no modification. It is logical to assume that some additional work effort will be placed on health care personnel to record disclosures, as the likely need to include recipient information and purpose for the disclosure are attributes that might not be easily captured in an automated fashion from transactional logs. If HHS is going to consider the individual interest in knowing when and why their data has been disclosed, it should also consider defining the disclosure rules to include “use” or “access” rather than just exchange. Some of the most publicized breaches of personal health data privacy are really abuse of privilege by authorized EHR users like hospital staff; nothing in the current HIPAA rules or in the proposed language would include this sort of access.
In applying to all HIPAA-covered entities, the scope of the accounting of disclosures requirements obviously extend well beyond the government, but HHS might do well to examine some of the approaches and technical mechanisms already used by government agencies. Under the Privacy Act, all federal agencies are required to maintain and make available to individuals accountings of disclosures of personal information held in any agency system of records. The Privacy Act makes an exception for access to records by government employees as part of performing their job duties, but in general require that agencies keep a record of when, what, and why a disclosure of personal information is made, and the name and address of the person or organization to whom the disclosure is made (5 USC §552a(c)). The fact that just about every agency has processes and/or systems in place to maintain these disclosure records should give HHS (and even EHR vendors) a lot of information about ways to implement such an accounting of disclosures from EHR systems.
If, as private sector health care entities are likely to argue, there is significant new effort required to comply with stricter accounting of disclosure rules, HHS might also want to consider what incentives might be provided to help mitigate the compliance burden. Meaningful use is one way to approach this, insofar as the ability to maintain and produce the accounting of disclosures the law requires could be made a required functional capability of certified EHR modules or systems. There is another potential source of incentive, at least if some progress is made to develop the business case for health information exchange using EHRs and health IT. While the best reason to look at business models for data sharing is to encourage participation, a potential side benefit of a business model in which data holders were paid for sharing their data, the transaction history that would provide the basis of billing records could also be used to satisfy accounting of disclosure requirements. Absent such a business model, there are policy oversight and legal enforcement interests in auditing data sharing transactions as well (basically to see if entities are living up to their legal or contractual obligations), and the information needed to satisfy such monitoring activities could also likely be leveraged for accounting of disclosures.
The Social Security Administration (SSA), citing an increased volume of claims and requests for its services, is evaluating ways to conduct more of its transactions online. Before making such a move SSA needs first to establish capabilities to remotely verify the identities of individuals requesting SSA services, something it lacks the technical ability to do now. In addition to finding and implementing appropriate enabling technology, SSA and other government agencies are constrained to some extent by their own policies and determinations of the significance of the transactions in question and sensitivity of the data involved in those transactions. Federal agencies are obligated to assess the authentication requirements of all online transactional systems that offer remote access, as instructed in a memorandum issued by OMB in 2004 under authority of the Government Paperwork Elimination Act (GPEA) and the E-Government Act of 2002. The relevant provisions in these laws actually address the use and legal equivalence of electronic signatures, but a key prerequisite for accepting electronic signatures is verifying the identity of the signer, so subsequent guidance to agencies focused on ways to initially prove the identity of and then authenticate remote users of government information systems. Agencies were further instructed to conduct an “e-authentication assessment” for their information systems, rating them according to a four-level scale introduced in the OMB memo and described in detail in NIST Special Publication 800-63, Electronic Authentication Guideline.
As you move up the e-authentication assurance levels from 1 (little or no confidence) to 4 (very high confidence) with respect to the validity of the identity asserted by a remote user, the requirements go up for initial identity proofing and for the strength of credentials presented by the user for authentication. At e-authentication level 4, there is no provision for remote identity proofing, as level 4 requires that identity proofing be done in person. A transaction assessed at e-authentication level 4 is therefore not feasible for fully online operation. This is precisely the situation that SSA finds itself in now, as some of its most sensitive (and frequently requested) transactions such as the replacement of a social security card currently require the presentation of physical, hard copy documentation to prove identity. To try to overcome some of these constraints SSA is experimenting with using video technology as an alternative to physical presence in a social security office (although remotely conducted services using video would still require the individual to appear in a government location such as administrative courts), but the need to inspect hard copies of documents provided as proof of identity is likely to be harder to overcome. This situation could be improved to some degree by advances in technology associated with government-issued credentials, such as driver’s licenses and passports, now accepted as proof of identity, although many legitimate concerns about fraud, identity theft, and impersonation persist even with smart cards and other mechanisms used in some contexts to bind authentication credentials to identity.
In a widely anticipated step, a discussion draft of an Internet privacy bill in the House of Representatives was released today, giving observers the chance to see where Congress might be headed with such legislation. The bill, sponsored by Reps. Rick Boucher (Democrat of Virginia) and Cliff Stearns (Republican from Florida), has yet to be formally or informally named, but the relatively brief (27 pages including over 6 pages of definitions) draft looks to be very narrowly focused on constraining data collection and use practices by commercial organizations outside of sales transactions or other routine operations. While most of the media attention on the pending legislation has been focused on Internet privacy practices, the scope of the discussion draft includes offline data collection (the bill uses the term “manual”) as well. The following provides some of the highlights and initial observations from reading the discussion draft.
Applicability
The coverage of the draft bill focuses on two aspects: the nature of the information, and the entity collecting or using it. It defines covered information to include standard contact information such as name, address, telephone number, and email address, as well as biometric data, social security number, credit card or other account number, consumer preferences used by the entity, any unique persistent identifier such as a customer number or IP address if the identifier is used to “collect, store, or identify information about a specific individual” or a computer or other device owned, used, or associated with a particular user. The inclusion of IP address among the covered information should not be construed as designating it personally identifiable information, especially because IP addresses would presumably only be included if they were static or permanently assigned to individual users, but the tacit implication is interesting, inasmuch as it runs counter to current judicial precedent in the U.S. The breadth of this list seems fairly exhaustive, aside from the fact that there are a lot of exceptions to the rules about collecting and using this data.
There is also a separate list of personal information types categorized as “sensitive information” that demand stronger levels of consent. Sensitive information would include data in medical records, financial account records, precise geographic location, and personal characteristics such as race, religious preference, or sexual orientation. Basically, where sensitive information is involved, the bill would require explicit affirmative consent before disclosure.
The provisions of the bill would apply to what it calls “covered entities” — anyone engaged in interstate commerce collecting any covered information, except for those collecting covered information from fewer than 5,000 individuals annually. The bill would also not apply to government agencies; at first glance it might seem obvious that such agencies are already constrained in their data collection practices by the Privacy Act, but that law only applies to federal agencies (specifically to executive branch agencies, the military, and independent regulatory agencies), not to state or local government authorities. In an acknowledgment of the overlap between this bill as drafted and many federal laws and regulations that include privacy protections or limitations on use and disclosure of data without consent, the discussion draft makes clear the bill will have no impact on Graham-Leach Bliley, the Fair Credit Reporting Act, HIPAA, the Social Security Act, the Communications Act, the Children’s Online Privacy Protection Act, or CAN-SPAM. It does not mention (but perhaps should) some other laws with similar provisions, presumably because they apply primarily to organizations or entities that are no conventional commercial entities. For instance, presumably FERPA was left out because schools and educational institutions don’t typically fall under the covered entity definition in the bill, but at least with respect to many colleges and universities, they certainly engage in commercial interstate commerce.
Proposed requirements
The basic stipulation in the bill says before a covered entity can collect information from someone, it has to provide notice (with detailed contents and methods for providing notice specified in the bill) and get consent from the individual whose data will be collected. In online settings, the idea is that the entity would post such a notice conspicuously on its website, in much the same way privacy notices are typically posted today. For manual data collection, advance notice must be provided in writing. In addition, advance notice and affirmative consent must be obtained by an entity before it can effect a change in its privacy policy or data use practices that would affect data it already collected. This seems a direct response to the by-now familiar behavior of Facebook and other social networking sites, who make changes that apply retro-actively and in some cases override privacy preferences users have already configured. There is also a requirement for “express affirmative consent” (in our interpretation, another way of saying “opt-in”) before an entity can disclose personal data to unaffiliated parties, or disclose sensitive information (particularly including a user’s geographic location), or collect or disclose a complete record of an individual’s online activity. This last item seems intended to address concerns about at behavioral targeting and monitoring activities.
As well-intentioned as these rules may be, particularly in online interactions, it’s hard to see how the provisions in this bill would improve much on the publication of privacy policies and terms of use that many Internet sites already do. Even if these proposed regulations made posting of such notices universal, the bill stipulates that the consent model is opt-out, meaning individuals are considered to have consented as long as they do not explicitly deny consent, whether or not they have seen the notice posted or are even aware it exists. This puts the onus squarely on the user, rather than the entity, and would seem to offer entities a clear path to compliance with the terms in the bill regardless of any actual strengthening of privacy protections for consumers. There are a limited set of cases where affirmative consent is required, and in some industries that alone is a significant step forward, but much of the sensitive information enumerated in the bill is already subject to disclosure limitations and consent requirements under laws like HIPAA and GLBA.
Covered entities are also obligated under the bill to provide appropriate security safeguards, although the draft language is entirely subjective on just what those might be, other than saying they are whatever the Federal Trade Commission determines to be necessary.
Exceptions
The intent of this bill seems to be to balance personal privacy and consumer preferences about the use of their personal information with business needs to, well, do business. With the need to avoid unreasonably constraining businesses from gathering the information they need to conduct transactions with or otherwise serve their customers, the bill would exempt covered entities from the notice and consent requirements if the information to be collected is for a transactional or operational purpose. Both of these terms are defined in the text of the bill, but generally speaking, if the data is collected in order to provide a product or service to a customer, such as completing an online order, the rules don’t apply. The operational purposes exception also allows an entity to share data it collects with a parent company, subsidiary, or affiliate (affiliated means under common ownership or corporate control) . There are however some business activities that are explicitly called out as not being considered part of “operational purposes” such as marketing, advertising, or disclosure to an unaffiliated party. Also explicitly excluded from coverage is any information that has been “rendered anonymous” by removing or obscuring sufficient personal information that there “is no reasonable basis to believe” it could be used to identify the individual it relates to. The word reasonable is of course subjective, but given recent research showing the ease with which “de-identified” data can in fact be positively associated with an individual, it would be nice to see some more explicit language on what is required for anonymity.
Consequences
The primary enforcement mechanism for the provisions in this bill is through the Federal Trade Commission under the unfair and deceptive trade practice doctrine of the FTC Act. The bill allows for enforcement by civil action at the state attorney general level, but explicitly does not provide a private right of action. The bill is also intended to preempt any existing state or local regulation covering the collection, use, and disclosure of the personal information described in the bill.
Both of the two federal health IT advisory committees working in support of the Office of the National Coordinator (ONC) have taken up discussion on patient privacy and consent management, working through the respective privacy and security workgroups that each committee maintains. For its part, the Privacy and Security Workgroup of the Health IT Standards Committee has started looking at available standards activities related to consent management, including OASIS and the International Security and Privacy Trust Alliance’s Privacy Management Reference Model (PMRM) and IHE’s Basic Patient Privacy Consents (BPPC). The focus on the standards side appears to be primarily on BPPC, which has been around for a while but which seems to be getting a new and closer look, in part due to its support for at least a dozen consent models, including variations of opt-out and opt-in that are most often considered by healthcare organizations when choosing to enable consent management.
The Health IT Policy Committee has also been talking a lot about consent, after consistently receiving comments and testimony from patient privacy advocates about the need to include consent and support for consumer preferences, both in the context of meaningful use for EHR incentive funding and more broadly to encourage public confidence in electronic health records and health information exchange of personal data in those records. At the Policy Committee’s April meeting, the Privacy and Security Workgroup presented high-level details on its current work on consent, and suggested that formal recommendations may be forthcoming as soon as this month. Their emphasis on consent as a prerequisite to establishing the trust necessary to allow individuals to endorse health information sharing is one of several parallel activities centered on trust going on within the Policy Committee’s workgroups, within which the committee members seem to acknowledge that both policy directives and corresponding standards and technologies are needed.
What’s remarkable and more than a little disappointing about all the work these two privacy and security workgroups are doing on consent is how little they appear to be communicating with each other, much less coordinating their efforts. In response to an article in Federal Computer Week that was published online on April 23, Policy Committee member and Privacy and Security Workgroup chair Deven McGraw posted a comment on April 27 highlighting this lack of coordination:
I co-chair the Health IT Policy Committee’s privacy and security workgroup, and I have never seen this technical framework, nor has it been formally presented to the privacy and security workgroup members for their consideration.
In a somewhat similar vein, at the end of the Standard Committee’s April 28 presentation on “Standards for Consumer Engagement” from its Privacy and Security Workgroup chairs, among the questions listed for consideration by the Standards Committee is what the Standard Committee’s role should be with respect to Policy Committee efforts to address consent and related consumer engagement issues. John Moehrke, an engineer for GE Healthcare and HITSP member who gave a presentation to the Standards Committee on BPPC, noted that while listening to a recording of the April 20 meeting of the Policy Committee’s Meaningful Use Workgroup on patient consent and consumer engagement, he heard a lot of passion on consent but very little attention to or explanation of the key elements that need to be simplified if consent management is to be achieved in an implementable way. Presumably if the standards and technology available to help manage patient consent were better understood (as well as the consent provisions in the relevant laws) we might see more progress on consent solutions rather the current cycle of analysis paralysis.
To an outside observer, it may seem strange that two groups with such obviously overlapping interest areas (not to mention potential dependencies) would not coordinate their efforts on a regular basis, but unfortunately, this lack of interaction appears to be the rule rather than the exception, at least with respect to security and privacy matters, despite the fact that the Standards Committee routinely briefs the Policy Committee on it standards recommendations and related considerations. For example, at the March Policy Committee meeting John Halamka presented a summary of Standards Committee progress, including an item about “launching educational sessions on consent-related standards.” Maybe the Policy Committee Privacy and Security Workgroup can schedule one of these sessions before coming out with their recommendations on consent.
The Policy Committee and Standards Committee have a history of coming up with separate recommendations on similar topics, occasionally with conclusions that make it hard for anyone following the activities and recommendations of these advisory committees to know what they should do. For instance, last fall the Standards Committee’s Privacy and Security Workgroup presented on more than one occasion that the IHE Enterprise User Authentication (EUA) standard and the Kerberos authentication and authorization model would no longer be included as a recommended health IT standard beyond the 2011 timeframe. This recommendation, based on a questionable (in our opinion) interpretation of a draft revision of NIST Special Publication 800-63, is not part of the adopted security standards specified as certification criteria for EHR systems under meaningful use, setting up a situation in which an EHR vendor (or eligible provider who buys that vendor’s certified EHR) relying on EUA could be certified for 2011, but may need to replace its authentication mechanism in order to remain certified for 2013 and beyond. It’s hard to imagine that either the Standards Committee or the Policy Committee really intends for certification to be such a fluid target, but the Policy Committee had an opportunity to influence the criteria that ONC included in its rules, but apparently did not take advantage of it. The still-solidifying process and standards under which EHR products will be certified are just one example of an effort that would seem to be greatly facilitated by consistency and harmonization of recommendations between the Policy and Standards Committees.