HHS withdraws final health data breach notification rule for revision

The Department of Health and Human Services (HHS) announced last week that it has withdrawn the final version of its rule on Breach Notification for Unsecured Protected Health Information, which it had submitted to OMB for review in May. HHS gave no specific reason for wanting to reconsider the rule, other than to note the complexity of the issue. The Interim Final Rule for breach notifications that went into effect last September remains in force pending further action on the final rule.

HHS did note that it received over 100 comments during the interim final rule’s 60-day comment period last fall, and there is some speculation that the decision to revise the rule again before finalizing it is due in particular to concerns over the provision that would allow entities suffering breaches to make their own subjective determination of whether the breach would result in “harm” to those whose personal data was disclosed. If an entity determined that no harm was likely to result, then the entity need not provide notification of the breach, to HHS or publicly. Shortly after the IFR was published, objections to the harm provision were raised not only by patient privacy advocates but also by members of Congress, and unless the now-withdrawn final rule was amended to strike that provision, it seems likely that addition efforts at either the federal and state level would might have been undertaken to remove this notification exception.

There is an active debate over unauthorized data disclosures and potential or actual harm to the victims of such breaches beyond the health breach disclosure context. Lawsuits filed over breaches of personal information are routinely dismissed when the parties who bring the suits are unable to demonstrate actual harm or injury has occurred, rather than that the potential for harm exists. The legal issue in these cases has little to do with privacy or, generally, with violations of breach notification laws, but with standards of civil procedure and tort liability requirements, which demand that plaintiffs be able to show actual harm in order to bring causes of action for negligence or poor security or data handling practices. Having general or domain-specific breach notification laws on the books should in theory help overcome the negligence right of action issue, but at least in the case of federal health data breaches, that will only be true if organizations responsible for data breaches can’t exempt themselves from notifications because they believe (or have no evidence) that the subjects of the breaches suffer actual harm.

Google Apps for Government receives federal authorization to operate from GSA

Google announced today that its public-sector focused cloud computing service, Google Apps for Government, successfully completed a security certification and accreditation (C&A) process and received an authorization to operate (ATO) from the General Services Administration. This achievement should help overcome one of the more significant barriers to federal agency adoption of third-party cloud computing solutions, and is a strong statement of Google’s commitment to the public sector market. It remains to be seen how willing other agencies will be to accept GSA’s authorization decision for Google’s apps; almost all federal agencies are self-accrediting, making their own decisions about what level of system security and what level of risk they are willing to accept in their own environments, and therefore system authorizations are not often “portable” among agencies, even when the same underlying technology is involved. According to Google’s own statements, the company is paying close attention not only to complying with relevant to security requirements, but also to aligning with emerging government-wide standards and definitions on cloud computing, cloud services, and the different architecture patterns associated with the term “cloud computing.” While the GSA authorization appears to be a good first step, presumably Google should try to enlist one or more federal agencies as a sponsor in seeking authorization for Google Apps under the Federal Risk and Authorization Management Program (FedRAMP). This government-wide initiative, sponsored by the federal CIO Council, is intended specifically to provide greater reuse of authorization efforts for outsourced systems and services provided for use by multiple agencies. While officially not limited in scope, its initial focus is on cloud computing.

The Google announcement describes the C&A as a requirement of the Federal Information Security Management Act (FISMA), which for practical purposes is true, even if it’s technically not quite accurate. Under federal guidance contained in Appendix III of OMB Circular A-130, “Security of Federal Automated Information Resources,” agencies are required to “authorize processing” based on an assessment of the information security controls put in place to protect an information system from loss of confidentiality, integrity, or availability of the system itself or the information it processes. Circular A-130 was originally published in 1985 to provide agencies with guidance on implementing the provisions of the Paperwork Reduction Act of 1980, and was significantly updated and re-released three additional times — with additional implementation guidance added to address provisions in the Clinger Cohen Act, updates to the PRA, and numerous other laws and executive orders that impact the management of federal information resources — with the most recent publication (Transmittal Memorandum 4) released in 2000. While the requirement to certify and accredit federal information systems (“accreditation” is essentially the same thing as “authorization”) predates FISMA, in recent years the C&A process has been described as a FISMA requirement, largely because the first version of NIST’s Special Publication 800-37, “Guide for the Security Certification and Accreditation of Federal Information Systems,” was not release in final form until May 2004, two years after FISMA was enacted. Circular A-130 actually references a NIST Federal Information Processing Standard (FIPS 102) that was superceded by SP800-37 and formally withdrawn in early 2005. In February of this year NIST completed a significant revision to SP800-37, which is now titled “Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach.” Despite this common interpretation of FISMA, the words certification, accreditation, and authorization don’t appear anywhere within the text of the FISMA legislation (which itself is Title III of the E-Government Act of 2002).

Health IT policy intensifies focus on consent

While there were several informative presentations and topics addressed at the monthly meeting of the Health IT Policy Committee today, the recommendations from the Privacy and Security Tiger Team on adoption of fair information practices and, especially, handling of consent generated a particularly active discussion. Managing consent and patient or consumer preferences about the use and disclosure of their personal information is garnering a lot of attention within the Office of the National Coordinator, as both its federal advisory committees have been considering the issue. Consent remains among the most significant issues in health information exchange and health IT, and in some ways represents a particularly difficult one to resolve, because it is impossible to satisfy the priorities of all the stakeholders involved. The focus of much of the debate has been on establishing consent as a key privacy protection which, if offered to patients, may help them feel more comfortable with the idea of their personal health data being stored in electronic health records and potentially shared with other entities through health information exchange.

The key policy question is when should consent be required before patient data is disclosed, shared, or  transferred. In many cases (most notably for treatment) there is no legal requirement and arguably no policy interest in requiring consent, but if a given entity decides that they would prefer to solicit patient preferences and honor consent directives, they are free to do so (presumably except in cases where they are legally required to disclose information regardless of patient preferences).With respect to treatment, the Tiger Team members have to date suggested that current legal requirements that mandate consent in advance of health data disclosure are sufficient, at least if they can be enforced, so their attention has sensibly been focused on a set of foreseeable circumstances or situations outside of core or routine purposes for use (such as treatment, payment, and health care operations) under which health information might be exchanged that should, as a matter of policy (and eventually, regulation) trigger the need for the health care entity to obtain patient consent before the data exchange takes place. Among the recommendations presented today was a representative list of factors that should trigger the need for health care entities to obtain consent from patients before sharing personal health data via health information exchange:

  • Patient’s health information is no longer under control of either the patient or the patient’s provider
  • Patient’s health information is retained for future use by a third party/ intermediary
  • Patient’s health information is exposed to persons or entities for reasons not related to ongoing treatment (or payment for care)
  • Patient’s information is aggregated outside of a provider’s record or record of integrated delivery system/accountable care organization with information about the patient from other, external medical records
  • The exchange is used to transmit information that is often perceived to be more sensitive than other types of information (e.g. behavioral health, substance abuse,  and other areas defined by NCVHS) 
  • Significant change in the circumstances supporting an original patient consent

The Tiger Team recommended to the Health IT Policy Committee that ONC adopt the position that “Choice should be required if any of the factors in the previous slide are present, and ONC should promote this policy through all of its policy levers.” The use of the term choice in this context refers to the ability of the health care consumer to assert preferences about data disclosure, including opting in or opting out of sharing data in different circumstances. One of the more energetic side debates during the meeting (and apparently reflecting similar lack of consensus among Tiger Team members) centered on the best choice model to recommend, with opt-in and opt-out being the two primary alternatives. Patient privacy advocates tend to favor opt-in, because it maximizes patient control over the use and disclosure of their data and because it requires the consent choice to be made in advance of any actual sharing of data. A subset of the group (it’s not entirely clear if this is a minority or majority of the members) advocates adopting an opt-in approach and not only recognizing a fundamental right of privacy (something which, it should be noted, does not exist in American law or jurisprudence, even in health care) but expecting the architecture of systems or solutions involved in health information exchange should reflect protection of privacy as a core design principle.

There are broader level conflicts between some of the key outcomes sought through health IT adoption and strong consumer controls over data sharing, most notably that an opt-in by default model might severely limit the amount of data available for sharing, which would reduce the effectiveness of the programs or initiatives or activities that depend on widely available health data. Still, providing consent is still routinely cited as a prerequisite for engendering public trust in the use of EHR’s and other health information technology, and despite the challenges with implementing consent management capabilities, focusing on privacy and consent is likely to pay greater dividends than emphasizing security controls.If the current security and privacy controls used with health IT were sufficient to give people the level of confidence they would need to obviate the concerns they have now about the protection of their personal data, then we might be at a point where the data should be shared by default. But, until we are at that point (and we’re not there now), people don’t have that level of confidence, so they must be offered the control (through opting in). This implicitly recognizes that not everyone has the same views, concerns, confidence, or perceptions of  trustworthiness of the system. With differing levels of trust, it’s unrealistic to impose a single standard approach that will satisfy everyone (a warning that the developers of the NHIN trust framework might do well to heed). Accepting the view that risk must be present for trust to come into play, this also means that if security and privacy measures could be made so effective as to eliminate the risk of misuse or unauthorized disclosure of information, there would be no need for individuals to have trust in the system. Any situation short of information surety will mean that some risk remains, and to encourage people to act (agree to share their data) despite that risk, there must be mechanisms in place that either increase trust or that compensate for the lack of trust, and therefore facilitate decisions to act on whatever level of trust exists.

It is somewhat refreshing to see the explicit statement from the Tiger Team that the central focus of trust in health IT is the relationship between the patient and the provider, specifically, that “Providers ‘hold the trust’ and are ultimately responsible for maintaining the privacy and security of their patients’ records,” including making decisions about exchanging or disclosing patient data. This relationship illustrates the three-part instantiation of trust — the truster (patient), the trustee (provider), and the context (doctor-patient relationship for health care). The characterization of trust in this context also fits the conception of trust as “encapsulated interest” where, in this case, the patient’s evaluation of the trustworthiness of the provider stems from the provider’s incorporation of the patient’s interests as his or her own. Having said that, and with no disrespect intended to the members or intentions of the privacy and security Tiger Team, there is a fundamental limitation as to the validity of policy statements purporting to represent patient perspectives unless and until some effort is made (other than opening sessions for public comment) to solicit and reflect actual consumer opinions about these issues.

Significant work remains to produce standards and rules on accounting of disclosures for PHI

The Health Information Technology for Economic and Clinical Health (HITECH) Act passed as part of the Recovery Act in February 2009 included a variety of revisions included to requirements already in effect under the HIPAA Security and Privacy Rules. Among these requirements is the need for HIPAA-covered entities to maintain an accounting of disclosures of protected health information (PHI) so that it may produce, when requested, a record of such disclosures to the individuals whose PHI has been disclosed. As we’ve noted previously, under the HIPAA Privacy Rule the accounting of disclosure rule covered a six year history, but exempted disclosures for the purposes of treatment, payment, or health care operations (45 CFR §164.528). Once the changes mandated by HITECH take effect, there will no longer be an exemption for these three most common uses of health care information, and the time period for the accounting of disclosures is shortened to three years from six. The original legislation called for the new accounting of disclosure rules to take effect as soon as January 1, 2011 (for entities that newly acquire electronic health record (EHR) technology) and no later than January 1, 2014, although in a move that may prove to have shown great foresight by Congress, the law allows the HHS Secretary to delay the effective dates by two years if such a delay is deemed necessary.

The January 1, 2011 date looks increasingly unlikely, for two primary reasons. First, the language of the HITECH Act instructs HHS to first adopt standards for accounting for disclosure, and then promulgate regulations about what information health care entities (and presumably business associates, since HITECH also made business associated directly responsible for complying with HIPAA requirements) must record about each disclosure (§13405(c)(2)). No such standards have yet been proposed, much less adopted, and at present HHS is still in the process of reviewing comments it received in response to the request for information it published in May of this year. Second, the EHR certification criteria proposed by the Office of the National Coordinator (ONC) in an interim rule published last winter included accounting of disclosures, and so represented a key driver influencing health IT vendors to make sure their EHR systems offered the capability in their products. However, in the revised certification criteria released last week in conjunction with the final version of the meaningful use rules, the accounting of disclosures functionality is now optional, so any sense of urgency vendors might have felt about providing that functionality has likely subsided. Taken together, the absence of standards and regulations on accounting of disclosures and the fact that such functionality is not required in order to certify EHR systems under Stage 1 of meaningful use suggest that the ability for health care providers to actually offer the sort of accounting called for in the HITECH Act may not be pervasive until Stage 2 takes effect in 2013.

In addition to the formal statements of standards and criteria in the “Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology,” ONC included information on the many comments it received and summary responses to those comments. With respect to accounting for disclosures, the decision to make the criterion optional clearly reflected the concerns of multiple commenters about the resource intensiveness of the criterion and the lack of clarity about how the requirement was intended to be satisfied. ONC acknowledges that “significant technical and policy challenges remain unresolved” and notes its expectation that accounting of disclosures rules and standards will likely be the subject of future rulemaking. Other than designating it optional and re-numbering the section where the criterion will be codified (to §170.302(w) rather than §170.302(v)), the text of the criterion remained unchanged from the interim rule to the final version: “Record disclosures made for treatment, payment, and health care operations in accordance with the standard specified.”

What did change in the final version is the wording of the audit standard (§170.210(b)), with a change of just a couple of words that nevertheless may have a significant impact on the way accounting of disclosures are implemented. ONC’s interim final rule published, on January 13, 2010, included an adopted security and privacy standard to “Record actions related to electronic health information,” the text of which said that audit data must be recorded when electronic health information is “created, modified, deleted, or printed.” Based on comments described by ONC in the final rule, many urged the addition of the word “accessed” to the standard to include read-only actions within the scope of the audit requirement. This change was included in the final text of the standard (and the action “printed” was removed), with the implication that now audit records should provide more value to entities seeking to identify authorized EHR user who misuse or inappropriately access health records. Should this sort of logic be applied to the accounting of disclosures rules when they are written, considering read-only viewing of a record to be a “disclosure” would go a long way towards making the accounting of disclosures a comprehensive history of all uses of health records, and to providing stronger access controls that enhance privacy protections for personal health information.

Wisconsin court ruling addresses a different aspect of privacy and personal e-mails

A Wisconsin state supreme court ruling issued last week adds another dimension to the current debate over employee expectations of privacy in personal communications using employer-provided means. In this case, a group of teachers in the Wisconsin Rapids School District — who, as school district employees, are considered state government workers — sued to prevent the public disclosure of contents of personal e-mails they had sent from their work computers. The request to disclose the e-mails was filed under the state’s open records law (less formally known as part of the state’s “Sunshine Laws”), which in general make communications related to conducting government business subject to public review. The essence of the ruling in this case is that since personal e-mails are not information about the conduct of government business, their contents should not be open to the public under the Sunshine Laws. Specifically, the court decided “Personal e-mails are therefore not always records within the meaning of Wis. Stat. §19.32(2) simply because they are sent and received on government e-mail and computer systems.” The focus on whether the e-mails fall under the definition of “record” is critical, because if they can be considered records then the Sunshine Laws would seem to apply by default; the two dissenting justices in the 5-2 ruling based their objection on a belief that the e-mails should be considered records.

The issue of what reasonable expectations of privacy employees may have with respect to personal communication using employer-provided resources has received a lot of attention in recent months, most notably in the context of the June U.S. Supreme Court decision in City of Ontario v. Quon, although ironically that decision left unresolved the question of reasonable expectations of privacy, and focused instead on the legality of a government employer’s search of person employee communications (in Quon, the communication was text messages sent via pager). While much of the current debate is focused on government agencies as employers and public sector employee rights, the issue is quite relevant for private employers as well, despite the fact that generally private companies have much broader latitude in monitoring their employee’s behavior and use of company resources, as long as they comply with notification requirements and other terms of the Electronic Communications Privacy Act (ECPA). A New Jersey state Supreme Court ruling handed down in March found that some narrow protections of employee privacy exist even when using private-sector employer-owned computers and network resources (specifically that attorney-client privilege was not waived when using such resources).

What’s different about the Wisconsin case is fact that the alleged violation of employee privacy rights did not stem from the school district’s intent to read the teachers’ personal emails, but instead a request to disclose the contents for public inspection. The Wisconsin Supreme Court decision, written by Chief Justice Shirley S. Abrahamson, begins by noting the state’s strong commitment to transparent government operations and long history of using open records and meeting laws to help ensure such transparency. She also points out that the Sunshine Laws in question were enacted at a time (1970s and ’80s) when e-mail technology wasn’t in common use, although current official guidance on compliance with the state Public Records Law makes it clear that electronic records are well within the scope of the law’s provisions, as it is the content, not the format, of the information that is important. This guidance, most recently updated in 2007, notes that “No Wisconsin precedent addresses whether personal e-mail received or sent on government equipment falls under the personal use exception to the definition of ‘record.'” The documentation notes, much as the Chief Justice did in her decision, that “Courts in other states, however, have concluded that personal e-mails sent to or from government accounts are not public records,” a set of precedents to which Wisconsin may now be added.