The Department of Health and Human Services (HHS) has released a long-anticipated Notice of Proposed Rulemaking that would implement the changes to accounting of disclosures requirements under the HIPAA Privacy Rule. HHS opened a 60-day comment period effective May 31, the date when the NPRM is scheduled to be published in the Federal Register. The changes, specified in the Health Information Technology for Economic and Clinical Health (HITECH) Act, would expand the types of transactions and uses of data that must be include in accountings of disclosures, reduce the time period for which organizations must maintain the disclosure information, and modify the set of information that must be recorded for each disclosure.
Under the current provisions of the HIPAA Privacy Rule, codified at 45 CFR §164.528, covered entities are required to maintain records on disclosures of protected health information for a period of six years, and to furnish that historical record of disclosures (the “accounting”) to individuals who request them. The Privacy Rule included an exemption for disclosures for the purposes of treatment, payment, health care operations, and a variety of other special circumstances, including disclosures to the individual of their own PHI. Collectively, the excepted purposes constitute the vast majority of activity involving disclosure. The current rules also cover all PHI, whether in paper or electronic form. HITECH shortened the accounting period to three years, but removes the exemptions for treatment, payment, and health care operations when the disclosure of information is from an electronic health record (EHR). HHS is also proposing to explicitly list the types of disclosures that are subject to the accounting of disclosure requirement, rather than the prior approach of generally requiring inclusion but enumerating specific exceptions. When the HITECH Act passed, many covered entities expressed concerns about the increased administrative burden they would face by essentially having to track all disclosures rather than the more limited set currently required under the law. Some have also pointed out that many EHR systems currently on the market do not provide the built-in functionality to record the information about each disclosure that is required under the revised rule in HITECH.
As part of the rules promulgated under the “meaningful use” EHR incentive program, the HHS Office of the National Coordinator last year adopted a new standard and EHR certification criterion for recording accounting of disclosure information. When it published its final rule for standards and certification criteria, however, ONC chose to make the accounting of disclosure criterion optional, pending further analysis and discussion on the potential impact of the new requirements to covered entities and business associates. In parallel, HHS issued a request for information in May 2010 seeking input from the industry and other interested parties about the potential burden of complying with the new accounting of disclosure rules, the technical capabilities available in the market to facilitate or automate this process, and evidence about the relative interest among individuals in requesting accountings of disclosures. The new NPRM includes some summary data about the comments received in response to the RFI, perhaps most interestingly noting that a large number of respondents reported no or very few requests for accountings since the Privacy Rule went into effect in 2003.
HHS’ new proposed rule divides individual rights in two, providing for separate rules that give individuals the right to an accounting of disclosures and to an “access report” that, in contrast to disclosures, would provide details about who has electronically accessed the individual’s PHI. The access report provision includes accesses both by employees of covered entities and business associates and by those external to the organization. There is no comparable provision in the current law, but the NPRM notes that since the rule applies only to electronic access, covered entities should already be collecting the relevant information about accesses under practices required in the HIPAA Security Rule. It seems likely that at least part of the justification for this new right is the heightened attention focused on the need for such a record of even routine accesses following a series of well-publicized incidents where hospital employees apparently abused their authorized access by viewing the health records of celebrities or other public figures.
Legislation introduced last week for consideration by the Senate Judiciary Committee would update some of the provisions in the Electronic Communications Privacy Act of 1986 (ECPA) to extend legal protections on information collected and maintained by electronic communications service provider to include geolocation information. The bill, introduced by Judiciary Committee chairman and Vermont Senator Patrick Leahy as the Electronic Communications Privacy Act Amendments Act of 2011 (S.2011) adds geolocation information (such as GPS coordinates and cell site location information) to the types of data that government authorities cannot obtain from service providers without first getting a warrant. The bill explicitly defines geolocation information as, “any information concerning the location of an electronic communications device that is in whole or in part generated by or derived from the operation or use of the electronic communications device.”
Leahy, who is cited as the original author of the ECPA in the press release announcing the introduction of the new bill, has spearheaded a campaign through his committee to highlight the many ways in which modern technology has developed beyond what the law was envisioned to cover. In a series of hearings dating from before the 2010 mid-term elections, the Judiciary Committee has heard testimony from a variety of stakeholders, including government, academic, judicial, and industry representatives. More recently, committee hearings have focused on privacy issues associated with GPS coordinates and other geolocation information collected automatically by many popular mobile devices, with or without the knowledge of device users. These issues, coupled with a series of inconsistent federal court rulings that tried to interpret ECPA to apply its terms to technologies and data types that didn’t exist 15 years ago, have left a somewhat confusing picture regarding just what information is subject to privacy protections, under what circumstances, and with what level of legal and administrative constraints. If enacted as written, it would appear that the proposed amendments to the ECPA would resolve the ambiguity surrounding how geolocation data should be treated. The text of the bill would amend the sections of Chapters 119 and 121 in Title 18 of the U.S. code to prohibit the disclosure of such information by service providers and preventing government authorities from accessing an electronic device for the purpose of retrieving geolocation information.
The focus on geolocation data in the proposed amendment is understandable given the attention generated by news that Apple’s popular iPhone devices stores a cache of location information that some have interpreted as potentially useful for tracking an individual’s location over time. Of course, cellular service providers have long collected device location information as part of their routine business operations, leading to some legal debates over just who owns that information and, in particular, whether subscribers can assert privacy rights about that information. The proposed bill addresses this key issue and several related topics about information disclosure, warrant or subpoena requirements, and emergency exceptions. Still unaddressed are other provisions in ECPA and the Stored Communications Act language it contains that cover the contents of electronic communications generally, but are not explicitly intended to address the wide variety of communications media, smartphones, tablets, and other sophisticated technologies using the services and infrastructure that modern electronic service providers now offer.
In the latest round of security recommendations for the Nationwide Health Information Network (NwHIN), the Privacy and Security Tiger Team (a workgroup of the federal Health IT Policy Committee that advises the National Coordinator for Health IT) offered its proposed approach for issuing digital certificates to NwHIN participants. In a brief presentation given at the Tiger Team’s May 23 meeting, the group recommended that all public key infrastructure certificates used in NwHIN exchanges must comply with Federal Bridge CA standards maintained by the Federal PKI Policy Authority and must be issued only by certificate authorities that have been cross-certified with the Federal PKI framework. The simple rationale for the recommendation offered by Tiger Team members is that any organization, commercial or otherwise, that participates in the NwHIN will presumably need or want to exchange data with federal agencies, and the Department of Veterans Affairs (VA) and other key health agencies have indicated that they will only accept certificates that conform to Federal Bridge CA standards.
Given the leading and central role played by the government in the NwHIN, the Tiger Team’s recommendation seems pretty intuitive. Separate from adhering to standards and expectations maintained by DoD, HHS, VA, and other government health entities, the recommendation — if adopted — would also serve to help realize the vision of getting ONC out of the certificate authority business, and also divest it of some of its governance authority over certificate issuers, who would need to apply directly to FPKIPA to receive cross-certification. Despite the award last summer of a contract to Stanley (now owned by CGI) for infrastructure and operations support that includes managing the digital certificate issuance process, ONC has long made it known that it does not want to operate any long-term infrastructure or own service delivery for the NwHIN. This approach would presumably still leave ONC with the governance responsibility of approving organizations for participation in the NwHIN, so as the NwHIN governance policies and procedures continue to evolve, it will be interesting to see what criteria or evaluation standards may be applied to applicant organizations to determine whether they should be allowed to participate at all. It is also important to remember that, regardless of who issues them, the digital certificates used in the NwHIN bind an organization — not an individual — to the certificate. This means that all employees or contractors of the participating organization who might have authorization to conduct data exchanges with other NwHIN participants are in essence sharing the same identification and authentication credential, putting the onus on the organization to ensure that only authorized individuals can access NwHIN-connected systems and initiate or conduct transactions.
As reported by the Globe and Mail earlier this week, a Canadian provincial court ruled that personal information stored by employees on employer-provided computers is protected by Canadian privacy laws, and the information cannot be given to law enforcement without first satisfying government search prerequisites such as obtaining a warrant. The Ontario Court of Appeal found that while an employer (in this case, a school) is permitted to inspect the contents of computers it owns – including personal data employees may have stored there – the employer cannot give access to law enforcement without a warrant. This legal interpretation of privacy regulations and protection from unreasonable search and seizure seems consistent with the 4th Amendment to the U.S. Constitution, which generally constrains the ability of government authorities to conduct searches but does not limit non-government employers from performing similar activities. However, the Ontario ruling answers the question of employee expectations of privacy regarding personal data more broadly than some similar judicial decisions in the United States, which tend to afford employees such protection in very narrowly defined circumstances.
The United States Supreme Court ruled yesterday that background checks conducted on government contractors — which included questions on prior drug use and treatment — by government agencies (in this case, NASA) are reasonable and did not violate the contractors’ right to privacy. The unanimous decision by the Court is not particularly surprising, but a concurring opinion (starting on p. 29) from Justice Antonin Scalia reveals some strong differences among the justice about the appropriate basis for the Court’s ruling, and about individual privacy rights under the Constitution. Writing for the Court, Justice Samuel Alito referred to precedents established in two previous cases — Whalen v. Roe and Nixon v. Administrator of General Services — to assert that the Court has held that a Constitutional right to privacy exists, at least with respect to personal information (the ruling uses the term “informational privacy”). Justice Scalia objects to this line of reasoning, finding no basis for such privacy rights in the Constitution. Specifically, Scalia wrote:
Like many other desirable things not included in the Constitution, “informational privacy” seems like a good idea – wherefore the People have enacted laws at the federal level and in the states restricting the government’s collection and use of information. But it is up to the People to enact those laws, to shape them, and, when they think it appropriate, to repeal them. A federal constitutional right to “informational privacy” does not exist.
This disagreement by necessity turns on interpretation, because the Constitution simply doesn’t address the matter (the word privacy does not appear anywhere within the text of the Constitution or its amendments). Some state governments and many privacy advocates consider this to be a serious deficiency, and aside from federal laws enacted to protect the privacy of personal information in a variety of contexts, 10 states have individual rights to privacy explicitly conferred in their state constitutions.
The reliance on Whalen seems somewhat insufficient, since in that case the court said that the duty of the government to avoid unauthorized disclosure of personal information that could be harmful or embarrassing to the individual “arguably has its roots in the Constitution.” In the current case, there is no dispute that NASA, as a federal government agency, is restricted under the Privacy Act (5 USC §522a) to limit the personal information it collects to only what is necessary to accomplish the intended purpose (in this case, to perform a background check) and from disclosing personal information it collects or for using it in any way inconsistent with the purpose for which it was collected. Given these protections regarding personally identifiable information collected on contractors and other individuals subject to background investigations, the only real question is whether the information sought on prior drug use and treatment go beyond what is reasonable or necessary for the government to ask in order to determine if the individuals are sufficiently trustworthy to perform the tasks their contract entails. It seems hard to argue that information about prior illegal activities, especially those involving potential addictions, would not be relevant to making a determination of trustworthiness. If so, then there is really no reason to entertain a debate about whether individual privacy rights are or are not afforded by the Constitution, although exposing the difference of opinion among Supreme Court justices offers some indication that the outcome of future cases that come before the Court on this issue might be hard to predict.