Facebook as a model for consent management

It’s not everyday that you hear Facebook’s most recent changes to its privacy practices referred to in strongly positive terms (at least by people who don’t work for Facebook), but some leading advocates of more fine-grained control over privacy in the health information context point to Facebook as an example showing, in the words of World Privacy Forum founder Pam Dixon, “that we can in fact have granular control over sensitive data.” One of the key aspects of health information privacy that remains under-addressed to date is the capture and adherence to consumer preferences about the use and disclosure of their personal health information. Beyond the debates about exact what uses of the data should require proactive consent from individuals, there has been concern over the functional and practical aspects of managing many different “consents” corresponding to different uses and context and scenarios. Deborah Peel, a doctor and founder of the non-profit Patient Privacy Rights, characterizes Facebook as a “kind of consent management system,” albeit one with controls that could stand improvement and that may not be fully suitable to handle the complexity (or robust identification and authentication requirements) involved in consent management for health information.

New ONC Chief Privacy Officer starts with a full plate

When Joy Pritts, the newly appointed Chief Privacy Officer in the Office of the National Coordinator for Health IT, started her job on Monday, she already had a list of pending action items stemming from activities, initiatives, and recommendations dating back more than three and a half years. High up on that list should be formulating a single cohesive federal privacy policy to help eliminate some of the confusion, conflicts, and inconsistencies among various federal and state level rules and regulations on privacy. This is a task for which Pritts should be unusually well suited, given her extensive research into different state-level privacy practices and deep expertise with key federal health privacy legislation, especially HIPAA. Over 10 years ago she was the lead author on a study called The State of Health Privacy, sponsored by the Health Privacy Project at Georgetown (now part of the Center for Democracy and Technology). The study, first published in 1999 and revised several times since then, provides a comparison of state-level privacy statutes and identifies states that have augmented or strengthened federal HIPAA requirements. More recently she served as an advisor on the Health Information Security and Privacy Collaborative (HISPC), an ONC initiative begun in 2006 now in its third phase that includes participation from 42 states.

To the extent that ONC director David Blumenthal will shape the Office’s privacy agenda, the effort to establish a comprehensive privacy policy (or privacy framework, as is often proposed) will draw on multiple sources of privacy principles and guidelines, as was the case with the privacy framework promulgated by the American Health Information Community (AHIC) a couple of years ago. At the urging of privacy advocates who have been advising the Health IT Policy Committee on privacy issues, both before and since the release of the meaningful use criteria for EHR technology adoption incentives, the current effort on privacy policy will apparently also include recommendations submitted over three years ago by the National Committee on Vital and Health Statistics (NCVHS) to address patient privacy rights associated with the Nationwide Health Information Network (NHIN). Tucked into those recommendations is a proposed definition for health information privacy: “an individual’s right to control the acquisition, uses, or disclosures of his or her identifiable health data.” This definition, if adopted, may have a better-than-average chance of successful application, as it is both domain-specific and encompasses the three major contexts (according to a model proposed and fully described by Daniel Solove in Understanding Privacy) in which privacy ordinarily comes into play:  information collection, information processing, and information dissemination. Given the past indications that ONC considers the privacy policy issue to be a prerequisite for making progress on other key initiatives, perhaps policy is the best place for Pritts to start.

More calls for government action on Internet security

During yesterday’s hearing of the Senate Committee on Commerce, Science, and Transportation, a panel of security experts urged the government to do more to push public and private sector action on critical infrastructure protection and Internet security, although those testifying differ on the exact role the federal government should play in encouraging that action. James Lewis of the Center for Strategic and International Studies repeated before the Committee his argument that federal regulation is needed to achieve the levels of participation sought among private sector organizations. His testimony included an analogy — familiar to anyone who has heard Lewis speak publicly in recent months — likening the need the government to regulate better cybersecurity to the historical regulatory action to promote safety in the automobile industry, and arguing that it is no longer feasible to rely on voluntary adoption of best practices and market forces. Michael McConnell, former director of national intelligence and currently with government contractor Booz Allen Hamilton, expressed similar views and concluded that private industry could no longer credible advocate a hands-off role for government. Other panelists were more circumspect in their choice of words and recommendations, such as Oracle Chief Security Officer Mary Ann Davidson, whose prepared remarks largely expressed support for things that Congress is actively doing, such as increasing funding for education in information security skills and trying to design software and technology products that are built to be more secure. Admiral James Arden Barnett, Jr., the Director of the Public Safety and Homeland Security Bureau of the Federal Communications Commission (FCC) also sees a role for government intervention, and focused his remarks on the possible role the FCC can play in critical infrastructure protection, including serving as the point of information on network outages and related issues collected from broadband service carriers. Scott Borg, head of the  U.S. Cyber Consequences Unit (a non-profit research institute), while noting the problem of market failures resulting in under-addressed aspects of cybersecurity, warned that the technical landscape changes so quickly that there is no practical way for the government to keep up if it tries to impose standards. None of the positions expressed were inconsistent with the emphasis on strong public-private partnerships to advance cybersecurity advocated by Committee Chairman Sen. Jay Rockefeller, who with co-sponsor Sen. Olympia Snowe drafted a piece of legislation titled the Cybersecurity Act of 2009 (S.773) that would codify and strengthen federal oversight roles on security, including elevating the federal cybersecurity czar to a Cabinet-level position. There were few voices at this hearing representing arguments by industry that financial incentives are a better alternative to regulation, or concerns raised by privacy advocates and free market proponents. The pervasive theme in yesterday’s hearing was the sense of urgency for the government to act, due to the ongoing threat environment and the potential for a serious attack attempt against U.S. critical infrastructure.

Not ready to comply with HITECH? That’s OK, HHS isn’t ready to enforce it yet either

Government observers are well aware that there is a big difference between passing a provision in a piece of legislation, crafting the rules that implement the provision, and then putting those rules into effect. Where new requirements or regulatory responsibilities are placed on organizations, it is also fairly common for the effective dates of new rules to be delayed if it’s clear the entities subject to the regulation aren’t ready to comply. Familar examples of such delays and compliance deadline extensions include those for small businesses subject to Sarbanes-Oxley and, more recently, the multiple delays in the deadline for personal information protection requirements in Massachusetts’ 201 CMR 17. With these precedents, it is perhaps unsurprising that personnel from the HHS Office of Civil Rights (OCR) have indicated that OCR will not begin enforcing new security and privacy requirements in the HITECH Act that apply to business associates. With these rules — essentially a set of strengthened HIPAA privacy and security requirements that apply to a broader set of health industry participants and organizations — it seems the delay is warranted not just for the apparent lack of readiness of the organizations covered by the rules, but also by OCR’s uncertainty regarding the most reasonable and consistent approach to take on HIPAA enforcement. HITECH reset many of the standards and expectations for monitoring and auditing compliance, and for investigating violations.

On a tangential note, this situation highlights the difficulty with following implementation timelines dictated in legislation, often without any extensive consideration of the feasibility of meeting the timelines. So far, HHS has done a pretty good job of issuing regulations and promulgating standards (at least in draft form) on or ahead of the schedule contained in the HITECH Act. The timing of the announcement last week that Judy Pritts had joined ONC as its Chief Privacy Officer was also dictated by HITECH (the law says the appointment had to be made “not later than 12 months after the date of enactment”). It should be noted that the rules under HITECH are officially in effect, so the only delay is in their enforcement. To some this might seem a trivial distinction, but historically HIPAA enforcement has relied a great deal on voluntary monitoring, so the fact that business associates shouldn’t expect an auditor visit right away shouldn’t divert the attention from these organizations on putting the appropriate processes, practices, and technologies in place to comply with the law.

If allegations are true, Pa. school district is on the wrong side of a lot of privacy rules

With a lawsuit filed in federal court last week , school officials in Lower Merion, Pennsylvania are on the defensive over the alleged illegal use of remotely activated webcams in laptop computers issued to students. It seems the Macbooks include software that allows administrators to turn on the webcam to try to help recover a laptop should it become lost or stolen; the security feature has been used several dozen times in such situations, apparently without raising any objections from students or their parents. In the case that prompted the laptop, however, a Harriton High School student was accused of engaging in “improper behavior” after school administrators recorded and viewed images of the student putting small object in his mouth — the school said they were drugs; the student says they were candy. Despite using the photographic “evidence” to support its claim against the student, the school district maintains that it would never use the remote webcam activation for any purpose other than recovery of a lost or stolen laptop. The Lower Merion district superintendent went so far as to claim, “The district has not used the tracking feature or webcam for any other purpose or in any other manner whatsoever.” He did not address how a Harriton assistant principal came to be in possession of images from the accused student’s laptop webcam, since there was no suspicion that the laptop was missing. There doesn’t appear to be any claim of probable cause (not that a school official is legally justified in determining probably cause) with respect to the student’s alleged behavior, but instead the claim is based on visual observations made using the webcam.

The most thorough (the term “thorough” doesn’t quite do justice to it) accounting of the technical tools involved and the actions and opinions of school network technicians comes from Intrepidus consultants Stryde Hax and Aaron Rhodes in a lengthy blog post.

With the attention now focused on the situation, it is becoming clear that while the alleged practice of remotely monitoring students in their own homes violates a number of federal laws, the school district appears to have acted inappropriately from the outset by not informing students or parents that the webcams in the laptops could be activated remotely. Even if it had provided notification and obtained consent for the explicit purpose of remote activation to aid in recovery of lost or stolen computers, the apparent use of the webcam for routine monitoring would be illegal. Many state and federal laws covering monitoring of employee behavior in the workplace such as the Electronic Communications Privacy Act require notification and consent prior to monitoring, so the fact that this monitoring took place in private homes and that minors were surveilled adds a host of other legal and regulatory protections that the school district appears to have ignored. In addition to ECPA, the lawsuit claims violations of federal laws including the Computer Fraud and Abuse Act, the Stored Communications Act, a section of the Civil Rights Act; the Pennsylvania Wiretapping and Electronic Surveillance Act and Pennsylvania common law; and the Fourth Amendment.