Lack of common security standards complicates health information exchange

A recurring challenge facing efforts to implement interoperable health information exchange solutions is agreeing on a common set of security standards that can be applied to both private and public sector participants in such exchanges. There are multiple alternatives from which to choose, notably including the HIPAA security rule, ISO/IEC 27002, and NIST SP800-53 security controls used in association with FISMA, but none of these apply — either by regulation or by choice — to all the different types of organizational entities sought for participant in health information exchange. The federal government, through the Office of the National Coordinator (ONC) for Health IT within the Department of Health and Human Services, has taken on the role of setting policy, providing program funding and financial incentives for health IT technology adoption, and establishing the criteria organizations must meet to qualify for these incentives. ONC has also formed advisory committees with representation from government, commercial, and non-profit organizations to determine the most appropriate overarching policies and standards to be used for health information exchange. For several years the government has been leading major initiatives intended to help realize the vision of a nationwide information exchange infrastructure, and with the passage of the HITECH Act in February 2009, the government also took on a role as the arbiter of technical standards, including those for security. In a recent webcast sponsored by 1105 Government Information Group, speakers from the government, contractor, and IT analyst communities gave presentations on security as both a key prerequisite and important enabler of health information exchange, and highlighted work being done today by the Veterans Administration that may serve as a model for recommended security standards for electronic health records. Even the limited experiences with health information exchanges between government agencies and private sector organizations demonstrate the enormous complexity involved with complying with all applicable security and privacy regulations. Nevertheless, getting security right is absolutely necessary in order to achieve widespread use of health IT and participation in health information exchanges.

For the VA’s part, director of Health Care Security Gail Belles emphasized the need for a common set of security standards that can be applied to both public and private sector entities, but also highlighted the lack of consistent standards even among federal agencies for handling data exchanges with non-federal entities. During the webcast Belles summarized a Veterans Health Administration pilot patient record sharing project with Kaiser Permanente in San Diego, using the specifications and standards of the Nationwide Health Information Network (NHIN). For the pilot project, both VHA and Kaiser Permanente signed a legal agreement laying out terms, prerequisites, and obligations for data exchange between the two organizations, and then proceeded in accordance with the regulatory security requirements that apply to each organization — in Kaiser Permanente’s case, that includes HIPAA and HITECH as well as California laws such as the SB 1386 governing privacy of personal information; in addition to HIPAA and HITECH the VA is subject to FISMA, the Privacy Act, and provisions under Title 38 of the U.S. Code covering privacy and confidentiality of veterans’ medical records and claims data. Even without beginning to dive into the specifics, the picture would be greatly simplified if a single comprehensive set of security and privacy standards were available. Because it has such a large presence in delivering and administering health care, the government alone is in the position to declare standards that will be adopted by federal and non-federal participants alike. The government is already working on a standard definition for the structure of an electronic health record, so it does not seem unreasonable that the government would also take a shot at formalizing the standards required to secure those records when they are exchanged.

Debate over anonymous comment posting on school news site raises familiar First Amendment issues

In a story first reported by the Roanoke Times and picked up by the Washington Post in today’s edition, Virginia Tech’s student newspaper is at odds with the University Commission on Student Affairs over its practice of allowing anonymous comments to be posted on its website. The commission — an advisory body comprising students, faculty, and staff — has recommended that unless the paper changes its policy regarding online comments, the school’s administration should withdraw the roughly $70,000 in funding the Collegiate Times receives through its parent organization, the Educational Media Company at Virginia Tech (EMCVT). As an organization independent from Virginia Tech, EMCTV and the newspaper do rely on school funding alone, but the commission has also suggested it might ban student organizations on campus from buying advertising in the paper, and that loss of revenue would threaten the paper’s survival. The disagreement has raised a variety of policy and legal issues, notably including constitutionality claims under the First Amendment, which on balance seem to suggest that the paper is on defensible ground, but that the school can likely get its way.

While some have raised issues about the inconsistency of the online posting policy itself (the paper’s editors do not accept anonymous letters to the editor, for instance, but do allow anonymous comments on the website), given the educational setting of the case, the core issues boil down to the ability of the school administration to control speech associated with the paper and the legal validity of Virginia Tech’s Principles of Community, which some anonymous contents posted in the past allegedly violate. Legal precedents established over the last 20 years or so generally side with educational administrators on the ability to censor some kinds of speech in any school-sponsored endeavor, not just publications, and also have found university codes of speech and even anti-harassment policies to be unconstitutional.

Prior to 1988, the most relevant legal standard in First Amendment issues in educational settings was Tinker v. Des Moines Independent School District (393 U.S. 503 (1969)), in which the Supreme Court ruled that student expression was speech protected under the First Amendment, and was generally applied to mean that school administrators could not prevent student speech on the basis of its content. In 1988 however, the Court more or less made a complete reversal in Hazelwood School Dist. v. Kuhlmeier (484 U.S. 260 (1988)), ruling that school administrators could in fact censor a school-sponsored newspaper. The role of the school as sponsor or publisher is important in Hazelwood, because the Court drew a distinction between “activities that students, parents, and members of the public might reasonably perceive to bear the imprimatur of the school” and those that are independent from it. Despite the formal ownership and funding structure of the Collegiate Times, it seems hard for the paper to argue that it so independent from Virginia Tech that content produced by the publication would not be associated with the school. On this issue, Virginia Tech has the law on its side.

However, before the commission or the school declare victory in this matter they might want to firm up the basis of their objections to the paper’s policy. By objecting to the use of anonymous posting to make comments that run counter to the Principles of Community, the commission puts the principles themselves at the heart of the dispute, and campus speech codes and other policies similar to Virginia Tech’s Principles of Community have repeatedly been found unconstitutional when challenged in court. As objectionable as the idea sounds, the administration might be on firmer ground if it followed through on its threat to prohibit student organizations from advertising in the paper.

Carrot or stick on cybersecurity?

Interesting post from GovInfoSecurity.com’s Eric Chabrow a couple of days ago, in which he borrows some conclusions from a Frontline documentary on the airline industry called “Flying Cheap” and applies them to the current debate about the best way to get critical infrastructure providers — especially those in the private sector — to implement and follow better security practices. Broadly speaking, there are two methods the government could use to effect changes in cybersecurity approaches: regulate or incentivize. A possible third option is closer collaboration between public and private sector organizations, but partnerships of that sort tend to fall into the “incentive” category, even if the incentives offered aren’t monetary.

The path of cybersecurity regulation has precedents in both the government (FISMA) and the private sector (HIPAA, GLBA, Sarbanes-Oxley) but regulations in force are applied narrowly by industry and do not at present address most critical infrastructure providers, whether in telecommunications networks, SCADA, or public works. Even with well-defined applicability, legislating security requirements often gets bogged down in the details, resulting in rules that say what you should do, but not how to do it effectively. This doesn’t mean that the government isn’t working on new and revised security regulations — there are in fact multiple concurrent and sometimes overlapping legislative efforts pending in Congress — but if history is any guide, these will not be sufficiently explicit or detailed to raise the bar across the board. The alternative approach of providing incentives to companies to improve their security has more proponents in industry than in government, although the Cyberspace Policy Review commissioned by the Obama administration and released in May 2009 tends to favor incentives over mandates. No advocate of an incentive-based approach has been more visible or vocal than the Internet Security Alliance’s CEO Larry Clinton, who has been pushing this point since at least the 2008 presidential election.

The lesson learned from the airline industry and its legally mandated safety regulations is that complying with regulations, even when it’s in the best interest of customers, costs money and has an impact on the corporate bottom line. For organizations that may have their priorities arranged more for business drivers than for achieving the outcomes sought by regulation, some consideration ought to be given to positive compliance incentives (and not just potential penalties for non-compliance). The administration has its own example to follow in the Recovery Act and follow-on funding devoted to providing financial incentives for adoption of health information technology; the motivation switches from incentive to penalty after 2015, but the emphasis in making the new technology pervasive is positive incentives.

Will complying with requirements in 201 CMR 17 give any tips to healthcare entities?

With the rapidly approaching March 1 deadline when Massachusetts’ new personal data protection law (201 CMR 17) finally goes into effect, one of many requirements facing organizations covered by the law is the need to encrypt all records or files containing personal information while the data is in transit across public networks or via wireless transmission, as well as information stored on laptops or other portable devices. The requirements notably stop short of requiring encryption of all personal data at rest (for data on Internet-connected systems, the law requires up to date patches and firewall protection), although the definition of “breach of security” in the regulation applies only to disclosure of unencrypted data, or encrypted data along with the means to decrypt it.

Much like the exception to data breach notification rules for personal health information that took effect last September, organizations who choose to use comprehensive encryption for personally identifiable information stored or in transit give themselves one less thing to worry about. It remains to be seen if this provides sufficient incentive for encryption of data at rest to become more pervasive. To the extent that organizations publicize their experiences complying with the Massachusetts regulations, achieving compliance with 201 CMR 17 may provide a useful data point for organizations in the health arena. Healthcare entities face stronger data privacy and security requirements from the HITECH Act’s effect on existing HIPAA rules, and also have to plan ahead for the security requirements contained among the “meaningful use” criteria that will be used to determine eligibility for federal health IT incentives to organizations adopting electronic health records and associated systems. These criteria include the ability to encrypt and decrypt data both in storage and in transit, so once again may provide another reason for these entities to start encrypting their data. Under the law (HITECH and HIPAA), organizations are not specifically required to encrypt personal data, so any risk-based decision to do so will likely be centered on the potential impact to the organization of a data breach. Given Health Net’s experience and pending legal action, the decision by healthcare entities to continue to leave personal health data unencrypted seems to make less and less sense all the time.

Facebook sued over December change in privacy practices

In a post earlier this week, we noted that generally speaking, anyone in the U.S. wanting to take legal action over the privacy practices of social networking sites like Facebook would have to do so within the boundaries of the Federal Trade Commission Act’s rules on unfair and deceptive trade practices. Two cases, both seeking class-action status, have now been filed in federal district court in California, alleging that Facebook’s recent changes are deceptive. It’s not at all clear that these charges have merit, especially in light of the explicit language Facebook has long has posted within its terms of service that basically reserve the right to make any changes it wants as long as it notifies users. Facebook users who don’t agree with that policy or the scope of changes that the company seems to like to make under cover of the policy certainly have the option not to use Facebook. The point is that the legal avenues to go after Facebook are somewhat limited, so the apparent approach being followed in these lawsuits is logical from that standpoint. More troubling for the plaintiffs may be the charge that Facebook’s privacy settings are too detailed and spread out, making them confusing.

These formal legal actions come on the heels of prior complaints filed with the FTC by EPIC and other consumer and privacy advocates about Facebook and its new privacy settings. Based on initial responses to EPIC by FTC Bureau of Consumer Protection chief David Vladeck, the complaints are at least getting attention from the feds, even if no formal investigation has been initiated.