This page summarizes some of the more interesting challenges related to security and privacy that we are currently focusing on. The intent is both to explain the nature of the problems and, where possible, to suggest or recommend ways to address them.
Online delivery of academic courses is increasingly popular, particularly among institutions offering part-time programs of study targeted at working professionals or other adult learners. There are numerous practical and logistical challenges – as well as pedagogical ones – associated with ensuring that online or distance learning models are as effective as conventional face-to-face alternatives. These challenges are especially acute when course subject matter is highly technical or specialized, or when achievement of intended learning objectives involves the use of lab exercises or hands-on experimentation. After more than 10 years’ experience teaching online courses to graduate information assurance students, it is readily apparent that technical mechanisms are available that facilitate effective learning outcomes in online learning environments.
One obvious and widely available technology to leverage in online courses is virtualization. Initial efforts began in 2011 within UMGC’s information assurance program to develop a virtual lab environment that enables students to work with specialized software and security tools without significant experience in those technologies. These efforts were shared in a presentation entitled, “Leveraging Virtualization to Facilitate Online Delivery of Technical Courses” at the 2012 New Learning Technologies conference in Orlando, Florida, sponsored by the Society for Applied Learning Technology (SALT). This presentation highlighted the use of virtual machine technology to facilitate delivery of online courses involving the hands-on use of security tools such as Snort. Aside from explaining the rationale behind implementing virtualization technology in this context, the presentation provides references and instructions for implementing a virtual machine-based instance of Ubuntu Linux with course-specific security tools and supporting programs pre-installed and configured.
Virtualization technology provides an ever-maturing platform and delivery model for information security education, particularly for approximating real-world deployment and usage patterns emphasizing open-source security tools such as Snort running in conventional LAMP-stack virtual servers. The ability to quickly and easily replicate multiple virtual machine instances offers instructors a scalable, consistent online lab environment that is both more accessible to students with a broader range of technical experience and easier to support, maintain, and enhance to support learning objectives.
Trust has long been an important topic in organizational theory, where the emphasis is on the need to engender trust among key organizational stakeholders in order to operate businesses more effectively. In the information age where data sharing, information exchange, and interoperability are among the primary goals, trust (and distrust) among organizations is a key factor. Our initial thinking on this topic is centered on the creation and maintenance of trust relationships, which, as nicely described in NIST’s Special Publication 800-39, can be established either authoritatively or through negotiation:
In the authoritative approach, an organization with appropriate authority establishes the essential conditions for trust. The authoritative organization initially: (i) identifies the goals and objectives for the provision of services/information or the participation in information sharing activities; (ii) determines the risk associated with the provision of such services/information or the information sharing activities; (iii) establishes the degree of trustworthiness of the information systems providing the services/information or supporting the information sharing operations; and (iv) determines how compliance to the trust requirements is demonstrated and measured. Once established, the trust relationship can continue as long as the information system trustworthiness remains unchanged and the organizational risk remains acceptable.
When a single authoritative organization does not exist over the organizations desiring to share information or to use services/information from external providers, or when such an organization might exist but is not willing or able to accept the risks to be incurred or to be accountable for risk management decisions, an alternative approach for developing trust relationships may be in order. The alternative, negotiated approach establishes trust through agreements among potential partners and relies on negotiating the provisions for the elements of trust among those partners. In developing negotiated trust relationships, there must be explicit agreement on all elements of trust including the identification of goals and objectives for the provision of services/information or information sharing, the associated risk in conducting those activities, the trustworthiness for information systems involved in the partnership, how trustworthiness is to be demonstrated and measured, and how the trust relationship is to be maintained over time. The objective is to achieve a sufficient understanding of the partner’s information security programs and information systems in order to establish and maintain an environment conducive to information sharing or to obtaining services/information.
Trust relationships depend on the specific actions taken by the participating/cooperating partners to provide appropriate security controls for the information systems supporting the partnerships and the evidence needed to demonstrate that the controls have been implemented as intended. This evidence can include, for example, security plans (including risk assessments), security assessment reports, plans of action and milestones, or any other information that the organization can produce to demonstrate the trustworthiness of its information systems (NIST SP800-39, pp. 17-18).
One area of particular interest for investigation is the potential for building trust frameworks, implementable through technical means, as an alternative to the predominantly legal approaches seen in contexts such as international data sharing (such as the Privacy Shield between the U.S. and the European Union) and health information technology.
All conventional treatments (and indeed most definitions) of data integrity in general and message integrity in particular consider only threats to integrity during transit – that is, that the message as received is the same as the message when sent. This problem space is tangentially related to a class of fault-tolerance intended to address “Byzantine failures”, so-called because of its reference to the Byzantine Generals Problem first described by Lamport, Shostak, and Pease in 1982. The Byzantine Generals Problem considers ways to ensure system (or decision-maker) reliability when both message sources and communication channels cannot be trusted. This model specifically focuses on communication among multiple participating nodes, but it has relevance for contemporary data integration patterns bringing data from multiple sources into a single point of aggregation. In a distributed system that performs calculations, reporting, or analysis on data aggregated from multiple sources, errors or inaccuracies in some of the data can invalidate the aggregate and any results produced with it. This translates into a need both for effective protective measures for stored data and a means for transmitting entities to determine the accuracy of the data they hold and to provide an assertion of validity for data when it is transmitted to another entity. While there are many procedural and technical controls designed to protect integrity of data at rest, the ability to first establish and then assert data validity is a problem with no current solution, making it a good topic for further investigation.