Privacy Engineering


Definition: Privacy is generally defined as the ability of an individual to exercise control over the collection, use, and dissemination of his or her Personally Identifiable Information (PII). Privacy issues arise with the collection, use, maintenance, and disclosure of PII.

PII is defined as “information which can be used to distinguish or trace an individual’s identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or identifying information which is linked or linkable to a specific individual, such as date and place of birth, mother’s maiden name, etc. [1]” Determining whether information is PII “demands a case-by-case assessment of the specific risk that an individual can be identified. [2]"

Keywords: E-Government Act, Fair Information Practice Principles (FIPPs), personally identifiable information (PII), privacy, Privacy Act, privacy impact assessment (PIA), system of records

MITRE SE Roles & Expectations: MITRE systems engineers (SEs) are expected to recognize when privacy is within the scope of the system being developed. They should work with the sponsoring organization to translate the agency’s privacy principles, policies, and procedures into actionable engineering requirements applicable to the focus of the system.

Background

Privacy is based on the implementation of Fair Information Practice Principles (FIPPs), which were initially developed in 1973 by a federal advisory committee, commissioned because of concern over the harmful consequences that computerized data systems could have on the privacy of personal information. Different versions of the FIPPs have been defined. One version is found in the Guidelines on the Protection of Privacy and Transborder Flows of Personal Data [3] that were developed by the Organization for Economic Cooperation and Development (OECD) in 1980 and later replaced by the 2013 OECD Privacy Framework [4]. The principles in the OECD documents have been widely adopted and form the basis for many privacy laws worldwide (see Table 1).

Table 1. 2013 OECD Privacy Framework Privacy Principles


Principle

Description

Collection limitation

There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.

Data quality

Personal data should be relevant to the purposes for which they are to be used and, to the extent necessary for those purposes, should be accurate, complete and kept up to date.

Purpose specification

The purposes for which personal data are collected should be specified no later than at the time of data collection, and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.

Use limitation

Personal data should not be disclosed, made available, or otherwise used for purposes other than those specified in accordance with the Purpose Specification Principle except a) with the consent of the data subject or b) by the authority of law.

Security safeguards

Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification, or disclosure of data.

Openness

There should be a general policy of openness about developments, practices, and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.

Individual participation

Individuals should have the right:

a) To obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to them.

b) To have communicated to them, data relating to them

i. Within a reasonable time.

ii. At a charge, if any, that is not excessive.

iii. In a reasonable manner.

iv. In a form that is readily intelligible to them.

c) To be given reasons if a request made under subparagraphs (a) and (b) is denied, and to be able to challenge such denial.

d) To challenge data relating to them and, if the challenge is successful, to have the data erased, rectified, completed, or amended.

Accountability

A data controller should be accountable for complying with measures that give effect to the principles stated above.

The centerpiece of the federal government's legal framework for privacy protection, the Privacy Act of 1974 [5], is based on the FIPPs and provides safeguards for information maintained by federal agencies. Specifically, the act places limitations on agencies' collection, disclosure, and use of personal information maintained in systems of records. The act describes a "record" as any item, collection, or grouping of information about an individual that is maintained by an agency and contains his or her name or another personal identifier. It also defines "system of records" as a group of records under the control of any agency from which information is retrieved by the name of the individual or by an individual identifier. The Privacy Act requires that when agencies establish or make changes to a system of records, they must notify the public through a notice in the Federal Register [6]. Agencies are allowed to claim exemptions from some of the provisions of the Privacy Act if the records are used for certain purposes, such as law enforcement. However, no system of records can be exempted from all of the Privacy Act's provisions. For example, agencies must publish the required notice in the Federal Register for all systems of records, even those that involve classified information. This ensures that the federal government does not maintain secret systems of records—a major goal of the act.

More recently, in 2002, Congress enacted the E-Government Act [7] to, among other things, enhance protection for personal information in government information systems or information collections by requiring that agencies conduct privacy impact assessments (PIAs). A PIA is an analysis of how personal information is collected, stored, shared, and managed in a federal system; it is used to identify privacy risks and mitigating controls to address those risks. Agencies must conduct PIAs (1) before developing or procuring information technology that collects, maintains, or disseminates information that is in identifiable form, or (2) before initiating any new data collections of information in an identifiable form that will be collected, maintained, or disseminated using information technology if the same questions are asked of 10 or more people. Individual PIAs may vary significantly depending on the specifics of department/agency guidance and the scope and sensitivity of PII collected, used, and disseminated.

Privacy Is Not Synonymous with Security

Privacy focuses on the individual's ability to control the collection, use, and dissemination of their PII, whereas security provides the mechanisms to ensure confidentiality and integrity of information, and the availability of information technology systems. The concepts of privacy and security, however, do intersect. Specifically, certain IT controls established to ensure confidentiality and integrity from a security perspective also support privacy objectives. For example, access controls ensure that only authorized individuals can read, alter, or delete PII. Such controls help achieve confidentiality and integrity from a security standpoint. In addition, when a system processes or stores PII, these IT controls ensure that users can access only the specific PII needed to perform their jobs; this helps ensure that use of PII is limited to authorized purposes (purpose specification) and protected from unauthorized access, destruction, and disclosure (security safeguards). Although establishing good security practices helps protect privacy, these practices are not, in and of themselves, sufficient to fully address the FIPPs.

Many organizations rely on the following activities to address privacy risks:

  • Policy
  • Risk assessments (e.g., PIAs)
  • Notice
  • Records management
  • Accounting of disclosures
  • Data flow mapping
  • Data loss prevention
  • Metrics

Yet privacy risks remain and privacy breaches continue to rise. Why? Because these things alone do not proactively address privacy risks at the appropriate level of specificity for a given system. To be effective, systems containing PII must be capable of preventing or minimizing the effect of human error or fallibility and appropriately constraining system actions.

To adequately address privacy risks, systems that manage PII must behave in a privacy-sensitive manner. Systems engineering processes are a largely untapped opportunity to embed privacy requirements into organizational activities in a way that provides major impact and will proactively address privacy risks. (See Figure 1.)

Figure 1. Addressing Privacy Risk in Systems Engineering Processes

Privacy Engineering Framework

Privacy engineering is a systematic, risk-driven process that operationalizes the Privacy by Design (PbD) philosophical framework within IT systems by:

  • Segmenting PbD into activities aligned with those of the systems engineering life cycle (SELC) and supported by particular methods that account for privacy’s distinctive characteristics.
  • Defining and implementing requirements for addressing privacy risks within the SELC using architectural, technical point, and policy controls. Privacy requirements must be defined in terms of implementable system functionality and properties. Privacy risks, including those beyond compliance risks, are identified and adequately addressed.
  • Supporting deployed systems by aligning system usage and enhancement with a broader privacy program.

The goal is to integrate privacy into existing systems engineering processes; it is not to create a separate new process. Figure 2 illustrates how the core privacy engineering activities map to stages of the classic systems engineering life cycle. A mapping exists for every systems engineering life cycle, including agile development, because every life cycle includes the core activities in some form.

Figure 2. Privacy Engineering Framework

The primary privacy engineering activities and methods are listed in Table 2.

Table 2. Privacy Engineering Activities and Methods

Life Cycle Activity

Privacy Method

Method Description

Privacy Requirements Definition: Specification of system privacy properties in a way that supports system design and development

Baseline & custom privacy system requirements

Granular technical privacy requirements derived from first principles and from risk analysis

Privacy empirical theories & abstract concepts

Methodological constructs based on theories of privacy and socio-technical systems

Privacy Design and Implementation: Representation and implementation of those elements of the system that support defined privacy requirements

Fundamental privacy design concepts

Explicit or tacit consenus understandings of how privacy works in a system

Privacy empirical theories & abstract concepts

Methodological constructs based on theories of privacy and socio-technical systems

Privacy design tools

Specific techniques for achieving privacy

Privacy heuristics

Experientially developed rules of thumb regarding privacy properties of artifacts

Privacy Verification and Validation: Confirmation that defined privacy requirements have been correctly implemented and reflect stakeholder expectations

Privacy tesing & review

Executable tests and targeted document reviews associated with privacy requirements

Operational synchronization

Analysis of privacy policies & procedures and system behaviors for inconsistencies

For additional details on privacy engineering inputs, activities, and outputs for each life-cycle activity in Table 2, see MITRE's Privacy Engineering Framework. See the SEG's Privacy Requirements Definition and Testing article for more information on addressing privacy within requirements definition and testing activities.

Best Practices and Lessons Learned

Privacy is not ensured by policy alone. Privacy by Design (PbD) advances the view that privacy cannot be assured solely by compliance with regulatory frameworks; rather, privacy assurance must become an organization’s default mode of operation. PbD applies to information technology, accountable business practices, and physical design. Adequate privacy requires thoughtful integration with every layer of an organization, including:

  • Organization policies and governance
  • Business processes
  • Standard operating procedures
  • System and network architectures
  • IT system design and development practices
  • Management of data sources

Consider the use of enterprise privacy-enhancing technologies (ePETs). ePETs are enterprise-oriented, data stewardship tools that help organizations achieve their business goals while appropriately managing PII throughout the information life cycle. These technologies may or may not be privacy-specific. ePETs include tools that can desensitize static data in databases by applying a variety of transformations, including masking and obfuscation. Desensitized data can then be used for testing and other purposes without unnecessarily disclosing individuals' PIIs. Another example is enterprise digital rights management, which can be used to impose limitations on the usage of digital content and devices, and can also be used to enforce limitations on the use of PII.

References & Resources

  1. Office of Management and Budget (OMB), May 7, 2007, Safeguarding Against and Responding to the Breach of Personally Identifiable Information, Memorandum M-07-16.
  2. Office of Management and Budget (OMB), June 25, 2010, Guidance for Agency Use of Third-Party Websites and Applications, Memorandum M-10-23.
  3. Office of Economic Cooperation and Development, September 23, 1980, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data.
  4. Office of Economic Cooperation and Development, 2013, The OECD Privacy Framework.
  5. Privacy Act of 1974, as amended, 5 U.S.C. § 552a.
  6. Federal Register, National Archives and Records Administration, accessed August 18, 2014.
  7. e-Government Act of 2002, Public Law 107-347.

Additional References and Resources

Kendall, D., ed., 2013, U.S. Government Privacy: Essential Policies and Practices for Privacy Professionals, Portsmouth, N.H., International Association of Privacy Professionals.

Information and Privacy Commissioner, Ontario, Canada, Privacy by Design Centre of Excellence, About PbD, accessed on August 18, 2014.

The MITRE Corporation, August 2014, Privacy Engineering Framework.

Publications

Download the SEG

MITRE's Systems Engineering Guide

Download for EPUB
Download for Amazon Kindle
Download a PDF

Questions?
Contact the SEG Team