About Us Our Work Employment News & Events
MITRE Remote Access for MITRE Staff and Partners Site Map
Our Work

Follow Us:

Visit MITRE on Facebook
Visit MITRE on Twitter
Visit MITRE on Linkedin
Visit MITRE on YouTube
View MITRE's RSS Feeds
View MITRE's Mobile Apps
Home > Our Work > Systems Engineering > SE Guide > Enterprise Engineering
Systems Engineering Guide

Privacy Systems Engineering

Definition: Privacy is generally defined as the claim of individuals to determine for themselves when, how, and to what extent information about them is communicated to others [1].

Privacy issues arise with the collection, use, maintenance, and disclosure of personally identifiable information (PII).

The Office of Management and Budget (OMB) defines PII as any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual's identity, such as name, Social Security number, date and place of birth, mother's maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information [2].

Keywords: E-Government Act, Fair Information Practices (FIPs), personally identifiable information (PII), privacy, Privacy Act, privacy impact assessments (PIA) record, system of record

MITRE SE Roles & Expectations: MITRE systems engineers are expected to understand the basic concept of privacy and be able identify PII and the situations in which privacy issues may arise. They should understand the legal requirements that apply to federal agencies' collection, use, maintenance, and disclosure of PII, and how these requirements relate to the systems engineering life cycle (SELC). Further, systems engineers are expected to develop, implement, and maintain technical controls to be included in information technology (IT) systems, which help ensure that privacy requirements are met.

Background

Privacy is based on the implementation of Fair Information Practices (FIPs), which were initially developed in 1973 by a federal advisory committee, commissioned because of concern over the harmful consequences that computerized data systems could have on the privacy of personal information. A revised version of the FIPs, developed by the Organization for Economic Cooperation and Development in 1980, has been widely adopted and forms the basis for many privacy laws worldwide (see Table 1) [3].

Table 1. The Fair Information Practices


Principle

Description

Collection limitation

The collection of personal information should be limited, should be obtained by lawful and fair means, and, where appropriate, with the knowledge or consent of the individual.

Data quality

Personal information should be relevant to the purpose for which it is collected, and should be accurate, complete, and current as needed for that purpose.

Purpose specification

The purposes for the collection of personal information should be disclosed before collection and upon any change to that purpose, and its use should be limited to those purposes and compatible purposes.

Use limitation

Personal information should not be disclosed or otherwise used for other than a specified purpose without consent of the individual or legal authority.

Security safeguards

Personal information should be protected with reasonable security safeguards against risks such as loss or unauthorized access, destruction, use, modification, or disclosure.

Openness

The public should be informed about privacy policies and practices, and individuals should have ready means of learning about the use of personal information.

Individual participation

Individuals should have the following rights: to know about the collection of personal information, to access that information, to request correction, and to challenge the denial of those rights.

Accountability

Individuals controlling the collection or use of personal information should be accountable for taking steps to ensure the implementation of these principles.

Source: Organization for Economic Cooperation and Development

The centerpiece of the federal government's legal framework for privacy protection, the Privacy Act of 1974, is based on the FIPs and provides safeguards for information maintained by federal agencies. Specifically, the act places limitations on agencies' collection, disclosure, and use of personal information maintained in systems of records. The act describes a "record" as any item, collection, or grouping of information about an individual that is maintained by an agency and contains his or her name or another personal identifier. It also defines "system of records" as a group of records under the control of any agency from which information is retrieved by the name of the individual or by an individual identifier. The Privacy Act requires that when agencies establish or make changes to a system of records, they must notify the public through a notice in the Federal Register. Agencies are allowed to claim exemptions from some of the provisions of the Privacy Act if the records are used for certain purposes, such as law enforcement. However, no system of records can be exempted from all of the Privacy Act's provisions. For example, agencies must publish the required notice in the Federal Register for all systems of record, even those that involve classified information. This ensures that the federal government does not maintain secret systems of records—a major goal of the act [4].

More recently, in 2002, Congress enacted the E-Government Act to, among other things, enhance protection for personal information in government information systems or information collections by requiring that agencies conduct privacy impact assessments (PIAs). A PIA is an analysis of how personal information is collected, stored, shared, and managed in a federal system; it is used to identify privacy risks and mitigating controls to address those risks. Agencies must conduct PIAs (1) before developing or procuring information technology that collects, maintains, or disseminates information that is in identifiable form, or (2) before initiating any new data collections of information in an identifiable form that will be collected, maintained, or disseminated using information technology if the same questions are asked of 10 or more people. [5] Individual PIAs may vary significantly depending on the specifics of department/agency guidance and the scope and sensitivity of PII collected, used, and disseminated.

It should be noted that privacy is not synonymous with security. (See Figure 1.) While privacy focuses on the individual's ability to control the collection, use, and dissemination of their PII, security provides the mechanisms to ensure confidentiality and integrity of information, and the availability of information technology systems. The concepts of privacy and security, however, do intersect. Specifically, certain IT controls established to ensure confidentiality and integrity from a security perspective also support privacy objectives. For example, access controls ensure that only authorized individuals can read, alter, or delete data. Such controls help achieve confidentiality and integrity from a security standpoint. In addition, when a system processes or stores PII, these IT controls ensure that users can access only the specific PII needed to perform their jobs; this helps ensure that use of PII is limited to authorized purposes (purpose specification) and protected from unauthorized access, destruction, and disclosure (security safeguards). While establishing good security practices helps protect privacy, these practices are not, in and of themselves, sufficient to fully address the FIPs.

Privacy vs. Information Security

Figure 1. Privacy vs. Information Security

Privacy Best Practices

Privacy Must Be "Built Into" the Systems Engineering Life Cycle: Consideration of privacy, including requirements to conduct PIAs, should be built into the agency systems engineering life cycle. This helps ensure that privacy requirements are considered early in the development of IT systems and that the technology is leveraged to provide privacy protections. At a minimum, the SELC should include the requirement to conduct a PIA as early in the development process as practicable. In addition, the SELC should contain procedures that require the completion of the PIA before the system is authorized to operate. Considering privacy requirements early in the development process avoids difficult and expensive retrofitting to address them later.

All Appropriate Stakeholders Must be Involved in Assessing Privacy Risks: In conducting the PIA, it is critical that the privacy office, the systems developer, and the business process owner all be involved. Having the appropriate stakeholders involved ensures that all privacy risks are identified and that alternative mitigating controls are considered.

Technology Can be Leveraged to Protect Privacy: Although many of the privacy risks identified through the PIA will be mitigated by establishing administrative controls—such as providing additional public notice, establishing policies and procedures to allow individuals access to the information held about them, or providing privacy training to system users—certain risks can be mitigated by technical system controls. Table 2 provides examples of how a system should be designed to protect privacy.

Table 2. How System Engineers Can Implement FIPs


Principle

Guidance for Systems Engineers

Collection limitation

Design the system to use only the minimum amount of PII necessary to accomplish the system's purpose. The key question to ask for each field of PII is: Can the purpose of the system be served without this particular field of PII?

Data quality

Develop the system to meet the data quality standards established by the agency.

Purpose specification

Develop systems that interact directly with the public such that the purpose for the collection of PII is made available.

Use limitation

Develop the system such that each field of PII is used only in ways that are required to accomplish the project's purpose. Each process associated with each field of PII should be reviewed to determine whether that use directly fulfills the project's purpose. If not, the function should not be developed.

Security safeguards

Implement information security measures for each field of PII to prevent loss, unauthorized access, or unintended use of the PII. Use encryption, strong authentication procedures, and other security controls to make information unusable by unauthorized individuals.1

Openness

Design the system to provide both a security and privacy statement at every entry point. Develop mechanisms to provide notice to the individual at the same time and through the same method that the PII is collected; for example, if PII is collected online, notice should also be provided online at the point of collection.

Individual participation

Design the system to allow identification of all PII associated with an individual to allow correction of all PII, including propagating the corrected information to third parties with whom the information was shared.

Accountability

Accountability can be encouraged, in part, by the use of audit logs that are capable of supporting a comprehensive audit of collection and use of all fields of PII to ensure that actual collection and use is consistent with the notice provided.

  • Audit logs should not contain the actual content of fields of PII, to limit unnecessary disclosure of this information.
  • The audit log should contain sufficient detail to identify (1) the source of each field of PII, (2) when each field of PII was accessed, (3) the uses of each field of PII, and when and by whom this information was used, (4) when each piece of PII was last updated and why, and (5) any suspicious transactions related to any field of PII and, if these occurred, the nature of the suspicion and the specific users involved.
  • If the use of Social Security numbers (SSNs) is authorized, systems engineers should create mechanisms to log access to SSNs and implement periodic reviews of the audit logs for compliance with the authorization.
  • The audit logs should also record sufficient information to support an audit of whether each field of PII was shared pursuant to a determination that the recipients needed the field of PII to successfully perform their duties, possessed the requisite security clearance, and provided assurance of appropriate safeguarding and protection of the PII.

Source: MITRE and Department of Homeland Security [6]

It should be noted that this list is not intended to be exhaustive. The specific risks and mitigating controls for an IT system or information collection should be identified by conducting a PIA.

Finally, systems engineers should consider the use of enterprise privacy-enhancing technologies (ePETs). ePETs are enterprise-oriented, data stewardship tools that help organizations achieve their business goals while appropriately managing PII throughout the information life cycle. These technologies may or may not be privacy-specific. ePETs include tools that can desensitize static data in databases by applying a variety of transformations, including masking and obfuscation. Desensitized data can then be used for testing and other purposes without unnecessarily disclosing individuals' PIIs. Another example is enterprise digital rights management, which can be used to impose limitations on the usage of digital content and devices, and can also be used to enforce limitations on the use of PII [8].

References & Resources

  1. Westin, Alan, 1967, Privacy and Freedom, New York: Athenaeum.
  2. Office of Management and Budget (OMB), July 12, 2006, Reporting Incidents Involving Personally Identifiable Information, M-06-19.
  3. Office of Economic Cooperation and Development, September 23, 1980, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data.
  4. The Privacy Act of 1974, as amended, 5 U.S.C. § 552a.
  5. The e-Government Act of 2002, Public Law 107-347.
  6. Privacy Office of the Department of Homeland Security, August 16, 2007, Privacy Technology Implementation Guide.
  7. OMB, May 7, 2007, Safeguarding Against and Responding to the Breach of Personally Identifiable Information, M-07-16.
  8. Shapiro, Stuart, February 27, 2009, A Gentle Introduction to Privacy Enhancing Technologies, PowerPoint Presentation.

MITRE has also developed several methodologies to support privacy engineering. For example, Privacy-Based Systems Analysis (PBSA) supports systematic strategic privacy analysis and planning at the program or mission level. It enables organizations to rigorously consider the alignment between privacy objectives and business and technical architectures. Privacy Risk Maps provide a view of enterprise privacy risk by documenting where PII resides and how it moves within the organization and across organizational boundaries. MITRE has also developed the Privacy RIsk Management Engine (PRIME), a Web-based PIA tool to support more effective privacy risk analysis and mitigation.

Additional References & Resources

  • Cannon, J.C., 2004, Privacy: What Developers and IT Professionals Should Know, Addison-Wesley.
  • McEwen, J. and S. Shapiro (eds.). 2009, U.S. Government Privacy: Essential Policies and Practices for Privacy Professionals. York, Maine: International Association of Privacy Professionals.

1OMB guidance directs that only cryptographic modules certified by the National Institutes for Standards and Technology (NIST) are to be used. See NIST's website at http://csrc.nist.gov/cryptval/ for a discussion of the certified encryption products [7].


Not all references and resources are publicly available. Some require corporate or individual subscriptions. Others are not in the public domain.

Link to MITRE-Only Resource References and resources marked with this icon are located within MITRE for MITRE employees only.


Page last updated: April 26, 2011   |   Top of page


For more information on the Systems Engineering Guide, or to suggest an article, please Contact Us.


Homeland Security Center Center for Enterprise Modernization Command, Control, Communications and Intelligence Center Center for Advanced Aviation System Development

 
 
 

Solutions That Make a Difference.®
Copyright © 1997-2013, The MITRE Corporation. All rights reserved.
MITRE is a registered trademark of The MITRE Corporation.
Material on this site may be copied and distributed with permission only.

IDG's Computerworld Names MITRE a "Best Place to Work in IT" for Eighth Straight Year The Boston Globe Ranks MITRE Number 6 Top Place to Work Fast Company Names MITRE One of the "World's 50 Most Innovative Companies"
 

Privacy Policy | Contact Us