Acquiring and Incorporating Post-Fielding Operational Feedback into Future Developments


Definition: The Post-Implementation Review (PIR) evaluation tool compares the conditions prior to the implementation of a project (as identified in the business case) with the project's actual results.

Keywords: benefit analysis, evaluation, implementation review, investment management, performance management, post-technology use

MITRE SE Roles and Expectations: MITRE systems engineers (SEs) are expected to understand the purpose and role of a Post-Implementation Review (PIR) and the benefits and costs of employing them. They are expected to be able to recommend techniques for PIRs, assist the government in tailoring PIR procedures, lead PIRs, and/or perform individual PIR tasks, as appropriate, e.g., post-implementation technical performance analyses. However, because PIRs should be conducted by individuals not directly involved in the previous steps of the acquisition process, MITRE SEs are frequently precluded from participating in the reviews themselves, other than as subject matter experts, because of their role in recommending appropriate technology or providing guidance during earlier phases of the life cycle.

Background

The PIR is used to evaluate the effectiveness of system development after the system has been in production for a period of time. The objectives of the PIR are to determine if the system does what it is designed to do: Does it effectively and efficiently support the user as required? The review is intended to assess how successful the system is in terms of functionality, performance, and cost versus benefits, as well as to assess the effectiveness of the life-cycle development activities that produced the system. The review results can be used to strengthen the system as well as system development procedures. However, although the systems engineering community generally agrees that the PIR is a laudable thing to do, in practice the review generally remains, at best, an ignored stepchild because departments and agencies content themselves with simply "running the numbers," i.e., doing statistical analyses of performance deltas.

Government Interest and Use

The President's Office of Management and Budget (OMB) outlines the purpose of the PIR in Step IV.3. Post-Implementation Review (PIR), of the Capital Programming Guide, first published in 1997, as a supplement to Circular A-11, Part 3: Planning, Budgeting, and Acquisition of Capital Assets [1]. It is described as a diagnostic tool to evaluate the overall effectiveness of the agency's capital planning and acquisition process as a complement to operational analysis. This control mechanism is used during the operational life cycle of an asset to enable resource managers to optimize the performance of capital assets over the course of their life cycle and eventual disposition [1, 2]. With reference to information technology solution projects, the Information Systems Audit and Control Association (ISACA) notes that the PIR is a very tricky part of service management and that it is important to include the PIR effort in the project estimation process [3].

OMB stipulates that PIRs should be conducted by individuals not directly involved in the acquisition of the asset, but that they may include owners and users of the asset or other personnel and consultants. By contrast in an Agile development environment, the Release Retrospective (the Agile PIR) is conducted as a highly interactive activity, including not only the core team but all impacted stakeholders, including end-user representatives, supporting functions, and senior management [4].

In response to the OMB mandate, agency system development life-cycle (SDLC) documentation now specifies the preparation of a PIR report as part of the final phase of the SDLC, Operations and Maintenance. However, in its IT Investment Management Framework [5, 6], the General Accounting Office (GAO) observes that agencies will continue to have difficulty performing an effective PIR unless they have more comprehensively established policies and procedures to assess the benefits and performance of their investments. This lack of documented processes is typically one of GAO's primary criticisms of agency performance in this area, for example, GAO's 2007 report on DoD Business System Modernization (GAO-07-538), which criticized the limited nature of the Defense Acquisition System (DAS) PIR procedures [7]. In its updated Acquisition Directive 102-01, Department of Homeland Security (DHS) requires a PIR of every program implemented in the agency [8]. DoD Instruction 5000.02 extends the PIR requirement to acquisition category (ACAT) II and below [9].

According to OMB, PIRs should be conducted three to twelve months after an asset becomes operational. However, agencies vary the length of that period, with a range of anywhere from three to eighteen months after implementation. GAO points out that the timing of a PIR can be problematic. A PIR conducted too soon after an investment has been implemented may fail to capture the full benefits of the new system, whereas a PIR conducted too late may not be able to draw adequately on the institutional memory of the development/investment process. GAO indicates that PIRs should also be conducted for initiatives that were aborted before completion to help identify potential management and process improvements.

To ensure consistency of evaluations, OMB recommends using a documented methodology to conduct PIRs, requiring that the chosen methodology align with the organization's planning process. The required level of specificity for PIR reports varies from agency to agency. According to OMB, PIRs should address [1]:

Customer/User Satisfaction

  • Partnership/involvement
  • Business process support
  • Investment performance
  • Usage

Internal Business

  • Project performance
  • Infrastructure availability
  • Standards and compliance
  • Maintenance
  • Security issues and internal controls
  • Evaluations (accuracy, timeliness, adequacy of information)

Strategic Impact and Effectiveness

  • System impact and effectiveness
  • Alignment with mission goals
  • Portfolio analysis and management
  • Cost savings

Innovation

  • Workforce competency
  • Advanced technology use
  • Methodology expertise
  • Employee satisfaction/retention
  • Program quality

In helping government sponsors tailor the PIR to a particular project, MITRE SEs should cast the net as broadly as possible, ensuring that the PIR also examines potential weaknesses and risks and the long-term maintainability of the solution.

Best Practices and Lessons Learned

In 2004, the Information Systems Audit and Control Association (ISACA) published a set of best practices specific to the PIR, the IS Auditing Guideline: Post-Implementation Review [10].

Focus on analysis. Properly leveraged, the PIR provides an important control mechanism and tool for continuously improving the acquisition process. However, PIR templates are all too frequently limited to checklists comparing baseline expectations to results on a numerical value scale or summary bullets rather than providing serious analyses of root causes and contributory factors. By contrast, the Systems Engineering: Post-Implementation Review (PIR) Document Template, published by the U.S. Army Information Technology Agency (ITA) as draft in October 2008, devotes the majority of its review to the system itself rather than to the acquisition process, and it emphasizes that the review should result in a free-form report, thereby highlighting the analytical potential of the review [11].

Avoid a "fix the bugs later" strategy. That agencies should underestimate the value of PIRs is not surprising, given that managers frequently feel they are all too aware of the limitations of system releases as a result of compressed schedules and inevitable scope creep prior to deployment. The typical strategy is simply to "move on and fix the bugs" in the next release. "Why do we need a PIR if we already know what is wrong?" is a rationale based on a failure to recognize the potential benefit of the review when it is conducted as a true audit of technical and process performance. Significantly, the PIR is not tied to any life-cycle milestone, effectively robbing it of any determinative value.

Position is everything. The PIR may also fail to be fully leveraged as a process improvement tool for IT acquisitions because OMB and GAO place it in a capital planning and investment control (CPIC) context where it is seen more as a capital management/project management tool than as a tool to improve system development from a technical perspective.

If, from a CPIC perspective, the PIR is seen as the initial iteration of the operational assessments conducted over the life cycle of an operational system, from the SDLC perspective it is only loosely tied in to the systems life cycle, being relegated to the O&M phase rather than seen as a close-out report for the systems implementation phase of the life cycle. From the SE perspective, the PIR is condemned to a technology limbo—too late for developers to care, too early for maintenance people to be interested.

The PIR is especially useful in modernization efforts. If the PIR presents an excellent opportunity for MITRE SEs to help their sponsors across all programs improve the way they conduct business, this opportunity is even more acute for those programs whose major focus is providing systems engineering support to agencies involved in business systems modernization (BSM) efforts. These efforts generally encompass the entire enterprise and frequently, as in the case of DHS, involve the insertion of technologies that revolutionize the way the agencies do business and provide challenges far exceeding those faced in the implementation of systems in the past. The issues resulting from BSM and major technology insertions are compounded when civilian agencies increasingly rely on contractors not just to deliver systems but to provide end-to-end services as well. The PIR can be particularly useful in such an environment not only from a project and portfolio management perspective but also as an analysis of the operational impact of new technology, for example, the use of biometrically enabled passports to verify traveler identity or the use of digitized documentation in a traditionally paper-based environment.

Success depends on following through on difficult issues. The effectiveness of PIRs as a process improvement tool in general, particularly in the case of BSM deployments that transform the operating environment, depends on a willingness to address issues of workforce competence, advanced technology use, and methodology expertise (e.g., how database-driven and service-oriented architectures are received in organizations previously responsible for the maintenance of long-outdated hardware and software). Although these areas of investigation are identified in OMB's PIR guidance, they are generally absent from agency PIR procedures. However, it is frequently in these areas that implementations fail to achieve the expected results. This is because the workforce is not ready to operate or maintain the new systems, those responsible for maintaining the new systems lack the expertise required to support next-generation technology delivered through the BSM program, or a mismatch exists between the advanced technology deployed and the requirements of the operational environment. If the PIR is not designed to explore the root cause of potential issues, the resulting report will remain a paper exercise. However, failure to confront difficult issues leaves agencies with limited tools in their dialog with oversight bodies when it comes to requesting additional resources or postponement of mandates.

References and Resources

  1. OMB, 2015, "Post Implementation Review and Post-Occupancy Evaluation," Section III.3.3, Capital Programming Guide, Ver. 3.0, Supplement to Office of Management and Budget Circular A-11: Planning, Budgeting, and Acquisition of Capital Assets, accessed August 22, 2017.
  2. OMB Circular No. A-130, November 28, 2000, Management of Federal Information Resources, revised, accessed August 22, 2017.
  3. ISACA, 10 Key Rules for Using the ITIL Framework Effectively, 2012, ISACA Journal Online, Vol. 4.
  4. Derby, E., and D. Larsen, August 5, 2006, Agile Retrospectives, Making Good Teams Great, Pragmatic Bookshelf.
  5. GAO, 2004, Information Technology Investment Management. A Framework for Assessing and Improving Process Maturity (GAO-04-394G), pp. 83–89.
  6. Internal Revenue Service, June 30, 2000, IRS Enterprise Life Cycle, Investment Decision Management, Post Implementation Review (PIR) Procedure.
  7. GAO, May 2007, Business Systems Modernization: DoD Needs to Fully Define Policies and Procedures for Institutionally Managing Investments (GAO-07-538), p. 31, accessed August 22, 2017.
  8. Department of Homeland Security, November 7, 2008, Acquisition Instruction/Guidebook 102-01-001, Interim Ver. 1.9, p. 11, accessed August 22, 2017.
  9. Department of Defense, January 7, 2015, Operation of the Defense Acquisition System, DoD Instruction Number 5000.02, accessed August 22, 2017.
  10. Information Systems Audit and Control Association (ISACA) Knowledge Center, "AI7.9 Post-Implementation Review," accessed August 22, 2017.
  11. U.S. Army Information Technology Agency, October 2008, ITA Systems Engineering. Post-Implementation Review (PIR) Document Template, Ver. 1.0.

Publications

Download the SEG

MITRE's Systems Engineering Guide

Download for EPUB
Download for Amazon Kindle
Download a PDF

Questions?
Contact the SEG Team