Planning and Managing Independent Assessments

Definition: An independent assessment is a tool that can be used at any point in a program life cycle to provide insight into progress and risks.

Keywords: accident investigation, audit, baseline assessment, independent expert review, independent technical assessment, red team, SCAMPI appraisal, Tiger Team

MITRE SE Roles and Expectations: MITRE systems engineers (SEs) are expected to be able to plan, lead, or be team members or subject matter experts of independent review teams.


Individual skills and experience are a solid foundation for participation in independent reviews, but completing the review on schedule with quality findings also depends on a disciplined process. This article describes the three phases of a formal review, the essential activities within each phase, why each activity is essential, the risk assumed when an activity is not performed, and lessons learned from actual appraisals and independent reviews.

An independent assessment is a team activity in which the team leader and team members are not members of the organization being assessed. This allows the team to more readily fulfill its charter objectively and without conflicts of interest. The methodology described in this article is based on established appraisal and assessment methodologies and is tailorable to most types of independent assessments.

An independent review can be planned and managed as a project with a three-phase life cycle:

  1. Planning and preparing for an independent review: Paraphrasing Yogi Berra, 90 percent of a review is in the planning; the other 50 percent is executing the plan and delivering the findings. In this phase, we emphasize the critical importance of working with the sponsor to identify and document all relevant details of the review from start to finish. The overarching product of this phase is a review plan, a contract signed by sponsor and the team leader.
  2. Executing the plan: In this phase, the review team interacts more broadly with the organization under review and executes the review plan as approved by the sponsor.
  3. Preparing and delivering final findings: This section describes how the review team develops and presents defensible findings from evidence they have collected, or from their analyses.

Planning and Preparation

Give me six hours to chop down a tree, and I will spend the first four sharpening the axe—. —Abraham Lincoln

The first phase of an independent review is the most important. The sponsor (whoever is paying for the review) and review team leader meet face to face and develop a review plan that includes a charter for the review team, a clear specification of objectives and issues to be resolved, a schedule and related dependencies, communications requirements, and resource requirements such as funding. After the initial meeting, the team leader prepares a draft plan for the sponsor's review and approval.

In the standard independent review life cycle in Figure 1, reviews must be planned and managed as projects. That is, each should have a beginning and an end, sufficient resources, and a schedule that divides the review into phases with entry and exit criteria. During the review, the team leader is responsible for monitoring and controlling the review against the plan and for all communications with relevant stakeholders.

Review Lifecycle

Figure 1. Standard Independent Review Life Cycle.

The team leader is also responsible for:

  • Re-planning as needed
  • Team selection, team-building, and team performance
  • Resolution of conflicts and impasses (must be able to motivate, adjudicate, and cajole)
  • Reporting results or findings to the sponsor and other stakeholders as required

Planning must be thoroughly documented because every planning element is a potential element of risk. Elements that occasionally get overlooked include names of review participants; the ability of personnel to participate; security and travel requirements; commitment to schedules, particularly during the execution phase, and format/content of the final report (PowerPoint? hard copy, Word file, or both?) so the sponsor knows what to expect.

Plan Content. When developing your review plan, include the following sections:

  1. Cover page (include date and current version numbered) and revision page
  2. Context information (Organization size and other data, how the review came about)
  3. Purpose and objectives (Why and what the sponsor expects to gain from the review)
  4. Review team charter (To establish the independent review and the authority of the review team, include a charter signed by the sponsor and the team leader that details preliminary requirements/scope, dependencies, and constraints of the review, and conveys purpose and scope of the review to the subjects of the review.)
  5. Key participants (Sponsor, review team, interviewees, evidence providers, presenters)
  6. Scope (Sponsor's needs and expectations; what the team will do and deliver; leave as little as possible to interpretation, and update the plan whenever scope changes.)
  7. Schedule (Top-level at first, then add details as the review unfolds; "extreme detail" for the execution phase)
  8. Reference standards or models (Standards, models, etc. used for analysis by the team)
  9. Dependencies (e.g., among participants, schedule, other activities)
  10. Constraints (Availability of personnel, time, staffing, funding, tools, facilities, security)
  11. Resource requirements (Funding, people, time, facilities, tools)
  12. Logistics (on-site support, escorts, working hours, Internet access, printers, copiers, phones)
  13. Risk management (What might go wrong, probability, consequences, mitigation steps)
  14. Roles/responsibilities (Who does what)
  15. Training (What may be required to perform the review)
  16. Security (Facility/information access, on-site escorting, visit requests, and clearances)
  17. Communications plan (Who talks to whom, when, why, what must be documented, etc.)
  18. Management reviews (Status report, issues, triggers for ad hoc meetings)
  19. Deliverables/format (Program or project-specific, tabletop, or formal presentation, level of detail)
  20. Ownership of the results (Usually the agency or organization that paid for the review)
  21. Signature page (Sponsor and review team)

Team Selection. After establishing the scope of the review, the team leader selects a team. Candidates must have an appropriate level of engineering and management experience (rookies are not recommended), and they may also need specific technical domain experience. Team members must know how work is done in the context of the project or program being appraised. Is it a Department of Defense (DoD) acquisition program? Homeland Security? DISA? Internal Revenue Service (IRS)? Find team members who have worked in those arenas.

Previous review experience is recommended, however, depending on the appraisal and team size, including one or two inexperienced team members should be considered as a means to grow the organization's independent assessment bench strength. It is strongly recommended that candidates be able to commit full time to the schedule. Part-time team members are a significant risk, so qualified alternates are recommended.

After the team is selected and before the execution phase begins, a full day should be reserved for team building where the team meets, reviews the plan (especially the schedule), develops analysis strategies, and allocates workload. The team may be asked to sign non-attribution statements and non-disclosure agreements, which guarantee that nothing observed, read, or heard during the review will be attributed to a person or project, and that all intellectual property rights of the organization(s) being appraised will be respected.

Initial Findings and Readiness Review. At this point the team should assess the availability of information or evidence needed and make a preliminary feasibility assessment (is there enough time/information/etc. to conduct the review as planned?), then deliver a readiness assessment to the sponsor. Preliminary recommendations can range from "it appears that enough information is available to complete the review—we recommend proceeding with the review as planned" to "we have questions" to "we recommend changing scope" to "we recommend delaying the review—you need more time to prepare."

Table 1. Risks Assumed When Planning Activities Are Not Performed 



Team leader and sponsor meeting to develop preliminary inputs

Weak basis for further planning:

-Scope poorly defined

-Schedule ambiguous

-No authority to proceed

Establish a written charter

-No formal contract between the sponsor and the review team

-Without authority from the sponsor, the review becomes a lower priority in the organization

Obtain and review initial program information

-Too many assumptions

-Reduced objectivity

Select and build a team

-Inappropriate knowledge and skill sets

-Inconsistent review/analysis methods among sub-teams

-Team member absenteeism

Develop initial issues

-Time lost trying to focus the review at a later stage

Develop the review team's methodology

-Inconsistent findings

-Challenges to findings

Best Practices and Lessons Learned:

  • Meet with the sponsor, not a delegate.
  • Don't start the review without a signed charter and a signed review plan.
  • Expect the review to be seen as an intrusion or new impediment to progress by the subjects of an independent review. They will, of course, want to be fully engaged in the day-to-day activities of their project. Ask the sponsor to send a copy of the charter to those who will be involved in the review. This will establish the team's authority and its level of access.
  • Keep the scope as narrow as possible in order to produce supportable and usable findings.
  • Activities in scope must be achievable.
  • Establish an understanding with the sponsor about the constraints that are placed on the review team and its activities.
  • Schedule interviews well in advance. Ask for early notification of cancellations. Be efficient with the time of those being interviewed. They may already be stressed or behind schedule.
  • Update the review plan whenever something changes, and publish revisions.
  • Review team composition and interpersonal skills of team members are key.
  • Team building really pays off.
  • Use mini-teams wherever and whenever possible.
  • Don't wait until the execution phase to begin planning how the team will locate or identify the evidence they need.
  • Start to finish for an assessment should be 30–60 days, depending on scope and team size.

Executing the Plan

During this phase, the appraisal team interacts with the organization and its personnel. The team leader briefs the appraisal plan and the organization presents contextual information about itself.

The team then collects and analyzes evidence by comparing work products, observations (e.g., demonstrations), and oral evidence against the standard or model agreed on for the review. The team leader monitors team progress, redistributes workload as needed to maintain schedule, and meets daily with the entire team to assess progress. Midway through this phase, the team should conduct a more detailed progress review.

After this internal review the team leader meets with the sponsor, describes issues that warrant attention, and presents recommendations, for example, to expand or reduce the appraisal scope and to continue or terminate the appraisal.

If the sponsor says to continue the appraisal, the team completes the preliminary findings using a consensus decision-making process, which requires any negative vote to be resolved before there is a finding. Preliminary findings should be presented to and validated by organization personnel involved in the appraisal. They alone are allowed to attend the presentation of preliminary findings because findings, if based on incomplete evidence or misinterpreted interview statements, could be incorrect. After the presentation, give the organization personnel a few hours to present new evidence that might change a preliminary finding. Evidence review is then closed, and the team develops its final findings report.

Table 2. Risks Assumed When Execution Phase Activities Are Not Performed



Opening briefs

-Lost opportunity to present evidence

-Confusion about who is supposed to participate in events

-Timing and location of events


Detailed schedule

-Wasted time


-Scramble to find rooms


-Diluted findings

-Customer confusion

Validation of preliminary findings

-Quality of final findings

-Customer satisfaction

Best Practices and Lessons Learned:

  • Have a detailed schedule and stick to it.
  • Maintain objectivity and fairness—avoid "witch hunts."
  • Find ground truth and evidence to support conclusions.
  • Report to the sponsor when independent assessment risks are happening, e.g., participants are not participating.

Final Findings: Preparation and Delivery

Final findings, the ultimate deliverable of the review, should address all issues and questions identified in the scope statement of the plan. They must be supportable, i.e., developed by the review team from evidence or the results of analyses using a consensus method. The team should review/polish the final findings before delivering them to the sponsor, and then present them as required by the review plan. After the presentation, a record of the appraisal is given to the sponsor and to others authorized by the sponsor. The sponsor alone owns the information presented in the briefing and controls its distribution.

Finally, the independent review team should conduct a wrap-up session to record lessons learned and to discuss possible next steps.

Table 3. Risks Assumed If Final Phase Activities Are Not Performed



Team review of final findings

-Items not covered or completely understood

-Presentation assignments not finalized

Establish delivery method

-An uncoordinated presentation

-"Winging it" in front of a senior audience

-Best person not presenting a finding

Coordinate the final findings brief

Obvious things not covered:

-Time of presentation not advertised

-Poor availability of attendees

-Room reservation not made

-No audio-visual setup

Team wrap-up

-Lessons learned not tabulated

-No coordination of potential next steps

Best Practices and Lessons Learned

  • Stay focused until the review is over.
  • Tiny details that are missed can spoil several weeks of excellent work.
  • The team leader is not necessarily the best presenter for every element of scope.
  • Record lessons learned before team members return to their regular jobs.

References and Resources

Clapp, J. A., and P. G. Funch, March 5, 2003, A Guide to Conducting Independent Technical Assessments, MITRE Center for Air Force C2 Systems.

CMMI Institute, December 2014, Standard CMMI Appraisal Method for Process Improvement (SCAMPI): Method Definition Document for SCAMPI A, B, and C, Ver. 1.3b.

The Project Management Institute, 2013, A Guide to the Project Management Body of Knowledge, (PMBOK Guide), 5th Ed.


Download the SEG

MITRE's Systems Engineering Guide

Download for EPUB
Download for Amazon Kindle
Download a PDF

Contact the SEG Team