RFP Preparation and Source Selection
Definition: Request for proposal (RFP) preparation and source selection are the actions necessary to prepare for a government solicitation and to select one or more contractors for delivery of a product or service.
Keywords: acquisitions, competitive procurement, non-competitive procurement, proposal, RFP, RFP development, source selection, strategy
MITRE SE Roles and Expectations: MITRE systems engineers (SEs) are expected to create technical and engineering portions of RFP documentation (requirements documents, statement of work, evaluation criteria) and to assist in the technical evaluation of bidders during source selection.
A program's acquisition strategy addresses the objectives for the acquisition, affordability, constraints, availability of resources and technologies, acquisition methods, types of contracts, terms and conditions of the contracts, management considerations, risk, and the logistics considerations for the resulting products or services. The acquisition strategy identifies the context for developing the RFP and determines the source selection method as either a "competitive" or "non-competitive" procurement. The requirements contained in the RFP and the contractor(s) proposal that is selected can often determine the success or failure of a system for the duration of a program. Using a credible process to develop the RFP and conduct a source selection can significantly improve the likelihood of success. Doing it right the first time is critical—rarely does a program have a chance to do it again.
In a competitive procurement, two or more contractors, acting independently, are solicited to respond to the government RFP. Their proposals are evaluated during source selection, and the contract is awarded to the contractor(s) who offers the most favorable terms to the government.
Competitive procurement activities include the preparation steps that lead to the development of the acquisition strategy, which, as noted earlier, provides the basis for developing the RFP and conducting the source selection. Leading up to the release of the RFP to industry are the development of solicitation documents, evaluation criteria, and source selection approach. Several techniques can be used to manage the pool of sources that will bring proposed solutions. Full and open competition allows any company to submit a proposal, and the government evaluates all offers to select a winner.
If the program office wants to reduce the number of contractors proposing to the top qualifiers, a variety of strategies can be considered. One example is to conduct a multi-step competitive procurement using competitive prototyping as a means to further evaluate the qualifications of competing contractors under more representative demonstration conditions. A second example is a phased competition, a risk reduction strategy that provides for the development of business approaches, systems development, etc. under contract with subsequent down-select competitions among contractors for further development or full performance within the same contract. Phased competition procedures may be appropriate when state-of-the-art solutions are sought and significant development work is required by industry.
The overall activities in a competitive procurement are illustrated in Figure 1. MITRE supports sponsors through the complete life-cycle process with technical guidance and advice to identify the need, through technical support during evaluation of proposals.
RFP Development Process
RFP development is part of the overall procurement process. The program’s acquisition team typically leads the RFP preparation process, and the SEs have specific roles that are based on their subject matter expertise. The expertise typically required includes software, hardware, and firmware engineering and information assurance (IA), at a minimum, along with specific domain expertise in areas that include radar, satellites, communications, aviation, healthcare, and cybersecurity.
The actions necessary for development and release of the RFP are shown in Figure 2.
The acquisition strategy provides the overall guidance for the development of the RFP, and the work breakdown structure (WBS) provides the definition of the program and guides the contractor in creating the contract WBS. The specifications or technical/system requirements document (TRD/SRD), the statement of objectives (SOO), statement of work (SOW) or performance work statement (PWS), and the contract data requirements list (CDRL) form the technical basis of the RFP. These are usually the focus of MITRE SEs on RFPs. Many templates and exemplars for these documents are available from the federal government, including the Office of Federal Procurement Policy and the Department of Defense. MITRE also has many exemplars and samples that can be effectively applied to customer requirements.
An RFP is divided into sections A–M. Sections A–J are primarily contract documents, except for section C, which is the SOO or SOW. Section J contains attachments like the TRD/SRD, Integrated Master Plan, and Government Furnished Data. Section L is the Instructions for Proposal Preparation, which defines the information the offerors provide to the government evaluation team. Section M constitutes the evaluation criteria. Evaluation criteria are another critical component of the RFP because they select the final winner. Technical criteria address areas of technical risk and complexity. MITRE is often asked to participate in the construction of sections L and M to ensure that the offerors have thoroughly addressed high-risk and technically complex items of concern in their proposals. It is critical that the RFP be well crafted. It must unambiguously address the government’s interests and be straightforward for government to evaluate and select a winner. If not, industry will have lots of questions that delay the source selection and that could result in a protest during the source selection process.
To efficiently prepare draft products and thus facilitate the RFP development process, the use of exemplar wording is recommended. Although every RFP must be tailored to address its specific set of requirements and capabilities, significant RFP content may be reused from project to project once it has been carefully crafted. This is not to say that content should be blindly incorporated.
In the source selection process of a competitive procurement, proposals are examined against the requirements, facts, recommendations, and government policy relevant to an award decision, and, in general, the best value proposal is selected. The actions shown in Figure 3 are those generally conducted during source selection. The focus of MITRE's participation in source selections is the evaluation of the technical proposal and the resulting risk assessment.
Emphasis needs to be placed on the process, making sure that both government and industry have a clear understanding of what is needed and what is offered. The Government Accountability Office (GAO) has reviewed many protests over the years, and two main reasons for protest sustainment are that the government did not follow its own process and that the government did not quantify the value of a higher priced offer from the low-priced offer. The proposal technical evaluation is a critical aspect of this process, and it is driven by the need to understand the technical, cost, and schedule risks embodied in each proposal.
Although it is not the preferred federal procurement approach, on occasion, non-competitive procurement is necessary to meet government needs for certain critical procurements. This approach is more commonly used with a contractor who is already on contract with the government (but not necessarily with the same organization doing the procurement) providing a similar capability, or when it is clearly advantageous to use the non-competitive approach in subsequent contract changes or new solicitations for an existing program.
Non-competitive contracts use the terms "sole source contractor" or "single source contractor." Sole source includes initial awards to one contractor, and it also includes change proposals or major modification (e.g., block update) to an existing contract (most common) after a competitive contract is awarded. Most sole source awards depend primarily on the experience of the contractors or their proprietary rights to products. Sole source awards take less time than competitive RFP and source selection processes (exceptions are large definitizations of engineering change proposals). Predetermined evaluation criteria not used, but a determination that the contractor is the only entity that can perform the work serves as the selection criteria. The government is usually in its weakest position for negotiations, given that it has no competition leverage to negotiate costs or requirements.
As with competitive procurement, the actions taken in a non-competitive procurement include the preparation steps that lead to the development of the acquisition strategy. Prior to developing the solicitation documents that constitute the RFP, the program office must submit justification and approval (J&A) documentation to the appropriate agency office to receive approval for the non-competitive procurement. Occasionally there is a technical reason for using a particular contractor, and MITRE is involved with generating the J&A. With this approval, the program office can develop the solicitation documents and enter into collaborative contract development with the contractor. On completion of the collaborative contract development, the program office evaluates, negotiates, and awards a contract with many of the steps indicated earlier. MITRE is most often used to evaluate the proposal for technical approach and resources/engineering hours.
Basis of Estimate (BOE) Evaluations
Basis of estimate (BOE) evaluations provide the government with a foundation for establishing a fair and reasonable contract or modification to a contract. MITRE conducts many BOE analyses, focusing on the biggest impact items and where to question the offeror’s estimate or estimation process. They provide an opportunity to verify the offeror’s understanding of the government’s technical requirements and are relied on for assessing the risk of meeting the performance parameters established by the government. The review of the offeror’s BOE should include the WBS basis for each contract line item number's (CLIN’s) effort estimates; a determination that the offeror’s methodology is acceptable and no mistakes have been made in the calculations; and an estimate of effort or cost for the government estimate of most probable cost (GEMPC) based on the cost team’s recommended level of confidence.
Each adjustment, must have a logical and defensible justification that is associated with specific CLIN(s) and WBS element(s). Coordinating with the cost team, MITRE can quantify and provide rationale for adjustments to the offeror’s estimate. Other BOE problems to look for include the inclusion of unnecessary tasks or purchases; subcontractor or inter-divisional efforts that are not well integrated; poor traceability to/from CWBS, SOW or IMP, IMS, and/or technical volumes; and planned staffing levels that appear to be unreasonable.
Best Practices and Lessons Learned
Market research can play a critical role in developing requirements and planning for the acquisition. Although it may be time consuming, researching the state of the art and visiting with contractors and vendors will give you a good sense of what's achievable for program requirements. In competitive procurements, requests for information or technical abstracts are very helpful in determining the range of available developers/suppliers. Asking industry to submit papers and demonstrations prior to the release of an RFP helps refine the requirements. MITRE, as an FFRDC operator, may review this kind of proprietary information and use it as a basis for validating technology or assumptions about requirements. Such feedback from industry may also be useful for refining the evaluation criteria for the RFP.
The role of competitive prototyping. Competitive prototyping can be used to require competing developers to demonstrate applicable technology or services, along with engineering process and documentation (as examples), to enable you to better evaluate their overall abilities to deliver the full program. You can also use it as a technique to reduce risk in complex or unproven technical areas. (For more information on competitive prototyping, see the SEG's article Competitive Prototyping in the Contractor Evaluation topic.)
The right level of detail for a WBS. The WBS is often the foundation for determining contractor progress and earned value during the program development and deployment phases. As such, it needs to be structured to provide enough detail to judge sufficient progress during program execution. Therefore, it is critical that the WBS be a part of the source selection evaluation. WBS elements should be partitioned into efforts no larger than 60 days per unit. At least 90+ percent of the WBS elements must be measurable in durations of 60 days or less. This allows you to track WBS completion quarterly and get a good indication of progress monthly. Each WBS item should only have three reporting states: zero percent complete (not started), 50 percent complete (at least 50 percent complete), and 100 percent complete (done). This allows you to track the WBS status without overestimating the percent complete.
What matters in the RFP. Depending on the acquisition strategy chosen, the completeness of the TRD/SRD is critical. For programs expected to evolve over time through "agile" or "evolutionary" acquisition strategies, you need to specify carefully chosen threshold requirements for the initial delivery of capability—ones that are achievable within the allotted schedule for that first delivery. Requirements to be satisfied in a later delivery of capability may be less stringent if they are apt to change before being contracted for development. In a more traditional acquisition strategy where all requirements are to be satisfied in one or two deliveries of capability, the TRD/SRD must be complete.
Test for performance. Another point to remember is that TRD/SRDs and SOO/SOW form the basis of testing—both sets of documents need to be written with a focus on performance and test. Waiting until test preparation is too late to discover that requirements were not stated in a manner that is quantifiable or testable. (For more on requirements and testing, see the System Design and Development and the Test and Evaluation topics in the SEG's SE Life-Cycle Building Blocks section.)
Picking the winner. Evaluation criteria need to be comprehensive and specific enough to allow clear differentiation between offerors, especially for requirements that are critically important to the success of the program. Consider trying a sample proposal against the criteria to see if they are in fact selective enough. There have been cases in which the criteria have not been expansive enough, and differentiating technical information found in the proposals to be relevant to selection could not be considered for evaluation. Beyond just the written criteria, consider requiring the offerors to provide and follow their risk management process or software development plan as part of an exercise or demonstration.
Sample tasking and challenges to demonstrate the solutions. The proposal from the vendor does not have to be paper-based. The government can ask for a sample task proposal or set criteria for the vendor to demonstrate their solution before award. This is the current thinking from the White House Office of Science and Technology, through their Challenge.gov and America Competes Act.
Source selection—Be prepared. SEs responsible for evaluating technical proposals need to be well versed in applicable current technology for the program. If a proposal contains a new technical approach that is unfamiliar, perform due diligence to determine the viability. Do not assume the approach is low risk or commonplace; be sure to determine its feasibility, its risk, and the proposing offeror’s familiarity with it. Consult with experts in areas where program staff have limited depth of knowledge. Getting the right assistance in source selection is critical to choosing the right contractor. You don't get another chance!
The danger of "leveling." During source selection, offerors are often approached for clarifications; they may be asked to answer questions in writing, provide oral proposals, or participate in question/answer sessions. The result of several iterations of these information exchanges among the offerors can result in or look like "leveling," which is when the government team has unintentionally brought an offeror’s proposal up to the level of other proposals through successive rounds of discussion. An example is inadvertently pointing out an offeror’s proposal weaknesses so that a final selection is based purely on cost. This is usually not the factual result but a perception and result of iterative clarification calls to ensure that all the offerors have provided adequate details of their approaches. As most engineers who have participated in multiple source selections will tell you, the offerors are not likely to be even in terms of technical risk, past experience/expertise, or in architectural approach. It is up to the engineering team to clarify the differences so that "leveling" does not occur. This means careful consideration of the evaluation criteria for differentiation, focusing on the critical areas needed for success on the program. If the offeror has not demonstrated a consistent approach throughout the proposal process, this in itself may be a legitimate weakness.
Incorporating continuous competition into the process. Competition is an extremely strong motivator; the forces of competition act as an "invisible hand" that regulates contractor performance. Contractors tend to keep each other in check, and the government greatly benefits from, and is protected by, the nature of competition. Extensive historic data on military programs has shown that, in a competitive environment, costs consistently decline and performance and reliability increase. Continuous competition strategies and methods can be applied from development through production in order to maintain multiple sources throughout the acquisition life cycle. These strategies can include dual sourcing in production, leader-follower contracts, low-level production quantities, and targeted technologies development with a second vendor.
Leverage in a sole source environment. It is a best practice to judge a proposed effort on a sole-source contract against similar past efforts that are already expensed and for which hours are actual. Again, the ability to do this depends on whether the initial contract (specifically the WBS) was structured to capture progress at an appropriate level to accrue cost and schedule for independent efforts. Without a reasonable facsimile for the proposed effort, you will need either sufficient experience in the contractor's development methodology to estimate hours or research into other programs and developments to compare against (both inherently less helpful for negotiating).
References and Resources
Air Force Materiel Command (AFMC), April 2005, HQ AFMC Justification and Approval Preparation Guide and Template, accessed October 24, 2017.
Bloom, M., and J. Duquette, July 2006, System Engineering in RFP Prep & Source Selection Process, V3.0.
Defense Federal Acquisition Regulation Supplement and Procedures, Guidance, and Information, Contracting by Negotiation, DFAR Subpart 215, accessed October 24, 2017.
Department of Defense, October 27, 2015, MIL-STD-961E, DoD Standard Practice, Defense and Programs—Unique Specifications Format and Content, accessed October 24, 2017.
Department of Defense, October 3, 2011, MIL-HDBK-881C, DoD Handbook Work Breakdown Structures for Defense Material Items.
Federal Acquisition Regulation, Contracting by Negotiation, FAR Part 15, accessed October 24, 2017.
Government Accountability Office (GAO), January 2, 2014, Bid Protest Annual Report to Congress for Fiscal Year 2013, GAO 14-276SP, accessed October 24, 2017.
ISO/IEC/IEEE Std. 29148-2011, Systems and Software Engineering—Life Cycle Processes—Requirements Engineering, accessed October 24, 2017.
OMB's Office of Federal Procurement Policy, October 1998, A Guide to Best Practices for Performance-Based Service Contracting, accessed October 24, 2017.
White House Office of Science and Technology Policy, 2012, Implementation of Federal Prize Authority: Progress Report, accessed October 24, 2017.