FAA W J. H T C T E H: Illiam Ughes Echnical Enter Est and Valuation Andbook
FAA W J. H T C T E H: Illiam Ughes Echnical Enter Est and Valuation Andbook
FAA W J. H T C T E H: Illiam Ughes Echnical Enter Est and Valuation Andbook
H U GH E S T E C H NI C AL C E NT E R
T E ST A ND E VA L UA T I ON H AN D B O OK
DOCUMENT # VVSPT-A2-PDD-013
VERSION # VERSION 4.0
VERSION DATE MAY 16, 2017
Intentionally Blank
TABLE OF CONTENTS
1 INTRODUCTION ........................................................................................................................................... 1-1
1.1 VERIFICATION AND VALIDATION PRINCIPLES ...............................................................................................1-1
1.2 OBJECTIVES OF THE TEST AND EVALUATION HANDBOOK.............................................................................1-3
1.3 SCOPE ...........................................................................................................................................................1-3
1.4 ROLES AND RESPONSIBILITIES ......................................................................................................................1-4
1.5 FUNDAMENTAL PRACTICES FOR QUALITY T&E............................................................................................1-4
1.6 TAILORING OF T&E PROCESSES....................................................................................................................1-5
1.6.1 Tailoring Criteria and Processes ....................................................................................................1-5
1.6.2 Tailoring Guidelines ......................................................................................................................1-6
1.6.2.1 Types of Tailoring ..........................................................................................................1-6
1.6.2.2 Commonly Tailored Processes .......................................................................................1-6
1.7 T&E STANDARDS CONFORMANCE PROCESS .................................................................................................1-7
1.8 T&E QUALITY MANAGEMENT SYSTEM PROCESSES .....................................................................................1-7
1.9 PEER REVIEWS ..............................................................................................................................................1-8
2 RELATED DOCUMENTS AND REFERENCES ....................................................................................... 2-1
3 T&E SUPPORT ACROSS THE AMS LIFECYCLE .................................................................................. 3-1
4 T&E SUPPORT TO SERVICE ANALYSIS & STRATEGIC PLANNING AND CONCEPT &
REQUIREMENTS DEFINITION ................................................................................................................. 4-1
5 T&E SUPPORT TO INVESTMENT ANALYSIS ....................................................................................... 5-1
5.1 PROGRAM REQUIREMENTS DOCUMENT SUPPORT .........................................................................................5-1
5.2 TEST AND EVALUATION MASTER PLAN DEVELOPMENT ...............................................................................5-2
5.2.1 TEMP Objectives ...........................................................................................................................5-4
5.2.2 Test Management Using the TEMP ...............................................................................................5-5
5.2.3 Test Design Process .......................................................................................................................5-7
5.2.3.1 Identification and Assessment of Requirements .............................................................5-7
5.2.3.2 Decomposition of Critical Operational Issues ................................................................5-7
5.2.3.3 Critical Performance Requirements Evaluation Approach ...........................................5-10
5.2.3.4 Defining Test Activities ................................................................................................5-10
5.2.3.5 Formulating the TEMP Verification Requirements Traceability Matrix ......................5-12
5.2.3.6 Identifying Test Capability Requirements ....................................................................5-13
5.2.3.7 Defining Test Schedule .................................................................................................5-14
5.2.3.8 Assessing Updates and Acquisition Program Baseline Changes ..................................5-14
5.2.3.9 Documenting the Test Design in the TEMP .................................................................5-14
5.2.3.10 Interim Assessment Report Planning ..........................................................................5-14
5.3 IMPLEMENTATION STRATEGY AND PLANNING DOCUMENT SUPPORT .........................................................5-15
5.4 FAA SYSTEM SPECIFICATION SUPPORT ......................................................................................................5-15
5.5 SCREENING INFORMATION REQUEST (SIR) SUPPORT ..................................................................................5-17
5.5.1 Section L of the SIR: Instructions, Conditions, and Notices to Offerors ....................................5-17
5.5.2 Section C of the SIR: Description Of Work Requirements .........................................................5-17
5.5.2.1 Statement of Work (SOW)/Performance Work Statement (PWS) ...............................5-17
5.5.2.2 Statement of Objectives (SOO) ....................................................................................5-18
5.6 PROPOSAL EVALUATION SUPPORT ..............................................................................................................5-19
6 T&E SUPPORT TO SOLUTION IMPLEMENTATION - DT .................................................................. 6-1
6.1 DT OVERVIEW ..............................................................................................................................................6-1
LIST OF FIGURES
Figure 1-1. T&E Application To The V-Model........................................................................................................ 1-2
Figure 3-1. Typical T&E Approach throughout the AMS Lifecycle ........................................................................ 3-3
Figure 3-2. T&E Phases and Requirements .............................................................................................................. 3-4
Figure 5-1. Test Category Hierarchy and Documentation ........................................................................................ 5-3
Figure 5-2. TEMP Relational Diagram..................................................................................................................... 5-6
Figure 5-3. Generic TEMP Test Design Process ...................................................................................................... 5-8
Figure 5-4. Decomposition of COIs to Test Cases ................................................................................................... 5-9
Figure 6-1. Typical DT Activities............................................................................................................................. 6-3
Figure 6-2. FAA DT Process Flow ........................................................................................................................... 6-5
Figure 6-3. SAT Process Flow.................................................................................................................................. 6-6
Figure 7-1. FAA OT Process Flow ........................................................................................................................... 7-4
Figure 7-2. From Requirements to Operational Readiness Determination ............................................................. 7-15
Figure 9-1. Test Capability Accreditation Process ................................................................................................... 9-2
Figure D-1. PRD (T&E Section) Review & Approval Cycle .................................................................................. D-3
Figure D-2. ISPD (T&E Section) Review & Approval Cycle ................................................................................. D-4
Figure D-3. pTEMP, iTEMP & fTEMP Review & Approval Cycle ....................................................................... D-5
Figure D-4. T&E Input Review Cycle for FAA System Specification, SIR Proposal Requirements and Statements
of Work ........................................................................................................................................................... D-6
Figure D-5. CMTP Review & Approval Cycle ....................................................................................................... D-7
Figure D-6. DT Test Plan Review & Approval Cycle ............................................................................................. D-8
Figure D-7. DT Test Procedures Review & Approval Cycle .................................................................................. D-9
Figure D-8. DT Test Report Review & Approval Cycle ....................................................................................... D-10
Figure D-9. DT Accreditation Plan/Report Review & Approval Cycle ................................................................ D-11
Figure D-10. OT Test Plan Review & Approval Cycle ......................................................................................... D-12
Figure D-11. OT Test Procedures and Test Capability Procedures Review & Approval Cycle............................ D-13
Figure D-12. OT Interim Assessment Report Review & Approval Cycle............................................................. D-14
Figure D-13. OT Quicklook Test Report Review & Approval Cycle ................................................................... D-15
Figure D-14. OT Final Test Report Review &Approval Cycle ............................................................................. D-16
Figure D-15. Field Familiarization Support Plan Review & Approval Cycle ....................................................... D-17
Figure D-16. OT Accreditation Plan/Report Review & Approval Cycle .............................................................. D-18
LIST OF TABLES
TABLE D-1. T&E WORK PRODUCTS ......................................................................................................................... D-1
1 INTRODUCTION
This Handbook provides standard processes for conducting high quality and consistent Test and
Evaluation (T&E) that supports the mission of Verification and Validation (V&V).
1.1 V ERIFICATION AND V ALI DATION P RINCIPLES
As part of the FAA mission, the William J. Hughes Technical Center (WJHTC) is actively
engaged in applying effective V&V principles and practices to T&E efforts. The intent of this
initiative is to improve the quality of T&E products and services, promoting effective planning,
reducing risks, and decreasing costs.
This initiative addresses the standards for V&V process areas that are based on the Capability
Maturity Model Integration (CMMI®) 1 standards, published by the Software Engineering
Institute of Carnegie Mellon University.
In accordance with CMMI, the purpose of verification is to ensure that a system is built right,
while validation ensures that the right system is built. Verification and validation represent
complementary process areas that are distinguished below.
• Verification - Confirmation that selected work products meet their specified
requirements. This includes evaluation of the end product (system, service or operational
change) and intermediate work products against all applicable requirements. Verification
is inherently an incremental process since it occurs throughout the development lifecycle
of the work products, beginning with initial requirements, progressing through
subsequent changes, and culminating in verification of the completed end product.
• Validation - Confirmation that an end product or end product component will fulfill its
intended purpose when placed in its intended environment. The methods employed to
accomplish validation are applied to selected work products as well as to the end product
and end product components. The work products should be selected on the basis of
which are the best predictors of how well the end product and end product components
will satisfy the intended purpose and user needs. Validation can apply to all aspects of an
end product in any of its intended environments, such as operation, training,
manufacturing, maintenance or support services.
V&V is a disciplined approach to assessing a product throughout the product lifecycle. V&V
strives to ensure that quality is built into the product and that the product satisfies operational
requirements. A strong focus on validation, an industry best practice, also helps to ensure
customer satisfaction. The T&E standards defined in this Handbook support a significant portion
of a comprehensive V&V approach. Some CMMI V&V practices are executed outside of the
T&E function, such as those that apply to the systems engineering discipline, and are therefore
not addressed in this document. The relationship of how the T&E function applies to V&V is
depicted in Figure 1-1, T&E Application to the V-Model. The V-Model illustrates the
interactions between each phase of the Acquisition Management System (AMS) lifecycle and its
1
CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.
associated T&E phase (i.e., T&E Planning and Support, Development Test (DT), and
Operational Test (OT)).
development of concepts, requirements, and design. The arrows on the right side of the “V”
represent the DT and OT phases. The primary V&V role during these two phases is the conduct
of T&E itself.
The arrows in the middle of the “V” are intended to show that each side of the “V” feeds the
other. For instance, the concept and requirements definition phase drives the test cases for OT,
while OT development and conduct can provide feedback for future concepts and requirements
definition.
1.2 O BJECTI VES OF THE T EST AND E VALUATION H ANDBOOK
This Handbook describes the test and evaluation processes, methods, and standards. The
objective is to ensure that a test program is complete, well managed, and conducted in a
consistent manner by the program test team throughout a system’s lifecycle, as well as achieving
V&V objectives. The processes delineated in this Handbook may be tailored to meet the specific
needs of a given FAA program as described in Section 1.6. (The use of the word “system”
within this Handbook refers to either “system” or “service”).
This Handbook is based on the T&E Process Guidelines and the Acquisition Management
Policy. Preparation of the Handbook was guided by lessons learned from past test programs,
industry best practices, International Organization for Standardization (ISO), CMMI standards,
and future FAA T&E needs.
1.3 S COPE
This Handbook addresses the T&E preparation, planning, conduct, and reporting activities in
support of FAA programs. It is intended for use by all test personnel. This Handbook is
organized into sections that first provide an overview of the T&E preparation and support
activities throughout the AMS lifecycle phases, and then define the specific T&E activities
required for each phase.
This Handbook describes the processes associated with the AMS T&E Work Breakdown
Structure (WBS) that support the objectives of the V&V process areas. The AMS defines the
T&E WBS as follows:
Test and Evaluation (AMS WBS, 3.5) - All Government test, analysis, and evaluation
activities to verify and validate that developed or modified products and product
components meet specification, satisfy program requirements, and are operationally
suitable and effective.
The T&E Handbook, including checklists, should be followed for T&E of Non-NAS systems,
NAS systems, and NAS Enterprise Level Capabilities. NAS Enterprise Level Capabilities are
operational improvements rooted in functions that are distributed across multiple systems in a
System of Systems (SoS) environment. DT test activities may need to be tailored for programs
employing “Agile Acquisition” (see 6.1.1). Wherever the provisions of this Handbook need to
be tailored, refer to Section 1.6, Tailoring of T&E Processes.
References are made throughout the Handbook to the “V&V Repository”. The V&V Repository
is an Internet-based information source hosted on servers under FAA control. The repository
provides users of this Handbook with T&E templates, samples, and tools as standard guidance to
accomplish specific V&V practices. The V&V Repository is located within the ANG-E T&E
Portal on the Knowledge Services Network (KSN) site.
VVSPT-A2-PDD-013 v4.0 1-3 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
Note: Refer to the V&V Repository for the Test Roles and Responsibilities Guide.
e) Accurately documented as-run test procedures and test logs for dry runs and
formal test runs.
f) Reports that provide historical test data, results, risks, deficiencies, and
recommendations with clear analysis of the performance and limitations against
planned objectives and requirements.
g) Integration and testing in an end-state environment.
(6) Prior to requirements approval, T&E personnel participate in reviews to ensure that
all program requirements are testable and validated against the operational mission.
(7) T&E is involved in the procurement package development and source selection
process to ensure that the contract complies with the test strategy and scope
documented in acquisition planning and test strategy documents.
(8) An integrated test plan (e.g., Test and Evaluation Master Plan (TEMP), see Section
5.2) is developed and completed early in the program lifecycle, and routinely
updated to reflect the evolution of the program.
(9) Operational Testing is executed in a test environment (hardware, software,
interfaces, etc.) representative of the expected in-service operational conditions.
Operational Testing is conducted with a representative team of end users who are
expected to interface with or operate the system under test.
(10) Required modifications for defects are verified and validated under conditions
equivalent to those that existed when the defects were initially identified.
Regression testing is conducted to verify and validate that fixes and modifications
do not adversely affect other interrelated operations and functionality.
1.6 T AILORI NG OF T&E P ROCESSES
Every T&E program is unique and has its own set of challenges and differences. Tailoring is a
critical activity that allows controlled changes to processes to meet the specific needs of a project
or organization.
1.6.1 TAILORING CRITERIA AND PROCESSES
To meet the unique needs of a program, the T&E processes in this Handbook may be tailored
based on:
a) Program complexity/scope
b) Risks
c) Level of effort
d) Acquisition strategies and type (e.g., Commercial Off-the-Shelf (COTS)/Non-
Development Item (NDI), Services, Software, Hardware, procurement of systems or
equipment, modification of facilities, changes in the physical infrastructure,
development of functional interfaces, spiral development implementation, etc.)
e) Program scope changes and decisions approved by the Joint Resources Council (JRC)
f) Test strategies in the approved TEMP
T&E processes and process elements that are directly related to critical standards and process
objectives and that have the word “must” in the Process Description Document (PDD) should not
be tailored without a valid programmatic or technical justification. Processes defined in this
Handbook that have been changed or omitted in a program require a clear rationale that relates to
a specific unique program element or technical variable (e.g., Integration Testing may not be
required on non-complex systems). The Fundamental T&E Practices defined in Section 1.5 must
be evident and remain intact in any tailored or developed process.
The tailored program processes are documented and approved in the Process Conformance
Checklists as defined in Section 1.7. The TSB reviews and assesses the tailoring approach and
rationale documented in the compliance checklists and determines if the T&E Handbook
processes and fundamental practices for quality T&E are maintained. If the checklists are
adequate, they are signed by the TSB. Any major deviations from test standards must be
rationalized, documented, and approved in a Request for Waiver form. This request is initiated
by the Test Director or T&E First Line Supervisor. The Quality Management Lead will identify
the approval authority.
Note: Refer to the V&V Repository for the Request for Waiver form.
Note: Refer to the V&V Repository for current versions of the Test Planning, DT, and OT
Process Conformance Checklist templates.
b) Quality Assurance (QA) – provides processes for QA audit activities in support of the
T&E goals, and provides management with insight on the level of adherence to
applicable standards.
c) Configuration Management (CM) – provides CM processes performed by
practitioners in support of T&E-related services to ensure that Configuration Items
(CIs) are managed and controlled.
d) Document Management and Control (DMC) – provides processes to manage and
control T&E and Quality Management System (QMS) documents.
e) Peer Review – provides a process to improve the quality of T&E work products and
to help verify that work products meet requirements.
f) Calibration – provides a process to define the requirements for the calibration of test,
measurement and diagnostic equipment, diagnostic software, and test data sets.
g) Corrective and Preventive Action – provides a process to identify and correct the
cause(s) of nonconformities and/or potential nonconformities in products and
processes within the T&E organization.
h) Control of Records – provides a process for identification, collection, maintenance,
protection, and storage of records.
i) Customer Feedback – provides a process to describe how the T&E organizations
receive and process feedback from customers.
j) Management Review – provides a process by which the T&E organizations review
the QMS to ensure that it is suitable, adequate, and effective in meeting customer
requirements.
k) Nonconforming Material – provides a process for identifying, reporting, segregating,
controlling and processing nonconforming test equipment, software, and custom test
setups, to prevent unauthorized use during formal testing.
l) Special Support Activities – provides a process describing how the T&E
organizations handle special projects, studies, or provide subject matter expertise
based upon the customer’s needs.
m) Quality Manual – provides a reference of processes to ensure quality testing of
systems and services entering into the NAS.
1.9 P EER R EVIEWS
Peer reviews are a critical component of V&V for T&E work products generated throughout a
product’s AMS lifecycle. Peer reviews are an effective verification practice for the early
removal of defects from selected work products and are the first mechanism for verification prior
to having partially or fully assembled product components.
Peer reviews involve a methodical examination of work products by the producer's peers as a
quality check to identify defects and to recommend changes or improvements. Peer reviews
provide strategic technical, operational, and procedural input for specific work products. The
peer review process must be conducted in accordance with the T&E Peer Review PDD.
Note: Refer to the V&V Repository for the T&E Peer Review PDD.
VVSPT-A2-PDD-013 v4.0 1-8 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
Peer reviews should be completed prior to the final TSB review. See Appendix D for the
recommended sequence of the peer review for specific work products.
For contract deliverables (e.g., the Contractor Master Test Plan (CMTP), DT Test Plans, DT Test
Procedures, and DT Test Reports), the DT test team should check if the deliverables have
undergone internal peer reviews by the prime contractor, prior to FAA review and approval (see
Section 6.1.3.2).
Intentionally Blank
Intentionally Blank
Analysis & Strategic Planning and CRD, IA, SI, and ending with ISM. The diagram also
indicates the key decisions, milestones, major test efforts, technical reviews, and most
importantly, the major T&E work products and when they would be initiated relative to the AMS
lifecycle phase. This figure depicts the typical FAA investment program lifecycle, and the
implementation of specific activities (i.e., test efforts or T&E work products) or events (i.e., key
decisions, milestones, or technical reviews).
Note: Each program is unique and may deviate from the typical lifecycle approach.
Sections 4 through 8 of this Handbook identify and define the specific T&E support required for
each of the AMS program lifecycle phases. The AMS Lifecycle phases shown in Figure 3-1 are
to be followed for any investment which has been identified by FAA as requiring executive
decision.
NAS Enterprise Level Capability programs need to be coordinated across multiple systems, each
of which has their own requirements. Enterprise Level Capability T&E teams need to be put in
place to:
a) Identify the hierarchy of requirements,
b) Allocate the requirements to appropriate organizations, and
c) Establish roles, responsibilities, and procedures for coordinating T&E activities,
including problem resolution.
Figure 3-2 illustrates test phases for both System/Solution level T&E, and NAS Enterprise Level
Capability T&E. The arrows show which set of requirements undergoes T&E in each test phase.
F IGURE 3-1. T YPICAL T&E A PPROACH THRO UGHOUT THE AMS L IFECYCLE
VVSPT-A2-PDD-013 v4.0 3-3 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
Intentionally Blank
T&E support for the development of program requirements during PRD development includes:
a) Participating in product engineering and implementation reviews
b) Ensuring that all new or modified requirements for functions or services are defined
and address the operational mission
c) Ensuring that all interfaces required for the NAS operational mission are defined
d) Reviewing and commenting on requirements for testability. (Requirements must be
precisely defined and leave no room for subjective interpretation. Parameters and
thresholds must be measurable. Functional performance requirements must be
verifiable and expressed in unambiguous terms)
e) Verifying that Critical Operational Issues (COIs) are completely described, are
operational in nature, represent observable events, and are testable
f) Verifying that program requirements essential to meeting the mission are identified as
Critical Performance Requirements (CPRs)
g) Structuring the test program to address all target system or subsystem components
and interfaces by:
1) Defining potential test strategies and seeking feedback from engineering and
implementation teams
2) Providing test strategy briefings to the PMO as required
3) Writing the T&E section of the PRD to include essential FAA and contractor tests
The T&E section of the PRD must be reviewed by the TSB, PMO, and the T&E Senior Manager,
but no formal approvals or endorsements are required. Refer to Appendix D, Figure D-1, for the
complete T&E section of the PRD review and approval cycle.
Note: Refer to the V&V Repository for a sample of the T&E section of the PRD, and for
Requirements Review Guidance, TSPAT-D3-GDE-004.
The preliminary TEMP (pTEMP) defines the investment program test strategy and scope. It is
developed upon the concepts and functions documented in the preliminary requirements
document prior to IID and is not expected to contain the complete level of detail required to fully
implement the T&E program. The iTEMP for each T&E program must be submitted by the DT
and OT Test Directors for approval by the Program Manager and the Technical Center Director
prior to the Final Investment Decision. The iTEMP is required for the Final Investment Decision
(FID). The iTEMP also is not expected to contain the complete level of detail required to fully
implement the T&E program. However, the iTEMP must contain estimates of the testing scope
that are sufficient to address ISPD requirements and development of T&E requirements for the
Screening Information Request (SIR).
The TEMP is a living document and is updated as the program progresses and more detailed
supporting information becomes available. Unaddressed or incomplete areas within early TEMP
versions which require refinement must be identified as “To Be Determined,” (TBD), and must
be included in the final or revised final versions as additional information becomes available.
The final TEMP (fTEMP) should be completed after design reviews, such as Critical Design
Review (CDR), are completed, and prior to delivery of the CMTP, if applicable, and is generally
revised at major program milestones (see Section 5.2.2). The pTEMP, iTEMP, fTEMP, and all
revisions to the fTEMP impacting test strategy or scope require TSB endorsement and the
approval signatures of the Program Manager and the Technical Center Director. Minor changes
to the TEMP that do not impact test strategy or scope (e.g., minor schedule changes, editorial
corrections, etc.) will be considered working drafts only and do not require these approval
signatures but are still subject to Document Management and Control. The T&E strategies and
methods described in the TEMP should be briefed to the Program Manager to obtain input and
concurrence with the general T&E approach. The briefing should occur prior to submitting the
initial and fTEMP for the Program Manager’s review and approval.
Refer to Appendix D, Figure D-3, for the complete TEMP review and approval cycle.
Note: Refer to the V&V Repository for the TEMP templates and TEMP samples.
5.2.1 TEMP OBJECTIVES
The TEMP must accomplish the following objectives to ensure a comprehensive test program:
a) Provide structure for managing the T&E effort
b) Define a test plan baseline that addresses all required test activities
c) Document the ITT’s consensus on the scope of testing that is required to evaluate the
system or service in a thorough and efficient manner
d) Define test strategies and evaluation methods for the system(s) or service(s) under test
e) Identify program requirements to be verified and validated, also noting those that will
not be evaluated.
f) Identify the CPRs from the PRD and associated criteria
g) Document traceability of program requirements and COIs to test activities in a
Verification Requirements Traceability Matrix (VRTM).
h) Decompose COIs into Measures of Effectiveness (MOEs), Measures of Suitability
(MOSs), and Measures of Performance (MOPs) in the VRTM.
i) Define a logical schedule of T&E activities
the evaluation of the COIs, MOEs and MOSs. Decomposition must be considered
preliminary for the pTEMP and iTEMP.
c) Mapping MOEs and MOSs to PRD requirements in the VRTM. If a conflict is
discovered between the PRD and any MOEs, MOSs or MOPs the conflict must be
resolved.
Figure 5-4 illustrates the decomposition of COIs down to test cases, and their relationship to
operational readiness.
b) Maintainability
c) Availability
d) Compatibility
e) Transportability
f) Interoperability
g) Safety
h) Human factors
i) Logistics
j) Supportability
k) Documentation
l) Personnel
m) Training requirements
n) Site adaptability
o) Fault tolerance
p) Security
Note: Refer to the V&V Repository for the COI Decomposition Guide and a COI
Decomposition sample.
5.2.3.3 CRITICAL PERFORMANCE REQUIREMENTS EVALUATION APPROACH
CPRs are program requirements identified in the PRD as essential to the successful performance
of the system or service in meeting the program’s mission needs. Special emphasis is placed
upon the evaluation of CPRs to ensure the timely assessment of system or service capabilities
and to promote program success.
Therefore, CPRs are tracked and their status is reported throughout the test program to provide
visibility to management in support of making prudent decisions. The evaluation approach
should determine the strategy for testing the CPR and the required milestones for assessing CPR
performance.
If CPRs have not been identified in the PRD or the APB, then it is recommended that the ITT
select a PMO-approved set of CPRs from the PRD, in accordance with the PRD template
guidance. Additional CPRs may also be identified from other program work products. The full
set of CPRs is documented in the TEMP.
5.2.3.4 DEFINING TEST ACTIVITIES
Requirements are allocated to a set of DT and OT activities to support a structured evaluation of
the system. In addition the COIs and CPRs identified above, test design should consider the
following factors affecting test:
a) NAS integration requirements
b) System interface requirements testing
Development of the VRTM is an iterative process. During early TEMP development, the TEMP
VRTM is generated with a mapping of COIs/MOEs/MOSs/MOPs and program requirements
down to the test activity level for OT and to the DT phase, as appropriate. Detail below the COI
level must be considered preliminary and may need to be revised and expanded as OT planning
progresses. As the program progresses, additional information such as that included in the ISPD,
the CMTP, and the FAA specification will provide further details that must be incorporated into
the VRTM.
For DT, the developing contractor will use the FAA System Specification to develop their
contractor specification(s) defining the DT requirements. The CMTP will then be developed,
containing a DT VRTM which maps the DT requirements to the DT activities. Subsequently,
the developing contractor will refine the DT VRTM with test case information during DT Test
Plan development (see Sections 6.2.2, 6.2.3, and 6.2.5). The TEMP VRTM should be updated
with the DT activity and test case information from the DT VRTM.
For OT, the TEMP VRTM will provide the basis for developing the OT VRTM. During OT
Test Plan development, the VRTM will be further refined to include COI/MOE/MOS/MOP and
program requirement mapping to the individual OT cases (see Sections 7.4.1 and 7.4.2).
Note: Refer to the V&V Repository for the Project Management PDD.
milestones for when IARs are planned to be delivered should consider when critical decisions
points in the program can be best supported by the report and the availability of meaningful data.
Recommended reporting milestones include major reviews (e.g., preliminary, critical, software,
system and program design reviews), software development, start of formal DT, completion of
DT, and start of OT. The OT Test Director documents strategic plans for IARs in the TEMP and
is the responsible individual for the deliverable. For more on the IAR content, see Section 7.7.1,
OT Interim Assessment Report.
Note: Refer to the V&V Repository for a sample of the T&E section of the ISPD.
Note: Refer to the V&V Repository for the Requirements Review guidance.
i) Assess the product components, system design, and system implementation approach
for developing potential test strategies.
j) Define potential test strategies and seek feedback from implementation and
engineering teams. Provide test specification briefings to the PMO, as required, to
attain buy-in.
k) Develop test program structure to address target system components, subsystems, and
critical interfaces.
l) Write the verification section of the FAA System Specification to define essential
verification areas and methods and strategies that are required to verify the
documented requirements.
m) Ensure that operational considerations have been incorporated in the selection of
verification methods (inspection, analysis, demonstration, and test).
Note: Refer to the V&V Repository for an FAA System Specification sample.
The DT Test Director must ensure that all T&E-related input to the FAA System Specification
has been peer-reviewed prior to submission (see Section 1.9). Refer to Appendix D, Figure D-4,
for the T&E Input Review Cycle for FAA System Specification.
5.5 S CREENING I NFORMATION R EQ UEST (SIR) S UPPORT
The FAA procures systems or services from contractors using agreements defined in contracts.
Before it can select a contractor to provide the system or service, the FAA issues a SIR to define
the specific efforts to be provided by the contractor. The test strategy from the TEMP is the
basis for determining what test and evaluation items belong in the SIR. The DT and OT Test
Directors must ensure that this test strategy is properly reflected in the SIR as described in the
following sections. Additionally, the DT and OT Test Directors must ensure that all T&E-
related input to the SIR has been peer-reviewed prior to submission (see Section 1.9).
5.5.1 SECTION L OF THE SIR: INSTRUCTIONS, CONDITIONS, AND NOTICES TO
OFFERORS
Section L of the SIR contains a description of the technical requirements that the contractor must
address within the proposal that they submit in response to the SIR. Contractor proposal
requirements that may be addressed in Section L of the SIR with respect to testing include:
a) Relevant experience of the contractor in testing similar systems
b) DT cost and schedule information
c) Test tools to be used
d) Test environment(s) proposed for formal tests
e) Integration and test management approach
f) Proposed test approaches and strategy to include expected conditions for CPR
evaluation
g) Specific operational conditions and loading that supports early evaluation of
operational capabilities during DT
h) Cost information for OT support
i) Test Configuration Management (CM) methods and practices to be used
j) OCD/OCT evaluation criteria (if required)
k) Risk management approach
5.5.2 SECTION C OF THE SIR: DESCRIPTION OF WORK REQUIREMENTS
Section C of the SIR contains the description of the work to be performed under the contract.
This section typically includes one of the following Government developed documents:
Statement of Work (SOW), Performance Work Statement (PWS) or Statement of Objectives
(SOO). The subsequent sections define each document purpose followed by a list of T&E areas
that should be considered when drafting the description of work requirements.
5.5.2.1 STATEMENT OF WORK (SOW)/PERFORMANCE WORK STATEMENT (PWS)
The SOW defines the specific tasks that the contractor must perform. The SOW should specify
in clear, understandable terms the work to be done in developing or producing the goods to be
VVSPT-A2-PDD-013 v4.0 5-17 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
delivered or services to be performed by the contractor. The SOW forms the basis for successful
performance by the contractor and effective administration of the contract by the government. A
well-written SOW serves as the standard for determining if the contractor meets the stated
requirements.
A PWS is a statement of work for Performance-Based Service Acquisitions (PBSAs) that
describe the required results in clear, specific and objective terms with measureable outcomes.
The key to a PBSA is that all aspects of the acquisition are structured around the expected
outcome. This shifts the performance risk away from the government and often results in
savings by allowing the contractor to provide innovative solutions for the stated need. A PWS is
typical of a contract that engages the time and effort of a contractor whose primary purpose is to
perform an identifiable task rather than to furnish an end item.
The SOW/PWS should accomplish the following T&E goals that are common across most FAA
investment programs:
a) Describe the T&E events and activities to be accomplished by the contractor that
reflect the program T&E strategy described in the TEMP, ISPD and Integrated
Master Schedule (IMS)
b) Indicate the use of T&E processes that are critical for program success (e.g., test
program management, integrated testing, test capability accreditation, discrepancy
reporting etc.)
c) Ensure that the government has access to contractor data, test activities, and results
d) Government review and approval of contractor deliverables such as test plans,
procedures and reports
e) Contractor T&E support for government run testing (e.g., contractor personnel,
government training, meetings, DR/PTR review boards, test readiness reviews, etc.)
Refer to Appendix D, Figure D-4, for the T&E Input Review Cycle for SIR Proposal
Requirements and Statements of Work.
Note: Refer to the V&V Repository for the FAA Statement of Work (SOW) Preparation
Guide and SOW sample.
Note: Refer to the V&V Repository for the Contractor Test Cost Proposal Review
guidance.
The DT and OT Test Directors and test team members assist in developing technical evaluation
criteria that will be used to evaluate vendor proposals to the SIR. For COTS and/or NDI
equipment procurements, the FAA may institute and conduct a “try before you buy” product
review in the technical evaluation segment. This evaluation approach is conducted within the
scope of either an OCD or an OCT. For either approach, prospective equipment vendors develop
their proposals based on evaluation criteria defined within the SIR
Intentionally Blank
Test and evaluation best practices as described in this Handbook, should be maintained and
tailored to the extent feasible to ensure the integrity of the test program and thus support the
overall objective of operational readiness. Users of this Handbook should consult AMS for
updates as FAA refines its adoption of agile acquisition methods.
6.1.2 DT REQUIREMENTS
DT verification is based on contractually-required activities that the prime contractor must
perform to demonstrate conformance to the FAA-developed system specifications. The prime
contractor may develop and maintain separate specifications that are derived from the FAA
specifications and approved by the FAA as a CDRL document. The FAA- developed system
specifications or the contractor system specification (A-Level) is the “test to” document that
drives the conduct of DT. The prime contractor and associated subcontractors may develop
subsystem specifications (B-Level) and design documents that form the basis for B-Level
verification and system level procedure development.
The DT Test Director and test team review the contractor’s test CDRL documentation and
witness all tests during DT. The DT Test Director reviews the team’s comments and
recommends Government approval or disapproval to the Contracting Officer (CO) and
Contracting Officer’s Representative (COR). The DT test team must be proficient in their
particular program domain (e.g., Communications, Navigation, Surveillance, Weather, and Air
Traffic Automation) and eventually become experts on the system under test. System expertise
is necessary to ensure that the tests performed by the system contractor are valid and
comprehensive.
The DT functional flow diagram depicted in Figure 6-2, FAA DT Process Flow, identifies the
tasks and functions that support DT from the Screening Information Request (SIR) through
delivery of the DT Final Test Report. The tasks and functions that support SAT, from generation
of the SAT Plan through the start of Field Familiarization (FF), are identified in Figure 6-3, SAT
Process Flow.
on the documents, typically 30 days. The schedule includes a time period for the contractor to
incorporate comments and redeliver the document to the FAA, typically no less than 15 days.
Prior to submitting formal draft or final CDRL documents to the FAA, the test team should
check if the contractor has conducted internal peer reviews.. These reviews help ensure that the
documents fully reflect the contractor's technical and programmatic approach for satisfying the
Government requirements. The contractor must conduct draft reviews of CDRL documents with
the FAA prior to delivering the final document. This review is conducted to ensure that all FAA
comments are clear and to verify the context in which the contractor intends to incorporate
comments.
The minimum test documentation for any procurement consists of the following:
a) CMTP
b) DT VRTM (may be contained in the CMTP, DT Test Plan(s), or in a separate CDRL
document)
c) DT Test Plan(s), Test Procedures, and Test Report
d) Integration and Test Development Plan (ITDP) and Report (as required)
e) Other contractor documentation that may be required and that impacts T&E,
including:
1) CM Plan
2) Software Development Plan
3) COTS/NDI Documentation
4) Reliability, Maintainability, and Availability (RMA) Test Report(s)
6.1.3.3 CONTRACTOR TEST DOCUMENT REVIEW STANDARDS
The DT Test Director and test team review and comment on all contractor-delivered test
documents. The Test Director provides comments and recommendations (approval/disapproval)
on the test documents to the CO and COR. The following guidelines should be used in
document review:
a) Ensure that the test team is familiar with the system design, NAS integration
environments, and general NAS architecture prior to reviewing the test plans and
procedures.
b) Ensure that the contractor has developed the DT VRTM based on the FAA System
Specification in accordance with the terms of the SOW/PWS.
c) Ensure that the approved DT VRTM has been delivered prior to the FAA reviewing
the test procedures.
d) Review the documents for technical content and compliance with the applicable DID.
e) Ensure that test documents are consistent with all relevant requirements and contain
clearly defined test objectives and success criteria.
f) Ensure that all interested parties receive CDRL documents with sufficient time for a
comprehensive review.
g) Collect comments from test team document reviews. Ensure that FAA comments are
clear, detailed, specific, and based on compliance with FAA requirements.
Consolidate and submit comments by the due date.
h) Reject the deliverable if it does not meet contract requirements and standards.
Technical directions, as approved and conveyed through the Contracting Office, are
provided to the contractor to rectify the shortcomings of the deliverable.
i) Use a database, when practical, to manage comments for delivery to the contractor.
Note: Refer to the V&V Repository for the DT Comment Form template.
manufacturing process used to develop the hardware needs to be qualified via testing. When a
significant quantity of hardware items is to be produced, only a representative subset of the
produced hardware is subject to test.
DT Hardware Testing focuses on the following areas:
a) Verifying that the hardware conforms to applicable specifications, is free from
manufacturing defects, and is substantially identical to qualified hardware (only for
PAT)
b) Verifying hardware-related human factors and safety requirements
c) Evaluating the manufacturing process for newly developed hardware
d) Testing of COTS/NDI products
e) Testing the hardware in the racks or assemblies to be fielded
f) Testing the interfaces between Line Replaceable Units (LRUs). This interface testing
can be conducted using special test software, operational software, or both.
g) Electrical power testing
h) Thermal testing
i) Acoustic testing
j) Electromagnetic Interference (EMI)/ Electromagnetic Compatibility (EMC) testing
k) Seismic, shock, and vibration testing
6.2.1.4 FACTORY ACCEPTANCE TESTING (FAT)
FAT is performed at the subsystem or partially integrated system level to verify some system
level requirements, non-system level software requirements, and hardware subsystem
requirements. FAT may also provide for the final verification of A-level requirements that do
not require the final baseline environment. FAT is conducted in accordance with the CMTP and
the SOW/PWS. FAT may be conducted at a contractor’s facility or at the FAA WJHTC.
The following items should be considered when planning for FAT:
a) FAT is a prerequisite to DT System Testing
b) The contractor obtains Government concurrence of the success criteria
c) FAT plans, procedures, and reports are CDRL deliverables that are approved by the
FAA
d) The FAA test team will witness FAT activities
e) The test executions are conducted on a configuration-managed baseline
6.2.1.5 FUNCTIONAL QUALIFICATION TESTING (FQT)
FQT is conducted on programs that have a vendor or subcontractor under contract to the prime
contractor for delivery of a product or subsystem. FQT demonstrates capabilities to integrate
with the NAS through the use of drivers and simulators, where applicable.
Prior to DT System Testing, the prime contractor directs associate subcontractors to perform
FQTs in accordance with the CMTP, the FAA contract SOW/PWS, and the prime
contractor/subcontractor SOW/PWS. FQT may be conducted at the vendor/subcontractor
facility, contractor facility, or the FAA WJHTC. The FAA will monitor the prime contractor and
subcontractor on all FQT activities.
The following items should be considered when planning FQT:
a) The prime contractor plans and approves all FQT activities
b) All FQT Test Plans, Test Procedures, and Test Reports are to be provided to the
Government for review (including formal deliverables from subcontractors)
c) FQT verifies product specification requirements associated with the subsystem
delivered by the subcontractor
d) The prime contractor utilizes the requirements from the product specification to
address and provide traceability to the FAA system specifications
e) All pertinent test data will be collected and logged in accordance with the approved
Quality Assurance (QA) processes
f) FQT may not be subjected to all of the formal FAA processes as other DT activities
since it is under the auspices of the prime contractor
6.2.1.6 DT INSTALLATION AND INTEGRATION TESTING
To ensure an efficient process, DT Installation and Integration (I&I) Testing is scheduled early in
the development of a system and made a prerequisite to DT System Testing. Early integration
testing is effective in finding low-level issues that might have been overlooked if this testing was
not conducted. For systems where the integration process is complex, specific integration
milestones are required.
The installation and integration of the hardware into laboratory environments and operational
sites to be used during system tests must be verified through DT I&I Testing prior to conducting
DT System Testing. DT I&I Testing ensures that the system is properly installed and
functioning, correctly interfaced with GFE, and ready to begin DT System Testing. It includes
hardware and NAS integration test activities. The FAA will witness all DT I&I Testing
conducted by the contractor.
6.2.1.7 DT SYSTEM TESTING
DT System Testing verifies the system’s ability to satisfy the requirements of the applicable
system specifications assigned to the contractor. The FAA will witness all DT System Testing
conducted by the contractor. System test plans, procedures, and reports are deliverables that are
approved by the FAA. System tests can include:
a) Integration and Interface Verification: The verification of subsystem integration and
NAS interfaces.
b) System Capacity and Response Time Performance Verification: The verification of
system processing time and stress thresholds.
c) Failure Mode and Failure Recovery Verification: The verification of failure mode
conditions and system recovery capabilities.
Prior to entering DT System Testing, a System Test Entrance Checklist is completed by the DT
Test Director as an internal FAA assessment of DT issues and test readiness. This checklist is
submitted to the TSB for their review and comment and must be endorsed by the T&E First Line
Supervisor prior to the Test Readiness Review (TRR) (see Section 6.3.2).
Note: Refer to the V&V Repository for the DT System Test Entrance Checklist template.
g) One or more “Key Sites” is/are designated as the first location(s) in the system site
deliveries. More than one Key Site may be necessary to verify system performance,
depending on the scope of the tests. Some considerations in determining a Key Site
include:
1) Amount of support available from site personnel to resolve issues found during
testing.
2) Experience of the site personnel in managing the planned testing.
3) Ability to coordinate and document facility procedures, and plan for use by
subsequent sites.
4) Location has a low risk for Air Traffic Control (ATC) safety impacts during
system integration at the site.
5) Representative field conditions and environment.
6) Available operational interfaces.
The AMS T&E Guidelines recommends that Key Site SAT be conducted following the
completion of formal OT. However, if a test program is designed to verify a significant number
of system specification requirements at the Key Site, the Key Site SAT should be planned for
completion prior to the start of formal OT.
Systems often must be adapted depending on where the system is deployed. The contractor, with
FAA approval, identifies which parameters are site-specific and ensures those parameters are
included in the software and hardware delivered to the site. Though CM is critical throughout
the test program, precautions are taken to ensure that site-specific data is correct. The contractor
briefs the site personnel on the configuration of the system prior to SAT at the TRR and Pre-Test
Briefing. These configurations are verified by the QRO and DT test team prior to formal test
conduct.
6.2.1.10 DT TEST CAPABILITY ACCREDITATION
DT test capabilities must be accredited in accordance with Section 9.1.
6.2.1.11 COORDINATION BETWEEN DT AND OT TEST DIRECTORS
The DT and the OT Test Directors coordinate on test strategies throughout the lifecycle of the
system. This coordination begins with the Integrated Test Team (ITT) when the TEMP is
developed and continues through the TWGs.
To reduce the risk of encountering major system problems during OT, the DT Test Director
should incorporate operational conditions and conduct evaluations from an operational
perspective. To accomplish this, both the DT and OT Test Directors must ensure that
operationally oriented test conditions and evaluation criteria are planned for by addressing them
appropriately in the SIR, SOW/PWS, CMTP, and system specification.
6.2.1.12 DT TRAINING AND FAMILIARIZATION
The DT Test Director is expected to be an expert in the domain for the program in which that
person leads. However, T&E specific training and qualification are required for each DT Test
Director. This training ensures that the DT Test Director understands the policies and processes
related to T&E.
Note: Refer to the V&V Repository for the CMTP template and a CMTP sample.
The DT test team, in accordance with the contract, conducts an early informal review of the
CMTP with the contractor in order to discuss test strategies and issues in a collaborative
environment. This collaboration will ensure that the contractor’s and FAA’s test planning is in
agreement and that the CMTP is contractually adequate and compliant. Subsequently, the DT
test team and the TSB review the revised CMTP prior to the DT Test Director’s recommendation
for approval by the CO. There may be several iterations of drafts as the CMTP is worked toward
acceptance by the government. The TSB can consult with the DT Test Team to determine
whether TSB review will be required for all iterations.
Refer to Appendix D, Figure D-5, for the complete CMTP review and approval cycle.
For planning purposes, the CMTP can be required as early as Contract Award to as late as 30
days after the Critical Design Review (CDR). Generally, if the product is COTS or an NDI, the
CMTP would be required sooner (i.e., towards Contract Award) rather than later. For
development products, general guidance would be based on the maturity level of the product or
the technology. For example, a development program using a proven technology may require
the CMTP to be delivered at the System Requirements Review (SRR) or no later than the
Preliminary Design Review (PDR). Consequently, a development project with an experimental
technology or other high risk may require a draft CMTP to be delivered at PDR and a final
CMTP at CDR. To summarize, using sound engineering judgment based on the product’s
maturity level is the best guidance for required delivery of the CMTP.
6.2.3 DT VERIFICATION REQUIREMENTS TRACEABILITY MATRIX (VRTM)
The DT VRTM summarizes how each requirement is verified. It includes:
a) specific requirement identifier
b) short description of the requirement
c) type of verification method that the contractor performs to show that the requirement
is satisfied (Demonstration, Test, Analysis, or Inspection, as defined in Section
5.2.3.5)
d) specific DT activity
e) unique test identifier during which the requirement is verified.
The initial DT VRTM is developed based on system level requirements contained in the FAA
System Specification in accordance with the terms of the SOW/PWS. Lower level DT VRTMs
are established and documented during the planning for hardware and software testing.
The baseline DT VRTM is contained in an appendix to the approved CMTP. Subsequent
changes to the DT VRTM do not require a resubmission of the entire CMTP. Instead, the
contractor can deliver formal updates via a revised CMTP appendix as a contract correspondence
that requires FAA approval. Additionally, during DT Test Plan development, the contractor will
refine the DT VRTM with test case information and will include the updated DT VRTM as part
of or an attachment to the DT Test Plan(s).
In addition to test case mapping, the DT VRTM includes success criteria, which describe the
condition needed to determine if the requirement is met. Success criteria are used to measure a
system task’s accomplishment and/or system characteristics, as well as measures of operational
capabilities in terms of the system’s ability to perform its mission in an operational environment.
Success criteria are developed for each contract specification requirement (per the CMTP DID)
and are generated by the contractor through a detailed analysis of requirements and provided to
the FAA for concurrence prior to the start of each final test execution. Once FAA concurrence is
provided, the contract specification requirements are assessed against the established success
criteria during DT. Any changes to the established, agreed-upon success criteria normally
require agreement in writing by the FAA and contractor, with updates incorporated into the DT
VRTM.
d) Plans and itemized lists for required GFE, Government Furnished Information (GFI),
and Government Furnished Property (GFP)
e) Program Trouble Reporting and corrective action process (see Section 6.3.7)
f) Configuration Management
g) DT VRTM
h) Accreditation plans for test capabilities
Note: Refer to the V&V Repository for the DT Test Plan template and a DT Test Plan
sample.
Similar to the review and approval process described in Section 6.2.2 for the CMTP, the DT test
team conducts an early informal review of the DT Test Plan with the contractor in order to
discuss test strategies and issues in a collaborative environment. This collaboration will ensure
that the contractor’s and FAA’s test planning is in agreement. Subsequently, the DT test team
and the TSB review the revised DT Test Plan prior to the DT Test Director’s recommendation
for approval by the CO.
Refer to Appendix D, Figure D-6, for the complete DT Test Plan review and approval cycle.
6.2.6 DT TEST PROCEDURES
The DT Test Procedures are developed by the contractor based on their respective DT Test
Plan(s). The Test Procedures include all of the details for conducting a particular test to verify
requirements as specified in the respective Test Plan(s). These details include:
a) Tables of step-by-step instructions to run the test
b) Observations to be made during the test
c) Expected results, including success criteria
d) Objectives
e) Test limitations
f) GFE, GFI, and GFP required for the specific test
g) Notations of the requirements being tested by a particular test and step
h) Data collection, reduction, and analysis required
i) Test tools and equipment required
j) Configuration of the system under test and the test environment
The DT test team reviews the DT Test Procedures and provides comments to the contractor for
disposition. Once the dispositions of comments are agreed to by both the FAA and the
contractor, the DT Test Director recommends approval or disapproval of the document to the CO
and COR. The test procedures are then executed via dry run testing (see Section 6.3.1) prior to
formal test conduct.
Refer to Appendix D, Figure D-7, for the complete DT Test Procedures review and approval
cycle.
Note: Refer to the V&V Repository for the DT Test Procedures template and a DT Test
Procedures sample.
6.2.7 DT ENTRANCE CRITERIA
Prior to entering any formal DT activity, the DT Test Director ensures that the following
minimum DT entrance criteria are met:
a) All entrance criteria as defined in the approved CMTP and DT Test Plan(s) are
satisfied
b) DT Test Procedures have been submitted, reviewed, and approved
c) The test configuration is known and documented
d) Dry run testing has been completed by the contractor
e) The configuration under test does not have known deficiencies that affect the
functions to be verified by the test
f) If required, test capability accreditation has been conducted and approved
6.2.8 DT EXIT CRITERIA
Prior to exiting any formal DT activity, the DT Test Director ensures that the following
minimum DT exit criteria are met:
a) All exit criteria as defined in the approved CMTP and DT Test Plan(s) are satisfied
b) Completion of all DT in accordance with approved Test Plans and Test Procedures
c) All Post-Test Reviews are complete
d) Test results are documented and accepted by the FAA
e) All DT Program Trouble Reports (PTRs) are fully documented, assessed, and status
reported
f) All contractor performance requirements are addressed in accordance with the
contract (may include PTR priority stipulations, or requirement pass-rate percentages)
6.3 DT T EST C ONDUCT
For all DT, the DT Test Director ensures that the contractor performs testing in accordance with
the SOW/PWS and approved Test Plans and Procedures. The DT Test Director also ensures that
the DT test team is prepared to witness the tests. The contractor conducts debug and dry run
testing followed by a TRR prior to each formal DT activity. The DT Test Director ensures that
the minimum DT entrance criteria as defined in Section 6.2.7 have been met.
The DT Test Director and/or the Test Lead must report and record the status of all test activities.
Following each formal DT activity, the DT Test Director ensures that the minimum DT exit
criteria as defined in Section 6.2.8 have been satisfied.
6.3.1 DT DEBUG AND DRY RUN TESTING
Prior to formal DT, debug and dry run testing of the procedures must be performed. Debug
testing is where test procedures are executed against the system under test to ensure that the test
steps are correct, complete, and produce repeatable expected results. During this testing, the
procedures are refined and updated to provide a logical flow to the sequence of test steps.
Dry runs are a complete end-to-end execution of the DT Test Procedures using formal test
configurations and accredited scenarios, simulations, and/or test tools to assess the following
criteria:
a) The laboratory environment is prepared
b) The system has been properly installed and configured
c) The system has the correct versions of both software and adaptation available, and all
system parameters that need to be set have been identified
d) Procedures are mature and redline text changes are fully incorporated (i.e., test
procedures can be run start to finish, as documented, without major deviations or
anomalies)
Dry runs are executed by the contractor and witnessed by the DT test team for each test prior to
entering formal DT, with dry run test results presented at the DT TRR. For each witnessed dry
run, a Test Status Report is prepared by the DT Test Lead (see Section 6.3.6). If a dry run that is
executed as if it was a formal test run and in accordance with test plans and procedures
completes without any significant problems, the contractor may request that the FAA waive a
second execution. If the FAA concurs, the dry run execution is accepted as a formal test run. In
these cases, a Pre-Test Briefing and a Post-Test Review are conducted to capture all test
execution details.
6.3.2 DT TEST READINESS REVIEW
The TRR is presented by the contractor to the FAA as required by the SOW/PWS. The
objectives of the TRR are to officially establish that the contractor is prepared and ready to start
formal testing of the system, and not to prematurely enter into it. The TRR covers the following
items:
a) Overview of testing to be performed
b) Status of contractor development and integration milestones and checkpoints
c) Status of all applicable test documentation
d) Identification of required Government and contractor resources and personnel
e) Configuration control and accreditation of test tools and test items (both hardware and
software) and any other items necessary for the successful conduct of testing,
including Data Reduction and Analysis (DR&A) tools and equipment
f) Prior test results, including those from any dry runs of tests
g) Summary of all PTRs, including status of relevant hardware, firmware, and software
problems
h) Review of test baseline, configuration, and environment
i) System CM status
j) GFE, GFI, and GFP status (if applicable)
k) Traceability between requirements and their associated tests using the DT VRTM
l) Test schedules
m) Test entrance and exit criteria
A draft TRR briefing package must be provided to the DT Test Director at a contractually
specified date prior to the planned TRR meeting date. Successful completion of the TRR
establishes the readiness for testing. The DT Test Director advises the COR on whether or not
DT may commence. A copy of the TRR briefing package must be included with the associated
DT Test Report.
6.3.3 DT PRE-TEST BRIEFINGS
Prior to each test, the contractor conducts a Pre-Test Briefing to ensure readiness to begin the
respective test. The Pre-Test Briefing covers the following items:
a) Test objectives and success criteria in accordance with the approved Test Plan and
Test Procedures
b) Proposed procedure changes (redlines)
c) Test configuration definition (e.g., hardware, adaptation, software version levels,
patches, configuration files), including any GFE
d) Test personnel assignments and Government and contractor resources
e) Test conduct walkthrough
f) Results of the CM audit
g) Test limitations
h) Review of known system anomalies that might impact testing
i) Planned deviations
j) DR&A methods
k) Results of any dependent testing that demonstrate readiness for test conduct
The DT Test Director and contractor test manager (or their designees) review and provide
signature approval of all planned deviations presented at the Pre-Test Briefing.
6.3.4 FORMAL DT TEST EXECUTION
The standard process items for formal test execution include:
a) The DT Test Director may delegate signature approval to the appropriate DT test
team member. The signature approval grants those test team members with authority
to initial changes to the formal Pre-Test Briefing and Post-Test Review packages and
redlines during formal test conduct.
b) The DT test team witnesses the formal runs of all tests. Copies of procedures are
provided to the FAA personnel witnessing formal test runs.
c) Proper test configuration is verified prior to starting formal test (including hardware,
software, firmware, adaptation parameters, test equipment, and test tools).
d) A Test Status Report is prepared for each formal test run (see Section 6.3.6.).
e) The DT test team witnesses deviations and changes to the Test Procedures.
f) The DT test team ensures that anomalies are properly documented (at the time of
occurrence) in the test log and that the test log is signed at the completion of testing.
g) The contractor provides the “as-run procedures” (i.e., with mark-ups) and test log
informally after test completion.
h) The contractor performs a walkthrough review of the DR&A results of the test data
with the DT test team to verify requirements.
6.3.5 DT POST-TEST REVIEWS
The contractor conducts a Post-Test Review with the FAA for each test performed to confirm
test results and completion. The Post-Test Review consists of:
a) Overall test results, including a summary of all associated requirements
b) Status of test objectives and exit criteria as specified in the associated Test Plan and
Test Procedures
c) Test conduct details (e.g., start date and time, stop date and time, etc.)
d) Any test configuration changes since the Pre-Test Briefing
e) All problems encountered, including where and how they are documented
f) Descriptions of all deviations and anomalies encountered
g) Test procedure changes
h) Details on any failed steps and requirements
i) Review of DR&A results and walkthrough findings
j) Regression test recommendations
k) Documentation of the outstanding test issues with action plans for closure
6.3.6 DT TEST STATUS REPORTS
The DT Test Director (or designee) prepares a Test Status Report for each dry run and formal
test run. The Test Status Report is distributed to the DT test team and entered into the test status
database (which is managed by the test team). The test status database is used to document all
information pertaining to a given test execution for a particular DT activity, including: test
name, FAA and prime contractor participants, facility name where the test is conducted, software
version, lab configurations, summary and specific details of the test, issues/concerns, and the
next test activities.
Note: Refer to the V&V Repository for the Test Status Report template.
reporting of all test problems. The contractor uses the database for tracking problems associated
with any system, equipment, software, or firmware that has been placed under formal
configuration control. The FAA reviews the overall design of the database to ensure operational
use and functionality required by the Government. The contractor will provide Government
personnel with access to the database and provide reports at the request of the Government.
PTRs entered into the database should be prioritized by the contractor (with review and
concurrence by the FAA) according to the following definitions:
Priority 1 is assigned to a problem that prevents the accomplishment of an operational,
mission-critical or mission-essential capability, or jeopardizes safety, security or other
requirements designated as Critical Performance Requirements (CPRs).
Priority 2 is assigned to a problem that adversely affects the accomplishment of an
operational, mission-critical or mission-essential capability and no work-around solution
is known.
Priority 3 is assigned to a problem that adversely affects the accomplishment of an
operational, mission-critical or mission-essential capability, but a work-around solution is
known.
Priority 4 is assigned to a problem that results in a user/operator inconvenience or
annoyance, but does not affect a required operational, mission-critical or mission-
essential capability.
Priority 5 is assigned to any other problem/defect not described above (e.g., system
documentation errors).
These definitions are also applicable to Discrepancy Reports (DRs). DRs may be generated prior
to OT for all issues discovered that impact operational requirements but are not being addressed
and need to be evaluated during OT (see Section 7.6.4).
The contractor problem-reporting system must be defined in the T&E activities section of the
Test Plan. The contractor submits the planned corrective action for each problem and identifies
the proposed regression testing or future modification(s) to the testing program required to
validate the successful corrective action. If a component fails during testing, the contractor must
perform failure analysis to identify the cause of the failure. Failed steps, with or without
associated problems, will be explained to the satisfaction of the Government. All anomalies will
be jointly analyzed by the contractor and the Government to determine a recovery plan.
The contractor is responsible for any corrective actions necessary to ensure full specification
compliance. The contractor completes repairs or procedural changes prior to submission for
regression testing. GFP-induced anomalies will be identified to determine Government
responsibilities for corrective actions. GFP anomalies do not relieve the contractor from
compliance to specification requirements.
6.3.8 DT REGRESSION TESTING
The contractor conducts regression tests when changes have been made to the hardware or
software for which the Government has previously accepted test results. The contractor
recommends and briefs the Government on the level of regression testing as part of the
corrective action. The DT Test Director will determine the extent of regression testing required.
Regression testing will not be started by the contractor until receiving Government concurrence
VVSPT-A2-PDD-013 v4.0 6-25 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
to proceed with the regression test. In addition, regression testing and analysis will ensure that
the fix did not cause a problem elsewhere in the system.
The contractor will determine and fully document the cause of the noncompliance in the
problem-reporting database for any failed test and provide a written notification to the
Government. The contractor will conduct the regression testing using the Government-approved
test plans and test procedures. The Government reserves the right to witness all regression
testing.
6.4 DT T EST R EPORT
The DT Test Report(s) addresses test results, including test conduct, data collected, the DR&A
process, and conclusions to be drawn from the test data. The FAA utilizes the results contained
in the test report to verify that a contractor has furnished a product that conforms to all contract
requirements for acceptance. A DT Test Report includes the following:
a) A copy of the approved TRR briefing package
b) All approved deviations and waivers generated as a result of testing
c) As-run test procedures with test logs and witness signatures
d) A test summary providing the status of each requirement tested
e) A DT VRTM showing requirement verification status
f) All test results, including data analysis (graphs, pictures, etc.)
g) Evaluation of system deficiencies and performance based on the testing results
h) Pre-Test Briefing and Post-Test Review packages
i) Complete identification of the test configuration, including hardware, software,
firmware, adaptation parameters, test equipment, and test tools
The DT Test Report also includes separate sections for test activities and results from Security
and Safety requirements verification, as defined in the contract.
The objectives of the DT Test Report are to provide essential information in support of decision-
making, assessing technical and investment risks, and verifying the attainment of technical
performance specifications and objectives documented in the DT Test Plan.
Upon receipt from the contractor, the DT Test Director forwards the DT Test Report to the DT
test team and the TSB for review and comment. Subsequently, the DT test team and the TSB
review the revised DT Test Report prior to the DT Test Director’s recommendation for approval
by the CO.
Refer to Appendix D, Figure D-8, for the complete DT Test Report review and approval cycle.
Note: Refer to the V&V Repository for the DT Test Report template and a DT Test Report
sample.
For programs using Agile development methods (see Section 6.1.1) there may be incremental
deployments to the field. For every increment that will be put into operational use the
government must verify and validate that the system is operationally ready.
Note: Refer to the V&V Repository for the OT Test Plan template and OT Test Plan
samples.
Note: Refer to the V&V Repository for the Request for Waiver form.
should participate in the dry run process to ensure that the test environment, procedures, and
system are ready for formal testing.
7.5.1 OT TEST PROCEDURE DEVELOPMENT
OT test procedure development is comprised of the following three stages:
a) Procedure Development: The formulation of procedures that take into account
operational conditions, measures, user stimuli, and scenarios that address the OT
requirements identified in the OT VRTM. This step must also associate and
incorporate the MOEs, MOSs, and MOPs. It may also include system checkout
activities needed to support procedure development. Evaluating specific functionality
requires that the procedures contain step-by-step detailed instructions to ensure test
objectives are met and the intended data is collected. To evaluate operational
capabilities, high-level nonspecific test procedures are used to allow the participant to
realistically use the system and respond to events.
b) Procedure Debugging: The running of the test procedures against the system under
test to ensure that the test steps are correct and complete. During this stage, the
procedures are refined and updated to provide a logical flow to the sequence of test
steps.
c) Dry Runs: A complete end-to-end execution of the OT Test Procedures using formal
test configurations to assess the following criteria:
1) The laboratory environment is prepared
2) The system has been properly installed and configured
3) The system has the correct versions of both software and adaptation available, and
all system parameters that need to be set have been identified
4) Any new or modified test scenarios, simulations, or tests tools are valid
5) Procedures are mature and redline text changes are fully incorporated (i.e., test
procedures can be run start to finish, as documented, without major deviations or
anomalies)
Note: Refer to the V&V Repository for the OT Test Procedures template and an OT Test
Procedures sample.
c) Support the assessment of the capabilities of the system to support operational tasks.
d) Be in the form of numerical rating scales, or “yes” or “no” answers to questions. Test
subjects may provide subjective elaborations, or explanations on specific questions, to
support their responses.
Appropriate union approval from the National Air Traffic Controllers Association (NATCA),
National Association of Air Traffic Specialists (NAATS), or Professional Airways System
Specialists (PASS) is required before a questionnaire may be used. Approval may involve
significant lead time.
Note: Refer to the V&V Repository for the OT Questionnaire guidance and OT
Questionnaire samples.
Note: Refer to the V&V Repository for the OT Test Procedures Status Matrix and Test
Status Report templates.
2) Coordinates all tasks required for test conduct. This includes laboratory
coordination and scheduling, data collection activities, Pre-Test Briefings and
Post-Test Reviews, observers, simulation support, and interface support.
3) Maintains all test schedules.
4) Establishes procedural checklists.
5) Conducts preliminary OT reviews of all scheduled activities with team members
and the Facility Manager.
6) Ensures that the Facility Manager is informed on all planned test activities.
7) Ensures that the Facility Manager coordinates planned test activities with adjacent
facilities.
8) Ensures that the Facility Manager coordinates and schedules required live testing
procedures, interfaces, and events.
7.6.2 OT TEST READINESS REVIEW
The OT TRR is conducted by the OT test team prior to the start of the formal OT activity. This
review verifies and approves the readiness to start formal OT and ensures participants understand
the outstanding issues. The TRR also communicates risks and limitations for formal OT. The
TRR includes:
a) Review of OT entrance criteria as documented in the OT Test Plan
b) Discussion and documentation of impacts if entry criteria are not fully met
c) OT approach, test objectives, and test structure used to determine operational
readiness
d) Test schedule
e) Hardware and software versions and configurations
f) CM process and test data management
g) Test team roles and responsibilities
h) Ensuring that all OT Test Plans and Test Procedures are approved by the
appropriate authorities
i) All OT procedure checkouts and dry runs have been conducted successfully
j) DR process
k) Reviewing known problems and workarounds (identify test impacts and criticality
of operational issues)
l) Test limitations, risks and deviations from the approved OT Test Plan are
understood and documented
m) Required test personnel are system-trained and available to support testing
n) Required technical and user documentation are current and available during
testing
Before the start of the Test Readiness Review (TRR), a draft TRR package is provided to the
TSB at least five days prior to the event for review. The TSB may participate in the TRR event.
7.6.3 FORMAL OT CONDUCT
Formal OT conduct consists of Pre-Test Briefings, test executions, Post-Test Reviews, and status
reporting.
7.6.3.1 OT PRE-TEST BRIEFINGS
The OT Pre-Test Briefing is presented by the OT Test Lead and consists of the approach for each
test to be conducted. It includes a review of:
a) Test objectives and success criteria
b) Test configurations (hardware, adaptation, software level, patches, test tool
certification, and configuration files)
c) Test personnel assignments
d) Test conduct walkthrough
e) Proposed procedural changes or planned deviations
f) Test limitations
g) The status of all known problems and anomalies that may impact the test or its results
h) Methods of data collection
7.6.3.2 OT EXECUTION
OT case executions are the culmination of OT planning efforts resulting in the complete and
official running of the OT Test Procedures. These test cases are the primary data source for OT
requirements validation.
During formal OT, the Test Lead manages the test execution, maintains a log of all system state
changes, and records information concerning any laboratory or system issue that results in a
deviation of, or impact on, the test. Such incidents include, but are not necessarily limited to,
system reboots and hard failures. Test observer logs are used to capture issues that impact
operational effectiveness or suitability. If questionnaires are being used to capture qualitative
VVSPT-A2-PDD-013 v4.0 7-12 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
and quantitative data, they must be completed and collected immediately following the test
execution. The Test Lead also maintains a list of DRs that identify all issues encountered during
testing (see Section 7.6.4).
7.6.3.3 OT POST-TEST REVIEWS AND TEST STATUS REPORTING
The OT Test Director should hold reviews at appropriate intervals during a test activity, such as:
a) Reviews of test case results at the end of each day
b) Reviews at the end of a test-case group
c) Reviews at the end of a test activity
d) Final Post Test Review
These reviews are all held prior to the OT Caucus. The main focus is to get field personnel to
clarify any issues that are not fully documented and to hold initial discussions on any impacts on
operational effectiveness or suitability. The development engineers responsible for
troubleshooting issues should attend to ensure that they fully understand the issues. The OT Test
Director or Test Lead generates briefing minutes and assigns action items.
The reviews may include:
a) Review of test results, including a summary of verified requirements
b) The status of test objectives and success criteria
c) Review of issues in the respective logs of the OT Test Lead and observers
d) Review of problems for accuracy, completeness, and recommended dispositions
e) Review of blue-line deviations (changes to test procedures made during formal test
conduct) from the test procedures
f) Recommendations for retesting or regression testing
Test status reports are generated following the completion of the above reviews. These reports
summarize the results of testing, highlight significant findings, and provide an assessment of
whether the objectives were met. They also record DRs and comments made against the system
documentation. The contents of the OT Post-Test Reviews and Test Status Reports are used to
support development of the OT Quicklook Test Report and Final Test Report.
Note: Refer to the V&V Repository for the Test Status Report template.
Note: Refer to the V&V Repository for the Discrepancy Report form.
The OT Test Director and test team must conduct DR reviews on a regular basis to ensure
accurate descriptions, determine validity, disposition priorities, and assess proposed resolutions.
To facilitate efficient tracking of issues discovered during OT, the OT Test Director maintains a
DR database. DRs are tracked until dispositions are documented for each and are closed out
when the OT Test Director and the relevant stakeholders are satisfied that the issue has been
resolved or mitigated. All DRs that require corrective action by the contractor are raised to PTR
status. DRs that have been raised to PTRs are not closed out as DRs until corrective action has
been implemented by the contractor and verified by the FAA.
7.6.5 OT CAUCUS
The OT Caucus, chaired by the OT Test Director, is conducted to present and review all findings
and test results related to operational effectiveness and suitability of the system. The focus of the
caucus is to gain concurrence on the operational impact and criticality of all DRs and PTRs. At a
minimum participants include representatives from the OT test team, and the field team(s).
The OT Caucus is held after the completion of formal testing. In preparation for the caucus, all
OT DRs and PTRs are entered into an OT Problem Traceability Matrix. At the caucus, the
matrix is reviewed to assess the impact of DRs and PTRs on COIs, MOEs, MOSs, MOPs, and
CPRs, and to determine their operational criticality. All DRs that require corrective action by the
contractor are raised to PTR status. PTRs are assigned initial priorities to assist with resolution
plans. At the caucus, each COI must be answered based on the criticality of the problem(s)
impacting it. Figure 7-2 shows the process flow for determining the degree to which COIs and
program requirements are met.
The following standard practices and activities must be considered for the conduct of the OT
Caucus:
a) The assessment of COIs must focus on operational impacts:
1) Ensure that adverse conditions are clearly linked to operational issues.
2) Clearly identify the specific operational functions, procedures, or services that are
interrupted, delayed, or degraded.
3) Indicate suitable workarounds, or if workarounds are not feasible.
b) Every issue must be well supported by references to the established CPRs, and COIs.
The DRs are related to COIs through their constituent MOPs, MOEs and MOSs in the
OT Problem Traceability Matrix. The matrix shows whether each COI is fully met,
or not met due to significant issues. It can also indicate when a COI is limited, due to
less significant issues, but not enough to prevent the system from being operationally
effective or suitable. (See Glossary for COI status definitions of Yes, Ltd (Limited),
and No).
c) In addition to issues specifically impacting COIs, discuss and document general
issues and concerns that the OT participants found to affect operational suitability and
effectiveness. Ensure that these issues and concerns have sufficient justification to be
documented as an operational impact (i.e. impact or degradation to operational
functions or services).
d) The caucus should not make programmatic decisions (such as milestone, deployment,
scheduling, or staffing decisions).
1) The OT Caucus should only provide supporting information for programmatic
decisions.
2) The caucus can address site unique considerations that support site deployment
decision or Key site planning.
e) The OT Caucus should review the established OT Exit Criteria, as indicated in
Section 7.4.4, and report status of any outstanding items.
f) Ensure that the caucus activities provide sufficient time for stakeholders from
different functional areas to collaborate on operational issues, to provide a
comprehensive and holistic assessment of operational impacts.
An OT Caucus Summary Report is produced to document the status of each COI, MOE, MOS,
MOP, and CPR for the system under test. This report supports resolution of critical PTRs prior
to operational deployment. This report and the PTR resolution plans provide the framework for
any regression testing that may be necessary to resolve outstanding critical PTRs. The Caucus
Summary report is the artifact that provides data that will go into the Quicklook and Final
Reports.
Note: Refer to the V&V Repository for the OT Problem Traceability Matrix and OT
Caucus Summary Report templates.
use. Regression testing may require the participation of the OT field team(s). Regression testing
must:
a) Ensure the functionality of corrective actions
b) Demonstrate that the corrective actions do not impact other system functionality
c) Re-run OT test procedures that were impacted by the problem
d) Re-run OT test procedures that may be impacted by the fix
e) Follow the same processes required for formal OT
f) Be documented in an OT Quicklook Test Report or the OT Final Test Report
7.7 OT T EST R EPORTING
OT test reporting is comprised of the following reports:
a) Interim Assessment Report(s)
b) Quicklook Test Report(s)
c) Final Test Report
7.7.1 OT INTERIM ASSESSMENT REPORT
The OT IAR is an optional reporting mechanism that will provide management with an
assessment of the current state and maturity of the system by identifying system capabilities and
limitations as tested for that reporting period. OT IARs are developed following specific
milestones, whether issues exist or not, as defined in the TEMP. The OT IAR must be provided
to the TSB and ITT for their review and comment.
The OT IAR will provide sufficient data to support resolution plans and program decisions.
Additionally, the OT IAR will assist in the planning for future test activities and support
planning for system implementation and deployment. Specifically, the OT IAR:
a) Provides the status of critical performance criteria defined in the TEMP
b) Analyzes issues based on their impact on COIs and CPRs
c) Provides early operational reporting for:
1) DT
2) Pre-formal OT conduct
3) Formal OT conduct
d) Highlights critical system issues which may impact the following operational
milestones:
1) Initial Operational Capability (IOC)
2) ISD
3) Operational Readiness Dates at field sites
e) Provides support for programmatic decision-making (including scheduling, test
planning, site deployment, and site acceptance testing)
consult the additional guidance in the OT Quick Look Report Template, which includes,
endorsement by the Test Standards Board (TSB) and signature of the WJHTC Director.
Refer to Appendix D, Figure D-13, for the complete OT Quicklook Test Report review and
approval cycle.
Note: Refer to the V&V Repository for the OT Quicklook Test Report template and an OT
Quicklook Test Report sample.
Example 2: “The OT test team found the system Operationally Ready because all COIs
were satisfactory answered and CPRs were successfully met. All issues found during
testing were rated Priority 3 or below.”
For large or high-risk test programs, the OT Test Director must conduct an OT Final Test Report
Out-Brief to the ITT prior to delivering a draft Final Test Report for review. Additionally, the
OT Test Director must ensure that the report has been peer-reviewed (see Section 1.9) prior to
submission to management for review/endorsement/approval. The OT Final Test Report
requires:
a) Endorsement of the TSB, the T&E First Line Supervisor, and the respective T&E
Senior Manager
b) Signature of the OT Test Director
c) Approval of the Technical Center Director
The Technical Center Director will approve the final version of the report based on earlier
endorsements and approvals, and an OT Final Test Report Out-Brief by the OT Test Director (as
required).
The approved OT Final Test Report should be delivered within 60 calendar days from the
completion of the test. The approved OT Final Test Report is delivered to the TSB, and the
Program Manager. The AMS requires the OT Final Test Report as part of the entrance criteria to
the ISD. The Technical Center Director will provide an OT Final Test Report Out-Brief, as
required, to the Assistant Administrator for NextGen, ANG -1, prior to the ISD milestone.
Refer to Appendix D, Figure D-14, for the complete OT Final Test Report review and approval
cycle.
Note: Refer to the V&V Repository for the OT Final Test Report and OT Final Test
Report Out-Brief templates and samples.
facility to gain confidence in the system and attain a higher level of hands-on familiarization.
The level and type of FF support is determined through coordination with the PMO, AT site
representatives, and Technical Operations site representatives to identify specific areas that
require T&E capabilities. Once the test organization’s role is determined, an FF Support Plan is
developed that details the activities. The FF Support Plan should clearly identify who generates
and maintains the FF artifacts such as daily status reports and site final report. The plan must be
provided to the TSB for their review and comment prior to final delivery. FF support consists of
the following activities:
a) Conduct initial planning meetings with operational and in-service management
organizations to gain insight into operational concerns
b) Meet with field sites to gain the desired type of faults for system familiarization
c) Perform an assessment of familiarization needs for the site and define roles for the FF
support team
d) Develop a support schedule and status tracking matrix
e) Develop a resource allocation matrix
f) Assess engineering resources required for test case development and support
g) Define FF support team training requirements
h) Define FF test tools as required
i) Develop and document test case procedures
j) Dry run test cases
k) Develop schedules to track all FF-related activities
l) Distribute the FF Support Plan to the PMO and the site(s)
Refer to Appendix D, Figure D-15, for the complete Field Familiarization Support Plan review
and approval cycle.
Note: Refer to the V&V Repository for the FF Support Plan template.
e) Develop and maintain FF database as central repository for FF-related documents and
data storage
Note: Refer to the V&V Repository for FF Test Procedures and FF Test Case Format
samples.
Note: Refer to the V&V Repository for FF Report and FF Status Tracking Matrix
samples.
Note: Refer to the V&V Repository for the ISM TEMP, OT Test Plan and OT Final Test
Report templates.
Note: Refer to the V&V Repository for the Special Support Activities PDD.
will be driven by the required test capability performance, functionality, and fidelity necessary to
support fulfillment of the objectives of a test activity or an individual test case. Examples of the
test capabilities typically accredited and types of accreditation are as follows:
a) Testbeds – A testbed is a stand-alone or distributed environment created for testing
purposes. It may consist of a combination of specialized hardware, software, real or
simulated environmental conditions. Verification of the testbed includes ensuring
that interfaces and components of the testbed function as designed and are free of
defects. Validation of the testbed involves ensuring that the environment and
associated stimulus provide sufficient representation of the conditions required for the
test objectives.
b) Simulated environments include:
1) Simulation Files (Scenarios/Scripts) – Simulation files or software scenarios are
used to automate procedures, system load, specific complex situations, etc.
Verification of simulations ensures that the simulation performs as scripted and is
reliable, repeatable, and free of defects. Simulations are validated to ensure that
they are realistic, comprehensive for the operational environment being simulated,
and sufficient for the intended use.
2) Simulated Interfaces – One or more interfaces are typically simulated when live
data or other system interfaces are not available or practical. Verification of the
simulated interface includes ensuring that it conforms to the Interface Control
Document (or other interface requirements). Validation of the interface should
ensure that all the appropriate messages, data, timing, and content are sufficiently
emulated by the simulated interface.
c) Instrumentation and Test Tools – Instrumentation and test tools are equipment items
that include data collection and analysis tools, drivers, oscilloscopes, meters,
analyzers, and probes. Instrumentation and test tools may be COTS or developmental
items. Verification of the instrumentation and test tools ensures that they are
performing as designed. Validation of the instrumentation and test tools ensure that
they are meeting the needs for the test. Calibration may be performed to accomplish
the accreditation. COTS items may use the vendor artifacts and documentation for
verification and validation.
d) Modeling – Modeling is a physical, mathematical, or otherwise logical representation
of a system, entity, or process. Typically, modeling is used to test and evaluate future
real-world capabilities without actually implementing them. Verification of a model
ensures that the model is fully functional, reliable, and accurately
processing/reporting data. Validation of a model focuses on ensuring that the model
encompasses all relevant variables, is representative of intended environments, and
sufficiently emulates the real-world capabilities that are identified in the test case.
9.1.3 ACCREDITATION PROCESS
Test capabilities that require accreditation are identified during test design and are documented
in the TEMP and test plans. Accreditation Plans and Procedures are developed in parallel with
the test plans. This provides the opportunity to explore alternative test approaches or
modifications to the overall test process. Additionally, Accreditation Plans and Procedures may
VVSPT-A2-PDD-013 v4.0 9-3 05/16/2017
T HIS D OC UM E NT B E C OME S U NC ONTR O LLE D O NC E PR INTE D
TEST & EVALUATION HANDBOOK
be either standalone documents referenced in the test plans, or they may be included as part of
the test plans. The prime contractor (for DT activities) or the OT Test Director (for OT
activities) is responsible for the development of the respective Accreditation Plan(s). For DT,
the DT test team conducts an early informal review of the plan(s) to ensure that they are in
agreement with the contractor’s accreditation strategy and to provide any other initial comments.
Subsequently, the DT test team and the TSB review the revised plan(s) for technical sufficiency
prior to the DT Test Director’s recommendation of approval by the Contracting Officer (CO).
For OT, the TSB endorses the Accreditation Plan(s) prior to final approval by the respective
T&E Senior Manager.
The FAA will witness all test capability accreditation activities conducted by the prime
contractor for DT, and will conduct the test capability accreditation activities for OT. The
accreditation analysis and results from these activities should be completed and approved no
later than 30 days before the test activity, or commencement of the simulation effort, requiring
the use of the test capability(ies). If this is not possible, the prime contractor (for DT) or OT Test
Director (for OT) has the prerogative of proposing an alternative completion date. The proposal
should contain pertinent rationales and call out any associated risks.
Results of the accreditation process are described in the Accreditation Report. This report
provides information on the risks associated with using the test capability, and recommendations
on whether to proceed. For DT, the prime contractor prepares the Accreditation Report and the
DT test team conducts an early informal review of the report to ensure that they are in agreement
with the contractor’s accreditation results and to provide any other initial comments.
Subsequently, the DT test team and the TSB review the revised report prior to the DT Test
Director’s approval recommendation to the CO and COR. For OT, the OT Test Director
prepares the Accreditation Report, the TSB reviews the report, and the T&E First-Line
Supervisor endorses the report prior to final approval by the respective T&E Senior Manager.
Configuration Management (CM) of test capabilities and archiving of all Accreditation Plans,
Procedures, and Reports is the responsibility of the prime contractor (for DT) or the OT Test
Director (for OT).
Refer to Appendix D, Figure D-9, for the complete DT Accreditation Plan and Accreditation
Report review and approval cycle, and Figure D-16 for the complete OT Accreditation Plan and
Accreditation Report review and approval cycle. For OT Accreditation Procedures refer to
Figure D-11, OT Test Procedures Review and Approval Cycle, which also applies to
Accreditation Procedures.
9.1.4 ACCREDITATION GUIDANCE
Accreditation activities for new test capabilities generally consist of the following steps:
a) Describe the function and features of the test capability being accredited as well as
the inputs and outputs.
b) Determine the intended use of the capability during testing.
c) Determine the test capability accreditation criteria. This may include documents such
as the user manual, specification and design document or may include subject matter
expertise.
d) Using known inputs, perform accreditation testing to exercise the capability, verifying
and validating the functions and features according to the intended use for the test.
e) Compare the outputs of the accreditation testing against the accreditation criteria for
the test capability and document the results.
In lieu of accreditation, an approved documented artifact (e.g., a prior Accreditation Report, test
logs, as-run authenticated test procedures, test data, etc.) is required for established capabilities
used in past testing or previously accredited capabilities. However, if the test capability or
environment has changed, then the test capability requires re-accreditation in the new test
environment.
The Accreditation Plan must describe the methods used to verify and validate the test capability
and must also identify any supporting artifacts. Once a test capability has been accredited, it
must be placed under CM in accordance with the CM Plan and monitored to ensure that the
accreditation standards are maintained. The Accreditation Plan must document the conditions
under which the test capability will require a re-accreditation process. Examples of such
conditions include changes to the test capability algorithms, inputs or outputs.
Note: Refer to the V&V Repository for the Test Capability Accreditation template and the
CM PDD.
and preparation is necessary. In addition to the information referenced on the NCP, the case file
documentation for a test modification must include:
a) The general method for accomplishing the modification, including:
1) A description of system or system modification to be tested
2) A description of connections and interfacing systems
3) Descriptive diagrams as required
4) An installation plan
b) The applicable PRD or requirement statement(s)
c) A test plan and specific test procedures, including:
1) Test objectives
2) Evaluation plan
3) Test steps
4) Expected or desired results
5) Exit criteria
6) Anticipated duration of test
d) A description of removal plan
e) An estimate of associated costs
f) A complete schedule of all planned tasks and activities
10.2 G LOSSARY
Adaptation Unique site-dependent data/functions required by the
operational program to provide the flexible capability necessary
for individual site performance determined during
implementation.
A-Level Specification Another name for the System Specification Document (SSD).
The SSD must show traceability to the Product Requirements
Document (PRD).
Blue-line changes Changes made to test procedures during formal test conduct.
COI status is assessed as: Yes – COI/CPR fully met under all test cases (No significant
issues impacting COI/CPR)
Ltd (Limited) – COI/CPR could not be met under all test
cases/conditions; ability to meet COI/CPR is limited (One or
more significant issues impacting COI/CPR)
No – COI/CPR not met (Serious shortfalls affecting COI/CPR)
Enterprise Level Capabilities NAS Requirements that may involve more than one system to
implement. Changes may be required in multiple FAA systems,
ATC procedures and avionics systems developed by industry.
Verification and Validation of Enterprise Level capabilities may
require multiple systems to reach specified states of
development, and may be performed by a dedicated Enterprise
Level Capability Test Team.
Field Familiarization Tests conducted at each site by Air Traffic and Technical
Operations personnel to verify that the site is ready to switch
over to the new system.
Initial Operational Capability IOC is the declaration by site personnel that the system is ready
(IOC) for conditional operational use in the NAS and denotes the end
of field familiarization at that site.
Operational Effectiveness The degree to which a product accomplishes its mission when
used by representative personnel in the expected operational
environment.
Operational Suitability The degree to which a product intended for field use satisfies its
availability, compatibility, transportability, interoperability,
reliability maintainability, safety, human factors, logistics
supportability, documentation, personnel, and training
requirements.
Red-line changes Changes made to test procedures during dry run conduct.
Test Activity A category of test hierarchy between Test Phase and Test Case
Group, with an identifiable title and reporting requirements.
Test Case Group A collection of test cases linked by a common purpose, such as
to verify a specific set of product requirements. A group may
consist of a similar set of test cases, performed under a range of
differing conditions. Identified by paragraph number in a Test
Procedures document.
Test Phase Highest level subdivision in a test program (e.g., T&E Program
Planning, DT, OT)
Test processes A general term for methods that may be used or procedures
which may be followed in order to conduct test.
Test program All of the identified activities that are required to perform V&V
of a NAS System or a NAS Enterprise Level Capability.
Test Steps A subset of a test case that directs test personnel to perform
actions, and document responses.
Test Tools Automated HW/SW support equipment that allows the re-
verification of existing baseline performance and the T&E of
new functions/fixes. The thoroughness of the tools and the
amount of automation of the tools directly affect the level of
verification that can be done in a reasonable timeframe.
_______________________________ ____________
DT Test Director
_______________________________ ____________
OT Test Director
_______________________________ ____________
TSB Representative
Note 1: Initial – Checklist will be filled out upon notification/assignment to a new T&E project, w/anticipated action tailoring included; Dated and signed
Revision – Checklist will be updated each time a changed, newly anticipated or actual but not previously anticipated tailored action is required;
Dated and signed for each revision
Checklist Complete – Checklist will be finalized when the associated program phase is complete and all actions have been addressed; Dated and signed
Note 2: Tailoring – Identify any required change(s) to the Action, and provide justification for each specific change (Sec. 1.6 of this T&E Handbook can be used to provide
justifications related to tailoring the T&E process); Leave blank if no tailoring is required
Note 3: Target Date – Provide at least the month and year of the anticipated completion of the Action; Revise only when a major program schedule change has occurred; Leave
blank if the Action already occurred prior to the completion of the Initial version of the checklist
Note 4: Completion Date – Provide the actual completion date of the Action; If the Action was completed prior to the completion of the Initial version of the checklist, provide at
least the year of the Action completion or leave blank if unknown; If the Action is completed after the completion of the Initial version of the checklist, provide the full
date of the Action completion (month/day/year); For Actions that have multiple items to complete, provide completion dates for each item and explain in the
Status/Comments column
Note 5: Status/Comments – Provide any supporting information for the Action; if the Action was completed prior to the completion of the Initial version of the checklist and the
actual date of completion is unknown, enter the words “Action occurred prior to the implementation of this T&E process” in this column
Note 6: Each signed and dated version (Initial, Revisions, and Checklist Complete) of this checklist will be maintained by the DT Test Director in either a hard copy or in an
electronic copy with a scanned signature page. Each “working copy” of this checklist can be maintained electronically on the DT Test Director’s PC. All signed and
dated versions of this checklist will be maintained for a period of two (2) years beyond the completion of the OT Checklist and in accordance with the ANG-E Document
Management and Control (DMC) Process Description Document (PDD)
_______________________________ ____________
T&E First Line Supervisor(s)
_______________________________ ____________
DT Test Director
_______________________________ ____________
TSB Representative
Note 1: Initial – Checklist will be filled out upon notification/assignment to a new T&E project, w/anticipated action tailoring included; Dated and signed
Revision – Checklist will be updated each time a changed, newly anticipated or actual but not previously anticipated tailored action is required; Dated and signed for each
revision
Checklist Complete – Checklist will be finalized when the associated program phase is complete and all actions have been addressed; Dated and signed
6.2.1.10,
DT11 Witness accreditation of DT test capabilities N/A
9.1
Note 2: Tailoring – Identify any required change(s) to the Action, and provide justification for each specific change (Sec. 1.6 of this T&E Handbook can be used to provide
justifications related to tailoring the T&E process); Leave blank if no tailoring is required
Note 3: Target Date – Provide at least the month and year of the anticipated completion of the Action; Revise only when a major program schedule change has occurred; Leave
blank if the Action already occurred prior to the completion of the Initial version of the checklist
Intentionally Blank
_______________________________ ____________
T&E First Line Supervisor(s)
_______________________________ ____________
OT Test Director
_______________________________ ____________
TSB Representative
Note 1: Initial – Checklist will be filled out upon notification/assignment to a new T&E project, w/anticipated action tailoring included; Dated and signed
Revision – Checklist will be updated each time a changed, newly anticipated or actual but not previously anticipated tailored action is required; Dated and signed for each
revision
Checklist Complete – Checklist will be finalized when the associated program phase is complete and all actions have been addressed; Dated and signed
Note 2: Tailoring – Identify any required change(s) to the Action, and provide justification for each specific change (Sec. 1.6 of this T&E Handbook can be used to provide
justifications related to tailoring the T&E process); Leave blank if no tailoring is required
Note 3: Target Date – Provide at least the month and year of the anticipated completion of the Action; Revise only when a major program schedule change has occurred; Leave
blank if the Action already occurred prior to the completion of the Initial version of the checklist
Note 4: Completion Date – Provide the actual completion date of the Action; If the Action was completed prior to the completion of the Initial version of the checklist, provide at
least the year of the Action completion or leave blank if unknown; If the Action is completed after the completion of the Initial version of the checklist, provide the full
date of the Action completion (month/day/year); For Actions that have multiple items to complete, provide completion dates for each item and explain in the
Status/Comments column
Note 5: Status/Comments – Provide any supporting information for the Action; if the Action was completed prior to the completion of the Initial version of the checklist and the
actual date of completion is unknown, enter the words “Action occurred prior to the implementation of this T&E process” in this column
Note 6: Each signed and dated version (Initial, Revisions, and Checklist Complete) of this checklist will be maintained by the OT Test Director in either a hard copy or in an
electronic copy with a scanned signature page. Each “working copy” of this checklist can be maintained electronically on the OT Test Director’s PC. All signed and
dated versions of this checklist will be maintained for a period of two (2) years beyond the completion of the OT Checklist and in accordance with the ANG-E Document
Management and Control (DMC) Process Description Document (PDD)
2
An early review is recommended for the purpose of identifying major work product deficiencies. This is typically
accomplished by the project’s TSB POC and one other TSB member. It is recommended that the TSB Early Review
start in parallel with the Peer Review but be complete prior to Comment Workoff and Tech Edit.
3
The completed review results in a “tabling” meeting in which the TSB determines the level of conformance to
established standards and the TSB’s position, if endorsable. Consult the TSB Work Product Review and
Endorsement Guide regarding the number of reviewers to include commensurate with the priority level of the
program.
4
Endorsement is defined as a recommendation for approval or disapproval of a work product with supporting
comments. Endorsement can be accomplished via email, endorsement letter, or a written signature on the signature
page of the document by the endorser(s).
The following pages contain diagrams of the process flow cycles for reviewing and approving
the test work products identified in the Handbook. The processes in the diagrams may be
tailored to meet program needs. A process may be restarted if major revisions are required based
on reviews or major program changes.
The following terminology is presented below for use in understanding the flow diagrams:
o Signature: On specified documents, signature(s) are required by the responsible author(s)
of the document (normally the Test Director(s)) on the signature page.
o Review: A Review is defined as an assessment of a draft or final draft document to
provide comments and input. The outcome of a review results in the delivery of a revised
draft or a final document. A Peer Review is a structured type of review which involves a
methodical examination of a completed draft document. Peer Reviews are conducted in
accordance with the Peer Review PDD to ensure the quality of the work product. Peer
Reviews are conducted by unbiased subject matter experts that have independence from
the development and approval of the document. Peer reviewers utilize knowledge,
experience, established standards and known best practices to provide editorial and
technical comments.
o Endorsement: Endorsement is defined as a recommendation for approval or disapproval
of a work product with supporting comments. Endorsement can be accomplished via
email, endorsement letter, or a written signature on the signature page of the document by
the endorser(s). Note: The TSB endorses work products via TSB Endorsement Position
Papers.
o Approval: For DT work products, approval is defined by the provision of an approval
recommendation from the DT Test Director to the Contracting Officer after all
appropriate Government authorities have reviewed and endorsed the document. For OT
work products, approval is defined by the written signature on the signature page of the
document by the designated authority after his or her review and approval of the
document.
The following symbology definitions are presented below for use in understanding the connector
lines between the blocks in the flow diagrams:
o Solid line: Mandatory path for activity to follow
o Dashed line: Recommended path for activity to follow
F IGURE D-3. P TEMP, I TEMP & F TEMP R EVIEW & A PPROVAL C YCLE