Non-Functional Test Plan Template
Non-Functional Test Plan Template
Non-Functional Test Plan Template
SOLUTION
PLAN REQUIREMENTS ANALYSIS DESIGN BUILD TEST TRAIN/DEPLOY MAINTENANCE
TO THE DOCUMENT OWNER: This template is provided as a guideline and resource. The structure and instructions
give detail about what might go into a completed document. Only you and your team, however, know what will
best fit the needs of your specific effort, and you are encouraged to adapt the template as appropriate to meet
those needs.
Executive Summary
Summarize the approach, process, and assumptions, dependencies and risks.
The goal of the non-functional testing is to confirm that the <Project> system is robust, secure, scalable,
and usable. The Non-Functional Test Plan includes the tests, both automated and manual, to be
conducted by the project team to confirm that all non-functional requirements have been met and all
pertinent university policies are observed.
Test Approach
Different projects will require different types of non-functional testing. Describe the appropriate types for
your project, as selected in the Comprehensive Test Plan, and describe the test scenarios, scripts, or use
cases for each type of test, as well as the expected outcome for each type of testing. Include information
about which tests will be automated versus manual.
Page 1 of 6
Project Name
Document Version 1.0
Performance
It is anticipated that this section is needed to detail what was introduced in the Comprehensive Test Plan.
Performance testing is conducted to verify that the system meets expected targets under normal
conditions. Describe how testing will be performed. Define/list performance objectives and
requirements, including the following when applicable:
Latency testing (the time difference between the data to reach from source to destinations)
Number of clients
Client request frequency
Client request arrival rate
Acceptable response time
Acceptable throughput
Acceptable memory utilization
Acceptable input/output rates
Response time versus number of concurrent users
% of requested static pages that must meet acceptable response time
% of requested scripts that must meet acceptable response time
Baseline multiplier (2X, 4X,…) that system must be capable of handling
Stubs to create
Metrics to collect (hits per second, concurrent connections)
Describe how testing will be performed. Define/list load objectives and requirements, including the
following when applicable:
Peak ratio that system must be capable of handling
Graceful degradation
Load while maintaining acceptable response times
Critical points
Network bandwidth
Message queue buffer size
Interactions among components
Interactions with databases
Many users doing the same action
Many users doing different actions
Include longevity by putting appropriate tests in a loop, or taking database offline and then
restart it
Security
Use this section if detail in addition to the Comprehensive Test Plan is needed.
User Acceptance
It is anticipated that this section is needed to detail what was introduced in the Comprehensive Test Plan.
User Acceptance testing is to be conducted by end-users who will execute pre-defined test scripts. It is
intended to show that the needs of the business are satisfied and to instill confidence in the user set.
Users may also perform additional tests not detailed in the plan, which are also relevant and within
scope of the project. Identify participants in the activity, hardware to be used during the test, how
defects will be tracked, any user training that is necessary, and any required test data.
Other Testing
Refer to the Comprehensive Test Plan and expand this section as necessary to appropriately describe
other testing efforts. If Recovery and Error testing has not been covered in the functional testing of the
application, include that here. This could include verifying behavior in the case of a power outage.
If Usability Testing needs to be detailed, include interfaces to be considered. Much of this testing may be
included in the User Acceptance testing. If that is indeed the case, please document.
Test Process
Identify the methods and criteria used in performing test activities. Define the specific methods and
procedures for each type of test. Define the detailed criteria for evaluating test results. Also, the “order of
events” should be documented. If Regression testing is done at the end of each phase, then document it
in this paragraph. Are there any dependencies or other elements that will impact the order in which the
tests will be performed? Include skill sets required by the test team.
Test Prep
Identify the set of tasks necessary to prepare for and perform testing activities. Identify any inter-task
dependencies and any specific skills required. Include the build of test scaffolding, test data creation, and
test case automation.
Test Deliverables
Identify the deliverable documents from the test process. This includes Jira issues, test execution logs,
and test summary reports. Identify the location and availability of each of these.
Environment Requirements
Specify both the necessary and desired properties of the test environment, including the physical
characteristics, communications, mode of usage, and testing supplies. Also provide the levels of security
required to perform test activities. Identify special test tools needed and other testing needs.
Software
Identify the software requirements needed to complete test activities. This includes operating systems
and versions, browsers and versions, and applications and versions.
Test Deliverables
Identify artifacts coming out of this test, including test scripts, updated test plans, test execution results,
issued logged.
Tools
Identify any testing tools or infrastructure that may need to be developed. Identify the software tools
employed, including use of each tool. Include defect tracking tools and any required configurations.
Documentation
Identify the documents required to support testing activities.
Assumptions
Identify the assumptions associated with this plan, such as “The above environment will be complete.”
Dependencies
Identify the dependencies, such as resource availability, hardware availability, code delivery, and
dependencies on external teams.
Risks
Identify the risks associated with this plan, specifying mitigation and/or contingency plans.
A test case defines a set of conditions to be verified and has an expected and repeatable outcome.
Documentation of the test case should also include specific set-up requirements and other entry
conditions. A test script is an automated, semi-automated, or manual set of steps used to implement a
test case. These scripts are used to determine if a function meets expected behavior and delivers an
expected result. There is a 1-1 mapping of a test case to a test script. Some examples are provided
below.
Test
Non-Functional
Case Goal Assumptions Input Output
Area
Name
data requested is
Test interface A and B are valid user id returned and
System/Integration SYS-25 between A & B functioning and password formatted correctly
User login,
complete a form form does not valid user id completed, saved
User Acceptance UAT-3 and logoff already exist and password form
user
Test fill a form documentation is valid user id completed, saved
Documentation UAT-3 documentation complete and password form
Revision History
Identify document changes.