PMO TCoE UA Test Plan MCTRA 3.0 Pricing SAMPLE

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 14
At a glance
Powered by AI
The key takeaways are that the document outlines the user acceptance test plan for the MCTRA 3.0 project. It details the objectives, scope, assumptions, test phases, environments, data, deliverables, schedule, roles and communication procedures for the UAT.

The purpose of the document is to outline the user acceptance test plan for the MCTRA 3.0 project. It provides details on the objectives, scope, assumptions, test phases, environments, data, deliverables, schedule, roles and responsibilities for the UAT.

The phases of the User Acceptance Test outlined are: test planning/preparation, test cases and traceability, test execution/management/reporting, and test closure tasks.

Department of Health and Human Services (SCDHHS) Project

Management Office (PMO)

User Acceptance Test Plan

MCTRA 3.0
User Acceptance Test Plan

Table of Contents

Section Heading Page number


Revision History............................................................................................................................3
Approvals......................................................................................................................................3
Definitions.....................................................................................................................................4
Reference Documents....................................................................................................................4
1. Document Purpose.....................................................................................................................5
2. Project Overview.......................................................................................................................5
3. Scope.........................................................................................................................................5
3.1 In Scope Requirements....................................................................................................5
3.2 Out of Scope Requirements.............................................................................................5
4. User Acceptance Test (UAT) Objectives..................................................................................6
5. Assumptions/Dependencies/Issues/Risks..................................................................................6
5.1 Assumptions.....................................................................................................................6
5.2 Dependencies...................................................................................................................6
5.3 Constraints.......................................................................................................................6
5.4 Risks.................................................................................................................................6
6. User Acceptance Test (UAT) Phase..........................................................................................7
6.1 Test Planning/Preparation................................................................................................7
6.2 Test Cases and Traceability.............................................................................................7
6.3 Test Execution/Management/Reporting..........................................................................7
6.4 Test Closure Tasks...........................................................................................................8
7. UAT Environments Requirements............................................................................................8
8. UA Test Data.............................................................................................................................9
9. UAT Deliverables......................................................................................................................9
10. UAT Schedule.........................................................................................................................9
11. Roles and Responsibilities.......................................................................................................9
12. UAT Team.............................................................................................................................10
13. UAT Defects..........................................................................................................................10
13.1 UAT Defect Tracking..................................................................................................10
13.2 UAT Defect Severity and Priority Standards...............................................................10
14. Defect escalation procedure..................................................................................................12
15. Integration and Intersystem Interfaces..................................................................................12
16. Suspension and Resumption Criteria.....................................................................................13
16.1 Suspension Criteria......................................................................................................13
16.2 Resumption Criteria.....................................................................................................13
17. Communication and Escalation.............................................................................................13

Page 2 of 14
628626592.doc_date
User Acceptance Test Plan

Revision History
Version No. Date Revised By Description of Change
0.1 12/19/2017 Rajesh Kadarkarai Initial Draft
1.0 12/21/2017 Rajesh Kadarkarai Updated Review comments

Approvals
The undersigned acknowledge that they have reviewed the Master Test Plan and agree with the
information presented within this document. Changes to this plan will be coordinated with, and
approved by, the undersigned, or their designated representatives. The Project Sponsor will be
notified when approvals occur.

Signature: Date: 12/21/2017


Print Name: Rajesh Kadarkarai
Title: TCoE Lead
Role: Test Lead

12/21/2017
Signature: Date:
Print Name: Raja Gampa
Title: Performance Manager
Role: Program Director

12/21/2017
Signature: Date:
Print Name: Mark Summers
Title: TCoE Manager
Role: Test Manager

12/21/2017
Signature: Date:
Print Name: Sreelekha Vuppalancha
Title: TCoE Lead
Role: PMO TCOE Auditor

Page 3 of 14
628626592.doc_date
User Acceptance Test Plan

Definitions
Acronym or Definition
Term
MCTRA The Medicaid Clinical Translation, Review and Approval (MCTRA)
System is used to store, update and export all medical code sets
into MMIS.
HCPCS/CPT The Healthcare Common Procedure Coding System (HCPCS)
is a set of health care procedure codes based on the
American Medical Association’s Current Procedural
Terminology (CPT)
BPM Business Process Management
ICD-10 The International Classification of Diseases, Tenth Edition
RBRVS Resource-Based Relative Value System
PMPM Per Member, Per Month

Reference Documents
Documents and Definition
Repository
Path
Requirement JIRA Stories
Stories from
JIRA
MCTRA BRD MCTRA BRD
FRS MCTRA-FRS

Page 4 of 14
628626592.doc_date
User Acceptance Test Plan

1. Document Purpose
The purpose of this document is to outline the User Acceptance Testing (UAT) process for the MCTRA 3.0
Pricing. Project Sponsors from all participating departments are intended to review this document.
Approval of this document implies that reviewers are confident that following the execution of the test
plan, the resulting system will be considered fully-tested and eligible for implementation.

UAT is to be completed by the Business Departments (UAT Team) that will be utilizing the software
and/or support departments. The testing is conducted to enable a user to validate that the software
meets the agreed upon acceptance criteria.

2. Project Overview
This project is to provide capabilities to Create Change Request on HCPCS/CPT Codes and send them to
approvers. This is to resolve the manual intervention of updating the code details in MMIS and keep both
the systems in sync.

3. Scope

1.1 In Scope Requirements


<List the BRD and FRS requirement and the corresponding UAT Testcase number in the tabular format>

Ref ID Functionality
1
Develop hierarchy for applying reimbursement rates based on approved rules

2
Establish process for updating and modifying rules as needed, to include ancillary
reimbursement, budgets, and actuary as needed

3
Define agency approval and review process

1.2 Out of Scope Requirements


<List the BRD and FRS requirements that are out of scope for UAT testing>

Ref ID Functionality
1 Anything not mentioned in In Scope

Page 5 of 14
628626592.doc_date
User Acceptance Test Plan

4. User Acceptance Test (UAT) Objectives


User Acceptance Testing is conducted to ensure that the system satisfies the needs of the business as
specified in the functional requirements and provides confidence in its use. Modifications to the
aforementioned requirements will be captured and tested to the highest level of quality allowed within
the project timeline.
To identify and expose defects and associated risks, communicate all known issues to the project team,
and ensure that all issues are addressed in an appropriate manner prior to implementation

5. Assumptions/Dependencies/Issues/Risks
This section captures Test Assumptions, Dependencies and Constraints specific to User Acceptance Test
(UAT) which are known at this time.

1.3 Assumptions
1) Business Requirements/Software System Requirement Specifications are clear, concise and able
to be translated into test cases.
2) Any approved PCR’s that QA Team have not had a chance to estimate for will not be included in
our testing until such time as they have been estimated, planned and approved.
3) All impacted application(s)/system(s) and their respective interfaces will be tested at least once
during the testing phase’s lifecycles
4) All necessary development will be complete in time to start testing.
5) JIRA/Adaptavist will be used as test management tool. All test cases, test results and defects will
be available in JIRA at: Project MCTRA (MCTRA)
6) All the team member will have access to JIRA/Adaptavist

1.4 Dependencies
1) All SDLC artifacts are complete and signed off
2) Test resources availability syncs with project scheduling
3) All test scripts are uploaded to Adaptavist prior to commencement of UAT execution
4) The Test environments are available and connectivity has been established between all the
interfaces identified on this project.
5) All necessary accesses are provided for UAT Team
6) Availability of Test Cases and specific test data according to the requirements
7) Changes in scope or redesign will require a project change request be submitted and approved

1.5 Constraints
1) Any unidentified or future changes or inclusions that may adversely affect the test schedule
2) Any technology ‘freeze’ periods
3) Resource contention and availability of Business, IT & External SME’s throughout all work streams
due to current allocation on other projects.
4) Timely resolution of issues and key decisions

1.6 Risks

This section lists all potential test related risks known at this time, the proposed mitigation and
contingency measures to be adopted by the UAT Team.

When conducting risk analysis two components should be considered:


 The probability that the negative event will occur.
 The potential impact or loss associated with the event.

Refer to the Project Risk Log for the full inventory of project related risks.

Page 6 of 14
628626592.doc_date
User Acceptance Test Plan
Ref Risk Risk Risk Mitigation Contingency
ID Probability Impact
H/M/L H/M/L

 1 Availability of Data L H N/A Extend the testing


for validation timeline
 2 Availability of SME  L H Backup plan Project plan need to be
needs to be there aligned based on
for SME availability of SME
           

6. User Acceptance Test (UAT) Phase

1.7 Test Planning/Preparation


Test planning and preparation involves ensuring the required framework is in place to support test
execution activities. The aim of testing is to ensure that the system matches the requirements specified
and that the probability of the occurrence of mistakes endangering real-time operation does not exceed
the acceptable level.
The following Test Planning/Preparation activities must be completed prior to initiation of User Acceptance
Test (UAT) execution activities:

1.8 Test Cases and Traceability


Test cases contain a detailed step by step breakdown of each test case to be performed by the UA tester.
Each script contains: test case number, product, test description, requirement number, requestor, tester,
action to be performed, test data to be utilized, expected results, error descriptions (if applicable),
pass/fail results, date tested, and any additional comments from the UA tester.
Location of Test Cases: MCTRA-UA Test Cases

Location of Traceability: MCTRA RTM

1.9 Test Execution/Management/Reporting

A. Test Execution
Test execution initiates when the UAT Plan has been completed and signed off, a complete set of test
cases have been written that cover all of the functional specifications and certain non-functional
specifications, if applicable, and the test environment becomes available. Test execution is basically
executing the test cases according to your test plan.  For each test case, follow the test steps described
in the test case and validate the ‘expected’ results against the ‘actual’ results.  If the expected results for
all steps of the test case were achieved the test passes, otherwise the test case fails.  Any failures are
documented as a defect with accompanying screen shots or other attachments that will help reproduce
the defect. 

B. Entry/Exit Criteria
Entry Criteria  The application works functionally as defined in the specifications
 No outstanding “Critical or High” defects
 All the identified QA Test Cases are executed with the pass rate of 98%
 Any open defects from QA should have resolution plan
 All areas have had testing started on them unless pre agreed by UAT
stakeholder/Test and Project managers
 Entire system functioning and all new components available unless previously
agreed between UAT stakeholder/Test manager and project managers
 All test cases are documented and reviewed prior to the commencement of
UAT

Page 7 of 14
628626592.doc_date
User Acceptance Test Plan
Exit Criteria  The Acceptance Tests must be completed, with a pass rate of not less than
98%.
 No outstanding “Critical or High” defects
 Less than 5 significant defects outstanding
 All Test cases have been complete
 No new defects have been discovered for a week prior to Production
Implementation.
 All test results recorded and approved
 UAT test summary report documented and approved
 UAT close off meeting held.

C. Test Management (Tracking)


Depending on the test management tools utilized by the team, test execution and results are logged
either manually or in a Test Management tool. If a Test Management tools is being utilized, results will be
available and summarised via a dashboard or test metric reporting. Tracking is a necessity in the testing
process, as quality metrics are required in order to effectively track how the test effort is progressing and
to measure the quality of the system/application.

Test Management activities to be performed are:

a) Test Case Creation and Execution will be performed in Adaptavist Test Management tool
b) JIRA will be used for Defect Management

D. Test Reporting
Test reporting, provides the ability to evaluate testing efforts and communicate test results to Project
stakeholders. The objective of reporting is to assess the current status for project testing against testing
timelines and to provide details about the overall quality of the application or system under test.

Test Reporting activities to be performed are:

a) Weekly Test Status Report will be generated and shared to project stakeholders

1.10 Test Closure Tasks


Test closure activities collect data from completed test activities to consolidate experience, test ware, and
metrics. Test closure activities occur at project milestones such as when a system/application is released,
a project is completed (or cancelled) or a test milestone achieved (i.e. completion of UAT phase).

Test Closure activities to be performed are:

a) Test Closure Report will be prepared at the end of UAT Phase along with the
Recommendations

7. UAT Environments Requirements


No. of Nodes/
Component CPU/Node Memory Storage
machines
Load Balancer 1 N/A N/A N/A
ML Cluster 6 8 vCPUs 32 Gb/node 1 TB/node
500 Gb Note: This will
ML Data Hub be mount and shared
1 8 vCPUs 16 Gb
Framework between Data hub and
Ingest Processing

Page 8 of 14
628626592.doc_date
User Acceptance Test Plan
Note: This will be use
above mount which
Ingest Processing 1 8 vCPUs 16 Gb should be shared
between Data hub and
Ingest Processing
Application Server 1 4-8 vCPUs 16 Gb 100 Gb
Build Server 1 4 vCPUs 16 Gb 100 Gb

8. UA Test Data
[List the required data that must be received before testing begins - i.e. access to systems, accounts,
etc.]
Test Suite # Test Data # Test Data Description
UAT 1 Source Data from MMIS, MEDS, OnBase and Curam will
be used for Testing
UAT 2 Codes from CMS will be used for Pricing

9. UAT Deliverables
The following sections detail milestones crucial to the completion of the UAT phase of the project. Once
all dependent milestones have been completed, UAT will formally sign-off on the system’s functionality
and distribute an e-mail to all project stakeholders.

 UAT Plan – A strategy-based document defining test methodology and criteria is distributed to the
team.
 UA Test Cases – A document that details each specific test case that will be performed during the
UAT process.
 UAT Closure Report – Formal sign-off indicating the system satisfies the needs of the business as
specified in the functional requirements and provides confidence in its use.

Page 9 of 14
628626592.doc_date
User Acceptance Test Plan

10. UAT Schedule

Application/
# Cycle Environment Planned Start Date Planned End Date
System
MCTRA 1 UAT 10/25/17 11/1/17
MMIS 1 UAT 10/25/17 11/1/17

11. Roles and Responsibilities

Unix
Test Dev
,
Phase Activity Lea Lea PM BA Comments
DBA
d d
Lead
Requirement Providing detailed I I I A& R Functional lead is also
Analysis list of R responsible for any functional
requirements in requirements if developed for
scope for the the release
release
Test Case Requirement A&R C C C C During requirement
Developmen analysis and test understanding and test case
t case development development test lead would
require help from other
stakeholders to finalize the test
plan
Test Case Test requirements A R I R R While test lead has the final
Developmen review and sign accountability on finalizing test
t off plan, it is other stake holders
responsibility to review and
provide sign off
User UAT execution C R C A R
Acceptance
test
execution
R – Responsibility A – Accountability C – Consulted I – Informed

Responsible: Those who do the work to achieve the task. There is typically one role with a participation
type of responsible, although others can be delegated to assist in the work required
Accountable: (also Approver or final Approving authority) those who are ultimately accountable for the
correct and thorough completion of the deliverable or task, and the one to whom responsible is
accountable. In other words, an Accountable must sign off (Approve) on work that responsible provides.
There must be only one Accountable specified for each task or deliverable.
Consulted: Those whose opinions are sought; and with whom there is two-way communication.
Informed: Those who are kept up-to-date on progress, often only on completion of the task or
deliverable; and with whom there is just one-way communication

12. UAT Team


The test team is comprised of members who possess a thorough knowledge of the current systems and
processing methods, i.e. SMEs. These team members will be better able to initiate test input, review the
results, and be more intuitively familiar with the impact on other related business issues and staff
activities. Members should be detail-oriented and be diligent in collecting proper documentation to
support the test results. Team members are selected based, in part, on the ability of management to
reassign the daily duties they will have to forgo while performing the testing.

All team members will be presented with an overview of the test process and what their specific role in
UAT will be. The Business Analyst’s role in the UAT process is to oversee testing by assigning scripts to
SMEs, providing general support, and serving as the primary UAT contact point throughout the test cycle.
Page 10 of 14
628626592.doc_date
User Acceptance Test Plan
The BA will be expected to filter out any duplicate defects found and escalate high priority issues to the
team in a time sensitive manner.

Name Project Role Email Phone Location


Rajesh Kadarkarai Test Lead [email protected] 82131 Jefferson

13. UAT Defects


Defects will be entered and tracked via Test Management Tool JIRA during the UAT process. Each entry
will include detailed information about each defect.

1.11 UAT Defect Tracking


Team members will be provided with instruction on how to effectively execute test scripts, as well
identify, capture, and report defects. Utilization of Microsoft Office applications and screen capture
programs (e.g. SnagIt) will be required to document defects for escalation. Team members will be
expected to present findings on regularly scheduled touch point meetings in the event that end user
support and/or Development require additional detail.

1.12 UAT Defect Severity and Priority Standards

Severity Definition Expected time


for Closure
Critical A complete software system, or a subsystem, or software unit (program 1 Business Day
or Module) within the system lost its ability to perform its required
function (=Failure) and no workaround available
OR
Testing of a significant number of tests cannot continue without closure
OR
Potential show stopper for Go/ No-Go decision to enter next stage or
Cutover without closure
High The software system, or subsystem, or software unit (program or 2 Business days
module) within the system produces Incorrect, Incomplete, or
Inconsistent results
OR
Defect impairs the usability (capability of software to be understood,
learned, used and attractive to the user when used under specified
conditions [ISO 9126]
Medium/Low Everything that not Major or Critical 3 Business days

IMPORTANT NOTE: It is recommended that this document be printed and used for reference during test
execution activities to ensure uniform categorization of defects across all base test phases.
 Examples of Defects and Related Severity Classifications

The following is a list provides examples of defects and their related severity classification. The table
below provides uniform guidance to line(s) of business to assist in assigning severity levels to defects.
Severity levels (Critical, High, Medium or Low) are measured in terms of the impact to the business as
well as any other systems, devices or users impacted with which the new system/application interfaces.
Critical

1. Crashes the system

2. Crashes the user session

3. Corrupts data

4. No work-around exists
Page 11 of 14
628626592.doc_date
User Acceptance Test Plan

5. Prevents completion of a given task within specified business time requirements

6. Missing security

7. Violates security policy of business

8. Negatively impacts customers or business’ ability to service customers

9. Causes violation of regulatory rule or guideline

10. Prevents or impedes implementation per required scheduled

11. Unable to interface to specified devices and applications (includes failed or degraded
performance due to device or application failure.) ex: printers, interfaces, & TCA

12. Failure to meet scalability, resiliency and performance requirements within specified
thresholds.

High

1. Work-around exists

2. Work-around negatively impacts customer interaction or business process with regard to


performance, scalability, stability and resiliency.

3. Probability of occurrence is high and/or easily reproduced – High means it occurs in


daily/regular operations of the application, device or interface

Medium

1. Work around exists

2. Work-around negatively impacts operation of the application, device or interface – does


not occur in regular operational use of application, device or interface. Not easily
reproduced.

3. Probability of occurrence is low and/or is not easily reproduced during regular operations
of the application, device or interface. This does NOT mean that all issues that are difficult
to reproduce fall in this category. Issue severity is based on the effect on the system or
users, NOT the difficulty in reproducing issues. This implies auxiliary functionality.

Low

1. Work around exists

2. Spelling

3. Grammar

4. Cosmetic – User interface issues that have minimal impact to the operation of the
application

5. Any help documentation or context sensitive information

14. Defect escalation procedure


Below table provides information on when to escalate a defect
Defect Severity # Blocking test Slipped SLA Candidate for
cases escalation
Any Level >10% of total test Yes
cases

Page 12 of 14
628626592.doc_date
User Acceptance Test Plan
Critical >5% of total test Yes
cases
Any Level Any number Yes / Go–No Go meeting is scheduled
within 5 days from current day

Defect communication and escalation procedure:


First level of notification: As soon as the defect is logged in to quality center, auto generated email
would be sent to the assigned person. Since the defect will be assigned to development team alias, all
the team who are subscribed to the alias would get the email.
Daily status review meeting: Along with the test execution status discussions, all the outstanding
defects would be discussed in the meeting. Development team, business team, basis team, QA
management and other stakeholders as appropriate would join the meeting. Defect details and estimated
time of fix would be documented in the quality center accordingly.
Defect disposition meeting: this is a twice a week meeting where in only high impact defects as
identified are the candidates for escalation would be discussed in detail. Development team management,
QA team management along with respective leads would discuss the finer details and put an action plan
to resolve them.
Escalation email to development team/SME team manager: QA Manager would send an email with
details of defects which need immediate attention to development team/SME team manager and on need
bases a triage call involving senior management would be organized to discuss associated risks, have a
resolution plan, and to review the status.
Note: Above mentioned escalation criteria can be adjusted during execution based on number of days left for the release go-no go
decision.

15. Integration and Intersystem Interfaces


The following tabular contents will list down the various Interfaces/Applications involved in the
Integration Testing of Superdome Project and also contains the individual point of contact that will be
used for coordinating any Integration Testing.

A diagram might work better or nice to have in addition


System ID Application/Functional Area Testing Responsibility/SME

MMIS UAT 2 MMIS Rajesh Kadarkarai

16. Suspension and Resumption Criteria


The purpose of this section is to identify the conditions that warrant a temporary suspension of testing
and the criteria for resumption.

1.13 Suspension Criteria


Suspension of test activities will be at the discretion of Test Manager/Test Lead and based on the
following conditions:
 Environment not available / unstable
 Major functionality not working
 Incorrect data/files loaded in test environment
 Blocking defect that would prevent further testing of the application/system
 Poor code quality which is evidenced by larger than normal number of defects identified in the
first few days

1.14 Resumption Criteria


Resumption of testing will be at the discretion of Test Manager/Test Lead and based on the following
requirements:

Page 13 of 14
628626592.doc_date
User Acceptance Test Plan
 Environment setup issues corrected
 Application instability issue resolved
 Correct data/files loaded in the test environment
 Blocking defect fixed and retested to provide evidence testing may continue

17. Communication and Escalation


Category Type Participants Mode Type of reporting
Bi-Weekly Project Project Manager Telephonic  High level project status,
project meeting Development Team conference  Key issues and risks,
QA Team  Action tracker
DB & Environment
Team
Weekly status PMO Project Manager Telephonic  Progress as against plan
meeting Development Team conference  Key issues and risks
QA Team  Action tracker
DB & Environment
Team
Sponsor
Daily status Project All Stakeholders Email Daily reporting of tasks and
reporting progress of the same against
plan

Page 14 of 14
628626592.doc_date

You might also like