PMO TCoE UA Test Plan MCTRA 3.0 Pricing SAMPLE
PMO TCoE UA Test Plan MCTRA 3.0 Pricing SAMPLE
PMO TCoE UA Test Plan MCTRA 3.0 Pricing SAMPLE
MCTRA 3.0
User Acceptance Test Plan
Table of Contents
Page 2 of 14
628626592.doc_date
User Acceptance Test Plan
Revision History
Version No. Date Revised By Description of Change
0.1 12/19/2017 Rajesh Kadarkarai Initial Draft
1.0 12/21/2017 Rajesh Kadarkarai Updated Review comments
Approvals
The undersigned acknowledge that they have reviewed the Master Test Plan and agree with the
information presented within this document. Changes to this plan will be coordinated with, and
approved by, the undersigned, or their designated representatives. The Project Sponsor will be
notified when approvals occur.
12/21/2017
Signature: Date:
Print Name: Raja Gampa
Title: Performance Manager
Role: Program Director
12/21/2017
Signature: Date:
Print Name: Mark Summers
Title: TCoE Manager
Role: Test Manager
12/21/2017
Signature: Date:
Print Name: Sreelekha Vuppalancha
Title: TCoE Lead
Role: PMO TCOE Auditor
Page 3 of 14
628626592.doc_date
User Acceptance Test Plan
Definitions
Acronym or Definition
Term
MCTRA The Medicaid Clinical Translation, Review and Approval (MCTRA)
System is used to store, update and export all medical code sets
into MMIS.
HCPCS/CPT The Healthcare Common Procedure Coding System (HCPCS)
is a set of health care procedure codes based on the
American Medical Association’s Current Procedural
Terminology (CPT)
BPM Business Process Management
ICD-10 The International Classification of Diseases, Tenth Edition
RBRVS Resource-Based Relative Value System
PMPM Per Member, Per Month
Reference Documents
Documents and Definition
Repository
Path
Requirement JIRA Stories
Stories from
JIRA
MCTRA BRD MCTRA BRD
FRS MCTRA-FRS
Page 4 of 14
628626592.doc_date
User Acceptance Test Plan
1. Document Purpose
The purpose of this document is to outline the User Acceptance Testing (UAT) process for the MCTRA 3.0
Pricing. Project Sponsors from all participating departments are intended to review this document.
Approval of this document implies that reviewers are confident that following the execution of the test
plan, the resulting system will be considered fully-tested and eligible for implementation.
UAT is to be completed by the Business Departments (UAT Team) that will be utilizing the software
and/or support departments. The testing is conducted to enable a user to validate that the software
meets the agreed upon acceptance criteria.
2. Project Overview
This project is to provide capabilities to Create Change Request on HCPCS/CPT Codes and send them to
approvers. This is to resolve the manual intervention of updating the code details in MMIS and keep both
the systems in sync.
3. Scope
Ref ID Functionality
1
Develop hierarchy for applying reimbursement rates based on approved rules
2
Establish process for updating and modifying rules as needed, to include ancillary
reimbursement, budgets, and actuary as needed
3
Define agency approval and review process
Ref ID Functionality
1 Anything not mentioned in In Scope
Page 5 of 14
628626592.doc_date
User Acceptance Test Plan
5. Assumptions/Dependencies/Issues/Risks
This section captures Test Assumptions, Dependencies and Constraints specific to User Acceptance Test
(UAT) which are known at this time.
1.3 Assumptions
1) Business Requirements/Software System Requirement Specifications are clear, concise and able
to be translated into test cases.
2) Any approved PCR’s that QA Team have not had a chance to estimate for will not be included in
our testing until such time as they have been estimated, planned and approved.
3) All impacted application(s)/system(s) and their respective interfaces will be tested at least once
during the testing phase’s lifecycles
4) All necessary development will be complete in time to start testing.
5) JIRA/Adaptavist will be used as test management tool. All test cases, test results and defects will
be available in JIRA at: Project MCTRA (MCTRA)
6) All the team member will have access to JIRA/Adaptavist
1.4 Dependencies
1) All SDLC artifacts are complete and signed off
2) Test resources availability syncs with project scheduling
3) All test scripts are uploaded to Adaptavist prior to commencement of UAT execution
4) The Test environments are available and connectivity has been established between all the
interfaces identified on this project.
5) All necessary accesses are provided for UAT Team
6) Availability of Test Cases and specific test data according to the requirements
7) Changes in scope or redesign will require a project change request be submitted and approved
1.5 Constraints
1) Any unidentified or future changes or inclusions that may adversely affect the test schedule
2) Any technology ‘freeze’ periods
3) Resource contention and availability of Business, IT & External SME’s throughout all work streams
due to current allocation on other projects.
4) Timely resolution of issues and key decisions
1.6 Risks
This section lists all potential test related risks known at this time, the proposed mitigation and
contingency measures to be adopted by the UAT Team.
Refer to the Project Risk Log for the full inventory of project related risks.
Page 6 of 14
628626592.doc_date
User Acceptance Test Plan
Ref Risk Risk Risk Mitigation Contingency
ID Probability Impact
H/M/L H/M/L
A. Test Execution
Test execution initiates when the UAT Plan has been completed and signed off, a complete set of test
cases have been written that cover all of the functional specifications and certain non-functional
specifications, if applicable, and the test environment becomes available. Test execution is basically
executing the test cases according to your test plan. For each test case, follow the test steps described
in the test case and validate the ‘expected’ results against the ‘actual’ results. If the expected results for
all steps of the test case were achieved the test passes, otherwise the test case fails. Any failures are
documented as a defect with accompanying screen shots or other attachments that will help reproduce
the defect.
B. Entry/Exit Criteria
Entry Criteria The application works functionally as defined in the specifications
No outstanding “Critical or High” defects
All the identified QA Test Cases are executed with the pass rate of 98%
Any open defects from QA should have resolution plan
All areas have had testing started on them unless pre agreed by UAT
stakeholder/Test and Project managers
Entire system functioning and all new components available unless previously
agreed between UAT stakeholder/Test manager and project managers
All test cases are documented and reviewed prior to the commencement of
UAT
Page 7 of 14
628626592.doc_date
User Acceptance Test Plan
Exit Criteria The Acceptance Tests must be completed, with a pass rate of not less than
98%.
No outstanding “Critical or High” defects
Less than 5 significant defects outstanding
All Test cases have been complete
No new defects have been discovered for a week prior to Production
Implementation.
All test results recorded and approved
UAT test summary report documented and approved
UAT close off meeting held.
a) Test Case Creation and Execution will be performed in Adaptavist Test Management tool
b) JIRA will be used for Defect Management
D. Test Reporting
Test reporting, provides the ability to evaluate testing efforts and communicate test results to Project
stakeholders. The objective of reporting is to assess the current status for project testing against testing
timelines and to provide details about the overall quality of the application or system under test.
a) Weekly Test Status Report will be generated and shared to project stakeholders
a) Test Closure Report will be prepared at the end of UAT Phase along with the
Recommendations
Page 8 of 14
628626592.doc_date
User Acceptance Test Plan
Note: This will be use
above mount which
Ingest Processing 1 8 vCPUs 16 Gb should be shared
between Data hub and
Ingest Processing
Application Server 1 4-8 vCPUs 16 Gb 100 Gb
Build Server 1 4 vCPUs 16 Gb 100 Gb
8. UA Test Data
[List the required data that must be received before testing begins - i.e. access to systems, accounts,
etc.]
Test Suite # Test Data # Test Data Description
UAT 1 Source Data from MMIS, MEDS, OnBase and Curam will
be used for Testing
UAT 2 Codes from CMS will be used for Pricing
9. UAT Deliverables
The following sections detail milestones crucial to the completion of the UAT phase of the project. Once
all dependent milestones have been completed, UAT will formally sign-off on the system’s functionality
and distribute an e-mail to all project stakeholders.
UAT Plan – A strategy-based document defining test methodology and criteria is distributed to the
team.
UA Test Cases – A document that details each specific test case that will be performed during the
UAT process.
UAT Closure Report – Formal sign-off indicating the system satisfies the needs of the business as
specified in the functional requirements and provides confidence in its use.
Page 9 of 14
628626592.doc_date
User Acceptance Test Plan
Application/
# Cycle Environment Planned Start Date Planned End Date
System
MCTRA 1 UAT 10/25/17 11/1/17
MMIS 1 UAT 10/25/17 11/1/17
Unix
Test Dev
,
Phase Activity Lea Lea PM BA Comments
DBA
d d
Lead
Requirement Providing detailed I I I A& R Functional lead is also
Analysis list of R responsible for any functional
requirements in requirements if developed for
scope for the the release
release
Test Case Requirement A&R C C C C During requirement
Developmen analysis and test understanding and test case
t case development development test lead would
require help from other
stakeholders to finalize the test
plan
Test Case Test requirements A R I R R While test lead has the final
Developmen review and sign accountability on finalizing test
t off plan, it is other stake holders
responsibility to review and
provide sign off
User UAT execution C R C A R
Acceptance
test
execution
R – Responsibility A – Accountability C – Consulted I – Informed
Responsible: Those who do the work to achieve the task. There is typically one role with a participation
type of responsible, although others can be delegated to assist in the work required
Accountable: (also Approver or final Approving authority) those who are ultimately accountable for the
correct and thorough completion of the deliverable or task, and the one to whom responsible is
accountable. In other words, an Accountable must sign off (Approve) on work that responsible provides.
There must be only one Accountable specified for each task or deliverable.
Consulted: Those whose opinions are sought; and with whom there is two-way communication.
Informed: Those who are kept up-to-date on progress, often only on completion of the task or
deliverable; and with whom there is just one-way communication
All team members will be presented with an overview of the test process and what their specific role in
UAT will be. The Business Analyst’s role in the UAT process is to oversee testing by assigning scripts to
SMEs, providing general support, and serving as the primary UAT contact point throughout the test cycle.
Page 10 of 14
628626592.doc_date
User Acceptance Test Plan
The BA will be expected to filter out any duplicate defects found and escalate high priority issues to the
team in a time sensitive manner.
IMPORTANT NOTE: It is recommended that this document be printed and used for reference during test
execution activities to ensure uniform categorization of defects across all base test phases.
Examples of Defects and Related Severity Classifications
The following is a list provides examples of defects and their related severity classification. The table
below provides uniform guidance to line(s) of business to assist in assigning severity levels to defects.
Severity levels (Critical, High, Medium or Low) are measured in terms of the impact to the business as
well as any other systems, devices or users impacted with which the new system/application interfaces.
Critical
3. Corrupts data
4. No work-around exists
Page 11 of 14
628626592.doc_date
User Acceptance Test Plan
6. Missing security
11. Unable to interface to specified devices and applications (includes failed or degraded
performance due to device or application failure.) ex: printers, interfaces, & TCA
12. Failure to meet scalability, resiliency and performance requirements within specified
thresholds.
High
1. Work-around exists
Medium
3. Probability of occurrence is low and/or is not easily reproduced during regular operations
of the application, device or interface. This does NOT mean that all issues that are difficult
to reproduce fall in this category. Issue severity is based on the effect on the system or
users, NOT the difficulty in reproducing issues. This implies auxiliary functionality.
Low
2. Spelling
3. Grammar
4. Cosmetic – User interface issues that have minimal impact to the operation of the
application
Page 12 of 14
628626592.doc_date
User Acceptance Test Plan
Critical >5% of total test Yes
cases
Any Level Any number Yes / Go–No Go meeting is scheduled
within 5 days from current day
Page 13 of 14
628626592.doc_date
User Acceptance Test Plan
Environment setup issues corrected
Application instability issue resolved
Correct data/files loaded in the test environment
Blocking defect fixed and retested to provide evidence testing may continue
Page 14 of 14
628626592.doc_date