Test Strategy Template

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 12

TESTING STRATEGY

PROJECT IDENTIFICATION
Project Name CPI/Project Number Project Type
(Business Consu tin!" Imp ementation" #p!ra$e" Interna " ot%er&

Customer Name

Customer Number

P anne$ 'tart/Finis%

Project 'ponsor

Pro!ram (ana!er

Project (ana!er (Customer&

Project (ana!er ('AP&

'AP 'er)ice Partner(s&

Project (ana!er ('er)ice Partner&

Table of contents Intro$uction***************************************************************************************************************************** + Purpose................................................................................................................................... 2 Scope...................................................................................................................................... 2 Data (i!ration Business Object 'cope********************************************************************************, Data (i!ration Process De-inition*****************************************************************************************, Migration Solution Steps......................................................................................................... 3 Test Cyc es De-inition************************************************************************************************************* , Test Cycle Definitions and Descriptions..................................................................................3 Ro es an$ Responsibi ities***************************************************************************************************** . Project Roles........................................................................................................................... 4 Project Responsibilities........................................................................................................... 5 Data (i!ration Testin! 'trate!y******************************************************************************************** / Project Role ssign!ents....................................................................................................... " #nit Testing............................................................................................................................. " Syste! Testing....................................................................................................................... $ %ntegration Testing.................................................................................................................. & #ser cceptance Testing...................................................................................................... '( Cuto)er * Perfor!ance Testing............................................................................................. ''

Introduction
Purpose T+e purpose of t+e Testing Strategy docu!ent is to define t+e strategy to be used to establis+ an effecti)e testing approac+ for t+e legacy data !igration solution. T+e actual !igration solution ,ill )ary a!ong projects due to t+e deploy!ent and a)ailability of soft,are tools- but t+e need to establis+ a structured testing approac+ re!ains constant regardless of t+e le)el of auto!ation. T+e !igration test strategy for!s part of and is input to t+e project le)el test planning and e.ecution process.

T+is docu!ent ,ill pro)ide t+e fra!e,or/ for t+e detailed planning of t+e tas/s re0uired for )alidating t+e legacy data !igration solution. T+e objecti)es are1

To define t+e test cycles for t+e data !igration ,or/ strea! and t+eir purpose To describe t+e dependencies and integration to t+e project le)el test strategy To define t+e roles and responsibilities re0uired to successfully test and )alidate t+e data !igration solution To identify t+e steps ,it+in t+e !igration process to deter!ine specific test objecti)es To pro)ide a basis for t+e test planning for eac+ of t+e data !igration test cycles To ensure t+e data !igration solution !eets t+e docu!ented functional and tec+nical re0uire!ents

'cope 2or t+e purpose of t+e docu!ent- it is assu!ed t+at project ,ill contain t+e follo,ing test cycles1

#nit Testing Syste! Testing %ntegration Testing #ser cceptance Testing Cuto)er * Perfor!ance Testing

T+e scope of t+is docu!ent is t+e definition of t+e strategy re0uired to test t+e !igration process- ,+ic+ is defined by t+e !o)e!ent of data fro! a legacy source syste! to t+e S P pplication. %t is not intended to define an approac+ or strategy for t+e testing of application !aster or transactional data. %t is assu!ed t+at once t+e data is loaded into t+e S P pplication- t+e testing strategy and test scenarios de)eloped for t+e !aster and transactional data ,it+in t+e Testing ,or/ strea! ,ill be utili3ed to )alidate application functionality and business scenario re0uire!ents.

Copyright/Trademark

Data Migration Business Object Scope


T+e follo,ing table lists t+e 4usiness 5bjects t+at ,ill be !igrated as part of t+e S P pplication i!ple!entation. ID ' 2 3 4 5 " $ 6 & '( 'AP Business Object 'ource App ication 'ource Tab e / Fi e (anua / Automate$ Data 0o ume

Data Migration Process Definition


(i!ration 'o ution 'teps Regardless of t+e actual auto!ated !igration solution- t+e follo,ing steps are perfor!ed to support t+e !o)e!ent of data fro! a legacy source syste! to t+e S P pplications1

'. Data 7.traction 8 e.traction of a specific data set fro! t+e source application or syste! to a specified for!at 2. Data Transfor!ation 8 t+e )alidation- +ar!oni3ation and enric+!ent of legacy data based on specified business and )alidation rules 3. 9eneration of :oad Ready Data 8 t+e generation of data to a specified for!at 4. :oading of Data into t+e S P pplication 8 t+e loading of load ready data into t+e S P pplication using standard S P load utilities suc+ as :SM; 5. <alidation of Migrated Data 8 t+e )alidation and reconciliation of !igrated data to ensure data co!pleteness and )alidation of usability

Test

!c"es Definition

Test Cyc e De-initions an$ Descriptions T+is docu!ent assu!es t+at t+e test cycles listed belo, are included in t+e o)erall testing strategy. T+e !igration test strategy !ay need to be adjusted if t+e i!ple!entation testing strategy is different. T+e table belo, pro)ides a description for eac+ test cycle and t+e dependency to t+e o)erall project.

Copyright/Trademark

Test Cyc e #nit Syste! %ntegration

Depen$ency Self contained ,it+in t+e Migration Trac/ Self contained ,it+ t+e Migration Trac/ %ntegrated ,it+ project le)el %ntegration testing

#ser cceptance

%ntegrated ,it+ project le)el #ser cceptance testing

Cuto)er * Perfor!ance

%ntegrated ,it+ project le)el Cuto)er * Perfor!ance testing

Purpose Test t+e indi)idual !igration progra!s and processes. Test t+e co!plete set of !igration progra!s and processes- along ,it+ data 0uality and usability. Test t+e co!plete set of !igration progra!s and process- along ,it+ e.ecution ti!es- data 0uality and usability. Populate S P pplication Test en)iron!ent to support project le)el integration testing. Test t+e co!plete set of !igration progra!s and process- along ,it+ e.ecution ti!es- data 0uality and usability. Populate S P pplication Test en)iron!ent to support project le)el user acceptance testing. Si!ulate full production load to capture e.ecution ti!es- tune t+e process and confir! e.ecution se0uences.

Ro"es and Responsibi"ities


T+e follo,ing section is designed to define t+e project roles re0uired to support t+e testing of t+e data !igration process and to identify t+e appropriate project !e!bers ,+o ,ill be responsible for fulfilling t+ese roles. Project Ro es T+e table belo, contains reco!!endations on project roles and descriptions. T+is infor!ation s+ould be adjusted to align to t+e indi)idual project structure and re0uire!ents.

ID '

Project Ro e Migration :ead

Team Consulting

2 3 4

Migration De)eloper Report De)eloper Tec+nical rc+itect 2unctional 7.pert

Consulting Consulting Consulting

Consulting

"

Data 5,ner

Client

4usiness #ser

Client

Description Responsible for t+e planning and e.ecution of t+e !igration trac/. ;or/s directly ,it+ t+e Testing trac/ lead to coordinate testing sc+edules. Responsible for t+e design- de)elop!ent and testing of !igration progra!s and processes. Responsible for t+e design- de)elop!ent and testing of 0uality reports for !onitoring test results. Responsible for establis+ing t+e tec+nical platfor! for t+e !igration solution- if a soft,are solution is being deployed. Responsible for pro)iding S P business object and application e.pertise for t+e design and testing of t+e !igration solution. T+e o,ner of t+e data fro! t+e business side ,+o understands t+e use and business rules associated ,it+ t+e data fro! a business perspecti)e. T+e business user of t+e legacy application ,+ere t+e data is being used.

Copyright/Trademark

&

:egacy pplication d!inistrator :egacy Data nalyst

Client

Client

'( ''

Testing Trac/ :ead Cuto)er :ead

Consulting * Client Consulting

T+e ad!inistrator of t+e legacy application ,+ere t+e legacy data resides. Responsible for e.tracting and !apping of t+e legacy data. T+e tec+nical or business analyst t+at understands t+e underlying business rules for t+e business object and data ele!ents fro! an application or tec+nical perspecti)e. Responsible for !apping t+e legacy data to t+e target S P application. T+e project le)el lead for t+e testing trac/. T+e project le)el lead for t+e cuto)er trac/.

Project Responsibi ities T+e table belo, contains a !apping of t+e project roles and t+e indi)iduals assigned to t+ese roles for eac+ of t+e business objects or !igration processes

ID = = = = '

Business Object n*a n*a n*a n*a >4usiness 5bjects ?'@

Project Ro e Migration :ead Testing Trac/ :ead Cuto)er Trac/ :ead Tec+nical rc+itect Migration De)eloper Report De)eloper 2unctional 7.pert Data 5,ner 4usiness #ser :egacy pplication d!inistrator :egacy Data nalyst Migration De)eloper Report De)eloper 2unctional 7.pert Data 5,ner 4usiness #ser :egacy pplication d!inistrator :egacy Data nalyst

Responsib e Resource

>4usiness 5bject ?2@

Data Migration Testing Strateg!


T+is section is designed to describe t+e approac+- goals and objecti)es for eac+ of t+e testing cycles. s t+e project progresses- t+e !igration solution ,ill !ature ,it+ eac+ test cycle and t+e focus of testing !o)es fro! t+e underlying solution to t+e 0uality and a)ailability of !igrated legacy data. T+e !igration tea! needs to align t+e testing tas/s- goals and objecti)es to t+e o)erall project re0uire!ents to ensure t+at t+ey are able to deploy an efficient solution ,+ile satisfying t+e project le)el data re0uire!ents and !ilestones.
Copyright/Trademark

Project Ro e Assi!nments T+e table belo, contains reco!!endations on t+e assign!ent and allocation of project roles across t+e )arious test cycles.

ID ' 2 3 4 5 " $ 6 & '( ''

Project Ro e Migration :ead Migration De)eloper Report De)eloper Tec+nical rc+itect 2unctional 7.pert Data 5,ner 4usiness #ser :egacy pplication d!inistrator :egacy Data nalyst Testing Trac/ :ead Cuto)er Trac/ :ead

#nit Aes Aes Aes Aes Bo Bo Bo Aes Aes Bo Bo

'ystem Aes Aes Aes Aes Aes Bo Bo Aes Aes Bo Bo

Inte!ration Aes Aes Bo Aes Aes Aes Aes Aes Aes Aes Bo

#ser Acceptance Aes Aes Bo Aes Aes Aes Aes Aes Aes Aes Bo

Cuto)er Per-orman Aes Aes Bo Aes Bo Bo Bo Aes Aes Aes Aes

#nit Testin! T+e #nit Test cycle is self contained ,it+in t+e !igration trac/ and is designed to test and )alidate t+e underlying !igration platfor!- progra!s and processes. T+ere is no dependency to t+e !ain project or re0uire!ent to deli)er legacy data to ot+er project trac/s fro! t+is test cycle.

ID '

'ubject Area Platfor!- Progra!s and Processes

Data Re0uire!ents

Objecti)e <alidate connecti)ity of platfor! <alidate file for!at and data content of legacy e.tracts 7.ercise )alidation and business rules in t+e !igration progra!s and processes <alidate file for!at and data content of load ready data <alidate S P load utilities T+ere is no need for production le)el data )olu!es 2or e.tracts- data necessary to )alidate for!at and content 2or progra!s and

'trate!y De)elop unit test plans t+at describe t+e e.ecution steps and results. Plans are designed to )alidate t+e design specifications for t+e progra!s and processes and t+e underlying platfor!.

T+ere ,ill be different resources focused on t+e )arious !igration solution co!ponents. 7ac+ resource ,ill be responsible !anaging t+e data re0uire!ents to satisfy t+eir test plan.

Copyright/Trademark

pplication 7n)iron!ent and 7.ecution Plan

4 5 $ 6 &

pplication 4usiness 5bject #nit Testing pplication 4usiness Scenario Testing Ti!ing Statistics Trac/ing Data Cuality Trac/ing Test Results Trac/ing

'(

Success Criteria

processes- data necessary to )alidate and e.ercise business rules 2or load ready datadata necessary to )alidate for!at and content and e.ecute load utilities Perfor! in t+e Migration en)iron!ent Tests can be independent of eac+ ot+er Bot re0uired during t+is test cycle Bot re0uired during t+is test cycle Bot re0uired during t+is test cycle Bot re0uired during t+is test cycle Trac/ results at t+e !igration progra! and process co!ponent le)el for eac+ !igration flo, Successful e.ecution of eac+ !igration progra! and process co!ponent based on t+e design specifications

7stablis+ a separate application en)iron!ent dedicated to t+e !igration trac/ to eli!inate dependencies to ot+er project trac/s.

7stablis+ spreads+eet or project plan to trac/ tas/s. %ssue trac/ing o,ned by indi)idual !igration tea! !e!ber. Signoff of indi)idual test plans or co!pletion of indi)idual testing tas/s on trac/ing s+eet or plan

'ystem Testin! T+e Syste! Test cycle is self contained ,it+in t+e !igration trac/. %t is designed to e.pand on t+e #nit test by directing focus on t+e e.ecution se0uence- e.ecution ti!es and t+e 0uality of legacy data. T+ere is no dependency to t+e !ain project or re0uire!ent to deli)er legacy data to ot+er project trac/s fro! t+is test cycle.

ID '

'ubject Area Platfor!- Progra!s and Processes

Objecti)e <alidate connecti)ity of platfor! <alidate platfor! si3ing re0uire!ents <alidate file for!at and data content of legacy e.tracts 7.ercise )alidation and business rules in t+e !igration progra!s and processes

'trate!y Sa!e as unit test. T+e data )olu!es ,ill be larger- so t+e underlying solution s+ould continue to be tested.

Copyright/Trademark

2 Data Re0uire!ents

pplication 7n)iron!ent and 7.ecution Plan

<alidate file for!at and data content of load ready data <alidate S P load utilities Close to production le)el data )olu!es are re0uired for capturing ti!ing statistics T+e data set s+ould contain !ost data ano!alies to properly e.ercise t+e )alidation and business rule Perfor! in t+e Migration en)iron!ent T+e !igration flo,s need to be e.ecuted in t+e re0uired se0uence

T+e legacy data set needs to co!e directly fro! t+e source syste!s and applications- no !anual creation of data is re0uired

pplication 4usiness 5bject #nit Testing

pplication 4usiness Scenario Testing

Ti!ing Statistics Trac/ing

Data Cuality Trac/ing

&

Test Results Trac/ing

'(

Success Criteria

4usiness object unit testing to be perfor!ed once t+e data is loaded to )alidate usability 4usiness scenario testing to be perfor!ed once t+e data is loaded to )alidate usability Capture t+e e.ecution ti!es for eac+ e.ecution step ,it+in t+e !igration flo, for eac+ business object Capture t+e record counts for t+e source data setrecords rejected and records loaded for eac+ step ,it+in t+e !igration flo, for eac+ business object Trac/ t+e results at t+e business object le)el to reflect t+e co!pletion of t+e end to end !igration process. T+e /ey !etrics include e.traction- load and usability. Successful e.ecution of business object !igration flo,- +ig+ percentage of 0uality data and functional

7stablis+ a separate application en)iron!ent dedicated to t+e !igration trac/ to eli!inate dependencies to ot+er project trac/s. T+e e.ecution se0uence needs to be )alidated and confir!ed to account for interdependencies. #tili3ed t+e business object test plans de)eloped ,it+in t+e Test trac/ for t+is testing #tili3ed t+e business scenario test plans de)eloped ,it+in t+e Test trac/ for t+is testing Create a spreads+eet to capture t+e e.ecution ti!es by e.ecution run

Create a spreads+eet to capture t+e record counts results by e.ecution run. Capture data issues in an issues log.

7stablis+ spreads+eet or project plan to trac/ !etrics at t+e business object flo, le)el. %ssue trac/ing o,ned by !igration trac/ lead.

Signoff of co!pletion of indi)idual business object !igration flo,s.

Copyright/Trademark

usability of t+e !igrated data Inte!ration Testin! T+e planning and e.ecution of t+e %ntegration Test cycle is t+e responsibility of t+e Testing Trac/ :ead. T+e !igration tea! is responsible for pro)iding !igrated legacy data to support t+e end=to=end testing. 4y t+is test cycle- t+e !igration solution and e.ecution se0uence needs to be stable- so t+e focus continues to be on t+e e.ecution ti!es and t+e 0uality of legacy data.

ID '

'ubject Area Platfor!- Progra!s and Processes

Objecti)e ssu!e stable en)iron!ent Monitor t+roug+put perfor!ance Close to production le)el data )olu!es are re0uired for capturing ti!ing statistics T+e data set s+ould contain !ost data ano!alies to properly e.ercise t+e )alidation and business rule Perfor! in t+e Test en)iron!ent T+e !igration flo,s need to be e.ecuted in t+e re0uired se0uence 4usiness object unit testing to be perfor!ed once t+e data is loaded to )alidate usability 4usiness scenario testing to be perfor!ed once t+e data is loaded to )alidate usability Capture t+e e.ecution ti!es for eac+ e.ecution step ,it+in t+e !igration flo, for eac+ business object Capture t+e record counts for t+e source data setrecords rejected and records loaded for eac+ step ,it+in t+e !igration flo, for eac+ business object

Data Re0uire!ents

'trate!y React to issues encountered ,it+ !igration progra!s and processes- no specific testing perfor!ed. naly3e and address perfor!ance bottlenec/s. T+e legacy data set needs to co!e directly fro! t+e source syste!s and applications- no !anual creation of data is re0uired

pplication 7n)iron!ent and 7.ecution Plan

7.ecute !igration process in t+e Test en)iron!ent

pplication 4usiness 5bject #nit Testing

Testing Tea! responsible for t+e e.ecution of t+e test scripts

pplication 4usiness Scenario Testing

Testing Tea! responsible for t+e e.ecution of t+e test scripts

Ti!ing Statistics Trac/ing

Capture t+e e.ecution ti!es in t+e trac/ing s+eet created during t+e Syste! test cycle.

Data Cuality Trac/ing

Capture t+e record counts results in t+e trac/ing s+eet created during t+e Syste! test cycle.

Copyright/Trademark

&

Test Results Trac/ing

'(

Success Criteria

Trac/ t+e results at t+e business object le)el to reflect t+e co!pletion of t+e end to end !igration process. T+e /ey !etrics include e.traction- load and usability. Successful e.ecution of business object !igration flo,- +ig+ percentage of 0uality data and functional usability of t+e !igrated data

Project plan needs to trac/ !etrics at t+e business object flo, le)el. %ssue trac/ing o,ned by Testing Trac/ :ead.

Testing Tea! responsible for signoffs and appro)als

#ser Acceptance Testin! T+e planning and e.ecution of t+e #ser cceptance Test cycle is t+e responsibility of t+e Testing Trac/ :ead. T+e !igration tea! is responsible for pro)iding !igrated legacy data to support t+e end=to=end testing. 4y t+is test cycle- t+e !igration solution and e.ecution se0uence needs to be stable- so t+e focus continues to be on t+e e.ecution ti!es and t+e 0uality of legacy data.

ID '

'ubject Area Platfor!- Progra!s and Processes

Objecti)e ssu!e stable en)iron!ent Monitor t+roug+put perfor!ance Production le)el data )olu!es are re0uired for capturing ti!ing statistics T+e data set s+ould contain !ost data ano!alies to properly e.ercise t+e )alidation and business rule Perfor! in t+e Test en)iron!ent T+e !igration flo,s need to be e.ecuted in t+e re0uired se0uence 4usiness object unit testing to be perfor!ed once t+e data is loaded to )alidate usability 4usiness scenario testing to be perfor!ed once t+e data is loaded to )alidate usability Capture t+e e.ecution

Data Re0uire!ents

'trate!y React to issues encountered ,it+ !igration progra!s and processes- no specific testing perfor!ed. naly3e and address perfor!ance bottlenec/s. T+e legacy data set needs to co!e directly fro! t+e source syste!s and applications- no !anual creation of data is re0uired

pplication 7n)iron!ent and 7.ecution Plan

7.ecute !igration process in t+e Test en)iron!ent

pplication 4usiness 5bject #nit Testing

Testing Tea! responsible for t+e e.ecution of t+e test scripts

pplication 4usiness Scenario Testing

Testing Tea! responsible for t+e e.ecution of t+e test scripts

Ti!ing Statistics

Capture t+e e.ecution ti!es in t+e


Copyright/Trademark

Trac/ing

Data Cuality Trac/ing

&

Test Results Trac/ing

'(

Success Criteria

ti!es for eac+ e.ecution step ,it+in t+e !igration flo, for eac+ business object Capture t+e record counts for t+e source data setrecords rejected and records loaded for eac+ step ,it+in t+e !igration flo, for eac+ business object Trac/ t+e results at t+e business object le)el to reflect t+e co!pletion of t+e end to end !igration process. T+e /ey !etrics include e.traction- load and usability. Successful e.ecution of business object !igration flo,- +ig+ percentage of 0uality data- functional usability of t+e !igrated data and close to acceptable e.ecution ti!e for !igration process

trac/ing s+eet created during t+e Syste! test cycle

Capture t+e record counts results in t+e trac/ing s+eet created during t+e Syste! test cycle.

Project plan needs to trac/ !etrics at t+e business object flo, le)el. %ssue trac/ing o,ned by Testing Trac/ :ead.

Testing Tea! responsible for signoffs and appro)als

Cuto)er / Per-ormance Testin! T+e planning and e.ecution of t+e Cuto)er * Perfor!ance Test cycle is t+e responsibility of t+e Cuto)er Trac/ :ead. T+e test is designed to si!ulate t+e production cuto)er process- so t+e focus for t+e !igration tea! is on e.ecution ti!es and data 0uality. T+e !igration tea! is responsible for e.ecuting t+e co!plete !igration process ,it+ production )olu!es and address perfor!ance and data 0uality issues.

ID '

'ubject Area Platfor!- Progra!s and Processes

Objecti)e ssu!e stable en)iron!ent Monitor t+roug+put perfor!ance 2ull production data set is re0uired for capturing ti!ing statistics and )alidating data 0uality Perfor! in t+e Cuto)er en)iron!ent T+e !igration flo,s need to be e.ecuted in t+e re0uired se0uence Project dependent

Data Re0uire!ents

pplication 7n)iron!ent and 7.ecution Plan

'trate!y React to issues encountered ,it+ !igration progra!s and processes- no specific testing perfor!ed. naly3e and address perfor!ance bottlenec/s. T+e legacy data set needs to co!e directly fro! t+e source syste!s and applications- no !anual creation of data is re0uired 7.ecute !igration process in t+e Cuto)er en)iron!ent

pplication 4usiness

Copyright/Trademark

5 $

5bject #nit Testing pplication 4usiness Scenario Testing Ti!ing Statistics Trac/ing

Project dependent Capture t+e e.ecution ti!es for eac+ e.ecution step ,it+in t+e !igration flo, for eac+ business object Capture t+e record counts for t+e source data setrecords rejected and records loaded for eac+ step ,it+in t+e !igration flo, for eac+ business object Trac/ t+e results at t+e business object le)el to reflect t+e co!pletion of t+e end to end !igration process. T+e /ey !etrics include e.traction- load and usability. Successful e.ecution of business object !igration flo,- +ig+ percentage of 0uality data- functional usability of t+e !igrated data and acceptable e.ecution ti!e for !igration process Capture t+e e.ecution ti!es in t+e trac/ing s+eet created during t+e Syste! test cycle

Data Cuality Trac/ing

Capture t+e record counts results in t+e trac/ing s+eet created during t+e Syste! test cycle.

&

Test Results Trac/ing

Project plan needs to trac/ !etrics at t+e business object flo, le)el. %ssue trac/ing o,ned by Cuto)er Trac/ :ead.

'(

Success Criteria

Cuto)er Tea! responsible for signoffs and appro)als

Copyright/Trademark

You might also like