Data-Migration Methodology For Sap V01a
Data-Migration Methodology For Sap V01a
Data-Migration Methodology For Sap V01a
DATA MIGRATION
METHODOLOGY
FOR SAP
PAGE 1
PAGE 2
PAGE 3
PAGE 4
TABLE OF CONTENTS
SECTION 1 - INTRODUCTION TO DATA CONVERSION.............................................................................................6
1.1
INTRODUCTION...........................................................................................................................................7
1.2
1.3
Overview..................................................................................................................................................7
Think SAP..............................................................................................................................................11
Prepare the Legacy Database.................................................................................................................11
Before the last test run, take into account the customizations of your new system...............................11
Reduce the amount of historical data to be transferred..........................................................................11
Use controls edition in SAP...................................................................................................................11
Small is beautiful....................................................................................................................................11
Be wise...................................................................................................................................................11
Play it safe..............................................................................................................................................11
OVERVIEW..................................................................................................................................................15
2.2
2.3
Data type................................................................................................................................................15
Information to complete in the conversion plan....................................................................................16
Main Business Objects sequence of conversion....................................................................................17
2.4
Business Objects....................................................................................................................................15
CALENDAR PLANNING............................................................................................................................23
Overview................................................................................................................................................23
PAGE 5
MS-Project or not...................................................................................................................................24
Sequencing the tasks..............................................................................................................................24
Key users and consultant availability to work on Master Data..............................................................24
End 'at best' VS 'most probable'.............................................................................................................24
Are you sure ?........................................................................................................................................24
Workload analysis..................................................................................................................................25
OVERVIEW..................................................................................................................................................28
3.2
3.3
3.4
3.5
3.6
3.7
3.8
3.9
3.10
CONCLUSION.......................................................................................................................................................................42
APPENDIX VARIOUS TEMPLATES..............................................................................................................................43
PAGE 6
1.1
PAGE 7
INTRODUCTION
Overview
Implementing SAP is an important challenge, both in terms of resources (people, money, time) and in business process. A
lot is at stake and, for most of you, failure is not an option you can afford. To put all odds on your side, you need a good
methodology. One that will provide you with a realistic planning, a solid organization, a way to manage the process and
control tools to detect and correct slippage before it becomes a problem.
An important part of this challenge will be the data conversion. Previous implementations of SAP have shown that data
migration can amount up to about 40% of the entire project. Poor data conversion will make your Go Live very difficult,
if not impossible.
This guide is aimed at helping you organize the data conversion process, which in turn, will lead to a successful
implantation.
There are many things being worked on at the same time. Yet, most of them are not progressing.
There are documents all over the place and, somehow, they always seem to be outdated.
There is a lot of misunderstanding, friction and frustration between the functional and technical team.
At Go Live
Master data deadlines where constantly busted and production load is done in ''rush mode' at the last
minute.
Some key parts of the data cannot be loaded in production. Patches are applied to the master data in order
to force-load.
Some data just will not get in at all, they will have to be entered after GO Live.
After Go Live
Some Data need to be corrected & entered after Go Live. Because the production system is now living,
data are moving targets. This makes the process difficult and time consuming . This translates into a
costly operation.
After discussing with people who lived these situation (manager, functional and technical), we identified the following
points :
Planning and resources load estimates where way out (when they existed).
Information does not travel well between functional and technical team. As we get near the Go Live, this
becomes much worst.
PAGE 8
Philosophy VS techniques
The approach I take to the data conversion is as much a state of mind as a technique. Both aspect of it must be applied for
results to show up. This is actually true of any concept. Most of concept failures are due to application of the technique
while neglecting the philosophical aspect of it.
The mindset required in our case is that we must do things right from the start and solve issues as they occur. Take the time
that it requires to do thing properly and thoroughly. No expediting, no bypassing of step, no piling of unsolved issue to keep
going.
Results will initially be slower to come. However, because you will get things right on the fist time, you will eventually
pick-up an impressive speed. As in car racing, it is not the speed at which you enter a turn that is most important, it is the
one at which you get out of it.
A few facts
The data conversion is not some technical stuff to give to the programmers and wait until it comes back. Most, if not all, of
the issues and problems you will encounter in the conversion process will be functional. Although the extract / load process
itself will not be effortless, it is the part between the extract and the load that is the most difficult. Getting the right data at
the right place with the values required for your business process is always a functional problem.
SAP is a process-oriented system and master data is an integral part of this process. Nice, but what does it means? The
answer is that everything is tied together. Master data is dependent of the customizing, the customizing is made accordingly
to the way you do your process, and master data is needed to run your process. If you change master data, it will most
probably change the behavior of the process. If you change customizing, your master data may become incomplete or
incorrect.
Whichever phases you are in the project, data conversion always seem to be the one step that can be pushed a little bit
forward in time when you run behind in the overall schedule. Doing this will put the conversion process too close to the end
of the project. In that situation, you will end up shoveling a ton of data into SAP at full speed with little control, if any, on
data quality and coherence. Remember the old saying "garbage in, garbage out".
There is no 'easy does it' way to do the data conversion and it takes time. Data conversion is made with lot of brain stuff
mixed with hard work and some programming. No technological gadget or guru will make this otherwise.
PAGE 9
Organization of the data conversion (Project manager & data conversion coordinator)
Data conversion plan
The WBS with workload estimates
The calendar planning with resources loading
Going on with the Business Objects data conversion (The resource responsible of the Business Object DC)
Validation of converted data and Key User + Business Objects Owner Signoff
Full conversion into PRODUCTION SYSTEM and final Signoff
PAGE 10
1.2
PAGE 11
Think SAP
Forget your actual system and understand SAP. First and foremost, get familiar with the SAP business process you will be
implementing. Then, according to the SAP process needs, establish what the Master Data requirements are. Then, and only
then, see what can be salvage from your legacy system.
Think SAP, do not try to fit in your old system into it.
Before the last test run, take into account the customizations of your new system
Because both the organizational structure and the actual customizing influence the data you transfer for business objects,
finalize all customizations before the last test run. Customizing changes after the final transfer may result in additional
required fields, this requires preparing and transferring more data. It can also invalidate the loaded data, which leaves you
with an incoherent data set that will be very costly to correct after Go Live.
Small is beautiful
Start small. The first time you transfer data, begin with a few records of a business object. This way, you learn how the
program works. After transferring some records successfully, try transferring a larger amount of data. Make sure that you
transfer each different type of data before you transfer on a larger scale.
Be wise
The full data integration in your production system is the end of the process and should mostly be a technical operation
where we push some buttons to get some results. To reach this goal, it implies that all functional and technical issues where
dealt with before starting the full size transfer from the Legacy System. The hard work is in the mapping and establishment
of conversion rules from the old to the new system. That is where you will make or miss your conversion. Don't even think
about loading large volumes into production if you are not completely ready.
Play it safe
I strongly suggest that you perform a system backup (or client copy) after transferring a significant amount of data. The
backup allows you to secure a specific level you have reached during the data conversion process. If you have any
problems, you can return to this level, and you do not have to begin the process all over again.
1.3
SAP Data Transfer Made Easy guidebook. It can be found on the SAP Simplification Group web site
Quick Reference Guide LSMW (How to ) and Presentation of LSMW. Can be found on web site
http://service.sap.com/lsmw It require a user name a password
PAGE 12
PAGE 13
WBS
Calendar planning
PAGE 14
2.1
PAGE 15
OVERVIEW
This section describes the organization of the conversion. This is the first building block of your conversion process and
must be completed right at the beginning of the project. This part of the process is to be done by the project manager and
the data conversion coordinator.
2.2
Business Objects
A Business object is a general category for data that defines something like material master, vendor master, stocks, orders,
purchase requisitions or organizational units. The first step is identifying which business objects are required in your SAP
implementation.
Data type
There are three types of data involved in a SAP system: master data, transactional data, and historical data.
Master Data. Application master data tends to be more static once defined. Most master data can be driven by the
legacy applications. Examples include vendors, customers, charts of accounts, assets, bills of materials, material
masters, info records, and so on.
Transactional Data. Transactional data is current and outstanding transaction data that needs to be captured from the
legacy system and defined to the SAP R/3 applications for business process completion. Examples include accounting
documents, open purchase orders, open sales orders, back orders, and so on.
Historical Data. Historical data needs to be brought over from the legacy system to the SAP R/3 System for reference
purposes. Examples include closed purchase orders, closed sales orders, summary general ledger information, and so
on.
PAGE 16
What
Where
Where are the data, which Legacy System's are involved for the extraction.
How much Estimate the number of records to be ultimately loaded into SAP.
How
Which business objects will be converted from the legacy system into SAP.
Combination of both
The data transfer method you choose will determine the types of resources you need. For example, you
may need temporary employees for the manual data entry and programmers for writing your own
extraction programs. You need to know both what data is in your legacy system and which SAP
applications correspond to the business objects that will be transferred. One person does not have to know
all of this, but the people who know this information should work closely together.
Who
Key user (Functional responsible of BO conversion : Rules, manual data corrections, test, validations )
Consultant
This part seems easy enough. However, you will quickly see that getting a clear answer to these questions
is no easy task. Take the time and energy it needs to answer these questions meticulously. It will avoid a
lot of turning in circle and save you lot of time throughout the project.
Ha yes, I forgot one thing, MAKE SURE that all whose name is on the document are aware of it,
understand what it mean and approve it.
PAGE 17
Pre-Required
- GL account Master
(Include primary cost & revenue elements)
- Profit centers hierarchy
- Profit centers
- Cost centers hierarchy
- Cost Centers
- Characteristics
- Classes
- Activity types
Optional
Internal orders
WBS elements if PS module.
Material Master
Work
Centers
Banks
Doc Mast
Customer Master
Vendor Master
BOM
Condition record
- Discount
- Pricing
Purchase
info records
Sales
info records
Routing Text
Source lists
Storage Bins
Open A/P
Open A/R
Opening Balances
Stocks (VM)
Stocks (IM)
Open Sales Orders
Contracts
Open Purchase Orders
2.3
PAGE 18
Why a WBS ?
Estimates for a project planning must be deducted and justified from a logical process. They represent the real workload
required for the different tasks of the project. The WBS is a great tool to figure out these numbers. It will permit to estimate
the workload of each task without any duration or calendar consideration. Ignoring the date factor help in getting as
objective as possible. The workload is calculated in Person/Days. Whether there is one or five persons assigned to a task,
the workload is always the same. The usage of Person/Days will help in getting a more precise calendar planning and will
make evaluation of the conversion progress easier.
WBS Sample
How to
The idea is to break the project in chunks and then break each of these in tasks. You then proceed to evaluate the workload
required for each of these elements. It will be much easier to get accurate and objectives numbers on small specific tasks
than on a large chunk. How to break it and at which level is more an art & experience mix than it is a science. The more
WBS you do, the better you will get at it.
If your WBS is not granular enough, your estimate is more difficult to get and will be less accurate. An error on one
element will also have a greater impact. As for progress follow-up, it will be less accurate, since any detected slippage will
involve higher number because the element is itself too big.
If the WBS is too granular, you will get lost in a forest of details and numbers. The follow-up will also be much more
difficult and it will be difficult to get the whole team to use it (too complex).
In this methodology, the WBS I suggest is a middle ground between these two limits. I got to this one by trials & errors on
different projects. I think it is granular enough to be precise and usable for efficient follow-up. Yet, simple enough, for the
whole team to easily contribute in evaluating the numbers.
PAGE 19
It is important at this point to make complete abstraction of calendar planning or any target date. Forget when this
should be finish or how long should it last. Just try to figure out the real workload needed to complete each element of
the DC process. After that, we will see how we can meet deadline by acting on the organization of the project rather
than "fixing up" the estimate. Starting a WBS while taking into consideration a goal to meet, like a specific date or
target total of Person/Days, will only lead to complaisance planning which will be false and get you in trouble.
Volumes
Quantity of records
Quantity of fields
% Manual fields
Business Objects Mapping & Conversion ( For detailed information on these items, refer to section 3 )
Total
PAGE 20
When getting the numbers for the original WBS, you average each element. Overall, you under estimate some and over
estimate others, but the average law will make it a globally reasonable measure. However, if you start concentrating on
some numbers while forgetting others, the average law is out the window. This is why you must consider, both the large and
small values, when re-evaluating a WBS.
Here are some suggestions I give to those concerned when re-evaluating a WBS :
Explain clearly what a Person/Day is: "let's say you have only this to one task to do and you have to do alone, how
many days will it take?".
Explain the work to evaluate. For example, making the conversion rules mean ; talking about it with the consultants
and Legacy System experts; writing the first version ; having meetings to answer gray areas ; doing some tests for
uncertain fields ; cross reading the documents and finally, some reflection time. Therefore, as you see, it is lot more
than just figuring how long it would take to write some lines of rules.
Count everyone's time. In the above example, you must count time for you, the consultants, the LS Experts, those
present at the meetings, those doing tests, etc. It adds up very quickly.
Explain the average law I mentioned above and make sure they do re-estimate all the elements with the same scrutiny.
While some high workload tasks may be overestimated, some smaller one are probably underestimated as well.
Avoid talking about deadline or total workload. They have to evaluate all elements independently from each other.
I personally went through that process a few times. Interestingly in all cases, the re-evaluation turned out with a slightly
higher global number. Mainly because they realize there is more, small but still time consuming, tasks than originally
thought so.
If there are two legacy systems, it will take twice the time (see next title 'Ballpark figures for more info).
As mentioned earlier, avoid thinking about deadline or total workload. Just honestly evaluate each element
independently from each other.
For some elements, you are clueless. It is very difficult to find someone who knows all, but there is always someone
who can help you on a specific topic. This is where splitting a project in small elements will help. Do not hesitate to
ask around.
Take into account the number of fields you need for each Business Object. If you have no clue, take 200 for Material
Master, 100 for Customers, 100 for Vendors, 40 for BOM, 40 for Routings. These figures are for an implementation
with modules FI, CO, MM, PP and SD. Later on, you can adjust to values that are more exact.
For BOM and Routings, if they are merged in a single structure in your LS (i.e. multilevel), count that BOM will take
double the time you originally estimated and Routing will most probably have to be done manually from scratch. SAP
is Single Level (unless it changed in newer versions) which mean that materials hierarchy is in the BOM and the
operations sequences are in routings.
Material Master is huge. It requires time and energy, lots of it. On top of being a difficult one, it is the first one you will
have to do.
There is much to learn while doing Material Master, and this learning will put in question the process, which
key users though they had already cornered.
Different people come up with their own set of rules, which need to be put together in a single Material
PAGE 21
Master. This will create collisions and conflicts, which will need meetings, discussions and testing before the
issues are solved.
The conversion rules are different for each Material Type and it is not always the same key users who have the
info for the different types.
Other than the Big Five, workload estimates are rarely linked to the number of fields. The key is then the quality of the
Legacy System data. Here are some factors on that will make the process much longer:
Historical data that was never purge
Be conservative, in doubt, over estimate rather than under estimate. Never mind how much you investigate or know the
LS. There is always one business object where you will discover, at the last minute, that it just will not fit into SAP
without major unplanned efforts. It is not bad luck, it just happens every time. Bad luck is when you did not consider it
in your planning.
If the data is not extracted from the LS but generated manually, it will take longer. The time is however more
predictable as manual data is rarely bugged.
When you extract data automatically from the LS, it should be faster. However take into account that programming
means possible bugs. It also needs modifications when the rules change (and change they will), which again may bring
bugs
If you have part automatic and part manual, like "yes we can extract most of it, but need to do some adjustments in
Excel", add extra time (50 to 100% more). At first glance, this seems like the easiest way to go. Well, its not! Trust
me, these will be real headaches. Although almost impossible to avoid them, try having as little as possible of these. In
all cases, prefer maximum usage of conversion rules.
Ballpark figures
Here are some figures to give you a ballpark of the projects I worked on. These are not absolute figures, as they vary from
one project to another.
In projects involving the modules FI, CO, MM, PP and SD, having from 20 000 to 40 000 material master items with all
related BOM and Routings, about 2 000 vendors/customers, 10 000 inventory records and all other basic DC stuff, it gave
me something between 400 to 600 Person/Days per legacy system.
I say per legacy system and this is something important to consider. If you have different legacy systems, you tend to think
the second one will go faster than the first. There is absolutely no gain. Each system must be evaluated as if it was a
different project. If one take 500 Person/Days, than two LS will take 1000 Person/Days. This will probably be a major
disagreement point among the team when you will show your numbers. Keep in mind that for all the projects I worked on,
it proved to be true. Mapping is different, conversion rules are different and issues with LS data will not be the same. Since
these three items represent the bulk of the DC process, you can see why two LS will be twice the energy.
PAGE 22
Base on the volume data from your WBS, you can calculate as follow :
For mapping : Count 10 min per field (0.02 day per field) and add 1 day to the total for set up and explanations
Conversion rules : Count 10 rules per day (0.1 day per field)
Data and Rules Adaptation : Count 12 seconds (~0.000416 day) per record and by field (number of fields x number of
records x 0.000416). There is more, later on, explaining what is Data and rules adaptation.
As you see, you need to establish how many fields need Data and Rules Adaptation. I use a percentage in the WBS so that I
can recalculate all the workload easily as I learn more about the LS. Base this on the number of fields you will populate in
SAP. I usually count that about 80% of the fields are solved by conversion rules and 20% will need data and rules
adaptation. If the data are in bad shape in the LS, go toward 70%-30%.
This formula is most pertinent for Material, Customer and Vendor masters. For BOM and Routing, the time is less
dependent on the number of fields than on the complexity of the data to extract. For those two, you can use the formula and
then add between 50% to 100%, depending on the legacy data complexity. As stated earlier, if BOM and Routing are
merged in a single structure in your LS (i.e. multilevel), count that BOM will take double the time you originally estimated
and Routing will most probably have to be done manually from scratch.
Other than the Big Five, the number of fields has little to do. It is the complexity of the process, which needs to be taken
into consideration. If you really count all the time spent on one business object, none will take less than 10 Person/Days.
Use you judgment and apply between 10 to 30 Person/Days per business object according to expected complexity. Each
time someone tells you "this is a one day thing", make a note of it and follow the time it really took from start up to a
loaded and validated data set you will see, nothing takes less than 10 days.
Another business object, which is also special, is inventory. At first it look simple enough, but getting 100% of the data in
SAP will prove to be a challenge... if you plan to shoot less than 100%, go back to page 1 of this document.
If you use WM in SAP, it will be even more challenging.
If you use WM in your LS and it is not working perfectly, it will be a great challenge.
If you do not have WM in your LS and want to use WM in SAP, than you are in for a heck of a ride. In this situation,
consider converting without WM and implementing it later, once you system has been stable for a while in production.
For Inventory, count 30 days for IM + up to 100 days for WM according to the three possible scenarios I just mentioned. If
you have a doubt, try finding someone who went through it before. Done right the inventory load takes lots of work but the
process will go well. Badly managed it will keep you up 24hrs/day for a few days before GO Live and after.
2.4
PAGE 23
CALENDAR PLANNING
Overview
At this point, you have assigned resources in the Data Conversion Plan and estimated the charge for each of the WBS
elements in Person/Days. You must now transform this information in duration for each task, this is the calendar planning.
To do the calendar planning, using MS-Project or other planning tool, you will enter the tasks and complete it with the
following information :
Tasks efforts in Person/Days
Tasks dependency
Names of the resources assigned to each task and the percentage of their availability on it
This will not only give you a calendar date planning based on an objective workload estimate, but it will also permit a quick
identification of resource over-allocation, overlapping of dependant tasks, and delay due to non working time and
bottlenecks.
On most conversion, the overload on key user is always a major problem. Your key users will be strongly solicited right
from the beginning of the project. Keep in mind that the more you go on with your projects, the more they will be solicited
to troubleshoot problems, and this will be on top of their normal conversion work. The result is that their availability will
only get lower as the project is going on. Do not under estimate this fact in your planning.
Once you will be done with the DC calendar planning, you must integrate it in the overall project planning and do a
resources load analyses. This task is most difficult, time consuming and very frustrating (especially if you do not master
MS-Project).
PAGE 24
MS-Project or not.
Most probably, the only planning tool you'll have available will be MS-Project. Although it is a nice tool, it also has great
talent in 'auto messing-up' your schedules (make backup copies and make them often).
My first advice is that you should learn the basics of MS-Project before you get into it. It will be a much less frustrating
experience. Some quick learning books can be found and are useful
Whichever tool you use must be able to give you a resources load analysis. This will be a key element of you planning.
At the very end of the calendar planning, I will add a 200 days task with 5 resources. This will translate as a
20 days duration buffer for the lead key users.
Believe me, it is very difficult to do better than the most probable date.
PAGE 25
- Do not parallel the task hoping to save time. There are only 24hrs in a day and people need sleep.
- Do not forget the 20-25% margin
- Do not change the Person/Days established in the WBS, it is the most objective values you can get.
- If you need to finish a task faster, never change an end date or the workload. Changes the resources allocations to obtain
the timeframe you want then re-validate the resources workload.
Workload analysis
Here you are, now you have to identify the resources overload and play with the task sequencing until all resources are in
normal workload.
This is a very difficult and frustrating step. In addition, since MS-Project will regularly mess things for you, MAKE
BACKUP copies before making changes in the calendar planning.
Once the planning is done and resources workload is realistic, you are ready to go. At this point you'll only have to identify
slippage as the project go and take corrective action before it has an impact on the project duration.
PAGE 26
Data Purging
& Cleansing
PAGE 27
Project kick-off
Load
Unit
Testing
3.1
PAGE 28
OVERVIEW
This section gives you information on the major steps involve for each Business Objects. Each person who is responsible of
a Business Object should read this.
In the previous section we saw one of the methodology main ingredients. It involved mainly planning and is actually a
basic management concept, which is applicable to any kind of project, computing or other. The second main ingredient of
the methodology, which will make it so efficient, is the way we deal with the conversion process itself.
3.2
The purging and cleansing of the Legacy System will save you lot of time and effort in the following steps of the
conversion. Start this as soon as possible and do as much as possible. This can be done without specific knowledge of SAP.
Data Purging
Before transferring data from your legacy system, delete all the old and obsolete data. For example, you may delete all
one-time customers or those for which there where no transaction in the last two years, also delete unused materials.
Data Cleansing
This process corrects data inconsistencies and ensures the integrity of the existing data during the migration process.
For example, there are often lots of inconsistencies in Customer and Vendor address fields. You will quickly find that
SAP will not let you load any address fields unless you get them clean.
3.3
The documentation of each business object will contain the Data conversion rules (or specification), which include :
PAGE 29
Key user are ask to question their Legacy System values and integrity
Rules permit a clear statement of what a key user think. Thus permitting to identify conflict and
misunderstanding between domains.
Rules document can be versioned, making change management easier as the project is progressing.
Communication between functional and technical people is facilitated by using a common ground language.
You'll have to keep in mind that these will be made by key users who may not be familiar with writing computing rules.
Therefore, it is necessary to give some example and to explain some basic key element in rules writing.
Basic properties of a rule :
Fields names
This is a crucial one. When discussing or writing notes, ALWAYS refer to a field in the form TABLE-FIELD.
You will quickly realize that as the project go, different people will start using different names for the same
field. As well they may start using the same name for different fields.
On top of this, some fields exist in different views in SAP master data. Sometime it is the same field that is
shown at two places while other times it is really two different fields. The best way to know which case apply
is to have the TABLE + FIELD information.
Example:
In Material Master, the field Availability check exists in the "MRP2" and the "Sales Gen" views. If you
look at the TABLE-FIELD of each view you get :
MRP2
: MARC-MTVFP
: KNVV- ZTERM
Billing Views
: KNB1- ZTERM
It is not the same field. In the payment view, the field is linked to the Company Code while for the Billing
view it is linked to the Sales Organization (you find this by looking at the tables keys). So both of these
fields can have different values.
PAGE 30
PAGE 31
PAGE 32
Taking the example of the field Goods receipt processing time in days (MARC-WEBAZ), it can be
decided among the domains to put it in the Purchasing view (and nowhere else).
This means that Purchasing is the lead on this, but do not stop other domains to use it of have specific
rules for it. It is however Purchasing role to make sure this field is used correctly.
Be careful, make sure it is really the same field. For example:
In Customer Master, the field " Terms of Payment' exist in "Payment Transactions" and "Billing"
views. If you look at the TABLE-FIELD of each view you get :
Payment Transactions
: KNVV- ZTERM
Billing Views
: KNB1- ZTERM
It is not the same field. In the payment view, the field is linked to the Company Code while for
the Billing view it is linked to the Sales Organization (you find this by looking at the tables
keys). So both of these fields can have different values.
SPECIAL MULTI VIEWS
In can however happen that no specific domain can be identified as the lead or main user of a field.
Lets take for example, MM / PP status (MARC-MMSTA) which is in views : purchasing, MRP1,
workscheduling, costing 1, quality management.
If no one can be clearly identified as the lead for that field, then we put it in a dummy view called
SPECIAL multi views. This is used to put all the fields that exist in different views and for which we
cant assign a lead functional.
2nd step : Regroup selection of all domains
Once this is done for each domain, than you (the DC manager ) have to put together all the results. This would
yield something that looks like this :
3rd step : Build the data conversion rules template with all the selected fields.
Specify for each field, which domain selected it. If more than one domain selected the same one, put the name of
all the domains who selected the field.
PAGE 33
PAGE 34
You now have Material Master conversion rules documented and a TODO LIST to follow up on the issues to be solved
before you can load the data.
PAGE 35
G002
G003
RULE
Note that SAP term Security deposit equal Retention in PRMS
Type of transaction
TYPE field in PRMS :
Partial PMT:
Pay.
Credit Memo:
Cr M.
Debit Memo:
Dr M.
Invoice :
Inv.
Non A/R cash:
Non AR
Adjustments:
Adj
Any other type is an error.
Validation to apply both at extraction and load.
Partial PMT. must be negative
in PRMS, if not ERROR
Credit Memomust be negative
in PRMS, if not ERROR
Debit Memo.must be positive
in PRMS, if not ERROR
Invoice..must be positive
in PRMS, if not ERROR
Any other type is an ERROR.
LSM Load parameters
KTOPL - Chart of account :
CA00
BUKRS Company code:
0070
GSBER - Business Area :
0040
BUDAT Posting Date :
05-31-02 or last day of last closed period.
OFFSET Account (2) :
REPRISECL
SKPERR Skip err :
X
PAGE 36
PAGE 37
Directory organization
As you go, you will end up with lots of documents and versions. To store the different files on your local server, use a
specific directory structure. I suggest having a structure with a directory for each Business Object to store all the files
relevant to the data conversion. Here is an example.
C:\.
Data Conversion
00 - Organization
DC PLAN
DC WBS
DC SCHEDULE
Old
< Store here previous versions of above mentioned documents >
01 - Material Master
Material Master - Field Selection Sheet.xls
Material Master - Data Conversion Spec.doc
< Keep here only the latest version of each document >
Old
< Store here previous versions of above mentioned documents >
Working Files
< Put here various working files >
xx -BO name
BO - Data Conversion Spec.doc
< Keep here only the latest version of each document >
Old
< Store here previous versions of above mentioned documents >
Working Files
< Put here various working files >
Freezing of V02
Once everyone agree that V02 is OK (functional and technical staff), freeze the version.
Password protect V02, so no one changes it afterwards
Make a copy of the document as V03
In V03, accept all changes so that there is no visible change.
In V03, activate MS-Word change tracking (should already be on)
Put V02 in the "Old" directory
Unprotect V03
And so on
PAGE 38
If you are not familiar with MS-Word change tracking I strongly suggest that you get acquainted with this functionality.
If you have many large MS-Office documents, at least a few of them will go totally corrupted during the project. It
always happens.
It also always happens that someone mess-up a document, usually the most critical one. .
This all happens because of a certain Murphy's and there is no way around it.
3.4
The rules documents for each BO have sections for Legacy sources and extraction procedures and Purging and
cleansing rules. This should explain what to clean/purge, what to extract and how to proceed. If it does not, IT MUST BE
DONE BEFORE YOU DO ANY PROGRAMMING.
It is essential that this be clearly documented and validated by the key user responsible for the BO. We are at the start of the
process and any error or omission at this point will affect us for the rest of the project. Once this is all documented,
validated and understood, you can start work on the extraction programs and process.
Once the extraction is clear, we must look at the load programs and process. This implicates all the aspects of loading the
required fields into each BO. Again, as for extraction, if there is ambiguity or incomplete information in the specs, DO
NOT PROCEED. At this point, take your time to solve the issues you can see on paper before you do any programming.
PAGE 39
3.5
This is, by far, the most time consuming part of the conversion process. This is also the most difficult, where System
Migration Managers can loose control of the conversion process. Remember this throughout the whole process: As in car
racing, it is not the speed at which you enter a turn that is most important, it is the one at which you get out of it.
The migration process is an iterative one.
After you have the specs, you do the extract and load programs
Some questions or issues will arise. You must address them and document in the specs whatever
corrective actions are needed. The specs must always be updated to reflect the new requirements
BEFORE you correct any program.
Do unit testing of the load
Some questions or issues will arise. You must address them and document in the specs whatever
corrective actions are needed. The specs must always be updated to reflect the new requirements
BEFORE you correct any program.
Following the modifications, you need to redo the unit testing of the load.
Full size test cycle
Some questions or issues will arise. You must address them and document in the specs whatever
corrective actions are needed. The specs must always be updated to reflect the new requirements
BEFORE you correct any program.
Following the modifications, you need to redo the unit testing of the load and redo a full size test cycle.
For all the Systems Migrations I did, these iterations accounted for a major part of the DC process duration. This is time
consuming and can be very frustrating if not properly managed.
In a nutshell, here are the guidelines most useful at this point :
1.
First, when something needs a change, it must be documented and validated in the specs so that it is clear and
unambiguous to all. Keep in mind that a change from one domain can affect others and they will find this only
later in the process, once everyone forgot about what was exactly changed.
2.
The Business Object Key user is the one accountable for the rules (all of them) relating to his/her BO, for their
maintenance and for the end result. This is not the technician part of the work. The tech team is there to develop
programs according to what the rule says.
3.
If it is not in the rules, it does not exist. So if you do not see what you need, get it documented.
4.
Have the discipline to manage change by versioning the documents and making sure they are cross-validated by
all implicated stakeholder.
5.
It is better to take your time to do it correctly rather than rushing it, which usually means you may have to slow
down at some points. Results will initially be slower to come, but you will eventually progress faster and faster. As
problems can accumulate in a snowball effect, so does success (and this is a good news). Take the time to do it
right now and you will eventually pick-up an impressive speed in the next steps.
6.
The specs must always be updated to reflect the new situation BEFORE you correct any programs. You must be
thinking, OK, stop repeating the same thing again and again, I got it. I do so because this is the most common
error I saw on projects that failed. Trying to save time a team starts working on new solutions without taking time
to document, cross read and validate the specs. They sure are getting output quickly this way but are they going
in the right direction ?
3.6
PAGE 40
If you start running in all directions and cant keep you head out the water, STOP, and go back to step 1.
Here we want to test the load programs at a unitary level. The goal is to see if we can load all the fields for all the data types
without error. We are more concerned about going through the load cycle without SAP stopping us, rather than validating
the correctness of the values. For this we use a very small volume of data, usually creating data manually from scratch
rather than using real extracted data.
The kind of issues you usually encounters at this point looks like the following :
Some mandatory fields are missing
Some dependency between 2 fields in one view where not considered and you cant load as expected
Invalid values where given to you in the rules
Errors in the load programs
Using a load program, SAP did not behave as you expected (it sometimes behave very differently with a load tool
than it does with manual data entry)
As mentioned earlier, the conversion process is iterative. Following the issues youll find here, you will have to go back to
the functional key user, find solutions and document them in the specs (this is the responsibility of the key user).
3.7
At this point, we want to know if we can load the data with the processes we developed (extraction and load), as well as
validating the results of the conversion in SAP.
At the end of this step, we will have a fully functioning and validated conversion cycle.
This is done it two steps:
1.
Load data that comes from the full extraction process (starting with small size and progressing with larger data set,
up to 50% of the complete data to be converted)
2.
Load a full size data set (at least 50% of the complete data to be converted and progressing towards 100%) using
the full Extract and Load cycle. The goal is to achieve 100% of loaded data
After each load, the functional that is responsible for the business object must validate the data. This is time consuming and
must be done as soon as possible. Remember the bottleneck with key user I mentioned earlier in the planning section, the
farther you are in the project, the less availability youll get from the key user responsible of the BO.
Finally, of course, following the issues you will find here, you will have to go back to the functional key user, find solutions
and document them in the specs (this is the responsibility of the key user).
3.8
Before going into pre-production, you must be able to start from an empty SAP client, extract, load 100% of the data, and
have full data validation from the BOs key users and the BOs Owners.
From this point on, if you change something, you must redo a full extract-load-validation cycle for the BO affected. Failure
to do so (i.e. making last minute changes and not validating them full size) will 99% of the time creates bugs in the load
process at production time.
Again, to achieve this you will need lots of discipline. While you can correct bugs in dev, it is difficult to correct them in
pre prod (and suicidal). If at the end you load in production and create data inconsistencies, it can take up to a year to
correct this. Because the production system is living its own life and cant be controlled as in dev, correcting Master Data
errors is like shooting a moving target the more you try to fix it, the more you seems to break everything around it. I
PAGE 41
have seen teams cheating on this step and still have corrupted BOM after a year. Since BOM affect Purchasing, Costing,
Manufacturing, Inventory, etc. you can imagine in which condition their SAP got after a while.
3.9
Since you went through all the steps and followed all the methodology guidelines, this is just a formality.
You do a full load in pre-prod, starting from a copy of prod and doing everything exactly as you will do in prod
You get the whole thing validated by the BOs key users and the BOs Owners
Then you get written signoff from the BOs Owners that all is A-OK (insist to have it written, not verbal)
After, you do the final production load and get final signoff
Finally you have nothing else to do, as by following this methodology, you managed to load 100% accurate Master
Data and no rework is needed (yes, this happened to me for real)
Here are the guidelines to follow here. Again simple but requiring a lot of discipline :
It is better to make manual corrections or create programs the correct the data after loading them. This
way you do not induce regression into the extract/load process.
When it is time to load in prod, do exactly as you did in pre prod, and then apply the manual changes or
run the data correcting programs exactly as you did in pre prod.
If you must change the anything in the load/extract process or programs, you must again do a full preprod load test/validation of everything.
PAGE 42
CONCLUSION
Now you know how I did different Master Data migrations faster and better than others did. I successfully used this
methodology in different industries, in different countries and with starting projects as well as with projects already running
which needed to be turned around.
The methodology itself is mainly a mix of good old common sense and management 101. Each step is the foundation
needed for the next step. Complete each step at 100% and the next one will be easier. This will have a snowball effect,
permitting you to gain more speed from step to step. Not following this rule will also have a snowball effect, but in the
other direction, reaching a point where the conversion process becomes totally out of control.
The difficulty is not in understanding the methodology. Pressure to show rapid results (any results), tendencies to push
forward issues so we can keep progressing, and resistance to slow down when the process starting to get out of hands.
These are the most difficult challenges you will have to overcome in order to complete each step at 100%
Although it may sometimes looks to others that you are taking the long route to get there, remember that the objective is not
to finish a specific step ASAP. The goal it is to complete the whole process in the best time possible and to deliver a
complete set of Master Data that will not need rework once in production.
This is how I managed to keep my head above the water and continuously see where we are going, even when everyone
else seems to be in crises.
PAGE 43
PAGE 44
B - WBS template
WBS Template.xls
Routing conversion
spec sample.doc
Materials Classes
and Characteristics structure.ppt