Oracle BI Applications 11 1 1 9 2 Upgrade Guide PDF
Oracle BI Applications 11 1 1 9 2 Upgrade Guide PDF
Oracle BI Applications 11 1 1 9 2 Upgrade Guide PDF
2 UPGRADE GUIDE
Table of Contents
Overview ....................................................................................................................................................... 4
Pre-requisites ................................................................................................................................................ 5
I.
2.
3.
4.
5.
NOTE: Need to select each component one by one and upgrade it in PSA. .......................................... 15
6.
7.
8.
9.
10.
a)
b)
c)
d)
e)
f)
g)
h)
11.
a)
b)
II.
2.
3.
4.
Post-Upgrade Tasks......................................................................................................................... 73
III.
APPENDIX ........................................................................................................................................ 73
1.
b)
Financials ..................................................................................................................................... 76
c)
Procurement ............................................................................................................................... 79
d)
Projects ....................................................................................................................................... 83
e)
Manufacturing ............................................................................................................................ 85
f)
This document describes the steps to upgrade BI Applications 11.1.1.8.1 PB6 (patch bundle 6) to version
11.1.1.9.2.
Overview
The Upgrade of BI Applications 11.1.1.8.1 PB6 to version 11.1.1.9.2 consists of the upgrade of the
following components, repositories (schema and content) and data:
1. Platform components
2. BI Applications binaries
3. BI Applications Component Repository (BIACOMP)
4. JAZN, RPD and Presentation Catalog
5. ODI Repository content (BIA_ODIREPO)
6. Business Analytics Warehouse (DW) - schema
7. Data Migration of existing data in the Business Analytics Warehouse
NOTE: The upgrade from BI Applications 7.9.6.x to BI Applications 11.1.1.9.2 is not supported. Upgrade
from BI Apps for Fusion Apps (11.1.1.5.x 11.1.1.7.0, 11.1.1.8.0) is also not supported
Sequence of Steps
The sequence of the steps in the upgrade of BI Applications 11.1.1.1.8.1 PB6 to 11.1.1.9.2 is outlined
below.
1. Complete Upgrade Pre-requisites.
2. Run the BI Applications 11.1.1.9.2 installer to upgrade the BI Application binaries from version
11.1.1.8.1 PB6 to 11.1.1.9.2.
3. Apply the FMW Middleware Patches for BI Applications 11.1.1.9.2.
4. Use the PSA tool to upgrade BIACOMP schema (ATGLite, FSM, BIACM and BIACM_IO component
upgrades).
5. Run script to upgrade deployment changes in BI Applications 11.1.1.9.2.
6. Use the BI Update Metadata Tool to upgrade the JAZN
4
Pre-requisites
Complete the following pre-requisites before performing the upgrade from BI Applications 11.1.1.8.1
PB6 to BI Applications 11.1.1.9.2.
Review the Certification Matrix for BI Applications version 11.1.1.9.2. The Certification Matrix is
available on the Fusion Middleware Certification Page on Oracle Technology Network (OTN).
I.
Select the inventory directory and OS group name and click Ok. The next step will display the
pop box where need to run the some scripts with root access, as shown below.
4. The Specify Installation Location screen displays the MW_HOME and BI_ORACLE_HOME for your
existing BI Applications 11.1.1.8.1 PB6 environment. Verify the locations and click Next.
5. A Warning dialog is received asking if you wish to upgrade the existing BI_ORACLE_HOME. Click
Yes.
6. On the Summary screen, review the installation details and click the Install button to proceed.
9
7. Click Next on the Installation Progress screen when the installation is complete.
8. Click Finish on the Complete screen to complete the installation.
10
The version of ODI used by BI Applications has not change between BI Applications 11.1.1.8.1 PB6 and
11.1.1.9.2. The version of ODI supported for BI Applications 11.1.1.9.2 is 11.1.1.7.0. An ODI patch is
applied to 11.1.1.7.0 as part of this FMW patch application step.
To apply platform patches:
NOTE: You will run a script to apply the patches. The script is a Perl script and is available in
<BI_Oracle_Home>/biapps/tools/bin/APPLY_PATCHES.pl.
The Perl script you will run to apply the patches requires a parameter input file
(apply_patches_import.txt). In this procedure, before you run the Perl script, you will update the
parameter input file to reflect the appropriate directory paths
1. Ensure that the WebLogic Administration Server, BI and ODI Managed Servers, Node Manager
and BI processes are shut down.
2. Download "Oracle Fusion Middleware Platform Patches for Oracle Business Intelligence
Applications" and "Oracle Fusion Middleware Platform Patches for Oracle Business Intelligence
Applications for <OS> from the Oracle Business Intelligence Applications 11.1.1.9.2 media pack
on Oracle Software Delivery Cloud. Download all parts.
3. Extract all .zip files into the same Patch Home directory, as follows:
Extract the contents of the downloaded .zip files containing the patches into the same directory,
for example, C:\patches or PATCH_HOME/patches.
Note: The directory structure of the extracted contents is not patches4fa/dist/ps6rc3. The
patches are contained in folders: biappsshiphome, odi, weblogic and oracle_common. You do
not have to unzip the individual patches.
4. Update the parameter input file (apply_patches_import.txt) to reflect the paths as specified in
the text file:
1. Create a writable directory where logs and temporary patch files will be stored. In the
apply_patches_import.txt file, you will set the WORKDIR parameter to point to the path
for this directory.
2. Open apply_patches_import.txt, which is located in the
<BI_Oracle_Home>/biapps/tools/bin directory.
3. Specify the following directory paths:
Directory
Path
11
Directory
Path
JAVA_HOME
INVENTORY_LOC
ORACLE_HOME
MW_HOME
COMMON_ORACLE_HOME
WL_HOME
ODI_HOME
WINDOWS_UNZIP_TOOL_EXE
WORKDIR
PATCH_ROOT_DIR
For example:
C:\patches
or
12
Directory
Path
PATCH_HOME/patches
biappshiphome_generic_patches.log
biappshiphome_<OS specific>_patches.log
odi_generic_patches.log
oracle_common_generic_patches.log
weblogic_patching.log
13
Enter database details like database host, port and SID in psa screen. Enter sys as stsdba as
username and enter sys password.
14
Click on <button> to find BIACOMP schema name select that. Click next.
Next screen will show components listed to be upgraded. It will list entries ATG, FSM, BIACM
and BIACM_IO. Select them and click next.
PSA will upgrade those components in BIACOMP schema and finally show success message.
Use SQL client tool and login as sys into the database which has BIACOMP schema, and run the
following:
update schema_version_registry
set version='11.1.1.9.2', upgraded = 'Y', start_time=sysdate, modified=sysdate
where OWNER='OTBIE_BIACOMP' and comp_id='BIACM';
NOTE: Need to select each component one by one and upgrade it in PSA.
Example
Linux:
./wlst.sh /Middleware_Home/Oracle_BI1/dwtools/scripts/REL8_REL9_postupgrade.py
/Middleware_Home/user_projects/domains/bifoundation_domain
Windows:
./wlst.cmd C:\Middleware_Home\Oracle_BI1\dwtools\scripts\REL8_REL9_postupgrade.py
C:\Middleware_Home\user_projects\domains\bifoundation_domain
This will deploy a shared library on ODI server
2. Start All the servers
16
Optional Parameters:
log.level - default is INFO. Valid values are FINEST, FINE, INFO, WARNING, SEVERE.
Optional Parameters:
log.level - default is INFO. Valid values are FINEST, FINE, INFO, WARNING, SEVERE.
2. Restart all BI processes using Oracle Process Manager and Notification Server (OPMN).
3. It is recommended to update GUIDS in OBIEE
18
10.
This section describes the steps to upgrade an existing BI Applications ODI Repository from version
11.1.1.8.1 PB6 to 11.1.1.9.2.
In order to retain the existing topology and security configuration in ODI while at the same time
minimizing the impact on other tools that interact with ODI (that is, BI Applications Configuration
Manager and the BI Applications instance of Weblogic Server), the upgrade process has you export the
existing or pre-upgrade configurations from the ODI Master Repository, drop the existing pre-upgrade
ODI Work and Master Repositories, import the new upgrade ODI Work and Master Repositories into the
same schema, then import the pre-upgrade ODI Repository configurations.
This effectively replaces all content (interfaces, packages, models, knowledge modules, load plans, etc)
while retaining the configuration definitions.
The process also allows you to retain any customizations performed by exporting the customizations
from the pre-upgrade repository and importing them back in after the repository has been upgraded.
19
Exporting Content
from the 11.1.1.8.1
ODI Repository
Exporting
Connections
Exporting
Security
Export
Customized
Data stores
Exporting
Customizations
Export Custom
Folder
Dropping the
11.1.1.8.1 ODI
Repository Schema
Connecting to the
ODI Repository
for 11.1.1.9.1
Importing Content
into the 11.1.1.9.1
ODI Repository
Importing
Connection
Details
Reconfiguring
External
Authentication
Changing the ID of
the 11.1.1.9.1 ODI
Repository
Verifying the
Imported
Details
Regenerating
Load Plans
20
Importing
Security
Use the Oracle Database Export and Import utility to export the ODI Repository for BI Applications
11.1.1.8.1 PB6 from the existing schema and import it into another schema. This will allow you to
connect to the 11.1.1.8.1 PB6 ODI repository in case you need to reference it.
This section describes how to create a new connection in ODI Studio to the back-up of the ODI
Repository for 11.1.1.8.1 PB6 which is now in a new database schema. See section Creating a Back up of
the 11.1.1.8.1 ODI Repository above for reference.
To create a new connection in ODI Studio to the back-up of the ODI Repository for 11.1.1.8.1 PB6:
Configure the connection to use the same ODI user you used before. Configure the connection as a
Master Repository connection. Connect to the repository and navigate to Topology -> Repositories ->
BIAPPS_WORKREP. Edit the repository to change the database user details used by the Work Repository
to use the new schema.
21
Disconnect from the repository, reconfigure the connection to include the Work Repository, connect
and verify all details are correct.
22
1. Launch the ODI Studio client and connect to the ODI repository for BI Applications 11.1.1.8.1
PB6. (Do not connect to the back-up of the ODI Repository).
2. Navigate to the Topology tab. From the Connect Navigator (Topology icon dropdown on the top
right side of the navigator pane), select Export. As part of the procedures described below, you
will export files to a local directory.
Exporting Connections
23
3. Launch the Smart Export wizard from the Export selection dialog
4. Drag the Global context into the Objects to be Exported window. Provide a meaningful name
for the export file. Click Export. This will export the logical and physical topology including
assigned Datasource Num ID values and database connect details.
24
Exporting Security
1. Connect to the 11.1.1.8.1 ODI Repository schema using Oracle data base client
Tools (SQL Plus/SQL Developer...Etc) and execute the below script.
/* Script Begins */
UPDATE SNP_FLEX_FIELD SET I_FF=26040 WHERE
FF_CODE='OBI_DATASTORE_DYNAMIC_FILTER1' AND I_OBJECTS=2400 AND
FF_TECHNO='ORACLE_BI';
UPDATE SNP_FLEX_FIELD SET I_FF=31040 WHERE
FF_CODE='OBI_DATASTORE_DYNAMIC_FILTER2' AND I_OBJECTS=2400 AND
FF_TECHNO='ORACLE_BI';
COMMIT;
/* Script Ends */
2. Navigate to Topology > Export and select the Export Security Settings action.
25
3. Choose to export to a local file (directory). This action exports your user configurations.
Exporting Customizations
If you have introduced any customizations in your ODI repository, you will need to export these as well.
Export Custom Folder
Per the customization methodology, all custom and customized ETL tasks should be in a separate
CUSTOM folder. Right click the Custom Folder and select the Export option. In the next window,
ensure the Child Components Export box is checked.
26
27
Drag and drop your purely custom and customized datastores. Be sure the Export child objects option
is checked. In the example below, WC_ALLOC_INV_BALANCES_F/FS are purely custom tables while
W_GL_OTHER_F/FS are out of the box tables that have been customized.
28
ODI requires the ID of the repository you import objects into be different from the ID of the repository
that objects were exported from. The ID of the 11.1.1.8.1 PB6 ODI Repository will need to be noted and
after the repository is replaced with the 11.1.1.9.2 repository, the repository ID will be updated to a
non-conflicting number.
The default value 500 is assigned to all repositories that are shipped by Oracle. If you have migrated the
repository across environments, the value could be different from this default value.
To note the ID of the ODI Repository for BI Applications 11.1.1.8.1 PB6:
1. Navigate to Topology -> Repositories -> Master Repository -> Right Click and select Open ->
Version -> Information -> Internal ID.
2. Note the ID.
29
3. Do the same for the Work Repository. The Repository ID should have the same value.
30
Use the BI Applications RCU to drop the existing 11.1.1.8.1 PB6 ODI Repository schema. You can use
either the BI Applications 11.1.1.8.1 RCU or the 11.1.1.9.2 RCU. You will be prompted with a list of
schemas that have already been installed select the schema where the ODI Repository for 11.1.1.8.1
exists.
Important ONLY drop the <prefix>_BIA_ODIREPO. Do not drop any other schema.
Before dropping the ODI Repository schema, stop the ODI Managed server odi_server1 from
console URL and re-start the DB
1. Launch the BI Applications RCU. Select the Drop radio button.
2. In the Database Connection Details screen, provide the connection details to the database which
hosts the ODI Repository for BI Applications 11.1.1.8.1 PB6.
3. From the Prefix All Schema Owners dropdown, select the prefix for your BI Applications
11.1.1.8.1 PB6 schemas.
4. Select only the Oracle Data Integration Master and Work Repository from the Select
Components screen. Do NOT select any of the other schema components.
31
5. Click Drop to drop the ODI Repository schema for 11.1.1.8.1 PB6.
The following steps are the same as when installing a refresh ODI repository. The only difference is that
we select the option to restore the ODI Master and Work repositories only.
e) Creating the ODI Repository for BI Applications 11.1.1.9.2
You must run the BI Applications 11.1.1.9.2RCU to create the schema for the ODI Repository for
11.1.1.9.2. This schema will use the same name as the 11.1.1.8.1 PB6 ODI Repository schema that was
just dropped.
Important You must select the option to use an existing prefix and re-use the same prefix that
was used by the schema that was previously dropped.
32
1. Unzip the BI Applications 11.1.1.9.2 RCU downloaded from the BI Applications 11.1.1.9.2 media
pack.
2. If you are not running RCU on the database host machine, then you must copy the obia_odi.dmp
file to a directory with global write access on the appropriate database server machine. (RCU
writes log files to this directory.) The .dmp file is located in
BIA_RCU_HOME/rcu/integration/biapps/schema.
3. Launch the BI Applications RCU for 11.1.1.9.2 from BIAPPS_RCU_HOME\bin:
UNIX: ./rcu
Windows: rcu.bat
4. Select the Create radio button.
5. In the Database Connection Details screen, provide the connection details to the database which
previously hosted the ODI Repository for BI Applications 11.1.1.8.1 PB6 which you dropped in
the previous procedure.
6. In the Select Components screen, from the Select an existing Prefix dropdown, select the same
prefix as that of your ODI Repository for 11.1.1.8.1. PB6 If you do not see the prefix as an
existing prefix then choose the Create a new Prefix radio button and enter the same prefix as
you had used before for the ODI Repository for 11.1.1.8.1 PB6.
33
7. Select the Oracle Data Integration Master and Work Repository. Do not select any other
component.
8. In the Value field in the Custom Variables screen, for the <prefix>_BIA_ODIREPO schema enter
the directory path of the folder on the database server that contains the obia_odi.dmp file. See
step 1 above.
Note: Do not include the name of the .dmp file in the directory path.
9. Complete the ODI Repository creation.
Connecting to the ODI Repository for 11.1.1.9.2
Create a connection in ODI Studio to the ODI Repository for 11.1.1.9.2 which you created in the previous
step. The repository is set to Internal Authentication. The user and password you use to connect to the
repository:
User: SUPERVISOR
Password: welcome
The ID of the repository has to be changed from the default to avoid conflicts when importing the
configurations, objects and customizations from the ODI Repository for 11.1.1.8.1 PB6. In the section
Noting the ID of the 11.1.1.8.1 ODI Repository above you made a note of the Repository ID for the
34
Master and Work repositories (default value is 500). In this procedure you will update the value in the
ODI Repository for 11.1.1.9.2 to a different value.
1. In ODI Studio, navigate to Topology -> Repositories -> Master Repository -> Right Click and select
Renumber
3. On the Renumbering the repository Step 2 dialog, enter a new ID that has not been used for
any of your existing ODI Repositories. Oracle suggests incrementing the value you noted in
section Noting the ID of the 11.1.1.8.1 ODI Repository by 1. Click OK.
Note that this incremented value should not be same number as the Repository ID of any
existing ODI Repository. The value you enter must be a numeric ID between 501 and 899.
35
4. Verify the new number by selecting the Master Repository. Right click and select Open to view
the Internal Id value that was set in the previous step.
5. Repeat for steps 1-4 to renumber the Work repository to the same value as the renumbered
Master repository.
36
f)
1. Launch the ODI Studio client and connect to the ODI repository for BI Applications 11.1.1.9.2.
2. Navigate to the Topology tab. From the Connect Navigator (Topology icon dropdown on the top
right side of the navigator pane), select Import. As part of the procedures described below, you
will import the file created by exporting the Global context in step Exporting Connections
above.
37
4. In the File Selection screen, specify the export file created in step Exporting Connections above.
5. The default behavior of Smart Import is to Merge details in the target repository. Ensure that no
issues are reported if issues are reported, resolve them to ensure the existing details are
replaced by the details being imported. Select the BIAPPS_DW and BIAPPS_BIACOMP Data
Servers and select the Overwrite option and continue.
38
As part of this procedure you will verify that the Physical Servers have the correct details and the
Physical Schemas have the correct Datasource Num ID value set in the DATASOURCE_NUM_ID flexfield.
1. In ODI Studio, navigate to Topology > Physical Architecture. Open the BIAPPS_DW physical
server.
2. Verify the User and Password are correctly populated under the Definition tab.
39
3. On the JDBC tab, verify that the JDBC URL is correctly set. If you used the default Merge action
during the Smart Import, the User and Password will be updated but the JDBC URL will remain
unchanged.
5. Verify the Physical schema for BIACOMP and DW are set as defaults. If they are not set as
defaults, Load Plans will fail.
1. In ODI Studio, navigate to Topology > Physical Architecture.
2. Open the physical schema under the BIAPPS_BIACOMP physical server.
40
3. Verify that the Default check box is checked on the Definition tab.
4. Check the Default check box if it is not checked. Save changes.
5. Repeat steps 2 to 4 for the physical schema under the BIAPPS_DW physical server.
6. Verify the source connection details. For example, if you need to extract from an instance of
eBusiness Suite 11.5.10, open the corresponding physical server (here named
EBS11510_DEFAULT) and verify the User/Password and JDBC.
41
7. Also verify the associated physical schema. Navigate to the Flexfields tab and verify the
DATASOURCE_NUM_ID flexfield is set to the value you originally assigned. This value of the
DATASOURCE_NUM_ID must match the value in BI Applications Configuration Manager for this
source connection.
42
8. Open the corresponding logical schema and verify the DATASOURCE_NUM_ID flexfield is also
set with the same value.
43
Importing Security
As part of this procedure you will import the Security settings that you exported in step Exporting
Security above. The import of Security settings is done using the insert and update option.
Follow below steps to import new security objects
1. In the Topology tab, from the Connect Navigator (Topology icon dropdown on the top right side
of the navigator pane), select Import.
2. In the Import Selection dialog, select Import Security Settings.
44
3. In the Import Security Settings dialog, select Synonym Mode INSERT for the Import Mode.
Select the Import from a Folder radio button. Enter the directory location to which you had
exported the Security settings in step Exporting Security above.
45
7. In the Import Security Settings dialog, select Synonym Mode UPDATE for the update Mode.
Select the Import from a Folder radio button. Enter the directory location to which you had
exported the Security settings in step Exporting Security above.
46
After the import is complete, the SUPERVISOR user may no longer be enabled. To ensure you can still
connect to the ODI repository in case of any issues, you should enable this user by ensure the
Supervisor property is set and this user does not have an expiration date. Once external authentication
is complete, you can log in with another administrative user and disable the Supervisor user.
47
1. Disconnect from the ODI Repository by selecting the ODI menu and then the Disconnect
<User> menu item.
2. From the ODI menu select the Switch Authentication Mode menu item
48
3. Enter the database connection details on the Login screen. Click Next.
4. On the Credentials screen click Finish
The following Information dialog is displayed to indicate that the repository was successfully switched to
External Authentication. You should see at least two users are matched. Note that the SUPERVISOR user
defined in ODI will not be matched to anything in the security store.
49
You can now connect to ODI using externally authenticated users; for example, the BI Applications
Administrator User.
If you receive the following error when switching to External Authentication, then you have not
configured the security files required for external authentication on the instance of ODI Studio
you are using.
To configure user access, see section 3.3.9.2 Configuring User Access for ODI Studio in the BI
Applications Installation Guide for 11.1.1.9.2.
After importing the security settings, disconnect from ODI repository and Switch Authentication
Mode.
If you get an unexpected error window
50
NOTE:
a) When switching from ODI internal to FMW security, Make sure that the ODI users had no expiry
dates (including SUPERVISOR)
b) If users had expiry dates , Change the expiry dates and make sure none of the users were disabled,
after doing the security import
Load Plans that were originally generated in the ODI Repository for 11.1.1.8.1 PB6 do not exist in the
ODI Repository for 11.1.1.9.2. The steps to transfer content from the 11.1.1.8.1 PB6 Repository to the
51
11.1.1.9.2 Repository do not include transferring the original load plans. The load plans will not reflect
any changes introduced as part of the upgrade so a new load plan must be generated.
Configuration Manager retains the Load Plan definitions on upgrade even though any metadata
associated with this load plan that was stored in ODI is no longer available. Use these existing
definitions in Configuration Manager to regenerate load plans including Domains Only Load Plans. Note
that any tasks that had previously executed pre-upgrade will execute in incremental mode post-upgrade
while any new tasks that may be introduced in the generated load plan will initially execute in full mode.
1. Log into BI Applications Configuration Manager as the BI Applications Administrator user.
2. Navigate to Manage Load Plans.
3. Regenerate all Load Plans including the Domains-Only Load Plans.
Refer to the Configuration Manager online help for the Manage Load Plans screen and the BI
Applications ETL Guide for 11.1.1.9.2 for more details on how to regenerate the load plan with the
existing load plan definition.
11.
Once the DDL and Data Upgrade steps are complete, it is time to import the customizations into the
post-upgrade repository.
If you have separate ODI repositories for Development (DEV), Testing (TEST) and Production (PROD),
there is a difference in the steps for getting the customizations into the post-upgrade DEV repository
and into the post-upgrade TEST or PROD repository. Assuming that only DEV is open to developers to
make changes and TEST and PROD instances are locked down so content can only be migrated, the
following summarizes the differences. Refer to the respective documents for the exact implementation
of each step.
DEV
Customizations Imported using Regular
52
Export/Import
Version Model
Version Model again
Import Custom Datastores
Import Custom Folders
Reapply Customizations
Generate Custom Scenarios
Apply Customizations to Generated Load
Plan
DEV to TEST/PROD
Customizations Migrated using Smart
Export/Import
Test/Prod Repo
The following sections describe the process to import the customizations previously exported from the
pre-upgrade repository into the post-upgrade DEV repository. Refer to the T2P ODI Migration
document for the steps to migrate the customizations from the DEV repository to TEST and PROD
repositories.
53
An important difference in the two processes is the use of Regular and Smart import. Smart imports
default behavior is to overwrite the target while Regular import allows us to merge with the target.
Smart import brings a lot of extra objects while Regular import just brings the objects you specified.
When moving from pre-upgrade to post-upgrade, we want to move only the customized objects. Using
Smart import would bring almost all objects from the pre-upgrade repository and by default overwrite
the objects in the post-upgrade repository. As the post-upgrade repository includes bug fixes and
enhanced functionality, we would lose all of that and replace it with the legacy pre-upgrade objects.
Regular import does not bring these extra objects with it.
When moving from DEV to TEST, the objects in the DEV repository should be replacing the objects in
TEST as they represent the bug fixes and enhanced functionality. For migrating, we use Smart Import to
bring all objects as these objects should always take precedence and overwrite what is in the target.
Very important! The following steps are implemented in the DEV repository only. For migrating
changes to TEST and PROD repositories, you follow the T2P Migration document.
a) Import Datastores
It is important to import the datastores first prior to importing the customized ETL tasks.
To import the customizations:
1. Launch the ODI Studio client and connect to the ODI Repository for BI Applications 11.1.1.9.2.
2. Navigate to Designer -> Models -> Oracle BI Applications (folder) -> Oracle BI Applications
(Model)
3. Create Original and Custom versions of the model
a. Right click the model, select Version -> Create Version. Create an initial version.
b. Perform these steps again to create the version with customizations.
c. Once the second version is complete, the two versions will match. However, after the
customizations are imported, the two will no longer match. The original version reflects
the out-of-the-box datastores while the new version reflects the merged datastores,
allowing comparison between the two.
4. Import the customized datastores
Right click the Oracle BI Applications model.
Select the Import Import Datastore option
54
In the Import Datastore window, ensure Import Type is Synonym Mode INSERT. Update
mode will update existing columns to reflect their pre-upgrade state while Insert/Update
mode will delete columns that were introduced in the upgrade repository but do not exist in
the pre-upgrade repository.
Navigate to the directory where you previously exported the custom datastores
Check the boxes for each datastore to be imported and click OK
55
If prompted to declare the repository ID and continue with the import, select Yes. You may see this
prompt multiple times, click Yes each time.
When the import completes, you should see a summary report similar to the following.
56
The upgrade datastores will now be merged with the pre-upgrade customizations.
Review the customized out-of-the-box datastores that have been imported.
Navigate to the Columns tab
In INSERT mode, the import will bring in the custom columns into datastore. In the 11.1.1.8.1 PB6
repository, these are usually the last columns. In the 11.1.1.9.2 repository, Oracle may have added
columns and these will have a conflicting position ID. Or a column simply may have moved between
57
11.1.1.8.1 PB6 and 11.1.1.9.2. These columns will appear to have the same position number but this
does not cause any issues. As an optional step, you can have ODI recalculate the position numbers.
Double click on any column in the datastore. Without making any changes, now select another column.
ODI will automatically recalculate the position numbers. Save the datastore.
58
1. Launch the ODI Studio client and connect to the ODI Repository for BI Applications 11.1.1.9.2.
2. Navigate to Designer -> Projects -> BI Apps Project -> Mappings
3. Right Click Mappings and select Import Import Sub Folder
In the Import Sub-Folder window, ensure Import Type is Synonym Mode INSERT
Navigate to the directory where you previously exported the custom folders
Check the boxes for each folder to be imported and click OK
59
When the import completes, you can see your custom folder and its corresponding customized objects.
60
At this point, it is necessary to merge any customizations with any changes that may have been
introduced in the upgrade repository. For example, a column may have been added to a table that was
customized (as in the W_GL_OTHER_F example earlier) or bug fixes applied or content otherwise
changed by Oracle.
There is no automatic merge mechanism available. The customized and out-of-the-box ETL tasks must
be inspected for changes and the changes manually incorporated into the other. The recommended
approach is to re-copy the out-of-the-box ETL task and re-apply the customizations to this new copy.
Since you are far more familiar with the changes you have made it should be easier to incorporate these
into the new copy rather than figure out the changes Oracle made and incorporate these into your
original copy.
1. Rename the customized ETL task to reflect the pre-upgrade version it was based on.
2. Duplicate the out-of-the-box ETL task per the customization methodology and move to the
Custom folder
Inspect the customized ETL task. The custom columns are populated but any new columns
introduced by Oracle are not.
61
Below is the copied interface where the new columns introduced by Oracle are populated but
the custom columns are not. Apply the customizations noted previously in this interface.
62
Generate a Scenario for the newly customized ETL task. Ensure the Scenario Name matches the
out-of-the-box Scenario Name but use a Version Number that is larger than any previously used
Version Number. The BI Apps Load Plans are configured to run the scenario with the largest
number. By retaining the same scenario name and assigning the largest scenario version
number, you do not have to make any changes to the load plan.
63
Apply Customizations to the Generated Load Plans as per the Customization methodology.
64
II.
This section outlines the steps to perform a warehouse schema and data upgrade from
11.1.1.8.1 PB6 to 11.1.1.9.2. Before performing the steps outlined in this section of the
document, you must have completed all steps described in section Part I - BI Applications
Infrastructure, Metadata and Schema Upgrades.
65
1. Pre-Upgrade Tasks
None
This privilege must be granted before you can execute the Upgrade Load Plans described later in this
section.
66
67
68
69
Execute the specific adapter upgrade Load plan to upgrade schema and warehouse data
Note: if there is any failures in the schema upgrade, follow below steps to run Upgrade DW DDL
procedure from outside the upgrade LP in case if there is a failure in Upgrade LP due to the Upgrade DW
DDL procedure DDL statements execution on your warehouse.
Step 1:
Note down the GENERATE_UPGRADE_DDL procedure failure step parameter values by opening the
upgrade load plan as shown below.
70
For example, from the above screenshot we have four scenario variables gets overwritten, so we will
have to note down these parameter values and when running Upgrade DW DDL procedure from outside
the upgrade LP, we should use the same values.
BIAPPS.UTIL_GEN_UPG_DDL_TABLE_LIST :
W_AP_HOLDS_F,W_AP_XACT_F,W_AR_XACT_F,W_PURCH_RQSTN_LINE_F,W_RQSTN_LINE_COST_F,W_
NEG_RESPONSES_F,W_NEG_AWARDS_F,W_NEG_LINES_F,W_NEG_INVITATIONS_F,W_PURCH_COST_F,
W_PURCH_RCPT_F,W_PURCH_SCHEDULE_LINE_F,W_PURCH_CHANGE_ORDER_F,W_AP_INV_DIST_F
BIAPPS.UTIL_GEN_UPG_DDL_RUN_MODE
BIAPPS.UTIL_GENDDL_CHAR_CLAUSE
BIAPPS.UTIL_GENDDL_RUN_DDL
COPY_MODE
71
When running procedure, it would prompt for variable values as shown below.
Overwrite four variable values noted down in Step1 as shown below. Apart from the four variable
values, overwrite BIAPPS.UTIL_GENDDL_CREATE_SCRIPT_FILE variable value with Y to create a script file
and also overwrite BIAPPS.UTIL_GENDDL_SCRIPT_LOCATION variable value with a valid server location
to place the generated script/log files.
72
Step 3:
After the successful execution of the Upgrade DW DDL procedure, mark Upgrade LP step
Upgrade_DW_Copy as completed and restart LP to continue running for further steps.
73
4. Post-Upgrade Tasks
Run an incremental load in the upgraded warehouse to complete the warehouse data upgrade steps.
III.
APPENDIX
This section lists the upgrade changes between 11.1.1.8.1 PB6 To 11.1.1.9.2 in Common
Dimension for EBS 11510, R12xx, PeopleSoft & Fusion adaptors
EBS11510 :
Supplier Account Dimension
1) What tables are truncated or records deleted
W_SUPPLIER_ACCOUNT_D
2) What new columns are included
W_SUPPLIER_ACCOUNT_D
PYMNT_MTHD_LKP_CODE
Summary
Earlier W_SUPPLIER_ACCOUNT_D.SPLR_RECPT_TYPE_CODE was sourced from
PO_VENDOR_SITES_ALL.PAYMENT_METHOD_LOOKUP_CODE instead of
PAY_ON_RECEIPT_SUMMARY_CODE. New column was added to extract payment method code
and Receipt type code was mapped to W_SUPPLIER_ACCOUNT_D.SPLR_RECPT_TYPE_CODE.
3) What issue/bug fixes are covered
Logic changed for SPLR_RECPT_TYPE_CODE
Logic added for PYMNT_MTHD_LKP_CODE
EBSR12xx:
Supplier Account Dimension
1) What tables are truncated or records deleted
W_SUPPLIER_ACCOUNT_D
2) What new columns are included
74
W_SUPPLIER_ACCOUNT_D
PYMNT_MTHD_LKP_CODE
Summary
From Oracle E-Business Suite R12, The payment method for the payees is now stored in the
following IBY tables:
IBY_EXT_PARTY_PMT_MTHDS
IBY_APPLICABLE_PMT_MTHDS
All the payees are stored in the following IBY table.
IBY_EXTERNAL_PAYEES_ALL
MINORITY_OWNED
SMALL_BUSINESS
WOMEN_OWNED
DISABLED_VETERAN_OWNED
HUB_ZONE
VETERAN_OWNED
First three classifications were being supported since OBIA 7.9.6.x version, but last three
classifications are being extracted from OBIA 11.1.9.2 version onwards.
MINORITY_OWNED
SMALL_BUSINESS
WOMEN_OWNED
DISABLED_VETERAN_OWNED
HUB_ZONE
VETERAN_OWNED
First three classifications were being supported since OBIA 7.9.6.x version, but last three
classifications are being extracted from OBIA 11.1.9.2 version onwards.
b) Financials
This section lists the upgrade changes after 11.1.1.8.1 to 11.1.1.9.2 Upgrade
Account Receivables (AR)
4) What tables are truncated or records deleted
W_AR_XACT_F
New Attributes:
Financials - AR Transactions > Receivables Receipt Business Unit
(PSFTxx)
(EBS11510)
(EBSR12xx)
(JDExx)
-NA-
W_AP_XACT_F
PAYABLES_PMT_ORG_WID (applicable only to Fusion Adaptor)
THIRD_PARTY_WID (applicable only to Fusion Adaptor)
AP_INV_CURR_AMT (applicable only to Fusion Adaptor)
DISCOUNT_TAKEN_INV_CURR_AMT (applicable only to Fusion Adaptor)
INVOICE_CURR_CODE (applicable only to Fusion Adaptor)
W_AP_HOLDS_F
THIRD_PARTY_WID (applicable only to Fusion Adaptor)
3) What tables have records added
-NA4) What metrics may be calculated differently
-NA5) What issue/bug fixes are covered
W_AP_HOLDS_F
(Fusion)
Populated THIRD_PARTY_WID=0 for the before upgrade fact data for all the records.
W_AP_XACT_F
(Fusion)
New Attributes:
Financials - AP Transactions > Payables Payment Business Unit
Third Party Payments - Populated THIRD_PARTY_WID=0 for the before upgrade fact
data for all the flows.
New Attributes:
Financials - AP Transactions >Third Party Remit-to Supplier
79
New Attributes/Metrics:
Financials - AP Transactions
> Document Details
Invoice Currency Code
>Facts - AP Transactions
AP Invoice Currency Amount
Discount Taken Invoice Currency Amount
(PSFTxx)
(EBS11510)
(EBSR12xx)
(JDExx)
-NA-
c) Procurement
Changes made in REL9/REL9.2 for FUSION adaptor OBIA Procurement Requisition Area
80
This doc summarized the changes introduced to REL9 and REL91 since REL8.1 in high level. Please note the below
change was made only for FUSION adaptor.
Approved Date, Submitted Date, Status in both header and line level
As Fusion SSP supports approved date, submitted date, and status for both Requisition Header level and
Requisition Line Level in REL9. Therefore, OBIA Procurement Requisition SA also supports the two levels
for only FUSION adaptor.
The below 4 attributes were introduced as well, but please note that they are just attributes, not
metrics. So, they wont be able to aggregated. These new attributes must be used in a linelevel/distribution-level report. They will produce weird result if they are used in higher level reports. To
prevent the weird results, the attribute description was written like the below to guide users.
Purchase Requisition Line Amount/Purchase Requisition Quantity: Always selected with Purchase
Requisition Number, Purchase Requisition Line Number, and Requisition Business Unit Number
together.
Purchase Requisition Distribution Amount/ Purchase Requisition Distribution Quantity: Always
selected with Purchase Requisition Number, Purchase Requisition Line Number, Purchase Requisition
Distribution Number, and Requisition Business Unit Number together.
Here is the list of new/obsolete columns of Requisition Details folder in BI Answer UI.
Subject Area
Sub Folder
Requisition Details
Presentation Column
Change Type
Resubmit Date
Obsolete
Submitted Date
New
Sub Folder
Presentation Column
Change Type
Obsolete
New
82
2.1 W_PURCH_RQSTN_LINE_FS
Seq
#
Column Name
Change Type
<Existing | New |
Obsolete >
74
HEADER_SUBMITTED_ON_DT
DATE
New
75
HEADER_APPROVED_ON_DT
DATE
New
76
HEADER_APPROVAL_STATUS_ID
VARCHAR2(80)
New
77
PARENT_REQ_LINE_ID
VARCHAR2(80)
New
78
PARENT_SUBMITTED_ON_DT
DATE
New
79
PARENT_APPROVED_ON_DT
DATE
New
2.2 W_PURCH_RQSTN_LINE_F
Seq
#
Column Name
Data Type
(length,
precisions)
Change Type
<Existing | New |
Obsolete >
95
HEADER_SUBMITTED_ON_DT
DATE
New
96
HEADER_APPROVED_ON_DT
DATE
New
97
HEADER_APPROVAL_STATUS_ID
VARCHAR2(80)
New
98
PARENT_REQ_LINE_ID
VARCHAR2(80)
New
99
PARENT_SUBMITTED_ON_DT
DATE
New
100
PARENT_APPROVED_ON_DT
DATE
New
2.3 W_RQSTN_LINE_COST_FS
Seq
#
Column Name
Data Type
(length,
Change Type
<Existing | New |
83
precisions)
Obsolete >
83
HEADER_SUBMITTED_ON_DT
DATE
New
84
HEADER_APPROVED_ON_DT
DATE
New
85
HEADER_APPROVAL_STATUS_ID
VARCHAR2(80)
New
2.4 W_RQSTN_LINE_COST_FS
Seq
#
Column Name
Change Type
<Existing | New |
Obsolete >
99
SUBMITTED_ON_DT
DATE
New
100
APPROVED_ON_DT
DATE
New
101
HEADER_SUBMITTED_ON_DT
DATE
New
102
HEADER_APPROVED_ON_DT
DATE
New
103
HEADER_APPROVAL_STATUS_ID
VARCHAR2(80)
New
d) Projects
This section lists the upgrade changes after 11.1.1.8.1 to 11.1.1.9.2 Upgrade for R11510 & R12xx
adaptors
9) What tables are truncated or records deleted
NA
10) What new columns are included in base fact/dimension
W_PROJECT_D.CAPITAL_COST_TYPE_CODE
W_PROJECT_DS.CAPITAL_COST_TYPE_CODE
11) What new tables have been introduced
A construction-in-process (CIP) asset is an asset you construct over a period of time. Create and
maintain your CIP assets as you spend money for raw materials and labor to construct them. Since
a CIP asset is not yet in use, it does not depreciate and is only in the corporate book. When you
finish building the CIP asset, you can place it in service and begin depreciating it. Customers need
visibility on the Project CIP costs with drill to Project Cost fact and Integration to the Finance
Fixed Assets subject area via the Fixed Asset dimension (W_FIXED_ASSET_D).
84
The analysis is always about, I am seeing a posting of $1M as CIP amount for an asset, how did I
get this amount, why it is high/low and It is simply not possible to get answer to these questions
just from Financials, even if we have Project and Task dimension on Financials side. While
Projects is not a true sub ledger, it is just like any other sub ledger like AR and AP where you
cannot get all the information just from GL
W_PROJ_ASSET_DS / W_PROJ_ASSET_D
W_PROJ_CAPITAL_EVENT_DS/ W_PROJ_CAPITAL_EVENT_D
W_PROJ_ASSET_DS_TL/ W_PROJ_ASSET_D_TL
W_PROJ_CIP_HDR_FS/ W_PROJ_CIP_HDR_F
W_PROJ_CIP_DTL_FS/ W_PROJ_CIP_DTL_F
12) What tables have records added
NA
13) What issue/bug fixes are covered
No major design changes
This section lists the upgrade changes after 11.1.1.8.1 to 11.1.1.9.2 Upgrade for PSFTx
adaptors
1) What tables are truncated or records deleted
NA
2) What new columns are included in base fact/dimension
W_PROJECT_D.CAPITAL_COST_TYPE_CODE
3) What new tables have been introduced
A construction-in-process (CIP) asset is an asset you construct over a period of time. Create and
maintain your CIP assets as you spend money for raw materials and labor to construct them. Since
a CIP asset is not yet in use, it does not depreciate and is only in the corporate book. When you
finish building the CIP asset, you can place it in service and begin depreciating it. Customers need
visibility on the Project CIP costs with drill to Project Cost fact and Integration to the Finance
Fixed Assets subject area via the Fixed Asset dimension (W_FIXED_ASSET_D).
The analysis is always about, I am seeing a posting of $1M as CIP amount for an asset, how did I
get this amount, why it is high/low and It is simply not possible to get answer to these questions
just from Financials, even if we have Project and Task dimension on Financials side. While
Projects is not a true sub ledger, it is just like any other sub ledger like AR and AP where you
cannot get all the information just from GL
W_PROJ_ASSET_DS / W_PROJ_ASSET_D
W_PROJ_CAPITAL_EVENT_DS/ W_PROJ_CAPITAL_EVENT_D
W_PROJ_ASSET_DS_TL/ W_PROJ_ASSET_D_TL
W_PROJ_CIP_HDR_FS/ W_PROJ_CIP_HDR_F
W_PROJ_CIP_DTL_FS/ W_PROJ_CIP_DTL_F
4) What tables have records added
W_PROJ_REVENUE_LINE_F - Amount Based Revenue support for PSFT
85
Until Release 8 we have been supporting only rate based Project revenue transactions in
W_PROJ_REVENUE_LINE_F. We have introduced support for Amount based transaction starting
Release 9. If the customer doesn't wish to extract Amount based transactions, set
PROJ_AMT_BASED_REVENUE_ENABLE CM parameter to 'N'. OOTB this is set to 'Y'.
5) What issue/bug fixes are covered
No major design changes
e) Manufacturing
This section lists the upgrade / data-patch changes after 11.1.1.8.1 to 11.1.1.9.2 Upgrade for R12xx
adaptors
(1)The following MFG/ EAM fact/dimension tables didn't have any constraints
/ indexes defined to create in the database. Enabled / created the proper
indexes.
W_MFG_SEIBAN_D
W_MFG_OPERATIONS_D
W_MFG_PLAN_D
W_MFG_PLAN_D_TL
W_WORKORDER_D
W_MFG_PLANNED_PRODUCTION_F
W_MFG_MATERIAL_USAGE_F
W_MFG_OPERATION_DETAIL_F
W_MFG_PROD_COST_F
W_MFG_RES_CAPACITY_F
W_MFG_RES_USAGE_PLAN_F
W_MFG_RES_XACT_F
W_RES_COST_HISTORY_F
W_PEGGING_DETALS_F
W_PRC_SPEC_RESULTS_F
W_PRC_SAMPLES_D
W_PRC_SPEC_D
W_PRC_SPEC_TESTS_D
W_PRC_TESTS_D
W_PRC_TESTS_D_TL
W_LOT_GENEALOGY_F
W_KANBAN_CARD_D
W_KANBAN_REPLEN_CYCLE_F
W_QA_CHAR_D
W_QA_CHAR_D_TL
W_QA_PLAN_D
W_QA_PLAN_D_TL
W_QA_SPEC_D
W_QA_SPEC_D_TL
86
W_QA_SPEC_TEST_D
W_QA_RESULTS_F
W_EAM_ASSET_D
W_EAM_ASSET_D_TL
W_EAM_ASSET_F
W_EAM_COST_F
W_EAM_LOCATION_D
W_EAM_LOCATION_D_TL
W_EAM_MATERIAL_USAGE_F
W_EAM_METER_D
W_EAM_METER_D_TL
W_EAM_METER_READING_F
W_EAM_RSRC_ACT_F
W_EAM_RSRC_STD_F
W_EAM_WORKORDER_F
W_EAM_WO_SNAP_F
(3)SILOS_SIL_MFGMATERIALUSAGEFACT, SDE_ORA_MfgProductionCostFact_Process
and
SDE_ORA_MfgResourceUsageFact_Actual_Process mappings modified
87
As per the design, Lot Genealogy and Pegging both the facts will always go
for full load (there is no incremental load for these facts) and both the facts
will not have soft delete mappings (to handle the deleted records from
the OLTP system).
Hence, it needs to have a truncate / insert strategy during the fact table load.
Thus, as part of this fix, we have called the table maintenance procedure as
part of the 2 SIL LP component for each of these Fact Groups.
(5)SIL_MfgOperationDimenion , SIL_MfgPlanDimension ,
SIL_MfgSeibanDimension and SIL_MfgProductionCostFact mappings
modified
The lookup condition with W_USER_D should result in only a single record.
The effective date logic should not have <= condition on EFFECTIVE_TO_DT.
This will result in duplicates. To prevent this, it was changed to < operator.
The reason for the duplicate to exist is because the insert to flow table was faulty in
nature and inserts two records; one with delete_flg='Y' and other with
delete_flg='N'. Basically, the soft deleted record (delete_flg='Y') is not supposed to
be inserted in the flow table. This was happening because there is no filter on
"Delete_flg".
The bug fix was to introduce a filter W_EAM_WORKORDER_F.DELETE_FLG='N' in the
temporary interface
PLP_EAMWorkOrderMonthlySnapshot.W_EAM_WO_SNAP_F_SQ_W_EAM_WORKORD
ER_F.
88
SDE_PSFT_CycleCountABCClassDimension
SDE_PSFT_CycleCountFact
SDE_PSFT_CycleCountHeaderDimension
SDE_PSFT_DomainGeneral_CST_COST_ELEMENTS
SDE_PSFT_InventoryAgingFact
SDE_PSFT_InventoryBalanceFact_Temporary
SDE_PSFT_InventoryDailyBalanceFact
SDE_PSFT_InventoryLotDailyBalanceFact
SDE_PSFT_ItemCostGeneral
SDE_PSFT_LotDimension
SDE_PSFT_LotDimension_Translate
SDE_PSFT_MovementTypeDimension_ReasonCode
SDE_PSFT_ProductLotTransactionFact
SDE_PSFT_Stage_GLAccountDimension_ProdLotTransactionDerive
SDE_PSFT_TransactionTypeDimension_CST_Cost_Elements
Requirement Description
To show the correct On Hand Amount for Supplier consigned Inventory from a Fusion FSCM
REl9 source. The BI Apps 11g data model already supports Consigned Inventory for Apps
Unlimited sources so the requirement from PM is to add the wiring of this logic from the
corresponding Fusion Apps BI VO:
FA will add two new columns to the existing VO. The two attributes we need are Owning Entity
ID and Owning Type in the InventoryOnhandPVO (see table 1). The BI VOs in Rel9 will be
updated with the changes
The Owning Entiryid and Owning type will only be populated for records that are supplier
consignment related. For now, FA will only support this type of consignment in REL9.
For the supplier consigned transactions, the Available Consigned Qty column will be populated.
Table 1
90
LOG VO Name
Columns added
Functional Description
FA will add two new columns, Owning Entity ID and Owning Type, in the InventoryOnhandPVO.
In the SDE FUSION mappings for Inventory Daily Balance Fact, changes will be made to check for
the supplier consigned transactions using the values passed in via the two new columns.
Specifically, when the Owning Type is NOT NULL, the associated quantity will be classified as an
Available Consigned Quantity.
For the supplier consigned transactions, the Available Consigned Qty column (see table 2) will
be populated in the FS table instead of the Available Quantity (ON_HAND_QTY).
Other, non-consigned, transactions will populate the Available Quantity, but not Available
Consigned Quantity.
No other consigned related columns will be supported as a part of this project (i.e.
INSP_CONSIGN_QTY, RESTRICTED_CONSIGN_QTY, BLOCKED_CONSIGN_QTY)
No RPD or SIL/PLP changes are planned. Only changes are in the ETL SDE logic.
Table 2
91
Subject Area
Inventory
Balance
Inventory
Balance
Column Name
AVAILABLE_CONSIGN_QTY
Description
Datatype
NUMBER(28,10)
92