OFSAAI User Guide 8.1.2.0.0

Download as pdf or txt
Download as pdf or txt
You are on page 1of 794

Oracle Financial Services Analytical

Applications Infrastructure
User Guide

Release 8.1.2.0.0

April 2024
OFS Analytical Applications Infrastructure User Guide
Copyright © 2024 Oracle and/or its affiliates. All rights reserved.
This software and related documentation are provided under a license agreement containing
restrictions on use and disclosure and are protected by intellectual property laws. Except as expressly
permitted in your license agreement or allowed by law, you may not use, copy, reproduce, translate,
broadcast, modify, license, transmit, distribute, exhibit, perform, publish, or display any part, in any
form, or by any means. Reverse engineering, disassembly, or de-compilation of this software, unless
required by law for interoperability, is prohibited.
The information contained herein is subject to change without notice and is not warranted to be error-
free. If you find any errors, please report them to us in writing.
If this is software or related documentation that is delivered to the U.S. Government or anyone
licensing it on behalf of the U.S. Government, then the following notice is applicable:
U.S. GOVERNMENT END USERS: Oracle programs, including any operating system, integrated
software, any programs installed on the hardware, and/or documentation, delivered to U.S.
Government end users are “commercial computer software” pursuant to the applicable Federal
Acquisition Regulation and agency-specific supplemental regulations. As such, use, duplication,
disclosure, modification, and adaptation of the programs, including any operating system, integrated
software, any programs installed on the hardware, and/or documentation, shall be subject to license
terms and license restrictions applicable to the programs. No other rights are granted to the U.S.
Government.
This software or hardware is developed for general use in a variety of information management
applications. It is not developed or intended for use in any inherently dangerous applications,
including applications that may create a risk of personal injury. If you use this software or hardware in
dangerous applications, then you shall be responsible to take all appropriate fail-safe, backup,
redundancy, and other measures to ensure its safe use. Oracle Corporation and its affiliates disclaim
any liability for any damages caused by use of this software or hardware in dangerous applications.
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be
trademarks of their respective owners.
Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC
trademarks are used under license and are trademarks or registered trademarks of SPARC
International, Inc. AMD, Opteron, the AMD logo, and the AMD Opteron logo are trademarks or
registered trademarks of Advanced Micro Devices. UNIX is a registered trademark of The Open Group.
This software or hardware and documentation may provide access to or information about content,
products, and services from third parties. Oracle Corporation and its affiliates are not responsible for
and expressly disclaim all warranties of any kind with respect to third-party content, products, and
services unless otherwise set forth in an applicable agreement between you and Oracle. Oracle
Corporation and its affiliates will not be responsible for any loss, costs, or damages incurred due to
your access to or use of third-party content, products, or services, except as set forth in an applicable
agreement between you and Oracle.
For information on third party licenses, click here.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 2


Document Control

Version Revision Date Change Log


Number

2.8 April 1. XLAdmin roles updated (36053604)


2. Added Mis Date details in Firerun execution (36419167).

2.7 March 3. Updated DB schema name restrictions (32414718)


4. Updated Edit feature for Instance access token (36342497)
5. Updated Erwin version – removed 12.5 (36340048)

2.6 February 2024 1. Updated the maximum number of allowed characters in a PLC
code for CALL, CREATE and REPLACE functions. (35592826)

2.5 January 2024 1. The Session timeout value in Update General Details is updated
to more than 10 minutes. (Doc 36099483)
2. Added Rest API for Object Migration (36150017)
3. Updated Erwin version – 12.5 (36167048)
4. Inclusion of SAML_Entity in System Configuration (36198751)

2.4 November 2023 1. Added


• Creating Forms Using Data Exporter (35716664)
• Exporting Forms Creating Using Data Exporter (35716664)
• Approving and Rejecting Records With 2 Factor Authentication
2. Removed View Workflow history from Questionnaire, as the
feature is not supported (35972233)
3. Added CPP Execution Performance Enhancements (35027469)

2.3 February 2023 Updated restricted characters for creating Attributes and Members in
Adding Attribute Definition and Adding Member Definition
(34908950).

2.2 February 2023 Updated relevant information regarding Model Upload-Script Path
and Json Comparison (35060351) in sections -Sequence of Scripts
Execution and Model Upload Using OFSAA Data Model Descriptor
(JSON) File.

2.1 Jan 2023 Included procedure to attach supporting documents during Data
entry in Data Maintenance Interface.
Updated steps for 2 factor authentication for approving records in
Data Maintenance Interface (34751606)

2.0 December 2022 Updated Setting Preferred Language section (Doc 34787889).

1.9 November 2022 Updated alert information related to incorrect Username, password,
FTP server and port in Database Server and Application Server
sections (34826719).

1.8 October 2022 Updated record selection details in Editing Form Details and Deleting
Form Details (34598676)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 3


Version Revision Date Change Log
Number

1.7 August 2022 Updated Erwin versions supported by OFSAAI in Upload Business
Model section for Doc 34511386

1.6 July 2022 Added a note (JIT Provisioning) in the Update General Details Section
for Doc 34122972.

1.5 July 2022 Added the Configuring Token-Based RFI Section (Doc 34250090).

1.4 June 2022 The following sections are updated:


• Update General Details (34104078)
• Passing Runtime Parameters in Data Mapping (34098279)
• Prerequisites (Doc 34237550)

1.3 March 2022 Added the Command Line Utility for Partition-Based Derived Entities
Section (Doc 33482484).

1.22 March 2022 The following sections are updated (Doc 33929561):
• Creating User Status Report
• Creating User Access Report

1.21 February 2022 The following section is updated:


• Passing Runtime Parameters in Data Mapping (Doc 33684371).

1.2 January 2022 Added the Command-line Utility to Bulk Import User Groups to IDCS
Section (Doc 33410774).

1.1 December 2021 • Updated the General Configurations if Big Data Processing
License is Enabled Section (33453948).
• Updated the Data Maintenance Interface Section (Doc 33188698).
• Added the Command Line Utility for Resave, Refresh and Delete
Partitions Section (Doc 33482484).
• Updated the Monitoring Batch Section (Doc 33598325).

1.0 Created The Command-Line Utility for SQL Modeler to JSON (ODM) Section is
November 2021 added for the enhancement in Release 8.1.2.0.0.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 4


Table of Contents
1 Preface .................................................................................................................................. 20

1.1 What is New in this Release of OFSAAAI Application Pack...................................................................................... 20


1.1.1 New Features in Release 8.1.2.0.0 ............................................................................................................................ 20

1.1.2 Deprecated Features .................................................................................................................................................... 21


1.1.3 Desupported Features ................................................................................................................................................. 21

1.2 About this Manual ............................................................................................................................................................... 21


1.3 Audience .............................................................................................................................................................................. 22
1.4 Recommended Skills ......................................................................................................................................................... 22
1.5 Recommended Environment .......................................................................................................................................... 22
1.6 Prerequisites ........................................................................................................................................................................ 22
1.7 Conventions and Acronyms ............................................................................................................................................ 22

2 OFSAAI - An Overview .......................................................................................................... 25

2.1 Components of OFSAAI.................................................................................................................................................... 25


2.2 Accessing OFSAA Applications....................................................................................................................................... 26
2.3 OFSAA Login Page ............................................................................................................................................................. 27
2.3.1 Log in as System Administrator ............................................................................................................................... 27
2.3.2 Log in as System Authorizer ..................................................................................................................................... 28

2.3.3 Log in as Business User .............................................................................................................................................. 28

2.4 Setting Preferred Language ............................................................................................................................................ 29


2.4.1 Setting Language from OFSAA Login Page .......................................................................................................... 29

2.4.2 Setting Language from OFSAA Landing Page...................................................................................................... 29

2.5 Changing Password ........................................................................................................................................................... 29


2.6 OFSAA Landing Page ......................................................................................................................................................... 31
2.6.1 Header ............................................................................................................................................................................ 32

2.6.2 Navigation Drawer ....................................................................................................................................................... 33

2.7 Modules in OFSAAI ............................................................................................................................................................ 34


2.8 Logging in OFSAA .............................................................................................................................................................. 35
2.8.1 Purging of Logs ............................................................................................................................................................ 35

2.8.2 Log File Format ............................................................................................................................................................ 36

3 Data Model Management .....................................................................................................37

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 5


3.1 Upload Business Model .................................................................................................................................................... 39
3.1.1 Model Upload Using JSON / erwin XML ................................................................................................................. 41

3.1.2 Model Upload Using DB Catalog ............................................................................................................................. 44

3.1.3 Model Upload Using OFSAA Data Model Descriptor (JSON) File .....................................................................46

3.2 OFSAA Data Model Extensions through the SQL Data Modeler ............................................................................50
3.2.1 Customization Process ...............................................................................................................................................50

3.2.2 Steps for Creating XML File: ...................................................................................................................................... 52

3.2.3 Triggering Model Upload Process ............................................................................................................................ 53

3.3 Sequence of Scripts Execution ....................................................................................................................................... 53


3.4 Configuring Session Parameters .................................................................................................................................... 54
3.4.1 Specify Database Session Level Parameters ......................................................................................................... 55

3.5 Partitioning Support .......................................................................................................................................................... 56


3.5.1 Registering Partition Information ............................................................................................................................ 56
3.5.2 Sub Partitioning Support ............................................................................................................................................ 57

3.6 Configurations for File Formats for Hive Infodom..................................................................................................... 57


3.7 Model Versioning ............................................................................................................................................................... 58
3.8 Viewing Log Details ........................................................................................................................................................... 58
3.9 Log File Download ............................................................................................................................................................. 59

4 Data Management Framework ........................................................................................... 60

4.1 Data Management Tools ................................................................................................................................................. 60


4.2 Components of Data Management Tools .....................................................................................................................61
4.3 Data Sources ........................................................................................................................................................................61
4.3.1 Creating a Data Source .............................................................................................................................................. 63
4.3.2 Versioning and Make Latest Feature ...................................................................................................................... 77

4.3.3 Modifying a Data Source ........................................................................................................................................... 78

4.3.4 Viewing a Data Source................................................................................................................................................ 78


4.3.5 Copying a Data Source ............................................................................................................................................... 78

4.3.6 Deleting Data Sources ................................................................................................................................................ 78

4.3.7 Purging Data Sources ................................................................................................................................................. 79


4.4 Data Mapping...................................................................................................................................................................... 79
4.4.1 Creating Data Mapping Definition ............................................................................................................................81

4.4.2 Modifying a Data Mapping Definition ................................................................................................................... 110

4.4.3 Versioning and Make Latest Feature of Data Mapping ..................................................................................... 110

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 6


4.4.4 Copying Data Mapping Definition ........................................................................................................................... 111

4.4.5 Viewing Data Mapping Definition ............................................................................................................................ 111

4.4.6 Deleting Data Mapping Definitions ......................................................................................................................... 111

4.4.7 Purging Data Mapping Definitions .......................................................................................................................... 111

4.5 Post Load Changes ........................................................................................................................................................... 112


4.5.1 Creating Post Load Changes Definition................................................................................................................. 113

4.5.2 Versioning and Make Latest Feature ..................................................................................................................... 118

4.5.3 Modifying Post Load Changes Definition .............................................................................................................. 119

4.5.4 Viewing Data Mapping Definition ........................................................................................................................... 119

4.5.5 Deleting Post Load Changes Definition ................................................................................................................. 119

4.5.6 Purging Post Load Changes Definitions ............................................................................................................... 120

4.6 User Defined Functions .................................................................................................................................................. 120


4.6.1 Creating User Defined Functions (UDFs) ............................................................................................................... 121
4.6.2 Viewing UDFs ...............................................................................................................................................................123

4.6.3 Modifying the User Defined Functions ..................................................................................................................123

4.6.4 Purging User Defined Functions ............................................................................................................................. 124


4.7 DMT Configurations ........................................................................................................................................................ 124
4.7.1 General Configurations if Big Data Processing License is Enabled ................................................................ 124

4.7.2 General Configurations if Big Data Processing License is not enabled......................................................... 128

4.7.3 Cluster Registration ................................................................................................................................................... 129


4.7.4 Performance Optimizations .....................................................................................................................................132

4.8 Slowly Changing Dimensions (SCD) ............................................................................................................................ 134


4.8.1 Creating Slowly Changing Dimension ................................................................................................................... 136

4.8.2 Executing SCDs........................................................................................................................................................... 139

4.8.3 SCD Execution for Heterogeneous Support ......................................................................................................... 140

4.8.4 Modifying SCD Definition .......................................................................................................................................... 141

4.8.5 Viewing SCD Definition .............................................................................................................................................. 141

4.8.6 Purging SCD Definitions ............................................................................................................................................ 141

4.9 CPP Execution Performance Enhancements ............................................................................................................. 141


4.10 Data Quality Framework ................................................................................................................................................. 142
4.10.1 Data Quality Rules ..................................................................................................................................................... 142

4.10.2 Data Quality Groups.................................................................................................................................................. 163

4.10.3 Configure Dynamic Degree of Parallelism (DOP) in DQ Framework ...............................................................172

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 7


4.11 References ..........................................................................................................................................................................172
4.11.1 Flat file ...........................................................................................................................................................................172

4.11.2 RDBMS ..........................................................................................................................................................................172

4.11.3 RAC.................................................................................................................................................................................173

4.11.4 Expression Builder ......................................................................................................................................................173

4.11.5 Passing Runtime Parameters in Data Mapping .................................................................................................. 175

4.11.6 Populating Assignment Type Details .................................................................................................................... 176

5 Unified Analytical Metadata .............................................................................................. 178

5.1 Alias ..................................................................................................................................................................................... 178


5.1.1 Adding Alias ................................................................................................................................................................ 179

5.1.2 Viewing Alias ............................................................................................................................................................... 180

5.1.3 Deleting Alias .............................................................................................................................................................. 180

5.2 Derived Entity .................................................................................................................................................................... 180


5.2.1 Creating Derived Entity.............................................................................................................................................. 181

5.2.2 Adding Partition Values............................................................................................................................................ 185

5.2.3 Copying Derived Entity ............................................................................................................................................. 186

5.2.4 Viewing Derived Entity Properties .......................................................................................................................... 186

5.2.5 Modifying Derived Entity .......................................................................................................................................... 187

5.2.6 Deleting Derived Entity ............................................................................................................................................. 187

5.3 Datasets .............................................................................................................................................................................. 188


5.3.1 Creating Dataset ........................................................................................................................................................ 190

5.3.2 Viewing Dataset Details ........................................................................................................................................... 192

5.3.3 Modifying Dataset Details ....................................................................................................................................... 192

5.3.4 Copying Dataset Details ........................................................................................................................................... 193

5.3.5 Deleting a Dataset ..................................................................................................................................................... 193


5.4 Dimension Management ................................................................................................................................................ 193
5.4.1 Components of Dimension Management ............................................................................................................ 194

5.4.2 Attributes ..................................................................................................................................................................... 195

5.4.3 Members ...................................................................................................................................................................... 199


5.4.4 Build Hierarchy .......................................................................................................................................................... 204

5.4.5 Hierarchy Maintenance ............................................................................................................................................ 210

5.5 Measure ...............................................................................................................................................................................217

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 8


5.5.1 Creating Business Measure ..................................................................................................................................... 218

5.5.2 Viewing Business Measure .......................................................................................................................................221

5.5.3 Modifying Business Measure ...................................................................................................................................221

5.5.4 Copying Business Measure .......................................................................................................................................221

5.5.5 Deleting Business Measure .......................................................................................................................................221

5.6 Business Processor .......................................................................................................................................................... 222


5.6.1 Adding Business Processor...................................................................................................................................... 223

5.6.2 Viewing Business Processor .................................................................................................................................... 226

5.6.3 Editing Business Processor ...................................................................................................................................... 227

5.6.4 Copying Business Processor .................................................................................................................................... 227

5.6.5 Deleting Business Processor.................................................................................................................................... 227

5.7 Expression.......................................................................................................................................................................... 228


5.7.1 Adding Expression Definition .................................................................................................................................. 229
5.7.2 Viewing Expression .....................................................................................................................................................231

5.7.3 Modifying Expression.................................................................................................................................................231

5.7.4 Copying Expression ....................................................................................................................................................231


5.7.5 Checking Dependencies ............................................................................................................................................231

5.7.6 Deleting Expression ................................................................................................................................................... 232

5.8 Filter ..................................................................................................................................................................................... 232


5.8.1 Navigating to Filters .................................................................................................................................................. 232
5.8.2 Adding Filter Definition ............................................................................................................................................ 233

5.8.3 Viewing Filter Definition ...........................................................................................................................................242

5.8.4 Modifying Filter Definition .......................................................................................................................................242

5.8.5 Copying Filter Definition...........................................................................................................................................242

5.8.6 Checking Dependencies ...........................................................................................................................................242

5.8.7 Viewing SQL of Filter .................................................................................................................................................243

5.8.8 Deleting Filter Definition ..........................................................................................................................................243

5.8.9 Download Filter Data, Bulk Edit, and Upload ......................................................................................................243

5.9 Map Maintenance ............................................................................................................................................................247


5.9.1 Creating a Mapper Definition................................................................................................................................. 248

5.9.2 Mapper Maintenance ............................................................................................................................................... 250

5.9.3 Default Secure Map ................................................................................................................................................... 255

5.9.4 Modifying Mapper Definition .................................................................................................................................. 255

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 9


5.9.5 Copying Mapper Definition...................................................................................................................................... 255

5.9.6 Deleting Mapper Definition .....................................................................................................................................256

5.9.7 Non Dynamic Mapper definitions ..........................................................................................................................256

5.10 Analytics Metadata .......................................................................................................................................................... 257


5.10.1 Dimension .................................................................................................................................................................... 257

5.10.2 Cubes ............................................................................................................................................................................ 262

5.11 References ........................................................................................................................................................................ 268


5.11.1 Scenario to Understand Dataset Functionality .................................................................................................. 268

5.11.2 Operator Types.......................................................................................................................................................... 268

5.11.3 Function Types and Functions ............................................................................................................................... 269

5.11.4 Creating Expression using Expression Builder .................................................................................................... 275

5.11.5 Base and Computed Measures ............................................................................................................................... 277

5.11.6 Business Hierarchy Types ........................................................................................................................................278


5.11.7 Measure Types ...........................................................................................................................................................287

5.11.8 Read Only Selected in Mapper Window ............................................................................................................... 290

6 Data Entries Forms and Queries ........................................................................................292

6.1 Excel Upload (Atomic) ..................................................................................................................................................... 292


6.1.1 Navigating to Excel Upload (Atomic) ....................................................................................................................293

6.1.2 Excel-Entity Mappings ..............................................................................................................................................293

6.1.3 Adding Excel-Entity Mappings ...............................................................................................................................293

6.1.4 Excel Upload ...............................................................................................................................................................295

6.2 Forms Designer ............................................................................................................................................................... 296


6.2.1 Creating a New Form ................................................................................................................................................297

6.2.2 Altering Existing Forms ........................................................................................................................................... 304

6.2.3 Copying Forms........................................................................................................................................................... 305


6.2.4 Deleting Forms .......................................................................................................................................................... 306

6.2.5 Assigning Rights........................................................................................................................................................ 306

6.2.6 Message Type Maintenance .................................................................................................................................. 307


6.3 Forms Authorization ...................................................................................................................................................... 308
6.4 Data Entry .......................................................................................................................................................................... 310
6.4.1 Viewing Form Details ................................................................................................................................................. 311

6.4.2 Searching Records ......................................................................................................................................................312

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 10


6.4.3 Editing Form Details ...................................................................................................................................................313

6.4.4 Adding Form Data ......................................................................................................................................................313

6.4.5 Authorizing Record .................................................................................................................................................... 314

6.4.6 Exporting Form Data ................................................................................................................................................. 316

6.4.7 Copying Form Data ....................................................................................................................................................317

6.4.8 Deleting Form Details ................................................................................................................................................317

6.4.9 References ....................................................................................................................................................................317

7 Data Maintenance Interface.............................................................................................. 324

7.1 Prerequisites ...................................................................................................................................................................... 325


7.1.1 Enabling DMI User Interface in Application Menu Tree .................................................................................... 325
7.1.2 User Role Mapping and Access Rights .................................................................................................................. 325

7.2 Creating Forms Definition .............................................................................................................................................. 326


7.2.1 Before you Begin ........................................................................................................................................................ 327

7.2.2 Creating Forms Definition Using Excel Upload ................................................................................................... 327

7.2.3 Creating Forms Using Data Exporter .....................................................................................................................331

7.2.4 Creating Forms Definition Using Designer ........................................................................................................... 332

7.3 Creating Data Filters for New Form Definitions........................................................................................................ 338


7.5 Enabling User Security for New Form Definitions.................................................................................................... 339
7.6 Editing a Forms Definition ............................................................................................................................................ 340
7.7 Approving or Rejecting Forms Definition .................................................................................................................. 341
7.7.1 Before you Begin ........................................................................................................................................................ 341

7.7.2 Approving a Forms Definition ................................................................................................................................. 341

7.7.3 Rejecting a Forms Definition ...................................................................................................................................342

7.8 Data Entry ..........................................................................................................................................................................343


7.8.1 Viewing Data Entry ....................................................................................................................................................343
7.8.2 Data Entry for Forms created without Auto- Approve option .........................................................................343

7.8.3 Data Entry for Forms Created using the Auto-Approve Option .................................................................... 349

7.8.4 Other Options for Data Entry ................................................................................................................................. 349


7.8.5 Exporting Forms Creating Using Data Exporter ..................................................................................................351

8 Rule Run Framework .......................................................................................................... 353

8.1 Components of Rules Run Framework .......................................................................................................................354

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 11


8.2 Rule ......................................................................................................................................................................................354
8.2.1 Components of Rule Definition............................................................................................................................... 355

8.2.2 Create Rule ..................................................................................................................................................................356

8.2.3 View Rule Definition ................................................................................................................................................. 370

8.2.4 Edit Rule Definition .....................................................................................................................................................371

8.2.5 Copy Rule Definition .................................................................................................................................................. 372

8.2.6 Authorize Rule Definition ......................................................................................................................................... 372

8.2.7 Export Rule to PDF ..................................................................................................................................................... 372

8.2.8 Trace Rule Definition Details ................................................................................................................................... 373

8.2.9 Delete Rule Definition ............................................................................................................................................... 373

8.2.10 Backdated Execution ................................................................................................................................................374

8.3 Process................................................................................................................................................................................376
8.3.1 Create Process ............................................................................................................................................................378
8.3.2 View Process Definition ........................................................................................................................................... 386

8.3.3 Edit Process Definition ............................................................................................................................................. 386

8.3.4 Copy Process Definition ...........................................................................................................................................387


8.3.5 Authorize Process Definition ...................................................................................................................................387

8.3.6 Export Process to PDF.............................................................................................................................................. 388

8.3.7 Trace Process Definition Details............................................................................................................................ 389


8.3.8 Delete Process Definition ........................................................................................................................................ 389

8.4 Run...................................................................................................................................................................................... 389


8.4.1 Create Run .................................................................................................................................................................. 390

8.4.2 View Run Definition .................................................................................................................................................. 400

8.4.3 Edit Run Definition ..................................................................................................................................................... 401

8.4.4 Copy Run Definition ................................................................................................................................................. 402

8.4.5 Authorize Run Definition ......................................................................................................................................... 402

8.4.6 Export Run to PDF .................................................................................................................................................... 402

8.4.7 Fire Run ....................................................................................................................................................................... 404

8.4.8 Delete Run Definition ............................................................................................................................................... 405

8.5 Manage Run Execution .................................................................................................................................................. 405


8.5.1 Creating Manage Run Definition ........................................................................................................................... 406

8.5.2 Viewing Manage Run Definition ............................................................................................................................ 408

8.5.3 Editing Manage Run Definition ............................................................................................................................. 408

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 12


8.6 Utilities ................................................................................................................................................................................ 410
8.6.1 Component Registration .......................................................................................................................................... 410

8.7 References ......................................................................................................................................................................... 412


8.7.1 How Run Rule Framework is used in LLFP Application ..................................................................................... 412

8.7.2 How Run Rule Framework is used in LRM Application ...................................................................................... 413

8.7.3 Process Hierarchy Members .................................................................................................................................... 415

8.7.4 Hierarchical Member Selection Modes ................................................................................................................. 416

8.7.5 Significance of Pre-Built Flag .................................................................................................................................. 416

8.7.6 Seeded Component Parameters in RRF ................................................................................................................ 417

9 Operations .......................................................................................................................... 427

9.1 Batch Maintenance ..........................................................................................................................................................427


9.1.1 Adding Batch Definition .......................................................................................................................................... 428

9.1.2 Specify Task Details .................................................................................................................................................. 431

9.2 Batch Execution ............................................................................................................................................................... 434


9.2.1 Executing Batch .........................................................................................................................................................435

9.2.2 Modifying Task Definitions of a Batch................................................................................................................. 439

9.3 Batch Scheduler ............................................................................................................................................................... 440


9.3.1 Creating Batch Schedule .......................................................................................................................................... 441

9.3.2 Updating Existing Batch Schedule ........................................................................................................................ 443

9.4 Batch Monitor .................................................................................................................................................................. 443


9.4.1 Crash Handling of Backend Servers ..................................................................................................................... 444

9.4.2 Monitoring Batch ...................................................................................................................................................... 444

9.5 Processing Report ........................................................................................................................................................... 447


9.6 Execution View Log ........................................................................................................................................................ 448
9.7 Batch Cancellation .......................................................................................................................................................... 449
9.7.1 Cancelling Batch ....................................................................................................................................................... 450

9.7.2 Aborting Batch ........................................................................................................................................................... 451


9.8 View Log ............................................................................................................................................................................. 451
9.8.1 Search and View Task ID Log ..................................................................................................................................452

9.9 References .........................................................................................................................................................................453


9.9.1 Task Component Parameters .................................................................................................................................453

10 Questionnaire..................................................................................................................... 465

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 13


10.1 Know the Questionnaire Workflow ............................................................................................................................. 465
10.2 Questionnaire Types ...................................................................................................................................................... 466
10.3 Use Search in the Questionnaire ................................................................................................................................. 466
10.3.1 Use the Basic Search ................................................................................................................................................ 467

10.3.2 Use the Advanced Search ....................................................................................................................................... 467

10.4 Configure the Questionnaire Attributes .................................................................................................................... 468


10.4.1 Add Questionnaire Attributes ................................................................................................................................ 470

10.4.2 Edit the Questionnaire Attributes ..........................................................................................................................473

10.4.3 Delete the Questionnaire Attributes ..................................................................................................................... 474

10.5 Define the Questions ...................................................................................................................................................... 474


10.5.1 Create the Questions in the Library ...................................................................................................................... 475

10.5.2 Edit the Questions From the Library ..................................................................................................................... 481

10.5.3 Create Questions by Copying Existing Questions ............................................................................................. 482


10.5.4 Delete the Questions from the Library ................................................................................................................. 482

10.5.5 View the Associated Questionnaires .................................................................................................................... 482

10.5.6 Wrap and Unwrap Questions from the Library .................................................................................................. 483
10.6 Define the Questionnaires ............................................................................................................................................ 483
10.6.1 Create the Questionnaire in the Library .............................................................................................................. 484

10.6.2 Approve the Questionnaires .................................................................................................................................. 492

10.6.3 Edit the Questionnaire From the Library ............................................................................................................. 493


10.6.4 Create the Questionnaire by Copying an Existing Questionnaire ................................................................. 494

10.6.5 Delete the Questionnaire from the Library ......................................................................................................... 494

10.6.6 Wrap and Unwrap the Questionnaire from the Library ................................................................................... 494

10.7 Configuring Token-Based RFI ...................................................................................................................................... 495

11 System Configuration and Identity Management .......................................................... 497

11.1 System Configuration .................................................................................................................................................... 497


11.1.1 Navigating to System Configuration .................................................................................................................... 497

11.1.2 Components of System Configuration................................................................................................................. 498


11.1.3 Database Server ........................................................................................................................................................ 498

11.1.4 Application Server .................................................................................................................................................... 503

11.1.5 Web Server ................................................................................................................................................................. 505


11.1.6 Database Details ....................................................................................................................................................... 509

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 14


11.1.7 OLAP Details ................................................................................................................................................................513

11.1.8 Configure Email Configuration ............................................................................................................................... 516

11.1.9 Instance Access Token ............................................................................................................................................. 517

11.1.10 Information Domain .................................................................................................................................................. 523

11.1.11 Configuration .............................................................................................................................................................. 527

11.1.12 Application ................................................................................................................................................................. 544

11.1.13 View OFSAA Product Licenses After Installation of Application Pack ......................................................... 547

11.2 Identity Management ..................................................................................................................................................... 548


11.2.1 Navigating to Identity Management .................................................................................................................... 548

11.2.2 Components of Identity Management ................................................................................................................. 548

11.2.3 Mappings in Identity Management ....................................................................................................................... 549

11.2.4 User Administrator ................................................................................................................................................... 550


11.2.5 System Administrator ...............................................................................................................................................567

11.2.6 User Activity Report................................................................................................................................................... 577

11.2.7 User Profile Report.....................................................................................................................................................578


11.2.8 Enable User .................................................................................................................................................................579

11.3 References ........................................................................................................................................................................ 580


11.3.1 List of Objects Created in Information Domain ................................................................................................. 580

11.3.2 Authentication and Logging .................................................................................................................................. 580


11.3.3 Populating Execution Statistics ............................................................................................................................. 580

11.3.4 SMS Auto Authorization .......................................................................................................................................... 581

12 Reports................................................................................................................................ 582

12.1 Accessing Reports ............................................................................................................................................................582


12.2 Creating User Status Report ..........................................................................................................................................582
12.3 Creating User Attribute Report .....................................................................................................................................585
12.4 Creating User Admin Activity Report ......................................................................................................................... 586
12.5 Creating User Access Report .........................................................................................................................................587
12.6 Creating Audit Trail Report ........................................................................................................................................... 589
12.7 Resizing and Sorting Reports ....................................................................................................................................... 590

13 Object Administration ....................................................................................................... 592

13.1 Access Object Administration and Utilities based on Information Domain ......................................................592
13.2 Object Security Concept in OFSAAI .............................................................................................................................593

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 15


13.2.1 User Group Authorization ........................................................................................................................................593

13.2.2 User Group Scope ......................................................................................................................................................593

13.2.3 User Group Access Right ......................................................................................................................................... 594

13.2.4 Object Access Type ...................................................................................................................................................595

13.3 OFSAA Seeded Security..................................................................................................................................................595


13.3.1 OFSAA Seeded User Groups....................................................................................................................................595

13.3.2 OFSAA Seeded Roles ................................................................................................................................................597

13.3.3 OFSAA Seeded Actions and Functions ................................................................................................................ 598

13.4 Object Security ................................................................................................................................................................. 599


13.4.1 Metadata Segment Mapping ................................................................................................................................. 599

13.4.2 Batch Execution Rights............................................................................................................................................ 600

13.5 Object Migration .............................................................................................................................................................. 602


13.5.1 Offline Object Migration .......................................................................................................................................... 602

13.5.2 Online Object Migration ...........................................................................................................................................623


13.6 Translation Tools ............................................................................................................................................................. 632
13.6.1 Config Schema Download........................................................................................................................................632

13.6.2 Config Schema Upload .............................................................................................................................................633

13.7 Utilities ............................................................................................................................................................................... 634


13.7.1 Metadata Authorization ...........................................................................................................................................635
13.7.2 Save Metadata .......................................................................................................................................................... 636

13.7.3 Write-Protected Batch ..............................................................................................................................................637

13.7.4 Metadata Difference ................................................................................................................................................ 638

13.7.5 Patch Information ..................................................................................................................................................... 638

13.7.6 Transfer Documents Ownership............................................................................................................................ 639

13.8 References ........................................................................................................................................................................ 640


13.8.1 Scenario to Understand Hierarchy Security........................................................................................................ 640

13.8.2 Role Mapping Codes ................................................................................................................................................. 641

13.8.3 Function Role Mapping ........................................................................................................................................... 642

14 Command Line Utilities ..................................................................................................... 648

14.1 Command Line Utility to Migrate Objects ................................................................................................................. 648


14.1.1 Prerequisites............................................................................................................................................................... 649
14.1.2 Migrating Objects Using OBJECTMIGRATION.xml File .................................................................................... 650

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 16


14.1.3 Migrating Objects Using CSV Files ........................................................................................................................ 658

14.1.4 Limitations.................................................................................................................................................................. 663

14.1.5 Objects Supported for Command Line Migration ............................................................................................. 664

14.1.6 Dependent Objects ................................................................................................................................................... 670

14.1.7 Migrating Security Management System (SMS) Objects ..................................................................................673

14.2 Command Line Utilities to Execute RRF Definitions................................................................................................675


14.2.1 Command Line Utility for Rule Execution ............................................................................................................675

14.2.2 Command Line Utility for Fire Run Service\ Manage Run Execution ........................................................... 676

14.3 Command Line Utility for DMT Migration .................................................................................................................677


14.3.1 Prerequisites................................................................................................................................................................677

14.3.2 Modes of Operation ................................................................................................................................................. 679

14.3.3 Few Important Pointers ............................................................................................................................................ 681

14.3.4 Logs.............................................................................................................................................................................. 682


14.3.5 Troubleshooting ........................................................................................................................................................ 682

14.4 Command Line Utility for File Encryption ................................................................................................................. 683


14.5 Command Line Utility to Publish Metadata in Metadata Browser ...................................................................... 685
14.6 Command Line Utility for Object Application Mapping in Metadata Browser ................................................. 686
14.7 Command Line Utility for Resaving UAM Hierarchy Objects ............................................................................... 687
14.7.1 Executing RUNIT.sh from Console ........................................................................................................................ 687

14.7.2 Executing RUNIT.sh from Operations Module (ICC) ......................................................................................... 688


14.7.3 Executing RUNIT.sh from RRF Module ................................................................................................................ 688

14.7.4 Utility Status Information........................................................................................................................................ 689

14.8 Command Line Utility for Resaving Derived Entities and Essbase Cubes ........................................................ 689
14.8.1 Command Line Utility for Resave, Refresh and Delete Partitions .................................................................. 691

14.8.2 Command Line Utility for Partition-Based Derived Entities ............................................................................ 691

14.9 Command Line Utility for Mapper Pushdown ......................................................................................................... 696


14.10 Command Line Utility for Downloading Metadata Objects in PDF Format ...................................................... 697
14.11 Command Line Utility for LDAP Migration ............................................................................................................... 698
14.12 Model Upload Utility ....................................................................................................................................................... 699
14.12.1 Run the Model Upload Utility ................................................................................................................................. 700

14.12.2 Model Upload Details ............................................................................................................................................... 702

14.13 Command Line Utility for Object Registration ......................................................................................................... 703


14.14 Command Line Utility for Transforming erwin XML to Database XML or JSON(ODM) ................................ 704

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 17


14.15 Command Line Utility for Generating Slice JSON (ODM) ..................................................................................... 705
14.16 Command-Line Utility for SQL Modeler to JSON (ODM) ...................................................................................... 706
14.17 Command Line Utility for Populating AAI_DMM_METADATA Table ................................................................ 706
14.18 Command-line Utility to Bulk Import User Groups to IDCS .................................................................................. 707

15 Rest APIs for Object Migration Utility .............................................................................. 709

15.1 Objectmigration Export API .......................................................................................................................................... 709


15.1.1 Endpoint Details ........................................................................................................................................................ 709
15.1.2 Sample ......................................................................................................................................................................... 709

15.2 Objectmigration Import API.......................................................................................................................................... 709


15.2.1 Endpoint Details ........................................................................................................................................................ 709

15.2.2 Sample .......................................................................................................................................................................... 710

15.3 Objectmigration Export Status API .............................................................................................................................. 710


15.3.1 Endpoint Details ......................................................................................................................................................... 710

15.4 Objectmigration Import Status API .............................................................................................................................. 710


15.4.1 Endpoint Details ......................................................................................................................................................... 710

15.5 Summary Objecttypes API .............................................................................................................................................. 711


15.5.1 Endpoint Details .......................................................................................................................................................... 711

15.6 Summary Objectcodes API ............................................................................................................................................. 711


15.6.1 Endpoint Details .......................................................................................................................................................... 711

15.7 Object Migration Download Dump API ........................................................................................................................ 711


15.7.1 Endpoint Details .......................................................................................................................................................... 711

15.8 Object Migration Upload Dump API ............................................................................................................................. 711


15.8.1 Endpoint Details .......................................................................................................................................................... 711

16 References ........................................................................................................................... 712

16.1 Calendar ..............................................................................................................................................................................712


16.2 Function Mapping Codes ................................................................................................................................................712
16.3 External Scheduler Interface Component ...................................................................................................................712
16.3.1 Architecture ..................................................................................................................................................................713

16.3.2 Scope of Integration ...................................................................................................................................................713


16.3.3 ESIC Invocation........................................................................................................................................................... 714

16.3.4 Batch Execution Mechanism ................................................................................................................................... 715

16.3.5 External Scheduler Batch Run ID........................................................................................................................... 720

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 18


16.3.6 Batch Monitoring ........................................................................................................................................................721

16.3.7 Advantages of ES ........................................................................................................................................................721

16.3.8 OFSAAI Standard XML ..............................................................................................................................................721

16.3.9 Exit Status Specifications ......................................................................................................................................... 722

16.3.10 ESIC Operations using Wrapper Scripts ................................................................................................................ 723

16.3.11 ESIC Operations Using Command Line Parameters and Job Types .............................................................. 725

16.3.12 Additional Information on ESIC ..............................................................................................................................729

16.4 File Upload Requirements ............................................................................................................................................. 730

17 Preferences ......................................................................................................................... 731

18 Appendix A .......................................................................................................................... 732

18.1 OFS Analytical Applications Infrastructure User Groups and Entitlements....................................................... 732
18.2 OFS Analytical Applications Infrastructure User Roles ........................................................................................... 732
18.3 OFS Analytical Applications Infrastructure Functions ............................................................................................743
18.4 OFS Analytical Applications Infrastructure Group - Role Mapping .................................................................... 764

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 19


PREFACE
WHAT IS NEW IN THIS RELEASE OF OFSAAAI APPLICATION PACK

1 Preface
OFSAAI provides the framework for building, running, and managing applications along with out of the
box support for various Deployment Models, Compliance to Technology standards, and supporting a host
of OS, Middleware, Database, and Integration with enterprise standard infrastructure.
The information contained in this document is intended to give you an exposure and an understanding of
the features in Oracle Financial Services Analytical Applications Infrastructure.
• What is New in this Release of OFSAAAI Application Pack
• About this Manual
• Audience
• Recommended Skills
• Recommended Environment
• Prerequisites
• Conventions and Acronyms

1.1 What is New in this Release of OFSAAAI Application Pack


This section lists new features and changes in the OFSAAAI Application Pack for Release 8.1.2.0.0.

1.1.1 New Features in Release 8.1.2.0.0


This section lists the new features described in this User Guide.

Table 1: New features in the OFSAAAI Application Pack Release 8.1.2.0.0

Feature Description

Data Maintenance Interface The enhancements are as follows:


• The Audit-Trail Window now shows the Audit Time-stamp in
milliseconds.
• The Master-Child Data Maintenance Support enables Banks to
maintain the data for the Master Table and its Child Tables (up to
four Child Tables). Users must provide data for all selected tables in
a single action.

Filter Download Filter Data, Bulk Edit, and Upload using the Utility to download
and upload Filter Definitions.

Link Analysis Performance Optimization to reduce the time taken to load graphs.

Model Upload The Command-Line Utility for SQL Modeler to JSON (ODM) to transform
SQL Modeler to ODM (JSON).

Object Migration The reverse population of Migrated Hierarchies using Offline Migration.

Operations Menu Privileged-based access to the functionalities in all of the Batch Screens.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 20


PREFACE
ABOUT THIS MANUAL

Feature Description

System Configuration • Web to Processing Tier communication through Hostname where


the Processing Tier binds to the IP Address and not the Hostname.
• UI Configuration to enable Just in Time (JIT) Provisioning to
synchronize the User, Group, and User-Group Mapping in external
systems such as LDAP, SAML, and SSO into OFSAA when a User
logs in.

Refresh Rate of Monitoring Change the default setting to 10 seconds to refresh the Batch Monitor
Batch and Batch Execution Windows.

Security Management • Utility to generate CSV file containing User, Groups, and User-Group
System Mapping in User Administrator.
• Following enhancements in User Maintenance:
 When you create a User, the Enable User check box is
selected by default.
 When you create a User, Start Date-Time displays the
Current Date-Time by default and the End Date displays 31-
December-2050 by default.
 Lock statuses such as Enable, Disable and Delete with
Authorization required for status updates.
 Authorization is not required to view the Enable User
Window in the User Activity Report.

For more details, see the Oracle Financial Services Advanced Analytical Applications Infrastructure Release
8.1.2.0.0 Readme.

1.1.2 Deprecated Features


There are no Deprecated Features in this release.

1.1.3 Desupported Features


There are no Desupported Features in this release.

1.2 About this Manual


This manual explains the functionalities of Oracle Financial Services Analytical Applications Infrastructure
(OFSAAI) in a procedural approach. OFSAAI is integrated with multiple modules that cover areas like data
extraction and transformation, definition and execution of rules and processes for molding a set of data,
and application of different techniques on raw data for model design purposes.
It also encompasses modules that are inevitable to make the Infrastructure Application flexible according
to the user requirements. These modules perform administration, definition of servers, database, and
Information Domain along with the other configuration processes such as segment and metadata
mapping, hierarchy security, and designing of the Infrastructure Menu functions. The last section of this
document consists of references and feedback information pertaining to any issues noticed within the
document.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 21


PREFACE
AUDIENCE

1.3 Audience
This guide is intended for:
• Business Analysts who are instrumental in solution designing and creation of statistical models
using historical data.
• System Administrators (SA) who are instrumental in maintaining and executing batches, making the
Infrastructure Application secure and operational, and configuring the users and security of
Infrastructure.

1.4 Recommended Skills


• System Administrators should be aware of the database concepts and the underlying database
structure of the Infrastructure Application from an operational perspective. System Administrators
also need to be technically sound in configuring the databases for data extraction procedures.
• Business analysts must have an in-depth knowledge of the underlying data sources that store
organization's data, the ETL concept of data warehousing and associated terminologies along with
the statistical techniques for model designing and execution.

1.5 Recommended Environment


For best viewing of Infrastructure Pages, set the window resolution to a minimum resolution of 1024 x 768
pixels.
For a list of compatible browsers, see the Oracle Financial Services Analytical Applications 8.1.2.0.0
Technology Matrix.

1.6 Prerequisites
• Successful installation of Infrastructure and related software.
• Good understanding of business needs and administration responsibilities.
• In-depth working knowledge of business statistics.

1.7 Conventions and Acronyms


The table describes the Conventions and Acronyms that are used in this document.

Table 2: Conventions and Acronyms

Conventions Description

Window Names are italicized.

Window actions are indicated in Bold

ALM Asset Liability Management

AMHM Attributes Members Hierarchies Module

ANSI American National Standards Institute

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 22


PREFACE
CONVENTIONS AND ACRONYMS

Conventions Description

API Application Programming Interface

ARIMA Auto Regressive Integrated Moving Average

ASCII American Standard Code for Information Interchange

AW Analytical Workspace

BA Business Analysts

BI Business Intelligence

BMM Business Metadata Management

BP Business Processor

CF Cash Flow

CSV Comma Separated Values

DBA Database Administrator

DEFQ Data Entry Forms and Queries

DMP Window or Memory Dump

DQ Data Quality

DSN Data Source Name

ELT Extract Load Transform

EPM Enterprise Performance Management

ES External Scheduler

ESIC External Scheduler Interface Component

ETL Extract Transform Load

EWMA Exponentially Weighted Moving Average

FTP File Transfer Protocol

GARCH Generalized Auto Regressive Conditional Heteroskedasticity

GMV General Market Variable

HTML Hyper Text Markup Language

HTTP Hypertext Transfer Protocol

Infodom Information Domain

IP Internet Protocol

JDBC Java Database Connectivity

JSON JavaScript Object Notation

JVM Java Virtual Machine

LDAP Lightweight Directory Access Protocol

LHS menu Left hand side menu

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 23


PREFACE
CONVENTIONS AND ACRONYMS

Conventions Description

MDB Microsoft Access Database

MOLAP Multidimensional Online Analytical Processing

NE Non Editable

OBIEE Oracle Business Intelligence Enterprise Edition

ODBC Open Database Connectivity

OFSAAI Oracle Financial Services Analytical Applications Infrastructure

OHC Oracle Help Centre

OLAP Online Analytical Processing

PDF Portable Data Format

PFT Profitability Management

PR2 Process Run Rule framework

RAC Real Application Cluster

RDBMS Relational Database Management System

RHS Right Hand Side

RRF Run Rule Framework

SA System Administrator

SFTP Secret File Transfer Protocol

SID System ID

SMS Security Management System

SQL Structured Query Language

T2T Table to Table

TBD To be Deleted

TFM Technical File Maintenance

TNS Name Transparent Network Substrate Name

TP Transfer Pricing

URL Uniform Resource Locator

VaR Value at Risk

XML Extensible Markup Language

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 24


OFSAAI - AN OVERVIEW
COMPONENTS OF OFSAAI

2 OFSAAI - An Overview
Oracle Financial Services Analytical Applications Infrastructure (OFSAAI) is a general-purpose Analytics
Applications infrastructure that provides the tooling platform necessary to rapidly configure and develop
analytic applications for the financial services domain. It is built with Open-Systems Compliant
architecture providing interfaces to support business definitions at various levels of granularity.
Applications are built using OFSAAI by assembling business definitions or business metadata starting
from data-model to lower grain objects like Dimensions, Metrics, Security Maps, and User Profile to higher
order objects like Rules, Models, and Analytic Query Templates which are assembled using the lower grain
ones. In addition to application definition tools, it provides the entire gamut of services required for
Application Management including Security Service, Workflow Service, Metadata Management,
Operations, Life-cycle Management, public API’s and Web Services that are exposed to extend and enrich
the tooling capabilities within the applications.
Oracle Financial Services Analytical Applications Infrastructure is the complete end-to-end Business
Intelligence solution that is easily accessible via your desktop. A single interface lets you tap your
company’s vast store of operational data to track and respond to business trends. It also facilitates
analysis of the processed data. Using OFSAAI you can query and analyze data that is complete, correct,
and consistently stored at a single place. It has the prowess to filter data that you are viewing and using for
analysis.
It allows you to personalize information access to the users based on their role within the organization. It
also provides a complete view of your enterprise along with the following benefits:
• Track enterprise performance across information data store.
• Use one interface to access all enterprise databases.
• Create consistent business dimensions and measures across business applications.
• Automate the creation of coordinated data marts.
• Use your own business language to get fast and accurate answers from all your databases.
• Deploy an open XML and web-based solution against all major relational or multi-dimensional
databases on Microsoft Windows and UNIX servers.
This chapter provides an overview of Infrastructure, its components, and explains how these components
are organized in the Splash window with the user login process.

2.1 Components of OFSAAI


The OFSAA Infrastructure consists of the following components/modules that are used to deploy an
analytical solution.
• Data Model Management
• Data Management Tools
• Unified Analytical Metadata
• Data Entries Forms and Queries
• Data Management Framework
• Data Maintenance Interface

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 25


OFSAAI - AN OVERVIEW
ACCESSING OFSAA APPLICATIONS

• Rules Run Framework


• Infrastructure Modules
• Operations
• Questionnaire
• Infrastructure Modules
• System Configuration and Identity Management
• Object Administration

All components are encapsulated within a common Security and Operational framework as shown in the
following figure:

Figure 1: Security and Operational framework

Infrastructure also supports many business analytical solution(s) like Operational Risk, PFT, and Basel,
which are licensed separately to the organization. This manual provides an overview of only the
technological components.
For a detailed overview of OFSAAI modules, see Modules in OFSAAI section.

2.2 Accessing OFSAA Applications


OFSAA can be accessed through your web browser as soon as the System Administrator (SA) installs and
configures Oracle Financial Services Analytical Applications.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 26


OFSAAI - AN OVERVIEW
OFSAA LOGIN PAGE

The SA will provide you with a link through which you can access Oracle Financial Services Analytical
Applications. You can access the login window through your web-browser using the URL http(s): <IP
Address of the Web Server > :<servlet port>/<context name>/login.jsp.
You can also login to the application with the host name instead of the IP address.

2.3 OFSAA Login Page


On entering the URL (<IP Address/hostname of the Web Server>:<servlet port>/<context
name>/login.jsp) in your browser window, the OFSAA Login Page is displayed:

Figure 2: OFSSA login Page

You can select the required language from the Language drop-down list. The language options displayed
in the drop-down list are based on the language packs installed for the OFSAA infrastructure. Based on
the selected Language, the appropriate language login window is displayed.
Enter the User ID and Password provided by the System Administrator and click Login. You will be
prompted to change your password on your first login. For details on how to change password, see the
Changing Password section.
In case the OFSAA setup has been configured for OFSAA native Security Management System (SMS)
Authentication, the password to be entered will be as per the password restrictions set in the OFSAA SMS
repository.

2.3.1 Log in as System Administrator


Post installation, the first login into Infrastructure is possible only for a System Administrator through user
ID “sysadmn”. This ID is created at the time of installation with default password as “password0”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 27


OFSAAI - AN OVERVIEW
OFSAA LOGIN PAGE

Enter User ID as “sysadmn” and password as “password0”. Click Login.

2.3.2 Log in as System Authorizer


System Authorizer ID is also created at the time of installation with the default password “password0”.
This ID is required to authorize the users created by the system administrator.
Enter login id as “sysauth” and password as “password0”. Click Login.

2.3.3 Log in as Business User


The Business users will be created by System Administrator and will be authorized by the System
Authorizer.
Enter User ID and Password provided by the System Administrator and click Login.

2.3.3.1 OFSAA Login if LDAP Servers are configured


If the OFSAA setup has been configured for LDAP Authentication, the Login Page is displayed as shown:

Figure 3: OFSSA Login Page

1. Enter your User ID and Password (as in LDAP store) in the respective fields.
2. Select the appropriate LDAP Server from the drop-down list, against which you want to get
authenticated. This is optional. If you do not select any server, you will be authenticated against the
appropriate LDAP server.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 28


OFSAAI - AN OVERVIEW
SETTING PREFERRED LANGUAGE

NOTE For SYSADMN/ SYSAUTH/ GUEST users, no need to select any


LDAP server as they are always authenticated against the SMS
store. Additionally, in case a specific user has been marked as
“SMS Auth Only” in the User Maintenance window even though
the OFSAA instance is configured for LDAP authentication, then
the user will also be authenticated against SMS store instead of
LDAP store. The user has to enter the password as per the SMS
store.

2.4 Setting Preferred Language


You can set your preferred language from,
• OFSAA Login Page
• OFSAA Landing Page

2.4.1 Setting Language from OFSAA Login Page


If you set your preferred language from the OFSAA Login page, the language will be changed for all the
future sessions.
You can select the required language from the Language drop-down list. The language options displayed
in the drop-down list are based on the language packs installed for the OFSAA infrastructure. Based on
the selected Language, the appropriate language login window is displayed.
When you select a new language and login, the OFSAA Landing page will be displayed in the previously
selected language. To View the application in the current language, re-login to the application.

2.4.2 Setting Language from OFSAA Landing Page


Select the language from the drop-down list to view the current session in a different language.
If you set your preferred language from the OFSAA Landing page, the selection is valid only for the
current session. The selected language will be set to the preferred language selected from the Login page.

NOTE By default, it is set to US-English.

2.5 Changing Password


You can choose to change your password any time by clicking your username appearing on the right top
corner and selecting Change Password.
Note that this option is available:
• If SMS Authentication & Authorization is configured as Authentication Type from the
Configuration window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 29


OFSAAI - AN OVERVIEW
CHANGING PASSWORD

• If LDAP Authentication & SMS Authorization is configured as Authentication Type from the
Configuration window and the SMS Auth Only checkbox is selected for the user in the User
Maintenance window.
• If SSO Authentication & SMS Authorization is configured as Authentication Type from the
Configuration window and the SMS Auth Only checkbox is selected for the user in the User
Maintenance window.

Figure 4: OFSSA Change Password window

In the Change Password window, enter a new password, confirm it, and click OK to view the OFSAA Login
window. Refer to the following guidelines for Password Creation:
• Passwords are displayed as asterisks (stars) while you enter. This is to ensure that the password is
not revealed to other users.
• Ensure that the entered password is at least six characters long.
• The password must be alphanumeric with a combination of numbers and characters.
• The password should not contain spaces.
• Passwords are case sensitive and ensure that the Caps Lock is not turned ON.
• By default, the currently used password is checked for validity if password history is not set.
• The new password should be different from previously used passwords based on the password
history, which can be configured.
For more information, see the Configuration section in System Configuration chapter.
If you encounter any of the following problems, contact the System Administrator:
• Your user ID and password are not recognized.
• Your user ID is locked after three consecutive unsuccessful attempts.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 30


OFSAAI - AN OVERVIEW
OFSAA LANDING PAGE

• Your user ID has been disabled.


• The guest user cannot change the password.

2.6 OFSAA Landing Page


On successful login, the OFSAA Landing Page is displayed.

Figure 5: OFSAA Landing Page

OFSAA Landing Page shows the available Applications as tiles, for which a user has access to. Clicking the
respective Application Tile launches that particular Application. You can change the Landing Page based
on your preference.
For more information, see the Preferences section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 31


OFSAAI - AN OVERVIEW
OFSAA LANDING PAGE

2.6.1 Header
Figure 6: OFSSA Header

Hamburger/Navigation Menu Icon- This icon is used to trigger the Application Navigation Drawer.
Application Icon- This icon is used to show the available Applications installed in your environment at
any time.
Administration Icon- This icon is used to go to the Administration window. The Administration window
displays modules like System Configuration, Identity Management, Database Details, manage OFSAA
Product Licenses, Create New Application, Information Domain, Translation Tools, and process Modelling
Framework as Tiles.
Reports Icon- This icon is used to launch various User Reports such as user Status Report, User Attribute
Report, User Admin Activity Report, User Access Report, and Audit Trial Report.
Language Menu- It displays the language you selected in the OFSAA Login Screen. The language options
displayed in the Language Menu are based on the language packs installed in your OFSAA instance. Using
this menu, you can change the language at any point of time.
User Menu- Clicking this icon displays the following menu:

Figure 7: User Menu

 Preferences- To set the OFSAA Landing Page.


 Change Password- To change your password.
For more information, see Change Password section. This option is available only if SMS
Authorization is configured.
 Log Out- To log out from OFSAA applications.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 32


OFSAAI - AN OVERVIEW
OFSAA LANDING PAGE

Last Login Details - This displays the last login details as shown:

Figure 8: Last Login Details

2.6.2 Navigation Drawer


Click Hamburger Icon to launch the Navigation Drawer as shown:

Figure 9: Navigation List drawer

Here the navigation items appear as a list. The First Level menu shows the installed applications. Clicking
an application displays the second-level menu with the application name and Common tasks menu. The
arrangement of the menu depends on your installed application.
Clicking an item in the menu displays the next level sub menu and so on. For example, to display Data
Sources, click Financial Services Enterprise Modeling>Data Management>Data Management
Framework>Data Management Tools>Data Sources.

Figure 10: Data Management Tools Menu

Click Hierarchical Menu to display the navigation path of the current sub menu as shown:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 33


OFSAAI - AN OVERVIEW
MODULES IN OFSAAI

Figure 11: Data Management Tools Hierarchical Menu

The RHS Content Area shows the Summary Page of Data Sources. Click anywhere in the Content Area to
hide the Navigation Drawer. To launch it back, click the Hamburger icon .
Click Home to display the OFSAA Landing Screen.

2.7 Modules in OFSAAI


• Data Model Management is intended for uploading the warehouse data from the operational
systems to database schema using erwin XML file.
• Data Management Framework is a comprehensive data integration platform that facilitates all the
data integration requirements from high-volume and high-performance batch loads to event-
driven integration processes and SOA-enabled data services. This module is used for managing
Data movement. This includes sub modules like Data Sources, Data Mapping, Post Load Changes
and Data Quality Framework.
• Data Entry Forms and Queries module facilitates you to design web-based user friendly Data Entry
windows with a choice of layouts for easy data view and data manipulation. This module has sub
modules like Forms Designer, Data Entry, and Excel Upload.
• Data Maintenance Interface module helps in the design and creation of forms that are in a user-
specified format. Authorized users with the required privileges can use these forms to view and
update existing data in the database.
• Unified Analytical Metadata is intended for the Information and Business Analysts who are
instrumental in supporting and affecting analytical decisions. This module is used to define and
maintain analytical metadata definitions. This module has sub modules like Alias, Derived Entity,
Dataset, Dimension Management, Business Measure, Business Processor, Build Hierarchy, Business
Dimension, Essbase Cube, Filters, Expression, Map Maintenance, and Cube Migration.
• Rule Run Framework facilitates you to define a set of rules, reporting objects, and processes that
are required to transform data in a warehouse. This module has sub modules like Rule, Process,
Run, and Manage Run Execution.
• Metadata Browser module provides extensive browsing capabilities of metadata, helps in tracking
the impact of changes to metadata, and trace through to the source of originating data. The
metadata in the Metadata Browser window is organized into different categories like Data
Foundation Metadata, Business Metadata, and Process Metadata.
• Operations module facilitates you in administration and processing of business data to create the
highest level of efficiency within the system and to derive results based on a specified rule. It
includes sections like Batch Maintenance, Batch Execution, Batch Scheduler, Batch Monitor, Batch
Processing Report, Batch Cancellation, and View Log.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 34


OFSAAI - AN OVERVIEW
LOGGING IN OFSAA

• Questionnaire module is an assessment tool, which presents a set of questions to users, and
collects the answers for analysis and conclusion. It can be interfaced or plugged into OFSAA
application packs.
• System Configuration & Identity Management module facilitates System Administrators to
provide security and operational framework required for Infrastructure. Administration window has
a Tiles menu with Tiles like System Configuration, Identity Management, Database Details, Manage
OFSAA Product Licenses, Create New Application, Information Domain, Translation Tools and
Process Modelling Framework.
• Object Administration facilitates System Administrators to define the security framework with the
capacity to restrict access to the data and metadata in the warehouse, based on a flexible, fine-
grained access control mechanism. These activities are mainly done at the initial stage and then on
a need basis. It includes sections like Object Security, Object Migration, and Utilities (consisting of
Metadata Difference, Metadata Authorization, Save Metadata, Write-Protected Batch, Component
Registration, Transfer Document Ownership, and Patch Information).

NOTE For information about OFSAA Product Licenses after installation of


Application Packs, see the View OFSAA Product Licenses After Installation
of Application Pack section.

2.8 Logging in OFSAA


Logging in OFSAA is done using Log4J. The log files are available in the following locations:
• UI/Web Logs: <DEPLOYED LOCATION>/<Context>.ear/<Context>.war/logs
• Application Logs: $FIC_HOME/logs
• Execution Logs: /ftpshare/logs/<MISDATE>/<INFODOM>/<COMPONENT NAME>/<LOG FILE
NAME>.log

2.8.1 Purging of Logs


Configure the logger related attributes in the RevLog4jConfig.xml file available in the
$FIC_HOME/conf/ folder. Each log file will have appenders in this file and attributes pertaining to this
particular appender can be changed.
The default size of the log files is set to 5000 KB and the number of maximum backup log files retained is
set to 5, both of which are configurable. Increasing these parameters to a higher value should depend on
the server hardware configurations and may reduce the performance.
To configure the Logs file size, follow these steps:
1. Navigate to $FIC_HOME/conf folder or <DeployedLocation>/<context.war>/<context>/
and locate RevLog4jConfig.xml file.
2. Configure the logger related attributes in the RevLog4jConfig.xml file. This file will have
Appenders for each log file.
Sample Appender for UMM log file is shown:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 35


OFSAAI - AN OVERVIEW
LOGGING IN OFSAA

<RollingFile name="UMMAPPENDER"
fileName="/scratch/ofsaaweb/weblogic/user_projects/domains/cdb/application
s/cdb.ear/cdb.war/logs/UMMService.log"
filePattern="/scratch/ofsaaweb/weblogic/user_projects/domains/cdb/applicat
ions/cdb.ear/cdb.war/logs/UMMService-%i.log" >
<PatternLayout>
<Pattern> [%d{dd-MM-yy HH:mm:ss,SSS zzz aa}{GMT}] [%-5level] [WEB] %m%n
</Pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="5000 KB" />
</Policies>
<DefaultRolloverStrategy max="5"> <!-- number of backup files -->
</DefaultRolloverStrategy>
</RollingFile>
3. To change the log file size, modify the value set for SizeBasedTriggeringPolicy size.
4. To change the number of backup files to be retained, modify the value set for
DefaultRolloverStrategy max.

2.8.2 Log File Format


In OFSAA, log format is standardized and can be read by any standard log analysis tool. The standard log
format is as follows:
[GMT TIMESTAMP] [LOGGER LEVEL] [LOGGER LOCATION] [MODULE/COMPONENT] [LOGGED
IN USER] [JAVA CLASS] <LOG MESSAGE>
Sample:
[25-04-18 10:08:41,066 GMT AM] [INFO ] [WEB] [UMM] [UMMUSER]
[BUSINESSMETADATA] Inside createImplicitObjectsForAllInfodom
[25-04-18 10:08:41,069 GMT AM] [INFO ] [WEB] [UMM] [UMMUSER]
[BUSINESSMETADATA] Call createImplicitObjectsForMapper for infodom = TESTCHEF
[25-04-18 10:08:42,142 GMT AM] [DEBUG] [WEB] [UMM] [UMMUSER]
[BUSINESSMETADATA] Source created successfully for infodom TESTCHEF
[25-04-18 10:08:42,142 GMT AM] [INFO ] [WEB] [UMM] [UMMUSER]
[BUSINESSMETADATA] Start - code added to create user group hierarchy for this
infodom
[25-04-18 10:08:42,142 GMT AM] [INFO ] [WEB] [UMM] [UMMUSER]
[BUSINESSMETADATA] Inside createUserGroupHierarchyForInfodom

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 36


DATA MODEL MANAGEMENT
LOGGING IN OFSAA

3 Data Model Management


Model refers to a data structure that consists of well-organized business data for analysis. Data Model
explicitly determines the structured data which stores persistent information in a relational database and
is specified in a data modeling language.
Data Model Maintenance within the Infrastructure system facilitates you to upload the warehouse data
from the operational systems to database schema using JSON (ODM)/ erwin XML file or Database
Catalog.
An erwin XML file is a standard tagged XML file based on the Object Property Model that can create the
required data models. You can upload the XML file by hosting it on the server and customize the update
process while uploading a Business Model.
An option to upload Database.XML or JSON (ODM) files instead of erwin XML for Model Upload. In
addtion, you can also upload an erwin XML and convert it to JSON (ODM). A command line utility
TransformErwin.sh is provided that can run on lower environment to generate Database.XML or JSON
(ODM) files from the erwin XML file, thereby saving the time taken for transforming erwin.XML to
Database.XML or JSON (ODM) during the model upload process. For more information, see Command
Line Utility for Transforming erwin XML to Database XML or JSON(ODM).
In case of slice, you can also use a command line utility to validate only the updated JSONs and generate
the updated JSONs for the Model Upload. This reduces the number of the files that is required for the
Model Upload. You can use the generateSliceJson.sh utility provided that can run on lower
environment to generate JSON (ODM) file from old Database.XML or erwin XML and new
Database.XML or erwin XML file. For more information, see Command Line Utility for Generating Slice
JSON (ODM).
The Database Catalog feature is used to generate a business model out of the database catalog
information. This can be used when a database physically exists and the business model has to be
reverse-generated for OFSAA metadata references. The reverse model generation feature can be
extended to RDBMS based Infodoms as well. This populates the following:
• OFSAA logical model abstraction layer, that is, the JSON files for the Infodom.
• Object registration repository
Following are the prerequisites while working with Business Model upload:
• Buffer pool has to be available to cache the table and index data.
• The page size for the Tablespace has to be created appropriately.
Following are the Model Upload modes available in the Business Model Upload window:

Table 3: Fields in the Business Model Upload and their Descriptions

Field Description

New You can upload a new business model only when you are uploading a model
for the first time for the selected Information Domain. This option is not
available for subsequent model uploads.
JSON / erwin and DB Catalog options are available for New Model Upload.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 37


DATA MODEL MANAGEMENT
LOGGING IN OFSAA

Field Description

Incremental Supported incremental changes include:


• Add tables
• Drop tables
• Alter table to add a column
• Alter table to change/remove an existing column
The existing model details are extracted and uploaded along with the
specific incremental updates. This option is available only with the
subsequent model uploads and captures all the metadata pertaining to the
changes in the database schema. The same can be tracked to assess the
impact.
The Incremental option is not supported if DB Catalog is selected for the
Model Upload option.

Rebuild You can re-build a model on the existing model in the database. The
existing model is replaced with the current model details. This option is
available with the subsequent model uploads and the current model
uploaded is considered as the latest model for the selected Information
Domain.
Any incremental changes are considered as a ‘Rebuild’ if DB Catalog is
selected as the Model Upload option.

Sliced You can quickly upload the Sliced model with only the incremental changes,
without merging the tables or columns of an existing model. In a Sliced
Model Upload you can incrementally add new tables, add/update columns
in the existing tables, and add/update primary/foreign keys in the existing
model. You can also drop a column or primary/foreign key. However,
dropping a table is not supported. This option is available only with the
subsequent model uploads.
• Sliced Model Upload is faster compared to other upload types as it
optimizes the system memory usage and reduces the file size of
erwin.xml.
• Sliced is not supported if DB Catalog is selected for the Model Upload
option.
In sliced model upload, if the version of the Base model existing in the
environment is higher than the Sliced model getting uploaded, then the
columns (which are not present in the Sliced model) are not dropped. For
more information, see the Model Versioning section.
Sliced Model Upload compares the existing entity JSON available in the
aai_dmm metadata table. Based on the checksum values:
• If the checksum matches, it will ignore the JSON.
• If the checksum values do not match, then the model upload is carried
out and overwrites the existing JSON.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 38


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

NOTE To access the Import Model framework within the


Infrastructure system, you (Business Analysts) must have the
IBMADD (Import Business Model) Function Role mapped.
To access the Data Model Upload window and add Models,
you must have the DMM_ADD Function mapped to the Role
and the Role (for example, DMMWRITE) must be mapped to
the particular User Group.
For additional information, see the My Oracle Support
Document ID 2773375.1.

Figure 12: Business Model Upload Summary window

The Business Model Upload Summary window facilitates to upload the required Business Model and
displays the summary of previously uploaded Business Models with their Name, Type (New/
Incremental/Rebuild/Sliced), Enable NoValidate status (Y or N), Result of upload
(Success/Failed/Running), Start Date, End Date, Log File path, and Status. You can click the View Log link
in the Status column corresponding to the required model to view the Model Upload details in the View
Log Details window.

NOTE To display the summary of the previous Model Uploads, you


must have a connection pool established to access data from
the database. For more information on connection pooling, see
OFS AAAI Application Pack Installation & Configuration Guide
available in the OHC Documentation Library

You can also search for a specific model based on the Name or Type (New / Incremental / Rebuild /
Sliced) existing within the system.

3.1 Upload Business Model


You can upload a new model or update/re-build an existing model to the database schema. The option to
upload a business model is available based on the existing model in the selected Information Domain.
Note the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 39


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

• OFSAAI supports Erwin 9.8, 2018 R1, 2019 R1, 2020 R1, 2020 R2, 2021 R1, and 12.1 generated XMLs in
Model Upload process.
• Time to time, Erwin Withdraws support for lower versions. However, one can open the prior version
data models using the latest versions of Erwin modeler. You can save it as a repository file with the
OFSAA supported versions.
• By default, OFSAAI supports Data Model up to 2 GB. To configure the permissible size specific to
requirements, see the Frequently Asked Questions section in OFS AAAI Installation Guide.
• Ensure that the XML file to be uploaded is saved in “All Fusion Repository Format”.
• Datatypes of TIMESTAMP WITH TIME ZONE and TIMESTAMP WITH LOCAL TIME ZONE are
supported for Model Upload. However, the processing of these datatypes is not supported in
OFSAAI.
To upload a Business Model:
1. From the Business Model Upload Summary window, click Add. The Business Model Upload
window is displayed.
2. (Mandatory) Enter a Name for the model being uploaded. Ensure that the name specified does not
exceed more than 30 characters in length and does not have special characters such as #, %, &, ‘,
and “.
3. Select the required Upload Option. The options are JSON / erwin XML, DB Catalog, and Data
Model Descriptor. For more information on each option, see the corresponding sections:
 Model Upload Using JSON / erwin XML
 Model Upload Using DB Catalog
 Model Upload Using OFSAA Data Model Descriptor

NOTE For subsequent model uploads, you must select the same
Upload Option as used in the first model upload. That is, if you
selected erwin as the Upload Option for the first-time model
upload, then the subsequent model uploads must be done
using the erwin option only.

4. Click Upload Model. The model upload execution is triggered and you are re-directed to the Model
Upload Summary window with the upload details in the summary grid. The “Status” of current
upload is indicated as Running and after the process is completed, the status is updated as either
Success or Failed depending on the execution.

NOTE Index creations are not supported from Apache Hive 3.0 version and
higher. To skip the Index creation, You can update
APACHE_HIVE_VERSION in the Configuration table and restart
FICServer.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 40


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

NOTE To display the current upload status, you must have a connection pool
established to access data from the database. For more information on
connection pooling, see OFS AAAI Installation Guide.

You can click View Log to view the model upload details and also Download Log File to a location for
reference.

NOTE Even if the object registration fails, the Model Upload process will be
successful. In such cases, you must manually do the object registration by
running the Command line utility for Object Registration, since object
registration is mandatory for subsequent model upload to be successful.

NOTE The Model Upload process is stopped if any errors are encountered. It
does not proceed until completion to capture all the errors.

3.1.1 Model Upload Using JSON / erwin XML


You can upload the warehouse data from the operational systems to the database schema using the erwin
XML, JSON or Database XML file. Using the stand-alone command line utility TransformErwin.sh, you
can transform the erwin XML into Database XML or JSON.
You can also use the DB.XML or JSON instead of erwin XML to speed up the model upload process. For
more information, see Command Line Utility for Transforming erwin XML to Database XML or
JSON(ODM).
If you are using other utilities to convert the file format to JSON(ODM) format, you must ensure the
following:
A zip file is created with name of model name with extension ODM.
Zip file contains below
• A master XML file would be generated once transformation is completed. The master xml file
contains the model name if provided while invoking utility and along with that the list of JSON files
generated.
• A JSON file would be created for each table definition in erwin XML file. The JSON files, which would
be made up of table name and erwin Model version separated by a ~ (tilde) symbol.
For example
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jsonupload>
<modelname> </modelname> <!—modelname-->
<jsonfiles>
<jsonfile>TBL_MSG~80000.json</jsonfile> <!—json file names -->

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 41


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

<jsonfile>TBL_ACC~80000.json</jsonfile>
<jsonfile>TBL_ACC_CLASS~80000.json</jsonfile>
<jsonfile>TBL_LOAN_APP~80000.json</jsonfile>
<jsonfile>TBL_CUST~80000.json</jsonfile>
<jsonfile>TBL_LOAN~80000.json</jsonfile>
</jsonfiles>
</jsonupload>
You should upload JSON or XML file (erwin or Database) by hosting it on the server and customize the
update process while uploading a Business Model.

Figure 13: Business Model Upload window for JSON / erwin XML

To perform Model Upload using the JSON / erwin option, follow these steps:
1. In the Business Model Upload window, select Upload Options as JSON / erwin XML.
2. Select the Upload Mode from the drop-down list. You can select New only for the first model
upload. For subsequent uploads, you can select Incremental, Rebuild, or Sliced upload mode. For
more information, see Model Upload modes. For the Sliced Model Upload, you can use SQL Data
Modeler. For more information, see OFSAA Data Model Extensions through the SQL Data Modeler.
3. Select the Object Registration Mode from the drop-down list as Full Object Registration or
Incremental Object Registration. You can select Incremental Object Registration for the Upload
Mode as Incremental and Sliced. It is recommended to select incremental only if the changes are
minimal.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 42


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

NOTE Incremental object registration is not supported for model having


super type-sub type and partitions.

4. Select the Upload File Details pane, and select the upload file type from the following:
 JSON: Select the ODM File for Upload from the drop-down list.

 XML: Select the erwin XML or Database XML file for upload from the drop-down list.

5. The list displays the ODM, erwin, or Database files that reside in the default server path (that is,
ftpshare (Application layer/<infodom>/erwin/erwinXML).

NOTE The erwin XML file name should have only alphanumeric
characters and underscore.

6. In the Additional Options grid, perform the following tasks:


a. Select Yes to directly Update the Database Schema with Model changes.
 If you select Yes, the generated SQL scripts are executed at runtime to update the model
changes in the database.
 If you select No, it restricts the system from updating the database automatically with the
model changes and only the model scripts are created. Later, you must execute the SQL
scripts in the correct sequence in order to make the Infodom Schema to be consistent with
the JSONs persisted in the DB. For more information, see Sequence of Execution of
Scripts.
Additionally, when you select No, ensure the following:
 You have a third party tool or ETL tool to manage the schema updates.
 Database consistency and schema updates are maintained manually by the database
administrator.

NOTE Only the table scripts are created and they must be updated
manually. If you choose this option for the first time and later
perform an Incremental / Sliced / Complete Model Re-build,
you must manually synchronize the schema with the Database
Schema.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 43


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

b. Select Yes for the Generate DDL Execution Logs option if you want execution audit
information such as execution start time, end time, and status of each SQL statement Run as
part of the Model Upload process. The execution log file is available under the
ftpshare/<INFODOM>/executionlogs folder.
c. Select Yes for the Refresh Session Parameters option to use Database session parameters
during the model upload process.
For more information, see Configuring Session Parameters section.
d. Select Yes to directly update the Alter constraints in NOVALIDATE State. During the
Incremental or Sliced Model Upload, alterations to the constraints consume a lot of time as the
constraints have to be validated.
 If you select Yes, an option to alter the constraints in the NOVALIDATE state is enabled
and it will not check the existing data for the integrity constraint violation. It is useful when
the existing data is clean. Therefore, NOVALIDATE potentially reduces the additional
overhead of the constraint validation and enhances the performance.
 By default, the option selected is No and the option to alter the constraints is not enabled.
It checks the existing data for the integrity constraint violation.

NOTE Note the following points about the NOVALIDATE option.


• Constraints in the NOVALIDATE state are supported
only in Incremental and Sliced modes.
• The Model Upload process, irrespective of the status of
success or failure, brings the constraints into the
NOVALIDATE state. Therefore, ENABLE VALIDATE
must be done as a post-model upload activity. That is,
Rollback does not validate the constraints that are
non-validated during the upload activity.
• The NOVALIDATE option is not relevant for the HDFS
systems.

7. Click Upload Model.

3.1.2 Model Upload Using DB Catalog


The Database Catalog (DB Catalog) feature is used to generate a business model out of the database
catalog information. This can be used when a database physically exists, and the business model has to be
reverse-generated for OFSAA metadata references. The reverse model generation feature can also be
extended to RDBMS based Infodoms. This Model Upload process populates the following:
• OFSAA logical model abstraction layer, that is, the JSON files for the Infodom.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 44


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

• Object registration repository.

NOTE erwin is the primary and boot-strap mode to register the Data-
Model with the OFSAA ecosystem. The DB Catalog option does
not take care of the logical artifacts. Hence, do not consider DB
Catalog as a replacement for erwin.

To perform Model Upload using the DB Catalog option:


1. From the Business Model Upload window, select Upload Options as DB Catalog.

Figure 14: Business Model Upload window for DB Catalog

2. Select the Upload Mode from the drop-down list. You can select New only for the first upload. For
subsequent uploads, you can select Rebuild.
For more information, see the Model Upload modes section.
 If the table details are specified in the
$OFSAA_HOME/conf/dmm/Input_DBCatalog_Objects.properties file, then the
application selects the specified tables for DB Catalog. The Entity Filters are not available
selection if the table details are specified in the properties file.
 If the table are not specified, then the application will upload all the tables from the database.
3. Specify the Entity Filters by entering details in the Starts With, Contains, and Ends With fields.
The Filters are patterns for entity names in the Database and can restrict the Database Model
generation to a specific set of entities. The Database Model is generated even if one of the specified
filter conditions matches.
4. You can also specify multiple conditions for a single filter type using comma-separated values. For
example, tables starting with TB and TM can be specified as “TB, TM”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 45


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

5. Click Upload Model.

3.1.3 Model Upload Using OFSAA Data Model Descriptor (JSON) File
This feature allows you to resume the Data Model upload from the logical Data Model, in the form of
OFSAA Data Model Descriptor File (JSON) that is generated in the base environment. This helps in
speeding up the Model Upload process by skipping the XSL transformation in the primary environment.
This feature can be used if the same model in the development environment should be uploaded to
multiple OFSAA instances in the production environment. In such scenarios, you can copy the model
definition (JSON) files and scripts to the target environment and run the command line utility
CopyUpload.sh, to integrate those files in the target environment. You can choose to resume the model
upload process from script generation or script execution.
Following are the steps involved in the model upload using OFSAA Data Model Descriptor file:
1. Copy the required files from source to target environment based on the start point from where you
want to resume the model upload process.
2. Execute the CopyUpload utility.
3. Perform Model Upload.

3.1.3.1 Copying the Required Files


Based on your start point, copy the required files from your source environment to the desired location:
1. If the start point is script generation, copy the JSON files from
/ftpshare/<INFODOM>/json/fipjson/ folder on the source.
2. If the start point is script execution, copy the JSON files from the
/ftpshare/<INFODOM>/json/fipjson/ folder and the DB scripts from the
/ftpshare/<INFODOM>/json/scripts and /ftpshare/<INFODOM>/scripts folders.
The following table describes the Start point and the Required files. After copying the required files,
proceed with Executing the CopyUpload Utility

Table 4: Start Point and the Required Files

Start point Required Files

Script generation /ftpshare/<INFODOM>/json/fipjson_-1/*.json

/ftpshare/<INFODOM>/json/fipjson_-1/*.json
Script Execution DB Scripts - /ftpshare/<INFODOM>/json/scripts and
/ftpshare/<INFODOM>/scripts folders

3.1.3.2 Executing the CopyUpload Utility


The command line utility CopyUpload is used to prepare the target OFSAA instance to resume the model
upload process from script generation or script execution. This utility is available in the
$FIC_HOME/ficapp/common/FICServer/bin/ folder.
Following are the prerequisites for executing the utility:
• CopyUpload.sh must have Execute permissions.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 46


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

• Appropriate permissions must be granted on the source folders.


• All the required files must be copied to the target environment.
For details, see Copying the Required Files.
To Run the utility from the console:
1. Navigate to $FIC_HOME/ficapp/common/FICServer/bin folder.
2. Execute the utility using the command:
./CopyUpload.sh
3. Enter the following when prompted:
 Enter ftpshare location – the path of the ftpshare location.
 Enter dsnname – the information domain name.
 Enter absolute filepath of fipjson folder - the path of the folder in which the
JSON files are available.
 Continue with scripts transfer? [y,n]– Enter ‘y’ if you want to copy the scripts, else
enter ‘n’.
 Enter absolute path for table folder– the path of the folder in which the table is
available.
 Enter absolute path for alter table– the path of the folder in which the alter table
file is available.
 Enter absolute path for scripts– the path of the folder in which the DB scripts are
available.
4. After the utility is executed successfully, the files are copied to the following locations:
 //ftpshare/archive/<INFODOM>/json/fipjson_-1/*.json
 //ftpshare/archive/<INFODOM>/json/scripts_-1/
 //ftpshare/archive/<INFODOM>/scripts

3.1.3.3 Triggering the Model Upload


Trigger the model upload process either through command line or through UI.

NOTE The CopyUpload.sh script must have been executed


successfully.

To perform Model Upload using Data Model Descriptor:


1. From the Business Model Upload window, select Upload Option as Data Model Descriptor.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 47


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

Figure 15: Business Model Upload window for Data Model Descriptor

2. Select the Object Registration Mode from the drop-down list as Full Object Registration or
Incremental Object Registration. It is recommended to select incremental only if the changes are
minimal.

NOTE Incremental Object Registration must be opted only if the


object registration on the base environment was incremental.
Full Object Registration can be performed irrespective of mode
opted in the base environment.

3. Select the Use archived JSON files check box.


4. Select the Use archived scripts check box if the starting point of the Model Upload process is from
the script execution, that is, if you have copied the DB scripts to the target environment. Otherwise,
deselect the check box.
5. In the Additional Options grid, perform the following tasks:
a. Select Yes to directly Update the database schema with Model changes.
 If you select Yes, the generated SQL scripts are executed at Runtime to update the model
changes in the database.
 If you select No, it restricts the system from updating the database with the data model
changes and only the model scripts are created. Later, you must execute the SQL scripts in
the correct sequence in order to make the Infodom Schema to be consistent with the
JSON files.
For more information, see Sequence of Execution of Scripts.
Additionally, when you select No, ensure the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 48


DATA MODEL MANAGEMENT
UPLOAD BUSINESS MODEL

 You have a third party tool or ETL tool to manage the schema updates.
 Database consistency and schema updates are maintained manually by the Database
Administrator.

NOTE Only the table scripts are created and they must be updated
manually. If you choose this option for the first time and later
perform an Incremental / Sliced / Complete Model Re-build,
you must manually synchronize the schema with the database
schema.

b. Select Yes for the Generate DDL execution logs option if you want execution audit information
such as execution start time, end time, and status of each SQL statements run as part of the
Model Upload process. The execution log file is available under the
ftpshare/<INFODOM>/Erwin/executionlogs folder.
c. Select Yes for the Refresh Session Parameters option to use Database session parameters
during the model upload process. For more information, see the Configuring Session
Parameters section.
d. Select Yes to directly update the Alter constraints in NOVALIDATE State. During the
Incremental or Sliced Model Upload, alterations to the constraints consume a lot of time as the
constraints have to be validated.
 If you select Yes, an option to alter the constraints in the NOVALIDATE state is enabled
and it will not check the existing data for the integrity constraint violation. It is useful when
the existing data is clean. So, NOVALIDATE potentially reduces the additional overhead of
the constraint validation and enhances the performance.
 By default, the option selected is No and the Option to alter the constraints is not enabled.
It checks the existing data for the integrity constraint violation.

NOTE Note the following points about the NOVALIDATE option.


• Constraints in the NOVALIDATE state are supported
only in Incremental and Sliced modes.
• The Model Upload process, irrespective of the status of
success or failure, brings the constraints into the
NOVALIDATE state. Hence, ENABLE VALIDATE must
be done as a post-model upload activity. That is,
Rollback does not validate the constraints that are
non-validated during the upload activity.
• The NOVALIDATE option is not relevant for the HDFS
systems.

6. Click Upload Model.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 49


DATA MODEL MANAGEMENT
OFSAA DATA MODEL EXTENSIONS THROUGH THE SQL DATA MODELER

3.1.3.4 Rollback
Rollback of the Model Upload happens to the state just before the CopyUpload.sh process. The migrated
files are preserved under the ftpshare/<INFODOM>/archive path.
1. Automatic Rollback occurs in the following cases:
a. When your start point is script generation:
 Creation of script failed
 Execution of script failed
b. When your start point is script execution:
 The execution of scripts failed.
2. In case of failure, for troubleshooting, check the following log files:
 $FIC_HOME/ficapp/common/FICServer/bin/nohup.out
 $FIC_HOME/ficapp/common/FICServer/logs/ETLService.log
 $FIC_HOME/ficapp/common/FICServer/logs/SMSService.log
 $FIC_HOME/ficapp/common/FICServer/logs/UMMService.log
 ftpshare/logs/
 ftpshare/executelogs
Contact Oracle Support services for further information.
3. You can trigger the Model Upload again, if required, using the files available in the path:
ftpshare/archive/<INFODOM>. It is not required to execute the CopyUpload utility again.

3.2 OFSAA Data Model Extensions through the SQL Data


Modeler
OFSAA out-of-the-box data models continue to be released as erwin Data Models. However, it supports
Oracle SQL modeler for Data Model extensions.
Proposed Data Model Extensions are:
1. Modifying a column of an existing table. Note that only data length modifications are allowed.
2. Adding one or more columns to an existing table.
3. Adding one or more tables.
A SQL Modeler template is released by Oracle Financial Services Data Foundation Pack (Enh 29467329 -
SUPPORT FOR DATA MODEL EXTENSION USING ORACLE SQL MODELER) that must be used for all
customizations. Refer to the Patch Readme and guidelines for more information on the process.

3.2.1 Customization Process


3.2.1.1 Modification of Columns of Existing Tables
• Column UDP ‘Custom’ must be set as YES for all the columns being customized. (Table UDP
‘Custom’ is not required to be set for out-of-the-box tables.)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 50


DATA MODEL MANAGEMENT
OFSAA DATA MODEL EXTENSIONS THROUGH THE SQL DATA MODELER

• Support is extended for column length change and addition of new columns. Ensure that the
existing column, when represented in SQL Modeler, must be intact with the base model definition
for information such as UDPs, domains, and other logical information. Otherwise, it can create
inconsistencies in the populated information of the OFSAA metadata repository.

NOTE Oracle recommends that you import only the altered columns into the
SQL Modeler. If you import all the columns (altered and unaltered), the
changes from the previous upload will be overwritten.
However, if you choose to import all the columns and avoid overwriting
the existing changes, select the blank value (do not select BYTE or CHAR)
from the Units drop-down list in the Column Properties tab in the SQL
Modeler.

• As model level UDPs are not supported by SQL Modeler, Model UDP - VERSION is expected to be
added at the table level. Ensure that the version for an existing table undergoing customization is
equal or higher than that of the previous model. If it is missing for any table, the default value is
80000. Therefore, there are possibilities to ignore customizations.

3.2.1.2 Addition of New Tables


• Tables are created only when the physical table UDP ‘Custom’ is set to YES.
• Columns of a custom table are considered as Custom. It is not required to mark them explicitly as
Custom with a UDP.
• One or more custom tables having a relationship with each other can be added together.
• If any of the custom tables is establishing a relationship with an existing table from OOB, then
ensure that the parent tables with keys or the entire parent table structure are available in the SQL
Modeler Model. Only the immediate parent is required to be added, not beyond that.
• Table and Column display names must be represented as notes in SQL Modeler (whereas it used to
be a logical name in erwin).
• As model level UDPs are not supported by SQL Modeler, Model UDP - VERSION is expected to be
added at the table level. If it is missing for any table, the default value will be 80000.

3.2.1.2.1 Limitations

• Index tablespace is not supported.


• Logical table UDPs are not supported.

NOTE Customizations are tracked under the table


AAI_DMM_MODEL_EXT_AUDIT_TRAIL.

3.2.1.3 OOB Model after Customization


• All customizations are retained after OOB slice.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 51


DATA MODEL MANAGEMENT
OFSAA DATA MODEL EXTENSIONS THROUGH THE SQL DATA MODELER

• During the upgrade, if the out of the box model comes with a Primary Key (PK) change that is
referenced by a custom table, the custom table is expected to be modified accordingly to hold the
Foreign Key (FK) change prior to the OOB upload.
For instance, if the parent table PK is modified to have an additional column, the following steps
must be performed to achieve the latest changes in the out-of-the-box model.
c. The child table (added as an extension) is expected to be altered to have the additional column
via the SQL Modeler mode of upload.
d. Proceed with the upgrade of the OOB Model upload.

3.2.2 Steps for Creating XML File:


1. In the Design Properties window, select General, Model Persistence, and then select Model
Persistence as Model in one file.
2. Save the Model as Relational Model under <DesignName>/rel/<ID> folder with .model.local
extension.
Example: D:\SQLMOD001\rel\F7706246-5EAEB0DCA216\F7706246-
5EAEB0DCA216.model.local
3. Rename .model.local to <Model_name>_RELATIONAL.xml.
Example: MDL_01_RELATIONAL.xml
4. If tablespace information is expected to be brought in during customization, the model upload
process expects input from physical model also. Physical model must be located under <Design
Name>/rel/phys/<ID> folder with .model.local extension.
Example: D:\SQLMOD001\rel\F7706246-5EAEB0DCA216\phys\32076570-
BF29817DFF70\32076570-BF29817DFF70.model.local
5. Rename .model.local to <Model_name>_PHYSICAL.xml.
Example: MDL_01_PHYSICAL.xml

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 52


DATA MODEL MANAGEMENT
SEQUENCE OF SCRIPTS EXECUTION

3.2.3 Triggering Model Upload Process


Figure 16: Business Model Upload window for SQL Modeler

From the Business Model Upload window, perform the following steps:
1. Enter a Name for the model being uploaded.
2. Select Sliced from the Upload Mode drop-down list.
3. Select SQL Modeler as the Upload Options.
This option is displayed only if you select Sliced as Upload Mode.
4. Select the XML file for upload from the File Name drop-down list.
The XML file is the one you created as explained in Steps for Creating XML File: section.
5. Click Upload Model.

NOTE • The Model Upload command line utility does not support SQL
Modeler as of now.
• You can only choose upload type for SQL Modeler Upload as
XML. JSON(ODM) files are not supported.

3.3 Sequence of Scripts Execution


When the Model Upload is performed with the option Update the Database Schema with Model
changes as No or <runscriptsFlag> is set as FALSE, you must execute the SQL scripts generated as

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 53


DATA MODEL MANAGEMENT
CONFIGURING SESSION PARAMETERS

part of the OFSAAI Model Upload process in the exact sequence, in order to make the Infodom Schema to
be consistent with the JSON files persisted in the database.
The sequence is explained in the following table.

Table 5: Sequence of Scripts Execution

Sequence Action Folder name Rollback folder name

5. Drop indexes droppedindex r_droppedindex

6. Drop foreign keys alterdropfkey r_ alterdropfkey

7. Drop primary keys droppkey r_droppkey

8. Drop tables dropoldtable r_dropoldtable

9. Create new tables newtables Droptable

10. Alter columns altercolumn r_altercolumn

11. Add primary keys addpkey r_addpkey

12. Add foreign keys addfkey r_addfkey

13. Add foreign keys for new tables newfkeys dropfkey

14. Create indexes createdindexes r_createdindexes

NOTE The folders are available at


ftpshare/<INFODOM>/json/scripts/altertable
location.

Roll back scripts must be executed in case of failures in the reverse order. That is, if the 4th step has
caused roll back, then roll back scripts from 4 to 1 must be executed in sequence. Rollback scripts are
available in the same path with the file name prefixed with r_.

3.4 Configuring Session Parameters


Model Upload is relatively time consuming as the data and model size grows. This enhancement allows
you to set the database session parameters according to an individual database environment, therefore,
improving the performance of the Model Upload process.
The configuration file Session_Parameters.conf is available in the $FIC_HOME/conf/dmm folder.
Following are the steps involved:
1. Specify database session level parameter settings in the Session_Parameters.conf file.
2. Set the option to refresh session parameters from the configuration files to TRUE either through
command line or using UI.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 54


DATA MODEL MANAGEMENT
CONFIGURING SESSION PARAMETERS

3.4.1 Specify Database Session Level Parameters


The Session_Parameters.conf file contains ALTER SESSION statements that must be set while a
connection is established. Any valid Oracle session setting can be specified. It is a single file that contains
parameter specification for different Infodoms, separated by an INFODOM parameter. The first parameter
in the file is the INFODOM parameter that identifies the DB parameters for that particular Infodom.
Followed by that, enter the session settings for second Infodom that again starts with the INFODOM
parameter.
# The file specifies the database session level parameter settings for better
performance
# of model upload process. The db session will be set with the following
statements mentioned.

# Parameter settings for Infodom 1


INFODOM = <INFODOM_NAME1>
#<alter session statement1;>
#<alter session statement2;>
#For example,
#<alter session set db_cache_size=200G;>
#<ALTER SESSION FORCE PARALLEL DML PARALLEL 49;>

# Parameter settings for Infodom 2


INFODOM = <INFODOM_NAME2>
#<alter session statement1;>
#<alter session statement2;>
#For example,
#<alter session set db_cache_size=200G;>
#<ALTER SESSION FORCE PARALLEL DML PARALLEL 49;>
#End of Parameter settings for Infodom 2
When the database session for Model Upload is initiated, the particular database session is initialized with
the specified settings. The settings are valid till the session ends.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 55


DATA MODEL MANAGEMENT
PARTITIONING SUPPORT

NOTE • The alter session statements mentioned in the


Session_Parameters.conf file must adhere to the
privileges of the respective OFSAA users.
• Every ALTER SESSION statement must start in a new
line and need not end with a semicolon (;); component
takes care of it.
• The syntax of the ALTER SESSION statements is
validated against the syntax tree of Oracle DB to
ensure credibility and to protect from any vulnerability.
If the syntax fails, the model upload operation will fail.
• RESUMABLE, SYNC and CLOSE DB LINK statements
are not supported.

3.5 Partitioning Support


Oracle Partitioning is supported for Model Upload process using erwin. The supported partition types are
Range Partitions, List Partitions Hash Partitions, and Interval Partitions.

NOTE • In the Sliced Model Upload mode, partition information


can be added only to the new tables; partitioning an
existing table is not supported.
• By default , the date format for partitions columns of
DATE type is set as MM/DD/YYYY and it is seeded in
the DMM_PARTITION_DATEFORMAT parameter in the
Configuration table. If the date format for DATE
partition columns are different in erwin Model, update
the parameter value appropriately before performing
the Model Upload.
• Partition information is considered based on the
following configuration parameter:
DMM_MODEL_INCLUDE_PARTITION - Default value is
Y.If the value is N, then partition information is skipped
during the Model Upload.

3.5.1 Registering Partition Information


You can register the partition information during the Model Upload process. Partition information for
tables is retrieved and registered into the OFSAAI object registration table REV_TAB_PARTITIONS during
the Model Upload process.
Partition table name and column names are added to V_TABLE_NAME and V_COLUMN_NAME
respectively in the REV_TAB_PARTITIONS table. Partition Sequence is stored in
N_PARTITION_SEQUENCE. The sequence starts from 1 for the major partition column and the maximum
sequence number is equal to the number of partitioned columns. V_PARTITION_VALUE holds the value

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 56


DATA MODEL MANAGEMENT
CONFIGURATIONS FOR FILE FORMATS FOR HIVE INFODOM

for a particular partition to be considered for any executions. Data into this column can be populated
manually or with the help of any OFSAAI table data load options.
Hive supports static and dynamic partitions. Values for static partition are known in the query whereas
dynamic partition values are known at the execution time. If V_PARTITION_VALUE is null in
REV_TAB_PARTITIONS, the table is considered as dynamic partitioned. AAI executions run on static and
dynamic partitions.

3.5.2 Sub Partitioning Support


Sub Partitions of type Range-Hash, List-Hash, and Interval-Hash are supported for the Model Upload
process using erwin.

3.6 Configurations for File Formats for Hive Infodom


Hive file format refers to how records are stored in the file. The supported file formats are Text, Sequence,
RC, Avro, Parquet and ORC. Model Upload component accepts the Input File Format and Output File
Format as inputs at three levels:
1. Configuration table entries.
This is the OFSAA instance-level configuration. This is applicable to all Information Domains in the
instance. Configuration table entries are:
 HIVE_INPUT_FILE_FORMAT– Default value is seeded as
org.apache.hadoop.mapred.TextInputFormat.
 HIVE_OUTPUT_FILE_FORMAT – Default value is seeded as
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.
2. Model-level properties (Model UDP)
You can define Model UDPs to hold the input and output file formats. These will be applied to all
tables in the model. UDP names are the same as the configuration parameters
(HIVE_INPUT_FILE_FORMAT and HIVE_OUTPUT_FILE_FORMAT).
3. Table-level properties (Table UDP)
File formats can be applied at an individual table-level by specific table level UDPs. UDP names are
the same as the configuration parameters (HIVE_INPUT_FILE_FORMAT and
HIVE_OUTPUT_FILE_FORMAT).

NOTE • Configuration Table data are overridden by Model


UDPs, which in turn will be overridden by Table UDPs.
• Hive file formats are support only for creating new
tables.

The supported File Formats are listed in the following table.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 57


DATA MODEL MANAGEMENT
MODEL VERSIONING

Table 6: Supported File Formats

Types Input File Format Output File Format

Text File org.apache.hadoop.mapred.TextI org.apache.hadoop.hive.ql.io.HiveIgno


nputFormat reKeyTextOutputFormat

Sequence org.apache.hadoop.mapred.Sequ org.apache.hadoop.hive.ql.io.HiveSeq


File enceFileInputFormat uenceFileOutputFormat

RC File org.apache.hadoop.hive.ql.io.RCF org.apache.hadoop.hive.ql.io.RCFileOu


ileInputFormat tputFormat

Avro File org.apache.hadoop.hive.ql.io.avr org.apache.hadoop.hive.ql.io.avro.Avr


o.AvroContainerInputFormat oContainerOutputFormat

ORC File org.apache.hadoop.hive.ql.io.orc. org.apache.hadoop.hive.ql.io.orc.OrcO


OrcInputFormat utputFormat

Parquet File parquet.hive.DeprecatedParquetI parquet.hive.DeprecatedParquetOutpu


nputFormat tFormat

3.7 Model Versioning


A model level UDP known as “VERSION” is available with every model. Five digits OFSA version numbering
is followed for model versions. Each table will inherit the model version into Table version as Table level
UDPs. Model Upload registers the version against each entity during the Model Upload process.
Sliced Model Upload checks the model version to decide if columns should be dropped or not. When the
SLICE and BASE models have common tables and if BASE entity version is higher than SLICE, then entity
in the BASE is retained unchanged. If the SLICE entity version is higher than or equal to the BASE version,
the entity in the SLICE will replace the BASE. After the entity is brought into the BASE model, the version
of it is stamped against it. Any models/ tables prior to OFSAAI version 80100 is defaulted to version
80000.

3.8 Viewing Log Details


Log details of all the Model Uploads done till the date to the current information domain can be viewed in
the Model Upload Summary window. You can click “View Log” in the Status column corresponding to the
required Model, to view the Model Upload details of the selected Model in the View Log Details (Log
Information) window. The View Log Details window also displays other details such as Task ID, Sequence
of upload, Severity, Message Description, Message Date, and Message Time.
You can also access the View Log window through the LHS menu (Operations > View Log) to find the log
details of all the Model Uploads done till the date. You can make use of the Search option to find the
required Model Upload details by selecting “Model Upload” as the Component Type from the drop-down
list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 58


DATA MODEL MANAGEMENT
LOG FILE DOWNLOAD

3.9 Log File Download


In the Model Upload Summary window, you can download the log file of the listed Model Uploads by
clicking on the log file name in the Log File column corresponding to the required Model.
In the File Download dialog, you can either open the file directly or save a copy for future reference. The
Log file contains the following information:
• Log File Name
• Model Upload Started At
• Source erwin XML File
• Model Upload Mode
• Using erwin.xsl File at
• Target XML File
• Information Domain
• Current Version Is
• Model Upload Completed at
• Object Registration Started as part of Model Upload at
• Object Registration Completed at

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 59


DATA MANAGEMENT FRAMEWORK
DATA MANAGEMENT TOOLS

4 Data Management Framework


Data Management framework within the Infrastructure system is a comprehensive data integration
platform that facilitates all the data integration requirements from high-volume and high-performance
batch loads to event-driven integration processes and SOA-enabled data services.
Data Management Framework consists of the following sections:
• Data Management Tools
• Data Quality Framework

4.1 Data Management Tools


Data Management Tools is a software application based on the ETL (extract-transform-load) structure,
which is used for data transformation and merging. The E-LT (extract-load, transform) structure in Data
Management Tools eliminates the need for a separate ETL server, and the analytical rules facilitate to
optimized performance, efficiency, and scalability.

Figure 17: Illustration of Data Management Tools

The Data Management Tools module is equipped with a set of automated tools and a tested data
integration methodologies that allows you to position the advanced N-tier web-based architecture and
integrate the enterprise data sources from the mainframe to the desktop.
In Data Management Tools, you can standardize and integrate the various source system data into a
single standard format for data analysis. You can also populate the warehouse in a defined period using
the ETL process for data extraction, transformation, and loading.
Following are the prerequisites while working with Data Management Tools:
• You can transform data using the options - Before load, While load or After Load.
• For source system information, filenames can be either fixed or delimited in length.
• The source types that can be loaded into the system are RDBMS and Flat-Files. For an RDBMS
source type, ensure that the appropriate drivers are installed.
• Ensure that you are aware of the process flow before you start with the extraction, transformation,
and loading process.
As part of the 8.0.6.0.0 release, Data Management Tools User Interface is re-organized and OJET/ALTA
theme is adapted for better usability. All metadata in DMT is now persisted in Database instead of XML
files.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 60


DATA MANAGEMENT FRAMEWORK
COMPONENTS OF DATA MANAGEMENT TOOLS

NOTE For migrating DMT metadata in previous versions to 8.0.6.0.0


version and above, see DMT Metadata Migration Guide.

4.2 Components of Data Management Tools


Data Management Tools consists of the following sections. Click the following links to view the sections in
detail:
• Data Sources
• Data Mapping
• Post Load Changes
• User Defined Functions
• DMT Configurations

4.3 Data Sources


Data Sources within the Data Management Tools of Infrastructure system facilitates you to define Data
Sources and generate data models of the Source systems. While defining a Data Source itself, the source
model generation happens.
The Data Source type is classified as:
• File based
 HDFS
 Flat File (Local to OFSAA or on a Remote Machine)
 WebLog
• Table based
 HDFS (HIVE)
 RDBMS (Oracle, MSSQL, DB2)

NOTE HDFS and WebLog based options are displayed only if the Big
Data Processing license is enabled.

DMT Metadata are stored in the Database Tables, instead of the earlier approach of storing in XML and it
is Infodom specific.
Since the source model generation is done for Flat file based Data Sources while defining a Data Source,
there is no separate Data File Mapping window for creating mapping definition. In other words, F2T and
F2H can be defined from the Data Mapping window itself.
If the Data Source is an OFSAA Infodom and model upload has already been done for the Infodom, there
is no need to create another Data Source pointing to this Infodom. The Infodom can directly be used in the

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 61


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

Data Mapping definition as a source. In addition, Dataset filters can be applied to this Infodom to get a
further subset of Entities.
The roles mapped to Data Sources are as follows:
• SRCACCESS
• SRCREAD
• SRCWRITE
• SRCPHANTOM
• SRCAUTH
• SRCADV

For all the roles, functions, and descriptions, see Appendix A.

Figure 18: Data Sources window

The Data Sources Summary window displays the list of pre-defined Data Sources with details such as
Code, Name, Source Type, Upload Type, Created By, Creation Date, Version, and Active. You can add,
view, modify, copy, authorize, delete, or purge Data Source definitions. You can make any version of a
Data Source definition as the latest. For more information, see Versioning and Make Latest Feature.
For sorting the fields, mouse-over at the end of the Column heading and click to sort in the ascending
order or click to sort the fields in the descending order.
You can search for a Data Source based on Code, Name, Source Type, and Record Status (Active, Inactive,
or Deleted). In the Search and Filter pane, enter the details of the Data source you want to search in the
respective fields and then click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 62


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

4.3.1 Creating a Data Source


Data Source refers to the physical structure or location of the source system. The Data Source can be a
file, a table, or WebLogs.
• In case of File, it can be a flat file that can be local to OFSAA or remote to OFSAA, or a file on HDFS.
• In case of a table, it can be an RDBMS table or HDFS table.
• In case of WebLogs, it can be in a local file system or in an HDFS cluster. If it is in the HDFS cluster,
you must register a cluster with the required information from the DMT Configurations>Register
Cluster window.
For tables, the connection and authentication details are defined in the System Configuration > Database
Details section. Proper connection pooling must be done, if you have to create an external Data Source on
a database without an Information Domain created on it. Applications access the data source using an FTP
connection.

NOTE 1. Source creation implicitly does a source model generation.


2. Defining the structure of a Flat File is mandatory during
the creation of Flat File based sources.
3. Data Sources cannot be defined on Configuration Schema.
By default, OFSAA generates Data Sources on
Configuration Schema and they can only be viewed; you
cannot edit them.

To create a Data Source follow these steps:


1. From the Data Sources window, click Add.
The Data Source window is displayed.

Figure 19: Data Sources window

The ID will be automatically generated once you create a data source. The Folder field is not
enabled.
2. Enter a distinct Code to identify the Data Source. Ensure that the code is alphanumeric with a
maximum of 50 characters in length and there are no special characters except underscore “_”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 63


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

3. Enter the Name of the Data Source.


4. Enter a Description for the Data Source.

4.3.1.1 Creating a Data Source Based on Local File System


This feature allows you to extract unstructured data from a Flat File for loading into a table based on
certain criteria. Ensure that the ASCII file types are not loaded into the staging area using FTP since it can
corrupt the file causing load failure. The flat file can be local to OFSAA or remote to OFSAA.
To create a data source based on LFS:
1. Select the Source Type as File.
2. Select the Based on as LFS.
3. Enter details as tabulated:
The following table provides the details of the Field and its description.

Table 7: Fields in the Data Sources Field and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Type Select Local or Remote from the drop-down list.

If Type is selected as Local: Specify the Source Date Format to be used as the default date format
for source data extraction and mapping.

If Type is selected as Server Name: Enter the Server Name or IP address where the Data
Remote: Source exists.
Server Port: Enter the active server port number that contains the flat
files.
User ID: Enter the FTP User ID required to connect to the server.
Password: Enter the FTP user password required to connect to the
server.
FTP Share: Enter the ASCII files location for loading if it is located in the
staging area other than the default staging area of Infrastructure
Database Server.
FTP Drive: Enter the FTP server path. In case of Unix Servers, the home
directory path is taken as default.
Source Date Format: Enter the Source Date Format that will be used as
the default date format for source data extraction and mapping. The
date format you enter is validated against the supported date formats of
the database to which the Config Schema points.

4. Select the required File Type. The options are:


 Delimited - Select Delimited if the data is separated by a delimiter.
 Enter the delimiter in the Field Delimiter field. This is a mandatory field.
 Fixed - Select Fixed if it is Fixed Width or Fixed Position File (it refers to a Flat File in which the
data is defined by the character position (tab space)).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 64


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

5. From the Generate Model pane, click Select if the File Type is Delimited or Fixed. This allows you
to select the table whose structure is similar to the structure of your source. Using this option, you
can generate a model based on the selected table. The Source Entities window is displayed.

Figure 20: Source Entities window

a. Select the Infodom from the drop-down list.


b. Select the Table from Available Values pane.

 Select the required Entity and click to move it to the Selected Values pane.

 Click to select all entities.

 Select an entity and click to de-select an entity.

 Click to de-select all entities.

 You can search for an entity by giving its name in the text field and click . Click to
reset the search field.
c. Click OK. All the columns in the selected Entity will be displayed in the Generate Model pane.
The available columns are Source Table, Table Logical Name, Source Column, Column Logical
Name, Data Type, Field Order, Start Position, Length, and Logical Data Type.

Figure 21: Generate Model pane

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 65


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

You can perform the following actions:


 Click to add a new row to specify a new column.

 Select a row and click to delete a row.


 Double-click the Field Order number and update if you want to change the order in which
the columns appear in the source file. Click Reorder to sort and reorder the Field Order
numbers to fill any missing numbers.
 Mouse-over at the end of the Column heading and click to sort the fields in the
ascending order or click to sort the fields in the descending order.
6. From the Generate Model pane, click Properties to specify the source properties.
For more information, see Specifying Source Properties.
7. Click Save. The Data Source definition will be saved as version 1.

4.3.1.2 Creating a Data Source for WebLogs


In the case of WebLogs, it can be in a local file system (LFS) or in an HDFS cluster. If it is in the HDFS
cluster, you must register a cluster with the required information from the DMT Configurations>Register
Cluster window.
To create a data source based on WebLogs:
1. Select the Source Type as File.
2. Select the Based on as LFS if the WebLogs are present in the local file system or as HDFS if
WebLogs are present in the HDFS cluster.
3. If Based on is selected as LFS, enter details as tabulated:
The following table provides the details of the Field and its description:

Table 8: Fields in the Data Source for WebLogs and their Description

Field Description

Fields marked in red asterisk (*) are mandatory.

Type Select Local or Remote from the drop-down list.

If Type is selected as Local: Specify the Source Date Format to be used as the default date format
for source data extraction and mapping.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 66


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

Field Description

If Type is selected as Server Name: Enter the Server Name or IP address where the Data
Remote: Source exists.
Server Port: Enter the active server port number that contains the flat
files.
User ID: Enter the FTP User ID required to connect to the server.
Password: Enter the FTP user password required to connect to the
server.
FTP Share: Enter the ASCII files location for loading, if it is located in the
staging area other than the default staging area of Infrastructure
Database Server.
FTP Drive: Enter the FTP server path. In case of Unix Servers, the home
directory path is taken as default.
Source Date Format: Enter the Source Date Format that will be used as
the default date format for source data extraction and mapping. The
date format you enter is validated against the supported date formats of
the database to which the Config Schema points.

4. If Based on is selected as HDFS, enter the details:


a. Select the HDFS cluster in which the file/folder is present, from the Cluster drop-down list. This
list displays the clusters that are registered from Register Cluster tab in the DMT Configurations
window.
For more information, see Cluster Registration section.
b. Enter the folder path present within the HDFS System in the HDFS File Path field. All files
present inside this folder will be loaded.
c. The Source Date Format field is not editable. The supported source date format is YYYY-MM-
DD.
5. Select the File Type as Regex.
6. Select the File Format from the drop-down list. The options are Text File, Sequence File, Parquet,
RC File, Avro, and Input Format.
7. From the Generate Model pane, click Derive. The Source Model Generation window is displayed. See
Source Model Generation for WebLog for detailed information.

NOTE Source model generation of HDFS files on Derive mode is not


supported. The workaround is to derive the model on local files
and point the source to the HDFS before saving the Data
Source definition.

4.3.1.2.1 Source Model Generation for WebLog

Source Model Generation (SMG) for Weblog files is done by reverse-generation of the Data Model from
WebLog files. That is, you can choose a sample file from the source base folder and the SMG process tries
to fit the data file to a known log type or to a custom log model. It validates the data model against a few

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 67


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

records from the file and publishes them to you. If you find the model satisfactory, you can save the
model. Otherwise, you can edit the model and re-validate it.
When source is saved from the UI, SMG logs will be available in the <web local
path>/<infodom>/dmt/source/<source code>/log folder. When source is saved from utilities (any
non j2ee container), logs will be written to <app ftpshare>/<infodom>/dmt/source/<source
code>/log folder.
To generate Source Model for WebLog:
1. From the Generate Model pane in the Data Sources window, click Derive.
The Source Model Generation window is displayed.

Figure 22: Source Model Generation window

All the files/folders from the base folder of the WebLog source are listed in the File Browser pane.
You can search for a particular file by entering the filename in the Search field. All special
characters except +, \, #, ~, %, &, *, ? , ( ,), [, ],\\ and ,. The selected file will be used to generate the
Data Model for the whole of WebLog source.
2. Select the file from the File Browser pane.
The File Format field displays the selected File format from the Generate pane.
3. Enter the number of records (n) to be fetched from the selected file for the preview. By default, 5 is
displayed. These records will be finally used to validate the Data Model.
4. Click Preview.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 68


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

Figure 23: Preview pane

You can view the “n” number of records displayed in the Preview pane.
5. Select a record from the Sample Data based on which you want to generate a Data Model. By
default, the last record is selected.
6. Select the appropriate Logger Type from the drop-down list. The available options are:
 APACHE - Sample - Select this if you know the log format of your data is in Apache log format.
 MICROSOFT-IIS - Sample - Select this if you know the log format of your data is in Microsoft
log format.
 Custom- Select this option if you are not sure about the log format. It will intelligently try to fit
data to a standard log format or generate a custom log model. Select the Delimited checkbox if
the data is separated by a delimiter and enter it in the Field Delimiter field.

NOTE Standard logger types and their details are seeded in the
AAI_DMT_WEBLOG_TYPES table. By default, details for Apache
and Microsoft-IIS logs are pre-populated. You can add other
logger methods to the table to make them visible in the UI. For
more information, see the Logger Type Seeded Table section in
OFSAAI Administration Guide.

7. Click Generate Data Model. If the model generation is successful, you can view the Data Model
Preview pane. Model is generated based on the selected record in the Preview pane.

Figure 24: Data Model Preview window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 69


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

 If you have selected standard Logger Type, standard column names are displayed. If Custom is
selected, column names are set as fld_0, fld_1, fld_2, and so on.
 The supported Data Types are String and Int.
 If Custom is selected as Logger Type and the Delimited checkbox is selected, the Regex field
will be non-editable and the Input Regex field will not be displayed.
 The Data Model is based on the generated Input Regex value. For the standard logger types,
this value is hard-coded. The regex is fuzzy-logically computed in the case of Custom Logger
Type.
 For more information on tweaking the Data Model, see the Model Customization section.
8. Click Validate to validate the “n” number of records against the model.

Figure 25: Data Validation window

If there are any records that do not conform to the model, an alert with the number of invalid
records is displayed. You can scroll the grid to check the erroneous data marked in red or optionally
click the Invalid Data button in the Data Validation grid.
In case of invalid records, you can tweak the Input Regex (Regular Expression) and re-validate the
model. For more details, see the Model Customization section.
9. Click Save when you are satisfied with the model.
Even if there are erroneous records, you can still save the model. Then, during the final load, those
records will result in erroneous data being loaded in the final table. In such cases, you can
separately apply data corrections rules to weed out those records.

4.3.1.2.2 Model Customizations

Clubbing Columns
Consider a scenario in which you want to club columns appearing in the Data Model Preview pane. You
can do it by deleting any one of the columns and then update the column name and the Input Regex of
the retained column appropriately.
Suppose you want to combine Status and Size columns, as shown in the following figure.

Figure 26: Size and Columns

• Rename the Status column to “Status + Size”.


• Change the Regex of the renamed column by combining the value within brackets(). For example, in
this case the Regex should be ([0-9]* [0-9]*).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 70


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

• Click corresponding to the Size column.

• Click to refresh/ reset the Input Regex based on the modifications you did.
• Click Validate to generate the model again.
Adding New Columns
Consider a scenario where you want to split a single column appearing in the Data Model Preview pane to
appear as multiple columns. This can be done by clicking Add and tweaking Input Regex, appropriately.
For example, if you want to split the Time column into Date and Time columns as shown in the following
figure.

Figure 27: Time column

• Click Add to add a new column. A new record is added in the last.
• Enter the Regex appropriately for both columns.
• If you want to add a column in between, change the Input Regex field appropriately. That is, Regex
of the newly added column should be added after the Regex of the column where you want to insert
the new column. Even though in the Data Model Preview pane, it does not get reflected, it is
displayed properly in the Data Validation pane.
URI and Referer Parsing
URI and Referer fields are considered complex attributes since apart from the hierarchical part
(scheme://example.com:123/path/data), there is a query part to it (?key1=value1&key2=value2). The query
part by convention is mostly a sequence of attribute-value pairs. SMG process identifies these keys as
potential attributes of interest and therefore, an option to keep them in the Data Model is provided.

Both in Standard and Custom logger methods, the URI and Referer fields will show icon, only if the
selected record’s URI or Referer field has a query part to it. You can choose a different record with a query
part instead.

Figure 28: URI or Referer Field

• Click .
The Attribute Browser window is displayed.

Figure 29: Attributes Browser window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 71


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

• Enter the number of records you want to look up beyond the previously selected n records for
attributes and click .
The Available Attributes column will get refreshed.
• Select the required attributes that you want to add as columns in your Data Model and click OK.
• Click Add to add an attribute that is not part of the data file.
• Click Save.

NOTE The selected attributes might become a sparse column after


the data load. Also, these attributes will not be available
separately in the data validation grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 72


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

4.3.1.3 Creating a Data Source Based on Table


This feature allows you to create a Data Source from an RDBMS table or a Hive table. The source model
generation for RDBMS and HIVE is done using the following options:
• erwin- The working of this mode is the same for an RDBMS or HIVE table. The erwin.xml file is
read and an XSLT converts it into the SOURCE_DATABASE.xml file.
• Catalog- In this option, the database catalog (HIVE metastore or RDBMS) is directly queried to get
the list of Tables and Columns. This metadata information is then saved into the
SOURCE_DATABASE.xml file. This component captures the Logical Names of the Tables and
Columns in addition to the Physical names. This option can be used for both RDBMS and HIVE.
To create a data source based on a table:
1. Select the Source Type as Table.
2. Select the required database from the Database Name drop-down list. If RDBMS is selected, the
drop-down list displays the available RDBMS tables. If HDFS is selected, it displays the available
HDFS table based sources (HIVE).
3. Enter the schema name in case of Oracle database in the Table Owner field.
4. Source Date Format is displayed as mm-dd-yyyy. You cannot modify it.
5. From the Generate Model pane, select the Upload Type as erwin or Catalog.
By default, Catalog is selected.
a. If Catalog is selected:

Figure 30: Generate Model pane

Specify the Filter criteria by entering details in the Starts with, Contains, and Ends with fields.
Filters are patterns for entity names in the Database and can restrict the source model
generation to a specific set of entities. The Source Model is generated even if one of the
specified filter conditions matches. You can also specify multiple conditions for a single filter
type using comma-separated values. For example, tables starting with TB and TM can be
specified as “TB, TM”.
b. If erwin is selected:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 73


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

Figure 31: General Model pane

Select the required erwin File from the drop-down list. The files that are placed inside the
ftpshare/<Infodom name>/dmt/erwin folder are displayed in the drop-down list.
Or
Click Attach and select the erwin file from your local system. Click Upload. You can see the
progress of the file upload in percentage. After being uploaded, select that file from the drop-
down list.
6. Click Save. The Data Source definition will be saved as version 1.

4.3.1.4 Creating a Data Source Based on HDFS File


This option is used if the file is present on HDFS cluster.
To create a data source based on HDFS File:

Figure 32: Source Details pane

1. Select Source Type as File.


2. Select Based on as HDFS.
3. Select the HDFS cluster in which the file/folder is present, from the Cluster drop-down list. This list
displays the clusters that are registered from the Register Cluster tab in the DMT Configurations
window. For more information, see Cluster Registration.
4. Enter the folder path present within the HDFS System in the HDFS File Path field. All files present
inside this folder will be loaded.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 74


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

5. The Source Date Format field is not editable. The supported source date format is YYYY-MM-DD.

4.3.1.5 Specifying Source Properties


1. From the Generate Model pane in the Data Sources window, click Properties.
The Properties window is displayed.

Figure 33: Properties window

You can click button to view the related information in a pop-up dialog pertaining to a field.
2. Enter the details as tabulated:

Table 9: Fields in the Properties window Field and their Descriptions

Field Description

File Sort
This section is applicable for File Type selected as Delimited or Fixed.

Sort Basis Select the basis on which the data file should be sorted, from the drop-
down list. The options are:
• Entire Record- By default, this option is selected.
• Primary Key- Select this option if the destination table has
primary keys.
• List of Fields- Select this option if you want to sort based on
some particular field.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 75


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

Field Description

Sort Order Select whether you want to sort the data file based on Binary or
Linguistic, from the drop-down list.

Sort File Select whether you want to sort it in Ascending or Descending order,
from the drop-down list.

Sort Fields This field is applicable only if you have selected Sort Basis as List of
Fields.
Specify the field based on which you want to sort the data file.

Miscellaneous

Record Delimiter Specify the record separator used in the data file.
By default, \n is selected as record delimiter. Modify if required.
Note: This is the only field applicable in case of WebLogs.

File Date Format Select the Regional Settings from the drop-down list if the Data File is
created with the date format of the Regional Settings of the Database
server.
By default, Database Settings is selected.

Data File Locale Select EN_US.UTF-8 from the drop-down list.

Oracle
This section is applicable only if File Type is selected as Delimiter.

Optionally Enclosed By Specify any optional Field Identifier used in the Data File, apart from
the Field Delimiter. It can be Fields enclosed by "Field".

Rules
This section is applicable only if File Type is selected as Delimited or Fixed.

Check Rules Select Header, Trailer, Header and Trailer or No from the drop-down
list depending on where the Validity rules are specified in the Data File.
If you select No, all other fields will be disabled.

Header Identifier This field is enabled only if you select Header or Header and Trailer
options for Check Rules.
Specify the first character or string that identifies the header record.

Data File Name Select Yes if the name of the Data File is part of the Header/Trailer.

Information Date Select Yes if Information Date (MIS Date) in the Data File is provided
as part of Header/Trailer.

Number of Records Select Yes if the number of records in the Data File is provided as part
of the Header/Trailer.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 76


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

Field Description

Check Sum Select Yes if Check Sum value in the Data File is provided as part of
Header/Trailer.
NOTE:
For checksum to be computed in F2T, it is mandatory that there must
be a column mapping to identify the current load. The supported
mappings are as follows:
1. Constant mapped to #MISDATE
2. Constant mapped to #FILENAME

Basis of Check Sum Specify the Source Column name on which the Check Sum is
computed. Ensure that the source column is a numeric column.

Trailer Identifier This field is enabled only if you select Trailer or Header and Trailer
options for Check Rules.
Specify the first Character or String that identifies the Trailer Record.

Header Field Order This field is enabled only if you select Header or Header and Trailer
options for Check Rules.
Specify the header field order as comma separated values: 1-Header
Identifier,2-Data File Name, 3-Information Date, 4-Number of records,
5-Value of Checksum, 6-Basis of Checksum.
For example, if you specify 1, 3, 2, 4, 5, 6; the header fields will be
Header Identifier, Information Date, Data File Name, Number of
records, Value of Checksum, Basis of Checksum.

Trailer Field Order This field is enabled only if you select Trailer or Header and Trailer
options for Check Rules.
Specify the Trailer field order as comma separated values-: 1- Trailer
Identifier,2-Data File Name, 3-Information Date, 4-Number of Records,
5-Value of Checksum, 6-Basis of Checksum.

3. Click Ok.

4.3.2 Versioning and Make Latest Feature


When a new definition is created, it will be saved as version 1. After you modify and save a definition, it will
be saved with version as the highest version +1. That is, if you modify version 2, which is the highest
version available, and save it, the version becomes 3.
To make any older version as latest:

1. From the Data Sources window, turn OFF the Active toggle button and click Search.
All inactive definitions are displayed.

2. Select the required definition and click Make Latest.


The selected definition becomes active and the current active definition becomes inactive.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 77


DATA MANAGEMENT FRAMEWORK
DATA SOURCES

4.3.3 Modifying a Data Source


This option allows you to modify a Data Source. You cannot modify inactive versions of a Data Source
definition. To make an inactive version as active, you should make that version as the latest.
To modify a Data Source:

1. From the Data Sources window, select the data source that you want to edit and click Edit. The
Data Source window is displayed.
2. Modify the required details. You cannot modify Code and Name.
For more information, see Creating a Data Source section.
3. Click Save.
The definition will be saved as the highest version +1. That is, if you are modifying a definition of
version number as 3 and the highest version available is 5, the definition will be saved as version 6.

4.3.4 Viewing a Data Source


You can view individual Data Source definition details at any given point.
To view an existing Data Source definition:

1. From the Data Sources window, select the data source that you want to view and click View. The
Data Source window is displayed.
The Data Source window displays the details of the selected Data Source definition. The Audit Panel
section displays the creation and modification information of the Data Source definition. The
Comments section displays additional information or notes added for the definition if any.

4.3.5 Copying a Data Source


This feature facilitates you to quickly create a new Data Source definition based on an existing one by
updating the required fields.
To copy a Data Source definition:

1. From the Data Sources window, select the data source that you want to copy and click Copy. The
Data Source window is displayed.
2. Enter Code and Name for the definition. Modify the required fields.
For more information, see Creating a Data Source section.

4.3.6 Deleting Data Sources


This option allows you to delete data sources. However, it is a soft deletion only. To permanently delete a
data source from the system, you need to purge it.
To delete Data Sources:

1. From the Data Sources window, select the data source that you want to delete and click Delete.
You can select multiple Data Sources for deletion. A confirmation message is displayed.
2. Click Yes to confirm the deletion or No to cancel the deletion.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 78


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

4.3.7 Purging Data Sources


This option allows you to remove deleted Data Sources permanently from the system. You should have
DMTADMIN user role mapped to your user group.
To purge Data Sources:
1. Search for the Deleted records by selecting Deleted from the Record Status drop-down list and
click Search.
2. Select the required Data Source definitions you want to permanently remove from the system and
click Purge.
3. Click OK to confirm purging.

4.4 Data Mapping


Data Mapping refers to the process of retrieving unstructured data from data sources for further data
processing, storage, or migration. The intermediate extraction process can be followed by data
transformation and metadata addition before exporting it to the staging area or to the Business Data
Model.
Data movement can be from:
• RDBMS source to RDBMS target (T2T)
• RDBMS source to Flat File target(T2F)
• RDBMS source to HDFS-Hive target (T2H)
• HDFS-Hive source to RDBMS target(H2T)
• HDFS-Hive source to HDFS target (H2H)
• HDFS/Local-WebLog Source to HDFS Target (L2H)
• HDFS-Hive source to Flat File target (H2F)
• Flat File to RDBMS target (F2T)
• Flat File present in Local File System (LFS) to HDFS target or HDFS file to HDFS target(F2H)

NOTE File present in the HDFS system cannot be loaded into RDBMS
target Infodom.
F2T and F2H can be defined from the Data Mapping window.
There is no separate Data File Mapping window.

Data movement between Hive and RDBMS can be enhanced using third-party tools like SQOOP and
Oracle Loader for Hadoop (OLH). You must set parameters from the DMT Configurations window. For
details, see the DMT Configurations section. For details on the configurations for SQOOP and OLH, see
OFSAAI Administration Guide available in the OHC Documentation Library.
For the configurations required to support WebLog ingestion (L2H), see the Data Movement of WebLog
Source to HDFS target section in the OFSAAI Administration Guide available in the OHC Documentation
Library.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 79


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

The roles mapped to Data Mapping are as follows:


• DMACCESS
• DMREAD
• DMWRITE
• DMPHANTOM
• DMAUTH
• DMADV

For all the roles, functions and descriptions, see Appendix A.

Figure 34: Data Mappings window

The Data Mappings window displays the list of pre-defined Data Mapping definitions with Record Status
as Executable with details such as Code, Name, Source, Type, Created By, Creation Date, Version, and
Active. You can add, view, modify, delete, or purge Data Mapping definitions. You can make any version of
a Data Mapping definition as the latest. For more information, see Versioning and Make Latest Feature of
Data Mapping.
For sorting the fields, mouse-over at the end of the Column heading and click to sort in the ascending
order or click to sort the fields in the descending order.
You can search for a Data Mapping definition based on Code, Name, Type (F2T, T2F, and T2T), Source,
and Record status. The options for Record Status are Executable, Active, Inactive, and Deleted.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 80


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

 Executable- Displays all active versions of Data Mapping definitions and inactive versions of the
same Data Mapping definitions with distinct sources.
 Active- Displays only the active version of all Data Mapping definitions.
 Inactive- Displays all the inactive versions of Data Mapping definitions.
 Deleted- Displays all the deleted Data Mapping definitions.

4.4.1 Creating Data Mapping Definition


This option facilitates you to extract data from data sources and load them to a table. The data source and
target can be RDBMS table, HDFS-HIVE table, or Flat File. It can also be a WebLog source and HDFS-Hive
target. You can Load data incrementally from any data source to a table based on certain criteria.

NOTE If DB2 is selected as the source database, map data from Table
to File (T2F) and then File to Table (F2T).
Processing on Datatypes TIMESTAMP WITH TIME ZONE and
TIMESTAMP WITH LOCAL TIME ZONE is not supported, even
though source model generation is supported for those
datatypes.

Defining Data Mapping involves the following steps:


 Specifying Data Mapping Details
 Selecting Model
 Defining Data Mapping to Table or File
 Defining Mapping Properties
 Associating DQ rules to the Data Mapping Definition

4.4.1.1 Specifying Data Mapping Definition Details


1. From the Data Mappings window, click Add. The Data Mapping window is displayed.

Figure 35: Data Mapping window

The ID will be automatically generated after you create a data mapping definition. The Folder field
is not enabled.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 81


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

2. Enter a distinct Code to identify the Data Mapping definition. Ensure that the code is alphanumeric
with a maximum of 50 characters in length and there are no special characters except underscore
“_”.
3. Enter the Name of the Data Mapping definition.
4. Enter a Description for the Data Mapping definition.

4.4.1.2 Selecting Model


Figure 36: Select Model pane

1. Select the Source as External Source or Infodom. By default, Infodom is selected.


2. If External Source is selected as Source, select the Data Source from the External drop-down list.
All the Data Sources you have defined in the current Infodom will be displayed in the drop-down
list.
3. If Infodom is selected as Source:
 Select the Information Domain from the Infodom drop-down list.
 Turn on the Filter By Dataset toggle button if you want to filter the Infodom by a dataset.
Select the Dataset from the drop-down list. The Dataset drop-down is enabled only if the Filter
By Dataset toggle button is turned on.

4.4.1.3 Defining Data Mapping to Table (T2T, F2T, H2T, T2H, H2H, F2H, L2H)
In case of F2T or F2H, the source data file should be located at
/ftpshare/<INFODOM>/dmt/source/<SOURCE_NAME>/data/<MIS_DATE>. In case of multi-tier
setup, if the dmt/source/<SOURCE_NAME>/data/<MIS_DATE>/ folder structure is not present in
/ftpshare/<INFODOM> location, manually create the folder structure.
For local L2H executions, create the execution file path explicitly in the app layer. Since the source folders
get created in web local path, the execution searches for data file in the
ftpshare/<infodom>/dmt/<sourcename>/data/<datefolder>/ folder in the app layer.

NOTE Data source based on a file present in the HDFS system cannot be
loaded into an RDBMS target Infodom.

1. Select the Load to Table option as Load Type.


2. From the Mapping Details pane, click Map.
The DI Mapping window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 82


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Figure 37: DI Mapping window

3. Select the required table from the Source Entities drop-down list.
The list displays all the tables that are part of the source model.
The selected source entity attributes are displayed in the Source Entities pane.
4. Select the target table from the Target Entities drop-down list.
The selected entities are displayed in the Target Entities pane of the Target Table Map Panel.
If the Target column is a partitioned column, it is indicated using a superscript P and if it has a static
value, mouse over the column to display the partition value.

To view the Entity details, select an entity and click . To remove an Entity from the Definition
pane or Target Entities pane, select the entity and click . You cannot remove an entity if any of its
attributes are mapped. The mapped attribute is indicated using a superscript m.

NOTE You can create a new table by clicking if the target


information domain is based on the HDFS database. The newly
created table will be part of the OFSAAI Data Model and it is
visible and available to all other modules. For more
information, see Dynamic Creation of Table.

5. To map a source to target, do one of the following:


 Select the required attribute from the Source Entities pane and select an attribute from the
Target Entities pane and click .

 Click to automatically map between source attribute and target attribute. Auto mapping
happens if both source and target attributes have the same name.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 83


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

 To remove a mapping, select the target column and click . To remove all mappings in the
Target Entities pane, click .
 To remove all mappings from a Target Entity, select the target table from the Target Entities
pane and click .
 To define an expression to transform a source column and map it to a target column:
 Select EXPRESSION from the Source Entities pane, select an attribute from the Target Entities
pane and click Transform Map. From the Expression Builder window, define an
expression to transform the column.
 To modify an expression, expand EXPRESSION from the Source Entities pane, select the
expression you want to modify and click Transform Map. Modify the expression from the
Expression Builder window. This will modify the value for all target columns mapped to this
expression irrespective of the target column selected while defining the expression.
A confirmation pop-up message is displayed.
 To map an existing expression to a new target column, expand EXPRESSION from the Source
Entities pane, select the expression you want to map and click .

NOTE For a single DI Mapping, you can use different target tables.
That is, after mapping a source column to a column in a Target
Entity, you can select another Target Entity and start mapping
source columns to that target table columns. Also, the same
source column can be mapped to different target columns of
different target entities.

6. For F2T definition, you can map Row Level Transformation (RLT) functions, that is, SysDate() and
Constant values to a target column:
 Select SysDate() under Entity Details in the Source Entities pane and the required target column
in the Target Entities pane and click . The target column should be a Date column.
 Select Constant Value under Entity Details in the Source Entities pane and the required target
column in the Target Entities pane and click . Select the required constant value type from
the drop-down list. The supported constant values are #DEFINITIONNAME, #SOURCENAME,
#MISDATE, and #FILENAME. Ensure the Data Type of the target column matches with the
constant value Data Type.
The options for Constants are:
 #DEFINITIONNAME- The name of the Data Mapping (F2T) definition will be transformed
at Row level and loaded into a mapped target column.
 #SOURCENAME- The name of the Source on which the Data Mapping (F2T) definition is
defined will be transformed at Row level and loaded into a mapped target column.
 #MISDATE- Execution date of the Data Mapping (F2T) definition will be transformed at
Row Level and loaded into the mapped target column.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 84


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

NOTE Columns mapped to #MISDATE will use the NLS format of DB


for loading. For loading successfully, specify the
DB_DATE_FORMAT given in the AAI_DB_PROPERTY table as
the NLS date format of the corresponding atomic schema. To
know the NLS date format of the DB , you can fire the following
query:
select * from V$nls_Parameters

 #FILENAME- The name of the file used for loading will be transformed at Row Level and
loaded into the mapped target column.
 Others- Enter user-defined constant value in the textbox provided. To map a constant
date to a target column, the date has to be given in NLS format of the database. That is, if
the NLS format is DD-MON-RR, in the text box value should be 25-OCT-19.

NOTE • Row Level Transformation is supported only for F2T.


• In case of date based columns in F2T, when you map a
source date column to multiple target columns, an
expression value is added to all the mapped target
columns, except to the first mapped column. The
expression is in this format: TO_DATE(<<first
record>>,'mm-dd-yyyy').

Figure 38: Join/Filter pane

If you are mapping from multiple Source Tables, define an expression to join the column data
corresponding to each table. You can pass Runtime Parameters through Expressions, Joins, and
Filter conditions. For more information, see Passing Runtime Parameters in Data Mapping.
7. Specify the ANSI Join or Join to join the source tables and enter the Filter criteria and Group By to
include during extraction. For example, “$MISDATE” can be a filter for Run-time substitution of the
MIS Date.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 85


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

NOTE If the defined expression uses a function that has a placeholder


or calls a stored procedure that has a placeholder for String
data type, enclose the placeholder in single quotes. Using
double-quotes would generate an error during extract
definition or batch execution. Additionally, expressions with
Date/Timestamp data type placeholders are not supported.

Figure 39: Prescript/Hint

8. Specify any Source Prescript or Target Prescript if you want to use it. Prescripts are supported for
all HIVE based target Infodoms, that is, for H2H and T2H definitions. In case of H2T, the prescripts
are fired on the source.
For more information, see Prescripts.
9. Specify Source Hint and Target Hint (if any) for faster loading. Oracle hints follow the format as /*+
HINT */. The mapping level hint is applicable for T2T, H2T, and H2H definitions only.
For example, /*+ PARALLEL */.

Figure 40: Target Table Map Details

The Target Table Map Details pane displays the mapping details.

NOTE The View SQL and Validate buttons will be enabled only if
your user group is mapped to the User Role DMADV.

10. Click View SQL to view the complete query in the SQL/Plan pane.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 86


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

11. Click Validate to validate the query by converting to the selected data source. If Validation is
successful, the Explain Plan for the SQL query is displayed in the SQL/Plan pane. Otherwise, the
SQL exception is displayed.
12. To modify an expression, select the expression name and click Edit Expression. Modify the
expression in the Expression Builder window.
For T2T definitions, it is recommended to use source-level expressions because the source and
target expressions are similar in T2T. Target expression for T2T is mainly provided to edit the target
level expression of the migrated Data Mapping definitions.
13. Click OK in the DI Mapping window.
14. Click Properties to specify the properties.
See Specifying Properties for Load To Table Option.
15. Click Save to save the mapping details. The Data Mapping definition will be saved as version 1.

NOTE 1. If a partitioned column is not mapped and the static value


is not set for the partitioned column, an alert is displayed.
The saving of the mapping definition does not fail. You
can set a static value at any time before execution.
2. For H2H definition, if the source and target are pointing to
two different Hive Schemas, it is mandatory to prefix the
schema name to the source tables. Otherwise, the
execution will fail.
3. When you click Save, if there are Primary Key Columns in
the Target Entities which are not mapped, then the
following alert appears:
[8368] Mandatory Columns are not Mapped [9024] Do you
want to continue?
You can click OK if no change is required and proceed, or
click Cancel to stay on the current window.

4.4.1.3.1 Specifying Properties for Load To Table Option

• T2T
• T2H
• H2H
• F2H
• H2T
• F2T

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 87


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

For T2T definition:

Figure 41: Properties window

The following table describes the Property Name and Value in the Properties window.

Table 10: Property Name and Value in the Properties window

Property Name Property Value

Constraints
Delete Duplicate Select Yes if you want to delete the duplicate records after insertion when
Primary Keys are disabled.

Disable Primary Key Select Yes to disable Primary Key while loading the data.
In case of Batch and Bulk modes, if any of the foreign keys are in Disabled
state before loading the data using T2T or the property Disable Primary
Key is set to Yes, then all the Primary Keys and corresponding Foreign Keys
are disabled before loading and are enabled back after loading. Hence the
initial status of foreign and primary keys can be changed from Disabled to
Enabled.
In case of Direct mode, if the Disable Primary Key property is not set
(selected as No), then the Delete Duplicate property is set to Yes
automatically, which in turn reports all the duplicate records in the error log
table.

File

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 88


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Frequency Select the frequency of loading the data file into Data Warehouse. This
property can be used to schedule Batch operations.
The options are Daily, Weekly, Monthly, Quarterly, Yearly, and One Time
Load.

Load Empty If this is set to Yes, the task will be successful even if there are no records to
load or if all the records are discarded or rejected.

MIS Date Field Specify the MIS Date field in the source data file. If MIS Date is not part of
the download, then you can use the MISDate () function in the Data
Mapping window to add MIS Date to the table automatically.

Loading

Load Previous Set to Yes if you want to load the data of the previous period when the
current period data is not available.

Loading Type Select the loading type from the drop-down list. The options are:
• Insert- The records will be overwritten.
• Append- The records will be appended to the target table.

Read Priority Choose the priority of reading the data from either Memory Store or
Persistent Store, from the drop-down list.

Write Priority Choose the priority of writing the data into either Memory Store or
Persistent Store, from the drop-down list.

Loading Mode

Record Load Limit If the number of records in the source table exceeds the Record Load Limit
value, the data loading will not happen. If the value is set as 0 or not
specified, the record count check is skipped.

Direct or Batch or Bulk Specify the Loading Mode as Direct, Batch, or Bulk.
In Bulk Mode of loading, note that:
Loading is possible only when the target database and the data source
created for the definition are in the same database.
If the schema used for source and target is different but the database is the
same, then the target schema should be granted “Select” access for the
source table.
You cannot specify the Batch Size and commit happens at the end of batch
load.
Batch loading is faster for fewer records as compared to a larger number of
records that sometimes leads to loss of data while loading.

Batch Size Specify the Batch Size if you want to load the records in batches. The ideal
values for batch sizes are 1024, 2048, 10000, or 20000. Huge batch sizes
may result in failure if the required system resources are not available.
If it is not specified, commit is done on the entire set.

Source Fetch Size Specify the Source Fetch Size for fetching data from the source system.
For T2T definitions, Source Fetch size is applicable to both Batch and Direct
loading methods.
For example, the default Source Fetch Size for Oracle JDBC is 10.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 89


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Rejection

Rejection Threshold Enter the maximum errors in absolute value that a Data File can have and
the Data Load will be marked successful.
After the erroneous record count exceeds the Rejection Threshold value,
the data loading task will fail and the inserted values will be rolled back for
that table. Inserts for the previous tables won't be reverted. Rejection
Threshold will be applied to each target table individually in a batch.
By default, the value is set as UNLIMITED.
Note the behavior of Rejection Threshold and Rejection Threshold %:
• Rejection Threshold is checked before Rejection Threshold %. If you set
a value for Rejection Threshold, it will be considered as the rejection
limit and any value given to Rejection Threshold % is not considered.
• If you set the Rejection Threshold as UNLIMITED or blank, it checks for
Rejection Threshold % and the value set for Rejection Threshold % will
be taken as rejection limit.
• If you set both Rejection Threshold and Rejection Threshold % as
UNLIMITED or blank, the whole Data file will be loaded irrespective of
the number of errors.

Rejection Threshold % Set Rejection Threshold as a percentage of the number of rows in the Data
file.
Enter the maximum errors that a Data File can have as a percentage of the
number of rows in the data file and the Data Load will be marked as
successful.
By default, the value is set as UNLIMITED.
Rejection Threshold % is considered only if Rejection Threshold is set to
UNLIMITED or blank.

For T2H definition:

Figure 42: Properties window

The following table describes the Property Name and Value in the Properties window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 90


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Table 11: Property Name and Value in the Properties window

Property Name Property Value

Loading

Loading Type Select the loading type from the drop-down list. The options are:
Insert- The records will be overwritten.
Append- The records will be appended to the target table.

Read Priority Choose the priority of reading the data from either Memory Store or
Persistent Store, from the drop-down list.

Write Priority Choose the priority of writing the data into either Memory Store or Persistent
Store, from the drop-down list.

Loading Mode

Record Load Limit If the number of records in the source table exceeds the Record Load Limit
value, the data loading will not happen. If the value is set as 0 or not specified,
the record count check is skipped.

Source Fetch Size Specify the Source Fetch Size for fetching data from the source system.
For example, the default Source Fetch Size for Oracle JDBC is 10.

Sqoop

Split By Column This is applicable only if you are using Sqoop for loading data into Hive tables.
Specify the Split By Column in the format “TableName.ColumnName”. It
should not be an expression. Additionally, the column should not be of data
type “Date” and it should not have Null data.
This is a mandatory field for T2H executions using Sqoop.
If you have not provided any value for this field, the T2H Sqoop engine defaults
the value to the last mapped source column.
Ideally, you should set the Split-by column to a PK numeric column. If the split
by column is String-based, Generic Options property needs to be set to -
Dorg.apache.sqoop.splitter.allow_text_splitter=true.
Generic Options This field is applicable only in Sqoop SSH mode.
Specify the generic arguments that will be appended before all the tool-specific
arguments. For example, -Doraoop.nologging=true

Specific Options This field is applicable only in Sqoop SSH mode.


Specify any tool specific arguments that will be appended at the end of the
Sqoop command. For example, --connection-param-file
ora.properties --update-mode allowinsert --update-key
<COLUMN_NAME>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 91


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

For H2H Definition:

Figure 43: Properties window

The following table describes the Property Name and Value in the Properties window.

Table 12: Property Name and Value in the Properties window

Property Name Property Value

Loading

Loading Type Select the loading type from the drop-down list. The options are:
Insert- The records will be overwritten.
Append- The records will be appended to the target table.

Read Priority Choose the priority of reading the data from either Memory Store or
Persistent Store, from the drop-down list.

Write Priority Choose the priority of writing the data into either Memory Store or
Persistent Store, from the drop-down list.

Loading Mode

Record Load Limit If the number of records in the source table exceeds the Record Load
Limit value, the data loading will not happen. If the value is set as 0 or
not specified, the record count check is skipped.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 92


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

For F2H Definition

Figure 44: Properties window

The following table describes the Property Name and Value in the Properties window.

Table 13: Property Name and Value in the Properties window

Property Name Property Value

File

Data File Enter the name of the Data File that needs to be extracted. You can specify
multiple files separated by ‘/’.
This property is useful to create metadata definitions for multiple Flat-Files
of the same structure by copying the Definition File.

Hive and Impala

Is File Local To Hive Server Select Yes if the file is on the server where HiveServer is running, else
select No from the drop-down list. This is applicable only for remote file
source.

Loading

Loading Type Select the loading type from the drop-down list. The options are:
Insert- The records will be overwritten.
Append- The records will be appended to the target table.

Read Priority Choose the priority of reading the data from either Memory Store or
Persistent Store, from the drop-down list.

Write Priority Choose the priority of writing the data into either Memory Store or
Persistent Store, from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 93


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

For H2T Definition

Figure 45: Properties window

The following table describes the Property Name and Value in the Properties window.

Table 14: Property Name and Value in the Properties window

Property Name Property Value

Loading

Loading Type Select the loading type from the drop-down list. The options are:
Insert- The records will be overwritten.
NOTE:
Limitation: In the Insert Mode for H2T SQOOP Execution, the
Target Tables are truncated. If a Task fails, the changes cannot be
rolled back.
Append- The records will be appended to the target table.

Read Priority Choose the priority of reading the data from either Memory Store or
Persistent Store, from the drop-down list.

Write Priority Choose the priority of writing the data into either Memory Store or
Persistent Store, from the drop-down list.

Loading Mode

Record Load Limit If the number of records in the source table exceeds the Record Load
Limit value, the data loading will not happen. If the value is set as 0 or
not specified, the record count check is skipped.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 94


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Batch Size Specify the Batch Size if you want to load the records in batches. The
ideal values for batch sizes are 1024, 2048, 10000, or 20000. Huge batch
sizes may result in failure if the required system resources are not
available.
If it is not specified, commit is done on the entire set.

Rejection

Rejection Threshold Enter the maximum errors in absolute value that a Data File can have and
the Data Load will be marked successful.
Once the erroneous record count exceeds the Rejection Threshold
value, the data loading task will fail and the inserted values will be rolled
back for that table. Inserts for the previous tables won't be reverted.
Rejection Threshold will be applied to each of the target tables
individually in a batch.
By default, the value is set as UNLIMITED.

Sqoop

Generic Options This field is applicable only in Sqoop SSH mode.


Specify the generic arguments which will be appended before all the
tool-specific arguments. For example, -Doraoop.nologging=true

Specific Options This field is applicable only in Sqoop SSH mode.


Specify any tool-specific arguments, which will be appended at the end
of the Sqoop command. For example, --connection-param-file
ora.properties --update-mode allowinsert --
update-key <COLUMN_NAME>

NOTE:
To parse the date column values, set this property as shown in the
follows:
• In Sqoop cluster:
--connection-param-file <path to the
ora.properties file on the sqoop node>
• In Sqoop client mode:
--connection-param-file
$FIC_DB_HOME/bin/ora.properties
Update the ora.properties file with the following parameter:
oracle.jdbc.mapDateToTimestamp=false
Use Staging Select Yes to use a staging table during Sqoop export.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 95


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

For F2T Definition

Figure 46: Model Dialog window

The following table describes the Property Name and Value in the Properties window.

Table 15: Property Name and Value in the Properties window

Property Name Property Value

File

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 96


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Frequency Select the frequency of loading the data file into Data Warehouse. This
property can be used to schedule Batch operations.
The options are Daily, Weekly, Monthly, Quarterly, Yearly, and One Time
Load.

MIS Date Field Specify the MIS Date field in the source data file. If MIS Date is not part of the
download, then use the MISDate() function in the Data Mapping window to
add MIS Date to the table automatically.

Data File Enter the data file name if it is different from the Definition name. This
property is useful to create metadata definitions for multiple Flat-Files of the
same structure by copying the Definition File.
Note: For F2T CPP execution, you should not enter “/ “ in the Data File name.

Load Empty If this is set to Yes, the task will be successful, even if there are no records to
load or if all the records are discarded or rejected.

Prefix Enter the string that is prefixed with the data file name separated by an
underscore (_).

Suffix • Select No if the data file name is not suffixed.


• Select Information Date if the data file name is suffixed with
Information Date or MIS Date in YYYYMMDD format separated by an
underscore (_).

Constraints

Disable Primary Key Select Yes to disable Primary Key while loading the data.
In case of Batch and Bulk modes if any of the foreign keys are in Disabled
state before loading the data using T2T or the property Disable Primary Key
is set to Yes, then all the Primary Keys and corresponding Foreign Keys are
disabled before loading and are enabled back after loading. Hence the initial
status of foreign and primary keys can be changed from Disabled to Enabled.
In case of Direct mode, if the Disable Primary Key property is not set
(selected as No), then the Delete Duplicate property is set to Yes
automatically, which in turn reports all the duplicate records in the error log
table.

Disable Check Constraints Select Yes if you want to disable the Check Constraints on columns of the
table or select No to load with the constraints enabled.

Loading Mode

Record Load Limit If the number of records in the source file exceeds the Record Load Limit
value, the data loading will not happen. If the value is set as 0 or not
specified, the record count check is skipped.

Loading

Load Previous Set to Yes if you want to load the data of the previous period when the
current period data is not available.

Loading Type Select the loading type from the drop-down list. The options are:
• Insert- The records will be overwritten.
• Append- The records will be appended to the target table.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 97


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Duplicate Row

Duplicate Row Checks Select Yes to check for Duplicate Rows and to remove them from the Data
File.

Duplicate Row This field determines which of the Duplicate Record(s) to be removed if
found. The options are Keep Last Occurrence and Keep First Occurrence.

Misc

Abort-Failure Condition Select Stop to stop the loading on reaching the Rejection Threshold. Select
Continue to ensure the reading of the entire Data File.

Query Enter the Query that needs to be executed before file loading.

Discard Max Enter the maximum errors allowed for SQL*Loader Discards while loading.

Edit and Reload Select Yes to have the option of editing the error file and re-loading it.

Oracle

Continue If Enter a condition which when satisfied will continue the file load.

Direct Load • Select Yes to do Fast Load into the Oracle Database only if you have not
defined any target expressions.
• Select Force to do Fast Load into the Oracle Database if target
expressions have only constant values.
• Select No if you do not want to enable Fast Load.

Load When Enter a condition which when satisfied will start the file load.

Parallel Load Select Yes to load the data in parallel into the Database table for faster
loading, else select No.

Preserve Blanks Select Yes to retain blank values in the Data without trimming.

BINDSIZE For conventional path loads, BINDSIZE specifies the maximum size (bytes) of
the bind array. The size of the bind array given by BINDSIZE overrides the
default size (which is system dependent) and any size determined.

Number of ROWS For conventional path loads, ROWS specifies the number of rows in the bind
array.
For direct path loads, ROWS identifies the number of rows you want to read
from the data file before a data save. The default is to read all rows and save
data once at the end of the load.

Trailing Null Columns Select Yes to retain Trailing Null Columns in the Data File.

Growth

Incremental Growth Enter the Incremental Growth of Data in absolute values over the previous
period.

Incremental Growth % Enter the Incremental Growth of Data in percentage over the previous period.

Rejection

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 98


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Rejection Threshold Enter the maximum errors in absolute value that a Data File can have and the
Data Load will be marked successful.
After the erroneous record count exceeds the Rejection Threshold value, the
data loading task will fail and the inserted values will be rolled back for that
table. Inserts for the previous tables won't be reverted. Rejection Threshold
will be applied to each of the target tables individually in a batch.
By default, the value is set as UNLIMITED.
Rejection Threshold is considered only if Rejection Threshold % is set to
UNLIMITED or blank.
If you set both Rejection Threshold % and Rejection Threshold as UNLIMITED
or blank, the whole Data file will be loaded irrespective of the number of
errors.

Rejection Threshold % Set Rejection Threshold as a percentage of the number of rows in the Data
file.
Enter the maximum errors as a percentage of the number of rows in the data
file, which a Data File can have and the Data Load will be marked as
successful.
By default, the value is set as UNLIMITED.
Note the behavior of Rejection Threshold % and Rejection Threshold:
• Rejection Threshold % is checked before Rejection Threshold. If you set a
value for Rejection Threshold %, it will be considered as the rejection
limit and it will not check Rejection Threshold.
• If you set Rejection Threshold % as UNLIMITED or blank, it checks for
Rejection Threshold and the value set for Rejection Threshold will be
taken as rejection limit.
• If you set both Rejection Threshold and Rejection Threshold % as
UNLIMITED or blank, the whole Data file will be loaded irrespective of the
number of errors.

4.4.1.4 Defining Data Mapping for File Extraction (T2F, H2F)


You can map data from a source table to the specified file in the Data Mapping window. The source can be
an RDBMS table or HDFS source. To load data to a file along with other sources, you need to define the
Data Mapping and specify the Source Entities. Source-Target mapping is not required since the table
structure is completely extracted to the specified file. However, if you want to do an F2T after T2F, source
to target mapping is required. For example, for DB2 you cannot directly load data from DB2 to RDBMS, so
you need to map data from Table to File (T2F) and then File to Table (F2T).
After execution of T2F or H2F definition, the extracted file will be present in
/ftpshare/<INFODOM>/dmt/def/<DEFINITIONNAME>/<BATCH_ID>/<DATE_FOLDER>. The column
names in the table will not be present in the extracted file.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 99


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Figure 47: Source Entity Details pane

1. Select Extract to File option as Load Type.


2. Click Select.
The Entity Selection window is displayed.

Figure 48: Entity Selection window

The Select Entity grid displays all entities in the selected Source or Infodom. Expand the Entity name
to view the attributes in each entity.
3. Select the required entities or attributes you want to extract to file:

 Select an entity and click if you want to extract all attributes in an entity.
 For extracting only selected attributes in an entity, expand the required entity, select the
attribute and click .

 Click to select all entities.

 To remove an attribute from the Selected Values, select the attribute and click .

 Click to remove all selected values.


4. Click Select to populate the selected entities or attributes in the Source Entity Details grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 100


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

NOTE Whenever you make any changes in the Select Entity grid, click
Select to refresh the Source Entity Details grid to reflect the
changes.

Figure 49: Join/Filter pane

5. If you are mapping from multiple Source Tables, define an expression to join the column data
corresponding to each table. Specify the ANSI Join or Join to join the source tables and enter the
Filter criteria and Group By to include during extraction. For example, “$MISDATE” can be a filter
for Run-time substitution of the MIS Date.

NOTE If the defined expression uses function that has a placeholder


or calls a stored procedure that has a placeholder for String
data type, enclose the placeholder in single quotes. Using
double-quotes would generate an error during extract
definition or batch execution. Additionally, expressions with
Date/Timestamp data type placeholders are not supported.

6. Specify Source Prescript if any.


For more information, see Prescripts.
7. Specify Source Hint if any, for faster loading. Oracle hints follow (/*+ HINT */) format. The
mapping level hint is not applicable.
For example, /*+ PARALLEL */.

NOTE Hints are not supported for T2F definitions.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 101


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Figure 50: Source Entity Details pane

8. Perform the following actions if required:

 Click Add to add a new custom column by defining it from the Expression Builder window.

 Click Edit to edit the Expression Value defined using the Expression Builder window. You
can also edit the expression value by double-clicking the Expression Value column and
manually typing the proper expression.
 Double-click the Field Order number and update the value to change the order in which
columns should appear in the target file.

NOTE No validation is provided for missing Field Orders. Hence,


during execution, those columns after the missing field order
will be omitted. Click Reorder to sort and reorder the Field
Order numbers to fill any missing numbers.

 Double-click the Logical Data Type and select the required option from the drop-down list to
change the Data Type of the target column. The available Data types are Number, String, Date
Time, Integer, and Timestamp.
 Double-click the Date Format and modify the date format, if required, for the target column.

NOTE Date Format should be mentioned for target columns with


Logical Data Type as Date Time. Else, the execution will fail.

 Select an attribute and click Delete if you do not want that attribute in the target file.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 102


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

NOTE The View SQL and Validate button will be enabled only if your
user group is mapped to the User Role DMADV.

9. Click View SQL to view the complete query in the SQL Plan pane.
10. Click Validate to validate the query by converting to the selected data source.
If validation is successful, the Explain Plan for the SQL query is displayed in the SQL Plan pane.
Otherwise, the SQL exception is displayed.
11. Click Ok to save the changes in the Entity Selection window.
12. Click Properties to specify the properties.
See Specifying Properties for Extract To File Option section.
13. Click Save to save the mapping details.
The Data Mapping definition will be saved as version 1.

4.4.1.4.1 Specifying Properties for Extract To File Option

For T2F or H2F definition:

Figure 51: Model Dialog window

The following table describes the fields in the Modal Dialog window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 103


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Table 16: Model Dialog window Fields and its Description

Property Name Property Value

File

Data File Enter the data file name.


Data File Name can be different from the Definition File Name. This property
is useful to create metadata definitions for multiple Flat-Files of the same
structure by copying the Definition File.

Suffix • Select No if you do not want to suffix the data file name.
• Select Information Date if you want to suffix the data file name with
Information Date or MIS Date in YYYYMMDD format separated by an
underscore (_).

Prefix Enter the string that you want to prefix with the data file name separated by
an underscore (_).

Misc

Field Delimiter Enter the field separator used in the Data File. By default, comma (,) is
selected.

Rules

Check Rules Select Header, Trailer, Header and Trailer or No from the drop-down list
depending on where the Validity rules are specified in the Data File.

Header Identifier This field is enabled only if you select Header or Header and Trailer options
for Check Rules.
Specify the first Character or String that identifies the Header Record.

Header Field Order This field is enabled only if you select Header or Header and Trailer options
for Check Rules.
Specify the header field order as comma separated values-: 1-Header
Identifier,2-Data File Name, 3-Information Date, 4-Number of records, 5-
Value of Checksum, 6-Basis of Checksum.
For example, if you specify 1,3,2,4,5,6; the header fields will be Header
Identifier, Information Date, Data File Name, Number of records, Value of
Checksum, Basis of Checksum.

Trailer Identifier This field is enabled only if you select Trailer or Header and Trailer options
for Check Rules.
Specify the first Character or String that identifies the Trailer Record.

Trailer Field Order This field is enabled only if you select Trailer or Header and Trailer options
for Check Rules.
Specify the Trailer field order as comma separated values-: 1- Trailer
Identifier,2-Data File Name, 3-Information Date, 4-Number of Records, 5-
Value of Checksum, 6-Basis of Checksum.

Data File Name Select Yes if the name of the data file should be provided as part of the
Header/Trailer.

Information Date Select Yes if the Information (MIS) Date in the Data File should be provided as
part of the Header/Trailer.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 104


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Property Name Property Value

Number of Records Select Yes if the number of records in the Data File should be provided as
part of the Header/Trailer.

Checksum Select Yes if a Check Sum Value should be provided as part of the
Header/Trailer.

Basis of Checksum Specify the Source Column Name on which the Check Sum is computed. It
has to be a Numeric column.

Source Fetch Size Specify the Source Fetch Size for fetching data from the source system.
This property is applicable only for T2F.
For example, the default Source Fetch Size for Oracle JDBC is 10.

4.4.1.5 Associating DQ Rules to a Data Mapping Definition:


Data Quality Rules can be associated to Data Mapping definitions so that Data Quality(DQ) checks are
done on the source and Data Correction (DC) is done while loading to the target table. Thus, DC is
segregated from DQ checks. This is supported for both RDBMS and HIVE based Data Mapping definitions.
However, DC on DQ Generic Check is not supported in T2H, H2T, and H2H. Additionally, associating DQ
Rules to Data Mapping is not supported for H2T OLH (Oracle Loader for Hadoop) mode.
If we associate DQ Rules with T2T and execute the batch, both T2T and all the DQ rules defined on the
Source table are executed. You have an option to include or exclude the Associated DQ rules. If we exclude
a DQ check and execute the batch, then only T2T operation is performed and not the DQ.
Prerequisites
• De-select the Allow Correction on DI Source checkbox from the Configuration window.
For more information, see the Updating Others Tab section.
• The DI Source should exist as an information domain.
To associate DQ rules to Data Mapping definition:
1. Click button in the Associated DQ Rules toolbar. The Data Quality Rule Association window is
displayed.
2. All DQ Rules defined on the source table are displayed.
3. Select the Exclude checkboxes corresponding to the DQ rules to exclude them being executed
along with the T2T operation.
4. Enter the sequence in which the selected DQ Rules should get executed in the Sequence column.
5. Click Save.

NOTE When a DQ rule is associated with a T2T mapping and the


Allow Correction on DI Source checkbox is not selected in the
System Configuration> Configuration > Others tab, DQ rule
checking is done on source, but data correction is done while
loading to the target table.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 105


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

4.4.1.6 Replacing Source or Target of Data Mapping Definition during Execution


You can replace the source of the Data Mapping definition during execution by using the Run-time
parameter EXEC_ENV_SOURCE. Therefore, you can convert a T2T definition into H2T, T2H into H2H, or
H2H into T2H. However, if the resultant definition is T2T, execution of T2T using CPP engine is not
supported.
Similarly, you can replace the target of the Data Mapping definition during execution by using the run-
time parameter EXEC_ENV_TARGET. Thus, you can convert a T2T definition into T2H, H2T into H2H, or
H2H into H2T. However, if the resultant definition is T2T, execution of T2T using CPP engine is not
supported.
If you are executing the Data Mapping definition through the RRF module, you should pass the parameter
with double quotes.
For example,
“EXEC_ENV_SOURCE”,”newSourceName”
“EXEC_ENV_TARGET”,”newTargetName”
If you are executing the Data Mapping definition through the ICC module, you should pass the parameter
with square brackets. For more information, see Component: LOAD DATA.

NOTE • Ensure the structure of the source/target in the mapping


definition is the same as that of the replacing
source/target.
• You can use both EXEC_ENV_SOURCE and
EXEC_ENV_TARGET together as well. The only limitation
is if the resultant definition is T2T, it cannot be executed
using the CPP engine.

4.4.1.7 Executing H2H on Spark


Following are the configurations required for executing H2H on Spark:
1. Register a cluster from DMT Configurations > Register Cluster with the following details:
 Name- Enter the name of the target information domain of the H2H mapping.
 Description- Enter a description for the cluster.
 Livy Service URL- Enter the Livy Service URL used to connect to Spark from OFSAA.
2. To execute H2H on Spark, set the EXECUTION_ENGINE_MODE parameter as SPARK from ICC or
RRF.
 Execution through Operations module- Pass [EXECUTION_ENGINE_MODE]=SPARK while
defining the H2H tasks from the Task Definition window.
For more information, see Component: LOAD DATA section.
 Execution through RRF module- Pass the following as a parameter while defining H2H as jobs
from the Component Selector window:
“EXECUTION_ENGINE_MODE”,”SPARK”

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 106


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

3. Spark Session Management- In a batch execution, a new Spark session is created when the first
H2H-spark task is encountered and the same spark session is reused for the rest of the H2H-spark
tasks in the same Run.
For the spark session to close at the end of the run, set the CLOSE_SPARK_SESSION to YES in the
last H2H-spark task in the batch.
 Execution through Operations module- Pass [CLOSE_SPARK_SESSION]=YES while defining the
last H2H-Spark task from the Task Definition window.
For more information, see Component: LOAD DATA section.
 Execution through RRF module- Pass the following as a parameter while defining the last H2H-
spark job from the Component Selector window:
“CLOSE_SPARK_SESSION”,”YES”

NOT 1. Ensure that the task with “CLOSE_SPARK_SESSION”,”YES” has less


precedence set from all the rest of the H2H-spark tasks.
E
2. By default, the created spark session will be closed when any of the H2H-
spark tasks fail.
3. Execution of H2H with a large number of mappings may fail because Spark
restricts the length of the SQL code in the spark.sql file to a maximum of
65535 (2^16 - 1).
4. When you run an H2H Load with Hive and Apache Spark, it fails with the
following error:
Error executing statement : java.lang.RuntimeException:
Cannot create staging directory
'hdfs://<HOST_NAME>/user/hive/warehouse/hivedatadom.db/di
m_account/.hive-staging_hive_2020-07-06_22-44-
57_448_3115454008595470139-1': Permission denied:
user=<USER_NAME>, access=WRITE,
inode="/user/hive/warehouse/hivedatadom.db/dim_account":h
ive:hive:drwxrwxr-x
Provide the required permissions to the logged-in user in the Hive Database Storage,
which enables the user to access and perform tasks in the storage.

4.4.1.8 Dynamic Table Creation


This option allows you to create a new table on the fly if the target Information Domain of the Data
Mapping is based on the HDFS database. You can use the newly created table for mapping. The newly
created table will be part of the OFSAAI Data Model, and it is made visible and available to all other
modules.
Note that you cannot create a table with partition.
To dynamically create a table follow these steps:

1. From the DI Mapping window, click in the Target Entities pane.


The Create Table window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 107


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

2. Enter a table name and click Generate.


The new table name is displayed on the Target Entities pane.
3. Select the required attributes from the Definition pane and map them to the new Table in the Target

Entities pane by clicking button.


4. After defining all mappings, click Save. The table will be created in the HDFS/ HIVE system with the
structure/data types of the mapped columns, and it will be added to the metadata repository (both
database xml and the object registration tables). The newly created table will be available for use in
other metadata like Datasets, Hierarchies, and so on.

4.4.1.9 Prescripts
Prescripts are fired on a Hive connection, before firing a select from or insert into a Hive table. While
defining a Prescript, note the following:
• Prescript should mandatorily begin with the keyword "SET".
• Multiple Prescripts should be semi-colon separated.
• Prescripts are validated for SQL Injection. The following key words are blacklisted:
"DROP","TRUNCATE","ALTER","DELETE","INSERT","UPDATE","CREATE", "SELECT"
All validations applicable in the UI are checked on execution also. If a prescript fails any of the validations
or if there is an error in firing the pre-script, the load operation is exited.

NOTE For H2T, the Prescript is fired on the source.

4.4.1.10 Handling Partitioned Target Tables


Data loading into a partitioned Hive target table is supported. The partitioned columns are indicated using
a superscript P in the DI Mapping window.
You can set a static value to a partitioned column from the REV_TAB_PARTITIONS table. If it is set, you
can view it from the DI Mapping window by pointing the mouse over the column name. You need not to
map the target column to any source column. If you map a source column to a target partitioned column
that already has a static value, the static value will get precedence.
If no static value is set to a partitioned column, you can pass a dynamic partitioned valued. You should
map a source column to the target partitioned column. If there is no mapping and static value is not set,
the empty or blank is passed as the partition value. Hive defaults the partition to
_HIVE_DEFAULT_PARTITON_. There is no loss of data in the non-partitioned columns.

NOTE If you need to enable dynamic partition in non-strict mode, set


the below property as a Prescript in the Data Mapping window:
set hive.exec.dynamic.partition.mode=nonstrict

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 108


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

Static partition value can also be set with placeholders. The placeholders supported in Data Mapping are
$RUNID, $PHID, $EXEID, $RUNSK, $SYSDATE, $TASKID, and $MISDATE. Additionally, partition value can
be provided as a parameter within square brackets. For example, [PARAM1]. Passing the parameter values
at runtime from RRF/ Operations module is same as for the other Run-time parameters in Data
Management Framework. Value for the placeholders/ additional parameters will be substituted as the
static partition values during the Run-time. For more information, see Passing Runtime parameters in
Data Mapping.

4.4.1.11 Expression Builder


Figure 52: Expression Builder window

1. In the Expression Builder window, do the following:


 Enter the Expression Name.
• Select the Data Type from the drop-down list. The available options are String, Date Time, Number,
Integer, and Timestamp. If you have selected Date Time as Type, set the Date Format by double
clicking the attribute/field from the Source Entities pane.
2. Define an expression by doing the following:
 Select the Table in the Entities section.
 Select the Function. You can select Transformations, Database Functions, or Extraction
Functions. Extract functions are populated from the “DATABASE_ABSTRACT_LAYER” table that
resides in the Config Schema.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 109


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

 Define the Operators by selecting Arithmetic, Concatenation, Comparison, Logical, or other


operators.
For more information, see Operators.
 Specify the ANSI Join or Join to map the table columns, and enter the filter criteria to include
during extraction. For example, “$MISDATE” can be a filter for Run-time substitution of the MIS
Date.

NOTE If the defined expression uses function that has a placeholder


or calls a stored procedure that has a placeholder for String
data type, enclose the placeholder in single quotes. Using
double-quotes would generate error during extract definition or
batch execution. Additionally, expressions with
Date/Timestamp data type placeholders are not supported.

3. Click Ok.

4.4.2 Modifying a Data Mapping Definition


This option allows you to modify a Data Mapping definition. You cannot modify inactive versions of a Data
Mapping definition. To make an inactive version as active, you should make that version as latest.
To modify a Data Mapping definition:
1. From the Data Mappings window, select the Data Mapping definition that you want to edit and click
Edit.
The Data Mapping window is displayed.
2. Modify the required details. You cannot modify Code and Name.
For more information, see Creating Data Mapping Definition section.
3. Click Save. The definition will be saved as highest version +1. That is, if you are modifying a
definition of version number as 3 and the highest version available is 5, the definition will be saved
as version 6.

4.4.3 Versioning and Make Latest Feature of Data Mapping


When a new definition is created, it will be saved as version 1. After you modify and save a definition, it will
be saved with version as highest version +1. That is, if you modify version 2, which is the highest version
available and save it, the version becomes 3.
In earlier version, Data Mapping definitions having same name with different sources could co-exist,
which is not allowed in OFSAAI 8.0.6.0.0 version and above. Therefore, while migrating Data mapping
definitions from earlier OFSAAI versions, the second occurrence of the definition with different source will
be saved as version 2. Then version 2 will be active and version 1 will be inactive and both are executables.
However, you can modify only the active versions.
To make any older version as latest:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 110


DATA MANAGEMENT FRAMEWORK
DATA MAPPING

1. From the Data Mapping window, select INACTIVE from the Record Status drop-down list and click
Search.
All inactive definitions are displayed.

2. Select the required definition and click Make Latest.


The selected definition becomes active and the current active definition becomes inactive.

4.4.4 Copying Data Mapping Definition


This feature facilitates you to quickly create a new Data Mapping definition based on an existing one by
updating the required fields.
To copy a Data Mapping definition follow these steps:
1. From the Data Mappings window, select the Data Mapping definition that you want to copy and
click Copy.
The Data Mapping window is displayed.
2. Enter Code and Name for the definition. Additionally, modify the required fields.
For more information, see Creating Data Mapping Definition section.

4.4.5 Viewing Data Mapping Definition


You can view individual Data Mapping definition details at any given point.
To view the existing Data Mapping definition:
1. From the Data Mappings window, select the Data Mapping definition that you want to view and click
View.
The Data Mapping window is displayed.
2. The Data Mapping window displays the details of the selected Data Mapping definition.
The Audit Panel section at the bottom of the window displays creation and modification information
of the Data Mapping definition. The Comments section displays additional information or notes
added for the definition, if any.

4.4.6 Deleting Data Mapping Definitions


This option allows you to delete a Data Mapping definition. However, it is a soft deletion only. To
permanently delete from system, you need to purge it.
To delete a Data Mapping definition:
1. From the Data Mapping window, select the Data Mapping definition that you want to delete and
click Delete. You can select multiple definitions for deletion.
A confirmation message is displayed.
2. Click Yes to confirm deletion or No to cancel deletion.

4.4.7 Purging Data Mapping Definitions


This option allows you to remove deleted Data Mapping definitions permanently from the system. You
should have DMTADMIN user role mapped to your user group.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 111


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

To purge Data Mapping definitions


1. Search for the Deleted Data Mapping definitions by selecting Deleted from the Record Status
drop-down list in the Data Mappings window and click Search.
2. Select the required Data Mapping definitions you want to permanently remove from the system and
click Purge.
3. Click OK to confirm purging.

4.5 Post Load Changes


Post Load Changes (PLC) refers to a rule describing the conversion of data from sources to Staging or
from Staging to Processing (destination) tables. During the data extraction, a Post Load Changes rule
helps in structuring the required data from sources to the target or an intermediate systems for further
processing. Based on the selected mode, Post Load Changes can be applied to execute the process
successfully.
Post Load Changes within the Data Management Tools framework allows you to define transformations to
the source data before extracting/loading it to the target database to populate the data warehouse.
The User Roles mapped to the Post Load Changes module are as follows:
• PLCACCESS
• PLC READ
• PLC WRITE
• PLC PHANTOM
• PLC AUTH
• PLC ADV

For all the roles, functions and descriptions, see Appendix A.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 112


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

Figure 53: Post Load Changes window

The Post Load Changes Summary window displays the list of pre-defined Post Load Changes definitions
with details such as Code, Name, Type, Created By, Creation Date, Version, and Active status. You can add,
view, modify, authorize, delete or purge Post Load Changes definitions. Note that copy functionality is not
yet available. You can make any version of a Post Load Changes definition as latest. For more information,
see Versioning and Make Latest Feature.
For sorting the fields, hover over the Column heading and click to sort in the ascending order or click
to sort the fields in the descending order.
You can search for a Post Load Changes definition based on Code, Name, Type, and Record Status
(Active, Inactive or Deleted). In the Search and Filter pane, enter the details of the Post Load Changes
definition you want to search in the respective fields and then click Search.

4.5.1 Creating Post Load Changes Definition


This feature allows you to create Post Load Changes definition based on Transformation, Stored
Procedure or External Library.
The Post Load Change window helps you to define Post Load Changes. You can create three types of
Transformations as follows:
• Insert/Update Transformation
• Stored Procedure Transformation
• External Library

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 113


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

4.5.1.1 Specifying Transformation Definition Details


1. From the Post Load Changes window, click Add.
The Post Load Change window is displayed.

Figure 54: Post Load Change window

The ID is automatically generated once you create a data mapping definition. The Folder field is not
enabled.
2. Enter a distinct Code to identify the transformation definition. Ensure that the code is alphanumeric
with a maximum of 50 characters in length and there are no special characters except underscore
“_”.
3. Enter the Name of the transformation definition.
4. Enter a Description for the transformation definition.
5. Select the PLC Type from the drop-down list. The options are:
 Insert Transformation
 Update Transformation
 Stored Procedure
 External Library

4.5.1.2 Adding Parameter Definition


1. Click in the Parameter Definition pane. A new row is inserted and allows you to define the run-
time parameters to the transformation.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 114


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

Figure 55: Parameter Definition pane

2. Enter the parameter name.


3. Click the Data Type cell and select the required Data Type from the list by using the List icon
.The supported data types are Integer, Decimal, Number, Char, Varchar2, and Date.
4. Double-click the Default Value cell and enter the default value for the parameter.
You can add more parameters by inserting additional rows and entering appropriate details.
To edit Parameter Name or Default Value, double-click the required cell and edit the values.
Additionally, you can delete a parameter by selecting the row and clicking .

4.5.1.3 Insert/Update Transformation


Insert/Update Transformation facilitates you to define transformation parameters; create expression with
source, destination, and join/filter conditions; add transformation logic and query the SQL Rule generated.
To insert or update a transformation:
1. Select Insert Transformation or Update Transformation from the Type drop-down list in the PLC
Type pane.
2. Enter the details in the Source Shuttle pane as tabulated:

Figure 56: Source Shuttle pane

The following table describes the fields in the Source Shuttle pane.

Table 17: Source Shuttle pane Field and its Description

Field Description

Fields marked in red asterisk (*) are mandatory.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 115


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

Field Description

Source Click Source Entity Selection. The Source Entities window is displayed.

• Select the entities from the Available Values and click .


• Search for a specific entity by entering the keywords and clicking
. You can also deselect an entity by selecting from the Selected

Values and clicking .

• Click to select all entities or click to remove all the


selected entities.
• Click OK.

Join/Filter Click to define the join or filter condition for the source entities. The
Expression Builder window is displayed.
For more information, see Expression Builder.

Destination Select the destination entity from the drop-down list.

3. From the Transformation Logic pane, perform the following tasks to add the transformation logic:

Figure 57: Transformation Logic pane

a. Click and a new row is added.


b. Double-click the Target Column cell and enter the target column name.
c. Double-click the equal to cell and select =.
d. Double-click the Value cell and enter the value to define the transformation logic.
e. Double-click the filter cell and enter the filter criteria if you want to apply filter for the
transformation logic.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 116


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

f. Click Generate Logic to generate the transformation logic and view the SQL query in the Query
Generated grid.

NOTE The Generate Logic button is enabled only if your user group
is mapped to the User Role DTADV.

4. Click Check Syntax (adjacent to the Save button) to check the syntax of the query generated.
5. Click Save to save the definition.
The Post Load Changes definition is added to the Post Load Changes Summary window.

4.5.1.4 Stored Procedure Transformation


The Stored Procedure Transformation feature facilitates you to define complex transformations involving
multiple tables which are contained in a pre-defined stored procedure/function. The recommended
method is to use CALL <function name>, provided the function is present in the Atomic Schema.
To define a Stored Procedure Transformation:
1. Select Stored Procedure from the Type drop-down list in the PLC Type pane.
2. Add the parameters as explained in the Adding Parameter Definition section.

Figure 58: Stored Procedure Editor pane

3. In the Stored Procedure Editor field, enter the CALL function to invoke the function stored in the
Atomic Schema. You can also enter the SQL block of the stored procedure/function. Ensure that all
the parameters used in your stored procedure are added from the Parameter Definition grid. Every
function you create should contain BatchID (VARCHAR2) and MisDate (VARCHAR2) as the first two
parameters.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 117


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

NOTE In case of CALL function, do not add BatchID (VARCHAR2) and


MisDate (VARCHAR2) as Parameters from the Parameter
Definition grid since these two mandatory parameters are
appended while creating the procedure.
If you want to pass Task_ID or Infodom name to the stored
procedure/function, define a parameter and explicitly pass the
parameter value as TASKID or INFODOM from ICC or RRF.
During execution, TASKID will be replaced with the task ID and
INFODOM will be replaced with the Information Domain name.

4. (Optional ) Click Check Syntax (adjacent to the Save button) to check the syntax of the stored
procedure.
5. Click Save to save the Stored Procedure Transformation definition.

4.5.1.5 External Library


External Library consists of built-in functions and procedures, which facilitates you to define complex SQL
Rule Transformations that are compiled and stored as an executable file. You can load the External Library
procedures and functions using the transformation wizard.
To define External Library Transformation:
1. Select External Library from the Type drop-down list in the PLC Type pane.
2. Add the parameters as explained in the Adding Parameter Definition section.

Figure 59: External Library detail pane

3. In the External Library detail grid, enter the name of the executable library file (.sh file) located in the
default ficdb/bin path in the External library field. You can also specify the path till the file
name.
4. Click Save to save the External Library Transformation definition.

4.5.2 Versioning and Make Latest Feature


When a new definition is created, it is saved as version 1. After you modify and save a definition, it will be
saved with version as highest version +1. That is, if you modify version 2, which is the highest version
available, and save it, the version becomes 3.
To make any older version as latest:

1. From the Post Load Changes Summary window, turn OFF the Active toggle button and click
Search. All inactive definitions are displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 118


DATA MANAGEMENT FRAMEWORK
POST LOAD CHANGES

2. Select the required definition and click Make Latest.


The selected definition becomes active and the current active definition becomes inactive.

4.5.3 Modifying Post Load Changes Definition


This option allows you to update the Post Load Changes definitions. You cannot modify inactive versions
of a Data Mapping definition. To make an inactive version as active, you should make that version as
latest.

To modify a Post Load Changes definition:


1. From the Post Load Changes Summary window, select the definition you want to modify and click
Edit.
2. Modify the required details.
For more information, see Creating Post Load Changes Definition.

NOTE • A PLC code in a PLC type stored procedure with a


CALL function can have a maximum of 25 characters.
• A PLC code in a PLC type stored procedure with a
CREATE/REPLACE function can have a maximum of
27 characters.
In both the above scenarios, the User cannot modify the seeded
definitions with more than the permissible length.

3. Click Save. The definition will be saved as highest version +1. That is, if you are modifying a
definition of version number as 3 and the highest version available is 5, the definition will be saved
as version 6.

4.5.4 Viewing Data Mapping Definition


This option allows you to view individual Post Load Changes definition details at any given point.
To view the existing Post Load Changes definition:
1. From the Post Load Changes Summary window, select the Post Load Changes definition that you
want to view and click View.
The Post Load Changes window is displayed.
2. The Post Load Changes window displays the details of the selected definition.

4.5.5 Deleting Post Load Changes Definition


This option allows you to delete a Post Load Changes definition. However, it is a soft deletion only. To
permanently delete the definition from system, you should purge it.
To delete a Post Load Changes definition:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 119


DATA MANAGEMENT FRAMEWORK
USER DEFINED FUNCTIONS

1. From the Post Load Changes Summary window, select the definition you want to delete and click
Delete.
You can select multiple definitions for deletion.
2. Click OK in the information dialog to confirm deletion.

4.5.6 Purging Post Load Changes Definitions


This option allows you to remove the deleted Post Load Changes definition permanently from the system.
You must have DMTADMIN user role mapped to your user group.
To purge the Post Load Changes definitions:
1. From the Post Load Changes Summary window, search for the deleted PLC definitions by selecting
Deleted from the Record Status drop-down list in the Post Load Changes window and click
Search. The deleted PLC definitions are displayed.
2. Select the required PLC definitions you want to permanently remove from the system and click
Purge.
3. Click OK to confirm purging.

4.6 User Defined Functions


This feature allows you to register Hive Permanent and Temporary User Defined Functions that can be
used in Expression Builders in OFSAAI.
Hive supports a lot of built-in SQL-like functions in HiveQL. However, a few functions that are available in
Oracle are not yet supported in Hive. A Java implementation for such functions is provided as custom Hive
UDFs by OFSAAI.
• TO_NUMBER(String input [, String format])
The TO_NUMBER function converts String input to a value of NUMBER datatype.
• TO_DATE(String input, String format)
 The TO_DATE function converts input to a value of DATE datatype in the specified format.
 Native Hive to_date(String) function when format is not specified works as is, and expects the
input to be specified in yyyy-MM-dd [HH:mm:ss] format.
• TO_CHAR(Number/Date input [, String format])
The TO_CHAR function converts a Date, Number, or String input to a String expression in a
specified format.
• NVL2(T Input1, T Input2, T Input3)
NVL2 lets you determine the value returned by a query based on whether a specified expression is
null or not null. If Input1 is not null, then NVL2 returns Input2. If expr1 is null, then NVL2 returns
Input3.
These functions are registered in OFSAAI and are available in the User Defined Functions Summary
window for using in the metadata definitions. However, you should register the OFSAAI Hive UDF jar in
the Hive server. The Hive UDF classes are present in the

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 120


DATA MANAGEMENT FRAMEWORK
USER DEFINED FUNCTIONS

$OFSAA_HOME/utility/DMT/UDF/lib/ofsaa-hive-udf.jar folder. Copy the Jar to


$HIVE_AUX_LIB path on the Hive server and then restart Hive services to use the functions in HiveQL.

NOTE User Defined Functions support only Java Date format.

The Roles mapped for User Defined Functions are as follows:


• UDFACCESS
• UDFREAD
• UDFWRITE
• UDFPHANTOM
• UDFAUTH
• UDFADV
For all the roles, functions, and descriptions, see Appendix A.

Figure 60: UDF Summary window

The User Defined Functions Summary window displays the available UDFs with details such as Function
Name, Function Description, Origin, Type, and Category. You can add new UDFs, modify, view, and purge
existing UDFs.

4.6.1 Creating User Defined Functions (UDFs)


This option allows you to create HIVE Permanent and Temporary User Defined Functions. After
registering the UDFS, they can be used in expression builders in OFSAAI (Data Mapping, Data Quality
Rules, Business Processor, Measure, Hierarchy, and Dataset).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 121


DATA MANAGEMENT FRAMEWORK
USER DEFINED FUNCTIONS

4.6.1.1 Prerequisites
1. The UDF JAR must be present in the Hive Auxiliary JARs path.
To create an Auxiliary JAR path, see Cloudera Documentation on Creating Temporary Functions.
2. If you want to use Permanent functions, following are the additional prerequisites:
a. Create permanent functions as shown in the following example:
Execute the following command from Hive CLI/Hue/Hive browser:
CREATE FUNCTION toChar AS 'com.ofs.aai.service.dmt.udf.custom.TO_CHAR
USING JAR 'hdfs:///path/to/jar'

NOTE Schema name should be specified initially before Function


name. By default, the default schema will be used.

b. Check if the UDF can be accessed through Hive Console.


To register User Defined Functions:
1. From the UDF Summary window, click Add from the toolbar.
The UDF Registration window is displayed.

Figure 61: UDF Registration window

2. Enter the details as tabulated:


The following table describes the fields in the UDF Registration window.

Table 18: Fields in the UDF Registration and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Function Name Enter the function name.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 122


DATA MANAGEMENT FRAMEWORK
USER DEFINED FUNCTIONS

Field Description

Function Description Enter a description of the function.

Origin Select the Origin from the drop-down list. Only HIVE is supported now.

Type Select the function type from the drop-down list. The options are
TEMPORARY and PERMANENT.
Note: Permanent Functions must be saved individually from Hive
CLI/Hue/Hive browser before registering in OFSAAI using the UI.

Category Select the category of the function from the drop-down list.
For HIVE, the categories available are UDF, UDAF, and UDTF.

Function Arguments Enter the arguments to be passed for the function.


For example, STRING and INT.

Class Name Enter the class name of the function.

Return Type This field is not application for HIVE UDFs.

Jar Path This field is not application for HIVE UDFs.


Note: For HIVE, the jars should be present in the Hive Auxiliary JARs
directory.

3. Click Save.

4.6.2 Viewing UDFs


This option allows you to view the User Defined Functions.
To view UDF definitions:

1. From the UDF Summary window, select the UDF and click View from the toolbar.
The UDF Registration window is displayed.
2. You can view the details of the selected UDF definition.
3. Click Close.

4.6.3 Modifying the User Defined Functions


This option allows you to modify Type, Function Arguments and Return type of the User Defined
Functions.
To modify the User Defined Functions:

1. From the User Defined Functions Summary window, select the UDF and click Edit from the
toolbar.
The User Defined Functions Registration window is displayed.
2. Modify the required details.
For more information, see Creating User Defined Functions (UDFs).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 123


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

4.6.4 Purging User Defined Functions


This option allows you to remove User Defined Functions from the system. You should have DMTADMIN
user role mapped to your user group.
To purge User Defined Functions:
1. From the User Defined Functions Summary window, select the required User Defined Functions you
want to permanently remove from the system and click Purge.
2. Click OK to confirm purging.

4.7 DMT Configurations


This section explains the configurations to be performed for a Data Mapping definition or PLC definition.
The role mapped to DMT Configurations is DMTADMIN. For the functions and descriptions, see Appendix
A.
• General Configurations if Big Data Processing License is enabled
• General Configurations if Big Data Processing License is not enabled
• Cluster Registration
• Performance Optimizations

4.7.1 General Configurations if Big Data Processing License is Enabled


Figure 62: DMT Configurations window

The following table describes the fields in the DMT Configurations window.

Table 19: Fields in the DMT Configurations window and their Description

Property Name Property Value

Generic

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 124


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Property Name Property Value

T2T Mode Select the mode of T2T to be used for execution of Data Mapping definition, from
the list. The options are Default (for Java engine) and CPP (for CPP engine).

H2T Mode Select the mode of H2T to be used for execution of Data Mapping definition, from
the list. The options are Default, Sqoop, and OLH.
OLH (Oracle Loader for Hadoop) must have been installed and configured in your
system. For more information on how to use OLH for H2T, see Oracle® Loader for
Hadoop (OLH) Configuration section in OFS Analytical Applications Infrastructure
Administration Guide.
Sqoop should have been installed and configured in your system. For more
information, see the Sqoop Configuration section in OFS Analytical Applications
Infrastructure Administration Guide. Additionally, you should register the cluster
information of the source Information domain using the Register Cluster tab.

T2H Mode Select the mode of T2H to be used for execution of Data Mapping definition, from
the list. The options are Default and Sqoop.
For the Default option, additional configurations are required, which is explained in
the Data Movement from RDBMS Source to HDFS Target (T2H) section in OFS
Analytical Applications Infrastructure Administration Guide. Additionally, you should
register the cluster information of the target Information domain using the Register
Cluster tab.
For the Sqoop option, Sqoop should have been installed and configured in your
system. For more information, see the Sqoop Configuration section in OFS
Analytical Applications Infrastructure Administration Guide. Additionally, you should
register the cluster information of the source Information domain using the Register
Cluster tab.

PLC Mode Select the mode of execution to be used for Post Load Changes definition, from the
list. The options are Default (for Java engine) and CPP (for CPP engine).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 125


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Property Name Property Value

SCD MODE This field is applicable only if SCD uses a merge approach.
• DEFAULT_V1- Select this option to perform SCD execution using JAVA engine
with a single Merge query for both Update and Insert. This is the default
execution mode.
• DEFAULT _V2- Select this option to perform SCD execution using JAVA engine
with a Merge query for updates and Insert query for inserts. Since Insert is a
separate query, the sequence used for SKEY will be incremented only for the
required records making the SKEY column value continuous.
• CPP_V1- Select this option to perform SCD execution using CPP engine with a
single Merge query for both Update and Insert. This is the default execution
mode.
• CPP_V2- Select this option to perform SCD execution using CPP engine with a
Merge query for updates and Insert query for inserts. Since Insert is a separate
query, the sequence used for SKEY will be incremented only for the required
records making the SKEY column value continuous.
• BACKDATED_V1-Backdated support for CPP_V1.
• BACKDATED_V2- Backdated support for CPP_V2.

Note: For the Backdated Executions containing type 2 column mappings, below
column mappings are mandatory :
• Start date
• End date

Is Hive Local This is applicable for T2H and F2H.


Select Yes if HiveServer is running locally to OFSAA, else select No, from the drop-
down list.

Validate Definition Select Yes to validate the SQL Query of the Data Mapping definition on save.
Query on Save

Generic Working Specify the path of the HDFS working directory for generic operations. By default,
Directory the path is set as /user/ofsaa/GenericPath.

Allow Pre806 Data This field is applicable only in case of upgrade from an earlier version to 8.1.0.0.0
File Path version. If yours is a fresh installation of 8.1.0.0.0 version using Full installer, this
field is not applicable.
For F2T, the path for Data File in versions before 8.0.6.0.0 is
/<ftpshare>/STAGE/<FileBasedSource>/<MISDate>/<dataFile.dat>. In 8.1.0.0.0, it
is changed to /ftpshare/<INFODOM>/dmt/source/<Data Source
Code>/data/<MISDATE>/<dataFile.dat>.
Select Yes to allow the old Data File path in 8.1.0.0.0 version.

SMG Mode By default, the Source Model Generation (SMG) mode is set as Dictionary.
When SMG Mode is selected as Dictionary, the time taken for generating Source
models of Views from the database is optimized.
Select Default for the earlier mode.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 126


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Property Name Property Value

Allow Pre806 T2F In the versions before 8.0.6.0.0, the T2F extract file path is
File Path <ftpshare>/STAGE/<SOURCE_CODE>/<MISDATE>.
Select Yes, if you want to set the preceding extract path.
If you select No, the extract file path is set to
<ftpshare>/<INFODOM>/dmt/def/<DEFINITION_CODE>/<BATCH_ID
>_<TASK_ID>/<MISDATE>.
Sqoop
(This section is applicable only if you select Sqoop for T2H Mode or H2T Mode.).

Sqoop Mode Select Client to execute Sqoop in client mode or select Cluster to execute Sqoop in
cluster mood, from the drop-down list.
If you select Cluster as Sqoop Mode, you should register the cluster from Register
Cluster tab. For more details, see Registering a Cluster.
Note: Copying of any Sqoop jars and Hadoop/Hive configuration XMLs to OFSAAI
is not required in cluster mode.

Sqoop Working Specify the path of the HDFS working directory for Sqoop related operations.
Directory

WebLog
(This section is applicable only for L2H)

Keep Weblog Select Yes or No from the drop-down list.


Processed File Yes- The working directory will be retained with the processed WebLog files. If the
data loading was successful, the WebLog file name will be appended with
Processed. Else, the WebLog file name will be appended with Working.
No- The working directory will be deleted after data loading.

Weblog Temp File Enter the extension of the Weblog temporary file.
Ext

Weblog Working Enter the name of the temporary working directory in HDFS.
Directory

File Encryption

Encryption At rest Select Yes from the drop-down list, if encryption is required for T2F or H2F and
decryption is required for F2T or F2H.

Key File Name Enter the name of the Key File that you used for encrypting the Data File.

Key File Path Enter the absolute path of the Key File that you used for encrypting the Data File.

NOTE You can use the BackendServerProperties.conf in the


ficdb/conf layer to support the required Timezone and Time
Format in the CPP logs.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 127


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

4.7.2 General Configurations if Big Data Processing License is not


enabled
Figure 63: DMT Configurations window

The following table describes the fields in the DMT Configurations window.

Table 20: Fields in the DMT Configuration window and their Description

Property Name Property Value

Generic

T2T Mode Select the mode of T2T to be used for execution of Data Mapping definition, from the list.
The options are Default (for Java engine) and CPP (for CPP engine).

PLC Mode Select the mode of T2T to be used for execution of Post Load Changes definition, from
the list. The options are Default (for Java engine) and CPP (for CPP engine).

SCD MODE This field is applicable only if SCD uses a merge approach.
• CPP_V1- Select this option to perform execution using a single Merge query for both
Update and Insert. This is the default execution mode.
• CPP_V2- Select this option to perform execution using Merge query for updates and
using Insert query for inserts. Since Insert is a separate query, the sequence used for
SKEY will be incremented only for the required records making the SKEY column
value continuous.
• BACKDATED_V1-Backdated support for CPP_V1.
• BACKDATED_V2- Backdated support for CPP_V2.

Note: For the Backdated Executions containing type 2 column mappings, below column
mappings are mandatory :
• Start date
• End date

Validate Definition Select Yes to validate the SQL Query of the Data Mapping definition on save.
Query on Save

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 128


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Property Name Property Value

Allow Pre806 Data This field is applicable only in case of upgrade from an earlier version to 8.0.6.0.0 version
File Path and above. If yours is a fresh installation of 8.1.0.0.0 version using Full installer, this field
is not applicable.
For F2T, the path for Data File in versions before 8.0.6.0.0 is
/<ftpshare>/STAGE/<FileBasedSource>/<MISDate>/<dataFile.dat>.
In 8.0.6.0.0, it is changed to /ftpshare/<INFODOM>/dmt/source/<Data
Source Code>/data/<MISDATE>/<dataFile.dat>.
Select Yes to allow the old Data File path in 8.1.0.0.0 version.

SMG Mode By default, the Source Model Generation (SMG) mode is set as Dictionary.
When SMG Mode is selected as Dictionary, the time taken for generating Source models
of Views from the database is optimized.
Select Default for the earlier mode.

File Encryption

Encryption At rest Select Yes from the drop-down list, if encryption is required for T2F and decryption is
required for F2T.

Key File Name Enter the name of the Key File, which you used to encrypt the Data File.

Key File Path Enter the absolute path of the Key File, which you used to encrypt the Data File.

NOTE You can use the BackendServerProperties.conf in the


ficdb/conf layer to support the required Timezone and Time
Format in the CPP logs.

4.7.3 Cluster Registration


This is required only if you have enabled Big Data Processing within your application pack.
Cluster registration is required for creating Data sources based on HDFS File or WebLogs in the HDFS
cluster and also for using Cluster for Sqoop mood.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 129


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Figure 64: DMT Configurations window – Register Cluster

This window allows you to register a new cluster, modify, view, copy, or delete an existing cluster. You can
search for a cluster based on Name.
For sorting the fields, mouse-over at the end of the Column heading and click to sort in the ascending
order or click to sort the fields in the descending order.

4.7.3.1 Registering a Cluster


This option allows you to register a cluster.

NOTE In case of T2H, cluster details should be given against target


Infodom name, and in case of H2T, cluster details must be
given against source name.

To register a cluster:
1. From the Register Cluster tab in the DMT Configurations window, click Add. The Cluster
Configurations window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 130


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Figure 65: Cluster Configurations window

2. Enter the details as tabulated.


The following table describes the fields in the Cluster Configurations window.

Table 21: Fields in the Cluster Configurations window and their Descriptions

Field Name Description

Generic

Name Enter a unique name for the cluster.

Description Enter a brief description of the cluster.

Details
(This section is not applicable for Sqoop Cluster mode.)

Authentication Type Enter the authentication type:


• KRB- Kerberos with Key Tab for secured cluster
• DEFAULT- for non-secured cluster

Configuration File Path Enter the path where Kerberos Configuration files such as core-
site.xml, hdfs-site.xml reside.

Principal Enter the Kerberos Principal name.

Keytab File Name Enter the name of the Key Tab file.

KRB5 Conf File Name Enter the name of the Kerberos Realm file.

Core Configuration XML Enter the name of the core-site.xml file.

HDFS Configuration XML Enter the name of the hdfs-site.xml file.

MapReduce Configuration XML Enter the name of the mapred-site.xml

Yarn Configuration XML Enter the name of the yarn-site.xml

Hive Configuration XML Enter the name of Hive configuration XML file.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 131


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

Field Name Description

SSH Details
(This section is applicable only for Sqoop in Cluster mode.)

SSH Server Name Enter the IP address of the node having Sqoop client installed.

SSH Port Enter the SSH port on the node, usually 22.

SSH Auth Alias Select the Auth Alias entered for SSH server from the drop-down
list.

3. Click Save.

4.7.4 Performance Optimizations


This feature allows you to externalize the Optimization parameters like Source Hint, Source Prescript,
Target Hint, and Target Prescript for OOB metadata definition. Since these parameters are external to the
metadata definition, they will not be overridden by OOB metadata during an upgrade, and as a result the
customized data will remain intact.
The Optimization parameters can be set from the following windows:
1. From the Data Mapping window, while creating the Data Mapping definition.
2. From the DMT Configurations>Optimizations tab, set the optimization parameters in the
Performance Parameter Table (AAI_DMT_PERFORMANCE_PARAMS) at following levels:
 OFSAA_INSTANCE level
 INFODOM level
 Definition level
3. From the Task Definition window, at execution parameter level.
For more information, see Component: LOAD DATA section.
Precedence
Following is the precedence in the descending order:
1. Task level square bracket parameters from the Task Definition window
2. Definition level parameters from DMT Configurations>Optimizations tab
3. Definition level parameters from the Data Mapping window
4. INFODOM level parameters from DMT Configurations>Optimizations tab
5. OFSAA_Instance/setup level parameters from DMT Configurations>Optimizations tab

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 132


DATA MANAGEMENT FRAMEWORK
DMT CONFIGURATIONS

NOTE 1. Precedence is at the parameter level and not at the


definition level (record level). That is, you can override only
the Target Hint from the Optimizations tab, and still use
Target Prescript from the Data Mapping Definition.
2. For the CPP engine, the OracleDB.conf parameters get
fired at first and then the Optimization parameters from
the Performance Parameter table
(AAI_DMT_PERFORMANCE_PARAMS) get fired.
3. For the ORACLE database, Prescripts must start with
ALTER SESSION and for the HIVE database, Prescripts
must start with SET; otherwise, those will be skipped.
4. Source Hint and Source Prescript are not relevant at
Infodom and OFSAA Instance level.

Figure 66: DMT Configurations window - Optimizations

The Optimizations tab displays all active Data Mapping definitions available in the setup. Additionally, an
entry for OFSAA instance and Information Domain will be also be present. It displays Data Mapping
definition details such as Code, Name, Source Prescript, Source Hint, Target Prescript, and Target Hint.
You can edit, view and delete performance parameters.

4.7.4.1 Configuring Performance Parameters


This option allows you to externalize performance parameters like Source Hint, Source Prescript, Target
Hint and Target Prescript for OOB metadata definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 133


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

• For T2T- Source Hint, Source Prescript, Target Hint, and Target Prescript are applicable.
• For T2F - Source Hint and Source Prescript are applicable.
• For F2T - Nothing is supported.
To configure Performance Parameters:
1. From the Optimizations tab in the DMT Configurations window, select the required Data Mapping
definition for which you want to configure performance parameters and click Edit. The
Performance Parameters window is displayed.

Figure 67: Performance Parameters

2. Specify Source Prescript or Target Prescript if you want to use it. Prescripts are supported for all
HIVE based target Infodoms, that is, H2H and T2H. In case of H2T, the prescripts are fired on the
source.
For more information, see Prescripts.
3. Specify Source Hint and Target Hint (if any), for faster loading. Oracle hints follow (/*+ HINT */)
format.
The mapping level hint is applicable for T2T, H2T, and H2H only.
For example, /*+ PARALLEL */.
4. Click Save.

4.8 Slowly Changing Dimensions (SCD)


A Slowly Changing Dimension (SCD) is a dimension that stores and manages both current and historical
data over time in a data warehouse. There are three types of SCDs:
Type 1 SCDs - Overwriting
In a Type 1 SCD, the new data overwrites the existing data. Thus the existing data is lost as it is not stored
anywhere else. No additional information is to be specified to create a Type 1 SCD.
Type 2 SCDs - Creating another dimension record
A Type 2 SCD retains the full history of values. When the value of a chosen attribute changes, the current
record is closed. A new record is created with the changed data values and this new record becomes the
current record. Each record contains the effective time and expiration time to identify the time period
between which the record was active.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 134


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

Type 3 SCDs - Creating a current value field


A Type 3 SCD stores two versions of values for certain selected level attributes. Each record stores the
previous value and the current value of the selected attribute. When the value of any of the selected
attributes changes, the current value is stored as the old value and the new value becomes the current
value.
OFSAA supports Type1 and Type 2 types of SCD. You can define and manage SCD metadata using the
Slowly Changing Dimension window. For information on constraints and assumptions of SCD execution on
Hive Information Domain, see SCD execution on Hive Information Domain and Heterogeneous Support for
SCD to RDBMS sections in OFS Analytical Applications Infrastructure Administration Guide.
The Roles mapped for Slowly Changing Dimensions module are as follows:
• SCDACCESS
• SCDREAD
• SCDWRITE
• SCDPHANTOM
• SCDAUTH
• SCDADV

Figure 68: Slowly Changing Dimension Summary window

The Slowly Changing Dimension Summary window displays the available SCDs with details such as Map
Reference Number, Table Name, Stage Table Name, and Source Priority. You can add new SCDs, modify,
view, and purge existing SCDs.
You can search for an SCD based on Stage Table Name, Dimension Table Name, and Map Reference
Number.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 135


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

4.8.1 Creating Slowly Changing Dimension


This option allows you to create a new SCD entry.
To create SCD:
1. From the Slowly Changing Dimension Summary window, click Add.
The Slowly Changing Dimension window is displayed.

Figure 69: Slowly Changing Dimension window

2. Enter the details as tabulated:


The following table describes the fields in the Slowly Changing Dimension window.

Figure 70: Fields in the Slowly Changing Dimension window and their Description

Field Name Description

Define SCD

Map Reference Number Enter a Mapping Reference Number for this unique mapping of a
Source to a Dimension Table. The supported numbers are from 0
to 999.
If it is given as -1, SCD will execute for all Map Reference Numbers.

Stage Table Name Enter the stage table name.

Source Priority Enter the priority of the source when multiple sources are mapped
to the same target.

Table Name Enter the dimension table name, whose record needs to be
updated.

SCD Details

Source Type Enter the type of the Source for a Dimension, that is, Transaction or
Master Source.

Source Key Enter the Source Key.

Data Offset Enter the offset for calculating the Start Date based on the File
Received Date.

Source Process Sequence Enter the sequence in which the various sources for the
DIMENSION will be taken up for processing.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 136


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

3. Click from the Column Mapping tab. A new row gets added.
4. Double-click each cell to edit it. Enter the following details for each record.
The following table describes the fields in the Slowly Changing Dimension window.

Table 22: Fields in the Slowly Changing Dimension window and their Description

Column Name Description

Sr. No. Enter a unique serial number.

Stage Column Name Enter the stage column name.

Column Name Enter the Column name in the Dimension Table.

Colum Type Enter the type of the column. For information for the possible
values, see Column Types.
You must enter information about at least the following column
types:
PK- Primary key, SK -Surrogate Key, SD- Start Date, LRI - Latest
Record Indicator, ED - End Date, DA - Dimensional attribute and
MD - MIS Date.

Column Datatype Enter the column data type.

SCD Type Enter the SCD type. The options are:


• 1 – Type I SCD
• 2 – Type II SCD
• NULL – No SCD handling for such attributes
For information on different SCD types, see SCD Types.

Priority Lookup Required Specify whether Lookup is required for Priority of Source against
the Source Key Column or not. The possible values are Y and N.

Column Format Enter the format of the column.

5. Click the Optimizations tab to add optimizer hints for merge execution mode.

Figure 71: Optimizations tab

6. Enter statement-level optimizer hints for the merge statement in the Source Hint field.
7. Enter statement-level optimizer hint for the select statement in merge in the Merge Hint field.
8. Enter alter statements to enable session level execution before merge statement in the Session
Enable Statement field.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 137


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

Format: "<enable stmt1>","<enable stmt2>"


For example: "alter session enable parallel dml","alter session enable parallel query"
9. Enter alter statements to disable session level execution after merge statement in the Session
Enable Statement field.
Format: "< disable stmt1>","< disable stmt2>"
For example: "alter session disable parallel dml","alter session disable parallel query"
10. Click Save.

4.8.1.1 Column Types


The possible values for column type in the SYS_STG_JOIN_MASTER are –
1. PK – Primary Dimension Value (maybe multiple for a given “Mapping Reference Number”)
2. SK – Surrogate Key
3. DA – Dimensional Attribute (maybe multiple for a given “Mapping Reference Number”)
4. DS – Works same as DA; additionally inserts description for default entries(MSG and OTH) into DS
type columns
5. SD – Start Date
6. ED – End Date
7. LRI – Latest Record Indicator (Current Flag)
8. CSK – Current Surrogate Key
9. PSK – Previous Surrogate Key
10. SS – Source Key
11. LUD – Last Updated Date / Time
12. LUB – Last Updated By
13. NN- Not Null columns
14. MD – MISDATE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 138


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

NOTE • For records of Column type SK, the value of


STG_COL_NM for that record should be
SEQUENCE_NAME.nextval. Name of the sequence can
be of the form SEQ_DIMTABLENAME which has to be
created before executing SCD.
• For records of Column type DA (value of OL_TYP of
SYS_STG_JOIN_MASTER is DA), the value of the
column SCD_TYP_ID should be set to 1 or 2 (depending
upon the SCD type). Since SKEY is a sequence, this is
available only in the dimension table and cannot be
considered for the change in the values of the fields;
hence for any non-DA columns, we cannot set the
SCD_TYP_ID to 1 or 2. They have to be set to NULL.
• For records of Column type ED, the value that goes into
the column STG_COL_NM should be ’31-dec-9999’.

4.8.2 Executing SCDs


You can execute SCDs through Operations (ICC) module or Rule Run Framework (RRF).

4.8.2.1 SCD Execution using Operations Module


This section is applicable for SCDs defined on RDMBS source and RDBMS target (T2T) or HIVE source and
HIVE target (H2H).
To execute SCDs from Operations:
1. From the Batch Maintenance window, create a new Batch.
For more information, see Adding Batch Definition section.
2. Create a task with Task parameters as shown:

Figure 72: Task Definition pane

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 139


DATA MANAGEMENT FRAMEWORK
SLOWLY CHANGING DIMENSIONS (SCD)

 The Executable field format is <SCD_Name>,<Map_Reference_Number>. For example, SCD,1


 Set Batch parameter as Y for all cases.
 If Wait is set as Y, then Run executable waits for the SCD component to finish task execution
and then update the task status.
3. Click Save.
4. Execute the Batch.

4.8.2.2 SCD Execution using RRF


This section is applicable only for SCDs defined on RDMBS source and RDBMS target (T2T) or HIVE source
and HIVE target (H2H).
To execute SCDs using RRF
1. Navigate to the RRF module and define a Run with Job as Executable:

2. Click button adjacent to the component name. The Parameters window is displayed.

Figure 73: Parameters window

3. Specify Parameters in the following format:


“scd”,”<Map Reference Number>”
For example, “scd”,”1”

4.8.3 SCD Execution for Heterogeneous Support


Assumptions:
1. The DIM table in Hive and RDBMS should have the same table and column names, though column
order may differ but not the data type.
2. You need to log into the ICC/ RRF Pages from the source Infodom, that is, Hive Infodom.
3. You need to pass two additional parameters DBSERVERNAME and DBSERVERIP while invoking the
SCD using the Run Executable component.
For SCD execution from Operations (ICC) module, the Executable format is as follows:
<SCD EXECUTABLE NAME>,<REFRENCE NUMBER>,<TARGET RDBMS NAME>,<TARGET RDBMS
SERVER>
For example: scd,78,ofsaatm,192.168.1.0
From RRF, specify Parameters in the format:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 140


DATA MANAGEMENT FRAMEWORK
CPP EXECUTION PERFORMANCE ENHANCEMENTS

“<SCD EXECUTABLE NAME>”,”<REFRENCE NUMBER>”,”<TARGET RDBMS NAME>”,”<TARGET


RDBMS SERVER>”
For example: “scd”,”78”,”ofsaatm”,”192.168.1.0”

4.8.4 Modifying SCD Definition


This option allows you to update the SCD definition.
To modify SCD definition follow these steps:
1. From the Slowly Changing Dimension Summary window, select the definition you want to modify
and click Edit.
2. Modify the required details.
For more information, see Creating Slowly Changing Dimension section.
3. Click Save.

4.8.5 Viewing SCD Definition


You can view individual SCD definition details at any given point.
To view the existing SCD definition follow these steps:
1. From the Slowly Changing Dimension Summary window, select the SCD definition that you want to
view and click View.
The Slowly Changing Dimension window is displayed.
2. This window displays the details of the selected definition.

4.8.6 Purging SCD Definitions


This option allows you to remove SCD definitions permanently from the system. You should have
DMTADMIN user role mapped to your user group to purge SCD definitions.
To purge SCD definitions:
1. From the Slowly Changing Dimension Summary window, select the SCD definition which you want
to purge and click Purge.
2. Click OK to confirm purging.

4.9 CPP Execution Performance Enhancements


You can enhance the CPP execution performance, to reduce the execution time between the tasks for the
Data Management component. To enhance the CPP execution performance, invokethe CPP Engine
directly without initializing intermediate JVMs based on system variable CPP_DIRECT_EXECUTION.

Add the CPP_DIRECT_EXECUTION variable to the .profile file, and set the following execution flags:
• When CPP_DIRECT_EXECUTION flag is set to “true”:
The DMT configuration properties - T2T_MODE and PLC_MODE will be overridden. When a
T2T/F2T/DT task is triggered by the ICC Batch Execution, the corresponding CPP Engine is invoked
in an optimized manner. The Java task logs will not be generated.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 141


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

• When CPP_DIRECT_EXECUTION flag is set to “false”:


The original behavior is restored where the executions happens based on the T2T_MODE and
PLC_MODE properties set in DMT configuration.

NOTE Restart the services, after adding the system variable. system
variable is added, you must restart the services.

4.10 Data Quality Framework


Data Quality Framework consists of a scalable rule-based engine which uses a single-pass integration
process to standardize, match, and duplicate information across global data. Data Quality Framework
within the Infrastructure system facilitates you to define rules and execute them to query, validate, and
correct the transformed data existing in an Information Domain.
Data Quality Framework consists of the following sections:
• Data Quality Rules
• Data Quality Groups

4.10.1 Data Quality Rules


Data Quality Rules allows you to create a DQ (Data Quality) definition and define nine specific validation
checks based on Range, Data Length, Column Reference/Specific Value, List of Value/Code, Null Value,
Blank Value, Referential Integrity, Duplicity, and Custom Check/Business. You can also cor0.rect data for
range, column reference, list of values, null value, and blank value parameters. The defined Data Quality
Rule checks can be logically grouped and executed together.
Control Total Check
Data Quality Rules, which are supported by OFSAA, integral to OFSAA eco-system, and domain specific,
are largely technical checks. Result of these checks leads to data correction. Enterprise Data Quality tools
support only technical checks. Business semantic driven checks are not typically seeded rule-type.
OFSAAI is provided with a comprehensive business semantic-rich and FSI (Financial Services and
Insurance) domain-centric Data Quality Rule Type. This type of quality check allows the configuration of
entity-attributes (multiple ones) checked against a reference entity with its set of attributes. The attributes
on both sides need not match (though the data-type will match). Both LHS (subject entity) and RHS
(reference entity) should permit tagging aggregate functions to attributes, adding dimension-filters to the
where-clause, and supporting group-by predicates (that are also dimensional filters or attributes specific
to LHS and RHS entity respectively). The group-by columns need not match the filter criteria columns in
the where clause of LHS and RHS.
Note that the result of the check is to log if the check failed/succeeded, along with criteria used with the
subject and reference. If there is group-by, failure or success will be recorded against every row of the
result on LHS (subject) and RHS (reference).
The roles mapped to DQ Rule are as follows:
 DQ Access
 DQ Advanced
 DQ Authorize

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 142


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

 DQ Auto Authorizer
 DQ Phantom
 DQ Read Only
 DQ Write
 DQ View Query
See Appendix A for the functions and roles required to access the framework.

Figure 74: Data Quality Rules window

The Data Quality Rule Summary window displays the list of pre-defined Data Quality Rules with other
details such as Name, Table, Access Type, Check Type, Folder, Creation Date, Created By, Last
Modification Date, Status, Is Grouped, Is Executed, Version, and Active. A defined rule is displayed in
Saved status until it is Approved/Rejected by the approver. The approved rules can be grouped further for
execution and the rejected rules are sent back to the user with the Approver comments.
You can add, view, modify, copy, approve/reject, resave, or delete Data Quality Rules within the Data
Quality Rule Summary window. You can make any version of a Data Quality Rule as the latest. For more
information, see Versioning and Make Latest Feature section. You can also search for a Data Quality Rule
based on Name, On Source, Source, Folder, Check Type, Table, or Record Status (Active, Inactive and All).

4.10.1.1 Creating a Data Quality Rule


You can create a Data Quality Rule definition by specifying the DQ Definition details along with the type of
validation check on the required table and defining the required validation conditions to query and correct
the transformed data. Data Quality Rules can be defined on entities of Infodom as well as on Data sources
that are defined from the Data Sources window. Before defining DQ Rule on a Data Source, the Source
Model Generation should have been done.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 143


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

NOTE • Data Quality Rules can be defined only on the DI


Sources whose underlying schema resides in the same
database, where OFSAAI METADOM or atomic schema
exists.
• If you are defining Data Quality check on a Data
Management Source, only the quality check will be
done; data correction will not be done since it is an
external source.
• When creating a Data Quality Rule, it is supported to a
maximum of 8 primary key columns.

To create a Data Quality Rule:


1. Click Add in the Data Quality Rules window. The Add button is disabled if you have selected any
checkbox in the grid.
The Data Quality Definition window is displayed.

Figure 75: Data Quality Definition window

2. In the DQ definition section:


 Enter the Name by which you can identify the DQ definition.
 Enter a Description or related information about the definition.
 Select the On DI Source checkbox if you want to define data quality check on a Data Source.
This is optional.
 Select the required Data Source from the drop-down list. The Source drop-down list
displays sources created on Oracle DB and Hive DB if it is RDBMS Information Domain or
sources created on Hive DB if it is Hive Information Domain.
 Select the Folder (available for selected Information Domain) from the drop-down list.
 Select the Access Type as either Read Only or Read/Write. The Read Only option enables only
the creator to modify the rule details. Other users can only view the DQ rules. The Read/Write

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 144


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

option enables all users to view, modify any fields (including Access Type), and delete the DQ
rule.
3. Select the Check Type from the drop-down list. The options are Specific Check, Generic Check,
and Control Total Check.

You can mouse-over for information.

4.10.1.1.1 Specific Check

This check is used to define conditions based on individual checks on a single column.

Figure 76: Check Type pane

If Specific Check is selected, perform the following steps:


1. Select Table and Base Column Name from the drop-down list.
The list displays all the tables that are marked for Data Quality Rule in a data model; that is, based
on ENABLE_CLASSIFICATION parameter. For more information, see Table Classification.

2. Click and select the Identifier Columns.


The list displays all PK columns of the selected base table.
This feature allows you to view the DQ results report based on the selected identifier columns apart
from the PK columns. You can select up to 8 Identifier columns including the PK columns. It is
mandatory to select the PK Columns.
3. If the selected Base Column is of Varchar/Char data type, select the Substring checkbox and enter
numeric values in the Parameters Position and Length characters fields.

4. Click and define the Filter condition using the Specify Expression window.
For more information, see Specify Expression.

NOTE While defining the filter condition, you can also include the
Runtime Parameter name, which you will be specifying in
Additional Parameters condition while executing the DQ Rule.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 145


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

5. Define the required Validation Checks by selecting the appropriate grid and specify the details. You
can define nine specific validation checks based on Range, Data Length, Column Reference/Specific
Value, List of Value/Code, Null Value, Blank Value, Referential Integrity, Duplicity, and Custom
Check/Business.

NOTE A minimum of one Validation check must be defined to


generate a query.

 Ensure that you select the Enable checkbox for every check to be applied as a part of rule.
 While defining any of the validation checks, you must specify the Severity (Error, Warning, or
Information. You can add an Assignment only when the Severity is selected as Warning or
Information. Assignments are added when you want to correct or update record(s) in the base
column data / selected column data. However, selecting severity as Error indicates there are no
corrections and only facilitates in reporting the quantity of bad records.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 146


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Figure 77: Validation Checks window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 147


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Table 23: Fields in the Validation Checks window and their Descriptions

Check Type Description

Range Check Range Check identifies if the base column data falls outside a specified range of Minimum
and Maximum value.
Example: If the Base Table is STG_CASA, Base Column is N_MIN_BALANCE_YTD,
Minimum value is 9, and Maximum value is 99, then the check with the Inclusive
checkbox enabled (by default) is defined as ‘STG_CASA.N_MIN_BALANCE_YTD < 9 and
STG_CASA.N_MIN_BALANCE_YTD > 99’. Here the base column data less than 9 and
greater than 99 are identified as invalid.
If the Inclusive checkbox is not selected for Minimum and Maximum, then the check is
defined as, ‘If STG_CASA.N_MIN_BALANCE_YTD <= 9 and
STG_CASA.N_MIN_BALANCE_YTD >= 99’. Here the base column data less than 10 and
greater than 98 are identified as invalid, where 9 and 99 are also included in the
validation and considered as invalid.
1. Select Enabled checkbox. This option is available only if the selected Base Column is
either of Date or Number data type.
Select the Severity as Error, Warning, or Information.
If the selected Base Column is of “Date” type, select Minimum and Maximum date range
using the Calendar. If the selected base column is of “Number” type, enter the Range
value. You can specify numeric, decimal, and negative values for number Data type. The
Inclusive checkbox is selected by default and you can deselect the same to include the
specified date/value during the validation check.

Click and specify an expression for Additional Condition using the Specify
Expression window. For more information, see Define Expression.
(Optional) If the Severity is set to Warning/Information:
2. Select the Assignment checkbox.
3. Select the Assignment Type from the drop-down list. For more information, see
Populating Assignment Type Details in the References section.
4. Specify the Assignment Value.
5. Select the Message Severity as 1 or 2 from the drop-down list.
6. Select the Message to be displayed from the drop-down list.

Data Length Data Length Check checks for the length of the base column data using a minimum and
Check maximum value and identifies if it falls outside the specified range.
Example: If the Base Table is STG_CASA, Base Column is N_MIN_BALANCE_YTD, the
Minimum value is 9, and the Maximum value is 12, then the check is defined as ‘If the
length of STG_CASA.N_MIN_BALANCE_YTD < 9 and > 12’. Here the base column data with
characters less than 9 and greater than 12 are identified as invalid.
Select Enabled checkbox.
Select the Severity as Error, Warning, or Information.
Specify the Minimum data length characters.
Specify the Maximum data length characters.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 148


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Check Type Description

Column Column Reference / Specific Value Check compares the base column data with another
Reference / column of the base table or with a specified direct value using the list of pre-defined
Specific Value operators.
Check Example: If the Base Table is STG_CASA, Base Column is N_MIN_BALANCE_YTD, and if
Column Reference check is defined against a specific value ‘100’ with the operator ‘>=’
then the check is defined as, ‘If STG_CASA.N_MIN_BALANCE_YTD < 100’. Here the base
column data with value less than 100 are considered as invalid.
Or, if Column Reference check is defined against another column N_MIN_BALANCE_MTD
with the operator ‘=’ then the check is defined as, ‘If STG_CASA.N_MIN_BALANCE_YTD <>
STG_CASA.N_MIN_BALANCE_MTD’. Here the reference column data not equal to the
base column data is considered as invalid.
Select Enabled checkbox. This option is available only if the selected Base Column is
either of Date or Number data type.
Select the Severity as Error, Warning, or Information.
Select the Mathematical Operator from the drop-down list.
Select the Filter Type as one of the following:
Select Specific Value and specify the Value. You can specify numeric, decimal, and
negative values for number Data type.
Select Another Column and select Column Name form the drop-down list.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.
(Optional) If the Severity is set to Warning/Information:
Select the Assignment checkbox.
Select the Assignment Type from the drop-down list. For more information, see
Populating Assignment Type Details in Reference section.
Specify the Assignment Value.
Select the Message Severity from the drop-down list.
Select the Message from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 149


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Check Type Description

List of Value / List of Value / Code Check can be used to verify values where a dimension / master table
Code Check is not present. This check identifies if the base column data does not matches with any
value or code specified in a list of values.
Example: If the Base Table is STG_CASA, Base Column is N_MIN_BALANCE_YTD, and the
list of values is mentioned are “100, 101, 102, 103, 104”, then the check is defined as, ‘If
STG_CASA.N_MIN_BALANCE_YTD is NOT IN (‘100, 101, 102, 103, 104)’. Here the base
column data apart from the one specified (i.e. 100, 101, 102, 103, 104) are considered as
invalid.
Or, for Code Check,
If the Base Table is CURRENCY_MASTER, Base Column is COUNTRY_CODE, and the list of
values is mentioned are ‘IN’, ‘US’, ‘JP’, then the check is defined as, ‘If
CURRENCY_MASTER.COUNTRY_CODE is NOT IN (‘IN’, ‘US’, ‘JP’)’. Here the base column
data apart from the one specified (i.e. ‘IN’, ‘US’, ‘JP’) are considered as invalid.
Select Enabled checkbox.
Select the Severity as Error, Warning, or Information.
Select the Filter Type as one of the following:
Select Input Values and specify the List of Values. You can specify numeric, decimal,
string (Varchar /char), and negative values.

Select Code and click in the List of Values column. The Code Selection window is
displayed. Select the required code and click . You can also click to select all the
available codes. Click OK.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.
(Optional) If the Severity is set to Warning or Information:
Select the Assignment checkbox.
Select the Assignment Type from the drop-down list. For more information, see
Populating Assignment Type Details in the References section.
Specify the Assignment Value.
Select the Message Severity from the drop-down list.
Select the Message from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 150


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Check Type Description

Null Value Check Null Value Check identifies if “NULL” is specified in the base column.
Example: If the Base Table is STG_CASA and the Base Column is N_MIN_BALANCE_YTD,
then the check is defined as, ‘If STG_CASA.N_MIN_BALANCE_YTD is NULL’. Here the base
column data, which is null, are considered as invalid.
Select Enabled checkbox.
Select the Severity as Error, Warning, or Information.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.
(Optional) If the Severity is set to Warning or Information:
Select the Assignment checkbox.
Select the Assignment Type from the drop-down list. For more information, see
Populating Assignment Type Details in the References section.
Specify the Assignment Value.
Select the Message Severity from the drop-down list.
Select the Message from the drop-down list.
Note: The Null Check support TIMESTAMP datatype.

Blank Value Blank Value Check identifies if the base column is blank without any values considering
Check the blank space.
Example: If the Base Table is STG_CASA and Base Column is N_MIN_BALANCE_YTD, then
the check is defined as, ‘If Length of data of STG_CASA.N_MIN_BALANCE_YTD after trim
is null’. Here the base column data that is blank/empty are considered as invalid.
Select Enabled checkbox.
Select the Severity as Error, Warning, or Information.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.
(Optional) If the Severity is set to Warning or Information:
Select the Assignment checkbox.
Select the Assignment Type from the drop-down list. For more information, see
Populating Assignment Type Details in the References section.
Specify the Assignment Value.
Select the Message Severity from the drop-down list.
Select the Message from the drop-down list.
Note: The Blank Check support TIMESTAMP datatype.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 151


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Check Type Description

Referential Referential Integrity Check identifies all base column data which has not been referenced
Integrity Check by the selected column of the referenced table. Here, the reference table and columns are
user specified.
Example: If the Base Table is STG_CASA, Base Column is N_MIN_BALANCE_YTD,
Reference table is STG_CASA_TXNS, and reference column is N_TXN_AMOUNT_NCY,
then the check is defined as, ‘(not exists (select STG_CASA_TXNS.N_TXN_AMOUNT_NCY
from STG_CASA_TXNS where
STG_CASA_TXNS.N_TXN_AMOUNT_NCY=STG_CASA.n_min_
balance_ytd))’. Here, if the STG_CASA. N_MIN_BALANCE_YTD column value does not
match with STG_CASA_TXNS. N_TXN_AMOUNT_NCY, then those base table records are
considered as invalid.
This check can be used to validate attributes like Geography dimension, currency
dimension, and so on.
Select Enabled checkbox.
Select the Severity as Error, Warning, or Information.
Select the Table (Referential Integrity Check dimension table) from the drop-down list.
The base table selected under the Select grid is excluded from the drop-down list.
Select the Column from the drop-down list.
The list displays those columns that have the same Data Type as that of the Base Column
selected under Select grid.
Select the Is Composite Key checkbox if the base column is part of a Composite Key.
Enter the Additional Reference Condition for the Composite Key. For example,
baseTable.column2=refTable.column2 and baseTable.column3=refTable.column3 where
column1, column2, column3 are part of the Composite Keys, baseTable.column1 is the
base column and refTable.column1 is the reference column.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.
Note: SELECT privilege should be granted to METADOM (atomic schema) user on Base
Table and Reference Table for all DQ rules which are defined on “Data Management
Sources”.

Duplicate Check Duplicate Check can be used when a combination of column is unique and identifies all
the duplicate data of the base table in terms of the columns selected for the duplicate
check.
Example: If the Base Table is STG_CASA, base column is N_MIN_BALANCE_YTD, and
duplicity columns are selected as N_MIN_BALANCE_MTD and N_MIN_BALANCE_ITD,
then the check is defined as, ‘If there are duplicate values for the combination of columns
STG_CASA. N_MIN_BALANCE_YTD, STG_CASA.N_MIN_BALANCE_MTD, and STG_CASA.
N_MIN_BALANCE_ITD are considered as invalid’.
Select Enabled checkbox.
Select the Severity as Error, Warning, or Information.

Click in Column list and select the required column.

Click and specify an expression for Additional Condition using Specify Expression
window. For more information, see Define Expression.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 152


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Check Type Description

Custom Custom Check/Business Check is a valid SQL query to identify the data with the query
Check/Business specified as the Custom/business SQL. You can define the SQL, but the Select clause of
Check the query has to follow the order as specified in the template of the Custom Check panel.
Example: When you want all the bad records based on two column selection from same
table, such as - Identify all the error records from Investments table where the account
number is not null and account group code is null:
• select PK_NAMES,PK_1,PK_2,PK_3,PK_4,PK_5,PK_6,PK_7,PK_8,ERROR_COLUMN
from (SELECT NULL PK_NAMES, NULL PK_1,NULL PK_2,NULL PK_3,NULL
PK_4,NULL PK_5,NULL PK_6,ACCOUNT_NUMBER PK_7, ACCOUNT_GROUP_CD
PK_8,1 ERROR_COLUMN FROM FSI_D_INVESTMENTS WHERE
ACCOUNT_GROUP_CD IS NULL AND ACCOUNT_NUMBER IS NOT NULL)
• Select Enabled checkbox.
• Select the Severity as Error, Warning, or Information.
• Enter the Custom/Business Check parameters within the brackets. Ensure that each
parameter is separated by a comma.
Note: Threshold check is performed based on the value set to Y for the following
parameter DQ_ENABLE_CUSTOM_THRESHOLD. By default, the value is N.

1. Click Generate Query.


The details are validated and the validated query along with the status is displayed in the Generated
Query section.
2. Click Save.
The defined Data Quality Rule definition is displayed in the Data Quality Rule Summary window with
the Status as “Saved” and Active as "N". After it is approved, it becomes active.
3. Additional conditions would be appended to the RI check criteria, that is, to the NOT EXISTS clause
in conjunction with an AND.

NOTE For all checks except Referential Integrity Check, the additional
condition is expected to be defined on the base table; whereas
for RI check, it can be done on the base table as well as the
reference table.

4.10.1.1.2 Generic Check

Generic Check is used to define conditions based on multiple columns of a single base table. These checks
are not pre-defined and can be specified (user-defined) as required.
If Generic Check is selected, do the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 153


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Figure 78: Generic Check pane

1. Select Table Name from the drop-down list.


The list displays all the tables which are marked for Data Quality Rule in a data model; that is, based
on ENABLE_CLASSIFICATION parameter. For more information, see Table Classification section.

2. Click and define the Filter condition using the Specify Expression window.
For more information, see Define Expression.

NOTE While defining the filter condition, you can also include the
Runtime Parameter name which you would be specifying in
Additional Parameters condition while executing the DQ Rule.

3. Click Add in the Condition grid.


The Specify Expression window is displayed. Define the Condition expression. For more information,
see Define Expression.

NOTE The length of the condition is restricted to 4000 characters.

The Expression is displayed with the “IF” and “Else” conditions along with the Severity status as
Error or Warning or Information.
You can change the Severity by selecting the checkbox corresponding to the condition and
selecting the Severity as Warning or Information from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 154


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

NOTE You can add an Assignment only when the Severity is selected
as Warning or Information. Assignments are added when you
want to correct or update record(s) in base column data /
selected column data. There can be one or more assignments
tagged to a single condition. However, selecting severity as
Error indicates there are no corrections and only facilitates in
reporting the quantity of bad records.

4. Select the checkbox adjacent to the required Condition expression and click Add in the
Assignment grid.
The assignment details are populated.

NOTE You can add an Assignment only if the Severity is Warning or


Information. There can be one or more assignments tagged to
a single condition.

5. Specify the Assignment details as tabulated.

Table 24: Fields in the Generic Value pane and their Descriptions

Field Description

Column Name Select the Column Name from the drop-down list.

Assignment Type Select the Assignment Type from the drop-down list. For more
information, see Populating Assignment Type Details in the References
section.

Assignment Value Select the Assignment Value from the drop-down list according to the
Assignment Type selected.

Message Severity Select the Message Severity as either 1 or 2 from the drop-down list.

Message Select the required Message for the Severity from the drop-down list.

You can also add multiple assignments by clicking Add in Assignment grid.

NOTE Minimum of one condition needs to be defined to save the


Rule.

6. Click Save.
The defined Data Quality Rule definition is displayed in the Data Quality Rule Summary window with
Status as “Saved” and Active as "N". After it is approved, it becomes active.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 155


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

4.10.1.1.3 Control Total Check

Using Control Total check, you can compare a constant reference value or reference entity against single
or multiple values obtained by applying aggregate functions on the columns of a master/main table, with
supporting dimensional filters. The dimensional filters can be time, currency, geography or so on.
There is no data correction configurable for the Control Total check. This check provides summary level
information on the entity used, attributes used, aggregate function applied, dimension-filters, group-by
columns/predicates selected, number of records subject to the check and so on.
Example of Control Total check based on Constant/Direct Value
Consider an example where you want to check the sum of loan amount for currency code ‘INR’ is greater
than or equal to a Constant Value. In the LHS, select Table as “stg_loan_transactions”, Dimensional Filter
as “dim_currency.n_currency_code=‘INR’“ and Group By as “dim_legal_entities.le_code, lob.lob_code,
dim_branch.branch_code, dim_prodcut.product_id”. In this case, the query for LHS Criteria will be
Select sum(end_of_period_balance)
from stg_loan_transactions SLT, dim_currency DC
where SLT.n_currency_skey=DC.n_currency_skey and DC.n_currency_code = ‘INR’ and
fic_mis_date = ‘12/12/2015’
group by dim_legal_entities.le_code, lob.lob_code, dim_branch.branch_code,
dim_prodcut.product_id”
If the result of the aggregate function is greater than or equal to the specified constant value, it will be
marked as Success, else Failure. After execution, the results can be viewed in DQ reports.
Example of Control Total check based on Reference Entity
Consider an example where you want to compare the sum of loan amount for currency code ‘INR’ with the
sum of transaction amount for currency code ‘INR’ for a period with MIS DATE as 12/12/2015. In the LHS,
select Table as “stg_loan_transactions”, Dimensional Filter as “dim_currency.n_currency_code=‘INR’“ and
Group By as “dim_legal_entities.le_code, lob.lob_code, dim_branch.branch_code,
dim_prodcut.product_id”. In the RHS, select Table as “gl_master”, Dimensional Filters as
“dim_currency.n_currency_code=‘INR’“ and fic_mis_date = 12/12/2015, and Group By as
“dim_legal_entities.le_code, lob.lob_code, dim_branch.branch_code, dim_prodcut.product_id”. In this case,
the query for LHS criteria will be same as given in the previous example and the query for RHS criteria will
be:
select sum(end_of_period_balance)
from gl_master GM, dim_currency DC, dim_time_date DTD
where GM.n_currency_skey = DC.n_currency_skey and GM.gl_code = ‘LES_001’ and
DTD.fic_mis_date = ‘12/12/2015’ and DC.n_currency_skey = ‘INR’
group by dim_legal_entities.le_code, dim_lob.lob_code, dim_branch.branch_code,
dim_prodcut.product_id
Consider you have selected the Operator as “>=”. Then, if the result of the aggregate function in the LHS is
greater than or equal to the result of the aggregate function in the RHS, it will be marked as Success, else
Failure. After execution, the results can be viewed in DQ reports.
If Control Total Check is selected, do the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 156


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Figure 79: Control Total Check pane

1. Select Table Name from the drop-down list.


The list displays all the tables which are marked for Data Quality Rule in a data model; that is, based
on ENABLE_CLASSIFICATION parameter. For more information, see Table Classification section.

2. Click and select the Identifier Columns from the Column Selection window.
The list displays all PK columns of the selected base table.
This feature allows you to view the DQ results report based on the selected identifier columns apart
from the PK columns. You can select up to 8 Identifier columns including the PK columns. It is
mandatory to select the PK Columns.

3. Click and define the Filter condition using the Specify Expression window.
For more information, see Define Expression.

NOTE While defining the filter condition, you can also include the
Runtime Parameter name which you would be specifying in
Additional Parameters condition while executing the DQ Rule.

4. Select the Severity as Error, Warning or Information.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 157


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

5. Enter the details in the LHS grid as tabulated:

Table 25: Fields in the LHS pane and their Descriptions

Field Description

Aggregate Expression
Click and define the Aggregate Expression using the Specify
Expression window. For more information, see Define Expression.

Additional Entities
Click and add additional entities if required from the Additional
Entities Selection window. This is optional.

ANSI Join Condition Specify ANSI Join condition if you have added Additional Entities.
For DQ rules defined on source, prefix the table names with “$SOURCE$”
if you are directly entering the ANSI Join Condition in the Expression
editor.

Join Condition Specify Join condition if you have added Additional Entities.

Additional Condition Specify additional condition if any.

Group By
Specify the group by predicates/ columns by clicking and selecting
Table and Column from the respective drop-down lists.
Note: The group-by columns need not match the filter criteria columns in
the where clause of LHS. If Group By columns are not selected on LHS
and RHS, a single row on LHS will be compared with a single row on RHS.

Group By Join Condition Specify the Group By Join condition in the form LHS.GRPBY_COL1 =
RHS.GRPBY_COL1 AND LHS.GRPBY_COL2 = RHS.GRPBY_COL2 and so
on. LHS and RHS will be joined based on this.
If the number of Group By columns on LHS does not match with the
number of Group By columns on RHS, it is mandatory to enter Group By
Join Condition.
If Group By Join Condition is not specified and the number of Group By
columns on LHS and RHS are equal, Group By Join Condition will be
automatically generated in the form “LHS.GRPBY_COL1 =
RHS.GRPBY_COL1 AND LHS.GRPBY_COL2 = RHS.GRPBY_COL2”.
If Group By columns are present only on LHS, every row on LHS will be
compared against the single row on RHS. Group By Join Condition will be
generated in the form “RHS.R_ID=1”.
If Group By columns are present only on RHS, the single row in LHS will
be compared against every row on RHS. Group By Join Condition will be
generated in the form “LHS.L_ID=1”.

6. Select the appropriate Operator from the drop-down list. The available operators are >, <, =, <>, <=,
and >=. Evaluation is done based on the selected numeric operator.
7. Select the Reference Type as:
 Direct Value- Enter the reference value in the Value field.
 Another Entity- This is used when you want to compare LHS with a different entity with its set
of attributes. Enter the details as follows:
 Reference Base Table- Select the reference table from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 158


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

 Specify Aggregate Expression, Additional Entities, ANSI Join Condition, Join


Condition, Additional Condition, and Group By in the respective fields.
For more information, see the preceding table.
 Relative Reference- Here Reference value is the same aggregate function on the subject entity
itself, but dimensional filters can vary. Reference Base Table and Aggregate Expression are
pre-seeded as in the LHS grid. You cannot modify them. Enter the other details as follows:
 Specify Additional Entities, ANSI Join Condition, Join Condition, Additional Condition
and Group By in the respective fields.
For more information, see the preceding table.

NOTE Control Total check is allowed only on numeric columns.


Group By clauses on LHS and RHS should be defined in such a
way that output of RHS and LHS are semantically correct to be
compared. That is, RHS and LHS should not result in two
different sets that cannot be compared against. Hence, ensure
the rule definitions are technically validated to meet this.

8. Click Generate Query.


The details are validated and the validated query along with the status is displayed in the Generated
Query section.
9. Click Save.
The defined Data Quality Rule definition is displayed in the Data Quality Rule Summary window with
the Status as “Saved” and Active as "N". After it is approved, it becomes active. If you are mapped to
the DQAUTOAUTHR role, the definition is automatically authorized and it becomes active.

NOTE No corrections or assignments are done by the framework for


Control Total check.

4.10.1.1.4 Table Classification

DQ rules whether can be defined on a table is decided by a new Servlet parameter


ENABLE_CLASSIFICATION, which is present in the web.xml file.
If ENABLE_CLASSIFICATION is set to Y, any tables with classification code 340 can be selected as base
table for DQ rule definition. This is the old behavior.
If ENABLE_CLASSIFICATION is set to N, then irrespective of the classification any table can be selected as
base table for DQ rule definition.

4.10.1.1.5 Defining Data Quality Rules on Partitioned Tables

Data correction on partitioned table is accomplished by overwriting the particular partition specified. At
run time, DQ engine look for partition information from OFSAAI object registration table
REV_TAB_PARTITION. If base table is partitioned, REV_TAB_PARTITIONS table will have partition column,
value, and sequence registered in it.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 159


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

If PARTITION_VALUE does not present in REV_TAB_PARTITIONS table for a


TABLE_NAME.COLUMN_NAME, it is considered as a dynamic partition.
Hive allows operations on dynamic partition only in non-strict mode. Non-strict mode is set by DQ engine
while it identifies REV_TAB_PARTITION.V_PARTITION_VALUE as null.
Static partition value can also be set with placeholders. For example, $PARAM1, $PARAM2 and the same
can be mentioned as ‘Additional Parameters’ while DQ batch execution. Value for the placeholders/
additional parameters will be substituted as the static partition values during DQ run time.

4.10.1.2 Versioning and Make Latest Feature


When a new definition is created, it will be saved as version 1 and once it is authorized, it will be in Active
status. After you modify any DQ rule and save, it will be saved with version as highest available version +1.
For example, if you modify a DQ rule of version 2 and the highest version available is 4, after you save the
definition, its version becomes 5. Only the latest version will be in Active status.
To make any older version as latest:

1. From the Data Quality Rules window, select the Record Status as Inactive and click Search. All
inactive definitions are displayed.

2. Select the required definition and click Make Latest. The selected definition becomes active and
the current active definition becomes inactive.

4.10.1.3 Viewing Data Quality Rule


You can view individual Data Quality Rule definition details at any given point. A system generated ID is
assigned to each Data Quality Rule when it is created, which can be viewed in the Audit Trail section.
To view the existing Data Quality Rule definition in the Data Quality Rule Summary window:
1. Select the checkbox adjacent to the required DQ Name.

2. Click View from the Data Quality Rules tool bar.


The Data Quality Definition (View Mode) window displays the details of the selected Data Quality
definition. The Audit Trail section at the bottom of View - DQ Definition window displays System ID and
metadata information about the Data Quality Rule defined.

4.10.1.4 Modifying Data Quality Rule


You can modify the saved Data Quality Rule definition(s) which are not grouped in the Data Quality
framework. A grouped Data Quality Rule definition can still be edited by unmapping the same from the
associated group(s).

NOTE An approved rule irrespective of whether it is mapped to


group(s) or it has been executed, cannot be edited if the
configuration of Data Quality Approval parameter is set to ‘N’.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 160


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

You can update all the definition details except for the Definition Name, Check Type, Table, and the Base
Column selected. To update the required Data Quality Rule definition details in the Data Quality Rule
Summary window:
1. Select the checkbox adjacent to the required DQ Name.

NOTE You can only edit those rules, which has the status as Saved or
Rejected and which are Approved (but not mapped with any
group). If you want to edit an Executed rule, you need to
unmap the rule from the group.

2. Click Edit from the Data Quality Rules tool bar. The Edit button is disabled if you have selected
multiple DQ Names.
The Data Quality Definition (Edit Mode) window is displayed.
3. Update the details as required. For more information, see Create Data Quality Rule.
4. Click Save and update the changes. The Status is changed to Saved and it will be inactive. The rule
should undergo authorization to become active. If you are mapped to the DQAUTOAUTHR role, the
definition is automatically authorized and it becomes active.

4.10.1.5 Copying Data Quality Rule


You can copy the existing Data Quality Rule to quickly create a new DQ definition based on the existing
rule details or by updating the required parameters. To copy an existing Data Quality Rule definition in the
Data Quality Rule Summary window:
1. Select the checkbox adjacent to the required DQ Name in the list whose details are to be duplicated.

2. Click Copy from the tool bar. The Copy button is disabled if you have selected multiple
checkboxes.
The Data Quality Definition (Copy Mode) window is displayed.
3. Edit the DQ definition Name and other details as required.
For more information, see Create Data Quality Rule.
4. Click Save. The defined Data Quality Rule definition is displayed in the Data Quality Rule Summary
window with the status as “Saved”.

4.10.1.6 Approving/ Rejecting Data Quality Rule


An authorizer can approve a pre-defined Data Quality Rule definition for further execution or reject an
inappropriate DQ definition listed within the Data Quality Rule Summary window. You should be mapped
to DQ Authorizer function role to approve or reject a DQ definition.
To approve/ reject Data Quality Rule in the Data Quality Rule Summary window:
1. Select the checkbox adjacent to the required DQ Name. Ensure that you select the “Saved” DQ
definition based on the Status indicated in the Data Quality Rules grid.
2. Do one of the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 161


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

 To approve a DQ definition, click Approve.


The User Comments window is displayed. Enter the notes or additional information to the user
and click OK.
The selected DQ definition is approved and a confirmation dialog is displayed. The definition
becomes active after it is approved.

 To reject a DQ definition, click Reject.


The User Comments window is displayed. Enter the notes or additional information to the user
and click OK. The selected DQ definition is rejected and a confirmation dialog is displayed.

NOTE The authorizer can approve/reject only one definition at a time.

The Approved/Rejected status of the DQ definition is indicated in the Status column of the Data
Quality Rule Summary window. You can mouse-over to view the Approver comments in a pop-
up.

4.10.1.7 Resaving Data Quality Rule


The DQ rule definition undergoes changes when the OFSAA data model alters the base tables attributes
(columns, primary keys) as a part of model versioning. The Resave option allows you to select multiple
DQs and save at one go, instead of re-generating and re-saving the rules one by one. For DQ Rules
defined on Infodom tables, resave persists the default PK columns as identifier columns and regenerate
the query. So the custom identifier columns selected at the time of rule definition will not be considered
when you resave the DQ rule. For DQ rules defined on Source, as the PK columns of source tables are not
identifiable, resave just re-generates the query and resave the query; it does not update the identifier
columns.
To resave data quality rule:
1. From the Data Quality Rules window, select the DQ Rules which you want to resave and click
Resave.
2. A status message is displayed showing whether the Resave was successful or failed.

4.10.1.8 Deleting Data Quality Rule


You can remove the Data Quality Rule definition(s) that are not grouped in the Data Quality framework. A
grouped and non-executed Data Quality Rule definition can still be deleted by unmapping the same from
all the associated group(s).
1. Select the checkbox adjacent to the required DQ Name whose details are to be removed.

2. Click Delete button from the Data Quality Rules tool bar.
3. Click OK in the information dialog to confirm deletion.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 162


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

4.10.2 Data Quality Groups


Data Quality Groups facilitates you to logically group the defined DQ definitions and schedule for
execution. DQ definitions can be executed either through Data Quality Groups Summary window of Data
Management Tools framework or in Batch Execution window of Operations module.
The roles mapped to DQ Group are as follows:
 DQ Group Access
 DQ Group Advanced
 DQ Group Authorize
 DQ Group Phantom
 DQ Group Ready
 DQ Group Write

Figure 80: Data Quality Groups Summary window

The Data Quality Groups Summary window displays the list of pre-defined Data Quality Groups with the
other details such as Name, Folder, Creation Date, Created By, Last Modification Date, Last Modified By,
Last Run Date, and Last Run Status. You can create and execute DQ Group definitions and view, modify,
copy, refresh, or delete DQ Group definitions within the Data Quality Groups Summary window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 163


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

NOTE • The “Last Run Status” column in the Data Quality


Groups Summary grid displays the Group execution
status as Not Executed, Ongoing, Interrupted,
Successful, and Failed.
• Those Data Quality groups created in Operations
module with the execution status as Held, Excluded, or
Cancelled are displayed as Not Executed in the Data
Quality Groups Summary grid. However, the same can
be viewed in Operations > Batch Monitor window.
• The “Last Run Status” column in Data Quality Rules
summary grid displays the Rule execution status as
Ongoing, Successful, or Failed. You can click on the
status to view additional details in View Log window.

You can also search for a DQ Group definition based on Name, Description, Folder, Rule Name, On Source,
or Source.

4.10.2.1 Creating Data Quality Group


You can create a DQ Group definition by defining the DQ Definition details and mapping the required DQ
Rules which are authorized and approved within the system. The DQ Group definition is flexible and
purpose driven. Groups can be created for different subject areas such as Credit and Market or it can be
application specific like Basel II, Economic capital.
To create DQ Group in the Data Quality Groups Summary window:

1. From the Data Quality Groups Summary window, click Add button in the Data Quality Groups
tool bar. Add button is disabled if you have selected any checkbox in the grid.
The Data Quality Group Definition window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 164


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

Figure 81: Data Quality Group Definition (New mode)

2. In the Data Quality Group Definition section, do the following:


 Enter the Name by which you can identify the DQ Group.
 Enter a description or related information about the DQ Group.
 Select the On DI Source checkbox if you want to group DQ Rules defined on DI Sources.
 Select the Source from the drop-down list.
The Source drop-down list displays sources created on Oracle DB and Hive DB if it is RDBMS
Information Domain or sources created on Hive DB if it is Hive Information Domain.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 165


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

NOTE DQ rule defined on a particular application- source mapping


cannot be grouped together with DQ rules defined on another
application- source mapping.

 Select the Folder (available for selected Information Domain) from the drop-down list.
3. In the Map DQ Rules section, do the following:

 Select the required DQ Rule from the Available Rules list and click . You can also search to
select a specific DQ Rule by entering the required keyword and clicking button.

NOTE If a DQ group has interdependent rules, such rules would not


give the expected result.

 To select all the listed DQ Rules, click .

You can also deselect a DQ Rule by selecting from the Mapped Rules list and clicking or deselect
all the mapped rules by clicking . You can search to deselect a specific DQ Rule by entering the
keyword and clicking button.
4. Click Save. The defined DQ group is listed in the Data Quality Rule Summary window and can be
executed for processing.
For more information, see Executing Data Quality Group.

4.10.2.2 Executing Data Quality Group


You can execute a defined DQ Group Definitions along with the mapped Rules and validation checks in the
Data Quality Group Summary window. This in turn creates a Batch in Operations module. You can also
create and execute a DQ Group in the Batch Execution window of Operations module. When a Data
Quality Group is executed for processing, the execution details can be viewed in View Data Quality Group
Summary Log.

NOTE Ensure Allow Correction on DI Source checkbox is selected in


the System Configuration> Configuration > Others tab if you
want to do the Data Quality check and correction
simultaneously through DCDQ framework.

Note that the results of execution of Data Quality Rules are stored in the table
DQ_RESULT_DETL_MASTER of respective METADOM schema. During the OFSAAI installation ensure the
Oracle database tablespace in which this table resides is configured to AUTOEXTEND ON. Otherwise, the
DQ Rule executions might result in error due to insufficient storage space available (ORA-01653 - Unable
to extend tablespace by 1024). To mitigate this error, ensure sufficient storage for the tablespace has been
allocated. For a single check (DQ) on a row of data, the table DQ_RESULT_DETL_MASTER stores the
results in 1 row. Thus, for 2 checks on a row, the table would store results in 2 rows and so on.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 166


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

A provision to Run DQ Rules in a DQ Group in parallel is introduced. There are two parameters
DQ_ENABLE_PARALLEL_EXEC and DQ_MAX_NO_OF_EXEC_THREADS added in the CONFIGURATION
table. If DQ_ENABLE_PARALLEL_EXEC parameter is set to 'Y', DQ rules within the group are executed in
parallel. DQ_MAX_NO_OF_EXEC_THREADS can be used to specify the number of rules which should be
Run, simultaneously.
If DQ_ENABLE_PARALLEL_EXEC parameter is set to 'N' or is not present, rules within the group are
executed sequentially.

NOTE 'Fail if threshold breaches' flag will not be considered for


parallel execution.

To execute a DQ Group in the Data Quality Group Summary window:


1. From the Data Quality Groups Summary window, select the checkbox adjacent to the required DQ
Group Name.

2. Click Run button from the Data Quality Groups tool bar. The Run button is disabled if you have
selected multiple checkboxes.
The Group Execution window is displayed.

Figure 82: Group Execution window

3. In the Batch details section, do the following:


 Select the MIS Date using the Calendar. MIS Date is mandatory and refers to the date with
which the data for the execution would be filtered. In case the specified MIS date is not
present in the target table, execution completes with the message “No Records found” in
View Log window.

NOTE If there is an As_Of_Date column in the table, it looks for


As_Of_Date matching the specified MIS Date.
The DQ Batch ID is auto populated and is not editable.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 167


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

 Specify the percentage of Threshold (%) limit in numeric value. This refers to the maximum
percentage of records that can be rejected in a job. If the percentage of failed records exceeds
the Rejection Threshold, the job will fail. If the field is left blank, the default value is set to 100%.
 Specify the Additional Parameters as filtering criteria for execution in the pattern Key#Data
type#Value; Key#Data type#Value; and so on.
Here the Datatype of the value should be “V” for Varchar/Char, or “D”’ for Date with
“MM/DD/YYYY” format, or “N” for numeric data. For example, if you want to filter some specific
region codes, you can specify the Additional Parameters value as
$REGION_CODE#V#US;$CREATION_DATE#D#07/06/1983;$ACCOUNT_BAL#N#10000.50;

You can mouse-over for more information.

NOTE In case the Additional Parameters are not specified, the default
value is taken as NULL. Except the standard place holders
$MISDATE and $RUNSKEY, all additional parameters for DQ
execution should be mentioned in single quotes. For example,
STG_EMPLOYEE.EMP_CODE = '$EMPCODE'.

 Select Yes or No from the Fail if Threshold Breaches drop-down list. If Yes is selected,
execution of the task fails if the threshold value is breached. If No is selected, execution of the
task continues.
 For executing DQ rules on Spark, specify ‘EXECUTION_VENUE=Spark’ in the Optional
Parameters field. Before execution, you should have registered a cluster from DMT
Configurations > Register Cluster window with the following details:
 Name- Enter name of the Hive information domain.
 Description- Enter a description for the cluster.
 Livy Service URL- Enter the Livy Service URL used to connect to Spark from OFSAA.
4. Click Execute.
A confirmation message is displayed and the DQ Group is scheduled for execution.
After the DQ Group is executed, you can view the details of the execution along with the log
information in the View Log window.
For more information, see Viewing Data Quality Group Summary Log.

4.10.2.3 Viewing Data Quality Group


You can view individual Data Quality Group definition details at any given point.

To view the existing DQ Group definition in the Data Quality Group Summary window:
1. From the Data Quality Groups Summary window, select the checkbox adjacent to the required DQ
Group Name.
The mapped DQ Rules are displayed in the Data Quality Rules grid.

2. Click View button from the Data Quality Groups tool bar.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 168


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

The Data Quality Group Definition window displays the DQ Group definition details and the mapped
DQ rules.

4.10.2.4 Modifying Data Quality Group


You can update the existing DQ Group definition details except for the Group Name. To update the
required DQ Group definition details in the Data Quality Groups Summary window:
1. From the Data Quality Groups Summary window, select the checkbox adjacent to the required
Group Name.

2. Click Edit button from the Data Quality Groups tool bar.
The Edit - DQ Group - DQ Definition Mapping window is displayed.
3. Update the details as required.
For more information, see Creating Data Quality Group.
4. Click Save and update the changes.

4.10.2.5 Copying Data Quality Group


You can copy the existing DQ Group details to quickly create a new DQ definition based on the existing
details or by updating the required parameters. To copy an existing DQ Group definition in the Data
Quality Groups Summary window:
1. From the Data Quality Groups Summary window, select the checkbox adjacent to the required
Group Name in the list whose details are to be duplicated.

2. Click Copy button from the toolbar. Copy button is disabled if you have selected multiple
checkboxes.
The Copy - DQ Group - DQ Definition Mapping window is displayed.
3. Edit the DQ Group Name and other details as required.
For more information, see Create Data Quality Group.
4. Click Save.
The new DQ Group definition is displayed in the Data Quality Groups Summary window.

4.10.2.6 Viewing Data Quality Group Summary Log


You can view the execution log details of Data Quality Rules in the View Log window. The View Log
window displays the details such as Check Name, Log Message, Message Date, Message Time, Total
Rows, Rows Impacted, Assignment Type, Assignment Severity, and Severity Message of the executed Data
Quality Rules.
To view the Data Quality Rule execution log details in the Data Quality Groups Summary window:
1. From the Data Quality Groups Summary window, select the DQ Group Name whose execution log
you want to view.
The Data Quality Rules associated with the selected Group are displayed in the Data Quality Rules
grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 169


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

2. Click the link in Last Run Status column corresponding to the required Data Quality Rule.
Or
Select the required Data Quality Rule and click View Log from the Data Quality Rules toolbar.
The View Log window is displayed with the latest execution data pertaining to Data Quality Rule
selected.

Figure 83: View Log window

 Select the Information Date from the drop-down list. Based on selection, you can select the
Group Run ID and Iteration ID from the corresponding drop-down lists.

 Click View Log button from the Group Execution Details toolbar. The Data Quality Rule Log
grid displays the execution details of the selected Data Quality Rule. You can also click Reset
button in the Group Execution Details toolbar to reset the selection.

4.10.2.7 Viewing Data Quality Report


You can view the execution summary report of Data Quality Rules in the Data Quality Reports window. The
Data Quality Summary Report grid displays the details such as Group Name, Description, Category, Table,
Column, Total Rows, and Rows Impacted. By clicking the corresponding DQ check link under Category,
you can view the Data Quality Detailed Report grid, which displays details of the record which has a data
correction such as Primary Key Columns, Error Value, and Assignment value.

NOTE If you have opted to run T2T with data correction, then the data
quality checking is done in the source and the Data Quality
Report generated is only a preview report of the actual
execution. That is, though the execution may have failed, you
can view Data Quality report.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 170


DATA MANAGEMENT FRAMEWORK
DATA QUALITY FRAMEWORK

To view the Data Quality Reports window:


1. From the Data Quality Groups Summary window, select the DQ Group Name whose DQ Report you
want to view.
The Data Quality Rules associated with the selected Group are displayed in the Data Quality Rules
grid.
2. Select the checkbox corresponding to the DQ rule and click View Reports button in the Data
Quality Rules grid.
The Data Quality Reports window is displayed.
3. Select the Information Date from the drop-down list. Based on selection, you can select the Group
Run ID and Iteration ID from the corresponding drop-down lists.
4. Click button from the Group Execution Details toolbar.
The Data Quality Summary Report grid is displayed.
5. Click the DQ check link under the Category column.
The Data Quality Detailed Report grid is displayed.

Figure 84: Data Quality Reports window

For Control Total Check type, the Data Quality Detailed Report displays Subject Reference Value, Operator,
Aggregate Reference Value, Group By columns, Aggregate Row Status and Rows Impacted.

4.10.2.8 Deleting Data Quality Group


You can remove the DQ Group definition(s) which are created by you and which are no longer required in
the system by deleting from Data Quality Groups Summary window.
1. From the Data Quality Groups Summary window, select the checkbox adjacent to the required
Group Name whose details are to be removed.

2. Click Delete button from the Data Quality Groups tool bar.
3. Click OK in the information dialog to confirm deletion.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 171


DATA MANAGEMENT FRAMEWORK
REFERENCES

4.10.3 Configure Dynamic Degree of Parallelism (DOP) in DQ


Framework
This feature allows you to achieve Oracle parallelism or any setting’s change before executing DQ
component. You can add scripts in the preScriptDQDC.conf file located at $FIC_DB_HOME/conf/
folder. These scripts will be executed before executing DQ task. These are generic scripts and are common
for all the DCDQ tasks.

NOTE This is applicable only on Oracle based Information domain.

You can define any optimization statement inside the preScriptDQDC.conf file as stated below:
1. Statement starting with #, will be ignored as it is considered as comments.
2. Statement with Key Words like CREATE, TRUNCATE, DROP, SELECT, and UPDATE will be ignored.
3. Different statements should be separated either by ; or new line.
4. Accepted/Filtered statements will be executed and can be seen in the log with execution status as
SUCCESS/FAILURE.
5. If unable to execute optimization statements or if file is not present in the respective path, log will
show the message, but DCDQ will not fail. It will continue with the execution.

4.11 References
This section of the document consists of information related to intermediate actions that needs to be
performed while completing a task. The procedures are common to all the sections and are referenced
where ever required. You can see to the following sections based on your need.

4.11.1 Flat file


Flat files are data files that store records with no structured relationships. You can define the data source
of a flat file present locally or on a remote server.
Flat-File present in local data source resides in the staging area of the Infrastructure Database Server.
Additional metadata information such as file format properties is required to interpret these files. Flat-File
present on a remote server can be accessed through FTP connection to load the remote data-file into the
Staging area of the Infrastructure Database Server.
The Data Source for a Flat-File serves the purpose of logically grouping a set of Flat-Files getting loaded
into the Warehouse from a defined source application.

4.11.2 RDBMS
RDBMS or relational database management system stores data in the form of tables along with the
relationships of each data component. The data can be accessed or reassembled in many different ways
without having to change the table forms.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 172


DATA MANAGEMENT FRAMEWORK
REFERENCES

RDBMS data source lets you define the RDBMS engine present locally or on a remote server using the FTP
access. RDBMS can be defined to connect to any of the RDBMS such as Oracle, Sybase, IBM DB2, MS SQL
Server, and any RDBMS through native connectivity drivers.
A separate license is required for third party jars and the client has to procure it.

4.11.3 RAC
Real Application Clusters (RAC) allows multiple computers to Run RDBMS software, simultaneously, while
accessing a single database and providing a clustered database.
In an Oracle RAC environment, two or more computers (each with an instance) concurrently access a
single database. This allows an application or user to connect to either of the computer and have access to
a single coordinated set of data. RAC addresses areas such as fault tolerance, load balancing, and
scalability.

4.11.4 Expression Builder


You can define an expression in the Expression Builder window to join two selected tables.

Figure 85: Expression Builder window

The Expression Builder window consists of the following sections:


• Entities - consists of the Entities folder with the list of tables that you selected from the Entity
Groups folder. Double-click the Entities folder to view the selected dimension tables (Product and
Segment tables).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 173


DATA MANAGEMENT FRAMEWORK
REFERENCES

• Functions – This is divided as Database Functions and User Defined Functions. Database Functions
consists of functions that are specific to databases like Oracle and MS SQL Server. You can use
these functions along with Operators to specify the join condition.
The Functions categories are displayed based on the database types as tabulated.

Table 26: Database and its Functions

Database Functions

Transact SQL Specific to MS SQL server which consists of Date and Time, Math, and
System functions.

SQL OLAP Specific to Microsoft OLAP which consists of Array, Dimension,


Hierarchy, Logical, Member, Number, Set, and String functions.

SQL Specific to Oracle which consists of String, Aggregate, Date and Time,
and Mathematical functions.

NOTE It is not mandatory to specify a Function for a join condition.

• Operators - Consists of the function operators categorized into folders as tabulated.

Table 27: Operator and its Types

Operator Types

Arithmetic +, -, %, * and /

Comparison '=', '!=', '< >', '>', '<', >=, <=,'IN', 'NOT IN', 'ANY', 'BETWEEN', 'LIKE',
'IS NULL', and 'IS NOT NULL'.

Logical 'NOT', 'AND' and 'OR'

Set UNION, UNION ALL, INTERSECT and MINUS

Other The Other operators are 'PRIOR', '(+)', '(' and ')'.

To specify the join condition:


1. Select the Entity of the fact table to which you want join the dimension entities.
2. Select a Function depending on the database type.
3. Select the Operator you want to use for the join condition.
4. Select the second Entity from the Entities pane that you want to join with the first entity. You can
also select more than one dimension table and link to the fact table.
The defined expression is displayed in the Expression pane. You can click Reset to reset the values.
5. Click OK.
The defined expression is validated as per the selected table and entity definition and on successful
validation, it is displayed in the main window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 174


DATA MANAGEMENT FRAMEWORK
REFERENCES

4.11.5 Passing Runtime Parameters in Data Mapping


The following Parameters are supported in Expressions, Joins and Filters used in the Data Mapping
definition.
• $RUNID
• $PHID
• $EXEID
• $RUNSK
• $SYSDATE
• $TASKID
• $MISDATE
• $BATCHRUNID
Apart from the above $Parameters, any other parameter can be passed within Square-Brackets. For
example, [PARAM1], [PARAM2], [XYZ], [ABCD].
Apart from these, L2H/H2H/T2H/H2T/F2H mappings also support following additional default
parameters. Values for these are implicitly passed from ICC/RRF.
• $MISDT_YYYY-MM-DD - Data type is String and can be mapped to VARCHAR2. Value will be the
MISDATE in ‘yyyy-MM-dd‘ format.
• $MISYEAR_YYYY - Data type is String and can be mapped to VARCHAR2. Value will be the year
value in ‘yyyy‘ format from MISDATE.
• $MISMONTH_MM - Data type is String and can be mapped to VARCHAR2. Value will be the month
value in ‘MM‘ format from MISDATE.
• $MISDAY_DD - Data type is String and can be mapped to VARCHAR2. Value will be the date value in
‘dd‘ format from MISDATE.
• $SYSDT_YYYY-MM-DD- Data type is String and can be mapped to VARCHAR2. Value will be the
System date in ‘yyyy-MM-dd‘ format.
• $SYSHOUR_HH24 - Data type is String and can be mapped to VARCHAR2. Value will be the hour
value in ‘HH24‘ format from System date.
• $MISDT_YYYYMMDD - Data type is String and can be mapped to VARCHAR2. Value will be
MISDATE in YYYYMMDD date format.
• $SYSDATE_YYYYMMDD- Data type is String and can be mapped to VARCHAR2. Value will be
system date in YYYYMMDD date format.

NOTE The aforementioned parameters are not supported for T2T and
F2T.

The T2T/L2H/H2H/T2H/H2T/F2H/T2F Mappings also support the following Parameters in the v8.1.2.1.0
and later versions to get MISDATE as a number:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 175


DATA MANAGEMENT FRAMEWORK
REFERENCES

• $MISDT_SKEY - where the Data Type is Integer and can be mapped to NUMBER. The value is
MISDATE represented as a number.
Two additional parameters are also supported for L2H mappings:
• [INCREMENTALLOAD] – Specify the value as TRUE/FALSE. If set to TRUE, historically loaded data
files will not be loaded again (load history is checked against the definition name, source name,
target Infodom, target table name and the file name combination). If set to FALSE, the execution is
similar to a snapshot load, and everything from the source folder/file will be loaded irrespective of
load history.
• [FOLDERNAME] – Value provided will be used to pick up the data folder to be loaded.
 For HDFS based Weblog source: Value will be suffixed to HDFS File Path specified during the
source creation.
 For Local File System based Weblog source: By default, the system will look for execution date
folder (MISDATE: yyyymmdd) under STAGE/<source name>. If the user has specified the
FOLDERNAME for this source, system will ignore the MISDATE folder and look for the directory
provided as [FOLDERNAME].
Passing values to the Runtime Parameters from the RRF module
 Values for $Parameters are implicitly passed through RRF
 Values for dynamic parameters (given in Square Brackets) need to be passed explicitly as:
"PARAM1","param1Value", “PARAM2”, “param2Value"
Passing values to the Runtime Parameters from the Operations module
 Value for $MISDATE is passed implicitly from ICC
 Value for other $parameters and dynamic parameters (given in Square Brackets) is passed as:
[PARAM] = param1VALUE , $RUNSK = VALUE

NOTE If the Runtime parameter is a string or involves string


comparison, ensure that appropriate single quotes are given in
the DI UI. For example, Filter Condition can be
DIM_COUNTRY.CountryName = ‘[PARAMCNTRY]’.

4.11.6 Populating Assignment Type Details


To populate the Assignment Type details, select any of the below Assignment Type option from the drop-
down list and do the following:
• No Assignment: This assignment is selected by default and does not have any target column
update, but the message details are pushed.
• Direct Value: Enter the Assigned Value. You can specify numeric, decimal, string (Varchar /char),
and negative values as required. If the specified Assigned Value characters length exceeds the base
column length, then a system alert message is displayed.
• Another Column: Select the required Column as Assigned Value from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 176


DATA MANAGEMENT FRAMEWORK
REFERENCES

• Code: If any code / leaf values exist for the selected base column, select the required Code as
Assigned Value from the drop-down list. If not, you are alerted with a message indicating that No
Code values exist for the selected base column.

• Expression: Click button in the Assignment Value column and specify an expression using
Specify Expression window. For more information, see Specify Expression.

NOTE The Expression you define in an Assignment Type field basically


derives the Assignment value and is not a filter condition as
defined for Additional Condition field. Therefore, you need to
specify an expression to derive only the resultant value, which
needs to be updated into the base column.
For example, the expression “STG_NON_SEC_EXPOSURES.n
_accrued_interest * 1.34” on validation, will update the base
column with the derived value after multiplying
“n_accrued_interest” value by 1.34. Therefore, expressions such
as “STG_NON_SEC_EXPOSURES.n_accrued_interest = 1.34” are
considered as invalid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 177


UNIFIED ANALYTICAL METADATA
ALIAS

5 Unified Analytical Metadata


The Unified Analytical Metadata transforms your ability to manage your enterprise by distributing a
consistent view of the business dimensions and key measures to every decision maker and application
developer. Oracle Financial Services Analytical Applications Infrastructure’s unique technology allows your
enterprise to define a consistent set of business terms and securely deploy them across the entire range
of analytic applications from your data warehouses and data marts to your business intelligence and
alerting tools to your data distribution and portal applications.
The Unified Analytical Metadata is intended for the Information and Business Analysts who are
instrumental in supporting and affecting analytical decisions. This section includes the following topics:
• Alias
• Derived Entity
• Datasets
• Dimension Management
• Measure
• Business Processor
• Expression
• Filter
• Map Maintenance
• Analytics Metadata

5.1 Alias
Alias refers to an assumed name or pseudonym. Alias section within the Infrastructure system facilitates
you to define an Alias for a table and specify the join condition between fact and dimension table. Alias
defined to a table help you to query data for varied analytical requirements.
The roles mapped to Alias module are as follows:
• Alias Access
• Alias Advanced
• Alias Authorize
• Alias Phantom
• Alias Read Only
• Alias Write
For all the roles and descriptions, see Appendix A.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 178


UNIFIED ANALYTICAL METADATA
ALIAS

Figure 86: Alias Summary window

The Alias Summary window displays the Alias name of the selected Entity. You can also add a new Alias,
view the Alias details and delete an existing Alias. Click the Column header names to sort the column
names in ascending or descending order. Click if you want to retain your user preferences so that
when you login next time, the column names will be sorted in the same way. To reset the user preferences,
click .

5.1.1 Adding Alias


This option allows you to add an Alias to an Entity. Your user group should be mapped to the role Alias
Write for adding alias.
To create an Alias:
1. Select an Entity from the drop-down list for which you need to create an Alias and click Add. The
Add Alias window is displayed.

Figure 87:Alias Details Add window

The Alias Details grid in the Add Alias window displays the entity name you have selected in a non-
editable field.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 179


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

2. Enter the Alias name you wish to provide for the selected entity in the Alias Name field.
3. Click Save. The Alias name is listed under the Aliases grid for the selected entity.
The User Info section at the bottom of Add Alias window displays metadata information about the Alias
Name created. The User Comments section facilitates you to add or update additional information as
comments.

5.1.2 Viewing Alias


You need to be mapped to the role Alias Read Only to view Alias.
To view the existing Alias:

Select an Entity from the drop-down list whose Alias details you want to view and click View. The
View Details window is displayed.
The User Info grid at the bottom of the window displays the metadata information about the Alias
definition along with the option to add comments.

5.1.3 Deleting Alias


You need to be mapped to the role Alias Write to Delete Alias.
To delete an Alias follow these steps:

4. Select an Entity from the drop-down list, whose Alias you want to delete and click Delete from
the Aliases tool bar.
5. Click OK in the warning dialog to confirm deletion.
The selected Alias names are removed.

5.2 Derived Entity


Entity refers to a table in which data is stored. Derived Entity within the Infrastructure system facilitates
you to define entities which are populated through a series of data transformation processes resulting
from an existing Dataset or a Data Source. A Derived Entity can be used to define other Business Metadata
such as measures, hierarchies, dimensions, Datasets, and cubes.
Partitioning support is introduced for Dataset based Derived Entity, which have partitions enabled on the
FACT table. This facilitates in fetching data from the specified partitions only, thus results in better
performance. The partition values can be provided dynamically.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 180


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

Figure 88: Summary window

The Derived Entity Summary window displays the list of pre-defined Derived Entities with their Code, Short
Description, Long Description, Creation Date, Source Type, and Materialize View status. By clicking the
Column header names, you can sort the column names in ascending or descending order. Click if you
want to retain your user preferences so that when you login next time, the column names will be sorted in
the same way. To reset the user preferences, click .
You can add, view, edit, copy, and delete a Derived Entity. You can search for a specific Derived Entity
based on the Code, Short Description, Source Type, and Authorization status.
Based on the role that you are mapped to, you can access, read, modify or authorize Derived Entity. For all
the roles and descriptions, see Appendix A. The roles mapped to Derived Entity are as follows:
• Derived Entity Access
• Derived Entity Advanced
• Derived Entity Authorize
• Derived Entity Phantom
• Derived Entity Read Only
• Derived Entity Write

5.2.1 Creating Derived Entity


This feature allows you to create a Derived Entity based on a Dataset, an Entity or a union of Derived
Entities. For Union and Union All options, the metadata used in the participating Derived Entities
determines the columns of the physicalized materialized view. For Union based Derived Entity, even if the
participating derived entities have metadata in common, the resultant materialized view in database will
ensure unique columns.
The same is explained in a tabular format:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 181


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

Table 28: Derived Entity based on the Dataset

Union Participating Metadata present in Final physicalized materialized view for union based
Based DEs participating DEs DE
DE

UN001 DE001 MSR MSR MSR MSR001 MSR002 MSR003 MSR004 MSR005
001 002 003

DE002 MSR MSR MSR


001 004 005

In case of Union All based definition, the resultant materialized view in database may have repetition of
data based on data present in the participating Derived Entities.

NOTE To define a Derived Entity based on an Entity in a Data Source,


you should have defined permissions for the particular Data
Source in the Atomic schema.

You can approve a Derived Entity created by other users if you have the authorizer rights. You need to be
mapped to the role Derived Entity Write to add or create a Derived Entity.
Partitioning is supported for Dataset based Derived Entities which have partitions enabled on the FACT
table.
To create a Derived Entity:
1. Click Add from the Derived Entity toolbar. The Derived Entity Details window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 182


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

Figure 89: Derived Entity Details window

2. Enter the details as tabulated.


The following table describes the fields in the Derived Entity window.

Table 29: Fields in the Derived Entity window Descriptions

Field Description

Code Enter a distinct code to identify the Derived Entity. Ensure that the code is
alphanumeric with a maximum of 8 characters in length and there are no
special characters except underscore “_”.
Note the following:
The code can be indicative of the type of Derived Entity being created.
A pre-defined Code and Short Description cannot be changed.
Same Code or Short Description cannot be used for Essbase installation:
“$$$UNIVERSE$$$”, “#MISSING”, “#MI”, “CALC”, “DIM”, “ALL”, “FIX”,
“ENDFIX”, “HISTORY”, “YEAR”, “SEASON”, “PERIOD”, “QUARTER”,
“MONTH”, “WEEK”, “DAY”.

Short Description Enter a Short Description based on the defined code. Ensure that the
description is of a maximum of 80 characters in length and does not contain
any special characters except “_, ( ), -, $”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 183


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

Field Description

Long Description Enter the Long Description if you are creating subject-oriented Derived
Entity to help users for whom the Derived Entity is being created or other
details about the type/subject. Ensure that the description is of a maximum
of 100 characters in length.

Source Type Select the source type from the drop-down list. The options are Dataset,
Entity, Union and Union All. The Union and Union All options are used to
create a Derived Entity by combining 2 or more existing Derived Entities.

Aggregate This field is enabled only if Source Type is selected as Dataset.


Turn ON the Aggregate toggle button to collate the information for the
Derived Entity.

Materialize View Turn ON the Materialize View toggle button if you are using Oracle
database to create a Materialized View with the Derived Entity Name and
short description.
Note: You cannot enable the Materialize View option if you are using IBM
DB2 database.

Dataset Name This field is enabled only if the Source Type is selected as Dataset.
Select the Dataset Name from the drop-down list. The Short Description for
the Datasets is available in the drop-down list to select.

Source Name This field is enabled only if the Source Type is selected as Entity.
Select the Source Name from the drop-down list.

Refresh Interval This field is enabled only if the Materialize View checkbox is selected.
Select the appropriate refresh interval from the drop-down list, The options
are:
None- Only materialized view will be created. If you select None for Refresh
Interval, it is mandatory to select None for Refresh Method.
Demand- The refresh of the Materialized View is initiated by a manual
request or a scheduled task.
Commit- The refresh is triggered by a committed data change in one of the
dependent tables.

Refresh Method This field is enabled only if the Materialize View checkbox is selected.
Select the appropriate refresh method from the drop-down list, The options
are:
None- Only materialized view will be created. If you have selected None for
Refresh Interval, it is mandatory to select None for Refresh Method.
Complete- This recreates the materialized view replacing the existing data.
This can be a very time-consuming process, especially if there are huge
amounts of data to be read and processed.
Fast- Applies the incremental changes to refresh the materialized view. If
materialized view logs are not present against the source tables in advance,
the creation fails.
Force- A fast refresh is attempted. If it is not possible, it applies Complete
refresh.
Note: Refresh Methods Fast and Commit do not work if the query has some
ANSI Join conditions.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 184


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

Field Description

Enable Query Rewrite This toggle button is enabled only if the Materialize View toggle button is
turned ON.
Turn ON the toggle button if you want to create materialized view with the
query rewrite option.

Parallelism

Hint Specify Hints (if any), for optimized execution of query. The specified hints
are appended to the underlying query of the derived entity.
Oracle hints follow (/*+ HINT */) format.
For example, /*+ PARALLEL */.

Prebuilt Table This toggle button is enabled only if the Materialize View toggle button is
turned ON and Source Type is selected as Dataset.
Turn ON the toggle button to enable partition for the Derived Entity.

On selecting the Dataset Name or Source Application Name, the respective fields are displayed in
the Metadata for Source Type list.
3. Double-click Metadata for Source Type.
 For Source Type selected as Dataset, the Metadata for Source Type displays all Hierarchies
and Measures defined on the Entities that are part of the selected Dataset, and Business
processors defined on the selected Datasets.
 For Source Type selected as Entity, it displays all Entities in the selected DI Source.
 For Source Type selected as Union or Union All, it displays all Derived Entities created with
Source Type as Dataset. You can select maximum of 15 Derived Entities.

4. Click to expand the folders. Select the required metadata and click . Click to select all
metadata. You can select a metadata and click to remove that metadata or click to remove
all selected metadata.
5. Select the hierarchy for which you want to add partition from the Partition drop-down list. This field
is enabled only if the Materialize View toggle button is turned ON and Source Type is selected as
Dataset. This drop-down lists the Hierarchies you selected as Metadata for Source Type.
6. Click Save.
A confirmation dialog is displayed.
The details are displayed in the Derived Entity Summary window.

5.2.2 Adding Partition Values


This option is used for adding partition values for the Derived Entity definitions which are created with
Prebuilt Table flag set as Y. After you provide partition values, data is fetched from the specified partitions
only, thereby resulting in better performance.
To add partition values

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 185


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

1. From the Derived Entity Summary window, select the Derived Entity for which you want to add
partition values and click Partitions. The Partition Details window is displayed.

Figure 90: Partition Details window

2. Click and enter the partition value in the editable row.


3. Click Save.

5.2.3 Copying Derived Entity


You can copy the pre-defined Derived Entity details to create another entity. You should have the Derived
Entity Write role mapped to your user group to copy a Derived Entity.
To copy a Derived Entity:

1. From the Derived Entity Summary window, select the derived entity you want to copy and click
Copy. The Derived Entity Details window is displayed.
2. Enter the required details.
For more information, see Creating Derived Entity section.
3. Click Save.

5.2.4 Viewing Derived Entity Properties


You can view the metadata of the selected Derived Entity.
To view the existing Derived Entity definition details follow these steps:

1. From the Derived Entity Summary window, select the derived entity you want to view and click
View. The Derived Entity Details window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 186


UNIFIED ANALYTICAL METADATA
DERIVED ENTITY

The View Derived Entity Details window displays the details of the selected Derived Entity definition.
The User Info grid at the bottom of the window displays the metadata information about the
Derived Entity definition created along with the option to add comments.
2. Click Close.

5.2.5 Modifying Derived Entity


You can modify a Derived Entity definition as required. A Derived Entity definition in the unauthorized
state (modified by other users) cannot be modified. You can modify Derived Entity if you have Derived
Entity Write role mapped to your user group.
1. From the Derived Entity Summary window, select the derived entity you want to modify and click
Edit.
The Derived Entity Details window is displayed.
2. Modify the required details such as Short Description, Long Description and the metadata to be
associated with the Derived Entity.
For more information, see Create Derived Entity.
3. Click Save and update the details.
When you modify a Derived Entity which is mapped to other metadata definition, the Affected
Metadata Dialog is displayed with the list of mapped Datasets, Measures, and Hierarchies which
gets auto updated. Click OK to confirm, else click Cancel.

Figure 91: Message window

5.2.6 Deleting Derived Entity


You can delete a Derived Entity that you have created or if you are authorized to do so. A Derived Entity in
Unauthorized state (modified by other users) cannot be deleted. You can delete Derived Entity if you have
the Derived Entity Write role mapped to your user group.
Delete function permanently removes the Derived Entity from the database. Ensure that you have verified
the details as indicated below:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 187


UNIFIED ANALYTICAL METADATA
DATASETS

• A Derived Entity definition marked for deletion is not accessible for other users.
• Every delete action has to be Authorized/Rejected by the authorizer.
 On Authorization, the Derived Entity details are removed.
 On Rejection, the Derived Entity details are reverted back to authorized state.
• You cannot update Derived Entity details before authorizing/rejecting the deletion.
• An unauthorized Derived Entity definition can be deleted.
To delete a Derived Entity in the Derived Entity window:

1. From the Derived Entity Summary window, select the derived entity you want to delete and click
Delete.
2. Click OK in the confirmation dialog.

5.3 Datasets
Dataset refers to a group of tables whose inter-relationship is defined by specifying a join condition
between the various tables. It is a basic building block to create a query and execute on a data warehouse
for a large number of functions and to generate reports.
Dataset function within the Infrastructure system facilitates you to create Datasets and specify rules that
fine-tune the information for querying, reporting, and analysis. Datasets enhances query time by pre-
defining the names of tables required for an operation (such as aggregation), and also provides the ability
to optimize the execution of multiple queries on the same table set. For more information, see Scenario to
Understand the Dataset Functionality section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 188


UNIFIED ANALYTICAL METADATA
DATASETS

Figure 92: Data Sets Summary window

The Datasets window displays the list of pre-defined Datasets with their Code, Short Description and Long
Description. You can add, view, edit, copy, and delete the required Dataset. You can also search for a
specific dataset based on the Code, Short Description, and Authorization status or view the list of existing
datasets within the system.
By clicking the Column header names, you can sort the column names in ascending or descending order.
Click if you want to retain your user preferences so that when you login next time, the column names
will be sorted in the same way. To reset the user preferences, click .
Based on the role that you are mapped to, you can access read, modify or authorize Datasets. For all the
roles and descriptions, see Appendix A. The roles mapped to Datasets are as follows:
• Dataset Access
• Dataset Advanced
• Dataset Authorize
• Dataset Phantom
• Dataset Read Only
• Dataset Write

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 189


UNIFIED ANALYTICAL METADATA
DATASETS

5.3.1 Creating Dataset


You can create Dataset by defining the Dataset Details, Entities, and Dataset Definition. You need to have
Dataset Write role mapped to create Datasets.
To create Dataset in the Datasets window:

1. From the Dataset Summary window, click Add from the Datasets tool bar.
The Dataset Details window is displayed.

Figure 93: Dataset Details window

2. Enter the details in the Dataset Details section as tabulated.


The following table describes the fields in the Dataset Summary window.

Table 30: Fields in the Dataset Summary window and their Description

Field Description

Enter a distinct code to identify the Dataset. Ensure that the code is
alphanumeric with a maximum of 8 characters in length and there are no
special characters except underscore “_”.
Note the following:
The code can be indicative of the type of Dataset being created.

Code A pre-defined Code and Short Description cannot be changed.


Same Code or Short Description cannot be used for Essbase installation:
"$$$UNIVERSE$$$", "#MISSING”, "#MI”, "CALC”, "DIM”, "ALL”, "FIX”,
"ENDFIX", "HISTORY”, "YEAR”, "SEASON", "PERIOD”, "QUARTER”,
"MONTH”, "WEEK”, "DAY".
In Unauthorized state, the users having Authorize Rights can view all the
unauthorized Metadata.

Enter a Short Description based on the defined code. Ensure that the
Short Description description is of a maximum of 8 characters in length and does not contain
any special characters except underscore “_”.

Enter the Long Description if you are creating subject-oriented Dataset to


help users for whom the Dataset is being created or other details about the
Long Description type/subject.
Ensure that the description is of a maximum of 100 characters in length.

3. From the Entities pane, you can perform the following:

 Select the required entity and click .

 To select all entities, click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 190


UNIFIED ANALYTICAL METADATA
DATASETS

 To remove an entity, select the entity from the Selected Values grid and click .

 To remove all entities from the Selected Values grid, click .


4. Specify the required table-join condition in the Dataset Definition pane as tabulated:

Figure 94: Dataset Definition pane

The following table describes the fields in the Dataset Definition pane.

Table 31: Fields in the Dataset Definition pane and their Descriptions

Field Description

The ANSI Join condition defines which set of data have been joined along
with the type of join condition. It also describes the exact operations to be
ANSI Join
performed while joining the Datasets. In ANSI join, the join logic is clearly
separated from the filtering criteria.

The Join/Filter Condition facilitates the objective of creating Datasets.


Datasets with linked tables using the join conditions help in reducing the
query time. There are two ways of defining the join condition:
JOIN condition for SQL Server/SQL OLAP combination should contain
Join/Filter Condition only EQUI JOIN condition as required by SQL OLAP.
In case of SQL Server/Essbase and Oracle/Essbase, Dataset must be
defined. Multiple cubes can be built with a single pass and the underlying
Dataset definition should be the same for all the cubes mapped which
reduces the aggregation time considerably.

The Date Filter condition enables you to cascade the cubes that are using
Date Filter
the Dataset with the defined Date Filter.

The Order By condition enables you to sort the dimension data in order.
The order of the Dimension nodes will be maintained only for Business
Order By
Intelligence enabled hierarchies. The Order By condition is specific to the
Essbase database.

5. Enter the required expression or click to define an expression using the Expression Builder
window.
For more information, see Expression Builder.
6. Click Preview.
The Data of Dataset <<dataset name>> window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 191


UNIFIED ANALYTICAL METADATA
DATASETS

Figure 95: Data of Dataset CBRC Mitigant Dataset window

This window displays an error message if the Query execution fails. Up to 400 records of data is
displayed in the Summary Grid pane.
7. Click Show Query to view the query.
8. Enter the values for MIS DATE (YYYYMMDD) and RUN SKEY parameters.
9. Click Save and save the Dataset Definition details.

5.3.2 Viewing Dataset Details


You can view individual Dataset details at any given point. You need to have Dataset Read Only role
mapped to view the Datasets. To view the existing Dataset definition details in the Datasets window:
1. From the Dataset Summary window, select the checkbox adjacent to the required Dataset code.

2. Click View from the Datasets toolbar.


The View Datasets window displays the details of the selected Dataset definition. The User Info grid
at the bottom of the window displays the metadata information about the Dataset definition created
along with the option to add comments.

5.3.3 Modifying Dataset Details


You can update the existing Dataset definition details except for the Code and Short Description. You
need to have Dataset Write role mapped to modify the Datasets. To update the required Dataset details
in the Datasets window:
1. From the Dataset Summary window, select the checkbox adjacent to the required Dataset code.

2. Click Edit from the Datasets toolbar.


The Edit Datasets window is displayed.
3. Update the required details.
For more information, see Create Dataset.
4. Click Save and update the changes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 192


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

5.3.4 Copying Dataset Details


You can copy the existing Dataset details to quickly create a new Dataset. You can later modify the Dataset
Code or Short Description, add/remove tables, and also define the join conditions. You need to have
Dataset Write role mapped to copy the Dataset definitions. To copy an existing Dataset definition in the
Datasets window:
1. From the Dataset Summary window, select the checkbox adjacent to the required Dataset code.

2. Click Copy from the Datasets toolbar.


The Dataset definition details are copied and a confirmation message is displayed.

5.3.5 Deleting a Dataset


You can remove the Dataset definition(s) which are created by you and which are no longer required in the
system by deleting from the Datasets window. You need to have Dataset Write role mapped to delete a
Dataset. Delete function permanently removes the Dataset details from the database. Ensure that you
have verified the details as indicated below:
• A Dataset definition marked for deletion is not accessible for other users.
• Every delete action has to be Authorized/Rejected by the authorizer.
 On Authorization, the Dataset details are removed.
 On Rejection, the Dataset details are reverted back to authorized state.
• You cannot update Dataset details before authorizing/rejecting the deletion.
• An unauthorized Dataset definition can be deleted.
To delete an existing Dataset in the Datasets window:
1. From the Dataset Summary window, select the checkbox adjacent to the required Dataset code.

2. Click Delete from the Datasets toolbar.


A confirmation dialog is displayed.
3. Click OK. The Dataset details are marked for delete authorization.

5.4 Dimension Management


Dimension Management within the Infrastructure system facilitates you to categorize data into a single
object as a Member; define levels and aggregate data to form the Hierarchies, and distinguish each
member by defining the required Attributes.
The roles mapped to Dimension Management are as follows:
• Dimension Access
• Dimension Advanced
• Dimension Authorize
• Dimension Phantom
• Dimension Read Only

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 193


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

• Dimension Write

Object Security
• This is implemented for Hierarchy, Filter, and Expressions objects.
• There are some seeded user groups and seeded user roles mapped to those user groups. If you are
using the seeded user groups, the restriction on accessing objects based on user groups is
explained in the OFSAA Seeded Security section.
• For creating/editing/copying/removing an object in Dimension Management module, your user
group should have been mapped to the folder in case of public or shared folder, or you should have
been the owner of the folder in case of private folder. Additionally, the WRITE role should be
mapped to your user group. For more information, see Object Security in OFSAAI section.
• To access the link and the Summary window, your user group should have ACCESS role mapped.
You can view all objects created in Public folders, Shared folders to which you are mapped and
Private folders for which you are the owner. For more information, see the Object Security in
OFSAAI section.
• The Folder selector window behavior and consumption of higher objects are explained in User
Scope section.

Hierarchy Member Security


• This is implemented for Hierarchy and Filter objects.
• For each information domain, a mapper definition can be set as the default Security mapper. Based
on this mapper definition, the usage of hierarchy members are restricted.
• The nodes/members in a Hierarchy/ Filter which are mapped to your user group will be enabled
and can be used. Those which are not mapped can be viewed, but you cannot use it since they are
in disabled state.
• If a child hierarchy is mapped and the parent is not mapped to your user group, the parent will be
displayed as a disabled node.
• You should have separate roles/functions mapped to add a leaf, sibling, or child to your hierarchy.

5.4.1 Components of Dimension Management


Dimension Management consists of the following sections. Click on the links to view the sections in detail.
• Attributes
• Members
• Build Hierarchy
• Hierarchy Maintenance

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 194


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

5.4.2 Attributes
Attributes refers to the distinguished properties or qualifiers that describes a dimension member.
Attributes may or may not exist for a simple dimension. Attributes section is available within the
Dimension Management section of Financial Services Applications module.

Figure 96: Attributes window

The Attributes window displays the list of pre-defined Dimension Attributes with the other details such as
the Numeric Code, Name, Data Type, Required, and Seeded. You can search for a specific Attribute based
on Numeric Code, Name, or Data Type and view the list of existing definitions within the system.

5.4.2.1 Adding Attribute Definition


Attributes facilitates you to define the properties or qualifiers for the Dimension members. The Write role
should be mapped to your user group, from the User Group Role Map window.
To create an Attribute definition in the Attributes window:
1. From the Attributes window, click Add.
The Attribute Definition (New Mode) window is displayed.

Figure 97: Attributes window

2. In the Dimension section, select the required dimension from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 195


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

3. Click button in the Numeric Code field.


A unique code is auto generated. You can also manually enter the code in the Numeric Code field.
4. Enter the Name and required Description for the Attribute.

NOTE Name: The characters ' " & ( ) % , ! / -are restricted in the name
field.
Description: The characters ~&+' "@ are restricted in the
description field.

5. Enter the Attribute window is as tabulated:


The following table describes the fields in the Attribute window.

Table 32: Fields in the Attributes window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Select the Data Type as DATE, DIMENSION, NUMBER, or STRING from the
drop-down list.
If NUMBER is selected as the Data Type:
The Scale field is enabled with “0” as default value.
Type
Enter a Scale value >= 0. If it is left as 0, values for this attribute will be
limited to Integers. If you wish to enable decimal entries for this attribute,
the maximum Scale value must be > 0 and <= the scale defined for
NUMBER_ASSIGN_VALUE in the dimension's underlying attribute table. See
the Data Model Utilities Guide for further details on the attribute table.

Select Yes or No. If this is set to No, an attribute value is optional for the
associated dimension members.
Required Attribute
Note: This field is disabled in Add and Edit modes if any members already
exist for the Dimension upon which this attribute is defined.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 196


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Field Description

If Required Attribute is set to Yes, a Default Value must be entered,


otherwise it is optional.
If DIMENSION is selected as the Data Type:
Select the required Dimension from the drop-down list in the Dimension
field.
Select the Default Value from the drop-down list of members mapped to
the selected Dimension. If the required Member is not listed in the drop-
down then select --More—and the Member Search window is displayed.
For more information see search.
If NUMBER is selected as the Data Type:
Default Value Enter a numeric value in the Default Value field, and it must be consistent
with the Scale you have defined.
If DATE is selected as the Data Type:

Click button to select a valid date as the Default Value from the
calendar.
If STRING is selected as the Data Type:
Enter alphanumeric value in the Default Value field.
The Maximum characters allowed in Default value field for
String Data Type is 1000.

6. Click Save.
The entries are validated and the defined Attribute is captured.

5.4.2.2 Viewing Attribute Definition


You can view individual Attribute Definition details at any given point. The Read only role should be
mapped to your user group.
To view the existing Attribute Definition details in the Attribute window:
1. Select the checkbox adjacent to the Numeric Code of the Attribute, whose details are to be viewed.

2. Click View button in the Dimension Attributes tool bar.


The View – Attributes window is displayed with the details such as Dimension, Numeric Code, Name,
Description, and Attribute Properties.

5.4.2.3 Modifying Attribute Definition


You can modify the Name, Description, or Default Value fields of an attribute definition. The Write role
should be mapped to your user group.
To modify an existing Attribute Definition in the Attributes window follow these steps:
1. Select the checkbox adjacent to the Numeric Code of the Attribute, whose details are to be updated.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 197


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

2. Click Edit button in the Dimension Attribute tool bar. Edit button is disabled if you have selected
multiple Attributes.
The Edit - Attributes window is displayed.
3. Edit the Attribute details such as Name, Description, or Default value.
For more information, see Add Attribute Definition.
4. Click Save to save the changes.

5.4.2.4 Copying Attribute Definition


The Copy Attribute Definition facilitates you to quickly create a new Attribute Definition based on the
existing attributes or by updating the values of the required attributes. The Write role should be mapped
to your user group.
To copy an existing Attribute Definition in the Attributes window:
1. Select the checkbox adjacent to the Numeric Code of the Attribute, whose details are to be
duplicated.

2. Click Copy button in the Dimension Attributes toolbar to copy a selected Attribute definition.
Copy button is disabled if you have selected multiple Attributes.
3. In the Copy – Attributes window you can:
 Create new attribute definition with existing variables. Specify new Numeric Code and
Attribute Name. Click Save.
 Create new attribute definition by updating the required variables. Specify new Numeric Code
and Attribute Name. Update the required details. For more information, see Add Attribute
Definition. Click Save.
The new attribute definition details are displayed in the Attributes window.

5.4.2.5 Attribute Definition Dependencies


You can view the dependencies of Attributes. The Read only role should be mapped to your user group.
To view the dependency of attribute in the Attributes window:
1. Select the checkbox adjacent to the Numeric Code of the Attribute whose dependency is to be
checked.

2. Click button in the Dimension Attributes toolbar.


The Check Dependencies button is disabled if you have selected multiple attributes. The Attributes
Dependency Information window is displayed with the dependency details.

5.4.2.6 Deleting Attribute Definition


You can remove the Attribute Definitions which are not required in the system by deleting from the
Attributes window. The Write role should be mapped to your user group.
1. Select the checkbox adjacent to the Numeric Code(s) of the Attributes whose details are to be
removed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 198


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

2. Click Delete button in the Dimension Attributes tool bar.


3. Click OK in the information dialog to confirm deletion.

5.4.3 Members
Dimension Members refer to the individual items that constitute a dimension when data is categorized
into a single object. Example, Product, Organization, Time, and so on. Members are available within
Dimension Management section of the Infrastructure system.
For more information on how to set up alphanumeric and numeric codes, see Configurations to use
Alphanumeric and Numeric Codes for Dimension Members section in OFSAAI Administration Guide.

Figure 98: Members window

The Members window displays the list of pre-defined Dimension Members with the other details such as
the Alphanumeric Code, Numeric Code, Name, and Is Leaf. You can also search for a specific Member
based on Alphanumeric / Numeric Code (irrespective of whether dimension is configured to be numeric or
alphanumeric), Name, Description, Enabled status, Is Leaf status, Attribute Name, or Attribute Value and
view the list of existing definitions within the system.

5.4.3.1 Adding Member Definition


This option allows you to add member definition. The Write role should be mapped to your user group.
To create an Attribute definition in the Attributes window:
1. Click Add from the toolbar.
The Member Definition (New Mode) window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 199


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Figure 99: Members Add window

2. In the Dimensions section, select the required Dimension from the drop-down list.
3. Enter the Member Details as tabulated:
The following table describes the fields in the Member Add window.

Table 33: Fields in the Members Add window Field and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

The Alphanumeric Code field is editable only if the selected Dimension


accepts Alphanumeric Code. For example, Billing Method Dimension.
Else, the field is Read Only and the value is fetched from the Numeric
Alphanumeric Code Code field entered.
Enter the required Alphanumeric Code. Ensure that the code has a
maximum of 14 characters and there are no special characters like & ' ~
" @ + included.

Enter the Numeric Code by doing any of the following:

To auto-generate a Numeric Code, click button. A system


generated code is displayed.
Numeric Code Manually enter the required code which is auto validated for
uniqueness. A maximum of 14 numeric characters can be specified.
Note: if the selected Dimension accepts only Numeric Code, then the
specified, the Numeric Code is auto populated to the Alphanumeric
Code field also.

Enter the Name of the Member.


Name
Note: The characters ' " & ( ) % , ! / are restricted.

Enter the required Description for the Member.


Description
Note: The characters ~&+' "@ are restricted.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 200


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Field Description

This field is set to Yes by default and is editable only in Edit window.
Note: You can change the option to No only when the particular
Enabled member is not used in any hierarchy. The disabled members will not be
displayed in Hierarchy rules, or UIs which are based on Hierarchies,
such as Hierarchy Filters and hierarchical assumption browsers used in
applications.

This field is set to Yes by default.


If Yes, the particular member can be used as a leaf node in any
hierarchy and child cannot be added to this node.
Is Leaf
If No, the node becomes a non-leaf and can have child nodes.
Note: A member created as Non Leaf having child nodes to it in any
hierarchy cannot be made Leaf.

NOTE If the Dimension is selected as “Common Chart of Accounts”,


proceed further. Else, jump to step 5.

4. Click button in Copy Attribute Assignment From field.


The Member Browser Properties window is displayed. This field can be left blank so that the Member
Attributes panel can be filled in without considering the values already assigned.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 201


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Figure 100:Member Browser Properties window

 Select the required Member from the Dimension Members list.

Click button in the Search grid to search for a specific Member based on Alphanumeric
Code, Numeric Code, Name, Description, Enabled status, Is Leaf status, Attribute Name, or
Attribute Value. You can also click button to find a member present in the Dimension
Members grid using key words.
 Click OK.
The selected Member is displayed in the Copy Attribute Assignment From field in New –
Member Details window and the details of selected Attribute are displayed in the Member
Attributes section. You can edit the Attribute details as indicated:
Edit Attribute based on date by clicking the (Calendar) icon.
Edit Attribute based on Dimension Value by selecting from the drop-down list.
Edit Attribute based on Number Value by entering the valid numerical value.
Edit Attribute based on String Value by specifying alphanumerical value.
5. Click Save and the defined Member Definition is captured after validating the entries.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 202


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

5.4.3.2 Viewing Member Definition


You can view individual Member Definition details at any given point. To view the existing Member
Definition details in the Members window:
1. Select the checkbox adjacent to the Alphanumeric Code of the Member, whose details are to be
viewed.

2. Click View button in the toolbar.


The View – Member Details window is displayed with the details such as Dimension, Member
Details, and Member Attributes details.

5.4.3.3 Modifying Member Definition


To modify an existing Member Definition in the Members window:
1. Select the checkbox adjacent to the Alphanumeric Code of the Member, whose details are to be
updated.

2. Click Edit button in the toolbar.


Edit button is disabled if you have selected multiple Members. The Edit – Member Details window is
displayed.
3. Edit the Member details as required.
For more information, see Add Member Definition.
4. Click Save to save the changes.

5.4.3.4 Copying Member Definition


The Copy Member Definition facilitates you to quickly create a new Member Definition based on the
existing attributes or by updating the values of the required members.
To copy an existing Member Definition in the Members window:
1. Select the checkbox adjacent to the Alphanumeric Code of the Member, whose details are to be
duplicated.
2. Click Copy button in the toolbar to copy a selected Member definition.
Copy button is disabled if you have selected multiple Members.
3. In the Copy – Member Details window you can:
 Create new Member with existing variables. Specify the Numeric Code and new Member
Name.
 Create new Member definition by updating the required variables. Specify the Numeric Code
and new Member Name. Update the required details. For more information, see Add Member
Definition. Click Save.
The new member definition details are displayed in the Members window.

5.4.3.5 Member Definition Dependencies


You can view the dependencies of Members. To view the dependency of member in the Members window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 203


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

1. Select the checkbox adjacent to the Alphanumeric Code of the Member, whose dependency is to be
viewed.

2. Click Check Dependencies button in the toolbar.


The Check Dependencies button is disabled if you have selected multiple members. The Members
Dependency Information window is displayed with the dependency details.

5.4.3.6 Deleting Member Definition


You cannot delete predefined members or the members which are the Nodes for a hierarchy.
To delete a Member in the Members window.
1. Select the checkbox adjacent to the Alphanumeric Code(s) of the Members, whose details are to be
removed.

2. Click Delete button in the Dimension Members tool bar.


3. Click OK in the information dialog to confirm deletion.

5.4.4 Build Hierarchy


Business Hierarchy refers to Organizing Data into logical tree structure to represent the groups and
relations among various levels at which measure can be viewed. A measure can be viewed at different
levels depending upon the hierarchy breakdown of the dimension category.
Based on the role that you are mapped to, you can access read, modify or authorize Build Hierarchy. For
all the roles and descriptions, see Appendix A. The roles mapped to Business Hierarchy are as follows:
 BMM Hierarchy Access
 BMM Hierarchy Advanced
 BMM Hierarchy Authorize
 BMM Hierarchy Phantom
 BMM Hierarchy Read Only
 BMM Hierarchy Write
For example, consider the following structure.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 204


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Figure 101: Business Hierarchy

You can view the Number of Customers (Measure) across Income Group (Dimension), which is further
broken down by different age groups (Hierarchy). While number of customers is a metric, it is useful when
viewed based on some categorization such as customer income profile or customers having an annual
income of over USD 100,000 per annum, to provide better quality of information.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 205


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Figure 102: Business Hierarchy window

The Business Hierarchy window displays the list of pre-defined Business Hierarchies with their Code, Short
Description, Long Description, Hierarchy Type, Hierarchy Sub Type, Entity, and Attribute. You can create
Business Hierarchies for measure(s), and view, edit, copy, or delete the required Business Hierarchies. For
more information on the Business Hierarchy Types and Sub-types, see Business Hierarchy Types.

NOTE When an AMHM hierarchy is created, implicitly a UAM Business


hierarchy also gets created and will be listed in the Summary
window of Business Hierarchy. The Code of Implicitly
populated UAM Hierarchy is system generated with length of 11
characters and prefixed with AMHM.

You can also search for a specific Business Hierarchy based on the Code, Short Description, Hierarchy
Type, Hierarchy Sub Type, and Authorization status, or view the list of existing Business Hierarchies within
the system.

5.4.4.1 Creating Business Hierarchy


You can create a Business Hierarchy by specifying the Hierarchy definition details and defining the
required Hierarchies. You need to be mapped to the role BMM Hierarchy Write to add or create a business
hierarchy.
To create a Business Hierarchy in the Business Hierarchy window:
1. Click Add button from the Business Hierarchy toolbar.
The Add Business Hierarchy window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 206


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Figure 103: Add Business Hierarchy window

2. Enter the details in Business Hierarchy Details section as tabulated.


The following table describes the fields in the Business Hierarchy window.

Table 34: Fields in the Business Hierarchy window and Descriptions

Field Description

Enter a distinct code to identify the Hierarchy. Ensure that the code is
alphanumeric with a maximum of 8 characters in length and there are no
special characters except underscore “_”.
Note the following:
The code can be indicative of the type of Hierarchy being created.

Code A pre-defined Code and Short Description cannot be changed.


Same Code or Short Description cannot be used for Essbase installation:
"$$$UNIVERSE$$$", "#MISSING”, "#MI”, "CALC”, "DIM”, "ALL”, "FIX”,
"ENDFIX", "HISTORY”, "YEAR”, "SEASON", "PERIOD”, "QUARTER”,
"MONTH”, "WEEK”, "DAY".
In Unauthorized state, the users having Authorize Rights can view all the
unauthorized Metadata.

Enter a Short Description based on the defined code. Ensure that the
Short Description description is of a maximum of 8 characters in length.
Note: The characters ' " & ( ) % , ! / are restricted.

Enter the Long Description if you are creating subject-oriented Hierarchy to


help users for whom the Hierarchy is being created or other details about
Long Description
the type/subject. Ensure that description is of a maximum of 100
characters in length.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 207


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

3. In the Business Hierarchy Definition section, select the Hierarchy Type from the drop-down list.

NOTE Hierarchy Type is the basic differentiator and based on your


selection, the other options to define the Business Hierarchy
are available.

You can select the following Hierarchy Type/Sub-Type. Click on the links to navigate to the
respective sections and define the required Hierarchy. For detailed information on all the Hierarchy
Types, see Business Hierarchy Types.

Hierarchy Type Description / Hierarchy Sub Type

In a Regular Hierarchy Type, you can define the following Hierarchy Sub
Types:
Non Business Intelligence Enabled
In a non-Business Intelligence Enabled Hierarchy, you need to manually add
the required levels. The levels defined will form the Hierarchy.
Business Intelligence Enabled
You can Enable Business Intelligence hierarchy when you are not sure of the
Hierarchy structure leaf values or the information is volatile and also when
the Hierarchy structure can be directly selected from RDBMS columns. The
system will automatically detect the values based on the actual data.
In a BI enabled Hierarchy, you will be prompted to specify if a Total node is
Regular required (not mandatory) and system auto-detects the values based on
actual data. For example, you can define three levels in BI Enabled
hierarchies like, Region (1), State (2), and Place (3). The auto generated
Hierarchies are:

Region (1) State (2) Place (3)

South Tamil Nadu Madras


Karnataka Bangalore
Andhra Pradesh Hyderabad

North Punjab Chandigarh

Parent Child
This option can be selected to define a Parent Child Type hierarchy.

A Measure Hierarchy consists of the defined measure as nodes and has only
Measure
the Non Business Intelligence Enabled as Hierarchy Sub Type.

A Time Hierarchy consists of the levels/nodes of high time granularity and


Time
has only the Business Intelligence Enabled as Hierarchy Sub Type.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 208


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

NOTE When the defined Hierarchy consists of more than 100 leaf
levels, the system treats it as a Large Hierarchy in order to
provide efficient and optimized hierarchy handling. For more
information on modify the default value, see Large Hierarchy.

After you have populated the required details in Business Hierarchy Definition and Hierarchy details
section, save the details.
4. Click Save in Add Business Hierarchy window and save the details.

5.4.4.2 Viewing Business Hierarchy


You can view individual Business Hierarchy at any given point. To view the existing Business Hierarchy
definition details in the Business Hierarchy window: You need to be mapped with the role BMM Hierarchy
Read Only to view Business Hierarchy.
1. Select the checkbox adjacent to the required Business Hierarchy code.

2. Click View button from the Business Hierarchy tool bar.


The View Business Hierarchy window displays the details of the selected Business Hierarchy
definition. The User Info grid at the bottom of the window displays metadata information about
Business Hierarchy created along with the option to add comments.

5.4.4.3 Modifying Business Hierarchy


You can update the existing Business Hierarchy definition details except for the Code and Hierarchy
Type/Sub-Type. You need to be mapped with the role BMM Hierarchy Write to modify Business
Hierarchy.

NOTE You cannot modify the implicitly created Business Hierarchies


for AMHM Hierarchies.

To update the required Business Hierarchy details in the Business Hierarchy window:
1. Select the checkbox adjacent to the required Business Hierarchy code.

2. Click Edit button from the Business Hierarchy tool bar.


The Edit Business Hierarchy window is displayed.
3. Update the required details.
For more information, see Create Business Hierarchy.
4. Click Save and update the changes.

5.4.4.4 Copying Business Hierarchy


You can copy the existing Business Hierarchy details to quickly create a new Business Hierarchy. You need
to be mapped to the role BMM Hierarchy Write to copy Business Hierarchy. To copy an existing Business
Hierarchy definition in the Business Hierarchy window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 209


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

1. Select the checkbox adjacent to the required Business Hierarchy code.

2. Click Copy button from the Business Hierarchy tool bar.


The Business Hierarchy definition details are copied and a confirmation message is displayed.

5.4.4.5 Deleting Business Hierarchy


You can remove the Business Hierarchy definition(s) which are created by you and which are no longer
required in the system by deleting from the Business Hierarchy window. Delete function permanently
removes the Business Hierarchy details from the database. You need to be mapped with the role BMM
Hierarchy Write to delete Business Hierarchy. Ensure that you have verified the following details as
indicated:
• A Business Hierarchy definition marked for deletion is not accessible for other users.
• Every delete action has to be Authorized/Rejected by the authorizer.
 On Authorization, the Business Hierarchy details are removed.
 On Rejection, the Business Hierarchy details are reverted back to authorized state.
• An unauthorized Business Hierarchy definition can be deleted.
You can delete an implicitly created Business Hierarchy for an AMHM Hierarchy, if it is not used in any
higher objects. After the Business Hierarchy is deleted, it will not be re-created if you resave AMHM
Hierarchy.

5.4.5 Hierarchy Maintenance


Hierarchies refer to dimension members that are arranged in levels, with each level representing the
aggregated total of the data from the level below. One dimension type can have multiple hierarchies
associated with it. Hierarchies are available within the Dimension Management section of Infrastructure
system.
You can access Hierarchies window by expanding Unified Analytical Metadata and Dimension
Management within the tree structure of LHS menu and selecting Hierarchy Maintenance.

Figure 104: Hierarchies window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 210


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

The Hierarchies window displays the list of Hierarchies created in all public folders, shared folders to which
you are mapped and private folders for which you are the owner, along with other details such as the
Name, Display level, Created By, Creation Date, and Last Modification Date. For more information on how
object access is restricted, see Object Security in AMHM module section.
You can also search for a specific Hierarchy definition based on Folder, Hierarchy Name, Dimension
Member Alphanumeric Code, Dimension Member Numeric Code, or Dimension Member Name and view
the existing definitions within the system.

5.4.5.1 Adding Hierarchy Definition


In the Hierarchies window, you can create Hierarchy Definition up to 15 levels by default. The maximum
permissible levels are up to 58 Hierarchies. To create a hierarchy, the Write role should be mapped to your
user group.

NOTE When an AMHM Hierarchy is created, implicitly a UAM


Business hierarchy also gets created and will be listed in the
Summary window of Business Hierarchy.

To create a Hierarchy definition in the Hierarchies window:


1. Click Add button in the Hierarchies toolbar.
The New – Hierarchy Details window is displayed.

Figure 105: Hierarchies window

2. Select Dimension from the drop-down list.


The selected Dimension from the New – Hierarchy Details window is displayed as the default
dimension for which member has to be defined.
The following table describes the fields in the Hierarchy Properties window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 211


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Table 35: Fields in the Hierarchies window and their Description

Field Description

Fields marked in red asterisk (*) are mandatory.

Enter the Name of the Hierarchy.


Name
Note: The characters ' " & ( ) % , ! / are restricted.

Enter the required Description for the Hierarchy.


Description
Note: The characters ~&+' " @ are restricted.

Select the folder where the hierarchy is to be stored from the drop-down
list.
The Folder selector window behavior is explained in User Scope section.
Folder Click to create a new private folder. The Segment Maintenance window
is displayed. For more information, see Segment Maintenance.
Note: You can select Segment/Folder Type as Private and the Owner
Code as your user code only.

Select the Access Type as Read Only or Read/Write.


Read-Only: Select this option to give other users access to only view the
hierarchy definition.
Access Type Note: A user with Phantom and Write role can modify or delete the
hierarchy even though the access type is selected as Read-only.
Read/Write: Select this option to give all users the access to view, modify
(including Access Type), and delete the hierarchy definition.

Click Yes to inherit the hierarchy properties of the Parent to the Child.
Automatic Inheritance
Click No if you want to define a new hierarchy.

Click Yes to display the Signage to the right hand side of the member in
Display Signage
the Show hierarchy panel. Else, click No.

Select from the drop-down list as one of the following:


Alphanumeric Code to Left of Name: Displays Alphanumeric Code on the
Left side of Member Name.
Alphanumeric Code to Right of Name: Displays Alphanumeric Code on
the Right side of Member Name.
Show Member Code
Only Name - No Code: Displays only the Member Name.
Numeric Code to Left of Name: Displays the Numeric Code on the Left
side of Member Name.
Numeric Code to Right of Name: Displays the Numeric Code on the Right
side of Member Name.

Initial Display Level Select the Initial Display level from the drop-down list.

Click Yes to display the Orphan Branch in the Show Hierarchy panel.
Orphan Branch
Otherwise, click No.

3. To add Child under the Show Hierarchy tab:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 212


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

NOTE The TREE _NODE_LIMIT needs to be set in the


AMHMConfig.properties file is moved to the Hierarchy
database. There are two ways to render members of a
hierarchy. One is the paginated mode, and the other is the non-
paginated mode. The method is decided based on the number
of nodes for a particular hierarchy.
• If the number of members is more than
TREE_NODE_LIMIT (configured in AMHM property
tables in CONFIG schema and not in the properties
file), members are loaded in paginated mode.
• If the number of members is less than
TREE_NODE_LIMIT (configured in AMHM property
tables in CONFIG schema and not in the properties
file), members are loaded non-paginated mode.
• If TREE_NODE_LIMIT is not configured in AMHM
property tables, then this value defaults to 5000. Any
change in this table requires a server restart, and the
values in the property file do not affect 8.1.0.0.0+
environments.
Select the Pagination icon to view more options under the
available components. Click a record to enable the Pagination
buttons.

a. Right-click in the Show Hierarchy tab.


b. Select Add Child option and the Add Member window are displayed.

c. Select the required Member and click . The Member is displayed in the Selected Members

panel. Click to select all Members which are shown in the Show Members pane. Click
to select all nodes/ members in the server.

You can click to deselect a Member or click to deselect all the Members.
You can click to search for the required member using Alphanumeric code, Numeric Code,
Name, Description, Attribute Name, or Attribute Value.
You can also click button to toggle the display of Numeric Code left, right, or name and click
button to display Alphanumeric Code left, right, or name.
d. Click OK.
The selected Member is displayed as Child under Show Hierarchy panel in the New – Hierarchy
Details window.
4. To add Sibling:
a. Right-click on the Child and select the option Add Sibling.
The Add Member window is displayed.

b. Select the required Member and click .

The Member is displayed in the Selected Members panel. You can click to select all

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 213


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

Members which are shown in the Show Members pane. Click to select all nodes/ members
in the server.

c. You can click to deselect a Member or click to deselect all the Members. You can also
Click to search for the required member.
d. Click Apply.
The selected Member is displayed as Sibling below the Parent under Show Hierarchy panel in
the New – Hierarchy Details window.
5. To add Leaf under a Parent, Child, or Sibling:
a. Right-click the Parent or Child and select Add Leaf.
The Add Member window is displayed.

b. Select the required Member and click .

The Member is displayed in the Selected Members panel. You can click to select all

Members which are shown in the Show Members pane. Click to select all nodes/ members
in the server.

You can click to deselect a Member or click to deselect all the Members. You can also
Click to search for the required member.
c. Click Apply.
The selected Member is displayed as Leaf below the Parent or Sibling under Show Hierarchy
panel in the New – Hierarchy Details window.
6. To define Level Properties:
a. Select Level Properties from the options under Parent, Child, Sibling or Leaf and the Level
Properties window is displayed.
b. Enter the valid Name and Description in the respective fields.
c. Click OK and the Levels defined are displayed in the drop-down in Initial Level Display field in
Hierarchy Properties grid in New – Hierarchy Details window.
7. To cut and paste Child or Sibling:
a. Right-click on any node and select Cut.
b. Right-click on any node and Paste as Child or Paste as Sibling.
8. To Delete and Undelete:
a. Right-click on the node to be deleted and select Delete Node.
The node deleted is stroked out.
b. Right-click and select UnDelete to cancel deletion of the node.
9. To add Child / Sibling / leaf:
a. Right-click on any node and select Create and add Child.
The New - Member Details window is displayed. For more information, see Add Member
Definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 214


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

b. Right-click on any node and select Create and add Sibling.


c. Right-click on any node and select Create and add leaf.
10. To view the Member Properties and Member Attributes of a node in the Show Hierarchy panel:
a. Click < button and the Member Property grid is displayed.
b. Click on a Member.
The properties such as Alphanumeric code, Numeric Code, Name, Description, Enabled, Is Leaf,
Created By, Creation Date, Last Modified By, Last Modification Date, Attribute, and Value of the
selected Member are displayed in the Member Properties and Member Attributes grids.
In the Hierarchies window you can also:
 Click to collapse the members under a node.

 Click or to expand a branch or collapse a branch.

 Click or to focus or defocus a selected node except the root node.

 Click or to view the name of members right or left.

 Click or to view the Numeric code values of members right or left.

 Click or to show code or show name of the members.


 Click button to view the Advanced Properties of the nodes.
11. Click Save in the New – Hierarchy Details window to validate and capture the entries.
The Audit Trail section at the bottom of the window displays the metadata about the Hierarchy with
the option to add additional information as comments. The User Comments section facilitates you
to add or update additional information as comments.

5.4.5.2 Viewing Hierarchy Definition


You can view individual Hierarchy Definition details at any given point. To view the existing Hierarchy
Definition details in the Hierarchies window:
1. Select the checkbox adjacent to the Hierarchy Name.

2. Click View button in the Hierarchies tool bar. The View button is disabled if you have selected
multiple Hierarchies.
The View – Hierarchy Details window is displayed with all the Hierarchy details.
In the View – Hierarchy Details window you can click button to search for a member using the
Alphanumeric Code, Numeric Code, or Member Name in the Search dialog.

NOTE The search functionality of this button will not return any
values if you search for a node in the Orphan Branch of the
hierarchy.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 215


UNIFIED ANALYTICAL METADATA
DIMENSION MANAGEMENT

5.4.5.3 Modifying Hierarchy Definition


You can modify the Name, Description, Folder, Access Type, Automatic inheritance, Display Signage,
Show Member Code, Initial Display level, Orphan branch, Show hierarchy details in Edit – Hierarchy Details
window.

NOTE When you modify a Hierarchy, the implicitly created UAM


Business Hierarchy will also get updated.

1. Select the checkbox adjacent to the Hierarchy Name whose details are to be updated.

2. Click Edit button in the Hierarchies tool bar.


Edit button is disabled if you have selected multiple Members. The Edit – Hierarchy Details window
is displayed.
In the Edit – Hierarchy Details window you can click button to search for a member using the
Alphanumeric Code, Numeric Code, or Member Name in the Search dialog. Edit the Hierarchy
details as required.
For more information, see Add Hierarchy Definition.
3. Click Save and save the changes.

5.4.5.4 Copying Hierarchy Definition


The Copy Hierarchy Definition facilitates you to quickly create a new Hierarchy Definition based on the
existing attributes or by updating the values of the required hierarchies.
To copy an existing Hierarchy Definition in the Hierarchies window:
1. Select the checkbox adjacent to the Hierarchy name whose details are to be duplicated.

2. Click Copy button in the Hierarchies toolbar to copy a selected Hierarchy definition.
Copy button is disabled if you have selected multiple Hierarchies. The Copy – Hierarchy Details
window is displayed.
In the Copy – Hierarchy Details window you can click button to search for a member using the
Alphanumeric Code, Numeric Code, or Member Name in the Search dialog.
3. In the Copy – Hierarchy Details window you can:
 Create new hierarchy definition with existing variables. Specify a new Hierarchy Name. Click
Save.
 Create new hierarchy definition by updating the required variables. Specify a new Hierarchy
Name and update the required details.
For more information, see Add Hierarchy Definition. Click Save.
The new Hierarchy definition details are displayed in the Hierarchies window.

5.4.5.5 Hierarchy Definition Dependencies


You can view the dependencies of Hierarchies. To view the dependency of Hierarchy in the Hierarchies
window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 216


UNIFIED ANALYTICAL METADATA
MEASURE

1. Select the checkbox adjacent to the Hierarchy Name.

2. Click button in the Hierarchies toolbar. The Check Dependencies button is disabled if you have
selected Hierarchy definitions. The Hierarchies Dependency Information window is displayed.

5.4.5.6 Deleting Hierarchy Definition


You can remove the Hierarchy Definitions which are not required in the system by deleting from the
Hierarchies window.

NOTE When you delete an AMHM Hierarchy, the implicitly created


UAM Business Hierarchy will also get deleted, if it is not used in
higher objects.

1. Select the checkbox adjacent to Hierarchy Name(s) whose details are to be removed.

2. Click Delete button in the Hierarchies tool bar.


3. Click OK in the information dialog to confirm deletion.
To delete an existing Business Hierarchy in the Business Hierarchy window:
1. Select the checkbox adjacent to the required Business Hierarchy code.

2. Click button from the Business Hierarchy tool bar.


A confirmation dialog is displayed.
3. Click OK. The Business Hierarchy details are marked for delete authorization.

5.5 Measure
Business Measure refers to a uniquely named data element of relevance which can be used to define
views within the data warehouse. It typically implies aggregated information as opposed to information at
a detailed granular level that is available before adequate transformations.
Based on the role that you are mapped to, you can access read, modify or authorize Measure. For all the
roles and descriptions, see Appendix A. The roles mapped to Measure are as follows:
 Measure Access
 Measure Advanced
 Measure Authorize
 Measure Phantom
 Measure Read Only
 Measure Write
Business Measure function within the Infrastructure system facilitates you to create measures based on
the area of analysis. While creating a measure, you can choose the aggregation type and apply business
exclusion rules based on your query/area of analysis. Business Measures can be stored as Base and

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 217


UNIFIED ANALYTICAL METADATA
MEASURE

Computed Measures and can also be reused in defining other multi-dimensional stores and query data
using the various modules of Oracle Analytical Application Infrastructure.

Figure 106: Business Measure Summary window

The Business Measures window displays the list of pre-defined Business Measures with their Code, Short
Description, Long Description, Aggregation Function, Entity, and Attribute. You can add, view, edit, copy,
and delete the required Business Measures. You can also search for a specific Business Measure based on
the Code, Short Description, and Authorization status or view the list of existing Business Measures within
the system.

5.5.1 Creating Business Measure


You can create a Business Measure by specifying the Business Measure Details and defining the Business
Measure Definition. You can create a business measure if you mapped with the role Measure Write with
the user group.
To create a measure in the Business Measures window:
1. Click Add button from the Business Measures tool bar.
The Add Business Measures window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 218


UNIFIED ANALYTICAL METADATA
MEASURE

Figure 107: Business Measure Details window

2. Enter the details in the Business Measure Details pane as tabulated.


The following table describes the Business Measure details pane.

Table 36: Fields in the Business Measure Details and their Description

Field Description

Fields marked in red asterisk (*) are mandatory.

Enter a distinct code to identify the Measure. Ensure that the code is
alphanumeric with a maximum of 8 characters in length and there are no
special characters except underscore “_”.
Note the following:
The code can be indicative of the type of Measure being created.

Code A pre-defined Code and Short Description cannot be changed.


Same Code or Short Description cannot be used for Essbase installation:
"$$$UNIVERSE$$$", "#MISSING”, "#MI”, "CALC”, "DIM”, "ALL”, "FIX”,
"ENDFIX", "HISTORY”, "YEAR”, "SEASON", "PERIOD”, "QUARTER”,
"MONTH”, "WEEK”, "DAY".
In Unauthorized state, the users having Authorize Rights can view all the
unauthorized Metadata.

Enter a Short Description based on the defined code. Ensure that the
Short Description description is of a maximum of 8 characters in length and does not contain
any special characters except underscore “_”.

Enter the Long Description if you are creating subject-oriented Measure to


help users for whom the Measure is being created or other details about the
Long Description
type/subject. Ensure that the description is of a maximum of 100
characters in length.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 219


UNIFIED ANALYTICAL METADATA
MEASURE

3. Enter the details in the Business Measure Definition section.


a. Select the required Aggregation Function from the drop-down list.
The list consists of various metrics based on which a Measure can be aggregated.
The available aggregation functions are as tabulated.

Table 37: Aggregation Functions and its Descriptions

Aggregator Description

Adds the actual value of attribute or data element to get the measure
SUM
value.

Counts the records for the data element to get the measure value or
COUNT
counts the number of occurrences.

This function acquires the maximum of the data element to get the
MAXIMUM
measure value.

This function obtains the minimum of the data element to get the
MINIMUM
measure value.

This function is different from a simple count aggregation function.


The peculiarity of these measures is that they are linked to the
dimensions and they vary across the hierarchies of these dimensions.
COUNT DISTINCT
In a Count Distinct aggregation function a simple roll cannot determine
the values at the intermediate nodes in the Hierarchies up of their leaf
level values.

Based on the selected Aggregation Function the Data Type is auto populated.
i. Select the Entity to load the data for the Measure from the drop-down list.
The list displays all the entities in the information domain, to which your application is
connected.
ii. Select the required Attribute from the drop-down list.
The list displays all the attributes in the selected entity.
iii. Define the Business Exclusions rules for the base Measure. You can enter the expression
or click button to define using the Expression Builder window.
iv. Define Filter Expression to filter the aggregation process. You can enter the expression or
click button to define using the Expression Builder window.
v. Turn on the Roll Up toggle button to calculate the measure values and to display the
nodes at the total level. By default, the checkbox is selected if the Aggregation Type is
Maximum, Minimum, Count, or Sum. Roll Up option, when selected with Percentage
Measures results in wrong values at intermediate/total levels.
4. Click Save to save the Business Measure details or click Close to discard the changes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 220


UNIFIED ANALYTICAL METADATA
MEASURE

5.5.2 Viewing Business Measure


You can view individual Business Measure at any given point. To view the existing Business Measure
definition details in the Business Measures window: You can view the Business Measure if you are mapped
with the role Measure Read Only with the user group.
1. Select the checkbox adjacent to the required Business Measure code.

2. Click View button from the Business Measure tool bar.


The View Business Measures window displays the details of the selected Business Measure
definition.
The User Info grid at the bottom of the window displays the metadata information about the
Business Measure created along with the option to add comments.

5.5.3 Modifying Business Measure


You can modify the Business Measure if you are mapped with the role Measure Write with the user group.
You can update the existing Business Measure definition details except for the Code and Short
Description.
To update the required Business Measure details in the Business Measure window:
1. Select the checkbox adjacent to the required Business Measure code.

2. Click Edit button from the Business Measures tool bar.


The Edit Business Measure window is displayed.
3. Update the required details.
For more information, see Create Business Measure.
4. Click Save and update the changes.

5.5.4 Copying Business Measure


You can copy the existing Business Measure details to quickly create a new Business Measure. You can
later modify the Code or Short Description, add/remove Entities and Attributes, and also define the
join/filter conditions. You can copy Business Measure if you are mapped with the role Measure Write with
the user group.
To copy an existing Business Measure definition in the Business Measure window:
1. Select the checkbox adjacent to the required Business Measure code.

2. Click Copy button from the Business Measures tool bar.


The Business Measure definition details are copied and a confirmation message is displayed.

5.5.5 Deleting Business Measure


You can remove the Business Measure definition(s) which are created by you and which are no longer
required in the system by deleting from the Business Measures window. To delete a Business Measure,

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 221


UNIFIED ANALYTICAL METADATA
BUSINESS PROCESSOR

you need to be mapped with the role Measure Write. Delete function permanently removes the Business
Measure details from the database. Ensure that you have verified the details as indicated below:
• A Business Measure definition marked for deletion is not accessible for other users.
• Every delete action has to be Authorized/Rejected by the authorizer.
 On Authorization, the Business Measure details are removed.
 On Rejection, the Business Measure details are reverted to authorized state.
• You cannot update Business Measure details before authorizing/rejecting the deletion.
• An unauthorized Business Measure definition can be deleted.
To delete an existing Business Measure in the Business Measure window:
1. Select the checkbox adjacent to the required Business Measure code.

2. Click Delete button from the Business Measure tool bar.


A confirmation dialog is displayed.
3. Click OK. The Business Measure details are marked for delete authorization.

5.6 Business Processor


Business Processor refers to a uniquely named data element of relevance which can be used to define
views within the data warehouse. It typically implies aggregated information as opposed to information at
a detailed granular level that is available before adequate transformations.
A Business Processor encapsulates a business logic for assigning a value to a measure as a function of
observed values for other measures. Business Processors are required Measurements that require
complex transformations that entail transforming data based on a function of available base measures.
Measurements that require complex transformations that entail transforming data based on a function of
available base measures require Business Processors. A supervisory requirement necessitates the
definition of such complex transformations with available metadata constructs.
Business Processors are metadata constructs that are used in the definition of such complex rules.
Business Processors are designed to update a measure with another computed value. When a rule that is
defined with a Business Processor is processed, the newly computed value is updated on the defined
target.
Based on the role that you are mapped to, you can access read, modify or authorize Business Processor.
For all the roles and descriptions, see Appendix A. The roles mapped to Business Processor are as follows:
 BMM Processor Access
 BMM Processor Advanced
 BMM Processor Authorize
 BMM Processor Phantom
 BMM Processor Read Only
 BMM Processor Write
You can access Business Processor window by expanding Unified Analytical Metadata within the tree
structure of LHS menu and selecting Business Processor.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 222


UNIFIED ANALYTICAL METADATA
BUSINESS PROCESSOR

Figure 108: Business Processor window

The Business Processor window displays the list of pre-defined Business Processors with their Code, Short
Description, Long Description, Dataset, and Measure. The Business Processor window allows you to
generate values that are functions of base measure values. Using the metadata abstraction of a business
processor, power users have the ability to design rule-based transformation to the underlying data within
the data warehouse / store. You can make use of Search and Filter option to search for specific Business
Processors based on Code, Short Description, or Authorized status. The Pagination option helps you to
manage the view of existing Business Processors within the system.

5.6.1 Adding Business Processor


You need to be mapped to the role group BMM Processor Write to add a Business Processor.
To create a Business Processor from the Business Processor window:
1. Click Add button.
The Add Business Processor window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 223


UNIFIED ANALYTICAL METADATA
BUSINESS PROCESSOR

Figure 109: Business Processor Add window

2. Enter the details as tabulated:


The following table describes the fields in the Business Processor window.

Table 38: Fields in the Business Processor window and their Description

Field Description

Code While creating a new Business Processor, you need to define a distinct
identifier/Code. It is recommended that you define a code that is
descriptive or indicative of the type of Business Processor being created.
This will help in identifying it while creating rules.
Note the following:
It is mandatory to enter a Code.
The Code should be minimum eight characters in length; it can be
alphabetical, numerical (only 0-9) or alphanumerical characters.
The Code should start with an Alphabet.
The Code cannot contain special characters with the exception of the
underscore symbol (_).
The saved Code or Short Description cannot be changed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 224


UNIFIED ANALYTICAL METADATA
BUSINESS PROCESSOR

Field Description

Short Description Short description is useful in understanding the content of the Business
Processor you are creating. It would help to enter a description based on
the code.
Note the following:
It is mandatory to enter a Short Description.
The Short Description should be of minimum one character and
maximum of 80 characters in length.
Only Alphanumeric, non-English, and Special characters such as “<blank
space>”, “.”, “$”, “&”, “%”, “<”, “>”, “)”, “(“, “_”, and “-” are permitted to be
entered in the Short Description field.

Long Description The long description gives an in-depth understanding of the Business
process you are creating. It would help you to enter a Long Description
based on the code.
The Long Description should be of minimum one character and
maximum 100 characters in length.

Dataset Select the Dataset from the drop-down list. The list of available Datasets
for the selected Information Domain will appear in the drop-down.
The Short Description of the Datasets as entered in the Datasets window
under Business Metadata Management will be reflected in the drop-
down.

Measure Select the Measure from the drop-down list. All base measures that are
defined on any of the tables present in the selected Dataset will appear
in the drop-down.
If the underlying measure is deleted after the Business Processor
definition, then the corresponding Business Processor definition will
automatically be invalidated.

Expression
Click button. The Expression window is displayed.
For more details on creating an expression using entities, functions and
operators, see Create Expression section.
The placeholder option enables the user to provide values for the
constants in the expression. The user can specify values to the business
processor expression during the Run time rather than at definition time
through the place holders defined while specifying the expression. The
user can specify the expression in the “Expression” field.
Note the following:
The values for the placeholders can be alphanumeric.
The process of specifying place holders enables the user to execute the
same business processor definition with different values during the Run
time.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 225


UNIFIED ANALYTICAL METADATA
BUSINESS PROCESSOR

Field Description

Expression has Aggregate The expression may require an aggregation function depending on the
Function business logic. The aggregation functions have to be entered in the
expression field per acceptable syntax. IF an aggregation function is
used in the expressions, the checkbox “Expression has Aggregate
Function” must be enabled. Leave the checkbox “Expression has
Aggregate Function” blank if your expression does not contain an
aggregation function.

You can also:

 Click button in the Business Processor Definition grid to refresh the entries.
 Click Parameters to specify default values for any of the placeholders defined.
The Parameters window is displayed.

Figure 110: Parameters window

i. Enter a default value for the place holders defined along with the expression in the Default
Value field.
ii. Click Save to save the default value for a placeholder.
The User Info grid at the bottom of the window displays the metadata information about the
Business Processor definition created along with the option to add comments.
3. Click Save. The Business Processor is saved and listed in the Business Processor window after
validating the entries.

5.6.2 Viewing Business Processor


You need to be mapped with the role group BMM Processor Read Only to view a Business Processor.
You can view individual Business Processor definition details at any given point. To view the existing
Business Processor definition in the Business Processor window:
1. Select the checkbox adjacent to the required Business Processor code.

2. Click View button from the Business Processor tool bar.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 226


UNIFIED ANALYTICAL METADATA
BUSINESS PROCESSOR

The View Business Processor window displays the details of the selected Business Processor
definition. The User Info grid at the bottom of the window displays the metadata information about
the Business Processor definition along with the option to add comments.

5.6.3 Editing Business Processor


You need to be mapped with the role group BMM Processor Write to edit Business Processor.
You can update the existing Business Processor definition details except for the Business Processor Code
and Short Description. To update the required Business Processor definition details in the Business
Processor window:
1. Select the checkbox adjacent to the required Business Processor code.

2. Click Edit button from the Business Processor tool bar.


The Edit Business Processor window is displayed.
3. Update the details as required.
For more information see Add Business Processor.
4. Click Save and update the changes.

5.6.4 Copying Business Processor


You need to be mapped with the role group BMM Processor Write to copy business processor.
You can copy the existing Business Processor to quickly create a new Business Processor definition based
on the existing rule details or by updating the required parameters. To copy an existing Business
Processor definition in the Business Processor window:
1. Select the checkbox adjacent to the required Business Processor code in the list whose details are to
be duplicated.

2. Click Copy button from the Business Processor tool bar. Copy button is disabled if you have
selected multiple checkboxes.
The Copy Business Processor window is displayed.
3. Edit the Business Processor details as required. It is mandatory that you change the Code and Short
Description values.
For more information see Add Business Processor.
4. Click Save.
The defined Business Processor is displayed in the Business Processor window.

5.6.5 Deleting Business Processor


You need to be mapped with BMM Processor Write to delete Business Processor.
You can remove Business Processor definition(s) which are no longer required in the system by deleting
from Business Processor window.
1. Select the checkbox(s) adjacent to the Business Processor codes whose details are to be removed.

2. Click Delete button from the Business Processor tool bar.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 227


UNIFIED ANALYTICAL METADATA
EXPRESSION

3. Click OK in the Warning dialog to confirm deletion.


The selected Business Processor definitions are removed.

5.7 Expression
An Expression is a user-defined tool that supplements other IDs and enables to manipulate data flexibly.
Expression has three different uses:
• To specify a calculated column that the Oracle Financial Services Analytical Application derivatives
from other columns in the database.
• To calculate assignments in data correction.
• To create calculated conditions in data and relationship filters.
Example:- Calculations like average daily balances, current net book balance, average current net book
balance, and weighted average current net rate can be created through Expressions.
Based on the role that you are mapped to, you can access read, modify or authorize Expression window.
For all the roles and descriptions, see Appendix A.
The roles mapped to Expression are as follows:
 Expression Access
 Expression Advanced
 Expression Authorize
 Expression Phantom
 Expression Read Only
 Expression Write

Figure 111: Expression Summary window

The Expression Summary window displays the list of pre-defined Expressions with other details such as
the Expression Name, Folder Name, Return Type, Created By, and Creation Date. For more information on
how object access is restricted, see Object Security in Dimension Management module section.
You can also search for a specific Expression definition based on Folder Name, Expression Name, or
Return Type and view the list of existing definitions within the system.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 228


UNIFIED ANALYTICAL METADATA
EXPRESSION

5.7.1 Adding Expression Definition


This option allows you to add an expression definition using variables, functions, and operators. The Write
role should be mapped to your user group.
To create a new Expression from the Expressions Summary window:
1. Click Add button in the Expressions Toolbar.
The New - Expression window is displayed.

Figure 112: Expression Summary New window

2. In the Expression Details grid:


 Enter the Expression Name and the required Description.

NOTE Expression Name: The characters &' " are restricted in the
name field.
Description: The characters ~&+' "@ are restricted in the
description field.

 Select the Folder Name from the drop-down list.


 The Folder selector window behavior is explained in User Scope section.
 Click to create a new private folder.
The Segment Maintenance window is displayed. For more information, see Segment
Maintenance.

NOTE You can select Segment/Folder Type as Private and the


Owner Code as your user code only.

 Select the Access Type as Read Only or Read/Write.


 Read-Only: Select this option to give other users the access to only view the expression.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 229


UNIFIED ANALYTICAL METADATA
EXPRESSION

NOTE A user with Phantom and Write role can modify or delete the
expression even though the access type is selected as Read-
only.

 Read/Write: Select this option to give all users the access to view, modify (including Access
Type) and delete the expression.
3. In the Entity Group Selection grid:

 In the Variants section, click button The Variant Selection window is displayed.
 Select the Entity Type and Entity Name from the drop-down lists.

 Select the required member and click .


The member is displayed Selected Members list. Click to select all the Members.

You can also click to deselect a Member or click to deselect all Members.
 Click OK.
The selected Entity Name and Members are displayed in the Variants section in the New
Expression window.
 In the Variant’s section, click “+” to expand Entity Group and double-click to select the required
Entity.
The selected Entity is displayed in the Expression grid.
 In the Function section, click “+” to expand Functions and select a function such as
Mathematical, Date, String, or Others options.
The selected Function is displayed in the Expression grid. For more information see Function
Types and Functions.
 In the Operators section, click “+” to expand Operators and select an operator such as
Arithmetic, Comparison, or Others.
The selected Operator is displayed in the Expression grid. For more information see Operator
Types.

 You can click button from the Add Constant grid to specify a Constant Value. Enter
the numerical value and click .
 In the Expression grid, you can right-click on the expression and do the following:

 Click Replace Expression ( ) to replace the expression with a new one.


 Click Insert Expression After ( ) to insert a new expression after the selected
expression.

 Click Delete ( ) to delete a selected expression.

 You can also click button in the Expression grid to clear the Expression.
4. Click Save to validate the entries and save the new Expression.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 230


UNIFIED ANALYTICAL METADATA
EXPRESSION

5.7.2 Viewing Expression


You can view individual Expression details at any given point. To view the existing Expression details the
Expression Summary window:
1. Select the checkbox adjacent to the Expression Name.

2. Click View button in the Expressions tool bar.


The View Expression window is displayed with the Expression details.

5.7.3 Modifying Expression


You can modify the Expression details as required in the Edit – Expression screen.
1. Select the checkbox adjacent to the Expression Name whose details are to be updated.

2. Click Edit button and the Edit – Expression window is displayed. Modify the required changes.
For more information, see Add Expression Definition.
3. Click Save and upload the changes.

5.7.4 Copying Expression


The Copy Expression facilitates you to quickly create a new Expression based on the existing parameters
or by updating the values. To copy an existing Expression in the Expression Summary window:
1. Select the checkbox adjacent to the Expression Name which you want to create a copy.

2. Click Copy button in the Expressions tool bar. Copy button is disabled if you have selected
multiple checkboxes.
The Copy – Expression window is displayed.
3. In the Copy – Expression window you can:
 Create new Expression with existing variables. Specify a new Filter Name and click Save.
 Create new Expression by updating the required variables. Specify a new Expression Name and
update the required details.
For more information, see Add Expression Definition. Click Save.
The new Expression details are displayed in the Expression Summary window.

5.7.5 Checking Dependencies


You can view the dependencies of a defined Expression in the Expression Summary screen:
1. Select the checkbox adjacent to the required Expression Name.

2. Click button in the Expressions tool bar. The Check Dependencies button is disabled if you
have selected multiple expressions.
The Dependent Objects window is displayed with Object id, Name, and id type of the dependent Objects.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 231


UNIFIED ANALYTICAL METADATA
FILTER

5.7.6 Deleting Expression


You can delete an expression which has Read/Write Access Type. To delete an expression from the
Expression Summary window:
1. Select the checkbox adjacent to the Expression Name(s) whose details are to be removed.

2. Click Delete in the Expressions tool bar.


3. Click OK in the information dialog to confirm deletion.

5.8 Filter
Filters in the Infrastructure system allows you to filter metadata using the defined expressions.

5.8.1 Navigating to Filters


You can access Filters by expanding United Analytical Metadata section within the tree structure of LHS
menu and selecting Filter.
Based on the role that you are mapped to, you can access read, modify or authorize Filters window. For all
the roles and descriptions, see Appendix A. The roles mapped to Filters are as follows:
 Filter Access
 Filter Advanced
 Filter Authorize
 Filter Phantom
 Filter Read Only
 Filter Write

Figure 113: Filter Summary window

The Filters Summary window displays the list of Filters created in all public folders, shared folders to which
you are mapped and private folders for which you are the owner, along with the other details such as the
Name, Type, Modification Date, and Modified By.
For more information on how object access is restricted, see Object Security in Dimension Management
module section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 232


UNIFIED ANALYTICAL METADATA
FILTER

You can also search for a specific Filter definition based on Folder Name, Filter Name, or Type and view
the list of existing definitions within the system. If you have selected Hierarchy from the Type drop-down
list, the Dimension drop-down list is also displayed.

5.8.2 Adding Filter Definition


This option allows you to add a filter. Filter can be of 4 types namely, Data Element, Hierarchy, Group, and
Attribute. To create filter definition, the Write role should be mapped to your user group.
To create a new filter from the Filters Summary window:
1. Click Add button in the Filters toolbar.
The Filter Definition window is displayed.

Figure 114: Filter Definition New window

2. Enter the Filter Details section details as tabulated:


The following table describes the fields in the Filter Definition window.

Table 39: Fields in the Filter Definition window and their Description

Field Description

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 233


UNIFIED ANALYTICAL METADATA
FILTER

Filter Details

Folder Name Select the Folder Name where the Filter is to be stored from the drop-
down list.
The Folder selector window behavior is explained in User Scope section.
Click to create a new private folder. The Segment Maintenance window
is displayed. For more information, see Segment Maintenance.
Note: You can select Segment/Folder Type as Private and the Owner
Code as your user code only.

Access Type Select the Access Type as Read Only or Read/Write.


Read-Only: Select this option to give other users the access to only view
the filter definition.
Note: A user with Phantom and Write role can modify or delete the filter
even though the access type is selected as Read-only.
Read/Write: Select this option to give all users the access to view, modify
(including Access Type) and delete the filter definition.

Filter Name Enter the filter name in the Filter Name field.
Note: The characters &’ ” are restricted.

Description Enter the description of the filter in the Description field.


Note: The characters ~&+' " @ are restricted.

3. From the Filter Type Selection pane, select the Filter Type from the drop-down list.
There are four different Filter Types available in the Filter Type Selection grid as tabulated. Click the
links to navigate to the appropriate sections.
The following table describes the fields in the Filter Type pane.

Table 40: Fields in the Filter Type pane and their Description

Filter Description

Data Element Data Element Filter is a stored rule that expresses a set of constraints.
Only columns that match the data type of your Data Element selection are
offered in the Data Element drop-down list box.
Example: Balances between 10,000 and 20,000 Accounts opened in the
current month Loans with amortization terms greater than 20 years.
Data Element Filters can access most instrument columns and most
columns in the Management Ledger. Data Element Filters are used within
other OFSAA rule types
(e.g., Allocation rules, Transfer Pricing rules, Asset | Liability Management
rules, and others).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 234


UNIFIED ANALYTICAL METADATA
FILTER

Filter Description

Hierarchy Hierarchy Filter allows you to utilize rollup nodes within a Hierarchy to
help you exclude (filter out) or include data within an OFSAA rule.
Example: You might want to process data for a specific set of divisions or
lines of business where you have a Hierarchy rule that expresses those
divisions or lines of business as rollup nodes. A Hierarchy Filter could be
constructed to "enable" the Commercial and Retail lines of business while
NOT enabling the Wealth Management line of business. Each of these lines
of business might include a handful or even thousands of cost centers.
When incorporated into an OFSAA processing rule, this Hierarchy Filter
would include every cost center in the Commercial and Retail lines of
business.

Group Group Filters can be used to combine multiple Data Element Filters with a
logical "AND".
Example: If Data Element Filter #1 filtered on mortgage balances greater
than 100,000 and Data Element Filter #2 filtered on current mortgage
interest rates greater than 6%, you could construct a Group Filter to utilize
both Data Filters. In this case, the resulting Group Filter would constrain
your data selection to mortgage balances greater than 100,000 AND
current mortgage interest.

Attribute Attribute Filters are created using defined Attributes. Attribute filters
facilitates you to filter on one or more Dimension Type Attributes. For each
attribute, you can select one or more values.
Example: Consider a filter that selects all records where the dimension
Common Chart of Account member represents an attribute value Expense
account, i.e., the attribute "Account Type" = Expense.
Now, using Attribute Filters, you can specify complex criteria as given
below:
Common Chart of Accounts where the Account Type attribute is Earning
Assets or Interest-bearing Liabilities, and the Accrual Basis attribute is
Actual/Actual
Also, You could further refine the filter by adding another condition for:
Organizational Unit where the Offset Org ID is a specific Org member
The Filter then saves these criteria rather than the member codes which
meet the criteria at the time the Filter is saved. During execution, the
engine dynamically selects all records from your processing table (e.g.
Mortgages, Ledger, etc.), which meet the specified member attribute
criteria.

After the required filter conditions are defined, save the Filter definition.

5.8.2.1 Define Data Element Filter


When you have selected the Filter Type as Data Element, define the Filter conditions by doing the
following in the Data Element Selection section:

1. In the Data Element Selection section, click button.


The Data Element Selection window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 235


UNIFIED ANALYTICAL METADATA
FILTER

Figure 115: Data Element Selection window

 Select any of the following Filter Classification Type from the drop-down list:
 Classified - This is the default selection and displays all the classified EPM specific
entities. If you are an EPM user, you need to select this option while defining Data Element
Filter to list all the related entities.
 Unclassified - This option displays all the non-classified i.e. non EPM specific entities. If
you are a non EPM user, you need to select this option while defining Data Element Filter
to list all the related entities.
 All - This option will select all the tables available in the selected Information Domain
irrespective of whether an entity has its table is classified or not.
 Select the required database table from the Entity Name drop-down list. The associated
members are displayed in the Show Members section.

 Select the required member and click . The member is listed in the Selected Members panel.
Click to move all Members.

You can click to deselect a Member or click to deselect all Members.


 Click OK. The selected Data Elements are displayed in the Data Element Selection field.
2. Select the Filter Method from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 236


UNIFIED ANALYTICAL METADATA
FILTER

For each column you wish to include in your Data Filter definition, you must specify one of the
following Filter Method:
The following table describes the fields in the Data Filter Definition.

Table 41: Fields in the Data Filter Definition window and their Description

Filter Description

Specific Values Specific Values are used to match a selected database column to a
specific value or values that you provide. You may either include or
exclude Specific Values.
You can add additional values by clicking the Add button. Click
adjacent to Add button to add 3, 5, 10 rows by selecting the checkbox
adjacent to 3, 5, or 10 respectively. You can add custom number of rows by
specifying the number in the text box provided, as shown and click .

To remove a row, select the checkbox and click Delete.


When comparing Specific Values for a character type column, you must
provide Specific Values that are character strings.
When comparing Specific Values for a date type column, you must provide
Specific Values that are dates (the application displays a Calendar control).
When comparing Specific Values for a numeric column, you must provide
Specific Values that are numbers.
Select Include Values or Exclude Values to include or exclude the
selected values.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 237


UNIFIED ANALYTICAL METADATA
FILTER

Filter Description

Ranges Ranges are used to match a selected database column to a range of values
or to ranges of values that you provide. You may either include or exclude
Range values.
Range Type is available for OFSA Datatype Term, Frequency, Leaf, Code,
and Identity and Column Datatype Date, Numeric and Varchar.
You can add additional values by clicking the Add button. Click
adjacent to Add button to add 3, 5, 10 rows by selecting the checkbox
adjacent to 3, 5, or 10 respectively. You can add custom number of rows by
specifying the number in the text box provided, as shown and click .

To remove a row, select the checkbox and click Delete.


If the Column Datatype is VARCHAR, provide Specific Values
(alphanumeric) that are character strings.
If the Column Datatype is DATE, provide Specific Values that are dates (the
application displays a Calendar control).
If the Column Datatype is Numeric, provide Specific Values that are
numbers.

If OFSA Datatype is LEAF, provide either numeric values or click to


select the numeric member ids.

If OFSA Datatype is CODE, provide either numeric values or click to


select the numeric member ids.
If OFSA Datatype is IDENTITY, provide specific numeric values. However,
no validation is done during save to validate the input value for a valid
identity code.
Select Include Values or Exclude Values to include or exclude the
selected values

Another Data Element Another Data Element is used to match a selected database column to
another database column. When constructing an Another Data Element
Filter Method, you may only compare a column to other columns that you
have already selected (the Data Element drop-down list box will only
contain columns that you have already selected).
You may use any of the following operators when choosing the Another
Data Element Filter Method:
=, <> (meaning "not equal to"), <, >, <=, or >=.

Expression Expression is used to match a selected database column to the results of


an OFSAAI Expression rule.
You may any of the following operators when choosing the Expression
Filter Method:
=, <> (meaning "not equal to"), <, >, <=, or >=.

 Click Add to list the completed filter conditions in the Filter Conditions grid.
 Click Update after modifying a filter condition to update in the Filter Conditions grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 238


UNIFIED ANALYTICAL METADATA
FILTER

 Click or buttons to move a selected Filter Condition up or down.

 Click button to delete selected individual Filter Conditions records.


3. Click Add or Edit in the Filter Definition window if you are creating a new or updating an existing
Filter definition.
4. Click Save to validate the entries and save the filter details.

5.8.2.2 Define Hierarchy Filter


When you have selected the Filter Type as Hierarchy, define the Filter conditions by doing the following in
the Hierarchy Selection section:
1. Select the required Dimension from the drop-down list.
2. Select the associated Hierarchy from the drop-down list. You can select More to search for a
specific Hierarchy in the Hierarchy more dialog.
3. Select any combination of rollup points and leaf (last descendent child) values.

Figure 116: Show Hierarchy Tab

The Show Hierarchy tab displays the leaves in each node in ascending order of Members.
In order to sort the nodes alphabetically, HIERARCHY_IN_FILTER_SORT-$INFODOM$-
$DIMENSION_ID$=$VALUE$ in the AMHMConfig.properties file present in the deployed location
should be set as Y. You should add such entry for all the required Dimension IDs for the sort
functionality to work for those dimensions.
For example:
HIERARCHY_IN_FILTER_SORT-OFSAAINFO-4345=Y
Restart servers after making any change in AMHMConfig.properties file for the change to take
effect.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 239


UNIFIED ANALYTICAL METADATA
FILTER

NOTE Select the Pagination icon to view more options under the
available components. Click the More Options (three dots) icon
to enable the Pagination buttons.

From this pane, you can:

 Click button to search for a hierarchy member using Dimension Member Alphanumeric
Code, Dimension Member Numeric Code, Dimension Member Name, or Attribute and by keying
in Matching Values in the Search dialog. The search results are also displayed in the ascending
order of Member Names.
 Click to collapse the members under a node.

 Click or to view the name of members right or left.

 Click or to view the Numeric code values of members right or left.

 Click or to show code or show name of the members.

 Click or to focus or defocus a selected node except the root node.

You can also click button to find a member present in the nodes list using key words. For
large tree (nodes>5000), this search will not return any value if the tree is not expanded.
4. Click Save to validate the entries and save the filter details.

Figure 117: Show Members Tab

The Show Members tab displays all the selected nodes in a list view, which helps you visualize all
the selected nodes as a list rather than as a tree. Currently, this feature is available in the Edit and
View mode of the Hierarchy Filter.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 240


UNIFIED ANALYTICAL METADATA
FILTER

5.8.2.3 Define Group Filter


When you have selected the Filter Type as Group, define the Filter conditions by doing the following in the
Data Element Filters grid:
1. Select the checkbox(s) adjacent to the required member names in the Available Filters section and

click . The selected members are displayed in the Selected Filters section. Click to select all
the Members.

You can click to deselect a Member or click to deselect all the Members.

You can also click button to search for a member in the Data Element Filter Search dialog using
Folder Name and Filter Name.
2. Click Save to validate the entries and save the filter details.

5.8.2.4 Define Attribute Filter


When you have selected the Filter Type as Attribute, define the Filter conditions by doing the following in
the Attribute Selection section:
1. Select the required Dimension from the drop-down list.
2. Select the associated Attribute from the drop-down list. The list consists of only Dimension Type
attributes for selection.

3. Click button in the Attribute Values grid.


The Attribute Values window is displayed.
In the Attribute Values window, the Dimension field under Dimension grid is auto populated with
the Dimension name with which the selected Attribute is defined and is non-editable. In the Search
grid you can search for Attribute Values depending on Alphanumeric Code, Numeric Code, or
Name.
4. Select the checkbox(s) adjacent to the Alphanumeric Codes of the required Attribute Values in the
Attribute Values grid and click OK. The Attribute Values grid displays the selected attribute values.

Select Attribute Value(s) in the Attribute Values grid and click button to delete it.
You can use the Attribute Values present in the Attribute Values grid to generate conditions.
5. Click Add button in the Attribute Values grid. The Filter Conditions grid is populated with the filter
condition using all the Attribute values.
You cannot define two conditions using the same attributes. Because conditions are joined with a
logical ‘AND’ and this will make the query invalid.
In the Filter Conditions grid, you can select a condition to view the Attribute Values used to generate
it and can update the condition.

You can also click button to view the SQL statement in View SQL window. Click button to
view a long filter condition in View Condition dialog.
6. Click Save. The Attribute Filter definition is saved.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 241


UNIFIED ANALYTICAL METADATA
FILTER

5.8.3 Viewing Filter Definition


You can view individual Filter details at any given point.
To view the existing Filter Definition details in the Filters Summary window:
1. Select the checkbox adjacent to the Filter Name.

2. Click View button in the Filter tool bar.


The View – Filter Details window is displayed with the filter details.

5.8.4 Modifying Filter Definition


This option allows you to modify the details of Filters.
1. Select the checkbox adjacent to the Filter Name whose details are to be updated.

2. Click Edit button and the Edit – Filter Details window is displayed. Modify the required changes.
For more information, see Add Filter Definition.
3. Click Save to save the changes.

5.8.5 Copying Filter Definition


The Copy Filter Definition facilitates you to quickly create a new Filter Definition based on the existing
parameters or by updating the values.
To copy an existing Filter Definition in the Filters window:
1. Select the checkbox adjacent to the Filter Name which you want to create a copy.

2. Click Copy button in the Filters tool bar. Copy button is disabled if you have selected multiple
checkboxes. The Copy – Filter Details window is displayed.
3. In the Copy – Filter Details window you can:
 Create new filter definition with existing variables. Specify a new Filter Name and click Save.
 Create new filter definition by updating the required variables. Specify a new Filter Name and
update the required details. For more information, see Add Filter Definition. Click Save.
The new filter definition details are displayed in the Filters Summary window.

5.8.6 Checking Dependencies


You can view the dependencies of a defined Filter. You can use filter in a Run definition. However, the Run
definitions are not shown as dependent objects when you check dependency for a filter. This is a
limitation.
To check the dependencies of a filter from the Filters Summary window:
1. Select the checkbox adjacent to the Filter Name.

2. Click button in the Filters tool bar. The Check Dependencies button is disabled if you have
selected multiple members.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 242


UNIFIED ANALYTICAL METADATA
FILTER

The Dependent Objects window is displayed with Object ID, Name, and ID Type of the dependent
Objects.

5.8.7 Viewing SQL of Filter


You can view the corresponding SQL of a defined filter.
To view the SQL of a filter from the Filters Summary window:
1. Select the checkbox adjacent to the filter to view the SQL.

2. Click View SQL button. The SQL equivalent of the selected filter is displayed in the View SQL
window.

5.8.8 Deleting Filter Definition


You can remove the Filter Definitions which are not required in the system by deleting from the Filters
Summary window.

NOTE A filter definition with dependency cannot be deleted. However,


if the dependent object is a Run Definition, you are able to
delete the filter definition. This is a limitation.

1. Select the checkbox adjacent to the Filter Name whose details are to be removed.

2. Click Delete in the Filters tool bar.


3. Click OK in the information dialog to confirm deletion.

5.8.9 Download Filter Data, Bulk Edit, and Upload


The Filter Definitions may undergo changes frequently based on business requirements. You can modify
filter conditions through the Edit Operation option available in the Filters Summary Window. However, to
perform a bulk edit of Filter Definitions in OFSAA, you can use the Download Filter Data feature.
The mechanism of the bulk edit of Filter Definitions is through the modification of data in an Excel-based
(XLS) template and uploading it back to OFSAA. After successful upload of the modified Filter Definitions,
the updated information is available in the OFSAA system.
Currently, the Download Filter Data feature is available for the following types of Filter Definitions:
1. Hierarchy Filters
2. Attribute Filters
3. Data Element Filters
The following sections in this topic provide information for the cycle that is required to complete the
Download of Filter Data, Bulk eEdit, and Upload.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 243


UNIFIED ANALYTICAL METADATA
FILTER

5.8.9.1 Download the Filter Data in XLS Format


To download the Filter Data in XLS format from the Filters Summary Window, follow the steps given
below. You must have Filter Write access to select one of more definitions to download from the Filters
Summary Window.
1. Select the Definition(s), which require modification.

NOTE The Definitions that you select within a page are considered for
download. To select records across multiple pages, consider
performing multiple downloads or increase the page size to
display more records.
You can also select all the filters that appear in the current page
by clicking the Select All checkbox in the header of the records.

2. Click Download.
A prompt appears to download the OFSAA_FILTER.xls file.
3. Download the XLS file to your local machine.

5.8.9.2 Edit the XLS File


The downloaded XLS file contains the Filter Definition Records with the following sheets in it:
• SNAPSHOT – This sheet contains a summary of the Filters Definitions of the XLS file and provides
information about the User who downloaded the file and the timestamp details of when the file was
downloaded.
• DATA ELEMENT FILTER – This sheet contains details of all the Data Element Filters that were
selected for the download.
• HIERARCHY FILTER - This sheet contains details of all the Hierarchy Filters selected for the
download.
• ATTRIBUTE FILTER - This sheet contains details of all the Attribute Filters selected for the
download.
A Filter Definition may contain multiple rows to cover all of the filter attributes.
• REFERENCES - This sheet contains information for valid values that you can refer to when
modifying the DATA ELEMENT FILTER, HIERARCHY FILTER, and ATTRIBUTE FILTER Sheets.
You can modify the Filter Definitions by the following actions:
• Add a new row to add a new Filter Condition.
• Delete a row to remove a Filter Condition.
• Modify a row to update an existing Filter Condition.

5.8.9.2.1 Delete Filter Definitions

To delete the Filter Definitions in the XLS file, follow these steps:
1. Delete a row to remove the Filter Conditions from the definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 244


UNIFIED ANALYTICAL METADATA
FILTER

Similarly, delete other rows that are not required.


2. Save the XLS file.

5.8.9.2.2 Add or Update Filter Definitions

To add or update a Filter Definitions in the XLS file, follow these steps:
To add or update a Filter Definitions in the XLS file, you must populate certain mandatory columns in each
of the sheets. The details are described in the following:

NOTE The Bulk Upload utility does not support incremental updates
of the definitions (Delta updates). The upload deletes the
existing Filter Conditions and replaces them with the Filter
Conditions present in the XLS file.

Data Element Filter


The columns which can be edited or added in a new row in this sheet are as follows:
• FILTER ID
• TABLE NAME
• COLUMN NAME
• FILTER METHOD (Input 0 for Specific Values or 1 for Range)
• SEQUENCE
• EXCLUDE
• FROM OPERATOR
• FROM VALUE
• TO OPERATOR (in case of ranges Filter Method only)
• TO VALUE (in case of ranges Filter Method only)
Other columns and their values are for reference purposes and they will not affect the Filter Definitions
when uploaded back in the OFSAA System.
Hierarchy Filter
The columns which can be edited in an existing row or added in a new row in this sheet are as follows::
• FILTER ID
• MEMBER
Other columns and their values are for reference purposes and they will not affect the Filter Definitions
when uploaded back in the OFSAA System.
Attribute Filter
The columns which can be edited in an existing row or added in a new row in this sheet are as follows:
• FILTER ID

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 245


UNIFIED ANALYTICAL METADATA
FILTER

• SEQUENCE
• DIMENSION ID
• ATTRIBUTE ID
• ATRRIBUTE VALUE
Other columns and their values are for reference purposes and they will not affect the Filter Definitions
when uploaded back in the OFSAA System.

5.8.9.3 Prepare to Upload Filter Definitions


The upload activity requires that you perform the following steps to prepare the modified XLS file for
upload before you proceed to upload the file:
1. Create a Directory and name it FilterUpload in the $FIC_HOME/Utility/ Directory.
The directory name is case-sensitive and you must ensure that the directory name matches
correctly. If the names do not match, the upload process displays an error.
2. Assign 755 permission to the FilterUpload Directory.
3. Copy the edited XLS file to the FilterUpload Directory.
You can now proceed to upload the Filter Definitions.

5.8.9.4 Upload the Filter Definitions


The upload of the Filter Definitions in the XLS saves the filter definitions into the OFSAA System and is the
final step in the Bulk Upload mechanism.

NOTE The Bulk Upload utility does not support incremental updates
of the definitions. The upload deletes the existing Filter
Conditions and replaces them with the Filter Conditions present
in the XLS file. In other words, the entries present in the XLS file
for a particular definition is the final metadata for that
definition in OFSAA.

The following sections describe the options to upload the Filter Definitions.

5.8.9.4.1 Upload the Filter Definitions through the Command Line

To upload the Filter Definitions through the Command Line, follow these steps:
1. Copy the modified XLS file to the $FIC_HOME/utility/FilterUpload directory in the OFSAA
Installation.
2. Run the Shell Script Utility from the $FIC_HOME/ficdb/bin directory as shown in the following:
./FilterUploadUtility.sh <infodom> <userid> <UNIQUE_IDENTIFIER>
For example,
./FilterUploadUtility.sh INFODOM USER UNIQUE_IDENTIFIER_1111

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 246


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

The parameter [UNIQUE_IDENTIFIER] is optional and it helps users trace issues registered in the
Database Table that is mapped to the UNIQUE_IDENTIFIER.
The XLS is uploaded and the Filter Definitions are saved post validation.
The respective errors and logs are available in the log files and the Config Schema. You can filter them by
UNIQUE_IDENTIFIER and view them.
The Table names with reference to the logs are: AAI_UTILS_AUDIT and AAI_UTILS_AUDIT_DETAILS.

5.8.9.4.2 Upload the Filter Definitions through the OFSAA Batch Window

To upload the Filter Definitions through the OFSAA Batch Window, follow these steps:
1. Go to the Batch Maintenance Window.
2. Select Add to add a new Batch or proceed to step 5 to update an existing Batch.
3. Provide the required details such as Batch Name, Batch Description, Batch ID, Sequential Batch, and
Duplicate Batch.
4. Click Save to save the Batch.
5. Select the Batch again from the list of all Batches to add a task.
6. Click Add to add a new task or proceed to step 13 to update an existing task.
7. Enter the Name and Description for the task.
8. Select Run Executable from the Components drop-down.
9. Select Datastore Type, Datastore Name, and Primary IP for Runtime Processes.
10. Enter details in the Executable field as shown in the following:
./FilterUploadUtility.sh, [infodom], [userId]
For example,
./FilterUploadUtility.sh, INFODOM_NAME, EXAMPLEUSER
11. Enter Wait Value, Batch Parameter, and Optional Parameters.
12. Click Save to save the task.
13. Go to the Batch Execution Window.
14. Select the required Batch.
15. Enter the date for the Batch to Run.
16. Select Execute Batch.
17. Click Ok in the confirmation popup.
The selected Batch runs to update the definitions in the OFSAA System.

5.9 Map Maintenance


The Map Maintenance window facilitates to control how dimension members are shared among users
within an information domain. You can map multiple hierarchies to user groups available in OFSAAI so
that the mapped members only can be viewed by the users belonging to that user group. You can set a

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 247


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

mapper definition as the default Security mapper for an information domain. Based on the members
mapped in a security mapper, the hierarchy browser window in OFSAAI framework displays the members
of the hierarchy along with its descendants.

NOTE Since a hierarchy’s member security is maintained at user


group level, the member maintenance related functions like
add, edit, copy, and delete will be the same for all users across
all the enabled members in the hierarchy maintenance window.

For understanding the Hierarchy Security feature, see Scenario to Understand Hierarchy Security section.
To access the Map Maintenance window, you should be mapped to Access role. To create, modify, and
delete a mapper, you should be mapped to Write role.
Based on the role that you are mapped to, you can access, read, modify, or authorize Map Maintenance.
For all the roles and descriptions, see Appendix A. The roles mapped to Map Maintenance are as follows:
 Mapper Access
 Mapper Advanced
 Mapper Authorize
 Mapper Phantom
 Mapper Read Only
 Mapper Write

Figure 118: Map Maintenance window

The Map Maintenance window displays the Name, Version, Description, Dynamic, Inherit Member, Map
Type, and Database View name for the available mapper definitions created in the selected Segment and
Infodom. Segments facilitate the classification of related metadata in a single segment. You have access to
only those metadata objects that are mapped to the same segment to which you are mapped.

5.9.1 Creating a Mapper Definition


This option allows you to create a mapper definition by selecting the required hierarchies. You can create
a data filter or security filter type mapper definition. For a security filter mapping, you should select the
default user group hierarchy present in OFSAAI as a hierarchy. You can select up to 9 hierarchies in a
mapper definition. You need to be mapped to the role Mapper Write to create mapper definition.
To create a new mapper definition from the Map Maintenance window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 248


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

1. Click Create new Map from the tool bar. The Mapper Definition – New window is displayed.

Figure 119: Mapper Definition New window

All hierarchies including the default user group hierarchy for the selected Infodom are listed under
the Members pane.
2. Enter the mapper definition details as tabulated:
The following table describes the fields in the Mapper Definition window.

Table 42: Fields in the Mapper Definition and their Description

Field Description

Fields marked in red asterisk (*) are mandatory.

Description Enter a description for the map definition.

Dynamic By default, the checkbox is selected and you do not have the option to
deselect this. The dynamic attribute is associated with a mapper definition
which facilitates the accommodation of latest members of a slowly
changing dimension by leveraging the push down functionality.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 249


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

Field Description

Map Type This drop-down list is enabled only if the Dynamic checkbox is selected.
Otherwise, data filter is selected and this field is disabled.
Select the Map type. The available options are:
Data Filter: Select this option to define a data filter type mapping, which
does not require a user group hierarchy to be selected among the
participating hierarchies.
Security Filter: Select this option to define a security filter type mapping,
which can be used to restrict access to members of a hierarchy based on
user groups. For a security filter, the user group hierarchy should be
attached with the definition. You can add other hierarchies to this definition
and will not have the option of saving the mapper definition without using a
User Group hierarchy.

Pushdown Select the checkbox if you want implicit push down of the mappings
whenever mappings are modified and saved through the Mapper
Maintenance window.

Database Entity Name Enter the name for the table/entity to be created in the atomic schema that
will be used to store the exploded mappings. The database entity name can
be alpha numeric, however should not start with a numeric character.

Comments Enter any additional information about the mapper definition.

Database View Name Enter the Database View name to be created for the selected database
entity. The View will be created in the atomic schema with Hierarchy code
as the column name.

3. Click the required hierarchies from the Members pane. The selected hierarchies are displayed under
the Selected Members pane.

NOTE • User group hierarchy should be selected for a security


mapper definition. If not selected then a validation
message providing information about the User Group
hierarchy to be selected is displayed back to the user
during the save operation.
• The Hierarchies selected in the Mapper Definition
window should not contain special characters “~”
(Tilde) and “$” (Dollar) in their node descriptions.

4. Click Save to save the mapper definition details.


The Mapper definition is saved with the version number as 1 in the authorized state and the same is
displayed in the refreshed Mapper List grid in Map Maintenance window.

5.9.2 Mapper Maintenance


The Mapper Maintenance feature allows you to define the mappings among the participating hierarchies
in the Mapper Definition window. You should select at least one member from each hierarchy to define a

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 250


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

mapping. You can add multiple mappings among the hierarchies. The mappings will be stored in the
database entity/table you have created during the mapper definition for further processing i.e. push down
operation. After defining all mappings, you can push down the mappings to be effective in the system
(The push down will be implicit if the same was opted at the mapper definition time). You need to be
mapped to the role Mapper Access to access the Mapper Maintenance feature.
To define the mappings:

1. From the Map Maintenance window, select the mapper definition and click Mapper
Maintenance. The Map window is displayed.

Figure 120: Map Maintenance window

Based on the hierarchies participating in the mapper definition, the search fields will be displayed.
The Search fields are enhanced with the autocomplete drop-down feature. You need to enter at
least 4 characters to display the drop-down options.
2. Click Add on the Member Combinations toolbar.
The hierarchies that were selected in the Mapper Definition window appear in the Map window,
along with their members.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 251


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

Figure 121: Mapper Definition window

You can select (pagination) icon to view more options under the selected member.
3. Select the required hierarchy members from each hierarchy and click View Mappings to view the
already available mapping combinations with the selected hierarchy members. The View Mappings
window is displayed.

Figure 122: View Mapping Dialog window

4. Click Close.
5. To add a new mapping from the Add Mappings window, select the required hierarchy members
from each hierarchy and the corresponding user group to which you want to map in case of security

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 252


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

mapper and click Go. Each mapping definition gets listed in the below grid. You should select at
least one member from each hierarchy to obtain a complete mapping.

NOTE If a child is mapped and parent is not mapped, the parent will
be displayed as disabled in the hierarchy browser window.

You can perform the following actions:

 Click to sort members based on path.

 Click to sort hierarchy (top to bottom).

 Click to sort based on level.

 Click or to collapse or expand the members under a node respectively.

 Click or to collapse or expand the selected branch respectively.

 Click to focus only on the selected branch. The Available Values pane shows the members

of the selected branch only. Click to go back to normal view.

 Click to display member's numeric codes on the right. The icon changes to .

 Click to display member's numeric codes on the left. The icon changes to .

 Click to show only member names. This is the default view. The icon changes to .

 Click to display member's alphanumeric codes on the right. The icon changes to .

 Click to display member's alphanumeric codes on the left. The icon changes to .

 Click to display only member names. This is the default view. The icon changes to .
6. Enter the mapping details as tabulated:
The following table describes the fields in the Mapping Definition window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 253


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

Table 43: Fields in the Mapping Definition window and their Description

Field Description

Macro This drop-down list allows you to define conditions based on which the
members will be mapped. The options are:
Self Only: Select this option if you want only the selected member to be
mapped. If this option is selected, the hierarchy browser will display the
selected member in enabled mode. If it has any descendants, those will be
displayed in disabled mode.
Self & Desc: Select this option if you want the selected members along its
descendants to be mapped.

Exclude Select Yes if you want to exclude certain members from being mapped.
For example, if you want to map a hierarchy to all user groups except one
user group say UG1, then map the hierarchy to UG1 and select the Exclude
option as Yes. This will ensure that all users belonging to user groups
except UG1 can access all the members of the hierarchy.

7. Click Save. All the mappings will be listed in the Member Combinations grid.
8. You can use the copy functionality to copy an already created mapping and edit the required fields.
To copy a mapping,

a. Select the mapping you want to copy, from the Member Combinations grid and click Copy.
The Copy Mapping window is displayed with all Hierarchies participating in the mapping.

Figure 123: Copy Mapping window

b. Select the Macro and Excluded information for the mapping and click Save. The copy of the
mapping will appear in the Member Combinations grid.
9. Click Pushdown to refresh the mapping of participating hierarchies available in the system. A
service will push down the mappings based on Config Schema Data (used combinations having
macros) in to the atomic schema (exploded mappings). The pushed down mapping i.e. the
exploded mappings will be displayed in the Mapped members pane.
10. Select a mapping from the first panel and click Remove if you want to remove the mapping from
the mapper. You should click Pushdown to effect these changes in the system.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 254


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

5.9.3 Default Secure Map


This option allows you to set a mapper definition as the default security mapper at Infodom level. You can
have different security filter type mapper definitions but in OFSAAI platform, the default security mapper
is used to provide hierarchy member security. If a mapper is not set as a default security mapper,
hierarchy browser will display all the members of the hierarchy in enabled mode and hierarchy member
security will not be available under such circumstances.

Click Default Security Map button on the toolbar to set a mapper as a default secure mapper. Once
selected, this information will be displayed in the Mapper Summary window. A delete icon will also be
available adjacent to it to remove the default security map from the system.

NOTE A Security Filter type mapper definition having the user group
hierarchy (seeded by OFSAAI) in its definition can only be
identified as a default security mapper and this validation will
be performed by the application. When a mapper is set as the
default security map in an information domain, it overrides the
existing default security map if present in the Infodom.

5.9.4 Modifying Mapper Definition


You can update only the Comments field and the pushdown option. You need to have Mapper Write role
mapped to your user group to modify a Mapper definition.
To update the Map Maintenance details in the Map Maintenance window:
1. Select the checkbox adjacent to the required Mapper code.

2. Click Edit Map button from the tool bar. The Mapper Definition window is displayed.
3. Update the Comments field or the push down option as desired (The push down option will be
available for edit, only in case of dynamic mapper definitions and this option will be disabled in case
of non-dynamic mapper definitions).
4. Click Save and update the changes.

5.9.5 Copying Mapper Definition


The Copy Mapper Definition allows you to quickly create a new Mapper Definition based on the existing
hierarchies and mappings. You can then add more hierarchies and mappings as required.
Note the following points:
• When you copy a Mapper definition, all the existing hierarchies and mappings get preselected and
copied to the new Mapper definition.
• You cannot remove the existing hierarchies from the new Mapper definition.
• You can add up to 9 hierarchies (including the existing ones) to the new Mapper definition.
• If you are copying a Mapper definition which has mappings (done through the Mapper Maintenance
window), then

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 255


UNIFIED ANALYTICAL METADATA
MAP MAINTENANCE

 The parent node /default node of the new hierarchy will get mapped with existing hierarchy
member combinations
 You need to select a hierarchy that has default data. Otherwise, an alert message is displayed
prompting you to select a hierarchy with default data.
• You cannot edit the fields Dynamic and Map Type.
• Pushdown will not happen automatically. You need to do the Pushdown operation of the new
Mapper definition explicitly.
To copy an existing Mapper Definition in the Map Maintenance window:
1. Select the checkbox adjacent to the Mapper Name which you want to copy.

2. Click Copy Map button in the tool bar. The Copy button is disabled if you have selected multiple
checkboxes. The Mapper Definition- Copy window is displayed.
3. Enter the required details in the Description, Database Entity Name, Database View Name and
Comments fields. For more information, see Creating a Mapper Definition.
4. Select the Pushdown checkbox if you want implicit push down of the mappings whenever
mappings are modified.
5. Select the required hierarchies from the Members pane. The selected hierarchies are displayed
under the Selected Members pane. Click Save.
The new Mapper definition details are displayed in the Map Maintenance window. Select the new
Mapper and click Mapper Maintenance button in the tool bar to add mappings to the newly
added hierarchies.

5.9.6 Deleting Mapper Definition


You can remove the Mapper definition(s) which are created by you and which are no longer required in
the system by deleting from the Map Maintenance window. You need to have Mapper Write role mapped
to your user group to delete a Mapper definition.
To delete a Mapper definition from the Map Maintenance window:
1. Select the checkbox adjacent to the required Mapper definition code.

2. Click Delete Map button from the tool bar. A confirmation dialog is displayed. If a default
security map was selected for deletion, then the same will be indicated in the confirmation dialog.
The mapper code will be followed by ‘(D)’ to indicate that the default security map has also been
selected for deletion.
3. Click OK. The Mapper definition details are deleted.

5.9.7 Non Dynamic Mapper definitions


The existing mapper definitions available in the system will be treated as non-dynamic mapper
definitions. You can continue to use such mapper definitions, that is, all functionalities supposed to be
applicable for an existing mapper definition will be available to you.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 256


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

5.10 Analytics Metadata


Analytics Metadata section consists of the following sections:
• Dimension
• Essbase Cube
• OLAP Cube
• Catalog

5.10.1 Dimension
Business Dimension within the Infrastructure system facilitates you to create a logical connection with
measures. It gives you various options across which you can view measures. A Business Dimension is a
structure of one or more logical grouping (hierarchies) that classifies data. It is the categorization across
which measures are viewed. A dimension can have one or more hierarchies.
You can access Business Dimension window by expanding Unified Analytical Metadata and Analytics
Metadata within the tree structure of the LHS menu and selecting Dimension.
The dimension specific details are explained in the following table.

Table 44: Fields in the Dimension window and their Description

Field Description

Dimension Properties Displays the Dimension Type and Data Type of the selected dimension object.

Displays the Hierarchy object which is used in creating the dimension.


Depends on
Click the object link to drill down for more details.

Displays the Essbase cube object in which the dimension is used.


Used In
Click the object link to drill down for more details.

Applications Displays the applications in which the dimension is used.

Based on the role that you are mapped to, you can access read, modify or authorize Dimension. For all the
roles and descriptions, see Appendix A. The roles mapped to Business Dimension are as follows:
• Dimension Access
• Dimension Advanced
• Dimension Authorize
• Dimension Phantom
• Dimension Read Only
• Dimension Write
Based on the user requirements you can define different dimensions as Regular, Time, or Measure. A
Dimension combined with measures helps in business query. Since dimension data is collected at the
lowest level of detail and then aggregated into higher-level totals, it is useful for analysis.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 257


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

Figure 124: Business Dimension window

The Business Dimension window displays the list of pre-defined Business Dimensions with their Code,
Short Description, Long Description, and Dimension Type. In the Business Dimension window, the user is
required to enter the Dimension code and a description when the user is defining it for the first time. The
user is required to select the dimension type, data type, and map available hierarchies to a dimension.
You can also make search for a specific business dimension based on the Code, Short Description, and
Authorization status or view the list of existing business dimensions within the system.

5.10.1.1 Creating Business Dimension


You can create a Business Dimension by specifying the Dimension definition details and defining the
required Dimension. You can define a Business Dimension only if you have Dimension Write role mapped
in the Infrastructure system.
To create a new Business Dimension from the Business Dimension window:
1. Click Add button from the Business Dimensions toolbar. The Add Business Dimension window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 258


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

Figure 125: Add Business Dimension window

2. Enter the details in the Business Dimension Details section as tabulated:


The following table describes the fields in the Add Business Dimension Details window.

Table 45: Fields in the Add Business Dimension Details window and their Dimension

Field Description

Enter a distinct code to identify the Dimension. Ensure that the code is
alphanumeric with a maximum of eight characters in length and there are
no special characters except underscore “_”.
Note the following:
The code can be indicative of the type of Dimension being created.

Code A pre-defined Code and Short Description cannot be changed.


Same Code or Short Description cannot be used for Essbase installation:
"$$$UNIVERSE$$$", "#MISSING”, "#MI”, "CALC”, "DIM”, "ALL”, "FIX”,
"ENDFIX", "HISTORY”, "YEAR”, "SEASON", "PERIOD”, "QUARTER”,
"MONTH”, "WEEK”, "DAY".
In Unauthorized state, the users having Authorize Rights can view all the
unauthorized Metadata.

Enter a Short Description based on the defined code. Ensure that the
Short Description description is of a maximum of eight characters in length and does not
contain any special characters except underscore “_”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 259


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

Select the Dimension Type from the drop-down list. The available options
are:
Regular: A regular dimension can have more than one hierarchy mapped
to it. The option of mapping multiple hierarchies is available only for a non-
SQLOLAP environment.
Dimension Type
Time: In a time dimension, the hierarchy defined has leaves/nodes of high
time granularity.
Measure: A measure dimension can have hierarchies of only type measure
mapped to them it. The Measure hierarchy type is specific to Essbase
MOLAP.

The Data Type is automatically selected based on the dimension type


Data Type selected. The default data type for the Business Dimension definition is
Text.

Enter the Long Description if you are creating subject-oriented Dimension


to help users for whom the Dimension is being created or other details
Long Description
about the type/subject. Ensure that description is of a maximum of 100
characters in length.

3. Click button in the Hierarchies grid. The Hierarchy Browser window is displayed.
Based on the dimension type, the hierarchies are displayed in the Members pane. You can expand
and view the members under the Hierarchies by clicking “+” button.

 Select the hierarchies from the Members pane and click . The selected hierarchies are
moved to the Selected Members pane.

 If you want to map all the available hierarchies, click .


 If you want to remove a selected hierarchy, select it from the Selected Members pane and click

. To deselect all the selected hierarchies, click .


 Click OK and the selected hierarchies are listed in the Hierarchies grid.
The User Info grid at the bottom of the window displays the metadata information about the
Business Dimension created along with the option to add comments.
4. Click Save in the Add Business Dimension window and save the details.

5.10.1.2 Viewing Business Dimension


You can view details of an individual Business Dimension at any given point. To view the existing Business
Dimension definition details in the Business Dimension window: You need to be mapped to the role
Dimension Read Only to view Business Dimension.
1. Select the checkbox adjacent to the required Business Dimension code.

2. Click View button from the Business Dimension tool bar.


The View Business Dimension window displays the details of the selected Business Dimension
definition. The User Info grid at the bottom of the window displays metadata information about
Business Dimension created along with the option to add comments.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 260


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

5.10.1.3 Modifying Business Dimension


You can update the existing Business Dimension definition details except for the Code, Short Description,
Dimension Type, and Data Type. You need to have Modify Dimension function role mapped to modify the
Business Dimension definitions.
You need to be mapped to Dimension Write to modify Business Dimension.
To update the required Business Dimension details in the Business Dimension window:
1. Select the checkbox adjacent to the required Business Dimension code.

2. Click Edit button from the Business Dimension tool bar. The Edit Business Dimension window is
displayed.
3. Update the required details. For more information, see Create Business Dimension.
4. Click Save and update the changes.

5.10.1.4 Copying Business Dimension


You can copy an existing Business Dimension details to quickly create a new Business Dimension. You
need to have Dimension Writerole mapped to copy the Business Dimension definitions. To copy an
existing Business Dimension definition in the Business Dimension window:
1. Select the checkbox adjacent to the required Business Dimension code.

2. Click Copy button from the Business Dimension tool bar.


3. The Business Dimension definition details are copied and a confirmation message is displayed.

5.10.1.5 Deleting Business Dimension


You can remove the Business Dimension definition(s) you have created and are no longer required in the
system, by deleting from the Business Dimension window. You need to have Dimension Write role mapped
to delete a Business Dimension. Delete function permanently removes the Business Dimension details
from the database. Ensure that you have verified the details as indicated below:
• A Business Dimension definition marked for deletion is not accessible for other users.
• Every delete action has to be Authorized/Rejected by the authorizer.
 On Authorization, the Business Dimension details are removed.
 On Rejection, the Business Dimension details are reverted back to authorized state.
• You cannot update Business Dimension details before authorizing/rejecting the deletion.
• An un-authorized Business Dimension definition can be deleted.
To delete an existing Business Dimension in the Business Dimension window:
1. Select the checkbox adjacent to the required Business Dimension code.

2. Click Delete button from the Business Dimension tool bar. A confirmation dialog is displayed.
3. Click OK. The Business Dimension details are marked for delete authorization.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 261


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

5.10.2 Cubes
Cube represents a multi-dimensional view of data which is vital in business analytics. It gives you the
flexibility of defining rules that fine-tune the information required to reflect in the hierarchy. Cube
enhances query time and provides a decision support for Business Analysts.
A cube is a combination of measures and dimensions, that is, measures represented along multiple
dimensions and at different logical levels within each dimension. For example, in a cube, you can view
Number of Customers, Number of Accounts, and Number of Relationships by Product, Time, and
Organization.

5.10.2.1 Essbase Cubes


Essbase has been derived from a history of OLAP applications based in the middle tier. The strategy of
Essbase is mainly on custom analytics and Business Intelligence applications. This strategy addresses the
what-if modeling and future-oriented questions that companies need answers today in order to see into
the future.
Essbase - A Separate-Server OLAP: Essbase is the OLAP server that provides an environment for rapidly
developing custom analytic and EPM applications. The data management strategy allows Essbase to easily
combine data from a wide variety of data sources, including the Oracle Database. Essbase is part of the
Oracle Fusion Middleware architecture.
Based on the role that you are mapped to, you can access read, modify or authorize Essbase Cube. For all
the roles and descriptions, see Appendix A. The roles mapped to Essbase cubes are as follows:
• Essbase Cube Access
• Essbase Cube Advanced
• Essbase Cube Authorize
• Essbase Cube Phantom
• Essbase Cube Read Only
• Essbase Cube Write

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 262


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

Figure 126: EssBase Cube Summary window

The Essbase Cube Summary window displays the list of pre-defined Essbase Cubes with their Code, Short
Description, Long Description, and MDB Name. By clicking the Column header names, you can sort the
column names in ascending or descending order. Click if you want to retain your user preferences so
that when you login next time, the column names will be sorted in the same way. To reset the user
preferences, click .
You can add, view, edit, copy, and delete an Essbase Cube. You can search for a specific Essbase Cube
based on the Code, Short Description, and Authorization status.

5.10.2.1.1 Creating Essbase Cube

When you are defining Essbase cube for the first time, you need to specify the Cube definition details and
the Cube-Building components such as Dimension, Variation, Intersecting details, DataSet, Formulae, and
Roll Off period details. Your User Group should be mapped with the User Role ‘Essbase Cube Write’ to
create or add an Essbase Cube.
Note the following:

NOTE • Cube Build with OLAP type as Essbase – If there is a


Business Intelligence (BI) hierarchy in the cube
definition, cube build is supported only if the data
length for BI Hierarchy processing is less than 50.
• You must define at least two Business Dimensions.
Else, an alert message is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 263


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

To create an Essbase Cube


1. From the Essbase Cube Summary window, click Add. The Essbase Cube Details window is
displayed.
2. Enter the Essbase Details as tabulated.
The following table describes the fields in the Essbase Details window.

Table 46: Fields in the Essbase Details window and their Description

Field Description

Enter a distinct code to identify the Cube. Ensure that the code is
alphanumeric with a maximum of 8 characters in length and there are no
special characters except underscore “_”.
Note the following:
The code can be indicative of the type of Cube being created.

Code A pre-defined Code and Short Description cannot be changed.


Same Code or Short Description cannot be used for Essbase installation:
"$$$UNIVERSE$$$", "#MISSING”, "#MI”, "CALC”, "DIM”, "ALL”, "FIX”,
"ENDFIX", "HISTORY”, "YEAR”, "SEASON", "PERIOD”, "QUARTER”,
"MONTH”, "WEEK”, "DAY".
In Unauthorized state, the users having Authorize Rights can view all the
unauthorized Metadata.

Enter a Short Description based on the defined code. Ensure that the
Short Description description is of a maximum of 8 characters in length and does not contain
any special characters except underscore “_”.

Entering the Long Description is helpful when creating Cube. It could


indicate the contents of the cube or any other useful information that
Long Description would help an analyst.
You can enter a Long Description with a minimum of one character and a
maximum of 100 characters in length.

Enter the name by which you want to identify the cube while saving it in a
multi-dimensional database.
Saving a cube to a multi-dimensional database is different from saving the
Cube definition wherein the definition (like all other metadata definitions) is
stored in the repository. When saved, the cube details are updated by the
MDB Name
cube name that you have attributed to it. Ex: NoofProd (Number of
Products)
Note: Ensure that the name is within 1 to 8 characters in length and can
contain alphabetical, numerical (only 0-9), or alphanumerical characters
without special characters and extra spaces.

Turn ON the toggle button if you wish to capture all incremental changes
Is Build Incremental made to the database. The cube definitions with the Is Build Incremental
toggle button turned ON can be executed with different MIS dates.

3. Enter the Cube Components in each of the tabs as tabulated.

Table 47: Cube Components window Field and its Description

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 264


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

Field Description

In the Dimension tab, the Available list consists of the pre-defined Dimensions.

Select the required Dimension for the cube and click button.

You can click button to select all the listed Dimensions.


Dimension (default)
You can also click button to deselect a Dimension or click button to
deselect all the selected Dimensions.
Note: It is mandatory to select at least two dimensions. One dimension should
be of Measure Dimension Type.

In the Variation tab, you can define the Variation by mapping the Dimension
against the defined Measure.

Figure 127:Variation tab

Variation

To map a Dimension to a Measure, select the corresponding check box.

Note that the Intersection option is specific to Count Distinct Measures. The
Count Distinct Measures should be intersected only across those dimensions
on which a duplicate is expected for that measure.
For example, there can be no customer who has both gender as Male and
Female. Thus intersecting the Count distinct measures across a Gender
dimension will not make sense. Similarly, the Count Distinct measures will
have duplicates across Products or Regions. Thus, the intersecting can be
across those dimensions (Product/Region). For more information, see
“Selecting Aggregation Function” in Business Measures section.
Intersection
Figure 128: Intersection tab

Select the required Dimension from the drop-down list corresponding to the
Measure.
Note: Mapped Intersection should be a subset of mapped Variation.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 265


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

In the Dataset tab, you can select the Dataset for the cube along with the
additional filters like the Date Filter and Business Exclusions.

Figure 129: Dataset tab

Dataset
Select the required Dataset from the drop-down list. The selected From
Clause and Join Condition for the selected Dataset are displayed.
To define the Date Filter, click button. The Expression Builder window is
displayed. Define the required expression by selecting the appropriate Entities,
Functions, and Operator. Click OK.
To define the Business Exclusion, click button. The Expression Builder
window is displayed. Define the required expression by selecting the
appropriate Entities, Functions, and Operator. Click OK.

Note that the Formulae tab is specific to Essbase MOLAP. In the Formulae tab,
you can apply filters to a hierarchy node.

Figure 130: Formula tab

Formulae
When you select a Dimension from the Selected Dimensions drop-down list,
the mapped Hierarchies will be listed out in the Hierarchies drop-down list.
Click button adjacent to Node Formula. The Expression Builder window is
displayed. Define the required expression by selecting the appropriate Entities,
Functions, and Operator. Click OK.

In the Roll Off tab, you can define the start date of the cube to specify the
history of the data which is to be picked up during aggregation. The maximum
period of data history that can be specified is 24 months. The Roll Off option is
enabled only to BI enabled hierarchies.

Figure 131: Roll Off tab

Roll Off
Turn ON the Roll Off Required toggle button.
Click to specify the Roll Off Period value (in integer) for which the data
should be maintained in the system. The data will be automatically rolled off
with the addition of new nodes to the cube.
Select the Dimension for which you want to specify the roll off period from the
drop-down list.
Select the Level from the drop-down list. The list contains the hierarchy levels
of the selected Dimension.

4. Click Save and save the Essbase Cube Definition details. A confirmation dialog is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 266


UNIFIED ANALYTICAL METADATA
ANALYTICS METADATA

The Cube definitions are stored in repository and accessed for query. Once saved, the cube details
are displayed with non-editable Code and Short Description fields.

5.10.2.1.2 Viewing Essbase Cube Details

You can view the metadata of a selected Essbase Cube definition at any given point. You need to be
mapped to the User Role Essbase Read Only to view Essbase Cube definition.
To view the existing Essbase Cube definition details:

From the Essbase Cube Summary window, select the Essbase Cube definition and click View.
The Essbase Cube Details window is displayed.
 The User Info tab displays the metadata properties such as Created By, Creation Date, Last
Modified By, Modified Date, Authorized By, and Authorized Date.
 The User Comments tab has a text field to enter additional information as comments about the
created Cube definition.

 Click Close.

5.10.2.1.3 Copying Essbase Cube Details

The Copy function is similar to “Save As” functionality and helps you to copy the pre-defined Essbase
Cube details to quickly create another Essbase Cube. Your User Group should be mapped to ‘Essbase
Cube Write’ User Role to copy the Cube details.
To copy Essbase Cube definition:

1. From the Essbase Cube Summary window, select the Essbase Cube definition and click Copy.
The Essbase Cube Details window is displayed.
2. Enter the Code, Short Description, Long Description and MDB Name. For more information, see
Create Essbase Cube section. You can also modify the cube components as required.
3. Click Save and save the updated details. A confirmation dialog is displayed.

5.10.2.1.4 Modifying Essbase Cube Details

1. From the Essbase Cube Summary window, select the Essbase Cube definition and click Edit. The
Essbase Cube Details window is displayed.
2. Modify the Essbase Cube definition with the cube components details as required. For more
information, see Create Essbase Cube section.
3. Click Save and save the updated details. A confirmation dialog is displayed.

5.10.2.1.5 Deleting Essbase Cube Details

You can remove Essbase Cube definition(s) which are created by you and which are no longer required in
the system by deleting from the Essbase Cube Summary window. You need to have Essbase Cube Write
User Role mapped to delete an Essbase Cube. Delete function permanently removes the Essbase Cube
details from the database. Ensure that you have verified the details as indicated below:
• An Essbase Cube definition marked for deletion is not accessible for other users.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 267


UNIFIED ANALYTICAL METADATA
REFERENCES

• Every delete action has to be Authorized/Rejected by the authorizer.


 On Authorization, the Essbase Cube details are removed.
 On Rejection, the Essbase Cube details are reverted back to authorized state.
• You cannot update Essbase Cube details before authorizing/rejecting the deletion.
• An un-authorized Essbase Cube definition can be deleted.
To delete an existing Essbase Cube:
1. From the Essbase Cube Summary window, select the Essbase Cube definition you want to deleted
and click Delete. A confirmation dialog is displayed.
2. Click OK. The Cube details are marked for delete authorization.

5.11 References

5.11.1 Scenario to Understand Dataset Functionality


Consider the scenario, where you want to analyze the Customer Relationship Management through
various profiles of a customer against the various transactions and the channels of transaction through
which the actual transactions have happened.
This information is maintained in relational tables. In a typical Star Schema implementation of the
relations, Customer profiles like Age, Gender, Sex, Residence, and Region are maintained in Individual
Dimension tables. Similarly, the Transaction Types and Channels would be maintained in a separate
Dimension tables. The actual transaction performed by the Customers will be stored in a Fact table.
A Dataset allows you to collate all the tables with a valid join condition. The tables defined in the Dataset
would form the FROM clause while aggregating for the Cube.

5.11.2 Operator Types


The operators available are of three types:
• Arithmetic
• Comparison
• Other
The following table describes the Operator Type with an example.

Table 48: Operator Types

Type Operator Example

Arithmetic + CUR_BOOK_BAL = CUR_PAR_BAL + DEFERRED_CUR_BAL

- AS_OF_DATE = MATURITY_DATE – REMAIN_TERM_C

* Remaining Balance after Offset = Opening balance – (Expected


balance on every payment date * Mortgage offset %)

/ CUR_PAYMENT = ORG_BOOK_BAL/ (ORG_TERM/ PMT_FREQ


[in months])

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 268


UNIFIED ANALYTICAL METADATA
REFERENCES

Type Operator Example

Comparison = CUR_PAYMENT = principal + interest

<> If ADJUSTABLE_TYPE_CD <> 0, INTEREST_RATE_CD = 001 to


99999.

> If ORIGINATION_DATE > AS_OF_DATE,


LAST_PAYMENT_DATE = ORIGINATION_DATE.

>= AS_OF_DATE >= ORIGINATION_DATE

< AS_OF_DATE < NEXT_REPRICE_DATE

<= If ORIGINATION_DATE <= AS_OF_DATE,


LAST_PAYMENT_DATE >= ORIGINATION_DATE

Other ( Parentheses group segments of an expression to make logical


sense.

) MATURITY_DATE <= NEXT_PAYMENT_DATE +


(REMAIN_NO_PMTS_C * PMT_FREQ)

, The comma separates statements of a function.

5.11.3 Function Types and Functions


You select the type of function for your expression from the Type list.
The choices are:
• Mathematical Functions
• Date Functions
• String Functions
• Other Functions
The type of function you select determines the choices available in the Function box. These unique
functions in the Functions Sub-container enable you to perform various operations on the data.
The following table lists each available function and Detail on the operations of each function in which it
appears.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 269


UNIFIED ANALYTICAL METADATA
REFERENCES

Table 49: Function Name and Type Description

Function Function
Notation Description Syntax Example
Type Name

Mathematic Absolute ABS(a) Returns the positive {ABS( } ABS (-3.5) =


al value of the database followed by 3.5.ABS(F), ABS(F +
column {EXPR1 C), ABS(F + C * R + F)
without any are possible.
embedded or However, ABS((F + C
outermost + R)), ABS((F + (MAX
left-right * CEILING))) are not
parentheses possible.
pair} followed
by { )}

Ceiling Ceiling (a) Rounds a value to the Ceiling(colum 3.1 becomes 4.0, 3.0
next highest integer n or stays the same
expression)

Greatest Greatest(a,b Returns the greater of 2 Greatest(colu Greatest(1.9,2.1) = 2.1


) numbers, formulas, or mn or
GREATEST( columns expression,
column or column, or
expression, expression
column or
expression)

Least Least (a,b) Returns the lesser of 2 Least(column Least(1.9,2.1) = 1.9


LEAST(colu numbers, formulas, or or expression,
mn or columns column or
expression, expression
column or
expression)

Natural Log LN(number Returns the natural LN(number) LN(86) equals


) logarithm of a number. where number 4.454347
LN(a) Natural logarithms are is the positive LN(2.7182818) equals
based on the constant e real number 1
(2.71828182845904). for which you
want the
natural
logarithm

Minimum Min(a) Returns the minimum Max(Column)


value of a -database
column

Maximum Max(a) Returns the maximum Max(Column)


value of a -database
column

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 270


UNIFIED ANALYTICAL METADATA
REFERENCES

Function Function
Notation Description Syntax Example
Type Name

Power Power(a,b) Raises one value to the {POWER(} Valid examples:


POWER(coe power of a second followed by POWER(F, R)
fficient, {EXPR1
POWER(F + C * R, F /
exponent) without any
R)
embedded or
outermost Invalid examples:
left-right POWER((F/R), F + R)
parentheses POWER((F + C), (C *
pair followed R))
by {,} followed
POWER(F + POWER,
by {EXPR1
R)
without any
embedded or POWER( MAX, C)
outermost
left-right
parentheses
pair} followed
by { )}

Round(a,b) Rounds a value to a Round(x, n) Round(10.52354,2)=1


Round ROUND number of decimal returns x 0.52
(number, places rounded to n
precision) decimal places

Sum Sum(a) Sums the total value of Sum(Column)


a database column.
Sum is a multi-row
function, in
contrast to +, which
adds 2 or more values
in a given row (not
column)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 271


UNIFIED ANALYTICAL METADATA
REFERENCES

Function Function
Notation Description Syntax Example
Type Name

Weighted WAvg(a,b) Takes a weighted WAvg(Column WAvg(DEPOSITS.CUR


Average WAvg average of one A, Column B) _NET_RATE,DEPOSIT
(column database column by a S.CUR_BOOK_BAL)
being second Column.
averaged, WAvg cannot appear in
weight any expression.
column) If you have two
formulas called F1 and
F2, both of which are
WAvg functions,
then you can form a
third formula F3 as F1 +
F2. If F3 is chosen as a
calculated column, then
an error message
appears and the SQL
code is not generated
for that column. This is
similar for nested WAvg
functions if F3 is WAvg
and it has F1 or F2 or
both as its parameters.

Note : You cannot use the Maximum and Minimum functions as calculated columns or in Data Correction
Rules. The Maximum, Minimum, Sum, and Weighted Average functions are multi-row formulas. They use
multiple rows in calculating the results.

Date Build Date BuildDate(y Requires three BUILDDATE(C BuildDate(95,11,30) is


ear,month, parameters, CYY,MM,DD) invalid (invalid
days) (CCYY,MM,DD) (century century).
and year, month, day). BuildDate(1995,11,30)
It returns a valid data is valid.
and enables you to
build a date from
components.
CAUTION: If the
parameters are entered
incorrectly, the date is
invalid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 272


UNIFIED ANALYTICAL METADATA
REFERENCES

Function Function
Notation Description Syntax Example
Type Name

Go Month GoMonth(d Advances a date by x GOMONTH(D GOMONTH(DEPOSIT


ate,months) number of months. Go ate column, S.ORIGINATION_DAT
Month does not know Number of E,DEPOSITS.ORG_TE
the calendar. For months to RM)
example, it cannot advance) Valid examples:
predict the last day of a
GOMONTH(F, F + R +
month. Typical
C)
functionality is
illustrated in the GOMONTH(F, R)
following table: Invalid examples:
GOMONTH(F + (R +
C), MAX)
GOMONTH((F * C), F)

For Example:

Date No of GOMON Comment


Column Months TH

1/31/94 1 2/28/94 Because 2/31/94 does not exist

1/15/94 2 3/15/94 Exactly 2 months:15th to 15th

2/28/94 3 5/28/94 Goes 28th to 28th: does not know that


31st is the end of May

6/30/94 -1 5/30/94 Goes back 30th to 30th: does not


know that
31st is end of May

Year Year(date) Year(x) returns the data Year(Column) Year(Origination


for year x. returns the Date) returns the year
year in the of the origination
column, date.
where the
column is a
date column.

Month Month(date Month(x) returns the Month(Colum Month(9) returns


) month in x, where x is a n) returns the September.
numbered month. month in the Month(Origination
column, Date) returns the
where the month of the
column is a origination date.
date column.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 273


UNIFIED ANALYTICAL METADATA
REFERENCES

Function Function
Notation Description Syntax Example
Type Name

String Trim All AllTrim(a) Trims leading


and following
spaces,
enabling the
software to
recognize
numbers
(entered in
All Trim) as a
numeric value,
which can
then be used
in calculating

Other If If(a=b,c,d) The IF function should If(Condition, If(LEDGER_STAT.Fina


statement always have odd Value if True, ncial= 110,
number of parameters Value if False). LEDGER_STAT.Mont
separated by commas. h 1 Entry,0)
The first parameter is IF(((MAX + SUM) >=
{IF( } followed
an expression followed 30), F, POWER) is
by EXPR2
by a relational operator, valid.
followed by {>
which is in turn
| < | <> | = |
followed by an
>= | <=}
expression.
followed by
Note: Avoid embedding EXPR2
multiple individual followed by
formulas in subsequent {{,} followed
formulas. This can by EXPR
create an invalid followed by ),}
formula. followed by
EXPR}n
followed by {)}
where n = 1, 2,
3, .....

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 274


UNIFIED ANALYTICAL METADATA
REFERENCES

Function Function
Notation Description Syntax Example
Type Name

Lookup Lookup(Ori Enables you to assign Lookup(O1,L1, Valid examples:


gCol,Looku values equal to values O2,L2,...On,Ln, LOOKUP(F, R, R)
pCol,…,Ret in another table for R) where
LOOKUP(F, R, F, F, F)
urnedCol) data correction. O=Column
from Original Invalid examples:
LOOKUP function
should always have an table LOOKUP(F)
odd number of L=Column LOOKUP(F, R)
parameters separated from Lookup LOOKUP(F + R, (F +
by commas and with a table R), MAX)
minimum of 3 R=Column to
parameters. be Returned
Note: Lookup is used So the
exclusively for data previous
correction. statement
would read:
where O1=L1
and O2=L2...
Returned
value R

5.11.4 Creating Expression using Expression Builder


You can define an expression in the Expression Builder window to join two selected tables. Click to
display the Expression Builder window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 275


UNIFIED ANALYTICAL METADATA
REFERENCES

Figure 132: Expression Builder window

The Expression window consists of the following sections:


• Entities - consists of the Entities folder with the list of tables that you selected from the Entity
Groups folder. Double-click the Entities folder to view the selected dimension tables (Product and
Segment tables).
• Functions - consists of functions that are specific to databases like Oracle and MS SQL Server. You
can use these functions along with Operators to specify the join condition.
The Functions categories are displayed based on the database types as tabulated.

Table 50: Database and its Functions

Database Functions

Specific to MS SQL server which consists of Date & Time, Math, and
Transact SQL
System functions.

Specific to Microsoft OLAP which consists of Array, Dimension,


SQL OLAP
Hierarchy, Logical, Member, Number, Set, and String functions.

Specific to Oracle which consists of Character, Conversion, Date and


SQL
Numeric functions.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 276


UNIFIED ANALYTICAL METADATA
REFERENCES

NOTE It is not mandatory to specify a Function for a join condition.

• Operators - consists of the function operators categorized into folders as tabulated.


The following tables shows the Operator and its types used.

Table 51: Operator and its Types

Operator Types

Arithmetic +, -, %, * and /

Comparison '=', '!=', '< >', ‘>', '<', 'IN', 'NOT IN, 'ANY', 'SOME', 'LIKE' and 'ALL'.

Logical 'NOT', 'AND' and 'OR'

Set UNION, UNION ALL, INTERSECT and MINUS

Others The Other operators are 'PRIOR', '(+)', '(' and ')'.

To specify the join condition:


1. Select the Entity of the fact table to which you want join the dimension entities.
2. Select a Function depending on the database type.
3. Select the Operator which you want to use for the join condition.
4. Select the second Entity from the Entities pane that you want to join with the first entity. You can
also select more than one dimension table and link to the fact table.
The defined expression is displayed in the Expression pane. You can click Reset to clear the
Expression pane.
5. Click OK and save the join condition details.

5.11.5 Base and Computed Measures


A Base Measure refers to a measure where the aggregation is done directly on the raw data from the
database. It represents some operation on the actual data available in the warehouse and its storage in its
aggregated form in another data store. This is different from metrics that is not stored in physical form,
but as functions that can be operated on other measures at viewing time. The choice of base or computed
measure is based on the user’s requirement of a design issue on storage optimality as it is on query
response speeds desired. These functions defined on other measures are called Computed Measures and
dealt separately. It is the metric definition like amount of sales or count of customers.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 277


UNIFIED ANALYTICAL METADATA
REFERENCES

5.11.6 Business Hierarchy Types


The available Business Hierarchies are as tabulated.

Table 52: Business Hierarchy and its Description

Hierarchy Type Description / Hierarchy Sub Type

In a Regular Hierarchy Type, you can define the following Hierarchy Sub
Types:
Non Business Intelligence Enabled
In a non Business Intelligence Enabled Hierarchy, you need to manually add
the required levels. The levels defined will form the Hierarchy.
Business Intelligence Enabled
Regular
You can Enable Business Intelligence hierarchy when you are not sure of the
Hierarchy structure leaf values or the information is volatile and also when the
Hierarchy structure can be directly selected from RDBMS columns. The
system will automatically detect the values based on the actual data.
Parent Child
This option can be selected to define a Parent Child Type hierarchy.

A Measure Hierarchy consists of the defined measure as nodes and has only
Measure
the Non Business Intelligence Enabled as Hierarchy Sub Type.

A Time Hierarchy consists of the levels/nodes of high time granularity and


Time
has only the Business Intelligence Enabled as Hierarchy Sub Type.

You can select the required Business Hierarchy from the drop-down list and specify the Hierarchy Sub
Type details. The window options differ on selecting each particular Hierarchy type. Click on the following
links to view the section in detail.
• Regular Hierarchy
• Measure Hierarchy
• Time Hierarchy

5.11.6.1 Regular Hierarchy


When you select Regular Hierarchy, you can define the Hierarchy Sub Type for Non Business Intelligence
Enabled, Business Intelligence Enabled, and Parent Child Hierarchy. Select the required Hierarchy Sub
Type from the drop-down list. Click on the following links to view the section in detail.
• Non Business Intelligence Enabled Hierarchy
• Business Intelligence Enabled Hierarchy
• Parent Child Hierarchy

5.11.6.1.1 Non Business Intelligence Enabled Hierarchy

When you have selected Regular - Non Business Intelligence Enabled Hierarchy option, do the following:

1. Click button in the Entity field. The Entity and Attribute window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 278


UNIFIED ANALYTICAL METADATA
REFERENCES

Figure 133: Entity and Attribute window

 You can either search for a specific Entity using the Search and Filter pane or select the
checkbox adjacent to the required Entity in the Available Entities list. The list of defined
Attributes for the selected entity is displayed Available Attributes list.
 You can either search for a specific Attribute using the Search and Filter pane or select the
checkbox adjacent to the required Attribute in the Available Attributes list.
 Click Save. The selected Entity and Attribute is displayed in the Add Business Hierarchy window.

NOTE Ensure that the values present in Attribute column do not


contain new line characters. Because the hierarchy node
descriptions in the hierarchy browser are considered as text
fields and do not permit new line characters.

2. Click button from the Business Hierarchy tool bar. The Add Node Values window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 279


UNIFIED ANALYTICAL METADATA
REFERENCES

Figure 134: Add Node Values window

 Enter the details in Hierarchy Values section as tabulated.


The following table describes the fields in the Add Node Values window.

Table 53: Fields in the Add Node Values window and their Description

Field Description

Node The Node value is auto-populated and is editable.

Short Description Enter the required short description for the node.

Node Identifier Click button and define an expression in the Expression window for
the Node Identifier. For more information, see Create Expression.

Enter the Sort order in numeric value.


Sort Order Note: The sort order of the default (OTHERS) node should be greater than
the rest of the nodes if this hierarchy is used in RRF Filter condition.

 From the Node Attributes grid, select Storage type from the drop-down list.
There are four Storage Types as tabulated.
The following table describes the fields in the Add Node Values window.

Table 54: Fields in the Add Node Values window and their Description

Field Description

This storage type allocates a data cell for the information to be stored in
Data Store the database. The consolidated value of the data is stored in this cell. The
consolidation for the node occurs during the normal process of rollup.

In this storage type, no cell is allocated and the consolidation is done when
the data is viewed. The consolidation for the node is ignored during the
Dynamic Calc
normal process of rollup. The consolidation of node occurs when you use
the OLAP tool for viewing data.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 280


UNIFIED ANALYTICAL METADATA
REFERENCES

Field Description

In this storage type, a cell is allocated but the data is stored only when the
data is consolidated when viewed, for the first time. The consolidation for
Dynamic Calc & Store
the node is ignored during the normal process of rollup. It occurs only
when you first retrieve the data from the database.

In this storage type, a cell is not allocated nor is the data consolidated. It is
only viewed.

Label Note: The Label storage type is specific to Essbase MOLAP. Storage type is
applicable only for the Regular hierarchy type and Measure. If the user
wants to specify a dynamic calc option at level members in a multi-level
time hierarchy, the same is provided through OLAP execution utility.

 Click Save. The Node values are displayed in Add Business Hierarchy window.
3. Click Save in the Add Business Hierarchy window and save the details.
In the Business Hierarchy toolbar, you can also do the following:

 Click button to Add subsequent node(s). For the second node or subsequent node, you can
define the Hierarchy Tree and Node Attributes details as explained below.
The following table describes the fields in the Hierarchy Browser pane.

Table 55: Hierarchy Browser pane Field and its Description

Field Description

Add Hierarchy Node Click button adjacent to Child of field and select the required
Member in the Hierarchy Browser window. Click OK.

Consolidation Type option is available to Essbase MOLAP. There are six


Consolidation Type consolidation types such as Addition, Subtraction, Product, Division,
Percent, and Ignore. Select the required option from the drop-down list.

 Click button by selecting the required Node level checkbox to edit the Node details.

 Click button to delete the defined Node details.

5.11.6.1.2 Business Intelligence Enabled Hierarchy

When you have selected Regular - Business Intelligence Enabled Hierarchy option, do the following:
1. Select Total Required checkbox, if you want the total of all the nodes.
2. Select List checkbox to retrieve information from database when queried.

NOTE List hierarchy can have only one level and you cannot select
List option if the Total Required option has been selected. See
List hierarchy.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 281


UNIFIED ANALYTICAL METADATA
REFERENCES

3. Click button in the Entity field. The Entity and Attribute window is displayed.
 You can either search for a specific Entity using the Search field or select the checkbox adjacent
to the required Entity in the Available Entities list. The list of defined Attributes for the selected
entity is displayed Available Attributes list.
 You can either search for a specific Attribute using the Search field or select the checkbox
adjacent to the required Attribute in the Available Attributes list.
 Click Save. The selected Entity and Attribute is displayed in the Add Business Hierarchy window.

NOTE Ensure that the values present in Attribute column do not


contain new line characters. Because the hierarchy node
descriptions in the hierarchy browser are considered as text
fields and do not permit new line characters.

4. Click button from the Business Hierarchy tool bar. The Add Hierarchy levels window is displayed.
 Enter the details in Level Details section as tabulated.

Table 56: Fields in the Business Hierarchy window

Field Description

Level The Level value is auto-populated and is editable.

Short Description Enter the required short description for the level.

Level Identifier Click button and define an expression in the Expression window for the
Level Identifier. For more information, see Create Expression.

Level Description Click button and define an expression in the Expression window for the
Level Description. For more information, see Create Expression.

 Click Save. The Level details are displayed in Add Business Hierarchy window.
BI Hierarchy value refresh on On Load property is not functional for data loads performed
through Excel Upload. It is applicable only for data loads which run through a batch process.
5. Click Save in the Add Business Hierarchy window and save the details.
In the Business Hierarchy tool bar, you can also do the following:
• Click button to Add subsequent Levels. For the second or subsequent levels, the levels are
incremented.

• Click button by selecting the required level checkbox to edit the Level details.

• Click button to delete the defined Level details.

5.11.6.1.3 Parent Child Hierarchy

When you have selected Regular - Parent Child Hierarchy option, do the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 282


UNIFIED ANALYTICAL METADATA
REFERENCES

1. Click button in the Entity field. The Entity and Attribute window is displayed.
 You can either search for a specific Entity using the Search field or select the checkbox adjacent
to the required Entity in the Available Entities list. The list of defined Attributes for the selected
entity is displayed Available Attributes list.
 You can either search for a specific Attribute using the Search field or select the checkbox
adjacent to the required Attribute in the Available Attributes list.
 Click Save. The selected Entity and Attribute is displayed in the Add Business Hierarchy window.

NOTE Ensure that the values present in Attribute column do not


contain new line characters. Because the hierarchy node
descriptions in the hierarchy browser are considered as text
fields and do not permit new line characters.

2. The Business Hierarchy section displays the pre-defined nodes such as Child code, Parent Code,
Description, Storage Type, Consolidation Type, and Formula. You can modify the node values by
doing the following:

 Click button from the Business Hierarchy tool bar. The Edit Hierarchy Values window is
displayed.

 Click button adjacent to the required node field and define the expression in the Expression
window. For more information, see Create Expression.
 Click Save. The node details are displayed in Add Business Hierarchy window.
3. Click Save in the Add Business Hierarchy window and save the details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 283


UNIFIED ANALYTICAL METADATA
REFERENCES

NOTE • When the size of the hierarchy is large, Parent Child


Hierarchy can be configured to be treated as a
Business Intelligence enabled hierarchy for optimal
performance. The hierarchy behaves like a non-
Business Intelligence hierarchy till a limit of the
number of nodes is reached. This limit (default value is
2048) which decides a hierarchy as BI or non-BI is
configurable and can be given a value considering the
system and JVM capabilities.
• Creating Parent Child Hierarchy with Roll-up Option -
It is possible to roll up the values of child nodes in
Parent child hierarchy to the parent level. If the parent
node itself has some value and the child nodes of it
also have associated values, it is possible for the value
of the parent node to be displayed as the sum of its
value and child values.

For using the Roll-up option, it is required to specify


parameters in the Consolidation Type for the node
field. Based on the column that is specified in the
Consolidation Type field, the values of the child nodes
will be rolled up i.e. added to the parent level. This can
then be viewed using the OBIEE reporting server.
However, when Consolidation type is not selected,
then it is referred to as Parent Child Hierarchy with
Rollup option.

5.11.6.2 Measure Hierarchy


When you select Measure Hierarchy, the Hierarchy Sub Type is selected as Non Business Intelligence
Enabled by default. To define a Measure Hierarchy in the Add Business Hierarchy window, do the
following:

1. Click button in the Entity field. The Entity and Attribute window is displayed.
 You can either search for a specific Entity using the Search field or select the checkbox adjacent
to the required Entity in the Available Entities list. The list of defined Attributes for the selected
entity is displayed Available Attributes list.
 You can either search for a specific Attribute using the Search field or select the checkbox
adjacent to the required Attribute in the Available Attributes list.
 Click Save. The selected Entity and Attribute is displayed in the Add Business Hierarchy window.

NOTE Ensure that the values present in Attribute column do not


contain new line characters. Because the hierarchy node
descriptions in the hierarchy browser are considered as text
fields and do not permit new line characters.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 284


UNIFIED ANALYTICAL METADATA
REFERENCES

2. In the Add Business Hierarchy window, select the Hierarchy Type as Measure.

3. Click button in the Entity field. The Entity and Attribute window opens.
 A list of all the available entities will be listed under Available Entities. Select the required
entity. The attributes for that entity will be listed under Available Attributes.
 Select the required Attribute and click Save. Click Cancel to quit the window without saving.
After saving, the Entity and Attribute will be displayed in their respective fields.
4. Click button from the Business Hierarchy tool bar. The Add Node Values window is displayed.
Enter the details in the Node Details section as tabulated.
The following table describes the fields in the Business Hierarchy too bar.

Table 57: Fields in the Business Hierarchy Tool bar and their Description

Field Description

Node The Node value is auto-populated and is editable.

Short Description Enter the required short description for the node.

 In the Node Attributes section, do the following:


 Select Storage type from the drop-down list. For more information, see Storage Types
section.
 Select the TB Type as First, Average, or Last from the drop-down list.
 Click Save. The Node values are displayed in Add Business Hierarchy window.
5. Click Save in the Add Business Hierarchy window and save the details.
In the Business Hierarchy tool bar, you can also do the following:
• Click button to Add subsequent Node/Measures. For the second node or subsequent node, you
can also define the Hierarchy Tree and Consolidation Type details as explained below.
The following table describes the fields in the Business Hierarchy too bar.

Table 58: Fields Business Hierarchy Tool bar and their Description

Field Description

Select Hierarchy Node Click button adjacent to Child of field and select the required
Member in the Hierarchy Browser window. Click OK.

Consolidation Type option is available to Essbase MOLAP. There are six


Consolidation Type consolidation types such as Addition, Subtraction, Product, Division,
Percent, and Ignore. Select the required option from the drop-down list.

• Click button by selecting the required Node level checkbox to edit the Node details.

• Click button to delete the defined Node details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 285


UNIFIED ANALYTICAL METADATA
REFERENCES

5.11.6.3 Time Hierarchy


When you select Time Hierarchy, the Hierarchy Sub Type is selected as Business Intelligence Enabled and
the “Total Required” checkbox is selected by default.
To define a Time Hierarchy in the Add Business Hierarchy window, do the following:

1. Click button in the Entity field.


The Entity and Attribute window is displayed.
 You can either search for a specific Entity using the Search field or select the checkbox adjacent
to the required Entity in the Available Entities list. The list of defined Attributes for the selected
entity is displayed Available Attributes list.
 You can either search for a specific Attribute using the Search field or select the checkbox
adjacent to the required Attribute in the Available Attributes list.
 Click Save.
The selected Entity and Attribute is displayed in the Add Business Hierarchy window.

NOTE Ensure that the values present in Attribute column do not


contain new line characters. Because the hierarchy node
descriptions in the hierarchy browser are considered as text
fields and do not permit new line characters.

2. Select the Time Hierarchy Type from the drop-down list. Depending on the selection, the
Hierarchy Levels are displayed in the Business Hierarchy section.
You can also Edit the required Hierarchy Level. Select the checkbox adjacent to the required Level
and click button.
The Edit Hierarchy Levels window is displayed. You can update Short Description, Level Identifier,
and Level Description details.
3. Specify Hierarchy Start Date by selecting Month and Day from the drop-down list.
4. Click Save and save the Time Hierarchy details.

5.11.6.4 Large Hierarchy Type


A large hierarchy refers to a hierarchy having large number of leaf levels. In order to provide an efficient
and optimized hierarchy handling, a hierarchy is defined as Large in Oracle Infrastructure. A default value
is set to accommodate the number of hierarchy nodes that a hierarchy can contain, for example, 100. If a
hierarchy exceeds the default value specified, then the system treats it as a large hierarchy.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 286


UNIFIED ANALYTICAL METADATA
REFERENCES

NOTE • The maximum hierarchy node limit can be configured


to a higher number in the FIC_HOME / CONFIG file.
However, the recommended, default value, is 100.
• A large hierarchy is possible only when you are
defining a Time or BI enabled hierarchy.
• A large hierarchy cannot be user-defined it is handled
automatically by the system.

5.11.6.5 List Hierarchy Type


A list hierarchy is a flat hierarchy i.e. with only one level. In a list hierarchy, all the nodes are displayed
unlike the large hierarchy. You can create hierarchy based on business terms like, Customer, Product,
Geography, and so on. The information for this hierarchy is generated from the metadata framework,
which encapsulates these business terms. This enables the user to generate a report in OBIEE reporting
server based on these business terms.
The advantage of defining a list hierarchy is that you need not know technical terminology or have
technical knowledge. It also allows the user to specify a range of values. You can also define a summary or
group total and perform a sort on the list hierarchy based on the hierarchy member value or attribute
value; these two features are available only for the fact-less view.
Ensure that when you save a BI enabled hierarchy, the defined hierarchy structure is formed (in the back-
end process) and stored in an xml format (as Hierarchycode.xml) in the application server. However, when
you save a BI-enabled List hierarchy, the hierarchy structure is not formed and hence there will be no
BIHIER.XML formed. Whenever this hierarchy is queried, the data is fetched from the atomic database.

5.11.7 Measure Types


You can choose the type of computed measure you want. The type options available are as follows:
• Simple Relationship
• Growth Function
• Time-series Function
• Other –referring to the advanced mode where you can define measures to suit your requirements.
Each of the computed measure types has sub-types. Each of these sub-options is explained below to help
you choose the right computed measure type.

5.11.7.1 Simple Relationship


The Simple Relationship type computed measure is of five types. They are:
• Ratio
• Ratio as Percentage
• Difference
• Addition
• Percentage Difference

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 287


UNIFIED ANALYTICAL METADATA
REFERENCES

1. When you select the Ratio option, the window displays a simple ratio of two measures. To define the
relationship as a ratio, double click the first <<Select Measure>> option to open the Select Measure
pop-up.
2. The pop-up displays will display the Measure folder. Double-click the folder to expand the list of
measures under it. Depending on the Information Domain you are logged in to, the measures for
that domain are displayed.
3. Select the measure for which you want to compute the ratio and click OK. To close the pop-up
without saving the selected measure option, click Cancel. Repeat the same procedure to choose the
second measure.

NOTE The method of selecting the Measures is common to all the


sub-options of the Simple Relationship type.

When you select the Ratio as Percentage option, the window displays the ratio percentage of the
selected measures. When you select the Difference option, the value displayed will be the difference
between two selected measures. When you select the Addition option, the summated value of the
selected measures will be displayed. When you select the Percentage Difference option, the
percentage value of the selected measures is computed.

5.11.7.1.1 Growth Function

Growth type computed measures are used to calculate the growth of a measure over a certain time
period. The Growth type measures are of two types:
• Absolute – where the growth of a measure can be calculated either in absolute terms i.e. a simple
difference
• Percentage – where the growth of a measure is calculated on a percentage basis.
Absolute Growth Option
1. Select the Absolute Growth option and enter the details as tabulated.
The following table describes the fields in the Absolute Growth Option.

Table 59: Fields in the Absolute Growth Option and their Description

Field Description

Select the base on which


Select it from the drop-down list. The available option is Consecutive Period.
to calculate the growth

Select the period from the drop-down list for which you want the growth to
Select the period
be monitored. The available options are Year, Quarter or month.

NOTE If the time Dimension period specified in the cube is Year,


Quarter and Month, it takes the previous period of the Time
Level.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 288


UNIFIED ANALYTICAL METADATA
REFERENCES

2. Select the measure from the Select the Measure pane. Depending on the Information Domain you
are logged in to, the measures for that domain are displayed in the pane. Select the measure from
the pane. On selecting the measure, the growth of the measure will be calculated for the
consecutive period for a year.
Percentage Growth Option
1. Select the Percentage Growth option and enter the details as tabulated.
The following table describes the fields in the Percentage Growth Option.

Table 60: Fields in the Percentage Growth Option and their Description

Field Description

Select the base on which


Select it from the drop-down list. The available option is Consecutive Period.
to calculate the growth

Select the period from the drop-down list for which you want the growth to
Select the period
be monitored. The available options are Year, Quarter or month.

2. Select the measure from the Select the Measure pane. Depending on the Information Domain you
are logged in to, the measures for that domain are displayed in the pane. Select the measure from
the pane. On selecting the measure, the growth of the measure will be calculated for the
consecutive period for a year.

5.11.7.1.2 Time-Series Function

The Time Series type measures are time dependent. The Time Series types are:
• Aggregation type – This option computes the estimate of the periodical performance on a period-
to-date basis.
• Rolling Average – This option computes the average for the previous N values based on the given
dynamic value (N).This dynamic range could vary from a period of three months to any number of
months.
Aggregation Type Option
1. Select the Aggregate option.
2. Select the measure from the Select the Measure pane. Depending on the Information Domain you
are logged in to, the measures for that domain are displayed in the pane.
Rolling Average Option
1. Select the Rolling Average option.
2. Enter the rolling average in the Select the number of periods for which to calculate the rolling
average field.

NOTE The duration/period refers to the number of periods with


respect to the current level in the time dimension of the chosen
cube i.e. if the Current Value of the time dimension + the
previous X values (where 'x' is 10 as you have specified) / 10 +1.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 289


UNIFIED ANALYTICAL METADATA
REFERENCES

3. Select the measure from the Select the Measure pane. Depending on the Information Domain you
are logged in to, the measures for that domain are displayed in the pane.

5.11.7.1.3 Other (Advanced Mode) Type

The Advanced computed measures option allows you to specify a formula for computation of the
measure. In order to enter the formula, it is assumed that the user is familiar with MDB specific OLAP
functions.
There are two ways that you can enter a formula.
You can define the function/condition for a measure and/or dimension by entering the expression in the
pane. It is not essential that you select the measure/dimension and the function in the order displayed.
You can select the function and then proceed to specify the parameters, which can be either a measure or
dimension or both.
You can define it by following the procedure mentioned below:
Selecting the Measure
1. Click Insert Measure to open the Select Measure pop-up. The pop-up displays will display the
Measure folder. Double-click the folder to expand the list of measures under it. Depending on the
Information Domain you are logged in to, the measures for that domain are displayed.
2. Click OK to select the measure selection. To close the pop-up without saving the selected measure
option, click Cancel.
Selecting the Dimension
1. Click Insert Dimension to open the Select Dimension pop-up. The pop-up displays will display the
Dimension folder. Double-click the folder to expand the list of dimensions under it. Depending on
the Information Domain you are logged in to, the dimensions for that domain are displayed.
2. Click OK to select the dimension selection. To close the pop-up without saving the selected
dimension option, click Cancel.
Selecting the Function
1. Click Insert Function to open the Select Function pop-up. Double-click the Functions folder to
expand the list of functions within in it. The functions available are those specific to Essbase. The
parameters for the function are displayed in the Parameters pane.

NOTE The functions displayed are based on the OLAP type and
therefore, vary for SQL OLAP and Essbase.

2. Click OK to select the function. To close the pop-up without saving the selected function option,
click Cancel.

5.11.8 Read Only Selected in Mapper Window


1. After selecting the Read Only option in the Mapper window (New), click Save.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 290


UNIFIED ANALYTICAL METADATA
REFERENCES

2. In the Mapper List window, the Read Only option against the created Map would appear as Y. Now
select the defined Map and click button. The Mapper window is displayed.
3. The Save Mapping and Delete Mapping options are disabled.
4. Select the Node and click on View Mapping. The View mapping window is displayed. The Delete
button is inactive.
5. Click Close to exit the window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 291


DATA ENTRIES FORMS AND QUERIES
EXCEL UPLOAD (ATOMIC)

6 Data Entries Forms and Queries


Data entry Forms and Queries (DEFQ) within the Infrastructure system facilitates you to design web based
user-friendly Data Entry windows with a choice of layouts for easy data view and data manipulation. An
authorized user can enter new data and update the existing data in the shared database. Data entry Forms
are primarily focused to create data entry systems which access the database and load the generated
input data.

NOTE Starting from 8.1.x.x.x version, refer to MOS Note 2907369.1 for
maintainabilty of the module.
1. DEFQmodule will be supported on an as-is, where-is basis
for the existing features.
2. Bug fixes if any, will be reviewed and fixed based on the
criticality of the issue.
3. Nice to have features, lower severity bugs, and
enhancements will be reviewed but may not be prioritized
and fixed.

To access Data Entries Forms and Queries:


1. Login to OFSAA.

2. Click from the header to display the applications in a Tiles menu.


3. Select the Financial Services Enterprise Modeling application from the Tiles menu. The
Navigation list to the left is displayed.
4. Click Common Tasks to expand the list.
5. Click Data Entries Forms and Queries to expand the list further. The following links are displayed
on the Navigation list:
a. Excel Upload (Atomic)
b. Forms Designer
c. Forms Authorization
d. Data Entry

6.1 Excel Upload (Atomic)


The Atomic Schema Upload window consists of Excel Utilities such as Excel-Entity Mappings and Excel
Upload. The Excel Entity Mappings and Upload utilities have the restricted access depending on the
following function roles mapped:
• Users with XLADMIN function role can only define mapping and authorize, but cannot upload the
file.
• User with XLUSER function can only retrieve mapping definition (pre-defined by XLADMIN user)
and can upload the file based on retrieved mapping.
Click on the below links to view the section in detail.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 292


DATA ENTRIES FORMS AND QUERIES
EXCEL UPLOAD (ATOMIC)

• Excel-Entity Mappings
• Excel Upload

6.1.1 Navigating to Excel Upload (Atomic)


You can access Excel Upload window by expanding Data Entries Forms and Queries from the Navigation
list to the left and clicking Excel Upload (Atomic).

6.1.2 Excel-Entity Mappings


Excel-Entity Mapping helps you to map Excel Data to the destination table in the database. Excel-Entity
Mapping supports excel files created in Microsoft 2007 and earlier versions along with the option to map
and upload multiple sheets created within a single excel file. You need to have XLADMIN function role
mapped in order to define mapping.

6.1.3 Adding Excel-Entity Mappings


To define mapping in the Excel-Entity Mappings window:
1. From the LHS menu of DEFQ- Excel Upload window, click Excel-Entity Mappings.
The Excel-Entity Mappings window is displayed.

Figure 135: Excel-Entity Mappings window

2. Click button in the Mappings Summary toolbar.


The Add Excel-Entity Mappings window is displayed.
3. Enter the Mapping Name and a brief Description.
4. Click Browse.
The Choose File to Upload dialog is displayed.

5. Select the required Excel file to be used as the template and click button.
The columns in the selected Excel template are listed in the Select Excel Columns grid and the
database tables are listed in the Select Entities grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 293


DATA ENTRIES FORMS AND QUERIES
EXCEL UPLOAD (ATOMIC)

Figure 136: Excel-Entity Mappings window

6. Enter the format in which the dates are stored in the excel sheet in the Source Date Format field.
7. Select the Apply to all Dates checkbox if you want to apply the source date format to all date fields
in the excel sheet.
8. Select the First Row is the Header checkbox, if your Excel template has a header row.
9. Select the Template Validation Required checkbox to validate whether the Excel template you use
is same as the Excel sheet you use during the Excel Upload window. The validation is done when
you upload the excel sheet. Error will be displayed if there is any mismatch between the Excel
template you use to map and the actual Excel sheet you upload.
This field is displayed only if you have selected the First Row is the Header checkbox.
10. Select the Bulk Authorization checkbox to assign the “Excel_Name” across the selected column.
For example, the selected column “v_fic_description” will have the Excel Name assigned.

NOTE Ensure that the selected “Varchar2” column contains the


required length to hold the Excel Name. In order to select Bulk
Authorization, you need to have Save with Authorization
checkbox selected.

11. Select Save with Authorization checkbox to authorize the data upon successful data load. The
three mandatory fields namely Maker ID, System Date, and Authorization Status are displayed in
the Select Excel Columns grid.
You need to map these fields to the corresponding columns in the Select Entities grid. The value for
Maker ID column is updated with the User ID of the user who is performing the Excel Upload. The

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 294


DATA ENTRIES FORMS AND QUERIES
EXCEL UPLOAD (ATOMIC)

value for Maker Date is updated with the current System Date during which the upload is performed
and the value for Authorization Status is updated with flag 'U'. See Save with Authorization to create
a Form where the uploaded data can be authorized.
12. Select a column from the Select Excel Columns grid and select an attribute or column from the
required table from the Select Entities grid. Click Map.
13. Click Automap. The respective columns with the similar names in the Excel sheet and the database
are mapped. You need to manually map the other columns. The mapping details are displayed in
the Mapping Information grid which facilitates you to edit the details as required.
14. Click Save Mapping.
The Excel-Entity Mapping window displays the excel-database table mapping details.
In the Excel-Entity Mappings window, you can also do the following:

 Click button in the Mappings Summary tool bar to View the mapping details.

 Click button in the Mappings Summary tool bar to Edit the mapping details.

 Click button in the Mappings Summary tool bar to Delete the mapping details.

 Click button to download the Excel template used in the mapping.

6.1.4 Excel Upload


Excel Upload helps you to upload Excel Data to destination table in the database. You need to have
“XLUSER” function role mapped to access Excel Upload window and retrieve mapping definition (pre-
defined by XLADMIN user) to upload excel data. Excel Upload supports excel files created in Microsoft
2007 and earlier versions along with the option to map and upload multiple sheets created within a single
excel file. You need to ensure that the excel data contains the dates in the format as defined in Add Excel-
Entity Mapping definition.
To upload excel data in the Excel Upload window:
1. Click Browse in the Excel File to Upload grid.
The Choose File to Upload dialog is displayed.

2. Select the required Excel file and click button.


Select the required sheet in the Excel file from the Sheet drop-down list and the Preview grid
displays the data of the selected sheet of the Excel file.

Figure 137: Excel Upload window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 295


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

3. Click in the Excel-Entity Mappings grid.


The Mapping Selector dialog is displayed with the pre-defined mapping details.
4. Select the checkbox adjacent to the required mapping definition and click OK.

NOTE You can download the Excel template used in the mapping by
clicking button.

5. Click Upload.
A confirmation dialog is displayed on successful upload and the excel data is uploaded to the
database table. You can click on View Log to view the log file for errors and upload status.

NOTE You must be mapped to the XLCNFADVNC Role to download


the logs when you click View Log.

6.2 Forms Designer

NOTE 1. This functionality doesn’t work when CSRF is enabled. To


disable CSRF, see the section Update General Details.
2. This functionality displays only on Microsoft Internet
Explorer™ browser.

Forms Designer within the Data Entry Forms and Queries section facilitates you to design web based user-
friendly Forms using the pre-defined layouts. You can access DEFQ - Forms Designer by expanding Data
Management Framework and Data Entry Forms and Queries within the tree structure of LHS menu and
selecting Forms Designer.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 296


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

Figure 138: Forms Designer window

The DEFQ - Forms Designer window displays a list of pre-defined options to create, modify, and delete
Forms. You can also assign rights and define messages. By default, the option to Create a New Form is
selected and the left pane indicates the total steps involved in the process. The available options are as
indicated below. Click on the links to view the section in detail.
• Creating a New Form
• Altering Existing Forms
• Copying Forms
• Deleting Forms
• Assigning Rights
• Message Type Maintenance

6.2.1 Creating a New Form


To design a new Form in the DEFQ - Forms Designer window:
1. Ensure that Create a New Form option is selected and do the following:
 Specify the application name by either entering the New Application Name or selecting
Available Applications from the drop-down list.
 Enter the New Form Name.
2. Click Next. The DEFQ - Layout Window is displayed with the range of pre-defined layouts for you to
choose.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 297


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

Figure 139: DEFQ – Layout window (Step 3 of Designing Form)

The following table describes the layouts in the DEFQ – Layout window.

Table 61: Layouts in the DEFQ – Layout window and their Description

Layout Description

It is the default layout which displays the records in the Form of a


Grid Layout
table/grid with multiple rows of data.

Single Record Layout It displays a single record at a time.

It is a combination of the Single Record and Grid layout. By selecting a


record in the grid, the record is displayed in a single record format, which
Edit/View Layout is editable. By default the first record will be displayed in the editable grid.
Note: The column names are editable only during altering the created
Form.

It displays a single record with its column in a grid format. You can view a
Multi Column Layout multi column layout Form without having to scroll or with minimum
scrolling to view all the columns.

It displays rows of a single record in a wrapped manner in a grid format.


Wrapping Row Layout You can view a wrapping row layout Form easily without having to scroll
horizontally to view all the data.

It displays the Hierarchical dimensional table with the selected dimension


details. You can select the following options:
Dimensional Table Tree
Tree View Layout Parent Child Tree
Note: The process to create a Form using the Tree View Layout differs
from the procedure explained below. For more information, refer Create
Tree View Form in the References section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 298


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

3. Select the required layout and click Next.


The List of Available Tables is displayed.
4. Select the required Table from the list on which the Form is to be created.

Figure 140: DEFQ – List of Available Tables Selection window (Step 4 of Designing Form)

NOTE You should use tables with names not longer than 25
characters. This is a limitation.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 299


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

For multiple selections, you can either press Ctrl key for nonadjacent selection or SHIFT key for
adjacent selections. Click Next, and the Fields Selection window is displayed.

NOTE If multiple tables are selected, you need to specify Join


Conditions. Select the Table from the drop-down list and select
the Available Fields. Specify the Join Condition. Click Next, the
join conditions are validated and Fields Selection window is
displayed.

5. Select the fields to be joined from the Available Fields list and click . You can press Ctrl key for
multiple selections and also click to select all the listed fields. All mandatory fields are auto
selected and are indicated on the window with an asterisk (*).

Figure 141: DEFQ – Fields Selection window(Step 5 of Designing Form)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 300


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

NOTE You can click or buttons to arrange the fields in the


required order as intended to display in the Data Entry Form.
The fields order need not be similar to the arrangement in the
underlying table.
Ensure the fields selected are not of CLOB data type since it is
not supported in DEFQ.

6. Click Next. The Sort Fields Selection window is displayed.

Figure 142: DEFQ – Sort Fields Selection window (Step 6 of Designing Form)

You can sort the fields in required order as intended to display in the Data Entry Form. Also the
mandatory fields which needs user inputs are indicated in '*' symbol and are auto selected in the
Selected Fields pane.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 301


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

 Select the field from the Available Fields list and click . You can press Ctrl key for multiple
selections and also click to select all the listed fields.
 (Optional) To arrange multiple fields, select Sort by Descending checkbox.
 (Optional) Select the Excel Map checkbox to enable Bulk Authorization.

NOTE In case you have selected Excel Map checkbox, you need to
select “Excel Name” from the Store Field As list in the DEFQ
Field Properties window. Only on selection, the
“SelectExcelSheetName” list is displayed for authorizer in the
DEFQ - Data Entry window.

7. Click Next. The DEFQ Field Properties window is displayed with the Form details such as Field
Name, Display Name, In View, In Edit/Add, Allow Add, Store Field as, Rules, and Format Type.

Figure 143: DEFQ – Field Properties window (Step 7)

Specify the parameters for each field as tabulated.


The following table describes the fields in the DEFQ – Field Properties window.

Table 62: Fields in the DEFQ – Field Properties window and their Description

Field Description

Display Name Edit the default Display Name if required.

Select either Display or Do not Display to display the field in the Form.
If the field is a foreign key field or if more than one table is selected, then
the following options are available in the drop-down list;
In View
Same Field
Alternate Display Field
Do not Display options

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 302


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

Field Description

Specify the edit parameters by selecting from the drop-down list. The
available options depend on the type of field selected.
For normal fields you can select Text Field, Text Area, Select List,
Protected Field, Read Only, and Do Not Show.
For foreign key field s you can select Read Only, Select List, and Do Not
In Edit/Add Show.
For primary key fields you can select Read Only and Do Not Show.
For calendar fields you can select Calendar and Do Not Show.
Note: If you choose Select List option, you need to define the values. For
more information, refer Define List of Values.

Select the checkbox to permit users to add new record.


Allow Add Note: An alert message is displayed if you are trying to save a Form with
add option disabled for the mandatory fields.

Select the required option from the drop-down list. You can select the
store format as Normal, Sequence Generator, Maker Date, Checker Date,
Store field as Created Date, Modified Date Auth Flag, Maker id, Maker Date, Checker id,
Checker Date, Checker Remarks, Maker Remarks, and Excel Name (If Excel
Map is selected in Sort Fields Selection window).

Click Rules and specify Rules and Expressions for the selected field in the
Rules Specifying Rules and Expressions for Data - Validations window.
For more information, refer Applying Rules section in References.

Select the required Format type from the drop-down list depending on the
Format Type field type selected.
CLOB data type is not supported.

Select the checkbox to group all the set of table Forms to a batch.

Batch Commit All the Form tables are executed along with the batch execution and if in
case, a Form in the table fails to execute, the entire set of Forms are
returned.

Click Message Details to define the message type for Creator and
Message Details Authorizer in the Messaging Details for a Form window. For more
information, refer Define Message Details.

Click Form Filter to define an expression for Form-level filter condition in


Form Filter the
Filter for Form window.

Click Data Versioning to perform data versioning on an authorized Form.


Data Versioning
For more information, refer Form Data Versioning.

8. Click either Save to only save the Form details or click Save for Authorization to save the changes
with authorization. For more details, refer Save for Authorization section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 303


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

Figure 144: DEFQ – Save window

NOTE Sometimes, on clicking Save, the form does not get saved. This is
because the Java heap size setting for OFSAAI service is set too
high and web server memory setting is too low. Contact System
Administrator to modify it to the appropriate setting by viewing
the log file created in the path:
$FIC_APP_HOME/common/FICServer/logs/.

While saving, the User for Mapping - DEFQ window is displayed which facilitates you to assign user
rights to the Form. For more information, refer Assign Rights.

6.2.2 Altering Existing Forms


To alter the field details of an existing Form in the DEFQ - Forms Designer window:
1. Select Alter Existing Forms from the available options and do the following:
 Select the Available Applications from the drop-down list.
 Select the Available Forms from the drop-down list. The listed Forms are dependent on the
DSN (Data Source Name) that you have specified.
2. Click Next. The Fields Selection Window is displayed.
Add or remove the selected fields as required to be displayed in the Form. You can choose a field
from the Available Fields list and click to add, or choose the selected field from the Fields to

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 304


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

Display list and click to de-select. You can press Ctrl key for multiple selections and also click
or buttons to select/de-select all the listed fields.
3. Click Next. The Sort Fields Selection Window is displayed.
 Sort the fields in required order as intended to display in the Form. You can choose a field from
the list and click or buttons to select/deselect. You can also click or buttons to
select/de-select all the listed fields.

 Select a field and click or buttons to arrange fields in the required order.
 (Optional) To arrange multiple fields, select Sort by Descending checkbox.
 (Optional) Select the Excel Map checkbox to enable Bulk Authorization.

NOTE In case you have selected Excel Map checkbox, you need to
select “Excel Name” from the Store Field As list in the DEFQ
Field Properties window. Only on selection, the
“SelectExcelSheetName” list is displayed for authorizer in the
DEFQ - Data Entry window.

4. Click Next. The DEFQ Field Properties window is displayed.


Modify the parameters for each field as required. Refer DEFQ Field Properties details.
5. Click either Save to save the Form details or click Save for Authorization to save the changes with
authorization.
While saving, the User for Mapping - DEFQ window is displayed which facilitates you to assign user
rights to the Form. For more information, refer Assign Rights.

6.2.3 Copying Forms


You can duplicate and recreate a Form with the required variants from an existing Form. You can also
change user rights or display options and other subtle variations for the selected layout.
To Copy a Form in the DEFQ - Forms Designer window:
1. Select Copy Forms from the available options and do the following:
 Select the application from the From Application drop-down list which consist of the required
Form which you want to copy.
 Select the application from the To Application drop-down list for which you want to copy the
Form.
 Select the required Form from the Save Form drop-down list.
 Enter a name for the Form in the As Form field.
2. Click Next. The specified Form is duplicated as a new Form and a confirmation dialog is displayed
with the status.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 305


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

6.2.4 Deleting Forms


You can remove the forms which are not required in the system by deleting from the DEFQ - Forms
Designer window.
1. Select Delete Forms from the available options and do the following:
 Select the application from the Available Application drop-down list which consist of the
required Form which you want to delete.
 Select the Form from the Available Forms drop-down list which you want to delete.
2. Click Next. An information dialog is displayed for confirmation. Click OK.

6.2.5 Assigning Rights


You can assign user permissions to view, add, edit, and delete the Form details in the User for Mapping -
DEFQ window.
1. Select Assign Rights from the available options and do the following:
 Select the required application from the Available Applications drop-down list.
 Select the required form for which you want to assign rights to a user from the Available Forms
drop-down list.
2. Click Next. The DEFQ- Assign Rights window is displayed.

3. Select the required user from Available User List. You can also click or buttons to reload
previous/next set of users in the list.
4. Select the checkbox corresponding to the user permissions such as View, Add, Edit, Delete, or All
Above. You must give View permission in order to allow users to Edit or Delete a Form.
5. Select Authorize or Auto-Authorize checkbox as required.
The Authorize and Auto-Authorize options are applicable for all the forms that have been saved
with the Authorize option. The Auto-Authorize feature for records is applicable in scenarios where
the Creator and Authorizer are the same. If a user has Add and Auto-Authorize permissions, the
data entered by the user is auto authorized and the data will be in Authorized status. In case of
normal Authorization, the Record added by the creator has to be authorized by a different user who
has Authorize permissions.

NOTE The Auto-Authorize feature in Forms Designer is applicable


only for data entered through Data Entry window and not
through Excel Upload window.

6. Select the Show Data Created by Current Users Only checkbox if you want the current user to
view data created by him only.
7. Click User Value Map to map users to the form based on data filter.
8. Click Save Access Rights. A confirmation dialog is displayed after saving and the user is added to
the Assigned User List.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 306


DATA ENTRIES FORMS AND QUERIES
FORMS DESIGNER

User Value Map


This feature allows you to create a data filter based on any field/column of the table you selected for
designing the Form. When a user tries to access the form in the Data Entry window, data will be filtered
and displayed based on the selected field to the users associated with that column.

NOTE The data type of field/column you select to define filter should
be NUMBER or VARCHAR. The users mapped to the DEFQ form
whose assign rights are authorized through “Forms
Authorization” can save the filter.

There are two types of filters, Global Data Filter and Custom Data Filter.
Global Data Filter: In this filter, the value will be fetched from the DEFQ_GLOBAL_VALUES table of the
Atomic schema, which is automatically created during information domain creation. The table needs to be
populated manually through excel upload. The table contains all the entities and the users mapped to
them.
Custom Data Filter: This filter enables the user to provide a custom filter for the form you design. In this
filter, you should enter values for all the users mapped to the form manually.
To set a Data Filter:
1. Click User Value Map in the DEFQ- Assign Rights window.
The User Value Map window is displayed.
2. Select the Global Data Filter option to filter the data globally.
 Select the field based on which the data should be filtered and displayed for the user, from the
Fields to Display section.

NOTE Normally the user can access all the data from the table
whenever the DEFQ form is created. Based on this filter, the
user will be displayed only the data which is mapped to him.

3. Select the Custom Data Filter to provide a custom filter for a specific DEFQ Form.
 Select User ID from the drop-down list and enter Values for that user. It is mandatory
4. Click Save.

6.2.6 Message Type Maintenance


You can manage the Message Type details which alert the Creator of the Form or to an Authorizer in the
DEFQ Message Type Maintenance window. Message Type details can be defined while creating a Form.
For more information, refer Define Messaging Details.
In the In the DEFQ - Forms Designer window, do the following:
1. Select Message Type Maintenance from the available options and click Next.
The DEFQ - Message Type Maintenance window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 307


DATA ENTRIES FORMS AND QUERIES
FORMS AUTHORIZATION

2. Select the message category from the Message Type drop-down list.
3. Edit the message details by doing the following:
 The defined Message Subject and Message Content is auto populated. Edit the details as
required.
 Add or remove the defined recipients. Double-click on the required member to toggle between
Available and Mapped Recipients list.

NOTE Selecting Authorizer alerts all the selected authorizers for


authorization.

4. Click Save. A confirmation is displayed on updating the Message Type details.

6.3 Forms Authorization

NOTE 1. This functionality doesn’t work when CSRF is enabled. To disable


CSRF, see the section Update General Details.
2. This functionality displays only on Microsoft Internet Explorer™
browser.

Forms Authorization within the Data Entry Forms and Queries section of Infrastructure system facilitates
you to view and authorize / approve any changes that are made to the privileges assigned to a user in a
particular Form.
You need to have FRMAUTH function role mapped to access Forms Authorization window.

NOTE You cannot authorize or reject a right request created by you,


even if you have FRMAUTH function role mapped.

You can access Forms Authorization window from the left hand side (LHS) menu of Infrastructure Home
Page. Click “+” and expand the Data Model Management and select Data Entry Forms and Queries.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 308


DATA ENTRIES FORMS AND QUERIES
FORMS AUTHORIZATION

Figure 145: Forms Authorization window

The Forms Authorization window displays the list of privileges assigned to a user in different Forms. These
privileges include create, view, modify, delete, authorize, and auto-authorize records. The Forms
Authorization window allows you to select a user from the drop-down list adjacent to User ID field. This
field displays the User ID’s associated with the selected Information Domain.
On selecting a user from the User ID field, the columns in Forms Authorization window lists the grants
requested for that user on different Forms as listed below.
The following tables describes the columns in the Forms Authorization window.

Table 63: Column Names in the Forms Authorization window and their Description

Column Name Description

Application Lists the specific application to which the Form has been assigned.

Form Name Displays the Form Name.

Access Rights Before Displays the available Right Requests for the selected user in the Form.
Note: For new Form, the column remains blank.

Access Rights After Displays the Right Requests raised for authorization.
DV - DEFQ VIEW
DA - DEFQ ADD
DE - DEFQ EDIT
DD - DEFQ DELETE
A - AUTHORIZE
DU - AUTO AUTHORIZE
S - SHOW DATA CREATED BY CURRENT USER ONLY

Operations Displays the operation carried out in the Form.


For example, “ADD” indicates a new form is created and specific roles are
assigned.

Created By Displays the USER ID from which the Right Request has been created.

Created Date Displays the Date on which the Right Request has been created.

Last Saved By Displays the USER ID from which the previous Right Request change has
been saved.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 309


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

Column Name Description

Last Saved Date Displays the Date on which the previous Right Request change has been
saved.

Checked By Displays the USER ID from which the Right Request has been authorized.

Checked Date Displays the Date on which the Right Request has been authorized.

To authorize or Reject a form in the Forms Authorization window:


1. Select the User ID from the drop-down box. 4B43BThe Right Requests submitted on various forms
are displayed.
2. Select the checkbox(s) adjacent to the requests to authorize / reject.
You can also select all the requests at once for a user, by clicking Select All checkbox.
3. Click Authorize / Reject to authorize or reject the selected Right Requests.
Once Form action privileges are authorized for a user, those actions can be performed on the Form.
For an existing Form with certain rights, the rights remain the same until the changes are
authorized / rejected by an authorizer.

NOTE Special chars are not allowed in DEFQ definitions except


underscore (_).

6.4 Data Entry

NOTE This functionality doesn’t work when CSRF is enabled. To


disable CSRF, see the section Update General Details.
This functionality displays only on Microsoft Internet Explorer™
browser.

Data Entry within the Data Entry Forms and Queries section of Infrastructure system facilitates you to
view, add, edit, copy, and delete data using the various layout formats and Authorize/Re-authorize data
records based on the permissions defined during the Form creation.
You can use the Search option to query the records for specific data and also export the data in Microsoft
Excel format for reference. You can launch multiple instances of Data Entry window using the URL to
search and update records simultaneously.
You can access DEFQ - Data Entry by expanding Data Entry Forms and Queries section of Data Model
Management module within the tree structure of LHS menu.

NOTE An alert message is displayed if you are not mapped to any


Forms in the system.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 310


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

Figure 146: DEFA – Data Entry window

The DEFQ - Data Entry window displays the list of Data Entry Forms and Query Forms mapped to the
logged-in user in the LHS menu. You can select the required Form to view the details. In the DEFQ - Data
Entry window, you can do the following:
• Viewing Form Details
• Editing Form Details
• Adding Form Data
• Authorizing Records
• Exporting Form Data
• Copying Form Data
• Deleting Form Details

6.4.1 Viewing Form Details


The DEFQ - Data Entry window displays the selected Form Data in the View mode by default. The Forms
are displayed based on the application names in the LHS menu. There are various layouts available to
customize the view and by default, the Form details are displayed in the layout in which it was designed.
In the DEFQ - Data Entry window, the following layout types are available. You can click on any of the
following layouts to view the Form details. The buttons i.e. Previous Page, Back, Next, and Next Page
helps you to navigate through the records. However, the customized header sorting does not apply when
you have navigate to Previous or Next pages.

NOTE The Roll Back option can be used only for authorized records
i.e. after the records are edited and saved, you can roll
back/undo the changes in view mode.

The following table describes the Layouts in the DEFQ – Data Entry window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 311


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

Table 64: Layouts in the DEFQ – Data Entry window and their Description

Layout Description

To view a single record details at any given point. You can use the navigation buttons to
Single Record
view the next record in the table.

To view and edit a single record. A list of five rows/records is displayed by default, and
the same can be changed by entering the required number in Display Rows. You need
Editable View
to select the required record from the list to view/edit and click Save to update the
changes.

To view all the records in a list. A list of five rows/records is displayed by default, and
Grid (Default) the same can be changed by entering the required number in Display Rows. You can
click on the column header to alphabetically sort the list of records in the table.

To view all the columns of a selected record. This layout enables you to view a record
Multi column
without having to scroll or with minimum scrolling to view all the columns.

To view all the rows of a selected record. This layout enables you to view a wrapping
Wrapped rows
row easily without having to scroll horizontally to view the columns.

6.4.2 Searching Records


In the DEFQ - Data Entry window, you can Search for a record in the View, Edit, and Authorize modes. You
can perform a quick Search to find a specific record or run an Advanced Search to further query the
record for the required details.
To search for a record in the DEFQ - Data Entry window:

1. Click .
The search fields are displayed.
2. Select Field Name from the drop-down list.
3. Enter the value/data in the Search field.
4. Click Go.
The search results are displayed in the list.
To perform an Advanced search in the DEFQ - Data Entry window:

1. Click within the Search fields.


The Advanced Search Window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 312


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

Figure 147: Advanced Search window

2. Select the required Parentheses/Join, Field, Operator from the drop-down list and enter the Value
as required to query the Form data.
3. Click GO.
The results are displayed with the field names containing the searched data.

6.4.3 Editing Form Details


You can edit the permitted Form field values in the DEFQ - Data Entry window. However, you cannot
modify the primary key fields which are displayed in non-editable format.
To edit Form Details in the DEFQ - Data Entry window:
1. Open the required Form in view mode

2. Select a record to be edited and click . The editable fields are enabled.
3. Enter/update the required details.
4. Click Save and update the changes.
5. If required, you can click Reset to undo the changes and return to original field values.
If you have edited an Authorized record, the same is again marked for authorization. Once the
record is updated, a modified status flag is set, and only these record changes can be rolled back.
The Roll Back option is supported in view mode only for authorized records, i.e. records which are
updated and saved.

6.4.4 Adding Form Data


You can add a row to the required table and enter the field details. To Add Form Data in the DEFQ - Data
Entry window:

1. Open the required Form in view mode and click .


2. By default, five rows are displayed. You can modify by specifying the number of required rows in
Display Rows field and clicking Reset.
3. Enter the required numeric data in the new fields. If you want to view the numeric data separated by
commas, enter the details accordingly.
4. Click Save and update the data to the selected table.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 313


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

6.4.5 Authorizing Record


You need to have DEFQMAN and SYSAUTH function roles mapped to access and authorize Forms in the
DEFQ framework. You can Authorize a single record or all the records of a selected Form with the in the
DEFQ - Data Entry window. You can Authorize record in a table which has a primary key field. A primary
key field in the record is indicated by “PK”. You need to have the authorization rights defined by the user
who has created the record. You can also Reject or Hold inappropriate records in the table.

Figure 148: DEFQ - Data Entry Authorization window

The status of each record in the table is indicated with an “AuthFlag” as indicated below:
• Unauthorized records are displayed with the status flag “U”
• Authorized records are displayed with the status flag “A”.
• Rejected records are displayed with the status flag “R”.
• Modified records are displayed with the status flag “M”.
• Deleted records are displayed with the status flag “D”.
• If an Unauthorized record is on Hold, the status flag is displayed as “H”.
• If a Modified record is on Hold, the status flag is displayed as “X”.
• If a Deleted record is on Hold, the status flag is displayed as “Z”.
To Authorize Data in the DEFQ - Data Entry window:

1. Open the required Form in view mode and click .


The list of available records for Authorization is displayed. If there are “no records” for Authorization
in the selected Information Domain, an alert message is displayed.
2. Select the “Auth” checkbox adjacent to the required record with the status flag “Unauthorized /
Put On Hold” and click Save.
A confirmation dialog is displayed. Click OK.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 314


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

You can also do a Bulk Authorization if Excel Map is selected in the Sort Fields Selection window.
Select the mapped Excel Name from the “SelectExcelSheetName” drop-down list.
The DEFQ - Data Entry window displays only those records which are uploaded though the selected
Excel sheet. Click Authorize Excel.
A confirmation dialog is displayed. Click OK.
You can Reject / Hold a record by doing the following:
• To Reject a record, select the checkbox in the “Rej” column adjacent to the required record and
click Save.
A confirmation dialog is displayed. Click OK.
You can also Reject records in Bulk Mode if Excel Map is selected in the Sort Fields Selection
window. Select the mapped Excel Name from the “SelectExcelSheetName” drop-down list.
The DEFQ - Data Entry window displays only those records which are uploaded though the selected
Excel sheet. Click Reject Excel. A confirmation dialog is displayed. Click OK.
• To Hold a record and to authorize or reject at a later point, select the checkbox in the “Hold”
column adjacent to the required record and click Save.
In the DEFQ - Data Entry window, you can also do the following:
• Click Authorize All and click on Save to authorize all the records displayed in current page.
• Click Reject All and click on Save to reject all the records displayed in current page.
• Click Hold All and click on Save to hold all the records displayed in current page.
If you have enabled the option to send alerts to the Creator of the Form in Message Type Maintenance
window, a message is sent indicating that the records are authorized/rejected/put-on-hold.

6.4.5.1 Re-authorizing Records


You can re-authorize an authorized record which has been updated by other users. When an authorized
record is updated, the status flag (AuthFlag) is set to “M” indicating that the record has been modified and
needs re-authorization.

Figure 149: DEFQ - Data Entry Re-Authorize window

To re-authorize modified records in the DEFQ - Data Entry window:

1. Open the required Form in view mode and click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 315


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

The list of available records with the Authorization status is displayed. If there are “no records” for
Authorization in the selected Information Domain, an alert message is displayed.
2. Click Reauthorize Records. The DEFQ Authorization Window is displayed.
3. Select the “Auth” checkbox adjacent to the required record.
4. Click Save. On re-authorization, a confirmation message is displayed.
You can also select the checkbox adjacent to “Rej” to reject the record, or “Hold” to re-authorize or
reject at a later point. A message is sent to the Form creator indicating that records are
authorized/rejected/put-on-hold.

6.4.5.2 Re-authorizing Deleted Records


You can re-authorize the delete action when an authorized record has been deleted by other users. When
an authorized record is deleted, the status flag (AuthFlag) is set to “D” indicating that the record has been
deleted and needs re-authorization.

Figure 150: DEFQ - Data Entry Re-Authorize Deleted Records window

To re-authorize deleted records in the DEFQ - Data Entry window:

1. Open the required Form in view mode and click .


The list of available records with the Authorization status is displayed. If there are “no records” for
Authorization in the selected Information Domain, an alert message is displayed.
2. Click Reauthorize Deleted Records. The DEFQ Authorization Window is displayed.
3. Select the “Auth” checkbox adjacent to the required record.
4. Click Save. On re-authorization, a confirmation message is displayed.
You can also select the checkbox adjacent to “Rej” to reject the record, or “Hold” to re-authorize or
reject at a later point. A message is sent to the Form creator indicating that records are
authorized/rejected/put-on-hold.

6.4.6 Exporting Form Data


You can export the required record(s) to a selected location in CSV format. To Export Form Data in the
DEFQ - Data Entry window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 316


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

1. In the View mode, select the checkbox adjacent to the record(s) which you want export.

2. Click . The File Download dialog is displayed.


3. Click Save. The Save As dialog is displayed.
4. Select the location and click Save. The selected record is exported.

6.4.7 Copying Form Data


You can copy the existing fields and create new fields in a record. When you copy a field, the primary key
values are incremented from the pre-defined value to the next acceptable value. However, the other fields
can be modified as required.
To copy fields in the DEFQ - Data Entry window:

1. Open the required Form in view mode and click .


The list of available records is displayed. All the primary field data (indicated by *) is incremented by
default.
2. Click Save. The field values are added to the record.
You can click Edit to modify the values or click Next to copy the next set of fields.

6.4.8 Deleting Form Details


You can remove Form details which are no longer required by deleting from the DEFQ - Data Entry
window.
1. In the View mode, select the record to be deleted.

2. Click . An information dialog is displayed.


3. Click OK to confirm and delete the record.

6.4.9 References
This section of the document consists of information related to intermediate actions that needs to be
performed while completing a task. The procedures are common to all the sections and are referenced
where ever required. You can refer to the following sections based on your need.

6.4.9.1 Creating Tree View Form


The process to create a Form using the Tree View Layout differs from the procedure as explained for
other layouts. You can create a Form using the Tree View Layout, by selecting either Dimensional Table
Tree or Parent Child Tree.

6.4.9.2 Dimensional Table Tree


If you want to create a Form using the Dimension table Tree, select Tree view > Dimension Table Tree
option in the DEFQ - Layout window. On clicking Next, you need to provide the required details in the
following windows:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 317


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

1. Dimension Table Selection: Enter the Root Name and select the Table. Click Next.
2. Fields Selection: Select required Fields to Display from Available fields and click Next.
3. Dimension Node Selection: Select Field Nodes from Available fields and click Next.
4. Select Dimensional Tree Nodes for the selected fields and click Next.
5. DEFQ Field Properties window: Specify the required details. For more information, refer DEFQ
Field Properties.

6.4.9.3 Parent Child Tree


If you want to create a Form using the Parent Child Tree, select Tree view > Parent Child Tree option in
the DEFQ - Layout window. On clicking Next, you need to provide the required details in the following
windows:
1. Hierarchy Table Selection: Enter the Root Name and select the Table. Click Next.
2. Parent-Child Node Selection: Select Parent Node, Child Node, and Node Description from the
drop-down list.
3. Fields Selection: Select required Fields to Display from Available fields and click Next.
4. DEFQ Field Properties window: Specify the required details. For more information, refer DEFQ
Field Properties.

6.4.9.4 Applying Rules


You can apply rules to Validate Form Data to specific fields such as Text Field, Text Area, or Protected
Field. To specify rules for a field in the DEFQ - Forms Designer DEFQ Field Properties window:
1. Click Rule adjacent to the required field. The Specifying Rules and Expressions for Data Validations
window is displayed.
2. Select the required Fields, Operators, and Functions from the list.
3. Enter the Rule Expression in the Expression Viewer field.
4. Depending on the data type of the selected field, the following column constraints are displayed.
Select the required checkbox.
 No Spaces
 Characters Only
 Alpha Numeric
 Not Null
 Non Negative
5. Select the Alignment type from the drop-down list.
6. Click OK and save the details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 318


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

6.4.9.5 Defining List of Values


While creating a Form, if you choose the Select List field parameter option in the In Edit/Add column in
the DEFQ Field Properties window, you need to define the list of values in the Select List window. However,
you do not need to define the values for foreign key fields and primary key fields.
In the Select List Window, select the required Field Type from the following options:
• Comma Separated Values: Supports only the user specified values while creating a Form.
• Dynamic List of Values: Supports fieldname from a table and stores it in the database. The same
can be used during Data Entry.
If Comma Separated Values is selected:
1. Enter the List of Values to be displayed.
2. Specify Alternate Display Values to be displayed.
3. Click OK and save the specified list of values.
If Dynamic List of Values is selected:
1. Select Table Value, List Value and Display Value field.
2. Select the Field, Operator, and Functions from the list.
3. Define a filter condition for the selected values.
4. Click OK and save the specified list of values.

6.4.9.6 Defining Messaging Details


While creating a Form, you can click Message Details in the DEFQ Field Properties window to define the
messaging details. You can specify an alert message which is sent to the Creator of the Form or to an
Authorizer.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 319


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

Figure 151: Messaging Details window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 320


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

In the Messaging Details for a Form window:


1. Select Messaging Required checkbox to activate the Messenger feature.

NOTE If the option is not selected, a single mail is sent for the entire
batch. Message details such as recipients, subject, and contents
are fetched from the metadata

2. Select the required Available Message Types from the list and click .
3. Select the Message Type from the drop-down list based on specific action.
4. Select Specific Messages Required to add a specific message.

5. Select Available Fields for Subject, Content, & Recipients from the list and click .
6. Click Save and save the messaging details. You also need to select Save with Authorization in the
DEFQ Field Properties window for the messages to be functional.

6.4.9.7 Form Data Versioning


You can perform data versioning on an authorized Form. The modifications made to the particular Form is
tracked and displayed as per date versioning. In the Data Versioning for Form window, do the following:
1. Select Enable Data Versioning checkbox to ensure that the version is tracked.
2. Select the Table and Version Identifier from the drop-down list.
3. Click OK and save the versioning details.

6.4.9.8 Save with Authorization


The Save with Authorization feature in Forms Designer (Sort Fields Selection window) allows you to
authorize the uploaded data. Authorization serves as a checkpoint for validation of uploaded data.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 321


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

Figure 152: DEFQ – Data Entry Save Authorization window

To authorize the uploaded data, you need to create a Form in DEFQ with the Save with Authorization
checkbox selected.
1. Before any DEFQ Form is created to authorize the data, the underlying table in the data model
needs to have below columns added to its table structure. You need to perform a data model upload
to have the new structures reflected in the application.
Columns required:
V_MAKER_ID VARCHAR2(20),
V_CHECKER_ID VARCHAR2(20),
D_MAKER_DATE DATE,
D_CHECKER_DATE DATE,
F_AUTHFLAG VARCHAR2(1),
V_MAKER_REMARKS VARCHAR2(1000),
V_CHECKER_REMARKS VARCHAR2(1000)
2. Navigate to Create a New Form in the Forms Designer section and complete the design steps up to
Step 6. From the DEFQ Field Properties window explained in step 7, select the appropriate values as
listed below for Store Field As depending on the columns selected:
V_MAKER_ID - MakerID
V_CHECKER_ID – CheckerID

D_MAKER_DATE - Maker Date

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 322


DATA ENTRIES FORMS AND QUERIES
DATA ENTRY

D_CHECKER_DATE - Checker Date


F_AUTHFLAG - AuthFlag
V_MAKER_REMARKS - Maker Remarks
V_CHECKER_REMARKS - Checker Remarks
3. Click Save with Authorization. Once data is loaded into the table, you can login as 'Authorizer' and
navigate to the Data Entry window. Select the Form to open and authorize the records loaded.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 323


DATA MAINTENANCE INTERFACE
DATA ENTRY

7 Data Maintenance Interface


Data Maintenance Interface (DMI) helps to design a Data Form in a user-specified format. Further, allows
to perform maintenance activities using the designed Form.
Data Form Designer
The designer allows the user to design a form to maintain the underlying data.
Data Maintenance
This allows the user to maintain the design either through the form that has been defined using Form
designer or do a Bulk upload using the Excel upload mechanism. A strong data governance process is
enabled through an approval workflow of the data maintained.
The Data Maintenance Interface (DMI) feature of OFSAAAI provides the capability to add or modify data in
any table in the Atomic Schema. The feature adopts a role-based approach so that different users in your
organization can perform the various stages involved in the modification of the table data. The different
stages also go through an approval cycle that also follows a role-based approach.
The modification of the table data by using the DMI features involves the following main processes:
• Configuring the required table and its attributes by using a template
• Performing the data entry for the configured templates
The configuring of the table and its attributes is achieved by using a template called Forms Definition. You
can use any of the following methods to configure the template:
• Use an Excel file with pre-existing data to modify the required table
• Use the Designer option to configure the required table and its attributes
When the Forms Definition is approved, the data entry for the template can be performed.
If the template is created by using an Excel file, the data entry involves the verification of table records that
are added or modified by the Excel file. The verified records are then sent for final approval.
If the template is created by using the Designer option, the data entry involves entering the values for the
table records. These records are then sent for final approval.
For example, consider the scenario where an Analyst in a Bank wants to modify the attributes, ACCOUNT
BRANCH CODE and ACCOUNT DESCRIPTION in the table DIM_ACCOUNT. The Analyst can create the
Forms Definition by using an Excel file with pre-existing data and request a supervisor to validate and
approve the Forms Definition. When it is approved, a member of the operations team can perform the
data entry by verifying the records modified by the Excel file. The verified records are then sent for
approval.

NOTE To use this feature:


• You must apply the mandatory patch 34794130 on top
of installing the OFS AAI 8.1.2.1.0 ML.
• Oracle Financial Services Analytical Applications
Infrastructure Extension Pack (OFS AAIEP) is required.
For more details, refer to OFS AAIEP Release Notes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 324


DATA MAINTENANCE INTERFACE
PREREQUISITES

Topics:
• Prerequisites
• Creating Forms Definition
• Approving or Rejecting Forms Definition
• Data Entry

7.1 Prerequisites
The following is the prerequisite to access and perform functions in the DMI user interface:
• Mapping DMI Menu into Application Menu Tree
• User Role Mapping and Access Rights

7.1.1 Enabling DMI User Interface in Application Menu Tree


You can configure DMI to appear in any relevant menu of your choice in the application.
For example, you could add to Common Tasks menu.
Add the 'AAI_DMI' menu entry to aai_menu_tree table (CONFIG) schema to enable the DMI in the LHS
menu.
After you have added the menu tree, follow the instructions described in the section User Role Mapping
and Access Rights,

7.1.2 User Role Mapping and Access Rights


User access to the DMI UI and the ability to perform functions in it is dependent on the mapping of the
user profile to the roles in the OFS AAI application and the access rights assigned.
To access the following DMI windows, Users of DMI must be mapped to the following groups in OFSAA:

Table : User Role Mapping for DMI

Role Code Role Name Functionality

Assign this role to the user to access the Designer


View menu from the Navigation Tree.
DMIDSGNREAD Data Designer Read
NOTE: The mapping of this role does not allow
view, edit and add actions.

Assign this role to the user to Authorize, Excel


DMIDSGNAUTH Data Designer Auth
Upload, and Designer Summary.

Assign this role to the user to Reject , Excel


DMIDSGNREJ Data Designer Reject
Upload, and Designer Summary.

Assign this role to the user to Create Designer


DMIDGNFORM Data Designer Form
Form Definition.

Assign this role to the user to Create Excel upload


DMIDGNTEMPLATE Data Designer Template
Definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 325


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

Role Code Role Name Functionality

Assign this role to the user to Delete, Excel upload,


DMIDSGNDEL Data Designer Delete
and Designer Summary.

Assign this role to the user to Create View


DMIDGNVIEW Data Designer View
Definition.

Assign this role to the user to Add, Edit and Copy


DMIDSGNWRITE Data Designer Write
all kinds of definitions in Designer screen.

Assign this role to the user to access the Data View


menu from the Navigation Tree.
DMIDATAREAD Data Entry Read
NOTE: The mapping of this role does not allow
view, edit, and add actions.

Assign this role to view the list of all Component


DMIDATAALL Data All Summary
Records in Data Entry Screen.

Assign this role to the user to Add, Edit Records in


DMIDATAWRITE Data Entry Write
Data Entry Screen.

Assign this role to Authorize a Record Summary in


DMIDATAAUTH Data Entry Auth
Data Entry Screen.

Assign this role to Reject a Record Summary in


DMIDATAREJ Data Entry Reject
Data Entry Screen.

Assign this role to Delete a Record Summary in


DMIDATADEL Data Entry Delete
Data Entry Screen.

7.2 Creating Forms Definition


You can create a Forms Definition from Forms Definition - Summary to select the table and the
attributes to modify. You can create a Forms Definition by using either of the following methods:
• Creating Forms Definition Using Excel Upload
• Creating Forms Definition Using Designer
When a Forms Definition is created, a supervisory user with the necessary role can validate the
configuration of the Forms Definition and approve it. This approved form can be later used by a different
user to perform Data Entry. For more information, see Data Entry.

NOTE A user who creates a Form would not be able to perform Data
Entry even if the same user provides the required permissions
while creating a Form.
For example, consider if the user AAIUSER is creating a Form
to modify the product data. If this user provides permission to
perform data entry while creating a Form, the same user would
not be able to perform this operation from Data Entry and the
following error is displayed:
“Maker cannot Approve/Reject the Record”

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 326


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

You can also auto-approve Forms that are created using the Designer option. When you auto-approve the
Form, the user performing the Data Entry does not have to send the entries for an approval cycle.

7.2.1 Before you Begin


The following roles must be assigned to you to perform actions from Forms Definition Summary:
• DMIDGNWRTE (Data Designer Write): Add, Edit, or Copy Forms Definition.
• DMIDGNFORM (Data Designer Form): Create Forms Definition. You must be assigned to this role
in addition to the role DMIDSGNWRITE.
• DMIDGNTMP (Data Designer Template): Create Template Definition. You must be assigned to
this role in addition to the role DMIDSGNWRITE.
For information on assigning roles, see Role Maintenance.

7.2.2 Creating Forms Definition Using Excel Upload


You can use an Excel file that has columns names as per the table in the Atomic Schema. You can also
modify the mapping for the attributes while you create the Forms Definition. When the Forms Definition
that you create using the Excel option is approved from the Forms Definition – Summary, users with the
necessary role and permission can perform Data Entry for the records updated by the Excel file.
To create a Forms Definition by using an Excel file, perform the following steps:
1. Click Financial Services Enterprise Modelling from the Navigation List on your Home Page.
2. Click Common Tasks, Data Maintenance Interface, and then select Designer View.
The Forms Definition – Summary Page is displayed.

3. Click Create New from the Forms Definition – Summary Page.


The Create Forms Definition Pane is displayed.
4. Click Apply after completing the fields as per the following table:

Table 65: Options in the Forms Definition Pane

Field Description

Type Select Excel Upload as the method to create the Forms


Definition.

Auto Map This option is enabled when the Type is selected as Excel
Entities Upload. Select this box to auto map the attributes in the
Excel file with the attributes in the Entity Table.

Name Enter a name for the Forms Definition.

Description Enter a description for the Forms Definition.

The File Upload tab of the Configure Page is displayed.


5. Enter a name and description for the excel template in the Template Name, and Description
Fields.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 327


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

6. Click Drag and Drop, and select the excel file to update the required table.

TIP You can also drag and drop the required excel file in the Drag
and Drop Field.

The excel file is uploaded and a confirmation box is displayed.


7. Click Apply.
The Mapped Entities tab is displayed.
8. Enter the name of the table that you want to modify in the Primary Entity Field.
If the table has Child tables, the Child tables also get displayed in the Mapped Entities tab. You can
select the required child tables for which data should be input during data entry.

9. Select Enable Bulk Authorization if you want to enable the bulk authorization of all the records
when you edit an approved Form from Data Entry.
10. Click Apply.
The Mapped Attributes Tab is displayed.
11. Click the table in the Entity Name field.
The source attributes from the table and the mapped attributes from the Excel file are displayed.

Figure 153: Expanded Table

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 328


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

If the selected table has Child tables, the Child tables that you select from the Mapped Entities tab
are also displayed in the Attributes tab. You can configure the attributes for the master table and
its child tables here.

Figure 154; Master Child Tables in the Mapped Attributes Tab

12. Click the required mapping in the Override Mapping Column and enter the required attribute name
if you want to change the default mapping.
13. Select Participate in Data Security if you want to configure a specific condition. The condition that
you configure is applicable when a user performs the data entry for the table records for each
approved Forms Definition from the Data Entry Page.
For example, consider that you configure the condition DIM_ACCOUNT_COUNTRY_NAME =
‘INDIA’ for the reference table DIM_Account. When a user performs the data entry for this
Forms Definition from the Forms Definition - Summary Page and enters a country name other
than ‘INDIA’, the record gets rejected by the application when another user approves this record.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 329


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

Complete the following steps if you want to configure Data Security for the Forms Definition:
a. Select the check box in the Participate in Data Security Column.

The icon is displayed.

b. Click the icon.


The Data Security pop-up box is displayed.
c. Select the required table based on which you want to build your condition from the Reference
Table drop-down list.
d. Build your expression by selecting the required column, condition, and filter value.

Figure 155: Expression for Data Security

e. Click Apply.
The filter condition is applied to the selected mapping.

14. Click User Security to select the user or user groups who can perform data entry to maintain the
data in the table.
The User Security Pane is displayed.
15. Enter the required user group or user to assign permissions from the Map Users / Groups Field.
When you select the user group or user, the permissions for each approved Forms Definition are
displayed. These permissions are the actions that the selected user group or user can perform while
performing Data Entry.

Figure 156: User Security Options in the Entity Details Page

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 330


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

16. Select the following permissions in the Map Users / Groups Pane that the user group or user can
perform for each Forms Definition from the Data Entry Page:

Table 66: Permissions in the Map Users / Groups Pane

Option Description

Add /Edit Add or modify records in an approved Forms Definition

Delete Delete records in an approved Forms Definition

Authorize Authorize the records in an approved Forms Definition

Duration From Optional. Select the start date for which the permissions
are available to the user or user group.

Duration To Optional. Select the end date for which the permissions
are available to the user or user group.

The User Security Configuration is complete.

NOTE If you select a user group for User Security, you can view the
users mapped to that particular group by clicking the icon.

17. Select DMI_WorkFlow from the Workflow drop-down list if you want the process of data entry to
go through the Process Modelling Framework (PMF) workflow that you have already created.
For information on PMF workflows, see the OFSAAI PMF Orchestration Guide.
18. Click Save as Draft if you want to save the Forms Definition in draft format.
19. Click Submit if you want to submit the Forms Definition for approval.
The Forms Definition is created and is displayed in Forms Definition - Summary in Awaiting
status.

7.2.3 Creating Forms Using Data Exporter


Forms created using Data Exporter are used to export table data to CSV or JSON format.
You can also include filters to view and export specific set of data.
1. Select Data Exporter in Create New Form Definition page.
 Code - The application generates a unique value for Form Code and does not require any input.
 Name - The name of the form in Form Name. You can enter between 3 to 100 characters. Only
alphabets, numbers, spaces, and underscores are allowed.
 Description - The Form Definition description. You can enter between 3 to 100 characters. Only
alphabets, numbers, spaces, and underscores are allowed.
 Created By - The username of the logged in User who created the form.
2. Click Apply to proceed with the Form creation.
Click Close to return to the Form Designer Summary page.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 331


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

The Table selection tab is displayed.


3. Select the Table from the drop-down list and click Apply.
The Attributes tab is displayed.
4. Click the drop-down arrow corresponding to the table in the Entity Name field.
The source attributes from the table are displayed. Select the required source attributes.
5. Apply Data Filters to the form data.
6. Click Data Preview to preview the form data.
7. Select Auto Approve if you do not want to the Forms Definition through the PMF workflow. When
you select this option, the Forms Definition is automatically approved from Forms Definition
Summary Page and is available for Data Entry. A user with the required role can then perform the
data entry without the need for an approval process.
If you do not select the Auto Approve option, the Form Definition goes for Manual Approval. A
user with the required role has to approve the form manually.
8. Click Save as Draft if you want to save the Forms Definition in draft format.
9. Click Submit if you want to submit the Forms Definition for approval.

7.2.4 Creating Forms Definition Using Designer


You can use the Designer option to create a Forms Definition and select the table and attributes that you
want to modify. When the Forms Definition that you create using the Designer option is approved from
the Forms Definition – Summary, you can enter the values for the table records in the approved Forms
Definition from Data Entry.
To create a Forms Definition by using the Designer option, perform the following steps:
1. Click Financial Services Enterprise Modelling from the Navigation List on your Home Page.
2. Click Common Tasks, Data Maintenance Interface, and then select Designer View.
Forms Definition – Summary is displayed.

3. Click Create New .


The Forms Definition Pane is displayed.
4. Select the Type as Designer.
5. Enter a name and description for the Forms Definition, and click Apply.
The Entities tab of the Configure Page is displayed.
6. Enter the name of the table that you want to modify in the Primary Entity Field.
If the selected table have child tables, the child tables gets displayed here. You can select the
required child tables for which you wish to input the data during data entry.

Figure 157: Child tables

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 332


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

NOTE You can select up to six Child tables only for each Master table.

7. Select Enable Bulk Authorization, if you want to enable the bulk authorization of records while
performing data entry.
8. Click Apply.
The Attributes Tab is displayed.
9. Click the table name in the View Name field.
The attributes in the entity table are displayed.

Figure 158: Expanded Table

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 333


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

If your table has Child tables, the Child tables that you select from the Entities tab also gets
displayed in the Attributes tab.

Figure 159: Master and Child Tables in the Attributes Tab

10. Select the attributes for which you want to modify the data from the Attribute Name field.
11. Select Participate in Data Security if you want to configure a specific condition. The condition that
you configure is applicable when a user performs the data entry for the table records for each
approved Forms Definition from the Data Entry Page.
For example, consider that you configure the condition DIM_ACCOUNT_COUNTRY_NAME =
‘INDIA’ for the reference table DIM_Account. When a user performs the data entry for this
Forms Definition from the Forms Definition Definition Summary Page and enters a country name

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 334


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

other than ‘INDIA’, the record gets rejected by the application when another user approves this
record.
Complete the following steps if you want to configure Data Security for the Forms Definition:
a. Select the check box in the Participate in Data Security column against the required mapping.

The icon is displayed.

b. Click the icon.


The Data Security pop-up box is displayed.
c. Select the required table based on which you want to build your condition from the Reference
Table drop-down list.
d. Build your expression by selecting the required column, condition, and filter value.

Figure 160: Expression for Data Security

e. Click Apply.
The filter condition is applied to the selected mapping.
12. Complete the following steps if you want to add filters to the Forms Definition:

a. Click Launch Filter Condition .


The Filter Condition pop-up is displayed.
b. Build your expression by selecting the required operator, column, and filter value.

NOTE You can add multiple conditions or groups in the Filter


Condition pop-up.

c. Click Apply.
The filter is displayed in the Filter Condition Field.
For example, consider the table dim_product_book that has the column v_product_code. The
column has values ranging from 1 to 500. If you want to view or modify the records that have
values less than 118 for the column v_product_code, you can create the following expression
using the Filter Condition pop-up:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 335


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

((DIM_PRODUCT_BOOK.V_PRODUCT_BOOK_CODE) < (‘118’)).


See the following image for reference:

Figure 161: Example for Filter Condition

13. Click Apply.


The Ruleset tab is displayed. This tab enables you to give permission to add data during data entry
for those attributes that are in read-only mode. Attributes that you select from the child tables are
also displayed in the Ruleset tab.
14. Select the checkbox against the Allow Add column for the attributes that you want to modify.

15. Click User Security to select the user or user groups who can perform data entry to maintain the
data in the table.
The User Security Pane is displayed.
16. Enter the required user group or user to assign permissions from the Map Users / Groups field.
When you select the user group or user, the permissions for each approved Forms Definition are
displayed. These permissions are the actions that the selected user group or user can perform while
performing Data Entry.

Figure 162: User Security Options in the Entity Details Page

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 336


DATA MAINTENANCE INTERFACE
CREATING FORMS DEFINITION

17. Select the following permissions in the Map Users / Groups Pane that the user group or user can
perform for each Forms Definition from the Data Entry Page:

Table 67: Permissions in the Map Users / Groups Pane

Permission Description

Add /Edit Add or modify records in an approved Forms Definition

Delete Delete records in an approved Forms Definition

Authorize Authorize the records in an approved Forms Definition

Duration From Optional. Select the start date for which the permissions
are available to the user or user group.

Duration To Optional. Select the end date for which the permissions are
available to the user or user group.

The User Security Configuration is complete.

NOTE If you select a user group for User Security, you can view the
users mapped to that particular group by clicking the icon.

18. Select DMI_WorkFlow from the Workflow drop-down list if you want the process of data entry to
go through the Process Modelling Framework (PMF) workflow that you have already created.
For information on PMF workflows, see the OFSAAI PMF Orchestration Guide.
19. Select Auto Approve, if you do not want to the Forms Definition through the PMF workflow. When
you select this option, the Forms Definition is automatically approved from Forms Definition –
Summary and is available for Data Entry. A user with the required role can then perform the data
entry without the need for an approval process. For more information, see Data Entry for Forms
Created using the Auto-Approve Option.
20. Click Save as Draft if you want to save the Forms Definition in draft format.
21. Click Submit if you want to submit the Forms Definition for approval.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 337


DATA MAINTENANCE INTERFACE
CREATING DATA FILTERS FOR NEW FORM DEFINITIONS

The Forms Definition is created and is displayed in the Forms Definition - Summary Page in
Awaiting status. If you have auto-approved it, the Form is ready for Data Entry.

7.3 Creating Data Filters for New Form Definitions


Complete the following steps if you want to add filters to the Forms Definition:
1. Click Launch Filter Condition.
The Filter Condition pane is displayed.
2. Enter/ select the following details.
 Column - Select the column from the applying the filter.
 Condition - Select one of the following filter conditions, to filter the column data.
 = - Equal to
 IN - <Verify>Validates the filter condition and return True or False.
 <> - Not equal to
 < - Lesser than
 <= - Lesser than or equal to
 > - Greater than
 >= - Greater than or equal to
 IS - <TBD>
 Type - Select one of the following filter types.
 Static - Select Static, to enter a value and execute the filter using only one value. You
cannot change the value at a later point.
 Dynamic - Select Dynamic, to change the filter value when needed.
After setting the filter type to Dynamic, select the Placeholder and set one of the default
seeded values, to process the filter.

NOTE:

Only values that are already seeded in the Database table, are displayed in the Placeholder drop-down list.

 Filter Value - Select/enter the filter value.

NOTE:

For Language Placeholder the default locale language is displayed and cannot be modified.

3. Click Add. to add a new Filter expression. You can add multiple Filter expressions to the same filter.
The filter is added to the list of filters.
Mouse-over the place holder filter, to view more details about the filter.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 338


DATA MAINTENANCE INTERFACE
ENABLING USER SECURITY FOR NEW FORM DEFINITIONS

4. Click Validate to verify the filter condition is valid.


A confirmation is message is displayed, if the filter is valid.
5. Click Apply.
The filter is displayed in the Filter Condition Field.
6. Click Reset, to clear all the filter expressions and create a new expression.
7. Click Delete to delete an existing filter expression.
8. Click Edit to modify a filter expression. After editing the expression, click Validate, to verify if the
condition is valid.
9. Click Apply to add the filter expression to the form definition.

7.4 Enabling Data Security for New Form Definitions


Data security conditions allows you to apply certain filters when a user performs the data entry for the
table records for each approved Forms Definition from the Data Entry page.
For example, consider that you configure the condition COUNTRY_NAME = ‘INDIA’ for the reference
table DIM_COUNTRY. When a user performs the data entry for this Forms Definition from the Forms
Definition - Summary Page and enters a country name other than ‘INDIA’, the record gets rejected by the
application when another user approves this record.
Complete the following steps to configure Data Security for the Forms Definition:
1. Select the check box next to the Attribute Name, in the Participate in Data Security Column.

NOTE:
Data Security information must be configured for each attribute name, separately.

 Click Data Security.


The Data Security page is displayed.
 Select the table based on which you want to build your condition from the Reference Table
drop-down list.
 Build your expression by selecting the required column, condition, and filter value.
 Click Apply.

7.5 Enabling User Security for New Form Definitions


The User Security option helps you to select the users/user groups who can add, edit, delete and/or
authorize data entry.
1. Click User Security to select the user or user groups who can perform data entry to maintain the
data in the table.
The User Security page is displayed.
2. Enter the required user group or user to assign permissions from the Map Users /Groups Field.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 339


DATA MAINTENANCE INTERFACE
EDITING A FORMS DEFINITION

When you select the user group or user, the permissions for each approved Forms Definition are
displayed. These permissions are the actions that the selected user group or user can perform while
performing Data Entry.

Table 68: Permissions in the Map Users / Groups Pane

Field Description

Add /Edit Add or modify records in an approved Forms Definition.

Delete Delete records in an approved Forms Definition.

Authorize Authorize the records in an approved Forms Definition.

Duration From Optional. Select the start date for which the permissions are available to
the user or user group.

Duration To Optional. Select the end date for which the permissions are available to
the user or user group.

The User Security Configuration is complete.

NOTE:

If you select a user group for User Security, you can view the users mapped to that group by clicking the Users icon.

7.6 Editing a Forms Definition


You can edit the Forms Definitions that are saved in draft format.
To edit a Forms Definition, perform the following steps:

1. Click Menu Button in the Forms Definition that is in Draft status, and then select Edit.

Figure 163: Options for Menu Button

The Configure Page is displayed.


2. Perform the required modifications to the Forms Definition and then click Submit.
The Forms Definition is changed to Awaiting status in Forms Definition- Summary. If you select
Auto Approve while editing, the Forms Definition is automatically approved.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 340


DATA MAINTENANCE INTERFACE
APPROVING OR REJECTING FORMS DEFINITION

7.7 Approving or Rejecting Forms Definition


You can validate and approve the Forms Definition created by a user in your organization if you are a
supervisor in your organization and has the required role assigned to you. For more information on Forms
Definition, see Overview.
For example, if you are the Operations Manager in your organization and you want to approve the Forms
Definition created by one of your analysts to modify the records for the table DIM_PRODUCT.
If the configuration in the Forms Definition is incorrect, you can reject the Forms Definition. The rejected
Forms Definition changes into Draft status. You can then request the required user to edit the Forms
Definition and submit it for approval again.
You can also view, copy, and edit each Forms Definition from the Forms Definition – Summary Page by
clicking Menu button . These actions are available based on the roles assigned to you.

7.7.1 Before you Begin


The following roles must be assigned to you before you can approve or reject a Forms Definition from the
Forms Definition Summary Page:
• DMIDGNWRTE (Data Designer Write): Add, Edit, or Copy Forms Definition.
• DMIDSGNREAD (Data Designer Read): View the Forms Definition - Summary Page. This role
does not allow view, edit, or adding actions.
• DMIDSGNSUM (Designer Records List): View the list of Designer Forms Definition
• DMIEXLSUM (Excel Uploaded Records List): View the list of Excel Forms Definition
• DMIALLSUM (Designer and Excel Records List): View the list of all Forms Definition
• DMIDGNATH (Data Designer Auth): Approve Forms Definitions
• DMIDGNREJ (Data Designer Reject): Reject Forms Definitions
For information on assigning roles, see Role Maintenance.

7.7.2 Approving a Forms Definition


To approve a Forms Definition, perform the following steps:
1. Click Financial Services Enterprise Modelling from the Navigation List on your Home Page.
2. Click Common Tasks, Data Maintenance Interface, and then select Designer View.
The Forms Definition Summary Page is displayed.

3. Click Menu Button in the Forms Definition that is in Awaiting status, and then click Approve.
The Configure Page is displayed.
4. Click Approve and then enter the required description for the approval in the Comments Field.

Figure 164: Comments for Approval

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 341


DATA MAINTENANCE INTERFACE
APPROVING OR REJECTING FORMS DEFINITION

5. Click Submit.
The Forms Definition is approved and is displayed in the Data Entry Page as an entry.

7.7.3 Rejecting a Forms Definition


To reject a Forms Definition, perform the following steps:

1. Click Menu Button in the Forms Definition that is in Awaiting status, and then click Reject.
The Configure Page is displayed.
2. Click Reject and then enter the required description for the approval in the Comments Field.

Figure 165: Comments for Rejection

3. Click Submit.
The Forms Definition is rejected and is displayed in Forms Definition – Summary in Draft status.
You can then edit the Forms Definition in draft status and submit it for approval again.
For more information on editing a Forms Definition, see Editing a Forms Definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 342


DATA MAINTENANCE INTERFACE
DATA ENTRY

7.8 Data Entry


The Data Entry feature of DMI enables you to maintain or modify the table data by using the Forms
Definition that is created and approved from Forms Definition – Summary. For more information on
Forms Definitions, see Creating Forms Definition.
If the approved Forms Definition is created by using the designer option, a user with the necessary role
can add or modify the records in the table as per the configuration in the Forms Definition. These records
are then sent to another user with the necessary permission for final approval.
If the approved Forms Definition is created by using an Excel file, a user with the necessary permission can
verify and approve the records that are modified with the values from the Excel file. If the records
modified by the Excel file are incorrect, the user can reject the records. The rejected record can be
modified by a different user with the necessary role and can be sent for the final approval again. The
Forms Definitions that are created by using an Excel file are labeled with an Excel icon in Data Entry.

Figure 166: Excel Definitions in Data Entry

7.8.1 Viewing Data Entry


Before you Begin
The role DMIDATAREAD (Data Entry Read) must be assigned if you want to view Data Entry.
Procedure
To view Data Entry, perform the following steps:
1. Click Financial Services Enterprise Modelling from the Navigation List on your Home Page.
2. Click Common Tasks, Data Maintenance Interface, and then select Data Entry.
The Data Entry Page is displayed.

7.8.2 Data Entry for Forms created without Auto- Approve option
7.8.2.1 Data Entry for Forms created using the Designer Option
If the Forms Definition is created by using the designer option, the user with the necessary role can enter
the values for the table records as per the configuration in the Forms Definition. This user can also add or
delete records. These records are then submitted for approval to another user with the necessary role.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 343


DATA MAINTENANCE INTERFACE
DATA ENTRY

Before you Begin


The following roles must be assigned to you if you want to perform data entry using Forms Definitions
that are created by using the Designer option:
• DMIDATAREAD (Data Entry Read): View Data Entry. This role does not allow view, edit, and add
actions for each Forms Definition.
• DMIDATWRTE (Data Entry Write): Add, edit, or copy records for data entry.
• DMIDATDEL (Data Entry Delete): Delete a record for data entry.
Procedure
To perform Data Entry for the table records in a Forms Definition created by using the Designer option,
perform the following steps:

1. Click Menu button in the required Forms Definition from the Data Entry Page.
2. Click Edit.
The Entity Details Page is displayed.
3. Select Ready from the Status drop-down list.
The entity records that are ready for entering data are displayed.

4. Click the Edit icon on the record for which you want to enter data.
An edit pane is displayed.
5. Enter the values in the attributes that you want to modify, and click OK.
The status of the modified record is changed from Ready to Draft. You can repeat the steps for all
the records for which the data needs to be entered.

6. Select Add if you want to add new records.


The new records are displayed in Draft status.

7. Select the required record and select Delete if you want to remove records that are in draft
status.

8. Click the modified record in draft status, and then click Send for Approval .
The record is sent for approval and is changed to Awaiting status. A user with the necessary role can
approve these records. For more information, see Approving and Rejecting Records after Data Entry.

ATTENTION If the user has configured the Participate In Data Security option
while creating a Forms Definition, you must enter the value as per
the configured condition. If you enter a value that does not meet
the condition, then the record is rejected by the application and the
approval gets failed. You can view the details of the rejection by
using the Audit trail option for each record.
For information on the Participate In Data Security option, see
Creating Forms Definition Using Designer.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 344


DATA MAINTENANCE INTERFACE
DATA ENTRY

7.8.2.2 Attaching Supporting Documents for Draft Records


When a record is added to a form definition and is in the Draft status, you can attach supporting
documents to validate the data updated in the record. You can configure the file extensions that can be
attached,
For more information about the configuring the list of file formats and size, refer to Configure Document
Upload File Formats and Size.
Complete the following procedure to add, delete or download a supporting document.

1. Click Menu button in the required Forms Definition from the Data Entry Page.
2. Click Edit.
The Entity Details Page is displayed.
3. Select Draft from the Status drop-down list.
The entity records that are ready for entering data are displayed.
4. Click the Actions button corresponding to the record for which you need to add, delete or download
a supporting document.
5. Select Attach, to open the Supporting Documents pane.
6. Click Drag and Drop, to attach a file.
You can attach only specific file formats. For more information, refer to Configure Document
Upload File Formats and Size.
A confirmation message is displayed, after the file is attached and the new file is added to the
supporting documents list.
7. Click Comments button corresponding to a document to add valid reasons and the document
details.
8. Click Download button, to download the supporting document.
9. Click Delete button, to remove the supporting document.

7.8.2.3 Data Entry for Forms created by using the Excel option
When a Forms Definition created by using an Excel file is approved from Forms Definition – Summary,
the table records in the selected table are modified by the data in the Excel file. These records are in
Awaiting status for the approved Forms Definition in Data Entry. You can verify the records modified by
the Excel file records and approve them if you are assigned to the necessary role. If the records modified
by the Excel file are incorrect, you can reject the records. The status of the rejected records is changed to
Draft. A user with the necessary role can edit the records in draft status and submit them for approval
again.
• To approve records, see Approving a Record.
• To reject records, see Rejecting a Record.
• To edit a record in draft status, see Editing a Rejected Record.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 345


DATA MAINTENANCE INTERFACE
DATA ENTRY

7.8.2.4 Approving and Rejecting Records With 2 Factor Authentication

7.8.2.4.1 Approving a Record with 2 Factor Authentication

Before you Begin


The role DMIDATATH (Data Entry Auth) must be assigned to you if you want to approve a record for Data
Entry.
Procedure
To approve records that are in the Awaiting status, perform the following steps:

1. Click Menu button in the required Forms Definition from Data Entry.
2. Click Edit.
The Entity Details Page is displayed.
3. Select Awaiting from the Status drop-down list.
The entity records that are waiting for final approval are displayed.

4. Select the required record or records, and then click Approve .

5. Enter the required comment in the Comments Field, and then click Approve .
A unique code is sent to the registered email ID of the authorized user who has logged in to approve
the valid records. The unique code is valid only for 90 seconds. If you have not received the unique
code or the unique code is invalid, click Resend in the unique code entry page, to receive a new
code.
6. Enter the unique code and click Submit.
The record is approved successfully with the values from the Excel file.

7.8.2.4.2 Rejecting a Record With 2 Factor Authentication

You can reject the modified record if the modified records are incorrect and if you have the necessary role
assigned to you. A different user with the necessary role can then modify the records again, and then
submit the record for approval.
Before you Begin
The role DMIDATREJ (Data Entry Reject) must be assigned to you if you want to reject a record for Data
Entry.
Procedure

1. To reject a record in Awaiting status, perform the following steps: Click Menu button in the
required Forms Definition from the Data Reporting - Data Entry Page.
2. Click Edit.
The Entity Details Page is displayed. The modified records that are awaiting for the final approval
are displayed here.

Select the required record, and then click Reject .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 346


DATA MAINTENANCE INTERFACE
DATA ENTRY

3. Enter the required comment in the Comments field, and then click Reject .
A unique code is sent to the registered email ID of the authorized user who has logged in to approve
the valid records. The unique code is valid only for 90 seconds. If you have not received the unique
code or the unique code is invalid, click Resend in the unique code entry page, to receive a new
code.
4. Enter the unique code and click Submit.
The record is rejected, and the status is changed to Draft. A user with the necessary role can now
edit the record.

7.8.2.4.3 Editing a Rejected Record

You can edit the records that are in draft status and send them approval to the user with the necessary
role.
Before you Begin
The role DMIDATWRTE (Data Entry Write) must be assigned to you if you want to edit a record for Data
Entry.
Procedure
To edit a record, perform the following steps:
1. Select Draft from the Status drop-down list.

2. Click Edit in the record that you want to edit.


The Edit Pane is displayed.

Figure 167: Edit Pane for Attributes in Each Record

3. Modify the required attributes, and then click OK.

4. Select the record and then click Send for Approval .


The modified record is now in Draft status. A user with the necessary role can approve the record.

ATTENTION If the user has configured the Participate In Data Security option
while creating a Forms Definition, you must enter the value as per
the configured condition. If an incorrect value is entered, the record
gets rejected by the application and the approval is failed. You can

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 347


DATA MAINTENANCE INTERFACE
DATA ENTRY

view the details of the rejection by using the Audit Trail option
for each record.
For information on the Participate in Data Security option, see
Creating Forms Definition Using Excel Upload.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 348


DATA MAINTENANCE INTERFACE
DATA ENTRY

7.8.3 Data Entry for Forms Created using the Auto-Approve Option
You can perform the Data Entry for auto-approved Forms without the need for approval from a
different user. For more information on auto approving Forms, see Creating Forms Definition.

Forms that are auto-approved would not have the Approve and Reject buttons in Data Entry.

Figure 168: Auto Approved Form

If you have the required permission and role are assigned to you, then perform the data entry, and use
the Submit with Auto Approve option to submit and modify the table data.

7.8.4 Other Options for Data Entry


The Audit Trail option for each record enables you to view the history of changes made to that
record.

Figure 169: Audit Trail Option for Each Record

Figure 170: Audit Trail Pane

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 349


DATA MAINTENANCE INTERFACE
DATA ENTRY

You can also drag and scale the white bar to view the history of changes based on months, days, or
hours.

Figure 171: Day Wise Audit History

Figure 172: Audit History Based on Hours

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 350


DATA MAINTENANCE INTERFACE
DATA ENTRY

You can select all the records together by selecting the check box against the Status Field. You must
enable Bulk Authorization while creating a Forms Definition for this option to appear in Data Entry.
For more information on this, see Creating Forms Definition Using Excel Upload or Creating Forms
Definition Using Designer.

Figure 173: Bulk Select Option for Records

7.8.5 Exporting Forms Creating Using Data Exporter


When you create forms using Data Exporter option, you can export the report to .CSV or .JSON
format.
To export forms created using Data Exporter (Table) option and Static Filter type:
1. In the Data Entry page, click Action next to the form to be exported and click Export.
The Data View page with the Table details associated with the form, is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 351


DATA MAINTENANCE INTERFACE
DATA ENTRY

2. Click Attribute Selection tab, to review the values and the filters and modify if required. You
can also use the default values for export.
3. Click Data Preview, to view the form based on the selected table, columns and the set filter
attributes.
4. To export the report, complete one of the following steps:
 Click Export CSV to export the report in CSV format.
 Select the File Format as CSV or JSON and click Export.
A confirmation message is displayed after the export is completed, and the Data Entry
Summary is displayed.
5. To download an exported report, click Action and click Status.
The Data Exporter Status page with the list of all the reports that are exported is displayed.
 Click Download to save the report to the local directory.
 Click Download Link to copy the link. You can paste the link in a Web browser and
download the CSV report to the local directory.
 Click Delete to delete the exported report.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 352


RULE RUN FRAMEWORK
DATA ENTRY

8 Rule Run Framework


Financial institutions require constant monitoring and measurement of risk in order to conform to
prevalent regulatory and supervisory standards. Such measurement often entails significant
computations and validations with an organization’s data. Data must be transformed to support such
measurements and calculations. The data transformation is achieved through a set of defined rules.
Rules Run Framework within the infrastructure system facilitates you to define a set of rules, reporting
objects, and processes that are required to transform data in a warehouse. You can execute Rules and
Process and manage the pre-defined rules within the system.
The Rules Run Framework is used for the following main purposes:
• To design a set of rules and processes and to structure the execution flow of processes that are
required to transform data in a data warehouse or data store.
• To design reporting objects based on previously transformed data that is stored as aggregated
data in multidimensional databases.
 To design reporting objects based on the atomic information stored in the data warehouse
or data store.
See How Run Rule Framework is used in LLFP Application and How Run Rule Framework is used in
LRM Application sections to know how the RRF module is used in other applications.
Before you begin, do the following:
• Select the required Application: An Application is mapped to an Information Domain, which
refers to a logical grouping of specific information and defines the underlying data warehouse
or data store in which the physical data model has been implemented. When you log in to the
Infrastructure system, you can access only those Applications to which your user ID is mapped.
Contact System Administrator for permissions to access a specific Application.
• Select the associated Segment: Segments are defined through the Administration module. A
Segment facilitates you to classify all the related metadata in the selected Information Domain.
You are authorized to access only those metadata objects to which the segment and user roles
have been mapped.
Object Security in RRF framework
• There are some seeded user groups and seeded user roles are mapped to those user groups. If
you are using the seeded user groups, the restriction on accessing objects based on user groups
is explained in the OFSAA Seeded Security section.
• For creating/editing/copying/removing an object in the RRF framework, you should be
mapped to the folder in case of a public or shared folder, or you should be the owner of the
folder in case of the private folder. Additionally, the WRITE role should be mapped to your user
group. For more information, see Object Security in OFSAAI.
• To access the link and the Summary window, your user group should be mapped to the ACCESS
role. You can view all objects created in Public folders, Shared folders to which you are mapped,
and Private folders for which you are the owner.
• In the Component Selector window, you can view the RRF objects like Rule and Process that are
created in Public or Shared folders to which you are mapped and Private folders for which you
are the owner.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 353


RULE RUN FRAMEWORK
COMPONENTS OF RULES RUN FRAMEWORK

• The Folder selector window behavior is explained in the User Scope section.
Hierarchy Member Security
• For each information domain, a default security mapper can be set. Based on this mapper
definition, the Hierarchy Browser window will be displayed.
• In the Hierarchy Browser window, the members that are mapped to your user group are enabled
and can be used. However, you can view the members that are not mapped, but you cannot use
it since they are disabled.
• If a child hierarchy is mapped and the parent is not mapped to your user group, the parent will
be displayed as a disabled node.
• For all AMHM hierarchies, the corresponding Business Hierarchy is created implicitly. Thus, you
can view and use AMHM hierarchies in the RRF framework, provided they are mapped to your
user group.
• Hierarchy member security is applied only for Source hierarchies. No security is used for Target
hierarchies, Rule Condition, Run Condition, and Process Condition.

8.1 Components of Rules Run Framework


Rules Run Framework consists of the following sections. Click the links to view the section details.
• Rule
• Process
• Run
• Manage Run
• Utilities

8.2 Rule
Financial institutions require constant monitoring and measurement of risk in order to conform to
prevalent regulatory and supervisory standards. Such measurement often entails significant
computations and validations with an organization’s data. Data must be transformed to support such
measurements and calculations. The data transformation is achieved through a set of defined rules.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 354


RULE RUN FRAMEWORK
RULE

Figure 174: Rule window

The Rules option in the Rules Run Framework provides a framework that facilitates the definition and
maintenance of a transformation. The metadata abstraction layer is used in the definition of rules
where the user is permitted to re-classify the attributes in the data warehouse model, thus
transforming the data. The underlying metadata objects such as Hierarchies, which are non-large or
non-list, Datasets, and Business Processors drive the Rule functionality. An authorizer must approve
the actions like creation, modification, copying, and deletion of a Rule for them to be effective.
The Rule window displays the rules created in the current Information Domain with the metadata
details such as Code, Name, Description, Type, Folder, Dataset, Version, and Active status. For more
information on how object access is restricted, see Object Security.
You can search for specific Rules based on Code, Name, Folder, Dataset, Version, Active status, or
Type. The Folder drop-down list displays all public folders, shared folders to which your user group is
mapped, and Private folders for which you are the owner. The Pagination option helps you to manage
the view of existing Rules within the system. You can also click Code, Name, Description, Type, Folder,
Dataset, Version, or Active tabs to sort the Rules in the List grid either in ascending or in descending
order.
The Roles mapped for the Rule module are Rule Access, Rule Advanced, Rule Authorize, Rule Read
Only, Rule Write, and Rule Phantom. Based on the roles mapped to your user group, you can access
various screens in the Rule module. For more information, see Appendix A.

8.2.1 Components of Rule Definition


A Rule is defined using existing metadata objects. The various components of a rule definition are as
tabulated.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 355


RULE RUN FRAMEWORK
RULE

Table 69: Components in the Rule Definition and their Descriptions

Component Description

Dataset This is a set of tables that are joined together by keys. A dataset must have at
least one FACT table. The values in one or more columns of the FACT tables
within a dataset are transformed with a new value.

Source This component determines the basis on which a record set within the dataset
is classified. The classification is driven by a combination of members of one
or more hierarchies. A hierarchy is based on a specific column of an
underlying table in the data warehouse model. The table on which the
hierarchy is defined must be part of the selected dataset. One or more
hierarchies can participate as a source as long as the underlying tables on
which they are defined, belong to the selected dataset.

Target This component determines the column in the data warehouse model that will
be impacted by an update. It also encapsulates the business logic for the
update. The identification of the business logic can vary depending on the
type of rule that is being defined.

Mapping This operation classifies the final record set of the target that is to be updated
into multiple sections. It also encapsulates the update logic for each section.
The logic for the update can vary depending on the hierarchy
member/business processor used. The logic is defined through the selection
of members from an intersection of a combination of source members with
target members.

Node Identifier This is a property of a hierarchy member. In a Rule definition, the members of
a hierarchy that cannot participate in a mapping operation are target
members, whose node identifiers identify them to be an ‘Others’ node, ‘Non-
Leaf’ node or those defined with a range expression. Source members, whose
node identifiers identify them to be ‘Non-Leaf’ nodes, can also be mapped.
For more information on Hierarchy properties, see Defining Business
Hierarchies.

NOTE The hierarchies and their nodes/members that are displayed in


the Hierarchy Browser window depend on the security mapper
definition for the selected information domain. For more
information, see Map Maintenance.

8.2.2 Create Rule


You can create rule definitions using the existing metadata objects. The Write role should be mapped
to your user group, from the User Group Role Map window.
To create a Rule definition:

1. Click New button from the toolbar in the Rule window. The Rule Definition (New Mode)
window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 356


RULE RUN FRAMEWORK
RULE

Figure 175: Rule Definition (New Mode) window

2. From the Linked to pane, click in the Folder field. The Folder Selector dialog is displayed.
The folders that are mapped to your user group are displayed.
a. Select the checkbox adjacent to the required folder. Click OK.

b. Click New from the List toolbar to create a new folder/segment. For more information,
see Segment Maintenance.

c. To search for a folder, specify any keyword and click .

3. From the Linked to pane, click in the Dataset field. The Dataset Selector dialog is displayed
with the list of datasets available under the selected information domain.
a. Select the checkbox adjacent to the required Dataset name and click OK.

b. To search for a particular dataset, specify any keyword and click .

c. To view the properties of the selected Dataset, click .


4. Enter the details in the Master information pane as tabulated.
The following table describes the Field Name in the Master Information pane.

Table 70: Field Names in the Master Information pane and their Descriptions

Field Name Description

ID The ID will be automatically generated once you create the rule.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 357


RULE RUN FRAMEWORK
RULE

Field Name Description

Enter a valid code for the rule. Ensure that the rule code is alphanumeric
Code with a maximum of 30 characters in length and there are no special
characters except underscore “_”.

Enter a valid name for the rule. Ensure that Rule Name is alphanumeric and
Name
does not contain any of the following special characters: #, %, &, +, ", and ~.

By default, the version field is displayed as <<NA>> for the new rule being
created. Once the rule definition is saved, an appropriate version is assigned
Version
as either -1 or 0 depending on the authorization permissions. For more
information, see Rule Definition Versioning.

By default, the Active field is displayed as <<NA>> for the new rule being
created. Once the rule definition is saved, the status is set to Yes if you are
Active
an Authorizer creating the rule or No if the created rule needs to be
authorized by an Authorizer.

Select the Type based on which you want to create the rule from the drop-
Type
down list. The options are Computation and Classification.

5. Click in the Master information pane to edit the properties of the Rule definition. The
Properties window is displayed.

Figure 176: Properties window

Data in the Query Optimization Settings pane are derived from the global properties (if defined)
in the Optimization tab of System Configuration > Configuration window. However, some
options defined in Global Preferences precede the Rule level properties that you define here.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 358


RULE RUN FRAMEWORK
RULE

The following table describes the fields in the Query Optimization Settings pane.

Table 71: Field Names Query Optimization Settings pane and their Descriptions

Field Name Description

Properties

Effective Start Date, Effective


Effective Dating is not implemented for Rule definition.
End Date

By default, this field displays the last change done to the Rule
Last operation type definition. While creating a Rule, this field displays the operation
type as Created.

Pre processing

This field refers to the pre-compiled rules that are executed with
the query stored in the database. While defining a rule, you can
make use of the Pre Built Flag to fasten the rule execution process
by making use of existing technical metadata details wherein the
rule query is not rebuilt again during Rule execution.
Select the required option from the drop-down list.
By default, Pre Built Flag status is set to No. This indicates that the
Pre Built Flag query statement is formed dynamically retrieving the technical
metadata details.
If the Pre Built Flag status is set to Yes, then the relevant metadata
details required to form the rule query are stored in the database on
saving the rule definition. When this rule is executed, the database
is accessed to form the rule query based on stored metadata
details, thus ensuring performance enhancement during rule
execution. For more information, see Significance of Pre-Built Flag.

Query Optimization Settings

Specify the SQL Hint that can be used to optimize Merge Query.
For example, “/*+ ALL_ROWS */”
In a Rule Execution, Merge Query formed using definition level
Merge Hints Merge Hint precede over the Global Merge Hint Parameters defined
in the Optimization tab of System Configuration > Configuration
window. In case, the definition level Merge Hint is empty/ null,
Global Merge Hint (if defined) is included in the query.

Specify the SQL Hint that can be used to optimize Merge Query by
selecting the specified query.
For example, “SELECT /*+ IS_PARALLEL */”
Select Hints In a Rule Execution, Merge Query formed using definition level
Select Hint precede over the Global Select Hint Parameters defined
in the Optimization tab of System Configuration > Configuration
window. In case, the definition level Select Hint is empty/null,
Global Select Hint (if defined) is included in the query.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 359


RULE RUN FRAMEWORK
RULE

Field Name Description

Refers to a set of semicolon (;) separated statements that need to


be executed before Merge Query on the same connection object.
During Rule Execution, Global Pre Script Parameters (defined in the
Pre Script Optimization tab of the Configuration window) are added to a Batch
followed by Rule definition level Pre Script statements, if the same
has been provided during rule definition. However, it is not
mandatory to have a Pre Script either at Global or definition level.

Refers to a set of semicolon (;) separated statements that are to be


executed after Merge Query on the same connection object.
During the Rule Execution, Global Post Script Parameters (defined
Post Script in the Optimization tab of the Configuration window) are added to a
Batch followed by Rule definition level Post Script statements, if the
same has been provided during rule definition. However, it is not
mandatory to have a Post Script either at Global or definition level.

You can select the ROWID checkbox to create a Merge Statement


based on ROWID instead of Primary Keys.
During Rule Execution, ROWID is considered while creating Merge
Statement if Use ROWID checkbox is selected in either Global
Use ROWID Parameters (Configuration window) or Rule definition properties.
If Use ROWID checkbox is not selected in either Global Parameters
(defined in the Optimization tab of the Configuration window) or
Rule definition properties, then the flag is set to “N” and Primary
Keys are considered while creating in Merge Statements.

6. Click OK. The properties are saved for the current Rule definition.

8.2.2.1 Add Members to Filter


You can define filters for a rule definition such as Hierarchy, Filter-Data Element, Filter-Hierarchy, or
Filter Group.

NOTE In order to access the Filter Selector window and to select the
pre-defined filters, you must have the FILTERRULE function
mapped to your user role.

To create a filter for a rule:

1. Click Selector from the List grid and select Filter. The Filter Selector window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 360


RULE RUN FRAMEWORK
RULE

Figure 177: Filter Selector window

In case of Hierarchy and Data Element Filter, the List pane of the Filter Selector window displays
all members based on the selected Information Domain and Dataset. Filtering based on Dataset
is not supported for other Filters like Group, Hierarchy, and Attribute.
2. Select any of the following filters from the drop-down list in the Search in pane:
The following table describes the Member Types in the Search pane.

Table 72: Member Types in the Search pane and their Descriptions

Member Type Description

Hierarchy Hierarchy refers to the defined Business Hierarchies and lists all the UAM
Hierarchies (can be implicitly created UAM hierarchies for AMHM hierarchy)
pertaining to the selected dataset.

Filter-Data Element Data Element Filter is a stored rule that expresses a set of constraints. Only
columns that match the data type of your Data Element selection are offered in
the Data Element drop-down list.

Filter-Hierarchy Hierarchy Filter allows you to utilize rollup nodes within a Hierarchy to help
you exclude (filter out) or include data within an OFSAA rule.

Filter-Group Group Filters can be used to combine multiple Data Element Filters with a
logical "AND".

Filter-Attribute Attribute Filters are created using defined Attributes. Attribute filters facilitate
you to filter on one or more Dimension Type Attributes.

3. Select the checkbox adjacent to the members you want to select.

4. Click to move the selected members to the Selected Filters pane.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 361


RULE RUN FRAMEWORK
RULE

NOTE You can select a maximum of nine Filters for a Rule.

In the Filter Selector window, you can perform the following actions:

 To search based on a specific member type, select it from the drop-down list and click .
You can also modify your search criteria by specifying the nearest keyword in the like field.

 Click to view the details of a selected member.


 Click Ascending or Descending to sort the selected components in the ascending or
descending alphabetical order.

 Click or to re-arrange the selected list of members.

NOTE The re-ordering of hierarchies does not affect the resulting SQL
query.

 Click to remove selected members from the Selected Filters pane.


5. Click OK. The selected filters are listed in the Rule Definition (New Mode) window.

8.2.2.2 Add Hierarchies to Source


The Source and Target can be selected from the List grid.
To select the source for a Rule:

1. Click Selector button from the List grid and select Source. The Hierarchy Selector
window is displayed.

Figure 178: Hierarchy Selector window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 362


RULE RUN FRAMEWORK
RULE

The LHS pane of the Hierarchy Selector window displays the available hierarchies under the
selected Information Domain and Dataset.
2. Select the checkbox adjacent to the Hierarchies you want to select as Source.

3. Click to move the selected hierarchies to the Selected Hierarchies pane.

NOTE You can select a maximum of nine Sources for a Rule.

In Hierarchy Selector window you can:

 To search for a member, specify the nearest keyword and click .

 To view the details of a selected hierarchy, click .


 Click Ascending or Descending to sort the selected components in Ascending or
Descending alphabetical order.

 Select the hierarchy and click or button to re-arrange the order of hierarchies.

 Click to remove selected hierarchies from the Selected Hierarchies pane.


4. Click OK. The selected hierarchies are listed in the Rule Definition (New Mode) window. 0.

8.2.2.3 Add Measures / Hierarchies to Target


To select the Target for a Rule in the Rule Definition (New Mode) window:

1. Click Selector from the List grid and select Target. The Measure Selector / Hierarchy
Selector window is displayed.
The Measure Selector and Hierarchy Selector windows are displayed depending on the type of
the Rule you have selected, i.e. the Computation Rule and Classification Rule respectively.
The LHS pane of the Measure Selector / Hierarchy Selector window displays the available
Measures / Hierarchies under the selected Information Domain and Dataset.
2. Select the checkbox(s) adjacent to the members you want to select as Target.

3. Click to move the selected measures to the Selected Measures / Selected Hierarchies pane.

NOTE Measures from different entities are not allowed as target


measures. You can select a maximum of five measures and a
single Hierarchy to the target.

In Measure Selector / Hierarchy Selector window you can:

 To search for a member, specify the nearest keyword and click .

 To view the details of a selected hierarchy, click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 363


RULE RUN FRAMEWORK
RULE

 Click Ascending or Descending button to sort the selected components in Ascending or


Descending order.

 Click or button to re-arrange the selected list of members.

 Click to remove selected measures from the Selected Measures / Selected Hierarchies
pane.
4. Click OK. The selected members are listed in the Rule Definition (New Mode) window.
In the List grid you can also:

• Click Move to move a selected member between Filter, Source, or Target.

• Click Show Details to view the selected member details.


Once all the necessary information in the first window of the Rule Definition (New Mode) is populated,
click Next to navigate to the concurrent procedures of defining a Rule.

8.2.2.4 Hierarchical Member Selection


The second window of Rule Definition (New Mode) displays all the information you have provided in
the Linked to and Master info grids. You can view the filters you have selected in the Rule Condition
grid.

Figure 179: Rule Condition (New Definition) window

From the Rule Condition grid, you can apply conditions for each of the BMM hierarchy filters.

NOTE In the case of Data Element, Group, or Hierarchy filters, you can
only view the SQL query.

To apply a condition for a BMM hierarchy filter and view the SQL query in the Rule Condition grid:

1. Click button adjacent to the filter details. The Hierarchy Browser window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 364


RULE RUN FRAMEWORK
RULE

Figure 180: Hierarchy Browser window

You can select (pagination) icon to view more options under the available member.

2. Select a member/node and click to select the same. Click to select the member as Self,
Self & Descendants, Self & Children, Parent, Siblings, Children, Descendants, or Last
Descendants. For more information, see Hierarchical Member Selection Modes.
In the Hierarchy Browser window you can also:

 Click to sort members based on the path.

 Click to sort hierarchy (top to bottom).

 Click to sort based on level.

 Click or to collapse or expand the members under a node respectively.

 Click or to collapse or expand the selected branch respectively.

 Click to focus only on the selected branch. The Available Values pane shows the

members of the selected branch only. Click to go back to normal view.

 Click to display member's numeric codes on the right. The icon changes to .

 Click to display member's numeric codes on the left. The icon changes to .

 Click to show only member names. This is the default view. The icon changes to .

 Click to display member's alphanumeric codes on the right. The icon changes to .

 Click to display member's alphanumeric codes on the left. The icon changes to .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 365


RULE RUN FRAMEWORK
RULE

 Click to display only member names. This is the default view. The icon changes to .

 Select a member and click or to re-arrange the members in the Selected Values
pane.

 Select a member and click to move it to the top or click to move it to the
bottom.

 Click to launch the Search panel. Here you can search based on Dimension Member
Numeric Code, Dimension Member Name or Dimension Member Alphanumeric Code.
You can also search in the grid based on member name using the Search field.

NOTE You can add up to 1000 members or nodes in the Selected


Members pane under the target hierarchy.

3. Click to view the filter details. The Preview SQL Query window is displayed with the
resultant SQL query. 0.

8.2.2.5 Select Hierarchy Members of Source Hierarchy and Move Source to Slicer
The selected Source and Target Hierarchies are displayed under the Combination Mapper pane. You
can move the source Hierarchies from the Combination Mapper pane to Slicer.
To move a source Hierarchy from the Combination Mapper pane to the Slicer pane:
1. Click the Hierarchy member and drag it to the Slicer pane.

2. Click to select the members of a Hierarchy. The Hierarchy Browser window is displayed.
Whenever a Source/ Target hierarchy is selected, by default, the root node will appear in the
Selected Members pane without checking hierarchy member security.

NOTE The Hierarchy members that are mapped to your user group
are enabled to be used; those that are not mapped are
disabled.

For more details on the Hierarchy Browser window, see Hierarchy Browser.

3. Click . The CombiFilter Node Browser window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 366


RULE RUN FRAMEWORK
RULE

Figure 181: CombiFilter Node Browser window

4. Select the checkbox adjacent to the member name and click OK. 0.

8.2.2.6 Select Business Processor as Target


The Measures selected as target are displayed under the Target Page column in the Combination
Mapper pane. You can select the Business Processors (BP) from these Measures.

NOTE If you are not able to view the Combination Mapper pane
properly due to resolution issues, click Collapse View in the
Map toolbar.

To select the Business Processors from a Measure:

1. Click adjacent to the Measure displayed under the Target Page column. The Business
Processor Selector window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 367


RULE RUN FRAMEWORK
RULE

Figure 182: Business Processor Selector window

2. Select the checkbox adjacent to the Business Processor name and click .
In Business Processor Selector window, you can:

 Search for a Business Processor by specifying the nearest keyword and clicking .

 Click to view the details of a selected Business Processor.

 Click to define a new Business Processor. For more information see Create Business
Processor.
 Click Ascending or Descending to sort the selected components in the ascending or
descending order.

 Click or to re-arrange the selected list of Business Processors.

 Click to remove the selected Business Processors from the Selected Business Processors
pane.
3. Click OK. The selected Business Processors are listed under the Combination Mapper pane
along with the Source and Filer definition details. 0.
After selecting Business Processor(s) in the Combination Mapper pane, you can set the Default
Target member, specify Parameters, and exclude child nodes for the Rule definition, if required.
 You can set the selected Target member as default by clicking on the header bar of the
required Business Processor and selecting the Default Member checkbox.
When a Target member is selected as default, all the unmapped Source member combinations
for that Target object will be logically mapped to the default member and the corresponding
target appears disabled. Run time parameters cannot be applied for such defaulted target BP’s.
However, the logical mappings will not overwrite the physical mapping.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 368


RULE RUN FRAMEWORK
RULE

 You can specify parameters for the selected Business Processor. Select the checkbox(s)
adjacent to the required Business Processor and click adjacent to the checkbox selected.
The Parameters pop-up is displayed.

NOTE A physical mapping is established when mapping is explicitly


done upon a combination of source and target members.

 For Classification Rules and Computation Rules with non-parameterized BP, the
Parameters pane is displayed as shown here:

Figure 183: Parameter pane

Enter the required note in the text field and click OK.
 For a Computation Rule with parameterized BP, the Parameters pop-up is displayed as
given.

Figure 184: Parameter pane

Enter the required note in the text field. The Parameter Default Value is fetched from
the Business Processor definition and the Assign Values can be entered manually
that is considered during Rule execution at Runtime. You can also clear the Assign
Value field by clicking Clear Values. Click OK.
 You can exclude child node(s) in the Combination Mapper pane, if they are not
required in the Rule execution. Click (Exclude). The Rule Exclude window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 369


RULE RUN FRAMEWORK
RULE

NOTE The exclude icon is available only for the combinations with
physical mappings. When a default member is removed from
the target member, all logical mappings are removed retaining
only physical mappings.

Figure 185: Rule Exclude window

The Rule exclude window displays only the child nodes associated with a Parent node. Ensure
that the selected parent has associated child nodes and is not the default member in the target.
 Select the checkbox adjacent to Rule code that you want to exclude and click OK.
Once all the necessary details are entered, click Save. The Rule definition is saved with the
provided details and is displayed in the Rule window.
Note that the default version of a new Rule definition created by an Authorizer is 0 and the one
created by non-authorizer is -1. For more details on Versioning, see Rule Definition Versioning.
The Audit Trail section of the Rule Definition (New Mode) window displays metadata
information about the Rule definition created. The User Comments section facilitates you to add
or update additional information as comments.

8.2.3 View Rule Definition


You can view individual rule definition details at any given point.
To view the existing rule definition details in the Rule window:
1. Select the checkbox adjacent to the rule Code whose details are to be viewed.

2. Click View button in the List toolbar. 0.


The Rule Definition (View Mode) window is displayed with all the details of the selected Rule.
Click Next and Back to navigate back and forth in the Rule Definition (View Mode) window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 370


RULE RUN FRAMEWORK
RULE

8.2.4 Edit Rule Definition


You can modify all the details except ID, Code, Version, Active, and Type of a rule definition. An
authorizer must approve the modified rule. Otherwise, it will be in an inactive state.

NOTE When a hierarchy, which is part of the default security mapper,


is used as a Source in a Rule definition, you must open the
Hierarchy Browser window (from the second window of Rule
Definition) and resave the selection of nodes based on the
latest accessible members in accordance with the default
security mapper definition. This ensures that the rule definition
is executed based on the latest available hierarchy member
security.

To modify an existing rule definition:


1. From the Rule window, select the checkbox adjacent to the Rule Code whose details are to be
updated.

2. Click Edit in the List toolbar. The Edit button is disabled if you have selected multiple rules.
The Rule Definition (Edit Mode) window is displayed.
3. Edit the rule details as required. For more information, see Create Rule.
4. Click Save to save the changes. 0.

8.2.4.1 Rule Definition Versioning


For an authorizer:
When you create a new rule, its version will be 0. When you edit an existing rule and try to save, you
are prompted whether to save it as a new version or not. If you click Yes, a new rule is created with
version as 0 and the rule having version as 0 will be saved with the version as maximum version +1. If
you click No, the existing rule is overwritten and the version will be as it is.
For a non-authorizer:
When you create a new rule, its version will be -1. Once the rule is approved by an authorizer, the
version becomes 0. When you edit an existing rule and try to save, you are prompted whether to save
it as a new version or not. If you click Yes, a new rule is created with version as -1. Once the rule is
approved, its version becomes 0 and the rule having version as 0 will be saved with version as
maximum version +1. If you click No, the existing rule is overwritten and the Active flag of the rule
becomes N (which you can view from the Summary window). The version remains the same. Once the
rule gets approved, its Active flag changes to Y.

NOTE • The rule with version 0 is the latest one and it can have
many versions say 1 to n, where 1 is the oldest rule and
n is the next to the latest.
• A rule with version -1 is always in an inactive state.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 371


RULE RUN FRAMEWORK
RULE

You can view all the versions of a particular rule by providing the rule’s name or code and clicking
Search in the Search and Filter grid. (Ensure the Version field is cleared since it is auto populated with
0).

8.2.5 Copy Rule Definition


This feature facilitates you to quickly create a new rule definition based on an existing rule or by
updating the values of the required rule.
To copy an existing rule definition:
1. From the Rule window, select the checkbox adjacent to the Rule Code whose details are to be
duplicated.

2. Click Copy in the List toolbar. The Rule Definition (Copy Mode) window is displayed. The
Copy button is disabled if you have selected multiple Rules. 0.
In the Rule Definition (Copy Mode) window, you can:
 Create a new Rule definition with existing variables. Specify a new Rule Code and Folder.
Click Save.
 Create a new Rule definition by updating the required variables. Specify a new Rule Code,
Folder, and update other required details. For more information, see Create Rule. Click
Save.
The new Rule definition details are displayed in the Rule window. By default, version “0” is set if
you have authorization rights, else the version is set to “-1”.

8.2.6 Authorize Rule Definition


A rule definition when created/modified should be approved by an authorizer. An authorizer can
approve/ reject a pre-defined rule definition listed within the Rule window. To approve/ reject a rule in
the Rule window, you need to have the Authorize role mapped to your user group.
If you are an authorizer, then all the Rule definitions created/ modified by you are auto approved and
the Active status is set to Yes. Otherwise, the Active status is set to No and an authorizer needs to
approve it to change the Active status to Yes.
To approve or reject a rule definition:
1. Select the checkbox(s) adjacent to the required Rule Code(s).
2. Do one of the following: 0.

 To approve the selected rule definitions, click Authorize and select Approve.

 To reject the selected rule definitions, click Authorize and select Reject.
A rule is made available for use only after the approval. For a rejected definition, a comment
with the rejection details will be added.

8.2.7 Export Rule to PDF


You can export single/multiple rule definition details to a PDF file.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 372


RULE RUN FRAMEWORK
RULE

To export the rule definition details in the Rule window:


1. Select the checkbox(s) adjacent to the Rule Code(s) you want to export.

2. Click Export button in the toolbar and select PDF. The Export dialog is displayed.

Figure 186: Export dialog window

The Export dialog displays the Export Format, Definition Type, and the names of the Selected
Definitions.
3. Click Export. The process is initiated and is displayed in a pop-up specific to the current
download. Once the PDF is generated, you can open/save the file from the File Download dialog
box.
You can either save the file on the local machine or view the file contents in a PDF viewer. The
downloaded PDF displays all the details such as Linked to, Properties, Master information, Audit
Trail, List, Mapping Details, and Comments of all the Rule definitions selected.

8.2.8 Trace Rule Definition Details


You can trace the metadata details of individual rule definitions.
To trace the underlying metadata details of a rule definition in the Rule window:
1. Select the checkbox adjacent to the Rule Code whose details are to be traced.

2. Click Trace Definition button from the toolbar.


The Trace Definition window is displayed with the details such as Traced Object (Name and
definition type) and Processes and Runs in which the selected Rule is used. In the Trace
Definition window, you can also select individual Process or Run and click to view the
definition details.

8.2.9 Delete Rule Definition


You can remove rule definition(s) which are no longer required in the system by deleting from Rule
window. However, it is a soft deletion only.
To delete rule definition:
1. Select the checkbox(s) adjacent to the Rule Code(s) which you want to delete.

2. Click Remove button from the toolbar.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 373


RULE RUN FRAMEWORK
RULE

3. Click OK in the information dialog to confirm the deletion.


An information dialog is displayed confirming the deletion of the rule definition(s) and asking
the authorization.

8.2.10 Backdated Execution


Backdated Execution refers to the provision of retroactive changes to dimensional records after those
records have already been processed and loaded into the warehouse. Support of backdated execution
is now limited to Rules and Data transformation Objects in OFSAAI
Support for backdated Execution of Rules in OFSAA work with active records present with in a
dimension (data with latest record indicator as Y). However with the addition of backdated execution
support, the rule definitions have the flexibility to continue to use the old version of the data (based on
start date and end date of records) for historical reporting purposes leaving the changed data in the
new record to only impact the fact data from that point forward. This feature is an enhanced capability
and the existing way of working on active record continues. The below sections underlines the
changes and configurations required to implement this feature.

8.2.10.1 Configuration of Backdated execution parameters


To use the backdated execution feature, one of the first step is to identify the participating target
Hierarchies in a Rule definition and make entry into the table AAI_BACKDATED_EXEC_INFO that
exists in config schema in an OFSAA environment.
To perform the configuration, perform the following steps:
1. Specify the Hierarchy details entry to be provided in AAI_BACKDATED_EXEC_INFO datatable
as tabulated.

Table 73: Hierarchy details entry to be provided in AAI_BACKDATED_EXEC_INFO

Column Name Description

V_METADATA_CODE The Hierarchy Code

V_APP_ID Application ID

V_METADATA_TYPE Specify the value as 3

V_INFODOM The Infodom Name

F_IS_RECORD_ACTIVE Specify the value as Y

F_EXECUTION_CRITERIA Specify the value as B

V_ENTITY_NAME The dimension on which the Hierarchy is defined

V_START_DATE_COLUMN_NAME The dimension table start date and column name

V_END_DATE_COLUMN_NAME The dimension table end date and column name

V_LRI_COLUMN_NAME The dimension table LRI column name

Example: Hierarchy details entry to be provided in AAI_BACKDATED_EXEC_INFO as provided


below:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 374


RULE RUN FRAMEWORK
RULE

The picture can't be displayed.

2. Metadata change in Business Hierarchy


The general recommendation of providing expression in hierarchy definition has been along
with Latest Record Indicator Flag. However this restricts the usage of older records when such
hierarchies are used in Rule Definition. Hence for a Rule to support backdated execution, the
underlying target hierarchy needs to be a defined without any consideration of LRI flag.
Example:
• Existing level Expression in a hierarchy (where LRI is used)
CASE
WHEN DIM_STANDARD_PRODUCT_TYPE.f_latest_record_indicator = 'Y' THEN
DIM_STANDARD_PRODUCT_TYPE.v_standard_product_type_code END
• Proposed level Expression in a hierarchy (where LRI is not used)
DIM_STANDARD_PRODUCT_TYPE.v_standard_product_type_code

NOTE The hierarchy definition requires a resave post this change. Use
the Business Hierarchy Edit operation to do this.

3. Run Definition change.


Since support of backdated execution is a runtime idea, the flag as indicated below needs to be
checked to enable the system to understand this mode of execution.

Figure 187: Backdated Execution Required checkbox

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 375


RULE RUN FRAMEWORK
PROCESS

4. Enable the flag to adjust the Rule query to pick up those dimensional data where the MISDATE
of execution is in between the start date and end date of the records present in the Dimension
table.

8.3 Process
A set of rules collectively form a Process. A process definition is represented as a Process Tree. The
Process option in the Rules Run Framework provides a framework that facilitates the definition and
maintenance of a process. By defining a process, you can logically group a collection of rules that
pertain to a functional process.
You can define a process with the existing metadata objects using a hierarchical structure, which
facilitates the construction of a process tree. A Process tree can have many levels and one or many
nodes within each level. Sub-processes are defined at level members and process hierarchy members
form the leaf members of the tree. See Process Hierarchy Members for more information.
Note the following:
• Precedence defined to each process determines the Process Initiation Sequence.
• If precedence is defined, the process execution (along with the associated Rules) happens based
on the precedence defined to each component.
• If no precedence is defined, all the processes within the process tree are initiated together in its
natural hierarchical sequence.
Consider the following illustration:
• If natural precedence is defined to the sub process SP1, process execution is triggered in the
sequence Rule 1 > SP1a > Rule 2 > SP1.
• If no precedence is defined, all the sub processes SP1, SP2, Rule 4, and Rule 5 are executed in
parallel.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 376


RULE RUN FRAMEWORK
PROCESS

Figure 188: Business Scenarios Illustration

Further, the business may require simulating conditions under different business scenarios and
evaluate the resultant calculations with respect to the baseline calculation. Such simulations are done
through the construction of Processes and Process trees. Underlying metadata objects such as Rules,
T2T Definitions, Processes, and Database Stored Procedures drive the process functionality.
Concurrent Rule Execution
You can define a process to combine different computation/ classification rules for concurrent
execution by marking the process or sub process as executable.
Conditions for execution
• Rules defined on different datasets cannot be combined together
• The executable process or sub process should update the same FACT table
• Aggregation rules will be merged as separate rules for execution
The Roles mapped for Process module are Process Access, Process Advanced, Process Authorize,
Process Read Only, Process Write and Process Phantom. Based on the roles mapped to your user
group, you can access various screens in the Process module. For more information on functions
mapped to these roles, see Appendix A.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 377


RULE RUN FRAMEWORK
PROCESS

Figure 189: Process window

The Process window displays the processes created in the current Information Domain with the
metadata details such as Code, Name, Folder, Version, and Active. For more information on how
object access is restricted, see Object Security.
You can search for specific Processes based on Code, Name, Folder, Version, or Active. The Folder
drop-down list displays all Public folders, shared folders to which your user group is mapped and
Private folders for which you are the owner. The Pagination option helps you to manage the view of
existing Processes within the system.

8.3.1 Create Process


You can build a process by adding one or more members called Process Nodes. If there are
Predecessor Tasks associated with any member, the tasks defined as predecessors precede the
execution of that member. The Write role should be mapped to your user group, from the User Group
Role Map window.
To define a process in the Process window:
1. Click New button from the List toolbar. The Process Definition (New Mode) window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 378


RULE RUN FRAMEWORK
PROCESS

Figure 190: Process Definition (New Mode) window

2. Click adjacent to the Folder field in the Linked to grid. The Folder Selector window is
displayed. The folders to which your user group is mapped are displayed.
a. Select the checkbox adjacent to the required folder. Click OK.

b. Click New from the List toolbar to create a new folder/segment. For more information,
see Segment Maintenance.

c. Search for a folder by specifying any keyword and clicking .


3. Enter the details of the Master information grid as tabulated:

Table 74: Fields in the Master Information pane and their Descriptions

Field Name Description

ID Refers to the default ID of a newly created process and is <<New>>.

Enter a valid code for the process. Ensure that the code is alphanumeric
Code with a maximum of 30 characters in length and there are no special
characters except underscore “_”.

Enter a valid name for the process. Ensure that the process name is
Name alphanumeric and does not contain any of the following special
characters: #, %, &, +, ", and ~.

By default, the version field is displayed as <<NA>> for the new process
being created. Once the process definition is saved, an appropriate
Version
version is assigned as either -1 or 0 depending on the authorization
permissions. For more information, see Process Definition Versioning.

By default, the Active field is displayed as <<NA>> for the new process
being created. Once the process definition is saved, the status is set to
Active
“Yes” if you are an authorizer or No if the created process needs to be
authorized by an authorizer.

Select the process type based on which you would like to create the rule
Type
from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 379


RULE RUN FRAMEWORK
PROCESS

Field Name Description

Select the checkbox if you want to bunch rule executions for concurrency.

Executable If you are selecting the checkbox, you can add only Computation or
Classification Rules as Components. For more information, see
Concurrent Rule Execution.

Route Execution to Select the checkbox if you want to route the execution of this Process
High Precedence Node definition to the high precedence node set up in the AM server.

4. Click Properties in the Master Information grid. The Properties window is displayed.

Figure 191: Properties window

You can edit the following tabulated details in the Properties window.

Table 75: Fields in the Properties window and their Descriptions

Field Name Description

Effective Start Date,


Effective Dating is not implemented for process definition.
Effective End Date

By default, this field displays the last change done to the process
Last Operation Type definition. While creating a process, the field displays the operation type
as Created.

5. Click OK. The properties are saved for the current process definition.

8.3.1.1 Define Sub Process to Root


You can define sub processes to the base process being created or for a pre-defined sub process
under a base process.
This option will not be available if you have selected the base process as executable. A process can
have multiple executable sub processes; however, an executable sub process cannot have sub process
within it. It can have only computation/classification rules as components.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 380


RULE RUN FRAMEWORK
PROCESS

To create a sub process in the Process Definition (New Mode) window:

1. Click Subprocess button. The Subprocess in ROOT window is displayed.

Figure 192: Subprocess in ROOT window

2. Enter the Subprocess Code. You cannot enter any special characters except underscore “_”.
3. Select the Executable checkbox to club the rules for concurrent execution. Executable sub
process can have only Classification/ Computation Rules.
4. Click OK.
The sub process is listed under the root process as a branch.

NOTE You can further create sub processes for the existing processes
or the base process by selecting the process and following the
aforementioned procedure; however, an executable sub
process cannot have a sub process within it.

8.3.1.2 Add Component to Base Process / Sub Process


You can add process components to the base process as well as the sub processes. For concurrent
rule execution, you should select only the rules that come under the Base Rules node. See Concurrent
Rule Execution for the conditions to select the rules.
To add the process components from the Process Definition (New Mode) window:
1. Select the process for which you want to add the component.

2. Click Component button.


The Component Selector window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 381


RULE RUN FRAMEWORK
PROCESS

Figure 193: Component Selector window

You can select (pagination) icon to view more options under the available components. For
more information, see Process Hierarchy Members.

3. Select a Process Component and click to move the component to the Tasks In <Process
Name> pane.
In Component Selector window you can also:
 Search for a component by specifying the nearest keyword in the Search field and clicking
button.
 Click Ascending or Descending to sort the selected components in Ascending or
Descending alphabetical order.

 Click or to move up or move down the selected components.


 Click adjacent to the component name, to add parameters for the selected components.
The parameters must be specified in double quotes and for multiple parameters, specify the
values separated by commas. For example, "value 1", "value 2".

 Click to remove the selected components from the Tasks In <Process Name> pane.

NOTE Sub processes listed in Tasks In <Process Name> pane cannot


be removed.

4. Click OK. The components are listed under the selected process.

8.3.1.3 Merging Rules for Concurrent Execution


After selecting Rules as components for concurrent execution, you can merge rules in a sub process to
define that as a logical single rule.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 382


RULE RUN FRAMEWORK
PROCESS

To merge rules in a sub process:


1. From the Component Selector window, select the required rules.

Figure 194: Component Selector window

2. Select the rules to be merged and click Merge Rules.

NOTE You can merge only rules which are part of the same dataset.

3. Specify the sub process code. The Executable checkbox will be selected. You cannot modify it.
4. Click Ok. The merged rules will be placed under the new sub process.

8.3.1.4 Add Precedence for Selected Components


You can add precedence for the selected components in the Process Definition (New Mode) window.
Precedence can be defined for peer processes in a selected parent process.

NOTE Precedence cannot be set for the executable sub processes.

To add precedence for a selected component:


1. Select the process for whose components you want to select precedence.
2. Click Precedence button. The Precedence Selector window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 383


RULE RUN FRAMEWORK
PROCESS

Figure 195: Precedence Selector window

3. Select Auto Map to override the predefined precedence and to set predecessor tasks as
precedence.
4. To manually select predecessor tasks for a task:
 Select a task from Tasks In <Process Name> drop-down list. The other tasks are listed in
the Available Precedence pane.

 Select the tasks to set as predecessor tasks and click .


 The selected tasks are listed in the Existing Precedence pane.

NOTE You cannot select tasks as predecessor tasks if they have cyclic
dependencies with the selected task.

In the Precedence Selector window, you can also:


 Click Ascending or Descending button to sort the selected tasks in Ascending or
Descending order.

 Click or to move up or move down the selected tasks.

 Click to remove selected tasks from the Existing Precedence pane.


5. Click OK. The precedence is set for the tasks in the selected process.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 384


RULE RUN FRAMEWORK
PROCESS

8.3.1.5 Move Tasks among Processes


You can move the tasks which have no dependency, among different processes in the Process
Definition (New/ Edit Mode) window.
To move tasks:
1. Select the task to be moved or the sub process under which the task to be moved comes. On the
right pane, the task or sub process details are displayed.
2. Select the checkbox(s) adjacent to the tasks to be moved to a different process.

3. Click Move button. The Move window is displayed.

Figure 196: Move window

4. Select the process/ sub process to which you want to move the task.
5. Click OK. The window is refreshed and the task is displayed under the selected process.

8.3.1.6 Remove Tasks from a Process


You can remove/ delete the tasks which have no dependency, from the Process Definition (New/ Edit
Mode) window.
To remove tasks:
1. Select the task to be removed or the sub process under which the task to be removed comes.
On the right pane, the task or sub process details are displayed.
2. Select the checkbox(s) adjacent to the tasks you want to remove.

3. Click Remove. The Warning dialog is displayed.


4. Click OK. The selected tasks are removed from the process.
In the Process Definition (New/ Edit Mode) window, you can also view the details of a selected task by

clicking Show Details button.


Click Save. The process definition is saved with the provided details and is displayed in the Process
window.
Note that, the default version of a new process definition created by an authorizer is 0 and the one
created by a non-authorizer is -1. For more details on versioning, see Process Definition Versioning.
The Audit Trail section at the bottom of the Process Definition (New Mode) window displays metadata
information about the Process definition created. The User Comments section facilitates you to add or
update additional information as comments.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 385


RULE RUN FRAMEWORK
PROCESS

8.3.2 View Process Definition


You can view individual process definition details at any given point.
To view the existing process definition details in the Process window:
1. Select the checkbox adjacent to the Process Code whose details are to be viewed.

2. Click View in the List toolbar.


The Process Definition (View Mode) window is displayed with all the details of the selected
Process.

8.3.3 Edit Process Definition


You can modify all the details except ID, Code, Version, Active status, Executable flag, and Type of a
Process definition. An authorizer needs to approve the modified rule. Otherwise, it will be in an
Inactive state.
To modify an existing process definition in the Process window:
1. Select the checkbox adjacent to the Process Code whose details are to be updated.

2. Click Edit button in the List toolbar. The Edit button is disabled if you have selected multiple
Processes. The Process Definition (Edit Mode) window is displayed.
3. Modify the process details as required. For more information, see Create Process.
4. Click Save to save the changes.

8.3.3.1 Process Definition Versioning


For an authorizer:
When you create a new process, its version will be 0. When you edit an existing process and try to
save, you are prompted whether to save it as a new version or not. If you click Yes, a new process is
created with version as 0 and the process with version as 0 will be saved with version as maximum
version +1. If you click No, the existing process is overwritten and the version will be as it is.
For a non-authorizer:
When you create a new process, its version will be -1. Once the process is approved by an authorizer,
the version becomes 0. When you edit an existing process and try to save, you are prompted whether
to save it as a new version or not. If you click Yes, a new process is created with version as -1. Once the
process is approved, its version becomes 0 and the process with version as 0 will be saved with
version as maximum version +1. If you click No, the existing process is overwritten, and the Active flag
of the process becomes N (which you can view from the Summary window). The version remains the
same. Once the process gets approved, its Active flag changes to Y.

NOTE • The process with version 0 is the latest one and it can
have many versions say 1 to n, where 1 is the oldest
process and n is the next to the latest.
• A rule with version -1 is always in an Inactive state.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 386


RULE RUN FRAMEWORK
PROCESS

You can view all the versions of a particular process by providing the process’s name or code and
clicking Search in the Search and Filter grid. (Ensure the Version field is cleared since it is auto
populated with 0).

8.3.4 Copy Process Definition


The Copy Process Definition facilitates you to quickly create a new process definition based on an
existing process or by updating the values of the required process.
To copy an existing process definition in the Process window:
1. Select the checkbox adjacent to the Process Code whose details are to be duplicated.

2. Click Copy button in the List toolbar to copy a selected process definition. The Process
Definition (Copy Mode) window is displayed. The Copy button is disabled if you have selected
multiple processes.
In the Process Definition (Copy Mode) window you can:
 Create a new process definition with existing variables. Specify a new Process Code and
Folder. Click Save.
 Create a new process definition by updating the required variables. Specify a new Process
Code, Folder, and update other required details. For more information, see Create Process.
Click Save.
The new process definition details are displayed in the Process window. By default, version 0 is
set if you have authorization rights, else the version is set to -1.

8.3.5 Authorize Process Definition


A process definition when created/modified should be approved by an authorizer. An authorizer can
approve/ reject a pre-defined process definition listed within the Process window. To approve/ reject
process(s) in the Process window, you need to have the Authorize role mapped to your user group. If
you are an authorizer, then all the process definitions created/modified by you are auto approved and
the Active status is set to Yes. Otherwise, the Active status is set to No and an authorizer must
approve it to change the Active status to Yes.
1. Select the checkbox(s) adjacent to the required Process Code(s).
2. Do one of the following:

 To approve the selected process definitions, click Authorize and click Approve
button.

 To reject the selected process definitions, click Authorize and click Reject button.
A process is made available for use only after the approval. For a rejected definition a comment
with the rejection details will be added.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 387


RULE RUN FRAMEWORK
PROCESS

8.3.6 Export Process to PDF


You can export single/multiple process definition details to a PDF file. To export the process definition
details in the Process window:
1. Select the checkbox(s) adjacent to the required Process Codes.

2. Click Export in the toolbar and click the PDF. A confirmation message is displayed.
3. Click Yes to confirm. The Export Options window is displayed.

Figure 197: Export Options window

The Export Options window displays the Export Format, Definition Type, the names of the
Selected Definitions, and the Trace Options.
4. To select the Trace Options:
 Select the checkbox(s) adjacent to the available options.

 Click . The selected options are displayed in the Selected Trace Options pane. You can
also select a trace option and click to deselect it from the Selected Trace Options pane.
5. Click Export. The process is initiated and is displayed in a pop-up specific to the current
download. Once the PDF file is generated, you can open/ save the file from the File Download
window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 388


RULE RUN FRAMEWORK
RUN

You can either save the file on the local machine or view the file contents in a PDF viewer. The
downloaded PDF displays all the details such as Linked to, Properties, Master info, Audit Trail, List,
Mapping Details, and Comments of all the Process definitions selected.

8.3.7 Trace Process Definition Details


You can trace the metadata details of individual process definitions. To trace the underlying metadata
details of a process definition in the Process window:
1. Select the checkbox adjacent to the Process Code whose details are to be traced.

2. Click Trace Definition from the toolbar.


The Trace Definition window is displayed with the details such as Traced Object (Name and Definition
Type), other Processes and Runs in which the selected Process is used. You can also select individual

Process or Run and click Show Details to view the definition details.

8.3.8 Delete Process Definition


You can remove process definition(s) which are no longer required in the system by deleting from the
Process window. However, it is a soft deletion only.
To delete process definition:
1. Select the checkbox(s) adjacent to the Process Code(s) whose details are to be removed.

2. Click Remove from the toolbar.


3. Click OK in the information dialog to confirm deletion.
An information dialog is displayed confirming the deletion of the Process definition(s) and asking the
authorization of the same.

8.4 Run
The Run feature in the Rules Run Framework helps you to combine various components and/or
processes together and execute them with different underlying approaches. Further, run conditions
and/or job conditions can be specified while defining a run.
Two types of runs can be defined namely Base Run and Simulation Run.
Base Run allows you to combine different rules and processes together as jobs and apply run
conditions and job conditions.
Simulation Run allows you to compare the resultant performance/ calculations with respect to the
baseline runs by replacing an existing job with a simulation job (a job can be a rule or a process). This
comparison provides useful insights into the effect of anticipated changes to the business.
Instance Run allows you to combine Base Runs and Simulation Runs in addition to other components
from multiple information domains as Jobs. This eliminates the need for having different Run
definitions if some Jobs are available in Hive Information Domain and some are present in RDBMS
Information Domain.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 389


RULE RUN FRAMEWORK
RUN

The Roles mapped for Run module are Run Access, Run Advanced, Run Authorize, Run Read Only,
Run Write and Run Phantom. Based on the roles mapped to your user group, you can access various
screens in the Run module. For more information on functions mapped to these roles, see Appendix
A.

Figure 198: Run window

The Run window displays the runs created in the current Information Domain with the metadata
details such as Code, Name, Type, Folder, Version, and Active status. For more information on how
object access is restricted, see Object Security.
You can search for specific runs based on Code, Name, Folder, Version, Active status, or Type. The
Folder drop-down list displays all Public folders, shared folders to which your user group is mapped,
and Private folders for which you are the owner. The Pagination option helps you to manage the view
of existing runs within the system.

8.4.1 Create Run


You can create run definitions using the existing metadata objects. The various components that can
be used to form run definitions are mentioned in Process Hierarchy Members. The Write role should
be mapped to your user group, from the User Group Role Map window.
The following table describes the filter conditions that can be applied to a run definition:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 390


RULE RUN FRAMEWORK
RUN

Table 76: Condition Types in the Create Run and their Descriptions

Condition Type Description

Run Condition A Run Condition is defined as a filter and all hierarchies (defined in the current
information domain) are available for selection.
You can select up to 9 run conditions.
A Run condition is defined for all Jobs. But it will be applied to a Job only if the
underlying target/destination entities of both Job and Hierarchy are common.

Job Condition A Job Condition is a further level of filter that can be applied at the component
level. This is achieved through a mapping process by which you can apply a
Job Condition to the required job.
You can select only one Job Condition and the hierarchy that you have already
selected as a run condition cannot be selected as the Job Condition again.

NOTE Filter conditions are not applicable for Instance Runs.

To create a run definition in the Run window:


1. Click New from the toolbar. The Run Definition (New Mode) window is displayed.

Figure 199: Run Definition (New Mode) window

2. Click adjacent to the Folder field in the Linked to grid. The Folder Selector window is
displayed. The folders to which your user group is mapped are displayed.
a. Select the checkbox adjacent to the required folder. Click OK.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 391


RULE RUN FRAMEWORK
RUN

b. Click New from the List toolbar to create a new folder/segment. For more information,
see Segment Maintenance.

c. Search for a folder by specifying any keyword and clicking button.


3. Enter the details of the Master information grid as tabulated below:
The following table describes the fields in the Master information pane.

Table 77: Field Names in the Master information pane and their Descriptions

Field Name Description

Refers to system-generated ID for a newly created run. When you


ID
create a rule, it is displayed as <<New >>.

Enter a valid code for the run. Ensure that the code value specified is a
maximum of 30 characters in length and it does not contain any
special characters except “_”.
Code The code is unique and case sensitive. It is used to identify a run
definition during execution.
Note: You cannot use the same code of a rule which has been deleted
from the UI.

Enter a valid name for the run. Ensure that Run Name is alphanumeric
and does not contain any of the following special characters: #, %, &,
Name +, ", and ~.
Note that the name is not required to be unique.

By default, the version field is displayed as <<NA>> for the new run
being created. Once the run definition is saved, an appropriate version
Version
is assigned as either -1 or 0 depending on the authorization
permissions. For more information, see Run Definition Versioning.

By default, the Active field is displayed as <<NA>> for the new run
being created. Once the run definition is saved, the status becomes
Active
Yes if you are an authorizer or No if the created Run needs to be
authorized by an authorizer.

Select the type of the run from the drop-down list. The available types
Type
are Base Run, Simulation Run, and Instance Run.

Route Execution to High Select the checkbox if you want to route the execution of this Process
Precedence Node definition to the high precedence node set up in the AM server.

4. Click Properties in the Master information grid. The Properties window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 392


RULE RUN FRAMEWORK
RUN

Figure 200: Properties window

You can edit the following tabulated details in the Properties window:

Table 78: Fields in the Properties window and their Descriptions

Field Name Description

Effective Start Date,


Effective Dating is not implemented for Run definition.
Effective End Date

By default, this field displays the last change done to the run
Last operation Type definition. While creating a run, the field displays the operation type
as Created.

5. Click OK. The properties are saved for the current Run definition.

8.4.1.1 Select Run Condition for Run


You can select conditions to preset the initialization mechanism of a run definition.

NOTE Run Condition is not applicable for Instance Run.

To select a condition for a run in the Run Definition (New Mode) window:

1. Click Selector from the List toolbar and select Run Condition. The Filter Selector window
is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 393


RULE RUN FRAMEWORK
RUN

Figure 201: Filter Selector window

You can select (pagination) icon to view more options under the available components.The
List pane displays Hierarchies or Filters based on the option selected in the drop-down list in
the Search in pane. The options are:
 Hierarchy- Displays all Business Hierarchies defined in the information domain.
 Filter-Data Element- Displays all Data Element Filters defined in the information domain.
 Filter-Hierarchy - Displays all Hierarchy Filters defined in the information domain.
 Filter-Group - Displays all Group Filters defined in the information domain.
 Filter-Attribute - Displays all Attribute Filters defined in the information domain.
2. Select the checkbox adjacent to the Hierarchy or Filter that you want to select as the Run
condition and click .
To know about the operations you can do in this window, see Filter Selector Hierarchy_Selector
window.
3. Click OK. The selected Hierarchies are listed in the Run Definition (New Mode) window.
4. If the selected Run condition is a Parent Child hierarchy, the Use Descendants checkbox is
displayed. If the checkbox is selected for a hierarchy, the descendants will be automatically
applied and need not be selected in node selection from the Hierarchy Browser window.

8.4.1.2 Select Jobs for Run


You can select the required jobs for the run definition being created.
To select jobs for Base and Simulation Run:

1. Click Selector from the List toolbar and select Job. The Component Selector window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 394


RULE RUN FRAMEWORK
RUN

Figure 202: Component Selector window

On the List pane, you can click button to expand the members and view the job
components. For more information, see Process Hierarchy Members.

2. Select a job component and click to move the component to the Tasks pane.

NOTE You cannot select different Jobs with same unique code in a
run definition. In such cases, the Jobs should be added to a
process and the process should be added to the run definition.

In Job Selector window you can also:

 Search for a component by specifying the nearest keyword and clicking . It may not
display search results if the branch of that component has not been expanded.
 Click Ascending or Descending button to sort the selected components in ascending or
descending alphabetical order.

 Click or to re-order the selected components.


 Click to add parameters for the selected components.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 395


RULE RUN FRAMEWORK
RUN

NOTE Parameters can be given in the format "param1","param1VALUE"


or "$PARAM2","param2VALUE". Single quotes should not be
used.

 Click to remove the selected components from the Tasks pane.


3. Click OK. The components are listed under the List pane in the Run Definition window.
To select Jobs for Instance Run

1. Click Selector from the List toolbar and select Job. The Component Selector window is
displayed.

Figure 203: Component Selector window

For Instance Run, you can add Base Run and Simulation Run as Jobs.
2. Select the information domain in which the job component you want to add is present, from the
Infodom drop-down list. By default, the selected Application’s Information Domain is displayed.
The drop-down list displays all information domains to which your user group is mapped except
sandbox information domains.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 396


RULE RUN FRAMEWORK
RUN

3. Select a job component and click to move the component to the Tasks pane.
 If you want to add a job component from another information domain, select the required
information domain from the drop-down list. The Component list refreshes and you can
add the required Job components.
 For more information see Job Selector.
4. Click OK. The components are listed under the List pane in the Run Definition window.

8.4.1.3 Select Job Condition for Run


You can select only a single job condition for the execution of predefined jobs in a run. A hierarchy,
which is already selected as a run condition, cannot be selected as a job condition.

NOTE The Cumulative Node Expression for Hierarchy Nodes used as


Job Condition in a Run definition should not cross 4000
characters. If it is exceeded, you will get an error while
executing the Run definition.
Job Condition is not applicable for Instance Run.

To select the job condition for a run:

1. Click Selector from the List toolbar and select Job Condition. The Filter Selector window
is displayed.
2. Select the checkbox adjacent to the hierarchy that you want to select as the Job condition and
click .
To know about the operations you can do in this window, see Filter Selector window.

NOTE Ensure that you have selected only one Job Condition and the
same hierarchy is not selected as both Run and Job conditions.

3. Click OK.
From the List grid in the Run Definition (New Mode) window, you can also:

 Click Move to change a selected run condition to job condition and conversely. For
Instance Run, the Move is disabled.

 Click Show Details to view the metadata information of the selected member.
 If the selected Job condition is a Parent Child hierarchy, the Use Descendants checkbox is
displayed. If the checkbox is selected for a hierarchy, the descendants will be automatically
applied and need not be selected in node selection from the Hierarchy Browser window.
Once all the necessary information in the first window of the Run Definition (New Mode) is populated,
click Next to navigate to the concurrent procedures of defining a Rule.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 397


RULE RUN FRAMEWORK
RUN

Figure 204: Run Definition (New Mode) window

The second window of Run Definition (New Mode) window displays all the information you have
provided in the Linked to and Master information grids. You can view the selected filters in the Run
Condition grid and selected jobs along with the job condition in the Detail Information grid in case of
Base Run and Simulation Run. For Instance Run, only jobs will be displayed.
Expand a job which is a process, then the Object, Parent Object, Precedence and Type columns are
populated.

8.4.1.4 Hierarchical Member Selection


In the Run Condition grid, you can modify the run conditions by including hierarchical members.

NOTE This option will be available only if you have selected Hierarchy
as the Run condition.

To modify a run condition:


1. Click corresponding to the run condition you want to modify. The Hierarchy Browser window
is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 398


RULE RUN FRAMEWORK
RUN

Figure 205: Hierarchy Browser window

2. Select a member/node and click to select the same. Click to select the member as Self,
Self & Descendants, Self & Children, Parent, Siblings, Children, Descendants, or Last
Descendants. For more information, see Hierarchical Member Selection Modes.
In the Hierarchy Browser window you can also:

 Click to sort members based on the path.

 Click to sort hierarchy (top to bottom).

 Click to sort based on level.

 Click or to collapse or expand the members under a node respectively.

 Click or to collapse or expand the selected branch respectively.

 Click to focus only on the selected branch. The Available Values pane shows the

members of the selected branch only. Click to go back to normal view.

 Click to display member's numeric codes on the right. The icon changes to .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 399


RULE RUN FRAMEWORK
RUN

 Click to display member's numeric codes on the left. The icon changes to .

 Click to show only member names. This is the default view. The icon changes to .

 Click to display member's alphanumeric codes on the right. The icon changes to .

 Click to display member's alphanumeric codes on the left. The icon changes to .

 Click to display only member names. This is the default view. The icon changes to .

 Select a member and click or to re-arrange the members in the Selected Values
pane.

 Select a member and click to move it to the top or click to move it to the
bottom.

 Click to launch the Search panel. Here you can search based on Dimension Member
Numeric Code, Dimension Member Name or Dimension Member Alphanumeric Code.
You can also search in the grid based on member name using the Search field.

3. Click corresponding to the run condition to view the SQL query. The SQL query is formed
based on the hierarchical member selection mode. The Preview SQL Query window is displayed
with the resultant SQL equivalent of the run condition.
The Detail Information grid displays the jobs and job condition defined for the run definition.

 Click adjacent to the job names to re-order the selected jobs.


 Click beside the job condition to launch the Hierarchy Browser window. This option will
be available only if a Hierarchy is selected as the Job condition.
 Select the checkbox corresponding to the job if you want to apply the Job condition to that
job.
 Click a job to view its definition details. For example, if it is a Rule, the Show Details window
displays the Rule Definition (View Mode) window.
You can click Back to navigate back to the first page of the Run Definition (New Mode) window
to modify any details.
Once all the necessary details are entered, click Save. If you are an authorizer, the version of the
run definition will be 0, else it will be -1.
The Audit Trail section at the bottom of Run Definition (New Mode) window displays metadata
information about the Run definition created. The User Comments section facilitates you to add
or update additional information as comments.

8.4.2 View Run Definition


You can view individual run definition details at any given point. To view the existing Run definition
details in the Run window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 400


RULE RUN FRAMEWORK
RUN

1. Select the checkbox adjacent to the Run Code whose details are to be viewed.

2. Click View in the List toolbar.


The Run Definition (View Mode) window is displayed with all the details of the selected Run.
Click Next and Back buttons to navigate back and forth in the Run Definition (View Mode)
window.

8.4.3 Edit Run Definition


You can modify all the details except ID, Code, Version, Active Status, and Type of a run definition. To
modify an existing run definition in the Run window:
1. Select the checkbox adjacent to the Run Code whose details are to be updated.

2. Click Edit in the List toolbar. Edit button is disabled if you have selected multiple Runs. The
Run Definition (Edit Mode) window is displayed.
3. Edit the Run details as required. For more information, see Create Run.
4. Click Save to save the changes.

8.4.3.1 Run Definition Versioning


For an authorizer:
When you create a new run, its version will be 0. When you edit an existing run and try to save, you are
prompted whether to save it as a new version or not. If you click Yes, a new Run is created with version
as 0 and the Run having version as 0 will be saved with version as maximum version +1. If you click
No, the existing run is overwritten and the version will be as it is.
For a non-authorizer:
When you create a new run, its version will be -1. After the Run is approved by an authorizer, the
version becomes 0. When you edit an existing Run and try to save, you are prompted whether to save
it as a new version or not. If you click Yes, a new Run is created with version as -1. Once the Run is
approved, its version becomes 0 and the Run having version as 0 will be saved with version as
maximum version +1. If you click No, the existing Run is overwritten, and the Active flag of the Run
becomes N (which you can view from the Summary window). The version remains the same. After the
Run gets approved, its Active flag changes to Y.

NOTE • The run with version 0 is the latest one and it can have
many versions say 1 to n, where 1 is the oldest Run and
n is the next to latest.
• A run with version -1 will always be in an Inactive state.

You can view all the versions of a particular rule by providing the run’s name or code and clicking
Search in the Search and Filter grid. (Ensure the Version field is cleared since it is auto populated with
0).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 401


RULE RUN FRAMEWORK
RUN

8.4.4 Copy Run Definition


This option facilitates you to quickly create a new run definition based on an existing run by updating
the values of the required fields.
To copy an existing Run Definition in the Run window:
1. Select the checkbox adjacent to the Run Code whose details are to be duplicated.

2. Click Copy in the List toolbar to copy a selected Run definition. The Run Definition (Copy
Mode) window is displayed. Copy button is disabled if you have selected multiple Runs.
In the Run Definition (Copy Mode) window you can:
 Create a new Run definition with existing variables. Specify a new Run Code and Folder.
Click Save.
 Create a new Run definition by updating the required variables. Specify a new Run Code,
Folder, and update other required details. For more information, see Create Run. Click
Save.
The new Run definition details are displayed in the Run window. By default, version 0 is set if you have
authorization rights, else the version is set to -1.

8.4.5 Authorize Run Definition


All the actions in a run definition should be approved by an authorizer. An authorizer can approve a
pre-defined Run definition for further execution or reject an inappropriate Run definition listed within
the Run window. To approve/ reject Run definitions in the Process window, you need to have the
Authorize role mapped to your user group.
If you are an authorizer, the Run definition is auto approved as you save it and the Active status is set
to Yes. Otherwise, the Active status is set to No and an authorizer needs to approve it to change the
Active status to Yes.
To approve/reject Runs:
1. Select the checkbox(s) adjacent to the required Run Codes.
2. Do one of the following:

 To approve the selected run definitions, click Authorize and select Approve.

 To reject the selected run definitions, click Authorize and select Reject.
A run is made available for use only after the approval. For a rejected definition a comment with the
rejection details will be added.

8.4.6 Export Run to PDF


This option allows you to export multiple run definitions to a PDF file. You have the option to export
only the rules or processes in the run definition to PDF by selecting the required Trace Options. In case
of Instance Run, you can select Runs that you want to export, apart from Rules and Processes.
To export the run definitions in the Run window:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 402


RULE RUN FRAMEWORK
RUN

1. Select the checkbox(s) adjacent to the required Run Codes.

2. Click Export button in the List toolbar and click the PDF button in the popup. The
Export dialog is displayed.

Figure 206: Export window

The Export window displays the Export Format, Definition Type, the names of the Selected
Definitions, and the Trace Options.
 Select the checkbox adjacent to Rule or Process if you want to export only the rule details or
Process details respectively. If you do not select any checkbox, all details of the selected run
definitions will be exported.

 Click . The selected options are displayed in the Selected Trace Options pane. You can
also select a trace option and click to deselect it from the Selected Trace Options pane.
3. Click Export. The process is initiated and is displayed in a pop-up specific to the current
download. Once the PDF is generated, you can open/save the file from the File Download
dialog.
You can either save the file on the local machine or view the file contents in a PDF viewer. The
downloaded PDF displays all the details such as Linked to, Properties, Master info, Audit Trail, List, and
Comments of all the Run definitions selected.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 403


RULE RUN FRAMEWORK
RUN

8.4.7 Fire Run


This feature facilitates you to execute a previously created Run. You can execute the run definition as
a batch from the Operations module.
To execute a run definition:

1. Select the checkbox adjacent to the Run Code which you want to execute and click Fire
Run in the List toolbar. The Fire Run window is displayed.
2. Enter the field details as tabulated below:
The following table describes the fields in the Fire Run window.

Table 79: Fields in the Fire Run window and their descriptions

Field Name Description

Name This field displays the name of the selected run.

Select the request type either as Single or as Multiple from the drop-
down list.
Single Request - You need to provide the MIS Date during Batch
Request Type
execution from the Operations module.
Multiple Request - You can run the batch with the same MIS date
multiple times from the Operations module.

Select the Batch either as Create or as Create & Execute from the
drop-down list
Create- The batch will be created and needs to be executed from the
Batch
Operations module.
Create & Execute- The batch will be created and executed. You can
monitor it from the Operations module.

Select the MIS Date for execution.


MIS Date For the Request Type Single with Batch mode as Create, MIS Date is
optional.

Select Yes and provide the Duration in seconds after which the run
Wait definition should be executed.
Select No to execute it immediately.

(Optional). The detailed execution description. You can enter a


Execution Description
maximum of 2000 characters.

Enter the required parameters in the field provided.


Parameters
The parameter provided in this field is considered for Run execution.

Enter the filter details in the field provided.


Filters
The filters provided in this field are considered for Run execution.

3. Click OK. The details are saved and the run definition is executed as per the Fire Run details. For
information on runtime parameters supported during run execution, see Passing Runtime
Parameters section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 404


RULE RUN FRAMEWORK
MANAGE RUN EXECUTION

8.4.8 Delete Run Definition


You can remove Run definition(s) which are no longer required in the system by deleting from Run
window. However, it is a soft deletion only. An authorizer has to approve the deletion.
1. Select the checkbox(s) adjacent to the Run Codes whose details are to be removed.

2. Click Remove from the List toolbar.


3. Click OK in the information dialog to confirm deletion.
An information dialog is displayed confirming the deletion of the Run definitions and asking the
authorization of the same.

8.5 Manage Run Execution


Manage Run execution enables you to have a workflow for Run execution. The predefined Run
definitions can be executed in a unique batch depending on the Type of the Manage Run Execution
defined. These batches can then be executed from the Operations module.
The Roles mapped for Mange Run Execution module are: Manage Run Access, Manage Run
Advanced, Manage Run Authorize, Manage Run Read Only, Manage Run Write and Manage Run
Phantom. Based on the roles mapped to your user group, you can access various screens in the
Mange Run Execution module. For more information on functions mapped to these roles, see
Appendix A.

Figure 207: Manage Run Execution window

The Manage Run Execution window displays the Run Execution requests created in the current
Information Domain with the metadata details such as Run name, Run Execution Description, Run

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 405


RULE RUN FRAMEWORK
MANAGE RUN EXECUTION

Execution ID, Type, MIS Date, and Request Status. If Object Security is implemented, see the Object
Security section to understand the behavior.
You can also search for specific Runs based on Run Name, Run Execution Description, MIS Date, Run
Execution ID, Type, or Request Status. The Pagination option helps you to manage the view of existing
Rules within the system.

8.5.1 Creating Manage Run Definition


You can create the Manage Run Definitions from the Manage Run Execution window. The Write role
should be mapped to your user group, from the User Group Role Map window.
To create a Manage Run Definition:
1. Click New button from the List toolbar. The Manage Run Definition (New Mode) window is
displayed.

Figure 208: Manage Run Definition (New Mode) window

2. Click adjacent to the Run field. The Run Selector window is displayed.

a. Click to view the details of the selected Run definition.

b. Search for a Run definition by specifying any keyword and clicking button.
c. Select the checkbox adjacent to the Run definition you want to select and click Ok.
The selected Run is displayed in the Run field, along with the Run ID.

3. Click adjacent to to view the details of the selected Run.


4. Enter the details in the Master Information and Execution Details grids as tabulated.
The following tables describes the fields in the Master Information and Execution Details grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 406


RULE RUN FRAMEWORK
MANAGE RUN EXECUTION

Table 80: Fields in the Master Information and Execution Details pane and their Descriptions

Field Name Description

Master Information grid

Run Execution ID The default ID of a newly created Run Execution is <<New >>

Enter a valid Run Execution Code. Ensure that the Run Execution Code
Run Execution Code specified is of maximum 30 characters in length and does not contain
any special characters except “_”.

Enter the Name of the Run Execution. Ensure that Run Execution Name is
Run Execution Name alphanumeric and does not contain any of the following special
characters: #, %, &, +, ", ~, and ‘.

Select the type of the Run Execution either as Single Request or as


Multiple Request.
Single Request - You need to provide the MIS Date during Batch
Type execution from the Operations module.
Multiple Request - You can run the batch with the same MIS date multiple
times from the Operations module.

Execution Details grid

Execution ID The default Execution ID of a newly created Run Execution is <<NA>>

Select the request status either as Open or as Closed.


Request Status Status Open creates a Manage Run definition.
Status Closed creates a Manage Run definition along with a Batch.

This field is displayed only if you have selected Type as Multiple


Request.
MISDate MIS Date refers to the date with which the data for the execution would
be filtered. Click to display Calendar. You can select the MIS Date from
the calendar.

The default Execution status of a newly created Run Execution is <<NA


Execution Status
>>

5. Click Save. For information on runtime parameters supported during Manage Run Execution,
see Passing Runtime Parameters section. The Run Execution is saved and a confirmation dialog
appears.
The Audit Trail section at the bottom of the Manage Run Definition (New Mode) window displays
metadata information about the Manage Run definition created. The User Comments section
facilitates you to add or update additional information as comments.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 407


RULE RUN FRAMEWORK
MANAGE RUN EXECUTION

8.5.1.1 Passing Runtime Parameters


The following runtime parameters are supported during run execution:
• $RUNID
• $PHID
• $EXEID
• $RUNSK
• $MISDATE
• $BATCHRUNID
Values for the runtime parameters are implicitly passed while executing the Run definition.

8.5.2 Viewing Manage Run Definition


You can view individual Manage Run definition details at any given point. To view the existing Manage
Run definition details in the Manage Run Execution window:
1. Select the checkbox adjacent to the Run Name whose details are to be viewed.

2. Click View in the List toolbar.


The Manage Run Execution Definition (View Mode) window is displayed with all the details of
the selected Manage Run Definition.

8.5.3 Editing Manage Run Definition


You can modify the Run Execution Description and Request Status details of a Manage Run definition.
To modify an existing Manage Run definition in the Manage Run Execution window:
1. Select the checkbox adjacent to the Manage Run Definition name whose details are to be
updated.

2. Click Edit in the List toolbar. Edit button is disabled if you have selected multiple Manage
Run Definitions.
The Manage Run Definition (Edit Mode) window is displayed.
3. Edit the Manage Run definition details as required.
For more information, see Manage Run Definition.
You can select the Request Status as Open, Closed, To be Deleted, or Final depending on the
current status of the definition:
 Status Open creates/updates a Manage Run definition.
 Status Closed creates a Manage Run definition along with a Batch.
 Status To be Deleted indicates the Manage Run definition is marked for deletion.
 Status Final indicates the Manage Run definition is successfully executed with expected
results.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 408


RULE RUN FRAMEWORK
MANAGE RUN EXECUTION

The Execution Status field displays the current execution status of a triggered Run as Success,
Failure, or Ongoing and <<NA>> for a non-executed Run.
4. Click Save to save the changes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 409


RULE RUN FRAMEWORK
UTILITIES

8.6 Utilities
This section consists of information related to the utilities available in the Rules Run Framework
module of OFSAAI.

8.6.1 Component Registration


The Component Registration section allows you to add components by defining certain parameters in
the Component Registration window.

NOTE Before you begin, ensure that you have registered all the
required components within the Run Rule Framework (RRF).
For detailed information, see OFSAAI Administration Guide.

Figure 209: Component Registration window

The Component Registration window displays the current components in the left pane and the field
values of the selected component in the right pane. The parameters described for a component in this
window are Component ID, ICC Component ID, Image Name, Parent ID, Class Path, and Tree Order.
The Audit Trail section at the bottom of the Component Registration window displays metadata
information about the Component selected/created.

8.6.1.1 Registering Components


You can register new components from the Component Registration window.
To register a new component:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 410


RULE RUN FRAMEWORK
UTILITIES

1. From the Component Registration window, click New. The fields in the right pane of the
Component Registration window are reset.
2. Enter the details as tabulated below.
The following tables describes the fields in the Component Registration window.

Table 81: Fields in the Component Registration window and their Descriptions

Field Name Description

Component ID Enter the Component ID.

Select the Parent ID from the drop-down list. The abbreviated form of
Parent ID component IDs are displayed in the list. You can check the Available
Components pane for the full forms of the abbreviations used.

ICC Component ID Select the ICC Component ID from the drop-down list.

Class Path Key in the class path.

Image Name Key in the image name which is allocated for the component.

Tree Order Enter the tree order as a numeric value.

3. Click Save. The fields are validated and the component is saved.

8.6.1.2 Editing Component Definition


You can modify all the details except the Component ID of a Component. To modify an existing
component in the Component Registration window:

NOTE Seeded Components cannot be modified.

1. Select the Component from the left pane tree structure, whose details are to be updated.

2. Click Edit button. The fields of the selected component are editable.
3. Edit the Component details as required. For more information, see Create Component.
4. Click Save to save the changes.

8.6.1.3 Removing Component Definition


You can remove individual Component definitions that are no longer required in the system by
deleting from the Component Registration window.

NOTE The seeded Components cannot be deleted.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 411


RULE RUN FRAMEWORK
REFERENCES

1. Select the Component whose details are to be removed and click Remove.
2. Click OK in the warning dialog to confirm deletion.
The Component Registration window confirms the deletion of the component definition.

8.7 References
This section of the document consists of information related to intermediate actions that are required
while completing a task. The procedures are common to all the sections and are referenced wherever
required.

8.7.1 How Run Rule Framework is used in LLFP Application


8.7.1.1 Rules
The following two types of Rules are available in Run Rule Framework for Oracle Financial Services
Loan Loss Forecasting and Provisioning (LLFP) Application:
• Classification Rules
• Computation Rules
Classification Rules
This type of Rules re-classify table records in the Data Model based on the criteria that include
complex Group by Clauses and Sub Queries within the tables.
In LLFP, various methods are used for calculations (for example, provision matrix method, cash flow
method, and so on). To determine a set of bank accounts that use one of these methods, you can use
the Run Rule Framework (RRF).
Example:
Consider a scenario to determine the required methods for Product Type and Customer Type and
move data from Staging to FACT_ACCOUNT_DETAILS table using a T2T.
Here, two source and target hierarchies, one each for Product Type and Customer Type is used. Based
on the values of this combination of Product Type and Customer Type, the target hierarchies are
assigned. This target hierarchy represents the method such as the provision matrix method and cash
flow method.
That is, based on the satisfied combinations from source hierarchies (Product Type and Customer
Type), the method SKey in the FACT_ACCOUNT_DETAILS table is updated.
Computation Rules
These Rules compute new values/matrices based on Simple Measures and update an identified set of
records within the data model.
For example:
In LLFP, Expected Credit Loss (ECL) is calculated by creating Rules using the following formula:
ECL = Outstanding Amount x Probability of Default (PD) X Loss Given Default (LGD)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 412


RULE RUN FRAMEWORK
REFERENCES

To calculate this, a DT is created using RRF, where necessary expressions are defined. The instructions
to multiply the values of all these three columns are encapsulated in the Rule.

8.7.1.2 Process and Run


After a Rule is created, it is assigned to a Process (which is a Batch in AAI). Multiple Rules can be
assigned to a Process with pre-determined order of execution. Later these Batches are executed as
Runs.

8.7.2 How Run Rule Framework is used in LRM Application


The process “LRM – BIS – Determining Revised Maturity for calculating the revised maturity dates” is
created for the BIS regularizations requirement in LRM. This process is used to select assets and
liabilities used for LCR computation.
This process is bifurcated into below five Computational Rules:

8.7.2.1 LRM - BIS Conservative Approach for Outflows


1. This Rule is created to update the Revised Maturity Date for the outflows as First Call Date of the
liability and the derivative products, with embedded options flag ‘Y’.
2. The source hierarchies related to standard product type and the embedded options flag are
considered.
3. The destination Measure of Revised Maturity Date SKey is defined as the target in the Rule.
4. The Business Processor containing the First Call Date column is mapped with the destination
Measure.
5. The relevant dataset LRM - Conservative Approach for Outflows is updated to fetch the relevant
data from where the selection occurs based on the criteria. The Revised Maturity Date for
Derivatives and liabilities for which embedded option flag is Y is updated with First Call Date.

8.7.2.2 LRM - BIS Conservative Approach for Inflows


1. This Rule is created to update the Revised Maturity Date for the inflows of the asset and the
derivative products based on the BIS regulations.
2. The source hierarchies related to standard product type, embedded options flag, and re-
hypothecated flag are considered.
3. The destination Measure of Revised Maturity Date SKey is defined as the target in the Rule.
4. The Business Processor containing the expression based on the BIS requirement is mapped to
the destination Measure.
5. The relevant dataset LRM - Conservative Approach for Inflows is updated to fetch the relevant
data from where the selection based on the criteria happens.

8.7.2.3 LRM - Updating Revised Maturity Date Surrogate Key With Maturity Date
Surrogate Key
1. This Rule is created to update the Revised Maturity Date for the assets and liability accounts
when the revised maturity date is absent.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 413


RULE RUN FRAMEWORK
REFERENCES

2. The source hierarchies related to Date and Run are considered.


3. The destination Measure of Revised Maturity Date SKey is defined as the target in the Rule.
4. The Business Processor containing the Original Maturity Date associated with the account is
mapped to the destination Measure.
5. The relevant dataset LRM - Updating the Revised Maturity Date Surrogate Key is updated to
fetch the relevant data and match the Business Processor, hierarchies, Measures, and tables
used in processing this Rule.

8.7.2.4 LRM - Updating Columns Using Revised Maturity Date


1. This Rule is created to update the respective Residual Maturity Band SKeys (obtained from the
preceding Rules) and the effective Residual Maturity Band SKeys.
2. The source hierarchies related to Date and Run are considered.
3. The destination Measures of the residual maturity band SKey and effective Residual Maturity
Band maturity date SKey with the relevant time bucket SKeys are defined as the target in the
Rule.
4. The Business Processors related to the destination Measures (Effective Residual Maturity Date
SKey, Residual Maturity Band SKey, Residual Maturity Time Bucket SKey and Revised Maturity
Time Bucket SKey) are mapped to the physical columns.
5. The relevant dataset LRM - Updating columns using Revised Maturity Date is updated to fetch
the relevant data and match the Business Processor, hierarchies, Measures, and tables used in
processing this Rule.

8.7.2.5 LRM - Residual Maturity Less Than Liquidity Horizon Flag Update
1. This Rule is created to update the accounts as ‘Y’, where the Residual Maturity Date falls within
the liquidity horizon.
2. The source hierarchy related to Run is considered.
3. The destination Measure is a flag which indicates if the Residual Maturity is less than the
liquidity horizon, and is defined as the target in the Rule.
4. The business process containing the flag related to the Residual Maturity that is less than the
liquidity horizon is mapped to the destination Measure.
5. The relevant dataset LRM - Residual Maturity Less Than Liquidity Horizon Flag Update is
created and updated to fetch the relevant data and match the Business Processor, hierarchies,
Measures, and tables used in processing this Rule.
After these Rules are created, they are added to the process ‘LRM – BIS – Determining Revised
Maturity’, in the order mentioned above. This process is stitched to a Run which is used to process the
LCR calculation related to the BIS regularizations in LRM.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 414


RULE RUN FRAMEWORK
REFERENCES

8.7.3 Process Hierarchy Members


The Process Hierarchy Members and their description are as tabulated.

Table 82: Components in the Process Hierarchy Members and their Descriptions

Component Description

Data Extraction Rules Display all the Extract definitions defined through OFSAAI Data Management
Tools.

Load Data Rules Display the following two sub types of definitions:
File Loading Rules display the entire File to Table definitions defined
through OFSAAI Data Management Tools.
Insertion Rules (Type1 Rules) display all the Table to Table definitions
defined through OFSAAI Data Management Tools.

Transformation Rules Displays the following definition sub type:


Database Functions-Transformations display all the DT definitions defined
in OFSAAI Data Management Tools.

Base Rules Display the following two sub types of definitions:


Classification Rules (type 2 rule) display all the type 2 rules defined in the
Rules Run Framework which have Active status as “Yes” and Version “0”.
Computation Rules (type 3 rule) display all the type 3 rules defined in the
Rules Run Framework which have Active status as “Yes” and Version “0”.

Processes Display all the existing processes defined through Process Framework which
have Active status as “Yes” and Version “0”.

Essbase Cubes Display all the Essbase cubes defined for the selected Information Domain in
OFSAAI Data Model Management.
Note: The cubes under the segment to which the user is mapped only will be
displayed.

Model Display all the existing model definitions defined in the Modeling framework
windows.

Stress Testing Display all the existing stress testing definitions defined in the Variable Shock
Library, Scenario Management, and Stress Definition windows.

Data Quality Displays all data quality groups defined from the OFSAAI Data quality
Framework.
The DQ Rule framework is registered with RRF. While passing additional
parameters during RRF execution, the additional parameters are passed
differently (when compared to DQGroup execution). For example, if the
additional parameters to be passed are :
$REGION_CODE#V#US;$CREATION_DATE#D#07/06/1983;$ACCOUNT_BA
L#N#10000.50, then they are passed as:
"REGION_CODE","V","US","CREATION_DATE","D","07/06/1983",
"ACCOUNT_BAL","N","100 00.50". In case the user wants to input threshold
percentage (for example,: 50%), then the parameter string passed is as
follows:
"50","REGION_CODE","V","US","CREATION_DATE","D","07/06/1983","ACCO
UNT_BAL","N", "10000.50". In the absence of the threshold parameter, it is
assumed to be 100%, by default.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 415


RULE RUN FRAMEWORK
REFERENCES

The parameters needed to execute all the listed components are explained in the Seeded Component
Parameters section.

8.7.4 Hierarchical Member Selection Modes


To aid the selection process, certain standard modes are offered through a drop-down. The available
modes are Self, Self-Children, Parent, Siblings, and Children.
Based on the hierarchy member security applied, the nodes/members of the hierarchy are displayed
in enabled or disabled mode. The members that are in enabled mode only can be selected. That is, the
members that are mapped to your user group only can be selected. For example, if you choose Self
Children, the immediate children of the selected hierarchy that are mapped to your user group only
will be moved to the RHS pane.
• The Self mode is the default mode displayed. In this mode, only the specific member selected in
the LHS pane will be selected onto the RHS pane.
• Choose the Self Children mode when you want a specific member and only its immediate
children to be selected onto the RHS pane.
• Choose the Parent mode when you want to select only the parent member of a selected
member onto the RHS pane.
• Choose the Siblings mode when you want to select all the sibling members of the selected
member (those members under the same parent) onto the RHS pane.
• Choose the Children mode when you want only the immediate children of a specific member to
be selected onto the RHS pane mode.

You can also click to select all the members to the Selected Values pane. Click to

deselect a selected member from the Selected Values pane or click to deselect all the members.

8.7.5 Significance of Pre-Built Flag


While defining a Rule, you can make use of Pre Built Flag to fasten the Rule execution process by
making use of pre-compiled technical metadata details. The purpose of Pre Built Flag is to enhance
the Rule execution process bypassing the need to search for the required technical metadata within
multiple database tables.
The following tables shows the Conditions and its Process flow

Table 83: Conditions and its Process flow

Condition Process flow

Rule definition with Pre-Built Flag set to “Y” > Build the Rule query.
Creating Rule:
Rule definition with Pre-Built Flag set to “N” > Do not build the Rule query during
Rule Save.

Pre-Built Flag set to “Y” > Retrieve the rule query from the appropriate table and
Executing Rule:
execute.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 416


RULE RUN FRAMEWORK
REFERENCES

Condition Process flow

Pre-Built Flag set to “N” > Build the Rule query by referencing the related
metadata tables and then execute.

For example, consider a scenario where Rule 1 (RWA calculation), using a Dataset DS1 is to be
executed. If the Pre-Built Flag condition is set to “N”, then the metadata details of From Clause and
Filter Clause of DS1 are searched through the database to form the query. Whereas, when the Pre-
Built Flag condition is set to “Y”, then the From Clause and Filter Clause details are retrieved from the
appropriate table to form the query and thereby triggered for execution.
Like Dataset, pre-compiled rules also exist for other Business Metadata objects such as Measures,
Business Processors, Hierarchies, and so on.
Note the following:
When you are sure that the Rule definition is not modified in a specific environment (production), you
can set the flag for all Rule definitions as “Y”. This would in turn help in performance improvement
during Rule execution. However, if the Rule is migrated to a different environment and if there is a
change in the query, change the status back to “N” and also may need to resave the Rule, since there
could be a change in metadata.

8.7.6 Seeded Component Parameters in RRF


The seeded component parameters available within OFSAAI are as follows:

8.7.6.1 Cube Aggregate Data (CubeAggregateData)


The following table describes the Parameters and its default value.

Table 84: Parameter Description and its Default Value

Parameter Name / Description Default Value


(Type)

IP Address (System Refers to the IP Address of the server where the OFSAAI
Defined) Database components for the particular information
domain have been installed. This IP Address also specifies
the location (server hostname / IP Address) where the
component is to be executed.

Datastore Type Enterprise Data Warehouse (EDW) EDW


(System Defined)

Datastore Name Information Domain Name


(System Defined)

Cube Parameter Unique Name of the component definition


(System Defined)

Optional Parameters It is a set of different parameters like Run ID, Process ID, Exe
(System Defined) ID, and Run Surrogate Key. For example,
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 417


RULE RUN FRAMEWORK
REFERENCES

Parameter Name / Description Default Value


(Type)

Operation (User It is a drop-down list with the following optional values - ALL
Defined) "ALL", "GENDATAFILES", and "GENPRNFILES" to generate
Data files or PRN files or both, during Cube build.

8.7.6.2 Create Cube (CubeCreateCube)


The following table describes the Parameters and its default value.

Table 85: Parameter Description and its Default Value

Parameter Name / (Type) Description Default Value

IP Address (System Defined) Refers to the IP Address of the server where the
OFSAAI Database components for the particular
information domain have been installed. This IP
Address also specifies the location (server
hostname / IP Address) where the component is to
be executed.

Datastore Type (System Defined) Enterprise Data Warehouse (EDW) EDW

Datastore Name (System Information Domain Name


Defined)

Cube Parameter (System Unique Name of the component definition


Defined)

Operation (User Defined) It is a drop-down list with the following optional ALL
values - "ALL", "BUILDDB", "TUNEDB",
"PROCESSDB", "DLRU", "ROLLUP", "VALIDATE",
"DELDB", "OPTSTORE"

8.7.6.3 Data Extraction Rules (ExtractT2F)


The following table describes the Parameters and its default value.

Table 86: Parameter Description and its Default Value

Parameter Name / (Type) Description Default Value

IP Address (System Defined) Refers to the IP Address of the server where the
OFSAAI Database components for the particular
information domain have been installed. This IP
Address also specifies the location (server
hostname / IP Address) where the component is to
be executed.

Datastore Type (System Defined) Enterprise Data Warehouse (EDW) EDW

Datastore Name (System Information Domain Name


Defined)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 418


RULE RUN FRAMEWORK
REFERENCES

Parameter Name / (Type) Description Default Value

Extract Name (System Defined) Unique Name of the component definition

Source Name (System Defined) The scope of T2F is limited to the Source of the
tables and this gives the name of the source.

8.7.6.4 Load Data Rules (LoadF2T)


The following table describes the Parameters and its default value.

Table 87: Parameter Description and its Default Value

Parameter Name / (Type) Description Default Value

IP Address (System Defined) Refers to the IP Address of the server where the
OFSAAI Database components for the particular
information domain have been installed. This IP
Address also specifies the location (server
hostname / IP Address) where the component is to
be executed.

Datastore Type (System Defined) Enterprise Data Warehouse (EDW) EDW

Datastore Name (System Information Domain Name


Defined)

File Name (System Defined) Unique Name of the component definition

Source Name (System Defined) The scope of this component is limited to the
source and it gives the name of the source file.

Load Mode (System Defined) Additional parameter to differentiate between F2T File To Table
and T2T

Data File Name (User Defined) Name of the source file. If not specified, the source
name provided in the definition will be used.

8.7.6.5 Load Data Rules (LoadT2T)


The following table describes the Parameters and its default value.

Table 88: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

IP Address (System Defined) Refers to the IP Address of the server where the OFSAAI
Database components for the particular information
domain have been installed. This IP Address also specifies
the location (server hostname / IP Address) where the
component is to be executed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 419


RULE RUN FRAMEWORK
REFERENCES

Parameter Name / (Type) Description Default


Value

Datastore Type (System Enterprise Data Warehouse (EDW) EDW


Defined)

Datastore Name (System Information Domain Name


Defined)

File Name (System Defined) Unique Name of the component definition

Source Name (System Defined) The scope of this component is limited to the source and
it gives the name of the source table.

Load Mode (System Defined) Additional parameter to differentiate between F2T and Table To
T2T Table

Default Value (System Defined) It is a set of different parameters like Run ID, Process ID,
Exe ID, and Run surrogate key. For example,
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

Data File Name (User Defined) Not Applicable since this parameter is only used for F2T,
not T2T

8.7.6.6 Modeling Framework - Model (MFModel)


The following table describes the Parameters and its default value.

Table 89: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

IP Address (System Defined) Refers to the IP Address of the server where the OFSAAI
Database components for the particular information
domain have been installed. This IP Address also specifies
the location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System Enterprise Data Warehouse (EDW) EDW


Defined)

Datastore Name (System Information Domain Name


Defined)

Operation (System Defined) Refers to the operation to be performed. You can click the ALL
drop-down list to select additional parameters to direct the
engine behavior.

Model Code (System Defined) Unique Name of the component definition

Optional Parameters (System It is a set of different parameters like Run ID, Process ID,
Defined) Exe ID, and Run Surrogate Key. For example,
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 420


RULE RUN FRAMEWORK
REFERENCES

8.7.6.7 Modeling Framework - Optimizer (MFOptimizer)


The following table describes the Parameters and its default value.

Table 90: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

IP Address (System Defined) Refers to the IP Address of the server where the OFSAAI
Database components for the particular information
domain have been installed. This IP Address also specifies
the location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System Enterprise Data Warehouse (EDW) EDW


Defined)

Datastore Name (System Information Domain Name


Defined)

Operation (System Defined) Refers to the operation to be performed. You can click the ALL
drop-down list to select additional parameters to direct the
engine behavior.

Model Code (System Defined) Unique Name of the component definition

Optional Parameters (System It is a set of different parameters like Run ID, Process ID,
Defined) Exe ID, and run surrogate key. For example,
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

8.7.6.8 Modeling Framework - Pooling (MFPoolling)


The following table describes the Parameters and its default value.

Table 91: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

IP Address (System Defined) Refers to the IP Address of the server where the OFSAAI
Database components for the particular information domain
have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System Enterprise Data Warehouse (EDW) EDW


Defined)

Datastore Name (System Information Domain Name


Defined)

Operation (System Defined) Refers to the operation to be performed. You can click the ALL
drop-down list to select additional parameters to direct the
engine behavior.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 421


RULE RUN FRAMEWORK
REFERENCES

Parameter Name / (Type) Description Default


Value

Model Code (System Defined) Unique Name of the component definition

Optional Parameters (System It is a set of different parameters like Run ID, Process ID, Exe
Defined) ID, and run surrogate key. For example,
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

8.7.6.9 Process
The process component does not have any seeded parameters and is the same defined in the Process
window.

8.7.6.10 Base Rules - Classification Rule (RuleType2)


The following table describes the Parameters and its default value.

Table 92: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

Rule Code (System Defined) This is the rule ID

The status Y - yes or N - no indicates if the rule query has to


Build Flag (System Defined) N
be re-built before execution or not.

It is a set of different parameters like Run ID, Process ID, Exe


Optional Parameters (System
ID, and Run surrogate key. For example,
Defined)
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

8.7.6.11 Base Rules - Computation Rule (RuleType3)


The following table describes the Parameters and its default value.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 422


RULE RUN FRAMEWORK
REFERENCES

Table 93: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

Rule Code (System Defined) Rule ID.

The status Y - yes or N - no indicates if the rule query has to


Build Flag (System Defined) N
be re-built before execution or not.

It is a set of different parameters like Run ID, Process ID, Exe


Optional Parameters (System
ID, and Run surrogate key. For example,
Defined)
$RUNID=123,$PHID=234,$EXEID=345,$RUNSK=456

8.7.6.12 Run Executable (RunExecutable)


The following table describes the Parameters and its default value.

Table 94: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

This determines if the executable is Synchronous (Y) /


Wait (System Defined) Y
Asynchronous (N)

Batch Parameter (System This determines if the implicit system parameters like batch
Y
Defined) ID, MIS date, and so on are to be passed or not.

It is the name of the ".sh" file that has to be executed


Executable (User Defined)
through this run executable component.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 423


RULE RUN FRAMEWORK
REFERENCES

8.7.6.13 Stress Testing -Variable Shocks (SSTVariableShock)


The following table describes the Parameters and its default value.

Table 95: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

Variable Shock Code (System


Unique Name of the component definition
Defined)

Refers to the operation to be performed. You can click the


Operation (System Defined) drop-down list to select additional parameters to direct the ALL
engine behavior.

Optional Parameters (System


This consists of Run Surrogate Key.
Defined)

8.7.6.14 Transformation Rules (TransformDQ)


The following table describes the Parameters and its default value.

Table 96: Parameter Description and its Default Value

Parameter Name / (Type) Description Default


Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

Rule Name (System Defined) Unique Name of the component definition

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 424


RULE RUN FRAMEWORK
REFERENCES

Parameter Name / (Type) Description Default


Value

It is a user defined parameter list along with different system


defined parameters like Run ID, Process ID, Exe ID, and Run
Surrogate Key only if the subtype is SP (Stored Procedure)
or EXT (External).
Parameter List (User Defined)
For example,
<<ParameterList>>,"$RUNID=123","$PHID=234","$EXEID=34
5","$RUNSK=456" otherwise it will be only
"$RUNID=123","$PHID=234","$EXEID=345","$RUNSK=456"

8.7.6.15 Transformation Rules (TransformDT)


The following table describes the Parameters and its default value.

Table 97: Parameter Description and its Default Value

Table 98: Parameter Name / Description Default


(Type) Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

Rule Name (System Defined) Unique Name of the component definition

It is a user defined parameter list along with different system


defined parameters like Run ID, Process ID, Exe ID, and Run
Surrogate Key only if the subtype is SP (Stored Procedure).
Parameter List (User Defined) For example,
<<ParameterList>>,"$RUNID=123","$PHID=234","$EXEID=34
5","$RUNSK=456" otherwise it will be only
"$RUNID=123","$PHID=234","$EXEID=345","$RUNSK=456"

8.7.6.16 Data Quality Groups (Run DQ)


The following table describes the Parameters and its default value.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 425


RULE RUN FRAMEWORK
REFERENCES

Table 99: Parameter Description and its Default Value

Table 100: Parameter Name / Description Default


(Type) Value

Refers to the IP Address of the server where the OFSAAI


Database components for the particular information domain
IP Address (System Defined) have been installed. This IP Address also specifies the
location (server hostname / IP Address) where the
component is to be executed.

Datastore Type (System


Enterprise Data Warehouse (EDW) EDW
Defined)

Datastore Name (System


Information Domain Name
Defined)

Data Quality Group Name Name of the DQ group to be executed.

Comma-separated parameters where the first value is


considered as the threshold percentage, followed by an
additional parameter which is a combination of three
Parameters tokens. Example,
“90”,”PARAM1”,”D”,”VALUE1”,”PARAM2”,”V”,”VALUE2”.
Note: Parameter ‘Fail if threshold is breached” is defaulted
to “Yes” for RRF executions.

You can pass Run Surrogate Key (RUNSK) as a filter.


Optional Parameter
For example, $RUNSK=456

NOTE If you want to configure components other than the seeded


components, see the Component Registration section in
OFSAAI Administration Guide.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 426


OPERATIONS
BATCH MAINTENANCE

9 Operations
Operations refers to administration and processing of business data to create the highest level of
efficiency within the system and to derive results based on a specified rule. Operations framework
within the Infrastructure system facilitates you (system administrator) to:
• Configure and operate the business processes effectively.
• Maintain the Operator Console by Defining and Executing Batches through the Operations
menu.
• Monitor the Batches scheduled for execution.
The roles mapped for Operations module are Batch Access, Batch Advanced, Batch Read Only, and
Batch Write.
If you require users to access only selected modules, enable the access to specific-module functions
and do not enable access to the Operator Console. Enabling access to the Operator Console gives
users access to all the Batch modules.
For example, if a user requires to access only the Batch Monitor module, map the user to Batch
Monitor Link function and ensure the user does not have access to the Operator Console function.
For more details on roles and functions, see Appendix A.
The operation section discusses the following sections:
• Batch Maintenance
• Batch Execution
• Batch Scheduler
• Batch Monitor
• Processing Report
• Batch Cancellation
• View Log

9.1 Batch Maintenance


Batch refers to a set of executable processes based on a specified rule. Batch Maintenance framework
within the Infrastructure system facilitates you to create and maintain the Batch Definitions. You can
process the Batch scheduled for execution from Batch Maintenance and also from other modules and
applications such as Rules Run Framework and Enterprise Modeling respectively.
You should have Batch Write User Role mapped to your User Group to cancel a Batch. The Batch
Maintenance window displays a list of Batches scheduled for maintenance with the other details such
as Batch ID, Batch Description, and the editable state of the Batch.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 427


OPERATIONS
BATCH MAINTENANCE

Figure 210: Batch Maintenance window

In the Batch Maintenance window, you can do the following:


• Create Batch Definitions and assign task details to a Batch. You can also set the task
precedence, specify component, and define the dynamic parameters based on the component.
• View the Batch Definition details.
• Change the Batch Definition Status as Non Editable (NE).
• Delete Batch Definition details.
You can also search for a specific Batch based on the Batch ID, Batch Description, Module, or Last
Modified Date.
You can transfer batch ownership from one user to another user. For details, see Transferring Batch
Ownership section in the OFSAAI Administration Guide.

9.1.1 Adding Batch Definition


You can either define an empty Batch or duplicate an existing Batch and specify the task details. To
add Batch definition in the Batch Maintenance window:
1. Click Add button from the Batch Name tool bar. The Add Batch Definition window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 428


OPERATIONS
BATCH MAINTENANCE

Figure 211: Batch Maintenance Add window

2. Enter the Batch details as tabulated.


The following table describes the fields in the Add Batch Maintenance window.

Table 101: Fields in the Batch Maintenance Add window and their Descriptions

Field Description

The Batch Name is auto generated by the system. You can edit to specify a
Batch name based on the following conditions:
The Batch Name should be unique across the Information Domain.
Batch Name The Batch Name must be alphanumeric and should not start with a
number.
The Batch Name should not exceed 41 characters in length.
The Batch Name should not contain any special characters except “_”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 429


OPERATIONS
BATCH MAINTENANCE

Field Description

Enter a description for the Batch based on the Batch Name.


Batch description should be alphanumeric. The following special characters
are allowed:
Character Description
----------- -------------
$ Dollar sign
& Ampersand
{ Open brace
} Close brace
[ Open square bracket
] Close square bracket
( Open parenthesis
) Close parenthesis
, Comma
< Less than sign
= Equal sign
> Greater than sign
# Pound sign
% Percent
_ Underscore
- Hyphen
Batch Description
: Colon
. Period
Blank space

Note: The special characters that are not supported are as follows:
Character Description
----------- -------------
! Exclamation point
" Double quotes
` Back quote
* Asterisk
+ Plus sign
; Semicolon
? Question mark
^ Carat
| Pipe character
~ Tilde character
' Apostrophe
\ Backslash
/ Forward slash
@ At sign

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 430


OPERATIONS
BATCH MAINTENANCE

Field Description

(Optional) Select the checkbox to create a new Batch by duplicating the


Duplicate Batch existing Batch details.
On selection, the Batch ID field is enabled.

Batch ID (If duplicate It is mandatory to specify the Batch ID if Duplicate Batch option is selected.
Batch is selected) Select the required Batch ID from the list.

Select the checkbox if the Batch has to be created sequentially based on


the task specified. For example, if there are 3 tasks defined in a Batch, task
Sequential Batch
3 should have precedence as task 2, and task 2 should have precedence as
task 1.

3. Click Save to save the Batch definition details. The new Batch definition details are displayed in
the Batch Name section of Batch Maintenance window with the specified Batch ID.
In the Batch Name tool bar of Batch Maintenance window, you can select the Batch ID and do
the following:

 Click View button and view the Batch Definition details.

 Click Edit button to change the status of the Batch as Non Editable (NE).

NOTE Non Editable batch status cannot be reverted to Editable status


later.

By default the new Batch created will have the status set as Editable (E).

 Click Delete button to delete the Batch definition details.

9.1.2 Specify Task Details


The Tasks Details section of Batch Maintenance window displays the list of tasks associated with a
specific Batch definition. In the Task Details section you can do the following:
• Update the pre-defined task and assign new tasks.
• Specify the Task Precedence.
• Update the pre-defined Component or specify new component.
• Specify the Dynamic Parameters based on the component selected.

9.1.2.1 Adding Task Details


To specify the task details in the Batch Maintenance window:
1. Click Add from the Task Details tool bar.
The Add Task Definition window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 431


OPERATIONS
BATCH MAINTENANCE

Figure 212:Task Definition Add window

2. Enter the task details as tabulated.


The following table describes the fields in the Add Task Definition window.

Table 102: Fields in the Task Definition Add window and their Descriptions

Field Description

The task ID is auto generated by the system depending on the


Task ID
precedence level and is not editable.

Enter the task description. No special characters are allowed in Task


Description.
Description The words like Select From or Delete From (identified as potential
SQL injection vulnerable strings) should not be entered in the
Description.

Components refers to individual functional units that are put


together to form a process. A component triggers its own set of
processes in the back-end to achieve the final output. For more
Components information on each component Property and Value Description, see
Task Component Parameters.
Select the required component from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 432


OPERATIONS
BATCH MAINTENANCE

Field Description

On selecting a task component, a list of dynamic parameters is


displayed. It is mandatory to select the parameter values based on
the component.
Specify the value for each parameter by selecting from the drop-
down list. Click the following links to view the component parameter
details.
AGGREGATE DATA
CREATE CUBE
EXTRACT DATA
LOAD DATA
Dynamic Parameters List
MODEL
PROCESS_EXECUTION
RULE_EXECUTION
RUN DQ RULE
RUN EXECUTABLE
SQL RULE
TRANSFORM DATA
VARIABLE SHOCK
WORKFLOW EXECUTION

Refers to the type of data store such as Enterprise Data Warehouse


Datastore Type
(EDW) which refers to the Multi-dimensional Database/Cubes.

Refers to the name of the Information Domain. By default the


Information Domain to which the selected Application is mapped, is
selected.
Datastore Name The unique combination of the Datastore Name and the Datastore
Type determine the physical machine on which the task will be
executed. It is assumed that the user gives the correct information
else task invocations may fail at Runtime.

Refers to the IP Address of the primary machine for runtime


Primary IP For Runtime
processes. Select the IP address of the machine on which you want to
Processes
execute the task, from the drop-down list.

3. Click Save to save the task definition details. The new task details are displayed in the Task
Details of the Batch Maintenance window with the Task ID.
In the Task Details tool bar of Batch Maintenance window you can select the Task ID and do the
following:
 Click Add button to add another Task.

 Click View button and view the selected Task details.

 Click Edit to modify the selected Task details.

 Click Delete button to delete the selected Task details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 433


OPERATIONS
BATCH EXECUTION

9.1.2.2 Defining Task Precedence


Task Precedence indicates the execution-flow of a Batch. Task Precedence value in the Task Details
facilitates you to determine the order in which the specific Tasks of a Batch are executed.
For example, consider a Batch consisting of 4 Tasks. First 3 Tasks does not have a precedence defined
and hence will be executed simultaneously during the Batch execution. But, Task 4 has precedence
value as task 1 which indicates that, Task 4 is executed only after Task 1 has been successfully
executed.
You can set Task precedence between Tasks, or schedule a Task to Run after another Task, or even
define to Run a Task after a set of other tasks. However, multiple tasks can be executed
simultaneously and cyclical execution of tasks is not permitted. If the precedence for a Task is not set,
the Task it is executed immediately on Batch execution.
To define the task precedence in the Batch Maintenance window:

1. Click button under the Precedence column of the task for which you want to add
precedence task.
The Task Precedence Mapping browser is displayed.

NOTE Task Precedence option is disabled if a batch has only one task
associated.

 Select the required Task from the Task List and click . You can press Ctrl key for multiple
selections.

 To select all the listed Tasks, click .

 To remove a Task, select the task from Select Tasks pane and click .

 To remove all the selected Tasks, click .


2. Click OK and update Task Precedence definition.

9.2 Batch Execution


Batch Execution refers to the process of initiating a Batch for current processing. When a Batch is
submitted for execution, a series of commands are sent to the database with respect to the defined
component parameters. This in turn returns an array of update counts (required value definitions)
when the commands are executed successfully.
You should have Batch Advanced User Role mapped to your User Group to execute a Batch.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 434


OPERATIONS
BATCH EXECUTION

Figure 213: Batch Execution window

The Batch Execution window displays the list of only those Batches which have at least one task
associated, with the other details such as Batch ID and Batch Description. When you select a Batch ID
in the list, the Task Details sections displays all the defined Tasks associated with the Batch.
The Batch Details section in the Batch Execution window lists the Batches depending on the Batch
Mode selected.
• The Run mode displays the Batch definitions which are newly defined and which have been
scheduled for execution.
• The Restart Mode displays the Batch definitions which are not executed successfully or either
has been interrupted during the previous Batch execution.
• The Rerun mode displays the Batch definitions which have been successfully executed, failed,
cancelled, or even interrupted during the previous Batch execution.
You can search for a specific Batch based on the Batch ID, Batch Description, Module, or Last Modified
Date. The pagination option helps you to view the list of existing Batches within the system.

9.2.1 Executing Batch


You can Run/Execute the Batches which are scheduled for execution in the Batch Execution window.
You can also modify the pre-defined Batch schedule or define a new schedule using the Batch
Scheduler. In the Batch Execution window you can execute a Batch in Run, Restart, or Rerun modes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 435


OPERATIONS
BATCH EXECUTION

On completion of batch execution, if the batch fails, a notification mail is sent to all users mapped to
the user group with the OPRMON role mapped to them.

9.2.1.1 Run/Execute Batch


You can Run/Execute Batch(s) which have been scheduled for execution in the Batch Execution
window. You can also Run/Execute a Batch using the External Scheduler (ES) which has the “External
Scheduler Interface Component” (ESIC) integrated with Infrastructure system. For more information,
see External Scheduler Interface Component.
To execute a Batch in the Batch Execution window:
1. Select Run as Mode in the Batch Mode pane. The list of Batches scheduled for execution is
displayed in the Batch Details pane.

Figure 214: Batch Details pane

2. Select the checkbox adjacent to the Batch ID which has to be executed. The specified task(s)
defined to the selected Batch are displayed in the Task Details section.

 In the Batch Details tool bar, click Schedule Batch button to define new or modify the
pre-defined Batch Schedule. For more information, see Batch Scheduler.

Figure 215: Task Details pane

 In the Task Details tool bar, click Exclude/Include button to Exclude/Include a task, or
click Hold/Release button to hold or release a task before executing the Batch. For
more information, see Modify Task Definitions of a Batch.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 436


OPERATIONS
BATCH EXECUTION

3. Specify the Information Date (mandatory) by clicking (calendar) button. The specified date
is recorded for reference.

NOTE You can also modify the required task parameters of the
selected Batch and include the changes during the Batch rerun.
For more information, see Specify Task Details.

4. Click Execute Batch button and select OK in the information dialog to confirm Batch Execution.
An information dialog is displayed indicating that Batch Execution is triggered successfully.

9.2.1.2 Restart Batch


You can restart a Batch which has not been executed successfully or which has been explicitly
interrupted, or cancelled, or put on hold during the execution process. These Batches are categorized
separately and listed in the Restart mode within the Batch Execution window. By restarting a Batch,
you can continue Batch execution directly from the point of interruption or failure and complete
executing the remaining tasks.
To Restart a Batch in the Batch Execution window:
1. Select Restart as Mode in the Batch Mode section. The list of interrupted/failed Batches during
execution is displayed in the Batch Details section.

Figure 216: Batch Details window

2. Select the checkbox adjacent to the Batch ID which has to be executed. The specified Task(s)
defined to the selected Batch are displayed in the Task Details section.

 In the Batch Details tool bar, click Schedule Batch button to define new or modify the
pre-defined Batch Schedule. For more information, see Batch Scheduler.
3. Select the Information Date from the drop-down list. This is a mandatory field.
4. Select the Batch Run ID (mandatory) from the drop-down list. This is a mandatory field.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 437


OPERATIONS
BATCH EXECUTION

 In the Task Details tool bar, click Exclude/Include button to exclude or include a task,
or click Hold/Release button to hold or release a task before executing the Batch. For
more information, see Modify Task Definitions of a Batch.

NOTE The Tasks in a Batch which have failed during the execution
process are indicated in Red in the Task Details section. You
can modify the required task parameters in Specify Task
Details window and include the changes during the Batch
restart. Else, the tasks fail again during the Batch Restart.

5. Click Execute Batch button and select OK in the information dialog to confirm Batch Execution.
An information dialog is displayed indicating that Batch Execution is triggered successfully.

9.2.1.3 Rerun Batch


You can rerun a Batch which has previously been executed. Rerun Batch facilitates you to run the
Batch irrespective of the previous execution state. A new Batch Run ID is generated during the Rerun
process and the Batch is executed as similar to the new Batch Run.
To rerun a Batch in the Batch Execution window:
1. Select Rerun in the Batch Mode section. The list of executed Batches is displayed in the Batch
Details section.
2. Select the checkbox adjacent to the Batch ID which has to be executed. The specified Task(s)
defined to the selected Batch are displayed in the Task Details section.

 In the Batch Details tool bar, click Schedule Batch button to define new or modify the
pre-defined Batch Schedule. For more information, see Batch Scheduler.
3. Select the Information Date from the drop-down list. This is a mandatory field.
4. Select the Batch Run ID from the drop-down list. This is a mandatory field.

 In the Task Details tool bar, click Exclude/Include button to exclude or include button a
task, or click Hold/Release button to hold or release a task before executing the Batch.
For more information, see Modify Task Definitions of a Batch.

NOTE You can also modify the required task parameters of the
selected Batch and include the changes during the Batch rerun.
For more information, see Specify Task Details.

5. Click Execute Batch button and select OK in the information dialog to confirm Batch Execution.
An information dialog is displayed indicating that Batch Execution is triggered successfully.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 438


OPERATIONS
BATCH EXECUTION

9.2.2 Modifying Task Definitions of a Batch


You can modify the task definition state in the Batch Execution window to exclude or hold the defined
task in a Batch from execution. The excluded tasks are therefore assumed to have completed
execution and get excluded during the Batch Run.
While executing a Batch in the Batch Execution window, you can:
• Exclude a task or Include the excluded task.
• Hold a task and Release the held task.
When you modify the task definition(s) in the Task Details section:
• The Excluded task(s) are displayed in “Grey” with the Task Status set to “K”.
• The task(s) on Hold are displayed in “Red” with the Task Status set to “H”.

NOTE In the combination, you are not permitted to Hold/Release an


Excluded task or Exclude/Include a task which is on Hold.

9.2.2.1 Exclude Task Definitions


You can Exclude Task(s) definition or Include the Excluded task(s) during Batch Execution. The
excluded task components are therefore executed in the normal process assuming that the Excluded
Task(s) have completed execution.
To exclude Task(s) in the in the Batch Execution window:

1. Click Exclude/Include button in the Task Details tool bar.


2. In the Task Mapping window, do one of the following:

 To exclude a task, select the required task from the Available Tasks list and click . You can
press Ctrl key for multiple selections.

 To exclude all tasks in the Available Tasks list, click .


3. Click OK and return to the Batch Execution window.
The Excluded Task(s) in the task details section are marked in “Grey” with the Task Status set to
“K”.

9.2.2.2 Include Excluded Task Definitions


To include an Excluded Task(s) in the in the Batch Execution window:

1. Click Exclude/Include button in the Task Details tool bar.


2. In the Task Mapping window, do one of the following:

 To include an excluded task, select the required task from the Set Tasks list and click .
You can press Ctrl key for multiple selections.

 To exclude all tasks in the Set Tasks list, click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 439


OPERATIONS
BATCH SCHEDULER

3. Click OK and return to the Batch Execution window.

9.2.2.3 Hold Task Definitions


You can Hold task(s) definition or Release the held task(s) during Batch Execution. In the Batch Run,
the task(s) which are on Hold along with the defined components are skipped during execution.
However, at least one task should be available in a Batch without being held/excluded for Batch
execution.
To hold Task(s) in the in the Batch Execution window:

1. Click Hold/Release button in the Task Details tool bar.


2. In the Task Mapping window, do one of the following:

 To Hold a task, select the required task from the Available Tasks list and click . You can
press Ctrl key for multiple selections.

 To Hold all tasks in the Available Tasks list, click .


3. Click OK and return to the Batch Execution window.
The Task(s) on Hold in the task details section are marked in “Red” with the Task Status set to
“H”.

9.2.2.4 Release Held Task Definitions


To Release Task(s) on Hold in the in the Batch Execution window:

1. Click Hold/Release button in the Task Details tool bar.


2. In the Task Mapping window, do one of the following:

 To release a held task, select the required task from the Set Tasks list and click . You can
press Ctrl key for multiple selections.

 To release all tasks in the Set Tasks list, click .


3. Click OK and return to the Batch Execution window.

9.3 Batch Scheduler


Batch Scheduler in the Infrastructure system facilitates you to schedule a Batch for later processing.
You can define a new Batch schedule or update a previously defined Batch schedule for processing.
You should have Batch Advanced User Role mapped to your User Group to schedule a Batch. The
Batch Scheduler window displays the list of Batches scheduled for execution with the other details
such as Batch ID and Batch Description. When you select a Batch in the list, the Batch Scheduler
options are displayed.

You can click Refresh button in the Server Time section to view the Current Sever Time while
defining a Batch schedule. You can search for a specific Batch based on the Batch ID Like, Batch
Description Like, Module, or Last Modified Date.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 440


OPERATIONS
BATCH SCHEDULER

9.3.1 Creating Batch Schedule


You can define a new schedule for processing Batch by specifying the required day(s) and time
intervals. The Batch is executed when the server time synchronizes with the scheduled time.

NOTE Any change made to the Server Time to accommodate for


Daylight Savings Time will not be reflected automatically in the
Batch Scheduler. All OFSAA services have to be restarted after
the time has been changed in the server to reflect the change in
time in the Batch Scheduler.

Figure 217: Batch Scheduler window

To create a schedule for Batch processing in the Batch Scheduler window:


1. Select the checkbox adjacent to the Batch ID whose details are to be updated.
The options to schedule a new Batch are displayed. By default, the Schedule type is selected as
New Schedule in the Batch Scheduler section.
2. In the New Schedule section, enter the Schedule Name to identify the task.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 441


OPERATIONS
BATCH SCHEDULER

3. Select the Schedule option as one of the following, and specify the related details as tabulated.
The following table shows the Schedule Options and its Schedule Task Details.

Table 103: Schedule Options and its Schedule Task Details

Schedule Option Schedule Task Details

Specify the Date on which the Batch has to be scheduled for processing
using the Calendar.
Enter the Run Time during which the Batch Scheduling should be run, in
Once (default option)
hours (hh) and minutes (mm) format.
Enter the number of Lag days which signifies the misdate when the Batch
is currently run. For the schedule type “Once” lag days is optional.

Specify the Dates, Start and End dates during which the Batch has to be
scheduled for processing using the Calendar.
Enter the Run Time during which the Batch Scheduling should be run, in
hours (hh) and minutes (mm) format.
Daily
Enter the number of Lag days which signifies the misdate when the Batch
is currently run.
Enter the frequency of Batch Run in the Every field as per the defined
schedule type. For example, Every 2 day(s)

Specify the Dates, Start and End dates during which the Batch has to be
scheduled for processing using the Calendar.
Enter the Run Time during which the Batch Scheduling should be run, in
hours (hh) and minutes (mm) format.
Enter the number of Lag days which signifies the misdate when the Batch
Weekly
is currently run.
Enter the frequency of Batch Run in the Every field as per the defined
schedule type. For example, Every 2 week(s).
Select the checkbox adjacent to the Days of the Week to specify the days
on which you need to run the Batch schedule.

Specify the Dates, Start and End dates during which the Batch has to be
scheduled for processing using the Calendar.
Enter the Run Time during which the Batch Scheduling should be run, in
hours (hh) and minutes (mm) format.
Enter the number of Lag days which signifies the misdate when the Batch
is currently run.
Select Interval option to enter the frequency of Batch Run in the Every
field or select Random to select the checkbox adjacent to Months on which
Monthly you need to run the Batch schedule.
Do one of the following:
Select Dates (default) option and enter the Dates of the Month on which
you need to run the Batch schedule. Also select the checkbox Include
Month’s Last Date to do so.
-Or-
Select Occurrence and specify the day of the week days and select the
specific weekday by clicking on the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 442


OPERATIONS
BATCH MONITOR

Schedule Option Schedule Task Details

Specify the Information Date of Batch schedule using the Calendar.


Specify the Run Date of Batch schedule using the Calendar.
Enter the Run Time of Batch schedule in hours (hh) and minutes (mm)
Adhoc
format.
You can also click to add another row or click to delete the row in the
Schedule Time tool bar.

4. Click Save to save the new Batch schedule details.

9.3.2 Updating Existing Batch Schedule


You can modify the required details and later schedule the previously defined Batch for processing.
To update existing Batch schedule in the Batch Scheduler window:
1. Select the checkbox adjacent to the Batch ID whose details are to be updated. The various Batch
schedule options are displayed.
2. In the Batch Scheduler section, select Existing Schedule as the Schedule type. The window is
refreshed and displays the Existing Schedule options.
3. Select the Schedule name whose details you want to modify from the drop-down list.

4. Click button in the Existing Schedule toolbar. The details of the scheduled Batch are
displayed in the Batch Scheduler pane.
5. Modify the required details. You can modify the Start and End dates, Run Time, Lag days, and
other details depending on the Schedule Type selected. For more information, see Creating
Batch Schedule.
6. Click Save to save the modified details of an existing Batch schedule.
You can also do the following in the Existing Schedule section of the Batch Scheduler window:

 Click button to view details of the selected Batch schedule. and buttons are
displayed.

 Click button to view Task Logs.


 Click button to view all the log details for the selected Batch.

 Click button to delete the selected Batch schedule.

 Click button to reset the Batch scheduler details.

9.4 Batch Monitor


Batch Monitor in the Infrastructure system facilitates you to view the status of executed Batch
definitions along with the tasks details. You can track the issues if any, on regular intervals and ensure
smoother Batch execution. An event log provides you the real time status of the executed Batches.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 443


OPERATIONS
BATCH MONITOR

You should have Batch Read Only User Role mapped to your User Group to monitor a Batch. The
Batch Monitor window displays a list of Batches with the other details such as Batch ID and Batch
Description.
You can search for a specific Batch based on Date range, Module, Status, and Batch Description. The
Batches listed in the Batch Details section can be sorted based on the current state as Successful,
Failed, Held, or New.

9.4.1 Crash Handling of Backend Servers


There are 3 different servers to execute a specific executable such as ICC, Router and Activation
Manager (AM). Request from ICC goes to Router and get forwarded to Activation Manager (AM). Then
AM executes the task and sends result back to Router which further gets forwarded to ICC.
If any of the server crashes while executing the batch and when recovery happens, the status is sent
back to ICC server.
• Router goes down: When router goes down, the Task Status will become indeterminate and the
Batch Status will become Failed.
• AM goes down: If AM goes down while executing a task, as soon as AM comes up, status of all
tasks in the Batch will change to Indeterminate and the Batch Status will become Failed.
• ICC goes down: When ICC goes down, the status of the task will become interrupted and the
Batch Status will become Failed.
 ICC will mark all the task status as interrupted even though some of the tasks might have
executed successfully.
 You have to manually validate the data before you re-trigger the batch again.

9.4.2 Monitoring Batch


The Batch Details section in the Batch Monitor window lists all the Batches which are schedule or
executed within the Infrastructure system.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 444


OPERATIONS
BATCH MONITOR

Figure 218: Batch Monitor window

You can view and monitor the required Batch definitions and the corresponding task details. You can
also export the values in Microsoft Excel format for reference.
To monitor a Batch in the Batch Monitor window:
1. Select the checkbox adjacent to the Batch ID whose details are to be monitored.
You can also search for a specific Batch by using the Search option and filter the search results
by selecting the required Status as Successful, Failed, Held, or Not Started in the drop-down list.
2. Enter the Batch Run Details as tabulated.
The following table describes the fields in the Batch Run Details window.

Table 104: Fields in the Batch Run Details window and their Descriptions

Field Description

Select the information date from the drop-down list which consists of
Information Date
recently executed Batch Information dates.

Specify the refresh rate at which the latest Batch status details have to be
Monitor Refresh Rate fetched in seconds. You can enter a value between 5 to 999 seconds.
NOTE: The default value of Monitor Refresh Rate is set to 10 seconds.

Select the Batch Run ID from the drop-down list which consists of Batch ID’s
Batch Run ID
form which the Batch has been executed.

3. Click Start Monitoring button in the Batch Run Details tool bar.
The state of the selected Batch is monitored and status is displayed in the following order:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 445


OPERATIONS
BATCH MONITOR

Figure 219: Batch Status pane

 The Batch Status pane displays the Batch Run ID with the Batch Status as Successful,
Failed, Held, or Not Started.
 Successful- Batch execution is successful.
 Failed- Batch execution failed. A notification mail is sent to all users mapped to the
user groups with the OPRMON role mapped to them. The mail will show the exact task
status as Not Run, Excluded, Held, Interrupted, Indeterminate and Cancelled.
 Held- Batch execution is put on hold.
 Not Started- Batch execution has not started.
 The Task Details section displays the executed task details such as Task ID, Task
Description, Metadata Value, Component ID, Task Status and Task Log. Click View Log link
to view the View Logger window. You can select the checkbox adjacent to the Task ID to
view the task component execution details in Event Log section.

NOTE If the component used in the task is Data Transformation, the


status will be Successful or Failed based on the invocation of
function/procedure is successful or failure. The errors
produced by PL/SQL will not have impact on task status unless
it throws an oracle exception.

 The Event Log section displays the list of errors and events of the Batch being executed.
The events are displayed in the ascending order with the latest event being displayed at the
top. The Event log consists of:
 Message ID, which is auto generated.
 Description, which has the error details.
 Severity, which can be Fatal, Inform, or Successful.
 Time, which indicates the time of the event.
4. In the Batch Run Details tool bar, you can do the following:

 Click button to stop the Batch monitoring process.

 Click button to reset Batch Run Details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 446


OPERATIONS
PROCESSING REPORT

5. In the Event Log tool bar, you can click Export button to export the event log details to
Microsoft Excel file for reference.

9.5 Processing Report


Batch Processing Report in the Infrastructure system facilitates you to view the execution status of
each task component defined in a Batch. The Batch Processing Report window displays the Batch
execution details such as Component, Task, Parameters, and Status. By default, the details of the
Latest Batch Run are displayed.
You should have Batch Read Only User Role mapped to your User Group to cancel a Batch.

Figure 220: Batch Processing window

To view the status of the required Batch, in the Batch Processing Report window:
1. Select the Information Date from the drop-down list. The list consists of executed Batch
Information dates in the descending order with the latest Batch Run details being displayed at
the top.
2. Select the required Batch Status from the drop-down list. The available batch statuses are:
 ALL
 Not Started
 Ongoing
 Complete
 Failed
 Cancelled

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 447


OPERATIONS
EXECUTION VIEW LOG

The window is refreshed and displays the status of each executed component of the selected
Batch with the Task ID, defined Parameters, and the Status.
See the following table to know the available Status Codes of the task and their description.

Table 105: Status Code and its Description

Status Code Description

N Not Run - Task has not been executed.

F Failed- Task execution failed due to some error.

S Success- Task has been successfully executed.

O Ongoing - Task is being executed.

C Completed – Task execution completed.

R Restart - Task restarted.

H Held- Task is on Hold.

K Excluded - Task has been excluded.

I Interrupted - Task has been interrupted since ICC server was down.

Q Task Cancelled - Task has been manually cancelled during execution.

Indeterminate – When Router or AM server goes down and is up again


D
during task execution, the task status becomes Indeterminate.

9.6 Execution View Log


The Execution View Log feature allows to view, on the View Logger window, the log files generated in a
batch execution.
1. Login to OFSAA.

2. Click from the header to display the applications in a Tiles menu.


3. Select the Financial Services Enterprise Modeling application from the Tiles menu. The
Navigation list to the left is displayed.
4. Click Common Tasks to expand the list.
5. Click Operations to expand the list further.
6. Click Execution View Log to display the View Logger window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 448


OPERATIONS
BATCH CANCELLATION

Figure 221: View Logger window

7. Enter the details on the window as instructed in the following:


1. MIS Date (mandatory): Click and select the Management Information System date for the log
from the Date Editor.
d. Infodom (mandatory): Select the required Infodom from the drop-down list.
e. Wildcard (optional): Enter any wildcard value to filter the search.
f. Component (mandatory): Select the required component from the drop-down list.
g. Log File: Select the required log file from the drop-down list.
2. Click View Log to Run the log details in the Log File Contents pane. Click Download and
download the log file if required. Click Reset to remove the selected data on the window.

9.7 Batch Cancellation


Batch Cancellation in the Infrastructure system facilitates you to cancel or abort a Batch, or a specific
Task, which is either scheduled or is in the process of execution.
In the Batch Cancellation,
• When a Batch is aborted, the Task which is in the process of execution will be interrupted and a
scheduled task is cancelled from execution.
• When a Batch is cancelled, the Task which is in the process of execution will be executed
completely and a scheduled task is cancelled from execution.
• When a Task is cancelled, all the dependent Tasks are also cancelled automatically.
You should have Batch Advanced User Role mapped to your User Group to cancel a Batch. The Batch
Cancellation window displays a list of scheduled and current processing Batches with the other details
such as Batch Run ID, Batch ID, Batch Description, Start Time, and Elapsed Time.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 449


OPERATIONS
BATCH CANCELLATION

Figure 222: Batch Cancellation window

In the Batch Cancellation window, you can do the following before cancelling a Batch/Task:
• In the Refresh Interval section, you can define the required Refresh Rate in seconds to fetch the
current status of Batches being executed.

Click Refresh button to refresh the window and fetch the current status of Batches being
executed.
• wIn the Legend section, you can refer to know the specific defined colors which are used to
indicate a particular state of a Task during Batch execution.

Indicates - Not Started

Indicates - On Going

Indicates - Successful

Indicates - Cancelled

9.7.1 Cancelling Batch


You can cancel a Batch or a specific Task within the Batch, when you want to postpone or reschedule
the Batch for later execution. To cancel a Batch in the Batch Cancellation window:
1. Select the checkbox adjacent to the Batch Run ID which has to be cancelled.

2. Click Cancel Batch in the Batch Details tool bar. The selected Batch is cancelled from
processing and the results are displayed in a confirmation dialog. Click OK.
The Tasks associated with the cancelled Batch are also cancelled excluding the ongoing Tasks.
The cancelled Batch can be viewed in Restart and Rerun Batch list, within the Batch Execution
window.

9.7.1.1 Cancel Task Details


To cancel the specific Task(s) in a Batch from processing:
1. Select the checkbox adjacent to the Batch Run ID.

2. Click Fetch Task Details in the Batch Details tool bar. The defined Task(s) are displayed in
the Task Details section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 450


OPERATIONS
VIEW LOG

3. Click Cancel Task in the Task Details tool bar.

NOTE The Cancel Task button will be disabled if you are not
mapped to TASKCANCEL function role.

The selected Task is cancelled from processing and the results are displayed in a confirmation dialog.
Click OK.

9.7.2 Aborting Batch


You can abort a Batch when you want to terminate the Batch execution before completion. To abort a
Batch in the Batch Cancellation window:
1. Select the checkbox adjacent to the Batch Run ID which has to be aborted.

2. Click Abort Batch button in the Batch Details tool bar. The selected Batch is aborted from
processing and the results are displayed in a confirmation dialog. Click OK.

NOTE The Abort Batch button is disabled if you are not mapped
to OPRABORT function role.

The Tasks associated with the cancelled Batch are also cancelled including the ongoing Tasks. The
cancelled Batch can be viewed in Restart and Rerun Batch list within the Batch Execution window.

9.8 View Log


View Log in the Infrastructure system facilitates you to view the execution status of each task
component defined in a Batch.

NOTE Currently only limited number of Component Types are


supported for viewing log. The supported component types can
be viewed from the Component Type drop-down list in the
Search grid.

You should have Batch Read Only User Role mapped to your User Group to cancel a Batch.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 451


OPERATIONS
VIEW LOG

Figure 223: View Log window

The View Log window displays Task ID’s Information such as Component, Task Name, Task ID,
Process Type, Status, Start Date, End Date, Elapsed Time, User, Batch Run ID, As of Date, Process Step,
Records Processed, and Number of Errors for the respective Component Type selected.

9.8.1 Search and View Task ID Log


To search for a Task ID and view the log information:
1. Specify the details in any or all of the following parameters.
The following tables describes the fields in the Search and View Task window.

Table 106: Fields in the Search View and Task window and their Descriptions

Field Description

Select the Component Type from the drop-down list. The available
component types are listed and based on the component type selected, the
Task ID details are displayed.
For example, if the component type is selected as Object Validation, then
Component Type
the Task ID Information section displays the Date, Component, Batch Run
ID, and Task ID.
Note: No Log records are displayed for some component types such as SQL
Rules. This is a limitation.

Select the date using the Calendar. This field is not applicable for some
As Of Date
component types.

Select the folder from the drop-down list. This field is not applicable for
Folder
some component types.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 452


OPERATIONS
REFERENCES

Field Description

This field is not applicable for some component types.

Click button, the Task Name Browser window is displayed.


• Search for the required Task by entering the keyword in the Search
Task Name field and click .

• Select the required task from Available Task list and click .

You can also click button to deselect a Task from the selected list.
• Click OK.

This field is not applicable for some component types. Enter the user
User
details.

This field is not applicable for some component types.


Batch Run ID Enter the Batch Run ID which has a unique ID (timestamp) and a short
description for identification.

2. Click Search. The Task ID Information section displays the search results based on the
specified parameters.

You can click Reset to reset the search fields.


3. In the Task ID Information section, click the Task ID of the required component. The View Log
Details window is displayed with additional information.

NOTE There are differences in time stamp between View Log and
FSI_MESSAGE_LOG.

9.9 References
This section of the document consists of information related to intermediate actions that needs to be
performed while completing a task. The procedures are common to all the sections and are referenced
where ever required. You can refer to the following sections based on your need.

9.9.1 Task Component Parameters


Components are individual functional units that are put together to form a process. Task Component
Parameters reflect the parameters that are being applied to the selected task. Each component
triggers its own set of processes in the back-end to achieve the final output.
The parameters required for each of the component ID’s are as tabulated.

NOTE The FIRERUN Component in ICC is not supported.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 453


OPERATIONS
REFERENCES

9.9.1.1 Component: AGGREGATE DATA


The following table describes the property of the Aggregate data.

Table 107: Aggregate Data Property and its Description

Property Description

Refers to the cube identifier as defined through the Business Metadata (Cube) menu
Cube Parameter
option. Select the cube code from the drop-down list.

Select the operation to be performed from the drop-down list. The available options
Operation
are ALL, GENDATAFILES, and GENPRNFILES.

Refers to the additional parameter that has to be processed during runtime. You can
Optional parameters specify the runsk value that should be processed as a runtime parameter during
execution. By default, the value is set to “null”.

9.9.1.2 Component: CREATE CUBE


The following table describes the fields of the Create Cube.

Table 108: Fields in the Create Cube and their Description

Field Description

Refers to the cube identifier as defined through the Business Metadata (Cube) menu
Cube Parameter
option. Select the cube code from the drop-down list.

Refers to the operation to be performed. Select the required Operation from the
drop-down list. The options are:
• ALL – This option will execute BUILDDB and DLRU.
• BUILDDB – This option should be used to build the outline in Essbase Cube. The
outline is built based on the parentage file(s) contents.
• TUNEDB – This option should be used to analyze data and optimize cube
settings. For example, if you are trying to achieve the best block size, where 64K
bytes is the ideal size.
• PROCESSDB – This option will execute BUILDDB and DLRU, and is same as All
Operation option. Selecting this option will internally assign as ALL.
• DLRU – This option should be used to Load Data in the Essbase Cube and trigger
a Rollup.
• ROLLUP – ROLLUP refers to populating data in parent nodes based on
calculations (E.g. Addition). This option should be used to trigger just the
ROLLUP option where in the CALC scripts are executed. The same is applicable
for DLRU option also.
• VALIDATE – This option will validate the outline.
• DELDB – This option will delete the Essbase cube.
• OPTSTORE – This option will create the Optimized outline for the cube.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 454


OPERATIONS
REFERENCES

9.9.1.3 Component: EXTRACT DATA


The following table describes the fields of the Extract Data.

Table 109 Fields in the Extract Data and their Descriptions

Field Description

Select the source from which the extract you want to execute is derived, from the
drop-down list.
Source Name
Sources defined from the Source Designer window of Data Management Tools are
displayed in the drop-down list.

Select the required extract name from the drop-down list. The list displays the Data
Extract Name Mapping definitions (T2F and H2F) defined on the selected source, from the Data
Mapping window.

Default Value

9.9.1.4 Component: LOAD DATA


The following table describes the fields of the Load Data.

Table 110: Fields in the Load Data and their Descriptions

Field Description

Select the load mode from the drop-down list. The options are Table to Table and
File to Table.
Table to Table should be selected for Data Mapping definitions such as T2T, T2H,
Load Mode
H2T, H2H and L2H definitions.
File to Table should be selected for Data Mapping definitions such as F2T and F2H
definitions.

Select the required source on which the Data Mapping or Data File Mapping
Source Name definition you want to execute is defined, from the drop-down list.
Based on the selection of Load Mode, the list displays the corresponding sources.

Select the Data Mapping or Data File Mapping definition you want to execute, from
File Name the drop-down list. Based on the selected Load Mode and Source Name, the list
displays the corresponding definitions.

The data filename refers to the .dat file that exists in the database. Specifying Data
File Name is mandatory for Load Mode selected as File to Table and optional for
Data File Name Load Mode selected as File to Table. If the file name or the .dat file name is
incorrect, the task fails during execution.
In case of L2H, you can specify the WebLog name.

Default Value Used to pass values to the parameters defined in Load Data Definition.
You can pass multiple runtime parameters while defining a batch by specifying the
values separated by ‘comma’.
For example, $MIS_DATE=value,$RUNSKEY=value,[DLCY]=value and so on.
Note the following:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 455


OPERATIONS
REFERENCES

Field Description

• The parameters can either be specified with $ or within [ ]. For example,


$RUNSKEY=value or [RUNSKEY]=value. When the definition is saved from the
UI, no value is assigned to these parameters and these are just passed for syntax
correctness only. Actual values will be passed to these parameters while defining
an ICC batch or a RUN.
• The list of valid Default Parameters are:
 RUNID- Data type is String and can be mapped to VARCHAR2
 PHID- Data type is String and can be mapped to VARCHAR2
 EXEID- Data type is String and can be mapped to VARCHAR2
 RUNSK- Data type is Integer and can be mapped to VARCHAR2 or INTEGER.
 SYSDATE- Data type is Date and can be mapped to DATE, VARCHAR2.
 TASKID- Data type is String and can be mapped to VARCHAR2
 MISDATE- Data type is Date and can be mapped to DATE, VARCHAR2.
 BATCHRUNID- Data type is String and can be mapped to VARCHAR2
Note: RUNID, PHID, EXEID, RUNSK, MISDATE and BATCHRUNID are implicitly
passed through RRF. Rest must be explicitly passed.
 EXEC_ENV_SOURCE- This parameter is used to replace an External Data
source or Infodom based Data Source of the T2T, T2H, H2T or H2H definition
during run time, provided the structure of the source in the mapping
definition is same as that of the replacing source. Hence you can convert a
T2T definition into H2T or T2H into H2H and so on. If the resultant definition
is T2T, then T2Texecution using CPP engine is not supported.
For external Data Source, prefix it with ‘EXT.’ and for Infodom based sources,
prefix it with ‘INF.’. For example, [EXEC_ENV_SOURCE]=EXT.<newSourceName>
or
[EXEC_ENV_SOURCE]=INF.<newSourceName>
Additionally, Cluster properties of the current logged-in Infodom will be
considered for the execution of the Data Mapping definition.
• EXEC_ENV_SOURCE_OWNER_INFODOM –This parameter is used to specify the
Infodom where the Data Source being replaced (<newSourceName>) was
created, in case that Infodom is different from the current Infodom where the
batch is executed. If this is not provided, it will look for the Data Source in the
current Infodom and may result in failed execution.
• EXEC_ENV_TARGET- This parameter is used to replace the target Infodom of the
T2T, T2H, H2T or H2H definition during run time, provided the structure of the
target in the mapping definition is same as that of the replacing target. Hence
you can convert a T2T definition into T2H or H2T into H2H and so on. But if the
resultant definition is T2T, then T2Texecution using CPP engine is not
supported.
For example, [EXEC_ENV_TARGET]=newTargetName
Also, DMT Configurations and Cluster properties of the new target Infodom will
be considered for the execution of the Data Mapping definition.
Note: You can use both EXEC_ENV_SOURCE and EXEC_ENV_TARGET together
as well. Only limitation is, if the resultant definition is T2T, execution using CPP
engine is not supported.
Note: If you are converting a mapping definition to T2H using
EXEC_ENV_SOURCE/EXEC_ENV_TARGET, there is no provision in UI to specify

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 456


OPERATIONS
REFERENCES

Field Description

the Split By Column/Generic Options. In such scenarios, execution via Sqoop


may fail, when the split by column is defaulted to a string/date column.
• EXECUTION_ENGINE_MODE- This parameter is used to execute H2H on Spark.
For example, [EXECUTION_ENGINE_MODE]=SPARK
• CLOSE_SPARK_SESSION- This parameter is used to close the Spark session after
executing the last H2H-Spark task in the batch.
In a batch execution, a new Spark session is created when the first H2H-Spark
task is encountered, and the same Spark session is reused for the rest of the
H2H-Spark tasks in the same run. For the Spark session to close at the end of
the run, user needs to set the CLOSE_SPARK_SESSION to YES in the last H2H-
spark task in the batch.
For example, [CLOSE_SPARK_SESSION]=YES
• SRCHINT- This parameter is used to provide Source Hints. For example,
[SRCHINT]= FIRST_ROWS(2)
Note that the value should not contain /*+ */. Only the content should be
given.
• SRCPRESCRIPT- This parameter is used to provide Source Prescript.
Note: ALTER keyword is not supported.
• TARGETHINT- This parameter is used to provide Target Hints. For example,
[TARGETHINT]= FIRST_ROWS(2)
Note that the value should not contain /*+ */. Only the content should be
given.
• TARGETPRESCRIPT- This parameter is used to provide Target Prescript.
Note: ALTER keyword is not supported.
Apart from these, L2H/H2H/T2H/H2T/F2H data mappings also support following
additional default parameters. Values for these are implicitly passed from ICC/RRF.
• $MISDT_YYYY-MM-DD - Data type is String and can be mapped to VARCHAR2.
Value will be the MISDATE in ‘yyyy-MM-dd‘ format.
• $MISYEAR_YYYY - Data type is String and can be mapped to VARCHAR2. Value
will be the year value in ‘yyyy‘ format from MISDATE.
• $MISMONTH_MM - Data type is String and can be mapped to VARCHAR2. Value
will be the month value in ‘MM‘ format from MISDATE.
• $MISDAY_DD - Data type is String and can be mapped to VARCHAR2. Value will
be the date value in ‘dd‘ format from MISDATE.
• $SYSDT_YYYY-MM-DD- Data type is String and can be mapped to VARCHAR2.
Value will be the System date in ‘yyyy-MM-dd‘ format.
• $SYSHOUR_HH24 - Data type is String and can be mapped to VARCHAR2. Value
will be the hour value in ‘HH24‘ format from System date.
Note: The aforementioned parameters are not supported for T2T and F2T.
• Only those variable which start with $ or [, will be replaced at run time and the
value of this variable will be equal to anything starting after “=” and ending
before comma “,”.
For example, if $DCCY/[DCCY] =’USD’, $RUNSKEY=1, then the replaced value in
query for $DCCY will be ‘USD’ and for $RUNSKEY will be 1.
• If you are using “RUNSKEY” parameter in ICC Batch, then ensure that you specify
the value of it instead of specifying $RUNSKEY / [RUNSKEY]. For example,

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 457


OPERATIONS
REFERENCES

Field Description

FCT_STANDARD_ACCT_HEAD.N_RUN_SKEY=’$RUNSKEY’. Since the value of


RUNSKEY will not be replaced during runtime.
• If there are quotes specified in parameter name, then ensure not to use quotes
while defining the expression or vice versa to avoid SQL errors. For example, if
the parameter name is $DCCY=’USD’ and the expression is defined using
‘$DCCY’ instead of $DCCY, then the final value will be ‘ ‘USD’ ’.
• When you execute a RUN, the run is always tagged with a RUNSK value (a unique
value for each run fired directly from the RRF). You might have a DERIVED
COLUMN in your T2T with expression like $RUNSK. If you execute this T2T
through a RUN, a unique RUNSK value is passed implicitly to the T2T engine,
which then assigns that value wherever $RUNSK is found. But if you try to
execute the T2T through ICC, then you need to explicitly pass a $RUNSK as a
parameter so that the T2T engine can use it.
Two additional parameters are now supported for L2H mappings:
• [INCREMENTALLOAD] – Specify the value as TRUE/FALSE. If set to TRUE,
historically loaded data files will not be loaded again (load history is checked
against the definition name, source name, target infodom, target table name and
the file name combination). If set to FALSE, the execution is similar to a snapshot
load, and everything from the source folder/file will be loaded irrespective of
load history.
• [FOLDERNAME] – Value provided will be used to pick up the data folder to be
loaded.
 For HDFS based Weblog source: Value will be suffixed to HDFS File Path
specified during the source creation.
 For Local File System based Weblog source: By default the system will look
for execution date folder (MISDATE: yyyymmdd) under STAGE/<source
name>. If the user has specified the FOLDERNAME for this source, system
will ignore the MISDATE folder and look for the directory provided as
[FOLDERNAME].

9.9.1.5 Component: MODEL


The following table describes the fields of the Model.

Table 111: Fields in the Model and their Descriptions

Field Description

Refers to the model that has to be processed. This is a system generated code that is
Rule Name
assigned at the time of model definition.

The All definition for the Operation field conveys the process of extracting the data
from the flat files and applying the run regression on the data extracted.
Operation
For Batches that are being built for the first time the data will be extracted from the
flat files and the run regression will be applied on it.

Refers to the set of parameters specific to the model that has to be processed. This
set of parameters is automatically generated by the system at the time of definition.
Optional Parameters
You must NOT define a Model using the Define mode under Batch Scheduling. You
must define all models using the Modeling framework menu.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 458


OPERATIONS
REFERENCES

9.9.1.6 Component: PROCESS_EXECUTION


This component will combine all the rules to create single or multiple merge queries. Only rules
defined on the same dataset can be merged. For creation of queries the current order of the rules
inside the process or sub-process will be taken into consideration. Following validations are performed
to determine single or multiple DMLs for merging Rules that is, validation on subsequent rules.
• For classification-classification or classification-computation rule combination, the target
column of the prior classification rule must not be used in any of the subsequent rules as source
hierarchies in the executable process or sub-process. Also the same target hierarchy must not
be used as a target in the subsequent rule.
• For computation-computation rule combination, the target measures of the prior computation
rule must not be used in any of the subsequent computation rules in the executable process or
sub-process.
All the merge queries created after satisfying all the conditions will be executed in a single transaction.

NOTE • RRF framework cannot validate the semantic


correctness of the rules grouped for merge. It is left to
the application developer/user to make a conscious
choice.
• If the merge results in an ill-formed or runaway SQL,
the framework will not be able to detect it at design
time. This is again left to application developer/user to
design the grouping that is syntactically valid.

The following table describes the fields in the Process Execution.

Table 112: Fields in the Process Execution and their Description

Field Description

Display the codes of the RRF Processes defined under the selected Infodom. Select
Process Code
the required Process from the drop-down list.

Display the codes of the Sub Processes available under the selected Process. Select
Sub Process Code
the required Sub Process from the drop-down list.

Select the required option from the drop-down list as “Yes” or “No”.
Build Flag refers to the pre-compiled rules, which are executed with the query stored
in database. While defining a Rule, you can make use of Build Flag to fasten the Rule
execution process by making use of existing technical metadata details wherein the
rule query is not rebuilt again during Rule execution.
Build Flag Built Flag status set to “No” indicates that the query statement is formed
dynamically retrieving the technical metadata details. If the Build Flag status is set to
“Yes” then the relevant metadata details required to form the rule query is stored in
database on “Save” of a Rule definition. When this rule is executed, database is
accessed to form the rule query based on stored metadata details, thus ensuring
performance enhancement during Rule execution. For more information, refer
Significance of Pre-Built Flag.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 459


OPERATIONS
REFERENCES

Field Description

Refers to the set of parameters which would behave as filter criteria for the merge
Optional Parameters
query.

9.9.1.7 Component: RULE_EXECUTION


The following table describes the fields in the Rule Execution.

Table 113: Fields in the Rule Execution and their Descriptions

Field Description

Rule Code Display the codes of the RRF Rules defined under the selected Infodom.

Select the required option from the drop-down list as “Yes” or “No”.
Build Flag refers to the pre-compiled rules, which are executed with the query stored
in database. While defining a Rule, you can make use of Build Flag to fasten the Rule
execution process by making use of existing technical metadata details wherein the
rule query is not rebuilt again during Rule execution.
Build Flag Built Flag status set to “No” indicates that the query statement is formed
dynamically retrieving the technical metadata details. If the Build Flag status is set to
“Yes” then the relevant metadata details required to form the rule query is stored in
database on “Save” of a Rule definition. When this rule is executed, database is
accessed to form the rule query based on stored metadata details, thus ensuring
performance enhancement during Rule execution. For more information, refer
Significance of Pre-Built Flag.

Refers to the set of parameters which would behave as filter criteria for the merge
Optional Parameters
query.

9.9.1.8 Component: RUN DQ RULE


The following table describes the fields in the Run DQ Rule.

Table 114: Fields in the Run DQ Rule and their Descriptions

Property Description

Refers to the Data Quality Groups consisting of associated Data Quality Rule
DQ Group Name
definition(s). Select the required DQ Group from the drop-down list.

Specify the percentage of Rejection Threshold (%) limit in numeric value. This refers
to the maximum percentage of records that can be rejected in a job. If the
Rejection Threshold
percentage of failed records exceeds the Rejection Threshold, the job will fail. If the
field is left blank, the default the value is set to 100%.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 460


OPERATIONS
REFERENCES

Property Description

Specify the Additional Parameters as filtering criteria for execution in the pattern
Key#Data type#Value; Key#Data type#Value;…etc.
Here the Data type of the value should be “V” for Varchar/Char, or “D” for Date with
“MM/DD/YYYY” format, or “N” for numeric data. For example, if you want to filter
Additional Parameters some specific region codes, you can specify the Additional Parameters value as
$REGION_CODE#V#US;$CREATION_DATE#D#07/06/1983;$ACCOUNT
_BAL#N#10000.50;
Note: In case the Additional Parameters are not specified, the default value is
fetched from the corresponding table in configuration schema for execution.

Parameters Comma separated parameters where first value is considered as the threshold
percentage, followed by additional parameters which are a combination of three
tokens. Example, “90”,”PARAM1”,”D”,”VALUE1”,”PARAM2”,”V”,”VALUE2”.
Note: Parameter ‘Fail if threshold is breached” is defaulted to “Yes” for RRF
executions.

Optional Parameter For DQ Rule execution on Spark, specify EXECUTION_VENUE=Spark in this field.
Note that, you should have registered a cluster from DMT Configurations > Register
Cluster window with the following details:
• Name- Enter name of the Hive information domain.
• Description- Enter a description for the cluster.
• Livy Service URL- Enter the Livy Service URL used to connect to Spark from
OFSAA.

9.9.1.9 Component: RUN EXECUTABLE


The following table describes the fields in the Run Executable.

Table 115: Fields in the Run Executable and their Descriptions

Field Description

Refers to the executable path on the DB Server. The Executable parameter contains
the executable name as well as the parameters to the executable. These executable
parameters have to be specified as they are specified at a command line. In other
words, the Executable parameter is the exact command line required to execute the
executable file.
The path to the executable has been entered in quotes. Quotes have to be used if
Executable the exe name has a space included in it. In other words, the details entered here
should look exactly as you would enter it in the command window while calling your
executable. The parameter value is case-sensitive. So, ensure that you take care of
the spaces, quotes, and case. Also, commas are not allowed while defining the
parameter value for executable.
To pass parameters like $RUNID, $PHID, $EXEID, $RUNSK to the RUN EXECUTABLE
component, specify RRFOPT=Y or rrfopt=y along with other executable details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 461


OPERATIONS
REFERENCES

Field Description

When the file is being executed you have the choice to either wait till the execution
is completed or proceed with the next task.
Select Y (Yes) or N (No) from the drop-down list.
Wait • Y- Select this if you want to wait for the execution to be completed
• N- Select this if you wish to proceed.
If the task is using FICGEN/RUN EXECUTABLE component and there is no
precedence set for this task, then the WAIT should always be set to 'N'.

Y- Select Yes if you want to pass the Batch parameters to the shell script file being
executed.
• If Wait is selected as Y and Batch Parameter is selected as Y, following
parameters are passed to the executable:
NIL <BatchExeRunID> <ComponentId> <Task> <Infodate>
Batch Parameter <Infodom> <DatstoreType> <IPAddress>
• If Wait is selected as N and Batch Parameter is selected as Y, following
parameters are passed to the executable:
<BatchExeRunID> <ComponentId> <Task> <Infodate>
<Infodom> <DatstoreType> <IPAddress>
N- Select No if the Batch parameters should not be passed to the shell script.

This field will be considered only if you have specified RRFOPT=Y or rrfopt=y in the
Executable field.
Optional Parameters
Specify the optional parameters that you want to pass to the executable. For
example, $RUNID, $PHID, $EXEID, $RUNSK.

9.9.1.10 Component: SQLRULE


The following table describes the fields in the SQL Rule.

Table 116: Fields in the SQL Rule and their Descriptions

Field Description

Refers to the location where the SQL Rule definition resides. Click the drop-down list
Folder
box in the Value column to select the desired Folder.

Refers to the defined SQL rule. Click the drop-down list in the Value column to select
SQL Rule Name
the SQL Rule.

9.9.1.11 Component: TRANSFORM DATA


The following table describes the fields in the Transform Data.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 462


OPERATIONS
REFERENCES

Table 117: Fields in the Transform Data and their Descriptions

Field Description

Refers to the Data transformation name that was defined in the Post Load Changes
Rule Name window of Data Management Tools framework. Select the rule name from the drop-
down list.

Is the list of parameters defined in Data Transformation check in which the


parameters must be in the same order as in the definition and must be separated by
a comma (“,”). Irrespective of the data type of the parameter defined in the
procedure. The parameter specified through the front-end does not require to be
specified within quotes (' ').

Parameter List Note: Commas are used as delimiters for parameter values internally by the ICC
Batch component. Ensure that commas are not used in any of the parameter values,
that is, “a, b, c” should not be a parameter value in the list of parameter values being
passed to the TRANSFORM DATA task. For example, if the parameter values to this
task are required to be passed as (val1, val2, (a, b, c), val4), the correct way would be
to pass these values as (val1, val2, (a*b*c), val4). You can use any other character as
a separator.

9.9.1.12 Component: VARIABLE SHOCK


The following table describes the fields in the Variable Shock.

Table 118: Fields in the Variable Shock and their Descriptions

Field Description

Refers to the variable shock that has to be processed. This is a system generated
Variable Shock Code
code that is assigned at the time of variable shock definition.

Refers to the operation to be performed. Click the drop-down list in the Value field
Operation to select the Operation. The available options are ALL, GENDATAFILES, and
GENPRNFILES.

Refers to Process ID and the User ID. Click in the text box adjacent to the Optional
Optional Parameters
Parameters field and enter the Process ID and User ID.

9.9.1.13 Component: Workflow Execution


The following table describes the fields in the Workflow Execution.

Table 119: Fields in the Workflow Execution and their Descriptions

Field Description

Enter an object ID of your choice. This ID will appear as Entity ID in the Process
Object ID
Monitor window.

Select the workflow you want to execute from the drop-down list. It displays all the
Workflow
workflows defined in the Process Modeller window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 463


OPERATIONS
REFERENCES

Field Description

Enter the value you want to pass to the Dynamic Parameters of the Run Task during
Optional Parameters
the execution of the workflow.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 464


QUESTIONNAIRE
KNOW THE QUESTIONNAIRE WORKFLOW

10 Questionnaire
The Questionnaire is an assessment tool that presents a set of questions to users and collects the
answers for analysis and conclusion. It can be interfaced or plugged into OFSAA application packs. For
example, the Enterprise Modeling Framework (EMF) application pack. It is role and permission-based,
and you can create a library of questions and use the library to create a questionnaire.

NOTE In the examples mentioned in this topic, it is assumed that the


Questionnaire window is configured to appear in the
Application Builder Component in Common Tasks.

The topics discussed in this guide are specific to end-users. However, if you are looking for
information on configuring the Questionnaire, see the Oracle Financial Services Analytical
Applications Infrastructure Administration User Guide.

10.1 Know the Questionnaire Workflow


The Questionnaire provides the following functions on the OFSAA user-interface:
• Configure the Questionnaire Attributes
• Define the Questions
• Define the Questionnaires
The workflow for the Questionnaire starts with the configuration of the Questionnaire Attributes. You
must have the required user roles and permissions assigned to your profile to configure the
Questionnaire Attributes. After you or a user with the requisite access has configured the attributes,
you can define and include questions in the Questions Library. You can combine questions and
questionnaire attributes to create Questionnaires.

NOTE Access to the Questionnaire menus is based on roles and


permissions granted to users.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 465


QUESTIONNAIRE
QUESTIONNAIRE TYPES

Figure 224: Questionnaire Flow chart

10.2 Questionnaire Types


Create the following types of Questionnaire in OFSAA as required:
1. Basic: This Questionnaire type follows a linear sequence in the flow. For example, if there are 20
questions in the questionnaire, the questions start from 1 and end at 20.
2. Decisions Tree (DT): This Questionnaire type displays the next question based on the answer
selected for the current question. For example, a question, “Are you living in the US?”, can
display the answer options “Yes” or “No”. If you select “Yes”, the next question displayed can be
“Which State are you from?”. The list can display states in the US in a drop-down list. However,
if you answer “No”, the next question displayed can be “Which Country are you from?”. For this
question, the list can display countries in a drop-down list.
3. Score Based: In this Questionnaire type, you assign a number value to a question for it to be
considered in the set of questions. This can be used as a percentage of the set that the question
adds value. For example, a question could be given a score of 20 out of 100, and this question
would contribute to 20% of the score of the questionnaire. Score based questionnaires, by
default, are of the type Basic. However, you can select branching logic on the UI and make a
score based questionnaire of the Decision Tree type.

10.3 Use Search in the Questionnaire


Search for existing questionnaire attributes from the Questionnaire Attributes Configuration window,
search for existing questions from the Questions Library window, and search for existing
questionnaires from the Questionnaire Library window. The respective windows display a Search
section at the top. There are two types of search:
1. Basic Search – a simple form of search.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 466


QUESTIONNAIRE
USE SEARCH IN THE QUESTIONNAIRE

2. Advanced Search – a complex form of search with combinations to filter results.

10.3.1 Use the Basic Search


The basic search is the default search. Enter the nearest matching keywords to search, and filter the
results by entering information in the additional fields.

Click Go to start a search and click Reset to clear the Search fields.

10.3.2 Use the Advanced Search


The Advanced Search option helps you find information faster and for specific combinations. Click
Advanced Search from the Search toolbar to display the Advanced Search fields.

Click Go to start a search and click Reset to clear the Search fields.

10.3.2.1 Description of the Search Fields


The search section provides fields to enter details and filter search results. The following table
provides descriptions for the fields (both Basic and Advanced Search) on the various windows in the
Questionnaire.
The following describes the fields in the Basic and Advanced Search windows.

Table 120: Fields in the Basic and Advanced Search windows and their Descriptions

Field Description

Questionnaire Attributes Configuration

Select the type of questionnaire component configured in the system from


Component
the drop-down.

Subcomponent Select the subcomponent for the selected Component.

Questions Library

ID Enter the system-generated identifier for the question. This is a unique value.

Question Enter the title of the question.

Select the category of classification for the question from the following
options:
Category • External
• IT
• Infrastructure

Select the type of question from the following options:


• Single Choice
• Multiple Choice
Question Type
• Free Text
• Number
• Range

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 467


QUESTIONNAIRE
CONFIGURE THE QUESTIONNAIRE ATTRIBUTES

Field Description

Select the type of user-interface element that is displayed. For example, drop
Display Type down, text field, and so on. The options are available based on the Question
Type selected.

Status Select the status of the question. For example, Draft, Open, and so on.

Select the From date for the last update on the question to search in a date
Last Modified From
range.

Select the To date for the last update on the question to search in a date
Last Modified To
range.

Questionnaire Library

Enter the system-generated identifier for the questionnaire. This is a unique


ID
value.

Name Enter the name of the questionnaire.

Component Select the type of questionnaire component configured in the system.

Select the type of questionnaire from the following options:


• Basic
Type
• Decision Tree
• Score Based

Select the status of the questionnaire. For example, Draft, Open, Pending,
Status
and In Review.

Select the From date for the last update on the questionnaire to search in a
Last Modified From
date range.

Select the To date for the last update on the questionnaire to search in a date
Last Modified To
range.

10.4 Configure the Questionnaire Attributes


This feature allows you to configure the Questionnaire Attributes, which uniquely identify the
Questionnaire that users use.
To access the Questionnaire Configuration window, expand the menu in the left pane where the
Questionnaire is configured and click Questionnaire. From the Questionnaire window, click
Questionnaire Configuration.

NOTE Configure the Questionnaire to appear in the menu of your


choice based on your application’s requirement. For
information on how to configure Questionnaire menus, see the
Oracle Financial Services Advanced Analytical Applications
Infrastructure Application Pack Administration and
Configuration Guide.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 468


QUESTIONNAIRE
CONFIGURE THE QUESTIONNAIRE ATTRIBUTES

Figure 225: Questionnaire window

The window displays the list of defined Attributes. It also displays the OFSAA Application that is
interfaced to the Questionnaire module. For example, Financial Services Enterprise Modeling. Create,
modify, or delete Questionnaire Attributes from this window.

Figure 226: Questionnaire Attributes Configuration window

The following table describes the fields displayed on the Questionnaire Attributes Configuration
window.

Table 121: Fields in the Questionnaire Attributes Configuration window and their Descriptions

Field Description

Displays the type of questionnaire component configured in the system.

Component Note: For information on configuring components, see the Oracle Financial
Services Advanced Analytical Applications Infrastructure Application Pack
Administration and Configuration Guide.

Subcomponent Displays the subcomponent for the selected Component.

Displays the code of the attribute as entered in the Add Attribute window.
Attribute Code
Once defined, this code cannot be edited.

Attribute Name Displays the name of the attribute as entered in the Add Attribute window.

Displays the condition executed at run time to display attribute values used
Attribute Value
on the Create Questionnaire window.

Displays whether the attribute is mandatory or not. The values are Yes and
Is Mandatory
No.

Last Updated Displays the last updated date and time details for the attribute.

Selection Type Displays the Attribute Selection Type as entered in the Add Attribute window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 469


QUESTIONNAIRE
CONFIGURE THE QUESTIONNAIRE ATTRIBUTES

Field Description

Displays the number of Questionnaires that are linked to the Attribute, and
Associated Questionnaires
are in Open and Pending Approval status.

Search for existing questionnaire attributes based on the Component. For more information, see the
Use Search in the Questionnaire section.

10.4.1 Add Questionnaire Attributes


You can use this option to create Questionnaire Attributes.
To add a Questionnaire Attribute:
1. Click Add from the Questionnaire Configuration window. The Add Attribute window is
displayed.

Figure 227: Add Attribute window

2. Enter the details for the fields in the Add Attribute window.
The following table describes the fields in the Add Attribute window.

Table 122: Fields in the Add Attribute window and their Descriptions

Field Description

Fields marked with a red asterisk (*) are mandatory.

Displays the OFSAA name of the application that is interfaced to


Application the Questionnaire module. For example, Financial Services
Enterprise Modeling. This is a read-only field and is not editable.

Select the Component from the drop-down list.


Note: For information on configuring components, see the
Component Oracle Financial Services Advanced Analytical Applications
Infrastructure Application Pack Administration and
Configuration Guide.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 470


QUESTIONNAIRE
CONFIGURE THE QUESTIONNAIRE ATTRIBUTES

Field Description

Select the Subcomponent for the selected Component from the


Subcomponent drop-down list. This field is enabled only if the selected
Component is configured to have subcomponent(s).

Enter the attribute code for the questionnaire attribute. This is a


unique value. If the code exists in the system, a message
Attribute Code
displays “The Attribute Code exists in the system, enter another
value”.

Enter a name for the questionnaire attribute. This is a unique


Attribute Name
value.

Select whether the attribute is mandatory or optional from the


Mandatory
drop-down list. The options are Yes and No.

Type of attribute that is displayed on the Questionnaire


Definition window. For example, selecting drop-down displays a
drop-down questionnaire in the Questionnaire Definition
window. Similarly, SQL Query displays data fetched from the
query on the Questionnaire Definition window.
Select the type of attribute from the drop-down list. The options
are:
• DropDown- Select this if you want a drop-down list in the
Questionnaire Definition window.
• SQL Query
Attribute Type • Hierarchy
• External
• Static
Note: Selecting any of these options results in the display of
different headings for the field right below the Attribute Type
field. The fields are also of different types based on the Attribute
Type selection. For example, selecting DropDown results in the
display of a drop-down in the field below and selecting SQL
Query results in the display of a text field. The row ‘(Headings for
the field below Attribute Type field.)’ provides details for the
different fields that appear on Attribute Type selection.

Attribute Selection Select whether you want the attribute type to be a single-
Type selection or multiple-selection type attribute.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 471


QUESTIONNAIRE
CONFIGURE THE QUESTIONNAIRE ATTRIBUTES

Field Description

Options displayed on the field below the attribute type field are
dynamic and vary based on the selection of the attribute type.
You can find the details in the following list.
Select from the following options:
• DropDown - selecting this attribute type displays a drop-
down Dimension Source with options that list dimension
tables acting as a source for the attribute being created.
Select from the following options:
 Attr Dim Single
 Attributes Dimension Composite
Note: The preceding drop down is displayed on the selection of
drop down as dimension and it is configurable. For information
on configuring dimension tables, see the Oracle Financial
Services Advanced Analytical Applications Infrastructure
Application Pack Administration and Configuration Guide.
• SQL Query - selecting this attribute type displays a text field
SQL Query where you have to enter a SQL Query to fetch
the data for the attribute being created. Format for SQL
(Headings for the field queries has to be given here with an example.
below Attribute Type
• Hierarchy- selecting this attribute type displays a drop down
field.)
Hierarchy Source with options that list hierarchy code acting
as a data source for the attribute being created.
• External - selecting this attribute type displays a text field
Web-Service URL where you have to enter a Web-Service
URL to fetch data for the attribute being created.
• Static - selecting this attribute type displays a drop down
Static Type with options that list static types to fetch data
for the attribute being created. Select from the following
options:
 Is Default
 Sign Off Type
 Reassign Required
 Is Confidential
Note: The preceding drop-down is displayed on the selection of
Attribute Type as static and it is configurable. For information on
how to configure it, see the Oracle Financial Services Advanced
Analytical Applications Infrastructure Application Pack
Administration and Configuration Guide.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 472


QUESTIONNAIRE
CONFIGURE THE QUESTIONNAIRE ATTRIBUTES

Field Description

Additional options for the values selected in the Static Type


drop-down. This field is displayed when you select any of the
following options from the Static Type drop-down:
• Sign Off Type
• Reassign Required
• Is Confidential
Select the following options from the drop-down:
• Sign Off Type - details for the source options for this type
are given below.
 Two Level Sign Off
Source Options
 Single Sign Off
 No Sign Off
• Reassign Required - details for the source options for this
type are given below.
 No
 Yes
• Is Confidential – details for the source options for this type
are given below.
 No
 Yes

3. Click Save to save the questionnaire attribute or click Cancel to discard the changes and close
the window.

10.4.2 Edit the Questionnaire Attributes

NOTE Attributes cannot be modified if they are linked to


Questionnaires that are in Open or Pending Approval status
and display a count greater than zero in the Associated
Questionnaires column on the Questionnaire Attributes
Configuration window.

Edit the questionnaire attributes from this window. Follow these steps to edit a questionnaire attribute:
1. Select an Attribute from the Questionnaire Configuration window that you want to edit.

2. Click Edit to display the Edit Attribute window.


3. Modify the details for the fields in the Edit Attribute window. See the Field Description table in
the Add the Questionnaire Attributes section for field details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 473


QUESTIONNAIRE
DEFINE THE QUESTIONS

NOTE The Application, Component, Subcomponent, and Attribute


Code fields are not editable.

4. Click Save to save the edited questionnaire attribute or click Cancel to discard the changes and
close the window.

10.4.3 Delete the Questionnaire Attributes


Delete the questionnaire attributes in the Questionnaire Attributes window. However, you can delete
only Questionnaire Attributes that do not have any Questionnaires linked.
Remove the Questionnaires linked to the Questionnaire Attributes before you delete it. For more
information on how to remove Associated Questionnaires, see Edit Questionnaire From the Library,
where the field Component corresponds to Questionnaire Attributes. For information on how to delete
a Questionnaire, see Delete Questionnaire From the Library.
To delete a questionnaire attribute, follow these steps:
1. From the Questionnaire Attributes Configuration window, select the checkbox adjacent to the
Attribute that you want to delete and click the Delete . You can also select multiple rows to
delete. A confirmation message is displayed.
2. Click Delete to delete the selected attribute(s) or click Cancel to discard the changes and close
the window.

10.5 Define the Questions


Define a library of questions from the Questions Library window that you can link to create a
Questionnaire.
To access the Questions Library window, expand the menu in the left pane where the Questionnaire is
configured and click Questionnaire. From the Questionnaire window, click Question Library.

NOTE Configure the Questionnaire to appear in the menu of your


choice based on your application’s requirement. For
information on how to configure Questionnaire menus, see the
Oracle Financial Services Advanced Analytical Applications
Infrastructure Application Pack Administration and
Configuration Guide.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 474


QUESTIONNAIRE
DEFINE THE QUESTIONS

Figure 228: Questions Library window

The window displays a list of defined Questions. Create, modify, copy, and delete Questions from this
window.
The following table describes the fields displayed on the Questions Library window.

Table 123: Fields in the Questions Library and their Descriptions

Field Description

Displays the system generated identifier for the question. This is a unique
ID
value.

Question Displays the title of the question.

Displays the category of classification for the question from the following
Category
options: External, IT, and Infrastructure.

Displays the type of question from the following options: Single Choice,
Question Type
Multiple Choice, Free Text, Number, and Range.

Displays the type of user-interface element that is displayed. For example,


Display Type drop-down, text field, and so on. The options are available based on the
Question Type selected.

Displays the number of questionnaires associated with the question. For


example, 7 indicates that there are seven questionnaires linked to the
Questionnaires
question. Click the link to display the list of questionnaires linked in the
Associated Questionnaires window.

Status Displays the status of the question. For example, Draft, Open, and so on.

Last Modified Displays the date and time for the last update on the question.

Search for existing questions based on ID and Question. For more information, see the Use Search in
the Questionnaire section.

10.5.1 Create the Questions in the Library


Create questions in the Questions Library window. Follow these steps to create a question:
1. Click Create Question from the Questions Library window to display the Question Details
window.
2. Enter the details for the fields in the Question Details window.
The following table describes the fields in the Question Details window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 475


QUESTIONNAIRE
DEFINE THE QUESTIONS

Table 124: Fields in the Question Details window and their Descriptions

Field Description

Displays the identification number of the question. This value is generated


ID
by the system during question creation and is unique.

Question Enter the question in this field.

Description Enter more details in the description of the question that you are creating.

Select the category of classification for the question that you are creating
from the drop-down options. For example:
• External – the question is of the category external.
• IT – the question is under the IT category.
Category
• Infrastructure – the question is in the infrastructure category.
Note: This field is optional and these options are an example from the OR
application. This field can be configured in the table
AAI_ABC_DIM_QTN_CATEGORY and its MLS table.

Select the type of user-interface elements for the question that you are
creating from the following drop-down options:
• Single Choice – select to create a single choice type of question.
• Multiple Choice – select to create a multiple choice type of question.
• Free Text – select to create a free text type of question.
• Number – select to create a type of question that requires a number
input.
• Range – select to create a type of question that requires input in a
Question Type defined range or a number input.
Note: When you select a Question Type option, details for the question type
display on the window. The instructions to enter the details are described in
the following subsections:
• Select Question Type – Single Choice
• Select Question Type – Multiple Choice
• Select Question Type – Free Text
• Select Question Type – Number
• Select Question Type – Range

3. Click Save Draft to save the details, or click Submit if you have entered all details and are
ready to submit. Click Close to discard the changes and close the window.

10.5.1.1 Select the Question Type – Single Choice


Select Single Choice to create a question with a single-choice answer option. After you select this
option, add details for the list of answers that would be available to users as either a drop-down or a
radio button. Users can select only one from the list configured by you. The following list shows the
procedure to add the details:
1. Click Single Choice from Questions Type to display the Single Choice section in the
Question Details window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 476


QUESTIONNAIRE
DEFINE THE QUESTIONS

2. Enter the details for the fields in the Question Details window.
The following table describes the fields in the Question Details window.

Table 125: Fields in the Question Details window and their Descriptions

Field Description

Select this option to display the answer choices in a drop-down.


Display as Drop down
Note: This option is selected by default.

Display as Radio
Select this option to display the answer choices in radio buttons.
Buttons

Select this option to make either the drop-down or radio buttons display
static answer choices.
After you select this option, you must enter the values that appear in the
static fields. Enter these values in the Response Options form appearing
below it. The following steps show the procedure to enter response options:
Static • Click Add Option and enter the answer choice in the text field. To
delete an option, select the checkbox on the option row that you want
to delete and click Delete Option .
• Similarly, you can add more options. These options will appear in the
choice of answers in either a drop-down or radio button format as
selected by you.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 477


QUESTIONNAIRE
DEFINE THE QUESTIONS

Field Description

Select this option to make either the drop-down or radio buttons display
dynamic answer choices.
After you select this option, you are presented with various text fields and
conditions options. Follow these steps to enter these values:
1. Enter the Primary Column from the database to fetch the answer from.
This could be the key.
2. Enter the Display Column from the database to display the answer in
the checkbox list or combo box.
3. Enter the table name where the Primary Column and the Display
Column exist in Reference Table.
4. Enter the filter criteria to apply to the table data being fetched to
display in Filter Condition. This step is optional.
5. Click Validate to validate the query formed by these steps. On
Dynamic validation, the Preview Options drop-down appears.
6. Enter the Option Type Column name in the Advanced section. The
value entered here appears in the Option Type Column in the
Conditions section.
7. Click Add in the Conditions section and enter a name for the answer
choice in the Name text field. Select a condition from the Condition
drop down. For example, Not Equal To. Enter the required data in the
Option Value Type.field. Select either Static or Dynamic from the
Scope drop-down. If you select Dynamic, then you must enter a
subquery to filter the options further. To delete a condition, select the
checkbox on the condition row that you want to delete and click
Delete .
8. Similarly, you can add more conditions. These conditions will appear in
the choice of answers in either a checkbox list or a combo box as
selected by you.

3. Click Save Draft to save the details or click Submit if you have entered all details and are
ready to submit. Click Close to discard the changes and close the window.

10.5.1.2 Select the Question Type – Multiple Choice


Select Multiple Choice to create a question with the option to choose multiple answers. After you
select this option, add details for the list of answers that would be available to users either as a
checkbox or a combo box. Users can select multiple answers from the list configured by you. Follow
these steps to add the details:
1. Click the Multiple Choice from Questions Type to display the Multiple Choice section in the
Question Details window.
2. Enter the details for the fields in the Question Details window.
The following table describes the fields in the Question Details window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 478


QUESTIONNAIRE
DEFINE THE QUESTIONS

Table 126: Fields in the Question Details window and their Descriptions

Field Description

Select this option to display the multiple choice answers in a list of


Display as Checkbox List checkboxes.
Note: This option is selected by default.

Select this option to display the multiple choice answers in a combo box
Display as a Combo Box
list.

Select this option to make either the checkbox list or combo box display
static answer choices.
After you select this option, you must enter the values that appear in the
static fields. Enter these values in the Response Options form appearing
below it. To enter response options, click Add Option and enter the
Static answer choice in the text field. To delete an option, select the checkbox
on the option row that you want to delete and click Delete Option .
Similarly, you can add more options. These options will appear in the
choice of answers in either a checkbox list or checkbox format as
selected by you.

Select this option to make the checkbox list or combo box display
dynamic answer choices.
After you select this option, you are presented with various text fields
and conditions options. Enter these values as described in the following
steps:
1. Enter the Primary Column from the database to fetch the answer
from. This could be the key.
2. Enter the Display Column from the database to display the answer
in the drop-down or the radio buttons.
3. Enter the table name where the Primary Column and the Display
Column exist in Reference Table.
4. Enter the filter criteria to apply to the table data being fetched to
display in Filter Condition. This step is optional.
5. Click Validate to validate the query formed by these steps. On
Dynamic validation, the Preview Options drop-down appears.
6. Enter the Option Type Column name in the Advanced section. The
value entered here appears in the Option Type Column in the
Conditions section.
7. Click Add in the Conditions section and enter a name for the
answer choice in the Name text field. Select a condition from the
Condition drop-down. For example, Not Equal To. Enter the
required data in Option Value Type. Select either Static or
Dynamic from the Scope drop down. If you select Dynamic, then
you must enter a subquery to filter the options further. To delete a
condition, select the checkbox on the condition row that you want
to delete and click Delete .
8. Similarly, you can add more conditions. These conditions will
appear in the choice of answers in either a drop down or radio
button format as selected by you.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 479


QUESTIONNAIRE
DEFINE THE QUESTIONS

3. Click Save Draft to save the details or click Submit if you have entered all details and are
ready to submit. Click Close to discard the changes and close the window.

10.5.1.3 Select the Question Type – Free Text


Select Free Text to create a question with either a text field or text area as the answer input option for
users. Follow these steps to add the details:
1. Click Free Text from Questions Type to display the Free Text section in the Question
Details window.
2. Enter the details for the fields in the Free Text pane.
The following table describes the fields in the Free Text pane.

Table 127: Fields in the Free Text pane and their Descriptions

Field Description

Select this option to input the answer in a text field.


Display as Text Field
Note: This option is selected by default.

Display as Text Area Select this option to input the answer in a text area.

Question to be used
while defining DT Select Yes or No to apply Decision Tree logic to the question.
Logic?

3. Click Save Draft to save the details or click Submit if you have entered all details and are
ready to submit. Click Close to discard the changes and close the window.

10.5.1.4 Select the Question Type – Number


Select Number to create a question where users can input a numeric value as the response to the
question. Follow these steps to add the details:
1. Click Number from Questions Type to display the Number section in the Question Details
window.
2. Enter the details for the fields in the Number section. For the Question to be used while
defining DT Logic? field, select Yes or No to apply Decision Tree logic to the question.
3. Click Save Draft to save the details or click Submit if you have entered all details and are
ready to submit. Click Close to discard the changes and close the window.

10.5.1.5 Select the Question Type – Range


Select Range to define an upper limit and a lower limit numeric value, which is the range that users will
use to respond to the question. After you select this option, add rows of upper and lower limit values
for the user to select using either a radio button or a number field.
The rows of ranges defined need not be continuous, however, they shouldn’t overlap. For example,
you can define Range 1 from 0 to 100 and Range 2 from 200 to 300. This is an example of a non-
continuous range since Range 2 didn’t start from 101. However, you cannot define Range 1 from 0 to

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 480


QUESTIONNAIRE
DEFINE THE QUESTIONS

100 and Range 2 from 100 to 200, since the upper limit of Range 1 (100) overlaps with the lower limit
of Range 2 (100).
Follow these steps to add the details:
1. Click Range from Questions Type to display the Range section in the Question Details
window.
2. Enter the details for the fields in the Range pane.
The following table describes the fields in the Range pane.

Table 128: Fields in the Range pane and their Descriptions

Field Description

Select this option to display a drop-down list of range values for the
Display as Range of Values answer. Define the range in the Add Option Delete Option section.
Note: This option is selected by default.

Display as a Number Select this option to input the answer in number format.

Add options in this section for the Range of Values that you want to be
available as the list of answers for the question.
To enter the range values, click Add Option and enter the range in the
Add Option/Delete Option Lower Limit and Upper Limit fields. To delete an option, select the
for Range of Values checkbox on the option row that you want to delete and click Delete
Option .
Similarly, you can add more range value options. These options will
appear in the choice of answers in a list of range values.

3. Click Save Draft to save the details or click Submit if you have entered all details and are
ready to submit. Click Close to discard the changes and close the window.

10.5.2 Edit the Questions From the Library


Edit questions from the Questions Library window. Follow these steps to edit a question:
1. Click Question ID in the ID column in the Questions Library window to display the Questions
Details window.

2. Click Edit to enable editing the question in the Questions Details window.
3. Enter the details for the fields in the Question Details window. See the Field Description table in
Create the Questions in the Library section for field details.

NOTE The ID field is read-only and is not editable.

4. Click Update to save the modified question. Click Submit after you are ready to submit
the edited question. Click Close to discard the changes and close the window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 481


QUESTIONNAIRE
DEFINE THE QUESTIONS

10.5.3 Create Questions by Copying Existing Questions


Copy an existing question from the library and create a new question. All the contents of the question
are carried forward to the new question with a new ID. Copy a question from the Questions Library
window and also from the Question Details window.

NOTE Associated Questionnaires are not copied over to the newly


created question. You must associate questionnaires
separately.

Follow these steps to copy a question and create a new question from the Questions Library window:
1. Click Select to select a Question from the Questions Library window.

2. Click Copy Question .


A message is displayed on the successful execution of the copy operation.

10.5.4 Delete the Questions from the Library


Delete questions from the Question Library window. Follow these steps to delete a question:
1. Click Select to select a Question in the Questions Library window that you want to delete.

2. Click Delete Question to display the Delete Confirmation pop-up window.


3. Click OK to delete the question or click Cancel to discard and close the pop-up window.

NOTE You can delete a question only if it is in Draft status.

10.5.5 View the Associated Questionnaires


Questions are linked in the Questionnaires (for more information, see Link a Question to the
Questionnaire and you can view the details for the same on this window. Follow these steps to view
associated questionnaires:
1. Click the Question ID on the ID column in the Questions Library window to display the
Questions Details window.
2. Click the Associated Questionnaires tab to display the Associated Questionnaires window.
View the associated Questionnaire details in this window.
The following table describes the field in the Questions Library.

Table 129: Fields in the Questions Library and their Descriptions

Field Description

ID Displays the unique identifier number for the questionnaire.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 482


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

Field Description

Name Displays the title of the questionnaire.

Application Displays the application interfaced with the questionnaire.

Component Displays the purpose of the use of the questionnaire.

Displays the type of questionnaire from the following options: Basic,


Type
Decision Tree, and Score Based.

No of Questions Displays the number of questions linked to the questionnaire.

Displays the status of the questionnaire. For example, Draft, Open, and
Status
so on.

Displays the date and time for the last modified action on the
Last Modified
questionnaire.

Note: For more details on the Questionnaire, see the Define the Questionnaires section.

3. Click the Details tab to go back to the Question Details window.


4. Click Close to go back to the Questions Library.

10.5.6 Wrap and Unwrap Questions from the Library


Wrap and unwrap questions from the library to collapse or expand the details entered in the fields.
Follow these steps to wrap and unwrap a question:
1. Click Select to select a Question from the Questions Library window.
2. Click Unwrap to unwrap a question. If the question is unwrapped, click Wrap .

10.6 Define the Questionnaires


Define the Questionnaires from this window by combining defined attributes and questions.
To access the Questionnaires Library window, expand the menu in the left pane where the
Questionnaire is configured and click Questionnaire. From the Questionnaire window, click
Questionnaire Library.

NOTE Configure the Questionnaire to appear in the menu of your


choice based on your application’s requirement. For
information on how to configure Questionnaire menus, see the
Oracle Financial Services Advanced Analytical Applications
Infrastructure Application Pack Administration and
Configuration Guide.

This window displays a list of existing Questionnaires. Create, modify, copy, and delete Questionnaires
from this window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 483


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

The following table describes the fields displayed on the Questionnaire Attributes Configuration
window

Table 130: Fields in the Questionnaire Attributes Configuration window and their Descriptions

Field Description

Displays the system generated identifier for the questionnaire.


ID
This is a unique value.

Name Displays the name of the questionnaire.

Displays the OFSAA application that is interfaced to the


Application Questionnaire module. For example, Financial Services
Enterprise Modeling.

Displays the type of questionnaire component configured in the


system.

Component Note: For information on configuring components, see the


Oracle Financial Services Advanced Analytical Applications
Infrastructure Application Pack Administration and
Configuration Guide.

Displays the type of questionnaire from the following options:


Type
Basic, Decision Tree, and Score Based.

Displays the number of questions linked at the time of the


No. of Questions
creation of the questionnaire.

Displays the status of the questionnaire. For example, Draft,


Status
Open, Pending, and In Review.

Displays the date and time for the last update on the
Last Modified
questionnaire.

Search for existing questionnaires based on ID and Name. For more information, see Use Search in
the Questionnaire section.

10.6.1 Create the Questionnaire in the Library


Create questionnaires in the Questionnaires Library window. Follow these steps to create a
questionnaire:
1. Click Create Questionnaire from the Questionnaire Library window to display the
Questionnaire Details window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 484


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

NOTE To edit a Questionnaire, see Editing the Questionnaire from the


Library section.

2. Enter the details for the fields in the Questionnaire Details.


The following table describes the fields in the Questionnaire Details window.

Table 131: Fields in the Questionnaire Details window and their Description

Field Description

Name Enter a relevant name for the questionnaire in this field.

Displays the identification number of the questionnaire. This value is


ID
generated by the system during questionnaire creation and is unique.

Description Enter a description of the questionnaire that you are creating.

Displays the OFSAA application that is interfaced to the Questionnaire.


Application
For example, Financial Services Enterprise Modeling.

Select the type of questionnaire from the following drop-down options:


 Basic – select to create a questionnaire with questions that are
arranged sequentially.
 Decision Tree – select to create a questionnaire that would
display the next set of questions based on the answer selected.
Note: Selecting this field displays the Result Categories drop-down.
 Hybrid – select to create a questionnaire that would display the
Type next set of questions whether the answer was selected or not. This
is a combination of Basic and Decision Tree Type. However, it
doesn’t make it mandatory to answer a question to display the
next question, as required in the Decision Tree.
 Score Based – select to create a questionnaire that can apply
scores based on the answer selected.
Note: Selecting this field displays the Enable Branching Logic
checkbox.

Select this checkbox to enable a score based questionnaire to display the


next set of questions based on the answer to the current question.
Enable Branching Logic
Note: This field is displayed when you select Score Based on the Type
drop down.

Select the required type of questionnaire component from the drop-


down.
Component Note: For information on configuring components, see the Oracle
Financial Services Advanced Analytical Applications Infrastructure
Application Pack Administration and Configuration Guide.

Select User defined attribute values from the drop-down.


User defined attributes
Note: For more information, see Adding the Questionnaire Attributes.

3. Click Save Draft to create the Questionnaire and save the details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 485


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

4. After you have entered the details discussed in the preceding table, you must create sections
and link questions to the sections. For simplicity, the topic is discussed in subsections within
this section. Click Edit and see the following sections for instructions:
 Creating a Section in a Questionnaire
 Linking a Question to a Questionnaire
 Configuring the Questions in a Section
 Rearranging the Sequence of Sections and Questions
 Delinking a Question to a Questionnaire
 Attaching URLs to a Questionnaire Section
 Editing a Section in a Questionnaire
 Deleting a Section in a Questionnaire
 Wrapping and Unwrapping Sections in a Questionnaire
5. Click Submit after you have entered all details and are ready to submit. Click Close to
discard the changes and to close the window. The Questionnaire moves from Draft to Pending
Approval status, and an approver has to approve to move it to Open status.
For more information, see Approving Questionnaires.

10.6.1.1 Create a Section in a Questionnaire


Create a section for your questionnaire and this section appears in the heading when the
questionnaire is displayed to users. For example, when you create sections “Your Profile” and “Your
Education”, the user of the questionnaire is displayed the headings: “Your Profile” and “Your
Education”, which will contain the relevant questions linked by you to these sections. Follow these
steps to create a section:
1. Enter a name for the section in the Section Name field.
2. Click Add . The section appears in the Sections and Questions section with subsections for
URL. Similarly, you can add more sections to your questionnaire. You must follow section
creation with the linking of questions. See the Linking a Question to a Questionnaire section.

10.6.1.2 Link a Question to a Questionnaire


Link questions that should appear in the questionnaire from the Questionnaire Details window.

NOTE You can link only Questions that are in Open status.

Follow these steps to link a question:

1. Click Edit to enable editing the questionnaire in the Questionnaire Details window.
2. Click Link Question to display the Link Questions window. For more information on the
fields displayed on this window, see the Define Questions section
3. Click Select to select a Question from the Link Questions window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 486


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

4. Click Link to display a message pop-up window. Click OK to link the question to the
questionnaire. Click Close to close the window.

10.6.1.3 Configure the Questions in a Section


On linking a question, the section displays the question. Link questions to different sections that you
have created and create a questionnaire. After you have linked a question to a section, you can
change the question configuration by following these steps:
1. Open the section on the Questionnaire Details window to view linked questions. Expand the
section if it is collapsed, you can view the questions in line with the section name heading.
The following table describes the various fields in the question linked to a section:

Table 132: Field and Description of the Section

Field Description

□ (checkbox) Select and click Edit Linked Question to view and edit the Response
Options in a linked question.

ID Displays the system generated unique identifier for the question.

Question Displays the title of the question.

Displays the type of user interface elements for the question from the
following options:
• Single Choice
• Multiple Choice
Question Type • Free Text
• Number
• Range
Note: For more information, see the Creating Questions in the Library
section.

Status Displays the status of the question. For example, Open.

Last Modified Displays the last modified date of the question.

Enter the comparative value to apply weight function to the question. The
sum of all the weight values should be 100. For example, if you have three
questions A, B, and C. You assign question ‘A’ a weight value of 35 and
question ‘B’ a weight value of 45, then you will have to assign weight
Weightage value of 20 to question ‘C’.
Note: This field is displayed if you have selected the Type as Score Based.
This field cannot be edited if you have linked Questions where the
Question Type is either Free Text or Number.

Displays whether the question is mandatory. The default value is


mandatory. However, you can disable it if required.
Is Question Mandatory? Note: Removing the mandatory condition disables the Weightage field
and removes values entered in it.
This field is not displayed if the Questionnaire Type is a Decision Tree.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 487


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

Field Description

Displays whether the question requires a comment for the answer. The
default value is selected as required. You can remove the selection if
Is Comment Required? required.
Note: This field is not displayed if the Questionnaire Type is a Decision
Tree.

Displays whether the question requires any supporting documents. The


Is Document Required?
default value is selected as not required.

2. Click Edit Linked Question to view and edit the Response Options for a question.
The following table describes the fields in the Response Option.

Table 133: Response Option Field and its Description

Field Description

□ (checkbox) Select a response option from the list to perform various actions.

Response Options

Enter the valid from range for the response.


From
Note: This field is displayed only for Question Type – Range.

Enter the valid to range for the response.


To
Note: This field is displayed only for Question Type – Range.

Enter the score for the response.


Score
Note: This field is displayed only for Score Based questions.

Selected Logic Click the button to display the Show Logic window.

Select from the options: Hard Stop and Soft Stop.


Selected Result Note: This field is displayed only for Score Based questions with
branching and Decision Tree type questionnaires.

Select if you want to make it mandatory to enter a comment.


Comment Mandatory?
Note: This field is not displayed for Decision Tree questions.

Legend Select to enable a legend.

3. Click Save to save the entries, or click Close to close the response options section.

10.6.1.4 Rearrange the Sequence of Sections and Questions


Rearrange the sequence of appearance of the questions in each section and also rearrange the
sequence of sections in a Questionnaire. This allows you to restructure the sections in a questionnaire
and the questions in the sections after you have linked them.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 488


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

NOTE To perform this function, the Questionnaire must be in Draft


status.

Follow these steps to sequence sections and questions:


1. Click Sequence Questions to display the Sequence Sections & Questions window. Change
the sequence of sections and the sequence of questions in the sections from this window.
To move the questions in a section, click Move Question . The Change Question Number
field appears. In the From field, enter the number of the question that you want to move. In the
To field, enter the number where you want to move the question to. Click Change to move
the question or click Close to discard the change. Another option is to use the Up and
Down in the Sequence column. Click the buttons for the row that you want to move up or
down.
You can also move questions between sections. Select a question or a set of questions from
a section that you want to move to another section. Click Move to Section . The Move
Selected Questions to drop-down appears. Select the section from the drop-down where you
want to move the questions to. Click Change to move the questions to the selected section
or click Cancel to discard the change.
To move sections, click Move Section . The Change Section Number field appears. In the
From field, enter the number of the section that you want to move. In the To field, enter the
number where you want to move the section to. Click Change to move the section or click
Close to discard the change.

NOTE The section numbers are in the header rows below the section
names as shown in the following illustration:

Figure 229: Preview Questionnaire window

Another option is to use the Up and Down in the Sequence column. Click the buttons
for the section that you want to move up or down.
2. Click Save Sequence to save the sequence rearrangement or click Close to discard and
close the window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 489


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

10.6.1.5 Delink a Question From a Questionnaire


Delink a question from a questionnaire from the Questionnaire Details window. Follow these steps to
delink a question:

1. Click Edit to enable editing the questionnaire in the Questionnaire Details window.
2. Click Select to select a Question from the section.
3. Click Delink Question to display the delink confirmation pop-up window. Expand the
section if it is collapsed, to view the Delink Question at the top.
4. Click OK to delink the question or click Cancel to discard and close the pop-up window.

10.6.1.6 Attach URLs to a Questionnaire Section


Add or attach URLs using two options in the Questionnaire: from the top bar on the Sections &
Questions section and the URL section. Use the top bar in a section to add URLs to the Section and
Questions section, and use the URL section to attach URLs to the Questionnaire.
Follow these steps to add a URL to the Sections & Questions section using the Add URL button from
the top bar:

1. Click Edit to enable editing the questionnaire in the Questionnaire Details window.

2. Click Add URL to display the Add URL pop-up window. Expand the section if it is collapsed,
to view the Add URL at the top.
3. Enter the details for the fields in the Add URL pop-up window
The following tables describes the fields in the Add URL window.

Table 134: Fields in the Add URL window and their Descriptions

Field Description

Component Displays the name of the section. This is a read-only field.

Section Displays the name of the section. This is a read-only field.

Select the type of entity that the URL is being linked to. The options are:
Entity Type Section
Questions

Select the Question that the URL is to be linked to. This drop down is
Question
enabled on selecting Question for Entity Type.

URL Name Enter a common name for the URL.

URL Enter the URL. For example, www.oracle.com.

URL Description Enter a description of the URL.

4. Click Save to add the URL and repeat the process to add another URL. Click Close when done.
The added URLs are displayed in the URL section. Attach URLs to the questionnaire here. Click

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 490


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

Attach URL(s) to attach URLs to the Questionnaire. To delete a URL, select a URL and
click Delete .
Follow these steps to attach a URL to a Questionnaire using the Attach URLs from the URL section:

1. Click Attach URL(s) from the URL section in the Questionnaire Details window. The Attach
URL pop-up window is displayed.
2. Enter the details for the fields in the pop-up window.
The following table describes the fields in the Attach URL window.

Table 135: Fields in the Attach URL window and their Descriptions

Field Description

Questionnaire Name Displays the name of the questionnaire. This is a read-only field.

URL Name Enter a common name for the URL.

URL Enter the URL. For example, www.oracle.com.

URL Description Enter a description of the URL.

3. Click Save to attach the URL and repeat the process to attach another URL. Click Close when
done. The added URLs are displayed in the URL section in the Questionnaire Details window. To
delete a URL, select a URL and click Delete .

10.6.1.7 Edit a Section in a Questionnaire


Edit sections in questionnaires from the Questionnaire Details window. Follow these steps to edit a
questionnaire section:

1. Click Edit to enable editing the questionnaire in the Questionnaire Details window.

2. Click Edit Section . The section name field is active. Expand the section if it is collapsed, to
view the Edit Section button at the top.
3. Enter the change in the Section Name field and click Save Section to save the details.

4. Click Update to save the modified questionnaire. Click Submit after you are ready to
submit the edited questionnaire.
Click Close to discard the changes and close the window.

10.6.1.8 Delete a Section in a Questionnaire


Delete sections in a questionnaire from the Questionnaire Details window. Follow these steps to delete
a section:

1. Click Edit to enable editing the questionnaire in the Questionnaire Details window.

2. Click Delete Section to display the delete confirmation pop-up window. Expand the section
if it is collapsed, to view the Delete Section button at the top.
3. Click OK to delete the question or click Cancel to discard and close the pop-up window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 491


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

NOTE Delete a section only if the questionnaire is in Draft or In


Review status. If you choose to delete a section, any question
that you have linked to the section is also deleted.

10.6.1.9 Wrap and Unwrap Sections in a Questionnaire


You can wrap and unwrap sections in a questionnaire from the library to collapse or expand the details
entered in the fields.
Follow these steps to wrap and unwrap a questionnaire section:
1. Select the section to wrap or unwrap. Expand the section if it is collapsed, to view the Wrap or
Unwrap button at the top.
2. Click Unwrap to unwrap a questionnaire section. If the section is unwrapped, Click Wrap .

10.6.2 Approve the Questionnaires


The Questionnaire is configured with an n-eyes system that enables the process of submission of a
Questionnaire to be reviewed and approved by one or more levels of supervisors or approvers. After
approval, the Questionnaire moves into Open status and is active. However, before it can move into
Open status, the Questionnaire can be moved through stages of reviews until the approver is satisfied
with the Questionnaire and approves it.
The following is a description of the various statuses when the n-eyes functionality is enabled:
• Draft – Questionnaire created by a user and not yet submitted.
• Pending Approval – Questionnaire submitted for approval to a supervisor.
• Open – Questionnaire approved and ready for use.
• In Review – Questionnaire in Open status that is edited by a user is moved to In Review. After
the changes are done, the submitted Questionnaire moves to Pending Approval status again for
the supervisor’s approval. For related topics, see Editing Questionnaires in Open Status –
Review Questionnaire.
You (the approver) can approve Questionnaires that users have submitted and which are now in
Pending Approval status. If there are changes to be made to the Questionnaire before you approve it,
you can reject it after entering relevant comments. The Questionnaire moves back to Draft or In
Review status and is assigned to the user for editing. The user can update for your comments and
submit the Questionnaire again and move it to the Pending Approval status.

NOTE You must be mapped to the QLOCAUTHRL role to approve


Questionnaires. For more information, see the Oracle Financial
Services Advanced Analytical Applications Infrastructure
Application Pack Administration and Configuration Guide.

Follow these steps to approve a questionnaire:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 492


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

1. Log in to the system with an Approver role user ID.


2. Click My Inbox tab and then click My Task to display a list of tasks assigned to you.
3. Search Questionnaire in Entity Type to display the list of Questionnaires that are in Pending
Approval status or search by the Questionnaire ID in Entity Name.
4. Click Task ID to open the Questionnaire and review.

5. Click Edit and update the Questionnaire, if required. Click Approve to approve and move
the Questionnaire to Open status. Click Reject if you have to recommend changes. The
Questionnaire moves into the Draft status and goes back to the user’s view in the
Questionnaire Library.

10.6.3 Edit the Questionnaire From the Library


Edit questionnaires in the Draft and In Review statuses from the Questionnaire Library window.

10.6.3.1 Edit the Questionnaires in Draft Status


Follow these steps to edit a Questionnaire in Draft status:
1. Click the Questionnaire ID on the ID column in the Questionnaires Library window to display
the Questionnaire Details window.

2. Click Edit to enable editing the questionnaire in the Questionnaire Details window.
3. Enter the details for the fields in the Questionnaire Details window.
See the field description table in Creating the Questionnaire in the Library section for field
details.

NOTE The ID field is read-only and is not editable.

4. Click Update to save the modified questionnaire. Click Submit after you are ready to
submit the edited questionnaire. Click Close to discard the changes and close the window.

10.6.3.2 Edit the Questionnaires in Open Status – Review Questionnaire


Questionnaires that are in Open status can only be edited using the Review Questionnaire feature.
Follow these steps to edit a Questionnaire in Open status:
1. Click the Questionnaire ID on the ID column in the Questionnaires Library window to display
the Questionnaire Details window.
2. Click Review Questionnaire to edit the Questionnaire in the Questionnaire Details window.
3. Edit the details as required. See the field description table in Creating Questionnaire in the
Library section for field details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 493


QUESTIONNAIRE
DEFINE THE QUESTIONNAIRES

NOTE The ID field is read-only and is not editable.

4. Click Update to save the modified questionnaire. Click Submit after you are ready to
submit the edited questionnaire. The Questionnaire moves to the Open status if there’s no
approval required. However, if approval is required, then the Questionnaire moves to Pending
Approval status. See Approving Questionnaires for more details. Click Close to discard the
changes and close the window.

10.6.4 Create the Questionnaire by Copying an Existing Questionnaire


Copy an existing questionnaire from the library and create a new questionnaire. All the contents of the
questionnaire are carried forward to the new questionnaire with a new ID. Copy a question from the
Questionnaire Library window.
Follow these steps to copy a questionnaire and to create a new questionnaire from the Questionnaire
Library window:
1. Click Select to select a Questionnaire from the Questionnaire Library window.

2. Click Copy Questionnaire .


A message is displayed on the successful execution of the copy operation.

10.6.5 Delete the Questionnaire from the Library


Delete questionnaires from the Questionnaire Library window. Follow these steps to delete a
questionnaire:
1. Click Select to select a Questionnaire in the Questionnaire Library window that you want to
delete.

2. Click Delete Questionnaire to display the delete confirmation pop-up window.


3. Click OK to delete the question or click Cancel to discard and close the pop-up window.

NOTE You can delete a questionnaire only if it is in Draft status.

10.6.6 Wrap and Unwrap the Questionnaire from the Library


Wrap and unwrap questionnaires from the library to collapse or expand the details entered in the
fields.
Follow these steps to wrap and unwrap a questionnaire:
1. Click Select to select a Questionnaire from the Questionnaire Library window.
2. Click Unwrap to unwrap a questionnaire. If the question is unwrapped, click Wrap .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 494


QUESTIONNAIRE
CONFIGURING TOKEN-BASED RFI

10.7 Configuring Token-Based RFI


Token-based RFI enables banks to share RFI links with users without needing an OFSAA user ID.
Token-enabled RFI allows users to access questionnaires anywhere, anytime.
When an RFI is sent to an individual, they will receive a token-based link to access the RFI
Questionnaire. On clicking the token-based Questionnaire link, the user is prompted to enter the
unique captcha that is shared via email. Without a valid captcha access to the link will be prevented.
To configure token-based RFI, follow these steps:
1. Navigate to the CONFIGURATION table in the Config schema.
2. Update the RFI_TOKEN_VALIDITY parameter to provide the amount of time the RFI Token
should remain valid.
The value for this parameter must be provided in minutes. The default value is 60.
3. Configure the Captcha Email which respondents will receive by updating the following
parameters:
 QTNR_CAPTCHA_SENDER_MAIL_ID: Configure the RFI Captcha Sender Mail ID.
For example: [email protected]
 QTNR_CAPTCHA_MAIL_SUBJECT: Configure the RFI Captcha Mail Subject line.
For example, RFI Unique Code.
 QTNR_CAPTCHA_MAIL_BODY_TXT: Configure the RFI Captcha Mail Body text.
For example: Please Enter this code to view the RFI page.
 QTNR_CAPTCHA_MAIL_BODY_SENDER_NAME: Configure the RFI Captcha Sender Name,
such as Administrator.
4. Service Authentication is done through the service account in the Token Enabled RFI screen.
Configure the service account by updating the OFSAA_SRVC_ACC parameter.
The default value is SYSADMN.

NOTE Oracle recommends creating a "SMS Auth Only" user from the
User Maintenance window for the service account rather than
using SYSADMN.

ATTENTION For Weblogic and Websphere environments, the configurations


found in the OFSAAI Administration Guide must be completed for
REST Services Authorization for the respective server. Refer to
Section: 12.10.1 Configuring WebLogic for REST Services
Authorization and Section: 12.11.1 Configuring WebSphere for
REST Services Authorization.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 495


QUESTIONNAIRE
CONFIGURING TOKEN-BASED RFI

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 496


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11 System Configuration and Identity Management


System Configuration and Identity Management module is an integral part of Infrastructure
administration process. It facilitates the System Administrators to provide security and operational
framework required for Infrastructure.
System Configuration and Identity Management activities should be performed by the infrastructure
administrator using the admin credentials.
This section consists of the following topics:
• System Configuration
• Identity Management

11.1 System Configuration


The Administration and Configuration section allows the System Administrators to configure the
Server details, Database details, OLAP details, and Information Domain along with the other
Configuration process such as segment and metadata mapping, and mapping segment to security.
System Configuration is mostly a onetime activity which helps System administrator to make the
Infrastructure system operational for usage.

11.1.1 Navigating to System Configuration


Click from the header to display the Administration tools in Tiles menu. Click System
Configuration from the Tiles menu to view a submenu list.
Note: After you have accessed a tool from the submenu, the options are also available in the
Navigation List to the left. Click button to access the Navigation List.

Figure 230: Navigation List drawer

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 497


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 231: System Configuration sub-menu

You (System Administrator) need to have full access rights to ftpshare folder with appropriate User ID
and password to add and modify the server details.

11.1.2 Components of System Configuration


System Configuration consists of the following sections. Click on the links to view the sections in
detail.
• Database Server
• Application Server
• Web Server
• Database Details
• OLAP Details
• Information Domain
• Configuration
• Create Application

11.1.3 Database Server


Database server refers to a computer in network which is dedicated to support database storage and
retrieval. The database layer of Infrastructure system can be represented by a single database server.
The Database Server Details window within the System Configuration section of Infrastructure system
facilitates you to add and modify the database server details on which the Infrastructure Database,
Application, and Web components have been installed. A database server can support multiple
Information Domains, but however, one Information Domain can be mapped to only one database
layer.

Click from the header to display the Administration tools in Tiles menu. Click System
Configuration from the Tiles menu to view a submenu list. Click Configure Database Server to view
the Database Server Details window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 498


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 232: Database Server Details window

By default, the Database Server Details window displays the pre-configured database server details. In
order to add or modify the database server details, you need to ensure that:
• The FTP/SFTP service should be installed on the Web/Application and DB Server.
• The FTP/SFTP ID for Web/App and DB server has to be created through the Computer
Management option under Administrative Tools for all the installations other than UNIX
installations.
• This user should belong to the administrator group.
• The FTP/SFTP password for Web/App and DB server needs to be specified in the Computer
Management option under Administrative Tools. Also, the Password Never Expires option has
to be checked.
• If the User enters an incorrect username, password, FTP Share and/or Port and clicks Save, the
following alert message is displayed.
Password or ShareName incorrect on XXXXXXXXXXXXXXXXXXXcom on port X2

NOTE • The Password verification is enabled only when One-


off patch (34763896) is applied on 8121 ML release.
• The Database Server Details window displays the pre-
configured Database Server Details specified during
OFSAA Infrastructure Installation.

11.1.3.1 Adding Database Sever Details


You can add a database server by specifying the Database Server Details, FTP Details, and Security
Details. To add database server details:
1. Select Add button from the Database Server Details window. The window is refreshed and
enables you to populate the required data in the fields.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 499


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 233: Database Server Details window

2. Enter the Database Server Details as tabulated.

NOTE Few of the fields in Database Server details are auto populated
based on the options specified during application installation
and are not editable

The following table describes the fields in the Database Server Details window.

Table 136: Fields in the Database Server Details window and their Descriptions

Field Description

If the IP address of the Infrastructure configuration servers is specified


IP Address during setup, the same is auto populated and cannot be modified.
If not, select the IP address by clicking on the drop-down list.

The socket server port is auto populated from dynamicservices.xml file


Socket Server Port in the ficserver/configuration path, and should not be edited.
By default the port number is 10101.

The OS type (Operating System) of the database is auto detected by the


Infrastructure Application and cannot be edited.
OS Type
The system supports only similar OS types in a single implementation
and does not support UNIX with NT combination.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 500


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

FTP refers to the transfer of files such as metadata and staging files
from one server to another. SFTP refers to secure FTP for transfer of
files from one server to another. LOCAL is selected to transfer files
within the same server.
Note the following:
• The FTP / SFTP option specified during setup is auto populated and
is not editable.
• The FTP/SFTP information should be created manually, prior to
entering the details. The application validates the information
ensuring that the value in FTP/SFTP and Host DB is not blank.
FTP/SFTP/LOCAL • When there is a change to the FTP/SFTP path, the old files should
be physically moved to the new path. The system ensures that all
new files are generated /transferred into the new path.
• The Radio Button LOCAL is available on OFSAAI 8.0.6.1.0 and later
release versions.
• The FTP of the Database Server, Application Server, and the Web
Server must be the same. For example, if you select SFTP for the
Database Server, repeat the same selection for the Application
Server and the Web Server too.
• At any time, if you modify the existing FTP selection, ensure that
you resave so that the changes take effect.

The FTP Details consists of:


 Technical Metadata tab, which consists of the path to erwin file which in turn stores TFM,
Database Model XML files, and Table Creation scripts.
 Business Metadata tab, which consists of path to the business logic XMLs such as Cube
Configuration files and Hierarchy Parentage files.
 Staging Area tab, which stores the path to FLAT files (data files) which can be loaded
through Data Management Tools. This is the only path that is not tagged to any
Information Domain.
3. Enter the FTP details in the technical Metadata, Business Metadata, and Staging Area tabs as
tabulated. The Technical Metadata tab is selected by default and the details specified here are
replicated as default values to Business Metadata, and Staging Area tabs.

NOTE It is recommended to define the same FTP share directory for


Technical Metadata, Business Metadata, and Staging Area.

The following table describes the fields in the Technical Metadata and Business Metadata tabs.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 501


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Table 137: Fields in the Technical Metadata and Business Metadata and their Descriptions

Field Description

Specify the physical path of the FTP/SFTP shared directory/Drive.


Drive
For example: e:\dbftp\

Specify the database FTP/SFTP port number.


Port Number By default the SFTP port number is 22 and can be changed if the port is
enabled.

Specify the user ID that is used to perform an FTP/SFTP in the machine


User ID where the database server is located. It is mandatory to specify the
FTP/SFTP User ID.

Enter the password which is same as the specified password for FTP/SFTP
Password user ID by the administrator.
Note: The password is represented by asterisk (*) for security reasons.

4. Click Next and enter the Security Details as tabulated:


The following table describes the fields in the Security Details tab.

Table 138: Fields in the Security Details tab and their Descriptions

Field Description

Enter the user ID which has the same user rights as the user who installed
Infrastructure.
Security User ID
The Application server validates the database user Id / Password to the
database server(s) for connection purposes.

Specify the password for the user who would be accessing the security
Security Password share name. The password is represented by asterisk (*) for security
reasons.

Enter the path locating the DB components installation folder which has
Security Share Name been specified by the user who has installed the infrastructure system. For
example: D:\Infrastructure

5. Click Save to save the Database Server details.

11.1.3.2 Modifying Database Server Details


To update the existing database server details:
1. Select Modify button from the Database Server Details window. The window is refreshed and
enables you to edit the required data in the fields.
2. Update the Database Server details as required.
Except for the auto populated OS type, you can edit all other details including IP Address, Server
Socket Port, and FTP details in Technical Metadata, Business Metadata, and Staging Area tabs.
For more information, see Add Database Server Details.
3. Click Save to save the changes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 502


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.4 Application Server


Application Server refers to a computer in a distributed network which provides the business logic for
an application program. The Application Server in the Infrastructure system maintains the application
layer which in turn consists of shared services, sub system services, and ICC server to manage the
warehouse operations.
Application Sever within the System Configuration section of Infrastructure system facilitates you
(System Administrator) to maintain the Application Server set-up details Click System Configuration
from the Tiles menu to view a submenu list. Click Configure Application Server to view the
Application Server Details window.

Figure 234: Application Server Details window

By default the Application Server Details (Server Master) window displays the pre-configured
application server details in the View mode.
The Application Server Details window is displayed in the Add mode when accessed for the first time
during the installation process to enter the application server setup details. Subsequently the window
is displayed in View mode providing option to only update the defined application server details.
If the User enters an incorrect username, password, FTP Share and/or Port and clicks Save, the
following alert message is displayed.
Password or ShareName incorrect on XXXXXXXXXXXXXXXXXXXcom on port X2

NOTE The Password verification is enabled only when One-off patch


(34763896) is applied on 8121 ML release.

11.1.4.1 Modifying Application Server Details


You can update the pre-defined Application Server details and FTP/SFTP/LOCAL details in the
Application Server Details window. To update the existing application server details:
1. Select Modify button from the Application Server Details window. The window is refreshed and
enables you to edit the required data in the fields.
2. Update the Application Server Details as tabulated.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 503


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

NOTE The data in some of the fields are auto populated with the pre-
defined Application Server details. Ensure that you edit only the
required fields.

The following table describes the fields in the Application Server Details window.

Table 139: Fields in the Application Server Details window and their Descriptions

Field Description

Enter the new IP address of the application server.


Note the following:
In case the IP Address of Application server is changed in any of the
Primary IP for Runtime
following two scenarios, contact Infrastructure Support for help:
Processes
• Change in IP Address of the Application server machine in use.
• Application server is physically moved from one machine to
another.

Select the option as either FTP or SFTP.


FTP refers to the transfer of files such as metadata and staging files
from one server to another. SFTP refers to secure FTP for transfer of
files from one server to another. LOCAL is selected to transfer files
within the same server
Note the following:
• The FTP / SFTP option specified during setup is auto populated.
• The FTP/SFTP information should be created manually, prior to
entering the details. The application validates the information
FTP/SFTP/LOCAL ensuring that the value in FTP/SFTP and Host DB is not blank.
• When there is a change to the FTP/SFTP path, the old files should
be physically moved to the new path. The system ensures that all
new files are generated /transferred into the new path.
• The FTP of the Database Server, Application Server, and the Web
Server must be the same. For example, if you select SFTP for the
Database Server, repeat the same selection for the Application
Server and the Web Server too.
• At any time, if you modify the existing FTP selection, ensure that
you resave so that the changes take effect.

Select the authentication type from the following:


• Password Auth – login authentication through password entries.
Authentication Type
• PublicKey Auth – login authentication through public key
authentication for enhanced security.

3. Enter the FTP details in the Technical Metadata, Business Metadata, and Staging Area tabs as
tabulated. The Technical Metadata tab is selected by default and the details specified here are
replicated as default values to Business Metadata, and Staging Area tabs.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 504


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

NOTE It is recommended to define the same FTP share directory for


Technical Metadata, Business Metadata, and Staging Area.

The following table describes the fields in the Technical Metadata, Business Metadata, and Staging
Area tabs.

Table 140: Fields in the Technical Metadata, Business Metadata, and Staging Area tabs and their
Descriptions

Field Description

Specify the new physical path of the FTP/SFTP shared directory/Drive. For
Drive
example: e:\dbftp\

Specify the database FTP/SFTP port number.


Port Number By default the SFTP port number is 22 and can be changed if the port is
enabled.

Specify the user ID that is used to perform an FTP/SFTP in the machine


User ID where the database server is located. It is mandatory to specify the
FTP/SFTP User ID.

Enter the password which is same as the specified password for SFTP user
Password ID by the administrator.
The password is represented by asterisk (*) for security reasons.

4. Click Save to save the changes.

11.1.5 Web Server


Web server refers to a computer program that delivers (serves) content, such as Web pages using the
Hypertext Transfer Protocol (HTTP) over the World Wide Web. The Web Server in the Infrastructure
system constitutes the presentation layer.
The Infrastructure Web Server (presentation layer) can be implemented in the following two ways:
• Installation of Single Web Server.
• Installation of Primary Web Server and a Secondary Server.
Web Sever within the System Configuration section of Infrastructure system facilitates you (System
Administrator) to add and modify the Web Server set-up details. Click from the header to display
the Administration tools in Tiles menu. Click System Configuration from the Tiles menu to view a
submenu list. Click Configure Web Server to view the Web Server Details window.
By default the Web Server Details (Server Master) window displays the pre-configured web server
details in the View mode.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 505


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.5.1 Adding Web Server Details


In the Infrastructure system you can create multiple web servers to route users through different web
servers. For example, you can route internal and external users through different web servers.
However, one of the Web Server has to be defined as primary server.
You can add a web server by specifying the Web Server details and FTP/SFTP/LOCAL Details in the
Web Server Details window.

Figure 235: Web Server Details window

To add web server details:


1. Select Add button from the Web Server Details window. The window is refreshed and enables
you to populate the required data in the fields.
2. Enter the Web Server details as tabulated.
The following table describes the fields in the Web Server Details window.

Table 141: Fields in the Web Server Details window and their Descriptions

Field Description

IP Address Enter the IP address of the web server.

Servlet Port Specify the web server port number. For example: 21

Specify the local path (location) where the static files need to be copied in
the primary server. For example: e:\revftp\
The static files such as Infrastructure OBIEE reporting server pages are
Local Path
copied to the specified location.
Note: The web server Unix user must have read/write privileges on the
Local Path directory. If not, contact your system administrator.

Select the protocol as either HTTP or HTTPS from the drop-down list.
Infrastructure supports FTP/SFTP into Web Server and streaming of files.
Protocol In case, FTP/SFTP is not allowed in a Web Server due to security reasons,
system can stream the data across Web Servers so that the Client need
not compromise on their Security policy.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 506


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Select this checkbox to enter public key authentication details. On


Transfer Protocol Details
selecting, the FTP Details pane is displayed.

3. (Optional) If you have selected the FTP Enabled checkbox, you can specify the Drive, Port
Number, and user details in the FTP details pane. Select the option as either FTP, SFTP or
LOCAL and enter the other details as tabulated.
The following tables describes the fields in the FTP Details pane.

Table 142: Fields in the FTP Details pane and Descriptions

Field Description

FTP Details

Select either FTP, SFTP or LOCAL based on your web server


requirement.
Enter the details based on the option displayed for the selections on the
Application Server Details window. The option displayed can be either
Password Auth or PublicKey Auth.
NOTE:
FTP/SFTP/LOCAL
• The FTP of the Database Server, Application Server, and the Web
Server must be the same. For example, if you select SFTP for the
Database Server, repeat the same selection for the Application
Server and the Web Server too.
• At any time, if you modify the existing FTP selection, ensure that
you resave so that the changes take effect.

Select from the following:


• Password Auth – Select to enter details for User ID and Password.
Authentication Type • PublicKey Auth – Select to enter details for Private Key Path and
Passphrase. This value is available only for SFTP.
Note: This field is not available if you select LOCAL.

Password Auth Enter details for User ID and Password.

Specify the user ID that is used to perform an FTP/SFTP in the machine


User ID where the database server is located. It is mandatory to specify the
FTP/SFTP User ID.

Enter the password which is same as the specified password for


Password FTP/SFTP user ID by the administrator.
The password is represented by asterisk (*) for security reasons.

PublicKey Auth Enter details for Private Key Path and Passphrase.

Enter the Private Key Path that is used to perform the FTP/SFTP in the
Private Key Path
database server. This is a mandatory field.

Passphrase Enter the passphrase to access the database server for FTP/SFTP.

4. Click Save to save the Web Server details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 507


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.5.2 Modifying Web Server Details


You can update the pre-defined Web Server details and FTP/SFTP Details in the Web Server Details
window. To update the existing web server details:
1. Select Modify button from the Web Server Details window. The window is refreshed and
enables you to edit the required data in the fields.
2. Update the Web Server details as required.
You can edit all the Web Server Details and FTP details in the Web Server Details window. For
more information, see Add Web Server Details.
3. Click Save to save the changes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 508


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.6 Database Details


Database Details in the System Configuration section facilitates you to define the database setup
details after you have configured the database server within the Infrastructure System. The
Infrastructure Database server for which you need to specify the database setup details could have
been installed in any of the following ways:
• Single tier with multiple Information Domains hosted across machines.
• Multi-tier with Multiple Information Domains hosted across machines.
• Single tier with single Information Domain on the same machine.
• Multi-tier with single Information Domain on the same machine as Infrastructure DB Server.
OFSAAI supports heterogeneous databases such as Oracle and HDFS. Database authentication details
are stored separately for each connection for reusability.
You (System Administrator) need to have SYSADM function role mapped to your role to access and
modify the database details. Click from the header to display the Administration tools in Tiles
menu. Click Database Details from the Tiles menu to view the Database Master window, or click
button to access the Navigation List. Click Database Details to view the Database Master window.

Figure 236: Database Master window

You can view the various databases defined for the database server. The Database Master window
allows you to add a new database and modify the existing ones.

11.1.6.1 Adding Database Details for DB Server


You can add a new database by specifying the name, Schema name, DB properties and connection
details. Ensure that the Server Details are specified and the database is created before adding the
database details.
You should not create Database details with Hive Server1 and Hive Server2 in the same setup since
Hive Server 1 and Hive Server 2 drivers cannot run at the same time in the same JVM. Loading both
drivers at the same time causes Hive Server 2 connection failure. This issue will be addressed in a
future release of the Hive driver.
You cannot configure multiple Database details using different Hive Drivers in a single OFSAA setup.
That is, multiple Data Sources using different Hive Drivers is not supported.
To add a new database:
1. Click button from the toolbar in the Database Master window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 509


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 237: Database Master window

2. Enter the Database details as tabulated.


The following table describes the fields in the Database Master window.

Table 143: Fields in the Database Master window and their Descriptions

Field Description

Select the Database IP Address from the drop-down list.


DB Server
This list displays the database server IP address defined during the set-up.

Enter the database Name. Ensure that there are no special characters and
extra spaces.
Name Note that, for Oracle database, the TNS (Transparent Network Substrate)
database name should be same as SID.
The Name should not exceed 20 characters.

Enter the Schema name for the database. Ensure to use only lower case or
Schema Name
upper case alphabets. Schema name does not support mixed case names.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 510


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

The available options are ORACLE, MSSQL,DB2UDB, and HIVE.


For Information Domain creation, only Oracle and Hive Database types are
DB Type supported. For DI source creation, MSSQL and DB2UDB are also supported.
You can create Hive Database instance for a single Hive server/ CDH. Multiple
data sources pointing to different Hive servers are not supported.

Select the authentication type from the drop-down list. Based on the Database
you have selected, the drop-down list displays the supported authentication
mechanisms.
Select Default for DB2UDB, ORACLE, and MSSQL databases.

Auth Type If DB Type is HIVE, then KERBEROS, KERBEROS_WITH_KEYTAB, LDAP, and


Default are supported.
If the Auth Type is configured as KERBEROS_WITH_KEYTAB for the Hive
database, then you must use the Keytab file to login to Kerberos. The Keytab
and Kerberos files should be copied to $FIC_HOME/conf and
$FIC_WEB_HOME/webroot/conf of OFS AAAI Installation Directory.

Connection Details

This field is not applicable for HIVE DB with Auth Type as Default.
Select the Alias name (connection) used to access the database from the drop-
down list.

Click to add a new database connection/atomic schema user. The Alias


Details window is displayed.
• Auth Alias- Enter a name for the database connection.
• User/Principal Name- Enter the atomic schema User ID to access the
Alias Name database. The system authenticates the specified User ID before providing
access.
• Auth String- Enter the password required to access the
database/schema. The system authenticates the specified password
before providing access. The maximum length allowed is 30 characters.
Special characters are not allowed.
Note: If Authentication type is KERBEROS_WITH_KEYTAB, Auth String
(Password) is not required. Since the Auth String is set as mandatory field,
enter a dummy password.

Auth Type Displays the Authentication Type. This field is read-only.

This field is applicable only for ORACLE DB with Auth Type as Default.
TNS is the SQL*Net configuration file that defines database address to
TNS Entry String
establish connection.
Enter the TNSNAME created for the Information Domain.

Enter the date format used in the Database server. You can find this in
Date Format nls_date_format entry for the database. This date format will be used in all the
applications using date fields.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 511


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

The default JDBC Connection String is auto populated based on the database
type selected. This is the JDBC (Java Database Connectivity) URL configured
by the administrator to connect to the database.
• For ORACLE DB type it is jdbc:oracle:thin:@<<DB Server Name>>:<<Port
Number>>:<<Oracle SID>>
• For MSSQL DB type it is jdbc:microsoft:sqlserver://<<DB Server
JDBC Connection Name>>:<<Port Number>>
String
• For DB2 DB type it is jdbc:db2://<<DB Server Name>>:<<Port
Number>>/<<Database Name>>
• For HIVE DB type, it is jdbc:hive2://<<DB Server Name>>:10000/default
You need to specify the appropriate details corresponding to the information
suggested in brackets. For example, in ORACLE DB you can specify the Port
number as 1521 and the SID as ORCL.

The default JDBC Driver Name is auto populated based on the database type
selected.
• For ORACLE DB type it is oracle.jdbc.driver.OracleDriver.
• For MSSQL DB type it is com.microsoft.jdbc.sqlserver.SQLServerDriver.
• For DB2 DB type, it is com.ibm.db2.jcc.DB2Driver.
JDBC Driver Name
• For Hive with Auth type as Kerberos with Keytab, it is
com.cloudera.hive.jdbc4.HS2Driver.
In case of modification, ensure that the specified driver name is valid since the
system does not validate the Driver Name.
Multiple data sources pointing to different Hive servers are not supported.

This field is applicable and mandatory for ORACLE DB.


Enter the JNDI Name.
JNDI Name JNDI name should be entered if you want to create information domain for this
DB schema. If the DB schema is for Data Sources, you can use any dummy
data for this field.

This field is applicable for Authentication Type selected as KERBEROS WITH


Key Tab File Name KEYTAB.
Enter the name of the Key Tab file.

This field is applicable for Authentication Type selected as KERBEROS and


REALM File Name KERBEROS WITH KEYTAB.
Enter the name of the Kerberos Realm file.

KERBEROS KDC This field is applicable for Authentication Type selected as KERBEROS.
Name Enter the name of Kerberos Key Distribution Center (KDC).

KERBEROS REALM This field is applicable for Authentication Type selected as KERBEROS.
Name Enter the name of the Kerberos Realm file.

This field is applicable for Authentication Type selected as KERBEROS.


JAAS File Name Enter the name of the Java Authentication and Authorization Service (JAAS)
file.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 512


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

3. Click Save to save the Database Details for DB Server.

11.1.6.2 Modifying Database Details


You can modify the database details by selecting the required Database schema from the Database
Master window. The fields like Name, Schema Name, DB Type and Auth Type are not editable. You
can add a new Alias (database connection) or modify the details of the existing Alias. For example, the
password for the database connection can be modified by clicking in the Alias Name field and
entering new password in the Auth String field in the Alias Details window. For more information, see
Add Database Details for DB server.

NOTE The database date when modified does not get auto updated.
You need to manually update the date in the database
parameters of NLS_DATE_FORMAT file and restart the DB. Also
the to_date function translation is not performed during the
data load.

Once you have updated all the required information, click Save to save the Database Details.

11.1.7 OLAP Details


OLAP or Online Analytical Processing is an approach to swiftly answer multi-dimensional analytical
queries. Any database configured for OLAP uses a multidimensional data model, allowing for complex
analytical and ad-hoc queries with a rapid execution time.
OLAP Details in the System Configuration section facilitates you to define the OLAP details after you
have configured the OLAP server within the Infrastructure System. The Infrastructure design makes it
mandatory for the System Administrators to define the OLAP details which is usually a onetime
activity. Once defined the details cannot be modified except for the user credentials.
You (System Administrator) need to have SYSADM function role mapped to your role to access and
modify the OLAP details. Click from the header to display the Administration tools in Tiles menu.
Click System Configuration from the Tiles menu to view a submenu list and click Configure OLAP
Details to view the OLAP Details window, or click button to access the Navigation List, click
System Configuration, and click Configure OLAP Details to view the OLAP Details window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 513


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 238: OLAP Details window

By default the OLAP Details window displays the pre-configured server details specified during the
installation.

11.1.7.1 Adding OLAP Details


You can add OLAP details by specifying the server IP, database type, and locale. Ensure that the OLAP
server is configured before adding the OLAP details. To add OLAP details:
1. Select Add button from the OLAP Details window. The window is refreshed and enables you to
populate the required data in the fields.

Figure 239: OLAP Details Add window

2. Enter the OLAP details as tabulated.


The following table describes the fields in the OLAP Details window.

Table 144: Fields in the OLAP Details window and their Descriptions

Field Description

Enter or select the OLAP server IP from the drop-down list.


Server IP The OLAP Server IP address is the IP address of the machine on which
OLAP server is running.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 514


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Select the OLAP database type from the drop-down list. The available
options:
• SQLOLAP
• ESSBASE
• EXPRESS
• DB2OLAP
• ORACLE
Note the following while selecting the OLAP DB type:
Type • By selecting ESSBASE and DB2OLAP, you need to specify different user
id and password for Cube Creation and Cube Viewing to avoid locking
of the cube when the cube is being built.
• By selecting SQLOLAP and EXPRESS, you need to specify one set of
user id and password common for both Cube Creation and Cube
Viewing.
• By selecting ORACLE, you need not specify user id and password for
Cube Creation and Cube Viewing.
In the same server, Multiple OLAP types can be installed in the same server
and configured in OFSAAI.

Select the locale from the drop-down list.


Locale Identifier
The specified locale is identified at the time localization set-up.

3. Specify the User ID and Password in the For Cube Creation section, based on the selected
OLAP DB Type. Ensure that User ID should not have any special characters or extra spaces and
it should not exceed 16 characters.
 For SQLOLAP, the User ID should be created in Microsoft Windows with appropriate
privileges for cube creation.
 For EXPRESS, the User ID should be created in EXPRESS with appropriate privileges for cube
creation.
4. Specify the User ID and Password For Cube Viewing, based on the selected OLAP DB Type.
Ensure that there are no special characters and extra spaces.
 Enter the FIV User ID to view the cube. If ESSBASE is selected as the database type, the cube
can be viewed in OBIEE reporting server.
5. Click Save to save the OLAP Details.

11.1.7.2 Modifying OLAP Details


By default, the OLAP Details window displays the OLAP details specified during the installation. The
defined OLAP details are not editable and you can only modify the user privileges for Cube Creation
and Viewing based on the selected OLAP DB Type. For more information, see Add OLAP Details.
Once you have updated all the required information, click Save to save the OLAP Details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 515


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.8 Configure Email Configuration


The Email Configuration feature helps you add email IDs and map their details in OFSAA. The
configured email IDs receive notifications through network communication channels when any
feature that is mapped to email notifications is triggered.
You must have the SYSADM function role mapped to your role to access and modify the Email
Configuration details.

11.1.8.1 Add an Email Configuration


To add email configuration in OFSAA, follow these steps:
1. Log in as a User with System Administrator privileges.
2. Click the Administrator icon from the Header to display the Administration window.
The Administrator tools are displayed in the Tiles menu.
3. Click System Configuration from the Tiles menu to display a submenu list.
4. Click Configure Email Configuration to view the Email Configuration window
Alternatively, to access the Email Configuration window, follow these steps:
a. Log in as a User with System Administrator privileges.
b. Click the Menu Navigation icon and access the Navigation List.
c. Click System Configuration and then click Configure Email Configuration.
5. In the Email Configuration window, to add an Email Configuration record, click Add and enter
the details as given in the following:
a. Email Service Name: Enter the name of the email service provider.
For example, oracle
b. Protocol: Enter the protocol of the email server.
For example, SMTP, IMAP, or POP.
c. Host: Enter the host name or the IP address of the email server.
For example, 192.0.2.1 or example.com
d. Port: Enter the port number of the email server.
For example, 25.
e. Authentication: Select True if you require authentication or select False.
If you select True, then the User Name and Password fields are enabled.
Username: Enter the email User ID.
Password: Enter the password to access the email.

11.1.8.2 View an Email Configuration


In addition to adding an Email Configuration in the Email Configuration window, you can select a
record and click View to view the Email Configuration details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 516


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.8.3 Edit an Email Configuration


To edit an Email Configuration, select the required record and click Edit in the Email Configuration
window. The Email Configuration is displayed in Edit Mode. Update the fields as required. The Email
Service Name field is not editable.
For information about the fields in the window, see the steps in the Add an Email Configuration
section.

11.1.8.4 Delete an Email Configuration


To delete an Email Configuration, select the required record and click Delete in the Email
Configuration window. Select OK in the Confirmation window to delete.

11.1.9 Instance Access Token


The Instance Access Token enables you to invoke RESTful APIs (from external systems) that are
packaged in the OFSAA Applications.
To enable this use case, you have to register the external system through the configuration UI as
explained in the following section.

NOTE The Instance Access Token feature is available from OFS AAI
v8.0.8.3.0 and later versions.

The Instance Access Token provides the following abilities:


• Mechanism to generate multiple Unique Transaction Tokens that can be used to invoke the
RESTful endpoint.
• Unique combination of OFSAA instance and a given external system name.

NOTE • The OFSAA Administrator user is responsible to


generate the token (or tokens) and share it with the
external systems.
• Every external system invoking the RESTful endpoints
on the OFSAA instance must generate a separate
unique Instance Access Token.

• Ability to generate the Instance Access Token multiple times, if previous token is misplaced or
lost.
• Endpoint to generate the Unique Transaction Token requests based on the input of Instance
Name and Instance Access Token.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 517


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.9.1 Creating the Instance Access Token


As an OFSAA Administrator you can create the Instance Access Tokens and share it with the external
systems for the RESTful endpoints access.
To create an Instance Access Token, perform the following steps:
1. Login as any user with System Administrator privileges and click System Configuration.
2. Select Configure Instance Access Token option from the System Configuration drop-down
menu.

Figure 240: Administration Page

The Configure Instance Access Token page is displayed.

3. Click Add.

Figure 241: Configure Instance Access Token Page

4. Specify the Instance Name.

Figure 242: Configure Instance Access Token Page

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 518


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

5. Click Generate Token.


The Instance Access Token is generated and displayed in the Instance Access Token Details
pane.
6. Copy the text from the Instance Access Token Details pane to a text file and save it as
xxxxxxx.props properties file.
The generated instance access token details should be used to further generate multiple Unique
Transaction Tokens using GET API calls. For more information about generating the Generating
Unique Transaction Tokens, see Generating Unique Transaction Tokens.
To edit an instance access token, click Edit, next to the particular token.

11.1.9.2 Generating Unique Transaction Tokens


You can use the following API to generate the One-Time Token:
• GET Method
• API: /rest-api/auth/v1/token
For Authorization, use BASIC AUTH with the following values:
• User Name: Instance Name
• Password: Instance Access Token

11.1.9.2.1 Token Expiry Configuration

The token expiry time can be configured in the Configuration UI. Specify the expiry limit of the token
in the API token validity in seconds field. By default, the One-Time Token validity is set to one hour.

Figure 243: Token Expiry Configuration

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 519


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

11.1.9.2.2 Configuration for SSO enabled setups

For SSO enabled setups, additionally configure the following fields:


• SSO Enabled
• Enable native authentication for Rest API

11.1.9.2.3 Authentication Policy Setting Configuration on SSO Server

To set the authentication policy settings, perform the following steps:

NOTE The following steps is an example, and it is applicable, if the


SSO software is Oracle Identity Manager (IDM).

1. Login to the OAM Administrator Console.


2. On the Launch Pad, click Application Domains from the Access Manager widget.

Figure 244: Launch Pad

The Application Domain window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 520


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

3. Search for the required application domain for which you want to switch the authentication
scheme, click Name from the search results to display the details for the application domain.

Figure 245: Application Domain tab

4. Click the Webgate_IDM_11g and click Resource tab.


5. Click the Search button.

Figure 246: Webgate IDM

The Search Results are displayed. The REST APIs required for OFSAA is highlighted as
displayed in the figure.
6. Click the Edit icon.
7. Modify the Protection Level from Protected to Excluded.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 521


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 247: Webgate IDM

To enable token based authentication for REST APIs, rather with basic authentication, you must
change Protection Level from Protected to Excluded.
8. Click Apply to save.

11.1.9.2.4 API Response

The GET API generates a One-Time Access Token as response in the JSON format as follows:
API call: /rest-api/auth/v1/token:
Response:

{
"token_type": "Bearer",
"expires_in": 3600,
"token":
"eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJqdGkiOiI5ZDljZWU4YS0zOGJmLTRkMjMt

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 522


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

OTU1ZC1kMTU5ODA2YTk5NzciLCJpc3MiOiJSU19RVE4iLCJhdWQiOiJPRlNBQSIsInN1YiI6Il
JTX1FUTiIsImlhdCI6MTYwNDk4NzU1OCwiZXhwIjoxNjA0OTkxMTU4fQ.WcxtP3A0NJa4U5bjD
_D8GQzzMd77pI4woW2Of11bxNMXnGM8jJUEI6msD81wayfs7Oemimv6SR4PGgln6xT_ylLXIcL
5qgSBqHifY-
Jb325gvKEMwize97SDEmLNhxz9x9dB5xvUguKIZsXz7CGK1aY8HPTdM4IZBZLHHccJIvgf0arE
3EeZtURdaycT9RbPYZvyyFW-ODK-NKSWATnbCmLVb-
CDZjcaO5KToX_ZXQIOmerWz2Wcj0wS8khceNq_zw-2O5cSAFrH15W0uyDWNLJd-
giT7sAXBi3oChxQ4Ms1qM7IB9xdVw44t0VGWrZfr5C-Yq3BGpkH_qix8R_r_A"
}

11.1.9.2.5 Invoking REST API using Bearer Token

To invoke your REST API using the bearer token, refer to the following sample:
Curl Command for logging in the REST API access through bearer token
curl --location --request POST 'http://whf00pfs:8092/ofsa/rest-
api/idm/service/login' \
--header 'Authorization: Bearer
eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJqdGkiOiI5ZDljZWU4YS0zOGJmLTRkMjMtO
TU1ZC1kMTU5ODA2YTk5NzciLCJpc3MiOiJSU19RVE4iLCJhdWQiOiJPRlNBQSIsInN1YiI6IlJ
TX1FUTiIsImlhdCI6MTYwNDk4NzU1OCwiZXhwIjoxNjA0OTkxMTU4fQ.WcxtP3A0NJa4U5bjD_
D8GQzzMd77pI4woW2Of11bxNMXnGM8jJUEI6msD81wayfs7Oemimv6SR4PGgln6xT_ylLXIcL5
qgSBqHifY-
Jb325gvKEMwize97SDEmLNhxz9x9dB5xvUguKIZsXz7CGK1aY8HPTdM4IZBZLHHccJIvgf0arE
3EeZtURdaycT9RbPYZvyyFW-ODK-NKSWATnbCmLVb-
CDZjcaO5KToX_ZXQIOmerWz2Wcj0wS8khceNq_zw-2O5cSAFrH15W0uyDWNLJd-
giT7sAXBi3oChxQ4Ms1qM7IB9xdVw44t0VGWrZfr5C-Yq3BGpkH_qix8R_r_A'

11.1.10 Information Domain


Information Domain within the Infrastructure system refers to a specific area of analysis which
consists of stored data models with the related Technical and Business data definitions for processing.
An Information Domain forms the backbone for all the data analysis. Information domain comprises
of Metadom Schema and Datadom Schema. Metadom Schema holds all the Business data definitions
and Datadom Schema consists of stored data models. For RDBMS Infodom, Metadom and Datadom
Schemas can be pointed to the same Database Schema. For HDFS Database, Metadom should
mandatorily point to an RDBMS Schema and Datadom Schema should point to the Hive Schema.
Information Domain in the System Configuration section facilitates you to define and maintain the
Information Domain Details within the Infrastructure system.
• The Information Domain Maintenance window can be accessed only if the Server details are
defined and at least one database has been created.
• One Information Domain can be mapped to only one database and one database can be
mapped to only one Information Domain.
• You need to execute the file privileges_config_user.sql which is available under $FIC_HOME
directory by logging into database as sysdba user, to grant privileges to the Database Schema.
• The Information Domain schema makes use the tables from the configuration schema and to
facilitate that you need to execute the file “<Infrastructure Database Layer Install

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 523


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Directory>/config_table_privileges_for_atomic_user.sql” from the Infrastructure config


database before the Information Domain is created.
You (System Administrator) need to have SYSADM function role mapped to your role to access and
modify the Information Domain details. Click from the header to display the Administration tools
in Tiles menu. Click Information Domain from the Tiles menu to view the Information Domain
Maintenance window, or click button to access the Navigation List, click Information Domain to
view the Information Domain Maintenance window.

Figure 248: Information Domain Maintenance window

By default the Information Domain Maintenance window displays the pre-configured Information
Domain details and allows you to add, modify, and delete Information Domains.

11.1.10.1 Creating Information Domain


You can create Information Domain only when you have a defined database which has not been
mapped to any Information Domain. To add Information Domain details:
1. Select Add button from the Information Domain Maintenance window. The window is refreshed
and enables you to populate the required data in the fields.

Figure 249: Information Domain Maintenance Add window

2. Enter the Information Domain details as tabulated:


The following table describes the fields in the Information Domain Details pane.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 524


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 250: Fields in the Information Domain Details pane and their Descriptions

Field Description

Enter the name of the Information Domain. Ensure that the name specified
Name is of minimum 6 characters long and does not contain any special
characters or extra spaces.

Enter the description of the Information Domain. Ensure the description


Description
field is neither empty nor exceeds 50 characters.

Is authorization required Select the checkbox if user authorization is required to access Business
for Business Metadata? Metadata.

Is this Staging Select the checkbox if you are creating a Staging/Temporary Information
Information Domain? Domain.

3. Click Next and enter the database details as tabulated:


The following table describes the fields in the Database Details for DB Server pane.

Table 145: Fields in the Database Details for DB Server pane and their Descriptions

Field Description

Select the database server from the drop-down list. The list contains all the
Database Server
defined database servers.

Select the database name from the drop-down list. The list contains all the
Database Name
database names contained within the server.

Select the OLAP server from the drop-down list. The list contains all the
OLAP Server
servers defined in OLAP Details.

Select OLAP Type from the drop-down list. The available options are:
ESSBASE
OLAP Type
ORACLE
SQAOLAP

Select the required option to re-generate all the Business Intelligence


Generate BI hierarchy Hierarchies either upon Data Load or upon Transformation or both. By
default, None option is selected.

4. Click Next.
5. Specify the file location path of erwin, Log, and Scripts file on the application server. For
example, an erwin file path could be /oracle/app73/ftpshare/<infodom>/erwin.
 erwin file stores TFM and Database Model XML files.
 Log file stores the Log data for all the Backend and Front-end components.
 Script file stores Table Creation scripts.
6. Specify the file location path of erwin, Log, and Scripts file on the database server.
For example, an erwin file path could be /home/db73/ftpshare/<infodom>/erwin.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 525


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

The specified details provided for the database and application server details will be mapped to
the Information Domain. A consolidated data would be stored in the DSNMASTER table in the
Config Schema database.
7. Select the Meta Database Server from the drop-down list. This is the database server of the
Metadom Schema.
8. Enter the Database Name of the Metadata Schema.
9. Click Save to save the Information Domain details.
After creating Information Domain successfully, add persistence unit entry and replace the
$JNDI_KEY_FOR_SERVER_TYPE in GRCpersistence.xml file present under
$FIC_WEB_HOME/webroot/WEB-INF/classes/META-INF folder.

The value for JNDI_KEY_FOR_SERVER_TYPE will vary based on the webserver type.
Similarly add persistence unit entry to persistence.xml file present under
$FIC_DB_HOME/conf/META-INF folder.
On creating an Information Domain a list of objects are created using the script files.

11.1.10.2 Modifying Information Domain


By default, the Information Domain Maintenance window displays the details of the selected
Information Domain. Select the required Information Domain by clicking on the Name drop-down list.
You can edit only the specific information as indicated below:
• In Information Domain Details section you can update the Information Domain Description
and change the option to specify “Is authorization required for Business Metadata?”
• In Generate BI hierarchy section, you can change the option to re-generate all the Business
Intelligence Hierarchies either upon Data Load or upon Transformation or both. By default,
“None” option is selected.
• In Paths on the APP and DB Server, you can update only the Log File Path. The erwin and
Scripts file path is updated automatically by the system when there is a change in the Server
Details. The change in path of Log and MDB files has to be updated manually by moving the
files to the new path.
Once you have updated the required information, click Save to save the Information Details. For more
information, see Create Information Domain.

11.1.10.3 Deleting Information Domain


You can remove an Information Domain in the Infrastructure system only when there are no users
mapped to it. Select the required Information Domain by clicking the Name drop-down list and click
Delete.

NOTE You need to manually drop the Atomic Schema/ objects in the
schema upon deletion of INFODOM.

Perform the following actions:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 526


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

1. Login to the Websphere/ Weblogic Admin console.


2. Delete any Data Sources/ Connection Pool entries configured to the Atomic Schema of the
INFODOM being deleted. For more information, see Appendix B in the OFS AAAI Application
Pack Installation and Configuration Guide available in the OHC Documentation Library.
3. Navigate to $FIC_HOME/ficweb/webroot/WEB-INF/ folder.
4. Edit the web.xml file and delete any <resource-ref> entries pointing to the same ATOMIC
schema.
5. Navigate to the folder on your OFSAA instance identified as FTPSHARE.
6. Delete the folder with same name as the INFODOM being deleted.

11.1.11 Configuration
Configuration is the process of defining the System Accessibility Components of an Information
System. Configuration in the System Configuration Section enables you (System Administrator) to
define and maintain the User Accessibility Details within the Infrastructure System.
You (System Administrator) must have the SYSADM Function Role mapped to your role to access and
modify the Configuration details. Click Administration from the Header to display the
Administration Tools in a Tiles Menu. Click System Configuration from the Tiles Menu to view a
submenu list and click Configure System Configuration to view the Configuration Window.

Alternatively, you can click the Navigation Button to access the Navigation List. Click System
Configuration, and click Configure System Configuration to view the Configuration Window.
The Configuration Window consists of the sections: General Details, Guest Login Details, Optimization,
and Others. By default, the General Details Window is displayed with the pre-configured details of the
Server and Database that you are currently working on and allows you to modify the required
information.

11.1.11.1 Update General Details


OFSAAI supports the following types of authentications:
• SMS Authentication & Authorization – This option is selected by default.
• LDAP Authentication & SMS Authorization - Ensure that the LDAP Servers are up and
running if you select this option. You can configure multiple LDAP Servers here. While logging
into the OFSAA instance, you can select the appropriate LDAP Server to process the
authentication requests.
• SSO Authentication & SMS Authorization - Ensure that the SSO Server is configured if you
select this option.
• SSO Authentication (SAML) and SMS Authorization
Specify the configuration details as tabulated:
The following table describes the fields in the Configuration Window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 527


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Table 146: Fields in the Configuration window and their Descriptions

Field Description

This field is not applicable if you select the SSO Enabled check box.
Number of invalid
logins Enter the number of attempts permitted to a user to enter incorrect passwords after
which the user account will be disabled.

Path for Application


Enter the Application Packaging path where the JSPs generated in DEFQ are saved.
Packaging

Enter the permitted duration of inactivity after which the session will be automatically
timed out and the user will be requested to login again.
Note the following:
• The session timeout value should be atleast or more than 10 minutes.
Session Timeout Value
(in minutes) • The session time out depends on the specified Session Timeout Value and web
server internal session maintenance. It may vary for different web servers.
• If SSO authentication is selected, ensure you set the Session Timeout Value
equivalent to the configured server session time to avoid improper application
behavior after session expired..

Enter the time in the session at which a popup must appear and display a timer that
shows the time remaining for the session to end.
Session Timeout
Popup Interval (in For example, if you enter 50 minutes to the Session Timeout Value and enter 5
minutes) minutes to the Session Timeout Popup Interval, the popup appears on the screen
after 45 minutes of inactivity and displays the timer (starts from 5 minutes and ends
at 0) for the session timeout.

Enter the System Environment Details such as Development, UAT, Production, and
Environment Details so on. The information is displayed in the Application’s top banner as the “In Setup”
information.

Select this check box to enable SSO Authentication & SMS Authorization.
Note: If SSO is enabled, then you must configure the SSO URL for Referer Header
SSO Enabled Validation.
For more information, see the Configure Referer Header Validation Section in the
OFSAA Security Guide.

Select to enable Token-based Authentication for the REST APIs to authenticate the
Enable native password.
authentication for For more information, see the Using REST APIs for User Management from Third-
REST API Party IDMs Section in the Oracle Financial Services Advanced Analytical Applications
Infrastructure Administration Guide.

The options displayed for Authentication Type are:


• SSO Authentication and SMS Authorization
• SSO Authentication (SAML) and SMS Authorization
Authentication
Type NOTE:
For more information about IDCS for SAML Integration, see
If SSO Enabled https://docs.oracle.com/en/cloud/paas/identity-
checkbox is selected: cloud/uaids/add-saml-application.html.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 528


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

This field is displayed only if you have selected Authentication


Type as SSO Authentication and SMS Authorization.
Select the required SSO Method. These methods are to specify
how the User ID must be passed from the SSO Engine.
• HTTP Request Header - Returns the value of the specified
request header as a string from the server. If selected, you
need to specify the header value in the SSO Header Value
SSO Method Field.
For example, SM_USER and iv-user header values are
supported in OAM.
• HTTP Request Remote User - Returns the login details of the
User who is requesting access to the application remotely.
• HTTP Request User Principal - Returns a
“java.security.Principal” object containing the name of the
Current Authenticated User.

This field is displayed only if you have selected Authentication


SSO Logout Type as SSO Authentication and SMS Authorization.
URL
Enter the URL of the page to invalidate SSO Session.

This field is displayed only if you have selected Authentication


SSO Redirect Type as SSO Authentication and SMS Authorization.
URL Enter the URL of the page to which the user must be redirected
after the SSO Session is invalidated.

This field is displayed only if you have selected Authentication


Type as SSO Authentication (SAML) and SMS Authorization.
Select this check box if you want to register OFSAA as the service
OFSAA as provider. If the check box is not selected, OFSAA will act as a One-
Service way SAML Authentication. That is, OFSAA will only assert the
Provider identity.
For more details on how to register OFSAA as Service Provider,
see SSO Authentication (SAML) Configuration Section in the
OFSAAI Administration Guide.

This field is displayed only if you have selected the OFSAA as


Service Provider check box.
Enter the IDP SingleSignOnService URL in the Identity Provider
URL Field.
NOTE:

Identity • Enter the fully qualified domain URL used to access the
Provider URL Identity Provider.
• This is an optional field and only required if IDP URL for login
and logout are different. In case this field is not configured
then “Identity Provider URL” will be used for both login and
logout requests.
• The following is an example for IDCS:
https://<IDCS_URL>/fed/v1/idp/sso

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 529


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

This field is displayed only if you have selected the OFSAA as


Service Provider check box.
Enter the IDP SingleLogOutService URL in the Identity Provider
Logout URL Field.
NOTE:
Identity • Enter the fully qualified domain URL used to access the
Provider Identity Provider Logout.
Logout URL • This is an optional field and only required if IDP URL for login
and logout are different. In case this field is not configured
then “Identity Provider URL” will be used for both login and
logout requests.
• The following is an example for IDCS:
https://<IDCS_URL>/fed/v1/idp/slo

This field is displayed only if you have selected Authentication


Type as SSO Authentication (SAML) and SMS Authorization.
SAML User
Attribute Enter the user attribute name, which is used to pass User ID in
SAMLResponse. If this parameter is not set, the user will be
retrieved from attribute “Subject” by default.

This field is displayed only if you have selected Authentication


Type as SSO Authentication (SAML) and SMS Authorization.
SAML
Certificate Enter the absolute path where the SAML Certificate from Identity
Absolute Path Provider is stored, which is required for SAML Assertion. If this
parameter is not set, the signature from SAMLResponse will not
be verified.

This field is displayed only if you have selected Authentication


SAML Logout Type as SSO Authentication (SAML) and SMS Authorization.
URL Enter the URL of the SAML Logout Page to be called on logout
operation.

This field is displayed only if you have selected Authentication


SAML Request Type as SSO Authentication (SAML) and SMS Authorization.
Binding
Select to use SAML Binding to transport messages within the URL.

This field is displayed only if you have selected Authentication


Type as SSO Authentication (SAML) and SMS Authorization.
Generate
Logout Request Select to generate a SAML Request for logout. Deselect this field
to direct users to the URL specified in the SAML Logout URL Field
for logout.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 530


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

This field is displayed only if you have selected Authentication


Type as SSO Authentication (SAML) and SMS Authorization.
Select this field and the following fields appear, which provide
capabilities to generate signed SAML Requests:
• Private Key
• X509 Certificate
• Signature Algorithm
NOTE: We recommend that you use the PKCS#8 format. Do not
protect the key with any passphrase.

Update this field with the private key used


Private Key
to sign the SAML Request.

Update this field with the certificate to sign


the SAML Request. Update the
sp_metadata.xml file with the same
Sign certificate.
Authentication X509 Certificate
For more information, see the SAML
and Logout
Service Provider Metadata Configuration
Request
with Certificate Section in the OFSAAI
Administration Guide.

Enter the URI of the algorithm. The


following are a few examples from w3.org:
• http://www.w3.org/2001/04/xmldsig-
more#rsa-sha256
• http://www.w3.org/2001/04/xmldsig-
more#rsa-sha224
Signature Algorithm • http://www.w3.org/2001/04/xmldsig-
more#rsa-sha384
• http://www.w3.org/2001/04/xmldsig-
more#rsa-sha512
NOTE: If you leave this field blank, the
system applies the default signature RSA-
SHA256.

Select the required authentication type from the drop-down list. The options are :
• SMS Authentication and Authorization

Authentication Type • LDAP Authentication and SMS Authorization


When you select Authentication Type as LDAP Authentication and SMS
Authorization, the LDAP Server Details popup is displayed. For more details, see
LDAP Server Details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 531


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Select to enable Just in time (JIT) provisioning which synchronizes the User, Group,
and User-Group mapping in external systems such as LDAP, SAML, and SSO into
OFSAA when a User logs in.
NOTE:
• JIT Provisioning is available on 8.1.1.2.0 and later versions. However, to enable it
in the 8.1.1.1.0 version, apply the 33067589 One-Off Patch from My Oracle
Support.
• JIT Provisioning is available on 8.1.2.0.0 and later versions. However, to enable it
in the 8.1.2.0.0 version, apply the 34019691 One-Off Patch from My Oracle
Support.
• JIT Provisioning is available on 8.1.2.1.0 version and further Maintenance
Releases.
• Update the Group Domain Mapping in OFSAA when you create in LDAP, SAML,
or SSO.
JIT Provisioning
• Configure the User Group Details in the LDAP Group Details Section if you select
Enabled
LDAP.
• For SAML, configure the attribute name user_groups for IDCS.
• For SSO, send the mapped groups in the header with the user_groups key.
• For SAML, configure the following attributes:
 user_groups
 user_email
 user_name
• For SSO, configure the following headers:
 user_groups
(To add more than one User Group, specify the User Groups separated
by commas.)
 user_email
 user_name
Before you select this check box in the UI, ensure that the JIT Provisioning Enabled
check box is selected to establish a connection with the External System.
Select to enable the unmap operation of the User Groups from the External System to
OFSAA during login.
NOTE:
• JIT Provisioning is available on 8.1.1.2.0 and later versions. However, to enable it
Enable JIT Unmapping in the 8.1.1.1.0 version, apply the 33067589 One-Off Patch from My Oracle
Operation Support.
• JIT Provisioning is available on 8.1.2.0.0 and later versions. However, to enable it
in the 8.1.2.0.0 version, apply the 34019691 One-Off Patch from My Oracle
Support.
• JIT Provisioning is available on 8.1.2.1.0 version and further Maintenance
Releases.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 532


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Before you perform this operation in the database, ensure that the JIT Provisioning
Enabled check box is selected to establish a connection with the External System.
Set the JIT_IS_GRP_CRT_ENABLED Parameter Value to Y in the Configuration Table
in the database to enable the Creation of Groups during the JIT Provisioning.
Enable Group Creation The default value is N.
during JIT
After setting the value to Y, commit and restart the Servers.
Provisioning
NOTE:
NOTE: This is not a
field in the UI, it is a • JIT Provisioning is available on 8.1.1.2.0 and later versions. However, to enable it
Parameter added to in the 8.1.1.1.0 version, apply the 33067589 One-Off Patch from My Oracle
the Configuration Support.
Table in the database. • JIT Provisioning is available on 8.1.2.0.0 and later versions. However, to enable it
in the 8.1.2.0.0 version, apply the 34019691 One-Off Patch from My Oracle
Support.
• JIT Provisioning is available on 8.1.2.1.0 version and further Maintenance
Releases.

Allow user to login


from multiple Select the check box to allow concurrent user login.
machines

Select the check box to enable Data Redaction. For more details, see the section Data
Allow Data Redaction
Redaction in the OFS AAI Administration Guide.

This field is not applicable if you have selected the SSO Enabled check box.
Encrypt Login Select the check box to encrypt the login password for more protection.
Password NOTE: For LDAP Authentication and SMS Authorization, this check box should not
be selected.

Enable CSRF Select this check box to enable protection for the Cross Site Request Forgery (CSRF)
Protection in the application.

Select the hierarchy security node type from the drop-down list. The available
options are:
Hierarchy Security • Group-Based Hierarchy Security
Type • User-Based Hierarchy Security
Depending on the selection, the user/ group details are displayed in the Hierarchy
Security Window.

Enter the email domains that you want to allow. Enter multiple domains with
comma-separated values if you want to allow more than one domain.
Allowed Email
For example: oracle.com, oci.oracle.com
Domains
During User Creation in the User Definition (add mode) Window, you can add only
Email IDs that belong to the allowed domains.

This field is not applicable if you have selected the SSO Enabled check box.
Dormant Days Enter the number of inactive days permitted after which the user is denied accessing
the system.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 533


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

This field is not applicable if you have selected the SSO Enabled check box.
Enter the number of inactive days permitted after which the user access permissions
are removed and the delete flag status is set as “Y”.
Inactive Days
Ensure that the number of Inactive days is greater than or equal to Dormant Days.
Note that, the user details still exist in the database and can be revoked by changing
the status flag.

This field is not applicable if you have selected the SSO Enabled check box.

Working Hours Enter the working hours (From and To) to restrict the user from logging in to the
system within the specified time range. The time is accounted for in 24 hours and in
hh:mm format.

This field is not applicable if you have selected the SSO Enabled check box.
Frequency of
Password Change Enter the number of days after which the login password will expire, and the user will
be navigated directly to the Change Password Window.

This field is not applicable if you have selected the SSO Enabled check box.

Password History Enter the number of instances the old passwords need to be maintained and the user
will be restricted not to use the same password again. A maximum of the last 10
passwords can be recorded.

This field is not applicable if you have selected the SSO Enabled check box.
Select one of the following options:

Password Restriction • Restricted - To impose additional rules and parameters for users while defining
a password.
• Un Restricted - To allow users to define any password of their choice ensuring
that the password is alphanumeric without any special characters.

Enter any disclaimer information that you want to make available for the users of the
Disclaimer Text
application on the Login Window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 534


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Specify the following password restriction parameters:


• Password Length - Enter the minimum and maximum characters permitted for
setting a password. The default range is between 6 and 20 characters.
• Numbers - Enter the minimum and maximum numeric characters permitted.
• Upper Case - Enter the minimum and maximum numbers of upper case
characters that are permitted.
• Lower Case - Enter the minimum and maximum numbers of lower case
characters that are permitted.
• Special Characters Occurrence Allowed - Select the check box if special
characters are allowed in passwords.
These fields are
• Special Character - Enter the minimum and maximum numbers of special
displayed only if you
characters that are permitted.
select the Restricted
option for Password • Special character occurrence Frequency - Enter the number of times the same
Restriction. special character can occur in the password.
• Disallowed Special Characters - Enter the special characters (without spaces)
which are not permitted in a password.
• Running Alphabets - Select the check box to allow running alphabets in a
password. For example, abc, xyz, AbC, and so on.
• Sequence Of Running Alphabets- Enter the number of times the sequence is
permitted.
• Running Numbers - Select the check box to allow running numbers in a
password. For example, 123, 456, and so on.
• Sequence Of Running Numbers- Enter the number of times the sequence is
permitted.

Email Notifications can be sent based on the following:


• Enable batch operation notification: Notifications are sent to all users mapped
to the batch monitor functionality.
• Enable batch owner notification only: Notification is sent to the user who
executes the batch.
Email Notification

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 535


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Select to enable security questions that users would have to answer before they can
reset their passwords. This feature enhances user authenticity validation. Enter
information for the following fields:
• Question 1 – Enter the first question to be displayed on the Password Reset Page.
• Answer 1 – Enter the answer to the first question.
• Question 2 - Enter the second question to be displayed on the Password Reset
Page.
• Answer 2 – Enter the answer to the second question.
• Question 3 - Enter the third question to be displayed on the Password Reset
Page.
• Answer 3 – Enter the answer to the third question.
The following illustration is an example:
Security Question
Figure 251: Security Question Enable Pane
Enable

Makes SAML_entity of SAML configurable on system configuration screen


You can enter alphanumeric values in this field.
• When SAML_ENTITY is blank and configuration saved, the Entity ID in IAM is set
SAML_ENTITY to {basae_url/context-name}.
• When an alphanumeric value is given in SAML_ENTITY , the IAM Entity ID is set
to the value specified in the SAML_ENTITY field.
Example : Test1,./:@-&?_45

Click Save and save the general tab details.

11.1.11.1.1 LDAP Server Details

This feature allows you to configure and maintain multiple LDAP servers in the OFSAA instance. You
can add a new LDAP server, modify/ view LDAP server details, and delete an existing LDAP server.
The LDAP Server Details window displays the details such as ROOT Context, ROOT DN, LDAP URL,
LDAP SSL Mode, and LDAP Server name.
To add a new LDAP Server

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 536


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

1. Select LDAP Authentication & SMS Authorization from the Authentication Type drop-down
list in the General Details tab, the LDAP Server Details window is displayed.
2. Click button in the toolbar. The LDAP Server Details window is displayed.

Figure 252: LDAP Server Details window

3. Enter the details as tabulated:


The following table describes the fields in the LDAP Server Details window.

Table 147: Fields in the LDAP Server Details window and their Descriptions

Field Description

Fields market with * are mandatory.

LDAP Server Details

Enter the LDAP URL from which the system authenticates the user.
LDAP URL
For example, ldap://hostname:3060/.

Enter the LDAP Server name.


LDAP Server
For example, ORCL1.in.oracle.com.

Select this option to login to the database anonymously and perform


functions. This is useful when you are searching for a user in the system and
Enable Anonymous cannot find the user. For example, you cannot find a cn due to a name
Bind change and you have to map the user to the correct dn. You can use a
property such as email to search for the dn and map it correctly.
Note: Selecting this field disables ROOT DN and ROOT Password fields.

Select the checkbox to enable LDAP over SSL to ensure encryption of user
LDAP SSL Mode
credentials when transferred over a network.

Enter the ROOT Distinguished Name.


ROOT DN
For example, cn=orcladmin,cn=Users,dc=oracle,dc=com.

ROOT Password Enter the LDAP server root password for authentication.

LDAP User Details

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 537


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

Enter the full path of the location of the active directory in the LDAP server
User Search Base from which to start the user search. This is a comma-delimited parameter.
For example, cn=User,dc=oracle,dc=com

Enter search filters to limit the user search for the results obtained from
User Search Filter
‘User Search Base’. For example, objectclass=organizationalPerson.

Enter a user search filter to include specific user groups. For example, enter
User Filter Classes
‘top’ for the search to access groups up to the top-level in the directory.

Specify the login ID attribute (user name) to be used in the system for users.
Login ID Attribute
For example, enter ‘cn’ to use the common name as the login id attribute.

Specify the attribute that maps to the Login ID. This is used for
Login Name Attribute
authentication purposes. For example, ‘sn’ maps to ‘cn’.

Enter the attribute to enable or disable a user. For example, ‘orclisEnabled’ is


User Enabled Attribute
to enable a user account in the LDAP server.

Enter the attribute that stores the user-account start-date information. For
User Start Date
example, ‘orcActiveStartdate’ contains start dates of all users.

Enter the attribute that stores the user-account end-date information. For
User End Date
example, ‘orclActiveEndDate’ contains start dates of all users.

LDAP Group Details

Enter the full path of the location of the active directory in the LDAP server
Group Search Base from which to start the group search. This is a comma-delimited parameter.
For example, cn=Groups,dc=oracle,dc=com

Enter search filters to limit the group search for the results obtained from
Group Search Filter
‘Group Search Base’. For example, objectclass=groupOfNames.

Enter a group search filter to include specific groups. For example,


Group Filter Classes
groupOfNames.

Group Member
Enter a member attribute listed for the Groups. For example, ‘member’.
Attribute

Group ID Attribute Enter the attribute that identifies the group name. For example, ‘cn’.

Enter the attribute that specifies the full name of the group. For example,
Group Name Attribute
description

4. Click Save.
When a business user accesses OFSAA login window where multiple LDAP servers are
configured in the OFSAA instance, the LDAP Server drop-down list is displayed. If the user
selects an LDAP server, he will be authenticated only against the selected LDAP server. If the
user does not select any LDAP server, he will be authenticated against the appropriate LDAP
server.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 538


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

NOTE SYSADMN/ SYSAUTH/ GUEST users need not select any LDAP
server as they are always authenticated against SMS store.
Additionally, if a specific user is marked as “SMS Auth Only” in
the User Maintenance window, then that user is authenticated
against the SMS store instead of the LDAP store even though
the OFSAA instance is configured for LDAP authentication. The
user has to enter password as per SMS store.

11.1.11.1.2 SSO Authentication and SMS Authorization

Before you configure SSO authentication, ensure that:


• You have configured OAM (Oracle Access Manager) or equivalent server for SSO user
authentication.
• The configured SSO server is up and running and an SSO login Page is displayed for users to
provide the authentication details.
• The configuration fields are updated correctly before saving the details.
• /<context-name>/login.jsp should be the only resource that is protected.
• The following URLs are there in the excluded URL list in SSO server:
1. MAP_WSDL_LOCATION=$PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$
CONTEXT$/mdbObjAppMap?wsdl
2. MDBPUBLISH_EXECUTION_WSDL_LOCATION=$PROTOCOL$://$WEBSERVERHOST$:
$WEBSERVERPORT$/$CONTEXT$/mdbPublishExecution?wsdl
3. Rest Service for Object Migration :- $PROTOCOL$://$WEBSERVERHOST$:$WE
BSERVERPORT$/$CONTEXT$/rest-
api/migrationrest/MigrationRESTService/invokeMigrationService
4. Rest Service for WSMRE :-
$PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$CONTEXT$/rest-
api/rrfmrerest/RestfulMREService/RestfulMREInvoke
5. Data Redaction =
$PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$CONTEXT$/rest-
api/redaction/redact/summary
6. $PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$CONTEXT$/servlet/com.ifle
x.fic.ficml.FICMaster
7. $PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$CONTEXT$/servlet/com.ifle
x.fic.icc.iccwl.ICCComm
8. $PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$CONTEXT$/help.jsp
9. $PROTOCOL$://$WEBSERVERHOST$:$WEBSERVERPORT$/$CONTEXT$/help/*

NOTE The place holders such as $PROTOCOL$, $WEBSERVERHOST$,


$WEBSERVERPORT$, and $CONTEXT$ in the URLs should be
updated appropriately

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 539


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

In case of any errors, the mapped users will not be able to login to the application and you may need
to correct the details by logging to the system as sysadmn.
For System Users:
• You can access OFSAAI Application using <Protocol (http/https)>://<IP/
HOSTNAME>:<SERVLET PORT>/<CONTEXT NAME>/direct_login.jsp.
• You have to select the appropriate user id from the drop-down list.
For Application Users:
• The Login Page will be their respective SSO Authentication Page.
• After successful login, you can change your locale from the Select Language link in the
application header of the Landing Page. Move the pointer over the link and select the
appropriate language from the listed languages. Based on the locales installed in the
application, languages will be displayed.
• The Change Password link will not be available in the application header.

11.1.11.2 Update Guest Login Details


You (System Administrator) can facilitate Guest Users to login to the Infrastructure system by
configuring the Guest Login Details. If a password is defined, then the guest users are required to
enter the password during logon and would then be navigated to the specific modules based on the
mapped Roles and Functions.
Ensure the following before configuring the guest user details:
• Functions and Roles should be mapped appropriately for tracking the guest user activities on
the system.
For example, when a guest user is permitted to modify Metadata, the change done cannot be
tracked since the system recognizes Guest User as Modifier.
• When there is a provision for Guest User to access the Infrastructure system from an external
machine, a specific set of .jsp’s (web pages) has to be defined to the Guest User and maintained
in the “urllist.cfg” in ficweb/conf folder.
For example, if the “urllist.cfg” contains “ficportal/Testing.jsp” and “fiv/OpenView.jsp’s”, Guest
users can view and execute Testing and OpenView.jsp’s from ficportal and fiv contexts.
 Any number of pages can be defined within the “urllist.cfg” file
 The additions into the CFG file will be done manually.
 Only the links specified in the urllist.cfg file can be accessed through the guest login.
• You can also specify access based on wild card entries. A wildcard character can be applied at
the main folder level only and not to a subset of files within a folder.
For example, if access is provided to ficportal/testing/*, then all the pages under
ficportal/testing folder are accessible from Guest login.
1. Select Guest Login tab and update the details as tabulated:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 540


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 253: Guest Login tab

The following table describes the fields in the Guest login tab.

Table 148: Fields in the Guest login tab and their Descriptions

Field Description

Select one of the following option from the drop-down list:


Guest Login ENABLED - To enable guest users and allow them to login to the system.
DISABLED - To restrict access to guest users.

You can select the Guest Password as one of the following from the drop-
down list only if you have ENABLED guest Login:
Guest Password
Required - Guest users need to specify a password to logon.
Not Required - Guest users can logon directly.

You can specify the Guest Password only if you have selected the previous
Guest Password field option as Required.
Enter the Guest Password as indicated:
• If Password Restrictions is set in the General Details tab, the specified
Guest Password password must satisfy all the defined parameters. However Guest
Users do not comply to change password, invalid login attempts, or
logging from multiple workstations,
• If no Password Restrictions is set, ensure that the specified password is
alphanumeric without any extra spaces.

2. Click Save and save the guest login configuration details.

11.1.11.3 Update Optimization Details


1. Select Optimization Details tab and update the details as tabulated:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 541


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 254: Optimization tab

The Optimization details such as Hints, Scripts, and Using ROWID instead of Primary Keys can
be specified to optimize Merge statements. The defined configurations are also fetched as
Query Optimization Settings while defining Rule definition properties.
The following table describes the fields in the Optimization tab.

Table 149: Fields in the Optimization tab and their Descriptions

Field Description

Specify the SQL Hint that can be used to optimize Merge Query.
For example, “/*+ ALL_ROWS */”
Hint used for MERGE In a Rule Execution, Merge Query formed using definition level Merge Hint
statement precede over the Global Merge Hint Parameters defined here. In case the
definition level Merge Hint is empty / null, Global Merge Hint (if defined
here) is included in the query.

Specify the SQL Hint that can be used to optimize Merge Query by selecting
the specified query.
For example, “SELECT /*+ IS_PARALLEL */”
Hint used for SELECT
statement In a Rule Execution, Merge Query formed using definition level Select Hint
precede over the Global Select Hint Parameters defined here. In case the
definition level Select Hint is empty / null, Global Select Hint (if defined
here) is included in the query.

Refers to a set of semicolon (;) separated statements which are to be


executed before Merge Query on the same connection object.
Script executed before In a Rule Execution, Global Pre Script Parameters defined here are added to
MERGE statement a Batch followed by Rule definition level Pre Script statements if the same
has been provided during rule definition. However, it is not mandatory to
have a Pre Script either at Global or definition level.

Refers to a set of semicolon (;) separated statements which are to be


executed after Merge Query on the same connection object.
Script executed after In a Rule Execution, Global Post Script Parameters defined here are added
MERGE statement to a Batch followed by Rule definition level Post Script statements if the
same has been provided during rule definition. However, it is not
mandatory to have a Post Script either at Global or definition level.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 542


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

You can select the ROWID checkbox to create a Merge Statement based on
specified ROWID instead of Primary Keys.
In a Rule Execution, ROWID is considered while creating Merge Statement if
User ROWID in ON clause Use ROWID checkbox is selected in either Global Parameters defined here
of MERGE statement or Rule definition properties.
If Use ROWID checkbox is not selected in either Global Parameters defined
here or Rule definition properties, then the flag is set to “N” and Primary
Keys are considered while creating in Merge Statements.

2. Click Save and save the Optimization details.

11.1.11.4 Updating Others Tab


1. Select the Others tab and update the details as tabulated:

Figure 255: Others tab

You can modify the Others tab details as tabulated.


The following table describes the fields in the Others tab.

Table 150: Fields in the Others tab and their Descriptions

Field Description

Limit on number of Specify the number of mappings which are to be displayed in Rule
mappings displayed Definition window. A maximum of 9999 records can be displayed.

Specify the number of subcomponents that can be displayed in each


Page size used in tree
Component from the Process Component Selector window. A maximum
pagination
of 9999 records can be displayed.

Application uses new Run Selecting this option will display only the new Run Rule Framework links
Rule Framework in Metadata Browser and Enterprise Modeling windows.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 543


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Field Description

You can select this checkbox to enable Infrastructure system to log all the
Enable audit log through usage and activity reports. A System Administrator can to generate Audit
Security Management Trail Reports in HTML format to monitor user activity on regular intervals.
System
Note: This is currently applicable for Run Rule Framework only.

This feature is disabled by default.


Select the check box to determine which case statement of a rule has
updated how many corresponding records. Though there is no impact in
Populate Execution
Rule execution, an insert query is used in the back-end to list the number
Statistics
of records processed by each condition in the rule.
For more information, see Populate Execution Statistics in References
section.

Select the checkbox to allow data correction on the data source. This
enables the data correction to be executed along with data quality checks.
Allow Correction on DI If the checkbox is not selected, data corrections will be done with T2T
Source (LOAD DATA) executions, that is while loading the data to the target
table.
By default, the checkbox is selected.

2. Click Save and save the Others tab changes.

11.1.12 Application
Once an application pack is installed, you can use only the Production or Sandbox information
domain, created during the installation process. Though there is an option to create a new Information
Domain, there is no menu to work with the frameworks on the newly created information domain.
This information domain then created acts only as a Sandbox Infodom.
The Create New Application feature allows you (System Administrator) to create a new Application
other than the standard OFSAA Applications and associate the standard/default platform framework
menu with it, thereby enabling the new application for usage. The standard platform framework menu
is seeded and rendered.

Click from the header to display the Administration tools in Tiles menu. Click Create New
Application from the Tiles menu to view the Create New Application window, or click button to
access the Navigation List, and click Create New Application to view the Create New Application
window.
After you create an Application, a new Role is created as <APP_CODE>ACC. This role needs to be
mapped to the user group and the users mapped to that user group will get the new Application listed
in the Tiles menu that appears on clicking from the header. Only Enabled applications are listed in
this menu.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 544


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 256: Create New Application window

The Create New Application window displays the existing Applications with the metadata details such
as Application ID, Application Name, Application Pack Name, Information Domain, and Enabled status.
You can make use of Search and Filter option to search for specific Application based on ID, Name,
Application Pack Name, Information Domain, and Enabled status.

11.1.12.1 Creating a New Application


This option allows you (System Administrator) to create a new Application by providing ID, Name, and
Description. You need to select the information domain which you want to map to the newly created
Application. You also have an option to enable or disable the Application.
Note the following points:
• At least one Information domain should be present. For more information on creating an
Information Domain, see the Creating Information Domain section.
• Mapping the same information domain to different Applications is allowed.
• The menu to the new Application will be the complete set of platform framework menus
including Enterprise Modeling and Inline Processing Engine menus that work on DATADOM
schema. Access to the menus is controlled using the User Group-Role mappings.
To create an Application

1. Click from the header to display the Administration tools in Tiles menu. Click Create New
Application from the Tiles menu to view the Create New Application window, or click
button to access the Navigation List, and click Create New Application to view the Create New
Application window.
2. Click from the Applications toolbar. The Create New Application window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 545


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

Figure 257: Create New Application (add) window

3. Enter the details as tabulated:


The following table describes the fields in the Create New Application (add) window.

Table 151: Fields in the Create New Application (add) window and Descriptions

Field Description

Application ID Enter the Application ID.

Enter the name of the Application. Maximum of six characters is


Application Name
supported.

Application Description Enter the description of the Application.

This field is automatically populated after you enter the Application ID. The
Application Pack Name
Application pack name will be <Application ID>PACK.

Select the Information Domain which you want to map to the Application
Information Domain from the drop-down list. The information domains to which your user
group is mapped are displayed in the list.

Enabled Select the checkbox to enable the Application for usage.

4. Click Save.
The new Application gets created and it appears in the Summary window. A new User Role is
created as <APP_CODE>ACC. You need to map this User Role to the required User Groups from
the User Group Role Map window. Once the System Authorizer authorizes the User Group- Role
Map, the new Application will be listed in the Select Applications drop-down from the
Applications tab for the User Group.

11.1.12.2 Modifying an Application


This option allows you to edit an existing Application. Only Application Name and Description can be
modified.
To modify an Application

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 546


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
SYSTEM CONFIGURATION

1. Click from the header to display the Administration tools in Tiles menu. Click Create New
Application from the Tiles menu to view the Create New Application window, or click
button to access the Navigation List, and click Create New Application to view the Create New
Application window.

2. Click from the Applications toolbar. The Create New Application (Edit) window is displayed.
3. Modify the required fields. You can edit the Application Name and Application Description.
4. Click Save.

11.1.13 View OFSAA Product Licenses After Installation of Application


Pack
You can view the Application’s Product Licenses relevant to Application Pack after the installation
process. The information available is read-only.
To view a Product License through the Application UI, follow these steps:
1. Log in as a User with System Administrator privileges.
2. Click the Administrator icon from the Header to display the Administration window.
The Administrator tools are displayed in the Tiles menu.
3. Click System Configuration from the Tiles menu to display a submenu list.
4. Click Manage OFSAA Product Licenses to view the Manage OFSAA Application Pack
Licenses window.
You can view the details for the installed Application Packs.
Alternatively, to access the Manage OFSAA Application Pack Licenses window, follow these
steps:
a. Log in as a User with System Administrator privileges.
b. Click the Menu Navigation icon and access the Navigation List.
c. Click System Configuration and then click Manage OFSAA Product Licenses.
5. In the Manage OFSAA Application Pack Licenses window, select the Application Pack for
which you want to view the details for the Products installed.
You can view the details in the Products in the Application Pack section as shown in the
following illustration:

Figure 258: View OFSAA Application Pack Licenses

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 547


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

NOTE The following UI Elements in the Products in the Application


Pack section are read-only and not clickable:
• Check boxes
• Buttons: View License Agreement and Reset

11.2 Identity Management


Identity Management in the Infrastructure administration process facilitates System Administrators to
provide access, monitor, and administer users along with the Infrastructure metadata operations.
The SMS component is incorporated with Password Encryption, Single Logon, Role and Data Based
Security, Access Control and Audit Trail features to provide a highly flexible security envelope.
System Administrators can create, map, and authorize users defining a security framework which has
the ability to restrict access to the data and meta-data in the warehouse, based on fine-grained access
control mechanism. These activities are mainly done at the initial stage and then on need basis.

11.2.1 Navigating to Identity Management


Click from the header to display the Administration tools in Tiles menu. Click Identity
Management from the Tiles menu to view the Security Management window, or click button to
access the Navigation List, and click Identity Management to view the Security Management window.

11.2.2 Components of Identity Management


Security Management consists of the following sections. Click on the links to view the sections in
detail.
• User Administrator

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 548


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

• System Administrator
• Audit Trail Report
• User Activity Report
• User Profile Report
• Enable User

11.2.3 Mappings in Identity Management


User- User Group Mappings
• A user is mapped to a single or multiple user groups
• A user group can have multiple users
• User to user group mapping is many to many
Function- Role Mappings
• A function is mapped to multiple roles
• A role can have many functions
• Function to role mapping is many to many
Folder/Segment- Domain Mappings
• A folder/segment is mapped to an information domain
• An information domain can have many folders/segments
• Folder/segment to information domain mapping is one to one, that is, a folder can be mapped
to a single domain
User Group Role Mapping
• A user group is mapped to multiple roles and each role will have multiple functions mapped to
it.
• All users belonging to a user group can do all functions associated with the roles to which the
user group is mapped.

Figure 259: User Group Role Mapping Illustration

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 549


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.4 User Administrator


User Administration is one of the core functions of Security Management which involves
administrators to create user definitions, user groups, maintain profiles, authorize users and user
groups, and map users to groups, domains and roles.
User Administration refers to a process of controlling the user privileges in accessing the
Infrastructure resources and is based on business requirements to provide access to view, create, edit,
or delete confidential data. It also involves the administrator tasks to grant permissions based on user
roles and requirements.
You (System Administrator) need to have SYSADM and METAAUTH function roles mapped to access
User Administrator in LHS menu of Security Management. The options available under User
Administrator are:
• User Maintenance
• User Group Maintenance
• User Group Map
• Profile Maintenance
• User Authorization
• User Group Authorization
• User Group Folder Authorization
• User Group Domain Map
• User Group Role Map
• User Group Folder Role Map
• Reinstating Deleted Users

11.2.4.1 User Maintenance


User Maintenance facilitates you to create user definitions, view, manage, modify, and delete user
information. It also allows you to enable disabled users in the system after authorization.
You can access User Maintenance by expanding User Administrator section within the tree structure
of Navigation List to the left.
The User Maintenance window displays user details such as User ID, Name, Profile Name, Start, and
End dates. You can also identify the user status if enabled to access the Infrastructure system. You can
also search for a specific user or view list of existing users within the system.

11.2.4.1.1 Adding User

To add a user definition in the User Maintenance window:


1. Select button from the User Maintenance tool bar. Add button is disabled if you have selected
any User ID in the grid. The New User window is displayed.

Figure 260: User Definition (add mode)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 550


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

2. Enter the user details as tabulated.


The following table describes the fields in the User Definition window.

Table 152: Fields in the User Definition window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Enter a unique user id. Ensure that the User ID does not contain any
User ID
special characters or spaces except “.”, “@”, “-”, and “_”.

Enter the user name. The user name specified here will be displayed on
User Name the Infrastructure splash window. Ensure that the User Name does not
contain any special characters except “–”, “’” and “.”.

Enter the employee code. Ensure that the Employee Code does not
Employee Code contain any special characters or spaces except “.”, “@”, “-”, and “_”.
If employee code is not provided, user ID will be taken as employee code.

Enter the contact address of the user. It can be the physical location from
Address where the user is accessing the system. Ensure that Contact Address
does not contain any special characters except ".", "#", "-", ",".

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 551


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Field Description

Specify the date of birth. You can use the popup calendar to enter the
Date Of Birth
date.

Enter the user designation. Ensure that Designation does not contain
Designation
any special characters except “_, “:” and "-".

Profile Name Select the profile name by clicking on the drop-down list.

Start Date By default, the Start Date is today’s date and you cannot edit this field.

By default, the End Date value is 12/31/2050 and you cannot edit this
End Date
field.

Enter the default password for the user for the initial login. User needs to
change the default password during the first login.

Password A user is denied access in case the user has forgotten the password or
enters the wrong password for the specified number of attempts (as
defined in the Configuration window). To enable access, enter a new
password here.

Select the Database Principal name from the drop-down list. The list
displays the Principal names for HDFS Kerberos connection.
Database Authentication
Principal Click to create a new Database Principal by entering the Principal
name and password in the DbAuth Principal and DbAuth String fields
respectively.

(Optional) Specify the notification start and end time within which the
Notification Time
user can be notified with alerts.

Email ID Enter the e-mail address of the user.

Mobile Number (Optional) Enter the mobile number of the user.

Pager Number (Optional) Enter the pager number of the user.

Select the checkbox to allow user to access the system.


Enable User NOTE: By default, this check box is selected.
A deselected checkbox denies access to the user.

Select the checkbox to allow users to access the system on holidays.


Login on Holidays
A deselected checkbox denies access to the user on holidays.

This field is displayed only if the LDAP Authentication & SMS


Authorization or SSO Authentication & SMS Authorization is selected
from the Configuration window.
SMS Auth Only Select the checkbox to authenticate the user through SMS even though
the LDAP Authentication or SSO Authentication is enabled.
This feature can be used to bypass LDAP or SSO authentication for
selected users.

Select the checkbox if you want to enable proxy user for database
Enable Proxy
connection.

Enter the Proxy user name for the OFSAAI user, which will be used for
Proxy User name
database connection.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 552


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

3. Click Save to upload the user details.


The new User details are populated in the User Authorization window which has to be authorized by
System Authorizers. Once authorized, the User details are displayed in User Maintenance window and
can then be mapped to the required user group in the User Group Map window.

11.2.4.1.2 Viewing User Details

You can view individual user details at any given point. To view the existing function details in the User
Maintenance window:
1. Select the checkbox adjacent to the User ID.

2. Click button in the User Maintenance tool bar.


The View User Details window is displayed with the details such as User ID, User Name, Address, Date
of Birth, Designation, Profile Description, Start, and End Date in which the user can access
Infrastructure system. The View User Details window also displays the notifications details and status if
enable to access the system on holidays.

11.2.4.1.3 Modifying User Details

To update the existing user details in the User Maintenance window:


1. Select the checkbox adjacent to the User ID whose details are to be updated.

2. Click button in the User Maintenance tool bar.


The Edit User Details window is displayed.
3. Update the required information. For more details, see Add User.

NOTE You cannot edit the User ID. You can view the modifications
once the changes are authorized. Also a new password must be
provided during the user details modification.

4. Click Save to save the changes.

11.2.4.1.4 Deleting User Details

You can remove the user definition(s) which are created by you and which are no longer required in
the system, by deleting from the User Maintenance window.
1. Select the checkbox adjacent to the user ID whose details are to be removed.

2. Click button in the User Maintenance tool bar.


3. Click OK in the information dialog to confirm deletion.

NOTE User can access the application until the delete request is
authorized.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 553


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.4.1.5 Adding User Attributes

This option allows you to input additional user attributes that are configured for a user. Ensure that
the required user attributes are present in the CSSMS_ATTRIB_MAST table. For more information
about how to add additional user attributes, see Setting up User Attribute Master section.
To add attributes to a user in the User Maintenance window:
1. Select the checkbox adjacent to the User ID for whom you wish to add additional attributes.

2. Click button in the User Maintenance tool bar. The User Attribute window is displayed.

Figure 261: User Attributes window

The user attributes present in the CSSMS_ATTRIB_MAST table are displayed in this window.
3. Enter appropriate information or select the required value from the drop-down list, for the
displayed user attributes.
4. Click Save to upload the changes.

11.2.4.2 Setting up User Attribute Master


OFSAAI captures some of the common user attributes such as Address, Designation, Date of Birth,
Employee Code and so on. Additionally, if you want to capture user details such as Branch Code or
Department Name, you can capture them by configuring User Attribute Master
(CSSMS_ATTRIB_MAST) table.
You have to upload the CSSMS_ATTRIB_MAST table after entering the required information on the
table. You should have Config Excel Advanced user role mapped to your user group. Note that this
role is not available to SYSADMN user.
1. Download the CSSMS_ATTRIB_MAST table. For more information on how to download a table
from Config Schema, see Config Schema Download section. You need to select
CSSMS_ATTRIB_MAST from the Select the table drop-down list.
2. Open the downloaded file in MS Excel 2003/ 2007. The excel file will have columns
ATTRIBUTE_ID, ATTRIBUTE_DESC, ALLOWED_VALUES, and TYPE.
3. Add data as shown in the following table:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 554


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Table 153: Details of Attribute ID, Description, Values, and Types

ATTRIBUTE_ID ATTRIBUTE_DESC ALLOWED_VALUES TYPE

BRANCH_CODE Branch Code 0

BRANCH_NAME Branch Name New York, Dallas 1

DEPT_CODE Department Code 0

DEPT_NAME Department Name 0

TYPE – Enter Type as 1 if you want to give a list of values from which the user has to select the
attribute value. In the ALLOWED_VALUES column, give the required values for the attribute.
Enter Type as 0 if the attribute value has to be entered in a text field.
4. Save the file.
5. Upload the modified CSSMS_ATTRIB_MAST table. For more information on how to upload a
table to Config Schema, see Config Schema Upload section. Note that you need to select
CSSMS_ATTRIB_MAST from the Select the table drop-down list and Upload Type as
Complete.
An appropriate message based on the success or failure status is displayed.

11.2.4.3 User Group Maintenance


User Group Maintenance facilitates you to create, view, edit, and delete user groups. You can maintain
and modify the user group information within the User Group Maintenance window.
You can access User Group Maintenance by expanding User Administrator section within the tree
structure of Navigation List to the left.
User Group Maintenance window displays details such as User Group ID, Group Name, Description,
Precedence, and the number of Mapped Users.
You can search for a user group based on User Group ID, Group Name, and Description.

11.2.4.3.1 Adding User Group

To add a User Group in the User Group Maintenance window:


1. Select from the User Group tool bar. Add button is disabled if you have selected any User
Group ID in the grid. The User Group Maintenance window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 555


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Figure 262: User Group Definition (add) window

2. Enter the details as tabulated.


The following table describes the fields in the User Group Maintenance pane.

Table 154: Fields in the User Group Maintenance pane and their Descriptions

Field Description

Specify a unique id for the user group. Ensure that there are no special
User Group ID
characters and extra spaces in the id entered.

Group Name Enter a name for the user group.

Description Enter a description for the user group.

Precedence Enter the Precedence value. You can click button to Lookup for the
existing precedence values applied to the various user groups.

NOTE The lower the value in the precedence column, the higher is
precedence. A user may be mapped to multiple user groups
and hence the precedence value is required if Group Based
Hierarchy Security setting is selected in the Configuration
window.

3. Click Save to upload the user group details. The new User Group details need to be authorized
before associating users to the user group created. Before user group authorization, you need
to map an information domain and role to the user group.

11.2.4.3.2 Viewing User Group Details

You can view individual user group details at any given point. To view the existing user group details
in the User Group Maintenance window:
1. Select the checkbox adjacent to the User Group ID.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 556


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

2. Click button in the User Group tool bar.


The View User Group Details window is displayed with the details such as User Group ID, Group
Name, Description, and Precedence value.

11.2.4.3.3 Modifying User Group

To update the existing user group details in the User Group Maintenance window:
1. Select the user group whose details are to be updated by clicking on the checkbox adjacent to
the User Group ID.

2. Click button in the User Group tool bar. Edit button is disabled if you have selected multiple
groups.
3. Edit the required User Group details except for User Group ID which is not editable. For more
information see Add User Group.
4. Click Save to upload changes.

11.2.4.3.4 Deleting User Group

You can remove user group definition(s) which are created by you, which do not have any mapped
users, and which are no longer required, by deleting from the User Group Maintenance window.
1. Select the checkbox adjacent to the user group ID(s) whose details are to be removed.

2. Click button in the User Group tool bar.


3. Click OK in the information dialog to confirm deletion.

NOTE User Groups cannot be deleted if any requests (Domain


map/unmap and Role map/unmap) are pending for
authorization or any users are mapped to it.

11.2.4.4 User - User Group Map


User - User Group Map facilitates you to map user(s) to specific user group which in turn is mapped to
a specific Information Domain and role. Every User - User Group mapping
needs to be authorized by the System authorizer. If you have enabled auto authorization, then the user- user group
mapping gets authorized automatically. To enable auto authorization, see the SMS Auto Authorization section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 557


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Figure 263: User – User Group Map window

User - User Group Map window displays details such as User ID, Name, and the corresponding
Mapped Groups. You can view and modify the existing mappings within the User - User Group Map
window.
You can access User - User Group Map window by expanding User Administrator section within the
tree structure of Navigation List to the left. You can also search for specific users based on User ID and
Name.

11.2.4.4.1 Viewing Mapped Groups

This option allows you to view the user groups mapped to a user.
To view the mapped User Groups of a user
• From the User-User Group Map window, select the checkbox adjacent to the User ID. The list of
user group(s) to which the selected user has been mapped is displayed under Mapped Groups
grid.

11.2.4.4.2 Mapping/Unmapping Users

This option facilitates you to map a user to specific user groups.


To map/un map user in User-User Group Map window:
1. Select the checkbox adjacent to the User ID.

2. Click button in the Mapped Groups grid.


The User - User Group Mapping window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 558


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

 To map a user group, select the User Group and click . You can press Ctrl key for
multiple selections.

 To map all the User Groups to a user, click .


 To remove a User Group mapping for a user, select the User Group from Select Members
pane and click .

 To remove all the group mappings of a user, click .


In the User - User Group Mapping window, you can search for a User Group using the Search
field.
3. Click OK to save the mappings and return to User-User Group Map window.

NOTE The newly created user- user group mapping needs to be


authorized by the system authorizer. Once it is authorized, it
will be visible in the User - User Group Mapping window. If you
have enabled auto authorization, then the user- user group
mapping gets authorized automatically.
User Group is displayed in the User - User Group Mapping
window only if it is mapped to at least one Domain and Role.

11.2.4.5 Profile Maintenance

NOTE This feature will not be available if Authentication Type is


selected as SSO Authentication and SMS Authorization from
the Configuration window.

Profile Maintenance facilitates you to create profiles, specify the time zones, specify the working days
of the week and map holiday’s schedule. Profile Maintenance window displays the existing profiles
with details such as the Profile Code, Profile Name, Time Zone, Workdays of Week, Holiday Time
Zone, and mapped Holidays. In the Profile Maintenance window you can add, view, edit, and delete
user profile definitions.
You can access Profile Maintenance by expanding User Administrator section within the tree
structure of Navigation List to the left. You can also search for specific profile or view the list of
existing profiles within the system.

11.2.4.6 Adding Profile


To add a profile in the Profile Maintenance window:
1. Select from the Profile Maintenance tool bar. Add button is disabled if you have selected any
Profile Code checkbox in the grid.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 559


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Figure 264: Profile Definition (add) window

2. The Profile Definition (add) window is displayed. Enter the details as tabulated.

Table 155: Fields in the Profile Definition (add) and their Descriptions

Field Description

Enter a unique profile code based on the functions that the user
Profile Code executes. For example, specify AUTH if you are creating an authorizer
profile.

Enter a unique profile name. Ensure that Profile Name does not contain
Profile Name
any special characters except ".", "(",")", "_", "-".

Select the Start and End time zone from the drop-down list. Time zones
Time Zone are hourly based and indicate the time at which the user can access the
system.

Select the Holiday Start and End time zone from the drop-down list.
Holiday Time Zone Time zones are hourly based and indicate the time at which the user can
access the system on holidays.

Select the work days of a week by clicking on the checkbox adjacent to


Work Days of Week week days. The specified time zones will be applicable to the selected
days.

3. Click Save to save the profile.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 560


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.4.7 Mapping Holidays


To enable user to access the Infrastructure system during holidays, map the profile to the holiday’s
schedule. For the user to access the system on holidays, the Login on Holidays checkbox in the User
Maintenance window must be checked.

1. Click button in the New Holidays grid. Holiday Mapping window is displayed.
The Holiday Mapping window displays the holidays that are added through the Holiday
Maintenance section.
2. To map a holiday, you can do the following:

 To map holiday to the user profile, select from the list and click .

 To map all the listed holidays to the user profile, click .

 To remove holiday mapping to user profile, select from the list and click .

 To remove entire holiday mapping for the user profile, click .


3. Click OK to save the mapping.

11.2.4.8 Viewing Profile


You can view the profile of a particular user at any given point. To view the existing user profile details
in the Profile Maintenance window:
1. Select the checkbox adjacent to the Profile Code.

2. Click button in the Profile Maintenance tool bar.


The Profile Maintenance window displays profile of the user with the holiday mapping details.

11.2.4.9 Modifying Profile


You can modify all the details except Profile Code and Profile Name of individual profiles at any
given point of time.
To edit a user profile in the Profile Maintenance window:
1. Select the checkbox adjacent to the Profile Code.

2. Click button in the Profile Maintenance tool bar.


3. Edit the user profile as required. For more information see Add Profile.
4. Click Save to upload changes.

11.2.4.10 Deleting Profile


You can remove user profile definition(s) which are created by you and which are no longer required
in the system, by deleting from the Profile Maintenance window.
1. Select the checkbox adjacent to the Profile Code(s) whose details are to be removed.

2. Click button in the Profile Maintenance tool bar.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 561


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

3. Click OK in the information dialog to confirm deletion.

11.2.4.11 User Authorization


User Authorization function facilitates system authorizers to authorize and allow user(s) created or
modified by system administrator to access the Infrastructure system. Whenever a new user is created
or an authorized user details are updated, the user has to be authorized by the system authorizers to
allow access to the Infrastructure system.
• As a system authorizer, you can:
 View the available user ID’s which are to be authorized.
 Authorize or reject users to access the system.
 Authorize or reject modification request of Users.
 View the current updated and previous user details for authorization.
 Authorize based on the user ID’s created by Systems Administrator.
• As a user, you can login to the Infrastructure system only if authorized by the system
Authorizer.
You can access User Authorization window by expanding User Administrator and selecting User
Authorization within the tree structure of Navigation List to the left.
The User Authorization window displays a list of available users for Authorization. By default, the users
will be displayed in alphabetical order of the User IDs with the other details such as User ID, Name,
User Start Date, and User Expiration Date. You can also search for specific users.

11.2.4.11.1 Authorizing or Rejecting User(s)

In the User Authorization window, do the following:


1. Select User ID which has to be authorized. The window is refreshed and the user details are
displayed below.
2. In the User Authorization tool bar,
 Click (authorize) button to authorize a user(s).

 Click (reject) button to reject a user(s).


3. Click OK in the information dialog to confirm authorization or rejection. On processing, a
system message is displayed.

11.2.4.12 User Group Authorization


User Group Authorization function facilitates system authorizers to authorize or reject the user groups
mapped to a user. This authorization is required if user groups are mapped to Public folders.
• As a system Authorizer, you can:
 View the list of mapped/unmapped user(s) to be authorized
 View the list of mapped/ unmapped roles to be authorized
 View the list of mapped/ unmapped domains to be authorized

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 562


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

 Authorize or reject mapping/un mapping of user group(s) to a role or a domain


You can access User Group Authorization window by expanding User Administrator section within
the tree structure of Navigation List to the left and selecting User Group Authorization.
The User Group Authorization window displays a list of available user groups for authorization. When
you select a user group, the details such as Mapped/Unmapped Users, Mapped/Unmapped Roles,
and Mapped/Unmapped Domains are displayed. You can search for specific user group based on
Group Code and Group Name.

NOTE After creating a user group, you need to map an information


domain and role to the user group. Then only the user group
will be visible for authorization in the User Group Authorization
window.

11.2.4.12.1 Authorizing or Rejecting User Group(s)

In the User Group Authorization window, do the following:


1. Select the required User Group ID for authorization.
The Mapped/Unmapped Users, Mapped/Unmapped Roles, and Mapped/Unmapped Domains
corresponding to the selected User Group are displayed in the respective grids.
2. Select the checkbox adjacent to the mapped or unmapped group/user/role/domain and
 Click (authorize) button to authorize it.

 Click (reject) button to reject it.


3. Click OK in the information dialog to confirm authorization or rejection. On processing, a
system message is displayed.

11.2.4.13 Authorization for User Group Folder Mapping


User Group Folder Mapping Authorization facilitates system authorizers to authorize or reject
mapping and un mapping of roles to folders, done from the User Group Role Map window. This
authorization is required for mapping of user groups to Shared folders.
As a system authorizer, you can view the list of mapped/unmapped user roles to be authorized for a
selected user group. Once the mapping/un mapping is authorized, then the changes will be in
effective.
You can access Authorization for User Group Folder Mapping window by expanding User
Administrator section within the tree structure of Navigation List to the left and clicking
Authorization for User Group Folder Mapping.
To authorize mapping of roles to folder
1. Click Authorization for User Group Folder Mapping under User Administrator in the
Security Management menu. The Authorization for User Group Folder Mapping window is
displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 563


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Figure 265: Authorization for User Group Folder Mapping window

2. Select the user group and the folder. The Mapped/Unmapped Roles corresponding to the
selected User Group which requires authorization are displayed in the respective grids.
3. Select the checkbox adjacent to the mapped or unmapped roles and
 Click (authorize) button to authorize it.

 Click (reject) button to reject it.


4. Click OK in the information dialog to confirm authorization or rejection. On processing, a
system message is displayed.

11.2.4.14 User Group Domain Map


User Group Domain Map facilitates System Administrators to view the available user groups and map
the required Domain to User Group(s). System Administrators can also remove user group mapping
for specific domain or map additional domains to a specific user group to ensure confidentiality of
restricted Information Domains.
You can access User Group Domain Map window by expanding User Administrator section within the
tree structure of Navigation List to the left.
The User Group Domain Map window displays a list of available user groups in alphabetical order with
the User Group ID, Group Name, and Description. On selecting a user group, the list of available
mapped domains are displayed.

NOTE It is mandatory to map at least one information domain to a


user group.

You can search for specific user group based on User Group ID, Group Name, and Description.
To map a user group to a domain, do the following:
1. Select the checkbox adjacent to the required User Group ID. The User Group Domain Map
window is refreshed to display the existing mapped domains.

2. Click button in the Mapped Domains section tool bar. The User Group Domain Map window
is displayed.

 To map Domains to a User Group, select the Domain from the Members list and click .
You can press Ctrl key for multiple selections.

 To map all the Domains to a User Group, click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 564


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

 To remove mapping for a user group, select the Domain from Select Members list and click
.

 To remove all Domains mapped to User Group, click .


In the User Group Domain Map window, you can search for a Domain using the Search field.
3. Click OK to save the mappings and return to User Group Domain Map window.
Mapping/un mapping of User Groups to Domain should be authorized by the System
Authorizer. If you have enabled auto authorization, then the User Group-Domain mapping/un
mapping gets authorized automatically. To enable auto authorization, see the SMS Auto
Authorization section.

11.2.4.15 User Group Role Map


User Group Role Map facilitates System Administrators to map Role(s) to specific User Group(s). Each
role has a defined function and any user(s) mapped to the role has to perform only those functions.
For example, the table below lists the user group mapped to a specific role.
The following tables describes the Group Code and Rule Code used for the User Group Role map.

Table 156: Group Code and Role Code used for the User Group Role map

GROUP CODE ROLE CODE

ADMIN SYSADM

AUTH SYSATH

CWSADM CWSADMIN

You can access User Group Role Map window by expanding User Administrator section within the
tree structure of Navigation List to the left.
The User Group Role Map window displays a list of available user groups in alphabetical order with the
User Group ID and Description. On selecting a user group, the list of available mapped roles are
displayed.
You can also search for specific user group or view the list of existing user groups within the system.
To map a Role to User Group, do the following:
1. Select the checkbox adjacent to the required User Group ID. The User Group Role Map window
is refreshed to display the existing mapped roles.

2. Click button in the Mapped Roles section tool bar. The User Group Role Map window is
displayed.
3. In the User Group Role Map window, you can search for a Role using the Search field and edit
the mapping.

 To map Role to a User Group, select the Role from the Members list and click . You can
press Ctrl key for multiple selections.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 565


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

 To map all the Roles to a specific User Group, click .

 To remove mapping for a user group, select the Role from Select Members list and click .

 To remove all Roles mapped to a User Group, click .


4. Click OK to save the mappings and return to User Group Role Map window.
Mapping/unmapping of User Roles to a User Group should be authorized by the System
Authorizer. If you have enabled auto authorization, then the User Group-Role
mapping/unmapping gets authorized automatically. To enable auto authorization, see the SMS
Auto Authorization section.

11.2.4.16 User Group Folder Role Map


User Group Folder Role Map facilitates System Administrators to map role(s) to specific user group(s),
which are mapped to shared folders. This mapping is used to give access rights to a user on objects
belonging to Shared folder/segment.
To map user group-folder-role
1. Click User Group Folder Role Map under User Administrator in the Security Management
menu. The User Group Folder Role Map window is displayed.

Figure 266: User Group Folder Role Map window

2. Select the user group from the User Group Folder Role Map grid.
All shared folders are displayed in the Infodom-Folder Map grid.

3. Select the shared folder to which you want to map roles and click .

4. Select the required roles and click or click to map all the roles. To remove mapping of a
role, select the role and click . To remove all mapped roles, click .
5. Click Ok.
User Group-Folder-Role mapping/unmapping should be authorized by the System Authorizer.
If you have enabled auto authorization, then the mapping/unmapping gets authorized
automatically. To enable auto authorization, see the SMS Auto Authorization section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 566


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.4.17 Reinstating Deleted Users


User Reinstate feature allows the System Administrators to reinstate deleted users. After reinstating,
you should map the users to the required user groups.
To reinstate deleted users
1. Click Reinstate User under User Administrator in the Security Management menu.
The User Reinstate window is displayed.

Figure 267: User Reinstate window

All deleted users are displayed in the User Reinstate pane.

2. Select the user you want to reinstate and click .


A confirmation message is displayed.
3. Click Ok.
The reinstated user(s) will have the same user id and the password will be reset as “password0”.

11.2.5 System Administrator


System Administration refers to a process of managing, configuring, and maintaining confidential
data in a multi-user computing environment. System Administration in Security Management involves
creating functions, roles, and mapping functions to specific roles. System Administration also involves
maintaining segment information, holiday list, and restricted passwords to ensure security within the
Infrastructure system.
You can access System Administrator in Navigation List to the left of Security Management. The
options available under System Administrator are:
• Function Maintenance
• Role Maintenance
• Function - Role Map
• Segment Maintenance
• Holiday Maintenance
• Restricted Passwords

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 567


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.5.1 Function Maintenance


A function in the Infrastructure system defines the privileges to access modules or components and to
define or modify metadata information associated. Function Maintenance allows you to create
functions for users to ensure only those functions are executed which are specific to the user's role.
You can access Function Maintenance by expanding System Administrator section within the tree
structure of Navigation List to the left. The Function Maintenance window displays the function details
such as Function Code, Function Name, Description, and the number of Roles Mapped to the function.
The Function Maintenance window also facilitates you to view, create, modify, and delete functions
within the system.
You can also make use of Search and Pagination options to search for a specific function or view the
list of existing functions within the system.

11.2.5.1.1 Creating Function

To create function in the Function Maintenance window:


1. Select from the Function Maintenance tool bar. Add button is disabled if you have selected
any function in the grid. The Function Definition (add) window is displayed.

Figure 268: Function Definition (add) window

2. Enter the function details as tabulated. You can also see pre-defined Function Codes for
reference.
The following table describes the fields in the Function Definition (add) window.

Table 157: Fields in the Function Definition (add) window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Enter a unique function code. Ensure that there are no special characters
Function Code
and extra spaces in the code entered. For example, DATADD to add dataset.

Enter a unique name for the function. Ensure that the Function Name does
Function Name
not contain any special characters except “(“, “)”, “_”, “-“, “.”

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 568


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Field Description

Enter the function description. Ensure that the Function Description does
Function Description
not contain any special characters except “(“, “)”, “_”, “-“, “.”

3. Click Save to upload the function details.


The User Info grid at the bottom of Function Maintenance window display metadata
information about the function created.

11.2.5.1.2 Viewing Function

You can view individual function details at any given point. To view the existing user details in the
Function Maintenance window:
1. Select the checkbox adjacent to the Function Code.

2. Click button in the Function Maintenance tool bar.


The View Function Details window is displayed with the details such as Function Code, Function
Name, and Function Description.

11.2.5.1.3 Modifying Function

To update the existing function details (other than system generated functions) in the Function
Maintenance window:
1. Select the checkbox adjacent to the required Function Code.

2. Click button in the Function Maintenance tool bar. The Edit Function Details window is
displayed.
3. Update the required information. For more details, see Create Function.

NOTE Function Code cannot be edited.

4. Click Save to upload the changes.

11.2.5.1.4 Deleting Function

You can remove only those function(s) created by you and which are no longer required in the system,
by deleting from the Function Maintenance window.
1. Select the checkbox adjacent to the Function Code whose details are to be removed.

2. Click button in the Function Maintenance tool bar.


3. Click OK in the information dialog to confirm deletion.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 569


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.5.2 Role Maintenance


A role in the Infrastructure system is a collection of functions defined for a set of users to execute a
specific task. You can create roles based on the group of functions to which users are mapped.
You can access Role Maintenance by expanding System Administrator section within the tree
structure of Navigation List to the left. The Role Maintenance window displays the role details such as
Role Code, Role Name, Role Description, and the number of Users Mapped to the role. The Role
Maintenance window also facilitates you to view, create, modify, and delete roles within the system.
You can also make use of Search and Pagination options to search for a specific role or view the list of
existing roles within the system.
To view the default roles defined within the Infrastructure application, see Role Mapping Codes.

11.2.5.2.1 Creating Role

To create role in the Role Maintenance window:


1. Select from the Role Maintenance tool bar. Add button is disabled if you have selected any
role in the grid. The New Role window is displayed.

Figure 269: Role Definition (add) window

2. Enter the role details as tabulated. You can also see pre-defined Codes for reference.
The following table describes the fields in the Role Definition (add) window

Table 158: Fields in the Role Definition (add) window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Enter a unique role code. Ensure that there are no special characters and
Role Code extra spaces in the code entered. For example, ACTASR to create Action
Assessor.

Enter a unique name for the role. Ensure that the Role Name does not
Role Name
contain any special characters except space.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 570


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Field Description

Enter the role description. Ensure that the Role Description does not
Role Description
contain any special characters except space.

3. Click Save to upload the role details. The User Info grid at the bottom of Role Maintenance
window display metadata information about the role created.

11.2.5.2.2 Viewing Role

You can view individual role details at any given point. To view the existing role details in the Role
Maintenance window:
1. Select the checkbox adjacent to the Role Code.

2. Click button in the Role Maintenance tool bar.


The View Role Details window is displayed with the details such as Role Code, Role Name, and
Role Description.

11.2.5.2.3 Modifying Role

To update the existing role details in the Role Maintenance window:


1. Select the checkbox adjacent to the required Role Code.

2. Click button in the Role Maintenance tool bar. The Edit Role Details window is displayed.
3. Update the required information. For more details, see Create Role.

NOTE Role Code and Role Name cannot be edited.

4. Click Save to upload the changes.

11.2.5.2.4 Deleting Role

You can remove only those role(s) which are created by you, which does not have any users mapped,
and which are no longer required in the system by deleting from the Role Maintenance window.
1. Select the checkbox adjacent to the Role Code whose details are to be removed.

2. Click button in the Role Maintenance tool bar.


3. Click OK in the information dialog to confirm deletion.

11.2.5.3 Function - Role Map


Function Role Map facilitates you to view and map a set of function(s) to a specific role within the
Infrastructure system. Functions can only be mapped to a defined set of roles to ensure effective
Infrastructure system security.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 571


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

You can access Function – Role Map by expanding System Administrator section within the tree
structure of Navigation List to the left. The Function – Role Map window displays a list of available Role
Codes in alphabetical order with the Role Name. On selecting a particular Role Code, the Mapped
Functions are listed in the Mapped Functions grid of Function – Role Map window.
You can also make use of Search and Pagination options to search for a specific role or view the list of
existing roles within the system.
To view the default Function – Role mapping defined within the Infrastructure application, see
Function Role Mapping.

Figure 270: Function – Role Map window

To map a role to a function in the Function – Role Map window, do the following:
1. Select the checkbox adjacent to the required Role Code. The Function – Role Map window is
refreshed to display the existing mapped functions.

2. Click button in the Mapped Functions section tool bar. The Function Role Mapping window
is displayed.
3. In the Function Role Mapping window, you can search for a function using the Search field and
edit the mapping.

 To map a function to a role, select the function from the Members list and click . You can
press Ctrl key for multiple selections.

 To map all the functions to the selected role, click .


 To remove function mapping for a specific role, select the function from Select Members
pane and click .

 To remove all function mapping for a role, click .


4. Click OK to save the mappings and return to Function – Role Map window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 572


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.5.4 Segment Maintenance


Segment is used to control access rights on a defined list of objects. It is mapped to an information
domain.
Segment Maintenance in the Infrastructure system facilitates you to create segments and assign
access rights. You can have different segments for different Information Domains or same segments
for different Information Domains.
User scope is controlled by segment/ folder types with which the object is associated.
• Objects contained in a public folder will be displayed irrespective of any user.
• Objects contained in a shared folder will be displayed if user belongs to a user group which is
mapped to an access type role with the corresponding folder.
• Objects contained in a private folder will be displayed only to the associated owner.
You can access Segment Maintenance by expanding System Administrator section within the tree
structure of Navigation List to the left. The Segment Maintenance window displays a list of available
segments with details such Domain, Segment Code, Segment Name, Segment Description,
Segment/Folder Type, Owner Code, and the number of Users Mapped to the segment. You can view,
create, modify, and delete segments within the Segment Maintenance window.
You can also make use of Search and Pagination options to search for a specific role or view the list of
existing roles within the system.

11.2.5.4.1 Creating Segment

To create segment in the Segment Maintenance window:


1. Select button from the Segment Maintenance tool bar. Add button is disabled if you have
selected any checkbox in the grid. The New Segment window is displayed.

Figure 271: Segment Maintenance (add) window

2. Enter the segment details as tabulated.


The following table describes the fields in the Segment Maintenance (add) window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 573


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Table 159: Fields in the Segment Maintenance (add) window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Select the required domain for which you are creating a segment, from the
Domain
drop-down list.

Enter a unique segment code. Ensure that the segment code does not
Segment Code exceed more than 10 characters and there are no special characters except
underscore or extra spaces.

Enter a unique name for the segment. Ensure that there are no special
Segment Name
characters in the name entered.

Enter the segment description. Ensure that there are no special characters
Segment Description
in the description entered except spaces, “(“, “)”, “_”, “-“, and “.”.

Select the type of the segment/folder from the drop-down list. The options
Segment/Folder Type
are Public, Private, and Shared.

Owner Code Select the owner code from the drop-down list.

3. Click Save to upload the segment details.


The User Info grid at the bottom of Segment Maintenance window displays metadata
information about the segment created.

11.2.5.4.2 Viewing Segment

You can view individual segment information at any given point. To view the existing segment details
in the Segment Maintenance window:
1. Select the checkbox adjacent to the required segment.

2. Click button in the Segment Maintenance tool bar.


The View Segment Details window is displayed with the details such Domain, Segment Code,
Segment Name, Segment Description, Segment /Folder Type, and Owner Code.

11.2.5.4.3 Modifying Segment

To update the existing segment details in the Segment Maintenance window:


1. Select the checkbox adjacent to the segment.

2. Click button in the Segment Maintenance tool bar. The Edit Segment Details window is
displayed.
3. Update the Segment Description, Segment/Folder Type, and Owner Code. The others fields are
view only and are not editable. For more details, see Create Segment.
4. Click Save to upload the changes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 574


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

11.2.5.4.4 Deleting Segment

You can remove only those segment(s) which are created by you, which does not have any users
mapped, and which are no longer required in the system by deleting from the Segment Maintenance
window.
1. Select the checkbox adjacent to the segment whose details are to be removed.

2. Click button in the Segment Maintenance tool bar.


3. Click OK in the information dialog to confirm deletion.

11.2.5.5 Holiday Maintenance

NOTE This feature will not be available if Authentication is configured


to SSO Authentication and SMS Authorization.

Holiday Maintenance facilitates you to create and maintain a schedule of holidays or non-working
days within the Infrastructure system. On a holiday, you can provide access to the required users and
restrict all others from accessing the system from the User Maintenance window.
You can access Holiday Maintenance by expanding System Administrator section within the tree
structure of Navigation List to the left. The Holiday Maintenance window displays a list of holidays in
ascending order. In the Holiday Maintenance window you can create and delete holidays.

11.2.5.5.1 Adding Holiday

To add holiday date in the Holiday Maintenance window:


1. Select from the Holiday Maintenance tool bar. Add button is disabled if you have selected any
checkbox in the grid. The New Holiday window is displayed.

2. Click button and specify date using the calendar.


For more information on selecting a date, see Calendar section.
3. Click Save to upload changes.

11.2.5.5.2 Deleting Holiday(s)

You can remove a holiday entry by deleting from the Holiday Maintenance window.
1. Select the checkbox adjacent to the holiday which has to be removed.

2. Click button in the Holiday Maintenance tool bar.


3. Click OK in the information dialog to confirm deletion.

11.2.5.6 Restricted Passwords

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 575


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

NOTE This feature will not be available if Authentication Type is


selected as SSO Authentication and SMS Authorization from
System Configuration> Configuration.

Restricted Passwords facilitates you to add and store a list of passwords using which users are not
permitted to access the Infrastructure system.
You can access Restricted Passwords by expanding System Administrator section within the tree
structure of Navigation List to the left. The Restricted Passwords window displays a list of restricted
passwords and allows you to add and delete passwords from the list.
You can also make use of Search and Pagination options to search for a specific password or view the
list of existing passwords within the system.

NOTE While searching for any pre-defined restricted password, you


have to key in the entire password.

11.2.5.6.1 Adding Restricted Password

To add restricted password in the Restricted Passwords window:


1. Select from the Restricted Passwords tool bar. Add button is disabled if you have selected
any checkbox in the grid.

Figure 272: Restricted Passwords window

2. Enter the password in the New – Password field. Ensure that the password is alphanumeric,
without any spaces, and the length should be between 6 and 20.characters.
3. Click Save to upload new password.

11.2.5.6.2 Deleting Restricted Password(s)

You can de-restrict a password by deleting from the Restrict Passwords window.
1. Select the checkbox adjacent to the password which has to be removed.

2. Click button in the Restricted Passwords tool bar.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 576


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

3. Click OK in the information dialog to confirm deletion.

11.2.6 User Activity Report


User Activity Report facilitates System Administrator to view and generate user activity reports to track
and ensure security within the infrastructure system.
You can access User Activity Report from the Security Management Navigation List to the left. The
User Activity Report window facilitates you to generate reports of the currently logged in users,
disabled users, deleted users, unauthorized users, and idle users. Additionally, you can generate Role
Master report, User ID Population report and UAM Admin Activity report.
The table below lists each user type within the User Activity Report window with other details.

Table 160: Report Types in the User Activity Report window and their Descriptions

Report Type Description

This window displays the list of current users accessing the Infrastructure
Currently logged in users system with details such as; User ID, User Name, and Last Login Date
information.

This window displays the list of users who are authorized but are currently
Disabled Users disabled to access the Infrastructure system with their details such as; User ID,
User Name, and Disabled On date.

This window displays the list of users who are removed from the system with
the status as authorized to access the Infrastructure system. The list also
Deleted Users
displays the details such as; User ID, User Name, Last Login, Authorization
Status, and the Deleted On date.

This window displays the User ID, and User Name of all the users which are
Unauthorized Users
not authorized.

This window displays the list of users who have not logged in to the
Infrastructure system for a certain period, with details such as; User ID and
User Name.
Idle Users
The default number of idle days accounted is 10 and the value can be modified
by entering the required number of days in the Idle Users (No of Days) field
located in Search and Filter grid.

This window displays all OFSAA Roles and the corresponding Functions/
Role Master Report rights mapped to the role. That is, if a Function/Right is assigned to a
particular role, then the corresponding check box will be in selected state.

To generate this report, enter the User ID of the user whose report you want to
generate and click . The report displays various user details such as User
User ID Population Report ID, User Name, Employee Code, Profiles, Status of the Profiles, Creation Date,
Last Password Changed Date, Last log in Date, Maker ID, Maker Date, Checker
ID, Checker Date, and Profile End Date.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 577


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

Report Type Description

To generate this report, enter the User ID of the user whose report you want to

generate and the duration and then click . The report displays the new
and old values for User ID, User Name, Employee Code, Profile Name, Activity,
UAM Admin Activity Report Maker ID, Checker ID, Marker Date, and Checker Date. It also displays the list
of Admin activities performed on the User within the specified duration such
as User Details modified, User Access rights modified, User Mappings
modified, and so on.

For User Activity Reports such as Currently logged in users, Disabled users, Deleted users,
Unauthorized users, and Idle users, you can:
• Click Save to File to generate a HTML format of the report.
The File Download window is displayed.
 Click Open in the File Download window to view the report in your browser.
 Click Save in the File Download window to save a local copy of the report.
For User Activity Reports such as Role Master Report, User ID Population Report and UAM Admin
Activity Report, you can:

• Click to save or open report in Excel format.

• Click to save or open report in PDF format.

11.2.7 User Profile Report


User Profile Reports in the Infrastructure system provides information about the Segment Name, User
Group Name, Role Name, and Function Name to which a user is mapped.
You can access User Profile Report in Security Management Navigation List to the left.. The User
Profile Report window facilitates you to generate user profile reports. You can make use of Pagination
option to view the list of users within the system.

11.2.7.1 Generate User Profile Report

1. Select in the User Profile Report tool bar.


The User Mapping window is displayed.
2. In the User Mapping window, do the following:

 Select the user names from the Members list and click . You can press Ctrl key for
multiple selections.

 To select all users to Selected Members pane, click .

 To remove a selected user, select the user from Select Members pane and click .

 To remove all the selected users from Select Members pane, click .

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 578


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
IDENTITY MANAGEMENT

3. Click OK to save the mappings and return to User Profile Report window.
4. Select Generate Reports in the User Profile Report window and view the report.

Figure 273: User Profile Report window

NOTE You can select File as the print option, to generate a HTML
report. The access link to the report is displayed at the bottom
of User Profile Report window.

You can also select Reset to refresh the selections in the User Profile Report window.

11.2.8 Enable User


Enable User facilitates you to search and select the required user and re-define the access to the
Infrastructure system. In the Enabling User window, you can permit user access and clear the
workstation information based on the following conditions:
• When user access is locked due to exceeding the number of invalid login attempts
• When user access is locked due to an abnormal exit from the system
You (System Administrator) need to have SYSADM function role mapped to access the Enable User
within the Utilities section of the Infrastructure system. The Enabling User window displays the details
of a selected user such as User Name, User Start and End Date, Last Disabled, Enabled, and Login
Date, IP Address, along with Enable Login and Clear Station status.
To Enable User in the Enabling User window:
1. Select the User ID for whom you need to enable access, from the drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 579


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
REFERENCES

You can also use search to filter the list and find the required ID. Click Search and enter the
keyword in Search For field. Click OK, the list is sorted based on the specified keyword.
2. Enable access to the selected user on any or both the conditions:
 Select Enable Login checkbox, if the user access is denied due to invalid login attempts.
 Select Clear Station checkbox, if the user access is denied due to an abnormal exit from the
system.

NOTE To Clear Station (clearing the cache of the previous session)


for enabled users, select the Enabled Users List check box.

3. Click Save and update the changes.


The Info grid at the bottom of the window displays the metadata about the changes.

11.3 References
This section of the document consists of information related to intermediate actions that needs to be
performed while completing a task. The procedures are common to all the sections and are referenced
where ever required. You can see the following sections based on your need.

11.3.1 List of Objects Created in Information Domain


On saving an Information Domain a list of objects will be created in the atomic database, mapped to
this Information Domain. You can view the list in My Oracle Support Portal by clicking the Document
ID: 1566694.1
If the required objects have not been created, there could be a problem in connecting to the database,
or required privileges are not set to the database users, or there may not be enough space in the
database. Ensure to rectify any of the above noted issues and then save the Information Domain.

11.3.2 Authentication and Logging


During the Oracle Financial Services Analytical Applications Infrastructure installation you will be
provided the options of selecting the authentication type required for OFSAAI Users. You can select
either SMS authentication and authorization or the Lightweight Directory Access Protocol (LDAP)
authentication for OFSAAI login.
LDAP is a standalone access directory that provides for a logon and requires only one user name and
password, while accessing different Software. During installation, if you have selected the LDAP Users
option in the User Configuration window the same will be configured for authentication.
For example, ldap://iflexop-241:389

11.3.3 Populating Execution Statistics


This feature allows you to determine which case statement of a rule has updated how many
corresponding records.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 580


SYSTEM CONFIGURATION AND IDENTITY MANAGEMENT
REFERENCES

On selecting this checkbox in Others tab of System Configuration > Configuration window, an insert
query is generated and executed just before the merge statement of the rule is executed. This in turn
lists the number of records processed by all mappings and also stores information about Run ID, Rule
ID, Task ID, Run Skey, MIS Date, number of records fetched by each mapping, order of evaluation of
each mapping, and so on, in configuration table (EXE_STAT).
Typically, the insert query lists the number of records processed by each condition in the rule and is
done just before the task gets executed and not after the batch execution is completed (since the state
of source data might change). This insert query works on all types of query formation including
Computation Rules with and without Aggregation, Classification Rules, Rules with multiple targets,
Rules with default nodes, Rules with Parameters in BPs, and Rules with exclusions.

11.3.3.1 Scenario
Consider the following scenario where, a typical rule would contain a series of Hierarchy Nodes
(BI/Non BI) as Source and one or more BPs or BI Hierarchy Leaf Nodes in the Target.
Rule 1 consists of the following:

Table 161: Source and Target details

SOURCE TARGET

Condition 1 Target 1

Condition 2 Target 1

Condition 3 Target 1

Condition 4 Target 2

The insert query execution populates execution statistics based on the following:
• Each rule has processed at least one record.
• Each target in the rule has processed at least one record through Condition 1 / Condition 2 /
Condition 3 and Condition 4.
• Each source in the rule has processed at least one record through Condition 1 / Condition 2 /
Condition 3 and Condition 4.

11.3.4 SMS Auto Authorization


If auto authorization is enabled, the system authorizer needs not to manually authorize the user- user
group mapping, user group-domain mapping, user group-role mapping and user group-role-folder
mapping. The mappings get authorized automatically.
To enable auto authorization
1. Execute the following query in the Configuration Schema:
UPDATE CONFIGURATION SET PARAMVALUE ='TRUE' WHERE
PARAMNAME='SMS_AUTOAUTH_REQD'
2. Restart the OFSAA server.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 581


REPORTS
ACCESSING REPORTS

12 Reports
Reports for user status, user activity, audit trail and so on is available to users and supports export of
the data generated in PDF and MS Excel formats.
The following user reports are available in the application:
• User Status Report
• User Attribute Report
• User Admin Activity Report
• User Access Report
• Audit Trail Report

12.1 Accessing Reports


The following instruction is the description for the procedure to access reports:
1. Log in to the application to display the OFSAA Landing Page.

You can access Audit Trail Report from Reports on the header. Click from the header to
display the Reports in Tiles menu.

Figure 274: Reports window

2. Click any of the reports to display the respective Search and Filter windows.

NOTE You can access reports from the Tiles menu, or by clicking the
button to view the Navigation List.

12.2 Creating User Status Report


The User Status Report provides information for deleted, disabled, logged-in, authorized, and idle
users.
The following is the procedure to create User Status Reports:
1. From the Reports Tiles Menu, click User Status Reports to display the User Status Report
Window.
2. In the Search and Filter Section, enter details in the fields as described in the following table:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 582


REPORTS
CREATING USER STATUS REPORT

Table 162: Fields in the User Status Report Window and their Descriptions

Field Description

Click the User ID Field to display a drop-down list of User IDs.


User ID Select All to display the report for all users in the system or select a
specific User ID to display the report for the selected User ID.

Click the User Name field to display a drop-down list of User Names.
User Name Select All to display the report for all users in the system or select a
specific User Name to display the report for the selected User Name.

Note: You can select either the User ID or User Name Field. You cannot
use a combination of both fields to generate the report.

Disabled Users Select the check box to filter the report for Users disabled in the system.

Deleted Users Select the check box to filter for Users deleted in the system.

Select the check box to filter for users who are currently logged in to the
Currently Logged in Users
system.

Note: You can use a combination of the Disabled Users and Deleted
Users check boxes to filter your reports. Selecting Disabled Users or
Deleted Users disables the Currently Logged in Users check box.
Conversely, selecting Currently Logged in Users disables the Disabled
Users and Deleted Users check boxes.

3. Click Search to generate the report and display the result in the section following the Search
and Filter Section, or click Reset to clear all values from the Search and Filter Section and
enter new criteria to search.
The following table describes the columns in the report:

Table 163: Fields in the Search and Filter Pane and their Descriptions

Field Description

User ID Displays the unique User ID of the User.

User Name Displays the unique User Name of the User.

Last Successful Login Displays the Date and Time of the last successful login by the User.

Last Failed Login Displays the Date and Time of the last failed login by the User.

Displays whether the User is enabled in the system or not.


The values are:
Enabled
YES
NO

Displays whether the User is deleted from the system or not.


The values are:
Deleted
YES
NO

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 583


REPORTS
CREATING USER STATUS REPORT

Field Description

Displays whether the User has been authorized in the system or not.
The values are:
YES
Authorized
NO
Note: The authorization of Users is done by Administrators who have
User Authorization Privileges.

Displays whether the User is currently logged in to the system or not.


The values are:
Currently Logged In
YES
NO

Displays the number of days that the User has remained idle in the
Idle Days
system.

Note: You must apply the 33150367 One-off Patch from My Oracle
Support to view the following additional fields: Start Date, End Date, Login
Holidays, SMS Auth Only, Created Date, Last Modified Date, Last
Password, Change Date, Last Enabled Date, Last Disabled Date, and
Deleted Date.

Displays the Start Date configured of the period for the User to be active
Start Date
in the system.

Displays the End Date configured of the period for the User to be active in
End Date
the system.

Displays whether the user is allowed to access the system on holidays or


not.
Login Holidays
For more information on how to enable this feature, see User
Maintenance.

Displays if the User can be authenticated through SMS.


SMS Auth Only For more information on how to enable this feature, see User
Maintenance.

Created Date Displays the date on which the User was created in the system.

Displays the date on which the details of the User were last updated in the
Last Modified Date
system.

Last Password Change Displays the date when the password was changed the last time around
Date for the User.

Last Enabled Date Displays the date when the User was last enabled in the system.

Last Disabled Date Displays the date when the User was last disabled in the system.

Deleted Date Displays the date when the User was deleted from the system.

Additional generic features available on the User Status Report Window is as follows:
To export the report, click the Export Button and select either PDF or Excel.

Figure 275: Export Menu Options

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 584


REPORTS
CREATING USER ATTRIBUTE REPORT

Select the Up or Down Icons from the header for a required column to sort the records in
ascending or descending order.
For more information, see Resizing and Sorting Reports.
Enter a number in the Go to Page Field in the footer to navigate to a specific record page. Or
use the First, Previous, Next, and Last Buttons to navigate between the list of records
displayed across multiple pages.
However, after you use the navigation features in the footer, the sorting feature in the header
does not apply.
The default value for this field is 5 records per page.

12.3 Creating User Attribute Report


The User Attribute Report provides information for various user attributes in the application such as
User ID and employee name.
The following is the procedure to create User Attribute Reports:
1. From the Reports Tiles menu, click User Attributes Reports to display the User Attribute
Report window.
2. In the Search and Filter section, enter in the fields in the User Attribute window.
The following table describes the fields in the User Attribute window.

Table 164: Fields in the User Attribute window and their Descriptions

Field Description

Click the User ID field to display a drop-down list of User IDs. Select All to
User ID display the report for all users in the system, or select a specific User ID to
display the report for the selected User ID.

Click the User Name field to display a drop-down list of User Names. Select
User Name All to display the report for all users in the system, or select a specific User
name to display the report for the selected User Name.

Note: You can select either User ID, or User Name. You cannot use a combination of both fields to
generate the report.

3. Click Search to generate the report and display the result in the section following the Search
and Filter section, or click Reset to clear all values from the Search and Filter section and enter
new criteria to search.
The following table provides description for the columns in the report.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 585


REPORTS
CREATING USER ADMIN ACTIVITY REPORT

Table 165: Fields in the Report Columns and their Descriptions

Field Description

User ID Displays the unique User ID of the user.

User Name Displays the unique User Name of the user.

Employee ID Displays the Employee ID of the user.

Resize and Sort Columns See Resizing and Sorting Reports.

4. To export the report, click the button and select either PDF, or Excel.

Figure 276: Export Menu options

12.4 Creating User Admin Activity Report


The User Admin Activity Report provides information for various activities of users.
The following is the procedure to create User Admin Activity Reports:
1. From the Reports Tiles menu, click User Admin Activity Reports to display the User Admin
Activity Report window.
2. In the Search and Filter section, enter in the fields in the User Admin Activity Report window.
The following table describes the fields in the User Admin Activity Report window.

Figure 277: Fields in the User Admin Activity Report window and their Descriptions

Field Description

Click the User ID field to display a drop-down list of User IDs. Select All to
User ID display the report for all users in the system, or select a specific User ID to
display the report for the selected User ID.

Click the User Name field to display a drop-down list of User Names. Select
User Name All to display the report for all users in the system, or select a specific User
name to display the report for the selected User Name.

Note: You can select either User ID, or User Name. You cannot use a combination of both fields to
generate the report.

From Date Select the start date for the report from the Date editor.

To Date Select the end date for the report from the Date editor.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 586


REPORTS
CREATING USER ACCESS REPORT

3. Click Search to generate the report and display the result in the section following the Search
and Filter section, or click Reset to clear all values from the Search and Filter section and enter
new criteria to search.
The following table provides description for the columns in the report:

Table 166: Fields in the Report Columns and their Descriptions

Field Description

User ID Displays the unique User ID of the user.

User Name Displays the unique User Name of the user.

Profile Name Displays the name of the profile for the user.

Activity Displays the type of activity performed on the user by the administrator.

Displays the User ID of the administrator performing the activity for the
Maker ID
user.

Checker ID Displays the User ID of the administrator performing the checker activity.

Maker Date Displays the date and time of performing the activity by the maker.

Resize and Sort Columns See Resizing and Sorting Reports.

4. To export the report, click the button and select either PDF, or Excel.

Figure 278: Export Menu options

12.5 Creating User Access Report


The User Access Report provides information for the access rights of Users based on the Role and
Group Mapping.
The following is the procedure to create User Access Reports:
1. From the Reports Tiles Menu, click User Access Reports to display the User Access Report
Window.
2. In the Search and Filter Section, enter details in the fields in the User Access Reports Window.
The following table describes the fields in the User Access Reports Window:

Figure 279: Fields in the User Access Reports Window and their Descriptions

Field Description

Click the User ID Field to display a drop-down list of User IDs.


User ID Select All to display the report for all Users in the system or select a specific
User ID to display the report for the selected User ID.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 587


REPORTS
CREATING USER ACCESS REPORT

Field Description

Click the User Name Field to display a drop-down list of User Names.
User Name Select All to display the report for all Users in the system or select a specific
User Name to display the report for the selected User Name.

Note: You can select either User ID or User Name. You cannot use a
combination of both fields to generate the report.

Note: You must apply the 33150367 One-off Patch from My Oracle
Support to view the following check boxes in the search criteria: Group,
Role, and Function.

Select the check box to apply the Group Filter to the report.
Group
All the Groups mapped to the selected user are displayed.

Select the check box to apply the Role Filter to the report.
Role
All the Groups and Roles mapped to the selected user are displayed.

Select the check box to apply the Function Filter to the report.
Function All the Groups, Roles, and Functions mapped to the selected user are
displayed.

Note: You can select the Group, Role, and Function check boxes and filter
your reports. Selecting any of the check boxes disables selection for the
remaining check boxes.
For example, selecting Group disables the Role and Function check boxes.

3. Click Search to generate the report and display the result in the section following the Search
and Filter Section or click Reset to clear all values from the Search and Filter Section and enter
new criteria to search.
The following table describes the columns in the report:

Table 280: Fields in the Report Columns and their Descriptions

Field Description

User ID Displays the unique User ID of the User.

User Name Displays the unique User Name of the User.

Group Name Displays the Group Name that the User is mapped to.

DSN ID Displays the Data Source Name (DSN).

Segment Code Displays the Segment Code.

Displays the Role Name that the User is mapped to.


Role Name Note: This field does not appear in the Window when you select the
Group check box.

Displays the Function that the User can access.


Function Name Note: This field does not appear in the Window when you select the
Group or Role check boxes.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 588


REPORTS
CREATING AUDIT TRAIL REPORT

Additional generic features available on the User Access Report Window is as follows:
To export the report, click the Export Button and select either PDF or Excel.

Figure 281: Export Menu Options

Select the Up or Down Icons from the header for a required column to sort the records in
ascending or descending order.
For more information, see Resizing and Sorting Reports.
Enter a number in the Go to Page Field or select a Page from the displayed numbers in the
footer to navigate to a specific record page. Or use the First, Previous, Next, and Last Buttons
to navigate between the list of records displayed across multiple pages.
However, after you use the navigation features in the footer, the sorting feature in the header
does not apply.
The default value for this field is 5 records per page.

12.6 Creating Audit Trail Report


The Audit Trail Report provides details for the user activities in the application such as login and add
action, status of the action and the machine name.
The following is the procedure to create Audit Trail Reports:
1. From the Reports Tiles menu, click Audit Trail Reports to display the Audit Trail Report
window.
2. In the Search and Filter section, enter in the fields in the Audit Trial window.
The following table describes the fields in the Audit Trial window.

Table 167: Fields in the Audit Trial window and their Descriptions

Field Description

Click the User Name field to display a drop-down list of User Names. Select
User Name All to display the report for all users in the system, or select a specific User
name to display the report for the selected User Name.

Click the Action field to display a drop-down list of actions in the


application that users can perform. Select All to display the report for all
Action
actions in the system, or select a specific action to display the report for the
selected action.

From Date Select the start date for the report from the Date editor.

To Date Select the end date for the report from the Date editor.

Enter a few characters to search for a user name and select the required
Action Detail
name.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 589


REPORTS
RESIZING AND SORTING REPORTS

3. Click Search to generate the report and display the result in the section following the Search
and Filter section, or click Reset to clear all values from the Search and Filter section and enter
new criteria to search.
The following table provides description for the columns in the report:

Table 168: Fields in the Report Columns and their Descriptions

Field Description

User ID Displays the unique User ID of the user.

Action Code Displays the type of action performed by the user.

Action Subtype Displays the sub type of the action.

Status Displays the status of the action. The values are successful or failure.

Action Details Displays the details for the action performed.

Operation Time Displays the date and time for the action performed.

Displays the IP address of the machine from which the action was
Workstation
performed.

Resize and Sort Columns See Resizing and Sorting Reports.

4. To export the report, click the button and select either PDF, or Excel.

Figure 282: Export Menu options

12.7 Resizing and Sorting Reports


The reports generated displays data in the section following the Search and Filter section. You can
resize the columns and sort the data in the columns. The following list describes the procedure to use
these features:
1. Access any of the reports. See Accessing Reports for more information.
2. Select and enter data in the fields, and click Search to generate the report. The results displays
in the section following the Search and Filter section.
3. To resize the columns, right-click to view the Resize and Sort Column option.

Figure 283: Resize and Sort Column options

4. Select and click Resize to view the options for Resize. Select Resize Width.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 590


REPORTS
RESIZING AND SORTING REPORTS

Figure 284: Resize options

5. Similarly, to Sort Columns, right-click to view the Resize and Sort Column option.
6. Select and click Sort Columns to view the options: Sort Column Ascending and Sort Column
Descending. Select the required sorting system.

Figure 285: Sort Columns options

7. You can also sort the columns in ascending or descending order by clicking on the column
headers.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 591


OBJECT ADMINISTRATION
ACCESS OBJECT ADMINISTRATION AND UTILITIES BASED ON INFORMATION DOMAIN

13 Object Administration
Object Administration is an integral part of the Infrastructure system and facilitates system
administrators to define the security framework with the capacity to restrict access to the data and
metadata in the warehouse, based on a flexible, fine-grained access control mechanism. These
activities are mainly done at the initial stage and then on need basis.
The document deals with the information related to the workflow of Infrastructure Administration
process with related procedures to assist, configure, and manage the administrative tasks effectively.
You (System Administrator/System Authorizer) need to have SYSATH, SYSADM, and METAAUTH
function roles mapped to access the Object Administration framework within the Infrastructure
system.
Object Administration consists of the following sections. Click the links to view the sections in detail.
• Object Security
• Object Migration
• Translation Tools
• Utilities

13.1 Access Object Administration and Utilities based on


Information Domain
Access to Object Administration and Utilities tile menu items on the Administration window is role-
based. System Administrators must have the required permissions to access Object Administration
and Utilities. Select an Information Domain from the drop-down list and then click on Object
Administration or Utilities to access the submenu. The following illustration shows the menu items
and the Information Domain drop-down:

Figure 286: Administration window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 592


OBJECT ADMINISTRATION
OBJECT SECURITY CONCEPT IN OFSAAI

Alternatively, the Information Domain drop-down list is also available at the top of the Navigation List.
Click on the Hamburger icon to access the Navigation List. The following illustration shows the
Information Domain drop-down on the Navigation List:

Figure 287: Navigation List – Information Domain window

13.2 Object Security Concept in OFSAAI


Object Security framework is based on a waterfall model for determining user’s rights to perform an
action on an object in the system. That is, if you do not haves the top level of object access type, there
is no need to check the second level, whereas if you have the top level, then the next level down is
checked. The security levels are as follows:
• User Group Authorization
• User Group Scope
• User Group Access Right
• Object Access Type
For Segment/Folder based objects, security will be impacted by the type of the object’s associated
folder.

13.2.1 User Group Authorization


User authorization is derived by the user being mapped to the User Group, having a Role with access
rights for the module for a specific information domain. Mapping between User Group-Role(s) and
mapping between User Group-Domain (Infodom/Folder) in the system is used to achieve this. The
Access role enables access to the module/object’s main menu link and visibility of the Object
Summary Page.

NOTE Objects to be displayed in the Summary window for a specific


user will be decided by the type of the folder to which the
object belongs.

13.2.2 User Group Scope


This is applicable to Folder-based object types. It governs visibility of object definitions in Summary
Pages and in selectable object definitions within parent objects. For Folder-based object types, user
scope is controlled by segment/ folder types with which the object is associated. Folder types are
Public, Shared, and Private:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 593


OBJECT ADMINISTRATION
OBJECT SECURITY CONCEPT IN OFSAAI

• Objects contained in a Public folder will be displayed in Summary window and in object selection
lists to all users, irrespective of user group mapping. No mapping is required.
• Objects contained in a Shared folder will be displayed in Summary window and in object
selection lists, to users belonging to the user groups, which are mapped to the corresponding
folder. The mapping is done from the User Group Folder Role Map window.
• Objects contained in a Private folder will be displayed only to the associated owner (an
individual user).
Consumption within Higher Objects
• A user can consume objects associated to Public Folders in another higher object provided the
Read Only role is mapped to the user group in that folder. This mapping is done through User
Group Role Map window. For objects in shared folders also, the Read Only role should be
mapped. This mapping is done through the User Group Folder Role Map window.
For example, consider a Run definition in which a Classification Rule is used. Suppose the
classification rule, say X is created in a Public folder called Y and the user belongs to user group
UG. Then for the user to use X rule in the Run definition, the user group UG should have
mapped to the “Rule Read Only” role. But if X rule is created in a Shared folder Z, the user group
UG should have mapped to the folder Z and to the “Rule Read Only” role.
Folder Selector Behavior
The folders displayed in the Folder Selector window launched from the Object definition window are:
• All Public and Shared folders which are mapped to the user group and on which the user group
has Write role. Mappings should be done for Public folders through the User Group Role Map
window and User Group Domain Map window. Mappings should be done for Shared folders
through User Group Folder Role Map window.
• All Private folders for which you are the owner.

13.2.3 User Group Access Right


This governs actions that can be performed on an object type. For objects which do not have Folder
concept, User Group–Role mappings govern object access and actions that can be performed on the
object.
For objects having Folder concept, the actions that you can do depend on the type of the folder/
segment with which the object definition is associated. Folder types are Public, Shared, and Private:
• For an object contained in a Public folder, the actions which can be performed by the user
depend on the mapping between user group and folder-Infodom and mapping between user
group and function- roles. For visibility in selection lists in parent objects, the User Group must
have at least Read access for the selected object type. For mapping a user group to domain, see
User Group Domain Map and for mapping a user group to a role, see User Group Role Map.
• For an object contained in a Shared folder, the actions which can be performed by the user
depend on User Group Folder Role mapping, which is done from the User Group Folder Role
Map window.
• For an object contained in a Private folder, the user who has been assigned as the owner of the
folder can do all actions except Add action.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 594


OBJECT ADMINISTRATION
OFSAA SEEDED SECURITY

13.2.4 Object Access Type


Object Access Type derives the special functionalities which can be performed on object definitions by
a user. It determines whether a user can do operations such as create, view, update, or delete for an
object definition.
OFSAAI supports two access types:
1. Read only
User who creates the object sets this property at object definition level, which will restrict other
users to perform Create/Update/Delete operations on the object. Other users can only view the
object details.
2. Read/Write
User who creates the object set this property at object level, which will allow other users to
perform Create/Read/Update/Delete operations on the object.
Since single user maintenance of an object is too restrictive, an override option is provided through
Phantom role type. If the user group to which the user belongs is mapped to the Phantom role type,
then the user will be able to perform CRUD operations irrespective of the object access type. Both
Phantom and Write roles should be mapped to the user group.
Phantom role can be applied at 2 different levels.
• User Group-Infodom level (applicable to Public Folders)
Map the user group to infodom-folder from User Group Domain Map window and map the user
group to the Phantom role for the required function from the User Group Role Map window. For
example, for a user to override object access type, his user group should be mapped to the
folder in which the object is created and should have been mapped to the Phantom role,
provided the folder in which the object is created is a Public folder. For information on how to
do the mapping, see User Group Domain Map and User Group Role Map sections.
• User Group-Folder-Role level (applicable to Shared Folders)
Map the user group to infodom-folder and then map it to the Phantom role for the required
function from the User Group Folder Role Map window if the folder in which the object is created
is a Shared folder. For information on how to do the mapping, see User Group Folder Role Map
section.

13.3 OFSAA Seeded Security


OFSAA provides various predefined security data such as seeded User Groups, Roles, and the
Functions mapped to those Roles.

13.3.1 OFSAA Seeded User Groups


OFSAA provides the following predefined User Groups and associated Roles for use with various
Infrastructure modules. Users mapped to these User Groups will have access as described below, for
objects in Public folders:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 595


OBJECT ADMINISTRATION
OFSAA SEEDED SECURITY

Seeded User Group


Description Mapped Roles
Name

Guest Users belonging to this user group will Access


have access to the LHS menu and the
associated Summary Pages.

Business User Users belonging to this user group will Access


have access to LHS menu and
Read Only
associated Summary Page, and view
object definitions.

Business Owner Users belonging to this user group will Access


have access to LHS menu and
Read Only
associated Summary Page, and do
CRUD (Create/ Read/ Update/ Delete) Write
operations on the objects.

Business Authorizer Users belonging to this user group will Access


have access to LHS menu and
Read Only
associated Summary Page; and
authorize the CRUD operations Authorize
(authority to Approve or Reject objects
which require authorization).

Business Users belonging to this user group will Access


Administrator have access to LHS menu and
Read Only
associated Summary Page; do and
authorize the CRUD operations; Write
execute and export definition.
Authorize

Advanced

Administrator Users belonging to this group will have Access


full access to the system.
Read Only

Write

Authorize

Advanced

Phantom

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 596


OBJECT ADMINISTRATION
OFSAA SEEDED SECURITY

NOTE • The behavior is relevant for Public folders only.


• For shared folders, irrespective of OFSAAI seeded user
groups to which you are mapped, your user group
should be mapped to the corresponding roles through
the User Group Folder Role Map window to do
particular actions.
• For example, consider a user belongs to Business
Owner user group. As per the above table, he has
Access, Read Only, and Write roles mapped to him by
default. That means, he is assigned the functions such
as Link, Summary, View, Add, Edit, Copy, Remove and
so on. For a Public folder, he can do all the mentioned
functions. However, for a Shared folder, he cannot do
an action such as Add or Edit unless he is mapped to
Write role from the User Group Folder Role Map
window.
• It is mandatory to do the required mapping of Roles to
the folder and user group from the User Group Folder
Role Map window in case of Shared folders.

13.3.2 OFSAA Seeded Roles


OFSAAI seeds the following predefined Roles for each object types, which are mapped to the
corresponding functions as described below:

Seeded Role Name Role Type Mapped Functions

Access Access Link

Summary

Read Only Action Summary

View

Trace

Compare

Publish

Write Action Add

Edit

Copy

Remove

MAKE_LATEST

Authorize Action Authorize

Advanced Action Execute

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 597


OBJECT ADMINISTRATION
OFSAA SEEDED SECURITY

Seeded Role Name Role Type Mapped Functions

Export

Archive

Restore

Advanced

Phantom Phantom Ignore Access Type

For Administrative type of roles, additional roles are seeded from Security Management Systems
(SMS) module.

13.3.3 OFSAA Seeded Actions and Functions


Action is derived as a user event which triggers a function for a specific object type. Each action and
object type combination will give a function.
OFSAA will seed the following actions which shall be used by different object types to define its
functions.

Seeded Action Name Description of behavior for resulting function

LINK Access to the LHS menu link

SUMMARY Access to Summary Page

VIEW Access to view Definition Page of the object

TRACE Access to trace Definition Page of the object.

ADD Privilege to create an object.

EDIT Privilege to edit the Definition Page of the object.

COPY Privilege to Copy the object definition.

REMOVE Privilege to remove the object from the system.

PURGE Privilege to purge the object data from the system.

APPROVE Privilege to authorize an object by approving the same after any action has been
performed.

REJECT Privilege to authorize an object by rejecting the same after any action has been
performed.

EXECUTE Privilege to execute the object definition.

EXPORT Privilege to export definition out of the system.

ARCHIVE Privilege to archive a definition.

RESTORE Privilege to restore any archived definition.

COMPARE Privilege to compare any definition with another.

PUBLISH Privilege to publish any definition to MDB.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 598


OBJECT ADMINISTRATION
OBJECT SECURITY

Seeded Action Name Description of behavior for resulting function

LATEST Privilege to make any authorized version definition of the definition latest.

IGNOREACCESS Privilege to ignore the access right given by a user.

ADVANCED Access to object specific special functionality.

13.4 Object Security


Object Security sub module consists of the following sections. Click the links to view the sections in
detail.
• Metadata Segment Mapping
• Batch Execution Rights
• Object to Application Mapping

13.4.1 Metadata Segment Mapping


Segment refers to a logically divided part of the whole object based on a specific requirement.
Metadata Segment Mapping facilitates you to map or unmap the required business metadata
definitions such as measures, hierarchies, cubes, attributes, and maps to the selected segment within
a specific Information Domain. Based on the mapping, the users mapped to the segment are
restricted to access only the relevant metadata to view and edit during metadata maintenance and
information security.
To access the Metadata Segment Mapping window, select the Information Domain from the drop-
down list, click Object Administration and select Metadata Segment Mapping.

Figure 288: Metadata Segment Mapping window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 599


OBJECT ADMINISTRATION
OBJECT SECURITY

You (System Administrator) should have SYSADM function role mapped to your user role to access
Metadata Segment Mapping window. By default this window displays the Information Domain Name
to which you are connected along with the metadata details of Measure.

13.4.1.1 Mapping Metadata Definitions to Segment


You can map/unmap the required business metadata definitions to a segment available within the
selected Information Domain.
To map the required metadata definitions:
1. Select the required User Segment from the drop-down list.
2. Select the required metadata type that you want to map to the selected Segment. The available
options are Measure, Hierarchy, Cube, Attribute, or Map. The metadata of the selected
metadata type are listed in the Available Metadata pane.
3. To map or unmap the required metadata, perform the following steps:

 To map a metadata, select the metadata from the Available Metadata pane and click .
The metadata is added to the Selected Metadata pane. You can press Ctrl key for multiple
selections.

 To map all the listed metadata definitions, click .


 To remove a metadata mapping, select the metadata from the Selected Metadata pane and

click .

 To remove the entire metadata mapping, click .


4. Click Save to save the metadata mapping details.
The window is refreshed displaying the status of the mapping.
5. Click Show Details to view the status of each mapping.
You can modify the mapping at any point and the mapping table is updated only on saving the
mapping details. When a metadata definition such as measures, hierarchies, cubes, map, and
attributes are removed from the Information Domain, the same is updated in the mappings
table.

13.4.2 Batch Execution Rights


The Batch Execution Rights feature allows you to map the required User Group to the defined Batches
before you execute them from the Batch Execution or Batch Scheduler window. You can map multiple
user groups in an Information Domain to different batches. If a user is mapped to multiple User
Groups, the combined list of batches mapped to these user groups is available in the Batch Execution
or Batch Scheduler window for execution.
The default User Group of a user who has created the batch has the maximum Precedence Value
among the other User Groups and is automatically mapped for execution. An explicit mapping of this
User Group to the Batch is not required.
You (System Administrator) must have SYSADM function role mapped to access the User Group-
Batch Execution Map window. To access the User Group-Batch Execution Map window, select the

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 600


OBJECT ADMINISTRATION
OBJECT SECURITY

Information Domain from the drop-down list, click Object Administration and select Batch
Execution Rights.

Figure 289: User Group – Batch Execution Map window

The User Group-Batch Execution Map window displays the list of defined Batches for the selected
Information Domain along with the other details such as Batch Name and Batch Description. You can
filter the list of defined batches that are created in Batch Maintenance, Enterprise Modeling, or in
Rules Run Framework. By default the list displays the batches defined in the Batch Maintenance
window.
To map User Group to the required Batch in the User Group-Batch Execution Map window:
1. Select the Information Domain from the drop-down list. By default, the window displays the
Information Domain to which you are connected.
2. Select the User Group to which you want to map the Batches, from the drop-down list.
The list consists of all the User Groups mapped to the selected Information Domain. The
window is refreshed and the list of defined batches is populated.
You can also search for a specific user group by clicking Search and specifying the User Group
Name in the Search for Group window. Click OK.
3. Select Batch Maintenance (default), Enterprise Modeling, or Run Rules Framework and filter
the list of batches. You can also select ALL to list all the defined batches for the selected
Information Domain.
4. Map User Group to Batch(s) by doing the following:
 To map batch(s) to the selected User Group, select Batch Map checkbox.
 To map all the batches to the selected User Group, click CheckAll.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 601


OBJECT ADMINISTRATION
OBJECT MIGRATION

You can also click UnCheckAll to remove all the mapping.


5. Click Save to save the User Group-Batch mapping details.

13.5 Object Migration


Object Migration is the process of migrating or moving objects from one environment to another
environment. You might want to do the migration of objects for several reasons. You might have
multiple environments to handle a global deployment or you might want to create multiple
environments to separate development, testing, and production.
In OFSAA Object Migration can be performed by the following:
• Offline Object Migration: Offline Object Migration enables you to create a dump (objects saved
in the file), and move the dump file from one environment to the other environment. To
facilitate offline object migration, you can either invoke the objects using a command line or use
the UI based approach. This further creates an exportable dump file for the object migration.
• Online Object Migration: Online Object Migration enables you to migrate objects the between
the source and target, if an OFSAA instance is up and running.

13.5.1 Offline Object Migration


The Offline Migration is introduced in OFSAAI to invoke the offline migration (objects) using a
Command Line Utility shell script. It facilitates a UI based approach to populate the dump file offline
for export or import of the objects by invoking the Command Line Utility.
This enables you to selectively pick the required objects, that is, the object migratable file, which
creates a dump file. For the list of objects that can be migrated, see the Objects Supported for
Command Line Migration section.
Offline Object Migration can be performed either by:
• Invoking shell script using the Command Line Utility
• Using the UI based approach.
Further in offline migration, migration can be performed based on the following:
• By creating an outline: In case of outline, the reference of the object selected by the user and
its dependencies are stored as outline definition. The actual object state is extracted from the
object source system, while performing the export action. The object state at the time of
export is captured in the export definition.
For example,
You can store the object definition with only the object reference information. This information
will be used at the time of export to determine the objects that needs to be migrated. The
selection has the objects selected, with their included and excluded dependencies. For more
information, see Creating Export Outline Definition.
• By creating a snapshot: In case of a snapshot, the entire state of the selected objects and its
dependencies are captured and stored as snapshot definition. This state is extracted at the
time of export. Snapshot to export has no dependency on the object source system.
For example,

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 602


OBJECT ADMINISTRATION
OBJECT MIGRATION

You can restore the previous state of objects, you can use export the objects as snapshots and
use it, which would export the state of the objects at that time of snaposhot creation for your
migration. For more information, see Generating a Snapshot.

NOTE The REST authentication is done against the Service Account


user mentioned under OFSAA_SRVC_ACC parameter in the
CONFIGURATION table. This user should be created with "SMS
Auth Only" attribute from the User Maintenance window. By
default, OFSAA_SRVC_ACC parameter is set as SYSADMN.

13.5.1.1 Prerequisites
• Folders (segments) and user groups that are designated for the import should be present in
the target.
• The source and target environment should have the same installed languages. OFSAA
supports 18 languages in total. In case of a particular language during export operations, only
the specified objects having the same language is exported.
• OFSAA users should have access to folders (Infodom segment mapping) in target as well as
source. This access is required to get the objects in its state as available in the source, to
perform actions such as view and edit.
• Tables accessible to users in source should also exist in target.
For example, if you want to migrate a Data Element Filter based on "Table A" and "Table B" in
the source, those two tables should exist in the target.

NOTE Before you migrate F2T, migrate the respective Data Source
files to the Target Environment or create them in the Target
Environment.

• For AMHM Dimensions and Hierarchies:


 The key processing Dimensions should be the same in both the source and target
environments.
 For Member migration, the Dimension type should have the same attributes in both source
and target environments.
 Numeric Dimension Member IDs should be the same in both the source and target
environments, to ensure the integrity of any Member-based objects.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 603


OBJECT ADMINISTRATION
OBJECT MIGRATION

NOTE If you have used the Master Table approach for loading
Dimension data and set it up to generate surrogate keys for
Members, this results in different IDs between the source and
target, so it may cause errors if you have objects which depend
on these IDs
All objects that generate new ID after migrating to a different
information domain and all components which are registered
through the Component Registration window, which will be used
in the Run Rule Framework (RRF), must be manually entered in
AAI_OBJ_REF_UPDATE table in the Configuration Schema. The
attributes present in the table are:
• V_OBJECT_TYPE- EPM Object Type
• V_RRF_OBJECT_TYPE- RRF object Type. The ID can
be referred from pr2_component_master table.
• V_ICC_OBJECT_TYPE- ICC object type, can be referred
from component_master table.
• F_IS_FILTER- Is the object to be migrated as a
filter/not?
• N_BATCH_PARAMETER_ORDER- The order of
parameter in task (if used in a batch).

13.5.1.2 Exporting Objects


Exporting Objects allows you to export a set of objects to migrate across Information Domains within
the same setup or across different setups. You can select one or more objects within an object type or
within multiple object types and migrate the same along with or without the dependent objects.
The roles mapped to Object Migration Export are as follows:
• OMEXREAD: This role provides you the read access for the objects that are migrated.
• OMEXWRITE: This role provides you the edit or write access for the objects that are migrated.
• OMEXADVND: This role provides you the access for the objects that are migrated.
For all the roles, functions and descriptions, see Appendix A.

Figure 290: Object Migration Export Summary window

The Object Migration Export Summary window displays the list of pre-defined Export Definitions with
their Outline Id and Dump Name. By clicking the Column header names, you can sort the column
names in ascending or descending order. You can add, view, edit, copy, export, delete, and generate

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 604


OBJECT ADMINISTRATION
OBJECT MIGRATION

snapshot for the Export Definition. You can search for a specific Export Definition based on the
Outline Id or Dump Name.

13.5.1.2.1 Creating an Export Definition

Export definition can be created can be performed based on the following approach:
• Creating Outline Definition
• Creating Snapshot Definition

13.5.1.2.2 Viewing an Export Definition

You can view individual Export definition details at any given point.
To view an existing Export definition, perform the following steps:
1. From the Object Migration Export Summary window, click the Menu Button and select View.
The Export Objects window is displayed. The Export Objects window displays the details of the
selected Export definition like Outline Id, Dump Name and the objects selected for exporting.

13.5.1.2.3 Editing an Export Definition

You can update the existing Export definition details except the Outline Id.
You can add more objects for exporting or removing the existing objects.
To modify the Export definition, perform the following steps:
1. From the Object Migration Export Summary window, click the Menu Button and Edit. The
Export Objects window is displayed.
2. Update the required details. For more information, see Creating Export Definition. You can also
exclude or include dependencies which are previously excluded to your export definition. For
more information, see Viewing and Excluding Dependencies in Outline Definition
3. Click Save and update the changes.

13.5.1.2.4 Copying an Export Definition

This option allows you to quickly create a new Export definition based on an existing Export definition.
You need to provide a new Outline Id and can modify other required details.
To copy an existing Export definition, perform the following steps:
1. From the Object Migration Export Summary window click the Menu Button and select Copy.
The Export Objects window is displayed.
2. Enter a unique Outline Id to identify the status of the migration process.
3. Update other details if required.
For more information, see Creating Export Definition.
4. Click Save.

13.5.1.2.5 Deleting an Export Definition

To delete an Export definition, perform the following steps:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 605


OBJECT ADMINISTRATION
OBJECT MIGRATION

1. From the Object Migration Export Summary window, select the Export definition that you want
to delete and click Delete.
A confirmation message is displayed.
2. Click Yes. The definition gets deleted.

13.5.1.3 Outline Definition of Objects


Creating an outline enables you to migrate the objects based on the user selection capturing the latest
objects for migration.

13.5.1.3.1 Creating Export Outline Definition

Exporting Objects allows you to export a set of objects to migrate across Information Domains within
the same setup or across different setups.
To create an export outline definition, perform the following steps:
1. Click the Add button in the Object Migration Export Summary window.
The Outline Definition window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 606


OBJECT ADMINISTRATION
OBJECT MIGRATION

Figure 291: Outline Definition window

2. Select Outline and Specify the following details:


 Outline Name
 File Name
3. Click Apply.
The Add Object To Outline window is displayed.
4. Select the object types that you want to add from the Object Types drop-down.
5. Select the objects from the object type results.

Figure 292: Add Object To Outline window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 607


OBJECT ADMINISTRATION
OBJECT MIGRATION

The select outlines are displayed in hierarchy in the Outline pane. In this example, Object Types
are selected as Aliases and Business Hierarchies.
6. Click Save.
The Export definition is available in the Object Migration Export Summary window.
7. Click the Menu Button and select Export to execute.
8. A confirmation message is displayed. Click Ok to trigger the export process.
The dump file will be created in /ftpshare/ObjectMigration/metadata/archive folder.
You can view the logs from /ftpshare/ObjectMigration/logs/ folder.

13.5.1.3.2 Viewing and Excluding Dependencies in Outline Definition

When creating an outline, you can specify its dependencies by using the dependency option. This
enables you to review the dependent objects with the outline.
To view the dependency, perform the following steps:
1. Click the Add button in the Object Migration Export Summary window.
The Outline Definition window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 608


OBJECT ADMINISTRATION
OBJECT MIGRATION

Figure 293: Outline Definition window

2. Select Outline and speciy the following details:


 Outline Name
 File Name
3. Click Apply. The Add Object To Outline window is displayed.
4. Select the object types that you want to add from the Object Types drop-down.
5. Select the objects from the object type results.
The select outlines are displayed in hierarchy in the Outline pane. In this example, Object Types
is selected as Rule.

Figure 294: Add Object To Outline window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 609


OBJECT ADMINISTRATION
OBJECT MIGRATION

6. Click Show Dependency. The relevant window is displayed.

Figure 295: Show Dependency window

You can exclude or include self and child objects from this window.
7. Right click the object and choose whether to include or exclude from the following options:
 Exclude Children
 Include Self and Children
 Exclude Self and Children

Figure 296: Show Dependency window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 610


OBJECT ADMINISTRATION
OBJECT MIGRATION

8. Based on your selecting the objects are excluded or included. When migrated the rule, the
excluded or included objects for the execution.
9. Click Object Selection link to go back to the previous window.
10. Select Save to save the outline.

Figure 297: Add Object To Outline window

11. Click the Menu Button and select Export to trigger the export.
After exporting the objects you can perform the following tasks:
 Viewing an Export Definition
 Editing an Export Definition

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 611


OBJECT ADMINISTRATION
OBJECT MIGRATION

 Copying an Export Definition


 Deleting an Export Definition

13.5.1.4 Snapshot Definition of Objects


The Snapshot Definition of Objects is an object migration UI-based support for the state of the system
at a given point. You can restore the object from the snapshots whenever it is required.

13.5.1.4.1 Generating a Snapshot

Generating a snapshot enables to saves the details of the object at a given point and you can also
restore the snapshot whenever required.
To generate a snapshot, perform the following steps:
1. Click the Add button in the Object Migration Export Summary window. The Outline Definition
window is displayed.

Figure 298: Snapshot Definition window

2. Select Snapshot and specify the name of the Snapshot.


The Add Object To Snapshot window is displayed.
3. Select the object that you want to add to the snapshot.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 612


OBJECT ADMINISTRATION
OBJECT MIGRATION

Figure 299: Add Object To Snapshot window

4. Click Save Snapshot.


The saved snapshot is be available in the Object Migration Export Summary window.
5. Click the Menu button. You can select Export to export the snapshot or View Log to see the
latest changes.

Figure 300: Add Object To Snapshot window

NOTE • Snapshots cannot be edited or deleted, only exported.


• You can also generate a Snapshot from an existing
Outline Definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 613


OBJECT ADMINISTRATION
OBJECT MIGRATION

13.5.1.4.2 Generating Snapshot for the Export Definition

To create snapshot for the export definition, perform the following steps:
1. From the Object Migration Export Summary window, select the Export definition that you want
to create snapshot and click Generate Snapshot.

Figure 301: Export Object window

2. Specify the Snapshot Name and click OK.

Figure 302: Generate Snapshot window

The saved snapshot is be available in the Object Migration Export Summary window.
3. Click the Menu Button. You can select Export to export the snapshot or View Log to see the
latest changes.

13.5.1.4.3 Exporting Snapshot into an Archive

The saved snapshot is be available in the Object Migration Export Summary window. You can click the
Menu Button and select Export to export the snapshot. The Archive is created in FTPSHARE and you
can view the log files for details.
EXAMPLE PATH: /scratch/ofsaa/ftpshare/ObjectMigration/metadata/archive.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 614


OBJECT ADMINISTRATION
OBJECT MIGRATION

13.5.1.5 Importing Objects


This feature allows you to import objects to your target environment from the archived dump file. The
dump file from source environment should be downloaded and moved into
/ftpshare/ObjectMigration/metadata/restore folder in the target system. This folder structure needs
to be created manually.
The roles mapped to Object Migration Import are as follows:
• OMIMREAD: This role provides you the read access for the objects that are migrated.
• OMIMWRITE: This role provides you the edit or write access for the objects that are migrated.
• OMIMADVND: This role provides you the access for the objects that are migrated.
For all the roles, functions and descriptions, see Appendix A.

Figure 303: Object Migration Import Summary window

The Object Migration Import Summary window displays the list of pre-defined Import Definitions with
their Outline Id and Dump Name. By clicking the Column header names, you can sort the column
names in ascending or descending order. You can add, view, edit, copy, and delete Import Definition.
You can search for a specific Import Definition based on the Outline Id and Dump Name.

13.5.1.5.1 Creating Import Definition from an Archive

To import objects, perform the following steps:


1. Click the Add button in the Object Migration Import Summary window. The Import Definition
window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 615


OBJECT ADMINISTRATION
OBJECT MIGRATION

Figure 304: Import Objects window

2. Select the dump file from the drop-down list. It displays the dump files in the
/ftpshare/ObjectMigration/metadata/restore folder. The objects in the dump file will
be displayed in the Available Objects pane.
3. Select the required Folder from the drop-down list. This is the default target folder if object
specific Folder is not provided. However, if both Folders are not specified, then source folder
available in the exported dump file will be considered as target folder.
4. Select the Retain Ids as Yes or No from drop down to retain the source AMHM objects after
migration.
If it is turned ON, different scenarios and the behaviors are as follows:
 Object and ID does not exist in Target- the object is created in target environment with
same ID as that in source.
 Object exists in Target with different ID- object is migrated and the ID in the target is
retained.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 616


OBJECT ADMINISTRATION
OBJECT MIGRATION

 ID already exists in Target with different object- then the object is migrated to target
environment and a new ID is generated.
 Same object and ID exists in Target- In this case, the behavior depends on the OVERWRITE
flag.
5. Turn ON the Fail On Error toggle button to stop the import process if there is any error. If it is
set OFF, the import process will continue with the next object even if there is an error.
6. Turn ON the Import All toggle button to import all objects in the dump file to the target
environment.
7. Turn ON the Overwrite toggle button to overwrite any existing metadata. If it is turned OFF, it
will not overwrite the object and continue migrating the next object.
8. Click Save.
The Import definition will be available in the Object Migration Import Summary window.
9. Select the definition and click Import to execute.
10. A confirmation message is displayed. Click Ok to trigger the import process.
You can view the logs from /ftpshare/ObjectMigration/logs folder.

13.5.1.5.2 Creating Import Definition from a Snapshot Archive

To import objects from a snapshot archive, perform the following steps:


1. Click the Add button in the Object Migration Import Summary window. The Import Definition
window is displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 617


OBJECT ADMINISTRATION
OBJECT MIGRATION

Figure 305: Import Definition window

2. Specify the Outline Name.


3. Select the Snapshot from the Snapshot Name drop-down.
4. Select the required Folder from the drop-down list. This is the default target folder if object
specific Folder is not provided. However, if both Folders are not specified, then source folder
available in the exported dump file will be considered as target folder.
5. Turn ON the Retain IDs toggle button to retain the source AMHM objects after migration.
If it is turned ON, different scenarios and the behaviors are as follows:
 Object and ID does not exist in Target- the object is created in target environment with
same ID as that in source.
 Object exists in Target with different ID- object is migrated and the ID in the target is
retained.
 ID already exists in Target with different object- then the object is migrated to target
environment and a new ID is generated.
 Same object and ID exists in Target- In this case, the behavior depends on the OVERWRITE
flag.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 618


OBJECT ADMINISTRATION
OBJECT MIGRATION

6. Turn ON the Fail On Error toggle button to stop the import process if there is any error. If it is
set OFF, the import process will continue with the next object even if there is an error.
7. Turn ON the Import All toggle button to import all objects in the dump file to the target
environment.
8. Turn ON the Overwrite toggle button to overwrite any existing metadata.
If it is turned OFF, it will not overwrite the object and continue migrating the next object.
9. Click Save.
The Import definition will be available in the Object Migration Import Summary window.
10. Select the definition and click Import to execute.
11. A confirmation message is displayed. Click Ok to trigger the import process.
You can view the logs from /ftpshare/ObjectMigration/logs folder.

13.5.1.5.3 Viewing an Import Definition

You can view individual Import definition details at any given point.
To view an existing Import definition, perform the following steps:
1. From the Object Migration Import Summary window, click the Menu Button and select View.
The Import Objects window is displayed.
2. The Import Objects window displays the details of the selected Import definition like Outline ID,
Dump Name and the objects selected for importing.

13.5.1.5.4 Editing an Import Definition

You can update the existing Import definition details except the Outline ID.
You can add more objects for importing or removing the existing objects.
To modify the Import definition, perform the following steps:
1. From the Object Migration Import Summary window, click the Menu Button and select Edit.
The Import Objects window is displayed.
2. Update the required details.
For more information, see Creating Import Definition.
3. Click Save and update the changes.

13.5.1.5.5 Copying an Import Definition

This option allows you to quickly create a new Import definition based on an existing Import definition.
You need to provide a new Outline Id and can modify other required details.
To copy an existing Import definition, perform the following steps:
1. From the Object Migration Import Summary window, click the Menu Button and select Copy.
The Import Objects window is displayed.
2. Enter a unique Outline ID to identify the status of the migration process.
3. Update other details if required. For more information, see Creating Import Definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 619


OBJECT ADMINISTRATION
OBJECT MIGRATION

4. Click Save.

13.5.1.5.6 Deleting an Import Definition

This option allows you to delete an Import definition.


To delete an Import definition, perform the following steps:
1. From the Object Migration Import Summary window, select the Import definition that you want
to view and click Delete.
A confirmation message is displayed.
2. Click Yes. The definition gets deleted.

13.5.1.6 Objects Supported for Migration and their Dependent Objects


The objects supported for migration is based on the following:
• Dependent Objects
• Filter SubTypes

13.5.1.6.1 Dependent Objects

The following table lists the objects that are supported for implicit dependency and the dependent
objects:

Table 169: Base Object Names and the Dependent Objects

Base Object Name Dependent Objects

DATA QUALITY RULE DERIVED ENTITY

DATA QUALITY GROUP DATA QUALITY RULE

DATA TRANSFORMATION NA

ETL DATA QUALITY RULE- This is not implemented.

DATA ENTRY FORMS AND NA


QUERIES (DEFQ)

ALIAS NA

DATASET

BUSINESS MEASURE
DERIVED ENTITY
BUSINESS HIERARCHY

BUSINESS PROCESSOR

ALIAS
BUSINESS MEASURE
DERIVED ENTITY

BUSINESS DIMENSION BUSINESS HIERARCHY

DERIVED ENTITY
BUSINESS HIERARCHY
BUSINESS MEASURE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 620


OBJECT ADMINISTRATION
OBJECT MIGRATION

Base Object Name Dependent Objects

ALIAS
DATASET
DERIVED ENTITY

DATASET

BUSINESS PROCESSOR BUSINESS MEASURE

BUSINESS PROCESSOR

DATASET

ESSBASE CUBE BUSINESS MEASURE

BUSINESS DIMENSION

ORACLE CUBE NA

MAPPER Hierarchies

FORMS FRAMEWORK Child Forms

FORMS MENU FORMS and LAYOUTS

FORMS LAYOUT Forms

FORMS TAB NA

FORMS PAGE FORMS and LAYOUTS

DATASET

MEASURE

HIERARCHY

BUSINESS PROCESSOR
RULE
DATA ELEMENT FILTER

GROUP FILTER

ATTRIBUTE FILTER

HIERARCHY FILTER

EXTRACT DATA

LOAD DATA

TRANFORM DATA

RULE

PROCESS PROCESS

CUBE

DATA QUALITY GROUP

VARIABLE SHOCK

MODEL

RUN EXTRACT DATA

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 621


OBJECT ADMINISTRATION
OBJECT MIGRATION

Base Object Name Dependent Objects

LOAD DATA

TRANFORM DATA

RULE

PROCESS

RUN

CUBE

DATA QUALITY GROUP

VARIABLE SHOCK

MODEL

DATA ELEMENT FILTER

GROUP FILTER

ATTRIBUTE FILTER

HIERARCHY FILTER

BATCH Not implemented

MEMBERS
DIMENSION
ATTRIBUTES

BUSINESS HIERARCHY

FILTER ATTRIBUTES

FILTER

EXPRESSION EXPRESSION

AMHM HIERARCHY Members

SANDBOX 2 NA

BUSINESS HIERARCHY

BUSINESS MEASURE
VARIABLE
BUSINESS PROCESSOR

DATASET

TECHNIQUE NA

VARIABLE

VARIABLE SHOCK DATASET

BUSINESS HIERARCHY

SCENARIO VARIABLE SHOCK

TECHNIQUE
MODEL
VARIABLE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 622


OBJECT ADMINISTRATION
OBJECT MIGRATION

Base Object Name Dependent Objects

DATASET

BUSINESS HIERARCHY

DataElement Filter

RUN
STRESS
SCENARIO

CATALOG PUBLISH NA

USER PROFILE

USER GROUP USER

ROLE FUNCTION

FUNCTION NA

PROFILE NA

PMF PROCESS NA

13.5.1.6.2 Filter SubTypes

The following table describes the Object Name and Object SubType ID.

Table 170: Object Name and Object SubType ID

Object Name Object SubType ID

DataElement Filter 4

Hierarchy Filter 8

Group Filter 21

Attribute Filter 25

13.5.2 Online Object Migration


Objects refer to the various definitions defined in the Infrastructure and Financial Services
applications. Object Migration framework within the Infrastructure facilitates you to define a set of
objects to migrate across Information Domains within the same setup or across different setups.
You can select one or more objects within an object type or within multiple object types and migrate
same along with the dependencies of the selected object automatically. For example, if you explicitly
select a Group Filter, the migration will automatically happen for the Data Element Filters which are
the dependents referenced within that Group Filter.
The following object types are available:
• Infrastructure UAM Objects such as Alias, Business Processor, Essbase Cube, Datasets, Business
Measures, Business Hierarchy, Business Dimension, Data Quality Rule and Data Quality Group.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 623


OBJECT ADMINISTRATION
OBJECT MIGRATION

• Financial Services Applications infrastructure objects such as Dimension, Hierarchy, Filter, and
Expression Rule.
• You can also migrate objects which are specific to applications such as Asset Liability
Management, Funds Transfer Pricing, or Profitability Management, if you have installed those
applications.

NOTE Apart from this method, you can migrate objects through
Command Line Utility to Migrate Objects or Offline Object
Migration (UI Based) process based on whether the objects you
want to migrate are supported in that approach.

Following are the pre-requisites while working with Object Migration:


• Both the Source and Target should have the same OFSAA version number.
• Folders (Segments) that are present in the Source should also be present in the Target.
• The Source and Target environment should have the same installed locales for migration.
• Users in Source should be the same in Target. (At least for users associated with objects
migrated).
• Users should have access to Folders in Target similar to the access in Source.
• Tables accessible to users in Source should also exist in Target.
For example, if you want to migrate a Data Element Filter based on "Table A" and "Table B" in
the Source, those two tables should exist in the Target.
• The key processing Dimensions should be the same in both the Source and Target
environments.
• For member migration, the dimension type should have the same Attributes in both Source and
Target environments.
• Numeric dimension member IDs should be the same in both the Source and Target
environments, to ensure the integrity of any member-based assumptions you want to migrate.

NOTE If you have used the Master Table approach for loading
dimension data and set it up to generate surrogate keys for
members, this results in different IDs between the Source and
Target. So it may cause error if you try to migrate objects which
depend on these IDs.

• Migration of Infrastructure UAM Objects happens over a secure Java Socket based
communication channel. To facilitate effective communication between the Source and Target
systems and also to display the UAM objects from the source, you need to import the SSL
certificate of Source in to the Target. For information on importing SSL certificate, see How to
Import SSL Certificate for Object Migration (Doc ID 1623116.1).

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 624


OBJECT ADMINISTRATION
OBJECT MIGRATION

• For Object migration across setups, migration process should always be triggered from the
target setup. You need to login to the target setup and select the required information domain.
Object Migration works more like an IMPORT into the Target. Thus, in case of migrating objects
within the same setup across Information Domains, you need to have logged into the Target
Information Domain in order to migrate the objects.
• Before migrating a DQ Group, ensure the DQ Rules present in that DQ Group are unmapped
from all other groups in the target. That is, if a DQ Rule is mapped to one or more DQ Groups in
the target, then it has to be unmapped from all the groups before migration.
• The following object types will not be migrated with their parent objects even though they are
registered as dependencies:
 Currencies registered as dependents of Interest Rate Codes (IRCs).
 Dimension Members registered as dependents.
Ensure that these dependencies exist in the target environment prior to the migration of parent
object.
You (AAI System Administrator) need to have FU_MIG_HP function role mapped to access the Object
Migration framework within Infrastructure.

The Object Migration Summary window displays the list of pre-defined Object Migration rules with the
other details such as Name, Folder, Source Infodom, Access Type, Modification Date, Last Execution
Date, Modified By, and Status. You can use the Search option to search for a required Object Migration
rule based on the Name or Folder in which it exists. The pagination option helps you to view the list of
existing Object Migration rules within the system.
In the Object Migration Summary window you can do the following:
• Defining Source Configuration
• Creating Object Migration Definition
• Viewing Object Migration Definition
• Modifying Object Migration Definition
• Copying Migration Rules
• Migrating Stored Object Rules
• Viewing Migration Execution Log

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 625


OBJECT ADMINISTRATION
OBJECT MIGRATION

13.5.2.1 Defining Source Configuration


You can define a source configuration by specifying the database connection details and user
credentials to access the database. You can also edit a pre-defined Source configuration.
To define a Source Configuration in the Object Migration Summary window:

1. Click Configuration from the Object Migration tool bar. The Source Configuration window
is displayed with the pre-configured database details.

You can also click View Configuration to view the pre-configured database details.
2. Click adjacent to the Name field. The window is refreshed and enables you to enter the
required details.

Figure 306: Source Configuration window

3. Enter a Name for the source connection and add a brief Description.
4. Enter the Source Database details as tabulated:

Table 171: Fields in the Source Configuration window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Enter the JDBC (Java Database Connectivity) URL configured by the


JDBC Driver Name administrator to connect to the database. For example,
oracle.jdbc.driver.OracleDriver

JDBC Connection Enter the connection string in the following format.


String “jdbc:oracle:thin:@<hostname:port>:<servicename>”

User ID Enter the user ID required to access the database.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 626


OBJECT ADMINISTRATION
OBJECT MIGRATION

Field Description

Password Enter the password required for authentication.

Enter the web server URL in the format


Web Server URL
“https://<hostname>:<port>/<domain>”

Source Infodom Enter the source Information Domain on which the database exists.

5. Click Validate to validate the specified configuration details.


6. Click Save to save the Source Definition details.
The Audit Trail section at the bottom of Source Configuration window displays the metadata
information about the source definition created.
You can also edit a pre-defined Source Definition by selecting the required source definition from
Name drop-down list. Edit the details, and click Save.

13.5.2.2 Creating Object Migration Definition


You can create an Object Migration definition in the target setup and launch the migration from the
definition, or save the definition details and execute the migration process at a later point.
• If source objects exist in the target setup, the objects are migrated only on selection of
Overwrite Object option in Object Migration definition window.
• If source objects do not exist in the target setup, then the objects are created in the target setup.
The dependent objects are migrated first and then the parent objects.
To create an Object Migration definition:
1. Click Add button from the Object Migration tool bar. The New - Object Migration window is
displayed.

Figure 307: New Object Migration window

2. Enter the Object Migration details as tabulated:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 627


OBJECT ADMINISTRATION
OBJECT MIGRATION

Table 172: Fields in the Object Migration Details window and their Descriptions

Field Description

Fields marked in red asterisk (*) are mandatory.

Select the required folder from the drop-down list. This folder refers to
Folder
the folder associated with the Object Migration rule.

Select one of the following options:


• Read-Only: Select this option to give other users the access to only
view the Object Migration definitions.
Access Type
• Read/Write: Select this option to give other users the access to
object to view, modify (including Access Type) and delete the Object
Migration definitions.

Enter a name for the Object Migration definition. Ensure that there are
Name
no special characters or extra spaces in the name specified.

Description Enter a brief description about the definition.

Select the required source configuration from the drop-down list. The list
Source displays the available source configurations that are created from the
Configuration window.

Select this checkbox to overwrite the target data, if source objects exist in
Overwrite Object
the target setup.

Object Selection and Placement


After you select an object type from the Migration rule’s LHS menu, the Object Selection and Placement
section will display the following options related to that object type:

This field is displayed if you have selected a segment /folder-based


object type.
Select the required source segment/folder from the drop-down list.

Source Segment/Folder All the registered objects for the selected source segment/folder are
displayed in the Source Infodom table.
Note: If you leave Source Folder blank, the Source Infodom table
displays all objects in all the folders to which you have access in the
source environment.

For some object types, there are additional selections. For example, if
Object-type specific you select the object type as Filters, you can select the required Filter
selections, such as Type from the drop-down list. The Source Infodom table displays all
Filter Type objects belonging to the selected Filter Type. If you leave Filter Type
blank, all filters will be displayed.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 628


OBJECT ADMINISTRATION
OBJECT MIGRATION

Field Description

This field is displayed if you have selected a segment /folder-based


object type. Target folder is the folder to which the selected objects are
migrated.
Select Same as Source option to migrate the objects to the same folder
as source folder. By default, Same as Source is selected.
Select the required folder from the drop-down list if you want a folder
other than source folder.
Consider the following scenarios to know how the Parent and Dependent
objects are migrated to the selected Target Folder.
• Dependent objects are migrated either implicitly or explicitly.
 Implicit Migration: This occurs when the dependents are not
explicitly selected. The dependent will be migrated automatically
if its parent is selected (this occurs regardless of whether it is
folder-based). For folder-based objects, the dependent
migration uses “Same as Source” logic: It uses a Target Folder
matching the dependent’s Source Folder.
 Explicit Migration: When you need to migrate the dependent
Target Folder objects to a specific folder (different than the dependent’s
Source Folder), explicitly select the dependent object and the
desired Target Folder for it.
Note: Explicit selection takes precedence over implicit migration for a
dependent.
For folder-based objects: A dependent object will not inherit the
parent’s Target Folder. This logic avoids the potential for
unintended duplicates; that is, an object could be a dependent of
multiple parent objects, and those parents each could be targeted
for a different folder.
An auto validation is done to check if the Target Folder exists. If it does
not exist,
• The object will not be migrated.
• Objects’ parents (if any) will not be migrated, regardless of whether
the child is implicitly or explicitly selected for migration.
• If the object has children whose migration could be valid (i.e. a valid
Target Folder and valid dependents, if any) then migration is done
by migrating a child prior to its parent to ensure integrity of parent.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 629


OBJECT ADMINISTRATION
OBJECT MIGRATION

Field Description

All available objects are displayed based on your selection of object type
and (if applicable) source segment/folder.
• Select the checkbox corresponding to the required object and click
to migrate the object to the target folder. You can also double
click to select the required object.

Source Infodom Table • Click to select all the listed objects for migration.
• You can use the Search and pagination options to find the required
object. Click the Search button and enter the name or description
in the Search window. Use Reset button to clear the search
criteria.
• Use the button to find an object displayed on the current page.

All objects which you have selected for migration are displayed.
• Select the checkbox corresponding to the required object and click
Target Infodom Table to remove the object from migration. You can also double click
to remove the required object.

• Click to remove all the selected objects from migration.

3. The Selected Objects grid shows all objects you have explicitly selected, for all object types.

4. Click button from the Selected Objects tool bar to populate the complete object details
such as Target Modification Date (if object exists in target Infodom) and Operation
(Add/Update) that can be performed during migration.
5. The Dependent Objects grid shows all objects which are automatically migrated due to a
dependency in a parent object.

6. Click button from the Dependent Objects tool bar to display the dependencies of the
selected objects.
To view the dependencies of a specific object, click on the object Name in either the Selected
Objects grid or the Dependent Objects grid. The parent / child dependencies are displayed in
the Parent / Child Dependency Information window.
You can also toggle the view of Parent / Child dependency information by selecting Parent or
Child in the Dependency Information grid.
7. The Audit Trail section will display details about Object Migration Rule creation and
modification, after it is saved. You can add comments from the User Comments tab.
8. Click Migrate to save and migrate the selected source objects to target setup or click Save to
save the Object Migration definition for future migration. You can later run the saved object
migration rule. For more information, see Migrate Stored Object Definition section.
Once the migration starts, the source objects are migrated to target setup and the Migration details
such as status, start, and end time are recorded. You can click View Log in the Object Migration
Summary window to view the details.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 630


OBJECT ADMINISTRATION
OBJECT MIGRATION

NOTE In case of an error during migration of any dependent objects,


the specific parent object is excluded from migration. You can
view the Migration Execution Log for details.

13.5.2.3 Viewing Object Migration Definition


You can view individual Object details at any given point.
To view the existing Object Migration definition details:
1. Select the checkbox adjacent to the Object Migration Definition Name.

2. Click View button in the Object Migration tool bar. The View - Object Migration window is
displayed.

3. Click button from the Selected Objects tool bar to refresh the properties.

4. Click button from the Dependent Objects tool bar to display the dependencies of the
selected Object.
5. To view all dependencies of an object, click the object Name. The parent / child dependencies
are displayed in the Parent / Child Dependency Information window.

13.5.2.4 Modifying Object Migration Definition


To update the existing Object migration definition details:
1. Select the checkbox adjacent to the Object Migration Definition Name.

2. Click Edit in the Object Migration tool bar. The Edit - Object Migration window is displayed.
3. Edit the required details. For more information, see Creating Object Migration Definition.

NOTE You cannot edit the Source details.

4. Click Save and save the changes.

In the Object Migration Summary window, you can also click Delete button to delete the
Object Migration Definition details.

13.5.2.5 Copying Migration Rules


The Copy Migration Rules facilitates you to quickly create a new Migration Rule Definition based on
the existing Source-Target Object mappings or by updating the required mapping details.
To copy an existing Migration Definition:
1. Select the checkbox adjacent to the Rule Name whose details are to be duplicated.

2. Click Copy in the Object Migration tool bar. Copy button is disabled if you have selected
multiple migration rules.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 631


OBJECT ADMINISTRATION
TRANSLATION TOOLS

3. Edit the Migration Rule Definition as required. You can modify the details such as Folder, Name,
Description, Access Type, Overwrite option, and also view the dependencies of the selected
objects. For more information, see Create Object Migration Definition.

NOTE You cannot edit the Source details.

4. Click Migrate to migrate the selected source objects to the target setup or click Save to save the
Object Migration definition for future migration.

13.5.2.6 Migrating Stored Object Definition


You can execute a stored Object Migration Definition and migrate the mapped objects to the target
setup. You can also interrupt the ongoing migration process at any given point.
To execute migration from a Stored Object Rules:
1. Select the checkbox adjacent to the Object Migration Definition Name.

2. Click Run in the Object Migration tool bar.


The migration process is triggered and the source objects are migrated to target setup. The
details can be viewed by clicking View Log in the Object Migration Summary window.
You can also interrupt the ongoing migration process by selecting the object rule definition and
clicking Cancel Run button.

13.5.2.7 Viewing Migration Execution Log


You can view the status of an executed migration rule definition with the log details of each migrated
object (parent) with the dependencies (child objects) indicated as components, along with its
sequence and severity.
To view the log details of an executed migration rule definition:
1. Click View Log in the Status column corresponding to the required Object Migration Definition.
The View Log window is displayed with the list of all the executed Object Migration Rule
definitions.
2. Click on the Task ID of the required Object Migration Rule and view the migration status such as
Task ID, Sequence, Severity, Message Description as Successful, Started, or Failed, Message
Date, and Message Time.

13.6 Translation Tools

13.6.1 Config Schema Download


Configuration schema refers to the database schema that is referred by all information domains to
access data related to Metadata, System Configuration, Administration Security, and so on.
Configuration schema stores the user security information and metadata used within the applications
which are deployed on OFSAA Infrastructure.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 632


OBJECT ADMINISTRATION
TRANSLATION TOOLS

The Config Schema Download window facilitates you download data from configuration schema
tables along with the option to filter data during download, in Microsoft Excel 2003/2007 format. The
Config Schema Download window has restricted access and you should have Config Excel Advanced
user role mapped to your user group to download configuration schema data.
To download Config Schema Data:
1. Select the table from the drop-down list. The list consists of those database objects (tables)
which are mapped to Configuration Schema based on a specific configuration.
2. Select the Format to download from the drop-down list. You can either select Microsoft Excel
2003 or 2007.
3. (Optional) If you want to download only the required data instead of complete table data,
specify a filter condition in Filter(where clause) field.
For example, if you want to download Group Code details from the table “cssms_group_mast”,
you can specify the filter condition as:
select * from cssms_group_mast where v_group_code in ('AUTH')
4. Select Download.
The File download dialog box is displayed providing you with options to Open or Save a copy of
the file in selected excel format.

13.6.2 Config Schema Upload


Configuration Schema refers to the Database Schema that is referred by all information domains to
access data related to Metadata, System Configuration, Administration Security, and so on.
Configuration Schema stores the user security information and metadata used within the applications
which are deployed on OFSAA Infrastructure.
To navigate to this screen, go to the Objects Administration tab, expand Translation Tools and click
Config Schema Upload from the LHS menu.
The Config Schema Upload window facilitates you to upload data to the configuration schema table
either by appending incrementally or complete re-load on the existing data, in Microsoft Excel
2003/2007 format. During upload, all the referential Constraints (Foreign Key Constraints) enabled on
the selected database object (table) are disabled and enabled back post upload. In case of any errors
while enabling the referential constraints or inserting the new data, the selected database object
(table) will be reverted back to its original state.
The Config Schema Upload window has restricted access and you should have Config Excel
Advanced user role mapped to your user group to upload configuration schema data.
To upload Config Schema Data:
1. Select the table from the drop-down list. The list consists of those database objects (tables)
which are mapped to Configuration Schema based on a specific configuration.
2. In Select the File to Upload field, click Browse. In Choose File to Upload dialog box, navigate and
specify the path of the data file (Microsoft Excel 2003/2007) which you want to upload.
If the excel contains multiple sheets, you can select the sheet from which data is to be uploaded.
Otherwise, by default the first sheet data is selected for upload.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 633


OBJECT ADMINISTRATION
UTILITIES

3. In Select the Sheet field click button, the Sheet Selector pop-up window is displayed. Select
the required sheet from the drop-down list and click OK.
4. In the Upload Type options, select one of the following:
 Incremental - In this type of upload, the data in Excel sheet is inserted / appended to the
target database object. The upload operation is successful only when all the data in the
selected Excel Sheet is uploaded. In case of any error, the uploaded data will be rolled back.
 Complete - In this type of upload, the data present in the selected database object is
overwritten with the data in selected Excel sheet. In case of an error, data in the selected
database object will be reverted back to its original state.
5. In Source Date Format field, specify the date format used in the data that you are uploading. An
insert query is formed based on the date format specified.
6. Select Upload. If you have selected Complete upload type, you will need to confirm to overwrite
data in the confirmation dialog box.
An information dialog box is displayed with the status of upload. You can click on View Log to
view the log file for errors and upload status. The log file contains the following information:
 Database object (table) to which the data is uploaded.
 Name of the excel file from which the data is uploaded.
 Number of records uploaded successfully.
 Number of records failed during upload and reason of failure.
 Upload Status (Success/Fail).

13.7 Utilities
Utilities refer to a set of additional tools which helps you to fine tune a defined process or maximize
and ensure the security of a database based on your need. The Utilities within the Administration
framework of Infrastructure system facilitates you to maintain the data in the Oracle database using
the various administrative tools. You can define the user access permissions, batch securities, upload
attributes, find metadata difference, and migrate source objects to target database.
You (System Administrator) need to have SYSADM function role mapped to access the Utilities section
within the Infrastructure system. You can access Utilities section within the Administration framework
under the tree structure of LHS menu.
To access various utilities, go to the Object Administration tab and click Utilities.
Administration Utilities consists of the following sections. Click on the links to view the sections in
detail.
• Metadata Authorization
• Metadata Difference
• Save Metadata
• Write-Protected Batch
• Component Registration

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 634


OBJECT ADMINISTRATION
UTILITIES

• Transfer Document Ownership


• Object Migration
• Patch Information
• Restructure

13.7.1 Metadata Authorization


Metadata Authorization within the Infrastructure system facilitates you to authorize or reject the
metadata version(s) created as a result of an update to the existing business definitions. The
modifications done to the higher level metadata or business definitions are recorded as a new version
of the same metadata which needs to be accepted or rejected, to reflect the changes. On
Authorization, the existing metadata is replaced with the current version. In case of Rejection, that
selected version of the metadata is removed from the system.
You need to have SYSADM and METAAUTH function roles mapped to access the Metadata
Authorization within the Administration framework of the Infrastructure system. The Metadata for
Authorization window displays the list of modified Metadata Type and the total number of eligible
metadata for authorization in the Business Metadata tab (Default).

Figure 308: Metadata for Authorization window

13.7.1.1 Authorize / Reject Metadata


To Authorize or Reject Metadata Types in the Metadata for Authorization window:
1. Select the Module tab as Business Metadata (default) or Rules Run Framework. The list of
Metadata Type eligible for authorization is displayed.
2. Select the required Metadata Type by clicking the Forms to be Authorized link.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 635


OBJECT ADMINISTRATION
UTILITIES

Figure 309: Metadata Details pane

A list of the metadata versions is displayed along with the other details such as Code, Short
Description, Action Performed, and Performed By details for the selected metadata definition.
3. Select the checkbox adjacent to the required version of the selected metadata and do one of the
following:
 Click Authorize to accept the metadata changes of the selected version.
 Click Reject to ignore the metadata changes and delete the selected version.
The window is refreshed on every action and the updates are displayed in the respective tab of
the Metadata for Authorization window.

13.7.2 Save Metadata


Save Metadata within the Infrastructure system facilitates you to resave the changes done to an
authorized metadata for the selected Information Domain. When you resave metadata, all the existing
metadata definitions are updated with the current changes along with the current modified date.

Figure 310: Metadata Resave window

You (System Administrator) need to have SYSADM function role mapped to access the Metadata
Resave window. The Metadata Resave window displays the list of Available Metadata for Hierarchy
(default) for the selected Information Domain.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 636


OBJECT ADMINISTRATION
UTILITIES

To resave metadata in the Metadata Resave window:


1. Filter the metadata type by selecting Hierarchy or Derived Entity. The list of Available Metadata
is populated. Do one of the following:

 Select the required metadata from the Available Metadata list and click button. You can
press Ctrl key form multiple selection.

 To select all the Available Metadata, click button.

You can also deselect a metadata by selecting from the Selected Metadata list and clicking
button or deselect all the selected metadata by clicking button.
2. Click Save and update the metadata changes. Status of operation is displayed.

13.7.3 Write-Protected Batch


Write-Protected Batch facilitates you to change the Editable State of Batches defined in the Batch
Maintenance window of the Infrastructure system. You can either restrict a Batch from being edited,
or remove the restrictions and allow users to modify the Batch Definition details.
You (System Administrator) need to have SYSADM function role mapped to access the Write-
Protected Batch within the Utilities section of the Infrastructure system.

Figure 311: Write Protected Batch window

The Write-Protected Batch window displays the list of defined Batches for the selected Information
Domain along with the other details such as Batch Name, Batch Description, and Write-Protection
status. By default, the Batch list is sorted in ascending order of the Batch Name and can be changed
by clicking and buttons respectively.
To change the Editable State of Batch in the Write-Protected Batch window, do the following:
• To change the Batch state as “Non Editable”, select the Write-Protected Batch checkbox of the
required Batch in the list and click Save. The Batch details are restricted from being edited in
the Batch Maintenance/Scheduler window.
• To change the Batch state as “Editable”, deselect the Write-Protected Batch checkbox of the
required Batch in the list and click Save. The Batch details can be modified as required in the
Batch Maintenance/Scheduler window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 637


OBJECT ADMINISTRATION
UTILITIES

• You can also click Check All to write-protect (restrict editing) all the batches in the list or click
Uncheck All to remove the restriction and allow editing of all the Batches.

13.7.4 Metadata Difference


Metadata Difference within the Infrastructure system facilitates you to view the difference between
two versions of a Metadata within the selected Information Domain. You (System Administrator) need
to have SYSADM function role mapped to access the Metadata Difference within the Utilities section
of the Infrastructure system.
To view the Metadata Difference, do the following:

1. Click button adjacent to Select Metadata.


The Metadata Tree dialog is displayed with a list of metadata available within the Data Model
Management and Rules Run Framework modules of the selected Information Domain.

NOTE Metadata Difference feature is not supported for RRF metadata

2. Select the required metadata by expanding the required node. Click OK.

3. Click button adjacent to From Version.


The Version Tree dialog is displayed with the list of available version for the selected metadata.
4. Select the required version by expanding the required node. Click OK.

5. Click button adjacent to To Version. The Version Tree dialog is displayed.


6. Select the required version by expanding the required node. Click OK.

7. Click button from the Metadata Difference tool bar.


The difference of the selected two metadata versions is displayed.

You can also click button to clear the metadata and version selections.

13.7.5 Patch Information


The Patch Information window within the Infrastructure facilitates you to view the list of patches
applied and applications installed till date. You (application user) need to have SYSADM function
mapped to your role to access the Patch Information window within the Utilities section of the
Infrastructure.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 638


OBJECT ADMINISTRATION
UTILITIES

Figure 312: Patch Information window

The Patch Information window dynamically displays a list of applied patches & applications installed
along with the Patch or Application Name, Information Domain on which the patch/application has
been installed, and Additional Information (if any). These records are fetched from the corresponding
tables in the database and are sorted in the ascending order of Applied Date by default.
You can search for a specific patch/application installation based on Patch/Application Name or
Information Domain.

13.7.6 Transfer Documents Ownership


This feature allows you to transfer the ownership of the uploaded documents to another user or user
group. When a user or user group is deleted, the uploaded documents will be orphaned. This feature
can be used to transfer the ownership of the documents before a user or user group is deleted.
The Transfer Document Ownership link is displayed when the user is mapped to any one of the
following roles:
• Document MGMT advanced
• Document MGMT authorize
• Document MGMT phantom
• Document MGMT write
For more details regarding Role and Functions, see Appendix A

13.7.6.1 Transferring Document Ownership to User


To transfer document ownership to user:
1. From the Transfer Documents Ownership window, select the user whose document ownership
you want to transfer from the User drop-down list.
The uploaded documents by the selected user are displayed under the Available Documents
pane.
2. Select the user to whom you want to transfer the document ownership from the Destination
User drop-down list.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 639


OBJECT ADMINISTRATION
REFERENCES

3. Select the documents from Available Documents whose ownership you want to transfer by
clicking button. The documents will be moved to the Selected Documents pane. You can
click to select all documents.
4. Click Save.

13.7.6.2 Transferring Document Ownership to User Group


To transfer document ownership to user group
1. From the Transfer Documents Ownership window, select the User Groups option.
2. Select the user group whose document ownership you want to transfer, from the Group drop-
down list.
The uploaded documents by the selected user group are displayed under the Available
Documents pane.
3. Select the group to which you want to transfer the document ownership from the Destination
Group drop-down list.
4. Select the documents from Available Documents whose ownership you want to transfer by
clicking button. The documents will be moved to the Selected Documents pane. You can
click to select all documents.
5. Click Save.

13.8 References
This section of the document consists of information related to intermediate actions that needs to be
performed while completing a task. The procedures are common to all the sections and are referenced
where ever required. You can see the following sections based on your need.

13.8.1 Scenario to Understand Hierarchy Security


Consider a bank “ABC” which has presence across the country and has split their business based on
regions. Each region is being managed by a Relationship manager reporting the Chief Executive
Officer. The Hierarchy is as indicated below.
Retail Assets Sales Head
• Sales Manager Personal Loans
 Sales Officer 1
 Sales Officer 2
• Sales Manager Mortgages
 Sales Officer 3
 Sales Officer 4
• Sales Manager Credit Cards
 Sales Officer 5

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 640


OBJECT ADMINISTRATION
REFERENCES

 Sales Officer 6
• Sales Manager Auto Loans
 Sales Officer 7
 Sales Officer 8
Products
• Personal Loans
• Mortgages
• Credit Cards
• Auto Loans
Each product is marketed by a separate team and which is headed by a Sales Manager who reports to
the Sales Head. Each Sales Manager in turn has two Sales Officers who are responsible for sales and
profitability of the product.
The Sales Head has decided that the Sales Officer of each product will not have access to the
information of other products. However, each Sales Manager will have access to Sales figures of the
other products.
Using the Oracle Infrastructure Security Hierarchy feature Administrator can provide information
security at hierarchy level by defining security options for each hierarchy node. Thus, the Bank can
control access of information at a node level and not increase the overheads.
This is how it is done in Oracle Infrastructure:
• First, the users are created in Oracle Infrastructure and then, a business hierarchy (as defined
above) is created.
• Now, the bank can restrict access of certain information to certain people in the Hierarchy
Security configuration.
• In this window, the administrator can control security by mapping the users to various nodes in
hierarchy.
For example, the administrator maps Sales Officer 1 and Sales Officer 2 to only the Personal
Loans Node in the Product hierarchy. This restricts Sales Officer 1 and 2 to only viewing and
maintaining their particular node in the hierarchy.
By default, all the users mapped to a domain can access all the hierarchy levels to which they
are mapped. This function allows the administrator to restrict or exclude a user/s from
accessing restricted nodes.

13.8.2 Role Mapping Codes


By default, the following roles are defined within the Infrastructure application. See Appendix A.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 641


OBJECT ADMINISTRATION
REFERENCES

Table 173: Details of the Role Code, Name, and their Descriptions

Role Code Role Name Role Description

CWSADMIN CWS Administrator CWS Administrator Role

DEFQMAN DEFQ Manager Data Entry Forma and Query Manager Role

DQADMN DQ Rule Admin Data Quality Rule Admin Role

ETLADM ETL Analyst ETL Analyst Role

METAAUTH Metadata Authorizer Metadata Authorizer Role

ORACUB Oracle Cube Administrator Oracle Cube Administrator Role

PR2ADM PR2 Administrator PR2 Administrator Role

SYSADM System Administrator System Administrator Role

SYSAMHM Fusion AMHM Admin Fusion Dimension Maintenance Admin Role

Fusion AMHM UAM Map


SYSAMHMUAM Fusion UAM Maintenance Admin Role
Admin

SYSATH System Authorizer System Authorizer Role

SYSBAU Business Analyst Business Analyst Role

SYSEXPN Fusion Expressions Admin Fusion Expressions Admin Role

SYSFILTERS Fusion Filters Admin Fusion Filters Admin Role

SYSOBJMIG Object Migration Admin Object Migration Maintenance Admin Role

SYSOPC Data Centre Manager Operator Console Role

SYSSQLRULE SQL Rule Admin SQL Rule Administrator Role

13.8.3 Function Role Mapping


The default roles are mapped to the following functions within the Infrastructure application.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 642


OBJECT ADMINISTRATION
REFERENCES

Table 174: Details of the Role and the Function Mappings

Roles Function Mappings

Add Alias MDB Window


Add Attributes Model Calibration
Add Business Processor Model Definition
Add Computed Measure Model Deployment
Add Cube Model Execution
Add Dataset Model Make Champion
Add Derived Entities Model Outputs
Add Dimension Modify Alias
Add Hierarchy Modify Attributes
Add Measure Modify Business Processor
Add RDM Modify Computed Measure
Alias Admin Modify Cube
Authorize Hierarchy Modify Dataset
Authorize Attributes Modify Derived Entities
Authorize Dataset Modify Dimension
Authorize Dimension Modify Hierarchy
Authorize Measure Modify Measure
Business Analyst User Window Modify RDM
Call Remote Web Services Optimizer Add
Cash Flow Equation Definition Optimizer Delete
Business Analyst
Computed Measure Advanced Pooling Add
Defi Administrator Pooling Delete
Defi User Refresh Hierarchies
Defq Administrator Remote SMS Access
Defq User Result of own request only
Delete Alias Result of Request and Status of all
Delete Attributes Rule Shock Definition
Delete Business Processor Sandbox Creation
Delete Computed Measure Sandbox Maintenance
Delete Cube Scenario Definition
Delete Dataset Stress Definition
Delete Derived Entities Variable Definition
Delete Dimension Variable Shock Definition
Delete Hierarchy View Alias
Delete Measure View Attributes
Delete RDM View Business Processor
Design RDM View Computed Measures
Document management Access View Cube
Excel Admin View Dataset
Excel User View Derived Entities

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 643


OBJECT ADMINISTRATION
REFERENCES

Roles Function Mappings

Execute Runs and Rules View Dimension


Export Metadata View Hierarchy
GMV Definition View Measure
Hierarchy Attributes View Metadata
Import Business Model View RDM
Import Metadata

Call Remote Web Services Remote SMS Access


Document Management Access Remote UAM Access
CWS Administrator
Execute Runs - Rules Result of own request only
Refresh Hierarchies Result of request - Status of all

Batch Cancellation
Execute Batch
Batch Processing
Data Centre Manager Operator Console
Create Batch
View log
Delete Batch

DeFi Excel
Excel Admin
DEFQ Manager Defq User
Excel User
Defq Administrator

Data Quality Authorization Rule Data Quality Delete Rule Group


Data Quality Add Rule Data Quality Edit Rule
Data Quality Add Rule Group Data Quality Edit Rule Group Data
DQ Rule Admin
Data Quality Copy Rule Quality Execute Rule Group Data
Data Quality Copy Rule Group Quality View Rule Group
Data Quality Delete Rule Data Quality View Rule

DI Designer Data Quality Add


ETL Analyst
DTDQ DI User

Fusion Add Attributes Fusion Edit Attributes


Fusion Add Hierarchies Fusion Edit Hierarchies
Fusion Add Members Fusion Edit Members
Fusion Attribute Home Page Fusion Hierarchies - View Dependent
Fusion Attributes - View Dependent Data
Data Fusion Hierarchy Home Page
Fusion AMHM Admin
Fusion Copy Attributes Fusion Member Home Page
Fusion Copy Hierarchies Fusion Members - View Dependent
Fusion Copy Members Data
Fusion Delete Attributes Fusion View Attributes
Fusion Delete Hierarchies Fusion View Hierarchies
Fusion Delete Members Fusion View Members

Fusion AMHM UAM Map


Fusion Hierarchies to UAM Mapping
Admin

Fusion Expressions Admin Fusion Add Expressions Fusion Expressions Home Page

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 644


OBJECT ADMINISTRATION
REFERENCES

Roles Function Mappings

Fusion Copy Expressions Fusion View Dependency Expressions


Fusion Delete Expressions Fusion View Expressions
Fusion Edit Expressions

Fusion Add Filters Fusion Filters - View Dependent Data


Fusion Copy Filters Fusion Filters - View SQL
Fusion Filters Admin
Fusion Delete Filters Fusion Filters Home Page
Fusion Edit Filters Fusion View Filters

Configuration
Metadata Segment Map
Database Details
Infrastructure Operator Console
Database Server
Administrator Infrastructure Administrator
Hierarchy Security
Infrastructure Administrator Window
Information Domain

Authorize Alias Authorize Technique


Authorize Attributes Authorize Templates
Authorize BBs Authorize Views
Authorize Business Processor Metadata Authorize Window
Authorize Computed Measure Model Authorize
Authorize Cube Sandbox Authorize
Authorize Dataset View Alias
Authorize DBs View Attributes
Authorize Derived Entities View Business Processor
Authorize Dimension View Computed Measures
Metadata Authorizer Authorize Hierarchy View Cube
Authorize KPIs View Dataset
Authorize Measure View Derived Entities
Authorize Nested Views View Dimension
Authorize Oracle Cube View Hierarchy
Authorize Pages View Measure
Authorize Process Tree View Oracle Cube
Authorize RDM View Process
Authorize Reports View RDM
Authorize Rule View Rule
Authorize Run View Run

Cancel Migration Execution Object Migration Delete Migration


Execute/Run Migration Process Ruleset
Object Migration Copy Migration Object Migration Edit Migration
Object Migration Admin Ruleset Ruleset
Object Migration Create Migration Object Migration Source Configuration
Ruleset Object Migration View Migration
Object Migration Home Page Ruleset

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 645


OBJECT ADMINISTRATION
REFERENCES

Roles Function Mappings

Object Migration ViewSource


Configuration

Modify Dimension
Add Dataset
Modify Hierarchy
Add Dimension
Modify Measure
Add Hierarchy
Modify Oracle Cube
Add Measure
View Alias
Oracle Cube Administrator Add Oracle Cube
View Dataset
Authorize Oracle Cube
View Dimension
Business Analyst User Window
View Hierarchy
Delete Oracle Cube
View Measure
Modify Dataset
View Oracle Cube

Access to Process Delete Run


Access to Rule Modify Process Tree
Access to Run Modify Rule
Add Process tree Modify Run
PR2 Administrator
Add Rule PR2 Windows
Add Run View Process
Delete Process View Rule
Delete Rule View Run

SQL Rule Edit


SQL Rule View
SQL Rule Add
SQL Rule Admin
SQL Rule Run
SQL Rule Delete
SQL Rule Copy

Administration Window Restricted Passwords Window


Application Server Window Role Maintenance Window
Audit Trail Report Window Rules Setup Configuration Window
Batch Cancellation Save Metadata Window
Batch Monitor Segment Maintenance Window
Configuration System Administrator
Database Details System Administrator Window
System Administrator
Database Server User Activity Reports Window
Design OFSAAI Menu Window User Attribute Upload Window
Enable User Window User Group Domain Map Window
Function Maintenance Window User Group Maintenance Window
Function Role Map Window User Group Role Map Window
Global Preferences View User Group User Map Window
Hierarchy Security User Maintenance Window

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 646


OBJECT ADMINISTRATION
REFERENCES

Roles Function Mappings

Holiday Maintenance Window User Profile Report Window


Information Domain User-Batch Execution Mapping
Locale Desc Upload Window Window
Metadata Difference Window View log
Metadata Segment Map Web Server Window
OLAP Details Window Write-Protected Batch Window
Operator Console

Administration Window
Infrastructure Administrator Window
Profile Maintenance Window
System Authorizer
System Administrator Window
System Authorizer
User Authorization Window

NOTE To access an object, the respective Group or Role needs to be


mapped instead of functions. See Appendix A.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 647


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

14 Command Line Utilities


The following command line utilities are introduced in OFSAAI.
• Command Line Utility to Migrate Objects
• Command Line Utilities to Execute RRF Definitions
• Command Line Utility for DMT Migration
• Command Line Utility for File Encryption
• Command Line Utility to publish Metadata in Metadata Browser
• Command Line Utility for Object Application mapping in new Metadata Browser
• Command Line Utility for Resaving UAM Hierarchy Objects
• Command Line Utility for Resaving Derived Entities and Essbase Cubes
• Command Line Utility for Mapper Pushdown
• Command Line Utility for Downloading Metadata Objects in PDF Format
• Command Line Utility for LDAP Migration
• Command Line Utility for Model Upload
• Command Line Utility for Object Registration
• Command Line Utility for Transforming erwin XML to Database XML or JSON(ODM)
• Command Line Utility for Generating Slice JSON (ODM)
• Command-Line Utility for SQL Modeler to JSON (ODM)
• Command-line Utility to Bulk Import User Groups to IDCS

14.1 Command Line Utility to Migrate Objects


Using the command line utility, you can migrate (export/ import) Infrastructure metadata objects
across different information domains or setups. You can specify one or more objects within an object
type or within multiple object types.
You can choose from where the object migration utility reads the data, that is, from CSV files or
OBJECTMIGRATION.xml file. For migrating objects using CSV files, see Migrating Objects using CSV
Files. For migrating objects using OBJECTMIGRATION.xml file, see Migrating Objects using
OBJECTMIGRATION.xml File.
For the list of objects that can be migrated, see the Objects Supported for Command Line Migration
section. However, currently some objects are not supported. You need to migrate them separately
from Object Migration UI, or manually recreate them in the target environment.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 648


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

NOTE The REST authentication is done against the Service Account


user mentioned under OFSAA_SRVC_ACC parameter in the
CONFIGURATION table. This user should be created with "SMS
Auth Only" attribute from the User Maintenance window. By
default, OFSAA_SRVC_ACC parameter is set as SYSADMN.

14.1.1 Prerequisites
• You must have access and execution rights in the $FIC_HOME/utility/Migration/ directory in
both the source and target environment.
• Folders (segments) and user groups that are designated for the import should be present in the
target.
• The source and target environment should have the same installed locales.
• OFSAA users in source should be the same in target (at least for users associated with objects
migrated).
• OFSAA users should have access to folders in target as well as source.
• Underlying tables of the objects being migrated should exist in target.
For example, if you want to migrate a Data Element Filter based on "Table A" and "Table B" in
the source, those two tables should exist in the target.
• For AMHM Dimensions and Hierarchies:
 The key processing Dimensions should be the same in both the source and target
environments.
 For Member migration, the Dimension type should have the same attributes in both source
and target environments.
 Numeric Dimension Member IDs should be the same in both the source and target
environments, to ensure the integrity of any Member-based objects.

NOTE If you have used the Master Table approach for loading
Dimension data and set it up to generate surrogate keys for
Members, this results in different IDs between the source and
target, so it may cause errors if you have objects which depend
on these IDs.

• All objects that generate new ID after migrating to a different information domain and all
components which are registered through the Component Registration window, which will be
used in the RRF, must be manually entered in AAI_OBJ_REF_UPDATE table in the Configuration
Schema. The implicit migration of dependent objects is not supported. They should be migrated
explicitly. The attributes present in the table are:
 V_OBJECT_TYPE- EPM Object Type
 V_RRF_OBJECT_TYPE- RRF object Type. The ID can be referred from
pr2_component_master table

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 649


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

 V_ICC_OBJECT_TYPE- ICC object type, can be referred from component_master table.


 F_IS_FILTER- Is the object to be migrated as a filter/not?
 N_BATCH_PARAMETER_ORDER- the order of parameter in task (if used in a batch).

14.1.2 Migrating Objects Using OBJECTMIGRATION.xml File


This section explains how to migrate objects using OBJECTMIGRATION.xml file. In this case, you have
to populate migration.properties file and OBJECTMIGRATION.xml file. These files are present in
the $FIC_HOME/utility/Migration/conf folder. You do not have to make any entries in the
export_input.csv and import_input.csv files, present in the same folder.
To migrate objects using OBJECTMIGRATION.xml file, perform the following steps:
1. Navigate to the $FIC_HOME/utility/Migration/conf folder.
2. Populate the migration.properties file with appropriate values as explained in the
following table.

NOTE The values in the properties file are updated by the installer. If
you want to run this utility from another location, the values
should be specified accordingly.

Table 175: Names in the Object Migration XML and their Descriptions

Name Description

Absolute path of the directory where the metadata/ archive and


metadata/ restore folders are created.
EXPORTIMPORT_BASEPATH For example:
EXPORTIMPORT_BASEPATH=
/oracle/rhelapp/ofs73app/utility/Migration

OFSAAI installation directory.


FIC_HOME
For example: FIC_HOME=/oracle/rhelapp/ofs73app

Set this as N. Then the utility reads from OBJECTMIGRATION.xml


READ_FROM_CSV
file.

NOTE The remaining entries in the migration. Properties file is not


required when you migrate objects using
OBJECTMIGRATION.xml file.

3. Update the OBJECTMIGRATION.xml file as explained below based on whether you want to
import or export objects:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 650


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

NOTE The OBJECTMIGRATION.xml file is available with the installer.


The Tag name, Attribute and the entries to be made in the XML
file are case sensitive.

NOTE Any updates done are available in the


OBJECTMIGRATION_template.xml. Before invoking the
command line utility, ensure that the updates available in the
OBJECTMIGRATION_template.xml file is available in the
OBJECTMIGRATION.xml file that you are using to migrate
objects.

14.1.2.1 For Exporting Objects


Table 176: Details and Descriptions of the Tags and their Attributes

Tag Name Attribute Description

USERID Specify the user ID of the OFSAAI user who will be running the
migration utility. Ensure the user is mapped to the specific
source Information Domain / Segment.
The user id should be provided in capital letters.
Note: The User ID or Service accounts are “SMS Auth Only” in
case of SSO and LDAP configured setups.

LOCALE Set this as en_US.

INFODOM Specify the Information Domain from where objects need to


be exported.
The information domain name should be provided in capital
letters.

FOLDER Not Applicable, only used for importing.

MODE Set the mode of the operation as EXPORT.

FILE Specify the name of the dump file which will be created under
$FIC_HOME/utility/Migration/metadata/archive folder as a
.DMP file.

FAILONERROR Not Applicable, only used for importing.

OVERWRITE Not Applicable, only used for importing.

RETAIN_IDS Not Applicable, only used for importing.

MIGRATION_CODE Enter the unique migration code to identify the status of the
migration process.
For example: 8860

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 651


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

OBJECT Code Specify the object Code which should be a unique identifier of
the definition according to the Type of the object in the
Information Domain. Code should be either system generated
or user defined unique code. See the Objects Supported for
Command Line Migration section to know for a particular
object whether it is user defined or system generated.
Note: Object Code is case sensitive.
You can specify the Code value as wildcard “*” if you are
migrating all objects of that Type.
For example, to export all Rules from RRF:
<OBJECTS>
<OBJECT Code="*" Type="112" />
</OBJECTS>
To export multiple objects of a particular object type, multiple
entries with each object code should be made in the
OBJECTMIGRATION.xml file.
For example, if you want to export three different rules, the
entries should be made as given below:
<OBJECTS>
<OBJECT Code="Rule Code_1" Type="112" />
<OBJECT Code="Rule Code_2" Type="112" />
<OBJECT Code="Rule Code_3" Type="112" />
</OBJECTS>
To export ETL objects, the format is Data Mapping Code
followed by Type=”122”.
For example,
<OBJECT Code="FCTPRODUCT" Type="122" />
Note: Only the latest version will be archived and it will be
restored as new version.
To export Enterprise Modeling Objects which supports
versioning, the version of the object should be a part of the
Code attribute.
<OBJECTS>
<OBJECT Code="ModelID_Version" Type="1305"
/>
</OBJECTS>

Additionally, if you want to include or exclude a few of the


dependent objects, you can set the depType as exclude or
include.
<OBJECTS>
<OBJECT Code="BP_Code_1" Type="105”>
<OBJECT Code="Dataset_Code_1"
depType="include" Type="104"/>
<OBJECT Code="Measure_Code_1"
depType="exclude" Type="101"/>
</OBJECT>
</OBJECTS>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 652


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Tag Name Attribute Description

Object Type Specify the Type ID of the required metadata objects to be


exported. Refer to the Objects Supported for Command Line
Migration section.

SubType SubType is available for Filters and AMHM hierarchy only. This
is a mandatory field.
For filters, SubType indicates the type of the filter. For
hierarchies, this indicates the Dimension ID.
See the table for filter SubTypes.
Example: For Group Filter,
<OBJECTS>
<OBJECT Code=”200265" Type="1"
SubType=”21”/>
</OBJECTS>

1. After you have updated the files with required information in the source environment, navigate
to $FIC_HOME/utility/Migration/bin path and execute ObjectMigration.sh. The
dump file will be created.
2. Once executed, you can view the related log files from the
$FIC_HOME/utility/Migration/logs location.

14.1.2.2 For Importing Objects


Table 177: Details and Descriptions of the Tags and their Attributes

Tag Name Attribute Description

USERID Specify the user ID of the OFSAAI user who will be running
the migration utility. Ensure that the user is mapped to the
specific target Information Domain / Segment.
The user id should be provided in capital letters.
Note: The User ID or Service accounts are “SMS Auth Only” in
case of SSO and LDAP configured setups.

LOCALE Set this as en_US.

INFODOM Specify the Information Domain where objects need to be


imported.
The information domain name should be provided in capital
letters.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 653


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Tag Name Attribute Description

FOLDER Specify the Code of the folder /segment to which you need to
import objects.
This field is optional. The folder value should be provided in
capital letters.
Note: This is the default target folder if object specific
TargetFolder is not provided. However, if both FOLDER and
TargetFolder are not specified, then source folder available
in the exported dump file will be considered as target folder.
For behavior in this release, see Limitations section.

MODE Set the Mode of the operation as IMPORT.

FILE Specify the name of the file to be imported, which is present


under $FIC_HOME/utility/Migration/metadata/restore
folder.

IMPORTALL Y indicates that all exported objects in the .DMP file (dump)
will be imported (regardless of any specific OBJECT entries in
the OBJECTMIGRATION.XML file).
Example:
<IMPORTALL TARGETFOLDER="BASEG">Y</IMPORTALL>
N indicates that only objects explicitly specified in the
OBJECTMIGRATION.XML file will be imported (provided they
are already exported and available in the dump file).
Note: When migrating Sandbox, IMPORTALL should be N.

FAILONERROR Specify whether to fail operation on any error.


Y - Stops the import process if there is any error.
N - Continues with the next object in the import process even
if there is an error.

OVERWRITE Specify whether to overwrite any existing metadata.


Y - Overwrites metadata even if the metadata already exists.
N - Will not overwrite the object if it already exists and
continue migrating the next object.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 654


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Tag Name Attribute Description

Specify whether to retain the source AMHM objects after


migration.
Y – To retain the Source AMHM objects’ System IDs.
N – Not to retain the Source AMHM objects’ System IDs.
If ‘Y' is selected, different scenarios and the behaviors
are as follows:
Object and ID does not exist in Target- the object is
RETAIN_IDS
created in target environment with same ID as that in
source.
Object exists in Target with different ID- object is
migrated and the ID in the target is retained.
ID already exists in Target with different object- then the
object is migrated to target environment and a new ID is
generated.
Same object and ID exists in Target- In this case, the
behavior depends on the OVERWRITE flag.
Enter the unique migration code to identify the status of the
MIGRATION_CODE migration process.
For example: 8860

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 655


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Tag Name Attribute Description

OBJECT Code Specify the object Code which should be a unique identifier
of the definition according to the Type of the object in the
Information Domain. Code should be either system
generated or user defined unique code. See the Objects
Supported for Command Line Migration section to know for a
particular object whether it is user defined or system
generated.
Note: Object Code is case sensitive.
You can specify the Code value as wildcard “*” if you are
importing all objects of that Type.
For example:
<OBJECTS>
<OBJECT Code="*" Type="112" />
</OBJECTS>
To import multiple objects of a particular metadata type,
multiple entries with each metadata code should be made in
the OBJECTMIGRATION.XML file.
For example, if you want to import three different rules, the
entries should be made as given below:
<OBJECTS>
<OBJECT Code="Rule Code_1" Type="112"
/>
<OBJECT Code="Rule Code_2" Type="112"
/>
<OBJECT Code="Rule Code_3" Type="112"
/>
</OBJECTS>
Note: Specify only those Codes that are present in the
exported dump file.
To import Enterprise Modeling Objects which supports
versioning, the version of the object should be a part of the
Code attribute.
<OBJECTS>
<OBJECT Code="ModelID_Version"
Type="1305" />
</OBJECTS>.
Type Specify the Type ID of the required metadata objects to be
imported. Refer to the Objects Supported for Command Line
Migration section.
Note: You need to specify only those Types, which are
present in the exported dump file.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 656


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Tag Name Attribute Description

SubType SubType is available for Filters and AMHM hierarchy only.


This is a mandatory field.
For filters, SubType indicates the type of the filter. For
hierarchies, this indicates the Dimension ID.
See the table for filter SubTypes.
Example: For Group Filter,
<OBJECTS>
<OBJECT Code= “200265" Type="1"
SubType=”21”/>
</OBJECTS>
OBJECTS TargetFolder Specify an optional attribute TargetFolder in <OBJECTS> tag
to import objects to a specific folder. Objects can be migrated
individually or in groups.
Example:
<OBJECTS TargetFolder="FSGBSEG">
<OBJECT Code="200143" Type="14"/>
</OBJECTS>
<OBJECTS TargetFolder="BASEG">
<OBJECT Code="M0001NW" Type="101"/>
<OBJECT Code="H0002CRP" Type="103"/>
</OBJECTS>
Note the following:
If you have not specified the TargetFolder, the objects will be
imported to the folder specified in FOLDER tag.
If you have not provided the default FOLDER value also, then
the source folder value in the dump file will be taken as target
folder.
For Catalog Publish object, the TargetFolder is mandatory.
For behavior in this release, see Limitations section.

1. Once you have updated the files with required information in the target environment:
 Create metadata/ restore folder under $FIC_HOME/utility/Migration directory (if
not present).
 Copy the exported .DMP file that needs to be imported to
$FIC_HOME/utility/Migration/metadata/restore folder.
 Navigate to $FIC_HOME/utility/Migration/bin path and execute
ObjectMigration.sh.
2. Once executed, you can view the related log files from the
$FIC_HOME/utility/Migration/logs location.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 657


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

14.1.3 Migrating Objects Using CSV Files


This section explains how to migrate objects using export_input.csv file and import_input.csv
file. These files are present in $FIC_HOME/utility/Migration/conf folder. This folder also
contains migration.properties file and OBJECTMIGRATION.xml file. You need not make any
entry in the OBJECTMIGRATION.xml file.
To migrate objects, perform the following steps:
1. Navigate to the $FIC_HOME/utility/Migration/conf folder.
2. Populate the migration.properties file with appropriate values as explained in the following
table.
The values in the properties file are updated by the installer. If you want to Run this utility from
another location, the values should be specified accordingly.

Table 178: Names in the Property file and their Descriptions

Name Description

Absolute path of the directory where the metadata/ archive and


metadata/ restore folders are created.
EXPORTIMPORT_BASEPATH For example:
EXPORTIMPORT_BASEPATH=
/scratch/ofsaaweb/OFSAAI/utility/Migration
OFSAAI installation directory.
FIC_HOME
For example: FIC_HOME /scratch/ofsaaweb/OFSAAI

Specify whether to read the inputs from CSV files or


OBJECTMIGRATION.xml file.
READ_FROM_CSV Set this as Y. Then the utility reads from export_input.csv
file for exporting objects or from import_input.csv file for
importing objects.

Specify the user ID of the OFSAAI user who will be running the
migration utility. Ensure the user is mapped to the specific source
Information Domain / Segment.
USERID
The user id should be provided in capital letters.
Note: The User ID or Service accounts are “SMS Auth Only” in
case of SSO and LDAP configured setups.

LOCALE Set this as en_US.

Specify the Information Domain from where objects need to be


exported/ imported.
INFODOM
The information domain name should be provided in capital
letters.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 658


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Name Description

This is applicable only for importing.


Specify the Code of the folder /segment to which you need to
import objects. The folder value should be provided in capital
FOLDER
letters.
If IMPORTALL_TARGET_FOLDER is not specified in case of
IMPORTALL=Y, then the objects are imported to this FOLDER.

Set the mode of the operation as:


MODE EXPORT - for exporting objects
IMPORT for importing objects

For exporting, specify the name of the file to be exported which


will be created under
$FIC_HOME/utility/Migration/metadata/archive
folder as a .DMP file.
DUMP_FILE_NAME
For importing, specify the name of the file to be imported, which
is present under
$FIC_HOME/utility/Migration/metadata/restore
folder.
Y indicates that all exported objects in the .DMP file (dump) will
be imported (regardless of any specific OBJECT entries in the
import_input.csv or OBJECTMIGRATION.XML file).
N indicates that only objects explicitly specified in the
IMPORTALL
import_input.csv or OBJECTMIGRATION.XML file will
be imported (provided they are already exported and available in
the dump file).
Note: When migrating Sandbox, IMPORTALL should be N.

Specify the target folder to which you want to import objects


IMPORTALL_TARGET_FOLDER when you specify IMPORTALL as Y. If this is not specified, it
imports the objects to FOLDER.

Specify whether to fail operation on any error.


Y - Stops the import process if there is any error.
FAILONERROR
N - Continues with the next object in the import process even if
there is an error.

Specify whether to overwrite any existing metadata.


Y - Overwrites metadata even if the metadata already exists.
OVERWRITE
N - Will not overwrite the object if it already exists and continue
migrating the next object.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 659


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Name Description

Specify whether to retain the source AMHM objects after


migration.
Y – Retain the Source AMHM object IDs.
N – Will not retain the Source AMHM object IDs.
RETAIN_IDS
If you have chosen the value 'Y' for RETAIN_ID and the Target
system does not consume the object ID of the Source object, the
ID will be retained while migration. If the object in the Target
system consumes the object ID of the Source, the ID will not be
retained while migration. Instead, it will generate a new ID.

Enter the unique migration code to identify the status of the


MIGRATION_CODE migration process.
For example: 8860

3. Update import_input.csv or export_input.csv files based on whether you want to


import or export objects as explained in the following tables:

NOTE Any updates done are available in the


export_input_template.csv and
import_input_template.csv files. Before invoking the
command line utility, ensure that the updates available in the
templates files are available in the export_input.csv and
import_input.csv files.

14.1.3.1 For Exporting Objects


Following are the entries in the export_input.csv file:

Table 179: Column Names in the export file and their Descriptions

Column Name Description

Object Code Specify the object Code which should be a unique identifier of the definition
based on the Object Type. It should be either system generated or user
defined unique code. See the Objects Supported for Command Line
Migration section to know for a particular object whether the code is user
defined or system generated.
You can specify the object Code value as wildcard “*” if you are migrating all
objects of that Object Type.

Object Type Specify the Type ID of the required metadata objects to be exported. Refer to
the Objects Supported for Command Line Migration section.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 660


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Column Name Description

Object Sub Type SubType is available for Filters and AMHM hierarchy only. This is a
mandatory field.
For filters, SubType indicates the type of the filter. For hierarchies, this
indicates the Dimension ID.
See the table for filter SubTypes.

Sandbox Infodom Specify the Sandbox Information Domain name to export Sandbox.

With Models Specify Y if you want to export all models present in the Sandbox Infodom
along with the Sandbox.
Specify N if you want to export only the Sandbox.

Include Dependency Specify Y if you want to export all dependent objects along with the base
objects.
Specify N if you want to export only the mentioned object.

Include Instances This is applicable only for PMF migration.


Specify Y if you want to export Questionnaire related workflow instance
data.

Is Response Data Required This is applicable only for Questionnaire migration.


Specify Y if you want to export the responses for Questionnaire.
Specify N if you want to skip it.

Application Code This is applicable only for Questionnaire migration.


Specify the application code for which you want to export the Questionnaire
data. For example, to migrate KYC related Questionnaire data, specify the
application code OFS_KYC. Similarly, you can specify the application code
for other applications and migrate the related Questionnaire data.

1. After entering the required details of the objects you want to export in the export_input.csv
file, navigate to $FIC_HOME/utility/Migration/bin path and execute
ObjectMigration.sh. The dump file will be created, which will have an import_input.csv with
list of all objects (including dependent ones) that are being exported.
2. Once executed, you can view the related log files from the
$FIC_HOME/utility/Migration/logs location.

14.1.3.2 For Importing Objects


Following are the entries in the import_input.csv file:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 661


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Table 180: Column Names in the export file and their Descriptions

Column Name Description

Object Code Specify the object Code which should be a unique identifier of the definition
based on the Object Type. It should be either system generated or user defined
unique code. See the Objects Supported for Command Line Migration section
to know for a particular object whether the code is user defined or system
generated.
You can specify the Object Code value as wildcard “*” if you are importing all
objects of that Object Type.
Note: Specify only those Codes that are present in the exported dump file.

Object Type Specify the Type ID of the required metadata objects to be imported. See the
Objects Supported for Command Line Migration section for Object Type IDs.

Object SubType SubType is available for Filters and AMHM hierarchy only. This is a mandatory
field.
For filters, SubType indicates the type of the filter. For hierarchies, this
indicates the Dimension ID.
See the table for filter SubTypes.

Sandbox Infodom Specify the Sandbox Information Domain name to import Sandbox.

With Models Specify Y if you want to import all models present in the Sandbox Infodom
along with the Sandbox.
Specify N if you want to import only the Sandbox.

Include Dependency Specify Y if you want to import all dependent objects along with the base
objects.
Specify N if you want to import only the mentioned object.

Is Base Object This attribute is for information and is not read while processing the input. This
will be set as Y if the exported object is a base object and will be N for all the
exported dependent objects.

Object Group and Object Specify a unique ID to the Object Group and the folder to which you want to
Group Target Folder import all the objects in that Object Group.
If Object Group is not specified, by default it takes the object group ID of the
preceding entry with Object Group. If the object group ID for the first entry is
not explicitly entered, it is assigned the value as ‘1’.
If object Group ID is specified and Object Group Target Folder is kept blank, the
objects of that Object Group will be imported to the folder mentioned in the
FOLDER tag in the migration.properties file. If that is also not
mentioned, it will be imported to the source folder mentioned in the dump file.
Note: An object with an Object Group ID different from the preceding object will
go to a new group. Hence, enter all the objects which you want to import to the
same folder successively.

Include Instances This is applicable only for PMF migration.


Specify Y if you want to import questionnaire related workflow instance data.

Once you have updated the files with required information in the target environment:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 662


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

 Create metadata/ restore folder under $FIC_HOME/utility/Migration directory (if


not present).
 Copy the exported .DMP file that needs to be imported to
$FIC_HOME/utility/Migration/metadata/restore folder.
 Navigate to $FIC_HOME/utility/Migration/bin path and execute
ObjectMigration.sh.
After it is executed, you can view the related log files from the
$FIC_HOME/utility/Migration/logs location.

Figure 313: Sample Import file

• mig_group_001 and mig_group_002 belong to Group 1 and they will be imported to folder
EMFLD.
• mig_group_003 and mig_group_004 belong to group 2 and they will be imported to folder
IPEFLD.
• mig_group_005 will be imported to the default folder set under <FOLDER> tag.
• mig_group_006 will be imported to the default folder set under <FOLDER> tag even though
the Object Group ID is same as that of mig_group_001. If you want mig_group_006 to be
imported to the same folder (EMFLD), then either you have to explicitly give the Object Group
Target Folder along with Object Group or mig_group_006 entry should be inserted before a
change in the User Group ID. That is, in the previous example, before the entry for
mig_group_003.

NOTE If nothing is specified for Include Dependency column, all the


dependent objects are exported.

14.1.4 Limitations
• For AMHM objects, irrespective of values specified in TargetFolder or FOLDER tags, the objects
are migrated to the source folder available in the exported dump file. Hence, ensure folder with
same name as it is in the dump file is present in target environment.
• Ensure the specified Folder is present in the target environment during IMPORT operation.
Currently validation is not done.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 663


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

14.1.5 Objects Supported for Command Line Migration


Table 181: Details of the Objects for command Line Migration

Support for Object Code Location of Object Code


Object Type Wildcard
Object Name
ID Select ALL
From UI From Backend
Option

Yes System generated In the Audit Trail pane, DQ_CHECK_MASTER->


code Object Code is displayed as N_RULE_SYS_ID
DATA QUALITY RULE 120 System ID.

Yes User defined unique Object Code is displayed as


code “Name” in the Data Quality
DATA QUALITY GROUP 1003 Groups Summary window.

No User defined unique Object Code is displayed as


code “Code” in the Post Load
DATA TRANSFORMATION 1 121 Changes Summary window.

Yes User defined name of Object Code is displayed as


the Data Source “Code” in the Data Sources
DATA SOURCES 2102 Summary window.

Yes User defined unique Object Code is displayed as


code “Map Reference number”
in the Slow Changing
SLOW CHANGING Dimension Summary
DIMENISONS (SCD) 2 2103 window.

No User defined unique Object Code is displayed as


code “Code” in the Data Mapping
ETL 122 Summary window.

1 Data Transformation objects, that is, Post Load Changes definitions based on Stored Procedures only are supported for migration.
2 Object migration support for Slow Changing Dimensions definitions are from OFSAAI 8.1.2.2.0 release.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 664


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Support for Object Code Location of Object Code


Object Type Wildcard
Object Name
ID Select ALL
From UI From Backend
Option

DATA ENTRY FORMS AND Yes User defined unique


124
QUERIES (DEFQ) code

Yes User defined unique In the Alias Summary


code window, select the Entity
and Code is displayed as
ALIAS 54 “Alias”.

Yes User defined unique Object Code is displayed as


code “Code” in the Derived Entity
DERIVED ENTITY 128 Summary window.

Yes User defined unique Object Code is displayed as


code “Code” in the Business
Measures Summary
BUSINESS MEASURE 101 window.

Yes User defined unique Object Code is displayed as


code “Code” in the Business
Dimension Summary
BUSINESS DIMENSION 102 window.

Yes User defined unique Object Code is displayed as


code “Code” in the Business
Hierarchy Summary
BUSINESS HIERARCHY 103 window.

Yes User defined unique Object Code is displayed as


code “Code” in the Datasets
DATASET 104 Summary window.

Yes User defined unique Object Code is displayed as


code “Code” in the Business
Processor Summary
BUSINESS PROCESSOR 105 window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 665


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Support for Object Code Location of Object Code


Object Type Wildcard
Object Name
ID Select ALL
From UI From Backend
Option

Yes User defined unique Object Code is displayed as


code “Code” in the Business
Processor Summary
ESSBASE CUBE 106 window.

Yes User defined unique NA


ORACLE CUBE 133 code

System generated Object Code is displayed as


code “Name” in the Map
MAPPER 136 Yes Maintenance window.

User defined unique FORMS_MASTER >


FORMS FRAMEWORK 126 Yes code FORM_CODE
User defined unique MENU_ITEMS>
FORMS MENU 125 Yes code MENU_ID
User defined unique TAB_MASTER> TAB_ID
FORMS TAB 1125 Yes code

User defined unique JSP_CONFIG_DETAILS


FORMS PAGE 1127 Yes code > JSP_ID

FORMS LAYOUT/ User defined unique TEMPLATE_MASTER >


TEMPLATE 1126 Yes code TEMPLATE_ID

System generated Object Code is displayed as


code “Code” in the Rule Summary
RULE 112 Yes window.

System generated Object Code is displayed as


code “Code” in the Process
PROCESS 111 Yes Summary window.

System generated Object Code is displayed as


code “Code” in the Run Summary
RUN 110 Yes window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 666


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Support for Object Code Location of Object Code


Object Type Wildcard
Object Name
ID Select ALL
From UI From Backend
Option

System generated Object Code is displayed as


code “Batch ID” in the Batch
BATCH 123 Yes Maintenance window.

Yes System generated REV_DIMENSIONS_B >


DIMENSION 12 code DIMENSION_ID
Yes System generated In the Audit Trail pane,
code Object Code is displayed as
FILTER 1 System ID.

Yes System generated In the Audit Trail pane,


code Object Code is displayed as
EXPRESSION 14 System ID.

Yes System generated In the Audit Trail pane,


code Object Code is displayed as
AMHM HIERARCHY 5 System ID.

No System generated Object Code is displayed as


code “Sandbox ID” in the
Sandbox Maintenance
window in the Production
SANDBOX3 1300 Infodom.

Yes System generated Object Code is displayed as


code “Variable ID” in the Variable
Management window in the
VARIABLE 1301 Production Infodom.

3 You can specify the name of the sandbox Infodom which you want to migrate for SANDBOXINFODOM attribute and Y for WITHMODELS attribute to
migrate the models along with the sandbox.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 667


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Support for Object Code Location of Object Code


Object Type Wildcard
Object Name
ID Select ALL
From UI From Backend
Option

Yes System generated Object Code is displayed as


code “Technique ID” in the
Technique Registration
window in the Production
TECHNIQUE 1302 Infodom.

Yes System generated NA


code with ‘_’ and
VARIABLE SHOCK 1303 Version number

Yes System generated NA


code with ‘_’ and
SCENARIO 1304 Version number

Object Code is displayed as


System generated “Model ID” and version
MODEL 1305 Yes code with ‘_’ and number as “Version” in the
Version number Model Management window
in the Sandbox Infodom.

Yes System generated Object Code is displayed as


code “Stress ID” in the Stress
Definition window in the
STRESS 1306 Production Infodom.

Yes System generated NA


CATALOG PUBLISH 1307 code

Yes User defined unique Object Code is displayed as CSSMS_USR_PROFILE >


code “User ID” in the User V_USR_ID
User 2000 Maintenance window.

Yes User defined unique Object Code is displayed as CSSMS_GROUP_MAST >


code “User Group ID” in the User V_GROUP_CODE
Group Maintenance
User Group 2001 window.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 668


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Support for Object Code Location of Object Code


Object Type Wildcard
Object Name
ID Select ALL
From UI From Backend
Option

Yes User defined unique Object Code is displayed as CSSMS_ROLE_MAST


code “Role Code” in the Role > V_ROLE_CODE
Role 2002 Maintenance window.

Yes User defined unique Object Code is displayed as CSSMS_FUNCTION_MAST


code “Function Code” in the > V_FUNCTION_CODE
Function Maintenance
Function 2003 window.

Yes User defined unique Object Code is displayed as CSSMS_PROFILE_MAST >


code “Profile Code” in the Profile V_PROFILE_CODE
Profile 2004 Maintenance window.

Yes User defined unique In the Process Modeller AAI_WF_PROCESS_B >


code window, Object Code to be V_PROCESS_ID
used is displayed as
PMF Process 8000 Process ID.

Questionnaire Configuration Yes User defined code


Attributes 8001

Yes System generated


Question Definitions 8002 code

Yes System generated


Questionnaire Definitions 8003 code

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 669


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

The following tables provides the details of the Object Name and their Types.

Table 182: Details of the Object Name and their Types

Object Name Object SubType ID

DataElement Filter 4

Hierarchy Filter 8

Group Filter 21

Attribute Filter 25

14.1.6 Dependent Objects


The following table lists the objects that are supported for implicit dependency and the dependent
objects:

Table 183: Details of the Object Name and their Types

Dependent Objects Dependent Object


Base Object Name Base Object Type ID
Type ID

DATA QUALITY RULE 120 DERIVED ENTITY 128

DATA QUALITY GROUP 1003 DATA QUALITY RULE 120

DATA TRANSFORMATION 121 NA NA

DATA SOURCES 2102 NA NA

DATA QUALITY RULE- This


ETL 122
is not implemented.

DATA ENTRY FORMS AND NA NA


124
QUERIES (DEFQ)

ALIAS 54 NA NA

DATASET 104

BUSINESS MEASURE 101


DERIVED ENTITY 128
BUSINESS HIERARCHY 103

BUSINESS PROCESSOR 105

ALIAS 54
BUSINESS MEASURE 101
DERIVED ENTITY 128

BUSINESS DIMENSION 102 BUSINESS HIERARCHY 103

DERIVED ENTITY 128


BUSINESS HIERARCHY 103
BUSINESS MEASURE 101

ALIAS 54
DATASET 104
DERIVED ENTITY 128

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 670


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Dependent Objects Dependent Object


Base Object Name Base Object Type ID
Type ID

DATASET 104

BUSINESS PROCESSOR 105 BUSINESS MEASURE 101

BUSINESS PROCESSOR 105

DATASET 104

ESSBASE CUBE 106 BUSINESS MEASURE 101

BUSINESS DIMENSION 102

ORACLE CUBE 133 NA

MAPPER 136 Hierarchies 103

FORMS FRAMEWORK 126 Child Forms 126

FORMS MENU 125 FORMS and LAYOUTS

FORMS LAYOUT 1126 Forms 126

FORMS TAB 36494 NA NA

FORMS PAGE 1127 FORMS and LAYOUTS 126, 1126

DATASET 104

MEASURE 101

HIERARCHY 103

BUSINESS PROCESSOR 105


RULE 112
DATA ELEMENT FILTER 4

GROUP FILTER 21

ATTRIBUTE FILTER 25

HIERARCHY FILTER 8

EXTRACT DATA 122

LOAD DATA 122

TRANFORM DATA 121

RULE 112

PROCESS 111 PROCESS 111

CUBE 106

DATA QUALITY GROUP 1003

VARIABLE SHOCK 1303

MODEL 1305

EXTRACT DATA 122


RUN 110
LOAD DATA 122

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 671


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Dependent Objects Dependent Object


Base Object Name Base Object Type ID
Type ID

TRANFORM DATA 121

RULE 112

PROCESS 111

RUN 110

CUBE 106

DATA QUALITY GROUP 1003

VARIABLE SHOCK 1303

MODEL 1305

DATA ELEMENT FILTER 4

GROUP FILTER 21

ATTRIBUTE FILTER 25

HIERARCHY FILTER 8

BATCH 123 Not implemented

MEMBERS NA
DIMENSION 12
ATTRIBUTES NA

BUSINESS HIERARCHY 103

FILTER 1 ATTRIBUTES NA

FILTER 1

EXPRESSION 14 EXPRESSION 14

AMHM HIERARCHY 5 Members NA

SANDBOX 2 1300 NA NA

BUSINESS HIERARCHY 103

BUSINESS MEASURE 101


VARIABLE 1301
BUSINESS PROCESSOR 105

DATASET 104

TECHNIQUE 1302 NA NA

VARIABLE 1301

VARIABLE SHOCK 1303 DATASET 104

BUSINESS HIERARCHY 103

SCENARIO 1304 VARIABLE SHOCK 1303

TECHNIQUE 1302
MODEL 1305
VARIABLE 1301

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 672


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

Dependent Objects Dependent Object


Base Object Name Base Object Type ID
Type ID

DATASET 104

BUSINESS HIERARCHY 103

DataElement Filter 4

RUN 110
STRESS 1306
SCENARIO 1304

CATALOG PUBLISH 1307 NA NA

USER 2000 PROFILE 2004

USER GROUP 2001 USER 2000

ROLE 2002 FUNCTION 2003

FUNCTION 2003 NA NA

PROFILE 2004 NA NA

PMF PROCESS 8000 NA NA

Questionnaire NA NA
Configuration Attributes 8001

Question Definitions 8002 NA NA

Questionnaire Configuration
8001
Questionnaire Definitions 8003 Attributes

Question Definitions 8002

14.1.7 Migrating Security Management System (SMS) Objects


The Security Management System (Administration) objects such as Users, User Groups, Roles,
Functions, and Profiles can be migrated using Command Line Utility.
The Command Line Utility enables migration of following SMS objects along with the mappings:
 Users along with the User-User Group Mapping, User-Profile Mapping, and User-Attribute
Mapping
 User Groups along with the User Group-Role Mapping and User Group-Folder-Role
Mapping
 Roles along with the Role-Function Mapping
 Functions
 Profiles along with the Profile-Holiday Mapping

14.1.7.1 Pre-requisites
To ensure successful migration of all mappings, you must import the SMS objects in the following
order:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 673


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO MIGRATE OBJECTS

• Functions
• Roles
• User Group
• User
For example: If you want to import User-User Group mapping, then you must migrate the User Group
first followed by User.
For more information on migrating object, see Migrating Objects section.

14.1.7.2 Object specific Migration


This section provides the information about the Prerequisites, Object Type IDs, Dependent Objects,
Limitations, Dependencies, and so on about the object specific migration.
This section includes the following topics:
• Object Name: USERS
• Object Name: USERGROUP
• Object Name: ROLES
• Object Name: FUNCTION
• Object Name: PROFILE

14.1.7.3 Object Name: USERS


• Type ID: 2000
• Dependency: The dependent objects should be migrated to the Target system, before
migration of the object. If the dependent objects are not available in the Target system, then
only the objects definitions are migrated and not the mappings.
• Dependent Objects: User Group, Profile

14.1.7.4 Object Name: USERGROUP


• Type ID: 2001
• Dependency:
 The dependent objects should be migrated to the Target system, before migration of the
object. If the dependent objects are not available in the Target system, then only the objects
definitions are migrated and not the mappings.
 For User Group-Folder-Role mapping, the shared folder type should be available in the
Target system with the same name as in the Source and should be mapped to a domain in
the Target with the same name as in the Source. Also, the roles should be available in the
Target.
• Dependent Objects: Roles

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 674


COMMAND LINE UTILITIES
COMMAND LINE UTILITIES TO EXECUTE RRF DEFINITIONS

14.1.7.5 Object Name: ROLES


• Type ID: 2002
• Dependency: The dependent objects should be migrated to the Target system, before
migration of the object. If the dependent objects are not available in the Target system, then
only the objects definitions are migrated and not the mappings.
• Dependent Objects: Function

14.1.7.6 Object Name: FUNCTION


• Type ID: 2003

14.1.7.7 Object Name: PROFILE


• Type ID: 2004

NOTE While importing Profile-Holiday mapping, if the holiday is not


defined in the target system; a new holiday is created.

14.2 Command Line Utilities to Execute RRF Definitions


RRF Rule definitions can be executed through the following command line utilities:
• Command Line Utility for Rule Execution
• Command Line Utility for Run Execution

14.2.1 Command Line Utility for Rule Execution


You can execute RRF Rule definitions through command line utility.
To execute Rule definitions, do the following:
1. Navigate to $FIC_HOME/utility/RuleExecution/bin of OFSAAI APP tier.
2. Execute RuleExecution.sh (UNIX) along with the required arguments such as <BatchRunExeID>
<ComponentID> <TaskID> <MisDate> <DataStoreType> <INFODOM> <IPaddress> <RuleID>
<BuildFlag> <OptionalParameters> in the same order.

Table 184: Arguments in the Rule Definition and their Descriptions

Arguments Description

BatchRunExeID Refers to the Execution ID of the Batch being executed.

ComponentID Refers to The Type of component to be executed.

TaskID Refers to the Task ID.

Refers to the date with which the data for the execution would be
MisDate
filtered.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 675


COMMAND LINE UTILITIES
COMMAND LINE UTILITIES TO EXECUTE RRF DEFINITIONS

Arguments Description

Refers to the type of data store such as Enterprise Data Warehouse


DataStoreType
(EDW) which refers to the Multi-dimensional Database/Cubes.

INFODOM Refers to the Information Domain mapped.

Refers to the IP Address of the machine on which Infrastructure


IPaddress
Database Components have been installed.

RuleID Refers to the Rule definition to be executed.

Build Flag refers to the pre-compiled rules, which are executed with
the query stored in database.
Built Flag status set to "No" indicates that the query statement is
BuildFlag
formed dynamically retrieving the technical metadata details.
If the Build Flag status is set to "Yes" then the relevant metadata
details required to form the rule query is re-compiled in database.

Refers to the set of parameters which would behave as filter criteria


OptionalParameters
for the merge query.

For example,
ksh RuleExecution.sh RRFATOM_exec_rule_20120904_1 RULE_EXECUTION Task1
20120906 EDW RRFATOM A.B.C.D 1344397138549 N
'$RUNID=,$PHID=,$EXEID=,$RUNSK='
3. You can access the location $FIC_HOME/utility/RuleExecution/logs to view the related log files.
Also the component specific logs can be accessed in the location fic_home/ftpshare/logs.

14.2.2 Command Line Utility for Fire Run Service\ Manage Run
Execution
Manage Run Execution utility can be used to execute Run definitions through RESTful Web Services
call. To achieve this, RESTful Service, Client and Shell script are available.

NOTE The REST authentication is done against the Service Account


user mentioned under OFSAA_SRVC_ACC parameter in the
CONFIGURATION table. This user should be created with "SMS
Auth Only" attribute from the User Maintenance window. By
default, OFSAA_SRVC_ACC parameter is set as SYSADMN.

Following are the pre-requisites before executing this utility:


1. Ensure that JAVA_HOME is pointing to JAVA bin installation directory.
2. Ensure FIC_HOME is pointing to application installation directory.
3. Set the PATH variable as $ICC_HOME/bin.
To execute this utility, do the following:
1. Navigate to $FIC_HOME/ficapp/icc/bin of OFSAAI APP tier.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 676


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DMT MIGRATION

2. Provide the following parameters in the command line.

Table 185: Arguments in the Run Definition and their Descriptions

Arguments Description

RUNCODE Refers to Run Code to be executed.

INFODOM Refers to the mapped Information Domain.

Refers to the Folder / Segment name to which run is getting


SEGMENT/FOLDER
executed.

Refers to the batch description.


Run Execution Description Note: In case the Run Execution description has space, the same can
be passed using double quotes.

Refers to the user name who is executing.


USERNAME Note: The User ID or Service accounts are “SMS Auth Only” in case of
SSO and LDAP configured setups.

Refers to the date with which the data for the execution would be
MISDATE
filtered.

3. Execute WSMRERequest.sh <Run Code> <Infodom> <Segment/Folder Code> <Run Execution


Description> <Username> <MIS Date <yyyyMMdd>>.
For example,
./WSMRERequest.sh "1305855689766" "APP" "APPSEG" "App approach"
"APPUSER" "20001231”
4. You can access the location $FIC_HOME/ficapp/icc/log/WSMRERequest.log to view the
related log files. Also the component specific logs can be accessed in the location <OFSAAI
deployed path>/logs.
Every execution of Fire Run Service creates a text file in the location ficapp/icc/mre which
contains the Batch ID created for that particular Run. The text file has the following format:
INFODOM_RUNID_MISDATE.mre

14.3 Command Line Utility for DMT Migration


This is a standalone utility which can be used to migrate the DMT metadata stored in XML files into
corresponding tables in the database. This utility can be executed from the command line. This utility
supports migration of metadata for metadata types Data Mapping, Data File Mapping, Table based
Data Sources, Post Load Changes (DT), and DMT Big Data related XMLs (ETLLoader.properties,
Cluster.XML). This utility has four modes of operation with various sub modes.

14.3.1 Prerequisites
• All the required XML files like TFM XML, ETL Repository XML, Definition XML, Properties XML,
Mapping XML must be present in the standard paths. (relative to the ftpshare folder)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 677


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DMT MIGRATION

• Table AAI_ETL_SOURCE must be present in the Config Schema, with all appropriate
information.
• Ensure the DMTUpgradeUtility_806.sh file is present in
$FIC_HOME/utility/DMT/Migration/bin folder.
• Ensure aai-dmt-migration.jar must be present in
$FIC_HOME/utility/DMT/Migration/lib. (This jar and other dependent OFSAA jars are
available in the aforementioned path. The DMTUpgradeUtility_806.sh file contains the list
of such jars.)
• Ensure the Clusters.XML file is present in the $FIC_HOME/conf directory.
• Ensure the ETLLoader.properties file is present in the $FIC_HOME/ficdb/conf directory.
To run the utility directly from the console:
1. Navigate to $FIC_HOME/utility/DMT/Migration/bin folder.
2. Execute ./DMTUpgradeUtility_806.sh with the following arguments:

Table 186: Argument Descriptions and their Values

Argument Name Description Value

MIGRATION TYPE Specify the mode of operation • UPGRADE (recommended mode)


• ONLY_DEFINITION (recommended
mode)
• UPGRADE_AS_VERSION
• ONLY_DEFINITION_AS_VERSION
For more information, see Modes of
Operation section.

METADATA TYPE Specify the metadata type that you • ALL- to migrate all metadata types
want to migrate. • Enter the specific metadata type that
you want to migrate. The available
metadata types are DMT_SRC,
DMT_PLC, DMT_DM (to migrate F2T,
T2T, and T2F), CLUSTERINFO (to
migrate Cluster information),
ETLPROPINFO (to migrate
ETLLoader.properties)
Note: DMT_SRC Metadata Type is
supported only for Migration Type set as
UPGRADE and ONLY_DEFINITION. Data
Sources based on Table and WebLog are
only supported for migration.

INFODOM NAME Specify the information domain name. • ALL- to migrate metadata from all
This argument is applicable only for information domains.
MIGRATION TYPE as • Enter the specific information domain
ONLY_DEFINITION and name if you want to migrate metadata
ONLY_DEFINITION_AS_VERSION. of a particular information domain only.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 678


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DMT MIGRATION

Argument Name Description Value

DEFINITION NAME Specify the definition name that you • ALL- to migrate all definitions
want to migrate. • Enter the specific definition name that
This argument is applicable only for you want to migrate.
MIGRATION TYPE as • For DMT_SRC metadata type, specify
ONLY_DEFINITION and as <Source Name 1>~<Infodom
ONLY_DEFINITION_AS_VERSION. 1>,<Source Name
2>~<Infodom 2>,<Source
Name3>~<Infodom 3>. That is, list
of source and corresponding Infodom
combination separated by comma.
• For DMT_DM metadata type, specify as
<Application Name>~<Source
Name>~<Definition Name>.
• For DMT_PLC metadata type, specify
the definition name.

14.3.2 Modes of Operation


Based on the value specified for the argument MIGRATION TYPE, the utility can be operated in
different modes:

NOTE Recommended modes are UPGRADE and ONLY_ DEFINITION.

MIGRATION TYPE set as UPGRADE


./DMTUpgradeUtility_806.sh UPGRADE <METADATA_TYPE>
In this scenario, the utility will check for the value set for METADATA TYPE. If it is set as ALL, the XML
data of all metadata types will be migrated to the corresponding tables. If METADATA TYPE is set to a
specific metadata, then the XML data of only that specific metadata will be migrated.
For example,
./DMTUpgradeUtility_806.sh UPGRADE DMT_DM
Note that INFODOM NAME and DEFINITION NAME will be implicitly set to ALL, irrespective of what
the user sets.
If metadata type is not set, it is implicitly set as ALL. For example, if you execute the following
command, all metadata types will be migrated:
./DMTUpgradeUtility_806.sh UPGRADE
In case of rerun of the migration utility, if a metadata is already present in the target environment, that
metadata will be skipped.
MIGRATION TYPE set as UPGRADE_AS_VERSION
./DMTUpgradeUtility_806.sh UPGRADE_AS_VERSION <METADATA_TYPE>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 679


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DMT MIGRATION

In this scenario, the specified metadata type will be migrated to the corresponding tables by
incrementing the version if the definition already exists in the target environment. If
<METADATA_TYPE> is set as ALL, all metadata types will be migrated.
For example,
./DMTUpgradeUtility_806.sh UPGRADE_AS_VERSION DMT_PLC
Note that INFODOM NAME and DEFINITION NAME will be implicitly set to ALL, irrespective of what
the user sets.
If metadata type is not set, it is implicitly set as ALL. For example, if you execute the following
command, all metadata will be migrated:
./DMTUpgradeUtility_806.sh UPGRADE_AS_VERSION
MIGRATION TYPE set as ONLY_DEFINITION
./DMTUpgradeUtility_806.sh ONLY_DEFINITION <Metadata type> <information
domain name> <Definition name>
This mode is used to migrate XML data of a particular definition to the corresponding tables. In this
mode, it is mandatory to set METADATA TYPE, INFODOM NAME and DEFINITION NAME arguments.
Otherwise, the utility execution will fail.
For example,
./DMTUpgradeUtility_806.sh ONLY_DEFINITION DMT_DM OFSAAINFO <Application
Name>~<Source Name>~<Definition Name>
./DMTUpgradeUtility_806.sh ONLY_DEFINITION DMT_DRC <Source Name 1>~<Infodom
1>,<Source Name 2>~<Infodom 2>,<SourceName3>~<Infodom3>

NOTE The Metadata Type DMT_SRC is supported only for table based
sources in ONLY_DEFINITION mode.
For Metadata Type DMT_DM, <information domain name>
should be a valid Infodom name, but the definition will not be
migrated to the specified Infodom name. It will be migrated to
all its mapped Information Domains, which are listed in the
ETLrepository.xml file.

In case of rerun of the migration utility, if a metadata definition is already present in the target
environment, that definition will be skipped.
MIGRATION TYPE set as ONLY_DEFINITION_AS_VERSION
./DMTUpgradeUtility_806.sh ONLY_DEFINITION_AS_VERSION <Metadata type>
<information domain name> <Definition name>
This mode is used to migrate XML data of a particular definition to the corresponding tables by
incrementing the version if the definition already exists in the target environment. In this mode, it is
mandatory to set METADATA TYPE, INFODOM NAME and DEFINITION NAME arguments. Otherwise,
the utility execution will fail.
For example,
./DMTUpgradeUtility_806.sh ONLY_DEFINITION_AS_VERSION DMT_DM OFSAAINFO
F2Tdefinition1

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 680


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DMT MIGRATION

For Metadata Type DMT_DM, <information domain name> should be a valid Infodom name, but the
definition is not migrated to the specified Infodom name. It will be migrated to all its mapped
Information Domains, which are listed in the ETLrepository.xml file.

14.3.3 Few Important Pointers


1. To reflect the migration changes, OFSAA services should be restarted.
2. All metadata should have a Metadata Code of maximum length of 250 characters. Old XML
based DMT definitions had only a name. So after migration, the existing name will be used as
Code. If name exceeds 250 characters, migration of that metadata will be skipped.
3. DMT_SRC is supported only for table based source in ONLY_DEFINITION mode.
4. While migrating a Data Mapping metadata (T2T, T2F), the underlying table based source will
also be migrated.
5. While migrating a Data File Mapping metadata (F2T) there are some assumptions that we need
to make, as the File based Sources have undergone a design change in the 8.0.6 version.
a. Each existing data file mapping definition (F2T) has a unique file based source.
b. The File based Source will be migrated implicitly by the utility when the F2T definition is
being migrated.
c. The source properties of the existing F2T definition will be set as the Properties of the File
Based Source.
d. If there are more than one F2T definition mapped to a single File Based source, then a new
unique File Based Source will be created for each F2T. Name of the new source will be
<Source Name>_ <Definition Name>. All references to the Source Name for this F2T in ICC
and RRF tables will be updated by the migration utility.
6. The new 806 table structure does not support a definition with the same name to be present in
more than one source. For such definitions the 2nd occurrence of the definition will be made
unique by appending the source name to the definition.
1. Modified Definition Name : <Definition Name >_ <Source Name>
a. All references of the definition name in ICC and RRF will be modified by the migration utility.
2. There have been a few modifications to properties names that are present in the
ETLLoader.properties file, which are being migrated to the AAI_DMT_CONFIG Table. Following
are the old property codes and the corresponding new ones.
 T2TMode -> T2T_MODE
 T2HMode -> T2H_MODE
 H2TMode -> H2T_MODE
 H2HMode -> H2H_MODE
 F2HMode -> F2H_MODE
 KEEP_WEBLOG_PROCESSED_FILES -> KEEP_WEBLOG_PROCESSED_FILE
 ISHIVELOCAL -> IS_HIVE_LOCAL
 SQOOPURL -> SQOOP_URL

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 681


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DMT MIGRATION

3. The following properties have been changed and will not be migrated from the
ETLLoader.properties file into the AAI_DMT_DB_CLUSTER_PROPERTY table. The user must
manually update the AAI_DMT_DB_CLUSTER_PROPERTY table with the new values, or use the
DMT Configurations window to update these values. The values must go into source or target
clusters as required.
 SQOOPSERVER_NAME -> SSH_HOST_NAME
 SQOOPSERVER_SSH_PORT -> SSH_PORT
 SQOOPSERVER_SSH_USERID -> SSH_USERID
 SQOOPSERVER_SSH_PASSWORD -> SSH_PASSWORD
4. In case of PLC Migration, ensure the function defined for the Stored Procedure in the <Infodom
name>_TFM.XML is same as the actual function in the Atomic Schema. In case of mismatch, in
the Edit mode of the PLC definition, the actual function in the Atomic Schema is replaced by the
function in the <Infodom name>_TFM.XML. If the SQL in Transformation has compilation
errors, modification of PLC definition will fail.

14.3.4 Logs
The following logs will be created in $FIC_HOME/utility/DMT/Migration/log folder:
• DMTMigrationUtility.log- This is a debug log. All parsing related information will be
available in this log file.
• DMTMigrationUtilityReport.log - This log file gives the status of all metadata that have
been migrated.
For errors during metadata save, see <Deployed Path>/webroot/logs/OFSAA.log.

14.3.5 Troubleshooting
In case of unsuccessful migration, refer the following logs for further debugging:
1. Make a note of failed T2Ts if any, from the report log (DMTMigrationUtilityReport.log). If
migration failed due to seeded xml errors, it will be logged in detailed migration log
(DMTMigrationUtility.log). Search this log with the Definition code to find the exact error.
2. If this doesn’t give sufficient information, see $ftpshare/logs/Migration/DMT/
DMTMigrationService.log for further details. Search this log with the Definition code to
find the exact error.

NOTE For FAQs and use cases related to DMT Metadata Migration
Utility, see FAQ section in OFSAA DMT Metadata Migration
Guide.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 682


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR FILE ENCRYPTION

14.4 Command Line Utility for File Encryption


This is a standalone utility which is used to encrypt and decrypt data files. This utility supports
generation of symmetric encryption key in AES 256 bit format.
This utility does not have dependency on OFSAA or DMT module. However, running this utility
requires log4j-core*.jar and log4j-api*.jar files.
Use Cases:
• If the user has opted for File Encryption from the DMT Configurations window:
 In case of T2F or H2F, the output file will be an encrypted file. To decrypt the data file, user
needs to use this utility.
 In case of F2Tor F2H, the input file should be an encrypted file. To encrypt the data file, user
needs to use this utility.

Prerequisites
• Ensure the following files are present in $FIC_HOME/utility/DMT/encryption/bin folder.
 dmtfileencryption.sh
 aai-dmt-encryption.jar
 log4j-core*.jar
 log4j-api*.jar
• Since the utility uses AES 256 bit encryption, it is mandatory to apply policy files. Perform the
following instructions to apply policy files:
a. Download the Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy
Files from Oracle. Be sure to download the correct policy file updates for your version of
Java (Java 7 or 8).
b. Uncompress and extract the downloaded file. The download includes a Readme.txt and two
.jar files with the same names as the existing policy files.
c. Locate the two existing policy files inside the folder <java-jre-home>/lib/security/.
 local_policy.jar
 US_export_policy.jar
d. Replace the existing policy files with the unlimited strength policy files you extracted.
To run the utility directly from the console:
1. Navigate to $FIC_HOME/utility/DMT/encryption/bin folder.
2. Execute ./dmtfileencryption.sh with the following arguments:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 683


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR FILE ENCRYPTION

Table 187: Argument Descriptions and their Values

Argument Name Description Value

MODE Specify the mode of operation • genkey


• encrypt_file
• decrypt_file
For more information, see Modes of
Operation section.

KEYFILE Absolute path of key file with key


file name.

INPUTFILE Absolute path of input file with


input file name.

OUTPUTFILE Absolute path of output file with


output file name.

Modes of Operation
Based on the value specified for the argument MODE, the utility can be operated in different modes:
MODE set as genkey
./dmtfileencryption.sh genkey <KEYFILE>
In this mode, utility takes the absolute path to which key has to be written as input. Creates a 256 bit
AES key and writes to the location given in <KEYFILE> attribute.
MODE set as encrypt_file
./dmtfileencryption.sh encrypt_file <INPUTFILE> <OUTPUTFILE> <KEYFILE>
In this mode, utility takes input file path, output file path and key file path as inputs. Using the 256 bit
AES key in the given key path, input file is encrypted and written into given output file path.
MODE set as decrypt_file
./dmtfileencryption.sh decrypt_file <INPUTFILE> <OUTPUTFILE> <KEYFILE>
In this mode, utility takes input file path, output file path and key file path as inputs. Using the 256 bit
AES key in the given key path, input file is decrypted and written into given output file path.

NOTE Input and output file absolute paths should be different.

Logs
The DMTFileEncryption.log file will be created in $FIC_HOME/utility/DMT/encryption/log
folder.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 684


COMMAND LINE UTILITIES
COMMAND LINE UTILITY TO PUBLISH METADATA IN METADATA BROWSER

14.5 Command Line Utility to Publish Metadata in Metadata


Browser
A command line utility MDBPublishExecution.sh is available to publish Metadata in Metadata Browser.
Following are the pre-requisites before executing this utility:
1. If the FICSERVER is configured to cache the metadata at the start up of the server, you need to
wait till the caching of metadata is completed to invoke this utility.
2. Ensure that JAVA_HOME is pointing to JAVA bin installation directory.
3. Ensure that the following jar file is present in $FIC_DB_HOM/lib directory.
aai-wsclient-mdbpublish.jar,aai-wsmdbpublishservice.jar
4. Ensure that MDBPublishExecution.properties file is present in $FIC_DB_HOME/conf
folder.
You can also manually update the properties file in the path
$FIC_DB_HOME/conf/MDBPublishExecution.properties to point to the required
ServiceURL.
MDBPUBLISH_EXECUTION_WSDL_LOCATION = URL of WebService (For example,
http://<<IP ADDRESS>>/OFSAAI/mdbPublishExecution?wsdl)
5. Metadata should be present.

NOTE Metadata definitions of length more than 200 characters are


not supported for MDB Publish.

To execute Metadata Browser publish utility:


1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier.
2. Execute MDBPublishExecution.sh (UNIX)
For example, ./MDBPublishExecution.sh
3. While executing, provide any of the following parameter as required:
 ALL - To publish metadata to all the available information domains.
 INFODM1 - To publish metadata to only one (specified) information domain.
 INFODOM1~INFODOM2~INFODOM3 - To publish metadata to multiple (specified)
information domains separated by tilde “~”.

NOTE If no parameter is specified, by default “ALL” option is


considered.

4. You can access the location $FIC_DB_HOME\log\MDBPublishExecution.log to view the


related log files.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 685


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR OBJECT APPLICATION MAPPING IN METADATA BROWSER

5. The publish execution specific log information is present in the MDBPublish.log file available
at the <DEPLOYED LOCATION>/<Context>.ear/<Context>.war/logs folder.
To run the utility through the Operations module:
1. Navigate to the Operations module and define a batch.
2. Add a task by selecting the component as RUN EXECUTABLE.
3. Enter Metadata Value as mentioned in the example.
For Example:
Component ID: RUN EXECUTABLE
Metadata Value (Executable) like:
MDBPublishExecution.sh,LANG611INFO
(where LANG611INFO is the Infodom)
Batch = Y

14.6 Command Line Utility for Object Application Mapping


in Metadata Browser
The following command line utility is introduced to perform Object Application mapping
Following are the pre-requisites before executing this utility:
1. Ensure that JAVA_HOME is pointing to JAVA bin installation directory.
2. Ensure that the following jar file is present in $FIC_DB_HOM/lib directory.
aai-wsclient-mdbpublish.jar, aai-wsmdbpublishservice.jar
3. Ensure that ObjAppMap.properties file is present in $FIC_DB_HOME/conf folder.
You can also manually update the properties file in the path $FIC_DB_HOME/conf/
ObjAppMap.properties to point to the required ServiceURL.
MAP_WSDL_LOCATION= URL of WebService (For example, https://<<IP
ADDRESS>>/OFSAAI/ mdbObjAppMap?wsdl)
To execute Metadata Object Application Mapping utility:
1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier.
2. Execute MDBObjAppMap.sh (UNIX)
For example, ./MDBObjAppMap.sh
3. While executing, provide any of the following parameter as required:
 ALL - To do object application mapping in all the available information domains.
 INFODM1 - To do object application mapping in only one (specified) information domain.
 INFODOM1~INFODOM2~INFODOM3 - To do object application mapping in multiple
(specified) information domains separated by tilde “~”.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 686


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING UAM HIERARCHY OBJECTS

NOTE If no parameter is specified, by default “ALL” option is


considered.

4. You can access the location $FIC_DB_HOME\log\MDBObjAppMap.log to view the related log
files.

14.7 Command Line Utility for Resaving UAM Hierarchy


Objects
OFSAAI has facilitated a utility called RUNIT.sh to resave UAM Hierarchy Objects. This file resides
under ficdb/bin area.

14.7.1 Executing RUNIT.sh from Console


To run the utility directly from the console:
1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier.
2. Execute RUNIT.sh (UNIX).
For example, ./RUNIT.sh
This will resave all the available hierarchy objects.
3. Provide the following parameters if you want to resave particularly some hierarchy objects:
 INFODOM- Specify the information domain name.
 USERID- Specify the user id.

NOTE The User ID or Service accounts are “SMS Auth Only” in case of
SSO and LDAP configured setups.

 HIERARCHY Code- Specify the hierarchy codes separated by tilde “~” or caret “^” to resave
only those hierarchies. Specify the hierarchy codes separated by exclamation mark “!” to
exclude those hierarchies from resaving.
 Asynchronous Mode- Specify whether you want to save the hierarchy in synchronous
manner or not. No indicates saving of hierarchies will happen only after the population of
the REV_BIHIER and REV_LOCALE_HIER tables in the atomic schema. This is an optional
parameter and if it is not mentioned, it will be in asynchronous mode.
./RUNIT.sh INFODOM USERID HIERARCHY_CODE1^HIERARCHY_Code2 OPTIONAL
PARAMETER
Exampel 1:
./RUNIT.sh OFSAAINFO AAAIUSER HR01^HR02 NO
Or
./RUNIT.sh OFSAAINFO AAAIUSER HR01~HR02 NO
This will resave the hierarchies HR01and HR02 in the OFSAAINFO information domain.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 687


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING UAM HIERARCHY OBJECTS

Example 2:
./RUNIT.sh OFSAAINFO AAAIUSER HIE001!HIE002 NO
This will resave all the hierarchies in the OFSAAINFO information domain except the hierarchies
HIE001 and HIE002.

NOTE If you want to exclude only one hierarchy, it should be


preceded with “!”.

14.7.2 Executing RUNIT.sh from Operations Module (ICC)


To run the utility through the Operations module:
1. Navigate to the Operations module and define a batch.
2. Add a task by selecting the component as RUN EXECUTABLE.
3. Under Dynamic Parameter List panel, specify as mentioned in the Executable field:
a. To resave all the available hierarchy objects, use the following command:
./RUNIT.sh
b. To resave particularly some hierarchy objects, use the following command:
./RUNIT.sh,INFODOM,USERID,HIERARCHY_code1^HIERARCHY_code2,No
Example 1:
./RUNIT.sh,OFSAAINFO,USERID,Hier01^Hier02^Hier03,No
This will resave the hierarchies Hier01, Hier02, and Hier03 in the OFSAAINFO information
domain.
Example 2:
./RUNIT.sh,OFSAAINFO,AAAIUSER,HIE001!HIE002
This will resave all the hierarchies in the OFSAAINFO information domain except the hierarchies
HIE001 and HIE002. That is, specify the hierarchy codes separated by exclamation mark “!” to
exclude those hierarchies from resaving.
If you want to exclude only one hierarchy, it should be preceded with “!”.
4. After saving the Batch Definition, execute the batch to resave the UAM Hierarchy Objects.

14.7.3 Executing RUNIT.sh from RRF Module


To run the utility through the RRF module:
1. Navigate to the RRF module and define a Run with Job as Executable:
2. Click button adjacent to the component name and specify the parameters in the following
format:
To resave all the available hierarchy objects:
“./RUNIT.sh”

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 688


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

To resave particularly some hierarchy objects:


”./RUNIT.sh”,”INFODOM”,”USERID”,”HIERARCHY_code1^HIERARCHY_code2”,”No

Example 1:
“./RUNIT.sh”,”OFSAAINFO”,”USERID”,”Hier01^Hier02^Hier03”,”No”
This will resave the hierarchies Hier01, Hier02, and Hier03 in the OFSAAINFO information
domain.
Example 2:
“./RUNIT.sh”,”OFSAAINFO”,”AAAIUSER”,”HIE001!HIE002”
This will resave all the hierarchies in the OFSAAINFO information domain except the hierarchies
HIE001 and HIE002. That is, specify the hierarchy codes separated by exclamation mark “!” to
exclude those hierarchies from resaving.
If you want to exclude only one hierarchy, it should be preceded with “!”.
3. After saving the Run Definition, execute it to resave the UAM Hierarchy Objects.

14.7.4 Utility Status Information


You can view the status of the utility and the hierarchies that are saved from the following tables:
• AAI_UTILS_AUDIT table - This table is for Utility run status such as utility execution started,
completed and/or failed. A transaction ID for each run is generated and is stored here.
• AAI_UTILS_AUDIT_DETAILS table - This table is mapped to each transaction ID generated in
AAI_UTILS_AUDIT, which will store status of each hierarchy (success/exception/completed).
This table also stores Data save and Metadata save status (success/exception/completed) for
each hierarchy.

14.8 Command Line Utility for Resaving Derived Entities


and Essbase Cubes
OFSAAI has facilitated a utility called MetadataReSave.sh to resave Derived Entity objects and Essbase
Cubes. This file resides under ficdb/bin area. In case of resaving Derived Entities, you can use
additional runtime filters dynamically to refresh only selected records in the Derived entities.
To run the utility directly from the console:
1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier.
2. Execute MetadataReSave.sh (UNIX) with proper parameters:
 INFODOM- Specify the information domain name.
 USERID- Specify the user id.

NOTE The User ID or Service accounts are “SMS Auth Only” in case of
SSO and LDAP configured setups.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 689


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

 Metadata Service Type – 856 for Derived Entity and 5 for Essbase Cube
 Derived Entity Code for resaving Derived Entities- Specify the derived entity codes
separated by tilde “~”
Or
Essbase Cube Code for resaving Essbase cubes- Specify the Essbase Cube code.
 Runtime filter- In case of derived entity, specify the runtime filter to refresh only a selected
set of records.
For example,
For resaving Derived Entities:
./MetadataReSave.sh,INFODOM,USERID,856,<Derived Entity code1>~<Derived
Entity code2>
For resaving Derived Entities with Runtime Filters:
./MetadataReSave.sh OFSAAAIINFO AAAIUSER 856 DE006 3^4 -f
"DIM_ACCOUNT.f_Latest_Record_Indicator = 'Y'"
For resaving Essbase Cube:
./MetadataReSave.sh,INFODOM,USERID,5,<Essbase Code>

NOTE ~ is not supported for Essbase Cubes. Only one Essbase Cube
can be resaved at a time.

To run the utility through the Operations module:


1. Navigate to the Operations module and define a batch.
2. Add a task by selecting the component as RUN EXECUTABLE.
3. Under Dynamic Parameter List panel, specify as following in the Executable field:
For resaving Derived Entities:
./MetadataReSave.sh,INFODOM,USERID,856,<Derived Entity code1>~<Derived
Entity code2>
For resaving Derived Entities with Runtime Filters:
./MetadataReSave.sh,OFSAAAIINFO,AAAIUSER,856,DE006,4^5,-
f,DIM_STANDARD_ACCT_HEAD.V_STD_ACCT_HEAD_ID='CAP622'
For resaving Essbase Cube:
./MetadataReSave.sh,INFODOM,USERID,5,<Essbase Code>
4. Select Yes or No for the Wait and Batch Parameter drop-down lists. For more information, see
Component: RUN EXECUTABLE section.
After saving the Batch Definition, execute the batch to resave Derived Entity Objects or Essbase
Cubes.
You can find the logs in $FIC_DB_HOME/log/MetadataReSave.log.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 690


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

14.8.1 Command Line Utility for Resave, Refresh and Delete Partitions
A command line utility called RefreshByPartition.sh is available to resave, refresh and delete
partitions.
To run the utility directly from the console:
1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier.
2. Execute RefreshByPartition.sh with proper parameters:
./RefreshByPartition.sh <DSNNAME> <USERNAME> <METADATA SERVICE TYPE>
[<METADATACODE>] <ADD_or_REFRESH_PARTITIONS(SEPARATED BY "^")>
<DELETE_PARTITION(SEPARATED BY "^")>
 <DSNNAME> - Information Domain name
 <USERNAME> - User Name of the logged in user
 <METADATA SERVICE TYPE> - 856 for Derived Entity
 [<METADATACODE>]- Derived Entity Code for which you want to refresh, add or delete
partitions
 <ADD_or_REFRESH_PARTITIONS> - Specify the Partitions which needs to be added or
refreshed, separated by ^
 <DELETE_PARTITION> - Specify the Partitions which needs to be deleted, separated by ^
For example:
./RefreshByPartition.sh TESTCHEF TESTUSER 856 DE003 1^2^3^4^5^6 2^4
Consider 1, 2, 3, 4 are already existing. Then in this case, 1 and 3 will be refreshed, 5 and 6 will be
added and 2 and 4 will be deleted.

NOTE • Deleting partitions happens before adding partitions.


• Existing partitions will continue to exist if they are not mentioned
in the parameter list.

14.8.2 Command Line Utility for Partition-Based Derived Entities


The command line utility RefreshPartitions.sh enables the handling of interdependency,
performance improvements, executions, rebuilds, and migrations for partition-based Derived Entities.
The utility refreshes partition-based Derived Entities. It does not affect non-partition-based Derived
Entities or Derived Entities which have been changed to a partition-based Derived Entities from the
backend (without being explicitly resaved post backend metadata seeding). In such cases, the Derived
Entities require resave operations before being referenced to the refresh partition utility.
Based on the parameters provided, the utility can add, purge, and refresh partitions for Derived
Entities. It can also action the Run filter provided as a parameter. You can specify to add or delete
partitions, or do both. The filter condition is not mandatory, however, if input, it gets applied only on
addition and refresh of partitions.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 691


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

The salient features of this command line utility are as follows:


• Enhanced to rebuild and retain all available partitions along with the history.
• Provided wrapper-materialized view on pre-built tables, which are based on the configurable
through flag.
• Provided a separate METADATA_PERFORMANCE_PARAMS table at the Derived Entities code
level for performance management.
• Provided a new METADATA_EXECUTION_PARAMS execution table for Derived Entities to
register the information for Refresh Partition invocation.
• Introduced a new METADATA_EXECUTION_LOGS execution logging table for Derived Entities
to capture the execution log unifying with the batch Id.

NOTE • Deleting partitions happens before adding partitions.


• Existing partitions will continue to exist if they are not mentioned
in the parameter list.

TIP You can run the utility as described in the following


subsections by providing the parameters implicitly as part of
the execution or use the externalize the runtime parameters
method described in the Externalize Dynamic Parameters for
Derived Entity Refresh Partition Executions.

More Topics in this Section:


• Externalize Dynamic Parameters for Derived Entity Refresh Partition Executions
• Manage Derived Entity Performance
• Capture the Derived Entity Execution Logs in the Database
The following options are available to run the utility:
1. Run Directly from the Console
2. Run from the Operations Module

14.8.2.1 Run Directly from the Console


To run the utility directly from the console:
1. Navigate to the $FIC_DB_HOME/bin directory of the OFSAAI FIC DB tier.
2. Execute RefreshPartitions.sh with the required parameters:
./RefreshPartitions.sh dsn=<INFODOM> usr=<USERNAME>
code=<DERIVED_ENTITY_CODE> addp=<PARTITIONS_TO_ADD/REFRESH>
delp=<PARTITIONS_TO_DELETE> filter=<RUN FILTER>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 692


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

 dsn - Information Domain name


 usr - User Name of the logged in user
 code - Derived Entity Code for which you want to refresh, add or delete partitions
 addp - Specify the Partitions which needs to be added or refreshed, separated by ^
 delp - Specify the Partitions which needs to be deleted, separated by ^
For example:
./RefreshPartitions.sh dsn=OFSAAAIINFO usr=AAAIUSER code=DE0001
addp=3^4 delp=1^2 filter="DIM_STANDARD_ACCT_HEAD.V_STID IN
('CAP622','CAP628)"

14.8.2.2 Run from the Operations Module


To run the utility from the Operations module (ICC):
1. Navigate to the Operations module and define a batch.
2. Add a task by selecting the component as RUN EXECUTABLE.
3. Under the Dynamic Parameter List panel, enter the following in the Executable field:
./RefreshPartitions.sh dsn=<INFODOM> usr=<USERNAME>
code=<DERIVED_ENTITY_CODE> addp=<PARTITIONS_TO_ADD/REFRESH>
delp=<PARTITIONS_TO_DELETE> filter=<RUN FILTER>
 dsn - Information Domain name
 usr - User Name of the logged in user
 code - Derived Entity Code for which you want to refresh, add or delete partitions
 addp - Specify the Partitions which needs to be added or refreshed, separated by ^
 delp - Specify the Partitions which needs to be deleted, separated by ^
For example:
./RefreshPartitions.sh, dsn=OFSAAAIINFO, usr=AAAIUSER, code=DE0001,
addp=3^4, delp=1^2,filter="DIM_STANDARD_ACCT_HEAD.V_STID IN
('CAP622','CAP628)"

14.8.2.3 Externalize Dynamic Parameters for Derived Entity Refresh Partition


Executions
You may find it difficult to provide dynamic values to the parameters during the execution of the
refresh partition utilities as described in the Command Line Utility for Partition-Based Derived Entities
section. Hence, the application is enhanced to read Derived Entity execution parameters from the
METADATA_EXECUTION_PARAMS table. Provide the required parameters in the table and the utility
will pick up the utility before execution. For details, see the following illustration and table:

Example of the METADATA_EXECUTION_PARAMS Table

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 693


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

METADATA_EXECUTION_PARAMS Table Details

Table Name Parameter Description

V_METADATA_CODE The DE Code.

N_METADATA_TYPE The type is 856.

V_METADATA_INFODOM The name of the metadata Infodom.

V_PARAM_NAME The partition parameter type. For example, addp, delp, and filter.

V_PARAM_VALUE The value for the partition parameter type. For example, in the preceding
illustration, the value for addp is 7^8, the value for delp is 3^4 , and the
value for filter is
DIM_STANDARD_ACCT_HEAD_LATEST_RECORD_INDICATOR="Y".

D_RECORD_DATE The date when the data was added or updated.

The following options are available to run the utility:


1. Run Directly from the Console
2. Run from the Operations Module

14.8.2.3.1 Run Directly from the Console

To run the utility directly from the console:


1. Navigate to the $FIC_DB_HOME/bin directory of the OFSAAI FIC DB tier.
2. Execute RefreshPartitions.sh with the required parameters:
RefreshPartitions.sh dsn=<INFODOM> usr=<USERNAME>
code=<DERIVED_ENTITY_CODE> params=external
 dsn - Information Domain name
 usr - User Name of the logged in user
 code - Derived Entity Code for which you want to refresh, add or delete partitions
 params - Required Runtime Partition parameters

NOTE params=external reads data from the table.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 694


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR RESAVING DERIVED ENTITIES AND ESSBASE CUBES

14.8.2.3.2 Run from the Operations Module

To run the utility from the Operations module (ICC):


1. Navigate to the Operations module and define a batch.
2. Add a task by selecting the component as RUN EXECUTABLE.
3. Under the Dynamic Parameter List panel, enter the following in the Executable field:
RefreshPartitions.sh dsn=<INFODOM> usr=<USERNAME>
code=<DERIVED_ENTITY_CODE> params=external
 dsn - Information Domain name
 usr - User Name of the logged in user
 code - Derived Entity Code for which you want to refresh, add or delete partitions
 params - Required Runtime Partition parameters

NOTE params=external reads data from the table.

14.8.2.4 Manage Derived Entity Performance


The application is enhanced to manage the performance of derived entities at an enterprise level. You
can configure the performance at an individual derived entity definition level. The database table
METADATA_PERFORMANCE_PARAMS holds this configuration. Provide the required performance
parameters in the database table with the details shown in the following illustration and table:

Example of the METADATA_PERFORMANCE_PARAMS Table

METADATA_PERFORMANCE_PARAMS Table Details

Table Name Parameter

V_METADATA_CODE The DE Code.

N_METADATA_TYPE The type is 856.

V_METADATA_INFODOM The name of the metadata Infodom.

V_SELECT_HINT The query for select hint.

V_INSERT_HINT The query for insert hint.

V_PARALLELISM The query for parallelism.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 695


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR MAPPER PUSHDOWN

Table Name Parameter

V_PRESCRIPTS The query for prescripts to modify connection session attributes, this gets
executed before firing the derived entity queries.

V_POSTSCRIPTS The query for postscripts rollback connection session attributes, this gets
executed after completion of derived entity queries.

14.8.2.5 Capture the Derived Entity Execution Logs in the Database


The application is enhanced to store the Derived Entity execution logs in the database for the ease of
viewing the logs. The METADATA_EXECUTION_LOGS table, present in the Config Schema, can be
used to verify the details of the execution logs as shown in the following illustration:

NOTE Define the Derived Entity ICC Batch with the Batch Parameter
as Y to capture the execution log that identifies with the batch
Id.

Reference to the METADATA_EXECUTION_LOGS table to View Logs

Filter logs to view for a particular execution by using the filters on the following columns:
• V_BATCH_RUN_ID
• V_TASK_ID
• V_METADATA_CODE

14.9 Command Line Utility for Mapper Pushdown


OFSAAI has facilitated a utility called MapPushDown which is used for push down operation of
mapper definitions. This utility is meant to refresh the mapping maintained in the atomic table based
on the latest members available in the hierarchy and the available macros already defined for the
mapper definition. This utility resides under ficdb/bin area.
To run the utility directly from the console:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 696


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR DOWNLOADING METADATA OBJECTS IN PDF FORMAT

1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier, where the utility is present.


2. Execute the following command:
./MapPushDown.sh <INFODOM>
where <INFODOM> is a mandatory parameter which represents the information domain in
which the utility will be Run.
This command will push down all the mapper definitions in the specified Infodom.
3. Provide the Mapper Codes separated by tilde “~” if you want to pushdown specifically some
mapper definitions:
Command:
./MapPushDown.sh <INFODOM> <Mapper code1~ Mapper code2>
For example,
./MapPushDown.sh BASEL 1099999999~1099999998~1099999997
To run the utility as an executable component from RRF:
4. Navigate to the RRF module.
 Define a Process definition with component as Executable.
 Pass parameters as required and add the Process into a Run to be fired.
Or
 Define a Run definition with component as Executable.
 Pass parameters as required and fire the Run definition.
Sample data for creating a Process with Executable component:
"MapPushDown.sh","BASEL","1099999998"
To run the utility through the Operations module:
1. Navigate to the Operations module and define a batch.
2. Add a task by selecting the component as RUN EXECUTABLE.
3. Pass parameters as required.
4. Under Dynamic Parameter List panel, specify ./MapPushDown.sh <INFODOM> or
./MapPushDown.sh <INFODOM> <Mapper code1~ Mapper code2> in the Executable
field.
Sample Data for executing through ICC:
./MapPushDown.sh BASEL 1099999998

14.10 Command Line Utility for Downloading Metadata


Objects in PDF Format
A command line utility called MDBPDFDownloadExecution.sh is available to download the details of
published metadata objects in PDF format. This utility is present at $FIC_DB_HOME/bin folder.
To execute MDBPDFDownloadExecution utility:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 697


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR LDAP MIGRATION

1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier.


2. Execute MDBPDFDownloadExecution.sh with proper arguments.
./MDBPDFDownloadExecution.sh infodom=<INFODOM> objCodes=[<LIST OF
OBJECT CODES>] folderName=[<Folder Name>]
 infodom=<INFODOM> – Specify the Infodom name where the metadata objects you want
to download is present.
 objCodes=[<LIST OF OBJECT CODES>] – Specify the object codes of the metadata objects
separated by comma. This is an optional parameter. If this is not given, all objects belonging
to the specified Infodom will be downloaded.
 folderName=[<Folder Name>] – Specify fully qualified folder name where downloaded
PDFs should be placed. This is an optional parameter. If this is not given, PDFs will be stored
at ftpSharepath.
For example, ./MDBPDFDownloadExecution.sh infodom=OFSAAIINFO
objCodes=HCY001,DIM001 folderName=/scratch/ofsaobie/ofsaa806
The parameters for the utility such as Infodom, objCodes, folderName are case sensitive.
3. You can find the related logs in the following locations:
 $FIC_DB_HOME/log/MDBPDFDownload.log
 <DEPLOYED LOCATION>/<Context>.ear/<Context>.war/logs/MDB.log

14.11 Command Line Utility for LDAP Migration


OFSAAI has facilitated a command line utility called LDAP Migration utility to migrate:
 users registered in LDAP server to OFSAA
 users in LDAP to a user group mapping in OFSAA
 user groups in OFSAA to LDAP server
This utility is present at $FIC_DB_HOME/bin folder.
To run the utility directly from the console:
1. Navigate to $FIC_DB_HOME/bin of OFSAAI FIC DB tier, where the utility is present.
2. To migrate users from LDAP server to OFSAA, execute the following command:
ldapmigration.sh <user> <password> LDAPTOSMS user <ldap_server>
<user_search_filter> <user_base>
3. To migrate users in a particular user group in LDAP server to OFSAA, execute the following
command:
ldapmigration.sh <user> <password> LDAPTOSMS groupmember <ldap_server>
<group_search_filter> <group_base>

NOTE This migration assumes the same user group exists in OFSAA.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 698


COMMAND LINE UTILITIES
MODEL UPLOAD UTILITY

4. To migrate only user-user group mapping from LDAP server to OFSAA, execute the following
command:
ldapmigration.sh <user> <password> LDAPTOSMS usergroupmap <ldap_server>
<group_search_filter> <group_base>

NOTE This migration assumes the same user group exists in OFSAA.

5. To migrate user groups from OFSAA to LDAP server, execute the following command:
ldapmigration.sh <user> <password> SMSTOLDAP group <ldap_server>
<group_search_filter>
where
<user>- Specify SYSADMN as the user name.
<password>- Specify SYSADMN password.
<ldap_server>- Specify the LDAP server name. For example, ORCL1.in.oracle.com.
<user_search_filter>- Specify filter condition for user search.
<user_base>- Specify user context base.
<group_search_filter>- Specify filter condition for user group search.
<group_base>- Specify group context base.
For example,
ldapmigration.sh SYSADMN password1 SMSTOLDAP group ORCL1.in.oracle.com
OFSAAGRP
ldapmigration.sh SYSADMN password1 LDAPTOSMS user ORCL1.in.oracle.com
objectclass=organizationalPerson cn=Users,dc=oracle,dc=com

14.12 Model Upload Utility


The Model Upload Utility uploads the Data Model through the command line parameter by executing
a shell script file. It is used to upload Models that are huge in size. The erwin file or Database XML
generated using TransformErwin.sh utility that contains the Data Model information must be placed at
<ftpshare>/<infodom>/erwin/erwinXML. This utility is present at the
$FIC_HOME/ficapp/common/FICServer/bin folder.
Following are the pre-requisites before executing this utility:
1. Ensure that JAVA_HOME in the .profile is pointing to JAVA bin installation directory.
2. Set the FIC_HOME path in the user .profile.
3. Ensure that the following jar file are present in the
$FIC_HOME/ficapp/common/FICServer/lib directory:
 datamodel.jar
 FICServer.jar

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 699


COMMAND LINE UTILITIES
MODEL UPLOAD UTILITY

 dateent.jar

14.12.1 Run the Model Upload Utility


1. Navigate to $FIC_HOME/ficapp/common/FICServer/bin location.
2. Execute Upload.sh as a Command Line argument as shown in the following:
./upload.sh <infodom> <entire file path> <username> <uploadmode
N/R/AM/AP> <modelUploadType E/C> <startsFilter> <constainsFilter>
<endsFilter> <runscriptsFlag> <constraintNoValidateFlag>
<EntityJsonFlag> <DDLMigrationFlag> <DDL Logs Flag>
<considerCustomization> <Object Registration Mode> <Refresh Params>

NOTE Ensure that you are provided with the execute permission.

The following are the descriptions for the arguments in the upload.sh file:
 <infodom> - Refers to the DSN name. The information domain to where the model
upload to be done.
 <entire file path> - Refers to the entire file path of the erwin XML or Database XML.
For example, $FTP_SHARE/$INFODOM/erwin/erwinXML/PFT_model.xml. Set this as
Null for DB Catalog and Data Model Descriptor options.
 <username> - Refers to the username of the OFSAA Application.

NOTE The User ID or Service accounts are “SMS Auth Only” in case of
SSO and LDAP configured setups.

 <uploadmode N/R/AM/AP> - Refers to the Upload Choice Code.


 N - Refers to the New Model Upload.
 R - Refers to the Complete Model Rebuild Upload.
 AM - Refers to the Incremental Model Upload.
 AP - Refers to the Sliced Model Upload.
 <modelUploadType E/C> - Refers to the Model Upload type.
 E - erwin upload
 C - Catalog Generation
 Set this as Null for Data Model Descriptor option.
 <startsFilter> - This argument should be given only for Catalog generation.
For example,
For Catalog - dim_test

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 700


COMMAND LINE UTILITIES
MODEL UPLOAD UTILITY

For erwin and Data Model Descriptor options- Null


 <constainsFilter> - This argument should be given only for Catalog generation.
For example,
For Catalog - dim_test
For erwin and Data Model Descriptor options– null
 <endsFilter> - This argument should be given only for Catalog generation.
For example,
For Catalog - dim_test
For erwin and Data Model Descriptor options - Null

NOTE Do not alter the filter conditions startsFilter, constainsFilter and


endsFilter.

 <runscriptsFlag> - Set this as TRUE or FALSE.


 TRUE - Updates the database/schema with the Model changes.
 FALSE - Does not update the database/schema with Model changes. If this is set to
FALSE, you should execute the SQL scripts generated as part of OFSAAI model upload
process in a correct sequence, in order to make the Infodom Schema to be consistent
with the DATABASE.xml. For more information, see Sequence of Execution of Scripts
section.
 <constraintNOVALIDATEFlag> - Refers to give an option to enable or disable
constraints in to alter constraint in NOVALIDATE state. During Incremental and Sliced
Model upload, the constraint validation is based on the value provided to this flag.
 TRUE - Enables constraints in NOVALIDATE state and does not check the existing
data for the integrity constraint violation.
 FALSE - Does not enable constraints in NOVALIDATE state and checks the existing
data for the integrity constraint violation.
 considerCustomization - If customization is allowed on columns, set it as TRUE, else
set it as FALSE.
 EntityJSONflag- Set this as TRUE if the model upload option is selected as Data Model
Descriptor, else set this as FALSE.
 ScriptsMigratedFlag- Set this as TRUE or FALSE.
 FALSE - To resume the model upload process from script generation. That is, if you
have copied only database xml file to your target environment, set this as FALSE.
 TRUE - To resume the model upload process from script execution. That is, if you
have copied only database xml file and DB scripts to your target environment, set this
as TRUE.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 701


COMMAND LINE UTILITIES
MODEL UPLOAD UTILITY

 DDL Logs Flag- Set this as TRUE to print execution audit logs for Scripts. The logs can be
found at ftpshare/<infodom>/executelogs/<infodom>_DDLLOG_<last data model
version>_<MM.DD.YYYY>-<HH.MM.SS>.log.
 Refresh Params – Set this as TRUE to use Database session parameters during model
upload process, else set this as FALSE.
 Object Registration Mode – Set it as F for full Object Registration or I for incremental
object registration.

NOTE Incremental object registration should be opted only if the


object registration on the base environment was incremental.
Full Object Registration can be performed irrespective of mode
opted in the base environment.

The various parameters to be passed for different modes are shown in the following matrix:

Table 188: Parameters for different Modes

Start point Object Registration status DatabaseXMLFlag ScriptsMigratedFlag ObjectRegistrationflag

Script Full Object Registration True False F


generation

Incremental Object True False I


registration

Script Full Object Registration True True F


Execution

Incremental Object True True I


registration

3. Logs are updated in regular Model Upload log at


ftpshare/<infodom>/logs/<infodom>_LOG_<last data model
version>_<MM.DD.YYYY>-<HH.MM.SS>.log

NOTE During incremental model upload, when the uploadmode is set


as AM, some of mappings done in Data Integrator may get
invalidated.
You are required to save these mappings again.

14.12.2 Model Upload Details


Some Java settings need to be configured while uploading the data model with various sizes of xml
files. This can be done by:
 Picking from the server

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 702


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR OBJECT REGISTRATION

 Model Upload Utility


 Browsing the file in the local computer.
These Java settings differ depending on the availability of RAM. You have to ensure that the Default
and Temporary table-space assigned to Oracle user is allocated with required space. The below table
consists of the Java settings done on both client and server machines:

Table 189: Details of the Java settings foe both client and server machines

Model Upload Size of Data Model XML File X_ARGS_APP ENV Variable in OFSAAI APP Layer
Options

Pick from Server 106 MB "-Xms1024m -Xmx1024m

36 MB "-Xms2048m -Xmx2048m

815 MB "-Xms4096m -Xmx4096m

1243 MB "-Xms6144m -Xmx6144m

Model Upload 106 MB "-Xms1024m -Xmx1024m


Utility
336 MB "-Xms2048m -Xmx2048m

815 MB "-Xms4096m -Xmx4096m

1243 MB "-Xms6144m -Xmx6144m

Save New erwin 106 MB "-Xms1024m -Xmx1024m


File In Server
336 MB "-Xms2048m -Xmx2048m

815 MB "-Xms4096m -Xmx4096m

1243 MB "-Xms6144m -Xmx6144m

14.13 Command Line Utility for Object Registration


The Register Objects Utility is used to do the object registration separately if it failed during Model
Upload process. You can execute the shell script file RegisterObjects.sh from the command line.
This utility is present at $FIC_HOME/ficapp/common/FICServer/bin location.
To run the utility directly from the console:
1. Navigate to $FIC_HOME/ficapp/common/FICServer/bin.
2. Open RegisterObjects.sh and enter the following arguments in the file:
 <infodom> - Refers to the DSN name.
3. Execute the script using the command:
./RegisterObjects.sh

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 703


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR TRANSFORMING ERWIN XML TO DATABASE XML OR JSON(ODM)

NOTE Ensure that you are provided with the execute permission.
Log file in ftpshare folder is empty. The logs are printed in the
console only.

14.14 Command Line Utility for Transforming erwin XML to


Database XML or JSON(ODM)
A standalone command line utility called TransformErwin.sh is provided that can run on lower
environments to generate Database XML or JSON from the erwin XML file. This utility does not have
any dependency on OFSAA. It is used for converting OOB and the customized erwin XML to Database
XML or JSON. This utility is available at $FIC_HOME/utility directory. You can copy the utility to
any machine and run the utility.
To run the utility directly from the console, perform the following section:
1. Navigate to $FIC_HOME/utility/TransformErwin/bin folder or wherever the utility is.
2. Execute TransformErwin.sh using the following command:
TransformErwin.sh <ErwinFilePath> <outputFilePath> <parserType>
<generateJson> <modelname>
 <ErwinFilePath> or <databasexmlFilePath> - The absolute path of the erwin XML
file or the absolute path of the Database XML file.
 <outputFilePath> - Destination path to store the output XML file
 <parserType> - Enter S for using the Saxon parser or enter X for using the Xalan parser. If
this parameter is null, the Saxon parser is used.
 <generateJson> - By defatult the value is null. If the value is true, then the JSON is
generated in the ODM (Oracle Data Model) archive file format. If the value is null, then DB.
XML is generated.
 <modelName> - If generateJson is true, then model name can be provided.
Outcome after using the utility
 The ODM is generated with the model name. The ODM has the entity wise JSONS and a
master xml, which contains the information about the entity wise JSON.
 The Database XML is generated with the same name as erwin file being transformed with
the prefix _DB. For example, erwin File Name: OFS_PFT_Datamodel.XML. Then the
resulting DB XML is OFS_PFT_Datamodel_DB.XML.
For example:
DB XML:
TransformErwin.sh …/erwin/erwinXML/MDL_INTVL_PART_BASE.xml
…/erwin/erwinXML X
JSON
TransformErwin.sh …/erwin/erwinXML/MDL_INTVL_PART_BASE.xml

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 704


COMMAND LINE UTILITIES
COMMAND LINE UTILITY FOR GENERATING SLICE JSON (ODM)

…/erwin/erwinXML X true mdl_name


Verify the log files located at $FIC_HOME/utility/TranformErwin/logs folder.

14.15 Command Line Utility for Generating Slice JSON


(ODM)
A standalone command line utility called generateSliceJson.sh is provided that can run on lower
environments to generate Slice JSON from the old XML (erwin XML or Database XML) and new XML
(erwin XML or Database XML) file.
Slice utility compares the old XML (erwin XML or Database XML) and new XML (erwin XML or
Database XML). Based on the checksum values:
 If the checksum matches, it will ignore the JSON.
 If the checksum values do not match, then the JSON (ODM) files are generated.
This utility does not have any dependency on OFSAA. It is used for converting OOB and the
customized erwin XML or Database XML to JSON (ODM) files. This utility is available at
$FIC_HOME/utility directory. You can copy the utility to any machine and run the utility.
To run the utility directly from the console, perform the following steps:
1. Navigate to $FIC_HOME/ utility/SliceJsonGenerateUtility/bin folder or wherever
the utility is.
2. Execute generateSliceJson.sh using the following command:
generateSliceJson.sh <oldXmlPath> <newXmlPath> <destinationFolder>
<modelname> <parserType>
 <oldXmlPath> - The absolute path of the old erwin XML file or the absolute path of the
old Database XML file.
 <newXmlPath> -The absolute path of the new erwin XML or the absolute path of the new
Database XML file.
 <destinationFolder> - The absolute path of destination folder where the zip file(.ODM)
will be created.
 <modelName> - The model name must be provided. The ODM file is generated with the
model name.
 <parserType> - Enter S for using the Saxon parser or enter X for using the Xalan parser. If
this parameter is null, the Saxon parser is used.
Outcome after using the utility
 The ODM is generated with the model name. The ODM is a sliced JSON that has the entity
wise JSONS and a master xml, which contains the information about the entity wise JSON.
For example:
ODM:
generateSliceJson.sh …/erwin/erwinXML/MDL_INTVL_PART_BASE_OLD.xml
…/erwin/erwinXML/MDL_INTVL_PART_BASE_NEW.xml

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 705


COMMAND LINE UTILITIES
COMMAND-LINE UTILITY FOR SQL MODELER TO JSON (ODM)

…/erwin/erwinXML X mdl_name
Verify the log files located at $FIC_HOME/utility/SliceJsonGenerateUtility/logs
folder.

14.16 Command-Line Utility for SQL Modeler to JSON (ODM)


A standalone Command-line Utility transformsqlmodel.sh is provided that can run on Lower
Environments to generate JSON (ODM) files from the SQL Modeler Files. This Utility does not have
any dependency on OFSAA. It is available in the $FIC_HOME/utility directory. You can copy the
Utility to any machine and run it.
To run the Utility from the Console, follow these steps:
1. Navigate to the $FIC_HOME/utility/TransformSQLModel/bin directory (or the directory
where the Utility exists).
2. Execute transformsqlmodel.sh using the following command:
transformsqlmodel.sh <sqlXmlPath> <destinationFolder> <modelName>
 <sqlXmlPath> - The absolute path of the SQL Modeler Files.
 <destinationFolder> - The destination path to store the output JSON (ODM) files.
 <modelName> - The name of the Model.
Output after Running the Utility
The ODM is generated with the Model Name entered. The ODM has the entity-wise JSONS and
a Master XML File, which contains the information about the entity wise JSON.
For example:
transformsqlmodel.sh …/erwin/erwinXML/MDL_SQL_MODELER.xml
…/erwin/erwinXML mdl_name
Verify the log files located in the $FIC_HOME/utility/TransformSQLModel/logs
directory.
Limitations
 Supertype-subtype is not supported.
 Model version has to be added entity-wise as an UDP.
 Index tablespace is not supported.
 Logical-table UDPs are not supported.

14.17 Command Line Utility for Populating


AAI_DMM_METADATA Table
This utility can used only when AAI_DMM_METADATA table is not populated properly or corrupted.
To run the utility directly from the console, perform the following section:
1. Log in to the OFSAA Server.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 706


COMMAND LINE UTILITIES
COMMAND-LINE UTILITY TO BULK IMPORT USER GROUPS TO IDCS

2. Take the backup of the AAI_DMM_METADATA table.


3. Execute PopulateTableJSON.sh file for the specific infodom as follows.
cd $FIC_HOME/utility/PopulateTableJSON/bin
./PopulateTableJSON.sh <infodom_name>
By default, this utility will read the JSON files from
"/ftpshare/<infodom>/json/fipjson/"
4. Seperate the JSON files path that can be specifed as 2nd argument to the shell script which is
optional.
cd $FIC_HOME/utility/PopulateTableJSON/bin
./PopulateTableJSON.sh <infodom_name> <JSON files path>
5. Verify the logs generated in the following
path:<$FIC_HOME/utility/PopulateTableJSON/logs>

14.18 Command-line Utility to Bulk Import User Groups to


IDCS
The IDCS Utility Bulk imports Users, Groups and User-Group Mappings from OFSAA into the Oracle
Identity Cloud Service (IDCS) by creating CSV files, which can be imported into IDCS. The utility
enables the integration of Users and Groups in OFSAA with IDCS.
To run the utility directly from the console, perform the following steps:
1. Log in to the OFSAA Server.
2. Navigate to the $FIC_HOME/utility/IDCS_Utility directory.
3. Assign Read, Write, and Execute permission to the IDCSUtility.sh Script File using the
following command:
chmod -777 IDCSUtility.sh
4. Execute the IDCSUtility.sh Script File using the following command:
./IDCSUtility.sh <URL> <UserName> <Password>
Where,
 <URL> - Enter the Web Server URL in the https://<hostname>:<port>/<domain>
format.
 <UserName> - Enter your user name to log in to Web Server.
 <Password> - Enter your password to authenticate the user name to log in to Web Server.
After you execute the script, the following CSV files are generated:
a. groups.csv
b. users.csv

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 707


COMMAND LINE UTILITIES
COMMAND-LINE UTILITY TO BULK IMPORT USER GROUPS TO IDCS

NOTE In the users.csv File, enter the Work Email ID for each user. The
Work Email ID is required to reset the password for each user in
the IDCS.

5. Import the preceding files in IDCS using the following instructions.


To import the users.csv file, perform the following steps:
a. Log in to IDCS.
b. Select the Users Tab.
c. Select Import.
d. Browse and select the users.csv File and select the Import Button.
The import of the users.csv file imports users into IDCS and each user receives an email to
reset the password.
To import the groups.csv file, perform the following steps:
a. Log in to IDCS.
b. Select the Groups Tab.
c. Select the Import Tab.
d. Browse and select the groups.csv File and select the Import Button.
The import of the groups.csv file imports groups into IDCS and each group is mapped to
the users imported earlier.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 708


REST APIS FOR OBJECT MIGRATION UTILITY
OBJECTMIGRATION EXPORT API

15 Rest APIs for Object Migration Utility


Utilize the following Object migration utility Apis to.import and export object definitions, get object
definition summary codes, and upload and download the dump file.

15.1 Objectmigration Export API


You can use the Objectmigration Export API to export a metadata object from a source environment
to a target environment.

15.1.1 Endpoint Details


• Method - . POST
• Rest Endpoint - . /rest-
api/migrationrest/MigrationRESTService/public/migrate/EXPORT
• Content Type - . XML

15.1.2 Sample
<OBJECTMIGRATION>
<USERID>user_id</USERID><!--User ID-->
<LOCALE>en_US</LOCALE><!--Locale Information-->
<INFODOM>infodom_name</INFODOM><!--Information Domain-->
<FOLDER>$FOLDER$</FOLDER><!-- Folder/Segment -->
<MODE>EXPORT</MODE><!--EXPORT/IMPORT-->
<FILE>TEST_001</FILE>
<IMPORTALL TargetFolder="$FOLDER$"></IMPORTALL><!—Applicable only for
import-->
<FAILONERROR>Y</FAILONERROR>
<OVERWRITE>Y</OVERWRITE>
<RETAIN_IDS>N</RETAIN_IDS>
<MIGRATION_CODE>migration_code</MIGRATION_CODE>
<OBJECTS TargetFolder="$FOLDER$">
<OBJECT Code="DQ0017" Type="120" INCLUDEDEPENDENCY="Y" /><!—object code and
object type of the migrated metadata object-->
</OBJECTS>
</OBJECTMIGRATION>

15.2 Objectmigration Import API


You can use the Objectmigration Import API to import a file from a source environment to target
environment.

15.2.1 Endpoint Details


• Method - . POST
• Rest Endpoint - . /rest-
api/migrationrest/MigrationRESTService/public/migrate/IMPORT

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 709


REST APIS FOR OBJECT MIGRATION UTILITY
OBJECTMIGRATION EXPORT STATUS API

• Content Type - . XML

15.2.2 Sample
<OBJECTMIGRATION>
<USERID>user_ID</USERID><!--User ID-->
<LOCALE>en_US</LOCALE><!--Locale Information-->
<INFODOM>infodom_name</INFODOM><!--Information Domain-->
<FOLDER>$FOLDER$</FOLDER><!-- Folder/Segment -->
<MODE>IMPORT</MODE><!--EXPORT/IMPORT-->
<FILE>TEST_001</FILE>
<IMPORTALL TargetFolder="$FOLDER$">Y</IMPORTALL>
<FAILONERROR>Y</FAILONERROR>
<OVERWRITE>Y</OVERWRITE>
<RETAIN_IDS>N</RETAIN_IDS>
<MIGRATION_CODE>TESTAPIIMP02</MIGRATION_CODE>
<OBJECTS TargetFolder="$FOLDER$">
<OBJECT Code="DQ0017" Type="120" INCLUDEDEPENDENCY="Y" />
</OBJECTS><!—object code and object type of the migrated metadata object--
>
</OBJECTMIGRATION>>

15.3 Objectmigration Export Status API


You can use the Objectmigration Export Status API to get the status of all the export migration
definitions present in a specific infodom.

15.3.1 Endpoint Details


• Method - . Get
• Rest Endpoint - . /rest-
api/migrationrest/MigrationRESTService/public/migrate/status/EXPORT/{mi
gcode}?infodom={infodom Value}
• Content Type - . XML

15.4 Objectmigration Import Status API


You can use the Objectmigration Import Status API to get the status of all the import migration
definitions present in a specific infodom.

15.4.1 Endpoint Details


• Method - . Get
• Rest Endpoint - . rest-
api/migrationrest/MigrationRESTService/public/migrate/status/IMPORT/{mi
gcode}?infodom={infodom Value}
• Content Type - . XML

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 710


REST APIS FOR OBJECT MIGRATION UTILITY
SUMMARY OBJECTTYPES API

15.5 Summary Objecttypes API


You can use the Summary Objecttypes API to get the list of object types available in a specific
infodom.

15.5.1 Endpoint Details


• Method - . Get
• Rest Endpoint - . /rest-
api/migrationrest/MigrationRESTService/public/migrate/summary/
objecttypes?infodom={infodom Value}

15.6 Summary Objectcodes API


You can use the Summary Objectcodes API to get the object codes associated with the respective
object types present in a specific infodom.

15.6.1 Endpoint Details


• Method - . Get
• Rest Endpoint - . /rest-
api/migrationrest/MigrationRESTService/public/migrate/summary/objecttyp
es/{objectcode}?infodom={infodom
Value}&currPage=2&pageSize=5&userId={userValue}

15.7 Object Migration Download Dump API


You can use the Object Migration Download Dump API to download the dump file to the local
directory.

15.7.1 Endpoint Details


• Method - . Post
• Rest Endpoint - . /rest-
api/migrationrest/MigrationRESTService/public/migrate/download/{fileNam
e}?infodom={infodom Value}

15.8 Object Migration Upload Dump API


You can use the Object Migration Upload Dump API to upload the dump file from your local directory.

15.8.1 Endpoint Details


• Method - . Post
• Rest Endpoint - .rest-
api/migrationrest/MigrationRESTService/public/migrate/upload/{fileName}
?infodom={infodom Value}

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 711


REFERENCES
CALENDAR

16 References
This section of the document consists of information related to intermediate actions that needs to be
performed while completing a task. The procedures are common to all the sections and are referenced
where ever required. You can refer to the following sections based on your need.

16.1 Calendar
Calendar icon in the user interface helps you to specify a date in the DD/MM/YYYY format by
selecting from the pop-up calendar. You can select the specific month and year using the drop-down
lists. When you click the required date the details are auto updated in the date field.

Figure 314: Calendar window

16.2 Function Mapping Codes


The following table lists the function codes with their description to help you identify the user
functions who needs to access the Infrastructure system and map roles appropriately. See Appendix
A.

16.3 External Scheduler Interface Component


ESIC (External Scheduler Interface Component) is an external command line executable which
integrates with the Infrastructure system to run or execute a Batch definition. This integration is
achieved by the Run Executable component.
The Operations module (ICC - Information Command Center) within the infrastructure system
manages the execution of all components within OFSAAI. This reports the status of tasks, which are
inseparable unit of work that must be executed as one single piece during a batch run. It also prompts
for subsequent course of action depending on success/failure of execution.
A task may have many subtasks and their execution mechanism is handled by the component
internally. Collection of tasks with defined precedence results in a Batch. There can be precedence set
for tasks which enforce the relative order of execution. The task precedence is responsible for the
parallelism achieved during the execution of a batch. Thus it is essential to take into account the

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 712


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

performance implications, while defining task precedence in a batch apart from the logical or
functional reasons that primarily define the relative order in which they may be executed.
For example, consider a batch comprising of tasks in the following figure. The arrows show the
precedence involved. The way these tasks are selected for execution is as follows:
• Pick up all the tasks that have START as their parent. It essentially means that these tasks
(Task1, Task2, and Task6) can be run independently.
• Subsequently pick all tasks for execution (at that instance of time) which has successful parent
tasks.
• A Batch is marked as successful only if all the executable tasks are successful.

Figure 315: Illustration of Batch Execution

16.3.1 Architecture
The ES executes a component named "External Scheduler Interface Component" (ESIC) and passes
the suitable parameters. For more information about these parameters see ESIC Command Line
Parameters and Job Types. The ESIC in turn passes these requests to OFSAAI to fetch the Exit status
and interpret as per the Exit Status Specifications.

16.3.2 Scope of Integration


The Integration of External Scheduler (ES) with OFSAAI facilitates with the following capabilities:

16.3.2.1 Run New Batch


• Initialize Batch, will create an instance of current definition to be executed against the provided
MIS Date.
• Execute complete Batch.
• De-initialize Batch, will update the status of instance.
• Restart Failed Batch
• On failure of Batch, Execute Batch in Restart mode after making necessary corrections

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 713


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

16.3.2.2 Rerun Batch


• Initialize Batch, will create an instance of current definition to be executed against the provided
MIS Date.
• Execute complete Batch.
• De-initialize Batch, will update the status of instance.

16.3.2.3 Execution of Tasks in a Batch


• Initialize Batch of which the task is a member, will create an instance of current definition to be
executed against the provided MIS Date.
• Execute individual Task of the Batch one after the other.
• Provided option to exclude the precedence specified in AAI for the tasks while executing
through ESIC.
• De-initialize Batch, will update the status of instance.

16.3.2.4 Restart of Failed Task


• On failure of Task, Re-execute Tasks after making necessary corrections.
• De-initialize Batch, will update the status of instance

NOTE Explicit initialization is not required for restart of a failed Batch


or Task if it is not de-initialized.

16.3.2.5 Export Batch


• To export a Batch definition from OFSAAI to a specified location in an OFSAAI standard XML
format. Also, an ES can add other ES specific details after importing the Batch definition to
utilize its capability.

16.3.3 ESIC Invocation


The ESIC commands can be invoked from anywhere in the machine where Infrastructure is installed
only if $FIC_APP_HOME/icc/bin is added to $PATH variable. Alternatively, you can navigate to that
directory where ESIC component is installed ($FIC_APP_HOME/icc/bin) and Execute.
The log files are generated in $FIC_APP_HOME/icc/log. ESIC handles all exceptions generated during
its execution.
The log file name for ESIC for each instance would be as follows:
ESIC_<Date>_<Time>_<PID>_< External Unique ID>.log
ESIC_<Date>_<Time>_<PID>_< External Unique ID>_<TaskId>.log
In case of an exception, ESIC logs appropriately and exits with an appropriate exit status that can be
used by the ES.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 714


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

Ensure the following:


• ES should execute Initialization and De-Initialization tasks which are invocations of ESIC with
specific parameters.
• ES invokes ESIC as a command line executable for each task that are to be executed which
includes the initialization and de-initialization tasks.
• Optionally, ESIC can wait for an executed task to complete. Once done, ESIC exits with an
appropriate exit status that is fetched by the ES.
• Once an execution has started, the instance of ESIC will exist till the request is completed.
• ESIC handles all exceptions generated and in case of an exception, ESIC logs it appropriately
and exits with an appropriate exit status that can be fetched by the ES.

NOTE When a Batch is initialized for execution through ES, ESIC


captures the OFSAAI user ID and password as parameters and
authenticates the same. If the user is already logged in through
UI, and Allow user to log in from multiple machines
checkbox from the Configuration window is not selected, it will
show the error message "User Already Logged in". Hence
initialization of batch will fail.

Figure 316: Illustration of ESIC

For more details of ESIC exit status, see Exit Status Specifications section. For other miscellaneous
information of ESIC, see Additional Information on ESIC section.

16.3.4 Batch Execution Mechanism


The recommendation for Batch Execution with an External Scheduler is as follows:

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 715


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

During the definition of a batch using the Batch Definition window of Operations module, the Batch is
called as EXTBATCH and the Information Domain in which this Batch is defined is called as
INFODOM. Hence INFODOM_EXTBATCH becomes the Batch ID.
Consider a scenario, to run the following tasks in this Batch.
• The first task 'Task1' loads data in a warehouse table FCT_CUSTOMER.
• The second task 'Task2' loads data in a warehouse table DIM_GEOGRAPHY.
• The third task 'Task3' is a Data Transformation, uses both the Tables mentioned above. Hence
this can run only if both the above tasks, Task1 and Task2 are complete.
• If either Task1 or Task2 fails, a new task namely Task 4 can be executed with the Data
Transformation which uses the data of the previous load.
• The final task is a task namely Task5 which is a Cube building task. This takes several hours as it
builds a Cube with many dimensions and hierarchies and holds large number of combinations.
The parameters for the Tasks are chosen from the drop-down choices provided. OFSAAI provides the
choices through its Data Model Management.
Since, the Task 3 or Task 5 is executed based on conditional success / failure of previous tasks, the
conditionality needs to be simulated in the ES. If the External Scheduler wants to control the
order/conditionality for tasks then it needs to be defined in such a way that they have the same
precedence. Here it would be ideal to define it as follows. The arrows in the following figure, shows the
precedence involved.

Figure 317: Illustration of Batch Execution

The export of such a Batch from OFSAAI would look like the following. For more information, see
OFSAAI Standard XML.
<BATCH BATCHID="INFODOM_EXTBATCH" NOOFTASKS="5" SYSTEMLOCALE="+5:30 GMT"
INFODOMAIN="INFODOM" REVUSER="OPERADMIN" DEFTYPE="DEF">
<RUNINFO REVUID="" EXTUID="" BATCHSTATUS="" INFODATE="" LAG=""/>
<TASK TASKID="Task1" COMPONENTID="LOAD DATA" TASKSTATUS="N" FILTER="N">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID/>
</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID/>
</ONFAILUREOF>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 716


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

</PRECEDENCE>
</TASK>
<TASK TASKID="Task2" COMPONENTID="CUBE CREATE" TASKSTATUS="N" FILTER="N">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID/>
</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID/>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
<TASK TASKID="Task3" COMPONENTID="RUN EXECUTABLE" TASKSTATUS="N" FILTER="N">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID/>
</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID/>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
<TASK TASKID="Task4" COMPONENTID="EXTRACT DATA" TASKSTATUS="N" FILTER="N">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID/>
</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID/>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
<TASK TASKID="Task5" COMPONENTID=" TRANSFORM DATA" TASKSTATUS="N"
FILTER="N">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID/>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 717


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID/>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
</BATCH>
Valid Values for Task Status are:

Table 190: Details of the Task Status and their Value

Task Status Value

N Not Started

O On Going

F Failure

S Success

Valid Values for Batch Status are:

Table 191: Details of the Batch Status and their Value

Batch Status Value

N Not Started

O On Going

R For Restart

C Complete

Valid values for FILTER are:

Table 192: Details of the Filter Status and their Value

Filter Status Value

H Hold

K Exclude/Skip

N No Filter

When the definition of a Batch is exported and imported in ES, the Task Status, the Batch Status, and
the Filter become irrelevant. This happens if you export a specific run of a Batch, which is not currently
supported by OFSAAI. This should be included as a part of the XML for completeness.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 718


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

After importing it in the ES, the Administrators can decide the order in which the tasks must be
executed and alter the order of execution without violating the precedence set in OFSAAI. For
example, the Administrator might configure it as in the following figure.

Figure 318: Illustration of Batch Execution

The invocation of ESIC by the ES and the command line parameters passed for each task for the
above configuration is as follows. For more information about command line parameters see ESIC
Command Line Parameters and Job Types.
The ES needs to provide the 'Ext Unique ID'. In this case it is MAESTRO_INFODOM_EXTBATCH
_20031001_1.
To Initialize the Batch Run:
esic -JI -Urevuser –Ppassword -RMAESTRO_INFODOM_EXTBATCH _20031001_1 -
IINFODOM –BEXTBATCH -D20031001 -F/tmp/INFODOM
Task 1:
esic -JXT -Urevuser –Ppassword -RMAESTRO_ INFODOM_EXTBATCH_20031001_1 -
IINFODOM -WC –TTask1

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 719


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

Task 2:
esic -JXT -Urevuser –Ppassword -RMAESTRO_ INFODOM_EXTBATCH_20031001_1 -
IINFODOM -WC –TTask2
Task 3:
esic -JXT -Urevuser –Ppassword -RMAESTRO_ INFODOM_EXTBATCH_20031001_1 -
IINFODOM -WC –TTask3
Task 4:
esic -JXT -Urevuser –Ppassword -RMAESTRO_ INFODOM_EXTBATCH_20031001_1 -
IINFODOM -WC –TTask4
Task 5:
esic -JXT -Urevuser –Ppassword -RMAESTRO_ INFODOM_EXTBATCH_20031001_1 -
IINFODOM -WC –TTask5
De-initialize:
esic -JD -Urevuser –Ppassword -RMAESTRO_ INFODOM_EXTBATCH_20031001_1 -
IINFODOM -BINFODOM_EXTBATCH -D20031001
Ensure the following scenarios while executing an ES Batch:
• Every Task executed in ES must have an equivalent task defined in a Batch within the
Operations module, except for specific tasks such as Initialization, De-initialization, and Status
Query / Alter Tasks.
• If ES requests to alter the status of a task that has already been requested for execution, an
error value is returned specific to such a case. The same hold good for Batch Run as well.
• Task Execution must follow the precedence as defined in OFSAAI. Else, the task execution
would result in failure.
• Re executing a task of a Batch run, which was successfully executed will result in failure.
• Execution of a Batch whose definition does not exist or deleted will result in failure. An error
value is returned specific to such a case.
• Execution of a task before the initialization of Batch will result in failure.
• Simultaneous execution of the same Task of a Batch Run will result in failure. The same holds
good for a Batch Run as well.

16.3.5 External Scheduler Batch Run ID


Batch Run ID is a unique identifier used to identify a particular Batch Run in the following format:
Infodom_Batchname_Infodate_Run
The Batch Run ID consists of the following components:

Table 193: Components in the Batch Run ID and their Descriptions

Component Description

Infodom The Information Domain for which the batch is being run.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 720


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

Component Description

Batchname The name of the Batch as assigned by the user.

Infodate The date on which the batch is run.

Run This indicates the number of times the Batch has been executed.
This value is incremented if the Batch is re run for the same MISDATE.

16.3.6 Batch Monitoring


The Batch Monitoring window in Operations module facilitates with the static and real time monitoring
of a Batch. On choosing a particular batch definition, an Infodate and a Batch Run ID displays the
status of the tasks inside the selected batch.

16.3.7 Advantages of ES
Following are the advantages of ES component:
• ES is capable of importing a Batch definition, which was previously exported in OFSAAI
Standard XML format. This eliminates the necessity to manually re-define the batch as per the
OFSAAI format.
• ES is capable of passing a unique id for a Batch Run to Operations module through an
initialization mechanism. For more information, see Batch Execution Mechanism.
• Every Batch run can be uniquely identified in both ES and Operations module, when tasks are
executed under the scope of a particular Batch Run.
• ES is capable of executing and passing the desired parameters to a Batch. Further it can fetch
an Exit status and interpret as per the Exit Status Specifications.

16.3.8 OFSAAI Standard XML


<BATCH BATCHNAME="Name of the Batch" NOOFTASKS="Total no of tasks in the Batch"
SYSTEMLOCALE="The locale of the system where the batch is defined " INFODOMAIN="The
Information domain where the batch is defined" REVUSER="User who defined the batch"
DEFTYPE="To Identify whether the XML file describes a batch definition or run (can take values 'D' in
case of definition and 'R' in case of run)">
<RUNINFO REVUID="Batch Run ID" EXTUID="External Unique ID for the Batch Run"
BATCHSTATUS="Status of the Batch Run" INFODATE="The info Date for the system" LAG="Defines
the Lag for the Batch"/>
<TASK TASKID="Task1" COMPONENTID="LOAD DATA" TASKSTATUS="O" FILTER="H">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID></TASKID>
</ONSUCCESSOF>
<ONFAILUREOF>

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 721


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

<TASKID/>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
<TASK TASKID="Task2" COMPONENTID="RUN EXECUTABLE" TASKSTATUS="O" FILTER="H">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID></TASKID>
</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID></TASKID>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
<TASK TASKID="Task3" COMPONENTID="EXTRACT DATA" TASKSTATUS="O" FILTER="N">
<PRECEDENCE>
<ONSUCCESSOF>
<TASKID>TASK1</TASKID>
</ONSUCCESSOF>
<ONFAILUREOF>
<TASKID>Task2</TASKID>
</ONFAILUREOF>
</PRECEDENCE>
</TASK>
</BATCH>
The valid values for FILTER are:

Table 194: Details of the Filter Status and their Value

Filter Status Value

H Hold

R Released

E Excluded/Skipped

I Included

16.3.9 Exit Status Specifications


The following table contains the list of Exit Statuses of the ESIC and their interpretations.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 722


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

Table 195: Details of the Exit Status and their Interpretations

Exit Status Interpretation

0 Success

-1 Failure

-2 Unable to contact OFSAAI

-3 Unable to query OFSAAI Metadata

-4 Unable to Initialize Batch

-5 Unable to De-Initialize Batch

-6 Failed to Execute a Task because of


incorrect parameters passed to the task

-7 Failed to Execute a Task/Batch

-8 Failed to Wait for Task/Batch

-9 Failed to Set Batch as Complete

-10 Failed to Add Filter to Task

-11 Failed to Purge Batch

-12 Failed to Export Batch Definition

-14 Invalid Configuration File

-15 Supplied Parameters Incorrect for Task


Execution

-16 Failed to Export Batch Logs

-13, -16 to –31 Reserved

1 Successful Poll of the Task – Task/Batch


Ongoing (O)

2 Successful Poll of the Task – Task


Excluded (K)

3 Successful Poll of the Task – Task/Batch


Held (H)

4 Successful Poll of the Task – Task/Batch


Not Started (N)

5-8 Reserved

16.3.10 ESIC Operations using Wrapper Scripts


OFSAAI has been enhanced to provide standardized wrapper scripts to perform ESIC batch
operations.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 723


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

16.3.10.1 Prerequisites
• JAVA_HOME (Required) must point at JAVA bin installation directory.
• Ensure the NAWK command is available under PATH.
Contact system administrator if the NAWK command does not exist.
Example: # yum install nawk
• ES_HOME (Required) must point to the ES Home folder.
• Copy the ES folder and the following jars should be present in ES/lib folder:
 FICServer.jar
 AESCryptor.jar
 aai-client.jar
• Update ES/conf/<Infodom>.ini file and specify the proper values.
 MISDATE=Information Date in format mm-dd-yyyy (For example: MISDATE=01-31-2010)
 USERNAME=OFSAAI Login user (For example: USERNAME=BASELUSER)

16.3.10.2 Initialize a Batch for Execution


1. Navigate to the $ES_HOME > bin folder.
2. Run InitializeBatch.sh by passing the following arguments
 Infodom: Information Domain name.
 Runid: RRF run code / ICC batch name
 BatchType: RRF/ICC
Example: ksh InitializeBatch.sh BASELINFO TESTBATCH ICC

16.3.10.3 Execute a Batch


1. Navigate to the $ES_HOME > bin folder.
2. Run ExecuteBatch.sh by passing the following arguments
 Infodom: Information Domain name.
 Runid: RRF run code / ICC batch name
 Mode:run/restart [optional]
Example: ksh ExecuteBatch.sh BASELINFO TESTBATCH run

16.3.10.4 Execute a Task


1. Navigate to the $ES_HOME > bin folder.
2. Run ExecuteTask.sh by passing the following arguments
 Infodom: Information Domain name.
 Runid: RRF run code / ICC batch name

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 724


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

 TaskName: Individual Task in a batch


 TaskPrecedenceCheck: Y/N [optional]
Example: ksh ExecuteTask.sh BASELINFO TESTBATCH Task1 Y

16.3.10.5 De-initializing a Batch


1. Navigate to the $ES_HOME > bin folder.
2. Run DeinitializeBatch.sh by passing the following arguments
 Infodom: Information Domain name.
 Runid: RRF run code / ICC batch name
Example: ksh DeinitializeBatch.sh BASELINFO TESTBATCH

16.3.10.6 View Logs for Individual Batch Run


$ES_HOME/log/ESIC_<batchrunid>.log

16.3.11 ESIC Operations Using Command Line Parameters and Job


Types
ESIC Command Line Parameters can be invoked using the following command:
esic -J<Job Type> <Parameters>
The type of the Parameters depends on the value of the Job Type. The various Job types are provided
below:

16.3.11.1 I - Initialize a Batch for Execution


This command prepares all the run tables and initialize the run of a batch. This should be executed
before any other external API for execution of a batch, as it registers the <External Unique ID> against
the Batch Run ID.
-JI –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -B<Batch Name> -
D<Info Date>-F<Temp Directory Name>
The components of the above command are tabulated below:

Table 196: Details of the Exit Status and their Interpretations

Exit Status Interpretation

User ID Enter the User ID used for initializing the Batch execution.

Password Enter the password for initializing the Batch execution. This password is
validated against the V_PASSWORD column in the
CSSMS_USR_PROFILE table.
An encrypted password is expected, so if the password is given as clear
text, a warning message is displayed, but it proceeds further for
validation.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 725


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

Exit Status Interpretation

Ext Unique ID Enter a unique ID against a batch execution. It is the responsibility of the
External Scheduler/calling program to supply the unique id to ESIC.
The value of this against OFSAAI batch execution id mapping is stored in
the table EXT_BATCH_RUN_ID_MAPPING.

Info Dom Enter the information domain against which the batch is getting
executed.

Batch Name Enter the Batch name.

Info Date Enter the MIS Date for Batch execution.

Temp Directory Name This can be any value chosen by the user.

16.3.11.2 D - DeInitialize/Clean up temporary files created for a Batch Execution


This command DeInitializes the Run of a Batch. All temporary resources allocated for that Run of a
Batch will be reclaimed. An attempt to call an API for a batch for which DeInitialize has been called will
return an error. If DeInitialize is called for an ongoing Batch which has no ongoing tasks, the batch
status will be in accordance to the status of the Tasks under this Batch. If any of the Tasks are
Ongoing, then this command will return a failure "batch cannot be de-initialized".
JD –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -B<Batch Name> -
D<Info Date>

16.3.11.3 X - Execute a Task/Batch or Restart of Batch


These options can be used to execute a Batch or Task of a Batch in OFSAAI. In the case of a batch, the
Batch must have been initialized. In the case of a Task, the batch, of which the task is a member, must
have been initialized, by calling the Initialize API.
When a Batch is defined in OFSAAI, each task will be assigned with unique id like Task1, Task2 and so
on. This task id has to be supplied for <Task ID>. This command would execute the batch/task as in
current system; the return value would depend on the wait mode specified. If the wait mode were 'S',
then a call would return success if the task was successfully triggered.
-JXB –U<ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -W<Wait Mode>
-JXT –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -W<Wait Mode>-
T<Task ID>
-JXRB –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -W<Wait Mode>
Wait Modes:
• C - Wait Completion of a Task/Batch
• S - Successful Trigger/Relay of Task to OFSAAI
If the wait mode were 'C', then the command would wait for completion of the task/batch and returns
the task/batch execution return values. Only Task/Batch marked as 'N' (not started) can be executed
using this API. A task can only be executed if it does not violate the precedence set in OFSAAI batch
definition.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 726


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

16.3.11.4 W - Get Task/Batch Status


-JWB –U<User ID> -P<Password> -R<Ext Unique ID> -W<Wait Mode> -I<Info Dom>
-JWT –U<User ID> -P<Password> -R<Ext Unique ID> -W<Wait Mode> -I<Info Dom>-
T<Task ID>

16.3.11.5 S – Finalize the Batch execution – primarily mark the Batch run as
complete
-JSB –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -V<Batch
Status>
Valid Values for Batch Status are:
C - Complete

16.3.11.6 F - Adding filter to a Task


-JFT –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -T<Task ID> -
V<Task Filter>
Valid values for filter are:
H - Hold
R - Release
E - Exclude/Skip
I - Include

16.3.11.7 P - Purge Batch Run data between two info dates


-JP –U<User ID> -P<Password> -I<Info Dom> [-B<Batch Name>] -S<Start Date> -
E<End Date>
The Start and End Dates must be in the following format: YYYYMMDD.
-JP –U<User ID> -P<Password> -I<Info Dom> -B<Batch Name> -S<Start Date> -
E<End Date> [<Y>]
<Y>- Additional parameter introduced to purge the data from the View Logs table. You need to
specify -B<Batch Name> along with <Y> to purge the data from the View Logs table for the specified
start and end date.

16.3.11.8 E - Export a Batch Definition


-JE –U<User ID> -P<Password> -I<Info Dom> -B<Batch Name> -F<File Name>
<File Name> contains the complete file name that would be created overwriting any file that exists
with the same name.

16.3.11.9 BL – View messages logged for a batch run


-JBL–U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -F<File Name>
[-V<Message Format String>]

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 727


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

<File Name> contains the complete file name that would be created overwriting any file that exists
with the same name.
<Message Format String> specifies the information that needs to be logged.
Format string can contain parameters that will be replaced with actual values from logs.
Valid values for message parameter are msgid, brid, taskid, component, tstatus, severity, tstamp, and
sysmsg.
Each parameter, when passed in a message format string should be enclosed within {}.
Example:
A typical message format string would look like:
{msgid}\t{brid}\t{taskid}\t{component}\t{tstatus}\t{severity}\t{tstamp}\t{sy
smsg}
If no message format string is supplied, then the log generated will be in the above format, with each
value separated by a tab.

16.3.11.10 Restart / Rerun Batches on Failure of a Task using JXRB Command


You can Restart and Rerun the batches in the event of failure of any task/batch during execution.
Ensure that batch execution which is being restarted is not De-Initialized.
To restart the batch, run the following command:
–JXRB –U<User ID> -P<Password> -R<Ext Unique ID> -I<Info Dom> -W<Wait Mode>
To Rerun a batch follow the below steps:
1. Initialize the batch.
2. Run the following command:
-JXRB –U<User ID> -P<Password> -R< Ext Unique ID > -I<Info Dom> -W<Wait
Mode>
3. De-Initialize batch.
The wait modes that can be used in both the above commands are:
• C - Wait Completion of a Task/Batch.
• S - Successful Trigger/Relay of Task to OFSAAI.
The entire batch must be initialized when:
• The batch is failed.
• Task in a Batch is failed. (The batch in which the task is a member must be initialized).
This initializations can be performed from the Initialize API.
The parameter name/value pairs override the parameters provided to the task during batch definition
in OFSAAI. This command executes the batch/task as in the current system.
The return value entirely depends on the wait mode specified.
• If the wait mode chosen as S, the execution returns a Success post the successful triggering of
the task.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 728


REFERENCES
EXTERNAL SCHEDULER INTERFACE COMPONENT

• If the wait mode is selected as C, the command waits for the completion of the task/batch
execution and returns the values.

NOTE Only Task/Batch marked as ‘N’ (not started) can be executed


using this API. A task can be executed only when it does not
violate the precedence set in batch definition.

16.3.12 Additional Information on ESIC


This section includes the information regarding the miscellaneous details, dependencies, and error
logging details for ESIC.

16.3.12.1 Miscellaneous Details and Dependencies


• ESIC resides on App Layer of OFSAAI.
• ESIC expect the environment variable FIC_APP_HOME to be defined for configuration and log
paths.
• In case the environment variable FIC_APP_HOME is not defined, ESIC will exit with an error
message on console.
• ESIC and ICC Server share a single configuration file, which resides in FIC_APP_HOME/icc/conf.
• ESIC resides in FIC_APP_HOME/icc/bin and paths to dependencies (ICC API library in this case)
need to be set to FIC_APP_HOME/icc/lib.
• The following processes are Java processes in platform, which contains environment variables
as JVM parameters.
 FIC Server
 ICC Server
 Model Upload
 Rule Execution
Only these processes can be tracked using JVM commands like JCMD and JPS.

16.3.12.2 Error Logging for ESIC


ESIC opens a file in $FIC_APP_HOME/icc/log for logging and the file descriptor for that file is passed
to the ICC API library for logging. The log file name for ESIC for each instance are as follows:
ESIC_<Date>_<Time>_<External Unique ID>_<TaskID>.log
ESIC log messages into a file only if the exit status values are -2, -12, -14, and –15. For more
information see Exit Status Specifications. In all other cases, ICC Server logs the errors and the causes
and ESIC only return the error value as an exit status.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 729


REFERENCES
FILE UPLOAD REQUIREMENTS

NOTE <External Unique ID> and <Task ID> can be used wherever
applicable.

16.4 File Upload Requirements


When uploading the file to the Filesystem (windows), the Filesystem does not allow the following
characters in file name:
• < (less than)
• > (greater than)
• : (colon)
• " (double quote)
• / (forward slash)
• \ (backslash)
• | (vertical bar or pipe)
• ? (question mark)
• * (asterisk)
In addition, following characters are also restricted in filename and not supported by OFSAA:
• , (Comma)
• { (Opening curly brace)
• } (Closing curly brace)
• Trailing space characters in file names. For example, abc, .txt)

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 730


PREFERENCES
FILE UPLOAD REQUIREMENTS

17 Preferences
The preferences section enables you to set your OFSAA Home Page and the Date Format in which all
Date fields should be displayed, throughout the application where OJET screens are used. This is the
configuration to set the Date Format at user level.
To set the user preferences:
1. Click the logged in user name and select Preferences from the drop-down menu. The
Preferences window is displayed.

Figure 319: Preferences window

2. Select the application which you want to display as your Home Page from the Set My Home
Page drop-down list.

NOTE Whenever you install a new application, the related value for
that application is found in the drop-down list.

3. Select the required Date Format in which the Date fields in all OJET screens in your application
to be displayed. The options are dd/MM/yyyy and MM/dd/yyyy.
4. Click Save to save your preference.

Setting Date Format


You can set the Date Format in which the Date fields in all OJET screens in your application to be
displayed at user-level, application-level and control-level. The first preference is user-level, and then
to application-level. If both are not set, it goes by the Date Format set at control-level.
User Level Preference for Date Format- See the Preferences section.
Application Level Preference for Date Format- If user has not set Date Format at user level, then
system checks for the value for 'DEFAULT_DATEFORMAT_REQ' parameter in the configuration table.
If it is set as TRUE, then the Date fields in all OJET screens in your application will be displayed in the
format given in 'DEFAULT_DATEFORMAT' parameter in the configuration table. If it is set as FALSE, it
takes the Date Format set at control-level. By default, the value for 'DEFAULT_DATEFORMAT_REQ'
parameter is set as FALSE.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 731


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GROUPS AND ENTITLEMENTS

18 Appendix A

18.1 OFS Analytical Applications Infrastructure User Groups


and Entitlements
The following table describes the User Groups and Entitlements are part of the OFSAA 8.0 AAAI
Application Pack release.

Table 197: User Group Name and Description

User Group Name User Group Description

Business Administrator User mapped to this group will have access to all the menu items and actions
for advanced operations of metadata objects.

Business Authorizer User mapped to this group will have access to all the menu items and actions
for authorization of changes to metadata objects.

Business Owner User mapped to this group will have access to all the menu items and actions
for read and write of metadata objects

Business User User mapped to this group will have access to all the menu items and actions
for access and read of metadata objects.

Guest User mapped to this group will have access to certain menu items with only
access privileges.

Identity Administrator User mapped to this group will have access to all the menu items for
managing User entitlements, User Group Entitlements and Access
Management configurations.

Identity Authorizer User mapped to this group will have access to all the menu items for
authorizing User entitlements, User Group Entitlements and Access
Management configurations.

Object Administrator User mapped to this group will have access to all menu items for managing
object migration and metadata traceability using metadata browser.

System Administrator User mapped to this group will have access to all menu items for managing
the setup configurations.

WorkFlow Delegation Admin User mapped to this group will have access to workflow delegation.

18.2 OFS Analytical Applications Infrastructure User Roles


The following table shows the User Roles Code, Name, and Description.

Table 198: User Roles Code, Name, and Description

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

ALIAS_ACSS Alias Access Alias Access

ALIAS_ADVN Alias Advanced Alias Advanced

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 732


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

ALIAS_AUTH Alias Authorize Alias Authorize

ALIAS_PHNT Alias Phantom Alias Phantom

ALIAS_ROLY Alias Read Only Alias Read Only

ALIAS_WRIT Alias Write Alias Write

AUDITROLE Audit Trail Report Role Audit Trail Report Role

BATCH_ACSS Batch Access Batch Access

BATCH_ADVN Batch Advanced Batch Advanced

BATCH_AUTH Batch Authorize Batch Authorize

BATCH_PHNT Batch Phantom Batch Phantom

BATCH_READ Batch Read Only Batch Read Only

BATCH_WRIT Batch Write Batch Write

BPROC_ACSS BMM Processor Business Processor Access


Access

BPROC_ADVN BMM Processor Business Processor Advanced


Advanced

BPROC_AUTH BMM Processor Business Processor Authorize


Authorize

BPROC_PHNT BMM Processor Business Processor Phantom


Phantom

BPROC_ROLY BMM Processor Read Business Processor Read Only


Only

BPROC_WRIT BMM Processor Write Business Processor Write

BUDIM_ACSS Dimension Access Dimension Access

BUDIM_ADVN Dimension Advanced Dimension Advanced

BUDIM_AUTH Dimension Authorize Dimension Authorize

BUDIM_PHNT Dimension Phantom Dimension Phantom

BUDIM_ROLY Dimension Read Only Dimension Read Only

BUDIM_WRIT Dimension Write Dimension Write

BUHCY_ACSS BMM Hierarchy BMM Hierarchy Access


Access

BUHCY_ADVN BMM Hierarchy BMM Hierarchy Advanced


Advanced

BUHCY_AUTH BMM Hierarchy BMM Hierarchy Authorize


Authorize

BUHCY_PHNT BMM Hierarchy BMM Hierarchy Phantom


Phantom

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 733


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

BUHCY_ROLY BMM Hierarchy Read BMM Hierarchy Read Only


Only

BUHCY_WRIT BMM Hierarchy Write BMM Hierarchy Write

BUMSR_ACSS Measure Access Measure Access

BUMSR_ADVN Measure Advanced Measure Advanced

BUMSR_AUTH Measure Authorize Measure Authorize

BUMSR_PHNT Measure Phantom Measure Phantom

BUMSR_ROLY Measure Read Only Measure Read Only

BUMSR_WRIT Measure Write Measure Write

DATASECURITY Data Security Role Role to access un-redacted data

DATASECURITYADMIN Data Security Admin Data security admin role for executing redaction policies

DEFQACCESS DEFQ access Data Entry Forms and Queries access

DEFQADVNC DEFQ advanced Data Entry Forms and Queries advanced

DEFQAUTH DEFQ authorize Data Entry Forms and Queries authorize

DEFQMAN DEFQ Manager Data Entry Forma and Query Manager Role

DEFQPHTM DEFQ phantom Data Entry Forms and Queries phantom

DEFQREAD DEFQ read Data Entry Forms and Queries read

DEFQWRITE DEFQ write Data Entry Forms and Queries write

DIADV DI Advanced DI Advanced Role

DI_ACCESS DI Access Data Ingestion Access Role

DI_PHANTOM DI Phantom Data Ingestion Phantom Role

DI_READ DI Read Data Ingestion Read-only Role

DI_WRITE DI Write Data Ingestion Write Role

DMACCESS Data Mapping UI User Group mapped will have access to Link and
Access Summary

DMADV Data Mapping Data Mapping Advanced Role


Advanced

DMAUTH Data Mapping User Group mapped will have access to authorize the
Authorize Data Mapping

DMMACC DMM Access Data Model Maintenance Access Role

DMMADVND DMM Advanced Data Model Maintenance Advanced Role

DMMAUTH DMM Authorize Data Model Maintenance Authorize Role

DMMFILEUPLDR Model Xml Upload Model Xml File Upload Role


Role

DMMPHTM DMM Phantom Data Model Maintenance Role

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 734


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

DMMREAD DMM Read Data Model Maintenance Read-only Role

DMMWRITE DMM Write Data Model Maintenance Write Role

DMPHANTOM Data Mapping Data Mapping Phantom Role.


Phantom

DMREAD Data Mapping Read User Group mapped will have access to View Definition.
Only

DMTADMIN Data Management Data Management Administrator Role


Admin

DMTDFMACSS Data File Mapping Data File Mapping Access


Access

DMTDMACSS Data Mapping Access Data Mapping Access

DMTSRCACSS Data Sources Access Data Sources Access

DMTUDFACSS UDF Screen Access UDF Screen Access

DMWRITE Data Mapping Write User Group mapped will have access to add, edit, copy
and delete PLC.

DOCMGMTACC Document MGMT Document management access


access

DOCMGMTADV Document MGMT Document management advanced


advanced

DOCMGMTAUT Document MGMT Document management authorize


authorize

DOCMGMTPHT Document MGMT Document management phantom


phantom

DOCMGMTRD Document MGMT Document management read


read

DOCMGMTWR Document MGMT Document management write


write

DQACC DQ Access Data Quality Rule Access Role

DQADVND DQ Advanced Data Quality Rule Advanced Role

DQAUTH DQ Authorize Data Quality Rule Authorize Role

DQPHTM DQ Phantom Data Quality Rule Phantom Role

DQQRYVIEWR DQ View Query Role Data Quality View Query Role

DQREAD DQ Read Data Quality Rule Read-only Role

DQWRITE DQ Write Data Quality Rule Write Role

DRENT_ACSS Derived Entity Access Derived Entity Access

DRENT_ADVN Derived Entity Derived Entity Advanced


Advanced

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 735


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

DRENT_AUTH Derived Entity Derived Entity Authorize


Authorize

DRENT_PHNT Derived Entity Derived Entity Phantom


Phantom

DRENT_ROLY Derived Entity Read Derived Entity Read Only


Only

DRENT_WRIT Derived Entity Write Derived Entity Write

DTSET_ACSS Dataset Access Dataset Access

DTSET_ADVN Dataset Advanced Dataset Advanced

DTSET_AUTH Dataset Authorize Dataset Authorize

DTSET_PHNT Dataset Phantom Dataset Phantom

DTSET_ROLY Dataset Read Only Dataset Read Only

DTSET_WRIT Dataset Write Dataset Write

DT_ACCESS DT Access Data Transformation Access Role

DT_PHANTOM DT Phantom Data Transformation Phantom Role

DT_READ DT Read Data Transformation Read-only Role

DT_WRITE DT Write Data Transformation Write Role

DTADV DT Advanced DT Advanced Role

ESCUB_ACSS Essbase Cube Access Essbase Cube Access

ESCUB_ADVN Essbase Cube Essbase Cube Advanced


Advanced

ESCUB_AUTH Essbase Cube Essbase Cube Authorize


Authorize

ESCUB_PHNT Essbase Cube Essbase Cube Phantom


Phantom

ESCUB_ROLY Essbase Cube Read Essbase Cube Read Only


Only

ESCUB_WRIT Essbase Cube Write Essbase Cube Write

ETLADM ETL Analyst ETL Analyst Role

EXPACC Expression Access Expression Access Role

EXPADVND Expression Advanced Expression Advanced Role

EXPAUTH Expression Authorize Expression Authorize Role

EXPPHTM Expression Phantom Expression Phantom

EXPREAD Expression Read Only Expression Read Only Role

EXPWRITE Expression Write Expression Write Role

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 736


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

FFWACCESS Forms Renderer Forms Renderer access


access

FFWADVNC Forms Renderer Forms Renderer advanced


advanced

FFWAUTH Forms Renderer Forms Renderer authorize


authorize

FFWPHTM Forms Renderer Forms Renderer phantom


phantom

FFWREAD Forms Renderer read Forms Renderer read

FFWWRITE Forms Renderer write Forms Renderer write

FILACC Filter Access Filter Access Role

FILADVND Filter Advanced Filter Advanced Role

FILAUTH Filter Authorize Filter Authorize Role

FILPHTM Filter Phantom Filter Phantom

FILREAD Filter Read Only Filter Read Only Role

FILWRITE Filter Write Filter Write Role

FMCACCESS Forms Conf access Forms Configuration access

FMCADVNC Forms Conf advanced Forms Configuration advanced

FMCAUTH Forms Conf authorize Forms Configuration authorize

FMCPHTM Forms Conf phantom Forms Configuration phantom

FMCREAD Forms Configuration Forms Configuration read


read

FMCWRITE Forms Configuration Forms Configuration write


write

HBRACC Hier Browser Access Hier Browser Access Role

HBRADVND Hier Browser Hier Browser Advanced Role


Advanced

HBRAUTH Hier Browser Hier Browser Authorize Role


Authorize

HBRPHTM Hier Browser Phantom Hier Browser Phantom

HBRREAD Hier Browser Read Hier Browser Read Only Role


Only

HBRWRITE Hier Browser Write Hier Browser Write Role

HIERACC Hierarchy Access Hierarchy Access Role

HIERADVND Hierarchy Advanced Hierarchy Advanced Role

HIERAUTH Hierarchy Authorize Hierarchy Authorize Role

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 737


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

HIERPHTM Hierarchy Phantom Hierarchy Phantom

HIERREAD Hierarchy Read Only Hierarchy Read Only Role

HIERWRITE Hierarchy Write Hierarchy Write Role

IDMGMTACC Identity MGMT access Identity management access

IDMGMTADVN Identity MGMT Identity management advanced


advanced

IDMGMTAUTH Identity MGMT Identity management authorize


authorize

IDMGMTPHTM Identity MGMT Identity management phantom


phantom

IDMGMTREAD Identity MGMT read Identity management read

IDMGMTWRIT Identity MGMT write Identity management write

INBOXACC Inbox Access Inbox Access

MAPPR_ACSS Mapper Access Mapper Access

MAPPR_ADVN Mapper Advanced Mapper Advanced

MAPPR_AUTH Mapper Authorize Mapper Authorize

MAPPR_PHNT Mapper Phantom Mapper Phantom

MAPPR_ROLY Mapper Read Only Mapper Read Only

MAPPR_WRIT Mapper Write Mapper Write

MDBACCESS MDB Access Metadata Browser Access

MDBREAD MDB Read Metadata Browser Read-only

MDBWRITE MDB Write Metadata Browser Write

METADMIN Publish Metadata Publish Metadata Role

MIGACC Obj Migration Access Object Migration Access Role

MIGADVND Obj Migration Object Migration Advanced Role


Advanced

MIGAUTH Obj Migration Object Migration Authorize Role


Authorize

MIGPHTM Obj Migration Object Migration Phantom Role


Phantom

MIGREAD Obj Migration Read Object Migration Read-only Role

MIGWRITE Obj Migration Write Object Migration Write Role

MREACC Manage Run Access Manage Run Access Role

MREADVND Manage Run Manage Run Advanced Role


Advanced

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 738


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

MREAUTH Manage Run Manage Run Authorize Role


Authorize

MREPHTM Manage Run Phantom Manage Run Phantom

MREREAD Manage Run Read Manage Run Read Only Role


Only

MREWRITE Manage Run Write Manage Run Write Role

OBJADMADV ObjectAdmin ObjectAdmin advanced access


advanced

OJFFACC OJFF Access OJFF Access

OMEXADVND Migration Export Migration Export Advanced Role


Advanced

OMEXPHTM Migration Export Migration Export Phantom Role


Phantom

OMEXREAD Migration Export Read Migration Export Read-only Role

OMEXWRITE Migration Export Migration Export Write Role


Write

OMIMADVND Migration Import Migration Import Advanced Role


Advanced

OMIMPHTM Migration Import Migration Import Phantom Role


Phantom

OMIMREAD Migration Import Read Migration Import Read-only Role

OMIMWRITE Migration Import Migration Import Write Role


Write

PLCACCESS PLC Access User Group mapped will have access to Link and
Summary

PLCADV PLC Advanced PLC Advanced Role

PLCAUTH PLC Authorize User Group mapped will have access to authorize the
PLC

PLCPHANTOM PLC Phantom PLC Phantom Role

PLCREAD PLC Read Only User Group mapped will have access to View Definition.

PLCWRITE PLC Write User Group mapped will have access to add, edit, copy
and delete PLC.

PTACC Process Access Process Access Role

PTADVND Process Advanced Process Advanced Role

PTAUTH Process Authorize Process Authorize Role

PTPHTM Process Phantom Process Phantom

PTREAD Process Read Only Process Read Only Role

PTWRITE Process Write Process Write Role

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 739


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

QADMINRL ABC Qtnr Template ABC Qtnr Template Admn


Admn

QADMINVWRL ABC Qtnr Template ABC Qtnr Template View


View

QLOCADMNRL ABC Qtnr Loc Admin ABC Qtnr Localized Admin

QLOCAUTHRL ABC Qtnr Loc Auth ABC Qtnr Localized Authorizer

QLOCVIEWRL ABC Qtnr Loc View ABC Qtnr Localized View

QSGNOFFRL ABC Qtnr Sign Off ABC Qtnr Sign Off

QTMPADMNRL ABC Qtnr Tmpl Admin ABC Qtnr Template Admin

QTMPVIEWRL ABC Qtnr Tmpl View ABC Qtnr Template View

QTNRADMNRL ABC Qtnr Admin ABC Qtnr Admin

QTNRCONFRL QtnrConfiguration QtnrConfiguration Execute


Execute

QTNRCONIRL ABC Qtnr Confidential ABC Qtnr Confidential

QUESTMATRL ABC Qtnr ABC Qtnr Maintenance


Maintenance

READLOG READ LOG Excution View Log Reader

RESTRACC Restructure Access Restructure Access

RESTREXEC Restructure Execute Restructure Execute

RESTRMOD Restructure Edit Restructure Edit

RESTRREAD Restructure Read Restructure Read

RESTRSUMM Restructure Summary Restructure Summary

RESTRWRITE Restructure Write Restructure Write

RLACC Rule Access Rule Access Role

RLADVND Rule Advanced Rule Advanced Role

RLAUTH Rule Authorize Rule Authorize Role

RLPHTM Rule Phantom Rule Phantom

RLREAD Rule Read Only Rule Read Only Role

RLWRITE Rule Write Rule Write Role

RNACC Run Access Run Access Role

RNADVND Run Advanced Run Advanced Role

RNAUTH Run Authorize Run Authorize Role

RNPHTM Run Phantom Run Phantom

RNREAD Run Read Only Run Read Only Role

RNWRITE Run Write Run Write Role

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 740


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

ROLREPACC User Role Report User Role Report Screen Access


Screen

RTIADMIN IPE Write IPE Write

SCDACCESS SCD Access User Group mapped will have access to SCD Link and
Summary

SCDADV SCD Advanced SCD Advanced Role

SCDAUTH SCD Authorize User Group mapped will have access to authorize the
SCD

SCDPHANTOM SCD Phantom SCD Phantom

SCDREAD SCD Read Only User Group mapped will have access to View SCD

SCDWRITE SCD Write User Group mapped will have access to add, edit, copy
and delete SCD.

SRCACCESS Data Source Access User Group mapped will have access to Link and
Summary

SRCADV Data Source Data Source Advanced Role


Advanced

SRCAUTH Data Source Authorize User Group mapped will have access to authorize the
Data Source

SRCPHANTOM Data Source Phantom Data Source Phantom

SRCREAD Data Source Read User Group mapped will have access to View Definition.
Only

SRCWRITE Data Source Write User Group mapped will have access to add, edit, copy
and delete Data Source.

STFACC STF Access Stress Testing Framework Access Role

STFADVND STF Advanced Stress Testing Framework Advanced Role

STFAUTH STF Authorize Stress Testing Framework Authorize Role

STFPHTM STF Phantom Stress Testing Framework Phantom Role

STFREAD STF Read Stress Testing Framework Read-only Role

STFWRITE STF Write Stress Testing Framework Write Role

SYSADMNACC System admin access Identity management access

SYSADMNADV System admin System administration advanced


advanced

SYSADMNAU System admin System configuration authorize


authorize

SYSADMNPHT System admin System administration phantom


phantom

SYSADMNRD System admin read System administration read

SYSADMNWR System admin write System administration write

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 741


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER ROLES

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

SYSAMHM Fusion AMHM Admin Fusion Dimension Maintenance Admin Role

SYSAMHMUMM Fusion AMHM UMM Fusion UMM Maintenance Admin Role


Map Admin

SYSEXPN Fusion Expressions Fusion Expressions Admin Role


Admin

SYSFILTERS Fusion Filters Admin Fusion Filters Admin Role

UAMADMNACC UAM AdminActivity UAM AdminActivity Report Screen Access


Report

UDFACCESS UDF Access User Group mapped will have access to UDF Link and
Summary

UDFADV UDF Advanced UDF Advanced Role

UDFAUTH UDF Authorize User Group mapped will have access to authorize the
UDF

UDFPHANTOM UDF Phantom UDF Phantom

UDFREAD UDF Read Only User Group mapped will have access to View UDF.

UDFWRITE UDF Write User Group mapped will have access to add, edit, copy
and delete UDF.

USRPOPACC User Id Population User Id Population Report Screen Access


Report

WFACC Workflow Access Workflow Access

WFADMINACC Process Admin User Process Admin User

WFADV Workflow Advanced Workflow Advanced

WFAUTH Workflow Authorize Workflow Authorize

WFDELACC Process Delegation Process Delegation User


User

WFDELGADM Workflow Delegation Workflow Delegation Admin


Admin

WFMACC Workflow Monitor Workflow Monitor Access


Access

WFMWRITE Manage Workflow Manage Workflow Monitors


Monitor

WFREAD Workflow Read Workflow Read

WFWRITE Workflow Write Workflow Write

XLATMACCES Atomic excel access Atomic schema excel upload access

XLATMADVNC Atomic excel Atomic schema excel upload advanced


advanced

XLATMAUTH Atomic excel Atomic schema excel upload authorize


authorize

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 742


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_ROLE_CODE V_ROLE_NAME V_ROLE_DESC

XLATMPHTM Atomic excel phantom Atomic schema excel upload phantom

XLATMREAD Atomic excel upload Atomic schema excel upload read


read

XLATMWRITE Atomic excel upload Atomic schema excel upload write


write

XLCNFADVNC Config excel advanced Configuration schema excel upload and download
access

18.3 OFS Analytical Applications Infrastructure Functions


The following table shows the Infrastructure Functions.

Table 199: Infrastructure Function Code, Name, and Description

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

ACCPURGE Purge Access Function For Purge Access

ADAPTERS Run Adapters The user mapped to this function will have rights to run
reveleus adapters

ADDMRE Add Manage Run The user mapped to this function can add the request
for run execution

ADDPROCESS Add Process tree The user mapped to this function can add the process
tree

ADDRULE Add Rule The user mapped to this function can add the rules

ADDRUN Add Run The user mapped to this function can add the run

ADD_F_KBD Add Flexible KBD The user mapped to this function can add Flexible KBD

ADD_RESTR Add Restructure The user mapped to this function can add Restructure

ADD_WF Add Workflow and The user mapped to this function can Create New
Process Definitions Workflow and Process definitions

ADMINSCR Administration Screen The user mapped to this function can access the
Administration Screen

ADVDRLTHR Access to Advanced drill The User mapped to this function will have access to
thru Advanced Drill thru

ALDADD Add Cube The user mapped to this function can add cubes

ALDATH Authorize Cube The user mapped to this function can authorize cubes

ALDDEL Delete Cube The user mapped to this function will have rights to
delete cubes

ALDLINK Essbase Cube Link Essbase Cube Link

ALDMOD Modify Cube The user mapped to this function can modify cubes

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 743


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

ALDSUMM Essbase Cube Summary Essbase Cube Summary

ALDVIW View Cube The user mapped to this function can view cubes

ALSADD Add Alias The user mapped to this function can add Alias

ALSATH Authorize Alias The user mapped to this function can authorize Alias

ALSDEL Delete Alias The user mapped to this function will have rights to
delete Alias

ALSLINK Alias Link Alias Link

ALSMOD Modify Alias The user mapped to this function can modify Alias

ALSSUMM Alias Summary Alias Summary

ALSVIW View Alias The user mapped to this function can view Alias

APPSRVR Application Server The user mapped to this function can access the
Screen Application Server Screen

ARCPROCES Archive Process The user mapped to this function can archive the
process tree

ARCRULE Archive Rule The user mapped to this function can archive the Rule

ARCRUN Archive Run The user mapped to this function can archive the Run

ATHPROCESS Authorize Process Tree The user mapped to this function can authorize Process
Tree

ATHRULE Authorize Rule The user mapped to this function can authorize the rule

ATHRUN Authorize Run The user mapped to this function can authorize run

ATH_F_KBD Authorize Flexible KBD The user mapped to this function can authorize Flexible
KBD

AUDTR Audit Trail Report This function displays Report for audit summary

AUD_TRL Audit Trail Report The user mapped to this function can access the Audit
Screen Trail Report Screen

AUTH_MAP Authorize Map(s) The user mapped to this function can AUTHORIZE Map
definitions

AUTH_SCR Metadata Authorize The user mapped to this function can see Authorization
Screen Screen

AUTH_WF Authorize Access to The user mapped to this function can Authorize the
Workflow and Process Workflow and Process Definition

BATCHMAINT Batch Maintenance Link The user mapped to this function can access Batch
Maintenance Link

BATCHEXEC Batch Execution Link The user mapped to this function can access Batch
Execution Link

BATCHMON Batch Monitor Link The user mapped to this function can access Batch
Monitor Link

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 744


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

BATCHSCHLD Batch Scheduler Link The user mapped to this function can access Batch
Scheduler Link

BATCHVLOG View Log Link The user mapped to this function can access View Log
Link

BATCHCNCL Batch Cancel Link The user mapped to this function can access Batch
Cancel Link

BATCHREP Batch Processing The user mapped to this function can access Batch
Report Link Processing Report Link

BATPRO Batch Processing The user mapped to this function will have rights to
process batch

BPROCADD Add Business Processor The user mapped to this function can add business
processors

BPROCATH Authorize Business The user mapped to this function can authorize business
Processor processors

BPROCDEL Delete Business The user mapped to this function can delete business
Processor processors

BPROCLINK Business Processor Link Business Processor Link

BPROCMOD Modify Business The user mapped to this function can modify business
Processor processors

BPROCSUMM Business Processor Business Processor Summary


Summary

BPROCVIW View Business The user mapped to this function can view business
Processor processors

CATADD Add Catalog This function gives access to add a Catalog.

CATARCH Archive Catalog This function gives access to archive a Catalog.

CATAUTH Authorize Catalog This function gives access to authorize a Catalog.

CATCOMP Compare Catalog This function gives access to compare a Catalog.

CATCOPY Copy Catalog This function gives access to copy a Catalog.

CATDWN Download Catalog This function gives access to download a Catalog.

CATEDIT Edit Catalog This function gives access to edit a Catalog.

CATEXP Export Catalog This function gives access to export a Catalog.

CATGEN Generate Catalog This function gives access to generate a Catalog.

CATIGNACC Ignore Catalog Access This function gives access to ignore a Catalog access.

CATIGNLCK Ignore Catalog Lock This function gives access to ignore a Catalog lock.

CATLAT Latest Catalog This function gives access to make a Catalog latest.

CATLINK Catalog Link This Function gives user access to the LHS link.

CATLOCK Lock Catalog This function gives access to lock a Catalog.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 745


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

CATPUB Publish Catalog This function gives access to publish a Catalog.

CATPURGE Purge Catalog This function gives access to purge a Catalog.

CATREM Remove Catalog This function gives access to remove a Catalog.

CATREST Restore Catalog This function gives access to restore a Catalog.

CATSUM Catalog Summary This function gives Summary Page access to the
mapped user.

CATTOKEN Catalog Token This function gives access to tokens of a Catalog.

CATTRACE Trace Catalog This function gives access to trace a Catalog.

CATVIEW View Catalog This function gives access to view a Catalog.

CFEDEF Cash Flow Equation The user mapped to this function can view/add the
Definition Cash Flow Equation definitions

CFG Configuration The user mapped to this function will have access to
configuration details

CMPPROCESS Compare Process The user mapped to this function can compare the
process tree

CMPRULE Compare Rule The user mapped to this function can compare the rules

CMPRUN Compare Run The user mapped to this function can compare the run

CONFXLADMN Config ExcelUpload The user mapped to this funciton can upload data to
Config schema tables

CPYPROCESS Copy Process Tree The user mapped to this function can copy Process Tree

CPYRULE Copy Rule The user mapped to this function can copy Rule

CPYRUN Copy Run The user mapped to this function can copy Run

CRTMAPADV Create Map Advanced The user mapped to this function will have rights to the
advanced options of map maintenance

CRT_MAP Create Map The user mapped to this function can CREATE/SAVEAS
Map definitions

CWSDOCMGMT Document The user mapped to this function can use Document
Management Access Management APIS via Callable Services Framework

CWSEXTWSAS Call Remote Web The user mapped to this function can call web services
Services configured in the Callable Services Framework

CWSHIERRFR Refresh Hierarchies The user mapped to this function can refresh hierarchies
through the Callable Services Framework

CWSPR2ACCS Execute Runs - Rules The user mapped to this function can execute runs and
rules through the Callable Services Framework

CWSSMSACCS Remote SMS Access The user mapped to this function can access SMS apis
through the Callable Services Framework

CWSUMMACCS Remote UMM Access The user mapped to this function can access UMM apis
through the Callable Services Framework

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 746


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

CWS_STATUS Result of request - The user mapped to this function can access requests
Status of all status through the Callable Services Framework

CWS_TRAN Result of own request The user mapped to the function can access own
only requests status using Callable Services Framework

DATADD Add Dataset The user mapped to this function can add datasets

DATASEC Data Security Function to see non-redacted data

DATASECADV Data Security Advanced Function to execute the redaction policy batch

DATATH Authorize Dataset The user mapped to this function can authorize datasets

DATDEL Delete Dataset The user mapped to this function will have rights to
delete datasets.

DATLINK Dataset Link Dataset Link

DATMOD Modify Dataset The user mapped to this function can modify datasets.

DATSUMM Dataset Summary Dataset Summary

DATVIW View Dataset The user mapped to this function can view datasets.

DBD Database Details The user mapped to this function will have access to
database details.

DBS Database Server The user mapped to this function will have access to
Database Server details.

DCLSADD Add Data Cluster This function gives access to add a Data Cluster

DCLSCOPY Copy Data Cluster This function gives access to copy a Data Cluster

DCLSEDIT Edit PData Cluster This function gives access to edit a Data Cluster

DCLSPURGE Purge Data Cluster This function gives access to purge a Data Cluster

DCLSVIEW View Data Cluster This function gives access to view a Data Cluster

DEEADD Add Derived Entities The user mapped to this function can add derived
entities.

DEEATH Authorize Derived The user mapped to this function can authorize derived
Entities entities.

DEEDEL Delete Derived Entities The user mapped to this function can delete derived
entities.

DEELINK Derived Entity Link Derived Entity Link

DEEMOD Modify Derived Entities The user mapped to this function can modify derived
entities.

DEESUMM Derived Entity Derived Entity Summary


Summary

DEEVIW View Derived Entities The user mapped to this function can view derived
entities

DEFADM Defi Administrator The user mapped to this function will have Defi
Administration rights

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 747


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

DEFAUTH Forms Autorization The user mapped to this function will have rights to
authorize the DEFQ forms

DEFEXL DeFi Excel DeFi Excel

DEFQADM Defq Administrator The user mapped to this function will have Defi
Administration rights

DEFQUSR Defq User The user mapped to this function will have Defi user
rights

DEFUSR Defi User The user mapped to this function will have Defi user
rights

DELPROCESS Delete Process The user mapped to this function can the process

DELRULE Delete Rule The user mapped to this function can delete the rules

DELRUN Delete Run The user mapped to this function can delete the run

DEL_MAP Delete Map The user mapped to this function can DELETE Map
definitions

DEL_WF Delete Workflow and The user mapped to this function can Delete Workflow
Process Definitions and Process definitions.

DEPRE_ACC Dummy Menu Dummy Menu

DIMADD Add Dimension The user mapped to this function can add dimensions.

DIMATH Authorize Dimension The user mapped to this function can authorize
dimensions.

DIMDEL Delete Dimension The user mapped to this function will have rights to
delete dimensions.

DIMLINK Business Dimension Business Dimension Link


Link

DIMMOD Modify Dimension The user mapped to this function can modify
dimensions

DIMSUMM Business Dimension Business Dimension Summary


Summary

DIMVIW View Dimension The user mapped to this function can view dimensions

DMADD Add Data Mapping This function gives access to add a Data Mapping

DMAUTH Authorize Data This function gives access to authorize a Data Mapping
Mapping

DMCONFEDIT Data Management This Function gives user access to add/edit a DMT
Configuration Edit Configuration Property.

DMCONFSUMM Data Management This Function gives user access to the DMT
Configuration Configuration Summary.

DMCOPY Copy Data Mapping This function gives access to copy a Data Mapping

DMDEL Delete Data Mapping This function gives access to delete a Data Mapping

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 748


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

DMEDIT Edit PData Mapping This function gives access to edit a Data Mapping

DMLAT Make Latest Data This function gives access to make latest a Data
Mapping Mapping

DMMFILEUPL Model Xml Upload The user mapped to this function can upload erwin
Model File for Model Upload

DMPURGE Purge Data Mapping This function gives access to purge a Data Mapping

DMSUMM Data Mapping This Function gives user access to the Data Mapping
Summary Summary and LHS Link.

DMTDFM Data File Mapping The user mapped to this function can access the Data
Screen File Mapping Screen

DMTDM Data Mapping Screen The user mapped to this function can access the Data
Mapping Screen

DMTSRC Data Sources Screen The user mapped to this function can access the Data
Sources Screen

DMTUDF UDF Screen The user mapped to this function can access the UDF
Screen

DMVIEW View Data Mapping This function gives access to view a Data Mapping

DMVIEWSQL View SQL Data Mapping This function gives access to view/validate a Data
Mapping/File Mapping SQL

DPPDEL Delete DMT This function gives access to delete a DMT Performance
Performance Params Parameters

DPPEDIT Edit DMT Performance This function gives access to edit a DMT Performance
Params Parameters

DQLADD Data Quality Add This function is for Data Quality Map applet

DQ_ADD Data Quality Add Rule The user mapped to this function can add DQ Rule

DQ_AUTH Data Quality The user mapped to this function can authorise DQ Rule
Authorisation Rule

DQ_CPY Data Quality Copy Rule The user mapped to this function can copy DQ Rule

DQ_DEL Data Quality Delete Rule The user mapped to this function can delete DQ Rule

DQ_EDT Data Quality Edit Rule The user mapped to this function can edit DQ Rule

DQ_GP_ADD Data Quality Add Rule The user mapped to this function can add DQ Rule
Group Group

DQ_GP_CPY Data Quality Copy Rule The user mapped to this function can copy DQ Rule
Group Group

DQ_GP_DEL Data Quality Delete Rule The user mapped to this function can delete DQ Rule
Group Group

DQ_GP_EDT Data Quality Edit Rule The user mapped to this function can edit DQ Rule
Group Group

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 749


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

DQ_GP_EXEC Data Quality Execute The user mapped to this function can execute DQ Rule
Rule Group Group

DQ_GP_VIW Data Quality View Rule The user mapped to this function can view DQ Rule
Group Group

DQ_LNK_ACC Data Quality Link The user mapped to this function can access the DQ
Access Links

DQ_QRY_VIW Data Quality View The user mapped to this function can generate the rule
Query query and view the generated query.

DQ_SUMM Data Quality Summary The user mapped to this function can access the DQ
Access Summary Pages.

DQ_VIW Data Quality View Rule The user mapped to this function can view DQ Rule.

EDIT_WF Edit Workflow and The user mapped to this function can Edit Workflow and
Process Definitions Process definitions.

ENABLEUSR Enable User Screen The user mapped to this function can access the Enable
User Screen.

ETLDEF DI Designer Defining Application,Extract,Flat-File,Mapping

ETLDTQ DTDQ Data Quality Rules and Data Transformation

ETLUSR DI User The user mapped to this function will be a Data


Integrator user

EXEC_RESTR Execute Restructure The user mapped to this function can execute
Restructure Process

EXEPROCESS Exexute Process The user mapped to this function can execute process
tree

EXERULE Exexute Rule The user mapped to this function can execute rules

EXERUN Exexute Run The user mapped to this function can execute run

EXEVIEWLOG Execution Log Viewer Screen For execution view log

EXPMD Export Metadata The user mapped to this function can Export Metadata

EXTPROCESS Export Process The user mapped to this function can export process
tree

EXTRULE Export Rule The user mapped to this function can export Rule

EXTRUN Export Run The user mapped to this function can export Run

FFWSCREEN Forms Renderer Screen Forms Renderer Screen

FILTERRULE Filters in Rule The user mapped to this function can apply filters to the
rules

FLOCADMFN ABC Questionnaire ABC Questionnaire Localized Admin


Localized Admin

FLOCAUTHFN ABC Questionnaire Loc ABC Questionnaire Loc Auth


Auth

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 750


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

FLOCVIEWFN ABC Questionnaire Loc ABC Questionnaire Loc View


View

FRMMGR Forms Manager The user mapped to this function can use Forms
Manager

FTMPLADMFN ABC Questionnaire ABC Questionnaire Template Admin


Template Admin

FTMPLVIEWF ABC Questionnaire ABC Questionnaire Template View


Template View

FUNCMAINT Function Maintenance The user mapped to this function can access the
Screen Function Maintenance Screen

FUNCROLE Function Role Map The user mapped to this function can access the
Screen Function Role Map Screen

FU_ATR_ADD Fusion Add Attributes The user mapped to this function can Create New
Attributes

FU_ATR_CPY Fusion Copy Attributes The user mapped to this function can Copy Attributes

FU_ATR_DD Fusion Attributes - View The user mapped to this function can View Dependent
Dependent Data Data for Attributes

FU_ATR_DEL Fusion Delete Attributes The user mapped to this function can Delete Attributes

FU_ATR_EDT Fusion Edit Attributes The user mapped to this function can Edit Attributes

FU_ATR_HP Fusion Attribute Home The user mapped to this function can view Attribute
Page Home Page

FU_ATR_VIW Fusion View Attributes The user mapped to this function can View Attributes

FU_EXP_ADD Fusion Add Expressions The user mapped to this function can Create New
Expressions

FU_EXP_CPY Fusion Copy The user mapped to this function can Copy Expressions
Expressions

FU_EXP_DD Fusion View The user mapped to this function can View Dependent
Dependency Data for Expressions
Expressions

FU_EXP_DEL Fusion Delete The user mapped to this function can Delete
Expressions Expressions

FU_EXP_EDT Fusion Edit Expressions The user mapped to this function can Edit Expressions

FU_EXP_HP Fusion Expns Home The user mapped to this function can view Expressions
Page Home Page

FU_EXP_IGN Fusion Expression The user mapped to this function can ignore the access
Ignore Access type for Expression

FU_EXP_LNK Fusion Expressions Link The user mapped to this function can view Expression
Summary Page in LHS Menu

FU_EXP_VIW Fusion View The user mapped to this function can View Expressions
Expressions

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 751


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

FU_FIL_ADD Fusion Add Filters The user mapped to this function can Create New Filters

FU_FIL_CPY Fusion Copy Filters The user mapped to this function can Copy Filters

FU_FIL_DD Fusion Filters - View The user mapped to this function can View Dependent
Dependent Data Data for Filters

FU_FIL_DEL Fusion Delete Filters The user mapped to this function can Delete Filters

FU_FIL_EDT Fusion Edit Filters The user mapped to this function can Edit Filters

FU_FIL_HP Fusion Filters Home The user mapped to this function can view Filters Home
Page Page

FU_FIL_IGN Fusion Filters Ignore The user mapped to this function can ignore the access
Access type for Filters

FU_FIL_LNK Fusion Filters Link The user mapped to this function can access Fusion
Filters Summary Link

FU_FIL_SQL Fusion Filters - View The user mapped to this function can view SQL for
SQL Filters

FU_FIL_VIW Fusion View Filters The user mapped to this function can View Filters

FU_GP_VIW Global Preferences View The user mapped to this function can view Global
Preferences

FU_HBR_ADD Fusion Hier Browser The user mapped to this function can add member in
Add AMHM Hierarchy Browser

FU_HBR_DEL Fusion Hier Browser The user mapped to this function can delete member in
Delete AMHM Hierarchy Browser

FU_HBR_EDT Fusion Hier Browser The user mapped to this function can edit in AMHM
Edit Hierarchy Browser

FU_HBR_SMY Fusion Hier Browser The user mapped to this function can use shared folder
Summary in AMHM Hierarchy Browser

FU_HIE_ADD Fusion Add Hierarchies The user mapped to this function can Create New
Hierarchies

FU_HIE_CPY Fusion Copy Hierarchies The user mapped to this function can Copy Hierarchies

FU_HIE_DD Fusion Hierarchies - The user mapped to this function can View Dependent
View Dependent Data Data for Hierarchies

FU_HIE_DEL Fusion Delete The user mapped to this function can Delete Hierarchies
Hierarchies

FU_HIE_EDT Fusion Edit Hierarchies The user mapped to this function can Edit Hierarchies

FU_HIE_HP Fusion Hierarchy Home The user mapped to this function can view Hierarchy
Page Home Page

FU_HIE_IGN Fusion Hierarchy Ignore The user mapped to this function can ignore the access
Access type for Hierarchies

FU_HIE_LNK Fusion Hierarchy Link The user mapped to this function can view Hierarchy
Summary Page Link in LHS Menu

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 752


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

FU_HIE_UMM Fusion Hierarchies to The user mapped to this function can Map Fusion
UMM Mapping Hierarchies to UMM Hierarchies

FU_HIE_VIW Fusion View Hierarchies The user mapped to this function can View Hierarchies

FU_MEM_ADD Fusion Add Members The user mapped to this function can Create New
Members

FU_MEM_CPY Fusion Copy Members The user mapped to this function can Copy Members

FU_MEM_DD Fusion Members - View The user mapped to this function can View Dependent
Dependent Data Data for Members

FU_MEM_DEL Fusion Delete Members The user mapped to this function can Delete Members

FU_MEM_EDT Fusion Edit Members The user mapped to this function can Edit Members

FU_MEM_HP Fusion Member Home The user mapped to this function can view Member
Page Home Page

FU_MEM_VIW Fusion View Members The user mapped to this function can View Members

FU_MIG_ADD Object Migration Create The user mapped to this function can Create Migration
Migration Ruleset Ruleset

FU_MIG_CFG Object Migration Source The user mapped to this function can manipulate Source
Configuration Configuration

FU_MIG_CPY Object Migration Copy The user mapped to this function can Object Migration
Migration Ruleset Edit Migration RulesetCopy Migration Ruleset

FU_MIG_CRN Cancel Migration The user mapped to this function can Cancel migration
Execution execution

FU_MIG_DEL Object Migration Delete The user mapped to this function can Delete Migration
Migration Ruleset Ruleset

FU_MIG_EDT Object Migration Edit The user mapped to this function can Edit Migration
Migration Ruleset Ruleset

FU_MIG_HP Object Migration Home The user mapped to this function can Object Migration
Page Link

FU_MIG_RUN Execute/Run Migration The user mapped to this function can Run the migration
Process process

FU_MIG_SUM Object Migration The user mapped to this function can view ruleset
Summary Page summary

FU_MIG_VCF Object Migration The user mapped to this function can view Source
ViewSource Configuration
Configuration

FU_MIG_VIW Object Migration View The user mapped to this function can View Migration
Migration Ruleset Ruleset

FU_SQL_ADD SQL Rule Add This function is for SQL Rule Add

FU_SQL_CPY SQL Rule Copy This function is for SQL Rule Copy

FU_SQL_DEL SQL Rule Delete This function is for SQL Rule Delete

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 753


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

FU_SQL_EDT SQL Rule Edit This function is for SQL Rule Edit

FU_SQL_RUN SQL Rule Run This function is for SQL Rule Run

FU_SQL_VIW SQL Rule View This function is for SQL Rule View

F_KBD_LINK Flexible KBD Link The user mapped to this function can see the Flexible
KBD Link

F_KBD_SUM Flexible KBD Summary The user mapped to this function can view summary of
Flexible KBD

GMVDEF GMV Definition The user mapped to this function can view/add the
General Market Variable definitions

GSTMNU Menu for Guest User Menu for Guest User

HCYADD Add Hierarchy The user mapped to this function can add hierarchies

HCYATH Authorize Hierarchy The user mapped to this function can authorize
hierarchies

HCYDEL Delete Hierarchy The user mapped to this function will have rights to
delete hierarchies

HCYLINK Business Hierarchy Link Business Hierarchy Link

HCYMOD Modify Hierarchy The user mapped to this function can modify hierarchies

HCYSUMM Business Hierarchy Business Hierarchy Summary


Summary

HCYVIW View Hierarchy The user mapped to this function can view hierarchies

HOLMAINT Holiday Maintenance The user mapped to this function can access the Holiday
Screen Maintenance Screen

IBMADD Import Business Model The user mapped to this function can import business
models

IMPMD Import Metadata The user mapped to this function can Import Metadata

INBOXLINK Link Access to Inbox The user mapped to this function can open Inbox

IND Information Domain The user mapped to this function will have access to
information domain details

LCKPROCESS Lock Process The user mapped to this function can lock process tree

LCKRULE Lock Rule The user mapped to this function can lock rules

LCKRUN Lock Run The user mapped to this function can lock run

LCK_F_KBD Lock Flexible KBD The user mapped to this function can lock Flexible KBD

LCK_RESTR Lock Restructure The user mapped to this function can lock Restructure

LINK_WF Link Access to Workflow The user mapped to this function can See the Workflow
and Process Definitions and Process Orachestration Link

LOCDESC Locale Desc Upload The user mapped to this function can access the Locale
Screen Desc Upload Screen

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 754


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

MAN_WF_M Manage Workflow and The user mapped to this function can Manage Workflow
Process Monitor and Process Monitor

MAPLINK Map Maintenance Link Map Maintenance Link

MAPSUMM Map Maintenance Map Maintenance Summary


Summary

MDDIFF Metadata Difference The user mapped to this function can access the
Screen Metadata Difference Screen

MDLAUTH Model Authorize The user mapped to this function can Authorize Model
Maintenance

MDLCALIB Model Calibration The user mapped to this function can view/add the
Model Calibration screen

MDLCHAMP Model Make Champion The user mapped to this function can view the
Champion Challenger screen

MDLDEF Model Definition The user mapped to this function can view/add the
Model definitions

MDLDEPLOY Model Deployment The user mapped to this function can access the Model
Deployment screen

MDLEXEC Model Execution The user mapped to this function can access the Model
Execution screen

MDLOUTPUT Model Outputs The user mapped to this function can view the Model
Outputs

MDMP Metadata Segment Map The user mapped to this function will have rights to
perform metadata segment mapping

METMAP Map Metadata The user mapped to this function can Map Metadata to
Application

METPUB Metadata Publish The user mapped to this function can publish metadata

METVIW View Metadata The user mapped to this function can access metadata
browser

MLPROCESS Make Latest Process The user mapped to this function can make latest
Process

MLRULE Make Latest Rule The user mapped to this function can make latest rule

MLRUN Make Latest Run The user mapped to this function can make latest run

MODMRE Modify Manage Run The user mapped to this function can modify the
request for run execution

MODPROCESS Modify Process Tree The user mapped to this function can modify Process
Tree

MODRULE Modify Rule The user mapped to this function can modify the rules

MODRUN Modify Run The user mapped to this function can modify run

MOD_F_KBD Edit Flexible KBD The user mapped to this function can edit Flexible KBD

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 755


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

MOD_MAP Modify Map The user mapped to this function can SAVE Map
definitions

MOD_RESTR Edit Restructure The user mapped to this function can edit Restructure

MRELINK Manage Run Link The user mapped to this function can view the manage
run link

MRESUM Manage Run Summary The user mapped to this function can view the manage
run summary

MSRADD Add Measure The user mapped to this function can add measures

MSRATH Authorize Measure The user mapped to this function can authorize
measures

MSRDEL Delete Measure The user mapped to this function will have rights to
delete measures

MSRLINK Business Measure Link Business Measure Link

MSRMOD Modify Measure The user mapped to this function can modify measures

MSRSUMM Business Measure Business Measure Summary


Summary

MSRVIW View Measure The user mapped to this function can view measures

OBJMGR_EXP Export Objects The user mapped to this function can Export Objects

OBJMGR_IMP Import Objects The user mapped to this function can Import Objects

OFSAAAI FS Enterprise Modeling The user mapped to this function can access Financial
Access Code Services Enterprise Modeling Application

OFSIPE FS Inline Processing The user mapped to this function can access Financial
Engine Access Code Services Inline Processing Engine Application

OJFFLINK Access to OJET Forms The user mapped to this function can access OJET
Framework Forms Framework

OJFF_MASK Access to OJET Forms The user mapped to this function can access OJET
Framework Masking Forms Framework Masking Screen

OLAPDETS OLAP Details Screen The user mapped to this function can access the OLAP
Details Screen

OM_EX_ADD Add Export Definitions The user mapped to this function can add export
definitions

OM_EX_COPY Copy Export Definitions The user mapped to this function can copy export
definitions

OM_EX_DLTE Delete Export The user mapped to this function can delete export
Definitions definitions

OM_EX_EDIT Edit Export Definitions The user mapped to this function can edit export
definitions

OM_EX_TRGR Trigger Export The user mapped to this function can trigger export
Definitions definitions

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 756


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

OM_EX_VIEW View Export Definitions The user mapped to this function can view export
definitions

OM_IM_ADD Add Import Definitions The user mapped to this function can add import
definitions

OM_IM_COPY Copy Import Definitions The user mapped to this function can copy import
definitions

OM_IM_DLTE Delete Import The user mapped to this function can delete import
Definitions definitions

OM_IM_EDIT Edit Import Definitions The user mapped to this function can edit import
definitions

OM_IM_TRGR Trigger Import The user mapped to this function can trigger import
Definitions definitions

OM_IM_VIEW View Import Definitions The user mapped to this function can view import
definitions

OPRABORT Batch Abort The user mapped to this function can Abort Batch

OPRADD Create Batch The user mapped to this function will have rights to
define batches

OPRCANCEL Batch Cancellation The user mapped to this function can Cancel Batch

OPRDEL Delete Batch The user mapped to this function will have rights to
delete batches

OPREXEC Execute Batch The user mapped to this function will have rights to run,
restart and rerun batches

OPRLINK Batch Link This function gives access to the LHS Link for
Operations.

OPRMON Batch Monitor The user mapped to this function will have rights to
monitor batches

OPRSCHEDUL Schedule Batch The user mapped to this function can schedule batches

ORACBADD Add Oracle Cube The user mapped to this function can add Oracle cubes

ORACBATH Authorize Oracle Cube The user mapped to this function can authorize Oracle
cubes

ORACBDEL Delete Oracle Cube The user mapped to this function will have rights to
delete Oracle cubes

ORACBMOD Modify Oracle Cube The user mapped to this function can modify Oracle
cubes

ORACBVIW View Oracle Cube The user mapped to this function can view Oracle cubes

ORACLINK Oracle Cube Link Oracle Cube Link

ORACSUMM Oracle Cube Summary Oracle Cube Summary

PATCHINFO View Patch Information The user mapped to this function can view list of all
fixes/ patches applied

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 757


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

PBLPROCESS Publish Process The user mapped to this function can publish the
process tree

PBLRULE Publish Rule The user mapped to this function can publish the rules

PBLRUN Publish Run The user mapped to this function can publish the run

PLCADD Add Post Load Changes This function gives access to add a PLC

PLCAUTH Authorize Post Load This function gives access to authorize a PLC
Changes

PLCCOPY Copy Post Load This function gives access to copy a PLC
Changes

PLCDEL Delete Post Load This function gives access to delete a PLC
Changes

PLCEDIT Edit Post Load Changes This function gives access to edit a PLC

PLCGENLOG Generate DT Logic This function gives access to Generate the DT Logic

PLCLAT Make Latest Post Load This function gives access to make latest a PLC
Changes

PLCPURGE Purge Post Load This function gives access to purge a PLC
Changes

PLCSUMM PLC Summary This Function gives user access to the PLC Summary.

PLCVIEW View Post Load This function gives access to view a PLC
Changes

PR2SCREEN PR2 Screens The user mapped to this function can access PR2
screens

PRGPROCESS Purge Process The user mapped to this function can purge the process
tree

PRGRULE Purge Rule The user mapped to this function can purge the rules

PRGRUN Purge Run The user mapped to this function can purge the run

PROFMAINT Profile Maintenance The user mapped to this function can access the Profile
Screen Maintenance Screen

PTIGNACC Process Ignore Access If Mapped the user will be able to add or remove access
type restrictions on process object

PTIGNLCK Process Ignore Lock If mapped the user will be able to add of remove lock on
process object

PTLINK Process Link The user mapped to this function can view the process
link

PTSUM Process Summary The user mapped to this function can view the process
summary

QADMINFN ABC Questionnaire ABC Questionnaire Template Admin Func


Template Admin Func

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 758


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

QADMINVWFN ABC Questionnaire ABC Questionnaire Template View Func


Template View Func

QCODMNUFN ABC Qstnaire Coordn Questionaire Coordinator Menu


Menu

QCONFIDNFN ABC Qtnr Confidential ABC Questionnaire Confidential Function


Func

QLOCADMFN ABC Questionnaire ABC Questionnaire Localized Admin Func


Localized Admin Func

QLOCAUTFN ABC Questionnaire ABC Questionnaire Localized Auth Func


Localized Auth Func

QLOCVIWFN ABC Questionnaire ABC Questionnaire Localized View Func


Localized View Func

QSIGNOFFFN ABC Questionnaire ABC Questionnaire Signoff Func


Signoff Func

QTNRADMFN ABC Questionnaire ABC Questionnaire Admin Func


Admin Func

QTNRCONFFN Configure The user mapped to this function can execute


Questionnaire QtnrConfiguration Process
Attributes

REGRRFCOMP Component The user mapped to this function can register


Registration Components for Rules Framework

RESTPASS Restricted Passwords The user mapped to this function can access the
Screen Restricted Passwords Screen

RESTR_LINK Restructure Link The user mapped to this function can see the
Restructure Link

RESTR_SUM Restructure Summary The user mapped to this function can view summary of
Restructure

RLIGNACC Rule Ignore Access If Mapped the user will be able to add or remove access
type restrictions on rule object

RLIGNLCK Rule Ignore Lock If mapped the user will be able to add of remove lock on
rule object

RLLINK Rule Link The user mapped to this function can view the rule link

RLSETCFG Rules Setup The user mapped to this function can access the Rules
Configuration Screen Setup Configuration Screen

RLSUM Rule Summary The user mapped to this function can view the rule
summary

RNIGNACC Run Ignore Access If Mapped the user will be able to add or remove access
type restrictions on run object

RNIGNLCK Run Ignore Lock If mapped the user will be able to add of remove lock on
run object

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 759


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

RNLINK Run Link The user mapped to this function can view the run link

RNSUM Run Summary The user mapped to this function can view the run
summary

ROLEMAINT Role Maintenance The user mapped to this function can access the Role
Screen Maintenance Screen

RRFSCREEN Rules Framework The user mapped to this function can access Rules
Screens Framework screens

RSTPROCESS Restore Process The user mapped to this function can restore the
process tree

RSTRULE Restore Rule The user mapped to this function can restore the Rule

RSTRUN Restore Run The user mapped to this function can restore the Run

RTIACC Real Time Infrastructure Real Time Infrastructure Function


Function

RTIASS Real Time Assessment Real Time Assessment Access


Access

RTIEVAL Real Time Evaluation Real Time Evaluation Access


Access

RTIPROF Real Time Profile Real Time Profile Access


Access

SANDBXAUTH Sandbox Authorize The user mapped to this function can Authorize a
Sandbox Maintenance

SANDBXCR Sandbox Creation The user mapped to this function can view/add the
Sandbox definitions

SANDBXMOD Sandbox Maintenance The user mapped to this function can view the Sandbox
Maintenance

SAVEMD Save Metadata Screen The user mapped to this function can access the Save
Metadata Screen

SCDADD Add SCD This function gives access to add a Slowly Changing
Dimension

SCDAUTH Authorize SCD This function gives access to authorize a Slowly


Changing Dimension

SCDCOPY Copy SCD This function gives access to copy a Slowly Changing
Dimension

SCDDEL Delete SCD This function gives access to delete a Slowly Changing
Dimension

SCDEDIT Edit SCD This function gives access to edit a Slowly Changing
Dimension

SCDLAT Make Latest SCD This function gives access to make latest a User Defined
Function

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 760


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

SCDPURGE Purge SCD This function gives access to purge a Slowly Changing
Dimension

SCDSUMM SCD Summary This Function gives user access to the Slowly Changing
Dimension Summary

SCDVIEW View SCD This function gives access to view a Slowly Changing
Dimension

SCNDEF Scenario Definition The user mapped to this function can define the
scenarios

SCROPC Operator Console The user mapped to this function will have access to the
operator console

SCRSAU System Administrator The user mapped to this function can access system
Screen administrator screens

SCR_MDB MDB Screen The user mapped to this function can access the MDB
screen

SEGMAINT Segment Maintenance The user mapped to this function can access the
Screen Segment Maintenance Screen

SRCADD Add Data Source This function gives access to add a Data Source

SRCAUTH Authorize Data Source This function gives access to authorize a Data Source

SRCCOPY Copy Data Source This function gives access to copy a Data Source

SRCDEL Delete Data Source This function gives access to delete Data Source

SRCEDIT Edit Data Source This function gives access to edit a Data Source

SRCLAT Make Latest Data This function gives access to make latest a Data Source
Source

SRCPURGE Purge Data Source This function gives access to purge a Data Source

SRCSUMM Source Summary This Function gives user access to the Data Source
Summary

SRCVIEW View Data Source This function gives access to view a Data Source

STRESSDEF Stress Definition The user mapped to this function can define the stress

SUM_WF Summary Access to The user mapped to this function can View Summary of
Workflow and Process Workflow and Process definitions
Definitions

SYSADM System Administrator The user mapped to this function will be a system
administrator

SYSATH System Authorizer The user mapped to this function will be a system
authorizer

TASKCANCEL Cancel Task The user mapped to this function can Cancel Task

TECHAUTH Authorize Technique The user mapped to this function can authorize
techniques

TECHDEF Add Technique The user mapped to this function can define techniques

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 761


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

TRANS_DOC Access to Transfer The User mapped to this function will have access to
Documents Ownership Transfer Documents Ownership

TRCPROCESS Trace Process The user mapped to this function can trace process tree

TRCRULE Trace Rule The user mapped to this function can trace Rule

TRCRUN Trace Run The user mapped to this function can trace Run

UACCR User Access Report This function displays Report for user access rights

UADAR User Admin Activity This function displays Report for various activities of
Report user

UAMADMNREP UAM AdminActivity The user mapped to this function can access the UAM
Reports Screen AdminActivity Reports Screen

UATTR User Attribute Report This function displays Report for various user attributes

UDFADD Add UDF This function gives access to add an User Defined
Function

UDFAUTH Authorize UDF This function gives access to authorize an User Defined
Function

UDFCOPY Copy UDF This function gives access to copy an User Defined
Function

UDFDEL Delete UDF This function gives access to delete an User Defined
Function

UDFEDIT Edit DUDF This function gives access to edit an User Defined
Function

UDFLAT Make Latest UDF This function gives access to make latest a User Defined
Function

UDFPURGE Purge UDF This function gives access to purge an User Defined
Function

UDFSUMM UDF Summary This Function gives user access to the User Defined
Function Summary

UDFVIEW View UDF This function gives access to view an User Defined
Function

UGDOMMAP User Group Domain The user mapped to this function can access the User
Map Screen Group Domain Map Screen

UGFLROLMAP User Group Folder Role The user mapped to this function can access the User
Map Screen Group Folder Role Map Screen

UGMAINT User Group The user mapped to this function can access the User
Maintenance Screen Group Maintenance Screen

UGMAP User Group User Map The user mapped to this function can access the User
Screen Group User Map Screen

UGROLMAP User Group Role Map The user mapped to this function can access the User
Screen Group Role Map Screen

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 762


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE FUNCTIONS

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

UPLOADSCN Upload Scenario The user mapped to this function can upload the
scenario data

USRACTREP User Activity Reports The user mapped to this function can access the User
Screen Activity Reports Screen

USRATH User Authorization The user mapped to this function can access the User
Screen Authorization Screen

USRATTUP User Attribute Upload The user mapped to this function can access the User
Screen Attribute Upload Screen

USRBATMAP User-Batch Execution The user mapped to this function can access the User-
Mapping Screen Batch Execution Mapping Screen

USRMAINT User Maintenance The user mapped to this function can access the User
Screen Maintenance Screen

USRPOPREP User Id Population The user mapped to this function can access the User Id
Reports Screen Population Reports Screen

USRPROFREP User Profile Report The user mapped to this function can access the User
Screen Profile Report Screen

USRROLREP User Role Reports The user mapped to this function can access the User
Screen Role Report Screen

USTATR User Status Report This function displays Report for deleted, disabled,
logged in, authorized and idle users

VARDEF Variable Definition The user mapped to this function can view/add the
Variable definitions

VARSHKDEF Variable Shock The user mapped to this function can define the variable
Definition shocks

VEU_MAP View Map The user mapped to this function can VIEW Map
definitions

VIEWLOG View log The user mapped to this function will have rights to view
log

VIEWMRE View Manage Run The user mapped to this function can view the request
for Run execution

VIEWPROC View Process The user mapped to this function can view the process
tree definitions

VIEWRULE View Rule The user mapped to this function can view the rules
definitions

VIEWRUN View Run The user mapped to this function can view the run
definitions

VIEW_F_KBD View Flexible KBD The user mapped to this function can view summary of
Flexible KBD

VIEW_HOME View APP Landing View the APP Landing Home Screen from Forms
Home Screen from Framework
Forms Framework

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 763


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

V_FUNCTION_CODE V_FUNCTION_NAME V_FUNCTION_DESC

VIEW_RESTR View Restructure The user mapped to this function can view summary of
Restructure

VIEW_WF View Workflow and The user mapped to this function can View Workflow
Process Definitions and Process definitions

VIEW_WF_M View Workflow and The user mapped to this function can View Workflow
Process Monitor and Process Monitor

WEBSRVR Web Server Screen The user mapped to this function can access the Web
Server Screen

WFADMLINK Link Access to Process The user mapped to this function will have rights to
Admin open Process Admin

WFDELLINK Link Access to Process The user mapped to this function will have rights to
Delegation open Process Delegation

WF_DLG_ADM Delegation Admin The user mapped to this function will have rights to be
delegation admin

WRTPR_BAT Write-Protected Batch The user mapped to this function can access the Write-
Screen Protected Batch Screen

XLADMIN Excel Admin The user mapped to this function can define Excel
Mapping

XLUSER Excel User The user mapped to this function can Upload Excel Data

18.4 OFS Analytical Applications Infrastructure Group - Role


Mapping
The following table shows the Infrastructure Group Name and Role Code.

Table 200: Infrastructure Group Name and Role Code

GROUP NAME ROLE CODE

Business Administrator ALIAS_ACSS

Business Administrator ALIAS_ADVN

Business Administrator ALIAS_AUTH

Business Administrator ALIAS_ROLY

Business Administrator ALIAS_WRIT

Business Administrator BATCH_ACSS

Business Administrator BATCH_ADVN

Business Administrator BATCH_AUTH

Business Administrator BATCH_READ

Business Administrator BATCH_WRIT

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 764


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Administrator BPROC_ACSS

Business Administrator BPROC_ADVN

Business Administrator BPROC_AUTH

Business Administrator BPROC_ROLY

Business Administrator BPROC_WRIT

Business Administrator BUDIM_ACSS

Business Administrator BUDIM_ADVN

Business Administrator BUDIM_AUTH

Business Administrator BUDIM_ROLY

Business Administrator BUDIM_WRIT

Business Administrator BUHCY_ACSS

Business Administrator BUHCY_ADVN

Business Administrator BUHCY_AUTH

Business Administrator BUHCY_ROLY

Business Administrator BUHCY_WRIT

Business Administrator BUMSR_ACSS

Business Administrator BUMSR_ADVN

Business Administrator BUMSR_AUTH

Business Administrator BUMSR_ROLY

Business Administrator BUMSR_WRIT

Business Administrator CATACC

Business Administrator CATADV

Business Administrator CATAUTH

Business Administrator CATREAD

Business Administrator CATWRITE

Business Administrator DEFQACCESS

Business Administrator DEFQREAD

Business Administrator DEFQWRITE

Business Administrator DI_ACCESS

Business Administrator DI_READ

Business Administrator DI_WRITE

Business Administrator DMMACC

Business Administrator DMMADVND

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 765


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Administrator DMMAUTH

Business Administrator DMMREAD

Business Administrator DMMWRITE

Business Administrator DOCMGMTACC

Business Administrator DOCMGMTADV

Business Administrator DOCMGMTRD

Business Administrator DOCMGMTWR

Business Administrator DQACC

Business Administrator DQADVND

Business Administrator DQAUTH

Business Administrator DQREAD

Business Administrator DQWRITE

Business Administrator DRENT_ACSS

Business Administrator DRENT_ADVN

Business Administrator DRENT_AUTH

Business Administrator DRENT_ROLY

Business Administrator DRENT_WRIT

Business Administrator DTSET_ACSS

Business Administrator DTSET_ADVN

Business Administrator DTSET_AUTH

Business Administrator DTSET_ROLY

Business Administrator DTSET_WRIT

Business Administrator DT_ACCESS

Business Administrator DT_READ

Business Administrator DT_WRITE

Business Administrator ESCUB_ACSS

Business Administrator ESCUB_ADVN

Business Administrator ESCUB_AUTH

Business Administrator ESCUB_ROLY

Business Administrator ESCUB_WRIT

Business Administrator EXPACC

Business Administrator EXPREAD

Business Administrator EXPWRITE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 766


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Administrator FFWACCESS

Business Administrator FFWREAD

Business Administrator FFWWRITE

Business Administrator FILACC

Business Administrator FILREAD

Business Administrator FILWRITE

Business Administrator FMCACCESS

Business Administrator FMCREAD

Business Administrator FMCWRITE

Business Administrator F_KBDACC

Business Administrator F_KBDAUTH

Business Administrator F_KBDREAD

Business Administrator F_KBDWRITE

Business Administrator HBRACC

Business Administrator HBRREAD

Business Administrator HBRWRITE

Business Administrator HIERACC

Business Administrator HIERREAD

Business Administrator HIERWRITE

Business Administrator MAPPR_ACSS

Business Administrator MAPPR_ADVN

Business Administrator MAPPR_AUTH

Business Administrator MAPPR_ROLY

Business Administrator MAPPR_WRIT

Business Administrator MDBACCESS

Business Administrator MDBREAD

Business Administrator MDBWRITE

Business Administrator MIGACC

Business Administrator MIGADVND

Business Administrator MIGAUTH

Business Administrator MIGREAD

Business Administrator MIGWRITE

Business Administrator MREACC

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 767


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Administrator MREADVND

Business Administrator MREAUTH

Business Administrator MREREAD

Business Administrator MREWRITE

Business Administrator ORCUB_ACSS

Business Administrator ORCUB_ADVN

Business Administrator ORCUB_AUTH

Business Administrator ORCUB_ROLY

Business Administrator ORCUB_WRIT

Business Administrator PTACC

Business Administrator PTADVND

Business Administrator PTAUTH

Business Administrator PTREAD

Business Administrator PTWRITE

Business Administrator RESTRACC

Business Administrator RESTREXEC

Business Administrator RESTRMOD

Business Administrator RESTRREAD

Business Administrator RESTRSUMM

Business Administrator RESTRWRITE

Business Administrator RLACC

Business Administrator RLADVND

Business Administrator RLAUTH

Business Administrator RLREAD

Business Administrator RLWRITE

Business Administrator RNACC

Business Administrator RNADVND

Business Administrator RNAUTH

Business Administrator RNREAD

Business Administrator RNWRITE

Business Administrator WFACC

Business Administrator WFMACC

Business Administrator WFMWRITE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 768


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Administrator WFREAD

Business Administrator WFWRITE

Business Administrator XLATMACCES

Business Administrator XLATMREAD

Business Administrator XLATMWRITE

Business Administrator XLCNFADVNC

Business Authorizer ALIAS_ACSS

Business Authorizer ALIAS_AUTH

Business Authorizer ALIAS_ROLY

Business Authorizer BATCH_ACSS

Business Authorizer BATCH_AUTH

Business Authorizer BATCH_READ

Business Authorizer BPROC_ACSS

Business Authorizer BPROC_AUTH

Business Authorizer BPROC_ROLY

Business Authorizer BUDIM_ACSS

Business Authorizer BUDIM_AUTH

Business Authorizer BUDIM_ROLY

Business Authorizer BUHCY_ACSS

Business Authorizer BUHCY_AUTH

Business Authorizer BUHCY_ROLY

Business Authorizer BUMSR_ACSS

Business Authorizer BUMSR_AUTH

Business Authorizer BUMSR_ROLY

Business Authorizer CATACC

Business Authorizer CATAUTH

Business Authorizer CATREAD

Business Authorizer DEFQAUTH

Business Authorizer DI_ACCESS

Business Authorizer DI_READ

Business Authorizer DMMACC

Business Authorizer DMMAUTH

Business Authorizer DMMREAD

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 769


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Authorizer DOCMGMTAUT

Business Authorizer DQACC

Business Authorizer DQAUTH

Business Authorizer DQREAD

Business Authorizer DRENT_ACSS

Business Authorizer DRENT_AUTH

Business Authorizer DRENT_ROLY

Business Authorizer DTSET_ACSS

Business Authorizer DTSET_AUTH

Business Authorizer DTSET_ROLY

Business Authorizer DT_ACCESS

Business Authorizer DT_READ

Business Authorizer ESCUB_ACSS

Business Authorizer ESCUB_AUTH

Business Authorizer ESCUB_ROLY

Business Authorizer EXPACC

Business Authorizer EXPREAD

Business Authorizer FFWAUTH

Business Authorizer FILACC

Business Authorizer FILREAD

Business Authorizer FMCAUTH

Business Authorizer F_KBDACC

Business Authorizer F_KBDAUTH

Business Authorizer F_KBDREAD

Business Authorizer HBRACC

Business Authorizer HBRREAD

Business Authorizer HIERACC

Business Authorizer HIERREAD

Business Authorizer MAPPR_ACSS

Business Authorizer MAPPR_AUTH

Business Authorizer MAPPR_ROLY

Business Authorizer MIGACC

Business Authorizer MIGAUTH

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 770


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Authorizer MIGREAD

Business Authorizer MREACC

Business Authorizer MREAUTH

Business Authorizer MREREAD

Business Authorizer ORCUB_ACSS

Business Authorizer ORCUB_AUTH

Business Authorizer ORCUB_ROLY

Business Authorizer PTACC

Business Authorizer PTAUTH

Business Authorizer PTREAD

Business Authorizer RESTRACC

Business Authorizer RESTREXEC

Business Authorizer RESTRREAD

Business Authorizer RESTRSUMM

Business Authorizer RLACC

Business Authorizer RLAUTH

Business Authorizer RLREAD

Business Authorizer RNACC

Business Authorizer RNAUTH

Business Authorizer RNREAD

Business Authorizer WFACC

Business Authorizer WFAUTH

Business Authorizer WFREAD

Business Authorizer XLATMAUTH

Business Owner ALIAS_ACSS

Business Owner ALIAS_ROLY

Business Owner ALIAS_WRIT

Business Owner BATCH_ACSS

Business Owner BATCH_READ

Business Owner BATCH_WRIT

Business Owner BPROC_ACSS

Business Owner BPROC_ROLY

Business Owner BPROC_WRIT

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 771


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Owner BUDIM_ACSS

Business Owner BUDIM_ROLY

Business Owner BUDIM_WRIT

Business Owner BUHCY_ACSS

Business Owner BUHCY_ROLY

Business Owner BUHCY_WRIT

Business Owner BUMSR_ACSS

Business Owner BUMSR_ROLY

Business Owner BUMSR_WRIT

Business Owner CATACC

Business Owner CATREAD

Business Owner CATWRITE

Business Owner DEFQACCESS

Business Owner DEFQREAD

Business Owner DEFQWRITE

Business Owner DI_ACCESS

Business Owner DI_READ

Business Owner DI_WRITE

Business Owner DMMACC

Business Owner DMMREAD

Business Owner DMMWRITE

Business Owner DOCMGMTACC

Business Owner DOCMGMTRD

Business Owner DOCMGMTWR

Business Owner DQACC

Business Owner DQREAD

Business Owner DQWRITE

Business Owner DRENT_ACSS

Business Owner DRENT_ROLY

Business Owner DRENT_WRIT

Business Owner DTSET_ACSS

Business Owner DTSET_ROLY

Business Owner DTSET_WRIT

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 772


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Owner DT_ACCESS

Business Owner DT_READ

Business Owner DT_WRITE

Business Owner ESCUB_ACSS

Business Owner ESCUB_ROLY

Business Owner ESCUB_WRIT

Business Owner EXPACC

Business Owner EXPREAD

Business Owner EXPWRITE

Business Owner FFWACCESS

Business Owner FFWREAD

Business Owner FFWWRITE

Business Owner FILACC

Business Owner FILREAD

Business Owner FILWRITE

Business Owner FMCACCESS

Business Owner FMCREAD

Business Owner FMCWRITE

Business Owner F_KBDACC

Business Owner F_KBDREAD

Business Owner F_KBDWRITE

Business Owner HBRACC

Business Owner HBRREAD

Business Owner HBRWRITE

Business Owner HIERACC

Business Owner HIERREAD

Business Owner HIERWRITE

Business Owner MAPPR_ACSS

Business Owner MAPPR_ROLY

Business Owner MAPPR_WRIT

Business Owner MDBACCESS

Business Owner MDBREAD

Business Owner MDBWRITE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 773


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business Owner MIGACC

Business Owner MIGREAD

Business Owner MIGWRITE

Business Owner MREACC

Business Owner MREREAD

Business Owner MREWRITE

Business Owner ORCUB_ACSS

Business Owner ORCUB_ROLY

Business Owner ORCUB_WRIT

Business Owner PTACC

Business Owner PTREAD

Business Owner PTWRITE

Business Owner RESTRACC

Business Owner RESTRREAD

Business Owner RESTRSUMM

Business Owner RESTRWRITE

Business Owner RLACC

Business Owner RLREAD

Business Owner RLWRITE

Business Owner RNACC

Business Owner RNREAD

Business Owner RNWRITE

Business Owner WFACC

Business Owner WFMACC

Business Owner WFMWRITE

Business Owner WFREAD

Business Owner WFWRITE

Business Owner XLATMACCES

Business Owner XLATMREAD

Business Owner XLATMWRITE

Business Owner XLCNFADVNC

Business User ALIAS_ACSS

Business User ALIAS_ROLY

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 774


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business User BATCH_ACSS

Business User BATCH_READ

Business User BPROC_ACSS

Business User BPROC_ROLY

Business User BUDIM_ACSS

Business User BUDIM_ROLY

Business User BUHCY_ACSS

Business User BUHCY_ROLY

Business User BUMSR_ACSS

Business User BUMSR_ROLY

Business User CATACC

Business User CATREAD

Business User DEFQACCESS

Business User DEFQREAD

Business User DI_ACCESS

Business User DI_READ

Business User DMMACC

Business User DMMREAD

Business User DOCMGMTACC

Business User DOCMGMTRD

Business User DQACC

Business User DQREAD

Business User DRENT_ACSS

Business User DRENT_ROLY

Business User DTSET_ACSS

Business User DTSET_ROLY

Business User DT_ACCESS

Business User DT_READ

Business User ESCUB_ACSS

Business User ESCUB_ROLY

Business User EXPACC

Business User EXPREAD

Business User FFWACCESS

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 775


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business User FFWREAD

Business User FILACC

Business User FILREAD

Business User FMCACCESS

Business User FMCREAD

Business User F_KBDACC

Business User F_KBDREAD

Business User HBRACC

Business User HBRREAD

Business User HIERACC

Business User HIERREAD

Business User MAPPR_ACSS

Business User MAPPR_ROLY

Business User MDBACCESS

Business User MDBREAD

Business User MIGACC

Business User MIGREAD

Business User MREACC

Business User MREREAD

Business User ORCUB_ACSS

Business User ORCUB_ROLY

Business User PTACC

Business User PTREAD

Business User RESTRACC

Business User RESTRMOD

Business User RESTRREAD

Business User RESTRSUMM

Business User RLACC

Business User RLREAD

Business User RNACC

Business User RNREAD

Business User WFACC

Business User WFREAD

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 776


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Business User WFWRITE

Business User XLATMACCES

Business User XLATMREAD

Data Controller ALIAS_ACSS

Data Controller ALIAS_ADVN

Data Controller ALIAS_AUTH

Data Controller ALIAS_PHNT

Data Controller ALIAS_ROLY

Data Controller ALIAS_WRIT

Data Controller BATCH_ACSS

Data Controller BATCH_ADVN

Data Controller BATCH_AUTH

Data Controller BATCH_PHNT

Data Controller BATCH_READ

Data Controller BATCH_WRIT

Data Controller BPROC_ACSS

Data Controller BPROC_ADVN

Data Controller BPROC_AUTH

Data Controller BPROC_PHNT

Data Controller BPROC_ROLY

Data Controller BPROC_WRIT

Data Controller BUDIM_ACSS

Data Controller BUDIM_ADVN

Data Controller BUDIM_AUTH

Data Controller BUDIM_PHNT

Data Controller BUDIM_ROLY

Data Controller BUDIM_WRIT

Data Controller BUHCY_ACSS

Data Controller BUHCY_ADVN

Data Controller BUHCY_AUTH

Data Controller BUHCY_PHNT

Data Controller BUHCY_ROLY

Data Controller BUHCY_WRIT

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 777


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller BUMSR_ACSS

Data Controller BUMSR_ADVN

Data Controller BUMSR_AUTH

Data Controller BUMSR_PHNT

Data Controller BUMSR_ROLY

Data Controller BUMSR_WRIT

Data Controller CATACC

Data Controller CATADV

Data Controller CATAUTH

Data Controller CATPHAN

Data Controller CATREAD

Data Controller CATWRITE

Data Controller DATASECURITYADMIN

Data Controller DEFQACCESS

Data Controller DEFQADVNC

Data Controller DEFQAUTH

Data Controller DEFQMAN

Data Controller DEFQPHTM

Data Controller DEFQREAD

Data Controller DEFQWRITE

Data Controller DI_ACCESS

Data Controller DI_PHANTOM

Data Controller DI_READ

Data Controller DI_WRITE

Data Controller DMMACC

Data Controller DMMADVND

Data Controller DMMAUTH

Data Controller DMMPHTM

Data Controller DMMREAD

Data Controller DMMWRITE

Data Controller DMTDFMACSS

Data Controller DMTDMACSS

Data Controller DMTSRCACSS

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 778


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller DMTUDFACSS

Data Controller DOCMGMTACC

Data Controller DOCMGMTADV

Data Controller DOCMGMTAUT

Data Controller DOCMGMTPHT

Data Controller DOCMGMTRD

Data Controller DOCMGMTWR

Data Controller DQACC

Data Controller DQADVND

Data Controller DQAUTH

Data Controller DQPHTM

Data Controller DQREAD

Data Controller DQWRITE

Data Controller DRENT_ACSS

Data Controller DRENT_ADVN

Data Controller DRENT_AUTH

Data Controller DRENT_PHNT

Data Controller DRENT_ROLY

Data Controller DRENT_WRIT

Data Controller DTSET_ACSS

Data Controller DTSET_ADVN

Data Controller DTSET_AUTH

Data Controller DTSET_PHNT

Data Controller DTSET_ROLY

Data Controller DTSET_WRIT

Data Controller DT_ACCESS

Data Controller DT_PHANTOM

Data Controller DT_READ

Data Controller DT_WRITE

Data Controller ESCUB_ACSS

Data Controller ESCUB_ADVN

Data Controller ESCUB_AUTH

Data Controller ESCUB_PHNT

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 779


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller ESCUB_ROLY

Data Controller ESCUB_WRIT

Data Controller ETLADM

Data Controller EXPACC

Data Controller EXPADVND

Data Controller EXPAUTH

Data Controller EXPPHTM

Data Controller EXPREAD

Data Controller EXPWRITE

Data Controller FFWACCESS

Data Controller FFWADVNC

Data Controller FFWAUTH

Data Controller FFWPHTM

Data Controller FFWREAD

Data Controller FFWWRITE

Data Controller FILACC

Data Controller FILADVND

Data Controller FILAUTH

Data Controller FILPHTM

Data Controller FILREAD

Data Controller FILWRITE

Data Controller FMCACCESS

Data Controller FMCADVNC

Data Controller FMCAUTH

Data Controller FMCPHTM

Data Controller FMCREAD

Data Controller FMCWRITE

Data Controller F_KBDACC

Data Controller F_KBDAUTH

Data Controller F_KBDREAD

Data Controller F_KBDWRITE

Data Controller HBRACC

Data Controller HBRADVND

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 780


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller HBRAUTH

Data Controller HBRPHTM

Data Controller HBRREAD

Data Controller HBRWRITE

Data Controller HIERACC

Data Controller HIERADVND

Data Controller HIERAUTH

Data Controller HIERPHTM

Data Controller HIERREAD

Data Controller HIERWRITE

Data Controller IDMGMTACC

Data Controller IDMGMTADVN

Data Controller IDMGMTAUTH

Data Controller IDMGMTPHTM

Data Controller IDMGMTREAD

Data Controller IDMGMTWRIT

Data Controller INBOXACC

Data Controller MAPPR_ACSS

Data Controller MAPPR_ADVN

Data Controller MAPPR_AUTH

Data Controller MAPPR_PHNT

Data Controller MAPPR_ROLY

Data Controller MAPPR_WRIT

Data Controller MDBACCESS

Data Controller MDBREAD

Data Controller MDBWRITE

Data Controller METADMIN

Data Controller MFACC

Data Controller MFADVND

Data Controller MFAUTH

Data Controller MFPHTM

Data Controller MFREAD

Data Controller MFWRITE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 781


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller MIGACC

Data Controller MIGADVND

Data Controller MIGAUTH

Data Controller MIGPHTM

Data Controller MIGREAD

Data Controller MIGWRITE

Data Controller MREACC

Data Controller MREADVND

Data Controller MREAUTH

Data Controller MREPHTM

Data Controller MREREAD

Data Controller MREWRITE

Data Controller OBJADMADV

Data Controller OJFFACC

Data Controller ORCUB_ACSS

Data Controller ORCUB_ADVN

Data Controller ORCUB_AUTH

Data Controller ORCUB_PHNT

Data Controller ORCUB_ROLY

Data Controller ORCUB_WRIT

Data Controller PR2ADM

Data Controller PTACC

Data Controller PTADVND

Data Controller PTAUTH

Data Controller PTPHTM

Data Controller PTREAD

Data Controller PTWRITE

Data Controller QADMINRL

Data Controller QADMINVWRL

Data Controller QLOCADMNRL

Data Controller QLOCAUTHRL

Data Controller QLOCVIEWRL

Data Controller QSGNOFFRL

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 782


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller QTMPADMNRL

Data Controller QTMPVIEWRL

Data Controller QTNRADMNRL

Data Controller QTNRCONFRL

Data Controller QTNRCONIRL

Data Controller QUESTMATRL

Data Controller RESTRACC

Data Controller RESTREXEC

Data Controller RESTRMOD

Data Controller RESTRREAD

Data Controller RESTRSUMM

Data Controller RESTRWRITE

Data Controller RLACC

Data Controller RLADVND

Data Controller RLAUTH

Data Controller RLPHTM

Data Controller RLREAD

Data Controller RLWRITE

Data Controller RNACC

Data Controller RNADVND

Data Controller RNAUTH

Data Controller RNPHTM

Data Controller RNREAD

Data Controller RNWRITE

Data Controller ROLREPACC

Data Controller RTIADMIN

Data Controller STFACC

Data Controller STFADVND

Data Controller STFAUTH

Data Controller STFPHTM

Data Controller STFREAD

Data Controller STFWRITE

Data Controller SYSADMNACC

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 783


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Data Controller SYSADMNADV

Data Controller SYSADMNAU

Data Controller SYSADMNPHT

Data Controller SYSADMNRD

Data Controller SYSADMNWR

Data Controller SYSAMHM

Data Controller SYSAMHMUMM

Data Controller SYSEXPN

Data Controller SYSFILTERS

Data Controller UAMADMNACC

Data Controller USRPOPACC

Data Controller WFACC

Data Controller WFADMINACC

Data Controller WFADV

Data Controller WFAUTH

Data Controller WFDELACC

Data Controller WFDELGADM

Data Controller WFMACC

Data Controller WFMWRITE

Data Controller WFREAD

Data Controller WFWRITE

Data Controller XLATMACCES

Data Controller XLATMADVNC

Data Controller XLATMAUTH

Data Controller XLATMPHTM

Data Controller XLATMREAD

Data Controller XLATMWRITE

Data Controller XLCNFADVNC

Guest HBRACC

Guest HIERACC

Guest MAPPR_ACSS

Guest MDBACCESS

Guest MIGACC

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 784


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Guest MREACC

Guest ORCUB_ACSS

Guest PTACC

Guest RESTRACC

Guest RESTRSUMM

Guest RLACC

Guest RNACC

Guest WFACC

Guest WFREAD

Guest XLATMACCES

Guest ALIAS_ACSS

Guest BATCH_ACSS

Guest BPROC_ACSS

Guest BUDIM_ACSS

Guest BUHCY_ACSS

Guest BUMSR_ACSS

Guest CATACC

Guest DEFQACCESS

Guest DI_ACCESS

Guest DMMACC

Guest DOCMGMTACC

Guest DQACC

Guest DRENT_ACSS

Guest DTSET_ACSS

Guest DT_ACCESS

Guest ESCUB_ACSS

Guest EXPACC

Guest FFWACCESS

Guest FILACC

Guest FMCACCESS

Guest F_KBDACC

Identity Administrator IDMGMTACC

Identity Administrator IDMGMTADVN

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 785


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Identity Administrator IDMGMTPHTM

Identity Administrator IDMGMTREAD

Identity Administrator IDMGMTWRIT

Object Administrator ALIAS_ACSS

Object Administrator ALIAS_ADVN

Object Administrator ALIAS_AUTH

Object Administrator ALIAS_PHNT

Object Administrator ALIAS_ROLY

Object Administrator ALIAS_WRIT

Object Administrator BATCH_ACSS

Object Administrator BATCH_AUTH

Object Administrator BATCH_PHNT

Object Administrator BATCH_READ

Object Administrator BATCH_WRIT

Object Administrator BPROC_ACSS

Object Administrator BPROC_ADVN

Object Administrator BPROC_AUTH

Object Administrator BPROC_PHNT

Object Administrator BPROC_ROLY

Object Administrator BPROC_WRIT

Object Administrator BUDIM_ACSS

Object Administrator BUDIM_ADVN

Object Administrator BUDIM_AUTH

Object Administrator BUDIM_PHNT

Object Administrator BUDIM_ROLY

Object Administrator BUDIM_WRIT

Object Administrator BUHCY_ACSS

Object Administrator BUHCY_ADVN

Object Administrator BUHCY_AUTH

Object Administrator BUHCY_PHNT

Object Administrator BUHCY_ROLY

Object Administrator BUHCY_WRIT

Object Administrator BUMSR_ACSS

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 786


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Object Administrator BUMSR_ADVN

Object Administrator BUMSR_AUTH

Object Administrator BUMSR_PHNT

Object Administrator BUMSR_ROLY

Object Administrator BUMSR_WRIT

Object Administrator CATACC

Object Administrator CATADV

Object Administrator CATAUTH

Object Administrator CATPHAN

Object Administrator CATREAD

Object Administrator CATWRITE

Object Administrator DEFQACCESS

Object Administrator DEFQADVNC

Object Administrator DEFQPHTM

Object Administrator DEFQREAD

Object Administrator DEFQWRITE

Object Administrator DI_ACCESS

Object Administrator DI_PHANTOM

Object Administrator DI_READ

Object Administrator DI_WRITE

Object Administrator DMMACC

Object Administrator DMMADVND

Object Administrator DMMAUTH

Object Administrator DMMPHTM

Object Administrator DMMREAD

Object Administrator DMMWRITE

Object Administrator DOCMGMTACC

Object Administrator DOCMGMTADV

Object Administrator DOCMGMTPHT

Object Administrator DOCMGMTRD

Object Administrator DOCMGMTWR

Object Administrator DQACC

Object Administrator DQADVND

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 787


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Object Administrator DQAUTH

Object Administrator DQPHTM

Object Administrator DQREAD

Object Administrator DQWRITE

Object Administrator DRENT_ACSS

Object Administrator DRENT_ADVN

Object Administrator DRENT_AUTH

Object Administrator DRENT_PHNT

Object Administrator DRENT_ROLY

Object Administrator DRENT_WRIT

Object Administrator DTSET_ACSS

Object Administrator DTSET_ADVN

Object Administrator DTSET_AUTH

Object Administrator DTSET_PHNT

Object Administrator DTSET_ROLY

Object Administrator DTSET_WRIT

Object Administrator DT_ACCESS

Object Administrator DT_PHANTOM

Object Administrator DT_READ

Object Administrator DT_WRITE

Object Administrator ESCUB_ACSS

Object Administrator ESCUB_ADVN

Object Administrator ESCUB_AUTH

Object Administrator ESCUB_PHNT

Object Administrator ESCUB_ROLY

Object Administrator ESCUB_WRIT

Object Administrator EXPACC

Object Administrator EXPPHTM

Object Administrator EXPREAD

Object Administrator EXPWRITE

Object Administrator FFWACCESS

Object Administrator FFWADVNC

Object Administrator FFWPHTM

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 788


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Object Administrator FFWREAD

Object Administrator FFWWRITE

Object Administrator FILACC

Object Administrator FILPHTM

Object Administrator FILREAD

Object Administrator FILWRITE

Object Administrator FMCACCESS

Object Administrator FMCADVNC

Object Administrator FMCPHTM

Object Administrator FMCREAD

Object Administrator FMCWRITE

Object Administrator HBRACC

Object Administrator HBRREAD

Object Administrator HBRWRITE

Object Administrator HIERACC

Object Administrator HIERPHTM

Object Administrator HIERREAD

Object Administrator HIERWRITE

Object Administrator MAPPR_ACSS

Object Administrator MAPPR_ADVN

Object Administrator MAPPR_AUTH

Object Administrator MAPPR_PHNT

Object Administrator MAPPR_ROLY

Object Administrator MAPPR_WRIT

Object Administrator MDBACCESS

Object Administrator MDBREAD

Object Administrator MDBWRITE

Object Administrator MIGACC

Object Administrator MIGADVND

Object Administrator MIGAUTH

Object Administrator MIGPHTM

Object Administrator MIGREAD

Object Administrator MIGWRITE

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 789


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Object Administrator MREACC

Object Administrator MREADVND

Object Administrator MREAUTH

Object Administrator MREPHTM

Object Administrator MREREAD

Object Administrator MREWRITE

Object Administrator OBJADMADV

Object Administrator ORCUB_ACSS

Object Administrator ORCUB_ADVN

Object Administrator ORCUB_AUTH

Object Administrator ORCUB_PHNT

Object Administrator ORCUB_ROLY

Object Administrator ORCUB_WRIT

Object Administrator PTACC

Object Administrator PTADVND

Object Administrator PTAUTH

Object Administrator PTPHTM

Object Administrator PTREAD

Object Administrator PTWRITE

Object Administrator RLACC

Object Administrator RLADVND

Object Administrator RLAUTH

Object Administrator RLPHTM

Object Administrator RLREAD

Object Administrator RLWRITE

Object Administrator RNACC

Object Administrator RNADVND

Object Administrator RNAUTH

Object Administrator RNPHTM

Object Administrator RNREAD

Object Administrator RNWRITE

Object Administrator XLATMACCES

Object Administrator XLATMADVNC

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 790


APPENDIX A
OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE GROUP - ROLE MAPPING

GROUP NAME ROLE CODE

Object Administrator XLATMPHTM

Object Administrator XLATMREAD

Object Administrator XLATMWRITE

Object Administrator XLCNFADVNC

System Administrator SYSADMNACC

System Administrator SYSADMNADV

System Administrator SYSADMNAU

System Administrator SYSADMNPHT

System Administrator SYSADMNRD

System Administrator SYSADMNWR

System Administrator WFACC

System Administrator WFMACC

System Administrator WFMWRITE

System Administrator WFREAD

System Administrator WFWRITE

WorkFlow Delegation Admin WFDELGADM

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 791


OFSAA SUPPORT

OFSAA Support
Raise a Service Request (SR) in My Oracle Support (MOS) for queries related to OFSAA Applications.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 792


SEND US YOUR COMMENTS

Send Us Your Comments


Oracle welcomes your comments and suggestions on the quality and usefulness of this publication.
Your input is an important part of the information used for revision.
• Did you find any errors?
• Is the information clearly presented?
• Do you need more information? If so, where?
• Are the examples correct? Do you need more examples?
• What features did you like most about this manual?
If you find any errors or have any other suggestions for improvement, indicate the title and part
number of the documentation along with the chapter/section/page number (if available) and
contact the Oracle Support.
Before sending us your comments, you might like to ensure that you have the latest version of the
document wherein any of your concerns have already been addressed. You can access My Oracle
Support site that has all the revised/recently released documents.

OFS ANALYTICAL APPLICATIONS INFRASTRUCTURE USER GUIDE | 793

You might also like