Skip Headers
Oracle® Fusion Middleware Configuration Guide for Oracle Business Intelligence Applications
11g Release 1 (11.1.1.7)

Part Number E36171-03
Go to Documentation Home
Home
Go to Book List
Book List
Go to Table of Contents
Contents
Go to Index
Index
Go to Feedback page
Contact Us

Go to previous page
Previous
Go to next page
Next
PDF · Mobi · ePub

B Functional Configuration Task Reference

This is a reference section that contains Help topics for Informational Tasks in Functional Setup Manager (FSM). Informational Tasks display conceptual information, or display configuration steps that are performed in tools external to FSM (for example, in Oracle Data Integrator, or Oracle BI EE Administration Tool).

The Help topics in this section are displayed in FSM when you click Go to Task for an Informational Task, or you click a Help icon for additional information about an FSM Task.

This chapter contains the following sections:

B.1 Example Functional Configuration Tasks For Multiple Offerings

This section lists example Tasks that apply to multiple Offerings.

Common Areas and Dimensions

Configure Data Load Parameters for File Based Calendars

Configure Enterprise List

Configure Global Currencies

Configure Initial Extract Date

Configure Reporting Parameters for Year Prompting

Configure Slowly Changing Dimensions

Define Enterprise Calendar

Specify Gregorian Calendar Date Range

B.2 Informational Task Reference - Miscellaneous

This section contains miscellaneous Help topics.

B.2.1 Getting Started With Functional Configuration

To get started with Functional Configuration, see Section 3.2, "Roadmap for Functional Configuration".

A BI Application Offering and one or more Functional Areas are selected during the creation of an Implementation Project. A list of Functional Setup tasks is generated based on the selected Oracle BI Applications Offering and Functional Area(s).

There are four main types of Functional Task:

  • Tasks to configure Data Load Parameters - Clicking on the Go To Task button for these tasks launches Oracle BI Applications Configuration Manager and the Manage Data Load Parameter setup user interface is displayed with the appropriate set of Data Load Parameters required to perform a task.

  • Tasks to manage Domains and Mappings - Clicking on the Go To Task button for these tasks launches Oracle BI Applications Configuration Manager and the Manage Domains and Mappings setup user interface is displayed with the appropriate set of Domain Mappings.

  • Tasks to configure Reporting Parameters - Clicking on the Go To Task button for these tasks launches Oracle BI Applications Configuration Manager and the Manage Reporting Parameter setup user interface is displayed with the appropriate set of Reporting Parameters required to perform a task.

  • Tasks that are informational - These tasks provide either:

    • conceptual, background or supporting information.

    • instructions for configuration that is performed in tools external to FSM (for example, in Oracle Data Integrator, or Oracle BI EE Administration Tool).

B.2.2 How to Add Closed Orders to Backlog Calculations

By default, the Oracle Supply Chain and Order Management Analytics application only extracts open sales orders from the Sales Order Lines (W_SALES_ORDER_LINE_F) table and Sales Schedule Lines table (W_SALES_SCHEDULE_LINE_F) for backlog calculations to populate the Backlog tables. Open sales orders are defined as orders that are not canceled or not complete. The purpose in extracting only open orders is that in most organizations those orders that are closed are no longer a part of backlog. However, if you want to extract sales orders that are marked as closed, you may remove the default filter condition from the extract mapping.

For example, assume your customer orders ten items. Six items are invoiced and shipped, but four items are placed on operational and financial backlog. This backlog status continues until one of two things happens:

  • The items are eventually shipped and invoiced.

  • The remainder of the order is canceled.

If you choose to extract sales orders that are flagged as closed, you must remove the condition in the Backlog flag. To do so, use the following procedure.

The BACKLOG_FLAG in the W_SALES_ORDER_LINE_F table is also used to identify which sales orders are eligible for backlog calculations. By default, all sales order types have their Backlog flag set to Y. As a result, all sales orders are included in backlog calculations.

To remove open order extract filters:

  1. In Oracle Data Integrator, open Mappings folder, and then SDE_ORA11510_Adaptor, SDE_ORAR12Version_Adaptor, or SDE_FUSION_V1_Adaptor folder.

  2. Open SDE_ORA_SalesOrderLinesFact - Interfaces - SDE_ORA_SalesOrderLinesFact.W_SALES_ORDER_LINE_FS for E-Business Suite adaptors, or SDE_FUSION_SalesOrderLinesFact - Interfaces - SDE_FUSION_SalesOrderLinesFact.W_SALES_ORDER_LINE_FS for FUSION adaptor.

  3. Click Quick-Edit tab and expand Mappings inside Quick-Edit tab.

  4. Find the OPR_BACKLOG_FLG and open Mapping Expression. Then, remove SQ_BCI_SALES_ORDLNS.OPEN_FLAG = 'Y' AND for E-Business Suite adaptors, or remove SQ_FULFILLLINEPVO.FulfillLineOpenFlag = 'Y' AND for FUSION adaptor.

  5. Find the FIN_BACKLOG_FLG and open Mapping Expression. Then, remove SQ_BCI_SALES_ORDLNS.OPEN_FLAG = 'Y' AND for E-Business Suite adaptors, or remove SQ_FULFILLLINEPVO.FulfillLineOpenFlag = 'Y' AND for FUSION adaptor.

  6. Save your changes to the repository.

  7. Open the Mappings folder, and then PLP folder.

  8. Open PLP_SalesBacklogLinesFact_Load_OrderLines - Interfaces -PLP_SalesBacklogLinesFact_Load_OrderLines.W_SALES_BACKLOG_LINE_F.SQ_SALES_ORER_LINES_BACKLOG.

  9. Click Quick-Edit tab and expand Filters inside Quick-Edit tab.

  10. Find the filter W_STATUS_D.W_STATUS_CODE<>'Closed' and remove it.

  11. Open PLP_SalesBacklogLinesFact_Load_ScheduleLines - Interfaces -PLP_SalesBacklogLinesFact_Load_ScheduleLines.W_SALES_BACKLOG_LINE_F.SQ_W_SALES_SCHEDULE_LINE_F.

  12. Click Quick-Edit tab and expand Filters inside Quick-Edit tab.

  13. Find the filter W_STATUS_D.W_STATUS_CODE<>'Closed' and remove it.

  14. Save your changes to the repository.

B.2.3 How to Include Non-booked Orders in Order Line and Schedule Line Tables

This is for only Oracle E-Business Suite source system such as SDE_ORA11510_Adaptor, and SDE_ORAR12Version_Adaptor. By default, only booked orders are extracted from the Oracle source system, as shown in Figure B-1.

Figure B-1 Handling Booked and Non-booked Orders

This diagram is described in surrounding text.

Therefore, all orders loaded into the Sales Order Lines, Sales Schedule Lines, and Sales Booking Lines tables are booked.

However, you can also load non-booked orders in Sales Order Lines (W_SALES_ORDERS_LINES_F) and Sales Schedule Lines (W_SALES_SCHEDULE_LINE_F), while loading only booked orders in Sales Booking Lines (W_SALES_BOOKING_LINE_F).

If you want to load non-booked orders into the Sales Order Lines and Sales Schedule Lines tables, you have to configure the extract so that it does not filter out non-booked orders. The OE_ORDER_LINES_ALL.BOOKED_FLAG = 'Y' condition indicates that an order is booked; therefore, this statement is used to filter out non-booked orders. So, to load all orders, including non-booked orders, remove the filter condition from the temp interfaces of the following mappings:

  • SDE_ORA_SalesOrderLinesFact

  • SDE_ORA_SalesOrderLinesFact_Primary

Also, if you include non-booked orders in the Sales Order Lines and Sales Schedule Lines tables, you have to exclude non-booked orders when you populate the Sales Booking Lines table from the Sales Order Lines or from the Sales Schedule Lines. You can do this by adding the W_SALES_ORDER_LINE_F.BOOKING_FLG = 'Y' or W_SALES_SCHEDULE_LINE_F.BOOKING_FLG = 'Y' condition to the interfaces of the following mappings:

  • SIL_SalesBookingLinesFact_Load_OrderLine_Credit

  • SIL_SalesBookingLinesFact_Load_OrderLine_Debit

  • SIL_SalesBookingLinesFact_Load_ScheduleLine_Credit

  • SIL_SalesBookingLinesFact_Load_ScheduleLine_Debit

To include non-booked orders in the Sales Order Lines and Sales Schedule Lines tables (for both full and Incremental load):

  1. In ODI Designer Navigator, open the SDE_ORA11510_Adaptor, or SDE_ORAR12Version _Adaptor.

  2. Find SDE_ORA_SalesOrderLinesFact and SDE_ORA_SalesOrderLinesFact_Primary. Then open the temp interfaces below.

    • SDE_ORA_SalesOrderLinesFact.W_SALES_ORDER_LINE_FS_SQ_BCI_SALES_ORDLNS

    • SDE_ORA_SalesOrderLinesFact_Primary.W_SALES_ORDER_LINE_F_PE_SQ_BCI_SALES_ORDLS

  3. Find and delete the filter condition OE_ORDER_LINES_ALL.BOOKED_FLAG='Y' from the temp interfaces mentioned above.

  4. Save your changes to the repository.

    Follow the steps below to make changes for Booking Lines table.

To include only booked orders in the Sales Booking Lines table:

  1. In ODI Designer Navigator, open the SILOS folder.

  2. Open the following interfaces then add the filter to Filters section.

    • SIL_SalesBookingLinesFact_Load_OrderLine_Credit folder: Open Quick-Edit tab of the SIL_SalesBookingLinesFact_Load_OrderLine_Credit.W_SALES_BOOKING_LINE_F_SQ_W_SALES_ORDER_LINE_F interface, and add W_SALES_ORDER_LINE_F.BOOKING_FLG = 'Y' to the Filters section.

    • SIL_SalesBookingLinesFact_Load_OrderLine_Debt folder : Open Quick-Edit tab of the SIL_SalesBookingLinesFact_Load_OrderLine_Debt.W_SALES_BOOKING_LINE_F interface, and add SQ_W_SALES_ORDER_LINE_F.BOOKING_FLG = 'Y' to the Filters section.

    • SIL_SalesBookingLinesFact_Load_ScheduleLine_Credit folder : Open Quick-Edit tab of the SIL_SalesBookingLinesFact_Load_ScheduleLine_Credit.W_SALES_BOOKING_LINE_F_SQ_W_SALES_SCHEDULE_LINE_F interface, and add W_SALES_SCHEDULE_LINE_F.BOOKING_FLG = 'Y' to the Filters section.

    • SIL_SalesBookingLinesFact_Load_ScheduleLine_Debt folder : Open Quick-Edit tab of the SIL_SalesBookingLinesFact_Load_ScheduleLine_Debt.W_SALES_BOOKING_LINE_F interface, and add SQ_W_SALES_SCHEDULE_LINE_F.BOOKING_FLG = 'Y' to the Filters section.

  3. Save your changes to the repository.

B.2.4 How to Track Multiple Attribute Changes in Bookings

The W_SALES_BOOKING_LINE_F table tracts changes in SALES_QTY, NET_AMT, and certain attributes defined in BOOKING_ID column. BOOKING_ID is calculated in SDE mappings of Sales Order Line table as follows:

  • For SDE_ORA11510_Adaptor and SDE_ORA12Version_Adaptor:

    TO_CHAR(SQ_BCI_SALES_ORDLNS.LINE_ID)||'~'||TO_CHAR(SQ_BCI_SALES_ORDLNS.INVENTORY_ITEM_ID)||'~'||TO_CHAR(SQ_BCI_SALES_ORDLNS.SHIP_FROM_ORG_ID)
    
  • For SDE_FUSION_V1_Adaptor:

    TO_CHAR(SQ_FULFILLLINEPVO.FulfillLineId)||'~'||TO_CHAR(SQ_FULFILLLINEPVO.FulfillLineInventoryItemId)||'~'||TO_CHAR(SQ_FULFILLLINEPVO.FulfillLineFulfillOrgId)
    

However, if you want to track changes on another attribute, then you must concatenate the source column of the attribute with the default mapping expression. For example, if you want to track changes in Customer Account, then concatenate the source column of Customer Account in the BOOKING_ID column as follows:

  • For SDE_ORA11510_Adaptor and SDE_ORA12Version_Adaptor:

    TO_CHAR(SQ_BCI_SALES_ORDLNS.LINE_ID)||'~'||TO_CHAR(SQ_BCI_SALES_ORDLNS.INVENTORY_ITEM_ID)||'~'||TO_CHAR(SQ_BCI_SALES_ORDLNS.SHIP_FROM_ORG_ID)||'~'||TO_CHAR(INP_CUSTOMER_ACCOUNT_ID)
    
  • For SDE_FUSION_V1_Adaptor:

    TO_CHAR(SQ_FULFILLLINEPVO.FulfillLineId)||'~'||TO_CHAR(SQ_FULFILLLINEPVO.FulfillLineInventoryItemId)||'~'||TO_CHAR(SQ_FULFILLLINEPVO.FulfillLineFulfillOrgId)||'~'||TO_CHAR(SQ_FULFILLLINEPVO.HeaderSoldToCustomerId)
    

To track multiple dimensional attribute changes in bookings:

  1. In ODI Designer Navigator, open the SDE_ORA11510_Adaptor, SDE_ORAR12Version _Adaptor, or SDE_FUSION_V1_Adaptor folder.

  2. Open the main interface of SDE mappings of Sales Order Line table:

    • SDE_ORA_SalesOrderLinesFact.W_SALES_ORDER_LINE_FS

    • SDE_FUSION_SalesOrderLinesFact.W_SALES_ORDER_LINE_FS

  3. Find BOOKING_ID column and modify the mapping expression as described above.

    If you want to track changes in multiple attributes, then you must concatenate all source columns of the attributes.

  4. Save your changes to the repository.

B.2.5 Review Table Partitioning for Human Resource Analytics

The Human Resource application will benefit from table partitioning especially on larger systems where the amount of data is greater.

The main benefits of table partitioning are:

  • Faster ETL, as indexes are rebuilt only over the table partitions that have changed.

  • Faster reports, as partition pruning is a very efficient way of getting to the required data.

Optional or Mandatory

This task is optional, however by default no tables are partitioned.

Applies to

Systems where Oracle Business Analytics Warehouse is implemented on an Oracle database.

Dependencies

No dependencies.

Task

The latest recommendations for table partitioning of Human Resource tables can be found in Tech Notes in My Oracle Support. These should be reviewed before any action is taken.

There is a table partitioning utility provided in ODI which can be used to create partitioned tables. This utility can be run at any time to implement a particular partition strategy on a table. It is re-runnable and can be used to change the strategy if needed. It will backup the existing table, create the partitioned table in its place and copy in the data and indexes.

For example, to implement table partitioning on the table W_WRKFC_EVT_MONTH_F:

  1. Execute the scenario IMPLEMENT_DW_TABLE_PARTITIONS passing in the parameters as follows:

    Table B-1 Parameters for Table Partitioning

    Parameter Name Description Value

    CREATE_SCRIPT_FILE

    Whether or not to create a file with the partition table script.

    Y(es)

    PARTITION_KEY

    Column acting as partition key.

    EVENT_MONTH_WID

    RUN_DDL

    Whether or not to execute the script.

    N(o)

    SCRIPT_LOCATION

    Location on file system to create the script.

    C:/Scripts/Partitioning

    TABLE_NAME

    Name of table to partition.

    W_WRKFC_EVT_MONTH_F


  2. If required, then review the script and adjust the partitioning definition.

    For the workforce fact table, monthly snapshot records are created from a specified date (HR Workforce Snapshot Date, default value 1st January 2008). Therefore, it would be logical to make this date the cutoff for the first partition, and then partition monthly or quarterly thereafter.

    This is done by changing the script from:

    CREATE TABLE W_WRKFC_EVT_MONTH_F 
    …
    PARTITION BY RANGE (EVENT_MONTH_WID) INTERVAL(1)
     (PARTITION p0 VALUES LESS THAN (1)) 
    …
    

    To:

    CREATE TABLE W_WRKFC_EVT_MONTH_F 
    …
    PARTITION BY RANGE (EVENT_MONTH_WID) INTERVAL(3) 
     (PARTITION p0 VALUES LESS THAN (200801)) 
    …
    
  3. Execute the script against Oracle Business Analytics Warehouse.

B.2.6 How to Implement Accounts Receivable Security for PeopleSoft

Financial Analytics supports security over Billing and Revenue Management Business Unit in Accounts Receivable. This Business Unit is the same as Receivables Business Unit in PeopleSoft, and the list of Receivables Business Unit that a user has access to is determined by the grants in PeopleSoft.

Configuring Accounts Receivable Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Accounts Receivable security for PeopleSoft, enable Oracle PeopleSoft initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Initialization Blocks

  • Oracle Fusion Applications: Receivables Business Unit

  • Oracle E-Business Suite: Operating Unit Organizations EBS

  • Oracle PeopleSoft: Receivables Organizations

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables to display the Variables dialog

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the RPD file.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Accounts Receivable subject area.

  • AR Analyst PSFT

  • AR Manager PSFT

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.7 How to Implement Accounts Payable Security for Oracle Fusion Applications

Financial Analytics supports security over Payables Invoicing Business Unit in Accounts Payable subject areas. This Business Unit is the same as Business Unit in Oracle Fusion Applications, and the list of Business units that a user has access to is determined by the grants in Oracle Fusion applications.

Configuring Accounts Payable Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. The initialization block names relevant to various source systems are given below. Oracle Fusion Applications security is enabled by default so there is no change required in the setup. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems. For example:

  • Oracle Fusion Applications: Payables Business Unit

  • Oracle E-Business Suite: Operating Unit Organizations EBS

  • Oracle PeopleSoft: Payables Organizations

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

  2. Choose Manage, then Variables to display the Variables dialog.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the RPD file.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Accounts Payable subject area.

  • OBIA_ACCOUNTS_PAYABLE_MANAGERIAL_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.8 How to Set Up Accounts Payable Security for Oracle E-Business Suite

Financial Analytics supports security over Payables Invoicing Business Unit in Accounts Payable subject areas. This Business Unit is the same as Operating Unit Organizations in E-Business Suite, and the list of Operating Unit Organizations that a user has access to is determined by the grants in E-Business Suite.

Configuring Accounts Payable Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Accounts Payable security for E-Business Suite, enable Oracle E-Business Suite initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems. For example:

  • Oracle Fusion Applications: Payables Business Unit

  • Oracle E-Business Suite: Operating Unit Organizations EBS

  • Oracle PeopleSoft: Payables Organizations

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

  2. Choose Manage, then Variables to display the Variables dialog.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the RPD file.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Accounts Payable subject area.

  • AP Analyst

  • AP Manager

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.9 How to implement Accounts Payable Security for PeopleSoft

Financial Analytics supports security over Payables Invoicing Business Unit in Accounts Payable subject areas. This Business Unit is the same as Payables Business Unit in PeopleSoft, and the list of Payables Business Unit that a user has access to is determined by the grants in PeopleSoft.

Configuring Accounts Payable Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Accounts Payable security for PeopleSoft, enable Oracle PeopleSoft initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems. For example:

  • Oracle Fusion Applications: Payables Business Unit

  • Oracle E-Business Suite: Operating Unit Organizations EBS

  • Oracle PeopleSoft: Payables Organizations

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

  2. Choose Manage, then Variables to display the Variables dialog.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the RPD file.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Accounts Payable subject area.

  • AP Analyst PSFT

  • AP Manager PSFT

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.10 How to Set Up Accounts Receivable Security for Oracle Fusion Applications

Financial Analytics supports security over Billing and Revenue Management Business Unit in Accounts Receivable subject areas. This Business Unit is the same as Business Unit in Oracle Fusion Applications, and the list of Business units that a user has access to is determined by the grants in Oracle Fusion applications.

Configuring Accounts Receivable Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. The initialization block names relevant to various source systems are given below. Oracle Fusion Applications security is enabled by default so there is no change required in the setup. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems. For example:

  • Oracle Fusion Applications: Receivables Business Unit

  • Oracle E-Business Suite: Operating Unit Organizations EBS

  • Oracle PeopleSoft: Receivables Organizations

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

  2. Choose Manage, then Variables to display the Variables dialog

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the RPD file.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Accounts Receivable subject area.

  • OBIA_ACCOUNTS_RECEIVABLE_MANAGERIAL_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.11 How to Set Up Accounts Receivable Security for Oracle E-Business Suite

Financial Analytics supports security over Billing and Revenue Management Business Unit in Accounts Receivable subject areas. This Business Unit is the same as Operating Unit Organization in E-Business Suite, and the list of Operating Unit Organizations that a user has access to is determined by the grants in E-Business Suite.

Configuring Accounts Receivable Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Accounts Receivable security for E-Business Suite, enable Oracle E-Business Suite initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems. For example:

  • Oracle Fusion Applications: Receivables Business Unit

  • Oracle E-Business Suite: Operating Unit Organizations EBS

  • Oracle PeopleSoft: Receivables Organizations

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

  2. Choose Manage, then Variables to display the Variables dialog

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the RPD file.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Accounts Receivable subject area.

  • AR Analyst

  • AR Manager

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.12 How to Deploy Objects in Oracle E-Business Suite for Exploding the Bill Of Materials

The Bill of Materials (BOM) functional area enables you to determine the profit margin of the components that comprise the finished goods. BOM enables you to keep up with the most viable vendors in terms of cost and profit, and to keep your sales organization aware of product delivery status, including shortages.

To deploy objects in Oracle E-Business Suite for exploding the BOM, ensure that the Oracle E-Business Suite source environment meets the minimum patch level for your version, as follows:

  • Customers with Oracle E-Business Suite version R12 must be at or above patch level 16023729.

  • Customers with Oracle E-Business Suite version R12.0.x or OPI patch set A must be at or above patch level 16037126:R12.OPI.A.

  • Customers with Oracle E-Business Suite version R12.1.x or OPI patch set B must be at or above patch level 16037126:R12.OPI.B.

  • Customers with Oracle E-Business Suite version 11i must be at or above patch level 16036191.

Refer to the System Requirements and Supported Platforms for Oracle Business Intelligence Applications for full information about supported patch levels for your source system.

Note: Systems at or above these minimum patch levels include the package OPI_OBIA_BOMPEXPL_WRAPPER_P in the APPS schema, and include the following tables in the OPI schema with alias tables in the APPS schema:

  • OPI_OBIA_W_BOM_HEADER_DS

  • OPI_OBIA_BOM_EXPLOSION

  • OBIA_BOM_EXPLOSION_TEMP

How to Configure the Bill of Materials Explosion Options

The Bill of Materials (BOM) functional area enables you to analyze the components that comprise the finished goods. BOM enables you to determine how many products use a certain component. It also enables you to get visibility into the complete BOM hierarchy for a finished product. In order to explode BOM structures, certain objects need to be deployed in your E-Business Suite system.

Note: To run the ETL as the apps_read_only user, you must first run the following DCL commands from the APPS schema:

Grant insert on opi.opi_obia_w_bom_header_ds to &read_only_user;
Grant analyze any to &read_only_user;

You can explode the BOM structure with three different options:

  • All. All the BOM components are exploded regardless of their effective date or disable date. To explode a BOM component is to expand the BOM tree structure.

  • Current. The incremental extract logic considers any changed components that are currently effective, any components that are effective after the last extraction date, or any components that are disabled after the last extraction date.

  • Current and Future. All the BOM components that are effective now or in the future are exploded. The disabled components are left out.

These options are controlled by the EXPLODE_OPTION variable. The EXPLODE_OPTION variable is preconfigured with a value of 2, explode Current BOM structure.

There are five different BOM types in a source system: 1- Model, 2 - Option Class, 3 - Planning, 4 - Standard, and 5 - Product Family. By default, only the Standard BOM type is extracted and exploded. You can control this selection using the EBS_BOM_TYPE parameter.

The SDE_ORA_BOMItemFact_Header mapping invokes the OPI_OBIA_BOMPEXPL_P package in the E-Business Suite database to explode the BOM structure. The table below lists the variables used to control the stored procedure.

Table B-2 Variables for the BOM Explosion Stored Procedure

Input Variable Preconfigured Value Description

BOM_OR_ENG

1

1—BOM

2—ENG

COMMIT_POINT

5000

Number of records to trigger a Commit.

COMP_CODE

Not applicable.

This parameter is deprecated and no longer affects the functionality of the procedure.

CST_TYPE_ID

0

This parameter is deprecated and no longer affects the functionality of the procedure.

EXPLODE_OPTION

2

1—All

2—Current

3—Current and Future

EXPL_QTY

1

Explosion quantity.

IMPL_FLAG

1

1—Implemented Only

2—Implemented and Non-implemented

LEVELS_TO_EXPLODE

10

Number of levels to explode.

MODULE

2

1—Costing

2—BOM

3—Order Entry

4—ATO

5—WSM

ORDER_BY

1

Controls the order of the records.

1—Operation Sequence Number, Item Number.

2—Item Number, Operation Sequence Number.

PLAN_FACTOR_FLAG

2

1—Yes

2—No

RELEASE_OPTION

0

Option to use released items.

STD_COMP_FLAG

0

1—Explode only standard components

2—All components

UNIT_NUMBER

Not applicable.

When entered, limits the components exploded to the specified Unit.

VERIFY_FLAG

0

This parameter is deprecated and no longer affects the functionality of the procedure.


B.2.13 How To Configure JD Edwards EnterpriseOne Category Codes

The Dimension tables listed below contain twenty generic attribute columns to allow for storage and display of customizable values from the source application system:

  • W_PRODUCT_ADDL_ATTR_D

  • W_PARTY_ORG_ADDL_ATTR_D

  • W_CUSTOMER_ACCOUNT_D (contains the new Attributes columns in the base table itself)

  • W_INT_ORG_D (contains the new Attributes columns in the base table itself)

B.2.14 How to Configure Self-Service Timecard Attributes to Business Object Natural Keys

On Oracle E-Business Suite Self-service time cards are defined based on Templates, those templates have seeded mappings which correspond to the physical table / column name of a given time card attribute.

The BI Apps product is shipped with mappings based on the seeded time card templates.

Optionally customers can change this mapping by editing the associated variable for each column mapping, in the first release this must be done within ODI metadata variables with the prefix HR_TIMECARD_FLEX_MAP. The screenshot below shows an example.

This screenshot is described in surrounding text.

When changing the values be careful not to introduce an ETL runtime error, that is, by introducing a SQL syntax error.

B.2.15 How to Make Corrections to the Group Account Number Configuration for PeopleSoft

Note: Refer to "How to set up Group Account Numbers for Peoplesoft" for general concepts about group account number and Financial Statement Item code.

When a user maps a GL natural account to an incorrect group account number, incorrect accounting entries might be inserted into the fact table. For example, the natural account 620000 is mistakenly classified under 'AR' group account number when it should be classified under 'AP' group account number. When this happens, the ETL program will try to reconcile all GL journals charged to account 620000 against sub ledger accounting records in AR Fact (W_AR_XACT_F). Since these GL journal lines did not come from AR, the ETL program will not be able to find the corresponding sub ledger accounting records for these GL journal lines. In this case, the ETL program will insert 'Manual' records into the AR fact table because it thinks that these GL journal lines are 'Manual' journal entries created directly in the GL system charging to the AR accounts.

To make corrections to group account number configurations for Peoplesoft, correct the mapping of GL natural account to the correct group account in the input CSV file called file_group_acct_codes_psft.csv.

If you add values, then you also need to update the BI metadata repository (that is, the RPD file).

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

For example, before correction, a CSV file has the following values (Incorrect Group Account Number assignment):

  • BUSINESS_UNIT = AUS01

  • FROM ACCT = 620000

  • TO ACCT = 620000

  • GROUP_ACCT_NUM = AR

After correction, account '620000' should now correctly point to 'AP' group account number, and the CSV file would have the following (corrected) values:

  • BUSINESS_UNIT = AUS01

  • FROM ACCT = 620000

  • TO ACCT = 620000

  • GROUP_ACCT_NUM = AP

Based on the Group Account corrections made in the CSV file, the next ETL process would reassign the group accounts correctly and fix the entries that were made to the fact tables from the previous ETL run(s).

B.2.16 Configure Data Load Parameter for JD Edwards EnterpriseOne Rate Type

Setting up the JDE_RATE_TYPE parameter

The concept of Rate Type in JD Edwards EnterpriseOne is different to that defined in Oracle Business Analytics Warehouse.

In Oracle's JD Edwards EnterpriseOne, the Rate Type is an optional key; it is not used during Currency Exchange Rate calculations.

ODI uses the JDE_RATE_TYPE parameter to populate the Rate_Type field in the W_EXCH_RATE_GS table. By default, the JDE_RATE_TYPE parameter has a value of "Actual." The query and lookup on W_EXCH_RATE_G will succeed when the RATE_TYPE field in the W_EXCH_RATE_G table contains the same value as the GLOBAL1_RATE_TYPE, GLOBAL2_RATE_TYPE and GLOBAL3_RATE_TYPE fields in the W_GLOBAL_CURR_G table.

B.2.17 How to Add Dates to the Order Cycle Time Table for Post-Load Processing

To add more dates, you need to understand how the Order Cycle Times table is populated. Therefore, if you want to change the dates loaded into the Order Cycle Time table (W_SALES_CYCLE_LINE_F), then you have to modify the interfaces for both a full load and an incremental load that take the dates from the W_* tables and load them into the Cycle Time table.

To add dates to the Cycle Time table load:

  1. In ODI Designer Navigator, expand Models - Oracle BI Applications - Oracle BI Applications - Fact.

  2. Find W_SALES_CYCLE_LINE_F and add a column to store this date you want to add.

    For example, if you are loading the Validated on Date in the W_SALES_CYCLE_LINE_F table, then you need to create a new column, VALIDATED_ON_DT, and modify the target definition of the W_SALES_CYCLE_LINE_F table.

  3. Save the changes.

  4. Open Projects - BI Apps Project - Mappings - PLP folders.

  5. Find PLP_SalesCycleLinesFact_Load folder and modify interfaces under the folder to select the new column from any of the following source tables, and load it to the W_SALES_CYCLE_LINE_F target table:

    • W_SALES_ORDER_LINE_F

    • W_SALES_INVOICE_LINE_F

    • W_SALES_PICK_LINE_F

    • W_SALES_SCHEDULE_LINE_F

  6. Modify the temp interfaces and the main interfaces for both a full load and an incremental load.

B.2.18 How to Set Up Group Account Numbers for Oracle E-Business Suite

This section explains how to map Oracle General Ledger Accounts to Group Account Numbers, and includes the following topics:

Note:

It is critical that the GL account numbers are mapped to the group account numbers (or domain values) because the metrics in the GL reporting layer use these values.

B.2.18.1 Overview of Mapping Oracle GL Accounts to Group Account Numbers

Group Account Number Configuration is an important step in the configuration of Financial Analytics, because it determines the accuracy of the majority of metrics in the General Ledger and Profitability module. Group Accounts in combination with Financial Statement Item Codes are also leveraged in the GL reconciliation process, to ensure that subledger data reconciles with GL journal entries. This topic is discussed in more detail later in this section.

You set up General Ledger accounts using the following configuration file:

  • file_group_acct_codes_ora.csv - this file maps General Ledger accounts to group account codes.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

You can categorize your Oracle General Ledger accounts into specific group account numbers. The group account number is used during data extraction as well as front-end reporting. The GROUP_ACCT_NUM field in the GL Account dimension table W_GL_ACCOUNT_D denotes the nature of the General Ledger accounts (for example, cash account, payroll account). For a list of the Group Account Number domain values, see Oracle Business Analytics Warehouse Data Model Reference. The mappings to General Ledger Accounts Numbers are important for both Profitability analysis and General Ledger analysis (for example, Balance Sheets, Profit and Loss, Cash Flow statements).

The logic for assigning the group accounts is located in the file_group_acct_codes_ora.csv file. Table B-3 shows an example configuration of the file_group_acct_codes_ora.csv file.

Table B-3 Example Configuration of file_group_acct_codes_ora.csv File

CHART OF ACCOUNTS ID FROM ACCT TO ACCT GROUP_ACCT_NUM

1

101010

101099

CA

1

131010

131939

FG INV

1

152121

152401

RM INV

1

171101

171901

WIP INV

1

173001

173001

PPE

1

240100

240120

ACC DEPCN

1

261000

261100

INT EXP

1

181011

181918

CASH

1

251100

251120

ST BORR


In Table B-3, in the first row, all accounts within the account number range from 101010 to 101099 that have a Chart of Account (COA) ID equal to 1 are assigned to Current Asset (that is, CA). Each row maps all accounts within the specified account number range and within the given chart of account ID.

If you need to create a new group of account numbers, you can create new rows in Oracle BI Applications Configuration Manager. You can then assign GL accounts to the new group of account numbers in the file_group_acct_codes_ora.csv file.

You must also add a new row in Oracle BI Applications Configuration Manager to map Financial Statement Item codes to the respective Base Table Facts. Table B-4 shows the Financial Statement Item codes to which Group Account Numbers must map, and their associated base fact tables.

Table B-4 Financial Statement Item Codes and Associated Base Fact Tables

Financial Statement Item Codes Base Fact Tables

AP

AP base fact (W_AP_XACT_F)

AR

AR base fact (W_AR_XACT_F)

COGS

Cost of Goods Sold base fact (W_GL_COGS_F)

REVENUE

Revenue base fact (W_GL_REVN_F)

TAX

Tax base fact (W_TAX_XACT_F)Foot 1 

OTHERS

GL Journal base fact (W_GL_OTHER_F)


Footnote 1 E-Business Suite adapters for Financial Analytics do not support the Tax base fact (W_TAX_XACT_F).

By mapping your GL accounts against the group account numbers and then associating the group account number to a Financial Statement Item code, you have indirectly associated the GL account numbers to Financial Statement Item codes as well. This association is important to perform GL reconciliation and to ensure the subledger data reconciles with GL journal entries. It is possible that after an invoice has been transferred to GL, a GL user might adjust that invoice in GL. In this scenario, it is important to ensure that the adjustment amount is reflected in the subledger base fact as well as balance tables. To determine such subledger transactions in GL, the reconciliation process uses Financial Statement Item codes.

Financial Statement Item codes are internal codes used by the ETL process to process the GL journal records during the GL reconciliation process against the subledgers. When the ETL process reconciles a GL journal record, it looks at the Financial Statement Item code associated with the GL account that the journal is charging against, and then uses the value of the Financial Statement item code to decide which base fact the GL journal should reconcile against. For example, when processing a GL journal that charges to a GL account which is associate to 'AP' Financial Statement Item code, then the ETL process will try to go against AP base fact table (W_AP_XACT_F), and try to locate the corresponding matching AP accounting entry. If that GL account is associated with the 'REVENUE' Financial Statement Item code, then the ETL program will try to go against the Revenue base fact table (W_GL_REVN_F), and try to locate the corresponding matching Revenue accounting entry.

B.2.18.2 How to Map Oracle GL Account Numbers to Group Account Numbers

This section explains how to map Oracle General Ledger Account Numbers to Group Account Numbers.

Note:

If you add new Group Account Numbers to the file_group_acct_codes_<source system type>.csv file, you must also add metrics to the BI metadata repository (that is, the RPD file). See Section B.2.18.3, "Example of Adding Group Account Number Metrics to the Oracle BI Repository" for more information.

To map Oracle GL account numbers to group account numbers:

  1. Edit the file_group_acct_codes_ora.csv file.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. For each Oracle GL account number that you want to map, create a new row in the file containing the following fields:

    Field Name Description

    CHART OF ACCOUNTS ID

    The ID of the GL chart of account.

    FROM ACCT

    The lower limit of the natural account range. This is based on the natural account segment of your GL accounts.

    TO ACCT

    The higher limit of the natural account range. This is based on the natural account segment of your GL accounts.

    GROUP_ACCT_NUM

    This field denotes the group account number of the Oracle General Ledger account, as specified in the warehouse domain Group Account in Oracle BI Applications Configuration Manager. For example, 'AP' for Accounts Payables, 'CASH' for cash account, 'GEN PAYROLL' for payroll account, and so on.


    For example:

    101, 1110, 1110, CASH
    101, 1210, 1210, AR
    101, 1220, 1220, AR
    

    Note:

    You can optionally remove the unused rows from the CSV file.

  3. Ensure that the values that you specify in the file_group_acct_codes_ora.csv file are consistent with the values that are specified in Oracle BI Applications Configuration Manager for Group Accounts.

  4. Save and close the CSV file.

B.2.18.3 Example of Adding Group Account Number Metrics to the Oracle BI Repository

If you add new Group Account Numbers to the file_group_acct_codes_<source system type>.csv file, then you must also use Oracle BI EE Administration Tool to add metrics to the Oracle BI repository to expose the new Group Account Numbers, as described in this example.

This example is applicable to the following tasks:

This example assumes that you have a new Group Account Number called 'Payroll (Domain member code 'PAYROLL'), and you want to add a new metric to the Presentation layer called 'Payroll Expense'.

To add a new metric in the logical table Fact – Fins – GL Other Posted Transaction:

  1. Using Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

    For example, the file OracleBIAnalyticsApps.rpd is located at:

    ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_
    obis<n>\repository
    
  2. In the Business Model and Mapping layer:

    1. Create a logical column named 'Payroll Expense' in the logical table 'Fact – Fins – GL Journals Posted'.

      For example, right-click the Core\Fact - Fins - GL Journals Posted\ object and choose New Object, then Logical Column, to display the Logical Column dialog. Specify Payroll Expense in the Name field.

    2. Display the Aggregation tab, and then choose 'Sum' in the Default aggregation rule drop-down list.

    3. Click OK to save the details and close the dialog.

    4. Expand the Core\Fact - Fins - GL Journals Posted\Sources\ folder and double click the Fact_W_GL_OTHER_GRPACCT_FSCLPRD_A source to display the Logical Table Source dialog.

    5. Display the Column Mapping tab.

    6. Select Show unmapped columns.

    7. Locate the Payroll Expense expression, and click the Expression Builder button to open Expression Builder.

    8. Use Expression Builder to specify the following SQL statement:

      FILTER("Core"."Fact - Fins - GL Journals Posted"."Transaction Amount" USING "Core"."Dim - GL Account"."Group Account Number" = 'PAYROLL')
      

      The filter condition refers to the new Group Account Number 'Payroll'.

    9. Repeat steps (d) to (h) for each Logical Table Source. Modify the expression in step (h) appropriately for each LTS by using the appropriate fact table that corresponds to the Logical Table Source.

      Steps (d) to (h) must be repeated for each Logical Table Source because in this example, there are multiple Logical Table Sources for fact table and aggregation tables in this logical table. Modify the expression in step (h) appropriately for each Logical Table Source by using the appropriate fact table to which it corresponds.

  3. Save the details.

  4. To expose the new repository objects in end users' dashboards and reports, drag the new objects from the Business Model and Mapping layer to an appropriate folder in the Presentation layer.

To add a new metric in the logical table Fact – Fins – GL Balance:

  1. Using Oracle BI EE Administration Tool, edit the BI metadata repository (RPD file).

    For example, the file OracleBIAnalyticsApps.rpd is located at:

    ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_
    obis<n>\repository
    
  2. In the Business Model and Mapping layer:

    1. Create a logical column named 'Payroll Expense' in logical table 'Fact – Fins – GL Balance'.

      For example, right-click the Core\Fact – Fins – GL Balance object and choose New Object, then Logical Column, to display the Logical Column dialog. Specify Payroll Expense in the Name field.

    2. In the Column Source tab, select Derived from existing columns using an expression.

    3. Click the Expression Builder button to display Expression Builder.

    4. Use Express Builder to specify the following SQL statement:

      FILTER("Core"."Fact - Fins - GL Balance"."Activity Amount" USING "Core"."Dim - GL Account"."Group Account Number" = 'PAYROLL')
      

      The filter condition refers to the new Group Account Number 'PAYROLL'.

  3. Save the details.

  4. To expose the new repository objects in end users' dashboards and reports, drag the new objects from the Business Model and Mapping layer to an appropriate folder in the Presentation layer.

B.2.19 How to Set Up Group Account Numbers for Peoplesoft

This section explains how to map General Ledger Accounts to Group Account Numbers, and includes the following topics:

Note:

It is critical that the GL account numbers are mapped to the group account numbers (or domain values) because the metrics in the GL reporting layer use these values. For a list of domain values for GL account numbers, see Oracle Business Analytics Warehouse Data Model Reference.

B.2.19.1 Overview of Mapping GL Accounts to Group Account Numbers

Group Account Number configuration is an important step in the configuration of Financial Analytics, as it determines the accuracy of the majority of metrics in the General Ledger and Profitability module. Group Accounts in combination with Financial Statement Item codes are also leveraged in the GL reconciliation process, to ensure that subledger data reconciles with GL journal entries. This topic is discussed in more detail later in this section.

You can categorize your PeopleSoft General Ledger accounts into specific group account numbers. The GROUP_ACCT_NUM field denotes the nature of the General Ledger accounts.

You set up General Ledger accounts using the following configuration file:

  • file_group_acct_codes_psft.csv - this file maps General Ledger accounts to group account codes.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Examples include Cash account, Payroll account, and so on. For a list of the Group Account Number domain values, see Oracle Business Analytics Warehouse Data Model Reference. The group account number configuration is used during data extraction as well as front-end reporting. For example, the group account number configuration is used heavily in both Profitability Analysis (Income Statement) and General Ledger analysis. The logic for assigning the accounts is located in the file_group_acct_codes_psft.csv file.

Table B-5 Layout of file_group_acct_codes_psft.csv File

BUSINESS_UNIT FROM_ACCT TO_ACCT GROUP_ACCT_NUM

AUS01

101010

101099

AP

AUS01

131010

131939

AR

AUS01

152121

152401

COGS

AUS01

171101

173001

OTHER

AUS01

240100

240120

REVENUE

AUS01

251100

251120

TAXFoot 1 


Footnote 1 Oracle's PeopleSoft adapters for Financial Analytics do not support the Tax base fact (W_TAX_XACT_F).

In Table B-5, in the first row, all accounts within the account number range from 101010 to 101099 containing a Business Unit equal to AUS01 are assigned to AP. Each row maps all accounts within the specified account number range and with the given Business Unit. If you need to assign a new group of account numbers, you can then assign GL accounts to the new group of account numbers in the file_group_acct_codes_psft.csv file.

You must also add a new row in Oracle BI Applications Configuration Manager to map Financial Statement Item codes to the respective Base Table Facts. Table B-6 shows the Financial Statement Item codes to which Group Account Numbers must map, and their associated base fact tables.

Table B-6 Financial Statement Item Codes and Associated Base Fact Tables

Financial Statement Item Codes Base Fact Tables

AP

AP base fact (W_AP_XACT_F)

AR

AR base fact (W_AR_XACT_F)

COGS

Cost of Goods Sold base fact (W_GL_COGS_F)

REVENUE

Revenue base fact (W_GL_REVN_F)

TAX

Tax base fact (W_TAX_XACT_F)Foot 1 

OTHERS

GL Journal base fact (W_GL_OTHER_F)


Footnote 1 Oracle's PeopleSoft adapters for Financial Analytics do not support the Tax base fact (W_TAX_XACT_F).

By mapping your GL accounts against the group account numbers and then associating the group account number to a Financial Statement Item code, you have indirectly associated the GL account numbers to Financial Statement Item codes as well. This association is important to perform GL reconciliation and ensure the subledger data reconciles with GL journal entries. It is possible that after an invoice has been transferred to GL, a GL user might adjust that invoice in GL. In this scenario, it is important to ensure that the adjustment amount is reflected in the subledger base fact as well as balance tables. To determine such subledger transactions in GL, the reconciliation process uses Financial Statement Item codes.

Financial Statement Item codes are internal codes used by the ETL process to process the GL journal records during the GL reconciliation process against the subledgers. When the ETL process reconciles a GL journal record, it looks at the Financial Statement Item code associated with the GL account that the journal is charging against, and then uses the value of the Financial Statement item code to decide which base fact the GL journal should reconcile against. For example, when processing a GL journal that charges to a GL account which is associate to 'AP' Financial Statement Item code, then the ETL process will try to go against AP base fact table (W_AP_XACT_F), and try to locate the corresponding matching AP accounting entry. If that GL account is associated with the 'REVENUE' Financial Statement Item code, then the ETL program will try to go against the Revenue base fact table (W_GL_REVN_F), and try to locate the corresponding matching Revenue accounting entry.

B.2.19.2 How to Map GL Account Numbers to Group Account Numbers

This section explains how to map General Ledger Account Numbers to Group Account Numbers.

Note:

If you add new Group Account Numbers to the file_group_acct_codes_<source system type>.csv file, you must also add metrics to the BI metadata repository (that is, the RPD file). See Section B.2.18.3, "Example of Adding Group Account Number Metrics to the Oracle BI Repository" for more information.

To map PeopleSoft GL account numbers to group account numbers:

  1. Edit the file_group_acct_codes_psft.csv file.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. For each GL account number that you want to map, create a new row in the file containing the following fields:

    Field Name Description

    BUSINESS_UNIT

    The ID of the BUSINESS UNIT.

    FROM ACCT

    The lower limit of the natural account range. This is based on the natural account segment of your GL accounts.

    TO ACCT

    The higher limit of the natural account range. This is based on the natural account segment of your GL accounts.

    GROUP_ACCT_NUM

    This field denotes the group account number of the General Ledger account, as specified in a domain in the Group Account domain in Oracle BI Applications Configuration Manager. For example, 'AP' for Accounts Payables, 'CASH' for cash account, 'GEN PAYROLL' for payroll account, and so on.


    For example:

    AUS01, 1110, 1110, CASH
    AUS01, 1210, 1210, AR
    AUS01, 1220, 1220, AR
    

    Note:

    You can optionally remove the unused rows in the CSV file.

  3. Ensure that the values that you specify in the file_group_acct_codes_psft.csv file are consistent with the values that are specified for domains in Oracle BI Applications Configuration Manager.

  4. Save and close the CSV file.

B.2.20 How to Configure GL Account and GL Segments for Oracle E-Business Suite

This section explains how to configure General Ledger Account and General Ledger Segments for Oracle E-Business Suite, and contains the following topics:

B.2.20.1 Overview

If you are deploying Oracle Financial Analytics, Oracle Procurement and Spend Analytics, or Oracle Supply Chain and Order Management Analytics, then you must configure GL account hierarchies as described in this topic.

Thirty segments are supported in which you can store accounting flexfields. Flexfields are flexible enough to support complex data configurations. For example:

  • You can store data in any segment.

  • You can use more or fewer segments per chart of accounts, as required.

  • You can specify multiple segments for the same chart of accounts.

B.2.20.2 Example of Data Configuration for a Chart of Accounts

A single company might have a US chart of accounts and an APAC chart of accounts, with the following data configuration:

Table B-7 Example Chart of Accounts

Segment Type US Chart of Account (4256) value APAC Chart of Account (4257) value

Company

Stores in segment 3

Stores in segment 1

Natural Account

Stores in segment 4

Stores in segment 3

Cost Center

Stores in segment 5

Stores in segment 2

Geography

Stores in segment 2

Stores in segment 5

Line of Business (LOB)

Stores in segment 1

Stores in segment 4


This example shows that in US Chart of Account , 'Company' is stored in the segment 3 column in the Oracle E-Business Suite table GL_CODE_COMBINATIONS. In APAC Chart of Account, 'Company' is stored in the segment 1 column in GL_CODE_COMBINATIONS table. The objective of this configuration file is to ensure that when segment information is extracted into the Oracle Business Analytics Warehouse table W_GL_ACCOUNT_D, segments with the same nature from different chart of accounts are stored in the same column in W_GL_ACCOUNT_D.

For example, we can store 'Company' segments from US COA and APAC COA in the segment 1 column in W_GL_ACCOUNT_D; and Cost Center segments from US COA and APAC COA in the segment 2 column in W_GL_ACCOUNT_D, and so on.

B.2.20.3 How to Set Up the GL Segment Configuration File

Before you run the ETL process for GL accounts, you must specify the segments that you want to analyze. To specify the segments, you use the ETL configuration file named file_glacct_segment_config_<source_system>.csv.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Figure B-2 Example values in file_glacct_segment_config_ora.csv

This screenshot or diagram is described in surrounding text.

In file_glacct_segment_config_ora.csv, you must specify the segments of the same type in the same column. For example, you might store all Cost Center segments from all charts of accounts in one column, and all Company segments from all charts of accounts in a separate column.

File file_glacct_segment_config_ora.csv contains a pair of columns for each accounting segment to be configured in the warehouse. In the 1st column, give the actual segment column name in Oracle E-Business Suite where this particular entity is stored. This column takes values such as SEGMENT1, SEGMENT2....SEGMENT30 (this is case sensitive). In the second column give the corresponding VALUESETID used for this COA and segment in Oracle E-Business Suite.

For example, you might want to do the following:

  • Analyze GL account hierarchies using only Company, Cost Center, Natural Account, and LOB.

    You are not interested in using Geography for hierarchy analysis.

  • Store all Company segments from all COAs in ACCOUNT_SEG1_CODE column in W_GL_ACCOUNT_D.

  • Store all Cost Center segments from all COAs in ACCOUNT_SEG2_CODE column in W_GL_ACCOUNT_D.

  • Store all Natural Account segments from all COAs in ACCOUNT_SEG3_CODE column in W_GL_ACCOUNT_D.

  • Store all LOB segments from all COAs in ACCOUNT_SEG4_CODE column in W_GL_ACCOUNT_D.

    Note: Although the examples above are mapping Natural Account, Balancing Segment and Cost Center segments to one of the segment columns in the file, it is not required that you map these three segments in the file. This is because we have dedicated dimensions to populate these three segments and they will be populated automatically by default whether or not you map these three segments in this file. It is preferred that these three segments are not mapped in this file so as to avoid redundant segment dimensions giving the same information.

GL Segment Configuration for Budgetary Control

For Budgetary Control, the first two segments are reserved for Project and Program segments respectively. Therefore, to use one or both of these, configure file_glacct_segment_config_ora.csv in this particular order:

1. Put your Project segment column name in the 'SEG_PROJECT' column in the CSV file.

2. Put your Program segment column name in the 'SEG_PROGRAM' column in the CSV file.

If you do not have any one of these reserved segments in your source system, leave that particular segment empty in the CSV file.

B.2.20.4 How to Configure GL Segments and Hierarchies Using Value Set Definitions

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  1. Configure file_glacct_segment_config_ora.csv, as follows:

    1. Edit the file file_glacct_segment_config_ora.csv.

      For example, you might edit the file located in \src_files\EBS11510.

    2. Follow the steps in Section B.2.20.3, "How to Set Up the GL Segment Configuration File" to configure the file.

  2. Edit the BI metadata repository (that is, the RPD file) for GL Segments and Hierarchies Using Value Set Definitions.

    The metadata contains multiple logical tables that represent each GL Segment, such as Dim_W_GL_SEGMENT_D_ProgramSegment, Dim_W_GL_SEGMENT_D_ProjectSegment, Dim_W_GL_SEGMENT_D_Segment1 and so on. Because all these logical tables are mapped to the same physical table, W_GL_SEGMENT_D, a filter should be specified in the logical table source of these logical tables in order to restrain the output of the logical table to get values pertaining to that particular segment. You must set the filter on the physical column SEGMENT_LOV_ID to the Value Set IDs that are applicable for that particular segment. The list of the Value Set IDs would be the same as the Value Set IDs you configured in the CSV file mentioned above.

    Specify a filter in the Business Model and Mapping layer of the Oracle BI Repository, as follows.

    1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

      The OracleBIAnalyticsApps.rpd file is located in ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_obis<n>\repository.

    2. Expand each logical table, for example, Dim - GL Segment1, and open the logical table source under it. Display the Content tab. In the 'Use this WHERE clause…' box, apply a filter on the corresponding physical table alias of W_GL_SEGMENT_D.

      For example: "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_D_Segment1"."SEGMENT_LOV_ID" IN (comma separated values IDs).

    3. Enter all Value Set IDs, separated by commas that correspond to this segment.

  3. Oracle Financial Analytics supports up to 30 segments in the GL Account dimension, and by default delivers ten GL Segment dimensions in the RPD. If you need more than ten GL Segments, perform the following steps to add new segments:

    In the Physical Layer:

    1. Create two new physical alias of W_GL_SEGMENT_D as "Dim_W_GL_SEGMENT_D_SegmentXX" and Dim_W_GL_SEGMENT_D_SegmentXX_GLAccount.

      To do this, right-click the physical table W_GL_SEGMENT_D and select New Object and then Alias.Name the new alias as "Dim_W_GL_SEGMENT_D_SegmentXX" and "Dim_W_GL_SEGMENT_D_SegmentXX_GLAccount".

    2. Create 4 new alias of W_GL_SEGMENT_DH as:

      - "Dim_W_GL_SEGMENT_DH_SegmentXX"

      - "Dim_W_GL_SEGMENT_DH_Security_SegmentXX"

      - "Dim_W_GL_SEGMENT_DH_SegmentXX_GLAccount"

      - "Dim_W_GL_SEGMENT_DH_Security_SegmentXX_GLAccount"

    3. Create a Foreign Key from "Dim_W_GL_SEGMENT_D_SegmentXX" to "Dim_W_GL_SEGMENT_DH_SegmentXX" and "Dim_W_GL_SEGMENT_DH_Security_SegmentXX".

      The foreign key is similar to the one from "Dim_W_GL_SEGMENT_D_Segment1" to "Dim_W_GL_SEGMENT_DH_Segment1" and "Dim_W_GL_SEGMENT_DH_Security_Segment1".

      The direction of the foreign key should be from W_GL_SEGMENT_DH to W_GL_SEGMENT_D; for example, on a '0/1': N cardinality join, W_GL_SEGMENT_DH will be on the '0/1' side and W_GL_SEGMENT_D will be on the 'N' side. See Oracle Fusion Middleware Metadata Repository Builder's Guide for Oracle Business Intelligence Enterprise Edition for more information about how to create physical foreign key joins.

    4. Create a similar physical foreign key from "Dim_W_GL_SEGMENT_D_SegmentXX_GLAccount" to "Dim_W_GL_SEGMENT_DH_SegmentXX_GLAccount' and "Dim_W_GL_SEGMENT_DH_Security_SegmentXX_GLAccount".

    5. Similarly, create physical foreign key join between Dim_W_GL_SEGMENT_D_SegmentXX and Dim_W_GL_ACCOUNT_D, with W_GL_SEGMENT_D on the '1' side and W_GL_ACCOUNT_D on the 'N' side.

    6. Save your changes.

  4. In the Business Model and Mapping Layer, do the following:

    1. Create a new logical table "Dim - GL SegmentXX" similar to "Dim – GL Segment1".

      This logical table should have a logical table source that is mapped to the physical tables created above (for example, it will have both Dim_W_GL_SEGMENT_DH_SegmentXX and Dim_W_GL_SEGMENT_DH_SegmentXX_GLAccount).

      This logical table should also have all attributes similar to "Dim – GL Segment1" properly mapped to the respective physical tables, Dim_W_GL_SEGMENT_DH_SegmentXX and Dim_W_GL_SEGMENT_DH_SegmentXX_GLAccount.

    2. In the Business Model Diagram, create a logical join from "Dim – GL SegmentXX" to all the relevant logical fact tables similar to "Dim – GL Segment1", with the GL Segment Dimension Logical table on the '0/1' side and the logical fact table on the 'N' side.

      To see all the relevant logical fact tables, first include Dim – GL Segment1 on the Business Model Diagram, and then right-click that table and select Add Direct Joins.

    3. Add the content filter in the logical table source of "Dim – GL SegmentXX" as described in the previous step.

    4. Create a dimension by right-clicking "Dim – GL SegmentXX", and select Create Dimension. Rename this to "GL SegmentXX". Make sure the drill-down structure is similar to "GL Segment1".

      If you are not sure how to do this, follow these steps: By default, the dimension will have two levels: the Grand Total Level and the Detail Level. Rename these levels to "All" and "Detail – GL Segment" respectively.

      Right-click the "All" level and select "New Object" and then "Child Level". Name this level as Tree Code And Version. Create a level under Tree Code And Version and name it as Level31. Similarly create a level under Level31 as Level30. Repeat this process until you have Level1 under Level2.

    5. Drag the "Detail – GL Segment" level under "Level1" so that it is the penultimate level of the hierarchy. Create another child level under "Detail – GL Segment" and name it as "Detail – GL Account".

    6. From the new logical table Dim - GL SegmentXX, drag the Segment Code, Segment Name, Segment Description, Segment Code Id and Segment Value Set Code attributes to the "Detail – GL Segment" level of the hierarchy. Similarly pull in the columns mentioned below for the remaining levels.

      Detail – GL Account – Segment Code – GL Account

      Levelxx – Levelxx Code, Levelxx Name, Levelxx Description and Levelxx Code Id

      Tree Code And Version – Tree Filter, Tree Version ID, Tree Version Name and Tree Code

    7. Navigate to the properties of each Level and from the Keys tab, create the appropriate keys for each level as mentioned below. Select the primary key and "Use for Display option" for each level as mentioned in the matrix below.

      Table B-8 Configuration values for GL Segments and Hierarchies Using Value Set Definitions

      Level Key Name Columns Primary Key of that Level Use for Display?

      Tree Code And Version

      Tree Filter

      Tree Filter

      Y

      Y

      Levelxx

      Levelxx Code

      Levelxx Code

      Y

      Y

      Levelxx

      Levelxx ID

      Levelxx Code Id

      <empty>

      <empty>

      Detail - GL Segment

      Segment ID

      Segment Code Id

      Y

      <empty>

      Detail - GL Segment

      Segment Code

      Segment Value Set Code and Segment Code

      <empty>

      Y

      Detail - GL Account

      Segment Code - GL Account

      Segment Code - GL Account

      Y

      Y


    8. Once you have created these new levels, you will have to set the aggregation content for all the Logical Table Sources of the newly created logical table created Dim - GL SegmentXX. Set the Aggregation Content in the Content tab for each LTS as mentioned below:

      Dim_W_GL_SEGMENT_DH_SegmentXX – Set the content level to "Detail – GL Segment".

      Dim _W_GL_SEGMENT_DH_SegmentXX_GLAccount – Set it to "Detail – GL Account".

    9. Set the aggregation content to all relevant fact logical table sources. Open all Logical Table Sources of all the logical fact tables that are relevant to the new logical table one at a time. Display the Content tab. If the LTS is applicable for that newly created segment, then set the aggregation content to "Detail – GL Account". If not, skip that logical table source and go to the next one.

    10. Drag your new "Dim - GL Segment XX" dimensions into the appropriate subject areas in the Presentation layer. Typically, you can expose these GL Segment dimensions in all subject areas where the GL Account dimension is exposed. You can also find all appropriate subject areas by right-clicking Dim – GL Segment1 and select Query Related Objects, then selecting Presentation, and then selecting Subject Area.

    11. Save your changes and check global consistency.

  5. Each GL Segment denotes a certain meaningful ValueSet(s) in your OLTP. To clearly identify each segment in the report, you can rename the presentation table "GL SegmentX", logical dimension "GL SegmentX", and logical table "Dim - GL SegmentX" according to its own meaning.

    For example, if you populate Product segment into Segment1, you can rename logical table "Dim - GL Segment1" as "Dim – GL Segment Product" or any other appropriate name and then rename the tables in the Presentation layer accordingly.

B.2.21 How to Configure GL Account and GL Segments for Oracle PeopleSoft

The GL Account dimension in the Oracle Business Analytics Warehouse is at a granularity of a combination of chartfields. PeopleSoft Financials provides several chartfields for GL accounts, such as account, alternate account, operating unit, department, and so on. The ETL program extracts all possible combinations of these chartfields that you have used and stores each of these chartfields individually in the GL Account dimension. It extracts the combinations of chartfields used from the following PeopleSoft account entry tables:

  • PS_VCHR_ACCTG_LINES (Accounts Payable)

  • PS_ITEM_DST (Accounts Receivable)

  • PS_BI_ACCT_ENTRY (Billings)

  • PS_CM_ACCTG_LINE (Costing)

  • PS_JRNL_LN (General Ledger)

The GL Account dimension (W_GL_ACCOUNT_D) in the Oracle Business Analytics Warehouse provides a flexible and generic data model to accommodate up to 30 chartfields. These are stored in the generic columns named ACCOUNT_SEG1_CODE, ACCOUNT_SEG2_CODE and so on up to ACCOUNT_SEG30_CODE, henceforth referred to as segments. These columns store the actual chartfield value that is used in your PeopleSoft application.

Mapping PeopleSoft Chartfields

A CSV file named file_glacct_segment_config_psft.csv is provided to map the PeopleSoft chartfields to the generic segments.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

The first row in the file is a header row; do not modify this line. The second row in the file is where you specify how to do the mapping. The value for the column ROW_ID is hard coded to '1'; there is no need to change this.

Note that the file contains 30 columns – SEG1, SEG2, up to SEG30. You will have to specify which chartfield to populate in each of these columns by specifying one of the supported values for the chartfields. The following list shows the chartfields currently supported for the PeopleSoft application.

Note:

Values are case sensitive. You must specify the values exactly as shown in the following list.

  • Activity ID

  • Affiliate

  • Alternate Account

  • Analysis Type

  • Book Code

  • Budget Reference

  • Budget Scenario

  • Business Unit PC

  • ChartField 1

  • ChartField 2

  • ChartField 3

  • Class Field

  • Fund Affiliate

  • GL Adjust Type

  • Operating Unit

  • Operating Unit Affiliate

  • Product

  • Program Code

  • Project

  • Resource Category

  • Resource Sub Category

  • Resource Type

  • Statistics Code

Note:

You only need to include the chartfields in the CSV file that you want to map.

B.2.22 How to Make Corrections to the Group Account Number Configuration for Oracle E-Business Suite

When a user maps a GL natural account to an incorrect group account number, incorrect accounting entries might be inserted into the fact table. For example, the natural account 1210 is mistakenly classified under 'AR' group account number when it should be classified under 'AP' group account number. When this happens, the ETL program will charge all the GL journal lines to account 1210 and try to reconcile these GL journal lines against subledger accounting records in the AR fact table (W_AR_XACT_F). Since these GL journal lines did not come from AR, the ETL program will not be able to find the corresponding subledger accounting records for these GL journal lines. In this case, the ETL program will insert 'Manual' records into the AR fact table because it thinks that these GL journal lines are 'Manual' journal entries created directly in the GL system charging to the AR accounts.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

How to Make Corrections to the Group Account Number Configuration for Oracle E-Business Suite:

  1. Correct the mapping of GL natural account to the group account in the input CSV file named file_group_acct_codes_ora.csv.

    For example, before correction, a CSV file has the following values (Incorrect Group Account Number assignment):

    • CHART OF ACCOUNTS ID = 101

    • FROM ACCT = 2210

    • TO ACCT = 2210

    • GROUP_ACCT_NUM = AR

    After correction, account '2210' should now correctly point to 'AP' group account number, and the CSV file would have the following (corrected) values:

    • CHART OF ACCOUNTS ID = 101

    • FROM ACCT = 2210

    • TO ACCT = 2210

    • GROUP_ACCT_NUM = AP

  2. Save the file.

    Based on the Group Account corrections made in the CSV file, the next ETL process will reassign the group accounts correctly and correct the entries that were made to the fact tables from the previous ETL run(s).

B.2.23 How to Configure Number of Days based Metrics for PeopleSoft

For certain metrics to function properly, you must configure the following two internal metrics in the Oracle BI Applications metadata repository (RPD):

  • # of Elapsed Days

  • # of Cumulative Elapsed Days

These metrics affect the calculation of other metrics, such as Days Sales Outstanding, Days Payables Outstanding, AP Turnover, AR Turnover, and so on.

To configure Number of Days based metrics:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

    The RPD file is located at:

    ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_obisn\repository

  2. On Business Model and Mapping layer, find logical table "Fact - Fins - Period Days Count".

  3. Under Sources, select Fact_W_DAY_D_PSFT logical table source.

  4. Clear the Disabled option in the General tab and click OK.

  5. Open the other two logical table sources, Fact_W_DAY_D_ORA and Fact_W_DAY_D_JDE, and select the Disabled option.

  6. Add "Fact - Fins - Period Days Count" and "Dim – Legal Entity" logical tables to the Business Model Diagram. To do so, right-click the objects and select Business Model Diagram, Selected Tables Only.

  7. In the Business Model Diagram, create a new logical join from "Dim – Legal Entity" to "Fact - Fins - Period Days Count." The direction of the foreign key should be from "Dim – Legal Entity" logical table to "Fact - Fins - Period Days Count" table. For example, on a (0,1):N cardinality join, "Dim – Legal Entity" will be on the (0/1) side and "Fact - Fins - Period Days Count" will be on the N side.

  8. Under "Fact - Fins - Period Days Count" logical table, open "# of Elapsed Days". Go to the Levels tab. For Legal Entity dimension, the Logical Level is set to All. Click the X button to remove this setting.

  9. Under "Fact - Fins - Period Days Count" logical table, open "# of Cumulative Elapsed Days". Go to the Levels tab. For Legal Entity dimension, the Logical Level is set to All. Click the X button to remove this setting.

  10. Check Global Consistency to ensure there are no errors, and then save the RPD file.

B.2.24 How to Update Dashboard Pages with PeopleSoft Prompts

Data Source specific dashboard prompts are provided with Financial Analytics to accommodate source specific filtering across all application Dashboard pages. You need to add each PeopleSoft dashboard prompt listed in Table B-9 to its associated dashboard page as part of the application configuration process.

Table B-9 Financial Analytics Dashboard Pages with Pre-configured PeopleSoft Path and Prompt Names

Dashboard Dashboard Page Shared Folders/Financials/Analytic Library PeopleSoft Prompt Name

General Ledger

Overview

/General Ledger/Key Ratios

Oracle PSFT - GL Key Ratios Prompt

General Ledger

Balance Sheet

/General Ledger/Balance Sheet

Oracle PSFT - GL Balance Sheet Prompt

General Ledger

Cash Flow

/General Ledger/Cash Flow

Oracle PSFT - GL Cash Flow Prompt

General Ledger

Budget vs. Actual

/General Ledger/Budget Actual

Oracle PSFT - GL Budget Prompt

General Ledger

Asset Usage

/General Ledger/Asset Usage

Oracle PSFT - GL Asset Usage Prompt

General Ledger

Liquidity

/General Ledger/Liquidity

Oracle PSFT - GL Liquidity Prompt

General Ledger

Financial Structure

/General Ledger/Financial Structure

Oracle PSFT - GL Financial Structure Prompt

General Ledger

GL Balance

/General Ledger/Transactions

Oracle PSFT - GL Balance Transactions Prompt

General Ledger

Trial Balance

/General Ledger/Trial Balance

Oracle PSFT - GL Trial Balance Prompt

Payables

Overview

/Payables/Overview

Oracle PSFT - AP Overview Prompt

Payables

AP Balance

/Payables/AP Balance

Oracle PSFT - AP Balance Prompt

Payables

Payments Due

/Payables/Payments Due

Oracle PSFT - AP Payments Due Prompt

Payables

Effectiveness

/Payables/Effectiveness

Oracle PSFT - AP Effectiveness Prompt

Payables

Payment Performance

/Payables/Payment Performance

Oracle PSFT - AP Payment Performance Prompt

Payables

Supplier Report

/Payables/Supplier Report

Oracle PSFT - AP Supplier Report Prompt

Payables

Holds and Discounts

/Payables/Supplier Report

Oracle PSFT - AP Holds and Discounts Prompt

Payables

Invoice Details

/Payables/Invoice Details

Oracle PSFT - AP Invoice Details Prompt

Payables

All AP Transactions

/Payables/All AP Transactions

Oracle PSFT - AP Txn Prompt

Profitability

Overview

/Profitability/Overview

Oracle PSFT - GL Profitability Overview Prompt

Profitability

P&L

/Profitability/P&L

Oracle PSFT - GL Profitability P&L Prompt

Profitability

Margins

/Profitability/Margins

Oracle PSFT - GL Profitability Margins Prompt

Profitability

Revenue

/Profitability/Revenue

Oracle PSFT - GL Profitability Revenue Prompt

Profitability

Products

/Profitability/Products

Oracle PSFT - GL Profitability Products Prompt

Profitability

Customers

/Profitability/Customers

Oracle PSFT - GL Profitability Customer Prompt

Receivables

Overview

/Receivables/Overview

Oracle PSFT - AR Overview Prompt

Receivables

AR Balance

/Receivables/AR Balance

Oracle PSFT - AR Balance Prompt

Receivables

Payments Due

/Receivables/Payments Due

Oracle PSFT - AR Payments Due Prompt

Receivables

Effectiveness

/Receivables/Effectiveness

Oracle PSFT - AR Effectiveness Prompt

Receivables

Payment Performance

/Receivables/Payment Performance

Oracle PSFT - AR Payment Performance Prompt

Receivables

Customer Report

/Receivables/Customer Report

Oracle PSFT - AR Customer Prompt

Receivables

Invoice Details

/Receivables/Invoice Details

Oracle PSFT - AR Invoice Details Prompt

Receivables

All AR Transactions

/Receivables/All AR Transactions

Oracle PSFT - AR Transactions Prompt


To update dashboard pages with PeopleSoft prompts:

These instructions explain how to modify the General Ledger dashboard's Overview page prompt as an example of how to modify a prompt.

  1. Access the dashboard page.

  2. Click the Page Options button and then select Edit Dashboard to launch Dashboard Editor.

  3. Remove the existing dashboard prompt from the top section in Dashboard Editor.

    For the Overview page in the General Ledger dashboard, remove the "Oracle FUSION - GL Key Ratios Prompt" from Section 1.

    Note:

    Remove the prompt, not the Section.

  4. From the selection pane in the Saved Content area, browse to the Shared Folders where the dashboard prompt to be used for this dashboard page is stored.

    For the Overview page in the General Ledger dashboard, the catalog path is stored in the following location:

    /Shared folders/Financials/Analytic Library/General Ledger/Key Ratios
    Prompt name: Oracle PSFT - GL Key Ratios Prompt
    
  5. Drag and drop the dashboard prompt from the shared folder into the section where you removed the prompt in step 3.

  6. Click the Save button to save the dashboard page and exit Dashboard Editor.

    This updates the dashboard page with the PeopleSoft prompt.

  7. Repeat these steps for all Financial Analytics dashboard pages listed in Table B-9.

B.2.25 How to Configure Original Job Requisition Status

Overview

In Oracle E-Business Suite, the Job Requisition status is not preserved as historical information in the OLTP. Therefore, as a Job Requisition status changes, for example from "Drafted" to "Approved" to "Open" to "Closed", the OLTP saves only the last status.

With respect to the Oracle Business Intelligence Applications data warehouse, the Job Requisition Open event is a significant event because several Job Requisition metrics depend on it, such as Job Requisition Open to Assessment Start Days, Job Requisition Open Since (Days), Job Requisition Age (Months), Job Requisition Opened, Job Requisition Open, Job Requisition Open (Period Begin) and so on . Therefore, you must track this event by configuring the original Job Requisition status event, which occurs on the Job Requisition start date.

This tasks configures Job Requisition "start" events.

Optional or Mandatory

This is a mandatory task, although there are defaults set up already in the installed solution. Oracle recommends that you read this section and then decide whether the default settings meet your business needs. If not, then you must change the configuration to suit your business needs.

Applies to

This configuration is required only for E-Business Suite source applications.

Task description

In order to infer the "start" event of a Job Requisition that can be at any state now (we call it "Job Requisition Current Status" – source conformed domain), we need to look at the possible previous "most significant" status. We call this as "Job Requisition Original Status" - a source conformed domain. In this task, you will map your "Job Requisition Current Status" domain members to one of the possible members of the "Job Requisition Original Status" domain.

For example, if the current status is "Closed", then you can infer that at one point it had a status of "Open". Therefore, you should map the original status to "Open" against the current status "Closed". However, if the current status is "Approval Denied", then it might make sense to assume that the requisition was never opened. Therefore, you should map the original status to another value, such as "Requested", against the current status "Approval Denied".

Both of the involved source conformed domains "Job Requisition Current Status" (**) and "Job Requisition Original Status" have members that come from the same set of values. In fact, all values of "Job Requisition Original Status" should exist as a value in "Job Requisition Current Status".

(**) Before you configure "Job Requisition Original Status" you must configure the source conformed domain "Job Requisition Current Status". For more information, refer to the task "Manage Domains and Member Mappings for Recruitment Fact Group".

The "Additional Information" section below gives some extra information related to this task that should help understand the concept better, in terms of how these configurations are used downstream. Also, it provides a list of installed mappings between the two source-conformed domain members related to this task.

Additional Information

In the current task, it is expected that you would configure a 'probable' and 'most significant' original status of a job requisition, given a current status, with intent to track its "Requisition Open" event. However, by mapping the original "status" alone does not complete the "event" configuration. Once you map the original status, you would also need to map "statuses" to the appropriate "events" in a later task called "Manage Domains and Member Mappings for Recruitment Event Type". These two tasks together complete the configuration of requisition start events.

For example, if the "Job Requisition Current Status" is "Closed", it may mean that the job requisition was "Open" on an earlier date. In this case, the "Job Requisition Original Status" can be configured as "Approved". The "Approved" status can then be mapped to RQSTN_OPEN as W_EVENT_CODE, W_SUB_STAGE_CODE, and W_STAGE_CODE at the configuration task "Manage Domains and Member Mappings for Recruitment Event Type". This completes the process of identifying a job requisition's "Opened" event when the job requisition is currently "Closed" and its previous statuses are not tracked in E-Business Suite.

Another example might go like this. If the "Job Requisition Current Status" is "Rejected", it may mean that this job requisition previously had a status of "Pending" on an earlier date and was never in "Open" status, ever. In this case, the "Job Requisition Original Status" can be configured as "Pending" instead of "Open". The "Pending" status can then be mapped to RQSTN_APPROVAL_PENDING as W_EVENT_CODE, W_SUB_STAGE_CODE and RQSTN_PENDING as stage code at the configuration task "Manage Domains and Member Mappings for Recruitment Event Type". Within the data warehouse, this job requisition will then be treated as being never opened.

The following table shows the installed mapping between the two source-conformed domain members. If this does not meet your business needs, then you will need to edit the member mappings.

Table B-10 Job Requisition Current Status to Job Requisition Original Status Member Mapping

Source Member (Current Status) Source Member Code (Current Status) Target Member (Original Status) Target Member Code (Original Status)

APPROVED

APPROVED

APPROVED

APPROVED

CANCELLED

CANCELLED

PENDING

PENDING

CLOSED

CLOSED

APPROVED

APPROVED

HOLD

HOLD

APPROVED

APPROVED

OPEN

OPEN

APPROVED

APPROVED

PENDING

PENDING

APPROVED

APPROVED

REJECTED

REJECTED

PENDING

PENDING

RQSTN_CANCELLED

UNAPPROVED

PENDING

PENDING

UNDER REVIEW

UNDER REVIEW

PENDING

PENDING


Dependency

If you edit this mapping, you would need to carry out a full ETL load of your Data Warehouse.

B.2.26 How to Set Up Fixed Asset Security for Oracle Fusion Applications

Financial Analytics supports security over fixed asset books in Fixed Asset subject areas. The list of asset books that a user has access to is determined by the grants in Oracle Fusion application.

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. The initialization block names relevant to various source systems are given below. Fusion Applications security is enabled by default, therefore no manual configuration is required. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

  • Oracle Fusion Applications: Fixed Asset Book

  • Oracle E-Business Suite: Fixed Asset Book EBS

  • Oracle PeopleSoft: Fixed Asset Book PSFT

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

Configuring BI Duty Roles

The following BI Duty Role is applicable to the Fixed Asset subject areas for Oracle Fusion Applications:

  • OBIA_ASSETS_ACCOUNTING_MANAGERIAL_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.27 How to Set Up Fixed Asset Security for E-Business Suite

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Fixed Asset security for E-Business Suite, enable Oracle E-Business Suite initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

  • Oracle Fusion Applications: Fixed Asset Book

  • Oracle E-Business Suite: Fixed Asset Book EBS

  • Oracle PeopleSoft: Fixed Asset Book PSFT

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

Configuring BI Duty Roles

The following BI Duty Role is applicable to the Fixed Asset subject areas for E-Business Suite:

  • Fixed Asset Accounting Manager EBS

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.28 How to Set Up Fixed Asset Security for PeopleSoft

Financial Analytics supports security over fixed asset books in Fixed Asset subject areas. The list of asset books that a user has access to is determined by the grants in PeopleSoft.

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Fixed Asset security for PeopleSoft, enable Oracle PeopleSoft initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

  • Oracle Fusion Applications: Fixed Asset Book

  • Oracle E-Business Suite: Fixed Asset Book EBS

  • Oracle PeopleSoft: Fixed Asset Book PSFT

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

Configuring BI Duty Roles

The following BI Duty Role is applicable to the Fixed Asset subject areas for PeopleSoft:

  • Fixed Asset Accounting Manager PSFT

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.29 How to Configure the Asset Category Dimension for E-Business Suite

Asset Category is defined in Oracle E-Business Suite Fixed Asset Application using the Key Flex Field (KFF) feature. You can setup KFF using different segments based on your business needs.

The configuration file file_fa_category_segment_config_ora.csv is used to configure the segment mapping between the Category KFF in your E-Business Suite Fixed Asset application and the Asset Category dimension in Oracle Business Analytics Warehouse. This configuration needs to be done before the ETL load. During ETL, the configuration csv file determines which KFF segment should be loaded into which segment column in the Asset Category dimension table in Oracle Business Analytics Warehouse.

For example, assuming in Oracle Business Analytics Warehouse, the segment columns store the following conformed values:

  • W_ASSET_CATEGORY_D.major_category stores Major category (such as COMPUTER)

  • W_ASSET_CATEGORY_D.minor_category stores Minor category (such as LAPTOP, or DESKTOP)

  • W_ASSET_CATEGORY_D.segment1 stores Major category

  • W_ASSET_CATEGORY_D.segment2 stores Minor category

Assuming in your Oracle E-Business Suite instance, you use segment 2 and segment 3 to store the major and minor category:

  • FA_CATEGORIES_B.segment1 not used

  • FA_CATEGORIES_B.segment2 stores Major category

  • FA_CATEGORIES_B.segment3 stores Minor category

With this example, the configure csv file should be configured as the follows:

Table B-11 Asset Category Dimension configuration

MAJOR_CATEGORY MINOR_CATEGORY SEG1 SEG2 SEG3 SEG4 SEG5 Etc

SEGMENT2

SEGMENT3

SEGMENT2

SEGMENT3

       

Note: The major_category and minor_category columns contain the segment numbers that represents the major and minor category respectively.

How to configure the Asset Category Dimension using file_fa_category_segment_config_ora.csv:

  1. Edit the file file_fa_category_segment_config_ora.csv.

    The file_fa_category_segment_config_ora.csv is used to match the segment fields in Oracle E-Business Suite to the segment fields the Oracle Business Analytics Warehouse table W_ASSET_CATEGORY_D.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter the segment mapping information into the fields.

    The column SEG1 to SEG7 represent the data warehouse segment columns in the asset category dimension table. For each of these segments, you fill in the corresponding mapped KFF segment. Fill the MAJOR_CATEGORY and MINOR_CATEGORY column with the segment number that represents the major and minor category respectively.

    Leave the field empty if there is no mapping.

  3. Save the file.

B.2.30 How to Configure the Customer Costs Lines and Product Costs Lines Tables

This section explains how to configure Customer Costs Lines and Product Costs Lines for Oracle Financial Analytics, and contains the following topics:

B.2.30.1 About the Customer Costs Lines and Product Costs Lines Tables for Financial Profitability Analytics

This configuration is required only if you are implementing Financial Profitability Analytics and you want to allocate your expenses by product or by customer dimensions. The default adapter does not capture the miscellaneous cost and expenses associated with generating revenue from a customer or from a product (for example, marketing campaign expenses). You must provide this miscellaneous data through the csv file, as described in this section.

The data files file_customer_cost_line_fs.csv and file_product_cost_line_fs.csv are used to enter data in the Customer Cost Lines table and the Product Cost Lines table before an ETL full load. Depending on the INTEGRATION_ID mentioned these files the ETL will do an Insert or update in the Customer Cost Line and Product Cost Line tables. If the INTEGRATION_ID mentioned in the files already exists in the fact table then the ETL will do an update for this transaction row in the fact table. You must populate these data files before an ETL load.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  • The file_customer_cost_line_fs.csv file loads the Customer Costs Lines data in the W_CUSTOMER_COST_LINE_F table. Columns are categorized into four types, as follows:

    • Integration_ID - This is a unique identifier to the individual rows in the file.

    • FK_ID - This should be the integration ID's of the corresponding Dimension. For example, CUSTOMER_ID, which should be populated by the Integration ID of the customer dimension.

    • Amount Columns - for example, CUST_COST_AMT, Amount for the particular transaction line.

    • Attribute Columns, for example COST_LINE_DOC_ITEM, COST_LINE_DOC_SUB_ITEM, EXPENSED_ON_DT and so on. While inserting data for the _DT columns, we need to make sure that the data is entered in 'YYYYMMDDHH24MISS' format.

  • The file_product_cost_line_fs.csv file loads the Product Costs Lines data in the W_PRODUCT_COST_LINE_F table. Columns are categorized into four types, as follows:

    • Integration_ID - this is a unique identifier to the individual rows in the file.

    • FK_ID - this should be the integration ID of the corresponding Dimension. For example, PRODUCT_ID, which should be populated by the Integration ID of the Product dimension.

    • Amount Columns - for example, CUST_COST_AMT, Amount for the particular transaction line.

    • Attribute Columns - for example, COST_LINE_DOC_ITEM, COST_LINE_DOC_SUB_ITEM, EXPENSED_ON_DT and so on. While inserting data for the _DT columns, we need to make sure that the data is entered in 'YYYYMMDDHH24MISS' format.

B.2.30.2 How to Configure the Customer Costs Lines and Product Costs Lines Tables for Financial Profitability Analytics

Before you perform a full load ETL, you must follow this procedure to configure the Customer Cost Lines and Product Costs Lines.

To configure the Customer Costs Lines and Product Costs Lines tables:

  1. Copy the data files file_customer_cost_line_fs.csv and file_product_cost_Line_fs.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Edit the file_customer_cost_line_fs.csv.

  3. Insert a record into the file for each customer costing transaction that you want to load into the Customer Cost fact table.

  4. Save the file.

  5. Edit the file_product_cost_line_fs.csv.

  6. Insert a record into the file for each product costing transaction that you want to load into the Product Cost fact table.

  7. Save the file.

    You are now ready to perform a full load ETL to load the Customer Cost Lines and Product Costs Lines.

B.2.31 How to Configure Budgets for Oracle General Ledger Analytics

If you are using Oracle E-Business Suite, PeopleSoft, or JD Edwards EnterpriseOne, and you would like to extract the budget data from these sources and import it into the data warehouse, you can use the preconfigured adapter mappings. However, if you want to use budget data from other external systems, you can import the data into the data warehouse using the Universal adapter as described in this section. This section includes the following topics:

B.2.31.1 About Configuring Universal Source Files

The following tables describe the columns in the universal source CSV files file_budget.csv and file_acct_budget.csv, their data types, and how to populate them where applicable.

Table B-12 shows the structure of the file_budget.csv file. The records in file_budget.csv are loaded into W_BUDGET_D.

Table B-12 Universal Source for Budget Fact (file_budget.csv)

Column Name Datatype Size Description

BUDGET_NAME

string

80

Budget name.

BUDGET_VERSION

string

30

Budget version.

BUDGET_STATUS

string

30

Budget status.

BUDGET_TYPE

string

30

Budget type.

CREATED_BY_ID

string

80

ID of created by user. Populate with Integration_ID from w_user_d.

CHANGED_BY_ID

string

80

ID of changed by user. Populate with Integration_ID from w_user_d.

CREATED_ON_DT

string

14

Created date.

CHANGED_ON_DT

string

14

Changed date. Used for updating an existing record in the warehouse. Increase the date if you want to update the record. If a record with the same integration_ID already exists in the target table W_BUDGET_D, then the load process will compare the CHANGED_ON_DT values between this record and the record in W_BUDGET_D. If this record's CHANGED_ON_DT is later than the record in W_BUDGET_D, then the load process will perform an update against the record in W_BUDGET_D; otherwise the load process will ignore this record, and no update or insertion will occur. If there is no matching record in W_BUDGET_D with the same integration_ID, then the load process will insert this record into W_BUDGET_D.

AUX1_CHANGED_ON_DT

string

14

-

AUX2_CHANGED_ON_DT

string

14

-

AUX3_CHANGED_ON_DT

string

14

-

AUX4_CHANGED_ON_DT

string

14

-

DELETE_FLG

string

1

-

DATASOURCE_NUM_ID

number

10

A number for your data source. Populate the same datasource_num_id as your main source application.

INTEGRATION_ID

string

80

A unique identifier for the record.

TENANT_ID

string

80

-

X_CUSTOM

string

10

-


Table B-13 shows the structure of the file_acct_budget.csv file. The records in the file_acct_budget.csv will be loaded into W__ACCT_BUDGET_F.

Table B-13 Universal Source for Budget Fact (file_acct_budget.csv)

Column Name Datatype Size Description

ADJUSTMENT_FLG

string

1

-

AUX1_CHANGED_ON_DT

string

14

-

AUX2_CHANGED_ON_DT

string

14

-

AUX3_CHANGED_ON_DT

string

14

-

AUX4_CHANGED_ON_DT

string

14

-

BUDG_BUSN_AREA_ORG_ID

string

80

Company Org identifier. Populate with integration_id from w_int_org_d where business_area_flg = Y.

BUDG_CTRL_AREA_ORG_ID

string

80

Company Org identifier. Populate with integration_id from w_int_org_d where ctrl_area_flg = Y.

BUDG_FIN_AREA_ORG_ID

string

80

Company Org identifier. Populate with integration_id from w_int_org_d where fin_area_flg = Y.

BUDGET_CALENDAR_ID

string

80

-

BUDGET_DOC_AMT

number

22

Budget amount in document currency.

BUDGET_GRP_AMT

number

22

-

BUDGET_ID

string

80

Populate with the value from integration_id in file_budget.csv

BUDGET_LEDGER_ID

string

80

-

BUDGET_LOC_AMT

number

22

Budget amount in local currency.

CHANGED_BY_ID

string

80

ID of changed by user. Populate with Integration_ID from w_user_d.

CHANGED_ON_DT

string

14

Changed date. Used for updating an existing record in the warehouse. Increase the date if you want to update the record. If a record with the same integration_ID exists in the target table W_ACCT_BUDGET_F already, then the load process will compare the CHANGED_ON_DT values between this record and the record in W_ACCT_BUDGET_F. If this record's CHANGED_ON_DT is later than the record in W_ACCT_BUDGET_F, then the load process will perform an update against the record in W_ACCT_BUDGET_F; otherwise it'll ignore this record, no update or insertion will occur. If there's no matching record in W_ACCT_BUDGET_F with the same integration_ID, then the load process will insert this record into W_ACCT_BUDGET_F.

COMPANY_ORG_ID

string

80

Company Org identifier. Populate with integration_id from w_int_org_d where company_flg = Y.

COST_CENTER_ID

string

80

Cost Center identifier. Populate with integration_id from w_cost_center_d.

CREATED_BY_ID

string

80

ID of created by user. Populate with Integration_ID from w_user_d.

CREATED_ON_DT

string

14

Created date.

DATASOURCE_NUM_ID

number

10

A number for your data source. Populate the same datasource_num_id as your main source application.

DELETE_FLG

string

1

-

DOC_CURR_CODE

string

30

Document currency code.

GL_ACCOUNT_ID

string

80

GL Account identifier. Populate with integration_id from w_gl_account_d.

GRP_CURR_CODE

string

30

-

INTEGRATION_ID

string

80

A unique identifier for the record.

LOC_CURR_CODE

string

30

Local currency code.

PERIOD_BEGIN_DT

string

14

-

PERIOD_END_DT

string

14

Populate with your budget period's end date. If your budget is monthly, populate with the month end date.

POSTED_ON_DT

string

14

A date on which this transaction can be reported.

PRODUCT_ID

string

80

Product identifier. Populate with integration_id from w_product_d.

PROFIT_CENTER_ID

string

80

Profit Center identifier. Populate with integration_id from w_profit_center_d.

PROJECT_ID

string

80

-

TENANT_ID

string

80

-

X_CUSTOM

string

10

-


Note: Date columns should be populated in the CSV file as a number in the format YYYYMMDDHH24MISS.

Use Table B-14 to understand how the integration_id (key) of some of the key dimensions are constructed for the Oracle E-Business Suite source system. You can use this information to populate the dimension foreign key identifiers in the above universal source CSV file for budget fact, if you have to use budget fact in conjunction with dimensions populated from Oracle E-Business Suite.

Table B-14 Populating the integration_id fields in Oracle E-Business Suite Source Systems

Field How to populate

GL_ACCOUNT_ID (w_gl_account_d)

.GL code combination ID.

COMPANY_ORG_ID (w_int_org_d)

No need to populate; will be calculated based on GL Account ID.

COST_CENTER_ID (w_cost_center_d)

No need to populate; will be calculated based on GL Account ID.

PROFIT_CENTER_ID (w_profit_center_d)

No need to populate; will be calculated based on GL Account ID.

LEDGER_ID (w_ledger_d)

For Oracle 11i, populate as set of book ID. For Oracle R12, populate as ledger ID.


Use Table B-15 to understand how the integration_id (key) of some of the key dimensions are constructed for Oracle's JD Edwards EnterpriseOne source systems. You can use this information to populate the dimension foreign key identifiers in the above universal source CSV file for budget fact, if you have to use budget fact in conjunction with dimensions populated from Oracle's JD Edwards EnterpriseOne.

Table B-15 Populating the integration_id fields in Oracle's JD Edwards EnterpriseOne

Field How to populate

GL_ACCOUNT_ID (w_gl_account_d_)

GBAID||'~'||GBSBL||'~'||GBSBLT

COMPANY_ORG_ID (w_int_org_d)

GBCO

COST_CENTER_ID (w_cost_center_d)

GBMCU

PROFIT_CENTER_ID (w_profit_center_d)

GBCO

LEDGER_ID (w_ledger_d)

GBCO

PRODUCT_ID (w_product_d)

If GBSBLT points to item, then update product ID with that GBSBL.

PROJECT_ID (w_product_d)

Not applicable

BUDG_BUSN_AREA_ORG_ID (w_int_org_d)

GBMCU

BUDG_FIN_AREA_ORG_ID (w_int_org_d)

GBMCU

BUDG_CTRL_AREA_ORG_ID (w_int_org_d)

GBMCU

BUDGET_ID (w_budget_d)

Not applicable


B.2.31.2 How to Import Budget Data into the Data Warehouse Through the Universal Adapter

Follow these steps to import budget data into the data warehouse through the Universal adapter.

  1. Populate the file_budget.csv and file_acct_budget.csv files with your budget data.

    Refer the tables above for details of how to populate these files.

  2. Build a Load Plan with one fact group: "900: Universal Adaptor Instance"."GL Budget".

  3. Run the Load Plan that you created in the previous step.

    Note: This Load Plan must be run after the regular Load Plan to populate Oracle Business Analytics Warehouse for the other Subject Areas has completed.

  4. Load the budget data or changes to existing budget data.

    Repeat Step 1 and Step 3 as needed to load new budget for the next fiscal period or make corrections to already loaded budget data.

B.2.32 How to Set Up Workforce Frozen Snapshots

The Workforce Frozen Snapshot fact will periodically capture a snapshot of the workforce data and keep it unchanged to allow reporting back on how the data actually appeared at a particular point in time. The snapshot frequency can be configured in various different modes.

Optional or Mandatory

This task is optional; however the default option will not collect any frozen snapshots.

Applies to

All sources (E-Business Suite, PeopleSoft and Fusion).

Task description in detail

To enable the feature, configure the following parameter:

HR_WRKFC_SNAPSHOT_HIST_MODE_CODE

The table below describes the valid values.

Table B-16 Values for HR_WRKFC_SNAPSHOT_HIST_MODE_CODE

Value Description

None (default)

The feature is disabled. The other options are described in more detail in the following sections. The frozen snapshots are stored in the table W_WRKFC_SNP_F.

Start of Month

Snapshots taken once per month on the first day of each month. No other parameters need to be configured.

End of Month

Snapshots taken once per month on the last day of each month. No other parameters need to be configured.

Relative to Start

Snapshots taken once per month on the nth occurrence of a given day of each month, for example 3rd Tuesday of each month. Other parameters need to be configured:

  • HR_WRKFC_SNAPSHOT_DAY_CODE to specify the day, for example Tuesday.

  • HR_WRKFC_SNAPSHOT_DAY_OCCURENCE to specify the nth occurrence, for example 3rd Tuesday.

Relative to End

Snapshots taken once per month on the nth occurrence of a given day from the end each month, for example 3rd Tuesday from the end each month. Other parameters need to be configured:

  • HR_WRKFC_SNAPSHOT_DAY_CODE to specify the day, for example Tuesday.

  • HR_WRKFC_SNAPSHOT_DAY_OCCURENCE to specify the nth occurrence, for example 3rd Tuesday.

Day of Week

Snapshots taken once per week on a specific day of week, for example every Monday. Other parameters need to be configured:

  • HR_WRKFC_SNAPSHOT_DAY_CODE to specify the day, for example Monday.


Dependency

No dependencies.

B.2.33 How to Use Workforce Deployment Subject Area

The Workforce Deployment Subject Area contains two logical facts:

  • Workforce Balance Information – for reporting on balances at a point in time for example, Month End Headcount and Salary.

  • Workforce Event Information – for reporting on events occurring in a period of time for example, Number of Terminations per Year.

Most reports should use either one or the other logical fact, and it is important to know the difference for the report to make sense. It is possible to design reports that use both logical facts together; however care must be taken over dimension usage.

Task description in detail

Workforce Balance Information

The Workforce Balance fact shows status at a given point in time. The time dimension should be used with this fact. If the time dimension is not used, the status for all time periods will be calculated and then only the current one returned. This is very inefficient.

Except for the current period, whichever time period is used (Year, Quarter, Month, and Day) the fact gives the status as of the period end date. For the current period the behaviour can be configured either to show status as of the future period end date or as of the current date. This is controlled by the HR_WRKFC_MAX_EFFECTIVE_DT session variable. For more about this variable, see task "How to Control Future Transaction Data Visibility".

This fact does not include terminated workers. The status is given only for workers who have not been terminated. For E-Business Suite this is inclusive of the "Actual Termination Date" which is the last day of employment where the worker is still active on the system.

Workforce Event Information

The Workforce Event fact shows events that have happened within a given period of time. The time dimension must be used with this fact otherwise all time periods will be shown, which would result in much slower performance. Events are shown which have occurred within each time period. For the current period the behaviour can be configured either to show events up to the current date, or all future events in the period as well. This is controlled by the HR_WRKFC_MAX_EFFECTIVE_DT session variable. For more about this variable, see task "How to Control Future Transaction Data Visibility".

This fact does include termination events, and defines the termination event date to be the first day the worker is inactive. For E-Business Suite this is the day after the "Actual Termination Date" which is the last day of employment where the worker is still active on the system.

Workforce Event Dimension

The Workforce Event Fact joins to the Workforce Event Dimension at detail level. The dimension stores information about each event, such as the type of event (for example, Hire, Termination) along with a number of change flags to indicate whether any of the main dimensions changed (for example, Organization, Job, and Grade).

The Workforce Balance Fact joins to the Workforce Event Dimension at summary level. This means that reports cannot use filters or attributes based on the dimension. However for reports combining balance and event information it is possible to create metrics based on the Workforce Event Dimension.

The Workforce Event Dimension is useful in two main ways:

  • Creating detail reports such as "Show me all the terminations in 2011" or "Show me all the organization changes of workers who have less than 1 year of service".

  • Creating event based metrics. For example, suppose you wanted to report on "Headhunted" events. In the workforce event domain mapping you map terminations with a reason of "Headhunted" to a new event "Headhunted" (mapped to sub-group Voluntary Termination and group Termination):

    • The "Headhunted" detail report should be straightforward – a listing of attributes where the Workforce Event Dimension Event Code = 'HEADHUNTED'.

    • The summary report needs to show numbers of terminations, including voluntary, involuntary and headhunted. The other metrics are already defined. Add a new metric "Headhunted Count" to the event logical fact defined as:

      CASE WHEN "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_WRKFC_EVENT_TYPE_D"."W_EVENT_CODE" = 'HEADHUNTED' THEN 1 ELSE 0 END
      

      Set the aggregation rule of this metric to be "Sum".

B.2.34 How to Retrieve Information Stored in Flexfield Columns

Flexfields are customer configured fields that are used to store necessary information like Job Code, Job Family, Job Function, Job Level and Pay Level Name.A flexfield can either be a key flexfield or a descriptive flexfield. It is possible to get information from the flexfield column to the data warehouse. This can be done by configuring the BI Applications Configuration Manager to distinguish and map such columns. This document will explain how the Oracle BI Applications Configuration Manager should be configured to achieve the above.

Optional or Mandatory

This configuration is required only if flexfields are implemented. If these configurations are not done then the columns dependent on flexfield data like Job Code, Job Family, Job Function and Job Level in job dimension and Pay Level Name in Grade dimension will not have any data in them.

Applies to

Applicable to E-Business Suite source systems.

Task description in detail

Oracle BI Applications Configuration Manager is used to configure flexfields, for example: For "Job" dimension flexfields for Job Code, Job Family, Job Function and Job Level are pre-seeded.

First, identify the flexfield structures and columns that store the information for flexfield dependent data warehouse columns such as Job family, Job Function, etc. For example, a source configuration can have the following scenarios:

For PER_JOB_DEFINITIONS.ID_FLEX_NUM = '52119', PER_JOB_DEFINITIONS.SEGMENT1 is Job family or/and PER_JOB_DEFINITIONS.SEGMENT2 is Job function. In this case, SEGMENT1 column store values of Job family and SEGMENT2 stores Job Function. ID_FLEX_NUM is flex field structure number.

To configure the flexfield, two source domains and one target domain are used for each area. These are already defined in Oracle BI Applications Configuration Manager. As an example, for "Job" dimension the source domains are "KEY:800:JOB" and "DESCRIPTIVE:800:PER_JOBS". The target domain is "W_FLEX_JOB_ATTRIBUTES".

"W_FLEX_JOB_ATTRIBUTES" has its domain members pre-seeded. The members are: Job Code, Job Family, Job Function and Job Level.

The source domain members for source domain code "KEY:800:JOB" either can be seeded manually or can be loaded through the domain ETL and will get values such as 52119:SEGMENT1 and 52119:SEGMENT2 (assuming these are key flex field, a similar approach can also be followed for descriptive flex field).

The source domain (KEY:800:JOB or DESCRIPTIVE:800:PER_JOBS) can be seeded by either of the two methods mentioned below

1. Manually populate the flexfield in the source domain Example: for Job dimension populate "52119:SEGMENT1" in KEY:800:JOB or DESCRIPTIVE:800:PER_JOBS depending on whether it is a key or descriptive field. Then you will need to map this value to JOB_FAMILY and run the domain only load plan once to get the configuration done.

2. Run the Domain only Load plan once to automatically populate the values (52119:SEGMENT1) in source domain (KEY:800:JOB or DESCRIPTIVE:800:PER_JOBS). Then map these values ("52119:SEGMENT1-> JOB_FAMILY"). Run the Domain only Load plan second time to get the configuration completed.

You need to map the source domain members to the pre-seeded target domain members in Oracle BI Applications Configuration Manager. A sample member mapping is provided below to explain. below to explain.

Table B-17 Example data for retrieving data from Flexfields

Source Domain Member Name Source Domain Member Code Target Domain Member Name Target Domain Member Code

52119:SEGMENT1

52119:SEGMENT1

JOB_FAMILY

JOB_FAMILY

52119:SEGMENT2

52119:SEGMENT2

JOB_FUNCTION

JOB_FUNCTION


This finishes the configuration required. When the load plan is executed, the values in SEGMENT1 will be used to populate Job family and SEGMENT2 will be used to populate Job function.

List of flexfield-dependant fields:

  • Job dimension - Job Code, Job Family, Job Function , Job Level

  • Grade dimension - Pay Level name

  • HR Position dimension - Position Number

Dependency

None.

B.2.35 How to Add Payroll Balances to BI Payroll Balance Group

Purpose

In order to extract payroll balances into Oracle Business Analytics Warehouse, the balances must be assigned to the BI Balance Group in the Fusion Applications system and Elements to Element Group in PeopleSoft Global Payroll.

For PeopleSoft North American Payroll and E-Business Suite Payroll, it is strongly recommended to create a custom table in the OLTP environment with all the balances/earnings/deductions/taxes that needs to be extracted into Oracle Business Analytics Warehouse.

By limiting the balances extracted, the performance of ETL and reports will be improved. In addition, only certain types of balance are suitable for including in the warehouse. You should only extract run balances, as other types of balances may not be fully additive (for example year-to-date balances cannot be added together).

To ensure addivity of measures we will only support run balances. For each payroll run the actual run balances processed will be stored. Because we are not breaking these down by context we can combine run balances across time to form higher level balances for example, PTD, MTD, YTD.

For E-Business Suite Payroll and PeopleSoft North American Payroll, refer to the Help Topic ID 243 for more information.

Optional or Mandatory

Mandatory for Fusion Payroll and PeopleSoft Global Payroll. Optional, but highly recommended for E-Business Suite Payroll and PeopleSoft North American Payroll.

For Fusion and Global Payroll, the ETL is configured to extract only the balances that are assigned to the 'BI Balance Group' and 'GLOBAL BI BALGRP' Element Group respectively.

Applies to

Fusion Payroll, PeopleSoft Global Payroll, PeopleSoft North American Payroll and E-Business Suite Payroll.

Task description in detail

Refer to the appropriate section for your source system:

B.2.35.1 Steps for Fusion

Pre-requisites

  • Access to Fusion Applications Payroll Administration area.

  • Office 2007 with Oracle ADF 11g Plug In.

  • List of defined balances required to add to BI Balance Group.

  • Listed by Balance Dimension (which must be Run) and Balance Type.

  • Listed by Legislative Data Group.

These instructions cover the steps required to add balances to the BI Balance Group for inclusion in Oracle Business Analytics Warehouse. There will be more details on the Payroll Administration documentation which will cover exceptions and verification reports to validate any setup.

Steps to create a batch:

  1. Log into Fusion Applications and navigate to the Payroll Administration area (Navigator => Payroll => Payroll Administration).

  2. In the Task pane select Batch Processing => Batch Loader.

  3. Click the Download button to open the Batch Loader Spreadsheet, re-entering your login details as requested.

  4. In the Batch Header Sheet tab, enter a name for the batch and the Legislative Data Group and Save.

  5. Double-click the batch name to select the batch and open the Batch Content Sheet tab.

  6. Click the Add button and select the 'Add a Defined Balance' action.

  7. Enter the details for each defined balance to be added to the BI Balance Group:

    • Line Sequence.

    • Attribute Definition – 'Global BI Attribute'.

    • Legislative Data Group – as entered in step 4.

    • Balance Dimension – balance dimension name; this should be a simple run balance without any contexts.

    • Balance Type – balance type name for the defined balance

  8. Click Save.

Steps to transfer the batch:

  1. In Fusion Applications navigate to the Checklists page (Navigator => Payroll => Checklists).

  2. In the Task pane select Payroll Flows => Submit a Process or Report.

  3. Select the Legislative Data Group for the batch.

  4. Select the 'Transfer Batch' process and click Next.

  5. Enter the details:

    Give a name for the Payroll Flow.

    For the batch parameter, select the batch name entered in step 4 of 'Steps to create a batch'.

  6. Submit the process.

B.2.35.2 Steps for PeopleSoft Global Payroll

  1. Navigate to Setup HRMS, then Product Related, then Global Payroll & Absence Mgmt, then Elements, then Element Groups, then Add a New Value.

    Element Group Name dialog.
  2. Provide the Name of the Element Group as 'GLOBAL BI BALGRP' and provide any meaningful description.

    Element Group Name dialog.
  3. Click on Element Group Members tab.

  4. Add the Earnings/Deductions that need to be extracted into Oracle Business Analytics Warehouse, to the Element Group created.

    The Global Payroll ETL is configured to extract only the Earnings/Deductions that are assigned to the Element Group GLOBAL BI BALGRP. The below screen shots show how to assign Earnings/Deductions to the Element Group.

    Element Group Name dialog.

B.2.35.3 Steps for PeopleSoft North America Payroll

For PeopleSoft North American Payroll and E-Business Suite Payroll, it is strongly recommended to create a custom table in the OLTP environment with all balances/earnings/deductions/taxes that need to be extracted into Oracle Business Analytics Warehouse.

For example, you might use the following PeopleSoft North American Payroll Custom Table Script:

CREATE TABLE OBIA_PAY_BAL_FILTER (BALANCE_ID VARCHAR2 (50));
INSERT INTO OBIA_PAY_BAL_FILTER(BALANCE_ID) 
SELECT DISTINCT A.BALANCE_CODE FROM 
(
SELECT D.DEDCD AS BALANCE_CODE FROM 
PS_DEDUCTION_TBL D
WHERE 
D.DEDCD IN ('401','B00-23','B10-02','B10-15','B10-16')
UNION
SELECT E.ERNCD AS BALANCE_CODE 
FROM 
PS_EARNINGS_TBL E
WHERE 
E.ERNCD IN ('001','007','B14','B30')
UNION
SELECT S.ERNCD_SPCL AS BALANCE_CODE
FROM 
PS_SPCL_EARNS_TBL S
WHERE 
S.ERNCD_SPCL IN ('100','142','143','145') 
UNION
SELECT ST.STATE AS BALANCE_CODE
FROM
PS_STATE_TAX_TBL ST
WHERE ST.STATE IN ('AK','AL','AR','AS')
UNION
SELECT CT.PROVINCE AS BALANCE_CODE
FROM
PS_CAN_TAX_PROV CT
WHERE CT.PROVINCE IN ('AB','BC','MB','NB')
) A;
CREATE UNIQUE INDEX OBIA_PAY_BAL_FILTER_U1 ON OBIA_PAY_BAL_FILTER (BALANCE_ID);

Add all the Earnings/Deductions/Taxes in the IN clause of the above query respectively.

B.2.35.4 Steps for E-Business Suite

E-Business Suite Payroll Custom Table Script:

CREATE TABLE OBIA_PAY_BAL_FILTER (BALANCE_ID VARCHAR2 (50));
INSERT INTO OBIA_PAY_BAL_FILTER (BALANCE_ID) 
SELECT DISTINCT DB.DEFINED_BALANCE_ID 
FROM 
PAY_BALANCE_TYPES BT,
PAY_DEFINED_BALANCES DB,
PAY_BALANCE_DIMENSIONS BD 
WHERE 
BT.BALANCE_TYPE_ID = DB.BALANCE_TYPE_ID AND 
DB.BALANCE_DIMENSION_ID = BD.BALANCE_DIMENSION_ID AND 
BT.BALANCE_NAME IN ('Payments','Overtime','Regular Earnings','Regular Salary');
CREATE UNIQUE INDEX OBIA_PAY_BAL_FILTER_U1 ON OBIA_PAY_BAL_FILTER (BALANCE_ID);

BT.BALANCE_NAME IN ('Payments','Overtime','Regular Earnings','Regular Salary') – List of All balances that need to be extracted into Oracle Business Analytics Warehouse).

B.2.36 How to Configure Band Dimensions

Purpose

There are seven Band Dimensions that OBI Applications HR Analytics make use of. The purpose of this task is to provide information about the band dimensions, and all that is needed to configure the "bands" in these tables.

Optional or Mandatory

The default solution has 'bands' configured based upon industry best practices. If the default bands meet your business needs, then no further configuration is required. Otherwise, this is mandatory.

Task description – Overview of Band Dimensions

To enable data analysis based on various groups of a given attribute, Oracle BI Applications provides an option to configure your choice of groups, or bands, for these seven attribute families:

  • Person Age

  • Job Requisition Age

  • Time Card Age

  • Performance Ratings

  • Period of Service

  • Compa Ratio

  • Learning Grade

The band data that you configure is stored in seven corresponding dimension tables. The following table provides a description of each of these tables.

Table B-18 Dimension Tables that Store Band Data

Dimension Table Description

W_AGE_BAND_D

Age Band table. This table breaks down the ages of people into different bands to help determine the age ranges the people fall into. The table has two levels:

LEVEL_ID = AGE_BAND. This level defines the age bands.

LEVEL_ID = AGE. This level defines the age (in months) for a person.

W_JOB_RQSTN_AGE_BAND_D

Job Requisition Age Band table. This table breaks down the age of the job requisition into different bands to help determine the age range the job requisition falls into. The table has two levels:

LEVEL_ID = RQSTN_AGE_BAND. This level defines the job requisition age bands.

LEVEL_ID = RQSTN_AGE. This level defines the job requisition age (in months).

W_TLB_AGE_BAND_D

Timecard Age Band table. This table breaks down the age of the time card into different bands to help determine the age range the time card falls into. The table has two levels:

LEVEL_ID = TIMECARD_AGE_BAND. This level defines the time card age bands.

LEVEL_ID = TIMECARD_AGE. This level defines the age (in days) for a time card.

W_PERFORMANCE_BAND_D

Performance Band table. This table breaks down the performance ratings into different bands to help determine the level of quality of a candidate. The table has two levels:

LEVEL_ID = PERF_BAND. This level defines the performance rating bands.

LEVEL_ID = PERF_RTNG. This level defines each normalized performance ratings (in integers up to 100) for a person.

W_PRD_OF_WRK_BAND_D

Period of Work Band table. This table breaks down employees and contingent workers into different bands to help determine the time that the employees or the contingent workers have been employed. The table has three levels:

Two levels define the bands: LEVEL_ID = POW_BAND_EMP defines the employees' period of work band; LEVEL_ID = POW_BAND_CWK defines the contingent workers' period of work band.

LEVEL_ID = POW. This level defines the period of work (in months) for a person.

W_COMPA_RATIO_BAND_D

Compa Ratio Band table. This table breaks down employee's Compa Ratio (percentage) values into different bands to help determine the distribution of Compa Ratio within the organization.

LEVEL_ID = COMPA_RATIO_BAND. This level defines the Compa Ratio bands.

LEVEL_ID = COMP. This level defines the Compa Ratio for a person.

W_LM_GRADE_BAND_D

Learning Grade Band table. This table breaks down the Learning scores into different grade bands to help determine the quality of a learner with respect to the course opted for. The table has two levels:

LEVEL_CODE = LM_GRADE_BAND. This level defines the learning grade bands.

LEVEL_CODE = GRADE. This level defines the learning score for a learner corresponding to a give course opted for.


Task description – Configuring Bands for all Band Dimensions

The Bands against the supported attributes (all seven) are already pre-seeded within Oracle BI Applications Configuration Manager. However, if you want to change the default bands set up for different base values, you would need to edit the pre-seeded configurations. Go to the 'Manage Domain Mappings and Hierarchies' menu within Oracle BI Applications Configuration Manager to do so. Screen shots of the pre-seeded mapping for each supported attribute are shown below.

Age Bands

This screenshot is described in surrounding text.

Job Requisition Age Bands

This screenshot is described in surrounding text.

Timecard Age Bands

This screenshot is described in surrounding text.

Performance Bands

This screenshot is described in surrounding text.

Period Of Work Ratio Bands - Employee

This screenshot is described in surrounding text.

Period Of Work Ratio Bands – Contingent Workers

This screenshot is described in surrounding text.

Compa Ratio Bands

This screenshot is described in surrounding text.

Learning Grade Bands

This screenshot is described in surrounding text.

B.2.37 How to control HR Future-Data Transaction Data Visibility

Purpose

In most HR systems it is common to enter transactions in advance. These are often termed Future-Dated transactions. Which roles can see future-date transactions is often tied to role based access control. In the BI Applications Human Resources a number of facts are secured in such a way as to limit access to future-dated transactions. This is achieved via some Session level OBIEE Initialization Blocks and associated variables that returns a certain date value. This date value dictates how far into the future the user gets access to data, if at all. The purpose of this task is to set this date.

Optional or Mandatory

By default, no future data access is provided for users. If users need future data access, then this task is mandatory.

Task Description

There are five Session variables, and two related Initialization Blocks involved here. These are:

Table B-19 Session variables

Session Variable Name Initialization Block used Configuration Needed (Y/N)

HR_WRKFC_MAX_EFFECTIVE_DT

HR_WRKFC_MAX_EFFECTIVE_DT

Y

HR_WRKFC_MAX_EFFECTIVE_DT_WID

HR Workforce Max Effective Dates

N

HR_WRKFC_MAX_EFFECTIVE_DT_MONTH_WID

HR Workforce Max Effective Dates

N

HR_WRKFC_MAX_EFFECTIVE_DT_QTR_WID

HR Workforce Max Effective Dates

N

HR_WRKFC_MAX_EFFECTIVE_DT_YEAR_WID

HR Workforce Max Effective Dates

N


You need to configure only the first Initialization Block. The default delivered behaviour of the Initialization Block 'HR_WRKFC_MAX_EFFECTIVE_DT' is to always return today's date, and the SQL statement is:

select DAY_DT
from "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_DAY_D_Common"
where DAY_DT = CURRENT_DATE

If your requirement is to allow future data access up until one year ahead of 'today', then you need to change the SQL to:

select CAST(TIMESTAMPADD(SQL_TSI_YEAR, 1, DAY_DT) AS DATE) 
from "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_DAY_D_Common" 
where DAY_DT = CURRENT_DATE

If you wish to change the default behaviour of the Initialization Block based on the authenticated user, then you will need to change the delivered SQL for Initialization Block "HR_WRKFC_MAX_EFFECTIVE_DT" to something like this:

select case when ':USER' in ('A','B','C') then 
      CAST(TIMESTAMPADD(SQL_TSI_YEAR, 1, DAY_DT) AS DATE) 
     else 
      DAY_DT 
     end 
from "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_DAY_D_Common"
where DAY_DT = CURRENT_DATE

The above represents a use case where users A, B and C gets future data access up until one year ahead from "today", whereas all other users have no future data access.

Regardless of the use cases, depending on what the variable HR_WRKFC_MAX_EFFECTIVE_DT returns, the second Initialization Block "HR - Future-dated Data Date (WIDs)" returns the appropriate values for the dependent four variables. The SQL goes (this is for your information only; no change is needed):

select
CAST(ROW_WID AS INT)        AS HR_WRKFC_MAX_EFFECTIVE_DT_WID,
CAST(CAL_MONTH_WID AS INT)  AS HR_WRKFC_MAX_EFFECTIVE_DT_MWID,
CAST(CAL_QTR_WID AS INT)    AS HR_WRKFC_MAX_EFFECTIVE_DT_QWID,
CAST(CAL_YEAR_WID AS INT)   AS HR_WRKFC_MAX_EFFECTIVE_DT_YWID
from "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_DAY_D_Common"
where DAY_DT = CAST(VALUEOF(NQ_SESSION.HR_WRKFC_MAX_EFFECTIVE_DT) AS DATE

The exact SQL will depend on the user/role requirements.

Note: Future-dated security is applied only to HR facts, not dimensions, at the time of writing.

Dependency

Future data restriction is available for the following Logical Facts:

  • Fact - HR - Workforce - Balance Information

  • Fact - HR - Workforce - Event Information

  • Fact - HR - Recruitment Event Information

  • Fact - HR - Workforce Gains and Losses - Event Information

  • Fact - HR - Workforce Gains and Losses - Balance Information

  • Fact - HR - Payroll Balance Detail

  • Fact - HR - Payroll Balance Summary

  • Fact - HR - Accrual Transactions - Balance Information

  • Fact - HR - Accrual Transactions - Event Information

B.2.38 How to Set Up Manager Hierarchy Base Security for Peoplesoft

Note that although the task title mentions PeopleSoft adaptor, it applies to all AU customers who implement manager or resource hierarchy based security.

Manager/Resource hierarchy based security in PS1 is implemented by using the initialization block Manager Hierarchy Level and one or multiple of the initialization blocks HR Security Person ID List (Fusion), HR Security Person ID List (Siebel), HR Security Person ID List (EBS), and HR Security Person ID List (PeopleSoft) that correspond to the adaptors of your choice. Initialization block Manager Hierarchy Level depends on those HR Security Person ID List initialization blocks.

In the security implementation, you must first identify how many of these HR Security Person ID List initialization blocks you have in your BI metadata repository. Note that not all of them may exist in your RPDs depending on the specific Oracle BI Applications products (for example, CRM, HCM, etc) that you are using. In what follows, we assume that all four of them exist in your RPD (but you may just see a subset of them in reality).

This screenshot is described in surrounding text.

By default (that is, on installation), the initialization block HR Security Person ID List (Fusion) is disabled. As a security best practice, you should disable unused initialization blocks. If unused initialization blocks are not disabled, then they will be run to populate their corresponding variables. Although different AU adaptors have different data structures and formats to store employee information, this might lead to (in very rare cases) more than one eligible employee login ID value to be used in Manager Hierarchy Level, which will in turn impact the security setting.

Specifically:

- if you are implementing EBS, then only HR Security Person ID List (EBS) must be enabled, while HR Security Person ID List (Siebel) and HR Security Person ID List (PeopleSoft) must be disabled.

- if you are implementing PeopleSoft, then only HR Security Person ID List (PeopleSoft) must be enabled, while HR Security Person ID List (EBS) and HR Security Person ID List (Siebel) must be disabled.

- if you are implementing Siebel, then only HR Security Person ID List (Siebel) must be enabled, while HR Security Person ID List (EBS) and HR Security Person ID List (PeopleSoft) must be disabled.

- if you are implementing both EBS and PeopleSoft, then HR Security Person ID List (EBS) and HR Security Person ID List (PeopleSoft) must be enabled, while HR Security Person ID List (Siebel) must be disabled.

- if you are implementing both EBS and Siebel, then HR Security Person ID List (EBS) and HR Security Person ID List (Siebel) must be enabled, while HR Security Person ID List (PeopleSoft) must be disabled.

- if you are implementing both Siebel and PeopleSoft, then HR Security Person ID List (Siebel) and HR Security Person ID List (PeopleSoft) must be enabled, while HR Security Person ID List (EBS) must be disabled.

- if you are implementing all EBS, Siebel and PeopleSoft, then all HR Security Person ID List (EBS), HR Security Person ID List (Siebel) and HR Security Person ID List (PeopleSoft) must be enabled.

The screen shot below shows an example using HR Security Person ID List (PeopleSoft).

This screenshot is described in surrounding text.

Click OK after this setting.

Do the same to the initialization blocks corresponding to other AU adaptors that you want to enable/disable.

Save your changes after your setting.

B.2.39 How To Configure Quarters for a Fiscal Time Dimensions

Oracle's JD Edwards EnterpriseOne does not have a concept of defining the quarters for a fiscal pattern or a fiscal year. Therefore, a configurable flat file is provided to populate quarter information. This configuration file enables you to feed quarter information such as Quarter Number for each period, Quarter Start Date, and Quarter End Date.

For information about how to configure this flat file, see Section B.2.98.1, "How to Configure the file_lkp_fiscal_period_Qtr_Config_jde.csv". Each fiscal pattern can have a varying number of periods as supported by the OLTP. Therefore, the quarter configuration is required for each fiscal year and for each fiscal pattern. The table below shows example values specified in the file file_lkp_fiscal_period_Qtr_Config_jde.csv.

Table B-20 Example of file_lkp_fiscal_period_Qtr_Config_jde.csv

Fiscal Pattern Year Period QuarterNo QuarterStart QuarterEnd

F

4

1

1

6/1/2004

8/30/2004

F

4

2

1

6/1/2004

8/30/2004

F

4

3

1

6/1/2004

8/30/2004

F

4

4

2

9/1/2004

11/30/2004

F

4

5

2

9/1/2004

11/30/2004

F

4

6

2

9/1/2004

11/30/2004

F

4

7

3

12/1/2004

2/28/2005

F

4

8

3

12/1/2004

2/28/2005

F

4

9

3

12/1/2004

2/28/2005

F

4

10

4

3/1/2005

3/31/2005

F

4

11

4

3/1/2005

3/31/2005

F

4

12

4

3/1/2005

3/31/2005

F

4

13

4

3/1/2005

3/31/2005

F

4

14

4

3/1/2005

3/31/2005


For each fiscal year in the F0008 table, you must define the quarters for each fiscal period. The quarter information is used in the calculation of aggregates by quarter. The W_MCAL_CONTEXT_G table in the Oracle Business Analytics Warehouse stores calendars associated with the ORG ID, Ledger ID, and Operating Unit columns. In

Oracle's JD Edwards EnterpriseOne, the fiscal date patterns are associated with the company which forms the ORG_ID and LEDGER_ID.

The W_MCAL_CAL_D table stores the calendar information. Every distinct Fiscal Date Pattern stored in the Fiscal Date Pattern table (F0008) has an entry in this table. The grain of this dimension is the Date Pattern Type, which identifies the Calendar in the Oracle Business Analytics Warehouse. This dimension does not have an association with the Fiscal year for that pattern. The MCAL_CAL_WID column is a four digit number that is reset to 1000 each time the ETL is run and incremented by one for each date pattern type stored in W_MCAL_CAL_D.

B.2.40 How to Map GL Accounts to Group Account Numbers for JD Edwards EnterpriseOne

This section explains how to map General Ledger Accounts to Group Account Numbers, and includes the following topics:

Note:

It is critical that the GL account numbers are mapped to the group account numbers (or domain values) because the metrics in the GL reporting layer use these values. For a list of domain values for GL account numbers, see Oracle Business Analytics Warehouse Data Model Reference.

B.2.40.1 Overview of Mapping GL Accounts to Group Account Numbers

Group Account Number Configuration is an important step in the configuration of Financial Analytics, as it determines the accuracy of the majority of metrics in the General Ledger and Profitability module. Group Accounts in combination with Financial Statement Item Codes are also leveraged in the GL reconciliation process, to ensure that subledger data reconciles with GL journal entries. This topic is discussed in more detail later in this section.

You can categorize your General Ledger accounts into specific group account numbers. The GROUP_ACCT_NUM field denotes the nature of the General Ledger accounts.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  • file_group_acct_codes_jde.csv - this file maps General Ledger accounts to group account codes.

The associations in this file are used in conjunction with the values defined for the following Domains:

  • W_GL_GROUP_ACCOUNT

  • W_GL_ACCT_CATEGORY

  • W_FIN_STMT

These Domain values and the mappings between them classify accounts into sub-groups, like Revenue and Cost of Goods Sold, as well as dividing accounts between Balance Sheet and Profit and Loss. Before you load your data, you must ensure that the account values are mapped consistently across these three collections. In particular, the GROUP_ACCOUNT_NUM domain that is specified in Oracle BI Applications Configuration Manager must contain valid members of the W_GL_GROUP_ACCOUNT Domain. Those values, in turn, are mapped to members of the W_GL_ACCT_CATEGORY and W_FIN_STMT Domains.

You can categorize the General Ledger accounts in Oracle's JD Edwards EnterpriseOne into specific group account numbers. The group account number is used during data extraction as well as front-end reporting.

The GROUP_ACCT_NUM field in the GL Account dimension table W_GL_ACCOUNT_D denotes the nature of the General Ledger accounts (for example, Cash account, AR account, Long Term Debt account Payroll account). For a list of the Group Account Number domain values, see Oracle Business Analytics Warehouse Data Model Reference.

The mappings to General Ledger Accounts Numbers are important for both Profitability analysis and General Ledger analysis (for example, Balance Sheets).

Using the file_group_account_codes_jde.csv, you can specify which group account (among the available group accounts) the object account is associated with. The Company column in this CSV file is the actual company the object account belongs to.

In addition to the From Account and To Account range, the system uses the incoming company as a parameter for the association. If the incoming company has not been configured in the group account flat file, the system inserts 00000 as the default value for Company for lookups. You can choose to not configure group accounts for any company other than 00000 if you are using a single global chart of accounts. However, if you configure group accounts for additional companies, you must configure all possible From Account and To Account ranges for these companies. In addition, you must always configure the entire range of accounts for company 00000.

Table B-21 below shows example values specified in the file file_group_account_codes_jde.csv

Table B-21 Example of file_group_account_codes_jde.csv

COMPANY FROM ACCT TO ACCT GROUP_ACCT_NUM

00000

4100

4190

AP

00000

1200

1299

AR

00000

2120

2195

ACC DEPCN

00000

4200

4211

ACC LIAB

00000

1100

1121

CASH

00000

4900

4910

CMMN STOCK

00000

1401

1469

FG INV

00000

3990

3990

GOODWILL

00000

4690

4690

LT DEBT

00000

3900

3940

OTHER ASSET

00000

1310

1400

OTHER CA

00000

4212

4550

OTHER CL

00000

4950

4950

OTHER EQUITY

00000

4610

4685

OTHER LIAB


The Domain mapping from W_GL_GROUP_ACCOUNT to W_FIN_STMT specifies the relationship between a group account number and a Financial Statement Item code.

Table B-22 shows the Financial Statement Item codes to which Group Account Numbers must map, and their associated base fact tables.

Table B-22 Financial Statement Item Codes and Associated Base Fact Tables

Financial Statement Item Codes Base Fact Tables

AP

AP base fact (W_AP_XACT_F)

AR

AR base fact (W_AR_XACT_F)

COGS

Cost of Goods Sold base fact (W_GL_COGS_F)

REVENUE

Revenue base fact (W_GL_REVN_F)

OTHERS

GL Journal base fact (W_GL_OTHER_F)


By mapping your GL accounts against the group account numbers and then associating the group account number to a Financial Statement Item code, you have indirectly associated the GL account numbers to Financial Statement Item codes as well.

B.2.40.2 How to Map GL Account Numbers to Group Account Numbers

This section explains how to map General Ledger Account Numbers to Group Account Numbers.

Note:

If you add new Group Account Numbers to the file_group_acct_codes_<source system type>.csv file, you must also add metrics to the BI metadata repository (that is, the RPD file). See Section B.2.18.3, "Example of Adding Group Account Number Metrics to the Oracle BI Repository" for more information.

To map GL account numbers to group account numbers:

  1. Edit the file_group_acct_codes_jde.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. For each GL account number that you want to map, create a new row in the file containing the following fields:

    Field Name Description

    COMPANY

    The ID of the COMPANY.

    FROM ACCT

    The lower limit of the natural account range. This is based on the natural account segment of your GL accounts.

    TO ACCT

    The higher limit of the natural account range. This is based on the natural account segment of your GL accounts.

    GROUP_ACCT_NUM

    This field denotes the group account number of the General Ledger account, as specified in a domain in Oracle BI Applications Configuration Manager. For example, 'AP' for Accounts Payables, 'CASH' for cash account, 'GEN PAYROLL' for payroll account, and so on.


    For example:

    1000, 1110, 1110, CASH
    1000, 1210, 1210, AR
    1000, 1220, 1220, AR
    

    Note:

    You can optionally remove the unused rows in the CSV file.

  3. Ensure that the values that you specify in the file_group_acct_codes_jde.csv file are consistent with the domain members of Group Account (W_GL_GROUP_ACCOUNT).

    In particular, the GROUP_ACCOUNT_NUM field in file_group_acct_names.csv must contain valid members of the W_GL_GROUP_ACCOUNT Domain. Those values, in turn, are mapped to members of the W_GL_ACCT_CATEGORY and W_FIN_STMT Domains.

  4. Save and close the CSV file.

B.2.41 How to Configure Incremental Extract for Projects Facts for PeopleSoft

This section explains how to configure incremental extract for project facts for PeopleSoft. There are two ways to configure incremental extract:

This section explains both of the above methods. This section does not cover the advantages or disadvantages of each method, or advise how to choose which method to adopt.

B.2.41.1 How to Configure Incremental extract for projects facts for PeopleSoft Using DB Triggers

This section describes the incremental extract solution to mitigate the performance issues for incremental ETL from PeopleSoft for Project Budget/Cost/Revenue/Commitment/Forecast/Cross Charge/Retention Facts using a database trigger solution.

To configure incremental extract for Project Facts Using Database Triggers:

  1. Read the Overview, which includes deploying the appropriate SQL code in your source system (for more information, see Section B.2.41.1.1, "Overview").

  2. Make updates in Oracle BI Applications Configuration Manager (for more information, see Section B.2.41.1.2, "Updates in Oracle BI Applications Configuration Manager").

  3. Make updates in Oracle Data Integrator (for more information, see Section B.2.41.1.3, "Updates in Oracle Data Integrator").

  4. Modify the Temporary Interfaces (for more information, see Section B.2.41.1.4, "Modify Temporary Interfaces").

  5. Modify the Load Plan (for more information, see Section B.2.41.1.5, "Modify Load Plan").

  6. Modify the Metadata Repository (for more information, see Section B.2.41.1.6, "Metadata Repository (RPD) Changes").

B.2.41.1.1 Overview

Before you start, read this overview.

Supported Versions

Supported DB: Oracle / SQL Server

Supported Oracle BI Applications releases: 11.1.1.7.1 (PS1) onwards

Supported Apps releases: PeopleSoft 8.9 onwards

Overview

The PeopleSoft ESA application does not populate the DTTM_STAMP column in PS_PROJ_RESOURCE correctly; this restricts the ability to devise incremental load logic around this column, which leads to a performance overhead while loading the Project Budget, Forecast, Commitment, Cross Charge, Cost, Retention and Revenue facts.

This document outlines the steps to facilitate changed data capture (CDC) for PeopleSoft OLTP based on database triggers. Note: By default, CDC using GoldenGate and ODI is supported. If you do not have a license for GoldenGate, then the solution outlined here can be followed for CDC and incremental loading out the relevant project fact tables.

Note: This approach is only supported for Oracle / SQL Server db

The document lists the code to be deployed in the source system (PeopleSoft apps), the ODI XML files which has to be imported to the ODI repository and the RPD changes for successfully implementing this solution.

Summary

  • A Trigger on PS_PROJ_RESOURCE will be created in the OLTP which will insert PKs of changed rows into the PROJ_RESOURCE_UPD_AUD table (Refer to the Steps section for code).

  • A View (OBIEE_PS_PROJ_RESOURCE_VW) will be created on the PS_PROJ_RESOURCE and this trigger table (PROJ_RESOURCE_UPD_AUD) and this view (OBIEE_PS_PROJ_RESOURCE_VW) is what will be used in the SDE fact extract source. (Refer to the Steps section for code).

  • The deleted rows will be captured in the PE tables via ETL from the Trigger table (PROJ_RESOURCE_UPD_AUD ) filtered on update_type = 'D'.

  • Rest of the delete strategy interfaces will also be updated to properly handle the soft delete logic.

  • In the ODI model layer the resource name for the object PS_PROJ_RESOURCE table will be replaced by the View on MV (OBIEE_PS_PROJ_RESOURCE_VW).

  • The deleted rows will be captured in the <fact>_DEL tables via ETL from the OBIEE_PS_PROJ_RESOURCE_DEL_VW.

  • The SIL fact interface will properly handle the soft delete logic once we set the values of variables to:

    - SOFT_DELETE_PREPROCESS = 'N' (This will not populate the <fact>_PE table)

    - SOFTDELETE_FEATURE_ENABLED = 'Y'

Summary of steps:

  1. Create the trigger and db objects.

  2. Run the full ETL.

  3. Modify the data in OLTP.

  4. Run the SDE fact extracts.

  5. Run the SIL fact loads.

  6. Run the soft delete ETLs.

Assumptions

  • There will be some performance impact on the OLTP application due to the presence of the trigger.

  • Actual deletes from PS_PROJ_RESOURCE will be treated as Soft Delete in Oracle Business Analytics Warehouse.

  • Important: The customer need to run truncate on the trigger table PROJ_RESOURCE_UPD_AUD from time to time (say every week) in order to ensure it does not become too big such that it begins to impact ETL run times.

Database Changes Required for an Oracle Database

If your source OLTP database is an Oracle database instance, then execute the SQL script in the file psft_orcl_trigger.txt, which is located in the installation folder <Oracle Home for BI>/biapps/etl/src_specific/PSFT/oracle.

For example:

/*ORACLE SCRIPT TO IMPLEMENT INCREMENTAL SOLUTION FOR PEOPLESOFT ADAPTOR FOR OBIA PROJECT ANALYTICS */
DROP TABLE PROJ_RESOURCE_UPD_AUD;
/
CREATE TABLE PROJ_RESOURCE_UPD_AUD(
        ROW_WID number(10),
BUSINESS_UNIT varchar2(5) NULL,
PROJECT_ID varchar2(15) NULL,
...
...
And so on.

Database Changes Required for an MS SQL Database

If your source OLTP database is an MS SQL Server database instance, then execute the SQL script in the file psft_mssql_trigger.txt, which is located in the installation folder <Oracle Home for BI>/biapps/etl/src_specific/PSFT/ms_sql_server.

For example:

/*MSSQL SCRIPT TO IMPLEMENT INCREMENTAL SOLUTION FOR PEOPLESOFT ADAPTOR FOR OBIA PROJECT ANALYTICS */

/* Replace <DB> with the actual schema name */

USE <DB>

DROP TABLE PROJ_RESOURCE_UPD_AUD;

CREATE TABLE PROJ_RESOURCE_UPD_AUD(
ROW_WID INT IDENTITY(1,1) PRIMARY KEY,
...
...
And so on.
B.2.41.1.2 Updates in Oracle BI Applications Configuration Manager

In Oracle BI Applications Configuration Manager, make the following changes:

  1. Set the value of SOFT_DELETE_PREPROCESS to 'N'.

  2. Set the value of SOFTDELETE_FEATURE_ENABLED to 'Y' for the Fact Group Level variables.

B.2.41.1.3 Updates in Oracle Data Integrator

Make the following changes:

  1. In the ODI model layer the resource name for the object PS_PROJ_RESOURCE table will be replaced by the view OBIEE_PS_PROJ_RESOURCE_VW.

    If you are using PeopleSoft 90, then follow this navigation. Otherwise, navigate to the 9.0 folder. That is, in ODI Designer Navigator, navigate to Models, then Peoplesoft 9.0, then peoplesoft 9.0 FNSCM, then FPC-Projects, then open the object PROJ_RESOURCE and change the 'Resource Name' to OBIEE_PS_PROJ_RESOURCE_VW.

    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
  2. Add column LAST_UPDATE_DT TIMESTAMP to the Object PROJ_RESOURCE in ODI model.

  3. Click Save.

  4. Regenerate the appropriate SDE scenarios for all the facts mentioned above.

    That is, in ODI Designer Navigator, navigate to Projects, then Mapping, then SDE_PSFT_90_Folder, navigate to the fact folder e.g SDE_PSFT_ProjectCostLineFact, then Packages, then Scenarios, then right click on the name and select the Regenerate button.

B.2.41.1.4 Modify Temporary Interfaces

Modify temporary interfaces, as follows:

  1. In ODI temporary interfaces the mapping for CHANGED_ON_DT needs to be changed to include the LAST_UPDATE_DT column in the code.

  2. For example, consider the temporary interface for Cost Fact:

    SDE_PSFT_ProjectCostLineFact.W_PROJ_COST_LINE_FS_SQ_PROJ_RESOURCE  
    
  3. The CHANGED_ON_DT is mapped with:

    RUN_REPLICATED_TRANSACTIONAL('#IS_SDS_DEPLOYED',PS_PROJ_RESOURCE.DTTM_STAMP,PS_PROJ_RESOURCE.CDC$_SRC_LAST_UPDATE_DATE)
    
  4. Change the update date:

    RUN_REPLICATED_TRANSACTIONAL('#IS_SDS_DEPLOYED',COALESCE(PS_PROJ_RESOURCE.LAST_UPDATE_DT,PS_PROJ_RESOURCE.DTTM_STAMP),PS_PROJ_RESOURCE.CDC$_SRC_LAST_UPDATE_DATE)
    
  5. Change the incremental filter, and replace PS_PROJ_RESOURCE.DTTM_STAMP with COALESCE(PS_PROJ_RESOURCE.LAST_UPDATE_DT,PS_PROJ_RESOURCE.DTTM_STAMP).

  6. Regenerate scenarios.

    Repeat the above steps for the remaining Budget, Forecast, Commitment, Cross Charge, Retention and Revenue facts.

    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
B.2.41.1.5 Modify Load Plan

Modify your Load Plan, as follows:

The Interface to load the Deleted rows needs to be added to the Load Plan, as follows:

  1. In ODI Designer Navigator, navigate to Load Plans and Scenarios, then Your Generated Load Plan and open it.

  2. In the Steps tab, navigate to '1 SDE Extract' step > 2 SDE Fact Group> Parallel (Persisted/Temporary Staging Table)>3 SDE PS PROJECT.

  3. Create a Serial Step "Projects_Identify_Deletes" under the root step and add the identify delete fact scenarios for Project Cost, Budget, Forecast, Commitment, Cross Charge, Retention and Revenue facts steps as shown in the screenshot below.

  4. Add scenarios in relevance to Load plan. For instance, scenarios with prefix SDE_PSFT_90_ADAPTOR to be added for PeopleSoft 90 Load plan and ones with prefix SDE_PSFT_91_ADAPTOR should be added for PeopleSoft 91 Load plan.

    These tasks have to be set to "Restart from Failure'.

  5. Click on these newly added tasks, edit them to take the '-1' version as shown in the screenshot.

    This is necessary to ensure that the latest scenario is used in case if there are multiple scenarios.

  6. Save the Load Plan.

    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
B.2.41.1.6 Metadata Repository (RPD) Changes

Change BMM filters for base Facts to filter out Deleted records. It is recommended that these changes are done in an offline mode.

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. In the BMM layer and go to Fact – Project Budget.

  3. Click to view LTS'es under this logical fact.

  4. Add the following filter to Fact_W_PROJ_BUDGET_F_Budget_Fact.

  5. Add a similar filter for Budget Fact ITD LTS (Delete Flg = 'N').

    This screenshot is described in surrounding text.

    Similarly, modify for Cost/Forecast/Commitment/Cross charge/Revenue/Retention facts.

B.2.41.2 How to Configure Incremental extract for projects facts for PeopleSoft Using Materialized Views

The PeopleSoft ESA application does not populate the DTTM_STAMP column in PS_PROJ_RESOURCE correctly; this restricts the ability to devise incremental load logic around this column, which leads to a performance overhead while loading the Project Budget, Forecast, Commitment, Cross Charge, Cost and Revenue facts.

To configure incremental extract for Project Facts Using Database Triggers:

  1. Read the Overview, which includes deploying the appropriate SQL code in your source system (for more information, see Section B.2.41.2.1, "Overview").

  2. Make updates in Oracle BI Applications Configuration Manager (for more information, see Section B.2.41.2.2, "Updates in Oracle BI Applications Configuration Manager").

  3. Make updates in Oracle Data Integrator (for more information, see Section B.2.41.2.3, "Updates in Oracle Data Integrator").

  4. Modify the Temporary Interfaces (for more information, see Section B.2.41.2.4, "Modify Temporary Interfaces").

  5. Modify the Load Plan (for more information, see Section B.2.41.2.5, "Modify Load Plan").

  6. Modify the Metadata Repository (for more information, see Section B.2.41.2.6, "Metadata Repository (RPD) Changes").

B.2.41.2.1 Overview

This section describes the steps to facilitate changed data capture (CDC) for PeopleSoft OLTP based on fast refresh of Materialized View (using MV log). By default, CDC using GoldenGate and ODI is supported. If you do not have a license for GoldenGate, then the solution outlined here can be followed for CDC and incremental loading of the relevant project fact tables.

Note: This approach is only supported for Oracle databases.

The section lists the code to be deployed in the source system (PeopleSoft apps), the ODI XML files which has to be imported to the ODI repository and the RPD changes for successfully implementing this solution.

Supported Versions

Supported DB: 10.2.0.4 with patch (RDBMS patch 9580103), Oracle 11i onwards

Supported Oracle BI Applications releases: 11.1.1.7.1 (PS1) onwards

Supported Apps releases: PeopleSoft 8.9 onwards

Summary

  • An MV log on PS_PROJ_RESOURCE will be created in the OLTP (Refer to the Steps section for code).

  • A PK constraint based on the unique index on PS_PROJ_RESOURCE will need to be created.

  • An MV will be created on PS_PROJ_RESOURCE with an additional column of LAST_UPDATE_DT which will be populated based on the sysdate. This new field will be indexed. (Refer to the Steps section for code).

  • A View will be created on the MV and this is what will be used in the SDE fact extract source. (Refer to the Steps section for code).

  • A one time complete refresh of MV will be done.

  • Prior to the daily ETL run the MV will be fast refreshed. MV refresh will be integrated in the Load Plan, for details refer to the Steps section below. Note: Oracle automatically purges the MV log once a fast refresh is run (since the MV log can grow substantially it is recommended to run a daily ETL run so that the MV fast refresh is quick.).

  • In the ODI model layer the resource name for the object PS_PROJ_RESOURCE table will be replaced by the View on MV (OBIEE_PS_PROJ_RESOURCE_VW).

    The deleted rows will be captured in the <fact>_DEL tables via ETL from the MV log filtered on DMLTYPE$$ = 'D'. This ETL will be run prior to the fast refresh as the data will be truncated otherwise.

  • The SIL fact interface will properly handle the soft delete logic once we set the values of variables to:

    - SOFT_DELETE_PREPROCESS = 'N' (This will not populate the <fact>_PE table)

    - SOFTDELETE_FEATURE_ENABLED = 'Y'

Summary of steps:

  1. Create the MV log/MV/View.

  2. Do a one-time complete refresh of MV.

  3. Run the full ETL.

  4. Modify the data in OLTP.

  5. Run the Primary delete capture ETLs.

  6. Fast refresh the MV (MV log gets truncated automatically).

  7. Run the SDE fact extracts.

  8. Run the SIL fact loads.

Assumptions

  • There will be some performance impact on the OLTP application due to the presence of MV log and there is a potential concern with MV Logs refresh time if the MV is not refreshed frequently. Oracle recommends refreshing it on a daily basis to avoid this problem.

  • This solution requires Oracle RDBMS version 10.2.0.4 with patch (RDBMS patch 9580103) or version 11.1.2.0 or above. If Oracle database behavior when updating a Materialized view based on a prebuilt table changes, this solution may need to be revisited.

  • If someone create another MV using the same MVlog (for whatever reason), then one would have to refresh all depending MVs before the log get purged.

  • Actual deletes from PS_PROJ_RESOURCE will be treated as Soft Delete in Oracle Business Analytics Warehouse.

Database Changes

  1. Run the following steps in the OLTP database in the instance in the order specified:

    ALTER TABLE PS_PROJ_RESOURCE ADD CONSTRAINT PS_PROJ_RESOURCE_PK PRIMARY KEY (BUSINESS_UNIT,PROJECT_ID,ACTIVITY_ID,RESOURCE_ID) USING INDEX PS_PROJ_RESOURCE;
     
    CREATE MATERIALIZED VIEW LOG ON PS_PROJ_RESOURCE NOCACHE LOGGING NOPARALLEL  WITH SEQUENCE;
     
    CREATE TABLE OBIEE_PS_PROJ_RESOURCE_MV AS SELECT * FROM PS_PROJ_RESOURCE WHERE 1=2;
     
    ALTER TABLE OBIEE_PS_PROJ_RESOURCE_MV ADD (LAST_UPDATE_DT DATE DEFAULT SYSDATE);
     
    CREATE MATERIALIZED VIEW OBIEE_PS_PROJ_RESOURCE_MV ON PREBUILT TABLE  REFRESH FAST  ON DEMAND AS SELECT * FROM PS_PROJ_RESOURCE;
     
    CREATE VIEW OBIEE_PS_PROJ_RESOURCE_VW AS SELECT * FROM OBIEE_PS_PROJ_RESOURCE_MV;
     
    CREATE VIEW OBIEE_PS_PROJ_RESOURCE_DEL_VW AS SELECT  business_unit,
            project_id,
            activity_id,
            resource_id
    FROM   (SELECT business_unit,
                   project_id,
                   activity_id,
                   resource_id,
                   dmltype$$,
                   CASE
                     WHEN sequence$$ = MAX(sequence$$) over (PARTITION BY
                                       business_unit,
                                       project_id,
                                       activity_id,
                                       resource_id ) THEN sequence$$
                     ELSE NULL
                   END                                 AS sequence$$
            FROM   mlog$_ps_proj_resource)   
    WHERE  sequence$$ IS NOT NULL
    And dmltype$$ ='D';
    
  2. Refresh the Materialized view.

    It will help to fast refresh the MV during the incremental run, for example, using the command EXECUTE DBMS_MVIEW.REFRESH('OBIEE_PS_PROJ_RESOURCE_MV', 'C');.

  3. Determine which indexes to create on the MV, by looking at the extract sql, and running a query plan.

    For example, an index is required on the LAST_UPDATE_DT field and a unique index on the PK fields.

B.2.41.2.2 Updates in Oracle BI Applications Configuration Manager

In Oracle BI Applications Configuration Manager, make the following changes:

  1. Set the value of SOFT_DELETE_PREPROCESS to 'N'.

  2. Set the value of SOFTDELETE_FEATURE_ENABLED to 'Y' for the Fact Group Level variables.

B.2.41.2.3 Updates in Oracle Data Integrator

The best option to maintain up-to-date custom MVs is to merge their refresh into the ODI Load Plan.The following PLSQL call ensures fast refresh for OBIEE_PS_PROJ_RESOURCE_MV:BEGINDBMS_MVIEW.REFRESH('OBIEE_PS_PROJ_RESOURCE_MV', 'F');END;

In ODI Designer Navigator, make the following changes:

  1. In the ODI model layer the resource name for the object PS_PROJ_RESOURCE table will be replaced by the view OBIEE_PS_PROJ_RESOURCE_VW.

    If you are using PeopleSoft 90, then follow this navigation. Otherwise, navigate to the 9.0 folder. That is, in ODI Designer Navigator, navigate to Models, then Peoplesoft 9.0, then Peoplesoft 9.0 FNSCM, then FPC-Projects, then open the object PROJ_RESOURCE and change the 'Resource Name' to OBIEE_PS_PROJ_RESOURCE_VW.

    This screenshot is described in surrounding text.
  2. Add column LAST_UPDATE_DT TIMESTAMP to the Object PROJ_RESOURCE in ODI model.

  3. Click Save.

  4. Regenerate the appropriate SDE scenarios for all the facts mentioned above.

    That is, in ODI Designer Navigator, navigate to Projects, then Mapping, then SDE_PSFT_90_Folder, navigate to the fact folder e.g SDE_PSFT_ProjectCostLineFact, then Packages, then Scenarios, then right click on the name and select the Regenerate button.

B.2.41.2.4 Modify Temporary Interfaces

Modify temporary interfaces, as follows:

  1. In ODI temporary interfaces the mapping for CHANGED_ON_DT needs to be changed to include the LAST_UPDATE_DT column in the code.

  2. For example, consider the temporary interface for Cost Fact:

    SDE_PSFT_ProjectCostLineFact.W_PROJ_COST_LINE_FS_SQ_PROJ_RESOURCE  
    
  3. The CHANGED_ON_DT is mapped with:

    RUN_REPLICATED_TRANSACTIONAL('#IS_SDS_DEPLOYED',PS_PROJ_RESOURCE.DTTM_STAMP,PS_PROJ_RESOURCE.CDC$_SRC_LAST_UPDATE_DATE)
    
  4. Change the update date:

    RUN_REPLICATED_TRANSACTIONAL('#IS_SDS_DEPLOYED',COALESCE(PS_PROJ_RESOURCE.LAST_UPDATE_DT,PS_PROJ_RESOURCE.DTTM_STAMP),PS_PROJ_RESOURCE.CDC$_SRC_LAST_UPDATE_DATE)
    
  5. Change the incremental filter, and replace PS_PROJ_RESOURCE.DTTM_STAMP with COALESCE(PS_PROJ_RESOURCE.LAST_UPDATE_DT,PS_PROJ_RESOURCE.DTTM_STAMP).

  6. Regenerate scenarios.

    Repeat the above steps for the remaining Budget, Forecast, Commitment, Cross Charge, Retention and Revenue facts.

    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
B.2.41.2.5 Modify Load Plan

Modify your Load Plan, as follows:

The Interface to load the Deleted rows needs to be added to the Load Plan.

  1. In ODI Designer Navigator, navigate to Load Plans and Scenarios, then your Generated Load Plan, and open it.

  2. In the Steps tab, navigate to '1 SDE Extract' step > 2 SDE Fact Group> Parallel (Persisted/Temporary Staging Table)>3 SDE PS PROJECT.

  3. Create a Serial Step "Projects_Identify_Deletes" under the root step and add the identify delete fact scenarios for Project Cost, Budget, Forecast, Commitment, Cross Charge, Retention and Revenue facts steps as shown in the screenshot below.

  4. Add scenarios in relevance to Load plan. For instance, scenarios with prefix SDE_PSFT_90_ADAPTOR to be added for PeopleSoft 90 Load plan and ones with prefix SDE_PSFT_91_ADAPTOR should be added for PeopleSoft 91 Load plan.

    These tasks have to be set to "Restart from Failure'.

  5. Click on these newly added tasks, edit them to take the '-1' version as shown in the screenshot.

    This is necessary to ensure that the latest scenario is used in case if there are multiple scenarios.

  6. Save the Load Plan.

    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
    This screenshot is described in surrounding text.
B.2.41.2.6 Metadata Repository (RPD) Changes

Change BMM filters for base Facts to filter out Deleted records. It is recommended that these changes are done in an offline mode.

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. In the BMM layer and go to Fact – Project Budget.

  3. Click to view LTS'es under this logical fact.

  4. Add the following filter to Fact_W_PROJ_BUDGET_F_Budget_Fact.

  5. Add a similar filter for Budget Fact ITD LTS (Delete Flg = 'N').

    This screenshot is described in surrounding text.

    Similarly, modify for Cost/Forecast/Commitment/Cross charge/Revenue/Retention facts.

B.2.42 How to Set Up Project Cost and Control Security for PeopleSoft

Overview

Oracle Project Analytics supports security over the following dimensions in Project Costing and Project Control subject areas.

Table B-23 Supported Project Costing and Project Control subject areas

Project Costing and Control FactsSecurity Entity Cost Commitment Budget Forecast

Project Business Unit

Y

Y

Y

Y

Project Organization

N

N

N

N

Expenditure Business Unit

N

N

N

N

Contract Business Unit

N

N

N

N

Project

Y

Y

Y

Y

Resource Organization

N

N

N

N

Ledger

N

N

N

N


Configuring Project Cost and Control Security For PeopleSoft

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

To enable data security for Project Cost and Control in PeopleSoft, based on your PeopleSoft security configuration enable PeopleSoft data security initialization blocks listed below and make sure the initialization blocks of all other source systems are disabled. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

About Data Security Configuration in PeopleSoft

In PeopleSoft, you access the security configuration pages for securing Project transactions by selecting Main Menu, then Set up Financials/Supply Chain, then Security, then Security Options.

B.2.42.1 Security by Business Unit

Init Blocks:

  • Project Business Unit List Budget PSFT

  • Project Business Unit List Costing PSFT

  • Project Business Unit List Forecast PSFT

  • Expenditure Business Unit List PSFT

If you are securing the Project data by Project/Expenditure Business Unit only, then follow the steps below to disable the Project dimension security:

  1. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_DATA_SECURITY Application Role.

    Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

  2. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_DATA_SECURITY Duty Role.

  3. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

B.2.42.2 Security by Project

Init Blocks:

  • Project List Budget PSFT

  • Project List Costing PSFT

  • Project List Forecast PSFT

If you are securing the Project data by Project dimension only, then follow the steps below to disable the Project BU dimension security:

  1. Disable Project Business Unit Security, as follows:

    1. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY Application Role.

      Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

    2. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY Duty Role.

    3. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd). in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

  2. Disable Expenditure Business Unit Security, as follows:

    1. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_EXPENDITURE_BUSINESS_UNIT_DATA_SECURITY Application Role.

      Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

    2. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_EXPENDITURE_BUSINESS_UNIT_DATA_SECURITY.

    3. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

B.2.42.3 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Costing and Control subject area.

  • OBIA_PSFT_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.43 How to Set Up Project Billing and Revenue Security for Peoplesoft

Overview

Oracle Project Analytics supports security over the following dimensions in Project Billing and Revenue subject areas.

Table B-24 Supported Project Billing and Revenue subject areas

Project Costing and Control FactsSecurity Entity Billing Revenue Contract Funding Cross Charge- Receiver Cross Charge - Provider Cross Charge - Invoice Cross Charge - Revenue

Project Business Unit

Y

Y

N

Y

Y

N

Y

Y

Project Organization

N

N

N

N

N

N

N

N

Expenditure Business Unit

N

N

N

N

N

Y

N

N

Contract Business Unit

Y

Y

Y

Y

N

N

N

Y

Project

Y

Y

N

Y

Y

Y

Y

Y

Resource Organization

N

N

N

N

N

N

N

N

Ledger

N

N

N

N

N

N

N

N


Configuring Project Billing and Revenue Security for PeopleSoft

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

You must enable data security for Project Billing & Revenue in PeopleSoft, based on your PeopleSoft security configuration. If security by Business Unit has been implemented, then follow the Security by Business Unit Section (ignore Security by Projects section); if security by projects has been implemented, then follow the Security by Projects section (ignore Security by Business Unit section) and enable data security initialization blocks listed in sections below. If only one source system is deployed, then you must make sure that all Project Security initialization blocks for other adapters are disabled. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

About Data Security Configuration in PeopleSoft

In PeopleSoft, you access the security configuration pages for securing Project transactions by selecting Main Menu, then Set up Financials/Supply Chain, then Security, then Security Options.

Depending on your security configuration, you need to use any combination of the Project Business Unit or Project dimension that are supported. Based on that, you to change the default installed configuration to match the OLTP security setup.

B.2.43.1 Security by Business Unit

Init Blocks:

  • Expenditure Business Unit List PSFT

  • Project Business Unit List Funding PSFT

  • Project Business Unit List Invoice PSFT

  • Project Business Unit List Revenue PSFT

  • Project Contract Business Unit List PSFT

  • Project Contract Business Unit List Invoice PSFT

  • Project Contract Business Unit List Revenue PSFT

If you are securing the Project data by Project /Expenditure/Contract Business Unit only, then follow the steps below to disable the Project dimension security:

  1. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_DATA_SECURITY Application Role.

    Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

  2. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_DATA_SECURITY Duty Role.

  3. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

    Alternatively this step can be performed by restarting all BI Services.

B.2.43.2 Security by Project

Init Blocks:

  • Project List Funding PSFT

  • Project List Invoice PSFT

  • Project List Revenue PSFT

If you are securing the Project data by Project dimension only, then follow the steps below to disable the Project BU dimension security:

  1. Disable Project Business Unit Security, as follows:

    1. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY Application Role.

      Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

    2. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY Duty Role.

    3. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

      Alternatively this step can be performed by restarting all BI Services.

  2. Disable Expenditure Business Unit Security, as follows:

    1. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_EXPENDITURE_BUSINESS_UNIT_DATA_SECURITY Application Role.

      Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

    2. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_EXPENDITURE_BUSINESS_UNIT_DATA_SECURITY Duty Role.

    3. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

  3. Disable security filters, as follows:

    1. In Oracle BI EE Administration Tool, select Manage, then Identity, then OBIA_PROJECT_CONTRACT_BUSINESS_UNIT_DATA_SECURITY, then Permissions, then Data Filters, then Disable data security filters for all facts except Funding and Contract.

B.2.43.3 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Costing and Control subject area.

  • OBIA_PSFT_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.44 How to Define New Groups and Mappings for Users and BI Roles

Note:

The following terms are synonymous:

- Enterprise Role

- Job Role

- Group

Oracle BI Applications implements data and object security using a set of BI duty roles. Separate set of roles are used to implement BI object security and BI data security.

To simplify BI duty role provisioning to application users, Oracle BI Applications also provides a BI duty role hierarchy of the roles used in BI object security and BI data security. The BI duty role hierarchy is structured in such a way that for each application area, typically a single BI duty role encapsulates the BI object and data security role hierarchy that is built using multiple other BI duty roles. This single BI duty role can then be provisioned to the end user for enabling the end user to have access to a specific BI application.

For example, the BI duty role 'Fixed Asset Accounting Manager EBS' provides the encapsulation for EBS Fixed Asset Accounting security. The end user must be provisioned with this single BI duty role – there is no need to separately provision underlying data/object security BI duty roles for this application to the user.

There are two ways to provision a BI duty role to a BI end user:

B.2.44.1 How to Use Fusion Middleware (FMW) to Provision an End User

To use the FMW provisioning for BI duty role, the users and enterprise roles must be present in a LDAP and that LDAP should have been configured as the source for authentication for BI.

In this approach, you can use your existing enterprise roles to associate a BI duty role to end user. If your installation has existing enterprise roles that you wish to use for BI security; you may want to consider using this approach.

For example, assume that an installation LDAP has enterprise role "ABC Corp Americas Account Manager" and the users and enterprise roles are present in this LDAP & this LDAP is used as source for authentication for the BI installation. Use Enterprise Manager (EM) of the BI instance and make the enterprise role "ABC Corp Americas Account Manager" member of BI duty role "Fixed Asset Accounting Manager EBS". After this association, all users that are member of enterprise role "ABC Corp Americas Account Manager" will inherit BI duty role "Fixed Asset Accounting Manager EBS".

Note that enterprise role "ABC Corp Americas Account Manager" is a custom enterprise role defined by customer (and could be any custom defined enterprise role) – this enterprise role does not have to be provided by Oracle.

Oracle BI Applications also provides a sample set of enterprise roles (also called groups) that inherit BI duty role hierarchy. For example, BI Applications provide enterprise role "Fixed Asset Accounting Manager EBS" that is member of BI duty role "Fixed Asset Accounting Manager EBS". You can associate this enterprise role to your BI users. Any users that are made member of enterprise role "Fixed Asset Accounting Manager EBS" will automatically inherit BI duty role "Fixed Asset Accounting Manager EBS" and will get right security for Fixed Assets Accounting reporting for EBS. This association can be done in either the following ways:

  • When you install Oracle BI Applications, the installation provides a default LDAP that is Weblogic Embedded LDAP. If you plan to use this LDAP as the LDAP for users and enterprise roles, then you can upload all enterprise roles that are provided with Oracle BI Applications. These enterprise roles are provided as a .ldif file and once the LDIF file is uploaded to Weblogic Embedded LDAP, then the default installed enterprise roles like "Fixed Asset Accounting Manager EBS" are available in the embedded LDAP. You can define your BI users in this LDAP and associate the users to one or more enterprise roles like "Fixed Asset Accounting Manager EBS". Once a user is associated with enterprise role "Fixed Asset Accounting Manager EBS", the user will inherit the BI duty role "Fixed Asset Accounting Manager EBS" and thus will have proper security for that application. Creation of users and association of users to enterprise roles in Weblogic Embedded LDAP is done using Weblogic Administration Console.

  • If your installation has a existing LDAP (and you do not wish to use the Weblogic Embedded LDAP that comes with the installation) that is being used for authentication, then you can add the enterprise role "Fixed Asset Accounting Manager EBS" to your LDAP and associate this enterprise role with existing users. Any users that are made member of enterprise role "Fixed Asset Accounting Manager EBS" in your LDAP will inherit BI duty role "Fixed Asset Accounting Manager EBS" and will thus get right security for this application. Addition of enterprise role to your LDAP and association of enterprise roles to user should be done using native LDAP tools.

    The FMW approach for associating BI duty role to user can be used only if Fusion Middleware is also used for the user authentication. If the use is being authenticated by a method other than Fusion Middleware Authentication, then Fusion Middleware cannot be used to associate a BI duty role to user. For example, if a user is being authenticated using Init Block, then the user association with BI duty role cannot be done using Fusion Middleware.

B.2.44.2 How to Use An RPD Init Block to Provision an End User

Oracle BI Applications provides an Init block named "Authorization" that queries the roles/responsibilities associated to the user in the source system and populates a Oracle BI EE variable called GROUP. Oracle BI EE associates BI duty roles to the users that are populated in the GROUP variable.

The init block approach for associating BI duty role to user can be used only if user is not being authenticated using Fusion Middleware.

For example, to associate BI duty role "Fixed Asset Accounting Manager EBS" to a user using init block approach, following steps are needed:

a. Enable the "Authorization" init block if disabled.

b. Update the init block SQL to use the EBS SQL used to populate users EBS responsibilities.

Oracle BI Applications provides different SQL statements for E-Business Suite, Siebel, and PeopleSoft for this init block.

c. Create responsibility "Fixed Asset Accounting Manager EBS" in the E-Business Suite source system and assign it to the user

d. When the init block is run for the user, the GROUP variable will be populated with value "Fixed Asset Accounting Manager EBS". The BI server will then assign BI duty role "Fixed Asset Accounting Manager EBS" to the user (that is, the BI duty role of the same name).

e. If the user has multiple responsibilities in source system, the GROUP variable will contain name of all the responsibilities.

Oracle BI EE will assign BI duty roles that matches any names contained in the GROUP variable. If one of the names within GROUP variable does not matches any BI duty role; Oracle BI EE will ignore that name. For example, if GROUP variable contains value (A, B, C, D) and if BI duty roles of names A, B and C exist; the user will be assigned BI duty roles (A, B, C). The value D will be ignored.

B.2.45 How to Configure Projects Capitalizable Flag for PeopleSoft

This section describes how to configure Project Capitalizable flag in Project dimension for Peoplesoft source, based on the project type.

Project Capitalizable Flag is associated against Project Type in flat file file_project_capitalizable_flag_psft.csv.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

To Configure Projects Capitalizable Flag for PeopleSoft:

  1. Identify the Project Type Class Codes by using the following SQL (To be run in OLTP source):

    SELECT T.PROJECT_TYPE||'~'||T.SETID AS PROJECT_TYPE
    FROM PS_PROJ_TYPE_TBL T
    WHERE T.EFFDT = (SELECT MAX(EFFDT)
                   FROM PS_PROJ_TYPE_TBL T1
                   WHERE T1.SETID = T.SETID AND T1.PROJECT_TYPE = T.PROJECT_TYPE
                   AND T1.EFFDT <= CURRENT_TIMESTAMP
                   GROUP BY T1.SETID, T1.PROJECT_TYPE)
    
  2. Edit the file file_project_capitalizable_flag_psft.csv.

  3. Copy the data from the PROJECT_TYPE column to the PROJECT_TYPE_CLASS_CODE column in the CSV file.

  4. Map each Project Type Class to a value of Y or N in the CAPITALIZABLE_FLG column. If the Project Type Class is considered capitalizable, enter Y. Otherwise, enter N.

  5. Save and close the file.

B.2.46 How to Configure Project Budget Fact for Oracle E-Business Suite

In E-Business Suite baselined Budgets are extracted into the Budget Fact (W_PROJ_BUDGET_F) table, the grain of which is Budget Line. Because only the baselined budgets are extracted, the records in this table are not updated after they are loaded into the data warehouse; only new records are inserted during the incremental ETL run. Budgets are stored in the Budget dimension (W_PROJ_BUDGET_D).

Note: For E-Business Suite, Transaction Currency is the Document Currency for this fact.

Defining Filters on Budget Metrics

A user can create multiple budgets for a single Project and multiple versions for the same budget type. Therefore, all exposed metrics are filtered by the following filters:

  • Approved Budget Type. A project can have only one Cost Budget with a budget type as "Approved Cost Budget" and one Revenue Budget with a budget type as "Approved Revenue Budget." Therefore, all Cost Budget metrics are filtered by the Approved Cost Budget and Approved Revenue Budget flags to ensure that the metrics include data from one budget only.

  • Current or Original Budget. Each Project forecast can have multiple versions. The Current Version may not be the same as the Original version. Therefore, to show only one Forecast version at a time, there are separate metrics for Current version and Original version. These flags are set automatically in OLTP when the forecast is baselined, but users can update them manually.

The user can still see the metrics for any other budget type or version by bringing the non-filtered metrics from Fact - Project Budget fact table into the Presentation area. But to avoid duplicate data, the report must have a filter on "Dim - Project Budget Version.Budget Type" and "Dim - Project Budget Version.Budget Version".

Before running the ETL for the first time, then go to the Financial Plan Type page in the HTML application and set your Approved Cost Budget Type and your Approved Revenue Budget Types, as shown in the example screen shot below.

This screen shot is described in surrounding text.

Budgets Created in Forms Client

For budgets entered through the Form client, the PA_BUDGET_ TYPES.PLAN_TYPE column is not populated for the two predefined budget types, AC and AR. Therefore, the following ETL logic is incorporated in interface SDE_ORA_ProjectBudgetDimension_BudgetType.W_PROJ_BUDGET_DS in the SDE_ORA_ProjectBudgetDimension folder:

DOMAIN_DEFAULT_UNASSIGNED( TO_CHAR(case when ISNULL(SQ_PA_BUDGET_VERSIONS.PLAN_TYPE) then  DECODE(SQ_PA_BUDGET_VERSIONS.BUDGET_TYPE_CODE1,'AC','BUDGET','AR','BUDGET','FC','FORECAST','FR', 'FORECAST',SQ_PA_BUDGET_VERSIONS.PLAN_TYPE  else SQ_PA_BUDGET_VERSIONS.PLAN_TYPE end )

Budget Fact Canonical Date

The Budget Fact contains the following two sets of Accounting Date and Period WIDs:

  • PROJ_ACCT_START_DT_WID, PROJ_ACCT_END_DT_WID, and PROJ_PERIOD_WID

    PROJ_ACCT_START_DT_WID and PROJ_ACCT_END_DT_WID are populated using START_DATE and END_DATE of budget line only for budgets that are time-phased using the Project Accounting (PA) Calendar.

  • GL_ACCT_START_DT_WID, GL_ACCT_END_DT_WID, and GL_PERIOD_WID

    The GL_ACCT_START_DT_WID and GL_ACCT_END_DT_WID are populated using the START_DATE and END_DATE of budget line for budgets that are time-phased by the General Ledger (GL) Calendar.

    For budgets defined with Time Phase equal 'P'(PA), 'N'(No Time Phase) or 'R'(Date Range), the GL_ACCT_START_DT_WID and GL_PERIOD_WID are resolved using the START_DATE of the budget line by choosing the period containing that date in the GL Calendar (pinned by the GL_MCAL_CAL_WID).

    This approach assumes that for time phase 'P','N', and 'R', there is a period containing the START_DATE for the given GL Calendar in the OLTP database.

For Forms -based budgets, even though the application does not allow creating budget lines in a different currency than the Project Functional currency, the currency from Project Functional Currency is used for the default value of the Document Currency field. This enables Budget Amounts to be analyzed in the Global Currencies. For example, Doc Raw Cost Amount is populated as:

COALESCE(SQ_PA_BUDGET_LINES.TXN_RAW_COST,
IIF(SQ_PA_BUDGET_LINES.TXN_CURRENCY_CODE = SQ_PA_BUDGET_LINES.PROJFUNC_CURRENCY_CODE,SQ_PA_BUDGET_LINES.RAW_COST,
NULL))

B.2.47 How to Configure Project Cross Charge Fact for PeopleSoft

In the services industry, employees may work on projects that are outside of their own organizations. In such cases, the organization that owns a project and the organization that owns the human resource (the employee), may be different. To handle these scenarios in Peoplesoft, Organizational Sharing method of project accounting is used to share costs and revenue that the project or activity generates between the entities. Rules and accounting procedures are setup; that define the agreement between the organization that owns the project and the organization that owns the human resource.

If sharing rules are defined and activated, the Pricing process calls the Sharing Application Engine process (PSA_SHARING) to search for rows that are designated for sharing. These rows are either loaded to Cost Fact or Revenue Fact depending on the analysis type. A row is eligible for the Sharing process if it satisfies the conditions set to in sharing setup page to identify shared rows. For example, an Organization that owns the project or activity differs from the organization that owns the transaction. An applicable sharing rule exists, and the row does not qualify as an exception to the sharing rules.

Organization Sharing Example

Assume there is a 80% revenue sharing rule in place between US004 (Receiver) and US001 (Provider) with no exceptions set.

This screen shot is described in surrounding text.

The shared row that gets created in PeopleSoft is loaded to Cross Charge fact table.

Note: Internal contract sharing is not supported in this release. In addition, sharing rows created directly via the Add transactions page in PeopleSoft are not supported (that is, using Project Costing > Transaction Definition > Add Transactions).

B.2.48 How to Configure Project Forecast Fact for Oracle E-Business Suite

The Forecast fact table is based on PA_BUDGET_LINES. A filter is applied to the Budget Version table to extract only baselined Forecasts for the Forecast fact. The grain of this table is a Forecast line. The ETL extracts only baselined forecasts, so the records in this table are not updated after they are loaded to the data warehouse; only new records are inserted during an incremental run. Forecasts are stored in the Budget dimension (W_PROJ_BUDGET_D) as well.

Note:

For E-Business Suite, Transaction Currency is the Document Currency for this fact.

Defining Filters on Forecast Metrics

Users can create multiple forecasts for a single Project and multiple versions for the same forecast type. Therefore, Oracle BI Applications filter all exposed metrics using the following filters:

  • Primary Forecast Type: One project can have only one Cost Forecast with a forecast type of "Primary Cost Forecast" and one Revenue Forecast with a Forecast type of "Primary Revenue Forecast." Therefore, all Cost and Revenue Forecast metrics are filtered on two flags, Primary Cost Forecast and Primary Revenue Forecast, to make sure we are showing data for only one forecast.

  • Current or Original Forecast: One Project forecast can have multiple versions. To show only one forecast version at a time, every metric for the Current Version and the Current Original Version is shown. These flags are set automatically in OLTP when the forecast is baselined, but users can update them manually.

Users can still view metrics for any other forecast type or version by bringing the non-filtered metrics from the Fact - Project Forecast fact table into the Presentation area. But to avoid duplicate data, the report must have a filter on Dim - Project Forecast Version.Forecast Type and Dim - Project Forecast Version.Forecast Version.

Before running the ETL for the first time, access the Financial Plan Type page in the HTML client, and select your Primary Cost forecast and Primary Revenue forecast types.

Forecasts Created in Forms Client

For Forecasts entered through the Form client, the PA_BUDGET_ TYPES.PLAN_TYPE column is not populated for the two predefined budget types, 'FC' and 'FR'. Therefore, the following ETL logic is incorporated in SDE_ORA_ProjectBudgetDimension_BudgetType.W_PROJ_BUDGET_DS in the SDE_ORA_ProjectBudgetDimension folder:

DOMAIN_DEFAULT_UNASSIGNED( TO_CHAR(case when ISNULL(SQ_PA_BUDGET_VERSIONS.PLAN_TYPE) then  DECODE(SQ_PA_BUDGET_VERSIONS.BUDGET_TYPE_CODE1,'AC','BUDGET','AR','BUDGET','FC','FORECAST','FR', 'FORECAST',SQ_PA_BUDGET_VERSIONS.PLAN_TYPE  else SQ_PA_BUDGET_VERSIONS.PLAN_TYPE end  )  )

For 'FC' and 'FR' types of Forecast versions created in the Forms client, the PRIMARY_COST_FORECAST _FLAG and PRIMARY_REV_FORECAST_FLAG are not populated in PA_BUDGET_VERSIONS. Therefore, the following ETL logic is incorporated in SDE_ORA_ProjectBudgetDimension_BudgetType.W_PROJ_BUDGET_DS in the SDE_ORA_ProjectBudgetDimension folder:

COALESCE(SQ_PA_BUDGET_VERSIONS.PRIMARY_COST_FORECAST_FLAG, case when SQ_PA_BUDGET_VERSIONS.BUDGET_TYPE_CODE1 = 'FC' THEN 'Y' ELSE NULL END)
COALESCE(SQ_PA_BUDGET_VERSIONS.PRIMARY_REV_FORECAST_FLAG, case when SQ_PA_BUDGET_VERSIONS.BUDGET_TYPE_CODE1 = 'FR' THEN 'Y' ELSE NULL END)

For Forms based forecasts, even though the application does not allow the creation of forecast lines in a different currency than the Project Functional currency, we are defaulting the Project Functional Currency in the Document Currency field, so that the Forecast Amounts can also be analyzed in the Global Currencies. For example Doc EAC Raw Cost Amount is populated as:

COALESCE(SQ_PA_BUDGET_LINES.TXN_RAW_COST,IIF(SQ_PA_BUDGET_LINES.TXN_CURRENCY_CODE = SQ_PA_BUDGET_LINES.PROJFUNC_CURRENCY_CODE, SQ_PA_BUDGET_LINES.RAW_COST,NULL))

Forecast Fact Canonical Date: The Forecast fact has the following two sets of Accounting Date and Period WIDs:

  • PROJ_ACCT_START_DT_WID, PROJ_ACCT_END_DT_WID & PROJ_PERIOD_WID

    PROJ_ACCT_START_DT_WID and PROJ_ACCT_END_DT_WID are populated using START_DATE and END_DATE of forecast line only for Forecasts that are time phased using the Project Accounting (PA) Calendar.

  • GL_ACCT_START_DT_WID, GL_ACCT_END_DT_WID and GL_PERIOD_WID

    The GL_ACCT_START_DT_WID and GL_ACCT_END_DT_WID are populated using START_DATE and END_DATE of forecast line for Forecasts time phased by the General Ledger (GL) Calendar.

    For Forecasts with a Time Phase equal to 'P' (PA), 'N' (No Time Phase), or 'R' (Date Range), the GL_ACCT_START_DT_WID and GL_PERIOD_WID are resolved using the START_DATE of the forecast line by choosing the Period containing that date in the corresponding GL Calendar.

    This approach assumes that for time phase equal 'P', 'N' or 'R', there will always be a period containing the START_DATE for the given GL Calendar in OLTP database.

B.2.49 How to Set Up Project Billing and Revenue Security For Oracle

Overview

Oracle Project Analytics supports security over the following dimensions in Project Billing and Revenue subject areas.

Table B-25 Supported Project Billing and Revenue subject areas

Project Costing and Control FactsSecurity Entity Billing Revenue Contract Funding Cross Charge- Receiver Cross Charge - Provider Cross Charge - Invoice Cross Charge - Revenue

Project Business Unit

Y

Y

N

Y

Y

N

Y

Y

Project Organization

Y

Y

N

Y

Y

N

Y

Y

Expenditure Business Unit

N

N

N

N

N

Y

N

N

Contract Business Unit

Y

Y

Y

Y

N

N

N

Y

Project

Y

Y

N

Y

Y

Y

Y

Y

Resource Organization

N

N

N

N

N

N

N

N

Ledger

N

N

N

N

N

N

N

N


Configuring Project Billing and Revenue Security for Oracle Fusion

In order for data security filters to be applied, ensure that the following initialization blocks are enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

You must disable Project Security initialization blocks for all other adapters. If more than one source system is deployed, make sure that the initialization blocks of those source systems are enabled.

Init Blocks:

  • Project Business Unit List Funding Fusion

  • Project Business Unit List Invoice Fusion

  • Project Business Unit List Revenue Fusion

  • Project Contract Business Unit List Fusion

  • Project Contract Business Unit List Invoice Fusion

  • Project Contract Business Unit List Revenue Fusion

  • Project List Funding Fusion

  • Project List Invoice Fusion

  • Project List Revenue Fusion

  • Project Organization List Funding Fusion

  • Project Organization List Invoice Fusion

  • Project Organization List Revenue Fusion

B.2.49.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Costing and Control subject area.

  • OBIA_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_PROJECT_MANAGEMENT_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.50 How to Extend the Project Task Hierarchy Dimension for E-Business Suite

Task dimension data is sourced from the task table (PA_TASKS) in E-Business Suite, as well as from other task-related OLTP tables such as:

  • PA_PROJ_ELEMENTS

  • PA_PROJ_ELEMENT_VERSIONS

  • PA_PROJ_ELEM_VER_STRUCTURE

  • PA_PROJ_ELEM_VER_SCHEDULE

Attributes such as WBS_NUMBER, PRIORITY_CODE, SCHEDULE_START_DATE, and SCHEDULE_END_DATE are sourced from these tables. Oracle BI Applications support only the latest version of the Financial Structure by using the following filter conditions:

PA_PROJ_ELEM_VER_STRUCTURE.STATUS_CODE = 'STRUCTURE_PUBLISHED'
AND PA_PROJ_ELEM_VER_STRUCTURE.LATEST_EFF_PUBLISHED_FLAG = 'Y'

The W_TASK_DH hierarchy table stores the flattened hierarchy for every task in W_TASK_D. It is at the same grain as W_TASK_D and is modeled as a Type I dimension. All tasks in the hierarchy support these columns:

  • TASK_NAME

  • TASK_NUMBER

  • WBS_LEVEL

  • WBS_NUMBER

Because both tables, W_TASK_D and W_TASK_DH, are at the same grain, fact tables do not have a separate foreign key to join with this table; instead, the join is on the Task Foreign Key.

By default, Oracle BI Applications support 20 levels in the flattened hierarchy. The levels are Base, 1, 2, and so forth up to 18, and Top. The base level represents the hierarchy record, and Top level is the Top hierarchy under the Project. If your financial structure contains more than 20 levels, you can extend the number of levels in the schema and ETL to support all levels.

To Extend the Project Task Hierarchy Dimension:

  1. To extend levels, you need to add all the change capture columns (TASK_NUMBER, WBS_LEVEL and WBS_NUMBER) for every new level that you want in the W_TASK_DHS and W_TASK_DH tables in the Models sub tab in ODI Designer Navigator.

  2. Extend the interfaces in the SDE and SILOs folder, as follows:

    1. Depending on the source navigate to the correct SDE folder for E-Business Suite or PeopleSoft.

    2. Edit and update the correct main interface by providing the correct mappings for the new columns.

      For example, SDE_ORA_TaskDimensionHierarchy.W_TASK_DHS or SDE_PSFT_TaskDimensionHierarchy.W_TASK_DHS.

    3. Open the SILOS folder and edit and update the ODI interface SIL_Project_TaskDimensionHierarchy.

  3. Regenerate the SDE/SILOS scenarios by expanding the Packages folder and right click the scenario to regenerate.

    This screen shot is described in surrounding text.

    You must also update the following objects in the metadata repository:

    • W_TASK_DH table in the physical layer.

    • Dim - Task Hierarchy Logical Table and Task Hierarchy Dimension in the logical layer.

    • All the Task Hierarchy Presentation tables in the Presentation Area.

B.2.51 How to Configure Project Customer in Projects Analytics for E-Business Suite

By default, E-Business Suite only has the 'PRIMARY' relationship code in the PA_PROJECT_CUSTOMERS table. Therefore, the value is included in the ODI filter used in the source extract mapping for the Project dimension to get the customer for a project. Customers can define an additional value such as 'OVERRIDE CUSTOMER' as the relationship value. In this case, the filter must be edited to include any additional values.

To edit the filter:

  1. In ODI Designer Navigator, connect to your ODI repository.

  2. Open the folder appropriate to your source system (for example, SDE_ORA_11510_Adaptor for Oracle V11.5.10, or SDE_ORA_R12_Adaptor for Oracle V12).

  3. Expand the SDE_ORA_ProjectDimension folder and open the interface SDE_ORA_Project.W_PROJECT_DS.LKP_PROJ_CUST and click on the 'Quick-Edit' tab.

  4. Expand the Filters tab and edit the expression column for the second filter.

  5. Remove the existing SQL and add the following sample SQL where it is assumed the values are 'PRIMARY' and 'OVERRIDE CUSTOMER'.

    Modify it according to your configuration. If you want it to be independent of any relationships, then just remove the filters on PROJECT_RELATIONSHIP_CODE - UPPER(PA_PROJECT_CUSTOMERS.PROJECT_RELATIONSHIP_CODE (+)) IN ('PRIMARY' . 'OVERRIDE CUSTOMER').

  6. Note: If the lookup returns more than one customer, then apply a MAX function on the id so that it always returns one row.

  7. Review the mapping to ensure it is valid then press ok and save the interface.

  8. Regenerate the scenario by expanding the Packages folder and right click the scenario to regenerate.

B.2.52 How to Configure Project Classification Dimension in Projects Analytics for Oracle E-Business Suite

Every project can be optionally classified into different categories. Within these categories, a project can be further categorized into different classification codes. Depending on how these classification categories are defined in the application, for some categories, a project can be classified with more than one classification code.

The Project Classification Table (W_PROJ_CLASSIFICATION_D) is at the grain of Project, Classification Category and Classification Code. The Project facts do not have an explicit foreign key for joining with Project Classification Dimension; instead the join is on the Project Foreign Key. As specifying a Classification Category is optional for a project, so the logical join in the metadata repository between the Facts and Project Classification Dimension has been set as right outer join to avoid losing records in case the project has not been classified.

Note: A particular classification code might exist for more than one classification category. Therefore, to avoid double counting, it is important that a classification category is fixed in a report that has classification code as one of the reporting attributes. If a Project belongs to more than one Classification Category under the same Classification, the Project metrics (Cost, Revenue, and so forth) will be double counted.

B.2.53 How to Configure Project Funding Fact for E-Business Suite

Funding is based on Funding Line, which represents allocations made to a project or task. The line level funding information is held in the Funding Line fact (W_PROJ_ FUNDING_ LINE_F), which is based on PA_PROJECT_FUNDINGS table in the Billing Module of E-Business Suite. Also, data is extracted from the Summary Funding table (PA_SUMMARY_PROJECT_FUNDINGS) to retrieve additional metrics like Unbaselined Amount, Baselined Amount, Invoiced Amount, Revenue Accrued; which are not available in the Funding line Fact; these would be available in Funding Header Fact (W_PROJ_FUNDING_HDR_F). Before running any ODI ETL job, you need to run the following process in E-Business Suite to update this table: PRC: Refresh Project Summary Amounts.

Note: For E-Business Suite, Funding Currency is the Document Currency for this fact.

The following Domains are used in the Project Funding area:

  • Project_Funding_Category: Used for categorizing funding allocation types. Project_Funding_Level: This flat file is used to indicate whether a funding line is for a Task or a Project. It is not used in any by default metric definition.

    Note: Funding Fact Canonical Date GL Date is not populated in the OLTP application. So in the data warehouse, the GL Date for E-Business Suite is based on the Funding Allocation Date, using the GL Calendar of the Project OU. This enables cross-functional analysis on GL Calendar. For example, cross analysis of funding and billing by Fiscal Year is not possible if there is no GL Date in the Funding fact. Customers who do not want to perform analysis based on GL Calendar can instead base it on Enterprise Calendar.

  • The GL date (Funding Allocation Date) is the canonical date for this table and is also used for global exchange rate calculation.

B.2.54 How to Configure Projects Resource Class for PeopleSoft

Resource Class involves classification of resources into people, equipment, material items, and financial elements.

B.2.54.1 Identify Resource Class based on a Source Type, Category, and Subcategory Combination of Values

To use this identification during the ETL process, you need to set the variable RESOURCE_CLASS_TYPECATSUB to 1 in FSM.

The ETL process uses the domainValues_Project_Cost_Resource_Class_TypeCatSub_psft.csv flat file to assign Resource Class to project cost records.

Use the following flat files to identify Resource Class based on a Source Type, Category, and Subcategory Combination of Values:

  • file_project_cost_resource_class_typecatsub_config_psft.csv

    Use this file to specify the columns (Source Type, Category, and Subcategory) to use in the lookup.

  • file_project_cost_resource_class_typecatsub_psft.csv

    The ETL process uses this flat file to list all Source Type, Category, Subcategory combinations of values to use for Resource Class. Enter values for only the columns that are selected in the file_Project_Cost_Resource_Class_TypeCatSub_config_psft.csv file. All columns must be included in the flat file and unselected columns must not contain a value.

You must identify each row as either People (L) or Equipment (A) as the last value.

To configure file_Project_Cost_Resource_Class_TypeCatSub_config_psft.csv (config file):

  1. Edit the file file_Project_Cost_Resource_Class_TypeCatSub_config_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter only one row with RowID of 1. Enter a Y in each column that represents the combination to be assigned a Resource Class. The columns are:

    Row ID
    Source Type
    Category
    Subcategory
    

    The following is an example of using a combination of Source Type and Category:

    1,Y,Y,

    In this example, Source Type and Category combinations stored in file_project_cost_resource_class_typecatsub_psft.csv are classified as People or Equipment when the values match.

  3. Save and close the file.

To configure the file_project_cost_resource_class_typecatsub_psft.csv file (data file):

  1. Edit the file_Project_Cost_Resource_Class_TypeCatSub_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter Resource Type, Category, and Subcategory combinations that are to be considered as Resource Class People or Equipment. For Resource Class of People, enter an L as the last value.

    For Resource Class of Equipment, enter an A as the last value. The format is:

    XXXXX,XXXXX,XXXXX,X

    You must specify each combination of lookup values. Wildcards are not supported.

    The following is an example of classifying costs with LABOR or SUBCN Source Type/no Category as People costs and costs with DIRCT Source Type/HRDWR Category as Equipment costs:

    LABOR,,,L
    SUBCN,,,L
    DIRCT,HRDWR,,A
    

    Note:

    This CSV file is used in conjunction with the file_Project_Cost_Resource_Class_TypeCatSub_config_psft.csv configuration file. In this example, this configuration file would contain the value 1,Y,Y,

    You must specify each combination of lookup values. The lookup will use columns with a Y in the configuration file.

  3. Save and close the file.

B.2.54.2 Identifying Resource Class Based on a ChartField Combination of Values

To use this identification during the ETL process, you need to set the variable RESOURCE_CLASS_CHARTFIELD to 1 in FSM.

The ETL process uses the file_project_cost_resource_class_chartfield_psft.csv flat file to assign Resource Class to Project Cost records.

To assign Resource Class based on a Chartfield combination of values, use the following CSV files:

  • file_Project_Cost_Resource_Class_ChartField_config_psft.csv

    Use this flat file to specify the Chartfield columns to use in the lookup.

  • file_project_cost_resource_class_chartfield_psft.csv

    Use this flat file to assign all ChartField combinations of values to a Resource Class. Enter values for only the columns that are selected in the file_Project_Cost_Resource_Class_ChartField_config_psft.csv file.

All columns must be included in the flat file and unselected columns must not contain a value. You must identify each row as either People (L) or Equipment (A) as the last value.

To configure the file_Project_Cost_Resource_Class_ChartField_config_psft.csv (config file):

  1. Edit the file file_Project_Cost_Resource_Class_ChartField_config_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter only one row with RowID of 1. Enter a Y in each column that represents the combination to be assigned a Resource Class. The columns are:

    Row ID
    Account
    Alternate Account
    Operating Unit
    Fund
    Dept ID
    Program
    Class
    Budget
    Project
    Business Unit
    Project
    Activity
    Source Type
    Category
    Subcategory
    Affiliate
    Affiliate 1
    Affiliate 2
    ChartField 1
    ChartField 2
    ChartField 3
    

    The following is an example of using a combination of Fund Code and Program:

    ,,,Y,,Y,,,,,,,,,,,,,,,
    

    In this example, Fund Code and Program Code combinations stored in the file_project_cost_resource_class_chartfield_psft.csv are classified as People or Equipment when the values match.

  3. Save and close the file.

To configure the file_Project_Cost_Resource_Class_ChartField_psft.csv (data file):

  1. Edit the file file_project_cost_resource_class_chartfield_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter ChartField combinations that are to be considered as Resource Class People or Equipment. For Resource Class of People, enter an L as the last value.

    For Resource Class of Equipment, enter an A as the last value. The format is:

    X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X,X

    X represents a Chartfield combination.

    Each combination of lookup values must be specified. Wildcards are not supported.

    The following example shows how to classify costs with Fund Code FND01 and Program Code P200b as People costs:

    ,,,FND01,, P2008,,,,,,,,,,,,,,,,L
    

    Note:

    This CSV file is used in conjunction with the file_Project_Cost_Resource_Class_ChartField_config_psft.csv configuration file. In this example, this configuration file would contain the value ,,,Y,,Y,,,,,,,,,,,,,,,.

    In the above example, Project Costing records with the Fund Code FND01 and Program Code P2008 are classified as Resource Class People.

    You must specify each combination of lookup values. Columns with a Y in the configuration file will be considered in the lookup.

  3. Save and close the file.

B.2.55 How to Include Incomplete Invoice Lines

By default, the Oracle Supply Chain and Order Management Analytics application is configured to extract completed sales invoices when performing the Sales Invoice data extract. Oracle 11i and Oracle R12 use a flag to indicate whether a sales invoice is complete. In particular, completed sales invoices are those where the RA_CUSTOMER_TRX_ALL.COMPLETE_FLAG = Y in Oracle11i and Oracle R12. To extract incomplete sales invoices, as well as complete invoices, remove the extract filter statement.

To remove the extract filter for sales invoices:

  1. In ODI, open the SDE_ORA115<ver>_Adaptor or SDE_ORAR12_Adaptor folder.

  2. Open the SDE_ORA_SalesInvoiceLinesFact.W_SALES_INVOICE_LINE_FS.SQ_BCI_SALES_IVCLNS temp interface, click on the Quick Edit tab.

  3. Expand the Filter section, select the filter that you have to remove "RA_CUSTOMER_TRX_ALL.COMPLETE_FLAG='Y'".

    The red cross mark gets highlighted, this is to delete the highlighted filter.

  4. Click on the cross button.

  5. Save your changes to the repository.

  6. Regenerate the scenario.

  7. Repeat steps 2 - 6 for the temp interface - SDE_ORA_SalesInvoiceLinesFact_Primary.W_SALES_INVOICE_LINE_F_PE_SQ_BCI_SALES_IVCLNS.

Oracle Fusion Applications uses a flag to indicate whether a sales invoice is complete. In particular, completed sales invoices are those where the SALESINVOICECUSTOMERTRXLINESPVO.TransactionHeaderCompleteFlag='Y'. To extract incomplete sales invoices, as well as complete invoices, remove the extract filter statement.

To remove the extract filter for sales invoices:

  1. In ODI, open the SDE_FUSION_Adaptor.

  2. Open the SDE_FUSION_SalesInvoiceLinesFact.W_SALES_INVOICE_LINE_FS_SQ_TRANSACTIONLINEPVO temp interface, click on the Quick Edit tab.

  3. Expand the Filter section, select the filter that you have to remove "SALESINVOICECUSTOMERTRXLINESPVO.TransactionHeaderCompleteFlag='Y'".

    The red cross mark gets highlighted, this is to delete the highlighted filter.

  4. Click on the cross button.

  5. Save your changes to the repository.

  6. Regenerate the scenario.

  7. Repeat steps 2 - 6 for the temp interface - SDE_ORA_SalesInvoiceLinesFact_Primary.W_SALES_INVOICE_LINE_F_PE_SQ_BCI_SALES_IVCLNS.

B.2.56 How to Set Up Project Cost and Control Security for Oracle Fusion

Overview

Oracle Project Analytics supports security using the following dimensions in Project Costing and Project Control subject areas.

Table B-26 Supported security dimensions in Project Costing and Project Control subject areas

Project Costing and Control FactsSecurity Entity Cost Commitment Budget Forecast

Project Business Unit

Y

Y

Y

Y

Project Organization

Y

Y

Y

Y

Expenditure Business Unit

Y

Y

N

N

Contract Business Unit

N

N

N

N

Project

Y

Y

Y

Y

Resource Organization

N

N

N

N

Ledger

N

N

N

N


Configuring Project Cost and Control Security For Oracle Fusion Applications

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

You need to ensure that all Project Security initialization blocks for other adapters are disabled. If more than one source system is deployed, then you must enable the initialization blocks of those source systems.

Init Blocks:

  • Expenditure Business Unit List Fusion

  • Project Business Unit List Budget Fusion

  • Project Business Unit List Costing Fusion

  • Project Business Unit List Forecast Fusion

  • Project List Budget Fusion

  • Project List Costing Fusion

  • Project List Forecast Fusion

  • Project Organization List Budget Fusion

  • Project Organization List Costing Fusion

  • Project Organization List Forecast Fusion

B.2.56.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Costing and Control subject area.

  • OBIA_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_PROJECT_MANAGEMENT_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.57 How to Set Up CRM Primary Organization Based Security for Siebel

Overview

Siebel CRM primary organization based security is applied in partner subject areas. In Siebel partner application, primary organization is basically the partner organization that the partner user belongs to. Primary organization based security gives the partner user access to only the entities of which his partner organization is the primary owner organization.

Configuring Primary Organization Based Security

The session variable ORGANIZATION store the list organization ID that the user belongs to. It is initialized via the initial block 'Orgs for Org-Based Security' when user logs in and then used as data filter in the primary organization based data security duty role.

B.2.57.1 Configuring BI Duty Roles

'Primary Org-Based Security' is the internal BI duty role to define data filter for primary organization based data security. And by default, it has the following members:

  • Partner Executive Analytics User

  • Partner Operations Analytics User

  • Partner Sales Manager Analytics User

  • Partner Service Manager Analytics User

These duty roles control which subject areas and dashboard content the user get access to. As member of 'Primary Org-Based Security', they also ensure that the primary organization based data security filters are applied.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.58 How to Set Up CRM Primary Employee/Position Hierarchy Based Security for Siebel

Overview

Primary employee/position hierarchy based security is widely used in many CRM subject areas, such as Sales, Marketing and Partner Management. For Siebel data source, CRM BI shares the same concept of Position and Position Hierarchy as they are defined in Siebel application.

Primary Employee/Position Hierarchy Based Security control starts with user's login and the login's level in the position hierarchy. User's login is then compared with the login defined at that particular level in position hierarchy as data filter in queries. By this way, user is granted with data visibility to the transactions as direct owner and the transactions owned by his/her subordinates.

Note: In CRM Siebel Forecasting Analytics, in addition to the position hierarchy, more data visibility is granted to the login user via "Indirect Sales Hierarchy", which is originally defined in Siebel Forecasting application and brought over to DW by ETL.

Configuring Resource Hierarchy Based Security

There are two session variables used in "Primary Employee/Position Hierarchy Based Security" for Siebel.

  • USER is the OBIEE system session variable, which is populated automatically when an user logs onto BI.

  • HIER_LEVEL contains level defined in position hierarchy that the login user belongs to. This variable is initialized via the session initial block "User Hierarchy Level".

B.2.58.1 Configuring BI Duty Roles

All the primary employee and position based security roles should be defined as member of the internal role "Primary Employee/Position Hierarchy-based Security". In the default configuration, "Primary Employee/Position Hierarchy-based Security" has the following members.

  • Partner Sales Rep Analytics User

  • Partner Service Rep Analytics User

  • Pricing Manager

  • Primary Owner-Based Security

  • Sales Manager Analytics

  • Sales Representative Analytics

  • Usage Accelerator - Sales Manager

These duty roles also control which subject areas and dashboard content the user can get access to.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.59 How to Configure Workforce Bypass Fast Formula

Purpose

Headcount and FTE may be derived in a number of different ways depending on the OLTP setup. If they are not stored in the table PER_ASSIGNMENT_BUDGET_VALUES_F then a fast formula is executed to calculate the value for each assignment record.

Typically fast formula execution is slow, and to avoid any performance issues for larger systems there is a new Workforce Bypass Fast Formula feature which retains the flexibility of the fast formulas without the cost.

Optional or Mandatory

This task is optional, however the default option of running the fast formulas will not be as fast.

Applies to

All versions of E-Business Suite.

Task description in detail

To bypass the fast formula execution configure the parameter HR_WRKFC_BYPASS_FF. Once that is done, the ETL will calculate the Headcount and FTE values using the same logic as in the default fast formulas TEMPLATE_HEAD and TEMPLATE_FTE (although values entered directly in the ABV table will still take precedence).

If the template formula logic is not adequate then it is possible to configure that in the ETL, although this is quite a complex task as it involves modifying the SQL expressions that implement the formula logic. The template formula SQL expressions for Headcount and FTE are stored in ODI variables HR_WRKFC_BYPASS_HDC_CALC and HR_WRKFC_BYPASS_FTE_CALC. The variable values may be overridden with the required logic in the generated load plan.

HR_WRKFC_BYPASS_HDC_CALC

Implements the logic from the fast formula TEMPLATE_HEAD but calculated directly from the base tables. The logic implemented is:

  • If the assignment is primary then the headcount is 1.

  • Otherwise headcount is 0.

The variable expression is:

(case when asg.primary_flag = 'Y' then 1 else 0 end)

If overriding this expression care must be taken to ensure that all references match up in every data set of the interface. Joins may be added if they do not change the number of rows being processed.

HR_WRKFC_BYPASS_FTE_CALC

Implements the logic from the fast formula TEMPLATE_FTE but calculated directly from the base tables. The logic implemented is:

  • If the assignment has full time employment category then the FTE is 1.

  • If the assignment has part time employment category then calculate the FTE based on working hours of assignment / expected working hours of assignment.

  • Otherwise FTE is 0.

The expected working hours of the assignment come from the position, organization, business group (in that order of precedence). If the assignment hours are given in a different frequency to the expected working hours then some conversion is necessary.

The variable expression is:

(case when asg.employment_category in ('FT','FR') then 1
      when asg.employment_category in ('PT','PR') then
       round((case when NVL(pos.working_hours,
                     NVL(org.org_information3, bus.org_information3)) = 0
              then 0
              else (decode(NVL(pos.frequency,
                               NVL(org.org_information4, bus.org_information4)),
                    'H', 1, 'D', 8, 'W', 40, 'M', 169) *
                    asg.normal_hours)
              / (decode(asg.frequency,
                 'HO', 1, 'D', 8, 'W', 40, 'M', 169) *
                 NVL(pos.working_hours,
                     NVL(org.org_information3, bus.org_information3)))
        end), 2)
   else 0
  end)

If overriding this expression care must be taken to ensure that all references match up in every data set of the interface. Joins may be added if they do not change the number of rows being processed.

Dependency

No dependencies.

B.2.60 How to Disable Projects Invoice Line Fact in the RPD for Fusion Applications

This topic explains how to disable Project Invoice Line for fusion users. By default, the Billing metrics are mapped to both Invoice Line Fact -W_PROJ_INVOICE_LINE_F and Invoice Line Distribution Fact -W_PROJ_INVOICE_DIST_F. For customers whose only data source is the Fusion database, the Invoice Line fact is an obsolete table and has to be disabled. But before disabling it metrics which are defined only in the line fact have to be deleted. The metrics listed below are around the Retention area and are not supported in the Fusion application currently.

To Disable Projects Invoice Line Fact, do the following:

  1. Delete unnecessary metrics, as described in Section B.2.60.1, "How to Delete Unnecessary Metrics".

  2. Disable Invoice Line Fact, as described in Section B.2.60.2, "How to Disable Invoice Line Fact".

Note: Oracle recommends that before you start this process you make a back up of your metadata repository (RPD file).

B.2.60.1 How to Delete Unnecessary Metrics

Before disabling the logical table source, the metrics defined using these Logical Table Sources have to be unmapped / deleted so as to have a consistent RPD, as follows:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. In the Business Modelling and Mapping layer, navigate to Fact – Project Billing.

  3. Delete the metrics listed below, as show in the following screen shot.

    This screen shot is described in surrounding text.

    List of 71 metrics that must be deleted:

    1.     Fact - Project Billing.--------------- Retention Amount ---------------
    2.     Fact - Project Billing.Current Withheld Amount
    3.     Fact - Project Billing.Current Withheld Amount - ITD
    4.     Fact - Project Billing.Current Withheld Amount - MTD
    5.     Fact - Project Billing.Current Withheld Amount - QTD
    6.     Fact - Project Billing.Current Withheld Amount - YTD
    7.     Fact - Project Billing.---------- Retention Billed -------------
    8.     Fact - Project Billing.Retention Billed
    9.     Fact - Project Billing.Retention Billed - ITD
    10.     Fact - Project Billing.Retention Billed - MTD
    11.     Fact - Project Billing.Retention Billed - QTD
    12.     Fact - Project Billing.Retention Billed - YTD
    13.     Fact - Project Billing.----------- Retention Withheld -------------
    14.     Fact - Project Billing.Total Retained Amount
    15.     Fact - Project Billing.Total Retained Amount - ITD
    16.     Fact - Project Billing.Total Retained Amount - MTD
    17.     Fact - Project Billing.Total Retained Amount - QTD
    18.     Fact - Project Billing.Total Retained Amount - YTD
    19.     Fact - Project Billing.----------- Retention Write-off -----------
    20.     Fact - Project Billing.Retention Write-off
    21.     Fact - Project Billing.Retention Write-off - ITD
    22.     Fact - Project Billing.Retention Write-off - MTD
    23.     Fact - Project Billing.Retention Write-off - QTD
    24.     Fact - Project Billing.Retention Write-off - YTD
    25.     Fact - Project Billing.-------------- Unearned Revenue ----------------
    26.     Fact - Project Billing.Unearned Revenue
    27.     Fact - Project Billing.Unearned Revenue - ITD
    28.     Fact - Project Billing.Unearned Revenue - MTD
    29.     Fact - Project Billing.Unearned Revenue - QTD
    30.     Fact - Project Billing.Unearned Revenue - YTD
    31.     Fact - Project Billing.-------------- Unbilled Receivables ------------
    32.     Fact - Project Billing.Unbilled Receivables
    33.     Fact - Project Billing.Unbilled Receivables - ITD
    34.     Fact - Project Billing.Unbilled Receivables - MTD
    35.     Fact - Project Billing.Unbilled Receivables - QTD
    36.     Fact - Project Billing.Unbilled Receivables - YTD
    37.     Fact - Project Billing.# of Unapproved Invoices
    38.     Fact - Project Billing.Retention Amount - MTD - Enterprise Calendar
    39.     Fact - Project Billing.Retention Amount - QTD - Enterprise Calendar
    40.     Fact - Project Billing.Retention Amount - YTD - Enterprise Calendar
    41.     Fact - Project Billing.Retention Billed - MTD - Enterprise Calendar
    42.     Fact - Project Billing.Retention Billed - QTD - Enterprise Calendar
    43.     Fact - Project Billing.Retention Billed - YTD - Enterprise Calendar
    44.     Fact - Project Billing.Retention Withheld - MTD - Enterprise Calendar
    45.     Fact - Project Billing.Retention Withheld - QTD - Enterprise Calendar
    46.     Fact - Project Billing.Retention Withheld - YTD - Enterprise Calendar
    47.     Fact - Project Billing.Retention Write-off - MTD - Enterprise Calendar
    48.     Fact - Project Billing.Retention Write-off - QTD - Enterprise Calendar
    49.     Fact - Project Billing.Retention Write-off - YTD - Enterprise Calendar
    50.     Fact - Project Billing.Unearned Revenue - MTD - Enterprise Calendar
    51.     Fact - Project Billing.Unearned Revenue - QTD - Enterprise Calendar
    52.     Fact - Project Billing.Unearned Revenue - YTD - Enterprise Calendar
    53.     Fact - Project Billing.Unbilled Receivables - MTD - Enterprise Calendar
    54.     Fact - Project Billing.Unbilled Receivables - QTD - Enterprise Calendar
    55.     Fact - Project Billing.Unbilled Receivables - YTD - Enterprise Calendar
    56.     Fact - Project Billing.Internal UBR UER Metric (Invoice Fact)
    57.     Fact - Project Billing.Internal Unearned Revenue - ITD
    58.     Fact - Project Billing.Internal Unbilled Receivable - ITD
    59.     Fact - Project Revenue.------------- Unearned Revenue --------------
    60.     Fact - Project Revenue.Unearned Revenue
    61.     Fact - Project Revenue.Unearned Revenue - ITD
    62.     Fact - Project Revenue.Unearned Revenue - QTD
    63.     Fact - Project Revenue.Unearned Revenue - MTD
    64.     Fact - Project Revenue.Unearned Revenue - YTD
    65.     Fact - Project Revenue.------------ Unbilled Receivable -------------
    66.     Fact - Project Revenue.Unbilled Receivable
    67.     Fact - Project Revenue.Unbilled Receivable - ITD
    68.     Fact - Project Revenue.Unbilled Receivable - MTD
    69.     Fact - Project Revenue.Unbilled Receivable - QTD
    70.     Fact - Project Revenue.Unbilled Receivable - YTD
    71.     Fact - Project Revenue.Internal UBR UER Metric (Invoice Fact)
    

B.2.60.2 How to Disable Invoice Line Fact

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. In the Business Modelling and Mapping layer, navigate to Fact – Project Billing.

  3. Select the 'Fact_W_PROJ_INVOICE_LINE_F_Invoice_Line' Logical Table Source, then right click, and choose Edit.

  4. Display the General tab, and select the 'Disabled' check box, then click OK.

    This screen shot is described in surrounding text.
  5. Select the 'Fact_W_PROJ_INVOICE_LINE_F_Invoice_Line_ITD' Logical Table Source, then right click, and choose Edit.

  6. Display the General tab, and select the 'Disabled' check box, then click OK.

    This screen shot is described in surrounding text.
  7. Save the changes.

  8. Run the Consistency Check and ensure that there are no errors, save the RPD file, and clear Oracle BI Enterprise Edition Cache.

    If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.61 How to Set Up SIA Student Financial Security for Peoplesoft

The Student Financial module is secured by Business Unit and Academic Institution, (except the Credit history Subject Area, which is only secured by Business Unit). From the object security perspective, the table below shows the default job roles for Student Financial module access.

Table B-27 Role names and Descriptions for Student Financial Analytics

Role Name Description

Bursar

Manages student receivables, student tuition and fee charges, student billing, student payments and student collections.

Campus Solutions Administrator

Administrator of Campus Solutions.

Student

Student of the Academic Institution.


The table below shows the duty roles and data security roles that are used by the Student Financial module.

Table B-28 Role Names and Roles Types for Student Financial Analytics

Role Name Role Type

OBIA_Student_Accounts_Analysis_Duty

Duty Role

OBIA_SIA_Admin_Analysis_Duty

Duty Role

OBIA_Student_Analysis_Duty

Duty Role

OBIA_STUDENT_INSTITUTION_DATA_SECURITY

Data Security Role

OBIA_STUDENT_BUSINESS_UNIT_DATA_SECURITY

Data Security Role

OBIA_STUDENT_DATA_SECURITY

Data Security Role


To set up SIA Student Financial Security for Peoplesoft:

  1. Log into Weblogic console.

  2. Click on "Lock and Edit" button.

  3. Navigate to Security Realms, then myrealm, then Users and Groups, then Users.

  4. On the users tab, create a new user.

    Ensure that the same user is present in the Peoplesoft Campus Solution OLTP system.

  5. Navigate to Security Realms, then myrealm, then Users and Groups, then Groups.

  6. On the Groups tab, create the same group as that available in the JAZN file.

    For example, Bursar, or Admissions Manager.

  7. Navigate to Security Realms, then myrealm, then Users and Groups, then Users, and click on the newly created user.

  8. Click on the groups tab and associate the user with the appropriate application role along with BIAuthors and BIConsumers roles and save the changes.

  9. Click the Release Configuration button.

All role mappings are accomplished inside the JAZN file, which is provided with the Oracle BI Applications installation. Any new role mapping is a part of the customization effort and the JAZN file needs to be updated.

B.2.62 How to Set Up SIA Administration Recruiting Security for Peoplesoft

The Student Financial module is secured by Business Unit and Academic Institution, (except the Credit history Subject Area, which is only secured by Business Unit). From the object security perspective, the table below shows the default job roles for Student Financial module access.

Table B-29 Role names and Descriptions for Student Financial Analytics

Role Name Description

Admissions Manager

Manages Recruiting processes to meet the enrollment targets.

Campus Solutions Administrator

Administrator of Campus Solutions.


The table below shows the duty roles and data security roles that are used by the Admissions and Recruiting module.

Table B-30 Role Names and Role Types for Admissions and Recruiting

Role Name Role Type

OBIA_Student_Admissions_Analysis_Duty

Duty Role

OBIA_SIA_Admin_Analysis_Duty

Duty Role

OBIA_STUDENT_INSTITUTION_DATA_SECURITY

Data Security Role


To set up SIA Admissions and Recruiting Security for Peoplesoft:

  1. Log into Weblogic console.

  2. Click on "Lock and Edit" button.

  3. Navigate to Security Realms, then myrealm, then Users and Groups, then Users.

  4. On the users tab, create a new user.

    Ensure that the same user is present in the Peoplesoft Campus Solution OLTP system.

  5. Navigate to Security Realms, then myrealm, then Users and Groups, then Groups.

  6. On the Groups tab, create the same group as that available in the JAZN file.

    For example, Bursar, or Admissions Manager.

  7. Navigate to Security Realms, then myrealm, then Users and Groups, then Users, and click on the newly created user.

  8. Click on the groups tab and associate the user with the appropriate application role along with BIAuthors and BIConsumers roles and save the changes.

  9. Click the Release Configuration button.

All role mappings are accomplished inside the JAZN file, which is provided with the Oracle BI Applications installation. Any new role mapping is a part of the customization effort and the JAZN file needs to be updated.

B.2.63 How to Set Up SIA Student Records Security for Peoplesoft

The Student Records module is secured by Academic Institution. From the object security perspective, the table below shows the default job roles that have Student Records module access.

Table B-31 Role names and Descriptions for Student Records

Role Name Description

Registrar

The Registrar is the head of the Student Records Office and is one of the key owners of the Student Information System.

Campus Solutions Administrator

Administrator of Campus Solutions.

Student

Student of the Academic Institution.


Note: The Student does not have access to the following three Subject Areas:

  • Institution summary

  • Class instructor

  • Class meeting pattern

The table below shows the duty roles and data security roles that are used by the Student Records module.

Table B-32 Role Names and Roles Types for Student Records

Role Name Role Type

OBIA_Student_Records_Analysis_Duty

Duty Role

OBIA_SIA_Admin_Analysis_Duty

Duty Role

OBIA_Student_Analysis_Duty

Duty Role

OBIA_STUDENT_INSTITUTION_DATA_SECURITY

Data Security Role

OBIA_STUDENT_DATA_SECURITY

Data Security Role


To set up SIA Student Records for Peoplesoft:

  1. Log into Weblogic console.

  2. Click on "Lock and Edit" button.

  3. Navigate to Security Realms, then myrealm, then Users and Groups, then Users.

  4. On the users tab, create a new user.

    Ensure that the same user is present in the Peoplesoft Campus Solution OLTP system.

  5. Navigate to Security Realms, then myrealm, then Users and Groups, then Groups.

  6. On the Groups tab, create the same group as that available in the JAZN file.

    For example, Bursar, or Admissions Manager.

  7. Navigate to Security Realms, then myrealm, then Users and Groups, then Users, and click on the newly created user.

  8. Click on the groups tab and associate the user with the appropriate application role along with BIAuthors and BIConsumers roles and save the changes.

  9. Click the Release Configuration button.

All role mappings are accomplished inside the JAZN file, which is provided with the Oracle BI Applications installation. Any new role mapping is a part of the customization effort and the JAZN file needs to be updated.

B.2.64 How to Configure Projects Forecast Fact for PeopleSoft

Estimate to Complete (ETC not to be confused with the Analysis Type value 'ETC') Cost and Revenue data from PeopleSoft Projects Costing area source is extracted for Project Forecast. Users would need to configure the analysis types for Estimate to Complete Analysis Types in the PROJ_FORECAST_FILTER in the FSM.

In the FSM, go to 'Manage Data Load Parameters section'; filter for Source PeopleSoft 9.0 or 9.1 FINSCM, filter Offering Oracle Project Analytics, filter Functional Area Project Control and Costing.

For Variable PROJ_FORECAST_FILTER set the Analysis types for Cost and Revenue metrics from Projects Costing Area in quotes for example 'ETC','ETB'.

Identifying Project Forecast Costs and Revenue ETC Metrics Based on Analysis Type, Source Type, Category, and Subcategory Combination of Values

You must configure the following flat files to identify Project Forecast ETC Costs and Revenues based on a Analysis Type (Mandatory), Source Type, Category, and Subcategory combination of values.

  • Configuring file_Project_Forecast_config_psft.csv

    The ETL process uses this flat file to designate which columns (Analysis Type, Source Type, Category, and Subcategory) are used in the lookup. A parameter specified in Oracle BI Applications Configuration Manager determines whether this lookup is performed for an implementation.

    Example (1) if you wish to configure the filter only on Analysis Type then:

    Table B-33 Example data for Configuring file_Project_Forecast_config_psft.csv

    ROWID ANALYSIS_TYPE RESOURCE_TYPE RESOURCE_CAT RESOURCE_SUB_CAT RETURN_VALUE
               
               
               

    1

    1

           

    Example (2) if you wish to configure the filter on RESOURCE_CAT and RESOURCE TYPE then (ANALYSIS_TYPE is mandatory):

    Table B-34 Example data for Configuring file_Project_Forecast_config_psft.csv

    ROWID ANALYSIS_TYPE RESOURCE_TYPE RESOURCE_CAT RESOURCE_SUB_CAT RETURN_VALUE
               
               
               

    1

    1

    1

    1

       

  • Configuring file_Project_Forecast_psft.csv

    The ETL process uses this flat file to list all Analysis Type, Source Type, Category, and Subcategory combination of values to use for Project Forecast ETC Cost and Revenue. Example for the above configuration in (1):

    Table B-35 Example data for Configuring file_Project_Forecast_config_psft.csv

    ANALYSIS_TYPE RESOURCE_TYPE RESOURCE_CAT RESOURCE_SUB_CAT RETURN_VALUE
             
             
             
             

    ETC

         

    C

    ETB

         

    R


    Example for the configuration in (2):

    Table B-36 Example data for Configuring file_Project_Forecast_config_psft.csv

    ANALYSIS_TYPE RESOURCE_TYPE RESOURCE_CAT RESOURCE_SUB_CAT RETURN_VALUE
             
             
             

    ETC

    LABOR

    TECH

     

    C

    ETB

    MATER

    TECH

     

    R


B.2.65 How to Implement Security For Order Management Analytics

To implement security for Oracle Order Management Analytics, do the following:

B.2.65.1 How to implement OM Inventory Org Based Security for EBS

Overview

Order Management Analytics supports security over Inventory Organizations in OM subject areas. The list of Inventory Organizations that a user has access to is determined by the grants in EBS.

Configuring Inventory Org Based Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Inventory Org Based security for EBS, enable E-Business Suite initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Oracle Fusion Applications: SCOM_AN: SECURITY: Inv Org Shipments List

E-Business Suite: Inventory Organizations EBS

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Open the variable by navigating to: Manage, then Variables, then Session, then Variables, then Non-System, then INV_ORG.

  3. Open the initialization block by navigating menu: Manage, then Variables, then Session, then Initialization blocks, then Inventory Organizations EBS.

  4. Clear the Disabled check box.

  5. Save the RPD.

B.2.65.1.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Order Management subject area.

  • Order Management Analyst

  • Order Management Executive

  • Order Fulfillment Analyst

  • Order Fulfillment Executive

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.65.2 How to implement OM Inventory Org Based Security for Oracle Fusion Applications

Overview

Order Management Analytics supports security over Inventory Organizations in OM subject areas. The list of Inventory Organizations that a user has access to is determined by the grants on the Oracle Fusion Applications

Configuring Inventory Org Based Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Inventory Org Based security for Fusion, enable Oracle Fusion initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Oracle Fusion Applications: SCOM_AN: SECURITY: Inv Org Shipments List

E-Business Suite: Inventory Organizations EBS

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Open the variable by navigating to: Manage, then Variables, then Session, then Variables, then Non-System, then INV_ORG_SHIPMENTS.

  3. Open the initialization block by navigating menu: Manage, then Variables, then Session, then Initialization blocks, then SCOM_AN:SECURITY:Inv Org Shipments List.

  4. Clear the Disabled check box.

  5. Save the RPD.

B.2.65.2.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Order Management subject area.

  • OBIA_SHIPPING_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_EXTENDED_SHIPPING_MANAGEMENT_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.65.3 How to implement OM Operating Unit Org-based Security for EBS

Overview

Order Management Analytics supports security over Operating Unit Organizations in OM subject areas. The list of Operating Unit Organizations that a user has access to is determined by the grants in EBS.

Configuring Inventory Org Based Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Operating Unit Org Based security for EBS, enable E-Business Suite initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Oracle Fusion Applications: Order Fulfillment Orchestration BU List

E-Business Suite: Operating Unit Organizations EBS

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Open the variable by navigating to: Manage, then Variables, then Session, then Variables, then Non-System, then OU_ORG____EBS.

  3. Open the initialization block by navigating menu: Manage, then Variables, then Session, then Initialization blocks, then Operating Unit Organizations EBS.

  4. Clear the Disabled check box.

  5. Save the RPD.

B.2.65.3.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Order Management subject area.

  • Order Management Analyst

  • Order Management Executive

  • Order Fulfillment Analyst

  • Order Fulfillment Executive

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.65.4 How to implement OM Operating Unit Org-based Security for Oracle Fusion Applications

Overview

Order Management Analytics supports security over Operating Unit Organizations in OM subject areas. The list of Operating Unit Organizations that a user has access to is determined by the grants on the Oracle Fusion Applications.

Configuring Operating Unit Org Based Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Operating Unit Org Based security for Fusion, enable Oracle Fusion initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Oracle Fusion Applications: Order Fulfillment Orchestration BU List

E-Business Suite: Operating Unit Organizations EBS

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Open the variable by navigating to: Manage, then Variables, then Session, then Variables, then Non-System, then OM_BU.

  3. Open the initialization block by navigating menu: Manage, then Variables, then Session, then Initialization blocks, then Order Fulfillment Orchestration BU List.

  4. Clear the Disabled check box.

  5. Save the RPD.

B.2.65.4.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Costing and Control subject area.

  • OBIA_EXTENDED_ORDER_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_ORDER_MANAGEMENT_ANALYSIS_DUTY

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.65.5 How to Grant Cross Functional Access to Order Management Users

Overview

Clients are accessing data from across the enterprise and delivering deep insight directly to business users. They perform cross-functional analysis to understand cause and affect relationships between key performance indicators across different departments. Cross functional reporting from a variety of relational databases and data sources is possible too. OBIA is a prepackaged data warehouse enabling historical analysis and cross domain insight. Common Conformed Dimensions ensure cross fact, cross subject areas and federated OBIA reporting.

Order Management enables users to analyze OM data along with data from Inventory, Account Receivable, GL Revenue, etc. For Example we may need reports like the below. This is cross functional reporting with Order Mgmt and Supply chain -

Q. How many of my top customers bought products from my worst suppliers?

Q. Which of my top suppliers are also my top customers?

E-Business Suite and JD Edwards EnterpriseOne adapters:

By default, Order Management's security implementation for E-Business Suite and JD Edwards EnterpriseOne adapter is enabled with cross-functional capability. The following BI Duty Roles enable users to access to the Order Management subject areas.

  • Order Management Analyst

  • Order Management Executive

  • Order Fulfillment Analyst

  • Order Fulfillment Executive

  • Order Management Analyst JDE

  • Order Management Executive JDE

  • Order Fulfillment Analyst JDE

  • Order Fulfillment Executive JDE

Table B-37 Duty Roles and Subject Areas

Duty Role Description Subject Areas:

Order Fulfillment Executive, Order Fulfillment Executive JDE.

This role provides secured access to Sales Order fulfillment managers and supply chain executives with insight into orders, backlogs, shipments and inventory to track fulfillment performance.

Sales - Invoice Lines

Sales - Schedule Lines

Sales - Backlog Lines

Sales - Pick Lines

Sales - Sales Revenue

Sales - Sales Receivables

Sales - Sales Overview

Sales - Orders, Backlog and Invoices

Sales - Order Process

Sales - Order Lines

Sales - Customer Activity

Sales - Inventory & Backlog

Sales - Backlog History

Order Fulfillment Analyst, Order Fulfillment Analyst JDE.

This role provides secured access to Sales Order fulfillment analysts with detailed insight into order line, booking line and backlog line details.

Sales - Backlog History

Sales - Backlog Lines

Sales - Booking Lines

Sales - Order Lines

Sales - Schedule Lines

Sales - Invoice Lines

Sales - Pick Lines

Sales - Sales Revenue

Sales - Sales Receivables

Sales - Sales Overview

Sales - Orders, Backlog and Invoices

Sales - Order Process

Sales - Customer Activity

Sales - Inventory & Backlog

Order Management Analyst, Order Management Analyst JDE.

This role provides secured access to Sales Order management analysts with detailed insight into order lines, booking line and invoice line details.

Sales - Overview

Sales - Booking Lines

Sales - Invoice Lines

Sales - Sales Revenue

Sales - Orders and Invoices

Sales - Orders, Backlog and Invoices

Sales - Order Lines

Sales - Customer Activity

Order Management Executive, Order Management Executive JDE.

This role provides order management executives with secured access to sales revenue, orders, invoices and backlog details.

Sales - Overview

Sales - Booking Lines

Sales - Invoice Lines

Sales - Sales Revenue

Sales - Orders and Invoices

Sales - Orders, Backlog and Invoices

Sales - Order Lines

Sales - Customer Activity


Fusion Adapter:

Order Management's security implementation for Fusion adapter has two modes.

For Fusion embedded BI deployment, use these duty roles.

OBIA_ORDER_FULFILLMENT_ORCHESTRATION_BUSINESS_UNIT_DATA_SECURITY,

OBIA_INVENTORY_ORGANIZATION_SHIPMENT_DATA_SECURITY.

For OBIA standalone deployment, which requires cross functional reporting, use these duty roles.

OBIA_EXTENDED_ORDER_FULFILLMENT_ORCHESTRATION_BUSINESS_UNIT_DATA_SECURITY,

OBIA_EXTENDED_INVENTORY_ORGANIZATION_SHIPMENT_DATA_SECURITY.

The following table shows the subject areas granted to each duty roles. Note that the 'extended' roles have broader access.

Table B-38 Duty Roles and Subject Areas

Duty Role Description Subject Areas

OBIA_SHIPPING_MANAGEMENT_ANALYSIS_DUTY

Description - This BI duty role is for Shipping Managers who are responsible for overseeing both processes and people for picking, packing and shipping items. This duty role allows Shipping Managers to get insight into Shipping, Backlogs, Inventory Transactions and Inventory Balances

Is a member of:

OBIA_INVENTORY_ORGANIZATION_SHIPMENT_DATA_SECURITY

Sales - Pick Lines

OBIA_EXTENDED_SHIPPING_MANAGEMENT_ANALYSIS_DUTY

Description - This duty role provides cross-module access to the shipping manager job role for stand-alone BI Apps content.

Member of - OBIA_EXTENDED_INVENTORY_ORGANIZATION_SHIPMENT_DATA_SECURITY

Sales - Pick Lines

Sales - Order Process

Sales - Inventory & Backlog

OBIA_ORDER_MANAGEMENT_ANALYSIS_DUTY

Description - This BI duty role is for Order Managers who are responsible for processing orders, managing backlogs and optimizing fulfillment performance. This duty role allows Order Managers to analyze Orders, Bookings, Holds, Orchestration Process, Shipping, Backlogs, Invoices and Inventory

Is a member of: OBIA_ORDER_FULFILLMENT_ORCHESTRATION_BUSINESS_UNIT_DATA_SECURITY

Sales - Backlog History

Sales - Backlog Lines

Sales - Booking Lines

Sales - Order Lines

Sales - Schedule Lines

Sales - Order Process

DOO Process Instances

Sales - Order Holds

OBIA_EXTENDED_ORDER_MANAGEMENT_ANALYSIS_DUTY

Description - This duty role provides cross-module access to the order manager job role for stand-alone BI Apps content. The cross-module access will include invoice, inventory, backlog, AR and shipping

Is a member of: OBIA_EXTENDED_ORDER_FULFILLMENT_ORCHESTRATION_BUSINESS_UNIT_DATA_SECURITY

Sales - Backlog History

Sales - Backlog Lines

Sales - Booking Lines

Sales - Order Lines

Sales - Schedule Lines

Sales - Invoice Lines

Sales - Pick Lines

Sales - Sales Revenue

Sales - Sales Receivables

Sales - Sales Overview

Sales - Orders, Backlog and Invoices

Sales - Orders & Invoices

Sales - Order Process

Sales - Customer Activity

Sales - Inventory & Backlog

Sales - Order Holds

DOO Process Instances


How to Grant Cross Functional Access to Order Management Users

Note: The following section describes a post-installation and optional configuration task.

  1. To facilitate OM users (such as Order Manager and Shipping Manager) to perform deeper and cross functional analysis apart from their regular duty, Oracle Supply Chain and Order Management Analytics has configured data and functional security to access cross functional information (such as inventory, backlog , shipping information) through extended duty roles. If you would like to provision such a duty to the Order Management users, then follow the instructions in this task.

  2. Understanding Extended Duty Roles: Seeded security roles for Oracle BI Applications for Fusion Applications includes the following additional duty roles. These extended roles are not mapped to any enterprise job roles by default, but they are pre-configured within Oracle BI Applications to enforce object and data level security for Inventory transactions.

  3. 'Extended Order Management Analysis Duty' role (Role name: OBIA_EXTENDED_ORDER_MANAGEMENT_ANALYSIS_DUTY) – This duty role provides cross-module access to the order manager job role for stand-alone Oracle BI Applications content. The cross-module access will include invoice, inventory, backlog, AR and shipping information. Data security on Oracle BI Applications is implemented using OBIA_ORDER_FULFILLMENT_ORCHESTRATION_BUSINESS_UNIT_DATA_SECURITY.

  4. 'Extended Shipping Management Analysis Duty' role (Role name: OBIA_EXTENDED_SHIPPING_MANAGEMENT_ANALYSIS_DUTY) – This duty role provides cross-module access to the shipping manager job role for stand-alone Oracle BI Applications content. The cross-module access will include inventory, backlog and orders information. Data security on Oracle BI Applications is implemented using 'OBIA_INVENTORY_ORGANIZATION_SHIPMENT_DATA_SECURITY'.

  5. Follow the steps below to implement Extended Duty roles in Supply Chain and Order Management Analytics:

    1. Assign BI duty 'OBIA_EXTENDED_ORDER_MANAGEMENT_ANALYSIS_DUTY' to Fusion Applications job role, 'Order Manager' or similar.

    2. Assign BI duty 'OBIA_EXTENDED_SHIPPING_MANAGEMENT_ANALYSIS_DUTY' to Fusion Applications job role, 'Shipping Manager' or similar.

    3. Assign appropriate Fusion Applications duty roles to the job role - 'Order Manager' and assign BU privileges. Data security of 'OBIA_ORDER_MANAGEMENT_ANALYSIS_DUTY' (Oracle BI Applications duty role) is controlled by the BUs assigned to the user.

    4. Customize Presentation catalog permissions for subject areas including cross functional content (for example Sales - Inventory and Backlog ) and Subject Area permissions as desired for below mentioned roles.

B.2.66 How to Deploy Audit Trail Stored Procedures

Stored procedures are a group of SQL statements that perform particular tasks on the database. For example, stored procedures can help improve the performance of the database. You deploy stored procedures by copying the stored procedure files from your Oracle BI Applications installation and deploying them to Oracle Business Analytics Warehouse.

Note: Some sessions may fail if these procedures are not compiled in the database before running the workflows.

To deploy stored procedures:

  1. Navigate to the <ORACLE_BI_HOME>/biapps/etl/etl_stored_procs/<database technology folder> folder.

    For example, for Oracle database, navigate to <ORACLE_BI_HOME>/biapps/etl/etl_stored_procs/oracle.

  2. Execute the sql script 'FIND_AUDIT_VALUES.sql' to deliver stored procedure into Oracle Business Analytics Warehouse.

  3. Compile the stored procedures in the Oracle Business Analytics Warehouse schema.

    The schema is typically named something like: %PREFIX%_DW. For example, BIAPPS_DW.

Note: If you have problems deploying the stored procedures, refer to your database reference guide, or contact your database administrator.

B.2.67 How to Configure Aggregate tables for Inventory Balances and Transactions

About Configuring the Inventory Monthly Balance Tables

To configure the Inventory Monthly Balance (W_INVENTORY_DAILY_BALANCE_F) and Inventory Lot Monthly Balance (W_INV_LOT_MONTHLY_BAL_F) aggregate tables, you need to consider the aggregation level, the time period to update the aggregation, and the time period to keep records in the Inventory Balance tables. You need to configure three parameters to configure the Inventory Monthly Balance tables:

  • GRAIN

    The GRAIN parameter controls the time span for which the latest balance is kept. This parameter has a preconfigured value of Month. The possible values for the GRAIN parameter are:

    • DAY

    • WEEK

    • MONTH

    • QUARTER

    • YEAR

  • KEEP_PERIOD

    The KEEP_PERIOD parameter, in conjunction with NUM_OF_PERIOD, controls how many periods worth of data are retained in the Inventory Daily Balance tables. For example, if KEEP_PERIOD is CAL_MONTH and NUM_OF_PERIOD is 3, then only the most recent 3 months of data are retained. This parameter has a preconfigured value of CAL_MONTH. Values for the KEEP_PERIOD parameter include:

    • CAL_DAY

    • CAL_WEEK

    • CAL_MONTH

    • CAL_QTR

    • CAL_YEAR

  • NUM_OF_PERIOD

    The NUM_OF_PERIOD parameter has a preconfigured value of 3. The value for the NUM_OF_PERIOD parameter is a positive integer, for example, 1, 2, 3, and so on.

Note: If you need "Inventory Turns" data for a period older than 3 months, you must change the parameter values for KEEP_PERIOD and NUM_OF_PERIOD. For example, if you want data for the last 3 years, then set KEEP_PERIOD to CAL_YEAR and NUM_OF_PERIOD to 3.

About Configuring the Product Transaction Aggregate Table

There are two aggregation scenarios to configure the Product Transaction aggregate (W_PRODUCT_XACT_A) table—the initial ETL run and then the incremental ETL run.

For your initial ETL run, you need to configure the aggregation level, and the length of history kept in the Product Transaction fact table.

For your initial ETL run, you need to configure the aggregation grain, using the GRAIN parameter.

For the incremental ETL run, you need to configure the aggregation level, the update period in aggregation, and the length of history kept in the Product Transaction fact table, using the following parameters:

  • GRAIN

    The GRAIN parameter specifies the aggregation level. Valid values are DAY, WEEK, MONTH (preconfigured value), QUARTER, YEAR.

  • REFRESH_PERIOD

    The REFRESH_PERIOD parameter, together with NUM_OF_PERIOD, indicates the number of period of records that will be refresh from the transaction table to the aggregate table. Valid values are DAY, WEEK, MONTH (preconfigured value), QUARTER, YEAR.

  • NUM_OF_PERIOD

    The NUM_OF_PERIOD parameter, together with REFRESH_METHOD, indicates the number of periods of records that will be refreshed from the transaction table to the aggregate table. Valid values are positive integers, for example, 1, 2, 3 (preconfigured value).

Before you run the initial ETL and then the incremental ETL to load the Product Transaction aggregate table, you need to configure the Product Transaction Aggregate Table, as follows.

To configure the Product Transaction Aggregate Table

You need to configure three parameters: REFRESH_PERIOD = 'MONTH', GRAIN = 'MONTH', and NUM_OF_PERIOD = 3.

To configure the Product Transaction aggregate table for the initial ETL run

  1. Retrieve the records in the Product Transaction fact (W_PRODUCT_XACT_F) table, and aggregate the records to the Product Transaction aggregate (W_PRODUCT_XACT_A) table at a certain grain level.

    For example, if GRAIN=MONTH then the records in the W_PRODUCT_XACT_F fact table are retrieved and aggregated to the W_PRODUCT_XACT_A table at a monthly level.

    Running the PLP_ProductTransactionAggregate scenario implements this step.

To configure the Product Transaction aggregate table for the incremental ETL run

  1. Delete the refreshed records from the Product Transaction aggregate (W_PRODUCT_XACT_A) table for a certain time.

    The REFRESH_PERIOD and the NUM_OF_PERIOD parameters determine the time period for the deletion.

    For example, if REFRESH_PERIOD=MONTH, NUM_OF_PERIOD=1, and the date is May 15, 2005, then all records for April and the current month (May) are deleted in the W_PRODUCT_XACT_A table.

    Running the PLP_ProductTransactionAggregate_Update scenario implements this step.

  2. Retrieve the records in the Product Transaction fact (W_PRODUCT_XACT_F) table, and aggregate the records to the W_PRODUCT_XACT_A table at a certain grain level.

    For example, if GRAIN=MONTH then the records in the W_PRODUCT_XACT_F fact table are retrieved and aggregated to the W_PRODUCT_XACT_A table at a monthly level.

    Running the PLP_ProductTransactionAggregate scenario implements this step.

B.2.68 How to Deploy the Stored Procedure for the Left Bound and Right Bound Calculation Option

The SIL_BOMItemFact mapping contains the stored procedure called COMPUTE_BOUNDS which traverses the exploded BOM tree structure and calculates the left bound and right bound. By default, the COMPUTE_BOUNDS stored procedure is turned off. If you want to turn on the procedure, see Section B.2.68.1, "How to Configure the Left Bound and Right Bound Calculation Option".

This procedure applies to E-Business Suite and Oracle Fusion source systems.

Note: This procedure is not required for JD Edwards EnterpriseOne (in JD Edwards EnterpriseOne, the left and right bounds are calculated automatically by the UBE (R30461).

B.2.68.1 How to Configure the Left Bound and Right Bound Calculation Option

You can use the left bound and the right bound calculation to expedite some reporting requirements. For example, you can find the components in a subassembly within a finished product. Left bound and right bound are stored in the W_BOM_ITEM_F table for each BOM node, and they have one row of data in the W_BOM_ITEM_F table. The COMPUTE_BOUNDS stored procedure traverses the exploded BOM tree structure and calculates the left bound and right bound. By default, the COMPUTE_BOUNDS stored procedure is off and the W_BOM_ITEM_F.LEFT_BOUNDS and W_BOM_ITEM_F.RIGHT_BOUNDS columns are empty.

The Figure below illustrates a sample BOM structure with the left bound and right bound values listed for each node. To find all the components under node B, you select the components with a top product key value of A, the left bound value is greater than 2, and the right bound value is less than 17.

Figure B-3 Example BOM Structure

This diagram is described in surrounding text.

You can use the following process to turn on the left bound and the right bound calculation and populate the W_BOM_ITEM_F.LEFT_BOUNDS and W_BOM_ITEM_F.RIGHT_BOUNDS columns.

To configure the left bound and right bound calculation option:

  1. In ODI, navigate to SILOS -> SIL_BOMItemFact -> Packages and edit the SIL_BOMItemFact package.

  2. Display the Diagram tab.

  3. Choose the 'Next step on success' tool (that is, the green o.k. arrow button).

  4. Draw a line connecting the Refresh IS_INCREMENTAL icon to the Run COMPUTE_BOUNDS icon.

  5. Draw a line connecting the Run COMPUTE_BOUNDS icon to the Run SIL_BOMItemFact icon.

  6. Save the Package.

  7. Generate the associated Scenario.

Note: The first step of the COMPUTE_BOUNDS ODI procedure attempts to create or replace the associated stored procedure in Oracle Business Analytics Warehouse. The user account under which the scenario runs must have the appropriate permissions for this step to succeed. Alternatively, the stored procedure can be deployed manually and the first step of the ODI procedure can then be disabled to avoid granting such permissions.

B.2.69 How to Configure Hours Per Absence Day

Purpose

This topic explains the configuration of the ODI variable HR_ABS_WORKING_HOURS_PER_DAY used for hours per days calculation.

Optional or Mandatory

By default, Oracle BI Applications uses a SQL expression for the variable HR_ABS_WORKING_HOURS_PER_DAY which is based on the fast formula named TEMPLATE_BIS_DAYS_TO_HOURS called by the source HR function hri_bpl_utilization.convert_days_to_hours(). If the template formula is used then this configuration is optional.

However if the fast formula is customized at source, the SQL expression used in the HR_ABS_WORKING_HOURS_PER_DAY variable must be reviewed and changed mandatorily.

Applies to

This applies to all extracts done for Absences Module from Oracle E-Business Suite 11.1.10 and R12.x.x.

Task description in detail

Check the logic used in the fast formula TEMPLATE_BIS_DAYS_TO_HOURS. If this fast formula is not customized, the default value for this variable will work, otherwise the variable value needs to be changed to a suitable sql expression.

From the fast formula text, determine the values of (a) Default hours per day,(b)Working days per week and (c) Working days per month and assign these values to the ODI variables HR_ABS_DFLT_HOURS_PER_DAY, HR_ABS_WORKING_DAYS_PER_WEEK and HR_ABS_WORKING_DAYS_PER_MONTH respectively.

Review the rest of the fast formula text and determine the formula used for calculating Working hours per day.

Based on the above information form the sql expression that exactly fits the fast formula logic. Refer to the default sql expression provided for a clear idea.

This variable HR_ABS_WORKING_HOURS_PER_DAY is used in the Interface SDE_ORA_AbsenceEventDimension.W_ABSENCE_EVENT_DS_SQ_PER_ABSENCE_ATTENDANCES_TMP for the column : UTL_HOURS_IN_DAY.

The default sql expression used in HR_ABS_WORKING_HOURS_PER_DAY is as follows:

round(case when tab.asg_freq is not null and tab.asg_hours is not null then
(case when tab.asg_freq = 'W' then tab.asg_hours/tab.working_days_per_week when tab.asg_freq ='M' then
tab.asg_hours/working_days_per_month when tab.asg_freq = 'D'
then asg_hours else 0  end) else(case when tab.full_freq is not null and  tab.full_hours is not null 
then(case when tab.full_freq ='W' then tab.full_hours/tab.working_days_per_week
when tab.full_freq='M' then tab.full_hours/working_days_per_month when tab.full_freq ='D' then full_hours else 0 end ) 
else dflt_hours_per_day end) end,2)

Dependency

None.

B.2.70 How to Configure Payroll Balance Filter

Purpose

This Parameter is used to selectively extract the balances in to the Warehouse. By limiting the balances extracted, the performance of ETL and reports will be improved. In addition, only certain types of balance are suitable for including in the warehouse. You should only extract run balances, as other types of balances may not be fully additive (for example year-to-date balances cannot be added together).

Both in case of E-Business Suite Payroll and PeopleSoft North American payroll, the Customer has to be provided a mechanism to choose the balances (in case of E-Business Suite Payroll) and earnings/deductions/taxes (in case of PeopleSoft North American Payroll) to be tracked in the Pay Run Balance Detail fact table.

To ensure addivity of measures we will only support run balances. For each payroll run, the actual run balances processed will be stored. Because we are not breaking these down by context we can combine run balances across time to form higher level balances, for example, PTD, MTD, YTD.

Optional or Mandatory

Optional for E-Business Suite Payroll and PeopleSoft North American Payroll, but is highly recommended.

Applies to

E-Business Suite Payroll and PeopleSoft North American Payroll.

Dependency

Instructions

Create a custom table in the OLTP system with the list of balances that need to be extracted for reporting. The SDE ETL will extract only these balances from the source system. For example:

CREATE TABLE OBIA_PAY_BAL_FILTER (BALANCE_ID VARCHAR2 (50));

A parameter HR_PAYROLL_FILTER_CLAUSE is added in ODI which will have a SELECT statement from the custom table that the customer has created in the source system, as shown below.

SELECT <COLUMN_NAME> FROM <SCHEMA>.<TABLE_NAME>

For example: SELECT BALANCE_ID FROM EMDBO.OBIA_PAY_BAL_FILTER

If the customer does not choose to create a custom table in the source system, the SDE extract will fetch all the balances and this could lead to performance issues.

If you need to extract all balances, then you must set this parameter to 1=1 (this is the default value on installation).

You set the value for variable HR_PAYROLL_FILTER_CLAUSE using Oracle BI Applications Configuration Manager.

For e-Business Suite Payroll or PeopleSoft North American Payroll, you use the following settings:

  • To Filter Balances, use SELECT<COLUMN_NAME> FROM <TABLE_NAME>.

  • To Extract All Balances, use 1=1.

HR_PAYROLL_FILTER_CLAUSE parameter in ODI


This parameter is set to refresh the value from Oracle BI Applications Configuration Manager.

HR_PAYROLL_FILTER_CLAUSE parameter in Configuration Manager


B.2.71 How to Configure Accrual Extract Months for Incremental ETL

Purpose

This topic explains the configuration of the ODI variable HR_ACCRUAL_EXTRACT_MTHS_INCR used for extraction of HR Accrual module data when running in incremental mode.

Optional or Mandatory

By default, Oracle BI Applications sets the value of HR_ACCRUAL_EXTRACT_MTHS_INCR variable to 3 months. This variable ensures data from last 3 months is refreshed in the warehouse when incremental ETL is executed.If this value seems fine no changes are needed.

However if the decision is to have incremental data collection to be different from the last 3 months of data, then this variable value must be set accordingly as per the need as a mandatory step.

Applies to

This applies to all extracts done for Absences Module from Oracle E-Business Suite 11.1.10 and R12.x.x.

Task description in detail

Accrual metrics are calculated on the fly and extracted from source. There are no source tables where E-Business Suite Accrual data is stored.

By default, Oracle BI Applications extraction from Oracle E-Business Suite extracts incremental Accrual data for last 3 months starting from current day. Hence HR_ACCRUAL_EXTRACT_MTHS_INCR is used in incremental extract filter clause.

Note: Setting a higher value progressively impacts the incremental extract performance. This value must be chosen judiciously based on incremental query performance.

Dependency

None.

B.2.72 How to Add New Country Code

In Oracle Business Analytics Warehouse, country is a data warehouse domain with the domain code set to 'W_COUNTRY'. The ISO alpha-2 letter code is stored as a domain code in Oracle BI Applications Configuration Manager.

There are new country codes published by ISO standards, for example, South Sudan Country Code 'SS' is published on August 11, 2011. If new country codes are added in the OLTP, then the following changes need to be done accordingly in Oracle BI Applications:

  1. In Oracle BI Applications Configuration Manager, add the new country code as a new domain code to domain 'W_COUNTRY'.

  2. In Oracle BI Applications Configuration Manager, add the new country code as a new domain code to domain 'COUNTRY' for the given Product Line Version.

  3. In Oracle BI Applications Configuration Manager, create the domain maps between the source domain 'COUNTRY' to the target domain 'W_COUNTRY' for the new country code.

  4. Reload the data warehouse table W_GEO_COUNTRY_D.

    The source is from a csv file_GeoCountry_ISO_Country_Codes_FUSION. This file needs to be updated with the new country code.

B.2.73 How to Configure Accrual Fast Formula Function Calls

Purpose

This document explains the configuration of the following ODI variables used by E-Business Suite Accrual Extract interfaces:

  • a) HR_ACCRUAL_PERIOD_GRANT_AMT

  • b) HR_ACCRUAL_BALANCE_AMT

  • c) HR_ACCRUAL_CARRYOVER_AMT

  • d) HR_ACCRUAL_ABSENCE_AMT

  • e) HR_ACCRUAL_OTHER_AMT

Optional or Mandatory

By default, Oracle BI Applications uses a SQL expression for the above mentioned five variables which is used to execute the template Fast Formulas. The SQL expression is a function call for various metrics used in Accrual extract as below:

  • OBIA_ACCRUAL_FUNCTIONS.GET_NET_ACCRUAL() - used in variables (a) and (b).

  • OBIA_ACCRUAL_FUNCTIONS.GET_CARRY_OVER() - used in variable (c).

  • OBIA_ACCRUAL_FUNCTIONS.GET_ABSENCE() - used in variable (d).

  • OBIA_ACCRUAL_FUNCTIONS.GET_OTHER_NET_CONTRIBUTION()- used in variable (e).

If Accrual fast formulas at source are customized this setup step is mandatory.

Applies to

This applies to all extracts done for Accrual Module from Oracle E-Business Suite 11.1.10 and R12.x.x.

Task description in detail

Configuring ODI variable HR_ACCRUAL_PERIOD_GRANT_AMT

This variable calls functions that fetches the Period Leave Accrual granted to an Employee for a given Accrual Plan and Period.

The default value is:

OBIA_ACCRUAL_FUNCTIONS.GET_NET_ACCRUAL(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,PER_ALL_ASSIGNMENTS_F.PAYROLL_ID,PER_ALL_ASSIGNMENTS_F.BUSINESS_GROUP_ID,-1,PER_TIME_PERIODS.END_DATE,PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.START_DATE,NULL)

When a customized function is called, the following example shows the expectation from the function call when an employee receives 1.5 days of period accrual grant per accrual period:

This screen shot is described in surrounding text.

If the template fast formula is customized the function call must also be suitably changed inside the ODI variable value.

A sample Accrual period is shown in Other Information section.

Configuring ODI variable HR_ACCRUAL_BALANCE_AMT

This variable calls functions that fetches the Balance Leave Accrual of an Assignment for a given Accrual Plan and Period.

Default value is:

OBIA_ACCRUAL_FUNCTIONS.GET_NET_ACCRUAL(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,PER_ALL_ASSIGNMENTS_F.PAYROLL_ID,PER_ALL_ASSIGNMENTS_F.BUSINESS_GROUP_ID,-1,PER_TIME_PERIODS.END_DATE,PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID)

When a customized function is called, the following example shows the expectation from such a function call for an employee receiving 1.5 days of period accrual grant and no Absences, Carryovers, Other Net Accruals:

This screen shot is described in surrounding text.

If the template fast formula is customized the function call must also be suitably changed inside the ODI variable value.

Configuring ODI variable HR_ACCRUAL_CARRYOVER_AMT

This variable calls functions that fetches the Carryover amount, when a new Accrual term begins.

Default value is:

OBIA_ACCRUAL_FUNCTIONS.GET_CARRY_OVER(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.END_DATE,PER_TIME_PERIODS.START_DATE )

When a customized function is called, the following example shows the expectation from such a function.

This screen shot is described in surrounding text.

If the template fast formula is customized the function call must also be suitably changed inside the ODI variable value. A sample Accrual term is shown in Other Information section.

Configuring ODI variable HR_ACCRUAL_ABSENCE_AMT

This variable calls the function that fetch the Absence amount, of a given accrual period

Default value is:

OBIA_ACCRUAL_FUNCTIONS.GET_ABSENCE(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.END_DATE,PER_TIME_PERIODS.START_DATE )

When a customized function is called, the following example shows the expectation from such a function.

This screen shot is described in surrounding text.

If the template fast formula is customized the function call must also be suitably changed inside the ODI variable value.

Configuring ODI variable HR_ACCRUAL_OTHER_AMT

This variable calls the function that fetch the Other Net Accrual amounts, of a given accrual period

Default value is :

OBIA_ACCRUAL_FUNCTIONS.GET_OTHER_NET_CONTRIBUTION(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.END_DATE,PER_TIME_PERIODS.START_DATE )

When a customized function is called, the following example shows the expectation from such a function

This screen shot is described in surrounding text.

If the template fast formula is customized the function call must also be suitably changed inside the ODI variable value.

Dependency

None.

Additional Information/Notes

- Absences are always subtracted from Accrual balance.

- Carryover is always added to Accrual balance

- Other Net Contributions are always added to Accrual balance, but with Appropriate sign.

For example, if Accrual balance is 10 and Other Net Contribution is 2, then Net Accrual balance is 10+2=12. If Accrual balance is 10 and Other Net Contribution is (-3), then Net Accrual Balance is 10+(-3) = 7.

- The example data set below shows Accrual term and Accrual period.

This screen shot is described in surrounding text.

B.2.74 How to set up HR Supervisor Hierarchy Based Data Security

Contents

B.2.74.1 Introduction

Data can be secured via the HR Supervisor Hierarchy using list variable[s], with associated data roles and security filter[s] which are applied at the physical SQL level as IN (a, b, c,...) statement.

How to choose / assign the Duty Role

Each BI Apps Duty role grants access to one or more subject areas, and is a member of at least one data security role.

You need to map a source role/responsibility to one or more BI Apps Duty roles. For instructions on how this is done refer to the FSM task in Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

HR Data Role to Duty Role Mapping

Note: The following are applicable to HR Supervisor Hierarchy security.

Table B-39 HR Data Role to Duty Role Mapping

Data (Security) Role Duty Role

Line Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Line Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Payroll Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Payroll Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Compensation Analyst (secured by HR Supervisor Hierarchy List) AU BI Data

Compensation Analyst (secured by HR Supervisor Hierarchy List) AU BI Duty

Compensation Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Compensation Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Recruiting Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Recruiting Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Recruiting VP (secured by HR Supervisor Hierarchy List) AU BI Data

Recruiting VP (secured by HR Supervisor Hierarchy List) AU BI Duty

Time Collection Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Time Collection Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resource VP (secured by HR Supervisor Hierarchy List) AU BI Data

Human Resource VP (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resource Analyst (secured by HR Supervisor Hierarchy List) AU BI Data

Human Resource Analyst (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resource Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Human Resource Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Learning Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Learning Manager (secured by HR Supervisor Hierarchy List) AU BI Duty


B.2.74.2 Line Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Line Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-40 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Line Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-41 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Calendar

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Absence Event

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.3 Payroll Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Payroll Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-42 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Payroll Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-43 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.4 Compensation Analyst (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Compensation Analyst HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-44 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Compensation Analyst HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-45 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.5 Compensation Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Compensation Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-46 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Compensation Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-47 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.6 Recruiting Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Recruiting Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-48 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Recruiting Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-49 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.7 Recruiting VP (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Recruiting VP HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-50 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Recruiting VP HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-51 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.8 Time Collection Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Time Collection Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-52 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Time Collection Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-53 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.9 Human Resource VP (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Human Resource VP HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-54 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Human Resource VP HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-55 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Calendar

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Absence Event

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.10 Human Resource Analyst (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Human Resource Analyst HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-56 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Human Resource Analyst HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-57 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.11 Human Resource Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Human Resource Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-58 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Human Resource Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-59 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Calendar

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Absence Event

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Balance Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Workforce Gains and Losses - Event Information

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.12 Learning Manager (secured by HR Supervisor Hierarchy List) AU BI Data

Initialization Blocks

The Learning Manager HR Supervisor Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-60 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_PERSON_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_PERSON_ID____EBS

HR Security Person ID List (EBS)

HR_SEC_PERSON_ID____PSFT

HR Security Person ID List (PeopleSoft)


Data Security Role Filters

The Learning Manager HR Supervisor Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-61 Data Security Role Filters

Name Filter

Dim - HR Supervisor Hierarchy

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Calendar

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Supervisor Hierarchy"."Top Level Source Person ID" = VALUEOF(NQ_SESSION.HR_SEC_PERSON_ID)


B.2.74.13 HR Duty Role to Oracle BI Applications HR Presentation Catalog Mapping

Note: The following are applicable to HR Supervisor Hierarchy security.

Table B-62 HR Duty Role to HR Presentation Catalog Mapping

Duty Role HR Presentation Catalog Mapping

Line Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resources – Time and Labor

Payroll Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources – Payroll

Human Resources – Time and Labor

Compensation Analyst (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Compensation

Human Resources – Payroll

Compensation Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Compensation

Human Resources – Payroll

Recruiting Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources – Recruitment

Recruiting VP (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources – Recruitment

Time Collection Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources – Time and Labor

Human Resource VP (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Payroll

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resources - Workforce Effectiveness

Human Resource Analyst (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resource Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Learning Manager (secured by HR Supervisor Hierarchy List) AU BI Duty

Human Resources - Learning Enrollment and Completion

Fusion Workforce Deployment Analysis Duty

Human Resources - Workforce Deployment

Fusion Compensation Analysis Duty

Human Resources - Compensation

Human Resources – Payroll

Fusion Absence and Leave Accrual Analysis Duty

Human Resources - Absence and Leave Accrual


B.2.75 How to set up Department Based Data Security

Contents

B.2.75.1 Introduction

Human Resource Analytics supports data security over Human Resources subject areas / facts via the Department dimension using department list variable[s]; each variable is used by one (or more) data roles, the data roles ensure security filter[s] are applied at the physical SQL level as IN (a, b, c,...) statement.

How to choose / assign the Duty Role

Each BI Apps Duty role grants access to one or more subject areas, and is a member of at least one data security role.

You need to map a source role/responsibility to one or more BI Apps Duty roles. For instructions on how this is done refer to the FSM task in Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

HR Data Role to Duty Role Mapping

Note: The following are applicable to Department security.

Table B-63 HR Data Role to Duty Role Mapping

Data (Security) Role Duty Role

Line Manager (secured by Department List) AU BI Data

Line Manager (secured by Department List) AU BI Duty

Payroll Manager (secured by Department List) AU BI Data

Payroll Manager (secured by Department List) AU BI Duty

Compensation Analyst (secured by Department List) AU BI Data

Compensation Analyst (secured by Department List) AU BI Duty

Compensation Manager (secured by Department List) AU BI Data

Compensation Manager (secured by Department List) AU BI Duty

Recruiting Manager (secured by Department List) AU BI Data

Recruiting Manager (secured by Department List) AU BI Duty

Recruiting VP (secured by Department List) AU BI Data

Recruiting VP (secured by Department List) AU BI Duty

Time Collection Manager (secured by Department List) AU BI Data

Time Collection Manager (secured by Department List) AU BI Duty

Human Resource VP (secured by Department List) AU BI Data

Human Resource VP (secured by Department List) AU BI Duty

Human Resource Analyst (secured by Department List) AU BI Data

Human Resource Analyst (secured by Department List) AU BI Duty

Human Resource Manager (secured by Department List) AU BI Data

Human Resource Manager (secured by Department List) AU BI Duty

Learning Manager (secured by Department List) AU BI Data

Learning Manager (secured by Department List) AU BI Duty


B.2.75.2 Line Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_LINEMGR_DEPT_DATA_SECURITY.

Initialization Blocks

The Line Manager Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-64 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_LINEMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_LINEMGR_LIST____EBS

HR - Line Manager - Department List (EBS)

HR_SEC_DEPT_LINEMGR_LIST____PSFT

HR - Line Manager - Department List (PeopleSoft)


Data Security Role Filters

The Line Manager Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-65 Data Security Role Filters

Name Filter

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Learning Enrollment Events

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Absence Event

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Recruitment Event Information

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Workforce - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Workforce - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Dim - Requisition Organization

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Learning Calendar

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)

Fact - HR - Payroll Balance Summary

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LINEMGR_LIST)


B.2.75.3 Payroll Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_PYRLMGR_DEPT_DATA_SECURITY.

Initialization Blocks

The Payroll Manager Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-66 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_PYRLMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_PYRLMGR_LIST____EBS

HR - Payroll Manager - Department List (EBS)

HR_SEC_DEPT_PYRLMGR_LIST____PSFT

HR - Payroll Manager - Department List (PeopleSoft)


Data Security Role Filters

The Payroll Manager Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-67 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_PYRLMGR_LIST)

Fact - HR - Payroll Balance Detail

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_PYRLMGR_LIST)

Fact - HR - Payroll Balance Summary

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_PYRLMGR_LIST)


B.2.75.4 Compensation Analyst (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_CMPALYST_DEPT_DATA_SECURITY.

Initialization Blocks

The Compensation Analyst Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-68 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_CMPALYST_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_CMPALYST_LIST____EBS

HR - Compensation Analyst - Department List (EBS)

HR_SEC_DEPT_CMPALYST_LIST____PSFT

HR - Compensation Analyst - Department List (PeopleSoft)


Data Security Role Filters

The Compensation Analyst Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-69 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPALYST_LIST)

Fact - HR - Payroll Balance Detail

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPALYST_LIST)

Fact - HR - Payroll Balance Summary

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPALYST_LIST)

Fact - HR - Workforce - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPALYST_LIST)

Fact - HR - Workforce - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPALYST_LIST)


B.2.75.5 Compensation Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_CMPMGR_DEPT_DATA_SECURITY.

Initialization Blocks

The Compensation Manager Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-70 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_CMPMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_CMPMGR_LIST____EBS

HR - Compensation Manager - Department List (EBS)

HR_SEC_DEPT_CMPMGR_LIST____PSFT

HR - Compensation Manager - Department List (PeopleSoft)


Data Security Role Filters

The Compensation Manager Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-71 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPMGR_LIST)

Fact - HR - Payroll Balance Detail

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPMGR_LIST)

Fact - HR - Payroll Balance Summary

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPMGR_LIST)

Fact - HR - Workforce - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPMGR_LIST)

Fact - HR - Workforce - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_CMPMGR_LIST)


B.2.75.6 Recruiting Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_RCRTMTMGR_DEPT_DATA_SECURITY.

Initialization Blocks

The Recruiting Manager Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-72 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_RCRTMTMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_RCRTMTMGR_LIST ____EBS

HR - Recruiting Manager - Department List (EBS)

HR_SEC_DEPT_RCRTMTMGR_LIST____PSFT

HR - Recruiting Manager - Department List (PeopleSoft)


Data Security Role Filters

The Recruiting Manager Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-73 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_RCRTMTVP_LIST)

Dim - Requisition Organization

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_RCRTMTMGR_LIST)

Fact - HR - Recruitment Event Information

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_RCRTMTMGR_LIST)


B.2.75.7 Recruiting VP (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_RCRTMTVP_DEPT_DATA_SECURITY.

Initialization Blocks

The Recruiting VP Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-74 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_RCRTMTVP_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_RCRTMTVP_LIST____EBS

HR - Recruiting VP - Department List (EBS)

HR_SEC_DEPT_RCRTMTVP_LIST____PSFT

HR - Recruiting VP - Department List (PeopleSoft)


Data Security Role Filters

The Recruiting VP Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-75 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" =

VALUEOF(NQ_SESSION.HR_SEC_DEPT_RCRTMTVP_LIST)

Dim - Requisition Organization

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_RCRTMTVP_LIST)

Fact - HR - Recruitment Event Information

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_RCRTMTVP_LIST)


B.2.75.8 Time Collection Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_TLBRMGR_DEPT_DATA_SECURITY.

Initialization Blocks

The Time Collection Manager Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-76 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_TLBRMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_TLBRMGR_LIST____EBS

HR - Time Collection Manager - Department List (EBS)

HR_SEC_DEPT_TLBRMGR_LIST____PSFT

HR - Time Collection Manager - Department List (PeopleSoft)


Data Security Role Filters

The Time Collection Manager Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-77 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_TLBRMGR_LIST)

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_TLBRMGR_LIST)

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_TLBRMGR_LIST)

Fact - HR - Workforce Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_TLBRMGR_LIST)

Fact - HR – Payroll Balance Summary

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_TLBRMGR_LIST)


B.2.75.9 Human Resource VP (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_HRVP_DEPT_DATA_SECURITY.

Initialization Blocks

The Human Resource VP Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-78 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_HRVP_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_HRVP_LIST____EBS

HR - Human Resource VP - Department List (EBS)

HR_SEC_DEPT_HRVP_LIST____PSFT

HR - Human Resource VP - Department List (PeopleSoft)


Data Security Role Filters

The Human Resource VP Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-79 Data Security Role Filters

Name Filter

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Learning Calendar

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Learning Enrollment Events

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Absence Event

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Recruitment Event Information

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Workforce - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Workforce - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Payroll Balance Detail

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Payroll Balance Summary

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)

Dim - Requisition Organization

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRVP_LIST)


B.2.75.10 Human Resource Analyst (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_HRALYST_DEPT_DATA_SECURITY

Initialization Blocks

The Human Resource Analyst Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-80 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_HRALYST_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_HRALYST_LIST____EBS

HR - Human Resource Analyst - Department List (EBS)

HR_SEC_DEPT_HRALYST_LIST ____PSFT

HR - Human Resource Analyst - Department List (PeopleSoft)


Data Security Role Filters

The Human Resource Analyst Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-81 Data Security Role Filters

Name Filter

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Learning Enrollment Events

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Absence Event

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Recruitment Event Information

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Workforce - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Workforce - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Dim - Requisition Organization

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)

Fact - HR - Learning Calendar

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRALYST_LIST)


B.2.75.11 Human Resource Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_HRMGR_DEPT_DATA_SECURITY.

Initialization Blocks

The Human Resource Manager - Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-82 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_HRMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_HRMGR_LIST____EBS

HR - Human Resources Manager - Department List (EBS)

HR_SEC_DEPT_HRMGR_LIST____PSFT

HR - Human Resources Manager - Department List (PeopleSoft)


Data Security Role Filters

The Human Resource Manager - Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-83 Data Security Role Filters

Name Filter

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Learning Enrollment Events

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Absence Event

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Recruitment Event Information

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Workforce - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Workforce - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Dim - Requisition Organization

"Core"."Dim - Requisition Organization"."Requisition Organization Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)

Fact - HR - Learning Calendar

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_HRMGR_LIST)


B.2.75.12 Learning Manager (secured by Department List) AU BI Data

Name: OBIA_AU_HCM_LRNGMGR_DEPT_DATA_SECURITY

Initialization Blocks

The Learning Manager Department list is determined at user sign-on via one or more Initialization Blocks:

Table B-84 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_DEPT_LRNGMGR_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_DEPT_LRNGMGR_LIST____EBS

HR - Learning Manager - Department List (EBS)

HR_SEC_DEPT_LRNGMGR_LIST ____PSFT

HR - Learning Manager - Department List (PeopleSoft)


Data Security Role Filters

The Learning Manager Department list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-85 Data Security Role Filters

Name Filter

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LRNGMGR_LIST)

Fact - HR - Learning Enrollment Events

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LRNGMGR_LIST)

Dim - Department

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LRNGMGR_LIST)

Fact - HR - Learning Calendar

"Core"."Dim - Department"."Department Number" = VALUEOF(NQ_SESSION.HR_SEC_DEPT_LRNGMGR_LIST)


B.2.75.13 HR Duty Role to Oracle BI Applications HR Presentation Catalog Mapping

Note: The following are applicable to Department security.

Table B-86 HR Duty Role to HR Presentation Catalog Mapping

BI Duty Roles HR Presentation Catalog Mapping

Line Manager (secured by Department List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resources - Time and Labor

Payroll Manager (secured by Department List) AU BI Duty

Human Resources - Payroll

Human Resources - Time and Labor

Compensation Analyst (secured by Department List) AU BI Duty

Human Resources - Compensation

Human Resources - Payroll

Compensation Manager (secured by Department List) AU BI Duty

Human Resources - Compensation

Human Resources - Payroll

Recruiting Manager (secured by Department List) AU BI Duty

Human Resources - Recruitment

Recruiting VP (secured by Department List) AU BI Duty

Human Resources - Recruitment

Time Collection Manager (secured by Department List) AU BI Duty

Human Resources - Time and Labor

Human Resource VP (secured by Department List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Payroll

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resources - Workforce Effectiveness

Human Resource Analyst (secured by Department List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resource Manager (secured by Department List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Learning Manager (secured by Department List) AU BI Duty

Human Resources - Learning Enrollment and Completion


B.2.76 How to Set Up Price Analytics Security for EBS

There is no row-level security applied to Price Analytics reports and metrics. Users who can access Price Analytics Subject areas can view all Order and Quote data in the related reports without any data-security filter.

Configuring BI Duty Roles

The table below lists BI Duty roles that can be assigned to users in order to give them access to Price Analytics Subject Areas.

Table B-87 BI Duty Roles and Subject Areas

BI Duty Roles Subject areas

Price Administrator

Sales – CRM Price Waterfall

Price Order Analytics

Sales – CRM Price Waterfall - Orders

Price Quote Analytics

(Member E-Business Suite responsibility : Quoting User, Quoting Sales Agent, Quoting Sales Manager)

Sales – CRM Price Waterfall - Quotes


For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.77 How to Perform RPD Modifications for Cost and Revenue Time Grain Changes

This topic explains how to configure the grain of Cost aggregate (W_PROJ_COST_A) and Revenue aggregate (W_PROJ_REVENUE_A) to Period, Quarter, or Year. As installed by default, the grain of the cost aggregate and revenue aggregate are set at Fiscal Period. However, you can modify the grain of the aggregate to either Period or Quarter or Year. This is done by configuring the FSM Parameters - COST_TIME_GRAIN and REVENUE_TIME_GRAIN - to 'PERIOD' or 'QUARTER' or 'YEAR'. In addition, you must make the metadata repository changes that are included in this section.

Note: This section covers how to configure the parameters and the metadata repository changes that you must make using Oracle BI EE Administration Tool.

This topic contains the following sections:

Note: Oracle recommends that you back up your metadata repository (RPD file) before applying making changes.

B.2.77.1 Setting the Time Grain Parameters in FSM

By default, the parameters COST_TIME_GRAIN and REVENUE_TIME_GRAIN is set to 'PERIOD'. If you want to change the grain of aggregates, you will have to set these variables to desired levels and concurrently the joins in the RPD should be updated to reflect the appropriate tables.

To change the values in FSM, navigate to Manage Parameters, select 'COST_TIME_GRAIN' and click the Edit button.

To setting the Time Grain Parameters in FSM:

  1. Navigate to Manage Parameters.

  2. Select COST_TIME_GRAIN and click on edit button.

  3. In the Manage Parameter Default values area, specify a value in the Default Value field. The allowed values are:

    • PERIOD

    • QUARTER

    • YEAR

  4. Repeat the above steps for REVENUE_TIME_GRAIN.

B.2.77.2 Changing the Time Grain of the Cost Aggregate table to Fiscal/Project/Enterprise Period

This is the default configuration. You must ensure that the COST_TIME_GRAIN is set to PERIOD in FSM, and that the following RPD joins are in place.

  1. Verify the joins to Fiscal Calendar (Dim-Date Fiscal Calendar).

    In the Business Model and Mapping layer, select the 'Dim_W_MCAL_PERIOD_D_Fiscal_Period' Logical Table Source from the 'Dim - Date Fiscal Calendar' and the 'Fact_Agg_W_PROJ_COST_A_Project_Cost' and 'Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD' Logical Table Sources in 'Fact - Project Cost' and then right click and select 'physical diagram->selected objects only' and verify the following physical joins, then click OK.

    This screen shot is described in surrounding text.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_GL_ACCT_PRD_STRT_DAY_WID"
    
    This screen shot is described in surrounding text.

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_END_DAY_WID" >=   "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    
    This screen shot is described in surrounding text.

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PRD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_QTR_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    

    Join D:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    
  2. Verify the joins to Project Calendar (Dim-Date Project Calendar).

    In the Business Model and Mapping layer, select the 'Dim_W_MCAL_PERIOD_D_Project_Period' Logical Table Source from the 'Dim - Date Project Calendar' and the 'Fact_Agg_W_PROJ_COST_A_Project_Cost' and 'Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD' Logical Table Sources in 'Fact - Project Cost' and then right click and select 'physical diagram->selected objects only' and verify the following physical join, then click OK.

    This screen shot is described in surrounding text.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_PRJ_ACCT_PRD_ST_DAY_WID"
    
    This screen shot is described in surrounding text.

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    
    This screen shot is described in surrounding text.

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_QTR_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    

    Join D:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    
  3. Verify the joins to Enterprise Calendar (Dim-Date).

    In the Business Model and Mapping layer, select the 'Dim_W_ENT_PERIOD_D' Logical Table Source from the 'Dim - Date' and the 'Fact_Agg_W_PROJ_COST_A_Project_Cost' and 'Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD' Logical Table Sources in 'Fact - Project Cost' and then right click and select 'physical diagram->selected objects only' and verify the following physical join, then click OK.

    This screen shot is described in surrounding text.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."ENT_PERIOD_START_DAY_WID"
    
    This screen shot is described in surrounding text.

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_END_DT_WID"
    
    This screen shot is described in surrounding text.

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_QTR_D"."ENT_QTR_END_DT_WID"
    

    Join D:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_END_DT_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID"
    
  4. Change the Content Aggregation Level in the Business Model and Mapping layer.

    As installed by default, the grain for cost aggregate is set to Period against the dimensions Dim-Date Fiscal Calendar, Dim-Date Project Calendar and Dim - Date. In the Business Model and Mapping layer open these two Logical Table Sources in 'Fact – Project Cost' and verify if grain is set at Period level.

    This screen shot is described in surrounding text.
  5. Save the changes.

    Run the Consistency Check and ensure that there are no errors, save the RPD file, and clear Oracle BI Enterprise Edition Cache. If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.77.3 Changing the Time Grain of the Revenue Aggregate table to Fiscal/Project/Enterprise Period

This is default configuration. You must ensure that the REVENUE_TIME_GRAIN is set to 'PERIOD' in the FSM and that the following RPD joins are in place.

  1. Verify the joins to Fiscal Calendar (Dim-Date Fiscal Calendar).

    In the Business Model and Mapping layer, select the 'Dim_W_MCAL_PERIOD_D_Fiscal_Period' Logical Table Source from the 'Dim - Date Fiscal Calendar' and the ' Fact_Agg_W_PROJ_REVENUE_A_Revenue' and 'Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD' Logical Table Sources in 'Fact - Project Revenue' and then right click and select 'physical diagram->selected objects only' and verify the following physical join, then click OK.

    This screen shot is described in surrounding text.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."GL_ACCT_PERIOD_START_DAY_WID"
    
    This screen shot is described in surrounding text.

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    
    This screen shot is described in surrounding text.

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_QTR_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    

    Join D:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    
  2. Verify the joins to Project Calendar (Dim-Date Project Calendar).

    In the Business Model and Mapping layer, select the 'Dim_W_MCAL_PERIOD_D_Project_Period' Logical Table Source from the 'Dim - Date Project Calendar' and the ' Fact_Agg_W_PROJ_REVENUE_A_Revenue' and 'Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD' Logical Table Sources in 'Fact - Project Revenue' and then right click and select 'physical diagram->selected objects only' and verify the following physical join, then click OK.

    This screen shot is described in surrounding text.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."PROJ_ACCT_PERIOD_START_DAY_WID"
    
    This screen shot is described in surrounding text.

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    
    This screen shot is described in surrounding text.

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_QTR_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    

    Join D:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    
  3. Verify the joins to Enterprise Calendar (Dim-Date).

    In the Business Model and Mapping layer, select the 'Dim_W_ENT_PERIOD_D' Logical Table Source from the 'Dim - Date' and the 'Fact_Agg_W_PROJ_COST_A_Project_Cost' and the 'Fact_Agg_W_PROJ_REVENUE_A_Revenue' and ' Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD' Logical Table Sources in 'Fact - Project Revenue' and then right click and select 'physical diagram->selected objects only' and verify the following physical join, then click OK.

    This screen shot is described in surrounding text.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."ENT_PERIOD_START_DAY_WID"
    
    This screen shot is described in surrounding text.

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_END_DT_WID"
    
    This screen shot is described in surrounding text.

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_QTR_D"."ENT_QTR_END_DT_WID"
    

    Join D:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_END_DT_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID"
    
  4. Change the Content Aggregation Level in the Business Model and Mapping layer.

    As installed by default, the grain for cost aggregate is set to Period against the dimensions Dim-Date Fiscal Calendar, Dim-Date Project Calendar and Dim- Date. In the Business Model and Mapping layer open these two Logical Table Sources in 'Fact – Project Revenue' and verify if grain is set at Period level.

    This screen shot is described in surrounding text.
  5. Save the changes.

    Run the Consistency Check and ensure that there are no errors, save the RPD file, and clear Oracle BI Enterprise Edition Cache. If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.77.4 Changing the Time Grain of the Cost Aggregate table to Fiscal/Project/Enterprise Quarter

If the grain of Cost aggregate is at quarter level, then you must ensure that the COST_TIME_GRAIN is set to 'QUARTER' in the FSM. In addition, make the following metadata changes for the Fiscal, Project, and Enterprise calendars:

  1. Delete the joins to Dim_W_MCAL_PERIOD_D_Fiscal_Period/ Dim_W_MCAL_ PERIOD_D_Project_Period /Dim_W_ENT_ PERIOD_D.

    Delete the existing physical joins between Fact_Agg_W_PROJ_COST_A_Project_Cost (under logical fact 'Fact – Project Cost') to Dim_W_MCAL_PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_PERIOD_D (under logical dimension 'Dim - Date')

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_GL_ACCT_PRD_STRT_DAY_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_PRJ_ACCT_PRD_ST_DAY_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."ENT_PERIOD_START_DAY_WID"
    

    Delete the existing physical joins between Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD (under logical fact 'Fact – Project Cost') to Dim_W_MCAL_PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_END_DAY_WID" >=   "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_END_DT_WID"
    
  2. Create join to Dim_W_MCAL_QTR_D_Fiscal_Quarter.

    In the Business Model and Mapping layer, select the Dim_W_MCAL_QTR_D_Project_Quarter/ Dim_W_MCAL_YEAR_D_Project_Year Logical Table Source from the 'Dim - Date Project Calendar' and the Fact_Agg_W_PROJ_COST_A_Project_Cost and Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD Logical Table Sources in 'Fact - Project Cost' and then right click and select 'physical diagram->selected objects only' and click ok. Create following physical join:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_QTR_START_DAY_WID" = 
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_GL_ACCT_PERIOD_START_DAY_WID"
    

    And verify the following joins:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PERIOD_END_DAY_WID" <=       "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_QTR_END_DAY_WID" AND 
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID" = 
            "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_END_DAY_WID"
      >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PERIOD_END_DAY_WID"
            AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_CAL_WID" = 
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    
  3. Create joins to Dim_W_MCAL_QTR_D_Project_Quarter.

    In the Business Model and Mapping layer, select the Dim_W_MCAL_QTR_D_Project_Quarter/ Dim_W_MCAL_YEAR_D_Project_Year Logical Table Source from the 'Dim - Date Project Calendar' and the Fact_Agg_W_PROJ_COST_A_Project_Cost and Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD Logical Table Sources in 'Fact - Project Cost' and then right click and select 'physical diagram->selected objects only' and click ok. Create following physical join:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_QTR_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_PRJ_ACCT_PRD_START_DAY_WID"
    

    And verify the following joins:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_QTR_END_DAY_WID" 
    AND "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_END_DAY_WID" >= 
    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" 
    AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    
  4. Create joins to Dim_W_ENT_QTR_D.

    In the Business Model and Mapping layer, select the Dim_W_ENT_QTR_D / Dim_W_ENT_YEAR_D Logical Table Source from the 'Dim - Date' and the Fact_Agg_W_PROJ_COST_A_Project_Cost and Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD Logical Table Sources in 'Fact - Project Cost' and then right click and select 'physical diagram->selected objects only' and click ok. Create following physical join:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_QTR_D"."ENT_QTR_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."ENT_PERIOD_START_DAY_WID"
    

    And verify following joins:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID" <=  
            "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_QTR_D"."ENT_QTR_END_DT_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_END_DT_WID" >= 
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID"
    
  5. Changing Content Aggregation Level in the Business Model and Mapping layer.

    As installed by default, the grain for cost aggregate is set to Period against the dimensions Dim-Date Fiscal Calendar, Dim-Date Project Calendar and Dim - Date.

    Instead of Fiscal/Project/Enterprise Period you must set this to Fiscal Quarter for Dim – Date Fiscal Calendar, Project Quarter for Dim – Date Project Calendar and Enterprise Quarter for Dim - Date.

  6. Save the changes.

    When these changes are complete, run the Consistency Check and ensure that there are no errors, save the RPD file, and clear the Oracle BI Enterprise Edition Cache. If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.77.5 Changing the Time Grain of the Revenue Aggregate table to Fiscal/Project/Enterprise Quarter

If the grain of Revenue aggregate is at quarter level, then you must ensure that the REVENUE_TIME_GRAIN is set to 'QUARTER' in FSM. Also, the following metadata changes should be made for the Fiscal, Project, and Enterprise calendars:

  1. Delete the joins to Dim_W_MCAL_PERIOD_D_Fiscal_Period/ Dim_W_MCAL_ PERIOD_D_Project_Period /Dim_W_ENT_ PERIOD_D.

    Delete the existing physical joins between Fact_Agg_W_PROJ_REVENUE_A_Revenue (under logical fact 'Fact – Project Revenue') to Dim_W_MCAL_ PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_ PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_ PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."GL_ACCT_PERIOD_START_DAY_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."PROJ_ACCT_PERIOD_START_DAY_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."ENT_PERIOD_START_DAY_WID"
    

    Delete the existing physical joins between Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD (under logical fact 'Fact – Project Revenue') to Dim_W_MCAL_PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_END_DT_WID"
    
  2. Create joins to Dim_W_MCAL_QTR_D_Fiscal_Quarter.

    In the Business Model and Mapping layer, select the Dim_W_MCAL_QTR_D_Fiscal_Quarter/ Dim_W_MCAL_YEAR_D_Fiscal_Year Logical Table Source from the 'Dim - Date Fiscal Calendar' and the Fact_Agg_W_PROJ_REVENUE_A_Revenue and Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD Logical Table Sources in 'Fact - Project Revenue' and then right click and select 'physical diagram-> selected objects only' and click ok. Create following physical join:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."GL_ACCT_PERIOD_START_DAY_WID"
    

    And verify the following joins:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_QTR_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Fiscal_Quarter"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    
  3. Create joins to Dim_W_MCAL_QTR_D_Project_Quarter.

    In the Business Model and Mapping layer, select the Dim_W_MCAL_QTR_D_Project_Quarter/ Dim_W_MCAL_YEAR_D_Project_Year Logical Table Source from the 'Dim - Date Project Calendar' and the Fact_Agg_W_PROJ_REVENUE_A_Revenue and Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD_Logical Table Sources in 'Fact - Project Revenue' and then right click and select 'physical diagram->selected objects only' and click ok. Create following physical join:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_QTR_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."PROJ_ACCT_PERIOD_START_DAY_WID"
    

    And verify the following joins:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_QTR_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_QTR_D_Project_Quarter"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    
  4. Create joins to Dim_W_ENT_QTR_D.

    In the Business Model and Mapping layer, select the Dim_W_ENT_QTR_D / Dim_W_ENT_YEAR_D Logical Table Source from the 'Dim - Date' and the Fact_Agg_W_PROJ_REVENUE_A_Revenue and Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD_ Logical Table Sources in 'Fact - Project Revenue' and then right click and select 'physical diagram->selected objects only' and click ok. Create following physical join:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_QTR_D"."ENT_QTR_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."ENT_PERIOD_START_DAY_WID"
    

    And verify the following joins:

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_QTR_D"."ENT_QTR_END_DT_WID"
    

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_END_DT_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID"
    
  5. Changing Content Aggregation Level in the Business Model and Mapping layer.

    As installed by default, the grain for revenue aggregate is set to Period against the dimensions Dim-Date Fiscal Calendar, Dim-Date Project Calendar and Dim - Date.

    Instead of Fiscal/Project Period you must set this to Fiscal Quarter for Dim – Date Fiscal Calendar, Project Quarter for Dim – Date Project Calendar and Enterprise Quarter for Dim - Date.

  6. Save the changes.

    When these changes are complete, run the Consistency Check and ensure that there are no errors, save the RPD file, and clear the Oracle BI Enterprise Edition Cache. If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.77.6 Changing the Time Grain of the Cost Aggregate table to Fiscal/Project/Enterprise Year

If the grain of Cost aggregate is at year level, then you must ensure that the COST_TIME_GRAIN is set to 'YEAR' in FSM. Also, the following metadata changes should be made for the Fiscal, Project, and Enterprise calendars:

  1. Delete the joins to Dim_W_MCAL_PERIOD_D_Fiscal_Period/ Dim_W_MCAL_ PERIOD_D_Project_Period /Dim_W_ENT_ PERIOD_D.

    Delete the existing physical joins between Fact_Agg_W_PROJ_COST_A_Project_Cost (under logical fact 'Fact – Project Cost') to Dim_W_MCAL_PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_GL_ACCT_PRD_STRT_DAY_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_PRJ_ACCT_PRD_ST_DAY_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."ENT_PERIOD_START_DAY_WID"
    

    Delete the existing physical joins between Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD (under logical fact 'Fact – Project Cost') to Dim_W_MCAL_PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_END_DAY_WID" >=   "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_END_DT_WID"
    
  2. Create joins to Dim_W_MCAL_YEAR_D_Fiscal_Year/ Dim_W_MCAL_YEAR_D_Project_Year/ Dim_W_ENT_YEAR_D.

    Following physical joins need to be created between following Logical Table Source fact Fact_Agg_W_PROJ_COST_A_Project_Cost (under logical fact 'Fact – Project Cost') and Dim_W_MCAL_YEAR_D_Fiscal_Year (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_YEAR_D_Project_Year (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_YEAR_D (under logical dimension 'Dim – Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_START_DAY_WID" =    
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_GL_ACCT_PERIOD_START_DAY_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_START_DAY_WID" =    
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."PRVDR_PRJ_ACCT_PRD_START_DAY_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_START_DT_WID" = 
            "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost"."ENT_PERIOD_START_DAY_WID"
    
  3. Verify the joins to Dim_W_MCAL_YEAR_D_Fiscal_Year/ Dim_W_MCAL_YEAR_D_Project_Year/ Dim_W_ENT_YEAR_D.

    Ensure that there are joins between Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD Logical Table Source in 'Fact - Project Cost' to Dim_W_MCAL_YEAR_D_Fiscal_Year Logical Table Source from the 'Dim - Date Fiscal Calendar', Dim_W_MCAL_YEAR_D_Project_Year Logical Table Source from the 'Dim - Date Project Calendar' and Dim_W_ENT_YEAR_D Logical Table Source from the 'Dim - Date'. These are done by default.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PRJ_ACCT_PRD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."PRVDR_PROJ_MCAL_CAL_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_END_DT_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_COST_A_Project_Cost_ITD"."ENT_PERIOD_END_DAY_WID"
    
  4. Changing Content Aggregation Level in the Business Model and Mapping layer

    As installed by default, the grain for cost aggregate is set to Period against the dimensions Dim-Date Fiscal Calendar, Dim-Date Project Calendar and Dim - Date.

    Instead of Fiscal/Project Period you must set this to Fiscal Year for Dim – Date Fiscal Calendar, Project Year for Dim – Date Project Calendar and Enterprise year for Dim - Date.

  5. Save the changes.

    When these changes are complete, run the Consistency Check and ensure that there are no errors, save the RPD file, and clear the Oracle BI Enterprise Edition Cache. If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.77.7 Changing the Time grain of the Revenue Aggregate table to Fiscal/Project/Enterprise Year

If the grain of Revenue aggregate is at year level, then you must ensure that the REVENUE_TIME_GRAIN is set to 'YEAR' in the FSM. Also, the following metadata changes should be made for the Fiscal, Project, and Enterprise calendars:

  1. Delete the joins to Dim_W_MCAL_PERIOD_D_Fiscal_Period/ Dim_W_MCAL_ PERIOD_D_Project_Period /Dim_W_ENT_ PERIOD_D.

    Delete the existing physical joins between Fact_Agg_W_PROJ_REVENUE_A_Revenue (under logical fact 'Fact – Project Revenue') to Dim_W_MCAL_ PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_ PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_ PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."GL_ACCT_PERIOD_START_DAY_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_START_DAY_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."PROJ_ACCT_PERIOD_START_DAY_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."ENT_PERIOD_START_DAY_WID"
    

    Delete the existing physical joins between Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD (under logical fact 'Fact – Project Revenue') to Dim_W_MCAL_PERIOD_D_Fiscal_Period (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_PERIOD_D_Project_Period (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_PERIOD_D (under logical dimension 'Dim - Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Fiscal_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_PERIOD_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_PERIOD_D_Project_Period"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID" <= "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_PERIOD_D"."ENT_PERIOD_END_DT_WID"
    
  2. Create joins to Dim_W_MCAL_YEAR_D_Fiscal_Year/ Dim_W_MCAL_YEAR_D_Project_Year.

    Additional physical joins need to be created between following Logical Table Source fact Fact_Agg_W_PROJ_REVENUE_A_Revenue (under logical fact 'Fact – Project Cost') and Dim_W_MCAL_YEAR_D_Fiscal_Year (under logical dimension 'Dim – Date Fiscal Calendar'), Dim_W_MCAL_YEAR_D_Project_Year (under logical dimension 'Dim – Date Project Calendar') and Dim_W_ENT_YEAR_D (under logical dimension 'Dim – Date').

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_START_DAY_WID" =    
    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."GL_ACCT_PERIOD_START_DAY_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_START_DAY_WID" =    
    "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."PROJ_ACCT_PERIOD_START_DAY_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_START_DT_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue"."ENT_PERIOD_START_DAY_WID"
    
  3. Verify the joins to Dim_W_MCAL_YEAR_D_Fiscal_Year/ Dim_W_MCAL_YEAR_D_Project_Year.

    Ensure that there are joins between Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD Logical Table Source in 'Fact - Project Revenue' to Dim_W_MCAL_YEAR_D_Fiscal_Year Logical Table Source from the 'Dim - Date Fiscal Calendar', Dim_W_MCAL_YEAR_D_Project_Year Logical Table Source from the 'Dim - Date Project Calendar' and Dim_W_ENT_YEAR_D Logical Table Source from the 'Dim - Date'. These are done by default.

    Join A:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Fiscal_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."GL_MCAL_CAL_WID"
    

    Join B:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_YEAR_END_DAY_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_ACCT_PERIOD_END_DAY_WID" AND "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_MCAL_YEAR_D_Project_Year"."MCAL_CAL_WID" = "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."PROJ_MCAL_CAL_WID"
    

    Join C:

    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_ENT_YEAR_D"."ENT_YEAR_END_DT_WID" >= "Oracle Data Warehouse"."Catalog"."dbo"."Fact_Agg_W_PROJ_REVENUE_A_Revenue_ITD"."ENT_PERIOD_END_DAY_WID"
    
  4. Changing Content Aggregation Level in the Business Model and Mapping layer

    As installed by default, the grain for revenue aggregate is set to Period against the dimensions Dim-Date Fiscal Calendar, Dim-Date Project Calendar and Dim - Date.

    Instead of Fiscal/Project Period you must set this to Fiscal Year for Dim – Date Fiscal Calendar and Project Year for Dim – Date Project Calendar and Enterprise year for Dim - Date.

  5. Save the changes.

    When these changes are complete, run the Consistency Check and ensure that there are no errors, save the RPD file, and clear the Oracle BI Enterprise Edition Cache. If you are making the changes in offline mode, then restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.78 How to configure E-Business Suite Accrual Extract Mode

Purpose

This document explains the configuration of the ODI variable HR_ACCRUAL_EXTRACT_MODE used for E-Business Suite Accrual extraction programs.

Based on the value of HR_ACCRUAL_EXTRACT_MODE , the variable HR_ACCRUAL_THREADS_TOTAL also needs to be setup as part of the extraction.

Optional or Mandatory

This is a mandatory step. The default value of HR_ACCRUAL_EXTRACT_MODE must be reviewed and its impact must be understood before setting a value or continuing with the default value, which is NOPARALLEL. See the 'Task Description in Detail' section for further information on this mode.

Applies to

This applies to all extracts done for Accrual Module from Oracle E-Business Suite 11.1.10 and R12.x.x.

Task description in detail

The E-Business Suite Accrual uses fast formula calls for extracting accrual data for assignments. Fast formula function calls are inherently slow and may cause performance problems. However if the number of assignments are less and/or number of periods of history being collected is small, the time taken to call the fast formulas for the various metrics should be reviewed. See the Additional Information section below for the SQL used to estimate the timings.

There are 3 modes that can be used:

  • NOPARALLEL: This value is used when Accruals Extraction runs in single a thread mode. This ensures that Accrual extraction is done as a part of the Accrual Load Plan. Permissions in E-Business Suite source schemas to create pl/sql package is needed by ODI for this mode to work. Can be used when data load is less, for example, In Incremental Load or when HR_ACCRUAL_EXTRACT_DATE parameter is set to a smaller value. This is also the DEFAULT extraction mode by default.

  • PARALLEL: This value is used when Accruals Extraction runs in Parallel threads. This will improve loading speed. To configure this mode we must assign a value to HR_ACCRUAL_THREADS_TOTAL variable. The numerical value in this variable decides the number of parallel threads spawned by accrual source extract program. A default value of 8 is assigned to this variable, implying that 8 parallel threads will be spawned. But provision is available to extend it till 10 threads, in which case the parallel loads plan steps for thread 9 and thread 10 have to be enabled (by default, eight parallel steps are enabled).

  • STANDALONE: This value is used when the Accrual Extraction process is not part of Accrual Load plan and is executed independently in a standalone manner before the Accrual Load Plan is executed. This may be done to prevent holding up the Accrual Load plan from spending too much time on the Accrual Extract interfaces. Standalone mode can be used when extraction volume is high in full load and takes a long time to complete. Here also PLSQL based wrapper approach is used. Permissions in E-Business Suite source schemas to create pl/sql package is needed by ODI for this mode to work.

Dependency

The extraction of incremental load depends on the value set for HR_ACCRUAL_EXTRACT_DATE. Hence for a high value of this variable which fetches a bigger dataset STANDALONE mode is best.

Additional Information

The following SQL can be used to estimate the Accrual Extraction Time:

SELECT PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID ,PER_TIME_PERIODS.END_DATE,
PER_UTILITY_FUNCTIONS.GET_NET_ACCRUAL(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,PER_ALL_ASSIGNMENTS_F.PAYROLL_ID,
PER_ALL_ASSIGNMENTS_F.BUSINESS_GROUP_ID,-1,PER_TIME_PERIODS.END_DATE,PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.START_DATE,NULL),
PER_ACCRUAL_CALC_FUNCTIONS.GET_CARRY_OVER(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,
PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.END_DATE,PER_TIME_PERIODS.START_DATE ),
PER_ACCRUAL_CALC_FUNCTIONS.GET_ABSENCE(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,
PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.END_DATE,
PER_TIME_PERIODS.START_DATE ),
PER_ACCRUAL_CALC_FUNCTIONS.GET_OTHER_NET_CONTRIBUTION(PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID,
PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ID,PER_TIME_PERIODS.END_DATE,PER_TIME_PERIODS.START_DATE)
FROM   APPS.PAY_ELEMENT_ENTRIES_F   PAY_ELEMENT_ENTRIES_F,
       APPS.PAY_ELEMENT_LINKS_F     PAY_ELEMENT_LINKS_F,
       APPS.PAY_ELEMENT_TYPES_F     PAY_ELEMENT_TYPES_F,
       APPS.PER_ALL_ASSIGNMENTS_F   PER_ALL_ASSIGNMENTS_F,
       APPS.PER_TIME_PERIODS        PER_TIME_PERIODS,
       APPS.PAY_ACCRUAL_PLANS       PAY_ACCRUAL_PLANS
WHERE  (1=1)
       AND (PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_TYPE IN ('E','C'))
       AND (PER_TIME_PERIODS.END_DATE < SYSDATE)
             AND (PER_TIME_PERIODS.END_DATE > #HR_ACCRUAL_EXTRACT_DATE)—Set the start date
       AND (PER_ALL_ASSIGNMENTS_F.PAYROLL_ID IS NOT NULL )
       AND (PER_ALL_ASSIGNMENTS_F.ASSIGNMENT_ID=PAY_ELEMENT_ENTRIES_F.ASSIGNMENT_ID)
       AND (PER_ALL_ASSIGNMENTS_F.PAYROLL_ID=PER_TIME_PERIODS.PAYROLL_ID)
       AND (PAY_ELEMENT_ENTRIES_F.ELEMENT_LINK_ID=PAY_ELEMENT_LINKS_F.ELEMENT_LINK_ID)
       AND (PAY_ELEMENT_LINKS_F.ELEMENT_TYPE_ID=PAY_ELEMENT_TYPES_F.ELEMENT_TYPE_ID)
       AND (PAY_ELEMENT_LINKS_F.ELEMENT_TYPE_ID=PAY_ACCRUAL_PLANS.ACCRUAL_PLAN_ELEMENT_TYPE_ID)
       AND (PER_TIME_PERIODS.END_DATE BETWEEN PAY_ELEMENT_ENTRIES_F.EFFECTIVE_START_DATE AND PAY_ELEMENT_ENTRIES_F.EFFECTIVE_END_DATE)
       AND (PER_TIME_PERIODS.END_DATE BETWEEN PAY_ELEMENT_LINKS_F.EFFECTIVE_START_DATE AND PAY_ELEMENT_LINKS_F.EFFECTIVE_END_DATE)
       AND (PER_TIME_PERIODS.END_DATE BETWEEN PER_ALL_ASSIGNMENTS_F.EFFECTIVE_START_DATE AND PER_ALL_ASSIGNMENTS_F.EFFECTIVE_END_DATE)
       AND (PER_TIME_PERIODS.END_DATE BETWEEN PAY_ELEMENT_TYPES_F.EFFECTIVE_START_DATE AND PAY_ELEMENT_TYPES_F.EFFECTIVE_END_DATE);

B.2.79 How to Configure GL Account Hierarchies using FSG definitions for E-Business Suite

You must configure GL account hierarchies if you are deploying Oracle Financial Analytics, Oracle Procurement and Spend Analytics, and Oracle Supply Chain and Order Management Analytics.

For information on how to configure Hierarchies using GL Accounting flexfields value sets definitions, see Section B.2.20, "How to Configure GL Account and GL Segments for Oracle E-Business Suite".

If you need to define GL account hierarchies based on multiple segments within a chart of accounts, then you can use the Oracle FSG report definition in E-Business Suite to define them.

You should first use the Oracle FSG form to define a row set or a column set, then Oracle BI Applications will extract the row set or column set definition and convert them into hierarchies.

Oracle FSG hierarchies are extracted from following E-Business Suite source tables:

  • RG_REPORT_AXIS_CONTENTS

    This table defines the relationship between the FSG report axis and GL code combinations. The GL code combinations with segment values within the value range defined for that axis are categorized as children of that axis.

  • RG_REPORT_AXIS_SETS

    This table stores the information for each of the row set or column set you defined. There is one record in this table for each row or column set you defined. Each row includes an axis set identifier, a row or column set name, and a structure identifier to assign a specific chart of accounts to the row set or column set.

  • RG_REPORT_CALCULATIONS

    This table stores formulas for calculating each row or column in the row or column set. An example of a row calculation might be to sum up the amount from the previous five rows. An example of a columns calculation might be to calculate column five by subtracting column four from column three.

For example, in Income Statement, 'Net Income' is the calculation result of 'Gross Profit from Revenue' minus 'Total Expense'. When converting to hierarchy, Net Income becomes the parent of 'Gross Profit from Revenue' and 'Total Expense'. Therefore, hierarchy can be derived based on the information in RG_REPORT_CALCULATION.

The following diagram shows an example hierarchy, with the top level Net Income node having two child nodes, Total Expense, and Gross Profit from Revn, and the Total Expense node having two child nodes, Operating Expense, and Depreciation Expense.

The following diagram shows how an income state is derived from a hierarchy:

This screenshot or diagram is described in surrounding text.

This hierarchy would be converted into a flattened hierarchy and stored in W_HIERARCHY_D in the following format:

Table B-88 Example of Flattened Hierarchy Stored in W_HIERARCHY_D

HIER Name HIER1 HIER2 HIER3 HIER4 HIER20

Income Statement

Net Income

Gross Profit...

Gross Profit...

Gross Profit...

Gross Profit...

Income Statement

Net Income

Total Expenses

Operating Expenses

Operating Expenses

Operating Expenses

Income Statement

Net Income

Total Expenses

Depreciation Expense

Depreciation Expense

Depreciation Expense


Fact tables join to the W_HIERARCHY_D table through the GL Account dimension table (W_GL_ACCOUNT_D).

The W_GL_ACCOUNT_D table contains six fields (HIER1_WID, HIER2_WID, HIER3_WID, ...., HIER6_WID), which are foreign keys to the W_HIERARCHY_D.row_wid. Therefore, each General Ledger Code combination can participate in up to six different hierarchies. You can decide which of the six hierarchies to drill on based on the column you use to join to W_HIERARCHY_D. For example, if you want to drill using the third hierarchy, you use W_GL_ACCOUNT_D.hier3_wid = W_HIERARCHY_D.row_wid.

Note:

Mathematical operators, such as '+', '-', '*', '/' (addition, subtraction, multiplication, division, and so on) are not extracted from the FSG definitions. For example, both A + B = C and A - B = C would give the same hierarchy, with a node C having two child nodes A and B, as shown in the following diagram:

This diagram shows node C having two child nodes A and B.

About the ETL Process for Oracle FSG Report

Before you run the ETL process for GL accounts, you must specify the hierarchies that you want to reference. To specify the FSG hierarchies that you want to reference, use the file file_gl_hierarchy_assignment_ora.csv.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Figure B-4 Example values in file_gl_hierarchy_assignment_ora.csv

This screenshot or diagram is described in surrounding text.

In this file, for each chart of accounts, you can specify six FSG hierarchies, using axis_set_id, which is a column from the RG_REPORT_AXIS_SETS table. It is the unique ID of a row set or column set you want to store in the GL account dimension table for the code combinations that belong to that chart of accounts.

The DATASOURCE_NUM_ID field specifies the data source to which the configurations apply. If you have multiple source systems, there might be a chart of accounts across the multiple source systems with the same ID. Therefore, you must use the DATASOURCE_NUM_ID value to distinguish between them.

For example, suppose you have an income statement FSG report and a balance sheet FSG report and you want to input both of their hierarchy structures into the data warehouse. Oracle BI Applications assumes that both reports are derived from the same set of GL accounts with CHART_OF_ACCOUNTS=101. The axis_set_id of the income statement is 1001, and for the balance sheet, it is 1003. The DATASOURCE_NUM_ID for this application is 2.

In addition, for those GL accounts that belong to the two reports, assume you want to associate their HIER1 column (in GL_ACCOUNT_D) with the income statement hierarchy structure and HIER3 column with balance sheet hierarchy structure.

In this case, you would add one row into file_gl_hierarchy_assignment_ora.csv with fields set as follows:

CHART OF ACCOUNTS - 101

HIER1_AXIS_SET_ID - 1001

HIER3_AXIS_SET_ID - 1003

DATASOURCE_NUM_ID - 2

(Leave the other row values blank.)

This row indicates that for all of the GL accounts with CHART_OF_ACCOUNTS=101 and DATASOURCE_NUM_ID=2, assigning hierarchies with axis_set_id=1001, null, 1003, null, null, null to HIER1~HIER6 columns respectively. Therefore, after extraction and loading, for those affected GL account rows, HIER1 column will be the foreign key to the income statement hierarchy row ID in W_HIERARCHY_D, and HIER3 column will be the foreign key to the balance sheet hierarchy row ID in W_HIERARCHY_D.

Note: Axis_set_id must be specified in file_gl_hierarchy_assignment_ora.csv for Financial Analytics to load the hierarchies.

To set up hierarchies with FSG Report Definition:

  1. Configure the file_gl_hierarchy_assignment_ora.csv file to specify the hierarchies you want to reference for each CHART_OF_ACCOUNTS, as follows:

    1. Edit the file file_gl_hierarchy_assignment_ora.csv.

      Note:

      The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

      Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

      Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

      Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

    2. Specify the segments that you want to analyze.

    3. Save and close the file.

  2. Configuration provided by default in the BI metadata repository relating to GL Account Hierarchy using FSG:

    1. Physical table aliases for GL Account Hierarchy using FSG (6) are created and joins created to GL Account Dimension table (Dim_W_GL_ACCOUNT_D).

      This screen shot is described in surrounding text.
    2. Logical tables for the above six dimension hierarchy physical tables are created along with the BMM joins to relevant logical facts.

    3. Appropriate logical Levels and content filters have been set for the 6 FSG Logical dimensions that are provided by default.

      This screen shot is described in surrounding text.
    4. All relevant Logical Table sources of the Logical Fact tables have been set with necessary Aggregation content for the six Logical dimensions that are provided by default.

      This screen shot is described in surrounding text.
  3. The following additional configuration might be needed by the users to expose the necessary attributes relating to FSG.

    1. Using Oracle BI EE Administration Tool, in the Presentation layer of the Oracle BI Repository, drag the new hierarchies from appropriate Logical dimensions into the Presentation folder.

    2. If required, then rename the hierarchies in the presentation layers.

B.2.80 How to Configure Domains and Member Mappings for Employment Dimension

Purpose

The Employment Dimension has a number of conformed domains which are used in many of the HCM metrics. These domains must be configured correctly for the reports to contain accurate information.

Optional or Mandatory

This task is optional, however the default configuration may not adequately reflect the OLTP setup, so this should be reviewed to ensure the reports are accurate.

Applies to

All sources.

Task description in detail

Configure the domain mappings related to the Employment Dimension. The most important one of these, as it is used by many metrics, is the mapping to worker type and subtype. These are designed as a hierarchy, where worker types from different systems can be conformed onto a single classification. Worker Type is primarily used to distinguish between Employees and Contingent Workers, and Worker Subtype gives a more detailed breakdown within each type.

The domain mapping for Worker Type is derived from the source domain "System Worker Type and User Worker Type". This is derived differently for each source system, and examples are given below for each.

Example for E-Business Suite

The System Worker Type and User Worker Type domain is based on:

  • System Person Type

  • User Person Type

Example Requirements: By default contingent workers are all grouped together. Add a sub-type of contingent workers for Interns, identified by the corresponding User Person Type.

Example Implementation:

  1. Add the following mappings to the domain map System Worker Type and User Worker Type -> Worker Subtype:

    CWK~INTERN -> CONTINGENT_INTERN

    The remaining definition for regular contingent workers is already seeded so no change required.

    The table below shows how the resulting domain mappings will look, with rows 1 and 2 showing the seeded domain mappings:

Table B-89 Example Domain Mappings

Source Member Code Column 1 Member Code Column 2 Member Code Target Member Code

CWK~__ANY__

CWK

__ANY__

CONTINGENT_CONTINGENT

EMP~__ANY__

EMP

__ANY__

EMPLOYEE_REGULAR

CWK~INTERN

CWK

INTERN

CONTINGENT_INTERN


Note: Multiple match is allowed, for example a contingent worker with person type "Intern" would match the mapping to either CONTINGENT_CONTINGENT or CONTINGENT_INTERN. The exact match on User Person Type takes precedence over "any" type, so the result would be CONTINGENT_INTERN.

Example for Peoplesoft

The System Worker Type and User Worker Type domain is based on:

  • Organizational Relationship

  • Empl Class

Example Requirements: By default contingent workers are all grouped together. Add a sub-type of contingent workers for Interns, identified by the corresponding Empl Class within Set Id XXX.

Example Implementation:

  1. Add the following mappings to the domain map System Worker Type and User Worker Type -> Worker Subtype:

    CWR~XXX:I -> CONTINGENT_INTERN

    The remaining definition for regular contingent workers is already seeded so no change required.

    The table below shows how the resulting domain mappings will look, with rows 1 to 7 showing the seeded domain mappings:

Table B-90 Example Domain Mappings

Source Member Code Column 1 Member Code Column 2 Member Code Target Member Code

CWR~STD:G

CWR

STD:G

CONTINGENT_TEMP

CWR~STD:I

CWR

STD:I

CONTINGENT_INTERN

CWR~STD:T

CWR

STD:T

CONTINGENT_TRAINEE

CWR~__ANY__

CWR

__ANY__

CONTINGENT_CONTRACTOR

EMP~STD:E

EMP

STD:E

EMPLOYEE_EXPATRIATE

EMP~__ANY__

EMP

__ANY__

EMPLOYEE_REGULAR

POI~__ANY__

POI

__ANY__

NONWORKER

CWR~XXX:I

CWR

XXX:I

CONTINGENT_INTERN


Note: Multiple match is allowed, for example a contingent worker with Empl Class "Intern" would match the mapping to either CONTINGENT_CONTRACTOR or CONTINGENT_INTERN. The exact match on Empl Class takes precedence over "any" class, so the result would be CONTINGENT_INTERN.

Example for Fusion

The setup in Fusion is exactly the same as for E-Business Suite.

Dependency

None.

B.2.81 How to Configure Projects Costing Burden Cost for PeopleSoft

Actual Costs are extracted from Project Costing for all Analysis Types within the project's Actual Cost Analysis Group.

All costs extracted will be loaded into the Cost Fact Line table as Raw Cost unless you perform one or both of the following configurations:

B.2.81.1 Identifying Project Cost Burden Costs based on Analysis Type

To use this identification during the ETL process, you need to set the variable BURDEN_ANALYSIS_TYPE to 1 in FSM

The ETL process uses the file_Project_Cost_Burden_Analysis_Type_psft.csv flat file to list all Analysis Types for Project Cost Burden Cost.

If the ETL process finds the Analysis Type in this flat file, it will not perform further lookups against other lookup tables to determine Project Cost Burden Cost.

To identify the Project Cost Burden Costs based on Analysis Type:

  1. Edit the file file_Project_Cost_Burden_Analysis_Type_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter the list of Analysis Types to be considered as Burden Costs.

    The format is XXX,1, where XXX is an Analysis Type. The 1 is a return value that indicates that this is a Burden Cost.

    The following example shows how to classify Costs with BUR and BRD Analysis Types as Burden Costs:

    BUR,1
    BRD,1
    
  3. Save and close the file.

Note:

FSM parameter BURDEN_ANALYSIS_TYPE is common to both Cost and Budget subject areas to identify Burden Costs.

If the requirements differ between Project Budget and Project Cost in your implementation you can create separate FSM and ODI variables for the same. Note: In the case of creating separate FSM variables for budget and cost facts to identify burden costs, the ETL would need to be modified to use these new variables.

B.2.81.2 Identifying Project Cost Burden Costs based on a Source Type, Category, and Subcategory Combination of Values

To use this identification during the ETL process, you need to set the variable BURDEN_TYPECATSUB to 1 in FSM.

You must configure the following flat files to identify Project Cost Burden Costs based on a Source Type, Category, and Subcategory combination of values: file_Project_Cost_Burden_TypeCatSub_config_psft.csv.

Use this flat file to specify all the columns among (Source Type, Category, and Subcategory) to use in the lookup: file_Project_Cost_Burden_TypeCatSub_psft.csv

Based on the columns entered in the previous csv file, use this flat file to list all Source Type, Category, and Subcategory combination of values to use as Project Cost Burden Cost.

Note:

Both Project Budget and Project Cost use these flat files to load data to W_PROJ_LOOKUP_PS table, along with a FSM parameter, to identify Burden Costs.

You can customize these files if the requirements differ between Project Budget and Project Cost in your implementation.

To configure the file_Project_Cost_Burden_TypeCatSub_config_psft.csv file (Config file):

  1. Edit the file file_Project_Cost_Burden_TypeCatSub_config_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter only one row with RowID of 1. Enter a Y in each column that represents the combination to be evaluated as a Project Cost Burden Cost. The columns are:

    Row ID
    Source Type
    Category
    Subcategory
    

    The following is an example of using a combination of Source Type and Category:

    1,Y,Y
    

    (For Source Type and SubCategory combination it would be 1,Y,,Y.)

  3. Save and close the file.

To configure the file_Project_Cost_Burden_TypeCatSub_psft.csv file (Data file):

  1. Edit the file file_Project_Cost_Burden_TypeCatSub_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter a list of Resource Type, Resource Category, and Resource Subcategory combinations to be considered as Project Cost Burden Costs. The format is:

    XXXXX,XXXXX,XXXXX,1

    XXXXX represents each combination of Resource Type, Resource Category, and Resource Subcategory.

    The 1 is a return value that indicates that this is a Burden Cost. Each combination of lookup values must be specified. Wildcards are not supported.

    The following example shows how to classify Costs with G&A or FRNG Source Types as Project Cost Burden Costs:

    G&A,,,1

    FRNG,,,1

    Note: This CSV file is used in conjunction with the file_Project_Cost_Burden_TypeCatSub_config_psft.csv configuration file. In this example, this configuration file would contain the value 1,Y.

  3. Save and close the file.

B.2.82 How to Configure Project Commitment Fact for EBS

Oracle Project Analytics for E-Business Suite includes a Project Commitments subject area that provides the ability to report on project commitments. Project commitments include total raw and burdened amounts for requisitions, purchase orders, and supplier invoices for organizations, projects, tasks, resources, suppliers, and associated hierarchies. The subject area provides the ability to track commitments at the commitment document level.

Oracle Business Analytics Warehouse includes a star schema to support the Project Commitments subject area. This star contains metrics to report on total commitments and its components, which includes quantity and amount (raw and burdened) for requisitions, purchase orders, and supplier invoices.

The W_PROJ_COMMITMENT_F fact table at the center of the star schema stores the latest commitment data, sourced from the transactional source PA_COMMITMENT_TXNS.

Configuration of Commitment Snapshot

Commitment data being transient, a snapshot table W_PROJ_COMMITMENT_SNP_F is populated. The grain of data in the snapshot table is controlled by ETL parameter PROJ_COMMITMENT_GRAIN. This parameter is set in the FSM and can have values WEEK, MONTH, QUARTER and YEAR. Example: PROJ_COMMITMENT_GRAIN = 'WEEK' would mean that the snapshot table stores one snapshot per week. So if the ETL is run multiple times within a week, the last snapshot will keep overwriting the older one until the end of the week. The Friday record will be kept, and a new record will be generated next Monday for the new week. This Grain is specified using a Task in FSM. The default is set to 'Month'.

Set the value for this variable in the FSM by navigating to Manage Data Load Parameters section'; filter for offering Oracle Project Analytics.

B.2.83 Manage Domains and Member Mappings for Time and Labor - Reported Time

Purpose

The Time and Labor - Reported Time dimension has a number of conformed domains which are used in many of the Time and Labor metrics. These domains must be configured correctly for the reports to contain accurate information.

Optional or Mandatory

This task is optional; the default values may prove good enough.

Applies to

E-Business Suite, and PeopleSoft

Task description in detail

Configuring the domains on the Time and Labor - Reported Time dimension are key to the successful attribution of time reporting entries to warehouse reporting unit of measure and punch types.

Source Time Entry Unit of Measure Code -> Timecard Unit of Measure Code

This task is optional; the default values may prove good enough.

Used to identify how Source Time Entry Unit of Measure Code maps to delivered target Timecard Unit of Measure Code domain members. The target domain is Extensible - customers can add to but not delete from it.

Example for E-Business Suite

The Source Time Entry Unit of Measure Code is the currently always assumed to be Hours.

Example Implementation

Table B-91 Source Member Codes and Target Member Codes

Source Member Code (Name) Target Member Code (Name)

HOURS (Hours)

HOURS (Hours)


Source TRC Type Code~Source Unit Of Measure-> Timecard Unit of Measure Code

This task is optional; the default values may prove good enough.

Used to identify how Source Time Entry Unit of Measure Code maps to delivered target Timecard Unit of Measure Code domain members. The target domain is Extensible - customers can add to but not delete from it.

This domain is a Multi Code domain member type; it uses two source domains (Source TRC Type Code & Source Unit of Measure) in combination to map to a target domain.

Example for PeopleSoft

Source TRC Type Code

On PeopleSoft the Source TRC Type Code is the PSXLATITEM TRC_TYPE_CODE.

Source Unit of Measure (UOM)

On PeopleSoft the Source Unit of Measure is the PSXLATITEM TBA.

Example Implementation

Table B-92 Source TRC Type Codes, Source Units of Measure, and Target Member Codes

Source TRC Type Code Source Unit Of Measure Target Member Code

A (Amount)

__ANY__ (Any)

AMOUNT (Amount)

H (Hours)

__ANY__ (Any)

HOURS (Hours)


Source Timecard Punch Type Code-> Timecard Punch Type Code

This task is optional; the default values may prove good enough.

Used to identify how Source Timecard Punch Type Code maps to delivered target Timecard Punch Type Code domain members. The target domain is Extensible - customers can add to but not delete from it.

Example for E-Business Suite

The Source Timecard Punch Type Code is the currently always assumed to be Elapsed.

Example Implementation

Table B-93 Source Member Codes and Target Member Codes

Source Member Code (Name) Target Member Code (Name)

ELAPSED (Elapsed)

ELAPSED (ELAPSED)


Example for PeopleSoft

Source TRC Type Code

On PeopleSoft the Source Timecard Punch Type Code is the PSXLATITEM PUNCH_TYPE.

Example Implementation

Table B-94 Source Member Codes and Target Member Codes

Source Member Code (Name) Target Member Code (Name)

0 (Elapsed)

ELAPSED (ELAPSED)

1 (In)

IN (In)

2 (Out) *

OUT (Out)

3 (Meal)

MEAL (Meal)

4 (Break)

BRK (Break)

5 (Transfer)

XFR (Transfer)


* At the time of writing Punch "Out" are not extracted into the data warehouse, to reduce volume.

B.2.84 How to set up HR Position Hierarchy Based Data Security

Contents

B.2.84.1 Introduction

Data can be secured via the HR Position Hierarchy using list variable[s], with associated data roles and security filter[s] which are applied at the physical SQL level as JOIN statement with the variables.

How to choose / assign the Duty Role

Each BI Apps Duty role grants access to one or more subject areas, and is a member of at least one data security role.

You need to map a source role/responsibility to one or more BI Apps Duty roles. For instructions on how this is done refer to the FSM task in Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

HR Data Role to Duty Role Mapping

Note: The following are applicable to HR Position Hierarchy security.

Table B-95 HR Data Role to Duty Role Mapping

Data (Security) Role Duty Role

Line Manager (secured by HR Position Hierarchy List) AU BI Data

Line Manager (secured by HR Position Hierarchy List) AU BI Duty

Payroll Manager (secured by HR Position Hierarchy List) AU BI Data

Payroll Manager (secured by HR Position Hierarchy List) AU BI Duty

Compensation Analyst (secured by HR Position Hierarchy List) AU BI Data

Compensation Analyst (secured by HR Position Hierarchy List) AU BI Duty

Compensation Manager (secured by HR Position Hierarchy List) AU BI Data

Compensation Manager (secured by HR Position Hierarchy List) AU BI Duty

Recruiting Manager (secured by HR Position Hierarchy List) AU BI Data

Recruiting Manager (secured by HR Position Hierarchy List) AU BI Duty

Recruiting VP (secured by HR Position Hierarchy List) AU BI Data

Recruiting VP (secured by HR Position Hierarchy List) AU BI Duty

Time Collection Manager (secured by HR Position Hierarchy List) AU BI Data

Time Collection Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resource VP (secured by HR Position Hierarchy List) AU BI Data

Human Resource VP (secured by HR Position Hierarchy List) AU BI Duty

Human Resource Analyst (secured by HR Position Hierarchy List) AU BI Data

Human Resource Analyst (secured by HR Position Hierarchy List) AU BI Duty

Human Resource Manager (secured by HR Position Hierarchy List) AU BI Data

Human Resource Manager (secured by HR Position Hierarchy List) AU BI Duty

Learning Manager (secured by HR Position Hierarchy List) AU BI Data

Learning Manager (secured by HR Position Hierarchy List) AU BI Duty


B.2.84.2 Line Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Line Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-96 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Line Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-97 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Calendar

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Absence Event

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.3 Payroll Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Payroll Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-98 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Payroll Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-99 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.4 Compensation Analyst (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Compensation Analyst HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-100 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Compensation Analyst HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-101 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.5 Compensation Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Compensation Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-102 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Compensation Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-103 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Payroll Balance Detail

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.6 Recruiting Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Recruiting Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-104 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Recruiting Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-105 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.7 Recruiting VP (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Recruiting VP HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-106 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Recruiting VP HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-107 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.8 Time Collection Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Time Collection Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-108 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Time Collection Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-109 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR – Payroll Balance Summary

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.9 Human Resource VP (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Human Resource VP HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-110 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Human Resource VP HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-111 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Calendar

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Absence Event

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Time and Labor - Reported Time

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Time and Labor - Processed Time

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.10 Human Resource Analyst (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Human Resource Analyst HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-112 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Human Resource Analyst HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-113 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Calendar

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Absence Event

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Payroll Balance Summary

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.11 Human Resource Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Human Resource Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-114 Initialization Blocks

Variable Name Initialization Block Name

HR_SEC_POS_LIST

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_VER

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_HIER_ID

Not applicable, this is a multi source variable population see below.

HR_SEC_POS_LIST____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_LIST____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_LIST____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_VER____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_VER____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_VER____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_ID____EBS

HR Position Hierarchy ID Version and Position List (EBS).

HR_SEC_POS_HIER_ID____PSFT

HR Position Hierarchy ID Version and Position List (PeopleSoft).

HR_SEC_POS_HIER_ID____FUSN

HR Position Hierarchy ID Version and Position List (Fusion).

HR_SEC_POS_HIER_LVL_POS_ID

HR Position Hierarchy Fixed Hier Level.


Data Security Role Filters

The Human Resource Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-115 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Calendar

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Absence Event

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Recruitment Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Workforce Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Event Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Accrual Transactions - Balance Information

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.12 Learning Manager (secured by HR Position Hierarchy List) AU BI Data

Initialization Blocks

The Learning Manager HR Position Hierarchy list is determined at user sign-on via one or more Initialization Blocks:

Table B-116 Initialization Blocks

Variable Name Initialization Block Name
   
   

Data Security Role Filters

The Learning Manager HR Position Hierarchy list Security is applied depending on the roles the user is granted, and when it is applied it is supported by the following HR logical facts and dimensions:

Table B-117 Data Security Role Filters

Name Filter

Dim - HR Position Hierarchy

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Calendar

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment and Completion

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))

Fact - HR - Learning Enrollment Events

"Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Based Position Id" = VALUEOF(NQ_SESSION.HR_SEC_POS_LIST) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy Version" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_VER)) AND "Core"."Dim - HR Position Hierarchy"."Sec Filter Col - Hierarchy ID" = (VALUEOF(NQ_SESSION.HR_SEC_POS_HIER_ID))


B.2.84.13 HR Duty Role to Oracle BI Applications HR Presentation Catalog Mapping

Note: The following are applicable to security.

Table B-118 HR Duty Role to Oracle BI Applications HR Presentation Catalog Mapping

BI Duty Roles HR Presentation Catalog Mapping

Line Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resources – Time and Labor

Payroll Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources – Payroll

Human Resources – Time and Labor

Compensation Analyst (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Compensation

Human Resources – Payroll

Compensation Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Compensation

Human Resources – Payroll

Recruiting Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources – Recruitment

Recruiting VP (secured by HR Position Hierarchy List) AU BI Duty

Human Resources – Recruitment

Time Collection Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources – Time and Labor

Human Resource VP (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Payroll

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resources - Workforce Effectiveness

Human Resource Analyst (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Human Resource Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Absence and Leave Accrual

Human Resources - Compensation

Human Resources - Learning Enrollment and Completion

Human Resources - Recruitment

Human Resources - Workforce Deployment

Learning Manager (secured by HR Position Hierarchy List) AU BI Duty

Human Resources - Learning Enrollment and Completion


B.2.85 How to set up Payroll Legislative Data Group Based Data Security

There is no topic for this FSM Task.

B.2.86 How to set up Payroll Based Data Security

There is no topic for this FSM Task.

B.2.87 How to Configure Workforce Revalidate Option

Purpose

Some cases of bad data are handled by the Workforce ETL Processes in the Persisted Staging layer and instead of raising an error the bad data is flagged invalid and excluded from Oracle Business Analytics Warehouse. If the bad data is cleaned up on the OLTP then use the Revalidate Option to reprocess it and if valid it will be included in Oracle Business Analytics Warehouse.

Optional or Mandatory

This task is optional, however the default option will always reprocess the bad data which will be a small overhead if it isn't going to be cleaned up.

Applies to

All sources.

Task description in detail

Set the parameter HR_WRKFC_REVALIDATE. The corresponding ETL tasks that will be affected can be found by searching for SDE%Workforce%Validate%. The invalid data can be reviewed by checking the records marked with VALID_FLG = 'N'. The tables and checks implemented are listed below.

Table B-119 E-Business Suite

Persisted Staging Table Type of bad data excluded

Assignment - W_ORA_WEVT_ASG_PS

Overlapping effective dates

Overlapping active primary assignments

FTE - W_ORA_WEVT_FTE_PS

Overlapping effective dates

Grade Rate - W_ORA_WEVT_GRT_PS

Overlapping effective dates

Headcount - W_ORA_WEVT_HDC_PS

Overlapping effective dates

Performance Reviews - W_ORA_WEVT_PERF_PS

More than one review per person/assignment per day

Person - W_ORA_WEVT_PSN_PS

Overlapping effective dates

Person Type - W_ORA_WEVT_PTYP_PS

Overlapping effective dates

Salary - W_ORA_WEVT_SAL_PS

Overlapping effective dates


Table B-120 Peoplesoft

Persisted Staging Table Type of bad data excluded

Headcount - W_PSFT_WEVT_HDC_PS

Overlapping active primary assignments

Performance Reviews - W_PSFT_WEVT_PERF_PS

More than one review per person/assignment per day


Table B-121 Fusion

Persisted Staging Table Type of bad data excluded

Assignment - W_FSN_WEVT_ASG_PS

Overlapping effective dates

Overlapping active primary assignments

Performance Reviews - W_FSN_WEVT_PERF_PS

More than one review per person/assignment per day

Salary - W_FSN_WEVT_SAL_PS

Overlapping effective dates

Supervisor - W_FSN_WEVT_SUP_PS

Overlapping effective dates


Dependency

No dependencies.

B.2.88 How to Manage Domains and Member Mappings for Recruitment Event Type

Purpose

This task is a critical configuration step for Recruitment Analytics. This task helps you to configure the domain member mapping between Recruitment pipeline 'source statuses' (both Job Requisition and Applicant statuses) and 'source reasons' associated to those source statuses to a set of warehouse conformed domains. These delivered warehouse conformed domains include Recruitment Event, Recruitment Event Reason, Recruitment Event Reason Type, Recruitment Sub Stage, Recruitment Stage, Applicant Event Flag and Job Requisition Event Flag.

There exist relationships between the delivered warehouse conformed domains, and default member mappings are already configured in your system. If you wish to change them, you are free to do so. However, the primary focus of this task is to make sure your source domain members (status and reasons) get mapped to the warehouse conformed domains, meaningfully.

Optional or Mandatory

This is a mandatory task for Recruitment Analytics.

Background

Before going into the actual task, it is worth clarifying the relationships/mappings that exists between the domains.


The domain member mappings between the warehouse conformed domains are seeded by default, and if the default settings meet your business needs, then no further configuration is required.

Metrics in Recruitment Analytics are heavily dependent upon the Warehouse Conformed domains Recruitment Event, Recruitment Sub Stage and Recruitment Stage. In a recruitment process, depending on certain activities/events, the status of the application process (or job requisition process) can change. As we proceed through the recruitment process pipeline, certain applications (or job requisitions) can be thought of entering a particular "stage" or leaving a "stage" and entering to a new "stage" that indicates where the application is in the overall process. Each "stage" can be further classified into "sub stages" for a finer grain analysis. For example, applicants get initial screening (sub stage = INITIAL_SCREENING), and those who qualify move to a written test (sub stage = ASSESSMENT), and those who pass the written test, are interviewed (sub stage = INTERVIEW). In a broad picture, the candidate has gone through two major stages – INITIAL_SCREENING and ASSESSMENT. For this example, this is how the default domain member mappings are configured:

Sub Stage to Stage map

Table B-122 Sub Stage to Stage map

Recruitment Sub Stage (Warehouse conformed) Recruitment Stage (Warehouse conformed)

INITIAL_SCREENING

INITIAL_SCREENING

ASSESSMENT

ASSESSMENT

INTERVIEW

ASSESSMENT


An application enters and leaves a "stage" or a "sub stage" because of some "events" that occur. When you map the warehouse confirmed domain "Recruitment Event" to the "sub stage", you actually link together the "cause and effect". Consider the possible events that might cause an application to enter INITIAL_SCREENING sub stage. To start with, we do initial screening after we receive an application. "Application Received" being a seeded value for domain "Recruitment Event", this is the "cause" (or event) that triggers the "effect" that the application has gone into "Initial Screening". A few others are below:

Recruitment Event to Sub Stage map

Table B-123 Recruitment Event to Sub Stage map

Recruitment Sub Stage (Warehouse conformed) Recruitment Stage (Warehouse conformed)

APPLICATION_RECEIVED

INITIAL_SCREENING

ASSESSMENT_START

ASSESSMENT

ASSESSMENT_INTERVIEW

INTERVIEW

ASSESSMENT_INTERVIEW1

INTERVIEW

ASSESSMENT_INTERVIEW2

INTERVIEW


These are just examples showing a few warehouse conformed domain member mappings, and is intended to introduce you to the topic of recruitment pipeline stages and sub stages and related events. As mentioned earlier, you do not have to alter these mappings unless your business requirements don't match what is delivered. A complete list of warehouse conformed domain member mappings are provided at the end of this help topic (see "Additional Information" section), for your information.

Before moving on to the source to warehouse conformed domain member maps, here's a short note on the other two 'warehouse conformed' domain mappings. We will not provide examples for these at this time, rather just touch upon them.

Recruitment Event to Recruitment Event Sequence Map

You could order your warehouse conformed recruitment events using numeric sequencing. This will help your business users to see the recruitment process more clearly. Some businesses prefer to carry out background checks at the end of the process, right before hiring. Some may do that way during the assessment stage. Recruitment event sequence order does not matter very much when it comes to processing recruitment analytics data, but might be helpful for reporting purposes.

Recruitment Event Reason to Recruitment Event Reason Type Map

The Recruitment Event is the "cause" of a status change in the recruitment pipeline, and usually there is a "reason" for the event that happened. For example, "Application Termination" may have happened because the candidate found "Another Job" elsewhere. In this case, "Another Job" is the reason. We will show how to map your source reasons to the warehouse-conformed Recruitment Even Reason later, but Recruitment Analytics provides a higher level of analysis grain through Recruitment Event Reason Type. This "type" level primarily attempts to segregate the Recruitment Event Reasons into three buckets – VOLUNTARY, INVOLUNTARY, or OTHER (if not possible to make out). In this particular case, it seems like the application was terminated voluntarily, since the candidate got another job.

Task description

Now we talk about the actual task to be carried out – mapping your source domain members to different warehouse conformed domain members. The mappings "Source Recruitment Status and Event Reason ? To ? Recruitment Event", and "Source Recruitment Event Reason ? To ? Recruitment Event Reason" already exists in the default solution. In other words, you do not have to create these domain mappings; you have to carry out the domain member mappings between these domains.

In order to map a "status" to an "event" accurately, you need to associate a reason to a status. This is the reason why both source (status + reason) together should be, and can be used to map to the warehouse conformed events, as well as the warehouse conformed event reasons.


Source Recruitment Status and Event Reason to Recruitment Event Map

The Source Recruitment Status and Event Reason (or Status alone) to Recruitment Event domain map is different for E-Business Suite and PeopleSoft. Two distinct source domain values are delivered – one for E-Business Suite (DOMAIN_CODE = ASSIGNMENT_SYSTEM_STATUS~ASSIGNMENT_USER_STATUS~RECRUITMENT_EVENT_REASON) and one for PeopleSoft (CODE = RECRUITMENT_STATUS). These are discussed separately, however, the overall objective is the same. That is, to map a source status and reason to one of the warehouse conformed values for Recruitment Event domain.

E-Business Suite Applications:

In this case, the true status for an applicant assignment can be inferred from the system status and user status from PER_ASSINGMENT_STATUS_TYPES table. The system status is filtered on the list ('ACTIVE_APL','INTERVIEW1','INTERVIEW2', 'OFFER', 'ACCEPTED') to only consider applicant statuses, and the user status adds more user friendly value to the actual status. The reason comes from HR_STANDARD_LOOKUPS.LOOKUP_TYPE = 'APL_ASSIGN_REASON' and gets tagged with all possible applicant statuses. Similarly, for application termination statuses, the system status used is 'TERM_APL' and reason 'TERM_APL_REASON'. These two are of type "Applicant Events". In addition, we also get Vacancy statuses (HR_STANDARD_LOOKUPS.LOOKUP_TYPE = 'VACANCY_STATUS'). There are no reasons associated with Vacancy Statuses, and hence, we substitute 'Unspecified' for the Vacancy reason. In a nutshell, the E-Business Suite set of source domain will contain members for application assignment status & reasons, application termination status & reasons, vacancy status & reasons ('Unspecified' only). You are expected to map these source domain members to one of the members under Recruitment Event warehouse conformed domain.

Below is a sample list of delivered E-Business Suite source domain mapping to Recruitment Event domain values. You will need to review and configure the appropriate mappings based on real values in your source E-Business Suite iRecruitment configuration.

Table B-124 (EBS) Source Recruitment Status and Event Reason (Source Domains) to Recruitment Event (Warehouse Conformed) Member Mappings

Source Member Code Target Member Target Member Code

__ANY__

Other Event

OTHER

ACCEPTED~__ANY__~__ANY__

Offer Accepted

OFFER_ACCEPTED

ACTIVE_APL~__ANY__~__ANY__

Application Received

APPLICATION_RECEIVED

INTERVIEW1~__ANY__~__ANY__

Assessment Interview

ASSESSMENT_INTERVIEW

INTERVIEW2~__ANY__~__ANY__

Assessment Second Interview

ASSESSMENT_INTERVIEW2

OFFER~__ANY__~__ANY__

Offer Extended

OFFER_EXTENDEDTERM_APL~__ANY__~__ANY__

TERM_APL~__ANY__~__ANY__

Application Terminated

APPLICATION_TERMINATED


PeopleSoft Applications:

Status, with status area, is needed to correctly identify the recruitment event. PS_HRS_STS_TBL tracks the valid combinations of statuses by status area, and this table becomes the primary source for our source domain members. Both applicant statuses (STATUS_AREA = 3) as well as job requisition statuses (STATUS_AREA = 1) are brought in as source domain members. This combination now needs to be mapped to the warehouse conformed Recruitment Event domain members.

The following table shows a sample set of delivered member mappings. You need to review and configure the appropriate mappings based on the real values in your PeopleSoft configuration.

Table B-125 PeopleSoft Source Recruitment Status (Source Domains) to Recruitment Event (Warehouse Conformed) Member Mappings

Source Member Source Member Code Target Member Target Member Code

Applied

3~020

Application Received

APPLICATION_RECEIVED

Cancelled

1~120

Requisition Cancelled

RQSTN_CANCELLED

Closed

1~110

Requisition Filled Closed

RQSTN_FILLED_CLOSED

Denied

1~008

Requisition Approval Denied

RQSTN_APPROVAL_DENIED

Denied

3~008

Application Terminated

APPLICATION_TERMINATED

Draft

1~005

Requisition Drafted

RQSTN_DRAFTED

Draft

3~005

Application Received

APPLICATION_RECEIVED

Hire Dec

3~078

Offer Extended

OFFER_EXTENDED

Hired

3~090

Hire

HIRE

Hold

1~100

Requisition on Hold

RQSTN_HOLD

Hold

3~100

Non Pipeline Events

NON_PIPELINE

Inactive

3~140

Non Pipeline Events

NON_PIPELINE

Interview

3~060

Assessment Interview

ASSESSMENT_INTERVIEW

Linked

3~015

Application Received

APPLICATION_RECEIVED

Linked Quest

3~019

Application Received

APPLICATION_RECEIVED

Offer

3~070

Offer Extended

OFFER_EXTENDED

Offer Accepted

3~071

Offer Accepted

OFFER_ACCEPTED

Open

1~010

Requisition Opened

RQSTN_OPEN

Pending

1~006

Requisition Approval Pending

RQSTN_APPROVAL_PENDING

PreOffAcc

3~076

Offer Accepted

OFFER_ACCEPTED

PreOffDec

3~069

Offer Extended

OFFER_EXTENDED

PreOffNot

3~075

Offer Extended

OFFER_EXTENDED

PreOffRej

3~077

Offer Extended

OFFER_EXTENDED

Ready to Hire

3~080

Offer Accepted

OFFER_ACCEPTED

Reject

3~110

Application Terminated

APPLICATION_TERMINATED

Review

3~010

Application Received

APPLICATION_RECEIVED

Route

3~050

Assessment Start

ASSESSMENT_START

Screen

3~030

Assessment Start

ASSESSMENT_START

Withdrawn

3~120

Application Terminated

APPLICATION_TERMINATED


Source Recruitment Event Reason to Recruitment Event Reason Map

The source domain "Source Recruitment Event Reason" (RECRUITMENT_EVENT_REASON) members comprise of all vacancy and applicant status reasons for both E-Business Suite Applications and PeopleSoft Applications. For E-Business Suite, you are only required to map termination related status reasons, as a minimum. All other reasons are mapped to "Other" in the target domain member. (Note that this is only minimum requirement. You can map all other reasons too). For PeopleSoft, source members (or reasons) are made up of three components – status area, status, and reason. These three together are required to map to the appropriate target domain member reasons. These mappings exist by default, but you are expected to treat them as "sample" only and provide your own. The following table shows a set of sample member mappings, for your information purposes only.

Table B-126 (All Sources) Source Recruitment Event Reason (Source Domains) to Recruitment Event Reason (Warehouse Conformed) Member

Source Member Source Member Code Target Member Target Member Code Product Line

APPLICATON WITHDRAWN

APL_ASSIGN_REASON:EBSTM WA

Application Withdrawn

WITHDRAWN

EBS

APPLICATON WITHDRAWN

APL_ASSIGN_REASON:WA

Application Withdrawn

WITHDRAWN

EBS

APPLICATON WITHDRAWN

TERM_APL_REASON:W

Application Withdrawn

WITHDRAWN

EBS

Any

__ANY__

Other

OTHER

EBS

DECLINED POSITION

APL_ASSIGN_REASON:DEC

Ineligible

INELIGIBLE

EBS

DECLINED POSITION

APL_ASSIGN_REASON:EBSTM DEC

Ineligible

INELIGIBLE

EBS

DECLINED POSITION

TERM_APL_REASON:D

Ineligible

INELIGIBLE

EBS

DECLINED POSITION

TERM_APL_REASON:DEC

Ineligible

INELIGIBLE

EBS

Not Selected

3~110:150

Disqualified In Interview

DISQUALIFIED

PSFT

Another Applicant was Hired

3~110:010

Failed To Start

FAILED_TO_START

PSFT

Another Job

3~110:190

Another Job

ANOTHER_JOB

PSFT

Ineligible - Basic Eligibility

3~110:100

Ineligible

INELIGIBLE

PSFT

Ineligible - Employment Cond

3~110:110

Ineligible

INELIGIBLE

PSFT

Ineligible - Min Grade/Salary

3~110:120

Ineligible

INELIGIBLE

PSFT

Lacks Other Min Qualifications

3~110:130

Ineligible

INELIGIBLE

PSFT

Lacks Required Credentials

3~110:140

Ineligible

INELIGIBLE

PSFT

Lacks Required Education

3~110:090

Ineligible

INELIGIBLE

PSFT

Lacks Required Experience

3~110:160

Ineligible

INELIGIBLE

PSFT

Misrepresentation

3~110:070

Disqualified In Interview

DISQUALIFIED

PSFT

No Opening

3~110:180

Headcount Not Available Or Hiring Freeze

HEADCOUNT_NOT_AVAILABLE

PSFT

No Show for Interview

3~110:060

No Show For Interview

NO_SHOW

PSFT

No Skills Match

3~110:040

Disqualified In Interview

DISQUALIFIED

PSFT

Poor Interview

3~110:030

Disqualified In Interview

DISQUALIFIED

PSFT

Rejected by Works Council

3~110:200

Offer Rejected

OFFER_REJECTED

PSFT

Rejected by Works Council

3~110:210

Offer Rejected

OFFER_REJECTED

PSFT

Selected for Other Position

3~110:080

Another Job

ANOTHER_JOB

PSFT

Unable to Contact

3~110:020

No Show For Interview

NO_SHOW

PSFT

Underqualified

3~110:050

Disqualified In Interview

DISQUALIFIED

PSFT


Dependency

The default Oracle BI Applications – Recruitment Analytics depends heavily on correct domain member maps and other configurations. If you change any domain member map, you need to carry out a full load ETL.

Extensibility

While target domain members are usually extensible, that is not the case for two Recruitment Analytics target domains, "Recruitment Stage" and "Recruitment Sub Stage". ETL logic expects the seeded domain values for these two target warehouse domains, and any extensions/alterations will require changes to the delivered ETL logic.

Additional Information

Following is a list of warehouse conformed domain member mappings. As mentioned earlier, you do not have to alter these mappings unless your business requirements don't match with the way these are mapped.

Table B-127 Recruitment Event (Warehouse Conformed) to Recruitment Sub Stage (Warehouse Conformed) Member Mappings

Source Member Source Member Code Target Member Target Member Code

Application Received

APPLICATION_RECEIVED

Initial Screening

INITIAL_SCREENING

Application Terminated

APPLICATION_TERMINATED

Application Terminated

APPLICATION_TERMINATED

Assessment First Interview

ASSESSMENT_INTERVIEW1

Interview

INTERVIEW

Assessment Interview

ASSESSMENT_INTERVIEW

Interview

INTERVIEW

Assessment Second Interview

ASSESSMENT_INTERVIEW2

Interview

INTERVIEW

Assessment Start

ASSESSMENT_START

Assessment

ASSESSMENT

Employed after completing first period of work band

EMP_FST_POW_COMPLETION

Employment beyond 1st period of work band

EMPLOYMENT_POST_POW1

First Appraisal / Review

EMP_PERF_REVIEW

Employment before 1st period of work band

EMPLOYMENT_PRE_POW1

Hire

HIRE

Employment before 1st period of work band

EMPLOYMENT_PRE_POW1

Non Pipeline Events

NON_PIPELINE

Non Pipeline

NON_PIPELINE

Offer Accepted

OFFER_ACCEPTED

Start Pending

START_PENDING

Offer Extended

OFFER_EXTENDED

Offer

OFFER

Other Event

OTHER

Non Pipeline

NON_PIPELINE

Requisition Approval Denied

RQSTN_APPROVAL_DENIED

Requisition Approval Denied

RQSTN_APPROVAL_DENIED

Requisition Approval Pending

RQSTN_APPROVAL_PENDING

Requisition Approval Pending

RQSTN_APPROVAL_PENDING

Requisition Cancelled

RQSTN_CANCELLED

Requisition Cancelled

RQSTN_CANCELLED

Requisition Drafted

RQSTN_DRAFTED

Requisition Drafted

RQSTN_DRAFTED

Requisition Filled Closed

RQSTN_FILLED_CLOSED

Requisition Filled or Closed

RQSTN_FILLED_CLOSED

Requisition Opened

RQSTN_OPEN

Requisition Open

RQSTN_OPEN

Requisition on Hold

RQSTN_HOLD

Requisition on Hold

RQSTN_HOLD

Terminated prior to completing first period of work band

EMP_TERMINATED

Employment Terminated

EMPLOYMENT_TERMINATED

Transferred prior to completing first period of work band

EMP_TRANSFER

Employment before 1st period of work band

EMPLOYMENT_PRE_POW1


Table B-128 Recruitment Sub Stage (Warehouse Conformed) to Recruitment Stage (Warehouse Conformed) Member Mappings

Source Member Source Member Code Target Member Target Member Code

Application Terminated

APPLICATION_TERMINATED

Application Terminated

APPLICATION_TERMINATED

Assessment

ASSESSMENT

Assessment

ASSESSMENT

Employment Terminated

EMPLOYMENT_TERMINATED

Employment Terminated

EMPLOYMENT_TERMINATED

Employment before 1st period of work band

EMPLOYMENT_PRE_POW1

Employment

EMPLOYMENT

Employment beyond 1st period of work band

EMPLOYMENT_POST_POW1

Employment

EMPLOYMENT

Initial Screening

INITIAL_SCREENING

Initial Screening

INITIAL_SCREENING

Interview

INTERVIEW

Assessment

ASSESSMENT

Non Pipeline

NON_PIPELINE

Non Pipeline

NON_PIPELINE

Offer

OFFER

Offer

OFFER

Requisition Approval Denied

RQSTN_APPROVAL_DENIED

Requisition Rejected

RQSTN_REJECTED

Requisition Approval Pending

RQSTN_APPROVAL_PENDING

Requisition Pending

RQSTN_PENDING

Requisition Cancelled

RQSTN_CANCELLED

Requisition Closed

RQSTN_CLOSED

Requisition Drafted

RQSTN_DRAFTED

Requisition Pending

RQSTN_PENDING

Requisition Filled or Closed

RQSTN_FILLED_CLOSED

Requisition Closed

RQSTN_CLOSED

Requisition Open

RQSTN_OPEN

Requisition Open

RQSTN_OPEN

Requisition on Hold

RQSTN_HOLD

Requisition Open

RQSTN_OPEN

Start Pending

START_PENDING

Start Pending

START_PENDING


Table B-129 Recruitment Event (Warehouse Conformed) to Recruitment Event Sequence (Warehouse Conformed) Member Mappings

Source Member Source Member Code Target Member Target Member Code

Requisition Drafted

RQSTN_DRAFTED

10

10

Requisition Approval Pending

RQSTN_APPROVAL_PENDING

20

20

Requisition Approval Denied

RQSTN_APPROVAL_DENIED

30

30

Requisition Opened

RQSTN_OPEN

40

40

Requisition on Hold

RQSTN_HOLD

50

50

Requisition Cancelled

RQSTN_CANCELLED

60

60

Requisition Filled Closed

RQSTN_FILLED_CLOSED

70

70

Application Received

APPLICATION_RECEIVED

100

100

Assessment Start

ASSESSMENT_START

110

110

Assessment Interview

ASSESSMENT_INTERVIEW

120

120

Assessment First Interview

ASSESSMENT_INTERVIEW1

130

130

Assessment Second Interview

ASSESSMENT_INTERVIEW2

140

140

Offer Extended

OFFER_EXTENDED

150

150

Offer Accepted

OFFER_ACCEPTED

170

170

Application Terminated

APPLICATION_TERMINATED

190

190

Hire

HIRE

200

200

First Appraisal / Review

EMP_PERF_REVIEW

230

230

Transferred prior to completing first period of work band

EMP_TRANSFER

240

240

Employed after completing first period of work band

EMP_FST_POW_COMPLETION

250

250

Terminated prior to completing first period of work band

EMP_TERMINATED

270

270

Non Pipeline Events

NON_PIPELINE

1000

1000

Other Event

OTHER

1010

1010


Table B-130 Recruitment Event Reason (Warehouse Conformed) to Recruitment Event Reason Type (Warehouse Conformed) Member Mappings

Source Member Source Member Code Target Member Target Member Code

Another Job

ANOTHER_JOB

Voluntary

VOLUNTARY

Application Withdrawn

WITHDRAWN

Voluntary

VOLUNTARY

Disqualified In Interview

DISQUALIFIED

Involuntary

INVOLUNTARY

Failed To Start

FAILED_TO_START

Voluntary

VOLUNTARY

Headcount Available

HEADCOUNT_AVAILABLE

Other

OTHER

Headcount Not Available Or Hiring Freeze

HEADCOUNT_NOT_AVAILABLE

Other

OTHER

Ineligible

INELIGIBLE

Involuntary

INVOLUNTARY

No Show For Interview

NO_SHOW

Voluntary

VOLUNTARY

Offer Rejected

OFFER_REJECTED

Involuntary

INVOLUNTARY

Unspecified

UNSPECIFIED

Other

OTHER


B.2.89 How to Set Up Project GL Reconciliation Security for Peoplesoft

Overview

Project Analytics supports security over following dimensions in Project GL Recon. In the Oracle Business Intelligence Applications solution, the "Business Unit" entity refers to "Operating Unit Organizations" in EBS. The list of Business Units that a user has access to, is determined by E-Business Suite grants.

Table B-131 Project Costing and Control Facts

Security Entity GL Recon Cost Fact GL Recon Revenue Fact

Project Business Unit

N

N

Project Organization

N

N

Expenditure Business Unit

N

N

Contract Business Unit

N

N

Project

N

N

Resource Organization

N

N

Ledger

Y

Y


Configuring PROJECT GL REC FOR E-Business Suite

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

You must enable data security for Project GL Reconciliation in E-Business Suite by enabling E-Business Suite data security initialization block listed below. If only one source system is deployed, then you must make sure that all Project Security initialization blocks for other adapters are disabled. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Init Blocks

EBS: Project GL Recon Ledger List EBS

To Set Up Project GL Reconcilliation Security for EBS

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Application Roles.

  2. Double click on OBIA_PROJECT_LEDGER_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable all data security filters.

  3. Save the metadata repository.

B.2.89.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project GL Recon subject area.

  • OBIA_EBS_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.90 How to Configure Project Commitment Fact for PeopleSoft

The commitment subject area from PeopleSoft is filtered from the source by filtering on the PS_INSTALLATION_PC table so ensure the analysis types for commitment transactions of Purchase Orders and Requisitions are mapped appropriately in PeopleSoft.

Project Costing Integration

Configuration of Commitment Snapshot

Commitment data being transient, a snapshot table W_PROJ_COMMITMENT_SNP_F is populated. The grain of data in the snapshot table is controlled by ETL parameter PROJ_COMMITMENT_GRAIN. This parameter is set in the FSM and can have values WEEK, MONTH, QUARTER and YEAR. Example: PROJ_COMMITMENT_GRAIN = 'WEEK' would mean that the snapshot table stores one snapshot per week. So if the ETL is run multiple times within a week, the last snapshot will keep overwriting the older one until the end of the week. The Friday record will be kept, and a new record will be generated next Monday for the new week. This Grain is specified using Tasks in FSM. The default is set to 'Month'.

Set the value for this variable in the FSM by navigating to Manage Data Load Parameters section'; filter for offering Oracle Project Analytics.

B.2.91 How to Set Up Project Resource Management Security for Peoplesoft

Overview

Project Analytics supports security using following dimensions in Project Resource Management subject areas.

Table B-132 Supported Project Resource Management Security subject areas

Project Resource Management Facts Securing Entity Resource Availability Resource Requirement Resource Utilization Assignment Resource Utilization Capacity Resource Utilization Expected Employee Job/Competency

Project Business Unit

N

Y

Y

N

Y

N

Project Organization

N

N

N

N

N

N

Expenditure Business Unit

N

N

N

N

N

N

Contract Business Unit

N

N

N

N

N

N

Project

N

Y

Y

N

Y

N

Resource Organization

N

N

N

N

N

N

Ledger

N

N

N

N

N

N


Configuring Project Resource Management For PeopleSoft

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

Enable data security for Project Resource Management in PeopleSoft based on your PeopleSoft security configuration. That is, if security by Business Unit has been implemented, then you must follow the Security by Business Unit Section (ignore Security by Projects section); if security by projects has been implemented, then you must follow the Security by Projects section (ignore Security by Business Unit section) and enable data security initialization blocks listed in sections below. You must disable Project Security initialization blocks for other adapters. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

About Data Security Configuration in PeopleSoft

In PeopleSoft, you access the security configuration pages for securing Project transactions by selecting Main Menu, then Set up Financials/Supply Chain, then Security, then Security Options.

Depending on your security configuration, you need to use any combination of either Project Business Unit or Project dimension. Based on that, you need to change the default configuration to match the OLTP security setup.

B.2.91.1 Security by Business Unit

Init Blocks:

  • Project Business Unit List RM PSFT

If you are securing the Project data by Project BU only, then follow the steps below to disable the Project dimension security:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity.

  2. Double click on OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable all data security filters that are disabled.

    This activates Project BU Security, which is required for the Resource Management Module in PeopleSoft.

  3. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_DATA_SECURITY Application Role.

    Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

  4. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_DATA_SECURITY Duty Role.

  5. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Action, then Synchronize Application Roles.

B.2.91.2 Security by Project

Init Blocks:

  • Project List RM PSFT

If you are securing the Project data by Project dimension only, then follow the steps below to disable the Project BU dimension security:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity.

  2. Double click on OBIA_PROJECT_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable all data security filters that are disabled.

    This activates Project based Security, which is required for the Resource Management Module in PeopleSoft.

  3. In Oracle Application Control, select Business Application Instance, then Application Roles, then Select the Oracle BI Applications Stripe, and query for the OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY Application Role.

    Note that OBIA_PSFT_PROJECT_DATA_SECURITY is listed as one of the members.

  4. Remove OBIA_PSFT_PROJECT_DATA_SECURITY as a member of the OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY Duty Role.

  5. In Oracle BI EE Administration Tool, select Manage, then Identity, then Action, then Synchronize Application Roles.

Note: Tree based Project Security type queries are not supported in the default application.

B.2.91.3 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Resource Management subject area.

  • OBIA_PSFT_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.92 How to Set Up RPD For Budgetary Control For PeopleSoft

To set up the RPD for Budgetary Control for PeopleSoft:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

    The RPD file is located at:

    ORACLE_INSTANCE\bifoundation\OracleBIServerComponent\coreapplication_obisn\repository

  2. In the Business Model and Mapping layer, go to the logical table Fact - Fins - GL Journals Budgetary Control.

  3. Under Sources, select the Fact_W_GL_OTHER_F_PSFT logical table source.

  4. Clear the Disabled option in the General tab and click OK.

  5. Repeat step 4 for the logical table sources – Fact_W_GL_OTHER_GRPACCT_DAY_A_PSFT and Fact_W_GL_OTHER_GRPACCT_FSCLPRD_A_PSFT.

  6. Under Sources, select the Fact_W_GL_OTHER_F_EBS logical table source.

  7. Select the Disabled option in the General tab and click OK.

  8. Repeat step 7 for the logical table sources - Fact_W_GL_OTHER_GRPACCT_DAY_A_EBS and Fact_W_GL_OTHER_GRPACCT_FSCLPRD_A_EBS.

  9. In the Business Model and Mapping layer, go the logical table Fact - Fins - Activity Budgetary Control.

  10. Under Sources, select the Fact_W_GL_BALANCE_F_PSFT logical table source.

  11. Clear the Disabled option in the General tab and click OK.

  12. Repeat step 11 for the other logical table source – Fact_W_GL_BALANCE_A_PSFT.

  13. Under Sources, select the Fact_W_GL_BALANCE_F_EBS logical table source.

  14. Select the Disabled option in the General tab and click OK.

  15. Repeat step 13 for the other logical table source - Fact_W_GL_BALANCE_A_EBS.

B.2.93 How to Set Up General Ledger Security for Peoplesoft

Overview

Financial Analytics supports a combination of the following security mechanisms for GL subject areas:

  • Security using Ledgers

  • Security using PeopleSoft Chartfields

B.2.93.1 Configuring Ledger Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Ledger Security for PeopleSoft, enable PeopleSoft initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

  • E-Business Suite 11i: Ledgers EBS11

  • E-Business Suite R12: Ledgers EBS12

  • Oracle PeopleSoft: Ledgers PeopleSoft

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

B.2.93.2 Configuring GL Segment Security using PeopleSoft Chartfields

Note: This section is applicable only if you have enabled Commitment Control in PeopleSoft. If you do not have Commitment Control, then you can skip this section.

This section gives an overview of the segment security using PeopleSoft Chartfields and supported scenarios in BI Applications.

There are various options using which you can setup security rules in the Commitment Control module in PeopleSoft. Oracle BI Applications supports only the following two security rules:

  • Security rules setup using the "Allow" access attribute.

  • Security rules setup using the "Tree Node" parameter.

A user can have two different types of access for each chartfield:

  • Partial Access - User has access to specific values within the tree defined for a chartfield. The node, for which the user has access to, is defined using the "Allow" and the "Tree Node" parameter in PeopleSoft. When the user is given access to a node within the tree, it means that the user has access to that node and all its child nodes.

    For example, if a user is granted access to node C, then the user has access to nodes C, D, E, F and G.

    Example, if user is granted access to node C, then the user has access to nodes C, D, E, F and G.
  • Full Access – The user has complete access to all the SETIDs for that chartfield.

B.2.93.3 Configuring GL Segment Security

GL Segment Security can be applied on the qualified GL Segment Dimensions : 'Dim – Cost Center', 'Dim – Natural Account' & 'Dim – Balancing Segment', as well as the 10 generic dimensions 'Dim – GL Segment1 to 'Dim – GL Segment 10' which are configurable to be any of the chartfields and is configured in the task <taskname & link>.

Before setting up the security, you need to first identify which of these segment dimensions you need to apply security on depending on your security requirements and the security setup in the Commitment Control module. Once that is determined the following steps to configure the RPD metadata need to be repeated for each of the securing segment dimension.

Initialization Blocks and Session Variables

  1. Create a "row wise" session initialization block and a corresponding session variable to get all the parent nodes the user has access to in a tree. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-133 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESS____PSFT', 'Department'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='DEPTID' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESS____PSFT', 'Department'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='DEPTID' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)

    GL_SEC_COSTCENTER_FILTEREDACCESS____PSFT

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESS____PSFT', 'Account'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='ACCOUNT' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESS____PSFT', 'Account'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='ACCOUNT' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    GL_SEC_ACCOUNT_FILTEREDACCESS____PSFT

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESS____PSFT', 'Fund Code'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='FUND_CODE' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESS____PSFT', 'Fund Code'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='FUND_CODE' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER) '

    GL_SEC_BALANCING_FILTEREDACCESS____PSFT

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESS____PSFT', '< ChartfieldString>'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='<ChartfieldCode>' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESS____PSFT', '< ChartfieldString>'||'~'||DEFN.SETID||'~'||DEFN.TREE_NAME||'~'||DEFN.TREE_NODE FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='<ChartfieldCode>' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    GL_SEC_SEGMENT<n>_FILTEREDACCESS____PSFT


    Connection Pool: "PeopleSoft OLTP"."PeopleSoft OLTP DbAuth Connection Pool"

    Notes:

    - For the Dim – GL Segment<n> init blocks, use the appropriate chartfield string and the chartfield code based on the chartfield you are securing. You can get the chartfield code from the PeopleSoft source system and the chartfield string should match the names used in file_glacct_segment_config_psft.csv file.- Use the default value for these variables as 'Default'.- All the variables created above should end with ____PSFT (4 '_' followed by the string PSFT). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

  2. Create a "row wise" session initialization block and a corresponding session variable to get the level in the hierarchy the above nodes fall in a tree. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-134 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESSLEVELS____PSFT', FIXED_HIER_LEVEL FROM W_COST_CENTER_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_COSTCENTER_FILTEREDACCESS____ PSFT)) AND CURRENT_FLG='Y'

    GL_SEC_COSTCENTER_FILTEREDACCESSLEVELS____PSFT

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESSLEVELS____ PSFT', FIXED_HIER_LEVEL FROM W_NATURAL_ACCOUNT_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_ ACCOUNT_FILTEREDACCESS____ PSFT)) AND CURRENT_FLG='Y'

    GL_SEC_ACCOUNT_FILTEREDACCESSLEVELS____PSFT

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESSLEVELS____ PSFT', FIXED_HIER_LEVEL FROM W_BALANCING_SEGMENT_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_ BALANCING_FILTEREDACCESS____ PSFT)) AND CURRENT_FLG='Y'

    GL_SEC_BALANCING_FILTEREDACCESSLEVELS____PSFT

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESSLEVELS___PSFT, FIXED_HIER_LEVEL FROM W_GL_SEGMENT_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_SEGMENT<n>_FILTEREDACCESS____ PSFT)) AND CURRENT_FLG='Y'

    GL_SEC_SEGMENT<n>_FILTEREDACCESSLEVELS____PSFT


    Connection Pool: "Oracle Data Warehouse"."Oracle Data Warehouse Repository Initblocks Connection Pool"

    Notes:

    - The 2nd highlighted variable name in the SQL comes from the variable names defined in Step 1. Make sure you use the same names.

    - Use the default value for these variables as 0.

    - All the variables created above should end with ____PSFT (4 '_' followed by the string PSFT). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

  3. Create a "row wise" session initialization block and a corresponding session variable to get all the SETIDs to which user has partial access for a given segment. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-135 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____PSFT', 'Department'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='DEPTID' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____PSFT', 'Department'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='DEPTID' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____PSFT

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____PSFT', 'Account'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='ACCOUNT' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____PSFT', 'Account'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='ACCOUNT' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____PSFT

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____PSFT', 'Fund Code'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='FUND_CODE' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____PSFT', 'Fund Code'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='FUND_CODE' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____PSFT

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____PSFT', '<ChartfieldString>'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN, PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_OPR_RULES OPR WHERE OPR.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='<ChartfieldCode>' AND OPR.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    UNION

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____PSFT', '<ChartfieldString>'||'~'||DEFN.SETID FROM PS_KSEC_RULES RULES, PS_KSEC_RULES_DEFN DEFN,PS_KSEC_RULES_EVEN EVENTS, PS_KSEC_CLSS_RULES CLSS, PSOPRDEFN OP, PSROLEUSER ORL, PSROLECLASS RCL WHERE CLSS.OPRCLASS = RCL.CLASSID AND OP.OPRID = ORL.ROLEUSER AND ORL.ROLENAME = RCL.ROLENAME AND CLSS.KSEC_RULE=EVENTS.KSEC_RULE AND EVENTS.KSEC_RULE=RULES.KSEC_RULE AND RULES.KSEC_RULE=DEFN.KSEC_RULE AND EVENTS.KSEC_EVENT='INQUIRE' AND RULES.KSEC_ATTRIB='A' AND DEFN.KSEC_RULE_PARAM='TRE' AND DEFN.CHARTFIELD='<ChartfieldCode>' AND OP.OPRID = 'VALUEOF(NQ_SESSION.USER)'

    GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____PSFT


    Connection Pool: "PeopleSoft OLTP"."PeopleSoft OLTP DbAuth Connection Pool"

    Notes:

    - For the Dim – GL Segment<n> init blocks, use the appropriate chartfield string and the chartfield code based on the chartfield you are securing. You can get the chartfield code from the PeopleSoft source system and the chartfield string should match the names used in file_glacct_segment_config_psft.csv file.

    - Use the default value for these variables as 'Default'.

    - All the variables created above should end with ____PSFT (4 '_' followed by the string PSFT). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

  4. Create a "row wise" session initialization block and a corresponding session variable to get all the SETIDs to which user has full access for a given chartfield. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-136 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FULLACCESS____PSFT', COST_CENTER_LOV_ID, FROM W_COST_CENTER_D WHERE COST_CENTER_LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____PSFT)

    GL_SEC_COSTCENTER_FULLACCESS____PSFT

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FULLACCESS____PSFT', NATURAL_ACCOUNT_LOV_ID, FROM W_ NATURAL_ACCOUNT _D WHERE NATURAL_ACCOUNT_LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____PSFT)

    GL_SEC_ACCOUNT_ FULLACCESS____PSFT

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_FULLACCESS____PSFT', BALANCING_SEGMENT_LOV_ID, FROM W_ BALANCING_SEGMENT_D WHERE BALANCING_SEGMENT _LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____PSFT)

    GL_SEC_BALANCING_ FULLACCESS TS____PSFT

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FULLACCESS____PSFT', SEGMENT_LOV_I FROM, W_GL_SEGMENT_D WHERE SEGMENT_LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____PSFT) AND SEGMENT_LOV_ID LIKE '<ChartfieldString>%'

    GL_SEC_SEGMENT<n>_ FULLACCESS____PSFT


    Connection Pool: "Oracle Data Warehouse"."Oracle Data Warehouse Repository Initblocks Connection Pool"

    Notes:

    - For the generic GL Segment dimensions, Dim – GL Segment 1 - 10, you will need to apply an appropriate filter to filter the SETIDs applicable for that chartfield. You can apply a filter on the chartfield string column which should be exactly similar to the one name used in file_glacct_segment_config_psft.csv file.

    - The 2nd highlighted variable name in the SQL comes from the variable names defined in Step 3. Make sure you use the same names.

    - Use the default value for these variables as 'Default'.

    - All the variables created above should end with ____PSFT (4 '_' followed by the string PSFT). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

Security Id Expression in the logical dimensions

  1. Each dimension has 32 security columns Level 0 Security Id through Level 31 Security Id as shown below. The expression for each of these logical columns need to be modified using the hierarchy level variable created above.

    Columns used in Security
  2. Open the logical table source of the dimension that maps to the warehouse dimension table and set the expression for each of these columns using the example from "Dim – Cost Center" dimension. For example, if you are securing by "Dim – GL Segment3" and the hierarchy level variable for this segment is "GL_SEC_SEGMENT3_FILTEREDACCESSLEVELS", you would set the expression for each of the "Level <n> Security Id" column with the following:

    INDEXCOL( IFNULL( VALUEOF(<n>, NQ_SESSION."GL_SEC_PROGRAM_FILTEREDACCESSLEVELS"),  VALUEOF(0, NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESSLEVELS")), 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL31_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL30_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL29_SECURITY_ID", 
    …and so on for each security id column…
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL1_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL0_SECURITY_ID")
    
  3. Repeat the above steps for each of the segment dimension to be secured.

Security filters in the "Data Security" application roles

Do the following:

  1. Navigate to "Manage –> Identity" from the menu, open the "General Ledger Data Security" application role and navigate to "Permissions -> Data Filters". For each of the logical facts secured under this role, you will see some existing filters, which are handling ledger security. You will need to append the segment security filters to this with an 'AND' condition. A snippet of the segment security filters to be appended for a given segment dimension is given below, assuming the security is on "Dim – GL Segment3" and the session variable prefix used in the previous steps was "GL_SEC_SEGMENT3".

    (
    "Core"."Dim - GL Segment3"."Segment Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - GL Segment3"."Segment Value Set Code" = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FULLACCESS") OR
    "Core"."Dim - GL Segment3"."Level 0 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 1 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 2 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - GL Segment3"."Level 30 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 31 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - GL Segment3"."Current Flag Security" = 'Y')
    )
    
  2. Repeat the above for each segment dimension that is secured using appropriate variable names for each segment and appending each block of filters with an AND. For example, if you are securing by cost center and segment3 dimensions, the filter will look like this, which includes the ledger security:

    /* Ledger  security filters */
     (
    "Core"."Dim - Ledger"."Key Id" = VALUEOF(NQ_SESSION."LEDGER")
    )
    /* cost center segment security filters */
    AND
     (
    "Core"."Dim - Cost Center"."Cost Center Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - Cost Center"."Cost Center Value Set Code" = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER_FULLACCESS") OR
    "Core"."Dim - Cost Center"."Cost Center Level 0 Security Id"    = VALUEOF(NQ_SESSION." GL_SEC_COST_CENTER_FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 1 Security Id"    = VALUEOF(NQ_SESSION." GL_SEC_COST_CENTER_FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 2 Security Id"    = VALUEOF(NQ_SESSION." GL_SEC_COST_CENTER_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - Cost Center"."Cost Center Level 30 Security Id"   = VALUEOF(NQ_SESSION." GL_SEC_COST_CENTER_FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 31 Security Id"   = VALUEOF(NQ_SESSION." GL_SEC_COST_CENTER_FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - Cost Center"."Current Flag Security" = 'Y')
    )
    /* segment3 security filters */
    AND
     (
    "Core"."Dim - GL Segment3"."Segment Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - GL Segment3"."Segment Value Set Code" = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FULLACCESS") OR
    "Core"."Dim - GL Segment3"."Level 0 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 1 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTERE+CESS") OR 
    "Core"."Dim - GL Segment3"."Level 2 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - GL Segment3"."Level 30 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 31 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - GL Segment3"."Current Flag Security" = 'Y')
    )
    

    Note: When a tree has more than one version, the security filters are always applied on the current version for that tree (CURRENT_FLG='Y'). However you can navigate through any other version of the tree in the reports but security will always be applied on the current version.

B.2.93.4 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the General Ledger subject area.

  • Budget Director PSFT

  • Budget Analyst PSFT

  • Financial Analyst PSFT

  • CFO Group PSFT

  • Controller Group PSFT

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

Note: These roles will have access to Account Payables, Account Receivables and Fixed Assets data in BI to facilitate the drill down from GL to those modules. However, access to data in the respective modules must be provisioned in the E-Business Suite system for these users in order to use the drill down capabilities.

B.2.94 How to Set Up Project GL Reconcilliation Security for PeopleSoft

Overview

Project Analytics supports security using Ledger dimension in Project GL Recon.

Table B-137 Project Costing and Control Facts

Security Entity GL Recon Cost Fact GL Recon Revenue Fact

Project Business Unit

N

N

Project Organization

N

N

Expenditure Business Unit

N

N

Contract Business Unit

N

N

Project

N

N

Resource Organization

N

N

Ledger

Y

Y


Configuring PROJECT GL RECON FOR PeopleSoft

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

Enable data security for Project GL Reconciliation for PeopleSoft by enabling PeopleSoft data security initialization block listed below. If only one source system is deployed, then you must make sure that all Project Security initialization blocks for other adapters are disabled. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Init Blocks

PeopleSoft: Project GL Recon Ledger List PSFT

To Set Up Project GL Reconcilliation Security for PeopleSoft

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity, then Application Roles.

  2. Double click on OBIA_PROJECT_LEDGER_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable all data security filters.

  3. Save the metadata repository.

B.2.94.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project GL Recon subject area.

  • OBIA_PSFT_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_PSFT_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.95 How to Configure Project Budget Fact for PeopleSoft

Cost Budget is extracted from Project Costing for all Analysis Types within the project's Cost Budget Analysis Group. All extracted Cost Budgets are loaded into the Budget fact table as Raw Cost unless you perform one or both of the following configurations described in this section.

Identifying Project Budget Burden Costs Based on Analysis Type

The ETL process uses the file_Project_Budget_Burden_Analysis_Type_psft.csv flat file to list all Analysis Types for Project Budget Burden Cost. If the ETL process finds the Analysis Type in this flat file, it will not perform further lookups against other lookup tables to determine Project Budget Burden Cost.

To configure the file_Project_Budget_Burden_Analysis_Type_psft.csv file:

  1. Edit the file file_Project_Budget_Burden_Analysis_Type_psft.csv file.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter a list of Analysis Types to be considered as Project Budget Burden Costs.

    The format is XXX,1 where XXX is an Analysis Type. The 1 is used as a return value to indicate that this is a Project Budget Burden Cost.

    The following is an example of classifying Costs with BUR and BRD Analysis Types as Project Budget Burden Costs:

    BUR,1
    BRD,1
    
  3. Save and close the file.

Identifying Project Budget Burden Costs Based on a Source Type, Category, and Subcategory Combination of Values

You must configure the following flat files to identify Project Budget Burden Costs based on a Source Type, Category, and Subcategory combination of values. FSM parameter BURDEN_TYPECATSUB determines if this lookup is performed for an implementation:

  • file_Project_Cost_Burden_TypeCatSub_config_psft.csv

    The ETL process uses this flat file to designate which columns (Source Type, Category, and Subcategory) are used in the lookup.

  • file_Project_Cost_Burden_TypeCatSub_psft.csv

    The ETL process uses this flat file to list all Source Type, Category, and Subcategory combination of values to use for Project Budget Burden Cost.

To configure the file_Project_Cost_Burden_TypeCatSub_config_psft.csv file:

  1. Edit the file_Project_Cost_Burden_TypeCatSub_config_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter only one row with RowID of 1. Enter a Y in each column that represents the combination to be evaluated as a Burden Cost. The columns are:

    • Row ID

    • Source Type

    • Category

    • Subcategory

    The following example shows how to use combinations of Source Type and Category:

    1,Y,Y,
    
  3. Save and close the file.

To configure the file_Project_Cost_Burden_TypeCatSub_psft.csv file:

  1. Edit the file_Project_Cost_Burden_TypeCatSub_psft.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. Enter a list of Resource Type, Resource Category, and Resource Subcategory combinations to be considered as Burden costs. The format is:

    XXXXX,XXXXX,XXXXX,1

    Where XXXXX is a combination of Resource Type, Resource Category, and Resource Subcategory

    The 1 is a return value that indicates that this is a Project Budget Burden Cost. Each combination of lookup values must be specified.

    Wildcards are not supported.

    The following is an example of classifying costs with G&A or FRNG Source Type as Project Budget Burden Costs:

    G&A,,,1
    FRNG,,,1
    

    Note: This CSV file is used in conjunction with the file_Project_Cost_Burden_TypeCatSub_config_psft.csv configuration file. In this example, this configuration file would contain the value 1,Y.

    Table B-138 Example file_Project_Cost_Burden_TypeCatSub_config_psft.csv configuration file

    Source Type Category Subcategory

    G&A

    <Blank>

    <Blank>

    FRNG

    LUX

    TEMP

    FRNG

    BONUS

    <Blank>


    Note: You must specify each combination of lookup values. The lookup will use columns with a Y in the configuration file.

  3. Save and close the file.

How to Configure Project Budget Analytics

This section describes how to configure Project Budget Analytics.

  1. In the FSM, go to 'Manage Data Load Parameters section', then filter for Source Peoplesoft 9.0 or 9.1 FINSCM, filter Offering Oracle Project Analytics, filter Functional Area Project Control and Costing.


  2. Set the following parameters:

    BURDEN_ANALYSIS_TYPE

    Use this parameter to specify Analysis Types as Burden Cost for the lookup. Valid values are:

    - 1. Enables the implementation to perform this lookup.

    - 0. (Default) Disables this lookup.

    BURDEN_TYPECATSUB

    Use this parameter to specify a combination of Source Type, Category, and Subcategory values as Burden Cost for the lookup. Valid values are:

    - 1. Enables this lookup.

    - 0. (Default) Disables this lookup.

  3. Save the details.

B.2.96 How to Set Up General Ledger Security for E-Business Suite

Overview

Financial Analytics supports a combination of the following security mechanisms for GL subject areas:

  • Security using Ledgers

  • Security using GL Accounting Segments

This section gives an overview of the segment security using GL Accounting Segments and supported scenarios in BI Applications.

One or more value sets are used to define the accounting segments in your OLTP. A user can have two different types of access for each value set:

  • Partial Access - The user has access to specific nodes within a value set. If the value set has hierarchical relationships defined between nodes, access to the user can be granted using "include" access type to a given node. This allows the user to access that node and all its child nodes.

    For example, if a user is granted access to node C, then the user has access to nodes C, D, E, F and G.

    Example, if user is granted access to node C, then the user has access to nodes C, D, E, F and G.

    Note: Oracle BI Applications does not support security rules that are set up using the "exclude" access type in Oracle E-Business Suite.

  • Full Access – The user has complete access to all the value set.

B.2.96.1 Configuring Ledger Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Ledger Security for PeopleSoft, enable PeopleSoft initialization block and make sure the initialization blocks of all other source systems are disabled. The initialization block names relevant to various source systems are given below. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

  • E-Business Suite 11i: Ledgers EBS11

  • E-Business Suite R12: Ledgers EBS12

  • Oracle PeopleSoft: Ledgers PeopleSoft

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

B.2.96.2 Configuring GL Segment Security

GL Segment Security can be applied on the qualified GL Segment Dimensions : 'Dim – Cost Center', 'Dim – Natural Account' & 'Dim – Balancing Segment', as well as the 10 generic dimensions 'Dim – GL Segment1 to 'Dim – GL Segment 10' which are configurable to be any of the accounting segments and is configured in the task <taskname & link>.

Before setting up the security, you need to first identify which of these segment dimensions you need to apply security on depending on your security requirements and the security setup in the E-Business Suite system. Once that is determined the following steps to configure the RPD metadata need to be repeated for each of the securing segment dimension.

Initialization Blocks and Session Variables

  1. Create a "row wise" session initialization block and a corresponding session variable to get all the parent nodes the user has access to in a tree. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-139 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_VALUESETS____EBS', COST_CENTER_LOV_ID FROM W_COST_CENTER_D WHERE ROW_WID > 0

    GL_SEC_COSTCENTER_VALUESETS____EBS

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_VALUESETS____EBS', NATURAL_ACCOUNT_LOV_ID FROM W_NATURAL_ACCOUNT_D WHERE ROW_WID > 0

    GL_SEC_ACCOUNT_VALUESETS____EBS

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_VALUESETS____EBS', BALANCING_SEGMENT_LOV_ID FROM W_BALANCING_SEGMENT_D WHERE ROW_WID > 0

    GL_SEC_BALANCING_VALUESETS____EBS

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_VALUESETS____EBS', SEGMENT<n>_ATTRIB FROM W_GLACCT_SEG_CONFIG_TMP

    GL_SEC_SEGMENT<n>_VALUESETS____EBS


    Connection Pool: "Oracle Data Warehouse"."Oracle Data Warehouse Repository Initblocks Connection Pool"

    Notes:

    - For the generic GL Segment dimensions, Dim – GL Segment 1 - 10, you will need to select the corresponding segment column from W_GLACCT_SEG_CONFIG_TMP which will have all the value sets corresponding to that segment.- Use the default value for these variables as 'Default'.- All the variables created above should end with ____EBS (4 '_' followed by the string EBS). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

  2. Create a "row wise" session initialization block and a corresponding session variable to get all the parent nodes the user has access to in a value set. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-140 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    select DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESS____EBS', TO_CHAR(C.FLEX_VALUE_SET_ID) ||'~'||C.FLEX_VALUE from FND_FLEX_VALUE_RULE_USAGES a, FND_FLEX_VALUE_RULE_LINES B, FND_FLEX_VALUES C

    where a.FLEX_VALUE_RULE_ID = B.FLEX_VALUE_RULE_ID and a.FLEX_VALUE_SET_ID = B.FLEX_VALUE_SET_ID and B.FLEX_VALUE_SET_ID = C.FLEX_VALUE_SET_ID and C.FLEX_VALUE between B.FLEX_VALUE_LOW and B.FLEX_VALUE_HIGH and B.INCLUDE_EXCLUDE_INDICATOR = 'I' and C.SUMMARY_FLAG = 'Y' and TO_CHAR(a.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_COSTCENTER_VALUESETS____EBS) and TO_CHAR(a.RESPONSIBILITY_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_EBS_RESP_ID) and a.APPLICATION_ID = 101

    GL_SEC_COSTCENTER_FILTEREDACCESS____EBS

    Dim – Natural Account

    select DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESS____EBS', TO_CHAR(C.FLEX_VALUE_SET_ID) ||'~'||C.FLEX_VALUE from FND_FLEX_VALUE_RULE_USAGES a, FND_FLEX_VALUE_RULE_LINES B, FND_FLEX_VALUES C

    where a.FLEX_VALUE_RULE_ID = B.FLEX_VALUE_RULE_ID and a.FLEX_VALUE_SET_ID = B.FLEX_VALUE_SET_ID and B.FLEX_VALUE_SET_ID = C.FLEX_VALUE_SET_ID and C.FLEX_VALUE between B.FLEX_VALUE_LOW and B.FLEX_VALUE_HIGH and B.INCLUDE_EXCLUDE_INDICATOR = 'I' and C.SUMMARY_FLAG = 'Y' and TO_CHAR(a.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_ACCOUNT_VALUESETS____EBS) and TO_CHAR(a.RESPONSIBILITY_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_EBS_RESP_ID) and a.APPLICATION_ID = 101

    GL_SEC_ACCOUNT_FILTEREDACCESS____EBS

    Dim – Balancing Segment

    select DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESS____EBS', TO_CHAR(C.FLEX_VALUE_SET_ID) ||'~'||C.FLEX_VALUE from FND_FLEX_VALUE_RULE_USAGES a, FND_FLEX_VALUE_RULE_LINES B, FND_FLEX_VALUES C

    where a.FLEX_VALUE_RULE_ID = B.FLEX_VALUE_RULE_ID and a.FLEX_VALUE_SET_ID = B.FLEX_VALUE_SET_ID and B.FLEX_VALUE_SET_ID = C.FLEX_VALUE_SET_ID and C.FLEX_VALUE between B.FLEX_VALUE_LOW and B.FLEX_VALUE_HIGH and B.INCLUDE_EXCLUDE_INDICATOR = 'I' and C.SUMMARY_FLAG = 'Y' and TO_CHAR(a.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_BALANCING_VALUESETS____EBS) and TO_CHAR(a.RESPONSIBILITY_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_EBS_RESP_ID) and a.APPLICATION_ID = 101

    GL_SEC_BALANCING_FILTEREDACCESS____EBS

    Dim – GL Segment<n>

    select DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESS____EBS', TO_CHAR(C.FLEX_VALUE_SET_ID) ||'~'||C.FLEX_VALUE from FND_FLEX_VALUE_RULE_USAGES a, FND_FLEX_VALUE_RULE_LINES B, FND_FLEX_VALUES C

    where a.FLEX_VALUE_RULE_ID = B.FLEX_VALUE_RULE_ID and a.FLEX_VALUE_SET_ID = B.FLEX_VALUE_SET_ID and B.FLEX_VALUE_SET_ID = C.FLEX_VALUE_SET_ID and C.FLEX_VALUE between B.FLEX_VALUE_LOW and B.FLEX_VALUE_HIGH and B.INCLUDE_EXCLUDE_INDICATOR = 'I' and C.SUMMARY_FLAG = 'Y' and TO_CHAR(a.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_SEGMENT<n>_VALUESETS____EBS) and TO_CHAR(a.RESPONSIBILITY_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_EBS_RESP_ID) and a.APPLICATION_ID = 101

    GL_SEC_SEGMENT<n>_FILTEREDACCESS____EBS


    Connection Pool: "Oracle EBS OLTP"."Oracle EBS OLTP DbAuth Connection Pool"

    Notes:

    - The 2nd highlighted variable name in the SQL comes from the variable names defined in Step 1. Make sure you use the same names.

    - Use the default value for these variables as 'Default'

    - All the variables created above should end with ____EBS (4 '_' followed by the string EBS). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

  3. Create a "row wise" session initialization block and a corresponding session variable to get the level in the hierarchy the above nodes fall in a hierarchical value set. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-141 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESSLEVELS____EBS', FIXED_HIER_LEVEL FROM W_COST_CENTER_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_COSTCENTER_FILTEREDACCESS____EBS)) AND CURRENT_FLG='Y'

    GL_SEC_COSTCENTER_FILTEREDACCESSLEVELS____EBS

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESSLEVELS____EBS', FIXED_HIER_LEVEL FROM W_NATURAL_ACCOUNT_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_ ACCOUNT_FILTEREDACCESS____EBS)) AND CURRENT_FLG='Y'

    GL_SEC_ACCOUNT_FILTEREDACCESSLEVELS____EBS

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESSLEVELS____EBS', FIXED_HIER_LEVEL FROM W_BALANCING_SEGMENT_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_ BALANCING_FILTEREDACCESS____EBS)) AND CURRENT_FLG='Y'

    GL_SEC_BALANCING_FILTEREDACCESSLEVELS____EBS

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESSLEVELS____EBS', FIXED_HIER_LEVEL FROM W_GL_SEGMENT_DH WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.GL_SEC_SEGMENT<n>_FILTEREDACCESS____EBS)) AND CURRENT_FLG='Y'

    GL_SEC_SEGMENT<n>_FILTEREDACCESSLEVELS____EBS


    Connection Pool: "Oracle Data Warehouse"."Oracle Data Warehouse Repository Initblocks Connection Pool"

    Notes:

    - The 2nd highlighted variable name in the SQL comes from the variable names defined in Step 2. Make sure you use the same names.

    - Use the default value for these variables as 0.

    - All the variables created above should end with ____EBS (4 '_' followed by the string EBS). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

  4. Create a "row wise" session initialization block and a corresponding session variable to get all the value sets to which user has partial access for a given segment. Use the sql queries and session variable names as given in the table below depending on the dimension that is secured.

    Table B-142 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    select DISTINCT 'GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____EBS', TO_CHAR(A.FLEX_VALUE_SET_ID) FROM FND_FLEX_VALUE_RULE_USAGES A WHERE TO_CHAR(A.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_COSTCENTER_VALUESETS____EBS) AND TO_CHAR(A.RESPONSIBILITY_ID) = VALUELISTOF(GL_SEC_EBS_RESP_ID)AND A.APPLICATION_ID = 101

    GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____EBS

    Dim – Natural Account

    select DISTINCT 'GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____EBS', TO_CHAR(A.FLEX_VALUE_SET_ID) FROM FND_FLEX_VALUE_RULE_USAGES A WHERE TO_CHAR(A.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_ACCOUNT_VALUESETS____EBS) AND TO_CHAR(A.RESPONSIBILITY_ID) = VALUELISTOF(GL_SEC_EBS_RESP_ID)AND A.APPLICATION_ID = 101

    GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____EBS

    Dim – Balancing Segment

    select DISTINCT 'GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____EBS', TO_CHAR(A.FLEX_VALUE_SET_ID) FROM FND_FLEX_VALUE_RULE_USAGES A WHERE TO_CHAR(A.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_BALANCING_VALUESETS____EBS) AND TO_CHAR(A.RESPONSIBILITY_ID) = VALUELISTOF(GL_SEC_EBS_RESP_ID)AND A.APPLICATION_ID = 101

    GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____EBS

    Dim – GL Segment<n>

    select DISTINCT 'GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____EBS', TO_CHAR(A.FLEX_VALUE_SET_ID) FROM FND_FLEX_VALUE_RULE_USAGES A WHERE TO_CHAR(A.FLEX_VALUE_SET_ID) = VALUELISTOF(NQ_SESSION.GL_SEC_SEGMENT<n>_VALUESETS____EBS) AND TO_CHAR(A.RESPONSIBILITY_ID) = VALUELISTOF(GL_SEC_EBS_RESP_ID)AND A.APPLICATION_ID = 101

    GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____EBS


    Connection Pool: "Oracle EBS OLTP"."Oracle EBS OLTP DbAuth Connection Pool"

    Notes:

    - The 2nd highlighted variable name in the SQL comes from the variable names defined in Step 1. Make sure you use the same names.

    - Use the default value for these variables as 'Default'.

    - All the variables created above should end with ____EBS (4 '_' followed by the string EBS). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

    Table B-143 Initialization Blocks and Session Variables

    Dimension SQL Variable Name

    Dim – Cost Center

    SELECT DISTINCT 'GL_SEC_COSTCENTER_FULLACCESS____EBS', COST_CENTER_LOV_ID, FROM W_COST_CENTER_D WHERE COST_CENTER_LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_COSTCENTER_FILTEREDACCESSVALUESETS____EBS)

    GL_SEC_COSTCENTER_FULLACCESS____EBS

    Dim – Natural Account

    SELECT DISTINCT 'GL_SEC_ACCOUNT_FULLACCESS____EBS', NATURAL_ACCOUNT_LOV_ID, FROM W_ NATURAL_ACCOUNT _D WHERE NATURAL_ACCOUNT_LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_ACCOUNT_FILTEREDACCESSVALUESETS____EBS)

    GL_SEC_ACCOUNT_ FULLACCESS____EBS

    Dim – Balancing Segment

    SELECT DISTINCT 'GL_SEC_BALANCING_FULLACCESS____EBS', BALANCING_SEGMENT_LOV_ID, FROM W_ BALANCING_SEGMENT_D WHERE BALANCING_SEGMENT _LOV_ID NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_BALANCING_FILTEREDACCESSVALUESETS____EBS)

    GL_SEC_BALANCING_ FULLACCESS TS____EBS

    Dim – GL Segment<n>

    SELECT DISTINCT 'GL_SEC_SEGMENT<n>_FULLACCESS____EBS', SEGMENT<n>_ATTRIB, FROM W_GLACCT_SEG_CONFIG_TMP WHERE SEGMENT<n>_ATTRIB NOT IN VALUELISTOF(NQ_SESSION.GL_SEC_SEGMENT<n>_FILTEREDACCESSVALUESETS____EBS)

    GL_SEC_SEGMENT<n>_ FULLACCESS____EBS


    Connection Pool: "Oracle Data Warehouse"."Oracle Data Warehouse Repository Initblocks Connection Pool"

    - For the generic GL Segment dimensions, Dim – GL Segment 1 - 10, you will need to select the corresponding segment column from W_GLACCT_SEG_CONFIG_TMP which will have all the value sets corresponding to that segment.

    - The 2nd highlighted variable name in the SQL comes from the variable names defined in Step 4. Make sure you use the same names.

    - Use the default value for these variables as 'Default'.

    - All the variables created above should end with ____EBS (4 '_' followed by the string EBS). This is for multi source implementation where the same variable can be initialized using multiple SQL's for multiple source systems.

Logical Column Expression in the BMM layer

  1. Each dimension has 32 security columns Level 0 Security Id through Level 31 Security Id as shown below. The expression for each of these logical columns need to be modified using the hierarchy level variable created above.

    Columns used in Security
  2. Open the logical table source of the dimension that maps to the warehouse dimension table and set the expression for each of these columns using the example from "Dim – Cost Center" dimension.

    For example, if you are securing by "Dim – GL Segment3" and the hierarchy level variable for this segment is "GL_SEC_SEGMENT3_FILTEREDACCESSLEVELS", you would set the expression for each of the "Level <n> Security Id" column with the following:

    INDEXCOL( IFNULL( VALUEOF(<n>, NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESSLEVELS"),  VALUEOF(0, NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESSLEVELS")), 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL31_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL30_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL29_SECURITY_ID", 
    …and so on for each security id column…
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL1_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL0_SECURITY_ID")
    

Security filters in the "Data Security" application roles

Do the following:

  1. Navigate to "Manage –> Identity" from the menu, open the "General Ledger Data Security" application role and navigate to "Permissions -> Data Filters". For each of the logical facts secured under this role, you will see some existing filters, which are handling ledger security. You will need to append the segment security filters to this with an 'AND' condition. A snippet of the segment security filters to be appended for a given segment dimension is given below, assuming the security is on "Dim – GL Segment3" and the session variable prefix used in the previous steps was "GL_SEC_SEGMENT3".

    (
    "Core"."Dim - GL Segment3"."Segment Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - GL Segment3"."Segment Value Set Code" = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FULLACCESS") OR
    "Core"."Dim - GL Segment3"."Level 0 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 1 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 2 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - GL Segment3"."Level 30 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 31 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - GL Segment3"."Current Flag Security" = 'Y')
    )
    
  2. Repeat the above for each segment dimension that is secured using appropriate variable names for each segment and appending each block of filters with an AND. For example, if you are securing by cost center and segment3 dimensions, the filter will look like this, which includes the ledger security:

    /* Ledger security filters */
    (
    "Core"."Dim - Ledger"."Key Id" = VALUEOF(NQ_SESSION."LEDGER")
    )
    /* cost center segment security filters */
    AND
     (
    "Core"."Dim - Cost Center"."Cost Center Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - Cost Center"."Cost Center Value Set Code" = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER_FULLACCESS") OR
    "Core"."Dim - Cost Center"."Cost Center Level 0 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER _FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 1 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER _FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 2 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER _FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - Cost Center"."Cost Center Level 30 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER _FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 31 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_COST_CENTER _FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - Cost Center"."Current Flag Security" = 'Y')
    )
    /* segment3 security filters */
    AND
     (
    "Core"."Dim - GL Segment3"."Segment Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - GL Segment3"."Segment Value Set Code" = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FULLACCESS") OR
    "Core"."Dim - GL Segment3"."Level 0 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3 _FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 1 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 2 Security Id"    = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - GL Segment3"."Level 30 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 31 Security Id"   = VALUEOF(NQ_SESSION."GL_SEC_SEGMENT3 _FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - GL Segment3"."Current Flag Security" = 'Y')
    )
    

    Note: When a tree has more than one version, the security filters are always applied on the current version for that tree (CURRENT_FLG='Y'). However you can navigate through any other version of the tree in the reports but security will always be applied on the current version.

B.2.96.3 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the General Ledger subject area.

  • Budget Director

  • Budget Analyst

  • Financial Analyst

  • CFO Group

  • Controller Group

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

Note: These roles will have access to Account Payables, Account Receivables and Fixed Assets data in BI to facilitate the drill down from GL to those modules. However, access to data in the respective modules must be provisioned in the E-Business Suite system for these users in order to use the drill down capabilities.

B.2.97 How to Set Up General Ledger Security for Fusion Applications

This topic describes how to implement GL segment security in Oracle BI Applications with a Fusion Applications source system, and contains the following sections:

B.2.97.1 Introduction

Oracle Financial Analytics supports a combination of the following security mechanisms for GL subject areas:

  • Security using GL Data Access Sets

  • Security using GL Accounting Segments

Data Access Set security is configured during installation and does not require additional configuration. This section gives an overview of the segment security, and describes how to configure security using GL Accounting Segments.

One or more value sets define the accounting segments in your OLTP. You can set up these value sets as a tree value set or non-tree value set. Users can have different types of access for each value set:

  • NOACCESS - User has access to none of the values in that value set.

  • FULLACCESS - User has access to all the values in that value set.

  • FILTEREDACCESS - User has access to specific values in that value set, defined as follows:

    • Tree valueset: If the valueset has a tree, then access to the user can be granted using "is-descendant of" hierarchical operator. This means that the user has access to that node and all the descendants of that node within that value set.

      For example in the following illustration, if the user is granted "is-descendant of" node C, then the user has access to nodes C, D, E, F and G.

      Example showing 'is-descendant' granted on node c.
    • Non-tree valueset: If the valueset does not have a tree, then the user can be granted access to specific node/s or a range of nodes

B.2.97.2 Configuring GL Segment Security

Prior to configuring the segment security in the Oracle BI Repository, you should have completed configuring the segment dimensions in the Oracle BI Repository by mapping the segment VOs to the appropriate logical dimensions using BI Extender. Then perform the following tasks for each of the segment that you are securing. Based on the value sets used for those segments, the segment can be a tree enabled segment or a non-tree segment. The security implementation is different for these cases.

B.2.97.2.1 Tree Segment Security Implementation

Perform the following steps when the segment on which security to be applied is a tree-based segment.

Task 1   Define Initialization Blocks and Session Variables
  1. For tree-based value sets, the data security VO "FscmTopModelAM.DataSecurityAM.KFFHierFilter1" will give the different access types for the user as mentioned in the previous section. You will need to create a row wise session initialization block which reads from this VO. A sample SQL for this initialization block is as follows.

    SET variable DISABLE_SQL_BYPASS=1, ApplicationIdBind='101', KeyFlexfieldCodeBind='GL#', SegmentLabelCodeBind='FA_COST_CTR': SELECT DISTINCT 'COST_CENTER_'||AccessType, CASE WHEN AccessType = 'FULLACCESS' THEN ValueSetCode ELSE ValueSetCode||'~'||TreeCode||'~'||TreeNodePk1Value END FROM "oracle.apps.fscm.model.analytics.applicationModule.FscmTopModelAM_FscmTopModelAMLocal"..."FscmTopModelAM.DataSecurityAM.KFFHierFilter1"
    

    Turn ON the "Allow deferred execution" option for this initialization block.

    Use the appropriate segment label code for the particular segment and any suitable prefix for the variable name, which are highlighted in bold text. In the above example, the segment label code used is "FA_COST_CTR" and the variable prefix used is "COST_CENTER_". This SQL will give (a) the value set codes the user has been granted full access to and/or (b) specific parent nodes within a tree the user has been granted access to using "is-descendant of" operator.

  2. Create two session variables for the initialization block with names <prefix>_FULLACCESS and <prefix>_FILTEREDACCESS, where <prefix> is the variable prefix used in the initialization block SQL. For example, in the above case you will define two session variables with the name COST_CENTER_FULLACCESS and COST_CENTER_FILTEREDACCESS. Default them with a value '-1' (Varchar).

  3. When the user has filtered access, we need to determine the hierarchy level in the hierarchy/tree where the node falls. For this you will need to create another row wise session initialization block. A sample SQL for this would be as follows. You will need to use the FILTEREDACCESS variable created in the previous step.

    SELECT DISTINCT 'COST_CENTER_LEVELS', FIXED_HIER_LEVEL FROM "Oracle Data Warehouse"."Catalog"."dbo"."W_COST_CENTER_DH" WHERE LEVEL0_SECURITY_ID IN (VALUELISTOF(NQ_SESSION.COST_CENTER_FILTEREDACCESS)) AND CURRENT_FLG='Y'
    

    Turn ON the "Allow deferred execution" option for this initialization block.

    Use "<prefix>_LEVELS" for the variable name in the select clause, where <prefix> is the same variable prefix that is used in Steps 2 and 3. Note: The variable name used (in the where clause), should be the same as defined in the previous initialization block.

  4. Create a session variable for the initialization block with the same name as used in the initialization block (COST_CENTER_LEVELS in this example) and default it with a value 0 (number). Set the execution precedence to make the initialization block mentioned in the previous step to run first.

  5. You can refer to the initialization blocks "Cost Center Security" and "Cost Center Security Top Node Levels" in the repository installed by default, as a reference to create the above two initialization blocks.

  6. Repeat the previous steps for each of the segment to be secured, giving a different name for the two initialization blocks and the three session variables for each segment.

Task 2   Security id Expression in the logical dimensions

Each segment dimension in the Oracle BI Repository (Dim - Cost Center, Dim - Balancing Segment, Dim - Natural Account Segment and Dim - GL Segment 1-10) can be either a tree or non-tree segment based on your requirements. In case you have configured them to be tree segments, perform the steps below after creating the initialization blocks and variables mentioned in Task 1.

  1. Each dimension has 32 security columns, Level 0 Security Id through Level 31 Security Id, as shown below. The expression for each of these logical columns must be modified using the hierarchy level variable created above.

    Shows the columns used in security.
  2. Open the logical table source of the dimension that maps to the warehouse dimension table and set the expression for each of these columns using the example from "Dim - Cost Center" dimension. For example, if you are securing by "Dim - GL Segment3" and the hierarchy level variable for this segment is "SEGMENT3_LEVELS", you would set the expression for each of the "Level <n> Security Id" column with the following:

    INDEXCOL( IFNULL( VALUEOF(<n>, NQ_SESSION."SEGMENT3_LEVELS"),  VALUEOF(0, NQ_SESSION."SEGMENT3_LEVELS")), 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL31_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL30_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL29_SECURITY_ID", 
    …and so on for each security id column…
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL1_SECURITY_ID", 
    "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_GL_SEGMENT_DH_Security_Segment3"."LEVEL0_SECURITY_ID")
    
  3. Repeat the above steps for each of the segment dimension to be secured.

Task 3   Security Filters in the Data Security Duty Roles

After completing Task 2, filters need to be added to the appropriate Data Role for data security predicates to be applied to queries.

  1. Navigate to "Manage -> Identity" from the menu.

  2. Open the "OBIA_GENERAL_LEDGER_DATA_SECURITY" Duty Role.

  3. Navigate to "Permissions -> Data Filters".

    For each of the logical facts secured under this role, you will see some existing filters, which are handling data access security. You will need to append the segment security filters to this with an 'AND' condition. A snippet of the segment security filters to be appended for a given segment dimension is given below, assuming the security is on "Dim - GL Segment3" and the session variable prefix used in the previous steps was "SEGMENT3".

    ("Core"."Dim - GL Segment3"."Segment Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - GL Segment3"."Segment Value Set Code" = VALUEOF(NQ_SESSION."SEGMENT3_FULLACCESS") OR
    "Core"."Dim - GL Segment3"."Level 0 Security Id"    = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 1 Security Id"    = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 2 Security Id"    = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - GL Segment3"."Level 30 Security Id"   = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 31 Security Id"   = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - GL Segment3"."Current Flag Security" = 'Y')
    )
    
    
  4. Repeat the above for each tree based segment dimension that is secured using appropriate variable names for each segment and appending each block of filters with an AND. For example, if you are securing by cost center and segment3 dimensions, the filter including the data access set security, will be as follows:

    /* data access security filters */
     (
    "Core"."Dim - GL Data Access Set Security"."Ledger List" = VALUEOF(NQ_SESSION."LEDGER_LIST") 
    OR
    "Core"."Dim - GL Data Access Set Security"."Ledger BSV List" = VALUEOF(NQ_SESSION."LEDGER_BSV_LIST") 
    OR
    "Core"."Dim - GL Data Access Set Security"."Ledger MSV List" = VALUEOF(NQ_SESSION."LEDGER_MSV_LIST")
    )
    /* cost center segment security filters */
    AND
     (
    "Core"."Dim - Cost Center"."Cost Center Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - Cost Center"."Cost Center Value Set Code" = VALUEOF(NQ_SESSION."COST_CENTER_FULLACCESS") OR
    "Core"."Dim - Cost Center"."Cost Center Level 0 Security Id"    = VALUEOF(NQ_SESSION." COST_CENTER _FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 1 Security Id"    = VALUEOF(NQ_SESSION." COST_CENTER _FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 2 Security Id"    = VALUEOF(NQ_SESSION." COST_CENTER _FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - Cost Center"."Cost Center Level 30 Security Id"   = VALUEOF(NQ_SESSION." COST_CENTER _FILTEREDACCESS") OR 
    "Core"."Dim - Cost Center"."Cost Center Level 31 Security Id"   = VALUEOF(NQ_SESSION." COST_CENTER _FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - Cost Center"."Current Flag Security" = 'Y')
    )
    /* segment3 security filters */
    AND
     (
    "Core"."Dim - GL Segment3"."Segment Value Set Code" IS NULL OR 
    ((
    "Core"."Dim - GL Segment3"."Segment Value Set Code" = VALUEOF(NQ_SESSION."SEGMENT3_FULLACCESS") OR
    "Core"."Dim - GL Segment3"."Level 0 Security Id"    = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 1 Security Id"    = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 2 Security Id"    = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    ...and so on for each security id column...
    "Core"."Dim - GL Segment3"."Level 30 Security Id"   = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS") OR 
    "Core"."Dim - GL Segment3"."Level 31 Security Id"   = VALUEOF(NQ_SESSION."SEGMENT3_FILTEREDACCESS")
    )
    AND 
    "Core"."Dim - GL Segment3"."Current Flag Security" = 'Y')
    )
    
    

    Note: When a tree has more than one version, the security filters are always applied on the current version for that tree (CURRENT_FLG='Y'). However, you can navigate through any other version of the tree in the reports but security will always be applied on the current version.

B.2.97.2.2 Non-Tree Segment Security Implementation

Perform the following steps when the segment on which security to be applied is not a tree based segment.

Task 1   Define Initialization Blocks and Session Variables
  1. Determine the name of the VO that was generated for the segment. It will follow a naming pattern such as FLEX_VS_<label>_VI, where <label> is the segment label defined in the OLTP.

  2. Create a session row wise initialization block reading from this VO.

    A sample SQL statement might be:

    SELECT 'GL_MANAGEMENT_FILTEREDACCESS', ValueSetCode||'~'||Value FROM "oracle.apps.fscm.model.analytics.applicationModule.FscmTopModelAM_FscmTopModelAMLocal"..."FscmTopModelAM.AccountBIAM.FLEX_VS_GL_MANAGEMENT2_VI"
    

    Use an appropriate prefix for the variable name, highlighted above. This initialization block gives a concatenation of value set code and values the user has access to.

  3. Create appropriate session variable with the same name as used above and default it with a value '-1' (Varchar). In the above example, the variable name is "GL_MANAGEMENT_FILTEREDACCESS".

  4. Repeat the above steps for each non-tree segment that needs to be secured.

Task 2   Security Filters in the "Data Security" Data Roles

After you have completed Task 1, filters need to be added to the appropriate Data Role for data security predicates to be applied to queries.

  1. Navigate to "Manage -> Identity" from the menu, and open the "OBIA_GENERAL_LEDGER_DATA_SECURITY" Data Role.

  2. Navigate to "Permissions -> Data Filters", and for each of the logical facts secured under this role, append the following filter to any existing filters with an 'AND' condition. The sample filter will look like:

    (
    "Core"."Dim - GL Segment2"."Segment Value Set Code" IS NULL OR 
    "Core"."Dim - GL Segment2"."Segment Code Id"  = VALUEOF(NQ_SESSION."GL_MANAGEMENT_FILTEREDACCESS")
    )
    
  3. Repeat the previous steps for each non-tree segment dimension that is secured using appropriate variable names for each segment and appending each block (one block per segment) with an "AND" condition. If you have a combination of non-tree and tree segments, then apply the data filters accordingly (as explained for each case) appending each filter with an 'AND' condition.

B.2.98 How to Configure Financial Analytics CSV Files for JD Edwards EnterpriseOne

The table below lists the CSV worksheet files and the domain values for Financial Analytics for JD Edwards EnterpriseOne.

Table B-144 CSV worksheet files and the domain values for Financial Analytics for JD Edwards EnterpriseOne.

Worksheet File Name Description

file_group_acct_codes_jde.csv

Lists the Group Account Codes and the corresponding domain values for the JD Edwards EnterpriseOne application.

file_lkp_fiscal_period_Qtr_Config_jde.csv

Lists the Time Dimension Fiscal Period and the corresponding domain values for the JD Edwards application.


How to Configure the file_group_acct_codes_jde.csv

This section explains how to configure the file_group_acct_codes_jde.csv. This flat file is used to identify the Group Account Codes for each object account range for each company. For example, for company 00001 you might specify group account codes for accounts 1000 to 2000 as REVENUE. For a detailed list of the domain values for all possible Group Account Codes, see Oracle Business Analytics Warehouse Data Model Reference.

To configure file_group_acct_codes_jde.csv:

  1. Edit the file file_group_acct_codes_jde.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  2. In the Company field, populate the company that you are setting up and specify the ranges in the From and To column of that row with the corresponding Group Account Code.

  3. Save and close the file.

B.2.98.1 How to Configure the file_lkp_fiscal_period_Qtr_Config_jde.csv

This section explains how to configure the file_lkp_fiscal_period_Qtr_Config_jde.csv. You must configure this file to support the metrics that are based on the Fiscal Quarter.

To configure file_lkp_fiscal_period_Qtr_Config_jde.csv:

  1. Identify the Fiscal Quarter data in your JD Edwards EnterpriseOne source system by using the following SQL:

    Select CDDTPN,CDFY from F0008
    
  2. Edit the file file_lkp_fiscal_period_Qtr_Config_jde.csv.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

  3. For each Date pattern, set the following fields:

    Table B-145 Fields and Values for file_lkp_fiscal_period_Qtr_Config_jde.csv

    Field Value

    FiscalPattern

    CDDTPN.

    YEAR

    CDFY.

    Period

    Period Number Numeric.

    Quarter No

    As there is no concept of Quarter in JD Edwards EnterpriseOne, define a numeric quarter number to which the period belongs.

    Quarter Start Date

    Customized Quarter Start Date for each period. Each quarter can span as many periods as users configure. The format is DD/MM/YYYY.

    Quarter End Date

    Customized Quarter End Date for each period. Each quarter can span as many periods as users configure. The format is DD/MM/YYYY.


    Note: Ensure that there are no unnecessary spaces in the flat file.

  4. Save and close the file.

B.2.99 How to Set Up CRM Territory Hierarchy Based Security for Oracle Fusion

Overview

Territory hierarchy based security is widely used in many CRM subject areas, such as Sales, Marketing and Partner Management. Territory based security control starts with the list of territories that the login user works for and the levels these territories belong to in the territory hierarchy. The list of territories and the levels in the territory hierarchy are then used as part of the data filter condition in queries.

There are variations of territory hierarchy based security when it's actually applied in different areas, although they are all territory based by nature.

  • For Opportunity and Revenue, visibility is granted to the login user via the following:

    • As member of the territory team that the opportunity is assigned to.

    • As owner or administrator of a parent territory in the hierarchy.

  • For Territory Quota and Resource Quota, visibility is granted to the login user via the following:

    • As team member of the territory that the Quota is created on.

    • As an owner or administrator of a parent territory in the hierarchy.

  • For Forecasting, visibility is granted to the login user via the following:

    • As team member of the territory that the Forecast is created on.

    • As owner or administrator of a parent territory in the hierarchy.

  • For Leads, visibility is granted to the login user via the following:

    • As team member of the territory that is assigned to lead.

    • As owner or administrator of a parent territory in the hierarchy of the territory assigned to lead.

Configuring Resource Hierarchy Based Security

There are 3 session variables used in territory hierarchy based data security roles.

  • TERR_LIST contains the list of Ids of the territory, in which the login user is a team member. This variable is initialized via the session initial block "Territory List".

  • SUPER_TERR_LIST contains the list of Ids of the territory, in which the login user is an owner or administrator. This variable is initialized via the session initial block "Super Territory List".

  • TERR_HIER_LEVEL_LIST contains the list of the levels in territory hierarchy that the login user is an owner or administrator of the territory. This variable is initialized via the session initial block "Territory Hierarchy Level List".

B.2.99.1 Configuring BI Duty Roles

All the Territory Hierarchy Based security roles should be defined as member of the internal role OBIA_TERRITORY _HIERARCHY_DATA_SECURITY, under which, all the necessary data filters are defined. In the default configuration, OBIA_TERRITORY_HIERARCHY_DATA_SECURITY has the following members:

  • OBIA_LEAD_ANALYSIS_DUTY

  • OBIA_PARTNER_ANALYSIS_DUTY

  • OBIA_PARTNER_ADMINISTRATIVE_ANALYSIS_DUTY

  • OBIA_PARTNER_CHANNEL_ACCOUNT_MANAGER_ANALYSIS_DUTY

  • OBIA_PARTNER_CHANNEL_ADMINISTRATIVE_ANALYSIS_DUTY

  • OBIA_PARTNER_CHANNEL_ANALYSIS_DUTY

  • OBIA_OPPORTUNITY_LANDSCAPE_ANALYSIS_DUTY

  • OBIA_SALES_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_SALES_MANAGERIAL_ANALYSIS_DUTY

  • OBIA_SALES_TRANSACTIONAL_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.100 How to Configure the Asset Location Dimension for E-Business Suite

Asset Location is defined in the Fixed Asset Application in E-Business Suite using the Key Flex Field (KFF) feature. You set up KFF using different segments based on your business needs. For example, one source system might have KFF set up using segment 1, 2, 3, 4 for country, state, city and address, while another source system might have segment 1, 2, 3, 4 for address, city, state, and country, or other additional information.

The configuration file file_fa_location_segment_config_ora.csv is used to configure the segment mapping between the Location KFF in your Fixed Asset application and the Asset Location dimension in Oracle Business Analytics Warehouse. You must configure this file before you start ETL.

Setting up the config file: file_fa_location_segment_config_ora.csv

The file_fa_location_segment_config_ora.csv is to be used to match the segment fields in E-Business Suite to the segment fields in the asset location table W_ASSET_LOCATION_D in Oracle Business Analytics Warehouse.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Edit this file and fill in the segment mapping info. The column SEG1 to SEG7 represent the data warehouse segment columns in the asset location dimension table. For each of the segments, you fill in the corresponding mapped KFF segment. If there is no mapping, then leave the field empty.

Example

In Oracle Business Analytics Warehouse, the segment columns store the following conformed values:

W_ASSET_LOCATION_D.segment1 stores the value of Country

W_ASSET_LOCATION_D.segment2 stores the value of State

W_ASSET_LOCATION_D.segment3 stores the value of County

W_ASSET_LOCATION_D.segment4 stores the value of City

W_ASSET_LOCATION_D.segment5 stores Address

In the E-Business Suite, County is not used. In KFF you have Country, State, City, and Address, as follows:

FA_LOCATIONS.segment1 stores the value of Country

FA_LOCATIONS.segment2 stores the value of State

FA_LOCATIONS.segment3 stores the value of City

FA_LOCATIONS.segment4 stores the value of Address

To deploy this scenario, the file_fa_location_segment_config_ora.csv file would be configured as follows:

Table B-146 Example configuration in file_fa_location_segment_config_ora.csv

SEG1 SEG2 SEG3 SEG4 SEG5 SEG6 SEG7

SEGMENT1

SEGMENT2

<Empty>

SEGMENT3

SEGMENT4

<Empty>

<Empty>


B.2.101 How to Configure Project Invoice Fact for E-Business Suite

Line level invoice information is extracted from the Invoice Line table (PA_DRAFT_INVOICE_ITEMS) in the Billing Module of E-Business Suite and loaded into Invoice Line Fact (W_PROJ_INVOICE_LINE_F). All invoices at any stage of the invoice generation process, such as creation, approval, release, transfer, and so forth, are loaded into this table so that customers can see a full view of the invoices. Some of the information available in the Invoice Header Table (PA_DRAFT_INVOICES_ALL) such as GL Date and PA Date; and flags such as Write-Off Flag, Concession Flag, Cancelled Flag, and Retention Invoice Flag in E-Business Suite, have also been denormalized into Invoice Line Fact.

For E-Business Suite, Invoice Currency is the Document Currency for this fact.

Note:

The E-Business Suite concurrent programs, such as PRC: Generate Draft Invoices for a Single Project or PRC: Generate Draft Invoices for a Range of Projects, for generating draft invoices, or PRC: Interface streamline Process, for transferring invoice to Receivables, should be run before the ETL is run to load Oracle Business Analytics Warehouse.

B.2.102 How to Set Up Project Billing and Revenue Security for E-Business Suite

Overview

Project Analytics supports security over following dimensions in Project Billing and Revenue subject areas. In Oracle Business Intelligence Applications, the 'Business Unit' entity refers to 'Operating Unit Organizations' in E-Business Suite. The list of Business Units that a user has access to, is determined by E-Business Suite grants.

Table B-147 Supported Project Billing and Revenue subject areas

Project Billing & Revenue FactsDimension Used For Securing Billing Revenue Contract Funding Cross Charge- Receiver Cross Charge - Provider Cross Charge - Invoice

Project Business Unit

Y

Y

N

Y

Y

N

Y

Project Organization

Y

Y

N

Y

Y

N

Y

Expenditure Business Unit

N

N

N

N

N

Y

N

Contract Business Unit

Y

Y

Y

Y

N

N

N

Project

Y

Y

N

Y

Y

Y

Y

Resource Organization

N

N

N

N

N

N

N

Ledger

N

N

N

N

N

N

N


Configuring Project Billing and Revenue Security for E-Business Suite

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

Enable data security for Project Billing and Revenue in E-Business Suite by enabling the initialization blocks listed below based on the E-Business Suite adaptor. If only one source system is deployed, then you must make sure that all Project Security initialization blocks for other adapters are disabled. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Initialization Blocks for Project Billing and Revenue

  • For R11x

    • Expenditure Business Unit List EBS11x

    • Project Business Unit List Funding EBS11x

    • Project Business Unit List Invoice EBS11x

    • Project Business Unit List Revenue EBS11x

    • Project Contract Business Unit List EBS11x

    • Project Contract Business Unit List Invoice EBS11x

    • Project Contract Business Unit List Revenue EBS11x

  • For R12

    • Expenditure Business Unit List EBSR12

    • Project Business Unit List Funding EBSR12

    • Project Business Unit List Invoice EBSR12

    • Project Business Unit List Revenue EBSR12

    • Project Contract Business Unit List EBSR12

    • Project Contract Business Unit List Invoice EBSR12

    • Project Contract Business Unit List Revenue EBSR12

  • For both R11x and R12

    • Project List Funding EBS

    • Project List Invoice EBS

    • Project List Revenue EBS

    • Project Organization List Funding EBS

    • Project Organization List Invoice EBS

    • Project Organization List Revenue EBS

B.2.102.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Billing and Revenue subject area.

  • OBIA_EBS_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.103 How to Set Up CRM Resource Organization Based Security for Oracle Fusion

Overview

Oracle Fusion CRM resource organization based security is applied when the Fusion marketing managers or marketing operational managers access marketing campaigns. It provides the users with access to all marketing campaigns primarily owned by their organizations or child organizations.

Configuring Resource Organization Based Security

There are two session variables used in resource organization based data security role.

  • RESOURCE_ORG_LIST contains the list of resource organization id that the login user belongs to. It is initialized via initial block "Resource Org List".

  • RESOURCE_ORG_HIER_LEVEL_LIST contains the list of levels of resource organization hierarchy. It is initialized via initial block "RESOURCE_ORG_HIER_LEVEL_LIST".

B.2.103.1 Configuring BI Duty Roles

OBIA_RESOURCE_ORGANIZATION_HIERARCHY_DATA_SECURITY is the internal BI duty role to define data filter for resource organization hierarchy based data security. By default, it has the following members:

  • OBIA_MARKETING_OPERATIONAL_ANALYSIS_DUTY

  • OBIA_MARKETING_MANAGERIAL_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. And as members of OBIA_RESOURCE_ORGANIZATION_HIERARCHY_DATA_SECURITY, they also ensure the primary resource organization hierarchy based data security filters are applied to all the queries involving marketing campaign.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.104 How to Configure Flat Files For Order Item Fact E-Business Suite

Configuration File: file_orderitem_fs.csv

The file is generic and therefore does not support any source Order system-specific features, such as recurring order lines.

Each line in this file will supplement existing E-Business Suite Order Lines with data to populate the following prices:

  • Order Item Approved Invoice Price

  • Order Item Approved Pocket Margin

  • Order Item Approved Pocket Price

  • Order Item Guideline Invoice Price

  • Order Item Guideline Pocket Price

  • Order Item Guideline Pocket Price

  • Order Item Requested Invoice Price

  • Order Item Requested Pocket Margin

  • Order Item Requested Pocket Price

The granularity of this file is each Order Line ID which should be the root Order Line ID in the case of configured products.

Each price above should be the rolled-up price in case of a configured product.

Table B-148 file_orderitem_fs.csv Field Descriptions

Column Name Data Type Sample Data Description

INTEGRATION_ID

VARCHAR2(80)

344946

The INTEGRATION_ID for this file will be the Order Line ID. This should be the root Order Line ID in the case of configured products.

GLN_INV_PRI

NUMBER(28,10)

101.55

The sales guideline (price policy) invoice price that was applied to the order. If there was no price exception, this is the same as the actual invoice price.

GLN_PKT_PRI

NUMBER(28,10)

91.534

The sales guideline (price policy) pocket price that was applied to the order, or the derived guideline pocket price based on the guideline invoice price. If there was no price exception, this is the same as the actual pocket price.

GLN_PKT_MARGIN

NUMBER(28,10)

30

What the pocket margin would have been for the order based on its quantity and the guideline pocket price. If there was no price exception, this is the same as the actual pocket margin.

REQ_INV_PRI

NUMBER(28,10)

1345.12

The invoice price that was requested (usually by sales) on the price exception request. If there was no price exception, this is the same as the actual invoice price.

REQ_PKT_PRI

NUMBER(28,10)

122.2

The pocket price based on the requested invoice price (usually by sales) on the price exception request. If there was no price exception, this is the same as the actual pocket price.

REQ_PKT_MARGIN

NUMBER(28,10)

1.3

What the pocket margin would have been for the order based on its quantity and the requested pocket price. If there was no price exception, this is the same as the actual pocket margin.

APPR_INV_PRI

NUMBER(28,10)

44.44

The invoice price that was last approved for the order. If there was no price exception, this is the same as the actual invoice price.

APPR_PKT_PRI

NUMBER(28,10)

22222.2

The pocket price that was last approved for the order. If there was no price exception, this is the same as the actual pocket price.

APPR_PKT_MARGIN

NUMBER(28,10)

172222.2

What the pocket margin amount would have been on the order the order based on its quantity and the invoice price that was last approved for the order. If there was no price exception, this is the same as the actual pocket margin.


B.2.105 How to Configure Flat Files For Price Segment Dimension For E-Business Suite

Configuration File: file_pri_segment_ds.csv

The file is generic and therefore does not support any source Pricing system-specific features.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Each row of this file will supplement existing E-Business Suite Customer Class that are extracted using the following SQL:

SELECT LOOKUP_CODE, MEANING FROM FND_LOOKUP_VALUES WHERE LOOKUP_TYPE = 'CUSTOMER CLASS' AND VIEW_APPLICATION_ID = 220 AND LANGUAGE = '<Base Language>'

The granularity of this file is each Customer Class Lookup code that has been created in the E-Business Suite Application.

Table B-149 file_pri_segment_ds.csv Field Descriptions

Column Name Data Type Sample Data Description

INTEGRATION_ID

VARCHAR2(80)

Technology

The INTEGRATION_ID for this file will be the Customer Class Code that is, FND_LOOKUP_VALUES.LOOKUP_CODE.

PROF_ATTR_<n>_CODE

VARCHAR2(50)

Credit

Profile Attribute Code. The Code-Name pair values will have to supplemented as Source domains via - file_domain_member_gs.csv

AUX1_CHANGED_ON_DT

NUMBER(28,10)

<Date>

In order to effect a Type 1 change, this column should be populated with the update date of the record in an Incremental Load.


B.2.106 How to Configure Flat Files For Price Strategy Dimension For E-Business Suite

Configuration File: file_pri_strategy_ds.csv

The file is generic and therefore does not support any source Pricing system-specific features.

Each row of this file will supplement existing E-Business Suite Sales Channels that are extracted using the following SQL:

SELECT LOOKUP_CODE, MEANING FROM FND_LOOKUP_VALUES WHERE LOOKUP_TYPE = 'SALES_CHANNEL' AND VIEW_APPLICATION_ID = 660 AND LANGUAGE = '<Base Language>'

The granularity of this file is each Sales Channel Lookup code that has been created in the E-Business Suite Application.

Table B-150 file_pri_strategy_ds.csv Field Descriptions

Column Name Data Type Sample Data Description

INTEGRATION_ID

VARCHAR2(80)

ALUMNI_VISIT

The INTEGRATION_ID for this file will be the Sales Channel Code that is, FND_LOOKUP_VALUES.LOOKUP_CODE.

STRATEGY_NUM

VARCHAR2(30)

PRODUCT_GL_PLAN_NAME

Not available.

PRODUCT_GL_PLAN_NAME

VARCHAR2(200)

PRICE_LIST_NAME

Not available.

PRICE_LIST_NAME

VARCHAR2(100)

Corporate Price List

Not available.

DEAL_GL_PLAN_NAME

VARCHAR2(200)

Standard Deal Policies

Not available.

PRI_SEGMENT_NAME

VARCHAR2(200)

Industrial – Retain/Harvest

Not available.

VERSION

NUMBER(10)

1

Not available.

COMPETITOR_NAME

VARCHAR2(200)

Name

Not available.

AUX1_CHANGED_ON_DT

VARCHAR2(19)

Not available.

This column should be populated in case of an Incremental load which should effect a type 2 change for a Sales Channel.


B.2.107 How to Configure Flat Files For Price Waterfall Element Dimension For E-Business Suite

Configuration File: file_pwf_element_ds.csv

The file is generic and therefore does not support any source Pricing system-specific features.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Each row of this file will supplement existing E-Business Suite Modifier Line IDs that are extracted using the following SQL:

SELECT QP_LIST_LINES.LIST_LINE_ID MODIFIER_LINE_ID FROM APPS.QP_LIST_LINES INNER JOIN APPS.QP_LIST_HEADERS_B ON QP_LIST_LINES.LIST_HEADER_ID = QP_LIST_HEADERS_B.LIST_HEADER_ID INNER JOIN APPS.QP_LIST_HEADERS_TL ON QP_LIST_HEADERS_B.LIST_HEADER_ID    = QP_LIST_HEADERS_TL.LIST_HEADER_ID WHERE  (QP_LIST_HEADERS_TL.LANGUAGE       = 'US') AND QP_LIST_LINES.LIST_LINE_TYPE_CODE IN ('DIS','SUR','PBH','FREIGHT_CHARGE') AND EXISTS (SELECT 1 FROM ASO_PRICE_ADJUSTMENTS WHERE QP_LIST_LINES.LIST_LINE_ID       = ASO_PRICE_ADJUSTMENTS.MODIFIER_LINE_ID AND ASO_PRICE_ADJUSTMENTS.APPLIED_FLAG = 'Y' UNION ALL SELECT 1 FROM OE_PRICE_ADJUSTMENTS WHERE QP_LIST_LINES.LIST_LINE_ID      = OE_PRICE_ADJUSTMENTS.LIST_LINE_ID AND OE_PRICE_ADJUSTMENTS.APPLIED_FLAG = 'Y' )

The granularity of this file is each Modifier Line ID of the Modifier Line Type – Discount, Surcharge, Price Break Header or Freight Charge which has caused an adjustment for an Order (or) Quote line.

Table B-151 file_pwf_element_ds.csv Field Descriptions

Column Name Data Type Sample Data Description

INTEGRATION_ID

VARCHAR2(80)

15678

The INTEGRATION_ID for this file will be the Modifier Line Id that is, QP_LIST_LINES.LIST_LINE_ID.

ELEMENT_TYPE_CODE

VARCHAR2(80)

Revenue Adjustment

Price Element Type Code. Code-Name pair values are supplemented using the file: file_domain_member_gs.csv.

BASIS_SEGMENT

VARCHAR2(50)

Ceiling Revenue

Not available.

TOKEN

VARCHAR2(50)

OFF_CEILING

Not available.

REVN_COST_IND

NUMBER(10)

1

Not available.

DISP_ON_ZERO_FLG

VARCHAR2(1)

N

Not available.


B.2.108 How to Configure Flat Files For Quote Item Fact For E-Business Suite file_quoteitem_fs.csv

Configuration File: file_quoteitem_fs.csv

The file is generic and therefore does not support any source quote system specific features, such as recurring quote lines, etc.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Each line in this file will supplement existing EBS Quote Lines with data to populate the following prices:

  • Quote Item Approved Invoice Price

  • Quote Item Approved Pocket Margin

  • Quote Item Approved Pocket Price

  • Quote Item Guideline Invoice Price

  • Quote Item Guideline Pocket Margin

  • Quote Item Guideline Pocket Price

  • Quote Item Requested Invoice Price

  • Quote Item Requested Pocket Margin

  • Quote Item Requested Pocket Price

The granularity of this file is each Quote Line ID which should be the root Quote Line ID in the case of configured products.

Each price above should be the rolled-up price in case of a configured product.

Table B-152 file_quoteitem_fs.csv Field Descriptions

Column Name Data Type Sample Data Description

INTEGRATION_ID

VARCHAR2(80)

344946

The INTEGRATION_ID for this file will be the Quote Line ID. This should be the root Quote Line ID in the case of configured products.

GLN_INV_PRI

NUMBER(28,10)

101.55

The sales guideline (price policy) invoice price that was applied to the quote. If there was no price exception, this is the same as the actual invoice price.

GLN_PKT_PRI

NUMBER(28,10)

91.534

The sales guideline (price policy) pocket price that was applied to the quote, or the derived guideline pocket price based on the guideline invoice price. If there was no price exception, this is the same as the actual pocket price.

GLN_PKT_MARGIN

NUMBER(28,10)

30

What the pocket margin would have been for the quote based on its quantity and the guideline pocket price. If there was no price exception, this is the same as the actual pocket margin.

REQ_INV_PRI

NUMBER(28,10)

1345.12

The invoice price that was requested (usually by sales) on the price exception request. If there was no price exception, this is the same as the actual invoice price.

REQ_PKT_PRI

NUMBER(28,10)

122.2

The pocket price based on the requested invoice price (usually by sales) on the price exception request. If there was no price exception, this is the same as the actual pocket price.

REQ_PKT_MARGIN

NUMBER(28,10)

1.3

What the pocket margin would have been for the quote based on its quantity and the requested pocket price. If there was no price exception, this is the same as the actual pocket margin.

APPR_INV_PRI

NUMBER(28,10)

44.44

The invoice price that was last approved for the quote. If there was no price exception, this is the same as the actual invoice price.

APPR_PKT_PRI

NUMBER(28,10)

22222.2

The pocket price that was last approved for the quote. If there was no price exception, this is the same as the actual pocket price.

APPR_PKT_MARGIN

NUMBER(28,10)

172222.2

What the pocket margin amount would have been on the quote the quote based on its quantity and the invoice price that was last approved for the quote. If there was no price exception, this is the same as the actual pocket margin.


B.2.109 How to Configure Flat Files For Source Domains in Price Analytics for E-Business Suite

This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-153 Configure Flat Files For Source Domains in Price Analytics

Presentation Table Presentation Column File Name File Column Source Domain Code

Price Waterfall Element

Source Element Code

file_pwf_element_ds.csv

ELEMENT_CODE

PRC_ELEMENT_CODE

Price Waterfall Element

Source Element Type Code

file_pwf_element_ds.csv

ELEMENT_TYPE_CODE

PRC_ELEMENT_TYPE_CODE

Price Profile

Profile Attribute 1 Code

file_pri_segment_ds.csv

PROF_ATTR_1_CODE

PRC_PROFILE_1

Price Profile

Profile Attribute 2 Code

file_pri_segment_ds.csv

PROF_ATTR_2_CODE

PRC_PROFILE_2

Price Profile

Profile Attribute 3 Code

file_pri_segment_ds.csv

PROF_ATTR_3_CODE

PRC_PROFILE_3

Price Profile

Profile Attribute 4 Code

file_pri_segment_ds.csv

PROF_ATTR_4_CODE

PRC_PROFILE_4

Price Profile

Profile Attribute 5 Code

file_pri_segment_ds.csv

PROF_ATTR_5_CODE

PRC_PROFILE_5


Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

Configuration File: file_domain_member_gs.csv

The file is generic and therefore does not support any source Pricing system specific features.

This file is used to populate data for the following domains which are not available in E-Business Suite and should be loaded only if supplementing data for any of the dimensions listed below:

  • Price Waterfall Element Dimension – Price Element Type Code (the Conformed Price Element Type will have to be mapped to any new domains introduced through this).

  • Price Segment Dimension - Price Profile Attribute 1 Code

  • Price Segment Dimension - Price Profile Attribute 2 Code

  • Price Segment Dimension - Price Profile Attribute 3 Code

  • Price Segment Dimension - Price Profile Attribute 4 Code

  • Price Segment Dimension - Price Profile Attribute 5 Code

Task SDE_ORA_DomainGeneral_PriceElementType will load the file data to Warehouse staging table W_DOMAIN_MEMBER_GS.

The granularity of this file is each domain member per language for any of the domains listed above.

Table B-154 file_domain_member_gs.csv Field Descriptions

Column Name Data Type Sample Data Description

DOMAIN_CODE

Not available.

Not available.

This should be populated with the Domain Code corresponding to the Source Domain that is to be configured as per Table 10-1.

DOMAIN_TYPE_CODE

Not available.

Not available.

Defaulted to 'S' - indicates this is a Source Domain Code.

DOMAIN_MEMBER_CODE

Not available.

Not available.

This should be populated with the CODE value supplied in any of the above files.

DOMAIN_MEMBER_NAME

Not available.

Not available.

This should be populated with the NAME value that corresponds to the Member Code supplied.

DOMAIN_MEMBER_DESCR

Not available.

Not available.

Not available.

DOMAIN_MEMBER_REF_CODE

Not available.

Not available.

Hardcode to '__NOT_APPLICABLE__'.

DOMAIN_MEMBER_DEFN_TYPE_CODE

Not available.

Not available.

Not available.

LANGUAGE_CODE

Not available.

Not available.

Warehouse Language Code.

SRC_LANGUAGE_CODE

Not available.

Not available.

Source Language Code.

INTEGRATION_ID

Not available.

Not available.

This is the unique ID for the record. The INTEGRATION_ID for this file can also be populated as DOMAIN_CODE~DOMAIN_MEMBER_CODE.

DATASOURCE_NUM_ID

Not available.

Not available.

The unique Data Source ID of the Siebel Instance you are configuring.


B.2.110 How to Configure Flat Files in Price Analytics for E-Business Suite

Background

Oracle Price Analytics sources data from Quoting, Order Management and Advanced Pricing modules by default which are available for E-Business Suite. Additionally, a flat file option has been provided to supplement Dimension attributes and additional Order (or) Quote line prices (for example, Guideline Invoice Price) which are not available in vanilla implementations of above modules.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

ETL from Flat Files

The ETL process loads the non-E-Business Suite data from flat files and data from E-Business Suite Applications database tables into staging tables; then loads data from the staging tables into Oracle Business Analytics Warehouse.

Table B-155 Flat Files and target tables in Oracle Business Analytics Warehouse

Flat File Description Supplements Target

file_pri_strategy_ds

This file holds data for additional attributes to supplement EBS Sales Channel Codes.

W_PRI_STRATEGY_D

Price Strategy-A grouping of pricing rules that define the approach for achieving a specific goal around selling and pricing and products, targeted at a Pricing Segment, sub-segment and specific selling situation.

file_pri_segment_ds

This file holds data for additional attributes to supplement EBS Customer Class Codes.

W_PRI_SEGMENT_D

Price Segment - A collection or grouping of customers that exhibit a common set of characteristics and buying behaviors in relation to a vendor of products or services. The Pricing Segment is usually a further refinement of the Market Segment in order to allow the pricing strategist to categorize and understand sets of customers who will respond to common pricing tactics.

file_pwf_element_ds

This file holds data for additional attributes to supplement EBS Modifier Lines.

W_PWF_ELEMENT_D

Price Waterfall Element - PWF elements are the various components that make up a Price Waterfall and include both: (a) Summed up price/revenues such as ceiling and segment revenues and (b) the various adjustments, which may be either a unitized adjustment or actual dollar adjustment for a discounting program, incentive, expense or cost summed up price/revenue.

file_quoteitem_fs

This file holds the Approved, Requested and Guideline Invoice/Pocket Price and Margins corresponding to transaction data for Quote Item.

W_QUOTEITEM_F

Quote Item–Stores the various quote line revenue amounts and adjustments. The grain of this table is the Quote Line.

file_orderitem_fs

This file holds the Approved, Requested and Guideline Invoice/Pocket Price and Margins corresponding to transaction data for Order Item.

W_ORDERITEM_F

Order Item – Stores the various order line revenue amounts and adjustments. The grain of this table is: Order Line.


Configuring Domains via Flat Files

Source Domain member values for those Code Dimension attributes which are supplemented using these files can be populated in the Warehouse via a domains file that is, file_domain_member_gs.csv

File Specifications

These files are used across all adaptors and hence only a few columns need to be populated which are supported for the E-Business Suite 12.1.3 Adaptor – the other columns should be populated with NULL.

The columns supported for a file are listed under File Structure in subsequent sections.

These files should exist in E-Business Suite 12.1.3 Source Files Folder even if not being used to supplement data. If not, Extract tasks will fail.

The data in the source files should conform to the following specifications:

  • Data should be in CSV files (*.csv).

  • For Full Load ETL, the files should contain all initial records that are supposed to be loaded into Oracle Business Analytics Warehouse; for incremental ETL, the files should contain only new or updated records.

  • All columns in the files should follow E-Business Suite application data model terms and standards, and all ID columns in the files are expected to have corresponding E-Business Suite IDs.

  • Data should start from line six of each file. The first five lines of each file will be skipped during ETL process.

  • Each row represents one unique record in staging table.

  • All date values should be in the format of YYYYMMDDHH24MISS. For example, 20071231140300 should be used for December 31, 2007, 2:03 pm.

  • Amount (or) Price column values should be of the same Document Currency Code used in the OLTP transaction.

  • Column INTEGRATION_ID in all flat files cannot be NULL as it will serve as either the (i) lookup key when supplementing OLTP data or (ii) primary key in case the file serves as the Primary source.

B.2.111 How to Configure Flat Files For Source Domains in Price Analytics for Siebel Applications

This section contains configuration steps that you need to perform on Oracle Price Analytics to populate Source and Conformed Domain Code Members. It contains the following topics:

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

B.2.111.1 How to Configure Source Domain Member Name Values

This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-156 Configure Flat Files For Source Domains in Price Analytics

Presentation Table Presentation Column File Name File Column Source Domain Code

Price Waterfall Element

Source Element Code

file_pwf_element_ds.csv

ELEMENT_CODE

PRC_ELEMENT_CODE

Price Waterfall Element

Source Element Type Code

file_pwf_element_ds.csv

ELEMENT_TYPE_CODE

PRC_ELEMENT_TYPE_CODE

Price Waterfall Element

Price Group Code

file_pwf_element_ds.csv

GROUP_CODE

PRC_GROUP_CODE

Price Profile

Profile Attribute 1 Code

file_pri_segment_ds.csv

PROF_ATTR_1_CODE

PRC_PROFILE_1

Price Profile

Profile Attribute 2 Code

file_pri_segment_ds.csv

PROF_ATTR_2_CODE

PRC_PROFILE_2

Price Profile

Profile Attribute 3 Code

file_pri_segment_ds.csv

PROF_ATTR_3_CODE

PRC_PROFILE_3

Price Profile

Profile Attribute 4 Code

file_pri_segment_ds.csv

PROF_ATTR_4_CODE

PRC_PROFILE_4

Price Profile

Profile Attribute 5 Code

file_pri_segment_ds.csv

PROF_ATTR_5_CODE

PRC_PROFILE_5


For example, the different Element Types used in the sample data in file – file_pwf_element_ds.csv are the following:

  • Segment

    The revenues that are part of a waterfall, such as ceiling revenue, list revenue, and so on.

  • Revenue Adjustment

    The adjustments made to the segment elements, for example, ceiling adjustment, invoice adjustment, and so on.

  • Cost Adjustment

    All other adjustments that are not part of any segment.

The corresponding Name values for above Element Types or any of the source domain members specified in Table 10-1 has to be supplied via file – file_domain_member_gs.csv in order for them to show up when querying Names in Analytics.

File_domain_member_gs.csv

  • The file is generic and therefore does not support any source Pricing system specific features.

  • Task SDE_DomainGeneral_PriceElementType will load the file data to Warehouse staging table W_DOMAIN_MEMBER_GS.

  • The granularity of this file is each domain member per language for any of the domains listed above.

Table B-157 file_domain_member_gs.csv Field Descriptions

Column Name Data Type Sample Data Description

DOMAIN_CODE

Not available.

Not available.

This should be populated with the Domain Code corresponding to the Source Domain that is to be configured as per Table 10-1.

DOMAIN_TYPE_CODE

Not available.

Not available.

Defaulted to 'S' - indicates this is a Source Domain Code.

DOMAIN_MEMBER_CODE

Not available.

Not available.

This should be populated with the CODE value supplied in any of the above files.

DOMAIN_MEMBER_NAME

Not available.

Not available.

This should be populated with the NAME value that corresponds to the Member Code supplied.

DOMAIN_MEMBER_DESCR

Not available.

Not available.

Not available.

DOMAIN_MEMBER_REF_CODE

Not available.

Not available.

Hardcode to '__NOT_APPLICABLE__'.

DOMAIN_MEMBER_DEFN_TYPE_CODE

Not available.

Not available.

Not available.

LANGUAGE_CODE

Not available.

Not available.

Warehouse Language Code.

SRC_LANGUAGE_CODE

Not available.

Not available.

Source Language Code.

INTEGRATION_ID

Not available.

Not available.

This is the unique ID for the record. The INTEGRATION_ID for this file can also be populated as DOMAIN_CODE~DOMAIN_MEMBER_CODE.

DATASOURCE_NUM_ID

Not available.

Not available.

The unique Data Source ID of the Siebel Instance you are configuring.


B.2.111.2 How To Configure Conformed Domain Members

There are two conformed domains used in Oracle Price Analytics as summarized in the table below:

Table B-158 Configure Flat Files For Source Domains in Price Analytics

Presentation Table Presentation Column File Name File Column Source Domain Code

Price Waterfall Element

Element Code

file_pwf_element_ds.csv

W_ELEMENT_CODE

Conformed Price Waterfall Element

Price Waterfall Element

Element Type

file_pwf_element_ds.csv

W_ELEMENT_TYPE_CODE

Conformed Price Waterfall Element Type


The source file file_pwf_element_ds.csv should already have the conformed domain mapped and should have values for the above columns for the corresponding Source Domain Code Member.

As these conformed domains are extensible by the user, the Name values for these domain members of Conformed domains specified in the Table above should be entered in Oracle BI Applications Configuration Manager.

B.2.111.3 Default Seeded Domain Members

This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-159 Domain Map: Price Waterfall Element -> Conformed Price Waterfall Element

Source Price Waterfall Element Member Code Source Price Waterfall Element Name Conformed Price Waterfall Element Member Code Conformed Price Waterfall Element Name

Ceiling Revenue

Ceiling Revenue

Ceiling Revenue

Ceiling Revenue

Cost

Cost

Cost

Cost

Invoice Revenue

Invoice Revenue

Invoice Revenue

Invoice Revenue

Pocket Margin

Pocket Margin

Pocket Margin

Pocket Margin

Pocket Revenue

Pocket Revenue

Pocket Revenue

Pocket Revenue

Segment Revenue

Segment Revenue

Segment Revenue

Segment Revenue


This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-160 Domain Map: Price Waterfall Element Type -> Conformed Price Waterfall Element Type

Source Price Waterfall Element Member Code Source Price Waterfall Element Name Conformed Price Waterfall Element Member Code Conformed Price Waterfall Element Name

Cost Adjustment

Cost Adjustment

Cost Adjustment

Cost Adjustment

Revenue Adjustment

Revenue Adjustment

Revenue Adjustment

Revenue Adjustment

Segment

Segment

Segment

Segment


These can be included in the file – file_pwf_element_ds.csv or new source/conformed domain members can be entered as mentioned in the previous sections.

B.2.112 How To Configure Flat Files in Price Analytics for Siebel Applications

This section describes how to configure Oracle Price Analytics. It contains the following topics:

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

B.2.112.1 Configuration Required Before a Full Load for Oracle Price Analytics

This section contains configuration steps that you need to perform on Oracle Price Analytics before you do a full data load.

Configuration Steps for Universal Sources

Oracle Price Analytics relies on data from universal sources, such as flat files, for waterfall related data.

The Table below lists the flat file source tables and the corresponding data warehouse tables for waterfall related data.

Table B-161 Flat File Source Tables and Corresponding Warehouse Tables

Flat File Description Loads Target

FILE_PRI_STRATEGY_DS

This file holds information about the different pricing strategies being used.

W_PRI_STRATEGY_D

FILE_PRI_SEGMENT_DS

This file holds the different pricing segment details.

W_PRI_SEGMENT_D

FILE_PWF_ELEMENT

This file contains information about the different waterfall elements.

W_PWF_ELEMENT_D

FILE_ORDIT_WTR_LOG_FS

This file holds the waterfall information for all the transaction data for Order Item.

W_ORDIT_WTR_LOG_F

FILE_QTEIT_WTR_LOG_FS

This file holds the waterfall information for all the transaction data for Quote Item.

W_QTEIT_WTR_LOG_F


B.2.112.1.1 Populating Flat File Data For Siebel Sources

This section provides guidelines for populating pricing data into flat files when the source is Siebel.

Oracle Price Analytics does not provide a way to load pricing strategy, pricing segment or price waterfall element information from a Siebel source. All such dimensions must be loaded with a universal source, such as flat files.

The source files for the pricing-related dimensions must conform to the following rules:

  • The Pricing Segment and Pricing Strategy IDs provided in the flat file must be the same for all the order lines in any given order.

  • The ROW_ID must be unique in all the flat files because they are used to form the Integration IDs.

  • The information added must be consistent with the existing data in the Siebel system. For instance, the Competitor Name added in the file must exist in the source system for proper resolution.

  • The Order Line IDs in the Order Item Waterfall fact source must exist in the source table S_ORDER_ITEM.

  • The Quote Line IDs in Quote Item Waterfall fact source must be a part of source table S_QUOTE_ITEM.

The Oracle Price Analytics facts W_ORDIT_WTR_LOG_F and W_QTEIT_WTR_LOG_F are loaded using the Order Item and Quote Item facts as well as flat files.

The pricing columns in the Order Item and Quote Item facts are loaded as shown in the table below.

Table B-162 Pricing columns in the Order Item and Quote Item facts

Column Name Expression

CEIL_PRI

IIF(ISNULL(FLAT_FILE_DATA),START_PRI,FLAT_FILE_DATA)

SEG_PRI

IIF(ISNULL(FLAT_FILE_DATA),START_PRI,FLAT_FILE_DATA)

INV_PRI

IIF(ISNULL(FLAT_FILE_DATA),NET_PRI,FLAT_FILE_DATA)

PKT_PRI

IIF(ISNULL(FLAT_FILE_DATA),NET_PRI,FLAT_FILE_DATA)

PKT_MARGIN

IIF(ISNULL(FLAT_FILE_DATA),START_PRI-NET_PRICE,FLAT_FILE_DATA)


If you need to load different values for the pricing columns other than the existing prices, you can use the flat files FILE_ORDERITEM_FS.csv and FILE_QUOTEITEM_FS.csv. Based on the Integration IDs, the pricing data is looked up from these flat files and loaded into the fact tables.

Note: Even if not supplementing QUOTEITEM or ORDERITEM – empty files should be available in the adaptor source files folder so that extract tasks do not fail.

B.2.112.1.2 Populating Flat File Data for Non-Siebel Sources

This section provides guidelines for populating pricing data into flat files for non-Siebel sources.

For non-Siebel sources, the source files for the pricing-related dimensions must conform to the following rules:

  • The Order Line IDs in the Order Item Waterfall fact source must exist in fact file source FILE_ORDERITEM_FS.

  • The Quote Line IDs in Quote Item Waterfall fact source must be a part of the fact file source FILE_QUOTEITEM_FS.

  • Ensure all the ROW_IDs are unique so as to avoid any duplication or index issues.

  • All the fact IDs added must be consistent with the ROW_ID of dimension file sources for proper resolution.

B.2.112.1.3 Data Standards for Flat Files

The flat files being used for Oracle Price Analytics facts, such as FILE_ORDIT_WTR_LOG_FS and FILE_QTEIT_WTR_LOG_FS, must be consistent with the line item tables. The prices in the waterfall log table must be the aggregated price in the line item tables. And, in the case of assembled or packaged products, the item tables store the package or assembly and the individual items that make up the package or assembly as separate line items. The line items in the flat file must store the individual prices and not rolled up prices; that is, if a package does not have a price and only the items inside it have prices, either the price of the package should be 0 and the items should have the prices or the package should have the rolled up prices and the item prices should be 0 to prevent double counting. Also, the Waterfall log table should store only the package or assembly and not the items that comprise it, and the price should be the rolled up price for a unit package or assembly.

B.2.112.2 Price Waterfall Element Sample Data

This section provides price waterfall element sample data.

Sample Data.
B.2.112.2.1 Example of an Order for a Simple Product

In this scenario, a simple order is created for a company that manufactures and sells lap tops. The graphic below shows an example of the order information in the Order Item fact table.

Sample Data for a Simple Product

Sample Data.

The graphic below shows an example of the Order Item waterfall log fact data for the transaction.

Order Item Waterfall Log Fact Data for a Simple Product

Sample Data.

As this example shows, each waterfall element is stored as an individual record and the Waterfall Element dimension identifies whether the element is a revenue or an adjustment.

B.2.112.2.2 Example of an Order for a Configured Product

This section shows an example of an order for an assembled product that has multiple child products.

Sample Data for an Assembled Product

Sample Data.

The Price Waterfall is stored for the packaged product and not the individual child items. The graphic below shows an example of the Order Item waterfall log fact data for the transaction.

Order Item Waterfall Log Fact Data for an Assembled Product

Sample Data.

B.2.113 How to Configure Flat Files For Source Domains in Price Analytics for Universal

This section contains configuration steps that you need to perform on Oracle Price Analytics to populate Source and Conformed Domain Code Members. It contains the following topics:

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

B.2.113.1 How to Configure Source Domain Member Name Values

This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-163 Configure Flat Files For Source Domains in Price Analytics

Presentation Table Presentation Column File Name File Column Source Domain Code

Price Waterfall Element

Source Element Code

file_pwf_element_ds.csv

ELEMENT_CODE

PRC_ELEMENT_CODE

Price Waterfall Element

Source Element Type Code

file_pwf_element_ds.csv

ELEMENT_TYPE_CODE

PRC_ELEMENT_TYPE_CODE

Price Waterfall Element

Price Group Code

file_pwf_element_ds.csv

GROUP_CODE

PRC_GROUP_CODE

Price Profile

Profile Attribute 1 Code

file_pri_segment_ds.csv

PROF_ATTR_1_CODE

PRC_PROFILE_1

Price Profile

Profile Attribute 2 Code

file_pri_segment_ds.csv

PROF_ATTR_2_CODE

PRC_PROFILE_2

Price Profile

Profile Attribute 3 Code

file_pri_segment_ds.csv

PROF_ATTR_3_CODE

PRC_PROFILE_3

Price Profile

Profile Attribute 4 Code

file_pri_segment_ds.csv

PROF_ATTR_4_CODE

PRC_PROFILE_4

Price Profile

Profile Attribute 5 Code

file_pri_segment_ds.csv

PROF_ATTR_5_CODE

PRC_PROFILE_5


For example, the different Element Types used in the sample data in file – file_pwf_element_ds.csv are the following:

  • Segment

    The revenues that are part of a waterfall, such as ceiling revenue, list revenue, and so on.

  • Revenue Adjustment

    The adjustments made to the segment elements, for example, ceiling adjustment, invoice adjustment, and so on.

  • Cost Adjustment

    All other adjustments that are not part of any segment.

The corresponding Name values for above Element Types or any of the source domain members specified in Table 10-1 has to be supplied via file – file_domain_member_gs.csv in order for them to show up when querying Names in Analytics.

File_domain_member_gs.csv

  • The file is generic and therefore does not support any source Pricing system specific features.

  • Task SDE_DomainGeneral_PriceElementType will load the file data to Warehouse staging table W_DOMAIN_MEMBER_GS.

  • The granularity of this file is each domain member per language for any of the domains listed above.

Table B-164 file_domain_member_gs.csv Field Descriptions

Column Name Data Type Sample Data Description

DOMAIN_CODE

Not available.

Not available.

This should be populated with the Domain Code corresponding to the Source Domain that is to be configured as per Table 10-1.

DOMAIN_TYPE_CODE

Not available.

Not available.

Defaulted to 'S' - indicates this is a Source Domain Code.

DOMAIN_MEMBER_CODE

Not available.

Not available.

This should be populated with the CODE value supplied in any of the above files.

DOMAIN_MEMBER_NAME

Not available.

Not available.

This should be populated with the NAME value that corresponds to the Member Code supplied.

DOMAIN_MEMBER_DESCR

Not available.

Not available.

Not available.

DOMAIN_MEMBER_REF_CODE

Not available.

Not available.

Hardcode to '__NOT_APPLICABLE__'.

DOMAIN_MEMBER_DEFN_TYPE_CODE

Not available.

Not available.

Not available.

LANGUAGE_CODE

Not available.

Not available.

Warehouse Language Code.

SRC_LANGUAGE_CODE

Not available.

Not available.

Source Language Code.

INTEGRATION_ID

Not available.

Not available.

This is the unique ID for the record. The INTEGRATION_ID for this file can also be populated as DOMAIN_CODE~DOMAIN_MEMBER_CODE.

DATASOURCE_NUM_ID

Not available.

Not available.

The unique Data Source ID of the Siebel Instance you are configuring.


B.2.113.2 How To Configure Conformed Domain Members

There are two conformed domains used in Oracle Price Analytics as summarized in the table below:

Table B-165 Configure Flat Files For Source Domains in Price Analytics

Presentation Table Presentation Column File Name File Column Source Domain Code

Price Waterfall Element

Element Code

file_pwf_element_ds.csv

W_ELEMENT_CODE

Conformed Price Waterfall Element

Price Waterfall Element

Element Type

file_pwf_element_ds.csv

W_ELEMENT_TYPE_CODE

Conformed Price Waterfall Element Type


The source file file_pwf_element_ds.csv should already have the conformed domain mapped and should have values for the above columns for the corresponding Source Domain Code Member.

As these conformed domains are extensible by the user, the Name values for these domain members of Conformed domains specified in Table 10-2 should be entered in Oracle BI Applications Configuration Manager

B.2.113.3 Default Seeded Domain Members

This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-166 Domain Map: Price Waterfall Element -> Conformed Price Waterfall Element

Source Price Waterfall Element Member Code Source Price Waterfall Element Name Conformed Price Waterfall Element Member Code Conformed Price Waterfall Element Name

Ceiling Revenue

Ceiling Revenue

Ceiling Revenue

Ceiling Revenue

Cost

Cost

Cost

Cost

Invoice Revenue

Invoice Revenue

Invoice Revenue

Invoice Revenue

Pocket Margin

Pocket Margin

Pocket Margin

Pocket Margin

Pocket Revenue

Pocket Revenue

Pocket Revenue

Pocket Revenue

Segment Revenue

Segment Revenue

Segment Revenue

Segment Revenue


This section provides instructions on how to configure the Source Domain Member Name values for those Codes seeded through the Price flat files. The table below captures the lineage from Presentation Column to Flat file column and relevant Source domain codes.

Table B-167 Domain Map: Price Waterfall Element Type -> Conformed Price Waterfall Element Type

Source Price Waterfall Element Member Code Source Price Waterfall Element Name Conformed Price Waterfall Element Member Code Conformed Price Waterfall Element Name

Cost Adjustment

Cost Adjustment

Cost Adjustment

Cost Adjustment

Revenue Adjustment

Revenue Adjustment

Revenue Adjustment

Revenue Adjustment

Segment

Segment

Segment

Segment


These can be included in the file – file_pwf_element_ds.csv or new source/conformed domain members can be entered as mentioned in the previous sections.

B.2.114 How to Set Up Price Analytics Security for Siebel Applications

Overview

There is Primary Employee/Position Hierarchy based data security applied to Price Analytics reports and metrics. Users who can access Price Analytics Subject areas can view all Order and Quote data in the related reports with/without data security filters based on the BI Duty Role assigned as specified in the following section.

B.2.114.1 Configuring BI Duty Roles

This table lists down BI Duty roles (and applicable data security) which can be assigned to users in order to give them access to Price Subject Areas.

Table B-168 BI Duty roles and applicable data security

BI Duty Role Data Security Subject Areas

Pricing Administrator

None

Sales – CRM - Price

Sales – CRM Price Waterfall

Sales – CRM Price Waterfall – Orders

Sales – CRM Price Waterfall – Quotes

Pricing Manager

Primary Employee/Position Hierarchy based data security

Sales – CRM - Price

Sales – CRM Price Waterfall

Sales – CRM Price Waterfall – Orders

Sales – CRM Price Waterfall – Quotes


For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.115 How to configure UOMs

To configure Units of Measure (UOM), you use Externally Conformed Domains in Oracle BI Applications Configuration Manager. For more information about how to configure an externally conformed domain, see Section 4.4.8, "How to Configure Externally Conformed Domains".

B.2.116 How to Set Up Default Fiscal Calendars for PeopleSoft

Semantic layer (RPD) metadata contains session variables to store the current fiscal year, fiscal quarter, and fiscal period and so on for the user. To support multiple fiscal calendars, you need to get the default fiscal calendar for a user based on the Ledger and/or Business Unit assigned to the user, and then get the current fiscal year, quarter and so on based on this default fiscal calendar.The following initialization blocks need to be created and/or modified in RPD metadata to get the default fiscal calendar for a user. These initialization blocks read information from the PeopleSoft security tables such as PS_SEC_BU_OPR, PS_SEC_BU_CLS, PS_SEC_LEDGER_OPR, PS_SEC_LEDGER_CLS.

To set up default Fiscal Calendar for PeopleSoft:

Note:

You need to use the PeopleSoft OLTP Connection Pool for the initialization blocks below.

  1. Create a session initialization block to get one Business Unit for a user. This could be a GL BU, AP BU, AR BU, and so on. Follow these steps:

    1. Create a new initialization block called Operating Unit for Fiscal Calendar, and populate a variable called OU_ORG_FSCL_CALENDAR.

    2. Use the following SQL for the initialization block:

      SELECT MAX(BUSINESS_UNIT) FROM (
      SELECT BUSEC.BUSINESS_UNIT
      FROM PS_SEC_BU_OPR BUSEC, PS_INSTALLATION_FS INST
      WHERE INST.SECURITY_TYPE = 'O' AND BU_SECURITY = 'Y'AND BUSEC.OPRID = ':USER'
      UNION ALL
      SELECT BUSEC.BUSINESS_UNIT
      FROM PS_SEC_BU_CLS BUSEC, PS_INSTALLATION_FS INST, PSOPRDEFN OPR
      WHERE INST.SECURITY_TYPE = 'C' AND BU_SECURITY = 'Y' AND BUSEC.OPRCLASS = OPR.OPRCLASS AND OPR.OPRID = ':USER');
      
  2. Modify the existing, default initialization block (provided for E-Business Suite Customers) to get one GLBU+Ledger combination for a user, because calendar is defined at a GLBU+Ledger combination in PeopleSoft, not on Ledger alone.

    Note:

    If Security is completely turned OFF in PeopleSoft, you can disable all of these initialization blocks.

    Follow these steps:

    1. Modify existing session initialization block called Ledgers_MCAL, and populate a variable called LEDGERS_MCAL.

    2. Use the following SQL for the initialization block:

      Note:

      Remove the comments in the SQL below (lines beginning with "--") from the actual SQL in the initialization block.

      SELECT 'LEDGER_MCAL',MAX(LEDGER_ID_FOR_MCAL) FROM
      (<insert the appropriate SQL from the SQL code below, based on the user level and business filters>)
      

      If security is set at user level with filters on Business Unit and Ledger.

      SELECT A.BUSINESS_UNIT||'~'||C.SETID||'~'||C.LEDGER LEDGER_ID_FOR_MCAL
      FROM PS_SEC_BU_OPR BUSEC, PS_SEC_LEDGER_OPR LEDSEC, PS_BU_LED_GRP_TBL A,PS_SET_CNTRL_REC B, PS_LED_GRP_LED_TBL C, PS_INSTALLATION_FS INST
      WHERE BUSEC.BUSINESS_UNIT = A.BUSINESS_UNIT AND LEDSEC.LEDGER_GROUP = A.LEDGER_GROUP AND LEDSEC.LEDGER = C.LEDGER AND A.BUSINESS_UNIT = B.SETCNTRLVALUE AND B.RECNAME = 'LED_GRP_LED_TBL' AND B.SETID = C.SETID AND
      A.LEDGER_GROUP = C.LEDGER_GROUP AND INST.SECURITY_TYPE = 'O' AND BU_SECURITY = 'Y' AND LEDGER_SECURITY = 'Y' AND BUSEC.OPRID = ':USER' AND LEDSEC.OPRID = ':USER' UNION ALL
      

      If security is set at user level with filters on Business Unit only.

      SELECT A.BUSINESS_UNIT||'~'||C.SETID||'~'||C.LEDGER LEDGER_ID_FOR_MCAL
      FROM PS_SEC_BU_OPR BUSEC, PS_BU_LED_GRP_TBL A, PS_SET_CNTRL_REC B, PS_LED_GRP_LED_TBL C, PS_INSTALLATION_FS INST
      WHERE BUSEC.BUSINESS_UNIT = A.BUSINESS_UNIT AND
      A.BUSINESS_UNIT = B.SETCNTRLVALUE AND B.RECNAME = 'LED_GRP_LED_TBL' AND B.SETID = C.SETID AND 
      A.LEDGER_GROUP = C.LEDGER_GROUP AND INST.SECURITY_TYPE = 'O' AND BU_SECURITY = 'Y' AND LEDGER_SECURITY = 'N' AND BUSEC.OPRID = ':USER' UNION ALL
      

      If security is set at user level with filters on Ledger only.

      SELECT A.BUSINESS_UNIT||'~'||C.SETID||'~'||C.LEDGER LEDGER_ID_FOR_MCAL FROM PS_SEC_LEDGER_OPR LEDSEC, PS_BU_LED_GRP_TBL A, PS_SET_CNTRL_REC B, 
      PS_LED_GRP_LED_TBL C, PS_INSTALLATION_FS INST
      WHERE LEDSEC.LEDGER_GROUP = A.LEDGER_GROUP AND LEDSEC.LEDGER = C.LEDGER AND A.BUSINESS_UNIT = B.SETCNTRLVALUE AND B.RECNAME = 'LED_GRP_LED_TBL' AND B.SETID = C.SETID AND
      A.LEDGER_GROUP = C.LEDGER_GROUP AND INST.SECURITY_TYPE = 'O' AND BU_SECURITY = 'N' AND LEDGER_SECURITY = 'Y' AND LEDSEC.OPRID = ':USER' UNION ALL
      

      If security is set at permission list level with filters on Business Unit and Ledger.

      SELECT A.BUSINESS_UNIT||'~'||C.SETID||'~'||C.LEDGER LEDGER_ID_FOR_MCAL
      FROM PS_SEC_BU_CLS BUSEC, PS_SEC_LEDGER_CLS LEDSEC, PS_BU_LED_GRP_TBL A, 
      PS_SET_CNTRL_REC B, PS_LED_GRP_LED_TBL C, PS_INSTALLATION_FS INST, PSOPRDEFN OPR WHERE BUSEC.BUSINESS_UNIT = A.BUSINESS_UNIT AND LEDSEC.LEDGER_GROUP = A.LEDGER_GROUP AND LEDSEC.LEDGER = C.LEDGER AND A.BUSINESS_UNIT = B.SETCNTRLVALUE AND B.RECNAME = 'LED_GRP_LED_TBL' AND B.SETID = C.SETID AND
      A.LEDGER_GROUP = C.LEDGER_GROUP AND INST.SECURITY_TYPE = 'C' AND BU_SECURITY = 'Y' AND LEDGER_SECURITY = 'Y' AND LEDSEC.OPRCLASS = OPR.OPRCLASS AND BUSEC.OPRCLASS = OPR.OPRCLASS AND OPR.OPRID = ':USER' UNION ALL
      

      is set at permission list level with filters on Business Unit only.

      SELECT A.BUSINESS_UNIT||'~'||C.SETID||'~'||C.LEDGER LEDGER_ID_FOR_MCAL
      FROM PS_SEC_BU_CLS BUSEC, PS_BU_LED_GRP_TBL A, PS_SET_CNTRL_REC B, PS_LED_GRP_LED_TBL C, PS_INSTALLATION_FS INST, PSOPRDEFN OPR
      WHERE BUSEC.BUSINESS_UNIT = A.BUSINESS_UNIT AND
      A.BUSINESS_UNIT = B.SETCNTRLVALUE AND B.RECNAME = 'LED_GRP_LED_TBL' AND B.SETID = C.SETID AND
      A.LEDGER_GROUP = C.LEDGER_GROUP 
      AND INST.SECURITY_TYPE = 'C' AND BU_SECURITY = 'Y' AND LEDGER_SECURITY = 'N' AND BUSEC.OPRCLASS = OPR.OPRCLASS AND OPR.OPRID = ':USER' UNION ALL
      

      If security is set at permission list level with filters on Ledger only.

      SELECT A.BUSINESS_UNIT||'~'||C.SETID||'~'||C.LEDGER LEDGER_ID_FOR_MCAL
      FROM PS_SEC_LEDGER_CLS LEDSEC, PS_BU_LED_GRP_TBL A, PS_SET_CNTRL_REC B, 
      PS_LED_GRP_LED_TBL C, PS_INSTALLATION_FS INST, PSOPRDEFN OPR
      WHERE LEDSEC.LEDGER_GROUP = A.LEDGER_GROUP AND LEDSEC.LEDGER = C.LEDGER AND A.BUSINESS_UNIT = B.SETCNTRLVALUE AND B.RECNAME = 'LED_GRP_LED_TBL' AND B.SETID = C.SETID AND
      A.LEDGER_GROUP = C.LEDGER_GROUP
      AND INST.SECURITY_TYPE = 'C' AND BU_SECURITY = 'N' AND LEDGER_SECURITY = 'Y' AND LEDSEC.OPRCLASS = OPR.OPRCLASS AND OPR.OPRID = ':USER');
      

B.2.117 How to Configure Cost Fact In Projects Analytics for EBS

Actual Costs are extracted from the Cost Distribution Lines table in the Project Costing module in E-Business Suite and loaded into the Cost Line Fact (W_PROJ_COST_LINE_F) table. For E-Business Suite, Transaction Currency is the Document Currency for this fact.

Note:

The GL Date is assigned to the Cost Distribution Line only (during Cost distribution) and not to the Expenditure Item records. The Expenditure data can only be analyzed by the Enterprise Calendar dimension and not by the GL calendar. The Expenditure data cannot be analyzed by the GL Account because the GL account is associated only when the data is distributed.

Cost Fact Canonical Date

The Canonical Date dimension for the Cost fact is based on the PRVDR_GL_DATE from Distribution Line table, whereas the Canonical Date dimension for the Expenditure fact is based on the EXPENDITURE_DATE from the Expenditure Items table.

The multi calendar date dimension contains calendars for multiple organizations. It is essential that all records in a report analyzing data by the Fiscal Calendar (Dim - Fiscal Calendar) point to the same calendar. For this reason, all reports in the dashboard are filtered on the Project Business Unit. To make all Cost records in a Project Business Unit point to the same calendar, the RCVR_GL_DATE and RCVR_PA_DATE columns are used to populate the GL_ACCOUNTING_DT_WID and PROJ_ACCOUNTING_DT_WID columns in the fact table respectively. Expenditure OU view (in Cost Fact) can be built using Enterprise Calendar as well.

Domain Values for Cost Fact

The Project Cost Transfer Status has been modeled as a domain value and can be configured in FSM.

Incremental Logic for Cost Fact

The incremental extract logic for the Cost fact table depends on the 'REQUEST_ID' field of the Cost Distribution Lines table. The W_PROJ_ETL_PS parameter table facilitates this logic. Using a separate ODI interface, the maximum Request Id in the source table at the time of the ETL run is stored in this table, which is subsequently used to populate the SDE task (SDE_ORA_PROJECTCOSTLINE) level ODI variable #EBS_REQUEST_ID_1. It is initialized using the following query: SELECT COALESCE((SELECT PRE_REQUEST_ID FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME = 'PA_COST_DISTRIBUTION_LINES_ALL'),0) FROM_DUAL()

Note:

If you are missing some Cost records in W_PROJ_COST_LINE_F after an incremental update, download patch 9896800 from My Oracle Support. The Tech Note included with the patch explains the scenarios where this can happen, and the proposed solution.

Configuring the Project Cost Aggregate Table

The Project Cost aggregate table (W_PROJ_COST_A) is used to capture information about the project cost distributions for the expenditure items. You need to configure the Project Cost Lines aggregate table before the initial ETL run and subsequent incremental ETL.Before the initial ETL run, you need to configure the COST_TIME_GRAIN parameter in FSM for the time aggregation level in the Project Cost Lines aggregate fact table.By default, the COST_TIME_GRAIN parameter has a value of PERIOD. The possible values for the COST_TIME_GRAIN parameter are:

PERIOD

QUARTER

YEAR

The Project Cost Lines aggregate table is fully loaded from the base table in the initial ETL run. The table can grow to millions of records. Therefore, the Project Cost aggregate table is not fully reloaded from the base table after each incremental ETL run. The Oracle Business Analytics Warehouse minimizes the incremental aggregation effort by modifying the aggregate table incrementally as the base table is updated.

The process is as follows:

  1. Oracle Business Analytics Warehouse finds the records to be updated in the base table since the last ETL run, and loads them into the W_PROJ_COST_LINE_TMP table. The measures in these records are multiplied by (-1). The mapping responsible for this task is SIL_ProjectCostLinesFact_Derive_PreLoadImage.

  2. Oracle Business Analytics Warehouse finds the inserted or updated records in the base table since the last ETL run, and loads them into the W_PROJ_COST_LINE_TMP table, without changing their sign. The mapping responsible for this task is SIL_ProjectCostLinesFact_Derive_PreLoadImage, which is run before PLP_ProjectCostLinesFact_Derive_PostLoadImage updates or inserts records in the base table.

  3. Oracle Business Analytics Warehouse aggregates the W_PROJ_COST_LINE_TMP table and load to W_PROJ_COST_A_TMP, which has the same granularity as the W_PROJ_COST_A table.

  4. The PLP_ProjectCostLinesAggregate_Derive mapping looks up the W_PROJ_COST_A aggregate table to update existing buckets or insert new buckets in the aggregate table (the mapping is PLP_ProjectCostLinesAggregate_Load).

Configuring Revenue Fact for E-Business Suite

Actual Revenue Line records are extracted from the Revenue/Event Distribution Lines tables (PA_CUST_REV_DISTRIB_LINES_ALL and PA_CUST_EVENT_DIST_ALL) in the Project Costing module in E-Business Suite and are loaded into the Revenue Line Fact (W_PROJ_REVENUE_LINE_F) table.

For E-Business Suite, Revenue Transaction Currency Code is the Document Currency Code for this fact.

Note:

E-Business Suite concurrent programs (such as PRC: Generate Draft Revenue for a Single Project or PRC: Generate Draft Revenue for a Range of Projects) for distributing revenue should be run before the ETL is run to load the data warehouse.

For the Revenue Header Fact (W_PROJ_REVENUE_HDR_F), the primary source is the PA_DRAFT_REVENUES table. Revenue line metrics, such as Bill and Revenue amounts, are aggregated in this table as well.

Revenue Fact Canonical Date

The Revenue Fact Canonical Date dimension is based on the GL_DATE from the Draft Revenues table.

Revenue Facts Staging Table

The Revenue Facts Staging Table is a common staging table that loads both the header and the line level revenue fact tables.

Revenue Fact Multicurrency Support

Some metrics such as Unearned Revenue, Unbilled Receivables, Realized Gains, and Realized Losses are only available in Local Currency and Global Currencies. There are three columns in w_proj_revenue_line_f and w_proj_revenue_hdr_f respectively for revenue amounts in global currencies.

Revenue Fact Domain Values

The project revenue status has been modeled as a domain value and can be configured in FSM.

Incremental Logic for Revenue Fact

The incremental extract logic for the Revenue fact table depends on the REQUEST_ID field of the Revenue Distribution Lines table. The W_PROJ_ETL_PS parameter facilitates this logic, and through a separate ODI process, the maximum Request Id in the source table at the time of the ETL run is stored in this table, which is subsequently used to populate the following variables for the SDE_ORA_ProjectRevenueLine task in ODI:

#EBS_REQUEST_ID_2

#EBS_REQUEST_ID_3

#EBS_REQUEST_ID_4

They are initialized using the following queries:

SELECT COALESCE((SELECT COALESCE(PRE_REQUEST_ID,0) FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME ='PA_CUST_EVENT_RDL_ALL'),0) FROM_DUAL()
SELECT COALESCE((SELECT COALESCE(PRE_REQUEST_ID,0) FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME ='PA_CUST_REV_DIST_LINES_ALL'),0) FROM_DUAL()
SELECT COALESCE((SELECT COALESCE(PRE_REQUEST_ID,0) FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME ='PA_DRAFT_REVENUES_ALL'),0) FROM_DUAL()

Configuring the Project Revenue Aggregate Table

The Project Cost aggregate table (W_PROJ_REVENUE_A) is used to capture information about the project revenue distributions. You need to configure the Project Revenue Lines aggregate table before the initial ETL run and subsequent incremental ETL.

Before the initial ETL run, you need to configure the REVENUE_TIME_GRAIN parameter in FSM for the time aggregation level in the Project Revenue Lines aggregate fact table.

By default, the REVENUE _TIME_GRAIN parameter has a value of PERIOD. The possible values for the REVENUE_TIME_GRAIN parameter are:

PERIOD

QUARTER

YEAR

The Project Revenue Lines aggregate table is fully loaded from the base table in the initial ETL run. The table can grow to millions of records. Therefore, the Project Revenue aggregate table is not fully reloaded from the base table after each incremental ETL run. The Oracle Business Analytics Warehouse minimizes the incremental aggregation effort by modifying the aggregate table incrementally as the base table is updated.

The process is as follows:

  1. Oracle Business Analytics Warehouse finds the records to be updated in the base table since the last ETL run, and loads them into the W_PROJ_ REVENUE_LINE_TMP table. The measures in these records are multiplied by (-1). The mapping responsible for this task is SIL_Project RevenueLinesFact_Derive_PreLoadImage.

  2. Oracle Business Analytics Warehouse finds the inserted or updated records in the base table since the last ETL run, and loads them into the W_PROJ_REVENUE_LINE_TMP table, without changing their sign. The mapping responsible for this task is SIL_ProjectRevenueLinesFact_Derive_PreLoadImage, which is run before PLP_ProjectRevenueLinesFact_Derive_PostLoadImage updates or inserts records in the base table.

  3. Oracle Business Analytics Warehouse aggregates the W_PROJ_ REVENUE _LINE_TMP table and load to W_PROJ_REVENUE_A_TMP, which has the same granularity as the W_PROJ_REVENUE_A table.

  4. The PLP_ProjectRevenueLinesAggregate_Derive mapping looks up the W_PROJ_REVENUE_A aggregate table to update existing buckets or insert new buckets in the aggregate table (the mapping is PLP_ProjectRevenueLinesAggregate_Load).

How To Configure Project Uom For E-Business Suite

  1. Use the following SQL to obtain the project UOMs:

    select lookup_code, meaning, description from fnd_lookup_values where lookup_type='UNIT' and LANGUAGE='US';
    
  2. If the codes are not already mapped, map the project UOMs to the warehouse (conformed) UOMs coded in FSM.

For more information, see About Working With Domains and Domain Mappings.

B.2.118 How to configure Project UOM for E-Business Suite

To get the project UOMs, use the SQL below in the OLTP source database, and then map them to warehouse (conformed) UOMs coded in FSM if the codes are not already mapped. For example:

select lookup_code, meaning, description from fnd_lookup_values where lookup_type='UNIT' and LANGUAGE='US';

B.2.119 How to Set Up Project Cost and Control Security for E-Business Suite

Overview

Oracle Project Analytics supports security over following dimensions in Project Costing and Project Control subject areas. In Oracle Business Intelligence Applications, the 'Business Unit' entity refers to 'Operating Unit Organizations' in E-Business Suite. The list of Business Units that a user has access to is determined by E-Business Suite grants.

Table B-169 Supported Project Costing and Project Control subject areas

Project Costing and Control FactsSecurity Entity Cost Commitment Budget Forecast

Project Business Unit

Y

Y

Y

Y

Project Organization

Y

Y

Y

Y

Expenditure Business Unit

Y

N

N

N

Contract Business Unit

N

N

N

N

Project

Y

Y

Y

Y

Resource Organization

N

N

N

N

Ledger

N

N

N

N


Configuring Project Cost and Control Security For E-Business Suite

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

You must enable data security for Project Cost and Control in E-Business Suite by enabling the initialization blocks listed below based on your E-Business Suite adaptor. You must disable Project Security initialization blocks for other adapters. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

Init Blocks: EBS R11x

  • Expenditure Business Unit List EBS11x

  • Project Business Unit List Budget EBS11x

  • Project Business Unit List Costing EBS11x

  • Project Business Unit List Forecast EBS11x

Init Blocks: EBS R12

  • Expenditure Business Unit List EBSR12

  • Project Business Unit List Budget EBSR12

  • Project Business Unit List Costing EBSR12

  • Project Business Unit List Forecast EBSR12

Init Blocks: EBS R11x and EBS R12

  • Project List Budget EBS

  • Project List Costing EBS

  • Project List Forecast EBS

  • Project Organization List Budget EBS

  • Project Organization List Costing EBS

  • Project Organization List Forecast EBS

B.2.119.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Costing and Control subject area.

  • OBIA_EBS_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.120 How to Configure Project Retention Fact for PeopleSoft

This topic explains how to configure Project Retention Fact for PSFT adaptor, and includes the following sections:

Retention metrics are supported for EBS and PSFT adaptors. Since the source of truth for EBS adaptor is billing fact, by default the Retention amounts are mapped to Invoice Line fact. But for PSFT adaptor these mappings are not valid and have to be sourced from Retention fact. Hence metrics defined on the Invoice line fact have to be unmapped and retention fact has to be enabled.

Oracle recommends that you make a back up of your metadata repository (RPD file) before applying changes.

B.2.120.1 How to enable Retention Fact

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode.

  2. In the Business Model and Mapping layer, select the 'Fact_W_PROJ_RETENTION_F_Retention_Amounts ' Logical Table Source from the 'Fact - Project Billing', and then right click and choose Edit.

  3. Display the General tab and clear the Disabled check box as shown in the screenshot below.

    This screenshot is described in surrounding text.
  4. Save the BI metadata repository.

B.2.120.2 How to un-map extraneous metric definitions

By default, the Retention related amounts are mapped to Invoice Line fact. But for a PeopleSoft adaptor, this mapping is not valid, therefore you must un-map these metrics.

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode.

  2. Navigate to Fact – Project Billing, select the metric Retention Billed, right click and edit.

  3. Display the Column Source tab, select the definition mapped to Fact_W_PROJ_INVOICE_LINE_F_Invoice_Line, and click on Unmap as shown in the screenshot below.

    This screenshot is described in surrounding text.
  4. Repeat steps 2 and 3 for following metrics for Fact_W_PROJ_INVOICE_LINE_F_Invoice_Line:

    • Total Retained Amount

    • Retention Write-off

  5. Run the Consistency Check and ensure that there are no errors, then save the BI metadata repository, and clear Oracle BI Enterprise Edition Cache.

  6. Restart the Oracle BI Server and Oracle BI Presentation Services.

B.2.121 How to implement Security for Supply Chain Analytics

Overview

Supply Chain Analytics supports role-based and organization-based security in Inventory and Costing subject areas. Assign users to the appropriate roles to control which subject areas they can access. The list of Inventory Organizations that a user has access to is determined by the grants in the source application system.

Configuring Inventory Org Based Security for E-Business Suite

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Inventory Org Based security for E-Business Suite, enable the Oracle EBS initialization block. If more than one source system is deployed, then you must also enable the initialization blocks of those source systems.

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Open the variable by navigating to: Manage, then Variables.

  3. Open the initialization block that needs to be enabled under Session – Initialization Blocks (Inventory Organizations EBS).

  4. Clear the Disabled check box.

  5. Repeat the above steps for the following initialization blocks:

    • SCOM_AN:SECURITY:Inv Org CycleCount List

    • SCOM_AN:SECURITY:Inv Org InvTxns List

    • SCOM_AN:SECURITY:Inv Org Onhand List

    • SCOM_AN:SECURITY:Inv Org Shipments List

  6. Save the RPD.

B.2.121.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Order Management subject area.

  • Inventory Analyst

    This role provides secured access to Inventory Analysts with detailed insight into transactions, balances, aging, bill of materials, cycle counts and returns, and covers the following Subject Areas:

    • Inventory – Cycle Count

    • Inventory – Transactions

    • Inventory – Customer and Supplier Returns

    • Inventory – Bill of Materials

    • Inventory – Balances

    • Inventory - Aging

  • Inventory Manager

    This role provides secured access to Inventory Managers with insight into inventory details and costing data, and covers the following Subject Areas:

    • Inventory – Cycle Count

    • Inventory – Transactions

    • Inventory – Customer and Supplier Returns

    • Inventory – Bill of Materials

    • Inventory – Balances

    • Inventory - Aging

    • Costing – Margin Analysis

    • Costing – Item Cost

    • Costing – Inventory Valuation

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.122 How to Configure Job Dimension in Projects Analytics for E-Business Suite

The Job dimension is maintained in the Human Resources Analytics module.

B.2.122.1 How to Extend the Project Task Hierarchy Dimension for E-Business Suite

Task dimension data is sourced from the task table (PA_TASKS) in E-Business Suite, as well as from other task-related OLTP tables such as:

  • PA_PROJ_ELEMENTS

  • PA_PROJ_ELEMENT_VERSIONS

  • PA_PROJ_ELEM_VER_STRUCTURE

  • PA_PROJ_ELEM_VER_SCHEDULE

Attributes such as WBS_NUMBER, PRIORITY_CODE, SCHEDULE_START_DATE, and SCHEDULE_END_DATE are sourced from these tables. Oracle BI Applications support only the latest version of the Financial Structure by using the following filter conditions:

  • PA_PROJ_ELEM_VER_STRUCTURE.STATUS_CODE = 'STRUCTURE_PUBLISHED'

  • AND PA_PROJ_ELEM_VER_STRUCTURE.LATEST_EFF_PUBLISHED_FLAG = 'Y'

The W_TASK_DH hierarchy table stores the flattened hierarchy for every task in W_TASK_D. It is at the same grain as W_TASK_D and is modeled as a Type I dimension. All tasks in the hierarchy support these columns:

  • TASK_NAME

  • TASK_NUMBER

  • WBS_LEVEL

  • WBS_NUMBER

Because both tables, W_TASK_D and W_TASK_DH, are at the same grain, fact tables do not have a separate foreign key to join with this table; instead, the join is on the Task Foreign Key.

By default, Oracle BI Applications support 20 levels in the flattened hierarchy. The levels are Base, 1, 2, and so forth up to 18, and Top. The base level represents the hierarchy record, and Top level is the Top hierarchy under the Project. If your financial structure contains more than 20 levels, then you can extend the number of levels in the schema and ETL to support all levels.

To Extend the Project Task Hierarchy Dimension for E-Business Suite:

  1. In ODI Designer Navigator, display the Models tab, and add the change capture columns (TASK_NUMBER, WBS_LEVEL and WBS_NUMBER) for every new level that you want in the W_TASK_DHS and W_TASK_DH tables.

    This screenshot is described in surrounding text.
  2. Extend the interfaces in the SDE and SILOs folder, as follows:

    1. Depending on the source navigate to the correct SDE folder for EBS or PSFT.

    2. Edit and update the correct main interface for example, SDE_ORA_TaskDimensionHierarchy.W_TASK_DHS or SDE_PSFT_TaskDimensionHierarchy.W_TASK_DHS by providing the correct mappings for the new columns.

      This screenshot is described in surrounding text.
    3. Open the SILOS folder and edit and update the ODI interface SIL_Project_TaskDimensionHierarchy.

  3. Regenerate the SDE/SILOS scenarios by expanding the Packages folder and right click the scenario to regenerate.

    This screenshot is described in surrounding text.

    You must also use Oracle BI EE Administration Tool to update the following objects in the metadata repository:

    • W_TASK_DH table in the physical layer.

    • Dim - Task Hierarchy Logical Table and Task Hierarchy Dimension in the logical layer.

    • All the Task Hierarchy Presentation tables in the Presentation Area.

B.2.122.2 How to Configure Project Customer in Projects Analytics for E-Business Suite

By default, E-Business Suite only has the 'PRIMARY' relationship code in the PA_PROJECT_CUSTOMERS table. Therefore, the value is included in the ODI filter used in the source extract mapping for the Project dimension to get the customer for a project. You can define an additional value such as 'OVERRIDE CUSTOMER' as the relationship value. In this case, the filter must be edited to include any additional values, as follows.

To Configure Project Customer in Projects Analytics for E-Business Suite:

  1. In ODI Designer Navigator, connect to your ODI repository.

  2. Open the correct folder like SDE_ORA_11510_Adaptor or SDE_ORA_R12_Adaptor folder etc depending on the source.

  3. Expand the SDE_ORA_ProjectDimension folder and open the interface SDE_ORA_Project.W_PROJECT_DS.LKP_PROJ_CUST and click on the 'Quick-Edit' tab.

  4. Expand the Filters tab and edit the expression column for the second filter.

  5. Remove the existing SQL and add the following sample SQL where it is assumed the values are 'PRIMARY' and 'OVERRIDE CUSTOMER'. Modify it according to your configuration.

    If you want it to be independent of any relationships, then you must remove the filters on PROJECT_RELATIONSHIP_CODE -UPPER(PA_PROJECT_CUSTOMERS.PROJECT_RELATIONSHIP_CODE (+)) IN ('PRIMARY' . 'OVERRIDE CUSTOMER').

    Note: If the lookup returns more than one customer, then you must apply a max on the id so that it always returns one row.

  6. Review the mapping to ensure it is valid then press ok and save the interface.

  7. Regenerate the scenario by expanding the Packages folder and right click the scenario to regenerate.

    This screenshot is described in surrounding text.

B.2.122.3 About Configuring Project Classification Dimension in Projects Analytics for E-Business Suite

Every project can be optionally classified into different categories. Within these categories, a project can be further categorized into different classification codes. Depending on how these classification categories are defined in the application, for some categories, a project can be classified with more than one classification code.

The Project Classification Table (W_PROJ_CLASSIFICATION_D) is at the grain of Project, Classification Category and Classification Code. The Project facts do not have an explicit foreign key for joining with Project Classification Dimension; instead the join is on the Project Foreign Key. As specifying a Classification Category is optional for a project, so the logical join in the metadata repository between the Facts and Project Classification Dimension has been set as right outer join to avoid losing records in case the project has not been classified.

Note: A particular classification code might exist for more than one classification category. Therefore, to avoid double counting, it is important that a classification category is fixed in a report that has classification code as one of the reporting attributes. If a Project belongs to more than one Classification Category under the same Classification, the Project metrics (Cost, Revenue, and so forth) will be double counted.

B.2.122.4 About Configuring Project Funding Fact for E-Business Suite

Funding is based on Funding Line, which represents allocations made to a project or task. The line level funding information is held in the Funding Line fact (W_PROJ_ FUNDING_ LINE_F), which is based on PA_PROJECT_FUNDINGS table in the Billing Module of E-Business Suite. Also, data is extracted from the Summary Funding table (PA_SUMMARY_PROJECT_FUNDINGS) to retrieve additional metrics like Unbaselined Amount, Baselined Amount, Invoiced Amount, Revenue Accrued; which are not available in the Funding line Fact; these would be available in Funding Header Fact (W_PROJ_FUNDING_HDR_F). Before running any ODI etl job, you need to run the following process in E-Business Suite to update this table: PRC: Refresh Project Summary Amounts.

Note: For E-Business Suite, Funding Currency is the Document Currency for this fact.

  • Project_Funding_Category: Used for categorizing funding allocation types. Project_Funding_Level: This flat file is used to indicate whether a funding line is for a Task or a Project. It is not used in any default metric definition.

  • Note: Funding Fact Canonical Date GL Date is not populated in the OLTP application. So in the data warehouse, the GL Date for E-Business Suite is based on the Funding Allocation Date, using the GL Calendar of the Project OU. This enables cross-functional analysis on GL Calendar. For example, cross analysis of funding and billing by Fiscal Year is not possible if there is no GL Date in the Funding fact. Customers who do not want to perform analysis based on GL Calendar can instead base it on Enterprise Calendar.

  • The GL date (Funding Allocation Date) is the canonical date for this table and is also used for global exchange rate calculation.

B.2.123 How to Configure Projects GL Reconciliation Solution for E-Business Suite 11.5.10

Projects GL Reconciliation solution is supported by default for E-Business Suite V12 and PeopleSoft V90 adaptors. To support this solution for E-Business Suite V11510 adaptor, you must perform the steps below. These steps include adding a join in the PLP GL reconciliation ODI interfaces.

Note: The additional join is required for E-Business Suite V11510 adaptor because sub-ledger accounting was introduced from E-Business Suite V12 onwards and the joins between Projects and GL source tables in E-Business Suite V11510 and V12 are different.

To Configure Projects GL Reconciliation Solution for E-Business Suite 11.5.10:

  1. In ODI Designer Navigator, connect to your ODI repository.

  2. Navigate to "BI Apps Project" -> "Mappings" -> "PLP" -> "PLP_Project_GLReconciliationFact".

    This screen shot is described in surrounding text.
  3. Before making any changes, right click "PLP_Project_GLReconciliationFact" folder and create a version of existing folder.

    This screen shot is described in surrounding text.
  4. Enter description: "Before adding GL Account join for 11510 source", then click OK.

    This screen shot is described in surrounding text.
  5. Modify the following three temp interfaces:

    • PLP_Project_GLReconciliationFact.SQ_AGG_CDL_AMOUNTS

    • PLP_Project_GLReconciliationFact.SQ_W_PROJ_GL_RECNCLIATION_F_A ("Cost Distributions with mismatch" dataset)

    • PLP_Project_GLReconciliationFact.SQ_W_PROJ_GL_RECNCLIATION_F_U

    1. Navigate to "BI Apps Project" -> "Mappings" -> "PLP" -> "PLP_Project_GLReconciliationFact".

    2. Open the temp interface -> Go to "Quick-Edit" Tab.

    3. Expand "Sources" and Click "Add Sources".

      This screen shot is described in surrounding text.
    4. Source Wizard dialog box will open up. Click on "Interfaces" tab and search for LKP_W_GL_ACCOUNT_D. Select the lookup interface under PLP folder, give it an alias: LKP_W_GL_ACCOUNT_D, check the "Use Temporary Interface as Derived Table (Sub-Select)" option and Click Ok. Press NO when ODI prompts "Do you want to perform Automatic Mapping?"

      This screen shot is described in surrounding text.
    5. Source Wizard dialog box will open up. Click on "Interfaces" tab and search for LKP_W_GL_ACCOUNT_D. Select the lookup interface under PLP folder, give it an alias: LKP_W_GL_ACCOUNT_D, check the "Use Temporary Interface as Derived Table (Sub-Select)" option and Click Ok. Press NO when ODI prompts "Do you want to perform Automatic Mapping?"

      In this step, we will add two joins:

      - A join between W_GL_LINKAGE_INFORMATION_G and LKP_W_GL_ACCOUNT_D.

      - A join between W_PROJ_COST_LINE_F and LKP_W_GL_ACCOUNT_D.

      e.1 Expand "Joins" section on Quick-Edit tab and click on "Add Joins".

      This screen shot is described in surrounding text.

      e.2 Select "General Ledger Linkage Information" as Left Source and LKP_W_GL_ACCOUNT_D as Right Source. Join on fields:

      W_GL_LINKAGE_INFORMATION_G.GL_ACCOUNT_ID = LKP_W_GL_ACCOUNT_D.INTEGRATION_ID
      AND W_GL_LINKAGE_INFORMATION_G.DATASOURCE_NUM_ID = LKP_W_GL_ACCOUNT_D.DATASOURCE_NUM_ID
      

      e.3 Select Join Type as "Left Outer Join" and Click OK.

      This screen shot is described in surrounding text.

      e.4 Click "Add Joins" again. Select "W_PROJ_COST_LINE_F" as Left Source and LKP_W_GL_ACCOUNT_D as Right Source. Join on fields:

      W_PROJ_COST_LINE_F.COST_GL_ACCOUNT_WID=LKP_W_GL_ACCOUNT_D.ROW_WID
      AND W_PROJ_COST_LINE_F.CR_GL_ACCOUNT_WID=LKP_W_GL_ACCOUNT_D.ROW_WID
      

      e.5 Select join Type as "Inner Join" and Click OK.

      This screen shot is described in surrounding text.

      e.6 In the "Joins" section, scroll to right and "Edit" the newly added join between W_PROJ_COST_LINE_F and LKP_W_GL_ACCOUNT_D.

      This screen shot is described in surrounding text.

      e.7 Edit the join condition to:

      W_PROJ_COST_LINE_F.COST_GL_ACCOUNT_WID = COALESCE(LKP_W_GL_ACCOUNT_D.ROW_WID,#ETL_UNSPEC_NUM)
      OR W_PROJ_COST_LINE_F.CR_GL_ACCOUNT_WID = COALESCE(LKP_W_GL_ACCOUNT_D.ROW_WID,#ETL_UNSPEC_NUM)
      

      e.8 Make sure you modify the join condition to OR apart from adding the COALESCE function. Click OK.

      This screen shot is described in surrounding text.

      e.9 Check the "Ordered" check box for the Left Outer join between GL Linkage and GL Account lookup.

      This screen shot is described in surrounding text.
  6. Save the interface.

    Repeat the steps a. to e. for all the following three temp interfaces:

    • PLP_Project_GLReconciliationFact.SQ_AGG_CDL_AMOUNTS

    • PLP_Project_GLReconciliationFact.SQ_W_PROJ_GL_RECNCLIATION_F_A ("Cost Distributions with mismatch" dataset only. Do not add joins in "Journal Lines with mismatch" dataset)

    • PLP_Project_GLReconciliationFact.SQ_W_PROJ_GL_RECNCLIATION_F_U

  7. Regenerate the scenario.

    1. After all three temp interfaces are modified and saved, navigate to "PLP_Project_GLReconciliationFact" -> "Packages" -> "PLP_Project_GLReconciliationFact" -> "Scenarios". Right click scenario "PLP_PLP_PROJECT_GLRECONCILIATIONFACT" and click "Regenerate".

      This screen shot is described in surrounding text.
    2. On the "Regenerate Scenario" dialog box, click OK.

      This screen shot is described in surrounding text.
    3. On Scenario Variables dialog, select 'Use All' in the Startup Parameters drop down list, then click OK.

      This screen shot is described in surrounding text.

B.2.124 Additional Information About GL Reconcilliation in Project Analytics

This section contains an overview of GL Reconcilliation for Project Analytics, and contains the following sections:

B.2.124.1 Overview

The sub ledger to General Ledger account reconciliation is a common task in the accounting process. The reconciliation process involves comparing the balances accounts between the General Ledger (GL) and a sub ledger, such as Projects. Balance differences between an account in the GL and the sub ledger are then explained, or "Reconciled" by finding unmatched journal entries. The differences may happen due to the asynchronous nature of Cost/Revenue Distributions processes from Projects module and the GL Journals creation/posting in the Finance module. For example, we can have cost distributions which are transferred to the sub ledger but the corresponding journal was either not created or not posted to the GL.

Reconciliations provide assurance that numbers in the GL are correct and synchronized with their corresponding drill down distributions, which is important as these numbers are used to generate financial statements.

New Functionality

To assist project accountants in reconciling the Project sub ledger with the GL, Oracle Project Analytics introduces a reconciliation solution that identifies six situations, or use cases, that explain why the GL and the Project sub ledger are not balanced. Section 3, below, identifies and explains these use cases and what do they mean for each adapter.

For Oracle Business Intelligence Applications release 11.1.1.7.1, the reconciliation solution is available for project cost and revenue transactions, for E-Business Suite 11.5.10, E-Business Suite R12x and PeopleSoft 9x source systems.

For this solution, Oracle Business Intelligence Applications introduces two new subject areas and over 30 new metrics. The Catalog includes two new dashboard pages in the Project Executive dashboard and 22 reports. The reports in the dashboard pages show the count of exceptions found for each of the use cases and their total amount. Users can slice these reports by time, organization and project when related to cost and revenue lines, and by time, ledger and natural segment when related to journal lines.

The reports are designed to help users find where they need to take action to reconcile the Projects sub ledger and the GL. For this, the reports identify the cost lines, revenue lines, and journal lines that explain the differences between the sub ledger and the GL.

Notice that at implementation time, and depending of the customer source system, customizations to the ETL code and metadata may be needed to enable support for some use cases. This document lists the FSM tasks that contain these instructions for the E-Business Suite and PeopleSoft source systems and the use cases for each one.

B.2.124.2 Configuring ETL Parameters

Project GL reconciliation ETL runs for a specific period window. Customers can specify the period for which they want to identify reconciliation issues by configuring the following two variables. These variables should not be null.

  • PROJ_GL_PERIODS_TO_LOAD

    • Specifies the number of periods to include for reconciliation.

    • Default (installed) value: 1.

    • Permissible values: positive integers 0,1,2,3 and so on.

  • PROJ_GL_RECON_DT

    • This is the date we start counting from (going backward) the number of periods, when loading data for Reconciliation.

    • Default (installed): "DEFAULT". When this variable has value "DEFAULT" it means it will use SYSDATE to identify the current period.

    • Permissible values: String "DEFAULT" or a date in YYYY-MM-DD format.

Examples

Table B-170 Examples

Sample Values Behavior

PROJ_GL_RECON_DT: DEFAULT

PROJ_GL_PERIODS_TO_LOAD: 1

When both these variables have default values, the ETL will run to reconcile data for current period (based on SYSDATE) and 1 previous period.

PROJ_GL_RECON_DT: DEFAULT

PROJ_GL_PERIODS_TO_LOAD: 3

With these values, the ETL will run to reconcile data for current period (based on SYSDATE) and 3 previous periods.

PROJ_GL_RECON_DT: 2012-12-31

PROJ_GL_PERIODS_TO_LOAD: 1

With these values, the ETL will run to reconcile data for the period in which 31-DEC-2012 falls and 1 previous period.

So if the calendar is monthly, it would reconcile for the current period which would be DEC-2012 and previous period would be NOV-2012.

PROJ_GL_RECON_DT: 2012-06-30

PROJ_GL_PERIODS_TO_LOAD: 3

With these values, the ETL will run to reconcile data for the period in which 30-JUN-2012 falls and 3 previous periods.

So if the calendar is monthly, it would reconcile for the current period which would be JUN-2012 and previous periods would be MAR-2012, APR-2012, MAY-2012.


B.2.125 How to Set Up CRM Partner Organization Based Security for Oracle Fusion

Oracle Fusion CRM partner organization based security is applied when fusion partner administrator access partner organization, partner owned leads and opportunity/revenue. Partner administrator should be able to access the above entities owned by his partner organization.

Configuring Partner Organization Based Security

The session variable PARTNER_ORG_HIER_LIST stores a list of partner organizations the login user belongs to. It is initialized via initial block Partner Org Hierarchy List and then used in partner organization based data security role.

B.2.125.1 Configuring BI Duty Roles

OBIA_PARTNER_ORGANIZATION_DATA_SECURITY is the internal BI duty role to define data filter for partner organization based data security. By default, it has one member BI duty role:

  • OBIA_PARTNER_ADMINISTRATIVE_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to. And as members of OBIA_RESOURCE_ORGANIZATION_HIERARCHY_DATA_SECURITY, they also ensure the primary resource organization hierarchy based data security filters are applied to all the queries involving marketing campaign.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.126 How to Configure Projects GL Reconciliation Manual Journal Entries

Projects GL Reconciliation solution in Oracle BIAPPS PS1 release supports an additional use case to identify manually created projects journals in GL apart from the default supported use cases.

For EBS R11.5.10 and R12.x, most often customers configure a descriptive flexfield in source to populate it with a project number or identifier when creating manual journals to associate a project to that journal. In such cases, the ETL code can be customized to extract and report these manually created journals. So in case if a customer has created a descriptive flexfield in EBS Apps GL Journal header or Lines table and they use it to populate Project Number for manually created journals, the Projects GL Reconciliation ODI interfaces can identify those transactions after doing some customizations to the expression where JOURNAL_SOURCE field is populated during extract.

By default, JOURNAL_SOURCE is populated directly from GL_JE_HEADERS, it will not be linked with Project Sub Ledger and in that case, such transactions will be filtered out.

This document describes the sample steps to customize the ETL code in ODI interfaces to identify manually created journal entries assuming the following:

The descriptive flexfield ATTRIBUTE1 in table GL_JE_HEADERS is used to store the Project Number (or some other Project key).

To Configure Projects GL Reconciliation Manual Journal Entries:

  1. In ODI Designer Navigator, connect your ODI repository.

  2. Navigate to BIApps Project -> Mappings -> <EBS adaptor folder> - SDE_ORA_GLJournalsFact - SDE_ORA_GLJournalsFact.W_GL_OTHER_FS_SQ_GL_JE_LINES.

  3. Open the temp interface and check the expression for JE_SOURCE.

    The expression mapped by default is GL_JE_HEADERS.JOURNAL_SOURCE.

  4. Modify JE_SOURCE field to the following value:

    GL_JE_HEADERS.JE_SOURCE || (CASE WHEN GL_JE_HEADERS.CONTEXT = 'Project Context' AND GL_JE_HEADERS.ATTRIBUTE1 IS NOT NULL THEN '~PA' ELSE NULL END)
    

    In general, you need to customize this expression to:

    GL_JE_HEADERS.JOURNAL_SOURCE || (CASE WHEN <Manual Entries for Projects> THEN '~PA' ELSE NULL END CASE)
    
  5. Save the interface.

  6. Re-generate the scenario for SDE_ORA_GLJOURNALSFACT.

    The next time the ETL is run, JOURNAL_SOURCE field in the W_GL_OTHER_F table will be populated with the value "Manual~PA" for manually created journals for Projects and GL Reconciliation ODI interfaces will identify them as manual journals related to Projects.

B.2.127 How to Define Aging Bucket Ranges in Accounts Payable and Accounts Receivable

Oracle Financial Analytics does not support overlapping aging bucket ranges. When you set up aging buckets in Fusion Applications or E-Business Suite, you must use only non-overlapping aging buckets for AP and AR aging analysis in Financial Analytics. Also, when buckets are defined to separate out due and overdue amounts in different buckets, make sure that the days_from value and days_to value for a bucket are defined so that it takes either due or overdue transactions, but not both. To do this, make sure the days_from for overdue buckets start from 1 or a positive number, not from 0 or a negative number. The following examples illustrate the supported aging bucket ranges.

Supported:

a) -60 days to -31 days
b) -30 days to -11 days
c) -10 days to 0 days
d) 1 days to 10 days
e) 11 days to 30 days
f) 31 days to 60 days and so on.

In this example, a/b/c are 'due' buckets and d/e/f are 'overdue' buckets. The bucket ranges are defined correctly with no overlaps and all overdue buckets start from 1 or above.

Not Supported:

a) -60 days to -31 days
b) -30 days to -11 days
c) -10 days to -1 days
d) 0 days to 10 days
e) 9 days to 30 days
f) 31 days to 60 days and so on.

In this example, the bucket ranges are not defined correctly. Note that bucket d starts from 0. So this bucket could hold some invoices that are due and some that are overdue. Thus a report that shows only overdue buckets could include some invoices that are not overdue. Furthermore, buckets d and e are overlapping. Therefore, some invoices could be reported in both buckets, thus making the total outstanding amount appear larger than it is.

Not Supported:

a) -60 days to -31 days
b) -30 days to -11 days
c) -10 days to -6 days
d) -5 days to 5 days
e) 6 days to 30 days
f) 31 days to 60 days and so on.

In this example, the bucket ranges are not defined correctly. Here, bucket d starts from a negative number and ends with a positive number. Similar to the previous example, this bucket could hold some invoices that are due and some that are overdue.

B.2.128 Manage Domains and Member Mappings for Timecard Entry Status

Purpose

The Timecard Entry Status dimension is key understanding the status of a timecard entry. The Timecard Entry Status is different for Reported and Processed (aka Payable) time. Oracle provides domain member mappings for both Reported and Processed.

Optional or Mandatory

This task is mandatory.

Applies to

E-Business Suite, and PeopleSoft

Task description in detail

Configuring the domains on the Timecard Entry Status dimension are key to the successful attribution of time reporting entries to warehouse reporting categories and subcategories.

Source Reported Timecard Entry Status Code -> Reported Timecard Status Code

This task is mandatory. Used to identify how Source Reported Timecard Entry Status map to delivered target Reported Timecard Status domain members; target domain members are used in the delivered metrics, dashboards and reports, for example, WORKING (Working), APPROVED (Approved). The target domain is Extensible - customers can add to but not delete from it.

Example for E-Business Suite

The Source Reported Timecard Entry Status Code is based on the values in the FND look-up (HXC_APPROVAL_STATUS).

Example Implementation

Table B-171 Example Target Member Codes

Source Member Code (Name) Target Member Code (Name)

APPROVED (Approved)

APPROVED (Approved)

ERROR (Error)

ERROR (Error)

REJECTED (Rejected)

REJECTED (Rejected)

SUBMITTED (Submitted)

SUBMITTED (Submitted)

WORKING (Working)

WORKING (Working)


Example for PeopleSoft

The Source Reported Timecard Entry Status Code is based on the values in PSXLATITEM (REPORTED_STATUS).

Example Implementation

Table B-172 Example Target Member Codes

Source Member Code (Name) Target Member Code (Name)

AP (Approved)

APPROVED

CN (Cancelled)

CANCELLED

DN (Denied)

REJECTED

IE (In Error)

ERROR

IP (In Process)

IN_PROCESS

NA (Needs Approval)

SUBMITTED

NW (New)

WORKING

PR (Processed)

PROCESSED

PB (Pushed Back)

PUSHED_BACK

SV (Saved)

WORKING

Submitted (SB)

SUBMITTED

Voided (VD)

VOIDED


Source Processed Timecard Entry Status Code -> Processed Timecard Entry Status Code

This task is mandatory.

Used to identify which Source Processed Timecard Entry Status are mapped to Processed Timecard Entry Status; target domain members are used in the delivered metrics, dashboards and reports, for example, ESTIMATE (Estimated), TRNSFRD_TO_PAY (Transferred to Payroll). The target domain is Extensible - customers can add to but not delete from it.

Example for E-Business Suite

On Oracle EBS (HXT_) the Source Processed Timecard Entry Status is the Batch Status.

On Oracle EBS (HXC_) the Source Processed Timecard Entry Status is a combination of the retrieval status and the retrieving application.

Example Implementation

Table B-173 Example Target Member Codes

Source Member Code (Name) Target Member Code (Name)

SUCCESS:PAY (Retrieved by Payroll)

TRNSFRD_TO_PAY (Transferred to Payroll)

SUCCESS:PA (Retrieved by Projects)

TRNSFRD_TO_PROJ (Transferred to Projects)

SUCCESS:PO (Retrieved by Purchasing)

TRNSFRD_TO_PURCH (Transferred to Purchasing)


Example for PeopleSoft

On PeopleSoft the Source Processed Timecard Entry Status is the PSXLATITEM (PAYABLE_STATUS).

Example Implementation

Table B-174 Example Target Member Codes

Source Member Code (Name) Target Member Code (Name)

OE (Online Estimate)

ONLINE_ESTIMATE (Online Estimate)

NA (Needs Approval)

NEEDS_APPROVAL (Needs Approval)

ES (Estimate)

ESTIMATED (Estimated)

AP (Approved)

APPROVED (Approved)

CL (Closed)

<No value>

SP (Sent to Payroll)

<No value>

RP (Rejected by Payroll)

<No value>

TP (Taken by Payroll)

TAKEN_BY_PAY (Taken by Payroll)

PD

<No value>

DL

<No value>

IG

<No value>

RV

<No value>

NP

TRNSFRD_TO_PROJ (Transferred to Projects)

DN

<No value>

PB

<No value>


B.2.129 Manage Domains and Member Mappings for Payroll Balance Dimension

Purpose

This section explains the mapping from source Payroll balances/earnings/deductions/taxes to the data warehouse Payroll summary measures through the use of domain values.

HR Analytics Payroll model has two fact tables, the detailed fact table and the summary fact table. The detailed fact table will has all the balances (in case of EBS Payroll) and Earnings/Deductions/Taxes (In case of PeopleSoft North American and Global Payroll) extracted from the source system on separate rows. This table will be used for detail reports and ad-hoc analysis.

To support analytic reporting, the summary fact table delivers a set of default summary measures. Source payroll balances are mapped to the summary measures and loaded into summary fact table. More than one payroll balance can be mapped to a summary metric, in which case the individual source balances will be summed to form a summary measure.

This diagram is described in surrounding text.

If, Base Pay constitutes Regular Salary, Total Earnings constitutes Regular Salary and Bonus and Total Tax constitutes Income Tax and Social Insurance Tax.

The source to target balance mapping should be done accordingly as shown below.

  • Regular Salary is mapped to Pay_Base and Total Earnings summary measures.

  • Bonus is mapped to Total Earnings (Total Earnings = Regular Salary + Bonus).

  • Income Tax and Social Insurance Tax are mapped to Total Tax (Total Tax = Income Tax + Social Insurance Tax).

To ensure additive property of measures, we only support run balances. For each payroll run the actual run balances processed are stored. Because we are not breaking these down by context, we can combine run balances across time to form higher level balances, for example, PTD, MTD, and YTD.

If the source balance mapping to the summary measures are not done in the Configuration Manager, the Payroll Summary Fact table will not be loaded with any data and this will cause the reports based on the summary fact table to return no data.

Optional or Mandatory

Mandatory, as the default HR Analytics payroll reports that are based on the Summary Fact table will return no data if the Source balance mapping is not configured.

Task description in detail

The graphic below shows the domains used for mapping the source balances to the target summary measures.

This diagram is described in surrounding text.

Source payroll balances can be mapped to the target summary measures in two ways:

- One to One Mapping (using W_PAY_BALANCE domain).

- Many to One Mapping (using W_PAY_MAP_FACTOR_xxxx domain).

In one-to-one mapping, a source balance is directly mapped to a summary measure in Oracle BI Applications Configuration Manager.

For example: If you have a source balance called Base Pay, you can map it to PAY_BASE summary measure code in the configuration manager using the Domain Mappings.

If you have multiple source balances Earns1, Earns2 that constitute a summary measure PAY_BASE, you can map multiple source balances to a single summary measure.

The source balances will be aggregated to populate the summary measure.

PAY_BASE = Earns1 + Earns2

Steps for One-to-One Balance Mapping

The following are the steps to be followed to map the Source Balances to the Warehouse Summary Measures.

  1. Identify the Source Balances to be extracted in the ETL.

    1. Refer to task "How to Add Balances to BI Balance Group" for restricting the balances extraction.

  2. Run the Domains ETL and extract the source domains into Oracle BI Applications Configuration Manager.

    1. Create a Domains ETL load plan in the Configuration Manager with the Fact Group as Payroll Fact Group.

    2. Execute the load plan and the source domains will be extracted into the Configuration Manager schema.

  3. Map the Source balances to the corresponding summary measures.

    1. Navigate to 'Manage Source Domains' under Domains Administration to check if the source domains are populated.

      This screenshot is described in surrounding text.
    2. Navigate to 'Manage Warehouse Domains' under Domains Administration to verify the target domains (summary measures) are present.

      This screenshot is described in surrounding text.
    3. Navigate to 'Manage Domain Mappings and Hierarchies' for mapping the Source balances to the Summary Measures.

      This screenshot is described in surrounding text.
    4. Click on the Edit button in the Domain Member Mappings section to map the source domains to the target domains.

    5. Save and Close.

  4. Run the main Load Plan to load Oracle Business Analytics Warehouse.

  5. The identified balances are loaded into Payroll detail fact table as separate rows.

  6. The Summary Measures are loaded in the summary fact table as per the mapping done in step 3.

Steps for Many to One Balance Mapping

In Many-to-One mapping, you can also derive a summary measure using various balances.

For example, NET_PAY can be derived using a calculation like:

Earns1 + Earns2 – Ded1 – Ded2, Earns1, Earns2, Ded1 and Ded2 being source balances.

For this to achieve, you need to map the above source balances to warehouse domain W_PAY_MAP_FACTOR_PAY_NET.

  1. Follow the above steps from 1 to 3.a (in Section 'Steps for One-to-One Balance Mapping' above)

  2. Navigate to Manage Warehouse Domains and search for W_PAY_MAP_FACTOR domain code.

    This screenshot is described in surrounding text.

    You can add the desired multiplier as the Domain Member Code. For example: 1, if you want the balance to be added once or –1 to deduct once.

  3. Navigate to the Manage Domain Mappings and Hierarchies and search for the domain W_PAY_MAP_FACTOR.

    This screenshot is described in surrounding text.
  4. Select the Net Pay domain mapping and click on Edit button in the Domain Member Section.

    This screenshot is described in surrounding text.

    In this screen, you can map the source balances to the balance multiplier.

    For example: If NET_PAY is calculated as Earns1 + Earns2 – Ded1 – Ded2, then Source Earns1 is mapped to 1, Earns2 is mapped to 1, Ded1 is mapped to –1 and Ded2 is mapped to –1.

    For each employee, per pay period, the NET_PAY is calculated with the above formula and loaded into PAY_NET column of the Payroll Summary Fact table.

  5. Run the Main Load Plan to extract the balances from source to warehouse.

  6. The identified balances are loaded into Payroll detail fact table as separate rows.

  7. The Summary Measures are loaded in the summary fact table as per the mapping done in step 3.

List of delivered Payroll Summary Measures

FLEX_BALANCEx summary measures can be used to map any source balance that does not fit in to the out-of-box summary measures. The flex balance list can be extended as part of any customization.

The following list shows the tab-separated data for Summary Measure Code, Category, and Description for the delivered Payroll Summary Measures.

SUMMARY MEASURE CODE      CATEGORY        DESCRIPTION
BEN_COST_EMPLOYEE Benefits        Benefit costs paid by an employee such as employee premium for medical, dental, vision, disability and life insurance. 
BEN_COST_EMPLOYER Benefits        Benefit costs paid by the employer such as employer premium for medical, dental, vision, disabilility, and life insurance, retirement funding and educational assistance, et.,  Employer-paid benefit cost is a key metric in analyzing employee total compensation and workforce cost.
BEN_TAXABLE      Benefits        Taxable benefits are employer provided "non-cash" taxable compensation or fringe benefits, such as employer-provided vehicles, complementary tickets, and educational assistance, are subject to tax rules.
DEDUCTIONS_INVOL Other Deductions        Involuntary deducitons are payroll deductions that the employer is mandated by the law to withhold from an employee's paycheck, e.g. income tax witholding, social security taxes, court ordered garnishment such as child support, bankrupcy order, tax levy.
DEDUCTIONS_POST_TAX       Other Deductions        Payroll deductions that are deducted after taxes are withheld.   Examples of post tax deductions are union dues, transportation fees, garnishments etc.  These deductions do not reduce taxable wages. 
DEDUCTIONS_PRE_TAX        Other Deductions        Payroll deductions that are deducted before taxes are withheld.   Examples of before tax deductions are health insurance premium, 401K deductions, etc.  These deductions reduce taxable wages.
DEDUCTIONS_VOL   Other Deductions        Voluntary deductions are payroll deductions that have been authorized by an employee e.g. retirement saving deduction, health and life insurance premiums, contribution to disability and health saving plans.  Some voluntary deductions are before-tax withholdings whereas others are withheld after taxes.  
FLEX_BALANCE1    Flex Balances   Extensible balance field 1
FLEX_BALANCE10   Flex Balances   Extensible balance field 10
FLEX_BALANCE11   Flex Balances   Extensible balance field 11
FLEX_BALANCE12   Flex Balances   Extensible balance field 12
FLEX_BALANCE13   Flex Balances   Extensible balance field 13
FLEX_BALANCE14   Flex Balances   Extensible balance field 14
FLEX_BALANCE15   Flex Balances   Extensible balance field 15
FLEX_BALANCE16   Flex Balances   Extensible balance field 16
FLEX_BALANCE17   Flex Balances   Extensible balance field 17
FLEX_BALANCE18   Flex Balances   Extensible balance field 18
FLEX_BALANCE19   Flex Balances   Extensible balance field 19
FLEX_BALANCE2    Flex Balances   Extensible balance field 2
FLEX_BALANCE20   Flex Balances   Extensible balance field 20
FLEX_BALANCE3    Flex Balances   Extensible balance field 3
FLEX_BALANCE4    Flex Balances   Extensible balance field 4
FLEX_BALANCE5    Flex Balances   Extensible balance field 5
FLEX_BALANCE6    Flex Balances   Extensible balance field 6
FLEX_BALANCE7    Flex Balances   Extensible balance field 7
FLEX_BALANCE8    Flex Balances   Extensible balance field 8
FLEX_BALANCE9    Flex Balances   Extensible balance field 9
HEALTHCARE_EMPLOYEE      Other Deductions        Employee contribution to healcare insurance premiums including medical, dental and vision plans.
HEALTHCARE_EMPLOYER      Benefits        Employer contribution towards the cost of employee healthcare insurance including medical, dental and vision insurance premium, or other employer-assisted wellness plans.
HOLIDAY_HOURS    Hours   Holiday hours are hours compensated for paid company holidays such as New Year, Christmas, etc.
OVERTIME_HOURS   Hours   Overtime hours paid  
PAY_BASE Standard Earnings       Base salary is the fixed salary or wage paid to an employee based on an employment contract.  Base pay does not include variable pay components such as bonus, overtime or sales commission.
PAY_BONUS        Standard Earnings       Bonus pay is they pay compensation over and above the amount of pay specified as a base salary or hourly rate of pay
PAY_COMMISSION   Standard Earnings       The amount of money that an individual receives based on the level of sales he or she has obtained. Sales commission is the amount earned in addition to his/her base salary.
PAY_GROSS        Standard Earnings       Gross amount of remuneration for each pay type including regular pay, overtime pay, allowances, commissions, bonuses, and any other amounts, before any deductions are made.
PAY_HOLIDAY      Standard Earnings       Holiday pay are pay compensated for paid company holidays such as New Year, Christmas, etc.
PAY_NET  Standard Earnings       The remaining amounts of an employee's gross pay after deductions, such as taxes and retirement contributions, are made.
PAY_OTHER        Standard Earnings       Other types of pay that are not base pay, bonus, overtime, commision pay.
PAY_OVERTIME     Standard Earnings       The amount of pay compensated for hours worked beyond an employee's normal working hours and is entitled to overtime premium.
PAY_VARIABLE     Standard Earnings       Variable pay is also known as performance pay, is used to recognise and reward employee performance above and beyond their normal job requirements.  Variable pay may include profit sharing, bonuses, holiday bonus, or other forms of cash, and goods and services such as a company-paid trip.
PENSION_EMPLOYEE Pension The amount contributed by an employee towards his/her retirement funding such as an employee's contribution to a retirement saving plan
PENSION_EMPLOYER Pension The amount contributed by the employer towards an employee's retirement funding such as employer contribution to an employee's retirement saving plan
REGULAR_HOURS    Hours   Hours compensation for an employee's normal working hours based on an employment contract
SICK_HOURS                       Hours   An employee sick time that is compensated
SICK_PAY                         Standard Earnings       Amount paid for an employee's sick time
SOC_INS_EMPLOYEE  Other Deductions        Social security insurance taxes paid by an employee
SOC_INS_EMPLOYER  Other Deductions        Social security insurance taxes paid by the employer
STOCK_VESTED_VAL  Benefits        The value of an employee's vested stock options
TAX_EMPLOYEE     Tax     Payroll taxes withheld from an employee's pay check such as income taxes, social security and medicate taxes, etc.
TAX_EMPLOYER     Tax     Employer paid taxes are payroll taxes paid by the employer for social security, medicare tax withholding unemployment tax insurance or any other form of employer payroll taxes.  Employer-paid tax is a key metric in  analyzing employee total compensation and workforce cost.
TOTAL_DEDUCTIONS Totals  Total before and after tax deductions including benefit deductions, taxes, and other voluntary or involuntary deductions.
TOTAL_EARNINGS   Totals  Total gross pay; this is the grand total of all gross pays on a pay check
VACATION_HOURS                   Hours   Total number of hours paid for an employee's vacation time or personal time off.
VACATION_PAY                     Standard Earnings       Amount compensated for an employee's vacation time or personal time off

B.2.130 How to Set Up Service Analytics Security for Siebel

Overview

There is no row-level security applied to Service Analytics reports and metrics. Users who can access Service Analytics Subject areas can view all data in the related reports without any data-security filter.

Configuring BI Duty Roles

This table lists down BI Duty roles which can be assigned to users in order to give them access to Service Subject Areas.

Table B-175 BI Duty Roles and Associated Subject Areas

BI Duty Roles Subject Areas

Service Agent

Service - CRM Activities

Service - CRM Service Requests

Service - CRM Agreements

Service - CRM Assets

Service - CRM Customer Satisfaction

Service - CRM Orders

Service Manager

Service - CRM Activities

Service - CRM Service Requests

Service - CRM Agreements

Service - CRM Assets

Service - CRM Email Response

Service - CRM Customer Satisfaction

Service - CRM Orders

Service Executive

Service - CRM Activities

Service - CRM Service Requests

Service - CRM Agreements

Service - CRM Assets

Service - CRM Customer Satisfaction

Service - CRM Orders

Service Delivery and Costs Analyst

Service - CRM Agreements


For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.131 How to implement Inventory Org Based Security for EBS Manufacturing Analytics

Overview

Manufacturing Analytics supports security over Inventory Organizations in manufacturing subject areas. The list of Inventory Organizations to which a user has access is determined by the grants in E-Business Suite.

Configuring Inventory Org Based Security

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. To enable Inventory Org Based security for EBS, enable E-Business Suite initialization block and make sure the initialization blocks of all other source systems are disabled.

Oracle EBS: Inventory Org-based Security

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Navigate to Manage and open variables from menu ('INV_ORG').

  3. Open the initialization block that you need to enable under Session – Initialization Blocks (Inventory Organizations EBS).

  4. Clear the Disabled check box.

  5. Save the metadata repository.

Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Manufacturing Analytics subject area.

  • OBIA_MANUFACTURING_EXECUTION_ANALYSIS_DUTY

  • OBIA_MANUFACTURING_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_MANUFACTURING_COST_ANALYSIS_DUTY

Table B-176 BI Duty Roles and Associated Subject Areas

BI Duty Roles Subject Areas

OBIA_MANUFACTURING_EXECUTION_ANALYSIS_DUTY

Manufacturing Execution Analyst for E-Business Suite.This role provides secured access to Operations Manager, Production Supervisor with Manufacturing execution.

Manufacturing – Material Usage

Manufacturing –Work Order Performance

Manufacturing –Work OrderSnapshot

Manufacturing –Resource Usage

Manufacturing –Resource Utilization

Manufacturing –Work Order Cycle Time

Manufacturing –Kanban

Manufacturing –WorkOrderAging

OBIA_MANUFACTURING_EXECUTIVE_ANALYSIS_DUTY

Manufacturing Executive for E-Business Suite.This role provides secured access to VP Manufacturing, Plant Manager, Plant General Manager with insight into Planning Manufacturing execution and costing.

Manufacturing –Production Plan

Manufacturing –Actual Production

Manufacturing – Material Usage

Manufacturing –Work Order Performance

Manufacturing –Work OrderSnapshot

Manufacturing –Resource Usage

Manufacturing –Resource Utilization

Manufacturing –Work Order Cycle Time

Manufacturing –Kanban

Manufacturing –WorkOrderAging

Manufacturing-Production Cost

Manufacturing-Plan to Produce

Manufacturing-Discrete Quality

OBIA_MANUFACTURING_COST_ANALYSIS_DUTY

Manufacturing Cost Analyst for E-Business Suite.This role provides secured access to Production Controller, Cost Accountant with insight over Production costing.

Manufacturing-Production Cost


These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.132 How to Perform System Setups and Post Install Tasks for BI Applications

For information about setting up Oracle BI Applications after installation, refer to Oracle Fusion Middleware Installation Guide for Oracle Business Intelligence Applications.

B.2.133 How to Set Up Drill Down in Oracle BI from General Ledger to Subledger

To set up drill down in Oracle BI Answers from General Ledger to subledger:

  1. Create your subledger request from 'Financials - AP Transactions' or 'Financials - AR Transactions' catalog as applicable.

  2. In your request, add a filter on the column 'GL Journal ID' under the 'Document Details' subfolder for the 'AP Line Details' or 'AR Line Details' folder,=65 and then set the operator of the filter to 'Is Prompted'.

  3. Build your GL Journal request from the 'Financials - GL Detail Transactions' catalog.

  4. To your request, add the column 'GL Journal ID' under the 'Document Details' folder.

  5. Navigate to the Column Properties of this column, and set the Value Primary Interaction property in the Column Format Interaction tab to 'NavigateAction Links'.

  6. Add a navigation target and set the target location to the sub ledger request you created earlier.

You may add multiple navigation targets if your GL report shows transactions from multiple subledgers and you want to drill from GL to the appropriate Subledger report. For example, if your GL report shows transactions from AP, AR and Revenue, and you have three subledger analyses for each of these, you can add three navigation targets (by selecting the option 'Add Navigation TargetsAction Link') and set the locations to each of these analyses. Subsequently, when you run the GL report and click on the 'GL Journal ID' column value, a popup appears, where you need to click on the appropriate target based on the journal you clicked on. This will not happen automatically. For example, if you click on a journal transaction originating from AP, you need to pick the appropriate subledger report (that is, the AP report in this case) to drill into the AP report and see the details. You can add the Group Account Number attribute from GL Account Dimension to your GL report to easily identify the subledger that the GL transaction belongs to.

Note:

For COGS, the 'GL Journal ID' column is not exposed in any presentation catalogs. It is available in the business model layer of the RPD metadata under the logical tables 'Dim - GL COGS Details'. As a workaround, you can create presentation catalogs to report on detail level transactions for COGS and expose this column under the 'Document Details' folder in the presentation catalog. You use similar steps as described above to setup a drill-down from GL to COGS.

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

    The RPD file is located in the \bifoundation\OracleBIServerComponent\coreapplication_obisn\repository folder.

  2. Create an empty presentation catalog (for example, Financials – GL Cost of Goods Sold). Set properties by following other presentation catalogs.

  3. Drag 'Dim – GL COGS Details' and 'Fact - Fins - GL Cost of Goods Sold Posted' to the presentation catalog.

  4. Drag other dimensions.

  5. Rename the presentation table 'Dim – GL COGS Details' to 'Document Details'.

  6. Rename the presentation table 'Fact - Fins - GL Cost of Goods Sold Posted' to 'Facts - GL Cost of Goods Sold'. Rename other dimensions if necessary.

You might also follow this same process to create a Presentation Table for Revenue to be able to drill from GL to Revenue level detail transactions.

B.2.134 How to Integrate Financial Analytics with Project Analytics

You can enable Oracle Financial Analytics to use dimension tables in Oracle Project Analytics. You can only perform this integration if you have licensed Oracle Project Analytics.

You can configure the following Oracle Financial Analytics Subject Areas to join to certain Project Dimensions:

  • Financials - Payables (Project, Task, Financial Resource and Expenditure Organization Dimensions)

  • Financials - Receivables (Contract Dim)

The following Oracle Financial Analytics fact tables integrate with Project Analytics dimensions:

  • W_AP_XACT_F

  • W_AR_XACT_F

  • W_AR_AGING_INVOICE_A

To use dimensions from Oracle Project Analytics in Oracle Financial Analytics, you must select the Oracle Project Analytics offering during installation and setup.

B.2.135 How to Implement Asset Category and Asset Location Dimensions in Fusion Application

No Help topic is available for this FSM Task.

B.2.136 How to Implement GL Segment, GL Account, Asset Category and Asset Location Dimensions for Fusion Applications

Follow the steps in this section to implement GL Segment and GL Segment Hierarchy Dimensions.

Guidelines

  • If you need to report on only concatenated segments, then no configuration is required, and you can skip this section.

  • If you want only Group Account Num (and related attributes), then at a minimum you need to configure just the Natural Account dimension.

  • If you are exposing any GL Segments (including cost center, balancing segment, natural account), then you must go through the full configuration.

  • If you are exposing any Financial fact, then at a minimum you need to configure the Natural Account dimension, because you need group account number.

B.2.136.1 Configuring the BI Extender

Before using BI Extender, you must perform the task named 'Performing Preconfiguration Tasks for the BI Extender' in Oracle Fusion Middleware Metadata Repository Builder's Guide for Oracle Business Intelligence Enterprise Edition (Oracle Fusion Applications Edition).

B.2.136.2 Configuring GL Segment and GL Account Dimensions

This section explains how to configure GL Segment and GL Account Dimensions.

B.2.136.2.1 Mapping the segment labels to BI Objects in Fusion Applications

In order to enable GL Accounting Flexfield in Oracle BI Applications, the following configuration must be performed in the Manage Key Flexfields UI in Fusion Applications. This configuration enables the Accounting Flex Segments for BI and provides the mapping with BI Object names that should be used as dimensions for each of the Account Flexfield segments.

To map the segment labels to BI Objects in Fusion Applications:

  1. In Fusion Applications, navigate to Manage Key Flexfields.

  2. In each of the Accounting Flexfield segments, set the BI Enabled Flag to Y, as follows:

    1. Query for 'GL#' as Key Flexfield Code.

    2. Click Manage Structure Instances.

    3. Edit each of the segments and select the BI enabled check box, then save the details.

      This should be done for all segments in every Structure instance that you intend to be mapped in the BI Metadata Repository (RPD).

      This screenshot is described in surrounding text.
  3. Populate the BI Object Name for each of the Segment Labels, as follows:

    Note: This name is the Logical table name in the RPD which is used as the dimension for the corresponding segment.

    1. In the Manage Key Flexfields UI in Fusion Applications, query for 'GL#' as Key Flexfield Code.

    2. Choose Actions, then Manage Segment Labels.

      This screenshot is described in surrounding text.
    3. Populate the BI Object Name for all the segment labels that you need to map in the RPD, then save the details.

      This screenshot is described in surrounding text.

    The following table shows the BI Object Names for each Qualified Segment label.

    Table B-177 Mappings for Segment Label Codes to BI Object Names

    Segment Label Code BI Object Name

    FA_COST_CTR

    Dim - Cost Center

    GL_BALANCING

    Dim - Balancing Segment

    GL_ACCOUNT

    Dim - Natural Account Segment


    For the non qualified segment labels, the BI Object Name should be populated with one of the 10 numbered Dim - Segments: Dim - GL Segment1, Dim - GL Segment2, Dim - GL Segment<n>, and so on, to Dim - GL Segment10.

  4. Click the Deploy Flexfield option to deploy the Flexfields.

B.2.136.2.2 Configuring GL Segments and GL Account using the BI Extension process

This section describes how to configure the GL Accounting Segment Dimension in the BI Metadata Repository (RPD), and how to extend the ETL metadata to populate the corresponding tables in Oracle Business Analytics Warehouse.

Overview of the BI Extension Process

There are no default mappings to populate the segment dimension data warehouse tables (W_COST_CENTER_D, W_COST_CENTER_DH, W_NATURAL_ACCOUNT_D, W_NATURAL_ACCOUNT_DH, W_BALANCING_SEGMENT_D, W_BALANCING_SEGMENT_DH, W_GL_SEGMENT_D, W_GL_SEGMENT_DH). Mappings to populate these tables are generated by the BI extension process. This process is driven through the RPD metadata. The logical dimensions in the RPD metadata corresponding to these tables are 'Dim – Cost Center', 'Dim – Balancing Segment', 'Dim – Natural Account Segment' and all 'Dim – GL Segment<n>' dimensions. These dimension tables are populated from a Tree View Object (VO) or from a Value Set View Object (VO), depending on whether a tree was associated with the segment or not in Fusion Applications.

For each segment associated with trees, two VOs will be generated (Tree and TreeCode) with the following naming structure:

- FscmTopModelAM.AccountBIAM.FLEX_TREE_VS_<segment label> _VI

- FscmTopModelAM.AccountBIAM.FLEX_TREECODE_VS_<segment label>_VI

For each segment without trees, one VO will be generated with the following naming structure:

- FscmTopModelAM.AccountBIAM.FLEX_VS_<XXX>_VI

In addition to the segment dimension tables, the BI Extension process also extends the installed ETL mapping that populates the GL Account Dimension (W_GL_ACCOUNT_D). This dimension table has a pair of columns for each segment dimension. For example, COST_CENTER_NUM and COST_CENTER_ATTRIB for Cost Center dimension, BALANCING_SEGMENT_NUM and BALANCING_SEGMENT_ATTRIB for Balancing Segment dimension, ACCOUNT_SEGn_CODE and ACCOUNT_SEGn_ATTRIB corresponding to the generic GL Segment<n> dimensions. These columns will be populated from the Flex BI Flattened VO; FscmTopModelAM.AccountBIAM.FLEX_BI_Account_VI. This VO will have a pair of columns for each segment; <segment label>_ and <segment label>_c. For example, for your Cost Center segment which has the segment label FA_COST_CTR, there will be two columns in this VOs named FA_COST_CTR_ and FA_COST_CTR_c.

BI Extension Process Flow

  • Step 1 - Import the appropriate View Objects (VOs) from the ADF data source.

  • Step 2 – Verify the automatic mapping of the VOs to the logical objects in the mapping dialog.

  • Step 3 - Provide connection information such as user name and password for repositories.

  • Step 4 - Click finish, and the appropriate metadata is generated and updated in the respective repositories.

Prerequisite

Before you start the BI Extension process, you must enable the Extender For BIAPPS setting using Oracle BI EE Administration Tool. To do this, choose Tools, then Options, then General, to display the Options dialog, and select the Extender for BIAPPS check box.

This screenshot is described in surrounding text.

To configure GL Segments and GL Account using the BI Extension process:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Navigate to the Oracle ADF database in the physical layer:

    oracle.apps.fscm.model.analytics.applicationModule.FscmTopModelAM_FscmTopModelAMLocal
    

    Then right click on the connection pool and select Import Metadata.

  3. In the 'Select Metadata Objects' dialog, do the following:

    1. Ensure that you have selected the 'Automatically include any missing joined objects' radio button.

    2. Click on the 'Synchronize with data source' icon, as shown in the example screenshot below.

      This screenshot is described in surrounding text.

      These settings import all VOs that need to be mapped to the logical tables in the RPD based on the mapping done between the segment labels and the BI Objects.

      This screenshot is described in surrounding text.

      Asset Category and Asset Location Dimension Configuration

      For Implementing Fixed Asset Category and Asset Location dimensions, the following Flex BI Flattened View Objects and Segment columns will be imported in the same import process.

      - FscmTopModelAM.CategoryBIAM.FLEX_BI_Category_VI

      - FscmTopModelAM.LocationBIAM.FLEX_BI_Location_VI

      The example below shows the import process for FLEX_BI_Category_VI.

      This screenshot is described in surrounding text.

      The example below shows the import process for FLEX_BI_Location_VI.

      This screenshot is described in surrounding text.
  4. Click on Next after you complete the import.

    Note: When some complex Chart of Account structures are defined in Fusion Applications, more than one VO might be generated for the same segment label. In this case you will see a warning message as shown in the screenshot below. Copy the information posted in the message, as this might be required in later steps. Click OK to proceed.

    This screenshot is described in surrounding text.
  5. In the 'Map to Logical Model' dialog you would see that the VOs imported in Step 3 are automatically mapped to the appropriate logical tables. You would also see that the logical columns are automatically mapped to the VO columns in the bottom panel.

    This screenshot is described in surrounding text.

    Validation:

    • For tree based segments, both the Tree and the Tree Code VO should be mapped to the same logical table. The 'Hierarchy' option should be checked for both.

    • For non-tree based segments, 'Hierarchy' option should not be checked.

    • FscmTopModelAM.AccountBIAM.FLEX_BI_Account_VI is mapped to 'Dim – GL Account'.

    • For the VOs that are mapped to logical tables, the necessary VO columns are also mapped to appropriate logical columns.

      Note: If you received the warning message in step 4, then none of the VOs mentioned in the message are mapped to a logical table. If you want to map these VOs in Oracle BI Applications, then you need to map to one of the generic GL segment dimensions (Dim – GL Segment<n>) manually at this stage. For each of the VOs that you manually map at this step, you also need to map the corresponding columns in FscmTopModelAM.AccountBIAM.FLEX_BI_Account_VI to the appropriate logical column in "Dim – GL Account".

      Asset Category and Asset Location Dimension Configuration

      In the Map to Logical Model dialog, note that the VOs imported in Step 3 are automatically mapped to the appropriate logical tables. Note also that the logical columns are automatically mapped to the VO columns in the bottom panel.

      This screenshot is described in surrounding text.

      Validation: You can validate at this stage that all the automatic mappings have happened as expected using the guidelines below:

      - FLEX_BI_Category_VI is mapped to logical table 'Dim – Asset Category' and FLEX_BI_Location_VI is mapped to logical table 'Dim – Asset Location'.

      - The segment columns in these VOs are mapped to the appropriate logical columns in these dimensions based on the Segment Label Code to BI Object Name mapping.

  6. When you have validated your mappings, click on Next and this will take you to the 'Publish to Warehouse' dialog. Provide the necessary details and click on Finish to complete the extension process.

  7. Validate and save your changes.

  8. Validation: If you have successfully completed the extension process, you will see new mappings in the ODI repository to populate the necessary tables. The mappings will be named with the following naming convention SDE_<Logical Table Name>_<Physical Target Name>.

    1. If a segment is mapped from a Tree and a Tree Code VO, then there will be two mappings generated one for loading the segment dimension and the other for the hierarchy dimension. For example, SDE_Dim_Cost_Center_W_COST_CENTER_D and SDE_Dim_Cost_Center_W_COST_CENTER_DH.

    2. If a segment is mapped from a Value Set VO, then there will be one mapping generated for loading the segment dimension. For example, SDE_Dim_GL_Segment1_W_GL_SEGMENT_D.

    3. For the GL Account extension, you would see that the mapping SDE_FUSION_GLAccountDimension will be extended to populate the new columns that were mapped in the previous steps.

  9. When you have completed the BI Extension process, rebuild the appropriate Load Plan or create a new Load Plan.

B.2.136.2.3 Validating the Logical Table Source Filters for Generic GL Segment Divisions

The RPD metadata contains multiple logical tables that represent the generic GL segments, such as Dim – GL Segment1, Dim – GL Segment2 and so on. Since these logical tables are mapped to the same physical table, W_GL_SEGMENT_D, a filter should be specified in the logical table source of these logical tables to restrain the output of the logical table to represent only that particular segment.These filters must be applied on the physical column SEGMENT_LOV_ID to the Value Set Codes that are applicable for that particular segment.

The extension process applies the content Logical Table Source filters for all the generic Dim – GL Segment<n> dimensions mapped in the previous steps. You can validate to check if the filters are applied accordingly and save your changes.

To validate the Logical Table Source Filters for Generic GL Segment Divisions:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. In the Business Model and Mapping layer, double click on 'Dim – GL Segment<n>' and open the LTS. Navigate to the Content table and there you would be able to see the filters applied, which should look similar to the filters in the example below.

    This screenshot is described in surrounding text.

    Note: You can find the list of value set codes for a particular segment by opening the segment VO table object in the physical layer of the RPD. It will be stored in the 'description' field of the table object.

B.2.136.2.4 Re-configuring Segment Dimensions

While configuring the segment dimensions as described in the first section, if you had mapped an incorrect VO to the segment dimension and generated metadata, you must revert the changes and re-map using the correct VO(s).

To re-configure Segment Dimensions:

  1. Delete the existing VO Logical Table Source from the corresponding logical table to bring it to the initial state.

  2. Delete the Logical Table Source filters if any applied on the DW Logical Table Source (only for the generic segment dimensions).

  3. Import the new VO (re-import if the VO already exists in the physical layer) and re-do the extension process as mentioned in the previous sections.

  4. If the process completes successfully, then the previously created mapping is replaced with a new mapping with the new VO.

B.2.137 How To Customize Extended Cross Functional Security for Employee Expenses

To facilitate procurement users (such as Procurement VP or Spend Analyst) to perform deeper and cross functional analysis apart from their regular duty, Oracle Procurement and Spend Analytics has data and functional security to access the employee expenses transactions (such as expense report, credit card transaction and expense violation) through extended duty roles. If you would like to provision this duty to the procurement and spend users, then follow the steps below.

Understanding Extended Duty Roles: BI seeded duty roles for Fusion Applications includes a 'Procurement Executive Analysis Duty' role (Role name: OBIA_PROCUREMENT_EXECUTIVE_ANALYSIS_DUTY) to act like a Spend Analyst/ Executive duty. This extended role is not mapped to any enterprise job roles by default, but it is pre-configured within Oracle BI Applications to enforce object and data level security for Employee Expenses. Internally, data security is implemented using 'Extended Procurement and Spend Business Unit Data Security' (Role name: OBIA_EXTENDED_PROCUREMENT_AND_SPEND_BUSINESS_UNIT_DATA_SECURITY). This data security role enables cross functional analysis by manage spend Business Unit Data Security.

Follow the steps below to implement 'Procurement Executive Analysis Duty' role:

  1. Create 'VP of Procurement' or similar executive job role in your Fusion Applications deployment and assign BI duty 'Procurement Executive Analysis Duty' to 'VP of Procurement'.

  2. Assign appropriate Fusion Applications duty roles to the job role - 'VP of Procurement' and assign BU privileges. Data security of 'Procurement Executive Analysis Duty' (OBIA duty role) is controlled by the BUs assigned to the user in the agent access 'manage spend' action.

  3. Customize Presentation catalog permissions (for Employee Expense dashboard and related analyses) and Subject Area permissions as desired for 'Procurement Executive Analysis Duty' role.

For more information on how to create and manage job roles in Fusion Applications, refer to section 'Understanding How to Secure Oracle Fusion Applications' in Oracle Fusion Applications Administrator's Guide.

B.2.138 How To Customize Extended Cross Functional Security for Accounts Payables

To facilitate procurement users (such as Category Managers and Procurement Managers) to perform deeper and cross functional analysis apart from their regular duty, Oracle Procurement and Spend Analytics includes configured data and functional security to access the accounts payable transactions (such as invoices, payments, payment schedules) through extended duty roles. To implement these duties, follow steps below.

Understanding Extended Duty Roles: Seeded security roles for Oracle BI Applications for Fusion Applications include the following additional duty roles. These extended roles are not mapped to any enterprise job roles by default, but they are pre-configured within Oracle BI Applications to enforce object and data level security for Accounts Payables.

  • 'Procurement Managerial Extended Analysis Duty' role (Role name: OBIA_PROCUREMENT_MANAGERIAL_ANALYSIS_DUTY) – This BI Duty role enables users to perform cross functional analysis outside of Category Management. Internally, data security on Oracle BI Applications is implemented using 'Extended Procurement and Payable Business Unit Data Security' (Role name: OBIA_EXTENDED_PROCUREMENT_AND_PAYABLE_BUSINESS_UNIT_DATA_SECURITY).

  • 'Category Manager Extended Analysis Duty' role (Role name: OBIA_CATEGORY_MANAGER_ANALYSIS_DUTY) – This BI Duty role enables to perform cross functional analysis outside of Procurement Management. Internally, data security on Oracle BI Applications is implemented using 'Extended Procurement and Payable Business Unit Data Security' (Role name: OBIA_EXTENDED_PROCUREMENT_AND_PAYABLE_BUSINESS_UNIT_DATA_SECURITY).

  • 'Procurement Executive Analysis Duty' role (Role name: OBIA_PROCUREMENT_EXECUTIVE_ANALYSIS_DUTY) to act also like a Spend Analyst/ Executive duty. Internally, data security on Oracle BI Applications is implemented using 'Extended Procurement and Spend Business Unit Data Security' (Role name: OBIA_EXTENDED_PROCUREMENT_AND_SPEND_BUSINESS_UNIT_DATA_SECURITY). This data security role enables cross functional analysis by manage spend Business Unit Data Security.

Follow the steps below to implement 'Procurement Executive Analysis Duty' role:

  1. Assign BI duty 'Procurement Managerial Extended Analysis Duty' to Fusion Applications job role, 'Procurement Manager' or similar.

  2. Assign BI duty 'Category Manager Extended Analysis Duty' to Fusion Applications job role, 'Category Manager' or similar.

  3. Create 'VP of Procurement' or similar executive job role in your Fusion Applications deployment and assign BI duty 'Procurement Executive Analysis Duty' to 'VP of Procurement'.

  4. Assign appropriate Fusion Applications duty roles to the job role - 'VP of Procurement' and assign BU privileges. Data security of 'Procurement Executive Analysis Duty' (OBIA duty role) is controlled by the BUs assigned to the user in the agent access 'manage spend' action.

  5. Customize Presentation catalog permissions (for Supplier Performance – AP Transactions related content) and Subject Area permissions as desired for above mentioned roles.

For more information on how to create and manage job roles in Fusion Applications, refer to section 'Understanding How to Secure Oracle Fusion Applications' in Oracle Fusion Applications Administrator's Guide.

B.2.139 How to Grant GL Data Role to HR VP Users

In Oracle Business Intelligence Applications, in order for a BI user with VP of HR job role to see GL data, he/she needs to be provisioned with GL data role pertaining to a Financial Analyst job role. The GL data role provisioned will control the data security that will be enforced upon the GL data the user is trying to view. To understand more details on how GL data are provisioned in Fusion Applications, refer to Oracle General Ledger User's Guide for more information.

B.2.140 How to Assign Group Account Numbers to Natural Accounts for HR Analytics

You must assign the following group account numbers in Fusion Applications for HR: (You can skip this task if you have already completed this task for General Ledger.)

  • 'CONT EXP' for 'Contracting Expenses'

  • 'COGS' for 'Cost of Goods Sold'

  • 'DEPCN' for 'Depreciation Expenses'

  • 'EMP BENFT' for 'Employee Benefits Related Expenses'

  • 'EMP OVERTIME' for 'Employee Overtime Expenses'

  • 'EMP SUPP' for Employee Support Expenses'

  • 'GEN PAYROLL' for General Admin and Other Payroll'

  • 'MISC OPER EXP' for 'Miscellaneous Operating Expenses'

  • 'MKTG PAYROLL' for 'Payroll Expenses'

  • 'SLS PAYROLL' for 'Payroll Expenses'

  • 'R&D PAYROLL' for 'Payroll Expenses' (GEN PAYROLL is already listed)

  • 'VARIANCE EXP' for 'Product Variance Expenses'

  • 'REVENUE' for 'Revenue'

Note: 'Other Operating Expenses' is a derived column. It does not need a group account number assignment.

How to Assign Group Account Numbers to Natural Accounts:

  1. Login to Fusion Applications.

  2. Click the Applcore menu.

  3. Identify the value set used for your natural account.

  4. Open the window to maintain value set values.

  5. Assign financial categories to each natural account from the list of values.

The following group account numbers (financial categories) are seeded:

ACC DEPCN - Accumulated Depreciation
ACC LIAB - Accrued Liabilities
AP - Account Payables
AR - Account Receivables
CASH - Cash
CMMN STOCK - Common Stock
COGS - Cost Of Goods Sold
CONT EXP - Contracting Expenses
DEFERRED COGS - Deferred Cost of Goods Sold
DEFERRED REVENUE - Deferred Revenue
DEPCN - Depreciation Expenses
EMP BENFT - Employee Benefits Related Expenses
EMP OVERTIME - Employee Overtime
EMP SUPP - Employee Support and Cafeteria Expenses
FG INV - Finished Goods Inventory, FREIGHT - Freight Expenses 
GEN PAYROLL - General Admin And Other Payroll 
GOODWILL - Goodwill 
INC TAX - Income Tax 
INT EXP - Interest Expenses 
LT DEBT - Long Term Debt 
MISC OPER EXP - Miscellaneous Operating Expenses 
MKTG PAYROLL - Marketing Payroll 
OTHER ASSET - Other Assets 
OTHER CA - Other Current Assets 
OTHER CL - Other Current Liabilities 
OTHER EQUITY - Other Equity Related 
OTHER INC - Other Income 
OTHER LIAB - Other Liabilities 
OTHER MKTG EXP - Other Marketing Expenses 
OTHER R&D EXP - Other R&D Expenses 
OTHER SLS EXP - Other Sales Expenses 
PPAID EXP - Prepaid Expenses 
PPE - PPE 
PREF STOCK - Preferred Stock 
PURCH - Purch 
R&D PAYROLL - R&D Payroll 
RET EARNING - Retained Earning 
REVENUE - Sales Revenue 
RM CONS - RM Cons 
RM INV - Raw Material Inventory 
SLS PAYROLL - Sales Payroll 
ST BORR - ST Borr 
TAX LIAB - Tax Liabilities 
TRAVEL & ENT EXP - Travel & Entertainment Expenses 
VARIANCE EXP - Product Variance Expenses 
WIP INV - WIP Inventory

B.2.141 How to Set Up Group Account Numbers for Fusion Applications

Assign Financial Categories (Group Account Num) to natural accounts as follows. You need access to Fusion Applications - Application Core Setup.

  1. In Fusion Applications, go to Application Core Setup.

  2. Click Manage Key Flexfields.

  3. Search Key Flexfield Code 'GL#'

  4. Click Manage Structure Instance.

  5. Find a structure instance for your chart of accounts.

  6. Select the structure instance and click Edit.

  7. Click Value Set Code for the Account segment to open Manage Value Sets.

  8. Click Manage Values.

  9. Search a natural account to which you want to assign financial categories.

  10. Select a value and click Edit.

  11. Assign a financial category from the list of values.

  12. Save the changes.

B.2.142 How to Set up GL Segments Which Need to be Aggregated for GL Balances

Aggregated GL balances are populated in W_GL_BALANCE_A. This table stores GL Balances by only 3 qualified GL segment by default - Natural Account segment, Balancing segment and Cost center segment. The GL balances are not summarized by non-qualified segments as installed by default. If you want to include the non-qualified segments, then you must modify the ODI interfaces as follows.

To Set Up GL Balance Segment Aggregates:

1. In ODI Designer Navigator, connect to your ODI repository.

2. Open temporary interface PLP_GLBalanceAggrByAcctSegCodes.SQ_W_GL_BALANCE_F.

3. Open the Mapping tab.

4. Right click GL_SEGMENT<N>_WID in General Account Balances (W_GL_BALANCE_F) and click "Add Column to Target Table".

5. Change "Execute On" to Staging of GL_SEGMENT<N>_WID in Property Inspector.

6. Open main interface PLP_GLBalanceAggrByAcctSegCodes.W_GL_BALANCE_A.

7. Modify the expression of SEGMENT<N>_WID as follows. SQ_W_GL_BALANCE_F.GL_SEGMENT<N>_WID

8. Change "Execute On" to Source in Property Inspector.

B.2.143 How To Customize Security for Procurement Executive / Spend Analyst

To enable procurement users (such as Procurement VP or Spend Analyst) to perform deeper and cross functional analysis apart from their regular duties, Oracle Procurement and Spend Analytics includes data and functional security to access the employee expenses transactions (such as expense report, credit card transaction and expense violation) through extended duty roles. If you would like to provision such duty to the procurement and spend users, then follow the steps below.

Understanding Extended Duty Roles: BI seeded duty roles for Fusion Applications includes 'Procurement Executive Analysis Duty' role (Role name: OBIA_PROCUREMENT_EXECUTIVE_ANALYSIS_DUTY) to act also like a Spend Analyst/ Executive duty. This extended role is not mapped to any enterprise job roles by default, but it is pre-configured within Oracle BI Applications to enforce object and data level security for Spend Analysis. Internally, data security on Oracle BI Applications is implemented using 'Extended Procurement and Spend Business Unit Data Security' (Role name: OBIA_EXTENDED_PROCUREMENT_AND_SPEND_BUSINESS_UNIT_DATA_SECURITY). This data security role enables cross functional analysis by manage spend Business Unit Data Security.

Follow the steps below to implement 'Procurement Executive Analysis Duty' role:

  1. Create 'VP of Procurement' or similar executive job role in your Fusion Applications deployment and assign BI duty 'Procurement Executive Analysis Duty' to 'VP of Procurement'.

  2. Assign appropriate Fusion Applications duty roles to the job role - 'VP of Procurement' and assign BU privileges. Data security of 'Procurement Executive Analysis Duty' (OBIA duty role) is controlled by the BUs assigned to the user in the agent access 'manage spend' action.

  3. Customize Presentation catalog permissions (for Spend Analyzer dashboard and related analyses) and Subject Area permissions as desired for 'Procurement Executive Analysis Duty' role.

For more information on how to create and manage job roles in Fusion Applications, refer to section 'Understanding How to Secure Oracle Fusion Applications' in Oracle Fusion Applications Administrator's Guidee. For more information on how to define and customize security in Oracle BI Applications, refer to Oracle Fusion Middleware Security Guide for Oracle Business Intelligence Applications.

B.2.144 How to Remove Spend Classification Integration Metadata

If you are not implementing Oracle Spend Classification, Oracle recommends that you remove or hide the Oracle Spend Classification integration metadata that is included in the Presentation layer of the BI repository. Hiding or deleting this metadata avoids potential confusion among business end users.

To remove or hide Oracle Spend Classification Integration Metadata:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

    Deployed RPD files are located in ORACLE_HOME\bifoundation\OracleBIServerComponent\coreapplication_obis<n>\repository.

  2. In the Presentation layer pane, expand the folder 'Procurement and Spend - Invoice Lines'.

    The Oracle Spend Classification metadata in the Physical layer consists of the following objects:

    Data Classification

    Auto UNSPSC

    Auto Purchasing Category

    Auto Custom Category 1

  3. To remove the metadata objects listed above, right click on the objects and select Delete.

    Note:

    If you decide later to implement Oracle Spend Classification, you need to do the following:

    1. In the Business Model and Mapping layer, drag and drop the following dimensions from a copy of the 'Procurement and Spend - Invoice Lines' folder into the Presentation layer or your metadata repository:

      • Dim - Auto UNSPSC

      • Dim - Auto Purchasing Category

      • Dim - Auto Custom Category1

  4. To hide the objects listed above from end users, right click and select Properties, then Permissions, and clear the Read permission check box for the appropriate user or group.

    Note:

    If you decide later to implement Oracle Spend Classification, you need to do the following:

    1. To display the following objects to end users, right click and select Properties, then Permissions, and select the Read permission check box for the appropriate user or group:

      • Data Classification

      • Auto UNSPSC

      • Auto Purchasing Category

      • Auto Custom Category 1

  5. Save and close the metadata repository.

B.2.145 How to Enable Project Dimensions

You can enable Oracle Supply Chain and Order Management to use dimension tables in Oracle Project Analytics. You can only perform this integration if you have licensed Oracle Project Analytics. You can configure the Oracle Supply Chain and Order Management Subject Areas listed below to join to certain Project Dimensions: Inventory Transactions (Project Dim, Task Dim, Financial Resource Dim).

The following Supply Chain fact table integrates with Project Analytics dimensions:

W_PRODUCT_XACT_F

Due to a limitation in Fusion Applications, the following Subject Areas of Oracle Supply Chain and Order Management Analytics are included in the configuration tag 'Enable Project Dimension', but are inactivated by default. Note: These settings are intentional, and they should not be re-activated.

  • SCOM_AN: Order Backlog

  • SCOM_AN: Order Booking

  • SCOM_AN: Order Credit

  • SCOM_AN: Order Customer Status History

  • SCOM_AN: Order Cycle

  • SCOM_AN: Order Fulfillment

  • SCOM_AN: Order Hold

  • SCOM_AN: Order Invoice

  • SCOM_AN: Order Invoice Credit

  • SCOM_AN: Order Scheduling

  • SCOM_AN: Order Shipping

B.2.146 How to Integrate Project Analytics with Financial Analytics

You can enable Oracle Financial Analytics to use dimension tables in Oracle Project Analytics. You can only perform this integration if you have licensed Oracle Project Analytics. You can configure the following Subject Areas in Oracle Financial Analytics to use Oracle Project Analytics tables:

  • Financials - Payables

  • Financials - Receivables

The following Oracle Financial Analytics fact tables integrate with Project Analytics dimensions:

  • W_AP_XACT_F

  • W_AP_BALANCE_F

  • W_AR_XACT_F

  • W_AR_AGING_INVOICE_A

B.2.147 How to Configure Order Item and Service Request Flat Files For ETL

Background

In Fusion Applications, there are several entities that are sourced from non-Fusion Applications systems. Fusion Applications CRM is leveraging OBIA (Oracle Business Intelligence Applications) to integrate data from Fusion Applications and non-Fusion Applications source systems. The Oracle BI Applications metadata layer consolidates disparate physical data sources and makes it ready for analysis by Fusion Applications users. Sales Prospector (SPE) is a brand new Fusion application for sales users helping them to manage their pipeline and whitespace effectively. SPE expects Order Item and Service Request data to be supplied from non-Fusion applications.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

ETL from Flat Files

Non-Fusion Applications data such as Order Item and Service Request can be directly loaded into Oracle Business Analytics Warehouse as long as the data can be presented in the specified flat file format. The ETL process loads the non-Fusion Applications data from the flat files and Fusion Applications data from Fusion Applications database tables into staging tables; then loads data from the staging tables into Oracle Business Analytics Warehouse.

SPE ETL Preparation

SPE needs non-Fusion Applications data for Order Item Fact, Service Request Fact and Service Request Dimension. The data should be presented in flat files according to the following specifications:

  • Data should be in CSV files (*.csv).

  • For full ETL, the files should contain all initial records that are supposed to be loaded into Oracle Business Analytics Warehouse; for incremental ETL, the files should contain only new or updated records.

  • The files are specially formatted for Fusion Sales Prediction Engine (SPE) data mining use only. All columns in the files should follow Fusion application data model terms and standards, and all ID columns in the files are expected to have corresponding Fusion Integration ID.

  • Data should start from line six of each file. The first five lines of each file will be skipped during ETL process.

  • Each row represents one record in staging table.

  • All date values should be in the format of YYYYMMDDHH24MISS. For example, 20071231140300 should be used for December 31, 2007, 2:03 pm.

  • Columns DATASOURCE_NUM_ID and INTEGRATION_ID in all flat files cannot be NULL.

  • Column DATASOURCE_NUM_ID needs to be fixed to 200, which is also the Fusion Applications data source number.

The Flat files for Order Item Fact, Service Request Fact and Service Request Dimension are:

Before starting the ETL run, the flat files should be prepared based on the formats provided in sections below.

B.2.147.1 Flat file file_orderitem_fs.csv

The file is generic and therefore does not support any source order system specific features, such as recurring order lines and etc. Each line in this file will contribute to the total order amount. The granularity of this file is each order line.

The file is specially formatted for Fusion Sales Prediction Engine (SPE) data mining use only.

Table B-178 File Structure for file_orderitem_fs.csv

Column Name Data Type Sample Data Description

CUSTOMER_ID

VARCHAR(80)

999997551042159

Customer Party Id. There could be more than one customer IDs in an order. Among the possible customer IDs of bill to, ship to, invoice to and so on; this is the primary ID for BI analysis use.

Foreign key to HZ_PARTIES.PARTY_ID.

CURCY_CD

VARCHAR(20)

USD

Currency Code, the currency that the order line amounts are based on.

CRM_CURR_EXCHANGE_RATE

NUMBER(28,10)

1.00

CRM Currency Exchange Rate, which is for the conversion of the order line amounts to the CRM common currency.

CRM_CORP_CURR_CODE

VARCHAR(20)

USD

CRM Common Currency Code.

ORDER_ID

VARCHAR(80)

4171787

Order header ID.

PROD_ID

VARCHAR(80)

999997500678718

Product Inventory Item ID.

Foreign key to EGP_SYSTEM_ITEMS_B.INVENTORY_ITEM_ID.

PROD_GROUP_ID

VARCHAR(80)

Null

Product Group ID. Optional for SPE ETL use. Leave null.

RESOURCE_ID

VARCHAR(80)

123445623

Resource ID, order owner Resource ID for order.

Foreign key to HZ_PARTIES.PARTY_ID

RESOURCE_ORG_ID

VARCHAR(80)

3453453453

Resource Organization ID, order owner's organization ID.

Foreign key to HR_ALL_ORGANIZATION_UNITS_F.ORGANIZATION_ID.

SOURCE_ID

VARCHAR(80)

100000016742344

Marketing campaign source code defined in MKT_SC_SOURCE_CODES.

ORDER_DT

DATE

20061220000000

Order Date in the format of YYYYMMDDHH24MISS. It is the date when order is placed. This date is used in ETL as canonical date for resolving dimensional FKs.

DATASOURCE_NUM_ID

NUMBER(10)

200

Data Source Number ID. Need to be fixed to 200, which is the same value for Fusion Applications data source in ETL.

INTEGRATION_ID

VARCHAR(80)

12149813

Integration ID, the order Line ID. Typically, each order may have one order header and multiple order lines.

DISCNT_AMT

NUMBER(28,10)

2.33

Discount Amount, deduction made to the unit price.

NET_PRI

NUMBER(28,10)

45.752

Net Price of order line item. This is the final price after deducting discount amount.

QTY_REQ

NUMBER(28,10)

12

Quantity Ordered for the line item.

PR_TERR_ID

VARCHAR(80)

1000000000023112

Primary Territory ID, ID of primary sales territory where order is placed.

Territory ID is defined in MOT_TERRITORIES.

CREATED_BY_ID

VARCHAR(80)

SALES_ADMIN

Created By ID, Login ID of the user who created the row.

CREATED_ON_DT

DATE

20071231140300

Created On Date in the format of YYYYMMDDHH24MISS.

CHANGED_BY_ID

VARCHAR(80)

SALES_ADMIN

Changed By ID, Login ID of the user who modified the row.

CHANGED_ON_DT

DATE

20071231140300

Changed On Date in the format of YYYYMMDDHH24MISS.

DELETE_FLG

VARCHAR(1)

Null, Y or N

Delete Flag, indicates if the record is deleted since last ETL. Default to N if null.

X_CUSTOM

VARCHAR(10)

Null

ETL reserved. Leave null.


B.2.147.2 Flat file file_srvreq_fs.csv

The columns listed below are required for SPE ETL use. The granularity of this file is each Service Request. The file is specially formatted for Fusion Sales Prediction Engine (SPE) data mining use only.

Table B-179 File Structure for Flat file file_srvreq_fs.csv

Column Name Data Type Sample Data Description

DATASOURCE_NUM_ID

NUMBER(10)

200

Data Source Number ID. Data Source Number ID needs to be fixed to 200, the same value for Fusion Applications data source in ETL.

INTEGRATION_ID

VARCHAR(80)

12149813

Integration ID, unique IDentifier ID for each Service Request.

CLOSE_DT

DATE

20030616174947

Closed Date, date in the format of YYYYMMDDHH24MISS when service request was closed.

OPEN_DT

DATE

20020516174947

Open Date, date in the format of YYYYMMDDHH24MISS when service request was open.

DELETE_FLG

VARCHAR(1)

Null, Y or N

Delete Flag, indicates if the record is deleted since last ETL. Default to N if null.

CREATED_BY_ID

VARCHAR(80)

SALES_ADMIN

Created By ID, Login ID of user who created the row.

CREATED_ON_DT

DATE

20071231140300

Created On Date in the format of YYYYMMDDHH24MISS.

CHANGED_BY_ID

VARCHAR(80)

SALES_ADMIN

Changed By ID, Login ID of the user who modified the row.

CHANGED_ON_DT

DATE

20071231140300

Changed On Date in the format of YYYYMMDDHH24MISS.

X_CUSTOM

VARCHAR(10)

Null

ETL reserved. Leave null.

CUSTOMER_ID

VARCHAR(80)

999997551042159

Customer Party Id.

Foreign key to HZ_PARTIES.PARTY_ID.

PROD_ID

VARCHAR(80)

999997500678718

Product Inventory Item ID.

Foreign key to EGP_SYSTEM_ITEMS_B.INVENTORY_ITEM_ID.


B.2.147.3 Flat file file_srvreq_ds.csv

The columns listed below are required for SPE ETL use. The granularity of this file is each Service Request. The file is specially formatted for Fusion Sales Prediction Engine (SPE) data mining use only.

Table B-180 File Structure for Flat file file_srvreq_ds.csv

Column Name Data Type Sample Data Description

DATASOURCE_NUM_ID

NUMBER(10)

200

Data Source Number Id. Data Source Number Id needs to be fixed to 200, the same value for Fusion Applications data source in ETL.

INTEGRATION_ID

VARCHAR(80)

1-10E-5

Integration ID, unique Identifier ID for each Service Request.

CLOSE_DT

DATE

20020516174947

Closed Date, date in the format of YYYYMMDDHH24MISS when service request was closed.

OPEN_DT

DATE

20020516174947

Open Date, date in the format of YYYYMMDDHH24MISS when service request was open.

SEV_CD

VARCHAR(80)

SR_SEVERITY~3-Medium

Severity Code of the Service Request.

Possible values are: SR_SEVERITY~1-Critical,

SR_SEVERITY~2-High,

SR_SEVERITY~3-Medium,

SR_SEVERITY~4-Low.

STATUS

VARCHAR(80)

SR_STATUS~Open

Service Request Status.

Possible values are: SR_STATUS~Approved, SR_STATUS~Cancelled, SR_STATUS~Closed, SR_STATUS~Completed, SR_STATUS~Open, SR_STATUS~Pending.

DELETE_FLG

VARCHAR(1)

Null, Y or N

Delete Flag, indicates if the record is deleted since last ETL. Default to N if null.

CREATED_BY_ID

VARCHAR(80)

SALES_ADMIN

Created By ID, Login ID of the user who created the row.

CREATED_ON_DT

DATE

20071231140300

Created On Date in the format of YYYYMMDDHH24MISS.

CHANGED_BY_ID

VARCHAR(80)

SALES_ADMIN

Changed By ID, Login ID of the user who modified the row.

CHANGED_ON_DT

DATE

20071231140300

Changed On Date in the format of YYYYMMDDHH24MISS.

X_CUSTOM

VARCHAR(10)

Null

ETL reserved. Leave null.


B.2.148 How to Incrementally Refresh the Inventory Monthly Balance Table

To incrementally refresh the Inventory Monthly Balance table:

  1. Delete the records from the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F) aggregate table for a certain time.

    The GRAIN parameter determines the time period for the deletion. For example, if GRAIN=MONTH, and the date is May 15, 2005, then all records for April and the current month (May) are deleted in the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F) table.

    Running the PLP_InventoryMonthlyBalance workflow mapping implements this step.

  2. Retrieve the records in the Inventory Balance (W_INVENTORY_DAILY_BAL_F) fact table and load the records to the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F) table at a certain grain level.

    For example, if GRAIN=MONTH, then the month end balance records in the W_INVENTORY_DAILY_BAL_F fact table are stored in and aggregated to the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F).

    Running the PLP_InventoryMonthlyBalance session, and the PLP_InventoryMonthlyBalance mapping implements this step. For the current month balance, balance records of the previous day (if it is in the same month) are deleted from W_INVENTORY_MONTHLY_BAL_F, and balance records of the current day will be loaded from W_INVENTORY_BALANCE_F to W_INVENTORY_MONTHLY_BAL_F.

    Running the PLP_InventoryMonthlyBalance workflow implements this step.

  3. Remove the old records from the W_INVENTORY_DAILY_BAL_F fact table.

    To remove old records you need to use the KEEP_PERIOD and the NUM_OF_PERIOD parameters. For example, if KEEP_PERIOD=MONTH, NUM_OF_PERIOD=1, and the date is May 15, 2005, then the records for April and the current month (May) are kept and the older records are deleted.

    Running the PLP_InventoryDailyBalance_Trim workflow implements this step.

    Note:

    The trimming process is to reduce data size in the table. It is important to emphasize that you will not be able to see the old daily balance records. But you will still be able to see the month-end balance. Therefore, make sure that you adjust the NUM_OF_PERIOD values to reflect your data volume and data recency requirements.

To Configure Inventory Monthly Balance and the Inventory Transaction Aggregate Table:

  1. Delete the records from the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F) aggregate table for a certain time.

    The GRAIN parameter determines the time period for the deletion. For example, if GRAIN=MONTH, and the date is May 15, 2005, then all records for April and the current month (May) are deleted in the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F) table.

    Running the PLP_InventoryMonthlyBalance workflow mapping implements this step.

  2. Retrieve the records in the Inventory Balance (W_INVENTORY_DAILY_BAL_F) fact table and load the records to the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F) table at a certain grain level.

    For example, if GRAIN=MONTH, then the month end balance records in the W_INVENTORY_DAILY_BAL_F fact table are stored in and aggregated to the Monthly Balance (W_INVENTORY_MONTHLY_BAL_F).

    Running the PLP_InventoryMonthlyBalance session, and the PLP_InventoryMonthlyBalance mapping implements this step. For the current month balance, balance records of the previous day (if it is in the same month) are deleted from W_INVENTORY_MONTHLY_BAL_F, and balance records of the current day will be loaded from W_INVENTORY_BALANCE_F to W_INVENTORY_MONTHLY_BAL_F.

    Running the PLP_InventoryMonthlyBalance workflow implements this step.

  3. Remove the old records from the W_INVENTORY_DAILY_BAL_F fact table.

    To remove old records you need to use the KEEP_PERIOD and the NUM_OF_PERIOD parameters. For example, if KEEP_PERIOD=MONTH, NUM_OF_PERIOD=1, and the date is May 15, 2005, then the records for April and the current month (May) are kept and the older records are deleted.

    Running the PLP_InventoryDailyBalance_Trim workflow implements this step.

    Note:

    The trimming process reduces the amount of data in the table. It is important to emphasize that after data trimming you will not be able to see the old daily balance records. However, you will still be able to see the month-end balance. Therefore, make sure that you adjust the NUM_OF_PERIOD values to reflect your data volume and data recency requirements.

B.2.149 How to Configure the Product Transaction Aggregate Table for ETL Runs

Before you run the initial ETL and then the incremental ETL to load the Product Transaction aggregate table, you need to configure the Product Transaction Aggregate Table, as follows.

To configure the Product Transaction Aggregate Table

Use Oracle BI Applications Configuration Manager to ensure that the required values are set for the following Parameters:

REFRESH_PERIOD = 'MONTH'

GRAIN = 'MONTH'

NUM_OF_PERIOD = 3

Note: If any of these parameters do not exist, create them as Data Type = Text with the specified Values.

To configure the Product Transaction aggregate table for the initial ETL run

Retrieve the records in the Product Transaction fact (W_PRODUCT_XACT_F) table, and aggregate the records to the Product Transaction aggregate (W_PRODUCT_XACT_A) table at a certain grain level.

For example, if GRAIN=MONTH then the records in the W_PRODUCT_XACT_F fact table are retrieved and aggregated to the W_PRODUCT_XACT_A table at a monthly level.

Running the PLP_ProductTransactionAggregate mapping implements this step.

To configure the Product Transaction aggregate table for the incremental ETL run

Delete the refreshed records from the Product Transaction aggregate (W_PRODUCT_XACT_A) table for a certain time.

The REFRESH_PERIOD and the NUM_OF_PERIOD parameters determine the time period for the deletion.

For example, if REFRESH_PERIOD=MONTH, NUM_OF_PERIOD=1, and the date is May 15, 2013, then all records for April and the current month (May) are deleted in the W_PRODUCT_XACT_A table.

Running the PLP_ProductTransactionAggregate mapping implements this step.

Retrieve the records in the Product Transaction fact (W_PRODUCT_XACT_F) table, and aggregate the records to the W_PRODUCT_XACT_A table at a certain grain level.

For example, if GRAIN=MONTH then the records in the W_PRODUCT_XACT_F fact table are retrieved and aggregated to the W_PRODUCT_XACT_A table at a monthly level.

Running the PLP_ProductTransactionAggregate workflow implements this step.

B.2.150 How to Process Bill of Material Explosion for JD Edwards EnterpriseOne

This section explains how to process the Bill of Materials (BOM) for exploding to a multi-level structure to ultimately populate both the W_BOM_HEADER_D and W_BOM_ITEM_F tables.

JD Edwards EnterpriseOne maintains BOM information in a single level format, but Oracle BI Applications requires it in multi-level format. Therefore, before loading data into Oracle BI Application tables, the single level structure must be exploded into a multi-level structure.

Because all of the BOM information is stored in one single table in JD Edwards EnterpriseOne source and there are no defined levels for the BOM, the system has to loop through iteratively to get the BOM exploded. Also, Oracle BI Applications maintains all the revisions to the components as a new version of the BOM along with their effective dates. Considering these facts, it is not feasible to use ETL to convert the single level BOM to a multi-level BOM. Therefore the logic from an existing JD Edwards EnterpriseOne UBE (R30460) was used to create a new UBE for the explosion.

This new UBE (R30461) extracts the manufactured end products and converts the single-level BOM format into a multi-level BOM format. In addition, it also extracts some required information like Left bounds/Right bounds and level parents (1 - 10).

The UBE loads the multi-level BOM structure for manufactured end products with each revision into two work files respectively for BOM header and item (component).

The ETL then extracts the data from the two work files and loads it into the Oracle BI Applications tables.

Note:

If you plan to consume analytics on Bill of Materials, it is mandatory to run this UBE before starting the ETL. This UBE and the related JD Edwards EnterpriseOne objects are created solely for the benefit of analytics and therefore will not be available in the existing source system.

B.2.151 Manage Domains and Member Mappings for Timecard Entry Type Dimension

The Timecard Entry Type dimension has a number of conformed domains which are used in many of the Time and Labor metrics. These domains must be configured correctly for the reports to contain the accurate attribution of time reporting entries to warehouse reporting categories and subcategories.

Optional or Mandatory

This task is mandatory.

Applies to

E-Business Suite and PeopleSoft.

Source Timecard Entry Type Code Mapped To Timecard Entry Type Subcategory

This task is mandatory.

Used to identify how Source Timecard Entry Type map to delivered target Timecard Entry Type Subcategory domain members; target domain members are used in the delivered metrics, dashboards and reports, for example, REGULAR (Regular), OVERTIME (Overtime). The target domain is Extensible - customers can add to but not delete from it.

Example for Oracle E-Business Suite

The Source Timecard Entry Type is the Element Type Id (Element).

Table B-181 Example Implementation for Oracle E-Business Suite

Source Member Code (Name) Target Member Code (Name)

64668 (Overtime Element)

OVERTIME (Overtime)

57941 (Regular Hours Worked)

REGULAR (Regular)

_HOURS_ (Hours)

REGULAR (Regular)

63085 (In Class Training Hours Rebate)

TRAINING (Training)


Example for PeopleSoft

On PeopleSoft the Source Timecard Entry Type is the Time Reporting Code (TRC).

Table B-182 Example Implementation for PeopleSoft

Source Member Code Target Member Code

KAO20 (Overtime - Double Time)

OVERTIME (Overtime)

MCREG (Regular)

REGULAR (Regular)

MTRG (Training)

TRAINING (Training)


Timecard Entry Type Subcategory Mapped to Timecard Entry Type Category

This task is optional. There are seeded mappings delivered with the product.

Used to identify which Timecard Entry Type Subcategory are mapped to Timecard Entry Type Category; target domain members are used in the delivered metrics, dashboards and reports, for example, WORKED (Worked), NON_WORKED (Non-Worked). The target domain is Extensible - customers can add to but not delete from it.

Table B-183 Example Seeded Implementation

Source Member Code (Name) Target Member Code (Name)

BL (Bereavement)

NON_WORKED (Non-Worked)

FAML (Family Leave)

NON_WORKED (Non-Worked)

HOL (Holiday)

NON_WORKED (Non-Worked)

OVERTIME (Overtime)

WORKED (Worked)

PREMIUM (Premium)

WORKED (Worked)

REGULAR (Regular)

WORKED (Worked)

SCK (Sickness)

NON_WORKED (Non-Worked)

TRAINING (Training)

OTHER (Other)

WORKING (Working)

WORKED (Worked)


Source Timecard Entry Type Code for Timecard Entry Productive (Y or N) Flag

Used to identify which Source Timecard Entry Type are considered productive; target domain members are used in the delivered metrics, dashboards and reports, for example, Y (Yes), N (No). The target domain is Extensible - customers can add to but not delete from it.

Example for Oracle E-Business Suite

The Source Timecard Entry Type is the Element Type Id (Element).

Table B-184 Example Implementation for Oracle E-Business Suite

Source Member Code (Name) Target Member Code (Name)

64668 (Overtime Element)

Y (Yes)

57941 (Regular Hours Worked)

Y (Yes)

63085 (In Class Training Hours Rebate)

N (No)


Example for PeopleSoft

On PeopleSoft the Source Timecard Entry Type is the Time Reporting Code (TRC).

Table B-185 Example Implementation for PeopleSoft

Source Member Code Target Member Code

KAO20 (Overtime - Double Time)

Y (Yes)

MCREG (Regular)

Y (Yes)

MTRG (Training)

N (No)


Note:

The following Domain Member Mappings should be completed by customers prior to loading the Time and Labor warehouse schema.

Source Timecard Entry Absence Category Mapped to Timecard Entry Absence Category

This task is optional.

Used to identify which Source Timecard Entry Absence Category are mapped to Timecard Entry Absence Category; target domain members are not currently used in the delivered metrics, dashboards and reports. The target domain is Extensible - customers can add to but not delete from it.

Source Timecard Entry Earning Category Code for Timecard Entry Type Earning Category

This task is optional

Used to identify which Source Timecard Entry Earning Category are mapped to Timecard Entry Type Earning Category; target domain members are not currently used in the delivered metrics, dashboards and reports. The target domain is Extensible - customers can add to but not delete from it.

B.2.152 How to Set Up the C_STATE_PROV Domain

To configure conformed STATE_PROV, you use Externally Conformed Domains in Oracle BI Applications Configuration Manager. For more information about how to configure externally conformed domains, see Section 4.4.8, "How to Configure Externally Conformed Domains".

B.2.153 How to Configure Flat Files for Spend Planning

Due to the nature of Spend Planning activities, some data elements are not captured in the source procurement system. Instead the data elements exist in flat files. In addition, some configuration is done using flat files.

B.2.153.1 How to configure flat files

This section contains the tasks that you must perform. Spend Categories and Expense Categories are two independent ways to classify Purchases and Spend. Spend Planning allows user to analyze and plan spend based on the Spend Category or Expense Categories. The two most common ways to map spend category and expense category are 1) GL Account or 2) Purchasing category. You must decide which of the two in the transactional system can identify spend category and expense category distinctively.

B.2.153.1.1 Configuring Spend Category Definition

There are mapping files for the corresponding source:

  • file_fusion_op_spend_categories.csv: this file is used by Fusion source.

  • file_ora_op_spend_categories.csv: this file is used by E-Business Suite source.

  • file_psft_op_spend_categories.csv: this file is used by PeopleSoft source.

Select purchasing category as the source of the mapping. Map each of the source values to these 5 target spend category values:

  • Indirect

  • Direct

  • Services

  • Travel Expenses

  • Marketing and Sales

Note: Oracle BI Applications V11g, in the Spend Planning application, only indirect spend are taken into analysis. Oracle recommends that you map all your source values for Purchasing category.

To map Purchasing category to spend category:

  1. Edit the CSV file.

    Note:

    The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

    Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

    Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

    Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

    Ensure the files are saved in UNIX mode only.

  2. For each Oracle GL account number that you want to map, create a new row in the file containing the following fields:

    Table B-186 Configuration File Details

    Column Name Description Applicable for EBS/FUSION Applicable for PSFT Required for GL Account mapping Required for Purchasing Category mapping

    MAP_FLAG

    Column value is 'GL' if the mapping is GL Account based and 'PC' if it is Purchasing Category based.

    Y

    Y

    Y

    Y

    CHART OF ACCOUNTS ID

    The ID of the GL chart of account. This column will not have data if mapping is done for PSFT.

    Y

    N

    Y (For EBS/Fusion only)

    N

    COMPANY_CODE

    The ID of the BUSINESS UNIT. Populated only for PSFT source.

    N

    Y

    Y (for PSFT only)

    N

    FROM ACCT

    The lower limit of the natural account range. This is based on the natural account segment of your GL accounts.

    Y

    Y

    Y

    N

    TO ACCT

    The higher limit of the natural account range. This is based on the natural account segment of your GL accounts.

    Y

    Y

    Y

    N

    CATEGORY_ID

    For EBS/Fusion this is based on the Category Set~Category ID concatenation.For PSFT this will have the value SETID~TREENAME~Category ID.

    Y

    Y

    N

    Y

    CATEGORY_NAME

    Category Name.

    Y

    Y

    N

    Y (Adding this value makes the file more readable. This data is not used in further mappings)

    SPEND_CATEGORY_ID

    In V1 Spend planning application, only indirect spend ('INDIRECT') are taken into analysis.

    Y

    Y

    Y

    Y


    Examples:

    GL account numbers to spend category will have sample values like below for EBSGL, 101, 1110, 1110,,,INDIRECTGL, 101, 1210, 1210,,,INDIRECTGL, 101, 1220, 1220,,,INDIRECTPurchasing category to spend category will have sample values like below for EBSPC,,,,2~1, Miscellaneous,INDIRECTPC,,,,3~1, Monitors,INDIRECTPC,,,,32~1, Electrical Equipment,INDIRECT

Additional Note: How to find the Category ID and Category Name SQL for different ERPs

Fusion V1:

select b.category_id, tl.description from CatalogAM.Category b, CatalogAM.CategorySets cat_set where b.category_id= cat_set.category_id and cat_Set.category_set_id=#PROD_CAT_SET_ID1

#PROD_CAT_SET_ID1 is a CM parameter. This parameter is configured for the Product Dimension. You must pass the value as specified in Oracle BI Applications Configuration Manager.

Category_id in the flat file should be in the format: '#PROD_CAT_SET_ID1~ b.category_id'

Category Name will have the tl.description data

EBS 11510/R12x:

select b.category_id, tl.description from MTL_CATEGORIES_B b,MTL_CATEGORIES_TL tl, MTL_CATEGORY_SET_VALID_CATS cat_set where b.category_id= tl.category_id and cat_set.category_id=b.category_id and cat_Set.category_set_id=#PROD_CAT_SET_ID1

#PROD_CAT_SET_ID1 is a CM parameter. This parameter is configured for the Product Dimension. You must pass the value as specified in Oracle BI Applications Configuration Manager.

Category_id in the flat file should be in the format: '#PROD_CAT_SET_ID1~ b.category_id'

Category Name will have the tl.description data

PSFT 90/PSFT91:

select b.setid, b.category_cd, tl.descrshort from PS_ITM_CAT_TBL b,PS_ITM_CAT_TBL_LNG tl where b.setid=tl.setid(+) and b.category_type=tl.category_type(+) and b.category_id=tl.category_id(+)

AND b.setid = <setid that the item belongs to>

Category_id in the flat file should be in the format: 'b.setid~ #TREE_NAME1~ b.category_cd'

#TREE NAME1 is a CM parameter. This parameter is configured for the Product Dimension. You must pass the value as specified in Oracle BI Applications Configuration Manager.

B.2.154 How to Configure Projects Resource Management Work Type Dimension for PeopleSoft

Note: This task is only applicable for Peoplesoft Project Resource Management.

Work Type identifies the type of work done on a project and/or task. Work Type can identify whether this particular task is billable, capitalizable or for training; and assigns a weight to the level of utilization of the person performing the task. Lot of metrics in Project Analytics Resource Management solution depends on the work type. For example, Training Hours considers only those tasks which are of type Training. In PeopleSoft Resource Management, Resource tasks are managed with a set of predefined task categories that are delivered with PeopleSoft Resource Management, and user definable categories that are available for tasks that are specific to the organization. However, in Peoplesoft Resource Management UI, there is no way to specify the work type associated with these task categories. The Task Types defined in PeopleSoft Resource Management is mapped to Work type Dimension. Work Type Dimension for PeopleSoft is not applicable for other Project Subject areas except Resource Management.

To Configure Projects Resource Management Work Type Dimension for PeopleSoft:

  1. Run the following query in the source database:

    SELECT CONCAT(CONCAT(TASK_TYPE,'~'),SYSTEM_SOURCE) FROM PS_RS_TASK_TYPE WHERE SYSTEM_SOURCE='RS'
    
  2. Copy the output of the above query and put in the file file_psft_work_type_ds.csv under the first column (INTEGRATION_ID) from row 5 onwards.

  3. The Work Type dimension provides two different scales to weight assigned time. This allows showing two different views of utilization: One at the resource level and the other at the organization level. For example, time doing rework or training could have full credit at the resource level while only partial at the organization level. For each of the row, specify the following information:

    Table B-187 Columns in file_psft_work_type_ds.csv

    Column Description

    RES_UTILIZATION_PERCENTAGE

    Resource Utilization Percentage for the task category. Any value between 0 and 100.

    ORG_UTILIZATION_PERCENTAGE

    Organization Utilization Percentage for the task category. Any value between 0 and 100.

    REDUCE_CAPACITY_FLG

    Flag that indicates whether any hours charged by the resource to this task category will reduce the capacity of the resource and the appropriate organization. Either Y or N.

    BILLABLE_CAPITALIZABLE_FLG

    Flag that indicates whether the task category type is billable or capitalizable. . Either Y or N.

    TRAINING_FLAG

    Flag that indicates whether the task category is of type training.


  4. Save the file.

    Ensure that the file is saved in CSV (text) not in MS Excel format.

  5. This file need to be placed under Oracle BI Applications Source Files folder.

B.2.155 How to Configure Project Revenue Fact for E-Business Suite

This topic contains the following sections:

B.2.155.1 Overview of Configuring Cost Fact for Oracle E-Business Suite

Actual Costs are extracted from the Cost Distribution Lines table in the Project Costing module in E-Business Suite and loaded into the Cost Line Fact (W_PROJ_COST_LINE_F) table.

For E-Business Suite, Transaction Currency is the Document Currency for this fact.

Note: E-Business Suite concurrent programs (such as PRC: Distribute Labor Costs and PRC: Distribute Usage and Miscellaneous Costs) for distributing Cost should be run before running the ETL to load the data warehouse. If the Cost Distribution program is not run before every incremental ETL run, the data in Cost Distribution Fact will not be synchronized with the actual expenditures in the Expenditure Fact table.

Expenditure Fact

The Expenditure Fact (W_PROJ_EXP_LINE_F) is based on PA_EXPENDITURE_ITEMS_ALL. It shows the actual expenditure data before distribution. This fact should be used by customers who do not distribute their Expenditure on a daily basis, but who have some users who need to see a frequently updated view of Expenditure data.

Note: The GL Date is assigned to the Cost Distribution Line only (during Cost distribution) and not to the Expenditure Item records. Therefore, the Expenditure data can only be analyzed by the Enterprise Calendar dimension and not by the GL calendar. Also, the Expenditure data cannot be analyzed by the GL Account because the GL account is associated only when the data is distributed.

Cost Fact Canonical Date

The Canonical Date dimension for the Cost fact is based on the PRVDR_GL_DATE from Distribution Line table, whereas the Canonical Date dimension for the Expenditure fact is based on the EXPENDITURE_DATE from the Expenditure Items table.

The multi calendar date dimension contains calendars for multiple organizations. It is essential that all records in a report analyzing data by the Fiscal Calendar (Dim - Fiscal Calendar) point to the same calendar. For this reason, all reports in the dashboard are filtered on the Project Business Unit. To make all Cost records in a Project Business Unit point to the same calendar, the RCVR_GL_DATE and RCVR_PA_DATE columns are used to populate the GL_ACCOUNTING_DT_WID and PROJ_ACCOUNTING_DT_WID columns in the fact table respectively. Expenditure OU view (in Cost Fact) can be built using Enterprise Calendar as well.

About Domain Values for Cost Fact

The Project Cost Transfer Status has been modeled as a domain value and can be configured in FSM.

Incremental Logic for Cost Fact

The incremental extract logic for the Cost fact table depends on the 'REQUEST_ID' field of the Cost Distribution Lines table. The W_PROJ_ETL_PS parameter table facilitates this logic.

Using a separate ODI interface, the maximum Request Id in the source table at the time of the ETL run is stored in this table, which is subsequently used to populate the SDE task (SDE_ORA_PROJECTCOSTLINE) level ODI variable #EBS_REQUEST_ID_1. It is initialized using the following query:

SELECT COALESCE((SELECT PRE_REQUEST_ID FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME = 'PA_COST_DISTRIBUTION_LINES_ALL'),0) FROM_DUAL()

Note: If you are missing Cost records in W_PROJ_COST_LINE_F after an incremental update, download patch 9896800 from My Oracle Support. The Tech Note included with the patch explains the scenarios where this can happen, and the proposed solution.

B.2.155.2 How to Configure the Project Cost Aggregate Table

The Project Cost aggregate table (W_PROJ_COST_A) is used to capture information about the project cost distributions for the expenditure items. You need to configure the Project Cost Lines aggregate table before the initial ETL run and subsequent incremental ETL.

Before the initial ETL run, you need to configure the COST_TIME_GRAIN parameter in FSM for the time aggregation level in the Project Cost Lines aggregate fact table.

By default, the COST_TIME_GRAIN parameter has a value of PERIOD. The possible values for the COST_TIME_GRAIN parameter are:

  • PERIOD

  • QUARTER

  • YEAR

The Project Cost Lines aggregate table is fully loaded from the base table in the initial ETL run. The table can grow to millions of records. Therefore, the Project Cost aggregate table is not fully reloaded from the base table after each incremental ETL run. The Oracle Business Analytics Warehouse minimizes the incremental aggregation effort by modifying the aggregate table incrementally as the base table is updated, as described below.

To Configure the Project Cost Aggregate Table:

  1. Oracle Business Analytics Warehouse finds the records to be updated in the base table since the last ETL run, and loads them into the W_PROJ_COST_LINE_TMP table. The measures in these records are multiplied by (-1). The mapping responsible for this task is SIL_ProjectCostLinesFact_Derive_PreLoadImage.

  2. Oracle Business Analytics Warehouse finds the inserted or updated records in the base table since the last ETL run, and loads them into the W_PROJ_COST_LINE_TMP table, without changing their sign. The mapping responsible for this task is SIL_ProjectCostLinesFact_Derive_PreLoadImage, which is run before PLP_ProjectCostLinesFact_Derive_PostLoadImage updates or inserts records in the base table.

  3. Oracle Business Analytics Warehouse aggregates the W_PROJ_COST_LINE_TMP table and load to W_PROJ_COST_A_TMP, which has the same granularity as the W_PROJ_COST_A table.

  4. The PLP_ProjectCostLinesAggregate_Derive mapping looks up the W_PROJ_COST_A aggregate table to update existing buckets or insert new buckets in the aggregate table (the mapping is PLP_ProjectCostLinesAggregate_Load).

B.2.155.3 Configuring Revenue Fact for E-Business Suite

Actual Revenue Line records are extracted from the Revenue/Event Distribution Lines tables (PA_CUST_REV_DISTRIB_LINES_ALL and PA_CUST_EVENT_DIST_ALL) in the Project Costing module in E-Business Suite and are loaded into the Revenue Line Fact (W_PROJ_REVENUE_LINE_F) table.

For E-Business Suite, Revenue Transaction Currency Code is the Document Currency Code for this fact.

Note: E-Business Suite concurrent programs (such as PRC: Generate Draft Revenue for a Single Project or PRC: Generate Draft Revenue for a Range of Projects) for distributing revenue should be run before the ETL is run to load the data warehouse.

For the Revenue Header Fact (W_PROJ_REVENUE_HDR_F), the primary source is the PA_DRAFT_REVENUES table. Revenue line metrics, such as Bill and Revenue amounts, are aggregated in this table as well.

Revenue Fact Canonical Date

The Canonical Date dimension is based on the GL_DATE from the Draft Revenues table.

Revenue Facts Staging Table

This is a common staging table that loads both the header and the line level revenue fact tables.

Revenue Fact Multicurrency Support

Some metrics such as Unearned Revenue, Unbilled Receivables, Realized Gains, and Realized Losses are only available in Local Currency and Global Currencies. There are three columns in w_proj_revenue_line_f and w_proj_revenue_hdr_f respectively for revenue amounts in global currencies.

Revenue Fact Domain Values

The project revenue status has been modeled as a domain value and can be configured in FSM.

Incremental Logic for Revenue Fact

The incremental extract logic for the Revenue fact table depends on the REQUEST_ID field of the Revenue Distribution Lines table. The W_PROJ_ETL_PS parameter facilitates this logic, and through a separate ODI process, the maximum Request Id in the source table at the time of the ETL run is stored in this table, which is subsequently used to populate the following variables for the SDE_ORA_ProjectRevenueLine task in ODI:

  • #EBS_REQUEST_ID_2

    This variable is initialized using the following query:

    SELECT COALESCE((SELECT COALESCE(PRE_REQUEST_ID,0) FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME ='PA_CUST_EVENT_RDL_ALL'),0) FROM_DUAL()
    
  • #EBS_REQUEST_ID_4

    This variable is initialized using the following query:

    SELECT COALESCE((SELECT COALESCE(PRE_REQUEST_ID,0) FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME ='PA_CUST_REV_DIST_LINES_ALL'),0) FROM_DUAL()
    
  • #EBS_REQUEST_ID_4

    This variable is initialized using the following query:

    SELECT COALESCE((SELECT COALESCE(PRE_REQUEST_ID,0) FROM QUALIFY_DS(W_PROJ_ETL_PS) WHERE TBL_NAME ='PA_DRAFT_REVENUES_ALL'),0) FROM_DUAL()
    

B.2.155.4 How to Configure the Project Revenue Aggregate Table

The Project Cost aggregate table (W_PROJ_REVENUE_A) is used to capture information about the project revenue distributions. You need to configure the Project Revenue Lines aggregate table before the initial ETL run and subsequent incremental ETL.

Before the initial ETL run, you need to configure the REVENUE_TIME_GRAIN parameter in FSM for the time aggregation level in the Project Revenue Lines aggregate fact table.

By default, the REVENUE _TIME_GRAIN parameter has a value of PERIOD. The possible values for the REVENUE_TIME_GRAIN parameter are:

  • PERIOD

  • QUARTER

  • YEAR

The Project Revenue Lines aggregate table is fully loaded from the base table in the initial ETL run. The table can grow to millions of records. Therefore, the Project Revenue aggregate table is not fully reloaded from the base table after each incremental ETL run. The Oracle Business Analytics Warehouse minimizes the incremental aggregation effort by modifying the aggregate table incrementally as the base table is updated.

To configure the Project Revenue Aggregate Table:

  1. Oracle Business Analytics Warehouse finds the records to be updated in the base table since the last ETL run, and loads them into the W_PROJ_ REVENUE_LINE_TMP table. The measures in these records are multiplied by (-1). The mapping responsible for this task is SIL_Project RevenueLinesFact_Derive_PreLoadImage.

  2. Oracle Business Analytics Warehouse finds the inserted or updated records in the base table since the last ETL run, and loads them into the W_PROJ_REVENUE_LINE_TMP table, without changing their sign. The mapping responsible for this task is SIL_ProjectRevenueLinesFact_Derive_PreLoadImage, which is run before PLP_ProjectRevenueLinesFact_Derive_PostLoadImage updates or inserts records in the base table.

  3. Oracle Business Analytics Warehouse aggregates the W_PROJ_ REVENUE _LINE_TMP table and load to W_PROJ_REVENUE_A_TMP, which has the same granularity as the W_PROJ_REVENUE_A table.

  4. The PLP_ProjectRevenueLinesAggregate_Derive mapping looks up the W_PROJ_REVENUE_A aggregate table to update existing buckets or insert new buckets in the aggregate table (the mapping is PLP_ProjectRevenueLinesAggregate_Load).

B.2.155.5 How To Configure Project Uom For E-Business Suite

Use the SQL below in the OLTP to get the project UOMs and then map them to warehouse (conformed) UOMs coded in FSM if the codes are not already mapped.

select lookup_code, meaning, description from fnd_lookup_values where lookup_type='UNIT' and LANGUAGE='US';

B.2.156 How to Set Up CRM Partner Channel Based Security for Oracle Fusion

Overview

Oracle Fusion CRM partner channel based security is applied when fusion channel administrator or channel operation manager access partner owned leads and opportunity/revenue. For leads, this means user can see all the leads that have sales channel stamped as indirect or partner. For opportunity/revenue, this means that user can access all opportunities (and associated revenue) that have a partner assigned to them.

Configuring Partner Channel Based Security

There is no session variable involved in the setting up for this data security.

B.2.156.1 Configuring BI Duty Roles

OBIA_PARTNER_ALL_INDIRECT_TRANSACTIONAL_DATA_SECURITY is the internal BI Duty Role used to define data filter for partner channel based data security. And by default, it only has one member duty role:

  • OBIA_PARTNER_CHANNEL_ADMINISTRATIVE_ANALYSIS_DUTY

This duty role controls which subject areas and dashboard content the user get access to. And as a member of OBIA_PARTNER_ALL_INDIRECT_TRANSACTIONAL_DATA_SECURITY, it ensures the partner channel based data security filter is applied in all queries involving lead or opportunity/revenue.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.157 How to Configure SIA Presentation Hierarchy in RPD

In Student Information Analytics (SIA), the following presentation hierarchies are implemented in the default delivery of the Student Information Analytics module:

  • Hierarchy -1 : Academic Year -> Academic Term

  • Hierarchy -2 (Sub-Plan hierarchy) : Academic Institution -> Academic Career -> Academic Program -> Academic Plan -> Academic Sub Plan

  • Hierarchy -3 (Plan hierarchy): Academic Institution -> Academic Career -> Academic Program -> Academic Plan

  • Hierarchy -4 (Program hierarchy) : Academic Institution -> Academic Career -> Academic Program

Hierarchy-1 doesn't require any configuration. However, Hierarchy-2, 3 and 4 require certain configurations as per the customer preference / customer data.

The steps to configure the sub-plan hierarchy are mentioned below. Similar steps can be adopted to configure the plan and program hierarchies.

Sub-Plan hierarchy configuration

The sub-plan hierarchy has been created based on the sub plan dimension which is basically a subset of academic plan dimension. If a customer has fact data till the sub plan level for all the fact rows, then the default Sub-Plan hierarchy will work properly and it doesn't require any configuration.

However, if you determine that in your data all fact rows will have data at least to the academic plan level, and academic sub plan information is optional, then you should use Academic Plan hierarchy (which is already present in the BMM layer).

The steps to configure the sub-plan hierarchy (in the scenario described above) are as follows:

  1. Delete the "SIA Academic Subplan" hierarchy from presentation layer under Academic Institution presentation table.

  2. Drag and drop the "SIA Academic Plan" hierarchy from the BMM layer to Academic Institution presentation table of a specific subject area.

  3. Drag and drop the following presentation columns to the respective presentation hierarchy levels of Academic Plan presentation hierarchy.

Table B-188 Presentation tables and columns

SI Number Presentation Table Presentation Column

1

Academic Institution / Institution

Source Institution

1

Academic Career

Academic Career

2

Academic Program

Academic Program

3

Academic Plan

Academic Plan


However, if you determine that in your data all fact rows will have data at least to the academic program level, and academic sub plan and plan information are optional, then you should use Academic Program hierarchy (which is also present in the BMM layer).

However, if you want to use Academic Career hierarchy, then you must create it first. Academic Career hierarchy is deployed by default, and is requires a customization.

B.2.158 How to Configure Backlog Period

The Backlog table (W_SALES_BACKLOG_LINE_F) stores backlog data for the current month. In contrast, the Backlog History table (W_SALES_BACKLOG_HIST_F) stores snapshots of all previous months' historical backlog data. The periods for which the Backlog History table tracks backlog data is defined by the Backlog Period Date. By default, the date is set as the last calendar day of the month; however you may configure this date. You may want to view backlog history at a more detailed level, such as by day or by week, instead of by 'MONTH' or at a more higher level like Quarter or Year, then this is done by configuring the FSM Parameters - TIME_GRAIN to 'DAY' OR 'WEEK' OR 'QUARTER' OR 'YEAR'. In addition, you must make the ODI changes that are included in this section.

Setting the Time Grain Parameters in FSM

By default, the parameter TIME_GRAIN is set to 'MONTH'. If you want to change the period of aggregates, you will have to set these variables to desired levels. To change the values in FSM, navigate to Manage Parameters, select 'TIME_GRAIN' and click the Edit button.

To set the Time Grain Parameters in FSM:

  1. Navigate to Manage Parameters.

  2. Select TIME_GRAIN and click on the edit button.

  3. In the Manage Parameter Default values area, specify a value in the Default Value field. The valid values are:

    • DAY

    • WEEK

    • MONTH

    • QUARTER

    • YEAR

    The corresponding Parameter Task in FSM is 'Configure Time Grain for Backlog Period Date'.

Setting the KM option for Backlog Period in ODI

By default the Aggregation period for Backlog Period is set to Month level. If you want to change the period of aggregates, you will have to set the 'OBI_DELETE_TYPE' which is a KM option for the ODI Interface named PLP_SalesBacklogHistoryFact_Load.W_SALES_BACKLOG_HISTORY_F.

To set the OBI_DELETE_TYPE in ODI:

  1. In ODI Designer Navigator, navigate to BIApps Projects, then Mappings, then PLP, then PLP_SalesBacklogHistoryFact_Load folder.

  2. Open the interface 'PLP_SalesBacklogHistoryFact_Load.W_SALES_BACKLOG_HISTORY_F', go to the Flow tab.

    The Flow tab in interface 'PLP_SalesBacklogHistoryFact_Load.W_SALES_BACKLOG_HISTORY_F'.
  3. In the Property Inspector, navigate to the KM Option "OBI_DELETE_TYPE", change the value to one of the below options as per the setting for the FSM TIME_GRAIN parameter value. The valid values for this option are:

    • CAL_DAY

    • CAL_WEEK

    • CAL_MONTH

    • CAL_QTR

    • CAL_YEAR

    Property Inspector with values.
  4. Save the interface and Regenerate the scenario.

B.2.159 How to Reload the Time Dimension Tables After the Data Warehouse Is Loaded

The default value for the last date in time dimension is Dec 31, 2020. If you need to extend this date, you must use Oracle BI Applications Configuration Manager to change the default value of the variable END_DATE to a larger value (no greater than Dec 31, 2050, which is the upper limit). The time dimension tables will be extended automatically in the next incremental load.

To Reload the Time Dimension Tables After the Data Warehouse Is Loaded:

  1. You will see a Subject Area named Common Dimension and Extend Day Dimension. If you have Multiple calendar in your Day Dimension, then choose the configuration tag Extend Day Dimension Multiple Calendar Support, or else remove it. Then assemble the Subject Area.

  2. Choose the Task Sil_DayDimension_XTND. Choose a new START_DATE (= @ END_DATE +1) and new END_DATE and set the parameter values.

  3. Choose the Task SDE_FUSION_TimePeriodMCalPeriod_XTND. Retain the START_DATE and choose a new END_DATE.

  4. Build the corresponding Load Plan with same name.

  5. Remember to change FILE_MCAL_CAL_D, FILE_MCAL_CONTEXT_G, FILE_MCAL_PERIOD_DS (these 3 in universal) and FILE_MCAL_CONFIG_G, in case you use them as source.

Load Plan steps for an Fusion Applications container:

  1. You will see a subject Area 'Common-Extend Day Dimension'. If you have Multiple calendars in your Day Dimension, then choose the configuration tag Extend Day Dimension Multiple Calendar Support, or else remove it. Then, assemble the Subject Area.

  2. Choose the Task Sil_DayDimension_XTND. Choose a new START_DATE (= END_DATE +1) and new END_DATE and set the parameter values.

  3. Choose the Task SDE_FUSION_TimePeriodMCalPeriod_XTND. Retain the START_DATE and choose a new END_DATE.

  4. Build the corresponding Load Plan named 'Common-Extend Day Dimension Fusion'.

B.2.160 How to Set Up CRM Partner Account Team Based Security for Oracle Fusion

Overview

Oracle Fusion CRM partner account team based security is applied when fusion channel account manager access partner owned opportunity/revenue. Partner channel account managers should be able to access all opportunities/revenue owned by the partner organization on whose partner account team they are a member of.

Configuring Partner Account Team Based Security

The session variable USER_PARTY_ID is the resource party Id that uniquely defines the login user. It is initialized via session initial block GET_PARTY_ID and then used in partner account team based data security role.

B.2.160.1 Configuring BI Duty Roles

OBIA_PARTNER_TEAM_DATA_SECURITY is the internal BI duty role to define data filter for partner account team based data security. By default, it has one member BI duty role:

  • OBIA_PARTNER_CHANNEL_ACCOUNT_MANAGER_ANALYSIS_DUTY

This duty role controls which subject areas and dashboard content the user get access to. And as a member of OBIA_PARTNER_TEAM_DATA_SECURITY, it ensures that the partner account team based data security filter is applied to all queries involving opportunity or revenue.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.161 How to Set Up Project Resource Management Security for E-Business Suite

Overview

Project Analytics supports security over following dimensions in Project Resource Management subject areas. In the Business Intelligence Applications solution, the 'Business Unit' entity refers to 'Operating Unit Organizations' in E-Business Suite. The list of Business Units that a user has access to, is determined by E-Business Suite grants.

Table B-189 Supported Project Resource Management Security subject areas

Project Resource Management Facts Securing Entity Resource Availability Resource Requirement Resource Utilization Assignment Resource Utilization Capacity Resource Utilization Expected Employee Job/Competency

Project Business Unit

N

Y

Y

N

Y

N

Project Organization

N

Y

Y

N

Y

N

Expenditure Business Unit

N

N

N

N

N

N

Contract Business Unit

N

N

N

N

N

N

Project

N

Y

Y

N

Y

N

Resource Organization

N

N

N

Y

Y

Y

Ledger

N

N

N

N

N

N


Configuring Project Resource Management For E-Business Suite

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system.

Note:

On installation, initialization blocks are enabled for E-Business Suite R12. If you are deploying on a source system other than E-Business Suite R12, then you must enable the appropriate initialization blocks.

You must enable data security for Project Resource Management in E-Business Suite by enabling the initialization blocks listed below based on your E-Business Suite adaptor. Make sure that all Project Security initialization blocks for other adapters are disabled. If more than one source system is deployed then, you must also enable the initialization blocks of those source system.

Init Blocks

Init Blocks for E-Business Suite R11x only:

  • Project Business Unit List RM EBS11x

Init Blocks for E-Business Suite R11x only:

  • Project Business Unit List RM EBSR12

Init Blocks for both E-Business Suite R11x and R12:

  • Project List RM EBS

  • Project Organization List RM

  • Project Resource Organization List

Configuration

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd) in online mode, and select Manage, then Identity.

  2. Double click on OBIA_PROJECT_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable data security filters related to Resource Management Fact Tables.

    This activates Project based Security which is needed for Resource Management Module in E-Business Suite.

  3. Double click on OBIA_PROJECT_RESOURCE_ORGANIZATION_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable all data security filters related to Resource Management Fact Tables.

    This activates Resource Organization based Security which is needed for Resource Management Module in E-Business Suite.

  4. Double click on OBIA_PROJECT_ORGANIZATION_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable data security filters related to Resource Management Fact Tables.

    This activates Project Organization based Security which is needed for Resource Management Module in E-Business Suite.

  5. Double click on OBIA_PROJECT_BUSINESS_UNIT_DATA_SECURITY, navigate to Permissions, then Data Filters, and enable data security filters related to Resource Management Fact Tables.

    This activates Project Business Unit based Security which is needed for Resource Management Module in E-Business Suite.

B.2.161.1 Configuring BI Duty Roles

The following BI Duty Roles are applicable to the Project Resource Management subject area.

  • OBIA_EBS_PROJECT_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_MANAGEMENT_ANALYSIS_DUTY

  • OBIA_EBS_PROJECT_DATA_SECURITY

These duty roles control which subject areas and dashboard content the user get access to. These duty roles also ensure the data security filters are applied to all the queries. For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.162 How to Implement Procurement and Spend Analytics Security

This topic contains the following sections:

About Duty Roles

The following sections explain which Duty Roles you need to deploy for each functional area. Duty roles control which subject areas and dashboard content a user can access. Duty roles also ensure that appropriate data security filters are applied to the SQL queries that power the dashboards and reports.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles", or in FSM refer to the FSM Task 'How to Define New Groups and Mappings for Users and BI Roles'.

B.2.162.1 How to implement Hierarchy Based Security for Employee Expense Subject Areas

This section covers Hierarchy-based security for Employee Expense Subject Areas.

B.2.162.1.1 Overview

The employee expense subject areas support security by employee hierarchy for line managers. The list of values a user has access to is determined by the grants in the source application system.

B.2.162.1.2 Enabling Initialization Blocks

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. For example, to enable security for EBS, enable Oracle EBS initialization block and make sure the initialization blocks of all other source systems are disabled.

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

    Use the table below for guidance on which Initialization Blocks to enable for your Source System.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

The table below shows which Initialization Blocks need to be enabled for each Source System.

Table B-190 List of Source Systems and related Initialization Blocks

Source System Initialization Block

Oracle Fusion Applications

HR Security Person ID List (Fusion)

Oracle EBS

HR Security Person ID List (EBS)

Oracle PeopleSoft

HR Security Person ID List (PeopleSoft)


B.2.162.1.3 Configuring BI Duty Roles

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles", or in FSM refer to the FSM Task 'How to Define New Groups and Mappings for Users and BI Roles'.

The following BI Duty Roles are applicable to the Employee Expense Subject Areas:

Table B-191 List of BI Duty Roles and Related Subject Areas

Role Employee Expenses - Credit Card Employee Expenses - Overview Employee Expenses - Violations

OBIA_LINE_MANAGER_EXPENSE_ANALYSIS_DUTY

X

X

X

OBIA_AU_LINE_MANAGER_EXPENSE_ANALYSIS_DUTY

X

X

X


B.2.162.2 How to implement Org Based Security for Employee Expense Subject Areas

This section covers Org-based security for Employee Expense Subject Areas.

B.2.162.2.1 Overview

The employee expense subject areas support security by Business Unit for corporate card administrators, expense managers, and spend executives. The list of values a user has access to is determined by the grants in the source application system.

B.2.162.2.2 Enabling Initialization Blocks

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. For example, to enable security for EBS, enable Oracle EBS initialization block and make sure the initialization blocks of all other source systems are disabled.

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

    Use the table below for guidance on which Initialization Blocks to enable for your Source System.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

The table below shows which Initialization Blocks need to be enabled for each Source System.

Table B-192 List of Source Systems and related Initialization Blocks

Source System Initialization Block

Oracle Fusion Applications

PROC_SPEND_AN:SECURITY:Employee Expense Corporate Card BU List

PROC_SPEND_AN:SECURITY:Employee Expense Violation BU List

Oracle EBS

Operating Unit Org-based Security

Oracle PeopleSoft

Operating Unit Org-based Security


B.2.162.2.3 Configuring BI Duty Roles

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles", or in FSM refer to the FSM Task 'How to Define New Groups and Mappings for Users and BI Roles'.

The following BI Duty Roles are applicable to the Employee Expense Subject Areas:

Table B-193 List of BI Duty Roles and Related Subject Areas

Role Employee Expenses - Credit Card Employee Expenses - Overview Employee Expenses - Violations

OBIA_CORPORATE_CARD_ADMINISTRATION_ANALYSIS_DUTY

X

   

OBIA_EXPENSE_MANAGEMENT_ANALYSIS_DUTY

   

X

OBIA_PROCUREMENT_EXECUTIVE_ANALYSIS_DUTY

X

X

X

Procurement and Spend Executive

X

X

X

Procurement and Spend Executive PSFT

X

X

X


B.2.162.3 How to implement Procurement and Spend Subject Areas Security for Suppliers

This section covers security for Suppliers.

B.2.162.3.1 Overview

The procurement and spend subject areas support security for suppliers in the Fusion Applications. The list of values a user has access to is determined by the grants in the source application system.

B.2.162.3.2 Enabling Initialization Blocks

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. For example, to enable security for EBS, enable Oracle EBS initialization block and make sure the initialization blocks of all other source systems are disabled.

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

    Use the table below for guidance on which Initialization Blocks to enable for your Source System.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

The table below shows which Initialization Blocks need to be enabled for each Source System.

Table B-194 List of Source Systems and related Initialization Blocks

Source System Initialization Block

Oracle Fusion Applications

PROC_SPEND_AN:SECURITY:Procurement Supplier Access Level

PROC_SPEND_AN:SECURITY:Procurement Supplier Access List


B.2.162.3.3 Configuring BI Duty Roles

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles", or in FSM refer to the FSM Task 'How to Define New Groups and Mappings for Users and BI Roles'.

The BI Duty Role "OBIA_SUPPLIER_ANALYSIS_DUTY" is applicable to these Subject Areas:

  • Fact - Purchasing - Agreement

  • Fact - Purchasing - Order

  • Fact - Purchasing - Receipt

  • Fact - Spend and AP Invoice Distribution

  • Dim - Supplier

  • Dim - Supplier Site

B.2.162.4 How to implement Procurement and Spend Security for Procurement users

This section covers security for Procurement users.

B.2.162.4.1 Overview

The procurement and spend subject areas support security by agent security for procurement users. The list of values a user has access to is determined by the grants in the source application system.

B.2.162.4.2 Enabling Initialization Blocks

In order for data security filters to be applied, appropriate initialization blocks need to be enabled depending on the deployed source system. For example, to enable security for EBS, enable Oracle EBS initialization block and make sure the initialization blocks of all other source systems are disabled.

To enable initialization blocks, follow the steps below:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

  2. Choose Manage, then Variables.

  3. Under Session – Initialization Blocks, open the initialization block that you need to enable.

    Use the table below for guidance on which Initialization Blocks to enable for your Source System.

  4. Clear the Disabled check box.

  5. Save the metadata repository (RPD file).

The table below shows which Initialization Blocks need to be enabled for each Source System.

Table B-195 List of Source Systems and related Initialization Blocks

Source System Initialization Block

Oracle Fusion Applications

PROC_SPEND_AN:SECURITY:Procurement Agreement BU List

PROC_SPEND_AN:SECURITY:Procurement Agreement View Others BU List

PROC_SPEND_AN:SECURITY:Procurement PurchaseOrder BU List

PROC_SPEND_AN:SECURITY:Procurement PurchaseOrder View Others BU List

PROC_SPEND_AN:SECURITY:Procurement Requisition BU List

PROC_SPEND_AN:SECURITY:Procurement Requisition View Others BU List

PROC_SPEND_AN:SECURITY:Procurement Sourcing BU List

PROC_SPEND_AN:SECURITY:Procurement Sourcing View Others BU List

PROC_SPEND_AN:SECURITY:Procurement Spend View BU List

PROC_SPEND_AN:SECURITY:Procurement Supplier Site Access List

Operating Unit Organizations

Oracle EBS

Operating Unit Org-based Security

Oracle PeopleSoft

Operating Unit Org-based Security


B.2.162.4.3 Configuring BI Duty Roles

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles", or in FSM refer to the FSM Task 'How to Define New Groups and Mappings for Users and BI Roles'.

The two graphics below show which BI Duty Roles are applicable to the Procurement and Spend Subject Areas.

Graphic one:


Graphic two:


B.2.163 How to Configure Scorecard Target Before Running ETL

Purpose

This section helps you to understand how to prepare the scorecard target files.

Optional or Mandatory

This task is only required if you choose to implement procurement scorecard feature.

Task description in detail

Use the file file_purch_scorecard_target.csv to specify the target for the KPI. The supported dimensions are time dimension and procurement Business Unit dimension.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

You must specify the following values in the source file in the required data format:

  • Quarter start date

  • Procurement business Unit ID

  • KPI name

  • KPI target value

The following KPIs are supported for KPI target value:

# of Negotiation Lines Awarded Per Category
# of POs Per Buyer
# of Suppliers Per Category
% of Fulfilled Requisition Lines Past Expected Date
% of Late Receipts
% of Processed Requisition Lines Past Expected Date
% of Realized Savings
% of Supplier Diversity Spend
% of Unfulfilled Requisition Lines Past Expected Date
Average Negotiation Cycle Time
Average Requisition to Receipt Cycle Time
Electronic Invoice %
Manual Requisition Lines Rate
Non-Agreement Purchase Rate
Overall Accepted %
Perfect Invoices %
Purchase Order Schedule Line Return Rate
Received On Time %

B.2.164 About Configuring Purchase Cycle Lines Aggregate Fact

This topic provides additional information about configuring the purchase cycle line aggregate parameter.

To aggregate the Purchase Cycle Lines table for ETL, you configure the TIME_GRAIN parameter, which is preconfigured to the value of Month. Valid values are DAY, WEEK, MONTH, QUARTER, YEAR.

The Purchase Cycle Lines aggregate table is fully loaded from the base table in the initial ETL run. The table can grow to millions of records. The Purchase Cycle Lines aggregate table is not fully reloaded from the base table after an ETL run. Oracle Business Analytics Warehouse minimizes the incremental aggregation effort by modifying the aggregate table incrementally as the base table is updated.

When the Supply Chain - Purchase Cycle Lines Subject Area is included in a Load Plan in Oracle BI Applications Configuration Manager, the Purchase Cycle Lines data is extracted using the following tasks:

  • SIL_PurchaseCycleLinesAggregate_Derive_PreSoftDeleteImage finds the records to be deleted in the base table since the last ETL run, and loads them into the W_PURCH_CYCLE_LINE_TMP table. The task is run in the source-specific window before the records are deleted from the base table.

  • SIL_PurchaseCycleLinesAggregate_Derive_PreLoadImage finds the records to be updated in the base table since the last ETL run, and loads them into the W_PURCH_CYCLE_LINE_TMP table. The measures in these records are multiplied by (-1). The task is run in the source-specific workflow before the records are updated in the base table.

  • PLP_PurchaseCycleLinesAggregate_Derive_PostLoadImage finds the inserted or updated records in the base table since the last ETL run, and loads them into the W_PURCH_CYCLE_LINE_TMP table, without changing their sign. The task is run in the post load-processing workflow after the records are updated or inserted into the base table.

  • PLP_PurchaseCycleLinesAggregate_Load aggregates the W_PURCH_CYCLE_LINE_TMP table, and joins it with the W_PURCH_CYCLE_LINE_A aggregate table to insert new or update existing buckets to the aggregate table.

B.2.165 About Configuring Purchase Receipts Aggregate Fact

This topic provides additional information about configuring the purchase receipt aggregate parameter.

To aggregate the Purchase Receipts table for ETL, you configure the TIME_GRAIN parameter, which is preconfigured to the value of Month. Valid values are DAY, WEEK, MONTH, QUARTER, YEAR.

The Purchase Receipt Lines aggregate table is fully loaded from the base table in the initial ETL run. The table can grow to millions of records. Thus, the Purchase Receipts aggregate table is not fully reloaded from the base table after each incremental ETL run. Oracle Business Analytics Warehouse minimizes the incremental aggregation effort by modifying the aggregate table incrementally as the base table is updated.

When the Supply Chain - Purchase Receipts Subject Area is included in a Load Plan in Oracle BI Applications Configuration Manager, the Purchase Receipts data is extracted using the following tasks:

  • SIL_PurchaseReceiptAggregate_Derive_PreSoftDeleteImage finds the records to be deleted in the base table since the last ETL run, and loads them into the W_PURCH_RCPT_TMP table. The measures in these records are multiplied by (-1). The task is run in the source-specific workflow before the records are deleted from the base table.

  • SIL_PurchaseReceiptAggregate_Derive_PreLoadImage finds the records to be updated in the base table since the last ETL run, and loads them into the W_PURCH_RCPT_TMP table. The measures in these records are multiplied by (-1). The task is run in the source-specific workflow before the records are updated in the base table.

  • PLP_PurchaseReceiptAggregate_Derive_PostLoadImage finds the inserted or updated records in the base table since the last ETL run, and loads them into the W_PURCH_RCPT_TMP table, without changing their sign. The task is run in the post load-processing workflow after the records are updated or inserted into the base table.

  • PLP_PurchaseReceiptAggregate_Load aggregates the W_PURCH_RCPT_TMP table, and joins it with the W_PURCH_RCPT_A aggregate table to insert new or update existing buckets to the aggregate table.

B.2.166 How to Set Up CRM Resource Hierarchy Based Security for Oracle Fusion

Overview

Resource hierarchy based security is widely used in many CRM subject areas, such as Sales, Marketing and Partner Management. Resource based security control starts with the current login user. The login user's party Id and the levels that the login user belongs to in resource hierarchy are then used as part of the data filter condition in queries.

There are variations of resource hierarchy based security rule when it's actually applied in different areas, although they are all resource based by nature.

For Opportunity and Revenue, visibility is granted to the login user as:

- A member of the opportunity team.

- Direct manager or above in the managerial hierarchy of the team member.

For Resource Quota, visibility is granted to the login user as:

- The resource that the resource quota is created for.

- Direct manager or above in the managerial hierarchy of the owner.

For Leads, visibility is granted to the login user as:

- A member of the lead team.

- Direct manager or above in the managerial hierarchy of the team member.

For Sales Campaigns, visibility is granted to the login user as:

- Direct owner of the campaign.

- Direct manager or above in the managerial hierarchy of the owner.

Configuring Resource Hierarchy Based Security

There are 2 session variables used in resource hierarchy based data security roles.

- RESOURCE_HIER_LEVEL_LIST contains the list of all the possible levels that the login user belongs to. This variable is initialized by session initial block "Resource Hierarchy Level List".

- USER_PARTY_ID is the resource party Id that uniquely defines the login user. This variable is initialized by session initial block GET_PARTY_ID.

B.2.166.1 Configuring BI Duty Roles

All the Resource Hierarchy Based security roles should be defined as member of the internal role OBIA_RESOURCE_HIERARCHY_DATA_SECURITY, under which, all the necessary data filters are defined. In the default (that is, installed) configuration, OBIA_RESOURCE_HIERARCHY_DATA_SECURITY has the following members.

  • OBIA_LEAD_ANALYSIS_DUTY

  • OBIA_PARTNER_ANALYSIS_DUTY

  • OBIA_PARTNER_ADMINISTRATIVE_ANALYSIS_DUTY

  • OBIA_PARTNER_CHANNEL_ACCOUNT_MANAGER_ANALYSIS_DUTY

  • OBIA_PARTNER_CHANNEL_ADMINISTRATIVE_ANALYSIS_DUTY

  • OBIA_PARTNER_CHANNEL_ANALYSIS_DUTY

  • OBIA_OPPORTUNITY_LANDSCAPE_ANALYSIS_DUTY

  • OBIA_SALES_CAMPAIGN_ANALYSIS_DUTY

  • OBIA_SALES_EXECUTIVE_ANALYSIS_DUTY

  • OBIA_SALES_MANAGERIAL_ANALYSIS_DUTY

  • OBIA_SALES_TRANSACTIONAL_ANALYSIS_DUTY

These duty roles control which subject areas and dashboard content the user get access to.

For more information about how to define new groups and mappings for Users and BI Roles, see Section B.2.44, "How to Define New Groups and Mappings for Users and BI Roles".

B.2.167 About Manage Domains and Member Mappings for Workforce Event

Purpose

The Workforce Event Dimension has a number of conformed domains which are used in many of the HCM metrics. These domains must be configured correctly for the reports to contain accurate information.

Optional or Mandatory

This task is optional; however the default configuration may not adequately reflect the OLTP setup, so this should be reviewed to ensure the reports are accurate.

Applies to

All sources, however the method of configuring this domain varies for each source.

Task description in detail

Configure the domain mappings related to the Workforce Event Dimension/Fact. The domain mapping for Workforce Event, Sub-Group and Group is used to classify events such as Hires, Terminations and Transfers, and to drill down further into different subtypes such as Voluntary or Involuntary Terminations.

These domains are designed as a hierarchy, so at the base level all events should map onto a conformed Workforce Event Domain which can be extended to include additional events. Custom metrics can be defined on events in this domain. The higher level group domains provide a drill to detail. For example, a hire (group) may break down into a new hire or rehire (sub-group).

Example for E-Business Suite

For E-Business Suite the domain mapping to configure is from a combination of event source, event reason and change combination:

  • Event source is the origin of the event. These are already seeded for example, ASG, FTE, HDC, SAL.

  • Event reason is the corresponding reason derived from the application lookup table. The reason codes are prefixed with the corresponding reason types.

  • Change combination allows events with organization, job, grade, location, position or supervisor changes to be mapped. Any combination of these can be specified, a few examples have been seeded.

Example Requirements:

  • Define Promotion as an assignment change with reason "Promotion" and an accompanying grade change.

  • Define Restructure as an organization change with reason "Restructure".

  • Define Transfer as an organization change with any other reason.

Example Implementation:

  1. Add "Restructure" to Workforce Event Detail domain.

    1. Map it to "Transfer" in the Workforce Event Sub Group domain.

    2. "Transfer" is already mapped to "Transfer" in the Workforce Event Group domain.

  2. Add the new source domain members that are needed for the required mapping from Source Workforce Event Reason Combination:

    1. ASG~PROMOTION~Grd=Y

    2. ASG~RESTRUCTURE~Org=Y

  3. Add the following mappings to the domain map Source Workforce Event Reason Combination -> Workforce Event Detail:

    1. ASG~PROMOTION~Grd=Y > ASG_PROMOTION

    2. ASG~RESTRUCTURE~Org=Y > RESTRUCTURE

  4. The remaining definition for transfers is already seeded so no change required

    The resulting domain mapping will look like this, with the shaded lines the seeded domain mappings:

    Sample configuration data.

Notes

Multiple matches are allowed, for example an assignment change with reason "Restructure" and an organization change would match the mapping to either TRANSFER or RESTRUCTURE. The exact match on reason takes precedence over "any" reason, so the result would be RESTRUCTURE.

Example for PeopleSoft

For PeopleSoft, the domain mapping to configure is from a combination of action and action reason. The action reason code is prefixed with the corresponding action code.

Example Requirements:

  • Define Restructure as a transfer action with reason "Restructure".

  • Define Transfer as a transfer action with any other reason.

Example Implementation:

  1. Add "Restructure" to Workforce Event Detail domain.

    1. Map it to "Transfer" in the Workforce Event Sub Group domain.

    2. "Transfer" is already mapped to "Transfer" in the Workforce Event Group domain.

  2. Add the following mappings to the domain map Source Workforce Event and Reason -> Workforce Event Detail:

    1. XFR~RESTRUCTURE -> RESTRUCTURE

  3. The remaining definition for transfers is already seeded so no change required

    The resulting domain mapping will look like this, with the shaded lines the seeded domain mappings:

    Sample configuration data.

Notes

Multiple matches are allowed, for example a transfer action with reason "Restructure" would match the mapping to either TRANSFER or RESTRUCTURE. The exact match on reason takes precedence over "any" reason, so the result would be RESTRUCTURE.

Example for Fusion

For Fusion there are two domain mappings for determining workforce events. The seeded mapping uses action type only to provide a default workforce event. This may be overridden by the domain mapping that uses a combination of action and action reason.

Example Requirements:

  • Define Restructure as a transfer action with reason "Restructure".

  • Define Transfer as a transfer action with any other reason.

Example Implementation:

  1. Add "Restructure" to Workforce Event Detail domain.

    1. Map it to "Transfer" in the Workforce Event Sub Group domain,

    2. "Transfer" is already mapped to "Transfer" in the Workforce Event Group domain.

  2. Add the following mappings to the domain map Source Workforce Event and Reason -> Workforce Event Detail:

    1. TRANSFER~RESTRUCTURE -> RESTRUCTURE

  3. The remaining definition for transfers is already seeded so no change required

    The resulting domain mapping from action and action reason will look like this:

    Sample configuration data.

Notes

There are no seeded mappings for the Source Workforce Event and Reason domain mapping. If no match is found for this domain mapping then the default is taken from the Source Workforce Event Type domain mapping.

Dependency

No dependencies.

B.2.168 Integration of Procurement and Spend Analytics with Project Analytics

If you have not implemented Project Applications with the minimum required level in your ERP, or if you have not licensed Oracle Project Analytics, or if you consider project dimensions are not important for Procurement and Spend Analytics, you should disable project integration with procurement and spend analytics. Otherwise, you can enable the integration.

Some Project Management dimensions are supported in certain Procurement facts to allow analysis of Procurement facts by Project and Task Dimension for example.

By default (that is, on installation) these dimension will be populated in the Procurement and Spend Analytics warehouse, and the foreign keys will be resolved in the following facts:

  • Expense Overview (Project Dim, Task Dim, Financial Resource Dim)

  • Spend Invoice Distribution (Project Dim, Task Dim)

  • Purchase Orders (Project Dim, Task Dim)

  • Purchase Requisition (Project Dim, Task Dim)

The following Oracle Procurement and Spend Analytics fact tables integrate with Project Analytics dimensions:

  • W_EXPENSE_F

  • W_AP_INV_DIST_F

  • W_AP_XACT_F

  • W_PURCH_COST_F

  • W_RQSTN_LINE_COST_F

To Enable Project Analytics Integration with Procurement and Spend Subject Areas:

The load plan generator will automatically pull in project related dimensions and tasks in the load plan when you select fact groups of the procurement and spend analytics. There is no extra step required.

To Disable Project Analytics Integration with Procurement and Spend Subject Areas:

  1. In ODI Designer Navigator, navigate to Load Plans and Generated Scenarios, and open your generated Load plan.

  2. Expand your Loan Plan as follows:

    1 SDE Extract, then 2 SDE Dimension Group, then Parallel.


  3. Disable the following dimension groups:

    - SDE Dims PROJECT_DIM

    - SDE Dims PROJRSRC_DIM

    - SDE Dims TASK_DIM

  4. Save the Load Plan.

B.2.169 How to Integrate Procurement and Spend Analytics with Spend Classification

This section contains configuration steps that apply to Oracle Procurement and Spend Analytics when deployed with Oracle Spend Classification. For implementing Oracle Spend Classification and required patches, refer to the Oracle Spend Classification product documentation.

If you are not implementing Oracle Spend Classification, you might choose to remove or hide the Oracle Spend Classification integration metadata from the Presentation layer of the BI repository (for more information about removing Oracle Spend Classification metadata, see Section B.2.144, "How to Remove Spend Classification Integration Metadata").

Note: Oracle Spend Classification is not part of the core Oracle BI Applications product suite, and is not packaged with any module of Oracle BI Applications. It is a separate solution offered by Oracle, and a separate license is required. If you are interested in licensing and implementing Oracle Spend Classification, then contact your Oracle Sales Representative.

B.2.169.1 Overview to Oracle Spend Classification Integration

Oracle Spend Classification is a complementary product that can be used in conjunction with Oracle Procurement and Spend Analytics to improve the accuracy of Spend by converting 'unclassified' Spend into item categories. Oracle Procurement and Spend Analytics is designed to work with or without Oracle Spend Classification.

Typical procurement systems will have many PO, Invoice, and Expense Transactions without reference to item and item categories, and in most cases they might have item descriptions in a free text format. When you implement Oracle Procurement and Spend Analytics, these transactions will come into the system as 'Unclassified' because they do not have corresponding items and/or item categories. This issue is more prominent if your organization's Spend constitutes a major portion of Indirect Spend.

Oracle Procurement and Spend Analytics is installed with infrastructure required to feed data from Oracle Business Analytics Warehouse to Oracle Spend Classification, and feed the classified data back into Oracle Business Analytics Warehouse. This Infrastructure is provided as an additional feature for those customers who would like to take the advantage of both Oracle Procurement and Spend Analytics and Oracle Spend Classification.

If you choose not to use Oracle Spend Classification, Oracle Procurement and Spend Analytics can be deployed as a stand alone solution, and the features of Procurement and Spend Analytics can be deployed without any dependency on Oracle Spend Classification.

B.2.169.2 About the Oracle Spend Classification Metadata

This section describes the Oracle Spend Classification metadata and repository metadata that is available for use with Oracle Spend Classification.

The following facts are integrated with Oracle Data Classification to enrich and automatically assign category codes.

  • W_AP_INV_DIST_F

  • W_PURCH_COST_F

  • W_RQSTN_LINE_COST_F

There are five types of taxonomy supported: UNSPSC, Oracle Purchasing Categories, and three custom categories. The classification resuLogical Table Source are stored in these columns:

  • AUTO_UNSPSC_WID

  • AUTO_PURCHASING_CATEGORY_WID

  • AUTO_CUSTOM_CATEGORY1_WID

  • AUTO_CUSTOM_CATEGORY2_WID

  • AUTO_CUSTOM_CATEGORY3_WID

In the Analytics metadata repository (RPD), the following is configured by default.

  • UNSPSC, Oracle Purchasing Categories, and Custom Category1 are configured up to the Business Model and Mapping layer. The facts and dimension names are as follows:

    • Fact - Spend and AP Invoice Distribution

    • Fact - Purchasing – Order

    • Fact - Purchasing – Requisition

    • Dim - Auto UNSPSC

    • Dim - Auto Purchasing Category

    • Dim - Auto Custom Category1

  • In the Presentation Layer, 'Procurement and Spend - Invoice Lines' contains the columns for data classification, under the following folders:

    • Data Classification

    • Auto UNSPSC

    • Auto Purchasing Category

    • Auto Custom Category 1

B.2.169.3 How to deploy UNSPSC, Oracle Purchasing Categories, and Custom Category1

Follow these steps if you want to expose UNSPSC, Oracle Purchasing Categories, and Custom Category1 for your Purchase Order and Purchase Requisition Subject Area.

To deploy UNSPCC, Oracle Purchasing Categories, and Custom Category1:

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

    The RPD file is located in the \bifoundation\OracleBIServerComponent\coreapplication_obisn\repository folder.

  2. In the Presentation layer, do the following:

    1. Expand the folder 'Procurement and Spend - Invoice Lines'.

    2. Multi-select the following folders and right click to copy:

      Data Classification

      Auto UNSPSC

      Auto Purchasing Category

      Auto Custom Category 1

    3. To implement Oracle Spend Classification in Purchase Orders, select the folder 'Procurement and Spend - Purchase Orders' and right click to paste in the folders.

    4. To implement Oracle Spend Classification in Purchase Requisitions, select the folder 'Procurement and Spend - Purchase Requisitions' and right click to paste in the selected folders.

    5. Verify the new folders.

    6. If required, re-order the folders as you would like the folders to be displayed to business users in the Presentation Services catalog.

  3. Save and close the repository.

B.2.169.4 How to deploy the additional Custom Category2 and Custom Category3

To deploy Custom Category2 and Custom Catogory3:

Note: This task uses the Fact_W_AP_INV_DIST_F fact as an example, though you can also apply the steps to deploy other facts.

  1. In Oracle BI EE Administration Tool, edit the BI metadata repository (for example, OracleBIAnalyticsApps.rpd).

    The RPD file is located in the \bifoundation\OracleBIServerComponent\coreapplication_obisn\repository folder.

  2. In the Physical layer, do the following:

    1. Right click on 'Dim_W_PROD_CAT_DH_AUTO_CUSTOM_CATEGORY1' under 'Oracle Data Warehouse' and select Duplicate.

    2. Rename it as 'Dim_W_PROD_CAT_DH_AUTO_CUSTOM_CATEGORY2'.

    3. Join dimension 'Dim_W_PROD_CAT_DH_AUTO_CUSTOM_CATEGORY2' and fact 'Fact_W_AP_INV_DIST_F' using the following condition:

      Dim_W_PROD_CAT_DH_AUTO_CUSTOM_CATEGORY2.ROW_WID = Fact_W_AP_INV_DIST_F.'AUTO_CUSTOM_CATEGORY2_WID
      
  3. In the Business Model and Mapping layer, do the following:

    1. Immediately below table 'Dim - Auto Custom Category1', create 'Dim - Auto Custom Category2'.

    2. Immediately below hierarchy 'Auto Custom Category1', create 'Dim - Auto Custom Category2' based on the physical table 'Dim_W_PROD_CAT_DH_AUTO_CUSTOM_CATEGORY2'.

    3. Join 'Dim - Auto Custom Category1' to 'Fact - Spend and AP Invoice Distribution'.

    4. Edit 'Fact - Spend and AP Invoice Distribution'. Fact_W_AP_INV_DIST_F. Display the Content tab, and set the level of 'Auto Custom Category2' to 'Custom Hierarchy Base Level'.

  4. In the Presentation layer, do the following:

    1. Create a sub-folder called 'Auto Custom Category 2' in the 'Procurement and Spend - Invoice Lines' folder. Edit folder and add this exact string to the Description box.

      Auto Custom Category2 becomes a sub-folder of Data Classification.

    2. Order this folder so that it is after 'Auto Custom Category 1'.

    3. Drag the 'Dim - Auto Custom Category1' columns from the Business Model and Mapping layer into the 'Auto Custom Category 2' folder in the Presentation layer.

  5. Save and close the repository.

  6. Repeat steps 2 - 5 for Custom Category3.

B.2.170 How to Integrate Project Analytics with Financial Analytics

You can enable Oracle Financial Analytics to use dimension tables in Oracle Project Analytics. You can only perform this integration if you have licensed Oracle Project Analytics. You can configure the following Subject Areas in Oracle Financial Analytics to use Oracle Project Analytics tables:

  • Financials - Payables

  • Financials - Receivables

The following Oracle Financial Analytics fact tables integrate with Project Analytics dimensions:

  • W_AP_XACT_F

  • W_AP_BALANCE_F

  • W_AR_XACT_F

  • W_AR_AGING_INVOICE_A

B.2.171 About Configuring Enterprise Calendars

An Enterprise calendar (or reporting calendar) enables cross Subject Area analysis. Enterprise calendar tables have the W_ENT prefix.

Enterprise calendars can be set to one of the OLTP sourced fiscal calendars, or to one of the warehouse generated calendars. This can be done by setting the following source system parameters at the Business Intelligence Applications Configuration Manager:

- GBL_CALENDAR_ID

- GBL_DATASOURCE_NUM_ID

The following sections show how to set up the source system parameters for the Enterprise calendar in different scenarios.

Scenario 1: Using an Oracle EBS fiscal calendar as the Enterprise calendar

Source System Oracle BI Applications Configuration Manager Parameters for Oracle EBS Enterprise Calendars:

GBL_CALENDAR_ID

This parameter is used to select the Enterprise Calendar. It should be the MCAL_CAL_NAME~MCAL_PERIOD_TYPE for Non-Generated Calendars. For example, GBL_CALENDAR_ID will be 'Accounting~41' if the Enterprise Calendar id = 'Accounting' and the calendar period_type = '41'.

Note: MCAL_CAL_NAME and MCAL_PERIOD_TYPE are sourced from PERIOD_SET_NAME and PERIOD_TYPE of the GL_PERIODS table (an Oracle EBS OLTP table). To see a valid list of combinations of MCAL_CAL_NAME~MCAL_PERIOD_TYPE, run the following query in the OLTP:

SELECT DISTINCT PERIOD_SET_NAME || '~' || PERIOD_TYPE FROM GL_PERIODS;

GBL_DATASOURCE_NUM_ID

If Enterprise Calendar is not a Generated Calendar: It should be the DATASOURCE_NUM_ID of the source system from where the Calendar definition is taken. For example, if you have two data sources, PeopleSoft and Oracle, and the Global Calendar is from an Oracle data source, then this parameter value should specify an Oracle data source. The pre-determined values of DATASORUCE_NUM_ID for different Oracle EBS versions are provided in the table below.

This graphic is described in surrounding text.

To set GBL_CALENDAR_ID and GBL_DATASOURCE_NUM_ID, log into Oracle BI Applications Configuration Manager, and click on Manage Data Load Parameters from the navigation bar on the left. Once being directed to the Manager Data Load Parameters page, type in GBL_CALENDAR_ID in the parameter field and choose Code as the parameter type. Then click the Search button after which the parameter with its current value will be returned. Refer to the example below, which shows 10000 as the current value of GBL_CALENDAR_ID.

This graphic is described in surrounding text.

To change the value of GBL_CALENDAR_ID, click on its current value, and then an edit dialog pops up.

This graphic is described in surrounding text.

Provide the desired value in the Parameter Value field (note that you do not need to include single quotes in the value, for example, just use Accounting~41 rather than 'Accounting~41'), and then click Save and Close to save your change. The new value of GBL_CALENDAR_ID has been set.

The setting procedure for GBL_DATASOURCE_NUM_ID is similar. It should be first retrieved by searching this variable. Once it is returned, click on its current value and then an edit dialog pops up. Change its parameter value there and then save the change.

Note: The available Oracle EBS calendars are also loaded into the OLAP warehouse table W_MCAL_CAL_D. Therefore, they can be viewed by running the following query in DW:

SELECT MCAL_CAL_ID, MCAL_CAL_NAME, MCAL_CAL_CLASS, DATASOURCE_NUM_ID FROM W_MCAL_CAL_D

WHERE DATASOURCE_NUM_ID = <the value corresponding to the EBS version that you use>;

Scenario 2: Using a PeopleSoft fiscal calendar as the Enterprise calendar

Source System Oracle BI Applications Configuration Manager Parameters for PeopleSoft Enterprise Calendars:

GBL_CALENDAR_ID

This parameter is used to select the Enterprise Calendar. It should be the SETID~CALENDAR_ID for Non-Generated Calendars. For example, GBL_CALENDAR_ID will be 'SHARE~01' if the Enterprise Calendar is = '01' and SET_ID = 'SHARE'.

Note: SETID and CALENDAR_ID are sourced from the PS_CAL_DEFN_TBL table (a PeopleSoft OLTP table). To see a valid list of combinations of SETID~CALENDAR_ID, run the following query in the OLTP:

SELECT DISTINCT SETID || '~' || CALENDAR_ID FROM PS_CAL_DEFN_TBL;

GBL_DATASOURCE_NUM_ID

If Global Calendar is not a Generated Calendar: It should be the DATASOURCE_NUM_ID of the source system from where the Calendar definition is taken. For example, if you have two data sources, PeopleSoft and Oracle, and the Global Calendar is from PeopleSoft source, then this parameter value should specify a PeopleSoft data source. The pre-determined values of DATASORUCE_NUM_ID for different PeopleSoft versions are provided in the table below.

This graphic is described in surrounding text.

The settings of these two variables in Oracle BI Applications Configuration Manager are the same as those steps for Oracle EBS.

Note: The available PeopleSoft calendars are also loaded into the OLAP warehouse table W_MCAL_CAL_D. Therefore, they can be viewed by running the following query in DW:

SELECT MCAL_CAL_ID, MCAL_CAL_NAME, MCAL_CAL_CLASS, DATASOURCE_NUM_ID FROM W_MCAL_CAL_D

WHERE DATASOURCE_NUM_ID = <the value corresponding to the PeopleSoft version that you use>;

Scenario 3: Using a warehouse generated calendar as the Enterprise calendar

Source System Oracle BI Applications Configuration Manager Parameters for Generated Enterprise Calendars:

GBL_CALENDAR_ID

This parameter should be the CALENDAR_ID of the Generated Calendar (4-4-5 or 14 period type of Calendars). By default, the 4-4-5 calendar has a CALENDAR_ID of '10000', and the 13-period calendar has a CALENDAR_ID of '10001'.

GBL_DATASOURCE_NUM_ID

If Global Calendar is a Generated Calendar: It should be the DATASOURCE_NUM_ID value of the OLAP (Data warehouse). In PS1, that for the OLAP is 999.

The settings of these two variables in Oracle BI Applications Configuration Manager are the same as those steps for Oracle EBS.

Note 1: Customers can generate additional warehouse generated calendars which can be picked as the Enterprise calendar.

Note 2: The available data warehouse calendars are also loaded into the OLAP warehouse table W_MCAL_CAL_D. Therefore, they can be viewed by running the following query in DW:

SELECT MCAL_CAL_ID, MCAL_CAL_NAME, MCAL_CAL_CLASS, DATASOURCE_NUM_ID FROM W_MCAL_CAL_D WHERE DATASOURCE_NUM_ID = 999;

Setting GBL_CALENDAR_ID and GBL_DATASOURCE_NUM_ID in a multi-source ETL

In a multi-source ETL run, multiple calendars from different data sources can be loaded. However, in this case, ONLY ONE calendar can be chosen as the Global Calendar. For example, if you have two data sources, PeopleSoft and Oracle, then you can only choose either a calendar from PeopleSoft or a calendar from Oracle as the Global Calendar. The two parameters GBL_CALENDAR_ID and GBL_DATASOURCE_NUM_ID should be set in Oracle BI Applications Configuration Manager according to the global calendar that you choose. Never provide more than one value to GBL_CALENDAR_ID or GBL_DATASOURCE_NUM_ID in Oracle BI Applications Configuration Manager. That would fail the ETL run.

B.2.172 How to Configure the Master Inventory Organization in Product Dimension Extract for Oracle 11i Adapter (Except for GL & HR Modules)

In Oracle 11i applications, products are defined in a Master Organization and then copied into the other Inventory Organizations for transactions. The Product dimension Extract mapping 'SDE_ORA_ProductDimension_Derive' has been enabled for configuration of this Master Organization based on the configuration in the OLTP. By default, the organization ID (that is set by the $$MASTER_ORG parameter) is set to 204. This organization ID 204 needs to be changed based on the individual implementation of OLTP in your deployment.

Note: This ETL implementation supports the best practice prescribed by Oracle for the creation of Single Master Organization for defining the Product master. This ETL implementation does not support the multiple master Organizations if the same product is defined in multiple master organizations. You can assign Multiple MASTER Organizations also under the same parameter by providing a comma-separated string of Organization codes (for example, '204','458').

To configure Master Inventory Organization, in FSM, use the FSM Task named 'Configure Data Load Parameters for Master Organization'.

B.2.173 How to Assign UNSPSC Codes to Products

This section explains how to assign United Nations Standard Products and Services Code (UNSPSC) codes to products and commodities. The United Nations Standard Products and Services Code® (UNSPSC®) provides an open, global multi-sector standard for efficient, accurate classification of products and services.

You can assign UNSPSC codes to your Products by associating PRODUCT with UNSPSC codes in file_unspsc.csv file, which are then loaded into the W_PROD_CAT_DH table. UNSPSC Codes are created automatically by the UNSPSC process in the W_PROD_CAT_DHS flow. This process loads the UNSPSC codes from the file_unspsc.csv into the W_PROD_CAT_DHS table.

Note:

The configuration file or files for this task are provided on installation of Oracle BI Applications at one of the following locations:

Source-independent files: <Oracle Home for BI>\biapps\etl\data_files\src_files\.

Source-specific files: <Oracle Home for BI>\biapps\etl\data_files\src_files\<source adaptor>.

Your system administrator will have copied these files to another location and configured ODI connections to read from this location. Work with your system administrator to obtain the files. When configuration is complete, your system administrator will need to copy the configured files to the location from which ODI reads these files.

To Assign UNSPSC Codes to Products

  1. Run a select statement on the W_PRODUCT_D to get the Products used in your deployment.

    For example, you might use Oracle SQLDeveloper to run the following SQL command:

    SELECT INTEGRATION_ID PRODUCT_ID, PRODUCT_NAME PRODUCT_NAME, PART_NUMBER
    PART_NUMBER, DATASOURCE_NUM_ID DATASOURCE_NUM_ID FROM W_PRODUCT_D;
    

    Note: In the above example SQL statement, the INTEGRATION_ID is the product that needs classification. The PRODUCT_NAME and PART_NUM are additional attributes to assist in classifying the UNSPSC Codes.

  2. Add an entry to the file_item_to_unspsc.csv file to associate a PRODUCT to a UNSPSC Code.

    The entry must be in the following format:

    (PRODUCT_ID,PRODUCT_NAME,PART_NUMBER,UNSPSC_CODE,DATASOURCE_NUM_ID)
    

    A list of UNSPSC Codes can be found in file_unspsc.csv.

    Note: When the file_unspsc.csv is configured with data, the PLP_ItemToUNSPSC_Classification workflow updates the rows in the W_PRODUCT_D table.

B.3 Informational Task Reference - Security

This section contains links to security-related Help topics:

B.4 Informational Task Reference - ETL Notes and Overviews

This section contains ETL Notes and Overview Help topics.

B.4.1 Getting Started With Functional Configurations

For information about getting started with functional configuration, see Section 3.2, "Roadmap for Functional Configuration".

B.4.2 ETL Notes and Additional Information for Oracle Project Analytics

List of Functional Areas for this Offering:

  • PROJECT_AN: Revenue

  • PROJECT_AN: Funding

  • PROJECT_AN: Forecast

  • PROJECT_AN: Cross Charge

  • PROJECT_AN: Cost

  • PROJECT_AN: Contract

  • PROJECT_AN: Commitment

  • PROJECT_AN: Budget

  • PROJECT_AN: Billing

B.4.3 ETL Notes and Additional Information for Service Analytics

This task is not required for Service Analytics and you can disregard it.

B.4.4 ETL Notes and Additional Information for Oracle Procurement and Spend Analytics

Procurement and Spend Analytics includes the following fact groups:

  • Employee Expense Functional Area

    • Expense Credit Card

    • Expense Overview

    • Expense Violations

  • Sourcing Functional Area

    • Sourcing Negotiation

    • Sourcing Response

  • Procurement Functional Area

    • Purchase Orders

    • Purchase Cycle

    • Purchase Receipts

    • Purchase Requisition

    • Purchase Agreement

    • Spend Invoice Distribution

    • Procurement Scorecard Target Fact Group

    You should also include Accounts Payable Functional Area and AP Transactions and Balance Fact group too in your implementation as Procurement and Spend Analytics has cross functional analysis with Financial Analytics.

B.4.5 ETL Notes and Additional Information for Supply Chain and Order Management Analytics

Supply Chain and Order Management related functional areas and fact groups:

  • Accounts Receivable

    • AR Transactions and Balance (ARTRANS_FG)

    • Customer Financial Profile Fact (FINPROFL_FG)

  • Costing

    • Item Cost (ITEMCOST_FG)

    • Valuation (VALUATION_FG)

    • Cost of Goods Sold (GLCOGS_FG)

  • Logistics

    • Inventory Balance (INVBAL_FG)

    • Inventory Cycle Count (INVCYCNT_FG)

    • Inventory Transactions (INVTRX_FG)

  • Order Management

    • Order Booking (OMBACKLOG_FG)

    • Order Booking (OMBOOKING_FG)

    • Order Customer Status History (OMCUSTSTATHIST_FG)

    • Order Cycle (OMCYCLE_FG)

    • Order Shipping (OMDELIVERY_FG)

    • Order Orchestration Process (OMDOOPRCSS_FG)

    • Order Invoice Credit (OMINVOICECREDIT_FG)

    • Order Invoice (OMINVOICE_FG)

    • Order Credit (OMORDERCREDIT_FG)

    • Order Fulfillment (OMORDERFULFILL_FG)

    • Order Hold (OMORDERHOLD_FG)

    • Order Scheduling (OMSCHEDULE_FG)

  • Profitability

    • Customer Expenses (CUSTEXP_FG)

    • Cost of Goods Sold (GLCOGS_FG)

    • GL Revenue (GLREVN_FG)

    • Product Expenses (PRODEXP_FG)

  • Supply Chain

    • BOM Item Fact (BOMITEM_FG)

B.4.6 ETL Notes and Additional Information for Project Analytics

No additional information is provided for this Offering.

B.4.7 ETL Notes and Additional Information for Sales Analytics

List of Functional Areas and Fact Groups under the "Oracle Sales Analytics" offering:

Offering

--> Oracle Sales Analytics (SALES_AN_OFRNG)

Functional Area

--> Asset (ASSET_FA)

Fact Group

--> Asset (ASSET_FG)

--> Customer Interactions Management (CUSTINTMGMT_FA)

Fact Group

--> Interactions Coverage (INTCTNCVRG_FG)

--> Interactions (INTERACTIONS_FG)

--> Marketing Leads (LEADS_FA)

Fact Group

--> Interactions (INTERACTIONS_FG)

--> Marketing Lead (MKTGLEAD_FG)

--> Opportunity and Revenue Management (OPTYREVNMGMT_FA)

Fact Group

--> Interactions (INTERACTIONS_FG)

--> Opportunity Revenue (OPTYREVN_FG)

--> Opportunity and Revenue Management for Segmentation (OPTYREVNMGMTSEG_FA)

Fact Group

--> Opportunity Revenue Segmentation (OPTYSEG_FG)

--> Order CRM (ORDRCRM_FA)

Fact Group

--> CRM Order (ORDER_FG)

--> Quota Management (QUOTAMGMT_FA)

Fact Group

--> Resource Quota (RESOURCEQUOTA_FG)

--> Territory Quota (TERRQUOTA_FG)

--> Quote CRM (QTECRM_FA)

Fact Group

--> CRM Quote (QUOTE_FG)

--> Sales Account (SALESACCNT_FA)

Fact Group

--> Sales Account (SALESACCNT_FG)

--> Sales Forecasting Management (SALESFCSTMGMT_FA)

Fact Group

--> Sales Forecast (SALESFCST_FG)

--> Siebel Sales Forecast (SIEBELSALESFCST_FG)

--> Sales Prediction Engine (SPE_FA)

Fact Group

--> Agreement Contract Item Fact (AGREE_FG)

--> Asset (ASSET_FG)

--> Campaign History (CAMPHIST_FG)

--> Marketing Lead (MKTGLEAD_FG)

--> Opportunity Revenue (OPTYREVN_FG)

--> CRM Order (ORDER_FG)

--> Response (RESPONSE_FG)

--> Service Request (SRVREQ_FG)

--> Service Agreement (AGREE_FA)

Fact Group

--> Agreement Invoice Line Fact Group (AGREEINVCLINE_FG)

--> Agreement Contract Item Fact (AGREE_FG)

--> Invoice Fact Group (INVOICE_FG)

--> Service Request (SRVREQ_FA)

Fact Group

--> Activity Fact Group (ACTIVITY_FG)

--> Service Request (SRVREQ_FG)

--> Survey (SURVEY_FG)

--> Territory Management (TERRMGMT_FA)

Fact Group

--> Marketing Lead (MKTGLEAD_FG)

--> Opportunity Revenue (OPTYREVN_FG)

--> Sales Account (SALESACCNT_FG)

--> Sales Forecast (SALESFCST_FG)

--> Usage Accelerator (USGACC_FA)

Fact Group

--> Customer Data Completeness (CUSTDTCMP_FG)

--> Interactions (INTERACTIONS_FG)

--> Party Person Fact (PARTYPERSON_FG)

B.4.8 ETL Notes and Additional Information for Product Information Management Analytics

No additional information is provided for this Offering.

B.4.9 ETL Notes and Additional Information for Partner Analytics

List of Functional Areas and Fact Groups under the "Oracle Partner Analytics" offering:

Offering

--> Oracle Marketing Analytics (PARTNER_AN_OFRNG)

Functional Area

--> Partner Deals (DEALS_FA)

Fact Group

--> Marketing Lead (MKTGLEAD_FG)

--> Opportunity and Revenue Management (OPTYREVNMGMT_FA)

Fact Group

--> Interactions (INTERACTIONS_FG)

--> Opportunity Revenue (OPTYREVN_FG)

--> Partner Performance (PARTPERF_FA)

Fact Group

--> Partner Program Measure Fact Group (PRMPROGMSR_FG)

--> Partner Programs (PARTPROG_FA)

Fact Group

--> Partner Enrollment Fact Group (PRMENROLL_FG)

--> Partner Presence Fact Group (PRMPRES_FG)

--> Service Request (SRVREQ_FA)

Fact Group

--> Activity Fact Group (ACTIVITY_FG)

--> Service Request (SRVREQ_FG)

--> Survey (SURVEY_FG)

B.4.10 ETL Notes and Additional Information for Human Resources Analytics

List of Functional Areas and Fact Groups under the Oracle Human Resources Analytics offering:

Offering

--> Oracle Human Resources Analytics (HR_AN_OFRNG)

Functional Area

--> Absence & Accrual (ABSACCRUAL_FA)

Fact Group

--> Absence Event (ABSEVT_FG)

--> Accrual Transaction (ACCRUALTRANS_FG)

--> General Ledger (GENLDGR_FA)

Fact Group

--> GL Budget (BUDGET_FG)

--> GL Balance (GLBAL_FG)

--> GL Journals (GLBAL_FG)

--> Learning (LEARNING_FA)

Fact Group

--> Learning Enrollment (LMENROLL_FG)

--> Payroll (PAYROLL_FA)

Fact Group

--> Payroll Balance (PAYROLLBAL_FG)

--> Recruitment (RCRTMNT_FA)

Fact Group

--> Recruitment (RCRTMNT_FG)

--> Time and Labor (TIMELABOR_FA)

Fact Group

--> Time and Labor - Processed / Payable Time (TLPRCSD_FG)

--> Time and Labor - Reported Time (TLRPTD_FG)

--> Workforce Deployment (WRKFCDEPLOY_FA)

Fact Group

--> Workforce Event (WRKFRCEVT_FG)

--> Workforce Effectiveness (WRKFCEFFECT_FA)

Fact Group

--> GL Journals (GLBAL_FG)

--> Workforce Event (WRKFRCEVT_FG)

B.4.11 ETL Notes and Additional Information for Marketing Analytics

List of Functional Areas and Fact Groups under the Oracle Marketing Analytics offering:

Offering

--> Oracle Marketing Analytics (MARKETING_AN_OFRNG)

Functional Area

--> Core Marketing (COREMKTG_FA)

Fact Group

--> Campaign History (CAMPHIST_FG)

--> Campaign Opportunity (CAMPOPTY_FG)

--> Household Fact (HOUSEHLD_FG)

--> Interactions (INTERACTIONS_FG)

--> KPI (KPI_FG)

--> Offer Product (OFFRPROD_FG)

--> Party Person Fact (PARTYPERSON_FG)

--> Response (RESPONSE_FG)

--> Marketing Leads (LEADS_FA)

Fact Group

--> Interactions (INTERACTIONS_FG)

--> Marketing Lead (MKTGLEAD_FG)

--> Marketing Plan (MKTGPLAN_FA)

Fact Group

--> Marketing Cost (MKTGCOST_FG)

--> Marketing Goal (MKTGGOAL_FG)

--> Opportunity Landscape (OPTYLANDSCAPE_FA)

Fact Group

--> Customer Purchase (CUSTPURCH_FG)

--> Marketing Lead (MKTGLEAD_FG)

--> Sales Account (SALESACCNT_FG)

--> Opportunity and Revenue Management (OPTYREVNMGMT_FA)

Fact Group

--> Interactions (INTERACTIONS_FG)

--> Opportunity Revenue (OPTYREVN_FG)

--> Order CRM (ORDRCRM_FA)

Fact Group

--> CRM Order (ORDER_FG)

--> Quote CRM (QTECRM_FA)

Fact Group

--> CRM Quote (QUOTE_FG)

--> Service Request (SRVREQ_FA)

Fact Group

--> Activity Fact Group (ACTIVITY_FG)

--> Service Request (SRVREQ_FG)

--> Survey (SURVEY_FG)

B.4.12 ETL Notes and Additional Information for Financial Analytics

List of Functional Areas and Fact Groups for the Oracle Financial Analytics Offering (FIN_AN_OFRNG):

Functional Areas:

--> Account Payables (ACNTPAY_FA)

Fact Group

--> AP Holds (APHOLDS_FG)

--> AP Transactions and Balance (APTRANS_FG)

--> Account Receivables (ACNTREC_FA)

Fact Group

--> AR Transactions and Balance (ARTRANS_FG)

--> Budgetary Control (BUDCNTRL_FA)

Fact Group

--> GL Budget (BUDGET_FG)

--> GL Balance (GLBAL_FG)

--> GL Journals (GLJRNLS_FG)

--> Employee Expense (EMPEXP_FA)

Fact Group

--> Expense Credit Card (EXPCRDTCRD_FG)

--> Expense Overview (EXPOVERVIEW_FG)

--> Expense Violations (EXPVIOLATION_FG)

--> Federal Financials (FEDFIN_FA)

Fact Group

--> GL Budget (BUDGET_FG)

--> GL Balance (GLBAL_FG)

--> GL Journals (GLJRNLS_FG)

--> Fixed Asset (FIXEDASSET_FA)

Fact Group

--> Fixed Asset Balance (ASTBAL_FG)

--> Fixed Asset Transactions (ASTXACT_FG)

--> General Ledger (GENLDGR_FA)

Fact Group

--> GL Budget (BUDGET_FG)

--> GL Balance (GLBAL_FG)

--> GL Journals (GLJRNLS_FG)

--> Profitability (PROFIT_FA)

Fact Group

--> Customer Expenses (CUSTEXP_FG)

--> Cost of Goods Sold (GLCOGS_FG)

--> GL Revenue (GLREVN_FG)

--> Product Expenses (PRODEXP_FG)

B.4.13 ETL Notes and Additional Information for Customer Data Management Analytics

No additional information is provided for this Offering.

B.4.14 Overview of Student Information Analytics

Student Information Analytics (SIA) in BI Applications captures detailed student, Instructor and Institution related information into a single environment. This enables Institutions in analyzing recruiting, admissions, student records and student financials data. It leverages the right strategic decisions to maximize an institution's student recruiting efforts, improve retention rates, identify successful and unsuccessful courses and programs, analyze faculty workloads and manage and track Student Financials.

Student Information Analytics is comprised of the following three content-specific data marts which constitute a comprehensive, integrated analytic platform.

B.4.14.1 Admissions & Recruiting Overview

The analytics around student recruiting seek to provide information about prospects and recruiters for an institution and the institution's success in converting the prospects to applicants and enrollees, with focus on the students' academic career, academic program, admit term and recruiting status.

The analytics around student admissions seek to provide information about applicants to the institution and the eventual records at the institution, with focus on the students' academic career, academic program, admit term and program status Admissions & Recruiting contains the following subject areas:

  • Student Recruiting : Contains data on prospective students, associated academic Institutions, academic careers, academic programs, plans and sub-plans.

  • Student Admission Application: Contains current information of student applications and provides information on applicants and their applications, and associated academic Institutions', academic careers, programs, plans and sub plans.

  • Student Admission Applications Status: This contains data on the application status changes and the effective dates. Each status change is represented as a row in the table.

  • External Test Scores: This contains data on the external test scores of the prospects and applicants.

  • External Academic Summary : This contains data on the external academic information of the prospects and applicants.

  • Application Evaluation : Contains information about the application evaluation which are used to evaluate applicants on specific criteria for the academic career and program to which they are applying.

  • Admissions Funnel: Enables funnel report analysis by applicant type, academic level, academic load, last school attended, as well as by term, Institution, career, program, campus, and so forth.

  • Student Response: Contains information about student responses to Institution and enables analysis such as why students chose not accept Institution Admission.

  • Transfer Credit: Contains information about the students' credits. A student can have course credit, test credit or other credits.

B.4.14.2 Student Records Overview

Student Records helps to manage all aspects of enrollment, including catalog and class schedule maintenance, transfer credits, class start and end dates, wait lists, academic programs, transcripts, and analysis. Analytics around Student Records help institutions to review and manage items such as enrollment and registration metrics, count of student and faculty by registered courses, available courses, graduation rates , student career, student academic standing, faculty and student profiles.

Student Records currently contains the following subject areas:

  • Academic Plan Summary -This Academic Plan Summary business area contains summary information about individual student for a given academic plan and related academic program.

  • Academic Program Detail -This Academic Program business area contains details on individual program actions for every student for a given academic program.

  • Academic Class- Academic Class business area contains information about class, courses offered, course component etc.

  • Class Enrollment -Class Enrollment business area contains details on individual student's enrollment for a given term for a given class.

  • Class Instructor - Class Instructor business area contains details on individual classes scheduled for a given term.

  • Class Meeting Pattern -The Class Meeting Pattern subject area contains metrics such as average class size by subject area or department, average class size by meeting patterns, classroom utilization by times of day and days of the week, class utilization and grade distribution.

  • Enrollment Requests- This subject area provides analysis of student enrollment requests. It allows managers to analyze metrics on enrollment requests by student, academic career, institution, term, status, action, reason, etc.

  • Degrees and Honors- This subject area provides analysis of student degrees as well as related honors. It allows managers to analyze metrics on degrees and honors conferred by student, term, institution, academic plan, academic sub plan, etc.

  • Term Enrollment- Term Enrollment business area contains details on individual student's enrollment for a given term.

  • Institution Summary- This Institution Summary business area contains details on student headcount, graduation count and retention count for every academic program, admit term, admit type, student gender, ethnicity, and student cohort.

B.4.14.3 Student Financials Overview

Student Financial Services is used by institutions to manage student receivables, billing, and collections. The subject areas in SIA link student financial information to dimensions such as Student, Account Category etc, enabling the analysis of student financial information from various viewpoints.

Analytics around Student Financials help institutions to review and manage and Student Financial transactions and monitor the outstanding .It allows Institutions to get student financial transactional level of detail.

Student Financial module currently contains the following subject areas:

  • Payment Details - Information about payments at a detail level. It allows managers to analyze payments by business unit, payment method, item type, term, academic year, etc.

  • Payment and Charges Cross Reference - Provides information on the payments applied to charges within student financials. It allows managers to analyze payments applied to charges by business unit, item type, account type, term, academic year, etc.

  • Transaction Details - Information about student financial transactions at a detail level. It allows managers to analyze line items by business unit, item type, account type, term, academic year, etc.

  • Credit History - Information on student and external organization accounts by aging set and aging category. It allows student financial managers to analyze credit history trends for a given business unit, account type, student, external org, etc.

B.4.15 Overview of Oracle Human Resources Analytics

Oracle Human Resources Analytics integrates workforce information from different HR functions and Finance. It provides executives, HR managers and front-line managers with an interactive tool to analyze workforce staffing, employee performance and payroll cost, to better design compensation that rewards performance, and to reduce employee retention and absence costs and to better source high quality applicants.

The Oracle HR Analytics application has the following functional areas:

Workforce Effectiveness

Workforce Effectiveness combines key HR metrics with the organization's financial data. It allows senior executives to monitor key HR effectiveness metrics at the enterprise level. The correlation of workforce and financial metrics provides insight into how workforce trends directly impact the organization's operations and financial strength.

Workforce Effectiveness delivers the following sample metrics:

  • Contracting Expense

  • Contribution per Headcount

  • Return on Human Capital

  • Revenue per Headcount

  • Workforce Cost

Workforce Deployment

Workforce Deployment subject area is the foundation for workforce analysis. It provides the comprehensive core workforce information to support analysis on head count, retention, workforce diversity, employee performance, and contingent labor utilization. Key workforce deployment information such as employee, organization, supervisor, performance band, and service band are shared with other HR functional areas. Sensitive personal attributes like birth date, age, and marital status are organized in a separate folder to allow for restricted access.

Configurable HR event analysis is another key feature of Workforce Deployment subject area. You can configure various employee assignment actions to support analysis in voluntary/involuntary termination, hires, transfers, promotions, or layoffs, and so on. In addition, changes in an employee's job, organization, location, supervisor and salary are tracked to support workforce movement analysis.

Workforce Deployment supports the following types of analysis:

  • Headcount analysis

  • Workforce diversity

  • Employee attrition and retention

  • Employee performance

  • Span of control

  • Internal mobility

Workforce Gains and Losses

Understanding headcount movement is essential to manage headcount budget and talent movement. A job transfer can result in a gain, loss or no headcount change as you traverse through the reporting hierarchy. Headcount Gains and Losses subject area allows you to analyze the effect of assignment changes, for example, hires, transfer-in, transfer-outs, reorganization, termination, on headcount movement by supervisor hierarchy.

Workforce Gain and Loss delivers the following metrics:

  • Headcount gain from Hire, Reorganization, Global Transfer, Transfer, Supervisor Change

  • Headcount loss from Transfer, Reorganization, Global Transfer, Supervisor Change, Termination

Compensation

The Compensation subject area provides the information that compensation managers and line managers need to manage compensation costs and evaluate the effectiveness of the compensation plan. The delivered compensation metrics allow you to correlate employee pay with performance and perform compensation parity analyses at different levels by job, grade, and tenure. It proactively detects over or under-compensated employees, which can have big impact on your company's ability to maintain a competitive edge.

Compensation subject area supports the following analysis:

  • Salary trend and percentile analysis

  • Salary compression between grades, jobs, and experienced workers

  • Salary and employee performance

  • Compa-ratio band analysis

Absence and Leave Accrual

Absence and Leave Accrual analyzes employee absence trends and leave accrual balances. Absenteeism impedes workforce productivity and increase workforce cost. It analyzes planned and unplanned absence trends, employee working days lost, and identify employees with frequent unplanned absences to reduce absenteeism cost. Leave and Accrual also allows managers to view employees' accrual balances and accrual liability.

Absence and Leave Accrual supports the following types of analysis:

  • Working days lost and Absence Trend

  • Employee absence calendar

  • Bradford Score

  • Accrual balance and liability by organization

Payroll

Payroll subject area captures transactional payroll details as well as aggregated payroll balances. With the Payroll subject area, you can analyze all earning, deduction, tax and special accumulator balances from Payroll. It allows you to configure source payroll balances to summary measures such as base pay, variable pay, total employer-paid health care cost, and total employer-paid taxes etc. to facilitate total compensation or workforce cost analysis. Payroll subject area is extensible. It delivers extension fields allowing customers to map additional balances measures that are not available in the pre-defined payroll balances.

All payroll measures are available for time trend analysis for example, Year, Quarter, Month and Year Ago.

Payroll subject area supports the following types of analysis:

  • Payroll balances

  • Payroll earning and deduction balances YTD, QTD, MTD, YAGO

  • Payroll cost trend

  • Overtime spend and Payroll labor hours

Recruitment

Recruitment functional area provides executives, recruiting managers and line managers the intelligence in assessing the efficiency and effectiveness of the recruitment process in sourcing and attracting quality new hires. It delivers over 100 metrics to monitor the entire recruitment life cycle.

Recruitment subject area supports the following types of recruitment analysis:

  • Job vacancy analysis

  • Recruitment events analysis

  • Quality of hire

  • Source of hire

  • Applicant pool analysis

  • Referral analysis

Learning

Learning is a key component of Talent Management. The Learning functional area focuses on the analysis of course offerings, delivery methods, course utilization, and learner enrollment and completion. By combining learning and workforce metrics, the Learning functional area provides critical insight into the effectiveness of learning delivery and how learning contributes to workforce development and employee performance.

Learning subject area supports the following types of analysis:

  • Employee course enrollment and completion rate

  • Learning hours delivered

  • Top 10 course enrollments

  • Course enrollment wait time

  • Learning scores

Time and Labor

Time and Labor subject area analyzes late timecard submission, reported time, processed or payable time. Time and Labor stores timecard transaction details from the source time tracking systems. With a configurable time entry category, you can analyze productive vs. productive hours, overtime trend, estimated cost for reported time, and variance between reported and processed time. Time and Labor also supports project time reporting by analyzing reported and processed time by project and tasks.

Time and Labor subject area supports the following types of analysis:

  • Reported time by time category and submission status

  • Productive vs. non-productive time

  • Late Timecards and timecard aging

  • Processed or payable time by processing status

  • Estimated cost for reported and processed time

  • Project labor hours

B.4.16 Overview of Oracle Price Analytics

Oracle Price Analytics is aimed at pricing analysis, sales operations, product marketing and management, and finance. It provides pricing analytics across the full price waterfall of contracts, quotes, orders and competitor pricing, allowing business users to do the following:

  • Identify the true realized price, as well as understand how price is diluted at appropriate summary and transactional levels.

  • Monitor price and discounting trends across segment, strategy, channel, geography and time.

  • Understand price and margin variation within the same products or customers through price band analysis.

  • Look for 'outliers' in customer and product profitability to determine the root cause as to why some groups perform better than others and to identify surgical pricing policy and deal opportunities.

  • Combine insights from historical data with Oracle data mining and predictive technologies to improve forward-looking decisions.

  • Break down price elements at the transaction level and identify the true realized price and margin through detailed waterfall analysis.

  • Highlight discounts that are disproportionate to volume and determine which regions, products or customers are responsible.

The following sources can populate pricing data:

  • Siebel CRM 8.1.1.7 and 8.2.2

  • E-Business Suite 12.1.3

  • Universal source

B.4.17 Overview of Service Analytics

No Help topic is available for this FSM Task.

B.4.18 Overview of Oracle Manufacturing Analytics

Oracle Manufacturing Analytics, part of the Oracle Business Intelligence Applications product line, delivers deep insight into manufacturing execution, production costing, inventory builds, and production quality. It helps manufacturing executives, cost accountants and production and operations managers track performance indicators and key trends in manufacturing execution. Manufacturing analytics, coupled with tight integration to other Oracle Business Intelligence Applications analytics offerings, helps organizations make informed decisions to monitor and evaluate supply chain execution and effectiveness.

Oracle Manufacturing Analytics provides analytics support for the following content areas:

Manufacturing Execution

Manufacturing execution captures the work order transactions on the production shop floor to provide summary level performance indicators with drill down to granular work order details. The analytics support includes:

  • Production completions into Inventory.

  • Work order analysis, which includes production schedule adherence, production quality, work order cycle times, work order aging etc.

  • Material issues to production and usage variance analysis.

  • Resource usage variance against standards as well as resource utilization of standard capacity.

  • Kanban card activity and kanban replenishment analysis for lean manufacturing.

Production Costing

This content area provides Production cost accountants with insight into production costs and cost variances. Production costing captures the actual production cost of work order completions with cost element break down by resource, material, overhead etc. In addition to actual cost trending and exception tracking, this content area also helps standard costing organizations track production cost variances by cost elements against pre-defined standard costs for a given item. This variance analysis can be performed at a summary level by GL variance accounts with the ability to drill down to work order and cost element details to track exceptions.

Production Quality

This content area captures quality testing information. It's an attribute rich content area that captures quality test results for a given quality plan and all the associated collection elements. These test results can be analyzed against pre-defined thresholds to identify exceptions with the non conformance and disposition details.

The ability to couple actual work order quality like scrap, rework, first pass yield with quality plans and their test results provides extensive capabilities to monitor and track production quality performance.

Production to Plan

This content area provides the capability to analyze different production plans sourced from ASCP and/or MPS, with the ability to pre-configure a specific plan as the baseline plan. This baseline plan is then compared against actual production completions to better understand production adherence to plan, production attainment to-date, deviations from the baseline plan etc. It also helps compare a given plan against the baseline plan to review deviations in the recommendations of a particular plan relative to the baseline plan.

Inventory

In addition to production performance on the shop floor, there is analytics support to analyze Inventory details to provide a holistic picture of the total supply available in a production plant as well as across all the Inventory organizations. The analytics support relatively to Inventory includes:

  • Inventory Balances: This content area captures daily and monthly snapshots of Inventory balances by product and by lots, if lot control is turned on. It helps analyze trends in Inventory balances across the different Inventory organizations. It also supports aggregate industry standard metrics like Inventory Turn and Days Inventory Outstanding, which are calculated metrics to help provide supply chain executives with KPIs to track supply performance across the organization.

  • Inventory Transactions: This content area captures all the Inventory transactions, including inter-organization transfers, customer and supplier returns, material issues and returns from shop floor, work order completions into inventory etc. It helps analyze the trends by transaction types to better understand the inventory movement patterns in an organization.

  • Inventory Bill of Materials: This content area helps review a flattened bill of material for a given product to better understand all the components and raw materials consumed collectively across all the levels as well as individually at each stage in the bill of material. This information, based on effectivity dates helps pull up all the components/material associated with a production work order and the associated inventory levels for these components/materials.

  • Inventory Aging: This content area provides the ability to track Inventory age, based on receipt into inventory and categorize the Inventory into aging buckets. With shelf life and/or lot expiration dates, this content area also helps classify expiring inventory into different buckets to monitor and expedite.

B.4.19 Overview of Supply Chain and Order Management Analytics

The Oracle Supply Chain and Order Management Analytics application for Fusion Applications allows you to analyze:

  • Bookings

  • Financial and Operational Backlogs

  • Invoices

  • The movement of sales orders through different stages of the sales cycle

  • Orchestration orders analysis

  • Order Hold analysis

  • Inventory held by an organization

  • Inventory movements in, out, and through manufacturing plants, distribution centers, or storage locations

  • Inventory Valuation

  • Inventory cycle count with Hit or Miss and Exact Match analysis

  • Product Information Management covering analytics for Item, Item-Batch and Item-Catalog attributes

  • Product Information analytics to support New Item Requests and Change Order processes in Fusion

The Oracle Supply Chain and Order Management Analytics application consists of orders, invoices, order orchestration, backlogs, inventory, logistics and product information management. Sales orders are the entry point for the sales process. Invoices are the exit point from the fulfillment process. Backlogs are points of congestion in your fulfillment process. This coverage includes insight into orchestration orders and process durations and which items are booked, backlogged, and invoiced. This allows you to evaluate the sales performance of individual sales representatives or departments. Oracle Supply Chain and Order Management Analytics application also provides you with information on Inventory Transactions, Inventory Balances and Customer and Supplier Returns. This enables companies to monitor inventory levels trend to Sales performance to improve cost exposure, increase turnover through inventory level reduction and increased velocity, properly deploy inventory at the right place / right time and better understand Customer and Supplier Returns to maintain quality.

In addition to the above, the Oracle Supply Chain and Order Management Analytics for has new content for Fusion Applications source that includes new subject areas in Costing, Distributed Order Orchestration, Logistics and Product Information Management.

B.4.20 Overview of Marketing Analytics

Oracle Marketing Analytics is a comprehensive analytical solution that provides timely fact-based insight into the marketing activities of the entire organization. It provides new levels of information richness, usability, and reach to marketing professionals throughout the enterprise. It provides actionable intelligence in the following Marketing areas: Marketing Effectiveness, Customer Insight, and Lead Analysis.

The main functional areas within Marketing Analytics are:

  • Core Marketing - Helps to analyze customer and prospect responses to campaigns, marketing activities and marketing offers.

  • Marketing Leads - Helps to do a detailed analysis on leads as they move through the lifecycle and leads interactions that a company has had with their customer and prospects. Analysis includes lead to opportunity conversion, what percentage of the leads are getting rejected and retired by the sales team, what are the main reasons, how effective the sales force in converting the leads and so on.

  • Marketing Planning - Helps to analyze marketing planning related information including marketing goals and marketing cost analysis.

  • Opportunity Revenue Management - Helps to analyze the opportunity revenue generated from marketing activities, helping marketers to calculate the Return on Marketing Investment (ROMI).

  • Order CRM - Helps to analyze order revenue generated from marketing activities, helping marketers to calculate the return on Marketing Investment.

  • Quote CRM - Helps to analyze quote revenue generated from marketing activities, helping marketers to calculate the return on Marketing Investment.

  • Service Request - Helps to analyze various marketing activities a company has had with their customers and prospects.

For a complete end to end analysis of marketing campaigns and other activities, you must implement all of the above functional areas.

Opportunity Landscape - Opportunity Landscape is a functional area included within Marketing Analytics, but is not necessary for the Marketing Analytics to function. This module provides analysis for Fusion Opportunity Landscape application. Refer to the product documentation of Fusion Opportunity Landscape for more details.

B.4.21 Overview of Customer Data Management Analytics

Fusion Customer Data Management Analytics provides insight into the data quality of an organization's customer data.This solution provides a set of data completeness analyses which allow you to monitor, measure, and analyze the completeness of the underlying party information of your enterprise, including organization and person information.

B.4.22 Overview of Project Resource Management Analytics for PeopleSoft

This section provides an overview of Project Resource Management Analytics.

For generic information about using Project Resource Management Analytics with PeopleSoft or E-Business Suite, see Section B.4.22.1, "About Project Resource Management Analytics for PeopleSoft and E-Business Suite".

For information about using Project Resource Management Analytics with PeopleSoft, see Section B.4.22.2, "Notes on Project Resource Management Analytics for PeopleSoft".

Note: For information about using Project Resource Management Analytics with E-Business Suite, see Section B.4.23, "Overview of Project Resource Management Analytics for E-Business Suite".

B.4.22.1 About Project Resource Management Analytics for PeopleSoft and E-Business Suite

This section contains information that applies to Project Resource Management Analytics with PeopleSoft and E-Business Suite.

Overview

With Oracle Business Intelligence Applications release 11.1.1.7.1, Project Analytics introduces a new subject area to analyze Project Resource Management.

This release supports Oracle E-Business Suite 11.5.10 and R12x, and PeopleSoft 9x.

The new subject area comes with over 230 metrics and the catalog includes five new dashboard pages and over 40 new reports. For details, refer to the Oracle BI Applications Metrics Guide.

The following are the areas of analysis that this subject area supports and the new fact tables in Oracle Business Analytics Warehouse introduced.

Project Requirements

Table W_PROJ_RSRC_RQRMNT_F stores the detail about project requirements, including metrics such as time and number of resources requested and unfulfilled by project requirement. The fact table stores the requirement at the grain of requirement date range. Requirements are captured for a range of days in the ERP, but to provide the ability to compare requirements to Capacity at daily grain, the requirement hours is distributed linearly among the days within that range in the metadata repository.

Resource Utilization

Table W_PROJ_RSRC_UTILIZATION_F stores the details about resource assignments for each assigned day as well as resource capacity for each valid business day for each resource. This fact includes metrics such as capacity, scheduled, available and unassigned time. Actual Utilization is also supported via Project Cost fact.

Competencies and Jobs

Table W_EMP_JOB_F stores the details about the employee's primary job. Table W_EMP_COMPETENCY_F stores the detail about the employee's competencies. The RPD introduces a new logical table source that uses the above two facts to provide the ability to compare supply vs demand of jobs and competencies. This fact includes metrics such as # Of Employees, # Of Employees MAGO etc.

Resource Availability

This star provides the ability to find resources available for a pre-determined number of consecutive business days. There is no physical table for this fact in the warehouse. It is an opaque view in the metadata repository (RPD).

This opaque view is a select query which makes use of Employee's Holiday information and the Employee's assignment information to calculate the availability. This fact includes metrics such as Available Resource Count for Bucket1 etc.

This new subject area requires the parameters listed below to be set at implementation time. These are FSM parameters which are used by both EBS and PSFT adaptors. These parameters need to be configured in FSM.

  • Project Resource Management Capacity Records Creation Period: This parameter is used to determine for how long (in months) the capacity records are created. The default value of the this parameter is 12.

  • Project Availability Bucket Size: This parameter is used to specify the number of consecutive available business days used in the search for resources. The default value for this parameter is 5.

  • Project Resource Management UOM: The RPD variable PROJ_RSRC_MNGMT_UOM specifies the unit of the reporting. This can be: HOURS, DAYS or FTE. The default value for this parameter is 'HOURS'.

  • Project Resource Management Value of Unit of Measure in Days: Value of Unit of Measure in Days expressed in hours. This specifies the number of hours in a business day. The default value for this parameter is 8.

  • Project Resource Management Value of Unit of Measure in FTE: Value of Unit of Measure in FTE - full time equivalent weekly hours. The default value for this parameter is 40.

  • Project Resource Management Value of Unit of Measure in Hours: Project Resource Management Value of Unit of Measure in Hours. By Default all metrics are displayed in Hours. The default value for this parameter is 1.

B.4.22.2 Notes on Project Resource Management Analytics for PeopleSoft

This section provides information that is specific to Project Resource Management Analytics for PeopleSoft.

  • Task Types defined in Resource Management source application need to be mapped to the Work Type Dimension. A configuration point in FSM is provided to specify whether the task type is billable, capitalizable or used for training. It also specifies the weight given to the time of each type in the calculation of utilization percentages.

  • Expenditure Type, Expenditure Class, GL Accounting Date, GL Accounting Date Fiscal Calendar, Projects Calendar dimension are not supported for this adapter.

  • PeopleSoft Resource Management does not store the capacity of a resource. The capacity is calculated in ETL only for working days.

  • Only those employees who are eligible to be staffed on projects are considered when doing these calculations.

B.4.23 Overview of Project Resource Management Analytics for E-Business Suite

This section contains information that is specific to using Project Resource Management Analytics with E-Business Suite.

For generic information about using Project Resource Management Analytics with PeopleSoft or E-Business Suite, see Section B.4.22.1, "About Project Resource Management Analytics for PeopleSoft and E-Business Suite".

Notes on the E-Business Suite Adapter

Oracle E-business Suite Resource Management creates the capacity of a resource except for the case where the resource has never been assigned to a Project. To ensure uniform reports for all employees, the capacity records of unassigned employees are created during ETL process and loaded into Oracle Business Analytics Warehouse. The duration for which the capacity records are created is controlled by the FSM parameter 'Project Resource Management Capacity Records Creation Period'.

B.4.24 Overview of Project Analytics

Oracle Project Analytics offers a comprehensive solution that delivers pervasive insight into several fundamental areas of project management. With Project Analytics, project executives, project managers and project accountants can track the status of projects through their life-cycle to improve their performance and profitability. Oracle Project Analytics is also integrated with other Oracle BI Applications, such as Financials and Procurement Analytics. These integrations deliver cross functional analysis on AR and AP and procurement transactions by project.

Project Analytics includes the following Subject Areas:

  • Project - Project Billing

    This subject area provides the ability to report on Invoicing, including amounts and quantities, across projects, tasks, organizations, resources, and associated hierarchies and for external, interproject and intercompany invoicing. This subject area also includes contract metrics.

  • Project - Budget

    This subject area provides the ability to report on cost, revenue, margin budgets, and the budget changes including tracking original and current budgets across projects, tasks, organizations, resources, periods and associated hierarchies at budget line level.

  • Project - Cost

    This subject area provides the ability to report on Cost (Burdened Cost), Raw Cost, Burden Cost for the past and current periods including inception-to-date, year-to-date comparisons across projects, tasks, organizations, resources, suppliers and associated hierarchies. It provides the ability to track the cost at cost distribution level.

  • Project - Forecast

    This subject area provides the ability to report on Cost, Revenue and Margin Forecasts, and the Forecast changes including tracking original and current forecasts across projects, tasks, organizations, resources, periods and associated hierarchies. It provides the ability to track the metrics that indicates the past, present and future performance of the cost, revenue and margin.

  • Project - Funding

    This subject area provides the ability to track contract amount, funding amount and other changes of the funding throughout the life cycle of the project. In addition, it provides the ability to compare analysis of Contract Amount, Funding amount and Invoice Amount across projects, tasks, customers, organizations and associated hierarchies.

  • Project - Performance

    This is a consolidated subject area with the combined information from Budgets, Forecasts, Cost, Revenue, and provides the ability to monitor performance by comparing the actual (cost, revenue, margin and margin %) with budgets, and forecasts across projects, tasks, organizations, resources, and associated hierarchies.

  • Project - Revenue

    This subject area provides the ability to report on Revenue transactions for the past and current periods including inception-to-date, year-to-date comparisons across projects, tasks, organizations, resources, suppliers and associated hierarchies. It provides the ability to track the revenue at the distribution level.

  • Project - Commitments

    This subject area provides the ability to report on the obligations for future expenditures that a project has made. Reporting can be done across organizations, projects, tasks, resources and periods. There are metrics showing raw and burdened amounts for requisitions, purchase orders and supplier invoices.

  • Project – Cross Charges

    This subject area provides the ability to report on expenditures that projects or organizations charge to each other for resources that they share. Reporting is possible across periods, organizations, projects, task and resources. Metrics include charges generated by Intercompany Billing or the Borrowing and Lent methods for current and previous periods.

  • Project – Resource Management

    This subject area provides the ability to report on the utilization of resources, the status and attributes of project requirements and the supply and demand of competencies and jobs. Reporting is possible across Gregorian calendar periods, resource organizations, resources, requirements and projects.

  • Project – Cost GL Reconciliation

    This subject area provides metrics and dimensions to track the number of reconciliation exceptions between Projects and the General Ledger and their amount value. There are six use cases supported covering from the transfer of cost distribution lines from Projects to the posting of the corresponding journal lines to the General Ledger. Use cases also cover exceptions because of mismatch between journal lines and the cost distributions lines that they summarize and journal lines with no matching cost distribution lines.

  • Project – Revenue GL Reconciliation

    This subject area provides metrics and dimensions to track the number of reconciliation exceptions between Projects and the General Ledger and their amount value. There are six use cases supported covering from the transfer of revenue distribution lines from Projects to the posting of the corresponding journal lines to the General Ledger. Use cases also cover exceptions because of mismatch between journal lines and the revenue distributions lines that they summarize and journal lines with no matching revenue distribution lines.

Cross Fact Analysis

The Canonical BU (Canonical Organization) is the Common Logical BU (Organization) against which data is analyzed across different fact tables. From each fact table one main BU (Org) is selected to be used for analyzing data in that fact table (for example, for Cost Fact, the canonical BU is the Expenditure BU; for Revenue Fact, the canonical BU is the Contact BU) and use the corresponding foreign key to join to the logical dimension Dim - Business Unit (Dim - Project Organization). These dimensions Dim - Business Unit and Dim - Project Organization are called Canonical BU and Canonical Project Organizations dimensions respectively. For example, for the Cost Fact the join would be

Dim_W_INT_ORG_D_Business_Unit.SCD1_WID = Fact_W_PROJ_COST_LINE_F_Project_Cost.EXPENDITURE_OPER_UNIT_WID

For Revenue Fact the join would be:

Dim_W_INT_ORG_D_Business_Unit.SCD1_WID =Fact_W_PROJ_REVENUE_LINE_F_Revenue_Lines.CONTRACT_BU_WID

In addition the Canonical BU calendar is used when forming the foreign key to the Fiscal Calendar Day dimension (W_MCAL_DAY_D).For Cross Fact Analysis, you must always ensure that you have a filter on Canonical BU ( Business Unit Name column under Organizations folder in presentation area). This filter on Canonical BU is required in all dashboards because it ensures the calendar is unique and prevents double counting.

The table below lists the Canonical BU (Canonical Organizations) that are available for the Logical Facts supported in Project Analytics solution.

Table B-196 List of Facts, Canonical BUs, and Canonical Organization

Fact Canonical Business Unit Canonical Organization

Project Billing

Contract BU

Contract Organization

Project Budget

Project BU

Project Organization

Project Budget - Linear Spread

Project BU

Project Organization

Project Commitment

Project BU

Project Organization

Project Commitment Snapshot

Project BU

Project Organization

Project Contract

Contract BU

Contract Organization

Project Cost

Expenditure BU

Expenditure Organization

Project Cross Charge - Invoice

Project BU

Project Organization

Project Cross Charge - Provider

Expenditure BU

Expenditure Organization

Project Cross Charge - Receiver

Project BU

Contract Organization

Project Cross Charge - Revenue

Contract BU

Contract Organization

Project Forecast

Project BU

Project Organization

Project Funding

Contract BU

Contract Organization

Project Revenue

Contract BU

Contract Organization


B.4.25 Overview of Budgetary Control in Financial Analytics

The Budgetary Control dashboard is targeted at executives managing overall budgets and senior level managers managing budgets by cost centers, funds, programs, projects and accounts. It is designed to provide key analysis pertaining to expense budgets including budget amounts, encumbrances and expenditures as well as revenue budgets including budget amounts and recognized revenues.

There are some pre-requisites to meet in order to use Budgetary ControlAnalytics:

  • Budgetary Control Analytics dashboards allows drill down from summary reports to detail reports on purchase orders, purchase requisitions etc. which fall under Procurement and Spend Analytics subject areas. So, in order for these drill downs to work, you must license and implement Procurement and Spend Analytics offering in addition to the Financial Analytics offering.

  • PeopleSoft customers, commercial or public sector, need to implement Commitment Control module in the PeopleSoft Applications to use Budgetary Control Analytics.

B.4.26 Overview of Project GL Reconcilliation Analytics for E-Business Suite 11.5.10

For an overview of GL Reconcilliation for Projects Analytics, see Section B.2.124, "Additional Information About GL Reconcilliation in Project Analytics".

Because in E-Business Suite11.5.10, there is no concept of a linkage (SLA) table, the use cases are a little different than the ones in R12. The following are the use cases supported for E-Business Suite11.5.10 source system.

This diagram is described in surrounding text.

Table B-197 Use cases and descriptions

Use Case Description

Cost/Revenue Lines Not Transferred

Cost/Revenue line transactions with transfer status – "Not Transferred".

Cost/Revenue Line Transfer Exceptions

Cost/Revenue lines which have transfer status = "Transferred" but with transfer exceptions. The ETL process identifies these lines because they are not present in table w_gl_linkage_information_G of the Oracle Business Intelligence Date Warehouse.

Lines which have been successfully transferred to General Ledger are included in table w_gl_linkage_information_G.

Cost/Revenue Lines not in GL

Not supported for EBS 11.5.10 as there is no SLA module.

Cost/Revenue Lines with Unposted Journals

Cost/Revenue lines which have been transferred out of Projects and transferred to GL but which are not posted in General Ledger. This use case requires some customizations as described in FSM Task: How to configure projects GL reconciliation manual journal entries use case for EBS R11510 and R12.

Manual Journal Lines

Reports journal lines which are manually created in General Ledger and have no corresponding Cost/Revenue lines in Projects.

In cases where the accounting has been set up such that there is no project segment in the chart of accounts, this reconciliation module cannot match manual journal lines with individual projects.

To match manual journal lines with projects, users need to annotate manual journal lines with the project number that they correspond to using a flex field, and modify the ETL to pick up this information.

Amounts Mismatch

Reports journal lines and cost/revenue lines for which GL amount of journal line does not match with the sum of corresponding cost/revenue line amounts that they summarize.


B.4.27 Overview of GL Reconcilliation Analytics for E-Business Suite R12

For an overview of GL Reconcilliation for Projects Analytics, see Section B.2.124, "Additional Information About GL Reconcilliation in Project Analytics".

The following are the use cases supported for E-Business Suite R12 source system.

This diagram is described in surrounding text.

Table B-198 Use cases and descriptions

Use Case Description

Cost/Revenue Lines Not Transferred

Cost/Revenue line transactions with transfer status = "Not Transferred".

Cost/Revenue Line Transfer Exceptions

Cost/Revenue lines with transfer status = "Transferred" but which are not in the GL Linkage table."

The ETL process identifies these lines because they are not present in table w_gl_linkage_information_G of the Oracle Business Intelligence Date Warehouse. Lines which have been successfully transferred to SLA are included in table w_gl_linkage_information_G.

Cost/Revenue Lines not in GL

Cost/Revenue lines which have been transferred out of Projects SLA but that have not been transferred to General Ledger.

Cost/Revenue Lines with Unposted Journals

Cost/Revenue lines with corresponding journal lines not posted to General Ledger.

Manual Journal Lines

Reports journal lines which are manually created in General Ledger and have no corresponding Cost/Revenue lines in Projects.

In cases where the accounting has been set up such that there is no project segment in the chart of accounts, this reconciliation module cannot match manual journal lines with individual projects.

To match manual journal lines with projects, users need to annotate manual journal lines with the project number that they correspond to using a flex field, and modify the ETL to pick up this information. This use case requires some customizations as described in FSM Task: How to configure projects GL reconciliation manual journal entries use case for EBS R11510 and R12.

Amounts Mismatch

Journal lines with amounts that do not match that of their corresponding cost or revenue lines.


B.4.28 Overview of Project GL Reconcilliation Analytics for PeopleSoft 9.0

For an overview of GL Reconcilliation for Projects Analytics, see Section B.2.124, "Additional Information About GL Reconcilliation in Project Analytics".

The following are the use cases supported for the PeopleSoft 9.0 source system.

Single-Feed Data System

If source system is set for single feed, then there is table PS_CA_ACCTG_LN_PC, which is a step previous to General Ledger for cost transfers (Single Feed) and for Revenue.

This diagram is described in surrounding text.

Table B-199 Use cases and descriptions

Use Case Description

Cost/Revenue Lines Not Transferred

Cost/Revenue line transactions with transfer status – "Not Transferred".

Cost/Revenue Line Transfer Exceptions

Cost/Revenue lines which have transfer status = "Transferred" but with transfer exceptions. The ETL process identifies these lines because they are not present in table w_gl_linkage_information_G of the Oracle Business Intelligence Date Warehouse.

Lines which have been successfully transferred to PS_CA_ACCTG_LN_PC are included in table w_gl_linkage_information_G.

Cost/Revenue Lines not in GL

Cost/Revenue lines which have been transferred out of Projects to intermediate Sub Ledger or GL Linkage table but are not transferred to General Ledger.

Cost/Revenue Lines with Unposted Journals

Cost/Revenue lines which have been transferred out of Projects and transferred to GL but which are not posted in General Ledger.

Manual Journal Lines

Reports journal lines which are manually created in General Ledger and have no corresponding Cost/Revenue lines in Projects.

Amounts Mismatch

Reports journal lines and cost/revenue lines for which GL amount of journal line does not match with the sum of corresponding cost/revenue line amounts that they summarize.


Dual-Feed Data System

This diagram is described in surrounding text.

Table B-200 Use cases and descriptions

Use Case Description

Cost/Revenue Lines Not Transferred

This use case does not apply for the dual feed setup.

Cost/Revenue Line Transfer Exceptions

This use case identifies lines that are in Projects sub ledger but not in the General Ledger. The ETL process identifies these lines because they are not present in table W_GL_LINKAGE_INFORMATION_G of the Oracle Business Intelligence Date Warehouse, while line which have been successfully transferred to General Ledger are included in table W_GL_LINKAGE_INFORMATION_G.

Cost/Revenue Lines not in GL

This use case does not apply for the dual feed setup.

Cost/Revenue Lines with Unposted Journals

Cost/Revenue lines which have been transferred out of Projects and transferred to GL but which are not posted in General Ledger.

Manual Journal Lines

Reports journal lines which are manually created in General Ledger and have no corresponding Cost/Revenue lines in Projects.

Amounts Mismatch

Reports journal lines and cost/revenue lines for which GL amount of journal line does not match with the sum of corresponding cost/revenue line amounts that they summarize.


Note

In PeopleSoft systems there are two types of revenue transactions: Amount-Based Revenue, and Rate-Based Revenue.

For Oracle Business Intelligence Applications release 11.1.1.7.1, the reconciliation only supports Rate-Based Revenue transactions. 'Amount Based Revenue' rows are currently not captured in our Revenue Fact table and are therefore not supported.

B.4.29 Overview of Oracle Procurement and Spend Analytics

Oracle Procurement and Spend Analytics comprises of Procurement Analytics, Sourcing Analytics, and Employee Expense Analytics.

Oracle Procurement and Spend Analytics enable organizations to optimize their supply side performance by integrating data from across the enterprise value chain and enabling executives, managers, and frontline employees to make more informed and actionable decisions. Organizations using Oracle Procurement and Spend Analytics benefit from increased visibility into the Corporate Spend and complete source-to-pay process, including comprehensive sourcing and procurement analysis, supplier performance analysis, supplier payables analysis, and Employee Expenses analysis. Through complete end-to-end insight into the savings, spend patterns, and supplier performance, organizations can significantly reduce costs, enhance profitability, increase customer satisfaction, and gain competitive advantage. Oracle Procurement and Spend Analytics also integrates with the other applications in the Oracle Business Intelligence Applications product line, such as Oracle Financial Analytics. They deliver this insight across the organization to increase the company's effectiveness in managing its customers, suppliers, and financial decisions.

Oracle Procurement and Spend Analytics provides visibility into sourcing, direct and indirect spending across the enterprise, payment, and employee expenses. Oracle Procurement and Spend Analytics comprises the following Subject Areas:

  • Procurement and Spend - Change Orders: This subject area provides the ability to report on changes to purchasing documents post approval, showing count of changes/ cancellations and processing time by Supplier, BU, Buyer, and Change Order attributes such as method, type, initiator, and so on. Note: this subject area is applicable to Fusion source. Other sources such as EBS or PeopleSoft does not support this subject area.

  • Procurement and Spend - Invoice Lines: This is a detailed subject area that provides the ability to report on total spend of an organization across suppliers, products, item categories, business units, cost centers, buying locations, supplier locations and associated hierarchy. In addition, this subject area also provides detailed information at invoice distribution level.

  • Procurement and Spend - Procure to Pay: This is a summary subject area that provides the ability to do comparative analysis and report on requested spend, committed spend and actual spend and receipts across business units, buying locations, suppliers, products, item categories and associated hierarchies for both direct and indirect spend (indirect spend being MRO and employee expenses) in detail to allow complete visibility of spending across your organization.

  • Procurement and Spend - Purchase Agreement: This subject area provides ability to report on Purchase Agreements, showing agreement amount, its consumption and expiration, number of different agreement types, buyers, supplier and supplier sites, agreement lines across Supplier, Supplier Site, Buyer, Item, BUs, and Agreement details.

  • Procurement and Spend - Purchase Cycle Lines: This is a summary subject area that provides the ability to report cycle time performance, such as requisition to purchase order lead time, purchase order to receipt lead time, P2P lead time of the suppliers of an organization.

  • Procurement and Spend - Purchase Orders: This is a detailed subject area that combines the information from Purchase Orders, Purchase Order Costs and Purchase Schedules with the ability to report on committed spend, contract compliance and Purchase orders of the suppliers of an organization across suppliers, company, products, item categories and associated hierarchies at purchase order line level.

  • Procurement and Spend - Purchase Orders BU Summary: This is the same as 'Procurement and Spend - Purchase Orders' Subject Area, except that they do not have data security enabled, and is used in Fusion Applications embedded reports only by explicit data filter.

  • Procurement and Spend - Purchase Receipts: This is a detailed subject area that provides the ability to report on actual spend and purchase receipts of the suppliers of an organization across suppliers, company, location, products, item categories and associated hierarchies at purchase receipt line level, including reporting based on receiving time.

  • Procurement and Spend - Purchase Requisition BU Summary: This is the same as 'Procurement and Spend - Purchase Receipts' Subject Area, except that they do not have data security enabled, and is used in Fusion Applications embedded reports only by explicit data filter.

  • Procurement and Spend - Purchase Requisition Status: This is a summary subject area that provides the ability to report on requisition status along the approval cycle of purchase requisitions of the suppliers of an organization. This subject area is only populated by the Universal adapter.

  • Procurement and Spend - Purchase Requisitions: This is a detailed subject area that provides the ability to report on requested spend and purchase requisitions (including cyclic requisitions) of the suppliers of an organization across suppliers, company, products, item categories and associated hierarchies at purchase requisition line level.

  • Supplier Performance - Supplier AP Transactions: This is a summary subject area that provides the ability to analyze payment performance and payment due analysis of the suppliers of an organization across suppliers, company, location, products, commodities and associated hierarchies. (Note: In order to populate Supplier Payables component, you must implement the Accounts Payables module of Oracle Financial Analytics. If you do not implement the Accounts Payables module, then some of the Supplier Payables reports will not be populated.)Procurement and Spend - Scorecard: This subject area supports Procurement Scorecard. It includes metrics/ KPIs and its targets that provide the ability to monitor and analyze trends of procurement organization's performance. It provides performance and goal attainment information, across time and business units, from different perspectives such as finance, internal customer, operations and supplier.

  • Supplier Performance - Supplier Performance: This subject area (built on Purchase Cycle Lines) contains targeted metrics that allow users to analyze the timeliness, reliability, cost, and quality of goods provided by the suppliers. It helps you to understand how well suppliers are contributing to the success of your organization.

  • Sourcing - Award: This subject area provides the ability to report on Sourcing Awards, showing projected and realized savings, award amount, quantity, price, PO amount, number of suppliers and BUs awarded across sourcing negotiation types, BUs, Suppliers, Buyers and Categories.

  • Sourcing - Negotiation: This subject area provides the ability to report on Sourcing Negotiations, showing negotiation amounts, header/ line counts and cycle times across sourcing negotiation types, BUs, Suppliers, Buyers and Categories.

  • Sourcing - Overview: This is a detailed subject area that provides the ability to report on supplier participation and response to sourcing documents, projects and realized savings, award amount, quantity, price, PO amount, number of suppliers and BUs awarded, and various cycle times across Sourcing negotiation types, BUs, Suppliers, Buyers and Categories.

  • Sourcing - Response: This subject area provides the ability to report on Sourcing Responses, showing supplier response and participation across sourcing negotiation types, BUs, Suppliers, Buyers and Categories.

  • Employee Expenses - Credit Card: This subject area provides the ability to report on the corporate card spend of an organization, showing the number and amount of outstanding transactions by business unit, employee, and expense categories.

  • Employee Expenses - Overview: This is a detailed subject area that provides the ability to report on employee spend of an organization across employees, company, cost center and associated hierarchies, including Approvers and cycle time measurements related to Approval, and Employee Expenses by various expense types.

  • Employee Expenses - Violations: This subject area provides the ability to report on policy violations for submitted employee expenses of an organization, across employee and business.

  • Spend Planning - Common: This subject area is used by spend panning application to provide reference data such as Exchange Rate, UOM conversion, Agreement etc. Note each of the table in this subject area represents an object that should be query independently. You should not create queries crossing tables.

  • Spend Planning - Historical Spend: This subject area is use by spend planning application to extract and analyze historical spend data.

  • Spend Planning - Purchasing: this subject area is use by spend planning application to extract and analyze historical purchasing data.

B.4.30 Overview of Product Information Management Analytics

Oracle Product Information Management (PIM) Data Hub is an enterprise data management solution that enables customers to centralize all product information from heterogeneous systems. It allows organizations to create a single, enterprise view of their product information, by integrating, standardizing and synchronizing fragmented product data from multiple source systems into a central, operational, data repository ('Hub').

PIM Data Hub solution centralizes the disparate sources of product information and provides a full, 360-degree view of products across all channels. It enables articulated management and communication of product information both within the organization as well as externally to customers and value-chain partners.

Oracle Product Information Management Analytics application comprises the following Subject Areas:

  • PIM - Item: This subject area provides information on creation and approval activities related to items of different Item class, type, phase and status.

  • PIM - Change Orders: This subject area provides information on activities related to Change Orders such as number of change orders in different age range, average age of change orders, different stages of change order life cycle, for example, approved, rejected, draft, pending effective.

  • PIM - New Item Request: This subject area provides information on activities related to New Item requests such as number of new item requests in different age range, average age of new item requests, New Item Request Cycle Time and different stages of new item request life cycle, for example, new, approved, rejected.

  • PIM - Item Catalog : This subject area provides information on activities related to Item Catalogs like number of new catalogs, categories, and shared categories.

  • PIM - Item Batch : This subject area provides information on activities related to Item Import from any external system such as number of items excluded, partially imported, successfully imported, and so on during the batch import process.

B.4.31 Overview of Partner Analytics

Partner Analytics helps channel and partner account managers assess partner and program performance on all key fronts - lead generation, deals registered, revenue and enrollments. It also enables partner organization sales representatives and managers to assess their own sales performance.

B.4.32 Overview of Financial Analytics

Oracle Financial Analytics comprises the following Functional Areas:

  • Employee Expenses - The Oracle Employee Expenses Analytics application has been designed to provide visibility into an organization's employee related expenditures, including corporate card usage, expense policy violations, and the overall submission and approval process. Gain control of the drivers of employee expenses by isolating top spenders across expense categories and identifying recurring policy violations. Visibility into overall expense trends improve ability to negotiate with key merchants. The default configuration for the Oracle Employee Expenses Analytics application is based on what is identified as the most-common level of detail or granularity. However, you may configure and modify the extracts to best meet your business requirements.

  • Fixed Assets - The Oracle Fixed Assets Analytics application provides finance controllers, asset managers, and cost center managers with a complete picture of the asset's life cycle from acquisition through to retirement. Fixed assets comprise approximately 40 to 50% of the balance sheet and are a key component for both the commercial and public sector customers. Tracking asset life cycle value and measuring returns on some of the key assets are important to increase the overall return of the organization. The default configuration for the Oracle Fixed Assets Analytics application is based on what is identified as the most-common level of detail or granularity. However, you may configure and modify the extracts to best meet your business requirements.

  • General Ledger - The General Ledger Analytics application has been designed to provide insight into key financial areas of performance, including balance sheet, cash flow, expenses, budget vs. actual, working capital, liquidity. Identify root cause of discrepancies for more timely, informed decisions at all levels of the organization. Gain access to reporting and analysis from intra-period financial information before books are closed. The default configuration for the Oracle General Ledger Analytics application is based on what is identified as the most-common level of detail or granularity. However, you may configure and modify the extracts to best meet your business requirements.

  • Payables - The Oracle Payables Analytics application has been designed to provide an overview of the health of the payables side of the business and enables Finance to best manage its cash outflows and ensure timely payments to its suppliers. The need for analysis is increasingly important because suppliers are becoming strategic business partners with the focus on increased efficiency for just in time, and quality purchasing relationships. The default configuration for the Oracle Payables Analytics application is based on what is identified as the most- common level of detail, or granularity. However, you can configure or modify the extracts to best meet your business requirements.

  • Profitability - The Oracle Profitability Analytics application has been designed to provide key data pertaining to profitability, including Profit and Loss Statements, Customer and Product profitability, Margin Analysis, ROA, and ROE. Insight into Revenue and Cost Drivers help drive financial accountability, and proactive behavior. The default configuration for the Oracle Profitability Analytics application is based on what is identified as the most-common level of detail or granularity. However, you may configure and modify the extracts to best meet your business requirements.

  • Receivables - The Receivables Analytics application has been designed to provide key data pertaining to receivables, including receivables due, credit risk, payments, collector efficiency and enables Finance to best manage cash inflows and their ability to collect debt. Each day that your receivables are past the due date represents a significant, opportunity-cost to your company. Keeping a close eye on the trends, and clearing of AR is one way to assess the efficiency of your sales operations, the quality of your receivables, and the value of key customers. The default configuration for the Oracle Receivables Analytics application is based on what is identified as the most-common level of detail or granularity. However, you may configure and modify the extracts to best meet your business requirements.