Skip Headers
Oracle® Business Intelligence Applications Release Notes
Version 7.9.4
E10921-03
  Go To Documentation Library
Library
Go To Product List
Product
Go To Table Of Contents
Contents

Previous
Previous
 
 

1 Release Notes

These release notes describe known issues and workarounds for Oracle Business Intelligence Applications Version 7.9.4, and contain the following sections:

1.1 How to Use These Release Notes

These release notes are updated periodically as new information becomes available. To ensure that you are reading the latest version of the release notes, check the Oracle Business Intelligence Applications Documentation Web site:

http://www.oracle.com/technology/documentation/bi_apps.html

1.2 General Issues and Workarounds

This section describes general issues and workarounds for Oracle Business Intelligence Applications products. It contains the following topics:

1.2.1 Certification Information

For certification information, refer to the System Requirements and Supported Platforms for Oracle Business Intelligence Applications document. This document is part of the Oracle Business Intelligence Applications documentation set. It is also available from the Certify tab on Oracle Metalink at the following location:

http://metalink.oracle.com

1.2.2 Installation and Upgrade

This section provides release notes on installing and upgrading Oracle Business Intelligence Applications. It contains the following topics:

1.2.2.1 Maintain Side-By-Side Environments When Upgrading

If you are upgrading from a previous release, then include in your upgrade preparations, plans for side-by-side instances of the entire Oracle BI Applications environment, not just for the Oracle Business Intelligence Presentation Catalog.

Enabling side-by-side instances of the entire Oracle BI Applications environment is a critical success factor for upgrade.

1.2.3 Documentation Corrections

This section provides corrections for various pieces of the documentation set for Oracle Business Intelligence Applications. It contains the following topics:

1.2.3.1 Corrections to Oracle Business Intelligence Applications Installation and Configuration Guide

Note the following corrections to the Oracle Business Intelligence Applications Installation and Configuration Guide:

1.2.3.2 Corrections to Oracle Business Intelligence Applications Upgrade Guide

Not applicable to Oracle Business Intelligence Applications Version 7.9.4.

1.2.3.3 Corrections to Oracle Business Intelligence Data Warehouse Administration Console Guide

Not applicable to Oracle Business Intelligence Applications Version 7.9.4.

1.3 Oracle Business Intelligence Applications: General

This section provides release notes for Oracle Business Intelligence Applications in general. It contains the following topics:

1.3.1 Issue Including Subject Areas from Two Different Sources into One Execution Plan

Data Warehouse Administration Console allows customers to build execution plans that are comprised of subject areas from different source system containers. However, such behavior is not supported in this release. For example, you cannot have a subject area from Oracle Applications 11.5.8 and Siebel Applications 7.8 in the same execution plan.

To work around this issue, create separate execution plans for subject areas in different containers. For example, suppose an execution plan has subject areas such as Sales, Inventory, and Receivables that come from an Oracle Applications 11.5.8 container. If the Sales subject area in Siebel Applications 7.8 must be populated, then it requires another execution plan.

1.3.2 Incremental Aggregation Tasks Can Run Slowly in Multisource ETL Situation

Data Warehouse Administration Console runs full load or incremental load commands based on the dates when data was last extracted from a source table and when data was loaded into a target table.

Dependence on last extract or last load dates can become an issue when loading data from two discrete source systems for the first time. ETL from the first source runs as a full extract and full load. ETL from the second source runs as a full extract and as an incremental load. This can cause the second load to be slower than the first, because full load mappings are simple while incremental mappings (especially for aggregate tables) must handle delta aggregation, manipulation, and other complicated situations. Thus incremental loads are costlier.

Slow performance when running incremental ETL is not noticeable in daily ETL runs because of lower data volumes. But, in situations where data is extracted and loaded for the first time from two source systems one after the other, there will be a noticeable slowing down of the second load due to the high data volumes loaded in an incremental mode.

For example, suppose that three years of data is extracted and loaded from Oracle Applications 11.5.8 followed by a three-year load of data from Oracle Applications 11.5.10. The load from Oracle Applications 11.5.10 source will be noticeably slower.

This issue has no workaround.

1.3.3 Data Warehouse Administration Console Cannot Use Batch Mode to Build Execution Plans

You cannot use the Data Warehouse Administration Console repository in batch mode when building execution plans on an Oracle 9i Release 2 Database. If batch mode is enabled in this situation, then you might see an error message such as the following one:

MESSAGE:::Error while persisting dependency. EXCEPTION CLASS::: com.siebel.analytics.etl.execution.ExecutionPlanInitializationException

This error occurs because the Oracle 9i JDBC drivers do not handle array inserts or bulk inserts. The execution plan build works well without the bulk inserts.

To work around this issue, perform these steps:

  1. Deselect the "Enable Batch Mode" box when building execution plans. The default is to enable the batch mode.

  2. Use the JDBC driver that is available in Oracle Database 10g and connect to the DAC repository with a thin connection.

1.3.4 Incorrect Name for SA System Presentation Column Prevents Delivery of iBots to Applications Users

The Oracle Business Intelligence Delivers iBots use a predefined query against the SA System subject area to retrieve a list of applications users who are associated with the iBot's Recipient Group. When an iBot is run, this predefined query will fail with an error message that is similar to the following one:

State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 27005] Unresolved column: "Time Zone".

The query error prevents the Oracle BI Delivers iBot from generating and delivering any content for the applications users that are associated with the iBot's Recipient Group. This impacts all iBot destination types including Interactive Dashboards and Emails and Disconnected Application cache generation for Oracle Business Intelligence Disconnected Analytics. The issue occurs because the "Time Zone" presentation column name is incorrectly spelled as "Timezone"in the SA System subject area.

To work around this issue use the Oracle Business Intelligence Administration Tool to change the name of the existing "Timezone" presentation column in the SA System presentation catalog to "Time Zone".

For information on the SA System subject area, see the Oracle Business Intelligence Server Administration Guide.

1.3.5 Lack of Time Zone Setting Prevents Delivery of iBots to Applications Users

The Oracle BI Delivers iBots use a predefined query against the SA System subject area to retrieve a list of applications users who are associated with the iBot's Recipient Group. When an iBot is run, users who do not have a time zone specified in user preferences are considered invalid users and iBots are not delivered to them.

This issue occurs because the join type for the S_TIMEZONE join in the S_USER Logical Table Source in the SA System Business Model is defined as INNER when it should be defined as RIGHT OUTER.

To work around this issue, perform these steps:

  1. Display the Oracle Business Intelligence Administration Tool.

  2. In the Business Model and Mapping layer, expand the SA System Business Model and the USER Logical Table.

  3. Double-click the S_USER Logical Table Source under Sources in the USER Logical Table.

  4. In the Logical Table Source - S_USER Dialog Box, change the type to RIGHT OUTER from INNER for the S_TIMEZONE join in the Joins section of the General tab.

For information on the SA System subject area, see the Oracle Business Intelligence Server Administration Guide.

1.3.6 Large Dimensional Tables Causing Slow Fact Lookups

Several dimensional tables that are larger than 255 columns are causing the fact lookups on these dimensions to perform slowly and the lookup database server to run out of memory. The following tables have been identified as experiencing this issue:

  • W_CUSTOMER_FIN_PROFL_D

  • W_CUSTOMER_LOC_D

  • W_CUSTOMER_LOC_USE_D

  • W_ORG_D

  • W_PRODUCT_D

  • W_SALES_PRODUCT_D

  • W_SUPPLIER_PRODUCT_D

To work around this issue and improve performance, complete the following steps:

  1. Edit the slow running session.

  2. For all transformations that look up the large dimensions that are listed in this section, change the following

    • Index cache size = 10 MB

    • Data cache size = 20 MB

This is a session-side change. No modifications are required in mapping.

1.3.7 Fix List of Values and Copy to List Task Hangs When Running Full or Incremental Load

Data Warehouse Administration Console automatically drops and recreates the registered indexes on the primary target table that is defined for a task for better ETL performance if the primary target table is truncated in full or incremental load. Data Warehouse Administration Console also gathers statistics on the ETL indexes automatically after recreating these indexes on this table. Statistics help the database optimizer to select the best access path to query the data.

The underlying workflow for this task's full and incremental load command is SDE_ListOfValues, which consists of the SIL_ListOfValuesGeneral and SIL_CopyListOfValuesToOLTP sessions. The first session populates the W_LST_OF_VAL_G table, and the second session populates the S_ETL_LOV table from the W_LST_OF_VAL_G table. Because the S_ETL_LOV table is defined as the primary table for this task, Data Warehouse Administration Console does not analyze the W_LST_OF_VAL_G table.

Even if this task is modified to include the W_LST_OF_VAL_G table as another primary table, Data Warehouse Administration Console will not analyze this table until after executing the entire workflow that comprises these two sessions. This causes the task to hang while it extracts rows from the unanalyzed W_LST_OF_VAL_G table to populate the S_ETL_LOV table.

To work around this issue:

  1. If necessary, make a copy of the deployed container where you want to apply this workaround. You cannot make changes to the out-of-the-box container.

  2. Create two new workflows in the Informatica Workflow Manager: one for the SIL_ListOfValuesGeneral session and the other for the SIL_CopyListOfValuesToOLTP session. The workflow name should match the session name that is used inside the workflow.

  3. Make two copies of the Fix List Of Values and Copy to OLTP task in Data Warehouse Administration Console. Rename the first copy as Fix List Of Values and the second copy as Copy List Of Values to OLTP task. Verify that the properties for both these tasks are set as follows:

    • Task 1: Fix List of Values

      • Command for Incremental Load: SIL_ListOfValuesGeneral

      • Command for Full Load: SIL_ListOfValuesGeneral

      • Folder Name: Extract

      • Primary Source: DBConnection_OLAP

      • Primary Target: DBConnection_OLTP

      • Task Phase: General

      • Execution Type: Informatica

      • Priority: 5

      • Source Tables: W_LST_OF_VAL_G (Primary), W_PARAM_G (Lookup)

      • Target: W_LST_OF_VAL_G (Primary).

      Ensure that Truncate Always and Truncate For Full Load options are not selected.

    • Task 2: Copy List of Values to OLTP

      • Command for Incremental Load: SIL_CopyListOfValuesToOLTP

      • Command for Full Load: SIL_CopyListOfValuesToOLTP

      • Folder Name: Extract

      • Primary Source: DBConnection_OLAP

      • Primary Target: DBConnection_OLTP

      • Task Phase: General

      • Execution Type: Informatica

      • Priority: 5

      • Source Tables: W_LST_OF_VAL_G (Primary)

      • Target: S_LST_OF_VAL (Primary)

      Ensure that Truncate Always and Truncate For Full Load options are selected.

  4. Edit the TASK_GROUP_Extract_lst_of_val task group by removing the Fix List of Values and Copy to List task and adding the newly created tasks Fix List Of Values and Copy List Of Values to OLTP to this task group. Set the execution order for these tasks as follows:

    • Fix List of Values: 1

    • Copy Lists of Values to OLTP: 2

  5. Reassemble the appropriate subject area to ensure that these new tasks are included in the subject area.

  6. Rebuild the execution plan in which these tasks must be executed.

1.3.8 Issue with Exchange Rates and Transaction Currencies

Current design and support of multiple currencies within the Oracle BI Applications and the data warehouse assumes that the transactional system (or OLTP system) provides exchange rates and table structures that store exchange rates.

If the OLTP system does not provide exchange rates from the "transaction currency" to the chosen "one or more data warehouse currencies", then the Fact table will have a null exchange rate value for "transaction" currency to "Global1" currency, and hence, analysis based on Global currencies will not be possible for these transactions. It also impacts the correctness of the data for data that resides in various aggregate tables. This issue is also seen in the other two supported currencies (Global2 and Global3).

To work around this issue, ensure that the OLTP system has all currency exchange rates from all possible transaction currencies added to all the three chosen data warehouse currencies, up front. If this is not taken care of beforehand and you encounter a missing exchange rate issue, then you can rerun transactions in "full" mode after you have fixed the missing exchange rate issue in the OLTP system.

1.3.9 Incorrect Default Physical SQL for Saved Result Sets on Teradata

When you set up segment Saved Result Sets in the Marketing Metadata on a Teradata data source, you find that the default SQL for inserting the saved result set header has incorrect syntax. The user will receive the following error message when trying to save a saved result set for a segment:

Odbc driver returned an error (SQLExecDirect). State: 37000. Code: 10034. [NQODBC] [SQL_STATE: 37000] [nQSError: 10034] The SQL statement has a syntax error. [nQSError: 43093] An error occurred while processing the EXECUTE PHYSICAL statement. [nQSError: 16001] ODBC error state: 37000 code: -3706 message: [NCR][ODBC Teradata Driver][Teradata RDBMS]Syntax error: expected something between the 'COUNT' keyword and ','.. [nQSError: 16014]]

This issue is caused by the fact that in the default SQL, the COUNT keyword requires double quotes on Teradata databases. To work around this issue, perform the following steps:

  1. In the Oracle BI Administration Tool, select Marketing from the Manage menu.

  2. Double-click the Target Level and select the Saved Result Sets tab.

  3. Modify the physical SQL that inserts the saved result set header by adding double quotes around the COUNT keyword.

The following example shows SQL code in which double quotes were included around the COUNT keyword.

INSERT INTO M_SR_HEADER (GUID, SEGMENT_PATH, DATE_TIME, TARGET_LEVEL, "COUNT", CREATED_BY) VALUES ('@{guid}', '@{segmentPath}', '@{createdTime}', '@{targetLevel}', @{count}, '@{createdBy}')

1.3.10 Issue with Mapping in Oracle Business Intelligence Applications 7.7.1.x Versions

The "SIL_ListofValuesDimension_MissingActivityCodes" mapping, which was used to move unbounded LOV types from the S_LST_OF_VAL OLTP table to the data warehouse table, is not available. This issue does not affect customers who have bounded LOV type. However, if customers use unbounded LOV types for sra_resolution_cd, this mapping is still required in the 7.7.1.x versions.

To work around this issue, re-import the old mapping in the new environment and execute it as a one-time job, if your data source is Oracle's Siebel CRM Applications Version 7.7 or higher. If your data source is Oracle's Siebel CRM Applications Version 7.5.3 or lower, then run this job as part of the ETL process.

1.3.11 Issues Inserting Tables on Infa Server in a Non-English Language Deployment

If you use the Informatica Infa server on a non-English language machine and your warehouse is on an Oracle database, then you might encounter an Oracle database problem when the Informatica server is trying to enter records with a decimal data type column into the warehouse. You might see an Oracle error message such as the following one:

Error: ORA-01722: invalid number

To work around this issue, disable the "Enable high precision" option in the session property. This option indicates whether to keep the precision of a decimal port up to 28 digits. If this option is disabled, then Informatica keeps a precision of up to only 15 digits for decimal ports. Because Oracle Business Intelligence Applications uses decimal ports for ID columns, a precision of 15 is sufficient.

1.3.12 Restoring the Informatica Repository in a Non-English Language Deployment

Due to a limitation in Informatica PowerCenter, you must use the 'PMREPAGENT restore' command to restore the Informatica Repository in a non-English language deployment of Oracle BI Applications (for example, Japanese or Simplified Chinese). If you try to use the Informatica PowerCenter user interface to restore the Informatica Repository in a non-English language deployment of Oracle BI Applications, the process will fail.

To restore the Informatica Repository in a non-English language deployment of Oracle BI Applications, use the 'PMREPAGENT restore' command, as specified in the following sample command:

\OracleBI\dwrep\Informatica\PMREP\PMREPAGENT restore -r <INFA_REP_NAME> -t <INFA_REP_CONNECT_TYPE> -u <INFA_REP_DBOWNER> -p <INFA_REP_PASSWORD> -c <INFA_REP_CONNECT_STR> -h <INFA_REPSERVNAME> -o <INFA_REPSERVPORT> -i <INFA_REP_FILE> -n

For more information about the PMREPAGENT command, see Informatica PowerCenter documentation on installation and configuration.

1.3.13 Contact Geography Attributes in Campaign Contacts_Segmentation Catalog Do Not Join Appropriately

The Contact Geography dimension attributes on the Campaign Contacts_Segmentation catalog, which map to the Person Geography dimension table, are not joined appropriately to the Campaign History Fact table. Therefore, Oracle Business Intelligence Server cannot find an appropriate navigation path when this dimension is used.

To work around this issue, remove the Geography dimension from the Campaign Contacts_Segmentation catalog. If users want to use this dimension, then they can switch to another subject area that has this dimension and reference it there, for example, Customer Profile_Segmentation.

1.3.14 Issue with Reports in Oracle Business Intelligence Presentation Catalog

The following reports within the shipped Oracle BI Presentation Catalog have poor query performance:

  • /shared/Interactive Selling/Pricing/Promotions/Recommended Promotions

  • /shared/Interactive Selling/Pricing/Promotions/Market Basket Analysis - Promotion Recommendations

  • /shared/Marketing/Customer Insight (B2C)/Products/Next Product Purchased

  • /shared/Interactive Selling/Products/Marketing/Next Product Purchased

Do not use these reports in the current release. This issue has no workaround.

1.3.15 Issues With Multi-Source ETL

The DAC Deployment procedure causes sequence generators transformations in the ETL to be reset to start from the number 1 again. This can cause some issues when running Source Independent Load (SIL) mappings for the different applications and adapters together. These SIL mappings will be unable to run together.

To illustrate this limitation, two examples of ETLs being unable to run together are shown below:

  • When the Siebel Vertical adapter and Oracle EBS adapter is used, the Siebel adapter leverages the SIL mappings in the SIL_Vert folder, while the Oracle EBS adapter leverages the SIL mappings in the SILOS folder. The sequence generators in the SIL_Vert folder gets updated to new values, while the same ones in the SILOS folder does not. This results in all SILOS mappings for common dimensions (like Employee, Exchange Rate, etc.) to fail.

  • The same dimensional tables are loaded from mappings within the SILOS folder and the PLP folders. This results in the same issue as above, and when the mappings are run, they fail.

The workaround is to set the Sequence Generator value to run between 1 and a sufficiently large finite value (for example, 1,000,000,000) for one of the folders and set the SILOS folder Sequence Generator value to run between 1,000,000,001 and its maximum limit of 2,000,000,000.

1.3.16 Invalid Expressions In MPLT_SA_ORA_SALESINVOICELINESFACT

The expressions for EXT_SALES_ORDER_NUM, EXT_SALES_ORDER_ITEM, and EXT_ITEM_DETAIL_NUM in MPLT_SA_ORA_SalesInvoiceLinesFact are inconsistent with SALES_ORDLN_ID.

For customers with Versions 7.9, 7.9.1, and 7.9.2, the following steps will fix this issue.

  1. Change the expressions of SALES_ORDER_NUM, SALES_ORDER_LINE and ITEM_DETAIL_NUM in MPLT_SA_ORA_SalesInvoiceLinesFact as follows:

    IIF(INP_INTERFACE_LINE_CONTEXT='ORDER ENTRY', INP_SALES_ORDER, NULL)

    IIF(INP_INTERFACE_LINE_CONTEXT='ORDER ENTRY', INP_SALES_ORDER_LINE, NULL)

    IIF(INP_INTERFACE_LINE_CONTEXT='ORDER ENTRY', INP_ITEM_DETAIL_NUMBER, NULL)

1.3.17 Forecasting Interoperability With Siebel OLTP 8.0

If you are using Siebel OLTP 8.0, you must add a filter in the RPD fiscal layer, as follows:

  1. Display the Oracle BI Administration Tool.

  2. In the physical layer under Forecasting Siebel OLTP connection pool, open the table S_FCST_ITEM_DTL.

  3. Change the Type to Select.

  4. Paste the following SQL into the window.

    S_FCST_ITEM_DTL.AGGR_DIM_NAME_1 IS NULL AND

    S_FCST_ITEM_DTL.AGGR_DIM_NAME_2 IS NULL AND

    S_FCST_ITEM_DTL.AGGR_DIM_NAME_3 IS NULL AND

    S_FCST_ITEM_DTL.AGGR_DIM_NAME_4 IS NULL AND

    S_FCST_ITEM_DTL.AGGR_DIM_NAME_5 IS NULL AND

    S_FCST_ITEM_DTL.AGGR_DIM_NAME_6 IS NULL

  5. Save the changes.

1.3.18 Email Personalization For Siebel 8.0

The email personalization formats that are installed out-of-the-box in Siebel 8.0 are not constraining the generated lists by treatment ID. As a result, when a campaign launch issues the SOAP call to generate list files for a given treatment, everyone that qualifies for the campaign is being returned in the list. For example, if the campaign has two email treatments that have been allocated to two different sets of campaign members, this issue causes all campaign members to receive both treatments.

This section explains how to work around this issue.

1.3.18.1 Updating The Repository

This section explains how to update the repository in Oracle BI Administration Tool.

  1. Start the Oracle BI Administration Tool.

  2. In the Physical Layer, add a physical column DCP_ID to the Campaign Promotion physical table in Marketing OLTP database and specify the following values in the Physical Column dialog:

    • Name: DCP_ID

    • Type: VARCHAR

    • Length: 15

    • Nullable: yes

  3. In the Business Model and Mapping Layer, add a logical column Treatment Id to the OLTP Campaign Promotion logical table in Marketing Contact List business model and specify the following values in the Logical Column dialog:

    • Name: Treatment Id

    • Logical Table Source: S_CAMP_CON

    • Mapped as: "Marketing OLTP".dbo."Campaign Promotion".DCP_ID

  4. In the Business Model and Mapping Layer, add a logical column Treatment Id to the OLTP Campaign Promotion logical table in Marketing Account List business model and specify the following values in the Logical Column dialog:

    • Name: Treatment Id

    • Logical Table Source: S_CAMP_CON

    • Mapped as: "Marketing OLTP".dbo."Campaign Promotion".DCP_ID

  5. In the Presentation Layer, add the Treatment Id presentation column to the Campaign History (Transaction Database) presentation table in the Marketing Contact List presentation catalog and specify the following values in the Presentation Column dialog:

    • Name: Treatment Id

    • Logical Column: "Marketing Contact List"."OLTP Campaign Promotion"."Treatment Id".

  6. In the Presentation Layer, add the Treatment Id presentation column to the Campaign History (Transaction Database) presentation table in the Marketing Account List presentation catalog and specify the following values in the Presentation Column dialog:

    • Name: Treatment Id

    • Logical Column: "Marketing Account List"."OLTP Campaign Promotion"."Treatment Id".

1.3.18.2 Updating The Campaign Load Format and Email Server Format

This section explains how to update the Campaign Load Format and Email Server Format in Siebel Marketing.

  1. Log in to Siebel Marketing.

  2. Add Treatment Id to the Campaign Contact integration component and specify the following values in the Edit Column Formula dialog:

    • Table heading: Campaign Contact

    • Column Heading: Treatment Id

    • Column Formula: '@{treatmentID}{0}'

  3. Add a filter to constrain the output based on the Treatment Id column and specify the following values in the Create/Edit Filter dialog:

    • Operator : is equal to / is in

    • Expression: '@{treatmentID}{0}'

1.3.19 Incorrect Username Displayed In The Greetings Message In Oracle Business Intelligence Interactive Dashboards

The username that appears in the Greetings message in the Oracle Business Intelligence Interactive Dashboards does not display correctly if the source system is either Oracle E-Business Suite (EBS) or Oracle's PeopleSoft. The variable DISPLAY_NAME used in the Greetings message in the dashboard header gets populated through an Initialization Block called LOGIN Properties. Out of the box, the connection pool and SQL statements used in this init block point to the Siebel OLTP. If you are running Oracle Business Intelligence Applications and your source system is either Oracle EBS or PeopleSoft, you need to change the Connection pool and data source SQL for the Initialization Block: LOGIN Properties, as described below.

The solution to this issue is:

  1. Open the EnterpriseBusinessAnalytics.rpd file using Oracle BI Administration Tool.

  2. Navigate to Manage > Variables to open the Variables Manager.

  3. Under Session > Initialization Block, select the "LOGIN Properties" initialization block.

  4. Double-click to open the properties dialog box.

  5. Click on the Edit Data Source button.

  6. Click on the Browse button.

  7. In the Select Connection Pool window, select either Oracle EBS OLTP Connection Pool or PeopleSoft OLTP Connection Pool, depending on your OLTP system.

  8. In the Default Initialization String box on the "Session Variable Initialization Block Data Source - LOGIN Properties" window, enter the SQL as appropriate for your source system application and the database platform it is running on.

    Table 1-1 Required SQL Strings For Each Source System Application And Database Combination

    Source System Application (and Database Platform) SQL required for Default Initialization String

    Oracle EBS (Oracle RDBMS)

    Select PER.FULL_NAME, 0 from PER_ALL_PEOPLE_F PER,

    FND_USER USR

    WHERE

    USR.USER_NAME= ':USER' AND

    USR.EMPLOYEE_ID=PER.PERSON_ID AND

    (SYSDATE <= USR.END_DATE OR USR.END_DATE IS NULL) AND

    PER.EFFECTIVE_START_DATE <= USR.START_DATE AND

    (USR.START_DATE < PER.EFFECTIVE_END_DATE OR PER.EFFECTIVE_END_DATE IS NULL)

    PeopleSoft (Oracle RDBMS)

    SELECT

    CASE

    WHEN EMPLOYEE_NAME_TODAY.EMPLID IS NULL THEN USR.OPRDEFNDESC

    ELSE EMPLOYEE_NAME_TODAY.NAME

    END DISPLAY_NAME, 0

    FROM

    PSOPRDEFN USR

    LEFT OUTER JOIN (SELECT B.EMPLID, B.NAME

    FROM

    PS_NAMES B, (SELECT EMPLID, MAX(EFFDT) EFFDT

    FROM PS_NAMES

    WHERE NAME_TYPE = 'PRI' AND EFFDT <= SYSDATE

    GROUP BY EMPLID) C

    WHERE B.EMPLID = C.EMPLID AND

    B.EFFDT = C.EFFDT AND

    B.NAME_TYPE = 'PRI')

    EMPLOYEE_NAME_TODAY ON USR.EMPLID = EMPLOYEE_NAME_TODAY.EMPLID

    WHERE USR.OPRID=':USER'

    PeopleSoft (MSSQL RDBMS)

    SELECT

    CASE

    WHEN EMPLOYEE_NAME_TODAY.EMPLID IS NULL THEN USR.OPRDEFNDESC

    ELSE EMPLOYEE_NAME_TODAY.NAME

    END DISPLAY_NAME, 0

    FROM

    PSOPRDEFN USR

    LEFT OUTER JOIN (SELECT B.EMPLID, B.NAME

    FROM

    PS_NAMES B, (SELECT EMPLID, MAX(EFFDT) EFFDT

    FROM PS_NAMES

    WHERE NAME_TYPE = 'PRI' AND EFFDT <= GETDATE()

    GROUP BY EMPLID) C

    WHERE B.EMPLID = C.EMPLID AND

    B.EFFDT = C.EFFDT AND

    B.NAME_TYPE = 'PRI'

    ) EMPLOYEE_NAME_TODAY ON

    USR.EMPLID = EMPLOYEE_NAME_TODAY.EMPLID

    WHERE

    USR.OPRID=':USER'

    PeopleSoft (DB2 RDBMS)

    SELECT

    CASE

    WHEN EMPLOYEE_NAME_TODAY.EMPLID IS NULL THEN USR.OPRDEFNDESC

    ELSE EMPLOYEE_NAME_TODAY.NAME

    END DISPLAY_NAME, 0

    FROM

    PSOPRDEFN USR

    LEFT OUTER JOIN (SELECT B.EMPLID, B.NAME

    FROM

    PS_NAMES B, (SELECT EMPLID, MAX(EFFDT) EFFDT

    FROM PS_NAMES

    WHERE NAME_TYPE = 'PRI' AND EFFDT <= CURRENT TIMESTAMP

    GROUP BY EMPLID) C

    WHERE B.EMPLID = C.EMPLID AND

    B.EFFDT = C.EFFDT AND

    B.NAME_TYPE = 'PRI'

    ) EMPLOYEE_NAME_TODAY ON

    USR.EMPLID = EMPLOYEE_NAME_TODAY.EMPLID

    WHERE

    USR.OPRID=':USER'


1.3.20 Using An ODBC Connection During An Upgrade

This issue applies when upgrading to Oracle Business Intelligence Applications version 7.9.4 from the previous versions of Siebel Analytics beginning with version 7.5.x. While migrating data into the upgraded data warehouse, if you are connected to an Oracle database and use an ODBC connection instead of Oracle Native Driver, the upgrade workflows will fail (see section "Migrating Data into the Upgraded Data Warehouse" in Oracle Business Intelligence Applications Upgrade Guide) with the following error:

CMN_1836 Error: Data for Lookup [LKP_W_POSITION_DH_WID] fetched from the database is not sorted on the condition ports. Please check your database sort order for the lookup condition ports.

The workaround for this issue is to use the Oracle Native driver instead of ODBC.

1.3.21 Errors In Prior Forecast Summary Amounts

The following facts in Sales Forecasting show incorrect values leading to errors in reporting:

  • Prior Forecast Summary Best Case

  • Prior Forecast Summary Worst Case

  • Prior Forecast Summary Revenue

  • Prior Forecast Summary Expected Revenue

  • Prior Forecast Summary Cost

  • Prior Forecast Summary Margin

This issue is caused by absence of necessary filters that need to be applied to the above metrics.


Note:

This issue does not affect any out-of-the-box reports or dashboards.

The workaround to solve this issue is to add the column "S_FCST_ITEM_DTL". "AGGR_DIM_NAME_1" in the physical layer and make necessary changes to the metric expressions in the logical layer. Following the detailed steps enumerated below will result in filtering out of the detail rows that are causing the issue.

To add the column "S_FCST_ITEM_DTL". "AGGR_DIM_NAME_1" in the physical layer and make necessary changes:

  1. Open the repository (.rpd file) using the Oracle BI Administration Tool.

  2. In the Physical layer of the repository, locate the table S_FCST_ITEM_DTL (path: Forecasting Siebel OLTP.Catalog.dbo).

  3. Add the column AGGR_DIM_NAME_1 (Data type: varchar; Length: 50) to the table.

  4. In the Business Model and Mapping layer of the repository, look for the metrics listed above under the folder Forecasting.FACTS.

  5. For each metric, do the following:

    1. Right click a metric and choose Properties.

    2. Highlight the OLTP Prior Forecast Summary under Data Type and choose View….

    3. Wrap the following around the metric expression:

      CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL THEN IFNULL ( ) END

    The table below shows the amended expression for each metric.

  6. Save the repository.

Table 1-2 Metric Expressions And Their Values After Being Updated

Metric Amended Expression

Prior Forecast Summary Best Case

CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL

THEN IFNULL("Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."UPSIDE_AMT", 0) *

IFNULL("Forecasting Siebel OLTP"."Catalog"."dbo"."S_ETL_EXCH_RATE (Fcst Summary)"."EXCH_RATE",

VALUEOF("ETL Unknown Exchange Rate")) END

Prior Forecast Summary Worst Case

CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL

THEN IFNULL("Forecasting Siebel OLTP".Catalog.dbo.S_FCST_ITEM_DTL.DOWNSIDE_AMT, 0) *

IFNULL("Forecasting Siebel OLTP".Catalog.dbo."S_ETL_EXCH_RATE (Fcst Summary)".EXCH_RATE,

VALUEOF("ETL Unknown Exchange Rate")) END

Prior Forecast Summary Revenue

CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL

THEN IFNULL("Forecasting Siebel OLTP".Catalog.dbo.S_FCST_ITEM_DTL.REV_AMT, 0) *

IFNULL("Forecasting Siebel OLTP".Catalog.dbo."S_ETL_EXCH_RATE (Fcst Summary)".EXCH_RATE, VALUEOF("ETL Unknown Exchange Rate")) END

Prior Forecast Summary Expected Revenue

CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL

THEN IFNULL("Forecasting Siebel OLTP".Catalog.dbo.S_FCST_ITEM_DTL.REV_AMT * "Forecasting Siebel OLTP".Catalog.dbo.S_FCST_ITEM_DTL.WIN_PROB / 100.0, 0) *

IFNULL("Forecasting Siebel OLTP".Catalog.dbo."S_ETL_EXCH_RATE (Fcst Summary)".EXCH_RATE, VALUEOF("ETL Unknown Exchange Rate")) END

Prior Forecast Summary Cost

CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL

THEN IFNULL("Forecasting Siebel OLTP".Catalog.dbo.S_FCST_ITEM_DTL.COST_AMT, 0) *

IFNULL("Forecasting Siebel OLTP".Catalog.dbo."S_ETL_EXCH_RATE (Fcst Summary)".EXCH_RATE, VALUEOF("ETL Unknown Exchange Rate")) END

Prior Forecast Summary Margin

CASE WHEN "Forecasting Siebel OLTP"."Catalog"."dbo"."S_FCST_ITEM_DTL"."AGGR_DIM_NAME_1" IS NULL

THEN IFNULL("Forecasting Siebel OLTP".Catalog.dbo.S_FCST_ITEM_DTL.MARGIN_AMT, 0) *

IFNULL("Forecasting Siebel OLTP".Catalog.dbo."S_ETL_EXCH_RATE (Fcst Summary)".EXCH_RATE, VALUEOF("ETL Unknown Exchange Rate")) END


1.3.22 Missing Language Folders

The following language folders are missing from the location %:\oraclebidata\disconnected\pharma\messages.

  • l_ar - Arabic

  • l_el - Greek

  • l_hu - Hungarian

  • l_iw - Hebrew

  • l_no - Norwegian

  • l_pl - Polish

  • l_ro - Romanian

  • l_ru - Russian

  • l_sk - Slovakian

  • l_th - Thai

  • l_tr - Turkish

To work around this issues, perform the following steps:

  1. Go to %:\oraclebidata\disconnected\pharma\messages\ and add the following folders:

    • l_ar

    • l_el

    • l_hu

    • l_iw

    • l_no

    • l_pl

    • l_ro

    • l_ru

    • l_sk

    • l_th

    • l_tr

  2. Copy the corresponding _iBotsCaptions.xml and PharmaCaptions.xml files from %:\oraclebidata\web\res\l_XX\Captions\to %:\oraclebidata\disconnected\pharma\messages\l_XX\.

1.3.23 Extraneous Dimension in Campaign Contacts_Segmentation Subject Area

The Campaign Contacts_Segmentation Subject Area contains an 'Industry Name' dimension that was not intended to be included in this Subject Area. Using this column in your analysis will result in a Metadata Inconsistency Error. Oracle recommends that you remove the column from the Presentation Subject Area or advise users to not use it.

1.3.24 Correct Resolution of Exchange Rates While Extracting 'STAT' Journals From General Ledger

This issue is specific to Oracle EBS adaptors used in conjunction with General Ledger Analytics.

During the ETL, the ETL process will try to resolve exchange rates from each transaction's transaction currency to the warehouse's global currencies. If you are extracting 'Statistics' journals from the General Ledger, the ETL process will try to resolve the exchange rates for these transactions as well. However, the ETL process will not be able to resolve the exchange rate because the 'Statistics' journal's transaction currency is 'STAT'. Since 'STAT' is not a currency code, the ETL process will not be able to resolve the exchange rates. As a result, the exchange rates will be defaulted to 0. When customers view the global amounts for these transactions, the global amounts would all be 0 because Oracle Business Intelligence Applications calculates the global amounts by multiplying the transaction amount with the global exchange rate. However, you typically want the global amounts for these statistical transactions to appear as their original transaction amount (that is, these transactions' transaction amounts should be equal to their global amounts). In order to achieve this, perform the task specified below to specify a special case to handle the 'STAT' currency. When the transaction currency is 'STAT', the returning exchange rate should always be '1'.

To specify a special case to handle the 'STAT' currency:

  1. Log into Informatica Designer.

  2. Open the SILOS folder.

  3. Locate the reusable mapplet MPLT_CURCY_CONVERSION_RATES and open it.

  4. Open the EXPT_CALC_EXCH_RATES expression transformation.

  5. Use the Expression Editor dialog to change the Expression value of the variables listed in the table below from the value in the Old Expression Value column to the value in the New Expression Value column.

    Table 1-3 EXPT_CALC_EXCH_RATES Variables and Their Old and New Expression Values

    Variable Name Old Expression Value New Expression Value

    DOC_TO_LOC_EXCH_RATE_VAR

    IIF(ISNULL(DOC_TO_LOC_EXCH_RATE),
    IIF(LOC_CURR_CODE = DOC_CURR_CODE, 1.0, 
    IIF(ISNULL(LOC_CURR_CODE), NULL, :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE,LOC_CURR_CODE,EXCH_DT,LOC_RATE_TYPE_VAR,DATASOURCE_NUM_ID))),
    DOC_TO_LOC_EXCH_RATE)
    
    IIF(ISNULL(DOC_TO_LOC_EXCH_RATE),
    IIF(LOC_CURR_CODE = DOC_CURR_CODE, 1.0, 
    IIF(DOC_CURR_CODE = 'STAT', 1.0,
    IIF(ISNULL(LOC_CURR_CODE), NULL, :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE,LOC_CURR_CODE,EXCH_DT,LOC_RATE_TYPE_VAR,DATASOURCE_NUM_ID)))),
    DOC_TO_LOC_EXCH_RATE)
    

    DOC_TO_GLOBAL1_EXCH_RATE_VAR

    IIF(ISNULL(GLOBAL1_CURR_CODE), NULL,
    IIF(GLOBAL1_CURR_CODE = DOC_CURR_CODE, 1.0, 
    IIF(GLOBAL1_CURR_CODE = LOC_CURR_CODE, DOC_TO_LOC_EXCH_RATE_VAR,
    :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE, GLOBAL1_CURR_CODE, EXCH_DT,GLOBAL1_RATE_TYPE, DATASOURCE_NUM_ID))))
    
    IIF(ISNULL(GLOBAL1_CURR_CODE), NULL,
    IIF(GLOBAL1_CURR_CODE = DOC_CURR_CODE, 1.0,  
    IIF(DOC_CURR_CODE = 'STAT', 1.0,
    IIF(GLOBAL1_CURR_CODE = LOC_CURR_CODE, DOC_TO_LOC_EXCH_RATE_VAR,
    :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE, GLOBAL1_CURR_CODE, EXCH_DT,GLOBAL1_RATE_TYPE, DATASOURCE_NUM_ID)))))
    

    DOC_TO_GLOBAL2_EXCH_RATE_VAR

    IIF(ISNULL(GLOBAL2_CURR_CODE), NULL,
    IIF(GLOBAL2_CURR_CODE = DOC_CURR_CODE, 1.0,  
    IIF(GLOBAL2_CURR_CODE = LOC_CURR_CODE, DOC_TO_LOC_EXCH_RATE_VAR,
    :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE, GLOBAL2_CURR_CODE, EXCH_DT,GLOBAL2_RATE_TYPE, DATASOURCE_NUM_ID))))
    
    IIF(ISNULL(GLOBAL2_CURR_CODE), NULL,
    IIF(GLOBAL2_CURR_CODE = DOC_CURR_CODE, 1.0,
    IIF(DOC_CURR_CODE = 'STAT', 1.0,
    IIF(GLOBAL2_CURR_CODE = LOC_CURR_CODE, DOC_TO_LOC_EXCH_RATE_VAR, :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE, GLOBAL2_CURR_CODE, EXCH_DT,GLOBAL1_RATE_TYPE, DATASOURCE_NUM_ID)))))
    

    DOC_TO_GLOBAL3_EXCH_RATE_VAR

    IIF(ISNULL(GLOBAL3_CURR_CODE), NULL,
    IIF(GLOBAL3_CURR_CODE = DOC_CURR_CODE, 1.0,  
    IIF(GLOBAL3_CURR_CODE = LOC_CURR_CODE, DOC_TO_LOC_EXCH_RATE_VAR, :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE, GLOBAL3_CURR_CODE, EXCH_DT,GLOBAL1_RATE_TYPE, DATASOURCE_NUM_ID))))
    
    IIF(ISNULL(GLOBAL3_CURR_CODE), NULL,
    IIF(GLOBAL3_CURR_CODE = DOC_CURR_CODE, 1.0,
    IIF(DOC_CURR_CODE = 'STAT', 1.0,
    IIF(GLOBAL3_CURR_CODE = LOC_CURR_CODE, DOC_TO_LOC_EXCH_RATE_VAR, :LKP.LKP_W_EXCH_RATE(DOC_CURR_CODE, GLOBAL3_CURR_CODE, EXCH_DT,GLOBAL1_RATE_TYPE, DATASOURCE_NUM_ID)))))
    

  6. Save the changes.