Oracle® Business Intelligence Applications Fusion Edition Release Notes Version 7.9.5 Part Number E12087-08 |
|
|
View PDF |
These release notes describe known issues and workarounds for Oracle Business Intelligence Applications Version 7.9.5, and contain the following sections:
These release notes are updated periodically as new information becomes available. To ensure that you are reading the latest version of the release notes, check the Oracle Business Intelligence Applications Documentation Web site:
This section describes general issues and workarounds for Oracle Business Intelligence Applications products. It contains the following topics:
For certification information, refer to the System Requirements and Supported Platforms for Oracle Business Intelligence Applications Fusion Edition document. This document is part of the Oracle Business Intelligence Applications documentation set, which is available at the following location:
This section provides release notes on installing and upgrading Oracle Business Intelligence Applications. It contains the following topics:
Section 1.2.2.1, "Maintain Side-By-Side Environments When Upgrading"
Section 1.2.2.2, "Installation Error Caused By Invalid Characters In Installation Directory Names"
If you are upgrading from a previous release, then include in your upgrade preparations, plans for side-by-side instances of the entire Oracle BI Applications environment, not just for the Oracle Business Intelligence Presentation Catalog.
Enabling side-by-side instances of the entire Oracle BI Applications environment is a critical success factor for upgrade.
As part of the overall Oracle Business Intelligence Applications installation, the OBIEE Platform directories with the default names 'OracleBI' and 'OracleBIData' are created. You can choose to override these default names and use a different name for these directories. However, if you include the characters, 'period' (.) or 'underscore' (_) in the directory names, the Oracle Business Intelligence Applications installation process throws an error, although these characters are legal in Windows directory names.
Workaround
Oracle recommends that you use the default directory names, (that is, 'OracleBI' and 'OracleBIData'). If you do not use the default directory names, avoid using periods and underscores in the directory names that you use.
This section provides corrections for various pieces of the documentation set for Oracle Business Intelligence Applications. It contains the following topics:
Section 1.2.3.2, "Corrections to Oracle Business Intelligence Applications Upgrade Guide"
Section 1.2.3.4, "Corrections to Oracle Business Analytics Warehouse Data Model Reference"
Note the following corrections to the Oracle Business Intelligence Applications Installation and Configuration Guide:
In Section 10.2.5.2.2 About the Handling of Booked and Nonbooked Orders in the Order Lines and Bookings Table, the task 'To include nonbooked orders in the Sales Order Lines tables' is missing the following two steps:
8. Add 'W_SALES_ORDER_LINE_F.BOOKING_FLG = 'Y'' (plus AND if there are existing filters) to the field of Source Filter inside the Source Qualifier transformation, at the following mappings:
SIL_SalesBookingLinesFact_Load_OrderLine_Credit
SIL_SalesBookingLinesFact_Load_OrderLine_Debit
9. Add 'W_SALES_SCHEDULE_LINE_F.BOOKING_FLG = 'Y'' (plus AND if there are existing filters) to the field of Source Filter inside the Source Qualifier transformation, at the following mappings:
SIL_SalesBookingLinesFact_Load_ScheduleLine_Credit
SIL_SalesBookingLinesFact_Load_ScheduleLine_Debit
If you do no perform Steps 8 and 9, the following error occurs: ORA-00001: unique constraint (OBAW.W_SLS_BKG_LN_F_U1) violated.
All references to 'Dim - Security Dimension' should read 'Dim - Position Security'.
In Table 3-1, Linux should be included in the list of supported operating systems for the Oracle Business Analytics Warehouse and the ETL Repositories.
In Chapter 3, the following changes are required:
The following note should be removed from the list at the start of the chapter: "For information about Oracle-specific database settings, see System Requirements and Supported Platforms for Oracle Business Intelligence Applications Fusion Edition."
A new section called "Oracle-Specific Database Guidelines for Oracle Business Analytics Warehouse" must be added as follows:
Note: Oracle Business Intelligence Applications Version 7.9.4 does not support Oracle 11g source systems.
To configure Business Analytics Data warehouses on Oracle databases more easily, refer to the following init.ora parameter template file, located in <DRIVE>:\<BI Apps install directory>\dwrep\Documentation\. For example, C:\OracleBI\dwrep\Documentation\.
init10gR2.ora - init.ora template for Oracle RDBMS 10gR2
The init.ora parameter template files provide parameter guidelines based on the cost-based optimizer for Oracle 10g. Use these guidelines as a starting point. You will need to make changes based on your specific database sizes, data shape, server size (CPU and memory), and type of storage. The database administrator should make changes to the settings based on performance monitoring and tuning.
Copy the template file corresponding to your database version into your $ORACLE_HOME/dbs directory, review the recommendations in the template file, and make the changes based on your specific database sizes, data shape, server size (CPU and memory), and type of storage. The database administrator should make changes to the settings based on performance monitoring and tuning considerations.
Sections 4.1, 4.2, and 4.3 erroneously state that the Oracle Business Intelligence Infrastructure that is required for Oracle Business Intelligence Applications can be Oracle Business Intelligence Standard Edition One. The Oracle Business Intelligence Infrastructure must be Oracle Business Intelligence Enterprise Edition.
In Sections 4.8.1 and 4.11.1 it is stated that JDK 1.5 or higher must be installed. It should state that JDK 1.5.x must be installed.
The task list in Section 4.2 "Installation and Configuration Process Task List" has the following corrections and amendments:
Step 1 should read as follows:
1. Before you start to install Oracle Business Intelligence Applications and Informatica PowerCenter Services, make sure that you have satisfied the following requirements:
a) Make sure that you satisfy the Oracle Business Intelligence Infrastructure requirements that are specified in Section 4.3.1 Oracle Business Intelligence Infrastructure Requirements.
b) Make sure that you satisfy the Informatica PowerCenter requirements that are specified in Section 4.3.2 Informatica PowerCenter Requirements.
c) Make sure that you satisfy the Code Page requirements that are specified in Section 4.3.3 Code Page and Data Movement Requirements.
d) Make sure that you perform that mandatory pre-installation tasks that are specified in Section 4.3.4 Pre-installation Tasks.
Step 11 should read: Create Relational Connections in Informatica Workflow Manager, as specified in Section 4.13 Configuring Relational Connections In Informatica Workflow Manager.
After Step 11, there should be a new Step 12, as follows:
New Step 12 - Configure the SiebelUniCodeDB Custom Property, as specified in Section 4.14 Configuring the SiebelUnicodeDB Custom Property.
The name of the Oracle Business Intelligence Applications repository (RPD) file should be OracleBIAnalyticsApps.rpd, not EnterpriseBusinessAnalytics.rpd as stated.
Table 4-2 in Section 4.7.3 "Setting PowerCenter Integration Services Custom Properties" has the following corrections:
The table should not include the property ValidateDataCodePages. The ValidateDataCodePages property is set in section 4.7.2 Setting PowerCenter Integration Services Relaxed Code Page Validation.
The table should not include the property SiebelUnicodeDB. This property must be set in a later step and is described in section 4.14. Configuring the SiebelUnicodeDB Custom Property.
The overrideMpltVarWithMapVar property is missing the following description: Allows Informatica to perform evaluation of parameters within mapplets.
For the Disable DB2BulkMode entry, the Notes column should read as follows:
Add this custom property and set the value to Yes if your Oracle Business Analytics Warehouse is on a DB2/390 or DB2 UDB database.
Step 3 in Section 4.12.1 "How to Set DAC System Properties should read:
3. Set values for the following properties:
DAC Server Host – enter the name or IP address of the machine that hosts the DAC Server.
DAC Server OS – enter the operating system of the machine that hosts the DAC Server. Possible values are Windows, AIX, Solaris, HP-UX, Linux.
Note: Values are case sensitive.
InformaticaParameterFileLocation – enter the path to the Informatica source file directory (that is, the \SrcFiles directory). For example, C:\Informatica\PowerCenter8.11\server\infa_shared\SrcFiles.
Note: For a list of DAC System Properties and their descriptions, see Section A.7.1, "How to Set Up DAC System Properties".
Section 4.15 Miscellaneous and Additional Configuration Tasks should include the following task:
How to Set Up the HOSTS File for Teradata Installations on UNIX
If you are using a Teradata database, you need to set the TDPID parameter on the machine where the Informatica Server is installed by adding an entry in the HOSTS file.
How to Set Up the HOSTS File for Teradata Installations on UNIX.
In the HOSTS file, enter a line in the following format:
<IP address of remote server> <remote server> <remote server>COP<n>
For example:
172.20.176.208 tdatsvr tdatsvrCOP1
Where tdatsvrCOP1 is the alias for the remote server. The alias must begin with an alphabetic string and end with the COP<n> suffix, where <n> is a number between 1 and the total number of applications processors that are associated with the Teradata communications processor.
Save the HOSTS file.
For more information about setting the TDPID parameter, refer to the Teradata documentation.
Section 10.2.2.10 How to Configure the Bill of Materials Explosion is obsolete, and should be replaced with the steps in Section 1.3.31, "No Data Is Loaded Into W_BOM_ITEM_F And W_BOM_ITEM_FS".
Section 10.2.5.2.17 How to Configure the Left Bound and Right Bound Calculation Option should include the following Note:
Note: Before you run an ETL using a BOM, you must compile and deploy the SQL code in Compute_Bounds_Ora11i.sql (for more information, see Section 8.2.1.1, "How to Deploy Stored Procedures").
Note the following corrections to the Oracle Business Intelligence Applications Upgrade Guide:
Section 5.6 "Upgrading the DAC Repository", Section 6.6 "Upgrading the DAC Repository", and Section 7.6 "Upgrading the DAC Repository", should read as follows:
Before you begin this procedure, do the following:
Determine what customizations were made to your existing DAC repository.
Make sure you have renamed and backed up your existing DAC repository into a different database. When you backup the DAC repository, you export the DAC metadata, in XML format (using the DAC Export tool), into a folder other than the standard DAC export folder where backups are stored (DAC\export). For instructions on exporting DAC metadata, see the Oracle Business Intelligence Data Warehouse Administration Console Guide.
To upgrade the DAC metadata repository
Upgrade your current DAC repository tables to be compatible with the new DAC repository release.
a. Configure the new DAC client version to read the DAC metadata from your existing DAC repository.
b. Log in to the DAC and select Yes if prompted to upgrade the repository tables.
Export the custom applications from your custom repository to a folder other than the standard DAC export folder (DAC\export) or the folder into which you backed up metadata in the previous format.
Import the new DAC metadata for your application from the standard DAC export folder (DAC\export). Select the Truncate Repository Tables check box.
For instructions on importing DAC metadata, see the Oracle Business Intelligence Data Warehouse Administration Console Guide.
Import the customized DAC metadata that you exported in Step 2, and deselect the Truncate Repository Tables check box.
This will append all the custom data to the repository, thus bringing in the customizations, which include the following:
- All modified data.
- All newly created custom data.
- All deleted data.
Refresh the source system container to locate any missing objects in your customized application. The missing objects are any new objects that the preconfigured applications may have that are not referenced in the custom applications.
a. In the DAC menu bar, select File, then select Refresh Source System Containers.A list of available source system containers appears in the container list to the right of the View buttons.
b. Select the appropriate source system container from the list, and click OK.Confirm that you want to refresh the source system container.
Rebuild all the subject areas and execution plans in your customized application to include any new changes in the object dependency. For information about building subject areas and execution plans, see the Oracle Business Intelligence Data Warehouse Administration Console Guide.
Not applicable to Oracle Business Intelligence Applications Version 7.9.5.
Note the following corrections to the Oracle Business Analytics Warehouse Data Model Reference:
Section 2.2.2 Fact Tables in Oracle Business Analytics Warehouse should read as follows:
A fact table in the Oracle Business Analytics Warehouse has a surrogate key only if the records in the fact table need to be updated or if the fact table has an aggregate table on top of it. Each fact table also contains one or more numeric foreign key columns to link it to various dimension tables.
This section provides release notes for Oracle Business Intelligence Applications in general. It contains the following topics:
Section 1.3.1, "Issue Including Subject Areas from Two Different Sources into One Execution Plan"
Section 1.3.2, "Incremental Aggregation Tasks Can Run Slowly in Multisource ETL Situation"
Section 1.3.4, "Lack of Time Zone Setting Prevents Delivery of iBots to Applications Users"
Section 1.3.5, "Large Dimensional Tables Causing Slow Fact Lookups"
Section 1.3.7, "Issue with Exchange Rates and Transaction Currencies"
Section 1.3.8, "Incorrect Default Physical SQL for Saved Result Sets on Teradata"
Section 1.3.9, "Issue with Mapping in Oracle Business Intelligence Applications 7.7.1.x Versions"
Section 1.3.10, "Issues Inserting Tables on Informatica Server in a Non-English Language Deployment"
Section 1.3.11, "Restoring the Informatica Repository in a Non-English Language Deployment"
Section 1.3.13, "Issue with Reports in Oracle Business Intelligence Presentation Catalog"
Section 1.3.15, "Invalid Expressions In MPLT_SA_ORA_SALESINVOICELINESFACT"
Section 1.3.16, "Forecasting Interoperability With Siebel OLTP 8.0"
Section 1.3.20, "Extraneous Dimension in Campaign Contacts_Segmentation Subject Area"
Section 1.3.21, "Changes In Calculating '# of offers' Metric In Marketing Analytics"
Section 1.3.22, "Restoring A Repository In Informatica In A Non-English Operating System"
Section 1.3.23, "Position DH Mapping Login Column Truncation"
Section 1.3.24, "Double Counting in the Forecasting Subject Area in Oracle Sales Analytics"
Section 1.3.25, "Error In Reports Based on 'Opportunity' Under Opportunity Contact Segmentation"
Section 1.3.26, "Filter Non-position Based Forecasting Records In Siebel 8.1"
Section 1.3.28, "Bitmap Index W_GL_REVN_F_F57 Incorrectly Marked As Type ETL"
Section 1.3.29, "Installation Errors With Oracle Applications Server Advanced Security Option"
Section 1.3.30, "SDE_ORA_ExchangeRateGeneral Failure With Oracle DB Version 8i"
Section 1.3.31, "No Data Is Loaded Into W_BOM_ITEM_F And W_BOM_ITEM_FS"
Section 1.3.32, "Issue with DB2 9.1 Databases During Full ETL Loads"
Section 1.3.33, "Issue with DB2 Databases Running Pharma Analytics on AIX and UNIX Platforms"
Section 1.3.34, "'STAT' Currency Journals or Accounts in Financial Analytics"
Section 1.3.35, "FIND_AUDIT_VALUES Transformation In SDE_OPTYSTGFACT Missing in Teradata Repository"
Section 1.3.36, "Division Name Pointing To An Obsolete Column In Oracle Sales Analytics"
Data Warehouse Administration Console allows customers to build execution plans that are comprised of subject areas from different source system containers. However, such behavior is not supported in this release. For example, you cannot have a subject area from Oracle Applications 11.5.8 and Siebel Applications 7.8 in the same execution plan.
To work around this issue, create separate execution plans for subject areas in different containers. For example, suppose an execution plan has subject areas such as Sales, Inventory, and Receivables that come from an Oracle Applications 11.5.8 container. If the Sales subject area in Siebel Applications 7.8 must be populated, then it requires another execution plan.
Data Warehouse Administration Console runs full load or incremental load commands based on the dates when data was last extracted from a source table and when data was loaded into a target table.
Dependence on last extract or last load dates can become an issue when loading data from two discrete source systems for the first time. ETL from the first source runs as a full extract and full load. ETL from the second source runs as a full extract and as an incremental load. This can cause the second load to be slower than the first, because full load mappings are simple while incremental mappings (especially for aggregate tables) must handle delta aggregation, manipulation, and other complicated situations. Thus incremental loads are costlier.
Slow performance when running incremental ETL is not noticeable in daily ETL runs because of lower data volumes. But, in situations where data is extracted and loaded for the first time from two source systems one after the other, there will be a noticeable slowing down of the second load due to the high data volumes loaded in an incremental mode.
For example, suppose that three years of data is extracted and loaded from Oracle Applications 11.5.8 followed by a three-year load of data from Oracle Applications 11.5.10. The load from Oracle Applications 11.5.10 source will be noticeably slower.
This issue has no workaround.
The Oracle Business Intelligence Delivers iBots use a predefined query against the SA System subject area to retrieve a list of applications users who are associated with the iBot's Recipient Group. When an iBot is run, this predefined query will fail with an error message that is similar to the following one:
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 27005] Unresolved column: "Time Zone".
The query error prevents the Oracle BI Delivers iBot from generating and delivering any content for the applications users that are associated with the iBot's Recipient Group. This impacts all iBot destination types including Interactive Dashboards and Emails and Disconnected Application cache generation for Oracle Business Intelligence Disconnected Analytics. The issue occurs because the 'Time Zone' presentation column name is incorrectly spelled as 'Timezone' in the SA System subject area.
To work around this issue use the Oracle Business Intelligence Administration Tool to change the name of the existing 'Timezone' presentation column in the SA System presentation catalog to 'Time Zone'.
For information on the SA System subject area, see the Oracle Business Intelligence Server Administration Guide.
The Oracle BI Delivers iBots use a predefined query against the SA System subject area to retrieve a list of applications users who are associated with the iBot's Recipient Group. When an iBot is run, users who do not have a time zone specified in user preferences are considered invalid users and iBots are not delivered to them.
This issue occurs because the join type for the S_TIMEZONE join in the S_USER Logical Table Source in the SA System Business Model is defined as INNER when it should be defined as RIGHT OUTER.
To work around this issue, perform these steps:
Display the Oracle Business Intelligence Administration Tool.
In the Business Model and Mapping layer, expand the SA System Business Model and the USER Logical Table.
Double-click the S_USER Logical Table Source under Sources in the USER Logical Table.
In the Logical Table Source - S_USER Dialog Box, change the type to RIGHT OUTER from INNER for the S_TIMEZONE join in the Joins section of the General tab.
For information on the SA System subject area, see the Oracle Business Intelligence Server Administration Guide.
Several dimensional tables that are larger than 255 columns are causing the fact lookups on these dimensions to perform slowly and the lookup database server to run out of memory. The following tables have been identified as experiencing this issue:
W_CUSTOMER_FIN_PROFL_D
W_CUSTOMER_LOC_D
W_CUSTOMER_LOC_USE_D
W_ORG_D
W_PRODUCT_D
W_SALES_PRODUCT_D
W_SUPPLIER_PRODUCT_D
To work around this issue and improve performance, complete the following steps:
Edit the slow running session.
For all transformations that look up the large dimensions that are listed in this section, change the following
Index cache size = 10 MB
Data cache size = 20 MB
This is a session-side change. No modifications are required in mapping.
Data Warehouse Administration Console automatically drops and recreates the registered indexes on the primary target table that is defined for a task for better ETL performance if the primary target table is truncated in full or incremental load. Data Warehouse Administration Console also gathers statistics on the ETL indexes automatically after recreating these indexes on this table. Statistics help the database optimizer to select the best access path to query the data.
The underlying workflow for this task's full and incremental load command is SDE_ListOfValues, which consists of the SIL_ListOfValuesGeneral and SIL_CopyListOfValuesToOLTP sessions. The first session populates the W_LST_OF_VAL_G table, and the second session populates the S_ETL_LOV table from the W_LST_OF_VAL_G table. Because the S_ETL_LOV table is defined as the primary table for this task, Data Warehouse Administration Console does not analyze the W_LST_OF_VAL_G table.
Even if this task is modified to include the W_LST_OF_VAL_G table as another primary table, Data Warehouse Administration Console will not analyze this table until after executing the entire workflow that comprises these two sessions. This causes the task to hang while it extracts rows from the unanalyzed W_LST_OF_VAL_G table to populate the S_ETL_LOV table.
To work around this issue:
If necessary, make a copy of the deployed container where you want to apply this workaround. You cannot make changes to the out-of-the-box container.
Create two new workflows in the Informatica Workflow Manager: one for the SIL_ListOfValuesGeneral session and the other for the SIL_CopyListOfValuesToOLTP session. The workflow name should match the session name that is used inside the workflow.
Make two copies of the Fix List Of Values and Copy to OLTP task in Data Warehouse Administration Console. Rename the first copy as Fix List Of Values and the second copy as Copy List Of Values to OLTP task. Verify that the properties for both these tasks are set as follows:
Task 1: Fix List of Values
Command for Incremental Load: SIL_ListOfValuesGeneral
Command for Full Load: SIL_ListOfValuesGeneral
Folder Name: Extract
Primary Source: DBConnection_OLAP
Primary Target: DBConnection_OLTP
Task Phase: General
Execution Type: Informatica
Priority: 5
Source Tables: W_LST_OF_VAL_G (Primary), W_PARAM_G (Lookup)
Target: W_LST_OF_VAL_G (Primary).
Ensure that Truncate Always and Truncate For Full Load options are not selected.
Task 2: Copy List of Values to OLTP
Command for Incremental Load: SIL_CopyListOfValuesToOLTP
Command for Full Load: SIL_CopyListOfValuesToOLTP
Folder Name: Extract
Primary Source: DBConnection_OLAP
Primary Target: DBConnection_OLTP
Task Phase: General
Execution Type: Informatica
Priority: 5
Source Tables: W_LST_OF_VAL_G (Primary)
Target: S_LST_OF_VAL (Primary)
Ensure that Truncate Always and Truncate For Full Load options are selected.
Edit the TASK_GROUP_Extract_lst_of_val task group by removing the Fix List of Values and Copy to List task and adding the newly created tasks Fix List Of Values and Copy List Of Values to OLTP to this task group. Set the execution order for these tasks as follows:
Fix List of Values: 1
Copy Lists of Values to OLTP: 2
Reassemble the appropriate subject area to ensure that these new tasks are included in the subject area.
Rebuild the execution plan in which these tasks must be executed.
Current design and support of multiple currencies within the Oracle BI Applications and the data warehouse assumes that the transactional system (or OLTP system) provides exchange rates and table structures that store exchange rates.
If the OLTP system does not provide exchange rates from the 'transaction currency' to the chosen 'one or more data warehouse currencies', then the Fact table will have a null exchange rate value for 'transaction' currency to 'Global1' currency, and hence, analysis based on Global currencies will not be possible for these transactions. It also impacts the correctness of the data for data that resides in various aggregate tables. This issue is also seen in the other two supported currencies (Global2 and Global3).
To work around this issue, ensure that the OLTP system has all currency exchange rates from all possible transaction currencies added to all the three chosen data warehouse currencies, up front. If this is not taken care of beforehand and you encounter a missing exchange rate issue, then you can rerun transactions in 'full' mode after you have fixed the missing exchange rate issue in the OLTP system.
When you set up segment Saved Result Sets in the Marketing Metadata on a Teradata data source, you find that the default SQL for inserting the saved result set header has incorrect syntax. The user will receive the following error message when trying to save a saved result set for a segment:
Odbc driver returned an error (SQLExecDirect). State: 37000. Code: 10034. [NQODBC] [SQL_STATE: 37000] [nQSError: 10034] The SQL statement has a syntax error. [nQSError: 43093] An error occurred while processing the EXECUTE PHYSICAL statement. [nQSError: 16001] ODBC error state: 37000 code: -3706 message: [NCR][ODBC Teradata Driver][Teradata RDBMS]Syntax error: expected something between the 'COUNT' keyword and ','.. [nQSError: 16014
]]
This issue is caused by the fact that in the default SQL, the COUNT keyword requires double quotes on Teradata databases. To work around this issue, perform the following steps:
In the Oracle BI Administration Tool, select Marketing from the Manage menu.
Double-click the Target Level and select the Saved Result Sets tab.
Modify the physical SQL that inserts the saved result set header by adding double quotes around the COUNT keyword.
The following example shows SQL code in which double quotes were included around the COUNT keyword.
INSERT INTO M_SR_HEADER (GUID, SEGMENT_PATH, DATE_TIME, TARGET_LEVEL, "COUNT", CREATED_BY) VALUES ('@{guid}', '@{segmentPath}', '@{createdTime}', '@{targetLevel}', @{count}, '@{createdBy}')
The SIL_ListofValuesDimension_MissingActivityCodes mapping, which was used to move unbounded LOV types from the S_LST_OF_VAL OLTP table to the data warehouse table, is not available. This issue does not affect customers who have bounded LOV type. However, if customers use unbounded LOV types for sra_resolution_cd, this mapping is still required in the 7.7.1.x versions.
To work around this issue, re-import the old mapping in the new environment and execute it as a one-time job, if your data source is Oracle's Siebel CRM Applications Version 7.7 or higher. If your data source is Oracle's Siebel CRM Applications Version 7.5.3 or lower, then run this job as part of the ETL process.
If you use the Informatica server on a non-English language machine and your warehouse is on an Oracle database, then you might encounter an Oracle database problem when the Informatica server is trying to enter records with a decimal data type column into the warehouse. You might see an Oracle error message such as the following one:
Error: ORA-01722: invalid number
To work around this issue, disable the 'Enable high precision' option in the session property. This option indicates whether to keep the precision of a decimal port up to 28 digits. If this option is disabled, then Informatica keeps a precision of up to only 15 digits for decimal ports. Because Oracle Business Intelligence Applications uses decimal ports for ID columns, a precision of 15 is sufficient.
Due to a limitation in Informatica PowerCenter, you must use the 'PMREPAGENT restore' command to restore the Informatica Repository in a non-English language deployment of Oracle BI Applications (for example, Japanese or Simplified Chinese). If you try to use the Informatica PowerCenter user interface to restore the Informatica Repository in a non-English language deployment of Oracle BI Applications, the process will fail.
To restore the Informatica Repository in a non-English language deployment of Oracle BI Applications, use the 'PMREPAGENT restore' command, as specified in the following sample command:
\OracleBI\dwrep\Informatica\PMREP\PMREPAGENT restore -r <INFA_REP_NAME> -t <INFA_REP_CONNECT_TYPE> -u <INFA_REP_DBOWNER> -p <INFA_REP_PASSWORD> -c <INFA_REP_CONNECT_STR> -h <INFA_REPSERVNAME> -o <INFA_REPSERVPORT> -i <INFA_REP_FILE> -n
For more information about the PMREPAGENT command, see Informatica PowerCenter documentation on installation and configuration.
The Contact Geography dimension attributes on the Campaign Contacts_Segmentation catalog, which map to the Person Geography dimension table, are not joined appropriately to the Campaign History Fact table. Therefore, Oracle Business Intelligence Server cannot find an appropriate navigation path when this dimension is used.
To work around this issue, remove the Geography dimension from the Campaign Contacts_Segmentation catalog. If users want to use this dimension, then they can switch to another subject area that has this dimension and reference it there, for example, Customer Profile_Segmentation.
The following reports within the shipped Oracle BI Presentation Catalog have poor query performance:
/shared/Interactive Selling/Pricing/Promotions/Recommended Promotions
/shared/Interactive Selling/Pricing/Promotions/Market Basket Analysis - Promotion Recommendations
/shared/Marketing/Customer Insight (B2C)/Products/Next Product Purchased
/shared/Interactive Selling/Products/Marketing/Next Product Purchased
The DAC Deployment procedure causes sequence generators transformations in the ETL to be reset to start from the number 1 again. This can cause some issues when running Source Independent Load (SIL) mappings for the different applications and adapters together. These SIL mappings will be unable to run together.
To illustrate this limitation, two examples of ETLs being unable to run together are shown below:
When the Siebel Vertical adapter and Oracle EBS adapter is used, the Siebel adapter leverages the SIL mappings in the SIL_Vert folder, while the Oracle EBS adapter leverages the SIL mappings in the SILOS folder. The sequence generators in the SIL_Vert folder gets updated to new values, while the same ones in the SILOS folder does not. This results in all SILOS mappings for common dimensions (like Employee, Exchange Rate, etc.) to fail.
The same dimensional tables are loaded from mappings within the SILOS folder and the PLP folders. This results in the same issue as above, and when the mappings are run, they fail.
The workaround is to set the Sequence Generator value to run between 1 and a sufficiently large finite value (for example, 1,000,000,000) for one of the folders and set the SILOS folder Sequence Generator value to run between 1,000,000,001 and its maximum limit of 2,000,000,000.
The expressions for EXT_SALES_ORDER_NUM, EXT_SALES_ORDER_ITEM, and EXT_ITEM_DETAIL_NUM in MPLT_SA_ORA_SalesInvoiceLinesFact are inconsistent with SALES_ORDLN_ID.
For customers with Versions 7.9, 7.9.1, and 7.9.2, the following steps will fix this issue.
Change the expressions of SALES_ORDER_NUM, SALES_ORDER_LINE and ITEM_DETAIL_NUM in MPLT_SA_ORA_SalesInvoiceLinesFact as follows:
IIF(INP_INTERFACE_LINE_CONTEXT='ORDER ENTRY', INP_SALES_ORDER, NULL)
IIF(INP_INTERFACE_LINE_CONTEXT='ORDER ENTRY', INP_SALES_ORDER_LINE, NULL)
IIF(INP_INTERFACE_LINE_CONTEXT='ORDER ENTRY', INP_ITEM_DETAIL_NUMBER, NULL)
If you are using Siebel OLTP 8.0, you must add a filter in the RPD fiscal layer, as follows:
Display the Oracle BI Administration Tool.
In the physical layer under Forecasting Siebel OLTP connection pool, open the table S_FCST_ITEM_DTL.
Change the Type to Select.
Paste the following SQL into the window.
S_FCST_ITEM_DTL.AGGR_DIM_NAME_1 IS NULL AND
S_FCST_ITEM_DTL.AGGR_DIM_NAME_2 IS NULL AND
S_FCST_ITEM_DTL.AGGR_DIM_NAME_3 IS NULL AND
S_FCST_ITEM_DTL.AGGR_DIM_NAME_4 IS NULL AND
S_FCST_ITEM_DTL.AGGR_DIM_NAME_5 IS NULL AND
S_FCST_ITEM_DTL.AGGR_DIM_NAME_6 IS NULL
Save the changes.
The email personalization formats that are installed out-of-the-box in Siebel 8.0 are not constraining the generated lists by treatment ID. As a result, when a campaign launch issues the SOAP call to generate list files for a given treatment, everyone that qualifies for the campaign is being returned in the list. For example, if the campaign has two email treatments that have been allocated to two different sets of campaign members, this issue causes all campaign members to receive both treatments.
This section explains how to work around this issue.
This section explains how to update the repository in Oracle BI Administration Tool.
Start the Oracle BI Administration Tool.
In the Physical Layer, add a physical column DCP_ID to the Campaign Promotion physical table in Marketing OLTP database and specify the following values in the Physical Column dialog:
Name: DCP_ID
Type: VARCHAR
Length: 15
Nullable: yes
In the Business Model and Mapping Layer, add a logical column Treatment Id to the OLTP Campaign Promotion logical table in Marketing Contact List business model and specify the following values in the Logical Column dialog:
Name: Treatment Id
Logical Table Source: S_CAMP_CON
Mapped as: "Marketing OLTP".dbo."Campaign Promotion".DCP_ID
In the Business Model and Mapping Layer, add a logical column Treatment Id to the OLTP Campaign Promotion logical table in Marketing Account List business model and specify the following values in the Logical Column dialog:
Name: Treatment Id
Logical Table Source: S_CAMP_CON
Mapped as: "Marketing OLTP".dbo."Campaign Promotion".DCP_ID
In the Presentation Layer, add the Treatment Id presentation column to the Campaign History (Transaction Database) presentation table in the Marketing Contact List presentation catalog and specify the following values in the Presentation Column dialog:
Name: Treatment Id
Logical Column: "Marketing Contact List"."OLTP Campaign Promotion"."Treatment Id".
In the Presentation Layer, add the Treatment Id presentation column to the Campaign History (Transaction Database) presentation table in the Marketing Account List presentation catalog and specify the following values in the Presentation Column dialog:
Name: Treatment Id
Logical Column: "Marketing Account List"."OLTP Campaign Promotion"."Treatment Id".
This section explains how to update the Campaign Load Format and Email Server Format in Siebel Marketing.
Log in to Siebel Marketing.
Add Treatment Id to the Campaign Contact integration component and specify the following values in the Edit Column Formula dialog:
Table heading: Campaign Contact
Column Heading: Treatment Id
Column Formula: '@{treatmentID}{0}'
Add a filter to constrain the output based on the Treatment Id column and specify the following values in the Create/Edit Filter dialog:
Operator: is equal to / is in
Expression: '@{treatmentID}{0}'
The user name that appears in the Greetings message in the Oracle Business Intelligence Interactive Dashboards does not display correctly if the source system is either Oracle E-Business Suite (EBS) or Oracle's PeopleSoft. The variable DISPLAY_NAME used in the Greetings message in the dashboard header gets populated through an Initialization Block called LOGIN Properties. Out of the box, the connection pool and SQL statements used in this init block point to the Siebel OLTP. If you are running Oracle Business Intelligence Applications and your source system is either Oracle EBS or PeopleSoft, you need to change the Connection pool and data source SQL for the Initialization Block: LOGIN Properties, as described below.
The solution to this issue is:
Open the EnterpriseBusinessAnalytics.rpd file using Oracle BI Administration Tool.
Navigate to Manage > Variables to open the Variables Manager.
Under Session > Initialization Block, select the 'LOGIN Properties' initialization block.
Double-click to open the properties dialog box.
Click on the Edit Data Source button.
Click on the Browse button.
In the Select Connection Pool window, select either Oracle EBS OLTP Connection Pool or PeopleSoft OLTP Connection Pool, depending on your OLTP system.
In the Default Initialization String box on the 'Session Variable Initialization Block Data Source - LOGIN Properties' window, enter the SQL as appropriate for your source system application and the database platform it is running on.
Table 1-1 Required SQL Strings For Each Source System Application And Database Combination
Source System Application (and Database Platform) | SQL required for Default Initialization String |
---|---|
Oracle EBS (Oracle RDBMS) |
Select PER.FULL_NAME, 0 from PER_ALL_PEOPLE_F PER, FND_USER USR WHERE USR.USER_NAME= ':USER' AND USR.EMPLOYEE_ID=PER.PERSON_ID AND (SYSDATE <= USR.END_DATE OR USR.END_DATE IS NULL) AND PER.EFFECTIVE_START_DATE <= USR.START_DATE AND (USR.START_DATE < PER.EFFECTIVE_END_DATE OR PER.EFFECTIVE_END_DATE IS NULL) |
PeopleSoft (Oracle RDBMS) |
SELECT CASE WHEN EMPLOYEE_NAME_TODAY.EMPLID IS NULL THEN USR.OPRDEFNDESC ELSE EMPLOYEE_NAME_TODAY.NAME END DISPLAY_NAME, 0 FROM PSOPRDEFN USR LEFT OUTER JOIN (SELECT B.EMPLID, B.NAME FROM PS_NAMES B, (SELECT EMPLID, MAX(EFFDT) EFFDT FROM PS_NAMES WHERE NAME_TYPE = 'PRI' AND EFFDT <= SYSDATE GROUP BY EMPLID) C WHERE B.EMPLID = C.EMPLID AND B.EFFDT = C.EFFDT AND B.NAME_TYPE = 'PRI') EMPLOYEE_NAME_TODAY ON USR.EMPLID = EMPLOYEE_NAME_TODAY.EMPLID WHERE USR.OPRID=':USER' |
PeopleSoft (MSSQL RDBMS) |
SELECT CASE WHEN EMPLOYEE_NAME_TODAY.EMPLID IS NULL THEN USR.OPRDEFNDESC ELSE EMPLOYEE_NAME_TODAY.NAME END DISPLAY_NAME, 0 FROM PSOPRDEFN USR LEFT OUTER JOIN (SELECT B.EMPLID, B.NAME FROM PS_NAMES B, (SELECT EMPLID, MAX(EFFDT) EFFDT FROM PS_NAMES WHERE NAME_TYPE = 'PRI' AND EFFDT <= GETDATE() GROUP BY EMPLID) C WHERE B.EMPLID = C.EMPLID AND B.EFFDT = C.EFFDT AND B.NAME_TYPE = 'PRI' ) EMPLOYEE_NAME_TODAY ON USR.EMPLID = EMPLOYEE_NAME_TODAY.EMPLID WHERE USR.OPRID=':USER' |
PeopleSoft (DB2 RDBMS) |
SELECT CASE WHEN EMPLOYEE_NAME_TODAY.EMPLID IS NULL THEN USR.OPRDEFNDESC ELSE EMPLOYEE_NAME_TODAY.NAME END DISPLAY_NAME, 0 FROM PSOPRDEFN USR LEFT OUTER JOIN (SELECT B.EMPLID, B.NAME FROM PS_NAMES B, (SELECT EMPLID, MAX(EFFDT) EFFDT FROM PS_NAMES WHERE NAME_TYPE = 'PRI' AND EFFDT <= CURRENT TIMESTAMP GROUP BY EMPLID) C WHERE B.EMPLID = C.EMPLID AND B.EFFDT = C.EFFDT AND B.NAME_TYPE = 'PRI' ) EMPLOYEE_NAME_TODAY ON USR.EMPLID = EMPLOYEE_NAME_TODAY.EMPLID WHERE USR.OPRID=':USER' |
The following language folders are missing from the location %:\oraclebidata\disconnected\pharma\messages.
l_ar - Arabic
l_el - Greek
l_hu - Hungarian
l_iw - Hebrew
l_no - Norwegian
l_pl - Polish
l_ro - Romanian
l_ru - Russian
l_sk - Slovakian
l_th - Thai
l_tr - Turkish
To work around this issues, perform the following steps:
Go to %:\oraclebidata\disconnected\pharma\messages\ and add the following folders:
l_ar
l_el
l_hu
l_iw
l_no
l_pl
l_ro
l_ru
l_sk
l_th
l_tr
Copy the corresponding _iBotsCaptions.xml and PharmaCaptions.xml files from %:\oraclebidata\web\res\l_XX\Captions\to %:\oraclebidata\disconnected\pharma\messages\l_XX\.
The Campaign Contacts_Segmentation Subject Area contains an 'Industry Name' dimension that was not intended to be included in this Subject Area. Using this column in your analysis will result in a Metadata Inconsistency Error. Oracle recommends that you remove the column from the Presentation Subject Area or advise users to not use it.
Customers on Oracle Business Intelligence Applications 7.9.5 and Siebel Marketing (OLTP) versions prior to 8.0 must make the following changes in the repository RPD file to make sure that the '# of offers' metric is calculated correctly:
Click on the fact 'Fact-CRM–Campaign History' in the Business Model and Mapping Layer and go to 'Sources'.
You will see two logical table sources, Fact_W_CAMP_HIST_F and Fact_W_CAMP_HIST_F_With_Offer.
Click on Fact_W_CAMP_HIST_F_With_Offer and activate the Logical Table Source, Fact_W_CAMP_HIST_F_With_Offer.
Click on the metric '# of offers' under the same Fact, 'Fact-CRM–Campaign History'.
You will see two calculations in this screen.
Remove the '# of offers' calculation from the Logical Table Source, Fact_W_CAMP_HIST_F.
Note: Oracle Business Intelligence Applications customers on pre-7.9.5 versions (7.9.1 or 7.9.2 or 7.9.3 or 7.9.4) and Siebel Marketing 8.0, will have to do the following:
Make sure that Logical Data Source Fact_W_CAMP_HIST_F is activated.
Add a new calculation for '# of offers' based on the table Fact_W_CAMP_HIST_F.
In Informatica 8.1.1, you cannot restore the Oracle_BI_DW_Base.rep file through Informatica Administration Console if Informatica is installed in a non-English operating system (for example, Chinese, Japanese, Korean). You must restore the repository by command prompt, using the following command:
PmRep restore -u <domain_user_name> -p <domain_user_password> -i <input_file_name> -n
Note: The '-n' option is not available in the help message of PmRep in Informatica 8.1.1. However, when restoring a repository in a non-English environment, you must use '-n' option, otherwise PmRep fails to restore the repository because the repository's codepage is not matched to Informatica's codepage.
This issue affects Oracle Business Intelligence Applications customers who use position hierarchy and use login names that are longer than 30 characters.
Siebel OLTP supports up to 50 character length for login names. However, the W_POSITION_DH mapping supports only 30 characters for this column. Consequently login names longer than 30 characters are truncated.
Note: The W_POSITION_D mapping itself supports 50 characters and therefore this issue is not faced while the Position dimension alone is involved.
Workaround
There is no workaround for this issue.
This issue affects customers using Oracle Business Intelligence Applications Version 7.9.5 with Siebel versions prior to 8.0. Customers on Siebel 8.0 or 8.1 and using Oracle Business Intelligence Applications Version 7.9.5 are not impacted.
Summary forecast metrics under the 'Sales – CRM Forecasting' subject area show inaccurate results due to double counting. This issue has been fixed for this release of Oracle Business Intelligence Applications but the solution leads to errors for customers on Siebel OLTP releases prior to 8.0 and using Oracle Business Intelligence Applications Version 7.9.5. These customers should use the following workaround.
Workaround
Open the repository (.rpd file) using Oracle Business Intelligence Administration Tool.
Locate the view 'S_FCST_ITEM_DTL' (path: Forecasting Siebel OLTP.Catalog.dbo) in the physical layer of the repository.
Change the table type to 'Physical Table'.
Remove the following columns:
S_FCST_ITEM_DTL.AGGR_DIM_NAME_1
S_FCST_ITEM_DTL.AGGR_DIM_NAME_2
S_FCST_ITEM_DTL.AGGR_DIM_NAME_3
S_FCST_ITEM_DTL.AGGR_DIM_NAME_4
S_FCST_ITEM_DTL.AGGR_DIM_NAME_5
S_FCST_ITEM_DTL.AGGR_DIM_NAME_6
Save the changes.
Locate the following physical views under the Forecasting OLTP Connection Pool:
FCST_DTLS_VU
Prior Forecast Details [FCST_DTLS_VU]
Qtr Ago FCST_DTLS_VU
Year Ago FCST_DTLS_VU
Open each of the above objects and remove the following filters:
FID.AGGR_DIM_NAME_1 IS NULL AND FID.AGGR_DIM_NAME_2 IS NULL AND FID.AGGR_DIM_NAME_3 IS NULL AND FID.AGGR_DIM_NAME_4 IS NULL AND FID.AGGR_DIM_NAME_5 IS NULL AND FID.AGGR_DIM_NAME_6 IS NULL
Save the changes.
This issue occurs when creating a report by selecting all of the attributes from OPPORTUNITY under the subject area, OPPORTUNITY CONTACT SEGMENTATION. If you create a report with this criteria, your system will give the following ODBC error:
Odbc driver returned an error (SQLExecDirectW) Error Details Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14026] Unable to navigate requested expression: Fact - Marketing - Segmentation Opportunity Contact.Implicit Fact Column. Please fix the metadata consistency warnings. (HY000) SQL Issued: SELECT Opportunity."Opportunity Name" saw_0, Opportunity."Opportunity Account Name" saw_1, Opportunity."Opty Status" saw_2, Opportunity."Lead Quality" saw_3, Opportunity."Lead Age Category" saw_4, Opportunity."Deal Size" saw_5, Opportunity."Primary Competitor" saw_6, Opportunity."Sales Stage Name" saw_7, Opportunity."Sales Method" saw_8, Opportunity."Targeted Opportunity Flag" saw_9, Opportunity."Reason Won or Lost" saw_10, Opportunity."Competitor ROW_ID" saw_11, Opportunity.ROW_ID saw_12 FROM "Opportunity Contact_segmentation" ORDER BY saw_0, saw_1, saw_2, saw_3, saw_4, saw_5, saw_6, saw_7, saw_8, saw_9, saw_10, saw_11, saw_12 "
Workaround
There is no workaround for this issue.
With Siebel Version 8.1's new functionality 'Customer Adaptive Forecasting', users are able to submit a forecast to a person not necessarily their manager, that is, outside of their reporting hierarchy. The resulting transaction records need to be filtered out during the extraction process to ensure inter operability with Siebel Version 8.1.
This section is only relevant if you are running your data warehouse in a Teradata database. For detailed information about using the TPump external loader, refer to Informatica documentation and Teradata documentation.
When using a TPump command, if get the following error messages (or similar error messages), make sure that you have set the parameters listed below:
TRANSF_1_1_1> DBG_21216 Finished transformations for Source Qualifier [Sq_W_POSITION_DS]. Total errors [0] WRITER_1_*_1> WRT_8047 Error: External loader process [2192] exited with error [12] WRITER_1_*_1> CMN_1761 Timestamp Event: [Mon Apr 28 14:39:54 2008] WRITER_1_*_1> WRT_8004 Writer initialization failed [Error opening session output file [\\.\pipe\w_XXXX_d1.out] [error=]]. Writer terminating. WRITER_1_*_1> CMN_1761 Timestamp Event: [Mon Apr 28 14:39:54 2008] WRITER_1_*_1> WRT_8047 Error: External loader process [2192] exited with error [12] WRITER_1_*_1> CMN_1761 Timestamp Event: [Mon Apr 28 14:39:54 2008] WRITER_1_*_1> WRT_8088 Writer run terminated. [External loader error.]
To run the Tpump loader, you need to specify the following parameters at the command line in the order specified:
Database Name: Enter the Teradata Database name.
Error Database Name: Enter the Teradata Database name.
Log Table Database Name: Enter the Teradata Database name.
Error Table name: Enter the name of the table to use as the error table.
Log Table name: Enter the name of table to use as the log table.
Notes
The following attributes can be specified at the connection level:
Database
Error Database
Log Table Database
The following attributes are specified at the workflow level:
Error Table
Log Table
The General Ledger revenue fact W_GL_RENV_F has a bitmap index W_GL_REVN_F_F57 declared on column w_gl_revn_f.ledger_wid. This index is incorrectly marked as an ETL index, and should be changed to Query index, as follows:
Start the Oracle DAC Client and log into the repository.
In the Design View, display the Indices tab, and query for index W_GL_REVN_F_F57.
On the Edit sub tab, change the Index Usage value from 'ETL' to 'Query'.
Save the changes.
When installing the Oracle Business Intelligence Applications on top of an installation of the Oracle Business Intelligence Enterprise Edition that was installed using the Oracle Applications Server advanced security option, the following error message will be displayed:
"Password length is less than the required minimum, 8"
Oracle Business Intelligence Applications installation will abort and the OracleBIAnalyticsApps.rpd file will be corrupted.
The error happens because the MINIMUM_PASSWORD_LENGTH setting in the NQSConfig.ini file was set to 8 by the OBIEE installation. The workaround is to change the value of MINIMUM_PASSWORD_LENGTH to 0 in NQSConfig.ini, install Oracle Business Intelligence Applications, then change the value back to 8.
Oracle database Version 8i does not support the extract SQL used in the SDE_ORA_ExchangeRateGeneral task.
The ETL process fails with error message: error ORA-00932: inconsistent datatypes.
Workaround
The workaround is to modify the SQL to explicitly cast the Data type so that the DBMS can process the SQL, as follows:
In PowerCenter Designer, open the SDE folder for your application (for example, SDE_ORA1158_Adaptor).
In the Mapplet Designer, check out and open the mapplet mplt_BC_ORA_RateGeneral_Compress.
Edit the Source Qualifier to display the Edit Transformations dialog.
Display the Properties tab.
In the Sql Query Transformation Attribute, replace the Value with the following SQL statement:
SELECT FROM_CURRENCY, TO_CURRENCY, CONVERSION_DATE, CONVERSION_TYPE, CONVERSION_RATE, STATUS_CODE, CREATED_BY, LAST_UPDATED_BY, CREATION_DATE, LAST_UPDATE_DATE, CASE WHEN NEXT_DT = SYSDATE-100000 AND PREV_DT = SYSDATE-100000 THEN DATE_FOR_WRAP /*If the bucket has only one row, no need to wrap from the next row [Dt, Dt+1) is what we need*/ ELSE to_date(to_char(LEAD(DATE_FOR_WRAP, 1) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_DATE),'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') END TO_DATE, /*Otherwise the wrap the next row as end date*/ CASE WHEN NEXT_DT = SYSDATE-100000 AND PREV_DT <> SYSDATE-100000 THEN 'Filter' ELSE 'Keep' END FILTER /*Filter out rows for constructing Bucket only*/ FROM (SELECT FROM_CURRENCY, TO_CURRENCY, to_date(to_char(CONVERSION_DATE,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') CONVERSION_DATE, CONVERSION_TYPE, CONVERSION_RATE, STATUS_CODE, CREATED_BY , LAST_UPDATED_BY , CREATION_DATE , LAST_UPDATE_DATE, to_date(to_char(CASE WHEN CONVERSION_DATE <> PREV_DT + 1 THEN SYSDATE-100000 ELSE PREV_DT END,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') PREV_DT, /*Equalize first row after the break to a Bucket Start*/ to_date(to_char(CASE WHEN CONVERSION_DATE <> NEXT_DT - 1 THEN SYSDATE-100000 ELSE NEXT_DT END,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') NEXT_DT, /*Equalize last row before the break to a Bucket End*/ to_date(to_char(CASE WHEN NEXT_DT = SYSDATE-100000 OR CONVERSION_DATE <> NEXT_DT - 1 THEN CONVERSION_DATE + 1 ELSE CONVERSION_DATE END,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') DATE_FOR_WRAP /*If it's the end of the Bucket DT+1 should be the D2 valur of [D1, D2)*/ FROM (SELECT FROM_CURRENCY, TO_CURRENCY, to_date(to_char(CONVERSION_DATE,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') CONVERSION_DATE, CONVERSION_TYPE, CONVERSION_RATE, STATUS_CODE, CREATED_BY , LAST_UPDATED_BY , CREATION_DATE , LAST_UPDATE_DATE, LAG(CONVERSION_RATE, 1, -1) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_RATE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE,CONVERSION_DATE, CONVERSION_RATE) PREV_RATE, to_date(to_char(LAG(CONVERSION_DATE, 1, SYSDATE-100000) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_RATE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE,CONVERSION_DATE, CONVERSION_RATE),'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') PREV_DT, to_date(to_char(LEAD(CONVERSION_DATE, 1, SYSDATE-100000) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_RATE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE,CONVERSION_DATE, CONVERSION_RATE),'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') NEXT_DT FROM GL_DAILY_RATES, (SELECT FROM_CURRENCY V_FROM_CURRENCY, TO_CURRENCY V_TO_CURRENCY, CONVERSION_TYPE V_CONVERSION_TYPE, min(CONVERSION_DATE) V_CLEANUP_DATE FROM GL_DAILY_RATES WHERE LAST_UPDATE_DATE > TO_DATE('$$LAST_EXTRACT_DATE','MM/DD/YYYY HH24:MI:SS') AND CONVERSION_DATE > SYSDATE - $$XRATE_UPD_NUM_DAY GROUP BY FROM_CURRENCY, TO_CURRENCY, CONVERSION_TYPE) V WHERE GL_DAILY_RATES.CONVERSION_DATE > SYSDATE - $$XRATE_UPD_NUM_DAY AND GL_DAILY_RATES.CONVERSION_DATE >= V.V_CLEANUP_DATE AND GL_DAILY_RATES.FROM_CURRENCY = V.V_FROM_CURRENCY AND GL_DAILY_RATES.TO_CURRENCY = V.V_TO_CURRENCY AND GL_DAILY_RATES.CONVERSION_TYPE = V.V_CONVERSION_TYPE ) WHERE CONVERSION_RATE <> PREV_RATE /*If rate is the same, should filter them to compress*/ OR CONVERSION_DATE <> PREV_DT +1 /*First row of a bucket*/ OR CONVERSION_DATE <> NEXT_DT -1 /*Last row of a bucket*/ )
Validate, save, and check-in the mapping.
In PowerCenter Workflow Manager, open the SDE folder for your application (for example, SDE_ORA1158_Adaptor).
Open the Sessions folder.
Check out, validate and save the session SDE_ORA_ExchangeRateGeneral_Compress.
Checkout and open the session SDE_ORA_ExchangeRateGeneral_Compress_Full to display the Edit Tasks dialog.
Display the Mapping tab.
Click on Sources in the left hand pane.
In the Properties area, select the Sql Query attribute, and replace the Value with the following SQL statement:
SELECT FROM_CURRENCY, TO_CURRENCY, CONVERSION_DATE, CONVERSION_TYPE, CONVERSION_RATE, STATUS_CODE, CREATED_BY , LAST_UPDATED_BY , CREATION_DATE , LAST_UPDATE_DATE, CASE WHEN NEXT_DT = SYSDATE-100000 AND PREV_DT = SYSDATE-100000 THEN DATE_FOR_WRAP /*If the bucket has only one row, no need to wrap from the next row [Dt, Dt+1) is what we need*/ ELSE to_date(to_char(LEAD(DATE_FOR_WRAP, 1) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_DATE),'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') END TO_DATE, /*Otherwise the wrap the next row as end date*/ CASE WHEN NEXT_DT = SYSDATE-100000 AND PREV_DT <> SYSDATE-100000 THEN 'Filter' ELSE 'Keep' END FILTER /*Filter out rows for constructing Bucket only*/ FROM (SELECT FROM_CURRENCY, TO_CURRENCY, to_date(to_char(CONVERSION_DATE,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') CONVERSION_DATE, CONVERSION_TYPE, CONVERSION_RATE, STATUS_CODE, CREATED_BY , LAST_UPDATED_BY , CREATION_DATE , LAST_UPDATE_DATE, to_date(to_char(CASE WHEN CONVERSION_DATE <> PREV_DT + 1 THEN SYSDATE-100000 ELSE PREV_DT END,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') PREV_DT, /*Equalize first row after the break to a Bucket Start*/ to_date(to_char(CASE WHEN CONVERSION_DATE <> NEXT_DT - 1 THEN SYSDATE-100000 ELSE NEXT_DT END,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') NEXT_DT, /*Equalize last row before the break to a Bucket End*/ to_date(to_char(CASE WHEN NEXT_DT = SYSDATE-100000 OR CONVERSION_DATE <> NEXT_DT - 1 THEN CONVERSION_DATE + 1 ELSE CONVERSION_DATE END,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') DATE_FOR_WRAP /*If it's the end of the Bucket DT+1 should be the D2 valur of [D1, D2)*/ FROM (SELECT FROM_CURRENCY, TO_CURRENCY, to_date(to_char(CONVERSION_DATE,'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') CONVERSION_DATE, CONVERSION_TYPE, CONVERSION_RATE, STATUS_CODE, CREATED_BY , LAST_UPDATED_BY , CREATION_DATE , LAST_UPDATE_DATE, LAG(CONVERSION_RATE, 1, -1) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_RATE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE,CONVERSION_DATE, CONVERSION_RATE) PREV_RATE, to_date(to_char(LAG(CONVERSION_DATE, 1, SYSDATE-100000) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_RATE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE,CONVERSION_DATE, CONVERSION_RATE),'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') PREV_DT, to_date(to_char(LEAD(CONVERSION_DATE, 1, SYSDATE-100000) OVER (PARTITION BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE, CONVERSION_RATE ORDER BY FROM_CURRENCY,TO_CURRENCY,CONVERSION_TYPE,CONVERSION_DATE, CONVERSION_RATE),'MM/DD/YYYY HH24:MI:SS'), 'MM/DD/YYYY HH24:MI:SS') NEXT_DT FROM GL_DAILY_RATES ) WHERE CONVERSION_RATE <> PREV_RATE /*If rate is the same, should filter them to compress*/ OR CONVERSION_DATE <> PREV_DT +1 /*First row of a bucket*/ OR CONVERSION_DATE <> NEXT_DT -1 /*Last row of a bucket*/ )
Validate, save, and check-in the session.
The mapping SDE_ORA_BOMItemFact needs to call a Stored Procedure (SP) in the Oracle EBS instance, which inserts rows into a global temporary table (duration SYS$SESSION, that is, the data will be lost if the session is closed). This Stored Procedure does not have an explicit commit. The Stored Procedure then needs to read the rows in the temporary table into the warehouse.
In order for the mapping to work, Informatica needs to share the same connection for the SP and the SQL qualifier during ETL.This feature was available in the Informatica 7.X release, but it is not available in the Informatica release 8.1.1 (SP4). As a result, W_BOM_ITEM_FS and W_BOM_ITEM_F are not loaded properly.
Workaround
For all Oracle EBS customers:
Open package body bompexpl.
Look for text "END exploder_userexit;", scroll a few lines above, and add a "commit;" command before "EXCEPTION".
Save and compile the package.
In addition, customers with Oracle EBS 11.5.9 and above must do the following:
Connect to your OLTP using user BOM.
Recreate the BOM_EXPLOSION_TEMP as a regular table (that is, not as a global temporary table).
Connect to your OLTP using user apps, recreate synonym BOM_EXPLOSION_TEMP to table bom.BOM_EXPLOSION_TEMP.
During full ETL loads using DB2 9.1 databases, the truncate task fails. This issue generates an error message in the DAC log file similar to the following:
Failed: SIEBTRUN ('@TABLEOWNER.W_QUOTE_MD') With error message: COM.ibm.db2.jdbc.DB2Exception: [IBM][CLI Driver][DB2/AIX64] SQL0668N Operation not allowed for reason code "3" on table "SIEBEL.W_QUOTE_MD". SQLSTATE=57016.
This issue occurs only during full loads. It does not occur during incremental loads, because tables are not truncated during incremental loads.
Workaround
This issue is described as "table in load pending state," and no code fix is needed. The table needs to be reverted back to its normal state. The ETL should then run without any further issues. It is recommended that a DBA assists in performing this workaround.
This issue affects Pharma Analytics on DB2 databases running on AIX or UNIX platforms.
The task LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full fails because the parameter file name is too long.
This issue generates the following error message in the Informatica Workflow log:
Session task instance [LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full]: [VAR_27026 Error: Missing initial value for session parameter:[$DBConnection_OLAP].
Session task instance [LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full]: [TM_6270 Error: Variable parameter expansion error.]
Workaround
In Informatica Workflow Manager, do the following:
Launch Informatica Workflow Manager.
Log in to the Informatica Repository.
Open folder SIL_Vert.
Check out workflow LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType.
Choose Edit from the Workflows drop-down menu, and change the name to LS_SIL_LOVUnboundedDim_LoadSyndType.
Check out workflow LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full.
Choose Edit from the Workflows drop-down menu, and change the name to LS_SIL_LOVUnboundedDim_LoadSyndType_Full.
Save the changes.
Check in.
In Data Warehouse Administration Console, do the following:
Start the DAC Client.
Log in to the DAC Repository.
Select the appropriate custom container from the container drop-down list in the main toolbar.
Go to the Design view, and click the Task tab.
In the top pane toolbar, click Query, and in the Name field enter LS Load into List Of Values Unbounded Dimension for Syndicated Type.
Click Go.
Change the value in the Command for Incremental Load field to LS_SIL_LOVUnboundedDim_LoadSyndType
Change the value in the Command for Full Load field to LS_SIL_LOVUnboundedDim_LoadSyndType_Full
Click Save.
Right -click on the task and select Ownership > Push to References.
Click Yes, and then click OK.
This issue is specific to Oracle eBusiness Suite adaptors used in conjunction with Financial Analytics.
Currently Oracle BI Applications does not support the extraction of statistical GL balances or journals. No accounts or transactions with a STAT currency code are brought into the warehouse. The existing out-of-the-box mappings SDE_ORA_GLBalanceFact and SDE_ORA_GLJournals contain a filter to filter out the 'STAT' records.
Workaround
To support 'STAT' currency, perform the task specified below:
Log into Informatica Designer.
Open the SDE folder.
Locate the mapping SDE_ORA_GLBalanceFact.
Open the mapplet mplt_BC_ORA_GL_Balance_Fact.
Remove or comment out the line 'AND GL_BALANCES.CURRENCY_CODE <> 'STAT' from SQL Qualifier.
Save the changes.
Perform similar steps for the mapping SDE_ORA_GLJournals:
Log into Informatica Designer.
Open the SDE folder.
Locate the mapping SDE_ORA_GLJournals.
Open the mapplet mplt_BC_ORA_GLXactsJournalsExtract.
Remove or comment out the line 'AND GL_JE_HEADERS.CURRENCY_CODE<>'STAT'' from SQL Qualifier.
Save the changes.
This issue affects all Oracle Sales Analytics customers on above OBIA Releases using Siebel as the source system with Teradata database as target.
FIND_AUDIT_VALUES transformation inside SDE_OptyStgFact is missing in the Teradata ETL repository.
Workaround
Logon to Informatica PowerCenter Designer, and open the Mapping Designer tool.
Open the Folder SDE_SBL_80_Adaptor_TD.
Delete the mapping SDE_OptyStgFact from the folder, which is a shortcut from the folder SDE_SBL_78_Adaptor.
Create a new shortcut of the mapping SDE_OptyStgFact from the folder SDE_SBL_80_Adaptor in the SDE_SBL_80_Adaptor_TD, and then re-name it to SDE_OptyStgFact.
Save the Changes and refresh the corresponding session in Informatica PowerCenter Workflow Manager for the changes to take effect.
To deploy the Stored Procedure on the source system, navigate to the OracleBI\dwrep\Informatica\Stored_Procedure_Scripts folder.
Open the folder appropriate to your database platform, and copy the source code from the file FIND_AUDIT_VALUES.sql into source system schema.
Compile the stored procedures in the source system schema.
Note: If you need assistance in deploying the stored procedures, see your Database Reference guide, or contact your database administrator.
In Oracle Sales Analytics deployments, the dimension column 'Organization ->Division Name' refers to the Employee's Division, which is the Business Unit Name in Siebel. Refer to web catalog folder under the Usage Accelerator Subject Areas named 'Usage Accelerator Current' and 'Usage Accelerator Summary'. Division Name points to the physical column, W_INT_ORG_D.BU_NAME, which is obsolete from Oracle Business Intelligence Applications Version 7.9.5 and later.
Workaround
The workaround is to add BU_WID to UA facts that are in use, as follows:
In the ETL, do the following:
Extend each fact, including temp tables (e.g. W_UAEMP_TMP) with BU_WID.
This enables you to make the new physical alias of W_INT_ORG_D for BU conform across all facts in UA (see RPD steps below).
Verify that the BU_ID currently in WS_POSTN is correct.
If it is incorrect, add new column.
Extend W_UAPOS_TMP with BU_WID and modify the mapping to load it.
Modify related SIL mappings to populate the BU_WID in each fact from W_UAPOS_TMP.
Note: There are 15 fact tables and three temp tables (WS_POSTN, W_UAEMP_TMP, W_UAPOS_TMP) that need to be extended with the new column BU_WID. You also need to:
- Modify the mappings that populate the data in the temp tables mentioned above.
- Change all the mappings that are involved in populating the fact tables, to fetch the data for the new column BU_WID.
In the RPD, do the following:
In the Physical Layer, create a new physical alias of W_INT_ORG_D.
In the Business Model and Mapping layer, create a new logical dimension for BU from the alias.
Create a simple hierarchy for the new BU logical dim.
Join (logically and physically) to the facts by BU_WID.
In the Presentation Layer, add the BU name from new BU logical dimension.
Update the reports (if any) following the presentation changes.
Note: If you are upgrading from a version prior to Oracle Business Intelligence Applications Version 7.9.5, you might need to fix the old fact records by a suitable means, such as creating a Workflow.
In AIX, UNIX, and DB2 platforms, the ETL task LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full
fails because of the long parameter file name. This problem only affects Oracle Pharma Analytics.
The user receives the following error message:
Error in Informatica Workflow Log
Session task instance [LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full]: [VAR_27026
Error: Missing initial value for session parameter:[$DBConnection_OLAP].]
Session task instance [LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full]: [TM_6270
Error: Variable parameter expansion error.
]
Workaround
This issue needs to be fixed in two places: in Informatica and in the DAC.
In Informatica, perform the following procedure.
Launch Informatica Workflow Manager
Login to the Informatica Repository.
Open the folder SIL_Vert.
Check out Workflow LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType
.
Choose Edit from the Workflows dropdown menu, and then change its name to LS_SIL_LOVUnboundedDim_LoadSyndType
.
Check out Workflow LS_SIL_ListOfValuesUnboundedDimension_LoadSyndicatedType_Full
.
Choose Edit from the Workflows dropdown menu, and then change its name to LS_SIL_LOVUnboundedDim_LoadSyndType_Full
.
Save the changes.
Check in your changes.
In the DAC, perform the following procedure.
Start the DAC Client.
Log in to the DAC Repository.
Click Design > Task
.
Change container to Data Warehouse.
Click on Query
and enter LS Load into List Of Values Unbounded Dimension for Syndicated Type
in the Name field.
Click Go.
Change the Command For Incremental Load field to LS_SIL_LOVUnboundedDim_LoadSyndType
.
Change Command For Full Load to LS_SIL_LOVUnboundedDim_LoadSyndType_Full.
Click on Save.
Right click on the task and choose Ownership > Push to References.
Click Yes.
Click OK.
When you handle booked and non-booked orders in Oracle Supply Chain and Order Management Analytics For Oracle EBS, if you do not follow Steps 8 and 9 below, you might get the following error: ORA-00001: unique constraint (OBAW.W_SLS_BKG_LN_F_U1) violated.
Workaround
1. Follow Steps 1 to 7 in 'To include nonbooked orders in the Sales Order Lines tables' in Section 10.2.5.2.2 About the Handling of Booked and Nonbooked Orders in the Order Lines and Bookings Table in the Oracle Business Intelligence Analytics Installation and Configuration Guide.
Then follow the additional two steps:
8. Add 'W_SALES_ORDER_LINE_F.BOOKING_FLG = 'Y'' (plus AND if there are existing filters) to the field of Source Filter inside the Source Qualifier transformation, at the following mappings:
SIL_SalesBookingLinesFact_Load_OrderLine_Credit
SIL_SalesBookingLinesFact_Load_OrderLine_Debit
9. Add 'W_SALES_SCHEDULE_LINE_F.BOOKING_FLG = 'Y'' (plus AND if there are existing filters) to the field of Source Filter inside the Source Qualifier transformation, at the following mappings:
SIL_SalesBookingLinesFact_Load_ScheduleLine_Credit
SIL_SalesBookingLinesFact_Load_ScheduleLine_Debit