1 Release Notes

These release notes describe known issues and workarounds for Oracle Business Intelligence Applications (Oracle BI Applications), and contain the following sections:

Note:

Earlier Oracle BI Applications 11g releases were only for Oracle Fusion Applications source systems. Information about known issues and workarounds in these earlier Oracle BI Applications 11g releases are documented in the Oracle Fusion Middleware Release Notes. To locate the Oracle Fusion Middleware documentation library appropriate for the release of Oracle Fusion Middleware in use at your company, see Oracle Fusion Middleware Documentation.

About These Release Notes

These release notes are updated periodically as new information becomes available.

To ensure that you're reading the latest version of the release notes, check the Oracle BI Applications Documentation set.

Obtaining Patches from My Oracle Support

Periodically, Oracle BI Applications patches are released.

To view and obtain patches that are available:
  1. Sign in to My Oracle Support.
  2. Click the Patches & Updates tab.
  3. Use the Patch Search area to locate patches.
  4. On the Patch Search Results page, select a patch and click Download to download the patch.
  5. Install the patch by following the instructions in the README file that is included with the patch.

Functional Setup Manager - Statement of Direction

Functional Setup Manager (FSM) has been desupported in the current release. Ignore the references to FSM that you might see in the online Help.

Issues and Workarounds for Installation, Upgrade, and Documentation

These issues and workarounds are for specific areas, such as installation, upgrade, security, and documentation.

Installation

This section applies to issues regarding installing Oracle BI Applications.

The stop/start sh Script Isn't Running Properly

This issue applies to restarting the Business Intelligence servers.

The stop/start sh script isn't running properly when the bi_server1 is down for some reason.

Workaround

  1. Kill all the Oracle WebLogic processes using the script:
    kill -9 $(ps -ef | grep weblogic | awk '{print $2}').
  2. Run this script to restart the bi_server1:
    ./start.sh.

Upgrade

You may encounter these issues while upgrading Oracle BI Applications.

Note:

The issues listed in this section are for your reference and you can use them as a guidance when you recreate the ODI customizations. These issues have been fixed in the ready-to use ODI repository.
Mapping Fails Due to Aggregate Function Inside Filter During Upgrade

This issue may occur when you upgrade from Oracle Data Integrator 11g to 12c.

If in 11g mapping, you're using any aggregate function inside a filter, for example IND_SCORE >= select max(avg_score) from TABLE_A, then this filter upon upgrade to 12c gets converted into an aggregate component. Since the aggregate component has extra "group by columns", it might result in inconsistent code leading to map failure.

Workaround

  1. Check which columns from the SELECT list aren't included in the "group by columns" list on upgrade.
  2. Add those columns to the "group by column" list.

    Note:

    If you expect that the SELECT statement might return duplicate set of data, then this solution may not work. Applying "group by" on all column results in a DISTINCT data set.
Filters with SELECT Statement Failing with Invalid Identifier Error

If your filter includes a SELECT statement that selects the same table that's used in the main query, then the filter fails after you upgrade Oracle Data Integrator to 12c.

After upgrade, the table alias is changed. If the same table is used in a SELECT statement in the filter, then the filter fails due to an ambiguous reference.

Workaround

Rename the alias of the table in the filter to match something similar to what was defined in Oracle Data Integrator 11g.
Import Source Adaptor IBM DB2 UDB Technology CROSS JOIN Syntax Isn't Getting Updated

The Import Source Adaptor IBM DB2 UDB Technology CROSS JOIN syntax isn't getting updated during the import process.

For IBM_DB2_UDB, the CROSS JOIN syntax must be CROSS JOIN instead of the default ",".

Workaround

  1. For Source Adaptor with IBM_DB2_UDB technology, run in import mode as IMPORT to import the source adaptor ODI metadata.
  2. After import is completed, sign in to ODI Studio, click Topology, expand Logical Architecture, and then expand Technologies.
  3. In Technologies, right-click IBM DB2 UDB, click Open, and then click SQL.
  4. Under the Cross Join syntax, change ',' to CROSS JOIN.
  5. Click Save and sign out of ODI Studio.
  6. Run the import command with Import mode as SCEREGEN to regenerate the scenarios.
Maps Fail with Invalid Identifier Error

Maps fail with the invalid identifier error due to no space between a variable and expression.

If you have defined a column expression that contains variables and there's no space between the deriving column and the variable, then mapping fails with the invalid identifier error as it won't be able to recognize the deriving column.

Workaround

Include a space between the deriving column and the variable to enable Oracle Data Integrator 12c to recognize the column and variable correctly.
JDBC URL for Non-Fusion Source File Data Servers

For non-Fusion sources, when moving data from the flat file (delimited type) to the warehouse table, the trailing spaces are getting trimmed from the column data, if there are any.

This is the expected functionality.

Workaround

If you want to preserve the trailing spaces from column data, then modify the file data server JDBC URL as follows:
jdbc:snps:dbfile?ENCODING=UTF-8&NO_RTRIM_DEL_STRING=TRUE
.
NOW User-Defined Function Causing Maps to Fail with Invalid Character Error

The NOW user-defined function is causing the upgraded maps to fail with the invalid character error.

Workaround

Replace %NOW% with (NOW) in the mapping expressions.
TRUNC() User-Defined Function Causing Inconsistent Data Types
The TRUNC() user-defined function is causing some of the upgraded maps to fail with the inconsistent data types error. DATE UNIT is getting replaced with NUMBER during the upgrade process.

Workaround

Replace TRUNC with TRUNC_DATE.
Optimizer Picking Incorrect Index Scan for Queries with CASE Statements

After upgrade, the mapping queries containing CASE statements aren't picking up indexes defined on columns used inside those CASE statements. Instead, the plan displays indexes that aren't of use or defined on columns that aren't used in the query.

Workaround

Rename the alias of tables used in the CASE statements similar to the one used in Oracle Data Integrator 11g or the actual table name.
Data Issue with Multiple Aggregate Steps After Upgrading to Oracle Data Integrator 12c

This issue applies to upgrading from Oracle Data Integrator 11g to Oracle Data Integrator 12c.

In Oracle Data Integrator 11g, if you use any aggregate function and if “execute on” is defined as Staging and Source, then the upgraded map in Oracle Data Integrator 12c has two different stages such as AGG_SRC and AGG_STG. This situation causes the extraction query in Oracle Data Integrator to add two "group by" clauses, which might cause data issues.

Workaround

Move the aggregate logic to one stage (either AGG_SRC or AGG_STG) and update the manual "group by" clause accordingly. Also remove the other aggregate stage from the mapping.
User-Defined Function CONCAT Isn't Getting Resolved

In Oracle Data Integrator 12c, the CONCAT user-defined function isn't getting resolved for Generic SQL technology.

In Oracle Data Integrator 12c, the CONCAT user-defined function isn't getting resolved when you use Microsoft SQL Server as the source.

In ODI 11g, the user-defined function CONCAT has Syntax || and Implementation $(VALUE1)+$(VALUE2). For example, Mapping Expression: Column1 || Column2. After resolution it's: Column1 + Column 2. In ODI 12c, Mapping Expression: Column1 || Column2 isn't resolving to Column1 + Column2.

Workaround

In the mapping expression, replace concat operator || with the user-defined function CONCAT_OPER(). CONCAT_OPER() has implementation of + for Microsoft SQL Server, || for Oracle, IBM DB2 UDB, and DB2 400.
Non-Oracle Source Registration Database Connection Issue

For non-Oracle source technologies IBM_DB2_UDB and MICROSOFT_SQL_SERVER, while registering the source in BI Applications Configuration Manager, the Test Database Connection option isn't enabled.

Also, the Schema Name and Catalog Name aren't getting updated in the Oracle Data Integrator repository.

Workaround

  1. In BI Applications Configuration Manager, register source without testing the database connection.
  2. Sign in to ODI Studio.
  3. Click Topology, expand Physical Architecture, and then expand Non-Oracle Technology IBM_DB2_UDB or MICROSOFT_SQL_SERVER as applicable.
  4. Double-click on the connection name that you provided while registering the source and then click JDBC.
    Ensure that the JDBC format is correct:
    • IBM_DB2_UDB - jdbc:weblogic:db2://hostname:port;DatabaseName=RegDB
    • MICROSOFT_SQL_SERVER - jdbc:weblogic:sqlserver://hostname:port;DatabaseName=RegDB
  5. Click Test Connection, select the configured Physical Agent, and then click Test.
    The test connection must be successful. If it isn't successful, then verify that the JDBC URL, user name, or password is correct. Update as required and test again.
  6. To update the Schema Name and Catalog Name in the ODI repository, expand the database connection, double-click the required physical schema, and then click Definition.
  7. For IBM DB2 UDB Technology, verify the schema and work schema, and update the actual values with a value similar to the sample value "TESTCTL".
  8. For Microsoft SQL Server Technology, verify the Database (Catalog), Database (Work Catalog), Owner (Schema) and Owner (Work Schema), and update the actual values with a value similar to the sample value "JDE_DEVELOPMENT" and "TESTCTL".
  9. Click Save and close the physical schema.
  10. Repeat the steps for each database connections specified during the source registration.
SDE EBS Mappings Fail Due to Incorrect Joins

An incorrect variable declaration caused an incorrect join or table alias, which results in the SDE EBS mappings (such as SDE_ORA_UOMCONVERSIONGENERAL_INTERCLASS) to fail.

In Oracle Data Integrator 11g, variable declarations such as #LANGUAGE_BASE were properly resolved. In Oracle Data Integrator 12c, you must revise the variable declaration as #BIAPPS.LANGUAGE_BASE.

Workaround

Replace all variables declared as #Variable_Name in expression editors as #BIAPPS.Variable_Name.
SDE Mappings Fail with Invalid Identifier for the Seed Data Variable
The alias of source tables has changed between Oracle Data Integrator 11g and Oracle Data Integrator 12c. The seed data variables using the table.column fail in 12c with an invalid identifier. The variables can't be resolved due to the changed alias in Oracle Data Integrator 12c.

Workaround

Use the REPLACE_MULT_ALIAS user-defined function to replace the alias correctly according to Oracle Data Integrator 12c.
Unable to Specify an Outer Join on a Correlation Column

Source dependent extract (SDE) mappings are failing with an error message that indicates that an outer join can't be specified on a correlation column.

When the same table is part of both a select query and mapping and if no aliases are given for the table in the inner select query, then the mapping fails with an error where the outer join refers to the alias of the outer query.

Workaround

Add an alias to the tables used in the inner select query.
Too Much Recursion in Text Resolution Issue when Generating a Scenario

When trying to generate a scenario, you might see an error message such as "Too much recursion in text resolution." The user defined function TRUNC causes this issue and prevents the scenario from being generated.

In Oracle Data Integrator 12c, the TRUNC(SYSDATE) function syntax changed to TRUNC(SYSDATE,0), from the TRUNC(SYSDATE) syntax in Oracle Data Integrator 11g.

Workaround

Use the TO_DATE(TO_CHAR(SYSDATE,'YYYYMMDD'),'YYYYMMDD') ; function, instead of the TRUNC function.
SDE Mappings Fail with $CDC_DML_CODE/CDC$_SRC_LAST_UPDATE_DATE Invalid Identifier

After upgrading to Oracle Data Integrator 12c, you might find that SDE mappings are failing with the $CDC_DML_CODE/CDC$_SRC_LAST_UPDATE_DATE invalid identifier.

The aggregator source (AGG_SRC) includes the CDC$ columns. When code is generated, the upgrade process considers the columns in the "select" statement.

Workaround

  1. For the CDC$ columns in Oracle Data Integrator 11g, change the "execute on" to "source" from "staging" and import the mappings to 12c from 11g.
  2. If the issue still persists, then complete the following steps in Oracle Data Integrator 12c:
    1. In the temporary mappings, enable the "Generate ANSI Syntax" check box in the join conditions.
    2. Create a separate flow for SDS and non-SDS modes. Refer to the SDE_PSFT_NEGOTIATIONINVITATIONFACT and SDE_PSFT_NEGOTIATIONLINEFACT mappings for the job design changes.
Upgraded Interfaces are Failing with Missing IN or OUT Parameter at Index Exception

Upgraded interfaces are failing with "Missing IN or OUT parameter at index exception" when mapping column name is the same as the variable name.

If a column name is prepended with : and if there's a variable with the same name, then it gets recognized as a variable because you can specify variables in this way.

Workaround

If any KM has :[CX_COL_NAME] in its command, then replace :[CX_COL_NAME] with <?= (char) 58 + \u0022[CX_COL_NAME]\u0022 ?>.
SDE_FUSION_TIMECARDPROCESSEDTIMEFACT_HWM Fails Due to Missing Columns in Nested SELECT

Fusion SDE mappings such as SDE_FUSION_TIMECARDPROCESSEDTIMEFACT_HWM are failing with TC_SEC_ORGANIZATION_ID invalid identifier error.

These mappings are failing with the invalid identifier error because the tables alias prefix is missing against the column name in the Expression Editor.

Workaround

Apply the table alias prefix correctly against all these columns in the Expression Editor. For example, change COLUMN_NAME to TABLE_NAME_ALIAS.COLUMN_NAME.
EBS SDE Mappings Are Failing Due to Missing or Incorrect Reference

EBS SDE mappings such as SDE_ORA_MFGLOTGENEALOGYFACT are failing with SQ_W_LOT_GENEALOGY_FS_UN_ALL.L1_INVENTORY_ITEM_ID invalid identifier error.

Oracle Data Integrator 12c is unable to resolve the attributes if there's space at the end of L1_INVENTORY_ITEM_ID in the Expression Editor.

Workaround

Go to the expression editor of all such columns that have incorrect or missing reference and remove the space or junk characters from the suffix.
SDE_FUSION_SALESINVOICELINESFACT Fails with Invalid Identifier Error

SDE_FUSION_SALESINVOICELINESFACT is failing with GREATEST_20 invalid identifier error.

AUX4_CHANGED_ON_DT is getting evaluated using GREATEST_20 function but only 15 parameters were provided to it, hence SDE_FUSION_SALESINVOICELINESFACT is failing.

Workaround

  1. Open mapping SDE_FUSION_SalesInvoiceLinesFact.W_SALES_INVOICE_LINE_FS.
  2. Select AUX4_CHANGED_ON_DT in target W_SALES_INVOICE_LINE_FS and change the expression as follows:
    GREATEST_15(
    COALESCE(LKP_W_FSN_CUSTLOC_USE_PS_SHIP.AUX1_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_FSN_CUSTLOC_USE_PS_SHIP.AUX2_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_FSN_CUSTLOC_USE_PS_SHIP.AUX3_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_FSN_CUSTLOC_USE_PS_SHIP.AUX4_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_FSN_CUSTLOC_USE_PS_SHIP.CHANGED_ON_DT,TO_DATE_VAR('#LOW_DATE'))
    ,
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_PAY.AUX1_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_PAY.AUX2_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_PAY.AUX3_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_PAY.AUX4_CHANGED_ON_DT,TO_DATE_VAR('#LOW_DA
    TE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_PAY.CHANGED_ON_DT,TO_DATE_VAR('#LOW_DATE'))
    ,
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_INVO.AUX1_CHANGED_ON_DT,TO_DATE_VAR('#LOW_D
    ATE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_INVO.AUX2_CHANGED_ON_DT,TO_DATE_VAR('#LOW_D
    ATE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_INVO.AUX3_CHANGED_ON_DT,TO_DATE_VAR('#LOW_D
    ATE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_INVO.AUX4_CHANGED_ON_DT,TO_DATE_VAR('#LOW_D
    ATE')),
    COALESCE(LKP_W_CUST_LOC_USE_PS_FSN_INVO.CHANGED_ON_DT,TO_DATE_VAR('#LOW_DATE')
    )
    )
  3. Save the changes and regenerate the scenario.
SILOS Mapping Fails with Invalid Identifier Error

SILOS mapping is failing with CALCULATE_DT_WID invalid identifier error.

CALCULATE_DT_WID user-defined function isn't getting resolved, appearing in code as it is, and therefore failing.

Workaround

Edit the CALCULATE_DT_WID user-defined function to include '0' (zero) in its argument. For example, CALCULATE_DT_WID_DFLT($(IN_DATE),0).
Incorrect Column Alias Causing SDE FUSION Mappings Failure

Incorrect column alias in the "Create SDS work view" step causes an invalid identifier issue.

Many SDE mappings are failing at the "Create SDS work view" step because the column name in sub-select and outer select aren't matching in the generated SQL. Codegen is generating an alias of random columns incorrectly and this is causing the invalid identifier issue. For example, SDE_FUSION_PRODUCTDIMENSION_DERIVE failed with ORA-00904:"SQ_BCI_PRODUCTS_SQ_BCI_PRODUCT"."C1_ORGORGANIZATIONDEFINITIO": invalid identifier error.

Workaround

This workaround is an example. Ensure that you modify the Generated Derived TableSQL task based on the KM in use.
  1. In KM IKM BIAPPS Oracle Control Append, go to task Generated Derived TableSQL and edit line number 127 to include ALIAS separator as follows:
    <%=odiRef.getColList(i,"", "[EXPRESSION]\t[ALIAS_SEP] [COL_NAME]", ",\n\t","", "((INS) and REW)")%>
  2. In KM IKM BIAPPS Oracle Control Append, go to task Generated Derived TableSQL and edit line number 233 to include ALIAS separator as follows:
    <%=odiRef.getColList(i,"", "[EXPRESSION]\t[ALIAS_SEP] [COL_NAME]", ",\n\t","", "((INS) and REW)")%><$=selectFlexColList$>
Default Knowledge Module Replaces the Knowledge Module Imported from ODI 12.2.1.3

When you export a mapping from ODI 12.2.1.3 and import that mapping in ODI 12.2.1.4, the default Knowledge Module (KM) replaces the imported KM. This issue occurs because the KM associations aren’t maintained. This issue applies to all the adaptors.

To work around this issue, open the imported mapping, and manually attach or replace the required KM.

Documentation Corrections

This information provides corrections and additions for documentation and Help for Oracle BI Applications.

General Documentation Issues

There are currently no documentation errors.

General Issues and Workarounds

These topics involve general issues and workarounds for Oracle BI Applications.

SDE_JDE_PurchaseAgreementLineFact Map Fails

The SDE_JDE_PurchaseAgreementLineFact map fails with "Invalid number" error in the SDS Mode.

Workaround

  1. In ODI Designer, expand BI Apps Project, expand Mappings, and then click SDE_JDEE1_91_Adaptor.
  2. In the SDE_JDE_PurchaseAgreementLineFact folder, open the SDE_JDE_PurchaseAgreementLineFact.W_PURCH_AGREEMENT_LINE_FS_STANDARD_SQ_RMS reusable mapping and change the mapping expression of the DPPSR column from TO_NUMBER(TRIM(F38012.DPPSR)) to IIF (IS_NUMBER(TRIM(F38012.DPPSR)),TO_CHAR(TO_NUMBER(TRIM(F38012.DPPSR))),TO_CHAR(TRIM(F38012.DPPSR))).
  3. Save the reusable mapping and close it.
  4. Open the SDE_JDE_PurchaseAgreementLineFact.W_PURCH_AGREEMENT_LINE_FS_SQ_RELEASED_AMOUNT_RMS reusable mapping and change the join condition between F38012 and F38111 from F38012.DPDMCT=F38111.DZDMCT AND F38012.DPDMCS=F38111.DZDMCS AND F38012.DPSEQ=F38111.DZSEQ AND TO_NUMBER(TRIM(F38012.DPPSR))=F38111.DZAN8 AND F38012.DPITM=F38111.DZITM to F38012.DPDMCT=F38111.DZDMCT AND F38012.DPDMCS=F38111.DZDMCS AND F38012.DPSEQ=F38111.DZSEQ AND (IIF (IS_NUMBER(TRIM(F38012.DPPSR)),TO_NUMBER(TRIM(F38012.DPPSR)),-999))=F38111.DZAN8 AND F38012.DPITM=F38111.DZITM.
  5. Save the reusable mapping and close it.
  6. Open the SDE_JDE_PurchaseAgreementLineFact.W_PURCH_AGREEMENT_LINE_FS_SQ_F38012_VENDOR_ITEM_TEMP_RMS reusable mapping and change the join condition between F38012 and F4104 from TO_NUMBER(TRIM(F38012.DPPSR))=F4104.IVAN8 AND F38012.DPITM=F4104.IVITM to (IIF (IS_NUMBER(TRIM(F38012.DPPSR)),TO_NUMBER(TRIM(F38012.DPPSR)),-999))=F4104.IVAN8 AND F38012.DPITM=F4104.IVITM.
  7. Save the reusable mapping and close it.
  8. Regenerate the SDE_JDEE1_91_ADAPTOR_SDE_JDE_PURCHASEAGREEMENTLINEFACT scenario.

SDE_PSFT_ProjectAwardFundingFact Map Fails to Convert Data Type

This issue is applicable for the PeopleSoft MSSQL DB source only.

The PeopleSoft MSSQL map SDE_PSFT_ProjectAwardFundingFact fails to convert the data type. For example, this map fails to convert the nvarchar value "2005M03" to data type "smallint".

Workaround

  1. Sign in to ODI Studio Designer, expand BI Apps Project, expand Mappings, click SDE_PSFT_92_Adaptor, and then click SDE_PSFT_ProjectAwardFundingFact.
  2. In the SDE_PSFT_ProjectAwardFundingFact.W_PROJ_AWARD_FUNDING_FS_SQ_AWARD_FUNDING_RMS reusable mapping, in the join between GM_AWARD and LKP_BUDGET_PERIOD, replace the current join condition with the following:
    GM_AWARD.CONTRACT_NUM=LKP_BUDGET_PERIOD.CONTRACT_NUM AND PC_BUD_DETAIL.PROJECT_ID=LKP_BUDGET_PERIOD.PROJECT_ID AND
        PC_BUD_DETAIL.BUDGET_PERIOD=TO_CHAR(LKP_BUDGET_PERIOD.BUDGET_PERIOD_GM)
    You see the TO_CHAR user defined function added to LKP_BUDGET_PERIOD.BUDGET_PERIOD_GM.
  3. Save the reusable mapping.
  4. Regenerate the SDE_PSFT_92_ADAPTOR_SDE_PSFT_PROJECTAWARDFUNDINGFACT scenario.

BIA_11 ODI XML Import Fails in the Windows Environment

When you unzip the BIA_11 adaptor zip file in Windows, the data store folder name SalesOrderLineAggregate. is automatically created as SalesOrderLineAggregate_.

The unzip tool is adding " _ " instead of "." at the end of folder name because Windows doesn’t accept a "dot" at the end. Hence, while importing the BIA_11 adaptor, the import command is unable to locate the file for the data store (SalesOrderLineAggregate.). This is causing the import process to fail and not process the scenario generation.

Workaround

  1. Update the "imp_ctl_55datastore.properties" property file by replacing SalesOrderLineAggregate. with SalesOrderLineAggregate_ in the following path:
    ORACLE/BIA_11/ORACLE/Model/OracleBIApplications/OracleBIApplications/Aggregate/SalesOrderLineAggregate_/TAB_SalesOrderLineAggregate_.xml

    Note:

    The "imp_ctl_55datastore.properties" file is available in the unzipped ORACLE__BIA_11__ORACLE_ODI_Metadata folder at ORACLE/BIA_11/ORACLE.
  2. Create the ODI repository by overwriting the existing repository and start the BIA_11 ODI XML import process.

E-Business Suite Source Extraction in the SDS Mode Fails

When you run the EBusiness Suite source extraction process in the SDS mode, the maps SDE_ORA_MfgProductionCostFact and SDE_ORA_KanbanReplenishmentCycleFact fail with the error "ORA-00979: not a GROUP BY expression". This issue occurs because the RUN_REPLICATED_TRANSACTIONAL user-defined function isn’t included in the Manual Group By expression.

Workaround

  1. Sign in to ODI Studio.
  2. Click the Designer tab, expand BI Apps Projects, click Mapping, and then click SDE_ORAR122_Adaptor.
  3. Navigate to the SDE_ORA_MfgProductionCostFact mapping folder.
  4. Open the SDE_ORA_MfgProductionCostFact.W_MFG_PROD_COST_FS_SQ_WIP_PERIOD_BALANCES_RMS reusable mapping.
  5. Edit the Manual Group By clause for the Aggregator component AGG_SRC1.
    After the update, the content of the clause must be as follows:
    WDJ.ORGANIZATION_ID, WDJ.WIP_ENTITY_ID, WDJ.STATUS_TYPE, WE.ENTITY_TYPE,
          WDJ.PRIMARY_ITEM_ID, WDJ.START_QUANTITY, WDJ.QUANTITY_COMPLETED, WDJ.QUANTITY_SCRAPPED,
          PPP.SEIBAN_NUMBER_FLAG, WDJ.PROJECT_ID, WDJ.OUTSIDE_PROCESSING_ACCOUNT,
          WPB.TL_OUTSIDE_PROCESSING_OUT, WPB.PL_OUTSIDE_PROCESSING_OUT,
          WPB.TL_OUTSIDE_PROCESSING_IN, WPB.PL_OUTSIDE_PROCESSING_IN, WDJ.CREATION_DATE,
          WDJ.CREATED_BY,RUN_REPLICATED_TRANSACTIONAL("#IS_SDS_DEPLOYED",WDJ.LAST_UPDATE
          _DATE,WDJ.CDC$_SRC_LAST_UPDATE_DATE), WDJ.LAST_UPDATED_BY, MSIB.PRIMARY_UOM_CODE,
          WPB.LAST_UPDATE_DATE, WDJ.OUTSIDE_PROC_VARIANCE_ACCOUNT, MP.PRIMARY_COST_METHOD,
          PPP.COSTING_GROUP_ID, MP.WMS_ENABLED_FLAG, MP.PROJECT_REFERENCE_ENABLED,
          MP.DEFAULT_COST_GROUP_ID, MP.COST_ORGANIZATION_ID, WDJ.DATE_RELEASED,
          GL.LEDGER_ID,
          GL.CURRENCY_CODE,RUN_REPLICATED_TRANSACTIONAL("#IS_SDS_DEPLOYED",'N',CASE WHEN
          WPB.CDC$_DML_CODE = 'D' THEN 'Y' ELSE 'N' END)
  6. Navigate to the SDE_ORA_KanbanReplenishmentCycleFact mapping folder.
  7. Open the SDE_ORA_KanbanReplenishmentCycleFact.W_KANBAN_REPLEN_CYCLE_FS_SQ_MTL_KANBAN_CA RD_ACTIVITY_RMS resuable mapping.
  8. Edit the Manual Group By clause for the Aggregator component AGG_SRC1.
    After the update, the content of the clause must be as follows:
    MTL_KANBAN_CARD_ACTIVITY.ORGANIZATION_ID,
          'STORAGE_LOC'||'~'||TO_CHAR(MTL_KANBAN_CARD_ACTIVITY.ORGANIZATION_ID)||'~'||TO
          _CHAR(MTL_KANBAN_CARD_ACTIVITY.SUBINVENTORY_NAME)||'~'||TO_CHAR(MTL_KANBAN_CAR
          D_ACTIVITY.LOCATOR_ID), MTL_KANBAN_CARD_ACTIVITY.INVENTORY_ITEM_ID,
          TO_CHAR(MTL_KANBAN_CARD_ACTIVITY.INVENTORY_ITEM_ID)||'~'||TO_CHAR(MTL_KANBAN_C
          ARD_ACTIVITY.ORGANIZATION_ID), MTL_KANBAN_CARD_ACTIVITY.KANBAN_CARD_ID,
          'STORAGE_LOC'||'~'||MTL_KANBAN_CARD_ACTIVITY.SOURCE_ORGANIZATION_ID||'~'||MTL_
          KANBAN_CARD_ACTIVITY.SOURCE_SUBINVENTORY||'~'||TO_CHAR(MTL_KANBAN_CARD_ACTIVIT
          Y.SOURCE_LOCATOR_ID), MTL_KANBAN_CARD_ACTIVITY.REPLENISHMENT_CYCLE_ID,
          MTL_KANBAN_CARDS.LAST_UPDATE_DATE,
          TO_CHAR(MTL_KANBAN_CARD_ACTIVITY.KANBAN_CARD_ID)||'~'||TO_CHAR(MTL_KANBAN_CARD
          _ACTIVITY.REPLENISHMENT_CYCLE_ID),
          RUN_REPLICATED_TRANSACTIONAL("#IS_SDS_DEPLOYED",MTL_KANBAN_CARD_ACTIVITY.LAST_
          UPDATE_DATE, MTL_KANBAN_CARD_ACTIVITY.CDC$_SRC_LAST_UPDATE_DATE),
          LKP_LATEST_STATUS.MAX_ACTIVITY_DATE,RUN_REPLICATED_TRANSACTIONAL("#IS_SDS_DEPL
          OYED", 'N',CASE WHEN MTL_KANBAN_CARD_ACTIVITY.CDC$_DML_CODE = 'D' THEN 'Y' ELSE
          'N' END)

Change in Language Isn't Working as Expected

In Oracle BI Applications Configuration Manager, when you change your language preference in the Preference section, save, and return to Manage Warehouse Languages in the System Setup section, the values of the "Installed" column in the warehouse Language table aren’t translated to your preferred language.

Workaround

Sign out and sign in to Oracle BI Applications Configuration Manager to resolve the issue.

UCM - RIDC Thin Communication API has been Deprecated

RIDC support has been deprecated from Oracle BI Application 11.1.1.10.3 release onwards.

The following Oracle Data Integrator procedures and packages that use the RIDC thin communication APIs to download a single file, delete a single file, and upload a single file won't be able to use it any longer:
  • UCM Processing - Procedure
  • UCM Delete from UCM - Package
  • UCM Download from UCM - Package
  • UCM Upload to UCM - Package

Workaround

Use the web service APIs to communicate with UCM. Refer to the UCM documentation.

Resource Limit Exception for Data Source When System is Under Stress

This issue applies to Oracle BI Applications Configuration Manager.

When you stress test Oracle BI Applications Configuration Manager by continuously navigating the Oracle BI Applications Configuration Manager dialogs for weeks, you may encounter the resource limit exception for data source "BIAPPSODIRepositoryDS" issue. This is very rare and intermittent.

Workaround

Restart the "bi_server1" to resolve the issue.

ODI Agent Overwrites the Maximum Number of Sessions to the Default Value of 1000

During deployment, OdiConfigAgent is overwriting the values for MAX_THREAD and MAX_SESS to the default value of 1000.

You must set the maximum number of sessions for the ODI Agent to no more than five. The default parallelism configuration for session and threads is 1000. Follow the steps in the Workaround to configure parallelism for sessions and threads.

Workaround

  1. Sign in to the ODI master repository.
  2. Navigate to the Topology tab, expand Physical Architecture, and then expand Agent Option.
  3. Select an agent, such as OracleDIAgent, and open it.
  4. Select Definition tab and modify as follows:
    1. Maximum numbers of sessions: 5
    2. Maximum numbers of threads: 5
  5. Save and disconnect from the ODI master repository.

SDE_ORA_QARESULTSFACT Map Fails with the Invalid Identifier Error

The issue is specific to the EBusiness Suite 11g database.

The SDE_ORA_QARESULTSFACT map fails with the "QA_CHARS_SQ_W_QA_RESULTS_FS"."CREATION_DATE" invalid identifier error in the load data step when you run it against the EBusiness Suite 11g database.

Workaround

Ensure that your source EBusiness Suite database is release version 12c or later.

Exceeded the Configured Maximum Number of Rows Returned for a View

You may encounter the error that the report has exceeded the configured maximum number of allowed input records.

Workaround for Pivot Tables and Graphs

If you encounter this error for pivot tables and graphs, then manually configure the cube settings for the pivot tables and graphs. You can use settings within the cube element to affect the display and processing of data in the pivot tables and graphs.

  1. Open the instanceconfig.xml file for editing. This file is available at BI_DOMAIN/config/fmwconfig/biconfig/OBIPS.
  2. Locate the cube section and add the CubeMaxRecords element. This element specifies the maximum number of records that are returned by an analysis for the view to process. This roughly governs the maximum number of cells that can be populated in a view; unpopulated cells in a sparse view don't count. The default is 40000.
  3. Include the elements and their ancestor elements as appropriate, as in the following example:
    <ServerInstance>
      <Views>
        <Cube>
          <CubeMaxRecords>65000</CubeMaxRecords>
        </Cube>
      </Views>
    </ServerInstance>

    Note:

    Both CubeMaxRecords limit the number of rows returned. The limit is determined by the setting with the larger value. See Use Fusion Middleware Control to Set the Maximum Number of Rows Processed to Render a Table
  4. Save your changes and close the file.
  5. Restart Oracle Business Intelligence.

Workaround for Reports with Trellis Views

  1. Open the instanceconfig.xml file for editing. The file is available at BI_DOMAIN/config/fmwconfig/biconfig/OBIPS.
  2. Include the elements and their ancestor elements as appropriate, as shown in the following example:
    <Views>
                <Trellis>
            <Simple>
                <MaxCells>4000</MaxCells>
                <MaxVisibleSections>10</MaxVisibleSections>
                <MaxVisiblePages>1000</MaxVisiblePages>
                <MaxVisibleRows>2000</MaxVisibleRows>
                <MaxVisibleColumns>2000</MaxVisibleColumns>
                <MaxPagesToRollOutInDelivery>1000</MaxPagesToRollOutInDelivery>
                <DefaultRowsDisplayed>10</DefaultRowsDisplayed>
                
    <DefaultRowsDisplayedInDelivery>100</DefaultRowsDisplayedInDelivery>
                
    <DefaultRowsDisplayedInDownload>6500</DefaultRowsDisplayedInDownload>
            </Simple>
            <Advanced>
                <MaxCells>5000</MaxCells>
                <MaxVisibleSections>50</MaxVisibleSections>
                <MaxVisiblePages>1000</MaxVisiblePages>
                <MaxVisibleRows>250</MaxVisibleRows>
                <MaxVisibleColumns>150</MaxVisibleColumns>
                <MaxPagesToRollOutInDelivery>1000</MaxPagesToRollOutInDelivery>
                <DefaultRowsDisplayed>25</DefaultRowsDisplayed>
                
    <DefaultRowsDisplayedInDelivery>250</DefaultRowsDisplayedInDelivery>
                
    <DefaultRowsDisplayedInDownload>10000</DefaultRowsDisplayedInDownload>
            </Advanced>
          </Trellis>
      </Views>
  3. Save your changes and close the file.
  4. Restart Oracle Business Intelligence.

    Note:

    Depending on the volume of data, you may have to alter the configurations.

Error While Selecting Mark as Complete for Failed Step in the ODI Console

In ODI Console, for a failed load plan execution, when you right-click on the step number of any failed load plan step and select "Mark as Complete" in the ODI menu, you may encounter an error.

Workaround

In ODI Studio, mark the failed step as complete.

Distributed Transaction Waiting for Lock

This issue applies to the DBLink mode.

In the DBlink mode, the ETL maps may fail with the "distributed transaction waiting for lock" error.

Workaround

To resolve this error, increase the value of the DISTRIBUTED_LOCK_TIMEOUT parameter to 120 or 180 seconds and restart the database.

Error While Resetting the Data Warehouse and SDS Schema

You may encounter an error while resetting the data warehouse and SDS schema.

Workaround

When you see the error, click OK and continue. The system resets the data warehouse and the SDS schema in the background.

SDE_PSFT_DOMAINGENERAL_ETHNICGROUP Map Failing During Incremental Run

In the SDS mode, during incremental run, the SDE_PSFT_DOMAINGENERAL_ETHNICGROUP map may fail with the "Non supported SQL92 token" error.

This error occurs because there is a leading space before the table name for PSXLATITEM in the QUALIFY user-defined function.

Workaround

In the SDE_PSFT_DomainGeneral_EthnicGroup.W_DOMAIN_MEMBER_GS_SQ_RMS map, open the Filter2 condition and replace it with the following:
RUN_FULL_INCREMENTAL('#IS_INCREMENTAL',
  1=1,
  RUN_REPLICATED_TRANSACTIONAL('#IS_SDS_DEPLOYED',
    PSXLATITEM.LASTUPDDTTM > TO_DATE_VAR('#LAST_EXTRACT_DATE'),
    (PSXLATITEM.FIELDVALUE, PSXLATITEM.FIELDNAME) IN ( SELECT
PS.FIELDVALUE,PS.FIELDNAME FROM QUALIFY(PSXLATITEM) PS WHERE PS.FIELDNAME IN
('ETHNIC_GROUP','ETHNIC_GRP_CD_UK') AND
    PS.CDC$_SRC_LAST_UPDATE_DATE > TO_DATE_VAR('#LAST_EXTRACT_DATE'))
  )
)

Mark Manifest as Processed Step in Load Plan Gets Delayed

For Oracle Fusion sources, the "Mark manifest as processed" step in the load plan takes between 8 to 40 minutes. This delay time depends on the network speed and the number of files extracted in the Universal Content Management system.

Avoid the delay by moving the "Mark manifest as processed" step to the parallel step instead of running it as a serial step.

Workaround

  1. Open the generated load plan in ODI Studio.
  2. Expand SDS Load Phase, expand File to SDS Load Phase, expand 1 FTS Load, expand 2 FTS Fact Group, expand the second Serial, and then expand 3 FTS General Post Load.
  3. In 3 FTS General Post Load, drill down to Post Load.
    You see the "Mark manifest as processed" scenario task.
  4. Expand Source Extract Phase, expand 1 SDE Extract, expand 2 SDE fact Group, and then expand Parallel (Fact groups).
  5. Drag the Mark manifest as processed scenario task from Post Load under 3 FTS General Post Load and drop it in Parallel (Fact groups).

ARG_GROUP Invalid Identifier Error

For any of the upgraded interfaces, if the system displays the ""ARG_GROUP": invalid identifier" error, then follow this workaround.

Workaround

  1. Enclose the ARG_GROUP((col 1, col 2)) argument in double parentheses.
    This change isn't required when there is only one argument, such as ARG_GROUP(col 1).
  2. Ensure that the entire expression is in a single line, for example:
    IS_FIRST(ARG_GROUP((Col 1, col 2, col 3)),ARG_GROUP((col 1, col 2,col 3)))
    .

Exception Appears While Resetting the Source Dependent Schema

This issue applies to resetting the source dependent schema.

While resetting the source dependent schema (sds) from Oracle Business Intelligence Configuration Manager, an exception is displayed if you click OK after selecting the data source.

Workaround

  1. While resetting the SDS, after selecting the data source, don’t click OK.
  2. Wait for the application to load other fields such as context and odiAgent.
  3. Set all the mandatory fields and then click OK.

Improve Fusion FTS Load Performance by Enabling External Table Mode

The FTS map uses the JVM mode to process primary extract files. In JVM mode, data is processed row by row from the pecsv file to the work table.

To overcome this, you can use the External Table mode for FTS map. This mode allows the load of primary extract files very quickly compared to the JVM mode. By default, the FTS map runs in the JVM mode. To enable the External Table mode, you must enable the flag manually in the W_FTS_DIRECTORY DW table. You can enable this mode for some or all maps.

Workaround

  1. To enable all FTS maps for External table, add an entry in the W_FTS_DIRECTORY DW table.
    For example, insert into W_FTS_DIRECTORY(REPLSTG_FILE_PATH,EXT_TBL_LOAD_FLG,TWO_TIER_INST_FLG,COMMENTS)

    values('/nfs/setup/oracle/work/ReplSTG','Y','ONPREM:N','');

    commit;

    Note: If you're using Oracle BI Applications on Cloud Infrastructure services, then create the mount directory which should be accessible from OCI VM and DBAAS. For example, /mnt/biapps/ReplStg/

    Possible values for TWO_TIER_INST_FLG attribute:

    • For On-premise, the value is either ONPREM:N or ONPREM:Y
    • For PAAS (OCI and DBAAS), the value is either PAAS:N or PAAS:Y
  2. To enable only a few maps for the External Table mode, add an entry in the W_FTS_LOAD_TYPE dw table.
    For example, insert into W_FTS_LOAD_TYPE (LOAD_TYPE,UPDATED_BY,CREATED_BY,LAST_UPDATED_DATE,CRETED_DATE,SCENARIO_NAME)

    values ('EXT_TABLE','XYZ','XYZ',sysdate,sysdate,'<Scenario Name>');

    commit;

Order of Joins Generated in the Load Data Step Differs in Oracle Data Integrator 12c

In the SDE mappings of all adaptors, the order of the joins generated in the SQL code of the Load Data step differs in Oracle Data Integrator 12c from Oracle Data Integrator 11g.

Workaround

Redesign the SDE mapping by removing all the joins and recreating each one in the same order as the joins from the SDE query in Oracle Data Integrator 11g.

Non-Oracle Source Adapters Scenario Regeneration Issue

Non-Oracle source adapters scenario regeneration displays an error.

Workaround

If the mappings or reusable mappings have lookups, then in all the mappings and reusable mappings for the join components, set the Generate ANSI Syntax and the Join Order, and then generate the scenario. Sample maps fixed are SDE_PSFT_CostCenterDimension_FINSCM SDE_JDE_ARTransactionFact_Full_SA_WO.

Source Dependent Extract Maps Fail with Invalid Identifier Error

Source dependent extract (SDE) maps might fail with the $CDC_DML_CODE/CDC$_SRC_LAST_UPDATE_DATE invalid identifier message.

This issue occurs because in the non-source dependent schema mode, the CDC columns are included in the “select” clause of the SQL query.

Workaround

  1. In the temporary mappings, enable the Generate ANSI Syntax check box in the join conditions.
  2. Create a separate flow for the source dependent schema and non-source dependent schema modes.
    Refer to the SDE_PSFT_NEGOTIATIONINVITATIONFACT and SDE_PSFT_NEGOTIATIONLINEFACT mappings for job design changes.

Hierarchy Maps Fail with Invalid Identifier Error

Many Hierarchy maps fail with the invalid identifier error in Oracle Data Integrator 12c.

In the FULL load ETL with SBL-2018 source, the SDE task SDE_PARTYORGANIZATIONHIERARCHY fails with "DIST_COMP"."BASE_LVL_PARTY_ID":invalid identifier message.

Workaround

  1. Move all the user defined functions (UDF) to the Target mapping.
  2. Remove the unwanted columns from the Distinct operation.
  3. Add the PAR_INTEGRATION_ID column to the Distinct operation.
  4. Change the Alias name of the distinct component from DIST_COMP to the source table name (W_SBL_PARTY_ORGDH_PS).
    Refer to the SDE_PARTYORGANIZATIONHIERARCHY map of Siebel as an example.

Data Types Created Incorrectly While Creating the C$ Table

When creating the C$ table for some columns, data types are created incorrectly.

Workaround

Add TO_CHAR or TO_DATE functions for the expression based on the data type you're expecting. For example, TO_CHAR(expression) and TO_DATE(expression).

Unable to Generate Scenario for SDE_FUSION_PARTYSUPPLIERSTAGING

This issue applies to scenario generation in Oracle Data Integrator 12c.

Workaround

If your main interface contains filters that apply over the column of reusable mapping, then uncheck the sub-select enabled box for the scenario to be generated successfully. For example, in mapping SDE_FUSION_PartySupplierStaging.W_FSN_PARTY_SUPPLIER_PS, uncheck the sub-select enabled box for reusable mapping component SQ_SUPPLIERPVO.

Unable to Authorize Connection Between the WebLogic Servers

This issue applies to the WebLogic servers.

The connection between Oracle WebLogic Servers isn't being authorized because the nonce value has expired. The clocks in the servers in the domain aren't synchronized.

Workaround

Synchronize the hardware clock by restarting the Network Time Protocol daemon (NTPD) or rebooting your machine.

Map SDE_SBL_822_ADAPTOR_SDE_PRODUCTATTRIBUTERELATION_1 is Failing

In Oracle Data Integrator 12c, the map SDE_SBL_822_ADAPTOR_SDE_PRODUCTATTRIBUTERELATION_1 is failing with the invalid identifier issue.

Workaround

  1. Update to add the UKs “in where” clause. For example, in the IKM BIAPPS SQL Siebel ETL table update, add a new option OBI_UK_CONSTRAINT boolean default FALSE.
  2. Update the "insert data" task by adding an “if” block that checks for UK with another “if” block with condition odiRef.getOption("OBI_UK_CONSTRAINT").equals("1"), such as:
    <% if(odiRef.getOption("OBI_UK_CONSTRAINT").equals("1") &&(!odiRef.getColList("","X","","","UK").equals(""))) 
    After this change, the KM checks for the UK's defined in the map and executes accordingly.
  3. Change the OBI_UK_CONSTRAINT option value as True in the physical tab for the IKM BIAPPS SQL Siebel ETL table.

ETL Fails Due to ActivityResourcesFact and ActivityFact Failure

This issue applies to the Fusion adaptor.

ETL fails due to SDE_FUSION_ActivityFact and SDE_FUSION_ActivityResourcesFact failure.

Workaround

  1. Log into the Oracle Business Intelligence Administration Tool.
  2. In the Physical Layer, navigate to the Import Metadata option for CRManalytics.
  3. Select Activity, Activity Assignee, and ActivityContact view objects and click Import Selected.
  4. After importing, click Finish.

Regional Settings Aren't Saved After Logging Out and In

This issue applies to the regional settings in Configuration Manager and the Functional Setup Manager (FSM) for non-English languages.

The changes in Language (go to task Preferences, click Regional or Preferences, and select Language) aren't saved properly after you log out of the application and log in again.

Workaround

After logging in, if the preference settings (for example, UI language, number format, date format, time format, time-zone) aren't set to the required values, you can go to task Preferences, click Regional or Preferences, select Language, and change the preferences as required. Save the preferences and then continue to use other tasks without logging out.

Market Basket Analysis Facts and Dimensions Not Supported

Due to performance issues, Market Basket analysis related facts and dimensions aren't supported.

List of logical facts not supported:

  • - Fact - CRM - Next Order Same Account

  • - Fact - CRM - Next Order Same Contact

  • - Fact - CRM - Order Item Same

  • - Fact - CRM - Product Affinity

List of logical dimensions not supported:

  • - Dim - Market Basket Product

  • - Dim - Next Product Purchased

Workaround

There isn't a workaround for this issue.

Human Resources E-Business Suite Payroll Patches for Payroll Analytics

This issue applies to Human Resources Analytics Payroll Subject Area for E-Business Suite.

If you're implementing Human Resources Analytics Payroll Subject Area for E-Business Suite, then it's mandatory to follow the E-Business Suite Payroll patching policy mentioned in My Oracle Support Note 295406.1 Mandatory Family Pack / Rollup Patch (RUP) levels for Oracle Payroll.

Load Plan Schedules Missing After Regeneration

This issue applies to the load plan regeneration.

After the load plan is regenerated in Configuration Manager, the load plan schedule is missing.

Workaround

Before regenerating a load plan, remove all existing schedules and recreate them after the regeneration is complete.

Non-supported Attributes and Metrics

This issue applies to notes on Inventory and Costing subject areas of PeopleSoft Data sources.

PeopleSoft:

The PeopleSoft data source doesn't support the following BMM Dimension Attributes:

"Core"."Dim - Lot"."Best By Date"

"Core"."Dim - Lot"."color"

"Core"."Dim - Lot"."Length"

"Core"."Dim - Lot"."Length UOM"

"Core"."Dim - Lot"."Thickness"

"Core"."Dim - Lot"."Thickness UOM"

"Core"."Dim - Lot"."Volume"

"Core"."Dim - Lot"."colume UOM"

"Core"."Dim - Lot"."Width"

"Core"."Dim - Lot"."Width UOM"

"Core"."Movement Type"."Transaction Action Code"

"Core"."Movement Type"."Transaction Action Description"

The PeopleSoft data sources doesn't support the following Metrics:

BMM:  "Core"."Fact - Supply Chain - Inventory Transaction"."Material Quantity"

BMM:  "Core"."Fact - Supply Chain - Inventory Transaction"."Material Amount"

BMM:  "Core"."Fact - Supply Chain - Inventory Transaction"."Total Material Amount Last 365 Days"

Presentation: "Inventory - Transactions".."Fact - Inventory Transactions"."Material Quantity"

Presentation: "Costing - Item Cost".."Fact - Costing - Item Cost"."Profit In Inventory"

Presentation: "Costing - Item Cost".."Fact - Costing - Item Cost"."Percentage of Profit In Inventory"

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”In Transit Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Receiving Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Allocated Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Un Allocated Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.” # of Products Requiring Reorder”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Value Only Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Value Transfer Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Returned Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Replenishment Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.” Inspection Consignment Quantity”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Reorder Point”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.” Inspection Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Returned Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Replenishment Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”Inspection Consignment Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.”WIP Amount”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.” Returned Quantity Year Ago”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.” Chg Returned Quantity Year Ago”

BMM: “Core”.”Fact – Supply Chain - Inventory Balance”.” Chg Returned Quantity % Year Ago”

Presentation: “Facts - Inventory Balance”.”In Transit Quantity”

Presentation: “Facts - Inventory Balance”.”Receiving Quantity”

Presentation: “Facts - Inventory Balance”.”Allocated Quantity”

Presentation: “Facts - Inventory Balance”.”Un Allocated Quantity”

Presentation: “Facts - Inventory Balance”.”Replenishment Quantity”

Presentation: “Facts - Inventory Balance”.”Inspection Consignment Quantity”

Presentation: “Facts - Inventory Balance”.”# of Products Requiring Reorder”

The PeopleSoft data source doesn't support the following Presentation Subject Area:


Inventory - Bill of Materials

Inventory – Cycle Count

The PeopleSoft data source doesn't support the following Logical Tables and corresponding Dimensions:

Dim - Balancing Segment

Dim - Product Transaction Type (Stays deprecated)

Dim - Cost Valuation Unit
Notes on behavior of specific Attributes and Metrics on PeopleSoft data source:
  • BMM: "Core"."Fact - Supply Chain - Inventory Aging"."Days In Inventory"

    The “Days In Inventory” metric is defined as the number of days between the Date Received and the date of data extraction. The Date Received, for PeopleSoft Inventory, is retrieved from the latest Receipt Header record which matches a particular Physical Inventory record. If no Receipt Header can be found, then the Staged Date of the Physical Inventory is used. If there's no associated Receipt Header and no Staged Date for a Physical Inventory record, then Days In Inventory aren't populated for that record.

  • BMM: "Core"."Fact - Supply Chain - Inventory Aging"."Days Left Expiry" and “Days Since Expired” (affects of Date Received)

    The “Days Left Expiry” and “Days Since Expired” metrics are defined as the days between the Expiration Date for a Physical Inventory item and the date of data extraction. For lot controlled items, the Expiration Date is derived from the lot control record. For other items, the Expiration Date is derived by adding the item’s Shelf Life to the Date Received.

  • Plant Location

    In the PeopleSoft adaptor, Plant Locations are retrieved for every Business Unit which is marked with a Business Unit Type of “PLNT”. Only “PLNT” type Business Units are represented in the Plant Location Dimension.

Data Extraction Error 'XXXDFFBIVO of type View Definition is not found'

This issue applies to Fusion Applications on-premises data sources.

If during load plan execution you see the error message XXXDFFBIVO of type View Definition is not found (or similar), then you must check your Flexfield deployment. You might have an incomplete flexfield setup.

Workaround

Make sure that your flexfield setup is complete.

BI Metadata Repository Issue in Fusion Applications for Security View

Fusion Applications supports 18 levels of Territory Hierarchy, but the opaque security view "Dim_Security_PrimaryTerritoryHierarchy_TerritoryResourceQuota" in the repository uses only 12 levels.

This results in a data security issue, where users aren't able to view all records.

Workaround

  1. In Oracle BI Administration Tool, edit the repository.
  2. Click Physical, click Oracle Data Warehouse and select Dim_Security_PrimaryTerritoryHierarchy_TerritoryResourceQuota.
  3. Change the Default Initialization String to the following SQL statement:

    Note:

    In the SQL statement, for each of the Territory Hierarchy levels 0 to 17, repeat the 'OR' clause with a CASE statement 'case VALUEOF(n, NQ_SESSION.TERR_HIER_LEVEL_LIST)...'.
    select T.TERR_WID as TERR_WID,
    'Y' as PRI_TERR_HIER_SEC_FLG
    from W_TERR_DH T
    where T.CURRENT_FLG = 'Y'
    and
    (
    (
    case VALUEOF(0, NQ_SESSION.TERR_HIER_LEVEL_LIST)
      when '0' then T.BASE_TERR_ID
      when '1' then T.LVL1_TERR_ID
      when '2' then T.LVL2_TERR_ID
      when '3' then T.LVL3_TERR_ID
      when '4' then T.LVL4_TERR_ID
      when '5' then T.LVL5_TERR_ID
      when '6' then T.LVL6_TERR_ID
      when '7' then T.LVL7_TERR_ID
      when '8' then T.LVL8_TERR_ID
      when '9' then T.LVL9_TERR_ID
      when '10' then T.LVL10_TERR_ID
      when '11' then T.LVL11_TERR_ID
      when '12' then T.LVL12_TERR_ID
      when '13' then T.LVL13_TERR_ID
      when '14' then T.LVL14_TERR_ID
      when '15' then T.LVL15_TERR_ID
      when '16' then T.LVL16_TERR_ID
      when '17' then T.TOP_LVL_TERR_ID
    else 'NO_VALUE'
    end
    ) in (VALUELISTOF(NQ_SESSION.SUPER_TERR_LIST))
    or
    (
    case VALUEOF(1, NQ_SESSION.TERR_HIER_LEVEL_LIST)
      when '0' then T.BASE_TERR_ID
      when '1' then T.LVL1_TERR_ID
      when '2' then T.LVL2_TERR_ID
      when '3' then T.LVL3_TERR_ID
      when '4' then T.LVL4_TERR_ID
      when '5' then T.LVL5_TERR_ID
      when '6' then T.LVL6_TERR_ID
      when '7' then T.LVL7_TERR_ID
      when '8' then T.LVL8_TERR_ID
      when '9' then T.LVL9_TERR_ID
      when '10' then T.LVL10_TERR_ID
      when '11' then T.LVL11_TERR_ID
      when '12' then T.LVL12_TERR_ID
      when '13' then T.LVL13_TERR_ID
      when '14' then T.LVL14_TERR_ID
      when '15' then T.LVL15_TERR_ID
      when '16' then T.LVL16_TERR_ID
      when '17' then T.TOP_LVL_TERR_ID
    else 'NO_VALUE'
    end
    ) in (VALUELISTOF(NQ_SESSION.SUPER_TERR_LIST)) 
    ...
    ...and so on.
    ...
    or
    (
    case VALUEOF(17, NQ_SESSION.TERR_HIER_LEVEL_LIST)
      when '0' then T.BASE_TERR_ID
      when '1' then T.LVL1_TERR_ID
      when '2' then T.LVL2_TERR_ID
      when '3' then T.LVL3_TERR_ID
      when '4' then T.LVL4_TERR_ID
      when '5' then T.LVL5_TERR_ID
      when '6' then T.LVL6_TERR_ID
      when '7' then T.LVL7_TERR_ID
      when '8' then T.LVL8_TERR_ID
      when '9' then T.LVL9_TERR_ID
      when '10' then T.LVL10_TERR_ID
      when '11' then T.LVL11_TERR_ID
      when '12' then T.LVL12_TERR_ID
      when '13' then T.LVL13_TERR_ID
      when '14' then T.LVL14_TERR_ID
      when '15' then T.LVL15_TERR_ID
      when '16' then T.LVL16_TERR_ID
      when '17' then T.TOP_LVL_TERR_ID
    else 'NO_VALUE'
    end
    ) in (VALUELISTOF(NQ_SESSION.SUPER_TERR_LIST))
    )
    

SDE_FUSION_PARTYORGANIZATIONDIMENSION Populates With Data Has Gaps

This issue applies to SDE_FUSION_PartyOrganizationDimension.

Workaround

  1. Log in to the ODI Studio and navigate to the Designer tab.
  2. Open the BI Apps Project, click Mappings, and select SDE_FUSION_x_Adaptor (where x can be 9, 10, or 11).
  3. Expand SDE_FUSION_PartyOrganizationDimension.
  4. Open MAIN interface (blue) SDE_FUSION_PartyOrganizationDimension.W_PARTY_ORG_DS.
  5. Go to column PRIMARY_PHONE_AREA_CODE and replace the existing expression with SQ_W_FSN_PARTY_ORG_PS.PRIMARY_PHONE_AREA_CODE.

Award_WID Not Populating in Existing Project Facts

This issue applies to Award_WID not getting populated in the existing Project Budget and Cost facts.

Workaround

Use the CA_DETAIL_PROJ table to get the link between contract number, projects, and activities for the following existing facts:

  • W_PROJ_COST_LINE_FS

  • W_PROJ_INVOICE_LINE_FS

  • W_PROJ_REVENUE_LINE_FS

  • W_PROJ_BUDGET_FS

  • W_PROJ_FORECAST_FS

  • W_PROJ_COMMITMENT_FS

SDE_Universal_PayrollBalanceDimension_Translate Fails

This issue applies to the Universal adaptor.

SDE_Universal_PayrollBalanceDimension_Translate fails as the balance name column has more than 80 characters.

Workaround

Ensure that the balance names in the source data are 80 characters or less.

ETL Fails with the “ORA-01792: maximum number of columns in a table or view is 1000” error message

This issue applies to Business Intelligence Applications consumer version 11.1.8.1 and later using the Oracle 19c Database.

BI Applications 11.1.1.10.3 ETL fails owing to the following error: “ORA-01792: maximum number of columns in a table view is 1000.”

The Oracle 19c database supports a maximum of 1000 columns when creating a table.

Workaround

set “_fix_control”='17376322.OFF'

alter session set "_fix_control"='17376322:OFF';  or   alter system set "_fix_control"='173763222.OFF';

The Configuration Manager Lists Organizations That Aren't Master Orgs

This issue applies to the MASTER ORG parameter.

The MASTER ORG parameter allows you to select from a list of Master Orgs retrieved from your OLTP. This list can also display certain non-Master orgs due to a code issue that's been fixed in the latest version.

Workaround

Ensure that you're aware of the Master Orgs in the system and select only those from the list.

The Taleo Cloud Source System Hits Daily Limit For Bulk Data Export requests

This issue applies to the Taleo Cloud source system. The Taleo Cloud source system enforces daily and transaction limits for bulk data export requests.

During the load plan execution, if a replication job hits the Taleo Cloud source system Daily Limit, then the job is paused for 24 hours but the status of the ODI LP step is marked as Completed.

Subsequent load plan steps of replication for the Taleo Cloud source system completes without actually running the replication and the rest of the ETL steps (SDE, SIL) are also completed for the Taleo Cloud source system without running the ETL. The next day, the load plan triggers the rest of the replication and completes the ETL.

The paused job is triggered automatically and runs the replication only for the entity which first hits the daily limit. For replicating the rest of the entities, a new load plan instance needs to be triggered after 24 hours of the previous load plan.

Workaround

If the Taleo zone limits are increased, it's recommended to update the Daily Limit on the Replicator to match the zone value, or update it to 0.

DAILY_EXPORT_LIMIT=0 (0 indicates unlimited).

Features Unavailable in On-Premises Deployments

This issue applies to features that aren't available in the Oracle BI Applications Release 11.1.1.10.3 on-premises deployments.

The Manage Extensible Attribute feature is unavailable in the on-premises deployments.

Workaround

This issue has no workaround currently.

Groups Definition Missing in WebLogic Console

This issue applies to Oracle Business Application Procurement and Spend Analytics CaaS Release: 11.1.1.10.3 CaaS.

The following Groups definition is missing in the WebLogic Console:
  • OBIA_AU_LINE_MANAGER_EXPENSE_ANALYSIS_DUTY

  • Purchasing Extended Buyer

Workaround

  1. Sign in to the WebLogic Console.
  2. Go to Security Realms in the left side panel.
  3. Click on myrealm.
  4. Click Users and Groups and then click Groups.
  5. Create a new group named OBIA_AU_LINE_MANAGER_EXPENSE_ANALYSIS_DUTY.
  6. Create a new group named Purchasing Extended Buyer.

Cloud Replicator Doesn't Support Non SSL-Enabled RightNow Sites

This issue applies to the current version of the Cloud Replicator.

The current version of the Cloud Replicator doesn't support extracting data from the non SSL-enabled RightNow sites.

Workaround

For Cloud Replicator to work with the RightNow site as a valid cloud source connection, the site should be SSL-enabled.

SILOS_SIL_HRPERSONLEGISLATION_FLEX_DIMENSION Fails

This issue applies to SILOS_SIL_HRPERSONLEGISLATION_FLEX_DIMENSION.

SIL_HRPersonLegislation_Flex_Dimension.W_HR_PERLEG_ADDL_D (Integration) fails on the target ORACLE connection BIAPPS_DW.

Workaround

  1. In the ODI Studio, click Models, click Oracle BI Applications, and select Dimensions.
  2. Open the table W_HRPERLEG_ADDL_D.
  3. Expand the columns and double-click PERLEG_ATTR3_CHAR.
  4. In the Description tab, set the slowly changing behavior to Overwrite on Change.
  5. Regenerate the scenario for SILOS_SIL_HRPERSONLEGISLATION_FLEX_DIMENSION.

PeopleSoft Tables to be Completely Extracted

This issue applies to the PeopleSoft Cloud Source adaptor.

The following error message comes up after the completion of an ETL process run:
Caused By: java.sql.SQLException: ORA-20000: Error creating Index/Constraint
:W_PSFT_INT_ORG_DH_PS_P1 => ORA-01452: cannot CREATE UNIQUE INDEX; duplicate
keys found

Workaround

The following tables need to be completely extracted every time after an ETL process has been run:
  • PSTREENODE

  • PSTREELEAF

Incorrect Inventory Organization Prompts on the Logistics Dashboard for Supply Chain Reporting

This issue applies to Supply Chain reporting.

If you don't use Fusion Applications, you might notice incorrect values for Inventory Organization prompts on the Logistics dashboard for Supply Chain reporting. You can edit the SQL statements for the prompts to correct these values.

Workaround

  1. In BI Analytics, under Catalog, navigate to Shared Folders, Supply Chain Management, Analytic Library, Embedded Content, Logistics, and select Prompts.
  2. Edit the Hit or Miss Accuracy prompt.
  3. Edit the Inventory Organization prompt.
  4. Under Option, update the SQL statement to generate the list of values to be SELECT "Inventory Organization"."Inventory Org Name", "Fact - Inventory Cycle Count"."Counted Quantity" FROM "Inventory - Cycle Count"
  5. Click OK and save the changes.
  6. Edit the Inventory Value Dashboard page promts.
  7. Edit the Inventory Organization prompt.
  8. Update the SQL statement to be: SELECT "Inventory Organization"."Inventory Org Name", "Fact - Costing - Inventory Valuation"."Accounted Onhand Amount" FROM "Costing - Inventory Valuation".
  9. Click OK and save the changes.

Cycle Count Subject Area is not Supported for the PeopleSoft Adaptor

This issue applies to PeopleSoft users.

The Cycle Count subject area is not supported for the PeopleSoft adaptor. Therefore, Cycle Count reports are not populated, including those on the following dashboard pages:
  • Fusion Logistics - Exact Matches Rate

  • Fusion Logistics - Hit or Miss Accuracy Inventory

  • Cycle Count Inventory - Cycle Count Details

Multi-Select Extensible Attribute List-of-Values (LOV) Columns Aren't Supported

This issue applies to BI Applications.

When you add to a report an extensible attribute column that is a list of values (LOV) and that has multiple values selected, you notice that rows are missing.

Workaround

This issue has no workaround.

Email Address Missing While Extracting Data from GlobalPerson

This issue applies to the missing privileges for the provisioned extractor user in the Fusion Applications release 11 POD environments.

While extracting data from the GlobalPersonForETLPVO Fusion view object (VO), using the FUSION_APPS_OBIA_BIEE_APPID user, some of the persons' email addresses are missing. This issue is due to the missing privileges for this user in the Fusion Applications release 11 POD environments. This user's the provisioned extractor user that the Oracle BI Applications ETL process uses to extract data from Fusion Applications. In order to extract secured data such as PII information (for example, person's home email, home phone, and national identifiers), you must ensure that this user's granted the data security policies that allow it to read the underlying tables.

Workaround

  1. Log in to the Fusion Applications as a user that has the IT Security Manager role.
  2. Navigate to Setup and Maintenance, and search for and launch the Manage Duties task.
    This opens the Oracle Entitlements Server tab, commonly known as the Authorization Policy Manager (APM).
  3. Use the search box to search for the FUSION_APPS_OBIA_BIEE_APPID user.

    Note:

    There's also a role called FUSION_APPS_OBIA_BIEE_APPID. Don't search for and create policies for the role. You must ensure to search for and create policies for the FUSION_APPS_OBIA_BIEE_APPID user.
    The search result displays the Business Intelligence Applications Extract Transform and Load Application Identity user name.
  4. Select this user name in the search result area and click on the Open (folder) icon.
    This opens a new tab displaying details such as the role assignments of this user.
  5. Click Find Global Policies to open the Search Policies tab.
    In the Data Security Policies for: region within the Search Policies tab are listed all of the currently granted data security policies to this user.
  6. Click New to open the New Policy tab with four sub-tabs and enter the following information in each sub-tab:
    • General Information tab:
      • Name: Custom Grant for Person Email to OBIA

      • DB Resource: Person Email

      • Module: Global Human Resources

      • Start Date: 1/1/01

    • Roles tab: There's no information to enter on this tab. It must, however, display a single entry for FUSION_APPS_OBIA_BIEE_APPID user name.

    • Rule tab:
      • Row Set: All Values

    • Action tab: Select Report Person Email and scroll to the selected actions list.

    • Click Save on the New Policy tab. This saves and returns you to the Search Policies tab with a confirmation message. Press OK to confirm.

  7. Repeat Step 6 for the following DB Resource and Actions:
    • DB Resource: Person Phone; Action: Report Person Phone

    • DB Resource: Human Resources Address; Action: Report Person Address

  8. On the Search Policies tab, sort Data Security Policies by Action (descending), scroll, and verify that the policies for Report Person Email, Report Person Phone, and Report Person Address have been created successfully.
  9. If the BI reports additionally need to extract other secured details such as National Identifiers, Visa, then create data security policies for the following:
    • DB Resource: Person National Identifier; Action: Report Person National Identifier

    • DB Resource: Person Visa Action; Action: Report Person Visa

    • DB Resource: Person Passport; Action: Report Person Passport

    • DB Resource: Person Driver License; Action: Report Driver License

    Note:

    If you're uncertain whether the Oracle Business Intelligence ETL process needs to extract data from these objects, then don't create the data security policies for that object. Only do so when you're absolutely certain.

ODI Metadata Patch Import Fails for Non-Oracle Source Databases

This issue is applicable only for source databases not from Oracle.

When you import the ODI metadata patch for non-Oracle source databases, the metadata patch fails with an errorMessage=DFW-99998 executionContextId=0000Nf3dM8W2nJjyl0lnWJ1Wxf61000001 error.

Workaround

  1. Export the BIAPPS_DW data server file to your local directory.
    1. In ODI Studio, click Topology, select Physical Architecture, expand Technologies, click Oracle, and then right-click BIAPPS_DW data server.
    2. Select Export, enter the name for the file, and then click Finish.
  2. Validate the Global ID in the BIAPPS_DW data server XML file exported to your local directory.
    1. Open the exported BIAPPS_DW data server XML file with a text editor and verify the Global ID for BIAPPS_DW data server.
    2. If the value of the Global ID in XML file is "44134a38-7a4a-37bf-a1f5-8eae1bc07046", skip steps 3 and 4.
    3. If the value of the Global ID in the XML file isn't "44134a38-7a4a-37bf-a1f5-8eae1bc07046", edit the XML file, update the value of Global ID for BIAPPS_DW Data to "44134a38-7a4a-37bf-a1f5-8eae1bc07046", and then save the file.
  3. Delete the BIAPPS_DW data server.
    1. In ODI Studio, click Topology, select Physical Architecture, expand Technologies, click Oracle, and then right-click BIAPPS_DW data server.
    2. Click Delete.
  4. Import the BIAPPS_DW data server XML file.
    1. In ODI Studio, click Topology, select Physical Architecture, expand Technologies, click Oracle, and then right-click BIAPPS_DW data server.
    2. Select Import and click Import Data Server.
    3. Browse and select the BIAPPS_DW data server XML file that you edited in step 2.

SDE_PSFT Maps Fail for Non-Oracle Source Databases

This data type conversion issue is applicable for the PeopleSoft MSSQL database source.

The SDE_PSFT maps fail for non-Oracle source databases because of the incorrect use of a conversion function when the TO_CHAR_JULIAN UDF function exists in the database.

Workaround

  1. Delete the TO_CHAR_JULIAN user function.
    1. In ODI Studio, click Designer, expand Global Objects, click Global User Functions, and then expand Conversion Operation.
    2. Right-click the TO_CHAR_JULIAN user-defined function, click Delete, and then click Yes to remove it.
  2. Run the import in MODE=SCEREGEN, substituting the appropriate PLV code and source technology to regenerate the scenario.

    For example, PSFT_9_2 for MSSQL:

    /scratch/jdk/jdk_1.8.0_201/bin/java -client
            -Dfile.encoding=UTF-8 -classpath         
            "/scratch/product/Oracle/Middelware/Oracle_Home/odi/sdk/lib/*:/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/jlib/*:/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/modules/oracle.idm/*:/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/modules/oracle.jps/*:/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/modules/oracle.jdbc/*:/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/modules/oracle.nlsrtl/*:/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/modules/oracle.ucp.jar:/scratch/product/Oracle/Middelware/Oracle_Home/bi/biapps/lib/*:/scratch/product/Oracle/Middelware/Oracle_Home/bi/biapps/biaodiutil/lib/bia-odi-util.jar"         
            -Dcommon.components.home=/scratch/product/Oracle/Middelware/Oracle_Home/oracle_common/         
            oracle.apps.biaodiutil.Import MODE=SCEREGEN         
            FILEDIR=/scratch/product/Oracle/Middelware/Oracle_Home/bi/biapps/admin/provisioning/odi/         
            PLV=PSFT_9_2 SRCTECH=MICROSOFT_SQL_SERVER         
            JPSCONFIGFILE=/scratch/product/Oracle/Middelware/Oracle_Home/odi/studio/bin/jps-config-jse.xml         
            LOGDIR=/scratch/product/Oracle/Middelware/Oracle_Home/bi/biapps/admin/provisioning/odi/logs/

Test Connection of an Oracle WebLogic Data Source Fails

This issue applies only to an Oracle WebLogic data source connection to Oracle Autonomous Data Warehouse database.

You might find that a test connection of an Oracle WebLogic data source to Oracle Autonomous Data Warehouse fails. This failure occurs when Oracle Autonomous Data Warehouse isn’t configured properly.

Workaround

To resolve this failure, apply Patch Set 2 to Oracle BI Applications 10.3 and configure Oracle Autonomous Data Warehouse properly.

Unable To Add Custom Lookup In Dataset For Two Reusable Mappings

In some of the mappings or reusable mappings, you don’t have a direct option to modify a map to add lookups.

Workaround

Convert the dataset to a flow-based mapping, and then edit the mapping or reusable mapping to add the new lookups.

  1. In Designer Navigator, open the mapping or reusable mapping.
  2. Right-click on the title (Default) and select Convert to Flow from the context menu.
  3. Click Yes.
  4. Edit the map and add the lookups.

Dashboard Unresponsive When Report Column Contains Multiple Currencies

If you've configured multiple currency and set the preferred currency to Ledger Currency, then the Compensation dashboard becomes unresponsive and you get the following error:

Warning: Report column Total Annual Base Salary has different currencies and this isn't supported. Change the currency selection or place additional report filters to ensure that data for only one currency is returned.

Workaround

If you've set the preferred currency to Ledger Currency or Local Currency, then select a Ledger or a Business Unit in the dashboard prompt.

Issues and Workarounds for Oracle GoldenGate

These issues and workarounds are related to the use of Oracle GoldenGate.

Support GoldenGate Integration for GL Segment Related Extract (EBS)

This change is required only when you use GoldenGate to replicate EBS data.

Model Change

Set the flexfields' value OBI SDS Load Tech Exclusion List to BIA_OGG_SDS for FND_FLEX_VALUE_CHILDREN_V in EBS 12.2 (This value is already set for the other EBS versions)
  1. Open Model Oracle E-Business Suite R12.2 , click Application Object Library, and select FND_FLEX_VALUE_CHILDREN_V.

  2. Open the Flexfields tab and set OBI SDS Load Tech Exclusion List to BIA_OGG_SDS.

  3. Save the change.

Package Changes

Modify the following packages to support SDS:

  • SDE_ORA_Stage_BalancingSegmentDimensionHierarchy_Primary

  • SDE_ORA_Stage_GLSegmentDimensionHierarchy_Primary

  • SDE_ORA_State_NaturalAccountDimensionHierarchy_Primary

As an example, the following instructions use the Balancing Segment:

  1. Take a backup of the ODI repository.

  2. In ODI client, open Mappings, click SDE_ORAXXX_Adaptor and select SDE_ORA_Stage_BalancingSementDimensionHiearrchy_Primary.

  3. Remove the link between SOFT_DELETE_PREPROCESS and SDE_ORA_Stage_BalancingSegmentDimensionHierarchy_Primary.W_BALANCING_SEGMENT_HIER_PS_PE.

  4. Drag the variable SDS_LOAD_TECHNO and IS_SDS_DEPLOYED to the package.

  5. Rename the step name from SDS_LOAD_TECHNO to "SDS_LOAD_TECHNO is BIA_OGG_SDS". Use Type “Evaluate Variable”, Operator “=”, and Value “BIA_OGG_SDS”.

  6. Use Type Set Variable and value N for IS_SDS_DEPLOYED.

  7. Connect SOFT_DELETE_PREPROCESS to SDS_LOAD_TECHNO is BIA_OGG_SDS.

  8. Connect SDS_LOAD_TECHNO is BIA_OGG_SDS to interface SDE_ORA_Stage_BalancingSegmentDimensionHierarchy_Primary.W_BALANCING_SEGMENT_HIER_PS_PE.

  9. Connect SDS_LOAD_TECHNO is BIA_OGG_SDS to IS_SDS_DEPLOYED .

  10. Connect IS_SDS_DEPLOYED to interface SDE_ORA_Stage_BalancingSegmentDimensionHierarchy_Primary.W_BALANCING_SEGMENT_HIER_PS_PE .

  11. Regenerate the scenario.

Load Plan Change

Add variable SDS_LOAD_TECHNO and refresh it at the root level.

  1. Open BIAPPS Load Plan, click Load Plan Dev Components, click SDE, and select EBS_XXX LP component above.

    SDE Dims BALSEG_DIM

  2. Go to the Variables tab.

  3. Click Add. Search for SDS_LOAD_TECHNO and click OK.

  4. Go to the Steps tab.

  5. Enable the Overwrite and Refresh checkboxes.

  6. Repeat steps 3 to 5 for the following load plan components:
    • 3 SDE Dims GLSEG_DIM

    • 3 SDE Dims NAT_ACCT_DIM

    • 3 SDE Dims GLACCNT_DIM (The corresponding package change is already included in the product for this load plan).