Skip Headers
Oracle® Communications Data Model Operations Guide
11g Release 1 (11.2)

Part Number E15883-01
Go to Documentation Home
Home
Go to Book List
Book List
Go to Table of Contents
Contents
Go to Index
Index
Go to Master Index
Master Index
Go to Feedback page
Contact Us

Go to previous page
Previous
Go to next page
Next
View PDF

3 Populating an Oracle Communications Data Model Warehouse

This chapter describes how to populate an Oracle Communications Data Model warehouse.

This chapter discusses the following topics:

Note:

The instructions in this chapter assume that after doing the fit-gap analysis described in Performing Fit-Gap Analysis, you have not identified or made any changes to the Oracle Communications Data Model logical or physical model. If you have made changes, you need to modify the ETL accordingly.

Pre-population Tasks

Before you populate your Oracle Communications Data Model warehouse for the first time, take the following steps:

  1. Perform all of the post-installation tasks described in Oracle Communication Data Model Installation Guide.

  2. Familiarize yourself with the ETL you use to populate an Oracle Communications Data Model warehouse as described in "Overview: The ETL for an Oracle Communications Data Model Warehouse".

  3. Determine if you will be doing an initial load of your warehouse as described in "Performing an Initial Load of the Warehouse" or an incremental load as described in "Performing Incremental Data Loading of the Warehouse". Familiarize yourself with the process of performing which ever type of load you will be performing.

  4. Familiarize yourself with how you can manage any errors that might occur during that initial load as described in "Managing Errors During Intra-ETL Execution".

  5. Estimate the size of your Oracle Communications Data Model warehouse as described in "Estimating Space for Oracle Communications Data Model".

If you have already performed an initial load of your Oracle Communications Data Model warehouse and are now performing an incremental load of the warehouse, familiarize yourself with the incremental load issues discussed in "Performing Incremental Data Loading of the Warehouse".

Overview: The ETL for an Oracle Communications Data Model Warehouse

In the Oracle Communications Data Model relational model, reference and lookup tables store master, reference, and dimensional data; while base, derived, and aggregate tables store transaction and fact data at different granularities. Base tables store the transaction data at the lowest level of granularity, while derived and aggregate tables store consolidated and summary transaction data.

As with any data warehouse, you use Extract, Transform, and Load (ETL) operations to populate an Oracle Communications Data Model data warehouse. You perform ETL operations using different types of ETL in the following order:

  1. Source-ETL processes that extract data from the source On-Line Transaction Processing (OTLP) system, transform that data, and loads the reference, lookup, and base tables Oracle Communications Data Model warehouse. Source-ETL is not provided with Oracle Communications Data Model. For information about creating source-ETL, see "Source-ETL in Oracle Communications Data Model".

  2. Intra-ETL processes that populate the remaining structures in an Oracle Communications Data Model warehouse. Intra-ETL does not access the OLTP data at all. All of the data that intra-ETL extracts and transforms is located within the Oracle Communications Data Model warehouse. Intra-ETL is provided with the Oracle Communications Data Model. For information about the intra-ETL, see "Intra-ETL in Oracle Communications Data Model".

Source-ETL in Oracle Communications Data Model

Source-ETL processes extract data from the source On-Line Transaction Processing (OTLP) system, transform that data, and load the transformed data into the lookup, reference, and base tables of an Oracle Communications Data Model warehouse.

Note:

You must design and write source-ETL processes yourself. Source-ETL is not provided with Oracle Communications Data Model.

Keep the following points in mind when designing and writing source-ETL:

Intra-ETL in Oracle Communications Data Model

One component delivered with the Oracle Communications Data Model is the INTRA_ETL_FLW process flow which is designed using the Oracle Warehouse Builder Workflow component.

The INTRA_ETL_FLW process flow uses the data in the Oracle Communications Data Model base, reference, and lookup tables to populate all of the other structures of the Oracle Communications Data Model.

The INTRA_ETL_FLW process flow consists of the following subprocesses and includes the dependency of individual sub-process flows and executes them in the proper order:

  1. DRVD_FLW

    This sub-process flow contains all the Oracle Warehouse Builder mappings for populating derived tables based on the content of the base, reference, and lookup tables. This process flow allows mappings to be executed concurrently.

  2. AGGR_FLW

    This sub-process flow contains PL/SQL code that uses the partitions change tracking strategy to refresh all the aggregate tables which are materialized views in the Oracle Communications Data Model.

  3. MINING_FLW

    This sub-process flow triggers the data mining model refreshment.

The INTRA_ETL_FLW process flow also includes the OLAP_MAP mapping. OLAP_MAP invokes the analytic workspace build function of the PKG_OCDM_OLAP_ETL_AW_LOAD package to load data from Oracle Communications Data Model aggregate materialized views into the Oracle Communications Data Model analytical workspace and calculates the forecast data. The PKG_OCDM_OLAP_ETL_AW_LOAD package reads OLAP ETL parameters from the DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema to determine the specifics of how to load the data and perform the forecasts.

Note:

The shell script ocdm_execute_wf.sh delivered with Oracle Communications Data Model performs the same function as Oracle Warehouse Builder Workflow INTRA_ETL_FLW, and it does not require Oracle Warehouse Builder workflow. For more information about working with this script, see "Manually Executing the Intra-ETL".

See:

Oracle Communication Data Model Reference for detailed information about the Oracle Communications Data Model Intra-ETL.

Estimating Space for Oracle Communications Data Model

The size of Oracle Communications Data Model warehouse largely depends on size of network event (or CDRs). You can calculate the approximate overall size of your Oracle Communications Data Model warehouse by using the following formula.

OCDM_Size = 150 * NAcct * NCall * 365 * NYear * 4

where:

For example, assume for an operator with only the wireless business has 350,000 accounts, every customer make 2 calls each year, the operator wants to save data for 5 years, then the size can be estimated as 766GB.

Note:

Note: This is a very preliminary estimation and to plan your data warehouse more accurately, you should do some experiments by loading some sample data into Oracle Communications Data Model schema and measure the actual storage size.

Performing an Initial Load of the Warehouse

Performing an initial load of an Oracle Communications Data Model is a multistep process:

  1. Load the reference, lookup, and base tables Oracle Communications Data Model warehouse by executing the source-ETL that you have written using the guidelines given in "Source-ETL in Oracle Communications Data Model".

  2. Load the remaining structures in the Oracle Communications Data Model, by taking the following steps:

    1. Update the parameters in DWC_ETL_PARAMETER control table in the ocdm_sys schema so that this information (that is, the beginning and end date of the ETL period) can be used when loading the derived and aggregate tables and views. See "Updating the Parameters of the DWC_ETL_PARAMETER Table for an Initial Load" for more information.

    2. Update the Oracle Communications Data Model OLAP ETL parameters in DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema to specify the build method and other build characteristics so that this information can be used when loading the OLAP cube data. See "Updating DWC_OLAP_ETL_PARAMETER Table for an Initial Load" for more information.

    3. Execute the Intra-ETL in one of the ways described in "Executing the Intra-ETL for Oracle Communications Data Model".

Updating the Parameters of the DWC_ETL_PARAMETER Table for an Initial Load

Before you execute the INTRA_ETL_FLW process to load the derived and aggregate relational tables and views, you must update the parameters of the DWC_ETL_PARAMETER control table in the ocdm_sys schema.

For an initial load of an Oracle Communications Data Model warehouse, specify the values shown in the following table.

Columns Value
PROCESS_NAME 'OCDM-INTRA-ETL'
FROM_DATE_ETL The beginning date of the ETL period.
TO_DATE_ETL The ending date of the ETL period.

See:

Oracle Communication Data Model Reference for more information on the DWC_ETL_PARAMETER control table.

Updating DWC_OLAP_ETL_PARAMETER Table for an Initial Load

Before you can load OLAP cube data, you need to update the Oracle Communications Data Model OLAP ETL parameters in DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema.

For an initial load of the analytic workspace, specify values following the guidelines in Table 3-1.

Table 3-1 Values of Oracle Communications Data Model OLAP ETL Parameters in the DWC_OLAP_ETL_PARAMETER Table for Initial Load

Column Name Value

PROCESS_NAME

'OCDM-INTRA-ETL'

BUILD_METHOD

C which specifies a complete refresh which clears all dimension values before loading.

CUBENAME

One of the following values that specifies the cubes you want to build:

  • ALL specifies that you want to build all of the cubes in the Oracle Communications Data Model analytic workspace.

  • cubename[[|cubename]...] specifies one or more cubes that you want to build.

MAXJOBQUEUES

A decimal value that specifies the number of parallel processes to allocate to this job. (Default value is 4.) The value that you specify varies depending on the setting of the JOB_QUEUE_PROCESSES database initialization parameter.

CALC_FCST

One of the following values depending on whether or not to calculate forecast cubes:

  • Y specifies calculate forecast cubes.

  • N specifies do not calculate forecast cubes.

NO_FCST_YRS

If the value of CALC_FCST is Y, specify a decimal value that specifies how many years forecast data you want to calculate; otherwise, specify NULL.

FCST_MTHD

If the value of CALC_FCST is Y, then specify AUTO; otherwise, specify NULL.

FCST_ST_YR

If the value of CALC_FCST is Y, then specify value specified as yyyy which is the "start business year" of a historical period; otherwise, specify NULL.

FCST_END_YR

If the value of CALC_FCST is Y, then specify value specified as yyyy which is the "end business year" of a historical period; otherwise, specify NULL.

OTHER1

Specify NULL.

OTHER2

Specify NULL.


See:

"Updating DWC_OLAP_ETL_PARAMETER Table for an Incremental Load" for information on the values to specify for an incremental load of OLAP cube data, and Oracle Communication Data Model Reference. for more detailed information on all possible values for the DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema.

Executing the Intra-ETL for Oracle Communications Data Model

You can execute the Intra-ETL packages provided with Oracle Communications Data Model in the following ways:

In either case, you can monitor the execution of the Intra-ETL, recover and troubleshoot errors as described in "Monitoring the Execution of the Intra-ETL Process", "Recovering an Intra ETL Process", and "Troubleshooting Intra-ETL Performance".

Executing the INTRA_ETL_FLW Workflow Within Oracle Warehouse Builder

To execute the INTRA_ETL_FLW process flow from Oracle Warehouse Builder, take the following steps:

See:

Oracle Warehouse Builder User's Guide for information about Oracle Warehouse Builder.

To use Oracle Communications Data Model Intra-ETL as a Oracle Warehouse Builder Workflow, follow these steps:

  1. Confirm that Oracle Warehouse Builder Workflow has been installed as described in Oracle Communication Data Model Installation Guide.

  2. Within Oracle Warehouse Builder, go to the Control Center Manager.

  3. Select OLAP_PFLW, then select AGR_PFLW, then select the main process flow INTRA_ETL_FLW.

  4. Right-click INTRA_ETL_FLW and select set action. If this is the first deployment, set action to Create; for deployment after the first, set action to Replace. Deploy the process flow.

    After the deployment finishes successfully, INTRA_ETL_FLW is ready to execute.

Manually Executing the Intra-ETL

Oracle Communications Data Model provides you with a script that you can use to populate the intra-ETL without using Oracle Warehouse Builder Workflow. This shell script is named ocdm_execute_wf.sh . It performs the same function as Oracle Warehouse Builder Workflow INTRA_ETL_FLW. It can be invoked by another process such as Source-ETL, or according to a predefined schedule such as Oracle Job Scheduling.

  1. The ocdm_execute_wf.sh program prompts you to enter the environmental variables described in the following table.

    Variable Description
    TSNAME The tnsname of target database.
    SYSTEM password SYSTEM account password of target database.
    ocdm_sys password The password of the ocdm_sys account.
    ORACLE HOME Oracle Database home(that is, the full path without ending "/").

  2. Reads the values from the DWC_ETL_PARAMETER and DWC_OLAP_ETL_PARAMETER control tables in the ocdm_sys schema before executing the mappings in the correct order

  3. The result of each table loading are tracked in the DWC_INTRA_ETL_PROCESS and DWC_INTRA_ETL_ACTIVITY control tables as described in "Monitoring the Execution of the Intra-ETL Process".

Performing Incremental Data Loading of the Warehouse

"Performing an Initial Load of the Warehouse" describes how to perform an intial load of an Oracle Communications Data Model data warehouse. After this initial load, you need to load new data into your Oracle Communications Data Model data warehouse regularly so that it can serve its purpose of facilitating business analysis.

To load new data into your Oracle Communications Data Model warehouse, you extract the data from one or more operational systems and copy that data into the warehouse. The challenge in data warehouse environments is to integrate, rearrange and consolidate large volumes of data over many systems, thereby providing a new unified information base for business intelligence.

The successive loads and transformations must be scheduled and processed in a specific order and will be determined by your business needs. Depending on the success or failure of the operation or parts of it, the result must be tracked and subsequent, alternative processes might be started.

You can do a full incremental load of the relational tables and views, OLAP cubes, and data mining models all at once, or you can refresh the the data sequentially:

  1. Incremental Loading of Relational Tables and Views

  2. Incremental Loading of OLAP Cubes

  3. Refreshing Data Mining Models

In either case, you can manage errors during the execution of the Intra-ETL as described in "Managing Errors During Intra-ETL Execution"

Incremental Loading of Relational Tables and Views

Refreshing the relational tables and views in an Oracle Communications Data Model is a multi-step process:

  1. Refresh the reference, lookup, and base tables Oracle Communications Data Model warehouse with OLTP data by executing the source-ETL that you have written using the guidelines given in "Source-ETL in Oracle Communications Data Model" .

  2. Update the parameters of the DWC_ETL_PARAMETER control table in the ocdm_sys schema. For an incremental load of an Oracle Communications Data Model warehouse, specify the values shown in the following table (that is, the beginning and end date of the ETL period).

    Columns Value
    PROCESS_NAME 'OCDM-INTRA-ETL'
    FROM_DATE_ETL The beginning date of the ETL period.
    TO_DATE_ETL The ending date of the ETL period.

    See:

    Oracle Communication Data Model Reference for more information on the DWC_ETL_PARAMETER control table.
  3. Refresh the derived tables and aggregate tables which are materialized views in Oracle Communications Data Model by executing the DRVD_FLOW and AGGR_FLOW subprocess of the INTRA_ETL_FLW process flow. See "Executing the Intra-ETL for Oracle Communications Data Model" for more information.

See also:

Oracle Warehouse Builder User's Guide

Incremental Loading of OLAP Cubes

On a scheduled basis you need to update the OLAP cube data with the relational data that has been added to the Oracle Communications Data Model data warehouse since the initial load of the OLAP cubes.

Take these steps to perform an incremental load of the analytic workspace that is part of the Oracle Communications Data Model warehouse:

  1. Update the aggregate tables which are materialized views in Oracle Communications Data Model. See "Incremental Loading of Relational Tables and Views" for more information.

  2. Update the OCDM OLAP ETL parameters in DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema so that this information (that is, the build method and other build characteristics) can be used when loading the OLAP cube data. See "Updating DWC_OLAP_ETL_PARAMETER Table for an Incremental Load" for more information.

  3. Execute the Intra-ETL to load the cube data in one of the ways described in "Executing the Intra-ETL for Oracle Communications Data Model".

  4. If necessary, recover from errors that happen during the execution of OLAP_MAP by taking the steps outlined in "Recovering from Errors During an Incremental Load of OLAP Cubes".

Updating DWC_OLAP_ETL_PARAMETER Table for an Incremental Load

Before you can load load OLAP cube data, you need to specify appropriate Oracle Communications Data Model OLAP ETL parameters values in the DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema.

For an incremental load of the analytic workspace, specify values following the guidelines in Table 3-2, "Values for Oracle Communications Data Model OLAP ETL Parameters in the DWC_OLAP_ETL_PARAMETER Table for an Incremental Load". You can either run this mapping in Oracle Warehouse Builder or use OMBPlus (Oracle Warehouse Builder Scripting Language) to execute the $ORACLE_HOME/ocdm/pdm/relational/intra_etl/owb_exec/todcolapetl.tcl in command line.

Table 3-2 Values for Oracle Communications Data Model OLAP ETL Parameters in the DWC_OLAP_ETL_PARAMETER Table for an Incremental Load

Column Name Value

PROCESS_NAME

'OCDM-INTRA-ETL'

BUILD_METHOD

? which specifies a fast refresh if possible; otherwise a complete refresh. (Default).

CUBENAME

One of the following values that specifies the cubes you want to build:

  • ALL specifies that you want to build all of the cubes in the Oracle Communications Data Model analytic workspace.

  • cubename[[|cubename]...] specifies one or more cubes that you want to build.

MAXJOBQUEUES

A decimal value that specifies the number of parallel processes to allocate to this job. (Default value is 4.) The value that you specify varies depending on the setting of the JOB_QUEUE_PROCESSES database initialization parameter.

CALC_FCST

One of the following values depending on whether or not to calculate forecast cubes:

  • Y specifies calculate forecast cubes.

  • N specifies do not calculate forecast cubes.

NO_FCST_YRS

If the value of CALC_FCST is Y, a decimal value that specifies how many years forecast data you want to calculate; otherwise, specify NULL.

FCST_MTHD

If the value of CALC_FCST is Y, then specify AUTO; otherwise, specify NULL.

FCST_ST_YR

If the value of CALC_FCST is Y, then specify value specified as yyyy which is the "start business year" of a historical period; otherwise, specify NULL.

FCST_END_YR

If the value of CALC_FCST is Y, then specify value specified as yyyy which is the "end business year" of a historical period; otherwise, specify NULL.

OTHER1

Specify NULL.

OTHER2

Specify NULL.


See:

Table 3-1, "Values of Oracle Communications Data Model OLAP ETL Parameters in the DWC_OLAP_ETL_PARAMETER Table for Initial Load" for information on the values to specify for an inital load of OLAP cube data, and Oracle Communication Data Model Reference for more detailed information on all possible values for the DWC_OLAP_ETL_PARAMETER control table in the ocdm_sys schema.

Executing the Oracle Communications Data Model OLAP ETL Mapping

You can execute the Oracle Communications Data Model ETL to update the OLAP cubes in the following ways

  • Refresh all of the data in the warehouse by executing the Oracle Warehouse Builder Workflow INTRA_ETL_FLW in one of the ways that are described in "Executing the Intra-ETL for Oracle Communications Data Model".

    The OLAP Cubes are populated through OLAP_MAP which is a part of Oracle Communications Data Model intra-ETL main workflow INTRA_ETL_FLW.

  • Refresh only the OLAP cube data by executing the OLAP_MAP Oracle Warehouse Builder mapping in the Oracle Warehouse Builder control center.

    Note:

    You must refresh the corresponding materialized view of the OLAP cubes you are refreshing before you execute OLAP_MAP. (For the mapping between OLAP cube and materialized views, please refer to Oracle Communication Data Model Reference.

Recovering from Errors During an Incremental Load of OLAP Cubes

To recover from any errors during the execution of OLAP_MAP during an incremental load of OLAP cubes, take the following steps:

  1. Change the value of the BUILD_METHOD column of the ocdm_sys.DWC_OLAP_ETL_PARAMETER table to "C".

  2. In Oracle Warehouse Builder, rerun the OLAP_MAP map.

Refreshing Data Mining Models

The MINING_FLW sub-process flow of the INTRA_ETL_FLW process flow triggers the data mining model refreshment. After the initial load of the warehouse, it is recommended to refresh the data mining models monthly. Refreshing the data models is integrated into the MINING_FLW sub-process flow. You can also manually refresh the data models.

To manually refresh all mining models, please call the following procedure.

ocdm_mining.PKG_OCDM_MINING.REFRESH_MODEL( MONTH_CODE,P_PROCESS_NO)

To manually recreate only one mining model, you can call the corresponding procedure. For example, to recreate ther churn SVM model, you can call the following procedure.

ocdm_mining.create_churn_svm_model( MONTH_CODE );

Managing Errors During Intra-ETL Execution

This topic discusses how you can identify and manage errors during Intra-ETL execution. It contains the following topics:

Monitoring the Execution of the Intra-ETL Process

Two control table in the ocdm_sys schema, DWC_INTRA_ETL_PROCESS and DWC_INTRA_ETL_ACTIVITY, monitor the execution of the Intra-ETL process. These tables are documented in Oracle Communication Data Model Reference.

Each normal run (as opposed to an error-recovery run) of a separate Intra-ETL execution performs the following steps:

  1. Inserts a record into the DWC_INTRA_ETL_PROCESS table with a monotonically increasing system generated unique process key, SYSDATE as process start time, RUNNING as the process status, and an input date range in the FROM_DATE_ETL and TO_DATE_ETL columns.

  2. Invokes each of the individual Intra-ETL programs in the appropriate order of dependency. Before the invocation of each program, the procedure inserts a record into the Intra-ETL Activity detail table, DWC_INTRA_ETL_ACTIVITY, with a system generated unique activity key in ACTIVITY_KEY, the process key value corresponding to the Intra-ETL process in PROCESS_KEY, an individual program name as the ACTIVITY_NAME, a suitable activity description in ACTIVITY_DESC, SYSDATE as the value of activity start time, and RUNNING as the activity status

  3. Updates the corresponding record in the DWC_INTRA_ETL_ACTIVITY table for the activity end time and activity status after the completion of each individual ETL program (either successfully or with errors. For successful completion of the activity, the procedure updates the status as 'COMPLETED-SUCCESS'. When an error occurs, the procedure updates the activity status as 'COMPLETED-ERROR', and also updates the corresponding error detail in the ERROR_DTL column.

  4. Updates the record corresponding to the process in the DWC_INTRA_ETL_ PROCESS table for the process end time and status, after the completion of all individual intra-ETL programs. When all the individual programs succeed, the procedure updates the status to 'COMPLETED-SUCCESS', otherwise it updates the status to 'COMPLETED-ERROR'.

You can monitor the execution state of the Intra-ETL, including current process progress, time taken by individual programs, or the complete process, by viewing the contents of the DWC_INTRA_ETL_PROCESS and DWC_INTRA_ETL_ACTIVITY tables corresponding to the maximum process key. Monitoring can be done both during and after the execution of the Intra-ETL procedure.

Recovering an Intra ETL Process

To recover an intra-ETL process

  1. Identify the errors by looking at the corresponding error details are tracked against the individual programs in the DWC_INTRA_ETL_ACTIVITY table.

  2. Correct the causes of the errors.

  3. Re-invoke the Intra-ETL process.

INTRA_ETL_FLW has the intelligence of identifying whether it is a normal run or recovery run by referring the DWC_INTRA_ETL_ACTIVITY table. During a recovery run, INTRA_ETL_FLW executes only the necessary programs. For example, in the case of a derived population error as a part of the previous run, this recovery run executes the individual derived population programs which produced errors in the previous run. After their successful completion, the run executes the aggregate population programs and materialized view refresh in the appropriate order.

In this way, the Intra-ETL error recovery is almost transparent, without involving the Data Warehouse or ETL administrator. The administrator only needs to take correct the causes of the errors and re-invoke the Intra-ETL process once more. The Intra-ETL process identifies and executes the programs that generated errors.

Troubleshooting Intra-ETL Performance

To troubleshoot the Intra-ETL performance:

Checking the Execution Plan

Use the SQLDeveloper or other tools to view the package body of the code generated by Oracle Warehouse Builder.

For example, take the following steps to examine DWD_ACCT_PYMT_DAY__MAP .

  1. Copy out the main query statement from code viewer.

    To do this, you copy from "CURSOR "AGGREGATOR_c" IS …." to end of the query, which is right above another "CURSOR "AGGREGATOR_c$1" IS".

    Description of trouble1.gif follows
    Description of the illustration trouble1.gif

  2. In SQLDeveloper worksheet, issue the following command to turn on the parallel DML:

    Alter session enable parallel dml;
    
  3. Paste the main query statement into another SQLDeveloper worksheet and view the execution plan by clicking F6.

    Carefully examine the execution plan to make the mapping runs according to an valid plan.

    Description of trouble2.gif follows
    Description of the illustration trouble2.gif

Monitoring PARALLEL DML Executions

Check that you are running mapping in parallel mode by executing the following SQL statement to count the executed "Parallel DML/Query" statement

column name format a50
column value format 999,999
SELECT NAME, VALUE 
FROM GV$SYSSTAT
WHERE UPPER (NAME) LIKE '%PARALLEL OPERATIONS%'
  OR UPPER (NAME) LIKE '%PARALLELIZED%'
  OR UPPER (NAME) LIKE '%PX%'
;

If you are running mapping in parallel mode, you should see "DML statements parallelized " increased by 1 every time the mapping was invoked. If not you do not see this increase, then the mapping was not invoked as "parallel DML".

If you see "queries parallelized" increased by 1 (one) instead, then typically it means that the SELECT statement inside of the INSERT was parallelized but that INSERT itself was not.

Description of quer_parel.gif follows
Description of the illustration quer_parel.gif

Troubleshooting Data Mining Model Creation

Once the data mining models are created, check the error log in ocdm_sys.dwc_intra_etl_activity table. For example, execute the following code.

set line 160
col ACTIVITY_NAME format a30
col ACTIVITY_STATUS format a20
col error_dtl format a80
select activity_name, activity_status,  error_dtl from dwc_intra_etl_activity;

If all models are created successfully, the activity_status will be all "COMPLETED-SUCCESS". If the activity_status is "COMPLETED-ERROR" for a certain step, please check the ERROR_DTL column, and fix the problem accordingly.

Some common error messages from ERROR_DTL and ACTIVITY_NAME are listed below.

ORA-20991: Message not available ... [Language=ZHS]CURRENT_MONTH_KEY

This error may happen when there are not enough data in the DWR_BSNS_MO table. For example, if the calendar data is populated with 2004~2009 data, the mining model refresh for Year 2010 may result in this error.

To fix this error, execute the Oracle Communications Data Model calendar utility script again to populate the calendar with sufficient data. For example:

Execute Calendar_Population.run('2005-01-01',10);

Tip:

Oracle Communication Data Model Reference for information on the calendar population utility script.

Message not available ... [Language=ZHS]

'ZHS' is a code for a language. The language name it relates to can appear as different name depending on the database environment. This error happens when ocdm_sys.DWC_MESSAGE.LANGUAGE does not contain messages for the current language.

Check the values in the DWC_MESSAGE table and, if required, update to the language code specified by the Oracle session variable USERENV('lang').

ORA-40113: insufficient number of distinct target values, for "create_churn_svm_model"

This error happens when the target column for the training model contains only one value or no value when it is expecting more than one value.

For example, for the churn svm model, the target column is:

ocdm_mining.DMV_CUST_CHRN_SRC_PRD.chrn_ind

To troubleshoot this error:

  1. Execute a SQL query to check if there are enough values in this column.

    Using the churn svm model as an example, issue the following statement.

    select chrn_ind, count(*) from DMV_CUST_CHRN_SRC_PRD  group by chrn_ind;
    

    The following is a result of the query.

    C   COUNT(*)
    - ----------
    1       1228
    0             4911 
    
  2. Check the following tables to make sure customer churn indicators are set properly:

    • ocdm_sys.dwr_cust.chrn_dt should contain the value for the churned customers.

    • ocdm_sys.dwd_acct_sttstc should contain the correct value for each customer in the most recent six months.

  3. Execute the following statement to refresh the mining source materialized views in the OCDM_MINING schema:

    exec pkg_ocdm_mining.refresh_mining_source;
    

ORA-40112:insufficient number of valid data rows, for "create_ltv_glmr_model"

For this model, target column is OCDM_MINING.DMV_CUST_LTV_PRDCT_SRC.TOT_PYMT_RVN.

To troubleshoot this error:

  1. Execute the following SQL statement:

    select count(TOT_PYMT_RVN) from DMV_CUST_LTV_PRDCT_SRC;
    
  2. Check to see that the value returned by this query is greater than 0 (zero) and similar to number of customers. If the number is 0 or too small, check the Oracle Communications Data Model Intra-ETL execution status as described in "Monitoring the Execution of the Intra-ETL Process".

ORG-11130:no data found in the collection, for "create_sentiment_svm_model"

This error occur when there is not enough data in the source table for Text sentiment model training: ocdm_mining.dm_cust_cmmnt.

To ensure that some text is loaded for customer sentiment analysis:

  1. Issue the following SQL statement.

    Select OVRAL_RSLT_CD, count(CUST_COMMENT) from DWB_EVT_PRTY_INTRACN 
    group by OVRAL_RSLT_CD;
    
  2. Check the number of text comments from the customer interaction table (DWB_EVT_PRTY_INTRACN.

  3. If there is not enough data in the customer interaction table, check the ETL logic from the source system to the Oracle Communications Data Model.