35 Oracle Business Intelligence Cloud Connector

It is important to understand how to work with Oracle Business Intelligence (BI) Cloud Connector in Oracle Data Integrator.

This chapter includes the following sections:

35.1 Introduction

Oracle Business Intelligence (BI) Cloud Connector facilitates to extract Business Intelligence data from a Fusion Applications Cloud data source into an Oracle Storage Service or UCM server.

Oracle Data Integrator (ODI) seamlessly integrates with Oracle BI Cloud Connector. Oracle Data Integrator features are designed to work best with Oracle BI Cloud Connector, including reverse-engineering and mappings.

35.1.1 Concepts

The Oracle BI Cloud Connector technology concepts map the Oracle Data Integrator concepts as follows: An Oracle BI Cloud Connector Instance corresponds to a data server in Oracle Data Integrator. Within this instance, a schema maps to an Oracle Data Integrator physical schema.

The Oracle BI Cloud Connector data server contains a reference to the object storage used by Oracle BI Cloud Connector to stage the extracted files. The object storage used can be either Oracle Object Storage or Oracle Storage Cloud Service.

35.1.2 Knowledge Modules

Oracle Data Integrator provides the following Knowledge Modules (KMs) for handling Oracle BI Cloud Connector data.

Table 35-1 Oracle BI Cloud Connector Knowledge Modules

Knowledge Module Description

LKM BICC to ADW External Table

Loads data from Oracle BI Cloud Connector to Oracle ADW using External Table method. You can use this LKM in combination with Oracle or generic SQL IKM.

LKM BICC to ADW Copy

Loads data from Oracle BI Cloud Connector to Oracle ADW. You can use this LKM in combination with Oracle or generic SQL IKM.

LKM BICC to ADW Copy Direct

Loads data from Oracle BI Cloud Connector to Oracle ADW. You can use this LKM as a standalone KM as you do not need any IKM.

35.2 Installation and Configuration

Make sure you have read the information in this section before you start working with the Oracle BI Cloud Connector technology:

35.2.1 System Requirements and Certifications

Before performing any installation, you should read the system requirements and certification documentation to ensure that your environment meets the minimum installation requirements for the products you are installing.

The list of supported platforms and versions is available on Oracle Technology Network (OTN):

http://www.oracle.com/technetwork/middleware/data-integrator/documentation/index.html

35.2.2 Technology Specific Requirements

The technology specific requirements for using Oracle BI Cloud Connector in Oracle Data Integrator are:

  • The Oracle BI Cloud Connector Technology and either the Oracle Object Storage or the Oracle Storage Cloud Service Technology are used.

  • A Data Server and Physical Schema representing the Object Storage used by Oracle BI Cloud Connector to stage the extracted files is required. For more information on configuring the object storage, see Setting up the Topology.

35.2.3 Connectivity Requirements

In order to use Oracle BI Cloud Connector in Oracle Data Integrator, you will need connection information to the Oracle Applications Cloud and the Storage.

35.3 Setting up the Topology

Setting up the topology involves creating the following ODI topology objects:

  • An Oracle Storage Service Data Server and Physical Schema representing the Object Storage used by Oracle BI Cloud Connector to stage the extracted files. The object storage used can be either Oracle Object Storage or Oracle Storage Cloud Service.

  • An Oracle BI Cloud Connector Data Server and Physical Schema representing the Oracle BI Cloud Connector instance.

This section contains the following topics:

35.3.1 Creating Topology objects for Oracle Object Storage

This section contains the following topics:

35.3.1.1 Creating an Oracle Object Storage Data Server
Create a data server for the Oracle Object Storage technology using the standard procedure, as described in Creating an Oracle Object Storage Data Server section. The information you need to fill out these properties will come from the Oracle BI Cloud Connector Console. This section details only the fields required or specific for defining an Oracle Object Storage data server:
  1. In the Definition tab:
    1. Name: Name of the data server representing the Oracle Object Storage instance used by Oracle BI Cloud Connector to stage the extracted data.
    2. Region: Oracle Object Storage region. A region is a localized geographic area, and an availability domain is one or more data centers located within a region. A region is composed of several availability domains. Most Oracle Cloud Infrastructure resources are either region-specific, such as a virtual cloud network, or availability domain-specific, such as a compute instance. The value specified in this field must be the same as the Host configured in the Oracle BI Cloud Connector Console.
    3. Tenant OCID: Tenant’s Oracle Cloud ID. Every Oracle Cloud Infrastructure resource has an Oracle-assigned unique ID called an Oracle Cloud Identifier (OCID). It is included as part of the resource's information in both the Console and API. The value specified in this field must be the same as the Tenancy OCID configured in the Oracle BI Cloud Connector Console.
    4. User OCID: Oracle Cloud ID of the user logging into Oracle Object Storage. The value specified in this field must be the same as the User OCID configured in the Oracle BI Cloud Connector Console.
    5. Tenant Name: Name of the tenant. The value specified in this field must be the same as the Namespace configured in the Oracle BI Cloud Connector Console.
  2. Click Test Connection, to test the established connection.
35.3.1.2 Creating an Oracle Object Storage Physical Schema
Create an Oracle Object Storage physical schema using the standard procedure, as described in Creating an Oracle Object Storage Physical Schema section. This section details only the fields required or specific for defining an Oracle Object Storage physical schema:
  1. In the Definition tab:
    1. Name: Name of the physical schema created.
    2. Bucket Name: It specifies the Oracle Object Storage Bucket name from which upload, download or the delete operation will happen. Select the required bucket from the Bucket Name drop-down list. The value selected in this field must be the same as the Bucket Name specified in the Oracle BI Cloud Connector Console.
    3. Directory (Work Schema): This is the temporary folder on the local system used for getting files from the Oracle Object Storage bucket during reverse engineering. If the directory does not exist, it will be created. Specify the required location in the local system.
  2. Check the Default box if you want this schema to be the default one for this data server (The first physical schema is always the default one).

35.3.2 Creating Topology objects for Oracle Storage Cloud Service

This section contains the following topics:

35.3.2.1 Creating an Oracle Storage Cloud Service Data Server
Create a data server for the Oracle Storage Cloud Service technology using the standard procedure, as described in Creating an Oracle Storage Cloud Service Data Server section. The information you need to fill out these properties will come from the Oracle BI Cloud Connector Console. This section details only the fields required or specific for defining an Oracle Storage Cloud Service data server:
  1. In the Definition tab:
    1. Name: Name of the data server representing the Oracle Storage Cloud Service instance used by Oracle BI Cloud Connector to stage the extracted data.
    2. Service URL: Oracle Storage Cloud Service URL. The value specified in this field should be a combination of the Protocol and Host names configured in the Oracle BI Cloud Connector Console. For example, protocol://hostname.
    3. Service Name: It denotes the name of the service for the created service URL. You need to specify the first part of the Service Name configured in the Oracle BI Cloud Connector Console as the service name in this field. For example, if the Service Name configured in the Oracle BI Cloud Connector Console is 'servicename-identitydomain', then the value specified in this field should be 'servicename'.
    4. User Name: Name of the user logging into the Oracle Storage Cloud Service. The value specified in this field must be the same as the User Name configured in the Oracle BI Cloud Connector Console.
    5. Password: Password of the logged in user. The value specified in this field must be the same as the Password configured in the Oracle BI Cloud Connector Console.
    6. Identity Domain: It denotes the domain specific to the created storage instance. You need to specify the second part of the Service Name configured in the Oracle BI Cloud Connector Console as the Identity Domain in this field. For example, if the Service Name configured in the Oracle BI Cloud Connector Console is 'servicename-identitydomain', then the value specified in this field should be 'identitydomain'.
  2. Click Test Connection, to test the established connection.
35.3.2.2 Creating an Oracle Storage Cloud Service Physical Schema
Create an Oracle Storage Cloud Service physical schema using the standard procedure, as described in Creating an Oracle Storage Cloud Service Physical Schema section. This section details only the fields required or specific for defining an Oracle Storage Cloud Service physical schema:
  1. In the Definition tab:
    1. Name: Name of the physical schema representing the container that stores the Oracle BI Cloud Connector files.
    2. Container Name: It specifies the container to which you wish to associate the created physical schema. Select the required container from the Container Name drop-down list. The value selected in this field must be the same as the Container Name specified in the Oracle BI Cloud Connector Console.
    3. Directory (Work Schema): This is the temporary folder on the local system used for getting files from Oracle Storage Cloud Service. If the directory does not exist, it is created. Specify the required location in the local system.
  2. Check the Default box if you want this schema to be the default one for this data server (The first physical schema is always the default one).

35.3.3 Creating Topology objects for Oracle BI Cloud Connector

This section contains the following topics:

35.3.3.1 Creating an Oracle BI Cloud Connector Data Server
Create a data server for the Oracle BI Cloud Connector technology using the standard procedure, as described in Creating a Data Server of Administering Oracle Data Integrator. This section details only the fields required or specific for defining an Oracle BI Cloud Connector data server:
  1. In the Definition tab:
    1. Under Data Server, enter the name of the data server representing the Oracle BI Cloud Connector instance and the Oracle BI Cloud Connector web service URL (for example, https://sam37764.fa.dc1.c9dev2.example.com).
    2. Under Connection, enter a user name and password for connecting to the Oracle BI Cloud Connector instance.
    3. Under Storage, select either Oracle Storage Cloud Service or Oracle Object Storage as per requirement.
    4. Depending on the value selected in the Storage block, enter the corresponding storage configuration details.
      • If the value selected in the Storage block is Oracle Storage Cloud Service, enter the following fields:

        • Oracle Storage Cloud Service DataServer: Name of the Oracle Storage Cloud Service data server used by Oracle BI Cloud Connector.

        • Physical Schema: Name of the physical schema corresponding to the Oracle Storage Cloud Service data server.

        • External Storage Name: Name of the external storage as it appears in the Oracle BI Cloud Connector Console.

      • If the value selected in the Storage block is Oracle Object Storage, enter the following fields:

        • Oracle Object Storage DataServer: Name of the Oracle Object Storage data server used by Oracle BI Cloud Connector.

        • Physical Schema: Name of the physical schema corresponding to the Oracle Object Storage data server.

        • External Storage Name: Name of the external storage as it appears in the Oracle BI Cloud Connector Console.

  2. Click Test Connection, to test the established connection.

    Note:

    You can only test the Oracle BI Cloud Connector data server connection on a local agent. It is not possible to test the data server connection using a Standalone or JEE agent.

35.3.3.2 Creating an Oracle BI Cloud Connector Physical Schema
Create a physical schema for the Oracle BI Cloud Connector data server using the standard procedure, as described in Creating a Physical Schema in Administering Oracle Data Integrator. This section details only the fields required or specific for defining an Oracle BI Cloud Connector physical schema:
  1. In the Definition tab:
    1. Name: Name of the physical schema corresponding to the Oracle BI Cloud Connector data server.
    2. Schema: There are thousands of View Objects (VOs) in Oracle BI Cloud Connector. These VOs are separated into three logical schemas: CrmAnalytics, FscmTopModel and HcmTopModelAnalyticsGlobal. You can select an appropriate schema as per requirement.
  2. Check the Default box if you want this schema to be the default one for this data server (The first physical schema is always the default one).
Create a logical schema for this physical schema using the standard procedure, as described in Creating a Logical Schema in Administering Oracle Data Integrator and associate it in a given context.

35.4 Creating and Reverse-Engineering an Oracle BI Cloud Connector Model

This section contains the following topics:

35.4.1 Creating an Oracle BI Cloud Connector Model

Create an Oracle BI Cloud Connector model using the standard procedure, as described in Creating a Model of Developing Integration Projects with Oracle Data Integrator.

35.4.2 Reverse-engineering an Oracle BI Cloud Connector Model

Oracle BI Cloud Connector supports Customized reverse-engineering - which uses the RKM Oracle BI Cloud Connector. To perform a Customized reverse-engineering on an Oracle BI Cloud Connector model, use the usual procedure, as described in Reverse-engineering a Model of Developing Integration Projects with Oracle Data Integrator.

The list of offerings that are available for reverse engineering is displayed in the Reverse Engineer tab. The user can choose the offerings whose data stores will be brought into ODI.

The RKM Oracle BI Cloud Connector is used to reverse engineer the Oracle BI Cloud Connector data stores. This RKM has the following options:

  • EXTRACT_ATTRIBUTES: If this option is set to False, then only the Oracle BI Cloud Connector data store names will be reverse engineered. If set to True, both the data store names and the attributes of each data store will be reverse engineered.

  • NUMBER_OF_THREADS: The number of threads that will send REST requests simultaneously to the Oracle BI Cloud Connector data server to retrieve the data store metadata.

35.5 Designing a Mapping

You can use Oracle BI Cloud Connector as a source of a mapping.

The KM choice for a mapping determines the abilities and performance of this mapping. The recommendations in this section help in the selection of the KM for different situations concerning an Oracle BI Cloud Connector server.

35.5.1 Loading Data from Oracle BI Cloud Connector

Oracle BI Cloud Connector can be used as a source of a mapping. The LKM used to load data between Oracle BI Cloud Connector and another type of data server is selected in the Loading Knowledge Module tab of the mapping. The choice of LKM determines the abilities and performance of this mapping.

You can use the following knowledge modules for loading Oracle BI Cloud Connector files to a target database:

35.5.1.1 LKM BICC to ADW External Table

This KM helps to load data from Oracle BI Cloud Connector to Oracle ADW using External Table method. You can use this LKM in combination with Oracle or generic SQL IKM.

KM Options

This KM has the following options:

  • CREDENTIAL_NAME — The name of the credential to be stored. The default value is ODI.

  • CREATE_CREDENTIAL — It creates new credentials. If set to False, ODI reuses the existing credentials.

  • GENERATE_FIELD_LIST — If this KM option is set to False, then the field_list clause is skipped and the default settings of ORACLE_LOADER access driver is applied.

  • SUBMIT_BICC_JOB — Oracle BI Cloud Connector has its own scheduler which can be set up to extract data and store it as zipped csv files in the Object Store. In ODI, you can set the SUBMIT_BICC_JOB option to False and simply pick up the extracted data from the Object Store (either all data or from the LAST_LOAD_DATE). If you prefer not to use the Oracle BI Cloud Connector scheduler, or you want to ensure that there is nothing more to extract since the last scheduled job, you can have ODI run the extract job where it will create a job to extract the data, wait for it to finish, and then fetch the data from the Object Store.

  • LAST_LOAD_DATE — This option allows to specify the date of the previous load of extracted data.

  • BICC_JOB_TIMEOUT — This option allows to specify the Oracle BI Cloud Connector job timeout in seconds.

Formatting

  • DELIMITED_FILE_FORMAT — It specifies delimited File Format and it can be CSV (default) or common delimited format known to ORACLE_LOADER access driver. You can use this KM option only if the source datastore File Format property is set to Delimited.

  • COMPRESSION — It specifies the compression method of the source file. It can have values nil or auto. Empty value implies no compression and AUTO implies compression type is auto-detected.

  • DATE_FORMAT — It helps to set specific date format.

  • TIMESTAMP_FORMAT — It helps to set specific time format.

  • REJECT_LIMIT — The query helps to display an error message after the specified number of rows are rejected. Default value is zero.

  • CONVERSION_ERRORS — It specifies the processing conversion errors. If any row throws an error because of a conversion error, the related columns are stored as null or the row is rejected.

  • TRIM_SPACES — It helps to trim the leading and trailing spaces of the fields. If set to True, it trims the specified spaces.

  • IGNORE_BLANK_LINES — If set to True, the blank lines are ignored without throwing any error.

  • IGNORE MISSING COLUMNS — If there are more columns in the field_list than the source files, the extra columns will be stored as null.

  • TRUNCATE_COLUMNS— If the data in a file is too long for a field, then this option will truncate the value of the field rather than rejecting the row.

Advanced

  • ADD_FORMAT_PROPERTIES — This option allows adding custom format properties.

    Use the following syntax: '<prop1>' VALUE '<value1>', '<prop2>' VALUE '<value2>' ...

  • OVERWRITE_FIELD_LIST — This option gives the possibility to redefine the source file field definitions, where ODI does not have enough information about your input data. The details that you enter here are used as the field_list parameter of dbms_cloud.create_external_table function call.

    For more details, refer to DBMS_CLOUD package documentation for more information.

Cleanup

  • CLEANUP_TEMPORARY_OBJECTS — Set this property to True, if you want temporary objects to be automatically cleaned up.

  • CLEANUP_CREDENTIAL— Set this property to True, if you want the credential object to be automatically cleaned up at the end of the every execution. Cleanup will happen only if the CREATE_CREDENTIAL option is also set to True.

35.5.1.2 LKM BICC to ADW Copy

This KM helps to load data from Oracle BI Cloud Connector to Oracle ADW. You can use it in combination with Oracle or generic SQL IKM.

KM Options

This KM has the following options:

  • CREDENTIAL_NAME — The name of the credential to be stored. The default value is ODI.

  • CREATE_CREDENTIAL — It creates new credentials. If set to False, ODI reuses the existing credentials.

  • GENERATE_FIELD_LIST — If this KM option is set to False, then the field_list clause is skipped and the default settings of ORACLE_LOADER access driver is applied.

  • SUBMIT_BICC_JOB — Oracle BI Cloud Connector has its own scheduler which can be set up to extract data and store it as zipped csv files in the Object Store. In ODI, you can set the SUBMIT_BICC_JOB option to False and simply pick up the extracted data from the Object Store (either all data or from the LAST_LOAD_DATE). If you prefer not to use the Oracle BI Cloud Connector scheduler, or you want to ensure that there is nothing more to extract since the last scheduled job, you can have ODI run the extract job where it will create a job to extract the data, wait for it to finish, and then fetch the data from the Object Store.

  • LAST_LOAD_DATE — This option allows to specify the date of the previous load of extracted data.

Formatting

  • DELIMITED_FILE_FORMAT — It specifies delimited File Format and it can be CSV (default) or common delimited format known to ORACLE_LOADER access driver. You can use this KM option only if the source datastore File Format property is set to Delimited.

  • COMPRESSION — It specifies the compression method of the source file. It can have values nil or auto. Empty value implies no compression and AUTO implies compression type is auto-detected.

  • DATE_FORMAT — It helps to set specific date format.

  • TIMESTAMP_FORMAT — It helps to set specific time format.

  • REJECT_LIMIT — The query helps to display an error message after the specified number of rows are rejected. Default value is zero.

  • CONVERSION_ERRORS — It specifies the processing conversion errors. If any row throws an error because of a conversion error, the related columns are stored as null or the row is rejected.

  • TRIM_SPACES — It helps to trim the leading and trailing spaces of the fields. If set to True, it trims the specified spaces.

  • IGNORE_BLANK_LINES — If set to True, the blank lines are ignored without throwing any error.

  • IGNORE MISSING COLUMNS — If there are more columns in the field_list than the source files, the extra columns will be stored as null.

  • TRUNCATE_COLUMNS— If the data in a file is too long for a field, then this option will truncate the value of the field rather than rejecting the row.

Advanced

  • ADD_FORMAT_PROPERTIES — This option allows adding custom format properties.

    Use the following syntax: '<prop1>' VALUE '<value1>', '<prop2>' VALUE '<value2>' ...

  • OVERWRITE_FIELD_LIST — This option gives the possibility to redefine the source file field definitions, where ODI does not have enough information about your input data. The details that you enter here are used as the field_list parameter of dbms_cloud.create_external_table function call.

    For more details, refer to DBMS_CLOUD package documentation for more information.

Cleanup

  • CLEANUP_TEMPORARY_OBJECTS — Set this property to True, if you want temporary objects to be automatically cleaned up.

  • CLEANUP_CREDENTIAL— Set this property to True, if you want the credential object to be automatically cleaned up at the end of the every execution. Cleanup will happen only if the CREATE_CREDENTIAL option is also set to True.

35.5.1.3 LKM BICC to ADW Copy Direct

This KM helps to load data from Oracle BI Cloud Connector to Oracle ADW. You can use this LKM as a standalone KM as you do not need any IKM.

KM Options

This KM has the following options:

  • CREDENTIAL_NAME — The name of the credential to be stored. The default value is ODI.

  • CREATE_CREDENTIAL — It creates new credentials. If set to False, ODI reuses the existing credentials.

  • GENERATE_FIELD_LIST — If this KM option is set to False, then the field_list clause is skipped and the default settings of ORACLE_LOADER access driver is applied.

  • SUBMIT_BICC_JOB — Oracle BI Cloud Connector has its own scheduler which can be set up to extract data and store it as zipped csv files in the Object Store. In ODI, you can set the SUBMIT_BICC_JOB option to False and simply pick up the extracted data from the Object Store (either all data or from the LAST_LOAD_DATE). If you prefer not to use the Oracle BI Cloud Connector scheduler, or you want to ensure that there is nothing more to extract since the last scheduled job, you can have ODI run the extract job where it will create a job to extract the data, wait for it to finish, and then fetch the data from the Object Store.

  • LAST_LOAD_DATE — This option allows to specify the date of the previous load of extracted data.

Formatting

  • DELIMITED_FILE_FORMAT — It specifies delimited File Format and it can be CSV (default) or common delimited format known to ORACLE_LOADER access driver. You can use this KM option only if the source datastore File Format property is set to Delimited.

  • COMPRESSION — It specifies the compression method of the source file. It can have values nil or auto. Empty value implies no compression and AUTO implies compression type is auto-detected.

  • DATE_FORMAT — It helps to set specific date format.

  • TIMESTAMP_FORMAT — It helps to set specific time format.

  • REJECT_LIMIT — The query helps to display an error message after the specified number of rows are rejected. Default value is zero.

  • CONVERSION_ERRORS — It specifies the processing conversion errors. If any row throws an error because of a conversion error, the related columns are stored as null or the row is rejected.

  • TRIM_SPACES — It helps to trim the leading and trailing spaces of the fields. If set to True, it trims the specified spaces.

  • IGNORE_BLANK_LINES — If set to True, the blank lines are ignored without throwing any error.

  • IGNORE MISSING COLUMNS — If there are more columns in the field_list than the source files, the extra columns will be stored as null.

  • TRUNCATE_COLUMNS— If the data in a file is too long for a field, then this option will truncate the value of the field rather than rejecting the row.

Advanced

  • ADD_FORMAT_PROPERTIES — This option allows adding custom format properties.

    Use the following syntax: '<prop1>' VALUE '<value1>', '<prop2>' VALUE '<value2>' ...

  • OVERWRITE_FIELD_LIST — This option gives the possibility to redefine the source file field definitions, where ODI does not have enough information about your input data. The details that you enter here are used as the field_list parameter of dbms_cloud.create_external_table function call.

    For more details, refer to DBMS_CLOUD package documentation for more information.

Cleanup

  • CLEANUP_TEMPORARY_OBJECTS — Set this property to True, if you want temporary objects to be automatically cleaned up.

  • CLEANUP_CREDENTIAL— Set this property to True, if you want the credential object to be automatically cleaned up at the end of the every execution. Cleanup will happen only if the CREATE_CREDENTIAL option is also set to True.

Target

  • CREATE_TARG_TABLE — It helps you to create the target table. Set this KM option to True, if you want to create the target table before loading.

  • TRUNCATE_TARG_TABLE — It helps you to truncate the target table. Set this KM option to True, if you want to truncate the target table before loading.

  • DELETE_TARG_TABLE — It allows you to delete the target table. Set this KM option to True, if you want to delete data from the target table before loading.