5.8 Create and Use an Oracle Business Intelligence Cloud Connector Connection

Oracle Business Intelligence Cloud Connector (BICC) allows you to extract business data from a data source and load it into Autonomous Database.

You can use the BICC connection to choose the offerings whose data stores you want to extract. Data Transforms uses an Oracle Object Storage Data Server used by Oracle BICC to stage the extracted files, which you can then use as a source for mapping. Note that you cannot use an Oracle BICC connection as a target for mapping.

This topic has the following sections:

Creating an Oracle Business Intelligence Cloud Connector Connection

To create an Oracle BICC connection you need to first configure external storage using the OCI Object Storage Connection tab in the BICC Console. You need to specify these connection details when you define the connection in Oracle Data Transforms.

To define an Oracle BICC connection,

  1. From the left pane of the Home page, click the Connections tab.

    Connections page appears.

  2. Click Create Connection.

    Create Connection page slides in.

  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Applications tab.
  4. Select Oracle BI Cloud Connector as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. Enter the URL in the BI Cloud Connector Service URL text box.
  8. In the Connection section, enter the following details:
    • In the User text box enter the user name configured in the Oracle BI Cloud Connector Console.
    • In the Password text box enter the password configured in the Oracle BI Cloud Connector Console.
  9. In the Storage section, enter the following details:
    • In the External Storage BICC Name text box enter the name of the external storage as it appears in the Oracle BI Cloud Connector Console.
    • In the External Storage Bucket text box specify the bucket into which extracts are uploaded. Bucket names are obtained in the OCI Console.
    • In the External Storage Name Space text box specify the namespace. Namespace is obtained in the OCI Console.
    • In the External Storage Region text box enter the OCI Object Storage region.
    • In the External Storage User text box enter your Oracle Cloud Infrastructure username.
    • In the External Storage Token text box enter the auth token.
  10. Click Test Connection, to test the established connection.
  11. After providing all the required connection details, click Create.

    The new connection is created.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Select Test Connection, to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema, to delete schemas.
  • Select Delete Connection, to delete the created connection.

You can also search for the required Connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Creating an Oracle Business Intelligence Cloud Connector Data Load

To create an Oracle Business Intelligence Cloud Connector data load you need to select at least one offering to load the BICC public view objects (PVO).

To create an Oracle Business Intelligence Cloud Data Load,

  1. Do one of the following:
    • On the Home page, click Load Data. The Create Data Load wizard appears.

      In the Create Data Load tab, enter a name if you want to replace the default value, add a description, and select a project from the drop-down.

    • On the Home page, click Projects, and then the required project tile. In the left pane, click Data Loads, and then click Create Data Load. The Create Data Load wizard appears.
  2. Enter a name if you want to replace the default value and add a description.
  3. For Load Processing, select the Internal radio button and from the Deployment Type drop-down select Data Transforms (Batch).
  4. Click Next.
  5. In the Source Connection tab,
    1. From the Connection Type drop-down, select a connection that you created using the instructions in Creating an Oracle Business Intelligence Cloud Connector Connection.
    2. From the Connection drop-down, select the required connection from which you wish to add the data stores.
    3. From Offerings, you can choose All Offerings to include all the offerings in the data load or Select Offerings to choose the offerings whose data stores you want to load.
    4. If you choose Select Offerings, click the Selected Offerings box to see the list of offerings that you can choose from. You can select multiple offerings. You must select at least one offering to load the BICC public view objects (PVO).
    5. Click Next.
  6. In the Target Connection tab,
    1. From the Connection Type drop-down, select Oracle as the connection type.
    2. From the Connection drop-down, select the connection you want to use to load the data into.
    3. Select the schema that you want to use from the Schema drop down.
    4. Click Save.

    The Data Load Detail page appears listing all the source tables.

Running an Oracle Business Intelligence Cloud Connector Data Load

The Data Load Detail page displays the information that you need to run a data load. To run an Oracle Business Intelligence Cloud Connector, you can apply different actions on the data entities such as Incremental Merge, Incremental w/ Deletes, and Do Not Load before loading it to the target schema.

To run a data load,

  1. In the Data Load Detail page, select the data entities that you want to move to the target schema.
  2. Click Modify if you want to add or remove any offerings before you run the data load.
  3. To filter the displayed list, you can do one of the following:
    • Enter the name or part of the name in the Name text box. This search returns data entities that include the specified string in the name. For example, the search string AD returns results such as ADMIN, ADDRESS, BADGE, UPGRADE, WORKLOAD, and so on.
    • Turn on the Use Regular Expression toggle to use pattern matching to search for particular strings of characters. For example, the search string CO.* returns results such as CONTACT, COUNT, COUNTRY and so on.
    You can select multiple data entities and load them to the target schema. You can also sort the displayed list using the following options:
    • All - Displays all the data entities that match the search criteria.
    • Selected - Displays all the rows that you selected.
    • Unselected - Displays all the unselected rows.
    • Invalid – Displays all the invalid rows.

    Note:

    These options display the list of data entities based on the search criteria. To view the list of all data entities, clear any applied filters.
  4. Click on the required icon to choose any of the following actions:
    • Incremental Merge - Updates the data in the selected column by comparing the source table with the target table.
    • Incremental w/ Deletes – If the selected column in the target table contains records that are deleted from the source table, this option deletes the records in the target table to match the source data.
    • Do Not Load – Skips the selected data entity from the data load job. After you click Save, these data entities are no longer available for future data load jobs.

    You can select multiple data entities and apply different actions. The unsaved rows are highlighted in bold.

  5. To configure options for running the data load job, click Settings and set the following:
    • BICC job polling interval - Set the polling interval for monitoring the job. The default value is 4 seconds.
    • BICC Job Timeout - Specify the maximum timeout before the job is considered as failed. A blank value means there is no timeout. The default value is 1000 seconds.
    • Conversion errors - If any row errors out because of a conversion error, specify whether the row will be rejected (default) or the related columns will be stored as null.
    • Reject Limit - The query errors out after the specified number of rows are rejected. The default value is 0.
  6. Click save icon to save the changes. A green checkmark (green check mark icon) in the row indicates that the changes are saved.
  7. To start the data load,
    • Click execute icon.

    A confirmation prompt appears when the data load starts successfully.

To check the status of the data load, see the Status panel on the right below the Target Schema details. For details about the Status panel, see Monitor Status of Data Loads, Data Flows, and Workflows. This panel shows links to the jobs that execute to run this data load. Click the link to monitor the progress on the Job Details page. For more information about jobs, see Create and Manage Jobs.

Creating an Oracle Business Intelligence Cloud Connector Data Flow

You can define a data flow to move data from a source Oracle BICC connection to a target Oracle Autonomous Database.

To create a data flow,

  1. Follow the instructions in Create a Data Flow to create a new data flow.
  2. In the Data Flow Editor click Add a Schema to define your source connection. From the Connection drop-down, select the Oracle Business Intelligence Cloud Connector connection and the schema that you want to use from the drop down. Click OK.
  3. Similarly add more schemas to the data flow.
  4. Drag the table that you want to use as a source in the data flow and drop it on the design canvas.
  5. Drag the table that you want to use as a target in the data flow and drop it on the design canvas.
  6. Save the data flow.
  7. Set the following transfer options, as required:
    • Submit BICC Job: Oracle BI Cloud Connector has its own scheduler, which can be set up to extract data and store it as zipped CSV files in the Object Store. You can use Oracle BI Cloud Connector to schedule the extracts and Data Transforms to schedule the loading. To do this, set up a schedule for extract in the Oracle BI Cloud Connector Console and set this option to True.
    • Bicc extract job type: Set this option to DATA or DATA_AND_KEYS to extract data files from BICC.

      If this option is set to KEYS or DATA_AND_KEYS, Data Transforms creates a file containing the keys and stores this key file in Object Storage. You need to do this if you want to use the Sync deleted records transfer option to synchronize records between the source and target tables. If the key file is not found, the data flow job will fail.

    • Sync deleted records: Set this to True to delete records from the target that are no longer in the source table. Data Transforms uses the key file to synchronize the source and target tables.

      Note:

      Using this option could affect performance depending on the size of data that is processed. Oracle recommends that you create duplicate data flows. You could schedule a daily run with this option set to False and another that could run weekly with this option set to True.
  8. Save and execute the data flow.

    When you run a data flow that has a BICC VO as a source and an Oracle table as a target, Data Transforms does the following:

  • Starts the extract job for the VO in BICC depending on how Submit BICC job is set.
  • If Bicc extract job type is set to DATA or DATA_AND_KEYS, extracts data files from BICC and stores any new records found since the last extract in Object Storage in the form of compressed CSV files.

    If Bicc extract job type is set to KEYS or DATA_AND_KEYS, creates a File containing Keys and places the key file in Object Storage.

  • Waits for the extract to complete.
  • Finds files created since the date specified in the Last Load Date transfer option and loads them into the target table. Note that this step will occur irrespective of what you have set Bicc extract job type to.
  • If the Sync deleted records transfer option is set to True, uses the key file to compare the source and the target table and removes any rows in the target that do not exist in the key file.