Top FAQs for Data Configuration

The top FAQs for Oracle Fusion Data Intelligence data configuration are identified in this topic.

What should I specify as the initial extract date in the pipeline parameters?

The initial extract date determines how far back the data is loaded into the autonomous data warehouse as part of the initial load or after a warehouse reset is performed. This is specific to your business and reporting needs and data retention policies.

I activated a functional area. Where can I check the status?

Navigate to the Console, click Data Configuration, and then click Request History.

What’s the difference between refresh and reset?

Refresh loads incremental data only, whereas reset reloads all data as of the initial extract date.

Can I reset and refresh the data warehouse?

Yes, you can reset and refresh the data warehouse. See Reset the Data Warehouse.

Can I reset and refresh a data pipeline for a functional area?

Yes, you can reset and refresh a data pipeline for a functional area. See Reload Data for a Data Pipeline and Refresh a Data Pipeline for a Functional Area.

Where do I check the data load status?

Navigate to the Console, click Data Configuration, then click Pipeline Settings, and view Last Refresh Date.

How do I increase the frequency of data refresh?

See Configure Frequent Data Refresh with Original Refresh Mechanism.

Can I load external data into the autonomous data warehouse?

Yes, you can load external data into the autonomous data warehouse from external sources and build your own augmentations using connectors. See Manage Data Connections.

Is there a limitation on external data volume that’s extracted into the autonomous data warehouse?

By default, custom data is limited to 50GB in the autonomous data warehouse associated with Oracle Fusion Data Intelligence. For any storage of custom data beyond 50GB, you must scale up through the Oracle Cloud Infrastructure Console. This is charged to your cloud account accordingly. See Scale Up Oracle Autonomous AI Lakehouse.

Can I extract Oracle Fusion Data Intelligence data and export it to external storage systems?

Yes, you can extract Oracle Fusion Data Intelligence data to export into an external storage system. However, take care to ensure that an adequate number of user licenses are available to access Oracle Fusion Data Intelligence data through the external system.

Can I filter the records with Status on the Request History page?

No, you must select both Status and Submitted by to get the applicable records.

Does daylight savings impact the data refresh process?

Yes, the incremental refreshes that are scheduled during the start and end of daylight saving times may not happen.

Can I clear the cache after the data pipeline is run?

The system clears the cache in the back-end after every data pipeline load as a standard step. You don't have to take any action.

Why are some entries missing for the frequent data refresh scheduled runs in the Warehouse Refresh Statistics report?

If no new records are present at source to get extracted and published to the warehouse during the scheduled frequent data refresh runs, then the Warehouse Refresh Statistics report skips those entries.

Why does the frequent data refresh schedule for a functional area such as General Ledger get skipped sometimes after performing a reset of multiple functional areas?

This is an expected behavior. The incremental refresh after performing multiple soft reset of functional areas take more time than usual due to the high volume of records being processed. Due to this the scheduled frequent General Ledger refresh jobs get skipped. The scheduled frequent General Ledger jobs start as soon as the incremental refresh is completed.

Even though I’ve enabled frequent data refresh for General Ledger, why is data on DW_GL_BALANCE_CA refreshing once a day with daily pipeline refresh schedule?

Currently, the Frequent Data Refresh process doesn’t support aggregate tables such as DW_GL_BALANCE_CA.

Is the Oracle managed Salesforce pipeline available prebuilt with Fusion CX Analytics?

Yes, it's available prebuilt with Fusion CX Analytics

What's available with the Oracle managed Salesforce pipeline?

Refer to Reference for Fusion CX Analytics.

I am a Fusion ERP Analytics customer, how do I enable the Oracle managed Salesforce pipeline?

The Oracle managed Salesforce pipeline requires the Fusion CX Analytics SKU. Contact your sales team.

How do I import a self-signed certificate into the host VM Java truststore?

The certificate’s Common Name (CN) is *.oraclevcn.com. Ensure that the host’s DNS resolution returns <shorthostname>.oraclevcn.com. For example: ext40.oraclevcn.com.
  1. Export the self sign certificate by running the following command to extract the certificate:
    echo | openssl s_client -connect <host_name>.oraclevcn.com:9091 -servername <host_name>.oraclevcn.com | openssl x509 > certificate.crt
    Example:
    echo | openssl s_client -connect ext40mt.oraclevcn.com:9091 -servername ext40mt.oraclevcn.com | openssl x509 > certificate.crt 
  2. Import the exported certificate to the Java truststore by running this command:
    keytool -importcert -trustcacerts -file certificate.crt -alias oraclevcn-cert -keystore <path_to_truststore> -storepass <truststore_password>

    Replace <path_to_truststore> and <truststore_password> with appropriate values for your environment.

  3. After successfully importing the certificate, run the plugin script.

How do I add the Rest Connector Source certificate to the Remote Agent Docker?

Follow these instructions:
  1. Open Google Chrome, go to the site, and click the icon to the left of the URL in the address bar. Click the Certificate option. Click Export, select the location, and save the .pem file.
  2. Import the exported certificate to the Remote Agent Docker Java keystore as follows:
    1. Log into the docker with root permission using this command: sudo docker exec -it -u root remoteagent /bin/bash
    2. Import certificate using this command: keytool -importcert -file /tmp/_.wms.ocs.oraclecloud.com.pem -keystore /usr/lib64/graalvm/graalvm21-ee-java17/lib/security/cacerts -alias 'wmscert'
  3. Restart the Remote Agent docker.