About Managing Data Connections

You can connect to a variety of data sources and remote applications to provide the background information for reports. You can blend the additional data from the various data sources with the ready-to-use datasets to enhance business analysis.

Oracle Fusion Analytics Warehouse can connect to other pre-validated data sources such as Oracle Object Storage, cloud applications such as Google Analytics, and on-premises applications such as Oracle E-Business Suite. You'll need to create a data connection type before you create a data connection using that type.

You can view the usage of capacity for custom data that is loaded into Oracle Fusion Analytics Warehouse through the connectors in the Custom Data Usage dashboard available in the Common folder. The dashboard shows data loaded daily and monthly from each of the activated external data sources.

Create a Data Connection Type

Connection Type specifies the source to which you are connecting. A connection type can have multiple connections.

You can create a custom data source type for any remote data connection. Be sure to create the data connection type before using it to Create a Data Connection.
  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Global Configurations.
  4. On the Manage Connections page, click Create and then click Connection Type.
  5. In the Create Connection Type dialog, enter the Name, Identifier, and Prefix for warehouse for the connection type.
  6. Click Add Property and enter the parameters for each property that defines the connection.
  7. When you've finished adding the connection properties, you can reorder them as needed.
  8. Click Save the Connection Type.
The new connection is available on the Connections page.

Edit a Data Connection Type

If the properties or parameters for a data connection type change, you can edit them.

  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    You see the Data Configuration page.
  3. On the Data Configuration page, click Manage Connections under Global Configurations.
  4. On the Manage Connections page, click Connection Types and then click or search for the connection type you want to edit.
    You can't edit or delete Oracle-managed connections.
  5. Click the Action button next to the connection type you want to change.
  6. In the dialog box for the connection type, edit the details for your connection type, and then click Save.

Delete a Data Connection Type

You can delete a data connection type if you don't need it anymore.

Note:

After you delete a connection type, you can't create new data connections to it.
  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    You see the Data Configuration page.
  3. On the Data Configuration page, click Manage Connections under Global Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Delete.
  6. In the Delete Connection dialog box, click Delete.

Create a Data Connection

You can create a data connection for any available connection type.

You need a connection type to create a data connection. You can either use a ready-to-use data connection type or Create a Data Connection Type that meets your business requirements.
  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Global Configurations.
  4. On the Manage Connections page, click Create and then click Connection.
  5. In the Create Connection dialog, click or search for the connection type you want to create.
  6. In the dialog box for the connection, enter the details for your connection in the fields.
  7. Click Save.
The new connection is available on the Connections page.

Test a Data Connection

After you create a data connection, you should test it to ensure it works properly.

  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Global Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Test Connection.
  6. On the Request History page, check the status of the request to test the connection.

Update a Data Connection

When you first make a data connection, or when you make changes, you need to initialize and refresh it.

  1. Sign in to the Oracle Cloud Infrastructure Console.
  2. In Oracle Cloud Infrastructure Console, click the Navigation menu icon in the top left corner.
  3. Click Analytics & AI. Under Analytics, click NetSuite Analytics Warehouse.
  4. Navigate to your service instances page.
  5. On the Instances page, click the instance for which you want to update the service.
  6. Click Connections, then select or search for the connection you want to test.
  7. Click the Action menu for the connection and select Initialize/Refresh Connection.

Delete a Data Connection

You can delete a custom data connections if you don't need it anymore.

Note:

You can't update or load data from deleted data connections to the warehouse.
  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    You see the Data Configuration page.
  3. On the Data Configuration page, click Manage Connections under Global Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Delete.
  6. In the Delete Connection dialog box, click Delete.

Load Data from a Remote Agent into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises systems and use the on-premises data to create data augmentations.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Analytics Warehouse only once in 24 hours.

Ensure that Remote Agent is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Download the remote agent docker image from here.
  2. Identify a host to deploy the remote agent.
    The host that you identify must meet these minimum system requirements for the basic configuration of a single source agent:
    • CPU: 4 (CORE/CPU)
    • Memory: 8 GB
    • Storage: 8 GB

    Note:

    The host must be able to make a JDBC connection to the JD Edwards database.
  3. Copy the docker image to the host and load it using this script:
    docker load -i <docker image zip>
    //List the images docker images
  4. Create and run the docker container using this script:
    docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    If the remote agent user interface isn't accessible, then run this script:

    sudo docker run -d -p 9091:9091 --name remoteagent --network host -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    Note:

    Ensure that the logs directory in /faw/logs/RemoteAgent/ has write permissions and the config folder in /faw/software/remoteagent/config/ is present in case you need to add custom properties.
  5. If you've a firewall that prevents you from accessing the remote agent, then ensure that you complete these steps before starting the docker container for the first time:
    1. Disable the firewall in the Linux host where docker is deployed.
    2. Start the docker container.
    3. Check the logs to see if the container is started.
    4. Enable the firewall.
    5. Open the port using this script:
      sudo firewall-cmd --zone=public --add-port=9091/tcp –permanentsudo firewall-cmd –reload
          sudo iptables-save | grep 9091
  6. Verify that the container has started successfully using this script:
    run '$ docker ps'
  7. Configure the extract service URL to connect using this information:
    1. Sign in to the remote agent user interface using http://<host>:9091/extractservice-remoteagent/index.html.
    2. Configure the extract service URL that the remote agent connects to and configure any outgoing proxies if required using the applicable extract service end points. You can form the extract service url based on your Oracle Fusion Analytics Warehouse url by replacing ui/oax/ with the extract service context path. For example, if your product url is https://myinstance.example.com/ui/oax/ then the extract service url would be https://myinstance.example.com/extractservice.
    3. Configure the JD Edwards data source outbound proxy in the remote agent user interface. You see the extract service URL. For example, you see http://<server IP>/extractservice.
  8. In the remote agent user interface, click Configure to configure the agent.
  9. Copy the configuration details from the text box or download the configuration details.
    You use it to set up the connection on the Data Configuration page in Oracle Fusion Analytics Warehouse.
  10. Configure the remote agent on the Data Configuration page in Oracle Fusion Analytics Warehouse using these instructions:
    1. Open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
      Remote Agent connection option

    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
      Create Connection Remote Agent dialog

    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.

Load Data from On-premises E-Business Suite into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises Oracle E-Business Suite system.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Analytics Warehouse only once a day.

Ensure that Oracle E-Business Suite On-Prem is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Download the remote agent docker image from here.
  2. Identify a host to deploy the remote agent.
    The host that you identify must meet these minimum system requirements for the basic configuration of a single source agent:
    • CPU: 4 (CORE/CPU)
    • Memory: 8 GB
    • Storage: 8 GB

    Note:

    The host must be able to make a JDBC connection to the E-Business Suite database.
  3. Copy the docker image to the host and load it using this script:
    docker load -i <docker image zip>
    //List the images docker images
  4. Create and run the docker container using this script:
    docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    If the remote agent user interface is not accessible, then run this script:

    sudo docker run -d -p 9091:9091 --name remoteagent --network host -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    Note:

    Ensure that the “logs” directory in /faw/logs/RemoteAgent/ has write permissions and the “config” folder in /faw/software/remoteagent/config/ is present in case you need to add custom properties.
  5. Verify that the container has started successfully using this script:
    run '$ docker ps'
  6. Configure the extract service URL to connect using this information:
    1. Sign in to the remote agent user interface using http://<host>:9091/extractservice-remoteagent/index.html.
    2. Configure the extract service URL that the remote agent connects to and configure any outgoing proxies if required using the applicable extract service end points.
    3. Configure the E-Business Suite data source outbound proxy in the remote agent user interface. You see the extract service URL. For example, you see http://<server IP>/extractservice.
  7. In the remote agent user interface, click Configure to configure the agent.
  8. Copy the configuration details from the text box or download the configuration details.
    You use it to set up the agent on the Data Configuration page in Oracle Fusion Analytics Warehouse.
  9. Configure the remote agent and E-Business Suite data source on the Data Configuration page in Oracle Fusion Analytics Warehouse using these instructions:
    1. Open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote as the remote agent connection type.

      Note:

      The "Remote" and "EBS" connection types are ready-to-use.
    5. Click Add File or drop down to fill in the details of the remote agent. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent. See Create a Data Connection.
    6. In Create Connection, select Oracle E-Business Suite as the connection type.
      E-Business Suite connection option

    7. In Create Connection for Oracle E-Business Suite On-Prem, select Remote as connectivity type.
      Create Connection for Oracle E-Business Suite On-Prem dialog

    8. In the Remote Agent field, select the remote agent connection that you created, for example, Remote Agent. Enter an email address to receive notifications in Notification Email, provide the credentials and E-Business Suite connection URL, and select the E-Business Suite offerings that you want to load data from.
    9. Confirm that you see the Remote Agent and E-Business Suite connections on the Manage Connections page.
    10. On the Manage Connections page, select the Actions menu for the E-Business Suite connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for E-Business Suite unless you perform a metadata extract.
    11. Test both the connections by selecting the Test Connection option in the Actions menu. You can check the statuses of all these requests on the Data Configuration Request History page.
  10. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the E-Business Suite data. Ensure that you select Regular as the source dataset type and EBS as the pillar. Select the applicable E-Business Suite source tables. See Augment Your Data.

Connect with Your Oracle Eloqua Data Source

If you’ve subscribed for Oracle Fusion CX Analytics and want to load data from your Oracle Eloqua source into Fusion Analytics Warehouse, then create a connection using the Eloqua connection type.

The Oracle Eloqua data that you load into Fusion Analytics Warehouse enables you to augment the data in your warehouse and create varied customer experience-related analytics. Ensure that Oracle Eloqua is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
  2. On the Data Configuration page, click Manage Connections under Global Configurations.
  3. On the Manage Connections page, click Create and then click Connection.
  4. In Create Connection, select Eloqua as the connection type.
  5. In the dialog for the Eloqua connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and the credentials to connect with the Eloqua source in User Name and Password.
  6. In URL, enter the URL for your Eloqua server in this sample format: https://<your eloqua server>/api/odata/1.0.Description of fawag_eloqua_create_connection.gif follows
    Description of the illustration fawag_eloqua_create_connection.gif
  7. Click Save.

Load Data from Your Oracle Eloqua Data Source

Create a data pipeline for the Marketing Campaign Analytics functional area to load data from your Oracle Eloqua source into Oracle Fusion Analytics Warehouse.

  1. Sign in to your service.
  2. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click Customer Experience.
  4. On the Customer Experience page, click Create.
  5. In the wizard, select Customer Experience Marketing Analytics in Offering and Marketing Campaign Analytics in Functional Area to transfer the data to the warehouse, and then click Next.
  6. Review the parameters and click one of the options:
    • Cancel: To cancel the data pipeline for the functional area.
    • Save: To save the data pipeline for the functional area but not activate it.
    • Activate: To schedule when to run the data pipeline for the functional area. See Activate a Data Pipeline for a Functional Area.

Load Data from Enterprise Performance Management into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from the Enterprise Performance Management (EPM) SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

You can connect to these functional modules of EPM:
  • Financial Close and Consolidation (FCCS)
  • Planning and Budgeting (PBCS)
  • Profitability and Cost Management (PCMCS)

You can only manually extract the incremental data because, for incremental extraction, you must update the results file in EPM before starting the next extraction for the updated data. Update the results file by running the integration using Data Exchange and then access the new results file from the EPM connector in Fusion Analytics Warehouse.

Depending on the functional module you want to connect to, ensure that the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • Oracle EPM - Financial Close and Consolidation
  • Oracle EPM - Planning and Budgeting
  • Oracle EPM - Profitability and Cost Management
See Make Preview Features Available.
  1. In EPM, create an integration, write out the results into a file whose name you provide in Download File Name, and then specify that same file name in List of Data Files while creating the connection to EPM in Fusion Analytics Warehouse to extract the data.
    Create an integration in EPM
  2. In Fusion Analytics Warehouse, create the EPM data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financial Close and Consolidation (FCCS)" module, select Oracle EPM - Financial Close and Consolidation as the connection type.

      Oracle EPM - Financial Close and Consolidation connection option

    5. In Create Connection for the EPM source, enter these details and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • User Name and Password: Enter the credentials for your EPM source. Prefix the user name with the domain of your EPM source, such as domain.username.
      • URL: Enter the API URL of your EPM source with version number of EPM. For example, if EPM version is 22.10.73, then the API url would be something like https://epm7-test-a123456.epm.us6.oraclecloud.com/interop/rest/22.10.73.
      • List of Data Files: Specify the file name that you had entered in Download File Name while creating an integration in EPM.
      Create EPM Connection
  3. On the Manage Connections page, select the Actions menu for the EPM connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for EPM unless you perform a metadata extract.
  4. On the Manage Connections page, select the Actions menu for the EPM connection and then select Test Connection.
  5. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the EPM data. Ensure that you select Regular as the source dataset type and EPM as the pillar. Select the applicable EPM source tables. See Augment Your Data.

Load Data from On-premises PeopleSoft into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises Oracle PeopleSoft system.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Analytics Warehouse only once in 24 hours.
Ensure that Remote Agent and depending on the functional module you want to connect to, the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • Oracle PeopleSoft On-Prem - Campus Solutions
  • Oracle PeopleSoft On-Prem - Financials
  • Oracle PeopleSoft On-Prem - Human Resources
  • Oracle PeopleSoft On-Prem - Learning Management
See Make Preview Features Available.
  1. Download the remote agent docker image from here.
  2. Identify a host to deploy the remote agent.
    The host that you identify must meet these minimum system requirements for the basic configuration of a single source agent:
    • CPU: 4 (CORE/CPU)
    • Memory: 8 GB
    • Storage: 8 GB

    Note:

    The host must be able to make a JDBC connection to the PeopleSoft database.
  3. Copy the docker image to the host and load it using this script:
    docker load -i <docker image zip>
    //List the images docker images
  4. Create and run the docker container using this script:
    docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    If the remote agent user interface is not accessible, then run this script:

    sudo docker run -d -p 9091:9091 --name remoteagent --network host -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    Note:

    Ensure that the “logs” directory in /faw/logs/RemoteAgent/ has write permissions and the “config” folder in /faw/software/remoteagent/config/ is present in case you need to add custom properties.
  5. If you've a firewall that is preventing you from accessing the remote agent, then ensure that you complete these steps before starting the docker container for the first time:
    1. Disable firewall in the Linux host where docker is deployed.
    2. Start the docker container.
    3. Check the logs to see if the container is started.
    4. Enable the firewall.
    5. Open the port using this script:
      sudo firewall-cmd --zone=public --add-port=9091/tcp –permanentsudo firewall-cmd –reload
          sudo iptables-save | grep 9091
  6. Verify that the container has started successfully using this script:
    run '$ docker ps'
  7. Configure the extract service URL to connect using this information:
    1. Sign in to the remote agent user interface using http://<host>:9091/extractservice-remoteagent/index.html.
    2. Configure the extract service URL that the remote agent connects to and configure any outgoing proxies if required using the applicable extract service end points.
    3. Configure the PeopleSoft data source outbound proxy in the remote agent user interface. You see the extract service URL. For example, you see http://<server IP>/extractservice.
  8. In the remote agent user interface, click Configure to configure the agent.
  9. Copy the configuration details from the text box or download the configuration details.
    You use it to set up the connection on the Data Configuration page in Oracle Fusion Analytics Warehouse.
  10. Configure the remote agent and PeopleSoft data source on the Data Configuration page in Oracle Fusion Analytics Warehouse using these instructions:
    1. Open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.
    7. Navigate to the Manage Connections page, click Create and then click Connection.
    8. In Create Connection, select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financials" module, select Oracle PeopleSoft On-Prem - Financials as the connection type.
      Oracle PeopleSoft On-Prem - Financials connection option

    9. In Create Connection for Oracle PeopleSoft On-Prem - Financials dialog, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle PeopleSoft On-Prem - Financials dialog

    10. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    11. Enter an email address to receive notifications in Notification Email, provide credentials for your PeopleSoft source in User Name and Password, and the URL of your PeopleSoft source in URL.
    12. Confirm that you see the Remote Agent and PeopleSoft connections on the Manage Connections page.
    13. On the Manage Connections page, select the Actions menu for the PeopleSoft connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for PeopleSoft unless you perform a metadata extract.
    14. Test both the connections by selecting the Test Connection option in the Actions menu. You can check the statuses of all these requests on the Data Configuration Request History page.
  11. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the PeopleSoft data. Ensure that you select Regular as the source dataset type and PSFT as the pillar. Select the applicable PeopleSoft source tables. See Augment Your Data.

Load Data from Google Analytics into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from the Google Analytics SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Before connecting with the Google Analytics source, note these:
  • Fusion Analytics Warehouse supports Google Analytics extractor for GA4 properties and doesn’t support the previous version – Google Universal Analytics (UA) properties.
  • DataStores are the list of GA4 properties.
  • DataStore columns are the list of Dimensions and Metrics for a GA4 property.
  • DataExtract runs the report based on user selection for a GA4 property as DataStore and Dimensions and Metrics as DataStore columns.
  • MetaExtract fetches metadata for all the available GA4 properties (DataStores) and its Dimensions and Metrics (DataStoreColumns).

Ensure that Google Analytics is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Google Cloud (Analytics) Project, create a service account and download the credentials.json file.
  2. Add the service account to the Google Analytics 4 property.
  3. Enable Google Analytics APIs using these instructions:
    1. Using a text editor, open the credentials.json file that you had downloaded and search for the client_email field to obtain the service account email address.
    2. Use this email address to add a user to the Google Analytics 4 property you want to access through the Google Analytics Data API v1.
    Enable Google Analytics APIs
  4. Ensure that the Google Analytics Admin API, Google Analytics Data API are available for your Google Analytics instance.
    View Google Analytics APIs
  5. In Fusion Analytics Warehouse, create the Google Analytics data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Google Analytics as the connection type.

      Google Analytics connection option

    5. In the dialog for the Google Analytics connection, select Standard as the connectivity type and enter these details:
      • Notification Email: An email address to receive notifications regarding this connection.
      • Service Account Credentials Json File: The Google Cloud Service Account credentials.json file that you had downloaded.
      • Account ID: Google Analytics account ID.
      • GA4 List of Property ID: The GA4 List of Property ID with commas to separate each ID.
      • Lookback Mode: Select either Full or Committed.
      • List of Lookback N days Ago: Comma separated list of days (integer) values such as 7,21.
      Create Connection dialog
      Note these:
      • For the Lookback mode, if you don't provide a value, then the Lookback mode isn't supported. The Full option requires one day value, if you provide multiple values, then the process uses the first value. You can provide multiple values for the Committed option.
      • For List Data Stores, the REST API returns a list of GA4 Property IDs either using the Account ID (if provided) or just the source configured or provided list of property.
      • For List columns, the REST API returns a list of column metadata for the given GA4 Property ID.
    6. Click Save.
  6. On the Manage Connections page, select the Actions menu for the Google Analytics connection and then select Test Connection.

    Note:

    REST API signature is same across sources. Test connection invokes GA Common Metadata API. This returns the default version values and no calls are made to the source.
  7. On the Manage Connections page, select the Actions menu for the Google Analytics connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Google Analytics unless you perform a metadata extract.
    Metadata extract:
    • Retrieves metadata columns for each GA4 Property ID provided in the source configuration.
    • Prefixes the GA property columns with Dimension_ orMetric_ that Fusion Analytics Warehouse later uses while extracting data to differentiate Dimension and Metric column type.
    • Leaves the payload dataStores array empty.
  8. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Google Analytics data. Ensure that you select Regular as the source dataset type and Google Analytics as the pillar. Select the applicable Google Analytics source tables. See Augment Your Data.
    When you enable data extraction, you can schedule to run when you choose to do so. For data extraction, note these:
    1. Provide date ranges to run the report and fetch data.
    2. Regular data extract uses the initial or last ExtractDate as StartDate and job RunDate as EndDate.
    3. Lookback mode includes additional date ranges along with the regular extract date range which fetches additional data set but in a single runReport call.
      • The Full option has a single date range; StartDate=ExtractDate - NdaysAgo, EndDate=RunDate.
      • The Commited option can have multiple date ranges. For each configured GA_LIST_OF_N_DAYS_AGO, StartDate=ExtractDate - NdaysAgo, EndDate=RunDate - NdaysAgo.

Load Data from Salesforce into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from the Salesforce SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Salesforce REST is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.
  1. In Fusion Analytics Warehouse, create the Salesforce data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Salesforce REST as the connection type.

      Salesforce REST connection option

    5. In the dialog box for the Salesforce connection, select Standard in Connectivity Type and enter an email address to receive notifications in Notification Email.

      Create Connection for Salesforce REST dialog

    6. Enter the URL of your Salesforce source such as <your Salesforce instance name>.my.salesforce.com in URL.
    7. Copy and paste the token url from your Salesforce instance in Token URL.
    8. Select the type of authorization in Authorization Type.
      Authorization Type can be one of the following: BASICAUTH or OAUTH. Ensure that you enter these authorization types in uppercase letters without any characters in them. You must provide the corresponding fields for those authorization types. For example, if you select BASICAUTH, then you must provide a valid username, password, security token, url, client ID, and client secret. If you select OAUTH, then you must provide a valid username, token url, client ID, and private key. Remember to update all the authorization fields, since Salesforce may reset or require you to reset them regularly.
    9. Enter the credentials for your Salesforce source in User Name and Password.
    10. Copy and paste the client ID that is usually a long alpha-numeric code from your Salesforce account in Client ID.
    11. Copy and paste the client secret from your Salesforce account in Client Secret.
      This is an alpha-numeric code and may contain special characters, however, it isn't visible. It is encrypted and shown as ….
    12. : Copy and paste the security token from your Salesforce account in Security Token.
      This is an alpha-numeric code and may contain special characters, however, it isn't visible. It is encrypted and shown as ….
    13. Copy and paste the private key from your Salesforce account in Private Key.
    14. Click Save.
  2. On the Manage Connections page, select the Actions menu for the EPM connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Salesforce unless you perform a metadata extract.
  3. On the Manage Connections page, select the Actions menu for the EPM connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Salesforce data. Ensure that you select Regular as the source dataset type and Salesforce as the pillar. Select the applicable Salesforce source tables. See Augment Your Data.

Load the Bureau of Labor Statistics Data into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from the Bureau of Labor Statistics (BLS) and use it to create data augmentations for various Enterprise Resource Planning, Human Capital Management and Supply Chain Management use cases.

BLS is a repository of many datasets created by the government of United States. It includes almost 100 public datasets such as Consumer Price Index, Producer Price Indexes, Import and Export Price Indexes, Consumer Price Index, Pay and Benefits, Unemployment, Employment, and Labor Productivity.

Since the government datasets don't have timestamp for partial changes to datasets and the government publishes the entire datasets including changes entirely, you must perform a full extract each time.

Ensure that U.S. Bureau of Labor Statistics (BLS) is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Analytics Warehouse, create the BLS data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select U.S. Bureau of Labor Statistics as the connection type.

      U.S. Bureau of Labor Statistics connection option

    5. In the dialog for the BLS connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide the URL for the BLS data in URL.
      Create Connection for the Bureau of Labor Statistics data
    6. Click Save.
  2. On the Manage Connections page, select the Actions menu for the BLS connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for BLS unless you perform a metadata extract.
  3. On the Manage Connections page, select the Actions menu for the BLS connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the BLS data. Ensure that you select Regular as the source dataset type and BLS as the pillar. Select the applicable BLS source tables. See Augment Your Data.

Load Data from Shopify into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from the Shopify SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Shopify is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Analytics Warehouse, create the Shopify data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Shopify as the connection type.

      Shopify connection option

    5. In the dialog for the Shopify connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, applicable token value in Access Token, Store Name such as myfawteststore.myshopify.com, and True in Bulk Extract.

      Create Connection for Shopify dialog

    6. Click Save.
  2. On the Manage Connections page, select the Actions menu for the Shopify connection and then select Test Connection.
  3. On the Manage Connections page, select the Actions menu for the Shopify connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Shopify unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Shopify data. Ensure that you select Regular as the source dataset type and Shopify as the pillar. Select the applicable Shopify source tables. See Augment Your Data.

Load Data from a Secure FTP Source into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from a secure FTP source (SFTP) and use it to create data augmentations.

Ensure that SFTP is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Analytics Warehouse, create the SFTP data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select SFTP as the connection type.

      SFTP connection option

    5. In the dialog for the SFTP connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide applicable values in Remote Host, User Name, Private Key, Remote Host Extract Files Directory, File Type, CSV Delimiter, CSV Date Format, and CSV Timestamp Format.

      Ensure that the private key you provide is in the valid OpenSSH format.


      Create Connection for SFTP dialog

    6. Click Save.
  2. On the Manage Connections page, select the Actions menu for the SFTP connection and then select Test Connection.
  3. On the Manage Connections page, select the Actions menu for the SFTP connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for SFTP unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the SFTP data. Ensure that you select Regular as the source dataset type and the secure FTP source as the pillar. Select the applicable secure FTP source tables. See Augment Your Data.

Load Data from Oracle Autonomous Database into Oracle Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle Fusion Analytics Warehouse extract service to acquire data from Oracle Autonomous Database and use it to create data augmentations.

You can create connections to five autonomous databases. Depending on the number of connections, ensure that options such as Oracle Autonomous Database 1, Oracle Autonomous Database2 are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Analytics Warehouse, create the autonomous database connection using these instructions:
    1. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, depending on the number of connections, select options such as Oracle Autonomous Database 1, or Oracle Autonomous Database2 as the connection type.

      Oracle Autonomous Database connection option

    5. In the dialog for the Oracle Autonomous Database connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, credentials to access the database in User Name and Password, and the database service details in Service. In Wallet, drag and drop the database wallet details.

      Create Connection for Oracle Autonomous Database dialog

    6. Click Save.
  2. On the Manage Connections page, select the Actions menu for the autonomous database connection and then select Test Connection.
  3. On the Manage Connections page, select the Actions menu for the autonomous database connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for autonomous database unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the autonomous database data. Ensure that you select Regular as the source dataset type and autonomous database as the pillar. Select the applicable autonomous database source tables. See Augment Your Data.

Load Data from Oracle Object Storage into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from Oracle Object Storage Service and use it to create data augmentations.

The recommended approach is to create one augmentation from one source table after acquiring data from Oracle Object Storage Service. After completion of augmentation, Fusion Analytics Warehouse renames the source table in this case and if you create more than one augmentation from the same source, all other augmentations may fail with a message that the source file wasn't found.

Ensure that Oracle Object Storage Service is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Store the following details in a text file to use while creating the connection to Oracle Object Storage Service in Fusion Analytics Warehouse:
    1. In Oracle Object Storage Service, create the Remote Host Extract Files directory as the base folder in which you must place all your data files. Note down the name of this directory. See the "To create a folder or subfolder" section in Using the Console.
    2. Obtain the URL of the Oracle Object Storage Service by signing into the Oracle Cloud Infrastructure Console and navigating to the bucket to get the details of the region, namespace, and bucket name. For example, the URL must be in the https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<name of the bucket> format. See the "To view bucket details" section in Using the Console.
    3. Obtain a user’s OCID by navigating in the Oracle Cloud Infrastructure Console to Identity & Security, and then Users. On the Users page, search for a user who has access to the bucket used in the connector and copy the OCID. Obtain the tenancy ID by clicking your profile icon and then Tenancy in the Oracle Cloud Infrastructure Console. Under Tenancy information, copy the OCID. See Where to Get the Tenancy's OCID and User's OCID.
    4. Obtain the fingerprint for a user from the Oracle Cloud Infrastructure Console. Navigate to API Keys under Resources on the user page, and then click Add API Keys. In the Add API Keys dialog, ensure that Generate API Key Pair is selected. Download the private and public keys using the Download Private Key and Download Public Key options. You must copy the entire text of the private key along with the comments before and after the actual key. These comments could be as simple as: “---------------Begin RSA Private Key --------” and “-----------End of RSA Private Key----------“. Don’t copy only the alphanumeric key without the header and footer comments. In the Add API Keys dialog, select Choose Public Key File to upload your file, or Paste Public Key, if you prefer to paste it into a text box and then click Add. Copy the fingerprint that you see after you upload the public key in the Console. It looks something like this:12:34:56:78:90:ab:cd:ef:12:34:56:78:90:ab:cd:ef.
  2. In Fusion Analytics Warehouse, create the Oracle Object Storage connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Oracle Object Storage Service as the connection type.
      Oracle Object Storage Service connection option
    5. In the dialog for the Oracle Object Storage Service connection, select Standard in Connectivity Type and enter these details:
      • Connection Name: Object Storage
      • Connection Type: Standard
      • Notification Email: An email address to receive notifications
      • Remote Host Extract Files Directory: Name of the base folder in which you must place all your data files in Oracle Object Storage Service
      • URL: URL of the Oracle Object Storage Service that you noted down in a text file
      • User ID: OCID of a user that has access to the applicable bucket in Oracle Object Storage Service
      • Finger Print: The fingerprint that you saw and copied after you uploaded the public key in the Console. It looks something like this: 12:34:56:78:90:ab:cd:ef:12:34:56:78:90:ab:cd:ef
      • Tenant ID: Tenancy in the Oracle Infrastructure Cloud Console that you noted down in the text file
      • Private Key: Paste the private key contents that you previously downloaded
      • File Type: csv
      • CSV Delimiter: Delimiter for the data files
      • CSV Date Format: Date format for the data files
      • CSV Timestamp Format: Timestamp format for the data files
      Create Connection for Oracle Object Storage Service dialog
    6. Click Save.
  3. In Oracle Object Storage Service:
    1. Create the folder structure in the Bucket using these guidelines:
      Base folder
      • The base folder in the bucket must match with the details provided in the connection.
      • Inside the base folder, ensure to place each file in its own folder.
      • Ensure that the Prefix of Data_Store_Name (same as Folder name) and Files in the target folder match exactly.

      See the "To create a folder or subfolder" section in Using the Console.

    2. Inside the base folder, create the metadata file for the Data Store List. This file lists the supported data stores. Each data store is a folder that has the actual file used in data augmentation, for example, ASSETS. Ensure that the file name and folder name match and there aren’t any special characters (including space) in the datastore, folder or file names.
      Base folder structure
    3. Create the metadata file for each data file under the data store folder using these guidelines:

      The META_DATASTORES.csv must have these columns:

      • DATA_STORE_NAME - A mandatory column to identify the data store name.
      • DATA_STORE_LABEL - A non-mandatory column that identifies the description of the data store.

      Each folder must have:

      • A data file that has the actual data that gets loaded into Oracle Fusion Analytics Warehouse. This file must have a prefix with the DATA STORE NAME.
      • A metadata file for the list of columns contains all the column information on the data. This file must have a Prefix with META_DATASTORES_<DATA_STORE_NAME>_COL.
        • For the columns in this metadata, ensure the following:
        • If column name is ABC, then metadata can be ABC or “ABC” - the double quotes are ignored.
        • If column name is “ABC”, then metadata must be “”ABC”” – the first double quotes are ignored.

      Example

      In the image, the folder name is ACTIVITY_TYPES. Hence, the data store name is ACTIVITY_TYPES. You can confirm this from the META_DATASTORES.csv file. In this example, the file is named ACTIVITY_TYPES.xlsx or ACTIVITY_TYPES.csv. The metadata file must be META_DATASTORES_ACTIVITY_TYPES_COL.csv.Sample folder and metadata file

      The META_DATASTORES_ACTIVITY_TYPES_COL.csv has these columns:
      • DATA_STORE_NAME - This is a mandatory column.
      • COLUMN_NAME - This is a mandatory column.
      • COLUMN_LABEL - This is a non-mandatory column.
      • DATA_TYPE – This is a mandatory column.
      • WIDTH – This column identifies the string length.
      • PRECISION - This column value must be Numeric data type.
      • SCALE - This column value must be Numeric data type.
      • KEY_SEQUENCE - This is a mandatory column that identifies the Primary Key definition. If you’re using the composite primary key, then use column order numbers as values.
  4. In Fusion Analytics Warehouse, on the Manage Connections page, select the Actions menu for the Oracle Object Storage Service connection and then select Test Connection.
  5. On the Manage Connections page, select the Actions menu for the Oracle Object Storage Service connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for the Oracle Object Storage Service unless you perform a metadata extract.
  6. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the data from the Oracle Object Storage Service. Ensure that you select Regular as the source dataset type and Oracle Object Storage Service as the pillar. Select the applicable source tables from the Oracle Object Storage Service data. See Augment Your Data.

Load Data from On-premises JD Edwards into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises JD Edwards system and use the JD Edwards data to create data augmentations.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Analytics Warehouse only once in 24 hours.

Ensure that Remote Agent and Oracle JD Edwards On-Prem are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Download the remote agent docker image from here.
  2. Identify a host to deploy the remote agent.
    The host that you identify must meet these minimum system requirements for the basic configuration of a single source agent:
    • CPU: 4 (CORE/CPU)
    • Memory: 8 GB
    • Storage: 8 GB

    Note:

    The host must be able to make a JDBC connection to the JD Edwards database.
  3. Copy the docker image to the host and load it using this script:
    docker load -i <docker image zip>
    //List the images docker images
  4. Create and run the docker container using this script:
    docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    If the remote agent user interface is not accessible, then run this script:

    sudo docker run -d -p 9091:9091 --name remoteagent --network host -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    Note:

    Ensure that the “logs” directory in /faw/logs/RemoteAgent/ has write permissions and the “config” folder in /faw/software/remoteagent/config/ is present in case you need to add custom properties.
  5. If you've a firewall that is preventing you from accessing the remote agent, then ensure that you complete these steps before starting the docker container for the first time:
    1. Disable firewall in the Linux host where docker is deployed.
    2. Start the docker container.
    3. Check the logs to see if the container is started.
    4. Enable the firewall.
    5. Open the port using this script:
      sudo firewall-cmd --zone=public --add-port=9091/tcp --permanent
      sudo firewall-cmd --reload
      sudo iptables-save | grep 9091
  6. Verify that the container has started successfully using this script:
    run '$ docker ps'
  7. Configure the extract service URL to connect using this information:
    1. Sign in to the remote agent user interface using http://<host>:9091/extractservice-remoteagent/index.html.
    2. Configure the extract service URL that the remote agent connects to and configure any outgoing proxies if required using the applicable extract service end points.
    3. Configure the JD Edwards data source outbound proxy in the remote agent user interface. You see the extract service URL. For example, you see http://<server IP>/extractservice.
  8. In the remote agent user interface, click Configure to configure the agent.
  9. Copy the configuration details from the text box or download the configuration details.
    You use it to set up the connection on the Data Configuration page in Oracle Fusion Analytics Warehouse.
  10. Configure the remote agent and JD Edwards data source on the Data Configuration page in Oracle Fusion Analytics Warehouse using these instructions:
    1. Open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.
    7. Navigate to the Manage Connections page, click Create and then click Connection.
    8. In Create Connection, select Oracle JD Edwards On-Prem as the connection type.
      Oracle JD Edwards On-Prem connection option

    9. In Create Connection for Oracle JD Edwards On-Prem, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle JD Edwards On-Prem

    10. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    11. Enter an email address to receive notifications in Notification Email, provide credentials for your JD Edwards source in User Name and Password, and the URL of your JD Edwards source in URL.
    12. Confirm that you see the Remote Agent and JD Edwards connections on the Manage Connections page.
    13. On the Manage Connections page, select the Actions menu for the JD Edwards connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for JD Edwards unless you perform a metadata extract.
    14. Test both the connections by selecting the Test Connection option in the Actions menu. You can check the statuses of all these requests on the Data Configuration Request History page.
  11. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the JD Edwards data. Ensure that you select Regular as the source dataset type and JD Edwards as the pillar. Select the applicable JD Edwards source tables. See Augment Your Data.

Load Data from Amazon Simple Storage Service into Oracle Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle Fusion Analytics Warehouse extract service to acquire data from Amazon Simple Storage Service (AWS S3) and use it to create data augmentations.

Ensure that AWS S3 is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Analytics Warehouse, create the AWS S3 data connection using these instructions:
    1. In Oracle Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select AWS S3 as the connection type.

      AWS S3 connection option

    5. In the dialog for the AWS S3 connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide applicable details of your AWS S3.

      Create Connection for AWS S3 dialog

    6. Click Save.
  2. On the Manage Connections page, select the Actions menu for the AWS S3 connection and then select Test Connection.
  3. On the Manage Connections page, select the Actions menu for the AWS S3 connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for AWS S3 unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the AWS S3 data. Ensure that you select Regular as the source dataset type and AWS S3 as the pillar. Select the applicable AWS S3 source tables. See Augment Your Data.

Load Data from Snowflake into Oracle Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle Fusion Analytics Warehouse extract service to acquire data from a Snowflake instance.

You can later use this data to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases. Establish the connection from Fusion Analytics Warehouse to your Snowflake instance to start data acquisition followed by augmentation.

Ensure that Snowflake is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Analytics Warehouse, create the Snowflake data connection:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Snowflake as the connection type.
      Snowflake connection option
    5. In Create Connection, enter these details and then click Save:
      • Connectivity Type: Standard.
      • Notification Email: An email address to receive notifications.
      • Host Name: Complete host name of your Snowflake instance.
      • Table Schema: Your Snowflake table schema such as TPCH_SF1.
      • Database: Mentioned in your Snowflake account under Data.
      • Warehouse: The compute resources in your Snowflake instance that you can find by running SHOW WAREHOUSES [ LIKE '<pattern>' ]. See SHOW WAREHOUSES.
      • Private Key: Generate the Private Key in Snowflake, if you don’t have one already, and paste it here. See Generate the Private Key.

      Create Snowflake connection

  2. On the Manage Connections page, select the Actions menu for the Snowflake connection, and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Snowflake unless you perform a metadata extract.
  3. On the Manage Connections page, select the Actions menu for the Snowflake connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Snowflake data. Ensure that you select Regular as the source dataset type and Snowflake as the pillar. Select the applicable Snowflake source tables. See Augment Your Data.

Load Data from Taleo into Fusion Analytics Warehouse (Preview)

As a service administrator, you can use the Fusion Analytics Warehouse extract service to acquire data from the Taleo instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Taleo is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Analytics Warehouse, create the Taleo data connection using these instructions:
    1. In Fusion Analytics Warehouse, open the Navigator menu, click Console, and then click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Taleo as the connection type.

      Taleo connection option

    5. In Connectivity Type, select Standard, enter an email address to receive notifications in Notification Email, host name of your Taleo instance in Host Name, and credentials for your Taleo source in User Name and Password.

      Create Taleo Connection dialog

    6. Click Save.
  2. On the Manage Connections page, select the Actions menu for the Taleo connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Taleo unless you perform a metadata extract.
  3. On the Manage Connections page, select the Actions menu for the Taleo connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Taleo data. Ensure that you select Regular as the source dataset type and Taleo as the pillar. Select the applicable Taleo source tables. See Augment Your Data.