2 Configure Oracle NetSuite Analytics Warehouse Data

You can set up data transfer using the NetSuite user interface or alternatively use the Data Configuration tile on the Oracle NetSuite Analytics Warehouse Console.

To set up data transfer using the NetSuite user interface, see Setting Up the Data Transfer.

In Oracle NetSuite Analytics Warehouse, use the Data Configuration page to specify the data load and reporting configuration details, and create data pipelines for functional areas that determine how your NetSuite data is loaded and displayed in the data warehouse. To navigate to the Data Configuration page:
  1. Sign in to your NetSuite account.
  2. Click Setup, click Integration, and then click Configuration.
  3. On the NetSuite Analytics Warehouse Configuration page, click Log in to NetSuite Analytics Warehouse. You see the Oracle NetSuite Analytics Warehouse home page.
  4. On the home page, open the Navigator menu, click Console, and then click Data Configuration under Data Administration. You see the Data Configuration page on which you can perform all the tasks discussed in this section.

Typical Workflow to Configure Data

Follow these tasks as a guide to configure Oracle NetSuite Analytics Warehouse data.

Task Description More Information
Specify the pipeline parameters Set up the pipeline parameters for your data model file before running your data pipelines for the functional areas. Set Up the Pipeline Parameters
Create a data pipeline for a functional area To start analyzing your data, create data pipelines for the functional areas to copy data to the data warehouse. Create a Data Pipeline for a Functional Area
Activate a data pipeline for a functional area You must activate the data pipeline for a functional area to run and load data into the data warehouse. Activate a Data Pipeline for a Functional Area
Specify the reporting configurations Set up the reporting configurations for your data model file after activating a data pipeline for a functional area. Set Up the Reporting Configurations
View request history View the load request history by functional area, load type, and status. View Load Request History
Augment your data Augment your reports by using datasets that you create with specific columns from various data stores. Augment Your Data

About Transaction Entities

Oracle NetSuite Analytics Warehouse is organized into logical business groupings called functional areas, which contain tables specific to the transaction types in each functional area.

Table Types

Each type of table follows a naming pattern for all functional areas.
  • <Transaction Type> are transaction type stripes. In the NetSuite universal module definition (UMD), the transaction table is striped by transaction types using the TYPE column. NetSuite Analytics Warehouse creates tables by stripes. Transaction type can be SalesOrd, CustInvc, and so on, but <Transaction Type> contains an expanded and readable format such as _SALES_ORDER_ or _CUSTOMER_INVOICE_ respectively.
  • <NS Table Name> are NetSuite tables which store dimensional data in the source.

Table 2-1 NetSuite Analytics Warehouse Table Types

Table Type Naming Pattern Source
Fact — Transaction Header DW_NS_<Transaction Type>_F Derived from NetSuite source tables transaction
Fact — Transaction Lines DW_NS_<Transaction Type>_LINES_F Derived from NetSuite source tables transaction, transactionline, and transactionaccountingline
Fact — Snapshot DW_NS_<Transaction Type>_SNAPSHOT_F Derived from transaction, systemnote, transactionhistory, and other transactions which act as snapshot events for the transaction
Dimensions DW_NS_<NS Table Name>_D Usually derived directly from source table
General Helper Table DW_NS_<NS Table Name>_G Usually derived directly from source table

Fact Denormalization

Fact — Transaction Lines are usually denormalized and contain additional attributes and dimension keys from the Fact — Transaction Header table to aid ease of reporting. For example, the Sales Order Lines fact contains status and trandate, which are denormalized from the Sales Order Header fact.

Fact — Transaction Header also contains dimensions keys or attributes from the Fact — Transaction Lines mainline in the NetSuite source tables that are non-transactional in nature.

Dimensions

The dimensions keys used in a fact usually follow the rule where they are named similar to the dimension table with which they are supposed to join to. For example, DW_NS_SALES_ORDER_F.CUSTOMER joins to DW_NS_CUSTOMER_D.ID.

Dimensions can be common across functional areas and transaction types. For example, customer and item are some of the common dimensions in NetSuite Analytics Warehouse.

Dimension History

The dimensions used in NetSuite Analytics Warehouse are type 1. Any changes in the source are reflected in NetSuite Analytics Warehouse either incrementally (if they are high volume dimensions) or in full update mode.

Base and Accounting Book Amounts for Transaction Amounts

Usually, transaction currency-based amount columns extracted from a source are converted to a subsidiary or base currency in NetSuite Analytics Warehouse fact lines tables. For example, foreignamount has a corresponding base_amount, estgrossprofit has a corresponding base_estgrossprofit, and so on.

Similarly, in line facts, the accounting book currency-based amounts are also converted into base and transaction currency when applicable. For example, accountingbook_credit and accountingbook_debit have base_creditamount and base_debitamount (in base or subsidiary currency) along with credit and debit (in transaction currency). The lines facts also consist of accountingbook_currency along with the base_currency key as a role-playing dimension in the lines fact tables. This helps in identifying the currency used for these measure columns.

Header facts contain conversion for the base or subsidiary currency amounts only for corresponding transaction currency amounts, such as foreigntotal, which will have a corresponding base_total along with the base_currency key. Header facts contain the base_currency key as a role-playing dimension. Header facts don’t contain any accounting book amounts.

Flags Attributes

Both facts and dimension tables in NetSuite Analytics Warehouse consist of flag attributes which are useful for reporting purposes. The typical values in these columns is either T or F, indicating true or false. Alternately, Y or N indicates yes or no.

Snapshot Facts

NetSuite Analytics Warehouse contains snapshot facts for many transaction types. These tables track daily status and amount changes for a transaction type. The granularity of the snapshots tables is a combination of transaction identifier and date. Snapshots usually contain one row for each day from the transaction start date to the transaction close date. If the transaction is open on the ETL run date, then snapshots are built until one day before the ETL run date.

Delete Transactions

NetSuite Analytics Warehouse supports deletion of transactions. When any transaction is deleted in the source system, the same is deleted in NetSuite Analytics Warehouse after an incremental refresh.

Warehouse Table Grain Identification

The granularity of each table (fact, dimension, etc.) can be determined by checking the source_record_id column. When filtered by a single value, the table should have only one record. As a convention, source record ID is placed immediately after the columns that constitute the grain of the table. For example, for DW_NS_SALES_ORDER_LINES_F, the granularity of the table consists of transaction, id, and accountingbook, hence the source_record_id is placed as the fourth column after the above three grain columns.

Source record source_record_id contains the concatenated string-converted value of the grain columns. Hence for the above example, DW_NS_SALES_ORDER_LINES_F.source_record_id = transaction~id~accountingbook (or 32451~2~1).

Extraction Dates and Load Volume Control

Both Fact — Transaction Header and Fact — Transaction Lines tend to be high-volume tables and may impact data extract performance. To control the extract data volume, users can set an initial extract date as part of pipeline parameters while performing the setup. This controls the extraction of data in the load. Incremental loads are controlled using the last modified date attribute in transaction and transactionlines tables. Many dimensions also extract data incrementally using the last modified date attribute, while some dimensions always extract in full extracts. The ones which load in full extracts are usually smaller dimensions,

Reporting Parameters

NetSuite Analytics Warehouse has one reporting parameter that needs to be set: subsidiary. The subsidiary parameter is used for consolidating transaction values from other subsidiaries' currencies to this reporting subsidiary currency.

About Data Pipelines for Functional Areas

Data pipelines for functional areas load data specific to a functional area into the data warehouse.

These pipelines hold configuration parameters specific to a functional area such as General Ledger functional area under Finance and Sales functional area under NetSuite.

Allowed Actions for Data Pipelines

Here's a list of the actions allowed for data pipelines when they are in various states. This information enables you to know which actions you can and can't perform when a data pipeline is in a particular state.

Pipeline status Available actions Additional information
Saved Edit and Delete You can activate a data pipeline using the Edit wizard or can delete the data pipeline.
InActive Edit, Activate, and Delete You can activate a data pipeline using the Edit wizard or can delete the data pipeline.
Activation Scheduled Edit and Delete If you edit a scheduled data pipeline, then the previous data pipeline is unscheduled and a new data pipeline is scheduled.
Activation in Progress View You can view the data pipeline in Read-only mode.
Activation Complete Edit, Delete, Reset, Deactivate, and Refresh Data You can perform these actions:
  • Edit only the Attribute Selection, Column Options, Entity Options, Dimension Keys, Schedule, and Save steps.
  • After the Edit action is completed, the data pipeline is ready for Refresh Data or AdHoc run. View the information icon that is displayed.
  • Perform a reset that marks the data pipeline for data reset and view the displayed information icon. The next incremental job peforms the actual action or you can peform the Refresh Data action to immediately reset the data.
  • Refresh the data. If an incremental job isn't running, then the Refresh Data action initiates an AdHoc run immediately to refresh the data. It peforms a full refresh if you had edited the data pipeline or performed a reset earlier.
  • Deactivate the data pipeline to change its state to InActive.

About Data Refresh Performance

Oracle strives constantly to improve performance for data loading in pipelines.

The performance of loading data for your instance will vary. The time to complete data processing, both full warehouse loads and incremental data loads, depends on various factors. A data pipeline load includes the following:

  • Extracting data from the Netsuite source.
  • Loading the data into Oracle Autonomous Data Warehouse.
  • Transforming the data into the prebuilt schema.
The time to complete each of these steps is determined by various factors including:
  • The availability of the source system.
  • The size and complexity of the source data.
  • The activated functional areas.
  • Custom SQL queries that impact Oracle Autonomous Data Warehouse.
  • Your queries running concurrently.
  • Customizations made on the source system objects (which require a full load for those objects).
  • The patching of source and target systems.
For daily refresh of data and pipeline execution to update all the resulting key metrics and dashboards, the execution time depends on many factors such as the amount of data getting incrementally refreshed. In order to plan your downstream workflow needs, ensure to perform these actions:
  • Turn on the pipeline execution estimation using the Data Refresh Estimate preview feature on the Enable Features page. The estimated refresh completion details are visible on the Pipeline Settings page.
  • Trigger any subsequent operations based on the estimated refresh completion time.
  • Reach out to Oracle Support if the refresh isn't completed by the estimated completion time.

Data refresh is typically completed daily unless the data pipeline jobs are turned off or stuck. You can observe data loading times for your specific source and warehouse configuration to estimate the time it takes for an incremental daily refresh and for a full warehouse load. This information can help you plan for the optimal time in the day to start your daily data refresh. You may want to schedule the data load to run during off-peak hours, for example, run initial full warehouse loads during weekends and incremental loads during weeknights to ensure that users aren't impacted.

Preview

You can view the estimated refresh completion time for daily pipelines on the Pipeline Settings page in the Estimated Refresh Completion field as a Preview feature. This enables you to plan your tasks in the application.
Estimated Refresh Completion details on the Pipeline Parameters page

About Pipeline Parameters

The pipeline parameters apply to all functional areas.

The pipeline parameters that apply at the functional area levels are initial extraction date and time to schedule the incremental job to run..
  • Initial Extract Date: Initial extract date is used when you extract data for a full load. Transactional data created after the initial extract date processes and loads to the warehouse. It reduces the initial data load volume. After extracting the data for a functional area, avoid changing the initial extract date. If you need to change the initial extract data, then after changing the date, reset the data warehouse and reactivate the functional areas. See Reset the Data Warehouse.
  • Data Refresh Schedule: Specify the frequency and when you want the incremental data load to happen. While specifying the timezone, the recommendation is to use city names to handle the daylight savings. For example, instead of selecting timezone such as EST or PST, select Europe/Bucharest or America/Los_Angeles. In this case, the data refresh process calculates the value mentioned in the Time field based on the local time irrespective of daylight savings.

Set Up the Pipeline Parameters

Set up the pipeline parameters for your data model file before running your data pipelines for the functional areas.

  1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration.
  2. On the Data Configuration page, under Configurations, click Pipeline Settings.
  3. Under Global Parameters, indicate the initial extract date from which to load the transaction data.
  4. Under Data Refresh Schedule, select the interval and depending on the selected interval, specify the time, day, and month when you want the incremental data load to happen.
  5. Click Save.

About Reporting Configurations

Set up the reporting configurations to specify how data is presented on the key metrics, workbooks, visualizations, dashboards, and reports in Oracle NetSuite Analytics Warehouse.

You must activate a data pipeline for a functional area to be able to set up the reporting configurations. Based on the functional area for which you've activated a data pipeline, you see the applicable reporting configurations. For example, if you've activated a data pipeline for a functional area in Oracle Fusion ERP Analytics, then you see the reporting configurations applicable for Oracle Fusion ERP Analytics.

About the Reporting Configuration for NetSuite Analytics Warehouse

You specify values for these parameters as you configure reporting for Oracle NetSuite Analytics Warehouse:

  • Subsidiary: This parameter controls the subsidiary or subsidiaries used for reporting. If you select parent, all the subsidiaries are covered in the reporting layer.

Set Up the Reporting Configurations

Specify the report parameters for your data model file.

  1. In Oracle NetSuite Analytics Warehouse Console, click Reporting Configuration.
  2. On the Reporting Configuration page, under Global Parameters, in Subsidiary, select a subsidiary to transfer data from that subsidiary.
  3. Click Save.

Create a Data Pipeline for a Functional Area

As a functional administrator, create a data pipeline for a functional area to copy data to the data warehouse. Use these steps to select and schedule runs to update the Oracle Autonomous Data Warehouse instance with the latest data.

Perform this task only after you have set up the pipeline and report parameters. You can schedule an activation of a functional area even if activation of another functional area is in progress. However, you can't update the activation of a functional area that's in progress.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click Create.
  5. On the NetSuite Warehouse page, use the wizard to select your application areas to transfer data to the warehouse, and then click Next.
  6. Review the parameters and click one of the options:
    • Cancel: To cancel the data pipeline for the functional area.
    • Save: To save the data pipeline for the functional area but not activate it.
    • Activate: To schedule when to run the data pipeline for the functional area. See Activate a Data Pipeline for a Functional Area.
Oracle NetSuite Analytics Warehouse extracts data only from the initial extract date that you specify. For example, if the initial extract date is 10th March and you've added data on 9th March in your source, then the data added on 9th won’t be part of this data pipeline activation plan. You can see the data pipeline activation plan listed on the Data Configuration page.

Edit a Data Pipeline for a Functional Area

You can change any details of your data pipeline for a functional area prior to activating it.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click the Action menu for the data pipeline for a functional area that you want to edit, and click Edit.
  5. Make the changes, review the changes, and then click Save or Activate.

Activate a Data Pipeline for a Functional Area

You must activate the data pipeline for a functional area to run it and load the data into the data warehouse.

Ensure that you don't activate a data pipeline for a functional area in the following situations:
  • Load in progress: If an incremental load is in progress.
  • An impending load: If an incremental load is scheduled to run in the next hour.
  • Exceeded the number of daily refresh requests: The maximum number of ad hoc data refresh requests for the day is four. If you've exceeded this number, then you can submit a request the following day.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click the Action menu for the saved data pipeline for the functional area that you want to activate, and click Edit.
  5. Review the details of the data pipeline for the functional area and then click Activate.
  6. In step 4 of the Data Configuration wizard, select Scheduled Execution Data to specify the date and time on which to run the data pipeline for the functional area. Select Run Immediately to create and run the data pipeline for the functional area immediately. Click Finish.
Oracle NetSuite Analytics Warehouse runs the data pipeline for the functional area, loads data into your data warehouse, and displays your data pipeline for the functional area on the Data Configuration page. Once data is successfully loaded, the system updates the status of the data pipeline for the functional area to Activation Completed.

Activate Value-added Datasets

Oracle NetSuite Analytics Warehouse provides ready-to-use value-added datasets that load data independently into Oracle NetSuite Analytics Warehouse and aren't dependent on activation of any other functional areas. You must specifically activate these datasets.

The value-added datasets use data from multiple functional areas. For example, the Inventory Snapshot dataset sources data from multiple transaction types belonging to functional areas such as Sales, Purchases, and Inventory.
  1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration.
  2. On the Data Configuration page, under Applications, on the NetSuite Warehouse tile, click the drop-down and select a category such as Oracle NetSuite Analytics for Finance and Supply Chain.
  3. On the Activations page, click Create.
  4. Select a value-added dataset such as Inventory Snapshot and then click Next.
  5. Review the parameters and click one of the options:
    • Cancel: To cancel the data pipeline for the functional area.
    • Save: To save the data pipeline for the functional area but not activate it.
    • Activate: To display the page for scheduling when to initiate activation of the functional area. Select Scheduled Execution Date to specify the date and time on which to run the data pipeline for the functional area. Select Run Immediately to create and run the data pipeline for the functional area immediately. After selecting your option, click Finish.

    Note:

    You can schedule an activation of a functional area even if activation of another functional area is in progress. However, you can't update the activation of a functional area that's in progress.

Deactivate a Data Pipeline for a Functional Area

You can deactivate all the incremental setup and jobs of a data pipeline for a functional area.

Deactivating a data pipeline for a functional area ensures that future incremental jobs don't select the specific functional area when the data pipeline runs. You can view the deactivated status of the data pipeline on the Data Configuration page.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click the Action menu for the active data pipeline for the functional area that you want to deactivate, and click Deactivate.
  5. Review the warning message and then click Deactivate.

Delete a Data Pipeline for a Functional Area

You can delete a data pipeline for a functional area prior to activating it.

Deleting a data pipeline for a functional area permanently removes it, so data refreshes of the warehouse won't occur.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click the Action menu for the data pipeline for the functional area that you want to delete, and click Delete.
  5. Review the warning message and then click Delete.

Refresh a Data Pipeline for a Functional Area

Refresh an activated data pipeline for a functional area to bring in new data for the selected functional area.

When you refresh the data pipeline, the system executes an incremental load immediately. The next scheduled run could take additional time to complete because of any new data. If you had reset a data pipeline for the functional area before refreshing data, then the system runs a full load immediately for that functional area.

Note:

Executing an on-demand data refresh while working on the data could cause temporary inconsistencies.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click the Action menu for the data pipeline for the functional area whose data you want to refresh, and then click Refresh Data.
  5. Review the warning message and then click Refresh Data.

Reload Data for a Data Pipeline

When you reload data for a data pipeline, you start the process of completely refreshing the data in the data warehouse for the functional area.

When you issue a request to reload data, the Request History page shows that the request is received and accepted. After you reload the data, the previous data still exists in the data warehouse related to the functional area. On the next data refresh, Oracle NetSuite Analytics Warehouse discards the existing data and loads new data. After reloading the data for the data pipeline, you see the data pipeline for the functional area on the Data Configuration page with the Completed status.

In a 24-hour period, you can reload the data for only 4 functional areas's data pipelines.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click NetSuite Warehouse.
  4. On the service page, for example, the Data Configuration: Oracle NetSuite Analytics page, click the Action menu for the data pipeline for the functional area that you want to reload, and click Reload Data.
  5. Review the warning message and then click Reload on Next Refresh.

Reset the Data Warehouse

Reset your data warehouse when your data source instance is refreshed from another environment, for example, when the environment is refreshed from production to a test environment. You may also want to reset the data warehouse when the data becomes corrupt.

After resetting the data warehouse, you must activate all of the data pipelines for the functional areas. See Activate a Data Pipeline for a Functional Area

.

Note:

Your data isn't available until Oracle NetSuite Analytics Warehouse completes all the activations after resetting the data warehouse.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, click Actions (Actions menu ellipsis icon) and then click Reset Data Warehouse.
  5. Review the warning message, provide the details, and then click Reset.

Reset the Cache

The data augmentation source column information is cached in the browser cache and is maintained for a week. If your source has new columns and they don't display in the Data Augmentation wizard, then you can reset the browser cache and retry data augmentation.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, click the Reset Cache menu option.
    You see a message that cache has been cleared successfully.

View Load Request History

You can view the data pipeline load request history by functional area, load type, and status along with other details for tracking and reference purposes.

The request history doesn't display the incremental loads. Because the request history shows only past load requests, any unscheduled loads don’t affect the existing pipelines and you can reactivate any functional area or augmentation.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Activity History, click Request History.
  4. On the Request History page, view the history of loads that have been performed.

About Request Types and Statuses

When you perform tasks on a functional area, you can review the different types of requests involved with those tasks in the Request History area.

Request Types

Request Type Description
Batch Request An operation that performs multiple functions such as mapping attributes, or importing and export objects.
Content Patch A content patch upgrade run.
Full Load (Adhoc) A full load run that happens immediately upon request.
Full Load (Scheduled) A request to schedule a full load run.
Map Attribute A Netsuite-specific request run.
Module Reset A request to delete an active functional area or source table.
Refresh Data The system ran a request to refresh data.
Reset A request to refresh the data in the data warehouse for the functional area.
Reset Data Warehouse A request to reset the warehouse. This reset deletes all the customer-added warehouse data.
Target Model Upgrade A request to upgrade the target model to the latest available version.

Request Statuses

Request Status Description
Activation Completed The job ran successfully and is now complete.
Activation in Progress
  • The job is running.
  • The job is resolving.
Activation Scheduled
  • Job is scheduled to run within one minute.
  • Job is scheduled to run at the specified date and time.
Deactivation Complete The job is removed from Active status.
InActive The job isn't saved or scheduled to run.
Received The job request is submitted.
Saved Job is saved but not scheduled to run.
Troubleshooting The job is taking a long time and Oracle is investigating it.

View the Audit Log

You can view the data configuration-related actions that were performed such as activation of a data pipeline for a functional area, setting up of reporting parameters, and saving a data augmentation.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Activity History, click Audit Log.
  4. On the Audit Log page, view the list of all the performed actions.

About Augmenting Your Data

Augment your reports by choosing specific columns from various custom transaction objects.

The data pipeline in Oracle NetSuite Analytics Warehouse extracts and loads data from NetSuite’s standard transaction objects such as the standard records into the warehouse. Use the Data Augmentation option on the Data Configuration page in Oracle NetSuite Analytics Warehouse to extract and populate the following custom transaction types as a custom dimension, custom fact, or an extended entity into the warehouse:
  • Custom attributes or records that reference a custom segment and custom list.
  • Custom attributes that don't reference other objects.
  • Custom reference that don't reference other objects.

Data augmentation allows you to seamlessly extract and load data from these custom transaction types and make it readily available in tables populated in Oracle Autonomous Data Warehouse. It also extends the subject areas in the semantic model and makes the custom transaction types available for reporting.

Augment Your Data

Extract and load data from your custom transactions and make it readily available in tables populated in the autonomous data warehouse.

You can use the system provided or customer provided source tables that are the custom transaction objects that you created in NetSuite. The system provided tables are pre-validated by Oracle NetSuite Analytics Warehouse. The customer provided tables are other source tables that are available for extraction but aren’t validated by Oracle NetSuite Analytics Warehouse. As a user with the functional administrator or system administrator application role, you can allow usage of a particular table that isn’t pre-validated by Oracle NetSuite Analytics Warehouse. However, Oracle can't ensure the success of processing such custom tables or any performance impacts, such as delays in the daily refreshing of data.

The Dataset augmentation type isn’t associated with any other augmentations. Based on the incremental schedule, the data in this dataset gets refreshed during scheduled pipeline refresh. But unlike other augmentations, this augmentation isn’t linked to other augmentations, and you can’t change the attributes as dimension or measure. This dataset isn’t associated with any subject area, since it is simply copying the dataset from source and creating a warehouse table. You can perform semantic model extension after the table is created. To use this dataset to build the joins or incorporate an object from the dataset into your semantic model, you must run an incremental load prior to using it because the incremental load populates the dataset.

You see the data augmentation pipeline jobs on the Data Augmentation page with one of these statuses:
  • Activation in Progress - You can’t edit, delete, or schedule a data augmentation pipeline job while activation is in progress.
  • Activation Completed - You can edit the data augmentation to add or delete VO attributes and save the changes. You can’t modify the schedule in this status.
  • Activation Scheduled - You can edit the data augmentation to add VO attributes, save the changes while retaining the existing schedule, reschedule the execution date and time, or execute the plan immediately.

Note:

You can change the names of the columns that you’ve added from the various data sources in your data augmentation. Later if you delete a data augmentation, then you must wait for the daily incremental run to complete to see the change in the reports, cards, and decks.
  1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  2. On the Data Configuration page, under Configurations, click Data Augmentation.
  3. On the Data Augmentation page, click Create.
    You see the data augmentation wizard with multiple steps. You can create the following augmentation types:
    • Create a dimension.
    • Create a fact.
    • Create a dataset.
    • Extend a transaction entity.
  4. In step 1 of the wizard, select an augmentation type. Each augmentation type requires you to complete certain tasks.
  5. For the type of augmentation, if you select Create Dimension to add a new dimension in the target instance, follow these instructions:
    1. Select Custom Record in Source Dataset Type.
    2. Select NetSuite Warehouse in Pillar.
    3. In Source Table Type, specify the source table type using either of the options and then click Next:
      • Select System Provided and then in Source Table, select a table to which you want to add the new dimension.
      • Select Customer Provided and then in Source Table, enter the name of the table to which you want to add the new dimension.
    4. In step 2 of the wizard, in the Available Attributes pane, select the check box for the attributes that you want in your new dimension, and then click Add Selected.

      You see the attributes that you had selected in the Target attributes pane and the recommended (defined) primary key. You can either accept this key or override it with your own primary key definition. You can mark any columns as incremental to ensure that you get new data specific to the columns.

    5. Optionally, select Advanced to reorganize the order of columns that are marked as primary keys. To reorder, select a primary key row and then drag to define a suitable join order.
    6. Click Next.
    7. In step 6 of the wizard, provide the following details and click Finish to save and schedule your data augmentation pipeline job:
      1. Name your augmentation pipeline job; for example, Sales Order.
      2. Enter a suffix for the target table name using underscore in place of spaces between words and don’t use special characters; for example, Sales_Order_D. The augmentation process automatically creates the target table name.
      3. Provide a description.
      4. Select the functional area and one or multiple subject areas in which you want to include this augmentation pipeline job.
      5. Specify the options to save the data augmentation pipeline job without executing it, or schedule the execution date and time, or execute it immediately.
  6. For the type of augmentation, if you select Create Fact to add a new fact table in the target instance, then follow these instructions:
    1. Select Custom Record in Source Dataset Type.
    2. Select NetSuite Warehouse in Pillar.
    3. In Source Table Type, specify the source table type using either of the options and then click Next:
      • Select System Provided and then in Source Table, select a table from where you want to add the new fact table.
      • Select Customer Provided and then in Source Table, enter the name of the table from where you want to add the new fact table.
    4. In step 2 of the wizard, in the Available Attributes pane, select the check box for the attributes that you want in your new fact table, and then click Add Selected.

      You see the attributes that you had selected in the Target attributes pane and the recommended (defined) primary key. You can either accept this key or override it with your own primary key definition. You can mark any columns as incremental to ensure that you get new data specific to the columns.

    5. Optionally, select Advanced to reorganize the order of columns that are marked as primary keys and specify one of your incremental keys to use when determining the initial extract date.
    6. Click Next.
    7. In step 6 of the wizard, provide the following details and click Finish to save and schedule your data augmentation pipeline job:
      1. Name your augmentation pipeline job; for example, Sales Order Line.
      2. Enter a suffix for the target table name using underscore in place of spaces between words and don’t use special characters; for example, Sales_Order_F. The augmentation process automatically creates the target table name.
      3. Provide a description.
      4. Select the functional area and one or multiple subject areas in which you want to include this augmentation pipeline job.
      5. Specify the options to save the data augmentation pipeline job without executing it, or schedule the execution date and time, or execute it immediately.
  7. For the type of augmentation, if you select Dataset to copy a dataset as is into a target warehouse table, then follow these instructions:
    1. Select Supplemental Data in Source Dataset Type.
    2. In Source Table Type, specify the source table type using either of the options:
      • Select System Provided and then in Source Table, select a table for the attributes.
      • Select Customer Provided and then in Source Table, enter the name of the table for the attributes.

      Note:

      You can select your customer extensions, item extensions, vendor extensions, partner extensions, and employee extensions as source tables if you want to add attributes from these entity extensions.
    3. In Source Table, select the applicable table whose attributes you want to add into the dataset and then click Next.
    4. In step 2 of the wizard, select the check box for the attributes from the source table to add to the target table, and then click Next.
    5. In step 3 of the wizard, select the settings for the selected columns, and then click Next.
    6. In step 6 of the wizard, provide the following details and click Finish to save and schedule your data augmentation pipeline job:
    7. In step 3 of the wizard, map the extension attributes to the warehouse entities by selecting the warehouse entity that is being extended. You see the primary column of the warehouse entity mapped with the attribute that you are extending.
    8. In step 6 of the wizard, provide the following details and click Finish to save and schedule your data augmentation pipeline job:
      1. Provide a name and description for your augmentation.
      2. Enter a suffix for the target table name using underscore in place of spaces between words and don’t use special characters; for example, Customer_Deposit_EXT. The augmentation process automatically creates the target table name.
      3. Specify the options to save the data augmentation pipeline job without executing it, or schedule the execution date and time, or execute it immediately.
  8. For the type of augmentation, if you select Extend Entity to add attributes to dimensions or measures to facts, then follow these instructions:
    1. Select Custom Attribute in Source Dataset Type.
    2. Select NetSuite Warehouse in Pillar.
    3. In Source Table Type, specify the source table type using either of the options and then click Next:
      • Select System Provided and then in Source Table, select a table for the attributes.
      • Select Customer Provided and then in Source Table, enter the name of the table for the attributes.

      Note:

      You can select your customer extensions, item extensions, vendor extensions, partner extensions, and employee extensions as source tables if you want to add attributes from these entity extensions.
    4. In step 2 of the wizard, in the Available Attributes pane, select the check box for the attributes from the source table that you want in your target table, and then click Add Selected. You see the attributes that you had selected in the Target attributes pane and the recommended (defined) primary key. You can either accept this key or override it with your own primary key definition.
    5. Select Advanced to reorganize the order of columns that are marked as primary keys and specify a date or timestamp data type column as one of your incremental keys to use when determining the initial extract date.
    6. Click Next.
    7. In step 3 of the wizard, map the extension attributes to the warehouse entities by selecting the warehouse entity that is being extended. You see the primary column of the warehouse entity mapped with the attribute that you are extending.
    8. In step 6 of the wizard, provide the following details and click Finish to save and schedule your data augmentation pipeline job:
      1. Name your augmentation pipeline job; for example, Customer Deposit.
      2. Enter a suffix for the target table name using underscore in place of spaces between words and don’t use special characters; for example, Customer_Deposit_EXT. The augmentation process automatically creates the target table name.
      3. Provide a description.
      4. Select the functional area and one or multiple subject areas in which you want to include this augmentation pipeline job.
      5. Specify the options to save the data augmentation pipeline job without executing it, or schedule the execution date and time, or execute it immediately.

Create a Dimension Alias

Dimension alias are alias names on the warehouse dimension tables. You can specify the alias names for the Dimension type augmentations and the data that is in the warehouse already as dimensions. The alias names enable you to reuse the existing warehouse tables with different names in the subject areas.

You create dimension aliases as a type of augmentation. This is useful when you want to reinstate a dimension for analysis. For example, if you would like to reuse the Vendor dimension as a Shipper dimension as well, you can create an alias for Vendor.

You can also select complex custom attributes as dimension aliases and retain the name defined in NetSuite. Ensure that you enable Dimension Alias in the Custom Attribute Mapper section under the Preview Features tab on the Enable Features page.

When you create the aliases, Oracle NetSuite Analytics Warehouse doesn’t copy the data for the dimension aliases. These augmentations don’t have a pipeline status, but they are visible in the subject area.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Data Augmentation.
  4. On the Data Augmentation page, click Create, and select Dimension Alias.
  5. In the Add Dimension Alias dialog, in Source Dimension, select a dimension from the source tables such as Employee.
  6. In Dimension Alias, enter a name such as Shipper Dim.
    In Table Name, Oracle NetSuite Analytics Warehouse displays the name such as DW_FA_X_SHIPPER_DIM.
  7. In Subject Areas to expose alias, select one or more subject areas from the subject areas where the original source dimension exists.
  8. Add a description of the dimension alias.
  9. Click Save.

About Mapping Custom Attributes

You can attach custom objects to transactions and transaction lines.

When you attach custom objects to transactions or transaction line types, you automatically create extension attributes that appear in Oracle Analytics Cloud reports and dashboards. The process is automated, so you don't need to create dimensions and extend entities.

Map Custom Attributes

Import custom attributes for transactions, transaction line, transaction accounting lines, and dimension entities using the Custom Attributes Mapper.

Determine the source attributes in NetSuite you want to import before starting this procedure to simplify the mapping process. For the source transaction types or entities that you want to import, NetSuite Analytics Warehouse displays the business names, IDs, data types, and size of the associated custom attributes.
  1. In NetSuite Analytics Warehouse Console, click Data Configuration.
  2. On the Data Configuration page, under Applications, click Custom Attributes Mapper.
  3. On the Custom Attribute Map List page, click Create.
  4. Select the type of attribute you want to create, and the source transaction type or entity you want to import.
  5. Select the attributes you want to associate with the source transaction type.
  6. Click Save.
  7. Click Publish to make the attribute map available to users.

About Autonomous Data Warehouse Wallet and Administrator Credentials

You can download the autonomous data warehouse wallet and view or reset the warehouse administrator credentials.

Download ADW Wallet

The ADW Wallet contains certificate keys for accessing your Autonomous Data Warehouse (ADW) instance.

The client credentials .zip that you download contains the following files:

  • cwallet.sso - Oracle auto-login wallet
  • ewallet.p12 - PKCS #12 wallet file associated with the auto-login wallet
  • sqlnet.ora - SQL*Net profile configuration file that includes the wallet location and TNSNAMES naming method
  • tnsnames.ora - SQL*Net configuration file that contains network service names mapped to connect descriptors for the local naming method
  • Java Key Store (JKS) files - Key store files for use with JDBC Thin Connections
  • README - Contains information about database tools and resources

Caution:

Wallet files, along with the database user ID and password, provide access to data in your Autonomous Database. Store wallet files in a secure location. Share wallet files only with authorized users. If wallet files are transmitted in a way that might be accessed by unauthorized users (for example, over public email), transmit the wallet password separately and securely.
Download the wallet following these instructions:
  1. Click Warehouse on the Console.
  2. On the Warehouse page, under ADW Wallet, click Download Wallet.

Reset Administrator Credentials

Administrator Credentials allow you to access and manage your Autonomous Data Warehouse through a SQL Developer Portal. You can view your administrator credentials or reset them.

Manage the administrator credentials following these instructions:
  1. Click Warehouse on the Console.
  2. On the Warehouse page, under Administrator Credentials, click Reset Credentials.

Disable Data Pipeline

As the cloud account administrator with the functional administrator or system administrator application role, you can disable the data pipeline and enable it again.

You may want to disable the data pipeline in the following scenarios:
  • If you don’t want to run pipeline jobs for a particular source.
  • If you don’t want to run pipeline jobs for a particular duration such as a quite time in your business activities.

In cases where the pipeline jobs are failing due to issues, Oracle disables the data pipeline from the backend to investigate and resolve. On the Pipeline Parameters page, Oracle provides a reason and action for you, if the resolution needs to be done from your side. You can resolve the issue and as an administrator you can enable the data pipeline yourself using the Data Pipeline Disabled toggle.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, under Data Pipeline, select the Data Pipeline Status toggle to Disabled and enter a reason in Details.
  5. Select the Data Pipeline Status toggle again to enable the data pipeline after you've completed the reason for disabling the data pipeline.

About Managing Data Connections

You can connect to a variety of data sources and remote applications to provide the background information for reports. You can blend the additional data from the various data sources with the prebuilt datasets to enhance business analysis.

Oracle NetSuite Analytics Warehouse can connect to other pre-validated data sources such as Oracle Object Storage, cloud applications such as Google Analytics, and on-premises applications such as Oracle E-Business Suite.

You can view the usage of capacity for custom data that is loaded into Oracle NetSuite Analytics Warehouse through the connectors in the Custom Data Usage dashboard available in the Common folder. The dashboard shows data loaded daily and monthly from each of the activated external data sources.
Custom Data Usage dashboard in the Common folder

Create a Data Connection Type

Connection Type specifies the source to which you are connecting. A connection type can have multiple connections.

You can create a custom data source type for any remote data connection.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Create and then click Connection Type.
  5. In the Create Connection Type dialog, enter the Name, Identifier, and Prefix for warehouse for the connection type.
  6. Click Add Property and enter the parameters for each property that defines the connection.
  7. When you've finished adding the connection properties, you can reorder them as needed.
  8. Click Save the Connection Type.
The new connection is available on the Connections page.

Edit a Data Connection Type

If the properties or parameters for a data connection type change, you can edit them.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connection Types and then click or search for the connection type you want to edit.
    You can't edit or delete Oracle-managed connections.
  5. Click the Action button next to the connection type you want to change.
  6. In the dialog box for the connection type, edit the details for your connection type, and then click Save.

Delete a Data Connection Type

You can delete a data connection type if you don't need it anymore.

Note:

After you delete a connection type, you can't create new data connections to it.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connections and then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Delete.
  6. In the Delete Connection dialog box, click Delete.

Create a Data Connection

You create a data connection to a data source to load data from that source into Oracle NetSuite Analytics Warehouse. You can create a connection for any available connection type.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Create and then click Connection.
  5. In the Create Connection dialog, click or search for the connection type you want to create.
  6. In the dialog box for the connection, enter the details for your connection in the fields.
  7. Click Save.
The new connection is available on the Connections page.

Update a Data Connection

When you first make a data connection, or when you make changes, you need to initialize and refresh it.

  1. Sign in to the Oracle Cloud Infrastructure Console.
  2. In Oracle Cloud Infrastructure Console, click the Navigation menu icon in the top left corner.
  3. Click Analytics & AI. Under Analytics, click NetSuite Analytics Warehouse.
  4. Navigate to your service instances page.
  5. On the Instances page, click the instance for which you want to update the service.
  6. Click Connections, then select or search for the connection you want to test.
  7. Click the Action menu for the connection and select Initialize/Refresh Connection.

Test a Data Connection

After you create a data connection, you should test it to ensure it works properly.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Test Connection.
  6. On the Request History page, check the status of the request to test the connection.

Delete a Data Connection

You can delete a custom data connection if you don't need it anymore.

Ensure that you delete the functional areas, data augmentations, and custom data configurations related to the data connection before deleting it. You can't update or load data from deleted data connections to the warehouse.

Note:

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Delete.
  6. In the Delete Connection dialog box, click Delete.

Connect With On-premises Sources

Connect with your on-premises applications to provide the background information for reports.

You can blend the additional data from these data sources with the prebuilt datasets to enhance business analysis.

Set up the Remote Agent to Load Data into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises systems such as E-Business Suite, Peoplesoft, and JD Edwards, load data from these on-premises systems into Oracle NetSuite Analytics Warehouse, and then use the on-premises data to create data augmentations.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle NetSuite Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle NetSuite Analytics Warehouse only once in 24 hours.

Note:

After configuring the remote agent on the Data Configuration page, wait for few minutes, refresh the remote agent page, and when you see the Agent Initialised message, you can proceed with other operations such as testing the connection to the remote agent, testng the connection to the remote source like EBusiness Suite, and refreshing the metadata. This enables you to run these jobs without timeout failures because data pipeline has a default timeout of 15 minutes.

Ensure that Remote Agent is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Download the remote agent Docker image from here.
  2. Identify a host to deploy the remote agent.
    The host that you identify must meet these minimum system requirements for the basic configuration of a single source agent:
    • CPU: 4 (CORE/CPU)
    • Memory: 8 GB
    • Storage: 8 GB

    Note:

    The host must be able to make a JDBC connection to the applicable database.
  3. Copy the Docker image to the host and load it using this script:
    docker load -i <docker image zip>
    //List the images docker images
  4. Create and run the Docker container using this script:
    docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    If the remote agent user interface isn't accessible, then run this script:

    sudo docker run -d -p 9091:9091 --name remoteagent --network host -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

    Note:

    Ensure that the logs directory in /faw/logs/RemoteAgent/ has write permissions and the config folder in /faw/software/remoteagent/config/ is present in case you need to add custom properties.
  5. If you've a firewall that prevents you from accessing the remote agent, then ensure that you complete these steps before starting the Docker container for the first time:
    1. Disable the firewall in the Linux host where Docker is deployed.
    2. Start the Docker container.
    3. Check the logs to see if the container is started.
    4. Enable the firewall.
    5. Open the port using this script:
      
      sudo firewall-cmd --zone=public --add-port=9091/tcp --permanent
      sudo firewall-cmd --reload
      sudo iptables-save | grep 9091
  6. Verify that the container has started successfully using this script:
    run '$ docker ps'
  7. Configure the extract service URL to connect using this information:
    1. Sign in to the remote agent user interface using http://<host>:9091/extractservice-remoteagent/index.html.
    2. Configure the extract service URL that the remote agent connects to and configure any outgoing proxies if required using the applicable extract service end points. You can form the extract service url based on your Oracle NetSuite Analytics Warehouse URL by replacing ui/oax/ with the extract service context path. For example, if your product URL is https://myinstance.example.com/ui/oax/ then the extract service URL would be https://myinstance.example.com/extractservice.
  8. In the remote agent user interface, click Configure to configure the agent.
  9. Copy the configuration details from the text box or download the configuration details.
    You use it to set up the connection on the Data Configuration page in Oracle NetSuite Analytics Warehouse.
  10. Optional: If you need to upgrade the remote agent, then use the following script:
    
    Stop Remoteagent docker
    sudo docker stop remoteagent
    
    Remove Remoteagent docker
    sudo docker rm remoteagent
    
    load the image from tar ball
    docker load -i <docker image zip>   
     
    Get Image ID from below 
    sudo docker images
    
    Run the image:       
    sudo docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/:Z -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent:Z <imageid>
  11. Configure the remote agent on the Data Configuration page in Oracle NetSuite Analytics Warehouse using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
      Remote Agent connection option

    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
      Create Connection Remote Agent dialog

    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.
Load Data from On-premises E-Business Suite into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises Oracle E-Business Suite system.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle NetSuite Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle NetSuite Analytics Warehouse only once a day.

Ensure that Oracle E-Business Suite On-Prem is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your on-premises Oracle E-Business Suite system into Oracle NetSuite Analytics Warehouse.
  2. Configure the remote agent and E-Business Suite data source on the Data Configuration page in Oracle NetSuite Analytics Warehouse using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote as the remote agent connection type.

      Note:

      The "Remote" and "EBS" connection types are prebuilt.
    5. Click Add File or drop down to fill in the details of the remote agent. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent. See Create a Data Connection.
    6. In Create Connection, select Oracle E-Business Suite as the connection type.
      E-Business Suite connection option

    7. In Create Connection for Oracle E-Business Suite On-Prem, select Remote as connectivity type.
      Create Connection for Oracle E-Business Suite On-Prem dialog

    8. In the Remote Agent field, select the remote agent connection that you created, for example, EBS-Remote Agent. Enter an email address to receive notifications in Notification Email, provide the credentials in User Name and Password, the E-Business Suite connection using the JDBC format such as jdbc:oracle:thin:@<HOST>:<PORT>/<DB_NAME/SID_NAME> in URL, and select Application Object Library and General Ledger offerings mandatorily, and any other E-Business Suite offerings that you want to load data from in Offerings.
    9. Confirm that you see the Remote Agent and E-Business Suite connections on the Manage Connections page.
    10. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the E-Business Suite connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for E-Business Suite unless you perform a metadata extract.
    11. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  3. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the E-Business Suite data. Ensure that you select Regular as the source dataset type and EBS as the pillar. Select the applicable E-Business Suite source tables. See Augment Your Data.
Load Data from On-premises JD Edwards into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises JD Edwards system and use the JD Edwards data to create data augmentations.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle NetSuite Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle NetSuite Analytics Warehouse only once in 24 hours.

Ensure that Remote Agent and Oracle JD Edwards On-Prem are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your on-premises Oracle E-Business Suite system into Oracle NetSuite Analytics Warehouse.
  2. Configure the remote agent and JD Edwards data source on the Data Configuration page in Oracle NetSuite Analytics Warehouse using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.
    7. Navigate to the Manage Connections page, click Create and then click Connection.
    8. In Create Connection, select Oracle JD Edwards On-Prem as the connection type.
      Oracle JD Edwards On-Prem connection option

    9. In Create Connection for Oracle JD Edwards On-Prem, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle JD Edwards On-Prem

    10. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    11. Enter an email address to receive notifications in Notification Email, provide credentials for your JD Edwards source in User Name and Password, and the URL of your JD Edwards source in URL.
    12. Confirm that you see the Remote Agent and JD Edwards connections on the Manage Connections page.
    13. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the JD Edwards connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for JD Edwards unless you perform a metadata extract.
    14. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  3. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the JD Edwards data. Ensure that you select Regular as the source dataset type and JD Edwards as the pillar. Select the applicable JD Edwards source tables. See Augment Your Data.
Load Data from On-premises PeopleSoft into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises Oracle PeopleSoft system.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle NetSuite Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle NetSuite Analytics Warehouse only once in 24 hours.
Ensure that Remote Agent and depending on the functional module you want to connect to, the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • Oracle PeopleSoft On-Prem - Campus Solutions
  • Oracle PeopleSoft On-Prem - Financials
  • Oracle PeopleSoft On-Prem - Human Resources
  • Oracle PeopleSoft On-Prem - Learning Management
See Make Preview Features Available.
  1. Set up the remote agent to load data from your on-premises Oracle E-Business Suite system into Oracle NetSuite Analytics Warehouse.
  2. Configure the remote agent and PeopleSoft data source on the Data Configuration page in Oracle NetSuite Analytics Warehouse using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.
    7. Navigate to the Manage Connections page, click Create and then click Connection.
    8. In Create Connection, select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financials" module, select Oracle PeopleSoft On-Prem - Financials as the connection type.
      Oracle PeopleSoft On-Prem - Financials connection option

    9. In Create Connection for Oracle PeopleSoft On-Prem - Financials dialog, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle PeopleSoft On-Prem - Financials dialog

    10. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    11. Enter an email address to receive notifications in Notification Email, provide credentials for your PeopleSoft source in User Name and Password, and the URL of your PeopleSoft source in URL.
    12. Confirm that you see the Remote Agent and PeopleSoft connections on the Manage Connections page.
    13. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the PeopleSoft connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for PeopleSoft unless you perform a metadata extract.
    14. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  3. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the PeopleSoft data. Ensure that you select Regular as the source dataset type and PSFT as the pillar. Select the applicable PeopleSoft source tables. See Augment Your Data.
Load Data from On-premises MySQL Database into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises MySQL database.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle NetSuite Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle NetSuite Analytics Warehouse only once a day.

Ensure that MySQL On-Prem is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your on-premises MySQL database into Oracle NetSuite Analytics Warehouse.
  2. Check the port number for your on-premises MySQL database and create a service request with server host and port details to enable network connectivity to the on-premises MySQL server.
  3. Configure the remote agent and on-premises MySQL database on the Data Configuration page in Oracle NetSuite Analytics Warehouse by following these steps:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select MySQL On-Prem as the connection type.
      MySQL On-Prem connection option

    5. In Connectivity Type, select Remote.
    6. In the Remote Agent field, select the remote agent connection that you created, for example, MySQL-Remote Agent. Enter an email address to receive notifications in Notification Email, and provide these details:
      • Host Name: Enter the host name of MySQL server such as 100.111.252.64
      • Port Number: Enter the port number where the server is listening such as 3306
      • Database: Enter the database name you need to connect to such as airportdb
      • Credentials to access the database in User Name and Password
      • Last Update Date Column Pattern: Enter "%r%a%o%%"

        Note:

        If column name matches with pattern provided, then NetSuite Analytics Warehouse marks the column name with isLastUpdateDate = true and uses it for the incremental extract. For example, if pattern provided is “%mo%fie%te%“, then the column name modifiedDate is marked as isLastUpdateDate = true.
      • Initial Extract Date Column Pattern: Enter "%e%art%new"

        Note:

        If column name matches with pattern provided, then NetSuite Analytics Warehouse marks the column name with isCreationDate = true and uses it for the initial extract date extraction. For example, if pattern provided is: “%cr%ted%te%“, then the column name createdDate is marked as isCreationDate = true.

      Create Connection for MySQL On-Prem dialog

    7. Confirm that you see the Remote Agent and on-premises MySQL database connections on the Manage Connections page.
    8. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the on-premises MySQL database connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for on-premises MySQL database unless you perform a metadata extract.
    9. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the on-premises MySQL database data. Ensure that you select Regular as the source dataset type and EBS as the pillar. Select the applicable on-premises MySQL database source tables. See Augment Your Data.
Load Data from SQL Server into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from SQL Server and use it to create data augmentations.

Ensure that SQL Server is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your SQL Server into Oracle NetSuite Analytics Warehouse.
  2. In Oracle NetSuite Analytics Warehouse, create the SQL Server data connection using these instructions:
    1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select SQL Server as the connection type.

      SQL Server connection option

    5. In the dialog for the SQL Server connection, enter these details and click Save:
      • Connectivity Type: Select Remote.
      • Remote Agent: Select the remote agent that you had set up to load data from your SQL Server.
      • Notification Email: Enter an email address to receive notifications.
      • SQL Server Name: Enter the SQL server name.
      • Port Number: Enter the port number on which your SQL server is available.
      • Database Name: Enter the database name in your SQLServer instance.
      • Schema Name: Enter the name of the schema for the dataset you want to load to run analytics on.
      • User Name and Password: Enter the credentials for your SQL Server instance.
      • Initial Extract Date Column Pattern: MM/dd/yyyy is the date format in your initial extract column; sample date is 1/23/1998.
      • Last Update Date Column Pattern: Last update date shows when the last update was done in your SQL Server database.

      Create Connection for SQL Server dialog

    6. Click Save.
  3. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the SQL Server connection and then select Test Connection.
  4. On the Manage Connections page, select Actions for the SQL Server connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for SQL Server unless you perform a metadata extract.
  5. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the SQL Server data. Ensure that you select Regular as the source dataset type and SQL Server as the pillar. Select the applicable SQL Server source tables. See Augment Your Data.

Connect with Cloud File Storage Sources

Connect with your file storage-based cloud sources to provide the background information for reports.

You can blend the additional data from these data sources with the prebuilt datasets to enhance business analysis.

About Date and Timestamp Formatting for CSV File-based Extractors

Extractors such as Secure FTP (SFTP), Amazon Simple Storage Service (AWS S3), and Oracle Object Storage Service use CSV data files that have date and timestamp fields.

For the CSV file-based extractors, use the format examples to provide the values in the CSV Date Format, and CSV Timestamp Format fields while entering the source connection details.

Note:

Ensure that the date and timestamp formats for the data files match the date and timestamp formats in your source; for example, if you've used MM/dd/yyyy and MM/dd/yyyy hh:mm:ss in your source, then you must specify the same formats while creating the applicable data connections.
Examples
Example Pattern
1/23/1998 MM/dd/yyyy
1/23/1998 12:00:20 MM/dd/yyyy hh:mm:ss
12:08 PM h:mm a
01-Jan-1998 dd-MMM-yyyy
2001-07-04T12:08:56.235-0700 yyyy-MM-dd'T'HH:mm:ss.SSSZ
The guidelines to define the format are:
Letter Meaning
M Month
d Day
y Year
h Hour (0-12)
H Hour (0-23)
m Minute
s Second
S Milli Second
a AM/PM
Z Timezone
Load Data from Amazon Simple Storage Service into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from Amazon Simple Storage Service (AWS S3) and use it to create data augmentations.

Ensure that AWS S3 is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle NetSuite Analytics Warehouse, create the AWS S3 data connection using these instructions:
    1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select AWS S3 as the connection type.

      AWS S3 connection option

    5. In the dialog for the AWS S3 connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide applicable details of your AWS S3.

      Create Connection for AWS S3 dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the AWS S3 connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the AWS S3 connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for AWS S3 unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the AWS S3 data. Ensure that you select Regular as the source dataset type and AWS S3 as the pillar. Select the applicable AWS S3 source tables. See Augment Your Data.
Load Data from Oracle Object Storage into NetSuite Analytics Warehouse

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from Oracle Object Storage Service and use it to create data augmentations.

The recommended approach is to create one augmentation from one source table after acquiring data from Oracle Object Storage Service. After completion of augmentation, NetSuite Analytics Warehouse renames the source table in this case and if you create more than one augmentation from the same source, all other augmentations may fail with a message that the source file wasn't found.

  1. Store the following details in a text file to use while creating the connection to Oracle Object Storage Service in NetSuite Analytics Warehouse:
    1. In Oracle Object Storage Service, create the Remote Host Extract Files directory as the base folder in which you must place all your data files. Note down the name of this directory. See the "To create a folder or subfolder" section in Using the Console.
    2. Obtain the URL of the Oracle Object Storage Service by signing into the Oracle Cloud Infrastructure Console and navigating to the bucket to get the details of the region, namespace, and bucket name. For example, the URL must be in the https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<name of the bucket> format. See the "To view bucket details" section in Using the Console.
    3. Obtain a user’s OCID by navigating in the Oracle Cloud Infrastructure Console to Identity & Security, and then Users. On the Users page, search for a user who has access to the bucket used in the connector and copy the OCID. Obtain the tenancy ID by clicking your profile icon and then Tenancy in the Oracle Cloud Infrastructure Console. Under Tenancy information, copy the OCID. See Where to Get the Tenancy's OCID and User's OCID.
    4. Obtain the fingerprint for a user from the Oracle Cloud Infrastructure Console. Navigate to API Keys under Resources on the user page, and then click Add API Keys. In the Add API Keys dialog, ensure that Generate API Key Pair is selected. Download the private and public keys using the Download Private Key and Download Public Key options. You must copy the entire text of the private key along with the comments before and after the actual key. These comments could be as simple as: “---------------Begin RSA Private Key --------” and “-----------End of RSA Private Key----------“. Don’t copy only the alphanumeric key without the header and footer comments. In the Add API Keys dialog, select Choose Public Key File to upload your file, or Paste Public Key, if you prefer to paste it into a text box and then click Add. Copy the fingerprint that you see after you upload the public key in the Console. It looks something like this:12:34:56:78:90:ab:cd:ef:12:34:56:78:90:ab:cd:ef.
  2. In NetSuite Analytics Warehouse, create the Oracle Object Storage connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Oracle Object Storage Service as the connection type.
      Oracle Object Storage Service connection option
    5. In the dialog for the Oracle Object Storage Service connection, select Standard in Connectivity Type and enter these details:
      • Connection Name: Object Storage
      • Connection Type: Standard
      • Notification Email: An email address to receive notifications
      • Remote Host Extract Files Directory: Name of the base folder in which you must place all your data files in Oracle Object Storage Service
      • URL: URL of the Oracle Object Storage Service that you noted down in a text file
      • User ID: OCID of a user that has access to the applicable bucket in Oracle Object Storage Service
      • Finger Print: The fingerprint that you saw and copied after you uploaded the public key in the Console. It looks something like this: 12:34:56:78:90:ab:cd:ef:12:34:56:78:90:ab:cd:ef
      • Tenant ID: Tenancy in the Oracle Infrastructure Cloud Console that you noted down in the text file
      • Private Key: Paste the private key contents that you previously downloaded
      • File Type: csv
      • CSV Delimiter: Delimiter for the data files
      • Date format for the data files must match the date format in your Oracle Object Storage Service source; for example, if you've used MM/dd/yyyy (01/23/1998) in your source, then you must specify the same format in CSV Date Format. See About Date and Timestamp Formatting for CSV File-based Extractors.
      • Timestamp format for the data files must match the timestamp format in your Oracle Object Storage Service source; for example, if you've used MM/dd/yyyy hh:mm:ss (01/23/1998 12:00:20) in your source, then you must specify the same format in CSV Timestamp Format
      Create Connection for Oracle Object Storage Service dialog
    6. Click Save.
  3. In Oracle Object Storage Service:
    1. Create the folder structure in the Bucket using these guidelines:
      Base folder
      • The base folder in the bucket must match with the details provided in the connection.
      • Inside the base folder, ensure to place each file in its own folder.
      • Ensure that the Prefix of Data_Store_Name (same as Folder name) and Files in the target folder match exactly.

      See the "To create a folder or subfolder" section in Using the Console.

    2. Inside the base folder, create the metadata file for the Data Store List. This file lists the supported data stores. Each data store is a folder that has the actual file used in data augmentation, for example, ASSETS. Ensure that the file name and folder name match and there aren’t any special characters (including space) in the datastore, folder or file names.
      Base folder structure
    3. Create the metadata file for each data file under the data store folder using these guidelines:

      The META_DATASTORES.csv must have these columns:

      • DATA_STORE_NAME - A mandatory column to identify the data store name.
      • DATA_STORE_LABEL - A non-mandatory column that identifies the description of the data store.

      Each folder must have:

      • A data file that has the actual data that gets loaded into NetSuite Analytics Warehouse. This file must have a prefix with the DATA STORE NAME.
      • A metadata file for the list of columns contains all the column information on the data. This file must have a Prefix with META_DATASTORES_<DATA_STORE_NAME>_COL.
        • For the columns in this metadata, ensure the following:
        • If column name is ABC, then metadata can be ABC or “ABC” - the double quotes are ignored.
        • If column name is “ABC”, then metadata must be “”ABC”” – the first double quotes are ignored.

      Example

      In the image, the folder name is ACTIVITY_TYPES. Hence, the data store name is ACTIVITY_TYPES. You can confirm this from the META_DATASTORES.csv file. In this example, the file is named ACTIVITY_TYPES.xlsx or ACTIVITY_TYPES.csv. The metadata file must be META_DATASTORES_ACTIVITY_TYPES_COL.csv.Sample folder and metadata file

      The META_DATASTORES_ACTIVITY_TYPES_COL.csv has these columns:
      • DATA_STORE_NAME - This is a mandatory column.
      • COLUMN_NAME - This is a mandatory column.
      • COLUMN_LABEL - This is a non-mandatory column.
      • DATA_TYPE – This is a mandatory column.
      • WIDTH – This column identifies the string length.
      • PRECISION - This column value must be Numeric data type.
      • SCALE - This column value must be Numeric data type.
      • KEY_SEQUENCE - This is a mandatory column that identifies the Primary Key definition. If you’re using the composite primary key, then use column order numbers as values.
  4. In NetSuite Analytics Warehouse, on the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Oracle Object Storage Service connection and then select Test Connection.
  5. On the Manage Connections page, select Actions for the Oracle Object Storage Service connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for the Oracle Object Storage Service unless you perform a metadata extract.
  6. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the data from the Oracle Object Storage Service. Ensure that you select Regular as the source dataset type and Oracle Object Storage Service as the pillar. Select the applicable source tables from the Oracle Object Storage Service data. See Augment Your Data.
Load Data from a Secure FTP Source into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from a secure FTP source (SFTP) and use it to create data augmentations.

Ensure that SFTP is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Check the port number for your secure FTP database and create a service request to get the port opened.

    Note:

    You must provide the IP address of the SFTP server, which should be a public IP and can’t be hostname and a fully qualified domain name (FQDN) or a class A private IP.
  2. In NetSuite Analytics Warehouse, create the SFTP data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select SFTP as the connection type.

      SFTP connection option

    5. In the dialog for the SFTP connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide applicable values in Remote Host, User Name, Private Key, Remote Host Extract Files Directory, File Type, CSV Delimiter, CSV Date Format, and CSV Timestamp Format. In Lock Metadata, specify whether you want to turn off the metadata extracts after first refresh if metadata isn't going to change. This option is useful if the flag to derive metadata from data files using the metadata utility is turned on in your source.
      Ensure the following:
      • The table name and file name in your SFTP source needs to be the same.
      • The private key you provide is in the valid OpenSSH format.
      • Date format for the data files must match the date format in your SFTP source; for example, if you've used MM/dd/yyyy (01/23/1998) in your source, then you must specify the same format in CSV Date Format. See About Date and Timestamp Formatting for CSV File-based Extractors.
      • Timestamp format for the data files must match the timestamp format in your SFTP source; for example, if you've used MM/dd/yyyy hh:mm:ss (01/23/1998 12:00:20) in your source, then you must specify the same format in CSV Timestamp Format.

      Create Connection for SFTP dialog

    6. Click Save.
  3. In your SFTP source:
    1. Create the folder structure in the Bucket using these guidelines:
      Base folder
      • The base folder in the bucket must match with the details provided in the connection.
      • Inside the base folder, ensure to place each file in its own folder.
      • Ensure that the Prefix of Data_Store_Name (same as Folder name) and Files in the target folder match exactly.

      See the "To create a folder or subfolder" section in Using the Console.

    2. Inside the base folder, create the metadata file for the Data Store List. This file lists the supported data stores. Each data store is a folder that has the actual file used in data augmentation, for example, ASSETS. Ensure that the file name and folder name match and there aren’t any special characters (including space) in the datastore, folder or file names.
      Base folder structure
    3. Create the metadata file for each data file under the data store folder using these guidelines:

      The META_DATASTORES.csv must have these columns:

      • DATA_STORE_NAME - A mandatory column to identify the data store name.
      • DATA_STORE_LABEL - A non-mandatory column that identifies the description of the data store.

      Each folder must have:

      • A data file that has the actual data that gets loaded into NetSuite Analytics Warehouse. This file must have a prefix with the DATA STORE NAME.
      • A metadata file for the list of columns contains all the column information on the data. This file must have a Prefix with META_DATASTORES_<DATA_STORE_NAME>_COL.
        • For the columns in this metadata, ensure the following:
        • If column name is ABC, then metadata can be ABC or “ABC” - the double quotes are ignored.
        • If column name is “ABC”, then metadata must be “”ABC”” – the first double quotes are ignored.

      Example

      In the image, the folder name is ACTIVITY_TYPES. Hence, the data store name is ACTIVITY_TYPES. You can confirm this from the META_DATASTORES.csv file. In this example, the file is named ACTIVITY_TYPES.xlsx or ACTIVITY_TYPES.csv. The metadata file must be META_DATASTORES_ACTIVITY_TYPES_COL.csv.Sample folder and metadata file

      The META_DATASTORES_ACTIVITY_TYPES_COL.csv has these columns:
      • DATA_STORE_NAME - This is a mandatory column.
      • COLUMN_NAME - This is a mandatory column.
      • COLUMN_LABEL - This is a non-mandatory column.
      • DATA_TYPE – This is a mandatory column.
      • WIDTH – This column identifies the string length.
      • PRECISION - This column value must be Numeric data type.
      • SCALE - This column value must be Numeric data type.
      • KEY_SEQUENCE - This is a mandatory column that identifies the Primary Key definition. If you’re using the composite primary key, then use column order numbers as values.
  4. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the SFTP connection and then select Test Connection.
  5. On the Manage Connections page, select Actions for the SFTP connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for SFTP unless you perform a metadata extract.
  6. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the SFTP data. Ensure that you select Regular as the source dataset type and the secure FTP source as the pillar. Select the applicable secure FTP source tables. See Augment Your Data.
Load Data from Azure Storage into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from Azure Storage and use it to create data augmentations.

Ensure that Azure Storage is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle NetSuite Analytics Warehouse, create the Azure Storage data connection using these instructions:
    1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Azure Storage as the connection type.
    5. In the dialog for the Azure Storage connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, enter these Azure Storage instance details, and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • File Type: Select CSV.
      • Azure Blob Connection String: Enter your storage account's connection string.
      • Container: Specify the Azure container.
      • Remote Host Extract Files Directory: Name of the base folder in which you must place all your data files in Azure Storage.
      • CSV Delimiter: Delimiter for the data files.
      • CSV Date Format: Date format for the data files must match the date format in your Azure Storage source; for example, if you've used MM/dd/yyyy (01/23/1998) in your source, then you must specify the same format. See About Date and Timestamp Formatting for CSV File-based Extractors

      Azure Storage connection dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Azure Storage connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the Azure Storage connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Azure Storage unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Azure Storage data. Ensure that you select Regular as the source dataset type and Azure Storage as the pillar. Select the applicable Azure Storage source tables. See Augment Your Data.

Connect With Cloud Sources

Connect with your cloud applications to provide the background information for reports.

You can blend the additional data from these data sources with the prebuilt datasets to enhance business analysis. To know about the date and timestamp formatting for the CSV file-based extractors, see About Date and Timestamp Formatting for CSV File-based Extractors.

Load Data from Azure SQL into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from Azure SQL and use it to create data augmentations.

Ensure that Azure SQL is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle NetSuite Analytics Warehouse, create the Azure SQL data connection using these instructions:
    1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Azure SQL as the connection type.

      Azure SQL connection option

    5. In the dialog for the Azure SQL connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, enter your Azure SQL instance details, and click Save.

      Create Connection for Azure SQL dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Azure SQL connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the Azure SQL connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Azure SQL unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Azure SQL data. Ensure that you select Regular as the source dataset type and Azure SQL as the pillar. Select the applicable Azure SQL source tables. See Augment Your Data.
Load Data from Oracle Autonomous Database into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from Oracle Autonomous Database and use it to create data augmentations.

You can create connections to five autonomous databases. Depending on the number of connections, ensure that options such as Oracle Autonomous Database 1, Oracle Autonomous Database2 are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle NetSuite Analytics Warehouse, create the autonomous database connection using these instructions:
    1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, depending on the number of connections, select options such as Oracle Autonomous Database 1, or Oracle Autonomous Database2 as the connection type.

      Oracle Autonomous Database connection option

    5. In the dialog for the Oracle Autonomous Database connection, provide these details:
      • Select Standard in Connectivity Type.
      • Enter an email address to receive notifications in Notification Email.
      • Enter the credentials to access the database in User Name and Password.
      • Enter the database service details in Service.
      • In Wallet, drag and drop the database wallet details.
      • In Initial Extract Date Column Pattern, provide the initial extract date pattern that matches the pattern in your source.
      • In Last Update Date Column Pattern, provide the last update date pattern that matches the pattern in your source.
      • If your source has flashback support, then select Yes in Enable flashback incremental.
      • In List of Flashback Incremental Exempt datastores, provide a comma separated list of datastores that you want to exempt from the flashback incremental queries.
      • If your source has ROWSCN support, then select Yes in Enable ROWSCN incremental.
      • In List of ROWSCN Incremental Exempt datastores, specify a comma-separated list of datastores that you want to exclude from the automatic tracking of row changes based on system change numbers.
      • In Schema Name, enter the schema name to extract data from.
      • In Data Store Name Pattern, specify the name pattern of the data stores that you want extracted. If you provide this value, then the connector extracts only data stores matching the pattern.

      Create Connection for Oracle Autonomous Database dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the autonomous database connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the autonomous database connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for autonomous database unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the autonomous database data. Ensure that you select Regular as the source dataset type and autonomous database as the pillar. Select the applicable autonomous database source tables. See Augment Your Data.
Load Data from Enterprise Data Management Cloud into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Enterprise Data Management Cloud instance and use it to create data augmentations.

The extracts created in the Enterprise Data Management Cloud service need to be public, hence you must promote your private extracts to the public. Review the documentation and error messages for the metadata refresh failures for the private extract. This connector supports only the CSV data format.

Ensure that Oracle Enterprise Data Management Cloud is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In NetSuite Analytics Warehouse, create the Enterprise Data Management Cloud data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Oracle Enterprise Data Management Cloud as the connection type.

      Oracle Enterprise Data Management Cloud connection option

    5. In the dialog for the Enterprise Data Management Cloud connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, host name of the Oracle Enterprise Data Management server in Host Name, credentials to access the Enterprise Data Management Cloud instance in User Name and Password. In Extract Configuration, enter the list of extracts using only “comma” as the delimiter. The configuration extract must be a single line JSON without formatting for the quotation marks (" instead of \"), for example:
      [{"applicationName": "Account Reconciliation", "dimensionName": "Profiles","extractName": "Profiles"}]

      Create Connection for Oracle Enterprise Data Management Cloud dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Enterprise Data Management Cloud connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the Enterprise Data Management Cloud connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Enterprise Data Management Cloud unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Enterprise Data Management Cloud data. Ensure that you select Regular as the source dataset type and Enterprise Data Management Cloud as the pillar. Select the applicable Enterprise Data Management Cloud source tables. See Augment Your Data.
Load Data from Enterprise Performance Management into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Enterprise Performance Management (EPM) SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

You can connect to these functional modules of EPM:
  • Financial Close and Consolidation (FCCS)
  • Planning and Budgeting (PBCS)
  • Profitability and Cost Management (PCMCS)
If you've pivot table in your source, then the metadata extract supports pivot table metadata extraction for the EPM connectors. You can only manually extract the incremental data because, for incremental extraction, you must update the results file in EPM before starting the next extraction for the updated data. Update the results file by running the integration using Data Exchange and then access the new results file from the EPM connector in NetSuite Analytics Warehouse. Ensure that you enter all the fields in accordance with your EPM nomenclature and specifics. For example, if you have defined Period in your EPM job as {June-23}, then include exactly the same in the Create Connection for the EPM source dialog.

Note:

The EPM connectors display the default datatype and size; you must edit these values as applicable while creating data augmentations.
Depending on the functional module you want to connect to, ensure that the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • Oracle EPM - Financial Close and Consolidation
  • Oracle EPM - Planning and Budgeting
  • Oracle EPM - Profitability and Cost Management
See Make Preview Features Available.
  1. In EPM, create an integration, write out the results into a file whose name you provide in Download File Name, and then specify that same file name in List of Data Files while creating the connection to EPM in NetSuite Analytics Warehouse to extract the data.
    Create an integration in EPM
  2. In EPM, when exporting data, use one of the following modes:
    • Standard mode: This built-in workflow process helps manage the data flow process in and out of EPM. For Standard mode, you specify the period when you run the integration.
    • Quick mode: This process by-passes most of the steps in the workflow and provides a significant improvement in the extract process. When using quick mode, you specify the period in the integration definition filter, and this may include a substitution variable.When using the API, the same rules apply, except that when you need to specify a period, you can’t use a substitution variable to pull from EssBase, but will need to include the period or year on extract.

    See Exporting Data.

  3. In NetSuite Analytics Warehouse, create the EPM data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financial Close and Consolidation (FCCS)" module, select Oracle EPM - Financial Close and Consolidation as the connection type.

      Oracle EPM - Financial Close and Consolidation connection option

    5. In Create Connection for the EPM source, enter these details and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • User Name and Password: Enter the credentials for your EPM source. Prefix the user name with the domain of your EPM source, such as domain.username.
      • URL: Enter the specific URL of your EPM source using the https://<DOMAIN_NAME>region.ocs.oraclecloud.com format. For example, https://epm7-test-a123456.epm.us6.oraclecloud.com. Ensure to use the https:// protocol to avoid a timeout error.
      • List of Data Files: Specify the file name that you had entered in Download File Name while creating an integration in EPM.
      • List of Job Names: Provide the EPM integration job names. If you’ve multiple job names, then ensure that they are comma separated. For example, FAWEPMTestingV2, FAWEXPORT,FAW Job Testing.
      • List of Period Names: Provide the period names for the corresponding job names. Ensure that multiple period names are comma separated. You may leave this blank, in which case the connector uses the global point of view. For example, “{Jan-22}-{Oct-22}”.
      • Extract Configuration: Provide the EPM job name such as "Job 1" and period name such as "Quarter 1" corresponding to the given job name to extract data for specific periods. This enables you to run multiple EPM jobs.

      Create EPM Connection

  4. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the EPM connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for EPM unless you perform a metadata extract.
  5. On the Manage Connections page, select Actions for the EPM connection and then select Test Connection.
  6. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the EPM data. Ensure that you select Regular as the source dataset type and EPM as the pillar. Select the applicable EPM source tables. See Augment Your Data.
Load Data from Google Analytics into NetSuite Analytics Warehouse

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Google Analytics SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Before connecting with the Google Analytics source, note these:
  • NetSuite Analytics Warehouse supports Google Analytics extractor for GA4 properties and doesn’t support the previous version – Google Universal Analytics (UA) properties.
  • DataStores are the list of GA4 properties.
  • DataStore columns are the list of Dimensions and Metrics for a GA4 property.
  • DataExtract runs the report based on user selection for a GA4 property as DataStore and Dimensions and Metrics as DataStore columns.
  • MetaExtract fetches metadata for all the available GA4 properties (DataStores) and its Dimensions and Metrics (DataStoreColumns).
  • This connector supports limited number of Google Analytics metrics. See Dimensions Metrics Explorer to know what is available.
  1. In Google Cloud (Analytics) Project, create a service account and download the credentials.json file.
  2. Add the service account to the Google Analytics 4 property.
  3. Enable Google Analytics APIs using these instructions:
    1. Using a text editor, open the credentials.json file that you had downloaded and search for the client_email field to obtain the service account email address.
    2. Use this email address to add a user to the Google Analytics 4 property you want to access through the Google Analytics Data API v1.
    Enable Google Analytics APIs
  4. Ensure that the Google Analytics Admin API, Google Analytics Data API are available for your Google Analytics instance.
    View Google Analytics APIs
  5. In NetSuite Analytics Warehouse, create the Google Analytics data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Google Analytics as the connection type.

      Google Analytics connection option

    5. In the dialog for the Google Analytics connection, select Standard as the connectivity type and enter these details:
      • Notification Email: An email address to receive notifications regarding this connection.
      • Service Account Credentials Json File: The Google Cloud Service Account credentials.json file that you had downloaded.
      • Account ID: Google Analytics account ID.
      • GA4 List of Property ID: The GA4 List of Property ID with commas to separate each ID.
      • Lookback Mode: Select either Full or Committed.
      • List of Lookback N days Ago: Comma separated list of days (integer) values such as 7,21.
      Create Connection dialog
      Note these:
      • For the Lookback mode, if you don't provide a value, then the Lookback mode isn't supported. The Full option requires one day value, if you provide multiple values, then the process uses the first value. You can provide multiple values for the Committed option.
      • For List Data Stores, the REST API returns a list of GA4 Property IDs either using the Account ID (if provided) or just the source configured or provided list of property.
      • For List columns, the REST API returns a list of column metadata for the given GA4 Property ID.
    6. Click Save.
  6. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Google Analytics connection and then select Test Connection.

    Note:

    REST API signature is same across sources. Test connection invokes GA Common Metadata API. This returns the default version values and no calls are made to the source.
  7. On the Manage Connections page, select Actions for the Google Analytics connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Google Analytics unless you perform a metadata extract.
    Metadata extract:
    • Retrieves metadata columns for each GA4 Property ID provided in the source configuration.
    • Prefixes the GA property columns with Dimension_ orMetric_ that NetSuite Analytics Warehouse later uses while extracting data to differentiate Dimension and Metric column type.
    • Leaves the payload dataStores array empty.
  8. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Google Analytics data. Ensure that you select Regular as the source dataset type and Google Analytics as the pillar. Select the applicable Google Analytics source tables. See Augment Your Data.
    When you enable data extraction, you can schedule to run when you choose to do so. For data extraction, note these:
    1. Provide date ranges to run the report and fetch data.
    2. Regular data extract uses the initial or last ExtractDate as StartDate and job RunDate as EndDate.
    3. Lookback mode includes additional date ranges along with the regular extract date range which fetches additional data set but in a single runReport call.
      • The Full option has a single date range; StartDate=ExtractDate - NdaysAgo, EndDate=RunDate.
      • The Commited option can have multiple date ranges. For each configured GA_LIST_OF_N_DAYS_AGO, StartDate=ExtractDate - NdaysAgo, EndDate=RunDate - NdaysAgo.
Load Data from Mongo Database into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Mongo database and use it to create data augmentations.

Ensure that MongoDB is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Check the port number for your Mongo database and create a service request with server host and port details to enable network connectivity to the Mongo database server.
  2. In NetSuite Analytics Warehouse, create the Mongo database connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select MongoDB as the connection type.

      MongoDB connection option

    5. In the dialog for the Mongo database connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide these details:
      • Connection Protocol: Enter the connection protocol such as mongodb+srv or mongodb
      • Credentials to access the database in User Name and Password
      • Host Name: Enter the host name of the Mongo database such as cluster0.example4.mongodb.net
      • Host Port: Enter the port number where Mobgo database is listening such as 27017
      • Database Name: Enter a name such as Analytics
      • Last Update Date Column Pattern: Enter pattern such as "%mo%fie%te%"

        Note:

        If column name matches with pattern provided, then NetSuite Analytics Warehouse marks the column name with isLastUpdateDate = true and uses it for the incremental extract. For example, if pattern provided is “%mo%fie%te%“, then the column name modifiedDate is marked as isLastUpdateDate = true.
      • Initial Extract Date Column Pattern: Enter pattern such as "%cr%ted%te%"

        Note:

        If column name matches with pattern provided, then NetSuite Analytics Warehouse marks the column name with isCreationDate = true and uses it for the initial extract date extraction. For example, if pattern provided is: “%cr%ted%te%“, then the column name createdDate is marked as isCreationDate = true.

      Create Connection for Mongo database dialog

    6. Click Save.
  3. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Mongo database connection and then select Test Connection.
  4. On the Manage Connections page, select Actions for the Mongo database connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Mongo database unless you perform a metadata extract.
  5. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Mongo database data. Ensure that you select Regular as the source dataset type and Mongo database as the pillar. Select the applicable Mongo database source tables. See Augment Your Data.
Load Data from MySQL Cloud Database into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the MySQL Cloud database and use it to create data augmentations.

Ensure that MySQL Cloud is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Check the port number for your MySQL Cloud database and create a service request with server host and port details to enable network connectivity to the MySQL server.
  2. In NetSuite Analytics Warehouse, create the MySQL Cloud database connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select MySQL Cloud as the connection type.

      MySQL Cloud database connection option

    5. In the dialog for the MySQL Cloud database connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide these details:
      • Host Name: Enter the host name of MySQL server such as 100.111.252.64
      • Port Number: Enter the port number where the server is listening such as 3306
      • Database: Enter the database name you need to connect to such as airportdb
      • Credentials to access the database in User Name and Password
      • Last Update Date Column Pattern: Enter format such as "%mo%fie%te%"

        Note:

        If column name matches with pattern provided, then NetSuite Analytics Warehouse marks the column name with isLastUpdateDate = true and uses it for the incremental extract. For example, if pattern provided is “%mo%fie%te%“, then the column name modifiedDate is marked as isLastUpdateDate = true.
      • Initial Extract Date Column Pattern: Enter format such as "%cr%ted%te%"

        Note:

        If column name matches with pattern provided, then NetSuite Analytics Warehouse marks the column name with isCreationDate = true and uses it for the initial extract date extraction. For example, if pattern provided is: “%cr%ted%te%“, then the column name createdDate is marked as isCreationDate = true.

      Create Connection for MySQL Cloud database dialog

    6. Click Save.
  3. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the MySQL Cloud database connection and then select Test Connection.
  4. On the Manage Connections page, select Actions for the MySQL Cloud database connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for MySQL Cloud database unless you perform a metadata extract.
  5. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the MySQL Cloud database data. Ensure that you select Regular as the source dataset type and MySQL Cloud database as the pillar. Select the applicable MySQL Cloud database source tables. See Augment Your Data.
Load Data from Oracle Analytics Publisher into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Analytics Publisher reports and use it to create data augmentations for various use cases.

Currently, the BI Publisher Reports connector only supports:
  • The Analytics Publisher in Oracle Fusion Cloud Applications for data augmentation.
  • Only those reports that complete within the Analytics Publisher report execution timeout limit that is typically 300 seconds.

The BI Publisher Reports connector workflow must observe the security rules of Oracle Fusion Cloud Applications. You must ensre that the password rotation and update are done on time before executing the BI Publisher Reports connector pipeline. Otherwise, those pipeline jobs will hang and eventually those jobs will get deleted, and the data source will be disabled until you update the password and resubmit the job.

Ensure that BI Publisher Reports is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In BI Publisher, build a comma separated values (CSV) report for the targeted tables. See Create Reports and Generate CSV Output.
  2. In NetSuite Analytics Warehouse, create the data connection to the BI Publisher reports using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select BI Publisher Reports as the connection type.

      BI Publisher connection option

    5. In the dialog for the BI Publisher reports connection, provide these details:
      • Select Standard in Connectivity Type.
      • Enter an email address to receive notifications in Notification Email.
      • Host name of the BI Publisher in Host Name.
      • Credentials to access your BI Publisher in User Name and Password
      • Enter comma separated reports in List of Reports in the <reports directory>/<report names> format.

        Note:

        Ensure that the reports directory and report names don't have spaces.
      • In Reports Configuration, provide the path of the report in Oracle Analytics Publisher, select Run Asynchronously if the report isn't huge, else select Run Synchronously In Chunk if the report has a large volume of data (more than 7 MB in size) because this enables you to download the report in chunks of data.

        Note:

        • If the report execution is completed before the timeout limit and generates large volume of data then set runInChunkMode: true. The recommendation is to set runInChunkMode to true if the report output file is over 7MB.
        • If the report has parameters defined, then provide the values in the "params" section of the report configuration array. If the value in the params array is set with placeholder __lastUpdateDate__, then the connector applies the initial extract date or last successful extract date to the param.
      • In CSV Date Format, enter the expected date format from report; for example, "yyyy-MM-dd".
      • In CSV Timestamp Format, enter yyyy-MM-dd'T'hh:mm:ss.

      Create Connection for BI Publisher dialog

    6. Click Save.
  3. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the BI Publisher reports connection and then select Test Connection.
  4. On the Manage Connections page, select Actions for the BI Publisher reports connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for the BI Publisher reports unless you perform a metadata extract.
  5. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the BI Publisher reports data. Ensure that you select Regular as the source dataset type and BI Publisher reports as the pillar. Select the applicable BI Publisher source tables. See Augment Your Data.
Load Data from Oracle Database Using JDBC into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use an extract service remote agent to connect to an Oracle database using JDBC and use the data to create data augmentations.

After connecting to an Oracle database using JDBC, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle NetSuite Analytics Warehouse instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the data from an Oracle database into Oracle NetSuite Analytics Warehouse only once every 24 hours.

Ensure that Remote Agent and Oracle JDBC are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your SQL Server into Oracle NetSuite Analytics Warehouse.
  2. Configure the remote agent and Oracle database data source on the Data Configuration page in Oracle NetSuite Analytics Warehouse using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Remote Agent as the connection type.
    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.
    7. Navigate to the Manage Connections page, click Create and then click Connection.
    8. In Create Connection, select Oracle JDBC as the connection type.
      Oracle JDBC connection option

    9. In Create Connection for Oracle JDBC, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle JDBC

    10. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    11. Enter an email address to receive notifications in Notification Email, provide credentials for the Oracle database source in User Name and Password, and the URL of the Oracle database source in URL.
    12. In Initial Extract Date Column Pattern, provide the initial extract date pattern that matches the pattern in your source.
    13. In Last Update Date Column Pattern, provide the last update date pattern that matches the pattern in your source.
    14. If your source has flashback support, then select Yes in Enable flashback incremental.
    15. In List of Flashback Incremental Exempt datastores, provide a comma separated list of datastores that you want to exempt from the flashback incremental queries.
    16. If your source has ROWSCN support, then select Yes in Enable ROWSCN incremental.
    17. In List of ROWSCN Incremental Exempt datastores, specify a comma-separated list of datastores that you want to exclude from the automatic tracking of row changes based on system change numbers.
    18. In Case Sensitive Data Stores, select Yes or No to specify whether the datastores have case sensitive data.
    19. In Schema Name, enter the schema name to extract data from.
    20. In Data Store Name Pattern, specify the name pattern of the datastores that you want extracted. If you provide this value, then the connector extracts only datastores matching the pattern.
    21. Confirm that you see the Remote Agent and Oracle database connections on the Manage Connections page.
    22. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Oracle database connection and then select Refresh Metadata.

      Note:

      You can’t create augmentations for Oracle database unless you perform a metadata extract.
  3. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Oracle database data. Ensure that you select Regular as the source dataset type and Oracle JDBC as the pillar. Select the applicable Oracle database source tables. See Augment Your Data.
Load Data from Salesforce into NetSuite Analytics Warehouse

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Salesforce SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

This version of the Salesforce Connector supports version 57.0 of the API from Salesforce.

  1. In NetSuite Analytics Warehouse, create the Salesforce data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Salesforce REST as the connection type.

      Salesforce REST connection option

    5. In the dialog box for the Salesforce connection, select Standard in Connectivity Type and enter an email address to receive notifications in Notification Email.

      Create Connection for Salesforce REST dialog

    6. Enter the URL of your Salesforce source such as <your Salesforce instance name>.my.salesforce.com in URL.
    7. Copy and paste the token url from your Salesforce instance in Token URL.
    8. Select the type of authorization in Authorization Type.
      Authorization Type can be one of the following: BASICAUTH or OAUTH. Ensure that you enter these authorization types in uppercase letters without any characters in them. You must provide the corresponding fields for those authorization types. For example, if you select BASICAUTH, then you must provide a valid username, password, security token, url, client ID, and client secret. If you select OAUTH, then you must provide a valid username, token url, client ID, and private key. Remember to update all the authorization fields, since Salesforce may reset or require you to reset them regularly.
    9. Enter the credentials for your Salesforce source in User Name and Password.
    10. Copy and paste the client ID that is usually a long alpha-numeric code from your Salesforce account in Client ID.
    11. Copy and paste the client secret from your Salesforce account in Client Secret.
      This is an alpha-numeric code and may contain special characters, however, it isn't visible. It is encrypted and shown as ….
    12. Copy and paste the security token from your Salesforce account in Security Token.
      This is an alpha-numeric code and may contain special characters, however, it isn't visible. It is encrypted and shown as ….
    13. Copy and paste the private key from your Salesforce account in Private Key.
    14. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the EPM connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Salesforce unless you perform a metadata extract.
  3. On the Manage Connections page, select Actions for the EPM connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Salesforce data. Ensure that you select Regular as the source dataset type and Salesforce as the pillar. Select the applicable Salesforce source tables. See Augment Your Data.
Load Data from Shopify into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Shopify SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Shopify is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

Note:

This data augmentation connector is temporarily unavailable as of Platform 24.R1 (March, 2024) release. Further updates will be available after an Oracle internal review.
  1. In NetSuite Analytics Warehouse, create the Shopify data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Shopify as the connection type.

      Shopify connection option

    5. In the dialog for the Shopify connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, applicable token value in Access Token, Store Name such as myfawteststore.myshopify.com, and True in Bulk Extract.

      Create Connection for Shopify dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Shopify connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the Shopify connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Shopify unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Shopify data. Ensure that you select Regular as the source dataset type and Shopify as the pillar. Select the applicable Shopify source tables. See Augment Your Data.
Connect with Your Oracle Eloqua Data Source

If you’ve subscribed for Oracle Fusion CX Analytics and want to load data from your Oracle Eloqua source into NetSuite Analytics Warehouse, then create a connection using the Eloqua connection type.

The Oracle Eloqua data that you load into NetSuite Analytics Warehouse enables you to augment the data in your warehouse and create varied customer experience-related analytics. Ensure that Oracle Eloqua is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  2. On the Data Configuration page, click Manage Connections under Global Configurations.
  3. On the Manage Connections page, click Create and then click Connection.
  4. In Create Connection, select Eloqua as the connection type.
  5. In the dialog for the Eloqua connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and the credentials to connect with the Eloqua source in User Name and Password.
  6. In URL, enter the URL for your Eloqua server in this sample format: https://<your eloqua server>/api/odata/1.0.Description of fawag_eloqua_create_connection.gif follows
    Description of the illustration fawag_eloqua_create_connection.gif
  7. Click Save.
Load Data from QuickBooks Online into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from QuickBooks Online and use it to create data augmentations.

Ensure that QuickBooks Online is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle NetSuite Analytics Warehouse, create the QuickBooks Online data connection using these instructions:
    1. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select QuickBooks Online as the connection type.

      QuickBooks Online connection option

    5. In the dialog for the QuickBooks Online connection, enter these details and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • URL: Enter the complete URL of your Quickbooks Online instance.
      • Client ID: This is the unique ID on your Quickbooks Online instance.
      • Client Secret: Copy and paste the entire "client secret" from your authentication.
      • Refresh Token: This token is changed by Quickbooks everyday; enter your refresh token for the day you want to execute this pipeline.
      • Company ID: This is your company ID for the Quickbooks Online instance.

      Create Connection for QuickBooks Online dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the QuickBooks Online connection and then select Test Connection.
  3. On the Manage Connections page, select Actions for the QuickBooks Online connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for QuickBooks Online unless you perform a metadata extract.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the QuickBooks Online data. Ensure that you select Regular as the source dataset type and QuickBooks Online as the pillar. Select the applicable QuickBooks Online source tables. See Augment Your Data.
Load Data from Snowflake into Oracle NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the Oracle NetSuite Analytics Warehouse extract service to acquire data from a Snowflake instance.

You can later use this data to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases. Establish the connection from NetSuite Analytics Warehouse to your Snowflake instance to start data acquisition followed by augmentation.

Ensure that Snowflake is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In NetSuite Analytics Warehouse, create the Snowflake data connection:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Snowflake as the connection type.
      Snowflake connection option
    5. In Create Connection, enter these details and then click Save:
      • Connectivity Type: Standard.
      • Notification Email: An email address to receive notifications.
      • Auth Schema: Enter “BASICAUTH” if you’re using username and password to establish the connection. Enter “PRIVATE_KEY” if you’re using token-based authentication.
      • User Name: Enter username only if you're using the basic authentication.
      • Password: Enter password for the username only if you're using the basic authentication.
      • Host Name: Complete host name of your Snowflake instance.
      • Table Schema: Your Snowflake table schema such as TPCH_SF1.
      • Database: Mentioned in your Snowflake account under Data.
      • Warehouse: The compute resources in your Snowflake instance that you can find by running SHOW WAREHOUSES [ LIKE '<pattern>' ]. See SHOW WAREHOUSES.
      • Private Key: Generate the Private Key in Snowflake, if you don’t have one already, and paste it here. See Generate the Private Key.

      Create Snowflake connection

  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Snowflake connection, and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Snowflake unless you perform a metadata extract.
  3. On the Manage Connections page, select Actions for the Snowflake connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Snowflake data. Ensure that you select Regular as the source dataset type and Snowflake as the pillar. Select the applicable Snowflake source tables. See Augment Your Data.
Load Data from Taleo into NetSuite Analytics Warehouse (Preview)

As a service administrator, you can use the NetSuite Analytics Warehouse extract service to acquire data from the Taleo instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Taleo is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In NetSuite Analytics Warehouse, create the Taleo data connection using these instructions:
    1. In NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Taleo as the connection type.

      Taleo connection option

    5. In Connectivity Type, select Standard, enter an email address to receive notifications in Notification Email, host name of your Taleo instance in Host Name, and credentials for your Taleo source in User Name and Password.

      Create Taleo Connection dialog

    6. Click Save.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Taleo connection and then select Refresh Metadata.

    Note:

    You can’t create augmentations for Taleo unless you perform a metadata extract.
  3. On the Manage Connections page, select Actions for the Taleo connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page and create a data augmentation using the Taleo data. Ensure that you select Regular as the source dataset type and Taleo as the pillar. Select the applicable Taleo source tables. See Augment Your Data.

Schedule Frequent Refreshes of Data

You can schedule frequent refreshes of the applicable functional areas, prioritized datasets, and warehouse tables to ensure that the incremental data in these is the latest.

You can refresh a limited set of tables currently with plans to support more functional areas and datasets in future releases. While opting to refresh certain functional areas, prioritized datasets, and warehouse tables, ensure to understanding which tables you can refresh or not because the selective refresh of some tables could lead to functional inconsistencies when combining data from multiple subject areas. This frequent data refresh capability is designed for refresh of base tables that capture the transactional data; it isn't meant for derived datasets that require aggregations, snapshots, or complex transformation logic. Such processing creates data latency that doesn't support high volume of frequent data refresh.

If you've enabled the "Prioritized Data Refresh" preview feature and selected datasets for a prioritized incremental refresh, then those specific datasets are available for a frequent data refresh. See Prioritize Datasets for Incremental Refresh (Preview). If you've enabled the "Frequent Refresh Tables" preview feature and saved your selection of the warehouse tables, then the selected tables are available as "Warehouse Tables" for a frequent data refresh. See Schedule Frequent Refreshes of Warehouse Tables. If you want to select the warehouse tables created by the custom data configurations that were generated prior to release Platform 23.R4, then you must regenerate the applicable custom data configurations for these tables to be visible for selection. From release Platform 23.R4, the warehouse tables created by the custom data configurations are available for a frequent data refresh under the Frequent Refresh tab.

For Oracle NetSuite Analytics Warehouse, you can schedule frequent refreshes for functional areas that are visible in the Frequent Data Refresh Schedule tab on the Pipeline Settings page.

When you select the functional areas for a frequent refresh, you won’t be able to refresh the data pipeline for the applicable functional area using the Refresh Data option on the Data Configuration page. The frequent data refresh process doesn't refresh data from external sources through the data augmentation connectors. Oracle Fusion Data Intelligence processes the data augmentations as part of the incremental loads. For frequent data refreshes, the semantic model won't be updated. The update jobs for semantic model won't run as part of data augmentations, they run for data augmentations only during incremental loads.

You may want to frequently refresh the data in these sample scenarios:
  • During the last five days of the month or every quarter, general ledger data changes quickly with updated entries more than once a day. In this scenario, a financial analyst wants to ensure taking more recent and accurate data into the analysis and report generation. Since some of the data pipelines can take several hours to execute from end to end, the frequency of refresh needs to be every 4 hours or 6 times in a 24-hour period.
  • In this scenario, a retailer that has strict guidelines for returns or refunds on any purchase orders, and needs to bring in the most recent order data for computation and report generation, especially in the beginning of a given fiscal quarter. To cover for time differences between Europe, East Coast, and West Coast of the U.S., a financial analyst needs to refresh data every 8 hours in 24 hours, or 3 times a day.

Ensure that Frequent Refresh is enabled in Pipeline Features under the Generally Available Features tab on the Enable Features page prior to scheduling frequent data refreshes. See Enable Generally Available Features.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, click Frequent Data Refresh Schedule.

    Frequent Refresh tab

  5. In the Frequent Data Refresh Schedule tab, in Refresh Frequency (in hours), select the number of hours after which you want the data refresh to happen.
  6. Enter a start time in Start time.
  7. In Last refresh date, view the date when the data was last refreshed.
  8. If you've enabled the "Frequent Refresh Tables" preview feature and saved your selection of the warehouse tables, then select the Perform frequent refresh for selected warehouse tables toggle. Click the "warehouse tables" hyperlink to view or update your selection of the warehouse tables in the Frequent Refresh Tables tab.
  9. Select the check boxes for the functional areas with the data you want refreshed at the specified interval.
  10. Click Save.

Schedule Periodic Full Reload of Functional Area Data

You can schedule a periodic full reload of the applicable functional areas to ensure that all the data in these functional areas is completely reloaded.

For Oracle NetSuite Analytics Warehouse, you can schedule a periodic full reload for functional areas such as Financials and Sales. When you select the functional areas for full reload, ensure that you select the dependent functional areas as well.

WARNING:

When you schedule a full reload, the incremental loads take a longer time and depending on the initial extract date, the incremental data load may take days to complete the extraction.
You may want to fully reload the data for a functional area in these sample scenarios:
  • The Finance department wants to do full reload of the general ledger (GL) data at the beginning of the month to make sure all data is fresh and clean for the monthly GL entries and the analytics and report generation.
  • At the beginning of every quarter, all data related to Purchase Orders is updated with returns, refunds, and any adjustments. In this scenario, revenue recognition needs to take all the latest Purchase Order data to compute all numbers and create accurate reports. Hence, a full reload of the order data is required on the 1st of every fiscal quarter that can be different from the calendar quarter.
You can check the refresh summary in two ways:
  • View details of the requests on the Request History page.
  • View the DW_WH_REFRESH_SUMMARY data warehouse table.

Ensure that Scheduled Full Data Reload is enabled in Pipeline Features under the Generally Available Features tab on the Enable Features page prior to scheduling periodic full reload. See Enable Generally Available Features.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, click Warehouse Full Relaod Schedule.

    Warehouse Full Reload Schedule tab

  5. In the Warehouse Full Reload Schedule tab, in Full Reload Frequency, select the frequency type such as weekly, monthly, or quarterly.
  6. Based on the selected frequency type, select the specific day of the week, month, or the first month of the quarter.
  7. Select the check boxes for the functional areas whose data you want reloaded fully at the specified period.
  8. Click Save.

Schedule Frequent Refreshes of Warehouse Tables

You can select warehouse data tables that are part of functional areas, descriptive flexfields custom configurations, and data augmentations for frequent data refresh ahead of other data in the regularly scheduled batch.

For a table to be eligible for a frequent data refresh, ensure that its supporting functional areas, data augmentations, and descriptive flexfields custom configurations are activated. The custom data configurations activated prior to release 23.R4 won't be available automatically. You must regenerate and redeploy them to schedule for frequent refresh. After you select the warehouse tables for frequent data refresh and save your selection, the selected tables are available as "Warehouse Tables" under the Frequent Refresh tab. You can set the frequent refresh schedule for these tables in the Frequent Refresh tab. See Schedule Frequent Refreshes of Data.

Ensure that Frequent Refresh Tables is enabled in Pipeline Features under the Preview Features tab on the Enable Features page prior to scheduling frequent refreshes of the tables. See Make Preview Features Available.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, click Frequent Refresh Tables.

    Frequent Refresh Tables tab

  5. In the Frequent Refresh Tables tab, select the check boxes for the applicable warehouse tables and click Save.

Prioritize Datasets for Incremental Refresh (Preview)

As a functional administrator, you can select datasets to prioritize for incremental refresh over others. For example, at the end of the quarter, you want to prioritize some of the datasets that have bookings data over other datasets.

You can search for and choose datasets based on the functional areas. The incremental refresh process automatically selects the dependent datasets. For example, if you select DW_GL_JOURNAL_CF, then the incremental refresh process automatically pulls in DW_GL_JOURNAL_F. The incremental refresh process runs the priority dataset refresh first from end-to-end, followed by the regular incremental refresh. The objects processed in the priority dataset refresh are reprocessed in the regular incremental refresh. The prioritized datasets are automatically available for frequent data refresh. See Schedule Frequent Refreshes of Data.
  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, click the Prioritized Data Refresh tab.

    Prioritized Data Refresh tab on the Pipeline Settings page

  5. In the Prioritized Data Refresh section, select the datasets that you want for the priority refresh.
  6. Click Save.

Reset and Reload the Data Source

As a functional administrator, you can reset and reload the data source for several reasons.

For example:
  • If you need to change the data source for some reason, then you may want to reset the present data source back to the original state.
  • If there is any data issue with the source system and you need to start the entire pipeline all over again.
  • Initially if you set up the connection and the pipeline to test some things and then want to start from the beginning again.

You can reset and reload data for all the activated functional areas, augmentations, and custom data configurations. This request may take a long time as it requires all data to be discarded and fetched again. Other requests like the incremental and full load won't start until this request completes.

Ensure that Reset and Reload the Data Source is enabled in Pipeline Features under the Generally Available Features tab on the Enable Features page prior to resetting and reloading the data source. See Enable Generally Available Features.

  1. Sign in to your service.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, under Configurations, click Pipeline Settings.
  4. On the Pipeline Settings page, clickActions (Actions menu ellipsis icon) and then click Reset & Reload.

    Reset and Reload option available from the Action icon

  5. Review the warning message, provide the details, and then click Submit.

    The Reset and Reload dialog

Extend Data with Custom Data Configurations

As a functional administrator, you can use the available custom data configurations to extend your data for detailed and varied analytics.

About Extending Data with Custom Data Configurations

Custom data configurations are prebuilt templates that enable you to create custom functional areas with the data that you bring into the Autonomous Data Warehouse. You later use these custom functional areas to augment the data from the source or enrich the data available in the Autonomous Data Warehouse.

When you start using a custom data configuration for the first time and deploy it, Oracle NetSuite Analytics Warehouse assigns the custom data configuration a version number. You can then edit the deployed version, make changes, and save it. This saved version then gets a new version number. While you’re changing the definition, the deployed version continues to fetch new data. An example of the currently available custom data configuration is Netsuite Account Analysis.

NetSuite Account Analysis is a custom data configuration that enables you to build custom content by choosing data across 42 transaction types. It also supports posting custom transactionsas and custom attribute mapper based custom attributes. This feature supports granularity at the transaction line level. Once custom data configuration is deployed, the created content manifests in the form of a subject area. This feature eliminates the need for users to go to raw table and link them using SQL to build across transaction type content for those 42 transaction type.

Prerequisites for Setting up Netsuite Account Analysis

Ensure these prerequisites are met:

  • Prior to configuring the Netsuite account analysis template, ensure that you’ve activated atleast one functional area such as Sales.
  • If you want to use the custom transactions and custom attributes while creating the account analysis subject area, then reach out to Oracle Support to have the custom transactions and custom attributes enable in your instance.

See Activate a Data Pipeline for a Functional Area.

Enable Netsuite Account Analysis (Preview)

Prior to configuring the Netsuite account analysis template to build the account analysis reports in Oracle NetSuite Analytics Warehouse, you must enable it.

  1. Sign in to your service as a functional administrator.
  2. In Oracle NetSuite Analytics Warehouse Console, click Enable Features, and then click Preview Features.
  3. Under the Preview Features tab, select the toggle to enable Netsuite Account Analysis.

Set up Netsuite Account Analysis

Set up the Netsuite Account Analysis subject area to build the account analysis reports in Oracle NetSuite Analytics Warehouse.

While configuring the custom account analysis subject area, ensure that you note the value in the Usage column for each of the available columns on the Netsuite Account Analysis page:
  • Certain columns are mandatory for the account analysis report, hence these columns are selected by default and you can't deselect them.
  • Certain columns are recommended and selected by default but you can deselect them based on your requirements for the account analysis reports.
  • Certain columns are optional and you can select them based on your requirements for the account analysis reports.
  • Certain columns are custom attributes and custom transactions-related. They are available for selection if you've completed the prerequisite task. See Prerequisites for Setting up Netsuite Account Analysis. After completing the prerequisite tast, you must select the Include custom transaction applicable for Account Analysis check box to have the columns from custom transactions available for selection.

After creating the custom account analysis subject area, you can create the account analysis reports based on your business requirements.

  1. Sign in to your service as a functional administrator.
  2. In Oracle NetSuite Analytics Warehouse Console, click Data Configuration, and then on the Data Configuration page, click Custom Data Configurations.
  3. On the Custom Data Configurations page, click Create and then select Netsuite Account Analysis.
  4. On the Netsuite Account Analysis page, select the data elements that you want to include in the custom account analysis template, and then click Save and Exit.
  5. On the Custom Data Configurations page, click Actions for the Netsuite Account Analysis that you created, and then click Generate.
    You can monitor the status of the Generate job on the Request History page.
  6. On the Custom Data Configurations page, when you see the status of the Netsuite Account Analysis that you created as Generated, then click Actions and select Deploy.
After you've successfully deployed the Netsuite Account Analysis subject area, on the Custom Data Configurations page, you can perform these actions by clicking Actions for the Netsuite Account Analysis that you deployed:
  • Undeploy
  • Reload Data
  • Edit
  • Deactivate
  • Refresh Data
  • Delete