Running Integrations

Running an integration entails using the INTEGRATION job type for the jobs REST API to execute an integration or data load rule based on how periods are processed and source filters.

The INTEGRATION job type is an enhanced version of DATARULE job type (see Running Data Rules in Data Management). It is recommended that you use the INTEGRATION job type for future integration jobs.

The INTEGRATION jobtype supports running integrations/data load rules based on:

  • period names provided to Planning
  • Global POVs
  • Planning substitution variables
  • source filters selected as runtime parameters
  • target options selected as runtime parameters
  • existing period ranges

It also supports overriding the source filters and target options at runtime without modifying the integration definition.

Prerequisites:

You must have the required privileges to execute a specific data rule/integration.

REST Resource

/aif/rest/{api_version}/jobs

Required Roles

Service Administrator, Power User

Request

Supported Media Types: application/json

Method:

POST

Payload:

{
             "jobType":"INTEGRATION",
             "jobName":"GLDATA",
             "periodName":"{Mar-20}",
             "importMode":"Replace",
             "exportMode":"Merge",
             "fileName":"inbox/GLBALANCE.txt"
}

REST Payload Description

The following table summarizes the REST payload.

Table 15-6 Parameters

Name Description Type Required Default
api_version V1 Path Yes None
jobType INTEGRATION JSON payload Yes None
jobName The name of the integration defined in Data Integration. You should enclose the rule or name in quotation marks if it contains a space. JSON payload Yes None
periodname

Name of the period(s)p enclosed in curly brackets ({}).

periodname includes:

  • Single Period—Refers to the Data Integration period name for a single period. The parameter is whatever the period name is defined in Period mapping.

  • Multi-Period—Refers to a multi-period load. The parameter is {Month-Year}{Month-Year}. For example, {Jan-20}{Mar-20} refers to a multi-period load from Jan-20 to Mar-20.

  • Planning Period Name—Refers to a Planning period name. The parameter is {Month#Year}, for example, {Jan#FY20}{Mar#FY20}. Using this convention, the client executing the API does not need to know the Data Integration period names. Instead you specify the Planning member names for the Year and Scenario dimensions.

    This parameter is supported in the Planning,Tax Reporting, and Financial Consolidation and Close business processes. It is functional for both your service applications and cloud deployments derived from on-premises data sources.

    This parameter is useful if triggered from an Oracle Enterprise Performance Management Cloud Groovy script by capturing the Year and Period member names. The application period mapping or global period mapping must exist with the Year and Month in the target values of the period mapping.

  • Planning Substitution Variable—This is an extension of the previous Planning period name parameter whereby a substitution variable can be specified instead of the actual Year/Month member names.

    The convention is {Month#&CurYr}{&FcstMonth#&CurYr}; for example, {Jan#&CurYr}{&FcstMonth#&CurYr}.

    A combination of both actual member names as well as substitution variables is supported.

    This parameter is supported in the Planning,Tax Reporting, and Financial Consolidation and Close business processes. It is functional for both your service applications and cloud deployments derived from on-premises data sources.

    The application period mapping or global period mapping must exist in the Data Integration of the instance where the API is executed, with the Year and Month in the target values of the period mapping. In this case, Year and Month refer to the current value of the substitution variable during execution.

  • GLOBAL POV–—Executes the data load for the Global POV period. Use the format {GLOBAL_POV}.

Note:

If you use any other period naming parameter other than the parameters described above, you get an "Invalid Input – HTTP 400 " error message.
JSON payload Yes None
importMode Determines how the data is imported into Data Integration.

Acceptable values are:

  • Append—Add to the existing POV data in Data Integration.

  • Replace—Delete the POV data and replace it with the data from the file.

  • Map and Validate—Skip importing the data, but re-process the data with updated Mappings and Logic Accounts.

  • No Import—Skip data import into Data Integration staging table.

  • Direct—To use the direct load method to extract data from your on-premises data sources and then load the data directly to the EPM Cloud using the EPM Integration Agent, you need to pass an importMode of "Direct." Other modes are not applicable for the direct load.

JSON payload Yes None
exportMode Determines how the data is exported into Data Integration.

Acceptable values for Planning business processes are:

  • Merge—Merge the data in the Data Integration staging table with the existing Planning data.

  • Replace—Clear the POV data and replace it with data in the Data Integration staging table. The data is cleared for Scenario, Version, Year, Period, and Entity dimensions.

  • Accumulate—Add the data in the Data Integration staging table to Planning.

  • Subtract—Subtract the data in the Data Integration staging table from existing Data Integration data.

  • No Export—Skip data export from Data Integration to Planning.

Acceptable values for Financial Consolidation and Close and Tax Reporting are:

  • Merge—Merge the data in the staging table with the data in the Financial Consolidation and Close and Tax Reporting application.

    If data already exists in the application, the system overwrites the existing data with the new data from the load file. If data does not exist, the new data is created.

  • Replace—Delete the POV data and replace it with the data from the file.

  • No Export—Skip the data export from Data Integration to Financial Consolidation and Close or Tax Reporting

If you use a regular data load for the Numeric Data Only and Numeric Data and All Data Type load methods, note the export mode requirements based on the load method:

  • For a Numeric Data Only load method, Accumulate and Subtract are the applicable export modes.

  • For a Numeric Data and All Data Type load method, Merge, Replace, and No Export are the applicable export modes.

If you use the direct data load to extract data from your on-premises data sources and then load the data directly to the EPM Cloud using the EPM Integration Agent, note the export mode requirements based on the load method:

  • For a Numeric Data Only load method, Merge, Accumulate, and Subtract are the applicable export modes.

  • For a Numeric Data and All Data Type load method, only Replace is the applicable export mode.

  • The No Export mode is not applicable for the Numeric Data Only and Numeric Data and All Data Type load methods.

When running a Quick Mode load, valid export modes are:

  • Replace
  • Merge
  • Accumulate
JSON payload Yes None
fileName

The fileName parameter is applicable only for native file-based data loads and ignored if specified for other loads.

The file name is optional. If you do not specify a file name, this API imports the data contained in the file name specified in the load data rule. The data file must already reside in the Inbox prior to executing the data load rule, for example, . inbox/GLBALANCES.txt.

You can also upload file to folders accessible from the Applications-Inbox/Outbox Explorer using a file name. Reference the files in this folder using this format:: #epminbox/<filename>.

JSON payload No None
sourceFilters

A parameter used to update the source filters defined for the data load rule or integration.

File-based applications—You cannot use the sourceFilters parameter for file-based applications. If you use this parameter with a file-based application, you get the HTTP 400 error message: "EPMFDM-ERROR: Data Load Rule does not support sourceFilters for File based loads."

Data Source based applications—Replaces the pre-defined source filter parameters at the rule level for a data load rule/integration or specifies them at runtime (if not pre-defined at application level when the data source is the type of source application.

Each filter name and its value should be sent as a key/value pair in the nested JSON object. The parameter name and value should be the English display names as seen in the user interface. Do not specify internal codes for parameter names and values using LOV validation. LOV validation is done for parameters having a restricted list of parameter values.

All filter names are not mandatory. Only filter names specified in the nested JSON object are replaced or set at runtime. The remaining filters are picked from the application/rule definition.

Oracle Essbase/Planning/Oracle General Ledger-based applications—Replaces the already defined dimension source filters for a data load rule/integration at runtime when Essbase/Planning/Oracle General Ledger balance type are the source applications.

Each dimension name and its value should be sent as a key/value pair in the nested JSON object.

Replacing or setting the dimension filters at run time is only supported for dimensions already having a filter defined in the data load rule/integration definition.

All filter names are not mandatory. Only dimension names specified in the nested JSON object will be replaced or set in runtime. The remaining filters are picked from data rule/integration definition.

Dimension filters are supported for all dimensions including the "Scenario" dimension. The "Year" and "Period" dimensions are not supported as filters because they are driven by the POV range.

All other source applications are not supported.

JSON payload No None
targetOptions

A parameter used to update the target options defined for the data load rule or integration.

Data Export based applications—Replaces the pre-defined target option parameters at the rule level for a data load rule/integration or specifies them at runtime (if not pre-defined at the rule level) when the Data Export is the type of target application.

Each option name and its value should be sent as a key/value pair in the nested JSON object. The parameter name and value should be the English display names as seen in the user interface. Do not specify internal codes for parameter names and values using the LOV validation. LOV validation is done for parameters having a restricted list of parameter values.

All option names are not required. Only option names specified in the nested JSON object are replaced or set at runtime. The remaining options are picked from the application/rule definition.

Planning target application—Replaces the following options at runtime.

  • Refresh Database—Yes/No

  • Dimension Name—Specify a dimension name for a custom dimension load rule. In this way the same rule can be used to build multiple dimensions in runtime.

  • Purge Data File–Yes/No

Other target applications are not supported.

JSON payload

No

None
executionMode

The executionMode parameter is applicable only for Quick Mode integrations.

Available options:

  • SYNC—When executionMode is SYNC, then the REST API call submits and waits until the integration has completed (that is, reached either a successful, failed, or warning state) before returning a response.

  • ASYNC—When executionMode is ASYNC, then the REST API call returns a response immediately without waiting for the integration to complete. Usually the integration is in a running state when the REST API response is received.

executionMode is a mandatory parameter and cannot be blank.

JSON payload Yes None

Sample REST Payloads

Below are sample payloads based on the source application type.

File Based Loads

In this example, the source is a text file running an integration through a native File adapter:

{
"jobType":"INTEGRATION",
"jobName":"ERPDATA",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"fileName":"inbox/TestData.txt"
}

Planning Applications

In these examples, the source is an Essbase/Planning based application. The supported applications include:

  • Planning modules
  • Reporting cubes (plan types) of Planning
  • Financial Consolidation and Close
  • Tax Reporting
  • Profitability and Cost Management
  • Oracle ERP Cloud - Oracle General Ledger Balances Cube

Example of Planning to Financial Consolidation and Close Data Synchronization

{
"jobType":"INTEGRATION",
"jobName":"PBCStoFCCS",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
	"Account":"@RELATIVE(Acc1,0)",
	"Entity":" @CHILDREN(Europe)",
	"Scenario":"Actual"
}
}

Example of Planning to to Data Export to File

{
"jobType":"INTEGRATION",
"jobName":"PBCStoFCCS",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
	"Account":"@RELATIVE(Acc1,0)",
	"Entity":" @CHILDREN(Europe)",
	"Scenario":"Actual"
},
"targetOptions":{
	"Download File Name":"PlanningToFile.csv",
	"Column Delimiter":",",
	"Include Header":"Yes"
}
}

Data Source Applications

Example of Incremental File adapter:

{
"jobType":"INTEGRATION",
"jobName":"MyIncrementalFileLoad",
"periodName":"{Jan-20}{Mar-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{"Source File":"File1.txt"}
}

Example of Netsuite adapter:

{
"jobType":"INTEGRATION",
"jobName":"NetsuiteLoad",
"periodName":"{Jan-20}{Mar-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
	 "Postingperiod":"This Fiscal Quarter",
	 "Mainline":"True"
}
}

Example of exporting Oracle NetSuite adapter toPlanning dimensions (metadata rule):

{
"jobType":"INTEGRATION",
"jobName":"NetsuiteMetadataLoad",
"periodName":"{Jan-20}{Mar-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"targetOptions":{
	 "Refresh Database":"Yes",
	 "Dimension Name":"Product"
}
}

Example of Oracle ERP Cloud (Payables Transactions):

{
"jobType":"INTEGRATION",
"jobName":"Payables1Load",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
		"Invoice Type":"Credit Memo",
	"Cancelled Invoices Only ":"Yes"
}
}

Example of Quick Mode integration:

{
{
    "jobType":"INTEGRATION",
    "jobName":"QuickMode_LOC1_DL1",
    "periodName":"{Jan-17}",
    "importMode":"Direct",
    "exportMode":"Merge",
    "executionMode":"ASYNC"
}

Response

Supported Media Types: application/json