Integration Job Type

Use the INTEGRATION job type for the jobs REST API to execute an integration or data load rule based on how periods are processed and source filters.

The INTEGRATION job type is an enhanced version of DATARULE job type (see Running Data Rules). It is recommended that you use the INTEGRATION job type for future integration jobs.

The INTEGRATION jobtype supports running integrations/data load rules based on:

  • period names provided to Planning
  • Global POVs
  • Planning substitution variables
  • source filters selected as runtime parameters.
  • existing period ranges

It also supports overriding the source filters at runtime without modifying the integration definition.

Prerequisites:

You must have the required privileges to execute a specific data rule/integration.

REST Resource

/aif/rest/{api_version}/jobs

Method:

POST

Payload:

{
             "jobType":"INTEGRATION",
             "jobName":"GLDATA",
             "periodName":"{Mar-20}",
             "importMode":"Replace",
             "exportMode":"Merge",
             "fileName":"inbox/GLBALANCE.txt"
}

REST Payload Description

The following table summarizes the REST payload.

Table 10-7 Parameters

Name Description Type Required Default
api_version V1 Path Yes None
jobType INTEGRATION JSON payload Yes None
jobName The name of the integration or data load rule defined in Data Management. You should enclose the rule or name in quotation marks if it contains a space. JSON payload Yes None
periodname

Name of the period enclosed in curly brackets ({}).

periodname includes:

  • Single Period—Refers to the Data Management period name for a single period. The parameter is whatever the period name is defined in Period mapping.

  • Multi-Period—Refers to a multi-period load. The parameter is {Month-Year}{Month-Year}. For example, {Jan-20}{Mar-20} refers to a multi-period load from Jan-20 to Mar-20.

  • Planning Period Name—Refers to a Planning period name. The parameter is {Month#Year}, for example, {Jan#FY20}{Mar#FY20}. Using this convention, the client executing the API does not need to know the Data Management period names. Instead you specify the Planning member names for the Year and Scenario dimensions.

    This parameter is supported in the Planning,Tax Reporting, and Financial Consolidation and Close business processes. It is functional for both your primary service applications and cloud deployments derived from on-premises data sources.

    The application period mapping or global period mapping must already exists with the Year and Month in the target values of the period mapping.

    This parameter is useful if triggered from an Oracle Enterprise Performance Management Cloud Groovy script by capturing the Year and Period member names. The application period mapping or global period mapping must exist with the Year and Month in the target values of the period mapping.

  • Planning Substitution Variable—This is an extension of the previous Planning period name parameter whereby a substitution variable can be specified instead of the actual Year/Month member names.

    The convention is {Month#&CurYr}{&FcstMonth#&CurYr}; for example, {Jan#&CurYr}{&FcstMonth#&CurYr}.

    A combination of both actual member names as well as substitution variables is supported.

    This parameter is supported in the Planning,Tax Reporting, and Financial Consolidation and Close business processes. It is functional for both your primary service applications and cloud deployments derived from on-premises data sources.

    The application period mapping or global period mapping must exist in the Data Management of the instance where the API is executed, with the Year and Month in the target values of the period mapping. In this case, Year and Month refer to the current value of the substitution variable during execution.

  • GLOBAL POV–—Executes the data load for the Global POV period. Use the format {GLOBAL_POV}.

Note:

If you use any other period naming parameter other than the parameters described above, you get an "Invalid Input – HTTP 400 " error message.
JSON payload Yes None
importMode Determines how the data is imported into Data Management.

Acceptable values are:

  • Append—Add to the existing POV data in Data Management

  • Replace—Delete the POV data and replace it with the data from the file.

  • Map and Validate—Skip importing the data, but re-process the data with updated Mappings and Logic Accounts.

  • No Import—Skip data import into Data Management staging table.

JSON payload Yes None
exportMode Determines how the data is exported into Data Management.

Acceptable values for Planning business processes are:

  • Merge—Merge the data in the Data Management staging table with the existing Planning data.

  • Replace—Clear the POV data and replace it with data in the Data Management staging table. The data is cleared for Scenario, Version, Year, Period, and Entity dimensions.

  • Accumulate—Add the data in the Data Management staging table to Planning.

  • Subtract—Subtract the data in the Data Management staging table from existing Data Management data.

  • No Export—Skip data export from Data Management to Planning.

Acceptable values for Financial Consolidation and Close and Tax Reporting are:

  • Merge—Merge the data in the staging table with the data in the Financial Consolidation and Close and Tax Reporting application.

    If data already exists in the application, the system overwrites the existing data with the new data from the load file. If data does not exist, the new data is created.

  • Replace—Delete the POV data and replace it with the data from the file.

  • No Export—Skip the data export from Data Management to Financial Consolidation and Close or Tax Reporting

JSON payload Yes None
fileName

The fileName parameter is applicable only for native file-based data loads and ignored if specified for other loads.

The file name is optional. If you do not specify a file name, this API imports the data contained in the file name specified in the load data rule. The data file must already reside in the Inbox prior to executing the data load rule, for example, . inbox/GLBALANCES.txt.

You can also upload file to folders accessible from the Applications-Inbox/Outbox Explorer using a file name. Reference the files in this folder using this format:: #epminbox/<filename>.

JSON payload No None
sourceFilters

A parameter used to update the source filters defined for the data load rule or integration.

File-based applications—You cannot use the sourceFilters parameter for file-based applications. If you use this parameter with a file-based application, you get the HTTP 400 error message: "EPMFDM-ERROR: Data Load Rule does not support sourceFilters for File based loads."

Data Source based applications—Replaces the pre-defined source filter parameters at the application/rule level for a data load rule/integration or specifies them at runtime (if not pre-defined at application level when the data source is the type of source application.

Each filter name and its value should be sent as a key/value pair in the nested JSON object. The parameter name and value should be the English display names as seen in the user interface. Do not specify internal codes for parameter names and values using LOV validation. LOV validation is done for parameters having a restricted list of parameter values.

All filter names are not mandatory. Only filter names specified in the nested JSON object are replaced or set at runtime. The remaining filters are picked from the application/rule definition.

Essbase/Planning/Oracle General Ledger-based applications—Replaces the already defined dimension source filters for a data load rule/integration at runtime when Essbase/Planning/Oracle General Ledger balance type are the source applications.

Each dimension name and its value should be sent as a key/value pair in the nested JSON object.

Replacing or setting the dimension filters at run time is only supported for dimensions already having a filter defined in the data load rule/integration definition.

All filter names are not mandatory. Only dimension names specified in the nested JSON object will be replaced or set in runtime. The remaining filters are picked from data rule/integration definition.

Dimension filters are supported for all dimensions including the "Scenario" dimension. The "Year" and "Period" dimensions are not supported as filters because they are driven by the POV range.

All other source applications are not supported.

JSON payload No None

Sample REST Payloads

Below are sample payloads based on the source application type.

File Based Loads

In this example, the source is a text file running an integration through a native File adapter:

{
"jobType":"INTEGRATION",
"jobName":"ERPDATA",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"fileName":"inbox/TestData.txt"
}

Planning Applications

In these examples, the source is an Essbase/Planning based application. The supported applications include:

  • Planning modules
  • Reporting cubes (plan types) of Planning
  • Financial Consolidation and Close
  • Tax Reporting
  • Profitability and Cost Management
  • Oracle ERP Cloud - Oracle General Ledger Balances Cube

Example of Planning to Financial Consolidation and Close Data Synchronization

{
"jobType":"INTEGRATION",
"jobName":"PBCStoFCCS",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
	"Account":"@RELATIVE(Acc1,0)",
	"Entity":" @CHILDREN(Europe)",
	"Scenario":"Actual"
}
}

Data Source Applications

Example of Incremental File adapter:

{
"jobType":"INTEGRATION",
"jobName":"MyIncrementalFileLoad",
"periodName":"{Jan-20}{Mar-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{"Source File":"File1.txt"}
}

Example of Netsuite adapter:

{
"jobType":"INTEGRATION",
"jobName":"NetsuiteLoad",
"periodName":"{Jan-20}{Mar-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
	 "Postingperiod":"This Fiscal Quarter",
	 "Mainline":"True"
}
}

Example of Oracle ERP Cloud (Payables Transactions):

{
"jobType":"INTEGRATION",
"jobName":"Payables1Load",
"periodName":"{Jan-20}",
"importMode":"REPLACE",
"exportMode":"NONE",
"sourceFilters":{
		"Invoice Type":"Credit Memo",
	"Cancelled Invoices Only ":"Yes"
}
}