Running Data Rules

Prerequisites

REST Resource

Request

Response

Executes a Data Management data load rule based on the start period and end period, and import or export options that you specify.

  • Data Rules: Data load rules define how Integrations load data from a file. You must have predefined data load rules to load data.

  • You must have the required privileges to execute a specific data rule.

GET /aif/rest/{api_version}/jobs

Supported Media Types: application/json

Parameters

The following table summarizes the client request.

Table 8-5 Parameters

Name Description Type Required Default
api_version Version of the API you are working with, such as V1 Path Yes None
jobType should be set to "DATARULE" Path Yes None
jobName The name of a data load rule defined in Data Management. You should enclose the rule name in quotation marks if it contains a space. Path Yes None
Period The first period for which data is to be loaded. This period name must be defined in Data Management period mapping. Path Yes None
importMode determines how the data is imported into Data Management.

Acceptable values are:

  • APPEND to add to the existing POV data in Data Management

  • REPLACE to delete the POV data and replace it with the data from the file

  • RECALCULATE to skip importing the data, but re-process the data with updated Mappings and Logic Accounts.

  • NONE to skip data import into Data Management staging table

Path Yes None
exportMode determines how the data is exported into Data Management.

how the data is exported to the Oracle Hyperion Planning application. Acceptable values are:

  • STORE_DATA to merge the data in the Data Management staging table with the existing Planning data

  • ADD_DATA to add the data in the Data Management staging table to Planning

  • SUBTRACT_DATA to subtract the data in the Data Management staging table from existing Planning data

  • REPLACE_DATA to clear the POV data and replace it with data in the Data Management staging table. The data is cleared for Scenario, Version, Year, Period, and Entity

  • NONE to skip data export from Data Management to Planning

Path Yes None
fileName An optional file name. If you do not specify a file name, this API imports the data contained in the file name specified in the load data rule. The data file must already reside in the INBOX prior to data rule execution. Path Yes None

Supported Media Types: application/json

Parameters

Table 8-6 Parameters

Name Description

status

Status of the job: -1 = in progress; 0 = success; 1 = error; 2 = cancel pending; 3 = cancelled; 4 = invalid parameter

jobStatus

A text representation of the job status, with one of the following values “RUNNING”, “SUCCESS”. “FAILED”

jobId

The process ID generated in Data Management for the job

logFileName

Log File containing entries for this execution.

outputFileName

Name of the output file generated, if any.

processType

Type of the process executed. Will contain “COMM_LOAD_BALANCES” for all Data Rule executions

executedBy

Login name of the user used to execute the rule.

details

Returns the exception stack trace in case of an application error

Supported Media Types: application/json

Parameters

Example of Response Body

The following shows an example of the response body in JSON format.

{
    "jobStatus": "RUNNING"
"jobId": 2019
"logFileName": "\outbox\logs\Account Reconciliation Manager_2019.log"
"outputFileName": null
"processType": "COMM_LOAD_BALANCES"
"executedBy": "admin"
"status": -1
"links": [1]
    0:  {
    "rel": "self"
    "href": "https://<SERVICE_NAME>-<TENANT_NAME>.<dcX>.oraclecloud.com/aif/rest/V1/jobs/2019"
    "action": "GET"
}
"details": null

}

Getting API Versions of Data Management APIs Sample Code

Java Sample prerequisites: java-json.jar ,jersey-core.jar, jersey-client.jar, jersey-json.jar

Common functions: See Common Helper Functions for Java

Example 8-1 Java Sample – RulesResource.java

/*
    File: RulesResource.java - Created on Apr 24, 2015
    Copyright (c) 2015 Oracle Corporation. All Rights Reserved.
    This software is the confidential and proprietary information of Oracle.
 */
 @POST
    @Path("/jobs")
    @Produces(MediaType.APPLICATION_JSON)
    @Consumes(MediaType.APPLICATION_JSON)
    public Object executeJob(String jsonPayload, @HeaderParam("ssoToken") String ssoToken, @HeaderParam("userName") String userName){
        Response.Status httpStatus = Response.Status.OK;
        BaseDTO response = null;
        RulesRequestDO requestDo = null;
        try{
            ObjectMapper mapper = new ObjectMapper();
            requestDo  = mapper.readValue(jsonPayload, RulesRequestDO.class);
        }catch (Throwable t){
                ERPIModelLogger.logError("Error encountered in Parsing JSON payload:",null,t);
                 httpStatus = Response.Status.BAD_REQUEST;
            response = RestResourceUtil.getCustomErrorResponse("EPMFDM-ERROR: Error in parsing JSON Payload",response, StatusCode.INVALID_INPUT);
        }
        if (requestDo != null && RestResourceUtil.isBlankString(requestDo.getJobType())){
            httpStatus = Response.Status.BAD_REQUEST;
            response = RestResourceUtil.getCustomErrorResponse("EPMFDM-ERROR: No Job Type Specified", response, StatusCode.INVALID_INPUT);
        }else if (requestDo != null && 
                  !(requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.DATARULE) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.BATCH)
                  || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.DIRECTIMPORT) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.DIRECTEXPORT)
                  || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.EXPORTTOARM) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.REFRESH_APP_DIMENSIONS)
                  || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.REFRESH_APP_METADATA) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.REPORT)
                  || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.MAPPINGIMPORT) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.MAPPINGEXPORT))){
            httpStatus = Response.Status.BAD_REQUEST;
            response = RestResourceUtil.getCustomErrorResponse("EPMFDM-ERROR: Invalid Job Type Specified", response, StatusCode.INVALID_PARAMETER);
        }
        if (requestDo == null){
            requestDo = new RulesRequestDO();
        }

        ERPIModelLogger.logInfo("Executing Job with options: "+requestDo.toString(), null);
        if ( RestResourceConstants.DATARULE.equalsIgnoreCase(requestDo.getJobType()) ){
            try{
                response = validateDataRuleParams(requestDo);
                if (response.getStatus() == 0){
                    if (RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getImportMode())){
                        requestDo.setImportMode(null);
                    }
                    if (RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getExportMode())){
                        requestDo.setExportMode(null);
                    }
                    response = getRulesServiceV1().executeDataRule(ssoToken, requestDo.getJobName(), requestDo.getStartPeriod(), 
                                                                       requestDo.getEndPeriod(), requestDo.getImportMode(), requestDo.getExportMode(),
                                                                       requestDo.getFileName(),getBaseURL());
                }else{
                    httpStatus = Response.Status.BAD_REQUEST;
                }
            }catch (Throwable t){
                ERPIModelLogger.logError("Error encountered in Execute Rule Service:",null,t);
                httpStatus = Response.Status.INTERNAL_SERVER_ERROR;
                response = RestResourceUtil.getHttpErrorResponse(t,response);
            }
        }else if ( RestResourceConstants.BATCH.equalsIgnoreCase(requestDo.getJobType()) ){
            try{
                if (RestResourceUtil.isBlankString(requestDo.getJobName())){
                    httpStatus = Response.Status.BAD_REQUEST;
                    response = RestResourceUtil.getCustomErrorResponse("Invalid or missing parameter: Job Name", response, StatusCode.INVALID_PARAMETER);
                }else{
                    response = getRulesServiceV1().executeBatch(requestDo.getJobName(), ssoToken, getBaseURL());
                }
            }
            catch (Throwable t){
                ERPIModelLogger.logError("Error encountered in Fetch Batch Definition Service:",null,t);
                httpStatus = Response.Status.INTERNAL_SERVER_ERROR;
                response = RestResourceUtil.getHttpErrorResponse(t,response);
            }
        return Response.status(httpStatus.getStatusCode()).header(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON + "; charset=UTF-8").entity(response).build();
    }

Example 8-2 Java Sample - Run Data Rule Validator, RulesResource.java

private BaseDTO validateJobParams(RulesRequestDO requestDo){
        BaseDTO response = new BaseDTO();
        boolean isError = false;
        String errorMessage = "Invalid or missing parameter: ";
        if (RestResourceUtil.isBlankString(requestDo.getJobName())){
            errorMessage = errorMessage + "Job Name,";
            isError = true;
        }
        if (RestResourceUtil.isBlankString(requestDo.getStartPeriod())){
            errorMessage = errorMessage + "Start Period,";
            isError = true;
        }
        if (RestResourceUtil.isBlankString(requestDo.getEndPeriod())){
            errorMessage = errorMessage + "End Period,";
            isError = true;
        }
        if ((RestResourceUtil.isBlankString(requestDo.getImportMode()) ||  RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getImportMode()) ) && 
            (RestResourceUtil.isBlankString(requestDo.getExportMode()) ||  RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getExportMode())) ){
            errorMessage = errorMessage + "Import/Export mode,";
            isError = true;
        }else {
            if (!RestResourceUtil.isBlankString(requestDo.getImportMode()) && 
                !("APPEND".equalsIgnoreCase(requestDo.getImportMode()) || "REPLACE".equalsIgnoreCase(requestDo.getImportMode()) ||
                  "RECALCULATE".equalsIgnoreCase(requestDo.getImportMode()) || RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getImportMode()) )){
                errorMessage = errorMessage + "Import mode,";
                isError = true;
                  }
            if (!RestResourceUtil.isBlankString(requestDo.getExportMode()) && 
                !("STORE_DATA".equalsIgnoreCase(requestDo.getExportMode()) || "ADD_DATA".equalsIgnoreCase(requestDo.getExportMode()) 
                || "SUBTRACT_DATA".equalsIgnoreCase(requestDo.getExportMode()) || "REPLACE_DATA".equalsIgnoreCase(requestDo.getExportMode()) 
                || "REPLACE".equalsIgnoreCase(requestDo.getExportMode()) || "MERGE".equalsIgnoreCase(requestDo.getExportMode()) 
                || RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getExportMode()) )){
                      errorMessage = errorMessage + "Export mode,";
                      isError = true;
                  }
        }
        if (isError){
            errorMessage = errorMessage.substring(0, errorMessage.length()-1);
            response = RestResourceUtil.getCustomErrorResponse(errorMessage, response, StatusCode.INVALID_PARAMETER);
        }
        return response;
    }