Execute Job

post

/essbase/rest/v1/jobs

Executes the job and returns the record containing job information, such as job ID, status, inputs, and output information for the current job.

Request

Supported Media Types
Body ()

Parameter provided as json string in the request body.

Root Schema : JobsInputBean
Type: object
Show Source
Nested Schema : ParametersBean
Type: object
Show Source
  • If true, data load stops on the first error. Otherwise, data load continues.

  • Allowed Values: [ "COMMIT", "ABORT" ]

    Integer or keyword specifying what to do if there is an error committing the load buffer. COMMIT or 1 - commit anyway. ABORT or 2 - terminate the commit operation.

  • For LCMImport job type. Name of artifact list to use (valid only if generateArtifactList was true in the LCMExport job).

  • For buildAggregation job type. If true, aggregate whichever views Essbase selects, based on collected user querying patterns. This option is only available if query tracking is turned on.

  • Unique ID of a single aggregate storage data load buffer. Must be a number between 1 and 4294967296. Use either this option or bufferIds option, but not both.

  • bufferIds

    Array of unique aggregate storage data load buffer IDs. Use this option if you want to commit multiple buffers.

  • For exportExcel job type. Valid build methods are PARENT-CHILD (the recommended method) and GENERATION.

  • For importExcel job type. When an application has already been built from a workbook, you can continue to update it by performing additional imports from the workbook. When importing again from a workbook, you can use these incremental-update options: NONE, RETAIN_ALL_DATA or REMOVE_ALL_DATA.

  • For exportExcel job type. If true, includes calculation scripts in the export.

  • For importExcel job type. Path to application workbook in the file catalog. Use importExcelFileName to specify the file name.

  • For exportExcel or exportData job type. If true, the cube is exported in tabluar format; otherwise, it is exported as an application workbook. Exported tabular data contains data and metadata organized into columns with headers that Essbase can use to deploy a new cube. The exported tabular data contains less information than an application workbook.

  • Integer or keyword describing aggregate storage data load buffer commit options, to specify whether you want to add to existing values, substract from existing values, or override existing values when committing the contents of the data load buffer to the cube.

    • STORE_DATA (0) - Store data in the load buffer
    • ADD_DATA (1) - Add values in the load buffer to existing stored data values
    • SUBTRACT_DATA (2) - Subtract values in the load buffer from existing stored data values
    • OVERRIDE_ALL_DATA (3) - Remove the current data values from the cube and replace them with the values in the data load buffer.
    • OVERRIDE_INCREMENTAL_DATA (4) - Remove the current contents of all incremental data slices in the cube and create a new data slice with the contents of the specified data load buffer. The new data is created with the data load property add values (aggregate_sum). If there are duplicate cells between the new data and the primary slice, their values are added together when you query for them.
    .
  • For exportExcel or exportData job type. If true, compress the data.

  • For dimbuild or dataload job types, when useConnection is true. Name of a saved connection.

  • For LCMExport job type. If true, save copy of backup on server storage.

  • For importExcel job type. If true, create cube artifacts in the cube directory.

  • For exportExcel job type. If true, includes data in the export.

  • For exportExcel or exportData job type. Constant indicating how much data to export. You can pass either the number or the corresponding string value. 0=ALL_DATA, 1=UPPER_LEVEL_BLOCKS, 2=NON_INPUT_BLOCKS, 3=LEVEL_ZERO_BLOCKS, 4=INPUT_LEVEL_DATA_BLOCKS

  • For buildAggregation job type. If true, let Essbase consider secondary hierarchies ('alternate rollups') for view selection.

  • For importExcel job type. If true, execute calculation scripts. Applicable only if the application workbook contains a calculation worksheet with Execute Calc set to Yes in the definition.

  • For exportData job type when columnFormat is true. Include dynamically calculated sparse members in the tabular data export. These are not included by default.

  • Source file for data load or dimension build. If the file is located somewhere other than the cube directory in the Essbase file catalog, such as in a user or shared directory, prefix the file name with catalog/directory_name/. For example, "file" : "catalog/shared/dataload.txt".

  • For dimension build. If true, continue the dimension build even if other user activities are in progress. This cancels active user sessions.

  • For LCMExport job type. If true, generate a text file list of exported artifacts. You can use the list to control the import; for example, rearranging the order of artifacts to control the import order, or enumerating which items in the list to import.

  • Specify true to hide the shadow application; otherwise, specify false.

  • For importExcel job type. File name of application workbook. If not in the cube directory, use catalogExcelPath to specify the path.

  • For For LCMExport and LCMImport job types. If true, include globally defined connections and Datasources as part of the export or import.

  • For LCMImport job type. If true, import from backup saved on server storage.

  • For importExcel job type. If true, loads data (if the workbook contains a data worksheet). Otherwise, only metadata is imported into the cube.

  • For clear data job. Keyword specifying what to clear. Default option, if omitted, is allData. The options for block storage cubes are: allData ??? All data, linked objects, and the outline are cleared. upperLevel ??? Upper level blocks are cleared. nonInput ??? Non input blocks are cleared. The options for aggregate storage cubes are: allData ??? All data, linked objects, and the outline are cleared. allAggregations ??? All aggregated data is cleared. partialData ??? Only a specified region is cleared (an MDX region specified in partialDataExpression parameter).

  • For LCMExport and LCMImport job types. If true, overwrite existing backup (for LCMExport) or overwrite existing application (for LCMImport).

  • For dimbuild or dataload job types, when useConnection is false, and the job uses a rules file that connects to an RDBMS. Password of the user who can connect to the RDBMS.

  • Main or base (not shadow) application name.

  • For buildAggregation job type. A stopping value. Use this value to give the ratio of the growth size you want to allow during the materialization of an aggregate storage cube, versus the pre-aggregation size of the cube. (Before an aggregation is materialized, the cube contains only input-level data.) For example, if the size of the cube is 1 GB, a stopping value of 1.2 means that the size of the resulting data cannot exceed 20% of 1 GB, for a total size of 1.2 GB. If you do not want to specify a stopping value, enter 0 for this parameter.

  • For importExcel job type. If true, re-create the application, if it already exists.

  • For dimension build. Preservation options for existing data in the cube. PRESERVE_ALL_DATA: Preserves all existing data blocks (valid for block storage and aggregate storage cubes). PRESERVE_NO_DATA: Discards existing data (valid for block storage and aggregate storage cubes). PRESERVE_LEAFLEVEL_DATA: Preserves existing level zero data (block storage only). PRESERVE_INPUT_DATA: Preserves existing input-level data (block storage only).

  • rtsv

    Runtime substitution variables defined for calc scripts.

  • Optional rules file (if the job is a data load or dimension build).

  • For calc execution. Calculation script name. Must have .csc file extension. You do not need to give a full path. Files are assumed to be in the relevant cube directory.

  • selectedDimensions
  • Shadow (copied) application name.

  • For LCMExport job type. If true, do not include data in the backup.

  • For LCMImport job type. Optional specification of an application name to import/restore to, if different from the exported/backed up application name.

  • Allowed Values: [ "INCR_TO_MAIN_SLICE", "INCR_TO_NEW_SLICE", "INCR_TO_NEW_SLICE_LIGHTWEIGHT" ]

    Integer or keyword specifying final options for committing data slices to the cube from the data load buffer.

    • INCR_TO_MAIN_SLICE (or 0): Commit the contents of all incremental data slices as a primary (main) slice.
    • INCR_TO_NEW_SLICE (or 1): Replace the contents of all incremental data slices with a new slice.
    • INCR_TO_NEW_SLICE_LIGHTWEIGHT (or 2): Write the data currently stored in the buffer to a new slice in the cube, as a lightweight operation. This option is intended only for very small data loads of up to 1,000s of cells that occur concurrently (for example, grid client data-update operations).
  • Time interval (in seconds) to wait before forcefully unloading/stopping an application, if it is performing ongoing requests. If a graceful unload process fails or takes longer than permitted by this timeout, Essbase forcefully terminates the application.

  • CompactDesignation
  • For dimbuild or dataload job types. If true, specifies that a saved connection should be used (you must also pass connection parameter indicating the connection name). If false, and the job uses a rules file that connects to an RDBMS, you must specify the user name and password to connect to the RDBMS.useConnection

  • For dimbuild or dataload job types, when useConnection is false, and the job uses a rules file that connects to an RDBMS. Name of the user who can connect to the RDBMS.

  • Waiting period (in seconds) for any active write-operations to complete.

  • For LCMExport and LCMImport job types. Name of compressed file to hold backup files.

Nested Schema : bufferIds
Type: array

Array of unique aggregate storage data load buffer IDs. Use this option if you want to commit multiple buffers.

Show Source
Nested Schema : rtsv
Type: array

Runtime substitution variables defined for calc scripts.

Show Source
Nested Schema : selectedDimensions
Type: array
Show Source
Nested Schema : CompactDesignation
Type: object
Show Source
Nested Schema : RTSV
Type: object

Details about the runtime substitution variable.

Show Source
  • Set to true to allow data cells for which no data exists, or false to suppress them.

  • The name of the dimension from which this variable pulls information. Supported only if type = MEMBER.

  • Name of the runtime substitution variable.

  • Set to true if only one contextual member selection may be passed to the runtime substitution variable. If there is a single member on the grid or POV, that member is used. If a dimension is on the POV, the active member is used. If a dimension is on the POV and there are multiple members, an error occurs.

    Set to false if all dimension members on the grid or the POV are included.

  • Allowed Values: [ "STRING", "NUMBER", "DATE", "MEMBER" ]

    Specification of whether the variable is for a member, string, or number.

  • value

    Default value of the runtime substitution variable. RTSV values can be strings, constants, member names, or member combinations.

    If the RTSV is designed for calcs executed in Smart View, its value must be set to the constant POV, to indicate that only the current data slice present in the spreadsheet grid should be calculated.

Nested Schema : value
Type: object

Default value of the runtime substitution variable. RTSV values can be strings, constants, member names, or member combinations.

If the RTSV is designed for calcs executed in Smart View, its value must be set to the constant POV, to indicate that only the current data slice present in the spreadsheet grid should be calculated.

Nested Schema : compactDesignationColumn
Type: array
Show Source
Nested Schema : dateDimensionLeaves
Type: array
Show Source
Nested Schema : dimCompactDesignationList
Type: array
Show Source
Nested Schema : CompactDesignationColumn
Type: object
Show Source
Nested Schema : DimCompactDesignation
Type: object
Show Source
Nested Schema : attNames
Type: array
Show Source
Nested Schema : attribute
Type: array
Show Source
Nested Schema : colNames
Type: array
Show Source
Nested Schema : columnTypes
Type: array
Show Source
  • Allowed Values: [ "TEXT", "INTEGER", "FLOAT", "TIME", "DATE", "BOOLEAN", "EMPTY", "UNKNOWN", "OUT_OF_RANGE" ]
Nested Schema : dimGenColumns
Type: array
Show Source
Nested Schema : genNames
Type: array
Show Source
Nested Schema : headerText
Type: array
Show Source
Nested Schema : uniqCount
Type: array
Show Source
Back to Top

Response

Supported Media Types

200 Response

OK

Job started successfully. Job information returned in response.

Body ()
Root Schema : JobRecordBean
Type: object
Show Source
Nested Schema : jobInputInfo
Type: object
Additional Properties Allowed
Show Source
Nested Schema : jobOutputInfo
Type: object
Additional Properties Allowed
Show Source
Nested Schema : additionalProperties
Type: object
Nested Schema : additionalProperties
Type: object

400 Response

Bad Request

Application may not exist, or application parameter may be incorrect. Or, database may not exist, or database parameter may be incorrect. Or, a null argument may have been passed.

500 Response

Internal Server Error.

503 Response

Service Unavailable

Naming exception or server exception.

Back to Top

Examples

The following examples show how to run Essbase jobs.

This example uses cURL to access the REST API from a Windows shell script. The calling user's ID and password are variables whose values are set in properties.bat.

The jobs you can run are:

Load Data

The following cURL example shows you how to run a data load job. Requires at least Database Update permission.

If you are loading data using an aggregate storage data load buffer, see the Create Buffer endpoint for examples.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./dataload.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in dataload.json.

{
 "application": "Sample", 
 "db": "Basic",
 "jobtype": "dataload", 
 "parameters": 
   { "file": "Data_Basic.txt",
     "abortOnError": "true" }
}

Note:

If the source file for a data load or dimension build is located somewhere other than the cube directory in the Essbase file catalog, such as in a user or shared directory, prefix the file name with catalog/directory_name/. For example: "catalog/shared/dataload.txt". For information about permitted import directories, refer to Specify Files in a Catalog Path.

Example of Response Body

{
  "job_ID": 1346,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "Data Load",
  "jobfileName": null,
  "userName": "admin",
  "startTime": 1574456694000,
  "endTime": 1574456694000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "dataFileName": "Data_Basic.txt",
    "abortOnError": true,
    "useConnection": false
  },
  "jobOutputInfo": {
    "recordsProcessed": 0,
    "recordsRejected": 0,
    "errorMessage": ""
  },
  "links": [ ]
}

Build Dimensions

The following cURL example shows you how to run a dimension build job. Requires at least Database Manager permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./dimbuild.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in dimbuild.json.

{
 "application": "ASOSamp",
 "db": "Basic",
 "jobtype": "dimbuild",
 "parameters": 
    { "file": "Dim_Products.txt",
      "rule": "Products.rul"  }
}

Note:

If the source file for a data load or dimension build is located somewhere other than the cube directory in the Essbase file catalog, such as in a user or shared directory, prefix the file name with catalog/directory_name/. For example: "catalog/shared/dataload.txt". For information about permitted import directories, refer to Specify Files in a Catalog Path.

Example of Response Body

{
  "job_ID": 4,
  "appName": "ASOSamp",
  "dbName": "Basic",
  "jobType": "Dimension Build",
  "jobfileName": "Products",
  "userName": "power1",
  "startTime": 1574814746000,
  "endTime": 1574814746000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "rulesFileName": "Products",
    "dataFileName": "Dim_Products.txt",
    "useConnection": false,
    "restructureOption": 1,
    "forceDimBuild": false
  },
  "jobOutputInfo": {
    "recordsProcessed": 0,
    "recordsRejected": 0,
    "errorMessage": ""
  },
  "links": [ ]
}

Run Calculation

The following cURL example shows you how to execute a calculation script. Requires at least Database Update permission, as well as provisioned access to the calculation script. Prerequisite: upload the script, as a .csc file, to the cube directory.

call properties.batcurl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./calc.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in calc.json.

{
 "application": "Sample", 
  "db": "Basic", 
  "jobtype": "calc", 
  "parameters": 
     { "file": "CalcAll.csc" }
}

Example of Response Body

{
  "job_ID": 1434,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "Calc Execution",
  "jobfileName": null,
  "userName": "admin",
  "startTime": 1574733981000,
  "endTime": 1574733981000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {},
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Clear Data

The following cURL example shows you how to run a job to clear cube data. Requires at least Database Update permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./cleardata.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in cleardata.json.

{
 "application": "Sample", 
  "db": "Basic", 
  "jobtype": "clear", 
  "parameters": 
     { 
      "option": "PARTIAL_DATA",
      "partialDataExpression": "{Feb}"
     }
}

Example of Response Body

{
  "job_ID": 116,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "Clear Data",
  "jobfileName": null,
  "userName": "dbupdater",
  "startTime": 1598329480000,
  "endTime": 1598329480000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "clearDataOption": "PARTIAL_DATA"
  },
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Import Cube from Application Workbook

The following cURL example shows you how to run a job that imports a cube from an Excel application workbook. Requires at least power user role, or Application Manager permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./importExcel.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in importExcel.json.

{
  "application": "ASOSamp",  
   "db": "Basic",  
   "jobtype": "importExcel",  
   "parameters": 
      { 
       "loaddata": "false",
       "overwrite": "true",
       "deleteExcelOnSuccess": "false",
       "catalogExcelPath": "/gallery/Applications/Demo Samples/Aggregate Storage/",
       "importExcelFileName": "ASO_Sample.xlsx",
       "recreateApplication": "true",
       "createFiles": "true"
      }
}

Example of Response Body

{
  "job_ID": 2,
  "appName": "ASOSamp",
  "dbName": "Basic",
  "jobType": "Import Excel",
  "jobfileName": "ASO_Sample.xlsx",
  "userName": "power1",
  "startTime": 1574810127000,
  "endTime": 1574810127000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "catalogExcelPath": "/gallery/Applications/Demo Samples/Aggregate Storage/",
    "importExcelFileName": "ASO_Sample.xlsx",
    "isLoadData": false,
    "recreateApplication": true,
    "isCreateFiles": true,
    "isExecuteScript": false
  },
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Export Cube to Application Workbook

The following cURL example shows you how to run a job that exports a cube to an Excel application workbook. Requires at least Database Manager permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./exportExcel.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in exportExcel.json.

{
  "application": "Sample",
  "db": "Basic",
  "jobtype": "exportExcel",
  "parameters": 
     {
      "dataLevel": "ALL_DATA",
      "columnFormat": "false",
      "compress": "false"
     }
}

Example of Response Body

{
  "job_ID": 10,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "Export Excel",
  "jobfileName": null,
  "userName": "admin",
  "startTime": 1575413474000,
  "endTime": 1575413474000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "calc": false,
    "data": false
  },
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Example of Response from Get Job {id}

The following curl command checks on the job status:

curl -X GET "https://myserver.example.com:9001/essbase/rest/v1/jobs/10" -H "accept: application/json" -u %User%:%Password%

returning the following response. The status is Completed, and the Excel file is saved in the Essbase file catalog, in the directory specified in metadataFile.

{
  "job_ID": 10,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "Export Excel",
  "jobfileName": null,
  "userName": "admin",
  "startTime": 1575413474000,
  "endTime": 1575413490000,
  "statusCode": 200,
  "statusMessage": "Completed",
  "jobInputInfo": {
    "calc": false,
    "data": false
  },
  "jobOutputInfo": {
    "errorMessage": "",
    "metadataFile": "/applications/Sample/Basic/Basic.xlsx"
  },
  "links": [
    {
      "rel": "self",
      "href": "https://myserver.example.com:9001/essbase/rest/v1/jobs/10",
      "method": "GET"
    },
    {
      "rel": "post",
      "href": "https://myserver.example.com:9001/essbase/rest/v1/jobs/10",
      "method": "POST"
    }
  ]
}

Back Up Application with LCM Export

The following cURL example shows you how to run a job that backs up cube artifacts to a Lifecycle Management (LCM) .zip file. Requires at least Application Manager permission.

This job type can be run from outside the Essbase machine, whereas the LCM utility must be run on the Essbase machine.

curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./lcmExport.json" -u %User%:%Password%

Sample JSON payload -- Back Up All Applications

The following sample JSON payload, passed to REST API in lcmExport.json, is an example for backing up all applications at once. The required parameters are jobtype, allApp, and zipFileName.

{
  "jobtype": "lcmExport",
  "parameters": 
     {
      "allApp": "true",
      "generateartifactlist": "false",
      "include-server-level": "false",
      "zipFileName": "exportedApps.zip",
      "skipdata": "true"
     }
}

Example of Response Body -- Back Up All Applications

The REST API returns the following response, and when the job is completed, saves the zip file in the issuing user's directory in the Essbase file catalog.

{
  "job_ID" : 73,
  "appName" : null,
  "dbName" : null,
  "jobType" : "LCM Export",
  "jobfileName" : "exportedApps.zip",
  "userName" : "appmanager",
  "startTime" : 1660755176000,
  "endTime" : 1660755176000,
  "statusCode" : 100,
  "statusMessage" : "In Progress",
  "jobInputInfo" : {
    "lcmExportFileName" : "exportedApps.zip",
    "skipdata" : true,
    "copyToStorage" : false,
    "threads" : 10,
    "include-server-level" : false,
    "generateArtifactList" : false,
    "filesystemcopy" : false,
    "disasterRecovery" : false,
    "verbose" : false,
    "allApp" : true,
    "exportdata" : false,
    "exportpartitions" : false,
    "exportfilters" : false
  },
  "jobOutputInfo" : {
    "errorMessage" : "",
    "infoMessage" : ""
  },
  "links" : [ ]
}

Sample JSON payload -- Back Up One Application

The following sample JSON payload, passed to REST API in lcmExport.json, is an example for backing up one application only. The required parameters are application, jobtype, and zipFileName.

{
  "application": "Sample",
  "jobtype": "lcmExport",
  "parameters": 
     {
      "zipFileName": "Sample1.zip",
      "skipdata": "true",
      "include-server-level": "false"
     }
}

Example of Response Body -- Back Up One Application

The REST API returns the following response:

{
  "job_ID": 11,
  "appName": "Sample",
  "dbName": null,
  "jobType": "LCM Export",
  "jobfileName": "Sample1.zip",
  "userName": "appmanager",
  "startTime": 1575424208000,
  "endTime": 1575424208000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "lcmExportFileName": "Sample1.zip",
    "skipdata": true,
    "copyToStorage": false,
    "threads": 10,
    "include-server-level": false,
    "generateArtifactList": false,
    "filesystemcopy": false,
    "disasterRecovery": false,
    "verbose": false,
    "allApp" : false,
    "exportdata" : false,
    "exportpartitions" : false,
    "exportfilters" : false
  },
  "jobOutputInfo": {
    "errorMessage": "",
    "infoMessage": ""
  },
  "links": [ ]
}

Example of Response from Get Job {id}

The following curl command checks on the job status:

curl -X GET "https://myserver.example.com:9001/essbase/rest/v1/jobs/11" -H "accept: application/json" -u %User%:%Password%

returning the following response. The status is Completed, and the zip file is saved in the Essbase file catalog, in the directory specified in lcmExportFilePath.

{
  "job_ID": 11,
  "appName": "Sample",
  "dbName": null,
  "jobType": "LCM Export",
  "jobfileName": "Sample1.zip",
  "userName": "appmanager",
  "startTime": 1575424208000,
  "endTime": 1575424228000,
  "statusCode": 200,
  "statusMessage": "Completed",
  "jobInputInfo": {
    "lcmExportFileName": "Sample1.zip",
    "skipdata": true,
    "copyToStorage": false,
    "threads": 10,
    "include-server-level": false,
    "generateArtifactList": false,
    "filesystemcopy": false,
    "disasterRecovery": false,
    "verbose": false,
    "allApp" : false,
    "exportdata" : false,
    "exportpartitions" : false,
    "exportfilters" : false
  },
  "jobOutputInfo": {
    "errorMessage": "",
    "infoMessage": "",
    "lcmExportFilePath": "/users/appmanager/Sample1.zip"
  },
  "links": [
    {
      "rel": "self",
      "href": "https://myserver.example.com:9001/essbase/rest/v1/jobs/11",
      "method": "GET"
    },
    {
      "rel": "post",
      "href": "https://myserver.example.com:9001/essbase/rest/v1/jobs/11",
      "method": "POST"
    }
  ]
}

Restore Cube with LCM Import

The following cURL example shows you how to run a job that restores cube artifacts from a Lifecycle Management (LCM) .zip file. To do this, you must be the power user who created the application, or a service administrator.

This job type can be run from outside the Essbase machine, whereas the LCM utility must be run on the Essbase machine.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./lcmImport.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in lcmImport.json.

{
  "jobtype": "lcmImport",
  "parameters": 
     {
      "zipFileName": "Sample1.zip",
      "include-server-level": "false",
      "targetApplicationName": "Sample_dup",
      "overwrite": "true"
     }
}

Example of Response Body

{
  "job_ID": 12,
  "appName": null,
  "dbName": null,
  "jobType": "LCM Import",
  "jobfileName": "Sample1.zip",
  "userName": "admin",
  "startTime": 1575425649000,
  "endTime": 1575425649000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "lcmImportFileName": "Sample1.zip",
    "lcmImportTargetApplicationName": "Sample_dup",
    "lcmImportFromStorage": false,
    "overwrite": true,
    "include-server-level": false,
    "verbose": false,
    "useCatalogPath" : false
  },
  "jobOutputInfo": {
    "errorMessage": "",
    "infoMessage": ""
  },
  "links": [ ]
}

Build Aggregation

The following cURL example shows you how to run a job that builds an aggregation. Requires at least Database Access permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./buildagg.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in buildagg.json.

{
  "application": "ASOSamp",
  "db": "Basic",
  "jobtype": "buildAggregation",
  "parameters":
     {
      "ratioToStop": "1.1",
      "basedOnQueryData": "false",
      "enableAlternateRollups": "false"
     }
}

Aggregations apply to aggregate storage cubes. Aggregations are consolidations, based on outline hierarchy, of level 0 data values in an aggregate storage cube. The term aggregation is used to refer to the aggregation process and the set of values stored as a result of the process.

An aggregation contains one or more aggregate views, which are collections of aggregate cells. When you build an aggregation, Essbase selects aggregate views to be rolled up, aggregates them, and stores the cell values in the selected views. If an aggregation includes aggregate cells dependent on level 0 values that are changed through a data load, the higher-level values are automatically updated at the end of the data load process.

When you build an aggregation, Essbase

  • selects 0 or more aggregate views based on the stopping value (ratioToStop) and/or on querying patterns (basedOnQueryData), if these parameters are given
  • builds the views that were selected

Note: The MaxL equivalent of this job type is the execute aggregate process statement.

Example of Response Body

{
  "job_ID": 8,
  "appName": "ASOSamp",
  "dbName": "Basic",
  "jobType": "Build Aggregation",
  "jobfileName": null,
  "userName": "dbaccess",
  "startTime": 1575411748000,
  "endTime": 1575411748000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "enableAlternateRollups": false,
    "basedOnQueryData": false,
    "ratioToStop": 1.1
  },
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Clear Aggregations

The following cURL example shows you how to run a job that clears aggregations on an aggregate storage cube.

call properties.batcurl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./clearagg.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in clearagg.json.

{
  "application": "ASOSamp",
  "db": "Basic",
  "jobtype": "clearAggregation"
}

Example of Response Body

{
  "job_ID": 9,
  "appName": "ASOSamp",
  "dbName": "Basic",
  "jobType": "Clear Aggregation",
  "jobfileName": null,
  "userName": "dbaccess",
  "startTime": 1575412855000,
  "endTime": 1575412855000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Export Data

The following cURL example shows you how to export data from a cube. Requires at least Database Manager permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./dataexport.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in dataexport.json.

{
 "application": "Sample", 
 "db": "Basic",
 "jobtype": "exportData", 
 "parameters": 
   { 
     "compress": "false",
     "columnFormat": "false", 
     "dataLevel": "LEVEL_ZERO_BLOCKS"
   }
}

Example of Response Body

{
  "job_ID": 28,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "Export Data",
  "jobfileName": null,
  "userName": "dbmanager",
  "startTime": 1575920712000,
  "endTime": 1575920712000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {
    "compress": false,
    "columnFormat": false,
    "dataLevel": "LEVEL_ZERO_BLOCKS"
  },
  "jobOutputInfo": {
    "scriptOutputFileName": "",
    "scriptOutputFileNamePath": "",
    "infoMessage": "",
    "errorMessage": ""
  },
  "links": [ ]
}

Run MDX Script

The following cURL example shows you how to run an MDX script that performs an insert or export. Requires at least Database Access permission.

call properties.bat
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs?links=none" -H Accept:application/json -H Content-Type:application/json --data "@./mdxScript.json" -u %User%:%Password%

Sample JSON payload

The cURL example above delivers the following JSON payload in mdxScript.json.

{
 "application": "Sample", 
 "db": "Basic",
 "jobtype": "mdxScript", 
 "parameters": 
   { 
     "file": "share/mdx_scripts/export_examp.mdx"
   }
}

where export_examp.mdx is an MDX script that creates an export file in the cube directory. The MDX script contents are:

EXPORT INTO FILE "example" OVERWRITE USING COLUMNDELIMITER "#~"
SELECT  
   {[Mar],[Apr]} ON  COLUMNS,
   Crossjoin({[100],[200]} , crossjoin({[Actual],[Budget]},
   {[Opening Inventory],[Ending Inventory]})) ON ROWS
FROM [Sample].[Basic]
WHERE ([New York])  

Example of Response Body

{
  "job_ID": 26,
  "appName": "Sample",
  "dbName": "Basic",
  "jobType": "MDX Script",
  "jobfileName": null,
  "userName": "dbaccess",
  "startTime": 1575918425000,
  "endTime": 1575918425000,
  "statusCode": 100,
  "statusMessage": "In Progress",
  "jobInputInfo": {},
  "jobOutputInfo": {
    "errorMessage": ""
  },
  "links": [ ]
}

Additionally, the MDX script writes results into example.txt in the Sample Basic cube directory. The results file contains the following data:

Product#~Scenario#~Measures#~Mar#~Apr
Colas#~Actual#~Opening Inventory#~2041#~2108
Colas#~Actual#~Ending Inventory#~2108#~2250
Colas#~Budget#~Opening Inventory#~1980#~2040
Colas#~Budget#~Ending Inventory#~2040#~2170
Root Beer#~Actual#~Opening Inventory#~2378#~2644
Root Beer#~Actual#~Ending Inventory#~2644#~2944
Root Beer#~Budget#~Opening Inventory#~2220#~2450
Root Beer#~Budget#~Ending Inventory#~2450#~2710
Back to Top