Execute Job
/essbase/rest/v1/jobs
Executes the job and returns the record containing job information, such as job ID, status, inputs, and output information for the current job.
Request
- application/json
Parameter provided as json string in the request body.
object
-
application:
string
-
db:
string
-
jobtype:
string
The type of job. Examples:
dataload
,dimbuild
,calc
,clear
,importExcel
,exportExcel
,lcmExport
,lcmImport
,clearAggregation
,buildAggregation
,asoBufferDataLoad
,asoBufferCommit
,exportData
,mdxScript
. -
parameters(optional):
object ParametersBean
object
-
abortOnError(optional):
string
If true, data load stops on the first error. Otherwise, data load continues.
-
actionType(optional):
string
-
analyzeFileName(optional):
string
-
analyzeSheetName(optional):
string
-
appId(optional):
string
-
artifactList(optional):
string
For LCMImport job type. Name of artifact list to use (valid only if generateArtifactList was true in the LCMExport job).
-
backupType(optional):
string
-
basedOnQueryData(optional):
string
For buildAggregation job type. If true, aggregate whichever views Essbase selects, based on collected user querying patterns. This option is only available if query tracking is turned on.
-
bufferId(optional):
integer(int32)
-
bufferIds(optional):
array bufferIds
-
buildMethod(optional):
string
For exportExcel job type. Valid build methods are PARENT-CHILD (the recommended method) and GENERATION.
-
buildOption(optional):
string
For importExcel job type. When an application has already been built from a workbook, you can continue to update it by performing additional imports from the workbook. When importing again from a workbook, you can use these incremental-update options: NONE, RETAIN_ALL_DATA or REMOVE_ALL_DATA.
-
calc(optional):
string
For exportExcel job type. If true, includes calculation scripts in the export.
-
catalogExcelPath(optional):
string
For importExcel job type. Path to application workbook in the file catalog. Use importExcelFileName to specify the file name.
-
columnFormat(optional):
string
For exportExcel or exportData job type. If true, the cube is exported in tabluar format; otherwise, it is exported as an application workbook. Exported tabular data contains data and metadata organized into columns with headers that Essbase can use to deploy a new cube. The exported tabular data contains less information than an application workbook.
-
commitOption(optional):
string
-
compress(optional):
string
For exportExcel or exportData job type. If true, compress the data.
-
connection(optional):
string
For dimbuild or dataload job types, when useConnection is true. Name of a saved connection.
-
copyToStorage(optional):
string
For LCMExport job type. If true, save copy of backup on server storage.
-
createFiles(optional):
string
For importExcel job type. If true, create cube artifacts in the cube directory.
-
data(optional):
string
For exportExcel job type. If true, includes data in the export.
-
dataLevel(optional):
string
For exportExcel or exportData job type. Constant indicating how much data to export. You can pass either the number or the corresponding string value. 0=ALL_DATA, 1=UPPER_LEVEL_BLOCKS, 2=NON_INPUT_BLOCKS, 3=LEVEL_ZERO_BLOCKS, 4=INPUT_LEVEL_DATA_BLOCKS
-
dbType(optional):
string
-
deleteExcelOnSuccess(optional):
string
-
dimDesignationMode(optional):
string
-
disasterRecovery(optional):
string
-
discoverDimensionTables(optional):
string
-
enableAlternateRollups(optional):
string
For buildAggregation job type. If true, let Essbase consider secondary hierarchies ('alternate rollups') for view selection.
-
enableSandboxing(optional):
string
-
executeScript(optional):
string
For importExcel job type. If true, execute calculation scripts. Applicable only if the application workbook contains a calculation worksheet with Execute Calc set to Yes in the definition.
-
exportDynamicBlocks(optional):
string
For exportData job type when columnFormat is true. Include dynamically calculated sparse members in the tabular data export. These are not included by default.
-
file(optional):
string
Source file for data load or dimension build.
-
filesystemcopy(optional):
string
-
force(optional):
string
-
forceDimBuild(optional):
string
For dimension build. If true, continue the dimension build even if other user activities are in progress. This cancels active user sessions.
-
generateArtifactList(optional):
string
For LCMExport job type. If true, generate a text file list of exported artifacts. You can use the list to control the import; for example, rearranging the order of artifacts to control the import order, or enumerating which items in the list to import.
-
importExcelFileName(optional):
string
For importExcel job type. File name of application workbook. If not in the cube directory, use catalogExcelPath to specify the path.
-
includeServerLevel(optional):
string
For For LCMExport and LCMImport job types. If true, include globally defined connections and Datasources as part of the export or import.
-
lcmImportFromStorage(optional):
string
For LCMImport job type. If true, import from backup saved on server storage.
-
loaddata(optional):
string
For importExcel job type. If true, loads data (if the workbook contains a data worksheet). Otherwise, only metadata is imported into the cube.
-
maxParallel(optional):
string
-
option(optional):
string
For clear data job. Keyword specifying what to clear. Default option, if omitted, is allData. The options for block storage cubes are: allData ??? All data, linked objects, and the outline are cleared. upperLevel ??? Upper level blocks are cleared. nonInput ??? Non input blocks are cleared. The options for aggregate storage cubes are: allData ??? All data, linked objects, and the outline are cleared. allAggregations ??? All aggregated data is cleared. partialData ??? Only a specified region is cleared (an MDX region specified in partialDataExpression parameter).
-
overwrite(optional):
string
For LCMExport and LCMImport job types. If true, overwrite existing backup (for LCMExport) or overwrite existing application (for LCMImport).
-
partialDataExpression(optional):
string
-
password(optional):
string
For dimbuild or dataload job types, when useConnection is false, and the job uses a rules file that connects to an RDBMS. Password of the user who can connect to the RDBMS.
-
ratioToStop(optional):
string
For buildAggregation job type. A stopping value. Use this value to give the ratio of the growth size you want to allow during the materialization of an aggregate storage cube, versus the pre-aggregation size of the cube. (Before an aggregation is materialized, the cube contains only input-level data.) For example, if the size of the cube is 1 GB, a stopping value of 1.2 means that the size of the resulting data cannot exceed 20% of 1 GB, for a total size of 1.2 GB. If you do not want to specify a stopping value, enter 0 for this parameter.
-
recreateApplication(optional):
string
For importExcel job type. If true, re-create the application, if it already exists.
-
restructureOption(optional):
string
For dimension build. Preservation options for existing data in the cube. PRESERVE_ALL_DATA: Preserves all existing data blocks (valid for block storage and aggregate storage cubes). PRESERVE_NO_DATA: Discards existing data (valid for block storage and aggregate storage cubes). PRESERVE_LEAFLEVEL_DATA: Preserves existing level zero data (block storage only). PRESERVE_INPUT_DATA: Preserves existing input-level data (block storage only).
-
rtsv(optional):
array rtsv
-
rule(optional):
string
Optional rules file (if the job is a data load or dimension build).
-
script(optional):
string
For calc execution. Calculation script name. Must have .csc file extension. You do not need to give a full path. Files are assumed to be in the relevant cube directory.
-
selectedDimensions(optional):
array selectedDimensions
-
skipdata(optional):
string
For LCMExport job type. If true, do not include data in the backup.
-
targetApplicationName(optional):
string
For LCMImport job type. Optional specification of an application name to import/restore to, if different from the exported/backed up application name.
-
termOption(optional):
string
-
threads(optional):
string
-
timestamp(optional):
string
-
unstructuredAnalysis(optional):
object CompactDesignation
-
useConnection(optional):
string
For dimbuild or dataload job types. If true, specifies that a saved connection should be used (you must also pass connection parameter indicating the connection name). If false, and the job uses a rules file that connects to an RDBMS, you must specify the user name and password to connect to the RDBMS.
-
user(optional):
string
For dimbuild or dataload job types, when useConnection is false, and the job uses a rules file that connects to an RDBMS. Name of the user who can connect to the RDBMS.
-
verbose(optional):
string
-
zipFileName(optional):
string
For LCMExport and LCMImport job types. Name of compressed file to hold backup files.
object
-
badRowListString(optional):
string
-
bsoLimitsExceeded(optional):
boolean
-
columnOffset(optional):
integer(int32)
-
compactDesignationColumn(optional):
array compactDesignationColumn
-
dateColumnId(optional):
integer(int32)
-
dateDimensionLeaves(optional):
array dateDimensionLeaves
-
dateDimString(optional):
string
-
dateFormatString(optional):
string
-
dimCompactDesignationList(optional):
array dimCompactDesignationList
-
dimDesignationMode(optional):
string
Allowed Values:
[ "DIM_DESIGNATION_MODE_ATTRIBS_AS_MULTILEVEL_DIMS", "DIM_DESIGNATION_MODE_ATTRIBS_AS_FLAT_DIMS", "DIM_DESIGNATION_MODE_ALL_FLAT", "DIM_DESIGNATION_MODE_OAV", "DIM_DESIGNATION_MODE_OAV_DIM", "DIM_DESIGNATION_MODE_OAV_SNOWFLAKE_DIM", "CONVERT_TO_CSV", "DIM_DESIGNATION_MODE_ATTRIBS_AS_ATTRIBS" ]
-
excelSheetName(optional):
string
-
fastAnalysis(optional):
boolean
-
file(optional):
string
-
maxDate(optional):
string(date-time)
-
measureDimensionName(optional):
string
-
minDate(optional):
string(date-time)
-
namingPriority(optional):
string
Allowed Values:
[ "NONE", "GENERATIONS", "DIMENSIONS" ]
-
nr(optional):
integer(int32)
-
nrh(optional):
integer(int32)
-
query(optional):
string
-
tableName(optional):
string
object
-
allowMissing(optional):
boolean
-
description(optional):
string
-
dimension(optional):
string
-
limit(optional):
string
-
name(optional):
string
-
singleChoice(optional):
boolean
-
type(optional):
string
Allowed Values:
[ "STRING", "NUMBER", "DATE", "MEMBER" ]
-
value(optional):
object value
object
object
-
attNames(optional):
array attNames
-
attribute(optional):
array attribute
-
colNames(optional):
array colNames
-
columnTypes(optional):
array columnTypes
-
dimGenColumns(optional):
array dimGenColumns
-
dimJoin(optional):
string
-
dimName(optional):
string
-
essbaseConnection(optional):
string
-
fkcolumnNumber(optional):
integer(int32)
-
genNames(optional):
array genNames
-
headerText(optional):
array headerText
-
parentColumnId(optional):
integer(int32)
-
query(optional):
string
-
uniqCount(optional):
array uniqCount
array
-
Array of:
string
Allowed Values:
[ "TEXT", "INTEGER", "FLOAT", "TIME", "DATE", "BOOLEAN", "EMPTY", "UNKNOWN", "OUT_OF_RANGE" ]
Response
- application/json
200 Response
OK
Job started successfully. Job information returned in response.
object
-
appName(optional):
string
Application name.
-
dbName(optional):
string
Cube name.
-
endTime(optional):
integer(int64)
End time of the job.
-
job_ID(optional):
integer(int64)
ID number of the job.
-
jobfileName(optional):
string
-
jobInputInfo(optional):
object jobInputInfo
Additional Properties Allowed: additionalProperties
-
jobOutputInfo(optional):
object jobOutputInfo
Additional Properties Allowed: additionalProperties
-
jobType(optional):
string
The type of job. Examples:
dataload
,dimbuild
,calc
,clear
,importExcel
,exportExcel
,lcmExport
,lcmImport
,clearAggregation
,buildAggregation
,asoBufferDataLoad
,asoBufferCommit
,exportData
,mdxScript
. -
links(optional):
array links
-
startTime(optional):
integer(int64)
Start time of the job.
-
statusCode(optional):
integer(int32)
Job status code indicating progress. Each code has a corresponding statusMessage.
statusCode statusMessage 100 IN_PROGRESS 200 COMPLETED 300 COMPLETED_WITH_WARNINGS 400 FAILED -
statusMessage(optional):
string
Job status message string indicating progress. Each string has a corresponding statusCode.
statusCode statusMessage 100 IN_PROGRESS 200 COMPLETED 300 COMPLETED_WITH_WARNINGS 400 FAILED -
userName(optional):
string
User who ran the job. Users have access to job listings based on their assigned user role. For example, if you have the Service Administrator role, you can see all jobs; if you have the User role, you can see only the jobs you ran.
object
object
400 Response
Bad Request
Application may not exist, or application parameter may be incorrect. Or, database may not exist, or database parameter may be incorrect. Or, a null argument may have been passed.
500 Response
Internal Server Error.
503 Response
Service Unavailable
Naming exception or server exception.
Examples
The following examples show how to run jobs. In these examples, cURL accesses the REST API from within a Windows shell script.
The calling user's ID and password are variables whose values are set in properties.bat
.
The jobs you can run are:
- Load Data
- Build Dimensions
- Run Calculation
- Clear Data
- Import Cube from Application Workbook
- Export Cube to Application Workbook
- Back Up Cube with LCM
- Restore Cube with LCM
- Build Aggregation
- Clear Aggregations
- Export Data
- Run MDX Script
Load Data
The following cURL example shows you how to run a data load job. Requires at least Database Update permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"db\": \"Basic\",
\"jobtype\": \"dataload\",
\"parameters\":
{ \"file\": \"Data_Basic.txt\",
\"abortOnError\": \"true\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 1346,
"appName": "Sample",
"dbName": "Basic",
"jobType": "Data Load",
"jobfileName": null,
"userName": "admin",
"startTime": 1574456694000,
"endTime": 1574456694000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"dataFileName": "Data_Basic",
"abortOnError": true,
"useConnection": false
},
"jobOutputInfo": {
"recordsProcessed": 0,
"recordsRejected": 0,
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/1346",
"method": "GET"
}
]
}
Build Dimensions
The following cURL example shows you how to run a dimension build job. Requires at least Database Manager permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"ASOSamp\",
\"db\": \"Basic\",
\"jobtype\": \"dimbuild\",
\"parameters\":
{ \"file\": \"Dim_Products.txt\",
\"rule\": \"Products.rul\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 4,
"appName": "ASOSamp",
"dbName": "Basic",
"jobType": "Dimension Build",
"jobfileName": "Products",
"userName": "power1",
"startTime": 1574814746000,
"endTime": 1574814746000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"rulesFileName": "Products",
"dataFileName": "Dim_Products",
"useConnection": false,
"restructureOption": 1,
"forceDimBuild": false
},
"jobOutputInfo": {
"recordsProcessed": 0,
"recordsRejected": 0,
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/4",
"method": "GET"
}
]
}
Run Calculation
The following cURL example shows you how to execute a calculation script. Requires at least Database Update permission, as well as provisioned access to the calculation script. Prerequisite: upload the script, as a .csc file, to the cube directory.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"db\": \"Basic\",
\"jobtype\": \"calc\",
\"parameters\":
{ \"file\": \"CalcAll.csc\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 1434,
"appName": "Sample",
"dbName": "Basic",
"jobType": "Calc Execution",
"jobfileName": null,
"userName": "admin",
"startTime": 1574733981000,
"endTime": 1574733981000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {},
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/1434",
"method": "GET"
}
]
}
Clear Data
The following cURL example shows you how to run a job to clear cube data. Requires at least Database Update permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"db\": \"Basic\",
\"jobtype\": \"clear\",
\"parameters\": {
\"option\": \"PARTIAL_DATA\",
\"partialDataExpression\":\"{Feb}\"}
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 116,
"appName": "Sample",
"dbName": "Basic",
"jobType": "Clear Data",
"jobfileName": null,
"userName": "dbupdater",
"startTime": 1598329480000,
"endTime": 1598329480000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"clearDataOption": "PARTIAL_DATA"
},
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/116",
"method": "GET"
}
]
}
Import Cube from Application Workbook
The following cURL example shows you how to run a job that imports a cube from an Excel application workbook. Requires at least power user role, or Application Manager permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"ASOSamp\",
\"db\": \"Basic\",
\"jobtype\": \"importExcel\",
\"parameters\":
{ \"loaddata\": \"false\",
\"overwrite\": \"true\",
\"deleteExcelOnSuccess\": \"false\",
\"catalogExcelPath\": \"/gallery/Applications/Demo Samples/Aggregate Storage/\",
\"importExcelFileName\": \"ASO_Sample.xlsx\",
\"recreateApplication\": \"true\",
\"createFiles\": \"true\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 2,
"appName": "ASOSamp",
"dbName": "Basic",
"jobType": "Import Excel",
"jobfileName": "ASO_Sample.xlsx",
"userName": "power1",
"startTime": 1574810127000,
"endTime": 1574810127000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"catalogExcelPath": "/gallery/Applications/Demo Samples/Aggregate Storage/",
"importExcelFileName": "ASO_Sample.xlsx",
"isLoadData": false,
"recreateApplication": true,
"isCreateFiles": true,
"isExecuteScript": false
},
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/2",
"method": "GET"
}
]
}
Export Cube to Application Workbook
The following cURL example shows you how to run a job that exports a cube to an Excel application workbook. Requires at least Database Manager permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"db\": \"Basic\",
\"jobtype\": \"exportExcel\",
\"parameters\":
{
\"dataLevel\": \"ALL_DATA\",
\"columnFormat\": \"false\",
\"compress\": \"false\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 10,
"appName": "Sample",
"dbName": "Basic",
"jobType": "Export Excel",
"jobfileName": null,
"userName": "admin",
"startTime": 1575413474000,
"endTime": 1575413474000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"calc": false,
"data": false
},
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/10",
"method": "GET"
}
]
}
Example of Response from Get Job {id}
{
"job_ID": 10,
"appName": "Sample",
"dbName": "Basic",
"jobType": "Export Excel",
"jobfileName": null,
"userName": "admin",
"startTime": 1575413474000,
"endTime": 1575413490000,
"statusCode": 200,
"statusMessage": "Completed",
"jobInputInfo": {
"calc": false,
"data": false
},
"jobOutputInfo": {
"errorMessage": "",
"metadataFile": "/applications/Sample/Basic/Basic.xlsx"
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/10",
"method": "GET"
},
{
"rel": "post",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/10",
"method": "POST"
}
]
}
Back Up Cube with LCM
The following cURL example shows you how to run a job that backs up cube artifacts to a Lifecycle Management (LCM) .zip file. Requires at least Application Manager permission.
This job type can be run from outside the Essbase machine, whereas the LCM utility must be run on the Essbase machine.
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"jobtype\": \"lcmExport\",
\"parameters\":
{
\"zipFileName\": \"Sample1.zip\",
\"skipdata\": \"true\",
\"include-server-level\": \"false\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 11,
"appName": "Sample",
"dbName": null,
"jobType": "LCM Export",
"jobfileName": "Sample1.zip",
"userName": "appmanager",
"startTime": 1575424208000,
"endTime": 1575424208000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"lcmExportFileName": "Sample1.zip",
"skipdata": true,
"copyToStorage": false,
"threads": 10,
"include-server-level": false,
"generateArtifactList": false,
"filesystemcopy": false,
"disasterRecovery": false,
"verbose": false
},
"jobOutputInfo": {
"errorMessage": "",
"infoMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/11",
"method": "GET"
}
]
}
Example of Response from Get Job {id}
{
"job_ID": 11,
"appName": "Sample",
"dbName": null,
"jobType": "LCM Export",
"jobfileName": "Sample1.zip",
"userName": "appmanager",
"startTime": 1575424208000,
"endTime": 1575424228000,
"statusCode": 200,
"statusMessage": "Completed",
"jobInputInfo": {
"lcmExportFileName": "Sample1.zip",
"skipdata": true,
"copyToStorage": false,
"threads": 10,
"include-server-level": false,
"generateArtifactList": false,
"filesystemcopy": false,
"disasterRecovery": false,
"verbose": false
},
"jobOutputInfo": {
"errorMessage": "",
"infoMessage": "",
"lcmExportFilePath": "/users/appmanager/Sample1.zip"
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/11",
"method": "GET"
},
{
"rel": "post",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/11",
"method": "POST"
}
]
}
Restore Cube with LCM
The following cURL example shows you how to run a job that restores cube artifacts from a Lifecycle Management (LCM) .zip file. To do this, you must be the power user who created the application, or a service administrator.
This job type can be run from outside the Essbase machine, whereas the LCM utility must be run on the Essbase machine.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"jobtype\": \"lcmImport\",
\"parameters\":
{ \"zipFileName\": \"Sample1.zip\",
\"include-server-level\": \"false\",
\"targetApplicationName\": \"Sample_dup\",
\"overwrite\": \"true\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 12,
"appName": null,
"dbName": null,
"jobType": "LCM Import",
"jobfileName": "Sample1.zip",
"userName": "admin",
"startTime": 1575425649000,
"endTime": 1575425649000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"lcmImportFileName": "Sample1.zip",
"lcmImportTargetApplicationName": "Sample_dup",
"lcmImportFromStorage": false,
"overwrite": true,
"include-server-level": false,
"verbose": false
},
"jobOutputInfo": {
"errorMessage": "",
"infoMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/12",
"method": "GET"
}
]
}
Build Aggregation
The following cURL example shows you how to run a job that builds an aggregation. Requires at least Database Access permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"ASOSamp\",
\"db\": \"Basic\",
\"jobtype\": \"buildAggregation\",
\"parameters\":
{ \"ratioToStop\": \"1.1\",
\"basedOnQueryData\": \"false\",
\"enableAlternateRollups\": \"false\" }
}"
-u %User%:%Password%
Aggregations apply to aggregate storage cubes. Aggregations are consolidations, based on outline hierarchy, of level 0 data values in an aggregate storage cube. The term aggregation is used to refer to the aggregation process and the set of values stored as a result of the process.
An aggregation contains one or more aggregate views, which are collections of aggregate cells. When you build an aggregation, Essbase selects aggregate views to be rolled up, aggregates them, and stores the cell values in the selected views. If an aggregation includes aggregate cells dependent on level 0 values that are changed through a data load, the higher-level values are automatically updated at the end of the data load process.
When you build an aggregation, Essbase
- selects 0 or more aggregate views based on the stopping value (ratioToStop) and/or on querying patterns (basedOnQueryData), if these parameters are given
- builds the views that were selected
Note: The MaxL equivalent of this job type is the execute aggregate process statement.
Example of Response Body
{
"job_ID": 8,
"appName": "ASOSamp",
"dbName": "Basic",
"jobType": "Build Aggregation",
"jobfileName": null,
"userName": "dbaccess",
"startTime": 1575411748000,
"endTime": 1575411748000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"enableAlternateRollups": false,
"basedOnQueryData": false,
"ratioToStop": 1.1
},
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/8",
"method": "GET"
}
]
}
Clear Aggregations
The following cURL example shows you how to run a job that clears aggregations on an aggregate storage cube.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"ASOSamp\",
\"db\": \"Basic\",
\"jobtype\": \"clearAggregation\"}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 9,
"appName": "ASOSamp",
"dbName": "Basic",
"jobType": "Clear Aggregation",
"jobfileName": null,
"userName": "dbaccess",
"startTime": 1575412855000,
"endTime": 1575412855000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/9",
"method": "GET"
}
]
}
Export Data
The following cURL example shows you how to export data from a cube. Requires at least Database Manager permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"db\": \"Basic\",
\"jobtype\": \"exportData\",
\"parameters\":
{ \"compress\": \"false\",
\"columnFormat\": \"false\",
\"dataLevel\": \"LEVEL_ZERO_BLOCKS\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 28,
"appName": "Sample",
"dbName": "Basic",
"jobType": "Export Data",
"jobfileName": null,
"userName": "dbmanager",
"startTime": 1575920712000,
"endTime": 1575920712000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {
"compress": false,
"columnFormat": false,
"dataLevel": "LEVEL_ZERO_BLOCKS"
},
"jobOutputInfo": {
"scriptOutputFileName": "",
"scriptOutputFileNamePath": "",
"infoMessage": "",
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/28",
"method": "GET"
}
]
}
Run MDX Script
The following cURL example shows you how to run an MDX script that performs an insert or export. Requires at least Database Access permission.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/jobs"
-H "accept: application/json"
-H "Content-Type: application/json"
-d "{ \"application\": \"Sample\",
\"db\": \"Basic\",
\"jobtype\": \"mdxScript\",
\"parameters\":
{ \"file\": \"shared/mdx_scripts/export_examp.mdx\" }
}"
-u %User%:%Password%
Example of Response Body
{
"job_ID": 26,
"appName": "Sample",
"dbName": "Basic",
"jobType": "MDX Script",
"jobfileName": null,
"userName": "dbaccess",
"startTime": 1575918425000,
"endTime": 1575918425000,
"statusCode": 100,
"statusMessage": "In Progress",
"jobInputInfo": {},
"jobOutputInfo": {
"errorMessage": ""
},
"links": [
{
"rel": "self",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "canonical",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs",
"method": "POST"
},
{
"rel": "Job Status",
"href": "https://192.0.2.1:443/essbase/rest/v1/jobs/26",
"method": "GET"
}
]
}