Create Buffer
/essbase/rest/v1/applications/{applicationName}/databases/{databaseName}/asodataload/buffers
Creates an aggregate storage data load buffer with the specified options.
In REST API, the jobType to load data without use of buffers is dataload. The jobTypes you need to load data incrementally to aggregate storage cubes (using buffers) are asoBufferDataLoad and asoBufferCommit.
The workflow in REST API to load data to aggregate storage cubes using buffers is:
- Create (initialize) the buffer(s)
- (Optional) List existing buffers (List Buffers)
- Run jobType asoBufferDataLoad to load data into the buffer(s) (Execute Job)
- Run jobType asoBufferCommit to commit data from the buffer(s) to the aggregate storage cube (Execute Job)
- (Optional) Destroy initialized load buffer without committing data (Destroy Dataload Buffer)
Request
- application/json
- application/xml
-
applicationName(required): string
Application name.
-
databaseName(required): string
Database name.
object
-
bufferId:
integer(int64)
Unique ID of a single aggregate storage data load buffer. Must be a number between 1 and 4294967296.
-
duplicateAggregationMethod:
string
Allowed Values:
[ "ADD", "ASSUME_EQUAL", "USE_LAST" ]
Select an option to resolve cell conflicts for duplicate cells in the aggregate storage data load buffer.
ADD
: (Default) Add values when the buffer contains multiple values for the same cell.ASSUME_EQUAL
: Treat duplicate values as equal.USE_LAST
: Combine duplicate cells by using the value of the cell that was loaded last into the data load buffer.
-
loadBufferOptions:
string
Allowed Values:
[ "IGNORE_NONE", "IGNORE_MISSING_VALUES", "IGNORE_ZERO_VALUES", "IGNORE_MISSING_AND_ZERO_VALUES" ]
Select an option to determine how missing and zero values in the aggregate storage data load buffer should be handled.
IGNORE_NONE(0)
: Do not ignore any values in the incoming data streamIGNORE_MISSING_VALUES(1)
: Ignore #MI values in the incoming data streamIGNORE_ZERO_VALUES(2)
: Ignore zero values in the incoming data streamIGNORE_MISSING_AND_ZERO_VALUES(3)
: Ignore #MI and zero values in the incoming data stream
-
resourceUsage:
integer(int64)
Percentage of the total load buffer resources that the load buffer will be allowed to use; must be within [0, 100], and the value of 0 is interpreted as default, which is currently 100.
Response
- application/json
- application/xml
200 Response
OK
Load buffer created successfully.
400 Response
Bad Request
Failed to create load buffer.
500 Response
Internal Server Error.
Examples
Aggregate storage cubes facilitate analysis of very large dimensions containing up to a million or more members. To support efficient, incremental loading of data values into such large cubes, Essbase:
-
Allows the processing of multiple data sources through temporary aggregate storage data load buffers
-
Allows you to control the percentage of resources a data load buffer uses
-
Allows an aggregate storage database to contain multiple slices of data (a query to the database accesses each slice, collecting all of the data cells)
-
Completes the incremental load process in a length of time proportional to the size of the data
To support loading data incrementally to large cubes, multiple data load buffers can be initialized on an aggregate storage cube. This REST API returns the following information about each existing data load buffer:
-
bufferId -- ID of a data load buffer (a number between 1 and 4294967296).
-
duplicateAggregationMethod -- One of the methods used to handle when multiple values for the same cell are being loaded from the data stream in the buffer:
- ADD -- Add values when the buffer contains multiple values for the same cell.
- ASSUME_EQUAL -- Stop loading if there are duplicates with unequal values.
- USE_LAST -- Combine duplicate cells by using the value of the cell that was loaded last into the load buffer. This is an optimal choice for loading text and date values, to help eliminate invalid aggregations.
-
loadBufferOptions -- How missing and zero values should be handled during the load.
- IGNORE_NONE -- Ignore neither #MISSING nor zero values in the incoming data stream. To select this option you can pass either this keyword or
0
. - IGNORE_MISSING_VALUES -- Ignore #MISSING values in the incoming data stream. To select this option you can pass either this keyword or
1
. - IGNORE_ZERO_VALUES -- Ignore zeros in the incoming data stream. To select this option you can pass either this keyword or
2
. - IGNORE_MISSING_AND_ZERO_VALUES -- Ignore zero and #MISSING values in the incoming data stream. To select this option you can pass either this keyword or
3
.
- IGNORE_NONE -- Ignore neither #MISSING nor zero values in the incoming data stream. To select this option you can pass either this keyword or
-
resourceUsage -- Percentage of the total load buffer resources that the load buffer is allowed to use. Must be within [0, 100]. Default is 100, and 0 is interpreted as default.
Copy/Paste-able Script with cURL Command to Initialize a Load Buffer
The following example shows how to initialize an aggregate storage data load buffer. In this buffer, Essbase accumulates and sorts data values from multiple sources.
This example uses cURL to call the REST API from a Windows shell script.
The calling user's ID and password are variables whose values are set in properties.bat
.
call properties.bat
curl -X POST "https://192.0.2.1:443/essbase/rest/v1/applications/ASOSamp/databases/Basic/asodataload/buffers" -H "accept: application/json" -H "Content-Type: application/json" -d "{ \"bufferId\": 100, \"duplicateAggregationMethod\": \"ADD\", \"loadBufferOptions\": \"IGNORE_NONE\", \"resourceUsage\": .25} -u %User%:%Password%"
Workflow Example for Two Buffers
The following example shows the sequence of request payloads required to load data incrementally using two aggregate storage data load buffers.
Initialize Load Buffer(s)
Create load buffers.
Sample JSON Payloads:
{
"bufferId": 100,
"duplicateAggregationMethod": "ADD",
"loadBufferOptions": "IGNORE_NONE",
"resourceUsage": 50
}
{
"bufferId": 101,
"duplicateAggregationMethod": "ADD",
"loadBufferOptions": "IGNORE_NONE",
"resourceUsage": 50
}
Sample cURL:
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/applications/ASOSamp/databases/Basic/asodataload/buffers" -H Accept:application/json -H "Content-Type: application/json" -d '{"bufferId": 100, "duplicateAggregationMethod": "ADD", "loadBufferOptions": "IGNORE_NONE", "resourceUsage": 50}'
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/applications/ASOSamp/databases/Basic/asodataload/buffers" -H Accept:application/json -H "Content-Type: application/json" -d '{"bufferId": 101, "duplicateAggregationMethod": "ADD", "loadBufferOptions": "IGNORE_NONE", "resourceUsage": 50}'
Load Data Into Buffer(s)
Load data into initialized buffers using asoBufferDataLoad
job type available in Jobs.
Parameter | Description |
---|---|
application |
Required. Application name. |
db |
Required. Cube name. |
jobtype |
Required. Job type (in this case, |
bufferId |
Required. Unique ID of a single aggregate storage data load buffer. Must be a number between 1 and 4294967296. |
rule |
Required. Rule file to use for loading into buffer. |
file |
Required, unless you will connect to an external source (using user+password OR useConnection+connection). Data file to load into buffer. |
abortOnError |
Required. Error handling choice. If true, data load stops on the first error. If false, data load continues despite errors. |
user |
Optional. Use if the rule file is configured to load data using connectivity to ODBC, OCI, or DSN, without using the |
password |
Optional, unless you entered a user name for connectivity to an external source. Password of user authorized to access the external database. |
useConnection |
Optional. true if the data load should use a saved connection that establishes network connectivity between Essbase and an external source of data. false if the data load is directly from a file in the catalog, or if you are providing a username and password to establish connectivity to an external source of data. |
connection |
Required if useConnection is true. Name of a saved connection that establishes network connectivity between Essbase and an external source of data. Example: |
Sample JSON Payload Using Data File:
{
"application": "ASOSamp",
"db": "Basic",
"jobtype": "asoBufferDataLoad",
"parameters": {
"bufferId": 100,
"rule": "data.rul",
"file": "Basic.txt",
"abortOnError": "false"
}
}
Sample JSON Payload Using Application-level Connection:
{
"application": "ASOSamp",
"db": "Basic",
"jobtype": "asoBufferDataLoad",
"parameters": {
"bufferId": 101,
"rule": "data.rul",
"useConnection": "true",
"connection": ASOSamp.conn_name,
"abortOnError":"false"
}
}
Sample cURL:
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs" -H "accept: application/json" -H "Content-Type: application/json" -d '{"application":"ASOsamp_Data", "db":"Sample", "jobtype":"asoBufferDataLoad", "parameters":{"bufferId": 101, "rule": "data.rul", "useConnection": true, "connection": "ASOsamp_Data.app_conn", "abortOnError":"true"}}'
Commit Buffer(s)
Move data from buffers to the aggregate storage cube, using asoBufferCommit
job type available in Jobs.
Parameter | Description |
---|---|
application |
Application name. |
db |
Cube name. |
jobtype |
Job type (in this case, |
bufferId OR bufferIds |
bufferId: Unique ID of a single aggregate storage data load buffer. Must be a number between 1 and 4294967296. OR bufferIds: Comma-separated list of more than one bufferId |
commitOption |
Load buffer commit options to use when committing the contents of the data load buffer to the cube. Options:
|
actionType |
What to do if there is an error committing the load buffer. Options:
|
termOption |
Final options for committing data slices to the cube from the data load buffer. Options:
|
Sample JSON Payload:
{
"application":"ASOSamp",
"db":"Basic",
"jobtype":"asoBufferCommit",
"parameters":{
"bufferIds":[100,101],
"commitOption":"ADD_DATA",
"actionType":"COMMIT",
"termOption":"INCR_TO_NEW_SLICE"
}
}
Sample cURL:
curl -X POST "https://myserver.example.com:9001/essbase/rest/v1/jobs" -H "accept: application/json" -H "Content-Type: application/json" -d '{"application":"ASOsamp_Data", "db":"Sample", "jobtype":"asoBufferCommit", "parameters":{"bufferIds": [100,101], "commitOption": "ADD_DATA", "actionType": "COMMIT", "termOption":"INCR_TO_NEW_SLICE"}}'