Data Set Operations

This API allows the user to generate the payload for a particular data set and/or start the import for a data set. The purpose of this API is to allow the exchange of data sets between application environments without having to access the application screens.

This documentation refers to the environment that loads the data set as the 'target' environment. The environment on which the data set is generated is referred to as the 'source' environment.

This documentation refers to creating a data set and building a data set. Creating the data set identifies the items to collect in the data set. Building the data set gathers all the details about these items and writes the items and their details to the output. So any updates on the migration items made between the create and the build will make it to the result. New items created between the create and the build will not.

This integration point supports the handling of data sets that belong to the definitions

  • CLAIMS_CONFIGURATION

  • PROVIDERS

  • PROCEDURES

  • DIAGNOSES

  • PRODUCT_BUILDING_BLOCKS

This integration point supports the handling of data sets that belong to the definitions

  • POLICIES_CONFIGURATION

  • PROVIDERS

Operations

This API supports the following operations:

PUT

/dataexchange/datasetdefinition/{datasetDefinitionCode}/dataset
on the source environment creates a data set.

POST

/dataexchange/createandexport/datasetdefinition/{dataSetDefinitionCode}/dataset
on the source environment creates and builds a data set

POST

/dataexchange/export/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}
on the source environment gathers all items of the data set and converts them into a payload structure (builds the data set).

GET

/dataexchange/export/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}
on the source environment retrieves the data set’s payload

GET

/dataexchange/datasets/fetchavailabledatasets/{dataSetDefinitionCode}/{dataSetVersion}
on the target environment collects all the available datasets from the registered source (producer) environments of the respective application

POST

/dataexchange/import/datasetdefinition/{dataSetDefinitionCode}
on the target environment uploads the data set payload file that was downloaded from a source environment

POST

/dataexchange/import/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}
on the target environment starts the retrieval of the data set from the given environment and the import of the data on the target environment

The system uses the above operations also for the Configuration Migration tool.

Source Environment Operations

PUT: Create a Migration Set

A PUT request on /dataexchange/datasetdefinition/{datasetDefinitionCode}/dataset creates a new data set.

The payload structure for creating the data set is as follows:

Payload Structure for Creating a Data Set
{
  "code":"<migration set code>",
  "description":"<description of the migration set code>",
  "exactVersionMatch": true,
  "disableDeleteByOmission": true,
  "configMigrationItems": [{
    "tableId": <tableId>,
    "recordId" : <recordId>
    },
    ...
  ],
  "excludeOptions": [{
    "tableTopId": <tableTopId>,
    "tableDependentId" : <tableDependentId>
    },
    ...
  ],
  "globalIncludeOptions": [{
    "tableId" : <tableId>
   },
   ...
  ]
}

The following payload example creates a migration set that includes all group clients without the referred dynamic logic.

Example Payload to Create a Migration Set that Includes all Group Clients Without the Referred Dynamic Logic:
{
  "code":"SampleTest",
  "description":"SampleTest",
  "exactVersionMatch" : true,
  "disableDeleteByOmission" : true,
  "items" : [{
    "tableId": 18039405, // id of the GROUP CLIENTS table
   }
  ],
  "excludeOptions": [{
    "tableTopId": 18039405,
    "tableDependentId" : 2220 // id of the DYNAMIC LOGIC table
   }
  ]

Response Message

The request creates a data set with the requested items and returns HTTP Response Code 201 along with the created dataset in the location header.

POST: Create and Build the Payload

The POST request message on /dataexchange/createandexport/datasetdefinition/{dataSetDefinitionCode}/dataset on the source environment creates and builds a data set.

The payload structure is the same as for the PUT operation.

For the specified definitions a migration set for full migration is created and is build to produce an outbound file for full migration. If the migration set with the specified set code exists, then system returns an error.

The full migration set includes all the top level items, along with the reference data. Refer chapter "Create a Migration Set" in the Configuration Guide of the applications for details on top level and reference data items.

The request starts a background process that gathers all items, creates a set and then converts it into a payload structure. The build process is similar to build payload IP. For details on response refer to build payload IP. The payload is exported on the source environment. It can be imported into a target environment by using one of the import requests, described separately in the guide. Note that build payload request (described below) can be used to re-build the payload file if needed by providing the set code created by this process.

This operation is protected by the access restriction 'datasetoperation createandexport API'.

The following error messages can be returned in the response:

Table 1. Response Messages
Code Sev Message

GEN-MIGR-008

Fatal

It is not possible to start a new build while another build or import is in progress

GEN-MIGR-010

Fatal

It is not possible to start a build with empty data set

POST: Build the Payload

The POST' request on `/dataexchange/export/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode} constructs the data set’s payload.

The request starts a background process that gathers all items of the data set and converts them into a payload structure. The payload is exported and stored on the source environment itself. It can be imported into a target environment by using one of the import requests, described separately below.

Example of a Post Request Message to Build the Data Set
{
  "inclusionDate" : "2023-12-31",
  "exactVersionMatch": true,
  "disableDeleteByOmission": true
}

The inclusion date is optional.

Response Message

The application responds with an HTTP 201, containing the URI location of the Data Set Process that was started, that is, /generic/datasetprocesses/{id}. The client can use this location to track the export’s progress through a GET operation.

The attributes endDateTime and result in the response message are empty when the process is still running.

Once the process completes, the endDateTime and the result are available in the response message. The result is either S when success or F when failure.

The response message also includes a link to the interface messages related to the data set build process. If there were any technical errors in the build process, then those can be discovered by following the interface message link.

Once the process completes, the system sends out a notification message. Configure the URI for that message by using the system property "ohi.datasetoperations.notification.endpoint.export". The system uses a PUT operation for that purpose, with the following payload structure:

Payload Structure of the Notification Message
<buildDataSetResponse>
  <resultMessages
    result
  >
    <resultMessage
      code
      text
    />
  </resultMessages>
</buildDataSetResponse>

The web service can return the following error messages in the response:

Table 2. Error Messages
Code Sev Message

OHI-IP-DATA-001

Fatal

Unknown combination of data set code {code} and data set definition code {code}

GEN-MIGR-008

Fatal

It is not possible to start a new build while another build or import is in progress

GEN-MIGR-010

Fatal

It is not possible to start a build with empty data set

Authorization

This API requires a grant for access restriction dataexchange IP.

Download Payload

To retrieve the data set’s payload, use a GET request on /dataexchange/export/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}.

The data set has to be build first. If that has not been done, the data set does not have a payload and HTTP 204 (No content) is returned.

Depending on the data set definition, the response body format can be either XML or ZIP.

Authorization

This API requires a grant for access restriction dataexchange IP.

Target Environment Operations

The target system can import datasets from files on the target environment (already downloaded from the source environment) or from datasets that are still on the source environment.

In the target environment the following operations are available to fetch and load the data sets.

Read Stop or Start Task Processing Dequeue before starting any import.

Check on Available Datasets

Use a GET request on dataexchange/datasets/fetchavailabledatasets/{dataSetDefinitionCode}/{dataSetVersion} in the target environment to collect all the available datasets from all the registered source environment(s) of the application. Note that you need credentials

In the below example the response message target environment has three registered source environments: source-host-1, source-host-2, and source-host-3. The user has authorization to 'source-host-1' environments, but does not have authorization to 'source-host-2', and got a conflict error on 'source-host-3'

This operation provides HTTP status codes as defined in Response Messages

Example Response Message with Available Data Sets from Three Source Environments (click to open)
{
  "dataSetMetaDataResponse": [
     {
       "dataSetMetaDataRestEntities": [
          {
            "code": "TEST_321",
            "description": "TEST_CONFIG_321",
            "env": "<host>",
            "url": "https://<source-host1>:<port>/<application-context-root>/api/dataexchange",
            "numberOfItems": 1,
            "exactVersionMatch": false,
            "generatedDescription": "",
            "disableDeleteByOmission": false,
               "id": "654321",
               "inclusionDate": {
                 "value": "2019-11-27T14:04:22.965+01:00"
                 },
               "objectLastUpdatedDate": {
                 "value": "2019-11-27T14:04:22.965+01:00"
                 },
               "creationDate": {
                 "value": "2019-11-27T13:59:30.411+01:00"
                 },
               "lastPayloadBuiltDateTime": {
                 "value": "2020-11-25T14:57:59.959+01:00"
             }
          }
       ],
       "dataSetMetaData": {
          "hostName": "https://<source-host-1>:<port>/<application-context-root>/api/dataexchange",
          "httpResponse": "200"
       }
     },
     {
       "dataSetMetaData": {
          "hostName": "https://<source-host-2>:<port>/<application-context-root>/api/dataexchange",
          "errorDetails": "HTTP 401 Unauthorized"
         }
     },
     {
       "dataSetMetaData": {
          "hostName": "https://<source-host-3>:<port>/<application-context-root>/api/dataexchange",
          "errorDetails": "HTTP 409 Conflict"
       }
     }
  ]
}

Registered Source Environments

Set up the system property ohi.application.uri.CONF in the target environment to configure one or more environments as source for migrating configuration data. The value of this property should be the URL to the dataexchange API on the source environment. To specify multiple resources, enter the URLs separated by a semicolon.

Send a POST request to /api/generic/properties with the payload

Example Request Message to Register Two Environments as Source Environments
{
"name": "ohi.application.uri.CONF",
"value": "http://dev.com:port/api/dataexchange;http://test.com:port/api/dataexchange"
}

Set up the system property ohi.application.uri.PRD in the target environment to configure one or more environments as source for migration of product data.

Credentials

Set up the credentials for credential key DataExchangeClient in the target environment. This enables the target environment to access the dataexchange API on the source environment(s).

Example

Send a PUT request to /api/credentials/credential/DataExchangeClient

{
   "username": "a_username",
   "password": "a_secret_password"
}

There is only one credential key for DataExchangeClient in the target environment. When registering more than one source environments, the username/password to access the sources must be the same on all source environments.

The user "a_username" must exist in the source environment and have a grant on the access restriction dataexchange IP.

Configuring these properties and credential keys is only required when you use the Import from Environment option for the Data Set Operations.

Import From File

A POST request message on /dataexchange/import/datasetdefinition/{dataSetDefinitionCode} uploads the data set payload file that was downloaded from a source environment.

We recommend uploading files with sizes under 20 MB. Larger files impact the stability of the system.

{
  "file" : "datasetpayloadfile.zip",     (1)
  "exactVersionMatch" : true,            (2)
  "disableDeleteByOmission" : true       (3)
}

The request can hold the following multipart/form-data parameters:

1 file - the data set payload file that was downloaded from a source environment. This parameter is mandatory.
2 exactVersionMatch - set this form parameter to true if the import should be allowed only if the data set was built on the same exact version (that includes patch version) as the target system.
This parameter is optional.
3 disableDeleteByOmission - set this form parameter to true if the "Delete By Omission" functionality should be switched off.
This parameter is optional.

Response Message

The application responds with an HTTP 201 containing the URI location of the Data Set Process that was started, similar to the export process described above.

If the system property "ohi.datasetoperations.notification.endpoint.import" specifies an URI, the systme sends a notification to that URI after processing completes. It uses a PUT operation for that purpose, with the following payload:

<importResponse>
  <resultMessages
    result
  >
    <resultMessage
      code
      text
    />
  </resultMessages>
</importResponse>

The following error messages can be returned in the web service response. These are messages that prevent the import from happening.

Table 3. Returned Messages
Code Sev Message

GEN-MIGR-007

Fatal

Import file must have a .zip extension

GEN-MIGR-009

Fatal

It is not possible to start a new import while another build or import is in progress

OHI-IP-DATA-004

Fatal

Unknown data set definition code {code}

The response file contains the messages that relate to the imported content.

Import From Environment

A POST request message on /dataexchange/import/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode} imports the data file from the source environment. It starts the retrieval of the data set from the given environment and then starts the import of the data on the target environment. Refer to Fetch Available Datasets for details.

{
  "sourceEnvironment" : "ABC",     (1)
  "exactVersionMatch" : true,      (2)
  "disableDeleteByOmission" : true (3)
}
1 The sourceEnvironment attribute is required and should contain the value set for the system property ohi.environment.identifier in the source.
2 Set the attribute exactVersionMatch to true if the import should be allowed only if the data set was built on the same exact version (that includes patch version) as the target system. This is optional.
3 Set the attribute disableDeleteByOmission to true if the Delete By Omission functionality should be switched off. This is optional.

Similar to the Import From File described above, an HTTP 201 is returned, and a notification is sent out once the process is done.

Authorization

The dataexchange API requires a grant for access restriction dataexchange IP.

Stop or Start Task Processing Dequeue

It is recommended to stop background processing before importing configuration items that may affect the outcome of the processing.

See Control Task Processing to learn how to check the current process status and how to stop and start the processing queue.

In the Oracle Health Insurance applications that use activity processing for background processing, make sure no activities are running that may be impacted by the import.