Data Set Operations Integration Point

This integration point allows the user to generate the payload for a particular data set and/or start the import for a data set. The purpose of this integration point is to allow the exchange of data sets between application environments without having to access the application screens.

This section refers to the environment that loads the data set as the 'target' environment. The environment on which the data set is generated is referred to as the 'source' environment.

In OHI Claims Adjudication and Pricing, this integration point supports the handling of data sets that belong to the definitions CLAIMS_CONFIGURATION, PROVIDERS, PROCEDURES and DIAGNOSES.

In OHI Enterprise Policy Administration, this integration point supports the handling of data sets that belong to the definitions POLICIES_CONFIGURATION and PROVIDERS.

In Oracle Insurance Gateway, this integration point supports the handling of data sets that belong to the definitions GATEWAY_CONFIGURATION.

The following operations are supported:

  • Source environment:

    • Create and Build payload

    • Build payload

    • Download payload

  • Target environment:

    • Stop/start task processing dequeue

    • Import from file

    • Import from environment

Source Environment Operations

Create and Build Payload

This integration supports creation and build of a full configuration migration data set through POST to uri

 <context-root>/dataexchange/createandexport/datasetdefinition/{dataSetDefinitionCode}/dataset

with the following payload:

{
  "code":"migrationsetcode",
  "description":"description of the migration set code"
  "exactVersionMatch": true,
  "disableDeleteByOmission": true
}

A migration set is created for the latest outbound data set definition with the provided code and description.

The following table specifies the data set definitions supported by this Integration Point

Application Data Set Definition Code

Authorization

AUTHORIZATIONS_CONFIGURATION

DIAGNOSES

PROCEDURES

PROVIDERS

Claims

CLAIMS_CONFIGURATION

DIAGNOSES

PROCEDURES

PROVIDERS

Policy

POLICIES_CONFIGURATION

PROVIDERS

Integration Gateway

GATEWAY_CONFIGURATION

For the specified definitions a migration set for full migration is created and is build to produce an outbound file for full migration. If the migration set with the specified set code exists, then system returns set already exists error.

The full migration set includes all the top level items, along with the reference data. Refer chapter 'Create a Migration Set' in the Configuration Migration guide of the applications for details on top level and reference data items.

The request starts a background process that gathers all items, creates a set and then converts it into a payload structure. The build process is similar to build payload IP. For details on response refer to build payload IP. The payload is exported on the source environment. It can be imported into a target environment by using one of the import requests, described separately in the guide. Note that build payload request (described below) can be used to re-build the payload file if needed by providing the set code created by this process.

This operation is protected by the access restriction 'datasetoperation createandexport API'.

The following error messages can be returned in the response:

Code Sev Message

GEN-MIGR-008

Fatal

It is not possible to start a new build while another build or import is in progress

GEN-MIGR-010

Fatal

It is not possible to start a build with empty data set

Build Payload

This request is used to construct the data set’s payload. The request starts a background process that gathers all items of the data set and converts them into a payload structure. The payload is exported and stored on the source environment itself. It can be imported into a target environment by using one of the import requests, described separately below.

POST /dataexchange/export/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}

{
  "inclusionDate" : "2000-12-31",
  "exactVersionMatch": true,
  "disableDeleteByOmission": true
}

The inclusion date is optional. The application responds with an HTTP 201, containing the URI location of the Data Set Process that was started, i.e. "/generic/datasetprocesses/{id}". The client can use that location to track the export’s progress. The attributes endDateTime and result in the response message will be empty when the process is still running. Once the process completes, the GET "/generic/datasetprocesses/{id} fills in the endDateTime and the result - S when success and F when failure. In addition, a link to interface message related to the data set build process is also returned. If there are any technical errors in the build processes then those can be discovered by following the interface message link.

In addition, once the process completes, a notification message is also sent out. The URI for that message can be configured using the system property "ohi.datasetoperations.notification.endpoint.export". It will use a PUT operation for that purpose, with the following payload structure:

<buildDataSetResponse>
  <resultMessages
    result
  >
    <resultMessage
      code
      text
    />
  </resultMessages>
</buildDataSetResponse>

The following error messages can be returned in the web service response:

Code Sev Message

OHI-IP-DATA-001

Fatal

Unknown combination of data set code {code} and data set definition code {code}

GEN-MIGR-008

Fatal

It is not possible to start a new build while another build or import is in progress

GEN-MIGR-010

Fatal

It is not possible to start a build with empty data set

Download Payload

To retrieve the data set’s payload, the following request can be used. The data set has to be build first. If that has not been done, the data set does not have a payload and HTTP 204 (No content) is returned.

GET /dataexchange/export/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}

Depending on the data set definition, the response body format can be either XML or ZIP.

Target Environment Operations

Stop/start task processing dequeue

It is recommended to stop background processing before importing configuration items that may affect the outcome of the processing. The following resource allows inquiry about the task processing status:

GET /taskprocessing

The response includes the following structure:

{
  "status" : "started" | "stopped",
  "links" : [
    {
       "rel" : "self",
       "href" : ".../taskprocessing"
    },
    {
       "rel" : "taskprocessing:control",
       "href" : ".../taskprocessing?action=start" | ".../taskprocessing?action=stop",
       "httpMethod" : "POST"
    }
  ]
}

In OHI applications that use task processing, the link(s) can be used to control the task dequeue process. In OHI Claims stopping the dequeue process stops claims from entering the claims flow. Claims that are already in the flow will continue to be processed. Starting the dequeue process tells the application to accept new claims for processing.

In OHI applications that use activity processing for background processing, make sure no activities are running that may be impacted by the import.

Import From File

This request can be used to upload the data set payload file that was downloaded from a source environment.

POST /dataexchange/import/datasetdefinition/{dataSetDefinitionCode}

The request shall contain the following multipart/form-data parameters:

  • file - the data set payload file that was downloaded from a source environment. This is required. Make sure the form data parameter is called "file".

  • exactVersionMatch - set this form parameter to true if the import should be allowed only if the data set/ was built on the same exact version (that includes patch version) as the target system. This is optional.

  • disableDeleteByOmission - set this form parameter to true if the "Delete By Omission" functionality should be switched off. This is optional.

The application responds with an HTTP 201 containing the URI location of the Data Set Process that was started, similar to the export process described above.

If an URI has been configured with the system property "ohi.datasetoperations.notification.endpoint.import", a notification will be sent to that URI after processing completes. It will use a PUT operation for that purpose, with the following payload:

<importResponse>
  <resultMessages
    result
  >
    <resultMessage
      code
      text
    />
  </resultMessages>
</importResponse>

The following error messages can be returned in the web service response. These are messages that prevent the import from happening.

Code Sev Message

GEN-MIGR-007

Fatal

Import file must have a .zip extension

GEN-MIGR-009

Fatal

It is not possible to start a new import while another build or import is in progress

OHI-IP-DATA-004

Fatal

Unknown data set definition code {code}

The response file contains the messages that relate to the imported content.

Import From Environment

Instead of importing from a file, the import process can also be done from a source environment. It starts the retrieval of the data set from the given environment and the import the data on the target environment.

POST /dataexchange/import/datasetdefinition/{dataSetDefinitionCode}/dataset/{dataSetCode}

{
  "sourceEnvironment" : "ABC"
  "exactVersionMatch" : true,
  "disableDeleteByOmission" : true
}
  • The "sourceEnvironment" attribute is required and should contain the SID of the source environment’s database.

  • Set the attribute "exactVersionMatch" to true if the import should be allowed only if the data set was built on the same exact version (that includes patch version) as the target system. This is optional.

  • Set the attribute "disableDeleteByOmission" true if the "Delete By Omission" functionality should be switched off. This is optional.

Similar to the Import From File described above, an HTTP 201 is returned, and a notification is sent out once the process is done.