6 Using Automation

This chapter describes the Oracle Communications Order and Service Management (OSM) automation framework, which enables you to configure and automatically run automated tasks and notifications.

About Automations and the Automation Framework

The OSM automation framework provides the primary interface for outbound and inbound operations that interact with external systems for automated order fulfillment. The automation framework also provides internal data processing for automated tasks within a process workflow. You can create notifications for individual tasks or at the order level that trigger automations. See OSM Concepts for information about automated tasks and notifications.

To run automated tasks, notifications at the task level, or notifications at the order level, you write automation plug-ins. The automation framework runs instances of automation plug-ins within the context of these tasks and notifications which defines what order data is available to the automation.

An automation plug-in can be a:

  • Custom automation plug-in, which is an automation plug-in that you write, consisting of custom business logic in the form of Java code.

  • Predefined automation plug-in, which is an automation plug-in that is provided with the OSM installation that you can augment with your business logic requirements.

OSM provides the following predefined automation plug-ins:

  • XSLT Plug-in. A plug-in that uses XSLT to generate outbound messages and process in-bound messages.

  • XQuery Plug-in. A plug-in that uses XQuery to generate outbound messages and process in-bound messages.

  • JDBC Plug-in. A plug-in that uses JDBC to retrieve or update data in the database.

  • Email Plug-in. A plug-in available for notifications that send email messages to external systems.

The automation framework simplifies the process of sending messages to external systems. The automation framework does the following:

  • Uses the JMS communication protocol.

  • Establishes and maintains the various JMS connections.

  • Constructs the JMS messages, setting the required message properties.

  • Correlates requests from OSM with responses from external systems.

  • Guarantees delivery of the message and handles any errors or exceptions. It retries messages until the message delivers.

  • Handles poison messages. For example, if the message is undeliverable for some reason.

When OSM sends a message to an external system using an automation plug-in, the following processing flow generally occurs:

  1. OSM runs an automated task instance that triggers an automation called a sender plug-in.

  2. The automation framework adds properties to the outbound message to correlate external system responses to requests. For example, for a predefined XQuery or XSLT sender plug-in:

    1. The sender plug-in sets a property on the outbound JMS message as the correlation property.

    2. The automation framework saves the message properties set for each message with the event information.

    3. The automation framework sets the replyTo JMS property on the JMS request based on properties configured for the sender plug-in.

  3. The automation framework sends the JMS message to the JMS queue and destination type that the external system must subscribe to in order to consume based on properties configured for the sender plug-in.

    Note:

    Custom automations are not restricted to JMS but can use any communication protocol, such as HTTP or FTP. See "About Custom Automation Plug-ins" for more information.

When OSM receives a message in response to the request, the following process flow generally occurs.

  1. After processing the request, the external system copies the properties from the incoming request to the outgoing response.

  2. The external system sends the response message to the reply to queue based on the replyTo JMS property in the request.

  3. The automation framework routes the response from the queue to the plug-in. The plug-in that receives the response is called an automator.

  4. The automation framework uses the message properties of the response, plus the correlation information, to reload a Context for the response message, which is in this scenario the task that sent the original request.

  5. The automator performs business logic, such as updating order data and completing the task.

You can create custom or predefined plug-ins using Oracle Communications Design Studio Help.

Figure 6-1 shows the flow of an automated task with a notification that call their corresponding automation plug-in. Design Studio provides the ability to map a specific automated task (Task A) to a specific automation plug-in (Automation Plug-in A), or a specific automated notification (Notification B) to a specific automation plug-in (Automation Plug-in B). This is called automation mapping. The mappings are saved to a cartridge, which is then deployed to the OSM server. OSM processes the automated tasks which trigger the mapped automation plug-ins when specific events occur. See "About Creating Automations in Design Studio" and "About Internal and External Events that Trigger Automations" for more information.

About Sender and Automator Automation Types

When you create an automation plug-in for a task, task notification, or order notification in Design Studio, you bring up the Add Automation dialog box to create a plug-in for the task or notification, give it a name, and select the Automation Type (for example, one of the predefined automations or a custom automation). There are two basic types of automation plug-ins: Sender and Automator. Use the Automator type if you want the plug-in to receive data and perform work on the data. Use the Sender type if you want the plug-in to receive data, perform work, then send the data to external systems.

About Automations in the Order and Task Contexts

You can configure automations in various contexts, such as automated tasks, notifications configured for automated tasks, notifications configured for manual tasks, notifications configured in process flows, and notifications configured at the order level. The data available to these automations depends on which of these contexts the automation is triggered. The two main contexts from which depend all the other contexts are the order context and the task context.

The data available to an automation plug-in in the task context is restricted to the data defined in the automated task's Task Data tab. The data available to an automation plug-in in the order context is restricted to the data defined in the order specification, Permission tab, Query Task subtab. This subtab links to a manual task designated as a query task that defines the data available to order level notifications but is not part of any process flow.

When you create custom automations, you can access these contexts from the OSM Java API com.mslv.automation.oms.AutomationContext class which is the parent class of OrderContext which is in turn the parent of the TaskContext. These are either parent or sibling classes for all the other contexts. You never need to import the AutomationContext because it is inherited by all the other contexts. You can also declare these contexts in predefined automation plug-ins.

Each context class provides methods (or inherits them from parent classes) that you can use in automation plug-ins to perform various functions such as:

  • Updating order data

  • Transitioning the task to a new state

  • Suspending the task

  • Completing the task

  • Getting order task data for use in business logic

  • Transition the order into a failed state

Figure 6-2 shows the class hierarchy stemming from the AutomationContext.

Figure 6-2 Context Object Class Hierarchy

Description of Figure 6-2 follows
Description of "Figure 6-2 Context Object Class Hierarchy"

Some of the methods that the task context inherits from the order context behave differently when run from the task context. For example, the update order method run from the task context can generate historical and contemporary order perspectives that can be used in order amendment analysis, while the update order method run from the order context does not. See "About Compensation for Automations" for more information.

Table 6-1 shows the Design Studio entity where you can configure automations, the types of events that trigger the automations, and the context that gets passed into the plug-in.

Table 6-1 Context Objects Passed To Plug-in

Automation Plug-in Trigger Design Studio Definition Location OSM Event OSM Event Type Context Object Passed To Plug-in

Automated task

Task editor, Automation tab

Task state transitions to Received

Task Event

TaskContext

Order milestone-based event notification

Order editor, Events tab

Order reaches specified milestone

Order Notification Event

OrderNotificationContext

Task state-based event notification

Task editor, Events tab

Task reaches specified state

Task Notification Event

TaskNotificationContext

Task state-based event notification

Process editor, Events tab on Properties view of a task in the process

Task reaches specified state, then data condition specified by rule evaluates to true.

Task Notification Event

TaskNotificationContext

Task status-based event notification

Process editor, Events tab on Properties view of a status in the process

Task reaches specified status, then data condition specified by rule evaluates to true.

Task Notification Event

TaskNotificationContext

Order data changed event notification

Order editor, Notifications tab

Specified order data changes.

Order Notification Event

OrderDataChangeNotificationContext

Order jeopardy notification

Order Jeopardy editor

The timer conditions for the jeopardy have been reached.

System Notification Event

OrderJeopardyNotificationContext

Order jeopardy notification

Order editor, Jeopardy tab

At polling, data condition defined by rule evaluates to true.

System Notification Event

OrderNotificationContext

Task jeopardy notification

Task editor, Jeopardy tab

At polling, data condition defined by rule evaluates to true.

System Notification Event

If the task-level jeopardy condition Multiple events per Task instance is set, then TaskNotificationContext is passed. Otherwise OrderNotificationContext is passed.

All context objects are located in the SDK/automation/automationdeploy_bin/automation_plugins.jar file. All context objects are defined in the same package: com.mslv.automation.oms.

About Internal and External Events that Trigger Automations

You must also define where you expect the sender or automator plug-in to receive its data when you set the plug-in Event Type, which specifies whether the plug-in instance receives data events internally from OSM or from external systems. The choices are as follows:

  • Internal Event Receiver (default choice): Internal receiver indicates that the source of event for plug-ins is internal to OSM. OSM makes order data available to these type of plug-ins in their respective contexts (see "About Automations in the Order and Task Contexts" for more information). For internal event receivers, the following happens:

    • An event occurs within OSM.

    • OSM creates a message and sends it to the oms_events message queue that the OSM installer creates during the installation process. OSM maps order priority to the JMS priority to prioritize internal events.

    • The automation framework subscribes to the internal message queue as part of the OSM installation.

    • The message is picked up by the automation framework and processed.

  • External Event Receiver: The data made available to the automation plug-in comes from a message sent from an external system. For external event receivers, the following happens:

    • An event occurs within an external system, such as an OSM automation plug-in sends a message that arrives at the external system.

    • The external system creates a response message and sends it to an external message queue. You must explicitly create the external message queues or you can use the oms_events queue that OSM uses for internal message processing.

    • The automation framework subscribes to the external message queue through the information you define on the External Event Receiver tab of the automation definition.

    • The message is picked up by the automation framework and processed.

Automated notifications are always defined as internal event receivers because, as the name implies, notifications are used to notify OSM users or other areas of the OSM system of some event occurring within OSM. That is why notifications do not receive messages from external systems; the information with which to notify always originates within OSM.

The new plug-in appears in the Automation list. Once you add a plug-in to the your automated task, you define the plug-in properties. See the Design Studio Help for further information.

About Accessing the XML API in Automations

You can use the XMP API from within automations. To access the XML APIs from within a custom automation plug-in, API users must belong to a WebLogic group that provides privilege to access the APIs. For accessing the XML APIs from within a custom automation plug-in, that WebLogic group is OSM_automation. So, to access the APIs from within a custom automation plug-in, the API user must belong to the WebLogic group OSM_automation.

See the Design Studio Help for further information regarding the Run As field, which defines the user of the automation.

About Queues, Correlation, and Property Selectors

Automation automator or sender plug-ins that are external event receivers (process responses from external systems) listen for responses (JMS messages) from external systems on an external message queue (JMS queue). These are responses to previously sent messages that are correlated back to a task based on correlation ID. In some cases you must specify filter criteria, defined in Design Studio as a message property selector, which OSM uses to filter messages on the JMS queue. A task only receives messages from queues that match the message property. If a message is selected, then message correlation occurs as normal and the automated task receives the message. The external system must echo back the filter criteria information by extracting and reinserting it into its response.

Note:

For JMS messages, Oracle recommends that you do not use the JMS prefix for custom headers. Reserve the JMS prefix for predefined JMS headers, for example, JMSCorrelationID, JMSMessageID, JMSPriority, and so on. Using the JMS prefix in custom headers can cause problems.

OSM Request and Response Message Queues

When configuring OSM automation plug-in requests, you must create request queues that external systems consume OSM request messages from. You can configure the JMS settings for these queues based on the order processing requirements of the solution. For example, request queues often require different retry, pause, and resume settings when external systems are down. As such, it is important to have specific request queues configured to support the various JMS message consumption scenarios for each external systems.

For returning responses messages, you can create response queues. The benefits of creating new response queues is that you can configure the JMS settings as the solution requires. Optionally, you can also use the predefined oms_events queue (using the mslv/oms/server_name/internal/jms/events JNDI) that OSM uses for internal message processing. The benefits for using oms_events exclusively for all response messages include:

  • Design Studio requires less time to build cartridges because the oms_events queue is internal to the oms.ear file. Design Studio does not need to generate a message-driven-bean and external automation ear file to listen on the external queue.

  • You can more efficiently deploy and undeploy cartridges where there is no external automation ear file for the response queue.

  • The OSM server consumes less memory when there is no external automation ear file for the response queue.

  • OSM is better able to prioritize messages from different systems when there is only one queue. OSM can observe message priority uniformly across all messages within the queue.

However, if you use oms_events, you cannot configure the JMS settings, such as the pause, retry, or resume settings because these settings are already optimized to process internal OSM messages. Being able to configure these JMS settings can be important in production systems when configuring error queues or when stopping the JMS message flow to certain queues during upgrade or maintenance windows. You must weigh the advantages and disadvantages of using oms_events.

Correlating Requests from OSM to Responses from External Systems

Correlation is a property that associates an incoming external system message with an outbound OSM message previously sent to initiate communication with the external system. In some situations you may need additional message filtering using message property selectors.

You can set the JMS ID Correlation parameter in messages sent from OSM to external systems to correlate response messages from the external system with the original request. If you expect the correlated response to return to the task that originally sent the message, then you do not need to programmatically set the correlation ID for the task because this is done for the task when the original sender sent the message. If you expect the correlated response to return to a different task (a receiver task) than the one that sent the message, then you must programmatically set the correlation ID for the outgoing JMS message in the sending task, and configure the receiver task to use the matching correlation ID. For more information about this second scenario, see "Asynchronous Communication: Single or Multiple Requests and Responses." In both scenarios, OSM compares the JMSCorrelationID with the correlation ID set for the task and associates the two messages if the respective values match.

Note:

No correlation configuration is required at the external system that sends the response message.

Correlation is of two types: Message Property and XML Body correlation.

In Message Property correlation, you specify a message header as the correlation ID in the outbound OSM message. For example:

outboundMessage:setJMSCorrelationID($outboundMessage, $corrID)

You can also specify additional message header properties in the outbound message. For example:

outboundMessage:setStringProperty($outboundMessage, $HEADER1, $corrID)

By default, Message Property correlation uses JMSCorrelationID as the correlation ID. The XML Body correlation uses an XPath expression to retrieve the correlation ID from the body of the XML message.

See "Internal XQuery Sender" and "Internal XSLT Sender" for examples of predefined XQuery and XSLT sender that set correlation ID for the outgoing messages. See "Internal Custom Java Sender" for an example of a custom Java sender that sets the correlation ID for the outgoing message.

Intercommunication Between Orders in the Same Domain

There is a special consideration when managing intercommunication between orders, and by extension cartridges that are deployed in the same domain. This situation can occur whenever there are two or more cartridges deployed in the same OSM server that need to communicate with each other.

The automation sender in the child cartridge needs to use the correlation ID specified by the parent order's task. By default, OSM uses the JMSCorrelationID property in the message header as the correlation ID. However, if both parent and child task senders use the same JMSCorrelationID property as the correlation ID, there is a potential situation where duplicate entries will exist in the OSM database with the same correlation ID, resulting in an error when the parent receiver tries to look up an automation context.

The design guideline to handle this is as follows:

  • For the parent automation sender, set the JMSCorrelationID header either programmatically, or allow the system to auto-generate this value.

  • For the child automation sender, set the JMSCorrelationID header to a different correlation ID than what the parent task sent, for example by using a different algorithm than the one used in the automator for the parent, or allowing the system to auto-generate a value. Define a separate custom field in the JMS header to contain the correlation ID expected by the parent task.

  • For the parent automation receiver, use the message property correlation configuration to retrieve the correlation ID from the custom defined JMS header field. This will prevent multiple entries with the same correlation ID in the database and will allow the parent task to correlate the automation context properly.

About Message Property Selectors

An automation task may have one or more external event receivers listening on the JMS queue.

If the automation task has only one external event receiver, you do not need to specify a message property selector. The automation tasks can use the JMS queue without the need for filter criteria.

You must specify a unique message property selector for the event receiver if any of the following situations apply:

  • If the automation task has more than one external event receiver listening on the same JMS queue. For example, if you defined multiple automation plug-in external event receivers for the same automation task.

  • If applications other than OSM share the same queue that an external event receiver is listening on.

  • If you use the Legacy build-and-deploy mode to build and deploy cartridges.

  • If you use the Both (Allow server preference to decide) build-and-deploy mode to build and deploy cartridges and configure the Internal dispatch mode for the OSM server.

Note:

Internally, the activation task uses the OSS_API_CLIENT_ID property in the message property selector when listening for response message from Oracle Communications ASAP. Do not use this property in a non-activation task external automator (even if the activation task is not used in the solution) because this causes OSM to route the response message incorrectly.

For information on how OSM processes plug-ins according to the build-and-deploy mode you set, see "About Building and Deploying Automation Plug-ins." For information on message property selector filter criteria, see the Design Studio Help.

About Automation Plug-in Communication Options

Automated tasks and the automation plug-ins they trigger can handle asynchronous or synchronous communication. Automated notifications and the automation plug-ins they trigger can handle asynchronous communication only because an automated notification can not be defined as external event receiver, so it can not process a response.

No External Communication: Data Processing Only

You can define an automation as an internal event receiver that extends AbstractAutomator. In this scenario, the input data is coming from OSM and not being sent anywhere, so there is no communication with an external system. The automation plug-in may perform some internal calculation, or just complete the task. Use this scenario for order-level or task-level notifications because notifications do not require responses. You can also use this scenario with automated task plug-ins.

Figure 6-3 illustrates this scenario. In the figure, Automation Plug-ins A and B are internal event receivers/automators.

Fire-and-Forget Communication: Message Sent to External Systems

You can define an automation as an internal event receiver that extends AbstractSendAutomator. In this scenario, the input data is coming from OSM and being sent to an external system. The automation plug-in sends an asynchronous "fire-and-forget" message. That is, it completes the task and sends a message to an external system, but does not expect a response back from the external system.

Figure 6-4 illustrates this scenario, which builds on Figure 6-1. In the figure, Automation Plug-in A is an internal event receiver/sender.

Figure 6-4 Automation Flow: Fire-and-Forget

Description of Figure 6-4 follows
Description of "Figure 6-4 Automation Flow: Fire-and-Forget"
Synchronous Communication: Single Request and Response

You can define an automated task that defines two automation plug-ins:

  • You can define the first automation as an internal event receiver that extends AbstractSendAutomator. In this scenario, the input data is coming from OSM and being sent to an external system. The automation plug-in sends a synchronous message which expects a response back from the external system.

  • You can define the second automation as an external event receiver that extends AbstractAutomator. In this scenario, the input data is coming from the external system (it is the response from the message sent by the first automation) and not being sent anywhere. The automation plug-in processes the response and completes the task.

Figure 6-5 illustrates this scenario, which builds upon Figure 6-4. In the figure, Automation Plug-in A-1 is an internal event receiver/sender, and Automation Plug-in A-2 is an external event receiver/automator.

Figure 6-5 Automation Flow: Simple Synchronous

Description of Figure 6-5 follows
Description of "Figure 6-5 Automation Flow: Simple Synchronous"
Synchronous Communication: Multiple Requests and Responses

You can define an automated task that defines multiple automation plug-ins:

  • You can define the first automation as an internal event receiver that extends AbstractSendAutomator. In this scenario, the input data is coming from OSM and being sent to an external system. The automation plug-in sends a synchronous message which expects a response back from the external system.

  • You can define the second automation as an external event receiver that extends AbstractSendAutomator. In this scenario, the input data is coming from the external system (it is the response from the message sent by the first automation) and being sent back to the external system. The automation plug-in processes the response and replies back by sending another message.

  • You can define the third automation as an external event receiver that extends AbstractAutomator. In this scenario, the input data is coming from the external system (it is the response from the second message sent by the second automation) and not being sent anywhere. The automation plug-in processes the response and completes the task.

Figure 6-6 illustrates this scenario, which builds upon Figure 6-5. In the figure, Automation Plug-in A-1 an internal event receiver/sender, Automation Plug-in A-2 is an external event receiver/sender, and Automation Plug-in A-3 is an external event receiver/automator.

There can be multiple exchanges in such a scenario; this is just an example. After some number of messages back and forth, the final automation must be an external event receiver that extends AbstractAutomator, to complete the task. This example shows communication with two different external systems; however, steps 8-13 could continue communications with External System X, rather than with External System Y.

Figure 6-6 Automation Flow: Complex Synchronous

Description of Figure 6-6 follows
Description of "Figure 6-6 Automation Flow: Complex Synchronous"
Asynchronous Communication: Single or Multiple Requests and Responses

In the synchronous communication scenario one task sends a single message and expects a response in return (see "Synchronous Communication: Single Request and Response"). While the task is waiting for the response to return, the order data associated to that task is not available for amendment processing, effectively blocking any revision order changes or cancelation request involving that task. This scenario is normally not a problem when the response returns quickly but for more asynchronous communication where the message can take a longer time to return, the scenario described in this section is more appropriate so as to avoid unnecessarily long delays in order amendments or cancelation requests.

You can define an automated task that defines a single automation as an internal event receiver that extends AbstractSendAutomator. In this scenario, the input data is coming from OSM and being sent to an external system. The automation plug-in sets a correlation ID and sends a message. In this case, however, OSM expects a response back from the external system but to a different task.

In this scenario, you must programmatically set the correlation ID for the outgoing message in the sending task. You cannot use the OSM auto-generated correlation ID functionality. For more information, see "Correlating Requests from OSM to Responses from External Systems."

You can define the second automated task with two automation plug-ins:

  • The first plug-in is an internal event receiver that extends AbstractAutomator. In this scenario, the input data is coming from the previous task that sent the initial message and correlation ID to the external system. The automation plug-in configures the correlation ID to correspond to the correlation ID configured on the previous task so that the message.is routed to the right location. In addition, this automator uses the taskContext suspendTask method to transition the task to a new customer defined task state (for example, a state called waitingforresponse) and also has the ability to suspend the task. When a task is in the suspended state, it can be amended.

  • The second plug-in is an external event receiver that extends AbstractAutomator. In this scenario, the input data is coming from the response to the message sent by the previous task. When the response arrives, the event automatically transitions the task to a new state (for example, a state called waitForProvisioningCompleted) that moves the task out of the suspended state and completes that task.

Figure 6-7 illustrates this scenario, which is a variant of Figure 6-5. In the figure, Automation Plug-in A-1 is an internal event receiver/sender. Automation plug-in B-1 sets the correlation ID and suspends the task, and Automation Plug-in B-2 is an external event receiver/automator.

Figure 6-7 Automation Flow: Simple Asynchronous

Description of Figure 6-7 follows
Description of "Figure 6-7 Automation Flow: Simple Asynchronous"

You can also apply this asynchronous communication to the synchronous communication scenario where one task sends and receives multiple messages (see "Synchronous Communication: Multiple Requests and Responses"). In Figure 6-6, replace plug-in A-3 with a new task that includes two automation plug-ins that set the expect correlation ID, suspend the task so that the task data can be amended or canceled while it is waiting for the response, and then completes the task when the response returns.

Storing Response Message as XML Type Parameters

When you receive response message from external fulfillment systems, you may want to store response message data on the OSM order. To do this, you can use a parameter that you designate as XML Type in the Design Studio Order editor Order Template tab.

However, you must strip the envelop, header, and body from the response message before storing data in this way. Having XML type data that includes the envelop, header, or body prevents OSM from sending any subsequent Web Service request messages because Web Service message envelops, headers, or body cannot be nested.

For example, you could receive response data and assign it to a variable, such as $wsResponseDataXmlData. This variable contains the entire response including the Web Service envelope, header, and body. You could use the following code to strip the envelope, header, and body:

Example 6-1 Stripping the Envelope, Header, and Body

let $wsResponseContentXmlData := $wsResponseDataXmlData/env:Envelope/env:Body/*
 

The new $wsResponseContentXmlData variable now contains only the content of the body.

About Custom Automation Plug-ins

All custom automation plug-in Java source files must reside in the cartridgeName/src directory. You can create subdirectories within the src directory as needed. When you compile the source file, the resultant Java class file is placed in the cartridgeName/out directory. Any subdirectories you created within the src directory are reflected in the out directory.

All custom automation plug-ins must extend one of the following automation classes, located in the SDK/automation/automationdeploy_bin/automation_plugins.jar file:

  • AbstractAutomator

  • AbstractSendAutomator

The custom automation plug-in can directly or indirectly extend AbstractAutomator or AbstractSendAutomator: If needed, there can be one or more layers of inheritance between AbstractAutomator or AbstractSendAutomator, and the automation plug-in.

These classes are hierarchically related. AbstractAutomator is the parent of AbstractSendAutomator as shown in Figure 6-8. Both classes reside in the com.mslv.automation.plugin package.

The AbstractAutomator can receive information, either from OSM or from an external system. The AbstractSendAutomator inherits this ability, so it can also receive information from OSM or from an external system; however, it can also send information. If the purpose of the custom automation plug-in you are writing is to send a message, it should extend the AbstractSendAutomator class; otherwise, it should extend the AbstractAutomator class.

Defining the Custom Automation Plug-in

For every custom automation plug-in you write, you must define a corresponding Custom Automation Plug-in entity in Design Studio. The Custom Automation Plug-in editor associates a Java class representing the custom automation plug-in to the Custom Automation Plug-in Design Studio entity. For example, if you create MyCustomPlugin.java and compile it, the result is MyCustomPlugin.class. You then create a new Custom Automation Plug-in entity, and populate the fields defined on the editor.

There is a difference between the terms custom automation plug-in and Custom Automation Plug-in: The former is a custom Java class, the latter is a Design Studio entity.

About the XML Template

The Custom Automation Plug-in editor also defines the XML Template field.

You must provide XML that defines the implementation for your custom automation plug-in. This is done through the <implement> element, as shown in Example 6-2. The <implement> element is defined in the cartridgeName/customAutomation/automationMap.xsd file, which is available with the creation of an OSM cartridge. See OSM Modeling Guide for more information.

Example 6-2 XML Template

<implement xsi:type="hw:customImplementation"
xmlns:hw="http://www.example.org/hello/world"
			xsi:schemaLocation="http://www.example.org/hello/world helloWorld.xsd">
  <hw:completionStatus>success</hw:completionStatus></implement>
 

You must also provide the corresponding schema file that defines the rules for the XML that you entered in the XML Template field. The schema file name in this example is helloWorld.xsd, shown on the third line of Example 6-2. The content of helloWorld.xsd is shown in Example 6-3.

Example 6-3 Schema for XML Template

<?xml version="1.0" encoding="UTF-8"?><schema xmlns="http://www.w3.org/2001/XMLSchema"
	targetNamespace="http://www.example.org/hello/world"	xmlns:tns="http://www.example.org/hello/world"
	elementFormDefault="qualified"
xmlns:Q1="http://www.oracle.com/OMS/AutomationMap/2001/11/23">
<import schemaLocation="automationMap.xsd"
  namespace="http://www.oracle.com/OMS/AutomationMap/2001/11/23">
</import>
<complexType name="customImplementation">
  	<complexContent>
      		<extension base="Q1:Implementation">
    			    <sequence>
    				      <element name="completionStatus" type="string"></element>
        			</sequence>
    		  </extension>
    	</complexContent>
  </complexType>
</schema>
 

The schema files you create must reside in the cartridgeName/customAutomation directory and the cartridgeName/resources/automation directory.

Note:

The generated automationMap.xml file includes the <implement> element for predefined automation plug-ins, but not for custom automation plug-ins. For additional examples of the implement element, see "AutomationMap.xml File".

When looking at the examples, note that the sub-elements defined for the implement element differ for senders versus automators.

About Creating Custom Automation Plug-ins

AbstractAutomator and AbstractSendAutomator each define abstract methods which require child classes to define those methods. The custom automation plug-in must define a specific method, depending on which Java class the custom automation plug-in extends:

  • A custom automation plug-in that extends AbstractAutomator must define the method:

    public void run(String inputXML, AutomationContext automationContext)
     
  • A custom automation plug-in that extends AbstractSendAutomator must define the method:

    public void makeRequest(String inputXML, AutomationConext automationContext, TextMessage outboundMessage)
    

By defining one of these methods in a custom automation plug-in, when an automated task or automated notification is triggered, OSM can process the automation mapping and call the method, knowing it is defined for the class name provided in the automation mapping.

The following sections describe the arguments used in the run and makeRequest methods. See "Custom Java Automation Plug-ins" for sample custom automation senders and receivers that illustrate how you can use these arguments.

inputXML Argument

The inputXML argument is a java.lang.String object. The custom automation plug-in does not need to include an import statement for this object because it is included in the hierarchy from which the custom automation is extending.

The inputXML argument is the input data in XML document form that can be parsed to access the individual pieces of data. If the automation is defined as an internal event receiver, the XML document defines OSM order data. If the automation is defined as an external event receiver, the XML document defines data from an external source. In either case, you need to know the expected XML definition in order to write the code to parse the data.

Data is not stored at the element for a given XML tag; it is stored at its child, so the approach for retrieving order data is not obvious. A command to retrieve order data looks like this:

Element clli_a = root.getElementsByTagName("clli_a").item(0); 
String text = clli_a.getFirstChild().getNodeValue();
AutomationContext Argument and Casting the Context Argument

Within the custom plug-in, you must determine which context object to expect as an argument, and then cast the AutomationContext object to the appropriate child context object (for example, TaskContext or OrderNotificationContext).

For example, in code below, the expected context object is TaskContext and automationContext is the name of the AutomationContext object argument.

if (automationContext instanceof TaskContext) {
    TaskContext taskContext = (TaskContext)automationContext; }
else { //log an error }

After the AutomationContext object is cast to the appropriate context object, all methods on the context object are available to the custom plug-in. See "About Automations in the Order and Task Contexts" for more information.

outboundMessage Argument

The outboundMessage argument is a javax.jms.TextMessage object. The custom automation plug-in does not need to include an import statement for this object because it is included in the hierarchy from which the custom automation is extending.

The outboundMessage argument is defined only for the makeRequest method; it is not defined for the run method. The makeRequest method is defined for classes that extend AbstractSendAutomator, which automatically send a message to an external system. You can write custom code that populates outboundMessage, which is sent to the external message queue defined by the automation definition. You do not have to write custom code to connect to the external system or send the message; OSM automation handles the connection and the message upon completion of the makeRequest method.

Accessing JDBC from Within an Automation Plug-in

Because custom automation plug-ins run inside a J2EE container, JDBC services are readily available.

To use JDBC from a plug-in, you must create a data source through the WebLogic console. The data source contains all the connection information for your proprietary database, such as host names, user names, passwords, number of connections, and so on.

For information on setting up data sources in WebLogic, see the overview of WebLogic Server applications development in the Oracle WebLogic documentation.

The following code illustrates how to connect to a proprietary database from OSM and perform a "SELECT *".

javax.naming.InitialContext initialContext = new InitialContext();
javax.sql.DataSource datasource = (javax.sql.DataSource) initialContext.lookup ("java:comp/env/jdbc/DataSource");

javax.sql.connection connection = datasource.getConnection();
javax.sql.Statement statement = connection.createStatement();

javax.sql.ResultSet resultSet = statement.executeQuery("SELECT * FROM my_custom_table");

Line two, the lookup, uses the JNDI name of the data source as a parameter.

Compiling the Custom Automation Plug-in

You must include the following JAR files in your project library list for the custom automation plug-in to compile:

  • WLS_home/wlserver_10.3/server/lib/weblogic.jar

  • SDK/automation/automationdeploy_bin/automation_plugins.jar

Note:

The version of the automation_plugins.jar that you reference to compile the custom automation plug-in must be the same version that resides in the cartridge osmlib directory. To verify this, check the date and size of each file. If they are different, use the version that came with the OSM installation. To do so, copy the automation_plugins.jar file from the SDK/automation/automationdeploy_bin directory to the osmlib directory of your cartridge. After the file is copied to the cartridge, clean and rebuild the cartridge.

Depending on the content of the custom automation plug-in, you may also need to include additional JAR files.

To include a JAR file in the project library list:

  1. From the Design Studio menu bar, select Project, then select Properties.

    The Properties for CartridgeName window opens.

  2. In the left navigation pane, click Java Build Path.

  3. Click the Libraries tab.

  4. Click Add External JARs.

    The Jar Selection window opens.

  5. Navigate to the location of the JAR file and double-click the JAR file to add it to the library list.

About Predefined Automation Plug-ins

The OSM installation provides several predefined automation plug-ins, as described in the following sections. The sections are presented in the order that the predefined automation plug-ins display within Design Studio, on the Add Automation window Automation Type list field.

All of the predefined automation plug-ins are part of the automation class hierarchy; they extend, either directly or indirectly, the AbstractAutomator class that you use to create custom automations, as shown in Figure 6-9.

Figure 6-9 Predefined Automation Plug-in Class Hierarchy

Description of Figure 6-9 follows
Description of "Figure 6-9 Predefined Automation Plug-in Class Hierarchy"

Note:

The XSLT and XQuery Automator predefined automation plug-in Java class are XSLTReceiver and XQueryReceiver. The presentation in Design Studio was changed to remove confusion. The names receiver and sender imply that one receives and one sends, which is not true: Both receive. The sender just has the added ability to send a message.

XSLT Sender

The XSLT Sender predefined automation plug-in provides a way to transform data and send it to an external system using JMS, with you supplying the extensible stylesheet language transformation (XSLT).

Defining the Automation

When defining the automation on the Add Automation window, select XSLT Sender from the Automation Type list field.

For an automation defined as an internal event receiver, the XSLT must transform the OSM input data to SystemY data, where SystemY is the external system that the automation is sending the transformed data to.

For an automation defined as an external event receiver, the XSLT must transform SystemX data to SystemY data, where SystemX is the external system that the automation is receiving input data from, and SystemY is the external system that the automation is sending the transformed data to.

See "Internal XSLT Sender" and "External XSLT Sender" for sample code.

XSLT Tab

Selecting XSLT Sender from the Automation Type list field results in XSLT tab being present on the Properties view for the automation. The XSLT tab is where you specify your XSLT file so the predefined automation plug-in can access it. You can specify your XSLT file in one of three ways by choosing the appropriate radio button:

  • When you choose Bundle in, you can select your XSLT file from a list that displays all XSLT files defined in the cartridge resources directory, which populates the XSLT field for you.

  • When you choose Absolute path, you must enter the path and name of your XSLT file in the XSLT field.

  • When you choose URL, you must enter the unified resource locator (URL) to your XSLT file in the XSLT field.

Note:

Oracle recommends that you choose Bundle in for production mode because it pulls the XSLT files into the PAR file. As a result, you can deploy the EAR file (which contains the PAR file) to any server and, at run time, the application can locate the XSLT files. If you choose Absolute Path or URL for production mode, you can deploy the EAR file to any server but are responsible for ensuring the XSLT files reside in the specified location on the server.

Conversely, Absolute Path or URL are optimal for testing mode because they do not require a rebuild and redeploy to pick up changes to the XSLT.

The XSLTSender class can cache the associated XSLT file, incurring minimal overhead on each invocation. When the automation is defined to cache the XSLT, the implementation detects at runtime whether the XSLT source has changed by checking the URL modification time and the XSLT is automatically reloaded if required. You can configure caching through the Maximum Number in Cache and Cache Timeout fields.

You can set exceptions for the XSLT processing by setting the Exception field. For automations defined on a task, the Exception list field provides the values of success and failure, which are task statuses. If you define additional task statuses, they also appear in the list. (The Exception field is not applicable for automations defined on an order.)

Oracle uses Saxon as the transformer factory to process XSLT. You can specify use of a different transformer factory by specifying a value for the Transformer Factory field.

Note:

Oracle recommends that you use the default Saxon transformer factory.

Routing Tab

The Routing tab consists of two sub-tabs: To and Reply To. Both sub-tabs define the same set of fields. The To sub-tab defines where the outbound message is being routed to, and the Reply To sub-tab defines where the in-bound message (replying to the outbound message) is being routed to. You must set the ReplyTo queue on the sender even if you are processing the return message on a different automation plug-in.

Writing the XSLT

When the XSLT transformer is called, it is passed references to the following named parameters that may be used from within the XSLT:

  • Automator: The class instance (for example, the instance of XSLTSender that is calling the XSLT).

  • Log: The automator's instance of org.apache.commons.logging.Log.

  • Context: The context object input parameter to the makeRequest method.

  • OutboundMessage: The outbound JMS TextMessage.

XSLTSender does not automatically complete the associated task after successful processing. If the task needs to be completed, the XSLT must include a call to

TaskContext.completeTaskOnExit(java.lang.String s)

as shown in Example 6-4:

Example 6-4 XSLT Java Call

<xsl:stylesheet version="1.0"
  	xmlns="http://java.sun.com/products/oss/xml/ServiceActivation" 
  	xmlns:xsl="http://www.w3.org/1999/XSL/Transform" 
	  xmlns:java="http://xml.apache.org/xslt/java" 
	  xmlns:xalan="http://xml.apache.org/xslt" 
	  xmlns:sa="http://java.sun.com/products/oss/xml/ServiceActivation"
	  xmlns:mslv-sa="http://www.oracle.com/oss/ServiceActivation/2003"
	  xmlns:co="http://java.sun.com/products/oss/xml/Common" 
	  exclude-result-prefixes="xsl java xalan sa mslv-sa">
	  <!-- * -->
	  <xsl:param name="automator"/>
	  <xsl:param name="log"/>
	  <xsl:param name="context"/>
	  <!-- * -->
	  <xsl:output method="xml" indent="yes" omit-xml-declaration="no" 
  xalan:indent-amount="5"/>
 	 <!-- * -->
	  <xsl:template match="/">
		    <xsl:variable name="void1" select="java:info($log,'completing task 
    with status success')"/>
  		  <xsl:variable name="void" select="java:completeTaskOnExit
    ($context,'success')"/>
	  </xsl:template>
  <!-- * -->
	  <xsl:template match="* | @* | text()">
		    <!-- do nothing -->
		    <xsl:apply-templates/>
	  </xsl:template>
</xsl:stylesheet>
 

As the XSLT author, you must ensure that the context parameter provided to the automation plug-in, and so to your XSLT, is an instance of TaskContext or TaskNotificationContext. This implementation attempts to complete the associated task, if applicable, on processing failure, using the exception status defined in the AutomationMap.xml file.

Steps to Follow When Using XSLT Sender

The following high level-steps describe how to set up the XSLT Sender predefined automation plug-in:

  1. Determine the from and to data that your XSLT is to translate.

  2. Write the XSLT.

  3. Define automated task or automated notification that will trigger the automation plug-in.

  4. Define the automation for automated task or automated notification:

    1. Select XSLT Sender from the Automation Type list field.

    2. For an automated task, define the automation as internal or external event receiver.

    3. Populate all applicable automation Properties tabs, including the tabs specific to this type of automation: the XSLT tab and the Routing tab.

  5. Build the cartridge.

  6. Deploy the cartridge to the OSM server.

  7. From within OSM, perform the event that triggers the automation.

  8. XSLTSender uses your XSLT to transform the data and send it to the external system specified by the automation definition.

XSLT Automator

The XSLT Automator predefined automation plug-in provides a way to transform data or update OSM with the transformed data, with you supplying the extensible stylesheet language transformation (XSLT).

Defining the Automation

When defining the automation on the Add Automation window, select XSLT Automator from the Automation Type list field.

For an automation defined as an internal event receiver, the scenario is not very plausible because your corresponding XSLT would not need to transform OSM data to OSM data. However, you can write XSLT that executes Java rather than transforms data, so it is possible to define an XSLT Automator as an internal event receiver, but you can accomplish the same thing by writing a custom automation plug-in. The decision on which to use is based on the complexity of the Java code: If it is fairly short and simple, it may be quicker to use the predefined automation plug-in and just write the XSLT, as opposed to writing the custom automation plug-in.

For an automation defined as an external event receiver, your corresponding XSLT must transform SystemX data to OSM data, where SystemX is the external system that the automation is receiving input data from. You can also specify to update OSM with the transformed data.

See "External XSLT Automator" and "Internal XSLT Automator" for sample code.

XSLT Tab

Selecting XSLT Automator from the Automation Type list field results in XSLT tab being present on the Properties view for the automation. The XSLT tab is where you specify your XSLT so the predefined automation plug-in can access it. You can specify your XSLT in one of three ways by choosing the appropriate radio button:

  • When you choose Bundle in, you can select your XSLT file from a list that displays all XSLT files defined in the cartridge resources directory, which populates the XSLT field for you.

  • When you choose Absolute path, you must enter the path and name of your XSLT file in the XSLT field.

  • When you choose URL, you must enter the unified resource locator (URL) that locates your XSLT file in the XSLT field.

Note:

Oracle recommends that you choose Bundle in for production mode and Absolute Path or URL for testing mode.

The XSLTReceiver class can cache the associated XSLT file, incurring minimal overhead on each invocation. When the automation is defined to cache the XSLT, the implementation detects at runtime whether the XSLT source has changed by checking the URL modification time; the XSLT is automatically reloaded if required. You can configure caching through the Maximum Number in Cache and Cache Timeout fields.

You can set exceptions for the XSLT processing by setting the Exception field. For automations defined on a task, the Exception list field provides the values of success and failure, which are task statuses. If you define additional task statuses, they also appear in the list. (The Exception field is not applicable for automations defined on an order.)

Oracle uses Saxon as the transformer factory to process XSLTs. You can specify to use a different transformer factory by specifying a value for the Transformer Factory field.

Note:

Oracle recommends that you use the default Saxon transformer factory.

When XSLT Automator is selected from the Automation Type list, the XSLT tab also includes the Update Order check box, which is not present when XSLT Sender is selected from the Automation Type list. If the check box is selected, XSLTReceiver updates OSM with the transformed order data. If the check box is deselected, XSLTReceiver just transforms the data; it does not update OSM with the transformed data.

Writing the XSLT

When the XSLT transformer is called, it is passed references to the following named parameters that may be used from within the XSLT:

  • Automator: The class instance (for example, the instance of XSLTReceiver that is calling the XSLT).

  • Log: The automator's instance of org.apache.commons.logging.Log.

  • Context: The context object input parameter to the makeRequest method.

XSLTReceiver does not automatically complete the associated task after successful processing. If the task needs to be completed, the XSLT must include a call to

TaskContext.completeTaskOnExit(java.lang.String s)

as shown in Example 6-4.

As the XSLT author, you must ensure that the context parameter provided to the automation plug-in, and so to your XSLT, is an instance of TaskContext or TaskNotificationContext. This implementation attempts to complete the associated task, if applicable, on processing failure, using the exception status defined in the AutomationMap.xml file.

Steps to Follow When Using XSLT Automator

The following high-steps describe how to set up the XSLT Automator predefined automation plug-in:

  1. Determine the from and to data that your XSLT is to translate.

  2. Write the XSLT.

  3. Define automated task or automated notification that will trigger the automation plug-in.

  4. Define the automation for automated task or automated notification:

    1. Select XSLT Automator from the Automation Type list field.

    2. For an automated task, define the automation as internal or external event receiver.

    3. Populate all applicable automation Properties tabs, including the tab specific to this type of automation; that is, the XSLT tab.

  5. Build the cartridge.

  6. Deploy the cartridge to the OSM server.

  7. From within OSM, perform the event that triggers the automation.

  8. XSLTAutomator uses your XSLT to transform the data or updates OSM with the transformed data.

XQuery Sender

The XQuery Sender predefined automation plug-in provides a way to extract and manipulate XML data and send it to an external system using JMS, with you supplying the XML query (XQuery).

Defining the Automation

When defining the automation on the Add Automation window, select XQuery Sender from the Automation Type list field.

For an automation defined as an internal event receiver, your corresponding XQuery can manipulate OSM data and send it to SystemY, where SystemY is the external system that the automation is sending the manipulated data to.

For an automation defined as an external event receiver, your corresponding XQuery can manipulate SystemX data and send it to SystemY, where SystemX is the external system that the automation is receiving input data from, and SystemY is the external system that the automation is sending the manipulated data to.

See "Internal XQuery Sender" and "External XQuery Sender" for sample code.

XQuery Tab

Selecting XQuery Sender from the Automation Type list field results in XQuery tab being present on the Properties view for the automation. The XQuery tab is where you specify your XQuery file so the predefined automation plug-in can access it. You can specify your XQuery file in one of three ways by choosing the appropriate radio button:

  • When you choose Bundle in, you can select your XQuery file from a list that displays all XQuery files defined in the cartridge resources directory, which populates the XQuery field for you.

  • When you choose Absolute path, you must enter the path and name of your XQuery file in the XQuery field.

  • When you choose URL, you must enter the unified resource locator (URL) to your XQuery file in the XQuery field.

Note:

Oracle recommends that you choose Bundle in for production mode and Absolute Path or URL for testing mode.

The XQuerySender class can cache the associated XQuery file, incurring minimal overhead on each invocation. When the automation is defined to cache the XQuery, the implementation detects at runtime whether the XQuery source has changed by checking the URL modification time; the XQuery is automatically reloaded if required. You can configure caching through the Maximum Number in Cache and Cache Timeout fields.

You can set exceptions for the XSLT processing by setting the Exception field. For automations defined on a task, the Exception list field provides the values of success and failure, which are task statuses. If you define additional task statuses, they also appear in the list. (The Exception field is not applicable for automations defined on an order.)

Routing Tab

The Routing tab consists of two sub-tabs: To and Reply To. Both sub-tabs define the same set of fields. The To sub-tab defines where the outbound message is being routed to, and the ReplyTo sub-tab defines where the in-bound message (replying to the outbound message) is being routed to. You must set the ReplyTo queue on the sender even if you are processing the return message on a different automation plug-in.

Writing the XQuery

When the XQuery processor is called, it is passed references to the following named parameters that may be used from within the XQuery:

  • Automator: The class instance (for example, the instance of XQuerySender that is calling the XSLT).

  • Log: The automator's instance of org.apache.commons.logging.Log.

  • Context: The context object input parameter to the makeRequest method.

  • OutboundMessage: The outbound JMS TextMessage.

XQuerySender does not automatically complete the associated task after successful processing. If the task needs to be completed, the XQuery must include a call to

TaskContext.completeTaskOnExit(java.lang.String s)

as shown in Example 6-4.

As the XQuery author, you must ensure that the context parameter provided to the automation plug-in, and so to your XQuery, is an instance of TaskContext or TaskNotificationContext. This implementation attempts to complete the associated task, if applicable, on processing failure, using the exception status defined in the AutomationMap.xml file.

Steps to Follow When Using XQuery Sender

The following high-steps describe how to set up the XQuery Sender predefined automation plug-in:

  1. Determine the from and to data that your XQuery is to manipulate.

  2. Write the XQuery.

  3. Define automated task or automated notification that will trigger the automation plug-in.

  4. Define the automation for automated task or automated notification:

    1. Select XQuery Sender from the Automation Type list field.

    2. For an automated task, define the automation as internal or external event receiver.

    3. Populate all applicable automation Properties tabs, including the tabs specific to this type of automation: the XQuery tab and the Routing tab.

  5. Build the cartridge.

  6. Deploy the cartridge to the OSM server.

  7. From within OSM, trigger the automation.

  8. XQuerySender uses your XQuery to manipulate the data and send it to the external system specified by the automation definition.

XQuery Automator

The XQuery Automator predefined automation plug-in provides a way to manipulate data or update OSM with the manipulated data, with you supplying the XML Query (XQuery).

Defining the Automation

When defining the automation on the Add Automation window, select XQuery Automator from the Automation Type list field.

For an automation defined as an internal event receiver, your corresponding XQuery can manipulate the OSM input data or specify to update OSM with the manipulated data.

For an automation defined as an external event receiver, your corresponding XQuery can manipulate the SystemX input data, where SystemX is the external system that the automation is receiving input data from. You can also specify to update OSM with the manipulated data.

See "External XQuery Automator" and "Internal XQuery Automator" for sample code.

XQuery Tab

Selecting XQuery Automator from the Automation Type list field results in XQuery tab being present on the Properties view for the automation. The XQuery tab is where you specify your XQuery so the predefined automation plug-in can access it. You can specify your XQuery in one of three ways by choosing the appropriate radio button:

  • When you choose Bundle in, you can select your XQuery file from a list that displays all XQuery files defined in the cartridge resources directory, which populates the XQuery field for you.

  • When you choose Absolute path, you must enter the path and name of your XQuery file in the XQuery field.

  • When you choose URL, you must enter the unified resource locator (URL) that locates your XQuery file in the XQuery field.

The XQueryReceiver class can cache the associated XQuery file, incurring minimal overhead on each invocation. When the automation is defined to cache the XQuery, the implementation detects at runtime whether the XQuery source has changed by checking the URL modification time; the XQuery is automatically reloaded if required. You can configure caching through the Maximum Number in Cache and Cache Timeout fields.

You can set exceptions for the XSLT processing by setting the Exception field. For automations defined on a task, the Exception list field provides the values of success and failure, which are task statuses. If you define additional task statuses, they also appear in the list. (The Exception field is not applicable for automations defined on an order.)

When XQuery Automator is selected from the Automation Type list, the XQuery tab also includes the Update Order check box, which is not present when XQuery Sender is selected from the Automation Type list. If the check box is selected, XQueryReceiver updates OSM with the manipulated data. If the check box is deselected, XQueryReceiver just manipulates the data; it does not update OSM with the manipulated data.

Writing the XQuery

When the XQuery transformer is called, it is passed references to the following named parameters that may be used from within the XQuery:

  • Automator: The class instance (for example, the instance of XQueryReceiver that is calling the XSLT).

  • Log: The automator's instance of org.apache.commons.logging.Log.

  • Context: The context object input parameter to the makeRequest method.

XQueryReceiver does not automatically complete the associated task after successful processing. If the task needs to be completed, the XQuery must include a call to

TaskContext.completeTaskOnExit(java.lang.String s)

as shown in Example 6-4.

As the XQuery author, you must ensure that the context parameter provided to the automation plug-in, and so to your XQuery, is an instance of TaskContext or TaskNotificationContext. This implementation attempts to complete the associated task, if applicable, on processing failure, using the exception status defined in the AutomationMap.xml file.

Steps to Follow When Using XQuery Automator

The following high-steps describe how to set up the XQuery Automator predefined automation plug-in:

  1. Determine the from and to data that your XQuery is to manipulate.

  2. Write the XQuery.

  3. Define automated task or automated notification that will trigger the automation plug-in.

  4. Define the automation for automated task or automated notification:

    1. Select XQuery Automator from the Automation Type list field.

    2. For an automated task, define the automation as internal or external event receiver.

    3. Populate all applicable automation Properties tabs, including the tab specific to this type of automation; that is, the XQuery tab.

  5. Build the cartridge.

  6. Deploy the cartridge to the OSM server.

  7. From within OSM, trigger the automation.

  8. XQueryReceiver uses your XQuery to manipulate the data or update OSM with the manipulated data.

DatabasePlugin

The DatabasePlugin class is a predefined automation plug-in that provides a way to interact with external databases, with you supplying the SQL and stored procedures to query and update a database. The automation plug-in can also be configured to update OSM with data returned from an external database.

DatabasePlugin is slightly different from the previously described predefined automation plug-ins, in that the input is not accessed through a file. Rather, the input is accessed through the XML Template field on the Custom Automation Plug-in editor. Because this predefined automation plug-in requires the use of the XML Template field, it must be defined as a Custom Automation Plug-in. As a result, DatabasePlugin does not appear in the Automation Type list field on the Add Automation window like the other predefined automation plug-ins do.

Note:

The OSM installation provides samples of the DatabasePlugin predefined automation plug-in, located in the SDK/Samples/DatabasePlugin directory.

Defining the Custom Automation Plug-in

To define the Custom Automation Plug-in for the DatabasePlugin predefined automation plug-in, set the Class field by selecting DatabasePlugin. The DatabasePlugin.class is located in the SDK/automation/automationdeploy_bin/automation_plugins.jar file, which comes with your OSM installation.

XML Template

The XML Template field consists of one or more statements defined under the root <implementation> element. A statement may update the database, or update OSM order data, or both. All statements share the following characteristics:

  • May contain SQL or stored procedure calls.

  • May have one or more parameters.

  • May return one or more result sets, either as a result of a database query or through a stored procedure OUT parameter.

  • May contain one or more bind paths.

  • May be configured to handle database exceptions in an implementation appropriate manner.

  • May run as a single transaction or in a group.

SQL statements are specified by the <sql> element and stored procedure statements are specified by the <call> element. The format of the call element is expected to be of the form {? = call <procedure-name>[<arg1>, <arg2>, ...]} or {call <procedure-name>[<arg1>,<arg2>, ...]}. Parameters are declared with the? character.

Example 6-5 and Example 6-6 show the SQL statement and the stored procedure call.

Example 6-5 SQL Statement

<implementationxmlns="http://www.oracle.com/Provisioning/database/DatabasePlugin/2006/02/28" ...>
    ...
    <query>
        <sql>SELECT 'dummy' FROM dual</sql>
    ...
    </query>
</implementation>

Example 6-6 Stored Procedure Call

<implementationxmlns="http://www.oracle.com/Provisioning/database/ DatabasePlugin/2006/02/28" ...>
    ...
    <update>
        <call>{call a_stored_procedure(?)}</call>
        ...
    </update>
</implementation>

Transaction Element

The <transaction> element allows statements to be grouped. All statements contained in a <transaction> element will be run as part of a single database transaction. If a statement is defined outside of the <transaction> element, it is auto-committed immediately after the statement completes. The available configuration parameters are:

  • dataSource: Mandatory. Specifies the JNDI name of the SQL data source used to create the database connection for the transaction. This data source must be defined in your WebLogic domain before the plug-in is called.

    Note:

    Do not configure the data source to support global transactions. The plug-in instance is called under an enclosing transaction, making this option illegal.

  • isolationLevel: Optional. Specifies the transaction isolation level. Valid values are READ_COMMITTED, READ_UNCOMMITTED, REPEATABLE_READ, and SERIALIZABLE. READ_UNCOMMITTED and REPEATABLE_READ are not supported by Oracle.

  • scrollType: Optional. Specifies the type of result sets to be created as part of the transaction. Valid values are FORWARD_ONLY, SCROLL_SENSITIVE, and SCROLL_INSENSITIVE. The SCROLL values apply only when more than one ResultSet definition is defined for the same result set.

  • update: A statement that updates the database, but does not return results.

  • query: A statement that queries the database for information. The returned results are used to update the order data.

Example 6-7 Transaction Definition

<implementation xmlns="http://www.oracle.com/Provisioning/database/DatabasePlugin/2006/02/28" ...>
    <transaction isolationLevel="READ_COMMITTED"    scrollType="SCROLL_INSENSITIVE">
        <dataSource>test/dblugin/datasource</dataSource>
        <query>
            <sql>SELECT 'dummy' FROM dual</sql>
            <resultSet>
                <column number="1">/path/to/p6/field</column>
            </resultSet>
        </query>
    </transaction>
</implementation>

Bind Path

The <bind path> element provides a way to correlate outbound parameter values and in-bound result set column values. Instances of this result column will be bound to instances of the specified parameter at the specified path; after which their paths may diverge. This attribute is only relevant when a parameter's path includes a multi-instance group element.

Consider the sample OSM order data shown in Example 6-8 and the corresponding plug-in configuration in Example 6-9.

Example 6-8 OSM Order Data

<employees>
    <employee>
        <name>William</name>
        <job/>
    </employee>
    <employee>
        <name>Mary</name>
        <job/>
    </employee>
</employees>

Example 6-9 Plug-in Definition Using a Bind Path

<implementation xmlns="http://www.oracle.com/Provisioning/database/DatabasePlugin/2006/02/28" ... >
    <query>
        <sql>SELECT job FROM employee WHERE name = ?</sql>
        <bindPath id="emp" path="/employee[2]"/>
        <parameter xsi:type="ProvisioningParameterType"        bindPathRef="emp" path="name" type="text"/>
        <resultSet appliesTo="1" appliesToRow="all">
            <column number="1" bindPathRef="emp" path="job"            updateOnMatch="true"/>
        </resultSet>
    </query>
</implementation>

The emp bind path selects the second employee (with name of Mary). This bind path is used as the basis for the parameter selection and the corresponding result set column value, ensuring the job field that gets updated is the job corresponding with the employee named Mary.

Parameter

The <parameter> element defines how values are bound to the SQL parameter declarations. Parameters must be defined in the order of the corresponding declarations.

OSMParameterType

Specifies a parameter, the value of which will be bound to a <sql> or <call> statement. Parameters are processed in the order they are declared. The available parameter configuration attributes are:

  • bindPathRef

  • path

  • type

  • mode

bindPathRef and/or path provide the value that will be set on the SQL parameter; type provides the data type of the value; mode specifies whether the parameter is a stored procedure IN, OUT, or INOUT parameter. Each attribute is described in more detail in the sections that follow.

bindPathRef: This is the ID value of a bind path defined elsewhere on the statement. Either bindPathRef, path, or both may be specified. The value bound to the SQL parameter depends on the result of the evaluation of the bind path's XPath expression, as described in the table.

Table 6-2 Bind Path Evaluation Behavior

XPath Result Behavior

null

If path is not specified, the SQL parameter is set to null. If path is specified, the SQL parameter is set based on the path evaluation as described below.

Node-set

If path is not specified, the SQL parameter is set according to the following algorithm:

The first node encountered in the node-set is selected.

If the node is an XML element, the text contained directly under the element is selected as a String (if none, the SQL parameter is set to null).

If the node is an XML attribute, the value of the attribute is selected as a String.

Otherwise, the node itself (as a Java Object) is selected.

The parameter value is set using the selected data based on the parameter's type (see Table 6-4).

Object

The parameter value is set using the selected data based on the parameter's type (see Table 6-4).

path: The XPath selector in the path attribute is evaluated against the plug-in's input data in order to determine the SQL parameter's value. The context node against which the path expression is evaluated depends on the format of the input data and whether or not bindPathRef evaluated to a node-set of XML Elements. If the bindPathRef evaluated to a node-set of Elements, the first encountered Element is used as the context node for the path expression. If the input is an OSM GetOrder.Response document, the context node is the _root element of the document. Otherwise, the context node is the document root element. The value bound to the SQL parameter depends on the result of the evaluation of the path's XPath expression, as described in Table 6-3.

Table 6-3 Path Expression Evaluation Behavior

XPath Result Behavior

null

The SQL parameter is set to null.

Node-set

The SQL parameter is set according to the following algorithm:

The first node encountered in the node-set is selected.

If the node is an XML Element, the text contained directly under the Element is selected as a String (if none, the SQL parameter is set to null).

If the node is an XML Attribute, the value of the Attribute is selected as a String.

Otherwise, the node itself (as a java Object) is selected.

The parameter value is set using the selected data based on the parameter's type (see Table 6-4).

Object

The parameter value is set using the selected data based on the parameter's type (see Table 6-4).

type: Specifies the data type of the parameter, which are OSM specific.Valid values are: boolean, currency, date, dateTime, numeric, phone, and text.

Table 6-4 shows the SQL data type that will be used to set the SQL parameter based on the specified type and the Java class of the parameter value.

Table 6-4 OSM Data Type to SQL Data Type Mapping

type Attribute Value SQL Data Type(1) Parameter Evaluation(2)

Boolean

Boolean

Evaluated according to java.lang.Boolean.parseBoolean() using the String value of the parameter. OSM values Yes and No are also supported.

currency

double

Evaluated according to java.lang.Double.parseDouble() using the String value of the parameter.

numeric

double

Evaluated according to java.lang.Double.parseDouble() using the String value of the parameter.

date

date

The String value of the parameter is expected to match the format yyyy-MM-dd.

dateTime

timestamp

The String value of the parameter is expected to match the format yyyy-MM-dd'T'HH:mm:ss z.

phone

string

Evaluated according to java.lang.String.valueOf().

text

string

Evaluated according to java.lang.String.valueOf().

Footnote 1 where the parameter is set as java.sql.PreparedStatement.setXXX(#, value)

Footnote 2 If the class of the parameter is directly assignable to the SQL data type, it is not first evaluated as a String. For example, if the type attribute value is numeric and the class of the parameter value is java.lang.Number, no String evaluation is required.

mode: Specifies the mode of the parameter. Valid values are IN, OUT, and INOUT. Applicable only if the statement is a prepared statement, that is, defined with <call>.

Exception

The exception statement specifies the behavior that the plug-in should exhibit when a particular Java exception is caught during processing. Exceptions can be ignored or they can complete the associated task with a particular exit status.

If the exception is an instance of java.sql.SQLException, behavior may be further constrained to exceptions that have a particular error code or SQL state value. Exception handlers are evaluated in document order; that is, the first exception handler that matches the thrown exception will be used. If no exception handler exists for a thrown exception, it will be wrapped in a com.mslv.oms.automation.plugin.JDBCPluginException and re-thrown.

Creating the JDBC Data Source

The Database Plug-in must be associated with a JDBC data source that:

  • Uses a non-XA database drive

  • Does not support global transactions (Supports Global Transactions is check box that is available when defining the WebLogic data source configuration).

When creating the JDBC data source:

  • Create a JDBC Data Source that refers to the schema under which you are running the scripts.

  • The provided Database Plug-in sample assumes that the JNDI name of this Data Source is demo/dbplugin/datasource. However, the Data Source can have any JNDI name, but the configuration XML files in the config directory needs to be updated appropriately.

  • For Database Type, select Oracle.

  • For Database Driver, select Oracle's Driver @ (Thin) Versions: 9.0.1, 9.2.0, 10.

Deselect the Supports Global Transactions check box. (This check box defaults to being selected, so you must deselect it.)

Exception

If you create a JDBC data source that uses an XA database drive or that supports global transactions, the DatabasePlugin implementation throws the exception shown in Example 6-10.

Example 6-10 Exception

An automation exception has occurred at 
AutomationDispatcherImpl.runAutomator:/automation/plugin/internal/task/
database_plugin_demo/1.0/get_employee_names/do.
The reason is: 
com.mslv.oms.automation.AutomationException: 
com.mslv.oms.automation.AutomationException: 
com.mslv.oms.util.jdbc.exception.UncategorizedSQLException:
Unable to commit transaction.
com.mslv.oms.automation.AutomationException: 
com.mslv.oms.automation.AutomationException: 
com.mslv.oms.util.jdbc.exception.UncategorizedSQLException:
Unable to commit transaction.
at com.mslv.oms.automation.plugin.AutomationEventHandlerImpl.a(Unknown Source)
at com.mslv.oms.automation.plugin.AutomationEventHandlerImpl.processMessage
(Unknown Source)
at com.mslv.oms.automation.AutomationDispatcher.onMessage(Unknown Source) 
at weblogic.ejb.container.internal.MDListener.execute(MDListener.java:429) 
at weblogic.ejb.container.internal.MDListener.transactionalOnMessage
(MDListener.java:335)
at weblogic.ejb.container.internal.MDListener.onMessage(MDListener.java:291) 
at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:4060) 
at weblogic.jms.client.JMSSession.execute(JMSSession.java:3953) 
at weblogic.jms.client.JMSSession$UseForRunnable.run(JMSSession.java:4467) 
at weblogic.work.ExecuteRequestAdapter.execute(ExecuteRequestAdapter.java:21) 
at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:145) 
at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:117) 

About Large Orders and Automation Plug-ins

The following sections provide information about developing and managing automation for large orders.

Limiting Automation Concurrency in Large Orders

OSM is designed to provide high levels of order processing concurrency. In most OSM solutions, this high level of concurrency (when coupled with proper system tuning such as database connections, WebLogic Server threads, and so on) is effective at maximizing OSM scalability and performance. However, in some cases, especially when orders are very large or the associated automation plug-in transactions are complex and lengthy, you may need to limit the number of automation plug-in instances that can run at one time. You can restrict the number of automation plug-ins that run concurrently using order automation concurrency control (OACC) policy files (automationConcurrencyModel.xml).

To create an OACC policy file:

  1. In Design Studio, create a file called automationConcurrencyModel.xml and add the file to the resource folder of the cartridge you want the OACC policy to apply to.

  2. Add the following snippet to the file after replacing the placeholders:

    <?xml version="1.0"?>
    <automationConcurrencyModel xmlns="http://xmlns.oracle.com/communications/ordermanagement/model"> 
       <automationConcurrencyPolicy name="name"> 
          <targetPlugins> 
            <cartridgeNamespace>namespace</cartridgeNamespace>
            <cartridgeVersion>version</cartridgeVersion>
            <pluginSelector>pluginSelector</pluginSelector> 
          </targetPlugins> 
          <scope>scope</scope> 
          <concurrencyLevel>concurrencyLevel</concurrencyLevel> 
       </automationConcurrencyPolicy> 
    </automationConcurrencyModel>
    

    where:

    • name: A policy name. Within the automationConcurrencyModel.xml file you can specify one or more automation concurrency policies. Each policy can be specified within the automationConcurrencyPolicy element.

    • You can use the optional child elements within the targetPlugins element to specify plug-ins contained in the automationMap.xml files in deployed cartridges or found on the OSM system class path. OSM must match all specified criteria before applying a policy. If no criteria are specified, OSM applies the policy to all deployed plug-ins.

      • namespace: The value for this field must be a valid cartridge namespace where the automation plug-ins are located.

      • version: The value for this field cartridge version.

      • pluginSelector: The value for this field is an XPath 1.0 selector. The context is the automationMap.xml file, which defines every automation plug-in associated with a specific cartridge.

        For example, the following selector matches all automation plug-ins from a cartridge with namespace foo and version 1.2 that are also external receivers with a receive/jmsSource element).

        .[cartridgeNamespace="foo"][cartridgeVersion="1.2.3.4.5"][count(receive/jmsSource)>0]).
        

        See "About Automation Maps" for more information about the automationMap.xml file.

    • scope: A value that specifies the scope of the policy. The values are:

      • ORDER_ID: The policy applies to every order on each OSM managed server. This scope is appropriate if you want to limit the degree of automated transactions that can run in parallel within a given order, but do not want to restrict how many separate orders could be running concurrently.

      • CARTRIDGE_AND_VERSION: The policy applies to a specific cartridge and version. The policy limits the maximum number of concurrent automated transactions that can occur across all orders from the same cartridge namespace and version. This scope is appropriate if you want to limit how many orders can have transactions concurrently processing that were created from within the same version of the same cartridge.

      • CARTRIDGE: The policy applies to a specific cartridge regardless of version. This scope is appropriate if you want to limit how many orders can have transactions concurrently processing that were created from a cartridge with the same namespace regardless of version.

      • SERVER: The policy applies to an entire server. The policy limits the maximum number of concurrent automated transactions that can occur across all orders in any one server regardless of cartridge namespace or version. This scope is appropriate if you want to limit how many orders can be processing on any one managed server regardless of the cartridge namespace and version that they are created from.

      If plug-ins from two cartridge versions were targeted then there would be two group instances (cartridge X version 1, cartridge X version 2).

    • concurrencyLevel: A numerical value specifying the maximum concurrency for each managed server that is allowed within the defined scope. A value of 1 or higher limits concurrency to the specified level within the scope. A value of 0 or less means unlimited concurrency (effectively disabling the policy).

  3. Save and close the file.

  4. Build the cartridge.

  5. Deploy the cartridge.

    Note:

    You can validate that the OACC policy was applied by verifying the WebLogic server domain_home/servers/servername/logs files (where domain_home is the directory that contains the configuration for the domain into which OSM is installed, and servername is the server whose logs you are checking). Details about deployed OACC policies are listed in the automation plug-in deployment summary.

For example, the following policy limits each order to run one automation plug-in at a time:

<?xml version="1.0"?>
<automationConcurrencyModel xmlns="http://xmlns.oracle.com/communications/ordermanagement/model"> 
   <automationConcurrencyPolicy name="name"> 
      <targetPlugins> 
         <pluginSelector>starts-with(./ejbName,'UpdateOACC')</pluginSelector> 
      </targetPlugins> 
      <scope>ORDER_ID</scope> 
      <concurrencyLevel>1</concurrencyLevel> 
   </automationConcurrencyPolicy> 
   <automationConcurrencyPolicy name="policymultithread"> 
      <targetPlugins> 
         <pluginSelector>starts-with(./ejbName,'UpdateMultiThread')
         </pluginSelector> 
      </targetPlugins> 
      <scope>ORDER_ID</scope> 
      <concurrencyLevel>3</concurrencyLevel> 
   </automationConcurrencyPolicy> 
</automationConcurrencyModel>

Using GetOrder and UpdateOrder API Functions in Large Orders

When you design automation plug-ins or interact with OSM from external applications, you can implement XML API or OSM Web Service GetOrder operations with the OrderDataFilter element that explicitly specifies which parts of the order to return data from. This can enhance performance in cases where orders are very large and complex with hundreds of order items and where returning the complete order in a response would be costly in terms of CPU and memory usage. For example, in many cases, an automation plug-in already has advanced knowledge of an order item line ID which you can use with the OrderDataFilter to specify the exact line ID you want to return data for.

See "GetOrder" for more information about the OrderDataFilter element in the XML API GetOrder.Request. See "GetOrder" for more information about the OrderDataFilter in the GetOrder Web Service.

When you use automation plug-ins or external clients, you can create XML API or Web Service UpdateOrder requests with a ResponseView that specifies the order data to be returned in an UpdateOrder response. This ResponseView behaves in the same way as a GetOrder request. You can use the OrderDataFilter with the ResponseView to further specify the returning data. If the response includes a fulfillment state update, then OSM automatically filters the response so that only order items and order components impacted by the fulfillment state update are included. This auto-filtering of fulfillment state updates in responses avoids expensive XQuery processing within OSM to determine impacted order item and order component fulfillment states. The ResponseView does this by automatically applying an OrderDataFilter from within the OSM Server which can more efficiently perform this filtering action and also avoids having to serialize and parse large amounts of XML not needed by the requesting client or automation plug-in logic.

In addition, you can use the ExternalFulfillmentStates nodes within an XML API or Web Service UpdateOrder to directly update order item fulfillment states. This optional approach improves order processing efficiency because you no longer need complicated XQuery logic to determine the impact of the external fulfillment state change on an order component and order item.

See "UpdateOrder" for more information about the ResponseView, OrderDataFilter, and ExternalFulfillmentStates elements in the XML API UpdateOrder.Request. See "UpdateOrder" for more information about the OrderDataFilter in the UpdateOrder Web Service.

About Compensation for Automations

The following sections describe how automations can be configured for compensation.

About Execution Modes for Automations

Internal event receiver sender automations triggered from tasks can be run in different execution modes in compensation scenarios. When the task is in a particular execution mode, only those sender automations configured with the corresponding execution mode can run. For example, a task may have three automations, one of which is a sender configured to send messages to external systems in both do and redo mode, with a corresponding automator that receives the responses from those messages. A third sender plug-in could be required for cancelation scenarios or if the task were no longer required in the process flow. This sender would send a cancelation request to the external system that would cancel any of the do or redo operations that had previously occurred on the external system. The response would be returned to the automator plug-in that contains code that can handle any do, redo, or undo request and transition the task as appropriate.

At the task level:

  • Internal senders and automators configured in the Automation tab can run in do, redo, or undo in normal or fallout modes.

  • Internal senders and automators configured in the Events tab can run in do, redo, or undo in normal mode.

About Automations that Update Order Data and Compensation Analysis

When a revision order triggers compensation analysis for an order, an automation that updates order data may potentially be data included in compensations.

Any update order data changes triggered from automations with TaskContext or TaskNotificationContext objects, regardless of whether the task can be run in different normal or fallout execution modes, participates in task-level compensation analysis. Figure 6-10 illustrates how an update order run during the base order processing of Task A is included in the historical order perspective (HOP) of revision 1 Task A.

Figure 6-10 Update Orders in Task Compensation Analysis

Description of Figure 6-10 follows
Description of "Figure 6-10 Update Orders in Task Compensation Analysis"

Any update order data changes triggered from automations with OrderNotificationContext or OrderDataChangeNotificationContext objects do not participate in task-level compensation analysis and OSM does not include them in the contemporary order perspective (COP) or historical order perspectives (HOP). Nevertheless, OSM includes these data updates into the real-time order perspective (ROP) and OSM adds the changes to the closest task instances that are created or completed when the data changes occur.

OSM guarantees the accuracy of these data perspectives for the data update changes done in the task context according to the definitions of these data perspectives. Because the update order data changes in the order context are not associated with a specific task, OSM cannot deterministically guarantee that the compensation perspectives (COP or HOP) will reflect the data changes in at the order context consistently and deterministically.

For more information about how perspectives work in change order management scenarios, see OSM Modeling Guide.

About Using GetOrder Responses to View Compensation Perspectives

During the fulfillment process, an order may fail (also known as fallout) for reasons such as insufficient data or incorrect data. You may have to revise the order data to fix the fallout. If there are multiple revisions on the order, you may need access to previous versions of it so you can provide the information required to roll back the order to the corresponding successful state rather than rolling it back to the previous successful state.

Using GetOrder's TaskExecutionHistory and OrderChangeID elements, you can obtain the order data for all the revisions that happened on an order and use the relevant data in the fulfillment process according to your needs. The GetOrder.Response and GetOrder.Request XML APIs also include these elements and are included with OSM Automation plug-ins.

For example, consider an order which has been revised three times. You can obtain order data of all the three revisions and use the required data for the fulfillment.

See "GetOrder" for more information about these elements.

Use the GetOrder function to retrieve the TaskExecutionHistory element which returns an OrderChangeID associated with each historical perspective.

The following sample code snippet provides the syntax for the GetOrder function:

let $taskData := fn:root(automator:getOrderAsDOM($automator))/oms:GetOrder.Response
let $orderChangeID := $taskData/oms:TaskExecutionHistory/oms:Task[1]/oms:OrderChangeID/text()
let $prevTaskData := fn:root(automator:getOrderAsDOM($automator, $orderChangeID))/oms:GetOrder.Response

In the example above, the OrderChangeID specifies the revision to look for and roll back. An OrderChangeID with a value 0 indicates that it is the original base order with no revisions.

About Creating Automations in Design Studio

The following sections describe Design Studio tasks involved in creating automations.

About Building and Deploying Automation Plug-ins

Starting with OSM 7.3, OSM runs all automation plug-ins inside the oms.ear file. Running all automation plug-ins in oms.ear improves the performance of processing of automated tasks and improves the performance of build and deployment of cartridges with automated tasks.

Figure 6-11 illustrates when automation plug-ins are built and deployed using Optimized mode, which is the only method available in OSM 7.3 and later. Internal event receiver type plug-ins run within the OSM application and do not require their own J2EE application. The figure also illustrates that the business logic of external event receiver type plug-ins is also run within the OSM application and only the automation framework of external event receiver type plug-ins requires its own J2EE application to listen on the external message queue.

Figure 6-11 Dispatch of Automation Plug-ins

Description of Figure 6-11 follows
Description of "Figure 6-11 Dispatch of Automation Plug-ins"

External event receiver type automation plug-ins always require their own J2EE application in order to listen on a JMS destination. All of the business logic for external event receiver type plug–in J2EE applications is executed within the OSM application and they need to be rebuilt only when the JNDI name of the JMS destination changes.

External event receiver type automation plug-ins are made up of both:

  • The automation framework (OSM infrastructure) which receives and prepares the incoming message so that it can be executed according to your business logic. The automation framework subscribes to the external message queue (JMS destination) and requires its own J2EE application in order to listen on the external message queue.

  • The business logic itself which determines how the incoming message will be processed (for example, XQuery, XSLT, and custom Java class).

The J2EE application of an external event receiver type automation plug-in contains only the minimum amount of automation framework infrastructure that allows it to listen on the external message queue and forward the message to the core OSM application logic. This means the business logic of the automation plug-in is executed within the OSM application. The automation framework acts primarily to forward the message to OSM. The only time you need to rebuild an external event receiver type automation plug-in is when you decide to use a different external message queue (when the JNDI name of the JMS destination changes).

Note:

Starting with OSM 7.2.4, automations are validated when they are deployed. Prior to that, errors like missing queues were only validated at run time. The deployment logs will provide information about any validation failures. Because the validation can cause the deployment to fail, once you have corrected the problem, you will need to redeploy the automation.

About Automation Maps

After you have defined the automated task or automated notification, and defined the automation for it, a successful build of the project automatically generates the automationMap.xml file:

  • This file is governed by the rules defined in the cartridgeName/customAutomation/automationMap.xsd file, which is only visible when in the Java perspective. The customAutomation directory and XSD file are present with the creation of an OSM cartridge.

  • This file is placed in the cartridgeName/cartridgeBuild/automation directory, which is only visible when in the Java perspective.

About Editing the Automation Map

If you are deploying a cartridge outside Design Studio, for example using OSM's cartridge management tools, the first time you upgrade a cartridge from a pre-OSM 7.0.3 version to a version of OSM that is 7.0.3 or later, you need to update the automationMap.xml manually. You need to add two elements to each <taskAutomator> element:

<cartridgeNamespace>Namespace</cartridgeNamespace>
<cartridgeVersion>Version</cartridgeVersion>

These elements are required because of changes to the automationMap.xsd.

If you are upgrading a pre-OSM 7.0.3 cartridge created in Design Studio, to a version that is 7.0.3 or later, no manual change is required.

For examples of generated XML for automations defined for automated tasks and automated notifications, see "AutomationMap.xml File." The information is not included in this chapter because Oracle recommends that when defining the automation, you take the defaults and allow the project build to generate the automationMap.xml file. The information in the appendix is provided for in-depth understanding of the file should you need to modify it for some rare, obscure business reason.

About Mnemonic Values for Design Studio Entities in Automation Maps

For automations defined as internal event receivers, the automationMap.xml generates the <mnemonic> element. This value of this element varies as described in Table 6-5.

The String value of the mnemonic element cannot exceed a length of fifty characters. If the length is greater than fifty, the following build error is encountered:

Exception caught assembling plug-ins: "Parse/validation of automation map cartridgeName/cartridgeBuild/automation/automationMap.xml using schema cartridgeName/customAutomation/automationMap.xsd failed: Invalid text fiftyPlusMnemonicValue in element: mnemonic."

Table 6-5 Mnemonic Values

Automated Task or Automated Notification <mnemonic> value

Automated task

taskName

Order milestone-based event notification

The <mnemonic> element is not generated for order milestone-based event notifications.

Task state-based event notification (task Events tab)

taskName

Task state-based event notification (process Events tab)

processName_eventName

Task status-based event notification

processName_eventName

Order data changed event notification

orderName_eventNotificationName

Order jeopardy notification

orderName_jeopardyName

Task jeopardy notification

taskName_jeopardyName

About Managing Automations

The following sections describe automation management topics.

Building and Deploying Automation Plug-ins

Building and deploying an automation plug-in is a matter of building and deploying the cartridge that defines the automation plug-in. See OSM Modeling Guide for more information.

Automating the Build and Deploy

You can also automate and build the deploy of an automation plug-in by automating the build and deploy of the cartridge that defines the automation plug-in. See OSM Modeling Guide.

Troubleshooting Automations

If you encounter a problem when attempting to run an automation, you must verify that you are not using multiple versions of the automation_plugins.jar file. You do this by checking that the date and size of the file are the same in the following locations:

  • When you create a new cartridge in Design Studio, the automation_plugins.jar file is placed in the osmlib directory of the cartridge. Verify the date and size of the file by viewing your Eclipse workspace in Windows Explorer, and navigating to the osmlib directory of the cartridge you created within your workspace.

  • When you install OSM, the automation_plugins.jar file is placed in the SDK/automation/automationdeploy_bin directory. This is the version of the automation_plugins.jar file that your project library list references to compile the cartridge project containing the automation. (See "Compiling the Custom Automation Plug-in" for more information.) Verify the date and size of the file by viewing your installation directory, and navigating to the SDK/automation/automationdeploy_bin directory.

If the two versions of the file are not the same, use the version from the OSM installation:

  1. Copy the automation_plugins.jar file from the SDK/automation/automationdeploy_bin directory to the osmlib directory of your cartridge within your Eclipse workspace.

  2. Clean and rebuild the cartridge.

  3. Redeploy the cartridge.

  4. Run the automation.

Note:

When the versions of the automation_plugins.jar file are not the same, you may also encounter a marshalling error when deploying the cartridge, prior to attempting to run the automation. The marshalling error, which states that it cannot find the getProductBuildVersion() method, displays on the WebLogic console; it does not display in Design Studio when deploying the cartridge. If you encounter this error, the resolution is the same. Follow the steps described above.

Upgrading Automation Plug-ins

If you are upgrading from a previous release of OSM, and the previous release included automation plug-ins (custom or predefined), the same steps that are required to define a new automation plug-in are required to define the existing automation plug-in in the new release, with the exception of writing of the actual custom Java code.

For example, if the previous release included the automation plug-in genericPlugin, to upgrade genericPlugin in the new release you need to:

  • Define the trigger in Design Studio

  • Define the automation mapping in Design Studio

  • Define the Custom Automation Plug-in in Design Studio

  • Deploy the cartridge that contains genericPlugin to the OSM server

If genericPlugin is a custom automation plug-in, you can reuse the custom Java code by placing the Java source file in the cartridge src directory, compiling it, and selecting the class when defining the Custom Automation Plug-in. If genericPlugin is a predefined automation plug-in, you can select the predefined class when defining the automation, and reuse your XSLT or XQuery files by copying them into the cartridge resource directory.