Provide an Extended Instruction to the OpenAI Model
This use case demonstrates how to provide an array-based instruction to the OpenAI model. Two roles are specified in the input. For each role, you define different content for the OpenAI model to address.
- Configure a REST Adapter trigger connection.
- Configure an OpenAI Adapter invoke connection.
- Create an application integration.
- Drag the REST Adapter trigger connection into the
integration canvas for configuration. For this example, the REST Adapter is configured as follows:
- A REST Service URL of
/extended3
is specified for this example. - A Method of POST is selected.
- A Request Media Type of
JSON is selected and the following sample
JSON structure is
specified:
{ "model" : "gpt-4o", "input" : [ { "role" : "developer", "content" : "perform the openAI LLM function calling functionality described in the form of text content. Response should be same as that returned for function calling" }, { "role" : "user", "content" : "Define a get_weather function with parameters location, which is a string object. Call the appropriate function for What is the weather like in Paris today?" } ], "instructions" : "", "max_output_tokens" : 234, "metadata" : null, "parallel_tool_calls" : true, "previous_response_id" : null, "store" : true, "stream" : false, "temperature" : 1, "tool_choice" : "auto", "top_p" : 1, "truncation" : "disabled", "user" : "asdf" }
- A Response Media Type of
JSON is selected and the following sample
JSON structure is
specified:
{ "id" : "resp_67e6322ad4688192abad3f268b66236e05ecd4d8d90549f9", "object" : "response", "created_at" : 1743139370, "status" : "completed", "error" : "df", "incomplete_details" : "asdf", "instructions" : "asdf", "max_output_tokens" : 243, "model" : "gpt-4o-2024-08-06", "output" : [ { "type" : "message", "id" : "msg_67e6322b18c08192a6352151edd8c9fa05ecd4d8d90549f9", "status" : "completed", "role" : "assistant", "content" : [ { "type" : "output_text", "text" : "Get current temperature for a given location." } ] } ], "parallel_tool_calls" : true, "previous_response_id" : "afsd", "reasoning" : { "effort" : "sdaf", "generate_summary" : "asf" }, "store" : true, "temperature" : 1, "text" : { "format" : { "type" : "text" } }, "tool_choice" : "auto", "top_p" : 1, "truncation" : "disabled", "usage" : { "input_tokens" : 43, "input_tokens_details" : { "cached_tokens" : 0 }, "output_tokens" : 68, "output_tokens_details" : { "reasoning_tokens" : 0 }, "total_tokens" : 111 }, "user" : "asdf" }
- A REST Service URL of
- Drag the OpenAI Adapter invoke connection into the
integration canvas and configure it as follows.
- From the OpenAI LLM Models list, select the model to use (for this example, gpt-4o is selected).
- From the Request Type list, select Extended Prompt.
- In the request mapper, map the source Input element to
the target Input element.
- In the response mapper, expand the source Response
Wrapper element and target Response
Wrapper element.
- Perform the following mappings.
- Specify the business identifier and activate the integration.
The completed integration looks as follows:
- From the Actions
menu, select Run.
The Configure and run page appears. - In the Body field of the Request
section, enter the following text, then click Run.
The body includes two roles (
developer
anduser
), each with their own content. Thedeveloper
role takes precedence over theuser
role in the OpenAI hierarchy. For example, if you were to change theuser
role content from asking for the Boston zip code to asking for the zip code of a neighborhood in New York, the OpenAI model would not be able to answer the question.{ "input": [{ "role": "developer", "content": "Give information only about Boston" }, { "role": "user", "content": "What is the zipcode of Beacon Hill, Boston?" }] }
The Body field of the Response section returns the following output. The zip code of Beacon Hill is returned.{ "output" : [ { "type" : "message", "id" : "msg_68477879290c819890e84a6f557f0b560cec1aa24c1b96c8", "status" : "completed", "role" : "assistant", "content" : [ { "type" : "output_text", "text" : "Beacon Hill, Boston, is primarily covered by the ZIP code 02108." } ] } ] }
- Expand the activity stream to view the flow of the messages sent and
received.
-
Message received by the trigger connection:
- Message sent by the invoke connection to the OpenAI model:
- Message received by the invoke connection from the OpenAI
model:
-