Passing JSON Parameters

To send JSON parameters in GET and POST requests:

  • Define the JSON parameters such as path, action and so on as shown below
Parameter Name Definition
Path Specify the URI published by the LLM provider
Action Supported POST and GET
requestPayload Input payload published by the LLM provider need to be enclosed within this parameter
Prompt Dynamic texts sent by the client (Siebel, Postman, and so on) would be substituted
responseField

Response text parameter name, can differ across the provider + model combinations

Example: For Cohere, the response field is “text”, for OpenAI response field is “content” and so on. It is used to display the response in UI

  • An example is shown here for outbound calls in JSON format for OCI Cohere Command R
JSON File Name JSON File Contents
oci_cohere.command-r-08-2024
{
    "path": "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com/20231130/actions/chat",
    "requestPayload": {
        "chatRequest": {
            "apiFormat": "COHERE",
            "message": "<prompt>"
        },
        "compartmentId": "<compartmentId>",
        "servingMode": {
            "modelId": "cohere.command-r-08-2024",
            "servingType": "ON_DEMAND"
        }
    },
    "action": "POST",
    "responseField": "text"
}
  • An example is shown here for outbound calls in JSON format for a sample Model Provider
JSON File Name JSON File Contents
google_gemini-1.5-flash-001.json
{ 
"path": "https://us-central1- aiplatform.googleapis.com/v1/projects/suryageminiproject1/locations/us-central1/
publishers/google/models/gemini-1.5-flash-001:generateContent", 
"action": "POST", "responseField": 
"text", "requestPayload": { "contents": [ { "role": "user", "parts": [ { "text": "<prompt>" } ] } ] }}