N/llm Module

Note:

The content in this help topic pertains to SuiteScript 2.1.

The N/llm module supports generative artificial intelligence (AI) capabilities in SuiteScript. You can use this module to send requests to the large language models (LLMs) supported by NetSuite and receive responses to use in your scripts.

N/llm Module Script Samples

If you're new to using generative AI in SuiteScript, see SuiteScript 2.x Generative AI APIs. That topic contains essential information about this feature.

The following list summarizes the main features that are available using the N/llm module:

This module is available in NetSuite by default when the Server SuiteScript feature is enabled. For more information, see Enabling Features.

Important:

As you work with this module, keep the following considerations in mind:

  • Generative AI features, such as the N/llm module, use creativity in their responses. Make sure you validate the AI-generated responses for accuracy and quality. Oracle NetSuite isn't responsible or liable for the use or interpretation of AI-generated content.

  • SuiteScript Generative AI APIs (N/llm module) are available only for accounts located in certain regions. For a list of these regions, see Generative AI Feature Availability in NetSuite.

In This Help Topic

N/llm Module Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Object

llm.ChatMessage

Object

Server scripts

The chat message.

llm.Citation

Object

Server scripts

A citation returned from the LLM.

llm.Document

Object

Server scripts

A document to be used as source content when calling the LLM.

llm.EmbedResponse

Object

Server scripts

The embeddings response returned from the LLM.

llm.Response

Object

Server scripts

The response returned from the LLM.

llm.StreamedResponse

Object

Server scripts

The streamed response returned from the LLM.

llm.Tool

Object

Server scripts

A tool the LLM can request.

llm.ToolCall

Object

Server scripts

A tool call request from the LLM.

llm.ToolParameter

Object

Server scripts

A parameter for a tool.

llm.ToolResult

Object

Server scripts

A tool result to send back to the LLM.

Method

llm.createChatMessage(options)

llm.ChatMessage

Server scripts

Creates a chat message based on a specified role and text.

llm.createDocument(options)

llm.Document

Server scripts

Creates a document to be used as source content when calling the LLM.

llm.createTool(options)

llm.Tool

Server scripts

Creates a tool the LLM can request.

llm.createToolParameter(options)

llm.ToolParameter

Server scripts

Creates a tool parameter for a tool.

llm.createToolResult(options)

llm.ToolResult

Server scripts

Creates a tool result to send back to the LLM.

llm.embed(options)

llm.EmbedResponse

Server scripts

Returns the embeddings from the LLM for a given input.

llm.embed.promise(options)

Promise

Server scripts

Asynchronously returns the embeddings from the LLM for a given input.

llm.evaluatePrompt(options)

llm.Response

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.evaluatePrompt.promise(options)

Promise

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and asynchronously returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.evaluatePromptStreamed(options)

llm.StreamedResponse

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.evaluatePromptStreamed.promise(options)

Promise

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and asynchronously returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateText(options)

llm.Response

Server scripts

Takes a prompt and parameters for the LLM and returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateText.promise(options)

Promise

Server scripts

Takes a prompt and parameters for the LLM and asynchronously returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateTextStreamed(options)

llm.StreamedResponse

Server scripts

Takes a prompt and parameters for the LLM and returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateTextStreamed.promise(options)

Promise

Server scripts

Takes a prompt and parameters for the LLM and asynchronously returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.getRemainingFreeUsage()

number

Server scripts

Returns the number of free requests in the current month.

llm.getRemainingFreeUsage.promise()

Promise

Server scripts

Asynchronously returns the number of free requests in the current month.

llm.getRemainingFreeEmbedUsage()

number

Server scripts

Returns the number of free embeddings requests in the current month.

llm.getRemainingFreeEmbedUsage.promise()

Promise

Server scripts

Asynchronously returns the number of free embeddings requests in the current month.

Enum

llm.ChatRole

enum

Server scripts

Holds the string values for the author (role) of a chat message.

Use this enum to set the value of the options.role parameter in llm.createChatMessage(options).

llm.EmbedModelFamily

enum

Server scripts

Holds the string values for the large language model to be used to generate embeddings.

Use this enum to set the value of the options.embedModelFamily parameter in llm.embed(options).

llm.ModelFamily

enum

Server scripts

Holds the string values for the large language model to be used.

Use this enum to set the value of the options.model parameter in llm.generateText(options).

llm.SafetyMode

enum

Server scripts

Holds the string values for the safety mode to be used for LLM requests.

Use this enum to set the value of the options.safetyMode parameter in llm.generateText(options) and llm.generateTextStreamed(options).

llm.ToolParameterType

enum

Server scripts

Holds the data type for a tool parameter.

Use this enum to set the value of the options.type parameter in llm.createToolParameter(options).

llm.Truncate

enum

Server scripts

Holds the string values for the truncation method to use when embeddings input exceeds 512 tokens.

Use this enum to set the value of the options.truncate parameter in llm.embed(options).

ChatMessage Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

ChatMessage.role

string

Server scripts

The author (role) of the chat message.

ChatMessage.text

string

Server scripts

Text of the chat message.

This text can be either the prompt sent by the script or the response returned by the LLM.

Citation Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Citation.documentIds

string[]

Server scripts

The IDs of the documents where the cited text is located.

Citation.end

number

Server scripts

The ending position of the cited text.

Citation.start

number

Server scripts

The starting position of the cited text.

Citation.text

string

Server scripts

The cited text from the documents.

Document Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Document.data

string

Server scripts

The content of the document.

Document.id

string

Server scripts

The ID of the document.

EmbedResponse Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

EmbedResponse.embeddings

number[]

Server scripts

The embeddings returned from the LLM.

EmbedResponse.inputs

string[]

Server scripts

The list of inputs used to generate the embeddings response.

EmbedResponse.model

string

Server scripts

The model used to generate the embeddings response.

Response Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Response.chatHistory

llm.ChatMessage[]

Server scripts

List of chat messages.

Response.citations

llm.Citation[]

Server scripts

List of citations used to generate the response.

Response.documents

llm.Document[]

Server scripts

List of documents used to generate the response.

Response.model

string

Server scripts

Model used to produce the LLM response.

Response.text

string

Server scripts

Text returned by the LLM.

Response.toolCalls

llm.ToolCall[]

Server scripts

Tool calls requested by the LLM.

Response.usage

llm.Usage

Server scripts

Token usage for a request to the LLM.

StreamedResponse Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

StreamedResponse.chatHistory

llm.ChatMessage[]

Server scripts

List of chat messages.

StreamedResponse.citations

llm.Citation[]

Server scripts

List of citations used to generate the streamed response.

StreamedResponse.documents

llm.Document[]

Server scripts

List of documents used to generate the streamed response.

StreamedResponse.model

string

Server scripts

Model used to produce the streamed response.

StreamedResponse.text

string

Server scripts

Text returned by the LLM.

StreamedResponse.toolCalls

llm.ToolCall[]

Server scripts

Tool calls requested by the LLM.

Tool Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Tool.description

string

Server scripts

The description of the tool.

Tool.name

string

Server scripts

The name of the tool.

Tool.parameters

llm.ToolParameter[]

Server scripts

The parameters of the tool.

ToolCall Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

ToolCall.name

string

Server scripts

The name of the requested tool.

ToolCall.parameters

llm.ToolParameter[]

Server scripts

The parameters of the requested tool.

ToolParameter Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

ToolParameter.description

string

Server scripts

The description of the tool parameter.

ToolParameter.name

string

Server scripts

The name of the tool parameter.

ToolParameter.type

string

Server scripts

The type of the tool parameter.

ToolResult Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

ToolResult.call

llm.ToolCall

Server scripts

The originating tool call request from the LLM.

ToolResult.outputs

Object[]

Server scripts

The outputs from running the tool specified in the tool call request.

Usage Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Usage.completionTokens

number

Server scripts

The number of tokens in the response from the LLM.

Usage.promptTokens

number

Server scripts

The number of tokens in the request to the LLM.

Usage.totalTokens

number

Server scripts

The total number of tokens for the entire request to the LLM.

Note:

To learn more about the generative AI models, see the SuiteAnswers article LLM Mapping for Generative AI Features. Be aware that you must be logged in to NetSuite to access articles in SuiteAnswers.

Related Topics

General Notices