LlamaLlmInferenceResponse

class oci.generative_ai_inference.models.LlamaLlmInferenceResponse(**kwargs)

Bases: oci.generative_ai_inference.models.llm_inference_response.LlmInferenceResponse

The generated text result to return.

Attributes

RUNTIME_TYPE_COHERE str(object=’’) -> str
RUNTIME_TYPE_DALLE3 str(object=’’) -> str
RUNTIME_TYPE_LLAMA str(object=’’) -> str
RUNTIME_TYPE_OPENAI str(object=’’) -> str
choices [Required] Gets the choices of this LlamaLlmInferenceResponse.
created [Required] Gets the created of this LlamaLlmInferenceResponse.
runtime_type [Required] Gets the runtime_type of this LlmInferenceResponse.

Methods

__init__(**kwargs) Initializes a new LlamaLlmInferenceResponse object with values from keyword arguments.
get_subtype(object_dictionary) Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.
RUNTIME_TYPE_COHERE = 'COHERE'
RUNTIME_TYPE_DALLE3 = 'DALLE3'
RUNTIME_TYPE_LLAMA = 'LLAMA'
RUNTIME_TYPE_OPENAI = 'OPENAI'
__init__(**kwargs)

Initializes a new LlamaLlmInferenceResponse object with values from keyword arguments. The default value of the runtime_type attribute of this class is LLAMA and it should not be changed. The following keyword arguments are supported (corresponding to the getters/setters of this class):

Parameters:
  • runtime_type (str) – The value to assign to the runtime_type property of this LlamaLlmInferenceResponse. Allowed values for this property are: “COHERE”, “LLAMA”, “OPENAI”, “DALLE3”
  • created (datetime) – The value to assign to the created property of this LlamaLlmInferenceResponse.
  • choices (list[oci.generative_ai_inference.models.Choice]) – The value to assign to the choices property of this LlamaLlmInferenceResponse.
choices

[Required] Gets the choices of this LlamaLlmInferenceResponse. A list of generated texts. Can be more than one if n is greater than 1.

Returns:The choices of this LlamaLlmInferenceResponse.
Return type:list[oci.generative_ai_inference.models.Choice]
created

[Required] Gets the created of this LlamaLlmInferenceResponse. The Unix timestamp (in seconds) of when the generation was created.

Returns:The created of this LlamaLlmInferenceResponse.
Return type:datetime
static get_subtype(object_dictionary)

Given the hash representation of a subtype of this class, use the info in the hash to return the class of the subtype.

runtime_type

[Required] Gets the runtime_type of this LlmInferenceResponse. The runtime of the provided model.

Allowed values for this property are: “COHERE”, “LLAMA”, “OPENAI”, “DALLE3”, ‘UNKNOWN_ENUM_VALUE’. Any unrecognized values returned by a service will be mapped to ‘UNKNOWN_ENUM_VALUE’.

Returns:The runtime_type of this LlmInferenceResponse.
Return type:str