ChatChoice¶
-
class
oci.generative_ai_inference.models.ChatChoice(**kwargs)¶ Bases:
objectRepresents a single instance of the chat response.
Methods
__init__(**kwargs)Initializes a new ChatChoice object with values from keyword arguments. Attributes
finish_reason[Required] Gets the finish_reason of this ChatChoice. grounding_metadataGets the grounding_metadata of this ChatChoice. index[Required] Gets the index of this ChatChoice. logprobsGets the logprobs of this ChatChoice. message[Required] Gets the message of this ChatChoice. usageGets the usage of this ChatChoice. -
__init__(**kwargs)¶ Initializes a new ChatChoice object with values from keyword arguments. The following keyword arguments are supported (corresponding to the getters/setters of this class):
Parameters: - index (int) – The value to assign to the index property of this ChatChoice.
- message (oci.generative_ai_inference.models.Message) – The value to assign to the message property of this ChatChoice.
- finish_reason (str) – The value to assign to the finish_reason property of this ChatChoice.
- logprobs (oci.generative_ai_inference.models.Logprobs) – The value to assign to the logprobs property of this ChatChoice.
- usage (oci.generative_ai_inference.models.Usage) – The value to assign to the usage property of this ChatChoice.
- grounding_metadata (oci.generative_ai_inference.models.GroundingMetadata) – The value to assign to the grounding_metadata property of this ChatChoice.
-
finish_reason¶ [Required] Gets the finish_reason of this ChatChoice. The reason why the model stopped generating tokens.
Stops if the model hits a natural stop point or a provided stop sequence. Returns the length if the tokens reach the specified maximum number of tokens.
Returns: The finish_reason of this ChatChoice. Return type: str
-
grounding_metadata¶ Gets the grounding_metadata of this ChatChoice.
Returns: The grounding_metadata of this ChatChoice. Return type: oci.generative_ai_inference.models.GroundingMetadata
-
index¶ [Required] Gets the index of this ChatChoice. The index of the chat.
Returns: The index of this ChatChoice. Return type: int
-
logprobs¶ Gets the logprobs of this ChatChoice.
Returns: The logprobs of this ChatChoice. Return type: oci.generative_ai_inference.models.Logprobs
-
message¶ [Required] Gets the message of this ChatChoice.
Returns: The message of this ChatChoice. Return type: oci.generative_ai_inference.models.Message
-
usage¶ Gets the usage of this ChatChoice.
Returns: The usage of this ChatChoice. Return type: oci.generative_ai_inference.models.Usage
-