ChatChoice¶
-
class
oci.generative_ai_inference.models.
ChatChoice
(**kwargs)¶ Bases:
object
Represents a single instance of the chat response.
Methods
__init__
(**kwargs)Initializes a new ChatChoice object with values from keyword arguments. Attributes
finish_reason
[Required] Gets the finish_reason of this ChatChoice. index
[Required] Gets the index of this ChatChoice. logprobs
Gets the logprobs of this ChatChoice. message
[Required] Gets the message of this ChatChoice. -
__init__
(**kwargs)¶ Initializes a new ChatChoice object with values from keyword arguments. The following keyword arguments are supported (corresponding to the getters/setters of this class):
Parameters: - index (int) – The value to assign to the index property of this ChatChoice.
- message (oci.generative_ai_inference.models.Message) – The value to assign to the message property of this ChatChoice.
- finish_reason (str) – The value to assign to the finish_reason property of this ChatChoice.
- logprobs (oci.generative_ai_inference.models.Logprobs) – The value to assign to the logprobs property of this ChatChoice.
-
finish_reason
¶ [Required] Gets the finish_reason of this ChatChoice. The reason why the model stopped generating tokens.
Stops if the model hits a natural stop point or a provided stop sequence. Returns the length if the tokens reach the specified maximum number of tokens.
Returns: The finish_reason of this ChatChoice. Return type: str
-
index
¶ [Required] Gets the index of this ChatChoice. The index of the chat.
Returns: The index of this ChatChoice. Return type: int
-
logprobs
¶ Gets the logprobs of this ChatChoice.
Returns: The logprobs of this ChatChoice. Return type: oci.generative_ai_inference.models.Logprobs
-
message
¶ [Required] Gets the message of this ChatChoice.
Returns: The message of this ChatChoice. Return type: oci.generative_ai_inference.models.Message
-