Usage¶
- 
class oci.generative_ai_inference.models.Usage(**kwargs)¶
- Bases: - object- Usage statistics for the completion request. - Methods - __init__(**kwargs)- Initializes a new Usage object with values from keyword arguments. - Attributes - completion_tokens- Gets the completion_tokens of this Usage. - completion_tokens_details- Gets the completion_tokens_details of this Usage. - prompt_tokens- Gets the prompt_tokens of this Usage. - prompt_tokens_details- Gets the prompt_tokens_details of this Usage. - total_tokens- Gets the total_tokens of this Usage. - 
__init__(**kwargs)¶
- Initializes a new Usage object with values from keyword arguments. The following keyword arguments are supported (corresponding to the getters/setters of this class): - Parameters: - completion_tokens (int) – The value to assign to the completion_tokens property of this Usage.
- prompt_tokens (int) – The value to assign to the prompt_tokens property of this Usage.
- total_tokens (int) – The value to assign to the total_tokens property of this Usage.
- completion_tokens_details (oci.generative_ai_inference.models.CompletionTokensDetails) – The value to assign to the completion_tokens_details property of this Usage.
- prompt_tokens_details (oci.generative_ai_inference.models.PromptTokensDetails) – The value to assign to the prompt_tokens_details property of this Usage.
 
 - 
completion_tokens¶
- Gets the completion_tokens of this Usage. Number of tokens in the generated completion. - Returns: - The completion_tokens of this Usage. - Return type: - int 
 - 
completion_tokens_details¶
- Gets the completion_tokens_details of this Usage. - Returns: - The completion_tokens_details of this Usage. - Return type: - oci.generative_ai_inference.models.CompletionTokensDetails 
 - 
prompt_tokens¶
- Gets the prompt_tokens of this Usage. Number of tokens in the prompt. - Returns: - The prompt_tokens of this Usage. - Return type: - int 
 - 
prompt_tokens_details¶
- Gets the prompt_tokens_details of this Usage. - Returns: - The prompt_tokens_details of this Usage. - Return type: - oci.generative_ai_inference.models.PromptTokensDetails 
 - 
total_tokens¶
- Gets the total_tokens of this Usage. Total number of tokens used in the request (prompt + completion). - Returns: - The total_tokens of this Usage. - Return type: - int 
 
-