Enum LlmInferenceRequest.RuntimeType
- java.lang.Object
-
- java.lang.Enum<LlmInferenceRequest.RuntimeType>
-
- com.oracle.bmc.generativeaiinference.model.LlmInferenceRequest.RuntimeType
-
- All Implemented Interfaces:
BmcEnum,Serializable,Comparable<LlmInferenceRequest.RuntimeType>
- Enclosing class:
- LlmInferenceRequest
public static enum LlmInferenceRequest.RuntimeType extends Enum<LlmInferenceRequest.RuntimeType> implements BmcEnum
The runtime of the provided model.
-
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static LlmInferenceRequest.RuntimeTypecreate(String key)StringgetValue()static LlmInferenceRequest.RuntimeTypevalueOf(String name)Returns the enum constant of this type with the specified name.static LlmInferenceRequest.RuntimeType[]values()Returns an array containing the constants of this enum type, in the order they are declared.
-
-
-
Enum Constant Detail
-
Cohere
public static final LlmInferenceRequest.RuntimeType Cohere
-
Llama
public static final LlmInferenceRequest.RuntimeType Llama
-
-
Method Detail
-
values
public static LlmInferenceRequest.RuntimeType[] values()
Returns an array containing the constants of this enum type, in the order they are declared. This method may be used to iterate over the constants as follows:for (LlmInferenceRequest.RuntimeType c : LlmInferenceRequest.RuntimeType.values()) System.out.println(c);
- Returns:
- an array containing the constants of this enum type, in the order they are declared
-
valueOf
public static LlmInferenceRequest.RuntimeType valueOf(String name)
Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)- Parameters:
name- the name of the enum constant to be returned.- Returns:
- the enum constant with the specified name
- Throws:
IllegalArgumentException- if this enum type has no constant with the specified nameNullPointerException- if the argument is null
-
create
public static LlmInferenceRequest.RuntimeType create(String key)
-
-