Class EmbedTextDetails
Details for the request to embed texts.
Inherited Members
Namespace: Oci.GenerativeaiinferenceService.Models
Assembly: OCI.DotNetSDK.Generativeaiinference.dll
Syntax
public class EmbedTextDetails
Properties
CompartmentId
Declaration
[Required(ErrorMessage = "CompartmentId is required.")]
[JsonProperty(PropertyName = "compartmentId")]
public string CompartmentId { get; set; }
Property Value
Type | Description |
---|---|
string | The OCID of compartment in which to call the Generative AI service to create text embeddings. |
Remarks
Required
InputType
Declaration
[JsonProperty(PropertyName = "inputType")]
[JsonConverter(typeof(StringEnumConverter))]
public EmbedTextDetails.InputTypeEnum? InputType { get; set; }
Property Value
Type | Description |
---|---|
EmbedTextDetails.InputTypeEnum? | Specifies the input type. |
Inputs
Declaration
[Required(ErrorMessage = "Inputs is required.")]
[JsonProperty(PropertyName = "inputs")]
public List<string> Inputs { get; set; }
Property Value
Type | Description |
---|---|
List<string> | Provide a list of strings. Each string can be words, a phrase, or a paragraph. The maximum length of each string entry in the list is 512 tokens. |
Remarks
Required
IsEcho
Declaration
[JsonProperty(PropertyName = "isEcho")]
public bool? IsEcho { get; set; }
Property Value
Type | Description |
---|---|
bool? | Whether or not to include the original inputs in the response. Results are index-based. |
ServingMode
Declaration
[Required(ErrorMessage = "ServingMode is required.")]
[JsonProperty(PropertyName = "servingMode")]
public ServingMode ServingMode { get; set; }
Property Value
Type | Description |
---|---|
ServingMode |
Remarks
Required
Truncate
Declaration
[JsonProperty(PropertyName = "truncate")]
[JsonConverter(typeof(StringEnumConverter))]
public EmbedTextDetails.TruncateEnum? Truncate { get; set; }
Property Value
Type | Description |
---|---|
EmbedTextDetails.TruncateEnum? | For an input that's longer than the maximum token length, specifies which part of the input text will be truncated. |