Options
All
  • Public
  • Public/Protected
  • All
Menu

Module generativeaiinference/lib/client

Variables

Const Breaker

Breaker: any = require("opossum")

common

Generative AI Service Inference API OCI Generative AI is a fully managed service that provides a set of state-of-the-art, customizable large language models (LLMs) that cover a wide range of use cases for text generation, summarization, and text embeddings.

Use the Generative AI service inference API to access your custom model endpoints, or to try the out-of-the-box models to generate text, summarize, and create text embeddings.

To use a Generative AI custom model for inference, you must first create an endpoint for that model. Use the Generative AI service management API to create a custom model by fine-tuning an out-of-the-box model, or a previous version of a custom model, using your own data. Fine-tune the custom model on a fine-tuning dedicated AI cluster. Then, create a hosting dedicated AI cluster with an endpoint to host your custom model. For resource management in the Generative AI service, use the Generative AI service management API.

To learn more about the service, see the Generative AI documentation.

OpenAPI spec version: 20231130

NOTE: This class is auto generated by OracleSDKGenerator. Do not edit the class manually.

Copyright (c) 2020, 2024, Oracle and/or its affiliates. All rights reserved. This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.