Class OciModelProvider
- All Implemented Interfaces:
ModelProvider
This provider enables integration with OCI's Generative AI service, which provides access to state-of-the-art foundation models including Cohere Command models for chat and Cohere Embed models for text embeddings.
The provider supports all three model types: embedding models for creating vector representations of text, chat models for conversational AI, and streaming chat models for real-time response generation.
Authentication is handled through OCI's authentication mechanisms including:
- OCI config file authentication
- Instance principals (for compute instances)
- Manual configuration via properties
Configuration is managed through MicroProfile Config with the following required property:
- oci.compartment.id - OCI compartment ID where models are deployed
Example usage:
@Inject
@Named("OCI")
ModelProvider ociProvider;
EmbeddingModel embeddingModel = ociProvider.getEmbeddingModel("cohere.embed-multilingual-v3.0");
ChatModel chatModel = ociProvider.getChatModel("cohere.command-r-08-2024");
StreamingChatModel streamingModel = ociProvider.getStreamingChatModel("cohere.command-r-08-2024");
- Since:
- 25.09
- Author:
- Aleks Seovic 2025.07.04
-
Constructor Summary
ConstructorsConstructorDescriptionOciModelProvider(org.eclipse.microprofile.config.Config config) Constructor for CDI initialization. -
Method Summary
Modifier and TypeMethodDescriptiondev.langchain4j.model.chat.ChatModelgetChatModel(String sName) Returns a chat model instance for the specified model name.dev.langchain4j.model.embedding.EmbeddingModelgetEmbeddingModel(String sName) Returns an embedding model instance for the specified model name.dev.langchain4j.model.chat.StreamingChatModelgetStreamingChatModel(String sName) Returns a streaming chat model instance for the specified model name.protected StringReturns the OCI compartment ID from configuration.Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface com.oracle.coherence.rag.ModelProvider
getScoringModel
-
Constructor Details
-
OciModelProvider
@Inject public OciModelProvider(org.eclipse.microprofile.config.Config config) Constructor for CDI initialization.
-
-
Method Details
-
getEmbeddingModel
Returns an embedding model instance for the specified model name.Embedding models convert text into vector representations that can be used for semantic similarity search and document retrieval operations.
Creates an OCI GenAI embedding model for generating vector representations of text. Supports Cohere embedding models like cohere.embed-multilingual-v3.0 and cohere.embed-english-v3.0.
- Specified by:
getEmbeddingModelin interfaceModelProvider- Parameters:
sName- the name of the embedding model to create- Returns:
- a configured OciGenAiEmbeddingModel instance
- Throws:
io.helidon.config.ConfigException- if the required compartment ID is not configuredIllegalArgumentException- if the model name is null or empty
-
getChatModel
Returns a chat model instance for the specified model name.Chat models provide conversational AI capabilities for generating responses to user queries in a blocking manner. These models are suitable for synchronous chat interactions.
Creates an OCI GenAI chat model for conversational AI. Supports Cohere Command models like cohere.command-r-08-2024 and cohere.command-r-plus-08-2024.
- Specified by:
getChatModelin interfaceModelProvider- Parameters:
sName- the name of the chat model to create- Returns:
- a configured OciGenAiChatModel instance
- Throws:
io.helidon.config.ConfigException- if the required compartment ID is not configuredIllegalArgumentException- if the model name is null or empty
-
getStreamingChatModel
Returns a streaming chat model instance for the specified model name.Streaming chat models provide conversational AI capabilities with streaming response generation, allowing for real-time token-by-token response delivery. This is useful for creating responsive chat interfaces.
Creates an OCI GenAI streaming chat model for real-time response generation. Enables progressive response streaming suitable for interactive applications.
- Specified by:
getStreamingChatModelin interfaceModelProvider- Parameters:
sName- the name of the streaming chat model to create- Returns:
- a configured OciGenAiStreamingChatModel instance
- Throws:
io.helidon.config.ConfigException- if the required compartment ID is not configuredIllegalArgumentException- if the model name is null or empty
-
ociCompartmentId
Returns the OCI compartment ID from configuration.- Returns:
- the compartment ID where OCI GenAI models are deployed
- Throws:
io.helidon.config.ConfigException- if the compartment ID is not configured
-