Class OciModelProvider

java.lang.Object
com.oracle.coherence.rag.model.oci.OciModelProvider
All Implemented Interfaces:
ModelProvider

@ApplicationScoped @Named("OCI") public class OciModelProvider extends Object implements ModelProvider
ModelProvider implementation for Oracle Cloud Infrastructure (OCI) GenAI models.

This provider enables integration with OCI's Generative AI service, which provides access to state-of-the-art foundation models including Cohere Command models for chat and Cohere Embed models for text embeddings.

The provider supports all three model types: embedding models for creating vector representations of text, chat models for conversational AI, and streaming chat models for real-time response generation.

Authentication is handled through OCI's authentication mechanisms including:

  • OCI config file authentication
  • Instance principals (for compute instances)
  • Manual configuration via properties

Configuration is managed through MicroProfile Config with the following required property:

  • oci.compartment.id - OCI compartment ID where models are deployed

Example usage:


 @Inject
 @Named("OCI")
 ModelProvider ociProvider;
 
 EmbeddingModel embeddingModel = ociProvider.getEmbeddingModel("cohere.embed-multilingual-v3.0");
 ChatModel chatModel = ociProvider.getChatModel("cohere.command-r-08-2024");
 StreamingChatModel streamingModel = ociProvider.getStreamingChatModel("cohere.command-r-08-2024");
 

Since:
25.09
Author:
Aleks Seovic 2025.07.04
  • Constructor Details

    • OciModelProvider

      @Inject public OciModelProvider(org.eclipse.microprofile.config.Config config)
      Constructor for CDI initialization.
  • Method Details

    • getEmbeddingModel

      public dev.langchain4j.model.embedding.EmbeddingModel getEmbeddingModel(String sName)
      Returns an embedding model instance for the specified model name.

      Embedding models convert text into vector representations that can be used for semantic similarity search and document retrieval operations.

      Creates an OCI GenAI embedding model for generating vector representations of text. Supports Cohere embedding models like cohere.embed-multilingual-v3.0 and cohere.embed-english-v3.0.

      Specified by:
      getEmbeddingModel in interface ModelProvider
      Parameters:
      sName - the name of the embedding model to create
      Returns:
      a configured OciGenAiEmbeddingModel instance
      Throws:
      io.helidon.config.ConfigException - if the required compartment ID is not configured
      IllegalArgumentException - if the model name is null or empty
    • getChatModel

      public dev.langchain4j.model.chat.ChatModel getChatModel(String sName)
      Returns a chat model instance for the specified model name.

      Chat models provide conversational AI capabilities for generating responses to user queries in a blocking manner. These models are suitable for synchronous chat interactions.

      Creates an OCI GenAI chat model for conversational AI. Supports Cohere Command models like cohere.command-r-08-2024 and cohere.command-r-plus-08-2024.

      Specified by:
      getChatModel in interface ModelProvider
      Parameters:
      sName - the name of the chat model to create
      Returns:
      a configured OciGenAiChatModel instance
      Throws:
      io.helidon.config.ConfigException - if the required compartment ID is not configured
      IllegalArgumentException - if the model name is null or empty
    • getStreamingChatModel

      public dev.langchain4j.model.chat.StreamingChatModel getStreamingChatModel(String sName)
      Returns a streaming chat model instance for the specified model name.

      Streaming chat models provide conversational AI capabilities with streaming response generation, allowing for real-time token-by-token response delivery. This is useful for creating responsive chat interfaces.

      Creates an OCI GenAI streaming chat model for real-time response generation. Enables progressive response streaming suitable for interactive applications.

      Specified by:
      getStreamingChatModel in interface ModelProvider
      Parameters:
      sName - the name of the streaming chat model to create
      Returns:
      a configured OciGenAiStreamingChatModel instance
      Throws:
      io.helidon.config.ConfigException - if the required compartment ID is not configured
      IllegalArgumentException - if the model name is null or empty
    • ociCompartmentId

      protected String ociCompartmentId()
      Returns the OCI compartment ID from configuration.
      Returns:
      the compartment ID where OCI GenAI models are deployed
      Throws:
      io.helidon.config.ConfigException - if the compartment ID is not configured