Class StreamingChatModelSupplier

java.lang.Object
com.oracle.coherence.rag.model.AbstractModelSupplier<dev.langchain4j.model.chat.StreamingChatModel>
com.oracle.coherence.rag.model.StreamingChatModelSupplier
All Implemented Interfaces:
Supplier<dev.langchain4j.model.chat.StreamingChatModel>

@ApplicationScoped public class StreamingChatModelSupplier extends AbstractModelSupplier<dev.langchain4j.model.chat.StreamingChatModel>
CDI supplier for streaming chat models in the Coherence RAG framework.

This supplier extends AbstractModelSupplier to provide streaming chat model instances for conversational AI interactions. It supports dynamic model selection based on configuration and maintains cached instances for efficient reuse.

The supplier integrates with various AI service providers through the ModelProvider interface, supporting models from OpenAI, OCI GenAI, Ollama, and other providers. Models are identified using the standard "provider/model" naming convention.

Configuration:

  • Default model: "OpenAI/gpt-4o-mini"
  • Configuration property: model.chat
  • Supports runtime configuration changes

Usage example:

 // Get default chat model
 StreamingChatModel model = chatModelSupplier.get();
 
 // Get specific model
 StreamingChatModel specificModel = chatModelSupplier.get("oci/meta.llama-3.1-70b-instruct");
 

Since:
25.09
Author:
Aleks Seovic 2025.07.04
  • Field Details

    • DEFAULT_CHAT_MODEL

      public static final String DEFAULT_CHAT_MODEL
      The default chat model used when no specific model is configured.

      This represents a balanced choice between performance and cost for general-purpose conversational AI tasks.

      See Also:
  • Constructor Details

    • StreamingChatModelSupplier

      public StreamingChatModelSupplier()
  • Method Details

    • description

      protected String description()
      Returns a human-readable description of the model type.

      This description is used in logging and error messages to identify the type of models this supplier manages.

      Specified by:
      description in class AbstractModelSupplier<dev.langchain4j.model.chat.StreamingChatModel>
      Returns:
      "chat" identifying this as a chat model supplier
    • defaultModel

      protected String defaultModel()
      Returns the default model name when no configuration is provided.

      This method provides the fallback chat model that will be used when no explicit configuration is available.

      Specified by:
      defaultModel in class AbstractModelSupplier<dev.langchain4j.model.chat.StreamingChatModel>
      Returns:
      the default chat model name
    • configProperty

      protected String configProperty()
      Returns the configuration property key for chat models.

      This property can be used to configure the default chat model at runtime through various configuration sources.

      Specified by:
      configProperty in class AbstractModelSupplier<dev.langchain4j.model.chat.StreamingChatModel>
      Returns:
      "model.chat" as the configuration property key
    • create

      public dev.langchain4j.model.chat.StreamingChatModel create(ModelName modelName)
      Creates a new streaming chat model instance for the specified model name.

      This method uses the model provider framework to create chat model instances. It looks up the appropriate ModelProvider based on the model name's provider component and delegates model creation to that provider.

      Specified by:
      create in class AbstractModelSupplier<dev.langchain4j.model.chat.StreamingChatModel>
      Parameters:
      modelName - the name of the chat model to create
      Returns:
      a new StreamingChatModel instance
      Throws:
      io.helidon.config.ConfigException - if the specified model provider is not supported or if the model cannot be created