Bring Your Own Models

If you have models you want to use instead of those curated by Data Science, you can bring them into AI Quick Actions from Object Storage by registering the model.

There are two ways to register a model:
  • Register a verified model.
  • Register an unverified model.
A verified model is one the Data Science service has tested the configurations for deployment and fine tuning. An unverified model is one the service hasn't tested.
Note

The difference between a service curated model and a verified model is that, for a verified model, you must download the model artifact into Object Storage and register the model in AI Quick Actions before using it.

Service Managed Inference Containers

Two inference containers are available to use with Bring Your Own Model.

For curated and verified models, Data Science has tested which inference container works best with each model and so the inference container can't be chosen. For unverified models, you must decide which inference container is most suitable for each model. Two service managed inference containers are available, one for models compatible with inference engine vLLM 0.4.1 and another container for models compatible with TGI 2.0.1.

Register Verified Models

Data Science has models you can choose to use that have been tested, but you need to download the model artifacts to Object Storage.

  • Follow the steps in Prerequisites.
    Steps 2 and 3 show different ways of registering a verified model.
    1. Click Models if it's not already shown.
    2. To register a new model from object storage:
      1. Click Register from object storage.
      2. Click Register verified model to select a model that's been tested by Oracle Data Science for deployment and fine-tuning.
      3. From the Select model list, check if the model you have downloaded is on the list. If it's on the list, select the model name.
      4. From the Select compartment list, select the compartment where the model artifact is stored.
      5. From the Object storage location list, select the bucket where the model artifact is stored.
      6. Provide a directory path for the object storage.
      7. Click Register to register the model. When the model registration finishes, the Model Information screen is displayed. The model is included in the list of models under My models.
      8. Click Fine-tune to fine tune the model.
      9. Click Deploy to deploy the model.
    3. To register a verified model:
      1. Click Ready-to-register models. A list of models that are ready to register, but have no artifacts, is shown.
      2. Click the model you want to register. The Model Information screen is displayed.
      3. Click Register model.
      4. From the Select compartment list, select the compartment where the model artifact is stored.
      5. From the Object storage location list, select the bucket where the model artifact is stored.
      6. Provide a directory path for the object storage.
  • For a complete list of parameters and values for AI Quick Actions CLI commands, see AI Quick Actions CLI

  • This task can't be performed using the API.

Register Unverified Models

Follow these steps to use models that haven't been tested by Data Science.

  • Follow the steps in Prerequisites.
    1. Click Models if it's not already shown.
    2. To register a custom model from object storage, click Register from object storage.
    3. Click Register unverified model to register an unverified model.
    4. In Model name, give the model a name with which to register it.
    5. From the Select compartment list, select the compartment where the model artifact is stored.
    6. From the Object storage location list, select the bucket where the model artifact is stored
    7. Provide a directory path for the object storage.
    8. Under Inference Container, select a container from the list to use for inferencing.
    9. (Optional) To fune-tune the model, check Enable Fine tuning.
    10. Click Register to register the model.
      When the model registration finishes, the Model Information screen is displayed. The model is included in the list of models under My models.
    11. Click Fine-tune to fine tune the model.
    12. Click Deploy to deploy the model.
  • For a complete list of parameters and values for AI Quick Actions CLI commands, see AI Quick Actions CLI

  • This task can't be performed using the API.