Set Up LangChain for Generative AI
Set up LangChain packages including packages for integration with OCI Generative AI. Then, test the installation by chatting with a model hosted on OCI Generative AI.
Overview
Key tasks:
- Install the LangChain and LangChain OCI packages.
- Run the Python code and receive a response from the model.
For more information, see the following resources:
Before You Begin
To successfully perform this tutorial, you must have the following configuration in place.
- A MacOS, Linux, or a Windows environment with Windows Subsystem for Linux (WSL).
- Python 3 installed. For the supported Python versions, see the Python SDK documentation.
- Have performed the following tutorials:
1. Install LangChain
2. Install the LangChain OCI Package
3. Gather Required Information
Collect all the information that you need to complete this tutorial. Copy the required information into a secure text file.
This tutorial sends one chat message to the Meta Llama 4 Scout model hosted on OCI Generative AI and receives a response. The cost for one on-demand chat message is close to zero dollars, but it's not free. With this section, we want you to learn how to calculate cost and decide which model to use when you have thousands of transactions.
3.1 Get Region and Model ID
Navigate to Pretrained Foundational Models in Generative AI and select Meta Llama 4 Scout (New).
- From the Available in These Regions section, copy a region not listed as (dedicated AI cluster only): US Midwest (Chicago)
- In the On-Demand Mode section, copy the OCI Model Name from the table:
meta.llama-4-scout-17b-16e-instruct
3.2 Get Compartment Information
To get the compartment OCID of a compartment:
3.3 Get Path to Config File
From the Set Up API Authentication for OCI tutorial, copy the following information:
- Path to the config file such as
<your-home-directory>/.oci/config - Authentication profile name to use in the config file. For example, Default.
3.4 Get Inference API Endpoint
3.5 Collected Information
Ensure you have the following information written down for the tutorial.
-
Compartment ID:
<sandbox-compartment>Example:
ocid1.compartment.oc1.aaaaaaa... -
Model ID:
meta.llama-4-scout-17b-16e-instruct -
API Endpoint:
https://inference.generativeai.us-chicago-1.oci.oraclecloud.com - Config File Path:
<path-to-config-fileExample:
<your-home-directory>/.oci/config - Authentication Profile Name in config File:
<auth-profile-name-in-config-file>Example:
Default
4. Chat
Chat using a model hosted in OCI
Generative AI. Reach this model with the langchain-oci package.
Ensure to perform the steps within your <sandbox-compartment>. You might not have permission to view or create resources in the tenancy or in other compartments.