Projects
OCI Generative AI projects organize conversations and responses under a shared set of settings. In a project, you define how long data is retained, enable long-term memory to persist context across conversations, and enable short-term memory compaction to optimize how conversation history is processed.
Projects are isolated from each other to support lifecycle management and compliance boundaries. Reference the project OCID in API and SDK calls to apply project settings at runtime.
About Projects
A project is an OCI resource that organizes agent-specific artifacts created through the Generative AI service, including responses, conversations, files, and containers. Projects are isolated from each other, so artifacts in one project aren’t accessible from another project. A project is required to call the OCI OpenAI-compatible API for agent-related tasks.
A project setting can include:
- Data retention for responses and conversations
- Long-term memory settings
- Conversation history compaction (short-term memory compaction) settings
- Deleting a project deletes all associated artifacts (responses, conversations, files, and containers) created within that project.
- API calls use the Project OCID.
Agent Memory
Agent Memory enables agents to retain and use context across interactions. It supports both short-term memory within a conversation and long-term memory across conversations, helping improve continuity, relevance, and efficiency.
Short-Term Memory
Short-term memory refers to the conversation context carried forward within an ongoing conversation. The Responses API and Conversations API simplify conversation state management, enabling multi-turn interactions.
Long-Term Memory
Long-term memory provides persistent context across conversations. When enabled, the service extracts key information from conversations and stores it so it can be recalled in future interactions within the same project.
Long-term memory is useful for scenarios that require continuity across sessions, such as:
- Remembering stable user preferences
- Retaining recurring background context
- Maintaining continuity across interactions
Short-Term Memory Compaction
As conversation history grows, sending the full history can increase token usage and latency. Short-term memory compaction summarizes and compresses earlier conversation history into a smaller, structured representation. This helps preserve key details while reducing the amount of context sent to the model.
This approach:
- Preserves key information from earlier turns
- Reduces token usage for long conversations
- Improves latency by keeping context lightweight
Data Retention
Data retention defines how long project artifacts are kept before they’re automatically removed. In OCI Generative AI projects, you set retention separately for responses and conversations.
- Response retention: Controls how long generated responses are retained.
- Conversation retention: Controls how long conversations are retained after the most recent update.
Retention settings help you control storage duration and align with your organization’s data handling requirements.
Limits
See Application Limits.
QuickStart Permissions
Manage
You can perform the following tasks to create, list, and use projects: