Overview
The Memory module in CAMEL provides a flexible and powerful system for storing, retrieving, and managing information for AI agents. It enables agents to maintain context across conversations and retrieve relevant information from past interactions, enhancing the coherence and relevance of AI responses.Getting Started
Installation
Ensure you have CAMEL AI installed in your Python environment:🔑 Setting Up API Keys
You’ll need to set up your API keys for OpenAI.Usage
To use the Memory module in your agent:- Choose an appropriate AgentMemory implementation (
ChatHistoryMemory
,VectorDBMemory
, orLongtermAgentMemory
). - Initialize the memory with a context creator and any necessary parameters.
- Use
write_records()
to add new information to the memory. - Use
retrieve()
to get relevant context for the agent’s next action. - Use
get_context()
to obtain the formatted context for the agent.
Setting LongtermAgentMemory
:
Import required modules
Adding LongtermAgentMemory
to your ChatAgent
:
Advanced Topics
Customizing Context Creator
You can create custom context creators by subclassingBaseContextCreator
:
Customizing Vector Database Block
ForVectorDBBlock
, you can customize it by adjusting the embedding models or vector storages:
Performance Considerations
- For large-scale applications, consider using persistent storage backends instead of in-memory storage.
- Optimize your context creator to balance between context relevance and token limits.
- When using
VectorDBMemory
, be mindful of the trade-off between retrieval accuracy and speed as the database grows.