Memory Cookbook#

You can also check this cookbook in colab here

Overview#

The Memory module in CAMEL provides a flexible and powerful system for storing, retrieving, and managing information for AI agents. It enables agents to maintain context across conversations and retrieve relevant information from past interactions, enhancing the coherence and relevance of AI responses.

Getting Started#

Installation#

Ensure you have CAMEL AI installed in your Python environment:

[ ]:
%pip install "camel-ai[all]==0.2.1"

馃攽 Setting Up API Keys#

You鈥檒l need to set up your API keys for OpenAI.

[ ]:
import os
from getpass import getpass

# Prompt for the API key securely
openai_api_key = getpass('Enter your API key: ')
os.environ["OPENAI_API_KEY"] = openai_api_key
Enter your API key: 路路路路路路路路路路

Usage#

To use the Memory module in your agent:

  1. Choose an appropriate AgentMemory implementation (ChatHistoryMemory, VectorDBMemory, or LongtermAgentMemory).

  2. Initialize the memory with a context creator and any necessary parameters.

  3. Use write_records() to add new information to the memory.

  4. Use retrieve() to get relevant context for the agent鈥檚 next action.

  5. Use get_context() to obtain the formatted context for the agent.

Setting LongtermAgentMemory:#

Import required modules

[ ]:
from camel.memories import (
    ChatHistoryBlock,
    LongtermAgentMemory,
    MemoryRecord,
    ScoreBasedContextCreator,
    VectorDBBlock,
)
from camel.messages import BaseMessage
from camel.types import ModelType, OpenAIBackendRole
from camel.utils import OpenAITokenCounter
[ ]:
# Initialize the memory
memory = LongtermAgentMemory(
    context_creator=ScoreBasedContextCreator(
        token_counter=OpenAITokenCounter(ModelType.GPT_4O_MINI),
        token_limit=1024,
    ),
    chat_history_block=ChatHistoryBlock(),
    vector_db_block=VectorDBBlock(),
)

# Create and write new records
records = [
    MemoryRecord(
        message=BaseMessage.make_user_message(
            role_name="User",
            meta_dict=None,
            content="What is CAMEL AI?",
        ),
        role_at_backend=OpenAIBackendRole.USER,
    ),
    MemoryRecord(
        message=BaseMessage.make_assistant_message(
            role_name="Agent",
            meta_dict=None,
            content="CAMEL-AI.org is the 1st LLM multi-agent framework and "
            "an open-source community dedicated to finding the scaling law "
            "of agents.",
        ),
        role_at_backend=OpenAIBackendRole.ASSISTANT,
    ),
]
memory.write_records(records)

# Get context for the agent
context, token_count = memory.get_context()

print(context)
[{'role': 'user', 'content': 'What is CAMEL AI?'}, {'role': 'assistant', 'content': 'CAMEL-AI.org is the 1st LLM multi-agent framework and an open-source community dedicated to finding the scaling law of agents.'}]
[ ]:
print(token_count)
49

Adding LongtermAgentMemory to your ChatAgent:#

[ ]:
from camel.agents import ChatAgent

# Define system message for the agent
sys_msg = BaseMessage.make_assistant_message(
    role_name='Agent',
    content='You are a curious agent wondering about the universe.',
)

# Initialize agent
agent = ChatAgent(system_message=sys_msg)

# Set memory to the agent
agent.memory = memory


# Define a user message
usr_msg = BaseMessage.make_user_message(
    role_name='User',
    content="Tell me which is the 1st LLM multi-agent framework based on what we have discussed",
)

# Sending the message to the agent
response = agent.step(usr_msg)

# Check the response (just for illustrative purpose)
print(response.msgs[0].content)
CAMEL AI is recognized as the first LLM (Large Language Model) multi-agent framework. It is an open-source community initiative focused on exploring the scaling laws of agents, enabling the development and interaction of multiple AI agents in a collaborative environment. This framework allows researchers and developers to experiment with various configurations and interactions among agents, facilitating advancements in AI capabilities and understanding.

Advanced Topics#

Customizing Context Creator#

You can create custom context creators by subclassing BaseContextCreator:

[ ]:
from camel.memories import BaseContextCreator

class MyCustomContextCreator(BaseContextCreator):
    @property
    def token_counter(self):
        # Implement your token counting logic
        return

    @property
    def token_limit(self):
        return 1000  # Or any other limit

    def create_context(self, records):
        # Implement your context creation logic
        pass

Customizing Vector Database Block#

For VectorDBBlock, you can customize it by adjusting the embedding models or vector storages:

[ ]:
from camel.embeddings import OpenAIEmbedding
from camel.memories import VectorDBBlock
from camel.storages import QdrantStorage

vector_db = VectorDBBlock(
    embedding=OpenAIEmbedding(),
    storage=QdrantStorage(vector_dim=OpenAIEmbedding().get_output_dim()),
)

Performance Considerations#

  • For large-scale applications, consider using persistent storage backends instead of in-memory storage.

  • Optimize your context creator to balance between context relevance and token limits.

  • When using VectorDBMemory, be mindful of the trade-off between retrieval accuracy and speed as the database grows.