camel.memories package#
Subpackages#
Submodules#
camel.memories.agent_memories module#
- class camel.memories.agent_memories.ChatHistoryMemory(context_creator: BaseContextCreator, storage: BaseKeyValueStorage | None = None, window_size: int | None = None)[source]#
Bases:
AgentMemory
An agent memory wrapper of
ChatHistoryBlock
.- Parameters:
context_creator (BaseContextCreator) – A model context creator.
storage (BaseKeyValueStorage, optional) – A storage backend for storing chat history. If None, an
InMemoryKeyValueStorage
will be used. (default:None
)window_size (int, optional) – The number of recent chat messages to retrieve. If not provided, the entire chat history will be retrieved. (default:
None
)
- get_context_creator() BaseContextCreator [source]#
Gets context creator.
- Returns:
A model context creator.
- Return type:
- retrieve() List[ContextRecord] [source]#
Get a record list from the memory for creating model context.
- Returns:
A record list for creating model context.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Writes records to the memory, appending them to existing ones.
- Parameters:
records (List[MemoryRecord]) – Records to be added to the memory.
- class camel.memories.agent_memories.LongtermAgentMemory(context_creator: BaseContextCreator, chat_history_block: ChatHistoryBlock | None = None, vector_db_block: VectorDBBlock | None = None, retrieve_limit: int = 3)[source]#
Bases:
AgentMemory
An implementation of the
AgentMemory
abstract base class for augmenting ChatHistoryMemory with VectorDBMemory.- Parameters:
context_creator (BaseContextCreator) – A model context creator.
chat_history_block (Optional[ChatHistoryBlock], optional) – A chat history block. If None, a
ChatHistoryBlock
will be used. (default:None
)vector_db_block (Optional[VectorDBBlock], optional) – A vector database block. If None, a
VectorDBBlock
will be used. (default:None
)retrieve_limit (int, optional) – The maximum number of messages to be added into the context. (default:
3
)
- get_context_creator() BaseContextCreator [source]#
Returns the context creator used by the memory.
- Returns:
The context creator used by the memory.
- Return type:
- retrieve() List[ContextRecord] [source]#
Retrieves context records from both the chat history and the vector database.
- Returns:
- A list of context records retrieved from both
the chat history and the vector database.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Converts the provided chat messages into vector representations and writes them to the vector database.
- Parameters:
records (List[MemoryRecord]) – Messages to be added to the vector database.
- class camel.memories.agent_memories.VectorDBMemory(context_creator: BaseContextCreator, storage: BaseVectorStorage | None = None, retrieve_limit: int = 3)[source]#
Bases:
AgentMemory
An agent memory wrapper of
VectorDBBlock
. This memory queries messages stored in the vector database. Notice that the most recent messages will not be added to the context.- Parameters:
context_creator (BaseContextCreator) – A model context creator.
storage (BaseVectorStorage, optional) – A vector storage storage. If None, an
QdrantStorage
will be used. (default:None
)retrieve_limit (int, optional) – The maximum number of messages to be added into the context. (default:
3
)
- get_context_creator() BaseContextCreator [source]#
Gets context creator.
- Returns:
A model context creator.
- Return type:
- retrieve() List[ContextRecord] [source]#
Get a record list from the memory for creating model context.
- Returns:
A record list for creating model context.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Writes records to the memory, appending them to existing ones.
- Parameters:
records (List[MemoryRecord]) – Records to be added to the memory.
camel.memories.base module#
- class camel.memories.base.AgentMemory[source]#
Bases:
MemoryBlock
,ABC
Represents a specialized form of MemoryBlock, uniquely designed for direct integration with an agent. Two key abstract functions, “retrieve” and “get_context_creator”, are used for generating model context based on the memory records stored within the AgentMemory.
- get_context() Tuple[List[ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam], int] [source]#
Gets chat context with a proper size for the agent from the memory.
- Returns:
- A tuple containing the constructed
context in OpenAIMessage format and the total token count.
- Return type:
(List[OpenAIMessage], int)
- abstract get_context_creator() BaseContextCreator [source]#
Gets context creator.
- Returns:
A model context creator.
- Return type:
- abstract retrieve() List[ContextRecord] [source]#
Get a record list from the memory for creating model context.
- Returns:
A record list for creating model context.
- Return type:
List[ContextRecord]
- class camel.memories.base.BaseContextCreator[source]#
Bases:
ABC
An abstract base class defining the interface for context creation strategies.
This class provides a foundational structure for different strategies to generate conversational context from a list of context records. The primary goal is to create a context that is aligned with a specified token count limit, allowing subclasses to define their specific approach.
Subclasses should implement the
token_counter
,:obj: token_limit, andcreate_context
methods to provide specific context creation logic.- token_counter#
A token counter instance responsible for counting tokens in a message.
- Type:
- token_limit#
The maximum number of tokens allowed in the generated context.
- Type:
int
- abstract create_context(records: List[ContextRecord]) Tuple[List[ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam], int] [source]#
An abstract method to create conversational context from the chat history.
Constructs the context from provided records. The specifics of how this is done and how the token count is managed should be provided by subclasses implementing this method. The output messages order should keep same as the input order.
- Parameters:
records (List[ContextRecord]) – A list of context records from which to generate the context.
- Returns:
- A tuple containing the constructed
context in OpenAIMessage format and the total token count.
- Return type:
Tuple[List[OpenAIMessage], int]
- abstract property token_counter: BaseTokenCounter#
- abstract property token_limit: int#
- class camel.memories.base.MemoryBlock[source]#
Bases:
ABC
An abstract class serves as the fundamental component within the agent memory system. This class is equipped with “write” and “clear” functions. However, it intentionally does not define a retrieval interface, as the structure of the data to be retrieved may vary in different types of memory blocks.
- write_record(record: MemoryRecord) None [source]#
Writes a record to the memory, appending it to existing ones.
- Parameters:
record (MemoryRecord) – Record to be added to the memory.
- abstract write_records(records: List[MemoryRecord]) None [source]#
Writes records to the memory, appending them to existing ones.
- Parameters:
records (List[MemoryRecord]) – Records to be added to the memory.
camel.memories.records module#
- class camel.memories.records.ContextRecord(*, memory_record: MemoryRecord, score: float)[source]#
Bases:
BaseModel
The result of memory retrieving.
- memory_record: MemoryRecord#
- model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}#
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
- model_config: ClassVar[ConfigDict] = {}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[Dict[str, FieldInfo]] = {'memory_record': FieldInfo(annotation=MemoryRecord, required=True), 'score': FieldInfo(annotation=float, required=True)}#
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.
This replaces Model.__fields__ from Pydantic V1.
- score: float#
- class camel.memories.records.MemoryRecord(*, message: BaseMessage, role_at_backend: OpenAIBackendRole, uuid: UUID = None, extra_info: Dict[str, str] = None)[source]#
Bases:
BaseModel
The basic message storing unit in the CAMEL memory system.
- message#
The main content of the record.
- Type:
- role_at_backend#
An enumeration value representing the role this message played at the OpenAI backend. Note that this value is different from the
RoleType
used in the CAMEL role playing system.- Type:
- uuid#
A universally unique identifier for this record. This is used to uniquely identify this record in the memory system. If not given, it will be assigned with a random UUID.
- Type:
UUID, optional
- extra_info#
A dictionary of additional key-value pairs that provide more information. If not given, it will be an empty Dict.
- Type:
Dict[str, str], optional
- extra_info: Dict[str, str]#
- classmethod from_dict(record_dict: Dict[str, Any]) MemoryRecord [source]#
Reconstruct a
MemoryRecord
from the input dict.- Parameters:
record_dict (Dict[str, Any]) – A dict generated by
to_dict()
.
- message: BaseMessage#
- model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}#
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[Dict[str, FieldInfo]] = {'extra_info': FieldInfo(annotation=Dict[str, str], required=False, default_factory=dict), 'message': FieldInfo(annotation=BaseMessage, required=True), 'role_at_backend': FieldInfo(annotation=OpenAIBackendRole, required=True), 'uuid': FieldInfo(annotation=UUID, required=False, default_factory=uuid4)}#
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.
This replaces Model.__fields__ from Pydantic V1.
- role_at_backend: OpenAIBackendRole#
- to_dict() Dict[str, Any] [source]#
Convert the
MemoryRecord
to a dict for serialization purposes.
- to_openai_message() ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam [source]#
Converts the record to an
OpenAIMessage
object.
- uuid: UUID#
Module contents#
- class camel.memories.AgentMemory[source]#
Bases:
MemoryBlock
,ABC
Represents a specialized form of MemoryBlock, uniquely designed for direct integration with an agent. Two key abstract functions, “retrieve” and “get_context_creator”, are used for generating model context based on the memory records stored within the AgentMemory.
- get_context() Tuple[List[ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam], int] [source]#
Gets chat context with a proper size for the agent from the memory.
- Returns:
- A tuple containing the constructed
context in OpenAIMessage format and the total token count.
- Return type:
(List[OpenAIMessage], int)
- abstract get_context_creator() BaseContextCreator [source]#
Gets context creator.
- Returns:
A model context creator.
- Return type:
- abstract retrieve() List[ContextRecord] [source]#
Get a record list from the memory for creating model context.
- Returns:
A record list for creating model context.
- Return type:
List[ContextRecord]
- class camel.memories.BaseContextCreator[source]#
Bases:
ABC
An abstract base class defining the interface for context creation strategies.
This class provides a foundational structure for different strategies to generate conversational context from a list of context records. The primary goal is to create a context that is aligned with a specified token count limit, allowing subclasses to define their specific approach.
Subclasses should implement the
token_counter
,:obj: token_limit, andcreate_context
methods to provide specific context creation logic.- token_counter#
A token counter instance responsible for counting tokens in a message.
- Type:
- token_limit#
The maximum number of tokens allowed in the generated context.
- Type:
int
- abstract create_context(records: List[ContextRecord]) Tuple[List[ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam], int] [source]#
An abstract method to create conversational context from the chat history.
Constructs the context from provided records. The specifics of how this is done and how the token count is managed should be provided by subclasses implementing this method. The output messages order should keep same as the input order.
- Parameters:
records (List[ContextRecord]) – A list of context records from which to generate the context.
- Returns:
- A tuple containing the constructed
context in OpenAIMessage format and the total token count.
- Return type:
Tuple[List[OpenAIMessage], int]
- abstract property token_counter: BaseTokenCounter#
- abstract property token_limit: int#
- class camel.memories.ChatHistoryBlock(storage: BaseKeyValueStorage | None = None, keep_rate: float = 0.9)[source]#
Bases:
MemoryBlock
An implementation of the
MemoryBlock
abstract base class for maintaining a record of chat histories.This memory block helps manage conversation histories with a key-value storage backend, either provided by the user or using a default in-memory storage. It offers a windowed approach to retrieving chat histories, allowing users to specify how many recent messages they’d like to fetch.
- Parameters:
storage (BaseKeyValueStorage, optional) – A storage mechanism for storing chat history. If None, an
InMemoryKeyValueStorage
will be used. (default:None
)keep_rate (float, optional) – In historical messages, the score of the last message is 1.0, and with each step taken backward, the score of the message is multiplied by the keep_rate. Higher keep_rate leads to high possiblity to keep history messages during context creation.
- retrieve(window_size: int | None = None) List[ContextRecord] [source]#
Retrieves records with a proper size for the agent from the memory based on the window size or fetches the entire chat history if no window size is specified.
- Parameters:
window_size (int, optional) – Specifies the number of recent chat messages to retrieve. If not provided, the entire chat history will be retrieved. (default:
None
)- Returns:
A list of retrieved records.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Writes memory records to the memory. Additionally, performs validation checks on the messages.
- Parameters:
records (List[MemoryRecord]) – Memory records to be added to the memory.
- class camel.memories.ChatHistoryMemory(context_creator: BaseContextCreator, storage: BaseKeyValueStorage | None = None, window_size: int | None = None)[source]#
Bases:
AgentMemory
An agent memory wrapper of
ChatHistoryBlock
.- Parameters:
context_creator (BaseContextCreator) – A model context creator.
storage (BaseKeyValueStorage, optional) – A storage backend for storing chat history. If None, an
InMemoryKeyValueStorage
will be used. (default:None
)window_size (int, optional) – The number of recent chat messages to retrieve. If not provided, the entire chat history will be retrieved. (default:
None
)
- get_context_creator() BaseContextCreator [source]#
Gets context creator.
- Returns:
A model context creator.
- Return type:
- retrieve() List[ContextRecord] [source]#
Get a record list from the memory for creating model context.
- Returns:
A record list for creating model context.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Writes records to the memory, appending them to existing ones.
- Parameters:
records (List[MemoryRecord]) – Records to be added to the memory.
- class camel.memories.ContextRecord(*, memory_record: MemoryRecord, score: float)[source]#
Bases:
BaseModel
The result of memory retrieving.
- memory_record: MemoryRecord#
- model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}#
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
- model_config: ClassVar[ConfigDict] = {}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[Dict[str, FieldInfo]] = {'memory_record': FieldInfo(annotation=MemoryRecord, required=True), 'score': FieldInfo(annotation=float, required=True)}#
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.
This replaces Model.__fields__ from Pydantic V1.
- score: float#
- class camel.memories.LongtermAgentMemory(context_creator: BaseContextCreator, chat_history_block: ChatHistoryBlock | None = None, vector_db_block: VectorDBBlock | None = None, retrieve_limit: int = 3)[source]#
Bases:
AgentMemory
An implementation of the
AgentMemory
abstract base class for augmenting ChatHistoryMemory with VectorDBMemory.- Parameters:
context_creator (BaseContextCreator) – A model context creator.
chat_history_block (Optional[ChatHistoryBlock], optional) – A chat history block. If None, a
ChatHistoryBlock
will be used. (default:None
)vector_db_block (Optional[VectorDBBlock], optional) – A vector database block. If None, a
VectorDBBlock
will be used. (default:None
)retrieve_limit (int, optional) – The maximum number of messages to be added into the context. (default:
3
)
- get_context_creator() BaseContextCreator [source]#
Returns the context creator used by the memory.
- Returns:
The context creator used by the memory.
- Return type:
- retrieve() List[ContextRecord] [source]#
Retrieves context records from both the chat history and the vector database.
- Returns:
- A list of context records retrieved from both
the chat history and the vector database.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Converts the provided chat messages into vector representations and writes them to the vector database.
- Parameters:
records (List[MemoryRecord]) – Messages to be added to the vector database.
- class camel.memories.MemoryBlock[source]#
Bases:
ABC
An abstract class serves as the fundamental component within the agent memory system. This class is equipped with “write” and “clear” functions. However, it intentionally does not define a retrieval interface, as the structure of the data to be retrieved may vary in different types of memory blocks.
- write_record(record: MemoryRecord) None [source]#
Writes a record to the memory, appending it to existing ones.
- Parameters:
record (MemoryRecord) – Record to be added to the memory.
- abstract write_records(records: List[MemoryRecord]) None [source]#
Writes records to the memory, appending them to existing ones.
- Parameters:
records (List[MemoryRecord]) – Records to be added to the memory.
- class camel.memories.MemoryRecord(*, message: BaseMessage, role_at_backend: OpenAIBackendRole, uuid: UUID = None, extra_info: Dict[str, str] = None)[source]#
Bases:
BaseModel
The basic message storing unit in the CAMEL memory system.
- message#
The main content of the record.
- Type:
- role_at_backend#
An enumeration value representing the role this message played at the OpenAI backend. Note that this value is different from the
RoleType
used in the CAMEL role playing system.- Type:
- uuid#
A universally unique identifier for this record. This is used to uniquely identify this record in the memory system. If not given, it will be assigned with a random UUID.
- Type:
UUID, optional
- extra_info#
A dictionary of additional key-value pairs that provide more information. If not given, it will be an empty Dict.
- Type:
Dict[str, str], optional
- extra_info: Dict[str, str]#
- classmethod from_dict(record_dict: Dict[str, Any]) MemoryRecord [source]#
Reconstruct a
MemoryRecord
from the input dict.- Parameters:
record_dict (Dict[str, Any]) – A dict generated by
to_dict()
.
- message: BaseMessage#
- model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}#
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_fields: ClassVar[Dict[str, FieldInfo]] = {'extra_info': FieldInfo(annotation=Dict[str, str], required=False, default_factory=dict), 'message': FieldInfo(annotation=BaseMessage, required=True), 'role_at_backend': FieldInfo(annotation=OpenAIBackendRole, required=True), 'uuid': FieldInfo(annotation=UUID, required=False, default_factory=uuid4)}#
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.
This replaces Model.__fields__ from Pydantic V1.
- role_at_backend: OpenAIBackendRole#
- to_dict() Dict[str, Any] [source]#
Convert the
MemoryRecord
to a dict for serialization purposes.
- to_openai_message() ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam [source]#
Converts the record to an
OpenAIMessage
object.
- uuid: UUID#
- class camel.memories.ScoreBasedContextCreator(token_counter: BaseTokenCounter, token_limit: int)[source]#
Bases:
BaseContextCreator
A default implementation of context creation strategy, which inherits from
BaseContextCreator
.This class provides a strategy to generate a conversational context from a list of chat history records while ensuring the total token count of the context does not exceed a specified limit. It prunes messages based on their score if the total token count exceeds the limit.
- Parameters:
token_counter (BaseTokenCounter) – An instance responsible for counting tokens in a message.
token_limit (int) – The maximum number of tokens allowed in the generated context.
- create_context(records: List[ContextRecord]) Tuple[List[ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam], int] [source]#
Creates conversational context from chat history while respecting token limits.
Constructs the context from provided records and ensures that the total token count does not exceed the specified limit by pruning the least score messages if necessary.
- Parameters:
records (List[ContextRecord]) – A list of message records from which to generate the context.
- Returns:
- A tuple containing the constructed
context in OpenAIMessage format and the total token count.
- Return type:
Tuple[List[OpenAIMessage], int]
- Raises:
RuntimeError – If it’s impossible to create a valid context without exceeding the token limit.
- property token_counter: BaseTokenCounter#
- property token_limit: int#
- class camel.memories.VectorDBBlock(storage: BaseVectorStorage | None = None, embedding: BaseEmbedding | None = None)[source]#
Bases:
MemoryBlock
An implementation of the
MemoryBlock
abstract base class for maintaining and retrieving information using vector embeddings within a vector database.- Parameters:
storage (Optional[BaseVectorStorage], optional) – The storage mechanism for the vector database. Defaults to in-memory
Qdrant
if not provided. (default:None
)embedding (Optional[BaseEmbedding], optional) – Embedding mechanism to convert chat messages into vector representations. Defaults to
OpenAiEmbedding
if not provided. (default:None
)
- retrieve(keyword: str, limit: int = 3) List[ContextRecord] [source]#
Retrieves similar records from the vector database based on the content of the keyword.
- Parameters:
keyword (str) – This string will be converted into a vector representation to query the database.
limit (int, optional) – The maximum number of similar messages to retrieve. (default:
3
).
- Returns:
- A list of memory records retrieved from the
vector database based on similarity to
current_state
.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Converts the provided chat messages into vector representations and writes them to the vector database.
- Parameters:
records (List[MemoryRecord]) – Memory records to be added to the memory.
- class camel.memories.VectorDBMemory(context_creator: BaseContextCreator, storage: BaseVectorStorage | None = None, retrieve_limit: int = 3)[source]#
Bases:
AgentMemory
An agent memory wrapper of
VectorDBBlock
. This memory queries messages stored in the vector database. Notice that the most recent messages will not be added to the context.- Parameters:
context_creator (BaseContextCreator) – A model context creator.
storage (BaseVectorStorage, optional) – A vector storage storage. If None, an
QdrantStorage
will be used. (default:None
)retrieve_limit (int, optional) – The maximum number of messages to be added into the context. (default:
3
)
- get_context_creator() BaseContextCreator [source]#
Gets context creator.
- Returns:
A model context creator.
- Return type:
- retrieve() List[ContextRecord] [source]#
Get a record list from the memory for creating model context.
- Returns:
A record list for creating model context.
- Return type:
List[ContextRecord]
- write_records(records: List[MemoryRecord]) None [source]#
Writes records to the memory, appending them to existing ones.
- Parameters:
records (List[MemoryRecord]) – Records to be added to the memory.