Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to Persist Messages in External Stores #530

Open
izzyacademy opened this issue Dec 23, 2024 · 9 comments
Open

Ability to Persist Messages in External Stores #530

izzyacademy opened this issue Dec 23, 2024 · 9 comments
Assignees
Labels
enhancement New feature or request

Comments

@izzyacademy
Copy link
Contributor

I am wondering if there is a plan to add ability to persist messages in remote stores like Redis or document stores instead of using plain memory to cache the messages [1]. There are scenarios where the application memory alone is not sufficient and we need to cache the state in a remote store and pick thing up later.

If this is already possible and I just have to add a plugin or dependency to do this, please let me know as well.

Thanks.

[1] https://ai.pydantic.dev/message-history/

@sydney-runkle
Copy link
Member

I think this is something that could be supported in user code for the most part, though we'd be happy to consider support for loading externally stored message history.

@izzyacademy
Copy link
Contributor Author

Thank you Sydney for sharing your thoughts. I appreciate it.

I think it would be good to have a sort of interface or abstract parent class that users can implement or inherit from and extend to have their own persistent store (Redis, MongoDB, Postgres, MySQL etc) to cache these messages. This can then be injected in and used internally to cache the message history.

This will help with checkpointing and caching in the event of errors or restarts and we dont lose all the messages in the history so far. I will also simplify things so that the user does not have to spend too much energy to figure out this persistence implementation.

Those were my thoughts behind this suggestion.

I am open to collaborating to bring this into the framework

@sydney-runkle sydney-runkle added the enhancement New feature or request label Dec 23, 2024
@samuelcolvin
Copy link
Member

Hi @izzyacademy I agree we need this.

I think we need an ABC, with implementations for:

  • memory
  • file
  • sqlite
  • postgres

and an example of a custom implementation. I would need very strong evidence that people want another specific database before we implemented it.

I think it's best if I hav a crack at a first implementation of this, since I'll know how I want it to work.

@izzyacademy
Copy link
Contributor Author

@samuelcolvin thanks for the update. I will wait for your initial draft and then I will share my feedback. After you start with the memory and SQL Lite, I can work on Redis and PostgreSQL.

I believe there will be a strong need for Redis as the synchronous and async libraries have 45M and 800K downloads per month respectively. I will check back later on this.

@sandeep540
Copy link

Hi @izzyacademy @samuelcolvin Even we are looking for similar type of implementation, Redis is preferred! for short-term memory, user-pref, user-details, etc.

Crew AI has similar - https://docs.crewai.com/concepts/memory (short-term memory, long-term memory, entity memory, and contextual memory)

I saw mem0, but it requires LLM, did not fit all our use-cases - https://github.com/mem0ai/mem0

@izzyacademy
Copy link
Contributor Author

@sandeep540 thanks for your comments. I agree. Redis is a great key-value store for this use case.

@LeonidShamis
Copy link

LeonidShamis commented Jan 16, 2025

I recently came across an interesting video by Adam Lucek about implementing different memory types that referenced two academic papers:

The types of memory that were implemented in that video are:

  • Working Memory - Current conversation and immediate context
  • Episodic Memory - Historical experiences and their takeaways
  • Semantic Memory - Knowledge context and factual grounding
  • Procedural Memory - The "rules" and "skills" for interaction

Could such memory types be considered for the API/ABC/implementation you are planning?

TBH, I'm not entirely sure what you consider as suitable "memory" to be integrated into PydanticAI, but I'll share some additional references to "long-term" memory information/tools:

Long-Term Memory Support in LangGraph
mem0 Memory layer for your AI apps
Zep Memory Foundation For Your AI Stack

@izzyacademy
Copy link
Contributor Author

Like the other external integrations, I think this could be [pydantic-ai-persistence] and we could integrate different data stores that implement the ABC.

Here are my initial thoughts based on the agent or graph workflow interactions with the persistent store

from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import Any

@dataclass
class PersistentStore(ABC):
    """A persistent store ABC """

    conversation_id: str = None
    """This unique string is used to track different interactions with the persistent store. 
    
    Generate unique one is None is specfied
    """

    @abstractmethod
    async def append_entry(self, entry: Any):
        """Adds a record to the end of the list"""

    @abstractmethod
    async def append_entries(self, entries: list[Any]):
        """Adds records to the end of the list"""

    @abstractmethod
    async def prepend_entry(self, entry: Any):
        """Adds a record to the beginning of the list"""

    @abstractmethod
    async def get_all_entries(self) -> list[Any]:
        """Retrieves all the entries for this conversation"""

    @abstractmethod
    async def get_entries(self, start: int, end: int) -> list[Any]:
        """Retrieves a subset of the entries for this conversation"""

    @abstractmethod
    async def get_first_entry(self) -> Any:
        """Retrieves the first entry for this conversation"""

    @abstractmethod
    async def get_last_entry(self) -> Any:
        """Retrieves the last entry for this conversation"""

    @abstractmethod
    async def clear(self):
        """Wipes the list clean to start from an empty list"""

    @abstractmethod
    async def remove_first_entry(self) -> Any:
        """Removes the first entry for this conversation"""

    @abstractmethod
    async def remove_last_entry(self) -> Any:
        """Removes the last entry for this conversation"""

    @abstractmethod
    async def remove_entries(self, start: int, end: int) -> list[Any]:
        """Removes a subset of the entries for this conversation"""

    @abstractmethod
    async def count(self) -> int:
        """Returns the total num of messages for this conversation"""


class Memory(PersistentStore):
     pass

class PostgreSQLPersistence(PersistentStore):
    pass

class RedisPersistent(PersistentStore):
    pass


@asaf
Copy link

asaf commented Jan 22, 2025

Just be careful not to end with a library that looks more like a platform :)

I think it should be designed in relation with #695

If not technically then at least with best practice how to maintain the persistency when using graphs,

For example Langchain best practice is to push al messages into the graph's state, then there's no requirement to store messages on the agent level any longer.

Another thing to consider is the fact that messages may be shared and modified via multiple agents.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants