Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LRU cache #696

Closed
connorjward opened this issue Apr 6, 2023 · 6 comments
Closed

Add LRU cache #696

connorjward opened this issue Apr 6, 2023 · 6 comments
Assignees

Comments

@connorjward
Copy link
Collaborator

We should introduce an LRU cache object/function to prevent unbounded memory usage by our in-memory caches. We will need to be careful about including the communicator in the cache key somehow so we never evict items from some ranks but not others.

Related issue #693.
Related discussion firedrakeproject/firedrake#2865

@wence-
Copy link
Member

wence- commented Apr 6, 2023

Reminder, that functools.lru_cache exists

@connorjward
Copy link
Collaborator Author

It does. But I think we need to be careful about communicators. I think that we need basically to have an LRU cache per communicator. Otherwise eviction won't be collective.

@wence-
Copy link
Member

wence- commented Apr 6, 2023

Natural thing to do is to hook the cache on the communicator as an MPI attribute then.

@connorjward
Copy link
Collaborator Author

Good suggestion, thanks! I still need to decide on the right API. I would quite like to implement this in tandem with my idea for a "cache manager" (#693).

@wence-
Copy link
Member

wence- commented Apr 6, 2023

Good suggestion, thanks!

Remember to do so on the inner "PyOP2" comm.

@connorjward
Copy link
Collaborator Author

Closing as solved by #724

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants