Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Progressively load large number of keys #471

Open
tugtugtug opened this issue Mar 24, 2022 · 4 comments
Open

Progressively load large number of keys #471

tugtugtug opened this issue Mar 24, 2022 · 4 comments

Comments

@tugtugtug
Copy link

tugtugtug commented Mar 24, 2022

Redis commander often takes a long time to load large hashset, and can run out of memory and get killed when the hashset is large enough.
I don't think this is a bug, but it probably makes very little sense to have redis commander replicate the entire redis db in memory.
As an enhancement, could redis commander implement progressive loading of large hashset or large number of keys?
i.e. if the hashset is large, then load the first say N entries, and load more when the front-end scrolls to the bottom or user clicks to load more (or load all).
This would allow redis-commander to be configured with a reasonable memory limit.

@sseide
Copy link
Collaborator

sseide commented Mar 24, 2022

you are right - such a feature is helpfull.
As i do not have time to implement it any PR is welcome. Will help to discuss a solution and answer questions arise during implementation.

@JESii
Copy link

JESii commented Apr 29, 2022

Not sure, but I might have a similar situation with a series of SET/SADD commands. I exported a branch (200 items, some with large JSON objects to be inserted). It's been running for ~10 minutes so far which seems excessive given that a single SET or SADD happens almost instantaneously. Oh... I see "413 Payload too large" in the console. I'll see if I can cut this down and do some more testing.
UPDATE: Had to cut down the 200 items into 5- to 1-record bits to avoid the jquery failure; one record was too large all by itself (85750 chars), but I was able to get the others done this way. Would be nice to have a larger import capability.

@sseide
Copy link
Collaborator

sseide commented May 2, 2022

@JESii HTTP 413 is a different error - you need to set the config key server.clientMaxBodySize to a bigger value. Exact value depends on your data stored inside Redis. Check docs/configuration.md for more information.

@JESii
Copy link

JESii commented May 2, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants