-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Working with huge graphs #62
Comments
Great question, @sharpaper . rdflib-sqlalchemy can work with graphs larger than the size of available memory, but there are caveats. rdflib-sqlalchemy will allocate memory proportional to the number of triples you are adding at once with an As a work-around, for adding triples, it's somewhat inconvenient, but you can split up the inserts to chunks that do fit in memory. There isn't a great work-around to |
This makes sense, the client should limit the range of possible results. However have you considered Python iterators? It could be a nice addition. |
Using iterators isn't really the problem in this case. Rather, there's a
dictionary that's accumulating triples that needn't do that. At least
that's the first thing I see.
…On Thu, May 14, 2020, 09:34 sharpaper ***@***.***> wrote:
There isn't a great work-around to triples() memory usage other than
finding queries with smaller result sets
This makes sense, the client should limit the range of possible results.
However have you considered Python iterators? It could be a nice addition.
—
You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub
<#62 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AALLFSEOJABJWON4NW7H4UDRRP6PDANCNFSM4NAOZR2Q>
.
|
"huge" as in "too big to fit into memory". I don't think I've seen this mentioned in the readme, so please excuse me if it's a dumb question. But considering that rdflib works in-memory, I was wondering if this plugin also needs to load all the graph in memory in order to work.
The text was updated successfully, but these errors were encountered: