You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An example of where it would be useful to be able to save several dask arrays together:
You can still return the header by just setting the key="header" for a second memmap_distributed call. It will add some time onto the saving of the dataset as the entire dataset might get loaded into ram with most of it thrown away.
Really what we should do is add things to a to_store context manager and then call:
Only once. That will merge taskgraphs as necessary and might reduce the time for saving certain signals. I've thought about it for things like saving lazy markers of possibly creating a hs.save() function for handling mulitple signals if you wanted to save multiple parts of some anaylsis efficently. This is a fairly abstract/higher level concept so maybe it would be seledomly used.
An example of where it would be useful to be able to save several dask arrays together:
You can still return the header by just setting the
key="header"
for a secondmemmap_distributed
call. It will add some time onto the saving of the dataset as the entire dataset might get loaded into ram with most of it thrown away.Really what we should do is add things to a
to_store
context manager and then call:rosettasciio/rsciio/hspy/_api.py
Line 111 in 31bd677
Only once. That will merge taskgraphs as necessary and might reduce the time for saving certain signals. I've thought about it for things like saving lazy markers of possibly creating a
hs.save()
function for handling mulitple signals if you wanted to save multiple parts of some anaylsis efficently. This is a fairly abstract/higher level concept so maybe it would be seledomly used.Originally posted by @CSSFrancis in #267 (comment)
The text was updated successfully, but these errors were encountered: