-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added link_data --> clear_cache relationship to support repacking zarr nwbfiles #215
Conversation
@oruebel, can you enable tests for this PR as well? |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## dev #215 +/- ##
==========================================
- Coverage 87.13% 85.53% -1.61%
==========================================
Files 5 5
Lines 1174 1182 +8
Branches 287 291 +4
==========================================
- Hits 1023 1011 -12
- Misses 100 115 +15
- Partials 51 56 +5 ☔ View full report in Codecov by Sentry. |
@rly @mavaylon1 can you please review. Export is more your area of expertise |
@pauladkisson @oruebel @rly I can take a gander this week. |
@mavaylon1, Still waiting for that gander...has it flown off-course? 🪿 |
@pauladkisson Said gander has indeed flown off into the wilderness due to SFN. Let me review this tomorow. |
@mavaylon1, giving this a bump. |
So I am taking a look but I need more context. can you walk me through it since there are multiple links pointing elsewhere. I'll continue looking at this right now, but let me know as well @pauladkisson You are repacking zarr files but then you added "data is an h5py.Dataset" so I assume you are exporting at some point. Also in general it is better to be more explicit with variable names. Can you be more explicit instead using "o" and "c". |
Sure! I am working on a function to repack NWB files with more appropriate chunking/compression. But in order to do so, I need an option to redo chunking/compression when exporting -- hence the link_data -> clear_cache relationship. Another related bug that pops when I enable this option: some datasets with strings are normally already decoded when writing from scratch, but when exporting, they need to be decoded again. |
This code is pretty much exact copy-paste from L1290-1300, so should be just as readable. |
Yeah both places aren't ideal, but I understand why you kept the style. I thought we cleaned that up already. All good though. |
@pauladkisson i am working on a change for exporting, but I think this PR should be good to go regardless. |
Fantastic! Thanks for pushing this through @mavaylon1. |
FYI I'm not authorized to merge. |
Motivation
See catalystneuro/neuroconv#1003
Checklist
ruff
from the source directory.