Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jupyter kernel crashing with sdata.write #827

Open
josenimo opened this issue Jan 14, 2025 · 1 comment
Open

Jupyter kernel crashing with sdata.write #827

josenimo opened this issue Jan 14, 2025 · 1 comment

Comments

@josenimo
Copy link

Describe the bug
I am writing large sdata objects to zarr, and the Kernel fails in an unpredictable manner.

I parse the image into sdata, a large mIF image (15, 44470, 73167) (8bit), with scale factors (5,5) to create a multiscale object. Then writing that simple sdata object seems to fail, (it takes about 20min, so only tried twice).

Before I send over this large data, are there any expected limitations from writing sdata objects into Zarr in a Jupyter notebook?
My naive concerns think about:

  1. chunking
    (what if chunk size is larger than downscaled image size?)
    (can I chunk different scales dynamically? I use the parser to chunk.)
  2. Hardware
    (I use M2 Macbook Pro).

This kind of kernel failures are particularly frustrating because they corrupt the zarr object, I was writing some new elements (another very large image) and it crashed, and it killed the object.

@LucaMarconato
Copy link
Member

LucaMarconato commented Jan 14, 2025

Thanks for reporting, and sorry to hear about this bug, it sounds indeed frustrating.

Before I send over this large data, are there any expected limitations from writing sdata objects into Zarr in a Jupyter notebook?

No, there is no expected limitation in .ipynb vs .py for this task.

chunking
(what if chunk size is larger than downscaled image size?)
(can I chunk different scales dynamically? I use the parser to chunk.)

Yes, you can rechunk the data after calling the parser and before saving as you see it fit. For instance check the code from here. Anyway it could be that your problem is due to a bug involving compression, please check here: #812 (comment).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants