Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversions fail with OOM exception #31

Open
forman opened this issue Mar 22, 2021 · 1 comment
Open

Conversions fail with OOM exception #31

forman opened this issue Mar 22, 2021 · 1 comment
Assignees
Labels
bug Something isn't working enhancement New feature or request in progress Started working on this

Comments

@forman
Copy link
Member

forman commented Mar 22, 2021

I have 35 annual CCI Sea Surface Temperature (SST) Level-4 Zarr datasets each comprising 4 data variables with dimensions (time=365, lat=3600, lon=7200) and chunking (time=16, lat=900, lon=1800). I want to append them to generate a single Zarr (with equal or similar chunking) but none of my nc2zarr jobs succeeded so far. The append job will always be terminated by an out-of-memory exception even if I assign it 256 GB of RAM. The issue is complicated by the fact that every append step takes 30 minutes and the OOM may occur only after appending ~20 years. This indicates one or more memory leaks in nc2zarr, xarray, dask, and/or zarr.

See also

@forman forman added bug Something isn't working enhancement New feature or request in progress Started working on this labels Mar 22, 2021
@forman forman self-assigned this Mar 22, 2021
forman added a commit that referenced this issue Mar 22, 2021
@TonioF
Copy link
Collaborator

TonioF commented Jun 21, 2021

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request in progress Started working on this
Projects
None yet
Development

No branches or pull requests

2 participants