You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using xclim or icclim to calculate indices that involve calculating a threshold along the entire time axis (a percentile-based threshold), memory usage can explode for large multi-dimensional (i.e. high lat/lon resolution) arrays.
A solution to this problem (i.e. appropriate file chunking and/or use of dask) needs to be found before we can commit to using xclim or icclim for the calculation of climate indices.
The text was updated successfully, but these errors were encountered:
Update (cc @ngben@chunhsusu): The latest version of icclim (v6.0.0) is much more efficient memory-wise, to the point where setting up a dask cluster isn't necessary for most/all problems. I've updated my run_icclim.py script at the following repo and the README has details: https://github.com/AusClimateService/indices
When using xclim or icclim to calculate indices that involve calculating a threshold along the entire time axis (a percentile-based threshold), memory usage can explode for large multi-dimensional (i.e. high lat/lon resolution) arrays.
A solution to this problem (i.e. appropriate file chunking and/or use of dask) needs to be found before we can commit to using xclim or icclim for the calculation of climate indices.
The text was updated successfully, but these errors were encountered: