-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add token
s to the collections
#33
Comments
I'm afraid In [2]: import dask, xarray as xr, xcollection as xc
In [3]: ds = xr.tutorial.open_dataset('rasm')
In [4]: dask.base.tokenize(ds) == dask.base.tokenize(ds)
Out[4]: False
In [5]: dask.base.tokenize(ds) == dask.base.tokenize(ds)
Out[5]: False The issue appears to be caused by the coordinates: In [8]: dask.base.tokenize(ds.Tair.data) == dask.base.tokenize(ds.Tair.data)
Out[8]: True In [16]: dask.base.tokenize(ds.Tair._coords) == dask.base.tokenize(ds.Tair._coords)
Out[16]: False |
Hmmm... I wonder if there would be any other options? |
I just re-run the example above, and things seem to work just fine: I have no idea what's going on.. pydata/xarray#4738 (comment) |
That is quite odd... glad it's running okay now? |
It would be helpful, especially when cacheing datasets, to have a list of corresponding tokens for each dataset in a collection.
These could be auto-generated by
dask.base.tokenize
The text was updated successfully, but these errors were encountered: