-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems when array of coordinate bounds is 2D #667
Comments
Is it possible for you to share a netcdf file that reproduces the issue? Based on the ncdump result, I think xray should handle it fine... |
I think ultimately, these are going to end up as Variables, not Coordinates. The CF convention refers to them as "Boundary Variables", and although they are essentially metadata for the Coordinates, I don't think that sort of complexity makes sense for xray right now. It shouldn't be too hard to fix the 2-d bounds problem though. |
I think the reason one gets the "Buffer has wrong number of dimensions" error here is still because of the presence of the time_bounds variable. If we drop time_bounds upon reading the file in, I think things work OK.
|
Sorry, @spencerkclark is right, the ValueError issue we had was also due to the 2D time bounds array only. For example: (the netCDF file used below is also at ftp://ftp.gfdl.noaa.gov/pub/s1h/atmos.201001-201012.t_surf.nc) In [1]: ds = xray.open_dataset('/archive/Spencer.Hill/am3/am3clim_hurrell/gfdl.ncrc2-intel-prod-openmp/pp/atmos/ts/monthly/1yr/atmos.201001-201012.t_surf.nc')
In [2]: print(ds)
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-2-4d24098ddece> in <module>()
----> 1 print(ds)
/home/s1h/anaconda/lib/python2.7/site-packages/xray/core/dataset.pyc in __repr__(self)
885
886 def __repr__(self):
--> 887 return formatting.dataset_repr(self)
888
889 @property
...
/home/s1h/anaconda/lib/python2.7/site-packages/pandas/tseries/timedeltas.pyc in _convert_listlike(arg, box, unit, name)
47 value = arg.astype('timedelta64[{0}]'.format(unit)).astype('timedelta64[ns]', copy=False)
48 else:
---> 49 value = tslib.array_to_timedelta64(_ensure_object(arg), unit=unit, errors=errors)
50 value = value.astype('timedelta64[ns]', copy=False)
51
pandas/tslib.pyx in pandas.tslib.array_to_timedelta64 (pandas/tslib.c:47046)()
ValueError: Buffer has wrong number of dimensions (expected 1, got 2)
In [3]: ds2 = ds.drop('time_bounds')
In [4]: print(ds2)
<xray.Dataset>
Dimensions: (bnds: 2, lat: 90, lon: 144, time: 12)
Coordinates:
* lat (lat) float64 -89.0 -87.0 -85.0 -83.0 -81.0 -79.0 -77.0 ...
* lon (lon) float64 1.25 3.75 6.25 8.75 11.25 13.75 16.25 18.75 ...
* time (time) datetime64[ns] 2010-01-16T12:00:00 2010-02-15 ...
* bnds (bnds) int64 0 1
Data variables:
average_DT (time) timedelta64[ns] 31 days 28 days 31 days 30 days ...
average_T1 (time) datetime64[ns] 2010-01-01 2010-02-01 2010-03-01 ...
average_T2 (time) datetime64[ns] 2010-02-01 2010-03-01 2010-04-01 ...
lat_bnds (lat, bnds) float64 -90.0 -88.0 -88.0 -86.0 -86.0 -84.0 ...
lon_bnds (lon, bnds) float64 0.0 2.5 2.5 5.0 5.0 7.5 7.5 10.0 10.0 ...
t_surf (time, lat, lon) float64 245.9 245.9 245.8 245.7 245.7 245.6 ...
... The errors I was thinking of relating to these lat- and lon-bounds were ultimately due to errors in my own code...my mistakes appear to be the unifying theme here! Sorry for the confusion. |
Most of the netCDF data I work with stores, in addition to the coordinates themselves, the bounds of each coordinate value. Often these bounds are stored as arrays with shape Nx2, where N is the number of points for that coordinate. For example:
These 2-D bounding arrays lead to the "Buffer has wrong number of dimensions" error in #665. In the case of #665, only the time coordinate has this 2-D bounds array; here other coordinates (namely lat and lon) have it as well.
Conceptually, these bound arrays represent coordinates, but when read in as a
Dataset
, they become variables, not coordinates. Perhaps this is part of the problem?The text was updated successfully, but these errors were encountered: