-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Forward slicing in DataIO #141
Conversation
thanks @oruebel I'll take a look and take care of flake8 |
Codecov Report
@@ Coverage Diff @@
## dev #141 +/- ##
==========================================
+ Coverage 69.73% 69.76% +0.02%
==========================================
Files 29 29
Lines 5726 5738 +12
Branches 1351 1352 +1
==========================================
+ Hits 3993 4003 +10
- Misses 1306 1307 +1
- Partials 427 428 +1
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm having trouble using these changes, but I'm not quite sure what the problem is
What trouble do you have? Note, this change just delegates the slicing, i.e., if the object that is being wrapped does not support slicing (e.g., a DataChunkIterator) than slicing will still fail. |
I tried to use it in DynamicTable and I'm getting a recursion error on the tests. But when I test it on its own it works so I may have made a mistake somewhere. Error
Traceback (most recent call last):
File "/anaconda3/envs/pynwb/lib/python3.7/site-packages/unittest2/case.py", line 67, in testPartExecutor
yield
File "/anaconda3/envs/pynwb/lib/python3.7/site-packages/unittest2/case.py", line 625, in run
testMethod()
File "/Users/bendichter/dev/pynwb/tests/integration/ui_write/base.py", line 183, in test_roundtrip
self.read_container = self.roundtripContainer()
File "/Users/bendichter/dev/pynwb/tests/integration/ui_write/base.py", line 169, in roundtripContainer
self.writer.write(nwbfile, cache_spec=cache_spec)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 219, in write
call_docval_func(super(HDF5IO, self).write, kwargs)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 327, in call_docval_func
return func(*fargs, **fkwargs)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/io.py", line 41, in write
f_builder = self.__manager.build(container, source=self.__source)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 164, in build
result = self.__type_map.build(container, self, source=source, spec_ext=spec_ext)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 1646, in build
builder = attr_map.build(container, manager, builder=builder, source=source, spec_ext=spec_ext)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 783, in build
self.__add_groups(builder, self.__spec.groups, container, manager, source)
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 995, in __add_groups
self.__add_groups(sub_builder, spec.groups, container, build_manager, source)
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 1011, in __add_groups
self.__add_containers(builder, spec, attr_value, build_manager, source, container)
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 1022, in __add_containers
rendered_obj = build_manager.build(value, source=source, spec_ext=spec)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 164, in build
result = self.__type_map.build(container, self, source=source, spec_ext=spec_ext)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 1646, in build
builder = attr_map.build(container, manager, builder=builder, source=source, spec_ext=spec_ext)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 782, in build
self.__add_datasets(builder, self.__spec.datasets, container, manager, source)
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 971, in __add_datasets
self.__add_containers(builder, spec, attr_value, build_manager, source, container)
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 1022, in __add_containers
rendered_obj = build_manager.build(value, source=source, spec_ext=spec)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 164, in build
result = self.__type_map.build(container, self, source=source, spec_ext=spec_ext)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 1646, in build
builder = attr_map.build(container, manager, builder=builder, source=source, spec_ext=spec_ext)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/build/map.py", line 800, in build
bldr_data = copy(container.data)
File "/anaconda3/envs/pynwb/lib/python3.7/copy.py", line 106, in copy
return _reconstruct(x, None, *rv)
File "/anaconda3/envs/pynwb/lib/python3.7/copy.py", line 281, in _reconstruct
if hasattr(y, '__setstate__'):
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 573, in __getattr__
if not self.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5_utils.py", line 306, in valid
if isinstance(self.data, Dataset) and not self.data.id.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 564, in data
return self.__data
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 573, in __getattr__
if not self.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5_utils.py", line 306, in valid
if isinstance(self.data, Dataset) and not self.data.id.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 564, in data
return self.__data
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 573, in __getattr__
if not self.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5_utils.py", line 306, in valid
if isinstance(self.data, Dataset) and not self.data.id.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 564, in data
return self.__data
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 573, in __getattr__
if not self.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5_utils.py", line 306, in valid
if isinstance(self.data, Dataset) and not self.data.id.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 564, in data
return self.__data
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 573, in __getattr__
if not self.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5_utils.py", line 306, in valid
if isinstance(self.data, Dataset) and not self.data.id.valid:
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 564, in data
return self.__data
File "/Users/bendichter/dev/hdmf/src/hdmf/data_utils.py", line 573, in __getattr__
if not self.valid:
...
RecursionError: maximum recursion depth exceeded while calling a Python object |
@oruebel With the current change, from hdmf.backends.hdf5.h5_utils import H5DataIO
from copy import copy
copy(H5DataIO(data=[1., 2., 3.])) I don't quite following what is causing the problem, and why it is triggered only in this situation. |
As far as I can tell, this is a problem on the dev branch already, rather than a new bug added by this PR. I'll take a look to see what's going on. Why do you need to copy the H5DataIO object? |
@bendichter copy and deepcopy should be fixed now |
Well that clears up some of my confusion. I couldn't see how your change could cause that error.
When I made the Thanks for fixing this! I'll test it now. |
The current commit caused this error, which showed up several times across the tests for pynwb: Error
Traceback (most recent call last):
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 907, in __list_fill__
dtype = cls.__resolve_dtype__(dtype, data)
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 529, in __resolve_dtype__
dtype = cls.__resolve_dtype_helper__(dtype)
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 543, in __resolve_dtype_helper__
return np.dtype([(x['name'], cls.__resolve_dtype_helper__(x['dtype'])) for x in dtype])
TypeError: 'numpy.dtype' object is not iterable
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/anaconda3/envs/pynwb/lib/python3.7/site-packages/unittest2/case.py", line 67, in testPartExecutor
yield
File "/anaconda3/envs/pynwb/lib/python3.7/site-packages/unittest2/case.py", line 625, in run
testMethod()
File "/Users/bendichter/dev/pynwb/tests/integration/test_io.py", line 268, in test_append
io.write(self.nwbfile, cache_spec=False)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 222, in write
call_docval_func(super(HDF5IO, self).write, kwargs)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 327, in call_docval_func
return func(*fargs, **fkwargs)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/io.py", line 42, in write
self.write_builder(f_builder, **kwargs)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 451, in write_builder
self.write_group(self.__file, gbldr)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 599, in write_group
self.write_group(group, sub_builder)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 599, in write_group
self.write_group(group, sub_builder)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 599, in write_group
self.write_group(group, sub_builder)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 604, in write_group
self.write_dataset(group, sub_builder)
File "/Users/bendichter/dev/hdmf/src/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 798, in write_dataset
dset = self.__list_fill__(parent, name, data, options)
File "/Users/bendichter/dev/hdmf/src/hdmf/backends/hdf5/h5tools.py", line 910, in __list_fill__
raise_from(Exception(msg), exc)
File "<string>", line 3, in raise_from
Exception: cannot add data to /processing/test_proc_mod/test_proc_dset/test_es - could not determine type |
@bendichter hmm that is odd. that seems related to the merged PR #143. i don't know if it is possible, but it might be useful to have an automatic github check of PyNWB for a PR in HDMF to make sure changes in HDMF don't break PyNWB unexpectedly. |
I guess you could run the PyNWB test suite for the dev version of HDMF and PyNWB as a separate pipeline in the HDMF repo. Doing this for just Python 3.7 should be good enough. |
Motivation
See NeurodataWithoutBorders/pynwb#1073
Forward getitem slicing in DataIO.
How to test the behavior?
Checklist
flake8
from the source directory.#XXX
notation whereXXX
is the issue number ? By including "Fix #XXX" you allow GitHub to close the corresponding issue.