-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix indexing inconsistencies #758
Conversation
Codecov Report
@@ Coverage Diff @@
## master #758 +/- ##
==========================================
+ Coverage 89.14% 90.84% +1.69%
==========================================
Files 64 64
Lines 9510 8921 -589
==========================================
- Hits 8478 8104 -374
+ Misses 1032 817 -215
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
@ben-bou I can't ask you to review but will be happy to hear feedback |
… INDEXING NOT CORRECTED YET" This reverts commit 06f56ae.
…tics/heat into bug/754-getitem-indexing
I got this after indexing a 2D dndarray with (0,0), so receiving a scalar (0d).
|
@ben-bou I can't reproduce this locally. Can you post your code? Thanks a lot! |
@ClaudiaComito So it's actually not reproducible but rather the same problem you encountered in the tests: in a loop, the first 4 bcasts fail, after that it works. How did you figure out where those 'wrong' broadcasts were happening? |
Could it be that you have a loop calling something like |
@ClaudiaComito You changed the broadcasting in I tried changing it back to
Which I believe shouldn't happen. The As a side note, EDIT2: the error had nothing to do with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me. only minor structural changes needed as far as i can tell. the actual code proposed looks good to me
@ben-bou indeed I switched to |
rerun tests |
Description
Fixing a few inconsistencies in the slicing of DNDarrays.
number of dimensions after slicing, esp. if the slice only contains 1 element, or if the slice is along the split axis. We cannot be 100% consistent with numpy/torch here, as we need to keep the split dimension no matter what.Slicing one single element along the split axis results in the loss of the split dimension, i.e. in aDNDarray
withsplit=None
. The rank containing the element broadcasts it to the others.Issue/s resolved: #656 #754 #770
Changes proposed:
DNDarray
, incl. when local tensors are emptyslice()
always keep "sliced" dimension if it correspond to the split axis (N.B. different from numpy/torch)key[split]
must be applied to all other dimensions as wellAlso:
DNDarray.counts_displs()
, returns actual counts and displacements (items and offsets) without assuming that the DNDarray is balanced, leading to...getitem
no longer assumes that the DNDarray is balanced, partially solving __getitem__ and __setitem__ assume DNDarray is balanced #668Type of change
Due Diligence
Does this change modify the behaviour of other functions? If so, which?
no