bugfix for ConcatLayer with propagate_down set #2972
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Currently, if
propagate_down[i]
isfalse
,offset_concat_axis
is not correctlyupdated for any subsequent bottoms (i+1, i+2, ...).
This bug is quite serious for anyone whom it affects -- if you use concat layer and some of its inputs don't need backpropagation (i.e. because they have no learned parameters below them), you should pull and try again with this fix. Many thanks to @bharath272 for reporting.
This was my bug -- it's been broken since merging #1970 in which I (tried to) generalize blobs and many layers to N dimensions. Sorry for the trouble.
In the near future I plan to go through all multi-bottom layers and at least glance to make sure they don't have similarEdit: After going through the list, I don't think any other (official) Caffe layers are subject to this flavor of bug, but please let me know if I missed anything.propagate_down
related bugs, since we don't ever really test thepropagate_down
functionality.