Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add parameter layer for learning any bottom #2079

Merged
merged 1 commit into from
May 9, 2016

Conversation

longjon
Copy link
Contributor

@longjon longjon commented Mar 9, 2015

(This is a minimal step in the direction of #1474. From discussion with @jeffdonahue.)

This layer simply holds a parameter blob of user-defined shape, and shares it as its single top.

This is useful if you want to learn disconnected bottoms in a net. In theory, all parameters could be handled this way, which would be a full realization of #1474. It is, however, not clear that that's the right thing to do from a user interface perspective: params would lose their semantic distinction, and their sizes would end up being double-specified since current layers already compute them.

Whether or not this ends up being on the solution path, I wanted to go ahead and throw up this PR since I'm already making use of it with other, future PRs.

@jeffdonahue
Copy link
Contributor

Cool, this looks good to me -- feel free to merge as you see fit. (I did realize after our discussion, that this couldn't just be used with an Eltwise SUM/PROD to emulate a hypothetical BiasLayer or DiagInnerProductLayer since the params would need to be tiled to match the number of instances. I should PR my TileLayer which would then let you do that sort of thing with this... Edit: see TileLayer in #2083)

@longjon
Copy link
Contributor Author

longjon commented May 6, 2016

This has now been rebased, as I'm continuing to use it in some of my models. I'll plan to merge soon per @jeffdonahue's prior approval, unless there are any further comments (in particular, if master has changed in ways that affect this PR -- I can't think of any).

@jeffdonahue
Copy link
Contributor

still LGTM

@shelhamer
Copy link
Member

Fine by me. This does come up from time to time so let's merge.

@longjon
Copy link
Contributor Author

longjon commented May 9, 2016

Okay, thanks for the eyes, merging this simple layer which I've made quite a bit of use of myself.

@longjon longjon merged commit c6bd853 into BVLC:master May 9, 2016
fxbit pushed a commit to Yodigram/caffe that referenced this pull request Sep 1, 2016
Add parameter layer for learning any bottom
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants