Skip to content

Commit

Permalink
737ea5e Merge pull request #1112 from BVLC/next
Browse files Browse the repository at this point in the history
  • Loading branch information
shelhamer committed Sep 19, 2014
1 parent 58b4bc4 commit ef2f817
Show file tree
Hide file tree
Showing 279 changed files with 9,701 additions and 3,353 deletions.
21 changes: 15 additions & 6 deletions development.html
Original file line number Diff line number Diff line change
@@ -1,10 +1,14 @@
<!doctype html>
<html>
<head>
<!-- MathJax -->
<script type="text/javascript"
src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML">
</script>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="chrome=1">
<title>
Caffe | Development
Caffe | Developing and Contributing
</title>

<link rel="stylesheet" href="/stylesheets/reset.css">
Expand Down Expand Up @@ -84,7 +88,7 @@ <h3 id="the-release-cycle">The release cycle</h3>

<h3 id="issues--pull-request-protocol">Issues &amp; Pull Request Protocol</h3>

<p>Use Github Issues to report <a href="https://github.com/BVLC/caffe/issues?labels=bug&amp;page=1&amp;state=open">bugs</a>, propose features, and ask <a href="https://github.com/BVLC/caffe/issues?labels=question&amp;page=1&amp;state=open">questions</a>.
<p>Use Github Issues to report <a href="https://github.com/BVLC/caffe/issues?labels=bug&amp;page=1&amp;state=open">bugs</a>, propose features, and ask development <a href="https://github.com/BVLC/caffe/issues?labels=question&amp;page=1&amp;state=open">questions</a>.
Large-scale development work is guided by <a href="https://github.com/BVLC/caffe/issues?milestone=1">milestones</a>, which are sets of Issues selected for concurrent release (integration from <code>dev</code> to <code>master</code>).</p>

<p>Please note that since the core developers are largely researchers, we may work on a feature in isolation for some time before releasing it to the community, so as to claim honest academic contribution.
Expand All @@ -93,13 +97,18 @@ <h3 id="issues--pull-request-protocol">Issues &amp; Pull Request Protocol</h3>
<p>When you are ready to start developing your feature or fixing a bug, follow this protocol:</p>

<ul>
<li>Do new development in <a href="https://www.atlassian.com/git/workflows#!workflow-feature-branch">feature branches</a> with descriptive names.</li>
<li>Bring your work up-to-date by <a href="http://git-scm.com/book/en/Git-Branching-Rebasing">rebasing</a> onto the latest <code>dev</code>.
<li>Develop in <a href="https://www.atlassian.com/git/workflows#!workflow-feature-branch">feature branches</a> with descriptive names.
<ul>
<li>For new development branch off <code>dev</code>.</li>
<li>For documentation and fixes for <code>master</code> branch off <code>master</code>.</li>
</ul>
</li>
<li>Bring your work up-to-date by <a href="http://git-scm.com/book/en/Git-Branching-Rebasing">rebasing</a> onto the latest <code>dev</code> / <code>master</code>.
(Polish your changes by <a href="https://help.github.com/articles/interactive-rebase">interactive rebase</a>, if you’d like.)</li>
<li><a href="https://help.github.com/articles/using-pull-requests">Pull request</a> your contribution to <code>BVLC/caffe</code>’s <code>dev</code> branch for discussion and review.
<li><a href="https://help.github.com/articles/using-pull-requests">Pull request</a> your contribution to <code>BVLC/caffe</code>’s <code>dev</code> / <code>master</code> branch for discussion and review.
<ul>
<li>Make PRs <em>as soon as development begins</em>, to let discussion guide development.</li>
<li>A PR is only ready for merge review when it is a fast-forward merge to dev, and all code is documented, linted, and tested – that means your PR must include tests!</li>
<li>A PR is only ready for merge review when it is a fast-forward merge, and all code is documented, linted, and tested – that means your PR must include tests!</li>
</ul>
</li>
<li>When the PR satisfies the above properties, use comments to request maintainer review.</li>
Expand Down
99 changes: 50 additions & 49 deletions doxygen/annotated.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion doxygen/benchmark_8hpp_source.html
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@
</div><!-- fragment --></div><!-- contents -->
<!-- start footer part -->
<hr class="footer"/><address class="footer"><small>
Generated on Mon Sep 8 2014 12:10:58 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
Generated on Thu Sep 18 2014 22:25:43 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/>
</a> 1.8.8
</small></address>
Expand Down
158 changes: 80 additions & 78 deletions doxygen/blob_8hpp_source.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion doxygen/caffe_8hpp_source.html
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@
</div><!-- fragment --></div><!-- contents -->
<!-- start footer part -->
<hr class="footer"/><address class="footer"><small>
Generated on Mon Sep 8 2014 12:10:58 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
Generated on Thu Sep 18 2014 22:25:43 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/>
</a> 1.8.8
</small></address>
Expand Down
19 changes: 10 additions & 9 deletions doxygen/classcaffe_1_1AbsValLayer-members.html
Original file line number Diff line number Diff line change
Expand Up @@ -121,18 +121,19 @@
<tr bgcolor="#f0f0f0" class="even"><td class="entry"><b>NeuronLayer</b>(const LayerParameter &amp;param) (defined in <a class="el" href="classcaffe_1_1NeuronLayer.html">caffe::NeuronLayer&lt; Dtype &gt;</a>)</td><td class="entry"><a class="el" href="classcaffe_1_1NeuronLayer.html">caffe::NeuronLayer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">explicit</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a1a3708013b0231e71d725252e10ce6e3">param_propagate_down</a>(const int param_id)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#acd4a05def9ff3b42ad72404210613ef7">param_propagate_down_</a></td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">protected</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a899b09f4b91ada8545b3a43ee91e0d69">set_loss</a>(const int top_index, const Dtype value)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a9a6fcb843803ed556f0a69cc2864379b">set_param_propagate_down</a>(const int param_id, const bool value)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a5bebaf079cff5bff7016be1733bb996e">SetLossWeights</a>(vector&lt; Blob&lt; Dtype &gt; * &gt; *top)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">protected</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a4f7809d8e708c4408a96af0752aec481">SetUp</a>(const vector&lt; Blob&lt; Dtype &gt; * &gt; &amp;bottom, vector&lt; Blob&lt; Dtype &gt; * &gt; *top)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a4a1754828dda22cc8daa2f63377f3579">ToProto</a>(LayerParameter *param, bool write_diff=false)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">virtual</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1AbsValLayer.html#ab556af9217c109c8dae9c8b4a462b4a5">type</a>() const </td><td class="entry"><a class="el" href="classcaffe_1_1AbsValLayer.html">caffe::AbsValLayer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">virtual</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a29dba205196d9aeaa1cfeba4dc891093">type_name</a>() const </td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">virtual</span></td></tr>
<tr bgcolor="#f0f0f0" class="even"><td class="entry"><b>~Layer</b>() (defined in <a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a>)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">virtual</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1NeuronLayer.html#ae4d8d67cbdb21a2953d6dd36e4ec0572">Reshape</a>(const vector&lt; Blob&lt; Dtype &gt; * &gt; &amp;bottom, vector&lt; Blob&lt; Dtype &gt; * &gt; *top)</td><td class="entry"><a class="el" href="classcaffe_1_1NeuronLayer.html">caffe::NeuronLayer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">virtual</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a899b09f4b91ada8545b3a43ee91e0d69">set_loss</a>(const int top_index, const Dtype value)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a9a6fcb843803ed556f0a69cc2864379b">set_param_propagate_down</a>(const int param_id, const bool value)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a5bebaf079cff5bff7016be1733bb996e">SetLossWeights</a>(vector&lt; Blob&lt; Dtype &gt; * &gt; *top)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">protected</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a4f7809d8e708c4408a96af0752aec481">SetUp</a>(const vector&lt; Blob&lt; Dtype &gt; * &gt; &amp;bottom, vector&lt; Blob&lt; Dtype &gt; * &gt; *top)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a4a1754828dda22cc8daa2f63377f3579">ToProto</a>(LayerParameter *param, bool write_diff=false)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">virtual</span></td></tr>
<tr><td class="entry"><a class="el" href="classcaffe_1_1AbsValLayer.html#ab556af9217c109c8dae9c8b4a462b4a5">type</a>() const </td><td class="entry"><a class="el" href="classcaffe_1_1AbsValLayer.html">caffe::AbsValLayer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">virtual</span></td></tr>
<tr class="even"><td class="entry"><a class="el" href="classcaffe_1_1Layer.html#a29dba205196d9aeaa1cfeba4dc891093">type_name</a>() const </td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">virtual</span></td></tr>
<tr bgcolor="#f0f0f0"><td class="entry"><b>~Layer</b>() (defined in <a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a>)</td><td class="entry"><a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td><td class="entry"><span class="mlabel">inline</span><span class="mlabel">virtual</span></td></tr>
</table></div><!-- contents -->
<!-- start footer part -->
<hr class="footer"/><address class="footer"><small>
Generated on Mon Sep 8 2014 12:10:59 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
Generated on Thu Sep 18 2014 22:25:43 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/>
</a> 1.8.8
</small></address>
Expand Down
15 changes: 9 additions & 6 deletions doxygen/classcaffe_1_1AbsValLayer.html
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@
&#160;</td><td class="memItemRight" valign="bottom"><b>AbsValLayer</b> (const LayerParameter &amp;param)</td></tr>
<tr class="separator:a513c13552694e5860b85986b94793eb0"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:a2a32937ff04041fb76672e1e5bb3e0aa"><td class="memItemLeft" align="right" valign="top">virtual void&#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="classcaffe_1_1AbsValLayer.html#a2a32937ff04041fb76672e1e5bb3e0aa">LayerSetUp</a> (const vector&lt; <a class="el" href="classcaffe_1_1Blob.html">Blob</a>&lt; Dtype &gt; * &gt; &amp;bottom, vector&lt; <a class="el" href="classcaffe_1_1Blob.html">Blob</a>&lt; Dtype &gt; * &gt; *top)</td></tr>
<tr class="memdesc:a2a32937ff04041fb76672e1e5bb3e0aa"><td class="mdescLeft">&#160;</td><td class="mdescRight">Does layer-specific setup: your layer should implement this. <a href="#a2a32937ff04041fb76672e1e5bb3e0aa">More...</a><br /></td></tr>
<tr class="memdesc:a2a32937ff04041fb76672e1e5bb3e0aa"><td class="mdescLeft">&#160;</td><td class="mdescRight">Does layer-specific setup: your layer should implement this function as well as Reshape. <a href="#a2a32937ff04041fb76672e1e5bb3e0aa">More...</a><br /></td></tr>
<tr class="separator:a2a32937ff04041fb76672e1e5bb3e0aa"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:ab556af9217c109c8dae9c8b4a462b4a5"><td class="memItemLeft" align="right" valign="top"><a class="anchor" id="ab556af9217c109c8dae9c8b4a462b4a5"></a>
virtual LayerParameter_LayerType&#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="classcaffe_1_1AbsValLayer.html#ab556af9217c109c8dae9c8b4a462b4a5">type</a> () const </td></tr>
Expand All @@ -131,6 +131,9 @@
<tr class="memitem:a43348846697146ea9f01a773855b0915 inherit pub_methods_classcaffe_1_1NeuronLayer"><td class="memItemLeft" align="right" valign="top"><a class="anchor" id="a43348846697146ea9f01a773855b0915"></a>
&#160;</td><td class="memItemRight" valign="bottom"><b>NeuronLayer</b> (const LayerParameter &amp;param)</td></tr>
<tr class="separator:a43348846697146ea9f01a773855b0915 inherit pub_methods_classcaffe_1_1NeuronLayer"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="memitem:ae4d8d67cbdb21a2953d6dd36e4ec0572 inherit pub_methods_classcaffe_1_1NeuronLayer"><td class="memItemLeft" align="right" valign="top">virtual void&#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="classcaffe_1_1NeuronLayer.html#ae4d8d67cbdb21a2953d6dd36e4ec0572">Reshape</a> (const vector&lt; <a class="el" href="classcaffe_1_1Blob.html">Blob</a>&lt; Dtype &gt; * &gt; &amp;bottom, vector&lt; <a class="el" href="classcaffe_1_1Blob.html">Blob</a>&lt; Dtype &gt; * &gt; *top)</td></tr>
<tr class="memdesc:ae4d8d67cbdb21a2953d6dd36e4ec0572 inherit pub_methods_classcaffe_1_1NeuronLayer"><td class="mdescLeft">&#160;</td><td class="mdescRight">Adjust the shapes of top blobs and internal buffers to accomodate the shapes of the bottom blobs. <a href="#ae4d8d67cbdb21a2953d6dd36e4ec0572">More...</a><br /></td></tr>
<tr class="separator:ae4d8d67cbdb21a2953d6dd36e4ec0572 inherit pub_methods_classcaffe_1_1NeuronLayer"><td class="memSeparator" colspan="2">&#160;</td></tr>
<tr class="inherit_header pub_methods_classcaffe_1_1Layer"><td colspan="2" onclick="javascript:toggleInherit('pub_methods_classcaffe_1_1Layer')"><img src="closed.png" alt="-"/>&#160;Public Member Functions inherited from <a class="el" href="classcaffe_1_1Layer.html">caffe::Layer&lt; Dtype &gt;</a></td></tr>
<tr class="memitem:a7b4e4ccea08c7b8b15acc6829d5735f6 inherit pub_methods_classcaffe_1_1Layer"><td class="memItemLeft" align="right" valign="top">&#160;</td><td class="memItemRight" valign="bottom"><a class="el" href="classcaffe_1_1Layer.html#a7b4e4ccea08c7b8b15acc6829d5735f6">Layer</a> (const LayerParameter &amp;param)</td></tr>
<tr class="separator:a7b4e4ccea08c7b8b15acc6829d5735f6 inherit pub_methods_classcaffe_1_1Layer"><td class="memSeparator" colspan="2">&#160;</td></tr>
Expand Down Expand Up @@ -461,17 +464,17 @@
</table>
</div><div class="memdoc">

<p>Does layer-specific setup: your layer should implement this. </p>
<p>Does layer-specific setup: your layer should implement this function as well as Reshape. </p>
<dl class="params"><dt>Parameters</dt><dd>
<table class="params">
<tr><td class="paramname">bottom</td><td>the preshaped input blobs, whose data fields store the input data for this layer </td></tr>
<tr><td class="paramname">top</td><td>the allocated but unshaped output blobs, to be initialized by LayerSetUp</td></tr>
<tr><td class="paramname">top</td><td>the allocated but unshaped output blobs</td></tr>
</table>
</dd>
</dl>
<p>This method should be used to do layer-specific setup. At a minimum, this includes reshaping the empty top blobs to the shape as dictated by the shapes of the bottom blobs and any relevant parameters from the <code>layer_param_</code>. </p>
<p>This method should do one-time layer specific setup. This includes reading and processing relevent parameters from the <code>layer_param_</code>. Setting up the shapes of top blobs and internal buffers should be done in <code>Reshape</code>, which will be called before the forward pass to adjust the top blob sizes. </p>

<p>Reimplemented from <a class="el" href="classcaffe_1_1NeuronLayer.html#ae289209c576059f6b2928a761515eabd">caffe::NeuronLayer&lt; Dtype &gt;</a>.</p>
<p>Reimplemented from <a class="el" href="classcaffe_1_1Layer.html#a11a81d8a3722fcbceab72e6a964695e2">caffe::Layer&lt; Dtype &gt;</a>.</p>

</div>
</div>
Expand All @@ -482,7 +485,7 @@
</div><!-- contents -->
<!-- start footer part -->
<hr class="footer"/><address class="footer"><small>
Generated on Mon Sep 8 2014 12:10:59 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
Generated on Thu Sep 18 2014 22:25:43 for Caffe by &#160;<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/>
</a> 1.8.8
</small></address>
Expand Down
Loading

0 comments on commit ef2f817

Please sign in to comment.