-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature/dynamic recurrent op forward and backward #4799
feature/dynamic recurrent op forward and backward #4799
Conversation
commit cc50ca9 Author: superjom <[email protected]> Date: Wed Oct 11 16:21:36 2017 -0400 clean code commit 3850325 Merge: 15d54bb b504a23 Author: superjom <[email protected]> Date: Wed Oct 11 16:16:30 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op-forward-test commit 15d54bb Author: superjom <[email protected]> Date: Wed Oct 11 16:13:48 2017 -0400 forward run smoothly commit d8b0f14 Author: superjom <[email protected]> Date: Tue Oct 10 20:39:26 2017 -0400 using namespace related commit a8ad874 Merge: 150acac c193f82 Author: superjom <[email protected]> Date: Tue Oct 10 20:38:45 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op-forward-test commit 150acac Author: superjom <[email protected]> Date: Tue Oct 10 19:40:57 2017 -0400 add DySeqBatch commit f282903 Author: superjom <[email protected]> Date: Tue Oct 10 18:23:25 2017 -0400 create outputs in step scopes commit 066cbb6 Author: superjom <[email protected]> Date: Tue Oct 10 18:21:57 2017 -0400 format commit 50c033b Author: superjom <[email protected]> Date: Tue Oct 10 18:19:23 2017 -0400 init reorder boot commit b5e8a8a Author: superjom <[email protected]> Date: Tue Oct 10 13:56:31 2017 -0400 merge dynamic op changes commit dccf0cc Author: superjom <[email protected]> Date: Mon Oct 9 19:45:24 2017 -0400 small fix commit 4a0cc85 Merge: 50c364e d30ada2 Author: superjom <[email protected]> Date: Mon Oct 9 19:44:40 2017 -0400 Merge branch 'feature/dynamic-recurrent-op' into feature/dynamic-recurrent-op-forward-test commit d30ada2 Author: superjom <[email protected]> Date: Mon Oct 9 19:01:20 2017 -0400 fix missing lib commit 50c364e Merge: e99a4ce 383faaf Author: superjom <[email protected]> Date: Mon Oct 9 17:44:16 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op-forward-test commit 3cfaa8b Merge: 58d53f1 d23cd51 Author: superjom <[email protected]> Date: Mon Oct 9 16:59:02 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit 58d53f1 Merge: e578fc3 e12ec95 Author: superjom <[email protected]> Date: Mon Oct 9 15:57:32 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit e578fc3 Author: superjom <[email protected]> Date: Mon Oct 9 15:56:59 2017 -0400 add definition for WITH_TESTING commit e99a4ce Author: superjom <[email protected]> Date: Mon Oct 9 15:56:18 2017 -0400 add pybind commit df46223 Author: superjom <[email protected]> Date: Mon Oct 9 14:19:05 2017 -0400 fix maker name commit 0e40af5 Merge: 0c97e0c 0ff540c Author: superjom <[email protected]> Date: Mon Oct 9 14:14:36 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit 0c97e0c Author: superjom <[email protected]> Date: Mon Oct 9 14:14:16 2017 -0400 fix gtest_prod.h loss error commit 7245bac Author: superjom <[email protected]> Date: Mon Oct 9 14:13:37 2017 -0400 make states a map commit 72706ae Merge: 9b1bfc5 d9585f9 Author: superjom <[email protected]> Date: Mon Oct 9 13:38:33 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op-forward-test commit 21b84bd Author: superjom <[email protected]> Date: Mon Oct 9 13:35:24 2017 -0400 fix cuda error commit b83ffae Merge: cdc6e5d 4c96008 Author: superjom <[email protected]> Date: Fri Oct 6 20:06:06 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit 9b1bfc5 Merge: 9b4991c 3e82922 Author: superjom <[email protected]> Date: Fri Oct 6 18:11:55 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op-forward-test commit 9b4991c Author: superjom <[email protected]> Date: Thu Oct 5 13:59:47 2017 -0400 fix gtest commit cdc6e5d Author: superjom <[email protected]> Date: Thu Oct 5 13:55:05 2017 -0400 fix gtest commit d300ce6 Author: superjom <[email protected]> Date: Thu Oct 5 13:54:27 2017 -0400 add pybind commit d5349e1 Author: superjom <[email protected]> Date: Wed Oct 4 18:41:56 2017 -0400 clean code commit 7d59aee Author: superjom <[email protected]> Date: Wed Oct 4 18:40:11 2017 -0400 recover op_registry commit 13c1d6d Author: superjom <[email protected]> Date: Wed Oct 4 18:38:58 2017 -0400 recover op_registry commit 728b4de Merge: f94f0a8 76169e0 Author: superjom <[email protected]> Date: Wed Oct 4 18:06:43 2017 -0400 Merge branch 'feature/dynamic-recurrent-op' of github.com:Superjom/Paddle into feature/dynamic-recurrent-op commit f94f0a8 Merge: cf87cd6 15b35f9 Author: superjom <[email protected]> Date: Wed Oct 4 18:06:36 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit cf87cd6 Author: superjom <[email protected]> Date: Wed Oct 4 18:06:08 2017 -0400 finish commit 76169e0 Author: superjom <[email protected]> Date: Wed Oct 4 18:04:54 2017 -0400 finish debug commit d432f17 Author: superjom <[email protected]> Date: Wed Oct 4 13:16:15 2017 -0400 finish forward implementation commit 85dac0f Author: superjom <[email protected]> Date: Mon Oct 2 22:52:59 2017 -0400 add test for dynamic recurrent op commit fceaad8 Merge: aaaed8b c705f06 Author: superjom <[email protected]> Date: Mon Oct 2 16:34:46 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit aaaed8b Merge: cf228a7 a80e010 Author: superjom <[email protected]> Date: Mon Oct 2 13:44:41 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/dynamic-recurrent-op commit cf228a7 Author: superjom <[email protected]> Date: Mon Oct 2 13:40:51 2017 -0400 add registry commit c138e6b Author: superjom <[email protected]> Date: Sat Sep 30 19:53:59 2017 -0400 add Run commit b3d28ae Merge: d77fb89 2719ff6 Author: superjom <[email protected]> Date: Sat Sep 30 18:44:39 2017 -0400 Merge branch 'feature/tensor_array' into feature/dynamic-recurrent-op commit 2719ff6 Merge: 956f949 0900aed Author: superjom <[email protected]> Date: Sat Sep 30 18:17:15 2017 -0400 Merge branch 'develop' of github.com:PaddlePaddle/Paddle into feature/tensor_array commit 956f949 Author: superjom <[email protected]> Date: Sat Sep 30 15:21:22 2017 -0400 set type commit 870be95 Author: superjom <[email protected]> Date: Sat Sep 30 15:17:10 2017 -0400 update commit d77fb89 Author: superjom <[email protected]> Date: Sat Sep 30 15:03:27 2017 -0400 add implementation commit 661f30b Author: superjom <[email protected]> Date: Wed Sep 27 22:09:41 2017 -0400 add tensor array
…/dynamic_recurrent_op_forward_and_backward
…/dynamic_recurrent_op_forward_and_backward
…/dynamic_recurrent_op_forward_and_backward
def create_forward_op(self): | ||
# create RNNOp | ||
self.forward_op = DynamicRecurrentOp( | ||
# inputs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inlinks => inputs
boot_memories => initial_memory_value
step_net => step
outlinks => outputs
pre_memories => ex_memory
As you are making something innovative, I guess you don't want to copy †he terminology simply from Caffe2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
boot_memories=["h_boot"], | ||
step_net="stepnet", | ||
# outputs | ||
outlinks=["h@mem"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the convention to of the name h@mem
and h@pre
? What does the @
here mean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently, the python API is not completed yet, so the high-level syntax as we described before cannot be used in this unit test.
Both h@mem
and h@pre
are some underlying names of memory
and ex-memory
, the work to define these two names should be done by the python API in the future.
a = set() | ||
backward_op = core.DynamicRecurrentOp.backward(self.forward_op, a) | ||
|
||
def create_step_net(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the design, we use Python's with
to construct the step block. It seems that this Python test should also do that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an underlying API, the with-statement syntax is not supported yet because the python API is still under developing.
a = set() | ||
backward_op = core.DynamicRecurrentOp.backward(self.forward_op, a) | ||
|
||
def create_step_net(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are not going to do NetOp, but block, here the step is not represented by a NetOp instance, but a block. We mustn't go on using the name "step_net".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently, these codes still focus on the primary logic of RNN its own. This implementation is based on NetOp
because that is stable and simpler.
I will start the migration to block
and executor
on next Monday.
* Link the pre-state of the first time step to the `boot-state` in parent's | ||
* scope. | ||
*/ | ||
void LinkBootState(const rnn::MemoryAttr& memory); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As we talked before, it is not boot, but bootstrap. A more plain name is initial.
/* | ||
* different `Run` method for forward and backward. | ||
*/ | ||
template <ComputeMode _> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are the possible values of ComputeMode
? Why do we need the _
here? We need at least a class comment here to explain such stuff.
|
||
// Attributes stored in AttributeMap | ||
AddAttr<std::vector<std::string>>(name.pre_memories, | ||
"names of pre-memories"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pre-memory => ex-memory
std::map<std::string, framework::TensorArray> step_outputs_; | ||
std::map<std::string, std::vector<framework::DySeqMeta>> dy_seq_metas_; | ||
rnn::Argument arg_; | ||
ArgCache cache_; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the purpose of ArgCache
? I saw that the terminology in your code includes "inlinks", "outlink", and "memory". Are they in "ArgCache", or what is the relationship between them and "argument"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a cache of all the arguments that located in the parent scope. RNN's computation is split into a bunch of protected methods, it is painful and ugly to pass in the arguments throughout all the member functions, so a cache called ArgCache
is added as a member.
There is only one public member function of RNNAlgorthm, which is Run
, refresh the ArgCache
when Run
is invoked.
including inlinks, outlinks, memories and so on.
…/dynamic_recurrent_op_forward_and_backward
…/dynamic_recurrent_op_forward_and_backward
…/dynamic_recurrent_op_forward_and_backward
…/dynamic_recurrent_op_forward_and_backward
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
fixs: #4561