-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test recognize_digits_conv #4926
Conversation
""" | ||
return map(param_name -> (grad_name, block_index, op_index)) | ||
""" | ||
assert isinstance(target, Variable) | ||
if no_grad_set is None: | ||
no_grad_set = set() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think append_backward(self, target, no_grad_set=None):
=> append_backward(self, target, no_grad_set=set())
will do the same trick?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, I think sometimes users will invoke backward by program.append_backward(target=var, no_grad_set=None)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default parameter in Python has a huge difference with other languages. It is mutable in Python.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see. Thanks for the heads up.
@@ -35,7 +35,10 @@ def fc(input, | |||
"Y": w, | |||
}, | |||
outputs={"Out": tmp}, | |||
attrs={'x_num_col_dims': num_flatten_dims}) | |||
attrs={ | |||
'x_num_col_dims': num_flatten_dims, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
x_num_col_dims
and num_flatten_dims
refer to the same thing. Maybe we should use the same name?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
x_num_col_dims
is an attribute of mul
op, while num_flatten_dims
is an attribute of fc
layer. They are at different levels.
program.append_backward(avg_cost, set()) | ||
print str(program) | ||
program.append_backward(avg_cost) | ||
# print str(program) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
clean up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, this line should not be committed. It is commented now to reduce test output.
@@ -41,21 +42,49 @@ def test_recognize_digits_mlp(self): | |||
# print str(program) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
clean up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above replied.
name='pixel', shape=[3, 48, 48], data_type='int32', program=program) | ||
conv2d_layer( | ||
layers.conv2d( | ||
input=images, num_filters=3, filter_size=[4, 4], program=program) | ||
|
||
# print str(program) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
clean up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above replied.
cost = layers.cross_entropy(input=predict, label=label, program=program) | ||
avg_cost = layers.mean(x=cost, program=program) | ||
|
||
program.append_backward(avg_cost) | ||
|
||
print str(program) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
clean up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above replied.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome
fixes #4927