Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactored settings.py for test-derivatives #75

Merged
merged 9 commits into from
May 29, 2020

Conversation

sbharadwajj
Copy link
Contributor

This request contains refactoring of settings for test-derivatives.

# examples #
###############################################################################

example = {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be a conv example

"input_kwargs": {"size": (4, 3, 4, 5)},
},
{
"module_fn": torch.nn.ZeroPad2d,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Create a new file for paddings

Activation layers
Convolutional layers
Linear Layers
Loss functions
Copy link
Owner

@f-dangel f-dangel May 27, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add padding layers

Comment on lines 175 to 188
def derivative_prefix_for(self):
self.derivative = self.make_derivative()
if is_activation(self.derivative):
prefix = "act"
elif is_layer(self.derivative):
prefix = "layer"
elif is_pooling(self.derivative):
prefix = "pool"
elif is_loss(self.derivative):
prefix = "loss"
else:
prefix = ""
return prefix

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thought a bit more about it and saw the modified names in the failing travis run. Now I think this does not add sufficient new information. If s.o. is developing a new MyLayer, it's possible to filter the tests with

pytest -vx . -k MyLayer

Could you undo this prefixing? I might have missed an argument.

@f-dangel f-dangel merged commit 258ea91 into f-dangel:development May 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants