Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch norm is not working correctly #18

Open
layerwise opened this issue Jan 14, 2017 · 0 comments
Open

Batch norm is not working correctly #18

layerwise opened this issue Jan 14, 2017 · 0 comments

Comments

@layerwise
Copy link

I think the issue #15 is part of a bigger problem with conv_batch_norm in infogan/misc/custom_ops.py

First of all, I noticed that the following block of code in lines 126-128 in infogan/alogs/infogan_trainer.py does not actually invoke the test graph:

        with pt.defaults_scope(phase=pt.Phase.test):
            with tf.variable_scope("model", reuse=True) as scope:
                self.visualize_all_factors()

I think the solution is to define a pt.UnboundVariable('phase') while making the templates, something along the lines of

        with pt.defaults_scope(phase=pt.UnboundVariable('phase', default=pt.Phase.train)):
            if network_type == "mnist":
                with tf.variable_scope("d_net"):
                    template = pt.template("input")

        train_output = template.construct(input=x_var)
        test_output = template.construct(input=x_var, phase=pt.Phase.test)

Not using the test graph for visualization does not effect the MNIST images all that much, because the normalization from the test batch seems to work quite ok, but for larger images, it might have an impact. Have you checked this?

However, I also think there is a larger issue with the test graph of conv_batch_norm. I was not able to make it work even after fixing #15 , and had to change the code to something more like in the #official batch normalization layer.

Please check if I might be on to something. I have a pull request ready including a small script test_batch_norm.py to check whether the batch norm implementation is behaving reasonably.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant