-
Notifications
You must be signed in to change notification settings - Fork 6.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some spikes in the total loss #807
Comments
Sometimes, it might be caused by an out-of-distribution training image: i.e., a usual training image that looks quite different from others. For example, most of your input has rich texture and structure, and suddenly you feed the network a flat input. |
You meant that sometimes the input network is noise images (i.e. many backgrounds with black regions and foreground is quite small). |
If you want to inspect it, you can add some debugging code. Each time, the loss is above some threshold, you save images to the disk. |
Great suggestion. I will let you know the region. Actually, I met the issue when I train with multiple GPU. I increase the crop-size but the issue still happens. Let I debug it |
@junyanz I found the issue. The issue is that the generator is too strong because I added a new loss to the generator, Hence, the D may easy to fool. Do you have any suggestion to solve it? Do I need to add more layer for D network (currently I used the patch of 7x7 as the output of the D network). |
You can add a few more layers, or increase the learning rate of D. |
During training, I found the total loss of cycleGAN rapidly increases at some epoch as the figure. Could you tell me the technical word of the issue? How to solve it?
The text was updated successfully, but these errors were encountered: