-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
suggest begin to train the 20*192 size net #124
Comments
Thank you for asking. Current structure is in fact 12x256, not 12x128. We are going to train 20x256 next, but we are having some problems we want to solve before, just to be sure that we are not stalling because of them. |
when will you swich to 20x256 net? stronger net will get more contributors |
Thank you as always for you interest, @l1t1. We are trying to train the 20x256 network, but there are technical difficulties which are slowing down the procedure. First of all, "RAM" minibatch had to be reduced from 128 to 64 positions, because otherwise the training doesn't fit in the GPU memory (which, by the way is a RTX 2080ti with 11Gb). So it's a tricky exercise, but we are spending all time on it at the moment. I hope we will be able to give good news in a few days and have a 20x256 network ready in three weeks. |
Awesome. It would be great if we could find the money to buy one, but I don't see it coming for now... |
since 20224 20256 are all tested by others, and 12*128 stalled
The text was updated successfully, but these errors were encountered: