-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformer results #2
Comments
The implementation of Transformer can be referred from https://github.com/ruotianluo/self-critical.pytorch, which can achieve ~128 CIDEr score. |
Hi @Panda-Peter. Thanks for your reply. I am actually following his code, but according to the results here, it can achieve 1.266 with self-critical. It can achieve 1.295 but with the new self-critical proposed, which you do not use. So his reported score for transformer is 1.266 ( |
We also implement the baseline of Transformer based on this code. However, we found the primary hyper-parameters are not optimal. You can tune the parameters and obtain ~1.283 cider. |
Hello. Thanks for your work and for sharing the code.
May you please tell me the details of the pure Transformer model you implemented which achieves 128.3 cider? To best of my knowledge, all implementations could achieve a maximum of around 126.6, according to all papers which utilized the transformer model. In your paper, you don't provide details on Transformer, and there is no supplementary material. So may I kindly know the details for your re-implementation of the pure transformer which achieves 128.3?
The text was updated successfully, but these errors were encountered: