-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can shaders be cached to improve load-time performance? #45
Comments
After doing some further reading, I suspect the WebGPU TF.js backend will likely fix this problem eventually. I suspect that with that being the case, there is little motivation for improvement to the WebGL backend, especially in this way. Closing. |
Reopening this issue as there was a great discussion and some new work being done for the WebGL backend over at TF.js tensorflow/tfjs#5205 I'm not sure if the release has dropped but this is probably a good enough reason to at least evaluate upgrading TF.js! |
This is now released over at https://github.com/tensorflow/tfjs/releases/tag/tfjs-v3.8.0 - so this is unblocked! |
Tried this out in #131 but unfortunately performance for load plus first inference decreased greatly. |
I'm in discussion with the TF.js team over at tensorflow/tfjs#5205 about this and it appears that the regression is not likely to be related to the incomplete uniforms work. I am going to do some bisection work on branches and see if I can narrow down what is causing the issue. I need to know that if I'm going to upgrade too at some point. Starting point based off three "reload" runs using the "wallMs" statistic on first inference where shader compilation is performed:
|
Where this landed is that the one flag default does indeed represent a regression for some users, so I'd have to turn that on. On the plus side though, the TF.js team also recommended enabling another flag, and that shaves something like a second off load times, which is great! |
It appears that Firefox does not yet cache WebGL shaders, see the bug here.
In Tensorflow.js, it appears that the majority of the time on first load/predict is likely blocking on upload.
Here is the first timing info for the first prediction (note that
tf.loadGraphModel
is speedy):TIMING LOADING: {"uploadWaitMs":4634,"downloadWaitMs":0,"kernelMs":3429,"wallMs":11169}
And here is a subsequent normal inference call:
TIMING NORMAL: {"uploadWaitMs":0,"downloadWaitMs":3,"kernelMs":32,"wallMs":117}
I will comment on the Firefox bug and see if it gets anywhere.
The text was updated successfully, but these errors were encountered: