Replies: 1 comment
-
In the future we are adding support for caching engines to speed up |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have successfully compiled a model, but I'm wondering how to cache it, so I can avoid the long compilation process in the future.
E.g, right now I'm using:
But, ideally, I'd save the result of
torch.compile
to disk, so I can use it in the future.Beta Was this translation helpful? Give feedback.
All reactions