-
-
Notifications
You must be signed in to change notification settings - Fork 16.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue in torchscript model inference #2129
Comments
@sourabhyadav we don't provide support for torchscript loading or inference, only export. |
@glenn-jocher Ok I will raise it as a question to the community, |
Hi @sourabhyadav I have a custom implementation of the loading and inference with torchscript, maybe you could check it in here. |
are you fix?? |
@zhiqwang if it's possible, could you please give a link to your implementation of the loading and inference with torchscript (current link is not available any more)? I need to speed up my inference of custom yolov5 model, but I'm new to CV and I don't know, how to implement it myself( |
Hi @pugovka91 , the notebook has a python interface of loading and inference with torchscript, and you can check this if you wanna the C++ interface. |
|
Hello @zhiqwang, it’s a bit out of topic, but I wanted to ask, if it’s possible to make detection with augment flag using yolort-model? Thanks a lot! |
Hi @pugovka91 , I'm not sure I understand your meaning correctly. Did you mean the Test-Time Augmentation (TTA), If that's the feature you're concerned about, we don't have this feature implemented yet in |
@zhiqwang yes, exactly) will be waiting for this feature implementation, thank you! |
🐛 Bug
I am facing the below issue when I try to load a saved torchscript model:
To Reproduce (REQUIRED)
Model saving was done using
export.py
file:Run:
Model loading is done like this:
Model loading seems fine. But the issue comes when we try to inference the model:
Output:
Environment
Is am I missing something here? Please guide me.
The text was updated successfully, but these errors were encountered: