Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor ONNXRuntime Python interface #176

Merged
merged 8 commits into from
Oct 3, 2021
Merged

Conversation

zhiqwang
Copy link
Owner

@zhiqwang zhiqwang commented Sep 27, 2021

We introduce the yolort.runtime interface to accelerate the application on the Python frontend. For example, you can define the runtime and do inferencing with ONNXRuntime as follows

from yolort.runtime import PredictorORT

detector = PredictorORT("best.onnx")
img_path = "bus.jpg"
scores, class_ids, boxes = detector.run_on_image(img_path)

@zhiqwang zhiqwang added deployment Inference acceleration for production enhancement New feature or request labels Sep 27, 2021
@codecov
Copy link

codecov bot commented Sep 27, 2021

Codecov Report

Merging #176 (39efaa5) into master (f0746c6) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #176   +/-   ##
=======================================
  Coverage   96.48%   96.48%           
=======================================
  Files          10       10           
  Lines         598      598           
=======================================
  Hits          577      577           
  Misses         21       21           
Flag Coverage Δ
unittests 96.48% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.


Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f0746c6...39efaa5. Read the comment docs.

@zhiqwang zhiqwang force-pushed the ort-python-inference branch 2 times, most recently from ead0a5e to f0f15ef Compare September 27, 2021 10:00
@itsnine
Copy link
Contributor

itsnine commented Sep 27, 2021

@zhiqwang hi, do you want to have that feature or you already started to write? I can do that feature if nobody already started doing it :)

@zhiqwang
Copy link
Owner Author

zhiqwang commented Sep 27, 2021

@zhiqwang hi, do you want to have that feature or you already started to write? I can do that feature if nobody already started doing it :)

Hi @itsnine , I haven't started this work yet, so far I've only worked on fixing some docstrings. We're welcome for this features, go ahead and feel free to start this job with a new PR. I'll hold this PR for fixing the docstrings. We can move forward to your PR first.

My current plan is to put the core code for ORT inference into the directory yolort/runtime. And furthermore we could name the python scripts as ort_modeling.py, you can also use others if you have a better suggestion. We plan to add more python inferencing backends in the future in this directory. And then add a CLI tool named run_model.py (just for example) to the directory tools for use cases.

@zhiqwang zhiqwang marked this pull request as draft September 27, 2021 17:10
@zhiqwang zhiqwang changed the title Add ONNXRuntime Python interface Refactor ONNXRuntime Python interface Sep 28, 2021
@zhiqwang zhiqwang force-pushed the ort-python-inference branch 8 times, most recently from a213a99 to f9ab3fa Compare September 30, 2021 05:45
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ zhiqwang
❌ pre-commit-ci[bot]
You have signed the CLA already but the status is still pending? Let us recheck it.

@zhiqwang zhiqwang marked this pull request as ready for review October 3, 2021 05:19
@zhiqwang zhiqwang merged commit bdbef45 into master Oct 3, 2021
@zhiqwang zhiqwang deleted the ort-python-inference branch October 3, 2021 05:22
@zhiqwang zhiqwang added the API Library use interface label Oct 14, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Library use interface deployment Inference acceleration for production enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants