-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor ONNXRuntime Python interface #176
Conversation
Codecov Report
@@ Coverage Diff @@
## master #176 +/- ##
=======================================
Coverage 96.48% 96.48%
=======================================
Files 10 10
Lines 598 598
=======================================
Hits 577 577
Misses 21 21
Flags with carried forward coverage won't be shown. Click here to find out more. Continue to review full report at Codecov.
|
ead0a5e
to
f0f15ef
Compare
@zhiqwang hi, do you want to have that feature or you already started to write? I can do that feature if nobody already started doing it :) |
Hi @itsnine , I haven't started this work yet, so far I've only worked on fixing some docstrings. We're welcome for this features, go ahead and feel free to start this job with a new PR. I'll hold this PR for fixing the docstrings. We can move forward to your PR first. My current plan is to put the core code for ORT inference into the directory |
a213a99
to
f9ab3fa
Compare
1d4bdac
to
4a7a094
Compare
for more information, see https://pre-commit.ci
|
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
We introduce the
yolort.runtime
interface to accelerate the application on the Python frontend. For example, you can define the runtime and do inferencing withONNXRuntime
as follows