Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT UNPIN] Upcoming ORT 1.20 Release Overview #22274

Open
sophies927 opened this issue Sep 30, 2024 · 9 comments
Open

[DO NOT UNPIN] Upcoming ORT 1.20 Release Overview #22274

sophies927 opened this issue Sep 30, 2024 · 9 comments

Comments

@sophies927
Copy link
Contributor

We are planning to release ORT 1.20 at the end of October.

For more information (announcements, major features, etc.), check out the new ONNX Runtime Release Roadmap page on our website: https://onnxruntime.ai/roadmap!

Feel free to respond to this issue with any questions.

@sophies927 sophies927 pinned this issue Sep 30, 2024
@HectorSVC HectorSVC unpinned this issue Oct 1, 2024
@HectorSVC HectorSVC pinned this issue Oct 2, 2024
@FricoRico
Copy link

FricoRico commented Oct 4, 2024

Is there already documentation I can read about implementing the following points:

  • Add Android QNN support, including a pre-build package, performance improvements, and Phi-3 model support.
  • Add GPU EP support for ORT Mobile.

@sophies927
Copy link
Contributor Author

Is there already documentation I can read about implementing the following points:

  • Add Android QNN support, including a pre-build package, performance improvements, and Phi-3 model support.
  • Add GPU EP support for ORT Mobile.

@edgchen1 @skottmckay

@edgchen1
Copy link
Contributor

edgchen1 commented Oct 7, 2024

Is there already documentation I can read about implementing the following points:

  • Add Android QNN support, including a pre-build package, performance improvements, and Phi-3 model support.
  • Add GPU EP support for ORT Mobile.

There is existing documentation about the QNN EP. It's not Android-specific.
https://onnxruntime.ai/docs/execution-providers/QNN-ExecutionProvider.html

@einarwar
Copy link

einarwar commented Oct 9, 2024

Is there any plan for when you will be supporting python 3.13?

@Djdefrag
Copy link

Djdefrag commented Oct 9, 2024

Please, is this problem #20713 expected to be resolved?

We have been waiting for 6 months....

@tianlinzx
Copy link

Please, is this problem #18942 expected to be resolved?

We have been waiting for 1year ....

@sophies927
Copy link
Contributor Author

Is there any plan for when you will be supporting python 3.13?

Yes, Python 3.13 support will be included in the upcoming ORT 1.20 release.

@FricoRico
Copy link

Is there already documentation I can read about implementing the following points:

  • Add Android QNN support, including a pre-build package, performance improvements, and Phi-3 model support.
  • Add GPU EP support for ORT Mobile.

There is existing documentation about the QNN EP. It's not Android-specific. https://onnxruntime.ai/docs/execution-providers/QNN-ExecutionProvider.html

What about the GPU EP for mobile?

@snnn
Copy link
Member

snnn commented Oct 19, 2024

Is there any plan for when you will be supporting python 3.13?

Yes. Our release RC packages are not ready yet, but now you can 3.13 packages from our nightly feed by using the following command:

python -m pip install coloredlogs flatbuffers numpy packaging protobuf sympy
python -m pip install -i https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/ --pre onnxruntime

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants