-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Documentation] Execution provider strings #22101
Comments
|
Can you add this link to the Python docs? |
I'm not sure who is in charge of Python documentation, but maybe @natke would know, or maybe @scottmckay would know from the CoreML angle. |
https://onnxruntime.ai/docs/api/python/api_summary.html#onnxruntime.get_all_providers
https://onnxruntime.ai/docs/api/python/api_summary.html#onnxruntime.get_available_providers
|
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details. |
Describe the documentation issue
It's not clear what should be the execution providers strings in Python.
Eg. I want to enable DirectML or CoreML. I can see them here https://onnxruntime.ai/docs/execution-providers/
But I don't see any mention for the exact string value that should passed eg
CUDAExecutionProvider
Page / URL
https://onnxruntime.ai/docs/api/python/api_summary.html#load-and-run-a-model
The text was updated successfully, but these errors were encountered: