-
Notifications
You must be signed in to change notification settings - Fork 290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DirectMLNpuInference fails to run on the ARM64 NPU #640
Comments
And Could you add some build script(such as CMakeLists.txt) in to sample project? |
Base on this blog. The DirectML version need 1.15.2, and onnxruntime need 1.18. The native demo use 1.17 onnxruntime.... <?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Microsoft.AI.DirectML" version="1.15.2" targetFramework="native" />
<package id="Microsoft.AI.MachineLearning" version="1.17.0" targetFramework="native" />
<package id="Microsoft.Windows.ImplementationLibrary" version="1.0.220914.1" targetFramework="native" />
</packages> |
Seeing similar issue on ASUS X Elite. Updated drivers based on the blog. WebNN gives driver not installed. |
Base on
DirectMLNpuInference
sample and #625(update SDK to Windows 11 SDK(10.0.26100.0)).I make DirectML NPU work well on Intel Lunar Lake Client Platform, I see the NPU usage not 0 when running program on Windows Task Manager.
While, I try this sample on My Windows ARM64 machine, the program would tell me "No NPU device found"...
Anyone know the reason?
Is DirectML NPU not support ARM64 platform?
My device info:
CPU: Snapdragon(R) X 12-core X1E80100 @ 3.40 GHz
GPU: Snapdragon(R) X Elite - X1E80100 - Qualcomm(R) Adreno(TM) GPU
NPU: Snapdragon(R) X Elite - X1E80100 - Qualcomm(R) Hexagon(TM) NPU
My test code:
The text was updated successfully, but these errors were encountered: