-
Notifications
You must be signed in to change notification settings - Fork 226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
md\07.Labs\Csharp\src\LabsPhi301 fails #84
Comments
Hi @IntranetFactory please ensure you have the latest drivers installed for your NPU Resolution
|
I'm using a Copilot+ PC with SnapDragon X CPU - will the intel-npu-library work in that case? |
@IntranetFactory Thanks for the confirmation on your running a Qualcomm device see https://learn.microsoft.com/en-us/windows/ai/npu-devices/ for the latest drivers and Onnx runtime support info https://learn.microsoft.com/windows/ai/npu-devices/ |
Qualcomm Snapdragon X: Currently, developers should target the Qualcomm QNN Execution Provider (EP), which uses the Qualcomm AI Engine Direct SDK (QNN). Pre-built packages with QNN support are available to download. This is the same stack currently used by the Windows Copilot Runtime and experiences on Copilot+ PC Qualcomm devices. |
I'm sorry it's the first time I use QNN - what does "target the Qualcomm QNN EP" mean? Do I just need to install that provider or do I also need to modify the cookbook (e.g. change CPU, install nuget packages)? |
I installed Microsoft.ML.OnnxRuntime.QNN nuget package. When I select "Any CPU" I get "System.DllNotFoundException: 'Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'". in When I select ARM64 architecture I get a build errors |
You are experiencing a DllNotFoundException when trying to use the Microsoft.ML.OnnxRuntime.QNN NuGet package with the “Any CPU” configuration. This error typically occurs when the DLL ‘onnxruntime-genai’ or one of its dependencies is not found by the system. To resolve this issue, you might want to ensure that: The NuGet package is properly installed and that all required dependencies are included. Additionally, checking the documentation for the Microsoft.ML.OnnxRuntime.QNN package for any specific installation or configuration instructions might provide further guidance to see if anyone else has encountered a build error when selecting the ARM64 architecture after installing the same package. I would suggest reaching out to the maintainers of the Microsoft.ML.OnnxRuntime.QNN package or seek support from the community, as they might have encountered and resolved similar issues. |
@IntranetFactory Thank you for your question. I would like to explain that the current example ONNX for Generative AI is based on the x86 framework, and will support the ARM64 architecture in the future. You can learn about the roadmap through the GitHub Repo https://github.com/microsoft/onnxruntime-genai. If you are using Copilot + PC for ARM64, it is recommended that you use Phi-Silica to call https://learn.microsoft.com/en-us/windows/ai/apis/phi-silica |
I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel |
Are just the C# samples not working on ARM64? So should Python work or does Phi-3 currently not work on ARM64 at all? |
The GenAI nugets don't support Arm64 currently, there is an issue tracking this here: microsoft/onnxruntime-genai#637. Adding @natke for awareness |
please use this https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.QNN |
@kinfey I tried that already, which causes build errors #84 (comment) |
Does anyone know more on this? |
I'm trying md\07.Labs\Csharp\src\LabsPhi301 on a new Copilot+ laptop. I adjusted modelPath to point to the correct folder.
When I run the lab I get:
Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'
After the first failure I updated all nuget packages, but still same result.
Should that sample work on a Copilot+ laptop?
The text was updated successfully, but these errors were encountered: