Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

md\07.Labs\Csharp\src\LabsPhi301 fails #84

Open
IntranetFactory opened this issue Jul 6, 2024 · 14 comments
Open

md\07.Labs\Csharp\src\LabsPhi301 fails #84

IntranetFactory opened this issue Jul 6, 2024 · 14 comments
Labels
enhancement New feature or request question Further information is requested

Comments

@IntranetFactory
Copy link

I'm trying md\07.Labs\Csharp\src\LabsPhi301 on a new Copilot+ laptop. I adjusted modelPath to point to the correct folder.

When I run the lab I get:
Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'

After the first failure I updated all nuget packages, but still same result.

Should that sample work on a Copilot+ laptop?

@leestott
Copy link
Contributor

leestott commented Jul 9, 2024

Hi @IntranetFactory please ensure you have the latest drivers installed for your NPU

Resolution

  1. Update drivers see https://github.com/intel/intel-npu-acceleration-library/tree/main
  2. We have validated there are no problem in the example, please also refer to https://github.com/intel/intel-npu-acceleration-library/blob/main/examples/phi-3.py for another validation check

@IntranetFactory
Copy link
Author

I'm using a Copilot+ PC with SnapDragon X CPU - will the intel-npu-library work in that case?

@leestott
Copy link
Contributor

leestott commented Jul 9, 2024

@IntranetFactory Thanks for the confirmation on your running a Qualcomm device see https://learn.microsoft.com/en-us/windows/ai/npu-devices/ for the latest drivers and Onnx runtime support info https://learn.microsoft.com/windows/ai/npu-devices/

@leestott
Copy link
Contributor

leestott commented Jul 9, 2024

Qualcomm Snapdragon X: Currently, developers should target the Qualcomm QNN Execution Provider (EP), which uses the Qualcomm AI Engine Direct SDK (QNN). Pre-built packages with QNN support are available to download. This is the same stack currently used by the Windows Copilot Runtime and experiences on Copilot+ PC Qualcomm devices.

@IntranetFactory
Copy link
Author

I'm sorry it's the first time I use QNN - what does "target the Qualcomm QNN EP" mean? Do I just need to install that provider or do I also need to modify the cookbook (e.g. change CPU, install nuget packages)?

@IntranetFactory
Copy link
Author

I installed Microsoft.ML.OnnxRuntime.QNN nuget package. When I select "Any CPU" I get "System.DllNotFoundException: 'Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'". in var model = new Model(modelPath);

When I select ARM64 architecture I get a build errors

image

@leestott
Copy link
Contributor

leestott commented Jul 10, 2024

@IntranetFactory

You are experiencing a DllNotFoundException when trying to use the Microsoft.ML.OnnxRuntime.QNN NuGet package with the “Any CPU” configuration. This error typically occurs when the DLL ‘onnxruntime-genai’ or one of its dependencies is not found by the system.

To resolve this issue, you might want to ensure that:

The NuGet package is properly installed and that all required dependencies are included.
The project is configured to copy the native dependencies to the output directory.
The native dependencies are compatible with the architecture you’re targeting.
If you’re still facing issues, you might consider adding a issues to the Onnx Runtime repo asking them to ensure compatibility with the native dependencies for Snapdragon.

Additionally, checking the documentation for the Microsoft.ML.OnnxRuntime.QNN package for any specific installation or configuration instructions might provide further guidance to see if anyone else has encountered a build error when selecting the ARM64 architecture after installing the same package.

I would suggest reaching out to the maintainers of the Microsoft.ML.OnnxRuntime.QNN package or seek support from the community, as they might have encountered and resolved similar issues.

@kinfey
Copy link
Contributor

kinfey commented Jul 10, 2024

@IntranetFactory Thank you for your question. I would like to explain that the current example ONNX for Generative AI is based on the x86 framework, and will support the ARM64 architecture in the future. You can learn about the roadmap through the GitHub Repo https://github.com/microsoft/onnxruntime-genai. If you are using Copilot + PC for ARM64, it is recommended that you use Phi-Silica to call https://learn.microsoft.com/en-us/windows/ai/apis/phi-silica

@IntranetFactory
Copy link
Author

I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel Phi Silica and OCR APIs are not included in this release. These will be coming in a future 1.6 release. or is there any other way to get that?

@IntranetFactory
Copy link
Author

Are just the C# samples not working on ARM64? So should Python work or does Phi-3 currently not work on ARM64 at all?

@nmetulev
Copy link
Member

The GenAI nugets don't support Arm64 currently, there is an issue tracking this here: microsoft/onnxruntime-genai#637.

Adding @natke for awareness

@leestott leestott added question Further information is requested enhancement New feature or request labels Jul 19, 2024
@kinfey
Copy link
Contributor

kinfey commented Jul 19, 2024

@IntranetFactory
Copy link
Author

@kinfey I tried that already, which causes build errors #84 (comment)

@meliolabsadmin
Copy link

I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel Phi Silica and OCR APIs are not included in this release. These will be coming in a future 1.6 release. or is there any other way to get that?

Does anyone know more on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

5 participants