-
Notifications
You must be signed in to change notification settings - Fork 298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DirectMLNpuInference fails to run on the intel NPU #625
Comments
Hi I can run this code on Intel Ultra 7 155U
|
Thanks for your experience. I will try to update my OS to Dev channel. Thank you |
Update to Windows 11 SDK(10.0.26100.0) would work for |
@Lucashien HW: ThinkPad X1 Carbon Gen 12, Intel(R) Core(TM) Ultra 7 155U |
I'm on the older Intel NPU that is present in the Surface Laptop Studio 2. I believe it's a Movidius 3700VC. (Its PCI hardware id is Although I was able to force this example to use that device simply by adjusting the THROW_IF_FAILED(d3d12CreateDevice(adapter.Get(), D3D_FEATURE_LEVEL_1_0_CORE, IID_PPV_ARGS(&d3dDevice))); I've added code to enable the D3D debug layer, and with that in place, I see this:
Initially I was on v31.0.100.2016 of the NPU driver, which is what Windows Update installs. I found that the Intel NPU driver page lists newer versions, but the latest (32.0.100.2820) doesn't actually support this device. But 32.0.100.2408 does support the device, and I've been able to install that. (And apparently there is a package on Windows Update that includes this version but I couldn't work out how to get Windows to offer me that.) But I still get the same error. So I think there are two issues here:
I think 1 is down to this line here: else if (forceComputeOnlyDevice && currentGpuAdapter->IsAttributeSupported(DXCORE_ADAPTER_ATTRIBUTE_D3D12_CORE_COMPUTE)) That won't select a compute-only device. It will select any device that offers compute. On my laptop, every device (Intel(R) Iris(R) Xe Graphics, NVIDIA GeForce RTX 4060 Laptop GPU, Intel(R) NPU, and even the Microsoft Basic Render Driver software device). I think that should probably be this: else if (forceComputeOnlyDevice && currentGpuAdapter->IsAttributeSupported(DXCORE_ADAPTER_ATTRIBUTE_D3D12_CORE_COMPUTE)
&& !currentGpuAdapter->IsAttributeSupported(DXCORE_ADAPTER_ATTRIBUTE_D3D12_GRAPHICS)) So this will match only if the device supports compute and it does not support graphics. That's what I'd expect "compute only device" to mean, and this does indeed reject all devices except for the Intel NPU. But having fixed that, the code just doesn't seem to work. I know the Intel driver still reports DirectML support as "preview". Are there any examples anywhere that show successful DirectML use on the Intel NPU that's in the Surface Laptop Studio 2? |
I’m encountering issues when attempting to run DirectML inference on an Intel NPU.
Specifically, the sample code will use my GPU instead of targeting the NPU. Here’s the relevant code as below.
When I set the GUID to DXCORE_HARDWARE_TYPE_ATTRIBUTE_NPU, the application fails to find the NPU device, printing "No NPU device found."
Here are the specifics of my hardware and software setup:
CPU: Intel(R) Core(TM) Ultra 9 185H
GPU: RTX 4060 Laptop
NPU: Intel(R) AI Boost
Driver Version: 32.0.100.2688
DirectX Version: 12
Nuget information:
Has anyone successfully run DirectML inference on an Intel NPU? If so, what steps were taken to properly configure the adapter and ensure the NPU was used?
Thank you for your assistance!
The text was updated successfully, but these errors were encountered: