Skip to content
This repository has been archived by the owner on Jun 17, 2024. It is now read-only.

Support for nvidia Triton Inference Server in ACI #42

Open
tom-r-o opened this issue Dec 7, 2020 · 0 comments
Open

Support for nvidia Triton Inference Server in ACI #42

tom-r-o opened this issue Dec 7, 2020 · 0 comments

Comments

@tom-r-o
Copy link

tom-r-o commented Dec 7, 2020

Running Triton on ACI (with a V100 GPU) yields errors in Triton:

ERROR: This container was built for NVIDIA Driver Release 450.51 or later, but
       version 410.104 was detected and compatibility mode is UNAVAILABLE.

       [[CUDA Driver UNAVAILABLE (cuInit(0) returned 803)]]

Running the same Triton image on an Azure VM with a V100 GPU from nvidia's image does not cause the error. I'm guessing this is related to some missing drivers.

Triton's quickstart guide is available here

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant