We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello guys I have some problems with deploying distserve on docker.
This is command what i used to launch docker env.
docker run --gpus all -it --name dist_bench --network=host --shm-size=10g -v /home/hyunmin/dataset:/workspace/dataset nvcr.io/nvidia/pytorch:24.05-py3 bash
I succefully build DistServe & SwiftTransformer inside my docker too.
However, when I tried to launch distserve online api_server or offline llm inference. It just stucks in initializing the engine.
Is this problem related to shm size options?
Can anyone share the env or commands which successfully launch the serving.
Thank you in advance.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hello guys I have some problems with deploying distserve on docker.
This is command what i used to launch docker env.
docker run --gpus all -it --name dist_bench --network=host --shm-size=10g -v /home/hyunmin/dataset:/workspace/dataset nvcr.io/nvidia/pytorch:24.05-py3 bash
I succefully build DistServe & SwiftTransformer inside my docker too.
However, when I tried to launch distserve online api_server or offline llm inference.
It just stucks in initializing the engine.
Is this problem related to shm size options?
Can anyone share the env or commands which successfully launch the serving.
Thank you in advance.
The text was updated successfully, but these errors were encountered: