Skip to content

Commit

Permalink
[Doc] fix third-party model example (vllm-project#9771)
Browse files Browse the repository at this point in the history
Signed-off-by: Russell Bryant <[email protected]>
Signed-off-by: NickLucche <[email protected]>
  • Loading branch information
russellb authored and NickLucche committed Oct 31, 2024
1 parent 3aceb91 commit 47169a5
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions docs/source/models/adding_model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,9 @@ If you are running api server with :code:`vllm serve <args>`, you can wrap the e
from vllm import ModelRegistry
from your_code import YourModelForCausalLM
ModelRegistry.register_model("YourModelForCausalLM", YourModelForCausalLM)
import runpy
runpy.run_module('vllm.entrypoints.openai.api_server', run_name='__main__')
if __name__ == '__main__':
import runpy
runpy.run_module('vllm.entrypoints.openai.api_server', run_name='__main__')
Save the above code in a file and run it with :code:`python your_file.py <args>`.

0 comments on commit 47169a5

Please sign in to comment.