-
Notifications
You must be signed in to change notification settings - Fork 684
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for LiteLLM #56
Comments
Great idea! We'll add this |
i'm the maintainer of LiteLLM, let me know if you run into any issues |
@ishaan-jaff Thanks a lot! Just to clarify, we simply need two new environment variables to be able to use LiteLLM, correct? 😄 |
We would need also another UI dropbox, because with LiteLLM you can choose dynamically which backend you want to call, even though the interface is OpenAI but you can load for example a LLama2 model. |
Example below:
But model can be anything defined by the user in his model list:
|
Great, thanks for the info! We'll look into it |
I suggest the best approach to be the following: API: Server Endpoints |
One variable* import openai
client = openai.OpenAI(
api_key="anything",
base_url="http://0.0.0.0:8000"
)
# request sent to model set on litellm proxy, `litellm --model`
response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [
{
"role": "user",
"content": "this is a test request, write a short poem"
}
])
print(response) |
Thanks everyone! We added the environment variable for the |
Hi @thomashacker, thanks for this wonderful feature. Wanna check with you if the embedding models in LiteLLM are supported? Thanks a lot! |
Hi there,
I am getting familiar with the source code but I want to have the ability to change the settings of the embeddings and generation to an OpenAI PROXY server: https://docs.litellm.ai/docs/simple_proxy
We would just need 2 environment settings, the first one is the API KEY you already have it and the second one is the URL to point the OpenAI class.
Those are exactly the same as the library as you would use it, of course normally the api_base poins to Azure:
Let me know.
Cheers!
The text was updated successfully, but these errors were encountered: