connecting to hosted llm (NOT LOCAL & NOT OPENAI) #1159
squatchydev9000
started this conversation in
General
Replies: 1 comment 3 replies
-
LiteLLM supports OpenAI compatible API? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Anyone been able to connect to a remote server hosting an LLM that isn't openai and isn't local (ollama/litellm/openllm/etc)?
***without wrappers and workarounds? ***without using the openai api system?
Beta Was this translation helpful? Give feedback.
All reactions