Doesn’t Ollama already support OpenAI api? #2107
Replies: 4 comments 8 replies
-
@marklysze would you like to elaborate? |
Beta Was this translation helpful? Give feedback.
-
Hey @NeverOccurs, yep, Ollama does have an OpenAI-compatible API that can be used directly with AutoGen. If you don't need Function/Tool Calling, then I'd say Ollama's API is the way to go (saves having to have LiteLLM). LiteLLM supports OpenAI's Function/Tool calling API parts and, together with Ollama, you can use function calling with AutoGen. We are looking to add a direct Ollama API example as well in the documentation. |
Beta Was this translation helpful? Give feedback.
-
Anytime, if you want to chat with others interested in local models, feel free to join AutoGen's Discord - #alt-models channel is available for just that. I'll close the discussion for now but feel free to create another here or on Discord if you have any. |
Beta Was this translation helpful? Give feedback.
-
Hi sorry to open this issue again, But does this also mean trying to use the RetreiveUserProxyAgent is not suitable for llama models just yet either because we need to do some function calling when trying to do a RAG agent? |
Beta Was this translation helpful? Give feedback.
-
Then why the need to wrap up Ollama models with LiteLLM?
Beta Was this translation helpful? Give feedback.
All reactions