Replies: 4 comments 9 replies
-
Have you checked out LLM proxies like Ollama and LiteLLM? |
Beta Was this translation helpful? Give feedback.
-
Yes. Please see the documentation here: https://microsoft.github.io/autogen/docs/topics/llm_configuration#adding-http-client-in-llm_config-for-proxy |
Beta Was this translation helpful? Give feedback.
-
I have tested the solution in doc for Gemini, it's not working. |
Beta Was this translation helpful? Give feedback.
-
Hey, I initiated an AgentBuilder class. It requires |
Beta Was this translation helpful? Give feedback.
-
Describe the issue
Can Autogen configure a network proxy to make calls to the official interfaces of AOAI or OpenAI through the network proxy.
Due to poor local network quality, I would like to set up a network proxy. Can it be done in Autogen? Where can I set it up?
Steps to reproduce
The system proxy's https proxy and https proxy have been set, but they are not effective in the Autogen program. Do I need to directly add a proxy network proxy in the program
Screenshots and logs
No response
Additional Information
autogenstudio version: 0.0.47
pyautogen version: 0.2.16
Beta Was this translation helpful? Give feedback.
All reactions