add OPENAI_API_BASE_URL as a config PARAMETER so it can point to other OpenAI compatible proxy servers (I.e. local models) #15
vrijsinghani
started this conversation in
Ideas
Replies: 1 comment
-
@vrijsinghani thank you for your request. I've a question, can this base_url work using the OpenAI package? I do see that OpenAI package supports adding a base_url configuration. If yes, can you please create an issue so that I can add the support? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
By adding a config parameter to OpenAIConfig like OPENAI_API_BASE_URL so it can be used via LiteLLM, OLLAMA, etc...
Beta Was this translation helpful? Give feedback.
All reactions