-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for openai baseurl/models #56
Comments
Thanks for opening the issue. I'm not familiar with those services, but after a brief look I think adding support should be quite easy. I'll look into it 👍🏻 |
Thank you! Greatly appreciate it. And just as as an after thought. If you look at https://github.com/matatonic , they have a few different openai compatible API's for vision, stt, tts, and image gen. But that is more of a personal want, not sure if those integrations are as straight forward. |
Any updates on this feature? |
It looks like this is already supported for Groq |
Hi, I was hoping for the same feature for the OpenAI + Genkit, I started working on my own plugin when I found this one, so instead of working on everything again I opened up the PR #143, improving the config so we could add custom models and set the Base URL as well. It would be a really nice feature since we have a lot of services available with OpenAI API support (deepinfra, openrouter, litellm, etc) |
Is your feature request related to a problem? Please describe.
No, it's adding to the open ai extension to allow for baseurl and model to be set in environment variables
Describe the solution you'd like
A way to extend the function to open ai compatible endpoints
Describe alternatives you've considered
Making it as it's own extension, as there are a lot of models that are not multimodal, or that at least don't have all the features of open ai. Although there are several projects that use the open ai api, to replace all features. Tts, stt, image generation, etc.
Additional context
This would allow people that don't have access to claude, gemini, etc. The ability to run any llm through: openrouter, deepinfra, etc
The text was updated successfully, but these errors were encountered: