Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for other LLM providers #240

Open
amirhmoradi opened this issue Aug 1, 2023 · 2 comments
Open

Add support for other LLM providers #240

amirhmoradi opened this issue Aug 1, 2023 · 2 comments

Comments

@amirhmoradi
Copy link
Contributor

I would like to be able to use any model from OpenAI, Anthropic, Cohere, Forefront, HuggingFace, Aleph Alpha, Replicate, Banana and llama.cpp and more.

Some projects that provide this feature:

I would love to contribute if your provide me with some help on how to get started :)

@molander
Copy link

+1!
We can start by just honoring anything with:
OPENAI_API_BASE=<some_other_server>
or perhaps passing the proxy option could work:
OPENAI_API_PROXY=<somewhere_else:/v1/completions>

Great project btw! Kudos and thank you for open-sourcing! Since it is opensource, it does indeed make sense to support some other models, especially opensource ones!

@krrishdholakia
Copy link

Hey - we might be able to help - https://github.com/BerriAI/litellm/tree/main/cookbook/proxy-server

Why use a proxy server instead of calling them directly though? With a server you'd need to run it somewhere and add an extra hop (latency)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants