-
-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: ollama support #8
Conversation
parameters = { | ||
stream = true, | ||
}, | ||
schema = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is the openapi schema? you may be able to autogenerate this in CI if so 👀
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's from their chat completion api.
Stevearc had brilliantly created a schema in his dotfiles that we leverage for this:
Love the idea of being able to do this for every adapter.
This is looking dope I'm excited to try it out with ollama!! |
We're slowly getting there. The OpenAI functionality has almost been all moved over to an adapter. Just the inline functionality that needs to follow. That shouldn't be too painful. Looking at the Ollami API, that should be a very quick adapter to write. |
@mrjones2014 - Ollama support now added. A couple of tests you can try (with
A few bugs to iron out but the majority of it is done. |
Wow, amazing!!!! 🚀 I will test this out this evening! |
Just been playing around with an Anthropic adapter too. That model is so impressive! |
Sorry, didn't have a chance to try it out last night. Will get to it as soon as I can! |
No probs! I'm going to merge this to |
First step is to extract the current OpenAI configuration into its own adapter. Then we can create Ollama and possibly other adapters for the plugin.
Adapters will be an interface which map to the
plenary.curl
library and then some. They'll allow for:The latter is important as it will be displayed to the user within the chat buffer, allowing them to tweak settings on the fly. Using the current cmp implementation it will allow users to see where and by how much they can tweak the models. It will also set out how the adapter options map up into the parameters during the request.