Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: ollama support #8

Merged
merged 42 commits into from
Mar 7, 2024
Merged

feat: ollama support #8

merged 42 commits into from
Mar 7, 2024

Conversation

olimorris
Copy link
Owner

@olimorris olimorris commented Feb 29, 2024

First step is to extract the current OpenAI configuration into its own adapter. Then we can create Ollama and possibly other adapters for the plugin.

Adapters will be an interface which map to the plenary.curl library and then some. They'll allow for:

  • A url
  • Some headers
  • Some parameters
  • Callbacks which can be leveraged from within the plugin to manipulate and output the api response
  • A defined schema for the API settings

The latter is important as it will be displayed to the user within the chat buffer, allowing them to tweak settings on the fly. Using the current cmp implementation it will allow users to see where and by how much they can tweak the models. It will also set out how the adapter options map up into the parameters during the request.

parameters = {
stream = true,
},
schema = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the openapi schema? you may be able to autogenerate this in CI if so 👀

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's from their chat completion api.

Stevearc had brilliantly created a schema in his dotfiles that we leverage for this:

2024-03-03 22_45_21 - CleanShot

Love the idea of being able to do this for every adapter.

@mrjones2014
Copy link
Contributor

This is looking dope I'm excited to try it out with ollama!!

@olimorris
Copy link
Owner Author

We're slowly getting there.

The OpenAI functionality has almost been all moved over to an adapter. Just the inline functionality that needs to follow. That shouldn't be too painful.

Looking at the Ollami API, that should be a very quick adapter to write.

@olimorris
Copy link
Owner Author

@mrjones2014 - Ollama support now added.

A couple of tests you can try (with ollama serve up and running and assuming you're using llama2):

  1. Open up the action palette and open a chat. Start conversing with the API
  2. In a regular buffer run the command :CodeCompanion print a table of 5 fruits

A few bugs to iron out but the majority of it is done.

@mrjones2014
Copy link
Contributor

Wow, amazing!!!! 🚀 I will test this out this evening!

@olimorris
Copy link
Owner Author

Just been playing around with an Anthropic adapter too. That model is so impressive!

@mrjones2014
Copy link
Contributor

Sorry, didn't have a chance to try it out last night. Will get to it as soon as I can!

@olimorris
Copy link
Owner Author

Sorry, didn't have a chance to try it out last night. Will get to it as soon as I can!

No probs! I'm going to merge this to main now but readme is fully updated.

@olimorris olimorris merged commit cd610db into main Mar 7, 2024
2 checks passed
@olimorris olimorris deleted the feat-ollama-support branch March 7, 2024 22:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants