Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Populate ollama models autocompletion with locally available models #45

Merged

Conversation

lazymaniac
Copy link
Contributor

@lazymaniac lazymaniac commented Apr 13, 2024

Created this one to supply list of model choices with locally installed models. In case of ollama it may have more sense, because there is way more available options compared to closed models like openai. Also, someone can use 30B or 70B version models if PC is strong enough.

@olimorris
Copy link
Owner

I love this as an idea however it doesn't work as I expected it to:

2024-04-14.13_08_07.-.WezTerm.mp4

Clearing the model name, I would expect to see the dropdown appear as I can't recall what Ollama models I have installed. I suspect this might be the cmp implementation.

@lazymaniac
Copy link
Contributor Author

Do you expect autocompletion to trigger autmatically when field is empty? For me it works when I trigger autocompletion with keymap (C-SPACE in my case). I can tinker about it more
Screenshot from 2024-04-14 17-51-45

@olimorris
Copy link
Owner

Ahhhh. I didn't manually trigger it. Great suggestion and thanks for the PR.

@olimorris olimorris merged commit e8c749a into olimorris:main Apr 16, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants