Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Adds LlamaCpp LLM #96

Merged
merged 16 commits into from
Jan 13, 2024
Merged

feat: Adds LlamaCpp LLM #96

merged 16 commits into from
Jan 13, 2024

Conversation

bruvduroiu
Copy link
Member

@bruvduroiu bruvduroiu commented Jan 10, 2024

  • uses llama-cpp-python as Python bindings for https://github.com/ggerganov/llama.cpp

  • refactors semantic_router.utils.function_call.{extract_function_inputs,is_valid_input} into semantic_router.llm.base.BaseLLM class methods (API-based, llama.cpp-based and HF-based LLMs can override this method)

closes #64

Copy link

codecov bot commented Jan 10, 2024

Codecov Report

Attention: 6 lines in your changes are missing coverage. Please review.

Comparison is base (666b361) 89.03% compared to head (2285771) 91.25%.

Files Patch % Lines
semantic_router/llms/llamacpp.py 90.00% 4 Missing ⚠️
semantic_router/llms/base.py 96.77% 1 Missing ⚠️
semantic_router/route.py 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main      #96      +/-   ##
==========================================
+ Coverage   89.03%   91.25%   +2.21%     
==========================================
  Files          24       25       +1     
  Lines        1040     1086      +46     
==========================================
+ Hits          926      991      +65     
+ Misses        114       95      -19     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@jamescalam jamescalam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm @bruvduroiu — this is awesome, I'm going to look into tests and will try locally with your notebook, once everything looks good we can merge

pyproject.toml Outdated Show resolved Hide resolved
semantic_router/llms/llamacpp.py Show resolved Hide resolved
@jamescalam jamescalam changed the title Adds LlamaCpp LLM feat: Adds LlamaCpp LLM Jan 13, 2024
semantic_router/llms/llamacpp.py Outdated Show resolved Hide resolved
semantic_router/llms/llamacpp.py Show resolved Hide resolved
@jamescalam jamescalam merged commit 311e909 into main Jan 13, 2024
8 checks passed
@jamescalam jamescalam deleted the bogdan/local-llm branch January 13, 2024 18:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support for local LLMs
2 participants