-
Notifications
You must be signed in to change notification settings - Fork 216
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Adds LlamaCpp LLM #96
Conversation
d1bfced
to
955ceaf
Compare
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #96 +/- ##
==========================================
+ Coverage 89.03% 91.25% +2.21%
==========================================
Files 24 25 +1
Lines 1040 1086 +46
==========================================
+ Hits 926 991 +65
+ Misses 114 95 -19 ☔ View full report in Codecov by Sentry. |
708b63b
to
cf1e721
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm @bruvduroiu — this is awesome, I'm going to look into tests and will try locally with your notebook, once everything looks good we can merge
uses
llama-cpp-python
as Python bindings for https://github.com/ggerganov/llama.cpprefactors
semantic_router.utils.function_call.{extract_function_inputs,is_valid_input}
intosemantic_router.llm.base.BaseLLM
class methods (API-based, llama.cpp-based and HF-based LLMs can override this method)closes #64