Replies: 2 comments
-
Hello @kadirnar. autollm AutoQueryEngine supports all litellm models in 1-line: https://docs.litellm.ai/docs/providers Moreover, autollm supports all llama-index models in 2-3 lines of code. You can provide any llama-index LLM instance to AutoServiceContext: autollm/autollm/auto/service_context.py Line 29 in c369a03 |
Beta Was this translation helpful? Give feedback.
-
I'm using the llama-cpp-python library to test lllm models. This library supports gguf format. I couldn't find it on this list. Is there sample code? Can I run models here using the Autollm library? |
Beta Was this translation helpful? Give feedback.
-
Example Doc: https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html
Beta Was this translation helpful? Give feedback.
All reactions