From 5f4fc7bee52f901bd697b0d78e9a4fb432321d41 Mon Sep 17 00:00:00 2001 From: Edgar Ruiz Date: Fri, 5 Apr 2024 16:45:01 -0500 Subject: [PATCH] Local LLMs section --- .../chat-with-llms-using-chattr.Rmd | 28 +++++++++++++++++++ 1 file changed, 28 insertions(+) diff --git a/_posts/2024-04-04-chat-with-llms-using-chattr/chat-with-llms-using-chattr.Rmd b/_posts/2024-04-04-chat-with-llms-using-chattr/chat-with-llms-using-chattr.Rmd index 92499f2e..4cc55418 100644 --- a/_posts/2024-04-04-chat-with-llms-using-chattr/chat-with-llms-using-chattr.Rmd +++ b/_posts/2024-04-04-chat-with-llms-using-chattr/chat-with-llms-using-chattr.Rmd @@ -27,3 +27,31 @@ knitr::opts_chunk$set( fig.height = 6 ) ``` + + +## Works with Copilot and ChatGPT + +## Works with local LLMs + +Open-source, trained models that are able to run in your laptop are widely +available today. Instead of integrating with each individually, `chattr` works +with **LlamaGPTJ-chat**. This is a lightweight application that communicates +with a variety of local models. At this time, LlamaGPTJ-chat integrates with the +following family models: + +- **GPT-J** (ggml and gpt4all models) +- **LLaMA** (ggml Vicuna models from Meta) +- **Mosaic Pretrained Transformers (MPT)** + +LlamaGPTJ-chat works right off the terminal. `chattr` integrates with the +application by starting an 'hidden' terminal session, where it initializes the +selected model, and makes it available to start chatting with it. + +To get started, you need to install LlamaGPTJ-chat, and download a compatible +model. More detailed instructions are found [here](https://mlverse.github.io/chattr/articles/backend-llamagpt.html#installation). + + +## Integrating with `chattr` + +## Next steps +