Skip to content

Commit

Permalink
add current model
Browse files Browse the repository at this point in the history
llama 7b
  • Loading branch information
textspur committed Jan 8, 2024
1 parent 7ea5b30 commit ffe59fd
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 6 deletions.
4 changes: 2 additions & 2 deletions README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
## Example

The first thing you should do after installation is to pull one of the models from <https://ollama.ai/library>.
By calling `pull_model()` without arguments, you are pulling the default models --- "llama2":
By calling `pull_model()` without arguments, you are pulling the (current) default model --- "llama2 7b":

```{r lib}
library(rollama)
Expand Down Expand Up @@ -94,7 +94,7 @@ options(rollama_config = "You make answers understandable to a 5 year old")
query("why is the sky blue?")
```

By default, the package uses the "llama2" model.
By default, the package uses the "llama2 7B" model. Supported models can be found at <https://ollama.ai/library>. To download a specific model make use of the additional information available in "Tags" <https://ollama.ai/library/mistral/tags>.
Change this via `rollama_model`:

```{r model}
Expand Down
10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ interface, you can start Ollama locally with one command:
## Example

The first thing you should do after installation is to pull one of the
models from <https://ollama.ai/library>. By calling `pull_model()`
without arguments, you are pulling the default models — “llama2:
models from <https://ollama.ai/library>. By calling `pull_model()`
without arguments, you are pulling the (current) default model --- "llama2 7b":

``` r
library(rollama)
Expand Down Expand Up @@ -189,8 +189,10 @@ query("why is the sky blue?")
#> that's why the sky is blue! Isn't that amazing? 😍
```

By default, the package uses the “llama2” model. Change this via
`rollama_model`:
By default, the package uses the "llama2 7B" model. Supported models can be found
at <https://ollama.ai/library>. To download a specific model make use of the additional
information available in "Tags" <https://ollama.ai/library/mistral/tags>.
Change this via `rollama_model`:

``` r
options(rollama_model = "mixtral")
Expand Down

0 comments on commit ffe59fd

Please sign in to comment.