Skip to content
This repository has been archived by the owner on May 9, 2024. It is now read-only.

Commit

Permalink
Merge pull request #47 from premAI-io/32-improve-dolly-documentation
Browse files Browse the repository at this point in the history
updated documentation dolly
  • Loading branch information
filopedraz authored Jul 11, 2023
2 parents 6e3b495 + f5affa0 commit ed1cc8c
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion chat-dolly-v2-12b/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

## 💻 Hardware Requirements

> **Memory requirements**: 24.5 GB (24576 bytes).
> **Memory requirements**: 23.91GiB GB (24484 MiB).
To run the `dolly-v2-12b` service, you'll need the following hardware configuration:

Expand Down
8 changes: 4 additions & 4 deletions chat-dolly-v2-12b/manifest.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,16 @@
"documentation": "",
"icon": "",
"modelInfo": {
"memoryRequirements": 24576,
"tokensPerSecond": 18
"memoryRequirements": 24484,
"tokensPerSecond": 19
},
"interfaces": [
"chat"
],
"dockerImages": {
"gpu": {
"size": 40689160223,
"image": "ghcr.io/premai-io/chat-dolly-v2-12b-gpu:1.0.2"
"size": 40689261892,
"image": "ghcr.io/premai-io/chat-dolly-v2-12b-gpu:1.0.3"
}
},
"defaultPort": 8000,
Expand Down

0 comments on commit ed1cc8c

Please sign in to comment.