Skip to content

For testing conversations and parameter tuning of Ollama's local large model(用于对 Ollama 本地大模型进行对话和参数调试的小工具)

License

Notifications You must be signed in to change notification settings

CairoLee/ollama-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Chat

For testing conversations and parameter tuning of Ollama's local large model

If this project is helpful to you, please give it a ★Star.
Craft with ❤︎ by CairoLee and Contributors

Screenshots

Screenshot

Features

  • Fetch the model list from Ollama's server
  • Support load parameters from model
  • Re-sending the same dialogue for testing after changing parameters and prompts is more concise.

Quick Start

Requirements

Install poetry

We use poetry to manage dependencies. If you don't have poetry installed, you can install it by running the following command in Linux, macOS, Windows (WSL):

curl -sSL https://install.python-poetry.org | python3 -

If you are using Windows, you can install it by running the following command in PowerShell:

(Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | py -

If you want to install poetry in other ways, you can refer to the official documentation.

Install dependencies

Clone the repository and run the following command in the project root directory:

poetry install

Run the app

Run the following command in the project root directory:

poetry run python main.py

Then you can visit http://127.0.0.1:7860 in your browser to chat with Ollama.

About

For testing conversations and parameter tuning of Ollama's local large model(用于对 Ollama 本地大模型进行对话和参数调试的小工具)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Languages