An Ollama client made with GTK4 and Adwaita
-
Updated
Sep 20, 2024 - Python
An Ollama client made with GTK4 and Adwaita
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
A single-file tkinter-based Ollama GUI project with no external dependencies.
Odin Runes, a java-based GPT client, facilitates interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
Your gateway to both Ollama & Apple MlX models
Witsy: desktop AI assistant
Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
open-blazorui is a simple UI for your local LLMs.
Ollama Chat is a GUI for Ollama designed for macOS.
Full featured demo application for OllamaSharp
An excellent localized AI chat client application, cross-platform, compatible with all large models compatible with Ollama and OpenAI API. Local deployment protects your data privacy and can be used as Ollama client and OpenAI client.
React-based chat interface that integrates with multiple AI providers including Ollama, OpenAI, and Anthropic to provide an interactive AI chatbot experience. It features a dynamic UI with real-time message updates, code highlighting, HTML preview capabilities, and artifact rendering.
Desktop UI for Ollama made with PyQT
A simple interface for interacting with LLMs via a local installation of Ollama
一个通过ollama API与本地LLMs聊天的小工具(a web application for chatting with local LLMs by ollama API)
Enables users to interact with the LLM via Ollama by implementating a client-server architecture utilizing FastAPI as server-side framework and Streamlit for user interface.
OllamaOne is an Ollama GUI client.
Add a description, image, and links to the ollama-gui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-gui topic, visit your repo's landing page and select "manage topics."