Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
-
Updated
Nov 4, 2024 - Dart
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
MemoryCache is an experimental development project to turn a local desktop environment into an on-device AI agent
MLX-VLM is a package for running Vision LLMs locally on your Mac using MLX.
🦙 Ollama Telegram bot, with advanced configuration
Like ChatGPT's voice conversations with an AI, but entirely offline/private/trade-secret-friendly, using local AI models such as LLama 2 and Whisper
Extract structured data from local or remote LLM models
MVP of an idea using multiple local LLM models to simulate and play D&D
Search your favorite websites and chat with them, on your desktop🌐
Visual summaries for code repositories
Empower Your Productivity with Local AI Assistants
OfflineAI is an artificial intelligence that operates offline and uses machine learning to perform various tasks based on the code provided. It is built using two powerful AI models by Mistral AI.
A browser extension that brings local AI to your device
Catalog of OCI images for popular open-source or open Large Language Models.
Telegram bot that interacts with the local Ollama 🦙 to answer user messages
Email Auto-ReplAI is a Python tool that uses AI to automate drafting responses to unread Gmail messages, streamlining email management tasks.
Telegram bot which can work with both openAI and LocalAI modes, it also uses UncensoredGPT models like Wizard-Uncensored. It can be launch at ordinary gamer PC without specific ai-gpu hardware
Add a description, image, and links to the local-ai topic page so that developers can more easily learn about it.
To associate your repository with the local-ai topic, visit your repo's landing page and select "manage topics."