mistral
Here are 489 public repositories matching this topic...
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
-
Updated
Dec 24, 2024 - Python
🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
-
Updated
Dec 24, 2024 - Go
Low-code framework for building custom LLMs, neural networks, and other AI models
-
Updated
Dec 23, 2024 - Python
Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
-
Updated
Dec 24, 2024 - Python
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
-
Updated
Oct 24, 2024 - Python
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
Updated
Dec 24, 2024 - Python
AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.
-
Updated
Dec 24, 2024 - TypeScript
Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.
-
Updated
Nov 7, 2024 - Swift
Efficient Triton Kernels for LLM Training
-
Updated
Dec 23, 2024 - Python
🧑🚀 全世界最好的LLM资料总结 | Summary of the world's best LLM resources.
-
Updated
Dec 24, 2024
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
-
Updated
Sep 23, 2024 - Python
Python SDK for AI agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
-
Updated
Dec 24, 2024 - Python
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
-
Updated
Dec 18, 2024 - Rust
[ACL 2024] An Easy-to-use Knowledge Editing Framework for LLMs.
-
Updated
Dec 23, 2024 - Jupyter Notebook
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
-
Updated
Oct 10, 2024 - Python
Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK.
-
Updated
Dec 24, 2024 - C++
Create chatbots with ease
-
Updated
Oct 15, 2024 - TypeScript
Improve this page
Add a description, image, and links to the mistral topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the mistral topic, visit your repo's landing page and select "manage topics."