本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
-
Updated
Jan 4, 2025 - HTML
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
Low-code framework for building custom LLMs, neural networks, and other AI models
SkyPilot: Run AI and batch jobs on any infra (Kubernetes or 12+ clouds). Get unified execution, cost savings, and high GPU availability via a simple interface.
Efficient Triton Kernels for LLM Training
An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
Code examples and resources for DBRX, a large language model developed by Databricks
dstack is a lightweight, open-source alternative to Kubernetes & Slurm, simplifying AI container orchestration with multi-cloud & on-prem support. It natively supports NVIDIA, AMD, & TPU.
DLRover: An Automatic Distributed Deep Learning System
Nvidia GPU exporter for prometheus using nvidia-smi binary
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
LLM (Large Language Model) FineTuning
irresponsible innovation. Try now at https://chat.dev/
The official repo of Aquila2 series proposed by BAAI, including pretrained & chat large language models.
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
Open Source LLM toolkit to build trustworthy LLM applications. TigerArmor (AI safety), TigerRAG (embedding, RAG), TigerTune (fine-tuning)
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Add a description, image, and links to the llm-training topic page so that developers can more easily learn about it.
To associate your repository with the llm-training topic, visit your repo's landing page and select "manage topics."