🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Dec 24, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
Python code for "Probabilistic Machine learning" book by Kevin Murphy
Flax Engine – multi-platform 3D game engine
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
Official code for Score-Based Generative Modeling through Stochastic Differential Equations (ICLR 2021, Oral)
A Library for Uncertainty Quantification.
Long Range Arena for Benchmarking Efficient Transformers
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Original Implementation of Prompt Tuning from Lester, et al, 2021
JAX (Flax) implementation of algorithms for Deep Reinforcement Learning with continuous action spaces.
Tevatron - A flexible toolkit for neural retrieval research and development.
Run Effective Large Batch Contrastive Learning Beyond GPU/TPU Memory Constraint
Orbax provides common checkpointing and persistence utilities for JAX users
A Jax-based library for designing and training transformer models from scratch.
Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.
Train very large language models in Jax.
Add a description, image, and links to the flax topic page so that developers can more easily learn about it.
To associate your repository with the flax topic, visit your repo's landing page and select "manage topics."