A diffusion transformer implementation in Flax
-
Updated
Sep 9, 2024 - Python
A diffusion transformer implementation in Flax
Diffusion Transformers in PyTorch and JAX
Lumina-T2X is a unified framework for Text to Any Modality Generation
Minimal DDPM/DiT-based generation of MNIST digits
FORA introduces simple yet effective caching mechanism in Diffusion Transformer Architecture for faster inference sampling.
The official implementation of "CAME: Confidence-guided Adaptive Memory Optimization"
A repo of a modified version of Diffusion Transformer
🔥🔥🔥Official Codebase of "DiT-3D: Exploring Plain Diffusion Transformers for 3D Shape Generation"
Implementation of Latent Diffusion Transformer Model in Tensorflow / Keras
Implementation of Diffusion Transformer Model in Pytorch
[ICCV 2023] Efficient Diffusion Training via Min-SNR Weighting Strategy
Add a description, image, and links to the diffusion-transformer topic page so that developers can more easily learn about it.
To associate your repository with the diffusion-transformer topic, visit your repo's landing page and select "manage topics."