A curated list of awesome Gaussian Splatting resources, inspired by awesome-computer-vision.
Gaussian Splatting is a revolutionary technique that combines the advantages of both explicit 3D scene representations like meshes and points, and continuous representations like Neural Radiance Fields (NeRF). It introduces 3D Gaussians as a flexible and expressive scene representation that allows for high-quality, real-time rendering at 1080p resolution. The technique is also optimized for fast training times, making it a highly efficient solution for real-time rendering of complex scenes captured from multiple photos.
- Must-Read Paper: 3D Gaussian Splatting for Real-Time Radiance Field Rendering
- Must-See Video: Short Presentation on 3D Gaussian Splatting
- Implementation: 3D Gaussian Splatting, reimagined: Unleashing unmatched speed with C++ and CUDA from the ground up!
- Beginner friendly Introduction: Blog: Introduction to 3D Gaussian Splatting
- Beginner friendly 2 minute Video: 3D Gaussian Splatting - Why Graphics Will Never Be The Same
- 1 November 2023: Added beginner friendly content introducing GS
- 31 October 2023: Added the paper: 4D Gaussian Splatting for Real-Time Dynamic Scene Rendering
- 30 October 2023: Added a FAQ section.
Authors: Bernhard Kerbl, Georgios Kopanas, Thomas LeimkΓΌhler3, George Drettakis
About: The paper introduces a method using 3D Gaussians and a fast rendering algorithm for high-quality, real-time novel-view synthesis of scenes at 1080p resolution.
- π Paper (Low Resolution)
- π Paper (High Resolution)
- π Project Page
- π» Code
- π₯ Short Presentation
- π₯ Explanation Video
Authors: Jonathon Luiten, Georgios Kopanas, Bastian Leibe, Deva Ramanan
Aboutn: This paper presents a method that simultaneously addresses dynamic scene novel-view synthesis and six degree-of-freedom (6-DOF) tracking of all dense scene elements through a collection of 3D Gaussians.
Authors: Guanjun Wu, Taoran Yi, Jiemin Fang, Lingxi Xie, Xiaopeng Zhang, Wei Wei, Wenyu Liu, Qi Tian, Xinggang Wang
About: The paper introduces 4D Gaussian Splatting (4D-GS) for real-time dynamic scene rendering. It efficiently models both Gaussian motions and shape deformations using a deformation field. The method achieves real-time rendering at high resolutions (70 FPS at 800*800 resolution on an RTX 3090 GPU) while maintaining high quality.
Authors: Ziyi Yang, Xinyu Gao, Wen Zhou, Shaohui Jiao, Yuqing Zhang, Xiaogang Jin
About: The paper proposes a deformable 3D Gaussians Splatting method for dynamic scene reconstruction and real-time rendering, addressing issues faced by implicit methods in rendering dynamic scenes.
Authors: Zilong Chen, Feng Wang, Huaping Liu
About: This paper presents a novel approach for generating high-quality 3D objects using Gaussian Splatting based text-to-3D generation (GSGEN).
Authors: Jiaxiang Tang, Jiawei Ren, Hang Zhou, Ziwei Liu, Gang Zeng
About: DreamGaussian proposes a novel 3D content generation framework which leverages a generative 3D Gaussian Splatting model for efficient and quality 3D content creation.
Authors: Taoran Yi1, Jiemin Fang, Guanjun Wu1, Lingxi Xie, Xiaopeng Zhang, Wenyu Liu, Tian Qi, Xinggang Wang About: GaussianDreamer proposes a fast 3D generation framework, where the 3D diffusion model provides point cloud priors for initialization and the 2D diffusion model enriches the geometry and appearance.
3D Gaussian Splatting for Real-Time Radiance Field Rendering
GSGEN: Text-to-3D using Gaussian Splatting
DreamGaussian: Generative Gaussian Splatting for Efficient 3D Content Creation
3D Gaussian Splatting, reimagined: Unleashing unmatched speed with C++ and CUDA from the ground up!
A Three.js implementation that demonstrates 3D Gaussian splatting. This project showcases a viewer for visualizing 3D Gaussian splats in a web-based application.
This project is a WebRTC implementation for viewing Gaussian splatting in real-time, allowing for interactive viewing experiences over the web.
A real-time renderer implemented in WebGL for viewing Gaussian splatting. This project aims to provide a high-performance rendering experience on the web.
A WebGPU-based viewer for Gaussian splatting, showcasing real-time rendering capabilities and leveraging the power of WebGPU for enhanced performance.
A toy project on GitHub showcasing an iOS and Metal AR Gaussian Splat Renderer. This viewer is designed for rendering Gaussian splatting in an AR environment on iOS devices using Metal.
This project demonstrates Gaussian splatting visualization within the Unity game engine. It provides a Unity-based implementation for rendering and viewing Gaussian splats.
A project that showcases CUDA accelerated rasterization of Gaussians with Python bindings. This viewer leverages CUDA for high-performance rendering and rasterization of Gaussians.
Summary: This blog post discusses the essence and benefits of Gaussian Splatting in rendering, shedding light on how it's a game changer for real-time, high-quality rendering, and how it could be applied in game development.
Summary:The author explores techniques and shares insights on how to make Gaussian splats smaller to optimize performance without compromising the quality of rendering.
Summary: A continuation of exploring ways to optimize the size of Gaussian Splats for better performance in rendering tasks.
Summary: The post breaks down the rasterization technique of 3D Gaussian Splatting, elaborating on how it enables real-time rendering of photorealistic scenes from small image samples, its process, and its potential impact on the future of graphics.
Summary: A tutorial video guiding viewers on how to get started with 3D Gaussian Splatting, covering the basics and essential steps to utilize this rendering technique. Watch here
Summary: This tutorial demonstrates how to view 3D Gaussian Splatting scenes in Unity, providing a practical guide for developers interested in integrating this technique in their projects. Watch here
Summary:Dive into Gaussian Splatting: what is it, how are scenes represented, and what fun things can we do with it? Watch here
- A: 3D Gaussian Splatting is a novel technique introduced for real-time, high-quality rendering of 3D scenes. It uses 3D Gaussians as a flexible and expressive scene representation, optimized for both visual quality and computational efficiency.
How does 3D Gaussian Splatting differ from traditional 3D scene representations like meshes and points?
- A: Traditional 3D scene representations like meshes and points are explicit and well-suited for fast GPU/CUDA-based rasterization. However, they lack the continuous nature that some other methods like Neural Radiance Fields (NeRF) offer. 3D Gaussian Splatting combines the best of both worlds, offering a continuous yet efficient representation.
- A: While NeRF methods offer high visual quality, they require costly stochastic sampling for rendering, which can result in noise. 3D Gaussian Splatting allows for state-of-the-art visual quality and competitive training times, without the need for costly sampling. It also ensures real-time rendering at high resolutions like 1080p.
- A: The method has three main components:
- 3D Gaussians as a flexible and expressive scene representation.
- Optimization of the properties of these 3D Gaussians, including their 3D position, opacity, anisotropic covariance, and spherical harmonic coefficients.
- A real-time rendering solution that uses fast GPU sorting algorithms and is inspired by tile-based rasterization.
- A: The method employs a tile-based splatting solution for real-time rendering. It uses fast GPU sorting algorithms and is inspired by tile-based rasterization techniques. This allows for anisotropic splatting that respects visibility ordering and enables a fast and accurate backward pass.
- A: The method is optimized for fast training times, making it a highly efficient solution for real-time rendering of complex scenes. For example, it can achieve training speeds and quality similar to the fastest existing methods, and it provides the first real-time rendering solution with high quality for novel-view synthesis.
- A: The paper primarily focuses on real-time rendering for scenes captured with multiple photos. However, its flexible and expressive representation makes it a promising candidate for dynamic scenes as well.
MIT