3D Gaussian splatting (3DGS) has recently demonstrated promising advancements in RGB-D online dense mapping. Nevertheless, existing methods excessively rely on per-pixel depth cues to perform map densification, which leads to significant redundancy and increased sensitivity to depth noise. Additionally, explicitly storing 3D Gaussian parameters of room-scale scene poses a significant storage challenge. In this paper, we introduce OG-Mapping, which leverages the robust scene structural representation capability of sparse octrees, combined with structured 3D Gaussian representations, to achieve efficient and robust online dense mapping. Moreover, OG-Mapping employs an anchor-based progressive map refinement strategy to recover the scene structures at multiple levels of detail. Instead of maintaining a small number of active keyframes with a fixed keyframe window as previous approaches do, a dynamic keyframe window is employed to allow OG-Mapping to better tackle false local minima and forgetting issues. Experimental results demonstrate that OG-Mapping delivers more robust and superior realism mapping results than existing Gaussian-based RGB-D online mapping methods with a compact model, and no additional post-processing is required.
3D 高斯点云(3DGS)最近在 RGB-D 在线密集映射中展示了有前景的进展。然而,现有方法过度依赖每像素深度线索进行地图稠密化,这导致了显著的冗余和对深度噪声的敏感性。此外,显式存储房间尺度场景的 3D 高斯参数面临显著的存储挑战。在本文中,我们介绍了 OG-Mapping,它利用稀疏八叉树的强大场景结构表示能力,结合结构化的 3D 高斯表示,实现高效且鲁棒的在线密集映射。此外,OG-Mapping 采用基于锚点的渐进式地图优化策略,以在多个细节层次恢复场景结构。与以前方法保持固定关键帧窗口的小数量活跃关键帧不同,OG-Mapping 采用动态关键帧窗口,使其能够更好地应对局部最小值和遗忘问题。实验结果表明,OG-Mapping 在紧凑模型下提供了比现有基于高斯的 RGB-D 在线映射方法更鲁棒且更优质的现实映射结果,无需额外的后处理。