Referenced-based scene stylization that edits the appearance based on a content-aligned reference image is an emerging research area. Starting with a pretrained neural radiance field (NeRF), existing methods typically learn a novel appearance that matches the given style. Despite their effectiveness, they inherently suffer from time-consuming volume rendering, and thus are impractical for many real-time applications. In this work, we propose ReGS, which adapts 3D Gaussian Splatting (3DGS) for reference-based stylization to enable real-time stylized view synthesis. Editing the appearance of a pretrained 3DGS is challenging as it uses discrete Gaussians as 3D representation, which tightly bind appearance with geometry. Simply optimizing the appearance as prior methods do is often insufficient for modeling continuous textures in the given reference image. To address this challenge, we propose a novel texture-guided control mechanism that adaptively adjusts local responsible Gaussians to a new geometric arrangement, serving for desired texture details. The proposed process is guided by texture clues for effective appearance editing, and regularized by scene depth for preserving original geometric structure. With these novel designs, we show ReGs can produce state-of-the-art stylization results that respect the reference texture while embracing real-time rendering speed for free-view navigation.
基于参考的场景风格化是一个新兴的研究领域,它根据内容对齐的参考图像来编辑外观。现有方法通常从预训练的神经辐射场(NeRF)开始,学习与给定风格匹配的新外观。尽管这些方法很有效,但它们固有地受到耗时的体积渲染的限制,因此对许多实时应用来说并不实用。 在这项工作中,我们提出了 ReGS,它采用 3D 高斯散射(3DGS)进行基于参考的风格化,以实现实时风格化视图合成。编辑预训练 3DGS 的外观具有挑战性,因为它使用离散高斯作为 3D 表示,这将外观与几何紧密绑定。像之前的方法那样简单地优化外观通常不足以模拟给定参考图像中的连续纹理。 为解决这一挑战,我们提出了一种新颖的纹理引导控制机制,可自适应地调整局部负责的高斯分布到新的几何排列,以服务于所需的纹理细节。这个提出的过程由纹理线索引导,以实现有效的外观编辑,并由场景深度正则化以保持原始几何结构。 通过这些新颖的设计,我们展示了 ReGS 可以产生尊重参考纹理的最先进的风格化结果,同时实现实时渲染速度,可用于自由视角导航。