Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 1.58 KB

2407.13520.md

File metadata and controls

5 lines (3 loc) · 1.58 KB

EaDeblur-GS: Event assisted 3D Deblur Reconstruction with Gaussian Splatting

3D deblurring reconstruction techniques have recently seen significant advancements with the development of Neural Radiance Fields (NeRF) and 3D Gaussian Splatting (3DGS). Although these techniques can recover relatively clear 3D reconstructions from blurry image inputs, they still face limitations in handling severe blurring and complex camera motion. To address these issues, we propose Event-assisted 3D Deblur Reconstruction with Gaussian Splatting (EaDeblur-GS), which integrates event camera data to enhance the robustness of 3DGS against motion blur. By employing an Adaptive Deviation Estimator (ADE) network to estimate Gaussian center deviations and using novel loss functions, EaDeblur-GS achieves sharp 3D reconstructions in real-time, demonstrating performance comparable to state-of-the-art methods.

最近,随着神经辐射场(NeRF)和三维高斯投影(3DGS)的发展,三维去模糊重建技术取得了显著进展。尽管这些技术可以从模糊图像输入中恢复相对清晰的三维重建,但在处理严重模糊和复杂的相机运动时仍存在局限。为了解决这些问题,我们提出了一种名为事件辅助的三维去模糊重建技术,结合高斯投影(EaDeblur-GS),该技术整合了事件相机数据以增强3DGS对运动模糊的鲁棒性。通过采用自适应偏差估计器(ADE)网络来估计高斯中心的偏差,并使用新颖的损失函数,EaDeblur-GS能够实时实现清晰的三维重建,其性能可与最先进的方法媲美。