Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 1.93 KB

2406.14978.md

File metadata and controls

5 lines (3 loc) · 1.93 KB

E2GS: Event Enhanced Gaussian Splatting

Event cameras, known for their high dynamic range, absence of motion blur, and low energy usage, have recently found a wide range of applications thanks to these attributes. In the past few years, the field of event-based 3D reconstruction saw remarkable progress, with the Neural Radiance Field (NeRF) based approach demonstrating photorealistic view synthesis results. However, the volume rendering paradigm of NeRF necessitates extensive training and rendering times. In this paper, we introduce Event Enhanced Gaussian Splatting (E2GS), a novel method that incorporates event data into Gaussian Splatting, which has recently made significant advances in the field of novel view synthesis. Our E2GS effectively utilizes both blurry images and event data, significantly improving image deblurring and producing high-quality novel view synthesis. Our comprehensive experiments on both synthetic and real-world datasets demonstrate our E2GS can generate visually appealing renderings while offering faster training and rendering speed (140 FPS).

事件相机以其高动态范围、无运动模糊和低能耗而闻名,最近由于这些特性而在广泛的应用领域中找到了用途。在过去几年中,基于事件的3D重建领域取得了显著进展,其中基于神经辐射场(NeRF)的方法展示了逼真的视图合成结果。然而,NeRF的体积渲染范式需要大量的训练和渲染时间。在本文中,我们介绍了事件增强高斯飞溅(E2GS),这是一种将事件数据融入高斯飞溅的新方法,高斯飞溅最近在新视角合成领域取得了重大进展。我们的E2GS有效地利用了模糊图像和事件数据,显著改善了图像去模糊,并产生了高质量的新视角合成。我们在合成和真实世界数据集上的全面实验表明,我们的E2GS可以生成视觉上吸引人的渲染,同时提供更快的训练和渲染速度(140 FPS)。