Left: Conventional (frame-based) 3D Gaussian Splatting fails to reconstruct geometric details due to motion blur caused by high-speed robot egomotion. Right: By exploiting the high temporal resolution of event cameras, Event3DGS can effectively reconstruct structure and appearance in the presence of fast egomotion.
By combining differentiable rendering with explicit point-based scene representations, 3D Gaussian Splatting (3DGS) has demonstrated breakthrough 3D reconstruction capabilities. However, to date 3DGS has had limited impact on robotics, where high-speed egomotion is pervasive: Egomotion introduces motion blur and leads to artifacts in existing frame-based 3DGS reconstruction methods.
To address this challenge, we introduce Event3DGS, an event-based 3DGS framework. By exploiting the exceptional temporal resolution of event cameras, Event3GDS can reconstruct high-fidelity 3D structure and appearance under high-speed egomotion. Extensive experiments on multiple synthetic and real-world datasets demonstrate the superiority of Event3DGS compared with existing event-based dense 3D scene reconstruction frameworks; Event3DGS substantially improves reconstruction quality (+3dB) while reducing computational costs by 95%. Our framework also allows one to incorporate a few motion-blurred frame-based measurements into the reconstruction process to further improve appearance fidelity without loss of structural accuracy.
The proposed Event3DGS aims to efficiently reconstruct a 3D scene representation from a given sequence of events (either grayscale or color) under high-speed robot egomotion and low-light conditions.
Visualization on synthetic scenes (event-only). Event sequences were generated using blender and event simulator. Event3DGS excels in reconstructing sharp structures and appearance details, such as ficus leaves (2nd row) and drum racks (3rd row).
Visualization on real-world scenes (event-only). Event sequences were emulated using experimental frame-based data and v2e. Event3DGS effectively captures fine details (e.g. grass behind the bicycle), and preserves 3D consistency.
Visualization on low-light experimental scenes (event-only). Event sequences were experimentally captured using DAVIS-346C. Event3DGS exhibits superior performance in accurately reconstructing edges of the objects and removing noises on non-event background pixels.
@inproceedings{xiongevent3dgs,
title={Event3dgs: Event-based 3d gaussian splatting for high-speed robot egomotion},
author={Xiong, Tianyi and Wu, Jiayi and He, Botao and Fermuller, Cornelia and Aloimonos, Yiannis and Huang, Heng and Metzler, Christopher},
booktitle={8th Annual Conference on Robot Learning}
}