Semi-dense 3D Reconstruction with a Stereo Event Camera

Date

2018

Authors

Zhou, Yi
Gallego, Guillermo
Rebecq, Henri
Kneip, Laurent
Li, Hongdong
Scaramuzza, D

Journal Title

Journal ISSN

Volume Title

Publisher

Springer Verlag

Abstract

Event cameras are bio-inspired sensors that offer several advantages, such as low latency, high-speed and high dynamic range, to tackle challenging scenarios in computer vision. This paper presents a solution to the problem of 3D reconstruction from data captured by a stereo event-camera rig moving in a static scene, such as in the context of stereo Simultaneous Localization and Mapping. The proposed method consists of the optimization of an energy function designed to exploit small-baseline spatio-temporal consistency of events triggered across both stereo image planes. To improve the density of the reconstruction and to reduce the uncertainty of the estimation, a probabilistic depth-fusion strategy is also developed. The resulting method has no special requirements on either the motion of the stereo event-camera rig or on prior knowledge about the scene. Experiments demonstrate our method can deal with both texture-rich scenes as well as sparse scenes, outperforming state-of-the-art stereo methods based on event data image representations.

Description

Keywords

Citation

Source

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Type

Conference paper

Book Title

Entity type

Access Statement

License Rights

DOI

10.1007/978-3-030-01246-5_15

Restricted until

2037-12-31