Reducing the Sim-to-Real Gap for Event Cameras
Date
2020
Authors
Stoffregen, Timo
Scheerlinck, Cedric
Scaramuzza, Davide
Drummond, Tom
Barnes, Nick
Kleeman, Lindsay
Mahony, Robert
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Abstract
Event cameras are paradigm-shifting novel sensors that
report asynchronous, per-pixel brightness changes called ‘events’ with
unparalleled low latency. This makes them ideal for high speed, high
dynamic range scenes where conventional cameras would fail. Recent
work has demonstrated impressive results using Convolutional Neural
Networks (CNNs) for video reconstruction and optic flow with events.
We present strategies for improving training data for event based CNNs
that result in 20–40% boost in performance of existing state-of-the-art
(SOTA) video reconstruction networks retrained with our method, and
up to 15% for optic flow networks. A challenge in evaluating event based
video reconstruction is lack of quality ground truth images in existing
datasets. To address this, we present a new High Quality Frames
(HQF) dataset, containing events and ground truth frames from a
DAVIS240C that are well-exposed and minimally motion-blurred. We
evaluate our method on HQF + several existing major event camera
datasets.
Description
Keywords
Citation
Collections
Source
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Type
Conference paper
Book Title
Entity type
Access Statement
License Rights
Restricted until
2099-12-31
Downloads
File
Description