Robust Visual Odometry and Dynamic Scene Modelling
Image-based estimation of camera trajectory, known as visual odometry (VO), has been a popular solution for robot navigation in the past decade due to its low-cost and widely applicable properties. The problem of tracking self-motion as well as motion of objects in the scene using information from a camera is known as multi-body visual odometry and is a challenging task. The performance of VO is heavily sensitive to poor imaging conditions (i.e., direct sunlight, shadow and image blur), which...[Show more]
|dc.description.abstract||Image-based estimation of camera trajectory, known as visual odometry (VO), has been a popular solution for robot navigation in the past decade due to its low-cost and widely applicable properties. The problem of tracking self-motion as well as motion of objects in the scene using information from a camera is known as multi-body visual odometry and is a challenging task. The performance of VO is heavily sensitive to poor imaging conditions (i.e., direct sunlight, shadow and image blur), which limits its feasibility in many challenging scenarios. Current VO solutions can provide accurate camera motion estimation in largely static scene. However, the deployment of robotic systems in our daily lives requires systems to work in significantly more complex, dynamic environment. This thesis aims to develop robust VO solutions against two challenging cases, underwater and highly dynamic environments, by extensively analyzing and overcoming the difficulties in both cases to achieve accurate ego-motion estimation. Furthermore, to better understand and exploit dynamic scene information, this thesis also investigates the motion of moving objects in dynamic scene, and presents a novel way to integrate ego and object motion estimation into a single framework. In particular, the problem of VO in underwater is challenging due to poor imaging condition and inconsistent motion caused by water flow. This thesis intensively tests and evaluates possible solutions to the mentioned issues, and proposes a stereo underwater VO system that is able to robustly and accurately localize the autonomous underwater vehicle (AUV). Visual odometry in dynamic environment is challenging because dynamic parts of the scene violate the static world assumption fundamental in most classical visual odometry algorithms. If moving parts of a scene dominate the static scene, off-the-shelf VO systems either fail completely or return poor quality trajectory estimation. Most existing techniques try to simplify the problem by removing dynamic information. Arguably, in most scenarios, the dynamics corresponds to a finite number of individual objects that are rigid or piecewise rigid, and their motions can be tracked and estimated in the same way as the ego-motion. With this consideration, the thesis proposes a brand new way to model and estimate object motion, and introduces a novel multi-body VO system that addresses the problem of tracking of both ego and object motion in dynamic outdoor scenes. Based on the proposed multi-body VO framework, this thesis also exploits the spatial and temporal relationships between the camera and object motions, as well as static and dynamic structures, to obtain more consistent and accurate estimations. To this end, the thesis introduces a novel visual dynamic object-aware SLAM system, that is able to achieve robust multiple moving objects tracking, accurate estimation of full SE(3) object motions, and extract inherent linear velocity information of moving objects, along with an accurate robot localisation and mapping of environment structure. The performance of the proposed system is demonstrated on real datasets, showing its capability to resolve rigid object motion estimation and yielding results that outperform state-of-the-art algorithms by an order of magnitude in urban driving scenarios.|
|dc.title||Robust Visual Odometry and Dynamic Scene Modelling|
|Collections||Open Access Theses|
|Zhang_Thesis_2021.pdf||Thesis Material||35.57 MB||Adobe PDF|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.