An On-Line POMDP Solver for Continuous Observation Spaces

Date

Authors

Hoerger, Marcus
Kurniawati, Hanna

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Abstract

Planning under partial obervability is essential for autonomous robots. A principled way to address such planning problems is the Partially Observable Markov Decision Process (POMDP). Although solving POMDPs is computationally intractable, substantial advancements have been achieved in developing approximate POMDP solvers in the past two decades. However, computing robust solutions for problems with continuous observation spaces remains challenging. Most on-line solvers rely on discretising the observation space or artificially limiting the number of observations that are considered during planning to compute tractable policies. In this paper we propose a new on-line POMDP solver, called Lazy Belief Extraction for Continuous Observation POMDPs (LABECOP), that combines methods from Monte-Carlo-Tree-Search and particle filtering to construct a policy reprentation which doesn’t require discretised observation spaces and avoids limiting the number of observations considered during planning. Experiments on three different problems involving continuous observation spaces indicate that LABECOP performs similar or better than state of-the-art POMDP solver

Description

Keywords

Citation

Source

Proceedings of the IEEE International Conference on Robotics and Automation

Book Title

Entity type

Access Statement

License Rights

Restricted until

2099-12-31