Frontiers in Neurorobotics (Oct 2013)

Learning indoor robot navigation using visual and sensorimotor map information

  • Wenjie eYan,
  • Cornelius eWeber,
  • Stefan eWermter

DOI
https://doi.org/10.3389/fnbot.2013.00015
Journal volume & issue
Vol. 7

Abstract

Read online

As a fundamental research topic, autonomous indoor robot navigation continues to be a challenge in unconstrained real-world indoor environments. Although many models for map-building and planning exist, it is difficult to integrate them due to the high amount of noise, dynamics and complexity. Addressing this challenge, this paper describes a neural model for environment mapping and robot navigation based on learning spatial knowledge. Considering that a person typically moves within a room without colliding with objects, this model learns the spatial knowledge by observing the person's movement using a ceiling-mounted camera. A robot can plan and navigate to any given position in the room based on the acquired map, and adapt it based on having identified possible obstacles. In addition, salient visual features are learned and stored in the map during navigation. This anchoring of visual features in the map enables the robot to find and navigate to a target object by showing an image of it. We implement this model on a humanoid robot and tests are conducted in a home-like environment. Results of our experiments show that the robot masters complex navigation tasks.

Keywords