IEEE Access (Jan 2020)

A Virtual End-to-End Learning System for Robot Navigation Based on Temporal Dependencies

  • Yanqiu Zhang,
  • Ruiquan Ge,
  • Lei Lyu,
  • Jinling Zhang,
  • Chen Lyu,
  • Xiaojuan Yang

DOI
https://doi.org/10.1109/ACCESS.2020.3010695
Journal volume & issue
Vol. 8
pp. 134111 – 134123

Abstract

Read online

Steering a wheeled mobile robot through a variety of environments is a complex task. To achieve this, many researchers have tried to convert front-facing camera data stream to the corresponding steering angles based on convolutional neural network model (CNN). However, most of existing methods suffer from higher cost of data acquisition and longer training cycles. To address these issues, this paper proposes an innovative end-to-end deep neural network model that fully considers the temporal relationships in the data and incorporates long short-term memory (LSTM) based on the CNN model. In addition, to obtain enough data to train and test the model, we establish a simulation system capable of creating realistic environment with various weather and road conditions and avoiding static and dynamic obstacles for robots. First, we use the system to capture the raw image sequence in different environments as a training set, and then we test the trained model in the system to realize an autonomous mobile robot that can adapt to various environments. The experimental results demonstrate that the proposed model not only can extract effectively and fully the features of road vision information with the highest correlation for navigation, but also can learn the time dependence of motion states and image features contained in a sequence.

Keywords