Sensors (Mar 2020)

Multi-Sensor Orientation Tracking for a Façade-Cleaning Robot

  • Manuel Vega-Heredia,
  • Ilyas Muhammad,
  • Sriharsha Ghanta,
  • Vengadesh Ayyalusami,
  • Siti Aisyah,
  • Mohan Rajesh Elara

DOI
https://doi.org/10.3390/s20051483
Journal volume & issue
Vol. 20, no. 5
p. 1483

Abstract

Read online

Glass-façade-cleaning robots are an emerging class of service robots. This kind of cleaning robot is designed to operate on vertical surfaces, for which tracking the position and orientation becomes more challenging. In this article, we have presented a glass-façade-cleaning robot, Mantis v2, who can shift from one window panel to another like any other in the market. Due to the complexity of the panel shifting, we proposed and evaluated different methods for estimating its orientation using different kinds of sensors working together on the Robot Operating System (ROS). For this application, we used an onboard Inertial Measurement Unit (IMU), wheel encoders, a beacon-based system, Time-of-Flight (ToF) range sensors, and an external vision sensor (camera) for angular position estimation of the Mantis v2 robot. The external camera is used to monitor the robot’s operation and to track the coordinates of two colored markers attached along the longitudinal axis of the robot to estimate its orientation angle. ToF lidar sensors are attached on both sides of the robot to detect the window frame. ToF sensors are used for calculating the distance to the window frame; differences between beam readings are used to calculate the orientation angle of the robot. Differential drive wheel encoder data are used to estimate the robot’s heading angle on a 2D façade surface. An integrated heading angle estimation is also provided by using simple fusion techniques, i.e., a complementary filter (CF) and 1D Kalman filter (KF) utilizing the IMU sensor’s raw data. The heading angle information provided by different sensory systems is then evaluated in static and dynamic tests against an off-the-shelf attitude and heading reference system (AHRS). It is observed that ToF sensors work effectively from 0 to 30 degrees, beacons have a delay up to five seconds, and the odometry error increases according to the navigation distance due to slippage and/or sliding on the glass. Among all tested orientation sensors and methods, the vision sensor scheme proved to be better, with an orientation angle error of less than 0.8 degrees for this application. The experimental results demonstrate the efficacy of our proposed techniques in this orientation tracking, which has never applied in this specific application of cleaning robots.

Keywords