Remote Sensing (Oct 2018)

Automated Attitude Determination for Pushbroom Sensors Based on Robust Image Matching

  • Ryu Sugimoto,
  • Toru Kouyama,
  • Atsunori Kanemura,
  • Soushi Kato,
  • Nevrez Imamoglu,
  • Ryosuke Nakamura

DOI
https://doi.org/10.3390/rs10101629
Journal volume & issue
Vol. 10, no. 10
p. 1629

Abstract

Read online

Accurate attitude information from a satellite image sensor is essential for accurate map projection and reducing computational cost for post-processing of image registration, which enhance image usability, such as change detection. We propose a robust attitude-determination method for pushbroom sensors onboard spacecraft by matching land features in well registered base-map images and in observed images, which extends the current method that derives satellite attitude using an image taken with 2-D image sensors. Unlike 2-D image sensors, a pushbroom sensor observes the ground by changing its position and attitude according to the trajectory of a satellite. To address pushbroom-sensor observation, the proposed method can trace the temporal variation in the sensor attitude by combining the robust matching technique for a 2-D image sensor and a non-linear least squares approach, which can express gradual time evolution of the sensor attitude. Experimental results using images taken from a visible and near infrared pushbroom sensor of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) onboard Terra as test image and Landsat-8/OLI images as a base map show that the proposed method can determine satellite attitude with an accuracy of 0.003° (corresponding to the 2-pixel scale of ASTER) in roll and pitch angles even for a scene in which there are many cloud patches, whereas the determination accuracy remains 0.05° in the yaw angle that does not affect accuracy of image registration compared with the other two axes. In addition to the achieved attitude accuracy that was better than that using star trackers (0.01°) regarding roll and pitch angles, the proposed method does not require any attitude information from onboard sensors. Therefore, the proposed method may contribute to validating and calibrating attitude sensors in space, at the same time better accuracy will contribute to reducing computational cost in post-processing for image registration.

Keywords