Frontiers in Neuroscience (Feb 2014)

Asynchronous Visual Event-based Time-to-Contact

  • Xavier eClady,
  • Charles eClercq,
  • Charles eClercq,
  • Sio-Hoi eIeng,
  • Fouzhan eHouseini,
  • Marco eRandazzo,
  • Lorenzo eNatale,
  • Chiara eBartolozzi,
  • Ryad Benjamin Benosman,
  • Ryad Benjamin Benosman

DOI
https://doi.org/10.3389/fnins.2014.00009
Journal volume & issue
Vol. 8

Abstract

Read online

A reliable and fast sensing of the environment is a fundamental necessity of mobile platforms. Unfortunately conventional cameras due to the current frame-based acquisition paradigm output low temporal dynamics and redundant data flow leading to high computational costs. It is obviously incompatible with the necessities of mobile platforms where energy consumption and computational load are a major issue. The restrictions of the frame-based paradigm are contradictory with applications requiring high speed sensor-based reactive control. This paper introduces a fast obstacle avoidance using the output of an asynchronous event-based time encoded imaging sensor. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. It introduces an event-based time-to-contact approach relying on the computation of visual event-based motion flows. Experiments on a mobile robot are presented in an indoor environment. Time to contact results are compared with those provided by a laser range finder showing that event-based sensing offers new perspectives for mobile robotics sensing.

Keywords