Nature Communications (May 2024)

An artificial visual neuron with multiplexed rate and time-to-first-spike coding

  • Fanfan Li,
  • Dingwei Li,
  • Chuanqing Wang,
  • Guolei Liu,
  • Rui Wang,
  • Huihui Ren,
  • Yingjie Tang,
  • Yan Wang,
  • Yitong Chen,
  • Kun Liang,
  • Qi Huang,
  • Mohamad Sawan,
  • Min Qiu,
  • Hong Wang,
  • Bowen Zhu

DOI
https://doi.org/10.1038/s41467-024-48103-9
Journal volume & issue
Vol. 15, no. 1
pp. 1 – 11

Abstract

Read online

Abstract Human visual neurons rely on event-driven, energy-efficient spikes for communication, while silicon image sensors do not. The energy-budget mismatch between biological systems and machine vision technology has inspired the development of artificial visual neurons for use in spiking neural network (SNN). However, the lack of multiplexed data coding schemes reduces the ability of artificial visual neurons in SNN to emulate the visual perception ability of biological systems. Here, we present an artificial visual spiking neuron that enables rate and temporal fusion (RTF) coding of external visual information. The artificial neuron can code visual information at different spiking frequencies (rate coding) and enables precise and energy-efficient time-to-first-spike (TTFS) coding. This multiplexed sensory coding scheme could improve the computing capability and efficacy of artificial visual neurons. A hardware-based SNN with the RTF coding scheme exhibits good consistency with real-world ground truth data and achieves highly accurate steering and speed predictions for self-driving vehicles in complex conditions. The multiplexed RTF coding scheme demonstrates the feasibility of developing highly efficient spike-based neuromorphic hardware.