InfoMat (Mar 2024)

Artificial visual‐tactile perception array for enhanced memory and neuromorphic computations

  • Jiaqi He,
  • Ruilai Wei,
  • Shuaipeng Ge,
  • Wenqiang Wu,
  • Jianchao Guo,
  • Juan Tao,
  • Ru Wang,
  • Chunfeng Wang,
  • Caofeng Pan

DOI
https://doi.org/10.1002/inf2.12493
Journal volume & issue
Vol. 6, no. 3
pp. n/a – n/a

Abstract

Read online

Abstract The emulation of human multisensory functions to construct artificial perception systems is an intriguing challenge for developing humanoid robotics and cross‐modal human–machine interfaces. Inspired by human multisensory signal generation and neuroplasticity‐based signal processing, here, an artificial perceptual neuro array with visual‐tactile sensing, processing, learning, and memory is demonstrated. The neuromorphic bimodal perception array compactly combines an artificial photoelectric synapse network and an integrated mechanoluminescent layer, endowing individual and synergistic plastic modulation of optical and mechanical information, including short‐term memory, long‐term memory, paired pulse facilitation, and “learning‐experience” behavior. Sequential or superimposed visual and tactile stimuli inputs can efficiently simulate the associative learning process of “Pavlov's dog”. The fusion of visual and tactile modulation enables enhanced memory of the stimulation image during the learning process. A machine‐learning algorithm is coupled with an artificial neural network for pattern recognition, achieving a recognition accuracy of 70% for bimodal training, which is higher than that obtained by unimodal training. In addition, the artificial perceptual neuron has a low energy consumption of ∼20 pJ. With its mechanical compliance and simple architecture, the neuromorphic bimodal perception array has promising applications in large‐scale cross‐modal interactions and high‐throughput intelligent perceptions.

Keywords