Intelligent Computing (Jan 2024)

Slimmed Optical Neural Networks with Multiplexed Neuron Sets and a Corresponding Backpropagation Training Algorithm

  • Yi-Feng Liu,
  • Rui-Yao Ren,
  • Dai-Bao Hou,
  • Hai-Zhong Weng,
  • Bo-Wen Wang,
  • Ke-Jie Huang,
  • Xing Lin,
  • Feng Liu,
  • Chen-Hui Li,
  • Chao-Yuan Jin

DOI
https://doi.org/10.34133/icomputing.0070
Journal volume & issue
Vol. 3

Abstract

Read online

Optical neural networks (ONNs) have recently attracted extensive interest as potential alternatives to electronic artificial neural networks, owing to their intrinsic capabilities in parallel signal processing with reduced power consumption and low latency. Preliminary confirmation of parallelism in optical computing has been widely performed by applying wavelength division multiplexing (WDM) to the linear transformation of neural networks. However, interchannel crosstalk has obstructed WDM technologies from being deployed in nonlinear activation on ONNs. Here, we propose a universal WDM structure called multiplexed neuron sets (MNS), which applies WDM technologies to optical neurons and enables ONNs to be further compressed. A corresponding backpropagation (BP) training algorithm was proposed to alleviate or even annul the influence of interchannel crosstalk in MNS-based WDM-ONNs. For simplicity, semiconductor optical amplifiers are employed as an example of MNS to construct a WDM-ONN trained using the new algorithm. The results show that the combination of MNS and the corresponding BP training algorithm clearly downsizes the system and improves the energy efficiency by a factor of 10 while providing similar performance to traditional ONNs.