Frontiers in Physics (Nov 2022)

Using photonic reservoirs as preprocessors for deep neural networks

  • Ian Bauwens,
  • Guy Van der Sande ,
  • Peter Bienstman ,
  • Guy Verschaffelt 

DOI
https://doi.org/10.3389/fphy.2022.1051941
Journal volume & issue
Vol. 10

Abstract

Read online

Artificial neural networks are very time consuming and energy intensive to train, especially when increasing the size of the neural network in an attempt to improve the performance. In this paper, we propose to preprocess the input data of a deep neural network using a reservoir, which has originally been introduced in the framework of reservoir computing. The key idea of this paper is to use such a reservoir to transform the input data into a state in a higher dimensional state-space, which allows the deep neural network to process the data with improved performance. We focus on photonic reservoirs because of their fast computation times and low-energy consumption. Based on numerical simulations of delay-based reservoirs using a semiconductor laser, we show that using such preprocessed data results in an improved performance of deep neural networks. Furthermore, we show that we do not need to carefully fine-tune the parameters of the preprocessing reservoir.

Keywords