Diffractive Deep Neural Networks at Visible Wavelengths
Hang Chen,
Jianan Feng,
Minwei Jiang,
Yiqun Wang,
Jie Lin,
Jiubin Tan,
Peng Jin
Affiliations
Hang Chen
Center of Ultra-precision Optoelectronic Instrument, Harbin Institute of Technology, Harbin 150001, China
Jianan Feng
Center of Ultra-precision Optoelectronic Instrument, Harbin Institute of Technology, Harbin 150001, China
Minwei Jiang
Center of Ultra-precision Optoelectronic Instrument, Harbin Institute of Technology, Harbin 150001, China
Yiqun Wang
Nanofabrication Facility, Suzhou Institute of Nano-Tech and Nano-Bionics, Chinese Academy of Sciences, Suzhou 215123, China
Jie Lin
Center of Ultra-precision Optoelectronic Instrument, Harbin Institute of Technology, Harbin 150001, China; Key Laboratory of Micro-Systems and Micro-Structures Manufacturing, Ministry of Education, Harbin Institute of Technology, Harbin 150001, China; Corresponding authors.
Jiubin Tan
Center of Ultra-precision Optoelectronic Instrument, Harbin Institute of Technology, Harbin 150001, China
Peng Jin
Center of Ultra-precision Optoelectronic Instrument, Harbin Institute of Technology, Harbin 150001, China; Key Laboratory of Micro-Systems and Micro-Structures Manufacturing, Ministry of Education, Harbin Institute of Technology, Harbin 150001, China; Corresponding authors.
Optical deep learning based on diffractive optical elements offers unique advantages for parallel processing, computational speed, and power efficiency. One landmark method is the diffractive deep neural network (D2NN) based on three-dimensional printing technology operated in the terahertz spectral range. Since the terahertz bandwidth involves limited interparticle coupling and material losses, this paper extends D2NN to visible wavelengths. A general theory including a revised formula is proposed to solve any contradictions between wavelength, neuron size, and fabrication limitations. A novel visible light D2NN classifier is used to recognize unchanged targets (handwritten digits ranging from 0 to 9) and targets that have been changed (i.e., targets that have been covered or altered) at a visible wavelength of 632.8 nm. The obtained experimental classification accuracy (84%) and numerical classification accuracy (91.57%) quantify the match between the theoretical design and fabricated system performance. The presented framework can be used to apply a D2NN to various practical applications and design other new applications.