Nuclear Engineering and Technology (Jul 2024)

A counting-time optimization method for artificial neural network (ANN) based gamma-ray spectroscopy

  • Moonhyung Cho,
  • Jisung Hwang,
  • Sangho Lee,
  • Kilyoung Ko,
  • Wonku Kim,
  • Gyuseong Cho

Journal volume & issue
Vol. 56, no. 7
pp. 2690 – 2697

Abstract

Read online

With advancements in machine learning technologies, artificial neural networks (ANNs) are being widely used to improve the performance of gamma-ray spectroscopy based on NaI(Tl) scintillation detectors. Typically, the performance of ANNs is evaluated using test datasets composed of actual spectra. However, the generation of such test datasets encompassing a wide range of actual spectra representing various scenarios often proves inefficient and time-consuming. Thus, instead of measuring actual spectra, we generated virtual spectra with diverse spectral features by sampling from categorical distribution functions derived from the base spectra of six radioactive isotopes: 54Mn, 57Co, 60Co, 134Cs, 137Cs, and 241Am. For practical applications, we determined the optimum counting time (OCT) as the point at which the change in the Kullback–Leibler divergence (ΔKLDV) values between the synthetic spectra used for training the ANN and the virtual spectra approaches zero. The accuracies of the actual spectra were significantly improved when measured up to their respective OCTs. The outcomes demonstrated that the proposed method can effectively determine the OCTs for gamma-ray spectroscopy based on ANNs without the need to measure actual spectra.

Keywords