Machine Learning: Science and Technology (Jan 2023)

Exploring machine learning to hardware implementations for large data rate x-ray instrumentation

  • Mohammad Mehdi Rahimifar,
  • Quentin Wingering,
  • Berthié Gouin-Ferland,
  • Hamza Ezzaoui Rahali,
  • Charles-Étienne Granger,
  • Audrey C Therrien

DOI
https://doi.org/10.1088/2632-2153/ad0d12
Journal volume & issue
Vol. 4, no. 4
p. 045035

Abstract

Read online

Over the past decade, innovations in radiation and photonic detectors considerably improved their resolution, pixel density, sensitivity, and sampling rate, which all contribute to increased data generation rates. This huge data increases the amount of storage required, as well as the cabling between the source and the storage units. To overcome this problem, edge machine learning (EdgeML) proposes to move computation units near the detectors, utilizing machine learning (ML) models to emulate non-linear mathematical relationships between detector’s output data. ML algorithms can be implemented in digital circuits, such as application-specific integrated circuits and field-programmable gate arrays, which support both parallelization and pipelining. EdgeML has both the benefits of edge computing and ML models to compress data near the detectors. This paper explores the currently available tool-flows designed to translate software ML algorithms to digital circuits near the edge. The main focus is on tool-flows that provide a diverse range of supported models, optimization techniques, and compression methods. We compare their accessibility, performance, and ease of use, and compare them for two high data-rate instrumentation applications: (1) CookieBox, and (2) billion-pixel camera.

Keywords