EPJ Web of Conferences (Jan 2024)

Transformers for Generalized Fast Shower Simulation

  • Raikwar Piyush,
  • Cardoso Renato,
  • Chernyavskaya Nadezda,
  • Jaruskova Kristina,
  • Pokorski Witold,
  • Salamani Dalila,
  • Srivatsa Mudhakar,
  • Tsolaki Kalliopi,
  • Vallecorsa Sofia,
  • Zaborowska Anna

DOI
https://doi.org/10.1051/epjconf/202429509039
Journal volume & issue
Vol. 295
p. 09039

Abstract

Read online

Recently, transformer-based foundation models have proven to be a generalized architecture applicable to various data modalities, ranging from text to audio and even a combination of multiple modalities. Transformers by design should accurately model the non-trivial structure of particle showers thanks to the absence of strong inductive bias, better modeling of long-range dependencies, and interpolation and extrapolation capabilities. In this paper, we explore a transformer-based generative model for detector-agnostic fast shower simulation, where the goal is to generate synthetic particle showers, i.e., the energy depositions in the calorimeter. When trained with an adequate amount and variety of showers, these models should learn better representations compared to other deep learning models, and hence should quickly adapt to new detectors. In this work, we will show the prototype of a transformer-based generative model for fast shower simulation, as well as explore certain aspects of transformer architecture such as input data representation, sequence formation, and the learning mechanism for our unconventional shower data.