Sensors (Jul 2021)

CNN-Based Classifier as an Offline Trigger for the CREDO Experiment

  • Marcin Piekarczyk,
  • Olaf Bar,
  • Łukasz Bibrzycki,
  • Michał Niedźwiecki,
  • Krzysztof Rzecki,
  • Sławomir Stuglik,
  • Thomas Andersen,
  • Nikolay M. Budnev,
  • David E. Alvarez-Castillo,
  • Kévin Almeida Cheminant,
  • Dariusz Góra,
  • Alok C. Gupta,
  • Bohdan Hnatyk,
  • Piotr Homola,
  • Robert Kamiński,
  • Marcin Kasztelan,
  • Marek Knap,
  • Péter Kovács,
  • Bartosz Łozowski,
  • Justyna Miszczyk,
  • Alona Mozgova,
  • Vahab Nazari,
  • Maciej Pawlik,
  • Matías Rosas,
  • Oleksandr Sushchov,
  • Katarzyna Smelcerz,
  • Karel Smolek,
  • Jarosław Stasielak,
  • Tadeusz Wibig,
  • Krzysztof W. Woźniak,
  • Jilberto Zamora-Saa

DOI
https://doi.org/10.3390/s21144804
Journal volume & issue
Vol. 21, no. 14
p. 4804

Abstract

Read online

Gamification is known to enhance users’ participation in education and research projects that follow the citizen science paradigm. The Cosmic Ray Extremely Distributed Observatory (CREDO) experiment is designed for the large-scale study of various radiation forms that continuously reach the Earth from space, collectively known as cosmic rays. The CREDO Detector app relies on a network of involved users and is now working worldwide across phones and other CMOS sensor-equipped devices. To broaden the user base and activate current users, CREDO extensively uses the gamification solutions like the periodical Particle Hunters Competition. However, the adverse effect of gamification is that the number of artefacts, i.e., signals unrelated to cosmic ray detection or openly related to cheating, substantially increases. To tag the artefacts appearing in the CREDO database we propose the method based on machine learning. The approach involves training the Convolutional Neural Network (CNN) to recognise the morphological difference between signals and artefacts. As a result we obtain the CNN-based trigger which is able to mimic the signal vs. artefact assignments of human annotators as closely as possible. To enhance the method, the input image signal is adaptively thresholded and then transformed using Daubechies wavelets. In this exploratory study, we use wavelet transforms to amplify distinctive image features. As a result, we obtain a very good recognition ratio of almost 99% for both signal and artefacts. The proposed solution allows eliminating the manual supervision of the competition process.

Keywords