EPJ Nuclear Sciences & Technologies (Jan 2021)

Nuclear data assimilation, scientific basis and current status

  • Ivanov Evgeny,
  • De Saint-Jean Cyrille,
  • Sobes Vladimir

DOI
https://doi.org/10.1051/epjn/2021008
Journal volume & issue
Vol. 7
p. 9

Abstract

Read online

The use of Data Assimilation methodologies, known also as a data adjustment, liaises the results of theoretical and experimental studies improving an accuracy of simulation models and giving a confidence to designers and regulation bodies. From the mathematical point of view, it approaches an optimized fit to experimental data revealing unknown causes by known consequences that would be crucial for data calibration and validation. Data assimilation adds value in a ND evaluation process, adjusting nuclear data to particular application providing so-called optimized design-oriented library, calibrating nuclear data involving IEs since all theories and differential experiments provide the only relative values, and providing an evidence-based background for validation of Nuclear data libraries substantiating the UQ process. Similarly, it valorizes experimental data and the experiments, as such involving them in a scientific turnover extracting essential information inherently contained in legacy and newly set up experiments, and prioritizing dedicated basic experimental programs. Given that a number of popular algorithms, including deterministic like Generalized Linear Least Square methodology and stochastic ones like Backward and Hierarchic or Total Monte-Carlo, Hierarchic Monte-Carlo, etc., being different in terms of particular numerical formalism are, though, commonly grounded on the Bayesian theoretical basis. They demonstrated sufficient maturity, providing optimized design-oriented data libraries or evidence-based backgrounds for a science-driven validation of general-purpose libraries in a wide range of practical applications.