Axioms (Jul 2024)

Advancing the Analysis of Extended Negative Dependence Random Variables: A New Concentration Inequality and Its Applications for Linear Models

  • Zouaoui Chikr Elmezouar,
  • Abderrahmane Belguerna,
  • Hamza Daoudi,
  • Fatimah Alshahrani,
  • Zoubeyr Kaddour

DOI
https://doi.org/10.3390/axioms13080511
Journal volume & issue
Vol. 13, no. 8
p. 511

Abstract

Read online

This paper introduces an innovative concentration inequality for Extended Negative Dependence (END) random variables, providing new insights into their almost complete convergence. We apply this inequality to analyze END variable sequences, particularly focusing on the first-order auto-regressive (AR(1)) model. This application highlights the dynamics and convergence properties of END variables, expanding the analytical tools available for their study. Our findings contribute to both the theoretical understanding and practical applications of END variables in fields such as finance and machine learning, where understanding variable dependencies is crucial.

Keywords