Axioms (Jul 2024)
Advancing the Analysis of Extended Negative Dependence Random Variables: A New Concentration Inequality and Its Applications for Linear Models
Abstract
This paper introduces an innovative concentration inequality for Extended Negative Dependence (END) random variables, providing new insights into their almost complete convergence. We apply this inequality to analyze END variable sequences, particularly focusing on the first-order auto-regressive (AR(1)) model. This application highlights the dynamics and convergence properties of END variables, expanding the analytical tools available for their study. Our findings contribute to both the theoretical understanding and practical applications of END variables in fields such as finance and machine learning, where understanding variable dependencies is crucial.
Keywords