Entropy (Dec 2022)
Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis
Abstract
The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mutual information has been considered to be the most comprehensive measure of dependence for evaluating total dependence, and several methods have been suggested for discerning the linear and nonlinear components of dependence between two variables. We, in this study, propose employing the Rényi and Tsallis mutual information measures for measuring total dependence because of their parametric nature. We first use a residual analysis in order to remove linear dependence between the variables, and then we compare the Rényi and Tsallis mutual information measures of the original data with that the lacking linear component to determine the degree of nonlinearity. A comparison against the values of the Shannon mutual information measure is also provided. Finally, we apply our method to the environmental Kuznets curve (EKC) and demonstrate the validity of the EKC hypothesis for Eastern Asian and Asia-Pacific countries.
Keywords