Malaysian Journal of Computing (Oct 2023)

VULGARIZED NEIGHBOURING NETWORK OF MULTIVARIATE AUTOREGRESSIVE PROCESSES WITH GAUSSIAN AND STUDENT-T DISTRIBUTED RANDOM NOISES

  • Rasaki Olawale Olanrewaju,
  • Ravi Prakash Ranjan,
  • Queensley C. Chukwudum,
  • Sodiq Adejare Olanrewaju

DOI
https://doi.org/10.24191/mjoc.v8i2.23103
Journal volume & issue
Vol. 8, no. 2
pp. 1574 – 1588

Abstract

Read online

This paper introduces the vulgarized network autoregressive process with Gaussian and Student-t random noises. The processes relate the time-varying series of a given variable to the immediate past of the same phenomenon with the inclusion of its neighboring variables and networking structure. The generalized network autoregressive process would be fully spelt-out to contain the aforementioned random noises with their embedded parameters (the autoregressive coefficients, networking nodes, and neighboring nodes) and subjected to monthly prices of ten (10) edible cereals. Global-α of Generalized Network Autoregressive (GNAR) of order lag two, the neighbor at the time lags two and the neighbourhood nodal of zero, that is GNAR (2, [2,0]) was the ideal generalization for both Gaussian and student-t random noises for the prices of cereals, a model with two autoregressive parameters and network regression parameters on the first two neighbor sets at time lag one. GNAR model with student-t random noise produced the smallest BIC of -39.2298 compared to a BIC of -18.1683 by GNAR by Gaussian. The residual error via Gaussian was 0.9900 compared to the one of 0.9000 by student-t. Additionally, GNAR MSE for error of forecasting via student-t was 15.105% less than that of the Gaussian. Similarly, student-t-GNAR MSE for VAR was 1.59% less than that of the Gaussian-GNAR MSE for VAR. Comparing the fitted histogram plots of both the student-t and Gaussian processes, the two histograms produced a symmetric residual estimate for the fitted GNAR model via student-t and Gaussian processes respectively, but the residuals via the student-t were more evenly symmetric than those of the Gaussian. In a contribution to the network autoregressive process, the GNAR process with Student-t random noise generalization should always be favored over Gaussian random noise because of its ability to absolve contaminations, spread, and ability to contain time-varying network measurements.

Keywords