Special Matrices (Jun 2024)
Disjoint sections of positive semidefinite matrices and their applications in linear statistical models
Abstract
Given matrices AA and BB of the same order, AA is called a section of BB if R(A)∩R(B−A)={0}{\mathscr{R}}\left(A)\cap {\mathscr{R}}\left(B-A)=\left\{0\right\} and R(AT)∩R((B−A)T)={0}{\mathscr{R}}\left({A}^{T})\cap {\mathscr{R}}\left({\left(B-A)}^{T})=\left\{0\right\}, where R(.){\mathscr{R}}\left(.) denotes the range (column space) of a matrix argument and the superscript TT stands for the transpose of a matrix. A matrix GG is called positive semidefinite (p.s.d.) if xTGx{x}^{T}Gx is nonnegative for all real vector xx. However, we refer to a symmetric p.s.d. matrix simply as a p.s.d. matrix as the applications discussed in this article are concerned with only symmetric p.s.d. matrices. An n×nn\times n p.s.d. matrix GG admits of a symmetric section GX{G}_{X} of GG with regard to an n×kn\times k matrix XX such that R(GX)=R(G)∩R(X){\mathscr{R}}\left({G}_{X})={\mathscr{R}}\left(G)\cap {\mathscr{R}}\left(X). In this article, sections of the type GX{G}_{X} are used in minimization of quadratic functions under linear constraints and splitting of vector random variables into uncorrelated vector random variables. In the general Gauss-Markoff model, y=Xβ+εy=X\beta +\varepsilon , with design matrix XX and a singular covariance matrix σ2G{\sigma }^{2}G of ε\varepsilon , yy is decomposed into four uncorrelated vector random variables as y=M1y+M2y+M3y+M4yy={M}_{1}y+{M}_{2}y+{M}_{3}y+{M}_{4}y, where Mi,i=1,2,3{M}_{i},i=1,2,3 consist of sections of GG and XXTX{X}^{T}, and M4{M}_{4} is a matrix whose row space is the null space of G+XXTG+X{X}^{T}.
Keywords