Restoring definiteness via shrinking, with an application to correlation matrices with a fixed block

Research output: Contribution to journalArticle


Indefinite approximations of positive semidefinite matrices arise in various data analysis applications involving covariance matrices and correlation matrices. We propose a method for restoring positive semidefiniteness of an indefinite matrix $M_0$ that constructs a convex linear combination $S(\alpha) = \alpha M_1 + (1-\alpha)M_0$ of $M_0$ and a positive semidefinite target matrix $M_1$. In statistics, this construction for improving an estimate $M_0$ by combining it with new information in $M_1$ is known as shrinking. We make no statistical assumptions about $M_0$ and define the optimal shrinking parameter as $\alpha_* = \min \{\alpha \in [0,1] : {$S(\alpha)$ is positive semidefinite}\}$. We describe three \alg s for computing $\alpha_*$. One algorithm is based on the bisection method, with the use of Cholesky factorization to test definiteness; a second employs Newton's method; and a third finds the smallest eigenvalue of a symmetric definite generalized eigenvalue problem. We show that weights that reflect confidence in the individual entries of $M_0$ can be used to construct a natural choice of the target matrix $M_1$. We treat in detail a problem variant in which a positive semidefinite leading principal submatrix of $M_0$ remains fixed, showing how the fixed block can be exploited to reduce the cost of the bisection and generalized eigenvalue methods. Numerical experiments show that when applied to indefinite approximations of correlation matrices shrinking can be at least an order of magnitude faster than computing the nearest correlation matrix.

Bibliographical metadata

Original languageEnglish
Pages (from-to)245-263
Number of pages19
JournalS I A M Review
StatePublished - 2016