Event Date:
Event Location:
- HSSB 1173
Speaker: Alex Kercheval
Principal components are used throughout the physical and social sciences to reduce the dimension of complex problems to manageable levels and to distinguish signal from noise. Our research identifies and mitigates bias in the leading eigenvector of a sample factor-based covariance matrix estimated in the high-dimension low sample size (HL) regime. The analysis also illuminates how estimation error in a covariance matrix can affect quadratic optimization. Eigenvector estimation in the HL regime may be useful for disciplines, such as finance, machine learning, or genomics, in which high-dimensional variables need to be analyzed from a limited number of observations.
We describe theoretical guarantees showing that a family of customizable James-Stein-style shrinkage operators applied to the sample leading eigendirection will improve estimation error almost surely asymptotically as the dimension tends to infinity, without the need for distributional assumptions beyond bounded fourth moments. Simulation experiments show that the asymptotic behavior is already present for realistic values of the dimension.
This is joint work with Lisa Goldberg and Hubeyb Gurdogan, and closely connected to work of Alex Shkolnik.