A Flexible Framework for Regularized Low-rank Matrix Estimation


Speaker


Abstract

Low-rank matrix estimation plays a key role in many scientific and engineering tasks including collaborative filtering and image denoising. Low-rank procedures are often motivated by the statistical model where we observe a noisy matrix drawn from some distribution with expectation assumed to have a low-rank representation. The statistical goal is to try to recover the signal from the noisy data. Classical approaches are centered around singular-value decomposition algorithms. Although the truncated singular value decomposition has been extensively used and studied, the estimator is found to be noisy and its performance can be improved by regularization. Methods based on singular-value shrinkage have achieved considerable empirical success and also have provable optimality properties in the Gaussian noise model (Gavish & Donoho, 2014). In this presentation, we propose a new framework for regularized low-rank estimation that does not start from the singular-value shrinkage point of view. Our approach is motivated by a simple parametric boostrap idea. In the simplest case of isotropic Gaussian noise, we end up with a new singular-value shrinkage estimator whereas for non-isotropic noise models, our procedure yields new estimators that perform well in experiments.

This event is organised by the Econometric Institute.
Twitter: @MetricsSeminars