CaltechTHESIS
  A Caltech Library Service

Low-Rank Matrix Recovery: Manifold Geometry and Global Convergence

Citation

Zhang, Ziyun (2023) Low-Rank Matrix Recovery: Manifold Geometry and Global Convergence. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/hd6q-g460. https://resolver.caltech.edu/CaltechTHESIS:05302023-222447373

Abstract

Low-rank matrix recovery problems are prevalent in modern data science, machine learning, and artificial intelligence, and the low-rank property of matrices is widely exploited to extract the hidden low-complexity structure in massive datasets. Compared with Burer-Monteiro factorization in the Euclidean space, using the low-rank matrix manifold has its unique advantages, as it eliminates duplicated spurious points and reduces the polynomial order of the objective function. Yet a few fundamental questions have remained unanswered until recently. We highlight two problems here in particular, which are the global geometry of the manifold and the global convergence guarantee.

As for the global geometry, we point out that there exist some spurious critical points on the boundary of the low-rank matrix manifold Mᵣ, which have rank smaller than r but can serve as limit points of iterative sequences in the manifold Mᵣ. For the least squares loss function, the spurious critical points are rank-deficient matrices that capture part of the eigen spaces of the ground truth. Unlike classical strict saddle points, their Riemannian gradient is singular and their Riemannian Hessian is unbounded.

We show that randomly initialized Riemannian gradient descent almost surely escapes some of the spurious critical points. To prove this result, we first establish the asymptotic escape of classical strict saddle sets consisting of non-isolated strict critical submanifolds on Riemannian manifolds. We then use a dynamical low-rank approximation to parameterize the manifold Mᵣ and map the spurious critical points to strict critical submanifolds in the classical sense in the parameterized domain, which leads to the desired result. Our result is the first to partially overcome the nonclosedness of the low-rank matrix manifold without altering the vanilla gradient descent algorithm. Numerical experiments are provided to support our theoretical findings.

As for the global convergence guarantee, we point out that earlier approaches to many of the low-rank recovery problems only imply a geometric convergence rate toward a second-order stationary point. This is in contrast to the numerical evidence, which suggests a nearly linear convergence rate starting from a global random initialization. To establish the nearly linear convergence guarantee, we propose a unified framework for a class of low-rank matrix recovery problems including matrix sensing, matrix completion, and phase retrieval. All of them can be considered as random sensing problems of low-rank matrices with a linear measurement operator from some random ensembles. These problems share similar population loss functions that are either least squares or its variant.

We show that under some assumptions, for the population loss function, the Riemannian gradient descent starting from a random initialization with high probability converges to the ground truth in a nearly linear convergence rate, i.e., it takes O(log 1/ϵ + log n) iterations to reach an ϵ-accurate solution. The key to establishing a nearly optimal convergence guarantee is closely intertwined with the analysis of the spurious critical points S_# on Mᵣ. Outside the local neighborhoods of spurious critical points, we use the fundamental convergence tool by the Łojasiewicz inequality to derive a linear convergence rate. In the spurious regions in the neighborhood of spurious critical points, the Riemannian gradient becomes degenerate and the Łojasiewicz inequality could fail. By tracking the dynamics of the trajectory in three stages, we are able to show that with high probability, Riemannian gradient descent escapes the spurious regions in a small number of steps.

After addressing the two problems of global geometry and global convergence guarantee, we use two applications to demonstrate the broad applicability of our analytical tools. The first is the robust principal component analysis problem on the manifold Mᵣ with the Riemannian subgradient method. The second application is the convergence rate analysis of the Sobolev gradient descent method for the nonlinear Gross-Pitaevskii eigenvalue problem on the infinite dimensional sphere manifold. These two examples demonstrate that the analysis of manifold first-order algorithms can be extended beyond the previous framework, to nonsmooth functions and subgradient methods, and to infinite dimensional Hilbert manifolds. This exemplifies that the insights gained and tools developed for the low-rank matrix manifold Mᵣ can be extended to broader scientific and technological fields.

Item Type:Thesis (Dissertation (Ph.D.))
Subject Keywords:Low-rank recovery, low-rank matrix manifold, Riemannian gradient descent, strict saddles, global convergence rate, matrix sensing, matrix completion, phase retrieval, Robust PCA, Gross-Pitaevskii eigenproblem, Lojasiewicz inequality
Degree Grantor:California Institute of Technology
Division:Engineering and Applied Science
Major Option:Applied And Computational Mathematics
Thesis Availability:Public (worldwide access)
Research Advisor(s):
  • Hou, Thomas Y.
Thesis Committee:
  • Stuart, Andrew M. (chair)
  • Owhadi, Houman
  • Tropp, Joel A.
  • Hou, Thomas Y.
Defense Date:8 May 2023
Non-Caltech Author Email:zyzhang0907 (AT) gmail.com
Record Number:CaltechTHESIS:05302023-222447373
Persistent URL:https://resolver.caltech.edu/CaltechTHESIS:05302023-222447373
DOI:10.7907/hd6q-g460
Related URLs:
URLURL TypeDescription
https://doi.org/10.1137/19M129437XDOIArticle adapted for Chapter 2
https://doi.org/10.4310/CMS.2022.v20.n2.a4DOIArticle adapted for Chapter 7
ORCID:
AuthorORCID
Zhang, Ziyun0000-0002-5794-2387
Default Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:15236
Collection:CaltechTHESIS
Deposited By: Ziyun Zhang
Deposited On:01 Jun 2023 16:46
Last Modified:08 Jun 2023 17:18

Thesis Files

[img] PDF - Final Version
See Usage Policy.

3MB

Repository Staff Only: item control page