A Caltech Library Service

Convex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations


Oymak, Samet (2015) Convex Relaxation for Low-Dimensional Representation: Phase Transitions and Limitations. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/Z9S46PWX.


There is a growing interest in taking advantage of possible patterns and structures in data so as to extract the desired information and overcome the curse of dimensionality. In a wide range of applications, including computer vision, machine learning, medical imaging, and social networks, the signal that gives rise to the observations can be modeled to be approximately sparse and exploiting this fact can be very beneficial. This has led to an immense interest in the problem of efficiently reconstructing a sparse signal from limited linear observations. More recently, low-rank approximation techniques have become prominent tools to approach problems arising in machine learning, system identification and quantum tomography.

In sparse and low-rank estimation problems, the challenge is the inherent intractability of the objective function, and one needs efficient methods to capture the low-dimensionality of these models. Convex optimization is often a promising tool to attack such problems. An intractable problem with a combinatorial objective can often be "relaxed" to obtain a tractable but almost as powerful convex optimization problem. This dissertation studies convex optimization techniques that can take advantage of low-dimensional representations of the underlying high-dimensional data. We provide provable guarantees that ensure that the proposed algorithms will succeed under reasonable conditions, and answer questions of the following flavor:

  • For a given number of measurements, can we reliably estimate the true signal?
  • If so, how good is the reconstruction as a function of the model parameters?

More specifically, i) Focusing on linear inverse problems, we generalize the classical error bounds known for the least-squares technique to the lasso formulation, which incorporates the signal model. ii) We show that intuitive convex approaches do not perform as well as expected when it comes to signals that have multiple low-dimensional structures simultaneously. iii) Finally, we propose convex relaxations for the graph clustering problem and give sharp performance guarantees for a family of graphs arising from the so-called stochastic block model. We pay particular attention to the following aspects. For i) and ii), we aim to provide a general geometric framework, in which the results on sparse and low-rank estimation can be obtained as special cases. For i) and iii), we investigate the precise performance characterization, which yields the right constants in our bounds and the true dependence between the problem parameters.

Item Type:Thesis (Dissertation (Ph.D.))
Subject Keywords:convex relaxation; low-dimensionality; high-dimensional estimation; phase transitions; clustering; structured signals
Degree Grantor:California Institute of Technology
Division:Engineering and Applied Science
Major Option:Electrical Engineering
Awards:Charles And Ellen Wilts Prize, 2015
Thesis Availability:Public (worldwide access)
Research Advisor(s):
  • Hassibi, Babak
Thesis Committee:
  • Hassibi, Babak (chair)
  • Tropp, Joel A.
  • Chandrasekaran, Venkat
  • Vaidyanathan, P. P.
  • Fazel, Maryam
Defense Date:12 June 2014
Record Number:CaltechTHESIS:08182014-091546460
Persistent URL:
Default Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:8635
Deposited By: Samet Oymak
Deposited On:22 Aug 2014 22:20
Last Modified:04 Oct 2019 00:06

Thesis Files

PDF (Complete Thesis) - Final Version
See Usage Policy.


Repository Staff Only: item control page