CaltechTHESIS
  A Caltech Library Service

Geometry-Driven Model Reduction

Citation

Budninskiy, Maxim A. (2019) Geometry-Driven Model Reduction. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/0RCX-0369. https://resolver.caltech.edu/CaltechTHESIS:12102018-011614041

Abstract

In this thesis we bring discrete differential geometry to bear on model reduction, both in the context of data analysis and numerical simulation of physical phenomena.

First, we present a novel controllable as-isometric-as-possible embedding method for low- and high-dimensional geometric datasets through sparse matrix eigenanalysis. This approach is equally suitable for performing nonlinear dimensionality reduction on big data and nonlinear shape editing of 3D meshes and pointsets. At the core of our approach is the construction of a "multi-Laplacian" quadratic form that is assembled from local operators whose kernels only contain locally affine functions. Minimizing this quadratic form produces an embedding that best preserves all relative coordinates of points within their local neighborhoods. We demonstrate the improvements that our approach brings over existing nonlinear local manifold learning methods on a number of datasets, and formulate the first eigen-based as-rigid-as-possible shape deformation technique by applying our affine-kernel embedding approach to 3D data augmented with user-imposed constraints on select vertices.

Second, we introduce a new global manifold learning approach based on metric connection for generating a quasi-isometric, low-dimensional mapping from a sparse and irregular sampling of an arbitrary low-dimensional manifold embedded in a high-dimensional space. Our geometric procedure computes a low-dimensional embedding that best preserves all pairwise geodesic distances over the input pointset similarly to one of the staples of manifold learning, the Isomap algorithm, and exhibits the same strong resilience to noise. While Isomap relies on Dijkstra's shortest path algorithm to approximate geodesic distances over the input pointset, we instead propose to compute them through "parallel transport unfolding," a discrete form of Cartan's development, to offer robustness to poor sampling and arbitrary topology. Our novel approach to evaluating geodesic distances using discrete differential geometry results in a markedly improved robustness to irregularities and sampling voids. In particular, it does not suffer from Isomap's limitation to geodesically convex sampled domains. Moreover, it involves only simple linear algebra, significantly improves the accuracy of all pairwise geodesic distance approximations, and has the same computational complexity as Isomap. We also show that our connection-based distance estimation can be used for faster variants of Isomap such as Landmark-Isomap.

Finally, we introduce an operator-adapted multiresolution analysis for finite-element differential forms. From a given continuous, linear, bijective, and self-adjoint positive-definite operator L, a hierarchy of basis functions and associated wavelets for discrete differential forms is constructed in a fine-to-coarse fashion and in quasilinear time. The resulting wavelets are L-orthogonal across all scales, and can be used to obtain a Galerkin discretization of the operator with a block diagonal stiffness matrix composed of uniformly well-conditioned and sparse blocks. Because our approach applies to arbitrary differential p-forms, we can derive both scalar-valued and vector-valued wavelets that block diagonalize a prescribed operator. Our construction applies to various types of computational grids, offers arbitrary smoothness orders of basis functions and wavelets, and can accommodate linear differential constraints such as divergence-freeness. We also demonstrate the benefits of the operator-adapted multiresolution decomposition for coarse-graining and model reduction of linear and nonlinear partial differential equations.

We conclude with a short discussion on how future work in geometric model reduction may impact other related topics such as semi-supervised learning.

Item Type:Thesis (Dissertation (Ph.D.))
Subject Keywords:Model Reduction; Manifold Learning; Nonlinear Dimensionality Reduction; Operator-adapted Wavelets; Finite-element Differential Forms; Numerical Homogenization.
Degree Grantor:California Institute of Technology
Division:Engineering and Applied Science
Major Option:Applied And Computational Mathematics
Thesis Availability:Public (worldwide access)
Research Advisor(s):
  • Desbrun, Mathieu
Thesis Committee:
  • Owhadi, Houman (chair)
  • Chandrasekaran, Venkat
  • Stuart, Andrew M.
  • Desbrun, Mathieu
Defense Date:27 November 2018
Record Number:CaltechTHESIS:12102018-011614041
Persistent URL:https://resolver.caltech.edu/CaltechTHESIS:12102018-011614041
DOI:10.7907/0RCX-0369
Related URLs:
URLURL TypeDescription
https://doi.org/10.1111/cgf.13250DOIArticle adapted for Ch. 3
https://arxiv.org/pdf/1806.09039.pdfarXivArticle adapted for Ch. 4
ORCID:
AuthorORCID
Budninskiy, Maxim A.0000-0002-9288-0249
Default Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:11303
Collection:CaltechTHESIS
Deposited By: Maxim Budninskiy
Deposited On:08 Jan 2019 20:25
Last Modified:04 Oct 2019 00:24

Thesis Files

[img]
Preview
PDF - Final Version
See Usage Policy.

119MB

Repository Staff Only: item control page