Citation
Sill, Joseph (1998) Monotonicity and connectedness in learning systems. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/GQWN-1H71. https://resolver.caltech.edu/CaltechETD:etd-09222005-110351
Abstract
This thesis studies two properties- monotonicity and connectedness- in the context of machine learning. The first part of the thesis examines the role of monotonicity constraints in machine learning from both practical and theoretical perspectives. Two techniques for enforcing monotonicity in machine learning models are proposed. The first method adds to the objective function a penalty term measuring the degree to which the model violates monotonicity. The penalty term can be interpreted as a Bayesian prior favoring functions which obey monotonicity. This method has the potential to enforce monotonicity only approximately, making it appropriate for situations where strict monotonicity may not hold. The second approach consists of a model which is monotonic by virtue of functional form. This model is shown to have universal approximation capabilities with respect to the class M of monotonic functions. A variety of theoretical results are also presented regarding M. The generalization behavior of this class is shown to depend heavily on the probability distribution over the input space. Although the VC dimension of M is [infinity], the VC entropy (i.e., the expected number of dichotomies) is modest for many distributions, allowing us to obtain bounds on the generalization error. Monte Carlo techniques for estimating the capacity and VC entropy of M are presented.
The second part of the thesis considers broader issues in learning theory. Generalization error bounds based on the VC dimension describe a function class by counting the number of dichotomies it induces. In this thesis, a more detailed characterization is presented which takes into account the diversity of a set of dichotomies in addition to its cardinality. Many function classes in common usage are shown to possess a property called connectedness. Models with this property induce dichotomy sets which are highly clustered and have little diversity. We derive an improvement to the VC bound which applies to function classes with the connectedness property.
Item Type: | Thesis (Dissertation (Ph.D.)) |
---|---|
Degree Grantor: | California Institute of Technology |
Division: | Engineering and Applied Science |
Major Option: | Computation and Neural Systems |
Thesis Availability: | Public (worldwide access) |
Research Advisor(s): |
|
Thesis Committee: |
|
Defense Date: | 21 May 1998 |
Non-Caltech Author Email: | joe_sill (AT) yahoo.com |
Record Number: | CaltechETD:etd-09222005-110351 |
Persistent URL: | https://resolver.caltech.edu/CaltechETD:etd-09222005-110351 |
DOI: | 10.7907/GQWN-1H71 |
Default Usage Policy: | No commercial reproduction, distribution, display or performance rights in this work are provided. |
ID Code: | 3689 |
Collection: | CaltechTHESIS |
Deposited By: | Imported from ETD-db |
Deposited On: | 23 Sep 2005 |
Last Modified: | 21 Dec 2019 03:03 |
Thesis Files
|
PDF (Sill_j_1998.pdf)
- Final Version
See Usage Policy. 3MB |
Repository Staff Only: item control page