A Caltech Library Service

On the Complexity of Neural Network Representations


Kiliç, Kordağ Mehmet (2024) On the Complexity of Neural Network Representations. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/trse-ff38.


The evolution of the human brain was one of the milestones in the history of information after the emergence of life. The underlying biological, chemical, and physical processes of the brain have amazed scientists for a long time. It is still a mystery how the human brain computes a simple arithmetical operation like 2 + 2 = 4. This enigma has spurred investigations into understanding the intrinsic architecture of the brain.

This thesis delves into two primary models for brain architecture: Feedforward Neural Networks and Nearest Neighbor (NN) Representations. Both models are treated under the hypothesis that our brain does not work with "large" numbers and expressive power is derived from connectivity. Thus, when examining a network or, more precisely, a single neuron model, we strive to minimize the bit resolution of weights, potentially increasing depth or circuit complexity.

For the NN representations, the memory is defined by a set of vectors in Rⁿ (that we call anchors), computation is performed by convergence from an input vector to a nearest neighbor anchor, and the output is a label associated with an anchor. Limited bit resolution in the anchor entries may result in an increase of the size of the NN representation.

In the digital age, computers universally employ the binary numeral system, ensuring the enduring relevance of Boolean functions. This study specifically explores the trade-off between resolution and size for the computation models for Boolean functions. It is established that "low resolution" models may require a polynomial or even an exponential increase in the size complexity of the "high resolution" model, potentially making the practical implementation infeasible. Building upon prior research, our goal is to optimize these blow-ups by narrowing the gaps between theoretical upper and lower bounds under various constraints. Additionally, we aim to establish connections between NN representations and neural network models by providing explicit NN representations for well-known Boolean functions in Circuit Complexity Theory.

Item Type:Thesis (Dissertation (Ph.D.))
Subject Keywords:Neural Networks, Boolean functions, Complexity Theory, Circuit Complexity Theory, Information Theory, Machine Learning, Nearest Neighbors
Degree Grantor:California Institute of Technology
Division:Engineering and Applied Science
Major Option:Electrical Engineering
Thesis Availability:Public (worldwide access)
Research Advisor(s):
  • Bruck, Jehoshua
Thesis Committee:
  • Hassibi, Babak
  • Winfree, Erik
  • Umans, Christopher M. (chair)
  • Bruck, Jehoshua
Defense Date:7 December 2023
Funding AgencyGrant Number
Carver Mead New Adventures FundUNSPECIFIED
Caltech Innovation Initiative (CI2) Research GrantUNSPECIFIED
Record Number:CaltechTHESIS:06032024-055145735
Persistent URL:
Related URLs:
URLURL TypeDescription adapted for Chapter 2 adapted for Chapter 2 adapted for Chapter 3 adapted for Chapter 3 adapted for Chapter 3
Kiliç, Kordağ Mehmet0009-0005-6321-7113
Default Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:16476
Deposited By: Kordag Mehmet Kilic
Deposited On:03 Jun 2024 23:39
Last Modified:12 Jun 2024 22:39

Thesis Files

[img] PDF (Thesis) - Final Version
See Usage Policy.


Repository Staff Only: item control page