Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Testing and Learning in High-Dimensions: Monotonicity Testing, Directed Isoperimetry, and Convex Sets

Abstract

This thesis studies testing and learning of monotone functions, k-monotone functions, and convex sets over high-dimensional domains. Our primary focus is monotonicity testing, which has been one of the central problems in the field of property testing since its beginnings in the late 90's. Monotonicity testing has generated a lot of interest, partially due to its connection to isoperimetric inequalities, which are a fundamental tool in Boolean function analysis. Our secondary focus is on testing and learning convex sets. Our contributions are presented in four parts summarized as follows:

1) We present a nearly optimal non-adaptive monotonicity testing algorithm for Boolean functions over d-dimensional hypergrids and continuous product spaces. Among other technical contributions, a central tool in our proof is a new isoperimetric inequality for Boolean functions over hypergrids.

2) Given the impact of isoperimetric inequalities for testing monotonicity of Boolean functions, a natural question is whether these inequalities generalize to larger ranges. We give a black-box reduction showing that the known inequalities in this area generalize to real-valued functions. We use this result to obtain nearly optimal bounds for (non-adaptive, one-sided error) monotonicity testing parameterized by the range size and an improved upper bound for approximating the distance to monotonicity of real-valued functions on the hypercube.

3) We present nearly matching upper and lower bounds for sample-based testing and learning of k-monotone functions over hypercubes and continuous product spaces.

4) Motivated by the limited understanding of convexity testing in high-dimensions we initiate the study of convex sets over the ternary hypercube, which is the simplest high-dimensional domain where convexity is a non-trivial property. We obtain (i) nearly tight bounds on the edge-boundary of convex sets in this domain, (ii) new upper and lower bounds for sample-based testing and learning, and (iii) nearly matching upper and lower bounds for non-adaptive testing with one-sided error.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View