This thesis is concerned with the study of the noise sensitivity of boolean functions and its applications in theoretical computer science. Noise sensitivity is defined as follows: Let f be a boolean function and let ý ý (0, ½) be a parameter. Suppose a uniformly random string x is picked, and y is formed by flipping each bit of x independently with probability ý. Then the noise sensitivity of f at ý is defined to be the probability that f ( x ) and f ( y ) differ.
In this thesis we investigate the noise sensitivity of various classes of boolean functions, including majorities and recursive majorities, boolean threshold functions, and monotone functions. Following this we give new complexity-theoretic and algorithmic applications of noise sensitivity: (1) Regarding computational hardness amplification, we prove a general direct product theorem that tightly characterizes the hardness of a composite function g ý f in terms of an assumed hardness of f and the noise sensitivity of g . The theorem lets us prove a new result about the hardness on average of NP : If NP is (1 ý poly( n ))-hard for circuits of polynomial size, then it is in fact (½ + o (1))-hard for circuits of polynomial size. (2) In the field of computational learning theory, we show that any class whose functions have low noise sensitivity is efficiently learnable. Using our noise sensitivity estimates for functions of boolean halfspaces we obtain new polynomial and quasipolynomial time algorithms for learning intersections, thresholds, and other functions of halfspaces. From noise sensitivity considerations we also give a polynomial time algorithm for learning polynomial-sized DNFs under the “Random Walk” model; we also give the first algorithm that learns the class of “junta” functions with efficiency better than that of the brute force algorithm. (3) Finally, we introduce a new collective coin-flipping problem whose study is equivalent to the study of “higher moments” of the noise sensitivity problem. We prove several results about this extension, and find optimal or near-optimal choices for the coin-flipping function for all asymptotic limits of the parameters. Our techniques include a novel application of the reverse Bonami-Beckner inequality. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)
Cited By
- Blanc G, Gupta N, Lange J and Tan L Universal guarantees for decision tree induction via a higher-order splitting criterion Proceedings of the 34th International Conference on Neural Information Processing Systems, (9475-9484)
- Wei C and Ermon S General bounds on satisfiability thresholds for random CSPs via fourier analysis Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, (3958-3965)
- Dachman-Soled D, Feldman V, Tan L, Wan A and Wimmer K Approximate resilience, monotonicity, and the complexity of agnostic learning Proceedings of the twenty-sixth annual ACM-SIAM symposium on Discrete algorithms, (498-511)
- Khot S and Vishnoi N (2015). The Unique Games Conjecture, Integrality Gap for Cut Problems and Embeddability of Negative-Type Metrics into ℓ1, Journal of the ACM, 62:1, (1-39), Online publication date: 2-Mar-2015.
- O’Donnell R, Wu Y and Zhou Y (2014). Optimal Lower Bounds for Locality-Sensitive Hashing (Except When q is Tiny), ACM Transactions on Computation Theory, 6:1, (1-13), Online publication date: 1-Mar-2014.
- O'Donnell R and Zhou Y Approximability and proof complexity Proceedings of the twenty-fourth annual ACM-SIAM symposium on Discrete algorithms, (1537-1556)
- Ré C and Suciu D (2008). Approximate lineage for probabilistic databases, Proceedings of the VLDB Endowment, 1:1, (797-808), Online publication date: 1-Aug-2008.
- Klivans A and Sherstov A (2007). Unconditional lower bounds for learning intersections of halfspaces, Machine Language, 69:2-3, (97-114), Online publication date: 1-Dec-2007.
Index Terms
- Computational applications of noise sensitivity
Recommendations
Bounding the average sensitivity and noise sensitivity of polynomial threshold functions
STOC '10: Proceedings of the forty-second ACM symposium on Theory of computingWe give the first non-trivial upper bounds on the average sensitivity and noise sensitivity of degree-d polynomial threshold functions (PTFs). These bounds hold both for PTFs over the Boolean hypercube {-1,1}n and for PTFs over Rn under the standard n-...
Average Sensitivity and Noise Sensitivity of Polynomial Threshold Functions
† Special Section on the Fifty-First Annual IEEE Symposium on Foundations of Computer Science (FOCS 2010)We give the first nontrivial upper bounds on the Boolean average sensitivity and noise sensitivity of degree-$d$ polynomial threshold functions (PTFs). Our bound on the Boolean average sensitivity of PTFs represents the first progress toward the resolution of ...
Sensitivity, block sensitivity, and l-block sensitivity of boolean functions
Sensitivity is one of the simplest, and block sensitivity one of the most useful, invariants of a boolean function. Nisan [SIAM J. Comput. 20 (6) (1991) 999] and Nisan and Szegedy [Comput. Complexity 4 (4) (1994) 301] have shown that block sensitivity ...