ABSTRACT
We show the following new lower bounds in two concrete complexity models:
(1) In the two-party communication complexity model, we show that the tribes function on n inputs[6] has two-sided error randomized complexity Ω(n), while its nondeterminstic complexity and co-nondeterministic complexity are both Θ(√n). This separation between randomized and nondeterministic complexity is the best possible and it settles an open problem in Kushilevitz and Nisan[17], which was also posed by Beame and Lawry[5].
(2) In the Boolean decision tree model, we show that the recursive majority-of-three function on 3h inputs has randomized complexity Ω((7/3)h). The deterministic complexity of this function is Θ(3h), and the nondeterministic complexity is Θ(2h). Our lower bound on the randomized complexity is a substantial improvement over any lower bound for this problem that can be obtained via the techniques of Saks and Wigderson [23], Heiman and Wigderson[14], and Heiman, Newman, and Wigderson[13]. Recursive majority is an important function for which a class of natural algorithms known as directional algorithms does not achieve the best randomized decision tree upper bound.
These lower bounds are obtained using generalizations of information complexity, which quantifies the minimum amount of information that will have to be revealed about the inputs by every correct algorithm in a given model of computation.
- F. Ablayev. Lower bounds for one-way probabilistic communication complexity and their application to space complexity. Theoretical Computer Science, 157(2):139--159, 1996. Google ScholarDigital Library
- A. V. Aho, J. D. Ullman, and M. Yannakakis. On notions of information transfer in VLSI circuits. In Proc. 15th Annual ACM Symposium on the Theory of Computing, pages 133--139, 1983. Google ScholarDigital Library
- R. Bar-Yehuda, B. Chor, E. Kushilevitz, and A. Orlitsky. Privacy, additional information, and communication. IEEE Transactions on Information Theory, 39(6):1930--1943, 1993.Google ScholarDigital Library
- Z. Bar-Yossef, T. Jayram, R. Kumar, and D. Sivakumar. An information statistics approach to data stream and communication complexity. In Proc. 43rd Annual IEEE Symposium on Foundations of Computer Science, pages 209--218, 2002. Google ScholarDigital Library
- P. Beame and J. Lawry. Randomized versus nondeterministic communication complexity. In Proc. 24th Annual ACM Symposium on the Theory of Computing, pages 188--199, 1992. Google ScholarDigital Library
- M. Ben-Or and N. Linial. Collective coin flipping. In S. Micali, editor, Randomness and Computation, pages 91--115. JAI Press, 1990.Google Scholar
- M. Blum and R. Impagliazzo. General oracle and oracle classes. In Proc. 28th Annual IEEE Symposium on Foundations of Computer Science, pages 118--126, 1987.Google ScholarDigital Library
- H. Buhrman and R. de~Wolf. Complexity measures and decision tree complexity: A survey. Theoretical Computer Science, 288(1):21--43, 2002. Google ScholarDigital Library
- A. Chakrabarti, Y. Shi, A. Wirth, and A. C.-C. Yao. Informational complexity and the direct sum problem for simultaneous message complexity. In Proc. 42nd IEEE Annual Symposium on Foundations of Computer Science, pages 270--278, 2001. Google ScholarDigital Library
- T. M. Cover and J. A. Thomas. Elements of Information Theory. John Wiley & Sons, Inc., 1991. Google ScholarDigital Library
- M. Furer. The power of randomness for communication complexity. In Proc. 19th Annual ACM Symposium on the Theory of Computing, pages 178--181, 1987. Google ScholarDigital Library
- J. Hartmanis and L. Hemachandra. Complexity classes without machines: On complete languages for UP. Theoretical Computer Science, 58:129--142, 1988. Google ScholarDigital Library
- R. Heiman, I. Newman, and A. Wigderson. On read-once threshold formulae and their randomized decision tree complexity. Theoretical Computer Science, 107(1):63--76, 1993. Google ScholarDigital Library
- R. Heiman and A. Wigderson. Randomized vs. deterministic decision tree complexity for read-once Boolean functions. Computational Complexity, 1:311--329, 1991.Google ScholarCross Ref
- J. Hromkovic and G. Schnitger. Nondeterministic communication with a limited number of advice bits. In Proc. 28th Annual ACM Symposium on the Theory of Computing, pages 551--560, 1996. Google ScholarDigital Library
- M. Karchmer and A. Wigderson. Monotone circuits for connectivity require super-logarithmic depth. SIAM Journal on Discrete Mathematics, 3(2):255--265, 1990.Google ScholarCross Ref
- E. Kushilevitz and N. Nisan. Communication Complexity. Cambridge University Press, 1997. Google ScholarDigital Library
- J. Lin. Divergence measures based on the Shannon entropy. IEEE Transactions on Information Theory, 37(1):145--151, 1991.Google ScholarDigital Library
- I. Newman. Private vs. common random bits in communication complexity. Information Processing Letters, 39:67--71, 1991. Google ScholarDigital Library
- N. Nisan. CREW PRAMs and decision trees. SIAM Journal on Computing, 20(6):999--1007, 1991. Google ScholarDigital Library
- R. Raz and A. Wigderson. Monotone circuits for matching require linear depth. Journal of the ACM, 39(3):736--744, 1992. Google ScholarDigital Library
- M. Saks and X. Sun. Space lower bounds for distance approximation in the data stream model. In Proc. of the 34th Annual ACM Symposium on Theory of Computing, pages 360--369, 2002. Google ScholarDigital Library
- M. Saks and A. Wigderson. Probabilistic Boolean decision trees and the complexity of evaluating game trees. In Proc. 27th IEEE Symposium on Foundations of Computer Science, pages 29--38, 1986.Google ScholarDigital Library
- M. Santha. On the Monte Carlo Boolean decision tree complexity of read-once formulae. Random Structures and Algorithms, 6(1):75--88, 1995. Google ScholarCross Ref
- M. Snir. Lower bounds for probabilistic linear decision trees. Theoretical Computer Science, 38:69--82, 1985.Google ScholarCross Ref
- G. Tardos. Query complexity, or why is it difficult to separate NPA ∩ co-NPA from PA by a random oracle. Combinatorica, 9:385--392, 1990.Google ScholarCross Ref
- A. C.-C. Yao. Some complexity questions related to distributive computing. In Proc. 11th Annual ACM Symposium on Theory of Computing, pages 209--213, 1979. Google ScholarDigital Library
- A. C.-C. Yao. The entropic limitations on VLSI computations (extended abstract). In Proc. 13th Annual ACM Symposium on Theory of computing, pages 308--311, 1981. Google ScholarDigital Library
Index Terms
- Two applications of information complexity
Recommendations
An information complexity approach to extended formulations
STOC '13: Proceedings of the forty-fifth annual ACM symposium on Theory of ComputingWe prove an unconditional lower bound that any linear program that achieves an O(n1-ε) approximation for clique has size 2Ω(nε). There has been considerable recent interest in proving unconditional lower bounds against any linear program. Fiorini et al. ...
Symmetry of information from meta-complexity
CCC '22: Proceedings of the 37th Computational Complexity ConferenceSymmetry of information for time-bounded Kolmogorov complexity is a hypothetical inequality that relates time-bounded Kolmogorov complexity and its conditional analogue. In 1992, Longpré and Watanabe showed that symmetry of information holds if NP is ...
The Round Complexity of Two-Party Random Selection
We study the round complexity of two-party protocols for generating a random $n$-bit string such that the output is guaranteed to have bounded “bias,” even if one of the two parties deviates from the protocol (possibly using unlimited computational ...
Comments