skip to main content
10.1145/780542.780640acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
Article

Two applications of information complexity

Published:09 June 2003Publication History

ABSTRACT

We show the following new lower bounds in two concrete complexity models:

  • (1) In the two-party communication complexity model, we show that the tribes function on n inputs[6] has two-sided error randomized complexity Ω(n), while its nondeterminstic complexity and co-nondeterministic complexity are both Θ(√n). This separation between randomized and nondeterministic complexity is the best possible and it settles an open problem in Kushilevitz and Nisan[17], which was also posed by Beame and Lawry[5].

  • (2) In the Boolean decision tree model, we show that the recursive majority-of-three function on 3h inputs has randomized complexity Ω((7/3)h). The deterministic complexity of this function is Θ(3h), and the nondeterministic complexity is Θ(2h). Our lower bound on the randomized complexity is a substantial improvement over any lower bound for this problem that can be obtained via the techniques of Saks and Wigderson [23], Heiman and Wigderson[14], and Heiman, Newman, and Wigderson[13]. Recursive majority is an important function for which a class of natural algorithms known as directional algorithms does not achieve the best randomized decision tree upper bound.

These lower bounds are obtained using generalizations of information complexity, which quantifies the minimum amount of information that will have to be revealed about the inputs by every correct algorithm in a given model of computation.

References

  1. F. Ablayev. Lower bounds for one-way probabilistic communication complexity and their application to space complexity. Theoretical Computer Science, 157(2):139--159, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. V. Aho, J. D. Ullman, and M. Yannakakis. On notions of information transfer in VLSI circuits. In Proc. 15th Annual ACM Symposium on the Theory of Computing, pages 133--139, 1983. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. R. Bar-Yehuda, B. Chor, E. Kushilevitz, and A. Orlitsky. Privacy, additional information, and communication. IEEE Transactions on Information Theory, 39(6):1930--1943, 1993.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Z. Bar-Yossef, T. Jayram, R. Kumar, and D. Sivakumar. An information statistics approach to data stream and communication complexity. In Proc. 43rd Annual IEEE Symposium on Foundations of Computer Science, pages 209--218, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. P. Beame and J. Lawry. Randomized versus nondeterministic communication complexity. In Proc. 24th Annual ACM Symposium on the Theory of Computing, pages 188--199, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. M. Ben-Or and N. Linial. Collective coin flipping. In S. Micali, editor, Randomness and Computation, pages 91--115. JAI Press, 1990.Google ScholarGoogle Scholar
  7. M. Blum and R. Impagliazzo. General oracle and oracle classes. In Proc. 28th Annual IEEE Symposium on Foundations of Computer Science, pages 118--126, 1987.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. H. Buhrman and R. de~Wolf. Complexity measures and decision tree complexity: A survey. Theoretical Computer Science, 288(1):21--43, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. A. Chakrabarti, Y. Shi, A. Wirth, and A. C.-C. Yao. Informational complexity and the direct sum problem for simultaneous message complexity. In Proc. 42nd IEEE Annual Symposium on Foundations of Computer Science, pages 270--278, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. T. M. Cover and J. A. Thomas. Elements of Information Theory. John Wiley & Sons, Inc., 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. Furer. The power of randomness for communication complexity. In Proc. 19th Annual ACM Symposium on the Theory of Computing, pages 178--181, 1987. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Hartmanis and L. Hemachandra. Complexity classes without machines: On complete languages for UP. Theoretical Computer Science, 58:129--142, 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. R. Heiman, I. Newman, and A. Wigderson. On read-once threshold formulae and their randomized decision tree complexity. Theoretical Computer Science, 107(1):63--76, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. R. Heiman and A. Wigderson. Randomized vs. deterministic decision tree complexity for read-once Boolean functions. Computational Complexity, 1:311--329, 1991.Google ScholarGoogle ScholarCross RefCross Ref
  15. J. Hromkovic and G. Schnitger. Nondeterministic communication with a limited number of advice bits. In Proc. 28th Annual ACM Symposium on the Theory of Computing, pages 551--560, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. M. Karchmer and A. Wigderson. Monotone circuits for connectivity require super-logarithmic depth. SIAM Journal on Discrete Mathematics, 3(2):255--265, 1990.Google ScholarGoogle ScholarCross RefCross Ref
  17. E. Kushilevitz and N. Nisan. Communication Complexity. Cambridge University Press, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J. Lin. Divergence measures based on the Shannon entropy. IEEE Transactions on Information Theory, 37(1):145--151, 1991.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. I. Newman. Private vs. common random bits in communication complexity. Information Processing Letters, 39:67--71, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. N. Nisan. CREW PRAMs and decision trees. SIAM Journal on Computing, 20(6):999--1007, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. R. Raz and A. Wigderson. Monotone circuits for matching require linear depth. Journal of the ACM, 39(3):736--744, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. M. Saks and X. Sun. Space lower bounds for distance approximation in the data stream model. In Proc. of the 34th Annual ACM Symposium on Theory of Computing, pages 360--369, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. M. Saks and A. Wigderson. Probabilistic Boolean decision trees and the complexity of evaluating game trees. In Proc. 27th IEEE Symposium on Foundations of Computer Science, pages 29--38, 1986.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. M. Santha. On the Monte Carlo Boolean decision tree complexity of read-once formulae. Random Structures and Algorithms, 6(1):75--88, 1995. Google ScholarGoogle ScholarCross RefCross Ref
  25. M. Snir. Lower bounds for probabilistic linear decision trees. Theoretical Computer Science, 38:69--82, 1985.Google ScholarGoogle ScholarCross RefCross Ref
  26. G. Tardos. Query complexity, or why is it difficult to separate NPA ∩ co-NPA from PA by a random oracle. Combinatorica, 9:385--392, 1990.Google ScholarGoogle ScholarCross RefCross Ref
  27. A. C.-C. Yao. Some complexity questions related to distributive computing. In Proc. 11th Annual ACM Symposium on Theory of Computing, pages 209--213, 1979. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. A. C.-C. Yao. The entropic limitations on VLSI computations (extended abstract). In Proc. 13th Annual ACM Symposium on Theory of computing, pages 308--311, 1981. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Two applications of information complexity

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      STOC '03: Proceedings of the thirty-fifth annual ACM symposium on Theory of computing
      June 2003
      740 pages
      ISBN:1581136749
      DOI:10.1145/780542

      Copyright © 2003 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 9 June 2003

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      STOC '03 Paper Acceptance Rate80of270submissions,30%Overall Acceptance Rate1,469of4,586submissions,32%

      Upcoming Conference

      STOC '24
      56th Annual ACM Symposium on Theory of Computing (STOC 2024)
      June 24 - 28, 2024
      Vancouver , BC , Canada

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader