skip to main content
research-article

CTrigger: exposing atomicity violation bugs from their hiding places

Authors Info & Claims
Published:07 March 2009Publication History
Skip Abstract Section

Abstract

Multicore hardware is making concurrent programs pervasive. Unfortunately, concurrent programs are prone to bugs. Among different types of concurrency bugs, atomicity violation bugs are common and important. Existing techniques to detect atomicity violation bugs suffer from one limitation: requiring bugs to manifest during monitored runs, which is an open problem in concurrent program testing.

This paper makes two contributions. First, it studies the interleaving characteristics of the common practice in concurrent program testing (i.e., running a program over and over) to understand why atomicity violation bugs are hard to expose. Second, it proposes CTrigger to effectively and efficiently expose atomicity violation bugs in large programs. CTrigger focuses on a special type of interleavings (i.e., unserializable interleavings) that are inherently correlated to atomicity violation bugs, and uses trace analysis to systematically identify (likely) feasible unserializable interleavings with low occurrence-probability. CTrigger then uses minimum execution perturbation to exercise low-probability interleavings and expose difficult-to-catch atomicity violation.

We evaluate CTrigger with real-world atomicity violation bugs from four sever/desktop applications (Apache, MySQL, Mozilla, and PBZIP2) and three SPLASH2 applications on 8-core machines. CTrigger efficiently exposes the tested bugs within 1--235 seconds, two to four orders of magnitude faster than stress testing. Without CTrigger, some of these bugs do not manifest even after 7 full days of stress testing. In addition, without deterministic replay support, once a bug is exposed, CTrigger can help programmers reliably reproduce it for diagnosis. Our tested bugs are reproduced by CTrigger mostly within 5 seconds, 300 to over 60000 times faster than stress testing.

References

  1. P. Barford, and M. Crovella.Generating representative Web Workloads for network and server performance evaluation.In ACM SIGMETRICS, June 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. B. Beizer.Software testing techniques, 2nd edition. New York: Van Nostrand Reinhold, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. A. Bron, E. Farchi, Y. Magid, Y. Nir and S. Ur.Applications of synchronization coverage.In PPoPP, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. O. Edelstein, E. Farchi, Y. Nir, G. Ratsaby, and S. Ur.Multi-threaded Java program test generation. In IBM Systems Journal, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. C. Flanagan, and S. N. Freund.Atomizer: a dynamic atomicity checker for multithreaded programs. In POPL, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. C. Flanagan, and S. Qadeer.A type and effect system for atomicity. In PLDI, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. C. Hammer, J. Dolby, M. Vaziri, and F. Tip. Dynamic detection of atomic-set-serializability violations. In ICSE, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. J. Harrold, and B. A. Malloy. Data flow testing of parallelized code. In ICSM, 1992.Google ScholarGoogle ScholarCross RefCross Ref
  9. P. V. Koppol, and K.-C. Tai. An incremental approach to structural testing of concurrent software. In ISSTA, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. J. R. Larus, and R. Rajwar. Transactional memory. Morgan & Claypool, 2006.Google ScholarGoogle Scholar
  11. S. Lu, W. Jiang, and Y. Zhou. A study of interleaving coverage criteria. In FSE, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. S. Lu, S. Park, E. Seo, and Y. Zhou. Learning from mistakes -- A comprehensive study of real world concurrency bug characteristics. In ASPLOS, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. Lu, J. Tucek, F. Qin, and Y. Zhou. AVIO: Detecting atomicity violations via access interleaving invariants. In ASPLOS, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. B. Lucia, J. Devietti, K. Strauss, and L. Ceze. Atom-Aid: Detecting and surviving atomicity violations. In ISCA, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. C.-K. Luk, R. Cohn, R. Muth, H. Patil, A. Klauser, G. Lowney, S. Wallace, V. J. Reddi, and K. Hazelwood. Pin: Building customized program analysis tools with dynamic instrumentation. In PLDI, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. M. Musuvathi, and S. Qadeer. Iterative context bounding for systematic testing of multithreaded programs. In PLDI, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. M. Musuvathi, S. Qadeer, T. Ball, and G. Basler. Finding and reproducing heisenbugs in concurrent programs. In OSDI, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. N. Nethercote, and J. Seward. Valgrind: A framework for heavyweight dynamic binary instrumentation. In PLDI, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. R. H. B. Netzer, and B. P. Miller. Improving the accuracy of data race detection.In PPoPP, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. S. Savage, M. Burrows, G. Nelson, P. Sobalvarro, and T. AndersonEraser: A dynamic data race detector for multithreaded programs. In ACM TOCS, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Software Bug Contributed to Blackout. SecurityFocus. http://www.securityfocus.com/news/8016Google ScholarGoogle Scholar
  22. K. Sen. Race directed random testing of concurrent programs. In PLDI, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. K. Sen, and G. Agha. Automated systematic testing of open distributed programs. In FSE, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. R. N. Taylor, D. L. Levine, and C. D. Kelly. Structural testing of concurrent programs. In IEEE Transactions on Software Engineering, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. M. Vaziri, F. Tip, and J. Dolby. Associating synchronization constraints with data in an object-oriented language. In POPL, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. S. C. Woo, M. Ohara, E. Torrie, J. P. Singh, and A. Gupta. The SPLASH-2 programs: characterization and methodological considerations. In ISCA, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. M. Xu, R. Bodík, and M. D. Hill. A serializability violation detector for shared-memory server programs. In PLDI, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. C.-S. D. Yang, A. L. Souter, and L. L. Pollock. All-du-path coverage for parallel programs. In ISSTA, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Y. Yu, T. Rodeheffer, and W. Chen. RaceTrack: Efficient detection of data race conditions via adaptive tracking. In SOSP, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. CTrigger: exposing atomicity violation bugs from their hiding places

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM SIGPLAN Notices
      ACM SIGPLAN Notices  Volume 44, Issue 3
      ASPLOS 2009
      March 2009
      346 pages
      ISSN:0362-1340
      EISSN:1558-1160
      DOI:10.1145/1508284
      Issue’s Table of Contents
      • cover image ACM Conferences
        ASPLOS XIV: Proceedings of the 14th international conference on Architectural support for programming languages and operating systems
        March 2009
        358 pages
        ISBN:9781605584065
        DOI:10.1145/1508244

      Copyright © 2009 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 March 2009

      Check for updates

      Qualifiers

      • research-article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader