skip to main content
10.1145/2001420.2001436acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article

Testing concurrent programs on relaxed memory models

Published:17 July 2011Publication History

ABSTRACT

High-performance concurrent libraries, such as lock-free data structures and custom synchronization primitives, are notoriously difficult to write correctly. Such code is often implemented without locks, instead using plain loads and stores and low-level operations like atomic compare-and-swaps and explicit memory fences. Such code must run correctly despite the relaxed memory model of the underlying compiler, virtual machine, and/or hardware. These memory models may reorder the reads and writes issued by a thread, greatly complicating parallel reasoning.

We propose Relaxer, a combination of predictive dynamic analysis and software testing, to help programmers write correct, highly-concurrent programs. Our technique works in two phases. First, Relaxer examines a sequentially-consistent run of a program under test and dynamically detects potential data races. These races are used to predict possible violations of sequential consistency under alternate executions on a relaxed memory model. In the second phase, Relaxer re-executes the program with a biased random scheduler and with a conservative simulation of a relaxed memory model in order to create with high probability a predicted sequential consistency violation. These executions can be used to test whether or not a program works as expected when the underlying memory model is not sequentially consistent.

We have implemented Relaxer for C and have evaluated it on several synchronization algorithms, concurrent data structures, and parallel applications. Relaxer generates many executions of these benchmarks with violations of sequential consistency, highlighting a number of bugs under relaxed memory models.

References

  1. M. F. Atig, A. Bouajjani, S. Burckhardt, and M. Musuvathi. On the verification problem for weak memory models. In The 36th annual ACM SIGPLAN-SIGACT symposium on Principles of Programming Languages (POPL), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. S. Bensalem, J.-C. Fernandez, K. Havelund, and L. Mounier. Confirmation of deadlock potentials detected by runtime analysis. In PADTAD, pages 41--50, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. S. Burckhardt, R. Alur, and M. M. K. Martin. CheckFence: checking consistency of concurrent data types on relaxed memory models. In ACM SIGPLAN conference on Programming Language Design and Implementation, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. S. Burckhardt and M. Musuvathi. Effective program verification for relaxed memory models. In Computer Aided Verification (CAV), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. Burnim, K. Sen, and C. Stergiou. Sound and complete monitoring of sequential consistency for relaxed memory models. In Tools and Algorithms for the Construction and Analysis of Systems (TACAS), 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. D. L. Dill, S. Park, and A. G. Nowatzyk. Formal specification of abstract memory models. In Symposium on Research on Integrated Systems, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. O. Edelstein, E. Farchi, Y. Nir, G. Ratsaby, and S. Ur. Multithreaded java program test generation. IBM Systems Journal, 41(1):111--125, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. C. Flanagan and S. N. Freund. Adversarial memory for detecting destructive races. In Programming language design and implementation (PLDI). ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. G. Gopalakrishnan, Y. Yang, and H. Sivaraj. QB or not QB: An efficient execution verification tool for memory orderings. In Computer-Aided Verification (CAV), 2004.Google ScholarGoogle ScholarCross RefCross Ref
  10. T. Q. Huynh and A. Roychoudhury. Memory model sensitive bytecode verification. FMSD, 31(3):281--305, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. P. Joshi, C.-S. Park, K. Sen, and M. Naik. A randomized dynamic program analysis technique for detecting real deadlocks. In ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. Kuperstein, M. Vechev, and E. Yahav. Automatic inference of memory fences. In Formal Methods in Computer Aided Design (FMCAD). IEEE Computer Society, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Z. Lai, S. C. Cheung, and W. K. Chan. Detecting atomic-set serializability violations in multi-threaded programs through active randomized testing. In ACM/IEEE 32nd International Conference on Software Engineering (ICSE), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. L. Lamport. A new solution of dijkstra's concurrent programming problem. Commun. ACM, 17(8):453--455, 1974. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. S. Mador-Haim, R. Alur, and M. M. Martin. Generating litmus tests for contrasting memory consistency models. In Computer Aided Verification (CAV), July 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. S. Mador-Haim, R. Alur, and M. M. K. Martin. Specifying relaxed memory models for state exploration tools. In (EC) 2 : Workshop on Exploting Concurrency Efficiently and Correctly, 2009.Google ScholarGoogle Scholar
  17. M. Musuvathi and S. Qadeer. Iterative context bounding for systematic testing of multithreaded programs. In Programming language design and implementation (PLDI). ACM, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. G. C. Necula, S. McPeak, S. P. Rahul, and W. Weimer. CIL: Intermediate Language and Tools for Analysis and transformation of C Programs. In Conference on compiler Construction (CC), 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. C.-S. Park and K. Sen. Randomized active atomicity violation detection in concurrent programs. In 16th ACM SIGSOFT International Symposium on Foundations of software engineering, pages 135--145. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. S. Park and D. L. Dill. An executable specification, analyzer and verifier for RMO (relaxed memory order). In ACM Symposium on Parallel Algorithms and Architectures, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. Park, S. Lu, and Y. Zhou. Ctrigger: exposing atomicity violation bugs from their hiding places. In 14th international conference on Architectural support for programming languages and operating systems, pages 25--36. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. K. Sen. Race directed random testing of concurrent programs. In ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. P. Sewell, S. Sarkar, S. Owens, F. Z. Nardelli, and M. O. Myreen. x86-TSO: a rigorous and usable programmer's model for x86 multiprocessors. Commun. ACM, 53(7):89--97, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. SPARC Inc. The SPARC architecture manual (v.9). 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. S. D. Stoller. Testing concurrent Java programs using randomized scheduling. In Workshop on Runtime Verification (RV'02), volume 70 of ENTCS, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  26. E. Torlak, M. Vaziri, and J. Dolby. MemSAT: Checking axiomatic specifications of memory models. In Programming Language Design and Implementation (PLDI). ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. W. Visser, K. Havelund, G. Brat, S. Park, and F. Lerda. Model checking programs. Automated Software Engn., 10(2):203--232, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Y. Yang, G. Gopalakrishnan, G. Lindstrom, and K. Slind. Nemos: A framework for axiomatic and executable specifications of memory consistency models. In 18th International Parallel and Distributed Processing Symposium (IPDPS). IEEE Computer Society, 2004.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Testing concurrent programs on relaxed memory models

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in
              • Published in

                cover image ACM Conferences
                ISSTA '11: Proceedings of the 2011 International Symposium on Software Testing and Analysis
                July 2011
                394 pages
                ISBN:9781450305624
                DOI:10.1145/2001420

                Copyright © 2011 ACM

                Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 17 July 2011

                Permissions

                Request permissions about this article.

                Request Permissions

                Check for updates

                Qualifiers

                • research-article

                Acceptance Rates

                Overall Acceptance Rate58of213submissions,27%

                Upcoming Conference

                ISSTA '24

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader