skip to main content
10.1145/1453101.1453131acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Differential symbolic execution

Published:09 November 2008Publication History

ABSTRACT

Detecting and characterizing the effects of software changes is a fundamental component of software maintenance. Version differencing information can be used to perform version merging, infer change characteristics, produce program documentation, and guide program re-validation. Existing techniques for characterizing code changes, however, are imprecise leading to unnecessary maintenance efforts.

In this paper, we introduce a novel extension and application of symbolic execution techniques that computes a precise behavioral characterization of a program change. This technique, which we call differential symbolic execution (DSE), exploits the fact that program versions are largely similar to reduce cost and improve the quality of analysis results. We define the foundational concepts of DSE, describe cost-effective tool support for DSE, and illustrate its potential benefit through an exploratory study that considers version histories of two Java code bases.

References

  1. S. Anand, P. Godefroid, and N. Tillmann. Demand-driven compositional symbolic execution. In Proceedings of the International Conference on Tools and Algorithms for the Construction and Analysis of Systems, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. T. Apiwattanapong, A. Orso, and M. J. Harrold. Jdiff: A differencing technique and tool for object-oriented programs. Automated Software Engineering, 14(1):3--36, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. T. Apiwattanapong, R. Santelices, P. K. Chittimalli, A. Orso, and M. J. Harrold. Matrix: Maintenance-oriented testing requirements identifier and examiner. In Proceedings of the Testing: Academic and Industrial Conference - Practice and Research Techniques, pages 137--146, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. R. E. Bryant, S. German, and M. N. Velev. Processor verification using efficient reductions of the logic of uninterpreted functions to propositional logic. ACM Transactions Computational Logic, 2(1):93--134, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Choco. Main-page Choco. http://choco-solver.net, 2008.Google ScholarGoogle Scholar
  6. D. Currie, X. Feng, M. Fujita, A. J. Hu, M. Kwan, and S. Rajan. Embedded software verification using symbolic execution and uninterpreted functions. International Journal of Parallel Programming, 34(1):61--91, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. CVC3. CVC3 page. http://www.cs.nyu.edu/acsys/cvc3, 2008.Google ScholarGoogle Scholar
  8. B. Daniel, D. Dig, K. Garcia, and D. Marinov. Automated testing of refactoring engines. In Proceedings of the the 6th joint meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering, pages 185--194, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Diffutils. Comparing and merging files. http://www.gnu.org/software/diffutils, 2002.Google ScholarGoogle Scholar
  10. M. Fowler. Refactoring home page. http://refactoring.org, 2007.Google ScholarGoogle Scholar
  11. P. Godefroid. Compositional dynamic test generation. In Proceedings of the ACM Symposium on Principles of Programming Languages, pages 47--54, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. D. Jackson and D. A. Ladd. Semantic Diff: A tool for summarizing the effects of modifications. In Proceedings of the International Conference on Software Maintenance, pages 243--252, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. JMeter. Jmeter-Apache JMeter. http://jakarta.apache.org/jmeter/, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. C. King. Symbolic execution and program testing. Communications of the ACM, 19(7):385--394, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. D. Notkin. Longitudinal program analysis. In Proceedings of the ACM SIGPLAN-SIGSOFT Workshop on Program Analysis for Software Tools and Engineering, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. C. S. Pǎsǎreanu, P. C. Mehlitz, D. H. Bushnell, K. Gundy-Burlet, M. Lowry, S. Person, and M. Pape. Combining unit-level symbolic execution and system-level concrete execution for testing NASA software. In Proceedings of the International Symposium on Software Testing and Analysis, pages 15--25, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. S. Raghavan, R. Rohana, D. Leon, A. Podgurski, and V. Augustine. Dex: a semantic-graph differencing tool for studying changes in large code bases. In Proceedings of the International Software Maintenance Conference, pages 188--197, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. V. P. Ranganath and J. Hatcliff. Slicing concurrent Java programs using Indus and Kaveri. Software Tools for Technology Transfer, 9(5--6):489--504, 2007.Google ScholarGoogle Scholar
  19. X. Ren, B. G. Ryder, M. Stoerzer, and F. Tip. Chianti: a change impact analysis tool for Java programs. In Proceedings of the 27th International Conference on Software Engineering, pages 664--665, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. G. Rothermel and M. J. Harrold. A safe, efficient regression test selection technique. ACM Transactions on Software Engineering and Methodology, 6(2):173--210, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. F. Siegel. MPIN-SPIN page. http://vsl.cis.udel.edu/mpi-spin/, 2008.Google ScholarGoogle Scholar
  22. S. F. Siegel, A. Mironova, G. S. Avrunin, and L. A. Clarke. Using model checking with symbolic execution to verify parallel numerical programs. In Proceedings of the 2006 International Symposium on Software Testing and Analysis, pages 157--168, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. S. F. Siegel, S. E. Moelius, and L. F. Rossi. A case study in the use of model checking to verify parallel scientific software. Technical Report UDEL-CIS 2007/343, Department of Computer and Information Sciences, University of Delaware, Nov. 2007.Google ScholarGoogle Scholar
  24. S. F. Siegel and L. F. Rossi. Analyzing BlobFlow: A case study using model checking to verify parallel scientific software. In Recent Advances in Parallel Virtual Machine and Message Passing Interface, 15th European PVM/MPI User's Group Meeting, Proceedings, volume 5205 of LNCS. Springer, 2008. To appear. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Siena. Siena: A wide-area event notification service. http://serl.cs.colorado.edu/~serl/dot/siena.html, 2008.Google ScholarGoogle Scholar
  26. SIR. Software-artifact infrastructure repository: Home. http://sir.unl.edu, 2008.Google ScholarGoogle Scholar
  27. W. Visser, C. S. Pǎsǎreanu, and S. Khurshid. Test input generation with Java PathFinder. In Proceedings of the International Symposium on Software Testing and Analysis, pages 97--107, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. W. Visser, C. S. Pǎsǎreanu?, and R. Pelánek. Test input generation for Java containers using state matching. In Proceedings of the International Symposium on Software Testing and Analysis, pages 37--48, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. J. Winstead and D. Evans. Towards differential program analysis. In Proceedings of the ICSE Workshop on Dynamic Analysis, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Differential symbolic execution

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              SIGSOFT '08/FSE-16: Proceedings of the 16th ACM SIGSOFT International Symposium on Foundations of software engineering
              November 2008
              369 pages
              ISBN:9781595939951
              DOI:10.1145/1453101

              Copyright © 2008 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 9 November 2008

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              Overall Acceptance Rate17of128submissions,13%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader