skip to main content
10.1145/581339.581397acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
Article

Visualization of test information to assist fault localization

Published:19 May 2002Publication History

ABSTRACT

One of the most expensive and time-consuming components of the debugging process is locating the errors or faults. To locate faults, developers must identify statements involved in failures and select suspicious statements that might contain faults. This paper presents a new technique that uses visualization to assist with these tasks. The technique uses color to visually map the participation of each program statement in the outcome of the execution of the program with a test suite, consisting of both passed and failed test cases. Based on this visual mapping, a user can inspect the statements in the program, identify statements involved in failures, and locate potentially faulty statements. The paper also describes a prototype tool that implements our technique along with a set of empirical studies that use the tool for evaluation of the technique. The empirical studies show that, for the subject we studied, the technique can be effective in helping a user locate faults in a program.

References

  1. xSlice: A tool for program debugging. http://xsuds.argreenhouse.com/html-man/coverpage.html.Google ScholarGoogle Scholar
  2. H. Agrawal, J. Horgan, S. London, and W. Wong. Fault localization using execution slices and dataflow tests. In Proceedings of IEEE Software Reliability Engineering, pages 143-151, 1995.Google ScholarGoogle ScholarCross RefCross Ref
  3. T. Ball and S. G. Eick. Software visualization in the large. Computer, 29(4):33-43, Apr. 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. J. S. Collofello and S. N. Woodfield. Evaluating the effectiveness of reliability-assurance techniques. Journal of Systems and Software, 9(3):191-195, 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. Eagan, M. J. Harrold, J. Jones, and J. Stasko. Technical note: Visually encoding program test information to find faults in software. In Proceedings of IEEE Information Visualization, pages 33-36, October 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. G. Eick, L. Steffen, Joseph, and E. E. Sumner Jr. Seesoft---A tool for visualizing line oriented software statistics. IEEE Transactions on Software Engineering, 18(11):957-968, Nov. 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. Elbaum, A. Malishevsky, and G. Rothermel. Prioritizing test cases for regression testing. In Proceedings of the ACM International Symposium on Softw. Testing and Analysis, pages 102-112, Aug. 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. J. Harrold, J. Jones, T. Li, D. Liang, A. Orso, M. Pennings, S. S., S. Spoon, and A. Gujarathi. Regression test selection for java software. In Proceedings of the ACM Conference on Object-Oriented Programming, Systems, Languages, and Applications, pages 312-326, October 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. J. Jones and M. J. Harrold. Test-suite reduction and prioritization for modified condition/decision coverage. In Proceedings of the International Conference on Software Maintenance, pages 92-101, November 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. H. Pan, R. A. DeMillo, and E. H. Spafford. Failure and fault analysis for software debugging. In Proceedings of COMPSAC 97, pages 515-521, Wahington, D.C., August 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. G. Rothermel and M. J. Harrold. A safe, efficient regression test selection technique. ACM Transactions on Software Engineering and Methodology, 6(2):173-210, Apr. 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. G. Rothermel, M. J. Harrold, J. Ostrin, and C. Hong. An empirical study of the effects of minimization on the fault detecti on capabilities of test suites. In Proceedings of the International Conference on Software Maintenance, Nov. 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. G. Rothermel, R. Untch, C. Chu, and M. J. Harrold. Prioritizing test cases for regression testing. IEEE Transactions on Software Engineering, 27(10):929-948, October 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Stasko, J. Domingue, M. Brown, and B. Price, editors. Software Visualization: Programming as a Multimedia Experience. MIT Press, Cambridge, MA, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Telcordia Technologies, Inc. xATAC: A tool for improving testing effectiveness. http://xsuds.argreenhouse.com/html-man/coverpage.html.Google ScholarGoogle Scholar
  16. I. Vessey. Expertise in debugging computer programs. International Journal of Man-Machine Studies: A process analysis, 23(5):459-494, 1985.Google ScholarGoogle Scholar
  17. F. Vokolos and P. Frankl. Empirical evaluation of the textual differencing regression testing tec hniques. In International Conference on Software Maintenance, November 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Visualization of test information to assist fault localization

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ICSE '02: Proceedings of the 24th International Conference on Software Engineering
          May 2002
          797 pages
          ISBN:158113472X
          DOI:10.1145/581339

          Copyright © 2002 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 19 May 2002

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • Article

          Acceptance Rates

          ICSE '02 Paper Acceptance Rate45of303submissions,15%Overall Acceptance Rate276of1,856submissions,15%

          Upcoming Conference

          ICSE 2025

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader