skip to main content
10.1145/2493394.2493406acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
research-article

Effective and ineffective software testing behaviors by novice programmers

Published:12 August 2013Publication History

ABSTRACT

This data-driven paper quantitatively evaluates software testing behaviors that students exhibited in introductory computer science courses. The evaluation includes data collected over five years (10 semesters) from 49,980 programming assignment submissions by 883 different students. To examine the effectiveness of software testing behaviors, we investigate the quality of their testing at different stages of their development. We partition testing behaviors into four groups according to when in their development they first achieve substantial (at least 85%) test coverage.

The study reveals significant results regarding effective and ineffective testing behaviors. A within-subjects comparison finds that higher coverage in early development is associated with higher quality code and with completing work earlier. Post-hoc analysis also suggests that the relationship between early testing and positive outcomes is independent of time management and effects of individuals' abilities. However, roughly 76% of students exhibit different testing behaviors on different assignments, demonstrating an opportunity to foster better, more consistent testing habits among computer science students.

References

  1. ABET (2013). "Criteria for Accrediting Computing Programs, 2013--2014." Retrieved May, 2013, from http://www.abet.org/DisplayTemplates/DocsHandbook.aspx?id=3148.Google ScholarGoogle Scholar
  2. Anderson, L. W., D. R. Krathwohl, et al. (2001). A taxonomy for learning and teaching and assessing: A revision of Bloom's taxonomy of educational objectives, Addison Wesley Longman.Google ScholarGoogle Scholar
  3. Barriocanal, E. G., M.-Á. S. Urbán, et al. (2002). "An experience in integrating automated unit testing practices in an introductory programming course." SIGCSE Bull. 34(4): 125--128. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Beck, K. (1999). "Embracing change with extreme programming." Computer 32(10): 70--77. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Beck, K. (2003). Test-Driven Development by Example, Addison Wesley. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bhat, T. and N. Nagappan (2006). Evaluating the efficacy of test-driven development: industrial case studies. Proceedings of the 2006 ACM/IEEE international symposium on Empirical software engineering. Rio de Janeiro, Brazil, ACM: 356--363. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Buffardi, K. and S. H. Edwards (2012). Exploring influences on student adherence to test-driven development. Proceedings of the 17th ACM annual conference on Innovation and technology in computer science education. Haifa, Israel, ACM: 105--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Buffardi, K. and S. H. Edwards (2012). "Impacts of Teaching Test-Driven Development to Novice Programmers." International Journal of Information and Computer Science 1(6): 9.Google ScholarGoogle Scholar
  9. Canfora, G., A. Cimitile, et al. (2006). Evaluating advantages of test driven development: a controlled experiment with professionals. Proceedings of the 2006 ACM/IEEE international symposium on Empirical software engineering. Rio de Janeiro, Brazil, ACM: 364--371. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Desai, C., D. Janzen, et al. (2008). "A survey of evidence for test-driven development in academia." SIGCSE Bull. 40(2): 97--101. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Edwards, S. H. "Web-CAT." 2013, from https://web cat.cs.vt.edu.Google ScholarGoogle Scholar
  12. Edwards, S. H. (2004). "Using software testing to move students from trial-and-error to reflection-in-action." SIGCSE Bull. 36(1): 26--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Fenwick, J. B., Norris, C., Barry, F. E., Rountree, J., Spicer, C. J., and Cheek, S. D. Another look at the behaviors of novice programmers. In Proc. 40th ACM Tech. Symp. Computer Science Education, ACM, New York, NY, 2009, pp. 296--300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Fraser, S., D. Astels, et al. (2003). Discipline and practices of TDD: (test driven development). Companion of the 18th annual ACM SIGPLAN conference on Object-oriented programming, systems, languages, and applications. Anaheim, CA, USA, ACM: 268--270. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Jadud, M. A first look at novice compilation behaviour using BlueJ. Computer Science Education, 15(1):25--40, March 2005.Google ScholarGoogle Scholar
  16. Janzen, D. S. and H. Saiedian (2007). A Leveled Examination of Test-Driven Development Acceptance. Proceedings of the 29th international conference on Software Engineering, IEEE Computer Society: 719--722. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Lappalainen, V., J. Itkonen, et al. (2010). ComTest: a tool to impart TDD and unit testing to introductory level programming. Proceedings of the fifteenth annual conference on Innovation and technology in computer science education. Bilkent, Ankara, Turkey, ACM: 63--67. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Melnik, G. and F. Maurer (2005). A cross-program investigation of students' perceptions of agile methods. Proceedings of the 27th international conference on Software engineering. St. Louis, MO, USA, ACM: 481--488. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Murphy, C., Kaiser, G., Loveland, K., and Hasan, S.. Retina: helping students and instructors based on observed programming activities. In Proc. 40th ACM Tech. Symp. on Computer Science Education, ACM, New York, NY, 2009, pp. 178--182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Spacco, J. and W. Pugh (2006). Helping students appreciate test-driven development (TDD). Companion to the 21st ACM SIGPLAN symposium on Object-oriented programming systems, languages, and applications. Portland, Oregon, USA, ACM: 907--913. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Spacco, J., D. Hovemeyer, et al. (2006). "Experiences with marmoset: designing and using an advanced submission and testing system for programming courses." SIGCSE Bull. 38(3): 13--17. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Effective and ineffective software testing behaviors by novice programmers

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ICER '13: Proceedings of the ninth annual international ACM conference on International computing education research
          August 2013
          202 pages
          ISBN:9781450322430
          DOI:10.1145/2493394

          Copyright © 2013 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 12 August 2013

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          ICER '13 Paper Acceptance Rate22of70submissions,31%Overall Acceptance Rate189of803submissions,24%

          Upcoming Conference

          ICER 2024
          ACM Conference on International Computing Education Research
          August 13 - 15, 2024
          Melbourne , VIC , Australia

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader