skip to main content
10.1145/2652524.2652533acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
research-article

Discovering buffer overflow vulnerabilities in the wild: an empirical study

Published:18 September 2014Publication History

ABSTRACT

Context: Reporters of security vulnerabilities possess rich information about the security engineering process. Goal: We performed an empirical study on reporters of buffer overflow vulnerabilities to understand the methods and tools used during the discovery. Method: We ran the study in the form of an email questionnaire with open ended questions. The participants were reporters featured in the SecurityFocus repository during two six-month periods; we collected 58 responses. Results: We found that in spite of many apparent choices, reporters follow similar approaches. Most reporters typically use fuzzing, but their fuzzing tools are created ad hoc; they use a few debugging tools to analyze the crash introduced by a fuzzer; and static analysis tools are rarely used. We also found a serious problem in the vulnerability reporting process. Most reporters, especially the experienced ones, favor full-disclosure and do not collaborate with the vendors of vulnerable software. They think that the public disclosure, sometimes supported by a detailed exploit, will put pressure on vendors to fix the vulnerabilities. But, in practice, the vulnerabilities not reported to vendors are less likely to be fixed. Conclusions: The results are valuable for beginners exploring how to detect and report buffer overflows and for tool vendors and researchers exploring how to automate and fix the process.

References

  1. O. Alhazmi and Y. Malaiya. Prediction capabilities of vulnerability discovery models. In RAMS'06. IEEE Computer Society, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. P. Anbalagan and M. Vouk. Towards a unifying approach in understanding security problems. In ISSRE'09. IEEE Press, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. J. Aranda and G. Venolia. The secret life of bugs: Going past the errors and omissions in software repositories. In ICSE '09. IEEE Computer Society, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. W. Arbaugh, W. Fithen, and J. McHugh. Windows of vulnerability: A case study analysis. Computer, 33(12):52--59, Dec. 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. A. Arora, R. Krishnan, R. Telang, and Y. Yang. Impact of vulnerability disclosure and patch availability - An empirical analysis. In WEIS '04, 2004.Google ScholarGoogle Scholar
  6. A. Austin and L. Williams. One technique is not enough: A comparison of vulnerability discovery techniques. In ESEM '11, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. D. Baca, B. Carlsson, K. Petersen, and L. Lundberg. Improving software security with static automated code analysis in an industry setting. Software---Practice and Experience, 43(3):259--279, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  8. A. Bessey, K. Block, B. Chelf, A. Chou, B. Fulton, S. Hallem, C. Henri-Gros, A. Kamsky, S. McPeak, and D. Engler. A few billion lines of code later: Using static analysis to find bugs in the real world. Communications of the ACM, 53(2):66--75, Feb. 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. H. Browne, W. Arbaugh, J. McHugh, and W. Fithen. A trend analysis of exploitations. In IEEE S&P '01. IEEE Computer Society, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. N. Denzin. The Research Act: A Theoretical Introduction to Sociological Methods. McGraw-Hill, New York, 1978.Google ScholarGoogle Scholar
  11. A. Doupé, M. Cova, and G. Vigna. Why Johnny can't Pentest: An analysis of black-box web vulnerability scanners. In DIMVA '10. Springer, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. Finifter, D. Akhawe, and D. Wagner. An empirical study of vulnerability rewards programs. In USENIX Security' 13. USENIX Association, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. Frei, D. Schatzmann, B. Plattner, and B. Trammell. Modelling the security ecosystem- The dynamics of (in)security. In WEIS '09, 2009.Google ScholarGoogle Scholar
  14. S. Frei, B. Tellenbach, and B. Plattner. 0-day patch - Exposing vendors' (In)security performance. BlackHat Europe, 2008.Google ScholarGoogle Scholar
  15. R. Gopalakrishna and E. Spafford. A trend analysis of vulnerabilities. Technical report, CERIAS, 2005.Google ScholarGoogle Scholar
  16. B. Johnson, Y. Song, E. Murphy-Hill, and R. Bowdidge. Why don't software developers use static analysis tools to find bugs? In ICSE '13. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. K. Krippendorff. Content Analysis: An Introduction to Its Methodology. Sage Publications Ltd, Singapore, 2004.Google ScholarGoogle Scholar
  18. L. Layman, L. Williams, and R. Amant. Toward reducing fault fix time: Understanding developer behavior for the design of automated fault detection tools. In ESEM '07. IEEE Computer Society, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. F. Massacci and V. Nguyen. Which is the right source for vulnerability studies?: An empirical analysis on mozilla firefox. In MetriSec '10. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. P. Mell, K. Scarfone, and S. Romanosky. CVSS: A complete guide to the Common Vulnerability Scoring System Version 2.0. Technical report, FIRST.org, 2007.Google ScholarGoogle Scholar
  21. H. Okhravi and D. Nicol. Evaluation of patch management strategies. International Journal of Computational Intelligence: Theory and Practice, 3:109--117, 2008.Google ScholarGoogle Scholar
  22. M. Patton. Qualitative Research & Evaluation Methods. Sage Publications Ltd, Singapore, 3 edition, 2001.Google ScholarGoogle Scholar
  23. N. Rutar, C. Almazan, and J. Foster. A comparison of bug finding tools for Java. In ISSRE '04. IEEE Computer Society, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. J. Saldana. The Coding Manual for Qualitative Researchers. Sage Publications Ltd, Singapore, 2009.Google ScholarGoogle Scholar
  25. R. Scandariato, J. Walden, and W. Joosen. Static analysis versus penetration testing: A controlled experiment. In Software Reliability Engineering (ISSRE), 2013 IEEE 24th International Symposium on, pages 451--460, Nov 2013.Google ScholarGoogle ScholarCross RefCross Ref
  26. B. Schneier. Full disclosure and the window of exposure. Crypto-Gram Newsletter, Sep 2000.Google ScholarGoogle Scholar
  27. T. Scholte, D. Balzarotti, and E. Kirda. Quo vadis? A study of the evolution of input validation vulnerabilities in web applications. In FC'11. Springer-Verlag, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. G. Schryen. A comprehensive and comparative analysis of the patching behavior of open source and closed source software vendors. In IMF, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. SecurityFocus. Bugtraq vulnerability list. http://www.securityfocus.com/.Google ScholarGoogle Scholar
  30. M. Shahzad, M. Shafiq, and A. Liu. A large scale exploratory analysis of software vulnerability life cycles. In ICSE '12. IEEE Press, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. L. Suto. Analyzing the effectiveness and coverage of Web application security scanners. Technical report, eEye Digital Security, 2007.Google ScholarGoogle Scholar
  32. M. Weinstein. TAMS Analyzer for Macintosh OS X: The native open source, Macintosh qualitative research tool.Google ScholarGoogle Scholar
  33. J. Wilander and M. Kamkar. A comparison of publicly available tools for dynamic buffer overflow prevention. In NDSS '03. The Internet Society, 2003.Google ScholarGoogle Scholar
  34. Y. Wu, R. Gandhi, and H. Siy. Using semantic templates to study vulnerabilities recorded in large software repositories. In SESS '10. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. R. Yin. Case Study Research: Design and Methods. Sage Publications Ltd, Singapore, 2004.Google ScholarGoogle Scholar
  36. S. Zhang, D. Caragea, and X. Ou. An empirical study on using the national vulnerability database to predict software vulnerabilities. In DEXA '11. Springer, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Discovering buffer overflow vulnerabilities in the wild: an empirical study

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            ESEM '14: Proceedings of the 8th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement
            September 2014
            461 pages
            ISBN:9781450327749
            DOI:10.1145/2652524

            Copyright © 2014 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 18 September 2014

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            ESEM '14 Paper Acceptance Rate23of123submissions,19%Overall Acceptance Rate130of594submissions,22%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader