ABSTRACT
Context: Reporters of security vulnerabilities possess rich information about the security engineering process. Goal: We performed an empirical study on reporters of buffer overflow vulnerabilities to understand the methods and tools used during the discovery. Method: We ran the study in the form of an email questionnaire with open ended questions. The participants were reporters featured in the SecurityFocus repository during two six-month periods; we collected 58 responses. Results: We found that in spite of many apparent choices, reporters follow similar approaches. Most reporters typically use fuzzing, but their fuzzing tools are created ad hoc; they use a few debugging tools to analyze the crash introduced by a fuzzer; and static analysis tools are rarely used. We also found a serious problem in the vulnerability reporting process. Most reporters, especially the experienced ones, favor full-disclosure and do not collaborate with the vendors of vulnerable software. They think that the public disclosure, sometimes supported by a detailed exploit, will put pressure on vendors to fix the vulnerabilities. But, in practice, the vulnerabilities not reported to vendors are less likely to be fixed. Conclusions: The results are valuable for beginners exploring how to detect and report buffer overflows and for tool vendors and researchers exploring how to automate and fix the process.
- O. Alhazmi and Y. Malaiya. Prediction capabilities of vulnerability discovery models. In RAMS'06. IEEE Computer Society, 2006. Google ScholarDigital Library
- P. Anbalagan and M. Vouk. Towards a unifying approach in understanding security problems. In ISSRE'09. IEEE Press, 2009. Google ScholarDigital Library
- J. Aranda and G. Venolia. The secret life of bugs: Going past the errors and omissions in software repositories. In ICSE '09. IEEE Computer Society, 2009. Google ScholarDigital Library
- W. Arbaugh, W. Fithen, and J. McHugh. Windows of vulnerability: A case study analysis. Computer, 33(12):52--59, Dec. 2000. Google ScholarDigital Library
- A. Arora, R. Krishnan, R. Telang, and Y. Yang. Impact of vulnerability disclosure and patch availability - An empirical analysis. In WEIS '04, 2004.Google Scholar
- A. Austin and L. Williams. One technique is not enough: A comparison of vulnerability discovery techniques. In ESEM '11, 2011. Google ScholarDigital Library
- D. Baca, B. Carlsson, K. Petersen, and L. Lundberg. Improving software security with static automated code analysis in an industry setting. Software---Practice and Experience, 43(3):259--279, 2013.Google ScholarCross Ref
- A. Bessey, K. Block, B. Chelf, A. Chou, B. Fulton, S. Hallem, C. Henri-Gros, A. Kamsky, S. McPeak, and D. Engler. A few billion lines of code later: Using static analysis to find bugs in the real world. Communications of the ACM, 53(2):66--75, Feb. 2010. Google ScholarDigital Library
- H. Browne, W. Arbaugh, J. McHugh, and W. Fithen. A trend analysis of exploitations. In IEEE S&P '01. IEEE Computer Society, 2001. Google ScholarDigital Library
- N. Denzin. The Research Act: A Theoretical Introduction to Sociological Methods. McGraw-Hill, New York, 1978.Google Scholar
- A. Doupé, M. Cova, and G. Vigna. Why Johnny can't Pentest: An analysis of black-box web vulnerability scanners. In DIMVA '10. Springer, 2010. Google ScholarDigital Library
- M. Finifter, D. Akhawe, and D. Wagner. An empirical study of vulnerability rewards programs. In USENIX Security' 13. USENIX Association, 2013. Google ScholarDigital Library
- S. Frei, D. Schatzmann, B. Plattner, and B. Trammell. Modelling the security ecosystem- The dynamics of (in)security. In WEIS '09, 2009.Google Scholar
- S. Frei, B. Tellenbach, and B. Plattner. 0-day patch - Exposing vendors' (In)security performance. BlackHat Europe, 2008.Google Scholar
- R. Gopalakrishna and E. Spafford. A trend analysis of vulnerabilities. Technical report, CERIAS, 2005.Google Scholar
- B. Johnson, Y. Song, E. Murphy-Hill, and R. Bowdidge. Why don't software developers use static analysis tools to find bugs? In ICSE '13. ACM, 2013. Google ScholarDigital Library
- K. Krippendorff. Content Analysis: An Introduction to Its Methodology. Sage Publications Ltd, Singapore, 2004.Google Scholar
- L. Layman, L. Williams, and R. Amant. Toward reducing fault fix time: Understanding developer behavior for the design of automated fault detection tools. In ESEM '07. IEEE Computer Society, 2007. Google ScholarDigital Library
- F. Massacci and V. Nguyen. Which is the right source for vulnerability studies?: An empirical analysis on mozilla firefox. In MetriSec '10. ACM, 2010. Google ScholarDigital Library
- P. Mell, K. Scarfone, and S. Romanosky. CVSS: A complete guide to the Common Vulnerability Scoring System Version 2.0. Technical report, FIRST.org, 2007.Google Scholar
- H. Okhravi and D. Nicol. Evaluation of patch management strategies. International Journal of Computational Intelligence: Theory and Practice, 3:109--117, 2008.Google Scholar
- M. Patton. Qualitative Research & Evaluation Methods. Sage Publications Ltd, Singapore, 3 edition, 2001.Google Scholar
- N. Rutar, C. Almazan, and J. Foster. A comparison of bug finding tools for Java. In ISSRE '04. IEEE Computer Society, 2004. Google ScholarDigital Library
- J. Saldana. The Coding Manual for Qualitative Researchers. Sage Publications Ltd, Singapore, 2009.Google Scholar
- R. Scandariato, J. Walden, and W. Joosen. Static analysis versus penetration testing: A controlled experiment. In Software Reliability Engineering (ISSRE), 2013 IEEE 24th International Symposium on, pages 451--460, Nov 2013.Google ScholarCross Ref
- B. Schneier. Full disclosure and the window of exposure. Crypto-Gram Newsletter, Sep 2000.Google Scholar
- T. Scholte, D. Balzarotti, and E. Kirda. Quo vadis? A study of the evolution of input validation vulnerabilities in web applications. In FC'11. Springer-Verlag, 2012. Google ScholarDigital Library
- G. Schryen. A comprehensive and comparative analysis of the patching behavior of open source and closed source software vendors. In IMF, 2009. Google ScholarDigital Library
- SecurityFocus. Bugtraq vulnerability list. http://www.securityfocus.com/.Google Scholar
- M. Shahzad, M. Shafiq, and A. Liu. A large scale exploratory analysis of software vulnerability life cycles. In ICSE '12. IEEE Press, 2012. Google ScholarDigital Library
- L. Suto. Analyzing the effectiveness and coverage of Web application security scanners. Technical report, eEye Digital Security, 2007.Google Scholar
- M. Weinstein. TAMS Analyzer for Macintosh OS X: The native open source, Macintosh qualitative research tool.Google Scholar
- J. Wilander and M. Kamkar. A comparison of publicly available tools for dynamic buffer overflow prevention. In NDSS '03. The Internet Society, 2003.Google Scholar
- Y. Wu, R. Gandhi, and H. Siy. Using semantic templates to study vulnerabilities recorded in large software repositories. In SESS '10. ACM, 2010. Google ScholarDigital Library
- R. Yin. Case Study Research: Design and Methods. Sage Publications Ltd, Singapore, 2004.Google Scholar
- S. Zhang, D. Caragea, and X. Ou. An empirical study on using the national vulnerability database to predict software vulnerabilities. In DEXA '11. Springer, 2011. Google ScholarDigital Library
Index Terms
- Discovering buffer overflow vulnerabilities in the wild: an empirical study
Recommendations
Game of detections: how are security vulnerabilities discovered in the wild?
There is little or no information available on what actually happens when a software vulnerability is detected. We performed an empirical study on reporters of the three most prominent security vulnerabilities: buffer overflow, SQL injection, and cross ...
Defending against Buffer-Overflow Vulnerabilities
A survey of techniques ranging from static analysis to hardwaremodification describes how various defensive approaches protect against buffer overflow, a vulnerability that represents a severesecurity threat.
Comments