skip to main content
Skip header Section
Software Quality Assurance: In Large Scale and Complex Software-intensive SystemsNovember 2015
Publisher:
  • Morgan Kaufmann Publishers Inc.
  • 340 Pine Street, Sixth Floor
  • San Francisco
  • CA
  • United States
ISBN:978-0-12-802301-3
Published:02 November 2015
Pages:
416
Skip Bibliometrics Section
Bibliometrics
Skip Abstract Section
Abstract

Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software has become complex and adaptable due to the emergence of globalization and new software technologies, devices and networks. These changes challenge both traditional software quality assurance techniques and software engineers to ensure software quality when building today (and tomorrows) adaptive, context-sensitive, and highly diverse applications. This edited volume presents state of the art techniques, methodologies, tools, best practices and guidelines for software quality assurance and offers guidance for future software engineering research and practice. Each contributed chapter considers the practical application of the topic through case studies, experiments, empirical validation, or systematic comparisons with other approaches already in practice. Topics of interest include, but are not limited, to: quality attributes of system/software architectures; aligning enterprise, system, and software architecture from the point of view of total quality; design decisions and their influence on the quality of system/software architecture; methods and processes for evaluating architecture quality; quality assessment of legacy systems and third party applications; lessons learned and empirical validation of theories and frameworks on architectural quality; empirical validation and testing for assessing architecture quality.Focused on quality assurance at all levels of software design and developmentCovers domain-specific software quality assurance issues e.g. for cloud, mobile, security, context-sensitive, mash-up and autonomic systemsExplains likely trade-offs from design decisions in the context of complex software system engineering and quality assuranceIncludes practical case studies of software quality assurance for complex, adaptive and context-critical systems

References

  1. ¿http://en.wikipedia.org/wiki/Deployment_Plan¿.Google ScholarGoogle Scholar
  2. Bass, L., Weber, I., Zhu, L., 2015. DevOps: A Software Architect's Perspective. Pearson Publishing. Google ScholarGoogle Scholar
  3. Newman, S., 2014. Building Microservices: Designing Fine-Grained Systems. O'Reilly Media. Google ScholarGoogle Scholar
  4. Ambler, S., 2005. Quality in an agile world. Softw. Qual. Prof.Google ScholarGoogle Scholar
  5. Cleland-Huang, J., 2015. Don't fire the architect! Where were the requirements? IEEE Softw¿.Google ScholarGoogle Scholar
  6. Dorner, R., 1996. The Logic of Failure: Recognizing and Avoiding Error in Complex Situations. Perseus Books, Cambridge, MA.Google ScholarGoogle Scholar
  7. Goldstein, H., 2005. Who killed the virtual case file? IEEE Spectr. Google ScholarGoogle Scholar
  8. Herbsleb, J.D., D.Z., 1997. Software quality and the capability maturity model. Commun. ACM 40, 6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. IEEE, 2010. Standard ISO/IEC/IEEE 24765:2010. IEEE.Google ScholarGoogle Scholar
  10. Len Bass, P.C., 2012. Software Architecture in Practice. Addison-Wesley. Google ScholarGoogle Scholar
  11. Mirakhorli, M., Cleland-Huang, J., 2013. Traversing the twin peaks. IEEE Softw. Google ScholarGoogle Scholar
  12. NASA, n.d. Software Assurance Definitions. National Aeronautics and Space Administration.Google ScholarGoogle Scholar
  13. National Research Council of the National Academies, 2010. Software Producibility for Defense. The National Academies Press, Washington, DC.Google ScholarGoogle Scholar
  14. Nelly Bencomo, R.B., 2014. [email protected], Applications, and Roadmaps (Dagstuhl Seminar 11481). Lecture Notes in Computer Science 8378. Springer.Google ScholarGoogle Scholar
  15. Robertson, S.R., 2013. Mastering the Requirements Process. Pearson Education. Google ScholarGoogle Scholar
  16. Albert, W., Tullis, T., 2013. Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics. Morgan Kaufmann. Google ScholarGoogle Scholar
  17. Ali, N., Solis, C., 2014. Exploring how the attribute driven design method is perceived, In: Mistrik, I., Bahsoon, R., Eeles, P., Roshandel, R., Stal, M. (Eds.), Relating System Quality and Software Architecture. Morgan Kaufman Elsevier, United States, pp. 2340. ISBN 9780124170094.Google ScholarGoogle Scholar
  18. Ali, N., Rosik, J., Buckley, J. Characterizing real-time reflexion-based architecture recovery: an in-vivo multi-case study. In: Proceedings of the 8th International ACM SIGSOFT Conference on Quality of Software Architectures (QoSA'12). ACM, New York, NY, pp. 23-32. Google ScholarGoogle Scholar
  19. Babar, M.A., Zhu, L., Jeffery, R., 2004. A framework for classifying and comparing software architecture evaluation methods. In: ASWEC, p. 309. Google ScholarGoogle Scholar
  20. Bachmann, F., Bass, L., Klein, M., 2003. Deriving Architectural. Tactics: A Step Toward. Methodical Architectural. Design. CMU/SEI-2003-TR-004. ESC-TR-2003-004.Google ScholarGoogle Scholar
  21. Bachmann, F., Bass, L., Klein, M., Shelton, C., 2005. Designing software architectures to achieve quality attribute requirements. In: Software, IEE Proceedings, vol. 152, issue 4, pp. 153-165.Google ScholarGoogle ScholarCross RefCross Ref
  22. Bardram, J.E., Christensen, H.B., Corry, A.V., Hansen, K.M., Ingstrup, M., 2005. Exploring quality attributes using architectural prototyping. In: Proceedings of First International Conference on the Quality of Software Architectures, LNCS, vol. 3712, pp. 155-170. Google ScholarGoogle Scholar
  23. Bass, L., Clements, P., Kazman, R., 2010. Software Architecture in Practice, third ed. Addison-Wesley Professional. Google ScholarGoogle Scholar
  24. Boehm, B., 1978. Characteristics of Software Quality, Vol 1 of TRW Series on Software Technology. North-Holland, Amsterdam, Holland.Google ScholarGoogle Scholar
  25. Chrissis, M.B., Konrad, M., Shrum, S., 2003. CMMI Guidlines for Process Integration and Product Improvement. Addison-Wesley Longman Publishing Co., Inc. Google ScholarGoogle Scholar
  26. Christensen, H.B., Hansen, K.M., 2010. An empirical investigation of architectural prototyping. J. Syst. Softw. 83 (1), 133-142. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Diaz-Pace, A., Kim, H., Bass, L., Bianco, P., Bachmann, F., 2008. Integrating quality-attribute reasoning frameworks in the ArchE design assistant. In: Proceedings of the 4th International Conference on Quality of Software-Architectures: Models and Architectures, LNCS, vol. 5281, pp. 171-188. Google ScholarGoogle Scholar
  28. Dybå, T., Dingsøyr, T., 2008. Empirical studies of agile software development: a systematic review. Inf. Softw. Technol. 50 (9), 833-859. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Eixelsberger, W., Ogris, M., Gall, H., Bellay, B., 1998. Software architecture recovery of a program family. In: ICSE, pp. 508-511. Google ScholarGoogle Scholar
  30. Emeakaroha, V.C., et al. 2010. Low level metrics to high level SLAs-LoM2HiS framework: bridging the gap between monitored metrics and SLA parameters in cloud environments. In: 2010 International Conference on High Performance Computing and Simulation (HPCS), IEEE.Google ScholarGoogle Scholar
  31. Fenton, N.E., Pfleger, S.L., 1998. Software Metrics--A Rigorous and Practical Approach, second ed. International Thomson Press, London. Google ScholarGoogle Scholar
  32. Franke, D., Weise, C. 2011. Providing a software quality framework for testing of mobile applications. In: 2011 IEEE Fourth International Conference on Software Testing, Verification and Validation (ICST), IEEE. Google ScholarGoogle Scholar
  33. Galin, D., 2004. Software Quality Assurance: From Theory to Implementation. Pearson Education.Google ScholarGoogle Scholar
  34. Garlan, D., Schmerl, B., 2004. Using Architectural Models at Runtime: Research Challenges. In: First European Workshop on Software Architecture, LNCS 3047. Springer, pp. 200-205.Google ScholarGoogle Scholar
  35. Gorton, I., 2006. Essential Software Architecture. Springer-Verlang. Google ScholarGoogle Scholar
  36. Gross, D., Yu, E., 2001. From non-functional requirements to design through patterns. Requirements Eng. 6 (1), 18-36.Google ScholarGoogle ScholarCross RefCross Ref
  37. Guide to the Software Engineering Body of Knowledge, 2015. SWEBOK Guide ¿https://www.computer.org/web/swebok¿.Google ScholarGoogle Scholar
  38. Harrison, N.B., Avgeriou, P., 2007. Leveraging architecture patterns to satisfy quality attributes. In: European Conference on Software Architecture, LNCS, pp. 263-270. Google ScholarGoogle Scholar
  39. Harrison, N.B., Avgeriou, P., 2010. How do architecture patterns and tactics interact? A model and annotation. J. Syst. Softw. 83 (10), 1735-1758. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Huang, G., Hong, M., Yang, F.Q., 2006. Runtime recovery and manipulation of software architecture of component-based systems. Autom. Softw. Eng. 13 (2), 257-281. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. IEEE Std 610.12-1990--IEEE Standard Glossary of Software Engineering Terminology, Corrected Edition, February 1991. In: IEEE Software Engineering Standards Collection, The Institute of Electrical and Electronics Engineers, New York, 1991.Google ScholarGoogle Scholar
  42. IEEE, 1061-1992--IEEE Standard for a Software Quality Metrics Methodology, IEEE Computer Society, 1992, http://dx.doi.org/10.1109/IEEESTD.1993.115124.Google ScholarGoogle Scholar
  43. ISO/IEC/IEEE 24765:2010(E)--IEEE Systems and Software Engineering Vocabulary, 2010.Google ScholarGoogle Scholar
  44. ISO 9000-3:1997(E), Quality Management and Quality Assurance Standards--Part 3: Guidelines for the Application of ISO 9001:1994 to the Development, Supply, Installation and Maintenance of Computer Software, second ed. International Organization for Standardization (ISO), Geneva.Google ScholarGoogle Scholar
  45. ISO 9000-3:2001 Software and System Engineering--Guidelines for the Application of ISO 9001:2000 to Software, Final draft. International Organization for Standardization (ISO), Geneva, unpublished draft, December 2001.Google ScholarGoogle Scholar
  46. ISO/IEC Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SquaRE)--System and Software Quality Models. ISO/IEC 25010:2011, 2011. Available from: ¿http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber535733¿.Google ScholarGoogle Scholar
  47. Kan, S.H., 2002. Metrics and Models in Software Quality Engineering. Addison-Wesley Longman Publishing Co., Inc. Google ScholarGoogle Scholar
  48. Kazman, R., Bass, L., Klein, M., 2006. The essential components of software architecture design and analysis. J. Syst. Softw. 79 (8), 1207-1216. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Kitchenham, B.A., 1996. Software Metrics: Measurement for Software Process Improvement. Blackwell Publishers, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Koschke, R., 2000. Atomic Architectural Component Recovery for Program Understanding and Evolution (Ph.D. thesis). Universität Stuttgart.Google ScholarGoogle Scholar
  51. Koschke, R., Simon, D., 2003. Hierarchical reflexion models. In: Proceedings of the 10th Working Conference on Reverse Engineering, Victoria, Canada. Google ScholarGoogle Scholar
  52. Li, Z. et al., 2012. On a catalogue of metrics for evaluating commercial cloud services. In: Proceedings of the 2012 ACM/IEEE 13th International Conference on Grid Computing. IEEE Computer Society. Google ScholarGoogle Scholar
  53. McCall, J., Richards, P., Walters, G., 1977. Factors in Software Quality, vols. 13, NTIS AD-A049-014, 015, 055, November 1977.Google ScholarGoogle Scholar
  54. Murphy, G., Notkin, D., Sullivan, K., 2001. Software reflexion models: bridging the gap between design and implementation. IEEE Trans. Softw. Eng. 27 (4), 364-380. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Patel, C., Ramachandran, M., 2009. Agile maturity model (AMM): a Software Process Improvement framework for agile software development practices. Int. J. Softw. Eng. 2 (1), 3-28.Google ScholarGoogle Scholar
  56. Papazoglou, M.P., van den Heuvel, W.J., 2003. Service-oriented computing: state-of-the-art and open research issues. IEEE Comput. 40 (11). Google ScholarGoogle Scholar
  57. Pohl, K., 2010. Requirements Engineering: Fundamentals, Principles, and Techniques. Springer Publishing. Google ScholarGoogle ScholarCross RefCross Ref
  58. Remco, C., Van Vliet, H., 2009. QuOnt: an ontology for the reuse of quality criteria. In: ICSE Workshop on Sharing and Reusing Architectural Knowledge, pp. 57-64. Google ScholarGoogle Scholar
  59. Rozanski, N., Woods, E., 2011. Software Systems Architecture: Working with Stakeholders Using Viewpoints and Perspectives. Addison-Wesley. Google ScholarGoogle Scholar
  60. Schulmeyer, G., 2007. Handbook of Software Quality Assurance. Artech House Publishers. fourth ed. Google ScholarGoogle Scholar
  61. Software Engineering Institute, 2010. Software Architecture Glossary. ¿http://www.sei.cmu.edu/architecture/start/glossary/¿.Google ScholarGoogle Scholar
  62. Stoermer, C., Rowe, A., O'Brien, L., Verhoef, C., 2006. Model-centric software architecture reconstruction. Softw. Pract. Exper. 36 (4), 333363, ISSN 0038-0644. http://dx.doi.org/10.1002/spe.v36:4. Google ScholarGoogle Scholar
  63. Tvedt, R.T., Costa, P., Lindvall, M., 2004. Evaluating software architectures. Adv. Comput. 61, 1-43, ¿http://dblp.uni-trier.de/db/journals/ac/ac61.html#TvedtCL04¿.Google ScholarGoogle ScholarCross RefCross Ref
  64. Tian, J., 2005. Software Quality Engineering: Testing, Quality Assurance, and Quantifiable Improvement. John Wiley & Sons. Google ScholarGoogle Scholar
  65. Wojcik, R., Bachmann, F., Bass, L., Clements, P., Merson, P., Nord, R., et al., 2006. Attribute-Driven Design (ADD), Version 2.0. Technical Report CMU/SEI-2006-TR-023, SEI.Google ScholarGoogle Scholar
  66. Abran, A, Nguyenkim, H, 1993. Measurement of the maintenance process from a demand-based perspective. J. Softw. Maintenance Res. Pract. 5 (2), 63-90.Google ScholarGoogle ScholarCross RefCross Ref
  67. Abrial, J, 2009. Faultless systems: yes we can!. IEEE Comput. 42 (9), 30-36. Google ScholarGoogle ScholarDigital LibraryDigital Library
  68. Ambler, S., 2002. What is agile modeling (AM)? ¿http://www.agilemodeling.com/¿ (accessed 3.02.15.).Google ScholarGoogle Scholar
  69. Andreou, A, et al., 2005. Key issues for the design and development of mobile commerce services and applications. Int. J. Mobile Commun. 3 (3), 303-323. Google ScholarGoogle ScholarDigital LibraryDigital Library
  70. Arthur, L, 1997. Quantum improvements in software system quality. CACM 40 (6), 47-52. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. Baaz, A, et al., 2010. Appreciating lessons learned. IEEE Softw. 27 (4), 72-79. Google ScholarGoogle ScholarDigital LibraryDigital Library
  72. Basili, V. et al., 1994. Using measurement to build core competencies in software. ¿http://www.cs.umd.edu/Bmvz/handouts/gqm.pdf¿ (accessed 3.02.15.).Google ScholarGoogle Scholar
  73. Beck, K., 2004. Extreme Programming Explained: Embrace Change. Addison-Wesley, Massachusetts, USA. Google ScholarGoogle Scholar
  74. Begel A, DeLine R, Zimmerman T, 2010. Social media for software engineering. In: Proceedings on the Future of Software Engineering Research. ACM, November. Google ScholarGoogle Scholar
  75. Black, S, et al., 2009. Formal versus agile: survival of the fittest? IEEE Comput. 42 (9), 37-45. Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. Boehm, B, Basili, V, 2001. Software defect reduction top ten list. IEEE Comput. 34 (1), 135-138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. Boehm, B, et al., 1978. Characteristics of Software Quality. North Holland Publishing, New York, USA.Google ScholarGoogle Scholar
  78. Breu, R, Kuntzmann-Combelles, A, Felderer, M, 2014. New perspectives on software quality. IEEE Softw. 31 (1), 32-38. Google ScholarGoogle ScholarDigital LibraryDigital Library
  79. Cantor, M, Royce, W, 2014. Economic governance of software delivery. IEEE Softw. 31 (1), 54-61. Google ScholarGoogle ScholarDigital LibraryDigital Library
  80. Chidamber, S, Kemerer, C, 1994. A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20 (6), 476-493. Google ScholarGoogle ScholarDigital LibraryDigital Library
  81. Chung, L, Leite, J, 2009. On Non-Functional Requirements in Software Engineering Conceptual Modeling: Foundations and Applications. Springer-Verlag, Heidelberg, Germany. Google ScholarGoogle Scholar
  82. De Almeida, J, Camargo, J, Basseto, B, Paz, S, 2003. Best practices in code inspection for safety-critical software. IEEE Softw. 20 (3), 56-63. Google ScholarGoogle ScholarDigital LibraryDigital Library
  83. Dig, D, Manzoor, K, Johnson, R, Nguyen, T, 2008. Effective software merging in the presence of object-oriented refactorings. IEEE Trans. Softw. Eng. 34 (3), 321-335. Google ScholarGoogle ScholarDigital LibraryDigital Library
  84. e Abreu, F., 1995. The MOOD metrics set. In: European Conference on Object-Oriented Programming (ECOOP'95) Workshop on Metrics, Aarhus, Denmark.Google ScholarGoogle Scholar
  85. e Abreu, F., Melo, W., 1996. Evaluating the impact of object-oriented design on software quality. In: Proceedings of the 3rd International Symposium on Software Metrics: From Measurement to Empirical Results (METRICS'96). IEEE Computer Society, Washington, DC, USA, pp. 90-99. Google ScholarGoogle Scholar
  86. Ekman, T, Asklund, U, 2004. Refactoring-aware versioning in eclipse. Electron. Notes Theor. Comput. Sci. 107, 57-69. Google ScholarGoogle ScholarDigital LibraryDigital Library
  87. Ericson, C, 2005. Hazard Analysis Techniques for System Safety. Wiley-Interscience, New Jersey, USA.Google ScholarGoogle Scholar
  88. Falessi, D, Sabetzadeh, M, Briand, L, Turella, E, Coq, T, Panesar-Walawege, R, 2012. Planning for safety standards compliance: a model-based tool-supported approach. IEEE Softw. 29 (3), 64-70. Google ScholarGoogle ScholarDigital LibraryDigital Library
  89. Fehlmann, T., 2003. Six sigma for software. ¿http://citeseerx.ist.psu.edu/viewdoc/download?doi510.1.1.91.6736&rep5rep1&type5pdf¿ (accessed 15.01.15.).Google ScholarGoogle Scholar
  90. Firesmith, D, 2012. Security and Safety Requirements for Software-intensive Systems. Auerbach, Massachusetts, USA. Google ScholarGoogle Scholar
  91. Fowler, M, Beck, K, Brant, J, Opdyke, W, Roberts, D, 1999. Refactoring--Improving the Design of Existing Code. Addison-Wesley, Massachusetts, USA. Google ScholarGoogle Scholar
  92. Gage, D., McCormick, J., 2004. We did nothing wrong. Baseline Magazine, March 4, 2004 ¿http://www.baselinemag.com/c/a/Projects-Processes/We-Did-Nothing-Wrong¿ (accessed 16.12.14.).Google ScholarGoogle Scholar
  93. Gardner, D., 2009. Can software development aspire to the cloud? ZDNet.com April 28, 2009 ¿http://www.zdnet.com/blog/gardner/can-software-development-aspire-to-the-cloud/2915¿ (accessed 25.10.14.).Google ScholarGoogle Scholar
  94. Garvin, D, 1984. What does product quality really mean? Sloan Manage. Rev. 1984 (Fall), 25-45.Google ScholarGoogle Scholar
  95. Gonclaves, L., Linderss, B., 2014. Getting value out of agile retrospectives. Leanpub.Google ScholarGoogle Scholar
  96. Grady, R, 1992. Practical Software Metrics for Project Management and Process Improvement. Prentice-Hall, New Jersey, USA. Google ScholarGoogle Scholar
  97. Hall, A, 1990. Seven myths of formal methods. IEEE Softw. 1990 (September), 11-20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  98. Hardy, T., 2012. Software and System Safety. Authorhouse, Indiana, USA.Google ScholarGoogle Scholar
  99. Hatton, L, 2007. The chimera of software quality. IEEE Comput. 40 (8), 102-103, 104. Google ScholarGoogle ScholarDigital LibraryDigital Library
  100. IEEE 1012-2012, 2012. IEEE Standard for System and Software Verification and Validation. ¿http://standards.ieee.org/findstds/standard/1012-2012.html¿ (accessed 12.02.15.).Google ScholarGoogle Scholar
  101. ISO 25010:2011, 2011. System and Software Quality Models. ¿https://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en¿ (accessed 12.02.15.).Google ScholarGoogle Scholar
  102. Kim, M, Notkin, D, Grossman, D, Wilson, G, 2012. Identifying and summarizing systematic code changes via rule inference. IEEE Trans. Softw. Eng. 39 (1), 45-62. Google ScholarGoogle ScholarDigital LibraryDigital Library
  103. Koegel, M, Herrmannsdoerfer, M, Li, Y, Helming, J, Joern, D, 2010. Comparing state and operation-based change tracking on models. In: Proceedings of the IEEE International EDOC Conference. Google ScholarGoogle Scholar
  104. Korpipaa, P, et al., 2003. Managing context Information in mobile devices. IEEE Pervasive Comput. 2 (3), 42-51. Google ScholarGoogle ScholarDigital LibraryDigital Library
  105. Kuehlemann, A, 2014. Transforming Test Through Automation. ¿http://www.it-daily.net/downloads/WP-Coverity-Transforming-Testing-0613.pdf¿ (accessed 12.02.15.).Google ScholarGoogle Scholar
  106. Lago, P, et al., 2010. Software architecture: framing stakeholders' concerns. IEEE Software 27 (6), 20-24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  107. Lawler, J, Kitchenham, B, 2003. Measurement modeling technology. IEEE Softw. 20 (3), 68-75. Google ScholarGoogle ScholarDigital LibraryDigital Library
  108. Lazaroni, M, et al., 2011. Reliability Engineering. Springer, New York, USA.Google ScholarGoogle Scholar
  109. Li, J, Stälhane, T, Conradi, R, Kristiansen, J, 2012. Enhancing defect tracking systems to facilitate software quality improvement. IEEE Softw. 29 (2), 59-66. Google ScholarGoogle ScholarDigital LibraryDigital Library
  110. Mallikarunja, C, et al., 2014. A report on the analysis of software maintenance and impact on quality factors. Int. J. Eng. Sci. Res. 05 (1), 1485-1489.Google ScholarGoogle Scholar
  111. McCall, J, Richards, P, Walters, G, 1977. Factors in Software Quality (Three volumes, NTIS AD-A) 49-014, 015, 055.Google ScholarGoogle Scholar
  112. Mead, N, Jarzombek, J, 2010. Advancing software assurance with publicprivate sector collaboration. IEEE Comput. 43 (9), 21-30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  113. Mottu, J, Baudry, B, LeBron, Y, 2008. Model transformation testing: oracle issue. In: Proceedings of the 2008 IEEE International Conference on Software Testing Verification and Validation Workshop (ICSTW'08). IEEE Computer Society, Washington, DC, USA, pp. 105-112. Google ScholarGoogle Scholar
  114. Musa, J, et al., 1987. Engineering and Managing Software with Reliability Measures. McGraw-Hill, New York, USA. Google ScholarGoogle Scholar
  115. Poth, A, Suyaev, A, 2014. Effective quality management: value and risk-based software quality management. IEEE Softw. 31 (6), 79-85.Google ScholarGoogle ScholarCross RefCross Ref
  116. Pressman, R S, Maxim, B R, 2014. Software Engineering: A Practitioner's Approach. McGraw-Hill, New York, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  117. Redwine, S, 2010. Fitting software assurance into higher education. IEEE Comput. 43 (9), 41-66. Google ScholarGoogle ScholarDigital LibraryDigital Library
  118. Redzic, C., Biak, J., 2006. Six sigma approach in quality improvement. In: Proceedings of Fourth International Conference on Software Engineering Research, Management, and Applications (SERA'06), August 2006, pp. 396-406. Google ScholarGoogle ScholarDigital LibraryDigital Library
  119. Reuveni, D., 2012. Crowdsourcing provides answer to app testing dilemma. ¿http://www.wirelessweek.com/Articles/2010/02/Mobile-Content-CrowdsourcingAnswer-App-Testing-Dilemma-Mobile-Applications/¿ (accessed 9.09.15.).Google ScholarGoogle Scholar
  120. Revelle, J, 2004. Quality Essentials: A Reference Guide from A to Z. ASQ Quality Press, Wisconsin, USA.Google ScholarGoogle Scholar
  121. Robes, R., 2007. Mining a change-based software repository. In: Proceedings of the Workshop on Mining Software Repositories (MSR'07). IEEE Computer Society, pp. 15-23. Google ScholarGoogle Scholar
  122. Rokosz, V, 2003. Long-term testing in a short-term world. IEEE Softw. 20 (3), 64-67. Google ScholarGoogle ScholarDigital LibraryDigital Library
  123. Rooksby, J, et al., 2009. Testing in the wild: the social and organizational dimensions of real world practice. J. Comput. Support. Work 18 (9), 559-580. Google ScholarGoogle ScholarDigital LibraryDigital Library
  124. Schulmeyer, G, 2007. Handbook of Software Quality Assurance. Artech House, Massachusetts, USA. Google ScholarGoogle Scholar
  125. Shull, F, 2012. Designing a world at your finger tips: a look at mobile user interfaces. IEEE Softw. 29 (4), 4-7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  126. Siakas, K. et al., Integrating six sigma with CMMI for high quality software. In: Proceedings of the 14th Software Quality Management Conference (SQM'06), April 2006. British Computer Society, pp. 85-96.Google ScholarGoogle Scholar
  127. Sinn, R, 2008. Software Security Technologies. Thomson Course Technology, Massachusetts, USA. Google ScholarGoogle Scholar
  128. Sterling, C, 2010. Managing Software Debt: Building for Inevitable Change. Addison-Wesley, Massachusetts, USA. Google ScholarGoogle Scholar
  129. Tutelage, M, Dubai, G, 2012. A research study on importance of testing and quality assurance in software development life cycle (SDLC) model. Int. J. Soft Comput. Eng. 2 (3), 251-257.Google ScholarGoogle Scholar
  130. Vargs, J., Cordoba, J., 2001. 10 best practices for effective testing and QA implementation. Softek Trends & Vision Newsletter 4 (July).Google ScholarGoogle Scholar
  131. Wiegers, K, 2002. Peer Reviews in Software. Addison-Wesley, Massachusetts, USA.Google ScholarGoogle Scholar
  132. Wood, D (Ed.), 2012. Principles of Quality Costs. ASQ Quality Press, Wisconsin, USA.Google ScholarGoogle Scholar
  133. Yacoub, S, 2003. Automated QA for document understanding systems. IEEE Softw. 20 (3), 76-82. Google ScholarGoogle ScholarDigital LibraryDigital Library
  134. Akao, Y., 1994. Development history of quality function deployment. The Customer Driven Approach to Quality Planning and Deployment. Asian Productivity Organization, Minato, Tokyo, ISBN 92-833-1121-3.Google ScholarGoogle Scholar
  135. Baily, R.A., 2008. Design of Comparative Experiments. Cambridge University Press, Cambridge, UK, ISBN 978-0521-68357-9.Google ScholarGoogle Scholar
  136. Beck, K. et al., 2001. Available from ¿http://agilemanifesto.org¿ (retrieved October 2014).Google ScholarGoogle Scholar
  137. Boehm, B.W., Brown, J.R., Kaspar, H., Lipow, M., McLeod, G., Merritt, M., 1978. Characteristics of Software Quality. North Holland Publishing, Amsterdam, the Netherlands.Google ScholarGoogle Scholar
  138. Brooks, 1975. The Mythical Man Month. Addison-Wesley, Reading, MA, Chapter 14. Google ScholarGoogle Scholar
  139. CMMI®, 2010. CMMI Product Team: CMMI for Development, Version 1.3 (CMU/SEI-2010- TR-033). Carnegie Mellon University, Software Engineering Institute, Pittsburgh, PA.Google ScholarGoogle Scholar
  140. Crosby, 1979. Quality is Free. McGraw-Hill, New York, NY, ISBN 0-07-014512-1.Google ScholarGoogle Scholar
  141. Dromey, R.G., 1995. A model for software product quality. IEEE Trans Softw Eng 21 (2), 146-162. Google ScholarGoogle ScholarDigital LibraryDigital Library
  142. Glib, T., 1997. Quantifying the qualitative: how to avoid vague requirements by clear specification language. Requirenautics Quarterly, British Computer Society, UK, 12, 9-13.Google ScholarGoogle Scholar
  143. Google Java Style, 2014. Available from: ¿https://google-styleguide.googlecode.com/svn/trunk/javaguide.html¿ (retrieved October 2014).Google ScholarGoogle Scholar
  144. Grady, R.B., Caswell, D.L, 1987. Software Metrics: Establishing a Company-wide Program. Prentice-Hall, Inc, Upper Saddle River, NJ. Google ScholarGoogle ScholarDigital LibraryDigital Library
  145. ISO/IEC 14598-1, 1459. Information Technology--Evaluation of Software Products--Part 1 General Guide. International Organization for Standardization, Geneva, Switzerland.Google ScholarGoogle Scholar
  146. ISO/IEC 25010, 2011. Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models. International Organization for Standardization, Geneva, Switzerland.Google ScholarGoogle Scholar
  147. ISO. ISO/IEC IS 9126, 1991. Software Product Evaluation--Quality Characteristics and Guidelines for their Use. International Organization for Standardization, Geneva, Switzerland.Google ScholarGoogle Scholar
  148. Jamwal Dr, D., 2010. Analysis of software quality models for organizations. Int J Latest Trends Comput 1 (2), (E-ISSN: 2045-5364) 19.Google ScholarGoogle Scholar
  149. Kaner, C., Nguyen, H.Q., Falk, J., 1988. Testing Computer Software, second ed. Thomson Computer Press, Boston, MA, ISBN 0-47135-846-0. Google ScholarGoogle Scholar
  150. Martínez-Lorente, A.R., Dewhurst, F., Dale, B.G., 1998. Total quality management: origins and evolution of the term. The TQM Magazine. MCB University Publishers Ltd, Bingley, UK.Google ScholarGoogle Scholar
  151. McCabe, 1976. A complexity measure. IEEE Trans Softw Eng, 308-320. Google ScholarGoogle ScholarDigital LibraryDigital Library
  152. McCall, J.A., Richards, P.K., Walters, G.F., 1977. Factors in Software Quality, Volumes I, II, and III. US Rome Air Development Center Reports, US Department of Commerce, USA.Google ScholarGoogle Scholar
  153. Parmenter, D., 2007. Key Performance Indicators. John Wiley & Sons NJ, ISBN 0-470-09588-1.Google ScholarGoogle Scholar
  154. Paulk, M.C., Curtis, B., Chrissis, M.B., Averill, E.L., Bamberger, J., Kasse, T.C., et al., 1991. Capability Maturity Model for Software. CMU/SEI-91-TR-24. Carnegie Mellon University, Software Engineering Institute, Pittsburgh, PA.Google ScholarGoogle Scholar
  155. Royce, W., 1970. Managing the development of large software systems. Proceedings of IEEE WESCON 26 (August), 1-9.Google ScholarGoogle Scholar
  156. Shewhart, W.A., 1931. Economic Control of Quality of Manufactured Product. D. Van Nostrand Company, New York, NY, ISBN 0-87389-076-0.Google ScholarGoogle Scholar
  157. Tennant, G., 2001. Six Sigma: SPC and TQM in Manufacturing and Services. Gower Publishing Ltd, Farnham, UK, ISBN 0-566-08374-4.Google ScholarGoogle Scholar
  158. Wagner, S., 2013. Software Product Quality Control: Chapter 2 Quality Models. Springer, Berlin, Germany, ISBN 978-3-642-38570-4.Google ScholarGoogle Scholar
  159. Yourdon & Constantine, 1979. Structured Design: Fundamentals of a Discipline of Computer Program and Systems Design. Yourdon Press, Upper Saddle River, NJ, ISBN 0-13-854471-9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  160. Beck, K., Beedle, M., Van Bennekum, A., Cockburn, A., Cunningham, W., Fowler, M., et al., 2001. Manifesto for Agile Software Development.Google ScholarGoogle Scholar
  161. Beer, S., 1966. Diagnosing the System for Organisations. John Wiley, London and New York.Google ScholarGoogle Scholar
  162. Boehm, B.W., 1988. A spiral model of software development and enhancement. Computer 21 (5), 61-72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  163. (CISQ) Consortium for IT Software Quality, 2012. CISQ Specifications for Automated Quality Characteristic Measures. Available from: ¿http://it-cisq.org/wp-content/uploads/2012/09/CISQ-Specification-for-Automated-Quality-Characteristic-Measures.pdf¿.Google ScholarGoogle Scholar
  164. CMMI-DEV, C.M.M.I., 2010. for Development, Version 1.3. Software Engineering Institute.Google ScholarGoogle Scholar
  165. Dahl, O.J., Dijkstra, E.W., Hoare, C.A.R., 1972. Structured Programming. Academic Press, London, UK. Google ScholarGoogle Scholar
  166. Deming, W.E., 2000. Out of the Crisis, First MIT Press Ed. MIT Press, Cambridge, MA.Google ScholarGoogle Scholar
  167. Floyd, R.W., 1967. Assigning meanings to programs, mathematical aspects of computer science. In: Schwartz, J.T. (Ed.) volume 19 of Proceedings of Symposium on Applied Mathematics, A.M.S.Google ScholarGoogle Scholar
  168. Fukuda, R., 1997. Building Organizational Fitness: Management Methodology for Transformation and Strategic Advantage. Productivity Press, Portland, OR.Google ScholarGoogle Scholar
  169. Gérard, B., Georges, G., 1992. The esterel synchronous programming language: design, semantics, implementation. Sci. Comput. Programming 19 (2), 87-152. Google ScholarGoogle ScholarDigital LibraryDigital Library
  170. Hoare, C.A.R., 1969. An Axiomatic Basis for Computer Program. Communications of the ACM 12 (10), 576-580. Google ScholarGoogle ScholarDigital LibraryDigital Library
  171. International Organization for Standardisation. (ISO), 1991. ISO/IEC: 9126 Information technology-Software Product Evaluation-Quality characteristics and guidelines for their use -1991.Google ScholarGoogle Scholar
  172. International Organization for Standardization. (ISO), 2011. IEC 25010: 2011: Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models.Google ScholarGoogle Scholar
  173. Kano, N., Searku, N., Takahashi, F., Tsuji, S., 1984. Attractive quality and must be quality. Himshitsu (The Journal of Quality, Japanese Society for Quality Control) 14, 39-48.Google ScholarGoogle Scholar
  174. Murthy, P.N., 1994. Systems practice in consulting. Syst. Pract. Action Res. 7 (4), 419-438.Google ScholarGoogle ScholarCross RefCross Ref
  175. Murthy, P.N., Sudhir, V., 1999. Multi Modeling Approach to Enterprise Analysis and Modeling. TCS Internal, QMS Guidelines.Google ScholarGoogle Scholar
  176. Sommerville, I., 2011. Software Engineering, ninth Ed. Addison-Wesley, Boston, MA, p. 29.Google ScholarGoogle Scholar
  177. Warfield, J.N., 1976. Societal Systems, Planning, Policy and Complexity. Wiley, New York.Google ScholarGoogle Scholar
  178. Watts, S.H., 1989. Managing the Software Process. Addison Wesley Professional, MA. Google ScholarGoogle Scholar
  179. Wirth, N., 1972. Systematic Programming: An Introduction. Prentice Hall, NJ, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  180. Wirth, N., 1974. Algorithms+Data Structures=Programs. Prentice Hall, NJ, USA. Google ScholarGoogle Scholar
  181. Zachman, J.A., 1987. A framework for information systems architecture. IBM Syst. J. 26 (3), 276-292. Google ScholarGoogle ScholarDigital LibraryDigital Library
  182. Zope, N., Nori, K., 2008. Process meta-modeling: a design perspective. TCS Internal Technical Report.Google ScholarGoogle Scholar
  183. Allman, E., 2012. Managing technical debt--shortcuts that save money and time today can cost you down the road. Commun. ACM 55 (5), 50-55. Google ScholarGoogle ScholarDigital LibraryDigital Library
  184. Basili, V.R., 1992. Software Modeling and Measurement: The Goal/Question/Metric Paradigm. University of Maryland at College Park, pp. 1-24. ¿http://drum.lib.umd.edu/bitstream/1903/7538/1/Goal_Question_Metric.pdf¿ (accessed on 10.01.15.).Google ScholarGoogle Scholar
  185. Bohnet, J., Döllner, J. 2011. Monitoring code quality and development activity by software maps. In: Paper Presented at the Proceedings of the 2nd International Workshop on Managing Technical Debt (MTD'11), Waikiki, Honolulu, HI, USA. Google ScholarGoogle Scholar
  186. Brown, N., Cai, Y., Guo, Y., Kazman, R., Kim, M., Kruchten, P., et al. 2010. Managing technical debt in software-reliant systems. In: Paper Presented at the Proceedings of the FSE/SDP Workshop on Future of Software Engineering Research (FoSER'10), Santa Fe, New Mexico, USA. Google ScholarGoogle Scholar
  187. Buschmann, F., 2011. To pay or not to pay technical debt. IEEE Softw. 28 (6), 29-31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  188. Cunningham, W., 1992. The WyCash portfolio management system. In: Paper Presented at the Proceedings of the 7th Object-Oriented Programming Systems, Languages, and Applications (OOPSLA'92), Vancouver, BC, Canada. Google ScholarGoogle Scholar
  189. Curtis, B., Sappidi, J., Szynkarski, A., 2012. Estimating the size, cost, and types of technical debt. In: Paper Presented at the Proceedings of the 3rd International Workshop on Managing Technical Debt (MTD'12), Zurich, Switzerland. Google ScholarGoogle Scholar
  190. Eisenberg, R.J., 2012. A threshold based approach to technical debt. SIGSOFT Softw. Eng. Notes 37 (2), 1-6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  191. Falessi, D., Shaw, M.A., Shull, F., Mullen, K., Keymind, M.S., 2013. Practical considerations, challenges, and requirements of tool-support for managing technical debt. In: Paper Presented at the Proceedings of the 4th International Workshop on Managing Technical Debt (MTD'13), San Francisco, CA, USA. Google ScholarGoogle Scholar
  192. Gat, I., 2012. Technical debt as a meaningful metaphor for code quality. IEEE Softw. 29 (6), 52-54. Google ScholarGoogle ScholarDigital LibraryDigital Library
  193. Gat, I., Heintz, J.D., 2011. From assessment to reduction: how Cutter Consortium helps rein in millions of dollars in technical debt. In: Paper Presented at the Proceedings of the 2nd International Workshop on Managing Technical Debt (MTD'11), Waikiki, Honolulu, HI, USA. Google ScholarGoogle Scholar
  194. Guo, Y., Seaman, C., 2011. A portfolio approach to technical debt management. In: Paper Presented at the Proceedings of the 2nd International Workshop on Managing Technical Debt (MTD'11), Waikiki, Honolulu, HI, USA. Google ScholarGoogle Scholar
  195. Holvitie, J., Leppänen, V., 2013. DebtFlag: technical debt management with a development environment integrated tool. In: Paper Presented at the Proceedings of the 4th International Workshop on Managing Technical Debt (MTD'13), San Francisco, CA, USA. Google ScholarGoogle Scholar
  196. ISO/IEC, 2011. Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models ISO/ IEC 25010:2011, pp. 1-34.Google ScholarGoogle Scholar
  197. ISO/IEC/IEEE, 2011. Systems and Software Engineering--Architecture Description. ISO/ IEC/IEEE 42010:2011(E) (Revision of ISO/IEC 42010:2007 and IEEE Std 1471-2000), pp. 1-46.Google ScholarGoogle Scholar
  198. Jansen, A., Bosch, J., 2005. Software architecture as a set of architectural design decisions. In: Paper Presented at the Proceedings of the 5th Working IEEE/IFIP Conference on Software Architecture (WICSA'05), Pittsburgh, PA, USA. Google ScholarGoogle Scholar
  199. Krishna, V., Basu, A., 2012. Minimizing technical debt: developer's viewpoint. In: Paper Presented at the Proceedings of the International Conference on Software Engineering and Mobile Application Modelling and Development (ICSEMA'12), Chennai, India.Google ScholarGoogle Scholar
  200. Kruchten, P., Nord, R.L., Ozkaya, I., 2012. Technical debt: from metaphor to theory and practice. IEEE Softw. 29 (6), 18-21. Google ScholarGoogle ScholarDigital LibraryDigital Library
  201. Li, Z., Liang, P., Avgeriou, P., 2014a. Architectural debt management in value-oriented architecting. In: Mistrik, I., Bahsoon, R., Kazman, R., Zhang, Y. (Eds.), Economics-Driven Software Architecture. Elsevier, Waltham, MA, USA, pp. 183-204.Google ScholarGoogle Scholar
  202. Li, Z., Liang, P., Avgeriou, P., Guelfi, N., Ampatzoglou, A., 2014b. An empirical investigation of modularity metrics for indicating architectural technical debt. In: Paper Presented at the Proceedings of the 10th International Conference on the Quality of Software Architectures (QoSA'14), Marcq-en-Bareul, France. Google ScholarGoogle Scholar
  203. Li, Z., Avgeriou, P., Liang, P., 2015. A systematic mapping study on technical debt and its management. J. Syst. Softw. 101 (3), 193-220. Google ScholarGoogle ScholarDigital LibraryDigital Library
  204. McConnell, S., 2008. Managing Technical Debt. Construx, pp. 114. ¿http://www.construx.com/uploadedFiles/Construx/Construx_Content/Resources/Documents/Managing%20Technical%20Debt.pdf¿ (accessed on: 10.01.15.).Google ScholarGoogle Scholar
  205. McGregor, J.D., Monteith, J.Y., Jie, Z., 2012. Technical debt aggregation in ecosystems. In: Paper Presented at the Proceedings of the 3rd International Workshop on Managing Technical Debt (MTD'12), Zurich, Switzerland. Google ScholarGoogle Scholar
  206. Nugroho, A., Visser, J., Kuipers, T., 2011. An empirical model of technical debt and interest. In: Paper Presented at the Proceedings of the 2nd International Workshop on Managing Technical Debt (MTD'11), Waikiki, Honolulu, HI, USA. Google ScholarGoogle Scholar
  207. Ozkaya, I., Kruchten, P., Nord, R.L., Brown, N., 2011. Managing technical debt in software development: report on the 2nd International Workshop on Managing Technical Debt, held at ICSE 2011. SIGSOFT Softw. Eng. Notes 36 (5), 33-35. Google ScholarGoogle ScholarDigital LibraryDigital Library
  208. Runeson, P., Höst, M., 2009. Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. 14 (2), 131-164. Google ScholarGoogle ScholarDigital LibraryDigital Library
  209. Seaman, C., Guo, Y., 2011. Measuring and monitoring technical debt. In: Zelkowitz, M. (Ed.), Advances in Computers, vol. 82. Elsevier, London, UK, pp. 25-45.Google ScholarGoogle Scholar
  210. Seddon, P.B., Scheepers, R., 2012. Towards the improved treatment of generalization of knowledge claims in IS research: drawing general conclusions from samples. Eur. J. Inf. Syst. 21 (1), 6-21.Google ScholarGoogle ScholarCross RefCross Ref
  211. van Heesch, U., Avgeriou, P., Hilliard, R., 2012. A documentation framework for architecture decisions. J. Syst. Softw. 85 (4), 795-820. Google ScholarGoogle ScholarDigital LibraryDigital Library
  212. Zazworka, N., Spinola, R.O., Vetro', A., Shull, F., Seaman, C. 2013. A case study on effectively identifying technical debt. In: Paper Presented at the Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering (EASE'13), Porto de Galinhas, Brazil. Google ScholarGoogle Scholar
  213. Alexander, C., 1979. The Timeless Way of Building. Oxford University Press, New York, NY.Google ScholarGoogle Scholar
  214. Alexander, C., Ishikawa, S, Silverstein, M, Jacobson, M, et al., 1977. A Pattern Language. Oxford University Press, New York, NY.Google ScholarGoogle Scholar
  215. CMMI for Development, November 2010. Version 1.3 - SEI Digital Library, Publisher: Software Engineering Institute, Carnegie Mellon University. ¿http://www.sei.cmu.edu/reports/10tr033.pdf¿.Google ScholarGoogle Scholar
  216. Deming, W.E., 2000. Out of the Crisis, First MIT Press ed. MIT Press, Cambridge, MA.Google ScholarGoogle Scholar
  217. Design Patterns: Elements of Reusable Object-Oriented Software, Pearson Education, USA, 1995. Google ScholarGoogle Scholar
  218. Gamma, E., Helm, R., Johnson, R., Vlissides, J., 1995. Design Patterns: Elements of Reusable Object-Oriented Software. Addison Wesley. Google ScholarGoogle ScholarDigital LibraryDigital Library
  219. Gharajedaghi, J., 2011. Systems Thinking: Managing Chaos and Complexity, third ed. Elsevier, USA. Google ScholarGoogle Scholar
  220. ISO/IEC 12207, 2008. Systems and Software Engineering--Software Life Cycle Processes.Google ScholarGoogle Scholar
  221. ISO/IEC 25010, 2011. Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models.Google ScholarGoogle Scholar
  222. ISO/IEC/IEEE 24765, 2010. Systems and Software Engineering--Vocabulary.Google ScholarGoogle Scholar
  223. ISO/IEC TS 30103, 2014. Software and Systems Engineering--Lifecycle Processes-- Framework for Product Quality Achievement.Google ScholarGoogle Scholar
  224. Jones, C., 2012. Software Quality in 2012: A Survey of the State of the Art. ¿http://sqgne.org/presentations/2012-13/Jones-Sep-2012.pdf¿.Google ScholarGoogle Scholar
  225. Kan, S.H., 2002. Metrics and Models in Software Quality Engineering. Google ScholarGoogle Scholar
  226. Kan, S.H., 2003. Metrics and Models in Software Quality Engineering. Pearson Education, USA. Google ScholarGoogle Scholar
  227. Niklaus W., 1973. Systematic Programming: An Introduction, Prentice-Hall Series in Automatic Computation, USA. Google ScholarGoogle Scholar
  228. Nistala, P., Kumari, P., 2013. Establishing content traceability for software applications: An approach based on structuring and tracking of configuration elements, "Traceability in Emerging Forms of Software Engineering (TEFSE)", International Workshop, pp. 68-71.Google ScholarGoogle Scholar
  229. Nistala, P., Priyanka, K., 2013a. An approach to carry out consistency analysis on requirements validating and tracking requirements through a configuration structure. In: Proc. 21st IEEE RE '2013.Google ScholarGoogle Scholar
  230. Nistala, P., Priyanka, K., 2013b. Establishing content traceability for software applications: an approach based on structuring and tracking of configuration elements. In: Proc. 7th International Workshop on Traceability in Emerging Forms of Software Engineering (TEFSE), Col-located with ICSE 2013.Google ScholarGoogle ScholarCross RefCross Ref
  231. Nistala, P., Bharadwaj, A., Priyanka, K., 2013. An approach to manage NFRs in agile methodology: expanding product roadmap to include NFR features based on holistic view of product quality. In: Improving Systems and Software Engineering Conference, ISSEC 2013.Google ScholarGoogle Scholar
  232. Nonaka, I, Takeuchi, H, 1995. The Knowledge Creating Company. Oxford University Press.Google ScholarGoogle Scholar
  233. Nori, K V Swaminathan, N, 2006. A framework for software product engineering. In: Asia-Pacific Software Engineering Conference. Google ScholarGoogle Scholar
  234. Ole-Johan Dahl, E.W., Dijkstra, C.A.R., 1972. Hoare, Structured Programming. Academic Press. Google ScholarGoogle Scholar
  235. Radice, R.A., Roth, N.K., O'Hara Jr, A.C., Ciarfella, W.A., 1985. A programming process architecture. IBM Syst. J. 24 (2), 79-90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  236. Svensson, D., Malmqvist, J., 2002. Strategies for product structure management at manufacturing firms. J. Comput. Inf. Sci. Eng. 2 (1), 50-58.Google ScholarGoogle ScholarCross RefCross Ref
  237. Abran, A., August 21-25 1995. Quality--The intersection of product and process. In: 6th IEEE International Software Engineering Standard Symposium (ISESS'95). Montréal, Canada.Google ScholarGoogle Scholar
  238. Abran, A., Garbajosa, J., Cheiki, L., November 2007. Estimating the test volume and effort for testing and verification & validation. In: IWSM-MENSURA 2007 Conference. Palma de Mallorca, Spain. ¿http://goo.gl/r90UUG¿.Google ScholarGoogle Scholar
  239. Abran, A., Al-Sarayreh, K.T., Cuadrado-Gallego, J.J., 2013. A standards-based reference framework for system portability requirements, computer standards and interfaces (CSI). Elsevier 35 (4), 380-395. Google ScholarGoogle Scholar
  240. Abran, A., Moore, J.W., Bourque, P., Dupuis, R., Tripp, L.T., Guide to the Software Engineering Body of Knowledge (SWEBOK), 2014 Version, IEEE. ¿http://www.computer.org/web/swebok¿. Google ScholarGoogle Scholar
  241. Axelos Ltd. ITIL (IT Infrastructure Library) v3 Refresh 2011, Core Guides, 2011, UK.Google ScholarGoogle Scholar
  242. Berga, E., Krogstieb, J., Sandvoldc, O., 1997. Enhancing user participation in system design using groupware tools. In: IRIS20 Conference. Hankø Fjordhotel, Norway, August 9-12.Google ScholarGoogle Scholar
  243. Boehm, B.W., Brown, J.R., Lipow, H., MacLeod, G.J., Merrit, M.J., 1978. Characteristics of Software Quality. Elsevier, North-Holland.Google ScholarGoogle Scholar
  244. Brynjolfosson, E., Austin Renshaw, A., van Alstyne, M., The Matrix of Change. Massachusetts Institute of Technology, Cambridge, USA, Working Paper # 189, January 1997. ¿http://goo.gl/Qh1b6Q¿.Google ScholarGoogle Scholar
  245. Buglione, L., 2012. The Next Frontier: Measuring and Evaluating the NonFunctional Productivity, MetricViews. IFPUG Newsletter 6(2), 11-14. ¿http://goo.gl/nVwdxr¿.Google ScholarGoogle Scholar
  246. Buglione, L., Abran, A., 1999. Multidimensional software performance measurement models: a tetrahedron-based design. In: Dumke, R., Abran, A. (Eds.), Software Measurement: Current Trends in Research and Practice. Deutscher Universitats Verlag GmbH, Wiesbaden, Germany, pp. 93-107.Google ScholarGoogle Scholar
  247. Buglione, L., Abran, A., March 22-23, 2001. QF2D: Quality Factor through QFD Application. In: Qualita'01 (4th International Congress on Quality and Reliability), Annecy, France, ISBN 2-9516453-0-0, pp. 34-39.Google ScholarGoogle Scholar
  248. Buglione, L., Abran, A., ICEBERG: a different look at Software Project Management, IWSM2002 in "Software Measurement and Estimation". In: 12th International Workshop on Software Measurement (IWSM'02). October 79, 2002, Magdeburg (Germany). Shaker Verlag, ISBN 3-8322-0765-1, pp. 153-167.Google ScholarGoogle Scholar
  249. Chung, L., Nixon, B.A., Yu, E., Myolopoulos, J., 1999. Nonfunctional Requirements in Software Engineering. Springer, ISBN 978-0792386667 ¿http://goo.gl/JCEyPg¿.Google ScholarGoogle Scholar
  250. CMMI Institute. CMMI-DEV (CMMI for Development) v1.3, Technical Report, CMU/SEI- 2010-TR-033, Software Engineering Institute, USA, November 2010. ¿http://goo.gl/ZqDhy6¿.Google ScholarGoogle Scholar
  251. Conti, T., 1997. Organizational Self-Assessment. Chapman & Hall, London, UK.Google ScholarGoogle Scholar
  252. COSMIC, Guideline on managing "Nonfunctional Requirements" for software, v0.23, July 2014. ¿http://cosmic-sizing.org/¿.Google ScholarGoogle Scholar
  253. Crosby, P.B., 1979. Quality is Free. McGraw-Hill, New York, USA, ISBN 0-451-62585-411.Google ScholarGoogle Scholar
  254. Crow, K., 2002. Customer-Focused Development with QFD, DRM Associates, Palos Verdes, CA, USA. ¿http://goo.gl/ykYbeM¿.Google ScholarGoogle Scholar
  255. Daneva, M., 2010. Balancing uncertainty of context in ERP project estimation: an approach and a case study. J. Softw. Maintenance Evol. Res. Pract. 22 (5), 310-335. Google ScholarGoogle Scholar
  256. Daneva, M., 2011. Uncertain context factors in ERP project estimation are an asset: insights from a semi-replication case study in a financial services firm. Int. J. Software Eng. Knowl. Eng. (IJSEKE) 21 (3), 389-411.Google ScholarGoogle ScholarCross RefCross Ref
  257. Daneva, M., Buglione, L., Herrmann, A., 2013. Software architects' experiences of quality requirements: what we know and what we do not know? In: 19th International Working Conference on Requirements Engineering--Foundation for Software Quality, REFSQ'13, April 8-11, 2013, Essen, Germany, pp. 1-17. Google ScholarGoogle ScholarDigital LibraryDigital Library
  258. Dean, E.B., Quality function deployment for large systems. In: 1992 International Engineering Management Conference, Eatontown, NJ, October 25-28, 1992.Google ScholarGoogle ScholarCross RefCross Ref
  259. Dromey, R.G., 1995. A model for software product quality. IEEE Trans. Software Eng. 21 (2), 146-162. Google ScholarGoogle ScholarDigital LibraryDigital Library
  260. ECSS, Space Engineering--System Engineering: Part 6. Functional and Technical Specifications, European Cooperation for Space Standardization, ECSS-E-10 Part 6A rev.1, October 31, 2005. ¿www.ecss.nl¿.Google ScholarGoogle Scholar
  261. Erasmus, P., Daneva, M., 2015. ERP services effort estimation strategies based on early requirements. In: 2nd International Workshop on Requirements Engineering for the Pre-contract Phase, REFSQ'15, March 23.Google ScholarGoogle Scholar
  262. Eriksson I.V., McFadden F., Tiittanen A.M., Improving software development through quality function deployment. In: 5th International Conference on Information Systems Development. ISD'96, Golansk, Poland, September 2426, 1996. ¿http://goo.gl/f5oQSC¿.Google ScholarGoogle Scholar
  263. Glinz, M., September 2005. Rethinking the notion of nonfunctional requirements, In: Proceedings of the 3rd World Congress on Software Quality (3WCSQ), Munich, Germany. ¿http://goo.gl/ncAxpQ¿.Google ScholarGoogle Scholar
  264. Grady, R., Caswell, D., 1987. Software Metrics: Establishing a Company-Wide Program. Prentice-Hall, Inco. Upper Saddle River, NJ, USA, ISBN 0138218447. Google ScholarGoogle ScholarDigital LibraryDigital Library
  265. Grimm, J., Denavs, D., Mazur, G., Using QFD to design a multi-disciplinary clinic. In: 23rd Symposium on Quality Function Deployment, December 3, 2011, San Diego, CA. ¿http://goo.gl/g2Cy3e¿.Google ScholarGoogle Scholar
  266. Guinta, L.R., Praizler, N.C., 1993. The QFD Book: The Team Approach to Solving Problems and Satisfying Customers through Quality Function Deployment. Amacom Books, ASIN 081445139X.Google ScholarGoogle Scholar
  267. Hauser, J.R., Clausing, D., 1988. The house of quality. Harv. Bus. Rev. 66 (3), 63-73, (May-June).Google ScholarGoogle Scholar
  268. Haag, S., Raja, M.K., Schkade, L.L., 1996. Quality function deployment: usage in software development. Commun. ACM 39 (1), 41-49. Google ScholarGoogle ScholarDigital LibraryDigital Library
  269. Herrmann, A., Daneva, M. 2008. Requirements prioritization based on benefit and cost prediction: an agenda for future research. In: 16th IEEE International Requirements Engineering Conference (RE'08), September 8-12, 2008, Barcelona, Spain, pp. 125-134. Google ScholarGoogle ScholarDigital LibraryDigital Library
  270. Herzwurm, G., Helferich, A., 2004. Customer-focussed selection and prioritization of common and variable features with quality function deployment. In: 2nd Groningen Workshop on Software Variability Management.Google ScholarGoogle Scholar
  271. Herzwurm, G., Ahlemeier, G., Schockert, S., Mellis, W., Success factors of QFD projects. In: World Innovation and Strategy Conference, August 35, 1998, Sydney, Australia, pp. 27-41.Google ScholarGoogle Scholar
  272. Herzwurm, G. Schockert, S., Breidung, M., Dowie, U., Requirements engineering for application development in volatile environments using continuous quality function deployment. In: 2003 International Conference on Software Engineering Research and Practice, Las Vegas, pp. 440-447.Google ScholarGoogle Scholar
  273. Herzwurm, G., Pelzl, N., Krams, B., QFD and cloud computing: a survey on the prioritization of security requirements for cloud computing. In: Proceedings of the 19th International Symposium on QFD 2013, Santa Fe, NM.Google ScholarGoogle Scholar
  274. Hrones Jr., J.A., Jedrey Jr., B.C., Zaaf, D., 1993. Defining global requirements with distributed QFD. Digit. Tech. J. 5 (4). Google ScholarGoogle Scholar
  275. IEEE Std 1061-1992: Standard for a Software Quality Metrics Methodology, 1992.Google ScholarGoogle Scholar
  276. ISO/IEC 9126, 1991. Information Technology--Software Product Evaluation--Quality Characteristics and Guidelines for their Use. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  277. ISO/IEC JTC1/SC7/WG6, IS 14598-1, 1459. Information Technology--Software Product Evaluation--Part 1: General Overview. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  278. ISO/IEC 9126-1, 9126. Software Engineering Product Quality--Part 1: Quality Model. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  279. ISO 21351:2005. Space Systems--Functional and Technical Specifications. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  280. ISO/IEC 14764:2006, 2006. Maintenance Process. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  281. ISO/IEC 15504-5:2006. Information Technology--Process Assessment--Part 5: An Exemplar Process Assessment Model. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  282. ISO/IEC/IEEE 24765:2010, 2010. Systems and Software Engineering--Vocabulary. International Organization for Standardization, Geneva, ¿http://goo.gl/HDhO3H¿.Google ScholarGoogle Scholar
  283. ISO/IEC 25010:2011, 2011. Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models. International Organization for Standardization, Geneva.Google ScholarGoogle Scholar
  284. Kassab, M., Daneva, M., Ormandjieva, O., Scope Management of Nonfunctional Requirements. In: 33th EUROMICRO Conference on Software Engineering and Advanced Applications (EUROMICRO'33), August 2931, 2007, Luebeck, Germany. IEEE Computer Society, pp. 409-417. Google ScholarGoogle ScholarDigital LibraryDigital Library
  285. Kassab, M., Daneva, M., Ormandjieva, O., Towards an early software effort estimation based on functional and nonfunctional requirements. IWSM/Mensura 2009: 182-196. Google ScholarGoogle Scholar
  286. Lauesen, S., 2002. Software Requirements: Styles and Techniques. Wiley, Addison-Wesley, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  287. Li, Y.-L., Chin, K.-S., Luo, X.-G., 2012. Determining the final priority ratings of customer requirements in product planning by MDBM and BSC. Expert Systems with Applications 39 (1), 1243-1255. Google ScholarGoogle ScholarDigital LibraryDigital Library
  288. McCall, J.A., Richards, P.K., Walters, G.F., 1977. Factors in Software Quality, Vol. I, II, III: Final Tech. Report, RADC-TR-77-369, Rome Air Development Center, Air Force System Command, Griffiss Air Force Base, NY.Google ScholarGoogle Scholar
  289. McConnell, S., 2006. Demystifying the Black Art. Microsoft Press, USA. Google ScholarGoogle Scholar
  290. Nayar, N., Sharma, T., Bansal, S.K., Saxena, S., November 2013. Implementation of "XP-QFD" in a Small Scale Project. Int. J. Comp. Appl. (0975-8887) 82(10). ¿http://goo.gl/WT6RGL¿.Google ScholarGoogle Scholar
  291. PMI, Project Management Body of Knowledge (PMBOK), 5th ed., January 2013. ¿www.pmi.org¿.Google ScholarGoogle Scholar
  292. QFD/CAPTURE homepage. ¿http://www.qfdcapture.com¿.Google ScholarGoogle Scholar
  293. Richardson, I., October 1997. Quality function deployment: a software process tool? In: 3rd Annual International QFD Symposium, Linkoping, Sweden. ¿http://goo.gl/B5Cnm9¿.Google ScholarGoogle Scholar
  294. Richardson, I., Using QFD to develop action plans for software process improvement. In: SEPG'98 Conference, April 1998.Google ScholarGoogle Scholar
  295. SEVOCAB, Software Engineering Vocabulary. ¿http://www.computer.org/sevocab¿.Google ScholarGoogle Scholar
  296. Standish Group, The CHAOS Manifesto: Think Big, Act Small, 2011. ¿http://goo.gl/0ncjrS¿.Google ScholarGoogle Scholar
  297. The Matrix of Change (MoC) Homepage. ¿http://ccs.mit.edu/moc/¿.Google ScholarGoogle Scholar
  298. VV.AA., QFD: The Customer-Driven Approach to Quality Planning and Deployment. In: Mizuno, S., & Akao, Y. (Eds.), APD, 1994.Google ScholarGoogle Scholar
  299. Zultner, R.E., 1995. Blitz QFD: better, faster and cheaper forms of QFD. Am. Program. 8, 24-36.Google ScholarGoogle Scholar
  300. Abdeen, H., Ducasse, S., Sahraoui, H., 2011. Modularization metrics: assessing package organization in legacy large object-oriented software. In: 2011 18th Working Conference on Reverse Engineering (WCRE). Google ScholarGoogle Scholar
  301. Abreu, F. B. e., Carapuça, R., 1994. Object-oriented software engineering: measuring and controlling the development process. In Proceedings of the 4th International Conference on Software Quality, McLean, VA.Google ScholarGoogle Scholar
  302. Allen, E.B., Khoshgoftaar, T.M., Chen, Y., 2001. Measuring coupling and cohesion of software modules: an information-theory approach. In: Software Metrics Symposium, 2001 (METRICS'01). Proceedings. Seventh International, IEEE. Google ScholarGoogle ScholarCross RefCross Ref
  303. Banker, R.D., Datar, S.M., Kemerer, C.F., 1991. A model to evaluate variables impacting the productivity of software maintenance projects. Manag. Sci. 37 (1), 1-18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  304. Bass, L., Clements, P., Kazman, R., 2012. Software Architecture in Practice. Addison-Wesley, Upper Saddle River, NJ, USA. Google ScholarGoogle Scholar
  305. Benbasat, I., Goldstein, D.K., Mead, M., 1987. The case research strategy in studies of information systems. MIS Q. 11 (3), 369-386. Google ScholarGoogle ScholarDigital LibraryDigital Library
  306. Bennett, K.H., Rajlich, V.T., 2000. Software maintenance and evolution: a roadmap. In: Proceedings of the Conference on The Future of Software Engineering. Limerick, Ireland, pp. 73-87. Google ScholarGoogle ScholarDigital LibraryDigital Library
  307. Bhattacharya, P., Iliofotou, M., Neamtiu, I., Faloutsos, M., 2012. Graph-based analysis and prediction for software evolution. In: Proceedings of the 34th International Conference on Software Engineering. Zurich, Switzerland. IEEE Press, pp. 419-429. Google ScholarGoogle Scholar
  308. Black, S., 2001. Computing ripple effect for software maintenance. J. Softw. Maintenance Evol. Res. Pract. 13 (4), 263-279. Google ScholarGoogle ScholarDigital LibraryDigital Library
  309. Buckley, J., LeGear, A.P., Exton, C., Cadogan, R., Johnston, T., Looby, B., et al., 2008. Encapsulating targeted component abstractions using software Reflexion Modelling. J. Softw. Maintenance Evol. Res. Pract. 20 (2), 107-134. Google ScholarGoogle ScholarDigital LibraryDigital Library
  310. Chidamber, S.R., Kemerer, C.F., 1994. A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20 (6), 476-493. Google ScholarGoogle ScholarDigital LibraryDigital Library
  311. DeMarco, T., 1986. Controlling Software Projects: Management, Measurement, and Estimates. Prentice Hall PTR, Upper Saddle River, NJ, USA. Google ScholarGoogle Scholar
  312. Ducasse, S., Anquetil, N., Bhatti, M.U., Cavalcante-Hora, A., 2011. Software Metrics for Package Remodularisation. [Research Report] 2011. ¿hal-00646878¿.Google ScholarGoogle Scholar
  313. Ejiogu, L.O., 1991. Software Engineering with Formal Metrics. QED Information Sciences, Inc. Google ScholarGoogle Scholar
  314. English, M., Buckley, J., Cahill, T., 2010. A replicated and refined empirical study of the use of friends in C11 software. J. Syst. Softw. 83 (11), 2275-2286. Google ScholarGoogle ScholarDigital LibraryDigital Library
  315. English, M., Cahill, T., Buckley, J., 2012. Construct specific coupling measurement for C++ software. Comput. Lang. Syst. Struct. 38 (4), 300-319. Google ScholarGoogle ScholarDigital LibraryDigital Library
  316. Fenton, N.E., Neil, M., 2000. Software metrics: roadmap. In: Proceedings of the Conference on The Future of Software Engineering. ACM, Limerick, Ireland, pp. 357-370. Google ScholarGoogle ScholarDigital LibraryDigital Library
  317. Fenton, N.E., Pfleeger, S.L., 1998. Software Metrics: A Rigorous and Practical Approach. PWS Publishing Co, Boston, MA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  318. Gaffney, J.E. Jr. 1981. Metrics in software quality assurance. In: Proceedings of the ACM'81 Conference. B. Levy, pp. 126-130. Google ScholarGoogle Scholar
  319. Gamma, E., Helm, R., Johnson, R., Vlissides, J., 1995. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley, Longman Publishing Co., Inc., Boston, MA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  320. Harrison, R., Counsell, S.J., Nithi, R.V., 1998. An evaluation of the MOOD set of object-oriented software metrics. IEEE Trans. Softw. Eng. 24 (6), 491-496. Google ScholarGoogle ScholarDigital LibraryDigital Library
  321. IEEE, 1983. IEEE Std. 729-1983 Standard Glossary of Software Engineering Terminology (ANSI).Google ScholarGoogle Scholar
  322. ISO/IEC, 2010. ISO/IEC 25010--Systems and Software Engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models, ISO/IEC.Google ScholarGoogle Scholar
  323. Jordan, H., Rosik, J., Herold, S., Botterweck, G., Buckley, J., 2015. Manually Locating Features in Industrial Source Code: The Search Actions of Software Nomads. In: Lucia, A.D., Bird, C., Oliveto, R. (Eds.), International Conference on Program Comprehension. IEEE, Florence, Italy. Google ScholarGoogle Scholar
  324. Kemerer, C., 1995. Software complexity and software maintenance: a survey of empirical research. Ann. Softw. Eng. 1 (1), 1-22.Google ScholarGoogle ScholarCross RefCross Ref
  325. Kemerer, C., Slaughter, S., 1997. Methodologies for performing empirical studies: report from the international workshop on empirical studies of software maintenance. Empir. Softw. Eng. 2 (2), 109-118. Google ScholarGoogle ScholarCross RefCross Ref
  326. Klein, H.K., Myers, M.D., 1999. A set of principles for conducting and evaluating interpretive field studies in information systems. MIS Q. 23 (1), 67-93. Google ScholarGoogle ScholarDigital LibraryDigital Library
  327. Lehman, M.M., 1980. Programs, life cycles, and laws of software evolution. Proc. IEEE 68 (9), 1060-1076.Google ScholarGoogle ScholarCross RefCross Ref
  328. Lethbridge, T., Sim, S., Singer, J., 2005. Studying software engineers: data collection techniques for software field studies. Empir. Softw. Eng. 10 (3), 311-341. Google ScholarGoogle ScholarDigital LibraryDigital Library
  329. Martin, R., 2005. The Tipping Point: Stability and Instability in OO Design. Available from: ¿http://www.drdobbs.com/the-tipping-point-stability-and-instabil/184415285¿ (retrieved 10.03.15.).Google ScholarGoogle Scholar
  330. Martin, R.C., 2000. Design Principles and Design Patterns. Available from: ¿http://www.objectmentor.com/resources/articles/Principles_and_Patterns.pdf¿ (retrieved 10.03.15).Google ScholarGoogle Scholar
  331. Mitchell, B.S., Mancoridis, S., 2006. On the automatic modularization of software systems using the bunch tool. IEEE Trans. Softw. Eng. 32 (3), 193-208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  332. Olague, H.M., Etzkorn, L.H., Messimer, S.L., Delugach, H.S., 2008. An empirical validation of object-oriented class complexity metrics and their ability to predict error-prone classes in highly iterative, or agile, software: a case study. J. Softw. Maintenance Evol. Res. Pract. 20 (3), 171-197. Google ScholarGoogle ScholarDigital LibraryDigital Library
  333. Parnas, D.L., 1972. On the criteria to be used in decomposing systems into modules. Commun. ACM 15 (12), 1053-1058. Google ScholarGoogle ScholarDigital LibraryDigital Library
  334. Parnas, D.L., 1977. Use of Abstract Interfaces in the Development of Software for Embedded Computer Systems, p. 34.Google ScholarGoogle Scholar
  335. Ponisio, L., Nierstrasz, O., 2006. Using Context Information to Re-architect a System. In: SMEF'06, Software Measurement European Forum, pp. 91-103.Google ScholarGoogle Scholar
  336. Rahman, F., Devanbu, P., 2013. How, and why, process metrics are better. In: Proceedings of the 2013 International Conference on Software Engineering. San Francisco, CA. IEEE Press, pp. 432-441. Google ScholarGoogle Scholar
  337. Robson, C., 2002. Real World Research. Blackwell.Google ScholarGoogle Scholar
  338. Runeson, P., Höst, M., 2009. Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. 14 (2), 131-164. Google ScholarGoogle ScholarDigital LibraryDigital Library
  339. Sant'Anna, C., Figueiredo, E., Garcia, A., Lucena, C.P., 2007. On the modularity of software architectures: a concern-driven measurement framework. In: Software Architecture. F. Oquendo. Springer, Berlin Heidelberg, vol. 4758, pp. 207-224. Google ScholarGoogle Scholar
  340. Sarkar, S., Kak, A.C., Rama, G.M., 2008. Metrics for measuring the quality of modularization of large-scaled object-oriented software. IEEE Trans. Softw. Eng. 34 (5), 700-720. Google ScholarGoogle ScholarCross RefCross Ref
  341. Sarkar, S., Rama, G.M., Kak, A.C., 2007. API-based and information-theoretic metrics for measuring the quality of software modularization. IEEE Trans. Softw. Eng. 33 (1), 14-32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  342. Schneidewind, N.F., 1987. The state of software maintenance. IEEE Trans. Softw. Eng. 13 (3), 303-310. Google ScholarGoogle ScholarDigital LibraryDigital Library
  343. Scitools, 2015. Understand. Available from: ¿https://scitools.com/¿ (retrieved 10.03.15.).Google ScholarGoogle Scholar
  344. Sharif, K.Y., English, M., Ali, N., Exton, C., Collins, J.J., Buckley, J., 2015. An empirically-based characterization and quantification of information seeking through mailing lists during Open Source developers' software evolution. Inf. Softw. Technol. 57 (0), 77-94.Google ScholarGoogle ScholarCross RefCross Ref
  345. Slaughter, S.A., Harter, D.E., Krishnan, M.S., 1998. Evaluating the cost of software quality. Commun. ACM 41, 67-73. Google ScholarGoogle ScholarDigital LibraryDigital Library
  346. Stake, R.E., Savolainen, R., 1995. The Art of Case Study Research. Sage Publications, Thousand Oaks, CA.Google ScholarGoogle Scholar
  347. Stevens, W.P., Myers, G.J., Constantine, L.L., 1974. Structured design. IBM Syst. J. 13 (2), 115-139. Google ScholarGoogle ScholarDigital LibraryDigital Library
  348. Szyperski, C., Gruntz, D., Murer, S., 2002. Component Software: Beyond Object-Oriented Programming. Addison-Wesley, Longman Publishing Co., Inc., Boston, MA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  349. Misic, V.B., 2001. Cohesion is structural, coherence is functional: different views, different measures. In: Software Metrics Symposium, 2001 (METRICS'01). Proceedings. Seventh International. Google ScholarGoogle ScholarCross RefCross Ref
  350. Yin, R.K., 2013. Case Study Research: Design and Methods. SAGE Publications, Thousand Oaks, CA.Google ScholarGoogle Scholar
  351. Yourdon, E., Constantine, L.L., 1979. Structured Design. Prentice-Hall, Englewood Cliffs, NJ.Google ScholarGoogle Scholar
  352. Zelkowitz, M.V., Wallace, D.R., 1998. Experimental models for validating technology. Computer 31 (5), 23-31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  353. Alexander, C., Ishikawa, S., Silverstein, M., 1977. A Pattern Language: Towns, Buildings, Construction, vol. 2. Oxford University Press.Google ScholarGoogle Scholar
  354. Alur, D., Malks, D., Crupi, J., 2001. Core J2EE Patterns: Best Practices and Design Strategies. Prentice Hall PTR, Upper Saddle River, NJ. Google ScholarGoogle Scholar
  355. Aniche, M., Ferreira, T., Gerosa, M., 2011. What concerns beginner test-driven development practitioners: a qualitative analysis of opinions in an Agile conference. 2nd Brazilian Workshop on Agile Methods (WBMA), Fortaleza, Brazil.Google ScholarGoogle Scholar
  356. Aniche, M.F., Gerosa, M.A., 2012. How the practice of TDD influences class design in object-oriented systems: patterns of unit tests feedback. In: Software Engineering (SBES), 2012 26th Brazilian Symposium on, IEEE. pp. 1-10. Google ScholarGoogle Scholar
  357. Astels, D., 2003. Test-Driven Development: A Practical Guide, segunda ed. Prentice Hall. Google ScholarGoogle Scholar
  358. Beck, K., 2002. Test-Driven Development by Example, first ed. Addison-Wesley Professional. Google ScholarGoogle Scholar
  359. Beck, K., 2004. Extreme Programming Explained, second ed. Addison-Wesley Professional. Google ScholarGoogle Scholar
  360. Dogsa, T., Batic, D., 2011. The effectiveness of test-driven development: an industrial case study. Softw. Qual. J.1-19, ¿http://dx.doi.org/10.1007/s11219-011-9130-2¿. Google ScholarGoogle Scholar
  361. e Nat Pryce, S.F., 2009. Growing Object-Oriented Software, Guided by Tests, 1st ed. Addison-Wesley Professional. Google ScholarGoogle Scholar
  362. Erdogmus, H., Morisio, M., Torchiano, M., 2005. On the effectiveness of the test-first approach to programming. IEEE Trans. Softw. Eng. 31, 226-237, ¿http://doi.ieeecomputersociety.org/10.1109/TSE.2005.37¿. Google ScholarGoogle ScholarDigital LibraryDigital Library
  363. Evans, 2003. Domain-Driven Design: Tacking Complexity in the Heart of Software. Addison-Wesley Longman Publishing Co., Inc., Boston, MA. Google ScholarGoogle Scholar
  364. Fairbanks, G., 2010. Just Enough Software Architecture: A Risk-Driven Approach. Marshall & Brainerd.Google ScholarGoogle Scholar
  365. Fowler, M., 1999. Refactoring: Improving the Design of Existing Code. Addison-Wesley Longman Publishing Co., Inc., Boston, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  366. Fowler, M., 2002. Patterns of Enterprise Application Architecture. Addison-Wesley Longman Publishing Co., Inc., Boston, MA. Google ScholarGoogle Scholar
  367. Fowler, M., 2007. Mocks aren't stubs. ¿http://martinfowler.com/articles/mocksArentStubs¿ (last accessed 26.11.14.).Google ScholarGoogle Scholar
  368. Freeman, S., Mackinnon, T., Pryce, N., Walnes, J., 2004. Mock roles, objects. In: Companion to the 19th Annual ACM SIG-PLAN Conference on Object-oriented Programming Systems, Languages, and Applications. ACM, New York, NY, pp. 236-246. ¿http://dx.doi.org/10.1145/1028664.1028765¿. Google ScholarGoogle Scholar
  369. Gamma, E., Helm, R., Johnson, R., Vlissides, J., 1995. Design Patterns: Elements of Reusable Object-oriented Software. Addison-Wesley Longman Publishing Co., Inc., Boston, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  370. George, B., Williams, L., 2003. An initial investigation of test driven development in industry. In: Proceedings of the 2003 ACM Symposium on Applied Computing. ACM, New York, NY, pp. 1135-1139. ¿http://doi.acm.org/10.1145/952532.952753¿. Google ScholarGoogle Scholar
  371. Guerra, E., 2014. Designing a framework with test-driven development: a journey. Softw. IEEE 31 (1), 9-14. Available from: http://dx.doi.org/10.1109/MS.2014.3. Google ScholarGoogle ScholarDigital LibraryDigital Library
  372. Guerra, E.M., Kinoshita, B., 2012. Patterns for introducing a superclass for test classes. In: Proceedings of the 9th Latin American Conference on Pattern Languages of Programming. ACM, New York, NY. Google ScholarGoogle Scholar
  373. Guerra, E.M., Yoder, J., Aniche, M., Gerosa, M.A., 2013. Test-driven development step patterns for handling objects dependencies. In: Proceedings of the 20th Conference on Pattern Languages of Programs. ACM, New York, NY. Google ScholarGoogle Scholar
  374. Guerra, E.M., Aniche, M., Gerosa, M.A., Yoder, J., 2014. Patterns for preparing for a test driven development session. In: Proceedings of the 21th Conference on Pattern Languages of Programs. ACM, New York, NY. Google ScholarGoogle Scholar
  375. Janzen, D., Saiedian, H., 2006. On the influence of test-driven development on software design. Proceedings of the 19th Conference on Software Engineering Education and Training (CSEET'06). Hawaii, US. pp. 141-148. Google ScholarGoogle Scholar
  376. Janzen, D.S., 2005. Software architecture improvement through test-driven development. In: Companion to the 20th Annual ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications. ACM, New York, NY, pp. 240-241. ¿http://doi.acm.org/10.1145/1094855.1094954¿. Google ScholarGoogle Scholar
  377. Kerievsky, J., 2004. Refactoring to Patterns. Pearson Higher Education. Google ScholarGoogle Scholar
  378. Landre, E., Wesenberg, H., Olmheim, J., 2007. Agile enterprise software development using domain-driven design and test first. In: Companion to the 22nd ACM SIGPLAN Conference on Object-Oriented Programming Systems and Applications Companion. ACM, New York, NY, pp. 983-993. ¿http://dx.doi.org/10.1145/1297846.1297967¿. Google ScholarGoogle Scholar
  379. Langr, J., 2001. Evolution of test and code via test-first design. ¿http://www.objectmentor.com¿ (last accessed 01.03.11.).Google ScholarGoogle Scholar
  380. Lanza, M., Marinescu, R., Ducasse, S., 2005. Object-Oriented Metrics in Practice. Springer-Verlag New York, Inc., Secaucus, NJ. Google ScholarGoogle Scholar
  381. Li, A.L., 2009. Understanding the Efficacy of Test Driven Development. Master's Thesis, Auckland University of Technology.Google ScholarGoogle Scholar
  382. Mackinnon, T., Craig, P., Freeman, S., 2001. Endotesting: unit testing with mock objects. In: Succi, G., Marchesi, M. (Eds.), Extreme Programming Examined. Addison-Wesley Longman Publishing Co., pp. 287-301. Google ScholarGoogle Scholar
  383. Madeyski, L., 2006. The impact of pair programming and test-driven development on package dependencies in object-oriented design--an experiment. In: Munch, J., Vierimaa, M. (Eds.), Product-Focused Software Process Improvement, Lecture Notes in Computer Science, vol. 4034. Springer, Berlin/Heidelberg, pp. 278-289. Google ScholarGoogle Scholar
  384. Martin, R.C., 2002. Agile Software Development, Principles, Patterns, and Practices, primeira ed. Prentice Hall. Google ScholarGoogle Scholar
  385. Merson P. (2013) Ultimate architecture enforcement: custom checks enforced at code-commit time. Hosking A.L., Eugster P.T. SPLASH (Companion Volume), ACM, 153-160, ¿http://dblp.uni-trier.de/db/conf/oopsla/splash2013c.html#Merson13¿. Google ScholarGoogle Scholar
  386. Merson, P., Yoder, J., Guerra, E., Aguiar, A., 2013. Continuous inspection--a pattern for keeping your code healthy and aligned to the architecture. In: Proceedings of the 3rd Asian Conference on Pattern Languages of Programs. ACM, New York, NY.Google ScholarGoogle Scholar
  387. Meszaros, G., 2006. XUnit Test Patterns: Refactoring Test Code. Prentice Hall PTR, Upper Saddle River, NJ. Google ScholarGoogle Scholar
  388. Muller, M., Hagner, O., 2002. Experiment about test-first programming. Softw. IEEE Proc 149 (5), 131-136. Available from: http://dx.doi.org/10.1049/ip-sen:20020540.Google ScholarGoogle ScholarCross RefCross Ref
  389. Siniaalto, M., Abrahamsson, P., 2008. Does test-driven development improve the program code? Alarming results from a comparative case study. Balancing Agility and Formalism in Software Engineering. Springer, Berlin Heidelberg, pp. 143-156. Google ScholarGoogle Scholar
  390. Steinberg, D.H., 2001. The Effect of Unit Tests on Entry Points, Coupling and Cohesion in an Introductory Java Programming Course. XP Universe.Google ScholarGoogle Scholar
  391. Babar, M.A., Zhu, L., Jeffery, R., 2004. A framework for classifying and comparing software architecture evaluation methods. Proc. Australian Software Engineering Conference, 309-318. Google ScholarGoogle Scholar
  392. Browning, T., 2001. Applying the design structure matrix to system decomposition and integration problems: a review and new directions. In: IEEE Trans. on Engineering Management, vol. 48. ACM Press, New York, NY, USA, pp. 292-306.Google ScholarGoogle Scholar
  393. Clements, P., Bachmann, F., Bass, L., Garlan, D., Ivers, J., Little, R., et al., 2010. Documenting Software Architectures: Views and Beyond, second ed. Boston, MA, Addison-Wesley. Google ScholarGoogle Scholar
  394. Danilovic, M., Sandkull, B., 2005. The use of dependency structure matrix and domain mapping matrix in managing uncertainty in multiple project situations. Int. J. Proj. Manage. 3, 193-203.Google ScholarGoogle ScholarCross RefCross Ref
  395. Demirli, E., Tekinerdogan, B., 2011. Software language engineering of architectural viewpoints. In: Proc. of the 5th European Conference on Software Architecture (ECSA 2011), LNCS 6903, pp. 336-343. Google ScholarGoogle ScholarCross RefCross Ref
  396. ISO/IEC 42010:2007, 2011. Recommended Practice for Architectural Description of Software-Intensive Systems (ISO/IEC 42010).Google ScholarGoogle Scholar
  397. Knodel, J., Popescu, D., 2007. A comparison of static architecture compliance checking approaches. In: Proceedings of the 6th Working IEEE/IFIP Conference on Software Architecture, Mumbai, India, p. 12. Google ScholarGoogle Scholar
  398. Koschke, R., Simon, D., 2003. Hierarchical reflexion models. In: Proceedings of the 10th Working Conference on Reverse Engineering, VIC, Canada. Google ScholarGoogle Scholar
  399. Murphy, G., Notkin, D., Sullivan, K., 2001. Software reflexion models: bridging the gap between design and implementation. IEEE Trans. Softw. Eng. 27 (4), 364-380. Google ScholarGoogle ScholarDigital LibraryDigital Library
  400. Rosik, J., Le Gear, A., Buckley, J., Babar, M.A., Connolly, D., 2011. Assessing architectural drift in commercial software development: a case study. Softw. Pract. Exp. 41 (1), 63-86. Google ScholarGoogle ScholarDigital LibraryDigital Library
  401. Sangal, N., Jordan, E., Sinha, V., Jackson, D. 2005. Using Dependency Models to Manage Complex Software Architecture. In: OOPSLA '05, New York, NY, USA, pp. 167-176. Google ScholarGoogle Scholar
  402. Tekinerdogan, B., Demirli, E., 2013. Evaluation framework for software architecture viewpoint languages. In: Proc. of 9th Int. ACM Sigsoft Conference on the Quality of Software Architectures Conference, Vancouver, Canada, June 17-21. Google ScholarGoogle Scholar
  403. Azevedo, L.S., Parker, D., Walker, M., Papadopoulos, Y., Araujo, R.E., 2013. Automatic decomposition of safety integrity levels: optimization by tabu search. In: Workshop CARS (2nd Workshop on Critical Automotive Applications: Robustness & Safety) of the 32nd International Conference on Computer Safety, Reliability and Security, Toulouse, France.Google ScholarGoogle Scholar
  404. Azevedo, L.S., Parker, D., Walker, M., Papadopoulos, Y., Araujo, R.E., 2014. Assisted assignment of automotive safety requirements. IEEE Software 31 (1), 62-68. Available from: http://dx.doi.org/10.1109/MS.2013.118. Google ScholarGoogle ScholarDigital LibraryDigital Library
  405. Bieber, P., Delmas, R., Seguin, C., 2011. DALculus--Theory and tool for development assurance level allocation. In: Proceedings of the 30th International Conference on Computer Safety, Reliability and Security, Naples, Italy. Springer, Berlin, Heidelberg, pp. 43-56. Google ScholarGoogle Scholar
  406. Capelle, T.V., Houtermans, M.J., 2006. Functional Safety: A Practical Approach to End-Users and System Integrators. HIMA Paul Hildebrandt GmbH Co. KG, Germany. Available from: ¿https://www.researchgate.net/publication/228620983_Functional_safety_a_practical_approach_for_end-users_and_system_integrators¿ (retrieved 8.2.14.).Google ScholarGoogle Scholar
  407. EUROCAE, 2010. ED-79A--Guidelines for development of civil aircraft and system. In: EUROCAE (retrieved 2014).Google ScholarGoogle Scholar
  408. Glover, F., 1986. Future paths for integer programming and links to artificial intelligence. Comput. Oper. Res. 13 (5), 533-549. Google ScholarGoogle ScholarDigital LibraryDigital Library
  409. Nordhoff, S., n.d. DO-178C/ED-12C--The New Software Standard for the Avionic Industry: Goals, Changes and Challenges. Available from: ¿www.sqs.com/uk/_download/DO-178C_ED-12C.pdf¿.Google ScholarGoogle Scholar
  410. Papadopoulos, Y., Walker, M., Parker, D., Rude, E., Hamann, R., Uhlig, A., et al., 2011. Engineering failure analysis and design optimisation with HiP-HOPS. Eng. Fail. Anal., 590-608.Google ScholarGoogle ScholarCross RefCross Ref
  411. SC 65-A, 2010. IEC61508--Functional Safety of Electrical/Electronic/Programmable Electronic Safety-Related Systems. International Electrotechnical Commission, Geneva, Switzerland.Google ScholarGoogle Scholar
  412. SC-167, 1992. DO-178B--Software Considerations in Airborne Systems and Equipment Certification, first ed. RTCA Inc.Google ScholarGoogle Scholar
  413. Sharvia, S., Papadopoulos, Y., 2011. IACoB-SA: an approach towards integrated safety assessment. In: IEEE International Conference on Automation Science and Engineering. IEEE, Trieste, Italy, pp. 220-225.Google ScholarGoogle Scholar
  414. S-18, SAE, 2010. ARP4754-A guidelines for development of civil aircraft and systems. SAE Int. Available from: ¿http://standards.sae.org/arp4754a/¿ (retrieved 2013).Google ScholarGoogle Scholar
  415. TC 22/SC3, 2011. ISO 26262--Road Vehicles--Functional Safety. International Organization for Standardization.Google ScholarGoogle Scholar
  416. Adachi, M., Papadopoulos, Y., Sharvia, S., Parker, D., Tohdo, T., 2011. An approach to optimisation of fault tolerant architecture using HiP-HOPS. Softw. Pract. Exp. 41 (11), 1202-1327. Google ScholarGoogle ScholarDigital LibraryDigital Library
  417. Adler, R., Domis, D., Hofig, K., Kemmann, S., Kuhn, T., Schwinn, J., et al. 2011. Integration of component fault trees into the UML. In: Workshops and Symposia at MODELS, pp. 312-327. Google ScholarGoogle Scholar
  418. Aizpurua, J.I., Muxika, E., 2012. Design of dependable systems: an overview of analysis and verification approaches. In: DEPEND'12: Fifth International Conference on Dependability. IARIA, pp. 4-12.Google ScholarGoogle Scholar
  419. Aizpurua, J., Muxika, E., 2013. Model-based design of dependable systems: limitation and evolution of analysis and verification approaches. Int. J. Adv. Sec. 6 (1&2), 12-13.Google ScholarGoogle Scholar
  420. Akerlund, O., Bieber, P., 2006. ISAAC, a framework for integrated safety analysis of functional, geometrical, and human aspects. In: 3rd European Congress on Embedded Real Time System (ERTS), Toulouse, France.Google ScholarGoogle Scholar
  421. Aleti, A., Bjornander, S., Grunske, L., & Meedeniya, I., 2009. ArcheOpterix: an extendable tool for architecture optimization of AADL models. In: MOMPES'09, Vancouver, Canada. Google ScholarGoogle Scholar
  422. Arnold, A., Point, G., Griffault, A., Rauzy, A., 2000. The Altarrica formalism for describing concurrent system. Fundamenta Informaticae 40 (2), 109-124. Google ScholarGoogle Scholar
  423. Azevedo, L., Parker, D., Walker, M., Papadopoulos, Y., Araujo, R., 2013. Assisted assignment of automotive safety requirements. IEEE Softw. 31 (1), 62-68. Google ScholarGoogle ScholarDigital LibraryDigital Library
  424. Batteux, M., Prosvirnova, T., Rauzy, A., Kloul, L., 2013. The AltaRica 3.0 Project for Model-Based Safety Assessment. INDIN. 741-746.Google ScholarGoogle Scholar
  425. Berthomieu, B., Bodeveix, B., Farail, M., Garavel, H., Gaufillet, P., Lang, F., et al. 2008. Fiarce: an intermediate language for model verification in topcased environment. In: ERTS'08.Google ScholarGoogle Scholar
  426. Bieber, P., Castel, C., Seguin, C., 2002. Combination of fault tree analysis and model checking for safety assessment of complex system. In: Proceedings of the 4th European Depting Conference on Dependable Computing, pp. 19-31. Google ScholarGoogle Scholar
  427. Boiteau, M., Dutuit, Y., Rauzy, A., Signoret, J., 2006. The AltarRica dataflow language in use: modeling of production availability of a multi-state system. Reliab. Eng. Syst. Saf. 91 (7), 747-755.Google ScholarGoogle ScholarCross RefCross Ref
  428. Bouissou, M., 2007. A generalization of dynamic fault trees through Boolean Logic Driven Markov Processes (BDMP). In: Proc. ESREL'07, pp. 1051-1058.Google ScholarGoogle Scholar
  429. Bozzano, M., Villafiorita, A., 2003. Improving system reliability via model checking: the FSAP/NuSMV-SA safety analysis platform. In: International Conference on Computer Safety, Reliability, and Security, Edinburgh. pp. 49-62.Google ScholarGoogle Scholar
  430. Bozzano, M., Villafiorita, A., et al., 2003. ESACS: an integrated methodology for design and safety analysis of complex systems. In: ESREL '03.Google ScholarGoogle Scholar
  431. Bozzano, M., Cimatti, A., Katoen, J., Nguyen, V., Noll, T., Roveri, M., 2011. Safety, dependability, and performance analysis of extended AADL models. Comput. J. 54 (5), 754-775. Google ScholarGoogle ScholarDigital LibraryDigital Library
  432. Chen, D.-J., Mahmud, N., Walker, M., Feng, L., Lonn, H., Papadopoulos, Y., 2013. Systems modeling with EAST-ADL for fault tree analysis through HiP-HOPS. In: 4th IFAC Workshop on Dependable Control of Discrete Systems. 4 (1), 91-96.Google ScholarGoogle Scholar
  433. COMPASS, 2013. Correctness, Modeling, and Performance of Aerospace Systems. Retrieved from ¿www.compass.informatik.rwth-aachen.de¿.Google ScholarGoogle Scholar
  434. Distefano, S., Puliafito, A., 2007. Dynamic reliability block diagram VS dynamic fault trees. In: Proceedings of Reliability Availability Maintainability Safety 2007, pp. 71-76. Google ScholarGoogle Scholar
  435. Dugan, J., Bavuso, S., Boyd, M., 1992. Dynamic fault tree models for fault tolerant computer systems. IEEE Trans. Reliabil. 41 (3), 363-377.Google ScholarGoogle ScholarCross RefCross Ref
  436. Edifor, E., Walker, M., Gordon, N., 2012. Quantification of priority-OR gates in temporal fault trees. Comput. Saf. Reliabil. Secur. SE, 99-110. Google ScholarGoogle Scholar
  437. Edifor, E., Walker, M., Gordon, N., Papadopoulos, Y., 2014. Using simulation to evaluate dynamic systems with weibull or lognormal distributions. In: Proceedings of the 9th International Conference on Dependability and Complex Systems, pp. 177-187.Google ScholarGoogle Scholar
  438. ESSaRel, 2005. Embedded Systems Safety and Reliability Analyser. Available from: ¿http://essarel.de¿ (retrieved 3.9.14.).Google ScholarGoogle Scholar
  439. Feiler, P., Rugina, A., 2007. Dependability Modeling with the Architecture Analysis & Design Language (AADL). Tech. Rep. Software Engineering Institute, Carnegie Mellon University, Pittsburgh, US.Google ScholarGoogle Scholar
  440. Feiler, P., Gluch, D., Hudak, J., 2006. The Architecture Analysis & Design Language (AADL): An Introduction. Tech. Rep. Software Engineering Institute, Carnegie Mellon University, Pittsburgh, US.Google ScholarGoogle Scholar
  441. Fenelon, P., McDermid, J., 1993. An integrated toolset for software safety analysis. J. Syst. Softw. 21 (3), 279-290. Google ScholarGoogle ScholarDigital LibraryDigital Library
  442. Fussel, J., Aber, E., Rahl, R., 1976. On the quantitative analysis of Priority-AND failure logic. IEEE Trans. Reliabil R-25 (5), 324-326.Google ScholarGoogle ScholarCross RefCross Ref
  443. Gallina, B., Punnekkat, S., 2014. A formalism for incompletion, inconsistency, interference and impermanence failures' analysis. In: Proceedings of the 37th EUROMICRO Conference on Software Engineering and Advanced Applications, pp. 493-500. Google ScholarGoogle Scholar
  444. Ge, X., Paige, R., McDermid, J., 2009. Probabilistic failure propagation and transformation analysis. In: International Conference on Computer Safety, Reliability, and Security (SAFECOM), pp. 215-228. Google ScholarGoogle Scholar
  445. German, R., Mitzlaff, J., 1995. Transient analysis of deterministic and stochastic Petri Nets with TimeNET. In: Proceedings of the 8th International Conference on Computer Performance Evaluation, Modeling Techniques, and Tools and MMB, pp. 209-223. Google ScholarGoogle Scholar
  446. Grunske, L., Kaiser, B., Papadopoulos, Y., 2005. Model-driven safety evaluation with state-event-based component failure annotations. In: 8th international conference on Component-Based Software Engineering (CBSE'05), pp. 33-48. Google ScholarGoogle Scholar
  447. Güdemann, M., Ortmeier, F., 2010. A framework for qualitative and quantitative formal model-based safety analysis. In: Proceedings of the 12th International Symposium on High-Assurance System Engineering (HASE), pp. 132-141. Google ScholarGoogle Scholar
  448. Güdemann, M., Ortmeier, F., 2011. Towards model-driven safety analysis. In: 3rd International Workshop on Dependable Control of Discrete Systems (DCDS), pp. 53-58.Google ScholarGoogle Scholar
  449. Güdemann, M., Ortmeier, F., Reif, W., 2008. Computation of ordered minimal critical sets. In: Proceedings of the 7th Symposium in Computer Safety, Reliability, and Security.Google ScholarGoogle Scholar
  450. Güdemann, M., Lipaczewski, M., Struck, S., Ortmeier, F., 2012. Unifying Probabilistic and Traditional Formal Model Based Analysis. In: MBEES'12.Google ScholarGoogle Scholar
  451. Helmer, G., Wong, J., Slagell, M., Honavar, V., Miller, L., Wang, Y., Wang, X., Stakhanova, N., 2007. Software fault tree and coloured Petri net -- based specification, design and implementation of agent-based intrusion detection systems. Int. J. Info. Comput. Secur. 1 (1), 109-142. Google ScholarGoogle ScholarDigital LibraryDigital Library
  452. Hura, G., Atwood, J., 1988. The use of Petri Nets to analyze coherent fault trees. IEEE Trans. Reliabil. 37 (5), 469-474.Google ScholarGoogle ScholarCross RefCross Ref
  453. Joshi, A., Vestal, S., Binns, P., 2007. Automatic generation of static fault trees from AADL models. In: DSN Workshop on Architecting Dependable Systems.Google ScholarGoogle Scholar
  454. Kaiser, B., Liggesmeyer, P., Mackel, O., 2003. A new component concept for fault trees. In: Proceedings for the 8th Australian Workshop on Safety Critical Systems and Software (SCS'03). vol. 33, pp. 37-46. Google ScholarGoogle Scholar
  455. Kwiatkowska, M., Norman, G., Parker, D., 2011. PRISM 4.0: verification of probabilistic real-time systems. In: Proceedings of the 23rd International Conference on Computer Aided Verification (CAV'11), pp. 585-591. Google ScholarGoogle Scholar
  456. Lipaczewski, M., Struck, S., Ortmeier, F., 2012. SAML goes eclipse--Combining model-based safety analysis and high-level editor support. In Proceedings of the 2nd International Workshop on Developing Tools as Plug-Ins (TOPI), pp. 67-72. Google ScholarGoogle Scholar
  457. Lisagor, O., Kelly, T., Niu, R., 2011. Model-Based Safety Assessment: Review of Discipline and its Challenges. In: 9th International Conference on Reliability, Maintainability and Safety (ICRMS), pp. 625-632.Google ScholarGoogle Scholar
  458. Marsan, M., Chiola, G., 1987. On Petri nets with deterministic and exponentially distributed firing times. In: Advances in Petri Nets, 266, pp. 132-145. Google ScholarGoogle Scholar
  459. Merle, G., Roussel, J., Lesage, J., Bobbio, A., 2010. Probabilistic algebraic analysis of fault trees with priority dynamic gates and repeated events. IEEE Trans. Reliabil. 59 (1), 250-261.Google ScholarGoogle ScholarCross RefCross Ref
  460. Mian, Z., Bottaci, L., Papadopoulos, Y., Sharvia, S., Mahmud, N., 2014. Model transformation for multi-objective architecture optimization of dependable systems. In: Dependability Problems of Complex Information Systems, 91-110.Google ScholarGoogle Scholar
  461. Niu, R., Tang, T., Lisagor, O., McDermid, J. A., 2011. Automatic safety analysis of networked control system based on failure propagation model. In: IEEE International Conference on Vehicular Electronics and Safety, pp. 53-58.Google ScholarGoogle Scholar
  462. Ortmeier, F., Reif, W., Schellhorn, G., 2005. Deductive cause-consequence analysis. In: Proceedings of the 6th IFAC World Congress, pp. 1434-1439.Google ScholarGoogle Scholar
  463. Paige, R., Rose, L., Ge, X., Kolovos, D., Brooke, P. J., 2008. FPTC: automated safety analysis for domain specific languages. In: Proceedings of Workshop on Non Functional System Properties in Domain Specific Modeling Languages, pp. 229-242.Google ScholarGoogle Scholar
  464. Papadopoulos, Y., Maruhn, M., 2001. Model-based synthesis of fault trees from matlab simulink models. In: International Conference on Dependable Systems and Networks (DSN), pp. 77-82. Google ScholarGoogle Scholar
  465. Papadopoulos, Y., McDermid, J., 1999. Hierarchically performed hazard origin and propagation studies. In: International Conference on Computer Safety, Reliability and Security, pp. 139-152. Google ScholarGoogle Scholar
  466. Papadopoulos, Y., Nggada, S., Parker, D., 2010. Extending HiP-HOPS with Capabilities of Planning Preventative Maintenance, Strategic Advantage of Computing Information Systems in Enterprise Management, editiors. Majid Sarrafzadeh Volume containing revised selected papers from International Conference in Computer Systems and Information Systems 2009-2010, pp. 231-245, ISBN: 978-960-6672-93-4.Google ScholarGoogle Scholar
  467. Papadopoulos, Y., Walker, M., Parker, D., Rude, E., Hamman, R., Uhlig, A., et al., 2011. Engineering failure analysis & design optimization with HiP-HOPS. J. Eng. Fail. Anal 18 (2), 590-608.Google ScholarGoogle ScholarCross RefCross Ref
  468. Point, G., Rauzy, A., 1999. AltaRica: constraint automata as a description language. Eur. J. Autom. 33 (8-9), 1033-1052.Google ScholarGoogle Scholar
  469. Rao, K., Durga, V., Gopika, V., Sanyasi, R., Kushawa, H., Verma, A., et al., 2009. Dynamic fault tree analysis using Monte Carlo simulation in probabilistic safety assessment. Reliabil. Eng. Syst. Saf. 94 (4), 872-883.Google ScholarGoogle ScholarCross RefCross Ref
  470. Robidoux, R., Lu, H., Xing, L., Zhou, M., 2010. Automated modeling of dynamic reliability block diagrams using coloured Petri Nets. IEEE Trans. Syst. Man, Cybernatics 40 (2), 337-351. Google ScholarGoogle ScholarDigital LibraryDigital Library
  471. Rugina, A., Kanoun, K., Kaaniche, M., 2007. A system dependability modeling framework using AADL and GSPNs. In: Architecting Dependable Systems IV, pp. 14-38. Google ScholarGoogle Scholar
  472. Sharvia, S., Papadopoulos, Y., 2011. IACoB-SA: an approach towards integrated safety assessment. In: Proceedings of 7th IEEE International Conference on Automation Science and Engineering, Trieste, Italy. pp. 220-225.Google ScholarGoogle Scholar
  473. Sharvia, S., Papadopoulos, Y., Walker, M., Chen, D., Lonn, H., 2014. Enhancing the EAST-ADL error model with HiP-HOPS semantics. In: Athens ATINER Conference Paper Series.Google ScholarGoogle Scholar
  474. Steiner, M., Keller, P., Liggesmeyer, P., 2012. Modeling the effects of software on safety and reliability in complex embedded systems. Comput. Saf. Reliabil. Secur., 454-465. Google ScholarGoogle ScholarDigital LibraryDigital Library
  475. TOPCASED, 2013. The Open Source Toolkit for Critical System. Available from: ¿www.topcased.org¿ (retrieved 9.11.14.).Google ScholarGoogle Scholar
  476. US Department of Defense, 1980. Procedures of Performing a Failure mode, Effects, and Criticality Analysis. Washington, DC.Google ScholarGoogle Scholar
  477. Vesely, W., Dugan, J., Fragola, J., Minarick, J., Railsback, J., 2002. Fault Tree Handbook with Aerospace Applications. Tech. rep., NASA office of safety and mission assurance, Washington, DC.Google ScholarGoogle Scholar
  478. Villemeur, A., 1991. Reliability, Availability, Maintainability and Safety Assessment: Methods and Techniques. John Wiley & Sons, Chichester.Google ScholarGoogle Scholar
  479. Walker, M., 2009. Pandora: A Logic for the Qualitative Analysis of Temporal Fault Trees PhD Thesis. University of Hull.Google ScholarGoogle Scholar
  480. Walker, M., Bottaci, L., Papadopoulos, Y., 2007. Compositional temporal fault tree analysis. In: Proceedings of the 26th International Conference on Computer Safety, pp. 106-119. Google ScholarGoogle Scholar
  481. Walker, M., Mahmud, N., Papadopoulos, Y., Tagliabo, F., Torchiaro, S., Schierano, W., Lonn, H., 2008. ATESST2: Review of relevant Safety Analysis Techniques. Tech. Rep, 1-121.Google ScholarGoogle Scholar
  482. Yang, Y., Zeckzer, D., Liggesmeyer, P., Hagen, H., 2011. ViSSaAn: visual support for safety analysis. In: Daastuhl Follow-Ups, pp. 378-395.Google ScholarGoogle Scholar
  483. Arrott, M., Demchak, B., Ermagan, V., Farcas, C., Farcas, E., Krüger, I.H., et al., 2007. Rich services: the integration piece of the SOA puzzle. In: Proceedings of the IEEE International Conference on Web Services (ICWS). IEEE Computer Society, Salt Lake City, UT, pp. 176-183.Google ScholarGoogle Scholar
  484. Bachmann, F., 2011. Give the Stakeholders What They Want: Design Peer Reviews the ATAM Style. CrossTalk.Google ScholarGoogle Scholar
  485. Boehm, B., 1988. A spiral model of software development and enhancement. Computer 21 (5), 61-72, IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  486. Boehm, B., 2006. Value-based software engineering: overview and agenda. In: Biffl, S., Aurum, A., Boehm, B., Erdogmus, H., Grünbacher, P. (Eds.), Value-Based Software Engineering. Springer, Berlin, pp. 3-14. (Chapter 1).Google ScholarGoogle Scholar
  487. Boehm, B., Jain, A., 2006. An initial theory of value-based software engineerin. In: Biffl, S., Aurum, A., Boehm, B., Erdogmus, H., Grünbacher, P. (Eds.), Value-Based Software Engineering. Springer, Berlin, pp. 15-37. (Chapter 2).Google ScholarGoogle Scholar
  488. Boehm, B., Turner, R., 2003. Balancing Agility and Discipline: Guide for the Perplexed. Longman Publishing Co, Boston, MA. Google ScholarGoogle Scholar
  489. Boehm, B.W., Brown, J.R., Lipow, M., 1976. Quantitative evaluation of software quality. In: Proceedings of the 2nd International Conference on Software Engineering. IEEE Computer Society Press Los Alamitos, CA, pp. 592-605. Google ScholarGoogle Scholar
  490. Boehm, B.W., Brown, J.R., Kaspar, H., Lipow, M., McLeod, G.J., Merritt, M.J., 1978. Characteristics of Software Quality. TRW Series of Software Technology, vol 1 North Holland, Amsterdam.Google ScholarGoogle Scholar
  491. Booth, D., Haas, H., McCabe, F., Newcomer, E., Champion, M., Ferris, C., et al., 2004. Web Services Architecture. W3C Working Group Note. Retrieved from: ¿http://www.w3.org/TR/2004/NOTE-ws-arch-20040211/¿.Google ScholarGoogle Scholar
  492. Carriere, S.J., 2009. Lightweight Architecture Alternative Assessment Method. ¿http://technogility.sjcarriere.com/2009/05/11/its-pronounced-like-lamb-not-like-lame/¿.Google ScholarGoogle Scholar
  493. Clements, P., Kazman, R., Klein, M., 2002. Evaluating Software Architecture: Methods and Case Studies. Addison Wesley, Boston, MA.Google ScholarGoogle Scholar
  494. Cockburn, A., 2000. Writing Effective Use Cases. Addison-Wesley, Boston, MA. Google ScholarGoogle Scholar
  495. Crosby, P.B., 1979. Quality is Free: The Art of Making Quality Certain. McGraw-Hill, New York, NY.Google ScholarGoogle Scholar
  496. Dache, G., 2001. IT Companies will gain competitive advantage by integrating CMM with ISO9001. Qual. Syst. Update 11 (11).Google ScholarGoogle Scholar
  497. Deissenboeck, F., Wagner, S., Pizka, M., Teuchert, S., Girard, J.F., 2007. An activity-based quality model for maintainability. In: Proc IEEE International Conference on Software Maintenance (ICSDM'07). IEEE Press, New York, NY, pp. 184-193.Google ScholarGoogle Scholar
  498. Demchak, B., Krüger, I., 2012. Policy driven development: flexible policy insertion for large scale systems. In: 2012 IEEE International Symposium on Policies for Distributed Systems and Networks. IEEE Computer Society, Chapel Hill, NC, pp. 17-24. Google ScholarGoogle Scholar
  499. Demchak, B., Farcas, C., Farcas, E., Krüger, I., 2007. The treasure map for rich services. In: Proceedings of the 2007 IEEE International Conference on Information Reuse and Integration (IRI). IEEE, Las Vegas, pp. 400-405.Google ScholarGoogle Scholar
  500. Demchak, B., Kerr, J., Raab, F., Patrick, K., Krüger, I., 2012. PALMS: a modern coevolution of community and computing using policy driven development. In: 45th Hawaii International Conference on System Sciences (HICSS), Maui, Hawaii. Google ScholarGoogle Scholar
  501. Deming, W.E., 1986. Out of the Crisis: Quality, Productivity and Competitive Position. Cambridge University Press, 507 pages.Google ScholarGoogle Scholar
  502. Dromey, R.G., 1995. A model for software product quality. IEEE Transactions on Software Engineering 21 (2), 146-163, IEEE Press Piscataway, NJ. Google ScholarGoogle ScholarDigital LibraryDigital Library
  503. Farcas, E., Farcas, C., Krüger, I., 2014. Successful CyberInfrastructures for E-Health. In: Mistrik, I., Bahsoon, R., Zhang, Y., Kazman, R. (Eds.), Economics-driven Software Architecture. Elsevier, Waltham, MA, pp. 259-296, ch. 12.Google ScholarGoogle Scholar
  504. Federal Information Security Management Act of 2002, Title III, E-Government Act of 2002, P.L. 107_347.Google ScholarGoogle Scholar
  505. Feigenbaum, A.V., 1983. Total Quality Control. McGraw-Hill, New York, NY.Google ScholarGoogle Scholar
  506. Fowler, M., 2009. Technical Debt Quadrant, Oct. Available from: ¿http://www.martinfowler.com/bliki/TechnicalDebtQuadrant.html¿ (accessed March 2012).Google ScholarGoogle Scholar
  507. Fowler, M., Beck, K., Brant, J., Opdyke, W., Roberts, D., 1999. Refactoring: Improving the Design of Existing Code. Addison-Wesley Longman Publishing Co., Inc, Boston, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  508. Garvin, D.A., 1984. What does product quality really mean? MIT Sloan Manage. Rev. 26 (1), 25-43.Google ScholarGoogle Scholar
  509. Grady, R.B., 1992. Practical Software Metrics for Project Management and Process Improvement. Prentice-Hall. Google ScholarGoogle Scholar
  510. Guo, Y., Seaman, C., 2011 A portfolio approach to technical debt management. Presented at the 2nd Workshop on Managing Technical Debt, Honolulu, HI. Google ScholarGoogle Scholar
  511. Guo, Y., Seaman, C., Gomes, R., Cavalcanti, A., Tonin, G., Da Silva, F.Q.B., et al., 2011. Tracking technical debt--an exploratory case study. In: 27th IEEE International Conference on Software Maintenance (ICSM'11), Williamsburg, VA, pp. 528-531. Google ScholarGoogle Scholar
  512. Halstead, M., 1977. Elements of Software Science. Elsevier Science Inc., New York, NY. Google ScholarGoogle Scholar
  513. Health Insurance Portability and Accountability Act of 1996. P.L. 104_191.Google ScholarGoogle Scholar
  514. Humphrey, W.S., 1989. Managing the Software Process. Addison-Wesley, Reading, MA. Google ScholarGoogle Scholar
  515. Ishikawa, K., 1985. What Is Total Quality Control?: The Japanese Way. Prentice-Hall.Google ScholarGoogle Scholar
  516. ISO, International Organization for Standardization, 2000. ISO 9001:2000, Quality Management Systems--Requirements.Google ScholarGoogle Scholar
  517. ISO, International Organization for Standardization, 2001. ISO 9126-1:2001, Software engineering--Product Quality, Part 1: Quality Model.Google ScholarGoogle Scholar
  518. ISO, International Organization for Standardization, 2011. ISO/IEC 25010:2011: Systems and software engineering--Systems and Software Quality Requirements and Evaluation (SQuaRE)--System and Software Quality Models.Google ScholarGoogle Scholar
  519. Juran, J.M., Gryna, F.M., 1970. Quality Planning and Analysis: From Product Development Through Use. McGraw-Hill, New York, NY.Google ScholarGoogle Scholar
  520. Juran, J.M., Gryna, F.M., 1988. Juran's Quality Control Handbook. McGraw-Hill, 1872 pages.Google ScholarGoogle Scholar
  521. Kan, S.H., 2002. Metrics and Models in Software Quality Engineering, second ed. Addison-Wesley. Google ScholarGoogle Scholar
  522. Kazman, R., Asundi, J., Klein, M., 2001. Quantifying the costs and benefits of architectural decisions. In: Proceedings of the 23rd International Conference on Software Engineering (ICSE'01). IEEE Computer Society, Toronto, Ontario, Canada, pp. 297-306. Google ScholarGoogle Scholar
  523. Kazman, R., Asundi, J., Klein, M., 2002. Making Architecture Design Decisions: An Economic Approach (CMU/SEI-2002-TR-035, ESCTR-2002-035). Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA.Google ScholarGoogle Scholar
  524. Leffingwell, D., 2007. Scaling Software Agility: Best Practices for Large Enterprises (The Agile Software Development Series). Addison-Wesley Professional. Google ScholarGoogle Scholar
  525. MacKenzie, C., Laskey, K., McCabe, F., Brown, P., Metz, R., 2006. Reference Model for Service Oriented Architecture 1.0. OASIS Standard. Retrieved from: ¿http://docs.oasis-open.org/soa-rm/v1.0/soa-rm.pdf¿.Google ScholarGoogle Scholar
  526. Markowitz, H., 1952. Portfolio selection. J. Finance 7, 77-91.Google ScholarGoogle Scholar
  527. McCabe, T.J., 1976. A complexity measure. IEEE Trans. Softw. Eng. 2 (4), 308-320. Google ScholarGoogle ScholarDigital LibraryDigital Library
  528. McCall, J.A., Richards, P.K., Walters, G.F., 1977. Factors in Software Quality, The National Technical Information Service (NTIS), Vols. 1, 2 and 3.Google ScholarGoogle Scholar
  529. Nikzad, N., Ziftci, C., Zappi, P., Quick, N., Aghera, P., Verma, N., et al., 2011. CitiSense-- Adaptive Services for Community-Driven Behavioral and Environmental Monitoring to Induce Change, Tech. Rep. CS2011-0961. University of California, San Diego, CA.Google ScholarGoogle Scholar
  530. Nikzad, N., Verma, N., Ziftci, C., Bales, E., Quick, N., Zappi, P., et al., 2012. CitiSense: Improving Geospatial Environmental Assessment of Air Quality Using a Wireless Personal Exposure Monitoring System. Wireless Health (Best Paper). Google ScholarGoogle Scholar
  531. Nord, R.L., Ozkaya, I., Kruchten, P., Gonzalez-Rojas, M., In search of a metric for managing architectural technical debt. In: 2012 Joint Working IEEE/IFIP Conference on Software Architecture (WICSA) and European Conference on Software Architecture (ECSA), pp. 91, 100, 20-24 August 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  532. Object Management Group, 2003. Model Driven Architecture (MDA) v1.0.1. omg/03-06-01, OMG.Google ScholarGoogle Scholar
  533. Ohno-Machado, L., Bafna, V., Boxwala, A.A., Chapman, B.E., Chapman, W.W., Chaudhuri, K., et al., 2012. iDASH: integrating data for analysis, anonymization, and sharing. J. Am. Med. Inform. Assoc.: JAMIA 19 (2), 196-201. Available from: http://dx.doi.org/10.1136/amiajnl-2011-000538.Google ScholarGoogle ScholarCross RefCross Ref
  534. Patrick, K., Wolszon, L., Basen-Engquist, K., Demark-Wahnefried, W., Prokhorov, A., Barrera, S., et al., 2011. CYberinfrastructure for COmparative effectiveness REsearch (CYCORE): improving data from cancer clinical trials. J. Transl. Behav. Med. Practice, Policy, Research 1 (1), 83-88. Available from: ¿http://dx.doi.org/10.1007/s13142-010-0005-z¿.Google ScholarGoogle ScholarCross RefCross Ref
  535. Paulk, M., Weber, C.V., Curtis, B., Chrissis, M.B., 1995. The Capability Maturity Model: Guidelines for Improving the Software Process. Addison-Wesley. Google ScholarGoogle Scholar
  536. Peterson, S.K., Shinn, E.H., Basen-Engquist, K., Demark-Wahnefried, W., Prokhorov, A. V., Baru, C., et al., 2013. Identifying early dehydration risk with home-based sensors during radiation treatment: a feasibility study with head and neck cancer patients. J. Natl. Cancer Inst. Monograph 47, 162-168, Oxford University Press, http://dx.doi.org/10.1093/jncimonographs/lgt016.Google ScholarGoogle ScholarCross RefCross Ref
  537. Saaty, T.L., 1982. Decision Making for Leaders: The Analytical Hierarchy Process for Decision in a Complex World. Lifetime Learning Publications, Belmot, CA.Google ScholarGoogle Scholar
  538. Seaman, C., Guo, Y., 2011. Measuring and monitoring technical debt. Adv. Comput. 82, 22.Google ScholarGoogle Scholar
  539. SEI, Software Engineering Institute, 2001a. Capability Maturity Model Integration (CMMI), Version 1.1, CMMI for Systems Engineering and Software Engineering (CMMI-SE/SW, V1.1), Continuous Representation. Carnegie Mellon University, CMU/SEI-2002-TR-001.Google ScholarGoogle Scholar
  540. SEI, Software Engineering Institute, 2001b. Capability Maturity Model Integration (CMMI), Version 1.1, CMMI for Systems Engineering and Software Engineering (CMMI-SE/SW, V1.1), Staged Representation. Carnegie Mellon University, CMU/SEI-2002-TR-002.Google ScholarGoogle Scholar
  541. Shewhart, W.A., 1931. Economic Control of Quality of Manufactured Product. D. Van Nostrand Company, New York, NY.Google ScholarGoogle Scholar
  542. Woods, E., 2011. Industrial architectural assessment using TARA. In: Ninth Working IEEE/IFIP Conference on Software Architecture (WICSA). Google ScholarGoogle Scholar
  543. Andriole, 1986. In: Andriole, S.J. (Ed.), Software Validation, Verification, Testing, and Documentation. Petrocelli Books, Princeton, NJ. Google ScholarGoogle Scholar
  544. Boehm, B.W., Brown, J.R., Kaspar, H., Lipow, M., McLeod, G., Merritt, M., 1978. Characteristics of Software Quality. North Holland Publishing, Amsterdam, the Netherlands.Google ScholarGoogle Scholar
  545. Brooks, 1975. The Mythical Man Month. Addison-Wesley, Reading, MA (Chapter 14). Google ScholarGoogle Scholar
  546. Cimperman, R., 2006. UAT Defined: A Guide to Practical User Acceptance Testing. Pearson Education, New York City NY, (Chapter 2) ISBN 9780132702621. Google ScholarGoogle Scholar
  547. CMMI®, 2010. CMMI Product Team: CMMI for Development, Version 1.3 (CMU/SEI- 2010-TR-033). Carnegie Mellon University, Software Engineering Institute, Pittsburgh, PA.Google ScholarGoogle Scholar
  548. Coates, IV, J.C., Srinivasan, S., 2014. SOX after ten years: a multidisciplinary review (January 12, 2014). Accounting Horizons. Available at SSRN: ¿http://ssrn.com/abstract=2379731¿.Google ScholarGoogle Scholar
  549. Cohen, J., 2006. Best Kept Secrets of Peer Code Review (Modern Approach. Practical Advice.). Smart Bear Inc., Somerville, MA, ISBN 1-59916-067-6.Google ScholarGoogle Scholar
  550. Deming, W.E., 1986. Out of the Crisis. MIT Center for Advanced Engineering Study, Cambridge, MA.Google ScholarGoogle Scholar
  551. Houston, A., 1988. A Total Quality Management Process Improvement Model. Navy Personnel Research and Development Center, San Diego, CA.Google ScholarGoogle Scholar
  552. Humphrey, W.S., 1987. Characterizing the Software Process: A Maturity Framework. CMU/SEI-87-TR-11. Carnegie Mellon University, Software Engineering Institute, Pittsburgh, PA.Google ScholarGoogle Scholar
  553. IEEE, 1984. Guide to Software Requirements Specifications. ISBN 0-7381-4418-5, IEEE Computer Society Press, Los Alamitos, CA.Google ScholarGoogle Scholar
  554. IEEE, 1990. IEEE Standard Glossary of Software Engineering Terminology" (IEEE Std 610.12-1990). IEEE Computer Society Press, Los Alamitos, CA.Google ScholarGoogle Scholar
  555. ISO 9126-2, 2001. DTR 9126-2: Software Engineering--Software Product Quality Part 2--External Metrics. ISO/IEC JTC1/SC7 N2419, 2001, International Organization for Standardization, Geneva, Switzerland.Google ScholarGoogle Scholar
  556. JUNIT, 2014. Retrieved from ¿http://junit.org¿ (October 2014).Google ScholarGoogle Scholar
  557. Juran, 1999. Juran's Quality Handbook, fifth ed. McGraw-Hill, New York, NY, ISBN 0-07-034003-X.Google ScholarGoogle Scholar
  558. Martínez-Lorente, A.R., Dewhurst, F., Dale, B.G., 1998. Total quality management: origins and evolution of the term. The TQM Magazine. MCB University Publishers Ltd, Bingley, UK.Google ScholarGoogle Scholar
  559. NASA, 2009. NASA software assurance definitions. Retrieved from ¿http://www.hq.nasa.gov/office/codeq/software/umbrella_defs.htm¿ (October 2014).Google ScholarGoogle Scholar
  560. Paulk, M.C., Curtis, B., Chrissis, M.B., Averill, E.L., Bamberger, J., Kasse, T.C., et al., 1991. Capability Maturity Model for Software. CMU/SEI-91-TR-24. Carnegie Mellon University, Software Engineering Institute, Pittsburgh, PA.Google ScholarGoogle Scholar
  561. PMI, 2010. A Guide to the Project Management Body of Knowledge (PMBOK Guide). PMI Standards Committee, Project Management Institute, Newtown Square, PA, ISBN 1-933890-66-5.Google ScholarGoogle Scholar
  562. SEI Cmmi® Maturity Profile Report, 2014. CMMI Maturity Profile Report. Carnegie Mellon University, Software Engineering Institute, Pittsburgh, PA, Retrieved from ¿http://cmmi®institute.com/wp-content/uploads/2014/05/Maturity-Profile-Ending-March-2014.pdf¿ (October 2014).Google ScholarGoogle Scholar
  563. Shewhart, 1931. Economic Control of Quality of Manufactured Product. D. Van Nostrand Company, New York, NY, ISBM0-87389-076-0.Google ScholarGoogle Scholar
  564. SOX, 2002. Public Law 107-204--Sarbanes-Oxley Act of 2002.Google ScholarGoogle Scholar
  565. Tennant, G., 2001. Six Sigma: SPC and TQM in Manufacturing and Services. Gower Publishing Ltd, Farnham, UK, ISBN 0-566-08374-4.Google ScholarGoogle Scholar
Contributors
  • Brunel University London
  • Monash University
  • Wageningen University & Research

Recommendations