skip to main content
10.1145/1632149.1632166acmotherconferencesArticle/Chapter ViewAbstractPublication PageshtConference Proceedingsconference-collections
research-article

Usability evaluation methods: mind the gaps

Published:12 October 2009Publication History

ABSTRACT

The strengths and weaknesses of heuristic evaluation have been well researched. Despite known weaknesses, heuristic evaluation is still widely used since formal usability testing (also referred to as empirical user testing) is more costly and time consuming. What has received less attention is the type of information heuristic evaluation conveys in comparison to empirical user testing supported by eye tracking and user observation. If usability methods are combined, it becomes even more important to distinguish the information contribution by each method. This paper investigates the application of two usability evaluation methods, namely heuristic evaluation and empirical user testing supported by eye tracking, to the website of a learning management system with the intent of discovering the difference in the usability information yielded. Heuristic evaluation as an inspection method is accepted to be fundamentally different from empirical user testing. This paper contributes to a deeper understanding of the nature of the differences by identifying the kind of usability problems identified through each method. The findings should be of interest to researchers, designers and usability practitioners involved in website design and evaluation.

References

  1. 9241-11, I.: Guidance on Usability (1998) {cited 15 March 2009}, Available from: http://www.iso.org/iso/en/CatalogueDetailPage.CatalogueDet ail?CSNUMBER=16883.Google ScholarGoogle Scholar
  2. Äijö, R., Mantere, J. 2001. Are Non-Expert Usability Evaluations Valuable? In 18th International Symposium on Human Factors in Telecommunications (HfT'01). Bergen, Norway.Google ScholarGoogle Scholar
  3. Aula, A., Majaranta, P., Räihä, K. J., 2005. Eye-Tracking Reveals the Personal Styles for Search Result Evaluation. Interact 2005, LNCS 3585, 1058--1061. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bednarik, R., Tukiainen, M. 2006. An eye-tracking methodology for characterizing program comprehension processes. In ETRA 2006. San Diego, California,: ACM 1-59593-305-0/06/0003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Desurvire, H. W. 1994. Faster, Cheaper!! Are Usability Inspection Methods as Effective as Empirical Testing? Usability Inspection Methods, Mack. John Wiley&Sons, New York, NY. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Desurvire, H. W., Kondziela, J. M.,&Atwood, M. E. 1992. What is gained and lost when using evaluation methods other than empirical testing. Proceedings of the HCI'92 Conference on People and Computers VII. 89--102. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Ehmke, C., Wilson, S. 2007. Identifying Web Usability Problems from Eye-Tracking Data. Proceedings of the 21st British CHI Group Annual Conference on HCI 2007: People and Computers XXI: HCI...but not as we know it. University of Lancaster, United Kingdom, ISBN:978-1-902505-94-7, 1: 119--128,. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Fu, L., Salvendy, G., Turley, L. 2002. Effectiveness of user testing and heuristic evaluation as a function of performance classifications. Behaviour&Information Technology. 21(2) 137--143.Google ScholarGoogle Scholar
  9. Gray, W. D., Salzman, M. C. 1998. Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods. Human-computer interaction 13, 203--261. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Hartson, H. R., Terence, S. A., Williges, R. C. 2001. Criteria For Evaluating Usability Evaluation Methods. International journal of human-computer interaction. 13(4), 373--410.Google ScholarGoogle Scholar
  11. Hertzum, M., Jacobsen, N. E. 2003. The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods. International journal of human-computer interaction. 15(1), 183--204.Google ScholarGoogle Scholar
  12. Jeffries, R., Desurvire, H. W. 1992. Usability Testing vs. Heuristic Evaluation: Was there a contest? SIGCHI Bulletin. 24(4). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Jeffries, R., Miller, J. R., Wharton, C., Uyeda, K. M. 1992. User interface evaluation in the real-world: a comparison of four techniques. ACM SIGCHI Bulletin, 24 (4): 39--41, ISSN:0736--6906 (October 1992)Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Karat, C. M., Campbell, R.,&Fiegel, T. 1992. Comparison of empirical testing and walkthrough methods in user interface evaluation. Proceedings of the ACM CHI'92 Conference on Human Factors in Computing Systems. 397--404. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Karn, K. S., Jacob, R. J. K. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The mind's eye, cognitive and Applied Aspects of Eye movement Research. Elsevier, Amsterdam.Google ScholarGoogle Scholar
  16. Kasarskis, P., Stehwien, J., Hickox, J., Aretz, A. 2001. Comparison of expert and novice scan behaviours during VFR flight. 11th International Symposium on Aviation Psychology Columbus, OH: The Ohio State University. {cited: 16 April 2009}, Available from: http://www.humanfactors.uiuc.edu/Reports&PapersPDFs/isa p01/proced01.pdf.Google ScholarGoogle Scholar
  17. Law, E. L., Hvannberg, E. T. 2004. Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation. In Proceedings of NordiCHI'04. Tampere, Finland: ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Nielsen, J. 1992. Finding usability problems through heuristic evaluation. Conference on Human Factors in Computing Systems. Proceedings of the SIGCHI conference on Human factors in computing systems, Monterey, California, United States, ISBN:0-89791-513-5, 373--380. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Nielsen, J. 2003. Jacob Nielsen's Alertbox: Usability 101. {cited 15 April 2009} Available from: http://www.useit.com/alertbox/20030825.htmlGoogle ScholarGoogle Scholar
  20. Nielsen, J. 1994. Heuristic evaluation. In: J. Nielsen and R. L. Mack, (eds.): Usability inspection methods. John Wiley&Sons, New York, 22--62. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Nielsen, J., Levy, J. 1994. Measuring Usability: Preference vs. Performance. Communications of the ACM. 37, 66--76. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Nielsen, J., Mollich, R. 1990. Heuristic evaluation of User interfaces. CHI'90 Proceedings. 249--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Preece, J., Rogers, Y., and Sharp, H. 2002. Interaction Design: beyond human-computer interaction. John Wiley&Sons, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Pretorius, M. C., Calitz, A. P., van Greunen, D. 2005. The Added Value of Eye Tracking in the Usability Evaluation of a Network Management Tool. In Proceedings of the 2005 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries. White River, South Africa. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Sauro, J., Kindlund, E. 2005. A Method to Standardize Usability Metrics into a Single Score. Proceedings of the SIGCHI conference on Human factors in computing systems Portland, Oregon, USA, ISBN:1-58113-998-5, 401--409 Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Schneiderman, B. 1998. Designing the user Interface: Strategies for effective human-computer interaction. Addison Wesley Longman, Reading, MA.Google ScholarGoogle Scholar
  27. Ssemugabi, S., De Villiers, R. 2007. A comparative study of two usability evaluation methods using a web-based e-learning application. Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries Port Elizabeth, South Africa. ISBN:978-1-59593-775-9, 226: 132--142. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Thimbleby, H. 2007. Press on. MIT Press, Cambridge.Google ScholarGoogle Scholar
  29. Tognazzini, B. 2003. First Principles of Interaction design. {cited 8 June 2009}; Available from: www.asktog.com/basics/firstPrinciples.html.Google ScholarGoogle Scholar
  30. van Greunen, D., Wesson, J. 2001. Formal usability testing - Informing design. {Cited 6 June 2009}. http://osprey.unisa.ac.za/saicsit2001/Electronic/paper20.pdfGoogle ScholarGoogle Scholar
  31. Wheeler Atkinson, B. F., Bennet, T. O., Bahr, G. S., Nelson, M. W. 2007. Development of a Multiple Heuristic Evaluatin Table (MHET) to Support Software Development and Usability Analysis. {Cited 25 May 2009}. Available from: http://research.fit.edu/carl/documents/MHET.pdfGoogle ScholarGoogle Scholar

Index Terms

  1. Usability evaluation methods: mind the gaps

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        SAICSIT '09: Proceedings of the 2009 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists
        October 2009
        225 pages
        ISBN:9781605586434
        DOI:10.1145/1632149

        Copyright © 2009 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 12 October 2009

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate187of439submissions,43%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader