ABSTRACT
The strengths and weaknesses of heuristic evaluation have been well researched. Despite known weaknesses, heuristic evaluation is still widely used since formal usability testing (also referred to as empirical user testing) is more costly and time consuming. What has received less attention is the type of information heuristic evaluation conveys in comparison to empirical user testing supported by eye tracking and user observation. If usability methods are combined, it becomes even more important to distinguish the information contribution by each method. This paper investigates the application of two usability evaluation methods, namely heuristic evaluation and empirical user testing supported by eye tracking, to the website of a learning management system with the intent of discovering the difference in the usability information yielded. Heuristic evaluation as an inspection method is accepted to be fundamentally different from empirical user testing. This paper contributes to a deeper understanding of the nature of the differences by identifying the kind of usability problems identified through each method. The findings should be of interest to researchers, designers and usability practitioners involved in website design and evaluation.
- 9241-11, I.: Guidance on Usability (1998) {cited 15 March 2009}, Available from: http://www.iso.org/iso/en/CatalogueDetailPage.CatalogueDet ail?CSNUMBER=16883.Google Scholar
- Äijö, R., Mantere, J. 2001. Are Non-Expert Usability Evaluations Valuable? In 18th International Symposium on Human Factors in Telecommunications (HfT'01). Bergen, Norway.Google Scholar
- Aula, A., Majaranta, P., Räihä, K. J., 2005. Eye-Tracking Reveals the Personal Styles for Search Result Evaluation. Interact 2005, LNCS 3585, 1058--1061. Google ScholarDigital Library
- Bednarik, R., Tukiainen, M. 2006. An eye-tracking methodology for characterizing program comprehension processes. In ETRA 2006. San Diego, California,: ACM 1-59593-305-0/06/0003. Google ScholarDigital Library
- Desurvire, H. W. 1994. Faster, Cheaper!! Are Usability Inspection Methods as Effective as Empirical Testing? Usability Inspection Methods, Mack. John Wiley&Sons, New York, NY. Google ScholarDigital Library
- Desurvire, H. W., Kondziela, J. M.,&Atwood, M. E. 1992. What is gained and lost when using evaluation methods other than empirical testing. Proceedings of the HCI'92 Conference on People and Computers VII. 89--102. Google ScholarDigital Library
- Ehmke, C., Wilson, S. 2007. Identifying Web Usability Problems from Eye-Tracking Data. Proceedings of the 21st British CHI Group Annual Conference on HCI 2007: People and Computers XXI: HCI...but not as we know it. University of Lancaster, United Kingdom, ISBN:978-1-902505-94-7, 1: 119--128,. Google ScholarDigital Library
- Fu, L., Salvendy, G., Turley, L. 2002. Effectiveness of user testing and heuristic evaluation as a function of performance classifications. Behaviour&Information Technology. 21(2) 137--143.Google Scholar
- Gray, W. D., Salzman, M. C. 1998. Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods. Human-computer interaction 13, 203--261. Google ScholarDigital Library
- Hartson, H. R., Terence, S. A., Williges, R. C. 2001. Criteria For Evaluating Usability Evaluation Methods. International journal of human-computer interaction. 13(4), 373--410.Google Scholar
- Hertzum, M., Jacobsen, N. E. 2003. The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods. International journal of human-computer interaction. 15(1), 183--204.Google Scholar
- Jeffries, R., Desurvire, H. W. 1992. Usability Testing vs. Heuristic Evaluation: Was there a contest? SIGCHI Bulletin. 24(4). Google ScholarDigital Library
- Jeffries, R., Miller, J. R., Wharton, C., Uyeda, K. M. 1992. User interface evaluation in the real-world: a comparison of four techniques. ACM SIGCHI Bulletin, 24 (4): 39--41, ISSN:0736--6906 (October 1992)Google ScholarDigital Library
- Karat, C. M., Campbell, R.,&Fiegel, T. 1992. Comparison of empirical testing and walkthrough methods in user interface evaluation. Proceedings of the ACM CHI'92 Conference on Human Factors in Computing Systems. 397--404. Google ScholarDigital Library
- Karn, K. S., Jacob, R. J. K. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The mind's eye, cognitive and Applied Aspects of Eye movement Research. Elsevier, Amsterdam.Google Scholar
- Kasarskis, P., Stehwien, J., Hickox, J., Aretz, A. 2001. Comparison of expert and novice scan behaviours during VFR flight. 11th International Symposium on Aviation Psychology Columbus, OH: The Ohio State University. {cited: 16 April 2009}, Available from: http://www.humanfactors.uiuc.edu/Reports&PapersPDFs/isa p01/proced01.pdf.Google Scholar
- Law, E. L., Hvannberg, E. T. 2004. Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation. In Proceedings of NordiCHI'04. Tampere, Finland: ACM. Google ScholarDigital Library
- Nielsen, J. 1992. Finding usability problems through heuristic evaluation. Conference on Human Factors in Computing Systems. Proceedings of the SIGCHI conference on Human factors in computing systems, Monterey, California, United States, ISBN:0-89791-513-5, 373--380. Google ScholarDigital Library
- Nielsen, J. 2003. Jacob Nielsen's Alertbox: Usability 101. {cited 15 April 2009} Available from: http://www.useit.com/alertbox/20030825.htmlGoogle Scholar
- Nielsen, J. 1994. Heuristic evaluation. In: J. Nielsen and R. L. Mack, (eds.): Usability inspection methods. John Wiley&Sons, New York, 22--62. Google ScholarDigital Library
- Nielsen, J., Levy, J. 1994. Measuring Usability: Preference vs. Performance. Communications of the ACM. 37, 66--76. Google ScholarDigital Library
- Nielsen, J., Mollich, R. 1990. Heuristic evaluation of User interfaces. CHI'90 Proceedings. 249--256. Google ScholarDigital Library
- Preece, J., Rogers, Y., and Sharp, H. 2002. Interaction Design: beyond human-computer interaction. John Wiley&Sons, Inc. Google ScholarDigital Library
- Pretorius, M. C., Calitz, A. P., van Greunen, D. 2005. The Added Value of Eye Tracking in the Usability Evaluation of a Network Management Tool. In Proceedings of the 2005 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries. White River, South Africa. Google ScholarDigital Library
- Sauro, J., Kindlund, E. 2005. A Method to Standardize Usability Metrics into a Single Score. Proceedings of the SIGCHI conference on Human factors in computing systems Portland, Oregon, USA, ISBN:1-58113-998-5, 401--409 Google ScholarDigital Library
- Schneiderman, B. 1998. Designing the user Interface: Strategies for effective human-computer interaction. Addison Wesley Longman, Reading, MA.Google Scholar
- Ssemugabi, S., De Villiers, R. 2007. A comparative study of two usability evaluation methods using a web-based e-learning application. Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries Port Elizabeth, South Africa. ISBN:978-1-59593-775-9, 226: 132--142. Google ScholarDigital Library
- Thimbleby, H. 2007. Press on. MIT Press, Cambridge.Google Scholar
- Tognazzini, B. 2003. First Principles of Interaction design. {cited 8 June 2009}; Available from: www.asktog.com/basics/firstPrinciples.html.Google Scholar
- van Greunen, D., Wesson, J. 2001. Formal usability testing - Informing design. {Cited 6 June 2009}. http://osprey.unisa.ac.za/saicsit2001/Electronic/paper20.pdfGoogle Scholar
- Wheeler Atkinson, B. F., Bennet, T. O., Bahr, G. S., Nelson, M. W. 2007. Development of a Multiple Heuristic Evaluatin Table (MHET) to Support Software Development and Usability Analysis. {Cited 25 May 2009}. Available from: http://research.fit.edu/carl/documents/MHET.pdfGoogle Scholar
Index Terms
- Usability evaluation methods: mind the gaps
Recommendations
Finding usability problems through heuristic evaluation
CHI '92: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsUsability specialists were better than non-specialists at performing heuristic evaluation, and “double experts” with specific expertise in the kind of interface being evaluated performed even better. Major usability problems have a higher probability ...
A comparison of usability evaluation methods for evaluating e-commerce websites
The importance of evaluating the usability of e-commerce websites is well recognised. User testing and heuristic evaluation methods are commonly used to evaluate the usability of such sites, but just how effective are these for identifying specific ...
Detailed Usability Heuristics: A Breakdown of Usability Heuristics to Enhance Comprehension for Novice Evaluators
HCI International 2020 - Late Breaking Papers: User Experience Design and Case StudiesAbstractHeuristic evaluation (HE) is one of the most commonly used usability evaluation methods. In HE, 3–5 evaluators evaluate a certain system guided by a list of usability heuristics with the goal of detecting usability issues. Although HE is popular ...
Comments