skip to main content
10.1145/1142405.1142439acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
Article

What do usability evaluators do in practice?: an explorative study of think-aloud testing

Published:26 June 2006Publication History

ABSTRACT

Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that immediate analysis of observations made in the think-aloud sessions is done only sporadically, if at all. When testing, evaluators seem to seek confirmation of problems that they are already aware of. During testing, evaluators often ask users about their expectations and about hypothetical situations, rather than about experienced problems. In addition, evaluators learn much about the usability of the tested system but little about its utility. The study shows how practical realities rarely discussed in the literature on usability evaluation influence sessions. We discuss implications for usability researchers and professionals, including techniques for fast-paced analysis and tools for capturing observations during sessions.

References

  1. Arnowitz, J., Gray, D., Dorsch, N., Heidelberg, M., & Arent, M. The Stakeholder Forest: Designing an Expense Application for the Enterprise, Proc. CHI 2005, ACM Press (2005), 941--956. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Beyer, H. & Holtzblatt, K. Contextual Design, Morgan Kaufman Publishers, San Francisco, 1998.Google ScholarGoogle Scholar
  3. Boivie, I., Åborg, C., Persson, J., & Lööfberg, M. Why Usability for Lost or Usability in in-House Software Development, Interacting with Computers, 15 (2003), 623--639.Google ScholarGoogle ScholarCross RefCross Ref
  4. Boren, M. T. & Ramey, J. Thinking Aloud: Reconciling Theory and Practice, IEEE Transactions on Professional Communication, 43, 3 (2000), 261--277.Google ScholarGoogle ScholarCross RefCross Ref
  5. Carter, L. & Yeats, D. The Role of Highlights Video in Usability Testing: Rhetorical and Generic Expectations, Technical Communications, 52, 2 (2005), 1--7.Google ScholarGoogle Scholar
  6. Chi, M. T. H. Quantifying Qualitative Analyses of Verbal Data: A Practical Guide, The Journal of the Learning Sciences, 6, 3 (1997), 271--315.Google ScholarGoogle ScholarCross RefCross Ref
  7. Cockton G., Lavery, D., & Woolrych, A., Inspection-Based Evaluations, in Jacko, J. A. & Sears, A. The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, 2003, 1118--1138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cockton, G., Woolrych, A., Hall, L., & Hidemarch, M. Changing Analysts' Tunes: The Surprising Impact of a New Instrument for Usability Inspection Method Assessment, Proc. HCI 2003, Springer Verlag (2003), 145--162.Google ScholarGoogle Scholar
  9. Dumas J., User-Based Evaluations, in Jacko, J. A. & Sears, A. The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, 2003, 1093--1117. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Dumas, J., Molich, R., & Jefferies, R. Describing Usability Problems: Are We Sending the Right Message?, interactions, 4 (2004), 24--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Dumas, J. & Redish, J. A Practical Guide to Usability Testing, Intellect, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Ericsson, K. A. & Simon, H. Protocol Analysis: Verbal Reports As Data, Revised Edition, MIT Press, Cambridge, MA, 1993.Google ScholarGoogle Scholar
  13. Frøkjær, E. & Hornbææk, K. Cooperative Usability Testing: Complementing Usability Tests With User-Supported Interpretation Sessions, Extended Abstracts of ACM Conference on Human Factors in Computing Systems (2005), 1383--1386. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Gulliksen, J., Boivie, I., Persson, J., Hektor, A., & Herulf, L. Making a Difference - a Survey of the Usability Profession in Sweden, Proc. Nordichi 2004, ACM Press (2004), 207--215. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Hertzum, M. User Testing in Industry: A Case Study of Laboratory, Workshop, and Field Tests, Proc. ERCIM Workshop on User Interfaces for All, (1999), 59--72.Google ScholarGoogle Scholar
  16. Hertzum, M. & Jacobsen, N. E. The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods, International Journal of Human-Computer Interaction, 13 (2001), 421--443.Google ScholarGoogle ScholarCross RefCross Ref
  17. Hornbæk, K. & Frøøkjær, E. Comparing Usability Problems and Redesign Proposals As Input to Practical Systems Development, Proc. CHI'2005, ACM Press (2005), 391--400. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Hornbæk, K. & Frøøkjær, E. Two Psychology-Based Usability Inspection Techniques Studied in a Diary Experiment , Proc. Nordichi 2004, ACM Press (2004), 3--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Iivari, N. Usability Specialists - 'a Mommy Mob', 'Realistic Humanists' or 'Staid Researchers'? An Analysis of Usability Work in Software Product Development, Proc. Interact 2005, Edizioni Guiseppe Laterza, (2005), 418--430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Jacobsen, N. E. & John, B. E. Two Case Studies in Using Cognitive Walkthroughs for Interface Evaluation, CMU-CS-00-132 (2000).Google ScholarGoogle Scholar
  21. Jeffries, R., Miller, J., Wharton, C., & Uyeda, K. User Interface Evaluation in the Real World: A Comparison of Four Techniques., Proc. CHI'91, (1991), 119--124. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. John, B. Beyond the UI: Product, Process and Passion, Proc. Nordichi 2004, ACM Press (2004), 285--286. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. John, B. E. & Mashyna, M. M. Evaluating a Multimedia Authoring Tool, Journal of the American Society of Information Science, 48, 9 (1997), 1004--1022. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. John, B. E. & Packer, H. Learning and Using the Cognitive Walkthrough Method: a Case Study Approach, Proc. CHI'95, ACM Press (1995), 429--436. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Karat, C.-M., Campbell, R., & Fiegel, T. Comparison of Empirical Testing and Walkthrough Methods in Usability Interface Evaluation, Proc. CHI'92, ACM Press (1992), 397--404. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Molich, Rolf, User testing, Discount user testing, 2003, www.dialogdesign.dk.Google ScholarGoogle Scholar
  27. Molich, R., Ede, M. R., Kaasgaard, K., & Karyukin, B. Comparative Usability Evaluation, Behaviour & Information Technology, 23, 1 (2004), 65--74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Nielsen, J. Usability Engineering, Morgan Kaufmann Publishers, San Francisco, CA, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Nielsen, J. Finding Usability Problems Through Heuristic Evaluation, Proc. CHI'92, ACM Press (1992), 373--380. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Pace, S. A Grounded Theory of the Flow Experiences of Web Users, International Journal of Human-Computer Studies, 60 (2004), 347-363. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Sawyer, P., Flanders, A., & Wixon, D. Making a Difference - The Impact of Inspections, Proc. CHI'96, ACM Press (1996), 376--382. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Spencer, R. The Streamlined Cognitive Walkthrough Method, Working Around Social Constraints Encountered in a Software Development Company, Proc. CHI'2000, (2000), 353--359. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Strauss, A. & Corbin, J. Basics of Qualitative Research - Techniques and Procedures for Developing Grounded Theory, Sage Publications, California, (1998).Google ScholarGoogle Scholar
  34. Szczur, M. Usability Testing - on a Budget: a NASA Usability Test Case Study, Behaviour & Information Technology, 13 (1994), 106--118.Google ScholarGoogle ScholarCross RefCross Ref
  35. Vredenburg, K., Mao, J.-Y., Smith, P. W., & Carey, T. A Survey of User-Centered Design Practice, Proc. CHI 2002, ACM Press (2002), 472--478. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Wilson, S., Bekker, M., Johnson, P., & Johnson, H. Helping and Hindering User Involvement - a Tale of Everyday Design, Proc. CHI'97, ACM Press (1997), 178--185. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Wixon, D. Evaluating Usability Methods: Why the Current Literature Fails the Practitioner, interactions, 10, 4 (2003), 29--34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Zirkler, D. & Ballman, D. R. Usability Testing in a Competive Market: Lessons Learned, Behaviour and Information Technology, 13, 1&2 (1994), 191--197.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. What do usability evaluators do in practice?: an explorative study of think-aloud testing

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          DIS '06: Proceedings of the 6th conference on Designing Interactive systems
          June 2006
          384 pages
          ISBN:1595933670
          DOI:10.1145/1142405

          Copyright © 2006 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 26 June 2006

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • Article

          Acceptance Rates

          Overall Acceptance Rate1,158of4,684submissions,25%

          Upcoming Conference

          DIS '24
          Designing Interactive Systems Conference
          July 1 - 5, 2024
          IT University of Copenhagen , Denmark

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader