ABSTRACT
Think-aloud testing is a widely employed usability evaluation method, yet its use in practice is rarely studied. We report an explorative study of 14 think-aloud sessions, the audio recordings of which were examined in detail. The study shows that immediate analysis of observations made in the think-aloud sessions is done only sporadically, if at all. When testing, evaluators seem to seek confirmation of problems that they are already aware of. During testing, evaluators often ask users about their expectations and about hypothetical situations, rather than about experienced problems. In addition, evaluators learn much about the usability of the tested system but little about its utility. The study shows how practical realities rarely discussed in the literature on usability evaluation influence sessions. We discuss implications for usability researchers and professionals, including techniques for fast-paced analysis and tools for capturing observations during sessions.
- Arnowitz, J., Gray, D., Dorsch, N., Heidelberg, M., & Arent, M. The Stakeholder Forest: Designing an Expense Application for the Enterprise, Proc. CHI 2005, ACM Press (2005), 941--956. Google ScholarDigital Library
- Beyer, H. & Holtzblatt, K. Contextual Design, Morgan Kaufman Publishers, San Francisco, 1998.Google Scholar
- Boivie, I., Åborg, C., Persson, J., & Lööfberg, M. Why Usability for Lost or Usability in in-House Software Development, Interacting with Computers, 15 (2003), 623--639.Google ScholarCross Ref
- Boren, M. T. & Ramey, J. Thinking Aloud: Reconciling Theory and Practice, IEEE Transactions on Professional Communication, 43, 3 (2000), 261--277.Google ScholarCross Ref
- Carter, L. & Yeats, D. The Role of Highlights Video in Usability Testing: Rhetorical and Generic Expectations, Technical Communications, 52, 2 (2005), 1--7.Google Scholar
- Chi, M. T. H. Quantifying Qualitative Analyses of Verbal Data: A Practical Guide, The Journal of the Learning Sciences, 6, 3 (1997), 271--315.Google ScholarCross Ref
- Cockton G., Lavery, D., & Woolrych, A., Inspection-Based Evaluations, in Jacko, J. A. & Sears, A. The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, 2003, 1118--1138. Google ScholarDigital Library
- Cockton, G., Woolrych, A., Hall, L., & Hidemarch, M. Changing Analysts' Tunes: The Surprising Impact of a New Instrument for Usability Inspection Method Assessment, Proc. HCI 2003, Springer Verlag (2003), 145--162.Google Scholar
- Dumas J., User-Based Evaluations, in Jacko, J. A. & Sears, A. The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, 2003, 1093--1117. Google ScholarDigital Library
- Dumas, J., Molich, R., & Jefferies, R. Describing Usability Problems: Are We Sending the Right Message?, interactions, 4 (2004), 24--29. Google ScholarDigital Library
- Dumas, J. & Redish, J. A Practical Guide to Usability Testing, Intellect, 1999. Google ScholarDigital Library
- Ericsson, K. A. & Simon, H. Protocol Analysis: Verbal Reports As Data, Revised Edition, MIT Press, Cambridge, MA, 1993.Google Scholar
- Frøkjær, E. & Hornbææk, K. Cooperative Usability Testing: Complementing Usability Tests With User-Supported Interpretation Sessions, Extended Abstracts of ACM Conference on Human Factors in Computing Systems (2005), 1383--1386. Google ScholarDigital Library
- Gulliksen, J., Boivie, I., Persson, J., Hektor, A., & Herulf, L. Making a Difference - a Survey of the Usability Profession in Sweden, Proc. Nordichi 2004, ACM Press (2004), 207--215. Google ScholarDigital Library
- Hertzum, M. User Testing in Industry: A Case Study of Laboratory, Workshop, and Field Tests, Proc. ERCIM Workshop on User Interfaces for All, (1999), 59--72.Google Scholar
- Hertzum, M. & Jacobsen, N. E. The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods, International Journal of Human-Computer Interaction, 13 (2001), 421--443.Google ScholarCross Ref
- Hornbæk, K. & Frøøkjær, E. Comparing Usability Problems and Redesign Proposals As Input to Practical Systems Development, Proc. CHI'2005, ACM Press (2005), 391--400. Google ScholarDigital Library
- Hornbæk, K. & Frøøkjær, E. Two Psychology-Based Usability Inspection Techniques Studied in a Diary Experiment , Proc. Nordichi 2004, ACM Press (2004), 3--12. Google ScholarDigital Library
- Iivari, N. Usability Specialists - 'a Mommy Mob', 'Realistic Humanists' or 'Staid Researchers'? An Analysis of Usability Work in Software Product Development, Proc. Interact 2005, Edizioni Guiseppe Laterza, (2005), 418--430. Google ScholarDigital Library
- Jacobsen, N. E. & John, B. E. Two Case Studies in Using Cognitive Walkthroughs for Interface Evaluation, CMU-CS-00-132 (2000).Google Scholar
- Jeffries, R., Miller, J., Wharton, C., & Uyeda, K. User Interface Evaluation in the Real World: A Comparison of Four Techniques., Proc. CHI'91, (1991), 119--124. Google ScholarDigital Library
- John, B. Beyond the UI: Product, Process and Passion, Proc. Nordichi 2004, ACM Press (2004), 285--286. Google ScholarDigital Library
- John, B. E. & Mashyna, M. M. Evaluating a Multimedia Authoring Tool, Journal of the American Society of Information Science, 48, 9 (1997), 1004--1022. Google ScholarDigital Library
- John, B. E. & Packer, H. Learning and Using the Cognitive Walkthrough Method: a Case Study Approach, Proc. CHI'95, ACM Press (1995), 429--436. Google ScholarDigital Library
- Karat, C.-M., Campbell, R., & Fiegel, T. Comparison of Empirical Testing and Walkthrough Methods in Usability Interface Evaluation, Proc. CHI'92, ACM Press (1992), 397--404. Google ScholarDigital Library
- Molich, Rolf, User testing, Discount user testing, 2003, www.dialogdesign.dk.Google Scholar
- Molich, R., Ede, M. R., Kaasgaard, K., & Karyukin, B. Comparative Usability Evaluation, Behaviour & Information Technology, 23, 1 (2004), 65--74. Google ScholarDigital Library
- Nielsen, J. Usability Engineering, Morgan Kaufmann Publishers, San Francisco, CA, 1993. Google ScholarDigital Library
- Nielsen, J. Finding Usability Problems Through Heuristic Evaluation, Proc. CHI'92, ACM Press (1992), 373--380. Google ScholarDigital Library
- Pace, S. A Grounded Theory of the Flow Experiences of Web Users, International Journal of Human-Computer Studies, 60 (2004), 347-363. Google ScholarDigital Library
- Sawyer, P., Flanders, A., & Wixon, D. Making a Difference - The Impact of Inspections, Proc. CHI'96, ACM Press (1996), 376--382. Google ScholarDigital Library
- Spencer, R. The Streamlined Cognitive Walkthrough Method, Working Around Social Constraints Encountered in a Software Development Company, Proc. CHI'2000, (2000), 353--359. Google ScholarDigital Library
- Strauss, A. & Corbin, J. Basics of Qualitative Research - Techniques and Procedures for Developing Grounded Theory, Sage Publications, California, (1998).Google Scholar
- Szczur, M. Usability Testing - on a Budget: a NASA Usability Test Case Study, Behaviour & Information Technology, 13 (1994), 106--118.Google ScholarCross Ref
- Vredenburg, K., Mao, J.-Y., Smith, P. W., & Carey, T. A Survey of User-Centered Design Practice, Proc. CHI 2002, ACM Press (2002), 472--478. Google ScholarDigital Library
- Wilson, S., Bekker, M., Johnson, P., & Johnson, H. Helping and Hindering User Involvement - a Tale of Everyday Design, Proc. CHI'97, ACM Press (1997), 178--185. Google ScholarDigital Library
- Wixon, D. Evaluating Usability Methods: Why the Current Literature Fails the Practitioner, interactions, 10, 4 (2003), 29--34. Google ScholarDigital Library
- Zirkler, D. & Ballman, D. R. Usability Testing in a Competive Market: Lessons Learned, Behaviour and Information Technology, 13, 1&2 (1994), 191--197.Google ScholarCross Ref
Index Terms
- What do usability evaluators do in practice?: an explorative study of think-aloud testing
Recommendations
Metaphors of human thinking for usability inspection and design
Usability inspection techniques are widely used, but few focus on users' thinking and many are appropriate only for particular devices and use contexts. We present a new technique (MOT) that guides inspection by metaphors of human thinking. The ...
Do usability evaluators do what we think usability evaluators do?
In this paper, I review the findings of ongoing research in usability and user experience analysis. In particular, I first discuss how real designers and usability evaluators in their own workplaces use findings from usability testing to drive design ...
Discourse Variations Between Usability Tests and Usability Reports
While usability evaluation and usability testing has become an important tool in artifact assessment, little is known about what happens to usability data as it moves from usability session to usability report. In this ethnographic case study, I ...
Comments