skip to main content
10.1145/2851581.2851607acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Practical Usability Rating by Experts (PURE): A Pragmatic Approach for Scoring Product Usability

Published:07 May 2016Publication History

ABSTRACT

Usability testing has long been considered a gold standard in evaluating the ease of use of software and websites-producing metrics to benchmark the experience and identifying areas for improvement. However, logistical complexities and costs can make frequent usability testing infeasible. Alternatives to usability testing include various forms of expert reviews that identify usability problems but fail to provide task performance metrics. This case study describes a method by which multiple teams of trained evaluators generated task usability ratings and compared them to metrics collected from an independently run usability test on three software products. Although inter-rater reliability ranged from modest to strong and the correlation between actual and predicted metrics did establish fair concurrent validity, opportunities for improved reliability and validity were identified. By establishing clear guidelines, this method can provide a useful usability rating for a range of products across multiple platforms, without costing significant time or money.

References

  1. Apple Computer. 1987. Human Interface Guidelines: The Apple Desktop Interface. AddisonWesley, Reading, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. J. M. Christian Bastien and Dominique L. Scapin. 1995. Evaluating a user interface with ergonomic criteria. Int. J. Hum.-Comput. Interact. 7, 2 (April 1995), 105-121. DOI: http://dx.doi.org/10.1080/1044731950952611 Google ScholarGoogle ScholarCross RefCross Ref
  3. Tasha Hollingsed and David G. Novick. 2007. Usability inspection methods after 15 years of research and practice. In Proceedings of the 25th Annual ACM International Conference on Design of Communication (SIGDOC '07). ACM Press, New York, NY, 249-255. DOI: http://dx.doi.org/10.1145/1297144.1297200 Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. International Organization for Standardization. 1998. ISO 9241--11: Ergonomic requirements for office work with visual display terminals (VDTs) Part 11: Guidance on usability. International Organization for Standardization, Genève, Switzerland. Retrieved December 2, 2015 from https://www.iso.org/obp/ui/#iso:std:iso:9241:11:ed-1:v1:enGoogle ScholarGoogle Scholar
  5. Robin Jeffries, James R. Miller, Cathleen Wharton, and Kathy M. Uyeda. 1991. User interface evaluation in the real world: A comparison of four techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '91). ACM Press, New York, NY, 119- 124. DOI: http://dx.doi.org/10.1145/108844.108862 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. B. E. John and S. J. Marks. 1997. Tracking the effectiveness of usability evaluation methods. Behaviour and Information Technology, 16, 4/5, 188-202.Google ScholarGoogle Scholar
  7. Claire-Marie Karat, Robert Campbell, and Tarra Fiegel. 1992. Comparison of empirical testing and walkthrough methods in user interface evaluation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '92). ACM Press, New York, NY, 397-404. DOI: http://dx.doi.org/0.1145/142750.142873 Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Effie L.-C. Law and Ebba T. Hvannberg. 2004. Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation. In Proceedings of the Third Nordic Conference on Human-Computer Interaction (NordiCHI '04). ACM Press, New York, NY, 241-250. DOI: http://dx.doi.org/10.1145/1028014.1028051 Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Clayton Lewis, Peter G. Polson, Cathleen Wharton, and John Rieman. 1990. Testing a walkthrough methodology for theory-based design of walk-upand-use interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM Press, New York, NY, 235- 242. DOI: http://dx.doi.org/10.1145/97243.97279 Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Microsoft Corporation. 1995. The Windows Interface: Guidelines for Software Design. Microsoft Press, Redmond, WA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jakob Nielsen. 1993. Usability Engineering. Academic Press, Boston, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Jakob Nielsen and Rolf Molich. 1990. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM Press, New York, NY, 249-256. DOI: http://dx.doi.org/10.1145/97243.97281 Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Jeff Sauro. 2012. How Effective are Heuristic Evaluations? (September 2012). Retrieved August 10, 2015 from http://www.measuringu.com/blog/effective-he.phpGoogle ScholarGoogle Scholar
  14. Sydney L. Smith and Jane N. Mosier. 1986. Guidelines for Designing User Interface Software. MITRE Technical Report MTR-10090. The MITRE Corporation, Bedford, MA.Google ScholarGoogle Scholar

Index Terms

  1. Practical Usability Rating by Experts (PURE): A Pragmatic Approach for Scoring Product Usability

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '16: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
      May 2016
      3954 pages
      ISBN:9781450340823
      DOI:10.1145/2851581

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 May 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI EA '16 Paper Acceptance Rate1,000of5,000submissions,20%Overall Acceptance Rate6,164of23,696submissions,26%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader