skip to main content
10.1145/2461121.2461136acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesw4aConference Proceedingsconference-collections
research-article

Evaluating accessibility-in-use

Authors Info & Claims
Published:13 May 2013Publication History

ABSTRACT

Evidence suggests that guidelines employed in conformance testing do not catch all the accessibility barriers encountered by users on the Web. Since accessibility is strongly tied to the users' experience there is a subjective perception of accessibility barriers and their severity. What is more, not only intangible qualities characterise the way in which these barriers are perceived, but also navigation styles, age, onset, expertise and abilities play a key role. In order to overcome the limitations of conformance testing and catch the problems that emerge during the interaction we propose a user-interaction-driven method to automatically report accessibility problems. To do so, we initially isolate the problematic situations faced by users and the tactics employed in such situations. These tactics are considered behavioural markers of cognitive processes that indicate problematic situations; the presence of tactics denotes the presence of problems. Then, we design and deploy algorithms to automatically detect the exhibition of these tactics and consequently detect problematic situations. WebTactics, a tool that unobtrusively detects and reports the problematic situations undergone by visually disabled users illustrates the method we propose.

References

  1. P. Biswas and P. Robinson. Evaluating the design of inclusive interfaces by simulation. In Proceedings of the 15th international conference on Intelligent user interfaces, IUI '10, pages 277--280, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. G. Brajnik. A comparative test of web accessibility evaluation methods. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, Assets '08, pages 113--120, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. M. Cooper, D. Sloan, B. Kelly, and S. Lewthwaite. A challenge to web accessibility metrics and guidelines: putting people and processes first. In Proceedings of the International Cross-Disciplinary Conference on Web Accessibility, W4A '12, pages 20:1--20:4, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. J. Harty. Finding usability bugs with automated tests. Communications of the ACM, 54(2):44--49, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. H. Petrie, F. Hamilton, N. King, and P. Pavan. Remote usability evaluations with disabled people. In Proceedings of the SIGCHI conference on Human Factors in computing systems, CHI '06, pages 1133--1141, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. C. Power, A. Freire, H. Petrie, and D. Swallow. Guidelines are only half of the story: accessibility problems encountered by blind users on the web. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems, CHI '12, pages 433--442, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. M. Schrepp. Goms analysis as a tool to investigate the usability of web units for disabled users. Universal Access the Information Society, 9(1):77--86, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. K. Shinohara and J. Tenenberg. A blind person's interactions with technology. Communications of the ACM, 52(8):58--66, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. F. Stefano, S. Borsci, and G. Stamerra. Web usability evaluation with screen reader users: implementation of the partial concurrent thinking aloud technique. Cognitive Processing, 11:263--272, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  10. S. Trewin, B. E. John, J. Richards, C. Swart, J. Brezin, R. Bellamy, and J. Thomas. Towards a tool for keystroke level modeling of skilled screen reading. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, Assets '10, pages 27--34, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. Vigo and G. Brajnik. Automatic web accessibility metrics: Where we are and where we can go. Interacting with Computers, 23(2):137--155, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. Vigo and S. Harper. Coping tactics employed by visually disabled users on the web (cope project, technical report), available at http://wel-eprints.cs.manchester.ac.uk/160/. In Transactions of the Web Ergonomics Lab, number 3 in COPE Project. University of Manchester, 2012.Google ScholarGoogle Scholar

Index Terms

  1. Evaluating accessibility-in-use

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Other conferences
            W4A '13: Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility
            May 2013
            209 pages
            ISBN:9781450318440
            DOI:10.1145/2461121

            Copyright © 2013 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 13 May 2013

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            W4A '13 Paper Acceptance Rate7of20submissions,35%Overall Acceptance Rate171of371submissions,46%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader