skip to main content
10.1145/1182475.1182496acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
Article

Systematic evaluation of e-learning systems: an experimental validation

Published:14 October 2006Publication History

ABSTRACT

The evaluation of e-learning applications deserves special attention and evaluators need effective methodologies and appropriate guidelines to perform their task. We have proposed a methodology, called eLSE (e-Learning Systematic Evaluation), which combines a specific inspection technique with user-testing. This inspection aims at allowing inspectors that may not have a wide experience in evaluating e-learning systems to perform accurate evaluations. It is based on the use of evaluation patterns, called Abstract Tasks (ATs), which precisely describe the activities to be performed during inspection. For this reason, it is called AT inspection. In this paper, we present an empirical validation of the AT inspection technique: three groups of novice inspectors evaluated a commercial e-learning system applying the AT inspection, the heuristic inspection, or user-testing. Results have shown an advantage of the AT inspection over the other two usability evaluation methods, demonstrating that Abstract Tasks are effective and efficient tools to drive evaluators and improve their performance. Important methodological considerations on the reliability of usability evaluation techniques are discussed.

References

  1. Ardito C., Costabile M. F., De Marsico M., Lanzilotti R., Levialdi S., Roselli T., Rossano V. An Approach to Usability Evaluation of e-Learning Applications. Universal Access in the Information Society International Journal, 4, 3 (2006), 270--283. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. De Angeli, A., Matera, M., Costabile, M. F., Garzotto, F., and Paolini, P. On the Advantages of a Systematic Inspection for Evaluating Hypermedia Usability. International Journal of Human-Computer Interaction, Lawrence Erlbaum Associates, Inc, 15, 3 (2003), 315--335.Google ScholarGoogle Scholar
  3. Dix, A., Finlay, J., Abowd, G., and Beale, R. Human-Computer Interaction (3rd Edition), London: Prentice Hall Europe, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Doubleday, A., Ryan, M., Springett, M., and Sutcliffe, A. A Comparison of Usability Techniques for Evaluating Design. In Proc. DIS'97, ACM Press (1997), 101--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. International Organization for Standardization. ISO 9241: Software Ergonomics Requirements for Office Work with Visual Display Terminal (VDT). (1998) Geneva, SwitzerlandGoogle ScholarGoogle Scholar
  6. Jeffries, R. and Desurvies, H. W. Usability testing vs. Heuristic Evaluation: was There a Context?, ACM SIGCHI Bulletin, 24, 4 (October 1992), 39--41. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Lanzilotti, R. A Holistic Approach to Designing and Evaluating e-Learning Quality: Usability and Educational Effectiveness. PhD dissertation, Dip. Informatica, Università di Bari, Bari, Italy, 2006.Google ScholarGoogle Scholar
  8. Matera, M., Costabile, M. F., Garzotto, F., and Paolini, P. SUE Inspection: an Effective Method for Systematic Usability Evaluation of Hypermedia. IEEE Transactions on Systems, Man and Cybernetics - Part A, 32, 1 (2002), 93--103. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Nielsen J. Usability Engineering. Academic Press. Cambridge, MA. 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Parlangeli, O., Marchigiani, E., and Bagnara, S. Multimedia System in Distance Education: Effects on Usability. Interacting with Computers, Elsevier Science Ltd, Great Britain, 12 (1999), 37--49.Google ScholarGoogle Scholar
  11. Quinn, C. N., Alem, L., and Eklund, J. A pragmatic evaluation methodology for an assessment of learning effectiveness in instructional systems. http://www.testingcentre.com/jeklund/Interact.htm.Google ScholarGoogle Scholar
  12. Squires, D. and Preece, J. Predicting quality in Educational Software: Evaluating for Learning, Usability, and the Synergy between them. Interacting with Computers, Elsevier Science Ltd, Great Britain, 11, 5 (1999), 467--483.Google ScholarGoogle Scholar
  13. Storey, M. A., Philipps, B., Maczewski, M., and Wang, M. Evaluating the usability of Web-Based Learning Tools. Education Technology & Society, 5, 3 (2002), 91--100.Google ScholarGoogle Scholar
  14. Wong, B., Nguyen, T. T., Chang, E., and Jayaratna, N. Usability Metrics for E-Learning. Workshop on Human Computer Interface for Semantic Web and Web Applications, Springer-Verlag, Heidelberg, Germany, LNCS No. 2889 (2003), 235--252.Google ScholarGoogle Scholar
  15. Zaharias P., Vasslopoulou K., and Poulymenakou A. Designing On-Line Learning Courses: Implications for Usability. www.japit.org/zaharias_etal02.pdf, 2002.Google ScholarGoogle Scholar

Index Terms

  1. Systematic evaluation of e-learning systems: an experimental validation

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      NordiCHI '06: Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
      October 2006
      517 pages
      ISBN:1595933255
      DOI:10.1145/1182475

      Copyright © 2006 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 14 October 2006

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate379of1,572submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader