ABSTRACT
The evaluation of e-learning applications deserves special attention and evaluators need effective methodologies and appropriate guidelines to perform their task. We have proposed a methodology, called eLSE (e-Learning Systematic Evaluation), which combines a specific inspection technique with user-testing. This inspection aims at allowing inspectors that may not have a wide experience in evaluating e-learning systems to perform accurate evaluations. It is based on the use of evaluation patterns, called Abstract Tasks (ATs), which precisely describe the activities to be performed during inspection. For this reason, it is called AT inspection. In this paper, we present an empirical validation of the AT inspection technique: three groups of novice inspectors evaluated a commercial e-learning system applying the AT inspection, the heuristic inspection, or user-testing. Results have shown an advantage of the AT inspection over the other two usability evaluation methods, demonstrating that Abstract Tasks are effective and efficient tools to drive evaluators and improve their performance. Important methodological considerations on the reliability of usability evaluation techniques are discussed.
- Ardito C., Costabile M. F., De Marsico M., Lanzilotti R., Levialdi S., Roselli T., Rossano V. An Approach to Usability Evaluation of e-Learning Applications. Universal Access in the Information Society International Journal, 4, 3 (2006), 270--283. Google ScholarDigital Library
- De Angeli, A., Matera, M., Costabile, M. F., Garzotto, F., and Paolini, P. On the Advantages of a Systematic Inspection for Evaluating Hypermedia Usability. International Journal of Human-Computer Interaction, Lawrence Erlbaum Associates, Inc, 15, 3 (2003), 315--335.Google Scholar
- Dix, A., Finlay, J., Abowd, G., and Beale, R. Human-Computer Interaction (3rd Edition), London: Prentice Hall Europe, 2003. Google ScholarDigital Library
- Doubleday, A., Ryan, M., Springett, M., and Sutcliffe, A. A Comparison of Usability Techniques for Evaluating Design. In Proc. DIS'97, ACM Press (1997), 101--110. Google ScholarDigital Library
- International Organization for Standardization. ISO 9241: Software Ergonomics Requirements for Office Work with Visual Display Terminal (VDT). (1998) Geneva, SwitzerlandGoogle Scholar
- Jeffries, R. and Desurvies, H. W. Usability testing vs. Heuristic Evaluation: was There a Context?, ACM SIGCHI Bulletin, 24, 4 (October 1992), 39--41. Google ScholarDigital Library
- Lanzilotti, R. A Holistic Approach to Designing and Evaluating e-Learning Quality: Usability and Educational Effectiveness. PhD dissertation, Dip. Informatica, Università di Bari, Bari, Italy, 2006.Google Scholar
- Matera, M., Costabile, M. F., Garzotto, F., and Paolini, P. SUE Inspection: an Effective Method for Systematic Usability Evaluation of Hypermedia. IEEE Transactions on Systems, Man and Cybernetics - Part A, 32, 1 (2002), 93--103. Google ScholarDigital Library
- Nielsen J. Usability Engineering. Academic Press. Cambridge, MA. 1993. Google ScholarDigital Library
- Parlangeli, O., Marchigiani, E., and Bagnara, S. Multimedia System in Distance Education: Effects on Usability. Interacting with Computers, Elsevier Science Ltd, Great Britain, 12 (1999), 37--49.Google Scholar
- Quinn, C. N., Alem, L., and Eklund, J. A pragmatic evaluation methodology for an assessment of learning effectiveness in instructional systems. http://www.testingcentre.com/jeklund/Interact.htm.Google Scholar
- Squires, D. and Preece, J. Predicting quality in Educational Software: Evaluating for Learning, Usability, and the Synergy between them. Interacting with Computers, Elsevier Science Ltd, Great Britain, 11, 5 (1999), 467--483.Google Scholar
- Storey, M. A., Philipps, B., Maczewski, M., and Wang, M. Evaluating the usability of Web-Based Learning Tools. Education Technology & Society, 5, 3 (2002), 91--100.Google Scholar
- Wong, B., Nguyen, T. T., Chang, E., and Jayaratna, N. Usability Metrics for E-Learning. Workshop on Human Computer Interface for Semantic Web and Web Applications, Springer-Verlag, Heidelberg, Germany, LNCS No. 2889 (2003), 235--252.Google Scholar
- Zaharias P., Vasslopoulou K., and Poulymenakou A. Designing On-Line Learning Courses: Implications for Usability. www.japit.org/zaharias_etal02.pdf, 2002.Google Scholar
Index Terms
- Systematic evaluation of e-learning systems: an experimental validation
Recommendations
A holistic approach to the evaluation of e-learning systems
UAHCI'07: Proceedings of the 4th international conference on Universal access in human-computer interaction: applications and servicesThis paper describes the eLSE methodology to evaluate e-learning systems. By combing a specific inspection technique with user-testing, eLSE allows inspectors, even not having a wide experience in evaluating e-learning systems, to perform accurate ...
On the importance of the user interface for e-learning systems quality
IASTED-HCI '07: Proceedings of the Second IASTED International Conference on Human Computer InteractionE-learning is the most recent way to achieve distance education, carried out by distributing learning material and processes over the Internet. This paper refines the concept of quality of e-learning systems and proposes a new framework, called TICS (...
An Experimental Comparison of Usage-Based and Checklist-Based Reading
Software quality can be defined as the customers' perception of how a system works. Inspection is a method to monitor and control the quality throughout the development cycle. Reading techniques applied to inspections help reviewers to stay focused on ...
Comments