skip to main content
10.1145/2669711.2669913acmotherconferencesArticle/Chapter ViewAbstractPublication PagesteemConference Proceedingsconference-collections
research-article

Supporting competency-assessment through a learning analytics approach using enriched rubrics

Published:01 October 2014Publication History

ABSTRACT

Universities have increasingly emphasized competencies as central elements of students' development. However, the assessment of these competencies is not an easy task. The availability of data that learners generate in computer mediated learning offers great potential to study how learning takes place, and thus, to gather evidences for competency-assessment using enriched rubrics. The lack of data interoperability and the decentralization of those educational applications set out a challenge to exploit trace data. To face these problems we have designed and developed SCALA (Scalable Competence Assessment through a Learning Analytics approach), an analytics system that integrates usage -how the user interacts with resources- and social -how students and teachers interact among them-trace data to support competency assessment. The case study of SCALA presents teachers a dashboard with enriched rubrics of blended datasets obtained from six assessment learning activities, performed with a group of 28 students working teamwork competency. In terms of knowledge discovery, we obtain results applying clustering and association rule mining algorithms. Thus, we provide a visual analytics tool ready to support competency-assessment.

References

  1. Ferguson, R. 2012. Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning (IJTEL), 4(5/6): 304--317. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Petropoulou, O., Retalis, S., and Lazakidou, G. 2012 Measuring students' performance in e-learning environments via enriched assessment rubrics. Evaluation in e-Learning.Google ScholarGoogle Scholar
  3. Bichsel, J. 2012. Analytics in higher education: Benefits, barriers, progress, and recommendations. EDUCAUSE Center for Applied Research REVIEW.Google ScholarGoogle Scholar
  4. Campbell, J. P, deBlois, P. B. and Oblinger, D. G. 2007. Academic analytics: A new tool for a new era. EDUCAUSE Center for Applied Research REVIEW.Google ScholarGoogle Scholar
  5. Johnson, D. H. 2013. Teaching a MOOC: experiences from the front line. Digital Signal Processing and Signal Processing Education Meeting (DSP/SPE), page 268--272.Google ScholarGoogle Scholar
  6. Beheshti, B., Desmarais, M., and Naceur, R. 2012. Methods to find the number of latent skills. International Educational Data Mining Society.Google ScholarGoogle Scholar
  7. IMS Global Learning Consortium. 2013. Learning Measurement for Analytics Whitepaper. Available at: http://www.imsglobal.org/IMSLearningAnalyticsWP.pdfGoogle ScholarGoogle Scholar
  8. Chatti, M. A., Dyckhoff, A. L., Schroeder, U., and Thus, H. 2011. A reference model for learning analytics. International Journal of Technology Enhanced Learning (IJTEL). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Redecker, C. 2013. The Use of ICT for the Assessment of Key Competences. Joint Research Centre Institute for Prospective Technological Studies. Scientific and Technical Research series --ISSN 1831-9424 (online).Google ScholarGoogle Scholar
  10. Macfadyen, L. P. and Dawson, S. 2010. Mining LMS data to develop an "early warning system" for educators: A proof of concept. Computers & Education, 54(2):588--599. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Johnson, R. L., Penny, J. A., and Gordon, B. 2009. Assessing performance: Designing, scoring, and validating performance tasks. Guilford Press.Google ScholarGoogle Scholar
  12. Dyckhoff, A. L., Zielke, D., Bultmann, M., Chatti, M. A., and Schroeder, U. 2012. Design and implementation of a learning analytics toolkit for teachers. Educational Technology & Society, 15 (3):58--76.Google ScholarGoogle Scholar
  13. Hattie, J. 2012. Visible Learning for Teachers: Maximizing Impact on Learning. Routledge.Google ScholarGoogle Scholar
  14. OECD. 2013. Innovative learning environments. OECD.Google ScholarGoogle Scholar
  15. Pepper, D. 2012. Literature review: Assessment for key competences. Technical report, KeyCoNet.Google ScholarGoogle Scholar
  16. Looney, J. and Siemens, G. 2011. Assessment competency: Knowing what you know and learning analytics. It is time for a breakthrough. Promethean Thinking Deeper Research Paper.Google ScholarGoogle Scholar
  17. European Commission. 2012. Assessment of Key Competences in initial education and training: Policy Guidance Staff Working Document. Accompanying the Communication from the Commission on Rethinking Education: Investing in skills for better socio-economic outcomes.Google ScholarGoogle Scholar
  18. ETS. 2012. Sea Change in Assessment: How Technology is Transforming K-12 Testing: http://www.k12center.org/rsc/pdf/a-sea-change-inassessment-letter-size.pdf.Google ScholarGoogle Scholar
  19. Sluijsmans, J. T., Brinke, D., and Vleuten, C., van der. Toetsen met leerwaarde. 2013. Samenvatting van de reviewstudie naar de effectieve kenmerken van formatief toetsen. NWO-PROO.Google ScholarGoogle Scholar
  20. Goodrich, H. 1997. Understanding rubrics. Educational Leadership, 54 (4): 14--17.Google ScholarGoogle Scholar
  21. Elias, T. 2011. Learning analytics: definitions, processes and potential. URL: http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdfGoogle ScholarGoogle Scholar
  22. Poblete, M. and Villa, A. 2007. Aprendizaje Basado en Competencias. Una propuesta para la evaluación de competencias genéricas. Bilbao: MensajeroGoogle ScholarGoogle Scholar
  23. Tuning. 2005. Tuning educational structures in Europe. Final report pilot project - phase 1.Google ScholarGoogle Scholar
  24. Van Barneveld, A., Arnold, K. E., and Campbell, J. P. 2012. Analytics in higher education: Establishing a common language. Boulder, CO: EDUCAUSE Learning InitiativeGoogle ScholarGoogle Scholar
  25. Slade, S. and Prinsloo, P. 2013. Learning analytics: ethical issues and dilemmas. American Behavioral Scientist, 57(10) pp. 1509--1528.Google ScholarGoogle ScholarCross RefCross Ref
  26. Anderson, Terry, and Julie Shattuck. 2012. Design-Based Research A Decade of Progress in Education Research? Educational Researcher 41.1 16--25.Google ScholarGoogle Scholar
  27. Queen, James. 1967. Some methods for classification and analysis of multivariate observations. Proceedings of the fifth Berkeley symposium on mathematical statistics and probability. Vol. 1. No. 281--297.Google ScholarGoogle Scholar
  28. Agrawal, R., and Ramakrishnan S. 1993. Fast algorithms for mining association rules. Proc. 20th int. conf. very large data bases, VLDB. Vol. 1215.Google ScholarGoogle Scholar
  29. Pham, Duc Truong, Stefan S. Dimov, and C. D. Nguyen. 2005. Selection of K in K-means clustering. Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 219.1. 103--119.Google ScholarGoogle ScholarCross RefCross Ref
  30. Quinlan J. R. 1992. Learning with continuous classes. Proceedings of the Australian Joint Conference on Artificial Intelligence. 343--348. World Scientific, SingaporeGoogle ScholarGoogle Scholar
  31. Boud, D., and Keogh, R. 1985. Promoting reflection in learning: a model, reflection: turning experience into learning. London: Kogan Page; New York: Nichols Pub pp. 18--40.Google ScholarGoogle Scholar

Index Terms

  1. Supporting competency-assessment through a learning analytics approach using enriched rubrics

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      TEEM '14: Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality
      October 2014
      711 pages
      ISBN:9781450328968
      DOI:10.1145/2669711

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 October 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate496of705submissions,70%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader