Skip to main content

Advertisement

Special Section: Learning Analytics: Intelligent Decision Support Systems for Learning Environments | Open | Published:

Temporal learning analytics visualizations for increasing awareness during assessment

Visualizaciones del análisis temporal del aprendizaje para aumentar el conocimiento durante la evaluación

Abstract

Visual representations of student-generated trace data during learning activities help both students and instructors interpret them intuitively and perceive hidden aspects of these data quickly. In this paper, we elaborate on the visualization of temporal trace data during assessment. The goals of the study were twofold: a) to depict students’ engagement in the assessment procedure in terms of time spent and temporal factors associated with learning-specific characteristics, and b) to explore the factors that influence the teachers’ Behavioural Intention to use the proposed system as an information system and their perceptions of the effectiveness and acceptance of our approach. The proposed visualizations have been explored in a study with 32 Secondary Education teachers. We adopted a design-based research methodology and employed a survey instrument — based on the Learning Analytics Acceptance Model (LAAM) — in order to measure the expected impact of the proposed visualizations. The analysis of the findings indicates that a) temporal factors can be used for visualizing students’ behaviour during assessment, and b) the visualization of the temporal dimension of students’ behaviour increases teachers’ awareness of students’ progress, possible misconceptions (e.g., guessing the correct answer) and task difficulty.

Resumen

Las representaciones visuales de datos de trazas generados por el alumnado durante las actividades de aprendizaje ayudan tanto a los estudiantes como a los profesores a interpretarlos intuitivamente y a percibir con rapidez aspectos ocultos. En este trabajo, describimos la visualización de datos de trazas temporales durante la evaluación. El estudio tenía un doble objetivo: a) describir la implicación de los estudiantes en el proceso de evaluación en cuanto a tiempo invertido y factores temporales asociados con características concretas del aprendizaje, y b) explorar los factores que influyen en la intención comportamental del profesorado en cuanto a emplear el sistema propuesto como sistema de información y sus percepciones de la efectividad y la aceptación de nuestro enfoque. Las visualizaciones propuestas se han examinado en un estudio con 32 profesores de educación secundaria. Adoptamos una metodología de investigación basada en el diseño y utilizamos un instrumento de encuesta -basada en el modelo de aceptación del análisis del aprendizaje- para medir el impacto esperado de las visualizaciones propuestas. El análisis de los hallazgos indica que a) los factores temporales se pueden utilizar para visualizar el comportamiento de los estudiantes durante la evaluación, y b) la visualización de la dimensión temporal del comportamiento de los estudiantes aumenta el conocimiento del profesor respecto al progreso de los alumnos, posibles conceptos erróneos (por ejemplo, adivinar la respuesta correcta) y dificultad de la tarea.

References

  1. Ajzen, I. (2002). Perceived Behavioral Control, Self-Efficacy, Locus of Control, and the Theory of Planned Behavior. Journal of Applied Social Psychology, 32, 665–683. doi: http://dx.doi.org/10.1111/j.1559-1816.2002.tb00236.x

  2. Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470–489. doi: http://dx.doi.org/10.1016/jxompedu.2011.08.030

  3. Ali, L., Gašević, D., Jovanović, J., & Hatala, M. (2013). Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education, 62, 130–148. doi: http://dx.doi.org/10.1016/j.compedu.2012.10.023

  4. Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41(1), 16–25. doi: http://dx.doi.org/10.3102/0013189X11428813

  5. Barclay, D., Higgins, C., & Thompson, R. (1995). The Partial Least Squares approach to causal modelling: Personal computer adoption and use as an illustration. Technology Studies, 2(1), 285–309.

  6. Chin, W. W. (1998). The partial least squares approach to structural equation Modeling. In G. A. Marcoulides (Ed.), Modern Business research Methods (pp. 295–336). Mahwah, NJ: Lawrence Erlbaum Associates.

  7. Cohen, J. (1988). Statistical Power analysis for the Behavioural Sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

  8. Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: development of a measure and Initial test. MIS Quarterly, 19(2), 189–211. doi: http://dx.doi.org/10.2307/249688

  9. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13, 319–340. doi: http://dx.doi.org/10.2307/249008

  10. Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp. 9–17). New York, NY: ACM. doi: http://dx.doi.org/10.1145/2090116.2090118

  11. Economides, A. A. (2005). Adaptive orientation methods in computer adaptive testing. In G. Richards (Ed.), Proceedings E-Learn 2005 World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 1290–1295). Vancouver, Canada.

  12. Govaerts, S., Verbert, K., & Duval, E. (2011). Evaluating the student activity meter: two case studies. In Leung, H., Popescu, E., Cao, Y., Lau, R. H., & Wolfgang Nejdl (Eds.), Proceedings of the 10th international conference on Advances in Web-Based Learning (pp. 188–197). Berlin, Heidelberg: Springer-Verlag. doi: http://dx.doi.org/10.1007/978-3-642-25813-8_20

  13. Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The Student Activity Meter for Awareness and Self-reflection. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems Extended Abstracts (pp. 869–884). ACM. doi: http://dx.doi.org/10.1145/2212776.2212860

  14. Leony, D., Pardo, A., de la Fuente Valentín, L., de Castro, D. S., & Kloos, C. D. (2012). GLASS: a learning analytics visualization tool. In Buckingham Shum, S., Gasevic, D. & Ferguson, R. (Eds.), Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 162–163). New York, NY: ACM. doi: http://dx.doi.org/10.1145/2330601.2330642

  15. Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. doi: http://dx.doi.org/10.1016/jxompedu.2009.09.008

  16. Mazza, R. & Milani, C. (2005). Exploring Usage Analysis in Learning Systems: Gaining Insights from Visualisations. In Proceedings of the International Conference on Artificial Intelligence in Education. Amsterdam. Mazza, R. & Dimitrova, V. (2007). CourseViz: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies, 65(2), 125–139. doi: http://dx.doi.org/10.1016/j.ijhcs.2006.08.008

  17. Merceron, A. & K. Yacef (2005). TADA-Ed for Educational Data Mining. Interactive Multimedia Electronic Journal of Computer-Enhanced Learning, 7(1).

  18. Morris, L. V., Finnegan, C., & Wu, S. S. (2005). Tracking student behavior, persistence, and achievement in online courses. The Internet and Higher Education, 8(3), 221–231. doi: http://dx.doi.org/10.1016/j.iheduc.2005.06.009

  19. Padilla-Melendez, A., Garrido-Moreno, A., & Del Aguila-Obra, A. R. (2008). Factors affecting e-collaboration technology use among management students. Computers & Education, 51(2), 609–623. doi: http://dx.doi.org/10.1016/j.compedu.2007.06.013

  20. Papamitsiou, Z. & Economides, A. A. (2013). Towards the alignment of computer-based assessment outcome with learning goals: the LAERS architecture. In Proceedings of the IEEE Conference on e-Learning, e-Management and e-Services. Malaysia. doi: http://dx.doi.org/10.1109/ic3e.2013.6735958

  21. Soller, A., Martinez, A., Jermann, P., & Muehlenbrock, M. (2005). From mirroring to guiding: A review of state of the art technology for supporting collaborative learning. International Journal of Artificial Intelligence in Education, 15(4), 261–290.

  22. Thomas, J. & Cook, C. A. (Ed.) (2005). Illuminating the Path: The R&D Agenda for Visual Analytics. National Visualization and Analytics Center.

  23. Wetzels, M., Odekerken-Schröder, G., & Van Oppen, C. (2009). Using PLS path modelling for assessing hierarchical construct models: Guidelines and empirical illustration. MIS Quarterly, 33(1), 177–195.

  24. Wolpers, M., Najjar, J., Verbert, K. & Duval, E. (2007). Tracking actual usage: the attention metadata approach. Educational Technology and Society, 10(3), 106–121.

  25. Zimmerman, B.J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41(2), 64–70. doi: http://dx.doi.org/10.1207/s15430421tip4102_2

Download references

Author information

Correspondence to Zacharoula Papamitsiou.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Keywords

  • temporal learning analytics
  • visualizations
  • awareness
  • monitoring
  • assessment
  • acceptance

Palabras clave

  • análisis temporal del aprendizaje
  • visualizaciones
  • conocimiento
  • monitorización
  • evaluación
  • aceptación