Skip to main content
  • Special Section: Learning Analytics: Intelligent Decision Support Systems for Learning Environments
  • Learning Analytics: Intelligent Decision Support Systems for Learning Environments
  • Open access
  • Published:

Temporal learning analytics visualizations for increasing awareness during assessment

Visualizaciones del análisis temporal del aprendizaje para aumentar el conocimiento durante la evaluación


Visual representations of student-generated trace data during learning activities help both students and instructors interpret them intuitively and perceive hidden aspects of these data quickly. In this paper, we elaborate on the visualization of temporal trace data during assessment. The goals of the study were twofold: a) to depict students’ engagement in the assessment procedure in terms of time spent and temporal factors associated with learning-specific characteristics, and b) to explore the factors that influence the teachers’ Behavioural Intention to use the proposed system as an information system and their perceptions of the effectiveness and acceptance of our approach. The proposed visualizations have been explored in a study with 32 Secondary Education teachers. We adopted a design-based research methodology and employed a survey instrument — based on the Learning Analytics Acceptance Model (LAAM) — in order to measure the expected impact of the proposed visualizations. The analysis of the findings indicates that a) temporal factors can be used for visualizing students’ behaviour during assessment, and b) the visualization of the temporal dimension of students’ behaviour increases teachers’ awareness of students’ progress, possible misconceptions (e.g., guessing the correct answer) and task difficulty.


Las representaciones visuales de datos de trazas generados por el alumnado durante las actividades de aprendizaje ayudan tanto a los estudiantes como a los profesores a interpretarlos intuitivamente y a percibir con rapidez aspectos ocultos. En este trabajo, describimos la visualización de datos de trazas temporales durante la evaluación. El estudio tenía un doble objetivo: a) describir la implicación de los estudiantes en el proceso de evaluación en cuanto a tiempo invertido y factores temporales asociados con características concretas del aprendizaje, y b) explorar los factores que influyen en la intención comportamental del profesorado en cuanto a emplear el sistema propuesto como sistema de información y sus percepciones de la efectividad y la aceptación de nuestro enfoque. Las visualizaciones propuestas se han examinado en un estudio con 32 profesores de educación secundaria. Adoptamos una metodología de investigación basada en el diseño y utilizamos un instrumento de encuesta -basada en el modelo de aceptación del análisis del aprendizaje- para medir el impacto esperado de las visualizaciones propuestas. El análisis de los hallazgos indica que a) los factores temporales se pueden utilizar para visualizar el comportamiento de los estudiantes durante la evaluación, y b) la visualización de la dimensión temporal del comportamiento de los estudiantes aumenta el conocimiento del profesor respecto al progreso de los alumnos, posibles conceptos erróneos (por ejemplo, adivinar la respuesta correcta) y dificultad de la tarea.


  • Ajzen, I. (2002). Perceived Behavioral Control, Self-Efficacy, Locus of Control, and the Theory of Planned Behavior. Journal of Applied Social Psychology, 32, 665–683. doi:

    Article  Google Scholar 

  • Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470–489. doi:

    Article  Google Scholar 

  • Ali, L., Gašević, D., Jovanović, J., & Hatala, M. (2013). Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education, 62, 130–148. doi:

    Article  Google Scholar 

  • Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41(1), 16–25. doi:

    Article  Google Scholar 

  • Barclay, D., Higgins, C., & Thompson, R. (1995). The Partial Least Squares approach to causal modelling: Personal computer adoption and use as an illustration. Technology Studies, 2(1), 285–309.

    Google Scholar 

  • Chin, W. W. (1998). The partial least squares approach to structural equation Modeling. In G. A. Marcoulides (Ed.), Modern Business research Methods (pp. 295–336). Mahwah, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Cohen, J. (1988). Statistical Power analysis for the Behavioural Sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: development of a measure and Initial test. MIS Quarterly, 19(2), 189–211. doi:

    Article  Google Scholar 

  • Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13, 319–340. doi:

    Article  Google Scholar 

  • Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp. 9–17). New York, NY: ACM. doi:

    Google Scholar 

  • Economides, A. A. (2005). Adaptive orientation methods in computer adaptive testing. In G. Richards (Ed.), Proceedings E-Learn 2005 World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 1290–1295). Vancouver, Canada.

  • Govaerts, S., Verbert, K., & Duval, E. (2011). Evaluating the student activity meter: two case studies. In Leung, H., Popescu, E., Cao, Y., Lau, R. H., & Wolfgang Nejdl (Eds.), Proceedings of the 10th international conference on Advances in Web-Based Learning (pp. 188–197). Berlin, Heidelberg: Springer-Verlag. doi:

    Google Scholar 

  • Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The Student Activity Meter for Awareness and Self-reflection. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems Extended Abstracts (pp. 869–884). ACM. doi:

  • Leony, D., Pardo, A., de la Fuente Valentín, L., de Castro, D. S., & Kloos, C. D. (2012). GLASS: a learning analytics visualization tool. In Buckingham Shum, S., Gasevic, D. & Ferguson, R. (Eds.), Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 162–163). New York, NY: ACM. doi:

    Chapter  Google Scholar 

  • Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. doi:

    Article  Google Scholar 

  • Mazza, R. & Milani, C. (2005). Exploring Usage Analysis in Learning Systems: Gaining Insights from Visualisations. In Proceedings of the International Conference on Artificial Intelligence in Education. Amsterdam. Mazza, R. & Dimitrova, V. (2007). CourseViz: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies, 65(2), 125–139. doi:

    Article  Google Scholar 

  • Merceron, A. & K. Yacef (2005). TADA-Ed for Educational Data Mining. Interactive Multimedia Electronic Journal of Computer-Enhanced Learning, 7(1).

  • Morris, L. V., Finnegan, C., & Wu, S. S. (2005). Tracking student behavior, persistence, and achievement in online courses. The Internet and Higher Education, 8(3), 221–231. doi:

    Article  Google Scholar 

  • Padilla-Melendez, A., Garrido-Moreno, A., & Del Aguila-Obra, A. R. (2008). Factors affecting e-collaboration technology use among management students. Computers & Education, 51(2), 609–623. doi:

    Article  Google Scholar 

  • Papamitsiou, Z. & Economides, A. A. (2013). Towards the alignment of computer-based assessment outcome with learning goals: the LAERS architecture. In Proceedings of the IEEE Conference on e-Learning, e-Management and e-Services. Malaysia. doi:

  • Soller, A., Martinez, A., Jermann, P., & Muehlenbrock, M. (2005). From mirroring to guiding: A review of state of the art technology for supporting collaborative learning. International Journal of Artificial Intelligence in Education, 15(4), 261–290.

    Google Scholar 

  • Thomas, J. & Cook, C. A. (Ed.) (2005). Illuminating the Path: The R&D Agenda for Visual Analytics. National Visualization and Analytics Center.

  • Wetzels, M., Odekerken-Schröder, G., & Van Oppen, C. (2009). Using PLS path modelling for assessing hierarchical construct models: Guidelines and empirical illustration. MIS Quarterly, 33(1), 177–195.

    Google Scholar 

  • Wolpers, M., Najjar, J., Verbert, K. & Duval, E. (2007). Tracking actual usage: the attention metadata approach. Educational Technology and Society, 10(3), 106–121.

    Google Scholar 

  • Zimmerman, B.J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41(2), 64–70. doi:

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Zacharoula Papamitsiou.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, duplication, adaptation, distribution, and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Papamitsiou, Z., Economides, A.A. Temporal learning analytics visualizations for increasing awareness during assessment. Int J Educ Technol High Educ 12, 129–147 (2015).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


Palabras clave