Skip to main content

A checklist to guide the planning, designing, implementation, and evaluation of learning analytics dashboards

Abstract

Higher education institutions are moving to design and implement teacher-facing learning analytics (LA) dashboards with the hope that instructors can extract deep insights about student learning and make informed decisions to improve their teaching. While much attention has been paid to developing teacher-facing dashboards, less is known about how they are designed, implemented and evaluated. This paper presents a systematic literature review of existing studies reporting on teacher-facing LA dashboards. Out of the 1968 articles retrieved from several databases, 50 articles were included in the final analysis. Guided by several frameworks, articles were coded based on the following dimensions: purpose, theoretical grounding, stakeholder involvement, ethics and privacy, design, implementation, and evaluation criteria. The findings show that most dashboards are designed to increase teachers’ awareness but with limited actionable insights to allow intervention. Moreover, while teachers are involved in the design process, this is mainly at the exploratory/problem definition stage, with little input beyond this stage. Most dashboards were prescriptive, less customisable, and implicit about the theoretical constructs behind their designs. In addition, dashboards are deployed at prototype and pilot stages, and the evaluation is dominated by self-reports and users’ reactions with limited focus on changes to teaching and learning. Besides, only one study considered privacy as a design requirement. Based on the findings of the study and synthesis of existing literature, we propose a four-dimensional checklist for planning, designing, implementing and evaluating LA dashboards.

Introduction

As (higher) education is becoming digitized and datafied, institutions have access to a greater variety, volume, granularity and velocity of student (learning) data, creating opportunities for data-informed pedagogy and student support. The digitization of higher education also allows institutions to expand educational offerings optimizing scale and asynchronicity, reaching geographically distributed students in a variety of hybrid, distance and distributed learning opportunities. Offering appropriate, timely and effective student support to large and increasingly distributed student cohorts necessitates not only harvesting and analyzing student data but also providing teachers and student support teams with access to such analyses and actionable data.

In recent years, there has been growing interest in using learning analytics (LA) to support teachers’ everyday practices in both online and blended learning environments. LA is concerned with collecting and measuring data about learners and their context for purposes of understanding and optimizing learning and the environment in which it occurs (Siemens & Gasevic, 2012). One of the focuses of LA research is to support teachers with informed teaching decisions, mainly through visualizing students’ learning behaviour using dashboards (Verbert et al., 2020). In this regard, higher education institutions are moving to design and implement teacher-facing LA dashboards with the expectation that instructors can extract deep insights about student learning and make informed decisions to improve their teaching (Li et al., 2021).

Within the LA field, a teacher-facing LA dashboard is an interactive visual display that provides information to teachers based on students’ learning patterns and interactions (Few, 2013; Verbert et al., 2014). A key assumption is that teachers will use the information provided by LA dashboards to help them monitor, reflect on, and regulate the teaching and learning process (van Leeuwen et al., 2019). In large and distributed learning contexts, such feedback allows teachers not to teach in the dark but to respond appropriately to students’ needs. For example, as Rummel (2008) describes in her proposed taxonomy of computer-supported collaborative learning (CSCL) support mechanisms, teacher-facing dashboards can be perceived as technological artefacts that indirectly support teachers during the orchestration of CSCL activities.

Nonetheless, current research has shown that the use of teacher dashboards at scale and the evidence of their impact on teachers’ everyday practice remains limited (Tsai et al., 2020; Viberg et al., 2018). One of the factors for the limited adoption of teacher dashboards is user concerns around ethics and privacy (Aslan et al., 2019; Drachsler & Greller, 2016; Slade & Prinsloo, 2013). This obstacle has been part of the evolution of research into LA even before its emergence in 2011 as a distinct research focus and practice (e.g., Bach, 2010; Knox, 2010). Another contributing factor to the low adoption of dashboards in teachers’ practice is the limited involvement of teachers in the design and development process of dashboards. As noted by Dollinger et al. (2019), there are still limited examples of mature and transparent collaboration with stakeholders in developing LA tools in the literature to date. Yet, scholars have recently suggested that it is not enough to introduce teachers to LA technologies but they must also be a part of the LA creation and design process (Dollinger et al., 2019). By involving teachers in the design process, dashboards are more receptive to teachers’ pedagogical needs and intentions, which could, in turn, favor adoption. Recognising the need for involving teachers in the design of LA systems, recent efforts such as the call for human-centred LA (Buckingham Shum et al., 2019) and ongoing initiatives such as workshops focusing on participatory LA have increased.

Additionally, despite the increasing attention towards participatory approaches and stakeholder involvement in LA, there is still limited evidence on how systems (e.g., dashboards) developed for teachers-one of the key stakeholders in LA are conceptualized. In particular, details on how teacher-facing dashboards are developed, theorized, implemented and evaluated are lacking. Moreover, even though a number of review studies have been conducted on the theme of LA dashboards, these have specifically focused on student-facing LA dashboards (Bodily & Verbert, 2017; Jivet et al., 2017; Matcha et al., 2019) or LA dashboards in general (Schwendimann et al., 2016). To the best of our knowledge, no systematic study has attempted to explore the state-of-the-art in teacher-facing dashboards. However, given that teachers are primary stakeholders for LA, and dashboards are one of the key intervention tools for LA adoption, it is critical that this area of research is investigated to favour wider LA adoption and to develop knowledge and principles to guide researchers and LA dashboard designers.

Towards this background, in this study, we present a systematic review of studies that present teacher-facing dashboards. In particular, we investigate the purpose of existing LA dashboards, their theoretical grounding, the extent to which stakeholders are involved in the design process, the maturity of teacher dashboards deployment, and the ways in which these dashboards are evaluated.

This review has important implications for designing actionable intelligence at scale because teacher-facing LA dashboards are an important value proposition of LA (Verbert et al., 2014). As such, their correct design and evaluation should be an important component in institutionalizing LA. This review reports the issues related to the purpose, design, implementation and evaluation of teacher-facing dashboards. Based on the analysis, we provide research and practice implications to guide researchers and technology developers in conducting responsible, rigorous, human-centered, and ethically responsible research in realizing the potential of effective instruction at scale. Furthermore, the article contributes to the theory of LA adoption and design by synthesizing the findings to propose an integrated checklist for planning, designing, implementing and evaluating of LA dashboards. The suggested checklist adds detail to existing LA models since it captures the whole cycle of planning, designing, implementing, and evaluating dashboards. Our checklist highlights important questions and considerations that support researchers and developers in making informed decisions during the design and implementation of LA dashboards.

Related literature

Due to the increasing use of LA dashboards, researchers have conducted review studies to describe the state of the art on this topic. Bodily and Verbert (2017) reviewed research on student-facing LA dashboards and educational recommender systems based on 93 articles. The review was exclusively focused on LA systems that collect click-level student data and report this data directly to students. Yoo et al. (2015) reviewed LA dashboards with specific references on their evaluation. Based on the sample of 10 studies (7 focusing on student-facing dashboards and 3 teacher-facing dashboards), the authors concluded that most LA dashboard studies lacked an evaluation. In this regard, the authors created an evaluation framework of 11 items to guide dashboard evaluations. Meanwhile, the limited number of studies included in this review limits the generalisability of this evaluation framework. Another popular review on dashboards is by Schwendimann et al. (2016), who conducted a comprehensive review of LA dashboards based on 55 studies presenting student, teachers, administrators and researcher-oriented dashboards. The authors concluded that most dashboards were developed for higher education contexts, presented as exploratory and proof-of-concept, and with little evaluation to establish the impact of the dashboards.

Jivet et al. (2017) followed with another review of LA dashboards but specifically focused on how student-facing dashboard developers utilize theories and models from learning sciences. Based on evidence from 28 studies, the findings revealed that very few dashboard evaluations consider educational concepts as a theoretical foundation for their design. Matcha et al. (2019) reviewed LA dashboards based on 29 studies. The results show that existing LA dashboards are rarely grounded in learning theory, offer no information about effective learning tactics and strategies, and have significant limitations in how their evaluation is conducted and reported. More recently, Valle et al. (2021) conducted a review of student-facing LA dashboards with the intention to map theoretical underpinnings and the connection between the LA dashboard’s intended outcomes and the measures used to evaluate them. The findings based on 28 studies included in the final analysis revealed a limited alignment between the intended outcomes of the dashboards and the way they are measured.

While there have been a number of literature reviews on LA dashboards and provided important contributions to the area of LA dashboard research, the available literature shows that none of these has exclusively focused on teacher-facing LA dashboards. Yet, more specific work in this area of LA is needed to develop theoretical knowledge and principles to guide researchers and designers in engaging non-data experts (e.g. teachers) (Martinez-Maldonado et al., 2015). Besides, none of the reviews heretofore has gone further to investigate issues related to the ethics and privacy of LA dashboards. Yet, such issues should be at the forefront to support the wider adoption of LA dashboards (Prinsloo & Kaliisa, 2022). Moreover, none of the current studies on LA dashboards has taken a step further to synthesize findings into a checklist/framework to guide the planning, design, implementation, and evaluation of LA dashboards. Yoo et al. (2015), who suggested a LA evaluation framework based on a review of 10 studies, conducted promising work towards this direction. However, this framework only captures the evaluation component, not the entire dashboard development process. To the best of our knowledge, there is no comprehensive framework capturing all the four main processes involved in the design, testing, evaluation and implementation cycle of LA dashboards.

Focus and research questions

This paper addressed the shortcoming described in the previous section by proposing a four-dimensional checklist for planning, designing, implementing, and evaluating LA dashboards. We hope that this kind of checklist can act as a guide and discussion tool for researchers and technology developers during the process of LA dashboard development. We have synthesised the findings from the 50 reviewed papers and existing LA models to identify the key stages of the dashboard development process, the key questions that can be addressed at each stage and a list of possible responses. While the checklist is based on findings from teacher-facing LA dashboards, we suggest that the checklist can be applied more broadly to all LA tools meant for different stakeholders (e.g., students and advisors). The checklist is primarily intended to guide researchers and technology developers.

The following research questions guide our study:

  • RQ1 For what purposes are LA dashboards for teachers designed?

  • RQ2 Which theoretical frameworks inform the design of teacher dashboards?

  • RQ3 To what extent are stakeholders involved?

  • RQ4 To what extent are privacy and ethical issues considered during the design process?

  • RQ5 How mature is the deployment of LA dashboards for teachers?

  • RQ6 How are teacher LA dashboards evaluated?

Methodology

To answer the above research questions, we conducted a systematic review following the guidelines provided by the PRISMA statement (Moher et al., 2009) such as applying a replicable search strategy to find relevant studies, which are coded and synthesized into findings (Bond et al., 2020).

Search strategy and study selection procedure

To find relevant studies, we selected the following databases as they contain relevant literature for the field of LA: ACM Digital Library, IEEE Xplore, SpringerLink, ScienceDirect, and Wiley Online Library. Additionally, we included the top 200 Google Scholar search results using our search terms to cover any other sources that were not indexed by the selected databases. We queried the selected databases with the following search terms: widget OR dashboard AND “learning analytics” OR “educational data mining”. Although the scope of this review is limited to dashboards that consider teachers as end-users, it was not possible to articulate this criterion in relevant search terms. To avoid missing some relevant studies, we built a query that retrieved all LA dashboards regardless of their end-users and, during screening, removed the ones whose focus was not teachers.

The search process lasted from 25 August to 1 April 2022. It was limited to studies conducted between 2011 and 2021, as most studies on LA were published after the first International Conference on Learning Analytics in 2011. The initial search process resulted in a total of 1968 hits. We further screened for the relevance of each of the results by examining the title and the abstract, removing duplicates and articles that do not present dashboards, are not aimed at teachers, and are without full text available in English. This process resulted in 326 papers, which were later subjected to full-text screening guided by the study’s inclusion criteria and research questions. We kept only the papers that (1) report on teacher-facing dashboards or both teachers and students; or on prototypes resulting from co-creation activities, (2) are available in English and available for public access, and (3) published in a peer-reviewed journal or conference due to the rigorous process they go through, hence ensuring validity. The decision to include conference papers was because LA is an emergent field, and most of the research findings are published at conferences. We removed studies presenting dashboards as work-in-progress, literature reviews, demos and posters. Following this process, we remained with 114 papers. On a critical examination of these studies, a further 64 papers were removed as they lacked an evaluation of the dashboard, presented the same dashboards, or were duplicates. Finally, our literature survey included 50 papers that satisfied all our criteria. The detailed search process is illustrated in Fig. 1, while Additional file 1: Appendix 1 (Table A1: Summary of the reviewed studies) summarise all the 50 studies included in this review.

Fig. 1
figure 1

(Adapted from Mohler et al., 2009)

Flow diagram for the systematic review process

Coding categories and frameworks: data extraction process

To extract the relevant data from the included articles, a detailed codebook was developed guided by the research questions. The articles were coded based on the following overarching dimensions: purpose, theoretical grounding, stakeholder involvement, ethics and privacy, design, implementation, and evaluation criteria. Besides, to ensure a systematic and valid process of coding relevant information, we applied existing models and frameworks, which are briefly elaborated on below.

  1. i.

    The LA Process Model (Verbert et al., 2013) highlights key aspects to consider while analyzing LA analytics applications. The model is conceptualized through four phases: (1) awareness-which is concerned with data and how it is presented to users; (2) reflection-focusing on how users assess the usefulness and relevance of the presented data and visualizations; (3) sense-making-involving users gaining insight from the presented data and visualizations, and (4) impact-which focuses on the change of behaviour of the user, based on the LA visualizations. This model was used to answer RQ1, which sought to assess the intended purpose of each teacher-facing dashboard.

  2. ii.

    The Learning Awareness Tools-User eXperience (LATUX) workflow. This is a framework for designing and deploying awareness tools for technology-enabled learning settings (Martinez-Maldonado et al., 2015). The framework was used to code the design stages followed during dashboard development. LATUX is composed of five stages: (1) Problem identification (i.e., identifying the requirements for LA and user interface design), (2) Paper prototype (i.e. initial, high-level representations of the intended design), (3) High-fidelity prototype (i.e., more detailed and realistic representation of the designed tool), (4) Pilot study (prove a concept and observe the live usage of the tool in an authentic context), and (5) In classroom practice (using the tool in unconstrained settings and at a larger scale and longer duration). Since LATUX is grounded on a well-established design process for creating, testing and re-designing user interfaces, we found it a reliable framework to guide the evaluation of existing teacher-facing LA dashboards. This framework provided the lens to respond to RQs 3 and 5, which sought to identify the extent to which teachers and other stakeholders are involved and the maturity of LA dashboard deployment, respectively.

  3. iii.

    Kirkpatrick’s evaluation model. To code for the evaluation of LA dashboards, we used Kirkpatrick’s evaluation model (Smidt et al., 2009), which conceptualizes evaluation at the four stages (a) Reaction (e.g., usability, impressions), (b) Learning (e.g., insights teachers got), (c) Behavior (e.g., change in how teachers manage their classroom and (d) Result (e.g., changes in student performance). Kirkpatrick’s model has been widely used in studies evaluating user interfaces and training programs (e.g., Kaliisa & Dolonen, 2022), hence being a valid model in answering RQ6, which sought to explore how teacher LA dashboards are evaluated.

Analysis, validity and reporting

We tabulated the included studies guided by our research questions to provide an overview of the different codes. Finally, we undertook a narrative analysis of the identified studies using individual papers as the unit of analysis. To ensure validity during the coding process, the three researchers screened five articles each to establish a joint understanding of the inclusion criteria iteratively. To increase the level of reliability, under circumstances where differences and discrepancies emerged, we used social moderation-an approach that involves a discussion between two or more people until an agreement is reached. Besides, after the different papers had been distributed among the three researchers, each researcher took initiative to check articles coded by the co-researchers to ensure consistency and check for any potential errors or coding inconsistencies.

Findings and discussion

In this section, we present the review results guided by the five research questions.

Descriptive information of included studies

The 50 studies that form the basis of the findings consist of conference papers (n = 28) and journal articles (n = 22). The analysis showed that the number of studies reporting teacher-facing dashboards has been growing steadily between 2012 (only one study) and 2021, with a significant spike of studies between 2019 (n = 9), 2020 (n = 10) and 2021 (n = 12) (see Fig. 2). This trend suggests that this area is of growing interest and importance in LA research. Similar to previous reviews, most teacher dashboards in this study are implemented in higher education settings (n = 31); fourteen studies present dashboards for the K-12 contexts and four studies in MOOCs. Only one study presented a dashboard for an informal learning context. Eleven studies described dashboards built for computer-supported or face-to-face collaborative scenarios. A summarised version of all the 50 reviewed studies is included as a Additional file 1: Table A1.

Fig. 2
figure 2

The distribution of teacher-facing dashboard studies between 2012 and 2021

Table 1 Summary of dashboard purposes based on Verbert’s LA process model and the number of papers in which they appear

RQ1. For what purposes are LA dashboards for teachers designed?

Similar to how Molenaar and Knoop-van Campen (2017) used Verbert’s LA process model to investigate teachers’ use of dashboard data, we are using the same model to survey for what purposes dashboard creators envisioned their systems, as shown in Table 1. We extracted and coded all purposes explicitly mentioned by each paper, even if they might fall under different levels of the model. Some studies mentioned more than one goal; thus, they appear under more categories.

We identified 33 papers that explicitly mentioned an aim to support teachers’ awareness by allowing them to monitor students’ progress or performance (n = 23), their behaviour (n = 15) or their emotional states (n = 4). We found two papers presenting dashboards that displayed teacher data instead of student data to facilitate self-reflection of teachers’ practices. Martinez-Maldonado (2019) used classroom sensor data to show teachers their activity around the classroom and allow them to reflect on their movement around the classroom, while Michos and Hernández-Leo (2018) developed a community awareness dashboard that displays teachers’ activity in a social learning design platform.

The next step in the model addresses a sense-making and reflection level (n = 26), in which teachers process the information displayed on dashboards in order to make pedagogical decisions. Most of the papers here focus on identifying struggling or at-risk students (n = 20). The other twelve papers included here were vague in terms of the specific purpose for their dashboards, using terms like “support decision making” (n = 4) or “understand student needs” (n = 5).

According to Verbert’s model, the ultimate goal of dashboards is to support teachers to act on the information received through dashboards. Most of the dashboards included in our analysis explicitly outlined an action-oriented purpose (n = 38). A relatively high number of dashboards were aimed to support teachers in optimizing or adapting their learning design and learning materials (n = 11) or planning upcoming lessons (n = 3). Several papers aimed to support teachers in providing feedback (n = 6) or offering personalized support to learners (n = 7). However, most were vague in the type of intervention they aimed to facilitate (n = 16).

RQ2. Which theoretical frameworks inform the design of teacher dashboards?

The analysis showed that only five studies mentioned theoretical underpinning informing their designs. Dourado et al. (2021) structure the information displayed on their dashboard in line with Rogoff’s (2003) social participation theory which proposes learning as participation in three interdependent planes of analysis: cultural/institutional, interpersonal and personal. Singh et al. (2020) rely on engagement theory (Kearsley & Shneiderman, 1998) and deep and surface learning (Beattie et al., 1997) to select their data sources and analytics. Michos and Hernández-Leo (2018) use the Cultural Historical Activity Theory framework (Engeström, 2000) to develop a community awareness dashboard that supports communities of teachers who use a social learning design platform. Martinez-Maldonado (2019) rely on spatial pedagogy to create a system that provides feedback to the teachers on their movement around the classroom, while Ez-Zaouia and Lavoué (2017) use Ekman’s classification of emotions (Ekman & Friesen, 1976) when creating a dashboard displaying information about learners’ emotions.

Besides, theoretical frameworks were employed to describe and analyze teachers’ use of dashboards in three papers. van Leeuwen et al. (2019) use the teacher noticing framework (Van Es & Sherin, 2002) as the theoretical grounding for analyzing teachers’ interpretation of classroom situations using a dashboard, while Molenaar and Knoop-van Campen (2017) use distributed cognition theory (Hutchins, 2000) as a research paradigm to investigate whether teachers’ dashboard usage connects to teachers’ professional routine. Finally, Zheng et al. (2021) use self-regulated learning (Zimmerman, 2013), a theory frequently employed as a lens to understand students’ self-regulated learning behaviours.

RQ3. To what extent are stakeholders involved?

Half of the analyzed papers reported the involvement of stakeholders in the design process (n = 25) (see Fig. 3). Teachers were the most commonly involved stakeholders (n = 22), followed by system engineers or interface designers (n = 7), learning scientists or instructional designers (n = 6) and LA researchers (n = 4).

Fig. 3
figure 3

The distribution of stakeholders’ involvement during the design of teacher-facing dashboards

Teachers: Most papers (n = 18) that took a participatory approach involved teachers in the problem exploration phase, requesting input from teachers on needs that LA dashboards could address. Teachers’ contributions were sought through multiple methods: contextual inquiry, semi-structured interviews, focus groups, in-classroom observations, workshops or online surveys. In most cases, the initiative to build dashboards and thus elicit input from teachers lies with the authors of the analyzed paper. However, we identified one particular case where the teaching staff requested the institution to develop a dashboard for them to be able to facilitate administrative tasks (e.g. monitoring students’ self-reported time-on-task) (Hilliger et al., 2021). In numerous papers, teachers were also asked for feedback on initial prototypes (n = 16) as authors sought early feedback on low-fidelity prototypes before investing efforts into developing the systems. In the case of the twelve papers, the prototypes were based on input provided by the teachers in the problem exploration phase. At the same time, the other four designs were informed by literature on the authors’ experience.

However, it is important to note that despite half of the analyzed studies including teachers in the design of dashboards, most of the studies only involved teachers at the exploratory stages. We identified only two papers that went beyond exploring teachers’ needs in the problem-exploration phase. For example, during interactive sessions with school teachers, van Leeuwen et al. (2019) first asked teachers to describe what they actually do in the classroom and then presented teachers with specific possible sources of information that were already logged by the system in order to understand which indicators would be most useful to teachers. Yoo and Jin (2020) asked instructional designers, web designers and engineers to provide multiple sketches and contribute to the design of the dashboard’s interface. These sketches were then feedbacked together with teachers. Interesting to note is the fact that explicit stakeholder involvement is a recent phenomenon, as all papers emphasizing stakeholder involvement were published after 2017.

Other stakeholders: Studies involved other stakeholders, including system engineers or interface designers (n = 7) who were involved at the technical stage of designing systems; instructional designers (n = 6) and LA researchers (n = 4) who were involved in the process of seeking feedback on prototype solutions. For example, Singh et al. (2020) involved two instructional designers and two LA experts in designing a dashboard that provides novel visualization designs to teachers. The authors presented early prototypes to instructional designers for feedback, while domain experts were involved in the stage of evaluating the usability and usefulness of the dashboard. The key stakeholder not reported in any of the studies was institutional leaders, yet these have been reported as key to the successful implementation of LA systems (Kaliisa & Dolonen, 2022).

RQ4. To what extent are privacy and ethical issues considered during the design process?

The vast majority of articles do not mention privacy and ethics. Of the 50 articles in the final corpus, only 10 articles mention ethics and privacy in general (e.g., the use of cameras, multimodal data), but only Bassen et al. (2018) mention privacy as an essential design decision for the teacher dashboard. All 10 of the articles refer to ethics and privacy in relation to student data, with none of the articles mentioning privacy or ethical issues pertaining to teacher data. This finding aligns with Alwahaby et al. (2021), who highlight that issues and concerns pertaining to teacher privacy are currently absent from discourses surrounding LA, specifically the design and use of teacher-facing dashboards. The only article discussing concerns about teacher privacy in the climate of increasing surveillance and the expansion of LA is the research by Harris et al. (2021). The authors refer to findings highlighting “potential emotional impacts that data sharing can have, [which] provide an opening to think about teacher privacy rights in relation to the outcomes of their work (e.g., student scores) and whether public scrutiny should be expected as part of professionalism” (p. 59). We argue that teacher data should be collected, analyzed and used in combination with student data so that data collected by LA dashboards is correlated with teachers’ presence and engagement.

RQ5. How mature is the deployment of LA dashboards for teachers?

The implementation stage describes the stage at which the reported dashboard was deployed. These stages were coded based on the LATUX (Learning Awareness Tools-User eXperience) a workflow for designing and deploying awareness tools (e.g., dashboards) for technology-enabled learning settings (Martinez-Maldonado et al., 2015) (see Fig. 4). The findings showed that most of the teacher-facing dashboards were implemented at a pilot stage (n = 23) which means that they were fully developed automated dashboards and used as a proof of concept in an authentic learning environment. This category was followed by dashboards employed as high-fidelity prototypes (n = 19), meaning they were not fully automated but with a detailed and realistic representation of the proposed tool. The third category included six studies that presented dashboards implemented at a classroom and institutional level (e.g. Li et al., 2021) with authentic and continuous live usage. Such dashboards have gone beyond piloting to continuous usage at institutions and on a relatively large scale. Lastly, two dashboards were at a paper prototype stage, meaning they were at an idea-generating level to gain a quick and flexible preliminary vision of the proposed system. The key finding from this category is that many teacher dashboards are developed as part of exploratory work and, therefore, only stop at prototype and piloting stages without moving towards actual classroom use.

Fig. 4
figure 4

Summary of teacher-facing implementation stages

RQ6. How are teacher LA dashboards evaluated?

Evaluation methods. The methods used to evaluate the deployed dashboards varied. The coding revealed that the most frequently utilized methods of evaluating teachers’ dashboards were surveys, interviews and log analysis (22, 20 and 12, respectively). The other methods include observation (n = 5), pre-and post-test (n = 4), screen capture and grades with three studies each, field notes (n = 2), experiments, think-aloud protocols, lesson plans, admission information, and researcher experiences with one study each. It is important to note that some studies employed more than one approach. For example, out of the 50 studies, three studies used at least more than three methods to evaluate the impact of dashboards. Aslan et al. (2012) used videos, logs, grades, teacher audio messages, observations, surveys and achievement tests. Amarasinghe et al. (2020) used screen capture, log data, and survey, while Ez-Zaouia and Lavoué (2017) used logs, field reports and interviews. Besides, most studies analyzed had a very small sample size of teacher participants, with some studies having only one teacher (e.g., Martinez-Maldonado, 2019).

Evaluation level. The evaluation category describes the level and the focus of dashboard evaluations. Guided by Kirkpatrick’s evaluation model (Smidt et al., 2009), the majority of the studies (n = 33) were focused on the reaction level, which aims at describing users’ perceptions and impressions of a particular dashboard. 11 studies evaluated dashboards at a learning level (e.g. insights teachers gained), nine studies evaluated dashboards at behavior levels (e.g. how the teacher manages the class, and help struggling students based on dashboard insights). 11 studies evaluated the result of the given dashboard by focusing on outcomes such as a change in students’ performance, student satisfaction, and attrition rates. It is also important to note that some studies evaluated more than one level. For example, Amarasinghe et al. (2020) evaluated the dashboard at three levels of reaction, behavior and results but only based on a sample of four teachers, which limits the generalization of the findings. Herodotou et al. (2019) looked at all levels of evaluation, thus providing evidence for the impact of the OU Analyze dashboard. For example, the authors reported that teachers who used OU Analyze and intervened with students who were flagged as at risk, the performance of such students was reported as significantly different from their peers who received no intervention from teachers. Meanwhile, it is critical to note that the dashboard presented by Herodotou et al. (2019) was deployed at scale and has been used for a long time, which makes it easier to evaluate its impact on teachers’ behavior and students’ performance.

Implications for research and practice

In this section, we outline a set of recommendations for research and practice based on the key findings from the analysis.

Turn awareness into action: The first research question sought to establish the purposes of existing teacher-facing LA dashboards. The findings showed that most teacher dashboards are aimed at supporting teachers’ various pedagogical actions, e.g., providing informed feedback, offering personalised support or adapting the learning design or learning materials. The mechanism through which this goal is achieved is first allowing teachers to monitor students’ progress, learning behaviour and emotional states. We argue that it is not enough to be ‘aware’ of what is happening without acting as teachers have an ethical obligation to act (e.g. reach out to students who may be at risk) once they receive information about students’ learning behaviours through the dashboard (Prinsloo & Slade, 2017). Thus, we encourage researchers and dashboard designers to focus on dashboards that support teachers to act based on the insights gained. This implies that for LA dashboards to have a meaningful impact on teaching and student learning, the insights gained should be translated into actionable intelligence (Jørno & Gynther, 2018) by identifying areas where students may be struggling and developing targeted interventions (e.g., revising the learning design) to address them.

Bring theory into the design process: The findings revealed that only five studies explicitly stated that they relied on theoretical grounding to inform the design of teacher dashboards. An important question for dashboard designers and researchers is on what basis are dashboards designed if not guided by theory? This is an important question to engage with since teacher-facing dashboards display information based on students’ learning activities, which should be interpreted through appropriate learning perspectives (Khalil et al., 2022). Moreover, by design, LA dashboards make implicit educational claims by capturing actions such as the activities logged. Yet, without being explicit about the theoretical constructs behind the visualisations produced, it is very likely that the metrics produced could misalign with teachers’ pedagogical and theoretical orientation (Bodily & Verbert, 2017). One way to address the theoretical challenge in dashboard design is to avoid designing prescriptive dashboards but rather those that allow teachers to apply their own theoretical perspectives to make sense of the data. This calls for flexible and customisable dashboard interfaces that allow teachers to select the type of data visualisations that align with their own theoretical perspectives. In addition, since theoretical perspectives might differ based on disciplines and subject content, it is important to propose several designs that are grounded and incorporate features from multiple theoretical perspectives (e.g., constructivism, behaviourism, self-regulated learning, and cognitivism).

Engage teachers beyond prototyping: The findings revealed that the level of teacher involvement in the LA dashboard design process is still limited, although a lot more effort has been invested in these endeavours in recent years. In many analysed papers, teachers are involved either at the beginning of the design process in the problem exploration phase or in a later stage where their feedback on initial prototypes is sought, even though the design process is described by authors as iterative. Given that the implementation of most dashboards included in this review is at a high-fidelity prototype or pilot stage, it is possible that it is too early to see systems that could report the involvement of stakeholders in subsequent iterations of the design process. We concurrently agree that meaningful involvement of stakeholders can lead to fruitful participatory design when designers carefully consider what input teachers can offer and at what stage of the design process their contributions would bring value (Dollinger et al., 2019). At the same time, we recognise the fact that even though teachers’ voices are important in the design process, in some cases, the teachers could be under-informed about educational research or design requirements. This calls for transparent discussions with stakeholders to identify their level of expertise and provide opportunities to learn more about aspects they might have less knowledge about yet important for LA design.

Privacy and ethics as design requirements and moral obligation: It falls beyond the scope of this systematic review to discuss the implications of the paucity of research on issues pertaining to teacher privacy in the design and use of teacher-facing dashboards. In light of the increasing datafication of and surveillance in education, and general concerns about privacy, we moot the need for dashboard researchers and designers to not only look at teacher perceptions and the technical aspects of dashboards but also to consider the issues of privacy and ethics (e.g., students and teachers’ rights, data ownership), while defining the protocols for dashboard designs (Williamson & Kizilcec, 2021). Besides, as the analysis shows, current research on ethical issues in teacher-facing dashboards deals exclusively with ethical issues with the collection, analysis and use of student data. There are, however, several ethical issues not considered in research on teacher-facing dashboards, such as the moral obligation to act once the analysis shows that students are at risk or in need of additional support. The obligation arising from knowing necessitates “the effective allocation of resources to ensure appropriate and effective interventions to increase effective teaching and learning” (Prinsloo & Slade, 2017, p. 46). Responding to identified needs and/or students-at-risk necessitates that we understand teachers’ responsibility and capacity to respond in a broader context of an institutional understanding of the factors impacting student success and retention, the political will and understanding of institutional leadership, integrated institutional sense-making structures and capacities, resource constraints and supporting student autonomy and responsibility (Prinsloo & Slade, 2017).

The two case studies analyzed in this study (see Section X) pointed to the reality that “knowing more about our students and making this information and knowledge available to a range of stakeholders does not necessarily result in action” (Prinsloo & Slade, 2017, p. 53), raising concerns about defaulting on the moral and fiduciary duty of institutions to act. In addition, we argue that central to ethical issues pertaining to teachers’ use of teacher-facing dashboards are also issues pertaining to teachers’ capacity in the context of their workload, teacher: student ratios, institutional support and responsiveness of such support, etc. Teacher-facing dashboards and the ethical implications in the design and envisioned use of these dashboards, therefore, have to be understood as entangled in institutional resources and support ecologies and not stand-alone attempts to capacitate teachers to make evidence-informed decisions.

Moving from prototypes to implementation in the wild: The findings showed that a large number of teacher dashboards are developed as part of exploratory work and, therefore, only stop at the prototype and piloting stages without moving towards actual classroom use. A few exceptions were dashboards such as SRES (Vigentini et al., 2020) and OU Analyze (Herodotou et al., 2019), which are currently used by several institutions, teachers and courses. The limited studies reporting on the continuous use of dashboards at a classroom and institutional level expose an important research gap in teacher dashboards research. In other words, the large number of studies deploying dashboards at a prototype level and small pilot studies imply that realising the actual impact of such dashboards is difficult since they are implemented in controlled situations (e.g. laboratories and simulations or very small-scale studies that cannot be generalised).

Triangulation of evaluation methods: This study found that teacher dashboard evaluations are dominated by self-reported methods such as interviews and surveys, which are subjective and do not fully capture the impact of dashboards in authentic learning and teaching environments. Moreover, the existence of only one experimental study evaluating the impact of teacher dashboards on students’ final learning outcomes suggests that causal inferences cannot be made from existing teacher dashboards literature. We encourage researchers intending to evaluate dashboards to employ different evaluation techniques, as user perceptions and experiences reported through interviews and surveys might not be accurate and adequate measures to evaluate actual impact. We argue that experimental studies complemented with self-reported methods (e.g. interviews) could provide better insights that teacher dashboards may have on students’ learning and teachers’ own practices.

Moving beyond usability to measuring impact: The findings revealed that evaluation of teacher dashboards tends to stop at a reaction level (e.g. user experiences), with very few studies reporting on the other levels, such as behaviour and result. These results underscore the need for researchers to consider moving beyond usability to measuring impact. In practice, this might require longitudinal studies trying to follow the impact of the deployed dashboards, other than the short-term evaluations and pilot studies, as shown in the current study.

Toward an integrated checklist for planning, designing, implementing and evaluating LA dashboards

The study findings highlight a number of LA dashboard design, implementation and evaluation considerations for researchers, technology developers and practitioners. To support the uptake of these considerations and contribute to the theory of LA design and adoption, we synthesize the findings to propose an integrated checklist to support the planning, designing, implementing and evaluation of LA dashboards for teachers, complementing similar work done for student-facing LA (Bodily & Verbert, 2017). The suggested checklist is informed by the synthesis of the findings in this review, as well as ideas from existing frameworks and models such as Verbert’s (2013) LA process model and the user-centred LA systems model (Matcha et al., 2019). Our suggested checklist adds greater detail to existing LA models since it captures the whole cycle of planning, designing, implementing, and evaluating dashboards by highlighting the important questions and considerations to support researchers and developers in making informed decisions during the design and implementation of LA dashboards.

Explanation of the checklist

The suggested checklist is operationalised through four interconnected dimensions: (i) Planning, (ii) Designing, (iii) Implementing, and (iv) Evaluating. The checklist assumes that the different dimensions are interlinked and influence each other. The proposed checklist is generic and can potentially be applied by researchers and technology developers across different domains. The four dimensions and the corresponding guiding questions are briefly described below and illustrated in Table 2.

  1. i.

    Planning: This dimension involves gathering relevant information to establish the need for developing a LA dashboard. This is one of the most critical stages of the dashboard development process since it might determine the overall design of the dashboard. One important element of this stage is identifying relevant stakeholders (e.g. teachers, students, administrators, technology designers etc.), and actively engaging them in discussions related to the process of designing a dashboard. As highlighted in the findings section, the involvement of stakeholders should not only stop at asking questions about their needs but also instead actively involve them throughout the planning, designing, and implementation and evaluation stages.

  2. ii.

    Designing: This dimension involves creating LA tools based on the identified needs and relevant theoretical constructs. The designs could be based on prototypes (both paper-based and digital) or advanced automated systems. The design process should be not only influenced by users’ needs and theory but also on ethical and privacy considerations. As highlighted in this review, ethical issues have received limited attention from designers of LA dashboards yet are critical for the uptake of LA dashboards by teachers and using the feedback to support students’ learning.

  3. iii.

    Implementing: This dimension involves the implementation of the designed solutions in the real world. As suggested by the LATUX model (Martinez-Maldonado et al., 2015), the implementation can be done at different stages, including the prototype, pilot and real classroom application. The findings of this review showed that the majority of teacher dashboards are implemented at a pilot stage, with few dashboards reaching application in authentic teaching environments. This implies that researchers should aim to move beyond small-scale dashboard studies if the potential of LA dashboards is to be realised.

  4. iv.

    Evaluating: This dimension involves the evaluation of the potential impact of the dashboard. The findings of the review showed that most studies evaluating dashboards are aimed at user perceptions. While this is an important indicator of dashboard effectiveness, it is critical to move beyond reactions toward measuring the actual impact on learning and teaching. In addition, researchers can use multiple approaches to evaluate dashboards (e.g. user interviews, focus groups, observation, and log data) to increase the validity of the measured outcomes.

Table 2 Checklist for planning, designing, implementing and evaluating LA dashboards

For each dimension described here, we suggest several questions that can guide researchers and designers to ensure that the most important aspects of the design and implementation process are covered.

How to use the checklist

We envision that the proposed checklist will be used in a number of ways. First, we expect LA researchers and designers to use the checklist during the planning, designing, implementation and evaluation stages. In particular, the checklist provides a quick and visual reference for what types of decisions to make at different stages of the LA dashboard design-evaluation stages. Secondly, researchers can use the checklist as a basis to evaluate existing LA dashboards. By doing so, new knowledge and lessons about the limitations in existing dashboards could be generated, thus leading to improved LA dashboard design and evaluation practices. Lastly, technology developers can use the checklist to identify the non-technical aspects (e.g., ethics and privacy) that should be considered during the design process to ensure that the technologies developed promote equitable teaching behaviors and decision-making. This checklist should be treated as a living document and we encourage researchers to revise it with additional questions depending on the context of use.

Limitations

Whilst this study followed PRISMA, an established and validated guideline to conduct systematic reviews, it is not without limitations. Firstly, during the search process, we restricted the process to papers that use the concepts widget OR dashboard AND “learning analytics” OR “educational data mining”. While these terms provided us with detailed results on teacher dashboards, it is possible that some papers not explicitly using the term dashboard or widget could have been missed. In addition, the analysis of dashboards in this study was descriptive and did not focus on the analysis of existing dashboards and whether they impact students’ learning or teachers’ practice. Future research could take the analysis in this paper one-step further by conducting a meta-analysis of teacher-facing dashboards and how they impact students’ learning or teachers’ practice to provide a novel contribution to the field. In addition, the planning, designing, implementing and evaluating checklist that has been proposed in this review has not been validated and will require further revisions based on empirical evidence. Nevertheless, we believe that this paper provides a simple framework with questions and possible responses to guide researchers and technology developers in developing dashboards that align with stakeholder needs and adhere to ethical and privacy considerations.

Conclusion

As higher education becomes increasingly digitized and datafied, institutions have access to greater variety and velocity of data, commonly shared through LA dashboards. In this paper, we argue that the potential of LA dashboards to support teaching and learning will only be realized if researchers and technology developers think differently not only about who is involved in the design and implementation of such dashboards but also about how they are involved, and the extent to which non-technical aspects such as ethics and privacy are considered. This paper contributes to this direction by proposing a four-dimensional checklist for the planning, design, implementation and evaluation of LA dashboards, which we hope can act as a guiding and discussion tool for researchers and technology developers during LA dashboard development. We argue that unless the four dimensions in the proposed checklist are considered in a holistic manner, the expected potential of LA dashboards in supporting teachers to optimise teaching and learning may not be realised. We encourage researchers to validate and extend the proposed checklist based on empirical studies to build up evidence of the relevance and applicability of the different components.

Availability of data and materials

The datasets analyzed during the current study are publicly available as supplementary materials.

References

  • Alwahaby, H., Cukurova, M., Papamitsiou, Z., & Giannakos, M. (2022). The evidence of impact and ethical considerations of Multimodal Learning Analytics: A Systematic Literature Review. The Multimodal Learning Analytics Handbook, 289–325.

  • Amarasinghe, I., Hernández-Leo, D., Michos, K., & Vujovic, M. (2020). An actionable orchestration dashboard to enhance collaboration in the classroom. IEEE Transactions on Learning Technologies, 13(4), 662–675.

    Article  Google Scholar 

  • Aslan, S., Alyuz, N., Tanriover, C., Mete, S. E., Okur, E., D'Mello, S. K., & Arslan Esme, A. (2019). Investigating the impact of a real-time, multimodal student engagement analytics technology in authentic classrooms. In Proceedings of the 2019 chi conference on human factors in computing systems (pp. 1–12).

  • Bach, C. (2010). Learning analytics: Targeting instruction, curricula and student support. Drexel University.

    Google Scholar 

  • Bassen, J., Howley, I., Fast, E., Mitchell, J., & Thille, C. (2018). OARS: exploring instructor analytics for online learning. In Proceedings of the Fifth Annual ACM Conference on Learning at Scale (pp. 1–10).

  • Beattie, V., IV., Collins, B., & McInnes, B. (1997). Deep and surface learning: A simple or simplistic dichotomy? Accounting Education, 6(1), 1–12.

    Article  Google Scholar 

  • Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405–418.

    Article  Google Scholar 

  • Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17(1), 1–30.

    Article  Google Scholar 

  • Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centred learning analytics. Journal of Learning Analytics, 6(2), 1–9.

    Article  Google Scholar 

  • Dollinger, M., Liu, D., Arthars, N., & Lodge, J. M. (2019). Working together in learning analytics towards the co-creation of value. Journal of Learning Analytics, 6(2), 10–26.

    Article  Google Scholar 

  • Dourado, R. A., Rodrigues, R. L., Ferreira, N., Mello, R. F., Gomes, A. S., & Verbert, K. (2021). A teacher-facing learning analytics dashboard for process-oriented feedback in online learning. In LAK21: 11th International Learning Analytics and Knowledge Conference (pp. 482–489).

  • Drachsler, H., & Greller, W. (2016). Privacy and analytics: it's a DELICATE issue a checklist for trusted learning analytics. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 89–98).

  • Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting action research with learning analytics. In Proceedings of the third international conference on learning analytics and knowledge (pp. 220–229).

  • Ekman, P., & Friesen, W. V. (1976). Measuring facial movement. Environmental Psychology and Nonverbal Behavior, 1, 56–75.

    Article  Google Scholar 

  • Engeström, Y. (2000). Activity theory and the social construction of knowledge: A story of four umpires. Organization, 7(2), 301–310.

    Article  Google Scholar 

  • Ez-Zaouia, M., & Lavoué, E. (2017, March). EMODA: A tutor oriented multimodal and contextual emotional dashboard. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 429–438).

  • Few, S. (2013). Information dashboard design: Displaying data for at-a-glance monitoring (Vol. 5). Analytics Press.

    Google Scholar 

  • Harris, L., Wyatt-Smith, C., & Adie, L. (2020). Using data walls to display assessment results: A review of their affective impacts on teachers and students. Teachers and Teaching, 26(1), 50–66.

    Article  Google Scholar 

  • Herodotou, C., Hlosta, M., Boroowa, A., Rienties, B., Zdrahal, Z., & Mangafa, C. (2019). Empowering online teachers through predictive learning analytics. British Journal of Educational Technology, 50(6), 3064–3079.

    Article  Google Scholar 

  • Hilliger, I., Miranda, C., Schuit, G., Duarte, F., Anselmo, M., & Parra, D. (2021, April). Evaluating a learning analytics dashboard to visualize student self-reports of time-on-task: a case study in a Latin American University. In LAK21: 11th International Learning Analytics and Knowledge Conference (pp. 592–598).

  • Hutchins, E. (2000). Distributed cognition. International Encyclopedia of the Social and Behavioral Sciences, 138, 1–10.

    Google Scholar 

  • Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness is not enough: Pitfalls of learning analytics dashboards in the educational practice. In Data Driven Approaches in Digital Education: 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, September 12–15, 2017, Proceedings 12 (pp. 82–96). Springer International Publishing.

  • Jørnø, R. L., & Gynther, K. (2018). What constitutes an ‘actionable insight’ in learning analytics? Journal of Learning Analytics, 5(3), 198–221.

    Article  Google Scholar 

  • Kaliisa, R., & Dolonen, J. A. (2022). CADA: a teacher-facing learning analytics dashboard to foster teachers’ awareness of students’ participation and discourse patterns in online discussions. Technology, Knowledge and Learning, 1–22.

  • Kearsley, G., & Shneiderman, B. (1998). Engagement theory: A framework for technology-based teaching and learning. Educational Technology, 38(5), 20–23.

    Google Scholar 

  • Khalil, M., Prinsloo, P., & Slade, S. (2022). The use and application of learning theory in learning analytics: a scoping review. Journal of Computing in Higher Education, 1–22.

  • Knox, D. (2010). Spies in the house of learning: A typology of surveillance in online learning environments. Paper presented to EDGE 2010–e-Learning: The horizon and beyond conference, Newfoundland.

  • Li, Q., Jung, Y., & Friend Wise, A. (2021). Beyond first encounters with analytics: Questions, techniques and challenges in instructors’ sensemaking. In LAK21: 11th International Learning Analytics and Knowledge Conference (pp. 344–353).

  • Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). The LATUX workflow: designing and deploying awareness tools in technology-enabled learning settings. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (pp. 1–10).

  • Martinez-Maldonado, R. (2019). A handheld classroom dashboard: Teachers’ perspectives on the use of real-time collaborative learning analytics. International Journal of Computer-Supported Collaborative Learning, 14, 383–411.

    Article  Google Scholar 

  • Matcha, W., Gašević, D., & Pardo, A. (2019). A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Transactions on Learning Technologies, 13(2), 226–245.

    Article  Google Scholar 

  • Michos, K., & Hernández-Leo, D. (2018). Supporting awareness in communities of learning design practice. Computers in Human Behavior, 85, 255–270.

    Article  Google Scholar 

  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., PRISMA Group* T. (2009). Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of Internal Medicine, 151(4), 264–269.

    Article  Google Scholar 

  • Molenaar, I., & Knoop-van Campen, C. (2017). Teacher dashboards in practice: Usage and impact. In Data Driven Approaches in Digital Education: 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, September 12–15, 2017, Proceedings 12 (pp. 125–138). Springer International Publishing.

  • Prinsloo, P., & Slade, S. (2017). An elephant in the learning analytics room: The obligation to act. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 46–55).

  • Prinsloo, P., & Kaliisa, R. (2022). Data privacy on the African continent: Opportunities, challenges and implications for learning analytics. British Journal of Educational Technology, 53, 894.

    Article  Google Scholar 

  • Rogoff, B. (2003). The cultural nature of human development. Oxford University Press.

    Google Scholar 

  • Rummel, N. (2018). One framework to rule them all? Carrying forward the conversation started by Wise and Schwarz. International Journal of Computer-Supported Collaborative Learning, 13, 123–129.

    Article  Google Scholar 

  • Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., & Dillenbourg, P. (2016). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30–41.

    Article  Google Scholar 

  • Siemens, G., & Gasevic, D. (2012). Guest editorial-learning and knowledge analytics. Journal of Educational Technology & Society, 15(3), 1–2.

    Google Scholar 

  • Singh, S., Meyer, B., & Wybrow, M. (2020, June). UserFlow: A Tool for Visualizing Fine-grained Contextual Analytics in Teaching Documents. In Proceedings of the 2020 ACM Conference on Innovation and Technology in Computer Science Education (pp. 384–390).

  • Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.

    Article  Google Scholar 

  • Smidt, A., Balandin, S., Sigafoos, J., & Reed, V. A. (2009). The Kirkpatrick model: A useful tool for evaluating training outcomes. Journal of Intellectual and Developmental Disability, 34(3), 266–274.

    Article  Google Scholar 

  • Tsai, Y. S., Rates, D., Moreno-Marcos, P. M., Muñoz-Merino, P. J., Jivet, I., Scheffel, M., & Gašević, D. (2020). Learning analytics in European higher education—Trends and barriers. Computers & Education, 155, 103933.

    Article  Google Scholar 

  • Valle, N., Antonenko, P., Dawson, K., & Huggins-Manley, A. C. (2021). Staying on target: A systematic literature review on learner-facing learning analytics dashboards. British Journal of Educational Technology, 52(4), 1724–1748.

    Article  Google Scholar 

  • Van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571–596.

    Google Scholar 

  • Van Leeuwen, A., Rummel, N., & Van Gog, T. (2019). What information should CSCL teacher dashboards provide to help teachers interpret CSCL situations? International Journal of Computer-Supported Collaborative Learning, 14, 261–289.

    Article  Google Scholar 

  • Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.

    Article  Google Scholar 

  • Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., & Klerkx, J. (2014). Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing, 18, 1499–1514.

    Google Scholar 

  • Verbert, K., Ochoa, X., De Croon, R., Dourado, R. A., & De Laet, T. (2020). Learning analytics dashboards: the past, the present and the future. In Proceedings of the tenth international conference on learning analytics & knowledge (pp. 35–40).

  • Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98–110.

    Article  Google Scholar 

  • Vigentini, L., Liu, D. Y., Arthars, N., & Dollinger, M. (2020). Evaluating the scaling of a LA tool through the lens of the SHEILA framework: A comparison of two cases from tinkerers to institutional adoption. The Internet and Higher Education, 45, 100728.

    Article  Google Scholar 

  • Williamson, K., & Kizilcec, R. F. (2021). Learning Analytics Dashboard Research Has Neglected Diversity, Equity and Inclusion. In Proceedings of the Eighth ACM Conference on Learning@ Scale (pp. 287–290).

  • Yoo, M., & Jin, S. H. (2020). Development and evaluation of learning analytics dashboards to support online discussion activities. Educational Technology & Society, 23(2), 1–18.

    Google Scholar 

  • Yoo, Y., Lee, H., Jo, I. H., & Park, Y. (2015). Educational dashboards for smart learning: Review of case studies. In Emerging issues in smart learning (pp. 145–155). Springer Berlin Heidelberg.

  • Zheng, J., Huang, L., Li, S., Lajoie, S. P., Chen, Y., & Hmelo-Silver, C. E. (2021). Self-regulation and emotion matter: A case study of instructor interactions with a learning analytics dashboard. Computers & Education, 161, 104061.

    Article  Google Scholar 

  • Zimmerman, B. J. (2013). From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist, 48(3), 135–147.

    Article  Google Scholar 

Download references

Acknowledgements

We thank the anonymous reviewers and editors for their valuable comments that improved the quality of our manuscript.

Author information

Authors and Affiliations

Authors

Contributions

RK: Conceptualization; Formal analysis; Investigation; Methodology; Project administration; Validation; Roles/Writing—original draft; Writing—review & editing.: IJ Formal analysis; Methodology; Validation; Writing—original draft; Writing—review & editing. PP: Formal analysis; Methodology; Validation; Writing—original draft; Writing—review & editing. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Rogers Kaliisa.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1

. Summary of the reviewed studies.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kaliisa, R., Jivet, I. & Prinsloo, P. A checklist to guide the planning, designing, implementation, and evaluation of learning analytics dashboards. Int J Educ Technol High Educ 20, 28 (2023). https://doi.org/10.1186/s41239-023-00394-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-023-00394-6

Keywords