Skip to main content

Table 2 Checklist for planning, designing, implementing and evaluating LA dashboards

From: A checklist to guide the planning, designing, implementation, and evaluation of learning analytics dashboards

Dimension

Guiding questions

Guiding responses

Planning

Why is a dashboard needed?

To support awareness, reflection, feedback, and assessment (Verbert et al., 2014)

Who is the potential user?

Decide in advance who the potential user of the tool will be (e.g., teachers, students, administrators)

How do we understand user needs?

Engage with users, understand their pedagogical challenges, and jointly discuss potential solutions (Matcha et al., 2019)

How and when do we engage the potential user?

Engage users from the beginning and through the design and implementation process; present explanations to stakeholders about potential benefits of the dashboard; allow stakeholders to discuss in groups

What is the best way to engage the user?

Seek users’ needs through interviews, surveys, and workshops, and keep them in the loop through the process

Designing

What theoretical issues should be considered?

Consider the theory behind the pedagogical problem being addressed; leverage the theoretical constructs to inform the dashboard design (Jivet et al., 2017); align the dashboard features with the learning design

What solutions are needed?

Consider users’ needs, theoretical evidence, learning design, technical requirements and the resources available

What data is useful to collect?

Consider the data available and how it connects to teachers’ needs and theoretical assumptions

How should the solution be designed?

Pay close attention to users’ needs and competencies, resources, learning design, and technical requirements, and include different forms of visualisations (e.g., graphs, text-based feedback)

Who should be involved in the design?

Involve all stakeholders (e.g. teachers, students, designers, technology developers and researchers)

How should they be involved?

Actively engage them through co-design workshops and prototype design sessions, interviews, and maintain close communication with users (Kaliisa & Dolonen, 2022)

What ethical issues should be considered?

Check users’ rights, be aware of data ownership, be aware of the data visualized to the users, and be prepared to act once a dashboard identifies possible behaviors that require intervention (Slade & Prinsloo, 2013)

Implementing

When should the tool be implemented at scale?

Start with prototypes (paper or semi-automated); consider multiple small-scale trials; gradually move towards large-scale implementation (Martinez-Maldonado et al., 2015)

What changes need to be made when moving from a prototype, small-scale evaluation to a large-scale implementation?

Consider all the feedback from the initial stages, consider the required resources (e.g., increased volume of data that comes with a larger user base), and carry out additional testing and optimisation

Are the potential users well-trained in using the dashboard?

Train the users with basic data literacy skills; use workshops to train users before adopting the dashboard

Evaluating

How do we evaluate the impact of a dashboard?

Use multiple approaches to increase validity (e.g., user interviews, log data, longitudinal studies); use authentic contexts

For how long should the evaluation be?

Conduct multiple evaluations, and test the tool for longer periods (Herodotou et al., 2019)

What should the focus of the evaluation be?

Focus on multiple elements such as user reactions, impact on users’ behaviours, and impact on teaching and learning (Yoo et al., 2015)