Skip to main content

Supporting students’ self-regulated learning in online learning using artificial intelligence applications

Abstract

Self-regulated learning (SRL) is crucial for helping students attain high academic performance and achieve their learning objectives in the online learning context. However, learners often face challenges in properly applying SRL in online learning environments. Recent developments in artificial intelligence (AI) applications have shown promise in supporting learners’ self-regulation in online learning by measuring and augmenting SRL, but research in this area is still in its early stages. The purpose of this study is to explore students’ perceptions of the use of AI applications to support SRL and to identify the pedagogical and psychological aspects that they perceive as necessary for effective utilization of those AI applications. To explore this, a speed dating method using storyboards was employed as an exploratory design method. The study involved the development of 10 AI application storyboards to identify the phases and areas of SRL, and semi-structured interviews were conducted with 16 university students from various majors. The results indicated that learners perceived AI applications as useful for supporting metacognitive, cognitive, and behavioral regulation across different SRL areas, but not for regulating motivation. Next, regarding the use of AI applications to support SRL, learners requested consideration of three pedagogical and psychological aspects: learner identity, learner activeness, and learner position. The findings of this study offer practical implications for the design of AI applications in online learning, with the aim of supporting students’ SRL.

Introduction

Fully online or blended courses taught via online video have become increasingly prevalent in higher education. However, unlike face-to-face lectures where instructors can support learners in regulating their own learning, online learning environments often provide learners with high levels of autonomy and low levels of instructor presence (Jansen et al., 2020). Consequently, learners’ self-regulated learning (SRL) becomes critical to their academic success (Kizilcec et al., 2017). Unfortunately, many learners struggle to self-regulate effectively in online learning (Greene & Azevedo, 2010; Jansen et al., 2020), resulting in a deficiency in utilizing their SRL during online learning (Winne & Baker, 2013). Therefore, it is essential to support learners’ SRL in online learning environments to enable them to achieve their learning goals effectively (Garcia et al., 2018).

External support using artificial intelligence (AI) could be a potential way of supporting learners’ successful SRL (Molenaar, 2022). However, supporting learners’ SRL is challenging as it comprises metacognitive, cognitive, behavioral and motivational processes (Lodge et al., 2019), and is heavily influenced by contextual and personal factors. Therefore, one-size-fits-all approaches to SRL support are ineffective (Roll et al., 2014). The use of AI to support SRL should have two objectives: to assess and interpret learners’ SRL behaviors in online learning environments (Noroozi et al., 2019; Roll & Winne, 2015) and to provide support that scaffolds these complex SRL processes. Various AI applications, such as AI plan organizers (Somasundaram et al., 2020), AI companions (Woolf et al., 2010), and AI agents (Goel & Polepeddi, 2016), have been developed to support learners’ SRL.

However, the learner's SRL can be influenced by factors such as motivation (Kizilcec et al., 2017) and self-efficacy (Zimmerman, 2013), as well as teachers' expertise and timely feedback. Given the significant impact of perceived teacher roles and learners' characteristics on the success of SRL (Jouhari et al., 2015), it is crucial to consider these aspects when effectively supporting SRL using AI applications. Zawacki-Richter and his colleagues (2019) highlighted the need for more research on the pedagogical and psychological considerations perceived by learners to effectively design AI applications in education. Similarly, Rosé and her colleagues (2019) emphasized the importance of providing interpretable and actionable insights into learners, rather than simply focusing on developing AI models that predict learners’ data more accurately. Additionally, it is important to comprehensively address potential issues learners may face when applying AI applications in real online learning environments, such as privacy violations, algorithmic biases, and surveillance (Seo et al., 2021a, 2021b).

To develop effective AI applications that support SRL in the long term, it is necessary to investigate students’ perspectives on such applications (Jivet et al., 2020). Students’ perceptions can serve as the basis for identifying pedagogical and psychological considerations in the development AI applications. The aim of this study was to identify how students perceive the use of AI applications in supporting SRL, and to explore the pedagogical and psychological aspects recognized importantly by learners when using AI applications to support SRL. To achieve the purpose of the study, we used a speed dating research method, a user experience design method that allows participants to interact with and experience various AI applications without any technical implementation process (Zimmerman & Forlizzi, 2017). The results of this study are expected to provide practical implications for the utilization and design of AI applications, aimed at supporting SRL in online learning environments.

Background

Self-regulated learning in online learning

SRL is defined as “the process whereby students activate and sustain cognition, behaviors, and affects, which are systematically oriented toward attainment of their goals” (Schunk & Zimmerman, 1994). SRL describes the cognitive, metacognitive, and motivational strategies that learners employ to manage their learning (Panadero, 2017). Metacognitive strategies specifically guide learners’ use of cognitive strategies to achieve their goals, including setting goals, monitoring learning progress, seeking help, and reflecting on whether the strategies used to meet the goal were useful (Pintrich, 2004; Zimmerman, 2008).

The recent growth of online learning, including fully online or blended courses taught through online video, has brought changes to the learning environment for students and offered them more control over their learning (Jansen et al., 2020). However, to succeed in such an autonomous environment, learners must engage in productive SRL. Previous studies on SRL in online learning environments have shown that effective use of SRL strategies can lead to improvements in learners’ achievement. Moreover, studies have proposed specific strategies to promote learners’ adoption of SRL strategies in online learning (Kim & Hodges, 2012; Taub et al., 2014).

Table 1 summarizes a total of 10 representative SRL strategies that appear across three phases (i.e., forethought, performance, and reflection) and four areas (i.e., cognition, metacognition, motivation, and behavior) in online learning. Zimmerman’s (2000a2000b) three-phase cyclical model divides SRL into forethought, performance, and reflection phases, which are explained in the first column of Table 1. The forethought phase refers to the process that occurs before efforts are made to learn, such as setting goals or making plans. The performance phase refers to the process of monitoring whether the learning activities and strategies of learners meet the goals that were set during the first phase. The reflection phase refers to the process by which learners evaluate their own learning process and reflect on all the steps that they have taken to improve with regard to their subsequent learning session. These three phases of SRL involve different SRL strategies in each of the four areas mentioned (as presented in the second and third columns of Table 1). The area of cognition concerns the cognitive strategies used by learners during learning. Metacognition refers to learners’ ability to reflect upon, understand, and control their own learning. The area of motivation relates to the different motivational beliefs that individuals may have regarding their own learning abilities. Lastly, the behavior area reflects the general efforts that learners may make during the learning process.

Table 1 A theoretical framework and description of phases, areas, and strategies in self-regulated learning

Human factors affecting self-regulated learning

Effective SRL is essential for learners to achieve their learning goals in online learning. Previous research has investigated the impact of psychological aspects on SRL in online learning. Wong et al. (2019) explored the significance of individual differences that impact the support of SRL in online learning. Littlejohn et al. (2016) examined how learners’ motivation and commitment to learning can affect their ability to self-regulate their learning. Similarly, Kizilcec et al. (2017) observed that learners with strong motivation for taking a course exhibit increased self-regulated learning behaviors. Self-efficacy also plays a crucial role in the conceptualization and development of SRL (Zimmerman, 2000a, 2000b). Learners with high self-efficacy are more likely to adopt SRL strategies to achieve their goals, while those with low self-efficacy may rely on external factors to regulate their learning (Zimmerman, 2013). Bannert and Reimann (2012) found that prompts were ineffective for learners with lower prior knowledge who were unable to act accordingly when prompted. They suggest that understanding individual differences in commonly available learner characteristics, such as course intentions, education level, self-efficacy and gender, in combination with real-time behavioral data, could provide adaptive scaffolding for learners.

The pedagogical support and expectations set by teachers can have significant impact on learners' capacity for self-regulated learning. Research has shown that teachers who provide clear and structured instructions, offer feedback, and model self-regulated learning behaviors have a positive influence on learners' SRL (Zimmerman & Schunk, 2011). Improving teacher support is also essential for promoting self-regulated learning in online learning environments (Albelbisi & Yusop, 2019). In particular, teachers’ expertise, timely and suitable feedback provision, motivation, and engagement with students in class discussions are educational characteristics that can have a positive effect on students’ self-regulation (Jouhari et al., 2015). Furthermore, teachers' expectations of their learners' ability to self-regulate can impact learners' beliefs about their own ability to do so. For instance, when teachers have high expectations of their learners' ability to self-regulate, learners are more likely to engage in SRL behaviors and achieve their learning objectives (Jang et al., 2010). According to Vansteenkiste et al. (2012), teacher expectations regarding the learning tests and desirable behavior in class play an important role in initiating and regulating student’s learning behavior. Therefore, understanding the role of teacher support and expectations in fostering learners' self-regulated learning is crucial for improving educational outcomes.

The educational environment can either support or hinder students' ability to regulate their learning. According to Jouhari et al. (2015), the atmosphere and conditions of the learning environment can act as either facilitating or inhibiting factors for self-regulation of the students. In terms of automated feedback systems, Deeva et al. (2021) conducted a comprehensive review of 190 papers, and their analysis revealed that only 33% of the systems provided automated feedback that reflected the learner's characteristics. Among the traits used, learning style was the most frequently utilized (44.87%), followed by cognitive abilities (5.12%) (Normadhi et al., 2019). Regarding learner control over feedback, most of the reviewed papers (71.6%) did not provide learners any control the amount, timing, and frequency of automated feedback. Some systems (23.9%) provided mild control to students, but only a small percentage (4.6%) allowed students to have extensive control over the feedback type, timing, and/or appearance (Deeva et al., 2021). Although some studies reflect the characteristics of learners, a system is still needed that can adaptively implement teachers' expectations. However, not all learners have the motivation to outperform others (Jin, 2021), it is necessary to examine learners’ needs for goal setting.

AI applications for supporting self-regulated learning

Various AI applications are expected to support students’ SRL in online learning. For example, Somasundaram and his colleagues (2020) developed an AI-based plan organizer that can help students set learning goals, suggest action plans, and offer study tips based on past historic data drawn from an institution and current data drawn from student profiles and performance. Hussein and colleagues (2014) developed an AI-based question generation application that uses natural language processing technology to support students’ self-learning with regard to different topics and fields. Craig and Schroeder (2017) developed a virtual human featuring various types of voices and demonstrated its beneficial effects on students’ learning outcomes and cognitive load. Luckin (2017) proposed an intelligent assessment and suggestion application that can suggest study materials and strategies to students based on data regarding each student’s interactions during the online learning process. Seo et al., (2021a2021b) proposed an AI analytics application that provides instructors with an analysis of students’ behavioral data (e.g., clickstream, quiz, login/logout, and eye-tracking data) and learning context (e.g., course week, exam, and rewatch). Woolf and colleagues (2010) developed an AI companion that enhances student motivation by providing emotional support to students whose learning progress is slower than planned and makes suggestions regarding the goals that they can achieve to support their desired career path after completing study. Goel and Polepeddi (2016) developed an AI agent that can answer student questions before, during, or after online courses based on answers to questions collected in from previous courses. Srinivasa and her colleagues (2021) developed the Notelink application, which allows students to take pictures of their notes to rediscover and play relevant videos on their smartphones or tablets. Ross et al. (2018) designed an adaptive quiz application in which AI provides students with a personalized set of exercise questions that suit their individual level of knowledge. Conati and her colleagues (2018) highlighted the need for research into the use of interpretable machine learning to enable AI to support students’ self-reflection over the course of a semester.

Although various AI applications that can support students’ SRL have been proposed, what may happen when they are introduced in online learning remains still unclear. Seo et al., (2021a2021b) found that students perceive AI applications as useful but simultaneously become dissatisfied due to the feeling that relying on AI leads to reduced creativity. Students were also concerned regarding the potential issues pertaining to responsibility, agency, and surveillance issues that AI could raise in online learning. Understanding students’ perceptions of the use of AI applications to support SRL is critical to addressing the potential issues and challenges posed by AI with regard to online learning. Such an understanding can help researchers design AI applications that can successfully support students' SRL while respecting social boundaries in online learning (Luria et al., 2020).

Research questions

To support learners’ use of SRL strategies in online learning (see Table 1), various forms of AI applications have been developed. Functionally, these AI applications work well, but it remains unclear whether students perceive them as helpful for improving SRL in online learning. Understanding learners’ perceptions of the use of AI applications to support SRL is important to prevent potential problems with AI and to utilize AI applications more effectively. Our literature review revealed research gaps regarding students' perceptions of using AI applications to support SRL, as well as pedagogical and psychological aspects that are requested to their utilization of these applications, and that there is a need for research that considers these aspects. We therefore address the following two research questions:

  • RQ1: How do students perceive the use of AI applications to support SRL in four self-regulation areas (i.e., cognition, metacognition, motivation, and behavior) and in three learning phases (i.e., forethought, performance, and reflection)?

  • RQ2: What pedagogical and psychological aspects do students perceive as important when using AI applications that support SRL?

Methods

In this study, we used the speed dating method with storyboards, an exploratory research method that allows participants to experience different AI applications in the form of storyboards, and urged participants to reflect truthfully on the impact that each AI application might have on their SRL (Zimmerman & Forlizzi, 2017). Exposure to the many potential AI applications that may be available in the future helps participants shape their perspectives and evaluate AI applications in a more personal context (Luria et al., 2020). The advantage of the speed dating method with storyboards is that it is suitable for use with participants who do not have AI knowledge or experience using AI applications (Luria et al., 2020; Zimmerman & Forlizzi, 2017). Another advantage of the speed dating method is that can be used even with a small number of participants. For example, Dillahunt et al. (2018) and Holstein et al. (2017) conducted the speed dating method with 11 and five participants, respectively. We employed the speed dating method as a design technique to gather insights on learners' experiences with AI via interviews. Subsequently, qualitative content analysis was utilized to scrutinize the interview data. Our goal was not to evaluate specific AI applications but rather to explore areas in which AI applications would contribute positively to students’ SRL and areas in which more attention is necessary. We first created a set of 10 storyboards related to the use of different types of AI applications to support students’ SRL in online learning (see Sect. "Creating storyboards") and subsequently used these storyboards to conduct the speed dating activity with student participants (see Sect. "Speed dating").

Creating storyboards

To create AI application storyboards that are technically feasible and that positively support SRL, we conducted an online brainwriting activity (Linsey & Becker, 2011), as part of which we asked a team of educational AI researchers to suggest AI application scenarios that support students’ SRL in online learning. During this process, a theoretical framework for SRL strategies was used (see Table 1). Four independent researchers were recruited (three faculty members working in the field of educational technology and one faculty member working in the field of artificial intelligence) with an average of 14.8 years (SD = 6.6 years) of research experience in educational AI. Each team member created scenarios using a Google Docs file and passed that file on to the other team members. This process was repeated four times until all researchers agreed that the AI application scenarios created were technically feasible and supported students’ SRL strategies in online learning. A total of 10 AI application scenarios were created.

Subsequently, a focus group interview was conducted with four educational experts with an average of 14 years (SD = 6.4 years) of research and teaching experience to verify that the 10 AI application scenarios derived from the brainwriting activity would have a positive effect on the students’ SRL in online learning. The first two authors conducted a focus group interview with four educational experts via a video conferencing platform (i.e., Zoom). We showed each scenario to these AI experts and asked the following questions: “Do you think the scenario is appropriate for the phases, areas, and categories of SRL?,” “Do you think this scenario can improve students’ SRL in online learning?” and “Based on your online teaching and research experience, can you improve this scenario to help improve your students’ SRL?” After showing the experts all the scenarios, the following question was asked: “Do you have any research ideas that could be used as a new scenario?” The scenario was modified to reflect the opinions of these educational experts. The focus group interview lasted approximately 60 min. After the expert interviews, documents containing their validated opinions were collected via e-mail. Each educational expert was compensated 145 US dollars for their time. This process was given clearance by the Institutional Review Board.

Overall, as shown in Table 2, 10 AI application scenarios were developed. These 10 final scenarios were not intended to cover all AI applications aimed at systematically improving SRL in online learning but rather to investigate changes in students’ SRL resulting from the use of AI applications in online learning.

Table 2 Self-regulated learning strategies and corresponding AI application scenario IDs, titles and summaries

We created storyboards based on the scenarios, which are shown in Table 2. Figure 1 contains an example of a storyboard that details the situation contained in a scenario with captions. We stylized the characters in the storyboards using a single visual style and flat cartoon shading to reduce gender and racial bias and to allow participants to immerse themselves in the characters in each storyboard (Truong et al., 2006; Zimmerman & Forlizzi, 2017). The full storyboards can be viewed at https://osf.io/6eb3v/?view_only=a81d3dfbb4e04a339175279ce318e93c.

Fig. 1
figure 1

A storyboard example of AI Analytics (S05) is shown in Table 2

Speed dating

Participants

We recruited research participants by a purposeful sampling method. To respond to the research question, the research participants were selected based on students who had experienced at least six months of fully online learning at universities and had diversity in academic achievement level, major background, gender, and grade. As a result, 16 university students (see Table 3) with backgrounds in 10 different majors in four different academic disciplines were selected, considering the characteristics of qualitative research in which the size of the sample is determined by informational considerations (Lincoln & Guba, 1985). No knowledge of AI applications was required on the part of participants, as we wanted participants to focus on the impact of AI applications on their SRL per se. Previous studies have shown that speed dating works well without any prior knowledge of or experience with AI applications (Luria et al., 2020; Zimmerman & Forlizzi, 2017). Each participant was reimbursed for their time with a gift card for coffee.

Table 3 Summary of the university student participants’ demographic information

Procedure

We conducted semistructured interviews with participants via a video conferencing platform (i.e., Zoom). We designed interview questions to understand how participants perceived the impact of the AI applications described in storyboards on their SRL (see Appendix A). First, to help students recall their online learning experiences, we asked the following question: “What was the most difficult part of learning online?” Subsequently, participants read each storyboard aloud and explained how they thought AI would affect their SRL in online learning. Specifically, we asked the following questions: “If this AI application were incorporated into your online learning, when and how would you use it?” “If this AI application is unlikely to help your SRL in your online learning, why is that the case?” and “What are some new things you would like to recommend or change with regard to this AI application? What are your concerns?” In addition, to obtain a holistic perspective on SRL in online learning, we asked participants to select AI applications that would be likely or unlikely to support SRL well. Note that to ensure that participants had the same or similar understanding of the AI applications described in the storyboards, if participants had any misconceptions about what AI could do for their SRL, we actively informed them. The entire interviews lasted approximately 45.3 min each (SD = 9.3 min), with 3–5 min spent sharing each storyboard and examining participant responses.

Data analysis

Each interview was audio recorded and transcribed for analysis. Analysis of the interview data followed the qualitative content analysis procedure (Glaser & Strauss, 1976; Schreier, 2012). First, all five authors repeatedly read the transcribed data and open-coded the contents that appeared to be meaningful in relation to the research question, identifying a total of 432 semantic codes. Subsequently, these semantic codes were clustered using axial coding to derive 46 final codes. With regard to this total of 46 final codes, Sect. "Learners’ perceptions on using AI applications to support SRL" presents 25 codes representing learners' perceptions of the use of AI applications to support SRL (RQ1). The remaining 21 codes were identified as seven themes and three factors related to the pedagogical and psychological aspects that affect learners’ use of AI applications (RQ2) based on selective coding, and these results are reported in Sect. "Pedagogical and psychological aspects for learners' use of AI applications". Each code and theme were determined to be mutually exclusive and distinct from each other. To ensure the validity of the data analysis, all authors participated in a total of seven instances of coordination and discussion until consensus was reached.

Results

Learners’ perceptions on using AI applications to support SRL

Table 4 summarizes the results of the analysis of responses obtained from the 16 participants on the use of AI applications to support SRL. The participants’ perceptions of how AI applications could be helpful for SRL were classified and organized based on the three learning phases (i.e., forethought, performance, and reflection) and four self-regulation areas (i.e., cognition, metacognition, motivation, and behavior). Most participants explained their intention to use AI applications in line with the SRL strategies that corresponded with the design intentions of the applications. Some students also reported being supported by other SRL strategies alongside the AI application design intentions, while others claimed that certain AI applications were not helpful for SRL. Quotations indicating the scenarios ("S") and participants ("P") were included while presenting their specific responses to each area of SRL.

Table 4 Learners’ perceptions of SRL support by AI applications

Supporting SRL during the forethought phase

SRL strategies during the forethought phase include planning, recognizing task values, and checking prior learning knowledge. All 16 participants responded that they would use the Plan Organizer (S01) to plan their learning. Students acknowledged the fact that the AI Companion (S06) would be effective in motivating them, especially when they needed to prepare for an exam or employment (P04, P05, P10, and P12). In addition, the companion is useful “because it provides a basis for changing the learning method” (P04) or modifying the learning plan (P02) by allowing students to recognize their deficiencies. However, some participants expressed the opinion that “university students are not motivated just by highlighting the possibility of career achievement because they are realistic” (P05 and P10). Learners used the Pre-question Generator (S02) to ensure that they had acquired prior knowledge (P01, P03, P04, P06, P09, P11, and P12) or to prepare for the next lesson (P10 and P13). However, some participants stated that they do not use this because it is either annoying to participate in the quiz (P04 and P07) or failed to reflect their grades (P15 and P16).

Supporting SRL during the performance phase

The core strategies associated with the performance phase include applying cognitive or motivational strategies, monitoring the learning process, and seeking help with difficulties. Participants noted their intent to utilize the Intelligent Suggestion to apply cognitive regulatory strategies such as enhancing their content understanding (P03, P05, P06, P07, P08, P09, P10, and P12) and implementing learning strategies (P01, P02, P03, P04, P05, P10, and P11). An additional advantage of promoting such an understanding of learning content is that “learners were less likely to procrastinate on their studies” (P09). Only two participants (P05 and P08) perceived that Virtual Human, designed to support motivational regulation, increases focus on learning. However, most participants noted that it is not helpful for students to focus (P03, P04, P05, P09, P10, and P11) or to interfere with learning (P02, P06, P07, P10, P12, P13, P14, and P15). All participants found AI Agent useful because it provides immediate answers to questions. Learners used AI Analytics to monitor their learning and establish learning strategies for cognitive regulation. However, most participants claimed that AI Analytics provides motivation for learning participation, and P04 and P13 suggest that it also increases confidence through comparison with other learners.

Supporting SRL during the reflection phase

During the reflection phase, learners engaged in critical thinking and self-reflection by reviewing learning content and evaluating their own learning process and results. Core strategies such as reviewing, self-evaluation, and self-satisfaction were commonly applied. Participants reported that Adaptive Quiz (S09) helped them check how much they understood the learning contents (P01, P05, P06, P08, P10, P11, P12, and P13) enhanced their understanding of the learning content. In addition, P02 suggested that Adaptive Quiz contributed to learners’ feelings of achievement. However, P04 raised a problem in this context: "Solving questions that fit my cognitive level does not help me achieve the learning objectives of the course.” All participants mentioned that they would use Notelink to review what they did not know from their notes. Self-satisfaction was mainly mentioned when learners felt recognized, praised, and comfortable with their performance after using AI Reflection (S10). For example, P02 said, "Praise will improve not only the sense of achievement and willingness of learning but also self-esteem”. On the other hand, some students (P01, P05, and P07) did not intend to use AI Reflection because the praise they received was not reflected in their grades.

Overall, learners recognized that AI applications to support metacognitive, cognitive, and behavioral regulation in different areas of SRL are generally helpful, whereas the use of AI applications (S03 and S06) to support motivational regulation is not. AI Analytics (S05) was the most helpful AI application for learning, chosen by 10 out of 16 participants, while five participants chose AI Agents (S07). These findings confirm that learners find dashboards that objectively present their learning status or functions that automatically answer questions to be more useful. However, no participants selected Virtual Human (S03) or AI Companion (S06), which supported motivational regulation, as the most helpful AI applications for learning. This finding suggests that the use of AI applications to affect learners’ minds may have different effects for each learner. Furthermore, 13 learners selected Virtual Human, and three learners selected AI Reflections as the AI application they would use the least. From this point of view, AI applications must take a more detailed approach that takes into account the needs and characteristics of students to have an effect that is consistent with the intention of such applications to support learners' self-regulation.

Pedagogical and psychological aspects for learners' use of AI applications

Participants recognized that for learners to effectively utilize AI applications that support SRL, three major pedagogical and psychological aspects should be considered important: learner identity, learner activeness, and learner position. Each aspect emphasizes the multilayered characteristics of the learner based on their context, and depending on the extent to which the AI application takes these factors into account, the applications had both advantages and limitations with regard to the learners’ use of the AI application. Table 5 illustrates the aspects that were revealed based on the results of the analysis as well as the related list of themes and codes.

Table 5 Summary of three aspects for learners’ use of AI applications in supporting self-regulated learning

Learner identity: developing learner and differentiating learner

Learner identity is defined in terms of how an individual feels about himself or herself as a learner and the extent to which he or she describes himself or herself as a ‘learner’ (Lawson, 2014). Participants noted that their personal characteristics, such as their circumstances, learning styles, life patterns, and interests, can change continuously. For example, some learners claimed that "An individual's life pattern may change frequently due to part-time work outside of schoolwork” (P04) and "The learner’s level of interest in his or her major changes frequently” (S6, P01). These opinions were mainly expressed with regard to Plan Organizer (S01), Intelligent Suggestion (S04), or AI Companion (S06) because these scenarios predict future behaviors based on learners' historical data. Therefore, when it was felt that AI prescriptions did not sufficiently consider the various variables associated with learners, some learners regarded it as pressure to follow. Therefore, when it was felt that AI prescriptions did not sufficiently consider the various variables associated with learners, some learners regarded it as pressure to follow. If so, it is inevitable that learners will have doubts about whether their learning can be improved through AI. For example, P14 identified himself as a developing learner with ‘growth potential’ and suggested that it is important to ensure that AI applications do not result in limiting learner’s abilities by taking into consideration the learner’s identity.

Simultaneously, participants differentiated themselves from other fellow learners based on various characteristics, such as whether they were interested in celebrities (S03, P07) or whether they had decided on their career path to continue their studies (S06, P03, P04, P06, P13, P14, and P15). Participants viewed the usefulness of AI applications differently depending on whether each application was well suited to their personal characteristics in most scenarios. If the AI’s suggestions did not match the individual characteristics of the participants, they felt negative emotions such as “AI applications are interfering with me” (S06, P02) or “pressuring me” (S06, P08). P11 also expressed, “I am afraid that they will continue to compare me to my fellow students based on the same standards” in S05. In this regard, the participants suggested that "learners should be able to directly add or modify the application’s suggestions based on their personal information” (S01, P13 and P15).

Learner activeness: AI dependence, learner-AI cooperation, and learner agency

Learners expressed their opinions on different levels of active patterns of action regarding their relationship with AI applications, ranging from AI dependence to learner agency. Learner agency is a key element in self-determined learning, which refers to a situation where learners take full responsibility for their own learning experience. This element is emphasized more frequently in online learning, which requires learners to have self-regulation ability (Agonács & Matos, 2021). The use of dependent AI in this context could be viewed as an experience that weakens learner agency the most. Participants were most concerned about the possibility of dependence on AI Agent (S07). This is because it was thought that the AI Agent would “interfere with helping creatively perform the tasks alone” (P02) and cause the possibility of “students solving the task in the same way” (P15). This means the concern “AI can impede learner’s cognitive efforts” (P14).

However, AI was also viewed as a useful way for learners to engage in active learning. We termed this phenomenon "learner-AI cooperation" in the sense that learners collaborate with AI while occupying a leading position. Time efficiency, just-in-time, and convenience are the features of the “learner-AI cooperation” strategy. Intelligent Suggestion (S04) has the effect of “reducing the time required for learning” (P01) and "preventing cramming learning” (P09) because it contains lecture content tailored to their learning style. The AI Agent (S07) is easy to use “when it is difficult to ask a professor questions late at night” (P03) and “when I want to check what I do not know right away” (P12). Some students noted that using AI Agent would be convenient when they were “ashamed to ask an instructor publicly using the LMS [learning management system]” (P13 and P14) or “confused about whether a question is too trivial to ask a teacher” (P07).

Meanwhile, some participants presented pedagogical alternatives to support learner agency. Focusing on five scenarios (S01, S04, S05, S09, and S10), they suggested that it would be better for AI applications to play the role of informing 'learning directions' rather than providing 'correct answers' to learning tasks (which was coded as ‘process oriented’). In the case of AI Reflection (S10), the AI “encourages growth in learner’s deficient areas” (P11 and P12) and "tell learners how to make up for deficiencies” (P12). Additionally, participants wanted their thoughts and needs to be reflected in AI data independently (which was coded as ‘content modification’). It is hoped that learners can control the database to be analyzed by AI by selecting “the topics they are interested in” (S02, P10) or “the problem-solving level that suits them” (S09, P12). These opinions suggest that supporting learners’ activeness is an important pedagogical and psychological aspect for students when using AI applications to support SRL.

Learner position: independent learner and dependent learner

Regarding the use of AI applications, the participants faced a situation in which they are both ‘independent learners’ and ‘dependent learners.’ Namely, they are students who engage in learning responsibly, but they are simultaneously subordinated to evaluations, credit requirements, and instructors. The ambivalence of the learner's position caused learners to perceive the educational usefulness of and concerns regarding AI application differently. First, as ‘independent learners,’ participants viewed AI applications as providing them with a variety of forms of cognitive support, such as helping them “identify core content” (S02) or “remember previously learned content for a long time” (S09). Furthermore, this usefulness eventually promoted learning efficiency, such as “reducing course learning time” (S04, P11), “reducing effort required to search for customized materials” (S04, P14), and “helping to prepare effectively for the test” (S09, P12). As independent learners, it is expected that participants will be able to check what they did well and what they lacked this semester and improve their ‘self-understanding’ through AI applications (S10, P02). They asked for specific feedback and advice on their learning status through the dashboard to improve positive motivation and self-esteem for learning (S10, P02, S05, P10, P11, P12, and P15).

On the other hand, as 'dependent learners,' participants focused mainly on the relationship between the use of AI applications and the exam or their grades. This means that even if learning information is provided according to the level of individual learners, the usefulness of tests or credits becomes the most important criterion for whether or not to use the information (P01, P03, P15, and P16). Participants recognized that quizzes or tests given under conditions unrelated to the final grade or credits were only a “psychological burden” (S02, P11) or “pressure” (S02, P10) to students. In other words, it was considered important in relation to “fairness” (P10) that different learning materials customized to learners should not affect the quality of the final evaluation. It is also a characteristic of dependent learners that participants wished to be able to choose whether to disclose (coded as selective disclosure) the results of the Quiz (S02) or AI Analytics (S05). Participants considered whether to use instructors rather than their own needs for AI applications and suggested that "learners should provide quiz questions that match the level of test questions presented by professors rather than their own learning level (S09, P07)." This fact is related to the university’s evaluation system where instructors hold the final decision-making authority. Accordingly, some participants expressed opinions that direct communication with the instructor would be more beneficial in terms of professionalism (S03, P05, S07, P01), indicating that their preference for instructor guidance over AI support as a factor affecting their learning motivation (coded as teacher preference). This means that learners’ individual learning should not be disconnected from the externally set achievement goals for learners to reach, and that customized support through AI applications should also have realistic usefulness for learners in this regard.

Discussion

The study results indicated that learners use AI applications designed to support SRL strategies as intended. This finding aligns with Gracia et al.’s (2018) systematic literature review of e-learning tools, including AI applications, which demonstrated their effectiveness in supporting 14 self-regulated learning strategies proposed by Zimmerman (1989). However, some participants perceived that AI applications were not useful in supporting motivational regulation. It is important to note that learning is a complex process that involves emotional factors, such as motivation and context, extending beyond cognitive capacity. (Bates, et al., 2020). Online learning environments face challenges related to a decline in students' motivation, particularly for those who have not developed SRL skills (Karaoglan et al., 2018; Karaoglan Yilmaz & Yilmaz, 2021). Jones and Castellano (2018) suggest that SRL tutoring can help less able learners increase their motivation to engage in SRL practices. However, the study suggests that AI support may not effectively address this challenge. Learners expressed a preference for human support for motivational regulation due to the trust relationship built with instructors. This finding is consistent with Krüger and Wilson's (2022) study, suggesting that users may not follow AI recommendations without clear trust relationships. To address this challenge, future research is needed to understand the mechanisms that build trust in AI systems and to develop strategies to promote motivation in online learning environments.

Next, the results of this study indicated that the human factors affecting learners’ self-regulated learning (SRL) were perceived at a more sensitive and detailed level in the learning environment that utilizes AI applications, which are not directly visible to teachers or fellow learners. Leaners responded sensitively and tried to capture clues of human factors that were formally or informally influenced in relationships with instructors and fellow learners through AI support. Chen et al. (2014) have suggested that the success of AI applications heavily depends on how learners utilize them. Hence, to support SRL effectively through AI applications, it is crucial to design them by considering the pedagogical and psychological aspects of students using them, rather than just focusing on technical aspects. Firstly, when designing an AI application, it’s important to prioritize the learner’s potential progression, rather than relying on patterns from their past data. This is because elements of the learner's identity, including their circumstances, learning style, lifestyle, and interests, are subject to ongoing changes. For example, in ‘Plan Organizer,’ AI should be designed to offer suggestions and guidance to the learner regarding the action plans that he or she must take in the context of his or her circumstances, even if the plan is unfavorable based on the learner’s historical data. Secondly, for successful SRL support in AI application design, the degree of AI intervention should be adjusted according to the degree of the learner’s activeness. If the learner exhibits low activeness, it’s crucial for the AI application to take a more proactive role in aiding their SRL. Conversely, if the learner is highly active, the AI application should gradually diminish its intervention, facilitating the learner to undertake more self-determined learning. The AI’s role should be to bolster, not supplant, the learner’s SRL. This finding can be viewed as a practical design implication for realizing hybrid human-AI regulatory system, as described by Molenaar (2022). Lastly, when designing AI applications, researchers must carefully consider the positions of learners, instructors, and AI. Dependent learners, mainly prioritizing exams or grades, might sidestep AI assistance even if it improves their SRL, if they perceive that the AI application doesn’t contribute to grade improvement. Therefore, in the design phase of AI applications, instructors need to contemplate the influence of AI applications on the interaction between learner and instructor (Seo et al., 2021a, 2021b). They must then decide whether to utilize such applications merely as supplementary tools or as integral components embedded within the curriculum. Overall, for a more effective support of student SRL in online environments, it’s crucial consider pedagogical and psychological design aspects such as learner identity, learner activeness, and learner position during the design and deployment of AI applications.

Conclusion

In this study, we examined learners’ perceptions of the use of AI applications to support SRL, as well as the related the pedagogical and psychological aspects, through a speed dating activity with storyboards. The results showed that learners perceived that the use of AI applications to support metacognitive, cognitive, and behavioral regulation in the context of SRL was useful but also that the use of AI applications to support motivational regulation was not useful. Additionally, three pedagogical and psychological aspects (i.e., learner identity, learner activeness, and learner position) were found to be necessary for the development and utilization of AI applications to support learners’ SRL. Our proposed theoretical and practical implications with regard to the ability of AI applications to support the SRL of students in online learning could be used to realize human-AI symbiosis in education.

Availability of data and materials

The full set of storyboards can be viewed at https://osf.io/6eb3v/?view_only=a81d3dfbb4e04a339175279ce318e93c.

References

  • Agonács, N., & Matos, J. F. (2021). Learner agency in distance education settings: Understanding language MOOC learners’ Heutagogical attributes. In S. Hase & L. M. Blaschke (Eds.), Unleashing the Power of Learner Agency. EdTech Books. http://edtechbooks.org/up/MOOC

  • Albelbisi, N. A., & Yusop, F. D. (2019). Factors influencing learners’ self-regulated learning skills in a massive open online course (MOOC) environment. Turkish Online Journal of Distance Education, 20(3), 1–16.

    Article  Google Scholar 

  • Bannert, M., & Reimann, P. (2012). Supporting self-regulated hypermedia learning through prompts. Instructional Science, 40, 193–211.

  • Bates, T., Cobo, C., Mariño, O., & Wheeler, S. (2020). Can artificial intelligence transform higher education? International Journal of Educational Technology in Higher Education, 17(1), 1–12.

    Article  Google Scholar 

  • Chen, C. M., Wang, J. Y., & Chen, Y.-C. (2014). Facilitating English-language reading performance by a digital reading annotation system with self-regulated learning mechanisms. Educational Technology & Society, 17(1), 102–114.

    Google Scholar 

  • Conati, C., Porayska-Pomsta, K., & Mavrikis, M. (2018). AI in Education needs interpretable machine learning: Lessons from Open Learner Modelling. arXiv preprint arXiv:1807.00154.

  • Craig, S. D., & Schroeder, N. L. (2017). Reconsidering the voice effect when learning from a virtual human. Computers & Education, 114, 193–205.

    Article  Google Scholar 

  • Deeva, G., Bogdanova, D., Serral, E., Snoeck, M., & De Weerdt, J. (2021). A review of automated feedback systems for learners: Classification framework, challenges and opportunities. Computers & Education, 162, 104094.

    Article  Google Scholar 

  • Dillahunt, T. R., Lam, J., Lu, A., & Wheeler, E. (2018). Designing future employment applications for underserved job seekers: a speed dating study. In Proceedings of the 2018 Designing Interactive Systems Conference (pp. 33–44).

  • Garcia, R., Falkner, K., & Vivian, R. (2018). Systematic literature review: Self-regulated learning strategies using e-learning tools for computer science. Computers & Education, 123, 150–163.

    Article  Google Scholar 

  • Glaser, B., & Strauss, A. (1976). The discovery of grounded theory. Aldine.

    Google Scholar 

  • Goel, A. K., & Polepeddi, L. (2016). Jill Watson: A virtual teaching assistant for online education. Georgia Institute of Technology.

    Google Scholar 

  • Greene, J. A., & Azevedo, R. (2010). The measurement of learners’ self-regulated cognitive and metacognitive processes while using computer-based learning environments. Educational Psychologist, 45(4), 203–209. https://doi.org/10.1080/00461520.2010.515935

    Article  Google Scholar 

  • Holstein, K., McLaren, B. M., & Aleven, V. (2017). Intelligent tutors as teachers' aides: exploring teacher needs for real-time analytics in blended classrooms. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 257–266).

  • Hu, H., & Driscoll, M. P. (2013). Self-regulation in e-learning environments: A remedy for community college? Journal of Educational Technology & Society, 16(4), 171–184.

    Google Scholar 

  • Hussein, H., Elmogy, M., & Guirguis, S. (2014). Automatic English question generation system based on template driven scheme. International Journal of Computer Science Issues (IJCSI), 11(6), 45.

    Google Scholar 

  • Jang, H., Reeve, J., & Deci, E. L. (2010). Engaging students in learning activities: It is not autonomy support or structure but autonomy support and structure. Journal of Educational Psychology, 102(3), 588.

    Article  Google Scholar 

  • Jansen, R. S., van Leeuwen, A., Janssen, J., Conijn, R., & Kester, L. (2020). Supporting learners’ self-regulated learning in Massive Open Online Courses. Computers & Education, 146, 103771.

    Article  Google Scholar 

  • Jin, S. (2021). Educational effects on the transparency of peer participation levels in asynchronous online discussion activities. IEEE Transactions on Learning Technologies, 14(5), 604–612.

    Article  Google Scholar 

  • Jivet, I., Scheffel, M., Schmitz, M., Robbers, S., Specht, M., & Drachsler, H. (2020). From students with love: An empirical study on learner goals, self-regulated learning, and sense-making of learning analytics in higher education. The Internet and Higher Education, 47, 100758.

    Article  Google Scholar 

  • Jones, A., & Castellano, G. (2018). Adaptive robotic tutors that support self-regulated learning: A longer-term investigation with primary school children. International Journal of Social Robotics, 10(3), 357–370.

    Article  Google Scholar 

  • Jouhari, Z., Haghani, F., & Changiz, T. (2015). Factors affecting self-regulated learning in medical students: A qualitative study. Medical Education Online, 20(1), 28694.

    Article  Google Scholar 

  • Karaoglan Yilmaz, F. G., & Yilmaz, R. (2021). Learning analytics as a metacognitive tool to influence learner transactional distance and motivation in online learning environments. Innovations in Education and Teaching International, 58(5), 575–585.

    Article  Google Scholar 

  • Karaoglan Yilmaz, F. G., Olpak, Y. Z., & Yilmaz, R. (2018). The effect of the metacognitive support via pedagogical agent on self-regulation skills. Journal of Educational Computing Research, 56(2), 159–180.

    Article  Google Scholar 

  • Kim, C., & Hodges, C. B. (2012). Effects of an emotion control treatment on academic emotions, motivation, and achievement in an online mathematics course. Instructional Science, 40(1), 173–192.

    Article  Google Scholar 

  • Kizilcec, R. F., Pérez-Sanagustín, M., & Maldonado, J. J. (2017). Self-regulated learning strategies predict learner behavior and goal attainment in Massive Open Online Courses. Computers & Education, 104, 18–33.

    Article  Google Scholar 

  • Krüger, S., & Wilson, C. (2022). The problem with trust: on the discursive commodification of trust in AI. AI & SOCIETY, 1–9.

  • Lawson, A. (2014). Learner identities in the context of undergraduates: A case study. Educational Research, 56(3), 343–356.

    Article  MathSciNet  Google Scholar 

  • Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Sage.

    Book  Google Scholar 

  • Linsey, J. S., & Becker, B. (2011). Effectiveness of brainwriting techniques: comparing nominal groups to real teams. In: Design creativity 2010 (pp. 165–171). Springer.

  • Littlejohn, A., Hood, N., Milligan, C., & Mustain, P. (2016). Learning in MOOCs: Motivations and self-regulated learning in MOOCs. The Internet and Higher Education, 29, 40e48. https://doi.org/10.1016/j.iheduc.2015.12.003

    Article  Google Scholar 

  • Lodge, J. M., Panadero, E., Broadbent, J., & De Barba, P. G. (2019). Supporting self-regulated learning with learning analytics. In Lodge, J., Horvath, J., & Corrin, L. (Eds.). Learning analytics in the classroom: Translating learning analytics research for teachers. ACADEMIA, 45–55.

  • Luckin, R. (2017). Towards artificial intelligence-based assessment systems. Nature Human Behavior, 1(3), 1–3.

    Google Scholar 

  • Luria, M., Zheng, R., Huffman, B., Huang, S., Zimmerman, J., & Forlizzi, J. (2020). Social boundaries for personal agents in the interpersonal space of the home. In: Proceedings of the 2020 CHI conference on human factors in computing systems (pp. 1–12).

  • Molenaar, I. (2022). The concept of hybrid human-AI regulation: Exemplifying how to support young learners’ self-regulated learning. Computers and Education: Artificial Intelligence, 3, 100070.

    Google Scholar 

  • Normadhi, N. B. A., Shuib, L., Nasir, H. N. M., Bimba, A., Idris, N., & Balakrishnan, V. (2019). Identification of personal traits in adaptive learning environment: Systematic literature review. Computers & Education, 130, 168–190.

    Article  Google Scholar 

  • Noroozi, O., Alikhani, I., Järvelä, S., Kirschner, P. A., Juuso, I., & Seppänen, T. (2019). Multimodal data to design visual learning analytics for understanding regulation of learning. Computers in Human Behavior, 100, 298–304.

    Article  Google Scholar 

  • Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2017.00422

    Article  Google Scholar 

  • Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16(4), 385–407.

    Article  Google Scholar 

  • Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33–40.

    Article  Google Scholar 

  • Roll, I., Wiese, E. S., Long, Y., Aleven, V., & Koedinger, K. R. (2014). Tutoring self-and co-regulation with intelligent tutoring systems to help students acquire better learning skills. Design Recommendations for Intelligent Tutoring Systems, 2, 169–182.

    Google Scholar 

  • Roll, I., & Winne, P. H. (2015). Understanding, evaluating, and supporting self-regulated learning using learning analytics. Journal of Learning Analytics, 2(1), 7–12.

    Article  Google Scholar 

  • Rosé, C. P., McLaughlin, E. A., Liu, R., & Koedinger, K. R. (2019). Explanatory learner models: Why machine learning (alone) is not the answer. British Journal of Educational Technology, 50(6), 2943–2958. https://doi.org/10.1111/bjet.12858

    Article  Google Scholar 

  • Ross, B., Chase, A. M., Robbie, D., Oates, G., & Absalom, Y. (2018). Adaptive quizzes to increase motivation, engagement, and learning outcomes in a first-year accounting unit. International Journal of Educational Technology in Higher Education, 15(1), 30.

    Article  Google Scholar 

  • Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19(4), 460–475.

    Article  Google Scholar 

  • Schreier, M. (2012). Qualitative content analysis in practice. Sage publications.

    Google Scholar 

  • Schunk, D. H. (2005). Self-regulated learning: The educational legacy of Paul R. Pintrich. Educational Psychologist, 40(2), 85–94.

    Article  Google Scholar 

  • Schunk, D. H., & Zimmerman, B. J. (1994). Self-regulation of learning and performance: Issues and educational applications. Lawrence Erlbaum Associates Inc.

    Google Scholar 

  • Seo, K., Dodson, S., Harandi, N. M., Roberson, N., Fels, S., & Roll, I. (2021a). Active learning with online video: The impact of learning context on engagement. Computers & Education, 165, 104132.

    Article  Google Scholar 

  • Seo, K., Tang, J., Roll, I., Fels, S., & Yoon, D. (2021b). The impact of artificial intelligence on learner-instructors in online learning. International Journal of Educational Technology in Higher Education, 18(1), 1–23.

    Article  Google Scholar 

  • Somasundaram, M., Junaid, K. M., & Mangadu, S. (2020). Artificial intelligence (AI) enabled intelligent quality management systems (IQMS) for the personalized learning path. Procedia Computer Science, 172, 438–442.

    Article  Google Scholar 

  • Srinivasa, R. J., Dodson, S., Seo, K., Yoon, D., & Fels, S. (2021). NoteLink: A Point-and-Shoot Linking Interface between Students' Handwritten Notebooks and Instructional Videos. In 2021 ACM/IEEE Joint Conference on Digital Libraries (JCDL) (pp. 140–149). IEEE.

  • Taub, M., Azevedo, R., Bouchet, F., & Khosravifar, B. (2014). Can the use of cognitive and metacognitive self-regulated learning strategies be predicted by learners’ levels of prior knowledge in hypermedia-learning environments? Computers in Human Behavior, 39, 356–367.

    Article  Google Scholar 

  • Truong, K. N., Hayes, G. R., & Abowd, G. D. (2006). Storyboarding: an empirical determination of best practices and effective guidelines. In: Proceedings of the 6th conference on designing interactive systems (pp. 12–21).

  • Vansteenkiste, M., Sierens, E., Goossens, L., Soenens, B., Dochy, F., Mouratidis, A., et al. (2012). Identifying configurations of perceived teacher autonomy support and structure: Associations with self-regulated learning, motivation and problem behavior. Learning and Instruction, 22(6), 431–439.

    Article  Google Scholar 

  • Wigfield, A., Klauda, S. L., & Cambria, J. (2011). Influences on the development of academic self-regulatory processes. In D. H. Schunk & B. Zimmerman (Eds.), Handbook of self-regulation of learning and performance (pp. 33–48). Routledge.

    Google Scholar 

  • Winne, P. H., & Baker, R. S. J. D. (2013). The potentials of educational data mining for researching metacognition, motivation, and self-regulated learning. JEDM—Journal of Educational Data Mining, 5(1), 1–8. https://doi.org/10.1037/1082-989X.2.2.131

    Article  Google Scholar 

  • Wong, J., Baars, M., Davis, D., Van Der Zee, T., Houben, G. J., & Paas, F. (2019). Supporting self-regulated learning in online learning environments and MOOCs: A systematic review. International Journal of Human-Computer Interaction, 35(4–5), 356–373.

    Article  Google Scholar 

  • Woolf, B. P., Arroyo, I., Muldner, K., Burleson, W., Cooper, D. G., Dolan, R., & Christopherson, R. M. (2010). The effect of motivational learning companions on low achieving students and students with disabilities. In: International conference on intelligent tutoring systems (pp. 327–337). Springer, Berlin, Heidelberg.

  • Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education—Where are the educators? International Journal of Educational Technology in Higher Education, 16(1), 1–27.

    Article  Google Scholar 

  • Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81(3), 329.

    Article  Google Scholar 

  • Zimmerman, B. J. (2000a). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39). Academic Press.

    Chapter  Google Scholar 

  • Zimmerman, B. J. (2000b). Self-efficacy: An essential motive to learn. Contemporary Educational Psychology, 25(1), 82–91.

    Article  Google Scholar 

  • Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183.

    Article  Google Scholar 

  • Zimmerman, B. J. (2013). From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist, 48(3), 135–147.

    Article  Google Scholar 

  • Zimmerman, B. J., & Pons, M. M. (1986). Development of a structured interview for assessing student use of self-regulated learning strategies. American Educational Research Journal, 23(4), 614–628.

    Article  Google Scholar 

  • Zimmerman, B. J., & Schunk, D. H. (2011). Handbook of self-regulation of learning and performance. Routledge.

    Google Scholar 

  • Zimmerman, J., & Forlizzi, J. (2017). Speed dating: Providing a menu of possible futures. She Ji: THe Journal of Design, Economics, and Innovation, 3(1), 30–50.

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank all students, educational experts, and AI experts for their great support and inspiration.

Funding

This research was supported by the research fund of Hanbat National University.

Author information

Authors and Affiliations

Authors

Contributions

SJ: Conceptualization, Methodology, Investigation, Writing—original draft, Visualization, Project administration, Funding acquisition; KI, MY, & IR: Methodology, Investigation, Writing—original draft; KS: Conceptualization, Methodology, Investigation, Writing—original draft, Supervision, Project administration.

Corresponding author

Correspondence to Kyoungwon Seo.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A: speed-dating interview script

Appendix A: speed-dating interview script

Introduction

  • Hello, thank you for taking time for this interview today. We’re really looking forward to learning from your experience with online learning.

  • Today, we’ll be discussing a set of 10 storyboards that are related to AI applications to support students’ self-regulated learning (SRL) in online learning. These AI applications are black-box models built using data collected from large numbers of students.

  • When reading the storyboards, try to think about them in the context of your discipline and experiences. Our goal is to reveal your perceptions of AI to support SRL in online learning.

  • For your information, the interview will take about 50 min. The interview will be audio recorded but will be confidential and deidentified.

  • (To help participants recall their own online learning experiences) What was the most difficult part of learning online?

For each storyboard

  • Do you think this AI application supports SRL? Yes, no, or do you feel neutral? Why?

  • If this AI application were incorporated into your online learning, when and how would you use it?

  • If this AI application is unlikely to help your SRL in your online learning, why is that the case?

  • What are some new things you would like to recommend or change with regard to this AI application? What are your concerns?

After examining all storyboards (capturing participants’ holistic point of view)

  • Of the storyboards shown today, which AI applications do you think would support SRL well in online learning? Why?

  • Which AI applications do you think would not support SRL well in online learning? Why?

Conclusion

  • Do you have any final comments?

  • Thank you for taking the time to interview with us today. We truly appreciate that you took time to participate in our study and share your expertise. Your insights were truly helpful.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jin, SH., Im, K., Yoo, M. et al. Supporting students’ self-regulated learning in online learning using artificial intelligence applications. Int J Educ Technol High Educ 20, 37 (2023). https://doi.org/10.1186/s41239-023-00406-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-023-00406-5

Keywords