Skip to main content
  • Research article
  • Open access
  • Published:

Investigating temporal access in a flipped classroom: procrastination persists

Abstract

This paper reports on a study that examines the learning behaviors and characteristics of students in a mobile applications computer programming class that adopted a “flipped” learning style. By harvesting learning analytics data from a learning management system, we created visualizations of work intensity to explore temporal patterns of students’ behavior and then correlate them with the students’ performance. Findings indicate that low, medium, and high performing students tend to access learning materials late with work intensity spiking on the lecture day, specifically during the lecture session. While high and low performing students show no difference in temporal access to material, medium performing students demonstrate the greatest degree of vibrancy regarding course content material access. Further a discussion of implications and insights on procrastination in the context of flipped classrooms are included.

Introduction

Despite a great deal of effort in supporting learning among students in university-based computer science (CS) courses, many students struggle. In a multi-national study of failure rates among students in introductory computer programming classes at the university level, it was found that even small improvements in failure rates could have a great deal of impact on the field. The authors of this study (Porter, Guzdial, McDowell, & Simon, 2013) state “Assuming that the pass rate found in this survey is representative, approximately 650,000 students every year do not pass CS1. In this light, just a small improvement of the pass rate of CS1 would cause a gigantic increase in the number of students passing (and perhaps eventually graduating) – a one percent increase in the pass rate means 20,000 students extra passing CS1” (p. 35). The authors note that even small changes in the pass rates of students taking entry level computer science courses would impact tens of thousands of students who might otherwise be discouraged by their struggles and leave computer science pursuits altogether.

However, there have been laudable efforts in the reform of computer science instruction. Pair programming, media computation, and peer instruction practices have been found to greatly enhance students’ success rates and increase student retention in the field (Bennedsen & Caspersen, 2007). Studies focused on computer programming classes have identified several key factors related to student success and perseverance at major universities. These factors include (1) previous computing experience; (2) work style preference; (3) self-efficacy for computer programming; (4) poor math skills; and (5) poor advising (Beaubouef & Mason, 2005).

It is clear that many students of computer science struggle. We also know that it is very important that computer science students learn to regulate their behavior as they take on challenging course work early in their studies of computer science. This self-regulation of learning behavior among students is associated with higher student grades and long-term retention (Shell, Hazley, Soh, Ingraham, & Ramsay, 2013). Procrastination is a challenging that has both academic and extra-academic dimensions. Procrastinating behaviors should be addressed early in a student’s career to prevent the emergence of problems later (Senécal, Julien, & Guay, 2003).

The “flipped classroom” model is a pedagogical approach that offers greater flexibility, and active student engagement than traditional teacher-centered strategies. It reverses the typical pattern of a lecture being followed by homework assignments. Then, class time is devoted to problem solving exercises, hands-on projects, or in-depth discussions. This model has increased in popularity among students particularly low achievers (Nouri, 2016). Learning Management Systems (LMS) such as Moodle, Canvas, Blackboard, and Desire2Learn have become key components in implementing this teaching style.

Fortunately, such systems not only provide an information conduit for students to use to access course content and complete course assignments, but also to provide a rich repository of data regarding students’ behavior. An LMS may record when students access materials and can indicate precisely what materials are accessed. By creating ways of this data speak to instructors, we may provide them with a tool to help them make data driven decisions about their pedagogy. It is now possible for instructors to actually see what materials students access, which students access the materials, and how much time they spend accessing materials. For example, it has been found that digital textbook analytics may be used as an effective early warning system to identify students at risk of academic failure. Reynol and Candrianna point out that data can be collected in an unobtrusive manner and be used to predict problems with students’ performance early (Junco & Clem, 2015). This adds to what instructors already do with assignments and assessments and gives instructors the ability to help struggling students in more timely and targeted ways. While online tools that computer science students currently use may be leveraged for learning, student behavior online is complex and difficult to predict. Haoa, Wright, Barnes, and Branch in a study of student online behavior found that problem difficulty is the most relevant factor in predicting students’ online assistance-seeking behaviors (Hao, Wright, Barnes, & Branch, 2016). But, is it possible to determine if students are struggling with self-regulation, procrastination, or problem difficulty? In order to understand this, instructors have to come up with not only appropriate and meaningful ways to present materials and assess student success, but also find ways to take advantage of the data that is harvested by the LMS used in the course they teach. This paper reports on a study that used multiple data points to assess learning and instruction in a “flipped” computer science course. The objective of the inquiry was to examine patterns of students’ behavior using a course LMS and correlate any observed patterns with learning outcomes.

The following research questions guide our inquiry:

  1. 1.

    What temporal patterns of behavior may be discernable among university students in a mobile applications computer science course?

  2. 2.

    To what extent might any such patterns of behavior relate to student performance?

  3. 3.

    Are there any significant differences among student performance groups with respect to LMS interaction?

This paper is organized as follows: we discuss related work in “Related work” section, then we present details about the course subject to the data analysis in “The course” section, after that, we explain methodology in “Methodology” section followed by a thorough discussion of results in “Results” section. We discuss implications in “Discussion” section and conclude in “Conclusion & future Work” section.

Related work

Computer science online instruction

Computer science online instruction has been researched in different ways and for different purposes. High attrition rates among students (Beaubouef & Mason, 2005) and the increasing trend toward using online LMSs have inspired researchers to investigate methods of improving the learning process. One example is incorporating pedagogical principles into the LMS using better learning programming languages (Rossling et al., 2008). Moreover combining advanced collaborative tools during the teaching of programming languages in a web-based environment has proven to be effective (Cavus, Uzunboylu, & Ibrahim, 2007). Flipped classroom strategies for CS education has been suggested as a solution for both pedagogical and financial challenges, such as creating active learning experiences with increasing financial pressures (Maher, Latulipe, Lipford, & Rorrer, 2015). This work is a step towards understanding students’ interaction with LMS in a Computer Science flipped classroom. We aim to identify common behavior and correlate them with students‘ performance.

Patterning students behavior on LMSs and identifying variables

The use of online LMS has provided researchers with a rich source of information through the logs of the system. Several studies have been done to process these logs to elicit the following:

  • Students’ learning and interaction

  • Variables for predicting learning outcomes

  • Clustering students into groups based on their interaction

As stated, LMS logs are used to elicit variables that predict user achievement. Variables such as regular study, late submissions of assignments, number of sessions (the frequency of course logins), and proof of reading the course information packets are found to be significant indicators of achievement (You, 2016).

Cerezo et al. (2016) have researched clustering students based on their patterns of interaction with LMS. The clustering helps understand students’ learning behavior as either a procrastinator or an individual learner. LMS logs are also used to model automatic learner profiles to help incorporate other variables such as students background and their computer skills (Özpolat & Akar 2009).

Plant et al. (2005) are among the first to question the amount of time spent studying as a predictor of performance, but have actually found that it is a poor predictor. Yet, they identified other factors such as previous knowledge, skills, and quality of study to be strong predictors. Kim et al. (2014) are more interested in comparing variables that predict achievement between LMS dependent courses and traditional courses. They conclude that variables such as login-time, login-frequencies and regularity of log-in interval can linearly predict achievement for traditional face-to-face classes. However, we argue that it is more complicated in the case of an LMS dependent class that adopts a flipped classroom style.

It is not only students’ achievement that researchers are looking to predict. Predicting learner motivation is especially important given the trend towards online course delivery. Munoz-Organero et al. (2010) used behavior patterns in the interaction of each particular student with the contents and services in an LMS to predict motivation to engage in the content. Another study (Dawson, 2009) found significant correlations between students’ achievement orientation and participation in discussion forums for predicting motivation to learn. This work aims to explore common students’ behavior in a flipped classroom setting using visualization techniques and to highlight significant factors related to the temporality of students’ interaction with LMS.

Educational data mining & analytics

Educational data mining (EDM) is an emerging interdisciplinary research area that deals with the development of methods to explore data originating in an educational context. EDM uses computational approaches in the analysis of data to examine educational outcomes (Romero & Ventura, 2010). With the increased use of automated learning tools, there has been a spike in research focusing on data and analytics for improving teaching and learning. As part of this trend, two research communities have emerged. One focused on EDM and another focused on Learning Analytics and Knowledge (LAK) (Siemens & Baker, 2012). EDM and LAK both reflect the emergence of data-intensive approaches to education. Both communities share the goals of improving education by improving assessment, identifying how problems in education are understood and analyzing how interventions are planned and selected. Siemens and Baker (2012) differentiate these issues by categorizing them based on techniques & methods, type of discovery that is prioritized, and type of adaptation & personalization. Techniques used in EDM mainly focus on anaylsis and on the visualizatoin of data. Results of such analyses can be used to provide feedback to support instructors, extract recommendation for students, predict student’s performance and model student’s behavior (Romero & Ventura, 2010).

According to Romero et al. (2010) EDM research should inform the development of EDM tools that are designed to be people of all skill level. In this work, we facilitate data analytics interpretation by providing visualization artifact to the user modeled as intensity charts.

The temporal aspect of learning analytics is presented in different forms and for different purposes. Thakur et al. (2014) investigate the temporal stability of students grades and use it to design data-driven assessment and feedback mechanism. Nespereira et al. (2014) examine whether LMS access frequency is a sign of success in college courses. They also obtain temporal behavior patterns for students in a blended learning setting. They found that there is a correlation between the interaction of students with an LMS and their final grades. Interaction is defined by all traces of a student activity when the LMS including the number of views, quizzes, grades, etc. Moreover, the researchers uncover a trend in temporal series analysis that can be used to predict students’ success. Basharia and Dawson (2011) have explored and visualized temporal participants interaction with discussion forums using in LMS. They observe a correlation between individual’s connectivity to peers and their overall academic success. Nicolas et al. (2011) state that lack of participation in discussion forums during the learning process is positively linked to procrastination and poor performance.

The course

Our inquiry began with an examination of students’ behavior and outcomes in a single CS course. This is a course in mobile learning taught at a large southern university. There are 63 students in the course comprised of both undergraduate senior level students and graduate students most of whom are enrolled in a masters computer science program. All the students could be tracked using the course LMS (MOODLE). Using data generated by MOODLE, the researcher were able to identify students and track their progress in relation to both activity and achievement. It is generally considered to be a difficult course; however, because mobile development is such a popular topic, many students take the course. Due to its popularity, the instructor of the course decided to utilize the LMS to inform pedagogical decisions.

Most significantly, the instructor created videos for the students to use in the course and aligned them with course content. These were arranged into weekly course modules. Videos, readings and other course materials were made available to students by way of MOODLE. The intention was to provide students with ample time and materials to nurture effective learning of course content. Because the materials were made available by the LMS, the exact moments the students accessed the materials could be archived. The instructor of the course sought to incorporate adaptive questioning and a project-based approach. Grading automation and forced practice are also key elements specific to the course design as mediated by the LMS.

The course made significant use of instructional videos, as videos have been found to enhance online learning (Zhang, Zhou, Briggs, & Nunamaker, 2006). The design and development of the videos for the class and the design of the in-class assignments took a lot of time on the part of the instructor. The in-class assignments were based on the videos. The assignments had to be very well tailored to the videos and have a high degree of alignment with them. The in-class assignments were designed to be challenging, and students had to access the videos in order to accomplish the assignments. In such “flipped classes,” the course instructor is not only an expert imparting content but also becomes the developer of instructional materials and a designer of experiences that nurture the emergence of expertise among students (Warter-Perez and Dong, 2012). The flipped classroom makes the work of the instructor more intensive and demanding in terms of time, experience, and content expertise (Strayer, 2012). However, the approach gives rise to questions such as how might an instructor or instructional designer find out about students’ use of the online materials that were created for their benefit? How do students behave in such a course in the field of computer science?

Course content

The course is about Mobile Application Development and is designed for both undergraduate and graduate students with prior experience, especially those with a strong programming background. Familiarity with Java or an equivalent object oriented programming language is expected for those who enroll in the course. Students design and build a variety of mobile applications using a hands-on, project-based approach. The platform adopted is mainly Google Android. Students get to know the various requirements and design decisions tied to mobile application development and how they can deal with the limited resources available on mobile devices. They also get exposed to various technologies that can be integrated with mobile applications.

Course schedule

The study took place during the fall semester of 2014. The semester was 16 weeks in duration. This pedagogical technique seeks to place didactic, one-way communication from the instructor (such as lecturing) outside of class time. In this approach, instructors and course designers use an LMS to post materials, such as videos, readings, lecture notes, PowerPoint presentations and other materials, several days before the scheduled lecture. Lecture time is then solely used for problem-solving. The course consistently used a predicted pattern of posting materials and solving in-class assignments. There were 10 such cycles during the semester under investigation. These cycles were between 5 days to 2 weeks in duration (Table 1). Five weeks were devoted to assessments and projects.

Table 1 Course in-class assignments schedule

Methodology

The researchers use queries to access the system logs for the course. The logs provide detailed information about users’ interaction with the course page. We extracted students’ accesses to the online material posted every cycle. We quantified material access by recording the number of clicks on the material links. Clicking a link implies an occurrence of watching a video or reading the material. The accesses are grouped by temporality (each day). This information was then used to extract students’ access patterns and then correlate it with student performance level. The researchers considered the time period that should be used as a determining factor of students’ behavior. A factor analysis test was used to determine the interval to be used.

To test the hypothesis that there may exist significant differences between students of different performance levels in terms of the frequency and temporality of accessing material, the set of 63 student accesses were collected and categorized based on their overall performance in the course. Three groups were created:

  • Low Performing Group (N=21,grade<70)

  • Medium Performing Group (N=22,70≥grade<81)

  • High Performing Group (N=20,grade≥81)

Results were analyzed with a one-way analysis of variance between-subjects. Assumptions for this model were checked, followed by an omnibus F-test to determine if there were differences in course performance with respect to frequency and temporality of course material access. The null-hypothesis is that there would be no difference. We demonstrate correlations between material access and course performance using F-test, post hoc pairwise comparisons to detect significant differences.

We also generated intensity images to visualize students’ interaction with the course in terms of material access for all the course cycles.

Results

In this section, we present the results of the statistical analysis conducted for the following null hypotheses:

  • Students’ interaction with the course in terms of material access is evenly distributed over the days before the lecture time

  • Students’ interaction with the course in terms of material accesses is the same among different performing student groups. This means that student performance level (high, medium, or low) affects neither the intensity nor the temporality of the accesses.

Statistical analysis

Temporal patterns of accessing the material

For all cycles, students’ number of material accesses during the 4 days before the lecture and the lecture day itself is recorded. To reduce the number of days to consider and also to explore which day(s) have the most impact, observed days are factor analyzed using principal component analysis with Varimax (orthogonal) rotation. The analysis yielded one factor (Lecture Day - LD) explaining a total of 37.3% of the variance for the entire set of variables. This factor also has the highest Eigenvalue of 1.86.

Figure 1 shows how the 5 days relate to each other. This means that the day of the lecture (LD) has the maximum number of accesses in all cycles and is considered distinctively relative to other days. The day before the lecture (LD-1) has a cumulative variance of 19.8%, which is the second highest value. In the next analysis, we start with considering only these two days as the factors representing student access to the material.

Fig. 1
figure 1

Factor Analysis for Days of Material Access

Based on the fact that lecture day seems to be the most active time when student access the material, we further analyzed student accesses per hour on that day.

Figure 2 shows that peak access occurs during the lecture time while students are working on solving the in-class assignment.

Fig. 2
figure 2

Activity During Lecture Day

Temporal access patterns categorized by students’ performance level

Means and standard deviations for the number of material accesses on lecture day (LD) and one day before (LD-1) are presented in Table 2.

Table 2 Descriptive statistics of LD vs. LD-1 vs. BLD

Having 3 groups (Low, Medium, and High performing groups), ANOVA is used to compare means of variance for the variable and # of accesses. Initially, we have two days of interest, Lecture day (LD) and the day before (LD-1), thus a one-way between-subjects ANOVA as a function of student performance level is used for both days. The assumption of homogeneity of variance is met for both LD, and LD-1; Levene’s F(63)=0.65, p=0.52,F(63)=0.003,p=0.997 respectively. All other assumptions are met. There are no statistically significant differences on the number of accesses in LD nor LD-1 among the different performing groups. F(63)=2.25,p>.05,p=0.11,F(63)=0.933, p>.05,p=0.4 respectively. However, considering post hoc pairwise comparisons using the Least Significant Difference (LSD) procedure, there is a statistically significant difference for LD-1 between low performing students and medium performing students, p=0.043 which is < 0.05. Hence, we have conducted another test considering lecture day vs. ALL the days before lecture day as one variable (Before Lecture Day - BLD). All days before the lecture day are consolidated into one variable by adding the number of accesses for each day to assemble what represents accesses before lecture day. Means and standard deviations for the number of material accesses on lecture day (LD) and all days before (BLD) are presented in Table 2.

A one-way between-subjects analysis of variance is performed on LD and BLD as a function of student performance level. The assumption of homogeneity of variance is met for both LD, and BLD; Levene’s F(63)=0.64,p=.67,F(63)=0.003,p=.25 respectively. All other assumptions are met. There is a statistically significant difference on the number of accesses in BLD, p<0.05,p=.015. Post-hoc test (LSD) shows that medium performing students are significantly different from low performing students in accessing the material before the lecture day, p<.05,p=0.004. Interestingly there is no other distinction between any other group comparisons; meaning the high and the low are not different in terms of when they access the material. High and medium students are also not scientifically different from each other.

Intensity charts

Tufte notes that, “graphical displays should show the data, induce the viewer to think about the substance rather than about methodology, graphic design, the technology of graphic production, or something else, avoid distorting what the data have to say and encourage the eye to compare different pieces of data” (p. 13) (Mulrow, 2002). Visualizing interaction can also provide “goal-oriented visualization” that can help students track their progress (Duval, 2011). The purpose of creating the intensity charts is twofold: (a) They indicate in a visual way when there are the most intense and the least intense activity in the course with respect to course material access. This is important because this can provide feedback to both instructors and students about what is occurring with course activity and may encourage increased attention to reflective practice among students.

Visualizations of progress are common in human computer interface systems and are a central feature of computer gaming and the gamification of learning systems. In this way, intensity charts may also serve as useful metacognitive tools. (b) The intensity charts indicate the nature of the intensity of students activity and this is organized by student ability. This provides even more feedback for instructors and learners and may also be a means for LMS designers to develop new tools for creating visualizations of course activity. One approach might be the automation of sending out text messages or emails to students who fail to access the material by a certain date or time.

Visualizations of course activity intensity may also indicate topics or assignments that are most problematic for students and may also indicate what topics are most difficult or may be most likely to contribute to procrastination, task avoidance, overconfidence, and ultimately student success or failure. Such approaches may also allow instructors to share activity patterns with students and instructors, especially when they correlate (positively or negatively) with student success. These patterns may also be used by instructors to prepare for specific “flipped classroom” activities or for getting a feedback on most accessed materials.

The charts were constructed based on the number of accesses to the material for every cycle.

Figure 3 shows an example of student activity in the 9th cycle. The chart captures access per day starting from the day the materials are posted and proceeding to the day of the lecture (LD). The days are represented as numbers where each number represents the number of days before the lecture day; they are also color coded as red: lecture day, green: week end,and black: otherwise. The day of the lecture is represented by red “0”. The Y-axis represents the material posted for the cycle. For instance, the 9th cycle has four material items (videos and readings) that students should watch or read before coming to the lecture. The number of accesses is represented by the color of the cell where dark blue represents no access and light yellow represents the maximum possible number of accesses per day for the semester which is 95 in this case.

Fig. 3
figure 3

Intensity chart for cycle # 9

As we can see in Fig. 4, the intensity of student activity is right-shifted in terms of temporality regardless the performing group. In other words, we can empirically demonstrate that students delay access to the materials until late in the cycle among the three groups (Low, Medium, and High performing). Due to space limitations, here we provide one cycle as an example to illustrate the pattern we discover in the data. However, all cycles generated reveal the same pattern of behavior among the students. Figures in AppendixExamples of activity intensity charts categorized by students’ performance level” section displays 3 other cycles. All of the cycles reveal that late material access is a significant feature of student behavior. This demonstrable lag is then correlated with learning outcomes. While all students share the delayed access, medium performing group is the most active and vibrant in terms of temporal access. This unexpected temporal pattern, especially for high performing students prompted the researchers to take a step back and measure a fundamental variable and that was the total number of accesses in the entire cycle. Not surprisingly, the total number of accesses is most notable among high performing students.

Fig. 4
figure 4

Activity categorized by students’ performance. a Low, b Medium, c High

Discussion

Steel (2007) defines procrastination as voluntarily delaying an intended course of action despite expecting to be worse off because of the delay. In academic settings procrastination is pervasive and potentially maladaptive behavior for many university and college students often resulting in feelings of psychological distress (Solomon & Rothblum, 1984). In an online learning setting, academic procrastination is found to have a negative relationship with self-regulated learning, motivation, and performance (Akinsola, Tella, & Tella, 2007; Asarta & Schmidt 2013; Lee, 2005; Michinov, Brunot, Le Bohec, Juhel, & Delaval, 2011; Rakes & Dunn, 2010; Tuckman, 2005). The regularity of student interaction with learning materials strongly predicts learning performance and the irregularity of student interaction with learning materials not surprisingly predicts poor performance (Jo et al. 2014). Thus, to interpret students behavior in this study, the investigators use Pintrich’s (2004) framework for assessing motivation and self-regulation for college students. Nonetheless, it worth mentioning at this point that procrastination is a serious and general obstacle faced by students regardless of their performance level, especially among undergraduates (Artino & Stephens, 2009). Initially, we considered Pintrich’s perspective and general assumptions of the self-regulated learning model (SRL) and how it might apply to this case of “flipped classroom.” We then, used the framework to interpret the findings. There are four assumptions that must be met. First, learners are assumed to construct their own meanings, goals, and strategies from the information available from the external environment and their own minds. For this class, students are unbounded by any policies or practices that might have altered their own perception of goals or actions. They were also encouraged to review prior background material to help them activate required knowledge. Second, we must assume that learners can potentially monitor, control and regulate certain aspects of their own cognition, motivation, and behavior. This also applies as students follow their own pace and have the control over when they access the materials and how many times they access the material. Basically, it is left to the students to prepare for the class as they choose. The third assumption is having the goal, criterion, or standard assumption. SRL model assumes that there is some type of goal or criterion that determines whether the learning process should continue as is or if it needs to be changed. For this course, the fact that in-class assignments and homework are being graded on a weekly manner gives the students a regular feedback. Students who did not score well in a certain week can know their grade and are left to decide whether they need to change their learning behavior or not. Actually, students recognize that the material has an accumulative nature. They are assumed to develop a sense of urgency that may help them to conquer the current cycle material to be able to follow up with future material. The final assumption is that self-regulatory activities are mediators between personal and contextual characteristics and actual achievement or performance. In this case, we believe that the blend of students’ individual learning styles and the class environment have both contributed to students’ learning outcome.

Pintrich‘s framework for self-regulated learning in the college classroom applies to this case. The model suggests a general time ordered sequence of phases that students would go through as they perform a task, however, there is no strong assumption that the phases are hierarchically or linearly structured such that earlier phases must always occur before later phases. Phase 1 involves planning and goal setting as well as activation of perception and knowledge of the task. Phase 2 relates to various monitoring processes that represent meta-cognitive awareness of different aspects of the task, self or context. Phase 3 is related to the effort to control and regulate different aspects of the self or task and context. Finally, Phase 4 concerns various kinds of actions and reflections on the self and the task or context. We explain how different performing groups would react according to this model, which might help understand the intensity charts.

In a flipped classroom setting, students are exposed to a substantial amount of information and course content early. There are repetitive cycles of material posting (phase 1: students target goals, plan effort and have an initial perception of tasks & phase 3: students select cognitive strategies, increase or decrease effort, persist or give up). Then the class face-to-face interaction promotes active learning and student engagement through problem-solving and group work (phase 2: students develop meta cognitive awareness, self-observe behavior). The students are given feedback on a timely manner so that they can react effectively to their learning behavior (phase 4: students develop cognitive judgment, evaluate tasks).

In phase 1 students are expected to set plans and activate prior knowledge and develop a perception of required tasks. Phase 2, 3 & 4 repeat every cycle where students adapt to the course and develop a learning behavior and reflect on their performance. Top students are expected to possess good programming skills. Expectations from in-class assignments are set clearly in the first week. Students pace themselves considering time and effort they expect to expend to complete the assignment. So what appears to be procrastination may turn out to be simply a smart usage of time and effort. From a student perspective, it is better to watch the videos just before the lecture so that content is fresh in their minds by the time they are exposed to the assignment. Medium performing students showed higher vibrancy in terms of temporal access, which might be explained by phase 2 & 3. Medium performing students are expected to be as motivated as top students. However, their skill set and programming background might not be as strong, in this case they are more likely to access the material earlier to be able to understand the content and solve the problems in the assignment. As for low performing students, they may procrastinate for reasons related to anxiety and low personal standards of achievement (Saddler & Buley, 1999). According to this model, students persist or give up in phase 3. Therefore, incompetent students may choose to simply delay accessing the materials because they give up or cease to expend effort. Further, this course is fast paced and content-heavy compared to other courses offered by the department, which may intimidate members of this specific group. To summarize, what might appear to be a shared behavior among 3 groups may have different reasons for being may result in different outcomes because of individual differences among students.

Conclusion & future Work

This paper offers a methodological approach to learning analytics and presents findings on student behavior in a particular type of learning environment and correlates those behaviors with measures of student learning. The goal is to develop a set of practices that enable us to leverage course data for improving learning and to use this as an approach to research and eventually build theory for designing environments that nurture the emergence of computer science expertise. In addition, to help understand students behaviors in a flipped classroom, our analysis revealed patterns of action in the data. Significantly we found that medium performing students demonstrated vibrancy with respect to material access. We also found that there is procrastination in all three performance groups. Though procrastination among students is typically considered a significant threat to student success, in this case, procrastination does not seem to have such an impact on high performing students. Nevertheless, high performing students demonstrate heavy access of course materials in total as compared to the other two groups. These findings speak to the literature not only in supporting the relationship between activity and student success, but also must be considered in the context of the access of course materials in the context of a “flipped classroom” approach. The visualizations complement this flipped approach and may serve as an effective tool for both instructors and students.

One limitation of this study is that the LMS calculated students’ access of materials based on the the clicks of links posted on MOODLE. While this tells us something about their access, it does not tell us about their interactions with the material outside of the system. For example, students may replay the videos or bookmark them and return to them later without clicking again. A further limitation is that this quantitative analysis did not reveal much about the quality of the student interactions with the materials. Computer based learning can be affected by many variables. Computer self-efficacy is one among other variables proven to cause anxiety among students (Saadé and Kira, 2009).

Future work in this area should consider characterizing the quality of student work and should address differences in the types of materials offered to the students. Obviously, not all materials are likely to elicit the same degree of intensity. Further research is necessary to determine what materials are preferable and why. More research is also necessary to explore means of effective flipped classroom techniques to support learning in computer science university courses. Mixed methods approaches may also give us more detailed information about how the students are using the materials and not just when, what, and for how long they access the materials. Furthermore, researchers should consider internal factors such as students’ skills and background knowledge while also considering external factors such as material quality and magnitude.

Appendix

Examples of activity intensity charts for all students

Fig. 5
figure 5

Intensity chart for cycle # 4

Fig. 6
figure 6

Intensity chart for cycle # 5

Fig. 7
figure 7

Intensity chart for cycle # 10

Examples of activity intensity charts categorized by students’ performance level

Table 3 Intensity chart for selected cycles categorized by students’ performing level

References

  • Akinsola, M., Tella, A, & Tella, A (2007). Correlates of academic procrastination and mathematics achievement of university undergraduate students. Eurasia Journal of Mathematics, Science & Technology Education, 3(4), 363–370.

    Article  Google Scholar 

  • Artino, A.,& Stephens, J. (2009). Academic motivation and self-regulation: A comparative analysis of undergraduate and graduate students learning online. The Internet and Higher Education, 12(3), 146–151.

    Article  Google Scholar 

  • Asarta, C.,& Schmidt, J. (2013). Access patterns of online materials in a blended course. Decision Sciences Journal of Innovative Education, 11(1), 107–123.

    Article  Google Scholar 

  • Bakharia, A,& Dawson, S (2011). Snapp: a bird’s-eye view of temporal participant interaction. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp. 168–173). ACM.

  • Beaubouef, T,& Mason, J (2005). Why the high attrition rate for computer science students: some thoughts and observations. ACM SIGCSE Bulletin, 37(2), 103–106.

    Article  Google Scholar 

  • Bennedsen, J,& Caspersen, M. (2007). Failure rates in introductory programming. ACM SIGCSE Bulletin, 39(2), 32–36.

    Article  Google Scholar 

  • Cavus, N, Uzunboylu, H, & Ibrahim, D (2007). Assessing the success rate of students using a learning management system together with a collaborative tool in web-based teaching of programming languages. Journal of Educational Computing Research, 36(3), 301–321.

    Article  Google Scholar 

  • Cerezo, R, Sánchez-Santillán, M, Paule-Ruiz, M., & Núñez, J. (2016). Students’ lms interaction patterns and their relationship with achievement: A case study in higher education. Computers & Education, 96, 42–54.

    Article  Google Scholar 

  • Dawson, S, Macfadyen, L, & Lockyer, L (2009). Learning or performance: Predicting drivers of student motivation. In Atkinson R. & McBeath C (Eds.) In Same places, different spaces. Proceedings of the 26th Annual ASCILITE International Conference (pp. 184–193). Auckland. University of Auckland, Auckland University of Technology, and Australasian Society for Computers in Learning in Tertiary Education.

    Google Scholar 

  • Duvall, E (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge. doi:doi:10.1145/2090116.2090118(pp. 9–17).

  • Hao, Q, Wright, E, Barnes, B, & Branch, R. (2016). What are the most important predictors of computer science students’ online help-seeking behaviors?Computers in Human Behavior, 62, 467–474.

    Article  Google Scholar 

  • Jo, I., Kim, D, & Yoon, M (2014). Analyzing the log patterns of adult learners in LMS using learning analytics. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge (pp. 183–187). ACM.

  • Junco, R,& Clem, C (2015). Predicting course outcomes with digital textbook usage data. The Internet and Higher Education, 27, 54–63.

    Article  Google Scholar 

  • Kim, J., Seodaemun-gu, S, Park, Y, Song, J, & Jo, I. (2014). Predicting students’ learning performance by using online behavior patterns in blended learning environments: Comparison of two cases on linear and non-linear model. Korea, 120, 750.

    Google Scholar 

  • Lee, E (2005). The relationship of motivation and flow experience to academic procrastination in university students. The Journal of Genetic Psychology, 166(1), 5–15.

    Article  Google Scholar 

  • Maher, M., Latulipe, C, Lipford, H, & Rorrer, A (2015). Flipped classroom strategies for cs education. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education, SIGCSE ’15pp. (218–223). New York: ACM. doi:http://dx.doi.org/10.1145/2676723.2677252. http://doi.acm.org/10.1145/2676723.2677252

  • Michinov, N, Brunot, S, Le Bohec, O, Juhel, J, & Delaval, M (2011). Procrastination, participation, and performance in online learning environments. Computers & Education, 56(1), 243–252.

    Article  Google Scholar 

  • Mulrow, E. (2002). The visual display of quantitative information. Taylor & Francis.

  • Munoz-Organero, M, Munoz-Merino, P., & Kloos, C. (2010). Student behavior and interaction patterns with an lms as motivation predictors in e-learning settings. IEEE Transactions on Education, 53(3), 463–470.

    Article  Google Scholar 

  • Nespereira, C., Dai, K, Redondo, R. P. D, & Vilas, A. (2014). Is the LMS access frequency a sign of students’ success in face-to-face higher education? In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality, TEEM 2014 (pp. 283–290). New York. ACM.

    Google Scholar 

  • Nouri, J (2016). The flipped classroom: for active, effective and increased learning–especially for low achievers. International Journal of Educational Technology in Higher Education, 13(1), 33.

    Article  Google Scholar 

  • Özpolat, E,& Akar, G. (2009). Automatic detection of learning styles for an e-learning system. Computers & Education, 53(2), 355–367.

    Article  Google Scholar 

  • Pintrich, P. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16(4), 385–407.

    Article  Google Scholar 

  • Plant, E., Ericsson, K., Hill, L, & Asberg, K (2005). Why study time does not predict grade point average across college students: Implications of deliberate practice for academic performance. Contemporary Educational Psychology, 30(1), 96–116.

    Article  Google Scholar 

  • Porter, L, Guzdial, M, McDowell, C, & Simon, B (2013). Success in introductory programming: What works?Communications of the ACM, 56(8), 34–36.

    Article  Google Scholar 

  • Rakes, G.,& Dunn, K. (2010). The impact of online graduate students’ motivation and self-regulation on academic procrastination. Journal of Interactive Online Learning, 9, 78–93. Retrieved from http://www.ncolr.org/jiol/.

  • Romero, C,& Ventura, S (2010). Educational data mining: a review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 40(6), 601–618.

    Article  Google Scholar 

  • Rossling, G, Joy, M, Moreno, A, Radenski, A, Malmi, L, Kerren, A, Naps, T, Ross, R., Clancy, M, Korhonen, A, Oechsle, R,... Iturbide, J. (2008). Enhancing learning management systems to better support computer science education. SIGCSE Bull, 40(4), 142–166. doi:doi:10.1145/1473195.1473239

    Article  Google Scholar 

  • Saadé, R.,& Kira, D (2009). Computer anxiety in e-learning: The effect of computer self-efficacy. Journal of Information Technology Education, 8(1), 177–191.

    Article  Google Scholar 

  • Saddler, C.,& Buley, J (1999). Predictors of academic procrastination in college students. Psychological Reports, 84(2), 686–688.

    Article  Google Scholar 

  • Senécal, C, Julien, E, & Guay, F (2003). Role conflict and academic procrastination: A self-determination perspective. European Journal of Social Psychology, 33(1), 135–145.

    Article  Google Scholar 

  • Shell, D., Hazley, M., Soh, L -K, Ingraham, E, & Ramsay, S (2013). Associations of students’ creativity, motivation, and self-regulation with learning and achievement in college computer science courses. In Proceedings of the 43rd Annual Frontiers in Education Conference.Piscataway, NJ. doi:10.1109/FIE.2013.6685116.

  • Siemens, G,& Baker, R. (2012). Learning analytics and educational data mining: Towards communication and collaboration. In Proceedings of the 2Nd International Conference on Learning Analytics and Knowledge, LAK ’12(pp. 252–254). New York: ACM. doi:http://dx.doi.org/10.1145/2330601.2330661. http://doi.acm.org/10.1145/2330601.2330661

  • Solomon, L.,& Rothblum, E. (1984). Academic procrastination: Frequency and cognitive-behavioral correlates. Journal of Counseling Psychology, 31(4), 503.

    Article  Google Scholar 

  • Steel, P (2007). The nature of procrastination: a meta-analytic and theoretical review of quintessential selfregulatory failure. Psychology Bulletin, 133, 65–94.

    Article  Google Scholar 

  • Strayer, J. (2012). How learning in an inverted classroom influences cooperation, innovation and task orientation. Learning Environments Research, 15(2), 171–193.

    Article  Google Scholar 

  • Thakur, G., Olama, M., McNair, A., Sukumar, S., & Studham, S (2014). Towards adaptive educational assessments: predicting student performance using temporal stability and data analytics in learning management systems. In Proceedings 20th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, ACCESS. New York City.

  • Tuckman, B. (2005). Relations of academic procrastination, rationalizations, and performance in a web course with deadlines. Psychological Reports, 96(3 suppl), 1015–1021.

    Article  Google Scholar 

  • Warter-Perez, N,& Dong, J (2012). Flipping the classroom: How to embed inquiry and design projects into a digital engineering lecture. In Proceedings of the 2012 ASEE PSW Section Conference. American Society for Engineering Education Washington DC.

  • You, J. (2016). Identifying significant indicators using lms data to predict course achievement in online learning. The Internet and Higher Education, 29, 23–30.

    Article  MathSciNet  Google Scholar 

  • Zhang, D, Zhou, L, Briggs, R., & Nunamaker, J. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & Management, 43(1), 15–27.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

AA: data statistical analysis, intensity charts creation, writing. MT: initial Brainstorming and outlining, writing. MS: initial Brainstorming and visioning, data collection. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Abeer AlJarrah.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

AlJarrah, A., Thomas, M.K. & Shehab, M. Investigating temporal access in a flipped classroom: procrastination persists. Int J Educ Technol High Educ 15, 1 (2018). https://doi.org/10.1186/s41239-017-0083-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-017-0083-9

Keywords