Insights from a randomized controlled trial of flipped classroom on academic achievement: the challenge of student resistance
International Journal of Educational Technology in Higher Education volume 20, Article number: 41 (2023)
Flipped classroom has been found to positively influence student achievement but the magnitude of the effect varies greatly according to discipline and local design, and few studies have been methodologically rigorous enough to establish causal evidence. Using a randomized controlled trial, this study addresses a gap in current knowledge by exploring how student responses mediate the impact of flipped classroom on academic achievement. The empirical setting is a first-year undergraduate macroeconomics course with 415 students. Comparing students in the treatment group with those in a traditional class, we find a positive, yet statistically insignificant effect on academic achievement. However, this overall effect masks important mediating effects, as students were unexpectedly reluctant to actively participate in the flipped classroom intervention. Consequently, the intervention has a substantially greater effect on academic achievement when controlling for the mediating effect of student participation which leads to consideration of the challenges of student resistance to flipped classroom.
Higher education curricula are increasingly being taught and learned fully or partially online and blends of online and face-to-face learning have been promoted as combining the best of both worlds to deliver higher quality education and keeping students engaged (Graham et al., 2014; Vaughan & Cloutier, 2017). During the pandemic, most institutions adopted online and blended learning but due to the speed of the transformation, decisions on new formats were made without much time for considering the best fit of pedagogy and student learning. This accelerated and widespread use of online and blended learning calls for more rigorous knowledge on the impact of these formats on student learning including the flipped classroom (FC), a setup where technology is used to shift content out of class, thus reserving in-class time for active learning approaches (Bergmann & Sams, 2014; Bishop & Verleger, 2013; Fisher et al., 2018; Lundin et al., 2018).
The growing body of research on the effects of the FC counts several reviews and meta-analyses (Akçayır & Akçayır, 2018; Cheng et al., 2019; Lundin et al., 2018; Strelan et al., 2020; van Alten et al., 2019) measuring both direct indicators of learning, such as academic achievement, and indirect measures such as satisfaction, motivation, and student engagement. Generally speaking, the FC format appears to positively impact academic achievement (Strelan et al., 2020), but the degree of the effect varies greatly according to discipline and local design, and few studies have been methodologically rigorous enough to establish causal evidence as many studies have been based on small sample sizes and self-reported data (Förster et al., 2022; O’Flaherty & Phillips, 2015). Moreover, most studies solely focus on the overall effect between FC and academic achievement and only fewer studies explore the relation between the two in more detail (for examples of studies that do, see Buhl-Wiggers et. al. (2023) on teacher heterogeneity and Sun et. al. (2018) on heterogeneity of self-regulated learning).
Even though student responses to a changed learning environment is a key issue in understanding the effectiveness of FC, less is known about how such responses mediate the relationship between FC and academic achievement. FC is found to increase student engagement (Elmaadaway, 2018; Fang et al., 2022; Merlin-Knoblich et al., 2019), motivation (Awidi & Paynter, 2019), and satisfaction with learning (Martínez-Jiménez & Ruiz-Jiménez, 2020). Moreover, findings show that FC increases interaction and feedback (Hussain et al., 2020), provides a more flexible learning experience (Price & Walker, 2021), and increases student participation (Aguilar et al., 2021; Foldnes, 2017). However, others find challenges of student resistance which we here define as any possible negative responses to the new teaching method (Tharayil et al., 2018). Student resistance to FC may be caused by requirements for more self-regulated learning (Jovanovic et al., 2019; Sun et al., 2018) or from higher workload demands on students (Burke & Fedorek, 2017; Khanova et al., 2015). Other studies point to students’ anxiety towards FC (Porcaro et al., 2016), preferences for teacher-led instruction (Hussain et al., 2020; Tomas et al., 2019) or difficulties abandoning old learning habits (Chen et al., 2014). Despite this body of literature, to our knowledge, no study has hitherto been conducted on the mediating effect of student responses to FC on academic achievement. We address this gap by providing a more rigorous understanding of the effects of FC on academic achievement as well as how students’ willingness to participate mediates the effect. Our research questions are:
What is the causal effect of FC on students’ academic achievement in a first-year economics class?
How does student participation mediate the relationship between FC and academic achievement?
The empirical context is a first-year macroeconomics course at a large Danish business school. The course was redesigned based on the FC format to induce more active learning and group work. This study has two main contributions to the literature on FC. First, we conduct a randomized controlled trial (RCT) to study the effect of an intervention inspired by FC on academic achievement. Second, we contribute to the growing literature on the importance of student responses to FC, focusing particularly on student resistance to FC. As a result, we gain deeper knowledge about possible reasons for the variation in effect sizes of academic achievement in FC, and we discuss possible challenges and how to address these.
Academic achievement and student responses in flipped classroom
The effect of flipped classroom on academic achievement
Studies investigating the effects of shifting to the FC format in higher education generally find positive effects on academic achievement, and a recent meta-analysis finds an effect size of 0.48 SDs for higher education (Strelan et al., 2020). Although positive, this overall effect size varies significantly by discipline (0.30 SD in IT to 0.98 SD in humanities). In business education, which is the focus of this study, the average effect size is 0.38 SDs, yet this is based on only few studies, highlighting a need for more evidence in specific disciplines including business and economics as well as social science in general.
Of the fourteen studies we identified in business and economics education, seven showed positive results (Albert & Beatty, 2014; Calimeris & Sauer, 2015; Chen & Hwang, 2019; Foldnes, 2016; Lento, 2016; Yamarik, 2007; Zhonggen & Guifang, 2016), six showed no difference from a traditional format (Bergfjord & Heggernes, 2016; Findlay-Thompson & Mombourquette, 2014; Haughton & Kelly, 2015; Lopes & Soares, 2018; Setren et al., 2021; Wozny et al., 2018) on average, and two showed that the effect depends on students’ prior academic achievement (Asarta & Schmidt, 2017; Wozny et al., 2018). Four studies use random assignment, but with similarly mixed results. Foldnes (2016) finds that students in a team-based FC perform 8.9 percentage points better on the final exam than students in the traditional lecture classroom, and Calimeris and Sauer (2015) show that FCs increase students’ average final exam performance by 0.64 standard deviations. Others find no statistically significant effect on final exam results (Setren et al., 2021; Wozny et al., 2018) on average although Wozny et. al. (2018) find that high-achieving students show a positive effect of 0.16 standard deviations. Accordingly, even in studies with rigorous experimental designs, the effect of FC has yet to be established in the context of business and economics education, and the estimated magnitude of the effect varies from modest to large.
Student responses to flipped classroom
The above variation in effect sizes might stem from differences in the underlying responses of students to the FC format. Positive effects include increased student engagement compared to traditional classroom approaches (Elmaadaway, 2018; Fang et al., 2022; Merlin-Knoblich et al., 2019), increased motivation for learning (Awidi & Paynter, 2019) and interest in the subject matter supported by various learning resources and activities (Fang et al., 2022). Accordingly satisfaction is reported to increase (Martínez-Jiménez & Ruiz-Jiménez, 2020). Positive effects are also reported regarding the opportunity for flexible learning including the use of videos for out-of-class learning (Price & Walker, 2021) as students can watch the content repeatedly at their own convenience and later discuss it during in-class time (Cabi, 2018). Regarding time in-class, studies of FC often emphasize active participation of students in in-class activities (Aguilar et al., 2021; Foldnes, 2017) as well as the opportunity for asking questions to instructors or peers in-class (Albert & Beatty, 2014; Fang et al., 2022; Hussain et al., 2020) and acquiring feedback and clarification (Cagande & Jugar, 2018; Chen & Hwang, 2019).
While there are many and varied reasons for student resistance to learning in higher education (Winkler & Rybnikova, 2019), various challenges are also mentioned in relation to FC—for more extensive reviews, see Akçayır and Akçayır (2018), Lundin et. al. (2018), Senali et. al. (2022). These include increased work load (Khanova et al., 2015), anxiety towards the new method (Porcaro et al., 2016) and difficulties abandoning old learning habits (Chen et al., 2014). Moreover, resistance to FC learning is often attributed to the need for self-regulated learning (Sun et al., 2018) and thus particularly pronounced at the first year of the undergraduate level where students are struggling with academic socialization and require more scaffolding (Hussain et al., 2020; Tomas et al., 2019). Also, student expectations of a specific, more teacher-led format has been found to increase resistance to the more active and collaborative FC format (Hao, 2016). Such challenges are found to spur adverse learning behaviors including procrastination (Förster et al., 2022) and lack of attendance (White et al., 2015), which are forms of resistance to participate in learning activities. Students’ acceptance of FC therefore differs according to characteristics as well as competences (Nouri, 2016).
The same mixed responses to FC can also be found among business and economics students. Several empirical studies find that they prefer more interaction in the classroom and therefore see FC as more conducive to learning than a traditional lecture format (Butt, 2014; Phillips & Trainor, 2014; Prashar, 2015; Scafuto et al., 2017; Steen-Utheim & Foldnes, 2018). However, other studies find that students respond negatively to FC, even though it leads to better academic performance (Ferreri & O’Connor, 2013; Strayer, 2012). Moreover, Findlay-Thompson and Mombourquette (2014) report mixed results with some students appreciating the convenience of pre-recorded lectures and opportunity for questions and feedback but others express dissatisfaction with lectures not taking place in the classroom and find FC to impose heavier workload. Accordingly, understanding student responses to FC both in- and out-of-class is crucial when trying to understand the effectiveness of shifting to FC formats.
The setting is a standard first-year, second-semester macroeconomics course in a Danish business school’s largest undergraduate program with approximately 600 enrollments each year. The exam is an end-of-term 4-h, closed-book exam counting 100% of the grade. The macroeconomics course consists of a series of large class lectures as well as weekly tutorial classes with approximately 40 students in each. There is no punitive attendance policy in neither lectures nor tutorial classes. The empirical focus of this study is the tutorial classes of this course as these are intended to be interactive, promoting active learning by allowing students to ask questions and teachers to clarify common misunderstandings. However, as students often come to class un(der)prepared, instructors feel compelled to present the curriculum, turning the classes into ‘mini-lectures’ with limited interaction. Hence, the intervention was motivated by the FC idea of increasing in-class activity in the tutorial classes by shifting direct instruction online and out-of-class while using in-class time for problem-solving in groups. Students had not been introduced to the FC format in previous classes during their first semester.
In learning economics, a critical task is to master the necessary mathematical skills that requires procedural knowledge, defined as the ability to consciously choose and execute step-by-step procedures in order to solve problems, acquired through intensive practice (Rittle-Johnson & Schneider, 2014). Accordingly, the pedagogical design included several learning components for activating student learning before, during, and after class. As illustrated in Fig. 1, the treatment group engaged in collaborative groupwork on a weekly assigned problem-set (referred to as the main problem-set) facilitated by instructors. To support preparation for groupwork, students in the treatment group had access to an online homework platform, MyEconLab that provided extra problem-solving exercises (referred to as the supplementary problem-set). Finally, to ensure clarification of misconceptions, the treatment group had access to videos with guiding solutions to the main problem-set.
The control group engaged in solving the main problem-set independently before class, and an instructor showed the solutions in-class. To ensure that access to more exercises did not drive the treatment effect, the control group received the same supplementary problem-set in a PDF file but had no immediate feedback or online guidance from MyEconLab.
The effectiveness of this intervention was evaluated through an RCT where seven of the fourteen classes were shifted to the new format, and the other seven were conducted traditionally, serving as control group. The RCT was introduced to students through e-mails and in-class before semester start. Students were informed of their rights to deny consent to use of data. The institutional Ethics Council approved the research project (approval number 22-020).
First, all students were randomly divided into a FC treatment condition and a control group, stratifying each by gender, age, and high-school location. Next, each group was randomly divided into seven classes, with fourteen classes in total. To avoid bias, each instructor was randomly assigned both a treatment and a control class and we scheduled all classes for the same day with classes switching timeslots halfway through the course. A research assistant monitored the classroom entrance to ensure that only students assigned to the treatment classes gained access and online access was restricted to the treatment group.
Data collection, measures, and descriptive statistics
Student background and first semester data come from the administrative database of the business school. The remaining data consist of a combination of quantitative data from the RCT on in-class and online participation and qualitative data about student responses to FC from a student survey and semi-structured interviews. Figure 2 illustrates the data collection process.
We focused on academic achievement measured by the grade received at the ordinary exam (a 4-h, written, closed-book exam) on a 7-point scale (national standard), this being the learning outcome of interest for the main quantitative analysis. For comparability of results, we standardized each test score by subtracting the overall mean and dividing by the standard deviation for the control group.
Pre-treatment control variables
To improve estimate precision, we included standard covariates such as age, gender, high-school GPA, high-school location, and previous semester’s microeconomics grade. The gender variable is a dummy, with females coded as one and males as zero. The variable for high-school location is categorical and breaks down into three categories: capital area, non-capital area, and international. Students’ grade from previous semester’s microeconomics course was included as an indicator of the proficiency-level needed for learning economics.
Measures of student responses to flipped classroom
Our primary measure of student responses is in-class participation, which is available for both treatment and control group. In-class participation both have a quantity (how often do you come to class) and a quality component (how much do you learn while being in class). We collected attendance data in every class and to measure each student’s overall attendance level, we calculated the proportion of classes in which each student had participated throughout the course. This in-class attendance measure is used as the mediator when conducting mediation analysis. In addition, we video-recorded each class twice and measured both groups’ interaction activity during class by counting instructor-to-student and student-to-student interactions in both control and treatment groups. We define interactions as initiating a question or response. Hence, if a student asks a group member a question that counts as one student-to-student interaction and the reply is another interaction. The same was done for student-to-instructor or instructor-to-student interactions. Because the videos for each class were recorded only twice during the semester and coded at the classroom level, these data are not included in our main quantitative models but serve to describe how the nature of in-class participation changed for the intervention classes. Recordings were conducted in 12 classes as two recordings failed due to technical problems.
For the treatment group only, we also measure the use of online materials as proxies for out-of-class online student participation. These data are retrieved from MyEconLab (the supplementary home assignments) and Panopto (the post-class videos). The proportion of exercises a student attempted served as a measure of MyEconLab activity. For the videos, we calculated the proportion of videos watched by each student.
In addition to the quantitative data, we use qualitative data to broaden the understanding of students’ experiences with participating in a FC. Semi-structured interviews were conducted with 24 students (12 from the control and from the treatment groups, respectively). Students were recruited by invitation and participated on a voluntary basis. To encourage expansive responses, we designed a short, semi-structured interview guide with open questions (Lee & Aslam, 2017), covering students’ description of the learning experience, what they liked or disliked about the learning activities, their level of satisfaction with the instructor, and their assessment of the social environment in class and of their own contribution to learning. Interviews lasted between 20 and 45 min each and were transcribed verbatim. Finally, students were invited to comment and reflect on their learning experiences in an endline survey.
The study population comprised 596 students enrolled in 2017. We arrived at our analytical sample (415 students) through three steps. First, we removed all the students who dropped out during the course (46 students). Second, we excluded 11 students from the control group for gaining access to the online materials, despite our efforts to minimize spillover. Third, we excluded students who did not take the ordinary exam (146 students) as the retake exam had a different format. Accordingly, the final analytical sample consisted of 415 students (192 in the control and 223 in the treatment group) with attrition comparable to previous years. Additional file 1: Appendix A, Fig. S1 illustrates the data cleaning process. Additional file 1: Appendix B, Table S1 shows balance on pre-intervention observable characteristics of the analytical sample.
Table 1 presents descriptive statistics for the main variables.
The average student was 21.75 years old, 38% of the sample were female, and around 49% were from the capital area, whereas 13% were international students. The average high-school GPA was 9.0, which was higher than the national average of 7.5. The average grade from the macro- and microeconomics course was 5.3 and 5.5, respectively. This is somewhat lower than the scale average, which is 7 and signals that both macro- and microeconomics were difficult for the students. The mean value of attendance was close to 55%, with a minimum value of 0 and a maximum share of 1. As for the treated students’ online participation, only about 20% opted to prepare for class. The videos were more popular, with an average use of just below 50%.
The overall treatment effect (total effect) of the FC intervention on academic achievement is described by the following model:
where Outcome is the final grade of student i, and T is the treatment variable (a dummy with value one for students in the treatment group and zero otherwise). Controls are pre-treatment confounders. Also, instructor-fixed effects were added so we effectively compared students taught by the same instructor. The parameter β1 is our coefficient of interest and measures the change in academic achievement attributed to the FC intervention.
As the total effect from estimating Eq. (1) may mask important and interesting student responses occurring between treatment and outcome, mediation analysis is applied. Accordingly, we add in-class attendance as the mediating variable to Eq. (1), adding a second equation to capture the indirect path from the treatment through the mediator to the outcome (Hayes, 2018):
The direct effect is α1 from Eq. (2), and the indirect effect is estimated as a product of parameters (α2 * b1) from Eqs. (2) and (3). When discussing statistical inference for the indirect effect, we relied on the bootstrapping of standard errors, as this effect is the product of coefficients from two different equations. Although β1 from Eq. (1) can be interpreted as the total causal effect of FC on academic achievement, adding post-treatment variables breaks the causal interpretation. Accordingly, an additional assumption of no intermediate confounding factors is required for the indirect and direct effects to be causally interpreted.
To obtain more detail about students’ participation in the learning experience, we conducted a qualitative investigation combining responses from the endline survey with interview transcripts. Two authors coded the data in parallel, using deep emergence (Suddaby, 2006) to identify relevant themes that either supported or challenged the quantitative analysis findings. Drawing on thematic analysis (Braun & Clarke, 2006) and grounded theory techniques (Strauss & Corbin, 1998), we generated initial codes by coding text extracts at the semantic level, then searched and reviewed the codes for themes across the data set. The authors met to compare notes on the emerging themes and discuss differences or clarify inconsistencies (Silverman, 2014). If any doubt arose, the authors coded the data anew, and the two sets of codes were compared to resolve any inconsistencies. The codes were then refined and consolidated before writing the final narrative. The themes identified are illustrated in Fig. 3, showing the data structure and exemplary text extracts.
Average treatment effect on academic achievement
Table 2 shows that students in the treatment group, on average, scored 0.102 SDs (p = 0.140) higher in the final exam than the control group. However, the effect does not differ significantly from zero for a two-sided test. If applying a one-sided test for a positive effect, we obtain significance at the 10% level (p = 0.070). The gender variable is significantly negative and the two variables on prior academic achievement are, as expected, both positively significant. The coefficient for students with an international background is significantly positive at the 10% level, whereas students from the non-capital area compared to the capital area show no difference. Finally, age showed an insignificant effect.
The total treatment effect estimates the overall difference in exam scores between the treatment and control groups, thus considering all pathways through which the FC affects the final exam grade. Still, to learn more about how shifting to FC impacts student participation, we turn to mediation analysis.
Student responses to flipped classroom
Student in-class participation
The attendance level started at roughly 70% for both groups in week 1 and subsequently declined over the 13-week period (Fig. 4). This decline was steeper for students in the treatment group, suggesting that the intervention caused the level of attendance to decrease.
Table 3 presents estimation results of the three types of mediation effects described in Eqs. (2) and (3): first, the total effect, which equates with the results in Table 3; second, the direct effect, which estimates how the FC impacts academic achievement when the level of attendance is kept constant; and, finally, the indirect effect, which measures how the FC effect on academic achievement is influenced by attendance.
Table 3 shows a positive and statistically significant direct relationship between the treatment and academic achievement indicating that, when the level of student attendance is held constant, FC has a positive effect on academic achievement. From the total effect, which equals 0.102 SDs (p = 0.140), to the direct effect, which equals 0.163 SDs (p = 0.023), the size increases by over 50%. The indirect effect is negative (p = 0.009) and smaller in size than the total effect. To summarize, when subtracting the indirect effect from the total effect, we see a much higher direct effect of the intervention, which is significant at the 5% level. However, when interpreting these results, one should note that the indirect and direct effects do not have the same causal interpretation as the total effect without an assumption of no intermediate confounding factors. As this assumption is rather strong, we interpret the direct and indirect effects only as associations rather than causal effects. Additional file 1: Appendix C, Table S2 contains more details on these estimation results.
As in-class participation is more nuanced than just attendance, we studied the activity during class in the two groups. Table 4 presents t-tests for equal means of number of interactions per student—with other students, with the instructor, or in total. All the tests lead to the conclusion that treated students had the highest activity level. The differences are strongest for student-to-student interactions (significant at the 1% level) and weakest for the instructor-student interactions (significant at the 5% level).
The influences of online participation
If we focus exclusively on the students assigned to FC, the data allow us to also explore the online participation.
As Table 5 shows, the use of the online options is rather limited for both alternatives. Particularly in the case of MyEconLab, the percentage of students that used 50% or more of the exercises is as low as 11%. For the videos, the corresponding number is 46%. This suggests that students in the intervention did not compensate for their lower in-class attendance with increased online participation. On the contrary, the correlation between MyEconLab use and class attendance is 40%, suggesting that students who engage in online activities are also more likely to attend classes.
In sum, we have showed that the FC was successful in increasing the in-class activity level, yet this did not increase the learning effect on average as the treatment also caused students to change their attendance pattern. Controlling for this change in attendance, the new format increased student achievement. Moreover, students did not compensate for their lower in-class attendance by increasing their online activity. With these results in mind, we turn to the qualitative data to explore why students in the treatment group were reluctant to participate in learning activities.
Reduced participation as a form of resistance to the new format
The most notable feature of the interviews and answers to open questions was the resistance to engage in the FC format. Many students felt that “it did not work at all,” expressing anger and frustration about the format. They made such statements as “I was very sorry that I didn’t have the ordinary format of teaching” or “I haven’t much positive to say about this course, as I really would have preferred to have ordinary teaching.”
Three narratives illustrate students’ reluctance to the FC format and add nuance to students’ level of participation: Struggling with new roles, Hesitance to collaborate, and Mixed responses to online activities.
Struggling with new roles: Students’ express skepticism that the student-centered FC format alters either their role as students or that of instructors. For example, students found it “unfair to change to a format with less presentation by the instructor.” As one student put it: ‘I’m 100% confident that my take-away from the course would have been much greater if I had had instructor-led instruction at the whiteboard’. Another student expressed the same sentiment, commenting, “I missed the instructor’s going through the material and showing me what’s right.” Yet another student explained disapproval of FC: “I know how I learn the best, which is by listening to the instructor going through the materials.”
Although negative statements dominated, some students also reported appreciation for opportunities to ask questions during the problem-solving exercises and for instructors to elaborate on different aspects of the content during in-class sessions. This indicates that positive emotional experiences align with willingness to participate. Moreover, reflections from students support positive cognitive experiences: “You learn best by trying to solve the problems yourself” and then in the classroom “focus[ing] on exactly those specific problems you couldn’t solve [at home].”
Hesitance to collaborate: A second theme of unwillingness to collaborate was bluntly voiced by a student: “I don’t get any learning out of working in groups.” Others found such collaboration “really annoying” or “tedious” and directly blamed groupwork for their decision not to attend class:
I didn’t come to class to sit together with other students who have exactly the same questions as me, just for us to answer them ourselves.
Students were especially insecure “working on exercises with people I didn’t know.” At times this insecurity was so strong that some students decided to opt out of the in-class learning activities for this reason alone: “I think many didn’t attend class because they didn’t know anyone in their class beforehand.” While reluctance to participate dominated students’ comments on groupwork, the data show more positive comments regarding cognitive involvement, highlighting the relevance of peer-to-peer discussion. One student found it “easier to remember the curriculum when you discuss with or explain it to your group members,” concluding that “learning improved due to the collaborative work.” Another student summarized perceived beneficial experiences from the FC, thus indicating alignment between positive cognitive involvement and willingness to participate,
By working on the exercises in class, I feel that I have acquired a better understanding than if they had just been explained to me quickly on the whiteboard, and I would have had to concentrate on writing it down but not really have understood it.
Mixed responses to online activities: The third theme relates to students’ reflections on why they were reluctant to participate in online activities. Students explained that they often found neither the time nor the motivation to do homework: “I’m not so good at getting it done at home when there is so much else to do.” This supports the quantitative analysis, which shows that only a small number of students completed the exercises on the online homework platform or watched all videos (see Table 5). As the following quote illustrates, students understand that this lack of preparation impedes groupwork in class:
Yeah, I guess I haven’t prioritized it sufficiently. You know, when you also have other things to do, like reading and stuff, but at least I have attended all the times I was supposed to.
Students reporting to have cognitively engaged with the online materials commented that they found the online homework system and the videos valuable tools for improving their understanding through feedback and enabled repetition of difficult topics, both during the course and when studying for the exam. Some even got “enough out of sitting at home with the videos and this MyEconLab platform.” Ironically, students who did not engage in the homework appreciated the in-class groupwork because it provided external motivation and structure for completing the exercises:
What I liked the best was that we could practice in-class while the instructor was there … ahh, in particular because you don’t always get to practice at home beforehand, so that was really great.
All in all, the qualitative data show strong negative emotional responses to shifting from instructor-directed to student-centered learning. Contrary to the aims of the intervention, student responses indicate that the changes gave rise to some frustration, anger, and resistance to participate, which in turn decreased attendance as well as academic achievement.
Results from the first research question showed a small positive but statistically insignificant effect of FC on academic achievement. The current study thereby aligns with previous studies in economics that find small or insignificant effects from FC on academic achievement (Setren et al., 2021; Wozny et al., 2018). Exploring this insignificant effect in more detail, results from the second research question showed that while FC provides opportunities to enhance student learning for students that participate, it can also hamper learning by inducing resistance and adverse attendance behavior. In the following, we discuss these findings and provide implications for practice.
An important point for discussion is how student participation mediates the relationship between FC and academic achievement. The intervention was designed on the assumption that students would actively participate in the FC, but students showed a noticeable reluctance to participate which in turn lowered academic achievement. This finding resonates with previous research documenting lower attendance due to students opting out of learning activities (White et al., 2015), but is at odds with more recent findings showing a positive relationship between FC and class attendance (Aguilar et al., 2021; Foldnes, 2017). This suggests that the relationship between FC, participation and academic achievement is quite complex and may vary by context. While flexibility is often emphasized as a key advantage of technology-supported learning (Müller et al., 2023; Nouri, 2016), our results show that the increased flexibility is not without its challenges as students can opt out of learning activities before realizing their benefits. As our study was carried out in the context of a first-year course, students’ readiness to participate in FC may have been an issue and led to the resistance to participate. Previous findings show that first-year students may not feel confident enough to take control of their learning and may prefer teacher-led instruction rather than student-centered learning (Nerland, 2020; Tomas et al., 2019).
Lack of attendance in the treatment group can be viewed as an act of resistance and merits discussion. While resistance to learning is a well-known challenge, reasons for student resistance in higher education are many and varied (Winkler & Rybnikova, 2019). When students are presented with new learning opportunities, they assess the risk in taking on the learning experience to their sense of self; in cases where the dominant emotions evoked are frustration or fear, they are less willing to engage in the learning activities than when they experience delight or pride (Lund Dean & Jolly, 2012). Disengagement occurs when students do not accept the risk and opt out of learning opportunities, either by distancing themselves from the opportunities and coping on their own or openly rejecting them, voicing their discontent. In this case, they miss opportunities for learning designed by the instructor. Our findings support previous findings which show that moving the instructor-led guidance to solutions online, disincentivizes some students from coming to class (Hussain et al., 2020; Tomas et al., 2019), as they do not regard collaborating with peers as a productive means of learning. Accordingly, when students opt out of in-class activities, the assumed benefits of identifying knowledge gaps through groupwork disappear.
Finally, the challenges of FC have implications for both instructors and institutions. Instructors may expect all students to approach classroom experiences positively and have the necessary psychological resources to cope with and learn from what they perceive as uncertain or ambiguous situations. However, in this study, students showed strong emotional responses, which calls for educator awareness about the possibility of student distress and the need to support them in their learning (Hao, 2016; Lai & Hwang, 2016). To increase the possibility for successful implementation of FC, more knowledge on how teachers can mitigate student resistance before shifting current formats into new less well researched formats is needed as also pointed to by Tharayil et. al. (2018). Institutions also have a responsibility to support instructors in developing the necessary competences if they intend to encourage teaching innovations and development of learning activities. Instructors should feel confident about introducing new teaching and learning methods also when students are resistant or unwilling to participate in learning activities; otherwise, the risk is that instructors make safe choices and provide students with what they are used to and immediately identify with, instead of making choices informed by pedagogy and aspirations for enhanced student learning. This includes learning from research on implementations of FC that do not have the expected effects to avoid painting too rosy a picture (Abeysekera & Dawson, 2015).
Despite a rigorous experimental design, this study has some limitations that bear mentioning. First, we cannot rule out that the experiment itself affected the results without a pure control group. A longer study timeframe is one way of addressing this issue, because more time could mitigate concerns by making the participants more comfortable and accustoming them to the experiment (Baxter et al., 2015). We implemented the intervention over the course of the entire semester, with our main outcome being the result of a high-stake exam. Thereby we expect students to have done their best regardless of their participation in the experiment.
Second, using only attendance as a measure of student in-class participation prevents us from ruling out the influence of other post-treatment confounding factors, such as motivation, on the results. With our present data, we are unable to address this, but we can nuance and support our findings by capturing students’ emotional and cognitive responses by including students’ qualitative responses. Moreover, resistance to the intervention may be caused by other pre-treatment factors that we have not looked for in our study as for example self‑organization, independent learning abilities, and prior experiences with FC that may have been important influencers (Asarta & Schmidt, 2020; Scheel et al., 2022).
Finally, the effects may not be generalizable to the overall student population, given that we have only looked at first-year students in a specific disciplinary setting. We therefore encourage replication in different contexts and for different groups of students.
This study has responded to calls for more methodological rigor by conducting an RCT that enables causal interpretation of the effectiveness of FC. This enables quantification of previous observations of mixed responses to FC and to provide evidence on the influence on students’ responses to new teaching formats. Based on the RCT, our findings showed a statistically insignificant effect of FC on student achievement. However, this effect masks important behavioral responses as the treatment also caused students to change their attendance pattern. Controlling for this change in attendance the new format did indeed increase student performance. Exploring the reasons for the changed attendance pattern in more detail we found that students resisted to participate as they struggled with the shift from teacher-led instruction and demand for increased group work which they found less conducive for learning than studying on their own. Such resistance to participate in new teaching formats may thus hamper learning and should be addressed by instructors and institutions to benefit from FC.
Availability of data and materials
Data is not publicly available but anonymized sets of data and replication files are available from the authors on request.
Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped classroom: Definition, rationale and a call for research. Higher Education Research & Development, 34(1), 1–14. https://doi.org/10.1080/07294360.2014.934336
Aguilar, R., Santana, M., Larrañeta, B., & Cuevas, G. (2021). Flipping the strategic management classroom: Undergraduate students’ learning outcomes. Scandinavian Journal of Educational Research, 65(6), 1081–1096. https://doi.org/10.1080/00313831.2020.1825524
Akçayır, G., & Akçayır, M. (2018). The flipped classroom: A review of its advantages and challenges. Computers & Education, 126, 334–345. https://doi.org/10.1016/j.compedu.2018.07.021
Albert, M., & Beatty, B. J. (2014). Flipping the classroom applications to curriculum redesign for an introduction to management course: Impact on grades. Journal of Education for Business, 89(8), 419–424. https://doi.org/10.1080/08832323.2014.929559
Asarta, C. J., & Schmidt, J. R. (2017). Comparing student performance in blended and traditional courses: Does prior academic achievement matter? The Internet and Higher Education, 32, 29–38. https://doi.org/10.1016/j.iheduc.2016.08.002
Asarta, C. J., & Schmidt, J. R. (2020). The effects of online and blended experience on outcomes in a blended learning environment. The Internet and Higher Education, 44, 100708. https://doi.org/10.1016/j.iheduc.2019.100708
Awidi, I. T., & Paynter, M. (2019). The impact of a flipped classroom approach on student learning experience. Computers & Education, 128, 269–283. https://doi.org/10.1016/j.compedu.2018.09.013
Baxter, K., Courage, C., & Caine, K. (2015). Chapter 13—Field studies. In K. Baxter, C. Courage, & K. Caine (Eds.), Understanding your users (2nd ed., pp. 378–428). Morgan Kaufmann. https://doi.org/10.1016/B978-0-12-800232-2.00013-4
Bergfjord, O. J., & Heggernes, T. (2016). Evaluation of a “flipped classroom” approach in management education. Journal of University Teaching and Learning Practice, 13(5), 17.
Bergmann, J., & Sams, A. (2014). Flipped learning: Gateway to student engagement. International Society for Technology in Education.
Bishop, J. L., & Verleger, M. A. (2013). The flipped classroom: A survey of the research. In 120th American society for engineering education annual conference and exposition (Vol. 30, pp. 1–18).
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.
Buhl-Wiggers, J., la Cour, L., Franck, M. S., & Kjærgaard, A. (2023). Investigating effects of teachers in flipped classroom: A randomized controlled trial study of classroom level heterogeneity. International Journal of Educational Technology in Higher Education, 20(1), 26. https://doi.org/10.1186/s41239-023-00396-4
Burke, A. S., & Fedorek, B. (2017). Does “flipping” promote engagement?: A comparison of a traditional, online, and flipped class. Active Learning in Higher Education, 18(1), 11–24. https://doi.org/10.1177/1469787417693487
Butt, A. (2014). Student views on the use of a flipped classroom approach: Evidence from Australia (SSRN Scholarly Paper ID 2331010). Social Science Research Network. https://papers.ssrn.com/abstract=2331010
Cabi, E. (2018). The impact of the flipped classroom model on students’ academic achievement. International Review of Research in Open and Distributed Learning, 19(3), 202–221.
Cagande, J. L. L., & Jugar, R. R. (2018). The flipped classroom and college physics students’ motivation and understanding of kinematics graphs. Issues in Educational Research, 28(2), 288–307.
Calimeris, L., & Sauer, K. M. (2015). Flipping out about the flip: All hype or is there hope? International Review of Economics Education, 20, 13–28. https://doi.org/10.1016/j.iree.2015.08.001
Chen, P.-Y., & Hwang, G.-J. (2019). An IRS-facilitated collective issue-quest approach to enhancing students’ learning achievement, self-regulation and collective efficacy in flipped classrooms. British Journal of Educational Technology, 50(4), 1996–2013. https://doi.org/10.1111/bjet.12690
Chen, Y., Wang, Y., Kinshuk, & Chen, N.-S. (2014). Is FLIP enough? Or should we use the FLIPPED model instead? Computers & Education, 79, 16–27. https://doi.org/10.1016/j.compedu.2014.07.004
Cheng, L., Ritzhaupt, A. D., & Antonenko, P. (2019). Effects of the flipped classroom instructional strategy on students’ learning outcomes: A meta-analysis. Educational Technology Research and Development, 67(4), 793–824. https://doi.org/10.1007/s11423-018-9633-7
Elmaadaway, M. A. N. (2018). The effects of a flipped classroom approach on class engagement and skill performance in a Blackboard course. British Journal of Educational Technology, 49(3), 479–491. https://doi.org/10.1111/bjet.12553
Fang, J., Vong, J., & Fang, J. (2022). Exploring student engagement in fully flipped classroom pedagogy: Case of an Australian business undergraduate degree. Journal of Education for Business, 97(2), 76–85. https://doi.org/10.1080/08832323.2021.1890539
Ferreri, S. P., & O’Connor, S. K. (2013). Instructional design and assessment: Redesign of a large lecture course into a small-group learning course. American Journal of Pharmaceutical Education, 77(1), 1–9.
Findlay-Thompson, S., & Mombourquette, P. (2014). Evaluation of a flipped classroom in an undergraduate business course (SSRN Scholarly Paper ID 2331035). Social Science Research Network. https://papers.ssrn.com/abstract=2331035
Fisher, R., Perényi, Á., & Birdthistle, N. (2018). The positive relationship between flipped and blended learning and student engagement, performance and satisfaction. Active Learning in Higher Education. https://doi.org/10.1177/1469787418801702
Foldnes, N. (2016). The flipped classroom and cooperative learning: Evidence from a randomised experiment. Active Learning in Higher Education, 17(1), 39–49. https://doi.org/10.1177/1469787415616726
Foldnes, N. (2017). The impact of class attendance on student learning in a flipped classroom. Nordic Journal of Digital Literacy, 12(1–2), 8–18. https://doi.org/10.18261/issn.1891-943x-2017-01-02-02
Förster, M., Maur, A., Weiser, C., & Winkel, K. (2022). Pre-class video watching fosters achievement and knowledge retention in a flipped classroom. Computers & Education, 179, 104399. https://doi.org/10.1016/j.compedu.2021.104399
Graham, C. R., Henrie, C. R., & Gibbons, A. S. (2014). Developing models and theory for blended learning research. In A. G. Picciano, C. D. Dziuban, & C. R. Graham (Eds.), Blended learning: Research perspectives (Vol. 2, pp. 13–33). Routledge.
Hao, Y. (2016). Exploring undergraduates’ perspectives and flipped learning readiness in their flipped classrooms. Computers in Human Behavior, 59, 82–92. https://doi.org/10.1016/j.chb.2016.01.032
Haughton, J., & Kelly, A. (2015). Student performance in an introductory business statistics course: Does delivery mode matter? Journal of Education for Business, 90(1), 31–43. https://doi.org/10.1080/08832323.2014.968518
Hayes, A. F. (2018). Introduction to mediation, moderation, and conditional process analysis (2nd ed.). The Guilford Press.
Hussain, S., Jamwal, P. K., Munir, M. T., & Zuyeva, A. (2020). A quasi-qualitative analysis of flipped classroom implementation in an engineering course: From theory to practice. International Journal of Educational Technology in Higher Education, 17(1), 43. https://doi.org/10.1186/s41239-020-00222-1
Jovanovic, J., Mirriahi, N., Gašević, D., Dawson, S., & Pardo, A. (2019). Predictive power of regularity of pre-class activities in a flipped classroom. Computers & Education, 134, 156–168. https://doi.org/10.1016/j.compedu.2019.02.011
Khanova, J., Roth, M. T., Rodgers, J. E., & McLaughlin, J. E. (2015). Student experiences across multiple flipped courses in a single curriculum. Medical Education, 49(10), 1038–1048. https://doi.org/10.1111/medu.12807
Lai, C.-L., & Hwang, G.-J. (2016). A self-regulated flipped classroom approach to improving students’ learning performance in a mathematics course. Computers & Education, 100, 126–140. https://doi.org/10.1016/j.compedu.2016.05.006
Lee, B., & Aslam, U. (2017). Towards the wholesome interview: Technical, social and political dimensions. In C. Cassell, A. L. Cunliffe, & G. Grandy (Eds.), The SAGE handbook of qualitative business and management research methods. Sage.
Lento, C. (2016). Promoting active learning in introductory financial accounting through the flipped classroom design. Journal of Applied Research in Higher Education, 8(1), 72–87. https://doi.org/10.1108/JARHE-01-2015-0005
Lopes, A. P., & Soares, F. (2018). Perception and performance in a flipped financial mathematics classroom. The International Journal of Management Education, 16(1), 105–113. https://doi.org/10.1016/j.ijme.2018.01.001
Lund Dean, K., & Jolly, J. (2012). Student identity, disengagement, and learning. Academy of Management Learning & Education, 11(2), 228–243.
Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: A systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education, 15(1), 20. https://doi.org/10.1186/s41239-018-0101-6
Martínez-Jiménez, R., & Ruiz-Jiménez, M. C. (2020). Improving students’ satisfaction and learning performance using flipped classroom. The International Journal of Management Education, 18(3), 100422. https://doi.org/10.1016/j.ijme.2020.100422
Merlin-Knoblich, C., Harris, P. N., & McCarty Mason, E. C. (2019). Examining student classroom engagement in flipped and non-flipped counselor education courses. Professional Counselor, 9(2), 109–125.
Müller, C., Mildenberger, T., & Steingruber, D. (2023). Learning effectiveness of a flexible learning study programme in a blended learning design: Why are some courses more effective than others? International Journal of Educational Technology in Higher Education, 20(1), 10. https://doi.org/10.1186/s41239-022-00379-x
Nerland, M. (2020). Challenges in accomplishing student-centred learning environments: Exploring sources of student participation hesitancy. In M. Elken, P. Maassen, M. Nerland, T. Prøitz, B. Stensaker, & A. Vabø (Eds.), Quality work in higher education: Organisational and pedagogical dimensions (pp. 1–19). Springer.
Nouri, J. (2016). The flipped classroom: For active, effective and increased learning—especially for low achievers. International Journal of Educational Technology in Higher Education, 13(1), 33. https://doi.org/10.1186/s41239-016-0032-z
O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education, 25, 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002
Phillips, C. R., & Trainor, J. E. (2014). Millennial students and the flipped classroom. Journal of Business & Educational Leadership, 5(1), 102–112.
Porcaro, P. A., Jackson, D. E., McLaughlin, P. M., & O’Malley, C. J. (2016). Curriculum design of a flipped classroom to enhance haematology learning. Journal of Science Education and Technology, 25(3), 345–357. https://doi.org/10.1007/s10956-015-9599-8
Prashar, A. (2015). Assessing the flipped classroom in operations management: A pilot study. Journal of Education for Business, 90(3), 126–138. https://doi.org/10.1080/08832323.2015.1007904
Price, C., & Walker, M. (2021). Improving the accessibility of foundation statistics for undergraduate business and management students using a flipped classroom. Studies in Higher Education, 46(2), 245–257. https://doi.org/10.1080/03075079.2019.1628204
Rittle-Johnson, B., & Schneider, M. (2014). Developing conceptual and procedural knowledge of mathematics (Vol. 1). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199642342.013.014
Scafuto, I., Serra, F., Mangini, E., Maccari, E. A., & Ruas, R. (2017). The impact of flipped classroom in MBA’s evaluation. Education + Training, 59(9), 914–928. https://doi.org/10.1108/ET-06-2016-0097
Scheel, L., Vladova, G., & Ullrich, A. (2022). The influence of digital competences, self-organization, and independent learning abilities on students’ acceptance of digital learning. International Journal of Educational Technology in Higher Education, 19(1), 44. https://doi.org/10.1186/s41239-022-00350-w
Senali, M. G., Iranmanesh, M., Ghobakhloo, M., Gengatharen, D., Tseng, M.-L., & Nilsashi, M. (2022). Flipped classroom in business and entrepreneurship education: A systematic review and future research agenda. The International Journal of Management Education, 20(1), 100614. https://doi.org/10.1016/j.ijme.2022.100614
Setren, E., Greenberg, K., Moore, O., & Yankovich, M. (2021). Effects of flipped classroom instruction: Evidence from a randomized trial. Education Finance and Policy, 16(3), 363–387. https://doi.org/10.1162/edfp_a_00314
Silverman, D. (2014). Interpreting qualitative data. SAGE Publications.
Steen-Utheim, A. T., & Foldnes, N. (2018). A qualitative investigation of student engagement in a flipped classroom. Teaching in Higher Education, 23(3), 307–324. https://doi.org/10.1080/13562517.2017.1379481
Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage.
Strayer, J. F. (2012). How learning in an inverted classroom influences cooperation, innovation and task orientation. Learning Environments Research, 15(2), 171–193. https://doi.org/10.1007/s10984-012-9108-4
Strelan, P., Osborn, A., & Palmer, E. (2020). The flipped classroom: A meta-analysis of effects on student performance across disciplines and education levels. Educational Research Review, 30, 100314. https://doi.org/10.1016/j.edurev.2020.100314
Suddaby, R. (2006). What grounded theory is not. Academy of Management Journal, 49(4), 633–642.
Sun, Z., Xie, K., & Anderman, L. H. (2018). The role of self-regulated learning in students’ success in flipped undergraduate math courses. The Internet and Higher Education, 36, 41–53. https://doi.org/10.1016/j.iheduc.2017.09.003
Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education, 5(1), 7. https://doi.org/10.1186/s40594-018-0102-y
Tomas, L., Evans, N., Doyle, T., & Skamp, K. (2019). Are first year students ready for a flipped classroom? A case for a flipped learning continuum. International Journal of Educational Technology in Higher Education, 16(1), 5. https://doi.org/10.1186/s41239-019-0135-4
van Alten, D. C. D., Phielix, C., Janssen, J., & Kester, L. (2019). Effects of flipping the classroom on learning outcomes and satisfaction: A meta-analysis. Educational Research Review, 28, 100281. https://doi.org/10.1016/j.edurev.2019.05.003
Vaughan, N. D., & Cloutier, D. (2017). Evaluating a blended degree program through the use of the NSSE framework. British Journal of Educational Technology, 48(5), 1176–1187.
White, C., McCollum, M., Bradley, E., Roy, P., Yoon, M., Martindale, J., & Worden, M. K. (2015). Challenges to engaging medical students in a flipped classroom model. Medical Science Educator, 25(3), 219–222. https://doi.org/10.1007/s40670-015-0125-7
Winkler, I., & Rybnikova, I. (2019). Student resistance in the classroom—Functional-instrumentalist, critical-emancipatory and critical-functional conceptualisations. Higher Education Quarterly, 73(4), 521–538. https://doi.org/10.1111/hequ.12219
Wozny, N., Balser, C., & Ives, D. (2018). Evaluating the flipped classroom: A randomized controlled trial. The Journal of Economic Education, 49(2), 115–129. https://doi.org/10.1080/00220485.2018.1438860
Yamarik, S. (2007). Does cooperative learning improve student learning outcomes? The Journal of Economic Education, 38(3), 259–277. https://doi.org/10.3200/JECE.38.3.259-277
Zhonggen, Y., & Guifang, W. (2016). Academic achievements and satisfaction of the clicker-aided flipped business English writing class. Journal of Educational Technology & Society, 19(2), 298–312.
No funding has been granted for this research.
Ethics approval and consent to participate
Ethical approval was received from the institutional ethics review committee.
The authors declare no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Construction of analytical sample. Figure S1. Study population: data cleaning and loss of observations. Appendix B. Balance of pre-treatment covariates. Table S1. Balancing of pre-treatment covariates between treatment and control group. Appendix C. Estimation results of Eqs. (2) and (3) in the main text. Table S2. Effects of flipped classroom on academic achievement and class attendance.
About this article
Cite this article
Buhl-Wiggers, J., la Cour, L. & Kjærgaard, A.L. Insights from a randomized controlled trial of flipped classroom on academic achievement: the challenge of student resistance. Int J Educ Technol High Educ 20, 41 (2023). https://doi.org/10.1186/s41239-023-00413-6
- Flipped classroom
- Randomized controlled trial
- Student participation
- Student resistance
- Academic achievement