Skip to main content

Instructional changes instigated by university faculty during the COVID-19 pandemic: the effect of individual, course and institutional factors

Abstract

The purpose of this study was to investigate instructional changes made by faculty for emergency online teaching necessitated by the COVID-19 pandemic, and hence to explore key factors related to those changes from an ecological systems perspective. Data on various individual, course, and institutional factors and instructional change variables were collected from 201 educators at higher education institutions. Results revealed that the level of instructional changes made by faculty was on average between substituting their existing course for an online one with some functional improvement (augmentation-level 3) and critical course redesign (modification-level 4), but that educators did not reach the level of the creation of new tasks which were previously inconceivable (redefinition-level 5). The biggest instructional change was found to be in teaching behaviors, followed by technology use, with only small changes in beliefs about online teaching. Factors that most highly correlated with instructional change were individual educators’ technology acceptance and innovation propensity, media synchronicity of the course, and the fidelity of institutional support. Recommendations are provided to aid strategic coping by universities facing a major crisis, with insights that may ultimately improve the quality of higher education in non-crisis contexts.

Introduction

With the rapid growth of online education, many researchers have explored experiences of faculty online teaching, including occurring during crises, such as natural disasters, or socio-political turmoil. In interviews with eight teachers who introduced online technology during the spread of the SARS virus in Hong Kong in 2003, Fox (2007) found that teachers saw both the potential and shortcomings of using online technology in education, with the negative impacts due mostly to lack of preparation time and teaching competency. After studying online teaching in the wake of Hurricane Katrina, Lorenzo (2008) concluded that higher education is often slow to adopt new tools and innovations and institute changes accordingly; when there is a will to do so even during a crisis. Ayebi-Arthur (2017), in a study of nine faculty members in New Zealand who offered online teaching after the earthquakes of 2010–2011, revealed that while most faculty members perceived the usefulness of online learning during the crisis, frequent communication, technology infrastructure and support were the keys to faculty successfully adopting online tools. Further, in a case study of academics who taught blended and online learning courses during student protests in South Africa between 2015 and 2016, Czerniewicz et al. (2019) found that most faculty accepted blended and online approaches as viable options during the university shutdowns, although face-to-face teaching as the more effective and preferred method of instruction.

In a similar vein, a few studies have investigated emergency online teaching by faculty during the COVID-19 pandemic that began in late 2019 and continues today. In a survey of U.S. faculty and administrators from 672 institutions, conducted in the early weeks of emergency online teaching, Johnson et al. (2020) discovered that regardless of whether faculty had taught online before, they were able to quickly adopt online teaching approaches and make the necessary adjustments to assignments, exams, and grading policies. Similar findings to these can be observed in studies conducted in less developed countries as well. In a qualitative study based on interviews with 20 professors from Indian universities, Shenoy et al. (2020) found that even those who initially resisted the adoption of technology and perceived online technology as a hindrance, quickly developed habits conducive to teaching online, utilizing appropriate tools and in time perceiving technology adoption to be a blessing and a welcome revolution in instruction. Similarly, Almaghaslah and Alsyari (2020) found that of a surveyed 59 Saudi Arabian faculty members the majority reported a smooth shift from classroom teaching to emergency online instruction and expressed an appreciation for the flexibility of the latter approach. A recent auto ethnography study conducted in Japan (Jung et al., 2021) also revealed that faculty members generally became more optimistic and utilized more diversified resources in emergency online teaching with more online teaching experience. All of these studies of faculty experiences during the COVID-19 pandemic are informative, as they reveal faculty members’ ability to quickly adopt online technologies and adaptation to emergency online teaching, as well as their confusion, anxiety, and struggles in the early stages of the transition. What these studies fail to explore, however, are the factors associated with the rapid adoption of new technologies and adjustments to emergency online teaching by faculty.

Several theories and models, such as Social Cognitive Theory, the Technology Acceptance Model, the Theory of Planned Behavior, and the first and second Unified Theory of Acceptance and Use of Technology (UTAUT), have identified key factors that may explain why individuals adopt certain technologies or innovations (Jung & Lee, 2020). For instance, the Technology Acceptance Model predicts technology acceptance based on the three factors of perceived usefulness, perceived ease of use, and user acceptance of information technology (Davis, 1989). Many researchers have used these factors to examine the acceptance of various technologies, including those involved in online teaching in higher education contexts (e.g., Alsofyani et al., 2012; Gibson et al., 2008; Green et al., 2009; King & He, 2006; Yuen & Ma, 2008). Although the robustness of the Technological Acceptance Model is well supported, criticisms of it include its exclusion of social influence and individual features (Ajibade, 2018). The UTAUT presents a viable alternative in that it uses four factors: performance expectancy, effort expectancy, social influence, and facilitating conditions, to both explain technology acceptance and acknowledge the moderating effects of age, gender, and experience (Venkatesh et al., 2003). Like the Technology Acceptance Model, the Unified Theory of Acceptance and Use of Technology has been widely applied in examining faculty members’ acceptance of online teaching across different contexts (Gunasinghe et al., 2019a; Hu et al., 2020; Radovan & Kristl, 2017).

Over the past few decades, there have been several studies which have examined factors influencing faculty adoption of online education. Some of these studies reveal faculty members are motivated to teach online when incentives at institutional level are offered, including but not exclusively, existence and quality of institutional support (Alsofyani et al., 2012; Moore & Anderson, 2003; Sumrall, 2002), which may include institutional commitment to online education and university policy to create supporting and facilitating environments for online education (Gannon-Cook & Ley, 2004; Theall, 1999), and personal rewards such as time and monetary compensation (Gannon-Cook & Ley, 2004; Townsend & Hauss, 2002). Other studies report that individual characteristics of faculty such as age, gender, teaching experience and intrinsic motivation (e.g., personal growth, satisfaction and experience gained from teaching online) influence acceptance and continuation of teaching online (Allan & Seaman, 2012; Chapman, 2011; Ko & Rossen, 2003; Shea, 2007). Granic and Marangunic (2019), Thatcher et al. (2007) and Agarwal and Prasad (1998) argue that personal innovativeness or innovation propensity influence individuals’ technology acceptance. Yet, other studies find that course-level instructional design and media factors, such as flexibility, collaborative discussions, and multimedia components affect faculty adoption of online education (Arend, 2009; Chapman, 2011). During a crisis like an earthquake or indeed COVID-19, some studies like that of Almaiah et al. (2020) and Ayebi-Arthur (2017) point out that an institution’s policies and strategies to help its faculty members readily cope with the crisis in question are influential in facilitating the implementation of online education.

While the technology and innovation adoption theories and relevant empirical studies may help researchers better understand factors influencing adoption behaviors across diverse contexts, their focus is mainly on individuals' voluntary use of a technology or their acceptance of an innovation including online teaching, when such opportunities exist (Dillon, 2001; Gunasinghe et al., 2019b; Taherdoost, 2018). These theories are therefore only able to offer limited predictive capabilities for those situations in which individuals have no choice other than to pursue the same end within more or less the same time and resource framework. Such is the case in emergency online teaching situations, where individual faculty members are obliged to develop and implement emergency online courses, using certain available technologies and in accordance with certain requirements specified by their university. Even when all faculty members are mandated to adopt online teaching, they of course make technological and pedagogical choices concerning what media or technology they will use and what instructional practices they will implement. To examine the factors affecting such instructional changes during emergency online teaching, one must consider the social processes that take place between the individual and the surrounding contexts as well as other individual and organizational factors. In this regard, the present study applies, as an umbrella framework, Bronfenbrenner’s ecological systems theory, with its focus on the relationship between individual experience and interactions with surrounding contexts, and further, draws on the Technology Acceptance Model to identify factors affecting instructional changes made by university faculty members providing emergency online teaching during the COVID-19 pandemic.

Theoretical framework

The ecological systems theory, developed by Bronfenbrenner (1977, 1979), served as the theoretical framework for this study. The theory explains how the intrinsic qualities of individuals and their immediate and broader environments interact to influence how they develop and change. Bronfenbrenner’s theory organizes environments into four systems with which an individual interacts (Bronfenbrenner, 1977, pp. 514–515): the micro-system, meso-system, exo-system, and macro-system.

  • The most proximal system to the individual is the micro-system, which refers to the immediate setting where one exists and usually performs a role or activity.

  • Moving outward from the microsystem is the meso-system, which incorporates all of one's micro-systems and the processes and interactions between them.

  • The exo-system is the next outermost system, which contains elements of one’s environment that one cannot control or effect, often including one’s neighborhood and other social and societal structures.

  • The outermost system is the macro-system, which includes one’s culture or subculture and, in some cases, one’s socio-economic setting, race, and country.

In the present study, the individuals in Bronfenbrenner’s formulation were South Korean university faculty members who have been obliged to carry out emergency online teaching during the COVID-19 pandemic. Individual factors that were found to be influential in the previous studies were also found to be present in this study: technology acceptance and innovation propensity (Agarwal & Prasad, 1998; Granic, & Marangunic, 2019; Stewart et al., 2010; Thatcher et al, 2007), and teaching perspective (Wingo et al., 2017) along with their gender, age and teaching experience (Allan & Seaman, 2012; Shea, 2007). The immediate setting or microsystem, with which each faculty member interacted, was a course they were required to teach online as a result of the pandemic. It included such factors as instructional objectives, media, and instructional design (Arend, 2009; Chapman, 2011) along with course disciplinary area, course level and class size. The meso-system was the institution in which faculty taught, and included such factors as the size, type, and location of the university (Green et al., 2009), the fidelity of institutional support (Alsofyani et al., 2012; Sumrall, 2002), and organization-level upheaval and nudge strategies employed to cope with the crisis (Almaiah et al., 2020; Ayebi-Arthur, 2017). Factors that were found to be important, mostly for voluntary decisions to accept a technology or online education, such as personal rewards were not included in the study as the context of this present study was in online education as mandatory requirements. The exo-system (e.g., government and national strategies and policies) and macro-system (e.g., national culture) were also not included in the research design, since they were the same across the board for all faculty members involved.

Keeping in mind the theoretical notions about individuals and micro and meso-systems of the ecological systems model, we hypothesized that factors at three different system levels: the individual, course, and institutional level interact with each other to affect the instructional changes instigated by faculty, in emergency online teaching. Specifically considered were changes in technology use, teaching behaviors, and beliefs about online teaching in five levels (Fig. 1).

Fig. 1
figure1

Theoretical framework of the study

Research questions

We applied our theoretical framework in developing the following research questions:

  1. 1

    To what extent did university faculty change their emergency online teaching during the COVID-19 pandemic in terms of: (1) technology use, (2) teaching behaviors, and (3) beliefs about online teaching?

  2. 2

    How did factors at individual, course, and institutional levels contribute to the changes that faculty instituted in their emergency online teaching?

Methods

Participants and data collection

A total of 201 university educators at higher education institutions, located in South Korea, participated in this study. The sample comprised 58.26% (n = 117) women, 40.8% (n = 82) men, and 1% (n = 2) who preferred not to state their gender. In regard to age, 6.0% (n = 12) of participants were in their thirties, 39.8% (n = 80) in their forties, 43.3% (n = 87) in their fifties, and 10.9% (n = 22) in their sixties. Teaching experience was relatively evenly distributed, with 8.5% (n = 17) with less than 5 years of teaching experience, 23.9% (n = 48) 6 to 10 years, 20.4% (n = 41) 11 to 15 years, 15.9% (n = 32) 16 to 19 years, and 31.3% (n = 63) more than 20 years. Most participants (80.6%, n = 162) could be considered to be at the beginner level of online teaching, with most at this level (69.2%, n = 139) having with no experience prior to COVID-19 and a smaller number at this level (11.4%, n = 23) having less than one-year of online teaching experience. Among the participants, 56.2% (n = 113) taught courses in the arts, humanities, and social sciences, 36.8% (n = 74) taught courses in engineering, sciences, and medical disciplines, and 7.0% (n = 14) taught multi-disciplinary courses. Course sizes were diverse: 3.0% (n = 6) of courses contained less than 10 students, 40.8% (n = 82) of 10 to 30 students, 31.8% (n = 64) of 31 to 50 students, 19.4% (n = 39) of 51 to 100 students, and 5.0% (n = 10) of more than 100 students. Institution sizes ranged from less than 10,000 students (62.7%, n = 126), to more than 10,000 students (37.3%, n = 75).

Data collection was from May 21, 2020 to July 2, 2020 using an online survey. The study design was approved by the Research Ethics Committee of the higher education institution of one of the authors (#2020-09). All study participants provided informed consent on the first page of the survey.

Measurement

The theoretical framework shown in Fig. 1, above, was used for the development of an initial survey and the conducting of two pilot tests with five university educators in an effort to detect erroneous or unclear expressions or any parts that might lead to biased responses. The final version consisted of 63 items across four sections designed to measure: (1) individual factors (23 items), (2) course factors (17 items), (3) institutional factors (20 items), and (4) instructional change variables (3 items).

Individual factors The section of the survey measuring individual factors consists of four sub-parts: (1) background information on gender, age, teaching experience and online teaching experience, (2) innovation propensity, (3) technology acceptance, and (4) teaching perspectives. To measure innovation propensity, we adapted 10 items from a survey by Savery (2005) that was developed based on Rogers’ diffusion of innovations theory, which categorizes individuals as innovators, early adopters, early majority, late majority, or laggards based on their behavior in relation to innovations, especially those involving technology. The total innovation propensity score was calculated by combining the scores of innovators and early adopters with the scores of those in the reverse-coded early majority, late majority, and laggards categories. The internal consistency of innovation propensity was 0.729. In regard to technology acceptance, four constructs were measured (perceived ease of use, perceived usefulness, attitude, and enjoyment) using nine items from a survey by Park (2009) modified to fit the context of this study. The internal consistency of technology acceptance was 0.853. To measure teaching perspectives, participants were asked to rate the importance of each of the roles of lecturer, designer, and facilitator for successful online teaching. The internal consistency of the three items that measured teaching perspectives was 0.756. The overall consistency of the individual factors, excluding individual background items, was 0.854.

Course factors The section of the survey measuring course factors consists of four sub-parts: (1) background information on course discipline, course size, and course objectives, (2) media synchronicity, (3) the extent of modification of the previous course, and (4) the time the educator spent implementing the course. In order to measure the extent to which faculty used synchronous media in the course three questions were posed about the use of synchronous media for content delivery, faculty-student interaction, and student–student interaction. The internal consistency of the media synchronicity items was 0.808. To measure the extent of modifications required to convert the course to an online format, five items were employed asking about modifications in learning content, instructional strategy, assessment, and interactions. The internal consistency of these items was 0.724. As for time spent on the course, the participants were asked to estimate the time and energy they spent conducting technology-related tasks, preparing lectures, implementing lectures, giving feedback or answering students’ questions, facilitating or motivating student learning, and conducting assessments. The internal consistency of the six items on time spent on the course was 0.813. The overall consistency of course factors, excluding background items, was 0.762.

Institutional factors The section of the survey on institutional factors consists of four sub-parts: (1) background information on the size and location of the institution and the support it offers, (2) the fidelity of support the institution provided, (3) nudging strategies (strategies to influence behavior without restricting options or incentives) (Thaler & Sunstein, 2008), and (4) upheaval strategies (strategies that nations, institutions, or individuals adopt to manage existential crisis). To measure the fidelity of institutional supports for teaching online, the participants were asked to rate the degree of helpfulness of five different forms of support: (1) face-to-face training, (2) online training, (3) text guidelines, (4) video tutorials, and (5) hot line or consultation. The internal consistency of these five items was 0.812. To examine nudging strategies adopted by the institution during the pandemic, four questions adapted from Thaler and Sunstein (2008) were asked. The internal consistency of these items was 0.645. Eight items taken from Diamond’s (2019) twelve successful upheaval strategies in the context of COVID-19 pandemic were asked to identify upheaval strategies of the institution. The internal consistency of these items was 0.924. The overall consistency of the items for the course factors, excluding course background items, was 0.903.

Instructional changes The section of the survey on instructional changes consists of three items measuring: (1) changes in technology use for courses being transitioned from a traditional course to an emergency online course, (2) changes in teaching behavior, and (3) changes in beliefs about online teaching. The possible responses for degree of change included the four levels of Puentedura’s (2006) Substitution-Augmentation-Modification-Redefinition (SAMR) model, with an additional one added for ‘no change’. The consistency of the three items measuring instructional change was 0.869.

  • No change (Level 1): online teaching with no change from classroom face-to-face teaching

  • Substitution (Level 2): online teaching as a direct tool substitute with little functional change

  • Augmentation (Level 3): online teaching as a direct tool substitute with functional improvement

  • Modification (Level 4): online teaching as a tool for significant task redesign

  • Redefinition (Level 5): online teaching as a tool for the creation of new tasks, previously inconceivable

Data analyses

To address the posed research questions, descriptive statistics, bivariate correlation analysis, independent and paired samples t-tests, and analyses of variance (ANOVA) were employed. To test the hypotheses in the research model shown in Fig. 1, variance-based structural equation modeling was conducted, employing the partial least squares approach, PLS-SEM using SmartPLS 3 software (SmartPLS GmbH, Germany, 2017). PLS-SEM was chosen as it is considered more relevant for analyzing complex models and exploratory research (Hair et al., 2017). Calculated descriptive statistics and correlations provided the preliminary data examination. The second-order measurement model were assessed, followed by the structural model: assessment of the measurement model required checking internal consistency reliability using the Cronbach alpha and composite reliability, convergent validity using outer loadings and Average Variance Extracted (AVE), and discriminant validity using cross loadings and Fornell-Larcker criteria. Based on the results of the measurement model assessment, indicators that were below the recommended thresholds were deleted and the model refined, and measurement validation for the new model conducted. All constructs in the resulting measurement model had acceptable reliability and validity (all Cronbach’s alphas greater than 0.6, composite reliability between 0.6 and 0.9, and AVE for constructs higher than 0.5). Structural relationships among individual factors, course factors, institutional factors, and instructional change variables were validated through a bootstrap analysis with 500 adjusted samples that approximated distribution. In assessing this structural model, we checked the collinearity statistics and calculated the path coefficients along with R2, F2, and Q2 to measure the quality of the model.

Results

Instructional changes made by faculty during the COVID-19 pandemic

Firstly, the mean and standard deviations of three instructional change variables; faculty changes in the use of technology, teaching behaviors, and beliefs on online teaching, as well as the total average change of all three (Table 1) were calculated. Faculty change in teaching behaviors had the highest mean (M = 3.53, SD = 1.02) and, at a similar magnitude, faculty change in the use of instructional technology had the next highest mean (M = 3.51, SD = 1.11). The mean of the change in faculty beliefs about online teaching was significantly lower (M = 3.45) than the mean of technology change [t(200) = 3.61***] and behavior change (t(200) = 2.73**), and had the largest variance among the other instructional change variables (SD = 1.18). The three change variables were significantly and positively related to each other (β = 0.63 ~ 76), and the total average change for all three variables was 3.45, equivalent to 69.1% of the perfect level.

Table 1 Descriptive statistics of instructional change variables and mean differences by background variables of individual, course, and institutional factors

Secondly, independent sample t-tests and one-way between-subjects analyses of variance (ANOVA) were performed with a view to investigating whether the total instructional change differed by background factors at the individual level (gender, age, teaching experience, and online experience of faculty members), course level (the discipline area of the course, the number of students taking the course, and the learning objectives the course aims for) and institutional level (the number of students in the university the faculty member is working for, the geographic location of the institution, and whether the institution has an organization that supports faculty teaching during COVID-19). No statistically significant difference in the total change for any of the background factors was found.

The two background factors of gender and class size were associated with differences in some instructional change variables, however. Change in technology use was significantly higher for female faculty members (M = 3.65, SD = 1.02) than for their male counterparts (M = 3.33, SD = 0.98) with a medium value of effect size [t(197) = − 2.2, p = 0.03, d = 0.32]. Change in beliefs about online teaching was significantly lower for courses with fewer than 10 students (M = 1.83, SD = 0.98) compared to courses with 10 to 30 students (M = 3.48, SD = 1.19), 31 to 50 students (M = 3.14, SD = 1.13), 51–100 students (M = 3.41, SD = 1.16), and more than 100 students (M = 3.80, SD = 1.18), with a large effect size (F(4, 196) = 3.75, p = 0.006, η2 = 0.07). The differences between courses with more than 10 and fewer than 100 students were not significant (p = 0.20) also, none of the institutional background factors made a significant difference in instructional change.

Thirdly, bivariate correlation analyses was conducted among particular individual factors, specifically; innovation propensity, technology acceptance, and teaching perspectives of faculty members, course factors; the degree of synchronicity of the media used in the course, amount of modification from the previous course, and the time and effort faculty members spent modifying the course, and institutional factors; the fidelity of instructional support provided by the institution, the extent of nudging strategies, and the amount of upheaval strategies used by the institution.: also calculated were the correlations between those factors and instructional changes, Table 2, below. The results show that most of the sub-factors, with each factor, were significantly related to each other. Among the individual factors, technology acceptance had the strongest correlation to instructional change variables (r = 0.37, 0.37, 0.36, and 0.41 for change in media/technology, practice/behavior, attitude/understanding of teaching and learning, and total change respectively, all p = 0.00). Among course factors, media synchronicity was highly correlated to total change (r = 0.35, p = 0.00), and among institutional factors, the fidelity of instructional support was correlated to change in media/technology use of educators (r = 0.29, p = 0.00).

Table 2 Descriptive statistics of individual, course, and institutional factors and correlation to instructional change variables

Factors affecting instructional changes by faculty during the COVID-19 pandemic

To assess both the measurement model and the structural model PLS-SEM was applied. In assessing the structural model, path coefficients and their significance were examined (Table 3 and Fig. 2) and found to be all significant except for the paths from institutional factors to course factors (p = 0.063) and from institutional factors to instructional change (p = 0.064), suggesting that institutional factors do not lead to instructional change directly, but via individual factors, and not via course factors. The strongest paths were found from individual factors to course factors (β = 0.47, p = 0.000) and to instructional change (β = 0.40, p = 0.000), followed by the path from institutional factors to individual factors (β = 0.23, p = 0.000). The path from course factors to instructional change was also significant, but with a small effect size (β = 0.15, p = 0.041). The model explains 29.3% of the variance in instructional change by faculty members during the COVID-19 pandemic, which, in education research, is indicative of significant explanatory power (R2 = 0.293, Adj R2 = 0.282) (Eisenhauer, 2009), and has acceptable predicative relevance value with a Q2 exceeding zero (Q2 = 0.26) (Hair et al., 2017).

Table 3 Results of structural model assessment
Fig. 2
figure2

Path coefficients of the research model

Discussion

The purpose of this study was to investigate the instructional changes made by faculty for emergency online teaching necessitated by the COVID-19 pandemic, as well as the key factors influencing those changes.

The results indicate that educators made the most drastic changes in their teaching behaviors, followed by changes made to their use of technology. The change in their beliefs about online teaching was small but significant. This finding is contrary to what one might expect given classical theories such as the Theory of Planned Behavior (Ajzen, 1985), the Technology Acceptance Models (Davis, 1989; Venkatesh, 2000, Venkatesh et al., 2003, 2012), the Information-Motivation-Behavioral Skills Model (Fisher & Fisher, 1992), and the Transtheoretical Model of Change (Prochaska & DiClemente, 1983), which claim that changes in beliefs and attitudes inevitably precede changes in behavior. Our contradictory finding may be due to the unique context of this study, where changes in teaching behaviors and technology adoption were urgently required for emergency online teaching even if attitudinal and belief changes had not occurred. Several studies (Ertmer et al., 2001; Fang, 1996; Farrell & Lim, 2005) have indicated that context, including the school/classroom/course context and the policy context, is an important factor affecting inconsistencies between teacher beliefs and practices. The unavoidable context of the COVID-19 pandemic has likely forced many educators to change their teaching behavior without making corresponding changes in their beliefs. As asserted by Festinger (1957), such a belief-behavior inconsistency would create cognitive dissonance for educators, whose natural human drive for cognitive consistency would motivate them to change their beliefs to justify the behavior they are required to perform (e.g., Cooper, 2007; Festinger & Carlsmith, 1959). Research on educators’ behavioral and attitudinal changes in various contexts is needed to test this as yet unproven argument.

Overall, the change identified in this study represents about 70% of perfect level (M = 3.45 out of 5.00). According to the SAMR model, this level corresponds to the relationship between Augmentation (level 3.0) and Modification (level 4.0); that is, university educators, on average, converted their existing courses to online courses with some functional improvements (Augmentation) and a modest revision of critical course redesign components (Modification), without quite reaching the level of creating new tasks that were previously inconceivable (Redefinition 5.0). This level of change may be regarded as a substantial outcome given that educators were required to create online courses in a short period of time and without much preparation or assistance. What, then, hindered educators from advancing to the Redefinition stage? According to Christensen (2003), innovation in business is of two types: (1) sustained innovation, which includes changes but does not affect an existing market, and (2) disruptive innovation, which creates a new market with a new set of values, ultimately overtaking an existing market. Disruptive innovation may correspond to Redefinition stage of the SAMR model, which involves radical and structural changes in existing practices. To reach the Redefinition stage and bring about disruptive innovation in online teaching, educators need to develop a high level of competency. They must understand the affordances of new technologies and be able to apply this knowledge to create new online pedagogies. It is to be hoped that further research may delve into potential factors that can bring about redefinition or disruptive changes.

This study also revealed that at individual, course, and institutional levels, background factors such as age, teaching experience, online experience, discipline, institution size, geographic location, and supporting organization, do not affect instructional change. Gender, however, appears to help explain, in part, the change in technology use. Our findings, that female faculty made greater changes in technology use in the context of emergency online teaching, may suggest their use of technology in their pre-COVID-19 teaching was less than their male counterparts, requiring more active change in technology adoption in emergency online teaching. Gender differences in technology use have been reported in other studies (e.g., Kahveci, 2010; Yau & Cheng, 2012), where researchers found that males had a higher level of confidence in and usage of technology compared to females. Some recent research (e.g., Gebhardt et al., 2019; Guillén-Gámez et al., 2020) suggests that this gender gap is becoming less evident, however, with the increased prevalence and importance of digital technologies in society and in higher education. Further investigation into gender differences in faculty technology use would help understand inconsistencies between these findings and previous research. Another possible interpretation of gender differences in this study may be that female educators respond to forced change differently and are more compliant with institutional-level decisions than their male counterparts, which suggests future research through in-depth qualitative research.

Class size, another course background sub-factor, was significant in explaining changes in beliefs concerning online teaching. Contrary to the assertions of numerous studies; that a small class size is more beneficial for quality teaching (e.g., Blatchford et al., 2003), this study shows that faculty are more likely to change their beliefs about online teaching when teaching a class of between 10 and 100 students: while those with less than 10 students in their class reported less change in their beliefs about online teaching. Explanation for this could be the observation that course preparation, instructional design, or instructional changes may be more influential in relatively large-sized courses compared to small-sized ones, eventually leading to greater changes in faculty beliefs and understandings about online teaching in larger classes (Elison-Bowers et al., 2011).

While most individual, course, and institutional background factors were not influential in explaining the instructional changes of faculty faced with emergency online teaching, some did show a degree of impact. Among them, technology acceptance and innovation propensity (individual factors) had two of the strongest associations with instructional change, followed by the fidelity of institutional support (institutional factor) and media synchronicity (course factor). Technology acceptance and innovativeness in individual educators have long been recognized as critical prerequisites for technology integration into teaching (e.g., Granic, & Marangunic, 2019). In a study of 237 primary and secondary school teachers, Akar (2019) concluded that teachers’ personal innovativeness affects their technology acceptance in teaching. Similarly, in a study of 92 university educators, Akgün (2017) found that educators’ individual innovativeness features and technology acceptance influenced their intention to use technology in instruction. Consistent with previous findings, then, this study confirms the importance of helping faculty develop technology acceptance and innovativeness in order to pave the way for resiliency when changes in modes of teaching are required.

At the institutional level, the fidelity of instructional support provided by the institution was another factor strongly associated with instructional change, particularly in regard to change in technology use. Instructional support of this sort may have some equivalencies to the ease of use in Technology Acceptance Model, behavioral control in Theory of Planned Behavior (TPB), and facilitating conditions in UTAUT. In other words, proper instructional support, provided by an institution, can assist educators in making use of technology more easily, flexibly, and with greater facilitation, regardless of whether the behavior is voluntary or obligatory. Ayebi-Arthur (2017) reported that one of the most important factors that helped overcome the 2011 earthquakes in a New Zealand university was institutional support for faculty in the use of e-learning technologies, along with the availability of technological tools where the former induced ease of use of the tools.

Interestingly, the synchronicity of instructional media used in the course being correlated with instructional change might be explained by media synchronicity theory, which holds that synchronous media are more effective for verifying, adjusting, or negotiating participants’ mental models so as to enhance shared understanding. Synchronous media can be more beneficial to lead instructional change than asynchronous media because instructors are able to receive learners’ reaction and feedback more directly and readily, in real time, while interacting with them to improve their mental models (Dennis et al., 2008). Moreover, this result has a cultural perspective interpretation: in a high-context communication culture like Korea, where in-person interaction is highly valued, synchronous, visual technologies tend to be more readily accepted than asynchronous, text-based tools, since the former simulate the face-to-face presence (Jung & Gunawardena, 2014).

Finally, the structural equation modeling that examined the path from individual, course, and institutional factors to instructional change, provided several insights. It revealed that individual factors reflecting progressive tendencies (e.g., technology acceptance and innovation propensity) had the most powerful influence over course factors (e.g., media selection and time and effort for the course) eventually leading to instructional change. Conversely, institutional factors had no direct impact on the instructional change or course factors. These findings point to the importance of faculty innovativeness and technology acceptance in bringing about changes in teaching and learning, especially during a crisis, as Everett Rogers posited. Rogers’ diffusion of innovations theory put individual adopters at the center of innovation diffusion, before the later addition of organizational and social influences (Rogers, 2003).

In this study, within its unique context of urgent societal change, such progressive tendencies among individual faculty members, when facilitated by institutional supports and strategies, were a critical driver in the instructional change needed for effective emergency online teaching. This finding suggests that higher education institutions need to help faculty cultivate innovativeness, technology acceptance, and skill development in order to facilitate a successful conversion to online teaching—especially in the midst of a global pandemic. In particular, such institutions need to develop upheaval strategies that include acknowledging crisis, accepting the responsibility of university leadership, conducting an honest self-appraisal, delineating institutional problems, obtaining material, financial and emotional help, adopting exemplary models, creating situation-specific institutional flexibility, and obtaining freedom from institutional constraints (Diamond, 2019).

The present study yields several theoretical and practical implications. Theoretically, planned voluntary change and urgent inevitable change may be radically different, suggesting that studies on these two topics should have different research agendas. While the former may be concerned with factors that induce change, the latter should focus more on the quality, or depth of change and related influencing factors. The current lack of theoretical guidance for the latter line of research calls for new theories or models that specifically address such forced instructional changes in response to emergencies or crises in higher education such as the COVID-19 pandemic we find ourselves facing today.

In higher education, institutional intervention in response to a crisis can be implemented in two major ways: demonstrating leadership in strategically coping with a crisis by engaging faculty members in planning and then designing faculty development programs to promote innovativeness. Because information communications and technology (ICT) governance is growing in importance as an alternative to face-to-face teaching (especially when it is not feasible, as during the COVID-19 pandemic), universities need to incorporate an ICT center in their main governance system in order to support technology acceptance by faculty and improve the fidelity of instructional supports.

The data used in this study were collected at the initial stage of the pandemic, between May 2020 and July 2020. Therefore, the results may not reflect phenomena that occurred at later stages of the crisis. This time specificity suggests the need for well-designed prospective investigations that address the shifting patterns of instructional changes over longer periods of time. The self-reported method applied in this study may not be sensitive enough to capture faculty beliefs or subsequent changes in their beliefs. To address this issue, a future investigation is needed applying alternative qualitative and quantitative methods, such as phenomenological analysis and beliefs inventory. In addition, this study excluded two of Bronfenbrenner’s ecological systems; exo-system and macro-system. Future research that explores the impacts of factors within these two systems may offer a broader picture of the correlates of instructional changes by university faculty. Nevertheless, this study offers a fresh new perspective that enriches our understanding of individual, course, and institutional factors affecting instructional changes by faculty during the COVID-19 pandemic, and in so doing contributes to universities’ efforts to cope strategically with crises and to ultimately improve the quality of higher education in any context.

Conclusion

The results of this study suggest that, in higher education contexts, crisis-driven changes may happen differently from pre-planned, voluntary change, and that factors influencing crisis-driven changes are different from those influencing voluntary changes; as reported in previous studies based on technology acceptance theories and models. More specifically, individual faculty members’ progressiveness in technology adoption and innovation and the institution’s reliable and well-managed support for faculty may be more important for crisis-driven change processes, while environmental factors such as social and cultural influences and facilitating conditions are more important in non-crisis contexts. This study offers implications for guiding higher education institutions in developing effective faculty development programs and institutional-level policies and strategies for quality online teaching in crisis situations.

Availability of data and materials

The data that support the findings of this study are available on request.

References

  1. Agarwal, R., & Prasad, J. (1998). A conceptual and operational definition of personal innovativeness in the domain of information technology. Information Systems Research, 9, 204–215. https://doi.org/10.1287/isre.9.2.204

    Article  Google Scholar 

  2. Ajibade, P. (2018). Technology acceptance model limitations and criticisms: Exploring the practical applications and use in technology-related studies, mixed method, and qualitative researches. Library Philosophy and Practice, 1941. http://digitalcommons.unl.edu/libphilprac/1941

  3. Ajzen, I. (1985). From intentions to actions: a theory of planned behavior. In J. Kuhl & J. Beckman (Eds.), Action-control: from cognition to behavior (pp. 11–39). Springer.

    Chapter  Google Scholar 

  4. Akar, S. G. M. (2019). Does it matter being innovative: Teachers’ technology acceptance. Education and Information Technologies, 24, 3415–3432.

  5. Allen, I. E. & Seaman, J. (2012). Conflicted: Faculty and online education. Babson Survey Research Group and Quahog Research Group, LLC. http://www.insidehighered.com/sites/default/server_files/files/IHE-BSRG-Conflict.pdf

  6. Almaghaslah, D., & Alsyari, A. (2020). The effects of the 2019 novel coronavirus disease (COVID-19) outbreak on academic staff members: a case study of a pharmacy school in Saudi Arabia. Risk Management and Healthcare Policy, 13, 795–802. https://doi.org/10.2147/RMHP.S260918

    Article  Google Scholar 

  7. Almaiah, M. A., Al-Khasawneh, A., & Althunibat, A. (2020). Exploring the critical challenges and factors influencing the E-learning system usage during COVID-19 pandemic. Education and Information Technologies, 25, 5261–5280. https://doi.org/10.1007/s10639-020-10219-y

    Article  Google Scholar 

  8. Alsofyani, M., Aris, B., Eynon, R., & Majid, N. (2012). A preliminary evaluation of short blended online training workshop for TPACK development using technology acceptance model. The Turkish Online Journal of Educational Technology, 11(3), 20–32. Retrieved from http://www.tojet.net/articles/v11i3/1133.pdf

  9. Arend, B. (2009). Encouraging critical thinking in online threaded discussions. The Journal of Educators Online. https://files.eric.ed.gov/fulltext/EJ904064.pdf

  10. Ayebi-Arthur, K. (2017). E-learning, resilience and change in higher education: helping a university cope after a natural disaster. E-Learning and Digital Media, 14(5), 259–274.

    Article  Google Scholar 

  11. Blatchford, P., Bassett, P., Goldstein, H., & Martin, C. (2003). Are class size differences related to pupils’ educational progress and classroom processes? Findings from the Institute of Education Class Size Study of children aged 5–7 Years. British Educational Research Journal, 29(5), 709–730.

    Article  Google Scholar 

  12. Bronfenbrenner, U. (1977). Toward an experimental ecology of human development. American Psychologist, 32, 513–531.

    Article  Google Scholar 

  13. Bronfenbrenner, U. (1979). The ecology of human development. Harvard University Press.

    Google Scholar 

  14. Chapman, D. (2011). Contingent and tenured/tenure-track faculty: Motivations and incentives to teach distance education courses. Online Journal of Distance Learning Administration, 14(3), 1–15. http://www.westga.edu/~distance/ojdla/fall143/chapman143.html

  15. Cooper, J. (2007). Cognitive dissonance: Fifty years of a classic theory. Sage Publications.

  16. Czerniewicz, L., Trotter, H., & Haupt, G. (2019). Online teaching in response to student protests and campus shutdowns: Academics’ perspectives. International Journal of Educational Technology in Higher Education, 16(43), 1–22. https://doi.org/10.1186/s41239-019-0170-1

    Article  Google Scholar 

  17. Davis, F. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340. https://doi.org/10.2307/249008

    Article  Google Scholar 

  18. Dennis, A. R., Fuller, R. M., & Valacich, J. S. (2008). Media, tasks, and communication processes: a theory of media synchronicity. MIS Quarterly: Management Information Systems, 32(3), 575–600.

    Article  Google Scholar 

  19. Diamond, J. (2019). Upheaval: turning points for nations in crisis. Little Brown and Company.

    Google Scholar 

  20. Dillon, A. (2001). User acceptance of information technology. Taylor and Francis.

    Google Scholar 

  21. Elison-Bowers, P., Sand, J., Barlow, M. R., & Wing, T. J. (2011). Strategies for managing large online classes. The International Journal of Learning, 18(2), 57-66.

  22. Eisenhauer, J. G. (2009). Explanatory power and statistical significance. Teaching Statistics, 31(2), 42–46.

    Article  Google Scholar 

  23. Ertmer, P. A., Gopalakrishnan, S., & Ross, E. M. (2001). Technology-using teachers: comparing perceptions of exemplary technology use to best practice. Journal of Research on Computing in Education, 33(5).

  24. Farrell, T. S. C. & Lim, P. C. P. (2005). Conceptions of grammar teaching: A case study of teachers’ beliefs and classroom practices. Teaching English as a Second or Foreign Language, 9(2), 1–13.

  25. Fisher, J. D., & Fisher, W. A. (1992). Changing AIDS-risk behavior. Psychological Bulletin, 111, 455–474.

    Article  Google Scholar 

  26. Fox, B. (2007). ICT use during SARS: Teachers’ experiences. Journal of Technology and Teacher Education, 15(2), 191–205. https://www.ascilite.org/conferences/perth04/procs/pdf/fox.pdf

  27. Gannon-Cook, R., & Ley, K. (2004). What’s driving faculty participation in distance education? Paper presented at the 27th Annual Meeting of the Association for Educational Communications and Technology

  28. Gebhardt E., Thomson S., Ainley J., Hillman K. (2019) Teacher gender and ICT. In: Gender differences in computer and information literacy. IEA Research for Education (A Series of In-depth Analyses Based on Data of the International Association for the Evaluation of Educational Achievement (IEA)), vol 8. Springer

  29. Gibson, S. G., Harris, M. L., & Colaric, S. M. (2008). Technology acceptance in an academic context: faculty acceptance of online education. Journal of Education for Business, 83(6), 355–359.

    Article  Google Scholar 

  30. Green, T., Alejandro, J., & Brown, A. (2009). The retention of experienced faculty in online distance education programs: understanding factors that impact their involvement. The International Review of Research in Open and Distance Learning, 10(3), 1–15. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/683/1279

  31. Granic, A., & Marangunic, N. (2019). Technology acceptance model in educational context: a systematic literature review. British Journal of Educational Technology, 50(5), 2572–2593.

    Article  Google Scholar 

  32. Gunasinghe, A., Hamid, J. A., Khatibi, A., & Azam, S. M. F. (2019a). The adequacy of UTAUT-3 in interpreting academician’s adoption to e-Learning in higher education environments. Interactive Technology and Smart Education, 17(1), 86–106. https://doi.org/10.1108/ITSE-05-2019-0020

    Article  Google Scholar 

  33. Guillén-Gámez, F. D., Mayorga-Fernández, M. J., & Álvarez-García, F. J. (2020). A study on the actual use of digital competence in the practicum of education degree. Technology, Knowledge and Learning, 25, 667–684.

  34. Gunasinghe, A., Hamid, J.A., Khatibi, A., & Azam, S.M.F. (2019b). Academicians’ acceptance of online learning environments: A review of information system theories and models. Global Journal of Computer Science and Technology. Retrieved from https://globaljournals.org/GJCST_Volume19/5-Academicians-Acceptance-of-Online.pdf

  35. Hair, J. F., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2017). A primer on partial least squares structural equation modeling (PLS-SEM) (2nd ed.). Thousand Oaks, CA: Sage.

  36. Hu, S., Laxman, K., & Lee, K. (2020). Exploring factors affecting academics’ adoption of emerging mobile technologies-an extended UTAUT perspective. Education and Information Technologies, 25, 4615–4635. https://doi.org/10.1007/s10639-020-10171-x

    Article  Google Scholar 

  37. Johnson, N., Veletsianos, G., & Seaman, J. (2020). US faculty and administrators’ experiences and approaches in the early weeks of the COVID-19 pandemic. Online Learning, 24(2), 6–21. https://doi.org/10.24059/olj.v24i2

    Article  Google Scholar 

  38. Jung, I., Omori, S., Dawson, W. P., Yamaguchi, T., & Lee, S. (2021). Faculty as reflective practitioners in emergency online teaching: an autoethnography. International Journal of Educational Technology in Higher Education. https://doi.org/10.1186/s41239-021-00261-2

    Article  Google Scholar 

  39. Jung, I.S., & Gunawardena, C. L. (Eds.) (2014). Culture and online learning: Global perspectives and research. Stylus.

  40. Jung, I. S., & Lee, J. (2020). A cross-cultural approach to the adoption of open educational resources in higher education. British Journal of Educational Technology, 51(1), 263–280. https://doi.org/10.1111/bjet.12820

    Article  Google Scholar 

  41. Kahveci, A. (2010). Quantitative analysis of science and chemistry textbooks for indicators of reform: A complementary perspective. International Journal of Science Education, 32(11), 1495–1519.

  42. Ko, S., & Rossen, S. (2003). Teaching online: a practical guide. Houghton Mifflin.

    Google Scholar 

  43. King, W., & He, J. (2006). A meta-analysis of the technology acceptance model. Information & Management, 43(6), 740–755. https://doi.org/10.1016/j.im.2006.05.003

    Article  Google Scholar 

  44. Lorenzo, G. (2008). The Sloan Semester. Journal of Asynchronous Learning Networks, 12(2), 5–40. https://files.eric.ed.gov/fulltext/EJ837474.pdf

  45. Moore, G., & Anderson, W. (Eds.). (2003). Handbook of distance education. Lawrence Erlbaum Associates.

    Google Scholar 

  46. Prochaska, J. O., & DiClemente, C. C. (1983). Stages and processes of self-change of smoking: toward an integrative model of change. Journal of Consulting and Clinical Psychology, 51(3), 390–395.

    Article  Google Scholar 

  47. Puentedura, R. (2006). Transformation, technology, and education. Retrieved from http://hippasus.com/resources/tte/

  48. Radovan, M., & Kristl, N. (2017). Acceptance of technology and its impact on teachers’ activities in virtual classroom: Integrating UTAUT and CoI into a combined model. Turkish Online Journal of Educational Technology, 16(3), 11–22.

    Google Scholar 

  49. Park, S. Y. (2009). An analysis of the technology acceptance model in understanding university students’ behavioral intention to use e-learning. Educational Technology & Society, 12(3), 150–162.

    Google Scholar 

  50. Rogers, E.M. (2003). Diffusion of innovations (5th Ed.). New York: Free Press.

  51. Savery, C. A. (2005). Innovators or laggards: surveying diffusion of innovations by public relations practitioners. The University of Akron.

    Google Scholar 

  52. Shenoy, V., Mahendra, S., & Vijay, N. (2020). COVID 19 lockdown technology adaption, teaching, learning, students engagement and faculty experience. Mukt Shabd Journal, 9(4), 698–702.

    Google Scholar 

  53. Shea, P. (2007). Bridges and barriers to teaching online college courses: A study of experienced online faculty at 36 colleges. Journal of Asynchronous Learning Networks, 11(2), 73–128. Retrieved from http://faculty.weber.edu/eamsel/Research%20Groups/On-line%20Learning/BridgesandBarriersTeachingOnline.pdf

  54. Stewart, C., Bachman, C., & Johnson, R. (2010). Predictors of faculty acceptance of online education. MERLOT Journal of Online Learning and Teaching, 6(3), 597–616. http://jolt.merlot.org/vol6no3/stewartc_0910.pdf

  55. Sumrall, J. (2002). Factors which influence faculty attitudes and perceptions of distance education in analytical subject areas. Dissertation Abstracts International, 65(06), 2081A. (UMI No. 3135309) http://www2.mcdaniel.edu/skerby/articles/Sumrall_dis_factors_influence_perceptions_de.pdf

  56. Taherdoost, H. (2018). A review of technology acceptance and adoption models and theories. Procedia Manufacturing, 22, 960–967.

    Article  Google Scholar 

  57. Thaler, R. H., & Sunstein, C. R. (2008). Nudge: improving decisions about health, wealth, and happiness. Yale University Press.

    Google Scholar 

  58. Thatcher, J. B., Loughry, M. L., Lim, J., & McKnight, D. H. (2007). Internet anxiety: an empirical study of the effects of personality, beliefs, and social support. Information and Management, 44(4), 353–363.

    Article  Google Scholar 

  59. Theall, M. (Ed.) (1999). Motivation from within: approaches for encouraging faculty and students to excel. New Directions for Teaching and Learning, 78. Jossey Bass

  60. Townsend, R., & Hauss, M. (2002). The 2002 AHA-OAH survey of part-time and adjunct faculty. Perspectives on History, October. http://www.historians.org/perspectives/issues/2002/0210/0210aha3.cfm

  61. Venkatesh, V., & Davis, F. (2000). A theoretical extension of the technology acceptance model: four longitudinal field studies. Management Science, 46(2), 186–204. https://doi.org/10.1287/mnsc.46.2.186.11926

    Article  Google Scholar 

  62. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: toward a unified view. MIS Quarterly, 27(3), 425–478.

    Article  Google Scholar 

  63. Venkatesh, V., Thong, J. Y. L., & Xu, X. (2012). Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology. MIS Quarterly, 12(1), 157–178.

    Article  Google Scholar 

  64. Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: exploring the literature using the technology acceptance model as an organizing framework. Online Learning, 21(1), 15–35. https://doi.org/10.24059/olj.v21i1.761

    Article  Google Scholar 

  65. Yau, H. K., & Cheng, A. L. (2012). Gender difference of confidence in using technology for learning. The Journal of Technology Studies, 38(2), 74–79.

    Article  Google Scholar 

  66. Yuen, A. H., & Ma, W. W. (2008). Exploring teacher acceptance of e-learning technology. Asia-Pacific Journal of Teacher Education, 36(3), 229–243.

    Article  Google Scholar 

Download references

Acknowledgements

Not application.

Funding

This study was supported by the Ministry of Education (NRF-2020S1A5A8041947) and the Ministry of Science and ICT (NRF-2021R1F1A1056465) of the Republic of Korea, and the National Research Foundation of Korea, and the 2019–2022 Grant-in Aid for Scientific Research (Kakenhi) from the Japan Society for the Promotion of Society.

Author information

Affiliations

Authors

Contributions

JL: Conceptualization; Formal analysis; Methodology; Writing—original draft; Writing—review and editing. IJ: Conceptualization; Investigation; Validation; Writing—original draft; Writing—review and editing. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Jihyun Lee.

Ethics declarations

Ethics approval and consent for participate

This study was approved by the Institutional Review Board of International Christian University (ICU (No. 2020-09). Written Informed consent was obtained from all participants.

Competing interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lee, J., Jung, I. Instructional changes instigated by university faculty during the COVID-19 pandemic: the effect of individual, course and institutional factors. Int J Educ Technol High Educ 18, 52 (2021). https://doi.org/10.1186/s41239-021-00286-7

Download citation

Keywords

  • COVID-19 pandemic
  • Ecological systems
  • Emergency online teaching
  • Faculty innovativeness
  • Instructional change
  • Technology adoption
  • University educators