Skip to main content
  • Research article
  • Open access
  • Published:

University students experience the COVID-19 induced shift to remote instruction

Abstract

The COVID-19 pandemic required an abrupt shift from face-to-face to online instruction for many students in higher education in the United States. Prior research has raised some concerns about both equitable access to online courses, and the quality of instruction in online courses compared to face-to-face courses. This survey study included a retrospective pretest approach to comparing students experiences before and after the transition to online instruction. The sample of 1731 students ranged across all available topics of study and all class standings from first-year students to doctoral students at a R1: Doctoral Universities—Very High Research Activity university according to the Carnegie classifications. Quality of instruction was addressed through the three principles of Universal Design for Learning. Students reported that most areas of quality of instruction were poorer after the transition, with having Engagement dropping by the largest effect size. However, Representation showed a small effect of improvement following the transition. Students who preferred online instruction reported less loss of instructional quality. Similarly, students eligible for disability services also reported less loss of instructional quality. Doctoral students reported significantly poorer access on multiple measures compared to all four years of undergraduate students’ standings. Results are discussed in terms of patterns, exceptions, effect sizes, and recommendations for future research.

Introduction

The abrupt change in higher education from face-to-face to online learning in the Spring of 2020 raised concerns about the accessibility of online instruction, as well as the quality of instruction in online learning (Lassoued et al., 2020). As of this writing, a search for “COVID” through the Educational Resources Information Center (ERIC) database yielded 835 results. About half of those did not even mention “online” instruction, and most of those that remain were not based on original empirical research. The remaining research studies were not based on data from students in higher education to address concerns about access and quality of instruction related to this transition to online instruction.

A review of studies addressing the COVID-19 pandemic and higher education found that the large majority of studies were descriptive, and that previous reviews focused primarily on institutional processes (Bond, in press). The purpose of this study was to map the content of original relevant research, rather than to synthesize their findings. More than half of the studies in this recent review focused on the experiences of undergraduates with respect to teaching and learning, and only two of those studies focused on students with disabilities, whereas the present study sought to also include postgraduate students and focus on students who are eligible for disability services.

Accessibility

Some international studies have looked at the issue of access related to the transition to online instruction caused by the pandemic. For example, two studies of students in Turkey reported that problems with technology hindered their learning after the transition to online learning (Arici, 2020; Hebebci et al., 2020). Algerian students reported similar problems (Blizak et al., 2020), as did students in Saudi Arabia (Al-Nofaie, 2020). These studies did not included data about accessibility prior to the pandemic for comparison.

While studies of accessibility specifically related to the transition to online instruction because of the current pandemic are limited, there is more research available about accessibility issues related to online learning in general. For example, while post-secondary students prefer face-to-face learning (Sutiah et al., 2020), some concerns have been raised about accessibility to online content and resources for particular populations including students with disabilities and low income students. For example, students with visual impairments have problems with accessibility to massive open online courses (MOOCs) (Park, 2019), virtual reality applications (Lannan, 2019), and information and communication support technology (Eligi, 2017). Students who are deaf or hard of hearing also report problems with accessibility to online learning (Batanero et al., 2019; Ferreiro-Lago & Osuna-Acedo, 2017). At the same time, students with a variety of disabilities report preferring online learning (Ilgaz & Gulbahar, 2017; Kent et al., 2018), despite the preference of most students for face-to-face classes. Nevertheless, international analyses of online learning accessibility in general has found learning materials and sites wanting (Alsalem and Abu, 2018; Boateng, 2016; Carvajal et al., 2018; Massengale & Vasquez III, 2016).

Low income students, first generation students, and older students also are more likely to have problems taking advantage of online courses because of both access challenges and less experience and expertise with related technology (Buzzetto-Hollywood et al., 2018). For example, a recent study found that access to technology for low income students was exacerbated by the transition to online instruction required by the current pandemic (Kim & Padilla, 2020). Banerjee (2020) confirmed that first generation students have poorer access to technology. The digital divide between older and younger people in general has been documented, although the research on an age-based digital divide specifically in education is limited (Blažic & Blažic, 2020).

Quality of Instruction

The learning platform company Top Hat (2020) surveyed over 3,000 college and university students in the United States and Canada about their experiences with online learning during the fall of 2020. These students reported a reduction in engagement and motivation related to remote learning. These students overwhelmingly preferred face-to-face over remote instruction, and also preferred synchronous remote instruction with live streaming and chat over asynchronous remote instruction. They also recommended a stronger emphasis on active learning and community building in online courses.

A survey of Indonesian students found that students were dissatisfied with communication with their instructors, and with the quality of knowledge transfer, after the transition to online instruction, although there were no results from before that transition for comparison (Syauqi et al., 2020). Students at a university in the United States stressed the need for good communication with their instructors after the transition (Murphy et al., 2020). Authors of another study of students in the United States concluded that students engagement was negatively impacted by the transition (Perets et al., 2020). As is common for research into student engagement, the property of engagement is not well defined in these studies (Bond et al., 2020).

Universal Design for Learning (UDL) is an evidenced-based approach to instructional design for effective and inclusive learning experiences. UDL is based on three broad principles (CAST, 2021). The Engagement principle is based on multiple means to motivate learners – the WHY of learning. Following the Presentation principle ensures that content is presented in multiple ways – the WHAT of learning. The Action & Expression principle focuses on multiple means for learners to interact with the content and express what they know – the HOW of learning.

Recent reviews and meta-analyses have confirmed that UDL is effective in traditional face-to-face classes (Al-Azawei et al., 2016; Capp, 2017). However, evidence for the effectiveness of UDL in online education is more limited. Scholars have recommended the application of UDL to online instruction (Catalano, 2014; Pittman & Heiselt, 2014). However, instructors have expressed concerns about implementing UDL in online courses because of their discomfort with technology, pedagogical competencies, available time, and resistance to change (Singleton et al., 2019).

There is some research to support the incorporating UDL guidelines into online courses to improve the quality of instruction. For example, students reported better communication about expectations and other course information after UDL was applied to the redesign of an online undergraduate course (Rao & Tanners, 2011). When instructors applied the principle of Action & Expression to a final course project, students reported positive engagement and learning from the project (Boothe et al., 2020). Similar results were reported from students when entire graduate level courses were designed to incorporate UDL principles (Scott et al., 2015). Implementation of UDL in online undergraduate classes was also a predictor of student acceptance of online learning (Al-Azawei et al., 2017).

The COVID-19 pandemic prompted an abrupt shift from face-to-face to remote instruction in universities. However, prior research has raised concerns about the quality of instruction in online courses, and well as equity and accessibility issues for online courses. At the same time, relevant research based on student responses is limited, and often does not include comparison data about experiences before the transition to online instruction. In addition, as the overall use of online instruction continues to increase, perhaps with some additional impetus from experiences with the COVID-19 pandemic, the implications for research on these issues are broad, and have long-term importance. For these reasons, I addressed the following three research questions:

What changes in quality of instruction did university students experience related to the transition to remote instruction due to the pandemic?

Were quality of instruction experiences different for university students eligible for disability services?

Was access to instruction and course materials different across specialties, classes, and whether or not students preferred online instruction over face-to-face instruction?

Was access to instruction and course materials different for university students eligible for disability services?

Methods

The project proposal was reviewed and approved by an ethics review process that is mandated by federal law in the United States, and the project was carried out without any deviations from the original proposal (Project Number: 1646025–1. Data were collected through an anonymous, and voluntary, online survey. Some items were based on a retrospective pretest–posttest design to identify changes in perceived experiences. Although this was an original survey, most of the items were based on previous research, as described below. Based on university records, all students enrolled during the spring 2020 semester were invited by email to participate in the study. They were given three weeks to complete the survey, and a reminder was sent to the same email list halfway through the three-week window. A total of 1,731 students responded from a distribution list of 19,752.

Context and participants

Data were collected from 1731 students at a Very High Research Activity university in the western part of the United States. Prior to the COVID-19 pandemic, the large majority of courses were taught in face-to-face classrooms. In March of 2020, all classes shifted to a totally online format, and that mandated online format continued through the early summer of 2021, with plans to return to face-to-face instruction in the fall of 2021. Students were invited to participate in the study on September 24, 2020, by email, with two follow-up reminders during the following four weeks.

Retrospective pretest

Given the circumstances surrounding the shift from face-to-face to remote instruction in higher education, related to the COVID-19 pandemic, randomized controlled trials (RCTs) would not be practical for examining the effects of this shift on student experiences. Under theses circumstances, a retrospective pretest design may be more appropriate (Pelfrey et al., 2009). The retrospective pretest design “involves asking participants at the time of the posttest to retrospectively respond to questionnaire items thinking back to a specified pretest period. In effect, participants rate each item twice within a single sitting (“then” and “now”) to measure self-perceptions of change” (Little et al., 2020, p. 175). Retrospective pretest designs have been used in the field of education to examine the effectiveness of academic instruction (Coulter, 2012), professional development (Sullivan & Haley, 2009), and teacher efficacy beliefs (Cantrell, 2003).

Further, response shift bias poses a threat to internal validity for RCTs (Howard & Dailey, 1979; Howard et al., 1979). Response shift bias occurs when the standards participants use for responding to self-report measures changes over time in repeated measures studies. In the case of the pandemic-induced shift to online instruction, responses about online instruction may be influenced by the experience of shifting from face-to-face to remote instruction, so that responses before the shift are based on different expectations from responses after the shift. In addition, RCTs, by definition, cannot be applied in situations where the researcher does not have control over the predictor variable and participants cannot be randomly assigned to different conditions. For these reasons, a retrospective pretest design offers a useful approach for comparing student experiences before and after the shift to online instruction.

Instrument

The instrument included two demographic items based on an earlier internal survey conducted by the university. These two items asked for academic standing (class), and major. The survey also included an item asking “How many university-level online courses had you completed before the Spring of 2020?” (Wang, 2014).

Four items asked about the quality of access students had to the course content. These items asked about the reliability of their Internet service, access to communication software (e.g. Zoom), reliability of devices such as computers and smart phones, and quality of experiences with online replacements for face-to-face collaboration (e.g. digital breakout rooms, white boards, discussion groups, etc.) (Gladhart, 2010; Murphy et al., 2019).

One dichotomous item asked if participants preferred online or face-to-face learning (Erickson & Larwin, 2016; Ilgaz & Gulbahar, 2017; Kent et al., 2018). Another dichotomous item asked if participants were eligible for disability services at the university. This item was based on the previous internal survey.

Four items asked about the frequency and helpfulness of communications with instructors before and after the pandemic-induced transition to online learning (Wang, 2014). Six items asked about instructors’ implementation of the three principles of Universal Design for Learning before and after the transition to online learning (Rao et al., 2015; Rao & Tanners, 2011; Singleton et al., 2019; Westine et al., 2019). The instrument is included in Additional file 1: Appendix S1 of this manuscript.

Data analysis

  • What changes in quality of instruction did university students experience related to the transition to remote instruction due to the pandemic?

    The results from a series of paired sample t-tests addressed this research question (Sagarin et al., 2014). For each of these tests, the mean of students’ reported experiences before the transition to online instruction, were compared to the mean of students’ reported experiences after the transition to online instruction.

  • Were quality of instruction experiences different for university students eligible for disability services?

    Using one-way ANOVAs, I tested the differences in gain score means for students eligible for disability services versus students not eligible for disability services. Because some comparisons failed a homogeneity of variance test, results for the Welch statistic are reported (Sagarin et al., 2014), to adjust for problems with homogeneity of variance.

  • Was access to instruction and course materials different across specialties, classes, and whether or not students preferred online instruction over face-to-face instruction?

    An omnibus ANOVA was run on responses for each of the four items related to access to determine if there were any significant differences across these specialties, with planned Tukey tests to identify pairwise significant differences, if any. A parallel analysis was planned and conducted to compare mean responses to the same four items across seven groups of students based on their academic standing – first year, sophomore, junior, senior, masters, doctoral, and graduate students who are not in a degree program. Using one-way ANOVAs, I compared the accessibility mean scores across students who preferred online instruction versus those who preferred face-to face instruction. Because some of these comparisons failed a test of homogeneity of variance, Welch statistics are reported.

  • Was access to instruction and course materials different for university students eligible for disability services?

ANOVA was used to compare the scores from the four access items for the eligible students, with the scores on those items for the rest of the participants.

Given that a large number of statistical tests were run for this study, the results are vulnerable to Type 1 error inflation (Sagarin et al., 2014). In addition, most of the sample sizes are large, providing enough statistical power to detect quite small effects. For these reasons, results will primarily be discussed in terms of overall patterns, exceptions to patterns, and effect sizes. When interpreting the results, it is also important to recognize that the participants are reporting their own experiences, which are not confirmed by independent measures. In addition, the study involved no manipulation of intervention or randomization of participants into groups. For both of these reasons, causal inferences, and recommendations for interventions must be speculative before confirming research results are available.

Results

What changes in quality of instruction did university students experience related to the transition to remote instruction due to the pandemic?

Table 1 reports the results from a series of t-tests checking for significant differences between retrospective pretest response means and posttest response means for each of the three UDL principles as well as the two issues of instructor communication. The table includes the number of students who responded to both items (N), the test statistic (t), the probability of a Type I error if the null hypothesis is rejected (p), and a standardized mean difference effect size (g*). A standardized mean difference is independent of statistical significance, making it insensitive to sample sizes, and generalizable across different analyses and studies (Ives, 2003). However, Cohen’s d and Hedges g are both susceptible to small sample bias. The effect size measure I used was Hedges g with a correction for this small sample bias (Durlak, 2009; Hedges & Olkin, 1985). Although our sample size would not be considered small, I adopted this effect size measure as a matter of good practice and consistency.

Table 1 Significant changes in reported quality of instruction before and after the shift to remote learning

All five of these comparisons yielded statistically significant differences between experiences before and after the shift to online instruction induced by the pandemic. In addition, these results show that these tests have adequate statistical power to detect small effects (Cohen, 1988). Four of the measures of quality of instruction became poorer after the switch to online instruction. Two of these had large effect sizes (frequency of communication, and engagement), one had a medium effect size (helpfulness of communication), and one had a small effect size (action & expressions). In addition, students reported that representation improved after the transition to on line instruction, with a small effect size.

Two hundred and twenty-eight of the participants reported preferring online classes over face-to-face instruction, while 1,125 reported preferring face-to-face instruction. For all participants, quality of instruction scores before the online shift were subtracted from quality of instruction scores after the shift to create gain scores for each participant, for each of the five measurers of quality of instruction. Using one-way analysis of variance (ANOVAs), I compared the quality of instruction gain scores across these two groups. Because some of these comparisons failed a test of homogeneity of variance, Welch statistics are reported. Results are reported in Table 2. In every case, students who preferred face-to-face instruction also reported significantly poorer experiences with quality of instruction than students who preferred online instruction. Three of the effect sizes are large, while the other two are small.

Table 2 Comparison of gains scores between students who preferred online instruction versus those who preferred face-to-face (f2f) instruction for quality of instruction before and after the shift to remote learning

Were quality of instruction experiences different for university students eligible for disability services?

One hundred and forty-seven of the participants reported being eligible for disability services. Descriptive statistics for both eligible students and other students are reported in Table 3. Negative mean gain scores indicate a reduction in instructional quality. Consistent with the findings for the first research question, both groups reported positive gain scores for the Representation gain scores, indicating an improvement in Representation following the transition to online instruction, and students eligible for disability services reported a greater improvement in Representation. Both groups reported poorer quality for all four of the other items related to quality of instruction, but in each case, students eligible for disability services reported less of a drop in instructional quality. Overall, the impact of the move to online instruction seemed to have a less negative effect for students eligible for disability services.

Table 3 Descriptive statistics for quality of instruction items for students eligible for disability services and other students
Table 4 Differences in quality of instruction gain scores between students eligible for disability services and other students
Table 5 Means (standard deviations) across specialties for the four access items
Table 6 Means (standard deviations) across class standing for the four access items
Table 7 Significance (effect sizes) for significant differences across class standing for the four access items
Table 8 Comparison of gains scores between students who preferred online instruction versus those who preferred face-to-face (f2f) instruction for accessibility before and after the shift to remote learning

The results for the comparisons between groups are reported in Table 4. Although the effect of the move to online instruction was less negative for students eligible for disability services for all five measures of instructional quality, only the Engagement comparison reached a conventional level of statistical significance. This was also the only comparison for which the effect size rose to the level of a small effect, indicating that students eligible for disability services had a significantly smaller reduction in engagement in their classes that students who were not eligible.

Was access to instruction and course materials different across specialties or classes or preferences for online or face-to face instruction?

Student specialties were identified by the university college or school that housed their primary field of study, and categorized based on the international standards established by the United Nations Educational, Scientific and Cultural Organization (UNESCO Institute for Statistics. (2015). International Standard of Classification: Fields of Education and Training 2013 (ISCED-F 2013)—Detailed Field Descriptions. Retrieved from Montreal, 2013). Although students studying journalism reported the best experiences with all four items, and students studying education reported the poorest experience for three of the four items, there were no statistically significant differences between the means of any of the 11 units or students who were undeclared. Means and standard deviations for these comparisons are reported on Table 5.

For the analysis across classes, all four of the omnibus ANOVAs were significant. Tukey tests identified several significantly different pairs of means for each of the four access items – a total of 25 significant pairwise comparisons. Across all four items, doctoral students reported poorer experiences with access than each of the four undergraduate classes, accounting for 16 of the significant mean differences. Effect sizes for these differences spanned the range from medium to large. Seven of the remaining significant differences were between masters students and some of the undergraduate classes. The effect sizes for these differences were almost all in the small range. Only one of the significant comparisons involved comparing undergraduates to undergraduates, and one involved comparing sophomores to graduate students who were not in a degree program. Table 6 reports means and standard deviations for the four access items across class standing. Table 7 reports p-values and effect size measures for the significant pairwise comparisons.

Comparisons of accessibility between students who preferred face-to-face instruction and those who did not are reported in Table 8. In every case, students who preferred face-to-face instruction also reported significantly poorer experiences with accessibility than students who preferred online instruction. Effect sizes ranged from small to large.

Was access to instruction and course materials different for university students eligible for disability services?

The sample of participants for this study included 153 students who reported being eligible for disability services. The eligible students reported better access for three of the four items, and poorer access for one of them. However, none of the mean differences between the two groups approached significance (all p-values were > 0.18), suggesting that the access experiences of the two groups were similar.

Discussion and conclusion

Quality of instruction

Based on prior research, a drop in reported quality of instruction would be expected after the transition to online instruction. At the same time, perceived quality of online instruction is related to how accepting students are of online instruction (Larmuseau, 2019). Our own results found this to be true for both the frequency and helpfulness of instructor communication. This result is consistent with prior work showing that instructor availability is a predictor of student perceptions of quality of instruction in online classes (Slaydon et al., 2020). These results suggest that instructors could improve the perceived quality of their online instruction by enhancing their availability and communication with students.

Also related to quality of instruction, students in this study reported that two principles of Universal Design for Learning (UDL), Engagement and Action & Expression, were significantly poorer after the transition to online instruction. In fact, the drop in Engagement was the largest of all the effect sizes for the five measures of quality of instruction. Synchronous online activities are related to improved engagement in online classes (Weiler, 2012), and prior research has shown that some design elements for online activities are more effective at enhancing student engagement than others (Cundell & Sheepy, 2018).

Students reported that the Representation element of UDL actually improved after the transition to online learning. This result held across all participants, as well as the separate subgroups of participants eligible or not eligible for disability services. One hypothesis to explain this finding is that the constraints of online instruction may require that instructors to be more creative about their presentation of content than in face-to-face classrooms. This hypothesis should be addressed in future research.

Changes in perceived quality of instruction may be attributable to changes in actual quality of instruction. However, other variables may also help to account for these changes. For example, a general perception that online instruction is of poorer quality may introduce a bias in perceptions relative to face-to-face instruction. In this context, students who preferred online over face-to-face instruction reported significantly more positive experiences about quality of online instruction, compared to face-to-face instruction, than did students who preferred face-to-face instruction.

These results regarding the quality of instruction suggest that the impact of the shift to online instruction related to the COVID-19 did not result in a uniform overall reduction in quality of instruction. Instead, the impact may be more complex, with some elements of instructional quality actually benefitting from the shift to online instruction. Most notably, the Representation principle of UDL improved after the move to online instruction. This complexity warrants further research to understand more clearly where quality of instruction would most benefit from improved resources.

Students who were eligible for disability services reported a significantly smaller drop in engagement after the online transition, than other students reported. Students eligible for disability services also reported less negative changes for the other four measures of quality of instruction, but these mean differences were not statistically significant. A plausible hypothesis to be tested by future research would be that contact with campus offices providing disability services may have helped to support the engagement of eligible students. The other four measures of quality of instruction are more directly related to what is happening in the course, and may not benefit from the work of the support staff providing disability services. The results of this study indicate that the shift to online instruction may have differentially impacted some students more than others. While this possibility warrants further investigation, these differential impacts may justify more differentiated interventions for student support.

Accessibility

There were no statistically significant differences between means across colleges or schools within the university for any of the four measures of accessibility. In addition, there were no significant differences on the accessibility measures between students eligible for disability services, and those who were not.

Across students with different class standing, the results were more varied. The most striking pattern was that doctoral students reported more difficulty on all four of the measures of accessibility than any of the four undergraduate groups. The effect sizes were mostly small for Internet access and reliability of devices, medium for communication software, and large for collaboration tools. This pattern might suggest that doctoral students were getting less support for accessibility, but it is not clear why that might be. Some evidence suggests that doctoral students, at least in the United Kingdom, found the pandemic lockdown interfered with their ability to pursue their research activities, while they also had no assurance of extensions or other considerations from their universities. (Byrom, 2020). Another hypothesis is that doctoral students are typically older than undergraduates, and doctoral students, at least in some fields, may be less facile with digital technologies. This is a troubling finding that warrants further investigation. Masters students also reported significantly greater difficulty with communication software than each of the four undergraduate classes, and these effect sizes were in the small range. For these results, the shift to online instruction has differentially impacted accessibility issues for some students more than others. This research, and systematic replications, can guide decisions about student support.

Not surprisingly, students who preferred online instruction reported significantly less difficulty with accessibility for all four measures. Students who preferred online instruction also reported having taken significantly more online courses than other students prior to the transition to online instruction (Welch = 292.581, p = 0.009, g* = 0.21). Three of those results yielded small effect sizes, while the result for collaboration tools yielded a large effect size. Perhaps collaboration tools are particularly vulnerable to the transition to online instruction. This pattern is reflected in the results for doctoral students in the previous paragraph.

The circumstances of the COVID-19 pandemic should be taken into consideration when interpreting these results. First, the shift to online instruction was mandated rather than voluntary. Second, the initial transition in the spring of 2020 occurred while classes were already in progress. That means students made the initial transition to online instruction within the same classes they were already taking face-to-face. Third, during the pandemic, students faced additional emotional challenges that may have influenced their experiences with the transition. It is not clear what these results might imply for situations where students have voluntarily chosen online versus face-to-face instruction when not facing a significant crisis like the COVID-19 pandemic.

In most cases, these results reflect perceived losses in both quality of instruction and accessibility to online resources and content following the transition to online instruction impelled by the COVID-19 pandemic. While acknowledging the risks of overgeneralizing and overinterpreting these results, they do support a recommendation that institutions of higher education focus on helping instructors improve quality of online instruction and access, particularly in the areas of student engagement, instructor communication, and use of collaboration tools. In addition, more focus on supporting quality of instruction and access for doctoral students is supported by these data. At the same time, there are some encouraging results as well. The representation principle of Universal Design for Learning reportedly improved after the shift to online instruction. In addition, students eligible for disability services reported less of a loss in quality of instruction and access than other students reported. Given that the use of online instruction in higher education was increasing before the COVID-19 pandemic, and maybe given further impetus from this pandemic, the implications for research on these issues are broad, and have long-term importance. The results of this study justify conducting similar studies, not just for systematic replication, but because the results can inform policy and practice in higher education online instruction.

Availability of data and materials

Data and materials are available from the author. The instrument is attached as an Additional file 1: Appendix S1.

Abbreviations

ANOVA:

Analysis of variance

ERIC:

Educational Resources Information Center

MOOCs:

Massive open online courses

RCTs:

Randomized controlled trials

UDL:

Universal Design for Learning

References

  • Al-Azawei, A., Parslow, P., & Lundqvist, K. (2017). The Effect of Universal Design for Learning (UDL) Application on E-learning Acceptance: A Structural Equation Model. International Review of Research in Open and Distributed Learning, 18(6), 54–87.

    Article  Google Scholar 

  • Al-Azawei, A., Serenelli, F., & Lundqvist, K. (2016). Universal Design for Learning (UDL): A content analysis of peer- reviewed journal papers from 2012 to 2015. Journal of the Scholarship of Teaching and Learning, 16(3), 39–56. https://doi.org/10.14434/josotl.v16i3.19295

    Article  Google Scholar 

  • Al-Nofaie, H. (2020). Saudi University Students’ Perceptions towards Virtual Education During Covid-19 Pandemic: A Case Study of Language Learning via Blackboard. Arab World English Journal, 11(3), 4–20. https://doi.org/10.24093/awej/vol11no3.1

    Article  Google Scholar 

  • Alsalem, G. M. D., & Abu, I. (2018). Access education: What is needed to have accessible higher education for students with disabilities in Jordan? International Journal of Special Education, 33(3), 541–561.

    Google Scholar 

  • Arici, B. (2020). Analysis of university students’ opinions on the Covid- 19 process and the distance education method applied in this process: The sample of Muş Alparslan University. African Educational Research Journal, 8(2), S344–S352. https://doi.org/10.30918/AERJ.8S2.20.064

    Article  Google Scholar 

  • Banerjee, M. (2020). An Exploratory Study of Online Equity: Differential Levels of Technological Access and Technological Efficacy among Underserved and Underrepresented <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: baseline;">Student Populations in Higher Education. Interdisciplinary Journal of e-Skills and Lifelong Learning, 16, 93–121. Doi: https://doi.org/10.28945/4664

  • Batanero, C., & de-Marcos, L., Holvikivi, J., Hilera, J. R. N., & Otón, S. . (2019). Effects of new supportive technologies for blind and deaf engineering students in online learning. IEEE Transactions on Education, 62(4), 270–276.

    Article  Google Scholar 

  • Blažic, B. J., & Blažic, A. J. (2020). Overcoming the <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: Baseline;">Digital <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: Baseline;">Divide with a Modern Approach to Learning <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: Baseline;">Digital Skills for the Elderly Adults. Education and Information Technologies, 25(1), 259–279. https://doi.org/10.1007/s10639-019-09961-9

    Article  Google Scholar 

  • Blizak, D., Blizak, S., Bouchenak, O., & Yahiaoui, K. (2020). Students ’ Perceptions Regarding the Abrupt Transition to Online Learning During the COVID-19 Pandemic: Case of Faculty of Chemistry and Hydrocarbons at the University of Boumerdes - Algeria. Journal of Chemical Education, 97, 2466–2471. https://doi.org/10.1021/acs.jchemed.0c00668

    Article  Google Scholar 

  • Boateng, J. K. (2016). Accessibility considerations for e learning in Ghana. Journal of Education and e-Learning Research, 3(4), 124–129. https://doi.org/10.20448/journal.509/2016.3.4/509.4.124.129

    Article  Google Scholar 

  • Bond, M., Buntin, K., Bedenlie, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: a systematic evidence map. International Journal of Educational Technology in Higher Education, 17, 2. https://doi.org/10.1186/s41239-019-0176-8

    Article  Google Scholar 

  • Bond, M., SvenjaMarín, Victoria I.Händel, Marion. (in press). Emergency remote teaching in higher education: Mapping the first global online semester.

  • Boothe, K. A., & Lohmann, M. J. O. (2020). Enhancing student learning in the online instructional environment through the use of universal design for learning. Networks, 22, 1.

    Google Scholar 

  • Buzzetto-Hollywood, N. A., Wang, H. C., & Elobeid, M. (2018). Addressing Information Literacy and the <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: Baseline;">Digital <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: Baseline;">Divide in <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: Baseline;">Higher Education. Interdisciplinary Journal of e-Skills and Lifelong Learning, 14, 77–93.

    Article  Google Scholar 

  • Byrom, N. (2020). The challenges of lockdown for early-career researchers. ELife. https://doi.org/10.7554/eLife.59634

    Article  Google Scholar 

  • Cantrell, P. (2003). Traditional vs retrospective pretests for measuring science teaching efficacy beliefs in preservice teachers. School Science and Mathematics, 103(4), 177–185. https://doi.org/10.1111/j.1949-8594.2003.tb18116.x

    Article  Google Scholar 

  • Capp, M. J. (2017). The effectiveness of universal design for learning: A meta-analysis of literature between 2013 and 2016. International Journal of Inclusive Education, 21(8), 791–807. https://doi.org/10.1080/13603116.2017.1325074

    Article  Google Scholar 

  • Carvajal, C. M., Piqueras, R. F., & Mérida, J. F. C. (2018). Evaluation of web accessibility of higher education institutions in Chile. International Education Studies, 11(12), 140–148.

    Article  Google Scholar 

  • CAST. (2021). About Universal Design for Learning. CAST. Retrieved January 8, 2021 from https://www.cast.org/impact/universal-design-for-learning-udl

  • Catalano, A. (2014). Improving distance education for students with special needs: A qualitative study ofstudents’ experiences with an online library research course. Journal of Library & Information Services in Distance Learning, 8(1–2), 17–31. https://doi.org/10.1080/1533290X.2014.902416

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (Second ed.). Erlbaum.

  • Coulter, S. E. (2012). Using the retrospective pretest to get usable, indirect evidence of student learning. Assessment & Evaluation in Higher Education, 37(3), 321–334. https://doi.org/10.1080/02602938.2010.534761

    Article  Google Scholar 

  • Cundell, A., & Sheepy, E. (2018). Student Perceptions of the Most Effective and Engaging <strong data-auto="strong_text"> Online Learning Activities in a Blended Graduate Seminar. Online Learning, 22(3), 87–102.

    Article  Google Scholar 

  • Durlak, J. A. (2009). How to select, calculate, and interpret effect sizes. Journal of Pediatric Psychology, 34(9), 917–928. https://doi.org/10.1093/jpepsy/jsp004

    Article  Google Scholar 

  • Eligi, I. M. (2017). Kelefa ICT accessibility and usability to support learning of visually-impaired students in Tanzania. International Journal of Education and Development Using Information and Communication Technology, 13(2), 87–102.

    Google Scholar 

  • Erickson, M. J., & Larwin, K. H. (2016). The potential impact of online/distance education for students with disabilities in higher education. International Journal of Evaluation and Research in Education, 5(1), 76–81.

    Google Scholar 

  • Ferreiro-Lago, E., & Osuna-Acedo, S. (2017). Factors affecting the participation of the deaf and hard of hearing in e-learning and their satisfaction: A quantitative study. International Review of Research in Open and Distributed Learning, 18(7), 267–291.

    Google Scholar 

  • Gladhart, M. A. (2010). Determining faculty needs for delivering accessible electronically delivered instruction in higher education. Journal of Postsecondary Education and Disability, 22(3), 185–196.

    Google Scholar 

  • Hebebci, M. T., Bertiz, Y., & Alan, S. (2020). Investigation of views of students and teachers on distance education practices during the coronavirus (COVID-19) pandemic. International Journal of Technology in Education and Science, 4(4), 267–282.

    Article  Google Scholar 

  • Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Academic Press.

    MATH  Google Scholar 

  • Howard, G. S., & Dailey, P. R. (1979). Response-shift bias: A source of contamination of self-report measures. Journal of Applied Psychology, 66(2), 144–150.

    Article  Google Scholar 

  • Howard, G. S., Ralph, K. M., Gulanick, N. A., Maxwell, S. E., Nance, S. W., & Gerber, S. K. (1979). Internal invalidity in pre-test-post-test self-report evaluations and a re-evaluation of retrospective pre-tests. Applied Psychological Measurement, 3, 1–23.

    Article  Google Scholar 

  • Ilgaz, H., & Gulbahar, Y. (2017). Why do learners choose online learning: The learners' voices. IADIS International Conference e-Learning 2017, Lisbon, Portugal.

  • Ives, B. (2003). Effect size use in studies of learning disabilities. Journal of Learning Disabilities, 36, 490–504.

    Article  Google Scholar 

  • Kent, M., Ellis, K., & Giles, M. (2018). Students with disabilities and eLearning in Australia: Experiences of accessibility and disclosure at Curtin University. TechTrends, 62, 654–663. https://doi.org/10.1007/s11528-018-0337-y

    Article  Google Scholar 

  • Kim, C. J. H., & Padilla, A. M. (2020). Technology for Educational Purposes Among Low-Income Latino Children Living in a Mobile Park in Silicon Valley: A Case Study Before and During COVID-19. Hispanic Journal of Behavioral Sciences, 42(4), 497–514. https://doi.org/10.1177/0739986320959764

    Article  Google Scholar 

  • Lannan, A. (2019). A virtual assistant on campus for blind and low vision students. The Journal of Special Education Apprenticeship, 8(2), 1–14.

    Google Scholar 

  • Larmuseau, C. (2019). Perceptions of Instructional <strong data-auto="strong_text"> Quality : Impact on Acceptance and Use of an <strong data-auto="strong_text"> Online Learning Environment. Interactive Learning Environments, 27(7), 953–964.

    Article  Google Scholar 

  • Lassoued, Z., Alhendawi, M., & Bashitialshaaer, R. (2020). An Exploratory Study of the Obstacles for Achieving Quality in Distance Learning during the COVID-19 Pandemic. Education Sciences. https://doi.org/10.3390/educsci10090232

    Article  Google Scholar 

  • Little, T. D., Chang, R., Gorrall, B. K., Waggenspack, L., Fukuda, E., Allen, P. J., & Noam, G. G. (2020). The retrospective pretest–posttest design redux: On its validity as an alternative to traditional pretest–posttest measurement. International Journal of Behavioral Development, 44(2), 175–183. https://doi.org/10.1177/0165025419877973

    Article  Google Scholar 

  • Massengale, L. R., & Vasquez, E., III. (2016). Assessing accessibility: How accessible are online courses for students with disabilities? Journal of the Scholarship of Teaching and Learning, 16(1), 69–79. https://doi.org/10.14434/josotl.v16i1.19101

    Article  Google Scholar 

  • Murphy, A., Malenczak, D., & Ghajar, M. (2019). Identifying challenges and benefits of online education for students with a psychiatric disability. Journal of Postsecondary Education and Disability, 32(4), 395–409.

    Google Scholar 

  • Murphy, L., Eduljee, N. B., & Croteau, K. (2020). College Student Transition to Synchronous Virtual Classes during the <strong data-auto="strong_text" style="margin: 0px; padding: 0px; border: 0px; vertical-align: baseline;">COVID-19 Pandemic in Northeastern United States. Pedagogical Research, 5: 4. Doi: https://doi.org/10.29333/pr/8485

  • Park, K., & Hyo-Jeong-Cha, H. (2019). Digital equity and accessible MOOCs: Accessibility evaluations of mobile MOOCs for learners with visual impairments. Australasian Journal of Educational Technology, 35(6), 48–63. https://doi.org/10.14742/ajet.5521

    Article  Google Scholar 

  • Pelfrey, S., William, V., Pelfrey, J., & William, V. (2009). Curriculum evaluation and revision in a nascent field: The utility of the retrospective pretest–posttest model in a Homeland Security program of study. Evaluation Review, 33(1), 54–82.

    Article  Google Scholar 

  • Perets, E. A., Chabeda, D., Gong, A. Z., Huang, X., Fung, T. S., Ng, K. Y., Bathgate, M., & Yan, E. C. Y. (2020). Impact of the emergency transition to remote teaching on student engagement in a non-STEM undergraduate chemistry course in the time of COVID-19. Journal of Chemical Education, 97, 2439–2447. https://doi.org/10.1021/acs.jchemed.0c00879

    Article  Google Scholar 

  • Pittman, C., & Heiselt, A. (2014). Increasing accessibility using Universal Design principles to address disability impairments in the online learning environment. Online Journal of Distance Learning Administration, 17, 3.

    Google Scholar 

  • Rao, K., & Edelen-Smith, & Wailehua, C.-U. . (2015). Universal design for online courses: applying principles to pedagogy. Open Learning, 30(1), 35–52. https://doi.org/10.1080/02680513.2014.991300

    Article  Google Scholar 

  • Rao, K., & Tanners, A. (2011). Curb cuts in cyberspace: universal instructional design for online courses. Journal of Postsecondary Education and Disability, 24(3), 211–229.

    Google Scholar 

  • Sagarin, B. J., Ambler, J. K., & Lee, E. M. (2014). An ethical approach to peeking at data. Perspectives on Psychological Science, 9(3), 293–304. https://doi.org/10.1177/1745691614528214

    Article  Google Scholar 

  • Scott, L. A., Temple, P., & Marshall, D. (2015). UDL in online college coursework: insights of infusion and educator preparedness. Online Learning, 19(5), 99–119.

    Article  Google Scholar 

  • Singleton, K., Evmenova, A., Jerome, M. K., & Clark, K. (2019). Integrating UDL strategies into the online course development process: Instructional designers’ perspectives Online. Learning, 23(1), 206–235. https://doi.org/10.24059/olj.v23i1.1407

    Article  Google Scholar 

  • Slaydon, J., Rose, D., & Allen, L. (2020). Quantifying the personal factor of FTF in an online world. Journal of Instructional Pedagogies, 23, 1.

    Google Scholar 

  • Sullivan, L. G., & l., & Haley, K. J. . (2009). Using a retrospective pretest to measure learning in professional development programs. Community College Journal of Research and Practice, 33(3–4), 346–362. https://doi.org/10.1080/10668920802565052

    Article  Google Scholar 

  • Sutiah, S., Slamet, S., Shafqat, A., & Supriyono, S. (2020). Implementation of distance learning during the COVID-19 in Faculty of Education and Teacher Training. Cypriot Journal of Educational Sciences, 15(1), 1204–1214. https://doi.org/10.18844/cjes.v15i5.5151

    Article  Google Scholar 

  • Syauqi, K., Munadi, S., & Triyono, M. B. (2020). Students’ perceptions toward vocational education on online learning during the COVID-19 pandemic. International Journal of Evaluation and Research in Education, 9(4), 881–886.

    Google Scholar 

  • Top Hat. (2020). Top Hat Field Report: Higher Education Students Grade the Fall 2020 Semester. T. Hat. https://tophat.com/teaching-resources/interactive/student-survey-report/

  • UNESCO Institute for Statistics. (2015). International Standard of Classification: Fields of Education and Training 2013 (ISCED-F 2013) - Detailed Field Descriptions. Retrieved from Montreal, C. (2013). International Standard of Classification: Fields of Education and Training 2013 (ISCED-F 2013) - Detailed Field Descriptions. http://uis.unesco.org/sites/default/files/documents/international-standard-classification-of-education-fields-of-education-and-training-2013-detailed-field-descriptions-2015-en.pdf

  • Wang, Y. D. (2014). Building student trust in online learning environments. Distance Education, 35(3), 345–359. https://doi.org/10.1080/01587919.2015.955267

    Article  Google Scholar 

  • Weiler, S. C. (2012). Quality Virtual Instruction: The use of synchronous onlineactivities to engage international students in meaningful learning. Journal of International Education and Leadership, 2, 2.

    Google Scholar 

  • Westine, C. D., Oyarzun, B., Ahlgrim-Delzell, L., Casto, A., Okraski, C., Park, G., Person, J., & Steele, L. (2019). Familiarity, current use, and interest in Universal Design for Learning among online university instructors. International Review of Research in Open and Distributed Learning, 20(5), 20–41.

    Article  Google Scholar 

Download references

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

As sole author, Bob Ives completed all of the work for this study, including research design, data collection, data analyses, and writing of the manuscript. The author read and approved the final manuscript.

Corresponding author

Correspondence to Bob Ives.

Ethics declarations

Ethics approval and consent to participate

This study was conducted in accordance with, and following, a determination by the Institutional Review Board at the University of Nevada, Reno, USA, that the study was Exempt from additional review.

Consent for publication

Not applicable

Competing interests

The author has no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ives, B. University students experience the COVID-19 induced shift to remote instruction. Int J Educ Technol High Educ 18, 59 (2021). https://doi.org/10.1186/s41239-021-00296-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-021-00296-5

Keywords