Skip to main content

Advertisement

The roles of academic engagement and digital readiness in students’ achievements in university e-learning environments

Article metrics

Abstract

University students, who are assumed to be digital natives, are exposed to campus e-learning environments to improve their academic performance at the beginning of their academic careers. However, previous studies of students’ perceptions of e-learning demonstrate a lack of consistent results with respect to the prediction of their academic achievement. The goal of this study was to examine university students’ perceptions of e-learning, based on their experiences, and the mediating roles of academic engagement and digital readiness within the university context of an e-learning environment for academic achievement. A total of 614 undergraduate students enrolled in a Korean university participated in this study. Using a partial least squares model to develop the theory, we examined students engaging in university e-learning environments in relation to their perceptions of e-learning, digital readiness, academic engagement, and academic achievement (i.e., grade point average). The results are significant for the importance of students’ academic engagement and digital readiness as mediators in their perceptions of e-learning predicted by academic achievement. Although students positively perceived e-learning experiences on campus, they must have strong digital skills to perform academic work and commit to effortful involvement in the context of academic learning in university e-learning environments. Our results provide practical implications for ways to enhance effective adoption of e-learning environments by college students, educators, and administrators.

Introduction

In recent years, higher education institutions have shown a persistent concern with enhancing students’ academic performance through the use of innovative technologies that offer new ways of delivering and producing university education (Deng & Tavares, 2013; Orton-Johnson, 2009). E-learning environments in universities assist with distributing educational resources, supporting instructor-to-student communication, facilitating student learning communities, managing student learning progress, and enabling students to take e-learning courses (Islam, 2013). The e-learning experiences of students in higher education institutions tend to be integrated with academic experiences for sustainable learning improvement because they are relevant not only to academic success but also to lifelong learning for personal success. The e-learning environment in a higher education institution is a learning ecosystem that integrates digital technology with teaching and learning practices as a significant educational innovation, by advancing technology-enabled platforms (Eze, Chinedu-Eze, & Bello, 2018). The benefits of e-learning environments for students and universities are saving substantial costs for physical teaching and learning infrastructure, contributing to the digitization of course contents to easily share and adopt learning contents anytime and anywhere, and integrating the global educational environment (Pham, Limbu, Bui, Nguyen, & Pham, 2019).

Recently, technology-driven learning experiences in university education have followed the changing educational paradigm from being instructor-led to becoming learner-centered learning strategies (Ituma, 2011; Olelewe & Agomuo, 2016). In Korea, university students in recent years have been engaging in university e-learning courses from the start of their academic lives, often before entering a pre-college program. To develop higher quality and learner-centered education, universities have built enriching e-learning environments that meet various educational needs (Islam, 2013). For example, the high quality of digital learning resources triggers the transformation of traditional classrooms into flipped classrooms or blended learning environments (Álvarez, Martín, Fernández-Castro, & Urretavizcaya, 2013), thereby enhancing face-to-face instruction through digital learning resources and offering students a more intellectually engaging learning experience (Woods, Baker, & Hopper, 2004). These changes reflect the idea that traditional instruction can be enhanced by using e-learning environments. Thus, universities are investing in the development of campus e-learning environments as students’ preferred method of course delivery or as a supplementary method to traditional face-to-face courses, based on the approach that technologically savvy, digital native students are familiar with such learning environments (Parkes, Stein, & Reading, 2015). These movements in Korea are stimulated to move toward this university-driven effort by governmental policies impacting higher education and by the needs of university members such as faculty or students. Considering students as digital natives implies that when students are engaged in a university e-learning environment, they are expected to have experience and confidence using such a type of learning environment. However, university educators are faced with unexpected results of students who engage in university e-learning environments such as students’ mixed perceptions of e-learning (Hunley et al., 2005; Levy, 2007).

To achieve a high rate of student academic success, e-learning in higher education encompasses the use of digital technologies to build educational materials for teaching and learning, to teach learners, and to regulate courses (Fry, 2001; Parkes et al., 2015). E-learning has expanded rapidly with the popularization and advancement of multimedia and network technologies such as high-speed Internet, high-definition video, smart devices, and intelligent functionalities of learning management systems (Cidral, Oliveira, Di Felice, & Aparicio, 2018; Eze et al., 2018). Advances in e-learning environments at universities around the world continue to contribute to improving students’ academic success (Castillo-Merino & Serradell-López, 2014; Naveed, Muhammed, Sanober, Qureshi, & Shah, 2017). Technological tools and systems in e-learning environments enhance the quality of learning experiences and outcomes by providing adaptive materials and strategies for the needs and preferences of individual learners (Means, Toyama, Murphy, & Bakia, 2013).

From simple adoption of in-person technology instruction to complex adoption using lecture capture, online chat, discussion boards, and social networking services, the higher education sector adopts blended learning as the norm to improve the effects of using e-learning environments as more active approaches to drive student engagement (López-Pérez, Pérez-López, & Rodríguez-Ariza, 2011). Interestingly, these types of dynamic adoption of e-learning systems show mixed results for students’ academic success such as increased satisfaction with the learning experience (Lyons & Evans, 2013), a positive effect in reducing dropout rates (López-Pérez et al., 2011), higher academic performance (López-Pérez et al., 2011; Roffe, 2002), and reflective and critical thinking (Saadé et al., 2012). By contrast, many studies also showed that there was no relationship or a negative relationship between satisfaction with e-learning courses and grade point average (GPA) (Levy, 2007) or technology use and student GPA (Hunley et al., 2005) among students who fully or partially access campus e-learning environments. The reason for the mixed results may be students’ different levels of digital skills, engagement, and other characteristics, including attitudes, motivations, and confidence about using university e-learning technology for academic activities (Roffe, 2002).

Research model and hypotheses

With positive support for the advancement of e-learning at universities, college students may have more opportunities to effectively engage in e-learning environments to achieve academic success. The goal of this study was to examine student perceptions of e-learning from their experiences, as well as the roles of academic engagement and digital readiness within the university context of an e-learning environment. Figure 1 in this section depicts our research model, which argues for the rationale of the proposed hypotheses. We propose and test the research model, which consists of five factors: e-learning adoption, e-learning attitude, digital readiness, academic engagement, and academic achievement.

Fig. 1
figure1

The research model

Academic achievement, engagement, and e-learning

Academic achievement, represented by GPA as an outcome of student experiences at university, is a typical factor in examining the effects of instructional activities. GPA is one of the best predictors of college success in academic activities (Moore & Shulock, 2009). Achievement at university tends to be determined by previously acquired knowledge, skills, abilities, and various factors related to time and resources devoted to studying and attending classes (Plant, Ericsson, Hill, & Asberg, 2005). According to Carini, Kuh, and Klein (2006), academic performance and students’ learning engagement show statistically significant positive relationships. Students’ academic engagement refers to commitment to or effortful involvement in the context of academic learning throughout a student’s entire school experience (Coates, 2006; Henrie, Halverson, & Graham, 2015). Engagement refers to the quality of effort made by students in educationally purposeful activities and contributes to desired academic outcomes (Kuh, 2001). Students’ deeper engagement can lead them to beneficial educational practices, which further lead to comprehensive learning (Coates, 2006; Hodge, Wright, & Bennett, 2017).

University students’ academic achievement is predicted by students’ e-learning experiences (Chou & Liu, 2005; Goh, Leong, Kasmin, Hii, & Tan, 2017; Kiviniemi, 2014). Students using an e-learning environment demonstrated improved learning performance and satisfaction (Chou & Liu, 2005). Similar results in Malaysia were reported by Goh et al. (2017). Studies examining the use of e-learning resources for academic work tend to illustrate the lack of student access to e-learning systems (Lust, Juarez Collazo, Elen, & Clarebout, 2012). In a research experiment, Kiviniemi (2014) found that blended learning approaches incorporating both in-person and e-learning course components improved student performance. Thus, we developed the following hypotheses:

  • Hypothesis 1 (H1): Students’ e-learning adoption is positively related to academic achievement (GPA).

  • Hypothesis 2 (H2): Students’ academic engagement is positively related to academic achievement (GPA).

  • Hypothesis 3 (H3): Students’ e-learning attitude is positively related to academic achievement (GPA).

Academic engagement and e-learning

The e-learning environment provides learning assistance that allows students to be more engaged and perform better in their academic courses (Islam, 2013). Academic engagement is important in any learning context, including face-to-face, online, and blended courses (Henrie et al., 2015). In studies examining higher education, academic engagement tends to be a strong predictor of academic development (Carini et al., 2006). Coates (2006), using a more inclusive and holistic perspective of the student experience, asserted that student engagement in academia develops from the dynamic association between students and their institutional circumstances. In that study, academic engagement focused more on student experiences in internal and formal instructional environments. Academic engagement can be increased by using technology to connect students, instructors, and the course content to facilitate academic success (Mehdinezhad, 2011). In this study, academic engagement plays the role of mediator for college students to support education via the adoption of e-learning in their academic work.

University students can be empowered by including e-learning in their academic experience. Students’ digital learning experiences increase their quality of learning by allowing them to easily access support, facilities, and additional content, and by facilitating interactions with the instructor or other students (Abbad, Morris, & de Nahlik, 2009). In higher education institutions, e-learning has become more important in enhancing educational experiences by delivering course materials and even entire courses, thus supporting traditional teaching and learning methods in the classroom. University students benefit from the adoption of e-learning in numerous ways, including through the flexibility of learning in terms of both time and place, the efficacy of accessing knowledge and information, educational interactivity, differentiation according to individual students, and self-pacing (Arkorful & Abaidoo, 2015). Thus, we developed the following hypotheses:

  • Hypothesis 4 (H4): Students’ e-learning adoption is positively related to academic engagement.

  • Hypothesis 5 (H5): Students’ e-learning adoption is positively related to digital readiness.

In this study, we examined students’ perceptions of behavior and control to understand and predict student behavior regarding e-learning experiences in university settings. Attitude toward e-learning is defined here as the degree to which a student perceives their behavior as favorable or unfavorable in e-learning (Ajzen, 1991). In addition, perceived behavioral control in e-learning is defined as the level of confidence an individual has in their ability to perform a behavior based on how easy or difficult it is perceived to be with respect to hindrances or facilitators (Liao, Chen, & Yen, 2007). Perceived behavioral control is also known to strengthen a person’s intention to perform a behavior and increase effort and perseverance (Ajzen, 2002). The achievement of the behavior depends on the availability of resources and opportunities (Ajzen, 1991). Thus, we developed the following hypotheses:

  • Hypothesis 6 (H6): Students’ e-learning attitudes are positively related to academic engagement.

  • Hypothesis 7 (H7): Students’ e-learning attitudes are positively related to digital readiness.

Digital readiness for academic engagement

Digital readiness for college students implies their technology-related knowledge, skills, attitudes, and competencies for using digital technologies to meet educational aims and expectations in higher education (Hong & Kim, 2018). Student’s academic engagement in higher education institutions tends to be enhanced by the adoption of digital technology by students, who are naturally proficient with technology because of their exposure to technology-rich environments (Jones, 2012). Kim, Hong, and Song (2018) claimed that university students in Korea who are digital natives may or may not effectively apply digital technologies for academic activities or associate them with academic literacy. Current students in the university context demonstrate a wide gap between digital skills in informal contexts and in formal learning (Margaryan, Littlejohn, & Vojt, 2011). Digital readiness for college students encompasses the meaningful use of digital skills for academic work, the development of digital media ability through active participation in and critical evaluation of digital culture, and the application of information literacy skills and strategies to academic work. It can be one of the significant connections between the student’s e-learning experience and academic achievement. Thus, we developed the following hypothesis:

  • Hypothesis 8 (H8): Students’ digital readiness for academic engagement is positively related to academic engagement.

Materials and methods

Data collection

The data were collected as part of a larger study on the quality of undergraduate educational experiences at the university, particularly students’ digital learning experiences. The survey was conducted through a self-administered online questionnaire using SurveyMonkey. The students who volunteered were sent an email in which they were asked to click on an attached Internet address that linked to the target survey.

Participants

Table 1 shows the demographic characteristics of the respondents. All participants were provided with a description of the study, with which they were asked to proceed when they agreed, even though our study conditions did not require approval from the Institutional Review Board (IRB) of the university where it was conducted. In this study, there were 614 respondents who were undergraduate students in South Korea, ranging in age from 18 to 27 years (mean = 21.37, SD = 2.09); 215 were male, and 399 were female. One hundred and seventy students were freshmen, 152 were sophomores, 198 were juniors, and 94 were seniors (see Table 1).

Table 1 Demographic characteristics

To explore student experiences with e-learning environments on campus, we asked the students to confirm their level of experience through a series of statements assessed on a Likert scale. Students showed that they were good adopters of e-learning applications and devices, including smart phones, tablets, or laptops, for their academic work. Students were searching the Internet to find information (M = 4.23, SD = .89), studying with various types of online content (YouTube, etc.) (M = 4.17, SD = .86), creating academic documents using various tools (M = 3.90, SD = .93), using mobile apps to study academic work (M = 3.76, SD = 1.04), and using networking software to connect with classmates (M = 4.26, SD = .85). However, the students’ lack of experience in college was high with respect to the university e-learning environments. The students lacked experience using lecture videos as e-learning activities (M = 2.71, SD = .149). They also lacked experience with mobile e-learning courses using their smart devices (e.g., smartphones or tablets) (M = 2.22, SD = .139). The data show that the students lacked experience in taking credit-based e-learning courses (M = 2.07, SD = .145) and in using university e-learning platforms for Massive Open Online Courses (MOOC) for either credit or noncredit courses (M = 1.91, SD = 1.30).

Instruments

In addition to questions regarding demographic information such as gender, age, grade, academic discipline, and grade point average (GPA), the survey included several questions to measure the students’ perceptions of e-learning environments on campus (see Appendix). The level of student experience with university e-learning environments was measured using items developed to collect such data, including e-learning systems with on-campus and off-campus components, the level of information and communication technology (ICT) required for the major, ways of learning to use e-learning resources and systems, and previous experience with e-learning prior to college.

As an antecedent to academic engagement and performance in university e-learning environments, e-learning attitude was measured as the positive or negative perceptions of students of the use of e-learning technology; it was measured using the seven items described by Chu and Chen (2016). Sample items included, “Studying with e-learning is a good idea” and “All things considered, using the e-learning system is beneficial to me.” The scale showed strong reliability in the present study: Cronbach’s alpha was equal to .94.

For e-learning adoption, perceived behavioral control in e-learning is measured through a student’s evaluation of resources and their capabilities when engaged in e-learning (Chu & Chen, 2016). An individual evaluation by college students of personal capabilities and resources is also an antecedent of the adoption of e-learning components. Perceived behavioral control is known to be a positive predictor of the intention to adopt e-learning (Chu & Chen, 2016). The three items were adapted from Chu and Chen (2016). Sample items included, “I have the necessary knowledge for using the university e-learning system,” “Using the university e-learning system is entirely within my control,” and “I have the necessary resources for using the university e-learning system.” The scale showed strong reliability in this study: Cronbach’s alpha was equal to .85.

Digital readiness was adopted from Hong and Kim (2018), who measured college students’ perceived digital competencies for academic engagement. Digital readiness is regarded as necessary for college students’ academic success. All 17 items were measured using a 5-point scale ranging from 1 (strongly disagree) to 5 (strongly agree). The scale showed strong reliability in this study: Cronbach’s alpha was equal to .91.

Academic engagement was measured using a scale developed by Handelsman, Briggs, Sullivan, and Towler (2005). Academic engagement is defined here as a student’s psychological and behavioral efforts and investment in learning, understanding or mastering skills, and knowledge in academic work (Fredricks, Blumenfeld, & Paris, 2004). Student experiences with e-learning systems can affect academic engagement. The scale showed strong reliability in this study: Cronbach’s alpha was equal to .84.

Data analysis and results

To analyze the collected data, descriptive statistics were used to calculate the mean, standard deviation, correlations, independent t-test results, and ANOVA using IBM SPSS 23 software. Partial least squares-structural equation modeling (PLS-SEM) was adopted to test the research model by empirically assessing a structural model together with a measurement model (Fornell & Larcker, 1981). To explore and develop a theoretical model, we assessed the research model using the SmartPLS 3.0 software (Ringle, Wende, & Becker, 2015), following the two-stop approach of first evaluating the measurement model and then the structural model (Anderson & Gerbing, 1988).

PLS-SEM was used to explore a hypothetical research model by analyzing latent variables with multiple observed variables using regression-based methods (Chin, 1998a, 1998b; Hair, Hult, Ringle, & Sarstedt, 2017). Also, PLS-SEM is a more exploratory means of understanding the specific path coefficients and variance of the dependent variable explained by the independent variables in the research model, rather than examining the goodness of fit (Chin, Marcolin, & Newsted, 2003; Petter, 2018). PLS-SEM is known to be a more effective approach to developing a theory with limited conditions such as multivariate normality assumptions, smaller sample sizes, residual distribution, and measurement scales than those in covariance-based structural equation modeling (CB-SEM) (Chin & Newsted, 1999; Hair, Sarstedt, Ringle, & Mena, 2012). For the fit indices of the model, we adapted Chin’s (1998a, 1998b) catalog of criteria.

For PLS-SEM, the size of data should be at least 10 times the number of constructs related to a single endogenous dependent construct (Chin, 1998a, 1998b; Wixom & Watson, 2001). In the research model, there are four constructs related to the endogenous dependent construct of academic achievement. The minimum number of data to apply PLS-SEM are 40 (10 × 4 constructs). Thus, the size of 614 in total exceeded the recommended sample size to drive statistical inferences.

The initial analysis of the research model was conducted through the approximate fit of the estimated model using the standardized root mean square residual (SRMR). The model showed the value of 0.051, which was below the recommended value of 0.08 (Hu & Bentler, 1998); this suggests that the research model is a good fit to the data. Moreover, the fit index of the saturated model showed a value of 0.051 and confirms a good value of model fit (Hair et al., 2017).

Measurement model

We evaluated the measurement model for the reliability, discriminant validity, and convergent validity of the constructs.

First, we examined reliability using Cronbach’s alpha and composite reliability in Table 2. The recommended cutoff value of both is 0.7 as extensive evidence of reliability and 0.8 or higher as exemplary evidence of reliability (Bearden, Netemeyer, & Mobley, 1993). In Table 3, all the constructs in the measurement model show a Cronbach’s alpha of 0.833 or higher and composite reliability of 0.883 or higher. All average variable extracted (AVE) values, ranging from 0.601 to 0.834, exceeded the recommended level of 0.5 (Fornell & Larcker, 1981), which means that 50% or more variance of the items is accounted for by the construct (Chin, 1998a, 1998b). Thus, the convergent validity and reliability show satisfactory levels for the measurement model.

Table 2 Descriptive statistics, Cronbach’s alpha, composite reliability, and average variance extracted
Table 3 Discriminant validity analysis

Second, discriminant validity is confirmed in the results of Tables 3 and 4. Initially, loadings of each item on its own construct and its cross-loadings on all other constructs were assessed. Each item should have a higher loading with the construct than its cross-loadings with other constructs. In Table 3, there are two different criteria for another test on discriminant validity: the Fornell-Larcker criterion (Fornell & Larcker, 1981) and the heterotrait-monotrait ratio of correlations (HTMT, Hair et al., 2017). The square root of AVE in the Fornell-Larcker criterion is shown to be higher than all other cross-correlations between constructs (Fornell & Larcker, 1981). In addition, all the constructs in the HTMT criterion appear lower than 0.85, thereby suggesting satisfactory discriminant validity (Kline, 2011).

Table 4 Matrix of loadings and cross-loadings of variables in the measurement model

Structural model

In the structural model, PLS-SEM was performed to calculate estimated path coefficients, path significance, and the coefficient of determination. Table 5 reports on the PLS-SEM test results, including the path coefficients and their t-values corresponding to each path in the structural model. Bootstrapping technique with 5000 resamples allows significance testing of path coefficients. To decide the significance of the paths, the t-value for each path in the structural model is 1.96 at the 0.05 significance level and 2.58 at the 0.01 significance level.

Table 5 Hypotheses, path coefficients, and results

Figure 2 shows the standardized path coefficients and significance levels for each hypothesis, indicating that the hypotheses were supported at p < .05. The results show that academic engagement (β = 0.297, p < 0.001, supporting H2) had significant influence on academic achievement, whereas e-learning adoption (β = 0.067, p > 0.05, not supporting H1) and e-learning attitude (β = − 0.052, p > 0.05, not supporting H3) did not predict academic achievement. For digital learning perceptions, e-learning adoption (β = 0.127, p < 0.01, supporting H4) and digital readiness for academic engagement (β = 0.272, p < 0.001, supporting H8) had positive and significant effects on academic engagement, whereas e-learning attitude did not show a significant effect on academic engagement (β = 0.020, p > 0.05, not supporting H6). E-learning adoption (β = 0.191, p < 0.001, supporting H5) and e-learning attitude (β = 0.151, p < 0.01, supporting H7) were positively related to digital readiness.

Fig. 2
figure2

PLS model of attitude toward e-learning and academic performance

To examine how students perceived digital learning ability to affect academic achievement, we examined the mediating effects of academic engagement and digital readiness, shown in Table 6. We adopted the bootstrapping method to perform the mediation analysis in the PLS-SEM (Nitzl, Roldan, & Cepeda, 2016; Streukens & Leroi-Werelds, 2016) with bias-corrected confidence estimates (Hayes, 2013) and a 95% confidence interval of the indirect effects.

Table 6 Mediation effects

Discussion and implications

In this study, we examine our research model to reveal the relationship between university students’ experience in e-learning and academic achievement (GPA) to enhance the explanatory power of the research model using additional factors. Mixed results regarding the effects of e-learning environments that encourage academic success motivated a scholarly interest in the design of the research model. Thus, the present study examined the mediating roles of digital readiness and academic engagement in e-learning for academic achievement within the university setting to strengthen this relationship.

This study attempts to contribute factors of enhancing student’s academic success in university e-learning environments. The findings show that two of the factors of students’ perceptions of e-learning on campus, e-learning adoption and e-learning attitude, did not directly predict academic achievement. This result is consistent with previous results on students’ participation in e-learning in university settings, which did not show a significant impact on their level of performance (Davies & Graff, 2005). There are possible explanations for the unexpected finding. First, it seems to be related to students’ high-effort experience required for academic activities involving e-learning. According to Kuh (2001), students’ quality of effort devoted to educationally purposeful activities can contribute to academic outcomes. The lack of a significant relationship may be related to a student’s commitment or effort—academic engagement—toward achieving good academic performance (Rodgers, 2008). In other words, students’ e-learning experiences and perceptions do not directly predict their achievement without their engagement in academic activities. Second, various factors, both personal and school-related, have an influence on student achievement (Plant et al., 2005), including gender, ethnicity, family income, and the social-economic environment (Betts & Morell, 1999); study hours and the pedagogical environment (Clifton, Perry, Stubbs, & Roberts, 2004); self-regulatory learning strategies and academic self-efficacy (Richardson, Abraham, & Bond, 2012); and others. Thus, examining the effects of university students’ e-learning on their academic achievement without controlling other factors may explain the weak direct relationship.

The research model revealed that student e-learning adoption and attitudes in the university context are academic achievements mediated by digital readiness and academic engagement. To lead students toward better outcomes using university e-learning environments, it is necessary to enhance their meaningful academic engagement in achieving better academic results (Axelson & Flick, 2010; Kearsley & Shneiderman, 1998). One of the most important goals of higher education using e-learning environments is to get students to be more active in the learning process through dynamic engagement that fosters cognitive and non-cognitive skills for academic success (Ituma, 2011; Saadé, Morin, & Thomas, 2012). Particularly, e-learning adoption plays the role of significant antecedent, rather than e-learning attitude, which means that the actual experiences of students in adopting e-learning can contribute to academic engagement. Additionally, university students who are confident in their digital skills for their academic work, which means they have digital readiness, encounter more possibilities for academic achievement in university e-learning environments; that is, positive perceptions or experiences of university e-learning environments and the use of course delivery using learning management are not enough to significantly demonstrate strong academic achievement for academic success. The findings of the current study imply that students who actively adopt e-learning and have confident attitudes still need to be committed and make the effort to learn the use of digital materials, along with a pedagogical approach that entails self-directed learning for academic achievement in a university e-learning environment (Davies & Graff, 2005). In addition, given that academic engagement mediates the relationship between e-learning use and academic achievement, e-learning environments should be designed to deepen students’ level of engagement in academic activities. Also, universities have not fully considered the digital competencies of students in relation to academic success in university e-learning environments (Parkes et al., 2015). Thus, a university should focus on supporting learning assistance and community building to ensure that students have enriched experiences of using e-learning systems for their learning (Islam, 2013). For example, universities can support the use of students’ electronic portfolios for academic courses with a university assistant as a learning mentor, as well as allow internal and external e-learning contents (e.g., MOOCs as large learning communities) to earn academic course credits toward degrees.

The practical implications of this study provide university administrators with suggestions for an integrated approach involving both e-learning and offline environments. The findings of this study suggest the need to provide opportunities for students to learn and adapt e-learning resources and infrastructure as a way of deepening their academic experiences. Universities need to provide training, direction, and support according to students’ profiles, which are derived from regular examination of their experiences and level of e-learning adoption for academic engagement and achievement. For example, universities have recently adopted intelligent systems using students’ profiles to predict students’ future performance and analyze current status of learning gaps. With such a system, the university can recommend activities (e.g., technology-based learning workshops) and contents (e.g., e-learning contents to learn Microsoft Excel for course work) to deepen student’s academic experiences of integrated learning environments. Although students as digital natives have experience using e-learning, universities should consider advancing the adoption of e-learning and other technologies, in both curricular and extra-curricular settings. For example, students could attend on- and offline campus workshops and courses from general education regarding technology integration for academic courses. Most of all, faculty must recognize the need for technology integration in their courses and make an effort to integrate campus e-learning environments into curriculum. E-learning environments on campus should be tools to support students’ efforts in their academic work. To be supportive, user experience design of e-learning systems is the key to enhancing student and faculty experiences and creating effective teaching and learning activities. Although members of the young generation are regarded as digital natives, they need to prepare for the integration of digital competency with academic work (Hong & Kim, 2018).

Further, integrated e-learning should be pursued by instructors along with students’ experiences of pursuing academic engagement to enhance the effect of higher education on academic achievement. In particular, blended learning that integrates instructor-led instruction with technology-driven teaching strategies or materials has proven to be an effective approach to enhancing student learning outcomes and academic satisfaction in higher education (López-Pérez et al., 2011; Lyons & Evans, 2013). Instructors should consider how to integrate campus e-learning tools and infrastructure in their classroom teaching and student learning. Effective emersion of a university e-learning environment that provides a successful academic experience strongly depends on the instructor’s guidance and teaching approaches in the environment. Instructors should find or apply digital tools or infrastructure to accomplish the course objectives according to the characteristics of the course.

Limitations and future research

It is necessary to examine additional antecedents to control unacknowledged confounding factors. The research model can be extended to the antecedents to predict digital readiness, academic engagement, and academic achievement having low R-squared values. For example, the possible antecedents for the model are student background information (e.g., young experiences of technology adoption, parental support at an early age; See Kim et al., 2018); academic experiences (e.g., instructors’ efforts to integrate e-learning into the courses and school support); e-learning experiences (e.g., taking intensive courses with digital technology, courses with mentors, and numbers of assignments using digital technology); and peer group culture of using technology can be considered as the antecedents to evolve our models. For future studies, the following are possible considerations: (1) Different levels of adoption with respect to university e-learning environments should be considered as a controlled factor in the model; (2) the research model needs to be extended to improve the findings of the roles of digital readiness and academic engagement as mediators; (3) the research model of the present study can be applied to examine students in other institutions or different countries to generalize the model; and (4) further studies could examine the relationship between university administrators’ and faculty members’ perceptions of e-learning adoption in campus-based and online courses and compare these with student engagement and performance.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the authors on reasonable request.

References

  1. Abbad, M. M., Morris, D., & de Nahlik, C. (2009). Looking under the bonnet: Factors affecting student adoption of e-learning systems in Jordan. International Review of Research in Open and Distance Learning, 10(2), 1–25. https://doi.org/10.19173/irrodl.v10i2.596.

  2. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. https://doi.org/10.1016/0749-5978(91)90020-T.

  3. Ajzen, I. (2002). Perceived behavioral control, self-efficacy, locus of control, and the theory of planned behavior. Journal of Applied Social Psychology, 32(4), 665–683. https://doi.org/10.1111/j.1559-1816.2002.tb00236.x.

  4. Álvarez, A., Martín, M., Fernández-Castro, I., & Urretavizcaya, M. (2013). Blending traditional teaching methods with learning environments: Experience, cyclical evaluation process and impact with MAgAdI. Computers & Education, 68, 129–140. https://doi.org/10.1016/j.compedu.2013.05.006.

  5. Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103(3), 411–423.

  6. Arkorful, V., & Abaidoo, N. (2015). The role of e-learning, advantages and disadvantages of its adoption in higher education. International Journal of Instructional Technology and Distance Learning, 12(1), 29–42.

  7. Axelson, R. D., & Flick, A. (2010). Defining student engagement. Change: The Magazine of Higher Learning, 43(1), 38–43. https://doi.org/10.1080/00091383.2011.533096.

  8. Bearden, W. O., Netemeyer, R. G., & Mobley, M. F. (1993). Handbook of marketing scales: Multi item measures for marketing and consumer behavior research. Newbury Park: Sage.

  9. Betts, J. R., & Morell, D. (1999). The determinants of undergraduate grade point average: The relative importance of family background, high school resources, and peer group effects. The Journal of Human Resources, 34(2), 268–293.

  10. Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47(1), 1–32. https://doi.org/10.1007/s11162-005-8150-9.

  11. Castillo-Merino, D., & Serradell-López, E. (2014). An analysis of the determinants of students’ performance in e-learning. Computers in Human Behavior, 30, 476–484. https://doi.org/10.1016/j.chb.2013.06.020.

  12. Chin, W. W. (1998a). The partial least squares approach to structural equation modeling. In G. A. Marcoulides (Ed.), Modern methods for business research, (pp. 295–336). Mahwah: Lawrence Erlbaum Associates.

  13. Chin, W. W. (1998b). Issues and opinion on structural equation modeling. MIS Quarterly, 22(1), 7–16.

  14. Chin, W. W., Marcolin, B. L., & Newsted, P. R. (2003). A partial least squares latent variable modeling approach for measuring interaction effects: Results from a Monte Carlo simulation study and an electronic-mail emotion/adoption study. Information Systems Research, 14(2), 189–217. https://doi.org/10.1287/isre.14.2.189.16018.

  15. Chin, W. W., & Newsted, P. R. (1999). Structural equation modeling analysis with small samples using partial least squares. In R. H. Hoyle (Ed.), Statistical strategies for small sample research, (pp. 307–341). Thousand Oaks: Sage Publications.

  16. Chou, S.-W., & Liu, C.-H. (2005). Learning effectiveness in a web-based virtual learning environment: A learner control perspective. Journal of Computer Assisted Learning, 21(1), 65–76.

  17. Chu, T. H., & Chen, Y. Y. (2016). With good we become good: Understanding e-learning adoption by theory of planned behavior and group influences. Computers & Education, 92-93, 37–52. https://doi.org/10.1016/j.compedu.2015.09.013.

  18. Cidral, W. A., Oliveira, T., Di Felice, M., & Aparicio, M. (2018). E-learning success determinants: Brazilian empirical study. Computers & Education, 122, 273–290. https://doi.org/10.1016/j.compedu.2017.12.001.

  19. Clifton, R. A., Perry, R. P., Stubbs, C. A., & Roberts, L. W. (2004). Faculty environments, psychosocial dispositions, and the academic achievement of college students. Research in Higher Education, 45(8), 801–828. https://doi.org/10.1007/s11162-004-5950-2.

  20. Coates, H. (2006). Student engagement in campus-based and online education: University connections. New York: Routledge. https://doi.org/10.4324/9780203969465.

  21. Davies, D., & Graff, M. (2005). Performance in e-learning: Online participation and student grades. British Journal of Education Technology, 36, 657–663.

  22. Deng, L., & Tavares, N. J. (2013). From Moodle to Facebook: Exploring students’ motivation and experiences in online communities. Computers & Education, 68, 167–176.

  23. Eze, S. C., Chinedu-Eze, V. C., & Bello, A. O. (2018). The utilisation of e-learning facilities in the educational delivery system of Nigeria: A study of M-University. International Journal of Educational Technology in Higher Education, 15(34), https://doi.org/10.1186/s41239-018-0116-z.

  24. Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50.

  25. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059.

  26. Fry, K. (2001). E-learning markets and providers: Some issues and prospects. Education + Training, 43(4/5), 233–239. https://doi.org/10.1108/EUM0000000005484.

  27. Goh, F. C., Leong, M. C., Kasmin, K., Hii, K. P., & Tan, K. O. (2017). Students’ experiences, learning outcomes and satisfaction in e-learning. Journal of E-learning and Knowledge Society, 13(2), 117–128. https://doi.org/10.20368/1971-8829/1298.

  28. Hair, J. F., Sarstedt, M., Ringle, C. M., & Mena, J. A. (2012). An assessment of the use of partial least squares structural equation modeling in marketing research. Journal of the Academy of Marketing Science, 40(3), 414–433. https://doi.org/10.1007/s11747-011-0261-6.

  29. Hair, J. F. J., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2017). A primer on partial least squares structural equation modeling (PLS-SEM), (2nd ed., ). Thousand Oaks: Sage.

  30. Handelsman, M. M., Briggs, W. L., Sullivan, N., & Towler, A. (2005). A measure of college student course engagement. The Journal of Educational Research, 98(3), 184–192. https://doi.org/10.3200/JOER.98.3.184-192.

  31. Hayes, A. (2013). Introduction to mediation, moderation, and conditional process analysis. New York, NY: Guilford.

  32. Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53. https://doi.org/10.1016/j.compedu.2015.09.005.

  33. Hodge, B., Wright, B., & Bennett, P. (2017). The role of grit in determining engagement and academic outcomes for university students. Research in Higher Education, 59(4), 448–460. https://doi.org/10.1007/s11162-017-9474-y.

  34. Hong, A. J., & Kim, H. J. (2018). College Students’ Digital Readiness for Academic Engagement (DRAE) Scale: Scale development and validation.Asia-Pacific Education Researcher, 27(4), 303–312. https://doi.org/10.1007/s40299-018-0387-0.

  35. Hu, L., & Bentler, P. M. (1998). Fit indices in covariance structure modeling: Sensitivity to under parameterized model misspecification. Psychological Methods, 3(4), 424–453.

  36. Hunley, S. A., Evans, J. H., Delgado-Hachey, M., Krise, J., Rich, T., & Schell, C. (2005). Adolescent computer use and academic achievement. Adolescence, 40(158), 307–318.

  37. Islam, A. K. M. N. (2013). Investigating e-learning system usage outcomes in the university context. Computers & Education, 69, 387–399. https://doi.org/10.1016/j.compedu.2013.07.037.

  38. Ituma, A. (2011). An evaluation of students’ perceptions and engagement with e-learning components in a campus based university. Active Learning in Higher Education, 12(1), 57–68. https://doi.org/10.1177/1469787410387722.

  39. Jones, C. (2012). Networked learning, stepping beyond the Net Generation and digital natives. In L. Dirckinck-Holmfeld, V. Hodgson, & D.McConnell (Eds.), Exploring the theory, pedagogy and practice of networked learning (pp. 27–41). New York, NY: Springer.

  40. Kearsley, G., & Shneiderman, B. (1998). Engagement theory: A framework for technology-based teaching and learning. Educational Technology,38(5), 20–23.

  41. Kim, H., Hong, A., & Song, H. D. (2018). The relationships of family, perceived digital competence and attitude, and learning agility in sustainable student engagement in higher education. Sustainability, 10(12), 4635.

  42. Kiviniemi, M. T. (2014). Effects of a blended learning approach on student outcomes in a graduate-level public health course. BMC Medical Education, 14(1), 1–7. https://doi.org/10.1186/1472-6920-14-47.

  43. Kline, R. B. (2011). Methodology in the social sciences. Principles and practice of structural equation modeling, (3rd ed., ). New York: Guilford Press.

  44. Kuh, G. D. (2001). The national survey of student engagement: Conceptual framework and overview of psychometric properties. Bloomington: Indiana University Center for Postsecondary Research and Planning.

  45. Levy, Y. (2007). Comparing dropouts and persistence in e-learning courses. Computers & Education, 48(2), 185–204. https://doi.org/10.1016/j.compedu.2004.12.004.

  46. Liao, C., Chen, J. L., & Yen, D. C. (2007). Theory of planning behavior (TPB) and customer satisfaction in the continued use of e-service: An integrated model. Computers in Human Behavior, 23(6), 2804–2822. https://doi.org/10.1016/j.chb.2006.05.006.

  47. López-Pérez, M. V., Pérez-López, M. C., & Rodríguez-Ariza, L. (2011). Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers & Education, 56(3), 818–826.

  48. Lust, G., Juarez Collazo, N. A., Elen, J., & Clarebout, G. (2012). Content management systems: Enriched learning opportunities for all? Computers in Human Behavior, 28(3), 795–808. https://doi.org/10.1016/j.chb.2011.12.009.

  49. Lyons, T., & Evans, M. M. (2013). Blended learning to increase student satisfaction: An exploratory study. Internet Reference Services Quarterly, 18(1), 43–53. https://doi.org/10.1080/10875301.2013.800626.

  50. Margaryan, A., Littlejohn, A., & Vojt, G. (2011). Are digital natives a myth or reality? University students’ use of digital technologies. Computers & Education, 56(2), 429–440. https://doi.org/10.1016/j.compedu.2010.09.004.

  51. Means, B., Toyama, Y., Murphy, R., & Bakia, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record, 115(3), 1–47.

  52. Mehdinezhad, V. (2011). First year students’ engagement at the university. International Online Journal of Educational Sciences, 3(1), 47–66.

  53. Moore, C., & Shulock, N. (2009). Student progress toward degree completion: Lessons learned from the research literature. Sacramento: Institute for Higher Education Leadership & Policy.

  54. Naveed, Q. N., Muhammed, A., Sanober, S., Qureshi, M. R. N., & Shah, A. (2017). Barriers effecting successful implementation of e-learning in Saudi Arabian universities. International Journal of Emerging Technologies in Learning, 12(6), 94–107.

  55. Nitzl, C., Roldan, J. L., & Cepeda, G. (2016). Mediation analysis in partial least squares path modeling: Helping researchers discuss more sophisticated models. Industrial Management & Data Systems, 116(9), 1849–1864.

  56. Olelewe, C. J., & Agomuo, E. E. (2016). Effects of B-learning and F2F learning environments on students’ achievement in QBASIC programming. Computers & Education, 103, 76–86. https://doi.org/10.1016/j.compedu.2016.09.012.

  57. Orton-Johnson, K. (2009). “I’ve stuck to the path I’m afraid”: Exploring student non-use of blended learning. British Journal of Educational Technology, 40(5), 837–847. https://doi.org/10.1111/j.1467-8535.2008.00860.x.

  58. Parkes, M., Stein, S., & Reading, C. (2015). Student preparedness for university e-learning environments. The Internet and Higher Education, 25, 1–10. https://doi.org/10.1016/j.iheduc.2014.10.002.

  59. Petter, S. (2018). “Haters gonna hate”: PLS and information systems research. ACM SIGMIS Database, 49(2), 10–13. https://doi.org/10.1145/3229335.3229337.

  60. Pham, L., Limbu, Y. B., Bui, T. K., Nguyen, H. T., & Pham, H. T. (2019). Does e-learning service quality influence e-learning student satisfaction and loyalty? Evidence from Vietnam. International Journal of Educational Technology in Higher Education, 16(7), 1–26. https://doi.org/10.1186/s41239-019-0136-3.

  61. Plant, E. A., Ericsson, K. A., Hill, L., & Asberg, K. (2005). Why study time does not predict grade point average across college students: Implications of deliberate practice for academic performance. Contemporary Educational Psychology, 30(1), 96–116. https://doi.org/10.1016/j.cedpsych.2004.06.001.

  62. Richardson, M. D., Abraham, C., & Bond, R. (2012). Psychological correlates of university students’ academic performance: A systematic review and meta-analysis. Psychological Bulletin, 138(2), 353–387.

  63. Ringle, C. M., Wende, S., & Becker, J.-M. (2015). SmartPLS 3.0. Boenningstedt: SmartPLS GmbH.

  64. Rodgers, T. (2008). Student engagement in the e-learning process and the impact on their grades. International Journal of Cyber Society and Education, 1(2), 143–156.

  65. Roffe, I. (2002). E-learning: Engagement, enhancement and execution. Quality Assurance in Education, 10(1), 40–50. https://doi.org/10.1108/09684880210416102.

  66. Saadé, R. G., Morin, D., & Thomas, J. D. E. (2012). Critical thinking in e-learning environments. Computers in Human Behavior, 28(5), 1608–1617. https://doi.org/10.1016/j.chb.2012.03.025.

  67. Streukens, S., & Leroi-Werelds, S. (2016). Bootstrapping and PLS-SEM: A step-by-step guide to get more out of your bootstrap results. European Management Journal, 34(6), 618–632. https://doi.org/10.1016/j.emj.2016.06.003.

  68. Wixom, B. H., & Watson, H. J. (2001). An empirical investigation of the factors affecting data warehousing success. MIS Quarterly, 25(1), 17–41 https://www.jstor.org/stable/pdf/3250957.pdf.

  69. Woods, R., Baker, J. D., & Hopper, D. (2004). Hybrid structures: Faculty use and perception of web-based courseware as a supplement to face-to-face instruction. The Internet and Higher Education, 7(4), 281–297. https://doi.org/10.1016/j.iheduc.2004.09.002.

Download references

Acknowledgements

We want to thank MS. Juyeon Hong, a Ph.D. student, for her help in the data collection. Furthermore, we wish to thank the anonymous reviewers.

Funding

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2017S1A3A2066878).

Author information

HJK, AJH, and HDS. conceived the research idea and designed the research framework. HJK collected the data and analyzed it. HJK, AJH, and HDS equally contributed to prepared, revised and approved the manuscript. All three authors read and approved the final manuscript.

Correspondence to Hae-Deok Song.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Table 7 Survey instrument item

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kim, H.J., Hong, A.J. & Song, H. The roles of academic engagement and digital readiness in students’ achievements in university e-learning environments. Int J Educ Technol High Educ 16, 21 (2019) doi:10.1186/s41239-019-0152-3

Download citation

Keywords

  • Digital learning environments
  • Academic engagement
  • Academic performance