Skip to main content

Technology-supported management education: a systematic review of antecedents of learning effectiveness

Abstract

This paper provides a systematic, multidisciplinary review of antecedents of the effectiveness of technology-supported management learning and highlights potential directions for future research. Passive knowledge acquisition in physical classrooms is no longer the hallmark of higher education. Instead, the introduction of new technologies allows for active knowledge construction in increasingly virtual spaces. Such changes in the learning environment affect the education of the managers of tomorrow. Nevertheless, research on technology-supported management learning and its implications for management educators is fragmented and inconsistent across research areas. This paper uses a systematic approach to structure and integrate results from the fields of educational psychology, educational technology, higher education, and management education. This allows us to derive a comprehensive overview of the antecedents of the effectiveness of technology-supported management learning from the various disciplines. Our work reveals several areas that require further investigation, including: (i) the best way to blend and flip formats for different management disciplines and content types, (ii) the selection, design, and richness of the technologies used, (iii) the instructor’s teaching style, including feedback and deliberate confusion, and (iv) learners’ affective states, such as their motivations and emotions, and the role of prior knowledge.

Introduction

Technology has reshaped management education—in contrast to the traditional format of passive knowledge acquisition in synchronous and analog classrooms, much of management education now involves active knowledge construction in increasingly asynchronous and virtual learning spaces (Arbaugh, 2000c; Garrison & Kanuka, 2004). The formerly prevalent objectivist model of learning assumes that there is an objective reality that can be transferred, which supports the traditional lecture format (Leidner & Jarvenpaa, 1995). In contrast, the constructivist model of learning posits several representations of reality, and assumes that students learn better when they construct knowledge themselves by actively engaging with and making sense of information (Arbaugh & Benbunan-Fich, 2006). The constructivist model is typically facilitated by technology. Sun, Tsai, Finger, Chen, and Yeh (2008) thus regard technology-supported management learning as the “paradigm of modern education.”

This technological penetration of management education has triggered a substantial amount of research into management learning beyond the traditional classroom (Arbaugh, 2014; Arbaugh & Duray, 2002; Redpath, 2012). Both conceptual and empirical work has been conducted in various disciplines. For instance, research has emerged in the fields of educational psychology (Leutner, 2014; Mayer, 2002; Moreno & Mayer, 2007; Park, Plass, & Brünken, 2014), educational technology (Alavi, 1994; Evans, 2008; Piccoli, Ahmad, & Ives, 2001; Selim, 2003, 2007; Sun et al., 2008), higher education (Liu, 2012; O’Neill & Sai, 2014; Snowball, 2014; Xu & Jaggars, 2014), and management education (Alavi & Gallupe, 2003; Arbaugh & Benbunan-Fich, 2006; Arbaugh, DeArmond, & Rau, 2013). According to Arbaugh et al. (2009), “the volume and quality of research in online and blended business education has increased dramatically during the past decade.”

However, the different research areas pursue different objectives and approaches. For example, educational psychologists, on the one hand, tend to follow a learner-centered approach: They investigate how learning occurs through the human cognitive architecture and they propose technical applications to facilitate related processes. Educational technology scholars, on the other hand, take a technology-centered approach in which they suggest pushing technological innovations into the classroom while expecting learners to adapt (Mayer, 2002). Moreover, the extant research shows that some antecedents of technology-supported management learning have similar effects across disciplines, while others lead to contradictory outcomes. Thus, the current state of the literature is highly fragmented and partially inconsistent. No literature review that integrates findings from the various fields, much less one with a dedicated focus on management education, is available.

Therefore, this paper addresses the widespread academic discourse on technology-supported management learning by systematically investigating the antecedents of that learning. As Buttner and Black (2014) note, “no single learning theory accounts for all aspects of learning.” Thus, we contrast and integrate prevailing concepts from educational psychology and educational technology research with central themes in the management education and higher education literature. In addition, this paper enriches established theories with more recent research topics, such as confusion and emotions (D’Mello, Lehman, Pekrun, & Graesser, 2014; Dindar & Akbulut, 2016; Knoerzer, Bruenken, & Park, 2016).

Our paper makes two contributions. First, by conducting a systematic, interdisciplinary review of the extant literature, we integrate the dispersed knowledge on the antecedents of the effectiveness of technology-supported management learning from the various disciplines. Second, we critically reflect on conceptual and empirical findings from prior work, and we derive an agenda for future research based on the identified commonalities, inconsistencies, and research gaps. On this basis, we encourage scholars to explore different ways of blending and flipping management learning environments to identify the ideal instruction formats for the different management disciplines and content types. This includes an in-depth study of the impact of collaboration and interaction. In addition, we ask researchers to examine different technology applications and related features to more systematically and effectively select and design learning technologies. We also emphasize the importance of additional research on instructors’ teaching styles in technology-supported management education, as instructors continue to play a critical but changing role. This examination includes feedback and deliberate confusion. Moreover, we call for more research on the prior knowledge and affective states of learners, particularly regarding motivation and emotions, which are still under-researched but can be expected to play an important mediating and/or moderating role in learning outcomes.

Background on the research topic

Management education research is a subdiscipline of the business sciences. According to Arbaugh and Hwang (2015), it can be defined as “formal business and management education learning in the context of higher education in academic institutions.” Even though precursors of the Journal of Education for Business date back to 1928, today’s predominant publication outlet, the Academy of Management Learning and Education, only came into existence in 2002. The most-cited articles in this field were published during the last 20 years (Arbaugh & Hwang, 2015). Hence, management education is an emerging research area.

One stream of research in the management education literature investigates the importance of information technologies and attempts to bring them into the management learning space (Arbaugh, 2000b; Arbaugh & Duray, 2002). Publications include narratives by instructors, examinations of learner perceptions, and experiments with different formats and technologies. Experimental conditions range from technological advances in traditional lectures (Alavi, 1994) to flipped environments (Lancellotti, Thomas, & Kohli, 2016) to full online programs (Eom, Wen, & Ashill, 2006). Given the limited history of the field of management education (Arbaugh & Hwang, 2015) and the lack of dedicated scholars of management learning and education (Arbaugh, 2016), the respective studies build on research from related disciplines, such as educational psychology (Mayer, 2002; Moreno & Mayer, 2007), education technology (Selim, 2007; Sun et al., 2008), and higher education (Liu, 2012; Snowball, 2014).

Educational psychology research follows a learner-centered approach (Mayer, 2002). It assumes that the human system for information processing remains constant in different learning environments (Mayer, 2003). Therefore, educational psychologists study how learning occurs in the human cognitive system, explore the cognitive processes behind selected learner characteristics, and propose technical applications to facilitate these processes. Research results indicate that cognitive and affective factors, such as learner attitude (Scheiter & Gerjets, 2007), motivation (Mayer, 2014), metacognition (Moreno & Mayer, 2007), and emotions (Leutner, 2014), as well as prior knowledge (Seufert, 2003) are important for learning effectiveness independent of the learning environment. These learner characteristics can partially be influenced by the instructor’s teaching style, guidance and feedback behavior (D’Mello et al., 2014; Mayer & Moreno, 2003; Park, Moreno, Seufert, & Brünken, 2011).

Educational technology research, on the other hand, follows a technology-centered approach, which attempts to bring technological innovations into the classroom, while learners are expected to adapt (Mayer, 2002). It primarily examines the role of technology characteristics based on the technology acceptance model (TAM) developed by Davis (1986) and the task-technology fit (TTF) proposed by Goodhue and Thompson (1995). Frequently analyzed factors resulting from these concepts are perceived ease of use, perceived usefulness, technology quality, technology reliability, and technology richness (Huang, 2014; McGill & Klobas, 2009; Selim, 2003; Song, Singleton, Hill, & Koh, 2004). The effects of these technology characteristics are further differentiated based on learner characteristics, such as demographics, prior experiences, and motivation (López-Pérez, Pérez-López, & Rodríguez-Ariza, 2011; Woo, 2014), instructor characteristics, such as attitude, control over the technology, and teaching style (Selim, 2007; Webster & Hackley, 1997), and format characteristics, such as flexibility, interaction, and assessment diversity (Concannon, Flynn, & Campbell, 2005; Sun et al., 2008).

Higher education research on technology-supported learning environments builds on these two approaches and examines learners’ perceptions and their engagement with different formats of instruction, i.e., different levels of technology use in higher education (Carini, Kuh, & Klein, 2006; Ituma, 2011; Zhao & Kuh, 2004). This includes an investigation of the opinions of learners who are in favor of or against technology-supported learning (O’Neill & Sai, 2014; Snowball, 2014). Furthermore, scholars examine the impact of different learner characteristics, such as demographics, motivation, and learning approaches (Haggis, 2009; Xu & Jaggars, 2014), format characteristics, such as flexibility and community (Reed & Reay, 2015; Zhao & Kuh, 2004), and technology characteristics, such as technology selection and quality (Kintu, Zhu, & Kagambe, 2017). In addition, higher education research places particular emphasis on student engagement (Carini et al., 2006; Ituma, 2011).

Across these disciplines, online activity (Asarta & Schmidt, 2013; Fritz, 2011), technology self-efficacy (Piccoli et al., 2001; Webster & Hackley, 1997), cognitive processing (Mayer, 2003; Mayer & Moreno, 2003), perceived learning (Arbaugh, 2000a; Evans, 2008), test performance (Arbaugh, 2000c; Krentler & Willis-Flurry, 2005), satisfaction (Concannon et al., 2005; Wu, Tennyson, & Hsia, 2010), and dropout rates (Deschacht & Goeman, 2015; López-Pérez et al., 2011) are commonly used as measures of effectiveness.

The brief overview of research activities in the fields of management education, educational psychology, educational technology, and higher education highlights that the antecedents of technology-supported management learning effectiveness can be classified into four dimensions: learner, instructor, format, and technology characteristics. These dimensions are illustrated in Fig. 1 and serve as the basis for our work.

Fig. 1
figure1

Dimensions of Antecedents of Effectiveness of Technology-Supported Management Learning

Methodology

The search for relevant literature was carried out in three steps as illustrated in Fig. 2. First, we identified potentially relevant publications through a database search and snowballing. Second, those publications were prioritized by skimming abstracts and full texts. Third, prioritized publications were classified according to their analytical focus.

Fig. 2
figure2

Systematic Literature Search Process

In the first step, we conducted a keyword search for leading peer-reviewed publications to ensure the relevance and quality of potential sources. We searched the EBSCO Academic Search Premier and EBSCO Business Source Premier databases for the following journals in the educational psychology, educational technology, higher education, and management education fields: Academy of Management Learning and Education, British Journal of Educational Technology, Computers and Education, Decision Sciences Journal of Innovative Education, Educational Psychologist, Educational Psychology Review, Educational Technology Research and Development, Higher Education, Information Systems Research, Innovative Higher Education, International Journal of Management Education, Internet and Higher Education, Journal of Computer Assisted Learning, Journal of Education for Business, Journal of Educational Psychology, Journal of Educational Technology and Society, Journal of Higher Education, Journal of Management Education, Learning and Instruction, Management Learning, MIS Quarterly, Research in Higher Education, and Studies in Higher Education. We then searched the abstracts in these journals for keywords related to student learning (i.e., education, learner, learning, student), learning effectiveness (i.e., achievement, effective, effectiveness, outcome, performance, success), technology support (i.e., computer, digital, electronic, internet, multimedia, online, technology), and management (i.e., accounting, business, economics, finance, management, marketing). Literature with abstracts containing any of the following terms was excluded, as it typically does not focus on technology-supported management education: children, knowledge management, machine learning, organizational learning, school. In addition, we searched the reference lists of the identified articles to uncover any frequently cited scholars and publications that had not yet been found. We repeated this process several times. A total of 317 potentially relevant publications were identified.

In the second step, the abstracts of the identified publications were reviewed to determine whether the findings were related to this paper’s objective. Papers had to meet five criteria for inclusion in our review: investigate human learning rather than organizational learning, study learning effectiveness, go beyond the traditional lecture mode to take technology support into account, focus on higher education situations in which management is taught, and enable a transfer of findings to management education if the findings were not already related to management. If the abstracts appeared to indicate that the focal paper was insufficient for evaluation, full texts were searched. As a result, we selected 79 publications for this review.

In the third step, the selected publications were classified for a detailed review. Based on their analytical focus, the articles were assigned to one or more of the previously identified dimensions of antecedents of the effectiveness of technology-supported management learning: learner, instructor, format, and technology. The selected publications and their key findings are listed in Table 1.

Table 1 Overview of Findings from the Extant Literature

Antecedents of effectiveness of technology-supported management learning

Technology characteristics

The integration of technologies into learning environments has been studied for about 30 years. Davis (1986) developed the first version of the technology acceptance model (TAM) to examine antecedents of a technology’s acceptance. He proposed that the capabilities of a technology trigger learners’ motivation to use it, which in turn leads to actual use. More specifically, the features of a technology are assumed to affect perceived ease of use and perceived usefulness, which then affect attitudes toward using that technology and, thus, actual use. Although this model is not explicitly tailored to learning, it has evolved as a basis for educational technology research. Several studies of technology-supported management learning show that perceived ease of use and perceived usefulness affect satisfaction but do not directly predict perceived learning (Arbaugh, 2000a, 2000b; Huang, 2014). Terpend et al. (2014) find that perceived ease of use predicts technology adoption. Selim (2003) also provides evidence that perceived ease of use and usefulness predict technology acceptance, and reveals that ease of use is mostly mediated by usefulness. Sun et al. (2008) conclude that ease of use enables e-learners to focus on the content rather than the technology.

Goodhue and Thompson (1995) introduce task-technology fit (TTF) and argue that “for an information technology to have a positive impact on individual performance, the technology must be utilized and must be a good fit with the tasks it supports.” Related antecedents of technology-supported management learning effectiveness that are frequently analyzed include technology quality and technology reliability. In an early experiment with synchronous technology-supported distance learning based on online lectures and videos, Webster and Hackley (1997) find that both variables influence attitude toward the format and the technology, and that technology quality also influences the relative advantage of the format (i.e., perceived learning). They argue that reliable, efficient, and effective technology interfaces promote learner motivation, while technical complications have the opposite effect. However, they do not find relationships with involvement and participation, cognitive engagement, technology-self-efficacy, or usefulness of the technology. Song et al. (2004) confirm that technical problems are perceived as disadvantages for online learning. Sun et al. (2008) examine technology and internet quality in e-learning but find no effects on the satisfaction of management students. Notably, internet quality may be taken for granted. McGill and Klobas (2009) examine the role of learning management systems and provide empirical evidence that TTF strongly influences perceived learning and weakly affects actual learning. They also show an indirect relationship between TTF and perceived learning through learners’ attitudes toward technology utilization and actual use. Interestingly, they also reveal an effect of TTF on the expected consequences of technology use, although this does not affect actual usage.

Webster and Hackley (1997) note that technology richness has a positive impact on involvement and participation, cognitive engagement, technology self-efficacy, perceived usefulness, attitudes toward technology and format, and perceived learning. They argue that technology richness supports the accessibility of instructors and their feedback, which moderates learner motivation, thereby predicting technology use and perceived learning. Yourstone et al. (2008) state that immediate feedback technologies, such as clickers, can have a positive impact on learning outcomes. Work by Snowball (2014) confirms that passive online activities, such as videos, can be useful for introducing new concepts, while more active components, such as quizzes, are more beneficial for learning. Sloan and Lewis (2014) suggest that lecture-capture videos are related to higher exam scores. Kember et al. (2010) find that technological features that promote constructive dialogue and interactive learning improve understanding. Volery and Lord (2000) and Wu et al. (2010) note that the design and functionality of a learning management system predict perceived learning. Arbaugh and Rau (2007) investigate online learning with different systems and, interestingly, find a negative relationship between technology variety and perceived learning but a positive relationship between technology variety and satisfaction. In addition, Huang (2014) identifies a positive relationship between technology playfulness and satisfaction in a mobile learning environment. He finds that learners’ self-management skills moderate the effects of usefulness and playfulness on satisfaction. These technology-related antecedents of the effectiveness of technology-supported management learning are summarized in Fig. 3.

Fig. 3
figure3

Technology-Related Antecedents

Format characteristics

While the format of instruction has traditionally been based on the physical classroom, the advent of technologies in management education allows for the emergence of new settings. Higher education research proposes a blended learning environment that is independent from the technology employed. According to Garrison and Kanuka (2004), this format is an “integration of face-to-face and online learning experiences – not a layering of one on top of the other.” López-Pérez et al. (2011) show that blended environments that combine face-to-face classes with online activities (e.g., crosswords, matching, fill in the blank, multiple-choice tests, wikis, forums) reduce dropout rates and improve exam performance. In line with TAM, they show that the perceived utility of online learning is correlated with the motivation generated by the technology, which in turn predicts satisfaction. However, they find that actual learning mainly depends on variables unrelated to blended environments, such as learners’ age, class attendance, or prior experiences—perceived utility and satisfaction do not predict actual learning. Notably, according to Grabe and Christopherson (2008), a lack of class attendance may be offset through online resources. Deschacht and Goeman (2015) find better exam performance for blended environments that integrate self-study, online collaboration, and classroom teaching. However, they also find that these environments are associated with higher dropout rates. They argue that the learning effect may be subject to survivorship bias. McLaren (2004) demonstrates that persistence in online delivery is significantly lower, while learning performance is independent of the format.

Although blended learning environments capture the benefits of technological innovations, such as flexibility in terms of time and place and learner control over pace and content, they also capture the benefits of physical classrooms (i.e., personal interaction through collaboration and community) (Arbaugh, 2014; Concannon et al., 2005). Educational technology research has found that course flexibility leads to e-learning satisfaction (Arbaugh, 2000b; Sun et al., 2008). The rationale is that flexibility allows learners to balance their personal commitments, such as work, family, and other activities, with their studies. Higher education research suggests that learner independence is crucial for building critical thinking skills (Garrison & Kanuka, 2004). Educational psychology research emphasizes that learner control over materials can have a positive impact on cognitive processing due to the possibility of pacing (Mayer et al., 2003; Moreno & Mayer, 2007). Pacing refers to a flexible presentation speed that encompasses pause, rewind, and fast-forward options. While pausing allows learners to restrict cognitive processing at a certain point of time, rewinding can intensify cognitive processing because the learner repeatedly receives the same information. The fast-forward option allows for certain sections to be skipped so that learners end up with shorter sections, which also benefit cognitive processing. The presentation of information in separate parts gives learners the opportunity to gradually build multiple mental representations that can be integrated later (Mayer & Chandler, 2001). Scheiter and Gerjets (2007) note that learner control in multimedia environments stimulates interest and motivation and, thereby, triggers more active and constructive processing. While Arbaugh and Duray (2002) show positive relationships between flexibility and both perceived learning and satisfaction in web-based environments, Arbaugh (2000a) finds no direct relationship between flexibility and perceived learning.

In blended learning environments, the flexibility of online learning is integrated with the preeminent characteristic of classroom teaching: interaction. Alavi (1994) finds that technology-supported learner collaboration and the associated interaction lead to greater satisfaction, self-reported learning, and enhanced exam performance. Collaboration can empower the structuring and sharing of information, leading to exposure to different views and opinions. This requires reiterating prior information when explaining knowledge to others, resolving opposing perspectives through discussions, and internalizing explanations from more knowledgeable peers. Eventually, this leads to more active knowledge processing and construction (Kreijns et al., 2013).

Eid and Al-Jabri (2016) provide evidence that online discussions and chats promote the exchange of knowledge that predicts perceived learning. Furthermore, networking via discussion forums leads to better performance (Walker et al., 2013). Arbaugh (2000a) also finds connections between perceived learning and interaction ease, interaction emphasis, and classroom dynamics. Arbaugh and Benbunan-Fich (2006) investigate online learning among 579 MBA students and find that group learning leads to higher perceived learning and satisfaction than individual learning. While group learning is moderated by an objectivist teaching approach, individual learning is moderated by constructivist instruction. Song et al. (2004) find that a perceived lack of community is detrimental to perceived online learning. In contrast, Eom et al. (2006) state that distance interactions lead to an adaptation of information that assists learners in overcoming feelings of remoteness. They find that interaction predicts satisfaction with online learning, which in turn fosters perceived learning. However, they do not find a direct link between interaction and perceived learning. Concannon et al. (2005) also find that interaction affects the satisfaction of e-learners, while Sun et al. (2008) find no relationship. Eom and Ashill (2018) find direct relationships between both learner-learner and learner-instructor interaction and perceived online learning. They also show that peer interactions in e-learning are beneficial for the self-regulation that predicts perceived learning. Perceived learning, in turn, causes satisfaction (Wu et al., 2010). Hazari et al. (2013) suggest that peer interactions via blogs lead to constructive feedback and self-assessments. On the other hand, Arbaugh and Rau (2007) find that peer interaction in online courses can negatively influence satisfaction, while it can positively affect perceived learning. Wu et al. (2010) reveal that the learning climate in a blended environment mediates the effect of interaction on satisfaction. According to Solimeno et al. (2008), online interaction can be even more beneficial for learning than personal interaction, as the former overcomes much of the interpersonal noise.

A variant of blended environments is flipped learning. According to higher education research, there is no single approach to flipped learning. However, the most important aspects include the provision of content in advance and higher-order learning during face time (O’Flaherty & Phillips, 2015). Therefore, introductions, explanations, and theories are studied individually and asynchronously at each student’s own pace, typically facilitated by a learning management system, while application and transfer problems are handled during class time. Solimeno et al. (2008) emphasize the benefits of asynchronous preparation, including flexibility in consulting materials and reviewing online comments from peers. Such a shift in the individual workload from reworking to preparing fosters ownership before class and enables deeper discussions in class that can be initiated by the learners themselves (O’Flaherty & Phillips, 2015). Flipped learning also supports the pretraining effect proposed in educational psychology research (Moreno & Mayer, 2007). The aim in this regard is to provide learners with relevant prior knowledge or to reactivate it if it is already available. This prepares the human memory with selected knowledge, which can later be integrated with new information. Consequently, pretraining facilitates meaning making and improves cognitive processing (Moreno & Mayer, 2007).

Educational technology research finds that assessment diversity in online environments increases satisfaction, as it enables multiple forms of feedback (Sun et al., 2008). Concannon et al. (2005) suggest that the use of some online tests during a semester reshapes study patterns by triggering continuous review and feedback. These format-related antecedents of the effectiveness of technology-supported management learning are outlined in Fig. 4.

Fig. 4
figure4

Format-Related Antecedents

Instructor characteristics

Instructors play a central role in any learning environment (Webster & Hackley, 1997). This role remains important in technology-supported management education, but it is changing (Daspit & D’Souza, 2012; Volery & Lord, 2000). Therefore, examinations of instructor characteristics should consider not only the personalities of instructors but also their roles, particularly with regard to learner-instructor interactions.

Research on instructors’ personality in technology-supported environments mainly focuses on instructors’ attitudes toward and control over the technology. Webster and Hackley (1997) find that the instructor’s attitude toward the technology affects learners’ attitudes toward the format and technology, technology self-efficacy, and perceived learning. In turn, learners’ technology self-efficacy predicts perceived learning (Wu et al., 2010). However, they find no relationship between the instructor’s attitude toward the technology and learners’ involvement and participation, cognitive engagement, or perceived usefulness of the technology. Concannon et al. (2005) find a positive relationship between the instructor’s attitude toward the technology and e-learners’ motivation to use that technology. López-Pérez et al. (2011) show that learner motivation influences actual learning in both the physical and virtual elements of blended environments. In addition, Sun et al. (2008) show a positive effect of the instructor’s attitude on the satisfaction of e-learners. They also emphasize the importance of the instructor’s technical competence.

Webster and Hackley (1997) demonstrate that the instructor’s control over the technology has a positive impact on learners’ attitudes toward a technology, its perceived usefulness, cognitive engagement, and perceived learning. However, they do not find relationships with involvement and participation or technology self-efficacy. Selim (2007) confirms that both attitudes toward and control over the technology affect business students’ e-learning satisfaction.

While the purpose of a traditional lecture is to deliver knowledge, instructors in a technology-supported environment should support active learning as facilitators and mentors (Solimeno et al., 2008). Markel (1999) proposes a change from “a sage on the stage into a guide on the side,” while Volery and Lord (2000) expect the role of the instructor to shift toward being “a learning catalyst and knowledge navigator.” Webster and Hackley (1997) find that such an interactive teaching style has a positive impact on learners’ involvement and participation, cognitive engagement, and attitudes toward format and technology. They find no relationships between an interactive teaching style and the perceived usefulness of the technology, technology self-efficacy, or perceived learning. However, Arbaugh (2000a) shows that efforts to create an interactive online environment predict perceived learning, and that the emphasis on interaction is directly related to satisfaction (Arbaugh, 2000b). Selim (2007) also shows that instructor characteristics, including the teaching style, influence business students’ satisfaction with e-learning.

Interactions between learners and instructors comprise both guidance (i.e., process input) and feedback (i.e., essential input) (Moreno & Mayer, 2007). On the one hand, process-related input promotes learners’ engagement in the right activities, especially the selection, organization, and integration of relevant information that strengthens relevant cognitive processing (Mayer & Moreno, 2003). On the other hand, essential input reduces learners’ extraneous cognitive processing by replacing misconceptions in the human memory (Moreno & Mayer, 2007). Extraneous processing refers to cognitive processes that are irrelevant for making sense of information and, thus, should be minimized. However, feedback must be well designed to avoid additional extraneous processing. For technology-supported environments, Demetriadis et al. (2008) suggest that scaffolding, a technique of appropriate questioning, can trigger learner reflection and deeper processing. They find that scaffolding leads to more knowledge acquisition and knowledge transfer. Moreno and Mayer (2007) confirm that reflection on prior information leads to more active organization and integration of new information. According to Eom et al. (2006), both guidance and feedback increase learner satisfaction, but only feedback improves perceived learning in an online environment. Hwang and Arbaugh (2006) show that feedback does not influence actual learning in blended environments. However, if the search for feedback is triggered by a competitive attitude (i.e., getting ahead of others or preventing others from getting ahead of oneself), it has a positive impact on actual learning. Sun et al. (2008) show that the timeliness of an instructor’s response has no influence on satisfaction with e-learning.

Instructor feedback in technology-supported environments has also been studied in connection with learners’ prior knowledge. Seufert (2003) finds that feedback in a computer-based learning task barely affects learners with a high level of prior knowledge. However, it positively moderates the comprehension of learners with intermediate prior knowledge, presumably due to its summarizing and repetitive nature. At the same time, feedback negatively moderates the recall performance of learners with little prior knowledge. Interestingly, in a computer-based simulation, Nihalani et al. (2011) find that learners with low prior knowledge learn better with the support of the instructor than in cooperation with other beginners and that feedback is disadvantageous for learners with high levels of prior knowledge.

As a variant of feedback, educational psychology scholars study confusion in online environments, which is defined as “the result of contradictions, conflicts, anomalies, erroneous information, and other discrepant events” (Park et al., 2014). They propose that when confusion is “induced, regulated, and resolved appropriately,” it can positively influence learning. D’Mello et al. (2014) find that knowledge and transfer are higher when confusion is deliberately triggered and successfully resolved. Learners’ prior knowledge has small moderation effects. Confusion is assumed to lead to deeper engagement with new information, thereby improving learning (Leutner, 2014).

Although feedback embodies interaction between instructors and learners, the physical presence of the instructor is not essential for improving cognitive processing (Redpath, 2012). Personal interaction can occur through a collaborative online environment or personalized online communication (Arbaugh, 2000c). Mayer (2002) proposes the personalization principle, which posits more effective processing for a conversational communication style in learning materials than for a formal communication style. This increases learners’ attention and encourages them to refer content to themselves (Moreno, 2006). In addition, Beege et al. (2017) find that frontal, as opposed to lateral, instructor orientation in learning videos promotes retention, as para-social interactions can trigger deeper cognitive processing and beneficial affective states. The lack of body language in online settings can be addressed through the use of humor, anecdotes, or emoticons (Whitaker, New, & Ireland, 2016). Guo et al. (2014) find that instructors who speak faster and with more enthusiasm in learning videos increase learner engagement. These instructor-related antecedents of technology-supported management learning effectiveness are illustrated in Fig. 5.

Fig. 5
figure5

Instructor-Related Antecedents

Learner characteristics

The learners themselves play an important role in the effectiveness of technology-supported management learning. Educational technology research initially examined the demographic background and prior experience of learners in technology-supported formats. While it is unclear whether gender predicts perceived learning in an online environment (Arbaugh, 2000a, 2008; Volery & Lord, 2000), both Arbaugh (2000b) and Arbaugh (2008) find that gender does not influence satisfaction. Furthermore, Lancellotti et al. (2016) find no connection between gender and actual learning. Age does not influence perceived e-learning (Arbaugh, 2000a), but it positively predicts actual learning in the physical and virtual settings of a blended environment (López-Pérez et al., 2011).

Prior technological experience also influences actual online learning (López-Pérez et al., 2011), while its relationships with perceived learning and satisfaction are not always significant (Arbaugh, 2000a, 2008; Arbaugh & Rau, 2007; Selim, 2007; Song et al., 2004; Volery & Lord, 2000). Piccoli et al. (2001) examine 146 management students and posit that previous technology experience can be beneficial, while a lack of such experience can promote feelings of anxiety and isolation. Sun et al. (2008) find that computer anxiety has a negative impact on satisfaction with e-learning, as it can hamper a learner’s attitude, which is essential for technology-supported learning (Scheiter & Gerjets, 2007). Solimeno et al. (2008) show that technology promotes perceived and actual learning among learners with low computer anxiety.

In addition to previous technological experience, research has examined the role of prior academic achievements. Nemanich et al. (2009) and Palocsay and Stevens (2008) find that learners’ academic abilities are associated with learning outcomes, particularly in online environments. Scheiter and Gerjets (2007) assume that a high level of prior knowledge moderates learning in multimedia environments. Asarta and Schmidt (2017) show that blended formats have a positive influence on exam performance for learners with high prior performance, while weaker students perform better in traditional formats. Owston et al. (2013) find that high achievers show the highest satisfaction with blended learning environments because they view blended learning as more convenient and engaging, and they feel that they learn key concepts better than in traditional classes.

Educational psychology scholars have considered affective aspects, such as learner motivation and emotions (Park et al., 2014). Motivation is defined as an “internal state that initiates, maintains, and energizes the learner’s effort to engage in learning processes” (Mayer, 2014). The corresponding work is based on the assumption that motivational factors can mediate learning by increasing or decreasing cognitive engagement (Moreno & Mayer, 2007). Selim (2007) shows that motivation affects e-learning acceptance and satisfaction. According to Song et al. (2004), e-learners expect their motivation to be related to learning. López-Pérez et al. (2011) find that motivation predicts actual learning in both the physical and virtual settings of a blended environment. Woo (2014) confirms the correlation between motivation and actual online learning. Eom et al. (2006) also find that motivation in an online environment affects satisfaction, although they do not find a direct link to perceived learning.

Plass et al. (2014) and Um et al. (2012) investigate emotions induced by videos in online learning, and find that positive emotions can promote comprehension and transfer. Their findings suggest that round, face-like shapes and warm colors reinforce the positive emotions that not only reduce the perceived difficulty of the task but also increase motivation and cognitive processing. This effect of emotions on performance can be mediated by motivation and/or moderated by prior knowledge (Leutner, 2014). In contrast, Knoerzer et al. (2016) find that positive emotions induced through music and autobiographic recall reduce actual online learning, possibly because they distract learners from the focal material. However, they find that negative emotions increase learning, possibly due to a perceived need for deeper information processing. They find no connection between emotions and motivation.

Educational psychology research on multimedia learning further posits that “metacognitive factors mediate learning by regulating cognitive processing and affect” (Moreno & Mayer, 2007). Metacognition mainly occurs in the form of self-regulation and reflection during the organization and integration of new information. Moreno and Mayer (2007) find that reflection is beneficial for cognitive processing, which leads to better learning outcomes. Eom and Ashill (2018) show that self-regulation in an e-learning environment mediates the relationship between motivation and perceived learning, which is related to satisfaction. Metacognition seems to be particularly important for non-interactive (i.e., distance) phases in which it is not triggered by interactions. However, metacognition is also important in an interactive setting if “the lesson can be performed in a superficial or automatic fashion” (Moreno & Mayer, 2007).

According to Fryer and Bovee (2016), “although a variety of factors influence learning, few are as important as time on task.” Macfadyen and Dawson (2010) distinguish between online activity and time online, noting that online activity (i.e., written posts, sent messages, completed assessments) indicates learner engagement and predicts actual outcomes, while time online does not. Fritz (2011) also shows that higher activity in the learning management system affects actual learning, while Asarta and Schmidt (2013) as well as Buttner and Black (2014) find no correlation between time online and learning. Based on learning analytics, Zacharis (2015) finds that four online activities predict 52% of the variance in the final grade: number of files viewed, reading and posting messages, content creation contribution, and quiz efforts. These learner-related antecedents of technology-supported management education are illustrated in Fig. 6.

Fig. 6
figure6

Learner-Related Antecedents

Discussion

In this paper, we have presented a systematic and comprehensive review of peer-reviewed, scientific publications from several research disciplines related to the effectiveness of technology-supported management learning. Although our search for literature was not limited to a specific timeframe, the current relevance of the topic is evident from the identified publications. Research on this topic began to emerge in the 1990s and it has since flourished. With regard to the field of management education, the most cited articles were published in the current millennium (Arbaugh & Hwang, 2015). We found that the antecedents of technology-supported management learning effectiveness include more than technological characteristics and learners’ abilities to deal with them. More specifically, the introduction of technologies into the management learning space has implications for formats, instructors, and learner characteristics, all of which are highly interdependent. The desired format of instruction, for example, which is chosen by the instructor, determines the appropriate technology and the role of the instructor. Characteristics of the selected technology, such as quality, reliability, and richness, and characteristics of the instructor, such as attitude, control, and teaching style, impact learners’ perceptions, metacognition, and affect. These relationships are, in turn, moderated by learners’ demographic characteristics and previous experiences. Eventually, all four dimensions—learner, instructor, format, and technology—directly or indirectly influence technology-supported learning effectiveness. These findings are independent from the measurement of effectiveness (i.e., online activity, cognitive processing, perceived learning, satisfaction, actual results, or dropout rates).

These antecedents of technology-supported management learning effectiveness are summarized in Fig. 7. The subsequent section derives detailed implications for future research based on the identified inconsistencies and interdependencies.

Fig. 7
figure7

Integrated Perspective on Antecedents of Technology-Supported Management Learning Effectiveness

Implications for future research

In investigating antecedents of technology-supported management learning effectiveness, we have identified several inconsistencies and research gaps in the extant literature. We encourage management education scholars to study these issues in order to develop additional insights into technology-supported management learning. Such research will advance the young field of management education and make a positive contribution to overall management research and education. In Table 2, we highlight aspects that provide opportunities for further research.

Table 2 Selected Opportunities for Further Research

As far as the overall effectiveness of technology-supported formats is concerned, research has produced a number of inconsistent results. For instance, there is disagreement about the impact of blended environments on dropout rates (Deschacht & Goeman, 2015; López-Pérez et al., 2011). Moreover, whether the use of technology is beneficial for learning remains unclear. Twenty years ago, Arbaugh (2000a) found that the format of instruction is more important than the specific technology employed. To date, theoretical concepts on how to blend and flip learning content in relation to subject areas and content types are still lacking (O’Flaherty & Phillips, 2015). Although there might not be a “one-size-fits-all” approach, it is possible to examine which course structures and format features, such as collaboration and interaction, are more appropriate for certain types of content. Due to the wide variety of management disciplines, scholars in management education are predestined to investigate different variants of blended and flipped learning (Arbaugh & Rau, 2007). Such studies can reveal connections among content type, optimal course format, and technology use.

Another key question is why learners continue to prefer face-to-face classes to online courses (O’Neill & Sai, 2014) even though they regularly use electronic devices and increasingly strive for individualism and flexibility. As technologies are likely to continue to play a central role in society, different learning formats should be studied in relation to specific technologies and their richness of features. Such studies can further investigate whether the use of technology actually equalizes learners’ performance (Krentler & Willis-Flurry, 2005). Moreover, Piccoli et al. (2001) argue that the investigation of new formats and technologies for management education requires an examination of optimal class sizes. They argue for an inverted U-shape relationship between class size and learning effectiveness, as the presence of more learners increases perspectives until a point is reached at which information overload and coordination difficulties outweigh the benefits of additional learners. However, this requires further examination.

Scholars agree that instructors play an important role in technology-supported management education, but how their role will change remains unclear (Arbaugh, 2000a; Volery & Lord, 2000). Some suggest a shift from “a sage on the stage into a guide on the side” (Markel, 1999), which implies a shift from an objectivist to a constructivist teaching approach. Nevertheless, collaborative management learning in a technology-supported environment seems to be moderated by an objectivist teaching approach (Arbaugh & Benbunan-Fich, 2006), which contradicts the plea for an interactive teaching style (Selim, 2007; Webster & Hackley, 1997). Furthermore, findings on the role and effects of feedback are inconsistent, particularly with regard to the moderating role of learners’ prior knowledge (Nihalani et al., 2011; Seufert, 2003). Deliberate confusion, a variant of feedback, has also been under-researched, and there are some indications that learners’ prior knowledge could play a moderating role (D’Mello et al., 2014; Leutner, 2014). Therefore, the design and impact of teaching style and instructor feedback on cognitive processing and actual learning should be further investigated, especially with regard to potential moderating variables, such as learners’ prior knowledge.

Since Moreno and Mayer (2007) proposed a cognitive-affective theory of learning with media, it has become clear that learning also depends on affective aspects, such as motivation and emotions. Although the related antecedents have not yet been fully researched, initial results suggest that the design of multimedia materials and interfaces should take into account features that trigger motivation and emotion (Mayer, 2014). However, while Plass et al. (2014) and Um et al. (2012) find that positive emotions can strengthen comprehension and transfer, Knoerzer et al. (2016) come to the opposite conclusion when they induce emotions in a different way. Another unresolved aspect of inducing emotions is whether the instructor should be shown speaking in educational videos. While this can create a positive sense of personalization, it may also increase the extraneous load (Kizilcec et al., 2015; Mayer, 2003). Furthermore, Leutner (2014) suggests that the effect of emotions on learning might be mediated by motivation or moderated by prior knowledge. As such, the interdependence and effects of motivation and emotions on cognitive processing and actual learning deserve further investigation. In addition, potentially moderating variables, such as learners’ prior knowledge, should be investigated.

Limitations

Although this review followed a systematic procedure, it has some limitations that can be attributed to either our methodology or our research focus. With regard to our methodology, the literature-identification process revealed that numerous publications from a variety of research areas have examined technology-supported learning. Although we have tried to systematically identify all major publications investigating this issue that are relevant for the management context, we cannot guarantee that our results are exhaustive. Furthermore, although we broadened our scope to include publications beyond management education research, we deliberately limited our search to educational psychology, educational technology, and higher education research. These three disciplines appeared to be the most promising during an initial interdisciplinary skimming of the literature. However, we cannot exclude the possibility that relevant research may have been conducted in other disciplines. Moreover, given the interdisciplinary nature of the sources, our literature prioritization and classification revealed that some results were more general in nature, while others were developed explicitly from management education research. In our search in the field of educational technology, we tried to limit our findings to those that came from a management context. Nevertheless, this paper also includes findings from other disciplines when they appeared to be transferable to the management environment. Decisions regarding this transferability were made by the authors.

In terms of the research focus, management is a broad field covering various sub-disciplines, including accounting, economics, finance, marketing, and strategy. Some of these fields are comparable in terms of concepts and terminologies, while others are not. Some fields are rather qualitative, and others are strongly quantitative. In addition, the spectrum of management learners ranges from freshmen in undergraduate programs to highly senior MBA students participating in executive programs. Similarly, the use of technologies in education covers a broad field ranging from traditional classroom teaching sporadically facilitated using electronic devices to programs taught fully online. As our objective was to examine antecedents of management learning in a technology-supported environment as a whole, we did not restrict the learning environment in terms of the technologies employed.

Concluding remarks

This paper has shown that educational technologies are quickly becoming an integral part of management education, both in theory and in practice. Although we have identified a number of research gaps and ideas for further research, educational authorities, institutions, and practitioners should not wait for additional research to be completed. Passive knowledge transfer in synchronous, analogue classroom sessions can no longer be viewed as the most effective educational format. In addition, there are already some indications of what constitutes effective technology-supported management education. In the meantime, researchers from different disciplines should pursue investigations of technology-supported settings in relation to management education and beyond.

Availability of data and materials

Not applicable.

References

  1. Alavi, M. (1994). Computer-mediated collaborative learning: An empirical evaluation. MIS Quarterly, 18, 159–174. https://doi.org/10.2307/249763 .

    MathSciNet  Article  Google Scholar 

  2. Alavi, M., & Gallupe, R. B. (2003). Using information technology in learning: Case studies in business and management education programs. Academy of Management Learning and Education, 2, 139–153. https://doi.org/10.5465/amle.2003.9901667 .

    Article  Google Scholar 

  3. Arbaugh, J. B. (2000a). How classroom environment and student engagement affect learning in internet-based MBA courses. Business Communication Quarterly, 63, 9–26. https://doi.org/10.1177/108056990006300402 .

    Article  Google Scholar 

  4. Arbaugh, J. B. (2000b). Virtual classroom characteristics and student satisfaction with internet-based MBA courses. Journal of Management Education, 24, 32–54. https://doi.org/10.1177/105256290002400104 .

    Article  Google Scholar 

  5. Arbaugh, J. B. (2000c). Virtual classroom versus physical classroom: An exploratory study of class discussion patterns and student learning in an asynchronous internet-based MBA course. Journal of Management Education, 24, 213–233. https://doi.org/10.1177/105256290002400206 .

    Article  Google Scholar 

  6. Arbaugh, J. B. (2008). Does the community of inquiry framework predict outcomes in online MBA courses? International Review of Research in Open and Distributed Learning, 9, 1–21. https://doi.org/10.19173/irrodl.v9i2.490 .

    Article  Google Scholar 

  7. Arbaugh, J. B. (2014). What might online delivery teach us about blended management education? Prior perspectives and future directions. Journal of Management Education, 38, 784–817. https://doi.org/10.1177/1052562914534244 .

    Article  Google Scholar 

  8. Arbaugh, J. B. (2016). Where are the dedicated scholars of management learning and education? Management Learning, 47, 230–240. https://doi.org/10.1177/1350507615595773 .

    Article  Google Scholar 

  9. Arbaugh, J. B., & Benbunan-Fich, R. (2006). An investigation of epistemological and social dimensions of teaching in online learning environments. Academy of Management Learning and Education, 5, 435–447. https://doi.org/10.5465/amle.2006.23473204 .

    Article  Google Scholar 

  10. Arbaugh, J. B., DeArmond, S., & Rau, B. L. (2013). New uses for existing tools? A call to study on-line management instruction and instructors. Academy of Management Learning and Education, 12, 635–655. https://doi.org/10.5465/amle.2011.0018A .

    Article  Google Scholar 

  11. Arbaugh, J. B., & Duray, R. (2002). Technological and structural characteristics, student learning, and satisfaction with web-based courses: An exploratory study of two on-line MBA programs. Management Learning, 33, 331–347. https://doi.org/10.1177/1350507602333003 .

    Article  Google Scholar 

  12. Arbaugh, J. B., Godfrey, M. R., Johnson, M., Pollack, B. L., Niendorf, B., & Wresch, W. (2009). Research in online and blended learning in the business disciplines: Key findings and possible future directions. Internet and Higher Education, 12, 71–87. https://doi.org/10.1016/j.iheduc.2009.06.006 .

    Article  Google Scholar 

  13. Arbaugh, J. B., & Hwang, A. (2015). What are the 100 most cited articles in business and management education research, and what do they tell us? Organization Management Journal, 12, 154–175. https://doi.org/10.1080/15416518.2015.1073135 .

    Article  Google Scholar 

  14. Arbaugh, J. B., & Rau, B. L. (2007). A study of disciplinary, structural, and behavioral effects on course outcomes in online MBA courses. Decision Sciences Journal of Innovative Education, 5, 65–95. https://doi.org/10.1111/j.1540-4609.2007.00128.x .

    Article  Google Scholar 

  15. Asarta, C. J., & Schmidt, J. R. (2013). Access patterns of online materials in a blended course. Decision Sciences Journal of Innovative Education, 11, 107–123. https://doi.org/10.1111/j.1540-4609.2012.00366.x .

    Article  Google Scholar 

  16. Asarta, C. J., & Schmidt, J. R. (2017). Comparing student performance in blended and traditional courses: Does prior academic achievement matter? Internet and Higher Education, 32, 29–38. https://doi.org/10.1016/j.iheduc.2016.08.002 .

    Article  Google Scholar 

  17. Beege, M., Schneider, S., Nebel, S., & Rey, G. D. (2017). Look into my eyes! Exploring the effect of addressing in educational videos. Learning and Instruction, 49, 113–120. https://doi.org/10.1016/j.learninstruc.2017.01.004 .

    Article  Google Scholar 

  18. Buttner, E. H., & Black, A. N. (2014). Assessment of the effectiveness of an online learning system in improving student test performance. Journal of Education for Business, 89, 248–256. https://doi.org/10.1080/08832323.2013.869530 .

    Article  Google Scholar 

  19. Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47, 1–32. https://doi.org/10.1007/s11162-005-8150-9 .

    Article  Google Scholar 

  20. Concannon, F., Flynn, A., & Campbell, M. (2005). What campus-based students think about the quality and benefits of e-learning. British Journal of Educational Technology, 36, 501–512. https://doi.org/10.1111/j.1467-8535.2005.00482.x .

    Article  Google Scholar 

  21. D’Mello, S., Lehman, B., Pekrun, R., & Graesser, A. (2014). Confusion can be beneficial for learning. Learning and Instruction, 29, 153–170. https://doi.org/10.1016/j.learninstruc.2012.05.003 .

    Article  Google Scholar 

  22. Daspit, J. J., & D’Souza, D. E. (2012). Using the Community of Inquiry framework to introduce wiki environments in blended-learning pedagogies: Evidence from a business capstone course. Academy of Management Learning and Education, 11, 666–683. https://doi.org/10.5465/amle.2010.0154 .

    Article  Google Scholar 

  23. Davis FD (1986) A technology acceptance model for empirically testing new end-user information systems: Theory and results. Doctoral thesis, Massachusetts Institute of Technology

    Google Scholar 

  24. Demetriadis, S. N., Papadopoulos, P. M., Stamelos, I. G., & Fischer, F. (2008). The effect of scaffolding students’ context-generating cognitive activity in technology-enhanced case-based learning. Computers and Education, 51, 939–954. https://doi.org/10.1016/j.compedu.2007.09.012 .

    Article  Google Scholar 

  25. Deschacht, N., & Goeman, K. (2015). The effect of blended learning on course persistence and performance of adult learners: A difference-in-differences analysis. Computers and Education, 87, 83–89. https://doi.org/10.1016/j.compedu.2015.03.020 .

    Article  Google Scholar 

  26. Dindar, M., & Akbulut, Y. (2016). Effects of multitasking on retention and topic interest. Learning and Instruction, 41, 94–105. https://doi.org/10.1016/j.learninstruc.2015.10.005 .

    Article  Google Scholar 

  27. Eid, M. I. M., & Al-Jabri, I. M. (2016). Social networking, knowledge sharing, and student learning: The case of university students. Computers and Education, 99, 14–27. https://doi.org/10.1016/j.compedu.2016.04.007 .

    Article  Google Scholar 

  28. Eom, S. B., & Ashill, N. J. (2018). A system’s view of e-learning success model. Decision Sciences Journal of Innovative Education, 16, 42–76. https://doi.org/10.1111/dsji.12144 .

    Article  Google Scholar 

  29. Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education, 4, 215–235. https://doi.org/10.1111/j.1540-4609.2006.00114.x .

    Article  Google Scholar 

  30. Evans, C. (2008). The effectiveness of m-learning in the form of podcast revision lectures in higher education. Computers and Education, 50, 491–498. https://doi.org/10.1016/j.compedu.2007.09.016 .

    Article  Google Scholar 

  31. Fritz, J. (2011). Classroom walls that talk: Using online course activity data of successful students to raise self-awareness of underperforming peers. Internet and Higher Education, 14, 89–97. https://doi.org/10.1016/j.iheduc.2010.07.007 .

    Article  Google Scholar 

  32. Fryer, L. K., & Bovee, H. N. (2016). Supporting students’ motivation for e-learning: Teachers matter on and offline. Internet and Higher Education, 30, 21–29. https://doi.org/10.1016/j.iheduc.2016.03.003 .

    Article  Google Scholar 

  33. Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. Internet and Higher Education, 7, 95–105. https://doi.org/10.1016/j.iheduc.2004.02.001 .

    Article  Google Scholar 

  34. Goodhue, D. L., & Thompson, R. L. (1995). Task-technology fit and individual performance. MIS Quarterly, 19, 213–236. https://doi.org/10.2307/249689 .

    Article  Google Scholar 

  35. Grabe, M., & Christopherson, K. (2008). Optional student use of online lecture resources: Resource preferences, performance and lecture attendance. Journal of Computer Assisted Learning, 24, 1–10. https://doi.org/10.1111/j.1365-2729.2007.00228.x .

    Article  Google Scholar 

  36. Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. In M. Sahami, A. Fox, M. A. Hearst, & M. T. H. Chi (Eds.), Proceedings of the first ACM conference on learning @ scale, (pp. 41–50). New York: ACM Press.

    Google Scholar 

  37. Haggis, T. (2009). What have we been thinking of?: A critical overview of 40 years of student learning research in higher education. Studies in Higher Education, 34, 377–390. https://doi.org/10.1080/03075070902771903 .

    Article  Google Scholar 

  38. Hazari, S., CO’M, B., & Rutledge, R. (2013). Investigating marketing students’ perceptions of active learning and social collaboration in blogs. Journal of Education for Business, 88, 101–108. https://doi.org/10.1080/08832323.2011.654141 .

    Article  Google Scholar 

  39. Huang, R.-T. (2014). Exploring the moderating role of self-management of learning in mobile English learning. Educational Technology and Society, 17, 255–267.

    Google Scholar 

  40. Hwang, A., & Arbaugh, J. B. (2006). Virtual and traditional feedback-seeking behaviors: Underlying competitive attitudes and consequent grade performance. Decision Sciences Journal of Innovative Education, 4, 1–28. https://doi.org/10.1111/j.1540-4609.2006.00099.x .

    Article  Google Scholar 

  41. Ituma, A. (2011). An evaluation of students’ perceptions and engagement with e-learning components in a campus based university. Active Learning in Higher Education, 12, 57–68. https://doi.org/10.1177/1469787410387722 .

    Article  Google Scholar 

  42. Kember, D., McNaught, C., Chong, F. C. Y., Lam, P., & Cheng, K. F. (2010). Understanding the ways in which design features of educational websites impact upon student learning outcomes in blended learning environments. Computers and Education, 55, 1183–1192. https://doi.org/10.1016/j.compedu.2010.05.015 .

    Article  Google Scholar 

  43. Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education, 14, 746. https://doi.org/10.1186/s41239-017-0043-4 .

    Article  Google Scholar 

  44. Kizilcec, R. F., Bailenson, J. N., & Gomez, C. J. (2015). The instructor’s face in video instruction: Evidence from two large-scale field studies. Journal of Educational Psychology, 107, 724–739. https://doi.org/10.1037/edu0000013 .

    Article  Google Scholar 

  45. Knoerzer, L., Bruenken, R., & Park, B. (2016). Facilitators or suppressors: Effects of experimentally induced emotions on multimedia learning. Learning and Instruction, 44, 97–107. https://doi.org/10.1016/j.learninstruc.2016.04.002 .

    Article  Google Scholar 

  46. Kreijns, K., Kirschner, P. A., & Vermeulen, M. (2013). Social aspects of CSCL environments: A research framework. Educational Psychologist, 48, 229–242. https://doi.org/10.1080/00461520.2012.750225 .

    Article  Google Scholar 

  47. Krentler, K. A., & Willis-Flurry, L. A. (2005). Does technology enhance actual student learning? The case of online discussion boards. Journal of Education for Business, 80, 316–321. https://doi.org/10.3200/joeb.80.6.316-321 .

    Article  Google Scholar 

  48. Lancellotti, M., Thomas, S., & Kohli, C. (2016). Online video modules for improvement in student learning. Journal of Education for Business, 91, 19–22. https://doi.org/10.1080/08832323.2015.1108281 .

    Article  Google Scholar 

  49. Leidner, D. E., & Jarvenpaa, S. L. (1995). The use of information technology to enhance management school education: A theoretical view. MIS Quarterly, 19, 265–291. https://doi.org/10.2307/249596 .

    Article  Google Scholar 

  50. Leutner, D. (2014). Motivation and emotion as mediators in multimedia learning. Learning and Instruction, 29, 174–175. https://doi.org/10.1016/j.learninstruc.2013.05.004 .

    Article  Google Scholar 

  51. Liu, O. L. (2012). Student evaluation of instruction: In the new paradigm of distance education. Research in Higher Education, 53, 471–486. https://doi.org/10.1007/s11162-011-9236-1 .

    Article  Google Scholar 

  52. López-Pérez, M. V., Pérez-López, M. C., & Rodríguez-Ariza, L. (2011). Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers and Education, 56, 818–826. https://doi.org/10.1016/j.compedu.2010.10.023 .

    Article  Google Scholar 

  53. Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers and Education, 54, 588–599. https://doi.org/10.1016/j.compedu.2009.09.008 .

    Article  Google Scholar 

  54. Markel, M. (1999). Distance education and the myth of the new pedagogy. Journal of Business and Technical Communication, 13, 208–222. https://doi.org/10.1177/1050651999013002005 .

    Article  Google Scholar 

  55. Mayer, R. E. (2002). Multimedia learning. Psychology of Learning and Motivation, 41, 85–139. https://doi.org/10.1016/S0079-7421(02)80005-6 .

    Article  Google Scholar 

  56. Mayer, R. E. (2003). The promise of multimedia learning: Using the same instructional design methods across different media. Learning and Instruction, 13, 125–139. https://doi.org/10.1016/S0959-4752(02)00016-6 .

    Article  Google Scholar 

  57. Mayer, R. E. (2014). Incorporating motivation into multimedia learning. Learning and Instruction, 29, 171–173. https://doi.org/10.1016/j.learninstruc.2013.04.003 .

    Article  Google Scholar 

  58. Mayer, R. E., & Chandler, P. (2001). When learning is just a click away: Does simple user interaction foster deeper understanding of multimedia messages? Journal of Educational Psychology, 93, 390–397. https://doi.org/10.1037//0022-0663.93.2.390 .

    Article  Google Scholar 

  59. Mayer, R. E., Dow, G. T., & Mayer, S. (2003). Multimedia learning in an interactive self-explaining environment: What works in the design of agent-based microworlds? Journal of Educational Psychology, 95, 806–812. https://doi.org/10.1037/0022-0663.95.4.806 .

    Article  Google Scholar 

  60. Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38, 43–52. https://doi.org/10.1207/S15326985EP3801_6 .

    Article  Google Scholar 

  61. McGill, T. J., & Klobas, J. E. (2009). A task–technology fit view of learning management system impact. Computers and Education, 52, 496–508. https://doi.org/10.1016/j.compedu.2008.10.002 .

    Article  Google Scholar 

  62. McLaren, C. H. (2004). A comparison of student persistence and performance in online and classroom business statistics experiences. Decision Sciences Journal of Innovative Education, 2, 1–10. https://doi.org/10.1111/j.0011-7315.2004.00015.x .

    Article  Google Scholar 

  63. Moreno, R. (2006). Does the modality principle hold for different media? A test of the method-affects-learning hypothesis. Journal of Computer Assisted Learning, 22, 149–158. https://doi.org/10.1111/j.1365-2699.2006.01595.x .

    Article  Google Scholar 

  64. Moreno, R., & Mayer, R. E. (2007). Interactive multimodal learning environments. Educational Psychology Review, 19, 309–326. https://doi.org/10.1007/s10648-007-9047-2 .

    Article  Google Scholar 

  65. Nemanich, L., Banks, M., & Dusya, V. (2009). Enhancing knowledge transfer in classroom versus online settings: The interplay among instructor, student, content, and context. Decision Sciences Journal of Innovative Education, 7, 123–148. https://doi.org/10.1111/j.1540-4609.2008.00208.x .

    Article  Google Scholar 

  66. Nihalani, P. K., Mayrath, M., & Robinson, D. H. (2011). When feedback harms and collaboration helps in computer simulation environments: An expertise reversal effect. Journal of Educational Psychology, 103, 776–785. https://doi.org/10.1037/a0025276 .

    Article  Google Scholar 

  67. O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. Internet and Higher Education, 25, 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002 .

    Article  Google Scholar 

  68. O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education, 68, 1–14. https://doi.org/10.1007/s10734-013-9663-3 .

    Article  Google Scholar 

  69. Owston, R., York, D., & Murtha, S. (2013). Student perceptions and achievement in a university blended learning strategic initiative. Internet and Higher Education, 18, 38–46. https://doi.org/10.1016/j.iheduc.2012.12.003 .

    Article  Google Scholar 

  70. Palocsay, S. W., & Stevens, S. P. (2008). A study of the effectiveness of web-based homework in teaching undergraduate business statistics. Decision Sciences Journal of Innovative Education, 6, 213–232. https://doi.org/10.1111/j.1540-4609.2008.00167.x .

    Article  Google Scholar 

  71. Park, B., Moreno, R., Seufert, T., & Brünken, R. (2011). Does cognitive load moderate the seductive details effect? A multimedia study. Computers in Human Behavior, 27, 5–10. https://doi.org/10.1016/j.chb.2010.05.006 .

    Article  Google Scholar 

  72. Park, B., Plass, J. L., & Brünken, R. (2014). Cognitive and affective processes in multimedia learning. Learning and Instruction, 29, 125–127. https://doi.org/10.1016/j.learninstruc.2013.05.005 .

    Article  Google Scholar 

  73. Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic IT skills training. MIS Quarterly, 25, 401–426. https://doi.org/10.2307/3250989 .

    Article  Google Scholar 

  74. Plass, J. L., Heidig, S., Hayward, E. O., Homer, B. D., & Um, E. R. (2014). Emotional design in multimedia learning: Effects of shape and color on affect and learning. Learning and Instruction, 29, 128–140. https://doi.org/10.1016/j.learninstruc.2013.02.006 .

    Article  Google Scholar 

  75. Redpath, L. (2012). Confronting the bias against on-line learning in management education. Academy of Management Learning and Education, 11, 125–140. https://doi.org/10.5465/amle.2010.0044 .

    Article  Google Scholar 

  76. Reed, P., & Reay, E. (2015). Relationship between levels of problematic internet usage and motivation to study in university students. Higher Education, 70, 711–723. https://doi.org/10.1007/s10734-015-9862-1 .

    Article  Google Scholar 

  77. Scheiter, K., & Gerjets, P. (2007). Learner control in hypermedia environments. Educational Psychology Review, 19, 285–307. https://doi.org/10.1007/s10648-007-9046-3 .

    Article  Google Scholar 

  78. Selim, H. M. (2003). An empirical investigation of student acceptance of course websites. Computers and Education, 40, 343–360. https://doi.org/10.1016/S0360-1315(02)00142-2 .

    Article  Google Scholar 

  79. Selim, H. M. (2007). Critical success factors for e-learning acceptance: Confirmatory factor models. Computers and Education, 49, 396–413. https://doi.org/10.1016/j.compedu.2005.09.004 .

    Article  Google Scholar 

  80. Seufert, T. (2003). Supporting coherence formation in learning from multiple representations. Learning and Instruction, 13, 227–237. https://doi.org/10.1016/S0959-4752(02)00022-1 .

    Article  Google Scholar 

  81. Sloan, T. W., & Lewis, D. A. (2014). Lecture capture technology and student performance in an operations management course. Decision Sciences Journal of Innovative Education, 12, 339–355. https://doi.org/10.1111/dsji.12041 .

    Article  Google Scholar 

  82. Snowball, J. D. (2014). Using interactive content and online activities to accommodate diversity in a large first year class. Higher Education, 67, 823–838. https://doi.org/10.1007/s10734-013-9708-7 .

    Article  Google Scholar 

  83. Solimeno, A., Mebane, M. E., Tomai, M., & Francescato, D. (2008). The influence of students’ and teachers’ characteristics on the efficacy of face-to-face and computer supported collaborative learning. Computers and Education, 51, 109–128. https://doi.org/10.1016/j.compedu.2007.04.003 .

    Article  Google Scholar 

  84. Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. Internet and Higher Education, 7, 59–70. https://doi.org/10.1016/j.iheduc.2003.11.003 .

    Article  Google Scholar 

  85. Sun, P.-C., Tsai, R. J., Finger, G., Chen, Y.-Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers and Education, 50, 1183–1202. https://doi.org/10.1016/j.compedu.2006.11.007 .

    Article  Google Scholar 

  86. Terpend, R., Gattiker, T. F., & Lowe, S. E. (2014). Electronic textbooks: Antecedents of students' adoption and learning outcomes. Decision Sciences Journal of Innovative Education, 12, 149–173. https://doi.org/10.1111/dsji.12066 .

    Article  Google Scholar 

  87. Um, E. R., Plass, J. L., Hayward, E. O., & Homer, B. D. (2012). Emotional design in multimedia learning. Journal of Educational Psychology, 104, 485–498. https://doi.org/10.1037/a0026609 .

    Article  Google Scholar 

  88. Volery, T., & Lord, D. (2000). Critical success factors in online education. International Journal of Educational Management, 14, 216–223. https://doi.org/10.1108/09513540010344731 .

    Article  Google Scholar 

  89. Walker, K., Curren, M. T., Kiesler, T., Lammers, H. B., & Goldenson, J. (2013). Scholarly networking among business students: Structured discussion board activity and academic outcomes. Journal of Education for Business, 88, 249–252. https://doi.org/10.1080/08832323.2012.690352 .

    Article  Google Scholar 

  90. Webster, J., & Hackley, P. (1997). Teaching effectiveness in technology-mediated distance learning. Academy of Management Journal, 40, 1282–1309. https://doi.org/10.2307/257034 .

    Article  Google Scholar 

  91. Whitaker, J., New, J. R., & Ireland, R. D. (2016). MOOCs and the online delivery of business education: What’s new? What’s not? What now? Academy of Management Learning and Education, 15, 345–365. https://doi.org/10.5465/amle.2013.0021 .

    Article  Google Scholar 

  92. Woo, J.-C. (2014). Digital game-based learning supports student motivation, cognitive success, and performance outcomes. Educational Technology and Society, 17, 291–307.

    Google Scholar 

  93. Wu, J.-H., Tennyson, R. D., & Hsia, T.-L. (2010). A study of student satisfaction in a blended e-learning system environment. Computers and Education, 55, 155–164. https://doi.org/10.1016/j.compedu.2009.12.012 .

    Article  Google Scholar 

  94. Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education, 85, 633–659. https://doi.org/10.1353/jhe.2014.0028 .

    Article  Google Scholar 

  95. Yourstone, S. A., Kraye, H. S., & Albaum, G. (2008). Classroom questioning with immediate electronic response: Do clickers improve learning? Decision Sciences Journal of Innovative Education, 6, 75–88. https://doi.org/10.1111/j.1540-4609.2007.00166.x .

    Article  Google Scholar 

  96. Zacharis, N. Z. (2015). A multivariate approach to predicting student outcomes in web-enabled blended learning courses. Internet and Higher Education, 27, 44–53. https://doi.org/10.1016/j.iheduc.2015.05.002 .

    Article  Google Scholar 

  97. Zhao, C.-M., & Kuh, G. D. (2004). Adding value: Learning communities and student engagement. Research in Higher Education, 45, 115–138. https://doi.org/10.1023/B:RIHE.0000015692.88534.de .

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Affiliations

Authors

Contributions

FAM developed the research idea, conducted the systematic literature analysis, structured the results, and derived future research priorities. TW drafted the work, was a major contributor, and substantially revised the manuscript. All authors read and approved the final manuscript. All authors have agreed both to be personally accountable for the author’s own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Corresponding author

Correspondence to Fabian Alexander Müller.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Müller, F.A., Wulf, T. Technology-supported management education: a systematic review of antecedents of learning effectiveness. Int J Educ Technol High Educ 17, 47 (2020). https://doi.org/10.1186/s41239-020-00226-x

Download citation

Keywords

  • Educational technology
  • Learning effectiveness
  • Management education
  • Systematic review