Skip to main content

Alternatives to the conventional ‘Oxford’ tutorial model: a scoping review

Abstract

In higher education, one commonly used teaching approach that is intended to develop deep learning is that of the ‘Oxford’ tutorial—a personalized Socratic approach in which an instructor discusses course-related issues with a handful of students. Even though this conventional tutorial model is well supported in the literature, it may be neglected by research-driven academics and is expensive to operate. The latter issue has placed tutorials in the spotlight because higher education institutions are facing huge funding cuts worldwide. In light of these problems, a scoping review was conducted to explore financially viable alternatives to the Oxford tutorial for management education. Articles in highly ranked management education and development academic journals were collected by searching these catalogs and compiling a database of 48 articles published in four premier journals. These articles were reviewed by two independent raters in order to arrive at 8 alternatives to the Oxford tutorial model that can achieve similar objectives of said tutorials while reducing costs. These alternative tutorial models all involve the application of information communication technologies to tutorials and include peer instruction, simulations and games, online collaborative learning, syndicates, flipped classrooms, communication systems, tailored learning, and portfolios. Challenges and implementation guidelines are explained for each alternative tutorial model.

Introduction

For many faculty, an ideal teaching environment is Socrates sitting under the linden tree, with three or four dedicated and interested students. Unfortunately, the reality of mass higher education or ‘massification’ (Hornsby & Osman, 2014) makes this impossible for all but the most elite and expensive institutions. Instead of small classes in which students are mentored by ‘a Socrates’, lecturing is used as an economical and efficient way to transfer knowledge and hopefully improve learning. Lecturing remains the most common method of teaching in higher education. But, lectures or any form of teaching at scale should be augmented by other forms of teaching because large classes present challenges for implementing student-focused teaching, and thus quality learning (Ryan et al., 2019; Sweeney, 2004).

Tutorials offer one way to augment lectures via smaller classes. The history of a tutorial approach to teaching, in which students are taught in small intimate groups, originated from the University of Oxford and the University of Cambridge in the eleventh century. Then, the purpose of tutorials was for a tutor to instruct as well as manage the conduct of younger colleagues (Moore, 1968). Towards the nineteenth century, the framework for tutorials was further established by the teachings of Professor Benjamin Jowett of the University of Oxford, who became renowned for his Socratic approach to teaching (Markham, 1967). Professor Jowett’s use of Socratic dialogue not only permeated the tutorial system throughout the University of Oxford, but also the concept of a tutorial in general. The tutorial system is regarded as the foundation of Oxbridge education, and is a cornerstone of the overall British education system. In fact, Lord Curzon, Chancellor of Oxford University in 1909, stated that the Oxford tutorial has ‘stamped its mark on the lives and characters of generations of men, and has excited the outspoken envy of other nations’ (Curzon, 1909, p. 122). It is no surprise then that the Oxford tutorial is regarded as the ‘jewel in the crown’ (Palfreyman, 2008, p. 15).

In some disciplines, tutorials can be composed of as few as one to six students, while in other disciplines such as social sciences, sciences, and engineering, tutorials tend to be larger (Commission of Inquiry, 1997). The relatively larger class sizes for lectures are often better suited for disseminating knowledge (an information transfer/teacher-centric approach). However, the smaller class sizes for tutorials can facilitate (a) close tutor–student interactions, thus allowing for individual attention and dyadic knowledge creation and (b) independent and self-directed preparation beforehand via reading, essay-writing, and/or preparing answers to problems (a conceptual change/student-centric approach) (Commission of Inquiry, 1997; Sweeney et al., 2004).

These two contrasting approaches to teaching have been found to be related to students’ approaches to learning. Students’ approaches to learning can be categorized as either deep or surface (Marton & Saljo, 1997). The surface approach involves rote learning for the purpose of memorization, recall, and other routine processing activities (Heikkilä & Lonka, 2006). A deep approach to learning means that students try to genuinely understand the underlying meaning of the content through the use of active problem solving and deep thinking skills (Heikkilä & Lonka, 2006). Unsurprisingly, students in a teacher-centric classroom environment (lectures) are more likely to adopt a surface approach to learning whereas students in a student-centric classroom environment (tutorials) are more likely to adopt a deep approach to learning (Prosser & Trigwell, 1999). Herein lies the importance of tutorials: a deep approach to learning is related to high-quality learning outcomes such as application, analysis, and critical thinking (Prosser & Trigwell, 1999).

Defining tutorials

Given the rich history of tutorials along with empirical support for its benefits, it is of little surprise that tutorials are common in higher education teaching. Despite its popularity, a clear definition of tutorial is largely absent in the academic literature. Two challenges in defining ‘tutorial’ are that (1) the concept has changed over time because of its flexibility and dynamism in practice (Mills & Alexander, 2013) and (2) there are inherent differences in tutorials between subject areas (Commission of Inquiry, 1997). Therefore, in order to arrive at a general definition of tutorial, we searched the web pages of the top 10 universities ranked by Times Higher Education in 2020. Given that these are world-leading universities, they are likely to provide current conceptualizations of the modern-day tutorial. The definitions provided by these universities are shown in Table 1.

Table 1 Definitions of ‘Tutorial’ Across the Top 10 Universities as Ranked by Times Higher Education (THE)

Based on these definitions, the concept of tutorials seems to be composed of four main features. First, tutorials are characterized by personalized attention because they are conducted in relatively smaller classes than lectures. Individualized instruction can be delivered via one-on-one tutoring. But, for tutorial groups, the ideal class size is between 5 and 8 students per group—fewer than 5 reduces diversity and variety, and thus the quality of interactions, and greater than 8 leads to a reduction in contribution from some members (Exley & Dennick, 2004). Larger ‘small’ groups of 20 to 25 students can provide similar advantages to small-group learning, but require careful planning to do so (Channon & Walker, 1984). Second, tutorials provide a safe space for deeper engagement with the subject by facilitating the testing of ideas, clarification on applications and problems, and hands-on practice, all while receiving regular feedback. Third, tutorials are intended to develop confidence, critical and independent thinking, and problem-solving skills that are course-related. Fourth, tutorials are student-centered because they feature high levels of interaction between the students and the tutor, and the students themselves. Being student-centered also means that students are generally expected to prepare work in advance in order to contribute to the tutorial. Taken into account these four features, a tutorial can then be defined as, personalized and student-centered small group sessions that provide a safe space for deeper engagement with the subject area in order to develop important skills and abilities that are targeted by the course.

Problem statement

Tutorials, in the traditional ‘Oxford’ sense (a teacher and few students in a classroom), are conducive to deep learning, and thus higher quality learning outcomes as explained earlier. However, tutorials are coming under increased scrutiny because of huge funding cuts to HEIs (Fazackerley, 2019) taken together with the rapid rise in student enrolment figures (also called ‘massification’) (Hornsby & Osman, 2014). This perfect storm creates a situation in which (1) some academics short-change their teaching in order to focus on their more recognized research for job security reasons (Palfreyman, 2008) and (2) HEIs are being pressured to cut expenditure while simultaneously becoming more student-centric in teaching approaches in order to compete with other HEIs. The first issue is problematic because tutorials run the risk of becoming ‘paste’ (i.e., they become neglected and melt away into an Oxbridge myth) (Palfreyman, 2008). The second issue is also problematic because student-centric tutorials can represent a significant financial cost to universities (Exley & Dennick, 2004). Both problems may be alleviated if there are financially viable alternatives to the Oxford tutorial.

Purpose statement

In light of the problem statement, the purpose of this paper is to explore creative alternatives to the Oxford tutorial that should be aligned to the definition of tutorial as outlined earlier. The following two questions guided my review of the literature:

  1. 1.

    What are cost-effective ways of re-designing the Oxford tutorial model for management education?

  2. 2.

    Do these alternative tutorial models promote high-quality learning outcomes?

The expected outcomes of this research are (1) models for designing tutorials that broaden our sights beyond the traditional Oxford tutorial model and (2) empirical research on the outcomes of these novel tutorial models.

Methods

We used a scoping review method to answer the research questions. A scoping review (or scoping study) refers to ‘a form of knowledge synthesis that addresses an exploratory research question aimed at mapping key concepts, types of evidence, and gaps in research related to a defined area or field by systematically searching, selecting, and synthesizing existing knowledge’ (Colquhoun et al., 2014, p. 1292). We chose a scoping review method because of the broad nature of our exploratory question and potentially diverse body of literature on tutorial alternatives. Scoping reviews are also considered appropriate when there is a need to (1) include a greater range of study methodologies than a systematic review or meta-analysis and (2) outline a descriptive review rather than synthesizing evidence across studies (Pham et al., 2014). Accordingly, we used a scoping review because we needed to conduct a mapping of the research on potential alternatives to a conventional tutorial, without prior knowledge of these alternatives or their effects. Furthermore, one of the main intentions of a scoping review is to inform practice, and, accordingly, the main aim of this paper was to inform practice at HEIs with respect to designing tutorials in alternative ways. In order to conduct this scoping review, we followed the guidelines outlined by Arksey and O’Malley (2005) and Levac et al. (2010).

Identifying relevant studies

We conducted a search of peer-reviewed academic articles that were published before December 2019. A major challenge when searching the term ‘tutorial’ is that it is commonly used in the literature to refer to a guide on how to perform a task via a series of stages (instead of ‘tutorial’ as defined in this paper). For this reason, database searching was impractical (e.g., searching the keyword ‘tutorial’ on ScienceDirect and PsycINFO yielded 66,944 and 3535 hits respectively with the articles focusing on ‘how-to’ guides for a variety of topics). Therefore, we hand-searched all ‘Management Education and Development’ journals that were rated 3 and 4 according to the Academic Journal Guide in 2018 as produced by the Chartered Association of Business Schools (CABS). These journals included Academy of Management, Learning and Education; British Educational Research Journal; Management Learning; and Studies in Higher Education. We searched for peer-reviewed articles containing the keyword ‘tutorial’.

Study selection

A total of 1563 records were produced as shown in Table 2. We then analyzed all titles and abstracts in order to determine relevancy to the tutorial concept. Because of the broad scope of this review, the keyword ‘tutorial’ did not have to be explicitly stated in the title or abstract. Instead, the inclusion criteria were expansive because we first screened articles that pointed toward any form of small group teaching technique, student-centered learning, general teaching and learning approaches, or anything relating to teaching and instruction. Note that the inclusion criteria stated here were not determined in a linear fashion, but were instead developed in an iterative process as is typical of scoping reviews (Colquhoun et al., 2014). Specifically, we refined the search strategy because new inclusion criteria were added after developing familiarity with the subject matter via a reading of the articles.

Table 2 Number of selected articles by journal

The inclusion criteria were used to select articles for further review. As shown in Table 2, the Academy of Management, Learning, and Education produced 80 records and 5 were deemed eligible; the British Educational Research Journal produced 400 records and 8 were deemed eligible; Management Learning produced 237 records and 11 were deemed eligible; and Studies in Higher Education produced 846 records and 34 were deemed eligible. A total of 58 articles were eligible for a full-text review from which 48 were extracted and included based on a full paper reading (see Fig. 1 for a PRISMA flow diagram). Most of the articles that focused on tutorials, but were not selected either prior to the first screening step or after the full paper reading, focused on training of tutors; the traditional Oxford tutorial model; pedagogical approaches, aids, and techniques that did not provide an alternative model to tutorials (e.g., problem-based learning and self-directed learning); institutional supports that complement tutorials (e.g., development centers); and principles/characteristics of student-centeredness rather than practical recommendations. Other articles that were not selected focused on topics that were not remotely related to tutorials or personalized small group teaching.

Fig. 1
figure1

PRISMA flow diagram outlining the stepwise screening and filtering of the literature

Charting the data

We charted the data in an iterative process by determining which tutorial approach to extract from each article in order to answer the research question. In charting the data, each tutorial approach was mapped onto four domains that were developed by two raters in an iterative process when reading through the articles. These four domains were information and communications technology; self-regulated learning, peer interaction, and small group teaching. In the next section, we discuss the findings for the alternative tutorial models in relation to the four domains.

Results

In collating the findings from the 48 included studies, two raters independently developed a thematic framework to collate the tutorial approaches according to the four domains (see Table 3).Footnote 1 Here, both raters grouped together approaches that were similar according to the four domains. Interrater agreement was 88%, and both raters discussed any differences in terminology in order to arrive at eight financially viable alternatives to the traditional Oxford tutorial model. These alternatives include peer instruction, simulations and games, online collaborative learning, flipped classrooms, syndicates, communication systems, tailored learning, and portfolios.

Table 3 Charting the data from included studies to map alternative models for tutorials

Peer instruction

Peer instruction was the most popular finding for an alternative to the Oxford tutorial. Peer instruction is used in this paper to capture a wide range of terms such as peer learning, proctoring, peer mentoring, peer-tutoring, peer teaching, peer modeling, peer education, and peer monitoring (even though these terms are often used interchangeably in the literature, see Topping, 2005 for the minor conceptual differences between certain terms). Interestingly, there is a rich history of research on peer instruction and its potential to replace the Oxford tutorial. In fact, investigative studies and test-runs of peer instruction date back to the 80 s and 90 s. The enthusiasm for peer instruction in the academic literature has gradually increased since then and has been accompanied by its increasing usage in higher education (Rees et al., 2016).

Peer instruction can be defined as ‘the acquisition of knowledge and skill through active helping and supporting among status equals or matched companions. It involves people from similar social groups who are not professional teachers helping each other learn and learning themselves by so doing’ (Topping, 2005, p. 631). Peer instruction typically takes the form of students tutoring or coaching other students under a lecturer’s supervision (the lecturer is not directly involved in teaching), more advanced students testing less advanced students before the latter write exams, student officials who are responsible for the discipline, and final year students who supervise first-year undergraduate projects. These peer instruction programs can be highly structured and sequenced via the use of workbooks in order to facilitate monitoring (Arco-Tirado et al., 2019). Also, the peer-tutor is typically assessed for credit in two ways: (1) a written report of their own progress and experiences throughout the year and (2) responses from tutees to questions focused on peer-tutoring (Arco-Tirado et al., 2019; see example questions in Saunders, 1992). The written report can take the form of a reflection for which the tutor engages in a conversation about actions taken during the course which may increase meta-cognitive abilities (i.e., the ability to monitor and supervise the learning process itself) (Cortese, 2005).

Peer instruction suggests equality of status, but this approach often includes interactions between more advanced and less advanced students (Saunders, 1992). The aims of peer instruction programs are to (1) create a friendly, supportive, safe, and less inhibited environment in which well-informed students are more willing to share ideas, ask questions, and seek feedback in a free and frank manner, (2) develop tutors and tutees communicative, leadership, and analytical thinking skills, (3) provide social support particularly with respect to adjusting to university life, (4) create a sense of community and belonging via increased student interactions, and (5) conserve faculty resources (Dancer et al., 2015; Frankham, 1998; Magin & Churches, 1995; Saunders, 1992). The first four aims are based on the premise that students and peer-tutors share similar cognitive structures, and thus are likely to be highly susceptible to their peer group (Frankham, 1998). In other words, peer-tutors can ‘speak the same language’ as their tutees, and thus may not only develop close relationships characterized by trust, but also be aware of approaches to educating that are aligned with their peers’ ways of learning (Frankham, 1998).

Peer instruction is related to a myriad of desirable outcomes. Quantitative studies showed that peer instruction can improve students’ (including at-risk students) academic performance (Arco-Tirado et al., 2019; Dancer et al., 2015; Morales et al., 2016), reduce dropout problems that particularly affect freshmen, (Arco-Tirado et al., 2019), and promote effective learning strategies and problem-solving skills (Longfellow et al., 2008). Quantitative studies that used experimental research designs to compare peer-tutoring to traditional instruction found that peer-tutoring can be superior because peer instruction was more strongly related to deep learning, students showed better constructive alignment of classes with final exams, and students grade point averages were higher (Arco-Tirado et al., 2011; Lueg et al., 2016). As mentioned earlier, deep learning is a critical outcome of tutorials, and peer instruction improves deep learning by students developing shared values, cohesion, feelings of responsibility to contribute, and elaboration by having to discuss and listen to different views about course content (Lueg et al., 2016). Qualitative studies found that peer instruction results in better counseling and advising to first and second-year students on a wide range of issues (Saunders, 1992), increased individual assistance and immediate responses to questions, a more relaxed learning climate, and greater empathetic understanding (Magin & Churches, 1995).

In addition to these empirical findings, peer instruction exposes tutees to their peers’ feedback. There is a wealth of research on peer feedback that goes beyond the scope of this review. That said, peer feedback can provide tutees with a different perspective to their teacher, and may improve tutees’ critical reasoning skills, evaluative judgment, reflections, and academic writing (López-Pellisa et al., 2020; Tai et al., 2018). Peer feedback also provides students with comments and dialogue that tend to be richer and more voluminous than that provided by a single teacher (Nicol, 2010) and may reduce learned dependence whereby students shift from ‘cue-seekers’ who ‘hunt for hints’ to maximize grades to autonomous learners who can now judge their peers’ work, and thus develop understandings of quality (Yorke, 2003). For further guidelines on designing peer feedback, see Falchikov and Goldfinch (2000) and Li et al. (2016).

Despite the clear benefits to peer instruction, there are a couple challenges in implementing this approach. First, while peer-tutoring can be beneficial to both tutors and tutees, there is the possibility for highly erratic and irresponsible tuition to occur at times. Furthermore, tutors can struggle with basic facilitation and knowing how to ask questions to improve understanding of a problem (Saunders, 1992). Second, in close peer-tutoring systems (i.e., between students at the same level), ambitious students may feel insufficiently tutored by weaker students (Lueg et al., 2016).

Therefore, student tutors need special training before adopting peer instruction (Saunders, 1992; Smith, 2008). Such training can be provided by a university-appointed tutor who leads tutorial classes, but in a significantly smaller capacity than a traditional tutorial model (i.e., a hybrid tutorial in which some sessions are led by the university tutor and other sessions are led by a peer-tutor). Instructors should also provide guiding questions to each tutor so that there is some consistency between classes, and because guided peer instruction is related to better learning outcomes than unguided peer instruction (Winters & Alexander, 2011). Arco-Tirado et al. (2019) outlined a training program for a dyadic cross-year peer instruction program that consisted of three training sessions, which focused on counseling approaches, developing self-regulated learning, subject-specific knowledge, and social adjustment to university study demands. Such training programs can be delivered via simulation exercises, role-play exercises, and traditional classroom learning (for more specifics on designing and delivering a training program see Goodlad et al., 1979). Peer feedback also requires teachers to develop students’ feedback literacy by showing students high and low-quality feedback examples as well as training in the use of assessment rubrics (Carless & Boud, 2018; Hanrahan & Isaacs, 2001).

Technology can also enhance the implementation of peer tutoring in various ways. First, the tutoring itself can not only be delivered over the internet (e.g., e-mail, videoconferencing software, instant messaging, etc.) but also be tracked and facilitated via web-based applications such as Opal (Online Peer-Assisted Learning) or ClassWide Peer Tutoring (Abbott et al., 2006; Evans & Moore, 2013). Second, Opal can determine tutor eligibility via digital problem solving, thus providing a gated approach to filtering tutors and tutees according to their competency in the subject matter with minimal effort required from instructors (Evans & Moore, 2013). Third, a peer tutoring web application may be used to offer flexibility to tutors and tutees to select each other by accepting and making requests respectively (Akobe et al., 2019). The use of technology in peer tutoring has been shown to improve student learning (Evans & Moore, 2013), and thus it is worth exploring combinations of these tools to further improve ways in which students connect to each other (e.g., gated approaches together with flexibility in the selection of tutors and tutees).

Overall, peer instruction shows promise. Once peer instruction is implemented properly as part of the curriculum and careful thought is given as to what form of organization fits the purpose and context (see Topping, 2005 for twelve questions that should be addressed prior to implementation), peer instruction appears to be not only an efficient and effective alternative to the Oxford tutorial, but also surpasses the traditional approach in important ways without adding considerably to academics’ workloads.

Simulations and games

Business games and simulations in management education and development originated in North America in the 1950s (Leemkuil & de Jong, 2012). The intention of these games and simulations was to bridge the gap between formal academic teaching and on-the-job practical experience. Games and simulations are not identical and may serve different purposes. Games or simulation games are ‘based on a model of a (natural or artificial) system or process’ and requires learners to achieve a challenging goal under specific constraints or uncertain conditions (Leemkuil & de Jong, 2012, p. 654). However, pure simulations require learners to alter the values of input variables to observe the impact on output variables without a specific goal or constraints (Leemkuil & de Jong, 2012). In both cases, students typically work in small groups to engage in deeper learning via application of course content to different scenarios (Lynn & Taylor, 1993; Simmons, 2017).

Empirically, business games and simulations are an attractive form of experiential and active learning (Kolb, 1984). They have even been shown to be superior to traditional teaching/lectures (DeNeve & Heppner, 1997; Pasin & Giroux, 2011). In addition, games and simulations can increase students' confidence, employability skills, and conceptual understanding of business and entrepreneurship (Neck & Greene, 2011). From a broader perspective, meta-analytic findings show that games and simulations are positively related to numerous desirable affective (e.g., satisfaction, motivation, attitudes), behavioral (e.g., participation, social skills, teamwork), and cognitive (e.g., learning, problem-solving, content understanding, critical thinking) outcomes (Vlachopoulos & Makri, 2017).

A major challenge with games and simulations is that learners may experience problems with the learning experience. Specifically, students may not like imprecisely defined problems, and thus may adopt an ‘engineering approach’ to achieve a certain goal rather than test hypotheses, interpret findings incorrectly, and have trouble correcting their pre-existing ideas even when data contradicts those ideas (De Jong, 2006; Lynn & Taylor, 1993). Therefore, support is needed to address these issues (Leemkuil & de Jong, 2012; Pasin & Giroux, 2011).

Reid et al. (2003) propose that three types of support be provided to students. First, interpretative support focuses on providing students with background knowledge so that they create sound hypotheses (e.g., online information or assignments that direct students towards variables that should be manipulated). Second, experimental support helps students to design and interpret experiments properly (e.g., adaptive feedback via pop-up windows or a ‘virtual advisor’). Third, reflective support involves prompting learners to think about specific aspects of the process and the knowledge gained. Taken together, the overall aim of these three types of support is to enhance cognitive learning by ensuring that students view the games and/or simulations as relevant by aligning the experience with students’ theoretical knowledge in the course.

Overall, business games and simulations with support offer a powerful alternative model to the Oxford tutorial. Specifically, with proper guidance (online and/or in-class) and integration with the curriculum, games and simulations mimic the features of tutorials (i.e., small personalized and interactive groups that are engaged in deep learning), but in a non-traditional manner because the supports for the game/simulation along with the experience itself acts as a tutor in a sense. Games and simulations can even be combined with other tutorial approaches as stated in this paper. For instance, flipped classrooms and syndicates may work well here because students can be required to play the game in their personal time, and then operate as ‘Boards of Directors’ with specific roles such as Managing Director, Operations, Human Resource Manager, etc. in class (Simmons, 2017). Here, students are encouraged to move away from the computer to make decisions, and can further be encouraged to document board meetings and include rationales for decisions (Simmons, 2017). Practitioners from industry and video clips can even be incorporated in order to provide timely advice and guidance on practical issues (Barrett & Lally, 2000; Lynn & Taylor, 1993). Another possible combination is the use of game elements with peer instruction (see Indriasari et al., 2020 for an outline of a “gamified peer review model”). A few popular examples of games and simulations include SimVenture, Mike’s Bikes, Capitalism Lab, simCEO, MobLab, and MIT Sloan Management Simulation Games.

Online collaborative learning

We used the term ‘online collaborative learning’ (OCL) to capture a multiplicity of similar concepts including online co-creation, computer-supported collaborative learning, online discussions, web-mediated discussions, web-based bulletin boards, and computer-mediated tutorials. OCL is based on social constructivist and sociocultural perspectives of learning which assert that all higher mental processes occur between people before being internalized (Dysthe, 2002). Specifically, meaning and understanding develop via multi-voicedness whereby there is a reciprocity of differences and similarities in points of view that leads to a ‘complexification’ of the issue, thus counteracting oversimplification of complex issues (Dysthe, 2002; Pee, 2020). Accordingly, OCL can be defined as ‘a process of social negotiation or collaborative sense making, mentoring and joint knowledge construction’ that is facilitated via electronic tools (Dysthe, 2002; Zhu, 1998, p. 234).

Electronic tools can facilitate synchronous or asynchronous discussions. Synchronous means that students communicate in real-time. These real-time communications typically take place in chat rooms such as WhatsApp and Discord—both of which have been shown to be effective when integrated into higher education classrooms (Lacher & Biehl, 2019; Minalla, 2018). These synchronous forms of discussions are likely to appeal to nowadays learners who are ‘always on’ and are familiar and enthusiastic about using instant messaging (Minalla, 2018). Asynchronous means that students communicate at their convenience (e.g., electronic bulletin boards on learning management systems such a Moodle). While synchronous discussions closely resemble informal oral speech, asynchronous discussions seem to be a hybrid between informal writing (‘free writing’) and presentation writing (Dysthe, 2002). Presentation writing means that the message is prepared with an audience in mind.

The typical format for OCL is one in which a teacher poses an open-ended question or assignment for which there is no single correct response. Then, each student is encouraged to contribute toward a co-constructed understanding via online discussions and/or voting (Dysthe, 2002; Pee, 2020). There is no consensus on the degree of teacher involvement necessary for productive discussions. But, research suggests that the instructor serves an important supporting or moderating function, with the degree of support being dependent on the context (Dysthe, 2002). In addition to this typical discussion format for one class, groups may also be used particularly in large classes. Here, a major advantage of OCL over traditional tutorials is that each group’s response/solution can then be shared online, and students can then be required to comment on other groups’ responses (cross-team solution co-creation) and even vote on the best response (Pee, 2020). Such cross-team interaction represents a unique departure from the Oxford tutorial in which there is typically no communication between tutorial groups.

Empirically, OCL can stimulate higher phases of knowledge construction (Schellens & Valcke, 2005) and improve students' grades (Webb et al., 2004). Interestingly, students’ course performance improves even when (1) students simply read posts on the forum without posting (in voluntary online forums) and (2) instructors invest little time on the forum (Cheng et al., 2011). A few additional benefits to OCL are that it allows students to (1) organize their tutorial learning to suit their lifestyle, (2) spend time to think and develop a proper response/answer, (3) contribute more evenly without being overpowered by more assertive students as typical in face-to-face tutorials, thus leading to a broad range of ideas, and (4) feel more confident about contributing because of reduced race and gender-based inhibitions (Light et al., 2000; Sweeney et al., 2004).

While OCL can replace the Oxford tutorial with fewer resources (i.e., limited to no role for a tutor) there are a few caveats. First, one danger of asynchronous discussions is that students tend to ‘say their piece’ in relation to the issue without considering others' views (Dysthe, 2002). Dysthe (2002) suggests that students should be encouraged to consider others' thoughts in order to develop dialogue rather than simply present new information. Specifically, the instructor should explicitly encourage students to steer their discussions in relation to previous student entries in order to fully develop prior thoughts on an issue (Dysthe, 2002). Second, OCL requires assignments that are challenging and relatable to stimulate dialogue (Dysthe, 2002). Third, although no instructor involvement can create a more immediate and collaborative environment, this approach can be susceptible to ‘flaming’ contributions (Light et al., 2000). Fourth, students should be advised to keep messages short and use paralinguistic cues (e.g., ‘I agree…’, ‘I feel strongly …’) to create dialogue. Overall, once support structures are put in place to guide and shape the interaction between students, OCL can provide a cheaper alternative to the Oxford tutorial.

Syndicates

Syndicates blur the lines between small and large group teaching approaches. Syndicates refer to an approach in which an instructor divides a larger group of students into a series of smaller working groups (Exley & Dennick, 2004). The instructor can then use various student-focused, small-group teaching approaches within these smaller groups and act as a sort of resource, coordinator, or synthesizer (Exley & Dennick, 2004; Ryan et al., 2019). In this subsection, we focus on syndicates that are instructor-led because other forms are covered earlier (i.e., peer instruction covers student-led syndicates and online collaborative learning covers virtual syndicates). Syndicates are typically found in classes that focus on problem-solving, practical work, or group project work (Exley & Dennick, 2004).

In large lectures, syndicates can serve the same purpose as tutorials, but must be coordinated properly and embedded seamlessly into the lecture by the instructor. For instance, each learning objective in the lecture can be accompanied immediately by syndicate work such as in-class activities (e.g., simulations, games, and/or experiential learning exercises), group questions/quizzes, problem-solving in pairs, small-group discussions, etc. (Ryan et al., 2019). Some of these activities can be delivered and/or augmented by educational technology as discussed later on (e.g., student response systems that provide real-time feedback on students’ learning). This sort of active participation in lectures shifts the lecture from a passive learning environment to one that can harness the benefits of small group teaching such as deep learning, higher-order skills, and generally high-quality learning outcomes (Ryan et al., 2019). Syndicates also seem particularly suited to flipped classrooms.

Flipped classrooms

The flipped classroom is ‘a blended learning approach which moves lecture content out of the classroom [to] online, freeing up class time for more active learning methods’ (Price & Walker, 2019, p. 1). The flipped classroom involves using pedagogical approaches that (1) transfer the dissemination of information out of class (typically online via screencasting and/or vodcasting), (2) use class time for active learning activities, and (3) necessitate pre- and/or post-class preparation in order to fully benefit from class activities (Abeysekera & Dawson, 2015). Lectures can then utilize group discussions and problem-solving activities via syndicates, particularly in large cohorts (Price & Walker, 2019).

The flipped classroom involves a change in the traditional use of in-class and out-of-class time that has been facilitated by educational technology (Abeysekera & Dawson, 2015; Lo, 2018). Educational technology enables the creation of pre-recorded self-paced instructional videos, online quizzes with immediate feedback, and online learning content, all of which can be delivered via e-learning portals (Kim et al., 2014; Lo, 2018). Instructional videos are commonly used for out-of-class time (e.g., edpuzzle) (Mehring, 2016), and advancements in technology means that these videos can take various forms such as animated videos (e.g., Powtoon and VideoScribe) and/or an instructor presenting alongside slides (e.g., Prezi Present). There are also technological advancements that can be used to enhance the active learning that takes place in-class, including clickers/student response systems and real-time monitoring and evaluation systems (e.g., Pear Deck and Kahoot). One recent technological advancement in flipped classrooms is that of ‘seamless flipped learning’ in which mobile and wireless communication technologies allow learners to connect home learning, in-class activities, and field learning (Hwang et al., 2015). Overall, educational technology has not only facilitated flipped classrooms but also continues to shape how flipped classrooms evolve.

Evidence suggests that the flipped classroom may be a sound alternative to the traditional lecture/tutorial format. Theoretically, the flipped classroom is expected to improve student motivation, help manage cognitive load, and improve academic performance (Abeysekera & Dawson, 2015). Flipped classrooms also represent a fundamental shift from lower-order knowledge and comprehension in lectures to higher-order application, analysis, evaluation, and synthesis via the active learning activities (Krathwohl, 2002). From a student’s perspective, the flipped classroom provides greater flexibility over the pace of learning because online lectures can be attended to at any time prior to scheduled class time (Price & Walker, 2019). Empirically, a few studies found that flipped classrooms are related to students’ motivation, engagement, attendance, learning, and more effective use of class time (Nouri, 2016; O’Flaherty & Phillips, 2015).

In using flipped classrooms, the first order learning from lectures is shifted from the lecture hall to online, while the richer second-order learning from tutorials is shifted to the lecture hall (Bonvillian & Singer, 2013). While this approach can sometimes place greater demands on the instructor than traditional lecture/tutorials (Price & Walker, 2019), a creative and possibly less demanding variation is to use massively open online courses (MOOCs) for lectures and assignments, and class time for practical work (Fox, 2013). Additionally, flipped classrooms must be managed carefully in order to reap similar benefits as the Oxford tutorial. Specifically, flipped classrooms require (1) a clear structure, in terms of the content being taught online and appropriate study timings (O’Flaherty & Phillips, 2015) and (2) students to engage in self-regulation (Abeysekera & Dawson, 2015). For specific guidelines on shifting a traditional course towards a flipped classroom, see Price and Walker (2019).

Communication systems

We used the label ‘communication systems’ to refer to the use of technology to facilitate student-centered pedagogical approaches. Rather than shift the student-centered approach to teaching from large lectures to smaller tutorials, modern technology facilitates the use of student-centered teaching in large lectures. In other words, communication systems can facilitate personalized interactions and discussions in large classes, and thus complement earlier themes such as peer instruction and syndicates (see Nicol & Boyle, 2003 for a sequence of activities when using communication systems with syndicates). There are four main ways in which technology can be utilized to serve similar purposes of tutorials, including student response systems, intelligent tutoring systems (ITSs), automatic writing evaluation tools, and personal feedback.

First, student response systems like ‘clickers’, Kahoot!, Socrative, and Poll Everwhere enable instructors to receive real-time anonymized data from students’ responses to in-class questions (Ryan et al., 2019). Instructors can thus use these technologies to quickly gauge students’ understanding and deliver immediate feedback to students (Ryan et al., 2019). Moreover, these technologies may facilitate wider exploration and perspectives than traditional tutorial settings because, unlike tutorials that tend to be dominated by a few students, student response systems provide anonymity for reticent students, and thus these students tend to be more motivated to contribute (Jones et al., 2006). Empirically, student response systems are related to student engagement, learning, and academic performance (Abdulla, 2018; Klein & Kientz, 2013).

Second, ITSs are “computer learning environments designed to help students master difficult knowledge and skills by implementing powerful intelligent algorithms that adapt to the learner at a fine-grained level and that instantiate complex principles of learning” (Graesser et al., 2018, p. 2). An ITS tracks learners’ knowledge, skills, abilities, and other psychological characteristics and adaptively responds to each learner’s level of subject mastery using powerful algorithms (Graesser et al., 2018). The artificial intelligence underlying ITSs are often based on cognitive science (Sottilare et al., 2014) and are widely available in STEM subjects (e.g., Cognitive Tutors for algebra and geometry or SHERLOCK for technology and engineering). Some ITSs also use conversational agents, which are talking heads that mimic humans’ verbal and nonverbal interactions and generate adaptive dialogue in response to students’ emotions (e.g., confusion or boredom) and knowledge (e.g., AutoTutor or DeepTutor) (Graesser, 2016).

Third, automatic writing evaluation tools are ‘web-based software applications that offer automated assessment of students’ written work’ (Ryan et al., 2019, p. 6). These tools provide students with immediate feedback on grammar, style, content, and structure, and thus minimizes time spent on assessing lower-level writing (Ranalli et al., 2017). Examples of such tools include Criterion, WriteToLearn, WriteLab, and Grammarly.

Fourth, in large classes, instructors can further reduce their assessment workload (or at least reduce the number of tutors required) by utilizing personal feedback at scale in two ways (Ryan et al., 2019). First, instructors can provide feedback via digital or audio recordings instead of text-based feedback. The former is not only faster for complex information but improves the quality of feedback, and is perceived by students as caring and supportive, thus enhancing the relationship between instructor and student (Mahoney et al., 2019). Second, learning analytics involves the use of personalized comments that are tailored to different levels of student interactions. For instance, creating specific comments to students’ interactions with a multiple-choice quiz: one for students who did not complete the quiz, one for students who partially completed the quiz, one for students who completed the quiz once, etc. (Pardo et al., 2019). This personalized feedback approach can lead to a marked improvement in students’ satisfaction and academic performance (Pardo et al., 2019). Pardo et al. (2019) provide an open-source tool called OnTask that can be used to provide personalized feedback for a small cost.

Overall, communication systems can create the illusion of a tutorial atmosphere in large lectures, particularly when used with other themes in this review such as syndicates. Instructors can use a combination of student response systems, automatic writing evaluation tools, digital recordings, and learning analytics in order to, not only achieve similar outcomes of the Oxford tutorial, but also to reduce reliance on such a format or possibly replace it entirely depending on the context.

Tailored learning

Tailored learning is used here to refer to courses that tailor skills development to each student via self-directed learning. Here, power is taken out of the hands of instructors and placed into the hands of students (Harrison, 1975). Students must identify their needs, set goals to achieve these needs, and evaluate their progress toward goals through reflection (Robertson, 1987; Stansfield, 1996). Ideally, students should review and revise their needs and goals throughout a course that uses tailored learning (Robertson, 1987). Also, in order to qualify as an alternative to tutorials, peer assessments should be used to assist in identifying skill gaps and developing skills (Robertson, 1987). Using this approach, the instructor provides resource guides (e.g., books, articles, case studies, lectures/films, exercises, etc.) and can utilize peer feedback while maintaining minimal involvement (Harrison, 1975; Robertson, 1987). In fact, instructors need to consciously provide students with a wide berth in order to guide students to ‘find out for themselves’ through experimentation, while simultaneously intervening when necessary without hindering the self-directed nature of such approaches—this is not an easy thing to do (Robertson, 1987).

These forms of tailored programs are well-suited to managerial programs/courses that focus on developing skills and competencies such as self-awareness, career development skills, presentation skills, creative thinking, interpersonal skills such as persuasion and communication, counseling, coaching and mentoring, etc. (Robertson, 1987; Stansfield, 1996). Also, such self-development methods can increase motivation to learn, create an atmosphere conducive to learning, promote gains in knowledge and understanding, and develop meta-cognitive skills (Stansfield, 1996).

In spite of the benefits of tailored programs, these programs present certain challenges. Tailored programs that are characterized by self-directed learning (1) require considerable time from students; (2) require instructors to relinquish considerable control over learning and tolerate fairly ‘messy’ learning; (3) can be perceived as lacking direction, particularly at the start of the program; (4) may be unfamiliar to students, which can increase anxiety, and (5) can subject the instructor to mystification because of the limited interactions (Robertson, 1987; Stansfield, 1996). Moreover, students with active learning styles (pragmatist) tend to be more responsive to self-development than students with passive learning styles (theorists) (Stansfield, 1996).

Overall, the use of tailored programs drastically alters the traditional lecture/tutorial model so that tutorials become unnecessary. But, such programs appear to be useful in fairly niche circumstances. Perhaps the principles that underlie these tailored approaches can translate to other non-managerial type courses, but this notion requires further investigation.

Portfolios

A portfolio refers to ‘a purposeful collection of student work that exhibits the student’s efforts, progress, and achievements’ (Paulson et al., 1991, p. 60). In higher education settings, portfolios are typically used for assessment or development and take the form of learning journals or learning diaries (Händel et al., 2020). Students are typically required to write their reflections on content that was taught in class as well as how this content relates to their own behavior (Händel et al., 2020).

In order to qualify as a potential alternative to the Oxford tutorial, portfolios should involve interaction with fellow students in a discussion forum as part of the reflection process (Händel et al., 2020). Discussion in the development of portfolios offers the opportunity for students to exchange thoughts, ideas, solutions, and problems with their peers, and is feasible via e-portfolios (Händel et al., 2020). E-portfolios are digital systems that facilitate portfolios as defined earlier but extend pencil-and-paper portfolios to allow for easier organization and sharing with others to create a learning community (e.g., edublogs) (Aguaded et al., 2013; Händel et al., 2020). E-portfolios can be created via free blogging websites that are easy to use (e.g., Weebly or WordPress).

Portfolios can shift the tutorial mindset from one in which learning is managed externally to one in which students take responsibility for their own learning and build their own knowledge communities (Aguaded et al., 2013; Estienne, 1991). Moreover, portfolios may achieve similar outcomes to tutorials such as deepening understanding and facilitating retention (Händel et al., 2020). Empirically, portfolios have been shown to increase writing self-efficacy, quality of students’ reflections, and academic performance (Händel et al., 2020; Hj. Ebil, 2020; Schmitz & Perels, 2011). However, the extent to which portfolios work well depends on the degree to which students manage their own learning (self-regulated learning) (Händel et al., 2020).

Overall, a learning portfolio is a helpful learning tool that was not designed to replace tutorials. But, portfolios require that students collaborate and reflect to construct their own knowledge while reducing their reliance on external instruction, and thus can reduce the need for tutorials. Furthermore, portfolios may work synergistically with other themes in this review such as simulations and games, flipped classrooms, online collaborative learning, and tailored learning in particular (Estienne, 1991). Therefore, combining portfolios with one or more of the approaches in this review can potentially replace the Oxford tutorial.

Discussion and conclusion

In this paper, we searched leading management education and development journals for cost-effective ways to redesign the traditional Oxford tutorial model, and then conducted further research on the effectiveness of these alternative models. The results show that higher education teaching and learning is evolving, and this evolution presents intriguing alternatives to the Oxford tutorial model.

The main limitation of this scoping review is that we did not search all CABS journals and the gray literature because of practical considerations. Nonetheless, we observed that themes were already being saturated within the existing search because no new themes were emerging later on in the screening process. Still, it is likely that other tutorial alternatives exist. On reflection, one idea for further research on this topic is to conduct a database search for ‘small group teaching’. In spite of this limitation, there are a couple of strengths of this scoping review. First, we used a clear protocol as outlined by Arksey and O’Malley (2005) and Levac et al. (2010), and thus maintain a high level of transparency and rigor throughout the process. Second, each included article was reviewed independently by two raters who met to resolve conflicts in the thematic analysis.

Tutorials have a rich history in higher education, and rightfully so. They have been used to discourage docility in learning while teaching students to think for themselves. But, this scoping review shows that there may be cheaper ways to achieve these goals. These cheaper alternatives can still carry the moniker, ‘tutorial’, because conceptually they all align with the definition proposed in this paper. However, for the most part, these tutorial alternatives shift dramatically away from the Oxford tutorial’s model. Given the evidence presented in this paper, HEIs may need to consider transitioning toward these alternative tutorial models, not only to cut expenditure, but also to prevent the ‘jewel in the crown’ from becoming ‘paste’.

Availability of data and materials

The charting form used for the scoping review is Table 3 of the manuscript.

Notes

  1. 1.

    I would like to acknowledge Darlene Balwant (MSc, BSc, MBPsS) for her assistance as a secondary rater.

References

  1. Abbott, M., Greenwood, C. R., Buzhardt, J., & Tapia, Y. (2006). Using technology-based teacher support tools to scale up the classwide peer tutoring program. Reading & Writing Quarterly, 22(1), 47–64. https://doi.org/10.1080/10573560500203525

    Article  Google Scholar 

  2. Abdulla, M. H. (2018). The use of an online student response system to support learning of Physiology during lectures to medical students. Education and Information Technologies, 23(6), 2931–2946. https://doi.org/10.1007/s10639-018-9752-0

    Article  Google Scholar 

  3. Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped classroom: Definition, rationale and a call for research. Higher Education Research & Development, 34(1), 1–14. https://doi.org/10.1080/07294360.2014.934336

    Article  Google Scholar 

  4. Aguaded, J. I., López-Meneses, E., & Jaén Martínez, A. (2013). University e-portfolios as a new higher education teaching method. The development of a multimedia educational material (MEM). RUSC Revista de Universidad y Sociedad del Conocimiento, 10(1), 7-28|188–209. https://doi.org/10.7238/rusc.v10i1.1333

    Article  Google Scholar 

  5. Akobe, D., Popoola, S. I., Atayero, A. A., Oseni, O. F., & Misra, S., et al. (2019). A web framework for online peer tutoring application in a smart campus. In S. Misra, O. Gervasi, B. Murgante, E. Stankova, V. Korkhov, & C. Torre (Eds.), Computational Science and Its Applications—ICCSA 2019. (Vol. 11623, pp. 316–326). Springer International Publishing. 10.1007/978-3-030-24308-1_26.

    Chapter  Google Scholar 

  6. Arco-Tirado, J. L., Fernández-Martín, F. D., & Fernández-Balboa, J.-M. (2011). The impact of a peer-tutoring program on quality standards in higher education. Higher Education, 62(6), 773–788. https://doi.org/10.1007/s10734-011-9419-x

    Article  Google Scholar 

  7. Arco-Tirado, J. L., Fernández-Martín, F. D., & Hervás-Torres, M. (2019). Evidence-based peer-tutoring program to improve students’ performance at the university. Studies in Higher Education. https://doi.org/10.1080/03075079.2019.1597038

    Article  Google Scholar 

  8. Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32. https://doi.org/10.1080/1364557032000119616

    Article  Google Scholar 

  9. Barrett, E., & Lally, V. (2000). Meeting new challenges in educational research training: The signposts for educational research CD-ROM. British Educational Research Journal, 26(2), 271–290. https://doi.org/10.1080/01411920050000999

    Article  Google Scholar 

  10. Bonvillian, W. B., & Singer, S. R. (2013). The online challenge to higher education. Issues in Science and Technology, XXIX(4). https://issues.org/the-online-challenge-to-higher-education/. Accessed 12 January 2020

  11. Carless, D., & Boud, D. (2018). The development of student feedback literacy: Enabling uptake of feedback. Assessment & Evaluation in Higher Education, 43(8), 1315–1325. https://doi.org/10.1080/02602938.2018.1463354

    Article  Google Scholar 

  12. Channon, L. D., & Walker, W.-L. (1984). A note on teaching larger ‘small’ groups. Studies in Higher Education, 9(1), 83–86. https://doi.org/10.1080/03075078412331378943

    Article  Google Scholar 

  13. Cheng, C. K., Paré, D. E., Collimore, L.-M., & Joordens, S. (2011). Assessing the effectiveness of a voluntary online discussion forum on improving students’ course performance. Computers & Education, 56(1), 253–261. https://doi.org/10.1016/j.compedu.2010.07.024

    Article  Google Scholar 

  14. Collier, K. G. (1980). Peer-group learning in higher education: The development of higher order skills. Studies in Higher Education, 5(1), 55–62. https://doi.org/10.1080/03075078012331377306

    Article  Google Scholar 

  15. Colquhoun, H. L., Levac, D., O’Brien, K. K., Straus, S., Tricco, A. C., Perrier, L., et al. (2014). Scoping reviews: Time for clarity in definition, methods, and reporting. Journal of Clinical Epidemiology, 67(12), 1291–1294. https://doi.org/10.1016/j.jclinepi.2014.03.013

    Article  Google Scholar 

  16. Commission of Inquiry. (1997). Commission of Inquiry Report. https://www.admin.ox.ac.uk/coi/commissionofinquiryreport/. Accessed 23 December 2019

  17. Corlett, S. (1971). Alternative methods of teaching business policy. Management Education and Development, 2(2), 64–76. https://doi.org/10.1177/135050767100200203

    Article  Google Scholar 

  18. Cortese, C. G. (2005). Learning through teaching. Management Learning, 36(1), 87–115. https://doi.org/10.1177/1350507605049905

    Article  Google Scholar 

  19. Curry, B. U., & Moutinho, L. (1992). Using computer simulations in management education. Management Education and Development, 23(2), 155–167. https://doi.org/10.1177/135050769202300212

    Article  Google Scholar 

  20. Curzon, G. (1909). Principles and Methods of University Reform. The Clarendon Press. http://www.educationengland.org.uk/documents/curzon1909/curzon.html#11

  21. Dancer, D., Morrison, K., & Tarr, G. (2015). Measuring the effects of peer learning on students’ academic achievement in first-year business statistics. Studies in Higher Education, 40(10), 1808–1828. https://doi.org/10.1080/03075079.2014.916671

    Article  Google Scholar 

  22. De Jong, T. (2006). Computer simulations: Technological advances in inquiry learning. Science, 312(5773), 532–533. https://doi.org/10.1126/science.1127750

    Article  Google Scholar 

  23. DeNeve, K. M., & Heppner, M. J. (1997). Role play simulations: The assessment of an active learning technique and comparisons with traditional lectures. Innovative Higher Education, 21(3), 231–246. https://doi.org/10.1007/BF01243718

    Article  Google Scholar 

  24. Dochy, F., Segers, M., & Sluijsmans, D. (1999). The use of self-, peer and co-assessment in higher education: A review. Studies in Higher Education, 24(3), 331–350. https://doi.org/10.1080/03075079912331379935

    Article  Google Scholar 

  25. Dysthe, O. (2002). The learning potential of a web-mediated discussion in a university course. Studies in Higher Education, 27(3), 339–352. https://doi.org/10.1080/03075070220000716

    MathSciNet  Article  Google Scholar 

  26. Estienne, M. (1991). A personal development file: Self-development among business studies students. Management Education and Development, 22(1), 15–22. https://doi.org/10.1177/135050769102200102

    Article  Google Scholar 

  27. Evans, M. J., & Moore, J. S. (2013). Peer tutoring with the aid of the Internet. British Journal of Educational Technology, 44(1), 144–155. https://doi.org/10.1111/j.1467-8535.2011.01280.x

    Article  Google Scholar 

  28. Exley, K., & Dennick, R. (2004). Small Group Teaching: Tutorials, Seminars and Beyond. Taylor & Francis. https://doi.org/10.4324/9780203465066

    Book  Google Scholar 

  29. Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322. https://doi.org/10.3102/00346543070003287

    Article  Google Scholar 

  30. Fazackerley, A. (2019). “It’s a dangerous time”: Can UK and US universities survive funding cuts? The Guardian. https://www.theguardian.com/education/2019/may/09/its-a-dangerous-time-can-uk-and-us-universities-survive-funding-cuts. Accessed 12 January 2020

  31. Fox, A. (2013). From MOOCs to SPOCs. Communications of the ACM, 56(12), 38–40. https://doi.org/10.1145/2535918

    Article  Google Scholar 

  32. Frankham, J. (1998). Peer Education: The unauthorised version. British Educational Research Journal, 24(2), 179–193. https://doi.org/10.1080/0141192980240205

    Article  Google Scholar 

  33. Gibbs, I., & Harland, J. (1987). Approaches to teaching in colleges of higher education. British Educational Research Journal, 13(2), 159–173. https://doi.org/10.1080/0141192870130205

    Article  Google Scholar 

  34. Gibson, W., Hall, A., & Callery, P. (2006). Topicality and the structure of interactive talk in face-to-face seminar discussions: Implications for research in distributed learning media. British Educational Research Journal, 32(1), 77–94. https://doi.org/10.1080/01411920500402029

    Article  Google Scholar 

  35. Goodlad, S., Abidi, A., Anslow, P., & Harris, J. (1979). The Pimlico Connection: Undergraduates as tutors in schools. Studies in Higher Education, 4(2), 191–201. https://doi.org/10.1080/03075077912331376967

    Article  Google Scholar 

  36. Graesser, A. C. (2016). Conversations with AutoTutor help students learn. International Journal of Artificial Intelligence in Education, 26(1), 124–132. https://doi.org/10.1007/s40593-015-0086-4

    Article  Google Scholar 

  37. Graesser, A. C., Hu, X., Nye, B. D., VanLehn, K., Kumar, R., Heffernan, C., et al. (2018). ElectronixTutor: An intelligent tutoring system with multiple learning resources for electronics. International Journal of STEM Education, 5(1), 15. https://doi.org/10.1186/s40594-018-0110-y

    Article  Google Scholar 

  38. Händel, M., Wimmer, B., & Ziegler, A. (2020). E-portfolio use and its effects on exam performance—a field study. Studies in Higher Education, 45(2), 258–270. https://doi.org/10.1080/03075079.2018.1510388

    Article  Google Scholar 

  39. Hanrahan, S. J., & Isaacs, G. (2001). Assessing self- and peer-assessment: The students’ views. Higher Education Research & Development, 20(1), 53–70. https://doi.org/10.1080/07294360123776

    Article  Google Scholar 

  40. Harrison, R. (1975). An experiment in self directed learning. Management Education and Development, 6(1), 19–25. https://doi.org/10.1177/135050767500600103

    MathSciNet  Article  Google Scholar 

  41. Havnes, A. (2008). Peer-mediated learning beyond the curriculum. Studies in Higher Education, 33(2), 193–204. https://doi.org/10.1080/03075070801916344

    Article  Google Scholar 

  42. Heikkilä, A., & Lonka, K. (2006). Studying in higher education: Students’ approaches to learning, self-regulation, and cognitive strategies. Studies in Higher Education, 31(1), 99–117. https://doi.org/10.1080/03075070500392433

    Article  Google Scholar 

  43. Hj Ebil, S., Salleh, S. M., & Shahrill, M. (2020). The use of E-portfolio for self-reflection to promote learning: a case of TVET students. Education and Information Technologies, 25(6), 5797–5814. https://doi.org/10.1007/s10639-020-10248-7

    Article  Google Scholar 

  44. Hornsby, D. J., & Osman, R. (2014). Massification in higher education: Large classes and student learning. Higher Education, 67(6), 711–719. https://doi.org/10.1007/s10734-014-9733-1

    Article  Google Scholar 

  45. Hwang, G.-J., Lai, C.-L., & Wang, S.-Y. (2015). Seamless flipped learning: A mobile technology-enhanced flipped classroom with effective learning strategies. Journal of Computers in Education, 2(4), 449–473. https://doi.org/10.1007/s40692-015-0043-0

    Article  Google Scholar 

  46. Indriasari, T. D., Luxton-Reilly, A., & Denny, P. (2020). Gamification of student peer review in education: A systematic literature review. Education and Information Technologies, 25(6), 5205–5234. https://doi.org/10.1007/s10639-020-10228-x

    Article  Google Scholar 

  47. Jackson, M. W., & Prosser, M. T. (1989). Less lecturing, more learning. Studies in Higher Education, 14(1), 55–68. https://doi.org/10.1080/03075078912331377612

    Article  Google Scholar 

  48. Jones, C., Connolly, M., Gear, A., & Read, M. (2006). Collaborative learning with group interactive technology: A case study with postgraduate students. Management Learning, 37(3), 377–396. https://doi.org/10.1177/1350507606067173

    Article  Google Scholar 

  49. Jones, R. J., & Andrews, H. (2019). Understanding the rise of faculty–student coaching: An academic capitalism perspective. Academy of Management Learning & Education, 18(4), 606–625. https://doi.org/10.5465/amle.2017.0200

    Article  Google Scholar 

  50. Kim, M. K., Kim, S. M., Khera, O., & Getman, J. (2014). The experience of three flipped classrooms in an urban university: An exploration of design principles. The Internet and Higher Education, 22, 37–50. https://doi.org/10.1016/j.iheduc.2014.04.003

    Article  Google Scholar 

  51. Klein, K., & Kientz, M. (2013). A model for successful use of student response systems. Nursing Education Perspectives, 34(5), 334–338. https://doi.org/10.5480/1536-5026-34.5.334

    Article  Google Scholar 

  52. Kniveton, B. H. (1992). The impact of group size on the behaviour and involvement of male mature students in tutorless seminars. British Educational Research Journal, 18(3), 287–296. https://doi.org/10.1080/0141192920180306

    Article  Google Scholar 

  53. Kolb, D. A. (1984). Experiential Learning: Experience as the Source of Learning and Development. (1st ed.). Prentice Hall.

    Google Scholar 

  54. Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory Into Practice, 41(4), 212–218. https://doi.org/10.1207/s15430421tip4104_2

    Article  Google Scholar 

  55. Lacher, L. L., & Biehl, C. (2019). Investigating team effectiveness using Discord: A case study using a gaming collaboration tool for the CS classroom. Presented at the International Conference on Frontiers in Education: Computer Science and Computer Engineering, Las Vegas, USA.

  56. Leemkuil, H., & de Jong, T. (2012). Adaptive advice in learning with a computer-based knowledge management simulation game. Academy of Management Learning & Education, 11(4), 653–665. https://doi.org/10.5465/amle.2010.0141

    Article  Google Scholar 

  57. Levac, D., Colquhoun, H., & O’Brien, K. K. (2010). Scoping studies: Advancing the methodology. Implementation Science, 5(1), 69. https://doi.org/10.1186/1748-5908-5-69

    Article  Google Scholar 

  58. Li, H., Xiong, Y., Zang, X., Kornhaber, M. L., Lyu, Y., Chung, K. S., & Suen, H. K. (2016). Peer assessment in the digital age: A meta-analysis comparing peer and teacher ratings. Assessment & Evaluation in Higher Education, 41(2), 245–264. https://doi.org/10.1080/02602938.2014.999746

    Article  Google Scholar 

  59. Light, V., Nesbitt, E., Light, P., & Burns, J. R. (2000). “Let’s you and me have a little discussion”: Computer mediated communication in support of campus-based university courses. Studies in Higher Education, 25(1), 85–96. https://doi.org/10.1080/030750700116037

    Article  Google Scholar 

  60. Lo, C. K. (2018). Grounding the flipped classroom approach in the foundations of educational technology. Educational Technology Research and Development, 66(3), 793–811. https://doi.org/10.1007/s11423-018-9578-x

    Article  Google Scholar 

  61. Longfellow, E., May, S., Burke, L., & Marks-Maran, D. (2008). ‘They had a way of helping that actually helped’: A case study of a peer-assisted learning scheme. Teaching in Higher Education, 13(1), 93–105. https://doi.org/10.1080/13562510701794118

    Article  Google Scholar 

  62. Lopez, M., & Elton, L. (1980). A course taught through a learning centre: An evaluation. Studies in Higher Education, 5(1), 91–99. https://doi.org/10.1080/03075078012331377366

    Article  Google Scholar 

  63. López-Pellisa, T., Rotger, N., & Rodríguez-Gallego, F. (2020). Collaborative writing at work: Peer feedback in a blended learning environment. Education and Information Technologies. https://doi.org/10.1007/s10639-020-10312-2

    Article  Google Scholar 

  64. Lueg, R., Lueg, K., & Lauridsen, O. (2016). Aligning seminars with Bologna requirements: Reciprocal peer tutoring, the solo taxonomy and deep learning. Studies in Higher Education, 41(9), 1674–1691. https://doi.org/10.1080/03075079.2014.1002832

    Article  Google Scholar 

  65. Lundy, J. (1991). Cognitive learning from games: Student approaches to business games. Studies in Higher Education, 16(2), 179–188. https://doi.org/10.1080/03075079112331382964

    Article  Google Scholar 

  66. Lynn, N., & Taylor, J. E. (1993). Personal and business skills development: A project-based approach at the University of Salford. Studies in Higher Education, 18(2), 137–150. https://doi.org/10.1080/03075079312331382329

    Article  Google Scholar 

  67. Magin, D. J. (1982). Collaborative peer learning in the laboratory. Studies in Higher Education, 7(2), 105–117. https://doi.org/10.1080/03075078212331379191

    Article  Google Scholar 

  68. Magin, D. J., & Churches, A. E. (1995). Peer tutoring in engineering design: A case study. Studies in Higher Education, 20(1), 73–85. https://doi.org/10.1080/03075079512331381810

    Article  Google Scholar 

  69. Mahoney, P., Macfarlane, S., & Ajjawi, R. (2019). A qualitative synthesis of video feedback in higher education. Teaching in Higher Education, 24(2), 157–179. https://doi.org/10.1080/13562517.2018.1471457

    Article  Google Scholar 

  70. Markham, F. M. H. (1967). Oxford (First Edition edition.). Weidenfeld & Nicolson.

  71. Martin, P. (1988). Self development groups in the context of a structured management development programme. Management Education and Development, 19(4), 281–297. https://doi.org/10.1177/135050768801900401

    Article  Google Scholar 

  72. Marton, F., & Saljo, R. (1997). Approaches to learning. In F. Marton, D. Hounsell, & N. J. Entwistle (Eds.), The Experience of Learning (3rd (internet) edition., pp. 39–58). Edinburgh: University of Edinburgh, Centre for Teaching, Learning, and Assessment. http://www.ed.ac.uk/schools-departments/institute-academic-development/learning-teaching/staff/advice/researching/publications/experience-of-learning

  73. McConlogue, T. (2015). Making judgements: Investigating the process of composing and receiving peer feedback. Studies in Higher Education, 40(9), 1495–1506. https://doi.org/10.1080/03075079.2013.868878

    Article  Google Scholar 

  74. Mcconnell, D. (1994). Managing open learning in computer supported collaborative learning environments. Studies in Higher Education, 19(3), 341–358. https://doi.org/10.1080/03075079412331381920

    Article  Google Scholar 

  75. Mehring, J. (2016). Present research on the flipped classroom and potential tools for the EFL classroom. Computers in the Schools, 33(1), 1–10. https://doi.org/10.1080/07380569.2016.1139912

    Article  Google Scholar 

  76. Mills, D., & Alexander, P. (2013, March). Small group teaching: A toolkit for learning. The Higher Education Academy. https://www.heacademy.ac.uk/sites/default/files/resources/Small_group_teaching_1.pdf

  77. Minalla, A. A. (2018). The effect of Whatsapp chat group in enhancing EFL learners’ verbal interaction outside classroom contexts. English Language Teaching, 11(3), 1–7

    Article  Google Scholar 

  78. Moore, W. G. (1968). The Tutorial System and Its Future. Pergamon Press.

  79. Morales, E. E., Ambrose-Roman, S., & Perez-Maldonado, R. (2016). Transmitting success: Comprehensive peer mentoring for at-risk students in developmental math. Innovative Higher Education, 41(2), 121–135. https://doi.org/10.1007/s10755-015-9335-6

    Article  Google Scholar 

  80. Neck, H. M., & Greene, P. G. (2011). Entrepreneurship education: Known worlds and new frontiers. Journal of Small Business Management, 49(1), 55–70. https://doi.org/10.1111/j.1540-627X.2010.00314.x

    Article  Google Scholar 

  81. Nicol, D. (2010). From monologue to dialogue: Improving written feedback processes in mass higher education. Assessment & Evaluation in Higher Education, 35(5), 501–517. https://doi.org/10.1080/02602931003786559

    Article  Google Scholar 

  82. Nicol, D., & Boyle, J. T. (2003). Peer instruction versus class-wide discussion in large classes: A comparison of two interaction methods in the wired classroom. Studies in Higher Education, 28(4), 457–473. https://doi.org/10.1080/0307507032000122297

    Article  Google Scholar 

  83. Nouri, J. (2016). The flipped classroom: for active, effective and increased learning—especially for low achievers. International Journal of Educational Technology in Higher Education, 13(1), 33. https://doi.org/10.1186/s41239-016-0032-z

    Article  Google Scholar 

  84. O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education, 25, 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002

    Article  Google Scholar 

  85. Orsmond, P., Merry, S., & Callaghan, A. (2013). Communities of practice and ways to learning: Charting the progress of biology undergraduates. Studies in Higher Education, 38(6), 890–906. https://doi.org/10.1080/03075079.2011.606364

    Article  Google Scholar 

  86. Palfreyman, D. (Ed.). (2008). The Oxford Tutorial. The Oxford Centre for Higher Education Policy Studies.

    Google Scholar 

  87. Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback: Learning analytics to scale personalised feedback. British Journal of Educational Technology, 50(1), 128–138. https://doi.org/10.1111/bjet.12592

    Article  Google Scholar 

  88. Pasin, F., & Giroux, H. (2011). The impact of a simulation game on operations management education. Computers & Education, 57(1), 1240–1254. https://doi.org/10.1016/j.compedu.2010.12.006

    Article  Google Scholar 

  89. Paulson, F. L., & Others, A. (1991). What makes a portfolio a portfolio? Educational Leadership, 48(5), 60–63

    Google Scholar 

  90. Pee, L. G. (2020). Enhancing the learning effectiveness of ill-structured problem solving with online co-creation. Studies in Higher Education, 45(11), 2341–2355. https://doi.org/10.1080/03075079.2019.1609924

    Article  Google Scholar 

  91. Pham, M. T., Rajić, A., Greig, J. D., Sargeant, J. M., Papadopoulos, A., & McEwen, S. A. (2014). A scoping review of scoping reviews: Advancing the approach and enhancing the consistency. Research Synthesis Methods, 5(4), 371–385. https://doi.org/10.1002/jrsm.1123

    Article  Google Scholar 

  92. Pittaway, L., & Cope, J. (2007). Simulating entrepreneurial learning: Integrating experiential and collaborative approaches to learning. Management Learning, 38(2), 211–233. https://doi.org/10.1177/1350507607075776

    Article  Google Scholar 

  93. Price, C., & Walker, M. (2019). Improving the accessibility of foundation statistics for undergraduate business and management students using a flipped classroom. Studies in Higher Education. https://doi.org/10.1080/03075079.2019.1628204

    Article  Google Scholar 

  94. Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. (1st ed.). SRHE and Open University Press.

    Google Scholar 

  95. Ranalli, J., Link, S., & Chukharev-Hudilainen, E. (2017). Automated writing evaluation for formative assessment of second language writing: Investigating the accuracy and usefulness of feedback as part of argument-based validation. Educational Psychology, 37(1), 8–25. https://doi.org/10.1080/01443410.2015.1136407

    Article  Google Scholar 

  96. Rees, E. L., Quinn, P. J., Davies, B., & Fotheringham, V. (2016). How does peer teaching compare to faculty teaching? A systematic review and meta-analysis. Medical Teacher, 38(8), 829–837. https://doi.org/10.3109/0142159X.2015.1112888

    Article  Google Scholar 

  97. Reid, D. J., Zhang, J., & Chen, Q. (2003). Supporting scientific discovery learning in a simulation environment: Learning in a simulation environment. Journal of Computer Assisted Learning, 19(1), 9–20. https://doi.org/10.1046/j.0266-4909.2003.00002.x

    Article  Google Scholar 

  98. Roach, K., & Hammond, R. (1976). Zoology by self-instruction. Studies in Higher Education, 1(2), 179–196. https://doi.org/10.1080/03075077612331376739

    Article  Google Scholar 

  99. Robertson, G. (1987). How “self” directed is self-directed learning? Management Education and Development, 18(2), 75–87. https://doi.org/10.1177/135050768701800201

    Article  Google Scholar 

  100. Ryan, T., French, S., & Kennedy, G. (2019). Beyond the Iron Triangle: improving the quality of teaching and learning at scale. Studies in Higher Education. https://doi.org/10.1080/03075079.2019.1679763

    Article  Google Scholar 

  101. Saunders, D. (1992). Peer tutoring in higher education. Studies in Higher Education, 17(2), 211–218. https://doi.org/10.1080/03075079212331382677

    Article  Google Scholar 

  102. Schellens, T., & Valcke, M. (2005). Collaborative learning in asynchronous discussion groups: What about the impact on cognitive processing? Computers in Human Behavior, 21(6), 957–975. https://doi.org/10.1016/j.chb.2004.02.025

    Article  Google Scholar 

  103. Schmitz, B., & Perels, F. (2011). Self-monitoring of self-regulation during math homework behaviour using standardized diaries. Metacognition and Learning, 6(3), 255–273. https://doi.org/10.1007/s11409-011-9076-6

    Article  Google Scholar 

  104. Simmons, E. L. (2017). Evolution in business simulations: A review of the simventure evolution platform (www.simventure.co.uk), created by Paul and Peter Harrington. Academy of Management Learning & Education, 16(4), 629–632. https://doi.org/10.5465/amle.2017.0284

    Article  Google Scholar 

  105. Smith, T. (2008). Integrating undergraduate peer mentors into liberal arts courses: A pilot study. Innovative Higher Education, 33(1), 49–63. https://doi.org/10.1007/s10755-007-9064-6

    Article  Google Scholar 

  106. Sottilare, R., Graesser, A. C., Hu, X., & Goldberg, B. (2014). Design Recommendations for Intelligent Tutoring Systems: Instructional Management. (Vol. 2)U.S. Army Research Laboratory.

    Google Scholar 

  107. Squires, G. (1983). Innovation through recession: An overview. Studies in Higher Education, 8(1), 71–77. https://doi.org/10.1080/03075078312331379131

    Article  Google Scholar 

  108. Stansfield, L. M. (1996). Is self-development the key to the future?: Participant views of self-directed and experiential learning methods. Management Learning, 27(4), 429–445. https://doi.org/10.1177/1350507696274003

    Article  Google Scholar 

  109. Stefani, L. A. J. (1994). Peer, self and tutor assessment: Relative reliabilities. Studies in Higher Education, 19(1), 69–75. https://doi.org/10.1080/03075079412331382153

    Article  Google Scholar 

  110. Sweeney, J., O’donoghue, T., & Whitehead, C. (2004). Traditional face-to-face and web-based tutorials: A study of university students’ perspectives on the roles of tutorial participants. Teaching in Higher Education, 9(3), 311–323. https://doi.org/10.1080/1356251042000216633

    Article  Google Scholar 

  111. Tai, J., Ajjawi, R., Boud, D., Dawson, P., & Panadero, E. (2018). Developing evaluative judgement: Enabling students to make decisions about the quality of work. Higher Education, 76(3), 467–481. https://doi.org/10.1007/s10734-017-0220-3

    Article  Google Scholar 

  112. Topping, K. J. (2005). Trends in peer learning. Educational Psychology, 25(6), 631–645. https://doi.org/10.1080/01443410500345172

    Article  Google Scholar 

  113. Tribe, D. M., & Tribe, A. J. (1987). Lawteach: An interactive method for effective large group teaching. Studies in Higher Education, 12(3), 299–310. https://doi.org/10.1080/03075078712331378082

    Article  Google Scholar 

  114. Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education, 14(1), 22. https://doi.org/10.1186/s41239-017-0062-1

    Article  Google Scholar 

  115. Webb, E., Jones, A., Barker, P., & van Schaik, P. (2004). Using e-learning dialogues in higher education. Innovations in Education and Teaching International, 41(1), 93–103. https://doi.org/10.1080/1470329032000172748

    Article  Google Scholar 

  116. Whitaker, J., New, J. R., & Ireland, R. D. (2016). Moocs and the online delivery of business education. What’s new? What’s not? What now? Academy of Management Learning & Education, 15(2), 345–365. https://doi.org/10.5465/amle.2013.0021

    Article  Google Scholar 

  117. Winters, F. I., & Alexander, P. A. (2011). Peer collaboration: The relation of regulatory behaviors to learning with hypermedia. Instructional Science, 39(4), 407–427. https://doi.org/10.1007/s11251-010-9134-5

    Article  Google Scholar 

  118. Yorke, M. (2003). Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice. Higher Education, 45(4), 477–501. https://doi.org/10.1023/A:1023967026413

    Article  Google Scholar 

  119. Zhu, E. (1998). Learning and mentoring: Electronic discussion in a distance learning course. In C. J. Bonk & K. S. King (Eds.), Electronic Collaborators: Learner-Centred Technologies for Literacy, Apprenticeship, and Discourse. (pp. 159–183). Lawrence Erlbaum Associates.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

The authors of this manuscript received no specific funding for this work.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Paul Tristen Balwant.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Balwant, P.T., Doon, R. Alternatives to the conventional ‘Oxford’ tutorial model: a scoping review. Int J Educ Technol High Educ 18, 29 (2021). https://doi.org/10.1186/s41239-021-00265-y

Download citation

Keywords

  • Deep learning
  • Learning
  • Scoping review
  • Teaching
  • Tutorial