Are open educational resources (OER) and practices (OEP) effective in improving learning achievement? A meta-analysis and research synthesis
International Journal of Educational Technology in Higher Education volume 20, Article number: 54 (2023)
While several studies have investigated the various effects of open educational resources (OER) and open educational practices (OEP), few have focused on its connection to learning achievement. The related scientific literature is divided about the effects of OER and OEP with regards to their contribution to learning achievement. To address this tension, a meta-analysis and research synthesis of 25 studies (N = 119,840 participants) was conducted to quantitatively investigate the effects of OER and OEP on students’ learning achievement. The analysis included course subject, level of education, intervention duration, sample size, geographical distribution, and research design as moderating variables of the obtained effects. The findings revealed that OER and OEP have a significant yet negligible (g = 0.07, p < 0.001) effect. Additionally, the analysis found that the obtained effect can be moderated by several variables, including course subject, level of education and geographical distribution. The study findings can help various stakeholders (e.g., educators, instructional designers or policy makers) in understanding what might hinder OER and OEP effect on learning achievement, hence accommodating better learning outcomes and more effective interventions.
Open educational resources and practices
The term Open Educational Resources (OER) was first coined at UNESCO’s 2002 Forum on Open Courseware, and it was defined in the recent UNESCO Recommendation on OER as “learning, teaching, and research materials in any format and medium that reside in the public domain or are under copyright that have been released under an open license that permit no-cost access, reuse, repurpose, adaptation, and redistribution by others” (UNESCO, 2019). Several studies have then reported the advantages of OER in reducing learning costs (Hilton, 2016), increasing accessibility to educational resources even for students with disabilities (Zhang et al., 2020a), and enhancing learning quality (Yuan & Recker, 2015; Weller et al., 2015; Zhang et al., 2020b). Wiley (2014) further outlined five key characteristics, also known as the 5Rs, of using OER, namely: (1) retain—each person has the right to make and own copies of the published resource; (2) reuse—each person has the right to use the educational resources content in different ways depending in the learning context (e.g., formal or informal learning); (3) revise—each person has the right to revise the educational resource for different purposes (e.g., adapting it to a learning context or enhancing it); (4) remix—each person has the right to create a new educational resource by combining one or more learning contents together; and (5) redistribute—each person has the right to share with others copies of the original revised or remixed educational resource. The 5Rs can support innovation in teaching and learning since OER can be created, used, shared and repurposed differently to traditional copyrighted educational materials.
Building on the idea of innovation in educational resources and the idea of openness in education (Bozkurt et al., 2023), the Open e-Learning Content Observatory Services (OLCOS) functions as a Transversal Action under the European eLearning Programme and is committed to advancing the creation, sharing, and global utilization of OER (OLCOS, 2007). In 2007, OLCOS conducted a roadmap study that emphasized the significance of integrating innovative teaching methods with OER (OLCOS, 2007). The project underscores that merely delivering OER within traditional teacher-centered frameworks might not sufficiently prepare individuals for educational success. It advocates for the incorporation of innovative educational practices alongside OER, and notably introduced the concept of Open Educational Practices (OEP). Based on this perspective, OEP can be defined as OER-enabled pedagogies, or “the set of teaching and learning practices that are only possible or practical in the context of the 5R permissions which are characteristic of OER” (Wiley & Hilton III, 2018, p. 135; cf. Bali et al., 2020). Ehlers (2011, p. 4) defined OEP as “practices which support the (re)use and production of Open Educational Resources through institutional policies, promote innovative pedagogical models, and respect and empower learners as co-producers on their lifelong learning paths.” In a comprehensive review, Huang et al. (2020) identified five dimensions for the possible implementation of OEP, namely: OER, open teaching, open collaboration, open assessment and facilitating technologies. Some research suggests that these practices can help enhance learning quality, access, and effectiveness in universities. With the positive potential of OER and OEP in education, their adoption in education has rapidly increased for the past years. A significant moment in the history of open education came with the UNESCO (2019) Recommendation on OER which provides strategic policy support for the uptake and monitoring of OER. Accordingly, the UNESCO recommendation calls upon member states to develop national policies for the adoption of OER, which include activities, such as creating guidelines and strategies to incorporate OER within educational institutions or facilitating the generation and sharing of OER materials among educators. This recommendation draws considerable attention and investments to OER and OEP projects without certainty about their positive effects. At present, with the great potential of OER and OEP in education, a majority who remain unaware of the transformative potential of open practice; some educators consider OEP to be one of the most significant teaching forms of the twenty-first century (Shear et al., 2015) while others are oblivious of its existence. It is also important to note that OEP is not an orthodoxy so much as a concept that can be realized in a multitude of different ways.
Research gap and study objectives
Dotson and Foley (2017) emphasized that changing the curriculum content (i.e., from proprietary to open) does not produce a change in students’ learning achievement. Harvey and Bond (2022) also argued that there is a need to investigate if a change in the learning content licensing has an impact on students’ learning achievement. Despite a growing body of evidence regarding the effectiveness of OER and OEP in learning, open research studies have focused on other variables (e.g., affordability, accessibility). Less attention has been paid to whether OER and OEP can enhance students’ learning achievement compared to traditionally copyrighted materials (Robinson, 2015). For instance, Hilton (2016) conducted a systematic review of articles focusing on OER issues and learning achievement and perception, written between 2002 and August of 2015. The researcher found that only seven of sixteen studies focused on learning achievement. The researcher conducted another systematic review of twenty-nine OER-focused articles, written between September 2015 and December 2018, and only nine new learning achievement studies were obtained (Hilton, 2020). This reflects the decline in attention being paid to OER/OEP and learning achievement since 2002. Moreover, the literature about the effects of OER and OEP in enhancing students’ learning achievement is divided, where some studies reported positive effects (e.g., Colvard et al., 2018), no effects (e.g., Fortney, 2021; Grissett & Huffman, 2019) or even negative effects (e.g., Gurung, 2017), implying that some students who used traditionally copyrighted materials had better effects than those who used OER.
The question of the relative efficacy of OER or OEP remains open. The main rationale, therefore for this study, is to examine whether or not OER and OEP can enhance learning achievement. Two systematic reviews (Hilton, 2016, 2020) attempted to investigate the above-mentioned phenomenon, however they were purely qualitative. The results from these two reviews did not effectively reveal the effects of OER and OEP on learning achievement. One study by Clinton and Khan (2019) conducted a meta-analysis related to this topic, however it investigated only the effect of open textbooks on post-secondary students’ learning achievement in the USA and Canada. Consequently, the previously obtained results do not reflect a comprehensive and an in-depth investigation of the effect of OER and OEP on learning achievement.
This present investigation aims at a more in-depth coverage of the current literature by including a range of types of OER (e.g., textbooks, videos, etc.) in many countries and at many educational levels. Smith (2013) highlighted the importance of researching improvements in achievement and attainment of OER, urging for further investigation into interventions that could result in significant enhancements in educational outcomes. In the same vein, Hilton (2020) has further suggested conducting sophisticated meta-analyses, where effect sizes across studies are calculated, to understand the measurable effect of OER on learning achievement. In response, this study employs a systematic analysis of the OER/OEP literature to comprehensively investigate whether the data supports the hypothesis that the use of OER and OEP can improve students’ learning achievement in a range of subjects. Therefore, to address this research gap, this study consisted of a meta-analysis and research synthesis of the relevant literature to provide quantitative evidence on the effects of OER and OEP on learning achievement. Meta-analysis, utilizing statistical methods, was employed to accurately measure the effect of a given intervention and the associated moderators of this effect (Rosenthal & DiMatteo, 2001).
Additionally, several studies reported that the effects of OER and OEP on learning achievement might vary due to different confounders, such as demographic information, the type of the course delivered, educational level (grade), intervention duration, among others (e.g., Hilton, 2016, 2020). Therefore, the present study takes a forward step towards analyzing if these variables might moderate the effect of OER and OEP on learning achievement. Specifically, this study addressed the following research questions:
RQ1. What is the effect of OER and OEP on students’ learning achievement?
RQ2. How does the effect of OER and OEP on students’ learning achievement vary according to the educational subject?
RQ3. How does the effect of OER and OEP on students’ learning achievement vary according to the educational level?
RQ4. How does the effect of OER and OEP on students’ learning achievement vary according to the intervention duration?
RQ5. How does the effect of OER and OEP on students’ learning achievement vary according to the sample size?
RQ6. How does the effect of OER and OEP on students’ learning achievement vary according to geographical distribution of students?
RQ7. How does the effect of OER and OEP on students’ learning achievement vary according to the research design?
This study identifies the effects of using OER and OEP on learning achievement through meta-analysis. To secure the selection of the most relevant literature to be meta-analyzed, the researchers of the current study followed the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines (Page et al., 2021). Additionally, the researchers followed recommendations outlined by Kitchenham and Charters (2007). This procedure suggests three stages, namely: planning, conducting, and reporting the review. Although these guidelines were originally proposed for conducting systematic reviews, they have been successfully employed in meta-analyses (e.g., Garzón et al., 2019). All the processes related to the selection and codification of the studies were carried out by two coders.
Planning the review
To ensure having only relevant studies (recall) within this meta-analysis, hence obtaining a high precision rate (Ting, 2010), “open educational resources” and “open educational practices” were used as search keywords. Particularly, the term abbreviations, namely OER and OEP, were not considered as search keywords because in scientific writings, the full name of a term is provided prior to using its abbreviation. The search process was undertaken in the following databases: Web of Science, Scopus, Taylor and Francis and ERIC. These databases were selected because they are popular in the field of educational technology (Bedenlier et al., 2020; Wang et al., 2023). ERIC, particularly, focuses on educational science, especially OER (Otto et al., 2021) and Scopus is known as the largest database for scholarly publications. The publication interval was from 2012 up to 2023. The starting year of 2012 was selected as the initial date because it marked the release of the “UNESCO Paris OER Declaration”, which urged governments to promote the use of OER, and called for publicly funded educational materials to be released in a freely reusable form. As a result, several OER initiatives were launched worldwide which catalyzed the development of the OER field. Due to the novelty of the topic, conference papers and doctoral dissertations were considered to be included into the research corpus, as suggested by several studies (e.g., Chen et al., 2020; Denden et al., 2022).
The search was conducted on April 4, 2023, at which date, researchers were able to identify 643 studies (Web of Science: 117, Scopus: 38, Taylor and Francis: 262, and ERIC: 226). After eliminating duplicates (n = 324), a total of 319 publications were selected for further analysis. The first filter was based on each article’s title and keywords. This process allowed us to identify and remove 75 papers that were not relevant to the purpose of this present study. Then, the abstract of the remaining 244 papers was read and analyzed comprehensively. This process allowed us to remove 135 papers that were not relevant. Finally, we analyzed the remaining 109 studies based on the following criteria: (1) empirical studies, (2) studies that specifically used OER or OEP, (3) studies that provided sufficient information (i.e., mean, median, standard deviation) to calculate the effect size.
Therefore, a study was excluded if (1) it was not empirical research, (2) it did not focus on using OER or OEP, (3) it was qualitative or review research, (4) it did not provide sufficient information to calculate the effect size, or (5) it was not written in English. This process limited the corpus for investigation to 25 papers (23 journal papers, 1 conference paper and 1 PhD dissertation) to be further examined and included in the analysis. At the end of this process, the reference section of each paper was then reviewed. However, this process did not provide additional studies. Figure 1 shows the PRISMA flowchart (Page et al., 2021) of the study selection process, where inter-rater reliability in each phase was above 0.7, which is considered very good (Cohen, 1960).
Conducting the review
This stage included the coding scheme for the data extraction process. In an effort to minimize the potential for bias, an online electronic data extraction form was designed (Kitchenham & Charters, 2007). To answer the aforementioned research questions, the following information in each study was coded: (1) OER type: The type of resource used for teaching, such as textbooks, videos, etc.; (2) course subject: the subject that was taught when using OER and OEP, such as mathematics, psychology, etc.; (3) educational level: the student grade in which OER and OEP were used, such as primary, bachelors, etc.; (4) the length of time over which OER and OEP were used (i.e., course duration); (5) sample size: the number of participants in each study. According to Cheung and Slavin (2016), the sample size was divided into small, where the number of participants is less than or equal to 250, and large, where the number of participants is larger than 250; (6) region: the region (country) where the experiment was conducted; and (7) research design: the followed research design when conducting the experiment.
Calculation of the effect size
Comprehensive Meta-Analysis V.4 (Borenstein, 2022) software was used to conduct the present meta-analysis. Additionally, Hedges’ g was used to calculate the effect sizes (Hedges, 1981). The motivation behind using Hedges’ g instead of Cohen’s d effect size was that the differential sample size between studies may bias the estimated effect size. This bias affects studies having a sample size smaller than 20, in which case Hedges’s g presents more reliable estimates than Cohen’s d (Hedges & Olkin, 1985). Eleven studies followed the pretest–posttest-control (PPC) research design. In this research design, students are randomly assigned to experimental and control treatments and are evaluated before and after the treatment. As stated by Morris (2008), this design provides better results regarding the accuracy of d values and control of threats to internal validity. The remaining fourteen studies, on the other hand, followed the posttest only with control (POWC) design, where students are assigned to experimental and control treatments and assessed only once, after the treatment (i.e., learning using OER or OEP).
According to the guidelines provided by Thalheimer and Cook (2002) for interpreting effect size, an effect size is negligible if − 0.15 < g < 0.15; small if 0.15 ≤ g < 0.40; medium if 0.40 ≤ g < 0.75; large if 0.75 ≤ g < 1.10; very large if 1.10 ≤ g < 1.45; and, huge if 1.45 ≤ g. Additionally, to test if there was any heterogeneity in the variation of effect sizes within the reviewed studies, Q and I2 were evaluated (Konstantopoulos & Hedges, 2019). Specifically, a preplanned analysis was conducted to investigate if the field of education, the level of education, or the learning setting influenced the overall average effect size.
Three methods were used to assess publication bias: classic Rosenthal’s fail-safe N, Orwin’s fail-safe N, and the trim-and-fill method. Rosenthal’s (1979) fail-safe number aims to determine the number of studies with nonsignificant results of unpublished data needed to nullify the mean effect size. A fail-safe number larger than 5k + 10 (where k is the original number of studies included in the meta-analysis) is robust. This means that the effect size of unpublished studies is not likely to affect the average effect size of the meta-analysis. However, this method assumes that the mean effect size in the missing studies is zero (Borenstein et al., 2021). To overcome this issue, Orwin (1983) proposed a more stringent method to identify how many missing studies would bring the overall effect to a specific non-zero value. This method permits selecting a value that represents the smallest effect of substantive importance and identifying how many missing studies it would take to bring the overall effect below this value. Alternatively, the trim-and-fill method was proposed by Duval and Tweedie (2000) with the intention of identifying publication bias by means of a funnel plot wherein the studies are represented by dots. If the dots are distributed on both sides of a vertical line representing the average effect size, there is no publication bias. Conversely, if most of the dots are located at the bottom of the funnel or on one side of the vertical line, publication bias is present (Borenstein et al., 2010).
Description of the included sample
Table 1 presents the included 25 studies in this present meta-analysis. Most of the studies (n = 19) were conducted with bachelor students. Additionally, OER and OEP were used mostly to teach psychology (n = 6), mathematics (n = 5) or also varied courses (n = 6). Among the 25 studies, 10 studies used small sample size (less than or equal to 250) and 15 studies used large sample size (larger than 250). Hedges’s g was also calculated. A positive Hedges’s g indicates that students using OER and OEP had better achievement than those who used traditionally copyrighted resources, and vice versa. Table 1 shows that 10 studies had negative Hedges’s g value.
Publication bias assessment
Borenstein et al. (2010) stated that a symmetric funnel plot—when the dots (studies) are distributed on both sides of the vertical line (combined effect size)—implies that there is no publication bias. However, if most of the dots are situated at the bottom of the funnel or on one side of the vertical line, there is publication bias. Figure 2 shows that the dots in this study are distributed symmetrically around the vertical line. Additionally, although some dots are outside the triangle of the funnel plot, most of them are in the upper part of Fig. 2 and not at the bottom. Therefore, it can be argued that the reliability of the present meta-analysis is not affected by publication bias.
Overall effect size for learning achievement
The meta-analysis yielded an overall effect size of g = 0.07, p < 0.001, indicating that OER and OEP had a negligible effect on students’ learning achievement (see Table 2). Specifically, Document (g = − 0.20; 95% CI = − 0.14 to 0.10; n = 1), Interactive (text) book (g = 0.13; 95% CI = 0.11 to 0.15; n = 18) and Interactive course (g = − 0.11; 95% CI = − 0.14 to − 0.08; n = 5) had a negligible effect on students’ learning achievement. Video (g = 0.20; 95% CI = − 0.0.33 to 0.73; n = 1) had a small effect on students’ learning achievement.
The I2 statistic showed that 96.60% of variance resulted from between-study factors, implying that other variables might moderate the effect size of OER (as pointed out in the background of this study).
The forest plot presents the variation of effect size across the 25 included studies (see Fig. 3). The black square represents each study’s weighted effect size, where a larger square size implies a larger effect size. The arrow underneath each square (effect size) represents the confidence interval of the associated effect size. The overall mean effect size (g = 0.073) is presented at the last row of the forest plot. Interestingly, it is seen that almost half of the studies had a negative effect size with different confidence intervals, implying that the use of traditionally copyrighted materials had a better impact on learning achievement compared to the use of OER and OEP. This further explains the obtained negligible effect of OER and OEP on students’ learning achievement (see Table 2).
Effect sizes of learning achievement for moderator variables
Meta-regression was used to investigate any possible variations in the effect sizes of educational subjects (Liesa-Orus et al., 2023). According to Table 3, the meta-regression result indicates that the course subject model is associated with the effect sizes of the learning achievement under OER as the p-value is 0.05 (Borenstein 2022). Moreover, the statistics of the subject indicate that using OER in history (p = 0.001) is likely to relate to the effect size. Specifically, the coefficient indicates that the expected mean effect size for studies using OER in history is 1.14 points higher than the expected mean effect size for studies using OER in psychology, with standard error 0.33 and a confidence interval 0.49–1.78. In other words, OER used in history is likely to have a significantly better effect on learning achievement than OER used in psychology.
Meta-regression was used to investigate any possible variations in the effect sizes of educational level (Chaudhary & Singh, 2022). According to Table 4, the meta-regression result indicates that the educational level model is associated with the effect sizes of the learning achievement under OER as the p-value is equal to 0.001 (Borenstein 2022). Moreover, the statistics of educational level indicate that using OER in professional development (p = 0.001) is likely to relate to the effect size. Specifically, the coefficient indicates that the expected mean effect size for studies using OER in professional development is 2.26 points higher than the expected mean effect size for studies using OER in bachelors, with standard error 0.48 and a confidence interval 1.31–3.20. In other words, OER used in professional development is likely to have a significantly better effect on learning achievement than OER used in bachelor.
Meta-regression was used to investigate any possible variations in the effect sizes of intervention duration (Shi et al., 2023). According to Table 5, the meta-regression result indicates that the intervention duration model is not associated with the effect sizes of the learning achievement under OER as the p-value is 0.99 (Borenstein 2022).
Meta-regression was used to investigate any possible variations in the effect sizes of sample size (Cheung & Slavin, 2016). According to Table 6, the meta-regression result indicates that the sample size model is not associated with the effect sizes of the learning achievement under OER as p-value is 0.08 (Borenstein 2022).
Meta-regression was used to investigate any possible variations in the effect sizes of the region (Liesa-Orus et al., 2023). According to Table 7, the meta-regression result indicates that the region model is associated with the effect sizes of learning achievement under OER as p-value is 0.01 (Borenstein 2022). Moreover, the region statistics indicate that using OER in Asia (p = 0.001) is likely to relate to the effect size. Specifically, the coefficient indicates that the expected mean effect size for studies using OER in Asia is 1.01 points higher than the expected mean effect size for studies using OER in North America, with standard error 0.33 and a confidence interval 0.36–1.65. In other words, OER used in Asia is likely to have a significantly better effect on learning achievement than OER used in North America.
Meta-regression was used to explore any possible associations in the effect sizes of the research design (Geissbühler et al., 2021). According to Table 8, the meta-regression result indicates that the research design model is not associated with the effect sizes of learning achievement under OER as p-value is 0.77 (Borenstein 2022).
Finally, to further investigate for possible covariance between confounding variables, a meta-regression that includes all of the individual confounding variables that yielded statistically significant results, namely subject, educational level and region, was conducted. Table 9 reveals that subject (p = 0.01) and educational level (p = 0.001) yielded a significant covariance between confounding variables.
This meta-analysis aimed to comprehensively assess the effectiveness of Open Educational Resources (OER) and Open Educational Practices (OEP) in relation to learning achievement. The analysis of 25 independent studies revealed that the impact of OER and OEP on learning achievement is generally negligible. These quantitative findings support the conclusions drawn from qualitative (Hilton, 2016, 2020) and quantitative (Clinton & Khan, 2019) reviews that compare learning achievement between courses using open and commercial textbooks. Additionally, it is found that course subject, educational level and the region of students might moderate the effects of OER and OEP. The obtained findings of this study can be discussed and explained from the following perspectives.
Improvement in access does not imply improvement in learning achievement: a holistic design is needed
The use of OER and OEP is often considered an effective learning intervention due to its potential to provide equal access to educational resources for all students (Grimaldi et al., 2019). However, the results of this meta-analysis do not substantiate this hypothesis. Dotson and Foley (2017) also argue that the change of a curriculum content license from proprietary to open does not always lead to a change in students’ learning achievement. In other words, we cannot expect an improvement in learning achievement by simply changing the license of a given educational resource from proprietary to open. It requires a more comprehensive approach that involves changing not only the license, but also the used instructional approach, the way the educational resources are designed, etc. Based on the review of 25 included studies, it is found that ensuring an improvement in learning achievement is beyond the simple access to educational resources, and several elements should be considered, some of which are considered and discussed below, namely: OER quality, instructional, and learners’ individual factors.
The quality of OER and effective implementation of OEP are crucial factors that significantly influence learning achievement. High-quality OER, characterized by accurate and up-to-date content, clear learning objectives, and appropriate instructional design, have been shown to positively impact student learning outcomes (Butcher, 2015). Learners who have access to well-designed OER that align with the curriculum and provide meaningful learning experiences are more likely to engage with the materials and effectively acquire knowledge and skills. However, it is important to acknowledge that not all OER resources meet the necessary standards of accuracy, coherence, and pedagogical effectiveness. Research has indicated significant variability in the quality of OER, resulting in inconsistent learning experiences and potentially limiting their impact on learning achievement (Weller, 2017). To address this, quality assurance processes, peer review, and evaluation mechanisms are essential to ensure that the content and resources meet established standards. While there was early skepticism and critique in regards to OER quality based on design and economic production models (see Kahle, 2008; Weller, 2010), there is nothing inherently different in regards to open and closed/proprietary content beyond the intellectual property rights. In other words, the quality criteria that apply to proprietary/closed resources also apply to OER and we should not expect quality differences in OER produced under the same production modes (e.g. by experienced publishers and designers). The results of this study indicate that no significant difference was found, which substantiates this assertion.
Scoring improvements in learning achievement also depends on how students engage with OER and the effective implementation of OEP accordingly. On the one hand, despite the easy access to learning materials provided by OER, learners may not use them at all (Feldstein et al., 2012) or may not have sufficient time to engage with them (Westermann Juárez and Venegas Muggli, 2017). On the other hand, while OER offer the advantage of making learning more individualized, students may encounter a broader range of perspectives through OER but the content they learn may not align with objective measures of learning (Gurung, 2017). In the same vein, Zulaiha and Triana (2023) stated that a proper teaching method and learning strategy must accompany the OER to be effectively leveraged to improve students’ skills in order for OER to make a significant impact on the student learning. An older study by Slavin and Lake (2008) similarly found that the selection of instructional approach has a larger impact on learning achievement than the choice of curriculum content. Besides, numerous educators face challenges such as time constraints, insufficient skills and competences (e.g., digital), lack of understanding about what OER or OEP actually mean, and a lack of incentives to engage in open practices. Consequently, the widespread adoption of OEP remains limited, potentially hindering its impact on learning achievement (Tlili et al., 2021; Zhang et al., 2020b).
Learners’ individual factors
The impact of OER and OEP on learning achievement is influenced by individual learner characteristics, including prior knowledge and motivation. Tlili and Burgos (2022) emphasized the importance of providing personalized learning as students in open education might have different backgrounds and competencies. It is crucial to acknowledge that students may show diverse responses to open educational initiatives, and some may require extra support or guidance to fully reap the benefits of these resources. For example, studies have shown that students from lower socioeconomic backgrounds, who may lack essential skills in effectively utilizing OER, tend to attain lower learning outcomes compared to their peers (Robinson, 2015). Thus, recognizing and addressing the diverse needs of learners is important in enhancing the impact of OER and OEP on learning achievement.
Adequate experimental design is crucial for accurately measuring learning achievement
Beyond the OER and OEP selection and implementation, the study indicates that the applied experimental design might hinder the accurate measurements of OER and OEP effects on learning achievement. Based on the 25 reviewed studies, it is found that most of the studies used quasi-experiments given that random assignments are not always possible in open education. As a result, this might hinder measuring the measurable effects of OER and OEP on learning achievement (Griggs & Jackson, 2017; Gurung, 2017).
Additionally, separating the effects of OER and OEP from other effects is also a challenge in the conducted experiments. Wiley (2022) described several ways in which research that purports to show the impact of OER adoption on student learning actually shows the impact of other interventions that are associated with OER adoption (e.g., when faculty receive support from an instructional designer to redesign a course after adopting OER). Pawlyshyn et al. (2013) also reported an improvement in learning achievement when OER was adopted simultaneously with flipped classrooms. However, it is not clear whether this improvement was due to the use of OER or flipped classrooms. OER are often employed alongside other interventions which can make isolating their effect methodologically problematic. This challenge of correlating improvements in learning achievement with the use of OER and OEP was also reported by other researchers (Griggs & Jackson, 2017; Gurung, 2017).
Most of the reviewed studies used final exam scores or GPA (grade point average) to measure the learning achievement of incorporating OER and OEP. However, this method is questionable as the designed exam may vary depending on the taught course subject and requirements, leading to a variation in the measured learning achievement. It is, therefore, recommended to use standardized instruments when measuring learning achievement using OER and OEP (Hendricks et al., 2017; Hilton, 2020). This normalization might lead to competence validation or even credit recognition through alternative credentials (e.g., Alternative Digital Credentials -ADC-), which is one of the open challenges around open education (Griffiths et al., 2022).
Confounding variables might lead to a variation of learning achievement
The present meta-analysis revealed that several confounding variables could affect the learning achievement of students when using OER and OEP. One of these variables is the course subject. This might be explained by the fact that some subjects have quality OER published online while others do not. For example, Lawrence and Lester (2018) highlighted a specific concern in the open content space for subjects like political science, which is the lack of available textbook options. Similarly, Choi and Carpenter (2017) found it challenging to find a suitable OER for their interdisciplinary Human Factors and Ergonomics course. The researchers discovered that OER for the Human Factors and Ergonomics course often provided in-depth content for individual topics, including extra information that is relevant to their subject of focus but not directly related to the course learning objectives. Furthermore, the limited number of OER options creates difficulties for instructors in applying some of their preferred pedagogical approaches.
The obtained findings also revealed that students’ geographic region can moderate the effect of OER and OEP on learning achievement. This could be explained with the fact that several regions, such as East Asia, have made remarkable progress in terms of raising awareness and adopting OER and OEP (Tlili et al., 2019), while others like the Arab region and sub-Saharan Africa are still behind (Tlili et al., 2020, 2022). This might result in divided regions in terms of students’ perception and acquired competencies to use OER and OEP, hence having varied learning achievement across regions.
Conclusions, implications and limitations
This study included a meta-analysis and research synthesis to investigate the effects of OER and OEP on students’ learning achievement. This analysis describes how this effect is moderated across different variables (i.e., course subject, level of education, intervention duration, sample size and geographical distribution). As discussed above, to the best of our knowledge, no previous study has conducted a similar analysis. Based on the findings, it can be argued that holistic OER learning design may be needed to optimize learning outcomes; that researchers should employ adequate experimental design when investigating the relationship between OER and learning achievement; and highlighted the potential role of confounding factors that can lead to a variation of learning achievement when using OER.
This study supports previous research in identifying no significant differences between the interventions using open and closed approaches (content or practice). This meta-analysis supports this conclusion following along existing literature on media/intermedia comparison studies (e.g. Clark, 1994; Salomon & Clark, 1977).
The conundrum for comparisons studies such as those included in this meta-analysis is thus: if a true experiment made to evaluate the influence of OER in learning achievement were to be designed, the only variable would be the OER itself, in other words, the intellectual property license of the content (considering this to be the defining characteristic of OER). If this were possible, one could only expect that the affordances of open licensing would possibly point to the effects of reduced cost or ease of access to relate to achievement (e.g., Fischer et al., 2015). But if this is done, it would offer us minimal new insights beyond what we already could expect, in principle. It stands to reason that not having access to resources designed to be part of a course would reduce achievement (a comparison on whether students actually did or did not access and make use of resources in the treatment and control condition is another study entirely).
However, the truly intriguing and critical questions pertain to practical applications. If we do allow practice to vary, for example: if the OER afforded some different sort of practice (as OEP is defined in OER-enabled pedagogy) then we are really measuring something more holistic—the practice which includes the resource. As Salomon and Clark (1977, p. 102) conclude: “In short, when only the least significant aspects of instruction are allowed to vary, nothing of interest could, and did, result.”
This study then might point us to valuable avenues for further research. Perhaps course instructors and designers are attempting to faithfully replicate courses that make use of OER simply to test possible outcomes in achievement; here, clearly, we should expect no difference to emerge. Furthermore, instructors may not be really leveraging OER-enabled pedagogy or more expansive perspectives of OEP.
Additionally, this meta-analysis might help disencourage further comparison studies based on OER and achievement. It points us to the urgency of expanding the object of analysis beyond intrinsic characteristics of OER and focus on how principles of openness might significantly alter the nature of the practices and courses themselves, might lead to outcomes which are not measured simply by achievement gains, and additionally, might or might not cater to different types of students.
This present study can contribute to the literature from different perspectives. From a theoretical perspective, this study adds to the ongoing debate for the past twenty years about the effectiveness of OER and OEP by revealing what might moderate the effectiveness of OER and OEP. From a practical perspective, this study can contribute to Sustainable Development Goals (SDGs) (UN, 2021), specifically SDG 4 quality education, by highlighting the different variables (i.e., quality, the used pedagogical approach, etc.) that different stakeholders (i.e., educators, instructional designers, etc.) should consider when adopting OER and OEP for better learning achievement. Finally, from a methodological perspective, this study contributes to the literature by pointing out various experimental criteria (i.e.., standardized measurements, design, etc.) that should be considered when designing research experiments to effectively measure the true effect of OER, hence providing more accurate results that could advance the field in this regard.
Limitations and future directions
It should be noted that the statistical power was not examined in this present meta-analysis, which is the case in the majority of published meta-analyses in the literature (Burçin, 2022; Dumas-Mallet et al., 2017; Thorlund & Mills, 2012). A significant barrier to the widespread implementation of statistical power in meta-analysis is the difficulty of understanding how it can be computed due to the various variables that should be considered in each study, as well as the heterogenicity of the conducted studies (Cafri et al., 2010; Ioannidis et al., 2014; Vankov et al., 2014). Additionally, there is a lack of an accessible and easy-to-use software or R script that can help to compute statistical power (Griffin, 2021; Thomas & Krebs, 1997). In this context, various software, such as G*power, have been developed to calculate statistical power for primary research, allowing for widespread implementation of power analysis in primary research (Faul et al., 2007). However, despite the similarity in procedure, such analogous software options do not exist for meta-analysis. Consequently, to calculate statistical power for a given meta-analysis, researchers must manually perform the calculations, use an online calculator, or utilize a user defined script (e.g., Cafri et al., 2009). These methods can be limited in functionality and difficult to integrate into a reproducible workflow (Griffin, 2021).
Besides, despite the reliability of the obtained results having been validated through the bias assessment, this study has some other limitations that should be acknowledged. For instance, the obtained results might be limited to the used keywords and electronic databases. Additionally, the obtained analysis was only based on courses conducted in English; non-English course studies might reveal different results. Moreover, while the present meta-regression yielded valuable insights about the effect of OER and OEP on learning achievement as well as the moderating variables of this effect, the limited sample size of the included studies might impact the generalizability of the findings. Therefore, future researchers are encouraged to complement this work by covering more databases and analyzing non-English courses, hence providing a more comprehensive view of OER and OEP effects. Additionally, this present meta-analysis did not consider teacher variables (e.g., same teacher or not when teaching using OER and non-OER materials) which could moderate the effects of OER and OEP on learning achievement (Hilton, 2020). Therefore, future studies could focus on this line of research. Finally, this present meta-analysis did not consider OER quality, which has been shown to have a significant impact on students’ learning outcomes (Butcher, 2015). Future research could systematically assess and incorporate OER quality as a moderating variable, hence further enhancing the understanding of the intricate relationship between OER and learning achievement. However, despite these limitations, this present study provided quantitative evidence about the OER and OEP effects on students’ learning achievement.
Availability of data and materials
The datasets generated and/or analyzed during the current study are presented within this study.
References with an asterisk (*) indicate studies included in the analysis
Allen, G., Guzman-Alvarez, A., Smith, A., Gamage, A., Molinaro, M., & Larsen, D. S. (2015). Evaluating the effectiveness of the open-access ChemWiki resource as a replacement for tr*aditional general chemistry textbooks. Chemistry Education Research and Practice, 16(4), 939–948. https://doi.org/10.1039/c5rp00084j
Bali, M., Cronin, C., & Jhangiani, R. S. (2020). Framing open educational practices from a social justice perspective. Journal of Interactive Media in Education, 2020(1), 1. https://doi.org/10.5334/jime.565
*Basu Mallick D., Grimaldi P. J., Whittle J., Waters A. E., & Baraniuk R. G. (2018). Impact of OER textbook adoption on student academic outcomes. Paper presented at the 15th Annual Open Education Conference, Niagara Falls, NY.
Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020). Facilitating student engagement through educational technology in higher education: A systematic review in the field of arts and humanities. Australasian Journal of Educational Technology, 36, 126–150. https://doi.org/10.14742/ajet.5477
Borenstein, M. (2022). Comprehensive meta-analysis software. In M. Egger, J. P. T. Higgins, & G. D. Smith (Eds.), Systematic reviews in health research: Meta-analysis in context (pp. 535–548). Wiley. https://doi.org/10.1002/9781119099369.ch27
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2010). A basic introduction to fixed-effect and random-effects models for meta-analysis. Research Synthesis Methods, 1(2), 97–111. https://doi.org/10.1002/jrsm.12
Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2021). Introduction to Meta-Analysis (2nd ed.). John Wiley & Sons. https://doi.org/10.1016/b978-0-12-209005-9.50005-9
Bozkurt, A., Gjelsvik, T., Adam, T., Asino, T. I., Atenas, J., Bali, M., Blomgren, C., Bond, M., Bonk, C. J., Brown, M., Burgos, D., Conrad, D., Costello, E., Cronin, C., Czerniewicz, L., Deepwell, M., Deimann, M., DeWaard, H. J., Dousay, T. A., Ebner, M., Farrow, R., Gil-Jaurena, I., Havemann, L., Inamorato, A., Irvine, V., Karunanayaka, S. P., Kerres, M., Lambert, S., Lee, K., Makoe, M., Marín, V. I., Mikroyannidis, A., Mishra, S., Naidu, S., Nascimbeni, F., Nichols, M., Olcott. Jr., D., Ossiannilsson, E., Otto, D., Padilla Rodriguez, B. C., Paskevicius, M., Roberts, V., Saleem, T., Schuwer, R., Sharma, R. C., Stewart, B., Stracke, C. M., Tait, A., Tlili, A., Ubachs, G., Weidlich, J., Weller, M., Xiao, J., & Zawacki-Richter, O. (2023). Openness in Education as a Praxis: From Individual Testimonials to Collective Voices. Open Praxis, 15(2), 76–112. https://doi.org/10.55982/openpraxis.15.2.574
Burçin, Ö. N. E. R. (2022). Evaluation of statistical power in random effect meta analyses for correlation effect size. Sakarya University Journal of Science, 26(3), 554–567. https://doi.org/10.16984/saufenbilder.1089793
Butcher, N. (2015). Basic guide to open educational resources (OER). Commonwealth of Learning (COL). https://doi.org/10.56059/11599/36
Cafri, G., Kromrey, J. D., & Brannick, M. T. (2009). A sas macro for statistical power calculations in metaanalysis. Behavior Research Methods, 41(1), 35–46. https://doi.org/10.3758/brm.41.1.35
Cafri, G., Kromrey, J. D., & Brannick, M. T. (2010). A meta-meta-analysis: Empirical review of statistical power, type I error rates, effect sizes, and model selection of meta-analyses published in psychology. Multivariate Behavioral Research, 45(2), 239–270. https://doi.org/10.1080/00273171003680187
Chaudhary, P., & Singh, R. K. (2022). A meta analysis of factors affecting teaching and student learning in higher education. Front. Educ., 6, 824504. https://doi.org/10.3389/feduc.2021.824504
Chen, Z., Chen, W., Jia, J., & An, H. (2020). The effects of using mobile devices on language learning: A meta-analysis. Educational Technology Research and Development, 68(4), 1769–1789. https://doi.org/10.1007/s11423-020-09801-5
Cheung, A., & Slavin, R. E. (2016). How methodological features of research studies affect effect sizes. Educational Researcher, 45(5), 283–292. https://doi.org/10.3102/0013189X16656615
*Chiorescu, M. (2017). Exploring open educational resources for college algebra. International Review of Research in Open and Distributed Learning, 18(4), 50–59. https://doi.org/10.19173/irrodl.v18i4.3003
*Choi, Y. M., & Carpenter, C. (2017). Evaluating the impact of open educational resources: A case study. Portal: Libraries and the Academy, 17(4), 685–693. https://doi.org/10.1353/pla.2017.0041
Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and Development, 42(2), 21–30. https://doi.org/10.1007/BF02299088
*Clinton, V. (2018). Savings without sacrifices: A case study of open textbook adoption. Open Learning: THe Journal of Open, Distance, and e-Learning, 33(3), 177–189. https://doi.org/10.1080/02680513.2018.1486184
Clinton, V., & Khan, S. (2019). Efficacy of open textbook adoption on learning performance and course withdrawal rates: A meta-analysis. AERA Open, 5(3), 2332858419872212. https://doi.org/10.1177/2332858419872212
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37–46. https://doi.org/10.1177/001316446002000104
*Colvard, N. B., Watson, C. E., & Park, H. (2018). The impact of open educational resources on various student success metrics. International Journal of Teaching and Learning in Higher Education, 30(2), 262–276.
Denden, M., Tlili, A., Chen, N. S., Abed, M., Jemni, M., & Essalmi, F. (2022). The role of learners’ characteristics in educational gamification systems: A systematic meta-review of the literature. Interactive Learning Environments. https://doi.org/10.1080/10494820.2022.2098777
Dumas-Mallet, E., Button, K. S., Boraud, T., Gonon, F., & Munafò, M. R. (2017). Low statistical power in biomedical science: A review of three human research domains. Royal Society Open Science, 4(2), 160254. https://doi.org/10.1098/rsos.160254
Ehlers, U.-D. (2011). Extending the territory: From open educational resources to open educational practices. Journal of Open Flexible and Distance Learning, 15(2), 1–10. http://www.jofdl.nz/index.php/JOFDL/index
*Engler, J. N., & Shedlosky-Shoemaker, R. (2019). Facilitating student success: The role of open educational resources in introductory psychology courses. Psychology Learning & Teaching, 18(1), 36–47. https://doi.org/10.1177/1475725718810241
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191. https://doi.org/10.3758/bf03193146
*Feldstein DPS, A. P., Martin, M., Hudson, A., Warren, K., Hilton III, J., & Wiley, D. (2012). Open textbooks and increased student access and outcomes. European Journal of Open, Distance and E-Learning. https://old.eurodl.org/?p=archives&year=2012&halfyear=2&article=533
Fischer, L., Hilton, J., III., Robinson, T. J., & Wiley, D. A. (2015). A multi-institutional study of the impact of open textbook adoption on the learning outcomes of post-secondary students. Journal of Computing in Higher Education, 27(3), 159–172. https://doi.org/10.1007/s12528-015-9101-x
Fortney, A. (2021). OER textbooks versus commercial textbooks: Quality of student learning in psychological statistics. Locus: The Seton Hall Journal of Undergraduate Research, 4(1), 4.
Garzón, J., Pavón, J., & Baldiris, S. (2019). Systematic review and meta-analysis of augmented reality in educational settings. Virtual Reality, 23(4), 447–459. https://doi.org/10.1007/s10055-019-00379-9
Geissbühler, M., Hincapié, C. A., Aghlmandi, S., Zwahlen, M., Jüni, P., & da Costa, B. R. (2021). Most published meta-regression analyses based on aggregate data suffer from methodological pitfalls: A meta-epidemiological study. BMC Medical Research Methodology, 21, 123. https://doi.org/10.1186/s12874-021-01310-0
*Grewe, K., & Davis, W. P. (2017). The impact of enrollment in an OER course on student learning outcomes. The International Review of Research in Open and Distributed Learning. https://doi.org/10.19173/irrodl.v18i4.2986
Griffin, J. W. (2021). Calculating statistical power for meta-analysis using metapower. The Quantitative Methods for Psychology., 17(1), 24–39. https://doi.org/10.20982/tqmp.17.1.p024
Griffiths, D., Burgos, D., & Aceto, S. (2022), Credentialing learning in the European OER Ecosystem. Retrieved July, the 14th, 2023, from https://encoreproject.eu/2022/09/06/credentialing-learning-in-the-european-oerecosystem/
Griggs, R. A., & Jackson, S. L. (2017). Studying open versus traditional textbook effects on students’ course performance: Confounds abound. Teaching of Psychology, 44(4), 306–312. https://doi.org/10.1177/0098628317727641
Grimaldi, P. J., Basu Mallick, D., Waters, A. E., & Baraniuk, R. G. (2019). Do open educational resources improve student learning? implications of the access hypothesis. PLoS ONE. https://doi.org/10.1371/journal.pone.0212508
*Grissett, J. O., & Huffman, C. (2019). An open versus traditional psychology textbook: Student performance, perceptions, and use. Psychology Learning & Teaching, 18(1), 21–35. https://doi.org/10.1177/1475725718810181
*Gurung, R. A. (2017). Predicting learning: Comparing an open educational resource and standard textbooks. Scholarship of Teaching and Learning in Psychology, 3(3), 233–248. https://doi.org/10.1037/stl0000092
*Hardin, E. E., Eschman, B., Spengler, E. S., Grizzell, J. A., Moody, A. T., Ross-Sheehy, S., & Fry, K. M. (2019). What happens when trained graduate student instructors switch to an open textbook? A controlled study of the impact on student learning outcomes. Psychology Learning & Teaching, 18(1), 48–64. https://doi.org/10.1177/1475725718810909
*Harvey, P., & Bond, J. (2022). The effects and implications of using open educational resources in secondary schools. The International Review of Research in Open and Distributed Learning, 23(2), 107–119. https://doi.org/10.19173/irrodl.v22i3.5293
Hedges, L. (1981). Distribution theory for glass’s estimator of effect size and related estimators. Journal of Educational Statistics, 6(2), 107–128. https://doi.org/10.3102/10769986006002107
Hedges, L., & Olkin, I. (1985). Statistical methods for meta-analysis. Academic Press.
*Hendricks, C., Reinsberg, S. A., & Rieger, G. W. (2017). The adoption of an open textbook in a large physics course: An analysis of cost, outcomes, use, and perceptions. International Review of Research in Open and Distributed Learning, 18(4), 78–99. https://doi.org/10.19173/irrodl.v18i4.3006
Hilton, J. III. (2016). Open educational resources and college textbook choices: A review of research on efficacy and perceptions. Educational Technology Research and Development, 64(4), 573–590. https://doi.org/10.1007/s11423-016-9434-9
Hilton, J., III. (2020). Open educational resources, student efficacy, and user perceptions: A synthesis of research published between 2015 and 2018. Educational Technology Research and Development, 68(3), 853–876. https://doi.org/10.1007/s11423-019-09700-4
*Hilton, J., III., Fischer, L., Wiley, D., & Williams, L. (2016). Maintaining momentum toward graduation: OER and the course throughput rate. International Review of Research in Open and Distributed Learning, 17(6), 18–27. https://doi.org/10.19173/irrodl.v17i6.2686
*Hilton, J. L., III., Gaudet, D., Clark, P., Robinson, J., & Wiley, D. (2013). The adoption of open educational resources by one community college math department. International Review of Research in Open and Distributed Learning, 14(4), 37–50. https://doi.org/10.19173/irrodl.v14i4.1523
Huang, R., Tlili, A., Chang, T. W., Zhang, X., Nascimbeni, F., & Burgos, D. (2020). Disrupted classes, undisrupted learning during COVID-19 outbreak in China: application of open educational practices and resources. Smart Learning Environments, 7, 1-15. https://doi.org/10.1186/s40561-020-00125-8
Ioannidis, J. P. A., Greenland, S., Hlatky, M. A., Khoury, M. J., Macleod, M. R., Moher, D., Schulz, K. F., & Tibshirani, R. (2014). Increasing value and reducing waste in research design, conduct, and analysis. The Lancet, 383(9912), 166–175. https://doi.org/10.1016/S0140-6736(13)62227-8
*Jhangiani, R. S., Dastur, F. N., Le Grand, R., & Penner, K. (2018). As good or better than commercial textbooks: Students’ perceptions and outcomes from using open digital and open print textbooks. Canadian Journal for the Scholarship of Teaching and Learning. https://doi.org/10.5206/cjsotl-rcacea.2018.1.5
Kahle, D. (2008). Designing open educational technology. In T. Iiyoshi and M. S. Vijay Kumar (Eds.), Opening up education: The collective advancement of education through open technology, open content, and open knowledge (pp. 27–45). MIT Press. https://mitpress.mit.edu/9780262515016/opening-up-education/
*Kelly, D. P., & Rutherford, T. (2017). Khan Academy as supplemental instruction: A controlled study of a computer-based mathematics intervention. The International Review of Research in Open and Distributed Learning. https://doi.org/10.19173/irrodl.v18i4.2984
Kitchenham, B. A., & Charters, S. (2007). Guidelines for performing systematic literature reviews in software engineering. (EBSE 2007–001). Keele University and Durham University Joint Report.
Konstantopoulos, S. P. Y. R. O. S., & Hedges, L. V. (2019). Statistically analyzing effect sizes: Fixed-and random-effects models. The Handbook of Research Synthesis and Meta-Analysis (pp. 245–280). Russell Sage Foundation. https://doi.org/10.7758/9781610448864.15
*Lawrence, C. N., & Lester, J. A. (2018). Evaluating the effectiveness of adopting open educational resources in an introductory American government course. Journal of Political Science Education, 14(4), 555–566. https://doi.org/10.1080/15512169.2017.1422739
Liesa-Orus, M., Lozano Blasco, R., & Arce-Romeral, L. (2023). Digital Competence in University Lecturers: A Meta-Analysis of Teaching Challenges. Education Sciences, 13(5), 508. https://doi.org/10.3390/educsci13050508
*Medley-Rath, S. (2018). Does the type of textbook matter? Results of a study of free electronic reading materials at a community college. Community College Journal of Research and Practice, 42(12), 908–918. https://doi.org/10.1080/10668926.2017.1389316
Morris, S. B. (2008). Estimating effect sizes from pretest-posttest-control group designs. Organizational Research Methods, 11(2), 364–386.
OLCOS. (2007). Open Educational Practices and Resources. Available online: https://www.olcos.org/cms/upload/docs/olcos_roadmap.pdf
Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational Statistics, 8(2), 157–159. https://doi.org/10.3102/10769986008002157
Otto, D., Schroeder, N., Diekmann, D., & Sander, P. (2021). Trends and gaps in empirical research on open educational resources (OER): A systematic mapping of the literature from 2015 to 2019. Contemporary Educational Technology, 13(4), ep325. https://doi.org/10.30935/cedtech/11145
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, https://doi.org/10.1136/bmj.n71
Pawlyshyn, N., Braddlee, D., Casper, L., & Miller, H. (2013). Adopting OER: A case study of cross-institutional collaboration and innovation. Educause Review. Accessed on May 20, 2023 from https://er.educause.edu/articles/2013/11/adopting-oer-a-case-study-of-crossinstitutional-collaboration-and-innovation.
*Robinson, T. J. (2015). The effects of open educational resource adoption on measures of post-secondary student success. Brigham Young University.
Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52(1), 59–82. https://doi.org/10.1146/annurev.psych.52.1.59
Salomon, G., & Clark, R. (1977). Reexamining the methodology of research on media and technology in education. Review of Educational Research, 47(1), 99–120. https://doi.org/10.3102/00346543047001099
Shear, L., Means, B., & Lundh, P. (2015). Research on open: OER research hub review and futures for research on OER. SRI International: Menlo Park, CA, USA.
*Shemy, N., & Al-Habsi, M. (2021). The effect of a Training Program based on Open Educational Resources on the Teachers Online Professional Development and their Attitudes towards it of AL-Dakhliya Governorate in Sultanate of Oman. Journal of Elearning and Knowledge Society, 17(1), 18–28. https://doi.org/10.20368/1971-8829/1135283
Shi, W., Ghisi, G. L. M., Zhang, L., Hyun, K., Pakosh, M., & Gallagher, R. (2023). Systematic review, meta-analysis and meta-regression to determine the effects of patient education on health behaviour change in adults diagnosed with coronary heart disease. Journal of Clinical Nursing, 32(15–16), 5300–5327. https://doi.org/10.1111/jocn.16519
Slavin, R. E., & Lake, C. (2008). Effective programs in elementary mathematics: A best-evidence synthesis. Review of Educational Research, 78(3), 427-515. https://doi.org/10.3102/0034654308317473
Smith, M. (2013). Ruminations on Research on Open Educational Resources. William and Flora Hewlett Foundation. Retrieved from https://hewlett.org/library/ruminations-on-research-on-open-educational-resources/
*Sulisworo, D., & Basriyah, K. (2021). Problem based learning using open educational resources to enhance higher order thinking skills in physics learning. In Journal of Physics: Conference Series, 1783(1), 012108. https://doi.org/10.1088/1742-6596/1783/1/012108
Thalheimer, W., & Cook, S. (2002). How to calculate effect sizes from published research: A simplified methodology. Work-Learning Research, 1(9).
Thomas, L., & Krebs, C. J. (1997). A review of statistical power analysis software. Bulletin of the Ecological Society of America, 78(2), 126–138.
Thorlund, K., & Mills, E. J. (2012). Sample size and power considerations in network meta-analysis. Systematic Reviews, 1, 1-13. https://doi.org/10.1186/2046-4053-1-41
Ting, K. M. (2010). Precision and Recall BT—Encyclopedia of Machine Learning (C. Sammut & G. I. Webb (Hrsg.); S. 781). Springer US. https://doi.org/10.1007/978-0-387-30164-8_652
Tlili, A., Altinay, F., Huang, R., Altinay, Z., Olivier, J., Mishra, S., Jemni, M., & Burgos, D. (2022). Are we there yet? A systematic literature review of Open Educational Resources in Africa: A combined content and bibliometric analysis. PLoS ONE, 17(1), e0262615. https://doi.org/10.1371/journal.pone.0262615
Tlili, A., & Burgos, D. (2022). Unleashing the power of Open Educational Practices (OEP) through Artificial Intelligence (AI): Where to begin? Interactive Learning Environments. https://doi.org/10.1080/10494820.2022.2101595
Tlili, A., Huang, R., Chang, T. W., Nascimbeni, F., & Burgos, D. (2019). Open educational resources and practices in China: A systematic literature review. Sustainability, 11(18), 4867. https://doi.org/10.3390/su11184867
Tlili, A., Jemni, M., Khribi, M. K., Huang, R., Chang, T. W., & Liu, D. (2020). Current state of open educational resources in the Arab region: An investigation in 22 countries. Smart Learning Environments, 7, 1–15. https://doi.org/10.1186/s40561-020-00120-z
Tlili, A., Zhang, J., Papamitsiou, Z., Manske, S., Huang, R., Kinshuk, & Hoppe, H. U. (2021). Towards utilising emerging technologies to address the challenges of using Open Educational Resources: a vision of the future. Educational Technology Research and Development, 69, 515-532. https://doi.org/10.1007/s11423-021-09993-4
UN. (2021). Sustainable Development Goals. United Nations. https://www.un.org/sustainabledevelopment/
UNESCO. (2019). Recommendation on Open Educational Resources. UNESCO: Paris, France. Accessible from: https://www.unesco.org/en/legal-affairs/recommendation-open-educational-resources-oer
Vankov, I., Bowers, J., & Munafò, M. R. (2014). Article commentary: On the persistence of low power in psychological science. Quarterly Journal of Experimental Psychology, 67(5), 1037–1040. https://doi.org/10.1080/17470218.2014.885986
Wang, H., Tlili, A., Huang, R., Cai, Z., Li, M., Cheng, Z., Yang, D., Li, M., Zhu, X., & Fei, C. (2023). Examining the applications of intelligent tutoring systems in real educational contexts: A systematic literature review from the social experiment perspective. Education and Information Technologies. https://doi.org/10.1007/s10639-022-11555-x
Weller, M. (2010). Big and Little OER. Open Ed, Barcelona. http://hdl.handle.net/10609/4851
Weller, M. (2017). The Development of New Disciplines in Education – the Open Education Example. https://oro.open.ac.uk/49737/
Weller, M., de los Arcos, B., Farrow, R., Pitt, B., & McAndrew, P. (2015). The Impact of OER on Teaching and Learning Practice. Open Praxis, 7(4), 351–361. https://doi.org/10.5944/openpraxis.7.4.227
*Westermann Juárez, W., & Venegas Muggli, J. I. (2017). Effectiveness of OER use in firstyear higher education students’ mathematical course performance: A case study. In C. Hodgkinson-Williams & P. B. Arinto (Eds.), Adoption and impact of OER in the Global South (pp. 187–229). https://doi.org/10.5281/zenodo.601203
Wiley, D. (2014). The Access Compromise and the 5th R [blog post]. Iterating toward openness. Improving Learning: Eclectic, Pragmatic, Enthusiastic. https://opencontent.org/blog/archives/3221
Wiley, D. (2022). On the Relationship Between Adopting OER and Improving Student Outcomes. https://opencontent.org/blog/archives/6949
Wiley, D., & Hilton, J. L., III. (2018). Defining OER-enabled pedagogy. International Review of Research in Open and Distributed Learning. https://doi.org/10.19173/irrodl.v19i4.3601
*Winitzky-Stephens, J. R., & Pickavance, J. (2017). Open educational resources and student course outcomes: A multilevel analysis. International Review of Research in Open and Distributed Learning, 18(4), 35–49. https://doi.org/10.19173/irrodl.v18i4.3118
Yuan, M., & Recker, M. (2015). Not all rubrics are equal: A review of rubrics for evaluating the quality of open educational resources. International Review of Research in Open and Distributed Learning, 16(5), 16–38. https://doi.org/10.19173/irrodl.v16i5.2389
Zhang, X., Tlili, A., Huang, R., Chang, T., Burgos, D., Yang, J., & Zhang, J. (2020b). A case study of applying open educational practices in higher education during COVID-19: Impacts on learning motivation and perceptions. Sustainability, 12(21), 9129. https://doi.org/10.3390/su12219129
Zhang, X., Tlili, A., Nascimbeni, F., Burgos, D., Huang, R., Chang, T. W., Jemni, M., & Khribi, M. K. (2020a). Accessibility within open educational resources and practices for disabled learners: a systematic literature review. Smart Learning Environments, 7, 1–19. https://doi.org/10.1186/s40561-019-0113-2
Zulaiha, D., & Triana, Y. (2023). Students’ perception toward the use of open educational resources to improve writing skills. Studies in English Language and Education, 10(1), 174–196. https://doi.org/10.24815/siele.v10i1.25797
The authors have no conflict of interest to declare.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Tlili, A., Garzón, J., Salha, S. et al. Are open educational resources (OER) and practices (OEP) effective in improving learning achievement? A meta-analysis and research synthesis. Int J Educ Technol High Educ 20, 54 (2023). https://doi.org/10.1186/s41239-023-00424-3