
- CSI Library Home
- CSI Library
- Open Educational Resources (OER)
- Research on OER
- Open licensing & citations

OER impact reports
Reasons to use oer, related research, publications by cuny faculty.
- OER by Discipline
- Virtual labs & simulations
- Creating OER
- Platforms and tools
- Zero Textbook Cost (ZTC)
- OER Accessibility Toolkit (CUNY) This link opens in a new window
- Grant Funding for Faculty OER Projects
- OER in Tenure and Promotion
- OER and Student Success, Retention, and Pedagogy
Organizations that research OER
- Open Educational Group
- A Comparison of Academic Outcomes in Courses Taught With Open Educational Resources and Publisher Content Bol, Linda ; Esqueda, Monica Christina ; Ryan, Diane ; Kimmel, Sue C Educational researcher, 2021-10-18, p.13189
- Practitioner Perspectives: The DOERS3 Collaborative on OER in Tenure and Promotion Published March 2, 2021 In the following Practitioner Perspective, Andrew McKinney, OER coordinator at the City University of New York (CUNY), and Amanda Coolidge, director of Open Education at BCcampus in British Columbia, Canada, share the development of an adaptable matrix to help faculty include OER (Open Educational Resources) in their tenure and promotion portfolios.

This list is presented in reverse chronological order. Subscription may be required in some cases.
The resources here were compiled by Stacy Katz and are available in list form .

2017 and before

- << Previous: Learn about OER
- Next: Finding OER >>

Facebook Twitter Instagram
- URL: https://library.csi.cuny.edu/oer
- Last Updated: Nov 8, 2023 5:08 PM
- Open supplemental data
- Reference Manager
- Simple TEXT file
People also looked at
Empirical study article, perceived usefulness of open educational resources: impact of switching to online learning for face-to-face and distance learners.
- 1 Information Technology Office, Hong Kong Metropolitan University, Hong Kong, China
- 2 Office of Research Affairs, Hong Kong Metropolitan University, Hong Kong, China
- 3 School of Open Learning, Hong Kong Metropolitan University, Hong Kong, China
This paper reports a study on the perceived usefulness of university students on open educational resources (OER) in relation to the switch of learning mode to online learning during the COVID-19 pandemic. The participants involved two groups of students, one studying in a face-to-face mode and the other in a distance learning mode. They took part in a survey which was conducted in 2019 before the pandemic (with a total of 912 responses) and 2021 during the pandemic (with a total of 1,018 responses). The results show that both groups of students generally perceived OER to be more useful during the pandemic. The specific types of OER which were perceived as relatively more useful include open online courses and open access textbooks. Face-to-face students showed a higher level of perceived usefulness of OER for preparing tests and examinations, while distance learning students perceived OER as more useful for supplementing course materials. They both concerned about the limitations of OER, especially on accuracy and comprehensiveness. The findings suggest the importance of recognizing the diverse needs of the two groups of students and offering appropriate OER support for them.
1. Introduction
Since their development in the early 2000s, open educational resources (OER) have become increasingly prevalent not only as an open source of learning resources but also as an agent to help transform teaching and learning with an aim to make learning more accessible and equitable ( Miao et al., 2016 ). OER are regarded as teaching and learning resources released under an open license and are allowed for reuse, revision, and redistribution with no or limited restrictions ( Li and Wong, 2021 ). Relevant resources have been well available for higher education. Popular types of resources cover open courseware and course materials, open online courses and tutorials, open e-books and e-journals, and open-source learning tools ( Cheung et al., 2013 ; Blomgren, 2018 ; Wong and Li, 2019 ).
The use of OER has shown to bring a broad range of benefits for learners. It helps learners to access and use learning materials easily, reduce their cost and dismantle learning barriers, facilitate sharing of learning materials and collaboration, and improve learning performance. The extent to which learners can benefit from OER depends on factors such as the availability of support, learners’ computer literacy, and availability of suitable resources ( Li and Wong, 2015 ).
Learners’ perception on the usefulness of OER is one of the major factors affecting their intention and behavior of engaging in OER ( Wong et al., 2016 ). There have been a series of studies conducted in recent years regarding learners’ perceived usefulness of OER for learning purposes ( Cheung, 2017 , 2018 , 2019 ). They addressed the context of university students in Hong Kong, who possess in general an adequate level of readiness for OER ( Li and Wong, 2014 ). A number of characteristics on the students’ perception on the usefulness of OER were identified. For example, the students tend to consider OER more useful for supplementing course textbooks and doing assignments and projects than for preparing tests and examinations. They also tend to consider open courseware, open e-books and online dictionaries more useful than other categories of OER.
The switch of learning mode from typical classroom-based learning to online learning by many higher education institutions during COVID-19 in order to accommodate social distancing requirements provided an opportunity to examine its impact on learners’ perceived usefulness of OER. There have been studies showing that the benefits of using OER for learners in online learning would be different from those in conventional classroom-based learning ( Wiley et al., 2017 ). However, the impact of learning mode on learners’ perceived usefulness of OER has not been systematically studied. Very little is known about how OER are perceived by learners after a rapid change of learning mode caused by the pandemic.
This paper reports the results of a study to address this literature gap. The study involved two identical surveys conducted in 2019 (where classroom-based learning was adopted) and 2021 (where online learning was adopted) participated by university students studying in a face-to-face mode and a distance learning mode in Hong Kong. The surveys focused on students’ perception of OER, which reveal the impact of a rapid switch to online learning for learners’ perceived usefulness of OER. The results during the pandemic period have been found different from those in the previous years. The change in the perceived usefulness occurred for both students studying in the face-to-face mode as well as the distance learning mode. The following research questions are addressed:
• How was learners’ perceived usefulness of OER changed after switching to online learning?
• What are the differences between face-to-face learners and distance learners in their perceived usefulness of OER?
The rest of this paper is structured as follows. Following this introduction, Section 2 presents a review of related studies on the usefulness of OER as well as the impact of learning mode. Section 3 describes the research method, and Section 4 reports the results of this study, which addressed learners’ perceived usefulness of OER for learning purposes as well as concerns about the shortcomings of OER. In Section 5, the results are discussed in relation to those in the related studies. Section 6 concludes this paper, where potential future studies are suggested.
2. Literature review
2.1. usefulness of oer.
The use of OER has shown various benefits for education. It facilitates faculty members to familiarize themselves with relevant skills and expertise for OER use, as well as enhance knowledge of up-to-date instructional technology developments and technological and pedagogical competencies ( Okada et al., 2012 ; Stagg et al., 2018 ). Positive evaluations have also been made by university teachers using OER who reported an improvement in course quality, for example in terms of content relevance as they need to carefully screen materials appropriate to the courses being taught ( Bliss et al., 2013 ; Ives and Pringle, 2013 ).
From the learner perspective, the usefulness of OER has been widely reported. Learners can get access to education more easily, especially for those living in a place where learning opportunities are very limited ( Conole, 2012 ; Beetham, 2013 ; Bowen et al., 2014 ). They are also allowed to reduce expenditures on learning materials including textbooks in particular, resulting in a high rate of completion in the courses they take ( Hilton and Laman, 2012 ). Through studying with relevant OER materials, learners may flexibly choose their learning goals from gaining educational qualifications to developing their career ( Shank, 2013 ; Hatzipanagos and Gregson, 2015 ; Grewe and Davis, 2017 ). The use of OER in online courses has been observed to encourage differentiated learning by reaching students at their present academic levels ( Wiley et al., 2017 ).
The study of OER usefulness has been commonly related to learners’ perception ( Otto et al., 2021 ). Harsasi (2015) looked at university students’ perceptions of the usefulness of the integration of OER into e-learning, and observed positive comments from the students reporting that using OER facilitates their understanding of the subject knowledge being taught in the lessons and enhances their technological competencies. Lin and Tang (2017) explored how university students perceived the adoption of OER to reduce statistics anxiety in their research methodology courses. They found that the adoption is useful for students to master and make use of research and problem-solving skills in their courses. Other relevant studies include Cooney (2016) , Gurung (2017a , b) , Jhangiani et al. (2018) , and Ocean et al. (2019) , where positive perceived usefulness of OER were noted among university students indicating that OER are helpful in improving their learning.
Cheung (2017 , 2018 , 2019 ) performed a series of studies to examine how university students, full time and distance learning, perceived the use of OER in their learning. The studies noted high levels of usefulness of OER among both groups who regarded OER as helpful in supplementing course textbooks and materials, acquiring more knowledge as learning reference, and gaining resources for completing assignments and projects and for preparing course assessments. Both groups considered open courseware, open e-books, and open learning software or tools useful for learning. In terms of open online courses and online learning platforms, distance learning students tended to view them as being more useful than full time students did.
2.2. Impacts of the COVID-19 pandemic on educational delivery and OER
The outbreak of the COVID-19 pandemic has brought tremendous changes in teaching and learning. A number of studies have been conducted in analyzing how the pandemic influences teaching and learning in various educational contexts in terms broadly of teaching and learning materials and approaches ( Chen, 2020 ; Kanoksilapatham, 2021 ; Williams and Werth, 2021 ). With respect to studies into the impacts of COVID-19 on educational delivery, March et al. (2021) noted distinct changes in both the completion rates and the delivery modes of courses during the pandemic. They found, for example, that the completion rates of asynchronous online courses increased a lot more than those of live in-person courses, which may be attributed to the implementation of social distancing during the pandemic. Torda and Shulruf (2021) investigated whether face-to-face and online modes of delivery influenced learning outcomes, social outcomes, and wellbeing among students during the pandemic.
However, the impacts of the pandemic on learners’ perceived usefulness of OER have not been systematically explored in the literature. Very little is known about how OER are perceived by students especially after a rapid change from face-to-face learning to online learning caused by the pandemic crisis. To address this gap, the present study extended the line of prior research in the field of OER by examining how the change in the mode of learning during the pandemic has impacted on the perceived usefulness of OER among university students.
3. Research method
This study aimed to examine the perceived usefulness of university students on OER in relation to the change in the learning mode during the COVID-19 pandemic. It covered two groups of students—one studying in a face-to-face mode and another one a distance learning mode. Before the pandemic, the face-to-face students studied in a traditional classroom setting and the distance learning students in a flexible setting including mainly online together with some face-to-face elements. During the pandemic, both groups of students rapidly switched to fully online learning. The study was conducted in the Hong Kong Metropolitan University which offers both face-to-face and distance learning programs.
An online survey was developed for this study. It was conducted twice—one in December 2019 (before the COVID-19 pandemic) and the other in February 2021 (during the pandemic). The participants include face-to-face students and distance learning students of the Hong Kong Metropolitan University. For the survey conducted in 2019, total of 489 and 423 valid responses were received from face-to-face students and distance learning students, respectively. Their average ages are 21.65 years and 32.32 years, respectively. For the other survey conducted in 2021, a total of 624 and 394 valid responses were received from the two groups, respectively. Their average ages are 21.82 and 31.51 years, respectively.
The survey was based on the one used in a series of studies on students’ perceived usefulness of OER ( Cheung 2017 , 2018 , 2019 ). It contains three sections. The first section collects students’ overall feedback on the usefulness of OER for learning purposes. There are questions, asking students on their agreeance or dis-agreeance to the usefulness of OER for difference learning purposes. The second section focuses on students’ perceived usefulness of various categories of OER. There are questions, asking students on their agreeance or dis-agreeance to the usefulness of different categories of OER. The third section addresses students’ perceived shortcomings and concerns on OER. There are questions, asking students on their agreeance or dis-agreeance to different shortcomings and concerns of OER.
A 5-point Likert scale is adopted for the response to each question, namely, “strongly agree,” “agree,” “neutral,” “disagree,” and “strongly disagree.” The response is converted to a numeric score, where 2 is for “strongly agree,” 1 for “agree,” 0 for “neutral,” −1 for “disagree,” and −2 for “strongly disagree.” For analysis purposes, a weighted score is calculated, as follows.
where p sa , p a , p n , p d, and p sd are the percentages of “strongly agree,” “agree,” “neutral,” “disagree,” and “strongly disagree” responses, respectively.
The weighted score is used to interpret the students’ general level of agreeance (or dis-agreeance) in a quantitative scale between −2 and 2. A weighted score closer to 0 (−0.5 to 0.5) means that students are generally neutral to a question. A weighted score more than 0.5 means that students generally agree to a question, and that a higher score implies the stronger agreeance. In contrast, a weight score less than 0.5 means that students generally disagree to a question, and that a lower score implies the stronger dis-agreeance.
4.1. Overall usefulness of OER for learning purposes
Table 1 shows the students’ perception about the overall usefulness of OER for various learning purposes. For each survey, the students’ responses of OER usefulness for each learning purpose are shown together with the difference between the scores from the two surveys. OER were generally perceived to be more useful for supplementing course textbooks and materials and getting resources for doing assignments and projects. Comparing the results of the two surveys, the perceived usefulness of OER increased for all learning purposes in the 2020/2021 survey than that in the 2019/2020 survey.

Table 1 . Overall usefulness of OER for various learning purposes.
The result suggests that OER would serve as a more useful support for learning in a fully online mode during the pandemic. For supplementing course textbooks and materials, there is only a slight increase (0.02) in the perceived usefulness from face-to-face students, but a larger increase (0.11) from distance learning students. For getting resources for preparing tests and examination, there is a larger increase (0.16) from face-to-face students but a slight increase (0.08) from distance learning students. The results suggest that the extent of usefulness of OER would depend on the study mode.
4.2. Usefulness of specific categories of OER for learning purposes
The second section of the survey covers students’ perceived usefulness of OER by different categories. Table 2 reports students’ perceived usefulness of open courseware and course materials. Between the two surveys, the perceived usefulness of face-to-face students in 2020/2021 is higher than that in 2019/2020 for all types of open courseware and course materials, with a difference of 0.12–0.19 in their scores. However, there is no clear difference in the scores of distance learning students between the two surveys, i.e., −0.02 − 0.08. Open courseware and course materials, in particular supplementary online learning materials, were perceived as more useful by face-to-face students than distance learning students during the switch to online learning.

Table 2 . Usefulness of various types of open courseware and course materials.
Table 3 presents the perceived usefulness of open online courses, tutorials and forums. Both the face-to-face students and distance learning students overall indicated a higher level of usefulness for all types of open online courses, tutorials and forums in the 2020/2021 survey. The increase was more prominent for that of face-to-face students on open online self-contained courses as well as open online tutorials on specific topics, with a difference of 0.32 and 0.29 in their scores, respectively. Comparatively, the differences of scores by distance learning students for these two types of OER were 0.11 and 0.13, respectively. Despite such increase, it is worth noting that the perceived usefulness of both groups of students on small-scale mobile learning courses and applications as well as online help desks and forums was at a relatively low level in the 2020/2021 survey, i.e., 0.35–0.54. The two types of OER are relatively not as useful as the others.

Table 3 . Usefulness of various types of open online courses, tutorials, and forums.
Table 4 reports the perceived usefulness of open access e-books, journals, reports and other documents. There was an increase in the perceived usefulness for all of these in the 2020/2021 survey for both groups of students. The increase was particularly prominent for face-to-face students for open access e-books which include both self-contained textbooks and reference books, with a difference of 0.32 and 0.29 in scores for these two types, respectively. The result suggests that open access e-books would be relatively more helpful for face-to-face students for learning in an online mode.

Table 4 . Usefulness of various types of open access e-books, journals, reports, and other documents.
Table 5 reports the perceived usefulness of open source learning software, tools and platforms. All of these types of OER were perceived by distance learning students as more useful in 2020/2021. For face-to-face students, only online anti-plagiarism checker and grammar checker as well as online learning platform for self and collaborative learning were perceived as more useful in 2020/2021. There was no clear change in their perceived usefulness for open online dictionaries and encyclopedia as well as online learning software, with differences of −0.07 and 0.02 in scores between the 2 years for these two types, respectively.

Table 5 . Usefulness of various types of open source learning software, tools, and platforms.
4.3. Concerns about the shortcomings of OER
Like many open online resources on the Internet, OER also have shortcomings such as inaccurate, irrelevant and incomplete contents. The accuracy, readability, completeness, comprehensiveness and relevancy of the OER contents are students’ major concerns ( Tang, 2020 ). Some OER providers have established self-regulatory guidelines or measures for assuring quality of the contents before publishing on the Internet. These are usually found in open access textbook platforms, where quality assurance is essentially required ( Cheung et al., 2015 ; Belikov and McLure, 2020 ; Cheung, 2020 ).
Table 6 reports the students’ concerns about the shortcomings of OER for learning purposes, covering the accuracy, updatedness, comprehensiveness and organization of OER contents. For both surveys, the students’ concerns were rather neutral. Both face-to-face and distance learning students had relatively more concerns about the accuracy and comprehensiveness of the contents. In the 2020/2021 survey, the extent of concerns of face-to-face students was slightly higher for the accuracy and updatedness of OER contents, while distance learning students showed less concern for the updatedness and comprehensiveness of OER contents.

Table 6 . Concerns about OER for learning purposes.
5. Discussion
For many years, OER have been used for different learning purposes and their usefulness has been widely studied ( Ocean et al., 2019 ; Otto et al., 2021 ). As a response to the social distancing measures since the outbreak of the pandemic, the switch of teaching and learning mode by higher education institutions in a short span of time from their usual classroom-based to fully online learning has posed an unprecedented challenge in ensuring the continuity of teaching and learning while maintaining teaching quality and learning effectiveness. Not only face-to-face students, the change in teaching mode has also impacted distance learning students. With the features of free and open availability on the Internet, OER have served as a suitable and timely support for both groups of students for their learning during the pandemic.
Overall positive feedback from the students on the usefulness of OER during the pandemic has been shown in this study. Both groups of the students showed an increase in their perceived usefulness of OER in terms of getting more reference materials for learning, doing assignments and projects. This finding is consistent with related studies, where OER allows learners to get access to educational materials more easily ( Beetham, 2013 ; Bowen et al., 2014 ). Also, learners may flexibly select suitable materials to cope with their learning goals ( Hatzipanagos and Gregson, 2015 ; Grewe and Davis, 2017 ).
There are also differences in the change of perceived usefulness between face-to-face and distance learning students in online learning. Face-to-face students indicated a relatively higher extent of perceived usefulness for using OER for preparing tests and examination during the pandemic, while distance learning students revealed a higher extent for using OER for supplementing course textbooks and materials. The results may reflect their changes in learning process, learning styles and habits, and also the changes in their usage of OER as well as their perception of the usefulness of OER ( Wong and Li, 2020 ; Wong and Wong, 2020 ). Following the findings in Wiley et al. (2017) that the use of OER in online courses encouraged differentiated learning, the results on the differences in perceived usefulness between face-to-face and distance learning students suggest the need to recognize the diverse needs of these two groups of students for tailoring learning activities for them.
The diverse needs of face-to-face and distance learning students are more clearly shown in their perceived usefulness of specific types of OER. Face-to-face students perceived open courseware and course materials as more useful during the pandemic, while distance learning students did not show an observable change in perceived usefulness of these types of OER. This would be because the distance learning students were usually already provided with a complete set of course materials including also online references for their self-study ( Harsasi, 2015 ; Cheung, 2018 ), thus their demand for additional course materials may not be as large as face-to-face students after switching to online learning. On the other hand, both groups of students showed a relatively high level of perceived usefulness for resources such as open access textbooks, reference books, and online courses. This suggests that the quality of these resources was recognized by the students which is also observed in related studies ( Gurung, 2017a , b ; Jhangiani et al., 2018 ). Also, the students may need more instructional support during the fully online teaching as reflected from their need for the resources.
As explained in Cheung (2021) , many higher education institutions responded to the pandemic through delivering lectures and tutorials online via video-conferencing tools, on the same duration of lessons and without any change in the contents. Learning effectiveness would be affected by just a switch on the delivery mode without any changes on the contents and forms. The needs of students for various types of OER imply potential problems in teaching and learning they faced during the fully online learning, and faculty members should tailor their teaching to cope with students’ needs for learning in the online mode ( Wong et al., 2021 ).
The students’ concerns about OER for learning purposes remained at a relatively low level in both surveys. This result is consistent with their overall positive feedback on the perceived usefulness of OER, which has been also reported in related studies ( Cheung, 2017 , 2018 , 2019 ). There are differences between face-to-face and distance learning students in their concerns shown in the two surveys. The face-to-face students revealed a slight increase in the extent of concerns about the accuracy, updatedness and comprehensiveness of OER contents, while the distance learning students showed a slight decrease in the extent of these concerns. The differences would be related to changes in their learning needs and experience in OER for learning during the pandemic period. It has been raised in relevant studies that learners’ perceived difficulties in relation to OER use during the COVID-19 pandemic are related to the familiarity ( Chen, 2020 ) and competence of working with the resources ( Huang et al., 2020 ).
6. Conclusion
This paper reports the usefulness of OER perceived by university students after a rapid switch to online learning during the pandemic period. The findings contribute to revealing the needs of learners for OER in relation to the change of learning mode. They show that the students generally perceived OER to be more useful during their learning in online mode, in particular for supplementing course textbooks and materials, and getting resources for doing assignments and projects. The specific types of OER which were perceived as relatively more useful include openly shared course materials, open online courses, open access e-textbooks, open online dictionaries, anti-plagiarism checkers and grammar checkers. On the other hand, the students concerned about the shortcomings of OER, especially on the accuracy and comprehensiveness of the contents. The findings suggest that students’ perceived usefulness of OER would be relevant to their learning mode. These findings supplement relevant literature with other focuses such as advocacy of OER use in response to the pandemic ( Huang et al., 2020 ; Lee and Lee, 2021 ) and development of related resources ( Chen, 2020 ).
This paper also addresses the diverse needs of face-to-face students and distance learning students. The results show that face-to-face students perceived as more useful resources such as open courseware and course materials, while distance learning students did not show an observable change in perceived usefulness of these resources after switching to online learning. Their needs for various types of resources reveal potential problems they faced during online learning. The results thus facilitate faculty members to tailor their teaching to cope with students’ diverse needs for learning in the online mode.
The limitation of this study should be noted. The surveys in this study focused on the perspective of university students in Hong Kong. For a comprehensive understanding of the usefulness of OER, future work should address other regions where learners’ needs for OER may be different. Also, the perspectives of other stakeholders such as faculty members should be covered.
The results of this study open up opportunities for future work to examine how OER support can be tailored for various groups of students. The differences in perceived usefulness of OER between face-to-face and distance learning students suggest future studies to examine their diverse needs for learning resources as well as instructional support for tailoring learning activities for them. Besides, it can be envisaged that students’ learning mode will be switched again after the pandemic, be it back to conventional face-to-face classroom teaching or a hybrid mode. Students’ needs for and perceived usefulness of OER may change again at that time. Future studies should address more on how this factor affects students’ perceptions and actual use of OER.
Data availability statement
The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.
Author contributions
All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.
Acknowledgments
The work described in this paper was partially supported by a grant from Hong Kong Metropolitan University (2021/011).
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Supplementary material
The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2022.1004459/full#supplementary-material
Beetham, H. (2013). Rethinking pedagogy for a digital age: Designing for 21st century learning (2nd Ed.). New York: Routledge.
Google Scholar
Belikov, O., and McLure, M. (2020). A qualitative analysis of open textbook reviews authored by postsecondary educators. Int. J. Open Educ. Resour. 3, 77–114. doi: 10.18278/ijoer.3.2.6
CrossRef Full Text | Google Scholar
Bliss, T. J., Hilton, J., Wiley, D., and Thanos, K. (2013). The cost and quality of online open textbooks: perceptions of community college faculty and students. First Monday 13, 10–28. doi: 10.5210/fm.v18i1.3972
Blomgren, C. (2018). OER awareness and use: the affinity between higher education and K-12. Int. Rev. Res. Open Dist. Learn. 19, 55–70. doi: 10.19173/irrodl.v19i2.3431
Bowen, W. G., Chingos, M. M., Lack, K. A., and Nygren, T. I. (2014). Interactive learning online at public universities: evidence from a six-campus randomized trial. J. Policy Anal. Manage. 33, 94–111. doi: 10.1002/pam.21728
Chen, H. H. J. (2020). Developing an OER website and analysing its use during the COVID-19 pandemic. English Teach. Learn. 44, 451–461. doi: 10.1007/s42321-020-00067-x
Cheung, K. S., Li, K. C., and Yuen, K. S. (2013). “An overview of open education resources for higher education” in Knowledge Sharing Through Technology (ICT 2013), Communications in Computer and Information Science (Singapore: Springer), 26–34.
Cheung, S. K. S., Yuen, K. S., Li, K. C., Tsang, E. Y. M., and Wong, A. (2015). Open textbooks: Engaging education stakeholders to share learning resources. International Journal of Services and Standards 10, 225–239.
Cheung, K. S. (2017). “Distance-learning students’ perception on usefulness of open educational resources” in Blended Learning: New Challenges and Innovative Practices . eds. K. S. Cheung, L. F. Kwok, W. K. Ma, L. K. Lee, and H. Yang, Lecture Notes in Computer Science , vol. 10309 (Switzerland: Springer), 389–399.
Cheung, K. S. (2018). “Perceived usefulness of open educational resources between full-time and distance-learning students” in Blended Learning: Enhancing Learning Success . eds. K. S. Cheung, L. F. Kwok, K. Kubota, L. K. Lee, and J. Tokito, Lecture Notes in Computer Science, vol. 10949 (Switzerland: Springer), 357–367.
Cheung, K. S. (2019). “A study on the university students’ use of open educational resources for learning purposes” in Technology in Education: Pedagogical Innovations, Communications in Computer and Information Science, 1048 . eds. K. S. Cheung, J. L. Jiao, L. K. Lee, X. B. Zhang, K. C. Li, and Z. H. Zhan (Singapore: Springer), 146–155.
Cheung, K. S. (2020). “A review of open access textbook platforms” in Blended Learning: Education in a Smart Learning Environment , Lecture Notes in Computer Science, vol. 12218 (Switzerland: Springer), 114–125.
Cheung, S. K. S. (2021). “Implication on perceived usefulness of open educational resources after a rapid switch to online learning mode” in Blended Learning: Re-thinking and Re-defining the Learning Process . eds. R. Li, S. K. S. Cheung, C. Iwasaki, L. F. Kwok, and M. Kageto (Switzerland: Springer), 298–308.
Conole, G. (2012). Designing for learning in an open world . New York: Springer.
Cooney, C. (2016). How do open educational resources (OER) impact students? A qualitative study at new York City College of technology, CUNY. Unpublished masters’ thesis. City University of New York. Available at: http://academicworks.cuny.edu/gc_etds/1347/
Grewe, K. E., and Davis, W. P. (2017). The impact of enrolment in an OER course on student learning outcomes. Int. Rev. Res. Open Dist. Learn. 18, 231–238. doi: 10.19173/irrodl.v18i4.2986
Gurung, R. A. (2017a). “Are OE resources high quality?” in Open: The philosophy and practices that are revolutionizing education and science . eds. R. S. Jhangiani and R. Biswas-Diener (London: Ubiquity Press), 79–86.
Gurung, R. A. (2017b). Predicting learning: comparing an open educational resource and standard textbooks. Scholarsh. Teach. Learn. Psychol. 3, 233–248. doi: 10.1037/stl0000092
Harsasi, M. (2015). The use of open educational resources in online learning: a study of students’ perceptions. Turk. Online J. Dist. Educ. 9, 74–87. doi: 10.17718/tojde.46469
Hatzipanagos, S., and Gregson, J. (2015). The role of open access and open educational resources: a distance learning perspective. Electr. J. E-Learn. 13, 97–105.
Hilton, J., and Laman, C. (2012). One college’s use of an open psychology textbook. Open Learn. 27, 265–272. doi: 10.1080/02680513.2012.716657
Huang, R., Tlili, A., Chang, T. W., Zhang, X., Nascimbeni, F., and Burgos, D. (2020). Disrupted classes, undisrupted learning during COVID-19 outbreak in China: application of open educational practices and resources. Smart learning. Environments 7:19. doi: 10.1186/s40561-020-00125-8
Ives, C., and Pringle, M. M. (2013). Moving to open educational resources at Athabasca University: a case study. Int. Rev. Res. Open Dist. Learn. 14, 14–26. doi: 10.19173/irrodl.v14i2.1534
Jhangiani, R. S., Dastur, F. N., le Grand, R., and Penner, K. (2018). As good or better than commercial textbooks: students’ perceptions and outcomes from using open digital and open print textbooks. Can. J. Scholarsh. Teach. Learn. 9, 1–22. doi: 10.5206/cjsotl-rcacea.2018.1.5
Kanoksilapatham, B. (2021). OER as language online lessons to enhance Thai university students’ English language skills in the COVID-19 pandemic era. Southeast Asian J. Engl. Lang. Stud. 27, 130–143. doi: 10.17576/3l-2021-2702-10
Lee, D., and Lee, E. (2021). International perspectives on using OER for online learning. Educ. Technol. Res. Dev. 69, 383–387. doi: 10.1007/s11423-020-09871-5
PubMed Abstract | CrossRef Full Text | Google Scholar
Li, K. C., and Wong, B. T. M. (2014). Readiness development of open educational resources in Hong Kong. International Journal of Continuing Education and Lifelong Learning 7, 119–137.
Li, K. C., and Wong, B. T. M. (2015). “Computer literacy and use of open educational resources: A study of university students in Hong Kong” in Technology in Education. Transforming Educational Practices with Technology . eds. K. C. Li, et al. (Berlin: Springer-Verlag), 206–214.
Li, K. C., and Wong, B. T. M. (2021). A review of the use of open educational resources: The benefits, challenges and good practices in higher education. International Journal of Innovation and Learning 30, 279–298.
Lin, Y. J., and Tang, H. (2017). Exploring student perceptions of the use of open educational resources to reduce statistics anxiety. J. Format. Design Learn. 1, 110–125. doi: 10.1007/s41686-017-0007-z
March, J. A., Scott, J., Camarillo, N., Bailey, S., Holley, J. E., and Taylor, S. E. (2021). Effects of COVID-19 on EMS refresher course completion and delivery. Prehosp. Emerg. Care 26, 617–622. doi: 10.1080/10903127.2021.1977876
Miao, F., Mishra, S., and McGreal, R. (Eds.) (2016). Open educational resources: Policy, costs and transformation . UNESCO and Commonwealth of Learning. Available at: http://oasis.col.org/bitstream/handle/11599/2306/2016_Perspectives-OER-Policy-Transformation-Costs.pdf?sequence=1&isAllowed=y
Ocean, M., Thompson, C., Allen, R., and Lyman, K. S. (2019). TIPs as texts: community college students’ perceptions of open educational resources. Int. J. Teach. Learn. Higher Educ. 31, 238–248.
Okada, A., Connolly, T., and Scott, P. J. (2012). Collaborative learning 2.0: Open educational resources . Hershey PA: Information Science Reference.
Otto, D., Schroeder, N., Diekmann, D., and Sander, P. (2021). Trends and gaps in empirical research on open educational resources (OER): a systematic mapping of the literature from 2015 to 2019. Contemporary. Educ. Technol. 13:11145. doi: 10.30935/cedtech/11145
Shank, J. (2013). Interactive open educational resources . Hoboken, NJ: John Wiley & Sons.
Stagg, A., Nguyen, L., Bossu, C., Partridge, H., Funk, J., and Judith, K. (2018). Open educational practices in Australia: a first-phase national audit of higher education. Int. Rev. Res. Open Dist. Learn. 19, 172–201. doi: 10.19173/irrodl.v19i3.3441
Tang, H. (2020). A qualitative inquiry of K-12 teachers’ experience with open educational practices: perceived benefits and barriers of implementing open educational resources. Int. Rev. Res. Open Dist. Learn. 21, 212–229. doi: 10.19173/irrodl.v21i3.4750
Torda, A., and Shulruf, B. (2021). It’s what you do, not the way you do it – online versus face-to-face small group teaching in first year medical school. BMC Med. Educ. 21:541. doi: 10.1186/s12909-021-02981-5
Wiley, D., Webb, A., Weston, S., and Tonks, D. (2017). A preliminary exploration of the relationships between student-created OER, sustainability, and student success. Int. Rev. Res. Open Dist. Learn. 18, 60–69. doi: 10.19173/irrodl.v18i4.3022
Williams, K., and Werth, E. (2021). A case study in mitigating COVID-19 inequities through free textbook implementation in the U.S. J. Interact. Media Educ. 14, 1–14. doi: 10.5334/jime.650
Wong, B. T. M., Li, K. C., Yuen, K. S., and Wu, J. W. S. (2016). Adopting and adapting open textbooks: School teachers’ readiness and expectations. International Journal of Services and Standards 11, 160–175.
Wong, B.T.M., and Li, K.C. (2019). Using open educational resources for teaching in higher education: A review of case studies . In Proceedings of the 5th International Symposium on Educational Technology (ISET) (pp. 186–190). Hradec Králové, Czech Republic.
Wong, B. T. M., and Li, K. C. (2020). “Meeting diverse student needs for support services: A comparison between face-to-face and distance-learning students” in Innovating Education in Technology-Support Environments . eds. K. C. Li, E. Y. M. Tsang, and B. T. M. Wong (Springer), 253–268.
Wong, B. T. M., and Wong, B. Y. Y. (2020). “Student support needs for wellness in open and distance learning” in Innovating Education in Technology-Support Environments . eds. K. C. Li, E. Y. M. Tsang, and B. T. M. Wong (Springer), 227–240.
Wong, B. T. M., Kwan, R., Li, K. C., and Wu, M. M. F. (2021). “Evaluation of hybrid teaching effectiveness: Feedback from academics” in The International Conference on Open and Innovative Education (ICOIE) (China: Hone Kong)
Keywords: open educational resources, online learning, face-to-face learner, distance learner, COVID-19
Citation: Cheung SKS, Wong BTM and Li KC (2023) Perceived usefulness of open educational resources: Impact of switching to online learning for face-to-face and distance learners. Front. Psychol . 13:1004459. doi: 10.3389/fpsyg.2022.1004459
Received: 27 July 2022; Accepted: 19 December 2022; Published: 18 January 2023.
Reviewed by:
Copyright © 2023 Cheung, Wong and Li. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Simon K. S. Cheung, ✉ [email protected]
- Review article
- Open access
- Published: 13 October 2023
Are open educational resources (OER) and practices (OEP) effective in improving learning achievement? A meta-analysis and research synthesis
- Ahmed Tlili ORCID: orcid.org/0000-0003-1449-7751 1 ,
- Juan Garzón ORCID: orcid.org/0000-0002-0374-8570 2 ,
- Soheil Salha ORCID: orcid.org/0000-0003-2791-9925 3 ,
- Ronghuai Huang ORCID: orcid.org/0000-0003-4651-5248 1 ,
- Lin Xu ORCID: orcid.org/0000-0001-5426-1570 1 ,
- Daniel Burgos ORCID: orcid.org/0000-0003-0498-1101 4 ,
- Mouna Denden ORCID: orcid.org/0000-0003-0035-3490 5 , 6 ,
- Orna Farrell ORCID: orcid.org/0000-0001-9519-2380 7 ,
- Robert Farrow ORCID: orcid.org/0000-0002-7625-8396 8 ,
- Aras Bozkurt ORCID: orcid.org/0000-0002-4520-642X 9 ,
- Tel Amiel ORCID: orcid.org/0000-0002-1775-1148 10 ,
- Rory McGreal ORCID: orcid.org/0000-0003-4393-0921 11 ,
- Aída López-Serrano ORCID: orcid.org/0000-0001-9008-7960 4 &
- David Wiley ORCID: orcid.org/0000-0001-6722-4744 12
International Journal of Educational Technology in Higher Education volume 20 , Article number: 54 ( 2023 ) Cite this article
3097 Accesses
44 Altmetric
Metrics details
While several studies have investigated the various effects of open educational resources (OER) and open educational practices (OEP), few have focused on its connection to learning achievement. The related scientific literature is divided about the effects of OER and OEP with regards to their contribution to learning achievement. To address this tension, a meta-analysis and research synthesis of 25 studies ( N = 119,840 participants) was conducted to quantitatively investigate the effects of OER and OEP on students’ learning achievement. The analysis included course subject, level of education, intervention duration, sample size, geographical distribution, and research design as moderating variables of the obtained effects. The findings revealed that OER and OEP have a significant yet negligible ( g = 0.07, p < 0.001) effect. Additionally, the analysis found that the obtained effect can be moderated by several variables, including course subject, level of education and geographical distribution. The study findings can help various stakeholders (e.g., educators, instructional designers or policy makers) in understanding what might hinder OER and OEP effect on learning achievement, hence accommodating better learning outcomes and more effective interventions.
Introduction
Open educational resources and practices.
The term Open Educational Resources (OER) was first coined at UNESCO’s 2002 Forum on Open Courseware, and it was defined in the recent UNESCO Recommendation on OER as “learning, teaching, and research materials in any format and medium that reside in the public domain or are under copyright that have been released under an open license that permit no-cost access, reuse, repurpose, adaptation, and redistribution by others” (UNESCO, 2019 ). Several studies have then reported the advantages of OER in reducing learning costs (Hilton, 2016 ), increasing accessibility to educational resources even for students with disabilities (Zhang et al., 2020a ), and enhancing learning quality (Yuan & Recker, 2015 ; Weller et al., 2015 ; Zhang et al., 2020b ). Wiley ( 2014 ) further outlined five key characteristics, also known as the 5Rs, of using OER, namely: (1) retain—each person has the right to make and own copies of the published resource; (2) reuse—each person has the right to use the educational resources content in different ways depending in the learning context (e.g., formal or informal learning); (3) revise—each person has the right to revise the educational resource for different purposes (e.g., adapting it to a learning context or enhancing it); (4) remix—each person has the right to create a new educational resource by combining one or more learning contents together; and (5) redistribute—each person has the right to share with others copies of the original revised or remixed educational resource. The 5Rs can support innovation in teaching and learning since OER can be created, used, shared and repurposed differently to traditional copyrighted educational materials.
Building on the idea of innovation in educational resources and the idea of openness in education (Bozkurt et al., 2023 ), the Open e-Learning Content Observatory Services (OLCOS) functions as a Transversal Action under the European eLearning Programme and is committed to advancing the creation, sharing, and global utilization of OER (OLCOS, 2007 ). In 2007, OLCOS conducted a roadmap study that emphasized the significance of integrating innovative teaching methods with OER (OLCOS, 2007 ). The project underscores that merely delivering OER within traditional teacher-centered frameworks might not sufficiently prepare individuals for educational success. It advocates for the incorporation of innovative educational practices alongside OER, and notably introduced the concept of Open Educational Practices (OEP). Based on this perspective, OEP can be defined as OER-enabled pedagogies, or “the set of teaching and learning practices that are only possible or practical in the context of the 5R permissions which are characteristic of OER” (Wiley & Hilton III, 2018 , p. 135; cf. Bali et al., 2020 ). Ehlers ( 2011 , p. 4) defined OEP as “practices which support the (re)use and production of Open Educational Resources through institutional policies, promote innovative pedagogical models, and respect and empower learners as co-producers on their lifelong learning paths.” In a comprehensive review, Huang et al. ( 2020 ) identified five dimensions for the possible implementation of OEP, namely: OER, open teaching, open collaboration, open assessment and facilitating technologies. Some research suggests that these practices can help enhance learning quality, access, and effectiveness in universities. With the positive potential of OER and OEP in education, their adoption in education has rapidly increased for the past years. A significant moment in the history of open education came with the UNESCO ( 2019 ) Recommendation on OER which provides strategic policy support for the uptake and monitoring of OER. Accordingly, the UNESCO recommendation calls upon member states to develop national policies for the adoption of OER, which include activities, such as creating guidelines and strategies to incorporate OER within educational institutions or facilitating the generation and sharing of OER materials among educators. This recommendation draws considerable attention and investments to OER and OEP projects without certainty about their positive effects. At present, with the great potential of OER and OEP in education, a majority who remain unaware of the transformative potential of open practice; some educators consider OEP to be one of the most significant teaching forms of the twenty-first century (Shear et al., 2015 ) while others are oblivious of its existence. It is also important to note that OEP is not an orthodoxy so much as a concept that can be realized in a multitude of different ways.
Research gap and study objectives
Dotson and Foley (2017) emphasized that changing the curriculum content (i.e., from proprietary to open) does not produce a change in students’ learning achievement. Harvey and Bond ( 2022 ) also argued that there is a need to investigate if a change in the learning content licensing has an impact on students’ learning achievement. Despite a growing body of evidence regarding the effectiveness of OER and OEP in learning, open research studies have focused on other variables (e.g., affordability, accessibility). Less attention has been paid to whether OER and OEP can enhance students’ learning achievement compared to traditionally copyrighted materials (Robinson, 2015 ). For instance, Hilton ( 2016 ) conducted a systematic review of articles focusing on OER issues and learning achievement and perception, written between 2002 and August of 2015. The researcher found that only seven of sixteen studies focused on learning achievement. The researcher conducted another systematic review of twenty-nine OER-focused articles, written between September 2015 and December 2018, and only nine new learning achievement studies were obtained (Hilton, 2020 ). This reflects the decline in attention being paid to OER/OEP and learning achievement since 2002. Moreover, the literature about the effects of OER and OEP in enhancing students’ learning achievement is divided, where some studies reported positive effects (e.g., Colvard et al., 2018 ), no effects (e.g., Fortney, 2021 ; Grissett & Huffman, 2019 ) or even negative effects (e.g., Gurung, 2017 ), implying that some students who used traditionally copyrighted materials had better effects than those who used OER.
The question of the relative efficacy of OER or OEP remains open. The main rationale, therefore for this study, is to examine whether or not OER and OEP can enhance learning achievement. Two systematic reviews (Hilton, 2016 , 2020 ) attempted to investigate the above-mentioned phenomenon, however they were purely qualitative. The results from these two reviews did not effectively reveal the effects of OER and OEP on learning achievement. One study by Clinton and Khan ( 2019 ) conducted a meta-analysis related to this topic, however it investigated only the effect of open textbooks on post-secondary students’ learning achievement in the USA and Canada. Consequently, the previously obtained results do not reflect a comprehensive and an in-depth investigation of the effect of OER and OEP on learning achievement.
This present investigation aims at a more in-depth coverage of the current literature by including a range of types of OER (e.g., textbooks, videos, etc.) in many countries and at many educational levels. Smith ( 2013 ) highlighted the importance of researching improvements in achievement and attainment of OER, urging for further investigation into interventions that could result in significant enhancements in educational outcomes. In the same vein, Hilton ( 2020 ) has further suggested conducting sophisticated meta-analyses, where effect sizes across studies are calculated, to understand the measurable effect of OER on learning achievement. In response, this study employs a systematic analysis of the OER/OEP literature to comprehensively investigate whether the data supports the hypothesis that the use of OER and OEP can improve students’ learning achievement in a range of subjects. Therefore, to address this research gap, this study consisted of a meta-analysis and research synthesis of the relevant literature to provide quantitative evidence on the effects of OER and OEP on learning achievement. Meta-analysis, utilizing statistical methods, was employed to accurately measure the effect of a given intervention and the associated moderators of this effect (Rosenthal & DiMatteo, 2001 ).
Additionally, several studies reported that the effects of OER and OEP on learning achievement might vary due to different confounders, such as demographic information, the type of the course delivered, educational level (grade), intervention duration, among others (e.g., Hilton, 2016 , 2020 ). Therefore, the present study takes a forward step towards analyzing if these variables might moderate the effect of OER and OEP on learning achievement. Specifically, this study addressed the following research questions:
RQ1. What is the effect of OER and OEP on students’ learning achievement?
RQ2. How does the effect of OER and OEP on students’ learning achievement vary according to the educational subject?
RQ3. How does the effect of OER and OEP on students’ learning achievement vary according to the educational level?
RQ4. How does the effect of OER and OEP on students’ learning achievement vary according to the intervention duration?
RQ5. How does the effect of OER and OEP on students’ learning achievement vary according to the sample size?
RQ6. How does the effect of OER and OEP on students’ learning achievement vary according to geographical distribution of students?
RQ7. How does the effect of OER and OEP on students’ learning achievement vary according to the research design?
Methodology
This study identifies the effects of using OER and OEP on learning achievement through meta-analysis. To secure the selection of the most relevant literature to be meta-analyzed, the researchers of the current study followed the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines (Page et al., 2021 ). Additionally, the researchers followed recommendations outlined by Kitchenham and Charters ( 2007 ). This procedure suggests three stages, namely: planning, conducting, and reporting the review. Although these guidelines were originally proposed for conducting systematic reviews, they have been successfully employed in meta-analyses (e.g., Garzón et al., 2019 ). All the processes related to the selection and codification of the studies were carried out by two coders.
Planning the review
To ensure having only relevant studies (recall) within this meta-analysis, hence obtaining a high precision rate (Ting, 2010 ), “open educational resources” and “open educational practices” were used as search keywords. Particularly, the term abbreviations, namely OER and OEP, were not considered as search keywords because in scientific writings, the full name of a term is provided prior to using its abbreviation. The search process was undertaken in the following databases: Web of Science, Scopus, Taylor and Francis and ERIC. These databases were selected because they are popular in the field of educational technology (Bedenlier et al., 2020 ; Wang et al., 2023 ). ERIC, particularly, focuses on educational science, especially OER (Otto et al., 2021 ) and Scopus is known as the largest database for scholarly publications. The publication interval was from 2012 up to 2023. The starting year of 2012 was selected as the initial date because it marked the release of the “UNESCO Paris OER Declaration”, which urged governments to promote the use of OER, and called for publicly funded educational materials to be released in a freely reusable form. As a result, several OER initiatives were launched worldwide which catalyzed the development of the OER field. Due to the novelty of the topic, conference papers and doctoral dissertations were considered to be included into the research corpus, as suggested by several studies (e.g., Chen et al., 2020 ; Denden et al., 2022 ).
The search was conducted on April 4, 2023, at which date, researchers were able to identify 643 studies (Web of Science: 117, Scopus: 38, Taylor and Francis: 262, and ERIC: 226). After eliminating duplicates ( n = 324), a total of 319 publications were selected for further analysis. The first filter was based on each article’s title and keywords. This process allowed us to identify and remove 75 papers that were not relevant to the purpose of this present study. Then, the abstract of the remaining 244 papers was read and analyzed comprehensively. This process allowed us to remove 135 papers that were not relevant. Finally, we analyzed the remaining 109 studies based on the following criteria: (1) empirical studies, (2) studies that specifically used OER or OEP, (3) studies that provided sufficient information (i.e., mean, median, standard deviation) to calculate the effect size.
Therefore, a study was excluded if (1) it was not empirical research, (2) it did not focus on using OER or OEP, (3) it was qualitative or review research, (4) it did not provide sufficient information to calculate the effect size, or (5) it was not written in English. This process limited the corpus for investigation to 25 papers (23 journal papers, 1 conference paper and 1 PhD dissertation) to be further examined and included in the analysis. At the end of this process, the reference section of each paper was then reviewed. However, this process did not provide additional studies. Figure 1 shows the PRISMA flowchart (Page et al., 2021 ) of the study selection process, where inter-rater reliability in each phase was above 0.7, which is considered very good (Cohen, 1960 ).

PRISMA flowchart for the search protocol
Conducting the review
This stage included the coding scheme for the data extraction process. In an effort to minimize the potential for bias, an online electronic data extraction form was designed (Kitchenham & Charters, 2007 ). To answer the aforementioned research questions, the following information in each study was coded: (1) OER type: The type of resource used for teaching, such as textbooks, videos, etc.; (2) course subject: the subject that was taught when using OER and OEP, such as mathematics, psychology, etc.; (3) educational level: the student grade in which OER and OEP were used, such as primary, bachelors, etc.; (4) the length of time over which OER and OEP were used (i.e., course duration); (5) sample size: the number of participants in each study. According to Cheung and Slavin ( 2016 ), the sample size was divided into small, where the number of participants is less than or equal to 250, and large, where the number of participants is larger than 250; (6) region: the region (country) where the experiment was conducted; and (7) research design: the followed research design when conducting the experiment.
Calculation of the effect size
Comprehensive Meta-Analysis V.4 (Borenstein, 2022 ) software was used to conduct the present meta-analysis. Additionally, Hedges’ g was used to calculate the effect sizes (Hedges, 1981 ). The motivation behind using Hedges’ g instead of Cohen’s d effect size was that the differential sample size between studies may bias the estimated effect size. This bias affects studies having a sample size smaller than 20, in which case Hedges’s g presents more reliable estimates than Cohen’s d (Hedges & Olkin, 1985 ). Eleven studies followed the pretest–posttest-control (PPC) research design. In this research design, students are randomly assigned to experimental and control treatments and are evaluated before and after the treatment. As stated by Morris ( 2008 ), this design provides better results regarding the accuracy of d values and control of threats to internal validity. The remaining fourteen studies, on the other hand, followed the posttest only with control (POWC) design, where students are assigned to experimental and control treatments and assessed only once, after the treatment (i.e., learning using OER or OEP).
According to the guidelines provided by Thalheimer and Cook ( 2002 ) for interpreting effect size, an effect size is negligible if − 0.15 < g < 0.15; small if 0.15 ≤ g < 0.40; medium if 0.40 ≤ g < 0.75; large if 0.75 ≤ g < 1.10; very large if 1.10 ≤ g < 1.45; and, huge if 1.45 ≤ g . Additionally, to test if there was any heterogeneity in the variation of effect sizes within the reviewed studies, Q and I 2 were evaluated (Konstantopoulos & Hedges, 2019 ). Specifically, a preplanned analysis was conducted to investigate if the field of education, the level of education, or the learning setting influenced the overall average effect size.
Publication bias
Three methods were used to assess publication bias: classic Rosenthal’s fail-safe N, Orwin’s fail-safe N, and the trim-and-fill method. Rosenthal’s (1979) fail-safe number aims to determine the number of studies with nonsignificant results of unpublished data needed to nullify the mean effect size. A fail-safe number larger than 5 k + 10 (where k is the original number of studies included in the meta-analysis) is robust. This means that the effect size of unpublished studies is not likely to affect the average effect size of the meta-analysis. However, this method assumes that the mean effect size in the missing studies is zero (Borenstein et al., 2021 ). To overcome this issue, Orwin ( 1983 ) proposed a more stringent method to identify how many missing studies would bring the overall effect to a specific non-zero value. This method permits selecting a value that represents the smallest effect of substantive importance and identifying how many missing studies it would take to bring the overall effect below this value. Alternatively, the trim-and-fill method was proposed by Duval and Tweedie (2000) with the intention of identifying publication bias by means of a funnel plot wherein the studies are represented by dots. If the dots are distributed on both sides of a vertical line representing the average effect size, there is no publication bias. Conversely, if most of the dots are located at the bottom of the funnel or on one side of the vertical line, publication bias is present (Borenstein et al., 2010 ).
Description of the included sample
Table 1 presents the included 25 studies in this present meta-analysis. Most of the studies ( n = 19) were conducted with bachelor students. Additionally, OER and OEP were used mostly to teach psychology ( n = 6), mathematics ( n = 5) or also varied courses ( n = 6). Among the 25 studies, 10 studies used small sample size (less than or equal to 250) and 15 studies used large sample size (larger than 250). Hedges’s g was also calculated. A positive Hedges’s g indicates that students using OER and OEP had better achievement than those who used traditionally copyrighted resources, and vice versa. Table 1 shows that 10 studies had negative Hedges’s g value.
Publication bias assessment
Borenstein et al. ( 2010 ) stated that a symmetric funnel plot—when the dots (studies) are distributed on both sides of the vertical line (combined effect size)—implies that there is no publication bias. However, if most of the dots are situated at the bottom of the funnel or on one side of the vertical line, there is publication bias. Figure 2 shows that the dots in this study are distributed symmetrically around the vertical line. Additionally, although some dots are outside the triangle of the funnel plot, most of them are in the upper part of Fig. 2 and not at the bottom. Therefore, it can be argued that the reliability of the present meta-analysis is not affected by publication bias.

Funnel plot of standard error by Hedges’ g

Overall effect size for learning achievement
The meta-analysis yielded an overall effect size of g = 0.07, p < 0.001, indicating that OER and OEP had a negligible effect on students’ learning achievement (see Table 2 ). Specifically, Document ( g = − 0.20; 95% CI = − 0.14 to 0.10; n = 1), Interactive (text) book ( g = 0.13; 95% CI = 0.11 to 0.15; n = 18) and Interactive course ( g = − 0.11; 95% CI = − 0.14 to − 0.08; n = 5) had a negligible effect on students’ learning achievement. Video ( g = 0.20; 95% CI = − 0.0.33 to 0.73; n = 1) had a small effect on students’ learning achievement.
The I 2 statistic showed that 96.60% of variance resulted from between-study factors, implying that other variables might moderate the effect size of OER (as pointed out in the background of this study).
The forest plot presents the variation of effect size across the 25 included studies (see Fig. 3 ). The black square represents each study’s weighted effect size, where a larger square size implies a larger effect size. The arrow underneath each square (effect size) represents the confidence interval of the associated effect size. The overall mean effect size ( g = 0.073) is presented at the last row of the forest plot. Interestingly, it is seen that almost half of the studies had a negative effect size with different confidence intervals, implying that the use of traditionally copyrighted materials had a better impact on learning achievement compared to the use of OER and OEP. This further explains the obtained negligible effect of OER and OEP on students’ learning achievement (see Table 2 ).

A forest plot of the Hedge's g estimates and the confidence intervals of all studies
Effect sizes of learning achievement for moderator variables
Course subject.
Meta-regression was used to investigate any possible variations in the effect sizes of educational subjects (Liesa-Orus et al., 2023 ). According to Table 3 , the meta-regression result indicates that the course subject model is associated with the effect sizes of the learning achievement under OER as the p-value is 0.05 (Borenstein 2022 ). Moreover, the statistics of the subject indicate that using OER in history (p = 0.001) is likely to relate to the effect size. Specifically, the coefficient indicates that the expected mean effect size for studies using OER in history is 1.14 points higher than the expected mean effect size for studies using OER in psychology, with standard error 0.33 and a confidence interval 0.49–1.78. In other words, OER used in history is likely to have a significantly better effect on learning achievement than OER used in psychology.
Educational level
Meta-regression was used to investigate any possible variations in the effect sizes of educational level (Chaudhary & Singh, 2022 ). According to Table 4 , the meta-regression result indicates that the educational level model is associated with the effect sizes of the learning achievement under OER as the p-value is equal to 0.001 (Borenstein 2022 ). Moreover, the statistics of educational level indicate that using OER in professional development ( p = 0.001) is likely to relate to the effect size. Specifically, the coefficient indicates that the expected mean effect size for studies using OER in professional development is 2.26 points higher than the expected mean effect size for studies using OER in bachelors, with standard error 0.48 and a confidence interval 1.31–3.20. In other words, OER used in professional development is likely to have a significantly better effect on learning achievement than OER used in bachelor.
Intervention duration
Meta-regression was used to investigate any possible variations in the effect sizes of intervention duration (Shi et al., 2023 ). According to Table 5 , the meta-regression result indicates that the intervention duration model is not associated with the effect sizes of the learning achievement under OER as the p-value is 0.99 (Borenstein 2022 ).
Sample size
Meta-regression was used to investigate any possible variations in the effect sizes of sample size (Cheung & Slavin, 2016 ). According to Table 6 , the meta-regression result indicates that the sample size model is not associated with the effect sizes of the learning achievement under OER as p-value is 0.08 (Borenstein 2022 ).
Geographical distribution
Meta-regression was used to investigate any possible variations in the effect sizes of the region (Liesa-Orus et al., 2023 ). According to Table 7 , the meta-regression result indicates that the region model is associated with the effect sizes of learning achievement under OER as p-value is 0.01 (Borenstein 2022 ). Moreover, the region statistics indicate that using OER in Asia (p = 0.001) is likely to relate to the effect size. Specifically, the coefficient indicates that the expected mean effect size for studies using OER in Asia is 1.01 points higher than the expected mean effect size for studies using OER in North America, with standard error 0.33 and a confidence interval 0.36–1.65. In other words, OER used in Asia is likely to have a significantly better effect on learning achievement than OER used in North America.
Research design
Meta-regression was used to explore any possible associations in the effect sizes of the research design (Geissbühler et al., 2021 ). According to Table 8 , the meta-regression result indicates that the research design model is not associated with the effect sizes of learning achievement under OER as p -value is 0.77 (Borenstein 2022 ).
Finally, to further investigate for possible covariance between confounding variables, a meta-regression that includes all of the individual confounding variables that yielded statistically significant results, namely subject, educational level and region, was conducted. Table 9 reveals that subject ( p = 0.01) and educational level ( p = 0.001) yielded a significant covariance between confounding variables.
Discussions
This meta-analysis aimed to comprehensively assess the effectiveness of Open Educational Resources (OER) and Open Educational Practices (OEP) in relation to learning achievement. The analysis of 25 independent studies revealed that the impact of OER and OEP on learning achievement is generally negligible. These quantitative findings support the conclusions drawn from qualitative (Hilton, 2016 , 2020 ) and quantitative (Clinton & Khan, 2019 ) reviews that compare learning achievement between courses using open and commercial textbooks. Additionally, it is found that course subject, educational level and the region of students might moderate the effects of OER and OEP. The obtained findings of this study can be discussed and explained from the following perspectives.
Improvement in access does not imply improvement in learning achievement: a holistic design is needed
The use of OER and OEP is often considered an effective learning intervention due to its potential to provide equal access to educational resources for all students (Grimaldi et al., 2019 ). However, the results of this meta-analysis do not substantiate this hypothesis. Dotson and Foley (2017) also argue that the change of a curriculum content license from proprietary to open does not always lead to a change in students’ learning achievement. In other words, we cannot expect an improvement in learning achievement by simply changing the license of a given educational resource from proprietary to open. It requires a more comprehensive approach that involves changing not only the license, but also the used instructional approach, the way the educational resources are designed, etc. Based on the review of 25 included studies, it is found that ensuring an improvement in learning achievement is beyond the simple access to educational resources, and several elements should be considered, some of which are considered and discussed below, namely: OER quality, instructional, and learners’ individual factors.
OER quality
The quality of OER and effective implementation of OEP are crucial factors that significantly influence learning achievement. High-quality OER, characterized by accurate and up-to-date content, clear learning objectives, and appropriate instructional design, have been shown to positively impact student learning outcomes (Butcher, 2015 ). Learners who have access to well-designed OER that align with the curriculum and provide meaningful learning experiences are more likely to engage with the materials and effectively acquire knowledge and skills. However, it is important to acknowledge that not all OER resources meet the necessary standards of accuracy, coherence, and pedagogical effectiveness. Research has indicated significant variability in the quality of OER, resulting in inconsistent learning experiences and potentially limiting their impact on learning achievement (Weller, 2017 ). To address this, quality assurance processes, peer review, and evaluation mechanisms are essential to ensure that the content and resources meet established standards. While there was early skepticism and critique in regards to OER quality based on design and economic production models (see Kahle, 2008 ; Weller, 2010 ), there is nothing inherently different in regards to open and closed/proprietary content beyond the intellectual property rights. In other words, the quality criteria that apply to proprietary/closed resources also apply to OER and we should not expect quality differences in OER produced under the same production modes (e.g. by experienced publishers and designers). The results of this study indicate that no significant difference was found, which substantiates this assertion.
Instructional approach
Scoring improvements in learning achievement also depends on how students engage with OER and the effective implementation of OEP accordingly. On the one hand, despite the easy access to learning materials provided by OER, learners may not use them at all (Feldstein et al., 2012 ) or may not have sufficient time to engage with them (Westermann Juárez and Venegas Muggli, 2017 ). On the other hand, while OER offer the advantage of making learning more individualized, students may encounter a broader range of perspectives through OER but the content they learn may not align with objective measures of learning (Gurung, 2017 ). In the same vein, Zulaiha and Triana ( 2023 ) stated that a proper teaching method and learning strategy must accompany the OER to be effectively leveraged to improve students’ skills in order for OER to make a significant impact on the student learning. An older study by Slavin and Lake ( 2008 ) similarly found that the selection of instructional approach has a larger impact on learning achievement than the choice of curriculum content. Besides, numerous educators face challenges such as time constraints, insufficient skills and competences (e.g., digital), lack of understanding about what OER or OEP actually mean, and a lack of incentives to engage in open practices. Consequently, the widespread adoption of OEP remains limited, potentially hindering its impact on learning achievement (Tlili et al., 2021 ; Zhang et al., 2020b ).
Learners’ individual factors
The impact of OER and OEP on learning achievement is influenced by individual learner characteristics, including prior knowledge and motivation. Tlili and Burgos ( 2022 ) emphasized the importance of providing personalized learning as students in open education might have different backgrounds and competencies. It is crucial to acknowledge that students may show diverse responses to open educational initiatives, and some may require extra support or guidance to fully reap the benefits of these resources. For example, studies have shown that students from lower socioeconomic backgrounds, who may lack essential skills in effectively utilizing OER, tend to attain lower learning outcomes compared to their peers (Robinson, 2015 ). Thus, recognizing and addressing the diverse needs of learners is important in enhancing the impact of OER and OEP on learning achievement.
Adequate experimental design is crucial for accurately measuring learning achievement
Beyond the OER and OEP selection and implementation, the study indicates that the applied experimental design might hinder the accurate measurements of OER and OEP effects on learning achievement. Based on the 25 reviewed studies, it is found that most of the studies used quasi-experiments given that random assignments are not always possible in open education. As a result, this might hinder measuring the measurable effects of OER and OEP on learning achievement (Griggs & Jackson, 2017 ; Gurung, 2017 ).
Additionally, separating the effects of OER and OEP from other effects is also a challenge in the conducted experiments. Wiley ( 2022 ) described several ways in which research that purports to show the impact of OER adoption on student learning actually shows the impact of other interventions that are associated with OER adoption (e.g., when faculty receive support from an instructional designer to redesign a course after adopting OER). Pawlyshyn et al. ( 2013 ) also reported an improvement in learning achievement when OER was adopted simultaneously with flipped classrooms. However, it is not clear whether this improvement was due to the use of OER or flipped classrooms. OER are often employed alongside other interventions which can make isolating their effect methodologically problematic. This challenge of correlating improvements in learning achievement with the use of OER and OEP was also reported by other researchers (Griggs & Jackson, 2017 ; Gurung, 2017 ).
Most of the reviewed studies used final exam scores or GPA (grade point average) to measure the learning achievement of incorporating OER and OEP. However, this method is questionable as the designed exam may vary depending on the taught course subject and requirements, leading to a variation in the measured learning achievement. It is, therefore, recommended to use standardized instruments when measuring learning achievement using OER and OEP (Hendricks et al., 2017 ; Hilton, 2020 ). This normalization might lead to competence validation or even credit recognition through alternative credentials (e.g., Alternative Digital Credentials -ADC-), which is one of the open challenges around open education (Griffiths et al., 2022 ).
Confounding variables might lead to a variation of learning achievement
The present meta-analysis revealed that several confounding variables could affect the learning achievement of students when using OER and OEP. One of these variables is the course subject. This might be explained by the fact that some subjects have quality OER published online while others do not. For example, Lawrence and Lester ( 2018 ) highlighted a specific concern in the open content space for subjects like political science, which is the lack of available textbook options. Similarly, Choi and Carpenter ( 2017 ) found it challenging to find a suitable OER for their interdisciplinary Human Factors and Ergonomics course. The researchers discovered that OER for the Human Factors and Ergonomics course often provided in-depth content for individual topics, including extra information that is relevant to their subject of focus but not directly related to the course learning objectives. Furthermore, the limited number of OER options creates difficulties for instructors in applying some of their preferred pedagogical approaches.
The obtained findings also revealed that students’ geographic region can moderate the effect of OER and OEP on learning achievement. This could be explained with the fact that several regions, such as East Asia, have made remarkable progress in terms of raising awareness and adopting OER and OEP (Tlili et al., 2019 ), while others like the Arab region and sub-Saharan Africa are still behind (Tlili et al., 2020 , 2022 ). This might result in divided regions in terms of students’ perception and acquired competencies to use OER and OEP, hence having varied learning achievement across regions.
Conclusions, implications and limitations
This study included a meta-analysis and research synthesis to investigate the effects of OER and OEP on students’ learning achievement. This analysis describes how this effect is moderated across different variables (i.e., course subject, level of education, intervention duration, sample size and geographical distribution). As discussed above, to the best of our knowledge, no previous study has conducted a similar analysis. Based on the findings, it can be argued that holistic OER learning design may be needed to optimize learning outcomes; that researchers should employ adequate experimental design when investigating the relationship between OER and learning achievement; and highlighted the potential role of confounding factors that can lead to a variation of learning achievement when using OER.
Implications
This study supports previous research in identifying no significant differences between the interventions using open and closed approaches (content or practice). This meta-analysis supports this conclusion following along existing literature on media/intermedia comparison studies (e.g. Clark, 1994 ; Salomon & Clark, 1977 ).
The conundrum for comparisons studies such as those included in this meta-analysis is thus: if a true experiment made to evaluate the influence of OER in learning achievement were to be designed, the only variable would be the OER itself, in other words, the intellectual property license of the content (considering this to be the defining characteristic of OER). If this were possible, one could only expect that the affordances of open licensing would possibly point to the effects of reduced cost or ease of access to relate to achievement (e.g., Fischer et al., 2015 ). But if this is done, it would offer us minimal new insights beyond what we already could expect, in principle. It stands to reason that not having access to resources designed to be part of a course would reduce achievement (a comparison on whether students actually did or did not access and make use of resources in the treatment and control condition is another study entirely).
However, the truly intriguing and critical questions pertain to practical applications. If we do allow practice to vary, for example: if the OER afforded some different sort of practice (as OEP is defined in OER-enabled pedagogy) then we are really measuring something more holistic—the practice which includes the resource. As Salomon and Clark ( 1977 , p. 102) conclude: “In short, when only the least significant aspects of instruction are allowed to vary, nothing of interest could, and did, result.”
This study then might point us to valuable avenues for further research. Perhaps course instructors and designers are attempting to faithfully replicate courses that make use of OER simply to test possible outcomes in achievement; here, clearly, we should expect no difference to emerge. Furthermore, instructors may not be really leveraging OER-enabled pedagogy or more expansive perspectives of OEP.
Additionally, this meta-analysis might help disencourage further comparison studies based on OER and achievement. It points us to the urgency of expanding the object of analysis beyond intrinsic characteristics of OER and focus on how principles of openness might significantly alter the nature of the practices and courses themselves, might lead to outcomes which are not measured simply by achievement gains, and additionally, might or might not cater to different types of students.
This present study can contribute to the literature from different perspectives. From a theoretical perspective, this study adds to the ongoing debate for the past twenty years about the effectiveness of OER and OEP by revealing what might moderate the effectiveness of OER and OEP. From a practical perspective, this study can contribute to Sustainable Development Goals (SDGs) (UN, 2021 ), specifically SDG 4 quality education, by highlighting the different variables (i.e., quality, the used pedagogical approach, etc.) that different stakeholders (i.e., educators, instructional designers, etc.) should consider when adopting OER and OEP for better learning achievement. Finally, from a methodological perspective, this study contributes to the literature by pointing out various experimental criteria (i.e.., standardized measurements, design, etc.) that should be considered when designing research experiments to effectively measure the true effect of OER, hence providing more accurate results that could advance the field in this regard.
Limitations and future directions
It should be noted that the statistical power was not examined in this present meta-analysis, which is the case in the majority of published meta-analyses in the literature (Burçin, 2022 ; Dumas-Mallet et al., 2017 ; Thorlund & Mills, 2012 ). A significant barrier to the widespread implementation of statistical power in meta-analysis is the difficulty of understanding how it can be computed due to the various variables that should be considered in each study, as well as the heterogenicity of the conducted studies (Cafri et al., 2010 ; Ioannidis et al., 2014 ; Vankov et al., 2014 ). Additionally, there is a lack of an accessible and easy-to-use software or R script that can help to compute statistical power (Griffin, 2021 ; Thomas & Krebs, 1997 ). In this context, various software, such as G*power, have been developed to calculate statistical power for primary research, allowing for widespread implementation of power analysis in primary research (Faul et al., 2007 ). However, despite the similarity in procedure, such analogous software options do not exist for meta-analysis. Consequently, to calculate statistical power for a given meta-analysis, researchers must manually perform the calculations, use an online calculator, or utilize a user defined script (e.g., Cafri et al., 2009 ). These methods can be limited in functionality and difficult to integrate into a reproducible workflow (Griffin, 2021 ).
Besides, despite the reliability of the obtained results having been validated through the bias assessment, this study has some other limitations that should be acknowledged. For instance, the obtained results might be limited to the used keywords and electronic databases. Additionally, the obtained analysis was only based on courses conducted in English; non-English course studies might reveal different results. Moreover, while the present meta-regression yielded valuable insights about the effect of OER and OEP on learning achievement as well as the moderating variables of this effect, the limited sample size of the included studies might impact the generalizability of the findings. Therefore, future researchers are encouraged to complement this work by covering more databases and analyzing non-English courses, hence providing a more comprehensive view of OER and OEP effects. Additionally, this present meta-analysis did not consider teacher variables (e.g., same teacher or not when teaching using OER and non-OER materials) which could moderate the effects of OER and OEP on learning achievement (Hilton, 2020 ). Therefore, future studies could focus on this line of research. Finally, this present meta-analysis did not consider OER quality, which has been shown to have a significant impact on students’ learning outcomes (Butcher, 2015 ). Future research could systematically assess and incorporate OER quality as a moderating variable, hence further enhancing the understanding of the intricate relationship between OER and learning achievement. However, despite these limitations, this present study provided quantitative evidence about the OER and OEP effects on students’ learning achievement.
Availability of data and materials
The datasets generated and/or analyzed during the current study are presented within this study.
References with an asterisk (*) indicate studies included in the analysis
Allen, G., Guzman-Alvarez, A., Smith, A., Gamage, A., Molinaro, M., & Larsen, D. S. (2015). Evaluating the effectiveness of the open-access ChemWiki resource as a replacement for tr*aditional general chemistry textbooks. Chemistry Education Research and Practice, 16 (4), 939–948. https://doi.org/10.1039/c5rp00084j
Article Google Scholar
Bali, M., Cronin, C., & Jhangiani, R. S. (2020). Framing open educational practices from a social justice perspective. Journal of Interactive Media in Education, 2020 (1), 1. https://doi.org/10.5334/jime.565
*Basu Mallick D., Grimaldi P. J., Whittle J., Waters A. E., & Baraniuk R. G. (2018). Impact of OER textbook adoption on student academic outcomes. Paper presented at the 15th Annual Open Education Conference, Niagara Falls, NY.
Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020). Facilitating student engagement through educational technology in higher education: A systematic review in the field of arts and humanities. Australasian Journal of Educational Technology, 36 , 126–150. https://doi.org/10.14742/ajet.5477
Borenstein, M. (2022). Comprehensive meta-analysis software. In M. Egger, J. P. T. Higgins, & G. D. Smith (Eds.), Systematic reviews in health research: Meta-analysis in context (pp. 535–548). Wiley. https://doi.org/10.1002/9781119099369.ch27
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2010). A basic introduction to fixed-effect and random-effects models for meta-analysis. Research Synthesis Methods, 1 (2), 97–111. https://doi.org/10.1002/jrsm.12
Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2021). Introduction to Meta-Analysis (2nd ed.). John Wiley & Sons. https://doi.org/10.1016/b978-0-12-209005-9.50005-9
Book MATH Google Scholar
Bozkurt, A., Gjelsvik, T., Adam, T., Asino, T. I., Atenas, J., Bali, M., Blomgren, C., Bond, M., Bonk, C. J., Brown, M., Burgos, D., Conrad, D., Costello, E., Cronin, C., Czerniewicz, L., Deepwell, M., Deimann, M., DeWaard, H. J., Dousay, T. A., Ebner, M., Farrow, R., Gil-Jaurena, I., Havemann, L., Inamorato, A., Irvine, V., Karunanayaka, S. P., Kerres, M., Lambert, S., Lee, K., Makoe, M., Marín, V. I., Mikroyannidis, A., Mishra, S., Naidu, S., Nascimbeni, F., Nichols, M., Olcott. Jr., D., Ossiannilsson, E., Otto, D., Padilla Rodriguez, B. C., Paskevicius, M., Roberts, V., Saleem, T., Schuwer, R., Sharma, R. C., Stewart, B., Stracke, C. M., Tait, A., Tlili, A., Ubachs, G., Weidlich, J., Weller, M., Xiao, J., & Zawacki-Richter, O. (2023). Openness in Education as a Praxis: From Individual Testimonials to Collective Voices. Open Praxis, 15 (2), 76–112. https://doi.org/10.55982/openpraxis.15.2.574
Burçin, Ö. N. E. R. (2022). Evaluation of statistical power in random effect meta analyses for correlation effect size. Sakarya University Journal of Science, 26 (3), 554–567. https://doi.org/10.16984/saufenbilder.1089793
Butcher, N. (2015). Basic guide to open educational resources (OER). Commonwealth of Learning (COL). https://doi.org/10.56059/11599/36
Cafri, G., Kromrey, J. D., & Brannick, M. T. (2009). A sas macro for statistical power calculations in metaanalysis. Behavior Research Methods, 41 (1), 35–46. https://doi.org/10.3758/brm.41.1.35
Cafri, G., Kromrey, J. D., & Brannick, M. T. (2010). A meta-meta-analysis: Empirical review of statistical power, type I error rates, effect sizes, and model selection of meta-analyses published in psychology. Multivariate Behavioral Research, 45 (2), 239–270. https://doi.org/10.1080/00273171003680187
Chaudhary, P., & Singh, R. K. (2022). A meta analysis of factors affecting teaching and student learning in higher education. Front. Educ., 6 , 824504. https://doi.org/10.3389/feduc.2021.824504
Chen, Z., Chen, W., Jia, J., & An, H. (2020). The effects of using mobile devices on language learning: A meta-analysis. Educational Technology Research and Development, 68 (4), 1769–1789. https://doi.org/10.1007/s11423-020-09801-5
Cheung, A., & Slavin, R. E. (2016). How methodological features of research studies affect effect sizes. Educational Researcher, 45 (5), 283–292. https://doi.org/10.3102/0013189X16656615
*Chiorescu, M. (2017). Exploring open educational resources for college algebra. International Review of Research in Open and Distributed Learning, 18 (4), 50–59. https://doi.org/10.19173/irrodl.v18i4.3003
*Choi, Y. M., & Carpenter, C. (2017). Evaluating the impact of open educational resources: A case study. Portal: Libraries and the Academy, 17 (4), 685–693. https://doi.org/10.1353/pla.2017.0041
Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and Development, 42 (2), 21–30. https://doi.org/10.1007/BF02299088
*Clinton, V. (2018). Savings without sacrifices: A case study of open textbook adoption. Open Learning: THe Journal of Open, Distance, and e-Learning, 33 (3), 177–189. https://doi.org/10.1080/02680513.2018.1486184
Clinton, V., & Khan, S. (2019). Efficacy of open textbook adoption on learning performance and course withdrawal rates: A meta-analysis. AERA Open, 5 (3), 2332858419872212. https://doi.org/10.1177/2332858419872212
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20 (1), 37–46. https://doi.org/10.1177/001316446002000104
*Colvard, N. B., Watson, C. E., & Park, H. (2018). The impact of open educational resources on various student success metrics. International Journal of Teaching and Learning in Higher Education, 30 (2), 262–276.
Google Scholar
Denden, M., Tlili, A., Chen, N. S., Abed, M., Jemni, M., & Essalmi, F. (2022). The role of learners’ characteristics in educational gamification systems: A systematic meta-review of the literature. Interactive Learning Environments . https://doi.org/10.1080/10494820.2022.2098777
Dumas-Mallet, E., Button, K. S., Boraud, T., Gonon, F., & Munafò, M. R. (2017). Low statistical power in biomedical science: A review of three human research domains. Royal Society Open Science, 4 (2), 160254. https://doi.org/10.1098/rsos.160254
Ehlers, U.-D. (2011). Extending the territory: From open educational resources to open educational practices. Journal of Open Flexible and Distance Learning , 15(2), 1–10. http://www.jofdl.nz/index.php/JOFDL/index
*Engler, J. N., & Shedlosky-Shoemaker, R. (2019). Facilitating student success: The role of open educational resources in introductory psychology courses. Psychology Learning & Teaching, 18 (1), 36–47. https://doi.org/10.1177/1475725718810241
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39 (2), 175–191. https://doi.org/10.3758/bf03193146
*Feldstein DPS, A. P., Martin, M., Hudson, A., Warren, K., Hilton III, J., & Wiley, D. (2012). Open textbooks and increased student access and outcomes. European Journal of Open, Distance and E-Learning . https://old.eurodl.org/?p=archives&year=2012&halfyear=2&article=533
Fischer, L., Hilton, J., III., Robinson, T. J., & Wiley, D. A. (2015). A multi-institutional study of the impact of open textbook adoption on the learning outcomes of post-secondary students. Journal of Computing in Higher Education, 27 (3), 159–172. https://doi.org/10.1007/s12528-015-9101-x
Fortney, A. (2021). OER textbooks versus commercial textbooks: Quality of student learning in psychological statistics. Locus: The Seton Hall Journal of Undergraduate Research, 4 (1), 4.
Garzón, J., Pavón, J., & Baldiris, S. (2019). Systematic review and meta-analysis of augmented reality in educational settings. Virtual Reality, 23 (4), 447–459. https://doi.org/10.1007/s10055-019-00379-9
Geissbühler, M., Hincapié, C. A., Aghlmandi, S., Zwahlen, M., Jüni, P., & da Costa, B. R. (2021). Most published meta-regression analyses based on aggregate data suffer from methodological pitfalls: A meta-epidemiological study. BMC Medical Research Methodology, 21 , 123. https://doi.org/10.1186/s12874-021-01310-0
*Grewe, K., & Davis, W. P. (2017). The impact of enrollment in an OER course on student learning outcomes. The International Review of Research in Open and Distributed Learning . https://doi.org/10.19173/irrodl.v18i4.2986
Griffin, J. W. (2021). Calculating statistical power for meta-analysis using metapower. The Quantitative Methods for Psychology., 17 (1), 24–39. https://doi.org/10.20982/tqmp.17.1.p024
Griffiths, D., Burgos, D., & Aceto, S. (2022), Credentialing learning in the European OER Ecosystem. Retrieved July, the 14th, 2023, from https://encoreproject.eu/2022/09/06/credentialing-learning-in-the-european-oerecosystem/
Griggs, R. A., & Jackson, S. L. (2017). Studying open versus traditional textbook effects on students’ course performance: Confounds abound. Teaching of Psychology, 44 (4), 306–312. https://doi.org/10.1177/0098628317727641
Grimaldi, P. J., Basu Mallick, D., Waters, A. E., & Baraniuk, R. G. (2019). Do open educational resources improve student learning? implications of the access hypothesis. PLoS ONE . https://doi.org/10.1371/journal.pone.0212508
*Grissett, J. O., & Huffman, C. (2019). An open versus traditional psychology textbook: Student performance, perceptions, and use. Psychology Learning & Teaching, 18 (1), 21–35. https://doi.org/10.1177/1475725718810181
*Gurung, R. A. (2017). Predicting learning: Comparing an open educational resource and standard textbooks. Scholarship of Teaching and Learning in Psychology, 3 (3), 233–248. https://doi.org/10.1037/stl0000092
*Hardin, E. E., Eschman, B., Spengler, E. S., Grizzell, J. A., Moody, A. T., Ross-Sheehy, S., & Fry, K. M. (2019). What happens when trained graduate student instructors switch to an open textbook? A controlled study of the impact on student learning outcomes. Psychology Learning & Teaching, 18 (1), 48–64. https://doi.org/10.1177/1475725718810909
*Harvey, P., & Bond, J. (2022). The effects and implications of using open educational resources in secondary schools. The International Review of Research in Open and Distributed Learning, 23 (2), 107–119. https://doi.org/10.19173/irrodl.v22i3.5293
Hedges, L. (1981). Distribution theory for glass’s estimator of effect size and related estimators. Journal of Educational Statistics, 6 (2), 107–128. https://doi.org/10.3102/10769986006002107
Hedges, L., & Olkin, I. (1985). Statistical methods for meta-analysis . Academic Press.
MATH Google Scholar
*Hendricks, C., Reinsberg, S. A., & Rieger, G. W. (2017). The adoption of an open textbook in a large physics course: An analysis of cost, outcomes, use, and perceptions. International Review of Research in Open and Distributed Learning, 18 (4), 78–99. https://doi.org/10.19173/irrodl.v18i4.3006
Hilton, J. III. (2016). Open educational resources and college textbook choices: A review of research on efficacy and perceptions. Educational Technology Research and Development, 64 (4), 573–590. https://doi.org/10.1007/s11423-016-9434-9
Hilton, J., III. (2020). Open educational resources, student efficacy, and user perceptions: A synthesis of research published between 2015 and 2018. Educational Technology Research and Development, 68 (3), 853–876. https://doi.org/10.1007/s11423-019-09700-4
*Hilton, J., III., Fischer, L., Wiley, D., & Williams, L. (2016). Maintaining momentum toward graduation: OER and the course throughput rate. International Review of Research in Open and Distributed Learning, 17 (6), 18–27. https://doi.org/10.19173/irrodl.v17i6.2686
*Hilton, J. L., III., Gaudet, D., Clark, P., Robinson, J., & Wiley, D. (2013). The adoption of open educational resources by one community college math department. International Review of Research in Open and Distributed Learning, 14 (4), 37–50. https://doi.org/10.19173/irrodl.v14i4.1523
Huang, R., Tlili, A., Chang, T. W., Zhang, X., Nascimbeni, F., & Burgos, D. (2020). Disrupted classes, undisrupted learning during COVID-19 outbreak in China: application of open educational practices and resources. Smart Learning Environments, 7 , 1-15. https://doi.org/10.1186/s40561-020-00125-8
Ioannidis, J. P. A., Greenland, S., Hlatky, M. A., Khoury, M. J., Macleod, M. R., Moher, D., Schulz, K. F., & Tibshirani, R. (2014). Increasing value and reducing waste in research design, conduct, and analysis. The Lancet, 383 (9912), 166–175. https://doi.org/10.1016/S0140-6736(13)62227-8
*Jhangiani, R. S., Dastur, F. N., Le Grand, R., & Penner, K. (2018). As good or better than commercial textbooks: Students’ perceptions and outcomes from using open digital and open print textbooks. Canadian Journal for the Scholarship of Teaching and Learning . https://doi.org/10.5206/cjsotl-rcacea.2018.1.5
Kahle, D. (2008). Designing open educational technology. In T. Iiyoshi and M. S. Vijay Kumar (Eds.), Opening up education: The collective advancement of education through open technology, open content, and open knowledge (pp. 27–45). MIT Press. https://mitpress.mit.edu/9780262515016/opening-up-education/
*Kelly, D. P., & Rutherford, T. (2017). Khan Academy as supplemental instruction: A controlled study of a computer-based mathematics intervention. The International Review of Research in Open and Distributed Learning . https://doi.org/10.19173/irrodl.v18i4.2984
Kitchenham, B. A., & Charters, S. (2007). Guidelines for performing systematic literature reviews in software engineering. (EBSE 2007–001). Keele University and Durham University Joint Report.
Konstantopoulos, S. P. Y. R. O. S., & Hedges, L. V. (2019). Statistically analyzing effect sizes: Fixed-and random-effects models. The Handbook of Research Synthesis and Meta-Analysis (pp. 245–280). Russell Sage Foundation. https://doi.org/10.7758/9781610448864.15
*Lawrence, C. N., & Lester, J. A. (2018). Evaluating the effectiveness of adopting open educational resources in an introductory American government course. Journal of Political Science Education, 14 (4), 555–566. https://doi.org/10.1080/15512169.2017.1422739
Liesa-Orus, M., Lozano Blasco, R., & Arce-Romeral, L. (2023). Digital Competence in University Lecturers: A Meta-Analysis of Teaching Challenges. Education Sciences, 13 (5), 508. https://doi.org/10.3390/educsci13050508
*Medley-Rath, S. (2018). Does the type of textbook matter? Results of a study of free electronic reading materials at a community college. Community College Journal of Research and Practice, 42 (12), 908–918. https://doi.org/10.1080/10668926.2017.1389316
Morris, S. B. (2008). Estimating effect sizes from pretest-posttest-control group designs. Organizational Research Methods, 11 (2), 364–386.
OLCOS. (2007). Open Educational Practices and Resources. Available online: https://www.olcos.org/cms/upload/docs/olcos_roadmap.pdf
Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational Statistics, 8 (2), 157–159. https://doi.org/10.3102/10769986008002157
Otto, D., Schroeder, N., Diekmann, D., & Sander, P. (2021). Trends and gaps in empirical research on open educational resources (OER): A systematic mapping of the literature from 2015 to 2019. Contemporary Educational Technology, 13 (4), ep325. https://doi.org/10.30935/cedtech/11145
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, https://doi.org/10.1136/bmj.n71
Pawlyshyn, N., Braddlee, D., Casper, L., & Miller, H. (2013). Adopting OER: A case study of cross-institutional collaboration and innovation. Educause Review. Accessed on May 20, 2023 from https://er.educause.edu/articles/2013/11/adopting-oer-a-case-study-of-crossinstitutional-collaboration-and-innovation .
*Robinson, T. J. (2015). The effects of open educational resource adoption on measures of post-secondary student success. Brigham Young University.
Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52 (1), 59–82. https://doi.org/10.1146/annurev.psych.52.1.59
Salomon, G., & Clark, R. (1977). Reexamining the methodology of research on media and technology in education. Review of Educational Research, 47 (1), 99–120. https://doi.org/10.3102/00346543047001099
Shear, L., Means, B., & Lundh, P. (2015). Research on open: OER research hub review and futures for research on OER. SRI International: Menlo Park, CA, USA.
*Shemy, N., & Al-Habsi, M. (2021). The effect of a Training Program based on Open Educational Resources on the Teachers Online Professional Development and their Attitudes towards it of AL-Dakhliya Governorate in Sultanate of Oman. Journal of Elearning and Knowledge Society, 17 (1), 18–28. https://doi.org/10.20368/1971-8829/1135283
Shi, W., Ghisi, G. L. M., Zhang, L., Hyun, K., Pakosh, M., & Gallagher, R. (2023). Systematic review, meta-analysis and meta-regression to determine the effects of patient education on health behaviour change in adults diagnosed with coronary heart disease. Journal of Clinical Nursing, 32 (15–16), 5300–5327. https://doi.org/10.1111/jocn.16519
Slavin, R. E., & Lake, C. (2008). Effective programs in elementary mathematics: A best-evidence synthesis. Review of Educational Research, 78 (3), 427-515. https://doi.org/10.3102/0034654308317473
Smith, M. (2013). Ruminations on Research on Open Educational Resources. William and Flora Hewlett Foundation. Retrieved from https://hewlett.org/library/ruminations-on-research-on-open-educational-resources/
*Sulisworo, D., & Basriyah, K. (2021). Problem based learning using open educational resources to enhance higher order thinking skills in physics learning. In Journal of Physics: Conference Series, 1783 (1), 012108. https://doi.org/10.1088/1742-6596/1783/1/012108
Thalheimer, W., & Cook, S. (2002). How to calculate effect sizes from published research: A simplified methodology. Work-Learning Research , 1(9).
Thomas, L., & Krebs, C. J. (1997). A review of statistical power analysis software. Bulletin of the Ecological Society of America, 78 (2), 126–138.
Thorlund, K., & Mills, E. J. (2012). Sample size and power considerations in network meta-analysis. Systematic Reviews, 1 , 1-13. https://doi.org/10.1186/2046-4053-1-41
Ting, K. M. (2010). Precision and Recall BT—Encyclopedia of Machine Learning (C. Sammut & G. I. Webb (Hrsg.); S. 781). Springer US. https://doi.org/10.1007/978-0-387-30164-8_652
Tlili, A., Altinay, F., Huang, R., Altinay, Z., Olivier, J., Mishra, S., Jemni, M., & Burgos, D. (2022). Are we there yet? A systematic literature review of Open Educational Resources in Africa: A combined content and bibliometric analysis. PLoS ONE, 17 (1), e0262615. https://doi.org/10.1371/journal.pone.0262615
Tlili, A., & Burgos, D. (2022). Unleashing the power of Open Educational Practices (OEP) through Artificial Intelligence (AI): Where to begin? Interactive Learning Environments . https://doi.org/10.1080/10494820.2022.2101595
Tlili, A., Huang, R., Chang, T. W., Nascimbeni, F., & Burgos, D. (2019). Open educational resources and practices in China: A systematic literature review. Sustainability, 11 (18), 4867. https://doi.org/10.3390/su11184867
Tlili, A., Jemni, M., Khribi, M. K., Huang, R., Chang, T. W., & Liu, D. (2020). Current state of open educational resources in the Arab region: An investigation in 22 countries. Smart Learning Environments, 7 , 1–15. https://doi.org/10.1186/s40561-020-00120-z
Tlili, A., Zhang, J., Papamitsiou, Z., Manske, S., Huang, R., Kinshuk, & Hoppe, H. U. (2021). Towards utilising emerging technologies to address the challenges of using Open Educational Resources: a vision of the future. Educational Technology Research and Development , 69 , 515-532. https://doi.org/10.1007/s11423-021-09993-4
UN. (2021). Sustainable Development Goals. United Nations. https://www.un.org/sustainabledevelopment/
UNESCO. (2019). Recommendation on Open Educational Resources . UNESCO: Paris, France. Accessible from: https://www.unesco.org/en/legal-affairs/recommendation-open-educational-resources-oer
Vankov, I., Bowers, J., & Munafò, M. R. (2014). Article commentary: On the persistence of low power in psychological science. Quarterly Journal of Experimental Psychology, 67 (5), 1037–1040. https://doi.org/10.1080/17470218.2014.885986
Wang, H., Tlili, A., Huang, R., Cai, Z., Li, M., Cheng, Z., Yang, D., Li, M., Zhu, X., & Fei, C. (2023). Examining the applications of intelligent tutoring systems in real educational contexts: A systematic literature review from the social experiment perspective. Education and Information Technologies . https://doi.org/10.1007/s10639-022-11555-x
Weller, M. (2010). Big and Little OER. Open Ed, Barcelona. http://hdl.handle.net/10609/4851
Weller, M. (2017). The Development of New Disciplines in Education – the Open Education Example. https://oro.open.ac.uk/49737/
Weller, M., de los Arcos, B., Farrow, R., Pitt, B., & McAndrew, P. (2015). The Impact of OER on Teaching and Learning Practice. Open Praxis, 7 (4), 351–361. https://doi.org/10.5944/openpraxis.7.4.227
*Westermann Juárez, W., & Venegas Muggli, J. I. (2017). Effectiveness of OER use in firstyear higher education students’ mathematical course performance: A case study. In C. Hodgkinson-Williams & P. B. Arinto (Eds.), Adoption and impact of OER in the Global South (pp. 187–229). https://doi.org/10.5281/zenodo.601203
Wiley, D. (2014). The Access Compromise and the 5th R [blog post]. Iterating toward openness. Improving Learning: Eclectic, Pragmatic, Enthusiastic. https://opencontent.org/blog/archives/3221
Wiley, D. (2022). On the Relationship Between Adopting OER and Improving Student Outcomes. https://opencontent.org/blog/archives/6949
Wiley, D., & Hilton, J. L., III. (2018). Defining OER-enabled pedagogy. International Review of Research in Open and Distributed Learning . https://doi.org/10.19173/irrodl.v19i4.3601
*Winitzky-Stephens, J. R., & Pickavance, J. (2017). Open educational resources and student course outcomes: A multilevel analysis. International Review of Research in Open and Distributed Learning, 18 (4), 35–49. https://doi.org/10.19173/irrodl.v18i4.3118
Yuan, M., & Recker, M. (2015). Not all rubrics are equal: A review of rubrics for evaluating the quality of open educational resources. International Review of Research in Open and Distributed Learning, 16 (5), 16–38. https://doi.org/10.19173/irrodl.v16i5.2389
Zhang, X., Tlili, A., Huang, R., Chang, T., Burgos, D., Yang, J., & Zhang, J. (2020b). A case study of applying open educational practices in higher education during COVID-19: Impacts on learning motivation and perceptions. Sustainability, 12 (21), 9129. https://doi.org/10.3390/su12219129
Zhang, X., Tlili, A., Nascimbeni, F., Burgos, D., Huang, R., Chang, T. W., Jemni, M., & Khribi, M. K. (2020a). Accessibility within open educational resources and practices for disabled learners: a systematic literature review. Smart Learning Environments, 7 , 1–19. https://doi.org/10.1186/s40561-019-0113-2
Zulaiha, D., & Triana, Y. (2023). Students’ perception toward the use of open educational resources to improve writing skills. Studies in English Language and Education, 10 (1), 174–196. https://doi.org/10.24815/siele.v10i1.25797
Download references
Acknowledgements
Not applicable.
Author information
Authors and affiliations.
Smart Learning Institute of Beijing Normal University, Beijing, China
Ahmed Tlili, Ronghuai Huang & Lin Xu
Faculty of Engineering, Universidad Católica de Oriente, Rionegro, Colombia
Juan Garzón
Faculty of Educational Sciences and Teachers’ Training, An-Najah National University, Nablus, Palestine
Soheil Salha
Research Institute for Innovation and Technology in Education (UNIR iTED), Universidad Internacional de la Rioja (UNIR), Logroño, Spain
Daniel Burgos & Aída López-Serrano
Univ. Polytechnique Hauts-de-France, LAMIH, CNRS, UMR 8201, 59313, Valenciennes, France
Mouna Denden
INSA Hauts-de-France, 59313, Valenciennes, France
Dublin City University (DCU), Dublin, Ireland
Orna Farrell
The Open University, Milton Keynes, UK
Robert Farrow
Anadolu University, Eskisehir, Turkey
Aras Bozkurt
University of Brasília, Brasília, Brazil
Athabasca University, Athabasca, Canada
Rory McGreal
Lumen Learning and Brigham Young University, Provo, UT, USA
David Wiley
You can also search for this author in PubMed Google Scholar
Contributions
Each author contributed evenly to this manuscript. All authors read and approved the final manuscript.
Corresponding author
Correspondence to Juan Garzón .
Ethics declarations
Competing interests.
The authors have no conflict of interest to declare.
Additional information
Publisher's note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and Permissions
About this article
Cite this article.
Tlili, A., Garzón, J., Salha, S. et al. Are open educational resources (OER) and practices (OEP) effective in improving learning achievement? A meta-analysis and research synthesis. Int J Educ Technol High Educ 20 , 54 (2023). https://doi.org/10.1186/s41239-023-00424-3
Download citation
Received : 19 July 2023
Accepted : 26 September 2023
Published : 13 October 2023
DOI : https://doi.org/10.1186/s41239-023-00424-3
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Open educational resources
- Open educational practices
- Learning achievement
- Meta-analysis
- Meta-synthesis
Research into Open Educational Resources for Development
Higher education plays an important role in helping developing countries reach their development goals. Yet, higher education institutions in many developing countries face a number of challenges. Among them is the growing demand for postsecondary education when most universities lack sufficient funds, human resources, and up-to-date curricula. To put it into context, the number of undergraduate and graduate students in China has grown at approximately 30% per year since 1999, yet the supply of quality postsecondary educational opportunities has not kept pace. One potentially effective response to these challenges is open educational resources (OER). The Hewlett Foundation defines OERs as "teaching, learning, and research resources that reside in the public domain or have been released under an intellectual property license that permits their free use and re-purposing by others." OERs are gaining significant reach globally thanks in part to greater access to the Internet and new flexible intellectual property licenses. The Massachusetts Institute of Technology (MIT) created one of the world's first OERs (MIT OpenCourseWare) by putting course materials, syllabi, and lectures freely online. The site receives over one million visits per month, of which 27% originate from East and South Asia alone. While OERs are receiving considerable attention in universities, education ministries, and among donors, questions remain about the extent to which OERs help meet the demands for high quality tertiary education in developing countries. Research is needed to move beyond the rhetoric and to establish whether OERs bridge the educational gap, and if so, how. The Researching Open Educational Resources for Development (ROER4D) project is a global research network on OERs and development. The University of Cape Town in South Africa coordinates the network, which includes 12 sub-projects in South America, sub-Saharan Africa, and Asia. ROER4D seeks to improve our understanding of the use and impacts of OERs in developing countries. More specifically, the project will: - build an empirical knowledge base on the use and impact of OERs for postsecondary education; - develop researchers' capacity to analyze open educational resources; - build a network of scholars focused on the contributions of OERs; and - communicate research results to inform educational policy and practice.

IMAGES
VIDEO
COMMENTS
An action research paper documents a “cycle of inquiry,” in which the writer evaluates a problem and develops a strategy of reform. Educators and educational administrators typically use this writing format to foster continual improvement i...
When it comes to writing a research paper, understanding the proper formatting and structure is essential. The American Psychological Association (APA) has established a set of guidelines that are widely used in the social sciences, includi...
To make an acknowledgement in a research paper, a writer should express thanks by using the full or professional names of the people being thanked and should specify exactly how the people being acknowledged helped.
PDF | This briefing paper provides the background to the current development of and future trends around OER aimed at adding to our understanding,.
PDF | This paper focuses on the significant developments in the area of open education, in particular the role that Open Educational Repositories (OER).
" This article analyzes how academic libraries are currently engaged in open access textbook and OER initiatives. By drawing on examples of
Learning materials can be taken easily and freely from internet. UT also utilize OER in it's learning process, especially in e-learning. The aim of this study
... .2019.00152 · This article is part of the Research Topic. Research in Underexamined Areas of Open Educational Resources. View all 7 Articles
Open Educational Resources (OERs) refers to those teaching, learning and research materials in any medium (digital or otherwise) that are found in the public
The OER movement has empowered researchers and educators to become more innovative in their teaching and learning, through the openness and
This paper reports a study on the perceived usefulness of university students on open educational resources (OER) in relation to the switch of learning mode
Due to the novelty of the topic, conference papers and doctoral dissertations were considered to be included into the research corpus, as
In the following paper some preliminary findings are presented. The OECD/CERI study on OER. There are many critical issues surrounding access, quality and costs
One potentially effective response to these challenges is open educational resources (OER). The Hewlett Foundation defines OERs as "teaching, learning, and