Open Access

Learning/training video clips: an efficient tool for improving learning outcomes in Mechanical Engineering

  • Silvia De la Flor López1Email author,
  • Francesc Ferrando1 and
  • Albert Fabregat-Sanjuan1
International Journal of Educational Technology in Higher Education201613:6

https://doi.org/10.1186/s41239-016-0011-4

Received: 13 October 2014

Accepted: 13 January 2015

Published: 19 February 2016

Abstract

An innovative blended-learning strategy is presented for the subject Laboratory of Elasticity and Strength of Materials, based on the use of different tools combining face-to-face methods with e-learning technologies to improve learning outcomes in Mechanical Engineering. The enhanced teaching environment includes upgraded teaching material, new self-assessment methods, and video clips providing detailed instructions for each practical session. Our challenge was to improve the learning method by changing from a model that simply presents the practical sessions to a model that actively involves students in the learning process. The results of an anonymous student satisfaction survey show that these improvements have been very well received; students consider that the new techniques are very useful both for better understanding the subject and for preparing for the oral exam. Rates of success demonstrate that the improvements have had a direct and significant impact on performance, as well as reducing the dropout rate. As an overall conclusion, we can state that all the new techniques are effective tools for improving learning outcomes in the Degree in Mechanical Engineering.

Keywords

Video clips Blended learning Self-assessment Learning outcomes Student satisfaction Rate of success

Introduction

Since the incorporation of Spanish universities into the European Higher Education Area, teaching methodologies have been radically overhauled, creating a more open and student-centred educational experience that fosters self-directed, participatory, active, group-oriented and engaged learning (Márquez & Jiménez, 2014).

The integration of practical training into the university curriculum, in order for students to develop a suitable professional profile, has become an issue of particular relevance since the launch of the Bologna Process (Molina et al. 2008), under which practical training is given greater prominence to strengthen the bond between theory and practice, which is fundamental to the acquisition of quality professional competencies (Molina et al. 2008; Colombo & Gómez Pradas 2014). With the implementation of the Bologna Process, teaching approaches have changed significantly and now focus more explicitly on student performance. Consequently, lecturers have looked for new methodologies and learning tools that strengthen the role of students, enabling them to take control of their own learning processes (Gámiz Sánchez et al. 2014). The new methodologies are geared towards fostering student participation to promote an autonomous learning process in a student-centred environment (Kramarski & Michalsky 2009; De Miguel, 2006).

Based on these principles, the Degree in Mechanical Engineering at Rovira i Virgili University uses unique teaching methods that bolster the students’ aptitudes and abilities, applying advanced experimental and practical teaching techniques. The experimental techniques are used for a number of practical subjects (laboratory subjects) that dovetail with the corresponding theoretical subjects but which have their own assessment, management and organisational structures.

The laboratory subject analysed in this paper is Laboratory of Elasticity and Strength of Materials, taught in the second semester of the second year of the Degree in Mechanical Engineering. Students complete 60 face-to-face teaching hours in one semester to acquire a practical understanding of the main concepts of materials strength as applied to engineering systems. The basic content is divided into three modules:
  • Module I, characterisation, testing and inspection of structural materials, with four practical sessions.

  • Module II, experimental characterisation of strain gauge measurement techniques, with four practical sessions.

  • Module III, development of the basic principles of the finite elements method, with four computer sessions.

Each week, students attend a 4-hour practical session for one of the modules.

Each module has a coordinator, who is responsible for the theoretical content, the teaching material, the assessment system and the coordination of the five other lecturers who teach the subject. All the supporting teaching material can be accessed in the corresponding virtual space on Moodle.

The main learning outcomes of this subject are, broadly, the ability to apply experimentally the fundamentals of elasticity and strength of materials (derived from a specific competence), the ability to apply engineering knowledge to practice (also derived from a cross-disciplinary competence) and the ability to produce correct oral communication (structured, clear and appropriate to the communicative situation) (derived from a core competence). The latter two competencies are also considered Generic Competencies in the Dublin Descriptors and refer to those competencies that are key, cross-disciplinary and transferrable to a wide variety of personal, social, academic and professional contexts throughout life. By acquiring these skills, on completion of the degree students will possess not only technical competencies but also methodological, human and social competencies (García-Aracil & Van der Velden, 2008; Villa, 2007).

The assessment system for this subject focuses on both the content and the acquisition of these competences. One of the evaluation techniques used in this system is the oral exam, during which the student gives a five-minute presentation on one of the practical sessions, chosen at random, and then answers questions from the teaching staff for another five minutes. The learning outcomes evaluated in this oral exam are as follows:
  • The practical session is presented in a proper, clear and organised manner.

  • The time allocated to the presentation is observed.

  • The student is capable of discussing and answering the questions posed.

The other evaluation items in the assessment system for this subject are the marks obtained in each practical session (see Table 1).
Table 1

Evaluation items and weighting

Evaluation items

Oral Exam (70 %)

• 30 % First Oral Exam (modules I & II)

• 40 % Final Oral Exam (all modules)

Reports from Practical Sessions (PS) (30 %)

All PS are equally rated

• 10 % Personal attitude in lab

• 20 % Written report on each PS

From our experience of teaching this subject, we have identified certain problems that impede or delay the acquisition of the learning outcomes, as well as significantly reducing the rate of success (the ratio of students who pass the exam to the total number of students taking the exam) and increasing the dropout rate (the ratio of students who do not take the exam to the total number of students enrolled in the subject) (Fernández Rico et al. 2007; Galán & Cabrera, 2002). In addition, a systematic decline in student satisfaction is detected each academic year.

These problems may be caused by various factors:
  • There is too much time between the first practical sessions and the exam date (about four months), as a result of which the students do not remember the main concepts of the first sessions.

  • In the oral exam, the students are unable to summarise the chosen practical session in five minutes.

  • The students do not receive guidance or model questions similar to those posed by the examiners, so they do not know how to prepare for the oral exam.

Another, less significant, problem that might noticeably reduce student satisfaction is the lack of uniformity between the different teaching materials and teaching methods, mainly due to the involvement of five different lecturers.

In order to reverse this trend, we intend to drastically change the teaching and learning methodology used for our laboratory subject, adapting it to a more interactive system by introducing a blended-learning strategy based on the use of different teaching tools that combine face-to-face methods with e-learning technologies. This type of strategy is characterised by the convergence of traditional face-to-face systems and online distance learning systems, thus forming integrated and complementary environments (Graham, 2006; Osorio, 2010; Gámiz Sánchez et al. 2014). Online methodologies in general and blended-learning models in particular have been well received by lecturers and students because of their usefulness, the flexibility they allow and the greater degree of student involvement and participation they permit (Sancho & Escudero, 2012; Seluakumaran et al. 2011). The implementation of blended-learning strategies has also been found to improve academic performance (Cabero et al. 2013; Garrison and Vaughan 2008).

The tools developed and implemented in our project are also aimed at improving autonomous learning before and after each practical session. The most important of these tools are a series of video clips summarising the lab sessions and a set of self-assessment activities. Several studies have related the use of a learning strategy based on self-assessment activities to improvements in student performance (Boud, 2003; Ćukušić et al. 2014; Snodin, 2013), so they would seem to be an excellent aid for our subject. Additionally, the incorporation of web tools, such as video clips accessible via YouTube, can be effective in improving levels of student engagement in the learning process, facilitating a learning-by-doing approach and better levels of retention and competence acquisition (Manca and Ranierit 2013; Rama, 2014).

The challenge is to introduce this set of e-learning tools in combination with traditional face-to-face practical training (essential in this kind of subject) to improve learning outcomes for laboratory subjects in Mechanical Engineering. Therefore, this project focuses on the accurate design, development, implementation and evaluation of a new teaching methodology for practical training in Degree in Mechanical Engineering laboratory subjects aided by e-learning environments.

Method

The improved teaching tools

We have introduced a number of interactive tools to enhance the learning experience: teaching material, with illustrated presentations (called learning by reading); video clips, with detailed instructions for each practical session (with a QR code linking to the video on YouTube for mobile devices), called learning by watching; and self-assessment methods to be carried out before (Are you ready?) and after (What have you learned?) each practical session.

These new resources have been integrated into the Moodle space for the subject and uploaded to YouTube, and are structured to achieve a high level of performance in learning activities. Our challenge is to improve the learning method by changing from a model that simply presents the practical sessions to a model that actively involves students in the learning process. We aim to convert our educational model to a “user-centred” model that places the students at the centre of the learning process and empowers them to guide their own educational experience.
  • Learning by reading (Fig. 1). We have developed enhanced teaching material for each practical session, which unify the content, provide explanatory illustrations and photographs for the experimental methods, and highlight the main objectives and conclusions. The new materials also provide clear and concise step-by-step instructions for each practical session and are accompanied by Are you ready? and What have you learned? self-assessment materials on the main questions related to the session. A QR code on the front cover of each document links to the associated video clip. There are numerous experiences of using QR codes in education, either to link to collections of student resources (Román & Martín, 2014; Allueva, 2013) or to integrate audio-visual media (Geyer, 2010; Marquis, 2012; Watson, 2013). In our laboratory environment, these codes can link to detailed instructions on how to use the equipment for the practical session. The aim is to integrate a full range of techniques so the students can use any of them when studying for the oral exam. With the improvements made to this material, we aim to help students achieve the specific learning outcome evaluated in the oral exam: The practical session is presented in a proper, clear and organised manner.
    Fig. 1

    Examples of teaching material for two practical sessions: a Module I; b Module II

  • Learning by watching (Fig. 2): This is the most significant improvement to the subject; it has also required the most time and dedication to prepare. For each practical session, we have prepared a 5–10 min illustrative video, which presents the fundamental aspects (overall objectives, operational objectives, key ideas, conclusions, etc.). The videos are intended to help students summarise each practical session for the oral exam. They are also a useful reminder of the content of sessions completed at the beginning of the semester. The addition of this e-learning tool enables students to acquire the necessary competences for the oral exam: The practical session is presented in a proper, clear and organised manner and The time allocated to the presentation is observed.
    Fig. 2

    Example slides from video clips: a, b Module I; c Module II; d Module III

  • Are you ready? We have also designed self-assessment tests, one for each practical session, consisting on 20–25 items classified by difficulty and containing a combination of numerical and multiple-choice questions. Part of the tests focuses on knowledge that we consider essential for successfully completing each practical session. Students should complete the tests before the corresponding practical sessions, and can access them via Moodle.

  • What have you learnt? In order to help students prepare for the oral exam, part of the self-assessment tests also focuses on concepts that students must be familiar with by the end of each practical session. These concepts will be discussed in the oral exam and must therefore be learned to achieve the desired learning outcome: The student is capable of discussing and answering the questions posed.

Both self-assessment methods implemented in Moodle are developed with feedback and can be attempted without limitations in number of attempts (see Fig. 3).
Fig. 3

Self-assessment questions for Module II

Tools for the analysis

The study was conducted on students taking the Laboratory of Elasticity and Strength of Materials subject in the 2013/14 academic year. The academic results were analysed for 75 students, where only 5 % were female (four female and 71 male), which is a common proportion in engineering degrees (Ferrando et al. 2013). The average age of participants was 20 years (age interval 19–23 years).

To measure the students’ perceptions of and satisfaction with the new resources, a survey was conducted via Moodle, containing twenty questions divided into three main categories:
  • Questions related to rating the usefulness of the improvements for better understanding the overall subject, and their usefulness for preparing for the oral exam. A modified 10-point Likert scale was used (1 = not at all, 10 = totally agree).

  • Questions with closed-answer text to analyse which of the improvements were most highly valued by students.

  • Closed questions (YES/NO) regarding the usefulness of the video clips and self-assessment methods for achieving the main learning outcomes.

The survey was conducted at the end of the semester, after the final oral exam. Of the 75 students enrolled in the subject, 50 answered this survey (4 female and 46 male). The data collected from the survey were analysed to obtain the respective percentages for each question posed.

To measure the results of the autonomous learning process with the implementation of this interactive teaching system, the students’ final grades for the subject were also analysed, considering the rate of successful task completion (rate of success) and the participation rate (dropout).

Results and discussion

Rate of successful task completion (rate of success) and participation rate (dropout)

To analyse whether the improvements have helped to increase the rate of success, Fig. 4 presents a comparison between the grades obtained in previous years with those obtained in the current year (when the new techniques were introduced). This comparison helps to understand the advantages brought by these improvements and to quantify the benefits of their implementation.
Fig. 4

Grades obtained in the last three academic years

We can see from Fig. 4 that the results for the first two years were similar and that grades have risen dramatically for the current year. The number of dropout students (NP) has been considerably reduced (from 41 % in 2011/12 to 24 % in 2013/14), as has the number of failed students (from 20 % to 8 %). By contrast, the number of students achieving good grades (between 7 and 9) has increased noticeably from 3 % in 2011/12 to 20 % in 2013/14. Unfortunately, there are still no students in the highest grade band, although this is probably due to the use of continuous assessment, which produces a more distributed grade.

Based on the results presented above, we can conclude that the effective use of our improved activities has had a considerable impact on the dropout rate and on the students’ grades. It would be useful to know if the students who have actively used these improved teaching methods are those who have obtained better grades, but since the survey was anonymous, the two facts cannot be correlated.

Students’ perceptions and satisfaction

To analyse the students’ opinions of the new resources, a survey was conducted via the Moodle platform. The 20 questions were designed to determine the degree of satisfaction with the improvements introduced.
  • Rating the usefulness of the improvements for better understanding the overall subject and preparing for the oral exam.

Figure 5 shows the students’ ratings with respect to the question: Rate the usefulness of the improvements for better understanding the overall subject: (i) video clips; (ii) self-assessment methods; and (iii) teaching material. The most noticeable result is that all of the students’ ratings fall between 5 and 10, even though a scale of 1 to 10 was used, which suggests that the improvements were generally well received.
Fig. 5

Student ratings of the usefulness of each improvement for better understanding the overall subject

Similarly, an initial observation of the data indicates that the students considered video clips to be the most useful item (with a mean rating of 8.9) and teaching material to be the least useful, although they still valued it highly (a mean rating of 8.3).

Figure 6 shows particularly illustrative results for the second question: Rate the usefulness of the improvements for preparing for the oral exam: (i) video clips; (ii) self-assessment methods; and (iii) teaching material. In this case, by comparing the results with those shown in Fig. 5, it is clear that students consider the improvements to be significantly more useful for preparing for the oral exam than for better understanding the overall subject, and that they found video clips (mean rating of 9.2) and self-assessment methods (mean rating of 9.3) the most useful. Surprisingly, although teaching material was considered useful for better understanding the overall subject (mean rating of 8.3), it was less highly valued for preparing for the oral exam (mean rating of 7.6).
Fig. 6

Student ratings of the usefulness of each improvement for preparing for the oral exam

Nevertheless, from the results for the first two questions we can conclude that, in general terms, the improvements have been very well received by students, and that they are considered very useful both for better understanding the overall subject and for preparing for assessment.
  • Items with closed-answer text to analyse which of the improvements were most highly valued by students.

In order to identify which aspects of the teaching material and video clips were most highly valued by the students, the following questions were asked:
  1. 1.
    Which of the following aspects of the teaching material do you consider most valuable?
    1. a.

      They are uniform in structure

       
    2. b.

      A QR code is inserted in the front cover

       
    3. c.

      They are clearly structured, organised and explained

       
    4. d.

      They include explanatory illustrations and photographs

       
     
  2. 2.
    Which of the following aspects of the video clips do you consider most valuable?
    1. a.

      They are short and concise

       
    2. b.

      They can be accessed via QR code

       
    3. c.

      They are good reminders of the sessions

       
    4. d.

      They are uniform in structure

       
     
Figure 7 shows the students’ qualitative assessments of the improvements. In Fig. 7(a) it can be clearly seen that students found the structure, organisation and clarity of the teaching material to be the most valuable aspect (60 %), followed by uniformity (21 %) and illustrations and photographs (19 %). This confirms our hypothesis that the decline in student satisfaction was due, in part, to the lack of uniformity of the different teaching materials and teaching methods and suggests that the problem has been corrected with the introduction of the improved teaching material.
Fig. 7

Qualitative assessments of the (a) teaching material and (b) video clips

Figure 7(b), as expected, shows that the most highly valued aspect of the video clips was that they provide a clear reminder of the lab sessions (69 %), particularly the sessions carried out at the beginning of the semester. Again, our hypothesis that the decline in student satisfaction was also due to too much time elapsing between the first practical sessions and the exam date is clearly confirmed, and the problem has been corrected. We also hypothesised that dissatisfaction arose from the fact that, in the oral exam, the students are unable to summarise the chosen practical session in the allotted time; we therefore expected students to value the short length and conciseness of the video clips, yet only 24 % selected this as the most valuable aspect.

Finally, the most surprising result shown in Fig. 7 (a, b) is the total lack of value – 0 % – given to the QR codes, which totally contradicts our initial assumptions about their usefulness. The students’ views are clear: they do not value this aspect of the improved teaching resources.
  • Closed questions (YES/NO) on the usefulness of the most relevant resources (video clips and self-assessment methods).

In order to ascertain whether the video clips and self-assessment methods have helped students to achieve two of the main learning outcomes (the specific competence, the ability to apply experimentally the fundamentals of elasticity and strength of materials; and the core competence, the ability to produce correct oral communication), we asked four questions:
  1. 1.

    Do you think that the video clips have helped you understand and complete each lab session?

     
  2. 2.

    Do you think that the self-assessment methods have helped you understand and complete each lab session?

     
  3. 3.

    Do you think that the video clips can help to reduce your preparation time for the oral exam?

     
  4. 4.

    Do you think that the self-assessment methods can help to reduce your preparation time for the oral exam?

     
The results, shown in Table 2, are largely what we expected to find: all the students consider that the self-assessment tests are the most useful resource in reducing preparation time for the oral exam, and 98 % of students consider that the video clips are the most useful resource for understanding the lab sessions. These data therefore support our hypothesis that the resources help students achieve two of the main learning outcomes: the specific competence, in the case of video clips; and the core competence, in the case of the self-assessment methods. However, in contrast to our initial assumptions, the results also show that the self-assessment tests do not help students understand and carry out the lab sessions, so we can conclude that the Are you ready? questions completed before each session are not useful for acquiring the specific competence.
Table 2

Results for questions relating to the video clips and self-assessment methods

 

Video clips

Self-assessment methods

 

Understanding lab sessions

Reducing preparation time for oral exam

Understanding lab sessions

Reducing preparation time for oral exam

Yes

98 %

55 %

40 %

100 %

No

2 %

45 %

60 %

0 %

Conclusions

Video clips for teaching and learning, combined with other interactive techniques (self-assessment methods and enhanced teaching material) are effective tools for improving learning outcomes in the Degree in Mechanical Engineering. These resources have significantly improved student grades, considerably reduced the dropout rate, and increased the overall pass rate. Moreover, the degree of student satisfaction, ascertained through an anonymous survey, has increased noticeably, so it can be concluded that the improvements have been well received. The students consider the new resources to be very useful both for better understanding the overall subject and for preparing for the oral exam. Our principal hypotheses regarding the causes of the annual decline in student satisfaction have been confirmed, and the underlying problems have been corrected, with a consequent improvement in the students’ acquisition of the specific and core competences.

To follow up this work, it would be interesting to monitor grades and student satisfaction over the coming years to gauge whether the improvements observed in the study are maintained or if additional improvements are recorded.

Declarations

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

(1)
Departament d’Enginyeria Mecànica, Universitat Rovira i Virgili (URV)

References

  1. Allueva A (2013) Experiencia de uso de los códigos QR en docencia., EuLES, Retrieved from http://eules.unizar.es/experiencia-de-uso-de-los-codigos-qr-en-docencia/ Google Scholar
  2. Boud D (2003) Enhancing Learning through Self-assessment. RoutledgeFalmer, Taylor & Francis Group, New YorkGoogle Scholar
  3. Cabero J, Llorente MC, Morales JA (2013) Contributions to e-Learning from a Best Practices Study at Andalusian Universities. RUSC Univ Knowl Soc J 10(1):45–60. doi:10.7238/rusc.v10i1.1159 Google Scholar
  4. Colombo A, Gómez Pradas M (2014) SIMULACRE: A proposal for practical training in e learning environments. RUSC Univ Knowl Soc J 11(3):4–20. doi:10.7238/rusc.v11i3.1781 View ArticleGoogle Scholar
  5. Ćukušić M, Garača Ž, Jadrić M (2014) Online self-assessment and students’ success in higher education institutions. Comput Educ 72:100–109. doi:10.1016/j.compedu.2013.10.018 View ArticleGoogle Scholar
  6. De Miguel M (2006) Metodologías de enseñanza y aprendizaje para el desarrollo de competencias. Orientaciones para el profesorado universitario ante el Espacio Europeo de Educación Superior. Alianza Editorial, MadridGoogle Scholar
  7. Fernández Rico JE, Fernández Fernández S, Álvarez Suárez A (2007) Academic Success and Student Satisfaction with University Teaching. RELIEVE Revista ELectrónica de Investigación y EValuación Educativa 13(2):203–214, Retrieved from http://www.uv.es/RELIEVE/v13n2/RELIEVEv13n2_4.htm Google Scholar
  8. Ferrando PJ, Gutiérrez-Colón M, Paleo P, De la Flor S, Ferrando F (2013) Distortions and gender-related differences in the perception of mechanical engineering in high school students. Psicothema 25(4):494–499, doi: 10.7334/psicothema2013.15 Google Scholar
  9. Galán Delgado E, Cabrera Guillén, P (2002) Factores contextuales y rendimiento académico. Revista Electrónica Interuniversitaria de Formación del Profesorado, 5(3). Retrieved from http://dialnet.unirioja.es/servlet/articulo?codigo=1034506
  10. Gámiz Sánchez V, Montes Soldado R, Pérez López M (2014) Self-assessment via a blended-learning strategy to improve performance in an accounting subject. RUSC Univ Knowl Soc J 11(2):41–54. doi:10.7238/rusc.v11i2.2055 Google Scholar
  11. García-Aracil A, Van der Velden R (2008) Competencies for Young European Higher Education Graduates: Labor Market Mismatches and their Payoffs. High Educ 55:219–239. doi:10.1007/s10734-006-9050-4 View ArticleGoogle Scholar
  12. Garrison R, Vaughan H (2008) Blended learning in higher education: Framework, principles and guidelines. Jossey-Bass, San FranciscoGoogle Scholar
  13. Geyer S (2010) 7 ways higher education can use QR codes to connect with current and prospective students., Retrieved from http://blogem.ruffalonl.com/2010/11/24/7-ways-higher-education-qr-codes-connect-current-prospective-students/ Google Scholar
  14. Graham CR (2006) Chapter 1: Blended Learning Systems: Definitions, Current Trends and Future Directions. In: Bonk CJ, Graham CR (eds) The Handbook of Blended Learning: Global Perspectives, Local Designs. Pfeiffer Publishing, San Francisco, pp 3–21Google Scholar
  15. Kramarski B, Michalsky T (2009) Investigating Preservice Teachers’ Professional Growth in Self-Regulated Learning Environments. J Educ Psychol 101(1):161–175. doi:10.1037/a0013101 View ArticleGoogle Scholar
  16. Manca S, Ranierit M (2013) Is it a tool suitable for learning? A critical review of the literature on Facebook as a technology-enhanced learning environment. J Comput Assist Learn 29(6):487–504. doi:10.1111/jcal.12007 View ArticleGoogle Scholar
  17. Márquez Lepe E, Jiménez-Rodrigo ML (2014) Project-based learning in virtual environments: a case study of a university teaching experience. RUSC Univ Knowl Soc J 11(1):76–90. doi:10.7238/rusc.v11i1.1762 View ArticleGoogle Scholar
  18. Marquis J (2012) Your Quick-Guide To Using QR Codes In Education., Retrieved from http://www.teachthought.com/technology/your-quick-guide-to-using-qr-codes-in-education/ Google Scholar
  19. Molina E, Iranzo P, Lopez MC, Molina MA (2008) Procedimientos de análisis, evaluación y mejora de la formación práctica. Revista de Educación 346:335–361, Retrieved from http://www.mecd.gob.es/revista-de-educacion/numeros-revista-educacion/numeros-anteriores/2008/re346/re346_13.html Google Scholar
  20. Osorio L. (2010). Características de los ambientes híbridos de aprendizaje: estudio de caso de un programa de posgrado de la Universidad de los Andes. RUSC. Universities and Knowledge Society Journal, 7(1). doi: 10.7238/rusc.v9i2.1285
  21. Rama C (2014) University virtualization in Latin America. RUSC Univ KnowlSocJ 11(3):32–41. doi:10.7238/rusc.v11i3.1729 View ArticleGoogle Scholar
  22. Román Graván P, Martín Gutiérrez A (2014) Social networks as tools for acquiring competences at university: QR codes through Facebook. RUSC Univ KnowlSocJ 11(2):26–40. doi:10.7238/rusc.v11i2.2050 Google Scholar
  23. Sancho T, Escudero N (2012) A Proposal for Formative Assessment with Automatic Feedback on an Online Mathematics Subject. RUSC Univ KnowlSocJ 9(2):59–79. doi:10.7238/rusc.v9i2.1285 Google Scholar
  24. Seluakumaran K, Jusof FF, Ismail R, Husain R (2011) Integrating an open-source course management system (Moodle) into the teaching of a first-year medical physiology course: a case study. Adv Physiol Educ 35(4):369–377. doi:10.1152/advan.00008.2011 View ArticleGoogle Scholar
  25. Snodin NS (2013) The effects of blended learning with a CMS on the development of autonomous learning: A case study of different degrees of autonomy achieved by individual learners. Comput Educ 61:209–216. doi:10.1016/j.compedu.2012.10.004 View ArticleGoogle Scholar
  26. Villa A (2007) Aprendizaje basado en competencias. Una propuesta para la evaluación de las competencias genéricas. Ed. Mensajero. Universidad de Deusto, BilbaoGoogle Scholar
  27. Watson E (2013) Discovering the innate potential of QR codes in Education., Retrieved from http://www.educationandtech.com/2013/12/qr-codes-secrets-in-education.html Google Scholar

Copyright

© The Author(s) 2016