- Research article
- Open Access
Student use and perception of technology enhanced learning in a mass lecture knowledge-rich domain first year undergraduate module
© The Author(s) 2017
- Received: 22 May 2017
- Accepted: 5 November 2017
- Published: 1 December 2017
The purpose of this study was to examine student use and perceptions of technology enhanced learning tools (TELTs), including their value for learning. Students enrolled onto a 12-week undergraduate science module had access to four TELTs each week, (i) a recording of the lecture (Panopto™), (ii) an animated mini review of the lecture (ShowMe), (iii) a multiple choice quiz hosted on Blackboard® (MCQ), and (iv) a module Twitter feed. Ninety-five students completed a survey at the end of the module, which included quantitative and qualitative questions, to examine whether they perceived the TELT to be useful for their learning. Analysis of the quantitative data suggest that Twitter was used significantly less than the other three TELTs (p < 0.001) with less people agreeing that it helped their learning (p < 0.001), whilst ShowMe and MCQ had a greater occurrence of an ‘agree’ rating compared to Twitter and Panopto (p ≤ 0.002). A thematic analysis of the qualitative responses identified assessment as a common theme across all four TELTs, being a positive factor for Panopto, ShowMe and MCQ, but negative for Twitter. Students highlighted ShowMe as being particularly useful for simplifying information. Based on this study TELTs similar to ShowMe (i.e. animations) are most recommended as this was one of the two highly rated TELTs (alongside MCQ), but may have more potential for crossover with other subjects, and students found it useful for more than just assessment.
- Social network
- E learning
Technology enhanced learning (TEL) has the potential to enhance the student experience by facilitating self-paced learning, lowering inhibition thresholds for asking questions, and allowing access to learning on an as-and-when basis (Kamath, 2015), all of which are factors that may contribute to informal and incidental learning outside of the formal learning space (Peart et al., 2014). There is a body of research examining the role of the teacher in facilitating TEL, and a reservation or anxiety from staff has been described with respect to the technology acceptance model (Blackwell et al., 2014; Gong et al., 2004; Louw, 2015; Teo et al., 2008). Such problems may present a barrier to the promotion of TEL to support student learning. However, an aspect that has been less researched is the contribution of students to the impact of TEL, and in particular their perspectives on TEL tools (TELT). There is a wealth of information on student perceptions of a selection of ‘learning objects’ (LO) and ‘web based learning tools’ (WBLT) (Cochrane, 2005; R. Kay, 2011; R. H. Kay & Knaack, 2009; Nurmi & Jaakkola, 2006; Vargo et al., 2003). Such studies can provide a useful insight for LO and WBLT developers, but the implication from a teaching perspective is somewhat limited unless the teacher is likely to use that very specific tool in their teaching. For example Cochrane (2005) evaluated two audio engineering LOs (interactive mixing desk and microphone chooser) and Nurmi and Jaakkola (2006) evaluated three LOs to teach fractions, the Finnish language and electrical DC circuits. Such tools are typically designed to be used in the classroom, and as a result do not promote extra-curricular and informal/incidental learning.
It may be more beneficial from a teaching perspective to understand the benefits of more generic TELTs, which staff can have more control over, implemented over a longer period of time. However current examples are limited, and are focused primarily upon an institution virtual learning environment (VLE) or social networking. Šumak et al. (2011) collated Electrical Engineering and Computer Science students’ perceptions of using a general virtual VLE (Moodle) and reported that perceived usefulness was a strong predictor of intention to use the VLE. This again has limited scope today as the use of a VLE is now commonplace within Higher Education. However, the identification of perceived usefulness as a predictor of acceptance is important as it can direct research to now determine what characteristics students consider useful, or indeed how students define ‘useful’. Junco et al. (2011) incorporated Twitter into a semester long module for pre-health professional majors, where students were encouraged to continue class discussions, organise study groups, and connect with each other and with staff. They concluded that the use of Twitter successfully increased student engagement, which may present a promising development for a wide range of teachers as the uses could be incorporated into any subject of study. However, the authors did not provide any data on the student perception of the use of Twitter, therefore it is unclear why the students engaged with Twitter, and without knowing this it is difficult to predict what other TELTs may be well accepted.
Lecture capture is another example of a generic TELT available to teaching staff. A narrative review identified that students use lecture capture to review content, but there is mixed evidence as to its effect on student grades and attendance (Karnad, 2013). This practice has been examined specifically in higher education science students, namely in the areas of veterinary medicine (Danielson et al., 2014) and pharmacy (Marchand et al., 2014). Danielson et al. (2014) reported that students perceived lecture capture to be most useful for learning in content driven lecture sessions compared to group work sessions, particularly for reviewing segments flagged in their notes, recapping a fast lecture, studying for examinations, and reviewing content missed due to absence. Staff agreed lecture capture to be beneficial for students to review lecture content, but identified reduced attendance as a risk. Marchand et al. (2014) also identified reduced attendance as a concern for staff, however neither study reported actual attendance data so it is unknown if the staff concerns were realised. In a letter to the editor Lach and McCarthy (2015) challenged the findings of Marchand et al. (2014) stating that attendance should not be a concern, as attendance is not a learning outcome and does not guarantee that learning will occur. They argue that staff focusing on the possible negative effect upon attendance may shadow the opportunities afforded by technology.
A form of TELT currently more researched in younger ages is the use of electronic whiteboards to simplify and share information (Castek & Beach, 2013; Maher, 2013). The potential for these to support learning in higher education has received little attention in the literature, other than being described as a useful tool for in the moment teaching (Archibald et al., 2014). Using such a tool to deliver a short summary video of the lecture may act to reinforce learning from the lecture, rather than replace it. Simplifying the lecture content in such a way may also help students direct their learning, and has been proposed as a way to help students overcome troublesome knowledge, as they present scientific mechanisms in the absence of other barriers such new terminology (Peart et al., 2014).
An understanding of student perceptions of different TELTs could be of benefit for a number of reasons; (i) identification of shared characteristics between well perceived TELTs may help predict what tools are likely to work in future, (ii) targeting particular types of TELTs may act to reduce staff anxiety by reducing choice and preventing over saturation with TELTs, (iii) contribute to the planning of departmental and institutional TEL strategies. The objective of this study was to integrate four different TELTs that the teaching staff could control into a semester long undergraduate sport and exercise science module, with the aim of examining student use and perceptions of the TELTs. Of particular interest was their views on accessibility, use and value for learning.
The study focused on a 12-week Level 4 (first year Undergraduate) module called Energetics of Exercise, which included 210 students from BSc Applied Sport and Exercise Science, BSc Sport, Exercise and Nutrition and BSc Psychology with Sport Sciences. The summative assessment for the module consisted of three multiple-choice examinations throughout the semester. All procedures were approved by the institution’s ethics committee, and all participants were provided with verbal and written information to ensure informed consent.
A Panopto™ recording of the lecture which allowed the students to download a video file (Fig. 1a). This was chosen as a simple TELT that would require no technological knowledge or extra-curricular effort from the staff, with the purpose of allowing students to recap the lecture content.
An animated mini review of the lecture using an interactive whiteboard mobile application (ShowMe), which consisted of a < 5-min video developed by the lecturer, focusing on what the lecturer deemed to be the essential part of the lecture (Fig. 1b). It was anticipated that the students would use these videos to recap on the main points as a start of, or alternative to, further reading.
A multiple choice quiz (MCQ) hosted on the VLE (Blackboard®) (Fig. 1c). Devised by the lecturer, these quizzes mimicked the summative assessment for the module to provide the students an opportunity for formative assessment.
A module Twitter feed sharing relevant information (Fig. 1d). Previous studies have identified Twitter as a tool to facilitate student engagement in a course, and encourage student discussion (Gikas and Grant, 2013; Junco et al., 2011).
Survey design and analysis
A representation of the questions included in the online questionnaire
Why did you access X?
□ To recap weekly content □ To prepare for the exam □ For general interest □ I did not use it □ Other….
How did you access X?
□ Phone/tablet □ PC/Laptop □ I did not use it □ Other….
Where did you access X?
□ At University □ At home □ While travelling □ I did not use it □ Other….
I strongly disagree
I somewhat disagree
I somewhat agree
I strongly Agree
I find X useful for learning.
X helped me develop confidence in the subject area.
I find X easy to use.
Using X is a bad idea (negative).
X makes learning more interesting.
I would like to use X in future modules
What in particular did you find useful about X?
Is there a way that the use of X could be improved?
Frequency of student responses on whether each TELT was useful to support their learning. Agree = median score 6–7; unsure = median score 3–5; Disagree = median score 1–2
Did not use
Χ 2 (9, N = 380) = 87.76, p < 0.001
* p ≤ 0.002
A thematic analysis has been presented in Table 4. One main finding was that a first order theme shared across all TELTs was that of assessment preparation. This theme was identified as a positive for Panopto (‘helped prepare for the exam’), ShowMe (‘can pick out main points for exam revision’) and MCQ (‘helped to know what to expect on the exam’). Specific reasons for these TELTs being perceived as useful for assessment could have been to help check understanding (MCQ; ‘helps you find out what you actually know’), tailor and personalise the delivery (Panopto; ‘can jump to specific points’) and provide information in a different format (ShowMe; ‘easier to understand than the lectures’). Conversely the theme of assessment was identified as an area for improvement with Twitter (‘not clearly linked to the exams’). In fact, the only positive theme for Twitter was convenience (‘easy to access links’).
The purpose of this study was to implement different TELTs into the delivery of a first year undergraduate science module, and collect student perceptions of their use and perceived value for learning. It was found that approximately 80% of the students surveyed accessed Panopto, ShowMe and MCQ to support their learning. However less than half of the cohort accessed Twitter, significantly less than the other three methods. Furthermore, only 4% of students agreed that it helped them with their studies, which is again in stark contrast to the other three methods (Table 2). This is despite students identifying it as a convenient way to get information (Table 4). Previous authors have advocated the use of Twitter to support Higher Education students, including Junco et al. (2011) who observed greater levels of engagement in students assigned to a Twitter group as opposed to a control. Gikas and Grant (2013) also reported generally positive student perceptions of using Twitter, including being able to embed learning within their normal lives and the ability to have discussions with classmates. Interestingly interaction was a theme under suggested improvements in the current study, with students commenting that ‘everyone should follow each other’ and that they should ‘share content’. A possible reason for the lack of interaction was a lack of student awareness, with one student commenting an improvement would be to ‘make students more aware’. However, the way in which staff made students aware of Twitter was no different to the other TELTs. Furthermore staff regularly updated the Twitter feed with both course specific information and relevant sources for further learning e.g. recent articles linked to that week’s course content, but no students posted their own tweets. The fact that Twitter was accessed more on mobile devices may suggest that it was used more for keeping up to date rather than active engagement. Of note is that the intervention implemented by Junco and colleagues used Twitter as a sole source of information (e.g. discussion board, announcements page, reading lists etc.), with no competing TELTs being used simultaneously. It may be that the use of Twitter in the current study was influenced by the fact that the traditional VLE was also being used for the roles described by Junco et al. (2011), or that three other TELTs were available at the same time so students directed their attention elsewhere. These factors may have had more of an influence if students were unfamiliar with the workings of Twitter.
A summary of student access to the TELTs
Why did you access X?
To recap weekly content
To prepare for the examination
For general interest
How did you access X?
Where did you access X?
Thematic analysis of qualitative responses for each TELT
Able to pause at any point to make notes/Can jump to specific points
Tailor lecture delivery and pacing
Good for exam revision/Helped prepare for the exam
Microphone poor/Missing video/Logging in was difficult
A good quick recap/It condenses all of the information/Short and concise/Straight to the point/Short, sharp and accurate summary
Gives step by step views/Makes complicated information compact and easy to understand/The verbal and visual information at the same time is useful/Simplifies lectures/Easier to understand than the lectures
Delivery of information
Easy to use/Easy to find the information
Can pick out main points for exam revision
Some videos too short/Lecturer sometimes quiet/Sometime talk too fast
Useful to see the types of questions/Good tool for revision/You can practice for the exam/Prepared you for the exam/Helped to know what to expect on the exam
Helps recap/Helps you find out what you actually know/Helped check where I may have been wrong with my notes/Shows what you know and what you don’t/Can track my learning
More questions/Links to webpages relating to the subject area/Not just multiple choice questions/A variety of questions
Easy to use/Easy to access links
Everyone should follow each other/Should share content/Make students more aware
Give more examples/Have a specific page/Make more links available /
Not clearly linked to the exams
It is perhaps clear why the MCQ was rated highly by the students, as it mimicked the summative assessment of the module. Some of the comments in the ‘confirming understanding’ theme suggest it may have also been of benefit for formative feedback (‘can track my learning’) (Table 4), however what is unclear is whether this TELT would have been perceived as useful if the mode of summative assessment had been different i.e. an essay or report. Another factor that may explain the positive perception of the MCQs is that this was the only TELT that required the student to take an active part. However no students alluded to this in the questionnaires.
As students appeared to be extrinsically motivated by assessment, TELTs were not used a lot for general interest (Table 3). All of the TELTs had the capacity to be used on mobile devices, however they were used primarily on PCs/laptops (Table 3). This may suggest that the potential for incidental or informal learning was not enhanced by using the TELTs. Each TELT was accessed mostly at home, therefore perhaps promoting extra-curricular study as it was hoped that the ShowMe videos would, but still in a formal and structured fashion. However, these points are speculative as we have no comparison to their learning habits prior to the study. Furthermore ‘general interest’ is quite vague and open to individual interpretation. Perhaps this would have been better termed ‘further study’ or ‘further understanding’.
In summary the current study has identified shared characteristics between the TELTs that students engage with and perceive to be useful. Such characteristics include links to the summative assessment and offering an alternative method of content delivery to the traditional lecture. Previous research has shown that time is a barrier for staff to incorporate TELTs into their teaching (Reed, 2014), so an appropriate practical message may be to start with one type of TELT. Based on this study TELTs similar to the ShowMe App are most recommended as this was one of the two highly rated TELTs (alongside MCQ), but has more scope for crossover with other subjects as the students found it useful for more than just assessment. Moreover the lecturer has full control over the content, so can make each animation specific to the current topic of study. Further work should look at the potential transfer of animations for learning in other subject disciplines, and investigate if there are any alternative TELTs for meeting the same purpose of simplifying information. It should be considered that this paper describes only those students that completed the survey (45% response rate), and the perceptions of the other students enrolled on the module are unknown. Furthermore, we cannot discount that using the TELTs in combination may have influenced the perception of each TELT, and we relied upon student self-reported usage for each TELT. Future work should monitor the actual use of the TELTs using tracking statistics, and look to examine predictive validity of student perceptions of TELTs to gain more insight into what student perception means for learning and academic performance, and what the implications are for teaching staff.
The authors wish to thank the students that took the time to complete the survey.
Availability of data and material
The data can be uploaded to a repository upon acceptance.
The authors received no funding for this study.
DP conceived the study, and DP, PR and LA designed the study. DP, PR and KK taught on the module and implemented the protocol. All authors reviewed the results and DP performed the final analysis. All authors approved the final version of the manuscript.
Ethics approval and consent to participate
The study was approved following the institutional ethics processes of Northumbria University.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
- Archibald, D., Macdonald, C. J., Plante, J., Hogue, R. J., & Fiallos, J. (2014). Residents’ and preceptors’ perceptions of the use of the iPad for clinical teaching in a family medicine residency program. BMC Medical Education, 14(1), 174.View ArticleGoogle Scholar
- Barak, M., Ashkar, T., & Dori, Y. J. (2011). Learning science via animated movies: Its effect on students’ thinking and motivation. Computers & Education, 56(3), 839–846.View ArticleGoogle Scholar
- Blackwell, C. K., Lauricella, A. R., & Wartella, E. (2014). Factors influencing digital technology use in early childhood education. Computers & Education, 77, 82–90.View ArticleGoogle Scholar
- Castek, J., & Beach, R. (2013). Using apps to support disciplinary literacy and science learning. Journal of Adolescent & Adult Literacy, 56(7), 554–564.View ArticleGoogle Scholar
- Cochrane, T. (2005). Interactive QuickTime: Developing and evaluating multimedia learning objects to enhance both face-to-face and distance e-learning environments. Interdisciplinary Journal of Knowledge and Learning Objects, 1(1), 33–54.Google Scholar
- Danielson, J., Preast, V., Bender, H., & Hassall, L. (2014). Is the effectiveness of lecture capture related to teaching approach or content type? Computers & Education, 72, 121–131.View ArticleGoogle Scholar
- Gikas, J., & Grant, M. M. (2013). Mobile computing devices in higher education: Student perspectives on learning with cellphones, smartphones & social media. The Internet and Higher Education, 19, 18–26.View ArticleGoogle Scholar
- Gong, M., Xu, Y., & Yu, Y. (2004). An enhanced technology acceptance model for web-based learning. Journal of Information Systems Education, 15(4), 365–374.Google Scholar
- Junco, R., Heiberger, G., & Loken, E. (2011). The effect of twitter on college student engagement and grades. Journal of Computer Assisted Learning, 27(2), 119–132.View ArticleGoogle Scholar
- Kamath, A. (2015). A review of use of eLearning in pharmacology. International Journal of Integrative Medical Sciences, 2(9), 157–162.View ArticleGoogle Scholar
- Karnad, A. (2013). Student use of recorded lectures: A report reviewing recent research into the use of lecture capture technology in higher education, and its impact on teaching methods and attendance. London: London School of Economics and Political Science.Google Scholar
- Kay, R. (2011). Evaluating learning, design, and engagement in web-based learning tools (WBLTs): The WBLT evaluation scale. Computers in Human Behavior, 27(5), 1849–1856.View ArticleGoogle Scholar
- Kay, R. H., & Knaack, L. (2009). Assessing learning, quality and engagement in learning objects: The learning object evaluation scale for students (LOES-S). Educational Technology Research and Development, 57(2), 147–168.View ArticleGoogle Scholar
- Lach, M. K., & McCarthy Jr., B. C. (2015). Student and faculty member perspectives on lecture capture in pharmacy education. American Journal of Pharmaceutical Education, 79(8), 126.View ArticleGoogle Scholar
- Lin, L., & Atkinson, R. K. (2011). Using animations and visual cueing to support learning of scientific concepts and processes. Computers & Education, 56(3), 650–658.View ArticleGoogle Scholar
- Louw, A. (2015). Developing a lecturer workshop for using tablets in the classroom. International Journal of Teaching and Learning in Higher Education, 27(3), 294–309.Google Scholar
- Maher, D. (2013). Pre-service primary teachers' use of iPads to support teaching: Implications for teacher education. Educational Research for Social Change, 2(1), 48–63.Google Scholar
- Marchand, J.-P., Pearson, M. L., & Albon, S. P. (2014). Student and faculty member perspectives on lecture capture in pharmacy education. American Journal of Pharmaceutical Education, 78(4), 74.View ArticleGoogle Scholar
- Nurmi, S., & Jaakkola, T. (2006). Effectiveness of learning objects in various instructional settings. Learning, Media and Technology, 31(3), 233–247.View ArticleGoogle Scholar
- Peart, D. J., Johnstone, S., Brown, J., & Bangani, P. (2014). Supporting teaching and learning in biosciences with mobile technology: Staff and student perspectives. The Journal of Research in Higher and Further Education, 2(1), 5–10.Google Scholar
- Reed, P. (2014). Staff experience and attitudes towards technology enhanced learning initiatives in one Faculty of Health & life sciences. Research in Learning Technology, 22, 22770.View ArticleGoogle Scholar
- Rossing, J. P., Miller, W. M., Cecil, A. K., & Stamper, S. E. (2012). iLearning: The future of higher education? Student perceptions on learning with mobile tablets. Journal of the Scholarship of Teaching and Learning, 12(2), 1–26.Google Scholar
- Šumak, B., Heričko, M., Pušnik, M., & Polančič, G. (2011). Factors affecting acceptance and use of Moodle: An empirical study based on TAM. Informatica, 35(1), 91–100.Google Scholar
- Teo, T., Lee, C. B., & Chai, C. S. (2008). Understanding pre-service teachers’ computer attitudes: Applying and extending the technology acceptance model. Journal of Computer Assisted Learning, 24(2), 128–143.View ArticleGoogle Scholar
- Vargo, J., Nesbit, J. C., Belfer, K., & Archambault, A. (2003). Learning object evaluation: Computer-mediated collaboration and inter-rater reliability. International Journal of Computers and Applications, 25(3), 198–205.Google Scholar