Skip to main content
  • Research article
  • Open access
  • Published:

Bringing technology to the mature classroom: age differences in use and attitudes

Abstract

Mature students are anecdotally thought to be more anxious about technology than younger students, to the extent that they avoid using technology. This is a problem in today’s higher education classrooms which often use a range of learning technologies, particularly as cohorts are becoming more and more likely to contain mature students. Previous work examining the attitudes of mature students to technology no longer reflects contemporary student age profiles or the current technological landscape. This study asks whether modern mature students in a UK university have more negative attitudes towards technology than younger students, and whether their usage of technology is different. A new diagnostic instrument, the Technology Attitudes Questionnaire, was developed to determine how students use technology for course activities and personal use, and their attitudes towards technology more generally. It was found that mature students use fewer technologies than younger students and use them less frequently, but have used them for a longer period over their lives. No difference was found for attitudes towards technology between the mature and younger groups. This research aims to contribute to the wider field of technology attitudes and use, particularly for the modern mature student cohort. These findings can be used to inform how educators design learning resources and use technology on their courses, working towards an age-inclusive programme.

Introduction

This paper conducts a timely exploration of students’ attitudes towards digital technology and how students use technology, using a new instrument designed for purpose. It asks whether mature students have more negative attitudes towards technology than younger students, and how the usage differs between these two groups. It will be of use to educators who are designing resources using technology for use in higher education (HE) classrooms, which are more and more likely to contain mature students as part of a changing cohort. Additionally, it makes a wider contribution to the study of technology attitudes and use, an ongoing field that is continually changing with the evolving technology landscape and a changing student cohort.

This section will introduce mature students as a group, and what is meant by technology enhanced learning. It will then discuss previous studies on students’ attitudes to technology, and finally set out the purpose of the study, including the research questions.

Mature students

Most UK universities define mature students as those who are aged 21 and over at entry, however Lewis (2018) found that students aged between 21 and 25 felt there was little difference between them and 18-year-old entrants. Some studies suggest that ‘mature’ should signify students who are older than this, for example above 30 (e.g. Mackey et al., 2018). Baxter and Britton (2001) define mature students are those who enter HE at an age of 26 or above. This is the definition that has been adopted for the purposes of this study, not least because it acknowledges students’ perspectives reported by Lewis (2018).

Mature students have been a growing group of applicants to higher education for several decades (Evans & Nation, 1996; Pearce, 2017; Schuetze, 2014), although application numbers have fallen in recent years (UCAS, 2017). The Universities and Colleges Admissions Service (UCAS) (2017) found that in 2017, 10.4% of successful HE applicants from the UK were mature, and in 2018, the number of acceptances for students aged 26 and over increased by 6.7% (UCAS, 2018). Initial UCAS data for 2019 show that older age groups, particularly for students aged over 30, have increased significantly (UCAS, 2019). This follows a general pattern over the last few years of increasing acceptance rates for students in older age groups. It is therefore worth our efforts to ensure our pedagogies are appropriate for mature students.

It should be noted that mature students are a diverse, heterogeneous group, consisting of adults from different genders, cultures, socioeconomic groups, and educational backgrounds, all with different reasons for studying (Schuetze, 2014; Waller, 2006). It is important, therefore, to maintain awareness of the diversity of mature students; being ‘mature’ is just one facet of their complex status.

Technology enhanced learning

There is no single agreed definition of technology enhanced learning (TEL), due to its extremely diverse and evolving nature. This study draws on the definition suggested by Law, Niederhauser, Christensen, and Shear (2016), and defines TEL as learning in an environment that is enriched by the integration of digital technology. The types of digital technology used for TEL are equally diverse, and may include hardware such as laptops, mobile telephones, televisions and e-readers, or software such as social networking, office suites, online forums, and videos (Antoniadis et al., 2009; Loughlin, 2017).

TEL has been shown to benefit students and improve their HE experience, both pedagogically and otherwise (Akçayır & Akçayır, 2018; Awidi & Paynter, 2019), and in turn, lecturers choose TEL such as VLEs, social media, and videos to improve student experience (Loughlin, 2017). Screen readers, recording tools and planning tools increase the accessibility of courses for disabled students; however, this raises concerns surrounding the ‘digital capital’ these students have - the social and cultural support and resources a person can access. This is something that can particularly affect mature students as well (Seale, Georgeson, Mamas, & Swain, 2015).

The use of technology in the learning environment can develop students’ higher-level thinking by moving beyond simple memorisation and recall (Lee & Choi, 2017). Students who exhibit higher-order thinking are more likely to be academically successful (Zohar & Dori, 2003). Mature students have been found to be more likely to adopt higher-order approaches over memorisation approaches (Richardson, 1994), and Lee and Choi (2017) found that the use of technology can help them to do so. Students’ attitudes towards technology indirectly affect higher-order thinking, which in turn increases academic success. Therefore attitudes are an important factor in order for design approaches to work as intended, and it is vital to explore these when designing TEL resources (Lee & Choi, 2017).

TEL also increases collaboration. This could be through providing resource-sharing platforms (Al-Emran, Elsherif, & Shaalan, 2016), enabling resource creation (e.g. student podcasts in Lee, McLoughlin, & Chan, 2008), or simply allowing easy communication between students, from which peer feedback and reflection will naturally arise (Lee & Choi, 2017). Furthermore, technology enables the development of interactive teaching approaches such as blended learning (Dalsgaard & Godsk, 2007) or flipped learning (Akçayır & Akçayır, 2018). These have been found to increase attainment (Al-Qahtani & Higgins, 2013; Charles-Ogan & Williams, 2015), as well as decrease subject-specific anxieties (Marshall, Staddon, Wilson, & Mann, 2017).

It is therefore unsurprising that TEL has been, and continues to be, a growing focus in higher education. HE institutions are integrating technology throughout, and encouraging (and sometimes pressuring) lecturers and tutors to use it in innovative ways. In turn, all students are expected to engage with technology in some form, irrespective of level or background. This presents challenges in designing learning activities that are accessible to all. In particular, it is vital to consider the learning needs of all students.

Attitudes to technology

It is often anecdotally thought that mature students are more anxious about technology than younger students, and that they are generally poorer and slower at gaining digital literacy skills (Broady, Chan, & Caputi, 2010). Some studies have found that older people are less likely to engage with technology than younger people (Czaja et al., 2006); however, when they perceive that the technology is useful, their motivation to use and learn it increases (Czaja & Sharit, 1998; Mitzner et al., 2010). The extent to which there are observable differences between mature and younger students in their approaches and attitudes to technology has not yet been clarified for the modern cohort.

The extent to which students choose to accept or reject technologies can have positive or negative effects on their education, since universities are embracing TEL more and more (Henderson, Selwyn, & Aston, 2015). Attitude is one important factor in technology acceptance, affecting whether students adopt technologies, but attitudes are also subject to change over time, often dependent on whether one is having positive or negative experiences (Broady et al., 2010; Straub, 2009).

Attitude can be difficult to define, as it has several dimensions and is used in various ways according to the needs of each author or instrument (Di Martino & Zan, 2010). Broadly, it is an individual’s disposition towards a subject, and whether it is positive or negative. Hart (1989) breaks it down into three components: emotional response, beliefs, and behaviour. This is a particularly useful definition, as it explicitly includes the behavioural aspect, allowing links to pedagogical methods and outcomes. This is the definition used for the purposes of this paper.

An overall attitude can be multi-dimensional, with several factors (Czaja & Sharit, 1998). Factors relating to TEL have been explored in previous literature, usually for younger students. They include confidence level (Garland & Noyes, 2005), previous experience (Garland & Noyes, 2004), and perception of the required knowledge level to engage with a resource (Levine & Donitsa-Schmidt, 1998). These can be classified as relating to the emotional response, behaviour, and belief attitude components respectively. Purpose, usefulness, and support also contribute to the overall attitude (Czaja & Sharit, 1998).

Gardner, Dukes, and Discenza (1993) found that students who use computers more are more confident with computers, and therefore have a more positive attitude towards them. They propose two factors affecting computer attitude: frequency of use; and how long the user has been using the technology. It is worth noting that Gardner et al. (1993) was conducted over 25 years ago, when computers were less common and less user-friendly, and therefore frequency of use may have had more of an effect on confidence than in modern days. In contrast, Garland and Noyes (2005) found that computer confidence isn’t the main factor affecting attitude, but confidence in learning from computers is. The distinction between the two may arise from the passage of time, in which those who have used computers the least, and are thus still learning, are less confident as they perceive they have less computer knowledge. This may mean they are more likely to be more apprehensive about technology they are not yet comfortable with, which may manifest itself as a ‘negative’ attitude. ‘Technology learning’ confidence may, then, be analogous to length of time of use, as explored in Gardner et al. (1993). Other factors that potentially affect attitude include: whether the technology is used for home use or in educational institutions (Gardner et al., 1993); self-perceived knowledge level (Mitzner et al., 2010); and perceived usefulness of the technology (Czaja & Sharit, 1998).

In 2005, Garland and Noyes found that mature students had lower computer confidence than younger students, both for general use and learning from computers. Interestingly, distance-learning students who were mature actually had more confidence in general computer use than younger students, but still had lower confidence for learning from computers. This again fits with the idea of more experience giving higher confidence, since distance-learning students use computers almost exclusively for learning. In contrast, Broady et al. (2010) found that older students have an initial lack of confidence that improves with use, which could be interpreted as lower confidence when learning about computers.

The factors that affect how users adopt technology are numerous and complex, and technology acceptance models have been used and studied for decades (Scherer, Siddiq, & Tondeur, 2019; Wingo, Ivankova, & Moss, 2017). One of the most famous and utilised is the Technology Acceptance Model, or TAM (Davis, 1989). The original TAM study suggested that the main two contributing factors to attitude are perceived usefulness and perceived ease of use. The TAM2 (Venkatesh & Davis, 2000), an update to the TAM, looked at voluntary versus compulsory use of technology, as well as other factors such as social influence processes and cognitive instrumental factors. Both the TAM and TAM2 are old instruments designed in a world where technology was not as prevalent. They also focus on job performance and productivity, not education, and although it has been used for student attitudes as well over the years (e.g. Levine & Donitsa-Schmidt, 1998; Ngai, Poon, & Chan, 2007), they may perform better in business environments (Legris, Ingham, & Collerette, 2003).

Studies that examine the attitudes of mature students to technology are often out of date, sometimes dating from more than a decade ago (Czaja & Sharit, 1998; Gardner et al., 1993; Garland & Noyes, 2005), or focus on distance learning students (Jelfs & Richardson, 2013). Technology has evolved rapidly, and technological advances have changed how students learn (Kim, Song, & Yoon, 2011), reducing the validity of the older scales (Garland & Noyes, 2008). With changing technology, attitudes and use will also have evolved (Broady et al., 2010). More recent studies on attitudes to technology have their own limitations, such as only exploring one aspect such as frequency of use (Kennedy, Judd, Dalgarno, & Waycott, 2010), focussing on mode of study (Arrosagaray, González-Peiteado, Pino-Juste, & Rodríguez-López, 2019), gender (Cai, Fan, & Du, 2017), or being too course-specific rather than allowing students to reflect on the use of technology in their everyday lives (Awidi & Paynter, 2019; Edmunds, Thorpe, & Conole, 2012). Other studies are or specific to certain types of technology such as mobile devices (Al-Emran et al., 2016) or simply computers (Garland & Noyes, 2004).

Purpose of the study

Higher education has a changing cohort, with increasing acceptance rates for mature students. Universities are adopting more and more technology, and expecting lecturers and tutors to integrate it as widely as possible (Shelton, 2014). As researchers, we therefore need to ask whether it is pedagogically efficient to treat our modern cohort the same as a traditional cohort? This calls for deeper understanding of the technological learning needs of mature students, and how they differ from those of younger, more ‘traditional’ students. This understanding is also crucial for the design of TEL resources (Lee & Choi, 2017). A diagnostic exploration will allow us to either reassure ourselves of the probable efficaciousness of current practice, or compel us to amend our learning environments.

This paper presents the findings from a quantitative study exploring students’ use of technology and their attitudes towards it. This study is timely because previous work examining the attitudes of mature students to technology may no longer reflect contemporary student age profiles. Technology has evolved much over the years, and therefore attitudes and use will also have evolved, particularly surrounding specific technologies that may have dropped out of use or evolved beyond recognition. No existing instruments were found to be suitable for the task, and so a new instrument was created for this study, based on Hart’s (1989) three attitude components, and also exploring students’ technology use. A factor analysis was carried out in order to determine the dimensions of students’ attitudes towards technology.

The following research questions were posited: (1) Are mature students more negative about technology enhanced learning than younger students? and (2) Is there an attitudinal difference between different ages of mature student? The answers to these questions will inform a discussion of the pedagogical implications for designing age inclusive classrooms, allowing us to begin to address the gap between intended and actual learning.

Methods

This section contains five subsections. “Survey Design” describes the process of how the questionnaire used for this study was created. “Pilot” explains how the survey was tested and validated. The subsection “Technology Attitudes Questionnaire” introduces the new instrument. “Participants for the Main Study” presents the demographic data for the participants and describes how they were recruited. The final subsection, “Data Analysis”, explains the strategies and statistical analyses that were carried out on the data generated by the survey.

Survey design

A semi-systematic search was done using the Google Scholar database to find existing attitude-surveying instruments from the 10 years spanning 2006 to 2016, using the search terms “attitude(s)”, “technology”, and “instrument(s)”. Only English-language instruments were considered. Some studies from the ten-year time frame used older instruments, and these older instruments were included for review as they had already been accepted for more recent use. After collating the instruments, none were found to be suitable in their entirety, due to seeming out of date, or having a focus on a specific technology. A new survey instrument was therefore designed, drawing on elements from the previous validated instruments.

Two hundred sixty five Likert-style items from 16 unique instruments were initially considered for inclusion. From this list, items were removed if they were duplicates, or very similar to each other. Items about specific technologies that would be difficult to adapt to general technology were also removed, along with items that were unclear. This left 57 items. References to specific technologies were replaced with just “technology”. Unsuccessful adaptations, or items that were now too similar, were again removed, resulting in 51 items being used for a pilot study. These covered expected aspects of usefulness, enjoyment, ease, confidence, interest, support, and importance. The items included positive and reversed items in order to combat acquiescence bias (Oppenheim, 1998).

Pilot

A small pilot survey (n = 24) was conducted to validate the instrument. It used students from the participant pool expected for the main study, and got a good spread of participants from the target groups (different ages, disciplines, etc). The questionnaire was trialled, and participants interviewed to determine if any questions were unclear, and if they felt anything was missing from the survey.

The pilot data was initially checked for errors (e.g. due to duplicate questions). A correlation matrix was checked for items that were highly correlated (> 0.8) with other items (Field, 2005). Where this was found, it was always due to items asking about similar concepts; in all cases, the more clearly-worded item was kept and the other item removed (Oppenheim, 1998). From this, 37 items remained. A principal components analysis (PCA) was run. In order to do this, Horn’s parallel analysis was used to determine how many factors to extract (Horn, 1965). Parallel analysis is considered to be the most accurate method of factor extraction since it outperforms the K1 rule and scree plots (Ledesma & Valero-Mora, 2007; Matsunaga, 2010; O’Connor, 2000; Zwick & Velicer, 1986). Two factors were identified from the parallel analysis. The PCA showed two items loaded onto both factors with a difference of less than 0.4 between the primary and secondary loadings, so these were discarded (Matsunaga, 2010). The first factor contained items about attitudes about technology utility, and the second factor contained items about comfort and confidence when using technology.

Internal consistency of the whole attitudes section was assessed using Cronbach’s alpha coefficient, and was found to be 0.935. The coefficient is sufficiently high, above the minimum recommendation of 0.7 (Nunnally & Bernstein, 1994). It is common for tests with higher numbers of items to result in higher levels of alpha (Tavakol & Dennick, 2011), so the relatively high number of items in this test may explain why alpha is so high.

After piloting and validation, 35 items were accepted for use in the final instrument for the main study, which will now be discussed.

Technology attitude questionnaire

The new instrument is called the Technology Attitude Questionnaire (TAQ), and adapts its 35 attitude items from Al-Emran et al. (2016), Bonanno, and Kommers, P.a.M. (2008), , Edmunds et al. (2012), Jay and Willis (1992), Garland and Noyes (2005), Knezek, Christensen, and Miyashita (1998), Lee and Clarke (2015), Liaw, Huang, and Chen (2007), Nguyen, Hseih, and Allen (2006), Pierce, Stacey, and Barkatsas (2007), Saadé and Kira (2007), Sagin Simsek (2008), Teo (2008), and Teo, Lee, and Chai (2008). In addition to this attitudes section, questions about how students use technology were also included. The TAQ was administered online using Survey Monkey. A copy of the final TAQ is included in the supplementary materials.

The TAQ is divided into three sections. The first section gives students a list of 24 types of technology, both hardware (e.g., laptops, mobile telephones, televisions, e-readers) and software (e.g., social networking, office suites, online forums, videos), and explores which of these they have used for course activities, non-course activities, or both. The second section asks how often they use each of the technologies (daily, weekly, monthly, less often than monthly, never, adapted from Kennedy et al. (2010)), and how long they have been using them (less than a year, 1–2 years, 3–5 years, 6–10 years, more than 10 years, never). These first two sections satisfy the behavioural component of attitude, as they describe how the student uses technology (Di Martino & Zan, 2010). The third section explores students’ attitudes to technology and attempts to find underlying factors. This satisfies the beliefs and emotion aspects of attitude towards technology. A 7-point verbal-rating Likert scale was chosen since they are more reliable than 5-point scales, and offer more opportunity to discriminate between values (Schwarz, Knäuper, Hippler, Noelle-Neumann, & Clark, 1991). The Likert responses were adapted from Beshai, Branco, and Dobson (2013): 1. “Entirely disagree”, 2. “Mostly disagree”, 3. “Somewhat disagree”, 4. “Neither agree nor disagree”, 5. “Somewhat agree”, 6. “Mostly agree”, and 7. “Entirely agree”. A “pass” option was also included to allow participants to abstain from each question should they wish.

The demographics collected were age group, course discipline by faculty, and whether the participants were full or part time. For analysis, both the individual age groups were used (18–21, 22–25, 26–30, 31–40, 41–50, 51–60, 61–70, 71+), as well as collating the participants into “mature” (consisting of the groups aged 26 and over) and “non-mature” (consisting of the groups under 26) groups. Other demographics such as gender and ethnicity were not collected as they were not to be analysed in this study.

Participants for the main study

The main study was based at a Russell Group university in Northern England. A request to participate in an online survey was sent to students subscribed to the University’s student volunteers mailing list. Students who completed the survey were eligible to be entered into a prize draw for an Amazon voucher. One hundred ninety four participants began the survey, with 161 participants completing it (83.0% completion rate). All participants gave informed consent.

30% (n = 49) of the sample were mature students, while 70% (n = 112) were non-mature; a more detailed breakdown of the participant ages is given in Table 1. Participants were from a variety of courses and disciplines, shown in Table 2, and both part time (n = 18, 11.2%) and full time students (n = 143, 88.8%) were included.

Table 1 Frequencies and percentages of students of the different age ranges in the sample
Table 2 Frequencies and percentages of course discipline of students in the sample

Data analysis

The overall goal was to identify the main attitude dimensions from the TAQ and determine if they were different across the different age groups.

The data was analysed using the IBM SPSS Statistics package (IBM Corp, 2013). In order to determine the factor structure of the TAQ, an exploratory factor analysis (EFA) was used, a data-driven method to identify underlying relationships between variables. The Kaiser-Meyer-Olkin measure of sampling adequacy was 0.888 and is therefore considered “good” (Parsian & Dunning, 2009), and Bartlett’s test of sphericity was significant (χ2 (136) = 1712.280, p < 0.001) showing that the data is suitable for factor analysis (Williams, Onsman, & Brown, 2010). Horn’s parallel analysis was used to determine how many factors to extract (Horn, 1965). Three factors were identified by the parallel analysis, but the third factor was considered to be a method factor, as it contained all of the reversed items relating to factor one (Zhang, Noor, & Savalei, 2016). The items loading onto the third factor were therefore removed, giving two final factors. Additionally, items were removed if they had factor loadings below 0.4 and items with cross-loadings over 0.4. In total, 14 items were removed. A two-factor structure with 17 items was confirmed through EFA.

For the non-Likert data, the normality of each distribution was assessed using Shapiro-Wilk tests and visual inspection of distributions and Q-Q plots where appropriate (Ghasemi & Zahediasl, 2012). Where the distributions were found to be normal, t-tests were used to assess for differences between the mature and non-mature groups, and analyses of variance (ANOVAs) were used to explore the differences between the age groups. Homogeneity of variance was assessed, and Welch’s ANOVA used where the groups’ variances were found to be different. Hedges’ g and eta squared (η2) are used as the respective effect sizes for the t-tests and ANOVAs (Fritz, Morris, & Richler, 2012; Kotrlik & Atherton, 2011; Tomczak & Tomczak, 2014). Where normality was not found, the non-parametric Mann-Whitney U and Kruskal-Wallis H tests were used, with effect size measures of r and η2H used respectively (Fritz et al., 2012; Kotrlik & Atherton, 2011; Tomczak & Tomczak, 2014).

Results

This section contains four subsections, each considering one aspect of the results. “Number of different technologies used” examines how many technologies students use from the list provided. “Frequency of use of technology” and “Length of time of use of technology” look at how often students use technology, and how many years they have been using technologies for, respectively. “Attitudes” finds the factor structure and dimensions of students’ attitudes to technology. Age differences are explored within each subsection.

Number of different technologies used

Across all students, the minimum number of technologies used was 9, and the maximum was 24 (out of a total of 24 different technologies presented). Mature students used a median of 21 types of technology and non-mature students used a median of 22.

There is a statistically-significant difference in the overall number of different technologies used by mature (mean rank = 68.59) and non-mature (mean rank = 86.43) students, shown by a Mann-Whitney test (U = 2136, z = − 2.259, p = 0.024). There is therefore evidence that mature students use fewer different technologies than non-mature students, however the effect size for this is small (r = − 0.178). A Kruskal-Wallis H test shows that there is no significant difference between the individual mature age groups (χ2(4) = 6.576, p = 0.160, η2H = 0.059).

The differences between the two groups was also explored for course-only activities and for non-course activities. A t-test shows that mature students used fewer technologies for their course-only activities (M = 12.76, SD = 3.55) than their non-mature counterparts (M = 13.88, SD = 3.11). The difference (M = 1.12, 95% CI [0.23, 2.22], t(159) = 2.015, p = 0.046), while significant, is still relatively small (g = − 0.348). A one-way ANOVA showed no significant difference between the mature age brackets (F(4, 44) = 1.049, p = 0.393, η2 = 0.09). In contrast, a Mann-Whitney test shows that there is little evidence of a difference in number of technologies used for non-course activities between mature (mean rank = 73.13) and non-mature (mean rank = 84.44) students (U = 2358, z = − 1.425, p = 0.154, r = − 0.112), and the Kruskal-Wallis test shows there is no evidence of a difference between the different mature age brackets either (χ2(4) = 7.611, p = 0.107, η2H = 0.08).

This shows that mature students use fewer technologies than younger students overall, as well as for learning on their course, although the medians are similar and the effect sizes are small. However, students use the same number of technologies for their non-course activities.

Frequency of use of technology

There is a significant difference in the frequency of use of technology between mature (mean rank = 91.86) and non-mature (mean rank = 76.25) students (U = 2212, z = − 2.176, p = 0.030). Due to the direction of the question asked in the questionnaire, mature students having a higher mean rank means that they use technology less often than their non-mature counterparts. The effect size is, however, small (r = − 0.171). The frequency of use across the different mature age brackets was not significant (χ2(4) = 7.168, p = 0.127, η2H = 0.072). This shows that mature students use technology less often than younger students.

Length of time of use of technology

Mature (mean rank = 121.33) students have used technologies for a significantly longer period of time over their lives than non-mature (mean rank = 63.36) students (U = 768, z = − 7.726, p < 0.001), and this has a large effect size (r = − 0.609). Between the mature age groups only, there was found to be no evidence of a difference (χ2(4) = 5.553, p = 0.235, η2H = 0.035). This shows that mature students have used technology for a longer time over their lives than younger students.

Attitudes

Exploratory factor analysis found a two-factor structure with 17 items, shown in Table 3. Factor 1 contains eight items, and the themes of these items are about comfort, confidence, and perceived confidence; factor 1 was therefore labelled “confidence”. This factor bears similarities to the ‘ease of use’ factor found in TAM-based studies (Davis, 1989). However, the items contained here are more attitudinal in nature rather than a judgement of one’s personal skills, which ‘ease of use’ would indicate. Factor 2 contains nine items, about when and why students use technology, so factor 2 was labelled “utility”. This is also similar to the TAM’s ‘perceived usefulness’ factor (Davis, 1989). These factors are very similar in theme to those identified in the pilot study, suggesting that the instrument performs consistently.

Table 3 Factor structure of students’ attitudes to technology

Internal consistency of the whole attitudes section was assessed using Cronbach’s alpha coefficient, and was found to be 0.916, with Cronbach’s alpha of the confidence factor calculated as 0.923, and the utility factor as 0.825. All coefficients are sufficiently high, above the minimum recommendation of 0.7 (Nunnally & Bernstein, 1994). The total variance explained by the confidence factor was 38.0%, and by the utility factor was 14.6%.

Mundfrom, Shaw, and Ke (2005) found that sample size is sensitive to both the ratio of variables to factors, and communality strength; they recommend a minimum variable-to-factor ratio of 7:1 for the agreement between sample and population factor structure to be “good”. The variable-to-factor ratio for this study was approximately 8:1. The level of communality was found to be “wide”, ranging approximately between 0.2 and 0.8 (Mundfrom et al., 2005). Therefore, using Mundfrom et al.’s (2005) guidelines, the sample size should meet the criterion of a minimum sample of 65 participants for excellent agreement. Since the sample size for this study is n = 161, it the sample size is considered excellent.

Comparing the confidence dimension between between mature (mean rank = 74.61) and non-mature (mean rank = 83.79) students, it was found that there is no significant difference (U = 2431, z = − 1.152, p = 0.249, r = − 0.091) with a very small effect size. There was also no difference between the different mature age brackets (χ2(4) = 3.651, p = 0.455, η2H = − 0.01).

There is also no significant difference (mature M = 5.40, SD = 0.917; non-mature M = 5.55, SD = 0.757) in the utility dimension (M = 0.14, 95% CI [− 0.13, 0.42], t(159) = 1.037, p = 0.302, g = − 0.185). Again, comparing the mature age brackets, this showed no difference, with a small-medium effect size (Welch’s F(3, 44) = 2.839, p = 0.075, η2 = 0.14).

The overall attitude, i.e. a combination of the two dimensions, was also compared between the mature (mean rank = 74.66) and non-mature students (mean rank = 83.77). This, as with the individual dimensions, was found to be non-significant (U = 2433, z = − 1.141, p = 0.254, r = − 0.090). An ANOVA was used since the homogeneity of variance for these groups was found to be non-significant, however the ANOVA showed that there was no difference (F(4, 44) = 0.904, p = 0.470, η2 = 0.08).

There is therefore no discernable difference in attitudes between mature students and younger students.

Discussion

Existing research comparing the relationship between age and attitude has focussed on specific technologies, or on attitudes to general technology without taking age into account as a factor. Studies examining age differences in attitude to general technology are usually out of date or about specific technologies. Since the use of technology is a continually-changing, evolving, and dynamic field, with the student body being a changing cohort, it was important to conduct an up-to-date exploration in technology attitude between mature and younger students. The current study conducts this exploration of age differences in technology attitude and use using a new instrument. Table 4 shows a comparison with other studies in this field that look at the differences between older and younger students, and presents the similarities and differences in findings. It also shows how this study addresses any limitations of these previous studies, particularly those discussed in section 1.3.

Table 4 Comparison summary of this study and previous studies that compare age groups

Two research questions were proposed: the first asks whether mature students are more negative about technology enhanced learning than younger students; and the second question asks if there are differences between the mature age groups. This section addresses how students use technology, discusses students’ attitudes to technology, and sets out the limitations of the study.

How students use technology

Mature students use fewer technologies than younger students overall, as well as for learning on their course. It is interesting that they show no difference in the number of technologies used for non-course activities, since it indicates that the conception that mature students have a fear or apprehension of technology is incorrect as they are choosing it to use in their personal lives. One might expect differences to arise across the difference mature age groups, with the older mature students beginning to use less technology (Czaja & Sharit, 1998; Selwyn, 2004); however, this also does not seem to be the case.

Ching et al. (2005) suggest that students who choose technology for their personal lives also choose it for everything else, but the difference in the number of technologies adopted for course and non-course activities seems to dispute this. The difference seems to suggest that the place and purpose for which students use technology affects their technology choices. Hawthorn (2007) found that older students have selective tendencies towards technology, limiting tasks to those they know they can do in order to minimise errors. This may be supported by the results that mature students use fewer technologies for learning; mature students may be selecting the technologies they are most confident with in order to carry out the tasks that ‘matter’, for their course and assessments, so to reduce potential problems. This may result in students avoiding unfamiliar technologies for learning.

Mature students use technology less often. This may be due to older students being more selective with the technology they use and when, as suggested by Hawthorn (2007), which may also be linked to them using fewer technologies overall. Students in different stages of life may choose to use technology differently, depending on whether they are family- or career-focussed, depending on their “life-fit” with technology (Selwyn, 2004). Family and work, particularly part-time work for mature students, could simply be keeping older students busier, thus reducing their frequency of use. Older adults may have jobs that do not require as much technology as younger adults, or they may even be retired, although retired adults often feel pressure from younger family to use technology (Selwyn, 2004). It would be interesting to look further at the use of technology in jobs using a finer-grained time scale than this instrument, and perhaps considering types of technology as well. However, using technology less often should not be interpreted as students not wanting to use technology, as based on the number of technologies they use, they seem comfortable with technology. It may be that mature students simply feel the need to use technology less, as they are used to not having technology for specific tasks in their lives. Using technology less often may even be a choice, particularly with today’s media demonising regular technology use as ‘addiction’ (e.g. Manjoo, 2018). Some studies (Czaja et al., 2006; Gardner et al., 1993) suggest that frequency of use and computer experience do have an effect on attitude, however, with increased use giving rise to more positive attitudes (or at least, less anxiety). This is interesting since this study found that there is no attitudinal difference between mature and non-mature students, despite there being a difference in frequency of use for the two groups.

Mature students have used technology for a longer time. The difference here may be because mature students, being older, adopted these technologies earlier than younger students, particularly if the technology has been around for a while. Some older students may even be early adopters of older technologies (Ching et al., 2005) who have not felt the need to upgrade since, which might explain the difference between number of technologies used and the overall use time.

It is possible that age-inclusiveness is a concern for the introduction of specific new technologies, not for technology as a whole, since older students may use older technologies (Selwyn, 2004). Mature students may have plenty of experience and comfort with technology in general, but due to using technology less frequently, it may take them longer to become comfortable with new technologies (Rogers, Meyer, Walker, & Fisk, 1998). This isn’t specific to mature students either: many younger students use technology less frequently than we might expect, and they too may suffer from the same thing as older students, in that it takes them longer to get used to new technologies. It might even be worth thinking about designing learning resources for students who use technology less frequently, as well as age factors. New technologies should therefore be carefully introduced to the whole cohort in a way that makes them accessible to everyone, including offering training opportunities if necessary. We could conclude that students of all ages don’t tend to need classes on how to use a computer, but they might need classes on how to use specific software on the computer.

Students’ attitudes to technology

Despite the differences in how students use technology, it is interesting that there is no indicated difference between mature and non-mature students’ attitudes overall, nor for any of the dimensions. Neither their confidence nor utility attitudes are different, which is in contrast to Garland and Noyes (2005), but in agreement with Czaja and Sharit (1998), although Czaja and Sharit looked at lay adults and not mature students in HE. This situation where students are using technology differently but don’t have different attitudes may just be an indication of the diversity with which technology in used. The lack of difference in attitudes further indicates that the perception that mature students have a fear of technology is false, or at least, that this fear is a thing of the past.

It is worth noting that this finding doesn’t show that no students are anxious about technology, or have negative attitudes. It has been shown in past studies that negative experiences lead to negative attitudes (Broady et al., 2010; Straub, 2009), and the use of technology is no different. However, negative attitudes are not specific to one particular age group. It is hoped that the recognition of this should help prevent further stereotype perpetuation that may affect how educators approach learning with technology when they are teaching mature students. From experience, lecturers simply being positive about technologies can reduce student anxiety, and therefore a positive, supportive learning environment should be fostered.

Generally, educators shouldn’t be avoiding technology when designing age-inclusive learning resources. However, we should communicate to students how frequently they are expected to use technology, for example, checking their emails daily, or using their mobile phones for quizzes in class. This will also allow students the time to procure technologies they need but may not currently have, or find alternative arrangements. It is important not to expect all students to become comfortable with newly introduced technologies within the same timeframe, as students who use technology less often, such as mature students, might take longer to become more fluent with it. We therefore need to design learning timetables, particularly if there are summative assessment points, that include the time for students to become familiar with technologies.

Limitations

The study has a number of limitations. It was set within one Russell Group institution in the UK, and may therefore not be generalisable to other societies, particularly where technology is not as accessible. Future research would invite participants from several institutions of different types throughout the UK. It would also be interesting to do a cross-country comparison with societies that do not have the same widespread access to technology.

This study looks at a relatively small sample, and some of the older age groups had very small sample sizes. This is indicative of the negative correlation between age and number of students present in the HE system. The small sample sizes may have prevented an indication of a digital divide between older age groups, such as that found by Czaja et al. (2006).

Participants were volunteers recruited via email, which in itself could have implications on the validity of the study, since students who are less confident with technology may not check their email as much or at all, or be less likely to do online questionnaires. Furthermore, “maturity” is only one of a number of qualities possessed by students, alongside socioeconomic status, gender, culture, and whether they are from a rural or urban area (Kennedy et al., 2010); these qualities form a complex status within each individual, and while this research has focussed just on age, it must be recognised that many factors will have effects upon technology attitudes and use.

Conclusion

Mature students do not seem to have different attitudes than younger students towards technology, or technology learning, but they do use it differently. They use fewer technologies, and seem to have been using these technologies for a long time, so they appear to be more loyal to the technologies that they choose to use. The results from the study imply that mature students are generally as confident and happy to use technology as younger students, particularly if they already use it. They seem to be equally confident learning new technology skills as younger students, and generally do not have the fear or apprehension towards technology that some lecturers or tutors may attribute to them.

This study has shown that there are opportunities for educators to move away from perceived stereotypes about older students, and inform how educators design age-inclusive learning resources and use technology on their courses. This may involve giving students enough time to adapt to new technologies, particularly if they do not use it often, and potentially providing explicit training sessions on new or unfamiliar technologies. This study contributes to the ongoing field of technology attitudes and use, particularly since technology is an evolving and persistent part of the global higher education landscape. It presents a new instrument, and explores attitudes of the modern cohort of mature students. Knowing how different groups of students engage is vital in order to enable them to reach their full potential.

In terms of future work, qualitative interviews would have the potential to allow the development of a more detailed understanding of the factors behind students’ attitudes. An observational study of students using technology in the live classroom might also reveal whether their actual use reflects their attitudes, and what real difficulties they encounter, both generally and with specific technologies. This would perhaps allow educators to choose the technology they employ in their classroom more effectively.

Availability of data and materials

The TAQ instrument is available in the supplementary materials. The dataset used and analysed during the current study is available from the corresponding author on reasonable request.

Abbreviations

ANOVA:

Analysis of variance

EFA:

Exploratory factor analysis

HE:

Higher education

IBM:

International Business Machines

PCA:

Principal components analysis

SPSS:

Statistical Package for the Social Sciences

TAM:

Technology Acceptance Model

TAQ:

Technology Attitude Questionnaire

TEL:

Technology enhanced learning

UK:

United Kingdom

References

  • Akçayır, G., & Akçayır, M. (2018). The flipped classroom: A review of its advantages and challenges. Computers & Education, 126, 334–345.

    Article  Google Scholar 

  • Al-Emran, M., Elsherif, H. M., & Shaalan, K. (2016). Investigating attitudes towards the use of mobile learning in higher education. Computers in Human Behaviour, 56, 93–102.

    Article  Google Scholar 

  • Al-Qahtani, A. A. Y., & Higgins, S. E. (2013). Effects of traditional, blended and e-learning on students’ achievement in higher education: E-learning, blended and traditional learning. Journal of Computer Assisted Learning, 29(3), 220–234.

    Article  Google Scholar 

  • Antoniadis, G., Granger, S., Kraif, O., Ponton, C., Medori, J., & Zampa, V. (2009). Integrated digital language learning. In N. Balacheff, S. Ludvigsen, T. D. Jong, A. Lazonder, & S. Barnes (Eds.), Technology-enhanced learning, (pp. 89–103). Netherlands: Springer.

    Chapter  Google Scholar 

  • Arrosagaray, M., González-Peiteado, M., Pino-Juste, M., & Rodríguez-López, B. (2019). Computers & Education, 134, 31–40.

    Article  Google Scholar 

  • Awidi, I. T., & Paynter, M. (2019). The impact of a flipped classroom approach on student learning experience. Computers & Education, 128, 269–283.

    Article  Google Scholar 

  • Baxter, A., & Britton, C. (2001). Risk, identity and change: Becoming a mature student. International Studies in Sociology of Education, 11(1), 87–104.

    Article  Google Scholar 

  • Beshai, S., Branco, L. D., & Dobson, K. S. (2013). Lemons into lemonade: Development and validation of an inventory to assess dispositional thriving. Europe's Journal of Psychology, 9(1), 62–76.

    Article  Google Scholar 

  • Bonanno, P., & Kommers, P. A. M. (2008). Exploring the influence of gender and gaming competence on attitudes towards using instructional games. British Journal of Educational Technology, 39(1), 97–109.

    Google Scholar 

  • Broady, T., Chan, A., & Caputi, P. (2010). Comparison of older and younger adults’ attitudes towards and abilities with computers: Implications for training and learning. British Journal of Educational Technology, 41, 473–485.

    Article  Google Scholar 

  • Cai, Z., Fan, X., & Du, J. (2017). Gender and attitudes toward technology use: a meta-analysis. Computers & Education, 105, 1–13.

    Article  Google Scholar 

  • Charles-Ogan, G., & Williams, C. (2015). Flipped classroom versus a conventional classroom in the learning of mathematics. British Journal of Education, 3(6), 71–77.

    Google Scholar 

  • Ching, C. C., Basham, J. D., & Jang, E. (2005). The legacy of the digital divide: Gender, socioeconomic status, and early exposure as predictors of full-spectrum technology use among young adults. Urban Education, 40(4), 394–411.

    Article  Google Scholar 

  • Czaja, S. J., Charness, N., Fisk, A. D., Hertzog, C., Nair, S. N., Rogers, W. A., & Sharit, J. (2006). Factors predicting the use of technology: Findings from the center for research and education on aging and technology enhancement (CREATE). Psychology and Aging, 21(2), 333–352.

    Article  Google Scholar 

  • Czaja, S. J., & Sharit, J. (1998). Age differences in attitudes toward computers. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences, 53(5), 329–340.

    Article  Google Scholar 

  • Dalsgaard, C., & Godsk, M. (2007). Transforming traditional lectures into problem-based blended learning: Challenges and experiences. Open Learning: The Journal of Open, Distance and e-Learning, 22(1), 29–42.

    Article  Google Scholar 

  • Davis, F. D. (1989). Perceived usefulness, perceived ease of us, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340.

    Article  Google Scholar 

  • Di Martino, P., & Zan, R. (2010). ‘Me and maths’: Towards a definition of attitude grounded on students’ narratives. Journal of Mathematics Teacher Education, 13(1), 27–48.

    Article  Google Scholar 

  • Edmunds, R., Thorpe, M., & Conole, G. (2012). Student attitudes towards and use of ICT in course study, work, and social activity: A technology acceptance model approach. British Journal of Educational Technology, 43(1), 71–84.

    Article  Google Scholar 

  • Evans, T., & Nation, D. (1996). Opening education: Policies and practices from open and distance education. London: Routledge.

    Google Scholar 

  • Field, A. (2005). Discovering statistics using IBM SPSS statistics. Thousand Oaks: SAGE Publications.

    Google Scholar 

  • Fritz, C. O., Morris, P. E., & Richler, J. J. (2012). Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2–18.

    Article  Google Scholar 

  • Gardner, D. G., Dukes, R. L., & Discenza, R. (1993). Computer use, self-confidence, and attitudes: A causal analysis. Computers in Human Behavior, 9(4), 427–440.

    Article  Google Scholar 

  • Garland, K., & Noyes, J. (2004). Computer experience: a poor predictor of computer attitudes. Computers in Human Behavior, 20(6), 823–840.

    Article  Google Scholar 

  • Garland, K., & Noyes, J. (2005). Attitudes and confidence towards computers and books as learning tools: A cross-sectional study of student cohorts. British Journal of Educational Technology, 36, 85–91.

    Article  Google Scholar 

  • Garland, K., & Noyes, J. (2008). Computer attitude scales: How relevant today? Computers in Human Behaviour, 24(2), 563–575.

    Article  Google Scholar 

  • Ghasemi, A., & Zahediasl, S. (2012). Normality tests for statistical analysis: A guide for non-statisticians. International Journal of Endocrinology and Metabolism, 10(2), 486–489.

    Article  Google Scholar 

  • Hart, L. E. (1989). Describing the affective domain: Saying what we mean. In D. B. McLeod, & V. M. Adams (Eds.), Affect and mathematical problem solving: A new perspective, (pp. 37–45). New York: Springer.

    Chapter  Google Scholar 

  • Hawthorn, D. (2007). Interface design and engagement with older people. Behaviour & Information Technology, 26(4), 333–341.

    Article  Google Scholar 

  • Henderson, M., Selwyn, N., & Aston, R. (2015). What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education, 42(8), 1567–1579.

    Article  Google Scholar 

  • Horn, J. L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30(2), 179–185.

    Article  MATH  Google Scholar 

  • IBM Corp (2013). IBM SPSS statistics for Macintosh, version 22.0. Armonk: IBM Corp.

    Google Scholar 

  • Jay, G. M., & Willis, S. L. (1992). Influence of direct computer experience on older adults’ attitudes towards computers. Journal of Gerontology, 47(4), 250–257.

    Article  Google Scholar 

  • Jelfs, A., & Richardson, J. T. E. (2013). The use of digital technologies across the adult life span in distance education. British Journal of Educational Technology, 44(2), 338–351.

    Article  Google Scholar 

  • Kennedy, G., Judd, T., Dalgarno, B., & Waycott, J. (2010). Beyond natives and immigrants: Exploring types of net generation students. Journal of Computer Assisted Learning, 26(5), 332–343.

    Article  Google Scholar 

  • Kim, S., Song, S.-M., & Yoon, Y.-I. (2011). Smart learning services based on cloud computing. Sensors, 11(8), 7835–7850.

    Article  Google Scholar 

  • Knezek, G., Christensen, R., & Miyashita, K. (1998). Instruments for assessing attitudes toward information technology. Texas: Taxes Centre for Educational Technology.

    Google Scholar 

  • Kotrlik, J. W., & Atherton, J. C. (2011). Reporting and interpreting effect size in quantitative agricultural education research. Journal of Agricultural Education, 52(1), 132–142.

    Article  Google Scholar 

  • Law, N., Niederhauser, D. S., Christensen, R., & Shear, L. (2016). A multilevel system of quality technology-enhanced learning and teaching indicators. Journal of Educational Technology & Society, 19(3), 72–83.

    Google Scholar 

  • Ledesma, R. D., & Valero-Mora, P. (2007). Determining the number of factors to retain in EFA: An easy-to-use computer program for carrying out parallel analysis. Practical Assessment, Research & Evaluation, 12(2), 1–11.

  • Lee, J., & Choi, H. (2017). What affects learner’s higher-order thinking in technology-enhanced learning environments? The effects of learner factors. Computers & Education, 115, 143–152.

    Article  Google Scholar 

  • Lee, J. J., & Clarke, C. L. (2015). Nursing students’ attitudes towards information and communication technology: An exploratory and confirmatory factor analytic approach. Journal of Advanced Nursing, 71(5), 1183–1193.

    Article  Google Scholar 

  • Lee, M. J. W., McLoughlin, C., & Chan, A. (2008). Talk the talk: Learner-generated podcasts as catalysts for knowledge creation. British Journal of Educational Technology, 39(3), 501–521.

    Article  Google Scholar 

  • Legris, P., Ingham, J., & Collerette, P. (2003). Why do people use information technology? A critical review of the technology acceptance model. Information & Management, 40(3), 191–204.

    Article  Google Scholar 

  • Levine, T., & Donitsa-Schmidt, S. (1998). Computer use, confidence, attitudes, and knowledge: A causal analysis. Computers in Human Behavior, 14(1), 125–146.

    Article  Google Scholar 

  • Lewis, I. (2018). The student experience of higher education. London: Routledge.

    Book  Google Scholar 

  • Liaw, S.-S., Huang, H.-M., & Chen, G.-D. (2007). Surveying instructor and learner attitudes toward e-learning. Computers & Education, 49(4), 1066–1080.

    Article  Google Scholar 

  • Loughlin, C. (2017). Staff perceptions of technology enhanced learning in higher education. In European conference on E-learning; Kidmore end, (pp. 335–343). Kidmore End: Academic Conferences International Limited.

  • Mackey, S., Kwok, C., Anderson, J., Hatcher, D., Laver, S., Dickson, C., & Stewart, L. (2018). Australian student nurse’s knowledge of and attitudes toward primary health care: A cross-sectional study. Nurse Education Today, 60, 127–132.

    Article  Google Scholar 

  • Manjoo, F. (2018). Even the tech elite are worrying about tech addiction Retrieved from https://www.nytimes.com/interactive/2018/02/09/technology/the-addiction-wrought-by-techies.html.

    Google Scholar 

  • Marshall, E. M., Staddon, R. V., Wilson, D. A., & Mann, V. E. (2017). Addressing maths anxiety within the curriculum. MSOR Connections, 15(3), 28–35.

    Article  Google Scholar 

  • Matsunaga, M. (2010). How to factor-analyze your data right: Do’s, don’ts, and how-to’s. International Journal of Psychological Research, 3(1), 97–110.

    Article  Google Scholar 

  • Mitzner, T. L., Boron, J. B., Fausset, C. B., Adams, A. E., Charness, N., Czaja, S. J., … Sharit, J. (2010). Older adults talk technology: Technology usage and attitudes. Computers in Human Behavior, 26(6), 1710–1721.

    Article  Google Scholar 

  • Mundfrom, D. J., Shaw, D. G., & Ke, T. L. (2005). Minimum sample size recommendations for conducting factor analysis. International Journal of Testing, 5(2), 159–168.

    Article  Google Scholar 

  • Ngai, E. W. T., Poon, J. K. L., & Chan, Y. H. C. (2007). Empirical examination of the adopting of WebCT using TAM. Computers & Education, 48(2), 250–267.

    Article  Google Scholar 

  • Nguyen, D. M., Hseih, Y.-C. J., & Allen, G. D. (2006). The impact of web-based assessment and practice on students’ mathematics learning attitudes. The Journal of Computers in Mathematics and Science Teaching, 25(3), 251–279.

    Google Scholar 

  • Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory, (3rd ed., ). New York: McGraw-Hill.

    Google Scholar 

  • O’Connor, B. P. (2000). SPSS and SAS programs for determining the number of components using parallel analysis and Velicer’s MAP test. Behavior Research Methods, Instrumentation, and Computers, 32, 396–402.

    Article  Google Scholar 

  • Oppenheim, A. (1998). Questionnaire design, interviewing and attitude measurement. London: Continnuum-3PL.

    Google Scholar 

  • Parsian, N., & Dunning, T. (2009). Developing and validating a questionnaire to measure spirituality: A psychometric process. Global Journal of Health Science, 1(1), 2–11.

    Article  Google Scholar 

  • Pearce, N. (2017). Exploring the learning experiences of older mature undergraduate students. Widening Participation and Lifelong Learning, 19, 59–76.

    Article  Google Scholar 

  • Pierce, R., Stacey, K., & Barkatsas, A. (2007). A scale for monitoring students’ attitudes to learning mathematics with technology. Computers & Education, 48(2), 285–300.

    Article  Google Scholar 

  • Richardson, J. T. E. (1994). Mature students in higher education: I. a literature survey on approaches to studying. Studies in Higher Education, 19(3), 309–325.

    Article  Google Scholar 

  • Rogers, W. A., Meyer, B., Walker, N., & Fisk, A. D. (1998). Functional limitations to daily living tasks in the aged: A focus group analysis. Human Factors, 40, 111–125.

    Article  Google Scholar 

  • Saadé, R. G., & Kira, D. (2007). Mediating the impact of technology usage on perceived ease of use by anxiety. Computers & Education, 49(4), 1189–1204.

    Article  Google Scholar 

  • Sagin Simsek, C. S. (2008). Students’ attitudes towards integration of ICTs in a reading course: A case in Turkey. Computers & Education, 51(1), 200–211.

    Article  Google Scholar 

  • Scherer, R., Siddiq, F., & Tondeur, J. (2019). The technology acceptance model (TAM): A meta-analytic structural equation modelling approach to explaining teachers’ adoption of digital technology in education. Computers & Education, 128, 13–35.

    Article  Google Scholar 

  • Schuetze, H. G. (2014). From adults to non-traditional students to lifelong learners in higher education: Changing contexts and perspectives. Journal of Adult and Continuing Education, 20(2), 37–55.

    Article  Google Scholar 

  • Schwarz, N., Knäuper, B., Hippler, H.-J., Noelle-Neumann, E., & Clark, L. (1991). Rating sales: Numeric values may change the meaning of scale labels. The Public Opinion Quarterly, 55(4), 570–582.

    Article  Google Scholar 

  • Seale, J., Georgeson, J., Mamas, C., & Swain, J. (2015). Not the right kind of ‘digital capital’? An examination of the complex relationship between disabled students, their technologies and higher education institutions. Computers & Education, 82, 118–128.

    Article  Google Scholar 

  • Selwyn, N. (2004). The information aged: A qualitative study of older adults’ use of information and communication technology. Journal of Aging Studies, 18, 369–384.

    Article  Google Scholar 

  • Shelton, C. (2014). “Virtually mandatory”: A survey of how discipline and institutional commitment shape university lecturers’ perceptions of technology. British Journal of Educational Technology, 45, 748–759.

    Article  Google Scholar 

  • Straub, E. T. (2009). Understanding technology adoption: Theory and future directions for informal learning. Review of Educational Research, 79(2), 625–649.

    Article  Google Scholar 

  • Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2, 53–55.

    Article  Google Scholar 

  • Teo, T. (2008). Assessing the computer attitudes of students: An Asian perspective. Computers in Human Behaviour, 24(4), 1634–1642.

    Article  Google Scholar 

  • Teo, T., Lee, C., & Chai, C. (2008). Understanding pre-service teachers’ computer attitudes: Applying and extending the technology acceptance model. Journal of Computer Assisted Learning, 24(2), 128–143.

    Article  Google Scholar 

  • Tomczak, M., & Tomczak, E. (2014). The need to report effect size estimates revisited. An overview of some recommended measures of effect size. Trends in Sport Sciences, 1, 19–25.

    Google Scholar 

  • UCAS (2017). 2017 End of Cycle Report: Patterns by Age, Cheltenham Available at: https://www.ucas.com/corporate/data-and-analysis/ucas-undergraduate-releases/ucas-undergraduate-analysis-reports/2017-end-cycle-report (Accessed 6 May 2018).

    Google Scholar 

  • UCAS (2018). End of cycle report 2018: Summary of applicants and acceptances. Cheltenham: Available at: https://www.ucas.com/data-and-analysis/undergraduate-statistics-and-reports/ucas-undergraduate-end-cycle-reports/2018-end-cycle-report (Accessed 7 April 2019.

  • UCAS (2019). UCAS end of cycle report 2019: Chapter 1: Summary of applicants and acceptances. Cheltenham: Available at: https://www.ucas.com/data-and-analysis/undergraduate-statistics-and-reports/ucas-undergraduate-end-cycle-reports/2019-end-cycle-report (Accessed 17 Jan 2020).

  • Venkatesh, V., & Davis, F. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46, 186–204.

    Article  Google Scholar 

  • Waller, R. (2006). ‘I don’t feel like “a student”, I feel like “me”!’: The over-simplification of mature learners’ experience(s). Research in Post-Compulsory Education, 11(1), 115–130.

    Article  Google Scholar 

  • Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide for novices. Australasian Journal of Paramedicine, 8(3), 1–13.

  • Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning, 21(1), 15–35.

    Article  Google Scholar 

  • Zhang, X., Noor, R., & Savalei, V. (2016). Examining the effect of reverse worded items on the factor structure of the need for cognition scale. PLoS One, 11(6), 1–15.

  • Zohar, A., & Dori, Y. J. (2003). Higher order thinking skills and low-achieving students: Are they mutually exclusive? Journal of the Learning Sciences, 12(2), 145–181.

    Article  Google Scholar 

  • Zwick, W. R., & Velicer, W. F. (1986). Comparison of five rules for determining the number of components to retain. Psychological Bulletin, 99(3), 432.

    Article  Google Scholar 

Download references

Acknowledgements

Thank you to Jon Scaife and Andy McLean, for their support and useful discussions about this research.

Thank you also to the useful comments from reviewers that helped to improve this manuscript.

Funding

No funding was received for this research.

Author information

Authors and Affiliations

Authors

Contributions

All work was carried out by the sole author. The author read and approved the final manuscript.

Corresponding author

Correspondence to Rachel V. Staddon.

Ethics declarations

Ethics approval and consent to participate

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research committee (University Research Ethics Committee) and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The project was reviewed via the University of Sheffield Ethics Review Procedure, as administered by the School of Education.

Informed consent was obtained from all individual participants included in the study. This article does not contain any studies with animals performed by any of the authors.

Competing interests

The author declares she has no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Technology Attitudes Questionnaire

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Staddon, R.V. Bringing technology to the mature classroom: age differences in use and attitudes. Int J Educ Technol High Educ 17, 11 (2020). https://doi.org/10.1186/s41239-020-00184-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41239-020-00184-4

Keywords