- Research article
- Open Access
Assessing digital competence and its relationship with the socioeconomic level of Chilean university students
International Journal of Educational Technology in Higher Education volume 19, Article number: 46 (2022)
Digital competence (DC) is one of the key aspects in citizen development in the digital age. The DC is particularly important in forming university students and future teachers. This article presents the main results of a study to evaluate DC and its relationship with the socioeconomic level of first-year students of pedagogy in three Chilean public universities, located in the north, center, and south of the country. A quantitative research methodology was used, with a sample of 817 students, the data were collected through the DIGCOMP-PED evaluation instrument, which evaluates DC development using the DIGCOMP framework. The results were analyzed at the general and socioeconomic level on the variables of the educational establishment where they attended high school and the territorial area of the university they attended. The main results indicate that the level of DC achievement is intermediate, the areas with the highest levels of achievement were “network security” and “online communication and collaboration.” On the other hand, the lowest levels of achievement were reached in the areas “information and digital literacy,” “digital content creation,” and “problem solving.” The level of DC is higher among students of private establishments and those who attend universities located in the central area.
The European Commission (2007) defines the key competences for twenty-first century citizens as a set of knowledge, skills, and attitudes that must be attained to be able to participate in society and learn throughout life; one of these competences is digital competence (DC). DC is increasingly more important in university-level training, where it is essential for students to develop progressing levels of autonomy and learning using digital technologies (DT), adapting to the continuous changes and advances of the digital society (Sánchez-Caballé et al., 2020). Students who reach university, despite being part of the digital-age generation, have significant weaknesses in the use of internet tools (Liesa-Orús et al., 2016). In recent years, digital competence has become a highly relevant line of research in the field of educational technology, both for teachers and students (Durán et al., 2016). Diagnosing DC is a cornerstone for generating training plans aimed at developing these competencies in students. Future teachers need to develop DC as a basis for developing digital teaching competence (Silva et al., 2019). A digitally competent teacher is key to embedding DT in education (Engen, 2019).
According to several authors, the socio-economic level of families is relevant in the development of pupils' digital competence (Zhong 2011; Claro et al., 2012). In this sense, some studies highlight the economic influence (Román & Murillo, 2013), where it is proven that having a computer at home increases pupils' digital competence (Gómez-Pablos et al., 2020; Kuhlmeier & Hemker, 2007; Livingstone & Helsper, 2007). There is also evidence that students' socio-economic background influences their ability to use new technologies through the resources available to them outside school (Fernández-Mellizo and Manzano (2018).
The socioeconomic level (SES) in Chile is directly related to the type of establishments in which students enroll (Barrientos-Oradini & Araya Castillo, 2018). This has an impact on their scores in university admission tests and, therefore, on the university they have access to (Araneda-Guirrimán et al., 2018). Municipal establishments receive students from low SES, private subsidized establishments receive students from middle SES, and paid establishments receive students from high SES. Top-ranking universities are accessed by students achieving the highest scores on the selection tests (Brunner, 2012). In Chile, there are marked differences in the type of institution that young people access according to their socioeconomic level; admission to universities is related to the score in the selection, which limits the possibilities of young people with fewer resources, as they systematically obtain lower scores on this test (Catalán and Santelices, 2014). At a general level, SES is one of the variables that most influence students’ DC (Hatlevik et al., 2018). This aspect is no different in Chile, where studies show that SES is directly related to students’ DC level (Claro et al., 2015; Jara, et al., 2015). For this reason, this study seeks to determine the DC level of first-year students of pedagogy, crossed with two socioeconomic variables: the type of educational establishment where they attended secondary school and the territorial area of the university they attend.
Digital competence (DC), involves “the confident, critical and responsible use of, and engagement with, digital technologies for learning, at work, and for participation in society.” (European Commission, 2018, p.4). DC is understood as the sum of skills, knowledge, and attitudes in technological, informational, multimedia, and communicative aspects, which give rise to a complex multiple literacy (Ferrari, 2012).
Several entities around the world have developed guiding frameworks to define dimensions and indicators for DC, of which the most well-known are: iSkills (Pérez-Escoda et al., 2019), International Society for Technology in Education (Fuller, 2020), DigiLit Leicester (Fraser et al., 2013), ICILS by the OECD (Punter et al., 2017), and finally, DIGCOMP by the European Economic Community (Redecker & Punie, 2017) and its latest version, DIGCOMP 2.1 (Carretero et al., 2017). The latter framework considers a globalizing concept of DC, which includes knowledge, skills, and attitudes, and covers the areas of: information, communication, content creation, security, and problem solving. It is currently being used in several studies to assess DC at the university level in general (López-Meneses et al., 2020; Vázquez-Cano et al. 2017) and in the fields of pedagogy (González-Calatayud et al., 2018; Gutiérrez & Serrano, 2016).
DC plays an important role in the personal and professional development of university students. For Gros (2015), students need to develop important skills to guide their educational processes, including digital skills. Developing digital skills is considered crucial for university students (Aguaded & Cabero, 2013). However, students do not demonstrate high proficiency in the use of DT for learning (Prendes-Espinosa and Román-García, 2017). Students reach university with a basic level of digital competence (Sánchez-Caballé et al., 2020). The university student body is part of a digital generation; however, they do not learn better with the use of DT; we need to work with them to develop DC. (Gutiérrez et al, 2018). This means that students’ DC needs to be assessed to design extracurricular training plans and incorporate activities that encourage DC development into the curriculum.
Using the DIGCOMP framework we found numerous studies. López-Meneses et al. (2020) evaluated the competence of university students from three European universities in three areas: “information and data literacy,” “communication and collaboration,” and “digital content creation.” The results showed that these future graduates had a high-intermediate level of competence in “information and digital literacy” and “communication and collaboration,” but a low-intermediate level in “digital content creation.” A study was carried out by (González-Calatayud et al., 2018) with the main purpose of improving DC of second year pedagogy students through tasks focused on working each of the areas of DIGCOMP. Students generally showed a medium level of digital competence in all areas, in the areas of “problem solving,” “information and data literacy,” and “digital content creation,” the lowest average values were found, and “communication and collaboration” and “network security” showed a higher average. Gutiérrez and Serrano (2016) analyzed DC in first year students of primary education pedagogy. In accordance with the DIGCOMP framework, the results indicate that participants consider themselves competent in the most basic aspects of digital competence.
Research on the socio-economic level of students has indicated that factors related to the family history of students influence their ICT literacy outcome (Siddiq et al., 2017). Students from low SES households express less self-efficacy in ICT (Vekiri, 2010). The ACARA study (2012) notes that children of parents with low educational levels showed poorer ICT literacy competency than children of parents with higher education. Hatlevik et al. (2018), in a study conducted in 15 countries, socio-economic status appears to be the most important predictor of computer competence and computing in all countries.
In Chile, the national digital skills test that evaluates proficiency levels and ICT skills for learning (SIMCE TIC) through simulated environments applied to high school students in 2013, shows that 46.9% of the students reached the initial level, 51.3% are at an intermediate level, while the advanced level is achieved by only 1.8%. In addition, it is observed that students in private schools show higher levels of ICT skills and students in municipal schools show the lowest levels (Rodríguez-Garcés & Muñoz-Soto, 2018). Analyzing the results of this ICT skills test, Jara et al. (2015) note that students who have a computer at home score better than those who do not, and students from families with a higher SES and more cultural possessions scored higher. On the other hand, Claro et al. (2015) found that the education level of parents was the most relevant factor in explaining the scores of students.
There are several tools to assess DC in undergraduate education at the level of self-perception or diagnosis, including: INCOTIC 2.0 (González et al., 2018), ACUTIC (Mirete et al., 2015), CODIEU (Casillas et al. 2018), REATIC (Moya-Martínez et al., 2011), INCODIES (Guillén-Gámez & Mayorga-Fernández, 2020). The latter follows the structure of the European DIGCOMP framework. This model may serve as a structure and basis for the development of specific DC evaluation by Petterson (2017). Measuring DC is a critical challenge to better understand its development, so further evaluation tools need to be developed for this measurement (HE & Zhu, 2017).
Objective assessment tools are increasingly being required, which are not based only on the perception of the user but measure the level of DCT by solving situations or problems in line with the indicators to be evaluated (Villa-Sánchez & Poblete-Ruiz, 2011: 150). It is also important to emphasize that there are differences between university students’ perception of their own digital competence and the skills they demonstrate (Gabarda-Méndez et al., 2017). One way to combine this type of instrument is to mix evaluation tests with self-assessment tests following the former (Rosman, 2015). Therefore, the challenge is to use an objective, reliable, and valid DCT evaluation test, which allows to assess the knowledge of university students validly and reliably.
General and specific objectives
General Objective: To determine the level of development of Digital Competence (DC) of first year pedagogy students in Chilean public universities, and its relationship with socioeconomic level, through the variables: educational establishment where they attended high school and the university they attend.
Objective 1: Evaluate the level of digital competence of a sample of students from three Chilean public universities.
Objective 2: Study the relationship between the level of achievement of digital competence and the educational establishment where they attended secondary education.
Objective 3: Study the relationship between the level of achievement of digital competence and the university in which they enroll.
The sample of this study was made up of 817 first year students of higher education who belonged to pedagogy programs of three public universities in northern, central, and southern Chile. This study was conducted during the 2020 academic year. The characteristics of the participants in this research are reported in Table 1.
65.1% of the students are female and 33.8% male. 52.6% attend private establishments (co-financed) and 39.2%, municipal establishments. Regarding the programs in which students are enrolled, 53.5% study secondary education pedagogy, and 29.4%, primary education pedagogy. Regarding the university they attend, 54.8% are enrolled in a university in the center of the country; 33.9%, in the north; and 11.3%, in the south.
In order to determine the DC level in students of pedagogy, the DIGCOMP-PED evaluation instrument was used, which presents a variety of situations that first year university students face in their daily life in terms of different uses of digital technology. The instrument was built considering the DIGCOMP framework (Redecker & Punie, 2017) specifically, its latest version, DIGCOMP 2.1 (Carretero et al., 2017). The instrument evaluates 21 indicators grouped into five areas of competence (Table 2).
After analyzing the dimensions and indicators considered by DIGCOMP 2.1, we generated the test-type evaluation instrument, composed of closed and multiple-choice items. The initial version had four choices for each of the 21 indicators, and a total of 84 items.
To ensure the validity of content of the instrument, the 84 initial questions were validated through expert judgment. These included 5 experts in the field of higher education linked to the area of technologies and education, related to initial teacher training, four from Chile and one from Spain. Validation matrices were used, where each expert evaluated the validity conditions of each item with a Yes (1) or a No (0). Based on the scores assigned by the experts, the overall quality of the items could be established, with variations from 73 to 100%; the questions were selected with over 80% of the assessment.
The final version of the instrument was composed of the three highest rated items for each of the 21 indicators, so the final instrument was made up of 63 items, three items for each of the 21 indicators, distributed in the five areas: “digital information and literacy,” 9 items; “online communication and collaboration,” 18 items; “digital content creation,” 12 items; “network security,” 12 items; and “problem solving” 12 items. Indicator 1 was made up of the first three items; indicator 2, of the next three items; and so on. The items were marked as “correct” 1 point or “incorrect” 0. The range of scores for the indicators is from 0 to 3, and the overall instrument score ranges from 0 to 63.
Reliability analysis of the instrument was evaluated using the Kuder-Richardson-21 indicator (McGahee & Ball, 2009), indicating that the consistency of the responses obtained at the total level was acceptable (KR-21 = 0.60). Cronbach’s alpha (α = 0,702). Difficulty level of the test was adequate (DL = 55.06%) and the minimum acceptable performance score (MAP) was 60%. Some examples are shown below (Figs. 1 and 2).
This research, which involves human participants, was approved by the ethics committee of the University of Santiago de Chile Nº 410/2019. Participation in the research was voluntary and was not mediated by the provision of any incentive or reward. The teams responsible for the study considered safeguarding anonymity and compliance with data transfer, requesting informed consent from the participants prior to the application. The instrument was responded to digitally, using the link provided. It was applied at the three universities, the beginning of the academic year 2020 during application of the mandatory diagnostic tests applied by the Chilean Ministry of Education, the process that lasted one month. This instrument is not part of mandatory diagnosis, it is an initiative of the three participating universities. The answers given by the students were downloaded and saved in an Excel spreadsheet, and then exported to the SSPS statistical programs.
Results analysis of the assessment instrument application to answer the research questions first considered a descriptive analysis of the data from the DC assessment instrument at the level of dimensions and indicators. Then, independent samples t-tests were performed to evaluate the mean differences in the scores obtained in the indicators and in the areas according to the variables studied. On the other hand, one-way ANOVA tests were performed to evaluate the differences in the scores obtained according to the variables: educational establishment where they attended high school and university. Tukey post-hoc tests were performed to identify the pairs of variables among which the statistically significant differences detected by the ANOVA test in SPSS (IBM Corp., 2016).
Analysis and results
Level of achievement in areas and indicators
Regarding the areas of digital competence (Fig. 3), the level of achievement reached 55.1%. The areas of “information and digital literacy,” 47.7%; “problem solving,” 47.3%; and “digital content creation,” 45.5%” are the areas of lowest achievement, reaching percentages below 50%. Meanwhile, “network security,” 73.2% and “online communication and collaboration,” 58.2%, were the areas of digital competence that obtained the highest percentage of achievement.
As shown in Table 3, the area of highest achievement Online security presents four indicators, the highest mean is obtained by the Device protection indicator (M = 0.813, SD = 0.232) and the lowest mean is reached by the Health and wellbeing protection indicator (M = 0.670, SD = 0.241). The area of lowest achievement Digital content creation presents 4 indicators, the highest mean is presented by the Content development indicator (M = 0.585, 0.281) and the lowest mean is obtained by the Programming indicator (M = 0.282, SD = 0.248).
Digital competence by establishment of origin
The results of the level of digital competence by students’ school of origin (Fig. 4) show that for the municipal sector the average is 53.5%; private subsidized, 55.6%; and private 58.9%. The dimension with the highest achievement for different educational centers is “network security,” with achievement values above 70%. The dimension with the lowest achievement for the rest of the schools is “digital content creation,” with an achievement level below 47%.
The Table 4 shows that for municipal, private subsidized and private establishments the indicator with the highest mean is Device protection with (M = 0.800, SD = 0.246), (M = 0.817, SD = 0.224) and (M = 0.854, SD = 0.205) respectively. The indicators with lower means for municipal and private establishments is creative use of digital technology (M = 0.259, SD = 0.260) and (M = 0.284, SD = 0.283) respectively and for private subsidized establishments is programming (M = 0.272, SD = 0.248).
Statistically significant differences were found in the total (F(3, 813) = 5.404, p < 0.001) and in the areas: “information and digital literacy” (F(3, 813) = 3.499, p < 0. 05), “online communication and collaboration” (F(3, 813) = 3.499, p < 0.05), “network security” (F(3, 813) = 3.128, p < 0.05) and “problem solving” (F(3, 813) = 3.076, p < 0.05). In these areas, private schools have significantly higher scores than those coming from municipal and private-subsidized schools.
Differences were found in the following indicators: Digital data, information, and contents management (F(3, 813) = 3.285, p < 0.05), Interacting through digital technologies (F(3, 813) = 3.046, p < 0.05), Citizen participation through digital technologies (F(3, 813) = 2.845, p < 0.05), Collaboration through digital technologies (F(3, 813) = 2.738, p < 0.05), Environmental protection (F(3, 813) = 2.718, p < 0.05), Identifying technological needs and responses (F(3, 813) = 3.076, p < 0.05), Creative use of digital technology (F(3, 813) = 3.414, p < 0.05), Identifying gaps in digital skills (F(3, 813) = 5.468, p < 0.05). In general, students from private paid tuition schools tend to have significantly higher scores than those who come from municipal, private-subsidized.
Digital competence by university admittance
The results of the level of digital competence by university of admittance (Fig. 5) show that for the central zone the average is 57.0%; north zone, 50%; and south zone, 53.3%. Highest achievement for the three institutions was “network security,” with levels above 71.0%. Lowest achievement for the three institutions was “digital content creation,” with achievement levels below 47%.
The Table 5 shows that for the three universities the indicator with the highest mean is Device protection university north zone (M = 0.839, SD = 0.203, center zone (M = 0.771, SD = 0.265) and south zone (M = 0.789, SD = 0.257). The indicator with the lowest means for the three universities is creative use of digital technology north zone (M = 0.296, SD = 0.275), center zone (M = 0.246, SD = 0.251) and south zone (M = 0.294, SD = 0.279) (Table 5).
Statistically significant differences were found overall (F(3, 813) = 23.576, p < 0.001) and in each area: “information and digital literacy” (F(3, 813) = 4.0.854, p < 0.05), “online communication and collaboration” (F(3, 813) = 21.714, p < 0.05), “digital content creation” (F(3, 813) = 8.447, p < 0.05), “network security” (F(3, 813) = 6.800, p < 0.05), and “problem solving” (F(3, 813) = 10.536, p < 0.05). In all areas, there is a tendency for students from central zone universities to have significantly higher scores than students from the northern and southern zone universities.
The indicators also reported other differences: Digital data, information and contents management (F(3, 813) = 3.169, p < 0.05), Interacting through digital technologies (F(3, 813) = 5.370, p < 0.05), Citizen participation through digital technologies (F(3, 813) = 22.419, p < 0.05), Online behavior (F(3, 813) = 3.611, p < 0.05), Integrating and re-elaborating digital content (F(3, 813) = 11.584, p < 0.05), Device protection (F(3, 813) = 5.830, p < 0.05), Environmental protection (F(3, 813) = 4.490, p < 0.05), Technical problem solving (F(3, 813) = 3.334, p < 0.05), Identifying technological needs and responses (F(3, 813) = 2.469, p = 0.085), Identifying gaps in digital skills (F(3, 813) = 11.359, p < 0.05). In all the above areas and indicators, students coming from the central zone university tend to have significantly higher scores than those from the northern zone university and then from the southern zone university. This trend does not occur with the indicator evaluate data, information, and digital content (F(3, 813) = 3.285, p < 0.05), where the university of the north zone has higher scores than those of the center zone.
Discussion and conclusions
The general objective of this study was to determine the level of Digital Competence (DC) development of first year pedagogy students in Chilean public universities, and its relationship with socioeconomic level through the variables: educational institution where they attended high school and university where they are pursuing their higher studies. The results found in the previous section will be discussed according to the specific objectives. The first objective was to evaluate the level of digital competence of a sample of students from three Chilean public universities. The results show that students have an intermediate level of achievement in the five areas of DC of the DIGCOMP framework (55.1%). These results agree with Rodríguez-Garcés & Muñoz-Soto (2018) who point out that for the ICT skills assessment test, 51.3% is an intermediate level. They differ from the study of self-perception of the level of digital skills according to DIGCOMP by Segrera-Arellana (2020) where most university students consider themselves at the advanced level. They also differ slightly from the study by Gutiérrez and Serrano (2016), where students in the first year of primary education pedagogy consider themselves competent in the most basic DC aspects of the DIGCOMP framework. The data agree with Gonzalez et al. (2018), who, using the same framework, found that university students generally show an average level of DC. This differs from other studies showing that university students have a basic level of DC (Liesa-Orús et al., 2016; Sánchez-Caballé et al., 2020).
The areas of highest achievement were “network security” and “online communication and collaboration,” whereas “problem solving,” “information and digital literacy,” and “content creation” recorded the lowest values. These results agree with those reported by González-Calatayud et al. (2018), who conducted a study with second year pedagogy students, using the DIGCOMP framework. The university students have a good perception of computer security knowledge and the awareness of performing backups more frequently (Roque Hernández and Juárez Ibarra, 2018). Future teachers, also present good attitudes towards security, but less knowledge, skills and practices related to the safe and responsible use of the Internet (Gallego-Arrufat, et al., 2019). The area of “network security,” coincides with the study by (Gallego-Arrufat et al., 2019), conducted on future undergraduate teachers from Spain and Portugal, where the area of digital security was the best evaluated, because students have good attitudes towards safety, but less knowledge, skills and practices related to safe and responsible use of the Internet. Data for similar studies that used the DIGCOMP framework in pedagogy students are discussed below. The level of achievement of “online communication and collaboration” agrees with the results found by López-Meneses et al. (2020) and Gutiérrez and Serrano (2016), which was high intermediate level. Within the areas of lower achievement, “information and digital literacy” obtained a score below 50%, which differs from the high intermediate level obtained by students in the study by López-Meneses et al. (2020). Similar results have been reported by Gutiérrez and Serrano (2016) and in the study by Napal-Frailen et al. (2018), where this area records the highest achievement. The results in “content creation” coincide with the studies of López-Meneses et al. (2020) and Gutiérrez and Serrano (2016), who report low levels for this area as does our study. In addition, the study by Napal-Frailen et al. (2018) shows, like our study, that “problem solving” and “content creation” are among the areas of lowest achievement.
The second and third objectives were to study the relationship between socioeconomic level and achievement of DC through the variables of the school where they attended high school and the university they entered. In both, we sought to study possible differences between levels of DC development.
The results show a higher level of achievement in students from private schools (58.9%) and a lower achievement in the municipal sector (53.5%). The area with the highest levels of achievement for the three types of establishments was “network security” and the lowest level was “digital content creation” for private and subsidized private schools, and “problem solving” for municipal schools. Statistically significant differences were found in the scale in the following areas: “information and digital literacy,” “online communication and collaboration,” “network security,” and “problem solving,” with students coming from private establishments having significantly higher scores than those coming from municipal and private-subsidized establishments.
At the University level, the highest level of achievement in students who entered university is found in the central zone of the country (57%) and the lowest, in the university of the north of the country (50.7%). This finding is related to the fact that students with a higher SES obtain higher scores in the university selection tests come from educational establishments with a higher SES and access universities with the best ranking, which are in the central zone of the country. The areas with the highest levels of achievement for the three universities were “network security” and “online communication and collaboration;” the lowest level of achievement for universities in the central and northern zones was “content creation” and for the one in the southern zone, “problem solving.” Statistically significant differences were found in the five areas of DIGCOMP, with students from the central zone university having significantly higher scores than students from the northern and southern zone universities.
The results obtained in this study confirm the relationship between the achievement level of ICT competencies and socioeconomic level of Chilean students. SES is directly related to the students’ DC level (Claro et al., 2015; Jara, et al., 2015) and is one of the variables that most influence students’ DC (Hatlevik et al., 2018). The use of ICT outside school is closely linked to SES, as its use in socioeconomic contexts is more limited (Hollingworth et al., 2011), and having a computer has an impact on the level of DC (Jara et al., 2015). In a study with Iranian university students (Nami & Vaezi, 2018) it was found that students with a personal computer demonstrated higher levels of technological literacy. The low levels of DC in the Chilean student body would be conditioned by the SES, with students from families of low SES displaying lower achievements than those with high SES (Rodríguez & Muñoz, 2018).
The findings of this study show that first year pedagogy students, according to the DIGCOMP framework, have not adequately developed DC. This ratifies the need to diagnose the level of DC through evaluation instruments and develop formative instances to improve their development, integrating DC as part of the curriculum of university programs, especially those of pedagogy, so that students use technology for their academic and personal development. Domingo-Coscolla et al., (2020) indicate that communication and collaboration should be prioritized during the teaching and learning process using digital technologies that favor them.
It is necessary to promote innovative actions in education that consider the use of technologies and developing DC in students (European Commission, 2018). For Guzmán-Simón et al. (2017), university teaching should incorporate DC as part of the academic training of students, especially in students who are preparing to become teachers. Developing DC in the early years is fundamental to achieve DC in the later years (Silva et al., 2019). González-Calatayud et al. (2018) show that after a training process the deficit areas of DC increase considerably. Students with basic DC levels perceive that formative work developed within an ICT subject helps them to improve DC (Gutiérrez & Serrano, 2016).
Measurement of DC is a critical challenge to better understand its development in practice (He & Zhu, 2017). Self-perception tests deliver higher ratings of DC than assessment tests (Gabarda-Méndez et al., 2017), one possibility is to combine them (Rosman, 2015). Therefore, it is appropriate to move forward in the construction of instruments to assess the level of DC, based on existing publications and existing DC frameworks. In this context, it is interesting to use a proven framework to identify and describe DC, such as DIGCOMP. The DIGCOMP-PED instrument used in this study is a good starting point for assessing DC in university students, as it consists of a set of questions that confront the student with concrete situations where the use of DC is necessary for its solution.
Among the corrective measures to improve the level of digital competence of students regardless of their socioeconomic level, is the training, which considers the areas of digital competencies according to DIGCOMP, each area can be a module or all a course. There is the experience of the development of digital competencies of university teachers according to the DigCompEdu Framework through a MOOC (Cabero-Almenara, et al., 2021). MOOCs contribute to the 2030 agenda for sustainable development because they can guarantee inclusion, as they are massive, open, free and accessible, in addition they develop the competences of autonomous work.
The main limitations of the study are related to the instrument in relation to the number of items, it would be advisable to consider a larger base of items and their validation with a greater number of experts. Another aspect to consider is the item alternatives, which only took into account one correct answer. The instrument was answered voluntarily online, therefore, there is not an equal representation of the different areas of teacher education in each university.
As proposals for the future, we consider interesting:
To address improvements to the instrument, expanding the questions for each indicator in future research.
Apply the instrument to university students from other areas of knowledge, such as engineering, medicine, and law, among others.
To carry out comparative studies between public and private, Latin American, and European universities in the same country since this subject is of growing interest.
Availability of data and materials
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
International Computer and Information Literacy Study
ACARA. (2012). National assessment program e ICT literacy years 6 & 10 report 2011. Sidney: Australian curriculum. Assessment and Reporting Authority. http://www.nap.edu.au/verve/_resources/nap_ictl_2011_public_report_final.pdf.
Aguaded, I., & Cabero, J. (2013). Tecnologías y medios para la educación en la e-sociedad. Alianza Editorial.
Araneda-Guirriman, C., Sallán, J. G., Pedraja-Rejas, L., & Rodríguez-Ponce, E. (2018). Percepciones sobre el perfil del estudiante universitario en el contexto de la educación superior de masas: Aproximaciones desde Chile. Interciencia, 43(12), 864–870.
Barrientos-Oradini, N., & Araya-Castillo, L. (2018). Educación Superior en Chile: una visión sistémica. Aletheia. Revista de Desarrollo Humano, Educativo y Social Contemporáneo, 10(2), 80–109. https://aletheia.cinde.org.co/index.php/ALETHEIA/article/view/507/277
Brunner, J. J. (2012). La lucha por la educación de las elites: Campo y canales formativos. Revista UDP, 9(1), 119–143.
Cabero-Almenara, J., Barragán-Sánchez, R., Palacios-Rodríguez, A., & Martín-Párraga, L. (2021). Design and Validation of t-MOOC for the Development of the Digital Competence of Non-University Teachers. Technologies, 9(4), 84. https://doi.org/10.3390/technologies9040084
Carretero, S., Vuorikari, R., & Punie, Y. (2017). The digital competence framework for citizens with eight proficiency levels and examples of use. Luxembourg, Office of the European Union,. https://doi.org/10.2760/38842
Casillas, S., Cabezas, M., Sanches-Ferreira, M., & Teixeira, F.L. (2018). Psychometric Study of a Questionnaire to Measure the Digital Competence of University Students (CODIEU). Education in the Knowledge Society, 19(3), 69–81. https://doi.org/10.14201/eks20181936981.
Catalán, X., & Santelices, M. (2014). Rendimiento académico de estudiantes de distinto nivel socioeconómico en universidades: El caso de la Pontificia Universidad Católica de Chile. Calidad En La Educación, 40, 21–52. https://doi.org/10.4067/S0718-45652014000100002
Claro, M., Preiss, D., San Martín, E., Jara, I., Hinostroza, J. E., Valenzuela, S., et al. (2012). Assessment of 21st century ict skills in chile: test design and results from high school level students. Computers & Education, 59, 1042–1053. https://doi.org/10.1016/j.compedu.2012.04.004.
Claro, M., Cabello, T., San Martin, E., & Nussbaum, M. (2015). Comparing marginal effects of Chilean students’ economic, social and cultural status on digital versus reading and mathematics performance. Computers & Education, 82, 1–10. https://doi.org/10.1016/j.compedu.2014.10.018
Domingo-Coscolla, M., Bosco, A., Carrasco-Segovia, S., & Sánchez-Valero, J. A. (2020). Fomentando la competencia digital docente en la universidad: Percepción de estudiantes y docentes. Revista De Investigación Educativa, 38(1), 167–782. https://doi.org/10.6018/rie.340551
Durán, M., Gutiérrez, I. & Prendes, M. (2016), “Análisis conceptual de modelos de competencia digital del profesorado universitario”, RELATEC: Revista Latinoamericana de Tecnología Educativa, 15(1), 97–114, http://hdl.handle.net/10662/5790
Engen, B. K. (2019). Understanding social and cultural aspects of teachers’ digital competencies. Comunicar, 27(61), 9–19. https://doi.org/10.3916/C61-2019-01
European Commission (Ed.) (2018), Proposal for a council recommendation on key competences for lifelong learning, Brussels, The council of the European union. https://bit.ly/2YsyGNz
Fernández-Mellizo, M., & Manzano, D. (2018). Anàlisi de les diferències en la competència digital dels alumnes espanyols. Papers: revista de Sociología, 103(2), 175–198. https://raco.cat/index.php/Papers/article/view/336759
Ferrari, A. (2012), Digital Competence in Practice: An Analysis of Framework Sevilla: JRC‐IPTS. http://ftp.jrc.es/EURdoc/JRC68116.pdf.
Fraser, J., Atkins, L. & Hall, R. (2013) DigiLit Leicester: Supporting teachers, promoting digital literacy, transforming learning, Leicester, Leicester City Council http://www.josiefraser.com/wp-content/uploads/2013/10/DigiLit-Leicester-report-130625-FINAL.pdf
Fuller, M. T. (2020). ISTE standards for students, digital learners, and online Learning. In Handbook of Research on Digital Learning (pp. 284–290). IGI Global.
Gabarda-Méndez, V., Rodríguez Martín, A., & Moreno Rodríguez, M. D. (2017). La competencia digital en estudiantes de magisterio. Análisis competencial y percepción personal del futuro maestro. Educatio Siglo XXI, 35(2 Julio), 253. https://doi.org/10.6018/j/298601
Gallego-Arrufat, M.-J., Torres-Hernández, N., & Pessoa, T. (2019). Competence of future teachers in the digital security area. Comunicar, 27(61), 57–67. https://doi.org/10.3916/C61-2019-05
Gómez-Pablos, V., García-Valcárcel, A., Casillas-Martín, S., & Cabezas-González, M. (2020). Evaluación de competencias informacionales en escolares y estudio de algunas variables influyentes. Revista Complutense de Educación, 4(31), 517–528. https://doi.org/10.5209/rced.65835
González, J., Esteve, F. M., Larraz, V., Espuny, C., & Gisbert, M. (2018). INCOTIC 20: una nueva herramienta para la autoevaluación de la competencia digital del alumnado universitario. Profesorado, 22(4), 133–152. https://doi.org/10.30827/profesorado.v22i4.8401
González-Calatayud, V., Román-García, M., & Prendes-Espinosa, M. P. (2018). Formación en competencias digitales para estudiantes universitarios basada en el modelo DIGCOMP. Edutec Revista Electrónica de Tecnología Educativa, 65, 1–15. https://doi.org/10.21556/edutec.2018.65.1119
Gros, B. (2015). The fall of the walls of knowledge in the digital society and the emerging pedagogies. Education in the Knowledge Society (EKS), 16(1), 58–68. https://doi.org/10.14201/eks20151615868
Guillén-Gámez, F. D., & Mayorga-Fernández, M. J. (2020). Quantitative-comparative research on digital competence in students, graduates and professors of faculty education: An analysis with ANOVA. Education and Information Technologies, 25(5), 4157–4174. https://doi.org/10.1007/s10639-020-10160-0
Gutiérrez, I., Román, M., & Sánchez, M. M. (2018). Estrategias para la comunicación y el trabajo colaborativo en red de los estudiantes universitarios. Revista Comunicar, 54, 91–100. https://doi.org/10.3916/C54-2018-09
Gutiérrez, I., & Serrano, J. L. (2016). Evaluation and development of digital competence in future primary school teachers at the University of Murcia. Journal of New Approaches in Educational Research, 5(1), 51–56. https://doi.org/10.7821/naer.2016.1.152
Guzmán-Simón, F., García-Jiménez, E., & López-Cobo, I. (2017). Undergraduate students’ perspectives on digital competence and academic literacy in a Spanish University. Computers in Human Behavior, 74, 196–204. https://doi.org/10.1016/j.chb.2017.04.040
Hatlevik, O. E., Throndsen, I., Loi, M., & Gudmundsdottir, G. B. (2018). Students’ ICT self-efficacy and computer and information literacy: Determinants and relationships. Computers & Education, 118, 107–119. https://doi.org/10.1016/j.compedu.2017.11.011
He, T., & Zhu, C. (2017). Digital informal learning among Chinese university students: The effects of digital competence and personal factors. International Journal of Educational Technology in Higher Education, 14, 1. https://doi.org/10.1186/s41239-017-0082-x
Hollingworth, S., Mansaray, A., Allen, K., & Rose, A. (2011). Parents’ Perspectives on Technology and Children’s Learning in the Home: Social Class and the Role of the Habitus. Journal of Computer Assisted Learning, 27(4), 347–360. https://doi.org/10.1111/j.1365-2729.2011.00431.x
Jara, I., Claro, M., Hinostroza, J. E., San Martín, E., Rodríguez, P., Cabello, T., & Labbé, C. (2015). Understanding factors related to Chilean students’ digital skills: A mixed methods analysis. Computers & Education, 88, 387–398. https://doi.org/10.1016/j.compedu.2015.07.016
Kuhlemeier, H., & Hemker, B. (2007). The impact of computer use at home on students’ Internet skills. Computers & Education, 49(2), 460–480.
Liesa-Orús, M., Vázquez-Toledo, S., & Lloret-Gazo, J. (2016). Identificación de las fortalezas y debilidades de la competencia digital en el uso de aplicaciones de internet del alumno de primer curso del Grado de Magisterio. Revista Complutense De Educación, 27(2), 845–862. https://doi.org/10.5209/rev_RCED.2016.v27.n2.48409
Livingstone, S., & Helsper, E. (2007). Gradations in digital inclusion: Children, young people and the digital divide. New Media & Society, 9(4), 671–696.
López-Meneses, E., Sirignano, F. M., Vázquez-Cano, E., & Ramírez-Hurtado, J. M. (2020). University students’ digital competence in three areas of the DIGCOMP 2.1 model: A comparative study at three European universities. Australasian Journal of Educational Technology, 1, 69–88. https://doi.org/10.14742/ajet.5583
McGahee, T., & Ball, J. (2009). How to read and really use an item analysis. Nurse Educator, 34, 166–171. https://doi.org/10.1097/NNE.0b013e3181aaba94
Mirete, A.;García, F. & Hernández, F. (2015), Cuestionario para el estudio de la actitud, el conocimiento y el uso de TIC (ACUTIC) en Educación Superior: Estudio de fiabilidad y validez, Revista Interuniversitaria de Formación del Profesorado, vol. 83, pp.75–89, Recuperado de https://www.redalyc.org/articulo.oa?id=27443659006
Moya-Martínez, M. del V., Hernández Bravo, J. R., Hernández Bravo, J. A. & Cózar Gutiérrez, R. (2011). Análisis de los estilos de aprendizaje y las TIC en la formación personal del alumnado universitario a través del cuestionario REATIC. Revista de Investigación Educativa, 29(1), 137–156. https://revistas.um.es/rie/article/view/110481
Nami, F., & Vaezi, S. (2018). How ready are our students for technology-enhanced learning? Students at a university of technology respond. Journal of Computing in Higher Education, 30(3), 510–529. https://doi.org/10.1007/s12528-018-9181-5
Napal-Fraile, M., Peñalva-Vélez, A., & Mendióroz-Lacambra, A. (2018). Development of Digital Competence in Secondary Education Teachers’ Training. Educ. Sci., 8, 3. https://doi.org/10.3390/educsci8030104
Pérez-Escoda, A., García-Ruiz, R., & Aguaded, I. (2019). Dimensions of digital literacy based on five models of development/Dimensiones de la alfabetización digital a partir de cinco modelos de desarrollo. Cultura y Educación, 31(2), 232–266. https://doi.org/10.1080/11356405.2019.1603274
Prendes-Espinosa, M., & Román-García, M. (2017). Entornos Personales de Aprendizaje. Una visión actual de cómo aprender con tecnologías. Octaedro.
Punter, R., Meelissen, M., & Glas, C. (2017). Diferencias de género en la alfabetización informática e informacional: una exploración del desempeño de niñas y niños en ICILS 2013. European Educational Research Journal, 16(6), 762–780. https://doi.org/10.1177/1474904116672468
Redecker, C., & Punie, Y. (2017). European framework for the digital competence of educators: DIGCOMPEdu. In Y. Punie (Ed.), EUR 28775 EN. Luxembourg: Publications Office of the European Union. https://doi.org/10.2760/159770
Rodríguez-Garcés, C. R., & Muñoz-Soto, J. A. (2018). Habilidades TIC para el aprendizaje en estudiantes chilenos: una insuficiente y segmentadainstalación de competencias en la escuela. Paradigma, 39(1), 208–228.
Román, M., & Murrillo, J. (2013). Estimación del efecto escolar para la competencia digital. Aporte del liceo en el desarrollo de las habilidades TIC en estudiantes de secundaria en Chile [Investigation into the effect of school on digital competency: The contribution of the lyceum to the development of ICT in Chilean high school students]. In CEPPE, Desarrollo de habilidades digitales para el siglo XXI en Chile: ¿Qué dice el SIMCE TIC? [Developing digital skills for the twenty-first century in Chile: What does ICT SIMCE say?] (pp. 141–176). Santiago, Chile: LOM Ediciones.
Roque Hernández, R. V., & Juárez Ibarra, C. M. (2018). Concientización y capacitación para incrementar la seguridad informática en estudiantes universitarios. PAAKAT, 8, 14.
Rosman, T., Mayer, A.-K., & Krampen, G. (2015). Combining self-assessments and achievement tests in information literacy assessment: Empirical results and recommendations for practice. Assessment & Evaluation in Higher Education, 40(5), 740–754. https://doi.org/10.1080/02602938.2014.950554
Sánchez-Caballé, A., Gisbert-Cervera, M., & Esteve-Mon, F. (2020). The digital competence of university students: a systematic literature review, Aloma: Revista de Psicologia. Ciències de l’Educació i de l’Esport, 38, 1. https://doi.org/10.51698/aloma.2020.38.1.63-74
Segrera-Arellana, J. R., Paez-Logreira, H. D., & Polo-Tovar, A. A. (2020). Competencias digitales de los futuros profesionales en tiempos de pandemia. Utopía y Praxis Latinoamericana, 25(11), 222–232. https://doi.org/10.5281/zenodo.42783
Siddiq, F., Gochyyev, P., & Wilson, M. (2017). Learning in Digital Networks – ICT literacy: A novel assessment of students’ 21st century skills. Computers and Education, 109, 11–37. https://doi.org/10.1016/j.compedu.2017.01.014.
Silva, J., Usart, M., & Lázaro-Cantabrana, J.-L. (2019). Teacher’s digital competence among final year Pedagogy students in Chile and Uruguay. Comunicar, 27(61), 33–43. https://doi.org/10.3916/C61-2019-03
Vázquez-Cano, E., Meneses, E. L., & García-Garzón, E. (2017). Differences in basic digital competences between male and female university students of Social Sciences in Spain. International Journal of Educational Technology in Higher Education, 14, 27. https://doi.org/10.1186/s41239-017-0065-y.
Vekiri, I. (2010). Socioeconomic differences in elementary students’ ICT beliefs and out-of-school experiences. Computers & Education, 54(4), 941–950.
Villa-Sánchez, A., & Poblete-Ruiz, M. (2011). Evaluación de competencias genéricas: Principios, oportunidades y limitaciones. Bordón, 63(1), 147–170.
Zhong, Z. J. (2011). From access to usage: The divide of self-reported digital skills among adolescents. Computers & Education, 56(3), 736–746.
We would like to thank Verónica Yañez for the translation of this article into English.
This work has been funded by Universidad de Santiago de Chile, USACH. Thanks to Proyecto USA 1756_DICYT Departamento de Investigaciones Científicas y Tecnológicas,Universidad de Santiago de Chile,Proyecto USA 1756_DICYT. Cooperation agreement between Universidad de Salamanca, Spain, and Universidad de Santiago de Chile.
authors declare that they have no competing interests. There are no problems with the journal's policies. All authors certify approval and compliance with the article submitted. This manuscript has not been published elsewhere and is not under consideration by another journal.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Silva-Quiroz, J., Morales-Morgado, E.M. Assessing digital competence and its relationship with the socioeconomic level of Chilean university students. Int J Educ Technol High Educ 19, 46 (2022). https://doi.org/10.1186/s41239-022-00346-6
- Higher education
- University students
- New literacy
- Educational quality
- Quantitative analysis