Next Article in Journal
Coarse Point Cloud Registration Based on Variational Functionals
Previous Article in Journal
A New Projection Method for a System of Fractional Cauchy Integro-Differential Equations via Vieta–Lucas Polynomials
Previous Article in Special Issue
Neurodidactics of Languages: Neuromyths in Multilingual Learners
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Statistical Validation of the “ECODIES” Questionnaire to Measure the Digital Competence of Colombian High School Students in the Subject of Mathematics

by
Ana Bertha Betín de la Hoz
1,
Antonio Rodríguez-Fuentes
2,*,
María Jesús Caurcel Cara
3 and
Carmen del Pilar Gallardo Montes
2
1
Secretary of Education of the District of Bogotá, 111321 Bogotá, Colombia
2
Department of Didactics and School Organization of the University of Granada, 18071 Granada, Spain
3
Department of Developmental and Educational Psychology of the University of Granada, 18071 Granada, Spain
*
Author to whom correspondence should be addressed.
Submission received: 20 November 2022 / Revised: 16 December 2022 / Accepted: 19 December 2022 / Published: 22 December 2022

Abstract

:
Education in the 21st century faces the challenge of digitalization; therefore, the acquisition and development of digital skills in students is indispensable, not only for their learning processes but also for their lives. This study aims to validate the test “ECODIES”, which was used to assess the level of development of digital competence in students in a public high school in Bogotá (Colombia). The test is based on the DigCom model and was administered to a sample of 777 students aged between 11 and 18. The results obtained in the exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and reliability analysis show the quality of the test. Therefore, in this study it is concluded that “ECODIES” is a test with the reliability and validity to assess digital competence in the Colombian context; in this way, we hope to gain enough research about this topic to contribute to the development of digital competence in Colombian students. We conducted an instrumental study for the analysis of the psychometric properties of the questionnaire.

1. Introduction

In 2020, the global education system faced the challenge of virtual classes due to the emergence of COVID-19. Educational institutions were forced to close their doors to prevent the spread of the virus [1,2]. However, the teaching–learning process had to continue. The pandemic accentuated the gaps already existing in most countries, among which technology was one of the most obvious [3]. A lack of connectivity and resources excluded at least a third of students from continuing to learn [4], partly breaking the relationship between information and communication technologies (ICT) and learning and teaching, which had already been identified as a digital divide. Moving teaching and learning to the virtual environment due to confinement meant a change in the roles of students, teachers, and families [5]. Changes occurred, also, in the paradigms regarding both classroom practices and work outside the classroom. Digital skills became vitally important for both students and teachers [6].
These new modalities shone a spotlight on the work of teachers, demanding that they deal with and shift to a virtual form of education based on digital tools that are both synchronous and asynchronous. Technology has thus been the “great ally” of all the social restructuring that has taken place up until now [7].
Certainly, the COVID-19 crisis marked a before-and-after in the teaching–learning process. It was, and still is, the time to implement new methodologies in the process. ICTs are constantly evolving, and education must adapt to these changes [8]. Teachers believed that the teaching–learning process in a digital environment would be easy, since current students are often referred to as digital natives [9] who, in theory, have the necessary skills to use ICT appropriately, with their high level of digital skills [10]. However, being born in this digitalized era does not mean, per se, that students will properly use ICTs as learning tools or that they themselves have acquired the necessary skills to do so. In neuroscientific research and its neuroeducation embodiment, this has been revealed as a neuromyth, or false myth about the functioning of the brain and the denomination of digital natives and immigrants. While it is true that today’s young people have grown up among a multitude of digital options and present skills and abilities that older generations do not have, this does not imply, as mentioned above, that they are digitally competent. This supposes, in the long term, a generation more educated and aware of the benefits, applicability, and usefulness of ICT, but the need to improve these skills is a current issue.
Digital competence in teachers and students is not a recent issue. Research has increased due to the continuous use of digital technologies, and educational systems have seen the need to promote the acquisition and development of these skills, especially in students, not only to improve their learning process, but also for performance in their future, personal, and professional lives [11]. Likewise, the development of these competences is closely linked to the success, creativity, and employability of each individual [12], so much so, that there are numerous studies focused on analyzing the digital competence of teachers [13,14,15,16,17] and students of all educational stages (infant, primary, secondary, and higher) [18,19,20] and in a diversity of different contexts. Attention has even been paid to digital training in working with people with functional diversity, both university students [21,22] and current educators [23,24]. Given the conceptual variety around this competence, it is complex and difficult to define a universal, valid, and accepted definition of digital competence [25]. This has an impact on the difficulty and lack of instruments that allow for the measurement of the level of digital competence in the school environment [26]. This competence, also known by other terms such as “digital literacy”, “digital competence”, “digital skills”, or “information literacy” [6], was included by the European Commission [14] as one of the eight core competences for lifelong learning. Therefore, it is necessary and indispensable that educational institutions promote and guarantee training in digital competence in the educational community. This contributes to social inclusion, the participation of all individuals regardless of their conditions, and the use of the opportunities associated with digital technologies that our new knowledge society offers [27,28,29].
Thus, digital competence is key to consolidating these dimensions in a pedagogical way in the classroom [30]. Indeed, there are many education systems and national legislation that include it among their objectives and competences. Specifically, its relevance has been escalating in the Colombian context from various national programs promoted by the Ministry of ICT such as MisiónTic 2022, Digital Citizenship, Digital School and Redvolución, whose purpose is to ensure that both students and teachers, and the community in general, are competent in the digital field.
Therefore, training in digital competence must be present in the educational process at all levels [31], regardless of the disciplinary field. It is transversal and transferable to any context and field of knowledge (hence its denominations as basic, key, transversal competence, etc.), since it helps in the acquisition of other specific competences, given its importance in contemporary digitalization [32,33,34]. This training should not be reduced to the simple development of instrumental skills for the management of hardware and software [35,36]; on the contrary, digital competence is understood as the ability not only to use ICT, but also to search, understand, evaluate, create, and communicate digital information, transform it into knowledge, and share it. It requires both cognitive and technical skills, but also involves the conscious, safe, and critical use of digital resources [37,38,39].
Along the same lines, Ferrari [40] defines digital competence as the set of skills and attitudes (also strategies, values, and awareness) that any citizen requires when using ICT or other digital media to perform tasks, solve problems, communicate, manage information, collaborate, create, share content, and build knowledge in a creative, effective, critical, appropriate way that is reflective, ethical, autonomous, and flexible. These definitions coincide with that of Area and Ribeiro [41], confirming that digital competence not only requires instrumental knowledge, but also axiological and emotional components since, being immersed in a digital society, we are interacting constantly and emotional states associated with the use of ICTs are generated, whether positive or negative [42].
Therefore, the purpose of this work is to validate the “ECODIES” instrument in the Colombian context and determine whether it can be used to evaluate digital competence in secondary school students in the territory. The study results will serve as a starting point for research on digital competence in Colombian students so that plans can be drawn up for studies that contribute to the acquisition and development of these competences from all areas of the curriculum of educational institutions.

Assessing Digital Competence

Assessing digital competence in students has become challenging; however, we have found some current models and standards of digital competence that are adjustable to the school environment (Table 1).
Researchers such as García-Valcárcel et al. [43] and Paredes-Labra et al. [44] have established differences in research approaches for the evaluation of digital competency in students: (a) students’ own assessment tests, and (b) execution tests. In the first approach, the assessment of digital competence is carried out by means of surveys, where students express their own opinions on their levels of competence. These types of tests are subjective and, therefore, the results questionable, since normally students tend to overestimate themselves. In the second case, the tests are based on the execution of tasks, troubleshooting, and performance of activities. Although they are designed based on technical competencies and less on formal skills, they are more appropriate and reliable for evaluating a student’s performance in the face of a problem in which the use of said competence is required, as shown by Baeza González et al. [45], Casillas-Martin [46], Gonzáles Segura et al. [47], and Paredes Labra et al. [44].
Much of the research on the assessment of digital competence leans towards the self-perception approach. The works of Rodríguez et al. [48] and Casillas Martín et al. [49] analyze the self-perception that university students have about their digital competences. The ADO (Online Digital Literacy) test [50] assesses the level of media proficiency in the general population, focusing specifically on the search, dissemination, and creation of digital content over the internet. In the same line as these self-perception scales, there are studies by Almedina et al. [51], Colás Bravo et al. [42], and De Pablos Pons et al. [26] aimed at students of primary and secondary basic education based on Likert-type scales. In the same sense, we find the tool INCONTIC (Inventory of ICT Competences) [52], whose objective is to identify the previous knowledge in digital competence of students in the last grades of secondary education and at the beginning of university, used in different contexts both in Spain and in Latin America. At the national level of Colombia, there is the work of Contreras-German et al. [53], who designed and validated an instrument for the last years of secondary basic education called the Digital Competence Assessment Scale (EVCD).
In line with the works that focus on objective tests for the evaluation of digital competence, there is the questionnaire called “Digital Campus” of Restrepo-Palacio and Cifuentes [54], whose objective is to evaluate the knowledge of university students at the University of the Sabana in Bogotá (Colombia). For primary and secondary basic education, some researchers, such as Martínez-Piñeros [55] and García-Valcárcel Muñoz-Repiso et al. [43], conclude that most of the instruments to assess digital competence in this population are based on the self-perception of individuals, who evaluate only some of the dimensions of digital competence. Therefore, it is difficult to have conclusive results. Similarly, to measure the level of digital competence of primary and secondary school students, it is necessary to consider not only what digital skills to develop, how, and when, but also the people who influence that development, such as the socio-family environment and teachers.
Despite this, there are some instruments to assess digital competence in basic education in students, such as the instrument created by Baeza-Gonzáles et al. [45] as part of the project, “Mind the gap: a snapshot of e-skills gender differences in Spain”, aimed at students in the last grades of primary education. For secondary education, the “ECODIES” test is available, based on the “Digcom” digital competence assessment model, developed by the GITE group of the University of Salamanca [56], which evaluates digital competence through questions on knowledge, skills, and aptitudes.
Finally, the scarcity of works that study and evaluate digital competence in primary and secondary basic education in the Colombian context is highlighted. Given the absence of instruments to assess digital competence in the Colombian context, the following study aims to analyze the psychometric properties of the “ECODIES” instrument, with the objectives of: (a) validating and adapting the instrument through its application to a pilot sample in the Colombian context; (b) determining the multidimensionality of the instrument through exploratory factor analysis; (c) confirming the multidimensionality of the instrument through confirmatory factor analysis; and (d) analyzing the reliability of the instrument.

2. Materials and Methods

Considering Montero and León [57], the present research is framed in instrumental studies for the adaptation and validation of the psychometric properties of the “ECODIES” instrument to a sample different from the original. This type of design is relevant for the validation of instruments created, or for the validation of another previously developed for use in another context [58]. The study has a quantitative approach of descriptive character framed in the non-experimental design of cross-sectional typology; data collection was carried out at a certain point of time [59].

2.1. Participants

The population under study was made up of students of secondary basic education of a public district school (IED) located in the town of Usme in the city of Bogotá (Colombia), with a population of 1081 students. Its sample size was amply representative, with a total of 777, which represents a sampling error of 1.9%, well below the 5% commonly accepted in research. Sample selection was performed by stratified random probability sampling [59,60] consisting of students in grades 6–11. Of the participants, 48% were female, 51.5% male, and 0.5% identified with another gender. The students were between the ages of 11 and 19. The mean age was 13.9 years (Table 2).

2.2. Instrument

The instrument used in this study (“ECODIES”) is modelled on the Common Framework for the Development and Understanding of Digital Competence in Europe (DigCom), established by the European Commission in 2013 [40], which established five areas, three levels, and three areas. In 2016, an update called the European Framework for Digital Competence of Citizens (DigCom 2.0) was published [61], which retained practically the same structure of DigCom 1.0. Finally, Carretero et al. [62] presented the latest update, which is now known as DigCom 2.1, and has 21 competencies grouped into 5 areas, as shown in Table 3, with 8 levels of difficulty (Figure 1).
The “ECODIES” test created by the researchers of the GITE group [56] is based on the DigCom model [40]. Initially, indicators were developed for each of the areas that make up digital competence: Area 1 (A1: problem solving), Area 2 (A2: information literacy), Area 3 (A3: security), Area 4 (A4: communication and collaboration), and Area 5 (A5: content creation) [43]. Subsequently, the external validation of the indicator model was carried out by expert judges from different educational levels. For a level of agreement among the judges, the Lawshe model [63] and the Tristan-López revision [64] were considered. The instrument was improved after a pilot test was carried out during the 2017–2018 academic year. The final version of the test consists of 108 items disseminated among knowledge, abilities, and attitudes (Table 4). Finally, the instrument obtained values in the Cronbach’s alpha internal consistency test of 0.89, which is considered high [65].
A variable was created that added the correct items for ability and knowledge to obtain the average of each area: the correct answers were coded as “1” and the other three incorrect answers as “0”, for a total score of 78 points. As for the items of attitude, they were coded on a scale of 1 to 5: (1) strongly disagree, (2) disagree, (3) indifferent, (4) agree, (5) strongly agree, for a total of 30 points to be obtained by students in this component.

2.3. Procedure

For the collection of these data, we had the endorsement of the ethics committee of the University of Granada, with registration number 2982/CEIH/2022, the permission of the rectory of the educational institution, and the informed consent of the families of the participating minors. The test was presented in the form of an online questionnaire conducted on Google Forms for this purpose. We collaborated with the afternoon and morning teachers, who provided spaces for meeting with the students.
The questionnaire was administered for this validation during mathematics classes, making use of the institution’s systems room. The researchers orally explained the purpose of the research, guaranteed anonymity and use of the data exclusively for research purposes, and invited participants to participate voluntarily. Instructions for completion of the test were given orally, although they could also be found in writing on the form, possible doubts were clarified, and the administration proceeded. It was carried out in two sessions to avoid fatigue of the participants.

2.4. Data Analysis

The internal consistency of the instrument was calculated using Cronbach’s alpha coefficient, using the IBM SPSS 22 program [66]. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were performed using statistical programs Jamovi 2.2 [67] and Jasp 0.16 [68].

3. Results

For the structural validity of the test, the sample was divided randomly into equal parts. The exploratory factor analysis (EFA) was performed with the first sample and the confirmatory factor analysis (CFA) was performed with the second sample.

3.1. Exploratory Factorial Analysis (EFA) of the ECODIES Test in the Study Population

In the EFA of principal components, for each area and for the attitude test, the coefficient KMO (Kaiser–Mayer–Olsen) and Bartlett’s sphericity test (Table 5) were obtained.
The coefficient (KMO) obtained in each of the areas ranged between 0.598 and 0.755, that is, between low and acceptable, because they were higher than 0.050 [69], and in the attitude test a coefficient of 0.950 (excellent) was obtained. Bartlett’s sphericity test indicated that the test was highly significant (p < 0.05), which corresponds to a degree of correlation between the variables. Therefore, the application of factor analysis was considered appropriate, as was the sample.
For principal component analysis, with the Varimax rotation method normalized with Kaiser, items were grouped into common variables to reduce their number. In this case, the new variables were named after the subareas of the original DigCom document [40]. Two factors were taken for this analysis (Factor 1: “Attitude” and Factor 2: “Knowledge and Ability”) [46] (Table 6).
The area with the highest burden on the “Knowledge/Ability” test (Factor 2) was “Communication and Collaboration” and the lowest “Content Creation”, as in the “Attitudes” test (Factor 1), where the “Content Creation” area appears with the lowest factorial load. On the other hand, the “Knowledge/Ability” test explained more than 50% of the overall test variance (58.8%) with one component rotation. The percentage variance of the total test (“Knowledge/Ability together with “Attitude”) was 65.2%, which was a good indicator of the validity of the test (Table 7).
In the matrix of principal components (Table 8), most of the subareas obtained values greater than 0.40 and saturated Factor 2 (“Knowledge/Ability”), thus confirming the original location by the expert committee that validated the test [46]. Similarly, items of attitudes saturated Factor 1 (“Attitude”), with values higher than 0.50. With these values, it is considered that both the subareas and the attitude items are determinants for the test. No values less than 0.40 are found; however, two low values stand out in Factor 2: Area 4: “Communication”, subarea: “Interaction with new technologies” (0.524) and the subarea: “Sharing information and content” (0.548).

3.2. Confirmatory Factor Analysis (CFA) of the ECODIES Test in the Study Population

With the other part of the sample, the CFA [70] was carried out to verify what the EFA yielded: two factors, for the items of “Knowledge/Capacity” and “Attitude” of the different areas of digital competence. As can be seen (Table 9), all the values obtained in the goodness adjustments for each area were very positive [71,72].
Factor loadings for problem solving ranged from 0.39 to 0.99, which is considered acceptable and significant (Figure 2).
Factors loadings for the area of information literacy ranged from 0.43 to 0.83, which is considered acceptable and significant (Figure 3).
Factor loadings for the security area ranged from 0.47 to 0.74, which is considered acceptable and significant (Figure 4).
Factor loadings for the area of communication and collaboration ranged from 0.50 to 0.83, which is considered acceptable and significant (Figure 5).
Factor loadings for the content creation area ranged from 0.47 to 0.81, which is considered acceptable and significant (Figure 6).
The factorial load obtained in the analysis confirmed the location of the items in the factors shown in the EFA.

3.3. Reliability ECODIES Test in the Study Population

The reliability analysis of internal consistency was calculated using the coefficient of McDonald’s omega and Cronbach’s alpha on both of the two components (“Attitudes” and “Knowledge/Ability”) and on the total test. The results of the “Attitudes” component and the complete test indicated a good level of reliability (>0.80) for both statistics. However, in the “Knowledge/Ability” component, the values presented were not so good, especially in the omega McDonald statistic (Table 10).

4. Discussion and Conclusions

As has been justified, digital competence is a part of knowledge and learning that is indispensable to our secondary school students. The acquisition and development of digital competence will ensure access to opportunities provided by our knowledge society. This training must go beyond the management of hardware and software that for many years has been the only teaching students have received in computer classrooms. Making changes to curricula to address this training first requires reliable and valid instruments to collect information [49,73], since the importance of digital training is mentioned in the educational legislation itself [74]. However, prior to drawing a digital competency and attitudinal profile of the students, we must start with a map of digital competence for its optimization, which was the objective of this work.
For this reason, we tried to validate the instrument for the evaluation of digital competence in secondary school students for the Colombian population, “ECODIES”, created and validated in Spain for this population by the GITE group [56]. The EFA yielded two differentiated factors in which the items were grouped, the factor of “Knowledge/Capacity” and the factor of “Attitudes”, as established in the original study validated for Spain [46], and that can also be differentiated in the validations by area of competence [43,75,76], as these two factors explained 65.2% of the variance. The results of the EFA were subsequently confirmed with the CFA; although in the original validation this analysis is not available, the data obtained in the RMSEA of each area were less than 0.05. Likewise, the CFI and TLI were higher than 0.9, which demonstrates the suitability of the instrument. The values obtained in both analyses guaranteed the test’s validity and reliability in such a way that it becomes an option to be applied as a diagnostic test of knowledge about digital competence in the context for which it has been validated.
Regarding reliability, results considered acceptable (0.841) were obtained for the total test, approaching that obtained in the original test validated for Spain (0.89) [33]. However, when reviewing the reliability index of some studies focused on some of the competence areas, it is highlighted that the Cronbach’s alpha index for the “Knowledge/Ability” factor is not acceptable (<0.70) [67,68,69,70,71,72,73,74,75], which is because the Cronbach’s alpha statistic is not recommended to calculate reliability on a scale of less than five categories [77,78], which can be verified in the present study with the reliability index of the “Knowledge/Capacity” component where reliability indices lower than 0.70 were obtained, although they were close. In the “Attitude” component, the reliability index is quite acceptable for both statistics, like that obtained in studies focused on competence areas.
The values obtained guaranteed the test’s validity and reliability in such a way that it becomes an option to be applied as a diagnostic test of knowledge about digital competence in the context for which it has been validated. In addition, it is considered a complete instrument since it groups the 5 areas and 21 competences of the DigCom model.
The validation of the instrument for the context in which it is applied will allow identification of the shortcomings of secondary education students in the Colombian context and thus carry out interventions at the curricular, pedagogical, didactic, and methodological levels for the acquisition and development of digital competence in educational institutions, as noted by Henríquez Coronel et al. [79], stating that assessing digital competence in students is essential today, given that these results will serve as a basis for designing and implementing digital literacy proposals in educational institutions.
It is known that most of the instruments to assess digital competence have focused on student self-perception [43,55]. However, this test also allowed us to evaluate the knowledge and attitude that students possess in the different areas of competence and thus identify the shortcomings that students have in the areas of digital competence. Applying these types of questionnaires also helps teachers improve their digital competence for integration into their teaching practice.
However, despite being a complete instrument, its extension is highlighted as the main shortcoming, since for its application an average time of ninety minutes is required; this implies that students could show some tiredness and apathy, especially in the lower grades, which could influence their responses. It was recommended to apply the instrument by areas and thus avoid this loss of interest in its content when applied completely. Another limitation to point out has to do with the sample; the data were collected in a single educational institution. Although the sample was correct for the analysis, it is recommended to apply the test to more educational institutions in the locality.
To conclude, it is important to recognize that digital competence is indispensable for the interaction of young people in different digital environments and that digitalization is here to stay. In Colombia, there is no model to evaluate digital competence in basic education. For this reason, advancing and deepening the subject in classroom practices is essential to promote the acquisition and development of digital competence and not leave the digital learning of students in the hands of the socio-family environment, which in turn presents many shortcomings (socioeconomic and academic) that will accompany students in their academic process. This would continue to widen the digital divide between those with diverse resources and knowledge and those without. This work is encouraged to become a starting point and application of the test in the Colombian context so that, later, new results can be counted on in many educational institutions and regions of the country, which will allow for a deepening of the field of digital competence in students of secondary basic education.

Author Contributions

Conceptualization, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; methodology, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; software, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; validation, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; formal analysis, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; investigation, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; resources, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; data curation, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; writing—original draft preparation, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; writing—review and editing, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; visualization, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; supervision, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; project administration, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M.; funding acquisition, A.B.B.d.l.H., A.R.-F., M.J.C.C. and C.d.P.G.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of University of Granada (protocol code 2982/CEIH/2022, and date of approval of 12/02/2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kawaoka, N.; Ohashi, K.; Fukuhara, S.; Miyachi, T.; Asai, T.; Imaeda, M.; Saitoh, S. Impact of school closures due to COVID-19 on children with neurodevelopmental disorders in japan. J. Autism Dev. Disord. 2021, 52, 2149–2155. [Google Scholar] [CrossRef] [PubMed]
  2. Liu, G.; Wang, S.; Liao, J.; Ou, P.; Huang, L.; Xie, N.; He, Y.; Lin, J.; He, H.G.; Hu, R. The efficacy of WeChat-Based parenting training on the psychological well-being of mothers with children with autism during the COVID-19 pandemic: Quasi-experimental study. JMIR Ment. Health 2021, 8, e23917. [Google Scholar] [CrossRef] [PubMed]
  3. Rodicio-García, M.L.; Ríos-de-Deus, M.P.; Mosquera-González, M.J.; Penado Abilleira, M. La Brecha Digital en Estudiantes Españoles ante la Crisis de la Covid-19. Rev. Int. Educ. Justicia Soc. 2020, 9, 103–125. [Google Scholar] [CrossRef]
  4. UNESCO. La Educación en Tiempos de la Pandemia de COVID-19. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000374075 (accessed on 25 July 2022).
  5. Fernández-Río, J.; Lopez-Aguado, M.; Pérez-Pueyo, Á.; Hortigüela-Alcalá, D.; Manso-Ayuso, J. La brecha digital destapada por la pandemia del coronavirus: Una investigación sobre profesorado y familias. Rev. Complut. Educ. 2022, 33, 351–360. [Google Scholar] [CrossRef]
  6. Díaz-Arce, D.; Loyola-Illescas, E. Competencias digitales en el contexto COVID 19: Una mirada desde la educación. Rev. Innova Educ. 2021, 3, 120–150. [Google Scholar] [CrossRef]
  7. Da Silva, L. Educación, cultura y aprendizaje en tiempos de COVID-19: El caso brasileño. Univ. Esc. Soc. 2020, 9, 43–54. [Google Scholar] [CrossRef]
  8. Cifuentes-Faura, J. Docencia online y Covid-19: La necesidad de reinventarse. Rev. Estilos Aprendiz. 2020, 13, 115–127. [Google Scholar] [CrossRef]
  9. Marc Prensky. Digital Natives, Digital Immigrants Part 1. On Horiz. 2001, 9, 1–6. [Google Scholar] [CrossRef] [Green Version]
  10. Acosta–Silva, D.A. Tras las competencias de los nativos digitales: Avances de una metasíntesis. Rev. Latinoam. Cienc. Soc. Niñez Juv. 2017, 15, 471–489. [Google Scholar] [CrossRef]
  11. Lucas, M. Facilitating Students’ Digital Competence: Did They Do It? In Transforming Learning with Meaningful Technologies—EC-TEL 2019; Scheffel, M., Broisin, J., Pammer-Schindler, V., Ioannou, A., Schneider, J., Eds.; Springer: Delft, The Netherlands, 2019; Volume 11722, pp. 3–14. [Google Scholar] [CrossRef]
  12. Cabero-Almenara, J.; Estrada-Vidal, l.; Gutiérrez-Castillo, J.J. Diseño y validación de un instrumento de evaluación de la competencia digital del estudiante universitario. Rev. Espac. 2017, 38, 16. Available online: https://www.revistaespacios.com/a17v38n10/17381018.html. (accessed on 5 July 2022).
  13. Abel, V.R.; Tondeur, J.; Sang, G. Teacher Perceptions about ICT Integration into Classroom Instruction. Educ. Sci. 2022, 12, 609. [Google Scholar] [CrossRef]
  14. Koh, K.T.; Tan, L.Q.W.; Camiré, M.; Alcantara, M.A.; Chua, W.C.A. Teachers’ and students’ perceptions of factors influencing the adoption of information and communications technology in physical education in Singapore schools. Eur. Phys. Educ. Rev. 2021, 28, 100–119. [Google Scholar] [CrossRef]
  15. Nikolopoulou, K.; Gialamas, V.; Lavidas, K. Mobile learning-technology barriers in school education: Teachers’ views. Technol. Pedagog. Educ. 2022, 1–16. [Google Scholar] [CrossRef]
  16. Ricardo, C.; Llinas, H.; Medina, A.; Cacheiro, M.L.; Villegas, A.; Lafaurie, A.; Navarro, V. Teachers’ perceptions of culturally appropriate pedagogical strategies in virtual learning environments: A study in Colombia. Turk. Online J. Distance Educ. 2022, 23, 113–130. [Google Scholar] [CrossRef]
  17. Thannimalai, T.; Ponniah, K.; Nawastheen, F.M.; Jose, F.; Jaiseelan, S. Attitudes and acceptance of information and communication technology (ICT) among urban and rural teachers in teaching and facilitation. Int. J. Adv. Appl. Sci. 2022, 9, 16–23. [Google Scholar] [CrossRef]
  18. Carrión, R.V. Frecuencia de uso de las TIC y evaluación del perfil de competencias digitales en estudiantes de educación. Cienc. Lat. Rev. Cient. Multidiscip. 2021, 5, 4120–4142. [Google Scholar] [CrossRef]
  19. García-Martín, J.; García-Sánchez, J. Pre-service teachers’ perceptions of the competence dimensions of digital literacy and of psychological and educational measures. Comput. Educ. 2017, 107, 54–67. [Google Scholar] [CrossRef]
  20. Ouahbi, I.; Darhmaoui, H.; Kaddari, F. “Visual Block-based Programming for ICT Training of Prospective Teachers in Morocco”. Int. J. Mod. Educ. Comput. Sci. 2022, 14, 56–64. [Google Scholar] [CrossRef]
  21. Colomo-Magaña, E.; Colomo-Magaña, A.; Basgall, L.; Cívico-Ariza, A. Pre-service teachers’ perceptions of the role of ICT in attending to students with functional diversity. Educ. Inf. Technol. 2022, 1–17. [Google Scholar] [CrossRef]
  22. Gialamas, V.; Nikolopoulou, K.; Kutromanos, G. Student teachers’ perceptions about the impact of internet usage on their learning and jobs. Comput. Educ. 2013, 62, 1–7. [Google Scholar] [CrossRef]
  23. Fernández-Batanero, J.M. TIC y la discapacidad. Conocimiento del profesorado de Educación Especial. Hekademos Rev. Educ. Digit. 2018, 24, 19–29. Available online: https://bit.ly/3jAveOk (accessed on 14 December 2022).
  24. Mañanes, J.; García-Martín, J. La competencia digital del Profesorado de Educación Primaria durante la pandemia (COVID-19). Profesorado. Rev. Curríc. Form. De Profr. 2022, 26, 125–140. [Google Scholar] [CrossRef]
  25. Ala-Mutka, K. Mapping Digital Competence: Towards a Conceptual Understanding; Institute for Prospective Technological Studies: Sevilla, Spain, 2011. [Google Scholar] [CrossRef]
  26. De Pablos Pons, J.; Colás Bravo, P.; Conde Jiménez, J.; Reyes de Cózar, S. La competencia digital de los estudiantes de educación no universitaria: Variables predictivas. Bordón. Rev. Pedagog. 2016, 68, 169–185. [Google Scholar] [CrossRef] [Green Version]
  27. European Comission. Key Competencies for Lifelong Learning: European Reference Framework; Office for Official Publications of the European Communities: Luxembourg, 2007. Available online: https://www.britishcouncil.org/sites/default/files/youth-in-action-keycomp-en.pdf (accessed on 10 July 2022).
  28. Napal Fraile, M.; Peñalva-Vélez, A.; Mendióroz Lacambra, A. Development of Digital Competence in Secondary Education Teachers’ Training. Educ. Sci. 2018, 8, 104. [Google Scholar] [CrossRef] [Green Version]
  29. Sánchez Antolín, P.; Muñoz Álvarez, T.; Paredes Labra, J. El trabajo en el aula y la competencia digital en el modelo 1a1 de la Comunidad de Madrid. Pixel-Bit Rev. Medios Educ. 2015, 47, 211–222. Available online: https://recyt.fecyt.es/index.php/pixel/article/view/61681/37692 (accessed on 14 July 2022). [CrossRef] [Green Version]
  30. García Peñalvo, F.; Ramirez Montoya, M. Aprendizaje, Innovación y Competitividad. RED Rev. Educ. Distancia 2017, 52. Available online: https://revistas.um.es/red/article/view/282141/205641 (accessed on 20 July 2022).
  31. Luján, R. Enseñanza de las TIC para el desarrollo de competencias tecnológicas en docentes de educación básica alternativa. Hamut’ay 2016, 3, 19–30. [Google Scholar] [CrossRef]
  32. López-Romero, L.; Aguaded-Gómez, M.d.l.C. Teaching Media Literacy in Colleges of Education and Communication. Comunicar 2015, 22, 187–195. [Google Scholar] [CrossRef] [Green Version]
  33. Sá, M.J.; Serpa, S. COVID-19 and the Promotion of Digital Competences in Education. Univers. J. Educ. Res. 2020, 8, 4520–4528. [Google Scholar] [CrossRef]
  34. Sá, M.; Serpa, S. Transversal Competences: Their Importance and Learning Processes by Higher Education Students. Educ. Sci. 2018, 8, 126. [Google Scholar] [CrossRef] [Green Version]
  35. Area, M.; Gros, B.; Marzal, M.A. Alfabetizaciones y TIC’; Síntesis: Madrid, Spain, 2008. [Google Scholar]
  36. Eshet, Y. Thinking in the Digital Era: A Revised Model for Digital Literacy. In Issues in Informing Science and Information Technology; Informing Science Institute: Santa Rosa, CA, USA, 2012; Volume 9, pp. 267–276. [Google Scholar]
  37. Amaro Agudo, A.; González García, E.; Martínez-Heredia, N. Desafíos para una ciudadanía inclusiva: Competencia digital entre adultos mayores y jóvenes. Comun. Mídia Consumo 2020, 17, 11. Available online: https://link.gale.com/apps/doc/A625500430/AONE?u=anon~b08a1473&sid=bookmark-AONE&xid=3fe6f2a9 (accessed on 20 July 2022). [CrossRef]
  38. Cabero-Almenara, J.; Palacios-Rodríguez, A. Marco Europeo de Competencia Digital Docente «DigCompEdu». Traducción y adaptación del cuestionario «DigCompEdu Check-In. Edmetic 2020, 9, 213–234. [Google Scholar] [CrossRef] [Green Version]
  39. Klassen, A. Deconstructing Paper-Lined Cubicles: Digital Literacy and Information Technology Resources in the Workplace. Int. J. Adv. Corp. Learn. 2019, 12, 5. [Google Scholar] [CrossRef] [Green Version]
  40. Ferrari, A. DigComp: A Framework for Developing and Understanding Digital Competence in Europe; Office of the European Union: Luxembourg, 2013. [Google Scholar] [CrossRef]
  41. Area, M.; Pessoa, T. De lo sólido a lo líquido: Las nuevas alfabetizaciones ante los cambios culturales de la Web 2.0. Comunicar 2012, 19, 13–20. [Google Scholar] [CrossRef] [Green Version]
  42. Colás-Bravo, P.; Conde-Jiménez, J.; Reyes-De Cózar, S. Competencias digitales del alumnado no universitario. Rev. Latinoam. Tecnol. Educ. 2017, 16, 7. [Google Scholar] [CrossRef]
  43. García-Valcárcel, M.A.; Salvador Blanco, L.; Casillas Martín, S.; Basilotta Gómez-Pablos, V. Evaluación de las competencias digitales sobre seguridad de los estudiantes de Educación Básica. RED Rev. Educ. Distancia 2019, 19. [Google Scholar] [CrossRef]
  44. Paredes-Labra, J.; Freitas, A.; Sánchez-Antolín, P. De la iniciación al manejo tolerado de tecnologías. La competencia digital de los estudiantes madrileños antes de la educación secundaria. RED Rev. Educ. Distancia 2019, 19. [Google Scholar] [CrossRef]
  45. Baeza-González, A.; Lázaro-Cantabrana, J.L.; Sanroma, M. Evaluación de la competencia digital del alumnado de ciclo superior de primaria en Cataluña. Rev. Medios Educ. Píxel–Bit 2022, 64, 265–298. [Google Scholar] [CrossRef]
  46. Casillas-Martín, S.; Cabezas-González, M.; García-Valcárcel, A. Análisis psicométrico de una prueba para evaluar la competencia digital de estudiantes de Educación Obligatoria. Rev. Electron. Investig. Eval. Educ. 2020, 26, 1–23. [Google Scholar] [CrossRef]
  47. González-Segura, C.M.; García-García, M.; Menéndez, V.H. Análisis de la evaluación de competencias y su aplicación en un sistema de gestión del aprendizaje. Un caso de estudio. RED Rev. Educ. Distancia 2018, 58. [Google Scholar] [CrossRef]
  48. Rodríguez-Conde, M.J.; Olmos, S.; Martínez, F. Propiedades métricas y estructura dimensional de la adaptación española de una escala de evaluación de competencia informacional autopercibida (IL-HUMASS). Rev. Investig. Educ. 2012, 30, 347–365. [Google Scholar] [CrossRef]
  49. Cabezas González, M.; Casillas Martín, S.; Sanches Ferreira, M.; Teixeira, F. Validación De Un Instrumento Para Medir La Competencia Digital De Estudiantes Universitarios (CODIEU). Rev. Estud. Investig. Psicol. Educ. 2018, 19, 69–81. [Google Scholar] [CrossRef] [Green Version]
  50. Dornaleteche, J.; Buitrago, A.; Moreno, L. Categorización, selección de ítems y aplicación del test de alfabetización digital online como indicador de la competencia mediática. Comunicar 2015, 22, 177–185. [Google Scholar] [CrossRef] [Green Version]
  51. Armor, I.; Serrano, R. An evaluation of Primary-School pupils’ Digital Competence. Espacios 2019, 40, 12. Available online: https://www.revistaespacios.com/a19v40n21/19402112.html (accessed on 2 August 2022).
  52. González-Martínez, J.; Esteve-Mon, F.M.; Larraz Rada, V.; Espuny Vidal, C.; Gisbert Cervera, M. INCOTIC 2.0. Una nueva herramienta para la autoevaluación de la competencia digital del alumnado universitario. Profr. Rev. Curríc. Form. Profr. 2018, 22, 133–152. [Google Scholar] [CrossRef]
  53. Contreras-German, J.; Piedrahita-Ospina, A.; Ramírez-Velásquez, I. Competencias digitales, desarrollo y validación de un instrumento para su valoración en el contexto colombiano. Trilogía Cienc. Technol. Soc. 2019, 11, 205–223. [Google Scholar] [CrossRef]
  54. Restrepo-Palacio, S.; Cifuentes, Y. Diseño y validación de un instrumento de evaluación de la competencia digital en Educación Superior. Ens. Avaliação Políticas Públicas Educ. 2020, 28, 932–961. Available online: https://www.redalyc.org/articulo.oa?id=399565425005 (accessed on 25 July 2022). [CrossRef] [Green Version]
  55. Martínez-Piñeiro, E.; Gewerc, A.; Rodríguez-Groba, A. Nivel de competencia digital del alumnado de educación primaria en Galicia. La influencia sociofamiliar. RED Rev. Educ. Distancia 2019, 19. [Google Scholar] [CrossRef]
  56. GITE–USAL: Grupo de Investigación en Tecnología Educativa. Universidad de Salamanca. Available online: https://gite.usal.es/ (accessed on 10 June 2022).
  57. Montero, I.; León, O.G. A guide for naming research studies inPsychology. Int. J. Clin. Health Psychol. 2007, 7, 847–862. Available online: https://bit.ly/3xSBzIFB (accessed on 25 September 2022).
  58. Ato, M.; López-García, J.; Benavente, A. Un sistema de clasificación de los diseños de investigación en psicología. An. De Psicol. / Ann. Psychol. 2013, 29, 1038–1059. [Google Scholar] [CrossRef] [Green Version]
  59. Bisquerra, R.; Alzina, R.B. Metodología de la Investigación Educativa, 6th ed.; Editorial La Muralla: Madrid, Spain, 2004. [Google Scholar]
  60. Otzen, T.; Manterola, C. Técnicas de muestreo sobre una población a estudio. Int. J. Morphol. 2017, 35, 227–232. [Google Scholar] [CrossRef] [Green Version]
  61. European Commission; Joint Research Centre. DigComp 2.0: The Digital Competence Framework for Citizens. Available online: https://data.europa.eu/doi/10.2791/11517 (accessed on 15 July 2022).
  62. European Commission. DigComp 2.1: The Digital Competence Framework for Citizens with Eight Proficiency Levels and Examples of Use; Publications Office of the European Union: Luxembourg, 2018. [CrossRef]
  63. Lawshe, C. A Quantitative Approach to Content Validity. Pers. Psychol. 1975, 28, 563–575. [Google Scholar] [CrossRef]
  64. Tristán-López, A. Modificación al modelo de Lawshe para el dictamen cuantitativo de la validez de contenido de un instrumento objetivo. Av. Med. 2008, 6, 37–48. Available online: https://www.humanas.unal.edu.co/lab_psicometria/application/files/9716/0463/3548/VOL_6._Articulo4_Indice_de_validez_de_contenido_37-48.pdf (accessed on 16 September 2022).
  65. Corral, Y. Validez y confiabilidad de los instrumentos de investigación para la recolección de datos. Rev. De Cienc. De La Educ. 2009, 19, 228–247. Available online: http://servicio.bc.uc.edu.ve/educacion/revista/n33/art12.pdf (accessed on 20 June 2022).
  66. IBM Corp. IBM SPSS Statistics for Windows (Version 22.0). [Computer Software]. 2020. Available online: https://www.ibm.com/es-es/products/spss-statistics/support (accessed on 14 June 2022).
  67. The Jamovi Project. Jamovi (Version 2.2) [Computer Software]. 2022. Available online: https://www.jamovi.org (accessed on 15 June 2022).
  68. Jasp. JASP (Version. 0.16) [Computer Software]. 2019. Available online: https://jasp-stats.org/download/ (accessed on 20 June 2022).
  69. Kaiser, H.F. An index of factorial simplicity. Psychometrika 1974, 39, 32–36. [Google Scholar] [CrossRef]
  70. Anderson, J.C.; Gerbing, D.W. Structural equation modeling in practice: A review and recommended two-step approach. Psychol. Bull. 1988, 103, 411–423. Available online: https://www3.nd.edu/~kyuan/courses/sem/readpapers/ANDERSON.pdf (accessed on 1 July 2022). [CrossRef]
  71. MacCallum, R.C.; Widaman, K.F.; Zhang, S.; Hong, S. Sample size in factor analysis. Psychol. Methods 1999, 4, 84–99. [Google Scholar] [CrossRef]
  72. West, R.F.; Meserve, R.J.; Stanovich, K.E. Cognitive sophistication does not attenuate the bias blind spot. J. Personal. Soc. Psychol. 2012, 103, 506–519. Available online: http://www.keithstanovich.com/Site/Research_on_Reasoning_files/West_Stanovich_JPSP2012.pdf (accessed on 13 September 2022). [CrossRef] [Green Version]
  73. Flores-Lueg, C.; Roig Vila, R. Percepción de estudiantes de Pedagogía sobre el desarrollo de su competencia digital a lo largo de su proceso formativo. Estud. Pedagógicos 2016, 42, 129–148. [Google Scholar] [CrossRef] [Green Version]
  74. Plan Sectorial de Educación 2020–2024. Available online: https://misioneducadores.educacionbogota.edu.co/sites/default/files/2022-02/PlanSectorial2020-2024.pdf (accessed on 12 December 2022).
  75. Cabezas-González, M.; Casillas-Martín, S.; García-Valcárcel-Muñoz-Repiso, A.; Basilotta-Gómez-Pablos, V. Validación de prueba para evaluar la competencia digital en el área de resolución de problemas en estudiantes de educación obligatoria. Rev. Electron. Educ. 2021, 25, 18–38. [Google Scholar] [CrossRef]
  76. Casillas-Martín, S.; Cabezas-González, M.; García-Valcárcel, A. Influencia del uso de WhatsApp y correo electrónico en la competencia digital en el área de comunicación. Estud. Sobre Educ. 2021, 41, 227–249. [Google Scholar] [CrossRef]
  77. Oliden, P.E.; Zumbo, B.D. Coeficientes de fiabilidad para escalas de respuesta categórica ordenada. Psicothema 2008, 20, 896–901. Available online: https://www.redalyc.org/pdf/727/72720458.pdf (accessed on 20 September 2022).
  78. Zumbo, B.D.; Gadermann, A.M.; Zeisser, C. Ordinal versions of coefficients alpha and theta for Likert rating scales. J. Mod. Appl. Stat. Methods 2007, 6, 20–29. Available online: http://digitalcommons.wayne.edu/jmasm/vol6/iss1/4 (accessed on 20 September 2022). [CrossRef] [Green Version]
  79. Henríquez-Coronel, P.M.; Cervera, M.G.; Fernández, I.F. La evaluación de la competencia digital de los estudiantes: Una revisión al caso latinoamericano. Chasqui Rev. Latinoam. Comun. 2018, 137, 93–112. Available online: https://revistachasqui.org/index.php/chasqui/article/view/3511 (accessed on 15 August 2022).
Figure 1. DigCom Framework 2.1. Own elaboration.
Figure 1. DigCom Framework 2.1. Own elaboration.
Mathematics 11 00033 g001
Figure 2. Factor loadings for the problem-solving area.
Figure 2. Factor loadings for the problem-solving area.
Mathematics 11 00033 g002
Figure 3. Factor loadings for the area of information literacy.
Figure 3. Factor loadings for the area of information literacy.
Mathematics 11 00033 g003
Figure 4. Factor loadings for the security area.
Figure 4. Factor loadings for the security area.
Mathematics 11 00033 g004
Figure 5. Factor loadings for the area of communication and collaboration.
Figure 5. Factor loadings for the area of communication and collaboration.
Mathematics 11 00033 g005
Figure 6. Factor loadings for the content creation area.
Figure 6. Factor loadings for the content creation area.
Mathematics 11 00033 g006
Table 1. Frameworks and benchmarks on the conceptualization and assessment of DC.
Table 1. Frameworks and benchmarks on the conceptualization and assessment of DC.
Frame/StandardContextConceptualizationReference
ISTE (Standard for Students)International7Areas:International Society for Technology in Education —ISTE (2016)
  • Empowered learners
2.
Digital citizenship
3.
Knowledge-building
4.
Innovative design
5.
Computational thinking
6.
Creative communication
7.
Global collaboration
DigCom 2.1 The Digital Competence Framework for CitizensEuropean21 competences defined in 8 levels of development and grouped in 5 areas:Carretero et al. (2017)
  • Information
2.
Communication and Collaboration
3.
Content creation
4.
Safety
5.
Problem-solving
DQ Framework. Digital Standard for Digital LiteracyInternational24 competences defined in 3 levels of development and grouped in 8 areas:DQ Institute (2019)
  • Identity
2.
Use
3.
Safety
4.
Security
5.
Emotional intelligence
6.
Literacy
7.
Communication
8.
Rights
Digital Kids Asia Pacific (DKAP) FrameworkInternational16 competences grouped into 5 domains:UNESCO (2019)
  • Digital literacy
2.
Online security and resilience
3.
Online participation and citizenship
4.
Digital emotional intelligence
5.
Digital creativity and innovation
Table 2. Distribution of the sample by grade, gender, and age.
Table 2. Distribution of the sample by grade, gender, and age.
GenderDistribution of the Sample by Age
GradeFrequencyFemaleMaleOtherAgeFrequency
24.7%82110011179 (23%)
13.5%456001274 (9.5%)
16.5%675921388 (11.3%)
18%647421498 (12.6%)
10°17.9%7366015138 (17.8%)
11°9.4%4033016108 (13.9%)
Total100%37140241761 (7.9%)
1825 (3.2%)
19
Total
6 (0.6%)
777 (100%)
Table 3. Competence areas and competences of the DigCom 2.1. framework. Source Carretero et al. [62].
Table 3. Competence areas and competences of the DigCom 2.1. framework. Source Carretero et al. [62].
Competence AreasCompetences
Problem-solving1. Solving technical problems
2. Identifying needs and technological responses
3. Innovating and creatively using technology
4. Identifying digital competence gaps
Information5. Browsing, searching, and filtering information
6. Evaluating Information
7. Storing and retrieving information
Safety8. Protecting devices
9. Protecting personal data
Communication and collaboration10. Protecting health
11. Protecting the environment
12. Interacting through technologies
13. Sharing information and content
14. Engaging in online citizenship
15. Collaborating through digital channels
16. Netiquette
17. Managing digital identity
Content creation18. Developing content
19. Integrating and re-elaborating
20. Copyright and licenses
21. Programming
Table 4. Number of items that make up the “ECODIES” test.
Table 4. Number of items that make up the “ECODIES” test.
Competence AreasKnowledge/SkillAttitudesTotal
A1: Problem solving16622
A2: Information12618
A3: Safety16622
A4: Communication and collaboration18624
A5: Content creation16622
Total:7830108
Note: own elaboration.
Table 5. KMO (Kaiser–Mayer–Olsen) and Bartlett’s test.
Table 5. KMO (Kaiser–Mayer–Olsen) and Bartlett’s test.
Competence AreasTEST KMOBARTLETT´S TEST
Chi-Square ValueGlSig.
Area1(PS)0.5982931200.000
Area2(I)0.715296660.000
Area3(SA)0.6093691200.000
Area4(CO)0.7556471530.000
Area5(CC)0.6163131200.000
Total test “Ecodies”0.691547930030.000
Attitude scale0.95058054350.000
Note: Area 1: problem-solving; Area 2: information; Area 3: safety; Area 4: communication and collaboration; Area 5: content creation.
Table 6. Factor loadings.
Table 6. Factor loadings.
Factor 1: AttitudesFactor 2: Knowledge/Skill
AreaFactor LoadingUnicityFactorUnicity
Loading
Area 1(PS)0.8050.3520.7530.434
Area 2(I)0.8370.30.7620.419
Area 3(SA)0.8420.290.7570.427
Area 4(CO)0.8820.2230.8370.299
Area 5(CC)0.8680.2460.7080.499
Note: Area 1: problem-solving; Area 2: information; Area 3: safety; Area 4: communication and collaboration; Area 5: content creation.
Table 7. Total explained variance.
Table 7. Total explained variance.
FactorTotal% Of Variance% Accumulated
Attitude3.5435.435.4
Knowledge/Skills2.9829.865.2
Table 8. Matrix of the principal components.
Table 8. Matrix of the principal components.
AreaVariablesFactor
Factor 1Factor 2
Area 1
(Problem solving)
C1: Solving technical problems
C2: Identifying needs and technological responses
C3. Innovating and creatively using technology
C4. Identifying digital competence gaps
Attitude 1
Attitude 2
Attitude 3
Attitude 4
Attitude 5
Attitude 6
0.019
−0.022
0.098
0.072
0.824
0.734
0.792
0.745
0.779
0.637
0.736
0.553
0.560
0.668
0.189
0.128
−0.097
−0.026
0.091
0.021
Area 2
(Information)
C1. Browsing, searching, and filtering information
C2. Evaluating information
C3. Storing and retrieving information
Attitude 1
Attitude 2
Attitude 3
Attitude 4
Attitude 5
Attitude 6
0.082
0.0683
0.082
0.636
0.784
0.626
0.843
0.765
0.711
0.669
0.763
0.764
0.239
0.121
0.164
0.133
0.007
−0.100
Area 3 (Safety)C1. Protecting devices
C2. Protecting personal data
C3. Protecting health
C4. Protecting the environment
Attitude 1
Attitude 2
Attitude 3
Attitude 4
Attitude 5
Attitude 6
0.114
0.056
0.100
0.081
0.607
0.733
0.674
0.711
0.675
0.732
0.680
0.726
0.682
0.580
−0.151
0.190
0.214
0.154
0.056
0.164
Area 4
(Communication and collaboration)
C1. Interacting through technologies
C2. Sharing information and content
C3. Engaging in online citizenship
C4. Collaborating through digital channels
C5. Netiquette
C6. Managing digital identity
Attitude 1
Attitude 2
Attitude 3
Attitude 4
Attitude 5
Attitude 6
0.017
0.080
0.141
0.040
0.175
0.126
0.742
0.813
0.806
0.752
0.807
0.745
0.524
0.548
0.783
0.742
0.720
0.614
0.128
0.126
0.088
0.141
0.069
0.096
Area 5 (Content creation)C1. Developing content
C2. Integrating and re-elaborating
C3. Copyright and licenses
C4. Programming
Attitude 1
Attitude 2
Attitude 3
Attitude 4
Attitude 5
Attitude 6
0.168
−0.013
0.094
0.100
0.806
0.774
0.735
0.756
0.729
0.710
0.691
0.741
0.619
0.675
0.110
0.124
0.170
0.114
−0.027
0.117
Note: Area 1: problem-solving; Area 2: information; Area 3: safety; Area 4: communication and collaboration; Area 5: content creation; C1: competence 1.
Table 9. Goodness-of-fit indices of the model (CFA).
Table 9. Goodness-of-fit indices of the model (CFA).
AreaModelX2 (gl)GlRMRTLICFIIFIRMSEA IC 90%
Problem-solving2 factors45.665340.0480.9990.9990.9990.030 (0.000–0.050)
Information2 factors35.99260.0460.9940.9960.9960.032 (0.000–0.050)
Safety2 factors39.962340.0440.9970.9980.9980.021 (0.000–0.050)
Communication and collaboration2 factors75.927530.0460.9980.9980.9980.033 (0.000–0.050)
Content creation2 factors45.514340.0440.9970.9980.9980.030 (0.000–0.050)
Note: X2: chi-square; Gl: degrees of freedom; RMR: root mean residuals; TLI: Tucker–Lewis fit index; CFI: comparative fit index; IFI: incremental adjustment index; RMSEA: root mean squared error of approximation; IC: confidence intervals.
Table 10. Reliability analysis.
Table 10. Reliability analysis.
TotalComponent 1Component 2
Component reliability analysis Mc Donald omega0.8190.9500.48
Component reliability analysis Cronbach alpha0.8410.9500.649
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Betín de la Hoz, A.B.; Rodríguez-Fuentes, A.; Caurcel Cara, M.J.; Montes, C.d.P.G. Statistical Validation of the “ECODIES” Questionnaire to Measure the Digital Competence of Colombian High School Students in the Subject of Mathematics. Mathematics 2023, 11, 33. https://0-doi-org.brum.beds.ac.uk/10.3390/math11010033

AMA Style

Betín de la Hoz AB, Rodríguez-Fuentes A, Caurcel Cara MJ, Montes CdPG. Statistical Validation of the “ECODIES” Questionnaire to Measure the Digital Competence of Colombian High School Students in the Subject of Mathematics. Mathematics. 2023; 11(1):33. https://0-doi-org.brum.beds.ac.uk/10.3390/math11010033

Chicago/Turabian Style

Betín de la Hoz, Ana Bertha, Antonio Rodríguez-Fuentes, María Jesús Caurcel Cara, and Carmen del Pilar Gallardo Montes. 2023. "Statistical Validation of the “ECODIES” Questionnaire to Measure the Digital Competence of Colombian High School Students in the Subject of Mathematics" Mathematics 11, no. 1: 33. https://0-doi-org.brum.beds.ac.uk/10.3390/math11010033

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop