Next Article in Journal
The Study of Dust Removal Using Electrostatic Cleaning System for Solar Panels
Previous Article in Journal
Effects of Variable Weather Conditions on Baled Proportion of Varied Amounts of Harvestable Cereal Straw, Based on Simulations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Developing a Holistic Success Model for Sustainable E-Learning: A Structural Equation Modeling Approach

by
Ahmad Samed Al-Adwan
1,
Nour Awni Albelbisi
2,
Omar Hujran
3,*,
Waleed Mugahed Al-Rahmi
4 and
Ali Alkhalifah
5
1
Electronic Business and Commerce Department, Business School, Al-Ahliyya Amman University, Amman 19328, Jordan
2
Faculty of Education, University of Malaya, Kuala Lumpur 50603, Malaysia
3
Department of Analytics in the Digital Era, College of Business and Economics, United Arab Emirates University, Al Ain 15551, United Arab Emirates
4
Self-Development Skills Department, College of Common First Year, King Saud University, Riyadh 11451, Saudi Arabia
5
Information Technology Department, College of Computer, Qassim University, Buraydah 52571, Saudi Arabia
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(16), 9453; https://0-doi-org.brum.beds.ac.uk/10.3390/su13169453
Submission received: 14 July 2021 / Revised: 17 August 2021 / Accepted: 19 August 2021 / Published: 23 August 2021

Abstract

:
In higher education learning, e-learning systems have become renowned tools worldwide. The evident importance of e-learning in higher education has resulted in a prenominal increase in the number of e-learning systems delivering various forms of services, especially when traditional education (face-to-face) was suddenly forced to move online due to the COVID-19 outbreak. Accordingly, assessing e-learning systems is pivotal in the interest of effective use and successful implementation. By relying on the related literature review, an extensive model is developed by integrating the information system success model (ISSM) and the technology acceptance model (TAM) to illustrate key factors that influence the success of e-learning systems. Based on the proposed model, theory-based hypotheses are tested through structural equation modeling employing empirical data gathered through a survey questionnaire of 537 students from three private universities in Jordan. The findings demonstrate that quality factors, including instructor, technical system, support service, educational systems, and course content quality, have a direct positive influence on students’ satisfaction, perceived usefulness, and system use. Moreover, self-regulated learning negatively affects students’ satisfaction, perceived usefulness, and system use. Students’ satisfaction, perceived usefulness, and system use are key predictors of their academic performance. These findings provide e-learning stakeholders with important implications that guarantee the effective, successful use of e-learning that positively affects students’ learning.

1. Introduction

Most economic aspects have been affected by the development of information technology [1], which has contributed to tremendous change in the educational sector. Students’ increased use of the web and Internet has encouraged educational institutions to replace traditional learning and teaching approaches. Consequently, higher education is now changing, and higher education institutions (HEIs) (i.e., universities) must accordingly fulfil students’ requirements, needs, and expectations [2]. E-learning is essential to enable and execute these activities as it allows students to access learning resources regardless of location and time. According to Cidral et al. [3], e-learning systems offer personified, flexible learning; reduce the cost of learning; and enable learning on demand. Al-Fraihat [4] (p. 57) states that e-learning has “a significant role in shifting from teacher-centered to student-centered education.”
Although many e-learning initiatives have been successfully implemented, most of them did not attain their intended objectives, progressed slowly, and observed increased dropout rates [4]. Additionally, assessing the success of e-learning represents a concern for e-learning systems’ stakeholders. HEIs have substantially invested in e-learning systems to enhance and support learning [5]. However, integrating innovative e-learning systems to support both teaching and learning is a major challenge for HEIs. An argument has been that the value of investing in e-learning systems significantly relies on the implementation of these systems by instructors and students [6]. Students’ acceptance and use of e-learning systems determine their success, and poor use of such systems prevents realizing their benefits, which leads to unsuccessful e-learning systems and thus poor return on investment [7]. These assumptions hold true especially in light of the rapid, erratic, and sudden outbreak of COVID-19. While the role of technology in supporting people in ordinary conditions is widely recognized, it is unclear the extent to which technology is capable to respond to their needs in unusual circumstances and extreme conditions such as COVID-19 [8].
Consequently, to help e-learning stakeholders successfully implement e-learning systems, many researchers have explored the key success factors of e-learning. Although these researchers have attempted to identify the requirements of e-learning systems, they have not revealed all the requirements [4]. The most significant factors that measure the success of e-learning systems are diverging in literature. Few studies have considered developing a holistic model that evaluates the success of e-learning systems from various angles. As Eom and Ashill [9] suggest, the holistic success model of e-learning should cover multiple levels of success. E-learning systems are a form of information system (IS) that incorporates human factors (i.e., learners and instructors) and nonhuman factors (i.e., learning management systems). Thus, examining various aspects of success related to human and nonhuman factors is vital. Al-Fraihat et al. [10] and Cidral et al. [3] have identified two prominent streams of e-learning adoption-related research. One stream focuses on adoption behaviors (e.g., system use, adoption, intention to use, usability), technological aspects, and the system. Nevertheless, because technology has become widely accessible and reliable, a more recent research stream has explored instructors’ and students’ interactions and attitudes as critical determinants of e-learning success. Accordingly, this study develops a holistic success model of e-learning that considers human and technological perspectives of e-learning.

2. Significance of the Study

The literature on e-learning is focused on e-learning adoption and postadoption. Two streams of research are related to the adoption and postadoption of e-learning. The first stream of research assumes users’ postadoption behavior of e-learning systems as an extension of the initial acceptance behavior of e-learning systems and employs the same variables to explain acceptance and continued use [1,11,12,13,14,15,16]. Studies related to this stream have mainly used the technology acceptance model (TAM) [17] as the underlying theoretical framework and extended it using other supplementary theories and models, such as the unified theory of acceptance and use of technology (UTAUT) [18], UTAUT2 [19], and the theory of planned behavior (TPB) [20], to examine the adoption and continuance behavior of e-learning systems. The second stream of research has used the expectation confirmation model (ECM) of Bhattacherjee [21] as the focal theoretical base [22,23,24]. Researchers of the second stream have been inclined to integrate various other frameworks, such as the IS success model (ISSM) [25], TPB [20], and TAM [17] along with the IS continuance model.
However, the aforementioned streams’ studies have primarily assessed factors that predict e-learning systems’ adoption and postadoption and do not consider how such factors or the actual use of e-learning systems relates to learning outcomes. To fill this gap in the literature, many scholars have extended their investigation beyond adoption behaviors (i.e., use or continued use) and have examined the impact of adoption behaviors on e-learning use outcomes. In this regard, two schools of research have emerged. In the first school, self-developed models and frameworks have been employed as the theoretical base [26,27,28]. Such studies have been conducted with different outcome variables that use various explanatory variables, resulting in models with a weak connection to theory. Accordingly, this weakness makes generalizing the findings of these studies difficult. Consequently, these studies are insufficient to explain how e-learning use and its determinants affect the outcomes of e-learning use. Conversely, scholars of the second school have relied on the ISSM, which offers theoretical support for the association between the behaviors of e-learning adoption and outcomes. However, an argument is that the ISSM has limited theoretical support concerning the relationships between determinants and behaviors of e-learning adoption [29].
Therefore, this study first indicates the limitations of research on technology adoption and IS success and then applies a combination of these research directions to overcome their limitations. A reasonable endeavor is to pursue theoretical support from both research streams to develop a holistic model that considers the determinants, behaviors, and outcomes of e-learning adoption. Specifically, the proposed model is theoretically grounded in TAM and is called the ISSM. Thus, the proposed model retains the quality factors of the ISSM as the main determinants of e-learning adoption behavior (i.e., use). Additionally, the model borrows two constructs from TAM—perceived usefulness (PU) and perceived ease of use (PEOU)—as further key determinants of e-learning use behavior. The conceptualization of the proposed model is discussed in detail in Section 4.

3. Information System Success Model (ISSM)

DeLone and McLean [30] developed the model of IS success. This model comprises six success factors that measure IS success, namely, organizational impact, individual impact, use, satisfaction (SAT), information quality, and system quality. The model suggests that quality factors (system and information quality) directly influence user satisfaction and IS use. Additionally, satisfaction and system use affect the individual impact and subsequently influence organizational impact. However, DeLone and McLean [25] updated and modified the original model by adding another quality factor—service quality—to predict IS use and user SAT (see Figure 1). Another modification integrates organizational and individual impacts into one factor called net benefits. The updated model of IS success (D&M model) has captured scholars’ attention in the field of IS. This demonstrates that IS success can be assessed using a set of quality factors (i.e., service quality, systems quality, and information quality). Subsequently, these factors influence user satisfaction and use. The D&M model theoretically indicates that a high-quality IS results in high user satisfaction and thus high levels of IS use and an increased perception of net benefits.
Although the D&M model has been investigated in various areas of IS, the model has limitations, particularly in e-learning-related research. Many studies [31,32] have confirmed that the determining factors of the net benefits variable (the outcome construct of the D&M model) are not abundantly acknowledged. This assertion holds true, especially because the net benefits variable is case-specific and therefore differs based on the requirements of each user and entirely relies on the objectives and type of a particular information system. Consequently, additional research is necessary to identify additional determining factors of net benefits, particularly in the educational technology context. Thus, this study suggests that self-regulated learning (SRL) and PU are additional determinants of the net benefits variable.
Various studies that employed the D&M model in the e-learning context have reported diverse estimates of variance explained (R2) by quality factors. Eom et al. [33] (p. 158) observes that “the DeLone and McLean model has limited explanatory power for explaining the role of e-learning systems on the outcomes of e-learning.” Hence, to enhance the explanatory power of the D&M model, scholars have indicated that additional research is required to explore additional quality factors of e-learning systems [10]. This study responds to these requests for further research by contextualizing and extending the original constructs of the D&M model to fit the e-learning settings. The contextualization and the operationalization of the proposed model are discussed in the next section.

4. Theoretical Foundation

This study suggests a contextualized model of e-learning success factors (see Figure 2) that is based on the D&M model and TAM. The proposed model is expected to be a driver of the development, design, and delivery of e-learning systems initiatives. Recognizing the aforementioned applications of the D&M model from the literature, this study adopts and contextualizes the key relationships of the D&M model’s constructs into the setting of this study, that is, e-learning. Specifically, because of the aforementioned limitations, the following modifications have been made.
  • According to Cheng [34], in the e-learning environment, service quality is the assistance provided by instructors and support service technicians. Accordingly, service quality in e-learning can be decomposed into two key dimensions: instructor quality (IQ) and support service quality (SSQ).
  • System quality is a vital indicator of e-learning quality and has two dimensions: educational system quality (ESQ) and technical system quality (TSQ). ESQ is related to the presence of education-related features such as diverse learning styles, evaluation styles, and communication and interactivity tools [35]. TSQ is concerned with technology-related aspects such as usability, availability, and reliability. Davis [17] demonstrates that PEOU reflects the extent to which users perceive a system’s use as effortless. Hence, PEOU is a key indicator of TSQ.
  • Similar to its use in the literature [36,37], information quality is used in the D&M model as a general quality measure of IS success. Thus, to produce a more fitting model that adapts to the requirements of e-learning systems, information quality is reworded as course content quality (CCQ).
  • The proposed model includes SRL as another critical factor in e-learning environments. SRL is a major success factor in e-learning environments [38]. It represents a context-specific dimension that reflects students’ ability to achieve their learning objectives through controlling their motivations and learning behaviors [39].
  • The proposed model retains the original relationships of the D&M model, except the relationship between user SAT and USE, which has been criticized and regarded as confusing and theoretically weak [10,40]. Therefore, in the proposed model, PU mediates the relationship between USE and user SAT.
  • This study facilitates the building of a context-specific model that adheres to the requirements and specifications of e-learning success by replacing the construct of net benefits with that of academic performance (ACP) as the outcome variable.
In summary, the quality factors and SRL described act as external variables that affect PU, user SAT, and USE. Additionally, PU directly influences user SAT, ACP, and USE. ACP is influenced by PU, user SAT, and USE. The following section explains the proposed hypotheses of the research model.

5. Hypothesis Development

5.1. Instructor Quality

Cheng [34] (p. 237) defines IQ as “the degree to which learners perceive that the instructor’s attitude that relates to the instructor’s response timeliness, teaching style, and help toward learners via the e-learning system.” IQ represents instructors’ teaching styles and attitudes that discernibly affect learners’ participation, attitudes, and enthusiasm toward e-learning systems. Pham et al. [41] state that the instructors’ feedback is a crucial factor in online classes; moreover, learners’ perception of quality and timeliness of instructor feedback is pivotal in the success of online courses. Additionally, Rajabalee and Santally [42] observe that instructor support is a crucial element in shaping learners’ SAT. Specifically, learners in e-learning environments may sense frustration and express negative feelings when they receive inadequate instructor support even if they perform well.
Additionally, appropriate instructor involvement in active academic guidance has been reported to directly contribute to developing learners’ SAT, performance, and achievement. Addressing learners’ needs or problems in a quick, efficient manner; providing online feedback on learners’ activities in e-learning systems (i.e., assignments); and sharing significant information on the online course are indispensable aspects to ensure learners’ SAT with e-learning systems [43]. Lee et al. [44] emphasize that providing proper academic guidance increases learners’ interest in high achievements and desire toward self-improvement. Accordingly, when instructors are involved in active academic guidance and provide quality and timely feedback using e-learning systems, learners’ acceptance and motivation to use such systems increases. Instructors in e-learning environments are responsible for setting learning goals and providing learning materials and activities (i.e., quizzes, assignments) to accomplish those goals. Accordingly, instructors’ technical competencies have been suggested to be critical in integrating ICT into the educational process [45,46]. This finding indicates that instructors must possess proper pedagogical and technological knowledge. Such competencies enable instructors to engage and be involved effectively in active academic guidance, which increases learners’ usefulness perception of e-learning systems [43]. As a result, such a high IQ will increase learners’ SAT and their usefulness perception of e-learning systems.
Thus, Hypothesis 1 (H1) is proposed:
Hypothesis 1a (H1a).
IQ positively influences perceived SAT with the e-learning system.
Hypothesis 1b (H1b).
IQ positively influences PU of the e-learning system.
Hypothesis 1c (H1c).
IQ positively influences the use of the e-learning system.

5.2. Course Content Quality

Information quality assesses the quality of the information provided by IS [25]. Information quality of an e-learning system is a fundamental aspect in assessing the success of the system as poor information quality may generate serious problems in attaining learning goals [10]. Information quality in the context of e-learning represents CCQ. Notably, CCQ is the content quality provided by an e-learning system [37,47]. The key attributes of CCQ are accuracy, usefulness, reliability, comprehensibility, availability, relevancy, completeness, and being up-to-date. Updated, comprehensive content fulfils learners’ expectations, making them feel pleased with the e-learning systems [34]. Additionally, e-learning systems are perceived as useful when constantly updated, rich course content is provided, and the course content can be customized to learners’ needs. Mtebe and Raisamo [48] explain that well-conceived courses that address intended learning outcomes enable learners to perform effectively in courses delivered by e-learning systems. Hence, a suggestion is that e-learning systems with high-quality course content would support learners in improving their outcomes and grades. Additionally, these courses with high-quality content would encourage learners to persist in using such systems, thereby increasing their SAT levels. Similarly, Yakubu and Dasuki [1] assert that CCQ is pivotal in motivating learners to use e-learning systems by improving their SAT with such systems.
Thus, Hypothesis 2 (H2) is proposed:
Hypothesis 2a (H2a).
CCQ positively influences perceived SAT with the e-learning system.
Hypothesis 2b (H2b).
CCQ positively influences PU of the e-learning system.
Hypothesis 2c (H2c).
CCQ positively influences the use of the e-learning system.

5.3. Educational System Quality

ESQ is a necessary element for achieving the targeted goals set by institutions [36]. It denotes the degree to which the features of e-learning are desirable in educational settings. ESQ focuses on measuring the quality of e-learning systems in terms of the presence of features such as communication facilities (i.e., chatting), learning styles’ diversity, evaluation material, and collaboration and interactivity methods [7,10]. The literature has demonstrated that educational tools and features of e-learning systems such as active learning; adequate information sharing; efficient, effective communication and collaborative tools (i.e., chatrooms, discussion forums); storage; and document sharing can remarkably maximize e-learning systems use and learners’ SAT [3,36,37]. Moreover, Goh et al. [49] reveal that interaction with peers and instructors through e-learning systems is a key aspect that determines SAT and facilitates accomplishing learning outcomes. Additionally, there is a significant connection between ESQ and PU [50].
Thus, Hypothesis 3 (H3) is proposed:
Hypothesis 3a (H3a).
ESQ positively influences students’ SAT with the e-learning system.
Hypothesis 3b (H3b).
ESQ positively influences PU of the e-learning system.
Hypothesis 3c (H3c).
ESQ positively influences the e-learning USE.

5.4. Support Service Quality

Cheng [34] (p. 237) defines SSQ as the “degree to which a learner perceives that the overall quality of personal support services from the e-learning system.” SSQ reflects the quality of assistance and support delivered to users by IT technicians and the IS unit (i.e., helpdesk, training). Lee [51] and Pham et al. [41] state that SSQ is a key determinant of learners’ SAT and acceptance of e-learning systems. The literature has demonstrated that offering suitable support services by services administrators or a helpdesk for e-learning systems increases learners’ perception of the usefulness, SAT, and acceptance of e-learning systems [43,52,53]. Cheok and Wong [54] assert that technical support is fundamental in using e-learning systems. Insufficient technical support may cause frustration and problems for learners during their interactions with e-learning systems, leading learners to perceive that the benefits of such systems do not outweigh the problems [24]. Technical support is an important factor that facilitates integrating and using technology in learning. Turugare and Rudhumbu [45] suggest that the availability of technical support to assist in adopting e-learning systems is a critical success factor that ensures that instructors and learners will not have to manage technical difficulties beyond their capabilities. Thus, technical support plays a critical role in conceiving the usefulness of the various functions of e-learning systems and thus enhances learners’ performance and interaction with systems. Literature has anticipated that offering services by IT technicians and units in organizations related to an e-learning system considerably affects learners’ perception of the system’s usefulness; subsequently, usefulness perception influences learners’ perceived SAT with the system [10]. Technical support participates effectively in generating favorable perceptions toward e-learning systems as it makes learners believe that the benefits of using e-learning systems are worth the effort.
Thus, Hypothesis 4 (H4) is proposed:
Hypothesis 4a (H4a).
SSQ positively influences perceived SAT with the e-learning system.
Hypothesis 4b (H4b).
SSQ positively influences PU of the e-learning system.
Hypothesis 4c (H4c).
SSQ positively influences the use of the e-learning system.

5.5. Technical System Quality

According to Seta et al. [37] and Lee and Jeon [47], system quality denotes the desirable functionalities and technical characteristics of IS. It relates to the efficiency, accuracy, and technical success of IS because it accounts for the absence or existence of bugs in the systems [35]. TSQ of an e-learning system is evaluated by measuring the quality of technical characteristics such as security; ease of navigation; availability; user friendliness; reliability; ability to integrate with other systems; ease of access; ease of finding information; and efficiency of functionalities; and it prompts a structured, interactive design [1,37,55]. Additionally, a suggestion is that TSQ is assessed by the extent to which the functionalities of an e-learning system are easy to learn [4]. The presence of friendly user interfaces and modern graphical interfaces improve learners’ SAT and stimulate their interest in using the system [56]. The high technical quality of an e-learning system positively influences learners’ perceptions that the system can deliver useful, reliable functionalities for learning activities, which increases learners’ SAT, and they thus become keen on the system. System quality is a key determent of learners’ SAT [10], USE [1], and usefulness [55].
Thus, Hypothesis 5 (H5) is proposed:
Hypothesis 5a (H5a).
TSQ positively influences perceived SAT with the e-learning system.
Hypothesis 5b (H5b).
TSQ positively influences PU of the e-learning system.
Hypothesis 5c (H5c).
TSQ positively influences the USE of the e-learning system.

5.6. Self-Regulated Learning

Zimmerman [57] (p. 541) describes SRL as “metacognitive, motivational, and behavioral processes that are personally initiated to acquire knowledge and skill, such as goal setting, planning, learning strategies, self-reinforcement, self-recording, and self-instruction.” Similarly, Landrum [58] states that self-regulation is attained when students can employ self-managing actions and independently implement learning processes. Al-Adwan et al. [38] indicate that SRL promotes autonomous, self-directed, and independent learning. Self-regulated students are dedicated participants who competently control the activities of their learning. For instance, self-regulated students monitor their learning processes, organize and review the content to be learned, manage their time, set up and follow plans for accomplishing all required learning-related tasks on time, and maintain positive motivational beliefs in their abilities and the value of learning [59]. SRL is a necessary success factor for e-learning, namely, students must possess a high level of autonomy in their learning, especially in the low presence of instructors or peers [60]. Different from the traditional (face-to-face) learning mode, in the new mode, the low level of presence of instructors shifts the task of learning control from instructors to individual students. Tasks executed by instructors (i.e., setting learning goals and progress assessment) are becoming the duty of the individual student. Accordingly, students with low SRL level would experience major difficulties in such considerably autonomous learning settings; thus, they would become dissatisfied, view the e-learning system as not useful, and resist using it.
Thus, Hypothesis 6 (H6) is proposed:
Hypothesis 6a (H6a).
SRL positively influences students’ SAT with the e-learning system.
Hypothesis 6b (H6b).
SRL positively influences students’ PU of the e-learning system.
Hypothesis 6c (H6c).
SRL positively influences the USE of the e-learning system.

5.7. Perceived Usefulness

Davis [17] (p. 320) defines PU as “the degree to which a person believes that using a particular system would enhance his/her job performance.” In TAM-related research, PU has been remarkably salient as a primary determinant of technology acceptance. It reflects the instrumental value of IS, such as e-learning systems. E-learning systems are expected to deliver useful features such as downloading learning material and interacting with peers and instructor that improve students’ learning. Literature has indicated that PU positively influences students’ SAT, USE, and ACP [10,61,62,63]. When students perceive that the e-learning system benefits their learning, they are inclined to be satisfied and subsequently inclined to use it. Moreover, when the system adds value by facilitating students’ in achieving higher performance and attaining learning goals, their ACP is enhanced.
Thus, Hypothesis 7 (H7) is proposed:
Hypothesis 7a (H7a).
PU positively influences e-learning USE.
Hypothesis 7b (H7b).
PU positively influences students’ SAT with the e-learning system.
Hypothesis 7c (H7c).
PU positively influences students’ ACP.

5.8. Satisfaction

SAT is a vital factor in assessing the success of IS [64,65]. Salam and Farooq [31] (p. 12) state that SAT is “the extent, to which its users (i.e., teachers and students etc.) are contented with the system functionalities, for a productive learning experience, and how well is its performance, to meet the expectations of all the stakeholders.” Literature has confirmed the distinct association between e-learning system use and user SAT [10] and has acknowledged that if learners are more satisfied with an e-learning system, their intention to use these systems is substantially increased. Additionally, Cidral et al. [3] indicate that SAT is fundamental in evaluating the long-term adoption of e-learning systems. Moreover, the impact of SAT on ACP has been validated [32,66]. Specifically, a positive user experience with the e-learning system favorably affects learners’ overall ACP.
Thus, Hypothesis 8 (H8) is proposed:
Hypothesis 8 (H8).
Students’ SAT positively influences their ACP.

5.9. System Use and Academic Performance

Salam and Farooq [31] (p. 13) state that use behavior demonstrates the degree to which “a user uses, the entire spectrum, of the available features, in a particular system, to fulfill his/her needs.” More specifically, USE represents users’ assessment of the overall use of a specific IS. In the literature, USE has been measured by, for example, the duration, nature, and frequency of use and users’ perceived effectiveness and usefulness of IS to satisfy their requirements [67,68]. Net benefits represent all anticipated benefits of IS for an organization, a group, or an individual [69,70]. Contributions and outcomes of the e-learning system are weighted by the resultant impact on the organizational and individual levels.
This study focuses on individual benefits, and not organizational benefits. Hence, the construct of net benefits is replaced with ACP to add a more contextualized construction to the proposed model that fit the e-learning context and measured individual benefits for students. Maqableh et al. [27] define ACP as “students’ ability to carry out academic tasks, and it measures their achievement across different academic subjects using objective measures such as final course grades and grading point average.” Literature has reported an association between e-learning use and students’ ACP [32] and has demonstrated that e-learning use supports students in their learning, for example, the provision of effective interaction, fast information interaction, and enhanced collaboration [29,35]. The actual use of the e-learning system designates that students acknowledge the use of the system as fulfilling their learning needs and helping them attain the learning outcomes. Accordingly, e-learning USE generates ACP.
Thus, Hypothesis 9 (H9) is proposed:
Hypothesis 9 (H9).
The use of the e-learning USE positively influences students’ ACP.

6. Methodology

6.1. Participants and Target System

This study used a sample of three private universities in Jordan that use Moodle, the target e-learning system, as a supplementary educational platform to traditional (face-to-face) teaching. Using both approaches—the online activities on Moodle and traditional teaching—in one course results in what is called a hybrid course, which is designed by instructors. The unit of analysis in this study was students who have used Moodle at least once in their learning. The accidental (convenience) sampling method was employed to select students. According to Gravetter and Forzano [71], this sampling method encourages participants to fill in the questionnaire based on their availability and willingness to participate. This study employed the accidental (convenience) sampling method because it is easier to use and faster in managing the survey than any other method.
Data were collected between December 2019 and March 2020, at the beginning of lockdown actions by the official authorities in Jordan. At that time, the actions of the official authorities in the country were viewed as precautionary measures rather than alarming ones. Nevertheless, the businesses around the country were severely affected and a considerable number of these businesses (including HEIs) were forced to move online. Such sudden and radical change occurred instantly. Hence, the survey of this study captures students’ acceptance of e-learning systems in its actual condition without any specialized preparation and training.
Those students were selected based on convenience sampling. Data were collected using a web-based questionnaire survey. In principle, the survey was distributed at the end of a 3-month academic period (the length of the academic semester). The instructors were asked to share the survey link on the pages of their courses on Moodle. As a result, the survey’s link was posted on 80 randomly selected courses on Moodle; in total, 1200 students were registered in these courses. Additionally, this study aimed to increase the response rate; thus, instructors were asked to send students a reminder 1 week after posting the survey. The survey was available for nearly 3 weeks. Of the 1200 students who received the survey, 577 students responded, resulting in a 48% response rate. After excluding incomplete and invalid responses (N = 40), 537 valid responses were obtained and subsequently used for validating and testing the research model. Table 1 presents the respondents’ profile.

6.2. Instrument Design

As mentioned above, data were collected using a web-based questionnaire survey that comprised two main sections. The first section collected respondents’ demographic data. The second section measured the nine proposed constructs in the research model. Particularly, the second section comprised 40 measurement items as each construct was measured by four items. All items were adopted from the related literature (Appendix A) and measured on a 5-point Likert scale from 1 (strongly agree) to 5 (strongly disagree).
Before final data collection, the validity and appropriateness of the questionnaire survey were evaluated using two procedures. In the first procedure, the measurement items were assessed by an expert academic panel of four individuals who have wide expertise in the field of IS. The assessment outcome reveals that the degree of agreement among the four members is 90.5%. Additionally, some suggestions to enhance the reliability and readability provided by the panel were considered. Second, a pilot study was performed on 60 students, which evaluated the reliability of the ten constructs. The results indicate that all constructs have adequate internal consistency as the Cronbach’s alpha for each construct exceeded 0.7 [72].

7. Results

Structural equation modeling (SEM) was used to examine the relationships among the constructs of the proposed research model. SEM is deemed appropriate for data analysis because it enables scholars to analyze and manage complex models with many dependent and independent variables concurrently and comprehensively [73]. Accordingly, partial least square-SEM (PLS-SEM) was used for data analysis. Ketchen [74] stated that the PLS-SEM is adequate for validating predictive power and estimating significantly complex models. Accordingly, PLS-SEM has been used by many scholars in various fields such as e-commerce [75,76,77,78,79,80,81], information systems [82], e-government [83,84], and educational technology-related research [85,86,87,88]. Corresponding with Anderson and Gerbing [89], data analysis using SmartPLS v.3.3.3 was performed throughout two key analytical stages. In the first stage, the reliability and validity of the measurement model were examined. Afterward, the structural model was tested to assess the estimation of path coefficients (hypothesis). The significance of the estimated path coefficients and loadings were assessed by the bootstrapping re-sampling procedure of 5000 re-samples. Before proceeding to the measurement model, the common method variance was evaluated. The test of Harman’s one factor was performed to assess the presence of the CMV [72]. Accordingly, an exploratory factor analysis (EFA) was executed, and all measurement items were factorized into a one single factor. The result demonstrates that ten factors emerged, as none of these factors accounted for ≥50% of variance among the measurement items. The construct of academic performance (ACP) had the highest variance explained and accounted for 30.2% of the total variance. Hence, the absence of CMV was evident.

7.1. Measurement Model

In this stage, the validity and reliability of the research model’s constructs, and their corresponding measurement items were assessed. Following Hair et al. [72], tests were performed, including the reliability of internal consistency, discriminant validity, and convergent validity. Confirming the presence of convergent validity requires the items loading on the intended theoretical constructs to be 0.708, and the value of average variance extracted to be ≥0.5. Regarding internal reliability, composite reliability and Cronbach’s alpha estimates are required to be ≥0.7. Table 2 presents all items that acquired a loading greater than the recommended value of 0.708; each construct had an AVE value ≥0.5 and Cronbach’s alpha and composite reliability values higher than 0.7. These results confirm the presence of convergent validity and internal reliability in the dataset.
Discriminant validity was evaluated by employing two criteria. First, the Fornell and Larcker [90] criterion was applied. The result suggests that the AVE square root of each construct should be greater than its correlation with any other construct in the model. Table 3 demonstrates that this condition was satisfied, confirming the presence of discriminant validity.
Second, the heterotrait–monotrait ratio approach was employed [91]. In Table 4, all values are ≤0.85; thus, the results of the Fornell–Larcker criterion and that discriminant validity is present are confirmed. In summary, all measures used in this study demonstrated satisfactory validity and reliability.

7.2. Goodness of Fit (GOF)

The measurement model was assessed for a satisfactory goodness-of-fit (GOF); the overall model fit was evaluated by evaluating five main indices (Table 5). According to Henseler et al. [92] and Benitez et al. [93], the actual values of all GOF indices are within and satisfy the recommended values/conditions, indicating that the proposed model fits the dataset.

7.3. Structural Model

Structural model evaluation was used to examine the paths between the research model’s constructs in regard to the proposed hypothesis, R2 “coefficient of determination or predictive power,” and predictive relevance (Q2) [72]. However, before evaluating the proposed structural relationships, it is vital to examine collinearity to confirm that the regression outcomes are unbiased. Consequently, as Hair et al. [72] advises, collinearity was evaluated using the values of variance inflation factor (VIF) of inner model (among constructs). When confirming that collinearity is absent, VIF estimates should be close to or less than 3. Table 6 indicates that all constructs acquired a VIF less than 3; thus, collinearity issues were absent.
R2 and Q2 were used to assess the predictive accuracy of the proposed model. As Hair et al. [94] suggests, all paths were tested by running a 5000 bootstrap re-samples procedure. Additionally, the blindfolding procedure was performed to calculate Q2 estimates. Table 7 demonstrates that all the dependent variables possess a Q2 estimate greater than 0, and the R2 are all higher than 0.25, indicating that the proposed model possesses adequate predictive accuracy.
In Table 7, SAT, USE, and PU explain 57.6% of the variance in ACP (R2 = 0.576). Although seven constructs contributed to explaining 64.1% and 61% of the variance in SAT (R2 = 0.641) and USE (R2 = 0.61), respectively, five constructs contributed to explaining 64% of the variance in PU (R2 = 0.64). According to Chin [95], such explanation power is classified as moderate to substantial. Furthermore, the procedure of PLS predict was performed to evaluate the predictive power [72]. Each indicator of the endogenous factors had a lower prediction error for the research model than LM, considering RMSE except for a few items (PU1, SAT3, USE4) (see Table 8). Such results associated with the positive Q2 values for all these indicators, demonstrating that research model possesses medium to high predictive power.
Regarding the path analysis in in Figure 3 and Table 9, the results indicate that H1–H5 and H7–H9 are supported; thus, H6 is not supported. Specifically, SRL generates a negative influence on PU, SAT, and USE. PU is the strongest predictor of APC (β = 0.299, p = 0.000); CCQ is the main predictor of PP (β = 0.289, p = 0.000); ESQ has the strongest effect on SAT (β = 0.169, p = 0.001); PU is the strongest predictor of USE (β = 0.230, p = 0.000).

8. Discussion

The findings of this study confirm the positive effect of IQ on SAT and USE (similar to [62]) and on PU (similar to [10]). This indicates that providing timely, effective feedback from instructors using the e-learning system increases students’ SAT and encourages them to use the e-learning system. Additionally, students are expected to develop high perceptions of e-learning usefulness when instructors respond to them in a timely manner and possess adequate technical skills and knowledge of course contents [45,61]. If instructors can manage all aspects of students’ online learning and promptly respond to requests and questions using the e-learning system, students consider the system useful. As instructors are the main individuals recognized as a fundamental element by students in the e-learning environment, Kim et al. [96] indicate that instructors are viewed as a key success factor in e-learning environments because they increase students’ SAT and motivate them to engage in learning opportunities. Instructors are considered content experts responsible for facilitating course delivery and managing students’ learning [62]. Furthermore, instructors in e-learning plan the curriculum and adopt effective pedagogical strategies to take advantage of available technologies to ensure a successful implementation of e-learning [97]. When providing effective e-learning activities and responding to students’ inquires and problems in a timely fashion, the use of e-learning by students is expected to be favored.
Consistent with the literature, CCQ is a significant enabler of students’ SAT and USE [40,66] and an important facilitator of PU [10]. These findings suggest that when course content delivered by the e-learning system is considered well designed, regularly updated, and sufficiently comprehensive, students’ PU and SAT increases. High-quality course content makes students feel contented with the e-learning system and thus encourages them to use the system. Additionally, when students believe that the system provides rich, frequently updated course content, and the degree of the courses’ content can be customized to suit their individual needs, they consider the e-learning system to be useful for their learning. The availability of CCQ facets such as providing sufficient, clear, and accurate information; updated content; and content with an attractive, intuitive design is essential to have a pleasing experience with the e-learning system, and thus contributes to students’ overall SAT. Similarly, logical, understandable organization of course components and content enables students to complete their learning tasks and responsibilities efficiently and more effectively and thus perceive the system as useful. Offering well-structured course designs helps students achieve the learning outcomes more effectively, enhancing SAT. Accordingly, the adequacy of courses’ content encourages students’ e-learning USE [37].
Similarly, the effect of ESQ on USE, SAT, and PU is significant. Although the significant influence of ESQ on USE and SAT has been confirmed in the literature [35,36], the significant influence of ESQ on PU and USE does not support the findings of Al-Fraihat et al. [10]. Our findings demonstrate that when the e-learning system provides high-quality functions to accomplish learning goals and tasks and facilitates the learning process, students perceive the usefulness of the system in delivering beneficial functions for effective learning. Moreover, students will perceive the system as useful if compatible functionalities are provided and students can continually access learning materials and contact their peers and instructors. These factors result in students’ SAT with the system and increase their willingness to use the system.
The findings demonstrate that SSQ is a key determinant for PU, USE, and SAT. The significant influence of SSQ on SAT and USE is supported by [37], but the significant influence of SSQ on PU and USE contradicts the findings of Al-Fraihat et al. [10]. Our findings suggest that if students receive proper technical support services from a help desk or technical personnel, their SAT with the e-learning system increases. An appropriate quality of technical services provided by technical personnel to students significantly helps them develop a high usefulness perception toward the e-learning system and increases their use of it. In e-learning environments, students often encounter learning difficulties, for example, in conceptual understanding, technical issues, ease of access, and digital literacy [42]. If such difficulties are not resolved in a timely manner, the effectiveness of and motivation for learning might decrease and have diverse effects on students’ overall SAT, PU, and USE of the e-learning system. Markova et al. [98] highlight that student support may include technical or instructional support such that instructors assist students in overcoming issues encountered during online courses. This type of support, particularly when students encounter technical problems, is essential to overcome challenges that affect students’ SAT [97]. Accordingly, the availability of helpful technical support leads to positive perceptions among students of SAT and e-learning system use. Additionally, technical support plays a fundamental role in revealing the novelties and key functionalities of e-learning systems to students, increasing students’ PU of the system.
The findings demonstrate that TSQ has a positive impact on PU, US, and SAT. Notably, literature [3,40] has confirmed the positive influence of TSQ on SAT and USE [10] and that the relationship between TSQ and PU is nonsignificant. Our finding demonstrates that an e-learning system with high technical quality (i.e., easy to use, error free, user friendly) requires students to allocate minimal effort to learn and manage the system’s functionalities; as a result, they can focus on the educational activities and content. Moreover, the appropriate technical design is a key determinant of how students can be effective in collaborating, interacting, and sharing learning material.
SRL is a key inhibitor of PU, SAT, and USE. This finding contradicts those in the literature [58,99]. This suggests that students are missing SRL, negatively influencing their SAT, usefulness perceptions, and e-learning USE. The e-learning environment is open in nature, and direct interaction with peers and instructors is low; thus, the responsibility of controlling learning processes shifts to the student [60]. As the e-learning system enables students to read and download learning materials at their convenience, students are obligated to control and manage their learning activities and processes. Hence, students with highly autonomous learning skills are more likely to be satisfied with the e-learning system than those with limited autonomous learning skills. Additionally, high SRL level allows students to recognize the value of the e-learning system’s usefulness, such as the availability of learning material and ability to interact with peers and instructors at their convenience.
Consistent with the literature, the impact of PU on SAT [100], USE [101], and ACP [10] is significant. Such findings indicate that students are inclined to use the e-learning system if the system provides useful features that facilitate the completion of learning-related tasks efficiently and successfully. Moreover, students’ SAT is increased by providing useful features that add value to the learning activities accomplished through the system. Thus, an increased perception of the e-learning system’s usefulness will increase their ACP, and they will learn more effectively.

9. Research Implications

9.1. Theoretical Implications

E-learning adoption studies (based on TAM-related models) such as those on predicting e-learning adoption and postadoption behaviors (i.e., use, continued use) have ignored how such behaviors influence learning outcomes and ACP. Likewise, the D&M model has been criticized because of the limited theoretical support for relationships between determinants and behaviors of e-learning adoption. Accordingly, to overcome these limitations, this study suggests a novel e-learning success model that integrates the D&M model and TAM. Such integration considers the strengths of each model and thus overcomes the weaknesses in both models. The proposed model is comprehensive because it includes various perspectives related to quality, social factors, usefulness, acceptance, and benefits of using e-learning systems for the ACP of students. This study advances the literature and empirically examines the research model developed by combining a set of context-specific factors that are drivers of the success of e-learning systems. Particularly, five quality factors (CCQ, IQ, TSQ, ESQ, and SSQ) are suggested and empirically tested. These quality factors act as antecedents of PU, SAT, USE, and ACP. All these factors have demonstrated importance as valid measures that contribute to the recognition of e-learning success factors. Additionally, unlike the literature, this study has introduced novel relationships not widely investigated in empirical studies. Particularly, new relationships are caused by the inclusion of SRL as a new context-specific factor. According to a review of the literature, this study is among the first studies to comprehensively investigate and empirically test the relationships between SRL and the aforementioned antecedents with PU, SAT, USE, and ACP in a single model. Finally, the proposed model demonstrates a considerable predictive power among PU, SAT, ACP, and USE. The proposed model moderately to substantially explains 57.6%, 64.1%, 61%, and 64% of the variance in ACP, SAT, e-learning use, and PU, respectively. Such predictive power is considerably higher than previous research (i.e., [1,37,66]).

9.2. Practical Implications

The findings demonstrate that IQ and support services affect the USE, PU, and SAT with the e-learning system. Consequently, instructors should receive adequate, constructive training before using the e-learning system [2]. This action will support instructors to acquire a broad understanding of the functionalities and features of the system and thus increase their confidence and efficiency when using the system. Additionally, instructors should master how to design online educational materials and how to teach online in manners that suit the nature and requirements of online learning environments. Providing technical and administrative supports to students to overcome learning difficulties or technical-related issues that may occur during their interaction with systems is also essential. Thus, a suggestion is that student support should be recognized as a key component of quality assurance that should be embedded in technology-enabled learning policies [42]. This may improve students’ learning experience and overall perceptions and thus prevent poor interaction and frustration. Moreover, technical support might be delivered to students through training courses and manuals on how to use and manage the system in their learning. The quality of course content shows a positive influence on PU, SAT, and USE. Accordingly, designing the content of educational materials to fulfil e-learning settings is an essential aspect. As Pham et al. [41] indicate, educational materials should be relevant and continually updated to fulfil learning requirements and, more essentially, help students enjoy learning. Furthermore, it has been noted that incorporating virtual reality in e-learning platforms can significantly increase students’ attractiveness, satisfaction, creativity, and motivation [102].
Educational and technical features of the e-learning systems are imperative aspects that impact PU, SAT, and USE. Hence, acknowledging that system features such as customization, integration between the system’s components, and reliability and usability of the e-learning system should be enhanced to produce systems that are more appealing, reliable, intuitive, user friendly, customized, and convenient to navigate [10]. In addition to accessing the e-learning systems from ordinary websites, providing mobile access with appropriate designs would enhance students’ SAT and sense of flexibility. Furthermore, the e-learning system must support students in developing collaborative, interactive learning environments through developing social networks with instructors and peers. Such collaboration and interactions may lead to higher learning effectiveness and, subsequently, higher grades. All these features contribute positively in enhancing students’ PU, SAT, and USE.
Given the significant negative impact of SRL on students’ PU, SAT, and the use of e-learning systems, universities should build modern pedagogical curricula that increase students’ efficacy of self-regulation. This goal can be achieved by shifting from instructor to student-centered learning. Such an approach to learning requires students to possess self-regulatory competencies to be responsible for their learning. Student-centered learning in online learning environments should be supported with the use of pedagogical tools, such as multimedia tools and asynchronous and synchronous communication channels, to facilitate educational interactions with instructors and peers. Simultaneously, instructors are pivotal in enhancing students’ SRL and so they must deliver the curricula and seek innovative methods to regularize and embed the skills of SRL in the learning activities [103,104]. Training courses might be useful in developing SRL strategies (i.e., time management, planning, goal setting), especially at early stages in study programs [105]. Additionally, the developers of e-learning systems play an essential role in raising SRL competencies. They should focus on finding methods and solutions that support students in time and task management, personalized planning, and goal settings.
The findings reveal that students’ PU of the e-learning system is a major determinant of SAT, USE, and ACP. What is critical is that e-learning systems deliver various valuable, beneficial tools and features that support students’ learning. For instance, the ability to receive course-related announcements; download learning content; take online examinations; and access instant group discussion tools and channels that facilitate peer–instructor interaction, support, and feedback in real-time are strongly recognized as useful features [106]. Additionally, students’ awareness of the benefits and usefulness of the e-learning system should be increased to enhance the popularity and use rates of the e-learning systems. Such awareness can be raised by conducting regular workshops and seminars to convey and demonstrate how the system benefits students.
Finally, SAT and USE are the main antecedents of ACP. Thus, students should support technology-enabled learning, and this support could be promoted by educating them on how e-learning USE can enhance their ACP. Moreover, as Abdous [107] argues, if the main driver for adopting e-learning is enhancing students’ learning experience on-campus, institutional policies must be directed toward technology-enabled pedagogies and digital learning. Additionally, corresponding to Rajabalee and Santally [42], institutional leaders and e-learning policy makers should rely on learning analytics to improve their understanding of students’ learning experiences and patterns.

10. Conclusions and Future Research

This study examines the key drivers of e-learning system success by developing a holistic research model that integrated TAM and the ISSM. The proposed model suggested a total of 23 hypotheses. To examine the research model, this study collected empirical data from students in three private universities in Jordan. The results indicated that 20 out of the 23 hypotheses were supported. Particularly, the suggested quality factors had a direct positive impact on the perceived usefulness, satisfaction, and use of e-learning. The quality factors included: instructor quality, course content quality, technical system quality, support service quality, and educational system quality. In addition, usage behavior, perceived usefulness, and satisfaction positively influenced students’ academic performance. Surprisingly, self-regulated learning is a key inhibitor of e-learning systems success and negatively affects e-learning system’s perceived usefulness, satisfaction, and use. Accordingly, e-learning stakeholders should introduce effective strategies to overcome the lack of students’ self-regulated learning. In this study, the cross-sectional model captures users’ perceptions and behaviors at a point in time. Further research is encouraged to employ longitudinal surveys considering the likelihood of a change in the preferences and perceptions of individuals as they gain experience. Further, the number of universities in the sample studied was limited. Such a limited sample raises questions on the generalization of the findings, especially to public universities. Accordingly, further research could extensively cover larger populations with different demographics and psychological and demographical attributes and a more representative sample. These steps would enhance the generalizability of the findings. This study, however, investigated three private universities and excluded public universities. In Jordan, more students attend public universities than private universities. Therefore, further research should consider public universities.
A suggestion is that cultures adopt different strategies in managing factors that affect e-learning adoption [35]. For example, although ease of use is perceived as essential in Eastern cultures, especially for developing countries, PU is more salient in Western cultures. Moreover, Eastern cultures are more socially oriented than Western cultures. This thus signifies the prominence of social factors in Eastern cultures. However, more attention should be paid to subjective norms, absorptive capacity, and mobility of individuals in Eastern cultures, not Western cultures. Consequently, further research should investigate cultural effects. Finally, the proposed research model was tested using self-reported measures. Thus, future research should use objective measures to test the proposed model.

Author Contributions

Conceptualization, A.S.A.-A. and N.A.A.; methodology, A.S.A.-A.; software, A.S.A.-A.; validation, A.S.A.-A., O.H. and W.M.A.-R.; formal analysis, A.S.A.-A.; investigation, A.S.A.-A.; resources, A.A. and O.H.; data curation, A.S.A.-A.; writing—original draft preparation, A.S.A.-A.; writing—review and editing, O.H., W.M.A.-R., N.A.A. and A.A.; visualization, A.S.A.-A. supervision, A.S.A.-A.; project administration, A.S.A.-A.; funding acquisition, A.A. and A.S.A.-A. All authors have read and agreed to the published version of the manuscript.

Funding

The study was supported by a grant from the United Arab Emirates University (UAEU), Fund number: 31B121.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Exclude.

Acknowledgments

We would like to gratefully acknowledge the assistance and support of the faculty members of the departments of Electronic Business and Commerce, and Management Information Systems at Al-Ahliyya Amman University for providing insightful feedback.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Questionnaire Form

ConstructItemSource
Instructor Quality (IQ)IQ1: “I use Moodle as recommended by my instructors.”[10]
IQ2: “I think an instructor’s enthusiasm about using Moodle stimulates my desire to learn.”
IQ3: “I receive a prompt response to questions and concerns from my instructors in Moodle.”
IQ4: “I think communicating and interacting with instructors are important and valuable in Moodle.”
Course Content Quality (CCQ)CCQ1: “The content and information available in Moodle is timely.”[36]
CCQ2: “The content and information available in Moodle is useful and easy to understand.”
CCQ3: “The content and information available in Moodle can be relied upon.”
CCQ4: “The content and information available in Moodle is accurate.”
Educational System Quality (ESQ)ESQ1: “I believe that communication facilities have been effective learning components in my study.”[37]
ESQ2: “Moodle provides evaluation components and assessment materials (e.g., quizzes, assignments).”
ESQ3: “Moodle provides me with different learning styles (e.g., flash animation, video, audio, text, simulation, etc.) and they are interesting and appropriate in my study.”
ESQ4: “Moodle provides interactivity and communication facilities such as chat, forums, and announcements.”
Support Service Quality (SSQ)SSQ1: “The IT services staff understands the specific needs of students.”[3]
SSQ2: “I receive a satisfactory and timely response from the IT services staff.”
SSQ3: “The IT services staff is available and cooperative when facing an error at Moodle.”
SSQ4: “Moodle provides proper online assistance and help.”
Technical System Quality (TSQ)TSQ1: “It is easy to understand the structure of Moodle and how to use it.”[37]
TSQ2: “Moodle includes the necessary features and functions I need.”
TSQ3: “Moodle navigation is easy to use.”
TSQ4: “Overall, Moodle is easy to use.”
Self-regulated learning (SRL)SRL1: “When it comes to learning and studying, I am a self-directed person.”[108]
SRL2: “In my studies, I am self-disciplined and find it easy to set aside reading and homework time.”
SRL3: “In my studies, I set goals and have a high degree of initiative.”
SRL4: “I am able to manage my study time effectively and easily complete assignments on time.”
Perceived Usefulness (PU)PU1: “Using Moodle enables me to accomplish my tasks more quickly.”[35]
PU2: “Using Moodle improves my learning performance.”
PU3: “Using Moodle helps me learn effectively.”
PU4: “Overall Moodle is useful.”
Satisfaction (SAT)SAT1: “I am satisfied with the performance of Moodle.”[1]
SAT2: “I enjoy using Moodle in my study.”
SAT3: “Moodle satisfies my educational needs.”
SAT4: “Overall, I am pleased with the experience of using Moodle.”
System Use (USE)USE1: “I use Moodle frequently.”[40]
USE2:” I depend on Moodle in my study.”
USE3: “I use Moodle regularly.”
USE4: “On average, I spend a long time on using Moodle.”
Academic Performance (ACP)ACP1: “Moodle has helped me to achieve the learning goals of the module.”[29]
ACP2: “I had good grades in such courses where Moodle is used heavily.”
ACP3: “Moodle makes communication easier with the instructor and other classmates.”
ACP4: “Moodle is a very effective educational tool and has helped me to improve my learning process.”

References

  1. Yakubu, N.; Dasuki, S. Assessing eLearning systems success in Nigeria: An application of the DeLone and McLean information systems success model. J. Inf. Technol. Educ. Res. 2018, 17, 183–203. [Google Scholar] [CrossRef] [Green Version]
  2. Coman, C.; Țîru, L.G.; Meseșan-Schmitz, L.; Stanciu, C.; Bularca, M.C. Online teaching and learning in higher education during the coronavirus pandemic: Students’ perspective. Sustainability 2020, 12, 10367. [Google Scholar] [CrossRef]
  3. Cidral, W.A.; Oliveira, T.; Di Felice, M.; Aparicio, M. E-learning success determinants: Brazilian empirical study. Comput. Educ. 2018, 122, 273–290. [Google Scholar] [CrossRef] [Green Version]
  4. Al-Fraihat, D.; Joy, M.; Sinclair, J. A comprehensive model for evaluating e-learning systems success. Distance Learn. Educ. Train. Lead. 2018, 15, 57–74. [Google Scholar]
  5. Popovici, A.; Mironov, C. Students’ perception on using eLearning technologies. Procedia-Soc. Behav. Sci. 2015, 180, 1514–1519. [Google Scholar] [CrossRef] [Green Version]
  6. Sharma, S.K.; Gaur, A.; Saddikuti, V.; Rastogi, A. Structural equation model (SEM)-neural network (NN) model for predicting quality determinants of e-learning management systems. Behav. Inf. Technol. 2017, 36, 1053–1066. [Google Scholar] [CrossRef]
  7. Almaiah, M.A.; Al-Khasawneh, A.; Althunibat, A. Exploring the critical challenges and factors influencing the e-learning system usage during COVID-19 pandemic. Educ. Inf. Technol. 2020, 25, 5261–5280. [Google Scholar] [CrossRef] [PubMed]
  8. Sitar-Taut, D.A.; Mican, D. Mobile learning acceptance and use in higher education during social distancing circumstances: An expansion and customization of UTAUT2. Online Inf. Rev. 2021. [Google Scholar] [CrossRef]
  9. Eom, S.B.; Ashill, N.J. A system’s view of e-learning success model. Decis. Sci. J. Innov. Educ. 2018, 16, 42–76. [Google Scholar] [CrossRef]
  10. Al-Fraihat, D.; Joy, M.; Sinclair, J. Evaluating E-learning systems success: An empirical study. Comput. Hum. Behav. 2020, 102, 67–86. [Google Scholar] [CrossRef]
  11. Kim, E.J.; Kim, J.J.; Han, S.H. Understanding student acceptance of online learning systems in higher education: Application of social psychology theories with consideration of user innovativeness. Sustainability 2021, 13, 896. [Google Scholar] [CrossRef]
  12. Siron, Y.; Wibowo, A.; Narmaditya, B.S. Factors affecting the adoption of e-learning in Indonesia: Lesson from COVID-19. J. Technol. Sci. Educ. 2020, 10, 282–295. [Google Scholar] [CrossRef]
  13. Rizun, M.; Strzelecki, A. Students’ acceptance of the COVID-19 impact on shifting higher education to dis-tance learning in Poland. Int. J. Environ. Res. Public Health 2020, 17, 6468. [Google Scholar] [CrossRef] [PubMed]
  14. Sukendro, S.; Habibi, A.; Khaeruddin, K.; Indrayana, B.; Syahruddin, S.; Makadada, F.A.; Hakim, H. Using an extended Technology Acceptance Model to understand students’ use of e-learning during COVID-19: Indonesian sport science education context. Heliyon 2020, 6, e05410. [Google Scholar] [CrossRef] [PubMed]
  15. He, T.; Huang, Q.; Yu, X.; Li, S. Exploring students’ digital informal learning: The roles of digital competence and DTPB factors. Behav. Inf. Technol. 2020, 1–11. [Google Scholar] [CrossRef]
  16. Handoko, B.L. Application of UTAUT theory in higher education online learning. In Proceedings of the 2019 10th International Conference on E-Business, Management and Economics, Beijing, China, 15–17 July 2019; pp. 259–264. [Google Scholar] [CrossRef]
  17. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  18. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  19. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Q. Manag. Inf. Syst. 2012, 36, 157–178. [Google Scholar] [CrossRef] [Green Version]
  20. Ajzen, I. The theory of planned behavior. Organizational Behavior and Human Decision Processes. J. Appl. Sci. Psychol. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  21. Bhattacherjee, A. Understanding information systems continuance: An expectation-confirmation model. MIS Q. 2001, 25, 351–370. [Google Scholar] [CrossRef]
  22. Safsouf, Y.; Mansouri, K.; Poirier, F. An analysis to understand the online learners’ success in public higher education in Morocco. J. Inf. Technol. Educ. 2020, 19, 87–112. [Google Scholar] [CrossRef] [Green Version]
  23. Yumei, L.; Qiongwei, Y.; Luoyan, M. An empirical research on influence factor of college students’ continued intentions of online self-regulated learning based on the model of ECM and TAM. In Proceedings of the 3rd International Conference on Social Science and Higher Education (ICSSHE), Sanya, China, 28–30 September 2017; pp. 28–30. [Google Scholar]
  24. Lee, J.-W. Online support service quality, online learning acceptance, and student satisfaction. Internet High. Educ. 2010, 13, 277–283. [Google Scholar] [CrossRef]
  25. DeLone, W.H.; McLean, E.R. The DeLone and McLean model of information systems success: A ten-year update. J. Manag. Inf. Syst. 2003, 19, 9–30. [Google Scholar] [CrossRef]
  26. Baber, H. Determinants of students’ perceived learning outcome and satisfaction in online learning during the pandemic of COVID-19. J. Educ. E-Learn. Res. 2020, 7, 285–292. [Google Scholar] [CrossRef]
  27. Maqableh, M.; Jaradat, M.; Azzam, A. Exploring the determinants of students’ academic performance at university level: The mediating role of internet usage continuance intention. Educ. Inf. Technol. 2021, 26, 4003–4025. [Google Scholar] [CrossRef]
  28. Al-Hariri, M.T.; Al-Hattami, A.A. Impact of students’ use of technology on their learning achievements in physiology courses at the University of Dammam. J. Taibah Univ. Med. Sci. 2017, 12, 82–85. [Google Scholar] [CrossRef] [Green Version]
  29. Islam, A.N. Investigating e-learning system usage outcomes in the university context. Comput. Educ. 2013, 69, 387–399. [Google Scholar] [CrossRef]
  30. DeLone, W.H.; McLean, E.R. Information systems success: The quest for the dependent variable. Inf. Syst. Res. 1992, 3, 60–95. [Google Scholar] [CrossRef] [Green Version]
  31. Salam, M.; Farooq, M.S. Does sociability quality of web-based collaborative learning information system influence students’ satisfaction and system usage? Int. J. Educ. Technol. High. Educ. 2020, 17, 1–39. [Google Scholar] [CrossRef]
  32. Martins, J.; Branco, F.; Gonçalves, R.; Au-Yong-Oliveira, M.; Oliveira, T.; Naranjo-Zolotov, M.; Cruz-Jesus, F. Assessing the success behind the use of education management information systems in higher education. Telemat. Inf. 2019, 38, 182–193. [Google Scholar] [CrossRef]
  33. Eom, S.; Ashill, N.J.; Arbaugh, J.B.; Stapleton, J.L. The role of information technology in e-learning systems success. Hum. Syst. Manag. 2012, 31, 147–163. [Google Scholar] [CrossRef]
  34. Cheng, Y.M. Extending the expectation-confirmation model with quality and flow to explore nurses’ continued blended e-learning intention. Inf. Technol. People 2014, 27, 230–258. [Google Scholar] [CrossRef]
  35. Mohammadi, H. Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Comput. Hum. Behav. 2015, 45, 359–374. [Google Scholar] [CrossRef]
  36. Al Mulhem, A. Investigating the effects of quality factors and organizational factors on university students’ satisfaction of e-learning system quality. Cogent Educ. 2020, 7, 1787004. [Google Scholar] [CrossRef]
  37. Seta, H.B.; Wati, T.; Muliawati, A.; Hidayanto, A.N. E-learning success model: An extention of DeLone and McLean IS’ success model. Indones. J. Electr. Eng. Inf. 2018, 6, 281–291. [Google Scholar] [CrossRef] [Green Version]
  38. Al-Adwan, A.S.; Al-Madadha, A.; Zvirzdinaite, Z. Modeling students’ readiness to adopt mobile learning in higher education: An empirical study. Int. Rev. Res. Open Distrib. Learn. 2018, 19, 221–241. [Google Scholar] [CrossRef] [Green Version]
  39. Zhu, Y.; Zhang, J.H.; Au, W.; Yates, G. University students’ online learning attitudes and continuous intention to undertake online courses: A self-regulated learning perspective. Educ. Technol. Res. Dev. 2020, 68, 1485–1519. [Google Scholar] [CrossRef]
  40. Shahzad, A.; Hassan, R.; Aremu, A.Y.; Hussain, A.; Lodhi, R.N. Effects of COVID-19 in e-learning on higher education institution students: The group comparison between male and female. Qual. Quant. 2021, 55, 805–827. [Google Scholar] [CrossRef]
  41. Pham, L.; Limbu, Y.B.; Bui, T.K.; Nguyen, H.T.; Pham, H.T. Does e-learning service quality influence e-learning student satisfaction and loyalty? Evidence from Vietnam. Int. J. Educ. Technol. High. Educ. 2019, 16, 1–26. [Google Scholar] [CrossRef] [Green Version]
  42. Rajabalee, Y.B.; Santally, M.I. Learner satisfaction, engagement and performances in an online module: Implications for institutional e-learning policy. Educ. Inf. Technol. 2021, 26, 2623–2656. [Google Scholar] [CrossRef]
  43. Keržič, D.; Tomaževič, N.; Aristovnik, A.; Umek, L. Exploring critical factors of the perceived usefulness of blended learning for higher education students. PLoS ONE 2019, 14, e0223767. [Google Scholar] [CrossRef]
  44. Lee, S.J.; Lee, H.; Kim, T.T. A study on the instructor role in dealing with mixed contents: How it affects learner satisfaction and retention in e-learning. Sustainability 2018, 10, 850. [Google Scholar] [CrossRef] [Green Version]
  45. Turugare, M.; Rudhumbu, N. Integrating technology in teaching and learning in universities in Lesotho: Opportunities and challenges. Educ. Inf. Technol. 2020, 25, 3593–3612. [Google Scholar] [CrossRef]
  46. Almerich, G.; Orellana, N.; Suárez-Rodríguez, J.; Díaz-García, I. Teachers’ information and communication technology competences: A structural approach. Comput. Educ. 2016, 100, 110–125. [Google Scholar] [CrossRef]
  47. Lee, E.-Y.; Jeon, Y.J.J. The difference of user satisfaction and net benefit of a mobile learning management system according to self-directed learning: An investigation of cyber university students in hospitality. Sustainability 2020, 12, 2672. [Google Scholar] [CrossRef] [Green Version]
  48. Mtebe, J.S.; Raisamo, R. A model for assessing learning management system success in higher education in Sub-Saharan countries. Electron. J. Inf. Syst. Dev. Ctries. 2014, 61, 1–17. [Google Scholar] [CrossRef]
  49. Goh, C.; Leong, C.; Kasmin, K.; Hii, P.; Tan, O. Students’ experiences, learning outcomes and satisfaction in e-learning. J. E-Learn. Knowl. Soc. 2017, 13, 117–128. [Google Scholar] [CrossRef]
  50. Almaiah, M.A.; Jalil, M.A.; Man, M. Extending the TAM to examine the effects of quality features on mobile learning acceptance. J. Comput. Educ. 2016, 3, 453–485. [Google Scholar] [CrossRef]
  51. Lee, M.C. Explaining and predicting users’ continuance intention toward e-learning: An extension of the expectation–confirmation model. Comput. Educ. 2010, 54, 506–516. [Google Scholar] [CrossRef]
  52. Iigaz, H.; Gülbahar, Y. Examining e-learners’ preferences and readiness satisfaction: A holistic modelling approach. Open Prax. 2020, 12, 209–222. [Google Scholar]
  53. Abbad, M.; Jaber, F. Evaluating e-learning systems: An empirical investigation on students’ perception in higher education area. Int. J. Emerg. Technol. Learn. 2014, 9, 27–34. [Google Scholar] [CrossRef]
  54. Cheok, M.L.; Wong, S.L. Predictors of e-learning satisfaction in teaching and learning for school teachers: A literature review. Int. J. Instr. 2015, 8, 75–90. [Google Scholar] [CrossRef]
  55. Mahmoodi, Z.; Esmaelzadeh-Saeieh, S.; Lotfi, R.; Eftekhari, M.B.; Kamrani, M.A.; Tourzani, Z.M.; Salehi, K. The evaluation of a virtual education system based on the DeLone and McLean model: A path analysis. F1000Research 2017, 6, 1631. [Google Scholar] [CrossRef] [Green Version]
  56. Bauk, S.; Šćepanović, S.; Kopp, M. Estimating students’ satisfaction with web based learning system in blended learning environment. Educ. Res. Int. 2014, 2014, 1–11. [Google Scholar] [CrossRef]
  57. Zimmerman, B.J. Self-regulated learning: Theories, measures, and outcomes. In International Encyclopedia of the Social Behavioral Sciences, 2nd ed.; Wright, J.D., Ed.; Elsevier: Amsterdam, The Neverlands, 2015; pp. 541–546. [Google Scholar] [CrossRef]
  58. Landrum, B. Examining students’ confidence to learn online, self-regulation skills and perceptions of satisfaction and usefulness of online classes. Online Learn. 2020, 24, 128–146. [Google Scholar] [CrossRef]
  59. Liaw, S.S.; Huang, H.M. Perceived satisfaction, perceived usefulness and interactive learning environments as predictors to self-regulation in e-learning environments. Comput. Educ. 2019, 60, 14–24. [Google Scholar] [CrossRef]
  60. Al-Adwan, A.S. Investigating the drivers and barriers to MOOCs adoption: The perspective of TAM. Educ. Inf. Technol. 2020, 25, 5771–5795. [Google Scholar] [CrossRef]
  61. Dorobăţ, I.; Corbea, A.M.I.; Muntean, M. Integrating student trust in a conceptual model for assessing learning management system success in higher education: An empirical analysis. IEEE Access 2019, 7, 69202–69214. [Google Scholar] [CrossRef]
  62. Mtebe, J.S.; Raphael, C. Key factors in learners’ satisfaction with the e-learning system at the University of Dar es Salaam, Tanzania. Australas. J. Educ. Technol. 2018, 34, 107–122. [Google Scholar] [CrossRef] [Green Version]
  63. Asoodar, M.; Vaezi, S.; Izanloo, B. Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Comput. Hum. Behav. 2016, 63, 704–716. [Google Scholar] [CrossRef]
  64. Foster, Y.; Hergeth, S.; Naujoks, F.; Krems, J.F.; Keinath, A. What and how to tell beforehand: The effect of user education on understanding, interaction and satisfaction with driving automation. Transp. Res. Part F Traffic Psychol. Behav. 2020, 68, 316–335. [Google Scholar] [CrossRef]
  65. Xu, F.; Du, J.T. Examining differences and similarities between graduate and undergraduate students’ user satisfaction with digital libraries. J. Acad. Librariansh. 2019, 45, 102072. [Google Scholar] [CrossRef]
  66. Cidral, W.; Aparicio, M.; Oliveira, T. Students’ long-term orientation role in e-learning success: A Brazilian study. Heliyon 2020, 6, e05735. [Google Scholar] [CrossRef]
  67. Farooq, M.S.; Salam, M.; Jaafar, N.; Fayolle, A.; Ayupp, K.; Radovic-Markovic, M.; Sajid, A. Acceptance and use of lecture capture system (LCS) in executive business studies: Extending UTAUT2. Interact. Technol. Smart Educ. 2017, 14, 329–348. [Google Scholar] [CrossRef]
  68. Alzahrani, A.I.; Mahmud, I.; Ramayah, T.; Alfarraj, O.; Alalwan, N. Modelling digital library success using the DeLone and McLean information system success model. J. Librariansh. Inf. Sci. 2017, 51, 291–306. [Google Scholar] [CrossRef]
  69. Makhni, E.C. Editorial commentary: Making the leap to the patient-reported outcomes measurement information system: A paradigm shift that will ultimately benefit our patients. Arthrosc. J. Arthrosc. Relat. Surg. 2020, 36, 521–523. [Google Scholar] [CrossRef] [Green Version]
  70. Zhang, L.; Thompson, R.G. Understanding the benefits and limitations of occupancy information systems for couriers. Transp. Res. Part C Emerg. Technol. 2019, 105, 520–535. [Google Scholar] [CrossRef]
  71. Gravetter, F.J.; Forzano, L.A.B. Research Methods for the Behavioral Sciences; Cengage Learning: Boston, MA, USA, 2018. [Google Scholar]
  72. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  73. Bollen, K.A.; Noble, M.D. Structural equation models and the quantification of behavior. Proc. Natl. Acad. Sci. USA 2011, 108, 15639–15646. [Google Scholar] [CrossRef] [Green Version]
  74. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling; Sage Publications: Thousand Oaks, CA, USA, 2013; pp. 184–185. [Google Scholar]
  75. Al-Adwan, A.S.; Sammour, G. What Makes Consumers Purchase Mobile Apps: Evidence from Jordan. J. Theor. Appl. Electron. Commer. Res. 2021, 16, 562–583. [Google Scholar] [CrossRef]
  76. Mican, D.; Sitar-Tăut, D.-A.; Moisescu, O.-I. Perceived usefulness: A silver bullet to assure user data availability for online recommendation systems. Decis. Support Syst. 2020, 139, 113420. [Google Scholar] [CrossRef]
  77. Al-Adwan, A.S.; Kokash, H.; Adwan, A.A.; Alhorani, A.; Yaseen, H. Building customer loyalty in online shopping: The role of online trust, online satisfaction and electronic word of mouth. Int. J. Electron. Mark. Retail. 2020, 11, 278–306. [Google Scholar] [CrossRef]
  78. Al-Adwan, A.S. Revealing the influential factors driving social commerce adoption. Interdiscip. J. Inf. Knowl. Manag. 2019, 14, 295–324. [Google Scholar] [CrossRef]
  79. Al-Adwan, A.S.; Kokash, H. The driving forces of Facebook social commerce. J. Theor. Appl. Electron. Comer. Res. 2019, 14, 15–32. [Google Scholar] [CrossRef] [Green Version]
  80. Al-Adwan, A.S.; Alrousan, M.; Al-Soud, A.; Al-Yaseen, H. Revealing the black box of shifting from electronic commerce to mobile commerce: The case of Jordan. J. Theor. Appl. Electron. Commer. Res. 2019, 14, 51–67. [Google Scholar] [CrossRef] [Green Version]
  81. Migdadi, M.M.; Abu Zaid, M.K.S.; Al-Hujran, O.S.; Aloudat, A.M. An empirical assessment of the antecedents of electronic-business implementation and the resulting organizational performance. Internet Res. 2016, 26, 661–688. [Google Scholar] [CrossRef]
  82. Hair, J.F.; Hollingsworth, C.L.; Randolph, A.B.; Chong, A.Y.L. An updated and expanded assessment of PLS-SEM in information systems research. Ind. Manag. Data Syst. 2017, 117, 442–458. [Google Scholar] [CrossRef]
  83. Hujran, O.; Abu-Shanab, E.; Aljaafreh, A. Predictors for the adoption of e-democracy: An empirical evaluation based on a citizen-centric approach. Transform. Gov. People Process Policy 2020, 14, 523–544. [Google Scholar] [CrossRef]
  84. Al-Hujran, O.; Al-Debei, M.M.; Chatfield, A.; Migdadi, M. The imperative of influencing citizen attitude toward e-government adoption and use. Comput. Hum. Behav. 2015, 53, 189–203. [Google Scholar] [CrossRef]
  85. Sitar-Tăut, D.A. Mobile learning acceptance in social distancing during the COVID-19 outbreak: The mediation effect of hedonic motivation. Hum. Behav. Emerg. Technol. 2021, 3, 366–378. [Google Scholar] [CrossRef]
  86. Kumar, J.A.; Bervell, B.; Annamalai, N.; Osman, S. Behavioral intention to use mobile learning: Evaluating the role of self-efficacy, subjective norm, and WhatsApp use habit. IEEE Access 2020, 8, 208058–208074. [Google Scholar] [CrossRef]
  87. Al-Adwan, A.S.; Khdour, N. Exploring Student Readiness to MOOCs in Jordan: A Structural Equation Modelling Approach. J. Inf. Technol. Educ. 2020, 19, 223–242. [Google Scholar] [CrossRef] [Green Version]
  88. Al-Adwan, A.S.; Albelbisi, N.A.; Aladwan, S.H.; Horani, O.; Al-Madadha, A.; Al Khasawneh, M.H. Investigating the Impact of Social Media Use on Student’s Perception of Academic Performance in Higher Education: Evidence from Jordan. J. Inf. Technol. Educ. Res. 2020, 19, 953–975. [Google Scholar] [CrossRef]
  89. Anderson, J.C.; Gerbing, D.W. Structural equation modeling in practice: A review and recommended two-step approach. Psychol. Bull. 1988, 103, 411–423. [Google Scholar] [CrossRef]
  90. Fornell, G.; Larcker, F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  91. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  92. Henseler, J.; Hubona, G.; Ray, A. Using PLS path modelling in new technology research: Updated guide-lines. Ind. Manag. Data Syst. 2016, 116, 2–20. [Google Scholar] [CrossRef]
  93. Benitez, J.; Henseler, J.; Castillo, A.; Schuberth, F. How to perform and report an impactful analysis using partial least squares: Guidelines for confirmatory and explanatory IS research. Inf. Manag. 2020, 57, 103168. [Google Scholar] [CrossRef]
  94. Hair, J.F.; Ringle, C.M.; Sarstedt, M. Partial least squares structural equation modeling: Rigorous applications, better results and higher acceptance. Long Range Plan. 2013, 46, 1–12. [Google Scholar] [CrossRef]
  95. Chin, W.W. The partial least squares approach to structural equation modeling. Mod. Methods Bus. Res. 1998, 295, 295–336. [Google Scholar]
  96. Kim, K.; Trimi, S.; Park, H.; Rhee, S. The impact of CMS quality on the outcomes of e-learning systems in higher education: An empirical study. Decis. Sci. J. Innov. Educ. 2012, 10, 575–587. [Google Scholar] [CrossRef]
  97. Teo, T.; Wong, S.L. Modeling key drivers of e-learning satisfaction among student teachers. J. Educ. Comput. Res. 2013, 48, 71–95. [Google Scholar] [CrossRef]
  98. Markova, T.; Glazkova, I.; Zaborova, E. Quality issues of online distance learning. Procedia-Soc. Behav. Sci. 2017, 237, 685–691. [Google Scholar] [CrossRef]
  99. Su, Y.; Li, Y.; Liang, J.C.; Tsai, C.C. Moving literature circles into wiki-based environment: The role of online self-regulation in EFL learners’ attitude toward collaborative learning. Comput. Assist. Lang. Learn. 2019, 32, 556–586. [Google Scholar] [CrossRef]
  100. Ghazal, S.; Al-Samarraie, H.; Aldowah, H. I am still learning: Modeling LMS critical success factors for promoting students’ experience and satisfaction in a blended learning environment. IEEE Access 2018, 6, 77179–77201. [Google Scholar] [CrossRef]
  101. Larmuseau, C.; Evens, M.; Elen, J.; Van Den Noortgate, W.; Desmet, P.; Depaepe, F. The relationship between acceptance, actual use of a virtual learning environment and performance: An ecological approach. J. Comput. Educ. 2018, 20185, 95–111. [Google Scholar] [CrossRef]
  102. Bucea-Manea-Țoniş, R.; Bucea-Manea-Țoniş, R.; Simion, V.E.; Ilic, D.; Braicu, C.; Manea, N. Sustainability in higher education: The relationship between work-life balance and XR e-learning facilities. Sustainability 2020, 12, 5872. [Google Scholar] [CrossRef]
  103. Cazan, A.M. An intervention study for the development of self-regulated learning skills. Curr. Psychol. 2020. [Google Scholar] [CrossRef]
  104. Vrieling, E.; Stijnen, S.; Bastiaens, T. Successful learning: Balancing self-regulation with instructional planning. Teach. High. Educ. 2018, 23, 685–700. [Google Scholar] [CrossRef] [Green Version]
  105. Pedrotti, M.; Nistor, N. How students fail to self-regulate their online learning experience. In Transforming Learning with Meaningful Technologies; Scheffel, M., Broisin, J., Pammer-Schindler, V., Ioannou, A., Schneider, J., Eds.; Springer: Berlin, Germany, 2019; pp. 377–385. [Google Scholar] [CrossRef] [Green Version]
  106. Althobaiti, M.M.; Mayhew, P. Assessing the usability of learning management system: User experience study. In E-Learning, E-Education, and Online Training; Springer: Berlin, Germany, 2016; pp. 9–18. [Google Scholar] [CrossRef]
  107. Abdous, M. Influence of satisfaction and preparedness on online students’ feelings of anxiety. Internet High. Educ. 2019, 41, 34–44. [Google Scholar] [CrossRef]
  108. Al-Adwan, A.S.; Al-Adwan, A.; Berger, H. Solving the mystery of mobile learning adoption in higher education. Int. J. Mob. Commun. 2018, 16, 24–49. [Google Scholar] [CrossRef]
Figure 1. Information system success model (DeLone and McLean [25]).
Figure 1. Information system success model (DeLone and McLean [25]).
Sustainability 13 09453 g001
Figure 2. The research model.
Figure 2. The research model.
Sustainability 13 09453 g002
Figure 3. Structural model analysis. (T Statistics), CCQ: course content quality, TSQ: technical system quality, IQ: instructor quality, SSQ: support service quality, ESQ: educational systems quality, SRL: self-regulated learning, SAT: satisfaction, PU: perceived usefulness, USE: system use, APC: academic performance.
Figure 3. Structural model analysis. (T Statistics), CCQ: course content quality, TSQ: technical system quality, IQ: instructor quality, SSQ: support service quality, ESQ: educational systems quality, SRL: self-regulated learning, SAT: satisfaction, PU: perceived usefulness, USE: system use, APC: academic performance.
Sustainability 13 09453 g003
Table 1. Respondents’ profile.
Table 1. Respondents’ profile.
VariableFrequencyPercentage
GenderMale35766%
Female18034%
Age in years<2020238%
20–3024145%
>309417%
Enrolled courseBachelor’s49392%
Master’s448%
Experience using the e-learning system<1 year16330%
1–2 years33162%
>2 years438%
N = 537.
Table 2. Construct reliability and validity.
Table 2. Construct reliability and validity.
ConstructItemLoadingαCRAVE
Academic Performance (ACP)APC10.930.940.960.85
APC20.92
APC30.91
APC40.92
Course Content Quality (CCQ)CCQ10.850.900.930.77
CCQ20.89
CCQ30.88
CCQ40.90
Educational System Quality (ESQ)ESQ10.890.900.930.77
ESQ20.88
ESQ30.87
ESQ40.87
Instructor Quality (IQ)IQ10.880.860.910.71
IQ20.82
IQ30.85
IQ40.83
Perceived Usefulness (PU)PU10.940.940.960.85
PU20.93
PU30.90
PU40.93
Satisfaction (SAT)SAT10.940.960.970.88
SAT20.94
SAT30.93
SAT40.94
Self-regulated learning (SRL)SRL10.820.870.910.72
SRL20.88
SRL30.83
SRL40.85
Support Service Quality (SSQ)SSQ10.840.860.910.71
SSQ20.82
SSQ30.85
SSQ40.86
Technical System Quality (TSQ)TSQ10.860.880.920.74
TSQ20.87
TSQ30.85
TSQ40.86
System Use (USE)USE10.890.900.930.78
USE20.86
USE30.89
USE40.89
α: Cronbach’s alpha, CR: composite reliability, AVE: average variance explained.
Table 3. Fornell and Larcker’s test.
Table 3. Fornell and Larcker’s test.
IQAPCCCQESQPUSATSRLSSQTSQUSE
IQ* 0.84
APC** 0.630.92
CCQ0.590.630.88
ESQ0.620.650.600.88
PU0.640.690.700.640.92
SAT0.650.660.660.670.690.94
SRL−0.56−0.58−0.59−0.63−0.62−0.620.85
SSQ0.550.560.600.610.620.63−0.560.84
TSQ0.620.600.620.590.660.65−0.570.610.86
USE0.620.670.640.640.690.63−0.620.610.630.88
* Numbers on the leading diagonal are the square root of AVE for each construct, ** correlation among the constructs, CCQ: course content quality, TSQ: technical system quality, IQ: instructor quality, SSQ: support service quality, ESQ: educational systems quality, SRL: self-regulated learning, SAT: satisfaction, PU: perceived usefulness, USE: system use, APC: academic performance.
Table 4. Heterotrait–monotrait test.
Table 4. Heterotrait–monotrait test.
IQAPCCCQESQPUSATSRLSSQTSQUSE
IQ-
APC0.70-
CCQ0.660.68-
ESQ0.700.700.67-
PU0.710.730.760.69-
SAT0.710.690.710.720.72-
SRL0.640.650.660.710.680.68-
SSQ0.640.620.680.690.680.690.64-
TSQ0.710.650.690.660.730.710.650.70-
USE0.700.730.710.720.750.680.690.690.70-
CCQ: course content quality, TSQ: technical system quality, IQ: instructor quality, SSQ: support service quality, ESQ: educational systems quality, SRL: self-regulated learning, SAT: satisfaction, PU: perceived usefulness, USE: system use, APC: academic performance.
Table 5. Model fit.
Table 5. Model fit.
IndexAcceptable Value/ConditionActual Value
SRMR (standardized root mean square residual)<0.080.045
d_ULS (unweighted least squares)“d_ULS < bootstrapped HI 95% of d _ULS and d_G < bootstrapped HI 95% of d_G”0.435
d_G (geodesic discrepancies)0.357
NFI (normed fit index)>0.90.912
Table 6. Collinearity test.
Table 6. Collinearity test.
Independent Variables
IQCCQESQPUSATSRLSSQTSQUSE
Dependent VariableAPC---2.412.07---2.11
PU2.082.152.29--2.002.062.20-
SAT2.162.382.322.78-2.042.092.29-
USE2.162.382.322.78-2.042.092.29-
Table 7. Predictive accuracy.
Table 7. Predictive accuracy.
ConstructR2Q2
APC0.5760.48
PU0.640.54
SAT0.6410.56
USE0.610.47
Table 8. PLS predict assessment.
Table 8. PLS predict assessment.
ItemRMSEPLSQ2_PredictPLSRMSELMRMSEPLS < RMSELMPredictive Power
APC10.3660.4640.375YesHigh
APC20.3550.480.359Yes
APC30.3580.4750.365Yes
APC40.370.4460.375Yes
PU10.3440.5260.337NoMedium
PU20.3370.5430.344Yes
PU30.3310.5590.343Yes
PU40.3470.5150.354Yes
SAT10.3250.5520.327YesMedium
SAT20.3220.5760.328Yes
SAT30.340.5230.337No
SAT40.3410.5240.343Yes
USE10.3720.4430.379YesMedium
USE20.370.4540.372Yes
USE30.3570.4850.358Yes
USE40.3840.4090.376No
Table 9. Path analysis summary.
Table 9. Path analysis summary.
HypothesesPathβSDT Statisticsp ValuesResult
H1aIQ → SAT0.1550.0503.1670.002Supported
H1bIQ → PU0.1590.0473.4300.001Supported
H1cIQ → USE0.1130.0442.5190.012Supported
H2aCCQ → SAT0.1530.0532.8970.004Supported
H2bCCQ → PU0.2890.0525.5770.000Supported
H2cCCQ → USE0.1180.0572.1030.035Supported
H3aESQ → SAT0.1690.0513.3200.001Supported
H3bESQ → PU0.1110.0512.1580.031Supported
H3cESQ → USE0.1510.0522.9350.003Supported
H4aSSQ → SAT0.1180.0492.4130.014Supported
H4bSSQ → PU0.1100.0442.4830.013Supported
H4cSSQ → USE0.1010.0432.3400.019Supported
H5aTSQ → SAT0.1190.0522.2840.022Supported
H5bTSQ → PU0.1800.0483.8430.000Supported
H5cTSQ → USE0.1060.0472.670.023Supported
H6aSRL → SAT−0.1170.0472.4880.013Not Supported
H6bSRL → PU−0.1250.0492.5540.011Not Supported
H6cSRL → USE−0.1300.0472.7430.006Not Supported
H7aPU → USE0.2300.0603.8640.000Supported
H7bPU → SAT0.1480.0632.3470.019Supported
H7cPU → APC0.2990.0595.0360.000Supported
H8SAT → APC0.2650.0554.8180.000Supported
H9USE → APC0.2950.0515.7670.000Supported
SD: Standard deviation.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Al-Adwan, A.S.; Albelbisi, N.A.; Hujran, O.; Al-Rahmi, W.M.; Alkhalifah, A. Developing a Holistic Success Model for Sustainable E-Learning: A Structural Equation Modeling Approach. Sustainability 2021, 13, 9453. https://0-doi-org.brum.beds.ac.uk/10.3390/su13169453

AMA Style

Al-Adwan AS, Albelbisi NA, Hujran O, Al-Rahmi WM, Alkhalifah A. Developing a Holistic Success Model for Sustainable E-Learning: A Structural Equation Modeling Approach. Sustainability. 2021; 13(16):9453. https://0-doi-org.brum.beds.ac.uk/10.3390/su13169453

Chicago/Turabian Style

Al-Adwan, Ahmad Samed, Nour Awni Albelbisi, Omar Hujran, Waleed Mugahed Al-Rahmi, and Ali Alkhalifah. 2021. "Developing a Holistic Success Model for Sustainable E-Learning: A Structural Equation Modeling Approach" Sustainability 13, no. 16: 9453. https://0-doi-org.brum.beds.ac.uk/10.3390/su13169453

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop