HOME JOURNALS CONTACT

Journal of Applied Sciences

Year: 2012 | Volume: 12 | Issue: 14 | Page No.: 1474-1480
DOI: 10.3923/jas.2012.1474.1480
Factors Determining e-learning Service Quality in Jordanian Higher Education Environment
Nabeel Farouq Al-Mushasha and Ayman Bassam Nassuora

Abstract: Rapid development of technologies and computers has changed methods of education and training and caused the blossom of e-learning programs. The use of technology-based training and electronic training (e-learning) is one of major trends in the field of human resource development. Nevertheless, there is a lack of research that addresses the issue of electronic learning service quality in higher education environment. This study aims to identify the factors that lead to service quality of e-learning in Jordanian higher education environment by using modified theoretical model based on SERVQUAL model. Data from a survey of 189 students were used to test the research model. Exploratory Factor Analysis (EFA) was conducted to examine the reliability and validity of the measurement model and multiple regression analysis was used to test the research model. The findings revealed that the factors that lead to service quality of e-learning in Jordanian higher education environment were interface design, reliability, responsiveness, trust and personalization.

Fulltext PDF Fulltext HTML

How to cite this article
Nabeel Farouq Al-Mushasha and Ayman Bassam Nassuora, 2012. Factors Determining e-learning Service Quality in Jordanian Higher Education Environment. Journal of Applied Sciences, 12: 1474-1480.

Keywords: Electronic learning, SERVQUAL, service quality and information systems quality

INTRODUCTION

The walls of the classrooms have been torn down, as computer technology evolution has widened the educational activities for instructors and students in the 90’s. The Internet technology has removed time and space constraints from instructors as well as students. With the rapid diffusion of the Internet, computers and telecommunications new approaches to learning were created (Berge and Collins, 1995; Crosta, 2004). On-line courses appeared as a new method of course delivery, since then, the interest in the development and use of distance learning in higher education has been steadily increasing (Dabbagh and Kitsantas, 2004). This rapid diffusion of the Internet and its deployment in learning, as well as on-line courses delivery is represented by Electronic Learning (e-learning). The demands of e-learning in connection with the possibilities offered by modern technology, pose new opportunities and new challenges to the educational systems. In recent years, the use of Information and Communication Technologies (ICT) in education, institutes of higher learning have the opportunity to revitalize the process of teaching and learning via electronic learning. And due to this advancement the students of all academic levels have developed much more sophisticated expectations, demands and study patterns than ever before (Al-Mushasha, 2010). Since electronic learning contents are delivered via wired or wireless Internet, they are information-oriented products and services (Martinez-Arguelles et al., 2010). Thus each service sector should have service quality criteria that specifically fit its features and characteristics (Dedeke, 2003).

During the past three decades, a number of researchers have sought to discover the attributes of the services which contribute most significantly to relevant quality assessments (Gronroos, 1984; Parasuraman et al., 1985; Pitt et al., 1995). Among them, the work conducted by Parasuraman et al. (1985) has been regarded as the most prominent which reveals ten dimensions of the service quality: (1) Tangibles; (2) Reliability; (3) Responsiveness; (4) Communication; (5) Credibility; (6) Security; (7) Competence; (8) Courtesy; (9) Understanding the customer and (10) Access. Based on these dimensions of the service quality, (Parasuraman et al., 1985) developed a model for the determinants of the perceived service quality which indicated that the perceived service quality was the consumer’s comparison between the expected service and the perceived service. Parasuraman et al. (1988) further purified and distilled these ten dimensions to five: (1) Tangibles; (2) Reliability; (3) Responsiveness; (4) Assurance and (5) Empathy. In turn, these five attributes constitute the base of a global measurement devise for the service quality, called, SERVQUAL. From the customers’ perspective, understanding customer satisfaction with service providers is useful in helping organizations assess current and potential ICT service providers. The previous research on ICT service quality could be applied to the understanding of e-learning service quality. E-learning users do not just want an electronic device, rather, they seek the system that satisfies their electronic learning services and above all they demand for service quality that leads to their satisfaction (Martinez-Arguelles et al., 2010). Nevertheless, there is a lack of research in the area of electronic learning service quality therefore; there is a need for a study that examines the factors that lead to service quality of e-learning in higher education environment.

This study attempts to drive the instrument dimensions of e-learning service quality through modifying SERVQUAL model to consider the online learning context and develops a research model to examine how e-learning service quality dimensions affect overall service quality. To prove the usefulness of the research model, data were collected from 189 students representing different Jordanian universities. Exploratory Factor Analysis (EFA) was conducted to examine the reliability and validity of the measurement model and multiple regression analysis was used to test the research model which was supported by SPSS 16.0 software. The findings revealed that the factors that lead to service quality of e-learning in a higher education environment were interface design, reliability, responsiveness, trust and personalization. Understanding the determinants of e-learning service quality provides valuable guidance to both vendors (universities) and customers (learners). E-learning vendors can benefit from this study by focusing on the factors that affect user satisfaction. Customers can benefit from this research in the selection of the e-learning vendors who can provide as many e-learning service quality as possible.

THEORETICAL BACKGROUND

Electronic service quality
Conceptual foundations:
From the literature, the researchers found out that the terms website service quality and online service quality have been used interchangeably by researchers (Aladwani and Palvia, 2002; Lee and Lin, 2005; Piccoli et al., 2004; Van Riel et al., 2001; Zeithaml et al., 2002). Being one of the pioneers who introduced the concept of electronic service quality (e-SQ) and who examined the service quality of websites as well as their role in service quality delivery to customers, Zeithaml et al. (2002) defined e-SQ and website service quality as “the extent to which a website facilitates efficient and effective shopping, purchasing and delivery of products and services”. Based on the above definition, it is evident that it suggests that the quality of a website is to provide sufficient service to customers to comfortably and confidently do shopping, expecting fast delivery and reliable service. In order to achieve that, companies should understand customers’ perceptions about service quality and how customers evaluate it (Zeithaml et al., 2002).

E-SQ dimensions as identified by previous researchers including Gounaris and Dimitriadis (2003) and Novak et al. (2000) have their origin in the Technology Acceptance Model (TAM) developed by Davis (1989). Davis (1989) defined perceived technology ease of use as “the degree to which the prospective user expects the target system to be free of effort”. He further perceived technology usefulness as “the degree to which a person believes that using a particular system would enhance his or her job performance”. These dimensions can help companies to predict consumers’ behavior when they decide to use a specific technology. It is proposed that the ease of use and usefulness of using a particular system affect the customers’ adoption of this system (Davis, 1989). Zeithaml et al. (2002) described eight dimensions used by customers when they evaluate e-SQ and the quality of websites. These criteria are:

Information availability: It refers to the availability and sufficiency of information that can help consumers search any relevant information to any products they are interested to fully enquire about
Ease of use/usability: It refers to the easiness of using the web site. This easiness can include downloading speed, design and organization
Privacy/security: Privacy refers to the degree of protecting customers’ personal information by not sharing their personal information with other websites (as in selling lists), protecting anonymity and providing informed consent. As far as security is concerned, it refers to protecting users from the risk of fraud and financial loss when they use their credit card or any other financial information. Security also refers to providing data confidentiality, security auditing, encryption and anti-virus protection
Graphic style: It refers to the attributes of a website in terms of choice of colors, layout, print size and type, photographs, graphics, animation, 3D-effects and multimedia
Fulfillment/reliability: It refers to the actual performance of the company, rather than with the website performance; defined as the provider’s ability to deliver the service or product as promised
Access: It refers to the availability of the contact information on the company’s website
Responsiveness: It refers to the promptness with which the company’s personnel give feedback to customers via e-mails
Personalization: It refers to the website’s ability to address customers’ preferences by providing personalized and customized services

Zeithaml et al. (2002), in another exploratory study, categorized dimensions into four core and three recovery dimensions that can be used to measure customers’ perceptions of e-SQ .The core dimensions include:

Efficiency: It refers to the customer’s ability to effectively access the website and check any relevant information with minimal effort
Fulfillment: It refers to the company’s actual performance in respect to the accuracy of service promises, the availability of products in stock and delivery time
Reliability: It refers to the technical functioning of a website, such as the extent to which it is available and functions properly
Privacy: It refers to the company’s will and ability to maintain the integrity of customer data. The three recovery dimensions are mainly concerned with situations in which a problem needs to be solved and a “personal service” is required. These recovery dimensions include
Responsiveness: It refers to the company’s ability to provide an appropriate problem-solving mechanism, i.e., online complaint ability, handling returns mechanism, online guarantees
Compensation: It refers to a money-back guarantee, return of shipping and handling costs
Contact points: It refers to the company’s ability to offer a live contact and customer support in real-time via online or other communication means

In the same line, Parasuraman (2004) suggests that there are eleven criteria of e-SQ that influence customers’ perceptions about website quality and e-SQ. These criteria include access, ease of navigation, efficiency, customization/personalization, security/privacy, responsiveness, assurance/trust, price knowledge, site aesthetics, reliability and flexibility.

CONCEPTUAL RESEARCH MODEL AND HYPOTHESES

A broad review of relevant literature in marketing, service quality and e-service quality provide us with a foundation for developing the theoretical framework. As determined by previous research findings, both identify the variables that might be important. This study found that e-learning in academic environment is a new area of research, majority of the coming dimensions are new in the sense of this area of study, as it is very difficult to find related studies, supported by evidence, that focus on e-learning service quality but similarities between e-learning and e-services can be expected since both are Internet-based. In this study, much attention is paid to the measurement model of e-learning service quality in higher education environment based on the well-known SERVQUAL model. Many previous studies related to service quality suggest that it should be necessary to add and modify items of the SERVQUAL scale, developed by Parasuraman et al. (1985, 1988) and to create a unique and comprehensive conceptual model of service quality, depending on the nature of the service sector under investigation (Carman, 1990; Cronin and Taylor, 1992; Parasuraman and Grewal, 2000). Based upon this suggestion, this study proposed and tested a multi-dimensional model of service quality for e-learning.

The dimensions adopted by this study were interface design, reliability, responsiveness, trust and personalization. The adoption of these dimensions seems to have influenced the student’s overall perception about e-learning service quality in higher education environment. The first dimension interface design refers to the appearance of e-learning portal and is consistent with the tangibility dimension in the SERVQUAL model. While Parasuraman et al. (1988) define a tangible dimension as the physical appearance, such as facilities, equipment and personnel, many researchers replace this definition with the user interface required for adapting to the e-service context (Aladwani and Palvia, 2002; Wolfinbarger and Gilly, 2003; Lee and Lin, 2005). The reliability dimension in the SERVQUAL model is composed of the consistency, dependability and accuracy of promised service performance (Parasuraman et al., 1988). Studies of new service-delivery options available with computer technology found that consistency and dependability of performance is an important dimension in the measurement of service quality, because of the user’s consideration of performance risks based on new technology service (Cox and Dale, 2001; Dabholkar, 1996; Lee and Lin, 2005). The third dimension is responsiveness which is similar to the responsiveness dimension in the SERVQUAL model. The SERVQUAL model (Parasuraman et al., 1988) defines responsiveness as the willingness of employees to provide prompt service and to deal with consumer complaints. According to Wang (2003), “responsiveness” measures the company’s ability to support customers with the appropriate information when a problem occurs; it also refers to the mechanism for handling returns and the ability to carry out arrangement for online guarantees.

Fig. 1: Hypotheses research model

A quick response to customers’ request is an indication that the company is customer-oriented. This, by its turn, is going to overcome the issue of uncertainty and increase the perceived convenience of customers (Gummerus et al., 2004; Wolfinbarger and Gilly, 2003; Lee and Lin, 2005). According to Lee and Lin (2005) trust is at the center of e-service with much academic discourse surrounding the security, privacy and confidence which is similar to the assurance dimension in the SERVQUAL model. Kimery and McCard (2002) argue that “trust is the user willingness to accept the vulnerability of an online transaction based on their positive expectations regarding future online provider behaviors”. Reichheld and Schefter (2000) point out that trust is a significant antecedent of participation in online settings because of the increased ease with which online transaction can behave opportunistically. Personalization in relation to service quality has been defined as caring, individualized attention for the consumer and subject knowledge of employees (Parasuraman et al., 1988). Van Riel et al. (2001) additionally define Personalization, in the e-service context, as the degree of customization of communication and service provider awareness of consumer needs in the e-service context. Personalization a key feature of most e-commerce business models because it offers real values for a customer and creates a perception of high-quality service. The heart of the personalization is to satisfy the individual customer’s unique needs (Huang and Lin, 2005; Van Riel et al., 2001; Lee and Lin, 2005). Figure 1 presents hypotheses research model.

Thus we can have the hypotheses as follows:

H1: There will be a significant positive relationship between interface design and overall service quality
H2: There will be a significant positive relationship between reliability and overall service quality
H3: There will be a significant positive relationship between responsiveness and overall service quality
H4: There will be a significant positive relationship between trust and overall service quality
H5: There will be a significant positive relationship between personalization and overall service quality

RESEARCH METHODOLOGY

Both primary and secondary data were collected for this research. In gathering information pertaining to the study; a questionnaire was as used as main instrument for data collection in this study. A pre-test of the questionnaire was conducted to assess the content validity of the measurement scales. Farther more, the questionnaire was pilot testing to gain additional support for content validity and to obtain initial indications about construct validity and reliability. The subjects for this study were confined to the e-leaning users who have experienced e-learning services. A screening question “Have you used e-leaning before?” was asked before the self-administered surveys were given out. Out of the 240 students obtained, 51 were incomplete or contained unreliable answers. 189 students representing different Jordanian universities were deemed usable. Of students responding, 50.8% were male and 49.2% female, 88.3% under the age of 26 while 11.3% between 26-40 years old. In terms of student’s level, undergraduate made up the largest number with 85%, followed by master degree with 12.3% and Ph.D. students 2.7%. In terms of students’ university, Yarmouk University (YU) made up the largest group of respondents (46%), followed by Jordanian University (JU) with 31% and Jerash Private University (JPU) 23%.

EMPIRICAL RESULTS

The validity and reliability of the instrument were evaluated. Construct validity was examined by factor analysis, using the principal components method with a varimax rotation. The final Exploratory Factor Analysis (EFA) solution resulted in an average of four items per dimension with one dimensions having only two items. Factor reliabilities, as represented with Cronbach’s alpha in the last column in Table 1, were between 0.76 and 0.82 for each factor. The reliability coefficients above 0.60 are typically considered satisfactory (Pallant, 2001). Principal component factor analysis was performed and five constructs were extracted. As shown in Table 1, there were no cross-loading items. Table 1 reports the factor loading, eigenvalue values, cumulative variance explained and Kaiser-Meyer-Olkin measure of sampling adequacy.

Table 1: Exploratory factor loading and reliability test
Cumulative variance explained (%) 69.972, Extraction method: Principal component factor, Rotation method: Varimax with kaiser normalization, Kaiser-Meyer-Olkin measure of sampling adequacy: 0.786

Table 2: Correlations matrix between overall service quality and five e-learning service quality dimensions

Additionally, items intended to measure the same construct exhibited prominently and distinctly higher factor loadings on a single construct than on other constructs, suggesting adequate convergent and discriminant validity (Hair et al., 1998) jointly, the observed reliability and construct validity suggested adequacy of the measurements used in the study. Table 2 demonstrates that the correlations between "overall service quality" and five e-learning service quality factors were consistently high. The correlation coefficient indicates significant positive relationship between the independents and dependent variable at the 0.01 level. The regression analysis used overall service quality as the dependent variable and the five e-learning service quality factors as the independent variable. Pursuant to the initial regression run, the outliers were detected by examining the standardized residual. One outlier were found and eliminated. Missing values were handled by choosing the option “exclude cases pairwise” which means that only cases with complete data for the pair of constructs being correlated were used to compute the correlation coefficient on which the regression analysis is based. This procedure produced 189 effective samples. The mean of all scale items within a factor was used to represent that factor.

Table 3 shows the results of multiple regression analysis. Five factors take account of 67.9 percentage of explained variance of overall service quality evaluation which is significant as indicated by the F-value.

Table 3: Regression analysis results between overall service quality and e-learning service quality dimensions
Dependent variable: Overall service quality (F = 106.825, p≤0.000 , R = 0.824, R2 = 0.679, Adjusted R2 = 0.673), Best predictor: Interface design (Beta = 0.505, p≤0.000, R2 = 0.534)

F is the value calculated for the F statistic by SPSS. The significant value (Sig.) is the likelihood of committing a Type I error after rejecting the null hypothesis. In this case, Sig. = 0.000 (p<0.001) and Sig. = 0.00 (p<0.01) which is less than p<0.05 (the criterion alpha level). Therefore, the regression equation as computed is statistically significant for all factors. To identify the most important factors that determine the learner perception about e-learning service quality in higher education environment, a stepwise regression was used, where the number of independent variables entered and the order of entry are determined by statistical criteria generated by the stepwise procedure. This procedure found under the SPSS output heading coefficients. This section of the output shows which of the variables statistically significant predictors of the dependent variable are.

DISCUSSION

This section summarizes and discusses the results associated with the development of a suitable instrument for measuring student's perception about e-learning service quality. This includes validity and reliability definition and analysis as well as content and face validity. The instrument was pilot tested, evaluated and refined. Sample data was analyzed (typically with exploratory factor analysis) and rough assessments were made of validity and reliability. A major issue was adjusting the scales items: deleting, adding and rewording. Also a full detailed of the instruments used for data analysis by exploratory factor analysis was provided which was used to test the relationship between each set of manifest variables and the associated dimension. Those questions (manifest variables) found to have significant loadings, as they represent the common set of items that students agree constitute the e-learning service quality factors. The primary data obtained from university students supports the findings in recent studies. Interface design, reliability and responsiveness in particular, are the most relevant and significant motivators of e-learning service quality. In addition, the study provides support for the important role of the SERVQUAL's personalization and trust constructs.

CONTRIBUTIONS AND FUTURE RESEARCH

This study investigated factors that lead to service quality of e-learning in higher education environment which potentially trigger a new stream of research. The following is a brief discussion of the most important contributions this research offered for theory and practice. These contributions add to the benefit of the students, academic organizations and hopefully to society and other non-academic organizations. This study emphasized that if a university has rewarded students with good academic services, it will contribute to the welfare of the student's achievements and therefore to the society. In addition the findings can be in much of use to e-government, e-commerce and electronic industry services implementations in general. Although many researchers have conducted studies designed to understand the concept and develop the reliable and valid measurement of service quality in different areas, the conceptual gap of service quality still exists in the academic domain. Moreover, it is very difficult to find related studies, supported by evidence, that focus on service quality of e-learning in higher education environment. The major contribution of this research can be argued from the view that it is among the pioneers to investigate the factors that lead to service quality of e-learning in higher education environment. Also as important contribution to the general evaluation of service quality research, this study has successfully developed the extended SERVQUAL model, in e-learning service quality context. In addition to all achievements that were mentioned earlier, the results are in line with the overall findings across several studies in the information systems service quality area (Kettinger and Lee, 1995; Yang et al., 2004; Jiang et al., 2002; Lee and Lin, 2005).

REFERENCES

  • Aladwani, A.M. and P.C. Palvia, 2002. Developing and validating and instrument for measuring user-perceived Web quality. Inf. Manage., 39: 467-476.
    CrossRef    


  • Al-Mushasha, N.F., 2010. Has the time for university's mobile learning come? determining student's perception. Proceedings of the12th International Conference on Information Integration and Web-based Applications and Services, November 8-10, 2010, Paris, France -.


  • Berge, L.Z. and M.P. Collins, 1995. Computer Mediated Communication and the Online Classroom in Distance learning. Hampton Press, Cresskill, New Jersey


  • Carman, J.M., 1990. Consumer perceptions of service quality an assessment of the SERVQUAL dimensions. J. Retailing, 66: 33-55.
    Direct Link    


  • Cox, J. and B.G. Dale, 2001. Service quality and e-commerce: An exploratory analysis. Managing Serv. Qual., 11: 121-131.
    CrossRef    Direct Link    


  • Cronin, Jr. J.J. and S.A. Taylor, 1992. Measuring service quality: A reexamination and extension. J. Market., 56: 55-68.
    CrossRef    Direct Link    


  • Crosta, L., 2004. Beyond the use of new technologies in adult distance courses: An ethical approach. Int. J. E-Learn., 3: 48-60.
    Direct Link    


  • Dabbagh, N. and A. Kitsantas, 2004. Supporting self-regulation in student-centered web-based learning environments. Int. J. E-learn., 3: 40-47.
    Direct Link    


  • Dabholkar, P.A., 1996. Consumer evaluations of new technology-based self-service options: An investigation of alternative models of service quality. Int. J. Res. Market., 13: 29-51.
    CrossRef    Direct Link    


  • Davis, F.D., 1989. Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Quart., 13: 319-340.
    CrossRef    Direct Link    


  • Dedeke, A., 2003. Service quality: A fulfillment–oriented and interactions–centered approach. Managing Service Quality, 13: 276-289.
    Direct Link    


  • Gounaris, S. and S. Dimitriadis, 2003. Assessing service quality on the web: Evidence from business-to-consumer portals. J. Serv. Marketing, 17: 529-548.
    CrossRef    Direct Link    


  • Gronroos, C., 1984. A service quality model and its marketing implications. Eur. J. Market., 18: 36-44.
    CrossRef    Direct Link    


  • Gummerus, J., V. Liljander, M. Pura, A. van Riel, 2004. Customer loyalty to content-based web sites: The case of an online health-care service. J. Serv. Market., 18: 175-186.
    CrossRef    Direct Link    


  • Hair, J.F., R.E. Anderson, R.L. Tataham and W.C. Black, 1998. Multivariate Data Analysis. Prentice Hall, New Jersey, USA


  • Huang, E.Y. and C.Y. Lin, 2005. Customer-oriented financial service personalization. Ind. Manage. Data Syst., 105: 26-44.
    CrossRef    Direct Link    


  • Jiang, J.J., G. Klein and C.L. Carr, 2002. Measuring information system service quality: SERVQUAL from the other side. MIS Q., 26: 145-166.
    Direct Link    


  • Kettinger, W.J. and C.C. Lee, 1995. Exploring a Gap model of information services quality. Inform. Resour. Manage. J., 9: 5-16.
    Direct Link    


  • Lee, G.G. and H.F. Lin, 2005. Customer perceptions of e-service quality in online shopping. Int. J. Retail Distrib. Manage., 33: 161-176.
    CrossRef    


  • Martinez-Arguelles, M., J. Castan and A. Juan, 2010. How do students measure service quality in E-learning? A case study regarding an internet-based university. Electron. J. E-Learn., 8: 151-160.
    Direct Link    


  • Novak, T.P, D.L. Hoffman and Y.F, Yung, 2000. Measuring the customer experience in online environments: A structural modelling approach. Marketing Sci., 19: 22-42.
    Direct Link    


  • Pallant, J., 2001. SPSS Survival Maual: A Step by Step Guide to Data Analysis Using SPSS for Windows. Allen and Unwin, Australia.


  • Parasuraman, A., 2004. Assessing and improving service performance for maximum impact: Insights from a two-decade-long research journey. Perform. Meas. Metrics, 5: 45-52.
    CrossRef    Direct Link    


  • Parasuraman, A. and D. Grewal, 2000. The impact of technology on the quality-value-loyalty chain: A research agenda. J. Acad. Market. Sci., 28: 168-174.
    CrossRef    Direct Link    


  • Parasuraman, A., V.A. Zeithaml and L.L. Berry, 1985. A conceptual model of service quality and its implications for future research. J. Market., 49: 41-50.
    CrossRef    Direct Link    


  • Parasuraman, A., V.A. Zeithaml and L.L. Berry, 1988. SERVQUAL: A multiple-item scale for measuring consumer perception. J. Retail., 64: 12-40.
    Direct Link    


  • Piccoli, G., M. Broham, R. Watson and A. Parasuraman, 2004. Net-Based customer service systems: Evolution and revolution in web site functionalities. Decis. Sci., 35: 423-455.
    CrossRef    Direct Link    


  • Pitt, L.F., R.T. Watson and C.B.Kavan, 1995. Service quality: A measure Of information effectiveness. MIS Quart., 19: 173-187.
    Direct Link    


  • Reichheld, F.F. and P. Schefter, 2000. E-loyalty: Your secret weapon on the web. Harvard Bus. Rev., 78: 105-113.
    Direct Link    


  • Van Riel, A.C.R., V. Liljander and P. Jurriens, 2001. Exploring consumer evaluations of e-services: A portal site. Int. J. Service Ind. Manage., 12: 359-377.
    CrossRef    


  • Wolfinbarger, M. and M.C. Gilly, 2003. ETailQ: Dimensionalizing, measuring and predicting etail quality. J. Retailing, 79: 183-198.
    CrossRef    


  • Yang, Z., M. Jun and R.T. Peterson, 2004. Measuring customer perceived online service quality. Int. J. Oper. Prod. Manage., 24: 1149-1174.
    CrossRef    Direct Link    


  • Zeithaml, V.A., A. Parasurarnan and A. Malhotra, 2002. Service quality delivery through web sites: A critical review of extant knowledge. J. Acad. Market. Sci., 30: 362-375.
    CrossRef    Direct Link    


  • Wang, M., 2003. Assessment of E-service quality via E-satisfaction in E-commerce globalization. Electron. J. Inform. Syst. Dev. Countries, 11: 1-4.
    Direct Link    


  • Kimery, K.M. and M. McCord, 2002. Third-party assurances: The road to trust in online retailing. Proceedings of the 35th Annual Hawaii International Conference on System Sciences, January 7-10, 2002, Big Island, Hawaii, pp: 63-82.

  • © Science Alert. All Rights Reserved