ABSTRACT
In pursuit of excellence it is increasingly important to identify demands and values of customers. Service quality has been identified as one such demand. Thus, the purpose of the current study is to develop a multidimensional scale to measure service quality of higher education. A questionnaire consisting of 43 items was developed to measure the construct and its dimensions. Construct validation using exploratory factor analysis showed an interpretable latent structure consisting of twelve factors. Thus it is concluded that service quality in higher education setting comprises of twelve factors such as visual appeal, outcome, campus, reputation, input quality (students), industry interaction, support facilities, input quality (faculty), inter personal relationships, curriculum, academic facilities and processes.
PDF Abstract XML References Citation
How to cite this article
DOI: 10.3923/ajm.2010.144.154
URL: https://scialert.net/abstract/?doi=ajm.2010.144.154
INTRODUCTION
Following manufacturing and service industries, higher education is one of the latest sectors which are being challenged with Total Quality Management (TQM) concepts and methodologies. A customer-oriented approach to quality, adopted from such a philosophy, necessitates investigating the needs of the customers. This is reinforced when considering the fact that quality of service in general is subjective, unlike quality of products, which can be measured objectively and so an appropriate way of measuring this characteristic is to assess the perception of consumers. In order to attract customers, serve their needs and retain them, service providers and researchers are actively involved in understanding consumers' expectations and perceptions of service quality. It is found that in a highly competitive environment, students have become more discriminating in their selection and more demanding of the colleges and universities they choose. Therefore, it is important for universities to understand their expectations. A constant research and analysis of education service quality is a necessary prerequisite for its improvement.
The identification of the dimensions, which signal quality and the achievement of excellence in higher education have emerged in this decade as key issues facing the academia. In the area of higher education, the concept of what constitutes quality has not been thoroughly addressed, although some interesting studies exist (Srikanthan and Dalrymple, 2003; Hill et al., 2003; Abdullah, 2006a, b; Sahney et al., 2006). There is vast field of general research in service quality. According to Chumpitaz and Swaen (2002) the number and the nature of service quality dimensions is directly related to the service under investigation. Thus there is a need for sector specific scale to measure service quality. Therefore, present research tries to find what dimensions are used by students, the primary customer of education system, in evaluating service quality of institutes of higher education.
The current research begins with a review of the relevant literature which lead to the research objectives, followed by a brief description of the research methodology. Subsequently, results are presented and discussed. Finally, limitations and directions for future research are provided.
Researchers have tried to define general quality dimensions, particularly concerning services. The most well known set of dimensions has been proposed by Parasuraman et al. (1985) and Zeithaml et al. (1996). The authors have later developed their framework and condensed the original ten dimensions to five comprehensive dimensions (Berry and Parasuraman, 1991). In an alternative framework, Gronroos (2000) presents a compilation of seven criteria of service quality. Gerhard et al. (1997) found two dimensions of service quality, while Carman (1990) suggested seven dimensions of service quality. Cronin and Taylor (1992) argued against the conclusion that service quality is multidimensional and instead proposed that the construct of service quality is unidimensional. In the end, researchers have concluded that the dimensionality of service quality is situation specific (Holdford and Patkar, 2003). Some researchers have classified service quality under two broad categories: technical and functional (Gronroos, 1984; Parasuraman et al., 1985, 1988, 1991; Lewis, 1989). Thus, we can conclude that there is considerable debate in the literature, (Cronin and Taylor, 1992, 1994; Teas, 1993, 1994; Parasuraman et al., 1994), about how best to conceptualize and operationalize the service quality construct and about the relationship between and relative importance of the key variables that relate to it. The service quality construct is still considered by many as unresolved (Caruana et al., 2000) and/or far from conclusive (Athanassopoulos, 2000).
There is a considerable debate about the best way to define service quality in HE (Becket and Brookes, 2006). Quality in Education has been defined variedly as, excellence in education (Peters and Waterman, 1982) fitness for purpose, fitness of educational outcome and experience for use (Juran and Gryna, 1988), conformance of education output to planned goals, specifications and requirements (Crosby, 1979) defect avoidance in education process (Crosby, 1979) and meeting or exceeding customers expectations of education (Parasuraman et al., 1985). Sahaney et al. (2006) in their research defined that quality in Education is a multiple concept which includes within its ambit the quality of inputs in the form of students, faculty, support staff and infrastructure; the quality of processes in the form of the learning and teaching activity; and the quality of outputs in the form of the enlightened students that move out of the system. In fact, it is all permeating covering all the aspects of academic life. Allen and Davis (1991) and Holdford and Patkar (2003) defined educational service quality as a students overall evaluation of services received as part of their educational experience.
Table 1 shows service quality dimensions as identified by various researchers in different higher education settings. Assimilation of the literature reviewed suggests reputation, outcome quality, input quality (faculty/students), curriculum, academic facilities, academic processes, campus, support facilities, non academic processes and interpersonal relationships as set of factors, students consider while evaluating the service quality offered by an institute. Thus, the research is carried out with the objective:
• | To validate the dimensions of service quality in higher education |
• | To validate the instrument to measure service quality in higher education setting |
Table 1: | Service quality dimensions as identified by various researchers |
MATERIALS AND METHODS
A quantitative study, involving the administration of a survey conducted during the period July 2008-July 2009, in order to empirically validate the factors of service quality in higher education. The survey approach was chosen as it is by far the most common method of primary data collection in marketing research. It has the advantages of ease, reliability and simplicity. It also simplifies coding, analysis and interpretation of data.
Sample
A total of 280 students from six technical institutions have been surveyed from whom 256 properly filled questionnaires have been obtained. Most of the respondents were enrolled in either their second or third year of the BE, MBA, MCA, PGHMCT and M Arch program. As for the overall rating of the quality service, majority of respondents gave their positive views with mean at 3.8 of the 5-point Likert scale.
Tools
Data Collection
The self developed instrument consisted of 43 items was used for the purpose of data collection. Respondents were asked to indicate their perception of each service quality item on five-point scales. The five-point scales ranged from 1, strongly disagree to 5, meaning strongly agree.
Data Analysis
Factor analysis has been recognized as a powerful and indispensable method of construct validation (Kerlinger, 1973) that is at the heart of the measurement of psychological constructs (Nunnally and Bernstein, 1994). So for the present study factor analysis was performed using SPSS to analyze the data.
RESULTS
Content Validity
The objective of item creation is to ensure content validity. Content validity is the representative ness of the construct domain (Carmines and Zeller, 1979; Kerlinger, 1973). To generate a representative sample of items and achieve content validity, a variety of procedures were employed in this study. The first procedure was a content analysis of the literature. The selected literature spanned both academic and professional journals and books in service quality, as well as other disciplines. The content analysis of the literature elicited 54 items associated with the service quality construct. These items depict the primary means, mentioned in the literature, by which the service quality construct may be realized in practice. As such, they represent indicators of the service quality extended by institutes of higher education to its students. Accordingly, an instrument was designed to measure the service quality. The questionnaire was then pre tested using seven professors in various field of management. Respondents remarked on the descriptions of the items purported to comprise the service quality construct. They were asked to note any item that should be added, deleted, or modified. They also commented on each item's meaningfulness and readability. Refinements to the instrument were made based on their suggestions and eight service quality items were deleted. Pilot test of the instrument was then undertaken with eleven subject experts. These participants were asked to evaluate the questionnaire in similar fashion to the pretest respondents. In addition, these eleven respondents were asked to give an overall impression of the instrument's ability to capture the multidimensional nature of the service quality construct. Wording and format refinements to the instrument were made based on comments from the eleven respondents. Their responses suggested that the items adequately covered the content domain as they made no suggestions for additional items. The content validity of the measurement instrument was then investigated by executing a procedure developed by Lawshe (1975) for quantitatively assessing content validity. This procedure employed a content evaluation panel of individuals knowledgeable about the concept being measured. The panel consisted of thirteen experts from both industry and academia. The panelists responded to each activity item's relation to service quality on a two-point scale: YES means item measures service quality and NO means item does not measure service quality. Based on these data, a Content Validity Ratio (CVR) was computed for each item from the formula:
CVR = N*/N |
where, N* is the frequency count of the number of panelists rating the item as YES and N is the total number of respondents. All but three items exhibit CVR of 0.9 indicating that the panel considered them to be important to the service quality concept. According to Lawshe (1975), this majority vote indicated content validity for the items.
Construct Validity
The construct validation study focused on a single research question: Will exploratory factor analysis of items in the service quality survey result in an interpretable factor structure of latent value constructs consistent with the ten-factor model suggested by the by the author after literature review? An exploratory principal components factor analysis was conducted to assess the construct validity of the instrument in this study. The factor analysis was run without specifying the number of factors to be extracted. The Kaiser-Meyer-Olkin Measure of Sampling Adequacy (MSA), a measure of the data sets appropriateness for factor analysis, was 0.85. Data sets with MSA values above 0.80 are considered appropriate for factor analysis (Hair et al., 1998). The result was a solution with 12 factors that exhibited eigenvalues greater that 1.0. This solution explained 66.98% of the covariance among the items. The rotated factor solution met the following three criteria: simplicity (Sethi and King, 2007), interpretability and the percent of variance explained (Lederer and Sethi, 1992).
The first factor (eigenvalue = 11.67) was labeled processes and accounted for 9.55% of the covariance. The seven items defining this factor, with loadings ranging from 0.72 to 0.44 includes opportunities to participate and organize variety of co curricular activities, opportunities to participate and organize variety of sports activities, seminars/workshops are organized, institute provides career information and guidance, orientation program/induction program is helpful in settling down, administrative process like registration, examination etc., are hassles free and contemporary teaching methods are used.
The second factor (eigenvalue = 3.4) was labeled academic facilities and accounted for 8.9 percent of the covariance. The five items describing this factor, with loadings ranging from 0.81 to 0.63, relate to has sufficient academic equipment e.g., laboratories, workshops, provides up-to-date computer laboratories, has clean, spacious and well-equipped classroom, provides easy access to information sources, e.g., books, journals, software, information networks and library offers wide range of resources.
The third factor (eigenvalue = 2.61) was labeled curriculum and accounted for 7.1% of the covariance. The five items comprising this factor, with loadings ranging from 0.79 to 0.58, includes course content reflects industry and social needs, curricula are research-based, responsive to industry evaluations about the curriculum, the institutes curricula are balanced, relevant and well organized and responsive to student evaluations about the curriculum.
The fourth factor (eigenvalue = 2.1) was labeled inter personal relationships and accounted for 6 percent of the covariance. The three items describing this factor, with loadings ranging from 0.71 to 0.61, include interaction with classmates, course mates and alumni is good, interaction with faculty is good and motivating and interaction with staff is good and supportive.
The fifth factor (eigenvalue = 1.7) was labeled input quality (faculty) and accounted for 5.9% of the covariance. The two items comprising this factor, with loadings 0.74 and 0.52, relate to faculty and staff is competent and faculty and staff keep themselves updated.
The sixth factor (eigenvalue = 1.6) was labeled support facilities and accounted for 5.1% of the covariance. The four items describing this factor, with loadings ranging from 0.70 to 0.45, healthcare facilities are available and approachable, dining-hall provides variety of food and at convenient hours, recreational facilities are available and approachable and institute provides clean and safe accommodations.
The seventh factor (eigenvalue = 1.43) was labeled industry interaction and accounted for 5.0% of the covariance. The four items defining this factor, with loadings ranging from 0.79 to 0.72, refer to organizes for on the job training, organizes for summer internships and organizes for industrial tours, guest lectures from industry experts are organized.
The eighth factor (eigenvalue = 1.39) was labeled input quality (students) and accounted for 4.9% of the covariance. The two items defining this factor, with loadings 0.84 and 0.79, illustrate admission procedure is appropriate and admission procedure is fair.
The ninth factor (eigenvalue =1.25) was labeled reputation and accounted for 3.7% of the covariance. The four items describing this factor, with loadings ranging from 0..64 to 0.56, include the alumni are doing well in their respective field, the institute has good academic reputation, degree obtained help in getting a good placement and high credibility of degrees awarded.
The tenth factor (eigenvalue =1.18) was labeled campus and accounted for 3.5% of the covariance. The three items describing this factor, with loadings ranging from 0.79 to 0.42, include institute is ideally located, good campus layout and appearance and provides ambience conducive to study/research.
The eleventh factor (eigenvalue =1.1) was labeled outcome and accounted for 3.2% of the covariance. The two items describing this factor, with loadings 0.73 and 0.43, include technical knowledge/skills, communication and soft skills acquired help in career development and knowledge generated by research at institute is acknowledged in academic circles, industries and society.
The twelfth factor (eigenvalue =1.0) was visual appeal and accounted for 3.2% of the covariance. There is only one item with loading more than 0.4 on this factor and it was with loading 0.74.
These twelve factors specify service quality in higher education and provide evidence supporting the construct validity of the instrument.
Reliability
Reliability refers to the lack of measurement error in the items on a scale (Kerlinger, 1973). The reliability of the instrument in this study was determined by computing the internal consistency coefficient, Cronbach's alpha coefficient, for each of the dimensions determined from the factor analysis. Nunnally (1967) advised that a magnitude of 0.5 to 0.6 for the Cronbach alpha statistic was sufficient in the early stages of basic research, but that an alpha of 0.8 is more desirable. The alpha coefficients for all twelve of the factors were above 0.6, with eight above 0.8.
DISCUSSION
The results of the present research are certainly related to the determinants of service quality and support the existing literature. The results show that education service quality is a multi dimensional concept which is in line with the finding of Abdullah (2006a, b), who developed a framework on service quality in higher education and named it HEdPERF. It is important to note that the twelve factors extracted did not conform to the famous SERVQUAL instrument, where five factors were identified, namely, responsiveness, reliability, empathy, assurance and tangibles which were said to represent the generic dimensions of service quality (Parasuraman et al., 1991). Likewise, many subsequent studies of service quality in a variety of service industries have also failed to recover the five dimensions of service quality (Buttle, 1996; Robinson, 1999).
Results of the factor analysis deviate a little from the ten factors which we proposed initially. Our conceptual model had two factors related to academic and non academic processes but the factor analysis result has shown processes as one factor which has items related to both academic and non academic processes. The reason may be that students are not discriminating between academic and non academic processes. Our proposed model had a single factor related to input quality (students and faculty) but the results of factor analysis have put students and faculty quality as two different factors and given faculty quality more importance than the student quality. Industry interaction has emerged as a new factor in the factor analysis. It can be due to the fact that professional education needs lot of practical exposure and as area of our research is limited to technical education, all the students (sample) are from professional courses. In line with the findings of Bitner (1990), factor analysis has given a new factor named visual appeal.
Reputation is well accepted by most of the researchers (Joseph et al., 2005; Joseph and Beatriz, 1997; LeBlanc and Nguyen, 1999; Sohail and Shaikh, 2004; Ford et al., 1999; Adee, 1997) as an important factor in determining service quality of educational institutes. Alumni of the institute by their contribution to industry and society help in enhancing positive perceived image of the institute in its stakeholders mind. Although none of the past study in education setting has given importance to this factor, authors strongly suggest this factor as contributor to reputation of institute based on the researches which conclude that nothing can be more effective than the products successful performance.
Faculty knowledge, competence (Shank et al., 1996; Sahney et al., 2006; Owlia and Aspinwall, 1996), contact personnel (Sohail and Shaikh, 2004; LeBlanc and Nguyen, 1999) academic staff (Joseph et al., 2005) have been accepted by researchers as an important dimension for service quality. Entrance procedure determines the quality and caliber of incoming students (India Today, 2004).
Outcome quality refers to the outcome of the service and represents what the customer gains from the service (McDougall and Levesque, 1994). Campus opportunities (Joseph and Beatriz, 1997; Ford et al., 1999), preparation for employment (Joseph et al., 2005) are the factors identified by researches in earlier studies. Competencies gained by students in terms knowledge, leadership qualities, team building and communication make the students more suited for future jobs and industry as stakeholder lays stress on this factor. Research quality has not been taken as a factor for service quality in literature searched probable reason is that most of the studies are conducted assuming students as only customers of institute.
Curriculum (Sohail and Shaikh, 2004; Kwan and Ng, 1999; LeBlanc and Nguyen, 1999; Sahney et al., 2006; Owlia and Aspinwall, 1996), academic facilities (Joseph and Beatriz, 1997; Lagrosen et al., 2004; Holdford and Patkar, 2003; Owlia and Aspinwall, 1996; Shank et al., 1996) and teaching methodology (Sohail and Shaikh, 2004; LeBlanc and Nguyen, 1999; Kwan and Ng, 1999; Cook, 1997; Sahney et al., 2006; Owlia and Aspinwall, 1996; Lagrosen et al., 2004) are all well accepted factors for determining service quality in higher education setting.
The students spend an important part of their life in the institutes campus, so the quality of life of students on campus becomes an important factor in determining perceived service quality of the institute. Support facilities, social activities and interpersonal relationships are important dimension to determine quality of life. As it can be noted in the Table 1, almost all the researchers in the literature searched seem to agree on all the three determinants of quality of life to be the measures of service quality in reference to higher education.
First, as a theoretical and empirical contribution, the study identifies and validates the factors of service quality in higher education. The twelve identified factors are particularly useful for comparing several institutes and thus for recognizing competitive advantages. The instrument thus developed can be used to measure levels of student perceived service quality for institutions. Furthermore, service quality with respect to each factor, can also be measured. The service quality score will give an indication of the service quality delivered by the institutions from the students perspective with respect to a particular factor. The service quality scores for all of the factors will provide a comprehensive picture of the level of service quality. Institutions can utilize these indices as reference points, in order to highlight those service quality aspects which may need further enhancement.
CONCLUSION AND FUTURE RESEARCH
It is important that the findings of this empirical research be evaluated in the light of certain limitations since acknowledgment of these limitations could suggest new directions for future studies. The present study was conducted in a single service sector, namely, higher technical education, thus some of the results particularly the dimensions of service quality may be specific to the particular service setting. Some researchers argue that the approach may potentially raise concerns about lack of generalizability, but interestingly such technique also eliminates problems associated with the effects of industry differences. Another limitation focuses on the measurement items in instrument, which were entirely described in positively worded statements and may lead to yes saying. Some researchers consider it good research practice to include both positively and negatively worded items. However, such approach may have consequences for respondents who can make comprehension errors and take more time to read the 43-item questionnaire. It is also suggested to carry out confirmatory factor analysis to analyze the relationship between the factors. Future research may examine which of the factors discriminate most significantly among the institutions. In terms of future research areas, it may be worthwhile to develop a measuring instrument from a different perspective that is from other customer groups, namely, internal customers, employers, government, parents and general public. Although, in higher education students must now be considered primary customers, the industry generally has a number of complementary and contradictory customers. This study has concentrated on the student customer only, but it is recognized that education has other customer groups which must also be satisfied.
ACKNOWLEDGMENTS
Rajani Jain, Dr. Gautam Sinha and Dr. S.K. De developed the concept and research design. Rajani Jain collected and analysed the data. Discussion and implications are synthesised by Rajani Jain and Dr. Gautam Sinha.
REFERENCES
- Abdullah, F., 2006. The development of HEdPERF: A new measuring instrument of service quality for the higher education sector. Int. J. Consumer Stud., 30: 569-581.
CrossRefDirect Link - Allen, J. and D. Davis, 1991. Searching for excellence in marketing education: The relationship between service quality and three out-come variables. J. Market. Educ., 13: 47-55.
CrossRefDirect Link - Athanassopoulos, A.D., 2000. Customer satisfaction cues to support market segmentation and explain switching behavior. J. Bus. Res., 47: 191-207.
CrossRef - Angell, R.J., T.W. Heffernan and P. Megicks, 2008. Service quality in postgraduate education. Qual. Assurance Educ., 16: 236-254.
CrossRefDirect Link - Athiyaman, A., 1997. Linking student satisfaction and service quality perceptions: The case of university education. Eur. J. Market., 31: 528-540.
Direct Link - Becket, N. and M. Brookes, 2006. Evaluating quality management in university departments. Qual. Assurance Educ., 14: 123-142.
CrossRefDirect Link - Bitner, M.J., 1990. Evaluating service encounters: The effects of physical surroundings and employee responses. J. Market., 54: 69-82.
Direct Link - Buttle, F., 1996. SERVQUAL: Review, critique, research agenda. Eur. J. Market., 30: 8-32.
CrossRefDirect Link - Carman, J.M., 1990. Consumer perceptions of service quality: An assessment of the SERVQUAL dimension. J. Retail., 66: 33-55.
Direct Link - Caruana, A., M.T. Ewing and B. Ramaseshan, 2000. Assessment of the three-column format SERVQUAL: An experimental approach. J. Business Res., 49: 57-65.
CrossRef - Cook, M.J., 1997. A student perspective of service quality in education. Total Qual. Manage. Business Excellence, 8: 120-125.
CrossRefDirect Link - Cronin, Jr. J.J. and S.A. Taylor, 1992. Measuring service quality: A reexamination and extension. J. Market., 56: 55-68.
CrossRefDirect Link - Cronin, Jr. J.J. and S.A. Taylor, 1994. SERVPERF versus SERVQUAL: Reconciling performance-based and perceptions-minus-expectations measurement of service quality. J. Market., 58: 125-131.
CrossRefDirect Link - Abdullah, F., 2006. Measuring service quality in higher education: Three instruments compared. Int. J. Res. Method Educ., 29: 71-89.
CrossRefDirect Link - Ford, J.B., M. Joseph and B. Joseph, 1999. Importance-performance analysis as a strategic tool for service marketers: The case of service quality perceptions of business students in New Zealand and the USA. J. Service Market., 13: 171-186.
Direct Link - Hill, Y., L. Lomas and J. MacGregor, 2003. Students' perceptions of quality in higher education. Qual. Assurance Educ., 11: 15-20.
CrossRefDirect Link - Joseph, M. and B. Joseph, 1997. Service quality in education: A student perspective. Qual. Assurance Educ., 5: 15-21.
CrossRefDirect Link - Juran, J.M. and F.M. Jr. Gryna, 1988. Juran`s Quality Control Handbook. 4th Edn., McGraw-Hill, New York, ISBN-10: 0070331766, pp: 1808.
Direct Link - Kwan, P.Y.K. and P.W.K. Ng, 1999. Quality indicators in higher education-comparing Hong Kong and China's Students. Manage. Audit. J., 14: 20-27.
CrossRefDirect Link - Lagrosen, S., R. Seyyed-Hashemi and M. Leitner, 2004. Examination of the dimensions of quality in higher education. Qual. Assurance Educ., 12: 61-69.
Direct Link - Lawshe, C.H., 1975. A quantitative approach to content validity. Personnel Psychol., 28: 563-575.
CrossRefDirect Link - Lederer, A.L. and V. Sethi, 1992. Root Causes of strategic information systems planning implementation problems. J. Manage. Inform. Syst., 9: 25-45.
Direct Link - LeBlanc, G. and N. Nguyen, 1999. Listening to the customer's voice: Examining perceived service value among business colege students. Int. J. Educ. Manage., 13: 187-198.
CrossRefDirect Link - Lewis, B.R., 1989. Quality in the service sector: A review. Int. J. Bank Market., 7: 4-12.
CrossRefDirect Link - Mahapatra, S.S. and M.S. Khan, 2007. Assessment of quality in technical education: An exploratory study. J. Services Res. Vol. 7.
Direct Link - McDougall, G.H.G. and T.J. Levesque, 1994. A revised view of service quality dimensions: An empirical investigation. J. Prof. Serv. Market., 11: 189-209.
CrossRefDirect Link - Gerhard, M., B. Christ and N. Deon, 1997. The dimensions of service quality: The original european perspective revisited. Service Ind. J., 17: 173-189.
CrossRefDirect Link - Owlia, M.S. and E.M. Aspinwall, 1996. Quality in higher education-a survey. Total Qual. Manage., 7: 161-172.
Direct Link - Parasuraman, A., V.A. Zeithaml and L.L. Berry, 1985. A conceptual model of service quality and its implications for future research. J. Market., 49: 41-50.
CrossRefDirect Link - Parasuraman, A., L.L. Berry and V.A. Zeithaml, 1991. Refinement and reassessment of the SERVQUAL scale. J. Retail., 67: 420-450.
Direct Link - Parasuraman, A., V.A. Zeithaml and L.L. Berry, 1994. Reassessment of expectations as a comparison standard in measuring service quality implications for further research. J. Market., 58: 111-124.
Direct Link - Holdford, D. and A. Patkar, 2003. Identification of the service quality dimensions of pharmaceutical education. Am. J. Pharma. Educ., Vol. 67, No. 4.
Direct Link - Robinson, S., 1999. Measuring service quality: Current thinking and future requirements. Market. Intell. Plann., 17: 21-32.
CrossRefDirect Link - Sahney, S., D.K. Banwet and S. Karunes, 2006. An integrated framework for quality in education: Application of quality function deployment, interpretive structural modelling and path analysis. Total Qual. Manage., 17: 265-285.
CrossRefDirect Link - Sethi, V. and W.R. King, 2007. Construct measurement in information systems research: An illustration in strategic systems. Decision Sci., 22: 455-472.
Direct Link - Shank, M.D., M. Walker and T. Hayes, 1996. Understanding professional service expectations: Do we know what our students expect in a quality education? J. Professional Services Market., 13: 71-89.
CrossRefDirect Link - Srikanthan, G. and J. Dalrymple, 2003. Developing alternative perspectives for quality in higher education. Int. J. Educ. Manage., 17: 126-136.
CrossRefDirect Link - Sohail, S. and N.M. Shaikh, 2004. Quest for excellence in business education: A study of student impression of service quality. Int. J. Educ. Manage., 18: 58-65.
CrossRefDirect Link - Soutar, G. and M. McNeil, 1996. Measuring service quality in a tertiary institution. J. Educ. Adm., 34: 72-82.
CrossRefDirect Link - Joseph, M., M. Yakhou and G. Stone, 2005. An educational institution`s quest for service quality: Customers` perspective. Qual. Assurance Educ., 13: 66-82.
CrossRefDirect Link - Teas, R.K., 1993. Expectations, performance evaluation and consumers' perceptions of quality. J. Marketing, 57: 18-34.
CrossRefDirect Link - Teas, R.K., 1994. Expectations as a comparison standard in measuring service quality: An assessment of a reassessment. J. Market., 58: 132-139.
Direct Link - Zeithaml, V.A., L.L. Berry and A. Parasuraman, 1996. The behavioural consequences of service quality. J. Market., 60: 31-46.
Direct Link - Gronroos, C., 1984. A service quality model and its marketing implications. Eur. J. Market., 18: 36-44.
CrossRefDirect Link - Parasuraman, A., V.A. Zeithaml and L.L. Berry, 1988. SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. J. Retail., 64: 12-40.
Direct Link
Mike, Young Reply
Jain, Sinha, and De did not tell readers where they conllected data.
Did they sample students in India?