Subscribe Now Subscribe Today
Research Article
 

Benchmarking as a Tool for Quality Improvement in College of Business Administration: An Application of AHP



Abdul Malik Syed and Mohammad Naushad
 
Facebook Twitter Digg Reddit Linkedin StumbleUpon E-mail
ABSTRACT

Higher Education Institutions (HEIs) globally face many challenges in formalizing and systematizing benchmarking. Many HEIs simply imitate the best practices without consideration for the level playing field; which ultimately results in mismatch and brings chaos instead of improvements. Our study provides a framework for formal benchmarking as a tool by using Analytical Hierarchy Process (AHP) for adapting the best practices for quality enhancement in College of Business Administration (CBAK). Our survey revealed interesting results and found majority of respondents hold forth that the most challenging task is selection of benchmarking partner. We construct an AHP model to choose an ideal benchmarking institution for CBAK; Pre and post performance and dynamic sensitivity analysis identified College of Industrial Management, KFUPM and Business School, National University of Singapore as the ideal benchmarking partners. With the proposed benchmarking framework CBAK can assess itself how well they are in relation to these identified colleges and identify the good practices and adapt them for quality enhancement.

Services
Related Articles in ASCI
Similar Articles in this Journal
Search in Google Scholar
View Citation
Report Citation

 
  How to cite this article:

Abdul Malik Syed and Mohammad Naushad, 2014. Benchmarking as a Tool for Quality Improvement in College of Business Administration: An Application of AHP. Journal of Applied Sciences, 14: 2087-2097.

DOI: 10.3923/jas.2014.2087.2097

URL: https://scialert.net/abstract/?doi=jas.2014.2087.2097
 
Received: November 11, 2013; Accepted: February 13, 2014; Published: May 07, 2014



INTRODUCTION

Benchmarking has emerged as the buzzword during the last decade as a tool for quality assessment and improvement. It is a planned, systematic effort with clear objectives and processes to measure and compare with the best in class. As stated by UNESCO in their benchmarking study that desire to learn from each other and to share aspects of good practice is almost as old as the university itself. Thus, improving performance by collaboration or comparison with other universities is nothing new in higher education. New paragraph: use this style but recent efforts on benchmarking stress on the formalization of such comparisons. Although it is not as easy as it seems. Higher Education Institutions (HEIs) globally face many challenges in formalizing and systematizing the benchmarking approach. Many HEIs simply imitate the best practices without consideration for the level playing field; which ultimately results in mismatch and brings chaos instead of improvements. This study provides a framework for formal benchmarking as a tool by using Analytical Hierarchy Process (AHP) for adapting the best practices for quality enhancement in Saudi Arabian Business Schools.

The motivation to undertake the present study has been developed taking a cue from the benchmarking efforts and the problems and challenges faced by the College of Business Administration Al-Kharj, hence forth CBAK, the serving institute of researchers itself. The curriculum of CBAK and many more colleges in Saudi Arabia was designed by taking into consideration the top business schools in USA. Needless to mention that certainly this was informal benchmarking but done without adopting any scientific basis and many senior faculty laments that it could not render the desired results within the Saudi context. Moreover, the curriculum taught in US universities had been developed incrementally over a span of several decades.

In addition there is a great amount of difference between the Saudi students and the USA students’ entry level requirements in a university; also USA students have their mother tongue as English, while Saudi Arabian students learn English at the undergraduate level only. Only recently on the behest of National Commission for Assessment and Academic Accreditation (NCAAA) that efforts are made to upgrade the entry requirements in Saudi Universities; with the introduction of compulsory preparatory year for students poor in English language and/or mathematics before any formal enrolment in the undergraduate program.

Another important aspect is the cultural differences; Saudi Arabia being a Muslim country has some Islamic requirements that need to be fulfilled by the curriculum, while students in USA do not have any such requirements.

LITERATURE REVIEW

Benchmarking in higher education: Benchmarking has emerged as one of the most efficient tools of management and quality improvement in various types of organizations. The literature on benchmarking has evolved over years and getting richer year to year. Some of the reviews can be cited as (Jackson et al., 1994; Zairi and Youssef, 1995, 1996; Vig, 1995; Czuchry et al., 1995; Dorsch and Yasin, 1998; Dattakumar and Jagadeesh, 2003) etc.

Any sector specific study may find enough literature to materialize the research problem. The present study also does not hold any exception. In the field of education the use of benchmarking its importance, different models, approaches and many more are studied and applied by various authors in independent studies and funded projects. Major contributor in this stream can be listed as (Tang and Zairi, 1998; Yarrow and Prabhu, 1999; McKinnon et al., 2000; Wan Endut et al., 2000; Weeks, 2000; Prasnikar et al., 2005; Stella and Woodhouse, 2007), etc. The list of these kinds of studies is very comprehensive. Most of the studies under this category emphasize the use of benchmarking in the Higher Education Institutions (HEIs) as a vital tool for total quality management and performance improvement.

AHP and benchmarking: Analytical Hierarchy Process (AHP) is a multi-criteria decision making process initially developed by Thomas L. Satty in the late seventies. The technique has grown up with the time and widely used by the researchers. It is also frequently used for benchmarking studies in various fields. Kabir et al. (2012), used the fuzzy logics with analytic hierarchy process (FAHP) approach to support the online retail benchmarking process. Faisal et al. (2011) applied analytic hierarchy process (AHP) in order to rank the seventeen Total Quality Management (TQM) practices in the service industry. Joshi et al. (2011) integrate the Delphi-AHP-TOPSIS techniques to benchmarking the benchmarking framework for evaluating the cold chain performance of a company. Kannan (2009), used AHP to assist the ocean container carriers operator industry for benchmarking the service quality. Raharjo et al. (2007) used AHP with Quality Function Deployment (QFD) analysis to integrate the dynamics of competitors' performance and the dynamics of customer preference, along with their interaction for benchmarking. Stella and Woodhouse (2007) used AHP in order to identify the best benchmarking partner for Value Management (VM) in China. Shee (2006) proposed the framework for Competence Set (CS) expansion by using the analytical hierarchy process presenting a case study of small and medium enterprise (SME). Chan et al. (2006), employed the double AHP methodology on benchmarking the logistics performance of the postal industry.

Wang et al. (1998), in their paper concluded after comparing the two important techniques, i.e. AHP and prioritization matrix that if time, cost and difficulty are the major concerns in product improvement of a company, the prioritization matrix method should be preferred; where accuracy is the major requirement, the AHP method would be a better choice.

Although, AHP is a well-developed tool but its applications are rarely noticed in HEIs. And the negligible presence in the Saudi Higher Education sector for benchmarking.

METHODOLOGY

The study is an empirical work based on the primary data collected by administering the questionnaire and survey responses were analyzed using the statistical package for Social Sciences Research (SPSS). The questionnaire is available with authors upon request. Secondary data was collected mainly from the public domain. The following paragraphs explain the AHP approach adopted in this study.

AHP approach: The Analytic Hierarchy Process (AHP) is a procedure suited for complex technological, economic and socio-political decision-making (Saaty, 1990). AHP was developed by Thomas L. Saaty in the early 1970s to help individuals and groups deal with multi-criteria decisions. By incorporating both subjective and objective data into a logical hierarchy framework, AHP provides decision-makers with intuitive or common sense approach to evaluating the importance of every element of a decision through a pair-wise comparison process (Saaty and Vargas, 1991).

There are four steps in AHP method Saaty (2000). First step involves decomposing the problem into attributes. Each attribute is further decomposed into Sub-attributes/Alternatives until the lowest level of the hierarchy.

In the second step weighing for each two of the attributes and sub-attributes by using a rating scale developed by Saaty (2000).

The third step i.e. evaluating, involves in calculating the weight of each attribute. From this step we get the overall priority for each alternative and the best choice is the alternative which has the largest overall priority value.

The fourth step i.e. selecting, measures how consistent the judgments have been relative to large samples of purely random judgments (Coyle, 2004). The consistency ratio (CR) which is calculated as follows:

Calculate the biggest eigenvector (λmax)
Compute the consistency index (μ) for each matrix of order n by the Eq:

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP

Finally, the CR of a pair-wise comparison matrix is the ratio of its consistency index, μ, to the corresponding Random Index (RI) value in Table 1. i.e., CR = μ/RI

RI is obtained from a large number of simulations runs and varies depending upon the order of matrix (Kannan, 2009). Table 1 shows the value of the RI for matrices of order 1-15 obtained by approximating random indices using 50,000 simulations (Saaty, 2009).

If the value of CR is equal to, or less than 0.1, it implies that the evaluation within the matrix is acceptable. If CR is more than 0.1, the inconsistency of judgments within that matrix has occurred and the evaluation process should therefore be reviewed, reconsidered and improved (Crowe et al., 1998).

Saaty (2009) describes that if CR is larger than desired we need to do three things (1) Find the most inconsistent judgment in the matrix, (2) Determine the range of values to which the judgment can be changed corresponding to which the inconsistency would be improved, (3) Ask the judge to consider, if he can, changing his judgment to a reasonable value in that range. In order to carry out these steps many software are available e.g. smart choice, super decisions and even an AHP Excel template. Not only all these software automates the above mentioned four but also accelerates and works with precision. For the purpose of our analysis we used Expert choice software.

Benchmarking survey response analysis: Collection of primary data was the most tedious and time taking job for this project. Initially, we administered the survey in online mode using Google analytics and due to poor response rate (less than five percent) were forced to distribute the hard copies. The online survey was targeted to collect opinion of the policy makers of different business schools throughout the kingdom. The survey link was emailed to the Dean, Vice-dean of different Business Schools in the kingdom as a reminder were followed up by the telephonic call. But unfortunately the response rate was murkier. In the later stage we decided to make a personal visit to the nearby business schools in the Riyadh region.

The survey revealed some interesting results about benchmarking in the Kingdom. In what follows, we will present the survey instrument with the demographics of the respondents and statistical analysis of the collected data followed by the discussion of the results and conclusion.

Benchmarking survey and demographics of respondents: The survey instrument was distributed to various business schools in the Riyadh region. The sample and the characteristics of the sampled subjects are as follows.

Analysis of the sample: About 150 questionnaires were distributed to Dean, Vice-dean and Heads of academic departments in addition to faculty members. Later on follow ups were made. As a result, there were 52 respondents (respondent rate 34.67%) and 98 non-respondents. It is important to note that each respondent answered only one questionnaire.

Analysis of the responses to the questionnaire: Among the 52 respondents who responded to the questionnaire, there were:

Overall 42.31% had experience of using benchmarking in their institution
About 53.85% did not used or participated in a benchmarking project at institutional level
There were 3.85% missing cases, i.e. did not prefer to respond

Table 1: Random index from 1-15 (Saaty, 2009)
Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 1: Summary of Institution’s experience with benchmarking

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 2: Priority for adapting the benchmarking as a tool

Therefore, the affirmative response rate equals 44.01% whereas the negative response rate represents 55.09% (Fig. 1).

Majority of respondents who said they did not use benchmarking expressed their level of priority for adapting the benchmarking as a tool to be on high priority (Fig. 2).

Characteristics of the respondents: The 52 respondents are characterize in terms of their job title, administrative position, nationality, academic department and work experience Fig. 3.

Analysis of data: The analysis of the data collected through the survey revealed the main purposes for benchmarking in the higher education environment, process of benchmarking and the challenges associated with designing and implementing effective benchmarking regimes in the institutions.

Table 2 shows the descriptive statistics for each of ten identified purposes (McKinnon et al., 2000) for which benchmarking could be applied in the higher education environment.

In general, benchmarking has mostly been seen as a building performance assessment into learning and teaching. Interestingly, benchmarking has been importantly seen as a building organization-wide commitment to goals, research performance and Strategic Planning. General management improvement and strengthening service support links were viewed as being only moderately important objectives for undertaking benchmarking. Areas such as keeping ahead of competition, Particular functional area improvement and staff development were rated lowly as a use for benchmarking by respondents. Interestingly the respondents agree on the major points but still some advocated that replicating can also work while go for benchmarking.

Process of benchmarking: To draw a general line by taking the perception of respondents for benchmarking process some questions were made specific to the process part of the benchmarking. The respondents in different job title agreed that benchmarking is a continuous process. In addition, opinioned that benchmarking could be against other higher education institutions with similar characteristics in the kingdom or overseas. Majority of them nodded that Benchmarking should be against other business firms/organizations. As a matter of fact Benchmarking should be against some self determined goals based on the circumstances and life stage of the institution at the time.

Major challenges for benchmarking in business schools: The following are the major challenges raised by the respondents while replying to the open ended question, “What are the challenges associated with designing and implementing effective benchmarking regimes in your institutions?” Though majority of the respondents were of the view that the major challenging task while benchmarking is the “selection of benchmarking partner institutions” In addition to this the responses can be summed up as follows:

Lack of information
Lack of manpower and resources

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 3(a-e): Characteristics of the respondents, (a) Administrative title, (b) Job title, (c) Nationality, (d) Department and (e) Work experience

Table 2: Descriptive statistics
Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP

Lack of time, earmarked resources, availability of information for comparisons
Gap between benchmarking institutions and our students
Designing and implementing effective benchmarking regions from faculty perspective since they lack orientation and philosophy of benchmarking
Finding the right benchmarking institutions/partner and then adapting their way of handling issues
Lack of proper direction and availability and accessibility of various data
Convincing the management/higher authorities about the gap areas for benchmarking
Faculty and staff involvement
Conceptual clarity and adhoc usage of benchmarking
Trying to benchmark only on the secondary data and not having partnerships or collaborations to get primary data which is confidential
Transparency and integrity while sharing data/information

Thus from the above discussion we can conclude that choosing a right partner is the biggest challenge perceived by the majority of respondents, the decision makers in benchmarking. Moreover, comprehensive survey of literature discussed earlier found no clear guideline on the scientific approach for the selection of benchmarking partner. Therefore, an attempt has been made in the subsequent section to develop/suggest a scientific approach for the benchmarking partner selection.

ANALYTIC HIERARCHY PROCESS (AHP) MODEL

Setting up of AHP model in expert choice: Applying the first step in the AHP model discussed in the methodology section above, we define the objective as “identifying the ideal benchmarking partner”. Next, it is further decomposed into seven criteria on which the benchmarking partner is considered. The criteria are then structured into a hierarchical form to represent the relationships between the identified factors as shown in Fig. 4 along with alternatives i.e. candidates for benchmarking.

Choosing the right criterion was a cumbersome task before us. We not only rely on the literature but also the views of major stakeholders, which were given due importance. In order to get the views of the major stakeholders, i.e. aculty members and the managers (Dean, Vice Deans and HODs), we organized a workshop and almost all voluntarily participated. The first half of the workshop was devoted to educate the participants about benchmarking methodology in Higher education and the second half was dedicated to brainstorm to identify several alternatives/criterion to select a benchmarking partner. The outcome of the workshop produced seven criteria’s to identify the ideal benchmarking partner as depicted in Table 3.

Next we needed to choose the sample of candidates for benchmarking. We populate the sample of candidates with the guidance note from the strategic plan of our university i.e., Salman bin Abdul-Aziz University. Moreover, we used two criteria: (1) Academic ranking of World universities and (2) Accreditation to short list the candidates. The final list had seven universities out of the ten as possible candidates for benchmarking show on in Table 4.

Determining weights from pair-wise comparison matrix: An initial set of seven indicators identified in step 1 was evaluated by seven experts as the participants who represented six academic departments including the top management, all belonging to authors own college, i.e.CBAK. In order to fill out the pair-wise comparison, participants were given an orientation about some basic information about AHP and an extensive description of how to use the Satty scale for pair-wise comparison. Later participants were asked about their judgments and the researchers directly fed in expert choice. The sample of the pair-wise comparison matrix is shown in Fig. 5.

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 4: AHP benchmarking model in expert choice

Table 3: Benchmarking criteria with explanation
Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP

Table 4: Prospective benchmarking partners/alternatives in AHP model
Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 5: Sample of the pair-wise comparison matrix.

AHP uses linear algebra and graph theory to calculate the relative importance of weighting for the selection criteria. Once the pair wise comparison is done, expert choice then calculates the Consistency Ratio (CR) for criterion. Saaty’s rule of thumb is to accept only judgment matrices with CR<0.1.

Initially our combined matrix had an inconsistency of 0.11 so in order to meet the strict limit of 0.1 the consistency ratio had to be revisited 1st and 2nd judgments i.e., the highest inconsistency to finally arrive at the value of CR = 0.09. Figure 6 depicts the derived priorities for the selection of benchmarking partner.

Majority of the experts give the highest priority to the criteria ‘Have Superior Performance in the Areas to be benchmarked’. Though it is again a matter of debate that is how we will judge the performance of the prospective benchmarking partner.

Table 5: Ratings of alternatives in AHP model
Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 6: Synthesize results for priorities with respect to goal

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 7: Combined rating of alternatives

However it raises a series of questions as follows which need a scientific approach to answer: Who is doing it the best? How do they do it? How can we adapt what they do to our institution? How can we be better than the best? In order to materialize these questions we do the second stage AHP model that would answer the above raised questions by providing ratings to the alternatives (candidates to be benchmarked). In other words, we synthesize by combining ratings to find out the ideal benchmarking partner and conduct sensitivity analysis for the entire criterion.

Rating alternatives (candidates to be benchmarked): The combined ratings of alternatives are presented in Table 5 and Fig. 7.

On synthesis, CIM-KFUPM and BS-NUS emerges as the benchmark with the combined priority of 75.7 and 71.5%, respectively, which is followed by CSB-UA with 64.4% and CSB-IU with 61.9% if the threshold limit of 60% is fixed.

To see if the relative change in weights of criteria causes any change in the ranks of benchmarking partners we perform the sensitivity analysis. After a series of sensitivity analyses, it is found that CIM-KFUPM and BS-NUS emerged as winners since they were no slight change in the ranking. Figure 8 and 9(a, b) show the performance and dynamic sensitivity graphs pre and post sensitivity, respectively.

Thus, the selection of an ideal benchmark partners is critical for the success of any benchmarking effort. The multitude of criteria makes the selection become a difficult and complex task.

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 8(a-b): (a) Performance sensitivity (preliminary) and (b) Dynamic sensitivity (preliminary)

Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Image for - Benchmarking as a Tool for Quality Improvement in College of Business Administration: 
  An Application of AHP
Fig. 9(a-b): (a) Performance sensitivity (post) and (b) Dynamic sensitivity (post)

An analytical model based on AHP was developed for the selection of ideal HEIs benchmarking partners.

CONCLUSION

The current study conducted a benchmarking survey and used Analytical Hierarchy Process (AHP) technique to present a benchmarking framework that support decision makers/managers in adapting the best practices for quality improvement at CBAK.

Benchmarking survey revealed that majority of respondents did not used or participated in any benchmarking project and expressed adapting the benchmarking tool as their top most priority. Although, majority of the respondents hold forth that the most challenging task is the selection of benchmarking partner institutions.

Using a scientific approach viz. AHP model, the current study successfully identified CIM-KFUPM and BS-NUS as the ideal benchmarking partners under all the circumstances based on pre- and post-performance sensitivity and dynamic sensitivity, respectively.

With the proposed benchmarking framework CBAK can easily understand its strengths and weaknesses as compared to its seven colleges chosen for this study. It can identify the good practices and can benchmark them for improving the weaknesses.

Indeed, gathering information from these partners is not an easy task. Even though, information can be collected from the public domain without directly contacting them. We recommend, in gathering benchmarking data CBAK should forge partnerships with the two ideal benchmarking partners, viz., CIM-KFUPM and BS-NUS in an ethical and legal manner.

Furthermore, the CBAK need not just copy the best practices learnt from its partners, it can adapt and go beyond the learning and use innovative means to create what is the most relevant as per its operational strategy. And in this way they can instill a culture of continuous and organizational learning, a process that provides continuous development, innovation in order to become the best-in-class.

A suggestion for future research could be the use of proposed framework to identify benchmarking partners for the Key Performance Indicators identified by NCAAA.

ACKNOWLEDGMENT

We are grateful to the Deanship of Scientific Research, Salman bin Abdulaziz University, KSA for providing us financial assistance to our project Grant No: 1432/?/62.

REFERENCES
1:  Jackson, A.E., R.R. Safford and W.W. Swart, 1994. Roadmap to current benchmarking literature. J. Manage. Eng., 10: 60-67.
Direct Link  |  

2:  Chan, F.T.S., H.K. Chan, H.C.W. Lau and R.W.L. Ip, 2006. An AHP approach in benchmarking logistics performance of the postal industry. Benchmarking: Int. J., 13: 636-661.
CrossRef  |  

3:  Coyle, G., 2004. The Analytic Hierarchy Process (AHP). In: Practical Strategy: Structured Tools and Techniques, Coyle, G. (Ed.). Pearson Education Ltd., Glasgow, Scotland, ISBN-13: 9780273682202, pp: 1-11.

4:  Crowe, T.J., J.S. Noble and J.S. Machimada, 1998. Multi-attribute analysis of ISO 9000 registration using AHP. Int. J. Qual. Reliability Manage., 15: 205-222.
CrossRef  |  Direct Link  |  

5:  Czuchry, A.J., M.M. Yasin and J.J. Dorsch, 1995. A review of benchmarking literature: A proposed model for implementation. Int. J. Mater. Prod. Technol., 10: 27-45.

6:  Dattakumar, R. and R. Jagadeesh, 2003. A review of literature on benchmarking. Benchmarking: Int. J., 10: 176-209.
CrossRef  |  Direct Link  |  

7:  Raharjo, H., M. Xie, T.N. Goh and A.C. Brombacher, 2007. A methodology to improve higher education quality using the quality function deployment and analytic hierarchy process. Total Qual. Manage. Bus. Excell., 18: 1097-1115.
CrossRef  |  Direct Link  |  

8:  Dorsch, J.J. and M.M. Yasin, 1998. A framework for benchmarking in the public sector: Literature review and directions for future research. Int. J. Public Sector Manage., 11: 91-115.
CrossRef  |  Direct Link  |  

9:  Joshi, R., D.K. Banwet and R. Shankar, 2011. A Delphi-AHP-TOPSIS based benchmarking framework for performance improvement of a cold chain. Expert Syst. Appl., 38: 10170-10182.
CrossRef  |  

10:  Kannan, G., 2009. Fuzzy approach for the selection of third party reverse logistics provider. Asia Pac. J. Marketing Logist., 21: 397-416.
CrossRef  |  Direct Link  |  

11:  Kabir, G., M. Ahsan and A. Hasin, 2012. Framework for benchmarking online retailing performance using fuzzy AHP and TOPSIS method. Int. J. Ind. Eng. Comput., 3: 561-576.
CrossRef  |  

12:  McKinnon, K.R., S.H. Walker and D. Davis, 2000. Benchmarking: A Manual for Australian Universities. Department of Education, Training and Youth Affairs, Higher Education Division, Canberra, Australia, ISBN-13: 9780642239716, Pages: 167.

13:  Zairi, M. and M. Youssef, 1995. A review of key publications on benchmarking part I. Benchmark. Qual. Manage. Technol., 2: 65-72.
CrossRef  |  Direct Link  |  

14:  Zairi, M. and M.A. Youssef, 1996. A review of key publications on benchmarking: Part II. Benchmarking Qual. Manage. Technol., 3: 45-49.
CrossRef  |  Direct Link  |  

15:  Prasnikar, J., Z. Debeljak and A. Ahcan, 2005. Benchmarking as a tool of strategic management. Total Qual. Manage. Bus. Excellence, 16: 257-275.
CrossRef  |  Direct Link  |  

16:  Shee, D.Y., 2006. An analytic framework for competence set expansion: Lessons learned from an SME. Total Qual. Manage. Bus. Excellence, 17: 981-997.
CrossRef  |  Direct Link  |  

17:  Vig, S.N., 1995. Benchmarking: A select bibliography. Productivity, 36: 521-524.

18:  Saaty, T.L. and L.G. Vargas, 1991. Prediction, Projection and Forecasting: Applications of the Analytic Hierarchy Process in Economics, Finance, Politics, Games and Sports. 2nd Edn., Kluwer Academic Publishers, Boston, MA., ISBN-13: 9780792391043, Pages: 251.

19:  Saaty, T.L., 1990. The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation. 2nd Edn., RWS, University of Pittsburgh, USA., ISBN-13: 9780962031724, Pages: 287.

20:  Saaty, T.L., 2000. Fundamentals of Decision Making and Priority Theory with the Analytic Hierarchy Process. 2nd Edn., RWS Publications, Pittsburgh, PA., ISBN-13: 978-0962031762, Pages: 527.

21:  Saaty, T.l., 2009. Principia Mathematica Decernendi = Mathematical Principles of Decision Making: Generalization of the Analytic Network Process to Neural Firing and Synthesis. RWS, Pittsburgh PA., ISBN-13: 9781888603101.

22:  Stella, A. and D. Woodhouse, 2007. Benchmarking in Australian higher education: A thematic analysis of AUQA audit reports. Australian Universities Quality Agency, Melbourne, VIC. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.71.8&rep=rep1&type=pdf.

23:  Faisal, T., Z. Rahman and M.N. Qureshi, 2011. Prioritising the practices of total quality management: An analytic hierarchy process analysis for the service industries. Total Qual. Manage. Bus. Excellence, 22: 1331-1351.
Direct Link  |  

24:  Wan Endut, W.J., M. Abdullah and N. Husain, 2000. Benchmarking institutions of higher education. Total Qual. Manage., 11: 796-799.
CrossRef  |  

25:  Tang, K.H. and M. Zairi, 1998. Benchmarking quality implementation in a service context: A comparative analysis of financial services and institutions of higher education. Part III. Total Qual. Manage., 9: 669-679.
CrossRef  |  Direct Link  |  

26:  Wang, H., M. Xie and T.N. Goh, 1998. A comparative study of the prioritization matrix method and the analytic hierarchy process technique in quality function deployment. Total Qual. Manage., 9: 421-430.
CrossRef  |  Direct Link  |  

27:  Weeks, P., 2000. Benchmarking in higher education: An Australian case study. Innovations Educ. Train. Int., 37: 59-67.
CrossRef  |  Direct Link  |  

28:  Yarrow, D.J. and V.B. Prabhu, 1999. Collaborating to compete: Benchmarking through regional partnerships. Total Qual. Manage., 10: 793-802.
CrossRef  |  Direct Link  |  

©  2021 Science Alert. All Rights Reserved